US20040169620A1 - Method of providing images to a person with movement disorder - Google Patents
Method of providing images to a person with movement disorder Download PDFInfo
- Publication number
- US20040169620A1 US20040169620A1 US10/790,735 US79073504A US2004169620A1 US 20040169620 A1 US20040169620 A1 US 20040169620A1 US 79073504 A US79073504 A US 79073504A US 2004169620 A1 US2004169620 A1 US 2004169620A1
- Authority
- US
- United States
- Prior art keywords
- image
- person
- signals
- movement
- movements
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1101—Detecting tremor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4082—Diagnosing or monitoring movement diseases, e.g. Parkinson, Huntington or Tourette
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/486—Bio-feedback
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6814—Head
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/7445—Display arrangements, e.g. multiple display units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/10—Athletes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
Definitions
- the present invention relates to a closed-loop augmented reality system for assisting people with motion disorders.
- Certain neurological disorders such as those associated with Parkinson's Disease (PD) are known to cause both motor and visual impairments. These impairments may include tremor, motor fluctuations, and involuntary arm, leg and head movements. In addition, patients with these disorders may have trouble initiating and sustaining movement. While people with such disorders have distorted visual feedback effects, they are even more dependent on such feedback than healthy people.
- PD Parkinson's Disease
- U.S. Pat. No. 5,597,309 describes a method for stimulating and sustaining ambulation in Parkinson's patients by creating virtual visual cues. The method, however, is based only on open-loop visual cue presentation, wherein initiating and sustaining cues are given at predetermined speeds using an image-generating device.
- an apparatus for adaptive image generation includes at least one non-radiating sensor, mountable on a body, for detecting body movements and producing signals related to the body movements, and a processor configured to receive the signals and generate an image, wherein the generated image is adapted according to the detected body movements.
- the processor may include a filtering unit for filtering noise from the received signals, the unit having an adaptive filtering element, and an image generator for providing the generated and adapted images from the filtered and received signals.
- the filtering unit may include linear elements and non-linear elements, and may be a neural network.
- the non-radiating sensor is an accelerometer. There may be two sensors for producing signals related to movement of a head and body.
- the generated image may include a geometric pattern, such as a tiled floor or parallel stripes, or it may include a view from real life.
- an apparatus for augmented reality which includes at least one sensor mountable on at least one part of a body for producing signals from movements of a body part, and a processor for adapting an augmented image based only on the produced signals.
- a system for adaptive augmented or virtual reality which includes at least one non-radiating sensor, mountable on at least one part of a body, for detecting body movements and producing signals related to the body movements, a processor configured to receive the signals and generate an image which is adapted according to the detected body movements, and a display for displaying the generated and adapted images.
- the system provides closed-loop biofeedback for adaptation of body movements.
- an apparatus for treating a movement disorder includes at least one sensor, mountable on a body, for detecting body movements and producing signals related to the body movements, and a processor configured to receive the signals and generate an image, wherein the generated image is adapted according to the detected body movements.
- a system and method for reducing involuntary movement artifacts from a signal including a voluntary movement processor for filtering a voluntary movement signal representative of a voluntary movement having involuntary movements therein, an adaptive involuntary movement processor for adaptively filtering a vertical motion signal, and a subtractor for subtracting the involuntary movements from the voluntary movement signal to produce a reduced artifact signal.
- the adaptive involuntary movement processor adapts its processing using the reduced artifact signal.
- Involuntary movement may include tremor or other unwanted movements.
- Voluntary movement may include walking or other full body movements such as turning, running, etc.
- a method for interaction of an image with body movement including the steps of providing an image to a person, receiving signals related to movements of the person, adapting the image according to the received signals, and providing the adapted image to the person, wherein the adapted image enables the person to adjust body movements.
- the steps may be performed repeatedly so as to provide continuous assistance of body movement.
- the image may be virtual or augmented.
- Interaction may include therapy, recreational activities (sports, sex, etc.) or physical assistance.
- a method for treating a movement disorder including the steps of providing an image to a person, receiving at least one signal from the person, filtering unwanted noise from the signal, adapting the image based on the received and filtered signal, and providing the adapted image to the person, wherein the adapted image enables the person to adjust body movements.
- the step of filtering may be accomplished using a filtering unit having an adaptive filtering element.
- the method may also include the step of measuring a walking parameter.
- FIGS. 1A and 1B are schematic illustrations of a user wearing one embodiment of the present invention
- FIGS. 2A and 2B are illustrations of images viewed by the user of FIGS. 1A and 1B;
- FIG. 3 is a block diagram illustration of a processor
- FIG. 4 is a block diagram illustration of one component of the processor of FIG. 3 in greater detail
- FIG. 5 is a block diagram illustration of another component of the processor of FIG. 3 in greater detail
- FIG. 6 is a block diagram illustration of open-loop and closed-loop control
- FIG. 7 is a table showing results from tests performed using one embodiment of the present invention.
- the proposed invention creates an adaptive augmented reality of motion over a virtual image, such as a tiled floor.
- the system is portable, and can be used for a variety of therapeutic, healing, assistive, or recreational activities. It uses non-radiating sensors, such as accelerometers, which directly measure movements of the body. It is particularly useful for treating diseases with motion impairment, such as Parkinson's Disease (PD), by providing closed-loop biofeedback for gait initiation, sustainment and stabilization.
- PD Parkinson's Disease
- FIGS. 1A and 1B illustrate one embodiment of the system on a user's body.
- FIG. 1A shows an overview of the entire system
- FIG. 1B shows a detailed view of a portion of the system.
- the adaptive augmented reality apparatus generally referenced 50
- Head-mounted assembly 52 comprising a sensor 60 A and a display 64
- Glasses 40 may be standard eyeglasses, with display 64 and sensor 60 A attached by, for example, clips.
- Sensors 60 A and 60 B are non-radiating sensors, such as accelerometers. Other types of non-radiating sensors may be used as well.
- Display 64 overlays a portion of one lens of glasses 40 , protruding out approximately 1 cm past the lens plane, as shown in FIG. 1B.
- Display 64 is a small (for example, 1 cm ⁇ 1 cm) piece, situated directly in front of one eye 41 . In this way, display 64 is close enough to eye 41 to allow the user to see a full view image on display 64 without obscuring any view of the surroundings.
- Display 64 may be, for example, a liquid crystal display (LCD).
- integrated eyeglasses may be used, where display 64 is already incorporated within glasses 40 .
- Such integrated glasses are available from, for example, i-glasses LC Model #500881, i-O Display Systems, LLC, Menlo Park, Calif. USA; or The MicroOptical Corporation, Westwood, Mass. USA.
- Display 64 whether located internally or externally to glasses 40 , is equipped with VGA or video connectors (not shown).
- Sensor 60 A is, for example, a tilt sensor such as Model #CXTILT02E or Model #CXTA02, available from Crossbow Technology, Inc., San Jose, Calif. USA, used to measure tilt of the head with respect to the vertical.
- sensor 60 A may be a sensor that can detect other movements as well as head tilt, such as a 3-axis accelerometer.
- Head-mounted assembly 52 is connected to body-mounted assembly 54 by wires 42 , as shown in FIG. 1A.
- Body-mounted assembly 54 comprises a processor 62 and a 3-axis accelerometer 60 B, for example, translational accelerometer Model #CXL04M3 (Crossbow Technology, Inc., San Jose, Calif. USA).
- Body-mounted assembly 54 may be configured in a box of a reasonable size for a person to wear, for example, but not limited to, one having dimensions 7 ⁇ 12 ⁇ 3 cm. Body-mounted assembly 54 is preferably attached to a belt, but may be connected to the body in any number of ways such as by a chest strap, adhesive, or other connecting device.
- FIGS. 2A and 2B show examples of images viewed by the user while wearing system 50 .
- the image displayed in FIG. 2A is adapted during movement and shown in FIG. 2B.
- FIG. 2A shows a virtual tiled floor image as displayed to the user during normal walk.
- the floor moves as the user walks, in an opposite direction as depicted by arrow 43 , to simulate a real floor as it appears to someone walking. If the user stumbles or falls forward, an image such as the one shown in FIG. 2B is displayed to the user, to simulate the actual view of a real tiled floor during stumble or fall.
- the image is continuously adapted to the motions of the user to create more realistic dynamics of the virtual world viewed by the patent. Consequently, the virtual floor viewed by the user moves only during actual motion, at a rate equal to this motion, as in real life. This enables the user to walk at his or her own pace, which may be variable and governed by individual motion characteristics.
- the tiles as shown in FIG. 2B expand as the user looks down or stumbles, and contract as he picks up his head and looks farther away, as in real life.
- Inner arrows 45 indicate directions of movement of the edges of the tiles in response to the falling motion.
- the floor expands while still in motion.
- the image turns in the other direction, as in real life. This feature is of particular importance for PD patients since a high number of these patients experience considerable difficulties turning around.
- the image is not restricted to tiled floors, and may include other geometric patterns, such as parallel stripes.
- other images may be generated, such as views from real life (i.e. outdoors in a park or the like).
- the image may be a virtual image, in which the outside world is blocked out, or it may be an augmented image, in which the image is superimposed onto the person's view of the real world.
- Processor 62 may be a wearable computer or a microprocessor. Input data to processor 62 is obtained from sensors 60 A and 60 B at input ports 74 A and 74 B, respectively, and output data from processor 62 is sent to display 64 through output port 72 .
- Signals may be, but are not limited to, proportional direct current (DC), and indicate some motion parameter.
- signals may contain acceleration data that is later converted to velocity data.
- signals may relate to an angle of head tilt, or other body movements.
- Signals from processor 62 to display 64 may be analog video signals, for example, PAL or NTSC, or they may be digital (e.g. VGA) signals. Conversion from analog to digital (A/D) or from digital to analog (D/A) may either be performed within processor 62 , or external to processor 62 using a converter.
- Processor 62 includes at least two components: a filtering unit 48 , and an image generator 40 .
- Filtering unit 48 filters signals received at input port 74 B from sensor 60 B. Signals from sensor 60 A relating to movements other than head tilt may be filtered as well, as shown by dashed lines. Filtering eliminates unwanted components from the sensor signals, such as tremor, motor fluctuations and involuntary arm, leg and head movements, as described in further detail below.
- Image generator 40 then incorporates filtered data, as well as signals received directly from sensor 60 A at input port 74 A, and translates the received and filtered proportional signals into rates and degrees of motion of the displayed virtual floor. Image generator 40 then adapts the base image (such as the one shown in FIG. 2A) according to the generated rate and degree of motion information. Adapted images are sent through output port 72 to display 64 to be viewed by the user.
- FIG. 4 is a block diagram illustration of a filtering component 45 of filtering unit 48 , used for filtering tremor, and other unwanted motions.
- Each filtering component 45 in filtering unit 48 is used for filtering signals related to motion in one particular axis or direction.
- filtering unit 48 may have one or several filtering components 45 , depending on the number of axes of movement being measured.
- noisy sensor data are generally cleaned by filtering. Signals relating to vertical movement (up/down), representing tremor and other involuntary movements, are then subtracted from signals relating to translational movement (forward/back or side/side) or other voluntary movements. In this way, both noise from signals and unwanted motions and tremor are filtered out.
- Filtering unit 48 has an upper path 47 and a lower path 49 .
- Upper path 47 is used for cleaning signals from voluntary movement. This may include translational, rotational, or other movements, which may be measured, for example, using a 3-axis accelerometer.
- Lower path 49 is used for eliminating tremor and involuntary movement, based on receipt of vertical (up/down) movements. Vertical movements may also be obtained from a 3-axis accelerometer, or by other measuring means.
- a linear filtering element 76 is used to clean signals in one axis, for example, forward acceleration, from voluntary movement or another voluntary movement in one axis, for example, forward acceleration.
- an adaptive linear filtering element 77 is used in lower path 49 .
- Adaptive linear filtering element 77 is, for example, 5-dimensional, and is similar to one proposed by Widrow B. and Winter R for a linear adapter noise canceller in “Neural nets for adaptive filtering and adaptive pattern recognition”, Computer 21(3): p. 25, 1988, incorporated herein by reference in its entirety.
- the b k are variable weights. K was taken to be 5, but can be any number.
- Linear filter 76 and adaptive linear filtering element 77 both feed into sigmoidal elements 78 .
- output y 2 (i) from adaptive linear filtering element 77 is subtracted from output y 1 (i) from linear filtering element 76 to obtain a final output r(i). Weights b k in adaptive linear filter 77 are then adjusted so as to minimize the squared final output r 2 (i).
- filtering unit 48 “learns” the user's motions.
- Filtering unit 48 may be considered a neural network.
- Each axis of movement uses its own filtering component 45 .
- the cleaned signal is sent from filtering unit 48 to image generator 40 .
- image generator 40 may simultaneously obtain multiple filtered signals from filtering unit 48 , as well as signals directly from sensor 60 A, such as a tilt sensor.
- FIG. 5 is a block diagram illustration of image generator 40 , used for creating images and adapting the images based on received filtered data.
- an initial image 80 of a tiled floor, or other image is created using an imaging software package (OpenGLTM, Silicon Graphics, Inc., Mountain View, Calif. USA).
- Data from sensors, which may be filtered or unfiltered, are fed into image generator 40 , and are used to make corresponding proportional changes in floor angle and speed of movement of image 80 , resulting in an updated image 80 ′, also provided by the imaging software.
- the filtered acceleration signals are converted into rate of motion data within image generator 40 , typically using an integrator.
- the tilt angle received from sensor 60 A is translated into an inclination angle of the virtual tiled floor so as to create a realistic view of the floor. Tripping or falling motions result in larger angles, and are translated into a proportional outward expansion of image 80 , as in real-life vision.
- Sensors 60 A and 60 B may also detect turning motions, which are translated into counter-turning motions of the virtual floor.
- the rates of motion of the virtual tiled floor are the same as the rates of body motion of the user, occurring in opposite directions so as to create the sensation of a floor fixed in space.
- the tilt of the virtual floor is the same as that of the user's head, as measured by head-mounted sensor 60 A. Parameters such as tile size, color and intensity of the virtual floor are adjustable.
- filtering unit 48 Because of filtering unit 48 , a forward motion of the tiled floor will not be triggered by leg tremor, and expansion of tile images, indicating a stumble or a fall, will not be caused by head tremor. Learning and filtering are performed on-line, as the patient's dynamic characteristics keep changing in time.
- a prototype of the proposed invention has been developed and systematically tested on PD patients supervised by a team of medical doctors in the Movement Disorders Clinic at the Cognitive Neurology Department of the RAMBAM Medical Center in Haifa, Israel.
- FIG. 6 illustrates the concept of open-loop versus closed-loop control.
- an image generator 40 produces a display 64 for a user 44 to see. User 44 may then react to display 64 , and voluntarily begin to move. This, however, has no effect on image generator 40 .
- the motion of user 44 is sensed by motion sensors 60 , which send signals related to this motion through a filtering unit 48 and back to image generator 40 .
- the closed-loop system incorporates signals from motion sensors 60 into display 64 .
- FIG. 7 is a table showing details about the subjects who participated in the study, and the results obtained with the display off, with open-loop display, and with closed-loop display.
- open-loop display no sensors were activated on the subject for measuring movements, resulting in an image displayed at a predetermined speed towards the observer.
- Speed and stride length are listed for each test per subject, and the final two columns list a percentage change for the tested parameters.
- Each test consisted of a subject walking a stretch of 10 meters 4 times. Only results from the last two out of four tests in each category were used, to eliminate the effect of training. At the start of each test, the subject was verbally instructed to start walking. The length of time and the number of steps to completion of the 10-meter stretch were recorded for each test. Speed in meters/second (m/s) and stride length in meters (m) were calculated. In the first test (the reference test) the display was turned off. In the second, the open-loop system was turned on, displaying a virtual tiled floor in perpetual motion towards the observer at the maximal speed level comfortable for the subject. The third test employed the adaptive closed-loop system. The order of the second and the third tests was then reversed and results were averaged, in order to eliminate the effect of training from the comparison.
- the proposed approach may make it possible to reduce medication and postpone surgical intervention.
- the proposed invention may be useful as treatment, as therapy, or as an assistive device.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Neurology (AREA)
- Physiology (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Neurosurgery (AREA)
- Dentistry (AREA)
- Developmental Disabilities (AREA)
- Biodiversity & Conservation Biology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Physical Education & Sports Medicine (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Rehabilitation Tools (AREA)
Abstract
A method for interaction of an image with full body movement, including the steps of providing an image to a person with a movement disorder from a display device mounted on such person's body, receiving signals related to voluntary and involuntary movements of such person's body, such signals received by a receiving device mounted on such person's body, adapting such image according to such received signals, and providing such adapted image to such person where such adapted image enables such person to adjust such body movement.
Description
- This application is a continuation of prior U.S. application Ser. No. 09/631,292, filed on Aug. 2, 2000 entitled “Closed Loop Augmented Reality Apparatus”, and claims the benefit of U.S.
Provisional Patent Application 60/182,026, filed on Feb. 11, 2000, which is incorporated in its entirety by reference herein. - The present invention relates to a closed-loop augmented reality system for assisting people with motion disorders.
- Certain neurological disorders, such as those associated with Parkinson's Disease (PD), are known to cause both motor and visual impairments. These impairments may include tremor, motor fluctuations, and involuntary arm, leg and head movements. In addition, patients with these disorders may have trouble initiating and sustaining movement. While people with such disorders have distorted visual feedback effects, they are even more dependent on such feedback than healthy people.
- U.S. Pat. No. 5,597,309 describes a method for stimulating and sustaining ambulation in Parkinson's patients by creating virtual visual cues. The method, however, is based only on open-loop visual cue presentation, wherein initiating and sustaining cues are given at predetermined speeds using an image-generating device.
- There is provided, in accordance with an embodiment of the present invention, an apparatus for adaptive image generation. The apparatus includes at least one non-radiating sensor, mountable on a body, for detecting body movements and producing signals related to the body movements, and a processor configured to receive the signals and generate an image, wherein the generated image is adapted according to the detected body movements.
- The processor may include a filtering unit for filtering noise from the received signals, the unit having an adaptive filtering element, and an image generator for providing the generated and adapted images from the filtered and received signals. The filtering unit may include linear elements and non-linear elements, and may be a neural network.
- In one embodiment, the non-radiating sensor is an accelerometer. There may be two sensors for producing signals related to movement of a head and body. The generated image may include a geometric pattern, such as a tiled floor or parallel stripes, or it may include a view from real life.
- There is also provided, in accordance with an alternative embodiment of the present invention, an apparatus for augmented reality, which includes at least one sensor mountable on at least one part of a body for producing signals from movements of a body part, and a processor for adapting an augmented image based only on the produced signals.
- There is also provided, in accordance with an alternative embodiment of the present invention, a system for adaptive augmented or virtual reality which includes at least one non-radiating sensor, mountable on at least one part of a body, for detecting body movements and producing signals related to the body movements, a processor configured to receive the signals and generate an image which is adapted according to the detected body movements, and a display for displaying the generated and adapted images. The system provides closed-loop biofeedback for adaptation of body movements.
- There is also provided, in accordance with an alternative embodiment of the present invention, an apparatus for treating a movement disorder. The apparatus includes at least one sensor, mountable on a body, for detecting body movements and producing signals related to the body movements, and a processor configured to receive the signals and generate an image, wherein the generated image is adapted according to the detected body movements.
- There is also provided, in accordance with an alternative embodiment of the present invention, a system and method for reducing involuntary movement artifacts from a signal, including a voluntary movement processor for filtering a voluntary movement signal representative of a voluntary movement having involuntary movements therein, an adaptive involuntary movement processor for adaptively filtering a vertical motion signal, and a subtractor for subtracting the involuntary movements from the voluntary movement signal to produce a reduced artifact signal. The adaptive involuntary movement processor adapts its processing using the reduced artifact signal.
- Involuntary movement may include tremor or other unwanted movements. Voluntary movement may include walking or other full body movements such as turning, running, etc.
- There is also provided, in accordance with an alternative embodiment of the present invention, a method for interaction of an image with body movement, including the steps of providing an image to a person, receiving signals related to movements of the person, adapting the image according to the received signals, and providing the adapted image to the person, wherein the adapted image enables the person to adjust body movements.
- The steps may be performed repeatedly so as to provide continuous assistance of body movement. The image may be virtual or augmented. Interaction may include therapy, recreational activities (sports, sex, etc.) or physical assistance.
- There is also provided, in accordance with an alternative embodiment of the present invention, a method for treating a movement disorder, including the steps of providing an image to a person, receiving at least one signal from the person, filtering unwanted noise from the signal, adapting the image based on the received and filtered signal, and providing the adapted image to the person, wherein the adapted image enables the person to adjust body movements.
- There may be, for example, two signals received from the person—one from the head and one from the body. The step of filtering may be accomplished using a filtering unit having an adaptive filtering element. The method may also include the step of measuring a walking parameter.
- The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the appended drawings in which:
- FIGS. 1A and 1B are schematic illustrations of a user wearing one embodiment of the present invention;
- FIGS. 2A and 2B are illustrations of images viewed by the user of FIGS. 1A and 1B;
- FIG. 3 is a block diagram illustration of a processor;
- FIG. 4 is a block diagram illustration of one component of the processor of FIG. 3 in greater detail;
- FIG. 5 is a block diagram illustration of another component of the processor of FIG. 3 in greater detail;
- FIG. 6 is a block diagram illustration of open-loop and closed-loop control; and
- FIG. 7 is a table showing results from tests performed using one embodiment of the present invention.
- The proposed invention creates an adaptive augmented reality of motion over a virtual image, such as a tiled floor. The system is portable, and can be used for a variety of therapeutic, healing, assistive, or recreational activities. It uses non-radiating sensors, such as accelerometers, which directly measure movements of the body. It is particularly useful for treating diseases with motion impairment, such as Parkinson's Disease (PD), by providing closed-loop biofeedback for gait initiation, sustainment and stabilization.
- Reference is now made to FIGS. 1A and 1B, which illustrate one embodiment of the system on a user's body. FIG. 1A shows an overview of the entire system, and FIG. 1B shows a detailed view of a portion of the system. The adaptive augmented reality apparatus, generally referenced50, is portable and generally self-contained, and comprises a head-mounted
assembly 52 and a body-mountedassembly 54. Head-mountedassembly 52, comprising asensor 60A and adisplay 64, is attached to a pair ofglasses 40.Glasses 40 may be standard eyeglasses, withdisplay 64 andsensor 60A attached by, for example, clips.Sensors -
Display 64 overlays a portion of one lens ofglasses 40, protruding out approximately 1 cm past the lens plane, as shown in FIG. 1B.Display 64 is a small (for example, 1 cm×1 cm) piece, situated directly in front of one eye 41. In this way,display 64 is close enough to eye 41 to allow the user to see a full view image ondisplay 64 without obscuring any view of the surroundings.Display 64 may be, for example, a liquid crystal display (LCD). - Alternatively, integrated eyeglasses may be used, where
display 64 is already incorporated withinglasses 40. Such integrated glasses are available from, for example, i-glasses LC Model #500881, i-O Display Systems, LLC, Menlo Park, Calif. USA; or The MicroOptical Corporation, Westwood, Mass. USA.Display 64, whether located internally or externally toglasses 40, is equipped with VGA or video connectors (not shown). -
Sensor 60A is, for example, a tilt sensor such as Model #CXTILT02E or Model #CXTA02, available from Crossbow Technology, Inc., San Jose, Calif. USA, used to measure tilt of the head with respect to the vertical. Alternatively,sensor 60A may be a sensor that can detect other movements as well as head tilt, such as a 3-axis accelerometer. - Head-mounted
assembly 52 is connected to body-mountedassembly 54 bywires 42, as shown in FIG. 1A. Alternatively, wireless connection is possible as well. Body-mountedassembly 54 comprises aprocessor 62 and a 3-axis accelerometer 60B, for example, translational accelerometer Model #CXL04M3 (Crossbow Technology, Inc., San Jose, Calif. USA). - Body-mounted
assembly 54 may be configured in a box of a reasonable size for a person to wear, for example, but not limited to, one having dimensions 7×12×3 cm. Body-mountedassembly 54 is preferably attached to a belt, but may be connected to the body in any number of ways such as by a chest strap, adhesive, or other connecting device. - Reference is now made to FIGS. 2A and 2B, which show examples of images viewed by the user while wearing
system 50. The image displayed in FIG. 2A is adapted during movement and shown in FIG. 2B. FIG. 2A shows a virtual tiled floor image as displayed to the user during normal walk. The floor moves as the user walks, in an opposite direction as depicted byarrow 43, to simulate a real floor as it appears to someone walking. If the user stumbles or falls forward, an image such as the one shown in FIG. 2B is displayed to the user, to simulate the actual view of a real tiled floor during stumble or fall. The image is continuously adapted to the motions of the user to create more realistic dynamics of the virtual world viewed by the patent. Consequently, the virtual floor viewed by the user moves only during actual motion, at a rate equal to this motion, as in real life. This enables the user to walk at his or her own pace, which may be variable and governed by individual motion characteristics. - Thus, the tiles as shown in FIG. 2B expand as the user looks down or stumbles, and contract as he picks up his head and looks farther away, as in real life.
Inner arrows 45 indicate directions of movement of the edges of the tiles in response to the falling motion. Thus, the floor expands while still in motion. Furthermore, as the user turns around, the image turns in the other direction, as in real life. This feature is of particular importance for PD patients since a high number of these patients experience considerable difficulties turning around. These real life effects give the patient needed biofeedback signals for stabilization and safer motion. - It will be appreciated that the image is not restricted to tiled floors, and may include other geometric patterns, such as parallel stripes. In addition, other images may be generated, such as views from real life (i.e. outdoors in a park or the like). The image may be a virtual image, in which the outside world is blocked out, or it may be an augmented image, in which the image is superimposed onto the person's view of the real world.
- Reference is now made to FIG. 3, which shows details of
processor 62 located within body-mountedassembly 54.Processor 62 may be a wearable computer or a microprocessor. Input data toprocessor 62 is obtained fromsensors input ports processor 62 is sent to display 64 throughoutput port 72. - Signals may be, but are not limited to, proportional direct current (DC), and indicate some motion parameter. For example, signals may contain acceleration data that is later converted to velocity data. Alternatively, signals may relate to an angle of head tilt, or other body movements. Signals from
processor 62 to display 64 may be analog video signals, for example, PAL or NTSC, or they may be digital (e.g. VGA) signals. Conversion from analog to digital (A/D) or from digital to analog (D/A) may either be performed withinprocessor 62, or external toprocessor 62 using a converter. -
Processor 62 includes at least two components: a filteringunit 48, and animage generator 40.Filtering unit 48 filters signals received atinput port 74B fromsensor 60B. Signals fromsensor 60A relating to movements other than head tilt may be filtered as well, as shown by dashed lines. Filtering eliminates unwanted components from the sensor signals, such as tremor, motor fluctuations and involuntary arm, leg and head movements, as described in further detail below.Image generator 40 then incorporates filtered data, as well as signals received directly fromsensor 60A atinput port 74A, and translates the received and filtered proportional signals into rates and degrees of motion of the displayed virtual floor.Image generator 40 then adapts the base image (such as the one shown in FIG. 2A) according to the generated rate and degree of motion information. Adapted images are sent throughoutput port 72 to display 64 to be viewed by the user. - Reference is now made to FIG. 4, which is a block diagram illustration of a
filtering component 45 offiltering unit 48, used for filtering tremor, and other unwanted motions. Eachfiltering component 45 infiltering unit 48 is used for filtering signals related to motion in one particular axis or direction. Thus, filteringunit 48 may have one orseveral filtering components 45, depending on the number of axes of movement being measured. - First, noisy sensor data are generally cleaned by filtering. Signals relating to vertical movement (up/down), representing tremor and other involuntary movements, are then subtracted from signals relating to translational movement (forward/back or side/side) or other voluntary movements. In this way, both noise from signals and unwanted motions and tremor are filtered out.
-
Filtering unit 48 has anupper path 47 and alower path 49.Upper path 47 is used for cleaning signals from voluntary movement. This may include translational, rotational, or other movements, which may be measured, for example, using a 3-axis accelerometer.Lower path 49 is used for eliminating tremor and involuntary movement, based on receipt of vertical (up/down) movements. Vertical movements may also be obtained from a 3-axis accelerometer, or by other measuring means. - In
upper path 47, alinear filtering element 76 is used to clean signals in one axis, for example, forward acceleration, from voluntary movement or another voluntary movement in one axis, for example, forward acceleration. Output is related to input by the following equation: x1(i)=Σakv1(i-k) for k=1 . . . K, where v1(i) and x1(i) are the input and output tolinear filtering element 76 at time i, respectively, and ak are weights. - In
lower path 49, an adaptivelinear filtering element 77 is used. Adaptivelinear filtering element 77 is, for example, 5-dimensional, and is similar to one proposed by Widrow B. and Winter R for a linear adapter noise canceller in “Neural nets for adaptive filtering and adaptive pattern recognition”, Computer 21(3): p. 25, 1988, incorporated herein by reference in its entirety. Similar tolinear filtering element 76, output is related to input by the following equation: x2(i)=Σbkv2(i-k) for k=1 . . . K, where v2(i) and x2(i) are the input and output to adaptivelinear filtering element 77 at time i, respectively. However, as opposed to thelinear filtering element 76, the bk are variable weights. K was taken to be 5, but can be any number. -
Linear filter 76 and adaptivelinear filtering element 77 both feed intosigmoidal elements 78. Forsigmoidal elements 78, new outputs y1(i) and y2(i) are related to inputs x1(i) and x2(i) fromlinear filtering element 76 and adaptivelinear filtering element 77, respectively by the following equation: yn(i)=tan h(xn(i)) at time i. Since the sigmoidal function is bound between two predetermined values, the sigmoidal elements attenuate high-amplitude accelerations, which was found to improve performance over the use of linear elements alone. Any combination of linear and sigmoidal elements may be used. For example, the sigmoidal elements may be included in eitherupper path 47 orlower path 49, or both or neither. - In
summer 80, output y2(i) from adaptivelinear filtering element 77 is subtracted from output y1(i) fromlinear filtering element 76 to obtain a final output r(i). Weights bk in adaptivelinear filter 77 are then adjusted so as to minimize the squared final output r2(i). - It should be noted that by adapting the filtering process in this way, filtering
unit 48 “learns” the user's motions.Filtering unit 48 may be considered a neural network. - Each axis of movement (forward/back or side/side, for example) uses its
own filtering component 45. For eachfiltering component 45, the cleaned signal is sent from filteringunit 48 to imagegenerator 40. Thus,image generator 40 may simultaneously obtain multiple filtered signals from filteringunit 48, as well as signals directly fromsensor 60A, such as a tilt sensor. - Reference is now made to FIG. 5, which is a block diagram illustration of
image generator 40, used for creating images and adapting the images based on received filtered data. Specifically, aninitial image 80 of a tiled floor, or other image, is created using an imaging software package (OpenGL™, Silicon Graphics, Inc., Mountain View, Calif. USA). Data from sensors, which may be filtered or unfiltered, are fed intoimage generator 40, and are used to make corresponding proportional changes in floor angle and speed of movement ofimage 80, resulting in an updatedimage 80′, also provided by the imaging software. In the case of acceleration data, the filtered acceleration signals are converted into rate of motion data withinimage generator 40, typically using an integrator. - Thus, the tilt angle received from
sensor 60A, is translated into an inclination angle of the virtual tiled floor so as to create a realistic view of the floor. Tripping or falling motions result in larger angles, and are translated into a proportional outward expansion ofimage 80, as in real-life vision. -
Sensors - The rates of motion of the virtual tiled floor are the same as the rates of body motion of the user, occurring in opposite directions so as to create the sensation of a floor fixed in space. The tilt of the virtual floor is the same as that of the user's head, as measured by head-mounted
sensor 60A. Parameters such as tile size, color and intensity of the virtual floor are adjustable. - Because of
filtering unit 48, a forward motion of the tiled floor will not be triggered by leg tremor, and expansion of tile images, indicating a stumble or a fall, will not be caused by head tremor. Learning and filtering are performed on-line, as the patient's dynamic characteristics keep changing in time. - The present invention may potentially be used for anything that other virtual reality devices are used for, such as entertainment, industry, science and medicine. The use of accelerometers allows for free movement and is not restricted by location or space. In addition, it allows for adaptation of the image to full body motions. Thus, for example, one embodiment of the invention may include a device which would enable a sport or any other recreational activity (i.e. sexual activity) to be performed with a virtual background scene, outside of an entertainment room allowing for more body movements. In another embodiment, the device could be connected to the Internet, allowing for direct interaction between patients and doctors or between users. Movement disorders may include stroke, trauma, PD, or other central nervous system disorders and degenerative diseases. Also, it may include birth defects and results of aging.
- A prototype of the proposed invention has been developed and systematically tested on PD patients supervised by a team of medical doctors in the Movement Disorders Clinic at the Cognitive Neurology Department of the RAMBAM Medical Center in Haifa, Israel.
- Reference is now made to FIG. 6, which illustrates the concept of open-loop versus closed-loop control. In an open-loop system, an
image generator 40 produces adisplay 64 for auser 44 to see.User 44 may then react to display 64, and voluntarily begin to move. This, however, has no effect onimage generator 40. In a closed-loop system, the motion ofuser 44 is sensed bymotion sensors 60, which send signals related to this motion through afiltering unit 48 and back toimage generator 40. In contrast to the open-loop system, which does not measure or respond to the body motions ofuser 44, the closed-loop system incorporates signals frommotion sensors 60 intodisplay 64. - Reference is now made to FIG. 7, which is a table showing details about the subjects who participated in the study, and the results obtained with the display off, with open-loop display, and with closed-loop display. For open-loop display, no sensors were activated on the subject for measuring movements, resulting in an image displayed at a predetermined speed towards the observer. Speed and stride length are listed for each test per subject, and the final two columns list a percentage change for the tested parameters.
- Fourteen subjects, all clinically diagnosed with idiopathic PD and treated with Dopaminergic medication, participated in the study. The subjects' initials, ages, number of years having the disease (yd) and disease severity on the Hoehn and Yahr (HY) scale (See Hoehn M M and Yahr M D: “Parkinsonism: onset, progression and mortality.” Neurology 17(5):427-42, 1967) are listed in FIG. 7. All subjects had 20/20 visual acuity, with correction when necessary. The tests were always performed at approximately the same time of day, and either following a 12-hour period without medication, or during the “off” state of the disease, which is characterized by severe immobility.
- Each test consisted of a subject walking a stretch of 10
meters 4 times. Only results from the last two out of four tests in each category were used, to eliminate the effect of training. At the start of each test, the subject was verbally instructed to start walking. The length of time and the number of steps to completion of the 10-meter stretch were recorded for each test. Speed in meters/second (m/s) and stride length in meters (m) were calculated. In the first test (the reference test) the display was turned off. In the second, the open-loop system was turned on, displaying a virtual tiled floor in perpetual motion towards the observer at the maximal speed level comfortable for the subject. The third test employed the adaptive closed-loop system. The order of the second and the third tests was then reversed and results were averaged, in order to eliminate the effect of training from the comparison. - The last two columns of FIG. 7 list the percentage changes in the performance parameters obtained for the closed-loop system with respect to the reference test. It can be seen that, in all cases but one, performance was improved significantly with respect to the reference test when the closed-loop system was turned on (higher speed, longer strides).
- Qualitative results were noted by the testers as well. Improvement in the quality of the steps was observed. Subjects who dragged their feet on the ground in the reference test raised them noticeably higher when the closed-loop system was turned on. Improvement was particularly dramatic in subjects tested during their “off” phase (JS and NM), characterized by severe immobility. These subjects were severely Brady-kinetic, unable to stand or start walking on their own. When the closed-loop display was turned on, and the subjects were instructed to watch the display, both subjects were able to start walking unaided. The one subject who did not benefit from the closed-loop system, MR, had no walking impairment; as can be seen from his test parameters, he had the best performance during the reference test.
- Comparison of results for the open-loop system and the closed-loop system shows that the average values are similar. However, the standard deviations for the open-loop system are much higher than for the closed-loop system as well as the reference test. This means that the open-loop system affects different individuals in very different ways. The behaviors of JS and NM are particularly noteworthy in this respect. Both subjects improved their performance parameters with respect to the reference test when the closed-loop system was turned on, and both experienced freezing episodes when the perpetual motion display (open-loop system) was turned on. For both subjects, the performance parameters for the open-loop system are even lower than for the reference test. Some subjects reported discomfort, dizziness and nausea caused by the perpetual floor motion of the open-loop system. Most subjects reported relative comfort with the self-activated, closed-loop adaptive system and indicated a clear preference for it over the open-loop system.
- The last two rows in the table show the average performance of the subject group (excluding MR, who, as noted before, had non-gait related impairment). It can be seen that, on average, the proposed closed-loop system improves performance by about 25% (speed) or 30% (stride length) with respect to the reference test. It should also be noted, however, that the standard deviations of these results are-rather high, which implies that the results should be evaluated mainly on an individual basis. Certain PD patients would be helped by the proposed approach to a very significant degree (50%-100%), while others would be helped to a lesser degree. Few, in particular those without walking impairments, would not be helped at all.
- Similar tests done on non-PD patients, such as stroke victims, have shown similar improvements in the walking abilities of these patients using the apparatus as described hereinabove.
- Our study is the first to show the benefit of augmented reality, adapted to a person's own motion, for gait control in PD patients. In particular, we have shown the advantage of a closed-loop adaptive display of a virtual tiled floor as compared to a previously proposed open-loop, non-adaptive, perpetual virtual motion display. Our experiments have shown that adaptive augmented reality can significantly improve the walking abilities of most PD patients without causing the discomfort and the freezing phenomena associated with the open-loop system.
- Finally, it is important to note that the gait parameters most affected by the proposed approach, namely, speed and stride length, also respond, to a similar extent, to antiparkinson medication (See Pedersen S W, Eriksson T and Oberg B: “Effects of withdrawal of antiparkinson medication on gait and clinical score in the Parkinson patient”, Acta Neurol. Scand. 84, 7, 1991) as well as to pallidotomy (brain surgery), as reported by Siegel K L and Metman L V: “Effects of bilateral posteroventral pallidotomy on gait in subjects with Parkinson's disease”, Arch. Neurol., 57, 198, 2000. However, medication causes involuntary movement which disturbs gait further.
- The proposed approach may make it possible to reduce medication and postpone surgical intervention. The proposed invention may be useful as treatment, as therapy, or as an assistive device.
- It will be appreciated by persons skilled in the art that the present invention is not limited by what has been particularly shown and described hereinabove. Rather the scope of the invention is defined by the claims which follow:
Claims (4)
1. A method for interaction of an image with full body movement, the method comprising the steps of:
providing an image to a person with a movement disorder from a display device mounted on said person's body;
receiving signals related to voluntary and involuntary movements of said person, said signals received by a receiving device mounted on said person's body;
adapting said image with a processor mounted on said person's body according to said received signals; and
providing said adapted image to said person on said display device,
wherein said adapted images enable said person to adjust said body movement.
2. A method as in claim 1 , wherein said steps are performed repeatedly so as to provide continuous interaction of said image with said body movement.
3. A method as in claim 1 , wherein said interaction includes therapy.
4. A method as in claim 1 , wherein said interaction includes assistance.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/790,735 US20040169620A1 (en) | 2000-02-11 | 2004-03-03 | Method of providing images to a person with movement disorder |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18202600P | 2000-02-11 | 2000-02-11 | |
US09/631,292 US6734834B1 (en) | 2000-02-11 | 2000-08-02 | Closed-loop augmented reality apparatus |
US10/790,735 US20040169620A1 (en) | 2000-02-11 | 2004-03-03 | Method of providing images to a person with movement disorder |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/631,292 Continuation US6734834B1 (en) | 2000-02-11 | 2000-08-02 | Closed-loop augmented reality apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
US20040169620A1 true US20040169620A1 (en) | 2004-09-02 |
Family
ID=32233001
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/631,292 Expired - Fee Related US6734834B1 (en) | 2000-02-11 | 2000-08-02 | Closed-loop augmented reality apparatus |
US10/790,735 Abandoned US20040169620A1 (en) | 2000-02-11 | 2004-03-03 | Method of providing images to a person with movement disorder |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/631,292 Expired - Fee Related US6734834B1 (en) | 2000-02-11 | 2000-08-02 | Closed-loop augmented reality apparatus |
Country Status (1)
Country | Link |
---|---|
US (2) | US6734834B1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040119662A1 (en) * | 2002-12-19 | 2004-06-24 | Accenture Global Services Gmbh | Arbitrary object tracking in augmented reality applications |
US20060140422A1 (en) * | 2004-12-29 | 2006-06-29 | Zurek Robert A | Apparatus and method for receiving inputs from a user |
US20070118044A1 (en) * | 2005-07-18 | 2007-05-24 | Mega Elektroniikka Oy | Method and device for identifying; measuring and analyzing abnormal neurological responses |
WO2016127005A3 (en) * | 2015-02-04 | 2016-10-27 | Aerendir Mobile Inc. | Local user authentication with neuro and neuro-mechanical fingerprints |
CN106693280A (en) * | 2016-12-29 | 2017-05-24 | 深圳市臻络科技有限公司 | Virtual-reality-based Parkinsonism training method, system and device |
CN110946556A (en) * | 2019-12-27 | 2020-04-03 | 南京信息工程大学 | Parkinson resting state tremor evaluation method based on wearable somatosensory network |
WO2022026296A1 (en) * | 2020-07-29 | 2022-02-03 | Penumbra, Inc. | Tremor detecting and rendering in virtual reality |
US20220034680A1 (en) * | 2020-07-22 | 2022-02-03 | Teseo S.R.L. | Method and system of topological localization in a built environment |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10231628B2 (en) * | 2003-07-02 | 2019-03-19 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Method for measuring movements of a person wearing a portable detector |
FR2856913B1 (en) * | 2003-07-02 | 2005-08-05 | Commissariat Energie Atomique | PORTABLE DETECTOR FOR MEASURING MOVEMENTS OF A CARRIER, AND METHOD. |
US20170329401A1 (en) * | 2004-03-02 | 2017-11-16 | Brian T. Mitchell | Simulated training environments based upon fixated objects in specified regions |
FR2868281B1 (en) * | 2004-03-30 | 2023-06-23 | Commissariat Energie Atomique | METHOD FOR DETERMINING THE MOVEMENTS OF A PERSON. |
JP4686681B2 (en) * | 2004-10-05 | 2011-05-25 | 国立大学法人東京工業大学 | Walking assistance system |
US7453984B2 (en) * | 2006-01-19 | 2008-11-18 | Carestream Health, Inc. | Real-time target confirmation for radiation therapy |
US9217868B2 (en) | 2007-01-12 | 2015-12-22 | Kopin Corporation | Monocular display device |
KR101441873B1 (en) * | 2007-01-12 | 2014-11-04 | 코핀 코포레이션 | Head mounted monocular display device |
EP2203896B1 (en) * | 2007-10-26 | 2019-04-24 | Koninklijke Philips N.V. | Method and system for selecting the viewing configuration of a rendered figure |
US8957835B2 (en) | 2008-09-30 | 2015-02-17 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
US10220311B2 (en) | 2008-10-31 | 2019-03-05 | Gearbox, Llc | System and method for game playing using vestibular stimulation |
US20100112535A1 (en) * | 2008-10-31 | 2010-05-06 | Searete Llc | System and method of altering motions of a user to meet an objective |
US9446308B2 (en) * | 2008-10-31 | 2016-09-20 | Gearbox, Llc | System and method for game playing using vestibular stimulation |
US8608480B2 (en) * | 2008-10-31 | 2013-12-17 | The Invention Science Fund I, Llc | System and method of training by providing motional feedback |
US8838230B2 (en) * | 2008-10-31 | 2014-09-16 | The Invention Science Fund I, Llc | System for altering motional response to music |
US8548581B2 (en) * | 2008-10-31 | 2013-10-01 | The Invention Science Fund I Llc | Adaptive system and method for altering the motion of a person |
FR2979447B1 (en) * | 2011-08-29 | 2015-09-25 | Commissariat Energie Atomique | METHOD FOR CONFIGURING SENSOR DETECTION DEVICE, COMPUTER PROGRAM, AND CORRESPONDING ADAPTIVE DEVICE |
US9078598B2 (en) | 2012-04-19 | 2015-07-14 | Barry J. French | Cognitive function evaluation and rehabilitation methods and systems |
US20140142442A1 (en) | 2012-11-19 | 2014-05-22 | Judy Sibille SNOW | Audio Feedback for Medical Conditions |
US10685487B2 (en) | 2013-03-06 | 2020-06-16 | Qualcomm Incorporated | Disabling augmented reality (AR) devices at speed |
US10061352B1 (en) * | 2017-08-14 | 2018-08-28 | Oculus Vr, Llc | Distributed augmented reality system |
US10799196B2 (en) | 2017-11-20 | 2020-10-13 | General Electric Company | System and method for encouraging patient stillness during imaging |
WO2019171216A1 (en) | 2018-03-07 | 2019-09-12 | Elon Littwitz | Augmented reality device and/or system and/or method for using same for assisting in walking or movement disorders |
GB2585241B (en) * | 2019-07-05 | 2021-12-22 | Strolll Ltd | Augmented reality system |
GB2598749A (en) * | 2020-09-10 | 2022-03-16 | Bae Systems Plc | Method for tracking orientation of an object, tracker system and head or helmet-mounted display |
JP2023545623A (en) * | 2020-09-10 | 2023-10-31 | ビ-エイイ- システムズ パブリック リミテッド カンパニ- | Methods for tracking object orientation, tracker systems, and head- or helmet-mounted displays |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5919149A (en) * | 1996-03-19 | 1999-07-06 | Allum; John H. | Method and apparatus for angular position and velocity based determination of body sway for the diagnosis and rehabilitation of balance and gait disorders |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4906193A (en) * | 1988-07-19 | 1990-03-06 | Mcmullen James | Intrinsic perceptual motor training device |
US5597309A (en) | 1994-03-28 | 1997-01-28 | Riess; Thomas | Method and apparatus for treatment of gait problems associated with parkinson's disease |
US5722420A (en) * | 1995-11-28 | 1998-03-03 | National Science Council | EMG biofeedback traction modality for rehabilitation |
EP0959444A4 (en) * | 1996-08-14 | 2005-12-07 | Nurakhmed Nurislamovic Latypov | Method for following and imaging a subject's three-dimensional position and orientation, method for presenting a virtual space to a subject, and systems for implementing said methods |
JP2917973B2 (en) * | 1997-06-23 | 1999-07-12 | 日本電気株式会社 | Simulated bodily sensation device |
US6097927A (en) * | 1998-01-27 | 2000-08-01 | Symbix, Incorporated | Active symbolic self design method and apparatus |
US6176837B1 (en) * | 1998-04-17 | 2001-01-23 | Massachusetts Institute Of Technology | Motion tracking system |
-
2000
- 2000-08-02 US US09/631,292 patent/US6734834B1/en not_active Expired - Fee Related
-
2004
- 2004-03-03 US US10/790,735 patent/US20040169620A1/en not_active Abandoned
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5919149A (en) * | 1996-03-19 | 1999-07-06 | Allum; John H. | Method and apparatus for angular position and velocity based determination of body sway for the diagnosis and rehabilitation of balance and gait disorders |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040119662A1 (en) * | 2002-12-19 | 2004-06-24 | Accenture Global Services Gmbh | Arbitrary object tracking in augmented reality applications |
US7050078B2 (en) * | 2002-12-19 | 2006-05-23 | Accenture Global Services Gmbh | Arbitrary object tracking augmented reality applications |
US7580540B2 (en) | 2004-12-29 | 2009-08-25 | Motorola, Inc. | Apparatus and method for receiving inputs from a user |
WO2006071420A2 (en) * | 2004-12-29 | 2006-07-06 | Motorola, Inc. | Apparatus and method for receiving inputs from a user |
WO2006071420A3 (en) * | 2004-12-29 | 2006-09-08 | Motorola Inc | Apparatus and method for receiving inputs from a user |
US20060140422A1 (en) * | 2004-12-29 | 2006-06-29 | Zurek Robert A | Apparatus and method for receiving inputs from a user |
US20070118044A1 (en) * | 2005-07-18 | 2007-05-24 | Mega Elektroniikka Oy | Method and device for identifying; measuring and analyzing abnormal neurological responses |
WO2016127005A3 (en) * | 2015-02-04 | 2016-10-27 | Aerendir Mobile Inc. | Local user authentication with neuro and neuro-mechanical fingerprints |
US9590986B2 (en) | 2015-02-04 | 2017-03-07 | Aerendir Mobile Inc. | Local user authentication with neuro and neuro-mechanical fingerprints |
CN106693280A (en) * | 2016-12-29 | 2017-05-24 | 深圳市臻络科技有限公司 | Virtual-reality-based Parkinsonism training method, system and device |
CN110946556A (en) * | 2019-12-27 | 2020-04-03 | 南京信息工程大学 | Parkinson resting state tremor evaluation method based on wearable somatosensory network |
US20220034680A1 (en) * | 2020-07-22 | 2022-02-03 | Teseo S.R.L. | Method and system of topological localization in a built environment |
WO2022026296A1 (en) * | 2020-07-29 | 2022-02-03 | Penumbra, Inc. | Tremor detecting and rendering in virtual reality |
US20220035452A1 (en) * | 2020-07-29 | 2022-02-03 | Penumbra, Inc. | Tremor detecting and rendering in virtual reality |
US11762466B2 (en) * | 2020-07-29 | 2023-09-19 | Penumbra, Inc. | Tremor detecting and rendering in virtual reality |
Also Published As
Publication number | Publication date |
---|---|
US6734834B1 (en) | 2004-05-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6734834B1 (en) | Closed-loop augmented reality apparatus | |
US11273344B2 (en) | Multimodal sensory feedback system and method for treatment and assessment of disequilibrium, balance and motion disorders | |
US10716469B2 (en) | Ocular-performance-based head impact measurement applied to rotationally-centered impact mitigation systems and methods | |
US10258259B1 (en) | Multimodal sensory feedback system and method for treatment and assessment of disequilibrium, balance and motion disorders | |
US9788714B2 (en) | Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance | |
US9994228B2 (en) | Systems and methods for controlling a vehicle or device in response to a measured human response to a provocative environment | |
US9370302B2 (en) | System and method for the measurement of vestibulo-ocular reflex to improve human performance in an occupational environment | |
WO2004008427A1 (en) | Closed-loop augmented reality apparatus | |
Wade et al. | The effect of ocular torsional position on perception of the roll-tilt of visual stimuli | |
Yardley | Contribution of somatosensory information to perception of the visual vertical with body tilt and rotating visual field | |
JP2007518469A5 (en) | ||
US20090240172A1 (en) | Vestibular rehabilitation unit | |
CN110381810A (en) | Screening apparatus and method | |
Lim et al. | Postural instability induced by visual motion stimuli in patients with vestibular migraine | |
IL301283A (en) | Immersive multisensory simulation system | |
US20100305411A1 (en) | Control of operating characteristics of devices relevant to distance of visual fixation using input from respiratory system and/or from eyelid function | |
Baram et al. | Walking on virtual tiles | |
Di Girolamo et al. | Vestibulo-ocular reflex modification after virtual environment exposure | |
Writer et al. | Vestibular rehabilitation: An overview | |
WO2020174636A1 (en) | Visual information changing device, prism glasses, and method for selecting lens in prism glasses | |
Virre | Virtual reality and the vestibular apparatus | |
KR102220837B1 (en) | Augmented Reality Based Mirror Exercise System for Exercise Rehabilitation of Patients with Nervous and Musculoskeletal system | |
KR101730699B1 (en) | Using virtual reality therapy apparatus for pain treatment of physical asymmetry | |
Keshner et al. | Postural research and rehabilitation in an immersive virtual environment | |
Bhatia et al. | A review on eye tracking technology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |