WO2024081781A1 - Rehab and training interactive and tactile projections of sound and light - Google Patents
Rehab and training interactive and tactile projections of sound and light Download PDFInfo
- Publication number
- WO2024081781A1 WO2024081781A1 PCT/US2023/076681 US2023076681W WO2024081781A1 WO 2024081781 A1 WO2024081781 A1 WO 2024081781A1 US 2023076681 W US2023076681 W US 2023076681W WO 2024081781 A1 WO2024081781 A1 WO 2024081781A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- patient
- data
- sensor
- camera
- haptic
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0004—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
- A61B5/0013—Medical image data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
- A61B5/02055—Simultaneously evaluating both cardiovascular condition and temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/7455—Details of notification to user or communication with user or patient; User input means characterised by tactile indication, e.g. vibration or electrical stimulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/09—Rehabilitation or training
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0242—Operational features adapted to measure environmental factors, e.g. temperature, pollution
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/021—Measuring pressure in heart or blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/024—Measuring pulse rate or heart rate
- A61B5/02438—Measuring pulse rate or heart rate with portable devices, e.g. worn by the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/053—Measuring electrical impedance or conductance of a portion of the body
- A61B5/0531—Measuring skin impedance
- A61B5/0533—Measuring galvanic skin response
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
- A61B5/14532—Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue for measuring glucose, e.g. by tissue impedance measurement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/369—Electroencephalography [EEG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/389—Electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/398—Electrooculography [EOG], e.g. detecting nystagmus; Electroretinography [ERG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4082—Diagnosing or monitoring movement diseases, e.g. Parkinson, Huntington or Tourette
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/42—Detecting, measuring or recording for evaluating the gastrointestinal, the endocrine or the exocrine systems
- A61B5/4261—Evaluating exocrine secretion production
- A61B5/4266—Evaluating exocrine secretion production sweat secretion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4869—Determining body composition
- A61B5/4875—Hydration status, fluid retention of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6803—Head-worn items, e.g. helmets, masks, headphones or goggles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/742—Details of notification to user or communication with user or patient; User input means using visual displays
- A61B5/745—Details of notification to user or communication with user or patient; User input means using visual displays using a holographic display
Definitions
- devices, systems, and methods for emission of electromagnetic waves and/or mechanical waves incorporate mechanical elements to provide non-laser focused mechanical pressure waves in the human audible spectrum (i.e., about 20 Hz to 20 kHz) and/or human non-audible spectrum.
- Mechanical elements may include parametric speaker arrays such as ultrasonic speaker arrays, piezo speakers, or electromagnetic speakers, and the like.
- beam forming and/or beam shaping methods are utilized to focus, direct, or otherwise manipulate waves propagated from the systems and devices disclosed herein.
- the devices and systems incorporate optical elements to provide laser focused mechanical pressure waves in the human audible spectrum and/or human non-audible spectrum.
- Optical elements may also be utilized to provide optical signals in the infrared, near infrared, or visible light spectrum.
- Optical elements may include lasers, light emitting diodes, lenses, mirrors, or a combination thereof.
- devices and systems incorporate thermal elements to alter an ambient temperature.
- thermal elements are utilized to lower an ambient temperature.
- thermal elements are utilized to lower an ambient temperature.
- thermal elements are utilized to adjust an ambient temperature between about 0° C to about 100° C.
- temperature sensors are incorporated to measure temperatures of surfaces or areas which may interact with the thermal elements.
- temperature sensors allow for dynamic adjustment of the thermal elements, as disclosed herein.
- devices and systems include interferometric elements to measure mechanical pressure waves or optical waves.
- interferometric elements are utilized for dynamic adjustment of optical elements, emission of electromagnetic waves, and/or emission of mechanical waves.
- devices and system include optical sensors.
- optical sensors are utilized to dynamically measure mechanical waves, optical waves, and motion/position of objects (e.g., animate and inanimate objects such as people, cars, rocks, etc.).
- an optical sensor is provided to capture images at a rate of 10 Hz to 10,000 Hz. Said captured images may be combined into a video format.
- an optical sensor comprises a camera.
- optical sensors include infrared, near infrared, visible light, ultra-violet spectrum sensors.
- optical sensors comprise three-dimensional (3D) spectroscopic cameras capable of sensing in infrared (IR), near infrared, visible light, and/or ultra-violet spectrum.
- systems utilize multiple stereo infrared (IR) imaging devices.
- systems and devices incorporate one or more computational elements (e.g., a microcontroller, application specific integrated circuit, single board computer, edge computing device, quantum computing device, etc.) to perform data processing and real-time data processing for dynamic output signal conditioning and adjustment based on desired output and measured signal inputs, as disclosed herein.
- computational elements e.g., a microcontroller, application specific integrated circuit, single board computer, edge computing device, quantum computing device, etc.
- systems include closed mesh network elements for selfrecognizing interact-ability with like devices to allow constructive or destructive distributed signal modification.
- systems include open network elements (e.g., 3G, 4G, 5G, long range (LoRa), and the like) to enable connection to internet, intranet, distributed computing network (cloud computing).
- systems include electrical elements to generate, consume, receive, and transmit power (e.g., solar panels, rechargeable battery, battery, wireless energy transmission / reception components, and the like) to provide power to the system and similar devices within a known proximity.
- communication between devices utilizes free space optics communication and has the ability to adjust data transmission bandwidth based on power consumption restrictions.
- a system for haptic interaction comprising: a haptic array comprising a plurality of ultrasonic devices; a camera; a light source; a thermal element; and a non-transitory computer-readable storage media encoded with instructions executable by at least one processor to cause the at least one processor to perform operations comprising: directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit a first acoustic field having a first focal point; directing the light source to emit light at or near the first focal point, direct the thermal element to emit heat at or near the first focal point, or both; determining a user motion based on data received by the camera; and based on the user motion, directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit a second acoustic field having a second focal point.
- the haptic array is a planar array. In some embodiments, the haptic array is a non-planar array. In some embodiments, the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz. In some embodiments, the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three- dimensional camera. In some embodiments, the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof.
- the camera captures data at a rate of about 10 Hz to 10,000 Hz.
- the user motion is a motion of an appendage of a user.
- the operations further comprise determining a measured position of the focal point based on data received by the camera.
- the operations further comprise directing at least a portion of the plurality of ultrasonic devices based on the measured position.
- the light source comprises a laser, a light emitting diode, a light bulb, or any combination thereof.
- the emitted light has a wavelength of about 10 nm to about 10,000 nm. In some embodiments, the emitted light has a frequency of about 0.3 THz to about 300 THz.
- the system further comprises an interferometric device, wherein the operations further comprise calibrating the haptic array based on data received from the interferometric device.
- the interferometric device comprises a laser doppler vibrometer, a laser interferometer, an acoustic interferometer, or any combination thereof.
- the system further comprises a communication device, wherein the operations further comprise transmitting the user motion, the data received by the camera, or any combination thereof, via the communication device.
- the communication device comprises a cellular device, a Wi-Fi device, a mesh network device, a satellite device, a Bluetooth device, or any combination thereof.
- the system further comprises an energy storage device providing power to the haptic array, the camera, the non-transitory computer- readable storage media, or any combination thereof.
- the energy storage device comprises a battery, a supercapacitor, or any combination thereof.
- the operations further comprise: directing the light source to emit light at or near the second focal point; directing the thermal element to emit heat at or near the second focal point; or both.
- the operations further comprise determining an object position, an objection motion, or both, based on data received by the camera.
- the operations further comprise: directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit a third acoustic field based on the object position, the objection motion, or both; directing the light source based on the object position, the objection motion, or both; directing the thermal element based on the object position, the objection motion, or both; or any combination thereof.
- a computer-implemented method of haptic interaction comprising: directing, by a computer, one or more ultrasonic devices in a haptic array to emit an acoustic field having a first focal point; directing, by the computer, a light source to emit light at or near the first focal point, direct a thermal element to emit heat at or near the first focal point, or both; determining, by the computer, a user motion based on data received by a camera; and based on the user motion, directing, by the computer, at least a portion of the plurality of ultrasonic devices in the haptic array to emit a second acoustic field having a second focal point.
- the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof.
- the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three-dimensional camera. In some embodiments, the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof. In some embodiments, the data is received by the camera at a rate of about 10 Hz to 10,000 Hz. In some embodiments, the method further comprises calibrating, by the computer, the haptic array based on data received from an interferometric device.
- the method further comprises determining, by the computer, a measured position of the focal point based on data received by the camera. In some embodiments, the method further comprises directing, by the computer, at least a portion of the plurality of ultrasonic devices based on the measured position. In some embodiments, the method further comprises directing, by the computer, the light source to emit light at or near the second focal point; directing, by the computer, the thermal element to emit heat at or near the second focal point; or both. In some embodiments, the method further comprises determining, by the computer, an object position, an objection motion, or both, based on data received by the camera.
- the method further comprises directing, by the computer, at least a portion of the plurality of ultrasonic devices in the haptic array to emit a third acoustic field based on the object position, the objection motion, or both; directing, by the computer, the light source based on the object position, the objection motion, or both; directing, by the computer, the thermal element based on the object position, the objection motion, or both; or any combination thereof.
- One aspect provided herein is a system for assessment of a patient, the system comprising: an ancillary device comprising a biometric sensor configured to measure a biometric data; a monitoring device comprising: a display configured to show a display image; a camera, a time-of-flight sensor, or both, configured to capture a plurality of pose images of the patient; and a haptic array comprising a plurality of ultrasonic devices; and a non-transitory computer-readable storage media encoded with instructions executable by at least one processor to cause the at least one processor to perform operations comprising: directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit an acoustic field based on the display image; determining two or more patient poses based at least in part on the biometric data and the two or more pose images; and determining the assessment of the patient based at least in part on the display image and at least a portion of the plurality of patient poses.
- the ancillary device is configured to couple to an appendage of the patient.
- the biometric sensor comprises an inertial motion unit, a photoplethysmography sensor, a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance sensor, an electrodermal activity sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof.
- the haptic array is a planar array. In some embodiments, the haptic array is a non-planar array.
- the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz.
- the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three-dimensional camera. In some embodiments, the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, a thermographic camera, or any combination thereof. In some embodiments, the camera, the time-of-flight sensor, or both, captures data at a rate of about 10 Hz to 10,000 Hz.
- each of the two or more patient poses comprise a position, an orientation, or both of an appendage of the patient.
- the two or more patient poses, the assessment of the patient, or both are determined at least in part by a machine learning algorithm.
- the assessment of the patient is determined at least in part based on a comparison between the two or more patient poses and two or more labeled healthy poses.
- the monitoring device further comprises a speaker, a joystick, a trackpad, a trackball, or any combination thereof.
- the monitoring device or the ancillary device comprise the non-transitory computer-readable storage media.
- the ancillary device further comprises an ancillary communication device and wherein the monitoring device further comprises a monitoring communication device communicably coupled to the ancillary device.
- the ancillary communication device and the monitoring communication device are wireless communication devices.
- Another aspect provided herein is a computer-implemented method of assessing a patient, the method comprising: showing a display image on a display while: receiving, from an ancillary device, a biometric data; capturing, by a camera, a time-of-flight sensor, or both, a plurality of pose images of the patient; and emitting, by a haptic array comprising a plurality of ultrasonic devices, an ultrasonic haptic based on the display image; determining two or more patient poses based at least in part on the biometric data and the two or more pose images; and determining the assessment of the patient based at least in part on the display image and at least a portion of the plurality of patient poses.
- the biometric data comprises an inertial motion unit data, a photoplethysmography data, a photoacoustic data, an ultrasound data, a glucose data, a bioimpedance data, an electrodermal activity data, a temperature data, a vision shadow capture data, an altitude data, a pressure data, a humidity data, a sweat rate data, a hydration data, a bioacoustics data, a dynamometer, an electrodermal data, or any combination thereof.
- at least a portion of the plurality of pose images comprise a two-dimensional image. In some embodiments, at least a portion of the plurality of pose images comprise a three-dimensional image.
- At least a portion of the plurality of pose images comprise an infrared image, a near infrared image, a visible light image, an ultra-violet spectrum image, a thermographic image, or any combination thereof.
- each of the two or more patient poses comprise a position, an orientation, or both of an appendage of the patient.
- the two or more patient poses, the assessment of the patient, or both are determined at least in part by a machine learning algorithm.
- the assessment of the patient is determined at least in part based on a comparison between the two or more patient poses and two or more labeled healthy poses.
- FIG. 1 shows a non-limiting example of a computing device; in this case, a device with one or more processors, memory, storage, and a network interface;
- FIG. 2 shows a non-limiting example of a web/mobile application provision system; in this case, a system providing browser-based and/or native mobile user interfaces;
- FIG. 3 shows a non-limiting example of a cloud-based web/mobile application provision system; in this case, a system comprising an elastically load balanced, auto-scaling web server and application server resources as well synchronously replicated databases;
- FIG. 4 shows a non-limiting example of a haptic array control system
- FIG. 5 shows a non-limiting example of a haptic array device
- FIG. 6 depicts a non-limiting example of a method of using a haptic array device as part of a physical therapy session
- FIG. 7A shows an image of an exemplary system for assessing a patient, per one or more embodiments herein;
- FIG. 7B shows an image of a user manipulating a virtual cube with an exemplary system for assessing a patient, per one or more embodiments herein;
- FIG. 8A shows an image of a user throwing a virtual ball with an exemplary system for assessing a patient, per one or more embodiments herein;
- FIG. 8B shows an image of a user manipulating one of three balls with an exemplary system for assessing a patient, per one or more embodiments herein;
- FIG. 9A shows a first image an exemplary system for assessing a patient, per one or more embodiments herein;
- FIG. 9B shows a second image an exemplary system for assessing a patient, per one or more embodiments herein;
- FIG. 10A shows a third image an exemplary system for assessing a patient, per one or more embodiments herein;
- FIG. 10B shows an image an exemplary ancillary device, per one or more embodiments herein;
- FIG. 11 shows an image of a first exemplary assessments of the patient, per one or more embodiments herein;
- FIG. 12 shows an image of a second exemplary assessments of the patient, per one or more embodiments herein;
- FIG. 13 is a flowchart of an exemplary method for individualized patient care, per one or more embodiments herein;
- FIG. 14 is flowchart of a computer-implemented method of assessing a patient, per one or more embodiments herein.
- the haptic feedback system utilizes a combination of optic and acoustic fields simultaneously.
- generated optic and acoustic fields have no direct interference, however, combining them provides benefits such as multi-resolution haptic images and a synergistic effect on haptic perception.
- the fields are applied simultaneously as elastic wave to stimulate nerves signals.
- the optic field is utilized to simulate or produce a “skin feeling,” or feeling of touch.
- the acoustic field is utilized to apply pressure. Combining two fields of different physical quantities would provide not only the superposition effect proposed above but also synergistic effects such as modification of the feeling.
- FIG. 4 shows a diagram of the components of haptic array device, according to some embodiments.
- FIG. 5 depicts a haptic array device, according to some embodiments.
- the system is parametric.
- the non-linearity of the frequency response produced by multiple ultrasonic frequencies in air is modeled utilizing parametric equations.
- the parametric equations may be utilized in computer and/or machine learning systems to (and resultingly, the effect is best modeled with parametric equations).
- the system includes Field Programmable Gate Arrays (FPGAs), machine learning, autonomous control systems, fast-networking, fast-self healing, interferometer sensors, ultrasonic speaker arrays, and the like.
- FPGAs Field Programmable Gate Arrays
- the system utilizes laser interferometer technology to measure the response of an environment, one or more objects, or a combination thereof to dynamically change parameters and achieve desired effects.
- a laser interferometer system sends out a two-beam laser to measure vibration of a surface.
- laser interferometer is used to receive vibration signals to calibrate the output of the ultrasonic transducer array to effectively beamform the audio waves to focus on one or more points on a subject or object.
- parametric speaker array is a highly directive speaker that consists of an array of ultrasonic transducers that exploit the nonlinear properties of air to self-demodulate modulated ultrasonic signals with the aim of creating narrow, focused sound waves (audible and inaudible).
- the ultrasonic transducers are piezoelectrically driven.
- the system utilizes one or more parametric speaker/transducer arrays.
- each transducer array comprises multiple transducers.
- the multiple transducers of each array output the same signal which is amplified by constructive interference.
- two or more arrays are configured to further amplify a signal via constructive interference.
- a plurality of speaker arrays may be utilized to precisely direct sound or amplify sound at a precise location.
- Use of a parametric speaker array may the traditional use of broadcasting audio through distributed & coherent beamforming functionality. This approach offers the capability of numerous smaller devices to output the same audio volume as a single large device.
- the system and methods herein allow for high powered acoustic energy signals to be achieved with a system which is relatively compact and has low power requirements.
- the system combines the laser interferometer and parametric speaker array technologies with the distributed coherent beamforming technique through a network capable control system that uses algorithms and/or machine learning (ML) to rapidly tune the audio effect to mitigate destructive environmental noise and to enable effective beam coherence. Therefore, in some embodiments, the system provides autonomous environmental adjustments and distributed coherence beam forming.
- ML machine learning
- the inventive device combines three fundamental technologies:
- the inventive device combines four fundamental technologies: (1) a small, ultrasonic parametric speaker array for broadcasting focused acoustic waveforms, (2) one or more lasers for generating laser haptics, (3) one or more video capture device for monitoring at least a portion of a subject, and (4) a network-connected system controller to manage data from both the network and the individual components.
- an individual system functions on its own.
- individual systems are combined in a network that provides a distributed coherent beamforming function.
- the system utilizes digital signal processing; embedded systems, information technology for distributed networking (i.e., Internet of Things (IOT)), and machine leaming/artificial intelligence (ML/ Al) for device self-calibration.
- IOT Internet of Things
- ML/ Al machine leaming/artificial intelligence
- a system 400 for controlling providing haptic feedback or stimulation is depicted, according to some embodiments.
- the system 400 is utilized to stimulate or provide haptic feedback to subject or portion of a subject (e.g., a hand of a subject 490).
- the system 400 includes network module 405, system controller 410, acoustic payload controller 420, a monitoring controller 425, monitoring sensors 430, acoustic haptic array controller 435, acoustic haptic array 450, optical emission controller 460, optical emitter 465, and recorder 440.
- the functions of the system 400 are controlled by system controller 410.
- the system controller 410 comprises a computer processing unit (CPU), as described herein.
- the CPU may comprise one or more programs loaded onto a memory for sending instructions for operating the various components of the system, as described herein.
- the system controller 410 may further comprise a field programmable gate array (FPGA) configurable to provide a logic circuit for specified functions of the system.
- the system controller 410 is in operative communication with a network module 405.
- the network module 405 may be configured to receive information instructions, such a programming instructions, parameter inputs, or the like and transmit said instructions to the system controller 410.
- the network module 405 may communicate with an external network, remote device, user interface, or the like, as disclosed herein.
- mesh networking is utilized.
- mesh networking allows the system to provide distributed coherence.
- mesh networking may allow many small systems to achieve the performance of a much larger system.
- Mesh networking may also allow the system to provide unique and complicated acoustic algorithms (e.g., machine learning) to enable precise spatial audio or ultrasonic feedback.
- the system controller 410 is operatively coupled to the acoustic payload controller 420 and the optical emission controller 460. In some embodiments, the acoustic payload controller and the optical emission controller are integrated into a single haptic array controller. In some embodiments, the system controller 410 is operatively coupled to the acoustic payload controller 420 and the optical emission controller 460 via one or more control buses 415.
- the acoustic payload controller 420 comprises an application specific integrated circuit (ASIC) processes one or more signals and provides an output signal to the acoustic haptic array controller 435.
- the acoustic haptic array controller 435 provides an output signal to the acoustic haptic array 450, where the output signal is transformed into to mechanical waveform (e.g., an acoustic, sound, or ultrasonic waveform) by one or more transducers of the acoustic haptic array.
- the haptic array controller comprises an amplifier to amplify the signal prior to output to the haptic array(s).
- the system is connected to a plurality of haptic arrays and the output to each haptic array is varied to produce a desired output.
- the constructive interference of the sonic waves produced by the transducers is utilized to produce one or more focal points.
- production of the focal point is digitally controlled by the haptic payload controller.
- focal points of sonic energy are produced with a resolution of 1/16 of the wavelength (e.g., approximately 0.5 mm for the 40 kHz ultrasound).
- the optical emission controller 460 comprises an application specific integrated circuit (ASIC) processes one or more received signals.
- the optical emission controller 425 receives signals from the system controller 410.
- the optical emission controller 425 receives signals from the system controller 410, the acoustic payload controller 420, the monitoring controller 425, or a combination thereof.
- the optical emission controller 460 provides directs and controls one or more optical emitters 465.
- the optical one or more optical emitters 465 comprise at least one light source. In some embodiments, the optical one or more optical emitters 465 comprise at least one light source coupled to one or more optical elements.
- the optical elements may comprise lenses, mirrors, polarizers, filters, and the like to direct manipulate light from a light source and direct it to a target location.
- the system is connected to a plurality of optical emitters and the output to each optical emitter is varied to produce a desired output.
- the light source of the optical emitter is a laser, as described herein.
- the optical emitter produces electromagnetic energy outside of the visible light spectrum.
- the optical emitter may produce electromagnetic waves within the ultraviolet or infrared spectrum.
- the optical emitter is replaced or used in combination with an emitter which generates another type of electromagnetic energy, such as radio emissions.
- the optical emitter is replaced or used in combination with a thermal emitter which generates and transmits heat toward a target location or focal point.
- the system 400 comprises a monitoring controller 425.
- the monitoring controller operates and receives data from one or more monitoring sensors.
- Monitoring sensors may include optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target (e.g., a target area, volume, or a portion of a subject 490).
- a target e.g., a target area, volume, or a portion of a subject 490.
- an interferometer is utilized as a monitoring sensor, as disclosed herein.
- the monitoring controller 425 transmits data from the monitoring sensors 430 to be processed by the acoustic payload controller 420.
- the acoustic payload controller 420 comprises a digitally-programmable potentiometer (DPP) which receives the interferometer data.
- the monitoring controller 425 transmits data from the monitoring sensors 430 to be processed by the optical emission controller 460.
- the optical emission controller 460 comprises a digitally-programmable potentiometer (DPP) which receives the data generated by the monitoring sensors.
- the monitoring data is sent back to system controller 410.
- the acoustic payload controller 420 may adjust the output signal to the acoustic haptic array controller 420 based on received data received data from the monitoring controller and/or instructions provided by the system controller 410 in response to received data from the monitoring controller 440.
- the optical emission controller 460 may adjust the output signal to the optical emitter based on received data received data from the monitoring controller and/or instructions provided by the system controller 410 in response to received data from the monitoring controller 440.
- the system is configured such that feedback received from the monitoring sensors 430 is utilized to adjust the system, output of the haptic arrays 450, and output of the optical emitters 465. In some embodiments, adjustments are made in real-time to provide a self-calibrating system.
- the system further comprises a recorder 440.
- Recorder 440 may receive and store monitoring data via an input/output (I/O) integrated circuit coupled to the monitoring controller.
- the stored data may be utilized by the system to improve outputs.
- the stored monitoring data is input into a machine learning module to improve the system.
- the system is used for audio recording using an interferometer (i.e., ISR).
- ISR interferometer
- the monitoring data is used to track a target 490.
- the monitoring data is used to monitor the response of a target to the haptic output of the system.
- FIG. 5 shows an exemplary haptic array device 500 is depicted, according to some embodiments.
- the haptic array device 500 comprises an array of transducers 550 for producing sonic haptics, as described.
- the array 550 is an ultrasonic transducer array, as disclosed herein.
- the haptic array device 500 further comprises laser systems 511, 512, 513.
- the haptic array device 500 further comprises an integrated monitoring system 520.
- the haptic array device 500 is configured to provide haptic feedback or sensations to an object or focal point 505.
- the object 505 is a portion of a user, such as a hand.
- the laser systems may be configured to produce haptics, 3 dimensional visualizations (i.e., holograms), or both.
- a hologram is produced by two of the laser systems function as optical emitters and using constructive interference to produce a 3D rendering.
- a third laser system produces haptic feedback while the other two laser systems produce the hologram.
- laser systems 511 and 512 may produce a hologram while laser system 513 provides haptic feedback to a target area 505.
- monitoring system 520 comprises one or more sensors for monitoring an object or an objects response to the provided haptics, as disclosed herein.
- the one or more sensors of the monitoring system may comprise optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target 505.
- the monitoring system is coupled to a computer system which identifies and tracks the target 505 and/or portions thereof, as disclosed herein.
- haptic array device 500 depicted in FIG. 5 depicts a device with fully integrated components, it should be appreciated that the components may not be integrated or may be separate from the device. Further, is should be appreciated that the device may be supplemented with further components (such as additional ultrasound transducer arrays) or additional haptic array devices of the same or a similar type.
- the system is modular, such that multiple systems can be networked to provide different levels of performance based on user needs.
- An individual system may operate independently for reduced function based on user needs.
- Combined systems may operate together to produce a higher output signal, provide haptic feedback to a larger volume of space.
- sonic haptics may be provided to a target (e.g., one or more focal points, a portion of a subject, etc.). As disclosed herein, sonic haptics may be provided to a target (e.g., one or more focal points, a portion of a subject, etc.). In some embodiments, sonic haptic feedback is provided to a target via an array of ultrasonic transducers. In some embodiments, an array of ultrasonic transducers comprises 324 transducers arranged in an 18 x 18 square grid. However, multiple arrangements of the transducers may be provided to better suit various applications. In some embodiments, the transducers are arranged as a planar array.
- the transducers are arranged in a non-planar array. In some embodiments, the transducers are arranged in two or more planar arrays which are provided at an angle to each other. In some embodiments, the transducers are arranged in two or more planar arrays which are orthogonal to each other. In some embodiments, the transducers are open aperture ultrasonic transducers. In some embodiments, the transducers are ceramic transducers (e.g., Nippon Ceramic T4010A1 transducers).
- an array of ultrasonic transducers comprises about 4 transducers to about 1,025 transducers. In some embodiments, an array of ultrasonic transducers comprises about 4 transducers to about 25 transducers, about 4 transducers to about 64 transducers, about 4 transducers to about 256 transducers, about 4 transducers to about 324 transducers, about 4 transducers to about 576 transducers, about 4 transducers to about 1,025 transducers, about 25 transducers to about 64 transducers, about 25 transducers to about 256 transducers, about 25 transducers to about 324 transducers, about 25 transducers to about 576 transducers, about 25 transducers to about 1,025 transducers, about 64 transducers to about 256 transducers, about 64 transducers to about 324 transducers, about 64 transducers to about 576 transducers, about 64 transducers to about 1,025 transducers, about 64 transducers to about 256 transducers, about 64 transducers to about
- the transducers are capable of producing an ultrasonic focal point having a diameter of about 20 millimeters (mm). In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of about 1 mm to about 100 mm.
- the transducers are capable of producing an ultrasonic focal point having a diameter of about 1 mm to about 5 mm, about 1 mm to about 10 mm, about 1 mm to about 20 mm, about 1 mm to about 40 mm, about 1 mm to about 50 mm, about 1 mm to about 100 mm, about 5 mm to about 10 mm, about 5 mm to about 20 mm, about 5 mm to about 40 mm, about 5 mm to about 50 mm, about 5 mm to about 100 mm, about 10 mm to about 20 mm, about 10 mm to about 40 mm, about 10 mm to about 50 mm, about 10 mm to about 100 mm, about 20 mm to about 40 mm, about 20 mm to about 50 mm, about 20 mm to about 100 mm, about 40 mm to about 50 mm, about 40 mm to about 100 mm, or about 50 mm to about 100 mm, including increments therein.
- the transducers are capable of producing an ultrasonic focal point having a diameter of at least about 1 mm, about 5 mm, about 10 mm, about 20 mm, about 40 mm, or about 50 mm, including increments therein. In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of at most about 5 mm, about 10 mm, about 20 mm, about 40 mm, about 50 mm, or about 100 mm, including increments therein.
- the transducer array is capable of providing pressure forces of about 10 millinewtons (mN) to about 20 mN. In some embodiments, the transducer array is capable of providing pressure forces of about 1 mN to about 100 mN.
- the transducer array is capable of providing pressure forces of about 1 mN to about 2 mN, about 1 mN to about 5 mN, about 1 mN to about 10 mN, about 1 mN to about 20 mN, about 1 mN to about 50 mN, about 1 mN to about 100 mN, about 2 mN to about 5 mN, about 2 mN to about 10 mN, about 2 mN to about 20 mN, about 2 mN to about 50 mN, about 2 mN to about 100 mN, about 5 mN to about 10 mN, about 5 mN to about 20 mN, about 5 mN to about 50 mN, about 5 mN to about 100 mN, about 10 mN to about 20 mN, about 10 mN to about 50 mN, about 10 mN to about 100 mN, about 20 mN to about 50 mN, about 20 mN to about 100 mN, about 10
- the ultrasonic haptics are based on acoustic radiation pressure, which is not vibrational and presses the skin surface. This can be applied on the skin for a long time but this is relatively weak. The sensation maybe similar to a laminar air flow within a narrow area.
- vibrotactile stimulations are produced by modulation of ultrasonic emission as waveforms.
- vibrotactile stimulations are produced by modulated by 200 Hz and 50 Hz waves.
- the waveforms for producing ultrasonic haptic feedback are sinewaves, rectangular waves, triangular waves, or a combination thereof.
- the spatial resolution produced by the transducer array is about 8.5 mm when the array is operating at 40 kilohertz (kHz).
- the haptic array device comprises one or more lasers for providing haptic feedback.
- a laser emits energy at a wavelength of about 10 nm to about 10,000 nm.
- a laser has a frequency of about 0.3 THz to about 300 THz.
- a power output of the laser is about 0.16 watts (W).
- a power output of the laser is about 0.01 W to about 0.5 W.
- a power output of the laser is about 0.01 W to about 0.05 W, about 0.01 W to about 0.1 W, about 0.01 W to about 0.13 W, about 0.01 W to about 0.16 W, about 0.01 W to about 0.2 W, about 0.01 W to about 0.3 W, about 0.01 W to about 0.5 W, about 0.05 W to about 0.1 W, about 0.05 W to about 0.13 W, about 0.05 W to about 0.16 W, about 0.05 W to about 0.2 W, about 0.05 W to about 0.3 W, about 0.05 W to about 0.5 W, about 0.1 W to about 0.13 W, about 0.1 W to about 0.16 W, about 0.1 W to about 0.2 W, about 0.1 W to about 0.3 W, about 0.1 W to about 0.5 W, about 0.13 W to about 0.16 W, about 0.13 W to about 0.2 W, about 0.13 W to about 0.3 W, about 0.13 W to about 0.5 W, about 0.13 W to about 0.16 W, about 0.13 W to about 0.2 W, about 0.13 W to about
- a power output of the laser is about 0.01 W, about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, about 0.3 W, or about 0.5 W, including increments therein. In some embodiments, a power output of the laser is at least about 0.01 W, about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, or about 0.3 W, including increments therein. In some embodiments, a power output of the laser is at most about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, about 0.3 W, or about 0.5 W, including increments therein.
- a low laser power levels prevent damaging of the skin of a user.
- the sensation produced by the laser system may be similar to an electric sensation.
- the haptic feedback from the laser causes evaporation from a nonthermal shockwave produced on skin.
- duration of laser exposure is limited to prevent damage to the skin.
- a haptic laser system comprises at least one laser light source.
- the haptic laser system comprises optical elements such as lenses, mirrors, polarizers, filters, and the like to direct manipulate light from a light source and direct it to a target location.
- a haptic laser system comprises galvo- mirrors for precise positioning of the laser energy.
- a laser system comprises a computer-controlled optical phased array comprising pixels that modulate a laser beam’s intensity, phase, or both.
- the haptic array device utilizes a combination of electromagnetic energy and pressure from mechanical waves to produce unique sensations for a user.
- the ultrasonic transducers can produce pressure in larger areas (e.g., about 30 cm areas).
- the laser haptics systems produce sensations in more focused areas (e.g., down to 1 micron). Therefore, a combination of laser and ultrasonic transducer systems may produce focused haptics at different scales simultaneously. For example, if a target is a hand of a user, the ultrasonic haptic system may produce a pressure sensation on the palm of the hand, while the laser haptic system focuses a sensation on a fingertip of the user. Such a configuration may be useful in confirming registration or detection of various parts of the hand when being used in combination with a gesture registration system.
- lasers of the haptic array device are utilized to produce visualizations.
- constructive interference produced by a laser emission system is utilized to generate 3D images or holograms.
- a 3D image or hologram is utilized to help guide a user when the haptic array device is being used as a controller or for gesture recognition.
- a 3D image or hologram is utilized to help guide a user when using an external device is being used as a controller or for gesture recognition.
- a 3D image is produced to guide a user’s hand to the center of an image captured by a camera (either incorporated or external to the haptic array device) being utilized for gesture recognition.
- a haptic array device utilizes a laser system to produce both haptic and visual effects.
- the haptic feedback is provided as the user interacts with a 3D image or hologram.
- a 3D image or hologram is utilized to help guide a user through a series movements as part of rehabilitation or training program.
- the system 700 comprises a monitoring device 710 and an ancillary device 720.
- the monitoring device 710 comprises a display 711, a haptic array 712, a camera 713, and a non-transitory computer-readable storage media.
- the monitoring device 710 comprises the display 711, the haptic array 712, a time-of-flight sensor 714, and the non-transitory computer-readable storage media.
- the monitoring device 710 comprises the display 711, the haptic array 712, the camera 713, the time-of-flight sensor 714, and the non-transitory computer-readable storage media.
- the monitoring device 710 further comprises a speaker, a joystick, a trackpad, a trackball, or any combination thereof.
- the display 711 is configured to show a display image.
- FIG. 7B shows an exemplary display image of the user’s hand manipulating a virtual cube. In some embodiments, this display image is shown while the user experiences a sensation of manipulating the virtual cube by pressure waves emitted from the haptic array 712.
- FIG. 8A shows an exemplary display image of the user’s hand throwing a ball.
- FIG. 8B shows an exemplary display image of the user’s hand manipulating one of three displayed balls.
- the ancillary device 720 comprises a biometric sensor.
- the biometric sensor is configured to measure a biometric data.
- the ancillary device 720 is configured to couple to an appendage of the patient.
- the biometric sensor comprises an inertial motion unit, a photoplethysmography sensor, a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance sensor, an electrodermal activity sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof.
- the ancillary device 720 further comprises a speaker, a joystick, a trackpad, a trackball, or any combination thereof.
- the monitoring device 710 or the ancillary device 720 comprise the non-transitory computer-readable storage media.
- the ancillary device 720 further comprises an ancillary communication device and wherein the monitoring device 710 further comprises a monitoring communication device communicably coupled to the ancillary device 720.
- the ancillary communication device and the monitoring communication device are wireless communication devices.
- the camera 713 is configured to capture a plurality of pose images of the patient. In some embodiments, the plurality of pose images of the patient form a video of the motion of the patient. In some embodiments, the camera 713 comprises a two- dimensional camera 713. In some embodiments, the camera 713 comprises a three- dimensional camera 713. In some embodiments, the camera 713 is an infrared camera 713, a near infrared camera 713, a visible light camera 713, an ultra-violet spectrum camera 713, a thermographic camera 713, or any combination thereof.
- the camera 713, the time-of-flight sensor 714, or both captures data at a rate of about 10 Hz to 10,000 Hz.
- the monitoring device 710 comprises two or more cameras 713, two or more time-of-flight sensors 714, or both. In one embodiment the two or more cameras 713, the two or more time-of-flight sensors 714, or both are arrayed to capture the patient from two or more directions. In one embodiment the two or more cameras 713, the two or more time-of-flight sensors 714, or both are arrayed about the haptic array 712.
- the haptic array 712 comprises a plurality of ultrasonic devices.
- the haptic array 712 is a planar array.
- the haptic array 712 is a non-planar array.
- the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz.
- the non-transitory computer-readable storage media encoded with instructions executable by at least one processor to cause the at least one processor to perform operations.
- the one or more operations comprise: directing at least a portion of the plurality of ultrasonic devices in the haptic array 712 to emit an acoustic field based on the display image; determining two or more patient poses based at least in part on the biometric data and the two or more pose images; and determining the assessment of the patient based at least in part on the display image and at least a portion of the plurality of patient poses.
- each of the two or more patient poses comprise a position, an orientation, or both of an appendage of the patient.
- the two or more patient poses, the assessment of the patient, or both are determined at least in part by a machine learning algorithm. In some embodiments, the assessment of the patient is determined at least in part based on a comparison between the two or more patient poses and two or more labeled healthy poses.
- FIGS. 11 and 12 show exemplary assessments of the patient.
- the assessment of the patient is displayed on the display.
- the assessment comprises a progress indicator, a vital sign indicator, a range of motion improvement indicator, a grip strength impro.
- the assessment allows the patient to record their pain level and provides an indicator of their current and/or past pain levels.
- one or more sensors are provided to monitor interaction with the haptic array device.
- a monitoring device comprising one or more sensors is provided to monitor a user position, the user motion, or both is outside a threshold from a set user position, a set user motion, or both.
- Monitoring sensors may include optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target (e.g., a target area, volume, or a portion of a subject 490).
- an interferometer is utilized as a monitoring sensor, as disclosed herein.
- a monitoring device comprises a camera.
- the camera captures data at a rate of about 10 Hz to 10,000 Hz.
- the camera comprises a two-dimensional camera.
- the camera comprises a three-dimensional camera.
- the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof.
- the camera is coupled to a computer processing unit (CPU) of the system, as disclosed herein.
- the camera may be utilized for gesture recognition.
- haptic feedback is provided by the haptic array device in response to position or movement of a target within the field of view of the camera.
- feature detection and extraction methods are utilized to identify a region of interest on the target.
- regions of interest may include such as a finger, palm, thumb, fingertip, etc. of a user.
- feature detection and extraction methods comprise computing processing of images to analyze contrasts in pixel brightness to recognize features.
- Feature detection and extractions methods may include edge detection, corner detection, blob detection, ridge detection, and combinations thereof.
- an edge detection algorithm is utilized to identify an outline or border of a target.
- a nearest neighbor, thresholding, clustering, partial differential equation, and/or other digital image processing methods are utilized to identify an outline or border of a target.
- Canny, Deriche, differential, Sobel, Prewitt, and Roberts cross edge detection techniques may be utilized to identify target or a portion thereof.
- Gaussian or Laplacian techniques are utilized to smooth or improve the accuracy of the identified target or portion thereof.
- the monitoring device comprises a kiosk.
- the monitoring device comprises a thermographic camera, a time-of-flight sensor, a microphone, or any combination thereof.
- the monitoring device further comprises a speaker, a haptic projection unit, an augmented reality projection unit, or any combination thereof.
- the monitoring device comprises a photoplethysmography sensor (PPG), a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance (BIA) sensor, an electrodermal activity (EDA) sensor, an inertial motion unity (IMU) sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof.
- PPG photoplethysmography sensor
- a photoacoustic sensor an ultrasound sensor
- a glucose sensor a bioimpedance (BIA) sensor
- EDA electrodermal activity
- IMU inertial motion unity
- the monitoring device is reconfigurable based on a patient’s handedness, height, or any combination thereof.
- the monitoring device comprises a wireless communication device, a wired communication device, or both.
- the monitoring device comprises an energy storage device, a wired charge connector, a wireless charge connector, or any combination thereof.
- the haptic feedback comprises a finger haptic, a magneto haptic, an opto-haptic, or any combination thereof.
- the monitoring device comprises a mount for a peripheral device.
- the monitoring devices herein comprise an automated kiosk that quantifies neural activity and physical behavior, to provide a telemedicine solution.
- the monitoring devices herein enable healthcare providers to remotely assess and monitor the neural activity and physical behavior of patients, allowing for early detection of changes that may indicate disease progression.
- the monitoring devices herein provide real-time feedback on neural activity and physical behavior, enabling patients to actively participate in their disease management and make informed decisions about their care.
- the monitoring devices herein employ high-resolution motion capture to assess physical impairments, enabling the quantitative analysis of movement patterns and motor coordination.
- the monitoring devices herein use high resolution motion capture paired with an augmented reality (AR) mid-air haptics interface enabling users to dynamically interact with computer generated objects that have tactile and responsive feedback.
- the haptics interface simultaneously stimulates brain activity associated sensory response to touch as well as causal response for tracking coordination.
- the monitoring devices herein pair with a biometric wrist worn wearable tracking surface that measures electromyography, bio-impedance, electrodermal activity (EDA), volumetric blood flow via photoplethysmography (PPG), motion, or any combination thereof.
- the motion is measured using a 9-DOF inertial motion unit (IMU).
- IMU 9-DOF inertial motion unit
- the monitoring devices herein pair with a head worn wearable able to assess electroencephalography (EEG) signals, electromyography (EMG) signals, electrooculography (EOG) signals, or any combination thereof.
- EEG electroencephalography
- EMG electromyography
- EOG electrooculography
- the signals are correlated to central brain activity, peripheral biometric signals and functional performance.
- the systems and methods herein employ a haptics projector and machine vision cameras that allows individualized care for patients with neurological disorders.
- computer generated objects are projected through the haptics projector and can be manipulated by a patient.
- the machine vision cameras may track how patients interact with the projected objects for the quantification of their behavior versus expected motion.
- the system herein utilize augmented reality (AR) mid-air haptics, which adds a tactile element to the assessment process.
- AR augmented reality
- This technology enables patients to interact with virtual objects in a realistic manner, providing a more immersive and engaging experience.
- the utilization of haptic technology in telemedicine is gaining significant traction due to its ability to provide realistic, tactile experiences for both patients and healthcare providers.
- the system herein enables healthcare providers to remotely assess patients' physical symptoms and provide targeted treatment plans.
- the system herein employ high-resolution motion capture technology to enable precise tracking and analysis of a patient's physical dynamics, providing valuable insights into their functional performance.
- the system herein objectively assess disease progression and monitor the effectiveness of interventions, which is a definitive advantage over more subjective measures of the progression of a disease.
- additional sensors are utilized to enhance or supplement the performance of the haptic array device or monitoring system thereof.
- ancillary sensors comprise wearable sensors which are attached to a user to receive additional data generated by movements or electrical signals (e.g., electromyographic (EMG), electroencephalographic (EEG), etc.) produced by a user.
- EMG electromyographic
- EEG electroencephalographic
- a wearable ancillary sensor comprises one or more motion sensors.
- the motion sensors comprise an accelerometer, a gyroscope, or a combination thereof.
- a wearable ancillary sensor array is configured to couple to an appendage, limb, or extremity of a user.
- an existing device comprising one or more motion sensors (e.g., a smart watch) is coupled to the haptic array device to act as an ancillary sensor device.
- additional bioinformatics are acquired by the ancillary sensors such as heart rate, body temperature, blood pressure, or a combination thereof.
- a wearable ancillary sensor array is configured to be worn on a head, a foot, or a wrist of user.
- a wearable ancillary sensor array comprising one or more EEG sensors is configured to place the EEG sensors in proximity to the scalp of a user and receive electric signals produced by the brain of the user.
- the EEG sensors do not require direct contact to the skin (e.g., no need for shaving of the head) or a gel to be applied to the scalp.
- the ancillary sensors are used confirm or verify actions or gestures made by a user.
- bioinformatic information obtained by the ancillary sensors is recorded and stored in a memory of the system.
- the ancillary sensor is head wearable and comprises a helmet, a visor, glasses, a headband, earbuds, earphones, or any combination thereof.
- the ancillary sensor is head wearable and comprises an electroencephalogram (EEG) sensor, an electrooculography (EOG) sensor, a cerebral blood volume sensor, a facial micro-motion sensor, or any combination thereof.
- the ancillary sensor is wrist and/or hand wearable and comprises a photoplethysmography sensor (PPG), a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance (BIA) sensor, an electrodermal activity (EDA) sensor, an inertial motion unity (IMU) sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof.
- PPG photoplethysmography sensor
- BIOS bioimpedance
- EDA electrodermal activity
- IMU inertial motion unity
- the wrist ancillary sensor is reconfigurable based on a patient’s handedness.
- the ancillary sensor is hand graspable.
- the wrist ancillary sensor comprises a wireless communication device, a wired communication device, or both.
- the wrist ancillary sensor comprises an energy storage device, a wired charge connector, a wireless charge connector, or any combination thereof.
- the wrist ancillary sensor comprises a finger interface, a haptic feedback, a joystick, a trackpad, a trackball, or any combination thereof.
- the haptic feedback comprises a finger haptic, a magneto haptic, an opto- haptic, or any combination thereof.
- the ancillary sensor is hand graspable.
- the wrist ancillary sensor comprises a finger interface, a haptic feedback, a joystick, a trackpad, a trackball, or any combination thereof.
- the ancillary sensor is foot-wearable and comprises a photoplethysmography (PPG) sensor, a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance (BIA) sensor, an electrodermal activity (EDA) sensor, inertial motion unity (IMU) sensor, a thermometer, an altimeter, a barometer, a humidity sensor, a sweat rate generation sensor, a hydration sensor, a bioacoustics sensor, or any combination thereof.
- PPG photoplethysmography
- BIOA bioimpedance
- EDA electrodermal activity
- IMU inertial motion unity
- the ancillary sensor comprises an electroencephalogram (EEG) sensor, an electrooculography (EOG) sensor, a cerebral blood volume sensor, a facial micromotion sensor, a photoplethysmography sensor (PPG), a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance (BIA) sensor, an electrodermal activity (EDA) sensor, an inertial motion unity (IMU) sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof.
- EEG electroencephalogram
- EOG electrooculography
- PPG photoplethysmography sensor
- IMU inertial motion unity
- the methods and systems herein employ a biometric wearable to track, for example, surface electromyography, surface conductance, bio-impedance, electrodermal activity, temperature, heart rate, and other environmental factors like pressure, humidity, and altitude.
- the wearable device comprise a 9 DOF (degrees of freedom) IMU to track patients’ motion with high precision.
- the biometric wearables capture physiological data, such as heart rate and electrodermal activity, allowing for a comprehensive assessment of the patient's physical and cognitive state. By analyzing these biometric signals, the systems herein can extract useful features and provide objective assessments of disease progression and functional performance. The potential for collecting EEG, EMG and EOG data adds another dimension to the ability to gather objective data to be correlated with the self-reported status evaluations.
- FIG. 13 Provided herein, per FIG. 13, is a flowchart of an exemplary method 1300 for individualized patient care.
- patient data 1311 and doctors notes 1312 are used to train a machine learning algorithm 1321, which with input from a neurological examination 1322 and/or a hand examination 1323, provides a report 1324 to a patient and/or caregiver.
- the neurological examination 1322 and/or the hand examination 1323 determine a current state 1330 of a patient, wherein the current state is based on an in-clinic visit 1331, a manual visit 1332 (e.g., performed by a family member or nurse), a qualitative result 1333, or any combination thereof.
- the methods 1300 herein employ individualized care 1341, a kiosk 1342 for patient interaction and data collection, and extended reality haptics 1343.
- the method 1300 produces a graphical user interface 1344 to allow the patient and/or caregiver to review and analyze collected data.
- the method 1400 comprises showing a display image 1411, receiving a biometric data 1412, capturing a plurality of pose images of the patient 1413, emitting an ultrasonic haptic based on the display image 1414, determining two or more patient poses based at least in part on the biometric data and the two or more pose images 1421, and determining the assessment of the patient based at least in part on the display image and at least a portion of the plurality of patient poses 1422.
- showing the display image 1411, receiving the biometric data 1412, capturing the plurality of pose images of the patient 1413, and emitting the ultrasonic haptic 1414 are performed simultaneously. In some embodiments, two or more of showing the display image 1411, receiving the biometric data 1412, capturing the plurality of pose images of the patient 1413, and emitting the ultrasonic haptic 1414 are performed simultaneously. In some embodiments, showing the display image 1411, receiving the biometric data 1412, capturing the plurality of pose images of the patient 1413, and emitting the ultrasonic haptic 1414 are performed within a time span of at most about 1 minute.
- two or more of showing the display image 1411, receiving the biometric data 1412, capturing the plurality of pose images of the patient 1413, and emitting the ultrasonic haptic 1414 are performed within a time span of at most about 1 minute.
- the biometric data comprises an inertial motion unit data, a photoplethysmography data, a photoacoustic data, an ultrasound data, a glucose data, a bioimpedance data, an electrodermal activity data, a temperature data, a vision shadow capture data, an altitude data, a pressure data, a humidity data, a sweat rate data, a hydration data, a bioacoustics data, a dynamometer, an electrodermal data, or any combination thereof.
- at least a portion of the plurality of pose images comprise a two-dimensional image. In some embodiments, at least a portion of the plurality of pose images comprise a three-dimensional image.
- At least a portion of the plurality of pose images comprise an infrared image, a near infrared image, a visible light image, an ultra-violet spectrum image, a thermographic image, or any combination thereof.
- each of the two or more patient poses comprise a position, an orientation, or both of an appendage of the patient.
- the two or more patient poses, the assessment of the patient, or both are determined at least in part by a machine learning algorithm.
- the assessment of the patient is determined at least in part based on a comparison between the two or more patient poses and two or more labeled healthy poses.
- the Al algorithms extract relevant features from the data, enabling calculations of human pose estimation and correlation with neural activity.
- the systems herein enable robust and reliable measures of cognitive and physical impairments in individuals.
- the machine learning algorithms herein are trained on collected data and established clinical assessments and existing objective measures.
- the methods herein employ statistical analysis on data collected from both a patient group and the healthy control group as a validation.
- data is collected via a personalized assessment activity for a plurality of simulated patient profiles.
- each activity is designed to extract key features that assess cognitive and
- the systems herein employ generative Al algorithms to analyze the data collected from motion capture and biometric wearables and perform human pose estimation calculations. This capability allows for the multi- modal quantification of impairment and provides healthcare professionals with valuable insights into the patient's condition.
- the system herein provide a holistic assessment of both physical dynamics and brain activity.
- the Artificial Intelligence (Al) algorithms are trained with comprehensive data on both physiological and neurological responses during functional tasks.
- a patient Capturing all this information while the patient manipulates the computer-generated object in a gamified way allows quantitative and reproducible assessment of cognition, dexterity, coordination, and other measures used to determine the state of health for a person with neurological disorder.
- a patient s signals are captured by an automated and portable monitoring device, in-home or in-clinic. Data can be captured asynchronously (e.g., self-guided by an Al- Assistant) or under the observation of a remote clinical operator via telemedicine.
- the systems herein enable access to care, access to monitoring tools, and challenges to reproducible examination to determine cognitive and physical impairment.
- a patient will undergo an assessment session at a healthcare facility or at home.
- the patient can be fitted with biometric wearables and can interact with virtual objects using AR mid-air haptics.
- the high- resolution motion capture system may track their movements and capture data on their physical dynamics. Throughout the assessment, data will be collected and analyzed using generative Al algorithms. In some embodiments, these algorithms extract features from the biometric signals and motion capture data to perform human pose estimation calculations and provide healthcare professionals with objective assessments of the patient's cognitive and physical impairment, disease progression, and functional performance.
- a rehabilitation and/or training system is provided by the devices and methods disclosed herein.
- a haptic array device is utilized in a rehabilitation system.
- the haptic array device is utilized to carry out an automated or partially-automated physical therapy regimen.
- the haptic array devices provide stimulation to a portion of a subject as a haptic therapy.
- the haptic array device comprises a monitoring system and is utilized for gesture recognition as part of rehabilitation and training methods and systems.
- the monitoring system further identifies features of a portion of a user for monitoring of movements/gestures.
- haptic feedback is provided to help guide the user.
- haptic feedback may be utilized to confirm that the monitoring system has identified a target portion of a user or confirm that a guided movement has been properly completed by the user.
- haptic feedback is utilized to confirm a portion of the user is properly in view of the monitoring system.
- a laser system of the haptic array provides a visualization (e.g., a hologram or 3D image).
- a user may interact with the visualization produced by the laser system to simulate manipulation of an object.
- haptic feedback is provided as a user interacts with the visualization to confirm that the user’s actions are properly registered.
- Coupling of a haptic array device to an external device may be carried out through wired or wireless communication.
- the external device may communicate/transmit data or information obtained by the haptic array device.
- the external device may comprise additional systems for simultaneous communication, such as a camera and microphone for video communi cation.
- the haptic array device may be modular, and additional haptic arrays or components may be utilized to enhance performance of the system.
- systems and methods enable that remote, objective assessment of cognitive and physical impairment for individuals with neurological disorders.
- the systems herein integrate high-resolution motion capture, augmented reality mid-air haptics, and biometric wearables to quantify neural activity and physical behavior and provide consistent, quantitative data that can precisely track outcomes over time.
- the systems and methods herein provide consistent, objective disease progression data across diverse populations.
- the systems and methods herein increase enrollment and access for rural and minority populations through remote assessments.
- the systems and methods herein enable enabling continuous longitudinal tracking of treatment efficacy over long periods and reduce demographic biases in clinical evaluations via quantitative measures.
- the systems and methods herein employ an augmented reality (AR) mid-air haptics interface paired with high- resolution motion capture, biometric wearable and head worn wearable to correlate central brain activity, peripheral biometric signals and functional performance.
- Patients can interact with virtual objects and a monitoring device to record their physical performance as well as their cognitive response to the movement and biometric data.
- the data is analyzed with the help of Artificial Intelligence (Al) algorithms and can be interpreted by a neurologist through a telehealth consultation.
- Al Artificial Intelligence
- the systems and methods herein are utilized to evaluate performance of an individual.
- the system monitors a user as they perform a series of movements to evaluate a range of motion or correct execution of the movements by the user.
- the system may be utilized in diagnosis of injury, training, or evaluation of performance.
- the system may be utilized to monitor the progression of rehabilitation from an injury or diagnosis of an injury.
- the device projects acoustic and/or visual holograms for a user to interact with.
- the device provides instruction for interacting with the projected holograms.
- the monitoring system detects, tracks, and records the movements made by the user. The movements may be directed to a specific appendage or body part of the user or may be movements made by the whole body of the user.
- the user’s range of motion is evaluated by instructing the user to conduct a series of movements.
- the system may provide a diagnosis of a condition, evaluate severity of a condition, and/or track rehabilitation of a condition. For example, a user may be instructed to perform a series of movements of their forearm to diagnose the severity of lateral epicondylitis.
- the system is utilized for training purposes.
- a haptic array device as disclosed herein, may be utilized to project haptics simulating a music instrument (e.g., a piano or guitar).
- the device may track the movements of the user during interaction and provide feedback (e.g., visual and/or haptic feedback) to correct the user’s movements as they interact with the projections.
- feedback e.g., visual and/or haptic feedback
- the user may be trained on proper technique for using the physical instrument. For example, a pianist may be able to practice and receive feedback on their piano playing technique without requiring an entire piano or keyboard. In this sense, the device may provide a portable training tool.
- the systems and methods herein are utilized to provide a fully or partially automated physical therapy regimen. In some embodiments, the systems and methods herein are utilized to provide a guided series of movements for physical therapy. In some embodiments, the system performs a quantitative analysis of at least one portion of a user’s body (e.g., a hand and wrist of a user) using machine/computer vision via visible light and thermographic imaging.
- the device projects acoustic holograms and the user responds accordingly.
- a rehabilitation status may be correlated to the user interaction of the acoustic holograms.
- the haptic array device projects visual holograms for the user to respond to.
- a visual hologram is provided to depict a series of movements to be executed by the user for strengthening or rehabilitation purposes. Sonic haptics, laser haptics, thermal haptics, or a combination thereof may be provided to confirm correct execution of movements or notify the use that the movement was not correctly executed.
- instructions for a series of movements (exercises) to be executed are provided external to the system, while the system confirms correct execution of the movements via haptic feedback.
- the system records the movements of the user and sends them to a physician or physical therapist for analysis.
- the system is utilized during a remote physical therapy session to confirm that the user/patient is correctly performing the physical therapy exercises.
- a physical therapist records a physical therapy exercises utilizing their own system (e.g., a haptic array device, as disclosed herein).
- the user instructions are based off the exercises recorded by the physical therapist.
- the user exercises are monitored by the system and confirmation of correct position is provided if the user performs the exercises similarly to the recorded exercises.
- the system and methods herein are utilized to provide lost-limb simulation or prosthetic training for amputees.
- the system provides haptic feedback to a site of amputation to simulate feelings of the lost limbs or extremities.
- the systems herein monitor a site of amputation and provide visual feedbacks in response to movement.
- movement at the amputation site is correlated into movement of a simulated appendage or extremity.
- movement at the amputation site is correlated into movement of a simulated prosthetic.
- a system e.g., a haptic array device, as disclosed herein
- the haptic feedback is provided to the muscle groups to simulate feelings of engaging a prosthetic.
- the haptic feedback is provided to the muscle groups guide a user as to which muscle groups should be engaged to induce a prosthetic to perform an action or motion.
- the system sends data acquired by a monitoring system to design a prosthetic for a patient.
- the data may include information as to the limit of the movements which may be performed at the amputation site.
- the prosthetic may then be better designed for a patient based on the movements they are able to perform.
- the system guides a patient through a series of movements designed to acquire information as to the range of motion capable by a patient. Therefore, customized prosthetics may be better designed based on each patient’s unique limitation of movement.
- the system herein is configured to provide haptic therapy to a portion of a user.
- sonic haptics, laser haptics, thermal haptics, and combinations thereof are directed to a portion of a user for therapeutic benefits.
- sonic haptics are directed toward a portion of a user to stimulate muscles or provide vibrational stimulation (similar to a massage gun). Vibrational stimulation may be generated by oscillation of sonic haptics.
- sonic haptics are directed toward a portion of a user to stimulate muscles or nerves.
- thermal radiation is directed toward a portion of a user to alleviate pain, muscle tension, and/or swelling.
- the system is configured to provide a therapeutic spa program.
- a user provides a portion of their body (e.g., a hand or foot) within proximity of a haptic array device, as disclosed herein, and the device commences with a series of haptic stimulations to alleviate pain, muscle tension, swelling, etc.
- a monitoring system identifies portions of the user’s body to ensure the haptic stimulation is properly directed.
- the user is instructed to perform as series of movements under guidance of the system as part of a therapeutic spa program.
- MS Multiple sclerosis
- MS is a complex chronic disease of the central nervous system, thought to be an autoimmune disorder, in which the immune system mistakenly attacks the protective covering of nerve fibers, affecting nerve conduction and causing a wide range of neurological symptoms
- Regular monitoring of MS is critical, as it allows healthcare providers to track disease progression, assess the effectiveness of treatments, and make necessary adjustments to the treatment plan. This can help individuals with MS manage their symptoms and maintain a better quality of life.
- MS is characterized by relapses or exacerbations, during which new or worsening symptoms occur. Monitoring allows healthcare providers to detect these relapses early and initiate appropriate interventions to minimize their impact.
- the system herein provide providing a telemedicine solution for MS care.
- the system herein comprise an automated monitoring device that quantifies neural activity and physical behavior, to provide a telemedicine solution for MS care.
- the system herein enable healthcare providers to remotely assess and monitor the neural activity and physical behavior of patients, allowing for early detection of changes that may indicate disease progression.
- the system herein improve access to specialized MS care for underserved populations.
- the systems herein bridge the geographical and logistical gaps, allowing individuals to receive the care they need without the burden of travel.
- the system herein provide real- time feedback on neural activity and physical behavior, enabling patients to actively participate in their disease management and make informed decisions about their care.
- the system herein employ high-resolution motion capture technology to enable precise tracking and analysis of a patient's physical dynamics, providing valuable insights into their functional performance.
- the system herein objectively assess disease progression and monitor the effectiveness of interventions, which is a definitive advantage over more subjective measures of the progression of the disease such as the EDSS.
- the system herein utilize augmented reality (AR) mid-air haptics, which adds a tactile element to the assessment process.
- AR augmented reality
- This technology enables patients to interact with virtual objects in a realistic manner, providing a more immersive and engaging experience.
- the utilization of haptic technology in telemedicine is gaining significant traction due to its ability to provide realistic, tactile experiences for both patients and healthcare providers.
- the system herein enables healthcare providers to remotely assess patients' physical symptoms and provide targeted treatment plans.
- the system herein employ biometric wearables, which capture physiological data, such as heart rate and electrodermal activity, allowing for a comprehensive assessment of the patient's physical and cognitive state. By analyzing these biometric signals, the systems herein can extract useful features and provide objective assessments of disease progression and functional performance.
- the potential for collecting EEG, EMG and EOG data adds another dimension to the ability to gather objective data to be correlated with the self-reported status evaluations. This data will provide critical insights into the cognitive aspect of the progression of MS in the population studied.
- the systems herein employ generative Al algorithms to analyze the data collected from motion capture and biometric wearables and perform human pose estimation calculations. This capability allows for the multi- modal quantification of impairment and provides healthcare professionals with valuable insights into the patient's condition.
- the system herein provide a holistic assessment of both physical dynamics and brain activity.
- a patient will undergo an assessment session at a healthcare facility or at home.
- the patient can be fitted with biometric wearables and can interact with virtual objects using AR mid-air haptics.
- the high- resolution motion capture system may track their movements and capture data on their physical dynamics. Throughout the assessment, data will be collected and analyzed using generative Al algorithms. In some embodiments, these algorithms extract features from the biometric signals and motion capture data to perform human pose estimation calculations and provide healthcare professionals with objective assessments of the patient's cognitive and physical impairment, disease progression, and functional performance.
- the systems herein comprise a monitoring device that employs high-resolution motion capture to assess physical impairments, enabling the quantitative analysis of movement patterns and motor coordination.
- the monitoring device uses high resolution motion capture paired with an augmented reality (AR) mid-air haptics interface enabling users to dynamically interact with computer generated objects that have tactile and responsive feedback.
- AR augmented reality
- the haptics interface simultaneously stimulates brain activity associated sensory response to touch as well as causal response for tracking coordination.
- the monitoring device pairs with a biometric wrist worn wearable tracking surface that measures electromyography, bioimpedance, electrodermal activity (EDA), volumetric blood flow via photoplethysmography (PPG), motion, or any combination thereof.
- the motion is measured using a 9-DOF inertial motion unit (IMU).
- IMU 9-DOF inertial motion unit
- the monitoring device pairs with a head worn wearable able to assess electroencephalography (EEG) signals, electromyography (EMG) signals, electro-oculography (EOG) signals, or any combination thereof.
- EEG electroencephalography
- EMG electromyography
- EOG electro-oculography
- the signals are correlated to central brain activity, peripheral biometric signals and functional performance.
- the Artificial Intelligence (Al) algorithms are trained with comprehensive data on both physiological and neurological responses during functional tasks.
- the Al algorithms extract relevant features from the data, enabling calculations of human pose estimation and correlation with neural activity. By analyzing the extracted data using machine learning techniques, the systems herein enable robust and reliable measures of cognitive and physical impairments in individuals with MS.
- a patient’s signals are captured by an automated and portable monitoring device, in-home or in-clinic.
- Data can be captured asynchronously (e.g., self- guided by an AI-Assistant) or under the observation of a remote clinical operator via telemedicine.
- the systems herein enable access to care, access to monitoring tools, and challenges to reproducible examination to determine cognitive and physical impairment.
- Parkinson's disease is a devastating neurodegenerative disorder that severely impacts the quality of life for millions of individuals.
- accurate assessment of motor symptoms is paramount for disease progression tracking and evaluating the efficacy of interventions.
- few reliable and comprehensive tools are available for the objective assessment of PD-related measures, effective management of the disease and the development of personalized treatments for patients has been challenging.
- the systems herein provide an innovative automated platform specifically designed to address the unmet need for accurate and comprehensive assessment of PD-related measures.
- the systems herein incorporate machine vision tracking, haptics, and biometric wearables, to provide a reliable and reproducible way to measure and monitor patients with Parkinson’s disease.
- Parkinson's One of the hallmark features of Parkinson's is the presence of motor deficits that significantly impact a patient's quality of life.
- Clinicians commonly utilize standardized rating scales like the Unified Parkinson's Disease Rating Scale (UPDRS) to evaluate various aspects of motor function, including tremors, bradykinesia, rigidity, and postural instability. While these assessments provide valuable insights into the severity of motor symptoms, progression of the disease, and response to therapeutic interventions, their qualitative manner often poses challenges. Motor symptoms such as bradykinesia (slowness of movement), rigidity, and tremors can vary in intensity and presentation from one patient to another. Clinicians often rely on their expertise and observational skills to assess these symptoms during a clinical examination.
- the UPDRS and its subscales provide a structured framework for evaluating motor symptoms, but they still involve subjective judgment.
- the systems herein quantify motor deficits in PD, to collect and analyze objective measurements.
- the systems herein employ haptics technology and computer vision to estimate the motor abilities of Parkinson’s patients when manipulating virtual objects.
- the systems herein enable healthcare professionals to accurately track disease progression, evaluate the impact of various interventions, and tailor treatment plans to the specific needs of individual patients.
- the systems herein are easily implemented, and can be used independently by patients, or with the guidance of a neurologist.
- the systems herein combine behavior and movement evaluation, though haptics object manipulation, and physiological measurements such as surface electromyography, surface conductance, bio-impedance, electrodermal activity, temperature, heart rate, as well as other environmental factors like pressure, humidity, and altitude.
- the systems and methods herein employ a haptics projector, machine vision cameras, and a biometric wearable, with current gold standard clinical measures to achieve a comprehensive and reliable assessment of patients with neurological disorders.
- the systems and methods herein employ a haptics projector and machine vision cameras that allows individualized care for patients with neurological disorders.
- computer generated objects are projected through the haptics projector and can be manipulated by a patient.
- the machine vision cameras may track how patients interact with the projected objects for the quantification of their behavior versus expected motion.
- the methods and systems herein employ a biometric wearable to track, for example, surface electromyography, surface conductance, bioimpedance, electrodermal activity, temperature, heart rate, and other environmental factors like pressure, humidity, and altitude.
- the wearable device comprise a 9 DOF (degrees of freedom) IMU to track patients’ motion with high precision. Capturing all this information while the patient manipulates the computer-generated object in a gamified way allows quantitative and reproducible assessment of cognition, dexterity, coordination, and other measures used to determine the state of health for a person with neurological disorder.
- the machine learning algorithms herein are trained on collected data and established clinical assessments and existing objective measures commonly utilized in PD-related research.
- the methods herein employ statistical analysis on data collected from both the PD patient group and the healthy control group as a validation.
- data is collected via a personalized assessment activity for a plurality of simulated patient profiles.
- each activity is designed to extract key features that assess cognitive and physical impairments relevant to PD and encompass a range of tasks to comprehensively evaluate the patients' motor skills, cognitive functions, and coordination abilities.
- a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range.
- description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
- a sample includes a plurality of samples, including mixtures thereof.
- rehabilitation is often used interchangeably herein to refer to methods of recovery or prevention from injuries, pain, and/or medical procedures. Methods may be guided or automated using the systems and devices disclosed herein.
- acoustic “sound,” or “sonic” are often used interchangeably herein to refer to mechanical pressure waves. Unless specified, the terms “acoustic” and “sonic” should broadly read on waveforms ranging through all sonic frequency ranges, including audible, inaudible, and ultrasonic frequencies.
- the term “about” a number refers to that number plus or minus 10% of that number.
- the term “about” a range refers to that range minus 10% of its lowest value and plus 10% of its greatest value.
- FIG. 1 a block diagram is shown depicting an exemplary machine that includes a computer system 100 (e.g., a processing or computing system) within which a set of instructions can execute for causing a device to perform or execute any one or more of the aspects and/or methodologies for static code scheduling of the present disclosure.
- a computer system 100 e.g., a processing or computing system
- the components in FIG. 1 are examples only and do not limit the scope of use or functionality of any hardware, software, embedded logic component, or a combination of two or more such components implementing particular embodiments.
- Computer system 100 may include one or more processors 101, a memory 103, and a storage 108 that communicate with each other, and with other components, via a bus 140.
- the bus 140 may also link a display 132, one or more input devices 133 (which may, for example, include a keypad, a keyboard, a mouse, a stylus, etc.), one or more output devices 134, one or more storage devices 135, and various tangible storage media 136. All of these elements may interface directly or via one or more interfaces or adaptors to the bus 140.
- the various tangible storage media 136 can interface with the bus 140 via storage medium interface 126.
- Computer system 100 may have any suitable physical form, including but not limited to one or more integrated circuits (ICs), printed circuit boards (PCBs), mobile handheld devices (such as mobile telephones or PDAs), laptop or notebook computers, distributed computer systems, computing grids, or servers.
- ICs integrated circuits
- PCBs printed circuit boards
- mobile handheld devices such as mobile telephones or PDAs
- Computer system 100 includes one or more processor(s) 101 (e.g., central processing units (CPUs), general purpose graphics processing units (GPGPUs), or quantum processing units (QPUs)) that carry out functions.
- processor(s) 101 optionally contains a cache memory unit 102 for temporary local storage of instructions, data, or computer addresses.
- Processor(s) 101 are configured to assist in execution of computer readable instructions.
- Computer system 100 may provide functionality for the components depicted in FIG. 1 as a result of the processor(s) 101 executing non-transitory, processor-executable instructions embodied in one or more tangible computer-readable storage media, such as memory 103, storage 108, storage devices 135, and/or storage medium 136.
- the computer-readable media may store software that implements particular embodiments, and processor(s) 101 may execute the software.
- Memory 103 may read the software from one or more other computer-readable media (such as mass storage device(s) 135, 136) or from one or more other sources through a suitable interface, such as network interface 120.
- the software may cause processor(s) 101 to carry out one or more processes or one or more steps of one or more processes described or illustrated herein. Carrying out such processes or steps may include defining data structures stored in memory 103 and modifying the data structures as directed by the software.
- the memory 103 may include various components (e.g., machine readable media) including, but not limited to, a random-access memory component (e.g., RAM 104) (e.g., static RAM (SRAM), dynamic RAM (DRAM), ferroelectric random-access memory (FRAM), phase-change random access memory (PRAM), etc.), a read-only memory component (e.g., ROM 105), and any combinations thereof.
- ROM 105 may act to communicate data and instructions unidirectionally to processor(s) 101
- RAM 104 may act to communicate data and instructions bidirectionally with processor(s) 101.
- ROM 105 and RAM 104 may include any suitable tangible computer-readable media described below.
- a basic input/output system 106 (BIOS) including basic routines that help to transfer information between elements within computer system 100, such as during start-up, may be stored in the memory 103.
- Fixed storage 108 is connected bidirectionally to processor(s) 101, optionally through storage control unit 107.
- Fixed storage 108 provides additional data storage capacity and may also include any suitable tangible computer-readable media described herein.
- Storage 108 may be used to store operating system 109, executable(s) 110, data 111, applications 112 (application programs), and the like.
- Storage 108 can also include an optical disk drive, a solid-state memory device (e.g., flash-based systems), or a combination of any of the above.
- Information in storage 108 may, in appropriate cases, be incorporated as virtual memory in memory 103.
- storage device(s) 135 may be removably interfaced with computer system 100 (e.g., via an external port connector (not shown)) via a storage device interface 125.
- storage device(s) 135 and an associated machine-readable medium may provide non-volatile and/or volatile storage of machine-readable instructions, data structures, program modules, and/or other data for the computer system 100.
- software may reside, completely or partially, within a machine-readable medium on storage device(s) 135.
- software may reside, completely or partially, within processor(s) 101.
- Bus 140 connects a wide variety of subsystems.
- reference to a bus may encompass one or more digital signal lines serving a common function, where appropriate.
- Bus 140 may be any of several types of bus structures including, but not limited to, a memory bus, a memory controller, a peripheral bus, a local bus, and any combinations thereof, using any of a variety of bus architectures.
- Computer system 100 may also include an input device 133.
- a user of computer system 100 may enter commands and/or other information into computer system 100 via input device(s) 133.
- Examples of an input device(s) 133 include, but are not limited to, an alpha-numeric input device (e.g., a keyboard), a pointing device (e.g., a mouse or touchpad), a touchpad, a touch screen, a multi-touch screen, a joystick, a stylus, a gamepad, an audio input device (e.g., a microphone, a voice response system, etc.), an optical scanner, a video or still image capture device (e.g., a camera), and any combinations thereof.
- the input device is a Kinect, Leap Motion, or the like.
- Input device(s) 133 may be interfaced to bus 140 via any of a variety of input interfaces 123 (e.g., input interface 123) including, but not limited to, serial, parallel, game port, USB, FIREWIRE, THUNDERBOLT, or any combination of the above.
- input interfaces 123 e.g., input interface 123 including, but not limited to, serial, parallel, game port, USB, FIREWIRE, THUNDERBOLT, or any combination of the above.
- computer system 100 when computer system 100 is connected to network 130, computer system 100 may communicate with other devices, specifically mobile devices and enterprise systems, distributed computing systems, cloud storage systems, cloud computing systems, and the like, connected to network 130. Communications to and from computer system 100 may be sent through network interface 120.
- network interface 120 may receive incoming communications (such as requests or responses from other devices) in the form of one or more packets (such as Internet Protocol (IP) packets) from network 130, and computer system 100 may store the incoming communications in memory 103 for processing.
- Computer system 100 may similarly store outgoing communications (such as requests or responses to other devices) in the form of one or more packets in memory 103 and communicated to network 130 from network interface 120.
- Processor(s) 101 may access these communication packets stored in memory 103 for processing.
- Examples of the network interface 120 include, but are not limited to, a network interface card, a modem, and any combination thereof.
- Examples of a network 130 or network segment 130 include, but are not limited to, a distributed computing system, a cloud computing system, a wide area network (WAN) (e.g., the Internet, an enterprise network), a local area network (LAN) (e.g., a network associated with an office, a building, a campus or other relatively small geographic space), a telephone network, a direct connection between two computing devices, a peer-to-peer network, and any combinations thereof.
- a network, such as network 130 may employ a wired and/or a wireless mode of communication. In general, any network topology may be used.
- Information and data can be displayed through a display 132.
- a display 132 include, but are not limited to, a cathode ray tube (CRT), a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT-LCD), an organic liquid crystal display (OLED) such as a passive-matrix OLED (PMOLED) or active-matrix OLED (AMOLED) display, a plasma display, and any combinations thereof.
- the display 132 can interface to the processor(s) 101, memory 103, and fixed storage 108, as well as other devices, such as input device(s) 133, via the bus 140.
- the display 132 is linked to the bus 140 via a video interface 122, and transport of data between the display 132 and the bus 140 can be controlled via the graphics control 121.
- the display is a video projector.
- the display is a head-mounted display (HMD) such as a VR headset.
- suitable VR headsets include, by way of non-limiting examples, HTC Vive, Oculus Rift, Samsung Gear VR, Microsoft HoloLens, Razer OSVR, FOVE VR, Zeiss VR One, Avegant Glyph, Freefly VR headset, and the like.
- the display is a combination of devices such as those disclosed herein.
- computer system 100 may include one or more other peripheral output devices 134 including, but not limited to, an audio speaker, a printer, a storage device, and any combinations thereof.
- peripheral output devices may be connected to the bus 140 via an output interface 124.
- Examples of an output interface 124 include, but are not limited to, a serial port, a parallel connection, a USB port, a FIREWIRE port, a THUNDERBOLT port, and any combinations thereof.
- computer system 100 may provide functionality as a result of logic hardwired or otherwise embodied in a circuit, which may operate in place of or together with software to execute one or more processes or one or more steps of one or more processes described or illustrated herein.
- Reference to software in this disclosure may encompass logic, and reference to logic may encompass software.
- reference to a computer-readable medium may encompass a circuit (such as an IC) storing software for execution, a circuit embodying logic for execution, or both, where appropriate.
- the present disclosure encompasses any suitable combination of hardware, software, or both.
- DSP digital signal processor
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- a general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
- a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
- a software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
- An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium.
- the storage medium may be integral to the processor.
- the processor and the storage medium may reside in an ASIC.
- the ASIC may reside in a user terminal.
- the processor and the storage medium may reside as discrete components in a user terminal.
- suitable computing devices include, by way of non-limiting examples, server computers, desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, media streaming devices, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles.
- server computers desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, media streaming devices, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles.
- Suitable tablet computers include those with booklet, slate, and convertible configurations, known to those of skill in the art.
- the computing device includes an operating system configured to perform executable instructions.
- the operating system is, for example, software, including programs and data, which manages the device’s hardware and provides services for execution of applications.
- suitable server operating systems include, by way of non -limiting examples, FreeBSD, OpenBSD, NetBSD®, Linux, Apple® Mac OS X Server®, Oracle® Solaris®, Windows Server®, and Novell® NetWare®.
- suitable personal computer operating systems include, by way of non-limiting examples, Microsoft® Windows®, Apple® Mac OS X®, UNIX®, and UNIX-like operating systems such as GNU/Linux®.
- the operating system is provided by cloud computing.
- suitable mobile smartphone operating systems include, by way of non-limiting examples, Nokia® Symbian® OS, Apple® iOS®, Research In Motion® BlackBerry OS®, Google® Android®, Microsoft® Windows Phone® OS, Microsoft® Windows Mobile® OS, Linux®, and Palm® WebOS®.
- suitable media streaming device operating systems include, by way of non-limiting examples, Apple TV®, Roku®, Boxee®, Google TV®, Google Chromecast®, Amazon Fire®, and Samsung® HomeSync®.
- Non-transitory computer readable storage medium includes, by way of non-limiting examples, Sony® PS3®, Sony® PS4®, Microsoft® Xbox 360®, Microsoft Xbox One, Nintendo® Wii®, Nintendo® Wii U®, and Ouya®.
- Non-transitory computer readable storage medium includes, by way of non-limiting examples, Sony® PS3®, Sony® PS4®, Microsoft® Xbox 360®, Microsoft Xbox One, Nintendo® Wii®, Nintendo® Wii U®, and Ouya®.
- the platforms, systems, media, and methods disclosed herein include one or more non-transitory computer readable storage media encoded with a program including instructions executable by the operating system of an optionally networked computing device.
- a computer readable storage medium is a tangible component of a computing device.
- a computer readable storage medium is optionally removable from a computing device.
- a computer readable storage medium includes, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, solid state memory, magnetic disk drives, magnetic tape drives, optical disk drives, distributed computing systems including cloud computing systems and services, and the like.
- the program and instructions are permanently, substantially permanently, semi -permanently, or non-transitorily encoded on the media.
- the platforms, systems, media, and methods disclosed herein include at least one computer program, or use of the same.
- a computer program includes a sequence of instructions, executable by one or more processor(s) of the computing device’s CPU, written to perform a specified task.
- Computer readable instructions may be implemented as program modules, such as functions, objects, Application Programming Interfaces (APIs), computing data structures, and the like, that perform particular tasks or implement particular abstract data types.
- APIs Application Programming Interfaces
- a computer program comprises one sequence of instructions. In some embodiments, a computer program comprises a plurality of sequences of instructions. In some embodiments, a computer program is provided from one location. In other embodiments, a computer program is provided from a plurality of locations. In various embodiments, a computer program includes one or more software modules. In various embodiments, a computer program includes, in part or in whole, one or more web applications, one or more mobile applications, one or more standalone applications, one or more web browser plug-ins, extensions, add-ins, or add-ons, or combinations thereof.
- a computer program includes a web application.
- a web application in various embodiments, utilizes one or more software frameworks and one or more database systems.
- a web application is created upon a software framework such as Microsoft® .NET or Ruby on Rails (RoR).
- a web application utilizes one or more database systems including, by way of non-limiting examples, relational, nonrelational, object oriented, associative, XML, and document-oriented database systems.
- suitable relational database systems include, by way of non-limiting examples, Microsoft® SQL Server, mySQLTM, and Oracle®.
- a web application in various embodiments, is written in one or more versions of one or more languages.
- a web application may be written in one or more markup languages, presentation definition languages, client-side scripting languages, server-side coding languages, database query languages, or combinations thereof.
- a web application is written to some extent in a markup language such as Hypertext Markup Language (HTML), Extensible Hypertext Markup Language (XHTML), or extensible Markup Language (XML).
- a web application is written to some extent in a presentation definition language such as Cascading Style Sheets (CSS).
- CSS Cascading Style Sheets
- a web application is written to some extent in a client-side scripting language such as Asynchronous JavaScript and XML (AJAX), Flash® ActionScript, JavaScript, or Silverlight®.
- AJAX Asynchronous JavaScript and XML
- Flash® ActionScript JavaScript
- a web application is written to some extent in a serverside coding language such as Active Server Pages (ASP), ColdFusion®, Perl, JavaTM, JavaServer Pages (JSP), Hypertext Preprocessor (PHP), PythonTM, Ruby, Tel, Smalltalk, WebDNA®, or Groovy.
- a web application is written to some extent in a database query language such as Structured Query Language (SQL).
- SQL Structured Query Language
- a web application integrates enterprise server products such as IBM® Lotus Domino®.
- a web application includes a media player element.
- a media player element utilizes one or more of many suitable multimedia technologies including, by way of non-limiting examples, Adobe® Flash®, HTML 5, Apple® QuickTime®, Microsoft® Silverlight®, JavaTM, and Unity®.
- an application provision system comprises one or more databases 200 accessed by a relational database management system (RDBMS) 210.
- RDBMSs include Firebird, MySQL, PostgreSQL, SQLite, Oracle Database, Microsoft SQL Server, IBM DB2, IBM Informix, SAP Sybase, Teradata, and the like.
- the application provision system further comprises one or more application severs 220 (such as Java servers, .NET servers, PHP servers, and the like) and one or more web servers 230 (such as Apache, IIS, GWS and the like).
- the web server(s) optionally expose one or more web services via app application programming interfaces (APIs) 240.
- APIs app application programming interfaces
- an application provision system alternatively has a distributed, cloud-based architecture 300 and comprises elastically load balanced, auto-scaling web server resources 310 and application server resources 320 as well synchronously replicated databases 330.
- Mobile application
- a computer program includes a mobile application provided to a mobile computing device.
- the mobile application is provided to a mobile computing device at the time it is manufactured.
- the mobile application is provided to a mobile computing device via the computer network described herein.
- a mobile application is created by techniques known to those of skill in the art using hardware, languages, and development environments known to the art. Those of skill in the art will recognize that mobile applications are written in several languages. Suitable programming languages include, by way of non-limiting examples, C, C++, C#, Objective-C, JavaTM, JavaScript, Pascal, Object Pascal, PythonTM, Ruby, VB.NET, WML, and XHTML/HTML with or without CSS, or combinations thereof.
- Suitable mobile application development environments are available from several sources. Commercially available development environments include, by way of non-limiting examples, AirplaySDK, alcheMo, Appcelerator®, Celsius, Bedrock, Flash Lite, .NET Compact Framework, Rhomobile, and WorkLight Mobile Platform. Other development environments are available without cost including, by way of non-limiting examples, Lazarus, MobiFlex, MoSync, and Phonegap. Also, mobile device manufacturers distribute software developer kits including, by way of non-limiting examples, iPhone and iPad (iOS) SDK, AndroidTM SDK, BlackBerry® SDK, BREW SDK, Palm® OS SDK, Symbian SDK, webOS SDK, and Windows® Mobile SDK.
- iOS iPhone and iPad
- a computer program includes a standalone application, which is a program that is run as an independent computer process, not an add-on to an existing process, e.g., not a plug-in.
- standalone applications are often compiled.
- a compiler is a computer program(s) that transforms source code written in a programming language into binary object code such as assembly language or machine code. Suitable compiled programming languages include, by way of non-limiting examples, C, C++, Objective-C, COBOL, Delphi, Eiffel, JavaTM, Lisp, PythonTM, Visual Basic, and VB.NET, or combinations thereof. Compilation is often performed, at least in part, to create an executable program.
- a computer program includes one or more executable complied applications.
- the computer program includes a web browser plug-in (e.g., extension, etc.).
- a plug-in is one or more software components that add specific functionality to a larger software application.
- Makers of software applications support plugins to enable third-party developers to create abilities which extend an application, to support easily adding new features, and to reduce the size of an application.
- plugins enable customizing the functionality of a software application.
- plug-ins are commonly used in web browsers to play video, generate interactivity, scan for viruses, and display particular file types.
- the toolbar comprises one or more web browser extensions, add-ins, or add-ons.
- the toolbar comprises one or more explorer bars, tool bands, or desk bands.
- plug-in frameworks are available that enable development of plug-ins in various programming languages, including, by way of non-limiting examples, C++, Delphi, JavaTM, PHP, PythonTM, and VB.NET, or combinations thereof.
- Web browsers are software applications, designed for use with network-connected computing devices, for retrieving, presenting, and traversing information resources on the World Wide Web. Suitable web browsers include, by way of non-limiting examples, Microsoft® Internet Explorer®, Mozilla® Firefox®, Google® Chrome, Apple® Safari®, Opera Software® Opera®, and KDE Konqueror. In some embodiments, the web browser is a mobile web browser. Mobile web browsers (also called microbrowsers, mini-browsers, and wireless browsers) are designed for use on mobile computing devices including, by way of non-limiting examples, handheld computers, tablet computers, netbook computers, subnotebook computers, smartphones, music players, personal digital assistants (PDAs), and handheld video game systems.
- PDAs personal digital assistants
- Suitable mobile web browsers include, by way of non-limiting examples, Google® Android® browser, RIM BlackBerry® Browser, Apple® Safari®, Palm® Blazer, Palm® WebOS® Browser, Mozilla® Firefox® for mobile, Microsoft® Internet Explorer® Mobile, Amazon® Kindle® Basic Web, Nokia® Browser, Opera Software® Opera® Mobile, and Sony® PSPTM browser.
- the platforms, systems, media, and methods disclosed herein include software, server, and/or database modules, or use of the same.
- software modules are created by techniques known to those of skill in the art using machines, software, and languages known to the art.
- the software modules disclosed herein are implemented in a multitude of ways.
- a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof.
- a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof.
- the one or more software modules comprise, by way of non-limiting examples, a web application, a mobile application, and a standalone application.
- software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine. In further embodiments, software modules are hosted on a distributed computing platform such as a cloud computing platform. In some embodiments, software modules are hosted on one or more machines in one location. In other embodiments, software modules are hosted on one or more machines in more than one location.
- the platforms, systems, media, and methods disclosed herein include one or more databases, or use of the same.
- suitable databases include, by way of non-limiting examples, relational databases, non-relational databases, object-oriented databases, object databases, entity -relationship model databases, associative databases, XML databases, and document-oriented databases. Further non-limiting examples include SQL, PostgreSQL, MySQL, Oracle, DB2, Sybase, and MongoDB.
- a database is Internet- based.
- a database is web-based.
- a database is cloud computing based.
- a database is a distributed database.
- a database is based on one or more local computer storage devices.
- Example 1 Automated Physical Therapy of a Wrist Joint
- a user interacts with a haptic array device, as disclosed herein, to perform a series of movements as part of a physical therapy program or exercise.
- Instructions may be provided to the haptic array device to implement the physical therapy program. Instructions may be provided via an external device, downloaded from a web/cloud server, provided by a media storage system, etc.
- Physical therapy sessions may be based on movements recorded by a physical therapist or instructor, as disclosed herein.
- a physical therapy program refers to a series of exercise or movements to be completed by a user.
- a physical therapy program may comprise a series of sessions wherein exercises are completed over a duration of time for purposes of training or rehabilitations.
- a monitoring system while conducting a physical therapy session, may record a user’s actions, and be utilized to track the user’s progress.
- the exemplary embodiment herein refers to a physical therapy program which comprises a series of physical therapy sessions, wherein each physical therapy session comprises a series of exercise.
- a physical therapy session begins when a physical therapy program is provided to the haptic array device, at step 610.
- a physical therapy program may include instruction for monitoring and providing haptic feedback to a user as they complete a series of guided exercises.
- the device is ready guide and monitor gestures made by a user.
- the user is guided to an initial position for an exercise.
- the haptic array device is configured to guide a series of exercises a user makes with one hand as part of a physical therapy program for a wrist joint.
- the haptic array device produces a visualization of where the user should place their hand, fingers, arm, or a combination thereof. The visualization may be a marker to indicate where the user should place the center of their palm or hand.
- the haptic array device provides feedback to confirm the users hand is in a proper position.
- the feedback comprises haptic feedback from the transducer array and/or a laser system of the device.
- the sonic haptics may be applied to the palm to indicate proper positioning and the laser haptics may be applied to each fingertip to indicate that the hand and fingers are in proper position and that the device is ready to monitor the user’s gestures.
- the device guides the user to move their hand to perform a physical therapy exercise. Guiding of the user’s hand may comprise visual and/or haptic feedback.
- the user completes a series of gestures or movements which are captured by the monitoring system of the haptic array device, according to some embodiments.
- a series of gestures might include at least two gestures made by the hand, or at least one movement from a first gesture to a second gesture. In some embodiments, the number of gestures is limited by the device. In some embodiments, the number of gestures is chosen by the user.
- successful completion of a prescribed movement is confirmed by haptic feedback provided by the device.
- the user is instructed to repeat the same movement as part of an exercise. After a prescribed number of repetitions, completion of an exercise is confirmed at step 670.
- a physical therapy program comprises a series of exercises, and upon completion of a first exercise a subsequent exercise is provided and steps 620 to 660 are repeated for each subsequent exercise.
- steps 620 to 660 are repeated for each subsequent exercise.
- a user upon completion of a physical therapy session, a user is provided with confirmation that the physical therapy session has been completed.
- data from the monitoring system is compiled to update a user and/or their physical therapist as to the progress made during each session.
- completion of a physical therapy session initiates a haptic therapy or spa program, as described herein.
- patient Jacob is diagnosed as having multiple sclerosis and is instructed by his physician to employ the systems herein.
- the monitoring device displays an image of three balls and Jacob is instructed to manipulate one of the balls.
- the haptic feedback of the monitoring device provides Jacob with the sensation that his fingers are contacting the ball.
- a camera and a time-of-flight sensor on the monitoring device and his ancillary device record a movement of Jacob’s hand as he interacts with the ball, while the biometric sensor of his ancillary device records his pulse.
- a delay is calculated between the measured orientation and movement of Jacob’s hand and a recorded measurement and orientation of a healthy patient. Jacob’s pain through his movement is determined in-part based on his recorded pulse. The delay and pain measurements are provided to Jacob in an assessment.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Physics & Mathematics (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Human Computer Interaction (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Radiology & Medical Imaging (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Pulmonology (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Computer Networks & Wireless Communication (AREA)
- Mathematical Physics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Dermatology (AREA)
- Neurology (AREA)
Abstract
Described are systems, devices, and methods for providing interactive and tactile projections of sound and light in a rehabilitation and training system. Monitoring devices remotely assess and monitor the neural activity and physical behavior of patients, allowing for early detection of changes that may indicate disease progression. In some embodiments, the monitoring devices herein provide real-time feedback, enabling patients to actively participate in their disease management and make informed decisions about their care. In some embodiments, the monitoring devices herein use high resolution motion capture paired with an augmented reality (AR) mid-air haptics interface enabling users to dynamically interact with computer generated objects that have tactile and responsive feedback.
Description
REHAB AND TRAINING INTERACTIVE AND TACTILE
PROJECTIONS OF SOUND AND LIGHT
CROSS-REFERENCE
[0001] This application claims the benefit of U.S. Provisional Application No. 63/379,381, filed October 13, 2022, which is/are hereby incorporated by reference in its/their entirety herein.
BACKGROUND
[0002] Proper rehabilitation of injuries, including injuries which may cause permanent disabilities in a patient, often require consistent manual therapy and attention. Conventional methods require attention by a trained physical therapist, often requiring the patient or therapist to travel for in-person physical therapy. While virtual therapy via video conference provides a remote option, limited viewing angles and interaction between a therapist and patient may hinder the quality of a therapy session. Provided herein are systems and methods for rehabilitative therapy provided via an interactive system for providing tactile projections of sound and light.
SUMMARY
[0003] Provided herein are devices, systems, and methods for emission of electromagnetic waves and/or mechanical waves. In some embodiments, devices and systems incorporate mechanical elements to provide non-laser focused mechanical pressure waves in the human audible spectrum (i.e., about 20 Hz to 20 kHz) and/or human non-audible spectrum. Mechanical elements may include parametric speaker arrays such as ultrasonic speaker arrays, piezo speakers, or electromagnetic speakers, and the like. In some embodiments, beam forming and/or beam shaping methods are utilized to focus, direct, or otherwise manipulate waves propagated from the systems and devices disclosed herein.
[0004] In some embodiments, the devices and systems incorporate optical elements to provide laser focused mechanical pressure waves in the human audible spectrum and/or human non-audible spectrum. Optical elements may also be utilized to provide optical signals
in the infrared, near infrared, or visible light spectrum. Optical elements may include lasers, light emitting diodes, lenses, mirrors, or a combination thereof.
[0005] In some embodiments, devices and systems incorporate thermal elements to alter an ambient temperature. In some embodiments, thermal elements are utilized to lower an ambient temperature. In some embodiments, thermal elements are utilized to lower an ambient temperature. In some embodiments, thermal elements are utilized to adjust an ambient temperature between about 0° C to about 100° C. In some embodiments, temperature sensors are incorporated to measure temperatures of surfaces or areas which may interact with the thermal elements. In some embodiments, temperature sensors allow for dynamic adjustment of the thermal elements, as disclosed herein.
[0006] In some embodiments, devices and systems include interferometric elements to measure mechanical pressure waves or optical waves. In some embodiments, interferometric elements are utilized for dynamic adjustment of optical elements, emission of electromagnetic waves, and/or emission of mechanical waves.
[0007] In some embodiments, devices and system include optical sensors. In some embodiments, optical sensors are utilized to dynamically measure mechanical waves, optical waves, and motion/position of objects (e.g., animate and inanimate objects such as people, cars, rocks, etc.). In some embodiments, an optical sensor is provided to capture images at a rate of 10 Hz to 10,000 Hz. Said captured images may be combined into a video format. In some embodiments, an optical sensor comprises a camera. In some embodiments, optical sensors include infrared, near infrared, visible light, ultra-violet spectrum sensors. In some embodiments, optical sensors comprise three-dimensional (3D) spectroscopic cameras capable of sensing in infrared (IR), near infrared, visible light, and/or ultra-violet spectrum. In some embodiments, systems utilize multiple stereo infrared (IR) imaging devices.
[0008] In some embodiments, systems and devices incorporate one or more computational elements (e.g., a microcontroller, application specific integrated circuit, single board computer, edge computing device, quantum computing device, etc.) to perform data processing and real-time data processing for dynamic output signal conditioning and adjustment based on desired output and measured signal inputs, as disclosed herein.
[0009] In some embodiments, systems include closed mesh network elements for selfrecognizing interact-ability with like devices to allow constructive or destructive distributed signal modification. In some embodiments, systems include open network elements (e.g., 3G, 4G, 5G, long range (LoRa), and the like) to enable connection to internet, intranet, distributed
computing network (cloud computing). In some embodiments, systems include electrical elements to generate, consume, receive, and transmit power (e.g., solar panels, rechargeable battery, battery, wireless energy transmission / reception components, and the like) to provide power to the system and similar devices within a known proximity. In some embodiments, communication between devices utilizes free space optics communication and has the ability to adjust data transmission bandwidth based on power consumption restrictions.
[0010] According to some embodiments, provided herein is a system for haptic interaction, the system comprising: a haptic array comprising a plurality of ultrasonic devices; a camera; a light source; a thermal element; and a non-transitory computer-readable storage media encoded with instructions executable by at least one processor to cause the at least one processor to perform operations comprising: directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit a first acoustic field having a first focal point; directing the light source to emit light at or near the first focal point, direct the thermal element to emit heat at or near the first focal point, or both; determining a user motion based on data received by the camera; and based on the user motion, directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit a second acoustic field having a second focal point.
[0011] In some embodiments, the haptic array is a planar array. In some embodiments, the haptic array is a non-planar array. In some embodiments, the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz. In some embodiments, the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three- dimensional camera. In some embodiments, the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof. In some embodiments, the camera captures data at a rate of about 10 Hz to 10,000 Hz. In some embodiments, the user motion is a motion of an appendage of a user. In some embodiments, the operations further comprise determining a measured position of the focal point based on data received by the camera. In some embodiments, the operations further comprise directing at least a portion of the plurality of ultrasonic devices based on the measured position. In some embodiments, the light source comprises a laser, a light emitting diode, a light bulb, or any combination thereof. In some embodiments, the emitted light has a wavelength of about 10 nm to about 10,000 nm. In some embodiments, the emitted light has
a frequency of about 0.3 THz to about 300 THz. In some embodiments, the system further comprises an interferometric device, wherein the operations further comprise calibrating the haptic array based on data received from the interferometric device. In some embodiments, the interferometric device comprises a laser doppler vibrometer, a laser interferometer, an acoustic interferometer, or any combination thereof. In some embodiments, the system further comprises a communication device, wherein the operations further comprise transmitting the user motion, the data received by the camera, or any combination thereof, via the communication device. In some embodiments, the communication device comprises a cellular device, a Wi-Fi device, a mesh network device, a satellite device, a Bluetooth device, or any combination thereof. In some embodiments, the system further comprises an energy storage device providing power to the haptic array, the camera, the non-transitory computer- readable storage media, or any combination thereof. In some embodiments, the energy storage device comprises a battery, a supercapacitor, or any combination thereof. In some embodiments, the operations further comprise: directing the light source to emit light at or near the second focal point; directing the thermal element to emit heat at or near the second focal point; or both. In some embodiments, the operations further comprise determining an object position, an objection motion, or both, based on data received by the camera. In some embodiments, the operations further comprise: directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit a third acoustic field based on the object position, the objection motion, or both; directing the light source based on the object position, the objection motion, or both; directing the thermal element based on the object position, the objection motion, or both; or any combination thereof.
[0012] According to some embodiments, provided herein is a computer-implemented method of haptic interaction, the method comprising: directing, by a computer, one or more ultrasonic devices in a haptic array to emit an acoustic field having a first focal point; directing, by the computer, a light source to emit light at or near the first focal point, direct a thermal element to emit heat at or near the first focal point, or both; determining, by the computer, a user motion based on data received by a camera; and based on the user motion, directing, by the computer, at least a portion of the plurality of ultrasonic devices in the haptic array to emit a second acoustic field having a second focal point.
[0013] In some embodiments, the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, the camera comprises a two-dimensional camera. In some embodiments,
the camera comprises a three-dimensional camera. In some embodiments, the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof. In some embodiments, the data is received by the camera at a rate of about 10 Hz to 10,000 Hz. In some embodiments, the method further comprises calibrating, by the computer, the haptic array based on data received from an interferometric device. In some embodiments, the method further comprises determining, by the computer, a measured position of the focal point based on data received by the camera. In some embodiments, the method further comprises directing, by the computer, at least a portion of the plurality of ultrasonic devices based on the measured position. In some embodiments, the method further comprises directing, by the computer, the light source to emit light at or near the second focal point; directing, by the computer, the thermal element to emit heat at or near the second focal point; or both. In some embodiments, the method further comprises determining, by the computer, an object position, an objection motion, or both, based on data received by the camera. In some embodiments, the method further comprises directing, by the computer, at least a portion of the plurality of ultrasonic devices in the haptic array to emit a third acoustic field based on the object position, the objection motion, or both; directing, by the computer, the light source based on the object position, the objection motion, or both; directing, by the computer, the thermal element based on the object position, the objection motion, or both; or any combination thereof.
[0014] One aspect provided herein is a system for assessment of a patient, the system comprising: an ancillary device comprising a biometric sensor configured to measure a biometric data; a monitoring device comprising: a display configured to show a display image; a camera, a time-of-flight sensor, or both, configured to capture a plurality of pose images of the patient; and a haptic array comprising a plurality of ultrasonic devices; and a non-transitory computer-readable storage media encoded with instructions executable by at least one processor to cause the at least one processor to perform operations comprising: directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit an acoustic field based on the display image; determining two or more patient poses based at least in part on the biometric data and the two or more pose images; and determining the assessment of the patient based at least in part on the display image and at least a portion of the plurality of patient poses. In some embodiments, the ancillary device is configured to couple to an appendage of the patient. In some embodiments, the biometric sensor comprises an inertial motion unit, a photoplethysmography sensor, a photoacoustic sensor, an ultrasound
sensor, a glucose sensor, a bioimpedance sensor, an electrodermal activity sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof. In some embodiments, the haptic array is a planar array. In some embodiments, the haptic array is a non-planar array. In some embodiments, the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz. In some embodiments, the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three-dimensional camera. In some embodiments, the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, a thermographic camera, or any combination thereof. In some embodiments, the camera, the time-of-flight sensor, or both, captures data at a rate of about 10 Hz to 10,000 Hz. In some embodiments, each of the two or more patient poses comprise a position, an orientation, or both of an appendage of the patient. In some embodiments, the two or more patient poses, the assessment of the patient, or both are determined at least in part by a machine learning algorithm. In some embodiments, the assessment of the patient is determined at least in part based on a comparison between the two or more patient poses and two or more labeled healthy poses. In some embodiments, the monitoring device further comprises a speaker, a joystick, a trackpad, a trackball, or any combination thereof. In some embodiments, the monitoring device or the ancillary device comprise the non-transitory computer-readable storage media. In some embodiments, the ancillary device further comprises an ancillary communication device and wherein the monitoring device further comprises a monitoring communication device communicably coupled to the ancillary device. In some embodiments, the ancillary communication device and the monitoring communication device are wireless communication devices.
[0015] Another aspect provided herein is a computer-implemented method of assessing a patient, the method comprising: showing a display image on a display while: receiving, from an ancillary device, a biometric data; capturing, by a camera, a time-of-flight sensor, or both, a plurality of pose images of the patient; and emitting, by a haptic array comprising a plurality of ultrasonic devices, an ultrasonic haptic based on the display image; determining two or more patient poses based at least in part on the biometric data and the two or more pose images; and determining the assessment of the patient based at least in part on the
display image and at least a portion of the plurality of patient poses. In some embodiments, the biometric data comprises an inertial motion unit data, a photoplethysmography data, a photoacoustic data, an ultrasound data, a glucose data, a bioimpedance data, an electrodermal activity data, a temperature data, a vision shadow capture data, an altitude data, a pressure data, a humidity data, a sweat rate data, a hydration data, a bioacoustics data, a dynamometer, an electrodermal data, or any combination thereof. In some embodiments, at least a portion of the plurality of pose images comprise a two-dimensional image. In some embodiments, at least a portion of the plurality of pose images comprise a three-dimensional image. In some embodiments, at least a portion of the plurality of pose images comprise an infrared image, a near infrared image, a visible light image, an ultra-violet spectrum image, a thermographic image, or any combination thereof. In some embodiments, each of the two or more patient poses comprise a position, an orientation, or both of an appendage of the patient. In some embodiments, the two or more patient poses, the assessment of the patient, or both are determined at least in part by a machine learning algorithm. In some embodiments, the assessment of the patient is determined at least in part based on a comparison between the two or more patient poses and two or more labeled healthy poses.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:
[0017] FIG. 1 shows a non-limiting example of a computing device; in this case, a device with one or more processors, memory, storage, and a network interface;
[0018] FIG. 2 shows a non-limiting example of a web/mobile application provision system; in this case, a system providing browser-based and/or native mobile user interfaces;
[0019] FIG. 3 shows a non-limiting example of a cloud-based web/mobile application provision system; in this case, a system comprising an elastically load balanced, auto-scaling web server and application server resources as well synchronously replicated databases; [0020] FIG. 4 shows a non-limiting example of a haptic array control system;
[0021] FIG. 5 shows a non-limiting example of a haptic array device;
[0022] FIG. 6 depicts a non-limiting example of a method of using a haptic array device as part of a physical therapy session;
[0023] FIG. 7A shows an image of an exemplary system for assessing a patient, per one or more embodiments herein;
[0024] FIG. 7B shows an image of a user manipulating a virtual cube with an exemplary system for assessing a patient, per one or more embodiments herein;
[0025] FIG. 8A shows an image of a user throwing a virtual ball with an exemplary system for assessing a patient, per one or more embodiments herein;
[0026] FIG. 8B shows an image of a user manipulating one of three balls with an exemplary system for assessing a patient, per one or more embodiments herein;
[0027] FIG. 9A shows a first image an exemplary system for assessing a patient, per one or more embodiments herein;
[0028] FIG. 9B shows a second image an exemplary system for assessing a patient, per one or more embodiments herein;
[0029] FIG. 10A shows a third image an exemplary system for assessing a patient, per one or more embodiments herein;
[0030] FIG. 10B shows an image an exemplary ancillary device, per one or more embodiments herein;
[0031] FIG. 11 shows an image of a first exemplary assessments of the patient, per one or more embodiments herein;
[0032] FIG. 12 shows an image of a second exemplary assessments of the patient, per one or more embodiments herein;
[0033] FIG. 13 is a flowchart of an exemplary method for individualized patient care, per one or more embodiments herein; and
[0034] FIG. 14 is flowchart of a computer-implemented method of assessing a patient, per one or more embodiments herein.
DETAILED DESCRIPTION
[0035] Provided herein are embodiments of a system for providing haptic feedback comprising a haptic array. In some embodiments, the haptic feedback system utilizes a combination of optic and acoustic fields simultaneously. In some embodiments, generated optic and acoustic fields have no direct interference, however, combining them provides benefits such as multi-resolution haptic images and a synergistic effect on haptic perception.
In some embodiments, the fields are applied simultaneously as elastic wave to stimulate nerves signals. In some embodiments, the optic field is utilized to simulate or produce a “skin feeling,” or feeling of touch. In some embodiments, the acoustic field is utilized to apply pressure. Combining two fields of different physical quantities would provide not only the superposition effect proposed above but also synergistic effects such as modification of the feeling.
[0036] FIG. 4 shows a diagram of the components of haptic array device, according to some embodiments. FIG. 5 depicts a haptic array device, according to some embodiments. In some embodiments, the system is parametric. In some embodiments, the non-linearity of the frequency response produced by multiple ultrasonic frequencies in air is modeled utilizing parametric equations. The parametric equations may be utilized in computer and/or machine learning systems to (and resultingly, the effect is best modeled with parametric equations). [0037] In some embodiments, the system includes Field Programmable Gate Arrays (FPGAs), machine learning, autonomous control systems, fast-networking, fast-self healing, interferometer sensors, ultrasonic speaker arrays, and the like. In some embodiments, the system utilizes laser interferometer technology to measure the response of an environment, one or more objects, or a combination thereof to dynamically change parameters and achieve desired effects. In some embodiments, a laser interferometer system sends out a two-beam laser to measure vibration of a surface. In some embodiments, laser interferometer is used to receive vibration signals to calibrate the output of the ultrasonic transducer array to effectively beamform the audio waves to focus on one or more points on a subject or object. [0038] In some embodiments, parametric speaker array is a highly directive speaker that consists of an array of ultrasonic transducers that exploit the nonlinear properties of air to self-demodulate modulated ultrasonic signals with the aim of creating narrow, focused sound waves (audible and inaudible). In some embodiments, the ultrasonic transducers are piezoelectrically driven.
[0039] In some embodiments, the system utilizes one or more parametric speaker/transducer arrays. In some embodiments, each transducer array comprises multiple transducers. In some embodiments, the multiple transducers of each array output the same signal which is amplified by constructive interference. In some embodiments, two or more arrays are configured to further amplify a signal via constructive interference. Further, a plurality of speaker arrays may be utilized to precisely direct sound or amplify sound at a precise location. Use of a parametric speaker array may the traditional use of broadcasting audio
through distributed & coherent beamforming functionality. This approach offers the capability of numerous smaller devices to output the same audio volume as a single large device. In contrast, current acoustic hailing or loudspeaker systems focus on high energy output over focused energy output, requiring large and powerful emitters that are difficult to move and/or emplace. In some embodiments, the system and methods herein allow for high powered acoustic energy signals to be achieved with a system which is relatively compact and has low power requirements.
[0040] In some embodiments, the system combines the laser interferometer and parametric speaker array technologies with the distributed coherent beamforming technique through a network capable control system that uses algorithms and/or machine learning (ML) to rapidly tune the audio effect to mitigate destructive environmental noise and to enable effective beam coherence. Therefore, in some embodiments, the system provides autonomous environmental adjustments and distributed coherence beam forming.
[0041] In some embodiments, the inventive device combines three fundamental technologies:
(1) a small, ultrasonic parametric speaker array for broadcasting focused acoustic waveforms,
(2) a laser interferometer to measure environmental noise data (e.g., ambient noise, wind spike, etc.) and record audio, and (3) a network-connected system controller to manage data from both the network and the individual components. In some embodiments, the inventive device combines four fundamental technologies: (1) a small, ultrasonic parametric speaker array for broadcasting focused acoustic waveforms, (2) one or more lasers for generating laser haptics, (3) one or more video capture device for monitoring at least a portion of a subject, and (4) a network-connected system controller to manage data from both the network and the individual components. In some embodiments, an individual system functions on its own.
[0042] In some embodiments, individual systems are combined in a network that provides a distributed coherent beamforming function. In some embodiments, the system utilizes digital signal processing; embedded systems, information technology for distributed networking (i.e., Internet of Things (IOT)), and machine leaming/artificial intelligence (ML/ Al) for device self-calibration.
I. Haptic Array Devices
[0043] With reference to FIG. 4, a system 400 for controlling providing haptic feedback or stimulation is depicted, according to some embodiments. In some embodiments, the system 400 is utilized to stimulate or provide haptic feedback to subject or portion of a subject (e.g.,
a hand of a subject 490). In some embodiments, the system 400 includes network module 405, system controller 410, acoustic payload controller 420, a monitoring controller 425, monitoring sensors 430, acoustic haptic array controller 435, acoustic haptic array 450, optical emission controller 460, optical emitter 465, and recorder 440.
[0044] In some embodiments, the functions of the system 400 are controlled by system controller 410. In some embodiments, the system controller 410 comprises a computer processing unit (CPU), as described herein. The CPU may comprise one or more programs loaded onto a memory for sending instructions for operating the various components of the system, as described herein. The system controller 410 may further comprise a field programmable gate array (FPGA) configurable to provide a logic circuit for specified functions of the system. In some embodiments, the system controller 410 is in operative communication with a network module 405. The network module 405 may be configured to receive information instructions, such a programming instructions, parameter inputs, or the like and transmit said instructions to the system controller 410. The network module 405 may communicate with an external network, remote device, user interface, or the like, as disclosed herein. In some embodiments, mesh networking is utilized. In some embodiments, mesh networking allows the system to provide distributed coherence. In turn, mesh networking may allow many small systems to achieve the performance of a much larger system. Mesh networking may also allow the system to provide unique and complicated acoustic algorithms (e.g., machine learning) to enable precise spatial audio or ultrasonic feedback.
[0045] In some embodiments, the system controller 410 is operatively coupled to the acoustic payload controller 420 and the optical emission controller 460. In some embodiments, the acoustic payload controller and the optical emission controller are integrated into a single haptic array controller. In some embodiments, the system controller 410 is operatively coupled to the acoustic payload controller 420 and the optical emission controller 460 via one or more control buses 415.
[0046] In some embodiments, the acoustic payload controller 420 comprises an application specific integrated circuit (ASIC) processes one or more signals and provides an output signal to the acoustic haptic array controller 435. In some embodiments, the acoustic haptic array controller 435 provides an output signal to the acoustic haptic array 450, where the output signal is transformed into to mechanical waveform (e.g., an acoustic, sound, or ultrasonic waveform) by one or more transducers of the acoustic haptic array. In some embodiments, the haptic array controller comprises an amplifier to amplify the signal prior to output to the
haptic array(s). In some embodiments, the system is connected to a plurality of haptic arrays and the output to each haptic array is varied to produce a desired output. In some embodiments, the constructive interference of the sonic waves produced by the transducers is utilized to produce one or more focal points. In some embodiments, production of the focal point is digitally controlled by the haptic payload controller. In some embodiments, focal points of sonic energy are produced with a resolution of 1/16 of the wavelength (e.g., approximately 0.5 mm for the 40 kHz ultrasound).
[0047] In some embodiments, the optical emission controller 460 comprises an application specific integrated circuit (ASIC) processes one or more received signals. In some embodiments, the optical emission controller 425 receives signals from the system controller 410. In some embodiments, the optical emission controller 425 receives signals from the system controller 410, the acoustic payload controller 420, the monitoring controller 425, or a combination thereof. In some embodiments, the optical emission controller 460 provides directs and controls one or more optical emitters 465.
[0048] In some embodiments, the optical one or more optical emitters 465 comprise at least one light source. In some embodiments, the optical one or more optical emitters 465 comprise at least one light source coupled to one or more optical elements. The optical elements may comprise lenses, mirrors, polarizers, filters, and the like to direct manipulate light from a light source and direct it to a target location. In some embodiments, the system is connected to a plurality of optical emitters and the output to each optical emitter is varied to produce a desired output. In some embodiments, the light source of the optical emitter is a laser, as described herein.
[0049] In some embodiments, the optical emitter produces electromagnetic energy outside of the visible light spectrum. For example, the optical emitter may produce electromagnetic waves within the ultraviolet or infrared spectrum. In some embodiments, the optical emitter is replaced or used in combination with an emitter which generates another type of electromagnetic energy, such as radio emissions. In some embodiments, the optical emitter is replaced or used in combination with a thermal emitter which generates and transmits heat toward a target location or focal point.
[0050] In some embodiments, the system 400 comprises a monitoring controller 425. In some embodiments, the monitoring controller operates and receives data from one or more monitoring sensors. Monitoring sensors may include optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward
a target (e.g., a target area, volume, or a portion of a subject 490). In some embodiments, an interferometer is utilized as a monitoring sensor, as disclosed herein.
[0051] In some embodiments, the monitoring controller 425 transmits data from the monitoring sensors 430 to be processed by the acoustic payload controller 420. In some embodiments, the acoustic payload controller 420 comprises a digitally-programmable potentiometer (DPP) which receives the interferometer data. In some embodiments, the monitoring controller 425 transmits data from the monitoring sensors 430 to be processed by the optical emission controller 460. In some embodiments, the optical emission controller 460 comprises a digitally-programmable potentiometer (DPP) which receives the data generated by the monitoring sensors. In some embodiments, the monitoring data is sent back to system controller 410. In some embodiments, the acoustic payload controller 420 may adjust the output signal to the acoustic haptic array controller 420 based on received data received data from the monitoring controller and/or instructions provided by the system controller 410 in response to received data from the monitoring controller 440. In some embodiments, the optical emission controller 460 may adjust the output signal to the optical emitter based on received data received data from the monitoring controller and/or instructions provided by the system controller 410 in response to received data from the monitoring controller 440. In some embodiments, the system is configured such that feedback received from the monitoring sensors 430 is utilized to adjust the system, output of the haptic arrays 450, and output of the optical emitters 465. In some embodiments, adjustments are made in real-time to provide a self-calibrating system.
[0052] In some embodiments, the system further comprises a recorder 440. Recorder 440 may receive and store monitoring data via an input/output (I/O) integrated circuit coupled to the monitoring controller. The stored data may be utilized by the system to improve outputs. In some embodiments, the stored monitoring data is input into a machine learning module to improve the system. In some embodiments, the system is used for audio recording using an interferometer (i.e., ISR). In some embodiments, the monitoring data is used to track a target 490. In some embodiments, the monitoring data is used to monitor the response of a target to the haptic output of the system.
[0053] FIG. 5 shows an exemplary haptic array device 500 is depicted, according to some embodiments. In some embodiments, the haptic array device 500 comprises an array of transducers 550 for producing sonic haptics, as described. In some embodiments, the array 550 is an ultrasonic transducer array, as disclosed herein. In some embodiments, the haptic
array device 500, further comprises laser systems 511, 512, 513. In some embodiments, the haptic array device 500, further comprises an integrated monitoring system 520. In some embodiments, the haptic array device 500 is configured to provide haptic feedback or sensations to an object or focal point 505. In some embodiments, the object 505 is a portion of a user, such as a hand.
[0054] The laser systems may be configured to produce haptics, 3 dimensional visualizations (i.e., holograms), or both. In some embodiments, a hologram is produced by two of the laser systems function as optical emitters and using constructive interference to produce a 3D rendering. In some embodiments, a third laser system produces haptic feedback while the other two laser systems produce the hologram. For example, laser systems 511 and 512 may produce a hologram while laser system 513 provides haptic feedback to a target area 505. [0055] In some embodiments, monitoring system 520 comprises one or more sensors for monitoring an object or an objects response to the provided haptics, as disclosed herein. The one or more sensors of the monitoring system may comprise optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target 505. In some embodiments, the monitoring system is coupled to a computer system which identifies and tracks the target 505 and/or portions thereof, as disclosed herein.
[0056] While the haptic array device 500 depicted in FIG. 5 depicts a device with fully integrated components, it should be appreciated that the components may not be integrated or may be separate from the device. Further, is should be appreciated that the device may be supplemented with further components (such as additional ultrasound transducer arrays) or additional haptic array devices of the same or a similar type.
[0057] In some embodiments, the system is modular, such that multiple systems can be networked to provide different levels of performance based on user needs. An individual system may operate independently for reduced function based on user needs. Combined systems may operate together to produce a higher output signal, provide haptic feedback to a larger volume of space.
A. Ultrasonic Haptics
[0058] As disclosed herein, sonic haptics may be provided to a target (e.g., one or more focal points, a portion of a subject, etc.). As disclosed herein, sonic haptics may be provided to a target (e.g., one or more focal points, a portion of a subject, etc.). In some embodiments, sonic haptic feedback is provided to a target via an array of ultrasonic transducers. In some
embodiments, an array of ultrasonic transducers comprises 324 transducers arranged in an 18 x 18 square grid. However, multiple arrangements of the transducers may be provided to better suit various applications. In some embodiments, the transducers are arranged as a planar array. In some embodiments, the transducers are arranged in a non-planar array. In some embodiments, the transducers are arranged in two or more planar arrays which are provided at an angle to each other. In some embodiments, the transducers are arranged in two or more planar arrays which are orthogonal to each other. In some embodiments, the transducers are open aperture ultrasonic transducers. In some embodiments, the transducers are ceramic transducers (e.g., Nippon Ceramic T4010A1 transducers).
[0059] In some embodiments, an array of ultrasonic transducers comprises about 4 transducers to about 1,025 transducers. In some embodiments, an array of ultrasonic transducers comprises about 4 transducers to about 25 transducers, about 4 transducers to about 64 transducers, about 4 transducers to about 256 transducers, about 4 transducers to about 324 transducers, about 4 transducers to about 576 transducers, about 4 transducers to about 1,025 transducers, about 25 transducers to about 64 transducers, about 25 transducers to about 256 transducers, about 25 transducers to about 324 transducers, about 25 transducers to about 576 transducers, about 25 transducers to about 1,025 transducers, about 64 transducers to about 256 transducers, about 64 transducers to about 324 transducers, about 64 transducers to about 576 transducers, about 64 transducers to about 1,025 transducers, about 256 transducers to about 324 transducers, about 256 transducers to about 576 transducers, about 256 transducers to about 1,025 transducers, about 324 transducers to about 576 transducers, about 324 transducers to about 1,025 transducers, or about 576 transducers to about 1,025 transducers, including increments therein. In some embodiments, an array of ultrasonic transducers comprises at least about 4 transducers, about 25 transducers, about 64 transducers, about 256 transducers, about 324 transducers, or about 576 transducers, including increments therein.
[0060] In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of about 20 millimeters (mm). In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of about 1 mm to about 100 mm. In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of about 1 mm to about 5 mm, about 1 mm to about 10 mm, about 1 mm to about 20 mm, about 1 mm to about 40 mm, about 1 mm to about 50 mm, about 1 mm to about 100 mm, about 5 mm to about 10 mm, about 5 mm to about 20 mm, about 5 mm to
about 40 mm, about 5 mm to about 50 mm, about 5 mm to about 100 mm, about 10 mm to about 20 mm, about 10 mm to about 40 mm, about 10 mm to about 50 mm, about 10 mm to about 100 mm, about 20 mm to about 40 mm, about 20 mm to about 50 mm, about 20 mm to about 100 mm, about 40 mm to about 50 mm, about 40 mm to about 100 mm, or about 50 mm to about 100 mm, including increments therein. In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of at least about 1 mm, about 5 mm, about 10 mm, about 20 mm, about 40 mm, or about 50 mm, including increments therein. In some embodiments, the transducers are capable of producing an ultrasonic focal point having a diameter of at most about 5 mm, about 10 mm, about 20 mm, about 40 mm, about 50 mm, or about 100 mm, including increments therein.
[0061] In some embodiments, the transducer array is capable of providing pressure forces of about 10 millinewtons (mN) to about 20 mN. In some embodiments, the transducer array is capable of providing pressure forces of about 1 mN to about 100 mN. In some embodiments, the transducer array is capable of providing pressure forces of about 1 mN to about 2 mN, about 1 mN to about 5 mN, about 1 mN to about 10 mN, about 1 mN to about 20 mN, about 1 mN to about 50 mN, about 1 mN to about 100 mN, about 2 mN to about 5 mN, about 2 mN to about 10 mN, about 2 mN to about 20 mN, about 2 mN to about 50 mN, about 2 mN to about 100 mN, about 5 mN to about 10 mN, about 5 mN to about 20 mN, about 5 mN to about 50 mN, about 5 mN to about 100 mN, about 10 mN to about 20 mN, about 10 mN to about 50 mN, about 10 mN to about 100 mN, about 20 mN to about 50 mN, about 20 mN to about 100 mN, or about 50 mN to about 100 mN, including increments therein. In some embodiments, the transducer array is capable of providing pressure forces of at least about 1 mN, about 2 mN, about 5 mN, about 10 mN, about 20 mN, or about 50 mN, including increments therebetween.
[0062] The ultrasonic haptics are based on acoustic radiation pressure, which is not vibrational and presses the skin surface. This can be applied on the skin for a long time but this is relatively weak. The sensation maybe similar to a laminar air flow within a narrow area.
[0063] A direct current output of ultrasound may be too weak to be perceivable at low levels. Therefore, in some embodiments, vibrotactile stimulations are produced by modulation of ultrasonic emission as waveforms. In some embodiments, vibrotactile stimulations are produced by modulated by 200 Hz and 50 Hz waves. In some embodiments, the waveforms for producing ultrasonic haptic feedback are sinewaves, rectangular waves, triangular waves,
or a combination thereof. In some embodiments, the spatial resolution produced by the transducer array is about 8.5 mm when the array is operating at 40 kilohertz (kHz).
B. Laser Haptics
[0064] In some embodiments, the haptic array device comprises one or more lasers for providing haptic feedback. In some embodiments, a laser emits energy at a wavelength of about 10 nm to about 10,000 nm. In some embodiments, a laser has a frequency of about 0.3 THz to about 300 THz. In some embodiments, a power output of the laser is about 0.16 watts (W). In some embodiments, a power output of the laser is about 0.01 W to about 0.5 W. In some embodiments, a power output of the laser is about 0.01 W to about 0.05 W, about 0.01 W to about 0.1 W, about 0.01 W to about 0.13 W, about 0.01 W to about 0.16 W, about 0.01 W to about 0.2 W, about 0.01 W to about 0.3 W, about 0.01 W to about 0.5 W, about 0.05 W to about 0.1 W, about 0.05 W to about 0.13 W, about 0.05 W to about 0.16 W, about 0.05 W to about 0.2 W, about 0.05 W to about 0.3 W, about 0.05 W to about 0.5 W, about 0.1 W to about 0.13 W, about 0.1 W to about 0.16 W, about 0.1 W to about 0.2 W, about 0.1 W to about 0.3 W, about 0.1 W to about 0.5 W, about 0.13 W to about 0.16 W, about 0.13 W to about 0.2 W, about 0.13 W to about 0.3 W, about 0.13 W to about 0.5 W, about 0.16 W to about 0.2 W, about 0.16 W to about 0.3 W, about 0.16 W to about 0.5 W, about 0.2 W to about 0.3 W, about 0.2 W to about 0.5 W, or about 0.3 W to about 0.5 W, including increments therein. In some embodiments, a power output of the laser is about 0.01 W, about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, about 0.3 W, or about 0.5 W, including increments therein. In some embodiments, a power output of the laser is at least about 0.01 W, about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, or about 0.3 W, including increments therein. In some embodiments, a power output of the laser is at most about 0.05 W, about 0.1 W, about 0.13 W, about 0.16 W, about 0.2 W, about 0.3 W, or about 0.5 W, including increments therein.
[0065] In some embodiments, a low laser power levels prevent damaging of the skin of a user. The sensation produced by the laser system may be similar to an electric sensation. In some embodiments, the haptic feedback from the laser causes evaporation from a nonthermal shockwave produced on skin. In some embodiments, duration of laser exposure is limited to prevent damage to the skin.
[0066] In some embodiments, a haptic laser system comprises at least one laser light source. In some embodiments, the haptic laser system comprises optical elements such as lenses, mirrors, polarizers, filters, and the like to direct manipulate light from a light source and
direct it to a target location. In some embodiments, a haptic laser system comprises galvo- mirrors for precise positioning of the laser energy. In some embodiments, a laser system comprises a computer-controlled optical phased array comprising pixels that modulate a laser beam’s intensity, phase, or both.
C. Cross-Field Haptics
[0067] In some embodiments, the haptic array device utilizes a combination of electromagnetic energy and pressure from mechanical waves to produce unique sensations for a user. In some embodiments, the ultrasonic transducers can produce pressure in larger areas (e.g., about 30 cm areas). In some embodiments, the laser haptics systems produce sensations in more focused areas (e.g., down to 1 micron). Therefore, a combination of laser and ultrasonic transducer systems may produce focused haptics at different scales simultaneously. For example, if a target is a hand of a user, the ultrasonic haptic system may produce a pressure sensation on the palm of the hand, while the laser haptic system focuses a sensation on a fingertip of the user. Such a configuration may be useful in confirming registration or detection of various parts of the hand when being used in combination with a gesture registration system.
[0068] Simultaneous application of pressure from the ultrasound transducers with application of a laser effect the perceived effects of the sensation from each haptic system. In some embodiments, application of pressure from the ultrasound transducers reduces the sensitivity of a user to the effects of a laser. This may allow for higher intensity laser application before a user perceives pain.
D. Optical Simulation
[0069] In some embodiments, lasers of the haptic array device are utilized to produce visualizations. In some embodiments, constructive interference produced by a laser emission system is utilized to generate 3D images or holograms. In some embodiments, a 3D image or hologram is utilized to help guide a user when the haptic array device is being used as a controller or for gesture recognition. In some embodiments, a 3D image or hologram is utilized to help guide a user when using an external device is being used as a controller or for gesture recognition. In some embodiments, a 3D image is produced to guide a user’s hand to the center of an image captured by a camera (either incorporated or external to the haptic array device) being utilized for gesture recognition.
[0070] In some embodiments, a haptic array device utilizes a laser system to produce both haptic and visual effects. In some embodiments, the haptic feedback is provided as the user interacts with a 3D image or hologram. In some embodiments, a 3D image or hologram is utilized to help guide a user through a series movements as part of rehabilitation or training program.
II. Systems and Methods for Assessing a Patient
[0071] Provided herein, per FIGS. 7A-10B, are systems 700 for assessment of a patient. As shown, in some embodiments, the system 700 comprises a monitoring device 710 and an ancillary device 720.
[0072] In one embodiment the monitoring device 710 comprises a display 711, a haptic array 712, a camera 713, and a non-transitory computer-readable storage media. In another embodiment the monitoring device 710 comprises the display 711, the haptic array 712, a time-of-flight sensor 714, and the non-transitory computer-readable storage media. In another embodiment the monitoring device 710 comprises the display 711, the haptic array 712, the camera 713, the time-of-flight sensor 714, and the non-transitory computer-readable storage media. In some embodiments, the monitoring device 710 further comprises a speaker, a joystick, a trackpad, a trackball, or any combination thereof.
[0073] In some embodiments, the display 711 is configured to show a display image. FIG. 7B shows an exemplary display image of the user’s hand manipulating a virtual cube. In some embodiments, this display image is shown while the user experiences a sensation of manipulating the virtual cube by pressure waves emitted from the haptic array 712. FIG. 8A shows an exemplary display image of the user’s hand throwing a ball. FIG. 8B shows an exemplary display image of the user’s hand manipulating one of three displayed balls.
[0074] In some embodiments, the ancillary device 720 comprises a biometric sensor. In some embodiments, the biometric sensor is configured to measure a biometric data. In some embodiments, the ancillary device 720 is configured to couple to an appendage of the patient. In some embodiments, the biometric sensor comprises an inertial motion unit, a photoplethysmography sensor, a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance sensor, an electrodermal activity sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof. In some embodiments, the ancillary device 720 further comprises a speaker, a joystick, a trackpad, a trackball, or any combination thereof.
[0075] In some embodiments, the monitoring device 710 or the ancillary device 720 comprise the non-transitory computer-readable storage media. In some embodiments, the ancillary device 720 further comprises an ancillary communication device and wherein the monitoring device 710 further comprises a monitoring communication device communicably coupled to the ancillary device 720. In some embodiments, the ancillary communication device and the monitoring communication device are wireless communication devices.
[0076] In some embodiments, the camera 713 is configured to capture a plurality of pose images of the patient. In some embodiments, the plurality of pose images of the patient form a video of the motion of the patient. In some embodiments, the camera 713 comprises a two- dimensional camera 713. In some embodiments, the camera 713 comprises a three- dimensional camera 713. In some embodiments, the camera 713 is an infrared camera 713, a near infrared camera 713, a visible light camera 713, an ultra-violet spectrum camera 713, a thermographic camera 713, or any combination thereof. In some embodiments, the camera 713, the time-of-flight sensor 714, or both, captures data at a rate of about 10 Hz to 10,000 Hz. In one embodiment the monitoring device 710 comprises two or more cameras 713, two or more time-of-flight sensors 714, or both. In one embodiment the two or more cameras 713, the two or more time-of-flight sensors 714, or both are arrayed to capture the patient from two or more directions. In one embodiment the two or more cameras 713, the two or more time-of-flight sensors 714, or both are arrayed about the haptic array 712.
[0077] In some embodiments, the haptic array 712 comprises a plurality of ultrasonic devices. In some embodiments, the haptic array 712 is a planar array. In some embodiments, the haptic array 712 is a non-planar array. In some embodiments, the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. In some embodiments, at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz.
[0078] In some embodiments, the non-transitory computer-readable storage media encoded with instructions executable by at least one processor to cause the at least one processor to perform operations. In some embodiments, the one or more operations comprise: directing at least a portion of the plurality of ultrasonic devices in the haptic array 712 to emit an acoustic field based on the display image; determining two or more patient poses based at least in part on the biometric data and the two or more pose images; and determining the assessment of the patient based at least in part on the display image and at least a portion of the plurality of patient poses. In some embodiments, each of the two or more patient poses comprise a
position, an orientation, or both of an appendage of the patient. In some embodiments, the two or more patient poses, the assessment of the patient, or both are determined at least in part by a machine learning algorithm. In some embodiments, the assessment of the patient is determined at least in part based on a comparison between the two or more patient poses and two or more labeled healthy poses.
[0079] FIGS. 11 and 12 show exemplary assessments of the patient. In some embodiments, the assessment of the patient is displayed on the display. In some embodiments, the assessment comprises a progress indicator, a vital sign indicator, a range of motion improvement indicator, a grip strength impro. In some embodiments, the assessment allows the patient to record their pain level and provides an indicator of their current and/or past pain levels.
A. Monitoring Devices
[0080] In some embodiments, one or more sensors are provided to monitor interaction with the haptic array device. In some embodiments, a monitoring device comprising one or more sensors is provided to monitor a user position, the user motion, or both is outside a threshold from a set user position, a set user motion, or both. Monitoring sensors may include optical sensors (e.g., cameras), thermal sensors, audio sensors, vibrational/mechanical sensors, and the like which are directed toward a target (e.g., a target area, volume, or a portion of a subject 490). In some embodiments, an interferometer is utilized as a monitoring sensor, as disclosed herein.
[0081] In some embodiments, a monitoring device comprises a camera. In some embodiments, the camera captures data at a rate of about 10 Hz to 10,000 Hz. In some embodiments, the camera comprises a two-dimensional camera. In some embodiments, the camera comprises a three-dimensional camera. The camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, or any combination thereof.
[0082] In some embodiments, the camera is coupled to a computer processing unit (CPU) of the system, as disclosed herein. The camera may be utilized for gesture recognition. In some embodiments, haptic feedback is provided by the haptic array device in response to position or movement of a target within the field of view of the camera.
[0083] In some embodiments, feature detection and extraction methods are utilized to identify a region of interest on the target. In embodiments wherein the system is used for gesture recognition, regions of interest may include such as a finger, palm, thumb, fingertip,
etc. of a user. In some embodiment, feature detection and extraction methods comprise computing processing of images to analyze contrasts in pixel brightness to recognize features. Feature detection and extractions methods may include edge detection, corner detection, blob detection, ridge detection, and combinations thereof.
[0084] In some embodiments, an edge detection algorithm is utilized to identify an outline or border of a target. In some embodiments, a nearest neighbor, thresholding, clustering, partial differential equation, and/or other digital image processing methods are utilized to identify an outline or border of a target. Canny, Deriche, differential, Sobel, Prewitt, and Roberts cross edge detection techniques may be utilized to identify target or a portion thereof. In some embodiments, Gaussian or Laplacian techniques are utilized to smooth or improve the accuracy of the identified target or portion thereof.
[0085] In some embodiments, the monitoring device comprises a kiosk. In some embodiments, the monitoring device comprises a thermographic camera, a time-of-flight sensor, a microphone, or any combination thereof. In some embodiments, the monitoring device further comprises a speaker, a haptic projection unit, an augmented reality projection unit, or any combination thereof.
[0086] In some embodiments, the monitoring device comprises a photoplethysmography sensor (PPG), a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance (BIA) sensor, an electrodermal activity (EDA) sensor, an inertial motion unity (IMU) sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof.
[0087] In some embodiments, the monitoring device is reconfigurable based on a patient’s handedness, height, or any combination thereof. In some embodiments, the monitoring device comprises a wireless communication device, a wired communication device, or both. In some embodiments, the monitoring device comprises an energy storage device, a wired charge connector, a wireless charge connector, or any combination thereof. In some embodiments, the haptic feedback comprises a finger haptic, a magneto haptic, an opto-haptic, or any combination thereof. In some embodiments, the monitoring device comprises a mount for a peripheral device.
[0088] In some embodiments, the monitoring devices herein comprise an automated kiosk that quantifies neural activity and physical behavior, to provide a telemedicine solution. In some embodiments, the monitoring devices herein enable healthcare providers to remotely
assess and monitor the neural activity and physical behavior of patients, allowing for early detection of changes that may indicate disease progression. In some embodiments, the monitoring devices herein provide real-time feedback on neural activity and physical behavior, enabling patients to actively participate in their disease management and make informed decisions about their care. In some embodiments, the monitoring devices herein employ high-resolution motion capture to assess physical impairments, enabling the quantitative analysis of movement patterns and motor coordination. In some embodiments, the monitoring devices herein use high resolution motion capture paired with an augmented reality (AR) mid-air haptics interface enabling users to dynamically interact with computer generated objects that have tactile and responsive feedback. In some embodiments, the haptics interface simultaneously stimulates brain activity associated sensory response to touch as well as causal response for tracking coordination. In some embodiments, the monitoring devices herein pair with a biometric wrist worn wearable tracking surface that measures electromyography, bio-impedance, electrodermal activity (EDA), volumetric blood flow via photoplethysmography (PPG), motion, or any combination thereof. In some embodiments, the motion is measured using a 9-DOF inertial motion unit (IMU). In some embodiments, the monitoring devices herein pair with a head worn wearable able to assess electroencephalography (EEG) signals, electromyography (EMG) signals, electrooculography (EOG) signals, or any combination thereof. In some embodiments, the signals are correlated to central brain activity, peripheral biometric signals and functional performance.
[0089] In some embodiments, the systems and methods herein employ a haptics projector and machine vision cameras that allows individualized care for patients with neurological disorders. In some embodiments, computer generated objects are projected through the haptics projector and can be manipulated by a patient. The machine vision cameras may track how patients interact with the projected objects for the quantification of their behavior versus expected motion. In some embodiments, the system herein utilize augmented reality (AR) mid-air haptics, which adds a tactile element to the assessment process. This technology enables patients to interact with virtual objects in a realistic manner, providing a more immersive and engaging experience. The utilization of haptic technology in telemedicine is gaining significant traction due to its ability to provide realistic, tactile experiences for both patients and healthcare providers. The system herein enables healthcare providers to remotely assess patients' physical symptoms and provide targeted treatment plans.
[0090] In some embodiments, the system herein employ high-resolution motion capture technology to enable precise tracking and analysis of a patient's physical dynamics, providing valuable insights into their functional performance. By capturing detailed movement data, the system herein objectively assess disease progression and monitor the effectiveness of interventions, which is a definitive advantage over more subjective measures of the progression of a disease.
B. Ancillary Sensors
[0091] In some embodiments, additional sensors are utilized to enhance or supplement the performance of the haptic array device or monitoring system thereof. In some embodiments, ancillary sensors comprise wearable sensors which are attached to a user to receive additional data generated by movements or electrical signals (e.g., electromyographic (EMG), electroencephalographic (EEG), etc.) produced by a user. In some embodiments, a wearable ancillary sensor comprises one or more motion sensors. In some embodiments, the motion sensors comprise an accelerometer, a gyroscope, or a combination thereof.
[0092] In some embodiments, a wearable ancillary sensor array is configured to couple to an appendage, limb, or extremity of a user. In some embodiments, an existing device comprising one or more motion sensors (e.g., a smart watch) is coupled to the haptic array device to act as an ancillary sensor device. In some embodiments, additional bioinformatics are acquired by the ancillary sensors such as heart rate, body temperature, blood pressure, or a combination thereof.
[0093] In some embodiments, a wearable ancillary sensor array is configured to be worn on a head, a foot, or a wrist of user. In some embodiments, a wearable ancillary sensor array comprising one or more EEG sensors is configured to place the EEG sensors in proximity to the scalp of a user and receive electric signals produced by the brain of the user. In some embodiments, the EEG sensors do not require direct contact to the skin (e.g., no need for shaving of the head) or a gel to be applied to the scalp.
[0094] In some embodiments, the ancillary sensors are used confirm or verify actions or gestures made by a user. In some embodiments, bioinformatic information obtained by the ancillary sensors is recorded and stored in a memory of the system.
[0095] In some embodiments, the ancillary sensor is head wearable and comprises a helmet, a visor, glasses, a headband, earbuds, earphones, or any combination thereof. In some embodiments, the ancillary sensor is head wearable and comprises an electroencephalogram
(EEG) sensor, an electrooculography (EOG) sensor, a cerebral blood volume sensor, a facial micro-motion sensor, or any combination thereof.
[0096] In some embodiments, the ancillary sensor is wrist and/or hand wearable and comprises a photoplethysmography sensor (PPG), a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance (BIA) sensor, an electrodermal activity (EDA) sensor, an inertial motion unity (IMU) sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof. [0097] In some embodiments, the wrist ancillary sensor is reconfigurable based on a patient’s handedness. In some embodiments, the ancillary sensor is hand graspable. In some embodiments, the wrist ancillary sensor comprises a wireless communication device, a wired communication device, or both. In some embodiments, the wrist ancillary sensor comprises an energy storage device, a wired charge connector, a wireless charge connector, or any combination thereof. In some embodiments, the wrist ancillary sensor comprises a finger interface, a haptic feedback, a joystick, a trackpad, a trackball, or any combination thereof. In some embodiments, the haptic feedback comprises a finger haptic, a magneto haptic, an opto- haptic, or any combination thereof. In some embodiments, the ancillary sensor is hand graspable. In some embodiments, the wrist ancillary sensor comprises a finger interface, a haptic feedback, a joystick, a trackpad, a trackball, or any combination thereof.
[0098] In some embodiments, the ancillary sensor is foot-wearable and comprises a photoplethysmography (PPG) sensor, a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance (BIA) sensor, an electrodermal activity (EDA) sensor, inertial motion unity (IMU) sensor, a thermometer, an altimeter, a barometer, a humidity sensor, a sweat rate generation sensor, a hydration sensor, a bioacoustics sensor, or any combination thereof.
[0099] In some embodiments, the ancillary sensor comprises an electroencephalogram (EEG) sensor, an electrooculography (EOG) sensor, a cerebral blood volume sensor, a facial micromotion sensor, a photoplethysmography sensor (PPG), a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance (BIA) sensor, an electrodermal activity (EDA) sensor, an inertial motion unity (IMU) sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof. [0100] In some embodiments, the methods and systems herein employ a biometric wearable to track, for example, surface electromyography, surface conductance, bio-impedance,
electrodermal activity, temperature, heart rate, and other environmental factors like pressure, humidity, and altitude. In some embodiments, the wearable device comprise a 9 DOF (degrees of freedom) IMU to track patients’ motion with high precision. In some embodiments, the biometric wearables capture physiological data, such as heart rate and electrodermal activity, allowing for a comprehensive assessment of the patient's physical and cognitive state. By analyzing these biometric signals, the systems herein can extract useful features and provide objective assessments of disease progression and functional performance. The potential for collecting EEG, EMG and EOG data adds another dimension to the ability to gather objective data to be correlated with the self-reported status evaluations.
C. Individualized Patient Care Methods
[0101] Provided herein, per FIG. 13, is a flowchart of an exemplary method 1300 for individualized patient care. As shown, in some embodiments, patient data 1311 and doctors notes 1312 are used to train a machine learning algorithm 1321, which with input from a neurological examination 1322 and/or a hand examination 1323, provides a report 1324 to a patient and/or caregiver. In some embodiments, the neurological examination 1322 and/or the hand examination 1323 determine a current state 1330 of a patient, wherein the current state is based on an in-clinic visit 1331, a manual visit 1332 (e.g., performed by a family member or nurse), a qualitative result 1333, or any combination thereof. In some embodiments, the methods 1300 herein employ individualized care 1341, a kiosk 1342 for patient interaction and data collection, and extended reality haptics 1343. In some embodiments, the method 1300 produces a graphical user interface 1344 to allow the patient and/or caregiver to review and analyze collected data.
[0102] Also provided herein, per FIG. 14, is a computer-implemented method of assessing a patient 1400. As shown, the method 1400 comprises showing a display image 1411, receiving a biometric data 1412, capturing a plurality of pose images of the patient 1413, emitting an ultrasonic haptic based on the display image 1414, determining two or more patient poses based at least in part on the biometric data and the two or more pose images 1421, and determining the assessment of the patient based at least in part on the display image and at least a portion of the plurality of patient poses 1422.
[0103] In some embodiments, showing the display image 1411, receiving the biometric data 1412, capturing the plurality of pose images of the patient 1413, and emitting the ultrasonic haptic 1414 are performed simultaneously. In some embodiments, two or more of showing the display image 1411, receiving the biometric data 1412, capturing the plurality of pose
images of the patient 1413, and emitting the ultrasonic haptic 1414 are performed simultaneously. In some embodiments, showing the display image 1411, receiving the biometric data 1412, capturing the plurality of pose images of the patient 1413, and emitting the ultrasonic haptic 1414 are performed within a time span of at most about 1 minute. In some embodiments, two or more of showing the display image 1411, receiving the biometric data 1412, capturing the plurality of pose images of the patient 1413, and emitting the ultrasonic haptic 1414 are performed within a time span of at most about 1 minute.
[0104] In some embodiments, the biometric data comprises an inertial motion unit data, a photoplethysmography data, a photoacoustic data, an ultrasound data, a glucose data, a bioimpedance data, an electrodermal activity data, a temperature data, a vision shadow capture data, an altitude data, a pressure data, a humidity data, a sweat rate data, a hydration data, a bioacoustics data, a dynamometer, an electrodermal data, or any combination thereof. [0105] In some embodiments, at least a portion of the plurality of pose images comprise a two-dimensional image. In some embodiments, at least a portion of the plurality of pose images comprise a three-dimensional image. In some embodiments, at least a portion of the plurality of pose images comprise an infrared image, a near infrared image, a visible light image, an ultra-violet spectrum image, a thermographic image, or any combination thereof. In some embodiments, each of the two or more patient poses comprise a position, an orientation, or both of an appendage of the patient.
[0106] In some embodiments, the two or more patient poses, the assessment of the patient, or both are determined at least in part by a machine learning algorithm. In some embodiments, the assessment of the patient is determined at least in part based on a comparison between the two or more patient poses and two or more labeled healthy poses.
[0107] In some embodiments, the Al algorithms extract relevant features from the data, enabling calculations of human pose estimation and correlation with neural activity. By analyzing the extracted data using machine learning techniques, the systems herein enable robust and reliable measures of cognitive and physical impairments in individuals. In some embodiments, the machine learning algorithms herein are trained on collected data and established clinical assessments and existing objective measures. In some embodiments, the methods herein employ statistical analysis on data collected from both a patient group and the healthy control group as a validation. In some embodiments, data is collected via a personalized assessment activity for a plurality of simulated patient profiles. In some embodiments, each activity is designed to extract key features that assess cognitive and
- l-
physical impairments and encompass a range of tasks to comprehensively evaluate the patients' motor skills, cognitive functions, and coordination abilities. In some embodiments, the systems herein employ generative Al algorithms to analyze the data collected from motion capture and biometric wearables and perform human pose estimation calculations. This capability allows for the multi- modal quantification of impairment and provides healthcare professionals with valuable insights into the patient's condition. By combining high- resolution motion capture, augmented reality mid-air haptics, and biometric wearables, the system herein provide a holistic assessment of both physical dynamics and brain activity. In some embodiments, the Artificial Intelligence (Al) algorithms are trained with comprehensive data on both physiological and neurological responses during functional tasks. [0108] Capturing all this information while the patient manipulates the computer-generated object in a gamified way allows quantitative and reproducible assessment of cognition, dexterity, coordination, and other measures used to determine the state of health for a person with neurological disorder. In some embodiments, a patient’s signals are captured by an automated and portable monitoring device, in-home or in-clinic. Data can be captured asynchronously (e.g., self-guided by an Al- Assistant) or under the observation of a remote clinical operator via telemedicine. As such, the systems herein enable access to care, access to monitoring tools, and challenges to reproducible examination to determine cognitive and physical impairment.
[0109] In some embodiments, a patient will undergo an assessment session at a healthcare facility or at home. The patient can be fitted with biometric wearables and can interact with virtual objects using AR mid-air haptics. Meanwhile, in some embodiments, the high- resolution motion capture system may track their movements and capture data on their physical dynamics. Throughout the assessment, data will be collected and analyzed using generative Al algorithms. In some embodiments, these algorithms extract features from the biometric signals and motion capture data to perform human pose estimation calculations and provide healthcare professionals with objective assessments of the patient's cognitive and physical impairment, disease progression, and functional performance.
III. Rehabilitation and Training Methods
[0110] Advances in telemedicine and decentralized clinical trials have created immense potential to transform biomedical research and improve patient outcomes. However, neurology trials face particular challenges with replicating in-clinic assessments remotely and consistently tracking disease progression over time. Neurological conditions like Parkinson’s,
multiple sclerosis (MS), and Alzheimer’s impose a tremendous health burden and require ongoing tracking of complex symptoms. Clinical rating scales remain the gold standard for assessment but rely on subjective evaluation during in-clinic visits. This leads to issues with inter-rater variability, assessment infrequency, and inconsistent longitudinal data capture. [0111] Accessing quality neurological care is a significant challenge due to the rarity and lack of transportation to available specialty neurological care providers. Patients with lack of access to neurologists may forego regular monitoring and treatment resulting in accelerated disease progression, more severe symptoms, a diminished quality of life, and inflated costs from emergency interventions. As such, there is a critical need for telehealth care delivery solutions can expand access to top-quality neurological care.
[0112] In some embodiments, a rehabilitation and/or training system is provided by the devices and methods disclosed herein. In some embodiments, a haptic array device is utilized in a rehabilitation system. In some embodiments, the haptic array device is utilized to carry out an automated or partially-automated physical therapy regimen. In some embodiments, the haptic array devices provide stimulation to a portion of a subject as a haptic therapy.
[0113] In some embodiments, the haptic array device comprises a monitoring system and is utilized for gesture recognition as part of rehabilitation and training methods and systems. In some embodiments, the monitoring system further identifies features of a portion of a user for monitoring of movements/gestures.
[0114] In some embodiments, haptic feedback is provided to help guide the user. For example, haptic feedback may be utilized to confirm that the monitoring system has identified a target portion of a user or confirm that a guided movement has been properly completed by the user. In some embodiments, haptic feedback is utilized to confirm a portion of the user is properly in view of the monitoring system.
[0115] In some embodiments, a laser system of the haptic array provides a visualization (e.g., a hologram or 3D image). In some embodiments, a user may interact with the visualization produced by the laser system to simulate manipulation of an object. In some embodiments, haptic feedback is provided as a user interacts with the visualization to confirm that the user’s actions are properly registered.
[0116] Coupling of a haptic array device to an external device may be carried out through wired or wireless communication. The external device may communicate/transmit data or information obtained by the haptic array device. The external device may comprise additional systems for simultaneous communication, such as a camera and microphone for video
communi cation. As discussed herein, the haptic array device may be modular, and additional haptic arrays or components may be utilized to enhance performance of the system.
[0117] Provided herein are systems and methods enable that remote, objective assessment of cognitive and physical impairment for individuals with neurological disorders. In some embodiments, the systems herein integrate high-resolution motion capture, augmented reality mid-air haptics, and biometric wearables to quantify neural activity and physical behavior and provide consistent, quantitative data that can precisely track outcomes over time. In some embodiments, the systems and methods herein provide consistent, objective disease progression data across diverse populations. In some embodiments, the systems and methods herein increase enrollment and access for rural and minority populations through remote assessments. In some embodiments, the systems and methods herein enable enabling continuous longitudinal tracking of treatment efficacy over long periods and reduce demographic biases in clinical evaluations via quantitative measures.
[0118] In some embodiments, the systems and methods herein employ an augmented reality (AR) mid-air haptics interface paired with high- resolution motion capture, biometric wearable and head worn wearable to correlate central brain activity, peripheral biometric signals and functional performance. Patients can interact with virtual objects and a monitoring device to record their physical performance as well as their cognitive response to the movement and biometric data. In some embodiments, the data is analyzed with the help of Artificial Intelligence (Al) algorithms and can be interpreted by a neurologist through a telehealth consultation.
A. Performance Evaluation and Diagnostics
[0119] In some embodiments, the systems and methods herein are utilized to evaluate performance of an individual. In some embodiments, the system monitors a user as they perform a series of movements to evaluate a range of motion or correct execution of the movements by the user. The system may be utilized in diagnosis of injury, training, or evaluation of performance. For example, the system may be utilized to monitor the progression of rehabilitation from an injury or diagnosis of an injury.
[0120] In some embodiments, the device projects acoustic and/or visual holograms for a user to interact with. In some embodiments, the device provides instruction for interacting with the projected holograms. In some embodiments, as the user interacts with the holograms, the monitoring system detects, tracks, and records the movements made by the user. The
movements may be directed to a specific appendage or body part of the user or may be movements made by the whole body of the user.
[0121] In some embodiments, the user’s range of motion is evaluated by instructing the user to conduct a series of movements. By monitoring and tracking the user’s movements the system may provide a diagnosis of a condition, evaluate severity of a condition, and/or track rehabilitation of a condition. For example, a user may be instructed to perform a series of movements of their forearm to diagnose the severity of lateral epicondylitis.
[0122] In some embodiment, the system is utilized for training purposes. For example, a haptic array device, as disclosed herein, may be utilized to project haptics simulating a music instrument (e.g., a piano or guitar). The device may track the movements of the user during interaction and provide feedback (e.g., visual and/or haptic feedback) to correct the user’s movements as they interact with the projections. In this manner, the user may be trained on proper technique for using the physical instrument. For example, a pianist may be able to practice and receive feedback on their piano playing technique without requiring an entire piano or keyboard. In this sense, the device may provide a portable training tool.
B. Automated Physical Therapy
[0123] In some embodiments, the systems and methods herein are utilized to provide a fully or partially automated physical therapy regimen. In some embodiments, the systems and methods herein are utilized to provide a guided series of movements for physical therapy. In some embodiments, the system performs a quantitative analysis of at least one portion of a user’s body (e.g., a hand and wrist of a user) using machine/computer vision via visible light and thermographic imaging.
[0124] In some embodiments, the device projects acoustic holograms and the user responds accordingly. A rehabilitation status may be correlated to the user interaction of the acoustic holograms. In some embodiments, the haptic array device projects visual holograms for the user to respond to. In some embodiments, a visual hologram is provided to depict a series of movements to be executed by the user for strengthening or rehabilitation purposes. Sonic haptics, laser haptics, thermal haptics, or a combination thereof may be provided to confirm correct execution of movements or notify the use that the movement was not correctly executed. In some embodiments, instructions for a series of movements (exercises) to be executed are provided external to the system, while the system confirms correct execution of the movements via haptic feedback.
[0125] In some embodiments, the system records the movements of the user and sends them to a physician or physical therapist for analysis. In some embodiments, the system is utilized during a remote physical therapy session to confirm that the user/patient is correctly performing the physical therapy exercises. In some embodiments, a physical therapist records a physical therapy exercises utilizing their own system (e.g., a haptic array device, as disclosed herein). In some embodiments, the user instructions are based off the exercises recorded by the physical therapist. In some embodiments, the user exercises are monitored by the system and confirmation of correct position is provided if the user performs the exercises similarly to the recorded exercises.
C. Lost-Limb Simulation
[0126] In some embodiments, the system and methods herein are utilized to provide lost-limb simulation or prosthetic training for amputees. In some embodiments, the system provides haptic feedback to a site of amputation to simulate feelings of the lost limbs or extremities. In some embodiments, the systems herein monitor a site of amputation and provide visual feedbacks in response to movement. In some embodiments, movement at the amputation site is correlated into movement of a simulated appendage or extremity.
[0127] In some embodiments, movement at the amputation site is correlated into movement of a simulated prosthetic. In some embodiments, a system (e.g., a haptic array device, as disclosed herein) provides haptic feedback to muscle groups which are engaged to control a prosthetic. In some embodiments, the haptic feedback is provided to the muscle groups to simulate feelings of engaging a prosthetic. In some embodiments, the haptic feedback is provided to the muscle groups guide a user as to which muscle groups should be engaged to induce a prosthetic to perform an action or motion.
[0128] In some embodiments, the system sends data acquired by a monitoring system to design a prosthetic for a patient. The data may include information as to the limit of the movements which may be performed at the amputation site. The prosthetic may then be better designed for a patient based on the movements they are able to perform. In some embodiments, the system guides a patient through a series of movements designed to acquire information as to the range of motion capable by a patient. Therefore, customized prosthetics may be better designed based on each patient’s unique limitation of movement.
D. Haptic Therapy
[0129] In some embodiments, the system herein is configured to provide haptic therapy to a portion of a user. In some embodiments, sonic haptics, laser haptics, thermal haptics, and combinations thereof are directed to a portion of a user for therapeutic benefits. In some embodiments, sonic haptics are directed toward a portion of a user to stimulate muscles or provide vibrational stimulation (similar to a massage gun). Vibrational stimulation may be generated by oscillation of sonic haptics. In some embodiments, sonic haptics are directed toward a portion of a user to stimulate muscles or nerves. In some embodiments, thermal radiation is directed toward a portion of a user to alleviate pain, muscle tension, and/or swelling.
[0130] In some embodiments, the system is configured to provide a therapeutic spa program. In some embodiments, a user provides a portion of their body (e.g., a hand or foot) within proximity of a haptic array device, as disclosed herein, and the device commences with a series of haptic stimulations to alleviate pain, muscle tension, swelling, etc. In some embodiments, a monitoring system identifies portions of the user’s body to ensure the haptic stimulation is properly directed. In some embodiments, the user is instructed to perform as series of movements under guidance of the system as part of a therapeutic spa program.
E. Multiple Sclerosis
[0131] Multiple sclerosis (MS) is a complex chronic disease of the central nervous system, thought to be an autoimmune disorder, in which the immune system mistakenly attacks the protective covering of nerve fibers, affecting nerve conduction and causing a wide range of neurological symptoms Regular monitoring of MS is critical, as it allows healthcare providers to track disease progression, assess the effectiveness of treatments, and make necessary adjustments to the treatment plan. This can help individuals with MS manage their symptoms and maintain a better quality of life. In addition, MS is characterized by relapses or exacerbations, during which new or worsening symptoms occur. Monitoring allows healthcare providers to detect these relapses early and initiate appropriate interventions to minimize their impact.
[0132] Access to specialized healthcare, particularly for chronic conditions like MS, can be challenging in remote or rural areas. Limited access to neurologists, disease-modifying therapies, and support services can lead to delayed diagnosis, inadequate treatment, and reduced quality of life for individuals living in these areas. Language barriers, lack of health
literacy, and limited awareness about MS can all contribute to delayed diagnosis and suboptimal disease management. Thus, there is a critical need for improvement of MS monitoring access. As such, in some embodiments, the system herein provide providing a telemedicine solution for MS care.
[0133] Traditional methods of diagnosis and disease monitoring often rely on in-person visits to healthcare providers, which can be inconvenient and costly for individuals living in remote areas. Additionally, tracking and quantifying neural activity and physical behavior to assess disease progression can be challenging without the use of advanced technology. The progression of MS is commonly measured using various clinical and imaging- based methods, including Magnetic Resonance Imaging (MRI) and Cognitive and Functional Assessments, but each method presents limitations. Those limitations include clinical variability that makes it challenging to accurately measure and predict disease progression using standardized methods, subjectivity of assessments methods, such as the Expanded Disability Status Scale (EDSS), limitations of imaging techniques due to the fact that not all lesions seen on MRI necessarily correlate with clinical symptoms, Inconsistency of Biomarkers’ ability to predict disease progression in MS, lack of longitudinal data describing the long term progression of the disease, and ethnic and genetic differences that result in measurement methods developed in one population not being directly applicable to others. [0134] In some embodiments, the system herein comprise an automated monitoring device that quantifies neural activity and physical behavior, to provide a telemedicine solution for MS care. In some embodiments, the system herein enable healthcare providers to remotely assess and monitor the neural activity and physical behavior of patients, allowing for early detection of changes that may indicate disease progression. In some embodiments, the system herein improve access to specialized MS care for underserved populations. By providing a telemedicine solution, the systems herein bridge the geographical and logistical gaps, allowing individuals to receive the care they need without the burden of travel. In some embodiments, the system herein provide real- time feedback on neural activity and physical behavior, enabling patients to actively participate in their disease management and make informed decisions about their care.
[0135] In some embodiments, the system herein employ high-resolution motion capture technology to enable precise tracking and analysis of a patient's physical dynamics, providing valuable insights into their functional performance. By capturing detailed movement data, the system herein objectively assess disease progression and monitor the effectiveness of
interventions, which is a definitive advantage over more subjective measures of the progression of the disease such as the EDSS.
[0136] In some embodiments, the system herein utilize augmented reality (AR) mid-air haptics, which adds a tactile element to the assessment process. This technology enables patients to interact with virtual objects in a realistic manner, providing a more immersive and engaging experience. The utilization of haptic technology in telemedicine is gaining significant traction due to its ability to provide realistic, tactile experiences for both patients and healthcare providers. The system herein enables healthcare providers to remotely assess patients' physical symptoms and provide targeted treatment plans.
[0137] In some embodiments, the system herein employ biometric wearables, which capture physiological data, such as heart rate and electrodermal activity, allowing for a comprehensive assessment of the patient's physical and cognitive state. By analyzing these biometric signals, the systems herein can extract useful features and provide objective assessments of disease progression and functional performance. The potential for collecting EEG, EMG and EOG data adds another dimension to the ability to gather objective data to be correlated with the self-reported status evaluations. This data will provide critical insights into the cognitive aspect of the progression of MS in the population studied.
[0138] In some embodiments, the systems herein employ generative Al algorithms to analyze the data collected from motion capture and biometric wearables and perform human pose estimation calculations. This capability allows for the multi- modal quantification of impairment and provides healthcare professionals with valuable insights into the patient's condition. By combining high- resolution motion capture, augmented reality mid-air haptics, and biometric wearables, the system herein provide a holistic assessment of both physical dynamics and brain activity.
[0139] In some embodiments, a patient will undergo an assessment session at a healthcare facility or at home. The patient can be fitted with biometric wearables and can interact with virtual objects using AR mid-air haptics. Meanwhile, in some embodiments, the high- resolution motion capture system may track their movements and capture data on their physical dynamics. Throughout the assessment, data will be collected and analyzed using generative Al algorithms. In some embodiments, these algorithms extract features from the biometric signals and motion capture data to perform human pose estimation calculations and provide healthcare professionals with objective assessments of the patient's cognitive and physical impairment, disease progression, and functional performance.
[0140] In some embodiments, the systems herein comprise a monitoring device that employs high-resolution motion capture to assess physical impairments, enabling the quantitative analysis of movement patterns and motor coordination. In some embodiments, the monitoring device uses high resolution motion capture paired with an augmented reality (AR) mid-air haptics interface enabling users to dynamically interact with computer generated objects that have tactile and responsive feedback. In some embodiments, the haptics interface simultaneously stimulates brain activity associated sensory response to touch as well as causal response for tracking coordination. In some embodiments, the monitoring device pairs with a biometric wrist worn wearable tracking surface that measures electromyography, bioimpedance, electrodermal activity (EDA), volumetric blood flow via photoplethysmography (PPG), motion, or any combination thereof. In some embodiments, the motion is measured using a 9-DOF inertial motion unit (IMU). In some embodiments, the monitoring device pairs with a head worn wearable able to assess electroencephalography (EEG) signals, electromyography (EMG) signals, electro-oculography (EOG) signals, or any combination thereof. In some embodiments, the signals are correlated to central brain activity, peripheral biometric signals and functional performance.
[0141] In some embodiments, the Artificial Intelligence (Al) algorithms are trained with comprehensive data on both physiological and neurological responses during functional tasks. In some embodiments, the Al algorithms extract relevant features from the data, enabling calculations of human pose estimation and correlation with neural activity. By analyzing the extracted data using machine learning techniques, the systems herein enable robust and reliable measures of cognitive and physical impairments in individuals with MS.
[0142] In some embodiments, a patient’s signals are captured by an automated and portable monitoring device, in-home or in-clinic. Data can be captured asynchronously (e.g., self- guided by an AI-Assistant) or under the observation of a remote clinical operator via telemedicine. As such, the systems herein enable access to care, access to monitoring tools, and challenges to reproducible examination to determine cognitive and physical impairment.
F. Parkinson’s Disease
[0143] Parkinson's disease (PD) is a devastating neurodegenerative disorder that severely impacts the quality of life for millions of individuals. As a progressive condition, accurate assessment of motor symptoms is paramount for disease progression tracking and evaluating the efficacy of interventions. However, as few reliable and comprehensive tools are available
for the objective assessment of PD-related measures, effective management of the disease and the development of personalized treatments for patients has been challenging.
[0144] As such, the systems herein provide an innovative automated platform specifically designed to address the unmet need for accurate and comprehensive assessment of PD-related measures. In some embodiments, the systems herein incorporate machine vision tracking, haptics, and biometric wearables, to provide a reliable and reproducible way to measure and monitor patients with Parkinson’s disease.
[0145] One of the hallmark features of Parkinson's is the presence of motor deficits that significantly impact a patient's quality of life. Clinicians commonly utilize standardized rating scales like the Unified Parkinson's Disease Rating Scale (UPDRS) to evaluate various aspects of motor function, including tremors, bradykinesia, rigidity, and postural instability. While these assessments provide valuable insights into the severity of motor symptoms, progression of the disease, and response to therapeutic interventions, their qualitative manner often poses challenges. Motor symptoms such as bradykinesia (slowness of movement), rigidity, and tremors can vary in intensity and presentation from one patient to another. Clinicians often rely on their expertise and observational skills to assess these symptoms during a clinical examination. The UPDRS and its subscales provide a structured framework for evaluating motor symptoms, but they still involve subjective judgment. As such, the systems herein quantify motor deficits in PD, to collect and analyze objective measurements. In some embodiments, the systems herein employ haptics technology and computer vision to estimate the motor abilities of Parkinson’s patients when manipulating virtual objects. By providing a reliable and comprehensive tool for objective measurement, the systems herein enable healthcare professionals to accurately track disease progression, evaluate the impact of various interventions, and tailor treatment plans to the specific needs of individual patients. [0146] In some embodiments, unlike currently available Parkinson’s detection devices, the systems herein are easily implemented, and can be used independently by patients, or with the guidance of a neurologist. Further, the systems herein combine behavior and movement evaluation, though haptics object manipulation, and physiological measurements such as surface electromyography, surface conductance, bio-impedance, electrodermal activity, temperature, heart rate, as well as other environmental factors like pressure, humidity, and altitude.
[0147] Assessing the state of patients with neurological disorders, such as Parkinson’s disease, is a complex task, requiring accurate and objective measurements. Traditional
clinical measures often lack objectivity and reproducibility, leading to challenges in disease progression monitoring. In some embodiments, the systems and methods herein employ a haptics projector, machine vision cameras, and a biometric wearable, with current gold standard clinical measures to achieve a comprehensive and reliable assessment of patients with neurological disorders.
[0148] In some embodiments, the systems and methods herein employ a haptics projector and machine vision cameras that allows individualized care for patients with neurological disorders. In some embodiments, computer generated objects are projected through the haptics projector and can be manipulated by a patient. The machine vision cameras may track how patients interact with the projected objects for the quantification of their behavior versus expected motion. In some embodiments, the methods and systems herein employ a biometric wearable to track, for example, surface electromyography, surface conductance, bioimpedance, electrodermal activity, temperature, heart rate, and other environmental factors like pressure, humidity, and altitude. In some embodiments, the wearable device comprise a 9 DOF (degrees of freedom) IMU to track patients’ motion with high precision. Capturing all this information while the patient manipulates the computer-generated object in a gamified way allows quantitative and reproducible assessment of cognition, dexterity, coordination, and other measures used to determine the state of health for a person with neurological disorder.
[0149] In some embodiments, the machine learning algorithms herein are trained on collected data and established clinical assessments and existing objective measures commonly utilized in PD-related research. In some embodiments, the methods herein employ statistical analysis on data collected from both the PD patient group and the healthy control group as a validation. In some embodiments, data is collected via a personalized assessment activity for a plurality of simulated patient profiles. In some embodiments, each activity is designed to extract key features that assess cognitive and physical impairments relevant to PD and encompass a range of tasks to comprehensively evaluate the patients' motor skills, cognitive functions, and coordination abilities.
IV. DEFINITIONS
[0150] Unless defined otherwise, all terms of art, notations and other technical and scientific terms or terminology used herein are intended to have the same meaning as is commonly understood by one of ordinary skill in the art to which the claimed subject matter pertains. In some cases, terms with commonly understood meanings are defined herein for clarity and/or
for ready reference, and the inclusion of such definitions herein should not necessarily be construed to represent a substantial difference over what is generally understood in the art. [0151] Throughout this application, various embodiments may be presented in a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the disclosure. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
[0152] As used in the specification and claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise. For example, the term “a sample” includes a plurality of samples, including mixtures thereof.
[0153] The terms “rehabilitation,” “training,” or “treatment” are often used interchangeably herein to refer to methods of recovery or prevention from injuries, pain, and/or medical procedures. Methods may be guided or automated using the systems and devices disclosed herein.
[0154] The terms “acoustic,” “sound,” or “sonic” are often used interchangeably herein to refer to mechanical pressure waves. Unless specified, the terms “acoustic” and “sonic” should broadly read on waveforms ranging through all sonic frequency ranges, including audible, inaudible, and ultrasonic frequencies.
[0155] As used herein, the term “about” a number refers to that number plus or minus 10% of that number. The term “about” a range refers to that range minus 10% of its lowest value and plus 10% of its greatest value.
[0156] The section headings used herein are for organizational purposes only and are not to be construed as limiting the subject matter described.
Computing system
[0157] Referring to FIG. 1, a block diagram is shown depicting an exemplary machine that includes a computer system 100 (e.g., a processing or computing system) within which a set of instructions can execute for causing a device to perform or execute any one or more of the aspects and/or methodologies for static code scheduling of the present disclosure. The components in FIG. 1 are examples only and do not limit the scope of use or functionality of
any hardware, software, embedded logic component, or a combination of two or more such components implementing particular embodiments.
[0158] Computer system 100 may include one or more processors 101, a memory 103, and a storage 108 that communicate with each other, and with other components, via a bus 140. The bus 140 may also link a display 132, one or more input devices 133 (which may, for example, include a keypad, a keyboard, a mouse, a stylus, etc.), one or more output devices 134, one or more storage devices 135, and various tangible storage media 136. All of these elements may interface directly or via one or more interfaces or adaptors to the bus 140. For instance, the various tangible storage media 136 can interface with the bus 140 via storage medium interface 126. Computer system 100 may have any suitable physical form, including but not limited to one or more integrated circuits (ICs), printed circuit boards (PCBs), mobile handheld devices (such as mobile telephones or PDAs), laptop or notebook computers, distributed computer systems, computing grids, or servers.
[0159] Computer system 100 includes one or more processor(s) 101 (e.g., central processing units (CPUs), general purpose graphics processing units (GPGPUs), or quantum processing units (QPUs)) that carry out functions. Processor(s) 101 optionally contains a cache memory unit 102 for temporary local storage of instructions, data, or computer addresses. Processor(s) 101 are configured to assist in execution of computer readable instructions. Computer system 100 may provide functionality for the components depicted in FIG. 1 as a result of the processor(s) 101 executing non-transitory, processor-executable instructions embodied in one or more tangible computer-readable storage media, such as memory 103, storage 108, storage devices 135, and/or storage medium 136. The computer-readable media may store software that implements particular embodiments, and processor(s) 101 may execute the software. Memory 103 may read the software from one or more other computer-readable media (such as mass storage device(s) 135, 136) or from one or more other sources through a suitable interface, such as network interface 120. The software may cause processor(s) 101 to carry out one or more processes or one or more steps of one or more processes described or illustrated herein. Carrying out such processes or steps may include defining data structures stored in memory 103 and modifying the data structures as directed by the software.
[0160] The memory 103 may include various components (e.g., machine readable media) including, but not limited to, a random-access memory component (e.g., RAM 104) (e.g., static RAM (SRAM), dynamic RAM (DRAM), ferroelectric random-access memory (FRAM), phase-change random access memory (PRAM), etc.), a read-only memory
component (e.g., ROM 105), and any combinations thereof. ROM 105 may act to communicate data and instructions unidirectionally to processor(s) 101, and RAM 104 may act to communicate data and instructions bidirectionally with processor(s) 101. ROM 105 and RAM 104 may include any suitable tangible computer-readable media described below. In one example, a basic input/output system 106 (BIOS), including basic routines that help to transfer information between elements within computer system 100, such as during start-up, may be stored in the memory 103.
[0161] Fixed storage 108 is connected bidirectionally to processor(s) 101, optionally through storage control unit 107. Fixed storage 108 provides additional data storage capacity and may also include any suitable tangible computer-readable media described herein. Storage 108 may be used to store operating system 109, executable(s) 110, data 111, applications 112 (application programs), and the like. Storage 108 can also include an optical disk drive, a solid-state memory device (e.g., flash-based systems), or a combination of any of the above. Information in storage 108 may, in appropriate cases, be incorporated as virtual memory in memory 103.
[0162] In one example, storage device(s) 135 may be removably interfaced with computer system 100 (e.g., via an external port connector (not shown)) via a storage device interface 125. Particularly, storage device(s) 135 and an associated machine-readable medium may provide non-volatile and/or volatile storage of machine-readable instructions, data structures, program modules, and/or other data for the computer system 100. In one example, software may reside, completely or partially, within a machine-readable medium on storage device(s) 135. In another example, software may reside, completely or partially, within processor(s) 101.
[0163] Bus 140 connects a wide variety of subsystems. Herein, reference to a bus may encompass one or more digital signal lines serving a common function, where appropriate. Bus 140 may be any of several types of bus structures including, but not limited to, a memory bus, a memory controller, a peripheral bus, a local bus, and any combinations thereof, using any of a variety of bus architectures. As an example and not by way of limitation, such architectures include an Industry Standard Architecture (ISA) bus, an Enhanced ISA (EISA) bus, a Micro Channel Architecture (MCA) bus, a Video Electronics Standards Association local bus (VLB), a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, an Accelerated Graphics Port (AGP) bus, HyperTransport (HTX) bus, serial advanced technology attachment (SATA) bus, and any combinations thereof.
[0164] Computer system 100 may also include an input device 133. In one example, a user of computer system 100 may enter commands and/or other information into computer system 100 via input device(s) 133. Examples of an input device(s) 133 include, but are not limited to, an alpha-numeric input device (e.g., a keyboard), a pointing device (e.g., a mouse or touchpad), a touchpad, a touch screen, a multi-touch screen, a joystick, a stylus, a gamepad, an audio input device (e.g., a microphone, a voice response system, etc.), an optical scanner, a video or still image capture device (e.g., a camera), and any combinations thereof. In some embodiments, the input device is a Kinect, Leap Motion, or the like. Input device(s) 133 may be interfaced to bus 140 via any of a variety of input interfaces 123 (e.g., input interface 123) including, but not limited to, serial, parallel, game port, USB, FIREWIRE, THUNDERBOLT, or any combination of the above.
[0165] In particular embodiments, when computer system 100 is connected to network 130, computer system 100 may communicate with other devices, specifically mobile devices and enterprise systems, distributed computing systems, cloud storage systems, cloud computing systems, and the like, connected to network 130. Communications to and from computer system 100 may be sent through network interface 120. For example, network interface 120 may receive incoming communications (such as requests or responses from other devices) in the form of one or more packets (such as Internet Protocol (IP) packets) from network 130, and computer system 100 may store the incoming communications in memory 103 for processing. Computer system 100 may similarly store outgoing communications (such as requests or responses to other devices) in the form of one or more packets in memory 103 and communicated to network 130 from network interface 120. Processor(s) 101 may access these communication packets stored in memory 103 for processing.
[0166] Examples of the network interface 120 include, but are not limited to, a network interface card, a modem, and any combination thereof. Examples of a network 130 or network segment 130 include, but are not limited to, a distributed computing system, a cloud computing system, a wide area network (WAN) (e.g., the Internet, an enterprise network), a local area network (LAN) (e.g., a network associated with an office, a building, a campus or other relatively small geographic space), a telephone network, a direct connection between two computing devices, a peer-to-peer network, and any combinations thereof. A network, such as network 130, may employ a wired and/or a wireless mode of communication. In general, any network topology may be used.
[0167] Information and data can be displayed through a display 132. Examples of a display 132 include, but are not limited to, a cathode ray tube (CRT), a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT-LCD), an organic liquid crystal display (OLED) such as a passive-matrix OLED (PMOLED) or active-matrix OLED (AMOLED) display, a plasma display, and any combinations thereof. The display 132 can interface to the processor(s) 101, memory 103, and fixed storage 108, as well as other devices, such as input device(s) 133, via the bus 140. The display 132 is linked to the bus 140 via a video interface 122, and transport of data between the display 132 and the bus 140 can be controlled via the graphics control 121. In some embodiments, the display is a video projector. In some embodiments, the display is a head-mounted display (HMD) such as a VR headset. In further embodiments, suitable VR headsets include, by way of non-limiting examples, HTC Vive, Oculus Rift, Samsung Gear VR, Microsoft HoloLens, Razer OSVR, FOVE VR, Zeiss VR One, Avegant Glyph, Freefly VR headset, and the like. In still further embodiments, the display is a combination of devices such as those disclosed herein.
[0168] In addition to a display 132, computer system 100 may include one or more other peripheral output devices 134 including, but not limited to, an audio speaker, a printer, a storage device, and any combinations thereof. Such peripheral output devices may be connected to the bus 140 via an output interface 124. Examples of an output interface 124 include, but are not limited to, a serial port, a parallel connection, a USB port, a FIREWIRE port, a THUNDERBOLT port, and any combinations thereof.
[0169] In addition or as an alternative, computer system 100 may provide functionality as a result of logic hardwired or otherwise embodied in a circuit, which may operate in place of or together with software to execute one or more processes or one or more steps of one or more processes described or illustrated herein. Reference to software in this disclosure may encompass logic, and reference to logic may encompass software. Moreover, reference to a computer-readable medium may encompass a circuit (such as an IC) storing software for execution, a circuit embodying logic for execution, or both, where appropriate. The present disclosure encompasses any suitable combination of hardware, software, or both.
[0170] Those of skill in the art will appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software,
various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality.
[0171] The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general- purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
[0172] The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by one or more processor(s), or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
[0173] In accordance with the description herein, suitable computing devices include, by way of non-limiting examples, server computers, desktop computers, laptop computers, notebook computers, sub-notebook computers, netbook computers, netpad computers, set-top computers, media streaming devices, handheld computers, Internet appliances, mobile smartphones, tablet computers, personal digital assistants, video game consoles, and vehicles. Those of skill in the art will also recognize that select televisions, video players, and digital music players with optional computer network connectivity are suitable for use in the system described herein. Suitable tablet computers, in various embodiments, include those with booklet, slate, and convertible configurations, known to those of skill in the art.
[0174] In some embodiments, the computing device includes an operating system configured to perform executable instructions. The operating system is, for example, software, including programs and data, which manages the device’s hardware and provides services for execution of applications. Those of skill in the art will recognize that suitable server operating systems include, by way of non -limiting examples, FreeBSD, OpenBSD, NetBSD®, Linux, Apple® Mac OS X Server®, Oracle® Solaris®, Windows Server®, and Novell® NetWare®. Those of skill in the art will recognize that suitable personal computer operating systems include, by way of non-limiting examples, Microsoft® Windows®, Apple® Mac OS X®, UNIX®, and UNIX-like operating systems such as GNU/Linux®. In some embodiments, the operating system is provided by cloud computing. Those of skill in the art will also recognize that suitable mobile smartphone operating systems include, by way of non-limiting examples, Nokia® Symbian® OS, Apple® iOS®, Research In Motion® BlackBerry OS®, Google® Android®, Microsoft® Windows Phone® OS, Microsoft® Windows Mobile® OS, Linux®, and Palm® WebOS®. Those of skill in the art will also recognize that suitable media streaming device operating systems include, by way of non-limiting examples, Apple TV®, Roku®, Boxee®, Google TV®, Google Chromecast®, Amazon Fire®, and Samsung® HomeSync®. Those of skill in the art will also recognize that suitable video game console operating systems include, by way of non-limiting examples, Sony® PS3®, Sony® PS4®, Microsoft® Xbox 360®, Microsoft Xbox One, Nintendo® Wii®, Nintendo® Wii U®, and Ouya®. Non-transitory computer readable storage medium
[0175] In some embodiments, the platforms, systems, media, and methods disclosed herein include one or more non-transitory computer readable storage media encoded with a program including instructions executable by the operating system of an optionally networked computing device. In further embodiments, a computer readable storage medium is a tangible component of a computing device. In still further embodiments, a computer readable storage medium is optionally removable from a computing device. In some embodiments, a computer readable storage medium includes, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, solid state memory, magnetic disk drives, magnetic tape drives, optical disk drives, distributed computing systems including cloud computing systems and services, and the like. In some cases, the program and instructions are permanently, substantially permanently, semi -permanently, or non-transitorily encoded on the media.
Computer program
[0176] In some embodiments, the platforms, systems, media, and methods disclosed herein include at least one computer program, or use of the same. A computer program includes a sequence of instructions, executable by one or more processor(s) of the computing device’s CPU, written to perform a specified task. Computer readable instructions may be implemented as program modules, such as functions, objects, Application Programming Interfaces (APIs), computing data structures, and the like, that perform particular tasks or implement particular abstract data types. In light of the disclosure provided herein, those of skill in the art will recognize that a computer program may be written in various versions of various languages.
[0177] The functionality of the computer readable instructions may be combined or distributed as desired in various environments. In some embodiments, a computer program comprises one sequence of instructions. In some embodiments, a computer program comprises a plurality of sequences of instructions. In some embodiments, a computer program is provided from one location. In other embodiments, a computer program is provided from a plurality of locations. In various embodiments, a computer program includes one or more software modules. In various embodiments, a computer program includes, in part or in whole, one or more web applications, one or more mobile applications, one or more standalone applications, one or more web browser plug-ins, extensions, add-ins, or add-ons, or combinations thereof.
Web application
[0178] In some embodiments, a computer program includes a web application. In light of the disclosure provided herein, those of skill in the art will recognize that a web application, in various embodiments, utilizes one or more software frameworks and one or more database systems. In some embodiments, a web application is created upon a software framework such as Microsoft® .NET or Ruby on Rails (RoR). In some embodiments, a web application utilizes one or more database systems including, by way of non-limiting examples, relational, nonrelational, object oriented, associative, XML, and document-oriented database systems. In further embodiments, suitable relational database systems include, by way of non-limiting examples, Microsoft® SQL Server, mySQL™, and Oracle®. Those of skill in the art will also recognize that a web application, in various embodiments, is written in one or more versions of one or more languages. A web application may be written in one or more markup languages, presentation definition languages, client-side scripting languages, server-side
coding languages, database query languages, or combinations thereof. In some embodiments, a web application is written to some extent in a markup language such as Hypertext Markup Language (HTML), Extensible Hypertext Markup Language (XHTML), or extensible Markup Language (XML). In some embodiments, a web application is written to some extent in a presentation definition language such as Cascading Style Sheets (CSS). In some embodiments, a web application is written to some extent in a client-side scripting language such as Asynchronous JavaScript and XML (AJAX), Flash® ActionScript, JavaScript, or Silverlight®. In some embodiments, a web application is written to some extent in a serverside coding language such as Active Server Pages (ASP), ColdFusion®, Perl, Java™, JavaServer Pages (JSP), Hypertext Preprocessor (PHP), Python™, Ruby, Tel, Smalltalk, WebDNA®, or Groovy. In some embodiments, a web application is written to some extent in a database query language such as Structured Query Language (SQL). In some embodiments, a web application integrates enterprise server products such as IBM® Lotus Domino®. In some embodiments, a web application includes a media player element. In various further embodiments, a media player element utilizes one or more of many suitable multimedia technologies including, by way of non-limiting examples, Adobe® Flash®, HTML 5, Apple® QuickTime®, Microsoft® Silverlight®, Java™, and Unity®.
[0179] Referring to FIG. 2, in a particular embodiment, an application provision system comprises one or more databases 200 accessed by a relational database management system (RDBMS) 210. Suitable RDBMSs include Firebird, MySQL, PostgreSQL, SQLite, Oracle Database, Microsoft SQL Server, IBM DB2, IBM Informix, SAP Sybase, Teradata, and the like. In this embodiment, the application provision system further comprises one or more application severs 220 (such as Java servers, .NET servers, PHP servers, and the like) and one or more web servers 230 (such as Apache, IIS, GWS and the like). The web server(s) optionally expose one or more web services via app application programming interfaces (APIs) 240. Via a network, such as the Internet, the system provides browser-based and/or mobile native user interfaces.
[0180] Referring to FIG. 3, in a particular embodiment, an application provision system alternatively has a distributed, cloud-based architecture 300 and comprises elastically load balanced, auto-scaling web server resources 310 and application server resources 320 as well synchronously replicated databases 330.
Mobile application
[0181] In some embodiments, a computer program includes a mobile application provided to a mobile computing device. In some embodiments, the mobile application is provided to a mobile computing device at the time it is manufactured. In other embodiments, the mobile application is provided to a mobile computing device via the computer network described herein.
[0182] In view of the disclosure provided herein, a mobile application is created by techniques known to those of skill in the art using hardware, languages, and development environments known to the art. Those of skill in the art will recognize that mobile applications are written in several languages. Suitable programming languages include, by way of non-limiting examples, C, C++, C#, Objective-C, Java™, JavaScript, Pascal, Object Pascal, Python™, Ruby, VB.NET, WML, and XHTML/HTML with or without CSS, or combinations thereof.
[0183] Suitable mobile application development environments are available from several sources. Commercially available development environments include, by way of non-limiting examples, AirplaySDK, alcheMo, Appcelerator®, Celsius, Bedrock, Flash Lite, .NET Compact Framework, Rhomobile, and WorkLight Mobile Platform. Other development environments are available without cost including, by way of non-limiting examples, Lazarus, MobiFlex, MoSync, and Phonegap. Also, mobile device manufacturers distribute software developer kits including, by way of non-limiting examples, iPhone and iPad (iOS) SDK, Android™ SDK, BlackBerry® SDK, BREW SDK, Palm® OS SDK, Symbian SDK, webOS SDK, and Windows® Mobile SDK.
[0184] Those of skill in the art will recognize that several commercial forums are available for distribution of mobile applications including, by way of non-limiting examples, Apple® App Store, Google® Play, Chrome WebStore, BlackBerry® App World, App Store for Palm devices, App Catalog for webOS, Windows® Marketplace for Mobile, Ovi Store for Nokia® devices, Samsung® Apps, and Nintendo® DSi Shop.
Standalone application
[0185] In some embodiments, a computer program includes a standalone application, which is a program that is run as an independent computer process, not an add-on to an existing process, e.g., not a plug-in. Those of skill in the art will recognize that standalone applications are often compiled. A compiler is a computer program(s) that transforms source code written in a programming language into binary object code such as assembly language
or machine code. Suitable compiled programming languages include, by way of non-limiting examples, C, C++, Objective-C, COBOL, Delphi, Eiffel, Java™, Lisp, Python™, Visual Basic, and VB.NET, or combinations thereof. Compilation is often performed, at least in part, to create an executable program. In some embodiments, a computer program includes one or more executable complied applications.
Web browser plug-in
[0186] In some embodiments, the computer program includes a web browser plug-in (e.g., extension, etc.). In computing, a plug-in is one or more software components that add specific functionality to a larger software application. Makers of software applications support plugins to enable third-party developers to create abilities which extend an application, to support easily adding new features, and to reduce the size of an application. When supported, plugins enable customizing the functionality of a software application. For example, plug-ins are commonly used in web browsers to play video, generate interactivity, scan for viruses, and display particular file types. Those of skill in the art will be familiar with several web browser plug-ins including, Adobe® Flash® Player, Microsoft® Silverlight®, and Apple® QuickTime®. In some embodiments, the toolbar comprises one or more web browser extensions, add-ins, or add-ons. In some embodiments, the toolbar comprises one or more explorer bars, tool bands, or desk bands.
[0187] In view of the disclosure provided herein, those of skill in the art will recognize that several plug-in frameworks are available that enable development of plug-ins in various programming languages, including, by way of non-limiting examples, C++, Delphi, Java™, PHP, Python™, and VB.NET, or combinations thereof.
[0188] Web browsers (also called Internet browsers) are software applications, designed for use with network-connected computing devices, for retrieving, presenting, and traversing information resources on the World Wide Web. Suitable web browsers include, by way of non-limiting examples, Microsoft® Internet Explorer®, Mozilla® Firefox®, Google® Chrome, Apple® Safari®, Opera Software® Opera®, and KDE Konqueror. In some embodiments, the web browser is a mobile web browser. Mobile web browsers (also called microbrowsers, mini-browsers, and wireless browsers) are designed for use on mobile computing devices including, by way of non-limiting examples, handheld computers, tablet computers, netbook computers, subnotebook computers, smartphones, music players, personal digital assistants (PDAs), and handheld video game systems. Suitable mobile web browsers include, by way of non-limiting examples, Google® Android® browser, RIM BlackBerry® Browser, Apple®
Safari®, Palm® Blazer, Palm® WebOS® Browser, Mozilla® Firefox® for mobile, Microsoft® Internet Explorer® Mobile, Amazon® Kindle® Basic Web, Nokia® Browser, Opera Software® Opera® Mobile, and Sony® PSP™ browser.
Software modules
[0189] In some embodiments, the platforms, systems, media, and methods disclosed herein include software, server, and/or database modules, or use of the same. In view of the disclosure provided herein, software modules are created by techniques known to those of skill in the art using machines, software, and languages known to the art. The software modules disclosed herein are implemented in a multitude of ways. In various embodiments, a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof. In further various embodiments, a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof. In various embodiments, the one or more software modules comprise, by way of non-limiting examples, a web application, a mobile application, and a standalone application. In some embodiments, software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine. In further embodiments, software modules are hosted on a distributed computing platform such as a cloud computing platform. In some embodiments, software modules are hosted on one or more machines in one location. In other embodiments, software modules are hosted on one or more machines in more than one location.
Databases
[0190] In some embodiments, the platforms, systems, media, and methods disclosed herein include one or more databases, or use of the same. In view of the disclosure provided herein, those of skill in the art will recognize that many databases are suitable for storage and retrieval of information associated with registered movements for a guided rehabilitation and training system. In various embodiments, suitable databases include, by way of non-limiting examples, relational databases, non-relational databases, object-oriented databases, object databases, entity -relationship model databases, associative databases, XML databases, and document-oriented databases. Further non-limiting examples include SQL, PostgreSQL, MySQL, Oracle, DB2, Sybase, and MongoDB. In some embodiments, a database is Internet-
based. In further embodiments, a database is web-based. In still further embodiments, a database is cloud computing based. In a particular embodiment, a database is a distributed database. In other embodiments, a database is based on one or more local computer storage devices.
V. EXAMPLES
[0191] The following examples are included for illustrative purposes only and are not intended to limit the scope of the invention.
Example 1: Automated Physical Therapy of a Wrist Joint
[0192] In some embodiments, a user interacts with a haptic array device, as disclosed herein, to perform a series of movements as part of a physical therapy program or exercise. Instructions may be provided to the haptic array device to implement the physical therapy program. Instructions may be provided via an external device, downloaded from a web/cloud server, provided by a media storage system, etc. Physical therapy sessions may be based on movements recorded by a physical therapist or instructor, as disclosed herein.
[0193] With reference to FIG. 6, an exemplary method of using a haptic array device to guide a user through a physical therapy program is provided, according to some embodiments. In the exemplary embodiment, a physical therapy program refers to a series of exercise or movements to be completed by a user. A physical therapy program may comprise a series of sessions wherein exercises are completed over a duration of time for purposes of training or rehabilitations. While conducting a physical therapy session, a monitoring system, as disclosed herein, may record a user’s actions, and be utilized to track the user’s progress. The exemplary embodiment herein, refers to a physical therapy program which comprises a series of physical therapy sessions, wherein each physical therapy session comprises a series of exercise.
[0194] In some embodiments, a physical therapy session begins when a physical therapy program is provided to the haptic array device, at step 610. A physical therapy program may include instruction for monitoring and providing haptic feedback to a user as they complete a series of guided exercises.
[0195] After the instructions are provided to the haptic array device, the device is ready guide and monitor gestures made by a user. At step 620, the user is guided to an initial position for an exercise. In some embodiments, the haptic array device is configured to guide a series of exercises a user makes with one hand as part of a physical therapy program for a wrist joint.
In some embodiments, the haptic array device produces a visualization of where the user should place their hand, fingers, arm, or a combination thereof. The visualization may be a marker to indicate where the user should place the center of their palm or hand. In some embodiments, once the user places their hand in the proper position, the haptic array device provides feedback to confirm the users hand is in a proper position. In some embodiments, the feedback comprises haptic feedback from the transducer array and/or a laser system of the device. In some embodiments, the sonic haptics may be applied to the palm to indicate proper positioning and the laser haptics may be applied to each fingertip to indicate that the hand and fingers are in proper position and that the device is ready to monitor the user’s gestures. [0196] At step 630, the device guides the user to move their hand to perform a physical therapy exercise. Guiding of the user’s hand may comprise visual and/or haptic feedback. At step 640, the user completes a series of gestures or movements which are captured by the monitoring system of the haptic array device, according to some embodiments. A series of gestures might include at least two gestures made by the hand, or at least one movement from a first gesture to a second gesture. In some embodiments, the number of gestures is limited by the device. In some embodiments, the number of gestures is chosen by the user. At step 650, successful completion of a prescribed movement is confirmed by haptic feedback provided by the device. At step 660, the user is instructed to repeat the same movement as part of an exercise. After a prescribed number of repetitions, completion of an exercise is confirmed at step 670.
[0197] In some embodiments, a physical therapy program comprises a series of exercises, and upon completion of a first exercise a subsequent exercise is provided and steps 620 to 660 are repeated for each subsequent exercise. At step 680, upon completion of a physical therapy session, a user is provided with confirmation that the physical therapy session has been completed. In some embodiments, data from the monitoring system is compiled to update a user and/or their physical therapist as to the progress made during each session. In some embodiments, completion of a physical therapy session initiates a haptic therapy or spa program, as described herein.
Example 2: Individualized Patient Care
[0198] In one example, patient Jacob is diagnosed as having multiple sclerosis and is instructed by his physician to employ the systems herein. Per FIG. 8B, the monitoring device displays an image of three balls and Jacob is instructed to manipulate one of the balls. As Jacob reaches out towards the image of the ball, the haptic feedback of the monitoring device
provides Jacob with the sensation that his fingers are contacting the ball. Simultaneously, a camera and a time-of-flight sensor on the monitoring device and his ancillary device record a movement of Jacob’s hand as he interacts with the ball, while the biometric sensor of his ancillary device records his pulse. A delay is calculated between the measured orientation and movement of Jacob’s hand and a recorded measurement and orientation of a healthy patient. Jacob’s pain through his movement is determined in-part based on his recorded pulse. The delay and pain measurements are provided to Jacob in an assessment.
[0199] While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.
Claims
1. A system for assessment of a patient, the system comprising:
(a) an ancillary device comprising a biometric sensor configured to measure a biometric data;
(b) a monitoring device comprising:
(i) a display configured to show a display image;
(ii) a camera, a time-of-flight sensor, or both, configured to capture a plurality of pose images of the patient; and
(iii) a haptic array comprising a plurality of ultrasonic devices; and
(c) a non-transitory computer-readable storage media encoded with instructions executable by at least one processor to cause the at least one processor to perform operations comprising:
(i) directing at least a portion of the plurality of ultrasonic devices in the haptic array to emit an acoustic field based on the display image;
(ii) determining two or more patient poses based at least in part on the biometric data and the two or more pose images; and
(iii) determining the assessment of the patient based at least in part on the display image and at least a portion of the plurality of patient poses.
2. The system of claim 1, wherein the ancillary device is configured to couple to an appendage of the patient.
3. The system of claim 1, wherein the biometric sensor comprises an inertial motion unit, a photoplethysmography sensor, a photoacoustic sensor, an ultrasound sensor, a glucose sensor, a bioimpedance sensor, an electrodermal activity sensor, a temperature sensor, a vision shadow capture sensor, an altimeter, a barometer, a humidity sensor, a sweat rate sensor, a hydration sensor, a bioacoustics sensor, a dynamometer, an electrodermal sensor, or any combination thereof.
4. The system of claim 1, wherein the haptic array is a planar array.
5. The system of claim 1, wherein the haptic array is a non-planar array.
The system of claim 1, wherein the plurality ultrasonic devices comprise, an ultrasonic speaker, a piezoelectric speaker, an electromagnetic speaker, or any combination thereof. The system of claim 1, wherein at least a portion of the plurality ultrasonic devices have a frequency of less than about 20 MHz. The system of claim 1, wherein the camera comprises a two-dimensional camera. The system of claim 1, wherein the camera comprises a three-dimensional camera. The system of claim 1, wherein the camera is an infrared camera, a near infrared camera, a visible light camera, an ultra-violet spectrum camera, a thermographic camera, or any combination thereof. The system of claim 1, wherein the camera, the time-of-flight sensor, or both, captures data at a rate of about 10 Hz to 10,000 Hz. The system of claim 1, wherein each of the two or more patient poses comprise a position, an orientation, or both of an appendage of the patient. The system of claim 1, wherein the two or more patient poses, the assessment of the patient, or both are determined at least in part by a machine learning algorithm. The system of claim 1, wherein the assessment of the patient is determined at least in part based on a comparison between the two or more patient poses and two or more labeled healthy poses. The system of claim 1, wherein the monitoring device further comprises a speaker, a joystick, a trackpad, a trackball, or any combination thereof. The system of claim 1, wherein the monitoring device or the ancillary device comprise the non-transitory computer-readable storage media. The system of claim 1, wherein the ancillary device further comprises an ancillary communication device and wherein the monitoring device further comprises a monitoring communication device communicably coupled to the ancillary device. The system of claim 17, wherein the ancillary communication device and the monitoring communication device are wireless communication devices.
A computer-implemented method of assessing a patient, the method comprising:
(a) showing a display image on a display while:
(i) receiving, from an ancillary device, a biometric data;
(ii) capturing, by a camera, a time-of-flight sensor, or both, a plurality of pose images of the patient; and
(iii) emitting, by a haptic array comprising a plurality of ultrasonic devices, an ultrasonic haptic based on the display image;
(b) determining two or more patient poses based at least in part on the biometric data and the two or more pose images; and
(c) determining the assessment of the patient based at least in part on the display image and at least a portion of the plurality of patient poses. The method of claim 21, wherein the biometric data comprises an inertial motion unit data, a photoplethysmography data, a photoacoustic data, an ultrasound data, a glucose data, a bioimpedance data, an electrodermal activity data, a temperature data, a vision shadow capture data, an altitude data, a pressure data, a humidity data, a sweat rate data, a hydration data, a bioacoustics data, a dynamometer, an electrodermal data, or any combination thereof. The method of claim 21, wherein at least a portion of the plurality of pose images comprise a two-dimensional image. The method of claim 21, wherein at least a portion of the plurality of pose images comprise a three-dimensional image. The method of claim 21, wherein at least a portion of the plurality of pose images comprise an infrared image, a near infrared image, a visible light image, an ultra-violet spectrum image, a thermographic image, or any combination thereof. The method of claim 21, wherein each of the two or more patient poses comprise a position, an orientation, or both of an appendage of the patient. The method of claim 21, wherein the two or more patient poses, the assessment of the patient, or both are determined at least in part by a machine learning algorithm.
The method of claim 21, wherein the assessment of the patient is determined at least in part based on a comparison between the two or more patient poses and two or more labeled healthy poses.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202263379381P | 2022-10-13 | 2022-10-13 | |
US63/379,381 | 2022-10-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024081781A1 true WO2024081781A1 (en) | 2024-04-18 |
Family
ID=90670356
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2023/076681 WO2024081781A1 (en) | 2022-10-13 | 2023-10-12 | Rehab and training interactive and tactile projections of sound and light |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024081781A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170164876A1 (en) * | 2014-07-17 | 2017-06-15 | Elwha Llc | Monitoring body movement or condition according to motion regimen with conformal electronics |
US20190196578A1 (en) * | 2017-12-22 | 2019-06-27 | Ultrahaptics Limited | Tracking in Haptic Systems |
US11263919B2 (en) * | 2013-03-15 | 2022-03-01 | Nike, Inc. | Feedback signals from image data of athletic performance |
-
2023
- 2023-10-12 WO PCT/US2023/076681 patent/WO2024081781A1/en unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11263919B2 (en) * | 2013-03-15 | 2022-03-01 | Nike, Inc. | Feedback signals from image data of athletic performance |
US20170164876A1 (en) * | 2014-07-17 | 2017-06-15 | Elwha Llc | Monitoring body movement or condition according to motion regimen with conformal electronics |
US20190196578A1 (en) * | 2017-12-22 | 2019-06-27 | Ultrahaptics Limited | Tracking in Haptic Systems |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7091531B2 (en) | Methods for physical gesture interface and projection display | |
Hellsten et al. | The potential of computer vision-based marker-less human motion analysis for rehabilitation | |
Mousavi Hondori et al. | A review on technical and clinical impact of microsoft kinect on physical therapy and rehabilitation | |
Veras et al. | Scoping review of outcome measures used in telerehabilitation and virtual reality for post-stroke rehabilitation | |
Torner et al. | Multipurpose virtual reality environment for biomedical and health applications | |
TWI589274B (en) | Virtual reality system for psychological clinical application | |
Athanasiou et al. | Towards Rehabilitation Robotics: Off‐the‐Shelf BCI Control of Anthropomorphic Robotic Arms | |
US20220133589A1 (en) | Systems and methods for thermographic body mapping with therapy | |
Tedesco et al. | Design of a multi-sensors wearable platform for remote monitoring of knee rehabilitation | |
US20240149060A1 (en) | Electrical stimulation device and electrical stimulation method | |
Rechy-Ramirez et al. | Impact of commercial sensors in human computer interaction: a review | |
Powell et al. | Openbutterfly: Multimodal rehabilitation analysis of immersive virtual reality for physical therapy | |
Rahman | Multimedia environment toward analyzing and visualizing live kinematic data for children with hemiplegia | |
Ortiz-Catalan et al. | Virtual reality | |
Kapsalyamov et al. | Brain–computer interface and assist-as-needed model for upper limb robotic arm | |
CA3185760A1 (en) | Brain injury rehabilitation method utilizing brain activity monitoring | |
Athanasiou et al. | Wireless Brain‐Robot Interface: User Perception and Performance Assessment of Spinal Cord Injury Patients | |
Soccini et al. | The ethics of rehabilitation in virtual reality: the role of self-avatars and deep learning | |
Frey et al. | EEG-based neuroergonomics for 3D user interfaces: opportunities and challenges | |
KR102220837B1 (en) | Augmented Reality Based Mirror Exercise System for Exercise Rehabilitation of Patients with Nervous and Musculoskeletal system | |
Lim et al. | Impact of mixed reality-based rehabilitation on muscle activity in lower-limb amputees: An EMG analysis | |
Gomilko et al. | Attention training game with aldebaran robotics NAO and brain-computer interface | |
WO2024081781A1 (en) | Rehab and training interactive and tactile projections of sound and light | |
Valdivia et al. | Development and evaluation of two posture-tracking user interfaces for occupational health care | |
Qu et al. | Understanding the impact of longitudinal VR training on users with mild cognitive impairment using fNiRS and behavioral data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23878223 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |