WO2020003130A1 - System and methods for quantifying manual therapy - Google Patents

System and methods for quantifying manual therapy Download PDF

Info

Publication number
WO2020003130A1
WO2020003130A1 PCT/IB2019/055351 IB2019055351W WO2020003130A1 WO 2020003130 A1 WO2020003130 A1 WO 2020003130A1 IB 2019055351 W IB2019055351 W IB 2019055351W WO 2020003130 A1 WO2020003130 A1 WO 2020003130A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
practitioner
subject
manipulation
sensor
Prior art date
Application number
PCT/IB2019/055351
Other languages
French (fr)
Inventor
Marcin GOSZCZYNSKI
Piotr Goszczynski
Arne DANKERS
Marcin BIELAK
Michael Wilt
Dawid LEWANDOWICZ
Original Assignee
S29 Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by S29 Technologies Inc filed Critical S29 Technologies Inc
Publication of WO2020003130A1 publication Critical patent/WO2020003130A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/008Apparatus for applying pressure or blows almost perpendicular to the body or limb axis, e.g. chiropractic devices for repositioning vertebrae, correcting deformation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4058Detecting, measuring or recording for evaluating the nervous system for evaluating the central nervous system
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4836Diagnosis combined with treatment in closed-loop systems or methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0292Stretching or bending or torsioning apparatus for exercising for the spinal column
    • A61H1/0296Neck
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/12Driving means
    • A61H2201/1253Driving means driven by a human being, e.g. hand driven
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1604Head
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5023Interfaces to the user
    • A61H2201/5043Displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5061Force sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5071Pressure sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5084Acceleration sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5097Control means thereof wireless
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/04Heartbeat characteristics, e.g. E.G.C., blood pressure modulation
    • A61H2230/06Heartbeat rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/20Blood composition characteristics
    • A61H2230/207Blood composition characteristics partial O2-value
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/30Blood pressure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/40Respiratory characteristics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/50Temperature

Definitions

  • the present invention relates to methods for assessing a manipulation by a practitioner on a first body part of a subject, wherein the practitioner has (e.g., wearing or using) at least one (e.g., one or more) practitioner edge device that includes at least one (e.g., one or more) practitioner sensor and the subject has (e.g., wearing or using) at least one (e.g., one or more) subject edge device that includes at least one (e.g., one or more) subject sensor on a second body part.
  • the manipulation device or tool has an edge device embedded therein can be worn or used by the practitioner and/or subject. For example, the practitioner can wear a strap having an edge device and/or use a towel having an edge device when performing the manipulation/contact.
  • the practitioner performs the manipulation to the first body part of the subject, wherein the first body part and the second body part is the same body part or a different body part, and wherein the practitioner sensor and the subject sensor each comprise a (e.g., one or more) gyroscope, accelerometer, force sensor, magnetometer or combination thereof, and optionally, at least one (e.g., one or more) central nervous system sensor.
  • the steps of the inventive method include transmitting practitioner data of the manipulation from the at least one practitioner sensor to a network and transmitting subject data of the manipulation from the at least one subject sensor to the network.
  • the next step involves processing the practitioner data and the subject data to assess the
  • the central nervous system sensor includes, for example, a heart rate monitor, a blood pressure monitor, a thermometer, an oxygen saturation monitor, a respiratory monitor, a blood lactate monitor, blood glucose monitor, a vascular flow monitor, a brain wave monitor, a sweat and saliva monitor, pain sensor or combination thereof, and optionally include a microphone or a video camera.
  • a“monitor” is also a sensor.
  • the practitioner data or subject data of the manipulation each include, in an aspect, position data representative of position or movement of the body part (e.g., angular velocity), acceleration data representative of an acceleration of the body part, and force data representative of force applied to the body part, and optionally, heart rate data, heart rate variability data, blood pressure data, vascular flow data, temperature data, oxygen saturation data, respiratory rate data, blood lactate data, blood glucose data, brain wave data, sweat and saliva data, pain data, and any combination thereof.
  • the practitioner manipulation device having an edge device attached thereto or embedded therein, can be in the form of a band or strap (e.g., a body band), a ring, a towel, a glove and the like.
  • the practitioner manipulation device having an edge device attached thereto or embedded therein, can be placed on the practitioner’s hand, arm, foot, knee, leg, hip, torso, back, head, or combination thereof.
  • the subject manipulation device having an edge device includes a band (e.g., a body band), a strap, a headband, a bracelet, a body band, a ring, and the like.
  • the subject edge device can be placed on the subject’s body part, such as a head, hand, arm, foot, knee, leg, hip, torso, back, or combination thereof.
  • the method further includes, in one embodiment, transmitting the practitioner data and the subject data wirelessly to the network (e.g., having a server that is local, on a cloud, or both).
  • the method can further include the step of storing the practitioner data and the subject data, and processing the practitioner data and the subject data to assess the manipulation. In an embodiment, these data can be compiled to obtain indicia of the manipulation. In another embodiment, processing the practitioner data and the subject data to assess the manipulation involves comparing the practitioner data with the subject data to one another or to a standard.
  • the inventive methodology can further include transmitting practitioner data from more than one time point, transmitting subject data from more than one time point, and compiling or comparing the data from the more than one time points (e.g., prior to the manipulation, during the manipulation, and after the manipulation).
  • the method can further include displaying the data in a representation such as a line graph, a three-dimensional representation of the movement, video (e.g., interactive video) and/or written description(s).
  • the inventive method of assessing a manipulation includes the practitioner performing the manipulation to the first body part of the subject; transmitting practitioner data of the manipulation from the practitioner sensor to a network; transmitting subject data of the manipulation from the subject sensor to the network; processing the practitioner data and the subject data to assess the manipulation to thereby obtain a manipulation indicia; and providing an output of the manipulation indicia.
  • the present invention also relates to systems for assessing a manipulation by a practitioner on a first body part of a subject, wherein the practitioner has (e.g., wearing or using) at least one practitioner edge device that includes at least one practitioner sensor and the subject has (e.g., wearing or using) at least one subject edge device that includes at least one subject sensor on a second body part.
  • the practitioner performs the manipulation to the first body part of the subject, wherein the first body part and the second body part is the same body part or a different body part.
  • the system encompasses the practitioner edge device having at least one practitioner sensor that obtains and transmits practitioner data; wherein the at least one practitioner sensor comprises a gyroscope, accelerometer, force sensor, a magnetometer, or combination thereof, and optionally, at least one central nervous system sensor; wherein the practitioner sensor further includes a wireless communication component.
  • the system also includes at least one subject edge device having at least one subject sensor that obtains and transmits subject data; wherein the subject sensor comprises a gyroscope, accelerometer, force sensor, a magnetometer, or combination thereof, and optionally, at least one central nervous system sensor; wherein the subject sensor further has a wireless communication component; wherein, when in use, the subject wears at least one sensor on the second body part.
  • a network that has at least one memory unit for storing processor executable instructions, the practitioner data and the subject data; and a processing unit for accessing at least one memory and executing the processor executable instructions, wherein the processing unit receives and processes the practitioner data and the subject data to assess the manipulation to thereby obtain a manipulation indicia.
  • the system includes a communication module that receives the practitioner data and subject data; and an output device for providing the manipulation assessment.
  • the system in an embodiment includes a hub having a second communication module that receives the practitioner data and subject data and transmits said practitioner data and subject data to the network.
  • the practitioner edge device and/or the subject edge device can each include a sensor, an analog to digital converter, an actuator, a digital to analog converter, a communication module (e.g., a bus), a processor, wireless connectivity, memory, a power supply (e.g., a battery), or a combination thereof.
  • the processing unit processes the practitioner data and the subject data to assess the manipulation includes compiling the practitioner data and the subject data.
  • the processing unit processes the practitioner data and the subject data to assess the manipulation includes comparing the practitioner data with the subject data or compare the practitioner data and the subject data to a standard.
  • the processing unit processes the practitioner data from more than one time point, the subject data from more than one time point, and compiles or compares the data from the more than one time points (e.g., prior to the manipulation, during the manipulation and after the manipulation).
  • the output device includes a display providing the data in a graphical representation e.g., a line graph, or a three-dimensional representation of the movement.
  • the present invention has a number of advantages, as follows.
  • the present invention provides for a quantifiable, objective characterization of manipulations performed on a patient at one or more timepoints.
  • the methods and systems of the present invention provide specific and detailed data acquired from treatments, a cumulative time line of treatment data, treatment data for each subject, treatment data for each practitioner, cross analysis of treatment data, collected treatment data utilization in manual therapy research, and collected treatment markers and fundamental physics equations.
  • An additional advantage of this device and system is that it provides instantaneous feedback about a performed manipulation.
  • a practitioner can improve their quality of treatment by repeating certain manipulations if their original objective(s) aren’t achieved.
  • Such data obtained from both the practitioner and the subject, improves the health of the patient, and allows for better education, uniformity in the applied techniques, and development of medical treatment standards. Subjects who utilize various practitioners can now acquire data from
  • An additional advantage of this invention is that it will have the ability in the future, through acquired data, to create an automated machine to assist a practitioner to perform manual therapy techniques and/or replace the practitioner all together. This would be advantageous to patients, animals, students, educators, insurance companies and regulatory boards when experts/practitioners cannot be present to provide care.
  • Fig. 1 is a schematic showing the flow of data from subject edge devices, practitioner edge devices via a network to a hub if present, if not present to the cloud where the data analysis occurs.
  • the Figure also shows output devices from the hub and/or the cloud.
  • storage can also occur on the sensors themselves using a memory card. Data transfer can occur, e.g., by wireless transfer or hardwire via micro USB cord.
  • 100151 Fig. 2 is a schematic showing the steps of the present invention involving receiving data from the subject edge device and the practitioner edge device, compiling and/or comparing the data to provide an indication of the quality or effectiveness of the manipulation, and then displaying, transmitting and/or storing the indicia, and optionally compiling and/or comparing two or more stored indicators and displaying that compilation/comparison.
  • Fig. 3 is a schematic of an edge device used in the present invention, wherein the edge device includes a sensor board (e.g., an analog sensor, analog to digital converter, actuator, and digital to analog converter), a communication module (e.g., a serial bus), and edge computer (e.g., a processor, wireless connectivity, memory and battery).
  • the edge device in this embodiment is a sandwich between the sensor board and the edge computer. In this figure, the edge device communicates with the hub computer.
  • FIG. 4A shows schematics of various manipulation devices having an edge device including straps and towels for use by the practitioner and/or patient.
  • FIG. 4B is a schematic showing a strap or band manipulation device (referred to as a“hand strap”), as placed on a hand for use.
  • a“hand strap” a strap or band manipulation device
  • FIG. 4C shows a schematic of a perspective view of a headband manipulation device having an edge device.
  • Fig. 4D is a schematic showing a detailed perspective view of the hand strap manipulation device shown in Fig. 4B.
  • Fig. 4E is a schematic of the hand strap in Figs. 4B and 4D showing the parts under the silicone cover.
  • FIG. 5 is a schematic showing a patient wearing a headband of the present invention shown in Fig. 4C with the practitioner wearing the hand strap shown in Fig. 4B, 4D and 4E.
  • Figs. 6A-C is a series of schematics showing a cervical spine manipulation by movement of the head of the subject wherein both the practitioner and subject are wearing sensors (edge device of practitioner is only shown in Fig. 6C since the practitioner’s edge device on their hand is under the subject’s head in Figs. 6 A and 6B).
  • the figures show the manipulation as the head is rotated during the contact/manipulation.
  • Fig.6A-C is a series of schematics showing the steps of using this invention. The pre-assessment can be performed by checking the range of motion of rotation of the head of the subject.
  • the figures show the manipulation as the head is rotated during the
  • Fig. 7A is a line graph showing data from an accelerometer during the cervical manipulation of Fig. 6.
  • the units are m/s 2 on the y-axis and ms on the x-axis.
  • Also shown in Fig.7A is the data from the gyroscope in m/s 2 on the y-axis and ms on the x-axis.
  • These two sensors are taking data from the practitioner’s edge device.
  • Fig., 7B shows a three-dimensional graphical representation showing the side-flexion of the head and degrees of side-flexion in the sagittal axis. These data are shown as two snapshots but can be presented in video format of the movement in three-dimensions. The following is shown: range of motion of the joint for the pre and post assessments, maximum acceleration (m/s 2 ) of the practitioner’s hand performing the manipulation, the maximum pressure applied by the practitioner performing the manipulation, and the heart rate of the patient during the time from the pre assessment to after the post assessment; or more simply articulated as the time of the complete treatment.
  • a line graph provides bio-indicators over time including heart rate (i.e. beats per minute).
  • FIG. 8 shows a general architecture for a computer system, according to the principles herein, including the network/cloud 812, communications interface/hub 806, the output device 808, input device 810, processor 802 and memory 804.
  • Fig. 9 shows graphs providing pre-assessment orientation data for manipulation of the cervical spine by relative angle of rotation during the pre-assessment over time (seconds) (e.g., right side flexion). Additionally, the lower graph displays force/pressure in mV applied over time (seconds) as applied during the pre-assessment.
  • Fig. 10 shows a three-dimensional graphical representation showing the pre-assessment orientation data and post-assessment data for manipulation of the cervical spine by side flexion of the head in degrees of rotation.
  • Fig. 11 shows line graphs providing the manipulation data during the manipulation of the cervical spine by right side flexion of the head.
  • Fig. 11 top line graph shows the radial speed (degrees/second) during the manipulation over the pressure in (mV) applied during the manipulation.
  • Fig. 11 bottom line graph shows pressure/force in mV over time in seconds.
  • 100301 - Fig. 12 is a screenshot of the software used in the present invention that shows and confirms that both sensors (at least one sensor from the practitioner, and at least one sensor from the subject) are connected to the systems and are streaming live data.
  • 100311 - Fig. 13 is a screenshot of the software used in an embodiment of the present invention that shows live streaming data from the practitioner hand strap in that screen.
  • the present invention relates to systems and methods for assessing and evaluating quality of manipulations or contact(s) performed by a practitioner (e.g., Clinician or a manual therapist such as, Chiropractors, Doctors, Performance Therapists, Physical Therapists, Osteopaths, Osteopath Physicians, Naprapaths, Massage Therapists, Athletic Therapists, Cranial Sacral Therapists, Ki- Hara Practitioners, Reflexologists, and Rolfing Practitioners) on a subject (e.g., patient, human, animal, non-human model) by obtaining data from both the practitioner and the subject, and optionally from, supporting therapeutic structures/implements (e.g. treatment tables) and auxiliary evidence recording equipment (e.g. video camera(s)).
  • a practitioner e.g., Clinician or a manual therapist such as, Chiropractors, Doctors, Performance Therapists, Physical Therapists, Osteopaths, Osteopath Physicians, Naprapaths, Massage Therapists, Athletic Therapists, Cranial Sac
  • This novel system and methodology obtains measurements from both the subject and the practitioner to determine the nature, quality, and/or effectiveness of manipulation(s).
  • the present invention serves to assess the quality of the manipulation, the outcome of the manipulation (e.g. if it works and how well), and the technique of the practitioner.
  • the system advantageously allows one to quantify and characterize the manipulation, which allows one to understand what techniques work and which techniques are most effective. Additionally, the feedback about the manipulation can be provided during the treatment session in real time.
  • This inventive methodology allows the practitioner to learn effective techniques and allows one to more easily reproduce the manipulation.
  • the system also allows one to assess mistakes made or ineffective techniques so as to avoid them and thereby improving their treatments, technique(s) and learning from manipulations that work and those that do not.
  • the system of the present invention assesses the manipulation/contact by measuring both the practitioner’s and the subject’s movements using edge devices.
  • a manipulation or contact are used interchangeably and refers to the act or process of manually adjusting a body part or manually treating a body part, both of which can be done directly or via structures in relation/communi cation to that body part.
  • the present invention is used in manual therapy, referring to an application of force to the subject’s body, typically administered by the practitioner’s body, but can also be applied through various implements such as needles and aptly constructed instruments to realize optimal health.
  • Various devices are referred to herein as they include an edge device, a hub device, a user console or dashboard device and the cloud or cloud infrastructure.
  • the edge refers to an input/output device (mechanics + electronics) of minimal size or profile (e.g., wearable or usable during manual adjustment) where interaction between the system and ambient real physical world happens via collection of appropriate set of sensors and actuators. Sensors are to input ambient world data to the systems and actuators execute implication of output data from the system. This is a device on the“interface” between ambient physical world and the inner workings of the entire system.
  • a hub device refers to a device that collects and distributes system data to the cluster of edge devices, maintains connectivity and synchronization with cloud infrastructure and if necessary, can locally substitute for cloud functionality.
  • the user console/dashboard device in one aspect, is a device capable to run software, which allows monitoring of all aspects of the systems behavior (ultimately sensors) and to influence all aspects of the systems behavior (ultimately actuators).
  • the cloud or cloud infrastructure refers to, for example, an underlying system’s computational and storage resource applicability to multiple local systems.
  • the system in an embodiment, can include a foot pedal or other input device(s) to turn the system on or off, or indicate that the practitioner is ready to begin or end a manual adjustment.
  • the edge device in one embodiment, is a device that measures and translates real-world information into digital information.
  • the edge device has an input/output device that includes one or more sensors, one or more analog to digital converters, and optionally one or more actuators, and one or more digital to analog converters.
  • the edge device further includes a communication module (e.g., one or more buses), and optionally one or more controllers (e.g., passive and/or active controllers).
  • the bus is a serial bus and forms a“sandwich” design between the sensor board and the edge computer.
  • the edge device includes an edge computer having a processor, wireless connectivity, memory and a power supply (e.g., battery).
  • Sensor types of the edge device include any sensor that can measure a condition to which it is subjected, as it relates to the manipulation or health of the subject/or practitioner.
  • sensor types include one or more of the following: gyroscope, accelerometer, force sensor, a magnetometer, or combination thereof, and optionally, at least one central nervous system sensor.
  • Central nervous system sensors include one or more of the following: heart rate monitor, blood pressure monitor, thermometer, oxygen saturation monitor, respiratory monitor, or combination thereof. Additional optional sensors include, microphones, video cameras, blood lactate monitors, blood glucose monitors, vascular flow monitors, brain wave monitors, sweat and saliva monitors, and pain sensors.
  • the sensors described herein can be made or purchased from commercially available sources.
  • the following sensors can be made or purchased separately or in combination with another sensor(s) for use with the present invention as follows: gyroscope (e.g., Bosch Sensortec BMI 160, Bosh Sensortec BNO055 (Bosh Sensortec GmbH (Reutlingen/Kusterdingen Germany)), STMicroelectronics LSM9DS0, LSM9DS1 (STMicroelectronics (Crolles Cedex, France)), TDK InvenSense MPU-9150 TDK (InvenSense Inc. (San Jose, California));
  • gyroscope e.g., Bosch Sensortec BMI 160, Bosh Sensortec BNO055 (Bosh Sensortec GmbH (Reutlingen/Kusterdingen Germany)
  • STMicroelectronics LSM9DS0, LSM9DS1 STMicroelectronics (Crolle
  • accelerometer e.g., Bosch Sensortec BMI 160, Bosh Sensortec BNO055 (Bosh Sensortec GmbH Reutlingen/Kusterdingen Germany), STMicroelectronics LSM9DS0, LSM9DS1
  • TDK InvenSense MPU-9150 TDK InvenSense Inc. (San Jose, California ); force sensor (e.g., FlexiForce A201, FlexiForce A401 and FlexiForce A502 (Tekscan, Inc. Boston, Massachusetts), StretchSense, a force and stretch sensor system
  • a magnetometer e.g., Bosh Sensortec BMI 150 , Bosh Sensortec BNO055 (Bosh Sensortec GmbH (Reutlingen/Kusterdingen Germany)), TDK InvenSense MPU-9150 (TDK InvenSense Inc. (San Jose, California )); heart rate monitor (e.g., Maxim Integrated MAX30100, MAX30101 (Maxim Integrated (San Jose, California)); blood pressure monitor (e.g., Nokia BPM + Blood Pressure Monitor and Nokia Technologies LTD.
  • Bosh Sensortec BMI 150 Bosh Sensortec BNO055 (Bosh Sensortec GmbH (Reutlingen/Kusterdingen Germany)
  • TDK InvenSense MPU-9150 TDK InvenSense Inc. (San Jose, California )
  • heart rate monitor e.g., Maxim Integrated MAX30100, MAX30101 (Maxim Integrated (San Jose, California)
  • blood pressure monitor e.g., Nokia
  • thermometer oxygen saturation monitor (e.g., Maxim Integrated MAX30100, MAX30101 (Maxim Integrated (San Jose, California)); electrocardiogram, respiratory monitor (e.g., B01MOODCL5, Beddit 3 Sleep Tracker (Apple Inc. Cuperino, California); blood lactate monitors (e.g., ALP10101 Lactate plus meter (Nova Biomedical, Waltham, Massachusetts); blood glucose monitors (e.g 71387 Freestyle Precision Neo - Blood Glucose meter (Abbott Diabetes Care Inc.
  • oxygen saturation monitor e.g., Maxim Integrated MAX30100, MAX30101 (Maxim Integrated (San Jose, California)
  • electrocardiogram, respiratory monitor e.g., B01MOODCL5, Beddit 3 Sleep Tracker (Apple Inc. Cuperino, California)
  • blood lactate monitors e.g., ALP10101 Lactate plus meter (Nova Biomedical, Waltham, Massachusetts); blood glucose monitors (e.g
  • vascular flow monitors e.g., Vascular flow monitor using portable touch-less device (video) to monitor entire flow in patients, University of Waterloo, Waterloo, Ontario
  • brain wave monitors e.g., ICM-20948 Brain wave headset monitor (EMOTIV, San Francisco, California)
  • sweat and saliva monitors e.g., Kenzen Patch.
  • Sweat analysis providing real time feedback (Kenzen, San Francisco, California); wearable salivary uric acid mouth guard biosensor with integrated wireless electronics (e.g., University of California, San Diego); microphones (e.g., ICDPX440, Sony Digital Voice Recorder (Sony Corporation, Minato, Tokyo, Japan)); video camera (e.g., CHDHS-501-CA GoPro Hero5 Session (GoPro, San Mateo, California)); ambient environment sensors (e.g., temperature (Bosh Sensortec BME), humidity (e.g., Bosh Sensortec BME); barometric pressure sensors (e.g., Bosch Sensortec BMP280, Bosh
  • Sensortec BMP180, STmicroelectronics LPS22HB), and pain sensors e.g.,“chemical-pain sensor” See for example, Jin, Hye Jun el ah, Biosensors and Bioelectronics, vo. 49: 86-91 (15 November 2013).
  • Other devices such as power management and battery charging devices can be included (e.g., IDT P9025AC wireless power receiver (IDT (San Jose, California)) and Analog Devices LTC4120 wireless power receiver (Analog Devices (Norwood, Massachusetts)).
  • edge devices include MbietLab Meta Wear CPRO and MetaMotionR platforms, which are, integrated systems containing a connectivity module (e.g., BLUETOOTH connectivity) and variety of motion and ambient environment sensors; Espressif System ESP32 based modules, which are dual core microcontrollers with a communication/connectivity protocol such as BLUETOOTH communication, Wi-Fi connectivity and allows for ways of integration of a variety of motion, ambient environment, CNS, force sensor and actuators; and StretchSense platform (StretchSense (Penrose, Auckland, New Zealand)) which is integrated system containing Bluetooth connectivity module and dedicated microcontroller for processing raw data from custom force and stretch sensors.
  • a connectivity module e.g., BLUETOOTH connectivity
  • Espressif System ESP32 based modules which are dual core microcontrollers with a communication/connectivity protocol such as BLUETOOTH communication, Wi-Fi connectivity and allows for ways of integration of a variety of motion, ambient environment, CNS, force sensor and actuators
  • StretchSense platform StretchSense (Stretch
  • EMF radiation protection materials to shield subjects from exposure(s), surround edge devices (Defender Shield, Florida, USA).
  • a hub device used with the present invention includes Raspberry Pi3, Raspberry Pi 3+ that are single board computers with a connectivity module (e.g., BLUETOOTH connectivity), Wi-Fi and wired Ethernet network connectivity; Two foot switch control pedal connected to hub device, iKKEGOL USB double switch pedal (FS2016 B2, iKKEGOL, China); and examples of cloud infrastructure used in the present invention are Amazon Web Services and Google Cloud Platform.
  • 3D motion camera capture technology allows for the recordation of sensor data from the subject and/or practitioner. See for example, W02019095055A1.
  • System 100 shows that information from subject edge device 102 and practitioner edge device 104 may or may not travel through Hub 108 but is compiled and analyzed by a processer in cloud 110 to provide output 112 that characterizes the nature or quality of the contact/manipulation.
  • the subject and the practitioner information can be compiled to generate a three- dimensional output or graph to characterize or replicate the manipulation (e.g., motion, force, direction, position, time of the manipulation).
  • the information can also be compared to a standard to assess the nature of the manipulation.
  • the standard can be data that represents the desired character of the manipulation, data can be obtained from a patient population on which the manipulation worked or did not work, or the data can be obtained from the specific patient from previous time points. Additionally, the data for the standard can be from industry accepted standard(s) by experts in the field.
  • state of the joint can be compared before, during and after the manipulation.
  • the state can be defined as in an example: range of motion of the joint along its axis (axes) of rotation, average force required to move the joint the full range, force required to move the joint the full range along its axis (axes) of rotation, etc..
  • the practitioner can compare the state before and after a manipulation by evaluating these numbers. The practitioner can decide in real time if a manipulation was successful if the range of motion increased, and if it required less force to move the joint along the full range of motion for instance.
  • the output can include a compilation of data, or a comparison and assessment of the subject’s and practitioner’s information, as compared to the standard.
  • the types of output are further described herein.
  • the output can be provided immediately to the practitioner and/or patient and the practitioner can adjust the next manipulation accordingly (e.g., use more or less force, change the angle of the torque, the course of the movement/rotation, and the like)
  • 100401 Fig. 1 shows the system 100 and the flow of data goes from subject edge device 102 and the practitioner edge device 104, and if a hub is present 106, flows to the hub 108.
  • Hub 108 receives and stores data from one or more manipulations/contacts and transmits it to cloud 110. If hub 108 is not present, then the data can be transmitted directly to cloud 110 wirelessly or through a wired connection via a computer.
  • the decision 106 as to if a hub is present depends on the system architecture. For example, a manipulation may be done out of the office where only the edge devices are present but the communication module in the devices allows for communication of the devices directly to the cloud; when not present in the office.
  • the cloud is in communication with or has a server, memory for storing data, a processor to compile and analyze the data, and a communication module to transmit the data to an output device.
  • Cloud 110 analyzes data obtained before, during, and after contact/manipulation.
  • the data of the present invention that flows from both the subject and the practitioner includes position data representative of position or movement of the body part, acceleration data representative of an acceleration of the body part, and force data representative of force applied to the body part, and optionally, heart rate data, heart rate variability data, blood pressure data, vascular flow data, temperature data, oxygen saturation data, ECG data, respiratory rate data, blood lactate data, blood glucose data, brain wave data, sweat and saliva data, pain indication data, and any combination thereof.
  • One or more of these datatypes can be used. The more data types used results in a more robust output and characterization of the
  • diagnostic test results can be embedded into the manual therapy recording system, including, X-rays, Ultrasounds, MRI’s, CAT scans, VNG-eye recordings, balance recordings (e.g. computerized dynamic posturography), complete blood profiles, urine analysis, stool analysis and genetic profiles.
  • subjective questionnaire data and third-party application data could be embedded into the system.
  • an application which monitors daily sleep phase activity could be either manually inputted or linked automatically into this system.
  • Methodology 200 begins with receiving step 202 in which data (at least one measure of data) from each of the subject edge device and the practitioner edge device are obtained pre-manipulation, during manipulation and/or post-manipulation.
  • the data is communicated to a processer where step 204 compiles the data measurements and/or compares the data measurements to a threshold value or standard to provide an indication of the quality or effectiveness of the manipulation/contact.
  • the indicia e.g., data representation, compilation or comparison
  • can be displayed in step 208 by an output device e.g., monitor, displays, printers, projectors, speakers, headphones and the like.
  • the display of the indicia can be tailored for the subject, the practitioner or other interested party, as desired.
  • the indicia can be transmitted in step 210 to the practitioner and/or the subject (e.g., via email, text, or by wire or wirelessly). In an embodiment, the indicia are transmitted back to the edge device to alert the subject and/or practitioner that the manipulation was successful or unsuccessful.
  • Various types of output of the edge devices are further described herein.
  • Step 206 allows the indicia to be stored e.g., in memory or on a storage drive. Two or more stored indicators from 206 can be further complied or compared in step 212 and displayed in step 214. The advantage of compiling two or more stored indicators allows for comparison of manipulations. For example, data can be compiled and/or compared from one or more time points of the same manipulation (pre-manipulation, during manipulation and post manipulation).
  • pre or post assessment data is obtained along with data during the contact/manipulation. If the data is obtained during pre- or post-manipulation, then the software processes data to estimate/determine the state of the joint and the subject. If the data is from the manipulation, then the software estimates/determines data points to quantify manipulation. If there is no event, then the software does nothing.
  • the range of motion of a particular joint from before and after the manipulation can be compared.
  • the post manipulation data of the same manipulation performed on the same subject on more than one occasion can be compared and associated with the outcome.
  • the data obtained during the actual manipulation can be compared too.
  • one practitioner edge device e.g., a practitioner manipulation device having an edge device therein
  • one subject edge device e.g., a subject manipulation device having an edge device therein
  • two or more edge devices on the practitioner and/or on the subject can be used to increase efficiency in acquiring data (e.g., force data).
  • data e.g., force data
  • an edge device in the form of a hand strap is used to acquire motion data
  • a second edge device in the form of a towel which contains the force sensors is used.
  • separating the motion sensors from the force sensors provides more variety of contacts and in certain cases, more reliable data.
  • edge device 300 is the interface between the real world and the computer system and can be a single or dual directional device.
  • the right side of the vertical line is the digital environment
  • left side of the line is an analog/digital environment or sensor board.
  • the edge device e.g. one or more edge devices
  • the dual directional edge device 300 of Fig. 3 includes integrated input/output device (I/O device) 320, also referred to as a“sensor board.”
  • I/O device integrated input/output device
  • Input/output device or sensor board 320 includes one or more analog sensors 302 in communication with analog to digital converter 304, which translates analog signals to digital data.
  • the sensors include, for example, gyroscope, accelerometer, force sensor, a magnetometer, heart rate monitor, blood pressure monitor, thermometer, oxygen saturation monitor, respiratory monitor, a microphone, a video camera, blood lactate monitor, blood glucose monitor, vascular flow monitor, brain wave monitor, sweat monitor, saliva monitor, pain sensor, and any combination thereof.
  • Each of these sensors can communicate to analog to digital converter 304, which in turn translates the analog signal to digital data and communicates via bus 318 or wirelessly to hub 316 or directly to the cloud.
  • integrated input/output device 320 also has actuator 308 that provides an output signal to digital to analog converter 306 which provides an analog signal to sensor 302.
  • the integrated edge device includes one or more sensors as described herein. Consequently, the sensors provide the type of data, described herein. Data obtained pre-contact/manipulation, during contact/manipulation, and post
  • contact/manipulation is converted from analog data to digital data by converter 304 and then digital data moves to hub 316 or the cloud through communications modules such as bus 318.
  • One or more buses can be used to transmit data back and forth.
  • Bus 318 communicates with edge computer 312 which transmits the data to the hub 316 or directly to the cloud.
  • the bus can be a serial bus that sandwiches the sensor board and the edge computer.
  • the edge computer generally miniaturized, has a memory device, and a processor communicatively coupled to the memory device, wireless connectivity, and a power supply (battery, charging port, and the like).
  • the edge device can further include an active controller which is a device that integrates one or more input/output devices with software. Passive controller also integrates input/output device 320 but does so without software and can optionally be included in the edge device.
  • An active controller has computational functionality, input output access device and a connectivity device. The wireless connectivity of the edge computer transmits data to computer systems (e.g. hub or cloud).
  • Edge device 300 can also optionally include one or more passive controllers in communication with one or more sensors.
  • Edge devices can further include a substrate or housing into which the components described herein can be housed, and/or a port for a wired connection.
  • the edge device can be embedded into a manipulation device used by the practitioner or subject or can be in the form of a wearable object or usable object when performing a manipulation.
  • the edge devices can be made in miniature form sufficient for embedding into the practitioner, subject, animal or model.
  • the practitioner could have all the motion sensors and CNS sensors embedded into the backs of their hands and the subject could have all the motion sensors and CNS sensors embedded into their heads.
  • the data could be transmitted from the embedded sensors to the hub or cloud.
  • a manipulation device having an edge device attached thereto or embedded therein can be in the form of, for example, a band, a strap, a ring, a towel, a glove, a headband, a bracelet, a body band, a sleeve, a patch and any combination thereof.
  • the edge device can be in the form of any wearable or usable object that allows one to assess measurements during a manipulation or contact. Exemplary devices are shown in Fig. 4A and include towel variations, bracelets, bands and straps.
  • the manipulation device having an edge device can be of any size and shape to allow one to wear and/or use the device during a manipulation. Fig.
  • FIG. 4B shows an example of a hand strap, hand strap 402, having an edge device worn by practitioner 424 with a hook and loop fastener
  • Fig. 4C shows headband 408 having an edge device for use by the subject.
  • Headband 408 has opening 410 through which the central nervous system sensor(s) often requires close proximity and direct contact with the subject. Headband 408 additionally has electromagnetic radiation (EMF) blocking fabric to protect the subject.
  • Fig. 4C shows a detailed view of hand strap 402 shown in Fig. 4B.
  • hand strap 402 has an opening 414 to receive the edge computer under cover 422 and is connected by cords 416 to force sensor 412. This design of the manipulation device allows the force sensor position to be adjustable (e.g., movable along the length of the strap).
  • Force sensor 412 is held down by the strap 418 when in use.
  • the hand strap 402 is held into place on the practitioner’s hand 424 with hook 420 that is received by a pocket on the opposite end of the strap.
  • Fig. 4E shows the design of the edge computer under cover 422.
  • Fig. 4E shows sensor board 300 along with retractable device 428 in communication with force sensor 412 via cords 416.
  • the manipulation devices can include various closures to keep the device on the body part during the manipulation.
  • closures include a fastener, a strap, a snap, a buckle, a button, a hook, an elastic member, a tie, a clip, a zipper, a drawstring & cord lock, a hook-to-hook arrangement, a hook & loop arrangement, hook & pocket arrangement, or a combination thereof.
  • the manipulation device can also include an adhesive for adhering to the subject’s or practitioner’s skin or clothing.
  • the fabric for the manipulation devices can be any fabric, now known or later developed, and is a fabric that is strong enough to withstand the manipulations and preferably easy to clean/hygienic. Such fasteners and fabrics can be purchased commercially.
  • EMF extremely low frequencies
  • RF radio frequency radiation
  • each edge device can be surrounded by EMF blocking materials.
  • the edge device’s purpose impacts the type of sensors being used and the shape/form of the edge device. For example, an adjustment of the cervical spine by manipulation of the head involves movement (e.g.
  • the strap used by the practitioner could include, for example, a gyroscope, accelerometer, force sensor, magnetometer and at least one central nervous system sensor such as a heart rate monitor, blood pressure monitor, thermometer, oxygen saturation monitor, respiratory monitor, microphone and video camera, blood lactate monitor, blood glucose monitor, vascular flow monitor, brain wave monitor, sweat and saliva monitor, pain sensor, or a combination thereof.
  • a gyroscope accelerometer, force sensor, magnetometer and at least one central nervous system sensor such as a heart rate monitor, blood pressure monitor, thermometer, oxygen saturation monitor, respiratory monitor, microphone and video camera, blood lactate monitor, blood glucose monitor, vascular flow monitor, brain wave monitor, sweat and saliva monitor, pain sensor, or a combination thereof.
  • the headband used by the subject could include a gyroscope, accelerometer, force sensor, magnetometer and at least one central nervous system sensor such as heart rate monitor, blood pressure monitor, thermometer, oxygen saturation monitor, respiratory monitor, microphone, video camera, blood lactate monitor, blood glucose monitor, vascular flow monitor, brain wave monitor, sweat and saliva monitor, pain sensor, or a combination thereof.
  • the type of sensors used in an edge device can be the same or could differ and, in some cases, certain sensor data may be more relevant than others, depending on the anatomy, technique, manipulation and the stage of the manipulation. Based on anatomy or technique delivery, as an example, practitioners could choose the desired manipulation device, for example, rings, straps for the cervical spine or towels for the hip or sacroiliac joints.
  • headband 408 is worn by the subject and hand strap 402 is worn by the practitioner.
  • the practitioner can adjust the location of the force sensor 412 to be at the base of the subject’s neck to obtain a better reading of the force used in the manipulation.
  • FIG. 6 shows the present invention being used by a practitioner on a subject.
  • the manipulation is being performed as the subject’s head is being rotated from a first position to a second position.
  • the head is being manipulated in order to adjust the cervical spine.
  • One body part is being moved, in this case, to adjust another body part.
  • the manipulation and adjustment are done on the same body part.
  • the manipulation can be performed on the body part being
  • the body parts of the subject that wear the manipulation device having an edge device include a head, hand, arm, foot, knee, leg, hip, torso, back, or combination thereof.
  • the practitioner can wear/use the edge device on any one of the following body parts: hand, foot, knee, leg, hip, torso, back and/or head.
  • Table 1 provides examples. Several combinations exist and depend on the body part being adjusted, the practitioner, the subject and the specific issue with the body part. 100521 Table 1 : Examples of Body Part Manipulation/Communication
  • Using the sensor in the methodology of the present invention involves the following steps.
  • the practitioner having one or more edge devices and the subject having one or more edge devices provide pre-manipulation data measurements.
  • the practitioner performs the manipulation during which real time data for the practitioner and the subject are obtained and transmitted. Additionally, the post-manipulation data can also be obtained and transmitted.
  • the practitioner can move the body part to assess range of motion, flexibility and strength and data from the edge devices can measure this.
  • Exemplary data includes position data representative of position or movement of the body part, acceleration data representative of an acceleration of the body part, and force data representative of force applied to the body part, and optionally, heart rate data, heart rate variability data, blood pressure data, vascular flow data, temperature data, oxygen saturation data, respiratory rate data, blood lactate data, blood glucose data, brain wave data, sweat and saliva data, pain data, and any combination thereof.
  • the indicia of the contact/manipulation can be characterized in various ways.
  • the character of the manipulation can be assessed, and examples of such characterizations are shown in Fig.7A and 7B.
  • the indicia can include positive/negative results or an otherwise binary output indicating if the manipulation was attempted/successful.
  • Indicia of the manipulation can also include graphical representations of the data as shown in Fig. 7 A, e.g. acceleration and angular velocity.
  • the data in Fig. 7A also shows acceleration and gyroscope (position) data.
  • Table 2 shows the type of sensor, data type, units of measurement and provides an example of how the data can be displayed.
  • the indicia can show compiled data, compared data or both.
  • Indicia also include compilations of data or comparisons of the unit of measurement obtained to a standard, described herein, to provide information to the practitioner if a manipulation met certain criteria.
  • a three-dimensional representation showing a replication or simulation of an aspect of the manipulation can be shown. See Fig. 7B.
  • range of motion of the body part can be assessed before and after the manipulation to determine if there is an increase in range of motion.
  • Fig. 7B shows the three-dimensional representation movement of the head for adjustment to the cervical spine. This is accomplished by transmittal of the data from the practitioner and subject at one or more time points and comparing the data to one another or to a standard. The data is transmitted to the processor for compilation and analysis to determine nature of the contact.
  • the indicia of the contact/manipulation are provided to an output device.
  • the edge device is also, in an embodiment, an output device and provides information to the practitioner and the subject.
  • the edge devices have an actuator that converts digital data to analog information.
  • the edge device can communicate a stimulus output signal needed for the sensor to work.
  • the heart rate sensor works by utilizing an LED light and emits a light into the vessel to obtain feedback on the differences in the reflected light, which can be used to calculate the heart rate.
  • the edge device has an actuator, in an aspect, to provide for certain amount of LED light at specific parameters to assess the heart rate, which would result in an input.
  • Another output type of the edge device is a system monitoring output. For example, such an output can inform the
  • the system can analyze the manipulation and correct behavior. For example, if too much force was used in the manipulation or the improper motion was used for the manipulation, the edge device can alert the user by sending a signal such as a red light or emit a “shake” of the device. If a manipulation worked, as compared to a standard, then the device provides a positive indicator. As the system records data for the practitioner and/or subject, the system learns which manipulations work and how the manipulation should be carried out to achieve a successful manipulation. The system can guide the practitioners with feedback based on comparing current data with data obtained from previous time points.
  • the hub of the present invention is a data processing apparatus or computing device and has several functions.
  • the hub serves as a communication interface between the edge devices and the cloud. Additionally, the hub can be used as a user interface in which the practitioner can turn the system on and be sure that it is ready to be used.
  • the hub acts as a remote controller for the invention and can be a computer, credit card sized computer, smart phone and/or tablet. With a touch screen interface, the hub is able to conduct the operations such as power and communication.
  • the practitioner can also provide information to the system (e.g., that the manipulation is about to begin, the type of manipulation, the phase (pre, during, post) of the manipulation, etc.).
  • the practitioner has the option and ability to provide information to the system in a hands-free environment, through a foot pedal and/or voice commands as an example.
  • the hub can also act as an output device to provide indications of system readiness, that the system is ready to receive data and/or for the practitioner to perform the manipulation and the outcome,
  • the hub acts as a data receiver and can receive and store data and upload data to the cloud.
  • the cloud as used in the present invention is generally where the instructions are executed according to software and algorithms described herein. Although the figures show a cloud-based system, the functions of the cloud can be performed on a local computer.
  • the software applications collect, compute and organize all the data metrics, which are uploaded to a hard drive device located in a central hub and a cloud-based system. In turn, this data can be displayed in multiple platforms, including watches, phones, tablets, smart televisions and/or desktop computers.
  • edge devices are used within these three aspects to gather data.
  • the data from the one or more edge devices are processed and displayed for either the practitioner and/or subject to see in a real time intuitive manner.
  • the algorithms for processing and interpreting the sensor data are explained for each phase.
  • the algorithm performs the following with respect to the state of the joint, subject and practitioner (1) a pre-assessment evaluation before the manipulation, (2) evaluation during the manipulation, and (3) a post-assessment evaluation after the manipulation.
  • the state of a joint can be represented using several data points including: the degrees of rotation around its (possibly multiple) axes (active and passive motion); the force required to move the joint through its range of motion without the subject attempting to resist the motion (passive range of motion); the amount of force production by a stationary joint in neutral/resting position while the subject is resisting the motion (resisted isometric contraction); or any combination thereof.
  • the state data points are assessed depending on the objective of the practitioner and the joint being manipulated.
  • the main purpose of assessing the state of the joint is to be able to compare the state before and after a manipulation. After a successful manipulation, the joint may have: a greater range of motion and require less force to move along its range of motion for instance.
  • the data points assessed can vary depending on the objective of the practitioner, and the type of joint that is being manipulated.
  • the state of a subject is determined by the physiological sensors such as heart rate (HR), sweat response, body temperature and others including pain sensors. Again, the main idea is to compare the state of the subject before and after a manipulation. For many sensors, the state of the subject can be continuously monitored during all three phases (the pre-assessment, manipulation and post-assessment). During a“bad” manipulation the sweat response, heart rate, body temperature and/or pain of a subject can alter adversely.
  • manipulation is about to occur.
  • Several methods can be used to accomplish this task, and the following are two possible methods for detecting the start of a manipulation sequence.
  • One approach uses a foot pedal to signal the start of a pre-assessment.
  • the foot pedal is clicked before the pre-assessment starts. This click is recorded and time stamped. Then the practitioner starts the pre-assessment. Once the pre-assessment is complete, the practitioner clicks on the foot pedal again. Similarly, the practitioner clicks before and after the manipulation and post assessment.
  • the data is cut into small blocks, each containing either a pre assessment, manipulation or post-assessment.
  • Another approach to detect the start of the pre-assessment phase is to use a comparison of the measurements of certain sensors in the edge devices.
  • certain sensors are expected to have particular values during pre-assessment when compared to values of other sensors for the same time point.
  • the pre-assessment phase can be detected by using two Inertial Measurement Units (IMUs) sensors and a force sensor.
  • the IMU consists of tri-axis accelerometers, gyroscopes and magnetometers. One IMU is attached to the practitioner’s hand.
  • the other IMU is attached to the patient’s joint that is being assessed.
  • the IMU can be attached to the head of the patient using a headband.
  • the force sensor is worn/used by the practitioner.
  • both IMU’s are moving around the same axis of rotation.
  • the force exerted on the head is non-zero.
  • the IMUs will be moving independently.
  • the state of the joint is determined from the data.
  • an IMU having accelerometer, gyroscope, and magnetometer, and if available, force sensor(s) can be used. By using force sensor(s), additional metrics are incorporated to assess the state of the joint.
  • the practitioner moves the joint along its axes of rotation while wearing or using a manipulation device having an edge device with the IMU.
  • the orientation of the sensor is determined. There exist many algorithms to determine the orientation of a sensor using the IMU data such as Kalman filtering (See L.
  • the range of motion of a joint can be determined using IMU data and an orientation algorithm.
  • the algorithm used to determine the state of the joint includes determining:
  • the start of the rotation can be determined in any number of ways.
  • One way to determine the start of the rotation is as follows. Before the rotation, the practitioner pauses for 5 seconds. The start of the rotation can then be determined by looking in the IMU sensor data for a 5 second pause, followed by a steady rotation along a constant axis. In addition, the force sensor will read an increased value during the rotation (as opposed to when the practitioner is not touching the subject, the force sensor value will be near zero).
  • the end point of the rotation can be determined in several ways.
  • One way to calculate the end point of the rotation is as follows. During the pre and post assessments the joint is rotated from a starting point, to an end point, and then back to the starting point. Thus, the end point of the rotation is the extreme point. This is the point that has maximal deviation from the initial starting point. The following steps describe one way to determine the extreme point/ end point:
  • the end point of the rotation is the maximal deviation from the start point, and is denoted with the curve in Fig. 9.
  • the peak force and average force between the start and end point are calculated.
  • the state is then displayed on a screen.
  • the state of the subject during, pre and post assessment is determined using the CNS sensors that are activated. For example, the data from heart rate and body temperature sensors worn or used by the subject are gathered. The mean and peak beats per minute and body temperature during the pre and post assessment are calculated.
  • the manipulation is quantified, in part, by the peak force and radial velocity as follows:
  • a spike detection algorithm is used to look for large spike(s) in force and radial velocity.
  • the peak force and acceleration are displayed on a screen.
  • the average force and average radial velocities can be calculated to give appropriate data sets for an extended treatment modality such as this.
  • Quantifiable data measured by the methods and systems of the present invention during a manipulation include any combination of the following:
  • 100831 Force The amount of force and rate of force development, applied by a practitioner on a body part of a subject to elicit an action on the body part or a body part in communication therewith.
  • Acceleration The acceleration of a practitioner’s hand(s) and/or related body part(s)/tool(s) as applied to a body part of a subject to elicit an action on the body part or a body part in communication therewith.
  • the acceleration also refers to that one or more body parts of the subject as the manipulation/contact being applied by the practitioner.
  • Velocity The velocity of a practitioner’s hand(s) and/or related body part(s)/tool(s) as applied to a body part of a subject to elicit an action on the body part or a body part in
  • the velocity also refers to that of one or more body parts of the subject as the manipulation/contact being applied by the practitioner.
  • Positioning The position of a practitioner’s hand(s) and/or related body part(s)/tool(s) as applied to a body part of a subject to elicit an action on the body part or a body part in
  • the position also refers to that of one or more body parts of the subject as the manipulation/contact being applied by the practitioner.
  • Timing The recorded time (the time point) of measurements before, during and after the manipulation/ contact.
  • Video Recordings Video recordings created before, during and after manipulation/contact on a subject by a practitioner.
  • Central Nervous System The central nervous system responses before, during, and after manipulation/contact on a subject by a practitioner which include: heart rate, heart rate variability, vascular flow, temperature, respiration, blood pressure, blood glucose, brainwaves, oxygen saturation, blood lactate, saliva and sweat profiles, and pain data.
  • computer system 800 has processor 802, memory 804, communications interface/hub 806, network/cloud 812 output device 808 and input device 810.
  • the edge device is an input and output device that communicates directly with the cloud or indirectly via a hub.
  • a processor which can be part of the hub and/or the cloud receives the subject and practitioner data and compiles and/or analyzes.
  • the hub or communication module can be configured to implement a communication protocol based on Bluetooth® technology, Wi-Fi, Wi-Max, IEEE 802.11 technology, a radio frequency (RF) communication, an infrared data association (IrDA) compatible protocol, or a shared wireless access protocol (SWAP).
  • RF radio frequency
  • IrDA infrared data association
  • SWAP shared wireless access protocol
  • FIG. 8 shows the general architecture of an illustrative computer system 800 that can be employed to implement any of the computer systems discussed herein.
  • the computer system 800 of Fig. 8 includes one or more processors 802 communicatively coupled to memory 804, one or more communications interfaces/hub 806, and one or more output devices 808 (e.g., one or more display units) and one or more input devices 810.
  • the memory 804 includes any computer-readable storage media and can store computer instructions such as processor-executable instructions for implementing the various functionalities described herein for respective systems, as well as any data relating thereto, generated thereby, or received via the communications interface(s) or input device(s).
  • the processors 806 shown in Fig. 8, can be used to execute instructions stored in the memory 804 and, in so doing, also may read from or write to the memory of various information processed and or generated pursuant to execution of the instructions.
  • the processor 802 of the computer system 800 shown in Fig. 8 also may be
  • the communications interface/hub 806 communicatively coupled to or control the communications interface/hub 806 to transmit or receive various information pursuant to execution of instructions.
  • the communications interface/hub 806 and/or input device 810 can be coupled to a wired or wireless network, bus, or other communication means and may therefore allow the computer system 800 to transmit information to and/or receive information from other devices (e.g., other computer systems).
  • one or more communications interfaces facilitate information flow between the components of the system 800.
  • the communications interface/hub can be configured (e.g., via various hardware components or software components) to provide a website as an access portal to at least some aspects of the computer system 800.
  • the output devices 808 of the computer system 800 shown in Fig. 8 may be provided, for example, to allow various information to be viewed or otherwise perceived in connection with execution of the instructions.
  • the input device(s) 810 can be provided, for example, to allow a user to make manual adjustments, make selections, enter data or various other information, or interact in any of a variety of manners with the processor during execution of the instructions.
  • Joints of biological beings are the points where two or more bones convene.
  • an adult frame is composed of 206 bones.
  • Bones are classified as hard and/or calcified tissues in the human body.
  • the skeletal system is split into two categories; Axial Skeleton & Appendicular Skeleton.
  • the axial skeleton consists of the bones from the head and trunk.
  • the appendicular skeleton consists of the bones that support the appendages.
  • Osteokinematics is the manner in which bones move in space without regard for joint surfaces.
  • Arthrokinematics refers to the movement of joint surfaces.
  • There are three types of joints which include Fibrous, Cartilaginous and Synovial joints. In mammals the most common and moveable joint is the synovial joint.
  • 100991 Joints are surrounded by various tissue structures, which include tendons, ligaments, fascia, skin, fibrous tissues, adipose tissues, synovial membranes, muscles, nerves and vascular vessels.
  • tissue structures which include tendons, ligaments, fascia, skin, fibrous tissues, adipose tissues, synovial membranes, muscles, nerves and vascular vessels.
  • the bones of the skeleton, muscles, cartilage, tendons, ligaments, joints and other connective tissues make up the musculoskeletal system, also known as the locomotor system.
  • the central nervous system is comprised of the brain and spinal cord.
  • the peripheral nervous system is comprised of all the nerves outside the brain and spinal cord that carry messages to and from the CNS.
  • the entire CNS is enclosed in bone.
  • the brain is protected by the skull and the spinal cord is protected by the vertebra of the spinal column.
  • the CNS is responsible for integrating sensory information from ascending afferent fibers (PNS), and responding appropriately with a command to be sent back by descending efferent fibers (PNS). All afferent input coming from the locomotor system is known as proprioception; ie. the sense of position.
  • Central nervous system lesions, general pathologies and/or musculoskeletal misalignments can adversely impact nerves, vascular structures, visceral organs, glands, connective tissues, non connective tissues, joint surfaces, tendons, ligaments, fascia and the like.
  • Joint manipulation is characteristically associated with the production of an audible click, popping or cracking sound.
  • the phenomenon of“joint cracking” is the result of cavity inception within synovial fluid, which is consistent with tribonucleation. Tribonucleation is a process where opposing surfaces resist separation until a critical point, when they separate rapidly, resulting in vapor cavities that do not collapse instantaneously. The resulting drop in synovial pressure allows dissolved gas to come out of solution and create a“clear space” (bubble, cavity, void), within the joint. The cavity formed at the time of rapid joint separation persists past the point of sound production.
  • Osteo-articular joint pumping is an example of a precise non-thrust technique. Joint pumping utilizes micro movements to attempt to improve a joint’s range of motion and change the quality of the surrounding connective tissue. - (http://www.evolutionsinstitute.com)
  • Manipulative Physical Therapists defines manipulation and mobilization as follows.
  • Manipulation a passive, high velocity, low amplitude thrust applied to a joint complex within its anatomical limit with the intent to restore optimal motion, function and/or to reduce pain.
  • Mobilization a manual therapy technique comprising a continuum of skilled passive movements to the joint complex that are applied at varying speeds and amplitudes, that may include a small-amplitude/high velocity therapeutic movement (manipulation) with the intent to restore optimal motion, function and/or to reduce pain.
  • http // www. phy sio-pedia. com/Manual_Therapy
  • the method includes: providing a power supply operable to power the edge device; providing at least one sensor device operable to obtain at least one measurement of the user; providing an analog to digital converter in communication with the sensor, providing an actuator in communication with a digital to analog converter, providing a wireless communication component operable to transmit data indicative of the at least one measurement obtained by the at least one sensor.
  • the edge device further includes an edge computer having a processor, wireless connectivity, memory and a power supply.
  • the edge computer and the rest of the edge device are coupled to one another using a serial bus forming a“sandwich.”
  • the method may optionally include, singly, collectively, in any order and/or in any combination: electrically connecting (e.g., via a plurality of flexible interconnects embedded on or within the flexible substrate) one or more or all of the aforementioned components.
  • Examples of the subject matter and the operations described herein can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
  • Examples of the subject matter described herein can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on computer storage medium for execution by, or to control the operation of, data processing apparatus.
  • the program instructions can be encoded on an artificially generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to a suitable receiver apparatus for execution by a data processing apparatus.
  • a computer storage medium can be, or be included in, a computer- readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. Moreover, while a computer storage medium is not a propagated signal, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially generated propagated signal. The computer storage medium can also be, or be included in, one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices).
  • the operations described in this specification can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources.
  • the term“data processing apparatus,”“computing device,”“server” or“hub” encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing.
  • the apparatus can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • the apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
  • code that creates an execution environment for the computer program in question e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
  • a computer program (also known as a program, software, software application, script, application or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment.
  • a computer program may, but need not, correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code).
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor receives instructions and data from a read only memory or a random access memory or both.
  • the essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data.
  • a computer can include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • a computer need not have such devices.
  • a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), for example.
  • Devices suitable for storing computer program instructions and data include all forms of non volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks.
  • the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • examples of the subject matter described herein can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube), plasma, or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse, touch screen or a trackball, by which the user can provide input to the computer.
  • a display device e.g., a CRT (cathode ray tube), plasma, or LCD (liquid crystal display) monitor
  • a keyboard and a pointing device e.g., a mouse, touch screen or a trackball
  • Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • a computer can interact with a user by sending documents to
  • Figs. 12 and 13 show the menu and the option of choosing the view data from the Practitioner Hand strap or the Patient’s (e.g., subject’s) Headband.
  • Fig. 13 shows an example of data shown when using the system of the present invention. In this case, Fig. 13 shows streaming motion data from the
  • Fig. 7A, 7B, Fig. 9, Fig. 11 Data from any of the sensors can be displayed in this fashion (e.g., Fig. 7A, 7B, Fig. 9, Fig. 11) or can be compiled and/or shown on the same screen.
  • a three-dimensional output can also be provided in two dimensional (Fig. 10) or three-dimensional form.
  • Examples of the subject matter described herein can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back end, middleware, or front end components.
  • the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network via hardwired or wirelessly or other means. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), an inter- network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
  • LAN local area network
  • WAN wide area network
  • Internet inter- network
  • the computing system can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • a server transmits data to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device). Data generated at the client device (e.g., a result of the user interaction) can be received from the client device at the server.
  • embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination. 1001221 Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results.
  • Step 2 Subject headband manipulation device having an edge device construction Easily cleaned, durable headband material
  • Step 3 Practitioner hand strap manipulation device having an edge device
  • 1001281 Force sensor is connected to analog to digital converter peripheral of the
  • 1001291 9D motion sensor is connected to Serial Peripheral Interface (SPI) or to inter- integrated circuit (l2c) bus of the microcontroller.
  • SPI Serial Peripheral Interface
  • l2c inter- integrated circuit
  • SPI is a synchronous serial communication electronic bus. Interface specification on these buses is used for short distance communication, primarily in embedded systems.
  • I2C is a synchronous, multi-master, multi-slave, packet switched, single-ended serial electronic bus.
  • Interface specification on this bus is used for short distance communication, primarily in embedded systems. It is used for attaching lower-speed peripherals to processors and microcontrollers in short- distance, intra-board communication.
  • 1001331 9D motion sensor is connected to SPI or I2C bus of the microcontroller
  • 1001381 Used to initiate and/or time stamp when assessment(s) and/or manipulations are commencing/terminating.
  • 1001421 WiFi used for I/O data exchange between edge (microcontroller) and local hub computer in order to maximize transfer bandwidth.
  • the user interface allowed the practitioner to make any necessary changes to the sensors, such as identifying/naming each sensor, adjusting the sensitivity of data recording on each sensor and having the ability to visually look at all the data streaming into the interface.
  • a foot pedal was connected to the hub device and placed appropriately near the treatment table for easy access to the practitioner’s feet.
  • the subject s headband accumulated central nervous system data points continuously from the whole treatment, from the moment the sensor was turned on.
  • Central nervous system markers included heart rate and temperature data in the current version of the invention.
  • the subject was positioned appropriately in a supine position by the practitioner to do a pre-assessment.
  • the foot pedal was pressed to signal the commencement of the pre assessment.
  • This particular pre-assessment required passive side flexion of the subject’s head, to the right, by the practitioner, until an end feel was reached; at which point the patient’s head was moved back to the original neutral starting position. Only one repetition (one right passive side bend) was performed.
  • the pre-assessment was performed to determine range of motion of the cervical spine utilizing the gyroscope sensor embedded in the subject’s headband.
  • the accelerometer determined how fast the patient’s head and practitioner’s hand moved respectively during this assessment.
  • the metrics recorded on the practitioner’s hand strap included velocity and position (m/s, degrees) of the hand and the amount of force which was applied to the subject’s head.
  • 1001731 One cervical spine manipulation was delivered, to the right side of the patient’s cervical spine, with the practitioner hand strap and subject headband recording data.
  • 1001741 In addition to the central nervous system data points from the subject’s headband, velocity (m/s) and change in position (degrees) of the head were recorded.
  • the practitioner’s hand strap recorded the velocity (m/s) and change in position (degrees) of the practitioner’s hand and the force (Newtons) applied by the practitioner’s hand to the subject’s cervical spine during the manipulation.
  • the system was used to check and adjust the C4 vertebra in the neck.
  • a right side-bend of the neck joint was performed.
  • the angle of deviation from the start of the pre-assessment is shown in Fig. 9.
  • the assessment start and end points as detected by the algorithm described herein are shown in the thick lines in the figure.
  • a 3D visualization of the motion is shown in Fig. 10.
  • the ranges of interest in the deviation angle data (the dark lines in the previous plot) were plotted in three-dimensions to aid in visualizing the movement of the head during the pre/post assessments.
  • the gray ellipse represents the subject’s head.
  • the range of motion of the joint can be calculated from the detected ranges on interest in the angle deviation data (the thick lines in the figures).
  • the range of motion results for the pre-assessment shown in Fig. 9 are summarized in the Table 3 (below).
  • angular velocity and force data collected during the adjustment of the C4 vertebra are shown in Fig. 11.
  • the moment of the manipulation is characterized by a sharp spike in both radial velocity and exerted pressure.
  • a peak detection algorithm as described herein picks out the largest spike from the data collected while the manipulation was performed. The detected peak is denoted by a large dot in the plots.
  • the biometric indicators that were measured during this trial consisted of tracking heart rate measurements.
  • the plots of heart rate as measured starting before the pre-assessment to after the post assessment are shown in Fig. 7B.
  • the vertical lined areas indicate the time when the pre and post assessments were being performed. As can be seen from the plot, there is no significant change in heart rate before and after the manipulation.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Epidemiology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Physiology (AREA)
  • Primary Health Care (AREA)
  • Cardiology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Neurology (AREA)
  • Pain & Pain Management (AREA)
  • Rehabilitation Therapy (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • Neurosurgery (AREA)
  • Data Mining & Analysis (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Pulmonology (AREA)
  • Geometry (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The present invention relates to a method and systems for collecting and quantifying data from both the practitioner and the subject during a treatment, contact or manipulation. The present invention acquires data of the applied manipulation and central nervous system data prior, during and post manual therapy treatments targeting joints. The data is obtained using edge devices having sensors (e.g., accelerometers, magnetometers, gyroscopes, pressure sensors, force sensors, heart rate monitors, temperature sensors, blood oxygenation sensors, respiration sensors and the like). Examples of types of manipulation devices having edge devices include bracelets, headbands, body bands, bands, straps, rings, towels that are used by both the practitioner and the subject. The edge devices communicate wirelessly with a central hub to collect data. The hub is responsible for collecting, processing, storing and uploading data to the cloud. Within the cloud, data is further filtered, analyzed, organized and displayed on an interactive platform.

Description

Figure imgf000002_0001
FILED ELECTRONICALLY
System and Methods for Quantifying Manual Therapy
RELATED APPLICATION S
100011 This application claims the benefit of U.S. Provisional Application No. 62690109, entitled, “System and Methods for Quantifying Manual Therapy” by Marcin Goszczynski, et al, filed June 26, 2018.
100021 The entire teachings of the above application(s) are incorporated herein by reference.
BACKGROUND OF THE INVENTION
100031 Manual therapists, such as Chiropractors, Osteopaths and Physical Therapists, utilize various types of treatment documentation including a widely adopted method referred to as the“SOAP” (Subjective, Objective, Assessment and Plan) note method, which is a method used to document patient visits. However, manual therapists often have no equipment or technology by which to include numerical data points in their notes pertaining to metrics involved with manual therapy techniques applied to their patients. Ultimately, this omission has often categorized related manual therapy professions as pseudo-scientific.
100041 Currently, there is an abundance of technology that quantifies and analyzes a patient’s health objectively; to name a few examples they include blood chemistry testing, diagnostic imaging, video analyses, measurement apparatus’s and balance platforms (e.g. computerized dynamic posturography). On the contrary, minimal or no technology presently exists that monitors the application of manual therapy techniques (e.g., manipulations).
100051 Variations among manual therapy practitioners stem from their educational backgrounds, geography, ongoing continuing education, patient population, pathology, experience level, anatomical differences and the like. Without quantifiable data of manipulations, it has been difficult to characterize precise treatments and techniques performed by the practitioner.
100061 In unfortunate circumstances, if apparent damage/injury occurs from administered treatment/adjustment/pumping/manipulation/mobilization techniques, the industry often lacks a system, which can determine what occurred during a manipulation or technique. 100071 A need exists for quantification of applied manual therapy techniques and the effectiveness of the manipulation(s) performed by obtaining information from sensors on the practitioner and on the patient. Yet a further need exists to provide a foundation for documenting quantifiable and numerical data points of performed manipulations. Such a foundation will provide for improved practitioner techniques of the manipulation and better outcomes for patients. A further need exists to use quantifiable characterizations of manipulations performed on a patient, particularly in real- world settings, to improve their health, especially when the patient’s health is regressing or stagnating. A need also exists to establish educational and technical standards for performing certain manual techniques using measurable manipulation datasets. Such data collection would provide valid scientific information, assist in the development of safety standards and
accountability, and enhance credibility for the manual therapy industry.
100081 SUMMARY OF THE INVENTION
100091 The present invention relates to methods for assessing a manipulation by a practitioner on a first body part of a subject, wherein the practitioner has (e.g., wearing or using) at least one (e.g., one or more) practitioner edge device that includes at least one (e.g., one or more) practitioner sensor and the subject has (e.g., wearing or using) at least one (e.g., one or more) subject edge device that includes at least one (e.g., one or more) subject sensor on a second body part. The manipulation device or tool has an edge device embedded therein can be worn or used by the practitioner and/or subject. For example, the practitioner can wear a strap having an edge device and/or use a towel having an edge device when performing the manipulation/contact. The practitioner performs the manipulation to the first body part of the subject, wherein the first body part and the second body part is the same body part or a different body part, and wherein the practitioner sensor and the subject sensor each comprise a (e.g., one or more) gyroscope, accelerometer, force sensor, magnetometer or combination thereof, and optionally, at least one (e.g., one or more) central nervous system sensor. The steps of the inventive method include transmitting practitioner data of the manipulation from the at least one practitioner sensor to a network and transmitting subject data of the manipulation from the at least one subject sensor to the network.
The next step involves processing the practitioner data and the subject data to assess the
manipulation to thereby obtain a manipulation indicia; and providing an output of the manipulation indicia. The central nervous system sensor includes, for example, a heart rate monitor, a blood pressure monitor, a thermometer, an oxygen saturation monitor, a respiratory monitor, a blood lactate monitor, blood glucose monitor, a vascular flow monitor, a brain wave monitor, a sweat and saliva monitor, pain sensor or combination thereof, and optionally include a microphone or a video camera. As used herein with respect to CNS sensors, a“monitor” is also a sensor. The practitioner data or subject data of the manipulation each include, in an aspect, position data representative of position or movement of the body part (e.g., angular velocity), acceleration data representative of an acceleration of the body part, and force data representative of force applied to the body part, and optionally, heart rate data, heart rate variability data, blood pressure data, vascular flow data, temperature data, oxygen saturation data, respiratory rate data, blood lactate data, blood glucose data, brain wave data, sweat and saliva data, pain data, and any combination thereof. The practitioner manipulation device, having an edge device attached thereto or embedded therein, can be in the form of a band or strap (e.g., a body band), a ring, a towel, a glove and the like. The practitioner manipulation device, having an edge device attached thereto or embedded therein, can be placed on the practitioner’s hand, arm, foot, knee, leg, hip, torso, back, head, or combination thereof. The subject manipulation device having an edge device includes a band (e.g., a body band), a strap, a headband, a bracelet, a body band, a ring, and the like. The subject edge device can be placed on the subject’s body part, such as a head, hand, arm, foot, knee, leg, hip, torso, back, or combination thereof. The method further includes, in one embodiment, transmitting the practitioner data and the subject data wirelessly to the network (e.g., having a server that is local, on a cloud, or both). The method can further include the step of storing the practitioner data and the subject data, and processing the practitioner data and the subject data to assess the manipulation. In an embodiment, these data can be compiled to obtain indicia of the manipulation. In another embodiment, processing the practitioner data and the subject data to assess the manipulation involves comparing the practitioner data with the subject data to one another or to a standard. The inventive methodology can further include transmitting practitioner data from more than one time point, transmitting subject data from more than one time point, and compiling or comparing the data from the more than one time points (e.g., prior to the manipulation, during the manipulation, and after the manipulation). The method can further include displaying the data in a representation such as a line graph, a three-dimensional representation of the movement, video (e.g., interactive video) and/or written description(s).
100101 In an embodiment, the inventive method of assessing a manipulation includes the practitioner performing the manipulation to the first body part of the subject; transmitting practitioner data of the manipulation from the practitioner sensor to a network; transmitting subject data of the manipulation from the subject sensor to the network; processing the practitioner data and the subject data to assess the manipulation to thereby obtain a manipulation indicia; and providing an output of the manipulation indicia.
loom The present invention also relates to systems for assessing a manipulation by a practitioner on a first body part of a subject, wherein the practitioner has (e.g., wearing or using) at least one practitioner edge device that includes at least one practitioner sensor and the subject has (e.g., wearing or using) at least one subject edge device that includes at least one subject sensor on a second body part. The practitioner performs the manipulation to the first body part of the subject, wherein the first body part and the second body part is the same body part or a different body part. The system encompasses the practitioner edge device having at least one practitioner sensor that obtains and transmits practitioner data; wherein the at least one practitioner sensor comprises a gyroscope, accelerometer, force sensor, a magnetometer, or combination thereof, and optionally, at least one central nervous system sensor; wherein the practitioner sensor further includes a wireless communication component. The system also includes at least one subject edge device having at least one subject sensor that obtains and transmits subject data; wherein the subject sensor comprises a gyroscope, accelerometer, force sensor, a magnetometer, or combination thereof, and optionally, at least one central nervous system sensor; wherein the subject sensor further has a wireless communication component; wherein, when in use, the subject wears at least one sensor on the second body part. Included in the system is a network that has at least one memory unit for storing processor executable instructions, the practitioner data and the subject data; and a processing unit for accessing at least one memory and executing the processor executable instructions, wherein the processing unit receives and processes the practitioner data and the subject data to assess the manipulation to thereby obtain a manipulation indicia. Furthermore, the system includes a communication module that receives the practitioner data and subject data; and an output device for providing the manipulation assessment. The system, in an embodiment includes a hub having a second communication module that receives the practitioner data and subject data and transmits said practitioner data and subject data to the network. The practitioner edge device and/or the subject edge device can each include a sensor, an analog to digital converter, an actuator, a digital to analog converter, a communication module (e.g., a bus), a processor, wireless connectivity, memory, a power supply (e.g., a battery), or a combination thereof. In an aspect, the processing unit processes the practitioner data and the subject data to assess the manipulation includes compiling the practitioner data and the subject data. In another aspect, the processing unit processes the practitioner data and the subject data to assess the manipulation includes comparing the practitioner data with the subject data or compare the practitioner data and the subject data to a standard. In an embodiment, the processing unit processes the practitioner data from more than one time point, the subject data from more than one time point, and compiles or compares the data from the more than one time points (e.g., prior to the manipulation, during the manipulation and after the manipulation). The output device, for example, includes a display providing the data in a graphical representation e.g., a line graph, or a three-dimensional representation of the movement.
100121 The present invention has a number of advantages, as follows. The present invention provides for a quantifiable, objective characterization of manipulations performed on a patient at one or more timepoints. The methods and systems of the present invention provide specific and detailed data acquired from treatments, a cumulative time line of treatment data, treatment data for each subject, treatment data for each practitioner, cross analysis of treatment data, collected treatment data utilization in manual therapy research, and collected treatment markers and fundamental physics equations. An additional advantage of this device and system is that it provides instantaneous feedback about a performed manipulation. Thus, a practitioner can improve their quality of treatment by repeating certain manipulations if their original objective(s) aren’t achieved. Such data, obtained from both the practitioner and the subject, improves the health of the patient, and allows for better education, uniformity in the applied techniques, and development of medical treatment standards. Subjects who utilize various practitioners can now acquire data from
administered treatments from multiple health care professionals and determine what is most effective. Development of quantifiable treatment standards can be used for education, health insurance, regulation and public safety. An additional advantage of this invention is that it will have the ability in the future, through acquired data, to create an automated machine to assist a practitioner to perform manual therapy techniques and/or replace the practitioner all together. This would be advantageous to patients, animals, students, educators, insurance companies and regulatory boards when experts/practitioners cannot be present to provide care.
BRIEF DESCRIPTION OF THE DRAWINGS
100131 The foregoing and other objects, features and advantages of the invention will be apparent from the following more particular description of preferred embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.
100141 Fig. 1 is a schematic showing the flow of data from subject edge devices, practitioner edge devices via a network to a hub if present, if not present to the cloud where the data analysis occurs. The Figure also shows output devices from the hub and/or the cloud. Note that in an embodiment, storage can also occur on the sensors themselves using a memory card. Data transfer can occur, e.g., by wireless transfer or hardwire via micro USB cord.
100151 Fig. 2 is a schematic showing the steps of the present invention involving receiving data from the subject edge device and the practitioner edge device, compiling and/or comparing the data to provide an indication of the quality or effectiveness of the manipulation, and then displaying, transmitting and/or storing the indicia, and optionally compiling and/or comparing two or more stored indicators and displaying that compilation/comparison.
100161 Fig. 3 is a schematic of an edge device used in the present invention, wherein the edge device includes a sensor board (e.g., an analog sensor, analog to digital converter, actuator, and digital to analog converter), a communication module (e.g., a serial bus), and edge computer (e.g., a processor, wireless connectivity, memory and battery). The edge device, in this embodiment is a sandwich between the sensor board and the edge computer. In this figure, the edge device communicates with the hub computer.
100171 Fig. 4A shows schematics of various manipulation devices having an edge device including straps and towels for use by the practitioner and/or patient.
100181 Fig. 4B is a schematic showing a strap or band manipulation device (referred to as a“hand strap”), as placed on a hand for use.
100191 Fig. 4C shows a schematic of a perspective view of a headband manipulation device having an edge device.
100201 Fig. 4D is a schematic showing a detailed perspective view of the hand strap manipulation device shown in Fig. 4B.
100211 Fig. 4E is a schematic of the hand strap in Figs. 4B and 4D showing the parts under the silicone cover.
100221 Fig. 5 is a schematic showing a patient wearing a headband of the present invention shown in Fig. 4C with the practitioner wearing the hand strap shown in Fig. 4B, 4D and 4E.
100231 Figs. 6A-C is a series of schematics showing a cervical spine manipulation by movement of the head of the subject wherein both the practitioner and subject are wearing sensors (edge device of practitioner is only shown in Fig. 6C since the practitioner’s edge device on their hand is under the subject’s head in Figs. 6 A and 6B). The figures show the manipulation as the head is rotated during the contact/manipulation. Fig.6A-C is a series of schematics showing the steps of using this invention. The pre-assessment can be performed by checking the range of motion of rotation of the head of the subject. The figures show the manipulation as the head is rotated during the
contact/manipulation. Both the practitioner and subject are wearing sensors.
100241 Fig. 7A is a line graph showing data from an accelerometer during the cervical manipulation of Fig. 6. The units are m/s2on the y-axis and ms on the x-axis. Also shown in Fig.7A, is the data from the gyroscope in m/s2 on the y-axis and ms on the x-axis. These two sensors (accelerometer + gyroscope) are taking data from the practitioner’s edge device.
100251 Fig., 7B shows a three-dimensional graphical representation showing the side-flexion of the head and degrees of side-flexion in the sagittal axis. These data are shown as two snapshots but can be presented in video format of the movement in three-dimensions. The following is shown: range of motion of the joint for the pre and post assessments, maximum acceleration (m/s2) of the practitioner’s hand performing the manipulation, the maximum pressure applied by the practitioner performing the manipulation, and the heart rate of the patient during the time from the pre assessment to after the post assessment; or more simply articulated as the time of the complete treatment. A line graph provides bio-indicators over time including heart rate (i.e. beats per minute).
100261 Fig. 8 shows a general architecture for a computer system, according to the principles herein, including the network/cloud 812, communications interface/hub 806, the output device 808, input device 810, processor 802 and memory 804.
100271 Fig. 9 shows graphs providing pre-assessment orientation data for manipulation of the cervical spine by relative angle of rotation during the pre-assessment over time (seconds) (e.g., right side flexion). Additionally, the lower graph displays force/pressure in mV applied over time (seconds) as applied during the pre-assessment.
100281 Fig. 10 shows a three-dimensional graphical representation showing the pre-assessment orientation data and post-assessment data for manipulation of the cervical spine by side flexion of the head in degrees of rotation.
100291 Fig. 11 shows line graphs providing the manipulation data during the manipulation of the cervical spine by right side flexion of the head. Fig. 11 top line graph shows the radial speed (degrees/second) during the manipulation over the pressure in (mV) applied during the manipulation. Fig. 11 bottom line graph shows pressure/force in mV over time in seconds.
100301 - Fig. 12 is a screenshot of the software used in the present invention that shows and confirms that both sensors (at least one sensor from the practitioner, and at least one sensor from the subject) are connected to the systems and are streaming live data.
100311 - Fig. 13 is a screenshot of the software used in an embodiment of the present invention that shows live streaming data from the practitioner hand strap in that screen.
DETAILED DESCRIPTION OF THE INVENTION
100321 The present invention relates to systems and methods for assessing and evaluating quality of manipulations or contact(s) performed by a practitioner (e.g., Clinician or a manual therapist such as, Chiropractors, Doctors, Performance Therapists, Physical Therapists, Osteopaths, Osteopath Physicians, Naprapaths, Massage Therapists, Athletic Therapists, Cranial Sacral Therapists, Ki- Hara Practitioners, Reflexologists, and Rolfing Practitioners) on a subject (e.g., patient, human, animal, non-human model) by obtaining data from both the practitioner and the subject, and optionally from, supporting therapeutic structures/implements (e.g. treatment tables) and auxiliary evidence recording equipment (e.g. video camera(s)). This novel system and methodology obtains measurements from both the subject and the practitioner to determine the nature, quality, and/or effectiveness of manipulation(s). The present invention serves to assess the quality of the manipulation, the outcome of the manipulation (e.g. if it works and how well), and the technique of the practitioner. The system advantageously allows one to quantify and characterize the manipulation, which allows one to understand what techniques work and which techniques are most effective. Additionally, the feedback about the manipulation can be provided during the treatment session in real time. This inventive methodology allows the practitioner to learn effective techniques and allows one to more easily reproduce the manipulation. The system also allows one to assess mistakes made or ineffective techniques so as to avoid them and thereby improving their treatments, technique(s) and learning from manipulations that work and those that do not.
100331 In one aspect, the system of the present invention assesses the manipulation/contact by measuring both the practitioner’s and the subject’s movements using edge devices. A manipulation or contact are used interchangeably and refers to the act or process of manually adjusting a body part or manually treating a body part, both of which can be done directly or via structures in relation/communi cation to that body part. The present invention is used in manual therapy, referring to an application of force to the subject’s body, typically administered by the practitioner’s body, but can also be applied through various implements such as needles and aptly constructed instruments to realize optimal health.
100341 Various devices are referred to herein as they include an edge device, a hub device, a user console or dashboard device and the cloud or cloud infrastructure. In an embodiment, the edge refers to an input/output device (mechanics + electronics) of minimal size or profile (e.g., wearable or usable during manual adjustment) where interaction between the system and ambient real physical world happens via collection of appropriate set of sensors and actuators. Sensors are to input ambient world data to the systems and actuators execute implication of output data from the system. This is a device on the“interface” between ambient physical world and the inner workings of the entire system. A hub device refers to a device that collects and distributes system data to the cluster of edge devices, maintains connectivity and synchronization with cloud infrastructure and if necessary, can locally substitute for cloud functionality. The user console/dashboard device, in one aspect, is a device capable to run software, which allows monitoring of all aspects of the systems behavior (ultimately sensors) and to influence all aspects of the systems behavior (ultimately actuators). The cloud or cloud infrastructure refers to, for example, an underlying system’s computational and storage resource applicability to multiple local systems. The system, in an embodiment, can include a foot pedal or other input device(s) to turn the system on or off, or indicate that the practitioner is ready to begin or end a manual adjustment.
100351 Referring to Fig. 1, the edge device, in one embodiment, is a device that measures and translates real-world information into digital information. As described further herein, the edge device has an input/output device that includes one or more sensors, one or more analog to digital converters, and optionally one or more actuators, and one or more digital to analog converters. The edge device further includes a communication module (e.g., one or more buses), and optionally one or more controllers (e.g., passive and/or active controllers). In an embodiment, the bus is a serial bus and forms a“sandwich” design between the sensor board and the edge computer. The edge device includes an edge computer having a processor, wireless connectivity, memory and a power supply (e.g., battery). Sensor types of the edge device include any sensor that can measure a condition to which it is subjected, as it relates to the manipulation or health of the subject/or practitioner. In particular, sensor types include one or more of the following: gyroscope, accelerometer, force sensor, a magnetometer, or combination thereof, and optionally, at least one central nervous system sensor. Central nervous system sensors include one or more of the following: heart rate monitor, blood pressure monitor, thermometer, oxygen saturation monitor, respiratory monitor, or combination thereof. Additional optional sensors include, microphones, video cameras, blood lactate monitors, blood glucose monitors, vascular flow monitors, brain wave monitors, sweat and saliva monitors, and pain sensors.
100361 The sensors described herein can be made or purchased from commercially available sources. For example, the following sensors can be made or purchased separately or in combination with another sensor(s) for use with the present invention as follows: gyroscope (e.g., Bosch Sensortec BMI 160, Bosh Sensortec BNO055 (Bosh Sensortec GmbH (Reutlingen/Kusterdingen Germany)), STMicroelectronics LSM9DS0, LSM9DS1 (STMicroelectronics (Crolles Cedex, France)), TDK InvenSense MPU-9150 TDK (InvenSense Inc. (San Jose, California));
accelerometer (e.g., Bosch Sensortec BMI 160, Bosh Sensortec BNO055 (Bosh Sensortec GmbH Reutlingen/Kusterdingen Germany), STMicroelectronics LSM9DS0, LSM9DS1
(STMicroelectronics (Crolles Cedex, France)), TDK InvenSense MPU-9150 (TDK InvenSense Inc. (San Jose, California ); force sensor (e.g., FlexiForce A201, FlexiForce A401 and FlexiForce A502 (Tekscan, Inc. Boston, Massachusetts), StretchSense, a force and stretch sensor system
(StretchSense (Penrose, Auckland, New Zealand)); a magnetometer (e.g., Bosh Sensortec BMI 150 , Bosh Sensortec BNO055 (Bosh Sensortec GmbH (Reutlingen/Kusterdingen Germany)), TDK InvenSense MPU-9150 (TDK InvenSense Inc. (San Jose, California )); heart rate monitor (e.g., Maxim Integrated MAX30100, MAX30101 (Maxim Integrated (San Jose, California)); blood pressure monitor (e.g., Nokia BPM + Blood Pressure Monitor and Nokia Technologies LTD.
(Health), Espoo, Finland); thermometer; oxygen saturation monitor (e.g., Maxim Integrated MAX30100, MAX30101 (Maxim Integrated (San Jose, California)); electrocardiogram, respiratory monitor (e.g., B01MOODCL5, Beddit 3 Sleep Tracker (Apple Inc. Cuperino, California); blood lactate monitors (e.g., ALP10101 Lactate plus meter (Nova Biomedical, Waltham, Massachusetts); blood glucose monitors (e.g 71387 Freestyle Precision Neo - Blood Glucose meter (Abbott Diabetes Care Inc. Alameda, California); vascular flow monitors (e.g., Vascular flow monitor using portable touch-less device (video) to monitor entire flow in patients, University of Waterloo, Waterloo, Ontario); brain wave monitors (e.g., ICM-20948 Brain wave headset monitor (EMOTIV, San Francisco, California); sweat and saliva monitors (e.g., Kenzen Patch. Sweat analysis providing real time feedback (Kenzen, San Francisco, California); wearable salivary uric acid mouth guard biosensor with integrated wireless electronics (e.g., University of California, San Diego); microphones (e.g., ICDPX440, Sony Digital Voice Recorder (Sony Corporation, Minato, Tokyo, Japan)); video camera (e.g., CHDHS-501-CA GoPro Hero5 Session (GoPro, San Mateo, California)); ambient environment sensors (e.g., temperature (Bosh Sensortec BME), humidity (e.g., Bosh Sensortec BME); barometric pressure sensors (e.g., Bosch Sensortec BMP280, Bosh
Sensortec BMP180, STmicroelectronics LPS22HB), and pain sensors (e.g.,“chemical-pain sensor” See for example, Jin, Hye Jun el ah, Biosensors and Bioelectronics, vo. 49: 86-91 (15 November 2013). Other devices such as power management and battery charging devices can be included (e.g., IDT P9025AC wireless power receiver (IDT (San Jose, California)) and Analog Devices LTC4120 wireless power receiver (Analog Devices (Norwood, Massachusetts)). Examples of edge devices used include MbietLab Meta Wear CPRO and MetaMotionR platforms, which are, integrated systems containing a connectivity module (e.g., BLUETOOTH connectivity) and variety of motion and ambient environment sensors; Espressif System ESP32 based modules, which are dual core microcontrollers with a communication/connectivity protocol such as BLUETOOTH communication, Wi-Fi connectivity and allows for ways of integration of a variety of motion, ambient environment, CNS, force sensor and actuators; and StretchSense platform (StretchSense (Penrose, Auckland, New Zealand)) which is integrated system containing Bluetooth connectivity module and dedicated microcontroller for processing raw data from custom force and stretch sensors. Additionally, EMF radiation protection materials, to shield subjects from exposure(s), surround edge devices (Defender Shield, Florida, USA). Examples of a hub device used with the present invention includes Raspberry Pi3, Raspberry Pi 3+ that are single board computers with a connectivity module (e.g., BLUETOOTH connectivity), Wi-Fi and wired Ethernet network connectivity; Two foot switch control pedal connected to hub device, iKKEGOL USB double switch pedal (FS2016 B2, iKKEGOL, China); and examples of cloud infrastructure used in the present invention are Amazon Web Services and Google Cloud Platform. Further, the use of 3D motion camera capture technology allows for the recordation of sensor data from the subject and/or practitioner. See for example, W02019095055A1.
100371 System 100 shows that information from subject edge device 102 and practitioner edge device 104 may or may not travel through Hub 108 but is compiled and analyzed by a processer in cloud 110 to provide output 112 that characterizes the nature or quality of the contact/manipulation. Specifically, the subject and the practitioner information can be compiled to generate a three- dimensional output or graph to characterize or replicate the manipulation (e.g., motion, force, direction, position, time of the manipulation). The information can also be compared to a standard to assess the nature of the manipulation. The standard can be data that represents the desired character of the manipulation, data can be obtained from a patient population on which the manipulation worked or did not work, or the data can be obtained from the specific patient from previous time points. Additionally, the data for the standard can be from industry accepted standard(s) by experts in the field.
100381 In one embodiment, state of the joint can be compared before, during and after the manipulation. Depending on the joint, the state can be defined as in an example: range of motion of the joint along its axis (axes) of rotation, average force required to move the joint the full range, force required to move the joint the full range along its axis (axes) of rotation, etc.. In this example, the practitioner can compare the state before and after a manipulation by evaluating these numbers. The practitioner can decide in real time if a manipulation was successful if the range of motion increased, and if it required less force to move the joint along the full range of motion for instance. 100391 Regardless of the standard, the output can include a compilation of data, or a comparison and assessment of the subject’s and practitioner’s information, as compared to the standard. The types of output are further described herein. The output can be provided immediately to the practitioner and/or patient and the practitioner can adjust the next manipulation accordingly (e.g., use more or less force, change the angle of the torque, the course of the movement/rotation, and the like)
100401 Fig. 1 shows the system 100 and the flow of data goes from subject edge device 102 and the practitioner edge device 104, and if a hub is present 106, flows to the hub 108. Hub 108 receives and stores data from one or more manipulations/contacts and transmits it to cloud 110. If hub 108 is not present, then the data can be transmitted directly to cloud 110 wirelessly or through a wired connection via a computer. The decision 106 as to if a hub is present depends on the system architecture. For example, a manipulation may be done out of the office where only the edge devices are present but the communication module in the devices allows for communication of the devices directly to the cloud; when not present in the office. The cloud is in communication with or has a server, memory for storing data, a processor to compile and analyze the data, and a communication module to transmit the data to an output device. Cloud 110 analyzes data obtained before, during, and after contact/manipulation. The data of the present invention that flows from both the subject and the practitioner includes position data representative of position or movement of the body part, acceleration data representative of an acceleration of the body part, and force data representative of force applied to the body part, and optionally, heart rate data, heart rate variability data, blood pressure data, vascular flow data, temperature data, oxygen saturation data, ECG data, respiratory rate data, blood lactate data, blood glucose data, brain wave data, sweat and saliva data, pain indication data, and any combination thereof. One or more of these datatypes can be used. The more data types used results in a more robust output and characterization of the
manipulation/contact. Furthermore, the following externally recorded diagnostic test results can be embedded into the manual therapy recording system, including, X-rays, Ultrasounds, MRI’s, CAT scans, VNG-eye recordings, balance recordings (e.g. computerized dynamic posturography), complete blood profiles, urine analysis, stool analysis and genetic profiles. Additionally, subjective questionnaire data and third-party application data could be embedded into the system. As an example, an application which monitors daily sleep phase activity could be either manually inputted or linked automatically into this system.
100411 Fig. 2, using system 100 shown in Fig. 1, describes the steps of the present invention.
Methodology 200 begins with receiving step 202 in which data (at least one measure of data) from each of the subject edge device and the practitioner edge device are obtained pre-manipulation, during manipulation and/or post-manipulation. The data is communicated to a processer where step 204 compiles the data measurements and/or compares the data measurements to a threshold value or standard to provide an indication of the quality or effectiveness of the manipulation/contact. The indicia (e.g., data representation, compilation or comparison) can be displayed in step 208 by an output device (e.g., monitor, displays, printers, projectors, speakers, headphones and the like). The display of the indicia can be tailored for the subject, the practitioner or other interested party, as desired. The indicia can be transmitted in step 210 to the practitioner and/or the subject (e.g., via email, text, or by wire or wirelessly). In an embodiment, the indicia are transmitted back to the edge device to alert the subject and/or practitioner that the manipulation was successful or unsuccessful. Various types of output of the edge devices are further described herein. Step 206 allows the indicia to be stored e.g., in memory or on a storage drive. Two or more stored indicators from 206 can be further complied or compared in step 212 and displayed in step 214. The advantage of compiling two or more stored indicators allows for comparison of manipulations. For example, data can be compiled and/or compared from one or more time points of the same manipulation (pre-manipulation, during manipulation and post manipulation).
100421 The logic of processes data to assess a manipulation is as follows. In an embodiment, pre or post assessment data is obtained along with data during the contact/manipulation. If the data is obtained during pre- or post-manipulation, then the software processes data to estimate/determine the state of the joint and the subject. If the data is from the manipulation, then the software estimates/determines data points to quantify manipulation. If there is no event, then the software does nothing.
100431 The range of motion of a particular joint from before and after the manipulation can be compared. In another example, the post manipulation data of the same manipulation performed on the same subject on more than one occasion can be compared and associated with the outcome. In another example, the data obtained during the actual manipulation can be compared too.
100441 In an embodiment, one practitioner edge device (e.g., a practitioner manipulation device having an edge device therein), one subject edge device (e.g., a subject manipulation device having an edge device therein) can be used to carry out the present invention, along with a foot pedal (to indicate to the system to record data or that a manipulation is beginning or ending). In yet another embodiment, two or more edge devices on the practitioner and/or on the subject can be used to increase efficiency in acquiring data (e.g., force data). In one example, for the practitioner, an edge device in the form of a hand strap is used to acquire motion data, and a second edge device in the form of a towel, which contains the force sensors is used. In this embodiment, separating the motion sensors from the force sensors provides more variety of contacts and in certain cases, more reliable data.
100451 With respect to Fig. 3, edge device 300 is the interface between the real world and the computer system and can be a single or dual directional device. In the figure, the right side of the vertical line is the digital environment, and left side of the line is an analog/digital environment or sensor board. The edge device (e.g. one or more edge devices) translates real world signals into digital data that is compiled and analyzed by the system. The dual directional edge device 300 of Fig. 3 includes integrated input/output device (I/O device) 320, also referred to as a“sensor board.” Input/output device or sensor board 320 includes one or more analog sensors 302 in communication with analog to digital converter 304, which translates analog signals to digital data. The sensors include, for example, gyroscope, accelerometer, force sensor, a magnetometer, heart rate monitor, blood pressure monitor, thermometer, oxygen saturation monitor, respiratory monitor, a microphone, a video camera, blood lactate monitor, blood glucose monitor, vascular flow monitor, brain wave monitor, sweat monitor, saliva monitor, pain sensor, and any combination thereof. Each of these sensors can communicate to analog to digital converter 304, which in turn translates the analog signal to digital data and communicates via bus 318 or wirelessly to hub 316 or directly to the cloud. In some embodiments, integrated input/output device 320 also has actuator 308 that provides an output signal to digital to analog converter 306 which provides an analog signal to sensor 302. One or more actuators and digital to analog converters can be included in the integrated input/output device of the present invention. The integrated edge device includes one or more sensors as described herein. Consequently, the sensors provide the type of data, described herein. Data obtained pre-contact/manipulation, during contact/manipulation, and post
contact/manipulation is converted from analog data to digital data by converter 304 and then digital data moves to hub 316 or the cloud through communications modules such as bus 318. One or more buses can be used to transmit data back and forth. Bus 318 communicates with edge computer 312 which transmits the data to the hub 316 or directly to the cloud. The bus can be a serial bus that sandwiches the sensor board and the edge computer. The edge computer, generally miniaturized, has a memory device, and a processor communicatively coupled to the memory device, wireless connectivity, and a power supply (battery, charging port, and the like).
100461 Optionally, the edge device can further include an active controller which is a device that integrates one or more input/output devices with software. Passive controller also integrates input/output device 320 but does so without software and can optionally be included in the edge device. An active controller has computational functionality, input output access device and a connectivity device. The wireless connectivity of the edge computer transmits data to computer systems (e.g. hub or cloud). Edge device 300 can also optionally include one or more passive controllers in communication with one or more sensors. Edge devices can further include a substrate or housing into which the components described herein can be housed, and/or a port for a wired connection.
100471 An array or plurality of sensor types can be included and depends on the type and purpose of the edge device. The edge device can be embedded into a manipulation device used by the practitioner or subject or can be in the form of a wearable object or usable object when performing a manipulation. The edge devices can be made in miniature form sufficient for embedding into the practitioner, subject, animal or model. As an example, the practitioner could have all the motion sensors and CNS sensors embedded into the backs of their hands and the subject could have all the motion sensors and CNS sensors embedded into their heads. Thus, the data could be transmitted from the embedded sensors to the hub or cloud.
100481 A manipulation device having an edge device attached thereto or embedded therein can be in the form of, for example, a band, a strap, a ring, a towel, a glove, a headband, a bracelet, a body band, a sleeve, a patch and any combination thereof. The edge device can be in the form of any wearable or usable object that allows one to assess measurements during a manipulation or contact. Exemplary devices are shown in Fig. 4A and include towel variations, bracelets, bands and straps. The manipulation device having an edge device can be of any size and shape to allow one to wear and/or use the device during a manipulation. Fig. 4B shows an example of a hand strap, hand strap 402, having an edge device worn by practitioner 424 with a hook and loop fastener, and Fig. 4C shows headband 408 having an edge device for use by the subject. Headband 408 has opening 410 through which the central nervous system sensor(s) often requires close proximity and direct contact with the subject. Headband 408 additionally has electromagnetic radiation (EMF) blocking fabric to protect the subject. Fig. 4C shows a detailed view of hand strap 402 shown in Fig. 4B. Specifically, hand strap 402 has an opening 414 to receive the edge computer under cover 422 and is connected by cords 416 to force sensor 412. This design of the manipulation device allows the force sensor position to be adjustable (e.g., movable along the length of the strap). Force sensor 412 is held down by the strap 418 when in use. The hand strap 402 is held into place on the practitioner’s hand 424 with hook 420 that is received by a pocket on the opposite end of the strap. Fig. 4E shows the design of the edge computer under cover 422. Fig. 4E shows sensor board 300 along with retractable device 428 in communication with force sensor 412 via cords 416.
100491 Additionally, the manipulation devices can include various closures to keep the device on the body part during the manipulation. Such closures include a fastener, a strap, a snap, a buckle, a button, a hook, an elastic member, a tie, a clip, a zipper, a drawstring & cord lock, a hook-to-hook arrangement, a hook & loop arrangement, hook & pocket arrangement, or a combination thereof.
The manipulation device can also include an adhesive for adhering to the subject’s or practitioner’s skin or clothing. The fabric for the manipulation devices can be any fabric, now known or later developed, and is a fabric that is strong enough to withstand the manipulations and preferably easy to clean/hygienic. Such fasteners and fabrics can be purchased commercially. Additionally, to assist with close contact exposures of EMF in the form of extremely low frequencies (EFF), radio frequency radiation (RF) and heat, each edge device can be surrounded by EMF blocking materials. The edge device’s purpose impacts the type of sensors being used and the shape/form of the edge device. For example, an adjustment of the cervical spine by manipulation of the head involves movement (e.g. position in space, speed, acceleration, force and pressure) as the head is rotated from a first position to a second position (e.g., from right to left). The strap used by the practitioner could include, for example, a gyroscope, accelerometer, force sensor, magnetometer and at least one central nervous system sensor such as a heart rate monitor, blood pressure monitor, thermometer, oxygen saturation monitor, respiratory monitor, microphone and video camera, blood lactate monitor, blood glucose monitor, vascular flow monitor, brain wave monitor, sweat and saliva monitor, pain sensor, or a combination thereof. The headband used by the subject, for example, could include a gyroscope, accelerometer, force sensor, magnetometer and at least one central nervous system sensor such as heart rate monitor, blood pressure monitor, thermometer, oxygen saturation monitor, respiratory monitor, microphone, video camera, blood lactate monitor, blood glucose monitor, vascular flow monitor, brain wave monitor, sweat and saliva monitor, pain sensor, or a combination thereof. Accordingly, the type of sensors used in an edge device can be the same or could differ and, in some cases, certain sensor data may be more relevant than others, depending on the anatomy, technique, manipulation and the stage of the manipulation. Based on anatomy or technique delivery, as an example, practitioners could choose the desired manipulation device, for example, rings, straps for the cervical spine or towels for the hip or sacroiliac joints.
100501 As shown in Figs. 5 and 6, headband 408 is worn by the subject and hand strap 402 is worn by the practitioner. Using the adjustable feature of hand strap 402, the practitioner can adjust the location of the force sensor 412 to be at the base of the subject’s neck to obtain a better reading of the force used in the manipulation.
100511 Fig. 6 shows the present invention being used by a practitioner on a subject. With the subject wearing the headband and the practitioner wearing the hand strap, the manipulation is being performed as the subject’s head is being rotated from a first position to a second position. The head is being manipulated in order to adjust the cervical spine. One body part is being moved, in this case, to adjust another body part. In other types of contacts, the manipulation and adjustment are done on the same body part. The manipulation can be performed on the body part being
manipulated or a first body part in communication with the second body part being manipulated, an example shown in Figs. 6A-6C. Accordingly, the body parts of the subject that wear the manipulation device having an edge device include a head, hand, arm, foot, knee, leg, hip, torso, back, or combination thereof. The practitioner can wear/use the edge device on any one of the following body parts: hand, foot, knee, leg, hip, torso, back and/or head. Table 1 provides examples. Several combinations exist and depend on the body part being adjusted, the practitioner, the subject and the specific issue with the body part. 100521 Table 1 : Examples of Body Part Manipulation/Communication
Figure imgf000019_0001
100531 Using the sensor in the methodology of the present invention involves the following steps. The practitioner having one or more edge devices and the subject having one or more edge devices provide pre-manipulation data measurements. The practitioner performs the manipulation during which real time data for the practitioner and the subject are obtained and transmitted. Additionally, the post-manipulation data can also be obtained and transmitted. During the pre and post manipulation phases, the practitioner can move the body part to assess range of motion, flexibility and strength and data from the edge devices can measure this. Exemplary data includes position data representative of position or movement of the body part, acceleration data representative of an acceleration of the body part, and force data representative of force applied to the body part, and optionally, heart rate data, heart rate variability data, blood pressure data, vascular flow data, temperature data, oxygen saturation data, respiratory rate data, blood lactate data, blood glucose data, brain wave data, sweat and saliva data, pain data, and any combination thereof.
100541 The indicia of the contact/manipulation can be characterized in various ways. The character of the manipulation can be assessed, and examples of such characterizations are shown in Fig.7A and 7B. The indicia can include positive/negative results or an otherwise binary output indicating if the manipulation was attempted/successful. Indicia of the manipulation can also include graphical representations of the data as shown in Fig. 7 A, e.g. acceleration and angular velocity. The data in Fig. 7A also shows acceleration and gyroscope (position) data. Table 2 shows the type of sensor, data type, units of measurement and provides an example of how the data can be displayed. The indicia can show compiled data, compared data or both.
100551 Table 2
Figure imgf000020_0001
100561 Indicia also include compilations of data or comparisons of the unit of measurement obtained to a standard, described herein, to provide information to the practitioner if a manipulation met certain criteria. A three-dimensional representation showing a replication or simulation of an aspect of the manipulation can be shown. See Fig. 7B. For example, range of motion of the body part can be assessed before and after the manipulation to determine if there is an increase in range of motion. Fig. 7B shows the three-dimensional representation movement of the head for adjustment to the cervical spine. This is accomplished by transmittal of the data from the practitioner and subject at one or more time points and comparing the data to one another or to a standard. The data is transmitted to the processor for compilation and analysis to determine nature of the contact. The indicia of the contact/manipulation are provided to an output device.
100571 The edge device is also, in an embodiment, an output device and provides information to the practitioner and the subject. In certain embodiments, the edge devices have an actuator that converts digital data to analog information. There are various types of outputs for the edge device. The edge device can communicate a stimulus output signal needed for the sensor to work. For example, the heart rate sensor works by utilizing an LED light and emits a light into the vessel to obtain feedback on the differences in the reflected light, which can be used to calculate the heart rate. The edge device has an actuator, in an aspect, to provide for certain amount of LED light at specific parameters to assess the heart rate, which would result in an input. Another output type of the edge device is a system monitoring output. For example, such an output can inform the
practitioner/subject if the system is ready. Yet another type of output relates to the operation of the system and/or methodology. The system can analyze the manipulation and correct behavior. For example, if too much force was used in the manipulation or the improper motion was used for the manipulation, the edge device can alert the user by sending a signal such as a red light or emit a “shake” of the device. If a manipulation worked, as compared to a standard, then the device provides a positive indicator. As the system records data for the practitioner and/or subject, the system learns which manipulations work and how the manipulation should be carried out to achieve a successful manipulation. The system can guide the practitioners with feedback based on comparing current data with data obtained from previous time points.
100581 The hub of the present invention is a data processing apparatus or computing device and has several functions. The hub serves as a communication interface between the edge devices and the cloud. Additionally, the hub can be used as a user interface in which the practitioner can turn the system on and be sure that it is ready to be used. In an embedment, the hub acts as a remote controller for the invention and can be a computer, credit card sized computer, smart phone and/or tablet. With a touch screen interface, the hub is able to conduct the operations such as power and communication. The practitioner can also provide information to the system (e.g., that the manipulation is about to begin, the type of manipulation, the phase (pre, during, post) of the manipulation, etc.). The practitioner has the option and ability to provide information to the system in a hands-free environment, through a foot pedal and/or voice commands as an example. The hub can also act as an output device to provide indications of system readiness, that the system is ready to receive data and/or for the practitioner to perform the manipulation and the outcome,
characterization, and/or quantification of the manipulation. The hub acts as a data receiver and can receive and store data and upload data to the cloud.
100591 The cloud as used in the present invention is generally where the instructions are executed according to software and algorithms described herein. Although the figures show a cloud-based system, the functions of the cloud can be performed on a local computer. The software applications collect, compute and organize all the data metrics, which are uploaded to a hard drive device located in a central hub and a cloud-based system. In turn, this data can be displayed in multiple platforms, including watches, phones, tablets, smart televisions and/or desktop computers.
100601 Data Algorithms
100611 In an embodiment, there are at least three aspects to this process. The state of the joint, subject and practitioner are quantified during the pre-assessment, manipulation and post assessment.
100621 Various edge devices, described herein, are used within these three aspects to gather data. The data from the one or more edge devices are processed and displayed for either the practitioner and/or subject to see in a real time intuitive manner. In an embodiment, the algorithms for processing and interpreting the sensor data are explained for each phase.
100631 Joint and Subject State Determination Through Pre and Post Assessment:
100641 The algorithm performs the following with respect to the state of the joint, subject and practitioner (1) a pre-assessment evaluation before the manipulation, (2) evaluation during the manipulation, and (3) a post-assessment evaluation after the manipulation.
100651 The state of a joint can be represented using several data points including: the degrees of rotation around its (possibly multiple) axes (active and passive motion); the force required to move the joint through its range of motion without the subject attempting to resist the motion (passive range of motion); the amount of force production by a stationary joint in neutral/resting position while the subject is resisting the motion (resisted isometric contraction); or any combination thereof. In some cases, only some of the state data points are assessed depending on the objective of the practitioner and the joint being manipulated. The main purpose of assessing the state of the joint is to be able to compare the state before and after a manipulation. After a successful manipulation, the joint may have: a greater range of motion and require less force to move along its range of motion for instance. The data points assessed can vary depending on the objective of the practitioner, and the type of joint that is being manipulated. 100661 The state of a subject is determined by the physiological sensors such as heart rate (HR), sweat response, body temperature and others including pain sensors. Again, the main idea is to compare the state of the subject before and after a manipulation. For many sensors, the state of the subject can be continuously monitored during all three phases (the pre-assessment, manipulation and post-assessment). During a“bad” manipulation the sweat response, heart rate, body temperature and/or pain of a subject can alter adversely.
100671 The techniques by which the practitioner utilizes during treatments is also quantified in each phase. How fast, where and how hard the practitioner presses are all examples by which the practitioner is evaluated and contributes to the data accumulation.
100681 Before determining the state of a joint, it is beneficial to communicate to the system of the present invention that a pre-assessment is occurring and/or have the system detect that a
manipulation is about to occur. Several methods can be used to accomplish this task, and the following are two possible methods for detecting the start of a manipulation sequence.
100691 One approach uses a foot pedal to signal the start of a pre-assessment. The foot pedal is clicked before the pre-assessment starts. This click is recorded and time stamped. Then the practitioner starts the pre-assessment. Once the pre-assessment is complete, the practitioner clicks on the foot pedal again. Similarly, the practitioner clicks before and after the manipulation and post assessment. Using this approach, the data is cut into small blocks, each containing either a pre assessment, manipulation or post-assessment.
100701 Another approach to detect the start of the pre-assessment phase is to use a comparison of the measurements of certain sensors in the edge devices. In such a case, certain sensors are expected to have particular values during pre-assessment when compared to values of other sensors for the same time point. For example, the pre-assessment phase can be detected by using two Inertial Measurement Units (IMUs) sensors and a force sensor. The IMU consists of tri-axis accelerometers, gyroscopes and magnetometers. One IMU is attached to the practitioner’s hand.
The other IMU is attached to the patient’s joint that is being assessed. For example, if the neck joint is being assessed, then the IMU can be attached to the head of the patient using a headband. The force sensor is worn/used by the practitioner. When the practitioner is moving the head to determine the range of motion, both IMU’s are moving around the same axis of rotation. In addition, the force exerted on the head is non-zero. When the practitioner is not moving the head of the patient, the IMUs will be moving independently. Thus, to determine if the pre-assessment is occurring, in this example, one can look for a portion of the data where the force is non- zero, and both sensors are moving along the same axis of rotation.
100711 Once it has been determined that the pre-assessment is occurring, the state of the joint is determined from the data. In order to determine the state of a joint, an IMU having accelerometer, gyroscope, and magnetometer, and if available, force sensor(s) can be used. By using force sensor(s), additional metrics are incorporated to assess the state of the joint. To determine the degrees of rotation of a joint, the practitioner moves the joint along its axes of rotation while wearing or using a manipulation device having an edge device with the IMU. Using the IMU data, the orientation of the sensor is determined. There exist many algorithms to determine the orientation of a sensor using the IMU data such as Kalman filtering (See L. Marins, el al., In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 4: 2003-2011, Maui, Hawaii, USA, (Nov. 2001)). , Mahoney filtering (See R. Mahony, et al, IEEE Transactions on Automatic Control, 53(5): 1203-1218, (2008)) , Madgwick filtering (See S. O. H. Madgwick, et al, In Proceedings of the IEEE International Conference on Rehabilitation Robotics, pp 1-7, Zurich, Switzerland, (Jun. 2011)) and Bayesian modelling (See Manon Kok, et al, Foundations and Trends in Signal Processing 11 : 1-2 pp 1-153. (2017)) for instance. Because the joint is isolated in motion by the practitioner and rotated along a desired axis of rotation, the change in orientation of the rotated end is equal to the degrees of rotation that the joint is moved. Therefore, the range of motion of a joint can be determined using IMU data and an orientation algorithm.
100721 In one embodiment, the algorithm used to determine the state of the joint includes determining:
1. the starting point of the rotation.
2. the end point of the rotation
3. the angle of rotation between these two points.
100731 The start of the rotation can be determined in any number of ways. One way to determine the start of the rotation is as follows. Before the rotation, the practitioner pauses for 5 seconds. The start of the rotation can then be determined by looking in the IMU sensor data for a 5 second pause, followed by a steady rotation along a constant axis. In addition, the force sensor will read an increased value during the rotation (as opposed to when the practitioner is not touching the subject, the force sensor value will be near zero).
100741 Also, the end point of the rotation can be determined in several ways. One way to calculate the end point of the rotation is as follows. During the pre and post assessments the joint is rotated from a starting point, to an end point, and then back to the starting point. Thus, the end point of the rotation is the extreme point. This is the point that has maximal deviation from the initial starting point. The following steps describe one way to determine the extreme point/ end point:
1. Starting at the start point, convert the orientation into points in 3D space. This can be done by choosing the start point as the point [0, 0, 1] for instance, and then rotating along the axis. The result is a trace of a rotation in 3D space as shown in Fig 10.
2. Determining the angle and axis of rotation between consecutive points along the 3D trace: Calculate the angle and axis of rotation between each point in the 3D space. A number of algorithms exists that can determine the angle of rotation and the axis of rotation given a set of points in 3D space. Examples include the TRIAD and QUEST algorithms (See M. D. Shuster et al., J. of Guidance, Control, and Dynamics, 4(l):70-77 (1981); and B. K. P. Horn. J. of the Optical Society of America A, 4(4):629-642 (1987)). Alternatively, the dot product can also be used to calculate the angle between two points in 3D space. In this case, the angle between the start point of the assessment and every other point is calculated using the dot product. The result is shown in Fig. 9.
3. The end point of the rotation is the maximal deviation from the start point, and is denoted with the curve in Fig. 9.
100751 Once the start and end points of the rotation are known, the angle between the start and end point must be calculated. This can again be done using the TRIAD or QUEST algorithms or using a dot product.
100761 In addition, the peak force and average force between the start and end point are calculated. The state is then displayed on a screen.
100771 The state of the subject during, pre and post assessment is determined using the CNS sensors that are activated. For example, the data from heart rate and body temperature sensors worn or used by the subject are gathered. The mean and peak beats per minute and body temperature during the pre and post assessment are calculated.
100781 Manipulation and Subject State Determination During Manipulation: During the manipulation, the state of the subject is determined using CNS sensors such as the heart rate monitor and body temperature sensor.
100791 In one aspect, the manipulation is quantified, in part, by the peak force and radial velocity as follows:
During the data segment denoted as the manipulation segment, a spike detection algorithm is used to look for large spike(s) in force and radial velocity.
If a large spike in force and radial velocity are found at a time point, then this determines the manipulation; the max force and radial velocity are recorded at this time.
100801 Once a manipulation has been detected, the peak force and acceleration are displayed on a screen.
100811 In another aspect, if the manipulation is a joint pumping technique which requires numerous and multiple contacts, the average force and average radial velocities can be calculated to give appropriate data sets for an extended treatment modality such as this.
100821 Quantifiable data measured by the methods and systems of the present invention during a manipulation include any combination of the following:
100831 Force: The amount of force and rate of force development, applied by a practitioner on a body part of a subject to elicit an action on the body part or a body part in communication therewith.
100841 Acceleration: The acceleration of a practitioner’s hand(s) and/or related body part(s)/tool(s) as applied to a body part of a subject to elicit an action on the body part or a body part in communication therewith. The acceleration also refers to that one or more body parts of the subject as the manipulation/contact being applied by the practitioner.
100851 Velocity: The velocity of a practitioner’s hand(s) and/or related body part(s)/tool(s) as applied to a body part of a subject to elicit an action on the body part or a body part in
communication therewith. The velocity also refers to that of one or more body parts of the subject as the manipulation/contact being applied by the practitioner. 100861 Positioning: The position of a practitioner’s hand(s) and/or related body part(s)/tool(s) as applied to a body part of a subject to elicit an action on the body part or a body part in
communication therewith. The position also refers to that of one or more body parts of the subject as the manipulation/contact being applied by the practitioner.
100871 Timing: The recorded time (the time point) of measurements before, during and after the manipulation/ contact.
100881 Audio Recordings of Joint(s) Cavitation: Audible sounds created during a joint cavitation during manipulation/contact on a subject by a practitioner.
100891 Video Recordings: Video recordings created before, during and after manipulation/contact on a subject by a practitioner.
100901 Central Nervous System: The central nervous system responses before, during, and after manipulation/contact on a subject by a practitioner which include: heart rate, heart rate variability, vascular flow, temperature, respiration, blood pressure, blood glucose, brainwaves, oxygen saturation, blood lactate, saliva and sweat profiles, and pain data.
100911 Computer system
100921 Referring to Fig. 8, computer system 800 has processor 802, memory 804, communications interface/hub 806, network/cloud 812 output device 808 and input device 810. The edge device is an input and output device that communicates directly with the cloud or indirectly via a hub. A processor which can be part of the hub and/or the cloud receives the subject and practitioner data and compiles and/or analyzes. In an example, the hub or communication module can be configured to implement a communication protocol based on Bluetooth® technology, Wi-Fi, Wi-Max, IEEE 802.11 technology, a radio frequency (RF) communication, an infrared data association (IrDA) compatible protocol, or a shared wireless access protocol (SWAP).
100931 Fig. 8 shows the general architecture of an illustrative computer system 800 that can be employed to implement any of the computer systems discussed herein. The computer system 800 of Fig. 8 includes one or more processors 802 communicatively coupled to memory 804, one or more communications interfaces/hub 806, and one or more output devices 808 (e.g., one or more display units) and one or more input devices 810.
100941 In the computer system 800 of Fig. 8, the memory 804 includes any computer-readable storage media and can store computer instructions such as processor-executable instructions for implementing the various functionalities described herein for respective systems, as well as any data relating thereto, generated thereby, or received via the communications interface(s) or input device(s). The processors 806 shown in Fig. 8, can be used to execute instructions stored in the memory 804 and, in so doing, also may read from or write to the memory of various information processed and or generated pursuant to execution of the instructions.
100951 The processor 802 of the computer system 800 shown in Fig. 8 also may be
communicatively coupled to or control the communications interface/hub 806 to transmit or receive various information pursuant to execution of instructions. For example, the communications interface/hub 806 and/or input device 810 can be coupled to a wired or wireless network, bus, or other communication means and may therefore allow the computer system 800 to transmit information to and/or receive information from other devices (e.g., other computer systems). While not shown explicitly in the system of Fig. 8, one or more communications interfaces facilitate information flow between the components of the system 800. In some implementations, the communications interface/hub can be configured (e.g., via various hardware components or software components) to provide a website as an access portal to at least some aspects of the computer system 800.
100961 The output devices 808 of the computer system 800 shown in Fig. 8 may be provided, for example, to allow various information to be viewed or otherwise perceived in connection with execution of the instructions. The input device(s) 810 can be provided, for example, to allow a user to make manual adjustments, make selections, enter data or various other information, or interact in any of a variety of manners with the processor during execution of the instructions.
100971 Joints, Central Nervous System and Manual therapy
100981 Joints of biological beings are the points where two or more bones convene. In Homo sapiens, an adult frame is composed of 206 bones. Bones are classified as hard and/or calcified tissues in the human body. The skeletal system is split into two categories; Axial Skeleton & Appendicular Skeleton. The axial skeleton consists of the bones from the head and trunk. The appendicular skeleton consists of the bones that support the appendages. Osteokinematics is the manner in which bones move in space without regard for joint surfaces. Arthrokinematics refers to the movement of joint surfaces. There are three types of joints which include Fibrous, Cartilaginous and Synovial joints. In mammals the most common and moveable joint is the synovial joint.
100991 Joints are surrounded by various tissue structures, which include tendons, ligaments, fascia, skin, fibrous tissues, adipose tissues, synovial membranes, muscles, nerves and vascular vessels. The bones of the skeleton, muscles, cartilage, tendons, ligaments, joints and other connective tissues, make up the musculoskeletal system, also known as the locomotor system. The
musculoskeletal system supports the body, allows for motion to occur and protects vital organs. 1001001 The central nervous system (CNS) is comprised of the brain and spinal cord. The peripheral nervous system (PNS) is comprised of all the nerves outside the brain and spinal cord that carry messages to and from the CNS. The entire CNS is enclosed in bone. The brain is protected by the skull and the spinal cord is protected by the vertebra of the spinal column. The CNS is responsible for integrating sensory information from ascending afferent fibers (PNS), and responding appropriately with a command to be sent back by descending efferent fibers (PNS). All afferent input coming from the locomotor system is known as proprioception; ie. the sense of position.
1001011 Central nervous system lesions, general pathologies and/or musculoskeletal misalignments can adversely impact nerves, vascular structures, visceral organs, glands, connective tissues, non connective tissues, joint surfaces, tendons, ligaments, fascia and the like.
1001021 Manual therapy attempts to influence the CNS, viscera and the musculoskeletal system by aligning select structures to allow for the appropriate flow of fluids and electrical signals through out the body to assist with healing, maintaining and optimizing health.
1001031 Joint manipulation is characteristically associated with the production of an audible click, popping or cracking sound. The phenomenon of“joint cracking” is the result of cavity inception within synovial fluid, which is consistent with tribonucleation. Tribonucleation is a process where opposing surfaces resist separation until a critical point, when they separate rapidly, resulting in vapor cavities that do not collapse instantaneously. The resulting drop in synovial pressure allows dissolved gas to come out of solution and create a“clear space” (bubble, cavity, void), within the joint. The cavity formed at the time of rapid joint separation persists past the point of sound production.
1001041 Osteo-articular joint pumping, joint adjustments, manipulations and mobilizations, are performed by manual therapy practitioners who include Chiropractors, Physical Therapists (Physiotherapists), Osteopaths (trained outside the United States), Osteopathic Physicians (DOs, trained within the United States). Dependent on the practitioner and geographical location, the terminology of technique(s) utilized is intertwined. Chiropractors refer to the manipulation of a spinal joint as an adjustment. Manipulation is synonymous with a Grade V mobilization developed by Geoffrey Maitland, who labeled joint mobilizations into the five grades of motion. The term “High velocity, low amplitude” (HVLA) thrust, is often interchangeable with a joint manipulation or impulse.
1001051 There are also multiple forms of joint mobilizations, which do not require a“HVLA” thrust, known as non-thrust manipulations, and are additionally performed by manual therapists such as, Naprapaths, Massage Therapists, Performance Therapists, Athletic Therapists, Cranial Sacral Therapists, Ki-Hara Practitioners, Reflexologists and Rolfing Practitioners.
1001061 Osteo-articular joint pumping is an example of a precise non-thrust technique. Joint pumping utilizes micro movements to attempt to improve a joint’s range of motion and change the quality of the surrounding connective tissue. - (http://www.evolutionsinstitute.com)
1001071 The International Federation of Orthopedic Manipulative Physical Therapists, defines manipulation and mobilization as follows. Manipulation - a passive, high velocity, low amplitude thrust applied to a joint complex within its anatomical limit with the intent to restore optimal motion, function and/or to reduce pain. Mobilization - a manual therapy technique comprising a continuum of skilled passive movements to the joint complex that are applied at varying speeds and amplitudes, that may include a small-amplitude/high velocity therapeutic movement (manipulation) with the intent to restore optimal motion, function and/or to reduce pain. - (http : // www. phy sio-pedia. com/Manual_Therapy)
1001081 Joint pumping, adjustments, manipulations and mobilizations are most frequently administered to patients by practitioner’s hands, but they can also be applied individually or in combination with any part of the practitioner’s body such as their feet, knees, legs, hips, torso, back and/or head. Various aptly constructed instruments such as drop tables, straps, harnesses, clamps, impulsers and activators, can also be used by practitioners to perform joint therapies. The term manipulation and/or contact of the present invention include all of the aforementioned including joint pumping, adjustments, manipulations, mobilization, and the like.
1001091 Joint manipulations are performed on patients at every age demographic. Every joint in living beings can be influenced via direct or indirect contact. 1001101 Also disclosed herein are methods of making and methods of using any of the edge devices described herein. In at least some embodiments, the method includes: providing a power supply operable to power the edge device; providing at least one sensor device operable to obtain at least one measurement of the user; providing an analog to digital converter in communication with the sensor, providing an actuator in communication with a digital to analog converter, providing a wireless communication component operable to transmit data indicative of the at least one measurement obtained by the at least one sensor. These components can be attached to a substrate. The edge device further includes an edge computer having a processor, wireless connectivity, memory and a power supply. In an embodiment the edge computer and the rest of the edge device are coupled to one another using a serial bus forming a“sandwich.” The method may optionally include, singly, collectively, in any order and/or in any combination: electrically connecting (e.g., via a plurality of flexible interconnects embedded on or within the flexible substrate) one or more or all of the aforementioned components.
fOOllll Examples of the subject matter and the operations described herein can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Examples of the subject matter described herein can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on computer storage medium for execution by, or to control the operation of, data processing apparatus. The program instructions can be encoded on an artificially generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to a suitable receiver apparatus for execution by a data processing apparatus. A computer storage medium can be, or be included in, a computer- readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. Moreover, while a computer storage medium is not a propagated signal, a computer storage medium can be a source or destination of computer program instructions encoded in an artificially generated propagated signal. The computer storage medium can also be, or be included in, one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices).
1001121 The operations described in this specification can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources. 1001131 The term“data processing apparatus,”“computing device,”“server” or“hub” encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing. The apparatus can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit). The apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
1001141 A computer program (also known as a program, software, software application, script, application or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
1001151 The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform actions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatuses can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
1001161 Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor receives instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data. Generally, a computer can include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices.
Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), for example. Devices suitable for storing computer program instructions and data include all forms of non volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
1001171 To provide for interaction with a user, examples of the subject matter described herein can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube), plasma, or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse, touch screen or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.
1001181 Examples of user interface screen shots of the software are shown in Figs. 12 and 13. Fig. 12 shows the menu and the option of choosing the view data from the Practitioner Hand strap or the Patient’s (e.g., subject’s) Headband. Fig. 13 shows an example of data shown when using the system of the present invention. In this case, Fig. 13 shows streaming motion data from the
Practitioner’s hand strap, the force data from the Practitioner’s hand strap and the peaks show the use of force during a manipulation. Data from any of the sensors can be displayed in this fashion (e.g., Fig. 7A, 7B, Fig. 9, Fig. 11) or can be compiled and/or shown on the same screen.
Additionally, as described herein a three-dimensional output can also be provided in two dimensional (Fig. 10) or three-dimensional form.
1001191 Examples of the subject matter described herein can be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network via hardwired or wirelessly or other means. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), an inter- network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
1001201 The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In some examples, a server transmits data to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device). Data generated at the client device (e.g., a result of the user interaction) can be received from the client device at the server.
1001211 While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any inventions or of what may be claimed, but rather as descriptions of features specific to particular embodiments of the systems and methods described herein. Certain features that are described in this specification in the context of separate
embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination. 1001221 Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. 1001231 In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
EXEMPLIFICATION
1001241 Example 1 : Procedures For Making The Components Of The Present Invention
1001251 Step 1 - Practitioner hand strap manipulation device having an edge device construction
Easily cleaned, durable strap/webbing material
Constructed to hold various sensors inside
Skin contact surface of strap has EMF protection material
Adjustable for size
Fabrics purchased from Ultrafabrics inc., LAVA # 558-5600
EMF material purchased from (Defender Shield, Florida, USA)
1001261 Step 2 - Subject headband manipulation device having an edge device construction Easily cleaned, durable headband material
Constructed to adjust size of heads
Skin contact surface of strap has EMF protection material
Standard headband elastic with glued on sensors
EMF material purchased from (Defender Shield, Florida, USA)
1001271 Step 3 - Practitioner hand strap manipulation device having an edge device
1001281 Force sensor is connected to analog to digital converter peripheral of the
microcontroller
1001291 9D motion sensor is connected to Serial Peripheral Interface (SPI) or to inter- integrated circuit (l2c) bus of the microcontroller.
1001301 SPI is a synchronous serial communication electronic bus. Interface specification on these buses is used for short distance communication, primarily in embedded systems. I2C is a synchronous, multi-master, multi-slave, packet switched, single-ended serial electronic bus.
Interface specification on this bus is used for short distance communication, primarily in embedded systems. It is used for attaching lower-speed peripherals to processors and microcontrollers in short- distance, intra-board communication.
1001311 Heart rate, humidity & temperature sensors are connected to I2C or SPI bus of the microcontroller. MbietLab Meta Wear CPRO Mbientlab; FlexiForce A201, FlexiForce A401 and FlexiForce A502 (Tekscan, Inc. Boston, Massachusetts), and ESP 32 Espressif System ESP32 based modules were used as sensors in the edge device.
1001321 Step 4 - Subject headband
1001331 9D motion sensor is connected to SPI or I2C bus of the microcontroller
1001341 CNS devices are connected to SPI or I2C bus of microcontroller
1001351 Heart rate, humidity & temperature sensors are connected to I2C or I2C bus of the microcontroller. MbietLab Meta Wear CPRO Mbientlab; FlexiForce A201, FlexiForce A401 and FlexiForce A502 (Tekscan, Inc. Boston, Massachusetts), and ESP 32 Espressif System ESP32 based modules were used as sensors in the edge device.
1001361 Step 5 - Foot pedal control
1001371 To allow for hands free control
1001381 Used to initiate and/or time stamp when assessment(s) and/or manipulations are commencing/terminating.
1001391 IKKEGOL USB double switch pedal (FS2016_B2, IKKEGOL, China)
1001401 Step 6 - Edge (microcontroller) & hub devices communication
1001411 Bluetooth low energy used for setup and control of the edge (microcontroller) in order to minimize energy consumption
1001421 WiFi used for I/O data exchange between edge (microcontroller) and local hub computer in order to maximize transfer bandwidth.
1001431 Step 7 - Hub processing device
1001441 Server for data collection from all sensors
1001451 Server to communicate with the cloud
1001461 Server for web-based user (UX)experience
1001471 Step 8 - UX dashboard
1001481 WEB browser to access UX experience server
1001491 Applications for direct UX experience access to edge and hub 1001501 Smart phone + tablet, obtained as an IPHONE and IPAD from Apple, Inc.
1001511 Step 9 - Cloud computing & IOT infrastructure
1001521 Data collection, storage, analytics
1001531 Client based data
1001541 Example 2 Procedure for Cervical Spine Manipulation - utilizing a high velocity low amplitude (HVEA) technique.
1001551 Methods:
1001561 Step 1
1001571 The practitioner flipped a small switch on the side of each sensor board, the hand strap and the subject headband, to turn on both devices.
1001581 Practitioner then placed the hand strap on their indented treatment hand and placed the headband on the patient’s head.
1001591 Practitioner used a hub device (laptop apple computer) to activate the system of the present invention and connected both practitioner and subject sensors wirelessly to the user interface.
1001601 The user interface allowed the practitioner to make any necessary changes to the sensors, such as identifying/naming each sensor, adjusting the sensitivity of data recording on each sensor and having the ability to visually look at all the data streaming into the interface.
1001611 Communication between the sensors and interface on the hub device was confirmed with a blinking orange light on both edge devices (i.e., hand strap and headband). This connection in the current version of the invention was established through a Wi-Fi signal.
1001621 Once sensors established power and communication with the hub device, data recording began.
1001631 A foot pedal was connected to the hub device and placed appropriately near the treatment table for easy access to the practitioner’s feet.
1001641 When the foot pedal was pressed, it time stamped incoming data, so to accurately identify each phase (pre-assessment, applied technique and post-assessment). The foot pedal was utilized in this phase of the current invention to assist with teaching the machine learning algorithm and recognition software. In future versions a foot pedal will not be required to time stamp each applicable phase, as the system will automatically recognize when assessments and techniques are being performed.
1001651 The subject’s headband accumulated central nervous system data points continuously from the whole treatment, from the moment the sensor was turned on. Central nervous system markers included heart rate and temperature data in the current version of the invention.
1001661 Step 2
1001671 The subject was positioned appropriately in a supine position by the practitioner to do a pre-assessment. The foot pedal was pressed to signal the commencement of the pre assessment. This particular pre-assessment required passive side flexion of the subject’s head, to the right, by the practitioner, until an end feel was reached; at which point the patient’s head was moved back to the original neutral starting position. Only one repetition (one right passive side bend) was performed.
1001681 The pre-assessment was performed to determine range of motion of the cervical spine utilizing the gyroscope sensor embedded in the subject’s headband. The accelerometer determined how fast the patient’s head and practitioner’s hand moved respectively during this assessment. The metrics recorded on the practitioner’s hand strap included velocity and position (m/s, degrees) of the hand and the amount of force which was applied to the subject’s head.
1001691 When the practitioner completed the pre-assessment, the foot pedal was pressed again to time stamp the data set.
1001701 Step 3
1001711 The subject was repositioned by the practitioner into the correct position necessary to receive the appropriate cervical spine manipulation. In this situation, the supine position was maintained and the hand position was adjusted accordingly on the patient’s cervical spine (neck) to deliver the manipulation technique.
1001721 The foot pedal was pressed before delivering the high velocity low amplitude technique.
1001731 One cervical spine manipulation was delivered, to the right side of the patient’s cervical spine, with the practitioner hand strap and subject headband recording data. 1001741 In addition to the central nervous system data points from the subject’s headband, velocity (m/s) and change in position (degrees) of the head were recorded. The practitioner’s hand strap recorded the velocity (m/s) and change in position (degrees) of the practitioner’s hand and the force (Newtons) applied by the practitioner’s hand to the subject’s cervical spine during the manipulation.
1001751 The foot pedal was pressed once the applied technique was completed.
1001761 Step 4
1001771 A post-assessment analysis was done in the exact fashion as was completed in Step 2 of this process. This was done to establish if any changes in cervical spine mobility occurred.
1001781 Additionally, examinations of the central nervous markers were analyzed in combination with the cervical spine motion during the entire treatment to monitor any changes in the subject.
1001791 Step 5
1001801 In this situation desired results were achieved, as range of motion increased and less force was required to passively move the patients head, and thus the treatment was completed and a manual shut down of the sensors and hub device was used to terminate connection.
1001811 In the event that the treatment required more attention, the practitioner could have re started at Step 2 or Step 3 dependent on circumstances.
1001821 Step 6
1001831 Because data had been transmitted wirelessly from sensors to hub to cloud, review of data on the cloud through computer interfaces was possible instantaneously.
1001841 Results:
1001851 The system was used to check and adjust the C4 vertebra in the neck. During the pre and post assessments a right side-bend of the neck joint was performed. The angle of deviation from the start of the pre-assessment is shown in Fig. 9. The assessment start and end points as detected by the algorithm described herein are shown in the thick lines in the figure. A 3D visualization of the motion is shown in Fig. 10. The ranges of interest in the deviation angle data (the dark lines in the previous plot) were plotted in three-dimensions to aid in visualizing the movement of the head during the pre/post assessments. The gray ellipse represents the subject’s head.
1001861 The range of motion of the joint can be calculated from the detected ranges on interest in the angle deviation data (the thick lines in the figures). The range of motion results for the pre-assessment shown in Fig. 9 are summarized in the Table 3 (below).
1001871 These plots and state estimates/determinations were generated and presented to the practitioner immediately after the pre-assessment is completed.
1001881 The angular velocity and force data collected during the adjustment of the C4 vertebra are shown in Fig. 11. The moment of the manipulation is characterized by a sharp spike in both radial velocity and exerted pressure. A peak detection algorithm as described herein picks out the largest spike from the data collected while the manipulation was performed. The detected peak is denoted by a large dot in the plots. The maximum radial velocity for this manipulation, from the practitioner’s hand strap, was 3.56 m/s, and the maximum exerted force was 328mV.
1001891 After the manipulation, the practitioner performed a post assessment. The algorithms used for the post assessment are the same as those used for the pre-assessment. The results of the post assessment are summarized in Table 3.
1001901
Table 3
Figure imgf000041_0001
Figure imgf000041_0002
1001911 In this case it can be seen that the range of motion of the joint (neck) has increased after the adjustment was performed by the practitioner by 16.68 degrees. Additionally, the amount of average force required to passively move the subjects head was 0.32mV less and with a peak force value l 3mV less than the initial movement.
1001921 The biometric indicators that were measured during this trial consisted of tracking heart rate measurements. The plots of heart rate as measured starting before the pre-assessment to after the post assessment are shown in Fig. 7B. The vertical lined areas indicate the time when the pre and post assessments were being performed. As can be seen from the plot, there is no significant change in heart rate before and after the manipulation.
1001931 Because the range of motion of the joint increased after the manipulation and no significant increase or decrease in heart rate was detected during or after the manipulation, at this time, we conclude that the adjustment was a successful manipulation.
1001941 All the measured data was presented to the practitioner during the treatment as it became available. The data presentation after a pre-assessment, manipulation and post-assessment have been performed is shown in Fig 7A. Based on the data presented in this visualization, the practitioner can decide on the next step of the treatment.
1001951 The terms about, approximately, substantially, and their equivalents may be understood to include their ordinary or customary meaning. In addition, if not defined throughout the specification for the specific usage, these terms can be generally understood to represent values about but not equal to a specified value. For example, 1%, 0.9%, 0.8%, 0.7%,
0.6%, 0.5%, 0.4%, 0.3%, 0.2%, 0.1%, 0.09% of a specified value.
1001961 The terms, comprise, include, and/or plural forms of each are open ended and include the listed items and can include additional items that are not listed. The phrase“And/or” is open ended and includes one or more of the listed items and combinations of the listed items.
1001971 The relevant teachings of all the references, patents and/or patent applications cited herein are incorporated herein by reference in their entirety.
1001981 While this invention has been particularly shown and described with references to preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention encompassed by the appended claims.

Claims

What is claimed is:
1) A method for assessing a manipulation by a practitioner on a first body part of a subject,
wherein the practitioner has at least one practitioner edge device that comprises at least one practitioner sensor and the subject has at least one subject edge device that comprises at least one subject sensor on a second body part; wherein the practitioner performs the manipulation to the first body part of the subject, wherein the first body part and the second body part is the same body part or a different body part, and wherein the at least one practitioner sensor and the at least one subject sensor each comprise a gyroscope, accelerometer, force sensor, a magnetometer, or combination thereof, and optionally, at least one central nervous system sensor; the method comprises:
a) transmitting practitioner data of the manipulation from the at least one practitioner sensor to a network;
b) transmitting subject data of the manipulation from the at least one subject sensor to the network;
c) processing the practitioner data and the subject data to assess the manipulation to thereby obtain a manipulation indicia; and
d) providing an output of the manipulation indicia.
2) The method of Claim 1, wherein the central nervous system sensor of the at least one
practitioner edge device, the at least one subject edge device, or both comprises a heart rate monitor, a blood pressure monitor, a thermometer, an oxygen saturation monitor, a respiratory monitor, a blood lactate monitor, blood glucose monitor, a vascular flow monitor, a brain wave monitor, a sweat and saliva monitor, pain sensor, or combination thereof, and optionally include a microphone or video camera.
3) The method of Claim 1, wherein the practitioner data or subject data of the manipulation each comprise position data representative of position or movement of the body part, acceleration data representative of an acceleration of the body part, and force data representative of force applied to the body part, and optionally, heart rate data, heart rate variability data, blood pressure data, vascular flow data, temperature data, oxygen saturation data, respiratory rate data, blood lactate data, blood glucose data, brain wave data, sweat and saliva data, pain data, and any combination thereof.
4) The method of Claim 1, wherein the at least one practitioner edge device is attached to or
embedded in a band, a strap, a headband, a bracelet, a body band, a ring, a towel, and a glove.
5) The method of Claim 4, wherein the at least one practitioner edge device is placed on the
practitioner’s hand, arm, foot, knee, leg, hip, torso, back, head, or combination thereof.
6) The method of Claim 1, wherein the at least subject edge device is attached to or embedded in a band, a strap, a headband, a bracelet, a body band, a ring, a towel and a glove.
7) The method of Claim 6, wherein the at least subject edge device is placed on the subject’s body part, wherein the body part comprises a head, hand, arm, foot, knee, leg, hip, torso, back, or combination thereof.
8) The method of Claim 1, further comprising transmitting the practitioner data of step a) and the subject data of step b) wirelessly to the network.
9) The method of Claim 1, wherein the practitioner data of step a) and the subject data of step b) is transmitted to a network comprising a server that is local, on a cloud, or both.
10) The method of Claim 1, further comprising storing the practitioner data of step a) and the
subject data of step b).
11) The method of Claim 1, wherein processing the practitioner data and the subject data to assess the manipulation comprises compiling the practitioner data from step a) and the subject data from step b).
12) The method of Claim 1, wherein processing the practitioner data and the subject data to assess the manipulation comprises comparing the practitioner data from step a) with the subject data from step b).
13) The method of Claim 1, wherein processing the practitioner data and the subject data to assess the manipulation comprises comparing the practitioner data from step a) and the subject data from step b) to a standard.
14) The method of Claim 1, further comprising transmitting practitioner data of step a) from more than one time point, transmitting subject data of step b) from more than one time point, and compiling or comparing the data from the more than one time points.
15) The method of Claim 14, wherein the one or more time points comprise prior to the
manipulation, during the manipulation, and after the manipulation. 16) The method of Claim 1, further comprising displaying the data in a graphical representation
17) The method of Claim 16, wherein the graphical representation comprising a line graph or a three-dimensional representation of the movement.
18) A method for assessing a manipulation by a practitioner on a first body part of a subject,
wherein the practitioner has at least one practitioner edge device that comprises at least one practitioner sensor and the subject has at least one subject edge device that comprises having one or at least one subject sensor on a second body part; wherein the first body part and second body part is the same body part or a different body part, and wherein the at least one practitioner sensor and the at least one subject sensor each comprise a gyroscope, accelerometer, force sensor, a magnetometer, or combination thereof, and optionally, at least one central nervous system sensor; the method comprises:
a) the practitioner performing the manipulation to the first body part of the subject;
b) transmitting practitioner data of the manipulation from the at least one practitioner sensor to a network;
c) transmitting subject data of the manipulation from the at least one subject sensor to the
network;
d) processing the practitioner data and the subject data to assess the manipulation to thereby obtain a manipulation indicia; and
e) providing an output of the manipulation indicia.
19) - A system for assessing a manipulation by a practitioner on a first body part of a subject,
wherein the practitioner has at least one practitioner edge device that comprises at least one practitioner sensor and the subject has at least one subject edge device that comprises at least one subject sensor on a second body part; wherein the practitioner performs the manipulation to the first body part of the subject, wherein the first body part and the second body part is the same body part or a different body part; the system comprises:
a) the practitioner edge device having at least one practitioner sensor that obtains and transmits practitioner data; wherein the at least one practitioner sensor comprises a gyroscope, accelerometer, force sensor, a magnetometer, or combination thereof, and optionally, at least one central nervous system sensor; wherein the practitioner sensor further comprises a wireless communication component; b) the subject edge device having at least one subject sensor that obtains and transmits subject data; wherein the subject sensor comprises a gyroscope, accelerometer, force sensor, a magnetometer, or combination thereof, and optionally, at least one central nervous system sensor; wherein the subject sensor further comprises a wireless communication component; wherein, when in use, the subject wears at least one sensor on the second body part;
c) a network comprising:
i) at least one memory unit for storing processor executable instructions, the practitioner data and the subject data;
ii) a processing unit for accessing at least one memory and executing the processor
executable instructions, wherein the processing unit receives and processes the practitioner data and the subject data to assess the manipulation to thereby obtain a manipulation indicia;
iii) a communication module that receives the practitioner data and subject data; and iv) an output device for providing the manipulation assessment.
20) The system of Claim 19, further comprising a hub having a second communication module that receives the practitioner data and subject data and transmits said practitioner data and subject data to the network.
21) The system of Claim 19, wherein the at least one practitioner edge device and the at least one subject edge device each further comprise an analog to digital converter, an actuator, a digital to analog converter, a communication module, a processor, memory, a power supply, or a combination thereof.
22) The system of Claim 19, the central nervous system sensor of the at least one practitioner edge device, the at least one subject edge device, or both comprises a heart rate monitor, a blood pressure monitor, a thermometer, an oxygen saturation monitor, a respiratory monitor, a blood lactate monitor, blood glucose monitor, a vascular flow monitor, a brain wave monitor, a sweat and saliva monitor, pain sensor, or combination thereof, and optionally include a microphone or video camera.
23) The system of Claim 19, wherein the practitioner data or subject data of the manipulation each comprise position data representative of position or movement of the body part, acceleration data representative of an acceleration of the body part, and force data representative of force applied to the body part, and optionally, heart rate data, heart rate variability data, blood pressure data, vascular flow data, temperature data, oxygen saturation data, respiratory rate data, blood lactate data, blood glucose data, brain wave data, sweat and saliva data, pain data, and any combination thereof.
24) The system of Claim 19, wherein the at least one practitioner edge device is attached to or
embedded in a band, a strap, a headband, a bracelet, a body band, a ring, a towel, and a glove.
25) The system of Claim 24, wherein the at least one practitioner edge device is placed on the
practitioner’s hand, arm, foot, knee, leg, hip, torso, back, head, or combination thereof.
26) The system of Claim 19, wherein the subject sensor is attached to or embedded in a band, a strap, a headband, a bracelet, a body band, a ring, a towel and a glove.
27) The system of Claim 26, wherein at least one subject edge device is placed on the subject’s body part, wherein the body part comprises a head, hand, arm, foot, knee, leg, hip, torso, back, or combination thereof.
28) The system of Claim 19, wherein the processing unit processes the practitioner data and the subject data to assess the manipulation comprises compiling the practitioner data from a) and the subject data from b).
29) The system of Claim 19, wherein the processing unit processes the practitioner data and the subject data to assess the manipulation comprises comparing the practitioner data from a) with the subject data from b).
30) The system of Claim 19, wherein the processing unit processes the practitioner data and the subject data to assess the manipulations comprises comparing the practitioner data from a) and the subject data from b) to a standard.
31) The system of Claim 19, the processing unit processes the practitioner data of a) from more than one time point, the subject data of b) from more than one time point, and compiles or compares the data from the more than one time points.
32) The system of Claim 31, wherein the one or more time points comprise prior to the
manipulation, during the manipulation and after the manipulation.
33) The system of Claim 19, wherein the output device comprises a display providing the data in a graphical representation
34) The system of Claim 33, wherein the graphical representation comprising a line graph, or a three-dimensional representation of the movement.
PCT/IB2019/055351 2018-06-26 2019-06-25 System and methods for quantifying manual therapy WO2020003130A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862690109P 2018-06-26 2018-06-26
US62/690,109 2018-06-26

Publications (1)

Publication Number Publication Date
WO2020003130A1 true WO2020003130A1 (en) 2020-01-02

Family

ID=68986287

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2019/055351 WO2020003130A1 (en) 2018-06-26 2019-06-25 System and methods for quantifying manual therapy

Country Status (1)

Country Link
WO (1) WO2020003130A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11651705B2 (en) * 2008-02-15 2023-05-16 Carla Marie Pugh Tracking and digital documentation of haptic manipulation data using wearable sensors

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8684922B2 (en) * 2006-05-12 2014-04-01 Bao Tran Health monitoring system
US20140303670A1 (en) * 2011-11-16 2014-10-09 Neuromechanical Innovations, Llc Method and Device for Spinal Analysis
US20150145682A1 (en) * 2013-11-25 2015-05-28 Mark Matthew Harris System and methods for nonverbally communicating patient comfort data
CA2968645A1 (en) * 2015-01-06 2016-07-14 David Burton Mobile wearable monitoring systems

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8684922B2 (en) * 2006-05-12 2014-04-01 Bao Tran Health monitoring system
US20140303670A1 (en) * 2011-11-16 2014-10-09 Neuromechanical Innovations, Llc Method and Device for Spinal Analysis
US20150145682A1 (en) * 2013-11-25 2015-05-28 Mark Matthew Harris System and methods for nonverbally communicating patient comfort data
CA2968645A1 (en) * 2015-01-06 2016-07-14 David Burton Mobile wearable monitoring systems

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11651705B2 (en) * 2008-02-15 2023-05-16 Carla Marie Pugh Tracking and digital documentation of haptic manipulation data using wearable sensors

Similar Documents

Publication Publication Date Title
US11123562B1 (en) Pain quantification and management system and device, and method of using
US20240163589A1 (en) System and method for communicating biofeedback to a user through a wearable device
US20180263530A1 (en) Chest measuring device, scoliosis correction system, system for remotely diagnosing spine, and wearable measuring device
US20190200914A1 (en) Motion analysis systems and methods of use thereof
US9526946B1 (en) Enhanced system and method for vibrotactile guided therapy
US9149222B1 (en) Enhanced system and method for assessment of disequilibrium, balance and motion disorders
RU2603047C2 (en) System and methods for medical use of motion imaging and capture
US20130337974A1 (en) Personal wellness management platform
US10271768B1 (en) Method for determining rehab protocol and behavior shaping target for rehabilitation of neuromuscular disorders
US20110263997A1 (en) System and method for remotely diagnosing and managing treatment of restrictive and obstructive lung disease and cardiopulmonary disorders
JP2023531923A (en) Systems and methods for relating symptoms to medical conditions
Tedesco et al. Design of a multi-sensors wearable platform for remote monitoring of knee rehabilitation
Casas et al. Human-robot interaction for rehabilitation scenarios
Powell et al. Openbutterfly: Multimodal rehabilitation analysis of immersive virtual reality for physical therapy
WO2024086537A1 (en) Motion analysis systems and methods of use thereof
WO2020003130A1 (en) System and methods for quantifying manual therapy
Oliveira et al. Visual kinematic feedback enhances the execution of a novel knee flexion gait pattern in children and adolescents
Mechanick et al. Wearable technologies in lifestyle medicine
Lancere Technological solutions for low back pain physical therapy real-time monitoring with feedback
US20220225897A1 (en) Systems and methods for remote motor assessment
Dalangin FIXTRATE: AN IOT-BASED POSTURE DETECTION AND CORRECTION SYSTEM
Wijethunga Free hand interaction therapy for Parkinson’s disease using leap motion
Isidro III The Feasibility of Augmented Reality as Support Tool for Motor Rehabilitation
Pogrzeba et al. Towards a vocabulary for describing 3d motion data in functionally oriented music therapy
Abdullah Design and Development of Biofeedback Stick Technology (BfT) to Improve the Quality of Life of Walking Stick Users

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19827062

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 28/04/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19827062

Country of ref document: EP

Kind code of ref document: A1