EP4139928A1 - Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine - Google Patents
Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicineInfo
- Publication number
- EP4139928A1 EP4139928A1 EP21791805.1A EP21791805A EP4139928A1 EP 4139928 A1 EP4139928 A1 EP 4139928A1 EP 21791805 A EP21791805 A EP 21791805A EP 4139928 A1 EP4139928 A1 EP 4139928A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- treatment
- patient
- treatment plan
- user
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 340
- 238000011282 treatment Methods 0.000 claims abstract description 2350
- 238000005259 measurement Methods 0.000 claims abstract description 119
- 238000012986 modification Methods 0.000 claims abstract description 68
- 230000004048 modification Effects 0.000 claims abstract description 68
- 230000004044 response Effects 0.000 claims abstract description 59
- 238000012545 processing Methods 0.000 claims description 246
- 230000015654 memory Effects 0.000 claims description 80
- 230000036772 blood pressure Effects 0.000 claims description 64
- 230000029058 respiratory gaseous exchange Effects 0.000 claims description 23
- 238000010801 machine learning Methods 0.000 description 201
- 238000004891 communication Methods 0.000 description 196
- 238000013473 artificial intelligence Methods 0.000 description 165
- 238000011369 optimal treatment Methods 0.000 description 131
- 230000033001 locomotion Effects 0.000 description 105
- 238000012549 training Methods 0.000 description 94
- 239000003814 drug Substances 0.000 description 67
- 208000002193 Pain Diseases 0.000 description 63
- 229940079593 drug Drugs 0.000 description 60
- 230000000875 corresponding effect Effects 0.000 description 58
- 230000000694 effects Effects 0.000 description 40
- 238000011084 recovery Methods 0.000 description 39
- 238000010586 diagram Methods 0.000 description 33
- 230000008859 change Effects 0.000 description 31
- 230000001276 controlling effect Effects 0.000 description 30
- 208000027418 Wounds and injury Diseases 0.000 description 29
- 230000006378 damage Effects 0.000 description 29
- 208000014674 injury Diseases 0.000 description 29
- 230000000007 visual effect Effects 0.000 description 26
- 230000006870 function Effects 0.000 description 25
- 210000003127 knee Anatomy 0.000 description 25
- 230000009471 action Effects 0.000 description 24
- 210000003205 muscle Anatomy 0.000 description 22
- 238000001356 surgical procedure Methods 0.000 description 22
- 238000005516 engineering process Methods 0.000 description 20
- 230000003190 augmentative effect Effects 0.000 description 18
- 238000012544 monitoring process Methods 0.000 description 18
- 238000012552 review Methods 0.000 description 18
- 208000001072 type 2 diabetes mellitus Diseases 0.000 description 18
- 230000000144 pharmacologic effect Effects 0.000 description 17
- 230000001225 therapeutic effect Effects 0.000 description 16
- 238000013519 translation Methods 0.000 description 16
- 238000013024 troubleshooting Methods 0.000 description 16
- 238000002483 medication Methods 0.000 description 15
- 230000000399 orthopedic effect Effects 0.000 description 14
- 238000013515 script Methods 0.000 description 14
- 206010020751 Hypersensitivity Diseases 0.000 description 13
- 230000003044 adaptive effect Effects 0.000 description 13
- 230000007815 allergy Effects 0.000 description 13
- 230000008901 benefit Effects 0.000 description 13
- 230000002708 enhancing effect Effects 0.000 description 13
- 238000013528 artificial neural network Methods 0.000 description 12
- 230000001413 cellular effect Effects 0.000 description 12
- 210000002569 neuron Anatomy 0.000 description 12
- 230000005236 sound signal Effects 0.000 description 12
- 208000024891 symptom Diseases 0.000 description 12
- 238000005192 partition Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 230000033228 biological regulation Effects 0.000 description 10
- 230000001351 cycling effect Effects 0.000 description 9
- 235000021004 dietary regimen Nutrition 0.000 description 9
- 230000001965 increasing effect Effects 0.000 description 9
- 238000012360 testing method Methods 0.000 description 9
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 description 8
- 210000000988 bone and bone Anatomy 0.000 description 8
- 238000004883 computer application Methods 0.000 description 8
- 238000004590 computer program Methods 0.000 description 8
- 238000013500 data storage Methods 0.000 description 8
- 239000008103 glucose Substances 0.000 description 8
- 239000007787 solid Substances 0.000 description 8
- 230000003068 static effect Effects 0.000 description 8
- 238000012706 support-vector machine Methods 0.000 description 8
- 230000002596 correlated effect Effects 0.000 description 7
- 201000010099 disease Diseases 0.000 description 7
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 7
- 230000000670 limiting effect Effects 0.000 description 7
- 210000000707 wrist Anatomy 0.000 description 7
- 206010020772 Hypertension Diseases 0.000 description 6
- 239000008280 blood Substances 0.000 description 6
- 210000004369 blood Anatomy 0.000 description 6
- 210000000038 chest Anatomy 0.000 description 6
- 238000007726 management method Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 239000000090 biomarker Substances 0.000 description 5
- 230000036541 health Effects 0.000 description 5
- 230000002452 interceptive effect Effects 0.000 description 5
- 230000002829 reductive effect Effects 0.000 description 5
- 238000011160 research Methods 0.000 description 5
- 238000011269 treatment regimen Methods 0.000 description 5
- 241001465754 Metazoa Species 0.000 description 4
- 208000031074 Reinjury Diseases 0.000 description 4
- 238000003339 best practice Methods 0.000 description 4
- 230000002457 bidirectional effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 238000013527 convolutional neural network Methods 0.000 description 4
- 230000003247 decreasing effect Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 210000003041 ligament Anatomy 0.000 description 4
- 239000004973 liquid crystal related substance Substances 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 239000000203 mixture Substances 0.000 description 4
- 238000012797 qualification Methods 0.000 description 4
- 230000000306 recurrent effect Effects 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 210000002435 tendon Anatomy 0.000 description 4
- 230000001755 vocal effect Effects 0.000 description 4
- 230000000386 athletic effect Effects 0.000 description 3
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 3
- 230000017531 blood circulation Effects 0.000 description 3
- 230000001934 delay Effects 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 230000035487 diastolic blood pressure Effects 0.000 description 3
- 230000003467 diminishing effect Effects 0.000 description 3
- 230000002526 effect on cardiovascular system Effects 0.000 description 3
- 210000003414 extremity Anatomy 0.000 description 3
- 239000007789 gas Substances 0.000 description 3
- 230000000977 initiatory effect Effects 0.000 description 3
- 210000002414 leg Anatomy 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 239000001301 oxygen Substances 0.000 description 3
- 229910052760 oxygen Inorganic materials 0.000 description 3
- 238000006213 oxygenation reaction Methods 0.000 description 3
- 238000000554 physical therapy Methods 0.000 description 3
- 230000035485 pulse pressure Effects 0.000 description 3
- 238000000638 solvent extraction Methods 0.000 description 3
- 230000035488 systolic blood pressure Effects 0.000 description 3
- 241000238558 Eucarida Species 0.000 description 2
- 206010020802 Hypertensive crisis Diseases 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 239000008186 active pharmaceutical agent Substances 0.000 description 2
- 230000003542 behavioural effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 238000007654 immersion Methods 0.000 description 2
- 238000011221 initial treatment Methods 0.000 description 2
- 238000010197 meta-analysis Methods 0.000 description 2
- 238000012148 non-surgical treatment Methods 0.000 description 2
- 230000002980 postoperative effect Effects 0.000 description 2
- 230000008439 repair process Effects 0.000 description 2
- 230000036387 respiratory rate Effects 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 238000005728 strengthening Methods 0.000 description 2
- 238000002560 therapeutic procedure Methods 0.000 description 2
- 208000024827 Alzheimer disease Diseases 0.000 description 1
- 208000006820 Arthralgia Diseases 0.000 description 1
- 241000288140 Gruiformes Species 0.000 description 1
- 206010019909 Hernia Diseases 0.000 description 1
- 208000007353 Hip Osteoarthritis Diseases 0.000 description 1
- 208000005016 Intestinal Neoplasms Diseases 0.000 description 1
- 208000012659 Joint disease Diseases 0.000 description 1
- 206010060820 Joint injury Diseases 0.000 description 1
- 208000000112 Myalgia Diseases 0.000 description 1
- 241000283973 Oryctolagus cuniculus Species 0.000 description 1
- 241000282320 Panthera leo Species 0.000 description 1
- 241000282376 Panthera tigris Species 0.000 description 1
- 208000018737 Parkinson disease Diseases 0.000 description 1
- 208000037656 Respiratory Sounds Diseases 0.000 description 1
- 208000029033 Spinal Cord disease Diseases 0.000 description 1
- 208000006011 Stroke Diseases 0.000 description 1
- 241000377209 Unicorn Species 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 210000000577 adipose tissue Anatomy 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 238000009530 blood pressure measurement Methods 0.000 description 1
- 208000029028 brain injury Diseases 0.000 description 1
- 210000000133 brain stem Anatomy 0.000 description 1
- 238000002554 cardiac rehabilitation Methods 0.000 description 1
- 239000007933 dermal patch Substances 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 206010012601 diabetes mellitus Diseases 0.000 description 1
- 230000004064 dysfunction Effects 0.000 description 1
- 238000002283 elective surgery Methods 0.000 description 1
- 125000001475 halogen functional group Chemical group 0.000 description 1
- 230000035876 healing Effects 0.000 description 1
- 239000005556 hormone Substances 0.000 description 1
- 229940088597 hormone Drugs 0.000 description 1
- 238000013150 knee replacement Methods 0.000 description 1
- 230000003692 lymphatic flow Effects 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 201000006417 multiple sclerosis Diseases 0.000 description 1
- 230000004118 muscle contraction Effects 0.000 description 1
- 206010028417 myasthenia gravis Diseases 0.000 description 1
- 210000000653 nervous system Anatomy 0.000 description 1
- 208000018360 neuromuscular disease Diseases 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 238000002203 pretreatment Methods 0.000 description 1
- 206010037833 rales Diseases 0.000 description 1
- 210000003625 skull Anatomy 0.000 description 1
- 208000020431 spinal cord injury Diseases 0.000 description 1
- 210000000115 thoracic cavity Anatomy 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
- 230000036642 wellbeing Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/10—Machine learning using kernel methods, e.g. support vector machines [SVM]
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6895—Sport equipment
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H80/00—ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/09—Rehabilitation or training
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B22/00—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
- A63B22/06—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with support elements performing a rotating cycling movement, i.e. a closed path movement
- A63B22/0605—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with support elements performing a rotating cycling movement, i.e. a closed path movement performing a circular movement, e.g. ergometers
- A63B2022/0611—Particular details or arrangement of cranks
- A63B2022/0623—Cranks of adjustable length
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B2071/0675—Input for modifying training controls during workout
- A63B2071/0683—Input by handheld remote control
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/10—Positions
- A63B2220/16—Angular positions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/50—Force related parameters
- A63B2220/51—Force
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2225/00—Miscellaneous features of sport apparatus, devices or equipment
- A63B2225/50—Wireless data transmission, e.g. by radio transmitters or telemetry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
Definitions
- Remote medical assistance may aid a patient in performing various aspects of a rehabilitation regimen for a body part.
- the patient may use a patient interface in communication with an assistant interface for receiving the remote medical assistance via audio and/or audiovisual communications.
- An aspect of the disclosed embodiments includes a method that includes receiving treatment data pertaining to a user who uses a treatment device to perform a treatment plan.
- the treatment data includes at least one of characteristics of the user, measurement information pertaining to the user while the user uses the treatment devices, characteristics of the treatment device, and the treatment plan.
- the method also includes generating treatment information using the treatment data and writing to an associated memory, for access at a computing device of a healthcare provider, the treatment information.
- the method also includes communicating with an interface, at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input, and modifying the treatment plan in response to receiving treatment plan input including at least one modification to the treatment plan.
- An aspect of the disclosed embodiments includes a computer-implemented system that includes a treatment apparatus configured to be manipulated by a patient while performing an exercise session, a patient interface configured to receive a virtual avatar.
- the patient interface comprises an output device configured to present the virtual avatar.
- the virtual avatar uses a virtual representation of the treatment apparatus to guide the patient through an exercise session.
- the virtual avatar is associated with a medical professional.
- the computer- implemented system includes a server computing device configured to provide the virtual avatar of the patient to the patient interface, receive, from the patient interface, a message pertaining to a trigger event, and wherein the message comprises a severity level of the trigger event, determine whether a severity level of the trigger event exceeds a threshold severity level, and responsive to determining that the severity level of the trigger event exceeds the threshold severity level, replace on the patient interface the presentation of the virtual avatar with a presentation of a multimedia feed from a computing device of the medical professional.
- a server computing device configured to provide the virtual avatar of the patient to the patient interface, receive, from the patient interface, a message pertaining to a trigger event, and wherein the message comprises a severity level of the trigger event, determine whether a severity level of the trigger event exceeds a threshold severity level, and responsive to determining that the severity level of the trigger event exceeds the threshold severity level, replace on the patient interface the presentation of the virtual avatar with a presentation of a multimedia feed from a computing device of the medical professional.
- An aspect of the disclosed embodiments includes a method for providing, by an artificial intelligence engine, an optimal treatment plan to use with a treatment apparatus.
- the method includes receiving, from a data source, clinical information pertaining to results of using the treatment apparatus to perform particular treatment plans for people having certain characteristics, wherein the clinical information has a first data format, translating a portion of the clinical information from the first data format to a medical description language used by the artificial intelligence engine, determining, based on the portion of the clinical information described by the medical description language and a plurality of characteristics pertaining to a patient, the optimal treatment plan for the patient to follow using the treatment apparatus to achieve a desired result; and providing the optimal treatment plan to be presented on a computing device of a medical professional.
- An aspect of the disclosed embodiments includes a method for providing, by an artificial intelligence engine, an optimal treatment plan to use with a treatment apparatus.
- the method includes receiving, from a data source, clinical information pertaining to results of using the treatment apparatus to perform particular treatment plans for people having certain characteristics, wherein the clinical information has a first data format, translating a portion of the clinical information from the first data format to a medical description language used by the artificial intelligence engine, determining, based on the portion of the clinical information described by the medical description language and a plurality of characteristics pertaining to a patient, the optimal treatment plan for the patient to follow using the treatment apparatus to achieve a desired result, and providing the optimal treatment plan to be presented on a computing device of a medical professional.
- Another aspect of the disclosed embodiments includes a system that includes a processing device and a memory communicatively coupled to the processing device and capable of storing instructions.
- the processing device executes the instructions to perform any of the methods, operations, or steps described herein.
- Another aspect of the disclosed embodiments includes a tangible, non-transitory computer-readable medium storing instructions that, when executed, cause a processing device to perform any of the methods, operations, or steps described herein.
- FIG. 1 generally illustrates a block diagram of an embodiment of a computer-implemented system for managing a treatment plan according to the principles of the present disclosure.
- FIG. 2 generally illustrates a perspective view of an embodiment of a treatment device according to the principles of the present disclosure.
- FIG. 3 generally illustrates a perspective view of a pedal of the treatment device of FIG. 2 according to the principles of the present disclosure.
- FIG. 4 generally illustrates a perspective view of a person using the treatment device of FIG. 2 according to the principles of the present disclosure.
- FIG. 5 generally illustrates an example embodiment of an overview display of an assistant interface according to the principles of the present disclosure.
- FIG. 6 generally illustrates an example block diagram of training a machine learning model to output, based on data pertaining to the patient, a treatment plan for the patient according to the principles of the present disclosure.
- FIG. 7 generally illustrates an embodiment of an overview display of the assistant interface presenting recommended treatment plans and excluded treatment plans in real-time during a telemedicine session according to the principles of the present disclosure.
- FIG. 8 generally illustrates an embodiment of the overview display of the assistant interface presenting, in real-time during a telemedicine session, recommended treatment plans that have changed as a result of patient data changing according to the principles of the present disclosure.
- FIG. 9 is a flow diagram generally illustrating a method for modifying, based on treatment data received while a user uses the treatment device of FIG. 2, a treatment plan for the patient and controlling, based on the modification, at least one treatment device according to the principles of the present disclosure.
- FIG. 10 is a flow diagram generally illustrating an alternative method for modifying, based on treatment data received while a user uses the treatment device of FIG. 2, a treatment plan for the patient and controlling, based on the modification, at least one treatment device according to the principles of the present disclosure.
- FIG. 11 is a flow diagram generally illustrating an alternative method for modifying, based on treatment data received while a user uses the treatment device of FIG. 2, a treatment plan for the patient and controlling, based on the modification, at least one treatment device according to the principles of the present disclosure.
- FIG. 12 generally illustrates a computer system according to the principles of the present disclosure.
- FIG. 13 shows a block diagram of an embodiment of a computer implemented system for managing a treatment plan according to the present disclosure.
- FIG. 14 shows a perspective view of an embodiment of a treatment apparatus according to the present disclosure.
- FIG. 15 shows a perspective view of a pedal of the treatment apparatus of FIG. 14 according to the present disclosure.
- FIG. 16 shows a perspective view of a person using the treatment apparatus of FIG. 14 according to the present disclosure.
- FIG. 17 shows an example embodiment of an overview display of an assistant interface according to the present disclosure.
- FIG. 18 shows an example embodiment of an overview display of the assistant interface presenting recommended optimal treatment plans and excluded treatment plans in real-time during a telemedicine session according to the present disclosure.
- FIG. 19 shows an example embodiment of a server translating clinical information into a medical description language for processing by an artificial intelligence engine according to the present disclosure.
- FIG. 20 shows an example embodiment of a method for recommending an optimal treatment plan according to the present disclosure.
- FIG. 21 shows an example embodiment of a method for translating clinical information into the medical description language according to the present disclosure.
- FIG. 22 shows an example computer system according to the present disclosure.
- FIG. 23 generally illustrates a block diagram of an embodiment of a computer-implemented system for managing a treatment plan according to the principles of the present disclosure.
- FIG. 24 generally illustrates a perspective view of an embodiment of a treatment device according to the principles of the present disclosure.
- FIG. 25 generally illustrates a perspective view of a pedal of the treatment device of FIG. 24 according to the principles of the present disclosure.
- FIG. 26 generally illustrates a perspective view of a person using the treatment device of FIG. 24 according to the principles of the present disclosure.
- FIG. 27 generally illustrates an example embodiment of an overview display of an assistant interface according to the principles of the present disclosure.
- FIG. 28 generally illustrates an example block diagram of training a machine learning model to output, based on data pertaining to the patient, a treatment plan for the patient according to the principles of the present disclosure.
- FIG. 29 generally illustrates an embodiment of an overview display of the assistant interface presenting recommended treatment plans and excluded treatment plans in real-time during a telemedicine session according to the principles of the present disclosure.
- FIG. 30 generally illustrates an embodiment of the overview display of the assistant interface presenting, in real-time during a telemedicine session, recommended treatment plans that have changed as a result of patient data changing according to the principles of the present disclosure.
- FIG. 31 is a flow diagram generally illustrating a method for monitoring, based on treatment data received while a user uses the treatment device of FIG. 24, characteristics of the user while the user uses the treatment device according to the principles of the present disclosure.
- FIG. 32 is a flow diagram generally illustrating an alternative method for monitoring, based on treatment data received while a user uses the treatment device of FIG. 24, characteristics of the user while the user uses the treatment device according to the principles of the present disclosure.
- FIG. 33 is a flow diagram generally illustrating an alternative method for monitoring, based on treatment data received while a user uses the treatment device of FIG. 24, characteristics of the user while the user uses the treatment device according to the principles of the present disclosure.
- FIG. 34 is a flow diagram generally illustrating a method for receiving a selection of an optimal treatment plan and controlling, based on the optimal treatment plan, a treatment device while the patient uses the treatment device according to the present disclosure.
- FIG. 35 generally illustrates a computer system according to the principles of the present disclosure.
- FIG. 36 shows a block diagram of an embodiment of a computer implemented system for managing a treatment plan according to the present disclosure.
- FIG. 37 shows a perspective view of an embodiment of a treatment apparatus according to the present disclosure.
- FIG. 38 shows a perspective view of a pedal of the treatment apparatus of FIG. 37 according to the present disclosure.
- FIG. 39 shows a perspective view of a person using the treatment apparatus of FIG. 37 according to the present disclosure.
- FIG. 40 shows an example embodiment of an overview display of an assistant interface according to the present disclosure.
- FIG. 41 shows an example block diagram of training a machine learning model to output, based on data pertaining to the patient, a treatment plan for the patient according to the present disclosure.
- FIG. 42 shows an embodiment of an overview display of the patient interface presenting a virtual avatar guiding the patient through an exercise session according to the present disclosure.
- FIG. 43 shows an embodiment of the overview display of the assistant interface receiving a notification pertaining to the patient and enabling the assistant to initiate a telemedicine session in real-time according to the present disclosure.
- FIG. 44 shows an embodiment of the overview display of the patient interface presenting, in real time during a telemedicine session, a feed of the medical professional that replaced the virtual avatar according to the present disclosure.
- FIG. 45 shows an example embodiment of a method for replacing, based on a trigger event occurring, a virtual avatar with a feed of a medical professional according to the present disclosure.
- FIG. 46 shows an example embodiment of a method for providing a virtual avatar according to the present disclosure.
- FIG. 47 shows an example computer system according to the present disclosure.
- first, second, third, etc. may be used herein to describe various elements, components, regions, layers and/or sections; however, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms may be only used to distinguish one element, component, region, layer, or section from another region, layer, or section. Terms such as “first,” “second,” and other numerical terms, when used herein, do not imply a sequence or order unless clearly indicated by the context. Thus, a first element, component, region, layer, or section discussed below could be termed a second element, component, region, layer, or section without departing from the teachings of the example embodiments.
- phrases “at least one of,” when used with a list of items, means that different combinations of one or more of the listed items may be used, and only one item in the list may be needed.
- “at least one of: A, B, and C” includes any of the following combinations: A, B, C, A and B, A and C, B and C, and A and B and C.
- the phrase “one or more” when used with a list of items means there may be one item or any suitable number of items exceeding one.
- spatially relative terms such as “inner,” “outer,” “beneath,” “below,” “lower,” “above,” “upper,” “top,” “bottom,” and the like, may be used herein. These spatially relative terms can be used for ease of description to describe one element’s or feature’s relationship to another element(s) or featme(s) as illustrated in the figures.
- the spatially relative terms may also be intended to encompass different orientations of the device in use, or operation, in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the example term “below” can encompass both an orientation of above and below.
- the device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptions used herein interpreted accordingly.
- a “treatment plan” may include one or more treatment protocols, and each treatment protocol includes one or more treatment sessions. Each treatment session comprises several session periods, with each session period including a particular exercise for treating the body part of the patient.
- a treatment plan for post-operative rehabilitation after a knee surgery may include an initial treatment protocol with twice daily stretching sessions for the first 3 days after surgery and a more intensive treatment protocol with active exercise sessions performed 4 times per day starting 4 days after surgery.
- a treatment plan may also include information pertaining to a medical procedure to perform on the patient, a treatment protocol for the patient using a treatment device, a diet regimen for the patient, a medication regimen for the patient, a sleep regimen for the patient, additional regimens, or some combination thereof.
- the treatment plan may also include one or more training protocols, such as strength training protocols, range of motion training protocols, cardiovascular training protocols, endurance training protocols, and the like.
- Each training protocol may include one or more training sessions comprising several training session periods, with each session period comprising a particular exercise directed to one or more of strength training, range of motion training, cardiovascular training, endurance training, and the like.
- telemedicine telehealth, telemed, teletherapeutic, telemedicine, remote medicine, etc. may be used interchangeably herein.
- enhanced reality may include a user experience comprising one or more of augmented reality, virtual reality, mixed reality, immersive reality, or a combination of the foregoing (e.g., immersive augmented reality, mixed augmented reality, virtual and augmented immersive reality, and the like).
- augmented reality may refer, without limitation, to an interactive user experience that provides an enhanced environment that combines elements of a real-world environment with computer generated components perceivable by the user.
- virtual reality may refer, without limitation, to a simulated interactive user experience that provides an enhanced environment perceivable by the user and wherein such enhanced environment may be similar to or different from a real-world environment.
- the term “mixed reality” may refer to an interactive user experience that combines aspects of augmented reality with aspects of virtual reality to provide a mixed reality environment perceivable by the user.
- the term “immersive reality” may refer to a simulated interactive user experienced using virtual and/or augmented reality images, sounds, and other stimuli to immerse the user, to a specific extent possible (e.g., partial immersion or total immersion), in the simulated interactive experience.
- an immersive reality experience may include actors, a narrative component, a theme (e.g., an entertainment theme or other suitable theme), and/or other suitable features of components.
- body halo may refer to a hardware component or components, wherein such component or components may include one or more platforms, one or more body supports or cages, one or more chairs or seats, one or more back supports or back engaging mechanisms, one or more leg or foot engaging mechanisms, one or more arm or hand engaging mechanisms, one or more head engaging mechanisms, other suitable hardware components, or a combination thereof.
- enhanced environment may refer to an enhanced environment in its entirety, at least one aspect of the enhanced environment, more than one aspect of the enhanced environment, or any suitable number of aspects of the enhanced environment.
- the term “threshold” and/or the term “range” may include one or more values expressed as a percentage, an absolute value, a unit of measurement, a difference value, a numerical quantity, or other suitable expression of the one or more values.
- the term “optimal treatment plan” may refer to optimizing a treatment plan based on a certain parameter or combinations of more than one parameter, such as, but not limited to, a monetary value amount generated by a treatment plan and/or billing sequence, wherein the monetary value amount is measured by an absolute amount in dollars or another currency, a Net Present Value (NPV) or any other measure, a patient outcome that results from the treatment plan and/or billing sequence, a fee paid to a medical professional, a payment plan for the patient to pay off an amount of money owed or a portion thereof, a plan of reimbursement, an amount of revenue, profit or other monetary value amount to be paid to an insurance or third-party provider, or some combination thereof.
- a monetary value amount generated by a treatment plan and/or billing sequence wherein the monetary value amount is measured by an absolute amount in dollars or another currency, a Net Present Value (NPV) or any other measure
- NPV Net Present Value
- Real-time may refer to less than or equal to 2 seconds. Near real-time may refer to any interaction of a sufficiently short time to enable two individuals to engage in a dialogue via such user interface, and will generally be less than 10 seconds but greater than 2 seconds.
- rehabilitation may be directed at cardiac rehabilitation, rehabilitation from stroke, multiple sclerosis, Parkinson’s disease, myasthenia gravis, Alzheimer’s disease, any other neurodegenative or neuromuscular disease, a brain injury, a spinal cord injury, a spinal cord disease, a joint injury, a joint disease, or the like.
- Rehabilitation can further involve muscular contraction in order to improve blood flow and lymphatic flow, engage the brain and nervous system to control and affect a traumatized area to increase the speed of healing, reverse or reduce pain (including arthralgias and myalgias), reverse or reduce stiffness, recover range of motion, encourage cardiovascular engagement to stimulate the release of pain-blocking hormones or to encourage highly oxygenated blood flow to aid in an overall feeling of well-being.
- Rehabilitation may be provided for individuals of average height in reasonably good physical condition having no substantial deformities, as well as for individuals more typically in need of rehabilitation, such as those who are elderly, obese, subject to disease processes, injured and/or who have a severely limited range of motion.
- rehabilitation includes prehabilitation (also referred to as ' ⁇ re habilitation” or "prehab”).
- Prehabilitation may be used as a preventative procedure or as a pre-surgical or pre treatment procedure.
- Prehabilitation may include any action performed by or on a patient (or directed to be performed by or on a patient, including, without limitation, remotely or distally through telemedicine) to, without limitation, prevent or reduce a likelihood of injury (e.g., prior to the occurrence of the injury); improve recovery time subsequent to surgery; improve strength subsequent to surgery; or any of the foregoing with respect to any non-surgical clinical treatment plan to be undertaken for the purpose of ameliorating or mitigating injury, dysfunction, or other negative consequence of surgical or non-surgical treatment on any external or internal part of a patient's body.
- a mastectomy may require prehabilitation to strengthen muscles or muscle groups affected directly or indirectly by the mastectomy.
- the removal of an intestinal tumor, the repair of a hernia, open-heart surgery or other procedures performed on internal organs or structures, whether to repair those organs or structures, to excise them or parts of them, to treat them, etc. can require cutting through, dissecting and/or harming numerous muscles and muscle groups in or about, without limitation, the skull or face, the abdomen, the ribs and/or the thoracic cavity, as well as in or about all joints and appendages.
- Prehabilitation can improve a patient's speed of recovery, measure of quality of life, level of pain, etc. in all the foregoing procedures.
- a pre-surgical procedure or a pre-non-surgical-treatment may include one or more sets of exercises for a patient to perform prior to such procedure or treatment. Performance of the one or more sets of exercises may be required in order to qualify for an elective surgery, such as a knee replacement.
- the patient may prepare an area of his or her body for the surgical procedure by performing the one or more sets of exercises, thereby strengthening muscle groups, improving existing muscle memory, reducing pain, reducing stiffness, establishing new muscle memory, enhancing mobility (i.e., improve range of motion), improving blood flow, and/or the like.
- Determining a treatment plan for a patient having certain characteristics may be a technically challenging problem.
- a multitude of information may be considered when determining a treatment plan, which may result in inefficiencies and inaccuracies in the treatment plan selection process.
- some of the multitude of information considered may include characteristics of the patient such as personal information, performance information, and measurement information.
- the personal information may include, e.g., demographic, psychographic or other information, such as an age, a weight, a gender, a height, a body mass index, a medical condition, a familial medication history, an injury, a medical procedure, a medication prescribed, or some combination thereof.
- the performance information may include, e.g., an elapsed time of using a treatment device, an amount of force exerted on a portion of the treatment device, a range of motion achieved on the treatment device, a movement speed of a portion of the treatment device, an indication of a plurality of pain levels using the treatment device, or some combination thereof.
- the measurement information may include, e.g., a vital sign, a respiration rate, a heartrate, a temperature, a blood pressure, or some combination thereof. It may be desirable to process the characteristics of a multitude of patients, the treatment plans performed for those patients, and the results of the treatment plans for those patients.
- Another technical problem may involve distally treating, via a computing device during a telemedicine or telehealth session, a patient from a location different than a location at which the patient is located.
- An additional technical problem is controlling or enabling the control of, from the different location, a treatment device used by the patient at the location at which the patient is located.
- a healthcare provider may prescribe a treatment device to the patient to use to perform a treatment protocol at their residence or any mobile location or temporary domicile.
- a healthcare provider may refer to a doctor, physician assistant, nurse, chiropractor, dentist, physical therapist, acupuncturist, physical trainer, coach, personal trainer, or the like.
- a healthcare provider may refer to any person with a credential, license, degree, or the like in the field of medicine, physical therapy, rehabilitation, or the like.
- the healthcare provider When the healthcare provider is located in a different location from the patient and the treatment device, it may be technically challenging for the healthcare provider to monitor the patient’s actual progress (as opposed to relying on the patient’s word about their progress) using the treatment device, modify the treatment plan according to the patient’s progress, adapt the treatment device to the personal characteristics of the patient as the patient performs the treatment plan, and the like.
- the systems and methods described herein may be configured to receive treatment data pertaining to a user while the user is using the treatment device to perform the treatment plan.
- the user may include a patient user or person using the treatment device to perform various exercises.
- the treatment plan may correspond to a rehabilitation treatment plan, a prehabilitation treatment plan, an exercise treatment plan, or other suitable treatment plan.
- the treatment data may include various characteristics of the user, various measurement information pertaining to the user while the user uses the treatment device, various characteristics of the treatment device, the treatment plan, other suitable data, or a combination thereof.
- At least some of the treatment data may correspond to sensor data of a sensor configured to sense various characteristics of the treatment device and/or the measurement information of the user. Additionally, or alternatively, while the user uses the treatment device to perform the treatment plan, at least some of the treatment data may correspond to sensor data from a sensor associated with a wearable device configured to sense the measurement information of the user.
- the various characteristics of the treatment device may include one or more settings of the treatment device, a current revolutions per time period (e.g., such as one minute) of a rotating member (e.g., such as a wheel) of the treatment device, a resistance setting of the treatment device, other suitable characteristics of the treatment device, or a combination thereof.
- the measurement information may include one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable measurement information of the user, or a combination thereof.
- the systems and methods described herein may be configured to generate treatment information using the treatment data.
- the treatment information may include a summary of the performance of the treatment plan by the user while using the treatment device formatted, such that the treatment data is presentable at a computing device of a healthcare provider or healthcare professional responsible for the performance of the treatment plan by the user.
- the terms “healthcare provider” and “healthcare professional” may be used interchangeably herein.
- the healthcare provider or healthcare professional may include a medical professional (e.g., such as a doctor, a nurse, a therapist, and the like), an exercise professional (e.g., such as a coach, a trainer, a nutritionist, and the like), or another professional sharing at least one of medical and exercise attributes (e.g., such as an exercise physiologist, a physical therapist, an occupational therapist, and the like).
- a healthcare provider or healthcare professional may be a human being, a robot, a virtual assistant, a virtual assistant in a virtual and/or augmented reality, or an artificially intelligent entity, including a software program, integrated software and hardware, or hardware alone.
- the systems and methods described herein may be configured to write to an associated memory, for access at the computing device of the healthcare provider, and/or provide, at the computing device of the healthcare provider, the treatment information.
- the systems and methods describe herein may be configured to provide the treatment information to an interface configured to present the treatment information to the healthcare provider.
- the interface may include a graphical user interface configured to provide the treatment information and receive input from the healthcare provider.
- the interface may include one or more input fields, such as text input fields, dropdown selection input fields, radio button input fields, virtual switch input fields, virtual lever input fields, audio, haptic, tactile, biometric or otherwise activated and/or driven input fields, other suitable input fields, or a combination thereof.
- the healthcare provider may review the treatment information and determine whether to modify the treatment plan and/or one or more characteristics of the treatment device. For example, the healthcare provider may review the treatment information and compare the treatment information to the treatment plan being performed by the user.
- the healthcare provider may compare the following (i) expected information, which pertains to the user while the user uses the treatment device to perform the treatment plan to (ii) the measurement information (e.g., indicated by the treatment information), which pertains to the user while the user uses the treatment device to perform the treatment plan.
- the expected information may include one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable information of the user, or a combination thereof.
- the healthcare provider may determine that the treatment plan is having the desired effect if one or more parts or portions of the measurement information are within an acceptable range associated with one or more corresponding parts or portions of the expected information. Conversely, the healthcare provider may determine that the treatment plan is not having the desired effect if one or more parts or portions of the measurement information are outside of the range associated with one or more corresponding parts or portions of the expected information.
- the healthcare provider may determine whether a blood pressure value (e.g., systolic pressure, diastolic pressure, and/or pulse pressure) corresponding to the user while the user uses the treatment device (e.g., indicated by the measurement information) is within an acceptable range (e.g., plus or minus 1%, plus or minus 5%, or any suitable range) of an expected blood pressure value indicated by the expected information.
- the healthcare provider may determine that the treatment plan is having the desired effect if the blood pressure value corresponding to the user while the user uses the treatment device is within the range of the expected blood pressure value.
- the healthcare provider may determine that the treatment plan is not having the desired effect if the blood pressure value corresponding to the user while the user uses the treatment device is outside of the range of the expected blood pressure value
- the healthcare provider may compare the expected characteristics of the treatment device while the user uses the treatment device to perform the treatment plan with characteristics of the treatment device indicated by the treatment information. For example, the healthcare provider may compare an expected resistance setting of the treatment device with an actual resistance setting of the treatment device indicated by the treatment information. The healthcare provider may determine that the user is performing the treatment plan properly if the actual characteristics of the treatment device indicated by the treatment information are within a range of corresponding ones of the expected characteristics of the treatment device. Conversely, the healthcare provider may determine that the user is not performing the treatment plan properly if the actual characteristics of the treatment device indicated by the treatment information are outside the range of corresponding ones of the expected characteristics of the treatment device.
- the healthcare provider may determine not to modify the treatment plan or the one or more characteristics of the treatment device. Conversely, while the user uses the treatment device to perform the treatment plan, if the healthcare provider determines that the treatment information indicates that the user is not or has not been performing the treatment plan properly and/or that the treatment plan is not or has not been having the desired effect, the healthcare provider may determine to modify the treatment plan and/or the one or more characteristics of the treatment device.
- the healthcare provider may interact with the interface to provide treatment plan input indicating one or more modifications to the treatment plan and/or to one or more characteristics of the treatment device if the healthcare provider determines to modify the treatment plan and/or the one or more characteristics of the treatment device.
- the healthcare provider may use the interface to provide input indicating an increase or decrease in the resistance setting of the treatment device, or other suitable modification to the one or more characteristics of the treatment device.
- the healthcare provider may use the interface to provide input indicating a modification to the treatment plan.
- the healthcare provider may use the interface to provide input indicating an increase or decrease in an amount of time the user is required to use the treatment device according to the treatment plan, or other suitable modifications to the treatment plan.
- the systems and methods described herein may be configured to modify the treatment plan based on one or more modifications indicated by the treatment plan input. Additionally, or alternatively, the systems and methods described herein may be configured to modify the one or more characteristics of the treatment device based on the modified the at least one aspect of the treatment plan and/or the treatment plan input.
- the treatment plan input may indicate to modify the one or more characteristics of the treatment device and/or the treatment plan may require or indicate adjustments to the treatment device in order for the user to achieve the desired results of the modified treatment plan.
- the systems and methods described herein may be configured to receive subsequent treatment data pertaining to the user while the user uses the treatment device to perform the treatment plan. For example, after the healthcare provider provides input modifying the treatment plan and/or controlling the one or more characteristics of the treatment device, the user may continue use the treatment device to perform the modified treatment plan.
- the subsequent treatment data may correspond to treatment data generated while the user uses the treatment device to perform the modified treatment plan.
- the subsequent treatment data may correspond to treatment data generated while the user continues to use the treatment device to perform the treatment plan, after the healthcare provider has received the treatment information and determined not to modify the treatment plan and/or control the one or more characteristics of the treatment device.
- the systems and methods described herein may be configured to further modify the treatment plan and/or control the one or more characteristics of the treatment device.
- the subsequent treatment plan input may correspond to input provided by the healthcare provider, at the interface, in response to receiving and/or reviewing subsequent treatment information corresponding to the subsequent treatment data.
- the systems and methods described herein may be configured to continuously and/or periodically provide treatment information to the computing device of the healthcare provider based on treatment data continuously and/or periodically received from the sensors or other suitable sources described herein.
- the healthcare provider may receive and/or review treatment information continuously or periodically while the user uses the treatment device to perform the treatment plan. Based on one or more trends indicated by the continuously and/or periodically received treatment information, the healthcare provider may determine whether to modify the treatment plan and/or control the one or more characteristics of the treatment device. For example, the one or more trends may indicate an increase in heart rate or other suitable trends indicating that the user is not performing the treatment plan properly and/or performance of the treatment plan by the user is not having the desired effect.
- the systems and methods described herein may be configured to use artificial intelligence and/or machine learning to assign patients to cohorts and to dynamically control a treatment device based on the assignment during an adaptive telemedicine session.
- numerous treatment devices may be provided to patients.
- the treatment devices may be used by the patients to perform treatment plans in their residences, at a gym, at a rehabilitative center, at a hospital, or any suitable location, including permanent or temporary domiciles.
- the treatment devices may be communicatively coupled to a server. Characteristics of the patients, including the treatment data, may be collected before, during, and/or after the patients perform the treatment plans. For example, the personal information, the performance information, and the measurement information may be collected before, during, and/or after the person performs the treatment plans. The results (e.g., improved performance or decreased performance) of performing each exercise may be collected from the treatment device throughout the treatment plan and after the treatment plan is performed. The parameters, settings, configurations, etc. (e.g., position of pedal, amount of resistance, etc.) of the treatment device may be collected before, during, and/or after the treatment plan is performed.
- the parameters, settings, configurations, etc. e.g., position of pedal, amount of resistance, etc.
- Each characteristic of the patient, each result, and each parameter, setting, configuration, etc. may be timestamped and may be correlated with a particular step in the treatment plan. Such a technique may enable determining which steps in the treatment plan lead to desired results (e.g., improved muscle strength, range of motion, etc.) and which steps lead to diminishing returns (e.g., continuing to exercise after 3 minutes actually delays or harms recovery).
- Data may be collected from the treatment devices and/or any suitable computing device (e.g., computing devices where personal information is entered, such as the interface of the computing device described herein, a clinician interface, patient interface, and the like) over time as the patients use the treatment devices to perform the various treatment plans.
- the data that may be collected may include the characteristics of the patients, the treatment plans performed by the patients, the results of the treatment plans, any of the data described herein, any other suitable data, or a combination thereof.
- the data may be processed to group certain people into cohorts.
- the people may be grouped by people having certain or selected similar characteristics, treatment plans, and results of performing the treatment plans. For example, athletic people having no medical conditions who perform a treatment plan (e.g., use the treatment device for 30 minutes a day 5 times a week for 3 weeks) and who fully recover may be grouped into a first cohort. Older people who are classified obese and who perform a treatment plan (e.g., use the treatment plan for 10 minutes a day 3 times a week for 4 weeks) and who improve their range of motion by 75 percent may be grouped into a second cohort.
- an artificial intelligence engine may include one or more machine learning models that are trained using the cohorts.
- the one or more machine learning models may be trained to receive an input of characteristics of a new patient and to output a treatment plan for the patient that results in a desired result.
- the machine learning models may match a pattern between the characteristics of the new patient and at least one patient of the patients included in a particular cohort. When a pattern is matched, the machine learning models may assign the new patient to the particular cohort and select the treatment plan associated with the at least one patient.
- the artificial intelligence engine may be configured to control, distally and based on the treatment plan, the treatment device while the new patient uses the treatment device to perform the treatment plan.
- the characteristics of the new patient may change as the new patient uses the treatment device to perform the treatment plan.
- the performance of the patient may improve quicker than expected for people in the cohort to which the new patient is currently assigned.
- the machine learning models may be trained to dynamically reassign, based on the changed characteristics, the new patient to a different cohort that includes people having characteristics similar to the now-changed characteristics as the new patient. For example, a clinically obese patient may lose weight and no longer meet the weight criterion for the initial cohort, result in the patient’ s being reassigned to a different cohort with a different weight criterion.
- a different treatment plan may be selected for the new patient, and the treatment device may be controlled, distally (e.g., which may be referred to as remotely) and based on the different treatment plan, the treatment device while the new patient uses the treatment device to perform the treatment plan.
- distally e.g., which may be referred to as remotely
- Such techniques may provide the technical solution of distally controlling a treatment device.
- the systems and methods described herein may lead to faster recovery times and/or better results for the patients because the treatment plan that most accurately fits their characteristics is selected and implemented, in real-time, at any given moment. “Real-time” may also refer to near real-time, which may be less than 10 seconds.
- the term “results” may refer to medical results or medical outcomes. Results and outcomes may refer to responses to medical actions.
- the artificial intelligence engine may be trained to output several treatment plans. For example, one result may include recovering to a threshold level (e.g., 75% range of motion) in a fastest amount of time, while another result may include fully recovering (e.g., 100% range of motion) regardless of the amount of time.
- the data obtained from the patients and sorted into cohorts may indicate that a first treatment plan provides the first result for people with characteristics similar to the patient’s, and that a second treatment plan provides the second result for people with characteristics similar to the patient.
- the artificial intelligence engine may be trained to output treatment plans that are not optimal i.e., sub-optimal, nonstandard, or otherwise excluded (all referred to, without limitation, as “excluded treatment plans”) for the patient. For example, if a patient has high blood pressure, a particular exercise may not be approved or suitable for the patient as it may put the patient at unnecessary risk or even induce a hypertensive crisis and, accordingly, that exercise may be flagged in the excluded treatment plan for the patient.
- the artificial intelligence engine may monitor the treatment data received while the patient (e.g., the user) with, for example, high blood pressure, uses the treatment device to perform an appropriate treatment plan and may modify the appropriate treatment plan to include features of an excluded treatment plan that may provide beneficial results for the patient if the treatment data indicates the patient is handling the appropriate treatment plan without aggravating, for example, the high blood pressure condition of the patient.
- the treatment plans and/or excluded treatment plans may be presented, during a telemedicine or telehealth session, to a healthcare provider.
- the healthcare provider may select a particular treatment plan for the patient to cause that treatment plan to be transmitted to the patient and/or to control, based on the treatment plan, the treatment device.
- the artificial intelligence engine may receive and/or operate distally from the patient and the treatment device.
- the recommended treatment plans and/or excluded treatment plans may be presented simultaneously with a video of the patient in real-time or near real-time during a telemedicine or telehealth session on a user interface of a computing device of a healthcare provider.
- the video may also be accompanied by audio, text and other multimedia information.
- Real-time may refer to less than or equal to 2 seconds.
- Near real-time may refer to any interaction of a sufficiently short time to enable two individuals to engage in a dialogue via such user interface, and will generally be less than 10 seconds but greater than 2 seconds.
- Presenting the treatment plans generated by the artificial intelligence engine concurrently with a presentation of the patient video may provide an enhanced user interface because the healthcare provider may continue to visually and/or otherwise communicate with the patient while also reviewing the treatment plans on the same user interface.
- the enhanced user interface may improve the healthcare provider’s experience using the computing device and may encourage the healthcare provider to reuse the user interface.
- Such a technique may also reduce computing resources (e.g., processing, memory, network) because the healthcare provider does not have to switch to another user interface screen to enter a query for a treatment plan to recommend based on the characteristics of the patient.
- the artificial intelligence engine may be configured to provide, dynamically on the fly, the treatment plans and excluded treatment plans.
- the treatment device may be adaptive and/or personalized because its properties, configurations, and positions may be adapted to the needs of a particular patient.
- the pedals may be dynamically adjusted on the fly (e.g., via a telemedicine session or based on programmed configurations in response to certain measurements being detected) to increase or decrease a range of motion to comply with a treatment plan designed for the user.
- a healthcare provider may adapt, remotely during a telemedicine session, the treatment device to the needs of the patient by causing a control instruction to be transmitted from a server to treatment device.
- Such adaptive nature may improve the results of recovery for a patient, furthering the goals of personalized medicine, and enabling personalization of the treatment plan on a per-individual basis.
- FIG. 1 generally illustrates ablock diagram of a computer-implemented system 10, hereinafter called “the system” for managing a treatment plan.
- Managing the treatment plan may include using an artificial intelligence engine to recommend treatment plans and/or provide excluded treatment plans that should not be recommended to a patient.
- the system 10 also includes a server 30 configured to store (e.g., write to an associated memory) and to provide data related to managing the treatment plan.
- the server 30 may include one or more computers and may take the form of a distributed and/or virtualized computer or computers.
- the server 30 also includes a first communication interface 32 configured to communicate with the clinician interface 20 via a first network 34.
- the first network 34 may include wired and/or wireless network connections such as Wi Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
- the server 30 includes a first processor 36 and a first machine-readable storage memory 38, which may be called a “memory” for short, holding first instructions 40 for performing the various actions of the server 30 for execution by the first processor 36.
- the server 30 is configured to store data regarding the treatment plan.
- the memory 38 includes a system data store 42 configured to hold system data, such as data pertaining to treatment plans for treating one or more patients.
- the server 30 is also configured to store data regarding performance by a patient in following a treatment plan.
- the memory 38 includes a patient data store 44 configured to hold patient data, such as data pertaining to the one or more patients, including data representing each patient’s performance within the treatment plan.
- the characteristics (e.g., personal, performance, measurement, etc.) of the people, the treatment plans followed by the people, the level of compliance with the treatment plans, and the results of the treatment plans may use correlations and other statistical or probabilistic measures to enable the partitioning of or to partition the treatment plans into different patient cohort-equivalent databases in the patient data store 44.
- the data for a first cohort of first patients having a first similar injury, a first similar medical condition, a first similar medical procedure performed, a first treatment plan followed by the first patient, and a first result of the treatment plan may be stored in a first patient database.
- the data for a second cohort of second patients having a second similar injury, a second similar medical condition, a second similar medical procedure performed, a second treatment plan followed by the second patient, and a second result of the treatment plan may be stored in a second patient database. Any single characteristic or any combination of characteristics may be used to separate the cohorts of patients.
- the different cohorts of patients may be stored in different partitions or volumes of the same database. There is no specific limit to the number of different cohorts of patients allowed, other than as limited by mathematical combinatoric and/or partition theory.
- This characteristic data, treatment plan data, and results data may be obtained from numerous treatment devices and/or computing devices over time and stored in the database 44.
- the characteristic data, treatment plan data, and results data may be correlated in the patient-cohort databases in the patient data store 44.
- the characteristics of the people may include personal information, performance information, and/or measurement information.
- characteristics about a current patient being treated may be stored in an appropriate patient cohort-equivalent database.
- the characteristics of the patient may be determined to match or be similar to the characteristics of another person in a particular cohort (e.g., cohort A) and the patient may be assigned to that cohort.
- the server 30 may execute an artificial intelligence (AI) engine 11 that uses one or more machine learning models 13 to perform at least one of the embodiments disclosed herein.
- the server 30 may include a training engine 9 capable of generating the one or more machine learning models 13.
- the machine learning models 13 may be trained to assign people to certain cohorts based on their characteristics, select treatment plans using real-time and historical data correlations involving patient cohort-equivalents, and control a treatment device 70, among other things.
- the one or more machine learning models 13 may be generated by the training engine 9 and may be implemented in computer instructions executable by one or more processing devices of the training engine 9 and/or the servers 30. To generate the one or more machine learning models 13, the training engine 9 may train the one or more machine learning models 13. The one or more machine learning models 13 may be used by the artificial intelligence engine 11.
- the training engine 9 may be a rackmount server, a router computer, a personal computer, a portable digital assistant, a smartphone, a laptop computer, a tablet computer, a netbook, a desktop computer, an Internet of Things (IoT) device, any other suitable computing device, or a combination thereof.
- the training engine 9 may be cloud-based or a real-time software platform, and it may include privacy software or protocols, and/or security software or protocols.
- the training engine 9 may use a training data set of a corpus of the characteristics of the people that used the treatment device 70 to perform treatment plans, the details (e.g., treatment protocol including exercises, amount of time to perform the exercises, how often to perform the exercises, a schedule of exercises, parameters/configurations/settings of the treatment device 70 throughout each step of the treatment plan, etc.) of the treatment plans performed by the people using the treatment device 70, and the results of the treatment plans performed by the people.
- the one or more machine learning models 13 may be trained to match patterns of characteristics of a patient with characteristics of other people assigned to a particular cohort.
- the term “match” may refer to an exact match, a correlative match, a substantial match, etc .
- the one or more machine learning models 13 may be trained to receive the characteristics of a patient as input, map the characteristics to characteristics of people assigned to a cohort, and select a treatment plan from that cohort.
- the one or more machine learning models 13 may also be trained to control, based on the treatment plan, the machine learning apparatus 70.
- Different machine learning models 13 may be trained to recommend different treatment plans for different desired results. For example, one machine learning model may be trained to recommend treatment plans for most effective recovery, while another machine learning model may be trained to recommend treatment plans based on speed of recovery.
- the one or more machine learning models 13 may refer to model artifacts created by the training engine 9.
- the training engine 9 may find patterns in the training data wherein such patterns map the training input to the target output, and generate the machine learning models 13 that capture these patterns.
- the artificial intelligence engine 11, the database 33, and/or the training engine 9 may reside on another component (e.g., assistant interface 94, clinician interface 20, etc.) depicted in FIG. 1.
- the one or more machine learning models 13 may comprise, e.g., a single level of linear or non linear operations (e.g., a support vector machine [SVM]) or the machine learning models 13 may be a deep network, i.e., a machine learning model comprising multiple levels of non-linear operations.
- deep networks are neural networks including generative adversarial networks, convolutional neural networks, recurrent neural networks with one or more hidden layers, and fully connected neural networks (e.g., each neuron may transmit its output signal to the input of the remaining neurons, as well as to itself).
- the machine learning model may include numerous layers and/or hidden layers that perform calculations (e.g., dot products) using various neurons.
- the system 10 also includes a patient interface 50 configured to communicate information to a patient and to receive feedback from the patient.
- the patient interface includes an input device 52 and an output device 54, which may be collectively called a patient user interface 52, 54.
- the input device 52 may include one or more devices, such as a keyboard, a mouse, a touch screen input, a gesture sensor, and/or a microphone and processor configured for voice recognition.
- the output device 54 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, smartphone, or a smart watch.
- the output device 54 may include other hardware and/or software components such as a projector, virtual reality capability, augmented reality capability, etc.
- the output device 54 may incorporate various different visual, audio, or other presentation technologies.
- the output device 54 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, and/or melodies, which may signal different conditions and/or directions.
- the output device 54 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the patient.
- the output device 54 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
- the patient interface 50 includes a second communication interface 56, which may also be called a remote communication interface configured to communicate with the server 30 and/or the clinician interface 20 via a second network 58.
- the second network 58 may include a local area network (LAN), such as an Ethernet network.
- the second network 58 may include the Internet, and communications between the patient interface 50 and the server 30 and/or the clinician interface 20 may be secured via encryption, such as, for example, by using a virtual private network (VPN).
- the second network 58 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
- the second network 58 may be the same as and/or operationally coupled to the first network 34.
- the patient interface 50 includes a second processor 60 and a second machine-readable storage memory 62 holding second instructions 64 for execution by the second processor 60 for performing various actions of patient interface 50.
- the second machine-readable storage memory 62 also includes a local data store 66 configured to hold data, such as data pertaining to a treatment plan and/or patient data, such as data representing a patient’s performance within a treatment plan.
- the patient interface 50 also includes a local communication interface 68 configured to communicate with various devices for use by the patient in the vicinity of the patient interface 50.
- the local communication interface 68 may include wired and/or wireless communications.
- the local communication interface 68 may include a local wireless network such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
- the system 10 also includes a treatment device 70 configured to be manipulated by the patient and/or to manipulate a body part of the patient for performing activities according to the treatment plan.
- the treatment device 70 may take the form of an exercise and rehabilitation apparatus configured to perform and/or to aid in the performance of a rehabilitation regimen, which may be an orthopedic rehabilitation regimen, and the treatment includes rehabilitation of a body part of the patient, such as a joint or a bone or a muscle group.
- the treatment device 70 may be any suitable medical, rehabilitative, therapeutic, etc.
- the treatment device 70 may be an electromechanical machine including one or more weights, an electromechanical bicycle, an electromechanical spin-wheel, a smart-mirror, a treadmill, or the like.
- the body part may include, for example, a spine, a hand, a foot, a knee, or a shoulder.
- the body part may include a part of a joint, a bone, or a muscle group, such as one or more vertebrae, a tendon, or a ligament.
- the treatment device 70 includes a controller 72, which may include one or more processors, computer memory, and/or other components.
- the treatment device 70 also includes a fourth communication interface 74 configured to communicate with the patient interface 50 via the local communication interface 68.
- the treatment device 70 also includes one or more internal sensors 76 and an actuator 78, such as a motor.
- the actuator 78 may be used, for example, for moving the patient’s body part and/or for resisting forces by the patient.
- the internal sensors 76 may measure one or more operating characteristics of the treatment device 70 such as, for example, a force a position, a speed, and /or a velocity .
- the internal sensors 76 may include a position sensor configured to measure at least one of a linear motion or an angular motion of a body part of the patient.
- an internal sensor 76 in the form of a position sensor may measure a distance that the patient is able to move a part of the treatment device 70, where such distance may correspond to a range of motion that the patient’s body part is able to achieve.
- the internal sensors 76 may include a force sensor configured to measure a force applied by the patient.
- an internal sensor 76 in the form of a force sensor may measure a force or weight the patient is able to apply, using a particular body part, to the treatment device 70.
- the system 10 generally illustrated in FIG. 1 also includes an ambulation sensor 82, which communicates with the server 30 via the local communication interface 68 of the patient interface 50.
- the ambulation sensor 82 may track and store a number of steps taken by the patient.
- the ambulation sensor 82 may take the form of a wristband, wristwatch, or smart watch.
- the ambulation sensor 82 may be integrated within a phone, such as a smartphone.
- the system 10 generally illustrated in FIG. 1 also includes a goniometer 84, which communicates with the server 30 via the local communication interface 68 of the patient interface 50.
- the goniometer 84 measures an angle of the patient’s body part.
- the goniometer 84 may measure the angle of flex of a patient’s knee or elbow or shoulder.
- the system 10 generally illustrated inFIG. 1 also includes a pressure sensor 86, which communicates with the server 30 via the local communication interface 68 of the patient interface 50.
- the pressure sensor 86 measures an amount of pressure or weight applied by a body part of the patient.
- pressure sensor 86 may measure an amount of force applied by a patient’s foot when pedaling a stationary bike.
- the system 10 generally illustrated in FIG. 1 also includes a supervisory interface 90 which may be similar or identical to the clinician interface 20. In some embodiments, the supervisory interface 90 may have enhanced functionality beyond what is provided on the clinician interface 20. The supervisory interface 90 may be configured for use by a person having responsibility for the treatment plan, such as an orthopedic surgeon.
- the system 10 generally illustrated in FIG. 1 also includes a reporting interface 92 which may be similar or identical to the clinician interface 20. In some embodiments, the reporting interface 92 may have less functionality from what is provided on the clinician interface 20. For example, the reporting interface 92 may not have the ability to modify a treatment plan. Such a reporting interface 92 may be used, for example, by a biller to determine the use of the system 10 for billing purposes.
- the reporting interface 92 may not have the ability to display patient identifiable information, presenting only pseudonymized data and/or anonymized data for certain data fields concerning a data subject and/or for certain data fields concerning a quasi-identifier of the data subject. Such a reporting interface 92 may be used, for example, by a researcher to determine various effects of a treatment plan on different patients.
- the system 10 includes an assistant interface 94 for a healthcare provider, such as those described herein, to remotely communicate with the patient interface 50 and/or the treatment device 70. Such remote communications may enable the healthcare provider to provide assistance or guidance to a patient using the system 10. More specifically, the assistant interface 94 is configured to communicate a telemedicine signal 96, 97, 98a, 98b, 99a, 99b with the patient interface 50 via a network connection such as, for example, via the first network 34 and/or the second network 58.
- a network connection such as, for example, via the first network 34 and/or the second network 58.
- the telemedicine signal 96, 97, 98a, 98b, 99a, 99b comprises one of an audio signal 96, an audiovisual signal 97, an interface control signal 98a for controlling a function of the patient interface 50, an interface monitor signal 98b for monitoring a status of the patient interface 50, an apparatus control signal 99a for changing an operating parameter of the treatment device 70, and/or an apparatus monitor signal 99b for monitoring a status of the treatment device 70.
- each of the control signals 98a, 99a may be unidirectional, conveying commands from the assistant interface 94 to the patient interface 50.
- an acknowledgement message may be sent from the patient interface 50 to the assistant interface 94.
- each of the monitor signals 98b, 99b may be unidirectional, status-information commands from the patient interface 50 to the assistant interface 94.
- an acknowledgement message may be sent from the assistant interface 94 to the patient interface 50 in response to successfully receiving one of the monitor signals 98b, 99b.
- the patient interface 50 may be configured as a pass-through for the apparatus control signals 99a and the apparatus monitor signals 99b between the treatment device 70 and one or more other devices, such as the assistant interface 94 and/or the server 30.
- the patient interface 50 may be configured to transmit an apparatus control signal 99a in response to an apparatus control signal 99a within the telemedicine signal 96, 97, 98a, 98b, 99a, 99b from the assistant interface 94.
- the assistant interface 94 may be presented on a shared physical device as the clinician interface 20.
- the clinician interface 20 may include one or more screens that implement the assistant interface 94.
- the clinician interface 20 may include additional hardware components, such as a video camera, a speaker, and/or a microphone, to implement aspects of the assistant interface 94.
- one or more portions of the telemedicine signal 96, 97, 98a, 98b, 99a, 99b may be generated from a prerecorded source (e.g., an audio recording, a video recording, or an animation) for presentation by the output device 54 of the patient interface 50.
- a prerecorded source e.g., an audio recording, a video recording, or an animation
- a tutorial video may be streamed from the server 30 and presented upon the patient interface 50.
- Content from the prerecorded source may be requested by the patient via the patient interface 50.
- the healthcare provider may cause content from the prerecorded source to be played on the patient interface 50.
- the assistant interface 94 includes an assistant input device 22 and an assistant display 24, which may be collectively called an assistant user interface 22, 24.
- the assistant input device 22 may include one or more of a telephone, a keyboard, a mouse, a trackpad, or a touch screen, for example.
- the assistant input device 22 may include one or more microphones.
- the one or more microphones may take the form of a telephone handset, headset, or wide-area microphone or microphones configured for the healthcare provider to speak to a patient via the patient interface 50.
- assistant input device 22 may be configured to provide voice-based functionalities, with hardware and/or software configured to interpret spoken instructions by the healthcare provider by using the one or more microphones.
- the assistant input device 22 may include functionality provided by or similar to existing voice- based assistants such as Siii by Apple, Alexaby Amazon, Google Assistant, or Bixby by Samsung.
- the assistant input device 22 may include other hardware and/or software components.
- the assistant input device 22 may include one or more general purpose devices and/or special-purpose devices.
- the assistant display 24 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, a smartphone, or a smart watch.
- the assistant display 24 may include other hardware and/or software components such as projectors, virtual reality capabilities, or augmented reality capabilities, etc.
- the assistant display 24 may incorporate various different visual, audio, or other presentation technologies.
- the assistant display 24 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, melodies, and/or compositions, which may signal different conditions and/or directions.
- the assistant display 24 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the healthcare provider.
- the assistant display 24 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
- the system 10 may provide computer translation of language from the assistant interface 94 to the patient interface 50 and/or vice-versa.
- the computer translation of language may include computer translation of spoken language and/or computer translation of text.
- the system 10 may provide voice recognition and/or spoken pronunciation of text.
- the system 10 may convert spoken words to printed text and/or the system 10 may audibly speak language from printed text.
- the system 10 may be configured to recognize spoken words by any or all of the patient, the clinician, and/or the healthcare provider.
- the system 10 may be configured to recognize and react to spoken requests or commands by the patient.
- the system 10 may automatically initiate a telemedicine session in response to a verbal command by the patient (which may be given in any one of several different languages).
- the server 30 may generate aspects of the assistant display 24 for presentation by the assistant interface 94.
- the server 30 may include a web server configured to generate the display screens for presentation upon the assistant display 24.
- the artificial intelligence engine 11 may generate recommended treatment plans and/or excluded treatment plans for patients and generate the display screens including those recommended treatment plans and/or external treatment plans for presentation on the assistant display 24 of the assistant interface 94.
- the assistant display 24 may be configured to present a virtualized desktop hosted by the server 30.
- the server 30 may be configured to communicate with the assistant interface 94 via the first network 34.
- the first network 34 may include a local area network (LAN), such as an Ethernet network.
- LAN local area network
- the first network 34 may include the Internet, and communications between the server 30 and the assistant interface 94 may be secured via privacy enhancing technologies, such as, for example, by using encryption over a virtual private network (VPN).
- the server 30 may be configured to communicate with the assistant interface 94 via one or more networks independent of the first network 34 and/or other communication means, such as a direct wired or wireless communication channel.
- the patient interface 50 and the treatment device 70 may each operate from a patient location geographically separate from a location of the assistant interface 94.
- the patient interface 50 and the treatment device 70 may be used as part of an in-home rehabilitation system, which may be aided remotely by using the assistant interface 94 at a centralized location, such as a clinic or a call center.
- the assistant interface 94 may be one of several different terminals (e.g., computing devices) that may be grouped together, for example, in one or more call centers or at one or more clinicians’ offices. In some embodiments, a plurality of assistant interfaces 94 may be distributed geographically. In some embodiments, a person may work as a healthcare provider remotely from any conventional office infrastructure. Such remote work may be performed, for example, where the assistant interface 94 takes the form of a computer and/or telephone. This remote work functionality may allow for work- from-home arrangements that may include part time and/or flexible work hours for a healthcare provider. [0145] FIGS. 2-3 show an embodiment of a treatment device 70. More specifically, FIG.
- the stationary cycling machine 100 includes a set of pedals 102 each attached to a pedal arm 104 for rotation about an axle 106.
- the pedals 102 are movable on the pedal arms 104 in order to adjust a range of motion used by the patient in pedaling.
- the pedals being located inwardly toward the axle 106 corresponds to a smaller range of motion than when the pedals are located outwardly away from the axle 106.
- a pressure sensor 86 is attached to or embedded within one of the pedals 102 for measuring an amount of force applied by the patient on the pedal 102.
- FIG. 4 generally illustrates a person (a patient) using the treatment device of FIG. 2, and showing sensors and various data parameters connected to a patient interface 50.
- the example patient interface 50 is a tablet computer or smartphone, or a phablet, such as an iPad, an iPhone, an Android device, or a Surface tablet, which is held manually by the patient.
- the patient interface 50 may be embedded within or attached to the treatment device 70.
- FIG. 4 generally illustrates the patient wearing the ambulation sensor 82 on his wrist, with a note showing “STEPS TODAY 1355”, indicating that the ambulation sensor 82 has recorded and transmitted that step count to the patient interface 50.
- FIG. 4 also generally illustrates the patient wearing the goniometer 84 on his right knee, with a note showing “KNEE ANGLE 72°”, indicating that the goniometer 84 is measuring and transmitting that knee angle to the patient interface 50.
- FIG. 4 also generally illustrates a right side of one of the pedals 102 with a pressure sensor 86 showing “FORCE 12.5 lbs.,” indicating that the right pedal pressure sensor 86 is measuring and transmitting that force measurement to the patient interface 50.
- FIG. 4 also generally illustrates a left side of one of the pedals 102 with a pressure sensor 86 showing “FORCE 27 lbs.”, indicating that the left pedal pressure sensor 86 is measuring and transmitting that force measurement to the patient interface 50.
- FIG. 4 also generally illustrates other patient data, such as an indicator of “SESSION TIME 0:04: 13”, indicating that the patient has been using the treatment device 70 for 4 minutes and 13 seconds. This session time may be determined by the patient interface 50 based on information received from the treatment device 70.
- FIG. 4 also generally illustrates an indicator showing “PAIN LEVEL 3”. Such a pain level may be obtained from the patent in response to a solicitation, such as a question, presented upon the patient interface 50.
- FIG. 5 is an example embodiment of an overview display 120 of the assistant interface 94.
- the overview display 120 presents several different controls and interfaces for the healthcare provider to remotely assist a patient with using the patient interface 50 and/or the treatment device 70.
- This remote assistance functionality may also be called telemedicine or telehealth.
- the overview display 120 includes a patient profile display 130 presenting biographical information regarding a patient using the treatment device 70.
- the patient profile display 130 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5, although the patient profile display 130 may take other forms, such as a separate screen or a popup window.
- the patient profile display 130 may include a limited subset of the patient’s biographical information. More specifically, the data presented upon the patient profile display 130 may depend upon the healthcare provider’ s need for that information. For example, a healthcare provider that is assisting the patient with a medical issue may be provided with medical history information regarding the patient, whereas a technician troubleshooting an issue with the treatment device 70 may be provided with a much more limited set of information regarding the patient. The technician, for example, may be given only the patient’s name.
- the patient profile display 130 may include pseudonymized data and/or anonymized data or use any privacy enhancing technology to prevent confidential patient data from being communicated in a way that could violate patient confidentiality requirements.
- privacy enhancing technologies may enable compliance with laws, regulations, or other rules of governance such as, but not limited to, the Health Insurance Portability and Accountability Act (HIPAA), or the General Data Protection Regulation (GDPR), wherein the patient may be deemed a “data subject”.
- HIPAA Health Insurance Portability and Accountability Act
- GDPR General Data Protection Regulation
- the patient profile display 130 may present information regarding the treatment plan for the patient to follow in using the treatment device 70.
- Such treatment plan information may be limited to a healthcare provider.
- a healthcare provider assisting the patient with an issue regarding the treatment regimen may be provided with treatment plan information, whereas a technician troubleshooting an issue with the treatment device 70 may not be provided with any information regarding the patient’s treatment plan.
- one or more recommended treatment plans and/or excluded treatment plans may be presented in the patient profile display 130 to the healthcare provider.
- the one or more recommended treatment plans and/or excluded treatment plans may be generated by the artificial intelligence engine 11 of the server 30 and received from the server 30 in real-time during, inter alia, a telemedicine or telehealth session.
- An example of presenting the one or more recommended treatment plans and/or ruled-out treatment plans is described below with reference to FIG. 7.
- the example overview display 120 generally illustrated in FIG. 5 also includes a patient status display 134 presenting status information regarding a patient using the treatment device.
- the patient status display 134 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5, although the patient status display 134 may take other forms, such as a separate screen or a popup window.
- the patient status display 134 includes sensor data 136 from one or more of the external sensors 82, 84, 86, and/or from one or more internal sensors 76 of the treatment device 70.
- the patient status display 134 may include sensor data from one or more sensors of one or more wearable devices worn by the patient while using the treatment device 70.
- the one or more wearable devices may include a watch, a bracelet, a necklace, a chest strap, and the like.
- the one or more wearable devices may be configured to monitor a heart rate, a temperature, a blood pressure, one or more vital signs, and the like of the patient while the patient is using the treatment device 70.
- the patient status display 134 may present other data 138 regarding the patient, such as last reported pain level, or progress within a treatment plan.
- User access controls may be used to limit access, including what data is available to be viewed and/or modified, on any or all of the user interfaces 20, 50, 90, 92, 94 of the system 10.
- user access controls may be employed to control what information is available to any given person using the system 10.
- data presented on the assistant interface 94 may be controlled by user access controls, with permissions set depending on the healthcare provider/user’s need for and/or qualifications to view that information.
- the example overview display 120 generally illustrated in FIG. 5 also includes a help data display 140 presenting information for the healthcare provider to use in assisting the patient.
- the help data display 140 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5.
- the help data display 140 may take other forms, such as a separate screen or a popup window.
- the help data display 140 may include, for example, presenting answers to frequently asked questions regarding use of the patient interface 50 and/or the treatment device 70.
- the help data display 140 may also include research data or best practices. In some embodiments, the help data display 140 may present scripts for answers or explanations in response to patient questions. In some embodiments, the help data display 140 may present flow charts or walk-throughs for the healthcare provider to use in determining a root cause and/or solution to a patient’s problem.
- the assistant interface 94 may present two or more help data displays 140, which may be the same or different, for simultaneous presentation of help data for use by the healthcare provider for example, a first help data display may be used to present a troubleshooting flowchart to determine the source of a patient’s problem, and a second help data display may present script information for the healthcare provider to read to the patient, such information to preferably include directions for the patient to perform some action, which may help to narrow down or solve the problem.
- the second help data display may automatically populate with script information.
- the example overview display 120 generally illustrated in FIG. 5 also includes a patient interface control 150 presenting information regarding the patient interface 50, and/or to modify one or more settings of the patient interface 50.
- the patient interface control 150 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5.
- the patient interface control 150 may take other forms, such as a separate screen or a popup window.
- the patient interface control 150 may present information communicated to the assistant interface 94 via one or more of the interface monitor signals 98b.
- the patient interface control 150 includes a display feed 152 of the display presented by the patient interface 50.
- the display feed 152 may include a live copy of the display screen currently being presented to the patient by the patient interface 50. In other words, the display feed 152 may present an image of what is presented on a display screen of the patient interface 50.
- the display feed 152 may include abbreviated information regarding the display screen currently being presented by the patient interface 50, such as a screen name or a screen number.
- the patient interface control 150 may include a patient interface setting control 154 for the healthcare provider to adjust or to control one or more settings or aspects of the patient interface 50.
- the patient interface setting control 154 may cause the assistant interface 94 to generate and/or to transmit an interface control signal 98 for controlling a function or a setting of the patient interface 50.
- the patient interface setting control 154 may include collaborative browsing or co-browsing capability for the healthcare provider to remotely view and/or control the patient interface 50.
- the patient interface setting control 154 may enable the healthcare provider to remotely enter text to one or more text entry fields on the patient interface 50 and/or to remotely control a cursor on the patient interface 50 using a mouse or touchscreen of the assistant interface 94.
- the patient interface setting control 154 may allow the healthcare provider to change a setting that cannot be changed by the patient.
- the patient interface 50 may be precluded from accessing a language setting to prevent a patient from inadvertently switching, on the patient interface 50, the language used for the displays, whereas the patient interface setting control 154 may enable the healthcare provider to change the language setting of the patient interface 50.
- the patient interface 50 may not be able to change a font size setting to a smaller size in order to prevent a patient from inadvertently switching the font size used for the displays on the patient interface 50 such that the display would become illegible to the patient, whereas the patient interface setting control 154 may provide for the healthcare provider to change the font size setting of the patient interface 50.
- the example overview display 120 generally illustrated in FIG. 5 also includes an interface communications display 156 showing the status of communications between the patient interface 50 and one or more other devices 70, 82, 84, such as the treatment device 70, the ambulation sensor 82, and/or the goniometer 84.
- the interface communications display 156 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5.
- the interface communications display 156 may take other forms, such as a separate screen or a popup window.
- the interface communications display 156 may include controls for the healthcare provider to remotely modify communications with one or more of the other devices 70, 82, 84.
- the healthcare provider may remotely command the patient interface 50 to reset communications with one of the other devices 70, 82, 84, or to establish communications with a new one of the other devices 70, 82, 84.
- This functionality may be used, for example, where the patient has a problem with one of the other devices 70, 82, 84, or where the patient receives a new or a replacement one of the other devices 70, 82, 84.
- the example overview display 120 generally illustrated in FIG. 5 also includes an apparatus control 160 for the healthcare provider to view and/or to control information regarding the treatment device 70.
- the apparatus control 160 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5.
- the apparatus control 160 may take other forms, such as a separate screen or a popup window.
- the apparatus control 160 may include an apparatus status display 162 with information regarding the current status of the apparatus.
- the apparatus status display 162 may present information communicated to the assistant interface 94 via one or more of the apparatus monitor signals 99b.
- the apparatus status display 162 may indicate whether the treatment device 70 is currently communicating with the patient interface 50.
- the apparatus status display 162 may present other current and/or historical information regarding the status of the treatment device 70.
- the apparatus control 160 may include an apparatus setting control 164 for the healthcare provider to adjust or control one or more aspects of the treatment device 70.
- the apparatus setting control 164 may cause the assistant interface 94 to generate and/or to transmit an apparatus control signal 99 (e.g., which may be referred to as treatment plan input, as described) for changing an operating parameter and/or one or more characteristics of the treatment device 70, (e.g., a pedal radius setting, a resistance setting, a target RPM, other suitable characteristics of the treatment device 70, or a combination thereof).
- the apparatus seting control 164 may include a mode buton 166 and a position control 168, which may be used in conjunction for the healthcare provider to place an actuator 78 of the treatment device 70 in a manual mode, after which a seting, such as a position or a speed of the actuator 78, can be changed using the position control 168.
- the mode buton 166 may provide for a seting, such as a position, to be toggled between automatic and manual modes.
- one or more setings may be adjustable at any time, and without having an associated auto/manual mode.
- the healthcare provider may change an operating parameter of the treatment device 70, such as a pedal radius seting, while the patient is actively using the treatment device 70. Such “on the fly” adjustment may or may not be available to the patient using the patient interface 50.
- the apparatus seting control 164 may allow the healthcare provider to change a seting that cannot be changed by the patient using the patient interface 50.
- the patient interface 50 may be precluded from changing a preconfigured seting, such as a height or a tilt setting of the treatment device 70, whereas the apparatus seting control 164 may provide for the healthcare provider to change the height or tilt seting of the treatment device 70.
- the example overview display 120 generally illustrated in FIG. 5 also includes a patient communications control 170 for controlling an audio or an audiovisual communications session with the patient interface 50.
- the communications session with the patient interface 50 may comprise a live feed from the assistant interface 94 for presentation by the output device of the patient interface 50.
- the live feed may take the form of an audio feed and/or a video feed.
- the patient interface 50 may be configured to provide two-way audio or audiovisual communications with a person using the assistant interface 94.
- the communications session with the patient interface 50 may include bidirectional (two-way) video or audiovisual feeds, with each of the patient interface 50 and the assistant interface 94 presenting video of the other one.
- the patient interface 50 may present video from the assistant interface 94, while the assistant interface 94 presents only audio or the assistant interface 94 presents no live audio or visual signal from the patient interface 50.
- the assistant interface 94 may present video from the patient interface 50, while the patient interface 50 presents only audio or the patient interface 50 presents no live audio or visual signal from the assistant interface 94.
- the audio or an audiovisual communications session with the patient interface 50 may take place, at least in part, while the patient is performing the rehabilitation regimen upon the body part.
- the patient communications control 170 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5.
- the patient communications control 170 may take other forms, such as a separate screen or a popup window.
- the audio and/or audiovisual communications may be processed and/or directed by the assistant interface 94 and/or by another device or devices, such as a telephone system, or a videoconferencing system used by the healthcare provider while the healthcare provider uses the assistant interface 94.
- the audio and/or audiovisual communications may include communications with a third party.
- the system 10 may enable the healthcare provider to initiate a 3-way conversation regarding use of a particular piece of hardware or software, with the patient and a subject matter expert, such as a healthcare provider or a specialist.
- the example patient communications control 170 generally illustrated in FIG. 5 includes call controls 172 for the healthcare provider to use in managing various aspects of the audio or audiovisual communications with the patient.
- the call controls 172 include a disconnect button 174 for the healthcare provider to end the audio or audiovisual communications session.
- the call controls 172 also include a mute button 176 to temporarily silence an audio or audiovisual signal from the assistant interface 94.
- the call controls 172 may include other features, such as a hold button (not shown).
- the call controls 172 also include one or more record/playback controls 178, such as record, play, and pause buttons to control, with the patient interface 50, recording and/or playback of audio and/or video from the teleconference session.
- the call controls 172 also include a video feed display 180 for presenting still and/or video images from the patient interface 50, and a self-video display 182 showing the current image of the healthcare provider using the assistant interface 94.
- the self-video display 182 may be presented as a picture- in-picture format, within a section of the video feed display 180, as is generally illustrated in FIG. 5. Alternatively or additionally, the self -video display 182 may be presented separately and/or independently from the video feed display 180.
- the example overview display 120 generally illustrated in FIG. 5 also includes a third party communications control 190 for use in conducting audio and/or audiovisual communications with a third party.
- the third party communications control 190 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5.
- the third party communications control 190 may take other forms, such as a display on a separate screen or a popup window.
- the third party communications control 190 may include one or more controls, such as a contact list and/or buttons or controls to contact a third party regarding use of a particular piece of hardware or software, e.g., a subject matter expert, such as a healthcare provider or a specialist.
- the third party communications control 190 may include conference calling capability for the third party to simultaneously communicate with both the healthcare provider via the assistant interface 94, and with the patient via the patient interface 50.
- the system 10 may provide for the healthcare provider to initiate a 3-way conversation with the patient and the third party.
- FIG. 6 generally illustrates an example block diagram of training a machine learning model 13 to output, based on data 600 pertaining to the patient, a treatment plan 602 for the patient according to the present disclosure.
- Data pertaining to other patients may be received by the server 30.
- the other patients may have used various treatment devices to perform treatment plans.
- the data may include characteristics of the other patients, the details of the treatment plans performed by the other patients, and/or the results of performing the treatment plans (e.g., a percent of recovery of a portion of the patients’ bodies, an amount of recovery of a portion of the patients’ bodies, an amount of increase or decrease in muscle strength of a portion of patients’ bodies, an amount of increase or decrease in range of motion of a portion of patients’ bodies, etc.).
- Cohort A includes data for patients having similar first characteristics, first treatment plans, and first results.
- Cohort B includes data for patients having similar second characteristics, second treatment plans, and second results.
- cohort A may include first characteristics of patients in their twenties without any medical conditions who underwent surgery for a broken limb; their treatment plans may include a certain treatment protocol (e.g., use the treatment device 70 for 30 minutes 5 times a week for 3 weeks, wherein values for the properties, configurations, and/or settings of the treatment device 70 are set to X (where X is a numerical value) for the first two weeks and to Y (where Y is a numerical value) for the last week).
- Cohort A and cohort B may be included in a training dataset used to train the machine learning model 13.
- the machine learning model 13 may be trained to match a pattern between characteristics for each cohort and output the treatment plan that provides the result. Accordingly, when the data 600 for a new patient is input into the trained machine learning model 13, the trained machine learning model 13 may match the characteristics included in the data 600 with characteristics in either cohort A or cohort B and output the appropriate treatment plan 602. In some embodiments, the machine learning model 13 may be trained to output one or more excluded treatment plans that should not be performed by the new patient.
- FIG. 7 generally illustrates an embodiment of an overview display 120 of the assistant interface 94 presenting recommended treatment plans and excluded treatment plans in real-time during a telemedicine session according to the present disclosure.
- the overview display 120 just includes sections for the patient profile 130 and the video feed display 180, including the self-video display 182. Any suitable configuration of controls and interfaces of the overview display 120 described with reference to FIG. 5 may be presented in addition to or instead of the patient profile 130, the video feed display 180, and the self-video display 182.
- the healthcare provider using the assistant interface 94 (e.g., computing device) during the telemedicine session may be presented in the self-video 182 in a portion of the overview display 120 (e.g., user interface presented on a display screen 24 of the assistant interface 94) that also presents a video from the patient in the video feed display 180.
- the video feed display 180 may also include a graphical user interface (GUI) object 700 (e.g., a button) that enables the healthcare provider to share, in real-time or near real-time during the telemedicine session, the recommended treatment plans and/or the excluded treatment plans with the patient on the patient interface 50.
- the healthcare provider may select the GUI object 700 to share the recommended treatment plans and/or the excluded treatment plans.
- another portion of the overview display 120 includes the patient profile display 130.
- the patient profile display 130 is presenting two example recommended treatment plans 600 and one example excluded treatment plan 602.
- the treatment plans may be recommended in view of characteristics of the patient being treated.
- the patient should follow to achieve a desired result, a pattern between the characteristics of the patient being treated and a cohort of other people who have used the treatment device 70 to perform a treatment plan may be matched by one or more machine learning models 13 of the artificial intelligence engine 11.
- Each of the recommended treatment plans may be generated based on different desired results.
- the patient profile display 130 presents “The characteristics of the patient match characteristics of uses in Cohort A. The following treatment plans are recommended for the patient based on his characteristics and desired results.” Then, the patient profile display 130 presents recommended treatment plans from cohort A, and each treatment plan provides different results.
- treatment plan “A” indicates “Patient X should use treatment device for 30 minutes a day for 4 days to achieve an increased range of motion of Y%; Patient X has Type 2 Diabetes; and Patient X should be prescribed medication Z for pain management during the treatment plan (medication Z is approved for people having Type 2 Diabetes).” Accordingly, the treatment plan generated achieves increasing the range of motion of Y%.
- the treatment plan also includes a recommended medication (e.g., medication Z) to prescribe to the patient to manage pain in view of a known medical disease (e.g., Type 2 Diabetes) of the patient. That is, the recommended patient medication not only does not conflict with the medical condition of the patient but thereby improves the probability of a superior patient outcome.
- a recommended medication e.g., medication Z
- the recommended patient medication not only does not conflict with the medical condition of the patient but thereby improves the probability of a superior patient outcome.
- Recommended treatment plan “B” may specify, based on a different desired result of the treatment plan, a different treatment plan including a different treatment protocol for a treatment device, a different medication regimen, etc.
- the patient profile display 130 may also present the excluded treatment plans 602. These types of treatment plans are shown to the healthcare provider using the assistant interface 94 to alert the healthcare provider not to recommend certain portions of a treatment plan to the patient.
- the excluded treatment plan could specify the following: “Patient X should not use treatment device for longer than 30 minutes a day due to a heart condition; Patient X has Type 2 Diabetes; and Patient X should not be prescribed medication M for pain management during the treatment plan (in this scenario, medication M can cause complications for people having Type 2 Diabetes).
- the excluded treatment plan points out a limitation of a treatment protocol where, due to a heart condition, Patient X should not exercise for more than 30 minutes a day.
- the ruled-out treatment plan also points out that Patient X should not be prescribed medication M because it conflicts with the medical condition Type 2 Diabetes.
- the healthcare provider may select the treatment plan for the patient on the overview display 120.
- the healthcare provider may use an input peripheral (e.g., mouse, touchscreen, microphone, keyboard, etc.) to select from the treatment plans 600 for the patient.
- the healthcare provider may discuss the pros and cons of the recommended treatment plans 600 with the patient.
- the healthcare provider may select the treatment plan for the patient to follow to achieve the desired result.
- the selected treatment plan may be transmitted to the patient interface 50 for presentation.
- the patient may view the selected treatment plan on the patient interface 50.
- the healthcare provider and the patient may discuss during the telemedicine session the details (e.g., treatment protocol using treatment device 70, diet regimen, medication regimen, etc.) in real-time or in near real-time.
- the server 30 may control, based on the selected treatment plan and during the telemedicine session, the treatment device 70 as the user uses the treatment device 70.
- FIG. 8 generally illustrates an embodiment of the overview display 120 of the assistant interface 94 presenting, in real-time during a telemedicine session, recommended treatment plans that have changed as a result of patient data changing according to the present disclosure.
- the treatment device 70 and/or any computing device may transmit data while the patient uses the treatment device 70 to perform a treatment plan.
- the data may include updated characteristics of the patient and/or other treatment data.
- the updated characteristics may include new performance information and/or measurement information.
- the performance information may include a speed of a portion of the treatment device 70, a range of motion achieved by the patient, a force exerted on a portion of the treatment device 70, a heartrate of the patient, a blood pressure of the patient, a respiratory rate of the patient, and so forth.
- the data received at the server 30 may be input into the trained machine learning model 13, which may determine that the characteristics indicate the patient is on track for the current treatment plan. Determining the patient is on track for the current treatment plan may cause the trained machine learning model 13 to adjust a parameter of the treatment device 70. The adjustment may be based on a next step of the treatment plan to further improve the performance of the patient.
- the data received at the server 30 may be input into the trained machine learning model 13, which may determine that the characteristics indicate the patient is not on track (e.g., behind schedule, not able to maintain a speed, not able to achieve a certain range of motion, is in too much pain, etc.) for the current treatment plan or is ahead of schedule (e.g., exceeding a certain speed, exercising longer than specified with no pain, exerting more than a specified force, etc.) for the current treatment plan.
- the trained machine learning model 13 may determine that the characteristics indicate the patient is not on track (e.g., behind schedule, not able to maintain a speed, not able to achieve a certain range of motion, is in too much pain, etc.) for the current treatment plan or is ahead of schedule (e.g., exceeding a certain speed, exercising longer than specified with no pain, exerting more than a specified force, etc.) for the current treatment plan.
- the trained machine learning model 13 may determine that the characteristics of the patient no longer match the characteristics of the patients in the cohort to which the patient is assigned. Accordingly, the trained machine learning model 13 may reassign the patient to another cohort that includes qualifying characteristics the patient’s characteristics. As such, the trained machine learning model 13 may select a new treatment plan from the new cohort and control, based on the new treatment plan, the treatment device 70.
- the server 30 may provide the new treatment plan 800 to the assistant interface 94 for presentation in the patient profile 130.
- the patient profile 130 indicates “The characteristics of the patient have changed and now match characteristics of uses in Cohort B. The following treatment plan is recommended for the patient based on his characteristics and desired results.”
- the patient profile 130 presents the new treatment plan 800 (“Patient X should use the treatment device for 10 minutes a day for 3 days to achieve an increased range of motion of L%.”
- the healthcare provider may select the new treatment plan 800, and the server 30 may receive the selection.
- the server 30 may control the treatment device 70 based on the new treatment plan 800.
- the new treatment plan 800 may be transmitted to the patient interface 50 such that the patient may view the details of the new treatment plan 800.
- the server 30 may receive treatment data pertaining to the patient.
- the treatment plan may correspond to a rehabilitation treatment plan, a prehabilitation treatment plan, an exercise treatment plan, or other suitable treatment plan.
- the treatment data may include various characteristics of the patient (e.g., such as those described herein), various measurement information pertaining to the patient while the patient uses the treatment device 70 (e.g., such as those described herein), various characteristics of the treatment device 70 (e.g., such as those described herein), the treatment plan, other suitable data, or a combination thereof.
- At least some of the treatment data may include the sensor data 136 from one or more of the external sensors 82, 84, 86, and/or from one or more internal sensors 76 of the treatment device 70. In some embodiments, at least some of the treatment data may include sensor data from one or more sensors of one or more wearable devices worn by the patient while using the treatment device 70.
- the one or more wearable devices may include a watch, a bracelet, a necklace, a chest strap, a head sweatband, a wrist sweatband, any other suitable sweatband, any other suitable wearable, or a combination thereof while the patient is using the treatment device 70, the one or more wearable devices may be configured to monitor a heart rate, a temperature, a blood pressure, one or more vital signs, and the like of the patient.
- the server 30 may generate treatment information using the treatment data.
- the treatment information may include a formatted summary of the performance of the treatment plan by the user while using the treatment device, such that the treatment data is presentable at a computing device of a healthcare provider responsible for the performance of the treatment plan by the user.
- the patient profile display 120 may include and/or display the treatment information.
- the server 30 may be configured to provide, at the overview display 120, the treatment information.
- the server 30 may store the treatment information for access by the overview display 120 and/or communicate the treatment information to the overview display 120.
- the server 30 may provide the treatment information to patient profile display 130 or other suitable section, portion, or component of the overview display 120 or to any other suitable display or interface.
- the healthcare provider assisting the patient while using the treatment device 70 may review the treatment information and determine whether to modify the treatment plan and/or one or more characteristics of the treatment device 70. For example, the healthcare provider may review the treatment information and compare the treatment information to the treatment plan being performed by the patient.
- the healthcare provider may compare one or more parts or portions of expected information pertaining to the patient’s ability to perform the treatment plan with one or more corresponding parts or portions of the measurement information (e.g., indicated by the treatment information) pertaining to the patient while the patient uses the treatment device 70 to perform the treatment plan.
- the expected information may include one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable information of the user, or a combination thereof.
- the healthcare provider may determine that the treatment plan is having the desired effect if one or more parts or portions of the measurement information are within an acceptable range of one or more corresponding parts or portions of the expected information. Conversely, the healthcare provider may determine that the treatment plan is not having the desired effect if one or more pats or portions of the measurement information are outside of the acceptable range of one or more corresponding parts or portions of the expected information.
- the healthcare provider may compare the expected respective characteristics of the treatment device 70 with corresponding characteristics of the treatment device 70 indicated by the treatment information. For example, the healthcare provider may compare an expected resistance setting of the treatment device 70 with an actual resistance setting of the treatment device 70 indicated by the treatment information.
- the healthcare provider may determine that the patient is performing the treatment plan properly if the actual characteristics of the treatment device 70 indicated by the treatment information are within a range of the expected characteristics of the treatment device 70. Conversely, the healthcare provider may determine that the patient is not performing the treatment plan properly if the actual characteristics of the treatment device 70 indicated by the treatment information are outside the range of the expected characteristics of the treatment device 70.
- the healthcare provider may determine not to modify the treatment plan or the one or more characteristics of the treatment device 70. Conversely, if the healthcare provider determines that the treatment information indicates that the patient is not performing the treatment plan properly and/or that the treatment plan is not having the desired effect, the healthcare provider may determine to modify the treatment plan and/or the one or more characteristics of the treatment device 70 while the user uses the treatment device 70 to perform the treatment plan.
- the server 30 may receive subsequent treatment data pertaining to the patient. For example, after the healthcare provider provides input modifying the treatment plan and/or controlling the one or more characteristics of the treatment device 70, the patient may continue to perform the modified treatment plan using the treatment device 70.
- the subsequent treatment data may correspond to treatment data generated while the patient uses the treatment device 70 to perform the modified treatment plan.
- the subsequent treatment data may correspond to treatment data generated while the patient continues to perform the treatment plan using the treatment device 70, after the healthcare provider has received the treatment information and determined not to modify the treatment plan and/or control the one or more characteristics of the treatment device 70.
- the server 30 may further modify the treatment plan and/or control the one or more characteristics of the treatment device 70 based on subsequent treatment plan input received from overview display 120.
- the subsequent treatment plan input may correspond to input provided by the healthcare provider, at the overview display 120, in response to receiving and/or reviewing subsequent treatment information corresponding to the subsequent treatment data. It should be understood that the server 30 may continuously and/or periodically provide treatment information to the patient profile display 130 and/or other sections, portions, or components of the overview display 120 based on continuously and/or periodically received treatment data.
- the healthcare provider may receive and/or review treatment information continuously or periodically while the user uses the treatment device to perform the treatment plan.
- the healthcare provider may determine whether to modify the treatment plan and/or control the one or more characteristics of the treatment device based on one or more trends indicated by the continuously and/or periodically received treatment information.
- the one or more trends may indicate an increase in heart rate or changes in other applicable trends indicating that the user is not performing the treatment plan properly and/or performance of the treatment plan by the user is not having the desired effect.
- FIG. 9 is a flow diagram generally illustrating a method 900 for monitoring performance of a treatment plan by a user using a treatment device and for selectively modifying the treatment plan and one or more characteristics of the treatment device according to the present disclosure.
- the method 900 is performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), software (such as is ran on a general-purpose computer system or a dedicated machine), or a combination of both.
- the method 900 and/or each of its individual functions, routines, subroutines, or operations may be performed by one or more processors of a computing device (e.g., any component of FIG. 1, such as server 30 executing the artificial intelligence engine 11).
- the method 900 may be performed by a single processing thread.
- the method 900 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the methods.
- the method 900 is depicted and described as a series of operations. However, operations in accordance with this disclosure can occur in various orders and/or concurrently, and/or with other operations not presented and described herein. For example, the operations depicted in the method 900 may occur in combination with any other operation of any other method disclosed herein. Furthermore, not all illustrated operations may be required to implement the method 900 in accordance with the disclosed subject matter. In addition, those skilled in the art will understand and appreciate that the method 900 could alternatively be represented as a series of interrelated states via a state diagram or events.
- the processing device may receive treatment data pertaining to a user who uses a treatment device, such as the treatment device 70, to perform a treatment plan.
- the treatment data may include characteristics of the user, measurement information pertaining to the user while the user uses the treatment device 70, characteristics of the treatment device 70, the treatment plan, other suitable data, or a combination thereof.
- the processing device may generate treatment information using the treatment data.
- the treatment information may include a summary of the performance of the treatment plan by the user while using the treatment device 70.
- the treatment information may be formatted, such that the treatment data is presentable at a computing device of a healthcare provider responsible for the performance of the treatment plan by the user.
- the processing device may be configured to provide (e.g., store for access, make available, make accessible, transmit, and the like), at the computing device of the healthcare provider, the treatment information.
- the processing device may be configured to provide the treatment information at an interface of the computing device of the healthcare provider.
- the processing device may store the treatment information for access by the computing device of the healthcare provide and/or communicate (e.g., or transmit) the treatment information to the computing device of the healthcare provider for display at the patient profile display 130 of the overview display 120, .
- the overview display 120 may be configured to receive input, such as treatment plan input, indicating one or more modifications to the treatment plan and/or one or more characteristics of the treatment device 70.
- the healthcare provider may interact with the various controls, input fields, and other aspects of the overview display 120 to provide the treatment plan input.
- the processing device may modify the treatment plan in response to receiving treatment plan input including at least one modification to the treatment plan. For example, the processing device may modify various features and characteristics of the treatment plan based on the at least one modification indicated by the treatment plan input.
- the processing device may selectively control the treatment device 70 using the modified treatment plan. For example, the processing device may modify one or more characteristics of the treatment device 70 based on modifications to the treatment plan. Additionally, or alternatively, the processing device may adapt, modify, adjust, or otherwise control on or more characteristics based on the treatment plan input.
- the treatment plan input may indicate at least one modification to one or more characteristics of the treatment device 70. The processing device may modify the one or more characteristics of the treatment device 70 based on the at least one modification indicated by the treatment plan input.
- FIG. 10 is a flow diagram generally illustrating an alternative method 1000 for monitoring performance of a treatment plan by a user using a treatment device and for selectively modifying the treatment plan and one or more characteristics of the treatment device according to the present disclosure.
- Method 1000 includes operations performed by processors of a computing device (e.g., any component of FIG. 1, such as server 30 executing the artificial intelligence engine 11).
- processors of a computing device e.g., any component of FIG. 1, such as server 30 executing the artificial intelligence engine 11
- one or more operations of the method 1000 are implemented in computer instructions stored on a memory device and executed by a processing device.
- the method 1000 may be performed in the same or a similar manner as described above in regard to method 900.
- the operations of the method 1000 may be performed in some combination with any of the operations of any of the methods described herein.
- the processing device may receive, during a telemedicine session, first treatment data pertaining to a user that uses a treatment device, such as the treatment device 70, to perform the treatment plan.
- the first treatment data includes, at least, measurement information pertaining to the user while the user uses the treatment device 70 to perform the treatment plan.
- the first treatment data may correspond to sensor data, such as sensor data 136, from one or more of the external sensors, such as external sensors 82, 84, 86, and/or from one or more internal sensors, such as internal sensors 76, of the treatment device 70.
- the first treatment data may include sensor data from one or more sensors associated with one or more corresponding wearable devices worn by the user while using the treatment device 70.
- the one or more wearable devices may include a watch, a bracelet, a necklace, a chest strap, a head sweatband, a wrist sweatband, any other suitable sweatband, and other suitable wearable device, or a combination thereof.
- the one or more wearable devices may be configured to monitor a heart rate, a temperature, a blood pressure, one or more vital signs, and the like of the user while the user is using the treatment device 70.
- the processing device may generate first treatment information using the first treatment data.
- the first treatment information may include a summary of the performance of the treatment plan by the user while using the treatment device 70.
- the first treatment information may be formatted, such that the first treatment data is presentable at a computing device of a healthcare provider responsible for the performance of the treatment plan by the user.
- the processing device may be configured to write to an associated memory, for access at the computing device of the healthcare provider, and/or provide, at the computing device of the healthcare provider, the first treatment information.
- the processing device may be configured to provide the first treatment information at an interface of the computing device of the healthcare provider.
- the processing device may be configured to provide the first treatment information at the patient profile display 130 of the overview display 120.
- the overview display 120 may be configured to receive input, such as treatment plan input, indicating one or more modifications to the treatment plan and/or one or more characteristics of the treatment device 70.
- the healthcare provider may interact with the various controls, input fields, and other aspects of the overview display 120 to provide the treatment plan input.
- the processing device may receive first treatment plan input responsive to the first treatment information.
- the first treatment plan input may indicate at least one modification to the treatment plan.
- the first treatment plan input may be provided by the healthcare provider, as described.
- the artificial intelligence engine 11 may generate the first treatment plan input.
- the processing device may modify the treatment plan in response to receiving the first treatment plan input including at least one modification to the treatment plan. For example, the processing device may modify various features and characteristics of the treatment plan based on the at least one modification indicated by the first treatment plan input.
- the processing device may selectively control the treatment device 70 using the modified treatment plan. For example, the processing device may modify one or more characteristics of the treatment device 70 based on modifications to the treatment plan. Additionally, or alternatively, the processing device may adapt, modify, adjust, or otherwise control on or more characteristics based on the first treatment plan input.
- the first treatment plan input may indicate at least one modification to one or more characteristics of the treatment device 70. The processing device may modify the one or more characteristics of the treatment device 70 based on the at least one modification indicated by the first treatment plan input.
- the processing device may receive second treatment plan input responsive to second treatment information generated using second treatment data.
- the processing device may receive second treatment data pertaining to the user while the user uses the treatment device 70.
- the second treatment data may include treatment data received by the processing device after the first treatment data.
- the second treatment data may pertain to the user while the user uses the treatment device 70 to perform the modified treatment plan.
- the second treatment data may pertain to the user while the user uses the treatment device 70 to perform the treatment plan (e.g., in cases where the healthcare provider does not modify the treatment plan, as described).
- the processing device may generate the second treatment information based on the second treatment data.
- the processing device may receive the second treatment plan input indicating at least one modification to the treatment plan.
- the processing device may be configured to provide the second treatment information to the patient profile display 130 and/or any other suitable section, portion, or component of the overview display 120 or to any other suitable display or interface.
- the healthcare provider e.g., and/or the artificial intelligence engine 11
- the processing device may modify the treatment plan. For example, the processing device may further modify (e.g., in cases where the processing device has already modified the treatment plan) and/or modify (e.g., in cases where the processing device has not previously modified the treatment plan) various features and characteristics of the treatment plan based on the at least one modification indicated by the second treatment plan input.
- the processing device may selectively control the treatment device 70. For example, based on modifications to the treatment plan, the processing device may modify one or more characteristics of the treatment device 70. Additionally, or alternatively, the processing device may adapt, modify, adjust, or otherwise control on or more characteristics based on the second treatment plan input.
- the second treatment plan input may indicate at least one modification to one or more characteristics of the treatment device 70.
- the processing device may modify the one or more characteristics of the treatment device 70 based on the at least one modification indicated by the second treatment plan input.
- Method 1100 is a flow diagram generally illustrating an alternative method 1100 for monitoring performance of a treatment plan by a user using a treatment device and for selectively modifying the treatment plan and one or more characteristics of the treatment device according to the present disclosure.
- Method 1100 includes operations performed by processors of a computing device (e.g., any component of FIG. 1, such as server 30 executing the artificial intelligence engine 11).
- processors of a computing device e.g., any component of FIG. 1, such as server 30 executing the artificial intelligence engine 11.
- one or more operations of the method 1100 are implemented in computer instructions stored on a memory device and executed by a processing device.
- the method 1100 may be performed in the same or a similar manner as described above in regard to method 900 and/or method 1000.
- the operations of the method 1100 may be performed in some combination with any of the operations of any of the methods described herein.
- the processing device may receive treatment data pertaining to a user who uses a treatment device, such as the treatment device 70, to perform the treatment plan.
- the treatment data may include any of the data described herein.
- the treatment data may correspond to sensor data, such as sensor data 136, from one or more of the external sensors, such as external sensors 82, 84, 86, and/or from one or more internal sensors, such as internal sensors 76, of the treatment device 70.
- at least some of the treatment data may include sensor data from one or more sensors associated with one or more corresponding wearable devices worn by the user while using the treatment device 70.
- the one or more wearable devices may include a watch, a bracelet, a necklace, a chest strap, a head sweatband, a wrist sweatband, any other suitable sweatband, any other suitable wearable device, or a combination thereof.
- the one or more wearable devices may be configured to monitor a heart rate, a temperature, a blood pressure, one or more vital signs, and the like of the user while the user is using the treatment device 70.
- the processing device may generate treatment information using the treatment data.
- the treatment information may include a summary of the performance of the treatment plan by the user while using the treatment device 70.
- the treatment information may be formatted, such that the treatment data is presentable at a computing device of a healthcare provider responsible for the performance of the treatment plan by the user.
- the processing device may be configured to provide, to at least one of the computing device of the healthcare provider and a machine learning model executed by the artificial intelligence engine 11, the treatment information.
- the processing device may receive treatment plan input responsive to the treatment information.
- the treatment plan input may indicate at least one modification to the treatment plan.
- the treatment plan input may be provided by the healthcare provider, as described.
- the artificial intelligence engine 11 executing the machine learning model may generate the treatment plan input.
- the processing device determines whether the treatment plan input indicates at least one modification to the treatment plan. If the processing device determines that the treatment plan input does not indicate at least one modification to the treatment plan, the processing device returns to 1102 and continues receiving treatment data pertaining to the user while the user uses the treatment device 70 to perform the treatment plan. If the processing device determines that the treatment plan input indicates at least one modification to the treatment plan, the processing device continues at 1112.
- the processing device may modify the treatment plan. For example, using the at least one modification to the treatment plan indicated by the treatment plan input, the processing device may modify the treatment plan. Based on the at least one modification indicated by the treatment plan input, the processing device may modify various features and characteristics of the treatment plan.
- the processing device may selectively control the treatment device 70. For example, based on the at least one modification to the treatment plan, the processing device may modify one or more characteristics of the treatment device 70. Additionally, or alternatively, the processing device may adapt, modify, adjust, or otherwise control on or more characteristics based on the treatment plan input.
- the treatment plan input may indicate at least one modification to one or more characteristics of the treatment device 70. Based on the at least one modification indicated by the treatment plan input, the processing device may modify the one or more characteristics of the treatment device 70. The processing device may return to 1102 and continue receiving treatment data pertaining to the user while the user uses the treatment device 70 to perform the treatment plan.
- FIG. 12 generally illustrates an example computer system 1200 which can perform any one or more of the methods described herein, in accordance with one or more aspects of the present disclosure.
- computer system 1200 may include a computing device and correspond to the assistance interface 94, reporting interface 92, supervisory interface 90, clinician interface 20, server 30 (including the AI engine 11), patient interface 50, ambulatory sensor 82, goniometer 84, treatment device 70, pressure sensor 86, or any suitable component of FIG. 1.
- the computer system 1200 may be capable of executing instructions implementing the one or more machine learning models 13 of the artificial intelligence engine 11 of FIG. 1.
- the computer system may be connected (e.g., networked) to other computer systems in a LAN, an intranet, an extranet, or the Internet, including via the cloud or a peer-to-peer network.
- the computer system may operate in the capacity of a server in a client-server network environment.
- the computer system may be a personal computer (PC), a tablet computer, a wearable (e.g., wristband), a set top box (STB), a personal Digital Assistant (PDA), a mobile phone, a camera, a video camera, an Internet of Things (IoT) device, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device.
- PC personal computer
- PDA personal Digital Assistant
- a mobile phone a camera, a video camera, an Internet of Things (IoT) device, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device.
- IoT Internet of Things
- the term “computef’ shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
- the computer system 1200 includes a processing device 1202, a main memory 1204 (e.g., read-only memory (ROM), flash memory, solid state drives (SSDs), dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 1206 (e.g., flash memory, solid state drives (SSDs), static random access memory (SRAM)), and a data storage device 1208, which communicate with each other via a bus 1110.
- main memory 1204 e.g., read-only memory (ROM), flash memory, solid state drives (SSDs), dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)
- DRAM dynamic random access memory
- SDRAM synchronous DRAM
- static memory 1206 e.g., flash memory, solid state drives (SSDs), static random access memory (SRAM)
- SRAM static random access memory
- Processing device 1202 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 1202 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets.
- the processing device 1402 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a system on a chip, a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like.
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- DSP digital signal processor
- network processor or the like.
- the processing device 1402 is configured to execute instructions for performing any of the operations and steps discussed herein.
- the computer system 1200 may further include a network interface device 1212.
- the computer system 1200 also may include a video display 1214 (e.g., a liquid crystal display (LCD), a light-emitting diode (LED), an organic light-emitting diode (OLED), a quantum LED, a cathode ray tube (CRT), a shadow mask CRT, an aperture grille CRT, a monochrome CRT), one or more input devices 1216 (e.g., a keyboard and/or a mouse or a gaming-like control), and one or more speakers 1218 (e.g., a speaker).
- the video display 1214 and the input device(s) 1216 may be combined into a single component or device (e.g., an LCD touch screen).
- the data storage device 1216 may include a computer-readable medium 1220 on which the instructions 1222 embodying any one or more of the methods, operations, or functions described herein is stored.
- the instructions 1222 may also reside, completely or at least partially, within the main memory 1204 and/or within the processing device 1202 during execution thereof by the computer system 1200. As such, the main memory 1204 and the processing device 1202 also constitute computer-readable media.
- the instructions 1222 may further be transmitted or received over a network via the network interface device 1212.
- While the computer-readable storage medium 1220 is generally illustrated in the illustrative examples to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
- the term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
- the term “computer-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
- Determining an optimal treatment plan for a patient having certain characteristics may be a technically challenging problem.
- a multitude of information may be considered when determining a treatment plan, which may result in inefficiencies and inaccuracies in the treatment plan selection process.
- some of the multitude of information considered may include a type of injury of the patient, types of available medical procedures to perform, treatment regimens, medication regimens, and the characteristics of the patient.
- the characteristics of the patient may be vast, and may include medications of the patient, previous injuries of the patient, previous medical procedures performed on the patient, measurements (e.g., body fat, weight, etc.) of the patient, allergies of the patient, medical conditions of the patient, historical information of the patient, vital signs (e.g., temperature, blood pressure, heart rate) of the patient, symptoms of the patient, familial medical information of the patient, and the like.
- clinical information pertaining to results of treatment plans performed using a treatment apparatus on other people.
- the clinical information may include clinical studies, clinical trials, evidence-based guidelines, journal articles, meta-analyses, and the like.
- the clinical information may be written by people having certain professional degrees (e.g., medical doctor, osteopathic doctor, physical therapist, etc.), certifications, etc.
- the clinical information may be retrieved from any suitable data source.
- the clinical information may describe people seeking treatment for a particular ailment (e.g., injury, disease, any applicable medical condition, etc.).
- the clinical information may describe that certain results are obtained when the people perform or have performed on them particular treatment plans (e.g., medical procedures, treatment protocols using treatment apparatuses, medication regimens, diet regimens, etc.).
- the clinical information may also include the particular characteristics of the people described. Direct or indirect reference may be made to values of the characteristics therein. It may be desirable to compare the characteristics of the patient with the characteristics of the people in the clinical information to determine what an optimal treatment plan for the patient may be such that the patient can obtain a desired result. Processing this historical information may be computationally taxing, inefficient, and/or infeasible using conventional techniques.
- an artificial intelligence engine may be trained to recommend the optimal treatment plan based on characteristics of the patient and the clinical information.
- the artificial intelligence engine may be trained to match a pattern between the characteristics of the patient and the people in various clinical information. Based on the pattern, the artificial intelligence engine may generate a treatment plan for the patient, where such treatment plan produced a desired result in the clinical information for a similarly matched person or similarly matched people.
- the treatment plan generated may be “optimal” based on the desired result (e.g., speed, efficacy, both speed and efficacy, life expectancy, etc.).
- the desired result e.g., speed, efficacy, both speed and efficacy, life expectancy, etc.
- the artificial intelligence engine may be trained to output several optimal or optimized treatment plans. For example, one result may include recovering to a threshold level (e.g., 75% range of motion) in a fastest amount of time, while another result may include fully recovering (e.g., 100% range of motion) regardless of the amount of time.
- the clinical information may indicate a first treatment plan provides the first result for people with characteristics similar to the patient’s, and a second treatment plan provides the second result for people with characteristics similar to the patient.
- the artificial intelligence engine may also be trained to output treatment plans that are not optimal (referred to as “ruled-out treatment plans”) for the patient. For example, if a patient has diabetes, a particular medication may not be approved or suitable for the patient and that medication may be flagged in the ruled-out treatment plan for the patient.
- the received clinical information and/or patient information may be translated into a medical description language.
- the medical description language may refer to an encoding configured to be efficiently processed by the artificial intelligence engine. For example, a clinical trial may be received and parsed, optionally with the addition of an attribute grammar; and then keywords pertaining to target information may be searched for. The values of the target information may be identified.
- a canonical format defined by the medical description language may be defined and/or generated, where the canonical format includes tags identifying the values of the target information and, optionally, tags implementing an attribute grammar for the medical description language.
- the medical description language may be extensible and include any property of an object-oriented or artificial intelligence programming language.
- the medical description language may define other methods or procedures.
- the medical description language may implement the concept of "objects", which can contain data, in the form of fields (often known as attributes or properties), and code, in the form of procedures (often known as methods).
- the medical description language may encapsulate data and functions that manipulate the data to protect them from interference and misuse.
- the medical description language may also implement data hiding or obscuring, which prevents certain aspects of the data or functions from being accessible to another component.
- the medical description language may implement inheritance, which arranges components as “is a type of’ relationships, where a first component may be a type of a second component and the first component inherits the functions and data of the second component.
- the medical description language may also implement polymorphism, which is the provision of a single interface to components of different types.
- the clinical information may be translated to the medical description language prior to the artificial intelligence engine determining the optimal treatment plans and/or ruled-out treatment plans.
- the artificial intelligence engine may be trained by using the medical description language representing the clinical information, such that the artificial intelligence engine is able to more efficiently determine the optimal treatment plans instead of using initial data formats in which the clinical information is received. Further, the artificial intelligence engine may continuously or continually receive the clinical information and include the clinical information in training data to update the artificial intelligence engine.
- the optimal treatment plans and/or ruled-out treatment plans may be presented to a medical professional.
- the medical professional may select a particular optimal treatment plan for the patient to cause that treatment plan to be transmitted to the patient.
- the artificial intelligence engine may receive and/or operate distally from the source of the clinical information and/or distally from the patient.
- the recommended treatment plans and/or ruled-out treatment plans may be presented during a telemedicine or telehealth session on a user interface of a computing device of a medical professional simultaneously with a video of the patient in real time.
- the video may also be accompanied by audio, text and other multimedia information. Real-time may refer to less than 2 seconds.
- Presenting the treatment plans generated by the artificial intelligence engine concurrently with a presentation of the patient video may provide an enhanced user interface because the medical professional may continue to visually and/or otherwise communicate with the patient while also reviewing the treatment plans on the same user interface.
- the enhanced user interface may improve the medical professional’s experience using the computing device and may encourage the medical professional to reuse the user interface.
- Such a technique may also reduce computing resources (e.g., processing, memory, network) because the medical professional does not have to switch to another user interface screen and enter a query for a treatment plan to recommend based on the characteristics of the patient.
- the artificial intelligence engine provides, dynamically on the fly, the optimal treatment plans and ruled-out treatment plans.
- the treatment apparatus may be adaptive and/or personalized because its properties, configurations, and positions may be adapted to the needs of a particular patient.
- the pedals may be dynamically adjusted on the fly (e.g., via a telemedicine session or based on programmed configurations in response to certain measurements being detected) to increase or decrease a range of motion to comply with a treatment plan designed for the user.
- Such adaptive nature may improve the results of recovery for a patient.
- a method comprising: receiving treatment data pertaining to a user that uses a treatment device to perform a treatment plan, wherein the treatment data comprises at least one of characteristics of the user, measurement information pertaining to the user while the user uses the treatment devices, characteristics of the treatment device, and at least one aspect of the treatment plan; generating treatment information using the treatment data; writing to an associated memory, for access by a computing device of a healthcare provider, the treatment information; communicating with an interface, at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input; and modifying the at least one aspect of the treatment plan in response to receiving treatment plan input including at least one modification to the at least one aspect of the treatment plan.
- Clause 3 The method of any clause herein, further comprising controlling, based on the modified at least one aspect of the treatment plan, the treatment device while the user uses the treatment device during a telemedicine session.
- the measurement information includes at least one of a vital sign of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, and a blood pressure of the user.
- Clause 7 The method of any clause herein, further comprising receiving subsequent treatment data pertaining to the user while the user uses the treatment device to perform the treatment plan.
- Clause 8 The method of any clause herein, further comprising modifying the modified the treatment plan in response to receiving subsequent treatment plan input including at least one further modification to the modified the at least one aspect of the treatment plan, wherein the subsequent treatment plan input is based on at least one of the treatment data and the subsequent treatment data.
- a tangible, non-transitoiy computer-readable medium storing instructions that, when executed, cause a processing device to: receive treatment data pertaining to a user that uses a treatment device to perform a treatment plan, wherein the treatment data comprises at least one of characteristics of the user, measurement information pertaining to the user while the user uses the treatment device, characteristics of the treatment device, and at least one aspect of the treatment plan; generate treatment information using the treatment data; write to an associated memory, for access at a computing device of a healthcare provider, the treatment information; communicate with an interface, at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input; and modify the at least one aspect of the treatment plan in response to receiving treatment plan input including at least one modification to the treatment plan.
- the measurement information includes at least one of a vital sign of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, and a blood pressure of the user.
- Clause 13 The computer-readable medium of any clause herein, wherein at least some of the treatment data corresponds to sensor data from a sensor associated with the treatment device.
- Clause 14 The computer-readable medium of any clause herein, wherein at least some of the treatment data corresponds to sensor data from a sensor associated with a wearable device worn by the user while using the treatment device.
- Clause 15 The computer-readable medium of any clause herein, wherein the processing device is further configured to receive subsequent treatment data pertaining to the user while the user uses the treatment device to perform the treatment plan.
- Clause 16 The computer-readable medium of any clause herein, wherein the processing device is further configured to modify the modified the at least one aspect of the treatment plan in response to receiving subsequent treatment plan input including at least one further modification to the treatment plan, wherein the subsequent treatment plan input is based on at least one of the treatment data and the subsequent treatment data.
- a system comprising: a memory device storing instructions; a processing device communicatively coupled to the memory device, the processing device executes the instructions to: receive treatment data pertaining to a user that uses a treatment device to perform a treatment plan, wherein the treatment data comprises at least one of characteristics of the user, measurement information pertaining to the user while the user uses the treatment device, characteristics of the treatment device, and at least one aspect of the treatment plan; generate treatment information using the treatment data; write to an associated memory, for access at a computing device of a healthcare provider, the treatment information; communicate with an interface, at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input; and modify the at least one aspect of the treatment plan in response to receiving treatment plan input including at least one modification to the treatment plan.
- Clause 18 The system of any clause herein, wherein the processing device is further configured to control, based on the modified the at least one aspect of the treatment plan, the treatment device while the user uses the treatment device.
- the measurement information includes at least one of a vital sign of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, and a blood pressure of the user.
- Clause 21 The system of any clause herein, wherein at least some of the treatment data corresponds to sensor data from a sensor associated with the treatment device.
- Clause 23 The system of any clause herein, wherein the processing device is further configured to receive subsequent treatment data pertaining to the user while the user uses the treatment device to perform the treatment plan.
- Clause 24 The system of any clause herein, wherein the processing device is further configured to modify the modified the at least one of the at least one aspect and any other aspect of the treatment plan in response to receiving subsequent treatment plan input including at least one further modification to the treatment plan, wherein the subsequent treatment plan input is based on at least one of the treatment data and the subsequent treatment data.
- FIG. 13 shows a block diagram of a computer-implemented system 2010, hereinafter called “the system” for managing a treatment plan.
- Managing the treatment plan may include using an artificial intelligence engine to recommend optimal treatment plans and/or provide ruled-out treatment plans that should not be recommended to a patient.
- a treatment plan may include one or more treatment protocols, and each treatment protocol includes one or more treatment sessions. Each treatment session comprises several session periods, with each session period including a particular activity for treating the body part of the patient.
- a treatment plan for post-operative rehabilitation after a knee surgery may include an initial treatment protocol with twice daily stretching sessions for the first 3 days after surgery and a more intensive treatment protocol with active exercise sessions performed 4 times per day starting 4 days after surgery.
- a treatment plan may also include information pertaining to a medical procedure to perform on the patient, a treatment protocol for the patient using a treatment apparatus, a diet regimen for the patient, a medication regimen for the patient, a sleep regimen for the patient, additional regimens, or some combination thereof.
- the system 2010 also includes a server 2030 configured to store and to provide data related to managing the treatment plan.
- the server 2030 may include one or more computers and may take the form of a distributed and/or virtualized computer or computers.
- the server 2030 also includes a first communication interface 2032 configured to communicate with the clinician interface 2020 via a first network 2034.1n some embodiments, the first network 2034 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
- the server 2030 includes a first processor 2036 and a first machine -readable storage memory 2038, which may be called a “memory” for short, holding first instructions 2040 for performing the various actions of the server 2030 for execution by the first processor 2036.
- the server 2030 is configured to store data regarding the treatment plan.
- the memory 2038 includes a system data store 2042 configured to hold system data, such as data pertaining to treatment plans for treating one or more patients.
- the server 2030 is also configured to store data regarding performance by a patient in following a treatment plan.
- the memory 2038 includes a patient data store 2044 configured to hold patient data, such as data pertaining to the one or more patients, including data representing each patient’s performance within the treatment plan.
- the characteristics of the people, the treatment plans followed by the people, the level of compliance with the treatment plans, and the results of the treatment plans may use correlations and other statistical or probabilistic measures to partition the treatment plans into different patient cohort-equivalent databases in the patient data store 2044.
- the data for a first cohort of first patients having a first similar injury, a first similar medical condition, a first similar medical procedure performed, a first treatment plan followed by the first patient, and a first result of the treatment plan may be stored in a first patient database.
- the data for a second cohort of second patients having a second similar injury, a second similar medical condition, a second similar medical procedure performed, a second treatment plan followed by the second patients, and a second result of the treatment plan may be stored in a second patient database. Any combination of characteristics may be used to separate the cohorts of patients. In some embodiments, the different cohorts of patients may be stored in different partitions or volumes of the same database.
- This characteristic data, treatment plan data, and results data may be obtained from clinical information that describes the characteristics of people who performed certain treatment plans and the results of those treatment plans.
- the characteristic data, treatment plan data, and results data may be correlated in the patient-cohort databases in the patient data store 2044.
- the characteristics of the people may include medications prescribed to the people, injuries of the people, medical procedures performed on the people, measurements of the people, allergies of the people, medical conditions of the people, historical information of the people, vital signs of the people, symptoms of the people, familial medical information of the people, other information of the people, or some combination thereof.
- characteristics about a current patient being treated may be stored in an appropriate patient cohort-equivalent database.
- the characteristics of the patient may include medications of the patient, injuries of the patient, medical procedures performed on the patient, measurements of the patient, allergies of the patient, medical conditions of the patient, historical information of the patient, vital signs of the patient, symptoms of the patient, familial medical information of the patient, other information of the patient, or some combination thereof.
- the server 2030 may execute an artificial intelligence (AI) engine 2011 that uses one or more machine learning models 2013 to perform at least one of the embodiments disclosed herein.
- the server 2030 may include a training engine 9 capable of generating the one or more machine learning models 2013.
- the machine learning models 2013 may be trained to generate and recommend optimal treatment plans using real-time and historical data correlations involving patient cohort-equivalents, among other things.
- the one or more machine learning models 2013 may be generated by the training engine 209 and may be implemented in computer instructions executable by one or more processing devices of the training engine 209 and/or the servers 2030.
- the training engine 209 may train the one or more machine learning models 2013.
- the one or more machine learning models 2013 may be used by the artificial intelligence engine 2011.
- the training engine 209 may be a rackmount server, a router computer, a personal computer, a portable digital assistant, a smartphone, a laptop computer, a tablet computer, a netbook, a desktop computer, an Internet of Things (IoT) device, any other desired computing device, or any combination of the above.
- the training engine 9 may be cloud-based or a real-time software platform, and it may include privacy software or protocols, and/or security software or protocols.
- the training engine 209 may use a training data set of a corpus of keywords representing target information to identify clinical information.
- the training data set may also include a corpus of clinical information (e.g., clinical trials, meta-analyses, evidence-based guidelines, journal articles, etc.) having a first data format.
- the clinical information may include characteristics of people, treatment plans followed by the people, and results of the treatment plans, among other things.
- the training data set may also include medical description language examples that include tags for target information, telemedical information and values embedded with the tags.
- the one or more machine learning models may be trained to translate the clinical information from the first data format to the machine description language having a canonical (e.g., tag-value pair and/or attribute grammar) format.
- the training may be performed by identifying the keywords of the target information, identifying values for the keywords, and generating the canonical value including tags for the target information and the values for the target information.
- the one or more machine learning models 2013 may also be trained to translate characteristics of patients received in real-time (e.g., from an electronic medical records (EMR) system) to the medical description language to store in appropriate patient cohort-equivalent databases.
- EMR electronic medical records
- the one or more machine learning models 2013 may be trained to match patterns of characteristics of a patient described by the medical description language with characteristics of other people described by the medical description language that represents the clinical information.
- the medical description language representing the clinical information may be stored in the various patient cohort-equivalent databases of the patient data store 2044. Accordingly, in some embodiments, the one or more machine learning models 2013 may access the patient cohort-equivalent databases when being trained or when recommending optimal treatment plans for a patient.
- Computing resources, efficiency of processing, accuracy and error minimization may be enhanced using the medical description language in the canonical format, as opposed to full bodies of text and/or EMR records. In particular, accuracy may be improved and errors may be minimized through the use of a formal medical description language that may be parsed to have one meaning, while informal descriptions may result in more than one, potentially semantically overloaded and unresolvable meanings.
- Different machine learning models 2013 may be trained to recommend different optimal treatment plans for different desired results. For example, one machine learning model may be trained to recommend optimal treatment plans for most effective recovery, while another machine learning model may be trained to recommend optimal treatment plans based on speed of recovery.
- the one or more machine learning models 2013 may refer to model artifacts created by the training engine 209.
- the training engine 209 may find patterns in the training data wherein such patterns map the training input to the target output, and generate the machine learning models 2013 that capture these patterns.
- the artificial intelligence engine 2011, the database 2033, and/or the training engine 209 may reside on another component (e.g., assistant interface 2094, clinician interface 2020, etc.) depicted in FIG. 13.
- the one or more machine learning models 2013 may comprise, e.g., a single level of linear or non-linear operations (e.g., a support vector machine [SVM]) or the machine learning models 2013 may be a deep network, i.e., a machine learning model comprising multiple levels of non linear operations.
- deep networks are neural networks including generative adversarial networks, convolutional neural networks, recurrent neural networks with one or more hidden layers, and fully connected neural networks (e.g., each neuron may transmit its output signal to the input of the remaining neurons, as well as to itself).
- the machine learning model may include numerous layers and/or hidden layers that perform calculations (e.g., dot products) using various neurons.
- the system 2010 also includes a patient interface 2050 configured to communicate information to a patient and to receive feedback from the patient.
- the patient interface includes an input device 2052 and an output device 2054, which may be collectively called a patient user interface 2052, 2054.
- the input device 2052 may include one or more devices, such as a keyboard, a mouse, a touch screen input, a gesture sensor, and/or a microphone and processor configured for voice recognition.
- the output device 2054 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, smartphone, or a smart watch.
- the output device 2054 may include other hardware and/or software components such as a projector, virtual reality capability, augmented reality capability, etc.
- the output device 2054 may incorporate various different visual, audio, or other presentation technologies.
- the output device 2054 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, and/or melodies, which may signal different conditions and/or directions.
- the output device 2054 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the patient.
- the output device 2054 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
- the patient interface 2050 includes a second communication interface 2056, which may also be called a remote communication interface configured to communicate with the server 2030 and/or the clinician interface 2020 via a second network 2058.
- the second network 2058 may include a local area network (LAN), such as an Ethernet network.
- the second network 2058 may include the Internet, and communications between the patient interface 2050 and the server 2030 and/or the clinician interface 2020 may be secured via encryption, such as, for example, by using a virtual private network (VPN).
- the second network 2058 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
- the second network 2058 may be the same as and/or operationally coupled to the first network 2034.
- the patient interface 2050 includes a second processor 2060 and a second machine-readable storage memory 2062 holding second instructions 2064 for execution by the second processor 2060 for performing various actions of patient interface 2050.
- the second machine-readable storage memory 2062 also includes a local data store 2066 configured to hold data, such as data pertaining to a treatment plan and/or patient data, such as data representing a patient’s performance within a treatment plan.
- the patient interface 2050 also includes a local communication interface20 68 configured to communicate with various devices for use by the patient in the vicinity of the patient interface 2050.
- the local communication interface 2068 may include wired and/or wireless communications.
- the local communication interface 2068 may include a local wireless network such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
- the system 2010 also includes a treatment apparatus 2070 configured to be manipulated by the patient and/or to manipulate a body part of the patient for performing activities according to the treatment plan.
- the treatment apparatus 2070 may take the form of an exercise and rehabilitation apparatus configured to perform and/or to aid in the performance of a rehabilitation regimen, which may be an orthopedic rehabilitation regimen, and the treatment includes rehabilitation of a body part of the patient, such as a joint or a bone or a muscle group.
- the body part may include, for example, a spine, a hand, a foot, a knee, or a shoulder.
- the body part may include a part of a joint, a bone, or a muscle group, such as one or more vertebrae, a tendon, or a ligament.
- the treatment apparatus 2070 includes a controller 2072, which may include one or more processors, computer memory, and/or other components.
- the treatment apparatus 2070 also includes a fourth communication interface 2074 configured to communicate with the patient interface 2050 via the local communication interface 2068.
- the treatment apparatus 2070 also includes one or more internal sensors 2076 and an actuator 2078, such as a motor.
- the actuator 2078 may be used, for example, for moving the patient’s body part and/or for resisting forces by the patient.
- the internal sensors 2076 may measure one or more operating characteristics of the treatment apparatus 2070 such as, for example, a force a position, a speed, and /or a velocity.
- the internal sensors 2076 may include a position sensor configured to measure at least one of a linear motion or an angular motion of a body part of the patient.
- an internal sensor 2076 in the form of a position sensor may measure a distance that the patient is able to move a part of the treatment apparatus 2070, where such distance may correspond to a range of motion that the patient’s body part is able to achieve.
- the internal sensors 2076 may include a force sensor configured to measure a force applied by the patient.
- an internal sensor 2076 in the form of a force sensor may measure a force or weight the patient is able to apply, using a particular body part, to the treatment apparatus 2070.
- the system 10 shown in FIG. 13 also includes an ambulation sensor 2082, which communicates with the server 2030 via the local communication interface 2068 of the patient interface 2050.
- the ambulation sensor 2082 may track and store a number of steps taken by the patient.
- the ambulation sensor 2082 may take the form of a wristband, wristwatch, or smart watch.
- the ambulation sensor 2082 may be integrated within a phone, such as a smartphone.
- the system 2010 shown in FIG. 13 also includes a goniometer 2084, which communicates with the server 2030 via the local communication interface 2068 of the patient interface 2050.
- the goniometer 2084 measures an angle of the patient’s body part.
- the goniometer 2084 may measure the angle of flex of a patient’s knee or elbow or shoulder.
- the system 2010 shown in FIG. 13 also includes a pressure sensor 2086, which communicates with the server 2030 via the local communication interface 68 of the patient interface 2050.
- the pressure sensor 2086 measures an amount of pressure or weight applied by a body part of the patient.
- pressure sensor 2086 may measure an amount of force applied by a patient’s foot when pedaling a stationary bike.
- the system 2010 shown in FIG. 13 also includes a supervisory interface 2090 which may be similar or identical to the clinician interface 2020.
- the supervisory interface 2090 may have enhanced functionality beyond what is provided on the clinician interface 2020.
- the supervisory interface 2090 may be configured for use by a person having responsibility for the treatment plan, such as an orthopedic surgeon.
- the system 2010 shown in FIG. 13 also includes a reporting interface 2092 which may be similar or identical to the clinician interface 2020.
- the reporting interface 2092 may have less functionality from what is provided on the clinician interface 2020.
- the reporting interface 2092 may not have the ability to modify a treatment plan.
- Such a reporting interface 2092 may be used, for example, by a biller to determine the use of the system 2010 for billing purposes.
- the reporting interface 2092 may not have the ability to display patient identifiable information, presenting only pseudonymized data and/or anonymized data for certain data fields concerning a data subject and/or for certain data fields concerning a quasi-identifier of the data subject.
- Such a reporting interface 2092 may be used, for example, by a researcher to determine various effects of a treatment plan on different patients.
- the system 2010 includes an assistant interface 2094 for an assistant, such as a doctor, a nurse, a physical therapist, or a technician, to remotely communicate with the patient interface 2050 and/or the treatment apparatus 2070.
- an assistant such as a doctor, a nurse, a physical therapist, or a technician
- Such remote communications may enable the assistant to provide assistance or guidance to a patient using the system 2010.
- the assistant interface 2094 is configured to communicate a telemedicine signal 2096, 2097, 2098a, 2098b, 2099a, 2099b with the patient interface 2050 via a network connection such as, for example, via the first network 2034 and/or the second network 2058.
- the telemedicine signal 2096, 2097, 2098a, 2098b, 2099a, 2099b comprises one of an audio signal 2096, an audiovisual signal 2097, an interface control signal 2098a for controlling a function of the patient interface 2050, an interface monitor signal 2098b for monitoring a status of the patient interface 2050, an apparatus control signal 2099a for changing an operating parameter of the treatment apparatus 2070, and/or an apparatus monitor signal 2099b for monitoring a status of the treatment apparatus 2070.
- each of the control signals 2098a, 2099a may be unidirectional, conveying commands from the assistant interface 2094 to the patient interface 2050.
- an acknowledgement message may be sent from the patient interface 2050 to the assistant interface 2094.
- each of the monitor signals 2098b, 2099b may be unidirectional, status-information commands from the patient interface 2050 to the assistant interface 2094.
- an acknowledgement message may be sent from the assistant interface 2094 to the patient interface 2050 in response to successfully receiving one of the monitor signals 2098b, 2099b.
- the patient interface 2050 may be configured as a pass-through for the apparatus control signals 2099a and the apparatus monitor signals 2099b between the treatment apparatus 2070 and one or more other devices, such as the assistant interface 2094 and/or the server 2030.
- the patient interface 2050 may be configured to transmit an apparatus control signal 2099a in response to an apparatus control signal 2099a within the telemedicine signal 2096, 2097, 2098a, 2098b, 2099a, 2099b from the assistant interface 2094.
- the assistant interface 2094 may be presented on a shared physical device as the clinician interface 2020.
- the clinician interface 2020 may include one or more screens that implement the assistant interface 2094.
- the clinician interface 2020 may include additional hardware components, such as a video camera, a speaker, and/or a microphone, to implement aspects of the assistant interface 2094.
- one or more portions of the telemedicine signal 2096, 2097, 2098a, 2098b, 2099a, 2099b may be generated from a prerecorded source (e.g., an audio recording, a video recording, or an animation) for presentation by the output device 2054 of the patient interface 2050.
- a prerecorded source e.g., an audio recording, a video recording, or an animation
- a tutorial video may be streamed from the server 2030 and presented upon the patient interface 2050.
- Content from the prerecorded source may be requested by the patient via the patient interface 2050.
- the assistant via a control on the assistant interface 2094, the assistant may cause content from the prerecorded source to be played on the patient interface 2050.
- the assistant interface 2094 includes an assistant input device 2022 and an assistant display 2024, which may be collectively called an assistant user interface 2022, 2024.
- the assistant input device 2022 may include one or more of a telephone, a keyboard, a mouse, a trackpad, or a touch screen, for example.
- the assistant input device 2022 may include one or more microphones.
- the one or more microphones may take the form of a telephone handset, headset, or wide-area microphone or microphones configured for the assistant to speak to a patient via the patient interface 2050.
- assistant input device 2022 may be configured to provide voice-based functionalities, with hardware and/or software configured to interpret spoken instructions by the assistant by using the one or more microphones.
- the assistant input device 2022 may include functionality provided by or similar to existing voice- based assistants such as Siri by Apple, Alexaby Amazon, Google Assistant, or Bixby by Samsung.
- the assistant input device 2022 may include other hardware and/or software components.
- the assistant input device 2022 may include one or more general purpose devices and/or special-purpose devices.
- the assistant display 2024 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, a smartphone, or a smart watch.
- the assistant display 2024 may include other hardware and/or software components such as projectors, virtual reality capabilities, or augmented reality capabilities, etc.
- the assistant display 2024 may incorporate various different visual, audio, or other presentation technologies.
- the assistant display 2024 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, melodies, and/or compositions, which may signal different conditions and/or directions.
- the assistant display 2024 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the assistant.
- the assistant display 2024 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
- the system 2010 may provide computer translation of language from the assistant interface 2094 to the patient interface 2050 and/or vice-versa.
- the computer translation of language may include computer translation of spoken language and/or computer translation of text.
- the system 2010 may provide voice recognition and/or spoken pronunciation of text.
- the system 2010 may convert spoken words to printed text and/or the system 2010 may audibly speak language from printed text.
- the system 2010 may be configured to recognize spoken words by any or all of the patient, the clinician, and/or the assistant.
- the system 2010 may be configured to recognize and react to spoken requests or commands by the patient.
- the system 2010 may automatically initiate a telemedicine session in response to a verbal command by the patient (which may be given in any one of several different languages).
- the server 2030 may generate aspects of the assistant display 2024 for presentation by the assistant interface 2094.
- the server 2030 may include a web server configured to generate the display screens for presentation upon the assistant display 2024.
- the artificial intelligence engine 2011 may generate recommended optimal treatment plans and/or excluded treatment plans for patients and generate the display screens including those recommended optimal treatment plans and/or ruled- out treatment plans for presentation on the assistant display 2024 of the assistant interface 2094.
- the assistant display 2024 may be configured to present a virtualized desktop hosted by the server 2030.
- the server 2030 may be configured to communicate with the assistant interface 2094 via the first network 2034.
- the first network 2034 may include a local area network (LAN), such as an Ethernet network.
- LAN local area network
- the first network 2034 may include the Internet, and communications between the server 2030 and the assistant interface 2094 may be seemed via privacy enhancing technologies, such as, for example, by using encryption over a virtual private network (VPN).
- the server 2030 may be configured to communicate with the assistant interface 2094 via one or more networks independent of the first network 2034 and/or other communication means, such as a direct wired or wireless communication channel.
- the patient interface 2050 and the treatment apparatus 2070 may each operate from a patient location geographically separate from a location of the assistant interface 2094.
- the patient interface 2050 and the treatment apparatus 2070 may be used as part of an in-home rehabilitation system, which may be aided remotely by using the assistant interface 2094 at a centralized location, such as a clinic or a call center.
- the assistant interface 2094 may be one of several different terminals (e.g., computing devices) that may be grouped together, for example, in one or more call centers or at one or more clinicians’ offices. In some embodiments, a plurality of assistant interfaces 2094 may be distributed geographically. In some embodiments, a person may work as an assistant remotely from any conventional office infrastructure. Such remote work may be performed, for example, where the assistant interface 94 takes the form of a computer and/or telephone. This remote work functionality may allow for work-from-home arrangements that may include part time and/or flexible work horns for an assistant.
- FIGS. 14-15 show an embodiment of a treatment apparatus 2070. More specifically, FIG. 14 shows a treatment apparatus 2070 in the form of a stationary cycling machine 2100, which may be called a stationary bike, for short.
- the stationary cycling machine 2100 includes a set of pedals 2102 each attached to a pedal arm 2104 for rotation about an axle 2106.
- the pedals 2102 are movable on the pedal arms 2104 in order to adjust a range of motion used by the patient in pedaling. For example, the pedals being located inwardly toward the axle 2106 corresponds to a smaller range of motion than when the pedals are located outwardly away from the axle 2106.
- a pressure sensor 2086 is attached to or embedded within one of the pedals 2102 for measuring an amount of force applied by the patient on the pedal 2102.
- the pressure sensor 2086 may communicate wirelessly to the treatment apparatus 2070 and/or to the patient interface 2050.
- FIG. 16 shows a person (a patient) using the treatment apparatus of FIG. 14, and showing sensors and various data parameters connected to a patient interface 2050.
- the example patient interface 2050 is a tablet computer or smartphone, or a phablet, such as an iPad, an iPhone, an Android device, or a Surface tablet, which is held manually by the patient.
- the patient interface 2050 may be embedded within or attached to the treatment apparatus 2070.
- FIG. 16 shows the patient wearing the ambulation sensor 2082 on his wrist, with a note showing “ STEPS TODAY 21355”, indicating that the ambulation sensor 2082 has recorded and transmitted that step count to the patient interface 2050.
- FIG. 16 also shows the patient wearing the goniometer 2084 on his right knee, with a note showing “KNEE ANGLE 72°”, indicating that the goniometer 2084 is measuring and transmitting that knee angle to the patient interface 2050.
- FIG. 16 also shows a right side of one of the pedals 2102 with a pressure sensor 2086 showing “FORCE 12.5 lbs.,” indicating that the right pedal pressure sensor 2086 is measuring and transmitting that force measurement to the patient interface 2050.
- FIG. 16 also shows a left side of one of the pedals 2102 with a pressure sensor 2086 showing “FORCE 27 lbs.”, indicating that the left pedal pressure sensor 2086 is measuring and transmitting that force measurement to the patient interface 2050.
- FIG. 16 also shows other patient data, such as an indicator of “SESSION TIME 0:04: 13”, indicating that the patient has been using the treatment apparatus 2070 for 4 minutes and 13 seconds. This session time may be determined by the patient interface 2050 based on information received from the treatment apparatus 2070.
- FIG. 16 also shows an indicator showing “PAIN LEVEL 3”. Such a pain level may be obtained from the patent in response to a solicitation, such as a question, presented upon the patient interface 2050. [0314] FIG.
- FIG. 17 is an example embodiment of an overview display 2120 of the assistant interface 2094.
- the overview display 2120 presents several different controls and interfaces for the assistant to remotely assist a patient with using the patient interface 2050 and/or the treatment apparatus 2070.
- This remote assistance functionality may also be called telemedicine or telehealth.
- the overview display 2120 includes a patient profile display 2130 presenting biographical information regarding a patient using the treatment apparatus 2070.
- the patient profile display 2130 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17, although the patient profile display 2130 may take other forms, such as a separate screen or a popup window.
- the patient profile display 2130 may include a limited subset of the patient’s biographical information. More specifically, the data presented upon the patient profile display 2130 may depend upon the assistant’s need for that information.
- a medical professional that is assisting the patient with a medical issue may be provided with medical history information regarding the patient, whereas a technician troubleshooting an issue with the treatment apparatus 2070 may be provided with a much more limited set of information regarding the patient.
- the technician for example, may be given only the patient’s name.
- the patient profile display 2130 may include pseudonymized data and/or anonymized data or use any privacy enhancing technology to prevent confidential patient data from being communicated in a way that could violate patient confidentiality requirements.
- privacy enhancing technologies may enable compliance with laws, regulations, or other rules of governance such as, but not limited to, the Health Insurance Portability and Accountability Act (HIPAA), or the General Data Protection Regulation (GDPR), wherein the patient may be deemed a “data subject”.
- HIPAA Health Insurance Portability and Accountability Act
- GDPR General Data Protection Regulation
- the patient profile display 2130 may present information regarding the treatment plan for the patient to follow in using the treatment apparatus 2070.
- Such treatment plan information may be limited to an assistant who is a medical professional, such as a doctor or physical therapist.
- a medical professional assisting the patient with an issue regarding the treatment regimen may be provided with treatment plan information, whereas a technician troubleshooting an issue with the treatment apparatus 2070 may not be provided with any information regarding the patient’s treatment plan.
- one or more recommended optimal treatment plans and/or ruled-out treatment plans may be presented in the patient profile display 2130 to the assistant.
- the one or more recommended optimal treatment plans and/or ruled-out treatment plans may be generated by the artificial intelligence engine 2011 of the server 2030 and received from the server 2030 in real-time during, inter alia, a telemedicine or telehealth session.
- An example of presenting the one or more recommended optimal treatment plans and/or ruled-out treatment plans is described below with reference to FIG. 18.
- the example overview display 2120 shown in FIG. 17 also includes a patient status display 2134 presenting status information regarding a patient using the treatment apparatus.
- the patient status display 2134 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17, although the patient status display 2134 may take other forms, such as a separate screen or a popup window.
- the patient status display 2134 includes sensor data 2136 from one ormore of the external sensors 2082, 2084, 2086, and/orfrom one or more internal sensors 2076 of the treatment apparatus 2070. In some embodiments, the patient status display 2134 may present other data 2138 regarding the patient, such as last reported pain level, or progress within a treatment plan.
- User access controls may be used to limit access, including what data is available to be viewed and/or modified, on any or all of the user interfaces 2020, 2050, 2090, 2092, 2094 of the system 2010.
- user access controls may be employed to control what information is available to any given person using the system 2010.
- data presented on the assistant interface 2094 may be controlled by user access controls, with permissions set depending on the assistant/user’s need for and/or qualifications to view that information.
- the example overview display 2120 shown in FIG. 17 also includes a help data display 2140 presenting information for the assistant to use in assisting the patient.
- the help data display 2140 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17.
- the help data display 2140 may take other forms, such as a separate screen or a popup window.
- the help data display 2140 may include, for example, presenting answers to frequently asked questions regarding use of the patient interface 2050 and/or the treatment apparatus 2070.
- the help data display 2140 may also include research data or best practices. In some embodiments, the help data display 2140 may present scripts for answers or explanations in response to patient questions.
- the help data display 2140 may present flow charts or walk-throughs for the assistant to use in determining a root cause and/or solution to a patient’ s problem.
- the assistant interface 2094 may present two or more help data displays 2140, which may be the same or different, for simultaneous presentation of help data for use by the assistant for example, a first help data display may be used to present a troubleshooting flowchart to determine the source of a patient’s problem, and a second help data display may present script information for the assistant to read to the patient, such information to preferably include directions for the patient to perform some action, which may help to narrow down or solve the problem.
- the second help data display may automatically populate with script information.
- the example overview display 2120 shown in FIG. 17 also includes a patient interface control 2150 presenting information regarding the patient interface 2050, and/or to modify one or more settings of the patient interface 2050.
- the patient interface control 2150 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17.
- the patient interface control 2150 may take other forms, such as a separate screen or a popup window.
- the patient interface control 2150 may present information communicated to the assistant interface 2094 via one or more of the interface monitor signals 2098b.
- the patient interface control 2150 includes a display feed 2152 of the display presented by the patient interface 2050.
- the display feed 2152 may include a live copy of the display screen currently being presented to the patient by the patient interface 2050. In other words, the display feed 2152 may present an image of what is presented on a display screen of the patient interface 2050. In some embodiments, the display feed 2152 may include abbreviated information regarding the display screen currently being presented by the patient interface 2050, such as a screen name or a screen number.
- the patient interface control 2150 may include a patient interface setting control 2154 for the assistant to adjust or to control one or more settings or aspects of the patient interface 2050. In some embodiments, the patient interface setting control 2154 may cause the assistant interface 2094 to generate and/or to transmit an interface control signal 2098 for controlling a function or a setting of the patient interface 2050.
- the patient interface setting control 2154 may include collaborative browsing or co-browsing capability for the assistant to remotely view and/or control the patient interface 2050.
- the patient interface setting control 2154 may enable the assistant to remotely enter text to one or more text entry fields on the patient interface 2050 and/or to remotely control a cursor on the patient interface 2050 using a mouse or touchscreen of the assistant interface 2094.
- the patient interface setting control 2154 may allow the assistant to change a setting that cannot be changed by the patient.
- the patient interface 2050 may be precluded from accessing a language setting to prevent a patient from inadvertently switching, on the patient interface 2050, the language used for the displays, whereas the patient interface setting control 2154 may enable the assistant to change the language setting of the patient interface 2050.
- the patient interface 2050 may not be able to change a font size setting to a smaller size in order to prevent a patient from inadvertently switching the font size used for the displays on the patient interface 2050 such that the display would become illegible to the patient, whereas the patient interface setting control 154 may provide for the assistant to change the font size setting of the patient interface 50.
- the example overview display 2120 shown in FIG. 17 also includes an interface communications display 2156 showing the status of communications between the patient interface 2050 and one or more other devices 2070, 2082, 2084, such as the treatment apparatus 2070, the ambulation sensor 2082, and/or the goniometer 2084.
- the interface communications display 2156 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17.
- the interface communications display 2156 may take other forms, such as a separate screen or a popup window.
- the interface communications display 2156 may include controls for the assistant to remotely modify communications with one or more of the other devices 2070, 2082, 2084.
- the assistant may remotely command the patient interface 2050 to reset communications with one of the other devices 2070, 2082, 2084, or to establish communications with a new one of the other devices 2070, 2082, 2084.
- This functionality may be used, for example, where the patient has a problem with one of the other devices 2070, 2082, 2084, or where the patient receives a new or a replacement one of the other devices 2070, 2082, 2084.
- the example overview display 2120 shown in FIG. 17 also includes an apparatus control 2160 for the assistant to view and/or to control information regarding the treatment apparatus 2070.
- the apparatus control 2160 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17.
- the apparatus control 2160 may take other forms, such as a separate screen or a popup window.
- the apparatus control 2160 may include an apparatus status display 2162 with information regarding the current status of the apparatus.
- the apparatus status display 2162 may present information communicated to the assistant interface 2094 via one or more of the apparatus monitor signals 2099b.
- the apparatus status display 2162 may indicate whether the treatment apparatus 2070 is currently communicating with the patient interface 2050.
- the apparatus status display 2162 may present other current and/or historical information regarding the status of the treatment apparatus 2070.
- the apparatus control 2160 may include an apparatus setting control 2164 for the assistant to adjust or control one or more aspects of the treatment apparatus 2070.
- the apparatus setting control 2164 may cause the assistant interface 2094 to generate and/or to transmit an apparatus control signal 2099 for changing an operating parameter of the treatment apparatus 2070, (e.g., a pedal radius setting, a resistance setting, a target RPM, etc.).
- the apparatus setting control 2164 may include a mode button 2166 and a position control 2168, which may be used in conjunction for the assistant to place an actuator 2078 of the treatment apparatus 2070 in a manual mode, after which a setting, such as a position or a speed of the actuator 2078, can be changed using the position control 2168.
- the mode button 2166 may provide for a setting, such as a position, to be toggled between automatic and manual modes.
- a setting such as a position
- one or more settings may be adjustable at any time, and without having an associated auto/manual mode.
- the assistant may change an operating parameter of the treatment apparatus 2070, such as a pedal radius setting, while the patient is actively using the treatment apparatus 2070. Such “on the fly” adjustment may or may not be available to the patient using the patient interface 2050.
- the apparatus setting control 2164 may allow the assistant to change a setting that cannot be changed by the patient using the patient interface 2050.
- the patient interface 2050 may be precluded from changing a preconfigured setting, such as a height or a tilt setting of the treatment apparatus 2070, whereas the apparatus setting control 2164 may provide forthe assistant to change the height or tilt setting of the treatment apparatus 2070.
- a preconfigured setting such as a height or a tilt setting of the treatment apparatus 2070
- the apparatus setting control 2164 may provide forthe assistant to change the height or tilt setting of the treatment apparatus 2070.
- the example overview display 2120 shown in FIG. 17 also includes a patient communications control 2170 for controlling an audio or an audiovisual communications session with the patient interface 2050.
- the communications session with the patient interface 2050 may comprise a live feed from the assistant interface 94 for presentation by the output device of the patient interface 2050.
- the live feed may take the form of an audio feed and/or a video feed.
- the patient interface 2050 may be configured to provide two-way audio or audiovisual communications with a person using the assistant interface 2094.
- the communications session with the patient interface 2050 may include bidirectional (two-way) video or audiovisual feeds, with each of the patient interface 2050 and the assistant interface 2094 presenting video of the other one.
- the patient interface 2050 may present video from the assistant interface 2094, while the assistant interface 2094 presents only audio or the assistant interface 2094 presents no live audio or visual signal from the patient interface 2050.
- the assistant interface 2094 may present video from the patient interface 2050, while the patient interface 2050 presents only audio or the patient interface 2050 presents no live audio or visual signal from the assistant interface 2094.
- the audio or an audiovisual communications session with the patient interface 2050 may take place, at least in part, while the patient is performing the rehabilitation regimen upon the body part.
- the patient communications control 2170 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17.
- the patient communications control 2170 may take other forms, such as a separate screen or a popup window.
- the audio and/or audiovisual communications may be processed and/or directed by the assistant interface 2094 and/or by another device or devices, such as a telephone system, or a videoconferencing system used by the assistant while the assistant uses the assistant interface 2094.
- the audio and/or audiovisual communications may include communications with a third party.
- the system 2010 may enable the assistant to initiate a 3-way conversation regarding use of a particular piece of hardware or software, with the patient and a subject matter expert, such as a medical professional or a specialist.
- the example patient communications control 2170 shown in FIG. 17 includes call controls 2172 for the assistant to use in managing various aspects of the audio or audiovisual communications with the patient.
- the call controls 2172 include a disconnect button 2174 for the assistant to end the audio or audiovisual communications session.
- the call controls 2172 also include a mute button 2176 to temporarily silence an audio or audiovisual signal from the assistant interface 2094.
- the call controls 2172 may include other features, such as a hold button (not shown).
- the call controls 2172 also include one or more record/playback controls 2178, such as record, play, and pause buttons to control, with the patient interface 2050, recording and/or playback of audio and/or video from the teleconference session.
- the call controls 2172 also include a video feed display 2180 for presenting still and/or video images from the patient interface 2050, and a self-video display 2182 showing the current image of the assistant using the assistant interface.
- the self video display 2182 may be presented as a picture-in-picture format, within a section of the video feed display 2180, as shown in FIG. 17. Alternatively or additionally, the self-video display 2182 maybe presented separately and/or independently from the video feed display 2180.
- the example overview display 2120 shown in FIG. 17 also includes a third party communications control 2190 for use in conducting audio and/or audiovisual communications with a third party.
- the third party communications control 2190 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17.
- the third party communications control 2190 may take other forms, such as a display on a separate screen or a popup window.
- the third party communications control 2190 may include one or more controls, such as a contact list and/or buttons or controls to contact a third party regarding use of a particular piece of hardware or software, e.g., a subject matter expert, such as a medical professional or a specialist.
- the third party communications control 2190 may include conference calling capability for the third party to simultaneously communicate with both the assistant via the assistant interface 2094, and with the patient via the patient interface 2050.
- the system 2010 may provide for the assistant to initiate a 3-way conversation with the patient and the third party.
- FIG. 18 shows an example embodiment of the overview display 2120 of the assistant interface 2094 presenting recommended optimal treatment plans and ruled-out treatment plans in real-time during a telemedicine session according to the present disclosure.
- the overview display 2120 just includes sections for the patient profile 2130 and the video feed display 2180, including the self-video display 2182. Any suitable configuration of controls and interfaces of the overview display 2120 described with reference to FIG. 17 may be presented in addition to or instead of the patient profile 2130, the video feed display 2180, and the self-video display 2182.
- the assistant e.g., medical professional
- who is using the assistant interface 2094 (e.g., computing device) during the telemedicine session may be presented in the self-video 2182 in a portion of the overview display 2120 (e.g., user interface presented on a display screen 2024 of the assistant interface 2094) that also presents a video from the patient in the video feed display 2180.
- a portion of the overview display 2120 includes the patient profile display 2130.
- the patient profile display 2130 is presenting two example optimal treatment plans 2600 and one example excluded treatment plan 2602.
- the optimal treatment plans may be recommended in view of various clinical information and characteristics of the patient being treated.
- the clinical information may include information pertaining to characteristics of other people, treatment plans followed by the other people, and results of the treatment plans.
- To generate the recommended optimal treatment plans 2600 the patient should follow to achieve a desired result, a pattern between the characteristics of the patient being treated and the other people may be matched by one or more machine learning models 2013 of the artificial intelligence engine 2011.
- Each of the recommended optimal treatment plans may be generated based on different desired results.
- treatment plan “A” indicates “Patient X should use treatment apparatus for 30 minutes a day for 4 days to achieve an increased range of motion of Y%; Patient X has Type 2 Diabetes; and Patient X should be prescribed medication Z for pain management during the treatment plan (medication Z is approved for people having Type 2 Diabetes).”
- the optimal treatment plan generated achieves increasing the range of motion of Y%.
- the optimal treatment plan also includes a recommended medication (e.g., medication Z) to prescribe to the patient to manage pain in view of a known medical disease (e.g., Type 2 Diabetes) of the patient. That is, the recommended patient medication not only does not conflict with the medical condition of the patient but thereby improves the probability of a superior patient outcome.
- a recommended medication e.g., medication Z
- Recommended optimal treatment plan “B” may specify, based on a different desired result of the treatment plan, a different treatment plan including a different treatment protocol for a treatment apparatus, a different medication regimen, etc.
- the patient profile display 2130 may also present the ruled-out treatment plans 2602. These types of treatment plans are shown to the assistant using the assistant interface 2094 to alert the assistant not to recommend certain portions of a treatment plan to the patient.
- the ruled-out treatment plan could specify the following: “Patient X should not use treatment apparatus for longer than 30 minutes a day due to a heart condition; Patient X has Type 2 Diabetes; and Patient X should not be prescribed medication M for pain management during the treatment plan (in this scenario, medication M can cause complications for people having Type 2 Diabetes).
- the ruled-out treatment plan points out a limitation of a treatment protocol where, due to a heart condition, Patient X should not exercise for more than 30 minutes a day.
- the ruled-out treatment plan also points out that Patient X should not be prescribed medication M because it conflicts with the medical condition Type 2 Diabetes.
- the assistant may select the optimal treatment plan for the patient on the overview display 2120.
- the assistant may use an input peripheral (e.g., mouse, touchscreen, microphone, keyboard, etc.) to select from the optimal treatment plans 2600 for the patient.
- the assistant may discuss the pros and cons of the recommended optimal treatment plans 2600 with the patient.
- the assistant may select the optimal treatment plan for the patient to follow to achieve the desired result.
- the selected optimal treatment plan may be transmitted to the patient interface 2050 for presentation.
- the patient may view the selected optimal treatment plan on the patient interface 2050.
- the assistant and the patient may discuss during the telemedicine session the details (e.g., treatment protocol using treatment apparatus 2070, diet regimen, medication regimen, etc.) in real-time.
- FIG. 19 shows an example embodiment of a server 2030 translating clinical information 2700 into a medical description language 2702 for processing by an artificial intelligence engine 2011 according to the present disclosure.
- the clinical information 2700 may be written by a person having a certain professional credential, license, or degree.
- the clinical information 2700 includes a portion of meta analyses for a clinical trial titled “EFFECT OF USING TREATMENT PLAN FOR HIP OSTEOARTHRITIS PAIN”.
- the portion includes a section for “Results” and a section for “Conclusion”.
- One or more machine learning models 2013 may be trained to parse a body of structured or unstructured text (e.g., clinical information 700) in search of a corpus of keywords that represent target information.
- the target information may be included in one or more portions of the clinical information 2700.
- Target information may refer to any suitable information of interest, such as characteristics of people (e.g., vital signs, medical conditions, medical procedures, allergies, familial medical information, measurements, etc.), treatment plans followed by the people, results of the treatment plans, clinical trial information, treatment apparatuses used for the treatment plan, and the like.
- the one or more machine learning models 2013 may generate a canonical format defined by the medical descriptive language.
- the values may be numbers, characters, alphanumeric characters, strings, arrays, and the like, wherein they are obtained from the portions of the clinical information 2700 (including the target information).
- the target information may be organized in parent-child relationships based on the structure, organization, and/or relationships of the information.
- the keyword “Results” may be identified and determined to be a parent level tag due to its encompassing children target information, such as trials, subjects, treatment plan, treatment apparatus, subject characteristics, and conclusions.
- a parent-level tag for “ ⁇ results>” may include child-level tags for “ ⁇ trials>”, “ ⁇ subjects>”, “treatment plan>”, “treatment apparatus>”, “ ⁇ subject characteristics ⁇ ’, and “ ⁇ conclusions>”.
- Each tag may have a corresponding ending tag (e.g., “ ⁇ results> ... ⁇ /results>”).
- the trained machine learning model 2013 identified keywords “treatment plan” and “treatment apparatus” in the portion of the clinical information 2700. Once identified, the trained machine learning model 2013 may analyze words in the vicinity (e.g., to the left and right) of the keywords to determine, based on training data, whether the words match a recognized context. The trained machine learning model 2013 may also determine, based on training data and based on attributes of the data, whether the words are recognized as being associated with the keywords. In FIG.
- the trained machine learning model may determine the words “range of motion (ROM)” fit the context of the keyword “treatment apparatus” and also are likely recognizable as being associated with the keyword “treatment apparatus”. Accordingly, the value “ROM” is placed in between tags “treatment apparatus>” and “ ⁇ /treatment apparatus>” representing target information.
- the other tags representing target information in the canonical format of the medical description language 2702 may be populated in a similar manner.
- the medical description language 2702 representing the portion of the clinical information 2700 may be saved in the patient data store 2044 in an appropriate patient cohort-equivalent database.
- FIG. 20 shows an example embodiment of a method 2800 for recommending an optimal treatment plan according to the present disclosure.
- the method 2800 is performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), software (such as is ran on a general-purpose computer system or a dedicated machine), or a combination of both.
- the method 2800 and/or each of its individual functions, routines, subroutines, or operations may be performed by one or more processors of a computing device (e.g., any component of FIGURE 13, such as server 2030 executing the artificial intelligence engine 2011).
- the method 2800 may be performed by a single processing thread.
- the method 2800 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the methods.
- the method 2800 is depicted and described as a series of operations. However, operations in accordance with this disclosure can occur in various orders and/or concurrently, and/or with other operations not presented and described herein. For example, the operations depicted in the method 2800 may occur in combination with any other operation of any other method disclosed herein. Furthermore, not all illustrated operations may be required to implement the method 2800 in accordance with the disclosed subject matter. In addition, those skilled in the art will understand and appreciate that the method 2800 could alternatively be represented as a series of interrelated states via a state diagram or events.
- the processing device may receive, from a data source 2015, clinical information 2700 pertaining to results of performing particular treatment plans using the treatment apparatus 2070 for people having certain characteristics.
- the clinical information has a first data format, which may include natural language text in the form of words arranged in sentences that are further arranged in paragraphs.
- the first data format may be a report or description, wherein the report or description may include information pertaining to clinical trials, medical research, meta-analyses, evidence-based guidelines, journals, and the like.
- the first data format may include information arranged in an unstructured manner and may have a first data size (e.g., bytes, kilobytes, etc.).
- the certain characteristics of the people may include medications prescribed to the people, injuries of the people, medical procedures performed on the people, measurements of the people, allergies of the people, medical conditions of the people, first historical information of the people, vital signs of the people, symptoms of the people, familial medical information of the people, or some combination thereof.
- the characteristics may also include the following information pertaining to the people: demographic, geographic, diagnostic, measurement- or test-based, medically historic, etiologic, cohort-associative, differentially diagnostic, surgical, physically therapeutic, pharmacologic and other treatment(s) recommended.
- the processing device may translate a portion of the clinical information from the first data format to a medical description language 2702 used by the artificial intelligence engine 2011.
- the medical description language 2702 may include a second data format that structures the unstructured data of the clinical information 2700.
- the medical description language 2702 may include using tag-value pairs, where the tags identify the type of value stored between the tags.
- the medical description language 2702 may have a second data size (e.g., bits) that is smaller than the first data size of the clinical information 2700.
- the medical description language may include telemedical data.
- the processing device may determine, based on the portion of the clinical information 2700 described by the medical description language 2702 and a set of characteristics pertaining to a patient, the optimal treatment plan 2600 for the patient to follow when using the treatment apparatus 2070 to achieve a desired result.
- One or more machine learning models 2013 of the artificial intelligence engine 2011 may be trained to output the optimal treatment plan 2600.
- one machine learning model 2013 may be trained to match a pattern between the portion of the clinical information described by the medical description language 2702 with the set of characteristics of the patient.
- the set of characteristics of the patient is also represented in the medical description language. The pattern is associated with the optimal treatment plan that may produce the desired result.
- the optimal treatment plan may include information pertaining to a medical procedure to perform on the patient, a treatment protocol for the patient using the treatment apparatus 2070, a diet regimen for the patient, a medication regimen for the patient, a sleep regimen for the patient, or some combination thereof.
- the desired result may include obtaining a certain result within a certain time period.
- the certain result may include a range of motion the patient achieves using the treatment apparatus 2070, an amount of force exerted by the patient on a portion of the treatment apparatus 2070, an amount of time the patient exercises using the treatment apparatus 2070, a distance the patient travels using the treatment apparatus 2070, a level of pain experienced by the patient when using the treatment apparatus 2070, or some combination thereof.
- the processing device may determine, based on the portion of the clinical information described by the medical description language and the set of characteristics pertaining to the patient, a second optimal treatment plan for the patient to follow using the treatment apparatus 2070 to achieve a second desired result.
- the desired result may pertain to a recovery outcome and the second desired result may pertain to a recovery time.
- the recovery outcome may include achieving a certain threshold of functionality, mobility movement, range of motion, etc. of a particular body part.
- the recovery time may include achieving a certain threshold of functionality, mobility, movement, range of motion, etc. of a particular body part within a certain threshold period of time. For example, some people may prefer to recover to a certain level of mobility as fast as possible without full recovery.
- different machine learning models 2013 may be trained, using different clinical information, to provide different recommended treatment plans that may produce different desired results.
- the processing device may determine, based on the portion of the clinical information described by the medical description language and the set of characteristics pertaining to the patient, an excluded treatment plan 2602 that should not be recommended for the patient to follow when using the treatment apparatus 2070 to achieve the desired result.
- the optimal treatment plan(s) 2600 and the excluded treatment plan(s) 2602 may be concurrently presented in a first portion (e.g., patient profile display 2130) of the user interface while at least the video or other multimedia data from the patient engaged in the telemedicine session may be presented in another portion (e.g., video feed display 2180).
- the optimal treatment plan(s) 2600 and the excluded treatment plan(s) 2602 may be concurrently presented while the medical professional is not engaged in a telemedicine session.
- the optimal treatment plan(s) 2600 and the excluded treatment plan(s) 2602 may be presented in the user interface before a telemedicine session begins or after a telemedicine session ends.
- the processing device may provide the optimal treatment plan to be presented in a user interface (e.g., overview display 2120) on a computing device (e.g., assistant interface 2094) of a medical professional.
- any other generated optimal treatment plans 2600 may be provided to the computing device of the medical professional.
- different optimal treatment plans that result in different outcomes may be presented to the medical professional.
- the processing device may receive a selected treatment plan of any of the treatment plans presented.
- the medical professional may select the optimal treatment plan based on an outcome preference of the patient. For example, an athlete might wish to optimize for performance, while a retiree might wish to optimize for a pain-free quality of life.
- the selected treatment plan may be transmitted to the computing device of the patient for presentation on a user interface.
- the optimal treatment plan(s) may be provided to the computing device of the medical professional during a telemedicine session to cause the optimal treatment plan to be presented in real-time in a first portion of the user interface while video and, optionally, other multimedia of the patient is concurrently presented in a second portion of the user interface.
- the selected treatment plan may be presented on the computing device of the patient during the telemedicine session such that the medical professional can explain the selected treatment plan to the patient.
- FIG. 21 shows an example embodiment of a method 2900 for translating clinical information into the medical description language according to the present disclosure.
- Method 2900 includes operations performed by processors of a computing device (e.g., any component of FIG. 13, such as server 2030 executing the artificial intelligence engine 2011).
- processors of a computing device e.g., any component of FIG. 13, such as server 2030 executing the artificial intelligence engine 2011.
- one or more operations of the method 2900 are implemented in computer instructions stored on a memory device and executed by a processing device.
- the method 2900 may be performed in the same or a similar manner as described above in regard to method 2800.
- the operations of the method 2900 may be performed in some combination with any of the operations of any of the methods described herein.
- the method 2900 may include operation 2804 from the previously -described method 2800 depicted in FIG. 20.
- the processing device may translate a portion of the clinical information from the first data format to a medical description language used by the artificial intelligence engine.
- the method 2900 in FIG. 21 includes operations 2902, 2904, and 2906.
- the operations 2902, 2904, and 2906 may be performed by one or more trained machine learning models 2013 of the artificial intelligence engine 2011.
- the processing device may parse the clinical information.
- the processing device may identify, based on keywords representing target information in the clinical information, the portion of the clinical information having values related to the target information.
- the processing device may generate a canonical format defined by the medical description language.
- the canonical format may include tags identifying values of the target information.
- the tags may be attributes describing specific characteristics of the target information.
- the specific characteristics may include which cohort class a person is placed in, age of the person, semantic information, being related to a certain cohort, familial history, and the like. In some embodiments, the specific characteristics may include any information or indication that a person is at risk.
- the canonical format may enable more efficient processing of the portion of the clinical information represented by the medical description language when training a machine learning model to generate the optimal treatment plans for patients who are using the trained machine learning model. Further, the canonical format may enable more efficient processing by the trained machine learning model when matching patterns between the characteristics of patients and the portion of the clinical information represented by the medical description language.
- FIG. 22 shows an example computer system 21000 which can perform any one or more of the methods described herein, in accordance with one or more aspects of the present disclosure.
- computer system 21000 may include a computing device and correspond to the assistance interface 2094, reporting interface 2092, supervisory interface 2090, clinician interface 2020, server 2030 (including the AI engine 2011), patient interface 2050, ambulatory sensor 2082, goniometer 2084, treatment apparatus 2070, pressure sensor 2086, or any suitable component of FIG. 13.
- the computer system 21000 may be capable of executing instructions implementing the one or more machine learning models 2013 of the artificial intelligence engine 2011 of FIG. 13.
- the computer system may be connected (e.g., networked) to other computer systems in a LAN, an intranet, an extranet, or the Internet, including via the cloud or a peer-to-peer network.
- the computer system may operate in the capacity of a server in a client-server network environment.
- the computer system may be a personal computer (PC), a tablet computer, a wearable (e.g., wristband), a set-top box (STB), a personal Digital Assistant (PDA), a mobile phone, a camera, a video camera, an Internet of Things (IoT) device, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device.
- PC personal computer
- PDA personal Digital Assistant
- IoT Internet of Things
- computer shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
- the computer system 21000 includes a processing device 21002, a main memory 21004 (e.g., read only memory (ROM), flash memory, solid state drives (SSDs), dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 21006 (e.g., flash memory, solid state drives (SSDs), static random access memory (SRAM)), and a data storage device 21008, which communicate with each other via a bus 1010.
- main memory 21004 e.g., read only memory (ROM), flash memory, solid state drives (SSDs), dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)
- DRAM dynamic random access memory
- SDRAM synchronous DRAM
- static memory 21006 e.g., flash memory, solid state drives (SSDs), static random access memory (SRAM)
- SRAM static random access memory
- Processing device 21002 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 21002 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 21002 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a system on a chip, a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 21002 is configured to execute instructions for performing any of the operations and steps discussed herein.
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- DSP digital signal processor
- the computer system 21000 may further include a network interface device 21012.
- the computer system 21000 also may include a video display 21014 (e.g., a liquid crystal display (LCD), a light-emitting diode (LED), an organic light-emitting diode (OLED), a quantum LED, a cathode ray tube (CRT), a shadow mask CRT, an aperture grille CRT, a monochrome CRT), one or more input devices 21016 (e.g., a keyboard and/or a mouse or a gaming-like control), and one or more speakers 21018 (e.g., a speaker).
- the video display 21014 and the input device(s) 21016 may be combined into a single component or device (e.g., an LCD touch screen).
- the data storage device 21016 may include a computer-readable medium 21020 on which the instructions 21022 embodying any one or more of the methods, operations, or functions described herein is stored.
- the instructions 21022 may also reside, completely or at least partially, within the main memory 21004 and/or within the processing device 21002 during execution thereof by the computer system 21000 As such, the main memory 21004 and the processing device 21002 also constitute computer-readable media.
- the instructions 21022 may further be transmitted or received over a network via the network interface device 21012.
- computer-readable storage medium 21020 is shown in the illustrative examples to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
- the term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
- the term “computer-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
- a method for providing, by an artificial intelligence engine, an optimal treatment plan to use with a treatment apparatus comprising:
- Clause 31 The method of any clause herein, wherein the desired result comprises obtaining a certain result within a certain time period, and the certain result comprises:
- the certain characteristics of the people comprise first medications prescribed to the people, first injuries of the people, first medical procedures performed on the people, first measurements of the people, first allergies of the people, first medical conditions of the people, first historical information of the people, first vital signs of the people, first symptoms of the people, first familial medical information of the people, first demographic information of the people, first geographic information of the people, first measurement- or test- based information of the people, first medically historic information of the people, first etiologic information of the people, first cohort-associative information of the people, first differentially diagnostic information of the people, first surgical information of the people, first physically therapeutic information of the people, first pharmacologic information of the people, first other treatments recommended to the people, or some combination thereof, and
- the plurality of characteristics of the patient comprise second medications of the patient, second injuries of the patient, second medical procedures performed on the patient, second measurements of the patient, second allergies of the patient, second medical conditions of the patient, second historical information of the patient, second vital signs of the patient, second symptoms of the patient, second familial medical information of the patient, second demographic information of the patient, second geographic information of the patient, second measurement- or test-based information of the patient, second medically historic information of the patient, second etiologic information of the patient, second cohort-associative information of the patient, second differentially diagnostic information of the patient, second surgical information of the patient, second physically therapeutic information of the patient, second pharmacologic information of the patient, second other treatments recommended to the patient, or some combination thereof.
- Clause 33 The method of any clause herein, wherein the clinical information is written by a person having a certain professional credential and comprises a journal article, a clinical trial, evidence-based guidelines, meta-analysis, or some combination thereof.
- a tangible, non-transitory computer-readable medium storing instructions that, when executed, cause a processing device to:
- [0405] receive, from a data source, clinical information pertaining to results of using the treatment apparatus to perform particular treatment plans for people having certain characteristics, wherein the clinical information has a first data format;
- [0411] identify, based on keywords representing target information in the clinical information, the portion of the clinical information having values of the target information;
- [0412] generate a canonical format defined by the medical description language, wherein the canonical format comprises tags identifying the values of the target information.
- [0416] determines, based on the portion of the clinical information described by the medical description language and the plurality of characteristics pertaining to the patient, a second optimal treatment plan for the patient to follow when using the treatment apparatus to achieve a second desired result, wherein the desired result pertains to a recovery outcome and the second desired result pertains to a recovery time; and [0417] provides the second optimal treatment plan to be presented on the computing device of the medical professional;
- [0419] transmits the selected treatment plan to a computing device of the patient.
- Clause 40 The computer-readable medium of any clause herein, wherein the desired result comprises obtaining a certain result within a certain time period, and the certain result comprises:
- the certain characteristics of the people comprise first medications prescribed to the people, first injuries of the people, first medical procedures performed on the people, first measurements of the people, first allergies of the people, first medical conditions of the people, first historical information of the people, first vital signs of the people, first symptoms of the people, first familial medical information of the people, first demographic information of the people, first geographic information of the people, first measurement- or test- based information of the people, first medically historic information of the people, first etiologic information of the people, first cohort-associative information of the people, first differentially diagnostic information of the people, first surgical information of the people, first physically therapeutic information of the people, first pharmacologic information of the people, first other treatments recommended to the people, or some combination thereof, and [0428] the plurality of characteristics of the patient comprise second medications of the patient, second injuries of the patient, second medical procedures performed on the patient, second measurements of the patient, second allergies of the patient, second medical conditions of the patient, second historical information of the patient, second vital signs of the patient, second symptoms
- Clause 42 The computer-readable medium of any clause herein, wherein the clinical information is written by a person having a certain professional credential and comprises a journal article, a clinical trial, evidence-based guidelines, or some combination thereof.
- a processing device communicatively coupled to the memory device, wherein the processing device executes the instructions to:
- [0433] receive, from a data source, clinical information pertaining to results of using the treatment apparatus to perform particular treatment plans for people having certain characteristics, wherein the clinical information has a first data format;
- a treatment plan for a patient having certain characteristics may be a technically challenging problem. For example, a multitude of information may be considered when determining a treatment plan, which may result in inefficiencies and inaccuracies in the treatment plan selection process.
- some of the multitude of information considered may include characteristics of the patient such as personal information, performance information, and measurement information.
- the personal information may include, e.g., demographic, psychographic or other information, such as an age, a weight, a gender, a height, a body mass index, a medical condition, a familial medication history, an injury, a medical procedure, a medication prescribed, behavioral or psychological conditions, or some combination thereof.
- the performance information may include, e.g., an elapsed time of using a treatment device, an amount of force exerted on a portion of the treatment device, a range of motion achieved on the treatment device, a movement speed of a portion of the treatment device, an indication of a plurality of pain levels using the treatment device, or some combination thereof.
- the measurement information may include, e.g., a vital sign, a respiration rate, a heartrate, a temperature, a blood pressure, a glucose level or other biomarker, or some combination thereof. It may be desirable to process the characteristics of a multitude of patients, the treatment plans performed for those patients, and the results of the treatment plans for those patients.
- Another technical problem may involve distally treating, via a computing device during a telemedicine or telehealth session, a patient from a location different than a location at which the patient is located.
- An additional technical problem is controlling or enabling the control of, from the different location, a treatment device used by the patient at the location at which the patient is located.
- a healthcare provider may prescribe a treatment device to the patient to use to perform a treatment protocol at their residence or any mobile location or temporary domicile.
- a healthcare provider may refer to a doctor, physician assistant, nurse, chiropractor, dentist, physical therapist, acupuncturist, physical trainer, coach, personal trainer, or the like.
- a healthcare provider may refer to any person with a credential, license, degree, or the like in the field of medicine, physical therapy, rehabilitation, or the like.
- the healthcare provider When the healthcare provider is located in a different location from the patient and the treatment device, it may be technically challenging for the healthcare provider to monitor the patient’s actual progress (as opposed to relying on the patient’s word about their progress) using the treatment device, modify the treatment plan according to the patient’s progress, adapt the treatment device to the personal characteristics of the patient as the patient performs the treatment plan, and the like.
- systems and methods configured to monitor the patient’s actual progress, while the patient performs the treatment plan using the treatment device, may be desirable.
- the systems and methods described herein may be configured to receive treatment data pertaining to a user who uses a treatment device to perform a treatment plan.
- the user may include a patient, user, or person using the treatment device to perform various exercises.
- the treatment data may include various characteristics of the user, various baseline measurement information pertaining to the user, various measurement information pertaining to the user while the user uses the treatment device, various characteristics of the treatment device, the treatment plan, other suitable data, or a combination thereof.
- the systems and methods described herein may be configured to receive the treatment data during a telemedicine session.
- at least some of the treatment data may correspond to sensor data of a sensor configured to sense various characteristics of the treatment device, and/or the measurement information of the user.
- at least some of the treatment data may correspond to sensor data from a sensor associated with a wearable device configured to sense the measurement information of the user.
- the various characteristics of the treatment device may include one or more settings of the treatment device, a current revolutions per time period (e.g., such as one minute) of a rotating member (e.g., such as a wheel) of the treatment device, a resistance setting of the treatment device, other suitable characteristics of the treatment device, or a combination thereof.
- the baseline measurement information may include, while the user is at rest, one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, a glucose level or other biomarker, other suitable measurement information of the user, or a combination thereof.
- the measurement information may include, while the user uses the treatment device to perform the treatment plan, one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, a glucose level of the user or, other suitable measurement information of the user, or a combination thereof.
- the systems and methods described herein may be configured to write to an associated memory, for access by an artificial intelligence engine, the treatment data.
- the artificial intelligence engine may be configured to use one or more machine learning models configured to use at least some of the treatment data to generate one or more predictions.
- the artificial intelligence engine may use a machine learning model trained using various treatment data corresponding to various users.
- the machine learning model may be configured to receive the treatment data corresponding to the user.
- the machine learning model may analyze the at least one aspect of the treatment data and may generate at least one prediction corresponding to the at least one aspect of the treatment data.
- the at least one prediction may indicate one or more predicted characteristics of the user.
- the one or more predicted characteristics of the user may include a predicted vital sign of the user, a predicted respiration rate of the user, a predicted heartrate of the user, a predicted temperature of the user, a predicted blood pressure of the user, a predicted performance parameter of the user performing the treatment plan, a predicted outcome of the treatment plan being performed by the user, a predicted injury of the user resulting from the user performing the treatment plan, or other suitable predicted characteristics of the user.
- the systems and methods described herein may be configured to receive, from the artificial intelligence engine, one or more predictions.
- the systems and methods described herein may be configured to identify a threshold corresponding to respective predictions received from the artificial intelligence engine. For example, the systems and methods described herein may identify one or more characteristics of the user indicated by a respective prediction.
- the systems and methods described herein may be configured to access a database configured to associate thresholds with characteristics of the user and/or combinations of characteristics of the user.
- the database may include information that associates a first threshold with a blood pressure of the user.
- the database may include information that associates a threshold with a blood pressure of the user and a heartrate of the user.
- the database may include any number of thresholds associated with any of the various characteristics of the user and/or any combination of user characteristics.
- a threshold corresponding to a respective prediction may include a value or a range of values, including an upper limit and a lower limit.
- the systems and methods described herein may be configured to determine whether a prediction received from the artificial intelligence engine is within a range of a corresponding threshold. For example, the systems and methods described herein may be configured to compare the prediction to the corresponding threshold. The systems and methods described herein may be configured to determine whether the prediction is within a predefined range of the threshold.
- the predefined range may include an upper limit (e.g., 0.5% or 1% percentagewise, or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable upper limit) above the value and a lower limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable lower limit) below the value.
- an upper limit e.g., 0.5% or 1% percentagewise, or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable upper limit
- a lower limit e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable lower limit
- the predefined range may include a second upper limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value) or other suitable upper limit) above the first upper limit and a second lower limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value) or other suitable lower limit) below the first lower limit.
- the threshold may include any suitable predefined range and may include any suitable format in addition to or other than those described herein.
- the systems and methods described herein may be configured to communicate with (e.g., or over or across) an interface, at a computing device of a healthcare provider, to provide the prediction and the treatment data.
- the systems and methods described herein may be configured to generate treatment information using the treatment data.
- the treatment information may include a summary of the performance of the treatment plan by the user while using the treatment device.
- the summary may be formatted, such that the treatment data is presentable at a computing device of the healthcare provider.
- the systems and methods described herein may be configured to communicate the treatment information with the prediction and/or the treatment data, to the computing device of the healthcare provider.
- the systems and methods described herein may be configured to update the treatment data pertaining to the user to indicate the prediction.
- the systems and methods described herein may, in response to determining that the prediction is within the range of the threshold, modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device based on the prediction.
- the systems and methods described herein may be configured to control, while the user uses the treatment device during a telemedicine session and based on a generated prediction, the treatment device.
- the systems and methods described herein may control one or more characteristics of the treatment device based on the prediction and/or the treatment plan.
- the healthcare provider may include a medical professional (e.g., such as a doctor, a nurse, a therapist, and the like), an exercise professional (e.g., such as a coach, a trainer, a nutritionist, and the like), or another professional sharing at least one of medical and exercise attributes (e.g., such as an exercise physiologist, a physical therapist, an occupational therapist, and the like).
- a “healthcare provider” may be a human being, a robot, a virtual assistant, a virtual assistant in a virtual and/or augmented reality, or an artificially intelligent entity, including a software program, integrated software and hardware, or hardware alone.
- the interface may include a graphical user interface configured to provide the treatment information and receive input from the healthcare provider.
- the interface may include one or more input fields, such as text input fields, dropdown selection input fields, radio button input fields, virtual switch input fields, virtual lever input fields, audio, haptic, tactile, biometric or otherwise activated and/or driven input fields, other suitable input fields, or a combination thereof.
- the healthcare provider may review the treatment information and/or the prediction.
- the healthcare provider may determine, based on the review of the treatment information and/or prediction, whether to modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device. For example, the healthcare provider may review the treatment information.
- the healthcare provider may, based on the review of the treatment information, compare the treatment information to the treatment plan being performed by the user.
- the healthcare provider may compare the following (i) expected information, which pertains to the user while the user uses the treatment device to perform the treatment plan to (ii) the prediction, which pertains to the user while the user uses the treatment device to perform the treatment plan.
- the expected information may include one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable information of the user, or a combination thereof.
- the healthcare provider may determine that the treatment plan is having the desired effect if the prediction is within an acceptable range associated with one or more corresponding parts or portions of the expected information. Alternatively, the healthcare provider may determine that the treatment plan is not having the desired effect if the prediction is outside of the range associated with one or more corresponding parts or portions of the expected information.
- the healthcare provider may determine whether a blood pressure value indicated by the prediction (e.g., systolic pressure, diastolic pressure, and/or pulse pressure) is within an acceptable range (e.g., plus or minus 1%, plus or minus 5%, percentagewise, plus or minus 1 unit of measurement (or other suitable numerical value), or any suitable percentage-based or numerical range) of an expected blood pressure value indicated by the expected information.
- the healthcare provider may determine that the treatment plan is having the desired effect if the blood pressure value is within the range of the expected blood pressure value.
- the healthcare provider may determine that the treatment plan is not having the desired effect if the blood pressure value is outside of the range of the expected blood pressure value.
- the healthcare provider may compare the expected characteristics of the treatment device with characteristics of the treatment device indicated by the treatment information. For example, the healthcare provider may compare an expected resistance setting of the treatment device with an actual resistance setting of the treatment device indicated by the treatment information. The healthcare provider may determine that the user is performing the treatment plan properly if the actual characteristics of the treatment device indicated by the treatment information are within a range of corresponding ones of the expected characteristics of the treatment device. Alternatively, the healthcare provider may determine that the user is not performing the treatment plan properly if the actual characteristics of the treatment device indicated by the treatment information are outside the range of corresponding ones of the expected characteristics of the treatment device.
- the healthcare provider may determine not to modify the at least one aspect treatment plan and/or the one or more characteristics of the treatment device.
- the healthcare provider may determine to modify the at least one aspect of the treatment plan and/or the one or more characteristics of the treatment device.
- the healthcare provider may interact with the interface to provide treatment plan input indicating one or more modifications to the treatment plan and/or to modify one or more characteristics of the treatment device, if the healthcare provider determines to modify the at least one aspect of the treatment plan and/or to modify one or more characteristics of the treatment device.
- the healthcare provider may use the interface to provide input indicating an increase or decrease in the resistance setting of the treatment device, or other suitable modification to the one or more characteristics of the treatment device.
- the healthcare provider may use the interface to provide input indicating a modification to the treatment plan.
- the healthcare provider may use the interface to provide input indicating an increase or decrease in an amount of time the user is required to use the treatment device according to the treatment plan, or other suitable modifications to the treatment plan.
- the systems and methods described herein may be configured to modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device.
- the systems and methods described herein may be configured to receive the subsequent treatment data pertaining to the user while the user uses the treatment device to perform the modified treatment plan. For example, after the healthcare provider provides input modifying the treatment plan and/or the one or more characteristics of the treatment device, and/or after the artificial intelligence engine modifies the treatment plan and/or one or more characteristics of the treatment device, the user may continue use the treatment device to perform the modified treatment plan.
- the subsequent treatment data may correspond to treatment data generated while the user uses the treatment device to perform the modified treatment plan.
- the subsequent treatment data may correspond to treatment data generated while the user continues to use the treatment device to perform the treatment plan, after the healthcare provider has received the treatment information and determined not to modify the treatment plan and/or the one or more characteristics of the treatment device, and/or the artificial intelligence engine has determined not to modify the treatment plan and/or the one or more characteristics of the treatment device.
- the artificial intelligence engine may use the one or more machine learning models to generate one or more subsequent predictions based on the subsequent treatment data. The systems and methods described herein may determine whether a respective subsequent prediction is within a range of a corresponding threshold.
- the systems and methods described herein may, in response to a determination that the respective subsequent prediction is within the range of the threshold, communicate the subsequent treatment data, subsequent treatment information, and/or the prediction to the computing device of the healthcare provider. In some embodiments, based on the subsequent prediction, the systems and methods described herein may modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device. [0467] In some embodiments, the systems and methods described herein may be configured to receive subsequent treatment plan input from the computing device of the healthcare provider. Based on the subsequent treatment plan input received from the computing device of the healthcare provider, the systems and methods described herein may be configured to further modify the treatment plan and/or to control the one or more characteristics of the treatment device.
- the subsequent treatment plan input may correspond to input provided by the healthcare provider, at the interface, in response to receiving and/or reviewing subsequent treatment information and/or the subsequent prediction corresponding to the subsequent treatment data.
- the systems and methods described herein may be configured to continuously and/or periodically generate predictions based on treatment data.
- the systems and methods described herein may be configured to provide treatment information to the computing device of the healthcare provider based on treatment data continuously and/or periodically received from the sensors or other suitable sources described herein. Additionally, or alternatively, the systems and methods described herein may be configured to continuously and/or periodically monitor, while the user uses the treatment device to perform the treatment plan, the characteristics of the user.
- the healthcare provider and/or the systems and methods described herein may receive and/or review, continuously or periodically, while the user uses the treatment device to perform the treatment plan, treatment information, treatment data, and or predictions. Based on one or more trends indicated by the treatment information, treatment data, and/or predictions, the healthcare provider and/or the systems and methods described herein may determine whether to modify the treatment plan and/or to modify and/or to control the one or more characteristics of the treatment device. For example, the one or more trends may indicate an increase in heartrate or other suitable trends indicating that the user is not performing the treatment plan properly and/or that performance of the treatment plan by the user is not having the desired effect.
- the systems and methods described herein may be configured to use artificial intelligence and/or machine learning to assign patients to cohorts and to dynamically control a treatment device based on the assignment during an adaptive telemedicine session.
- one or more treatment devices may be provided to patients.
- the one or more treatment devices may be used by the patients to perform treatment plans in their residences, at a gym, at a rehabilitative center, at a hospital, at their work place, at a hotel, at a conference center, or in or at any suitable location, including permanent or temporary domiciles.
- the treatment devices may be communicatively coupled to a server.
- Characteristics of the patients, including the treatment data, may be collected before, during, and/or after the patients perform the treatment plans.
- the personal information, the performance information, and the measurement information may be collected before, during, and/or after the person performs the treatment plans.
- the results (e.g., improved performance or decreased performance) of performing each exercise may be collected from the treatment device throughout the treatment plan and after the treatment plan is performed.
- the parameters, settings, configurations, etc. e.g., position of pedal, amount of resistance, etc.
- Each characteristic of the patient, each result, and each parameter, setting, configuration, etc. may be timestamped and may be correlated with a particular step in the treatment plan. Such a technique may enable determining which steps in the treatment plan are more likely to lead to desired results (e.g., improved muscle strength, range of motion, etc.) and which steps are more likely to lead to diminishing returns (e.g., continuing to exercise after 3 minutes actually delays or harms recovery).
- desired results e.g., improved muscle strength, range of motion, etc.
- diminishing returns e.g., continuing to exercise after 3 minutes actually delays or harms recovery.
- Data may be collected from the treatment devices and/or any suitable computing device (e.g., computing devices where personal information is entered, such as the interface of the computing device described herein, a clinician interface, patient interface, and the like) over time as the patients use the treatment devices to perform the various treatment plans.
- the data that may be collected may include the characteristics of the patients, the treatment plans performed by the patients, the results of the treatment plans, any of the data described herein, any other suitable data, or a combination thereof.
- the data may be processed to group certain people into cohorts.
- the people may be grouped by people having certain or selected similar characteristics, treatment plans, and results of performing the treatment plans. For example, athletic people having no medical conditions who perform a treatment plan (e.g., use the treatment device for 30 minutes a day 5 times a week for 3 weeks) and who fully recover may be grouped into a first cohort. Older people who are classified obese and who perform a treatment plan (e.g., use the treatment plan for 10 minutes a day 3 times a week for 4 weeks) and who improve their range of motion by 75 percent may be grouped into a second cohort.
- an artificial intelligence engine may include one or more machine learning models that are trained using the cohorts.
- the one or more machine learning models may be trained to receive an input of characteristics of a new patient and to output a treatment plan for the patient that results in a desired result.
- the machine learning models may match a pattern between the characteristics of the new patient and at least one patient of the patients included in a particular cohort. When a pattern is matched, the machine learning models may assign the new patient to the particular cohort and select the treatment plan associated with the at least one patient.
- the artificial intelligence engine may be configured to control, distally and based on the treatment plan, the treatment device while the new patient uses the treatment device to perform the treatment plan.
- the characteristics of the new patient may change as the new patient uses the treatment device to perform the treatment plan.
- the performance of the patient may improve quicker than expected for people in the cohort to which the new patient is currently assigned.
- the machine learning models may be trained to dynamically reassign, based on the changed characteristics, the new patient to a different cohort that includes people having characteristics similar to the now-changed characteristics as the new patient. For example, a clinically obese patient may lose weight and no longer meet the weight criterion for the initial cohort, result in the patient’ s being reassigned to a different cohort with a different weight criterion.
- a different treatment plan may be selected for the new patient, and the treatment device may be controlled, distally (e.g., which may be referred to as remotely) and based on the different treatment plan, while the new patient uses the treatment device to perform the treatment plan.
- distally e.g., which may be referred to as remotely
- Such techniques may provide the technical solution of distally controlling a treatment device.
- the systems and methods described herein may lead to faster recovery times and/or better results for the patients because the treatment plan that most accurately fits their characteristics is selected and implemented, in real-time, at any given moment. “Real-time” may also refer to near real-time, which may be less than 10 seconds. As described herein, the term “results” may refer to medical results or medical outcomes. Results and outcomes may refer to responses to medical actions.
- the artificial intelligence engine may be trained to output several treatment plans. For example, one result may include recovering to a threshold level (e.g., 75% range of motion) in a fastest amount of time, while another result may include fully recovering (e.g., 100% range of motion) regardless of the amount of time.
- the data obtained from the patients and sorted into cohorts may indicate that a first treatment plan provides the first result for people with characteristics similar to the patient’s, and that a second treatment plan provides the second result for people with characteristics similar to the patient.
- the artificial intelligence engine may be trained to output treatment plans that are not optimal i.e., sub-optimal, nonstandard, or otherwise excluded (all referred to, without limitation, as “excluded treatment plans”) for the patient. For example, if a patient has high blood pressure, a particular exercise may not be approved or suitable for the patient as it may put the patient at unnecessary risk or even induce a hypertensive crisis and, accordingly, that exercise may be flagged in the excluded treatment plan for the patient.
- the artificial intelligence engine may monitor the treatment data received while the patient (e.g., the user) with, for example, high blood pressure, uses the treatment device to perform an appropriate treatment plan and may modify the appropriate treatment plan to include features of an excluded treatment plan that may provide beneficial results for the patient if the treatment data indicates the patient is handling the appropriate treatment plan without aggravating, for example, the high blood pressure condition of the patient.
- the treatment plans and/or excluded treatment plans may be presented, during a telemedicine or telehealth session, to a healthcare provider.
- the healthcare provider may select a particular treatment plan for the patient to cause that treatment plan to be transmitted to the patient and/or to control, based on the treatment plan, the treatment device.
- the artificial intelligence engine may receive and/or operate distally from the patient and the treatment device.
- the recommended treatment plans and/or excluded treatment plans may be presented simultaneously with a video of the patient in real-time or near real-time during a telemedicine or telehealth session on a user interface of a computing device of a healthcare provider.
- the video may also be accompanied by audio, text and other multimedia information.
- Real-time may refer to less than or equal to 2 seconds.
- Near real-time may refer to any interaction of a sufficiently short time to enable two individuals to engage in a dialogue via such user interface and will generally be less than 10 seconds but greater than 2 seconds.
- Presenting the treatment plans generated by the artificial intelligence engine concurrently with a presentation of the patient video may provide an enhanced user interface because the healthcare provider may continue to visually and/or otherwise communicate with the patient while also reviewing the treatment plans on the same user interface.
- the enhanced user interface may improve the healthcare provider’s experience using the computing device and may encourage the healthcare provider to reuse the user interface.
- Such a technique may also reduce computing resources (e.g., processing, memory, network) because the healthcare provider does not have to switch to another user interface screen to enter a query for a treatment plan to recommend based on the characteristics of the patient.
- the artificial intelligence engine may be configured to provide, dynamically on the fly, the treatment plans and excluded treatment plans.
- the treatment device may be adaptive and/or personalized because its properties, configurations, and positions may be adapted to the needs of a particular patient.
- the pedals may be dynamically adjusted on the fly (e.g., via a telemedicine session or based on programmed configurations in response to certain measurements being detected) to increase or decrease a range of motion to comply with a treatment plan designed for the user.
- a healthcare provider may adapt, remotely during a telemedicine session, the treatment device to the needs of the patient by causing a control instruction to be transmitted from a server to treatment device.
- Such adaptive nature may improve the results of recovery for a patient, furthering the goals of personalized medicine, and enabling personalization of the treatment plan on a per-individual basis.
- a technical problem may occur which relates to the information pertaining to the patient’s medical condition being received in disparate formats.
- a server may receive the information pertaining to a medical condition of the patient from one or more sources (e.g., from an electronic medical record (EMR) system, application programming interface (API), or any suitable system that has information pertaining to the medical condition of the patient).
- sources e.g., from an electronic medical record (EMR) system, application programming interface (API), or any suitable system that has information pertaining to the medical condition of the patient.
- EMR electronic medical record
- API application programming interface
- some embodiments of the present disclosure may use an API to obtain, via interfaces exposed by APIs used by the sources, the formats used by the sources.
- the API when information is received from the sources, the API may map, translate and/or convert the format used by the sources to a standardized format used by the artificial intelligence engine. Further, the information mapped, translated and/or converted to the standardized format used by the artificial intelligence engine may be stored in a database accessed by the artificial intelligence engine when performing any of the techniques disclosed herein. Using the information mapped, translated and/or converted to a standardized format may enable the more accurate determination of the procedures to perform for the patient and/or a billing sequence.
- the standardized information may enable generating treatment plans and/or billing sequences having a particular format that can be processed by various applications (e.g., telehealth).
- applications e.g., telehealth applications
- the applications may be provided by a server and may be configured to process data according to a format in which the treatment plans and the billing sequences are implemented.
- the disclosed embodiments may provide a technical solution by (i) receiving, from various sources (e.g., EMR systems), information in non-standardized and/or different formats; (ii) standardizing the information; and (iii) generating, based on the standardized information, treatment plans and billing sequences having standardized formats capable of being processed by applications (e.g., telehealth application) executing on computing devices of medical professional and/or patients.
- sources e.g., EMR systems
- applications e.g., telehealth application
- FIG. 23 generally illustrates a block diagram of a computer-implemented system 3010, hereinafter called “the system” for managing a treatment plan.
- Managing the treatment plan may include using an artificial intelligence engine to recommend treatment plans and/or provide excluded treatment plans that should not be recommended to a patient.
- the system 3010 also includes a server 3030 configured to store (e.g., write to an associated memory) and to provide data related to managing the treatment plan.
- the server 3030 may include one or more computers and may take the form of a distributed and/or virtualized computer or computers.
- the server 3030 also includes a first communication interface 3032 configured to communicate with (e.g., or over) the clinician interface 3020 via a first network 3034.
- the first network 3034 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
- the server 3030 includes a first processor 3036 and a first machine-readable storage memory 3038, which may be called a “memory” for short, holding first instructions 3040 for performing the various actions of the server 3030 for execution by the first processor 3036.
- the server 3030 is configured to store data regarding the treatment plan.
- the memory 3038 includes a system data store 3042 configured to hold system data, such as data pertaining to treatment plans for treating one or more patients.
- the server 3030 is also configured to store data regarding performance by a patient in following a treatment plan.
- the memory 3038 includes a patient data store 3044 configured to hold patient data, such as data pertaining to the one or more patients, including data representing each patient’s performance within the treatment plan.
- the characteristics (e.g., personal, performance, measurement, etc.) of the people, the treatment plans followed by the people, the level of compliance with the treatment plans, and the results of the treatment plans may use correlations and other statistical or probabilistic measures to enable the partitioning of or to partition the treatment plans into different patient cohort-equivalent databases in the patient data store 3044.
- the data for a first cohort of first patients having a first similar injury, a first similar medical condition, a first similar medical procedure performed, a first treatment plan followed by the first patient, and a first result of the treatment plan may be stored in a first patient database.
- the data for a second cohort of second patients having a second similar injury, a second similar medical condition, a second similar medical procedure performed, a second treatment plan followed by the second patient, and a second result of the treatment plan may be stored in a second patient database. Any single characteristic or any combination of characteristics may be used to separate the cohorts of patients.
- the different cohorts of patients may be stored in different partitions or volumes of the same database. There is no specific limit to the number of different cohorts of patients allowed, other than as limited by mathematical combinatoric and/or partition theory.
- This characteristic data, treatment plan data, and results data may be obtained from numerous treatment devices and/or computing devices over time and stored in the database 3044.
- the characteristic data, treatment plan data, and results data may be correlated in the patient-cohort databases in the patient data store 3044.
- the characteristics of the people may include personal information, performance information, and/or measurement information.
- characteristics about a current patient being treated may be stored in an appropriate patient cohort-equivalent database.
- the characteristics of the patient may be determined to match or be similar to the characteristics of another person in a particular cohort (e.g., cohort A) and the patient may be assigned to that cohort.
- the server 3030 may execute an artificial intelligence (AI) engine 3011 that uses one or more machine learning models 3013 to perform at least one of the embodiments disclosed herein.
- the server 3030 may include a training engine 9 capable of generating the one or more machine learning models 3013.
- the machine learning models 3013 may be trained to assign people to certain cohorts based on their characteristics, select treatment plans using real-time and historical data correlations involving patient cohort- equivalents, and control a treatment device 3070, among other things.
- the one or more machine learning models 3013 may be generated by the training engine 309 and may be implemented in computer instructions executable by one or more processing devices of the training engine 309 and/or the servers 3030. To generate the one or more machine learning models 3013, the training engine 309 may train the one or more machine learning models 3013. The one or more machine learning models 3013 may be used by the artificial intelligence engine 3011.
- the training engine 309 may be a rackmount server, a router computer, a personal computer, a portable digital assistant, a smartphone, a laptop computer, a tablet computer, a netbook, a desktop computer, an Internet of Things (IoT) device, any other suitable computing device, or a combination thereof.
- the training engine 9 may be cloud-based or a real-time software platform, and it may include privacy software or protocols, and/or security software or protocols.
- the training engine 309 may use a training data set of a corpus of the characteristics of the people that used the treatment device 3070 to perform treatment plans, the details (e.g., treatment protocol including exercises, amount of time to perform the exercises, how often to perform the exercises, a schedule of exercises, parameters/configurations/settings of the treatment device 3070 throughout each step of the treatment plan, etc.) of the treatment plans performed by the people using the treatment device 3070, and the results of the treatment plans performed by the people.
- the one or more machine learning models 3013 may be trained to match patterns of characteristics of a patient with characteristics of other people assigned to a particular cohort.
- the term “match” may refer to an exact match, a correlative match, a substantial match, etc.
- the one or more machine learning models 3013 may be trained to receive the characteristics of a patient as input, map the characteristics to characteristics of people assigned to a cohort, and select a treatment plan from that cohort.
- the one or more machine learning models 3013 may also be trained to control, based on the treatment plan, the machine learning apparatus 3070.
- Different machine learning models 3013 may be trained to recommend different treatment plans for different desired results. For example, one machine learning model may be trained to recommend treatment plans for most effective recovery, while another machine learning model may be trained to recommend treatment plans based on speed of recovery.
- the one or more machine learning models 3013 may refer to model artifacts created by the training engine 309.
- the training engine 309 may find patterns in the training data wherein such patterns map the training input to the target output and generate the machine learning models 3013 that capture these patterns.
- the artificial intelligence engine 3011 and/or the training engine 309 may reside on another component (e.g., assistant interface 3094, clinician interface 3020, etc.) depicted in FIG. 23.
- the one or more machine learning models 3013 may comprise, e.g., a single level of linear or non linear operations (e.g., a support vector machine [SVM]) or the machine learning models 3013 may be a deep network, i.e., a machine learning model comprising more than one level (e.g., multiple levels) of non-linear operations.
- deep networks are neural networks including generative adversarial networks, convolutional neural networks, recurrent neural networks with one or more hidden layers, and fully connected neural networks (e.g., each neuron may transmit its output signal to the input of the remaining neurons, as well as to itself).
- the machine learning model may include numerous layers and/or hidden layers that perform calculations (e.g., dot products) using various neurons.
- the system 3010 also includes a patient interface 3050 configured to communicate information to a patient and to receive feedback from the patient.
- the patient interface includes an input device 3052 and an output device 3054, which may be collectively called a patient user interface 3052, 3054.
- the input device 3052 may include one or more devices, such as a keyboard, a mouse, a touch screen input, a gesture sensor, and/or a microphone and processor configured for voice recognition.
- the output device 3054 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, smartphone, or a smart watch.
- the output device 3054 may include other hardware and/or software components such as a projector, virtual reality capability, augmented reality capability, etc.
- the output device 3054 may incorporate various different visual, audio, or other presentation technologies.
- the output device 3054 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, and/or melodies, which may signal different conditions and/or directions.
- the output device 3054 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the patient.
- the output device 3054 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
- the patient interface 3050 includes a second communication interface 3056, which may also be called a remote communication interface configured to communicate with the server 3030 and/or the clinician interface 3020 via a second network 3058.
- the second network 3058 may include a local area network (LAN), such as an Ethernet network.
- the second network 3058 may include the Internet, and communications between the patient interface 3050 and the server 3030 and/or the clinician interface 3020 may be secured via encryption, such as, for example, by using a virtual private network (VPN).
- VPN virtual private network
- the second network 3058 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc. In some embodiments, the second network 3058 may be the same as and/or operationally coupled to the first network 3034.
- the patient interface 3050 includes a second processor 3060 and a second machine-readable storage memory 3062 holding second instructions 3064 for execution by the second processor 3060 for performing various actions of patient interface 3050.
- the second machine-readable storage memory 3062 also includes a local data store 3066 configured to hold data, such as data pertaining to a treatment plan and/or patient data, such as data representing a patient’s performance within a treatment plan.
- the patient interface 3050 also includes a local communication interface 3068 configured to communicate with various devices for use by the patient in the vicinity of the patient interface 3050.
- the local communication interface 3068 may include wired and/or wireless communications.
- the local communication interface 3068 may include a local wireless network such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
- the system 3010 also includes a treatment device 3070 configured to be manipulated by the patient and/or to manipulate a body part of the patient for performing activities according to the treatment plan.
- the treatment device 3070 may take the form of an exercise and rehabilitation apparatus configured to perform and/or to aid in the performance of a rehabilitation regimen, which may be an orthopedic rehabilitation regimen, and the treatment includes rehabilitation of a body part of the patient, such as a joint or a bone or a muscle group.
- the treatment device 3070 may be any suitable medical, rehabilitative, therapeutic, etc. apparatus configured to be controlled distally via another computing device to treat a patient and/or exercise the patient.
- the treatment device 3070 may be an electromechanical machine including one or more weights, an electromechanical bicycle, an electromechanical spin-wheel, a smart-mirror, a treadmill, or the like.
- the body part may include, for example, a spine, a hand, a foot, a knee, or a shoulder.
- the body part may include a part of a joint, a bone, or a muscle group, such as one or more vertebrae, a tendon, or a ligament.
- the treatment device 3070 includes a controller 3072, which may include one or more processors, computer memory, and/or other components.
- the treatment device 3070 also includes a fourth communication interface 3074 configured to communicate with (e.g.
- the treatment device 3070 also includes one or more internal sensors 3076 and an actuator 3078, such as a motor.
- the actuator 3078 may be used, for example, for moving the patient’s body part and/or for resisting forces by the patient.
- the internal sensors 3076 may measure one or more operating characteristics of the treatment device 3070 such as, for example, a force a position, a speed, a velocity, and /or an acceleration.
- the internal sensors 3076 may include a position sensor configured to measure at least one of a linear motion or an angular motion of a body part of the patient.
- an internal sensor 3076 in the form of a position sensor may measure a distance that the patient is able to move a part of the treatment device 3070, where such distance may correspond to a range of motion that the patient’s body part is able to achieve.
- the internal sensors 3076 may include a force sensor configured to measure a force applied by the patient.
- an internal sensor 3076 in the form of a force sensor may measure a force or weight the patient is able to apply, using a particular body part, to the treatment device 3070.
- the system 3010 generally illustrated in FIG. 23 also includes an ambulation sensor 3082, which communicates with the server 3030 via the local communication interface 3068 of the patient interface 3050.
- the ambulation sensor 3082 may track and store a number of steps taken by the patient.
- the ambulation sensor 3082 may take the form of a wristband, wristwatch, or smart watch.
- the ambulation sensor 3082 may be integrated within a phone, such as a smartphone.
- the system 3010 generally illustrated in FIG. 23 also includes a goniometer 3084, which communicates with the server 3030 via the local communication interface 3068 of the patient interface 3050.
- the goniometer 3084 measures an angle of the patient’s body part.
- the goniometer 3084 may measure the angle of flex of a patient’s knee or elbow or shoulder.
- the system 3010 may also include one or more additional sensors (not shown) which communicate with the server 3030 via the local communication interface 3068 of the patient interface 3050.
- the one or more additional sensors can measure other patient parameters such as a heartrate, a temperature, a blood pressure, a glucose level, the level of another biomarker, one or more vital signs, and the like.
- the one or more additional sensors may be optical sensors that detect the reflection of near-infrared light from circulating blood below the level of the skin.
- Optical sensors may take the form of a wristband, wristwatch, or smartwatch and measure a glucose level, a heartrate, a blood oxygen saturation level, one or more vital signs, and the like.
- the one or more additional sensors may be located in a room or physical space in which the treatment device 3070 is being used, inside the patient’s body, disposed on the person’s body (e.g., skin patch), or included in the treatment device 3070, and the one or more additional sensors may measure various vital signs or other diagnostically -relevant attributes (e.g., heartrate, perspiration rate, temperature, blood pressure, oxygen levels, any suitable vital sign, glucose level, a level of another biomarker, etc.).
- the one or more additional sensors may transmit the measurements of the patient to the server 3030 for analysis and processing (e.g., to be used to modify, based on the measurements, at least the treatment plan for the patient).
- the system 3010 generally illustrated in FIG. 23 also includes a pressure sensor 3086, which communicates with the server 3030 via the local communication interface 68 of the patient interface 3050.
- the pressure sensor 3086 measures an amount of pressure or weight applied by a body part of the patient.
- pressure sensor 86 may measure an amount of force applied by a patient’s foot when pedaling a stationary bike.
- the system 3010 generally illustrated in FIG. 23 also includes a supervisory interface 3090 which may be similar or identical to the clinician interface 3020.
- the supervisory interface 3090 may have enhanced functionality beyond what is provided on the clinician interface 3020.
- the supervisory interface 90 may be configured for use by a person having responsibility for the treatment plan, such as an orthopedic surgeon.
- the system 3010 generally illustrated in FIG. 23 also includes a reporting interface 3092 which may be similar or identical to the clinician interface 3020.
- the reporting interface 3092 may have less functionality from what is provided on the clinician interface 3020.
- the reporting interface 3092 may not have the ability to modify a treatment plan.
- Such a reporting interface 3092 may be used, for example, by a biller to determine the use of the system 3010 for billing purposes.
- the reporting interface 3092 may not have the ability to display patient identifiable information, presenting only pseudonymized data and/or anonymized data for certain data fields concerning a data subject and/or for certain data fields concerning a quasi-identifier of the data subject.
- Such a reporting interface 3092 may be used, for example, by a researcher to determine various effects of a treatment plan on different patients.
- the system 3010 includes an assistant interface 3094 for a healthcare provider, such as those described herein, to remotely communicate with (e.g., or over or across) the patient interface 3050 and/or the treatment device 3070. Such remote communications may enable the healthcare provider to provide assistance or guidance to a patient using the system 3010. More specifically, the assistant interface 3094 is configured to communicate a telemedicine signal 3096, 3097, 3098a, 3098b, 3099a, 3099b with the patient interface 3050 via a network connection such as, for example, via the first network 3034 and/or the second network 3058.
- a telemedicine signal 3096, 3097, 3098a, 3098b, 3099a, 3099b with the patient interface 3050 via a network connection such as, for example, via the first network 3034 and/or the second network 3058.
- the telemedicine signal 3096, 3097, 3098a, 3098b, 3099a, 3099b comprises one of an audio signal 3096, an audiovisual signal 3097, an interface control signal 3098a for controlling a function of the patient interface 3050, an interface monitor signal 3098b for monitoring a status of the patient interface 3050, an apparatus control signal 99a for changing an operating parameter of the treatment device 3070, and/or an apparatus monitor signal 3099b for monitoring a status of the treatment device 3070.
- each of the control signals 3098a, 3099a may be unidirectional, conveying commands from the assistant interface 3094 to the patient interface 3050.
- an acknowledgement message may be sent from the patient interface 3050 to the assistant interface 3094.
- each of the monitor signals 3098b, 3099b may be unidirectional, status-information commands from the patient interface 3050 to the assistant interface 3094.
- an acknowledgement message may be sent from the assistant interface 3094 to the patient interface 3050 in response to successfully receiving one of the monitor signals 3098b, 3099b.
- the patient interface 3050 may be configured as a pass-through for the apparatus control signals 3099a and the apparatus monitor signals 3099b between the treatment device 3070 and one or more other devices, such as the assistant interface 3094 and/or the server 3030.
- the patient interface 3050 may be configured to transmit an apparatus control signal 3099a in response to an apparatus control signal 3099a within the telemedicine signal 3096, 3097, 3098a, 3098b, 3099a, 3099b from the assistant interface 3094.
- the assistant interface 3094 may be presented on a shared physical device as the clinician interface 3020.
- the clinician interface 3020 may include one or more screens that implement the assistant interface 3094.
- the clinician interface 3020 may include additional hardware components, such as a video camera, a speaker, and/or a microphone, to implement aspects of the assistant interface 3094.
- one or more portions of the telemedicine signal 3096, 3097, 3098a, 3098b, 3099a, 3099b may be generated from a prerecorded source (e.g., an audio recording, a video recording, or an animation) for presentation by the output device 3054 of the patient interface 3050.
- a prerecorded source e.g., an audio recording, a video recording, or an animation
- a tutorial video may be streamed from the server 3030 and presented upon the patient interface 3050.
- Content from the prerecorded source may be requested by the patient via the patient interface 3050.
- the healthcare provider may cause content from the prerecorded source to be played on the patient interface 3050.
- the assistant interface 3094 includes an assistant input device 3022 and an assistant display 3024, which may be collectively called an assistant user interface 3022, 3024.
- the assistant input device 3022 may include one or more of a telephone, a keyboard, a mouse, a trackpad, or a touch screen, for example.
- the assistant input device 3022 may include one or more microphones.
- the one or more microphones may take the form of a telephone handset, headset, or wide-area microphone or microphones configured for the healthcare provider to speak to a patient via the patient interface 3050.
- assistant input device 3022 may be configured to provide voice-based functionalities, with hardware and/or software configured to interpret spoken instructions by the healthcare provider by using the one or more microphones.
- the assistant input device 3022 may include functionality provided by or similar to existing voice-based assistants such as Siri by Apple, Alexa by Amazon, Google Assistant, orBixby by Samsung.
- the assistant input device 3022 may include other hardware and/or software components.
- the assistant input device 3022 may include one or more general purpose devices and/or special- purpose devices.
- the assistant display 3024 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, a smartphone, or a smart watch.
- the assistant display 3024 may include other hardware and/or software components such as projectors, virtual reality capabilities, or augmented reality capabilities, etc.
- the assistant display 3024 may incorporate various different visual, audio, or other presentation technologies.
- the assistant display 3024 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, melodies, and/or compositions, which may signal different conditions and/or directions.
- the assistant display 3024 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the healthcare provider.
- the assistant display 3024 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
- the system 3010 may provide computer translation of language from the assistant interface 3094 to the patient interface 3050 and/or vice-versa.
- the computer translation of language may include computer translation of spoken language and/or computer translation of text.
- the system 3010 may provide voice recognition and/or spoken pronunciation of text.
- the system 10 may convert spoken words to printed text and/or the system 3010 may audibly speak language from printed text.
- the system 3010 may be configured to recognize spoken words by any or all of the patient, the clinician, and/or the healthcare provider.
- the system 3010 may be configured to recognize and react to spoken requests or commands by the patient.
- the system 3010 may automatically initiate a telemedicine session in response to a verbal command by the patient (which may be given in any one of several different languages).
- the server 3030 may generate aspects of the assistant display 3024 for presentation by the assistant interface 3094.
- the server 3030 may include a web server configured to generate the display screens for presentation upon the assistant display 3024.
- the artificial intelligence engine 3011 may generate recommended treatment plans and/or excluded treatment plans for patients and generate the display screens including those recommended treatment plans and/or external treatment plans for presentation on the assistant display 3024 of the assistant interface 3094.
- the assistant display 3024 may be configured to present a virtualized desktop hosted by the server 3030.
- the server 3030 may be configured to communicate with (e.g., or over) the assistant interface 3094 via the first network 3034.
- the first network 3034 may include a local area network (LAN), such as an Ethernet network.
- LAN local area network
- the first network 3034 may include the Internet, and communications between the server 3030 and the assistant interface 3094 may be seemed via privacy enhancing technologies, such as, for example, by using encryption over a virtual private network (VPN).
- the server 3030 may be configured to communicate with (e.g., or over or across) the assistant interface 3094 via one or more networks independent of the first network 3034 and/or other communication means, such as a direct wired or wireless communication channel.
- the patient interface 3050 and the treatment device 3070 may each operate from a patient location geographically separate from a location of the assistant interface 3094.
- the patient interface 3050 and the treatment device 3070 may be used as part of an in-home rehabilitation system, which may be aided remotely by using the assistant interface 3094 at a centralized location, such as a clinic or a call center.
- the assistant interface 3094 may be one of several different terminals (e.g., computing devices) that may be grouped together, for example, in one or more call centers or at one or more clinicians’ offices. In some embodiments, a plurality of assistant interfaces 3094 may be distributed geographically. In some embodiments, a person may work as a healthcare provider remotely from any conventional office infrastructure. Such remote work may be performed, for example, where the assistant interface 3094 takes the form of a computer and/or telephone. This remote work functionality may allow for work-from-home arrangements that may include part time and/or flexible work horns for a healthcare provider. [0521] FIGS. 24-25 show an embodiment of a treatment device 3070. More specifically, FIG.
- the stationary cycling machine 3100 includes a set of pedals 3102 each attached to a pedal arm 3104 for rotation about an axle 3106.
- the pedals 3102 are movable on the pedal arms 3104 in order to adjust a range of motion used by the patient in pedaling.
- the pedals being located inwardly toward the axle 3106 corresponds to a smaller range of motion than when the pedals are located outwardly away from the axle 3106.
- the pedals may be adjustable inward and outward from the plane of rotation.
- a pressure sensor 3086 is attached to or embedded within one of the pedals 3102 for measuring an amount of force applied by the patient on the pedal 3102.
- the pressure sensor 3086 may communicate wirelessly to the treatment device 3070 and/or to the patient interface 3050.
- FIG. 26 generally illustrates a person (a patient) using the treatment device of FIG. 24, and showing sensors and various data parameters connected to a patient interface 3050.
- the example patient interface 3050 is a tablet computer or smartphone, or a phablet, such as an iPad, an iPhone, an Android device, or a Surface tablet, which is held manually by the patient.
- the patient interface 3050 may be embedded within or attached to the treatment device 3070.
- FIG. 26 generally illustrates the patient wearing the ambulation sensor 3082 on his wrist, with a note showing “STEPS TODAY 31355”, indicating that the ambulation sensor 3082 has recorded and transmitted that step count to the patient interface 3050.
- FIG. 26 also generally illustrates the patient wearing the goniometer 3084 on his right knee, with a note showing “KNEE ANGLE 72°”, indicating that the goniometer 3084 is measuring and transmitting that knee angle to the patient interface 3050.
- FIG. 36 also generally illustrates a right side of one of the pedals 3102 with a pressure sensor 3086 showing “FORCE 12.5 lbs.,” indicating that the right pedal pressure sensor 3086 is measuring and transmitting that force measurement to the patient interface 3050.
- FIG. 26 generally illustrates the patient wearing the ambulation sensor 3082 on his wrist, with a note showing “STEPS TODAY 31355”, indicating that the ambulation sensor 3082 has recorded and transmitted that step count to the patient interface 3050.
- FIG. 26 also generally illustrate
- FIG. 26 also generally illustrates a left side of one of the pedals 4102 with a pressure sensor 3086 showing “FORCE 27 lbs.”, indicating that the left pedal pressure sensor 3086 is measuring and transmitting that force measurement to the patient interface 3050.
- FIG. 26 also generally illustrates other patient data, such as an indicator of “ SES SION TIME 0 : 04 : 13 ” , indicating that the patient has been using the treatment device 3070 for 4 minutes and 13 seconds. This session time may be determined by the patient interface 3050 based on information received from the treatment device 3070.
- FIG. 26 also generally illustrates an indicator showing “PAIN LEVEL 3”. Such a pain level may be obtained from the patent in response to a solicitation, such as a question, presented upon the patient interface 3050.
- FIG. 27 is an example embodiment of an overview display 3120 of the assistant interface 3094.
- the overview display 3120 presents several different controls and interfaces for the healthcare provider to remotely assist a patient with using the patient interface 3050 and/or the treatment device 3070.
- This remote assistance functionality may also be called telemedicine or telehealth.
- the overview display 3120 includes a patient profile display 3130 presenting biographical information regarding a patient using the treatment device 3070.
- the patient profile display 3130 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27, although the patient profile display 3130 may take other forms, such as a separate screen or a popup window.
- the patient profile display 3130 may include a limited subset of the patient’s biographical information. More specifically, the data presented upon the patient profile display 3130 may depend upon the healthcare provider’s need for that information.
- a healthcare provider that is assisting the patient with a medical issue may be provided with medical history information regarding the patient, whereas a technician troubleshooting an issue with the treatment device 3070 may be provided with a much more limited set of information regarding the patient.
- the technician for example, may be given only the patient’s name.
- the patient profile display 3130 may include pseudonymized data and/or anonymized data or use any privacy enhancing technology to prevent confidential patient data from being communicated in a way that could violate patient confidentiality requirements.
- privacy enhancing technologies may enable compliance with laws, regulations, or other rules of governance such as, but not limited to, the Health Insurance Portability and Accountability Act (HIPAA), or the General Data Protection Regulation (GDPR), wherein the patient may be deemed a “data subject”.
- HIPAA Health Insurance Portability and Accountability Act
- GDPR General Data Protection Regulation
- the patient profile display 3130 may present information regarding the treatment plan for the patient to follow in using the treatment device 3070.
- Such treatment plan information may be limited to a healthcare provider.
- a healthcare provider assisting the patient with an issue regarding the treatment regimen may be provided with treatment plan information, whereas a technician troubleshooting an issue with the treatment device 3070 may not be provided with any information regarding the patient’s treatment plan.
- one or more recommended treatment plans and/or excluded treatment plans may be presented in the patient profile display 3130 to the healthcare provider.
- the one or more recommended treatment plans and/or excluded treatment plans may be generated by the artificial intelligence engine 3011 of the server 3030 and received from the server 3030 in real-time during, inter alia, a telemedicine or tele health session.
- An example of presenting the one or more recommended treatment plans and/or ruled-out treatment plans is described below with reference to FIG. 29.
- the example overview display 3120 generally illustrated in FIG. 27 also includes a patient status display 3134 presenting status information regarding a patient using the treatment device.
- the patient status display 3134 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27, although the patient status display 3134 may take other forms, such as a separate screen or a popup window.
- the patient status display 3134 includes sensor data 3136 from one or more of the external sensors 3082, 3084, 3086, and/or from one or more internal sensors 3076 of the treatment device 3070 and/or one or more additional sensors (not shown) as has been previously described herein.
- the patient status display 3134 may include sensor data from one or more sensors of one or more wearable devices worn by the patient while using the treatment device 3070.
- the one or more wearable devices may include a watch, a bracelet, a necklace, a chest strap, and the like.
- the one or more wearable devices may be configured to monitor a heartrate, a temperature, a blood pressure, a glucose level, a blood oxygen saturation level, one or more vital signs, and the like of the patient while the patient is using the treatment device 3070.
- the patient status display 3134 may present other data 3138 regarding the patient, such as last reported pain level, or progress within a treatment plan.
- User access controls may be used to limit access, including what data is available to be viewed and/or modified, on any or all of the user interfaces 3020, 3050, 3090, 3092, 3094 of the system 3010.
- user access controls may be employed to control what information is available to any given person using the system 3010.
- data presented on the assistant interface 3094 may be controlled by user access controls, with permissions set depending on the healthcare provider/user’s need for and/or qualifications to view that information.
- the example overview display 3120 generally illustrated in FIG. 27 also includes a help data display 3140 presenting information for the healthcare provider to use in assisting the patient.
- the help data display 3140 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27.
- the help data display 3140 may take other forms, such as a separate screen or a popup window.
- the help data display 3140 may include, for example, presenting answers to frequently asked questions regarding use of the patient interface 3050 and/or the treatment device 3070.
- the help data display 3140 may also include research data or best practices. In some embodiments, the help data display 3140 may present scripts for answers or explanations in response to patient questions. In some embodiments, the help data display 3140 may present flow charts or walk-throughs for the healthcare provider to use in determining a root cause and/or solution to a patient’s problem.
- the assistant interface 3094 may present two or more help data displays 3140, which may be the same or different, for simultaneous presentation of help data for use by the healthcare provider.
- a first help data display may be used to present a troubleshooting flowchart to determine the source of a patient’s problem
- a second help data display may present script information for the healthcare provider to read to the patient, such information to preferably include directions for the patient to perform some action, which may help to narrow down or solve the problem.
- the second help data display may automatically populate with script information.
- the example overview display 3120 generally illustrated in FIG. 27 also includes a patient interface control 3150 presenting information regarding the patient interface 3050, and/or to modify one or more settings of the patient interface 3050.
- the patient interface control 3150 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27.
- the patient interface control 3150 may take other forms, such as a separate screen or a popup window.
- the patient interface control 3150 may present information communicated to the assistant interface 3094 via one or more of the interface monitor signals 3098b.
- the patient interface control 3150 includes a display feed 3152 of the display presented by the patient interface 3050.
- the display feed 3152 may include a live copy of the display screen currently being presented to the patient by the patient interface 3050.
- the display feed 3152 may present an image of what is presented on a display screen of the patient interface 3050.
- the display feed 3152 may include abbreviated information regarding the display screen currently being presented by the patient interface 3050, such as a screen name or a screen number.
- the patient interface control 3150 may include a patient interface setting control 3154 for the healthcare provider to adjust or to control one or more settings or aspects of the patient interface 3050.
- the patient interface setting control 3154 may cause the assistant interface 3094 to generate and/or to transmit an interface control signal 3098 for controlling a function or a setting of the patient interface 3050.
- the patient interface setting control 3154 may include collaborative browsing or co-browsing capability for the healthcare provider to remotely view and/or to control the patient interface 3050.
- the patient interface setting control 3154 may enable the healthcare provider to remotely enter text to one or more text entry fields on the patient interface 3050 and/or to remotely control a cursor on the patient interface 3050 using a mouse or touchscreen of the assistant interface 3094.
- the patient interface setting control 3154 may allow the healthcare provider to change a setting that cannot be changed by the patient.
- the patient interface 3050 may be precluded from accessing a language setting to prevent a patient from inadvertently switching, on the patient interface 3050, the language used for the displays, whereas the patient interface setting control 3154 may enable the healthcare provider to change the language setting of the patient interface 3050.
- the patient interface 3050 may not be able to change a font size setting to a smaller size in order to prevent a patient from inadvertently switching the font size used for the displays on the patient interface 3050 such that the display would become illegible to the patient, whereas the patient interface setting control 3154 may provide for the healthcare provider to change the font size setting of the patient interface 3050.
- the example overview display 3120 generally illustrated in FIG. 27 also includes an interface communications display 3156 showing the status of communications between the patient interface 3050 and one or more other devices 3070, 3082, 3084, such as the treatment device 3070, the ambulation sensor 3082, and/or the goniometer 3084.
- the interface communications display 3156 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27.
- the interface communications display 3156 may take other forms, such as a separate screen or a popup window.
- the interface communications display 3156 may include controls for the healthcare provider to remotely modify communications with one or more of the other devices 3070, 3082, 3084.
- the healthcare provider may remotely command the patient interface 50 to reset communications with one of the other devices 3070, 3082, 3084, or to establish communications with a new one of the other devices 3070, 3082, 3084.
- This functionality may be used, for example, where the patient has a problem with one of the other devices 3070, 3082, 3084, or where the patient receives a new or a replacement one of the other devices 3070, 3082, 3084.
- the example overview display 3120 generally illustrated in FIG. 27 also includes an apparatus control 3160 for the healthcare provider to view and/or to control information regarding the treatment device 3070.
- the apparatus control 3160 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27.
- the apparatus control 3160 may take other forms, such as a separate screen or a popup window.
- the apparatus control 3160 may include an apparatus status display 3162 with information regarding the current status of the apparatus.
- the apparatus status display 3162 may present information communicated to the assistant interface 3094 via one or more of the apparatus monitor signals 3099b.
- the apparatus status display 3162 may indicate whether the treatment device 3070 is currently communicating with the patient interface 3050.
- the apparatus status display 3162 may present other current and/or historical information regarding the status of the treatment device 3070.
- the apparatus control 3160 may include an apparatus setting control 3164 for the healthcare provider to adjust or control one or more aspects of the treatment device 3070.
- the apparatus setting control 3164 may cause the assistant interface 3094 to generate and/or to transmit an apparatus control signal 3099 (e.g., which may be referred to as treatment plan input, as described) for changing an operating parameter and/or one or more characteristics of the treatment device 3070, (e.g., a pedal radius setting, a resistance setting, a target RPM, other suitable characteristics of the treatment device 3070, or a combination thereof).
- the apparatus setting control 3164 may include a mode button 3166 and a position control 3168, which may be used in conjunction for the healthcare provider to place an actuator 3078 of the treatment device 3070 in a manual mode, after which a setting, such as a position or a speed of the actuator 3078, can be changed using the position control 3168.
- the mode button 3166 may provide for a setting, such as a position, to be toggled between automatic and manual modes.
- one or more settings may be adjustable at any time, and without having an associated auto/manual mode.
- the healthcare provider may change an operating parameter of the treatment device 3070, such as a pedal radius setting, while the patient is actively using the treatment device 3070. Such “on the fly” adjustment may or may not be available to the patient using the patient interface 3050.
- the apparatus setting control 3164 may allow the healthcare provider to change a setting that cannot be changed by the patient using the patient interface 3050.
- the patient interface 3050 may be precluded from changing a preconfigured seting, such as a height or a tilt setting of the treatment device 3070, whereas the apparatus seting control 3164 may provide for the healthcare provider to change the height or tilt seting of the treatment device 3070.
- the example overview display 3120 generally illustrated in FIG. 27 also includes a patient communications control 3170 for controlling an audio or an audiovisual communications session with the patient interface 3050.
- the communications session with the patient interface 3050 may comprise a live feed from the assistant interface 3094 for presentation by the output device of the patient interface 3050.
- the live feed may take the form of an audio feed and/or a video feed.
- the patient interface 3050 may be configured to provide two-way audio or audiovisual communications with a person using the assistant interface 3094.
- the communications session with the patient interface 3050 may include bidirectional (two- way) video or audiovisual feeds, with each of the patient interface 3050 and the assistant interface 3094 presenting video of the other one.
- the patient interface 3050 may present video from the assistant interface 3094, while the assistant interface 3094 presents only audio or the assistant interface 3094 presents no live audio or visual signal from the patient interface 3050.
- the assistant interface 3094 may present video from the patient interface 3050, while the patient interface 3050 presents only audio or the patient interface 3050 presents no live audio or visual signal from the assistant interface 3094.
- the audio or an audiovisual communications session with the patient interface 3050 may take place, at least in part, while the patient is performing the rehabilitation regimen upon the body part.
- the patient communications control 3170 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27.
- the patient communications control 3170 may take other forms, such as a separate screen or a popup window.
- the audio and/or audiovisual communications may be processed and/or directed by the assistant interface 3094 and/or by another device or devices, such as a telephone system, or a videoconferencing system used by the healthcare provider while the healthcare provider uses the assistant interface 3094.
- the audio and/or audiovisual communications may include communications with a third party.
- the system 3010 may enable the healthcare provider to initiate a 3-way conversation regarding use of a particular piece of hardware or software, with the patient and a subject mater expert, such as a healthcare provider or a specialist.
- the example patient communications control 3170 generally illustrated in FIG. 27 includes call controls 3172 for the healthcare provider to use in managing various aspects of the audio or audiovisual communications with the patient.
- the call controls 3172 include a disconnect buton 3174 for the healthcare provider to end the audio or audiovisual communications session.
- the call controls 3172 also include a mute buton 3176 to temporarily silence an audio or audiovisual signal from the assistant interface 3094.
- the call controls 3172 may include other features, such as a hold buton (not shown).
- the call controls 3172 also include one or more record/playback controls 3178, such as record, play, and pause butons to control, with the patient interface 3050, recording and/or playback of audio and/or video from the teleconference session.
- the call controls 3172 also include a video feed display 3180 for presenting still and/or video images from the patient interface 3050, and a self-video display 3182 showing the current image of the healthcare provider using the assistant interface 3094.
- the self -video display 3182 may be presented as a picture-in-picture format, within a section of the video feed display 3180, as is generally illustrated in FIG. 27. Alternatively or additionally, the self-video display 3182 may be presented separately and/or independently from the video feed display 3180.
- the example overview display 3120 generally illustrated in FIG. 27 also includes a third party communications control 3190 for use in conducting audio and/or audiovisual communications with a third party .
- the third party communications control 3190 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27.
- the third party communications control 3190 may take other forms, such as a display on a separate screen or a popup window.
- the third party communications control 3190 may include one or more controls, such as a contact list and/or buttons or controls to contact a third party regarding use of a particular piece of hardware or software, e.g., a subject matter expert, such as a healthcare provider or a specialist.
- the third party communications control 190 may include conference calling capability for the third party to simultaneously communicate with both the healthcare provider via the assistant interface 3094, and with the patient via the patient interface 3050.
- the system 3010 may provide for the healthcare provider to initiate a 3-way conversation with the patient and the third party.
- FIG. 28 generally illustrates an example block diagram of training a machine learning model 3013 to output, based on data 3600 pertaining to the patient, a treatment plan 3602 for the patient according to the present disclosure.
- Data pertaining to other patients may be received by the server 3030.
- the other patients may have used various treatment devices to perform treatment plans.
- the data may include characteristics of the other patients, the details of the treatment plans performed by the other patients, and/or the results of performing the treatment plans (e.g., a percent of recovery of a portion of the patients’ bodies, an amount of recovery of a portion of the patients’ bodies, an amount of increase or decrease in muscle strength of a portion of patients’ bodies, an amount of increase or decrease in range of motion of a portion of patients’ bodies, etc.).
- Cohort A includes data for patients having similar first characteristics, first treatment plans, and first results.
- Cohort B includes data for patients having similar second characteristics, second treatment plans, and second results.
- cohort A may include first characteristics of patients in their twenties without any medical conditions who underwent surgery for a broken limb; their treatment plans may include a certain treatment protocol (e.g., use the treatment device 70 for 30 minutes 5 times a week for 3 weeks, wherein values for the properties, configurations, and/or settings of the treatment device 70 are set to X (where X is a numerical value) for the first two weeks and to Y (where Y is a numerical value) for the last week).
- Cohort A and cohort B may be included in a training dataset used to train the machine learning model 3013.
- the machine learning model 3013 may be trained to match a pattern between characteristics for each cohort and output the treatment plan that provides the result. Accordingly, when the data 3600 for a new patient is input into the trained machine learning model 3013, the trained machine learning model 3013 may match the characteristics included in the data 3600 with characteristics in either cohort A or cohort B and output the appropriate treatment plan 3602. In some embodiments, the machine learning model 3013 may be trained to output one or more excluded treatment plans that should not be performed by the new patient. [0560] FIG.
- FIG. 29 generally illustrates an embodiment of an overview display 3120 of the assistant interface 3094 presenting recommended treatment plans and excluded treatment plans in real-time during a telemedicine session according to the present disclosure.
- the overview display 3120 just includes sections for the patient profile 3130 and the video feed display 3180, including the self-video display 3182. Any suitable configuration of controls and interfaces of the overview display 3120 described with reference to FIG. 27 may be presented in addition to or instead of the patient profile 3130, the video feed display 3180, and the self-video display 3182.
- the healthcare provider using the assistant interface 3094 (e.g., computing device) during the telemedicine session may be presented in the self-video 3182 in a portion of the overview display 3120 (e.g., user interface presented on a display screen 3024 of the assistant interface 3094) that also presents a video from the patient in the video feed display 3180.
- the video feed display 3180 may also include a graphical user interface (GUI) object 3700 (e.g., a button) that enables the healthcare provider to share, in real-time or near real-time during the telemedicine session, the recommended treatment plans and/or the excluded treatment plans with the patient on the patient interface 3050.
- the healthcare provider may select the GUI object 3700 to share the recommended treatment plans and/or the excluded treatment plans.
- another portion of the overview display 3120 includes the patient profile display 3130.
- the patient profile display 3130 is presenting two example recommended treatment plans 3600 and one example excluded treatment plan 3602.
- the treatment plans may be recommended in view of characteristics of the patient being treated.
- the patient should follow to achieve a desired result, a pattern between the characteristics of the patient being treated and a cohort of other people who have used the treatment device 3070 to perform a treatment plan may be matched by one or more machine learning models 3013 of the artificial intelligence engine 3011.
- Each of the recommended treatment plans may be generated based on different desired results.
- the patient profile display 3130 presents “The characteristics of the patient match characteristics of uses in Cohort A. The following treatment plans are recommended for the patient based on his characteristics and desired results.” Then, the patient profile display 3130 presents recommended treatment plans from cohort A, and each treatment plan provides different results.
- treatment plan “A” indicates “Patient X should use treatment device for 3030 minutes a day for 4 days to achieve an increased range of motion of Y%; Patient X has Type 2 Diabetes; and Patient X should be prescribed medication Z for pain management during the treatment plan (medication Z is approved for people having Type 2 Diabetes).” Accordingly, the treatment plan generated achieves increasing the range of motion of Y%.
- the treatment plan also includes a recommended medication (e.g., medication Z) to prescribe to the patient to manage pain in view of a known medical disease (e.g., Type 2 Diabetes) of the patient. That is, the recommended patient medication not only does not conflict with the medical condition of the patient but thereby improves the probability of a superior patient outcome.
- a recommended medication e.g., medication Z
- Recommended treatment plan “B” may specify, based on a different desired result of the treatment plan, a different treatment plan including a different treatment protocol for a treatment device, a different medication regimen, etc.
- the patient profile display 3130 may also present the excluded treatment plans 3602. These types of treatment plans are shown to the healthcare provider using the assistant interface 3094 to alert the healthcare provider not to recommend certain portions of a treatment plan to the patient.
- the excluded treatment plan could specify the following: “Patient X should not use treatment device for longer than 30 minutes a day due to a heart condition; Patient X has Type 2 Diabetes; and Patient X should not be prescribed medication M for pain management during the treatment plan (in this scenario, medication M can cause complications for people having Type 2 Diabetes).
- the excluded treatment plan points out a limitation of a treatment protocol where, due to a heart condition, Patient X should not exercise for more than 30 minutes a day.
- the ruled-out treatment plan also points out that Patient X should not be prescribed medication M because it conflicts with the medical condition Type 2 Diabetes.
- the healthcare provider may select the treatment plan for the patient on the overview display 3120.
- the healthcare provider may use an input peripheral (e.g., mouse, touchscreen, microphone, keyboard, etc.) to select from the treatment plans 3600 for the patient.
- the healthcare provider may discuss the pros and cons of the recommended treatment plans 3600 with the patient.
- the healthcare provider may select the treatment plan for the patient to follow to achieve the desired result.
- the selected treatment plan may be transmitted to the patient interface 3050 for presentation.
- the patient may view the selected treatment plan on the patient interface 3050.
- the healthcare provider and the patient may discuss during the telemedicine session the details (e.g., treatment protocol using treatment device 3070, diet regimen, medication regimen, etc.) in real-time or in near real-time.
- the server 3030 may control, based on the selected treatment plan and during the telemedicine session, the treatment device 3070 as the user uses the treatment device 3070.
- FIG. 30 generally illustrates an embodiment of the overview display 3120 of the assistant interface 3094 presenting, in real-time during a telemedicine session, recommended treatment plans that have changed as a result of patient data changing according to the present disclosure.
- the treatment device 3070 and/or any computing device may transmit data while the patient uses the treatment device 3070 to perform a treatment plan.
- the data may include updated characteristics of the patient and/or other treatment data.
- the updated characteristics may include new performance information and/or measurement information.
- the performance information may include a speed of a portion of the treatment device 3070, a range of motion achieved by the patient, a force exerted on a portion of the treatment device 3070, a heartrate of the patient, a blood pressure of the patient, a respiratory rate of the patient, and so forth.
- the data received at the server 3030 may be input into the trained machine learning model 3013, which may determine that the characteristics indicate the patient is on track for the current treatment plan. Determining the patient is on track for the current treatment plan may cause the trained machine learning model 3013 to adjust a parameter of the treatment device 3070. The adjustment may be based on a next step of the treatment plan to further improve the performance of the patient.
- the data received at the server 3030 may be input into the trained machine learning model 3013, which may determine that the characteristics indicate the patient is not on track (e.g., behind schedule, not able to maintain a speed, not able to achieve a certain range of motion, is in too much pain, etc.) for the current treatment plan or is ahead of schedule (e.g., exceeding a certain speed, exercising longer than specified with no pain, exerting more than a specified force, etc.) for the current treatment plan.
- the trained machine learning model 3013 may determine that the characteristics indicate the patient is not on track (e.g., behind schedule, not able to maintain a speed, not able to achieve a certain range of motion, is in too much pain, etc.) for the current treatment plan or is ahead of schedule (e.g., exceeding a certain speed, exercising longer than specified with no pain, exerting more than a specified force, etc.) for the current treatment plan.
- the trained machine learning model 3013 may determine that the characteristics of the patient no longer match the characteristics of the patients in the cohort to which the patient is assigned. Accordingly, the trained machine learning model 3013 may reassign the patient to another cohort that includes qualifying characteristics the patient’s characteristics. As such, the trained machine learning model 3013 may select a new treatment plan from the new cohort and control, based on the new treatment plan, the treatment device 3070. [0573] In some embodiments, prior to controlling the treatment device 3070, the server 3030 may provide the new treatment plan 3800 to the assistant interface 3094 for presentation in the patient profile 3130. As depicted, the patient profile 3130 indicates “The characteristics of the patient have changed and now match characteristics of uses in Cohort B.
- the following treatment plan is recommended for the patient based on his characteristics and desired results.”
- the patient profile 3130 presents the new treatment plan 3800 (“Patient X should use the treatment device for 10 minutes a day for 3 days to achieve an increased range of motion of L%.”
- the healthcare provider may select the new treatment plan 3800, and the server 3030 may receive the selection.
- the server 3030 may control the treatment device 3070 based on the new treatment plan 3800.
- the new treatment plan 3800 may be transmitted to the patient interface 3050 such that the patient may view the details of the new treatment plan 3800.
- the server 3030 may be configured to receive treatment data pertaining to a user who uses a treatment device 3070 to perform a treatment plan.
- the user may include a patient, user, or person using the treatment device 3070 to perform various exercises.
- the treatment data may include various characteristics of the user, various baseline measurement information pertaining to the user, various measurement information pertaining to the user while the user uses the treatment device 3070, various characteristics of the treatment device 3070, the treatment plan, other suitable data, or a combination thereof.
- the server 3030 may receive the treatment data during a telemedicine session.
- At least some of the treatment data may include the sensor data 3136 from one or more of the external sensors 3082, 3084, 3086, and/or from one or more internal sensors 3076 of the treatment device 3070.
- at least some of the treatment data may include sensor data from one or more sensors of one or more wearable devices worn by the patient while using the treatment device 3070.
- the one or more wearable devices may include a watch, a bracelet, a necklace, a chest strap, and the like.
- the one or more wearable devices may be configured to monitor a heartrate, a temperature, a blood pressure, one or more vital signs, and the like of the patient while the patient is using the treatment device 3070.
- the various characteristics of the treatment device 3070 may include one or more settings of the treatment device 3070, a current revolutions per time period (e.g., such as one minute) of a rotating member (e.g., such as a wheel) of the treatment device 3070, a resistance setting of the treatment device 3070, other suitable characteristics of the treatment device 3070, or a combination thereof.
- the baseline measurement information may include, while the user is at rest, one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable measurement information of the user, or a combination thereof.
- the measurement information may include, while the user uses the treatment device 3070 to perform the treatment plan, one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable measurement information of the user, or a combination thereof.
- the server 3030 may write to an associated memory, for access by the artificial intelligence engine 3011, the treatment data.
- the artificial intelligence engine 3011 may use the one or more machine learning models 3013, which may be configured to use at least some of the treatment data to generate one or more predictions.
- the artificial intelligence engine 3011 may use a machine learning model 3013 configured to receive the treatment data corresponding to the user.
- the machine learning model 3013 may analyze the at least one aspect of the treatment data and may generate at least one prediction corresponding to the at least one aspect of the treatment data.
- the at least one prediction may indicate one or more predicted characteristics of the user.
- the one or more predicted characteristics of the user may include a predicted vital sign of the user, a predicted respiration rate of the user, a predicted heartrate of the user, a predicted temperature of the user, a predicted blood pressure of the user, a predicted outcome of the treatment plan being performed by the user, a predicted injury of the user resulting from the user performing the treatment plan, or other suitable predicted characteristic of the user.
- the server 3030 may receive, from the artificial intelligence engine 3011, the one or more predictions.
- the server 3030 may identify a threshold corresponding to respective predictions received from the artificial intelligence engine 3011. For example, the server 3030 may identify one or more characteristics of the user indicated by a respective prediction.
- the server 3030 may access a database, such as the database 3044 or other suitable database, configured to associate thresholds with characteristics of the user and/or combinations of characteristics of the user.
- the database 3044 may include information that associates a first threshold with a blood pressure of the user. Additionally, or alternatively, the database 3044 may include information that associates a threshold with a blood pressure of the user and a heartrate of the user. It should be understood that the database 3044 may include any number of thresholds associated with any of the various characteristics of the user and/or any combination of user characteristics.
- a threshold corresponding to a respective prediction may include a value or a range of values including an upper limit and a lower limit.
- the server 3030 may determine whether a prediction received from the artificial intelligence engine 3011 is within a range of a corresponding threshold. For example, the server 3030 may compare the prediction to the corresponding threshold. The server 3030 may determine whether the prediction is within a predefined range of the threshold.
- the predefined range may include an upper limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable upper limit) above the value and a lower limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable lower limit) below the value.
- an upper limit e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable upper limit
- a lower limit e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable lower limit
- the predefined range may include a second upper limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable upper limit) above the first upper limit and a second lower limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable lower limit) below the first lower limit.
- the threshold may include any suitable predefined range and may include any suitable format in addition to or other than those described herein.
- the server 3030 may communicate with (e.g., or over) an interface, such as the overview display 3120 at the computing device of the healthcare provider assisting the user, to provide the prediction and the treatment data.
- the server 3030 may generate treatment information using the treatment data and/or the prediction.
- the treatment information may include a formatted summary of the performance of the treatment plan by the user while using the treatment device 3070, such that the treatment data and/or the prediction is presentable at the computing device of a healthcare provider responsible for the performance of the treatment plan by the user.
- the patient profile display 3120 may include and/or display the treatment information.
- the server 3030 may be configured to provide, at the overview display 3120, the treatment information.
- the server 3030 may store the treatment information for access by the overview display 3120 and/or communicate the treatment information to the overview display 3120.
- the server 3030 may provide the treatment information to patient profile display 3130 or other suitable section, portion, or component of the overview display 3120 or to any other suitable display or interface.
- the server 3030 may, in response to determining that the prediction is within the range of the threshold, modify at least one aspect of the treatment plan and/or, based on the prediction, one or more characteristics of the treatment device 3070.
- the server 3030 may control, while the user uses the treatment device 3070 during a telemedicine session and based on a generated prediction, the treatment device 3070. For example, the server 3030 may, based on the prediction and/or the treatment plan, control one or more characteristics of the treatment device 3070.
- the healthcare provider may include a medical professional (e.g., such as a doctor, a nurse, a therapist, and the like), an exercise professional (e.g., such as a coach, a trainer, a nutritionist, and the like), or another professional sharing at least one of medical and exercise attributes (e.g., such as an exercise physiologist, a physical therapist, an occupational therapist, and the like).
- a “healthcare provider” may be a human being, a robot, a virtual assistant, a virtual assistant in a virtual and/or augmented reality, or an artificially intelligent entity, including a software program, integrated software and hardware, or hardware alone.
- the interface may include a graphical user interface configured to provide the treatment information and receive input from the healthcare provider.
- the interface may include one or more input fields, such as text input fields, dropdown selection input fields, radio button input fields, virtual switch input fields, virtual lever input fields, audio, haptic, tactile, biometric or otherwise activated and/or driven input fields, other suitable input fields, or a combination thereof.
- the healthcare provider may review the treatment information and/or the prediction.
- the healthcare provider may, based on the review of the treatment information and/or the prediction, determine whether to modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device 3070.
- the healthcare provider may review the treatment information.
- the healthcare provider may, based on the review of the treatment information, compare the treatment information to the treatment plan being performed by the user.
- the healthcare provider may compare the following: (i) expected information, which pertains to the user while the user uses the treatment device to perform the treatment plan, to (ii) the prediction, which pertains to the user while the user uses the treatment device to perform the treatment plan.
- the expected information may include one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable information of the user, or a combination thereof.
- the healthcare provider may determine that the treatment plan is having the desired effect if the prediction is within an acceptable range associated with one or more corresponding parts or portions of the expected information. Alternatively, the healthcare provider may determine that the treatment plan is not having the desired effect if the prediction is outside of the range associated with one or more corresponding parts or portions of the expected information.
- the healthcare provider may determine whether a blood pressure value indicated by the prediction (e.g., systolic pressure, diastolic pressure, and/or pulse pressure) is within an acceptable range (e.g., plus or minus 1%, plus or minus 5%, plus or minus 1 unit of measurement (or other suitable numerical value), or any suitable range) of an expected blood pressure value indicated by the expected information.
- the healthcare provider may determine that the treatment plan is having the desired effect if the blood pressure value is within the range of the expected blood pressure value.
- the healthcare provider may determine that the treatment plan is not having the desired effect if the blood pressure value is outside of the range of the expected blood pressure value.
- the healthcare provider may compare the expected characteristics of the treatment device 3070 with characteristics of the treatment device 3070 indicated by the treatment information and/or the prediction. For example, the healthcare provider may compare an expected resistance setting of the treatment device 3070 with an actual resistance setting of the treatment device 3070 indicated by the treatment information and/or the prediction. The healthcare provider may determine that the user is performing the treatment plan properly if the actual characteristics of the treatment device 3070 indicated by the treatment information and/or the prediction are within a range of corresponding characteristics of the expected characteristics of the treatment device 3070. Alternatively, the healthcare provider may determine that the user is not performing the treatment plan properly if the actual characteristics of the treatment device 3070 indicated by the treatment information and/or the prediction are outside the range of corresponding characteristics of the expected characteristics of the treatment device 3070.
- the healthcare provider may determine not to modify the at least one aspect treatment plan and/or the one or more characteristics of the treatment device 3070.
- the healthcare provider may determine to modify the at least one aspect of the treatment plan and/or the one or more characteristics of the treatment device 3070.
- the healthcare provider may interact with the interface to provide treatment plan input indicating one or more modifications to the treatment plan and/or to modify one or more characteristics of the treatment device 3070.
- the healthcare provider may use the interface to provide input indicating an increase or decrease in the resistance setting of the treatment device 3070, or other suitable modification to the one or more characteristics of the treatment device 3070.
- the healthcare provider may use the interface to provide input indicating a modification to the treatment plan.
- the healthcare provider may use the interface to provide input indicating an increase or decrease in an amount of time the user is required to use the treatment device 3070 according to the treatment plan, or other suitable modifications to the treatment plan.
- the server 3030 modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device 3070.
- the server 3030 may receive the subsequent treatment data pertaining to the user. For example, after the healthcare provider provides input modifying the treatment plan and/or the one or more characteristics of the treatment device 3070 and/or after the server 3030, using the artificial intelligence engine 3011, modifies the treatment plan and/or one or more characteristics of the treatment device 3070, the user may continue use the treatment device 3070 to perform the modified treatment plan.
- the subsequent treatment data may correspond to treatment data generated while the user uses the treatment device 3070 to perform the modified treatment plan.
- the subsequent treatment data may correspond to treatment data generated while the user continues to use the treatment device 3070 to perform the treatment plan.
- the subsequent treatment data may include the updated treatment data (e.g., the treatment data updated to include the at least one prediction),
- the server 3030 may use the artificial intelligence engine 3011 using the machine learning model 3013 to generate one or more subsequent predictions based on the subsequent treatment data.
- the server 3030 may determine whether a respective subsequent prediction is within a range of a corresponding threshold.
- the server 3030 may, in response to a determination that the respective subsequent prediction is within the range of the threshold, communicate the subsequent treatment data, subsequent treatment information, and/or the prediction to the computing device of the healthcare provider.
- the server 3030 may modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device 3070 based on the subsequent prediction.
- the server 3030 may receive subsequent treatment plan input from the computing device of the healthcare provider.
- the server 3030 may further modify the treatment plan and/or control the one or more characteristics of the treatment device 3070.
- the subsequent treatment plan input may correspond to input provided by the healthcare provider, at the interface, in response to receiving and/or reviewing subsequent treatment information and/or the subsequent prediction corresponding to the subsequent treatment data.
- the server 3030 using the artificial intelligence engine 3011, may continuously and/or periodically generate predictions based on treatment data.
- the server 3030 may provide treatment information and/or predictions to the computing device of the healthcare provider. Additionally, or alternatively, the server 3030 may continuously and/or periodically monitor, while the user uses the treatment device 3070 to perform the treatment plan, the characteristics of the user.
- the healthcare provider and/or the server 3030 may receive and/or review, continuously or periodically, while the user uses the treatment device 3070 to perform the treatment plan, treatment information, treatment data, and or predictions . Based on one or more trends indicated by the treatment information, treatment data, and/or predictions, the healthcare provider and/or the server 3030 may determine whether to modify the treatment plan and/or to modify and/or to control the one or more characteristics of the treatment device 3070. For example, the one or more trends may indicate an increase in heartrate or other suitable trends indicating that the user is not performing the treatment plan properly and/or that performance of the treatment plan by the user is not having the desired effect.
- the server 3030 may control, while the user uses the treatment device 3070 to perform the treatment plan, one or more characteristics of the treatment device 3070 based on the prediction. For example, the server 3030 may determine that the prediction is outside of the range of the corresponding threshold. Based on the prediction, the server 3030 may identify one or more characteristics of the treatment device 3070. The server 3030 may communicate a signal to the controller 3072 of the treatment device 3070 indicating the modifications to the one or more characteristics of the treatment device 3070. Based on the signal, the controller 3072 may modify the one or more characteristics of the treatment device 3070.
- the treatment plan including the configurations, settings, range of motion settings, pain level, force settings, and speed settings, etc. of the treatment device 3070 for various exercises, may be transmitted to the controller of the treatment device 3070.
- the controller may receive the indication. Based on the indication, the controller may electronically adjust the range of motion of the pedal 3102 by adjusting the pedal inwardly, outwardly, or along or about any suitable axis, via one or more actuators, hydraulics, springs, electric motors, or the like.
- the treatment plan may define alternative range of motion settings for the pedal 3102. Accordingly, once the treatment plan is uploaded to the controller of the treatment device 3070, the treatment device 3070 may continue to operate without further instruction, further external input, and the like. It should be noted that the patient (via the patient interface 3050) and/or the assistant (via the assistant interface 3094) may override any of the configurations or settings of the treatment device 3070 at any time. For example, the patient may use the patient interface 3050 to cause the treatment device 3070 to immediately stop, if so desired.
- FIG. 31 is a flow diagram generally illustrating a method 3900 for monitoring, based on treatment data received while a user uses the treatment device 3070, characteristics of the user while the user uses the treatment device 3070 according to the principles of the present disclosure.
- the method 3900 is performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), software (such as is ran on a general-purpose computer system or a dedicated machine), or a combination of both.
- the method 3900 and/or each of its individual functions, routines, subroutines, or operations may be performed by one or more processors of a computing device (e.g., any component of FIG. 23, such as server 3030 executing the artificial intelligence engine 3011).
- the method 3900 may be performed by a single processing thread.
- the method 3900 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the methods.
- the method 3900 is depicted and described as a series of operations. However, operations in accordance with this disclosure can occur in various orders and/or concurrently, and/or with other operations not presented and described herein. For example, the operations depicted in the method 3900 may occur in combination with any other operation of any other method disclosed herein. Furthermore, not all illustrated operations may be required to implement the method 3900 in accordance with the disclosed subject matter. In addition, those skilled in the art will understand and appreciate that the method 3900 could alternatively be represented as a series of interrelated states via a state diagram or events.
- the processing device may receive treatment data pertaining to a user who uses a treatment device, such as the treatment device 3070, to perform a treatment plan.
- the treatment data may include characteristics of the user, baseline measurement information pertaining to the user, measurement information pertaining to the user while the user uses the treatment device 3070 to perform the treatment plan, characteristics of the treatment device 3070, at least one aspect of the treatment plan, other suitable data, or a combination thereof.
- the processing device may write to an associated memory, for access by an artificial intelligence engine, such as the artificial intelligence engine 3011 , the treatment data.
- the artificial intelligence engine 3011 may be configured to use at least one machine learning model, such as the machine learning model 3013.
- the machine learning model 3013 may be configured to use at least one aspect of the treatment data to generate at least one prediction.
- the at least one prediction may indicate one or more predicted characteristics of the user.
- the one or more predicted characteristics of the user may include a predicted vital sign of the user, a predicted respiration rate of the user, a predicted heartrate of the user, a predicted temperature of the user, a predicted blood pressure of the user, a predicted outcome of the treatment plan being performed by the user, a predicted injury of the user resulting from the user performing the treatment plan, or other suitable predicted characteristic of the user.
- the processing device may receive, from the artificial intelligence engine 3011, the at least one prediction.
- the processing device may identify a threshold corresponding to the at least one prediction. For example, the processing device may identify one or more characteristics of the user indicated by a respective prediction.
- the processing device may access a database, such as the database 3044 or other suitable database, configured to associate thresholds with characteristics of the user and/or combinations of characteristics of the user.
- the database 3044 may include information that associates a first threshold with a blood pressure of the user. Additionally, or alternatively, the database 3044 may include information that associates a threshold with a blood pressure of the user and a heartrate of the user.
- a threshold corresponding to a respective prediction may include a value or a range of values including an upper limit and a lower limit.
- the processing device may, in response to a determination that the at least one prediction is within a range of a corresponding threshold, communicate with an interface at a computing device of a healthcare provider, to provide the at least one prediction and the treatment data. For example, the processing device may compare the at least one prediction and/or one or more characteristic of the user indicated by the prediction with the corresponding threshold identified by the processing device. If the processing device determines that the prediction is within the range of the threshold, the processing device may communicate the at least one prediction and/or the treatment data to the computing device of the healthcare provider.
- FIG. 32 is a flow diagram generally illustrating an alternative method 31000 for monitoring, based on treatment data received while a user uses the treatment device 3070, characteristics of the user while the user uses the treatment device 3070 according to the principles of the present disclosure.
- Method 31000 includes operations performed by processors of a computing device (e.g., any component of FIG. 23, such as server 3030 executing the artificial intelligence engine 3011).
- one or more operations of the method 31000 are implemented in computer instructions stored on a memory device and executed by a processing device.
- the method 31000 may be performed in the same or a similar manner as described above in regard to method 3900.
- the operations of the method 31000 may be performed in some combination with any of the operations of any of the methods described herein.
- the processing device may receive treatment data, during a telemedicine session, pertaining to a user who uses a treatment device or treatment apparatus, such as the treatment device 3070, to perform a treatment plan.
- the treatment data may include characteristics of the user, baseline measurement information pertaining to the user, measurement information pertaining to the user while the user uses the treatment device 3070 to perform the treatment plan, characteristics of the treatment device 3070, at least one aspect of the treatment plan, other suitable data, or a combination thereof.
- the processing device may write to an associated memory, for access by an artificial intelligence engine, such as the artificial intelligence engine 3011 , the treatment data.
- the artificial intelligence engine 3011 may be configured to use at least one machine learning model, such as the machine learning model 3013.
- the machine learning model 3013 may be configured to use at least one aspect of the treatment data to generate at least one prediction.
- the at least one prediction may indicate one or more predicted characteristics of the user.
- the one or more predicted characteristics of the user may include a predicted vital sign of the user, a predicted respiration rate of the user, a predicted heartrate of the user, a predicted temperature of the user, a predicted blood pressure of the user, a predicted outcome of the treatment plan being performed by the user, a predicted injury of the user resulting from the user performing the treatment plan, or other suitable predicted characteristic of the user.
- the processing device may receive from the artificial intelligence engine 11 the at least one prediction.
- the processing device may identify a threshold corresponding to the at least one prediction. For example, the processing device may identify one or more characteristics of the user indicated by a respective prediction.
- the processing device may access a database, such as the database 3044 or other suitable database, configured to associate thresholds with characteristics of the user and/or combinations of characteristics of the user.
- the database 3044 may include information that associates a first threshold with a blood pressure of the user. Additionally, or alternatively, the database 3044 may include information that associates a threshold with a blood pressure of the user and a heartrate of the user.
- a threshold corresponding to a respective prediction may include a value or a range of values including an upper limit and a lower limit.
- the processing device may, in response to a determination that the at least one prediction is within a range of a corresponding threshold, communicate with an interface at a computing device of a healthcare provider, to provide the at least one prediction and the treatment data. For example, the processing device may compare the at least one prediction and/or one or more characteristic of the user indicated by the prediction with the corresponding threshold identified by the processing device. If the processing device determines that the prediction is within the range of the threshold, the processing device may communicate the at least one prediction and/or the treatment data to the computing device of the healthcare provider.
- the processing device may receive, from the computing device of the healthcare provider, treatment plan input indicating at least one modification to the at least one of the at least one aspect of the treatment plan and any other aspect of the treatment plan.
- the processing device may modify, using the treatment plan input, the at least one of the at least one aspect of the treatment plan and any other aspect of the treatment plan.
- the processing device may control, during a telemedicine session while the user uses the treatment device 3070 and based on the modified at least one of the at least one aspect of the treatment plan or any other aspect of the treatment plan, the treatment device 3070.
- FIG. 33 is a flow diagram generally illustrating an alternative method 31100 for monitoring, based on treatment data received while a user uses the treatment device 3070, characteristics of the user while the user uses the treatment device 3070 according to the principles of the present disclosure.
- Method 31100 includes operations performed by processors of a computing device (e.g., any component of FIG. 23, such as server 3030 executing the artificial intelligence engine 3011).
- processors of a computing device e.g., any component of FIG. 23, such as server 3030 executing the artificial intelligence engine 3011.
- one or more operations of the method 31100 are implemented in computer instructions stored on a memory device and executed by a processing device.
- the method 31100 may be performed in the same or a similar manner as described above in regard to method 3900 and/or method 31000.
- the operations of the method 31100 may be performed in some combination with any of the operations of any of the methods described herein.
- the processing device may receive treatment data pertaining to a user who uses a treatment device, such as the treatment device 3070, to perform a treatment plan.
- the treatment data may include characteristics of the user, baseline measurement information pertaining to the user, measurement information pertaining to the user while the user uses the treatment device 3070 to perform the treatment plan, characteristics of the treatment device 3070, at least one aspect of the treatment plan, other suitable data, or a combination thereof.
- the processing device may write to an associated memory, for access by an artificial intelligence engine, such as the artificial intelligence engine 3011 , the treatment data.
- the artificial intelligence engine 3011 may be configured to use at least one machine learning model, such as the machine learning model 3013.
- the machine learning model 3013 may be configured to use at least one aspect of the treatment data to generate at least one prediction.
- the at least one prediction may indicate one or more predicted characteristics of the user.
- the one or more predicted characteristics of the user may include a predicted vital sign of the user, a predicted respiration rate of the user, a predicted heartrate of the user, a predicted temperature of the user, a predicted blood pressure of the user, a predicted outcome of the treatment plan being performed by the user, a predicted injury of the user resulting from the user performing the treatment plan, or other suitable predicted characteristic of the user.
- the processing device may receive, from the artificial intelligence engine 3011, the at least one prediction.
- the processing device may generate treatment information using the at least one prediction.
- the treatment information may include a summary of the performance, while the user uses the treatment device 3070 to perform the treatment plan, of the treatment plan by the user and the at least one prediction.
- the treatment information may be formatted, such that the treatment data and/or the at least one prediction is presentable at a computing device of a healthcare provider responsible for the performance of the treatment plan by the user.
- the processing device may write, to an associated memory for access by at least one of the computing device of the healthcare provider and a machine learning model executed by the artificial intelligence engine 3011, the treatment information and/or the at least one prediction.
- the processing device may receive treatment plan input responsive to the treatment information.
- the treatment plan input may indicate at least one modification to the at least one aspect treatment plan and/or any other aspect of the treatment plan.
- the treatment plan input may be provided by the healthcare provider, as described.
- the artificial intelligence engine 3011 executing the machine learning model 3013 may generate the treatment plan input.
- the processing device may determine whether the treatment plan input indicates at least one modification to the at least one aspect treatment plan and/or any other aspect of the treatment plan.
- the processing device determines that the treatment plan input does not indicate at least one modification to the at least one aspect treatment plan and/or any other aspect of the treatment plan, the processing device returns to 31102 and continues receiving treatment data pertaining to the user while the user uses the treatment device 3070 to perform the treatment plan. If the processing device determines that the treatment plan input indicates at least one modification to the at least one aspect treatment plan and/or any other aspect of the treatment plan, the processing device continues at 31116.
- the processing device may selectively modify the at least one aspect of the treatment plan and/or any other aspect of the treatment plan. For example, the processing device may determine whether the treatment data indicates that the treatment plan is having a desired effect. The processing device may modify, in response to determining that the treatment plan is not having the desired effect, at least one aspect of the treatment plan in order to atempt to achieve the desired effect, and if not possible, at least a portion of the desired effect.
- the processing device may control, while the user uses the treatment device 3070, based on treatment plan and/or the modified treatment plan the treatment device 3070.
- FIG. 34 generally illustrates an example embodiment of a method 31200 for receiving a selection of an optimal treatment plan and controlling a treatment device while the patient uses the treatment device according to the present disclosure, based on the optimal treatment plan.
- Method 31200 includes operations performed by processors of a computing device (e.g., any component of FIG. 23, such as server 3030 executing the artificial intelligence engine 3011).
- processors of a computing device e.g., any component of FIG. 23, such as server 3030 executing the artificial intelligence engine 3011.
- one or more operations of the method 31200 are implemented in computer instructions stored on a memory device and executed by a processing device.
- the method 31200 may be performed in the same or a similar manner as described above in regard to method 3900.
- the operations of the method 31200 may be performed in some combination with any of the operations of any of the methods described herein.
- various optimal treatment plans may be generated by one or more trained machine learning models 3013 of the artificial intelligence engine 3011. For example, based on a set of treatment plans pertaining to a medical condition of a patient, the one or more trained machine learning models 3013 may generate the optimal treatment plans.
- the various treatment plans may be transmited to one or more computing devices of a patient and/or medical professional.
- the processing device may receive an optimal treatment plan selected from some or all of the optimal treatment plans.
- the selection may have been entered on a user interface presenting the optimal treatment plans on the patient interface 3050 and/or the assistant interface 3094.
- the processing device may control, while the patient uses the treatment device 3070, based on the selected optimal treatment plan, the treatment device 3070.
- the controlling may be performed distally by the server 3030. For example, if the selection is made using the patient interface 3050, one or more control signals may be transmited from the patient interface 3050 to the treatment device 3070 to configure, according to the selected treatment plan, a seting of the treatment device 3070 to control operation of the treatment device 3070.
- one or more control signals may be transmited from the assistant interface 3094 to the treatment device 3070 to configure, according to the selected treatment plan, a seting of the treatment device 3070 to control operation of the treatment device 3070.
- the sensors 3076 may transmit measurement data to a processing device.
- the processing device may dynamically control, according to the treatment plan, the treatment device 3070 by modifying, based on the sensor measurements, a seting of the treatment device 3070. For example, if the force measured by the sensor 3076 indicates the user is not applying enough force to a pedal 3102, the treatment plan may indicate to reduce the required amount of force for an exercise.
- the user may use the patient interface 3050 to enter input pertaining to a pain level experienced by the patient as the patient performs the treatment plan.
- the user may enter a high degree of pain while pedaling with the pedals 3102 set to a certain range of motion on the treatment device 3070.
- the pain level entered by the user may be within a range or at a level which may cause the range of motion to be dynamically adjusted based on the treatment plan.
- the treatment plan may specify alternative range of motion settings if a certain pain level is indicated when the user is performing an exercise subject to a certain range of motion.
- FIG. 35 generally illustrates an example computer system 31300 which can perform any one or more of the methods described herein, in accordance with one or more aspects of the present disclosure.
- computer system 31300 may include a computing device and correspond to the assistance interface 3094, reporting interface 3092, supervisory interface 3090, clinician interface 3020, server 3030 (including the AI engine 3011), patient interface 3050, ambulatory sensor 3082, goniometer 3084, treatment device 3070, pressure sensor 3086, or any suitable component of FIG. 23.
- the computer system 31300 may be capable of executing instructions implementing the one or more machine learning models 3013 of the artificial intelligence engine 3011 of FIG. 23.
- the computer system may be connected (e.g., networked) to other computer systems in a LAN, an intranet, an extranet, or the Internet, including via the cloud or a peer-to-peer network.
- the computer system may operate in the capacity of a server in a client-server network environment.
- the computer system may be a personal computer (PC), a tablet computer, a wearable (e.g., wristband), a set top box (STB), a personal Digital Assistant (PDA), a mobile phone, a camera, a video camera, an Internet of Things (IoT) device, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device.
- PC personal computer
- PDA personal Digital Assistant
- a mobile phone a camera, a video camera, an Internet of Things (IoT) device, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device.
- IoT Internet of Things
- the term “computed shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
- the computer system 31300 includes a processing device 31302, a main memory 31304 (e.g., read only memory (ROM), flash memory, solid state drives (SSDs), dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 31306 (e.g., flash memory, solid state drives (SSDs), static random access memory (SRAM)), and a data storage device 31308, which communicate with each other via a bus 31310.
- main memory 31304 e.g., read only memory (ROM), flash memory, solid state drives (SSDs), dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)
- DRAM dynamic random access memory
- SDRAM synchronous DRAM
- static memory 31306 e.g., flash memory, solid state drives (SSDs), static random access memory (SRAM)
- SRAM static random access memory
- Processing device 31302 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 31302 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 31302 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a system on a chip, a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 31302 is configured to execute instructions for performing any of the operations and steps discussed herein.
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- DSP digital signal processor
- the computer system 31300 may further include a network interface device 31312.
- the computer system 31300 also may include a video display 31314 (e.g., a liquid crystal display (LCD), a light-emitting diode (LED), an organic light-emitting diode (OLED), a quantum LED, a cathode ray tube (CRT), a shadow mask CRT, an aperture grille CRT, a monochrome CRT), one or more input devices 31316 (e.g., a keyboard and/or a mouse or a gaming-like control), and one or more speakers 31318 (e.g., a speaker).
- the video display 31314 and the input device(s) 31316 may be combined into a single component or device (e.g., an LCD touch screen).
- the data storage device 31316 may include a computer-readable medium 31320 on which the instructions 31322 embodying any one or more of the methods, operations, or functions described herein is stored.
- the instructions 31322 may also reside, completely or at least partially, within the main memory 31304 and/or within the processing device 31302 during execution thereof by the computer system 31300 As such, the main memory 31304 and the processing device 31302 also constitute computer-readable media.
- the instructions 31322 may further be transmitted or received over a network via the network interface device 31312 [0645] While the computer- readable storage medium 31320 is generally illustrated in the illustrative examples to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
- the term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
- the term “computer-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
- a method for providing, by an artificial intelligence engine, an optimal treatment plan to use with a treatment apparatus comprising:
- Clause 51 The method of any clause herein, wherein the desired result comprises obtaining a certain result within a certain time period, and the certain result comprises:
- the certain characteristics of the people comprise first medications prescribed to the people, first injuries of the people, first medical procedures performed on the people, first measurements of the people, first allergies of the people, first medical conditions of the people, first historical information of the people, first vital signs of the people, first symptoms of the people, first familial medical information of the people, first demographic information of the people, first geographic information of the people, first measurement- or test- based information of the people, first medically historic information of the people, first etiologic information of the people, first cohort-associative information of the people, first differentially diagnostic information of the people, first surgical information of the people, first physically therapeutic information of the people, first pharmacologic information of the people, first other treatments recommended to the people, or some combination thereof, and
- the plurality of characteristics of the patient comprise second medications of the patient, second injuries of the patient, second medical procedures performed on the patient, second measurements of the patient, second allergies of the patient, second medical conditions of the patient, second historical information of the patient, second vital signs of the patient, second symptoms of the patient, second familial medical information of the patient, second demographic information of the patient, second geographic information of the patient, second measurement- or test-based information of the patient, second medically historic information of the patient, second etiologic information of the patient, second cohort-associative information of the patient, second differentially diagnostic information of the patient, second surgical information of the patient, second physically therapeutic information of the patient, second pharmacologic information of the patient, second other treatments recommended to the patient, or some combination thereof.
- Clause 53 The method of any clause herein, wherein the clinical information is written by a person having a certain professional credential and comprises a journal article, a clinical trial, evidence-based guidelines, meta-analysis, or some combination thereof.
- a tangible, non-transitory computer-readable medium storing instructions that, when executed, cause a processing device to:
- [0686] receive, from a data source, clinical information pertaining to results of using the treatment apparatus to perform particular treatment plans for people having certain characteristics, wherein the clinical information has a first data format;
- [0687] translate a portion of the clinical information from the first data format to a medical description language used by the artificial intelligence engine; [0688] determine, based on the portion of the clinical information described by the medical description language and a plurality of characteristics pertaining to a patient, the optimal treatment plan for the patient to follow using the treatment apparatus to achieve a desired result; and
- [0697] determines, based on the portion of the clinical information described by the medical description language and the plurality of characteristics pertaining to the patient, a second optimal treatment plan for the patient to follow when using the treatment apparatus to achieve a second desired result, wherein the desired result pertains to a recovery outcome and the second desired result pertains to a recovery time; and [0698] provides the second optimal treatment plan to be presented on the computing device of the medical professional;
- [0700] transmits the selected treatment plan to a computing device of the patient.
- Clause 60 The computer-readable medium of any clause herein, wherein the desired result comprises obtaining a certain result within a certain time period, and the certain result comprises:
- the certain characteristics of the people comprise first medications prescribed to the people, first injuries of the people, first medical procedures performed on the people, first measurements of the people, first allergies of the people, first medical conditions of the people, first historical information of the people, first vital signs of the people, first symptoms of the people, first familial medical information of the people, first demographic information of the people, first geographic information of the people, first measurement- or test- based information of the people, first medically historic information of the people, first etiologic information of the people, first cohort-associative information of the people, first differentially diagnostic information of the people, first surgical information of the people, first physically therapeutic information of the people, first pharmacologic information of the people, first other treatments recommended to the people, or some combination thereof, and
- the plurality of characteristics of the patient comprise second medications of the patient, second injuries of the patient, second medical procedures performed on the patient, second measurements of the patient, second allergies of the patient, second medical conditions of the patient, second historical information of the patient, second vital signs of the patient, second symptoms of the patient, second familial medical information of the patient, second demographic information of the patient, second geographic information of the patient, second measurement- or test-based information of the patient, second medically historic information of the patient, second etiologic information of the patient, second cohort-associative information of the patient, second differentially diagnostic information of the patient, second surgical information of the patient, second physically therapeutic information of the patient, second pharmacologic information of the patient, second other treatments recommended to the patient, or some combination thereof.
- Clause 62 The computer-readable medium of any clause herein, wherein the clinical information is written by a person having a certain professional credential and comprises a journal article, a clinical trial, evidence-based guidelines, or some combination thereof.
- a processing device communicatively coupled to the memory device, wherein the processing device executes the instructions to:
- [0714] receive, from a data source, clinical information pertaining to results of using the treatment apparatus to perform particular treatment plans for people having certain characteristics, wherein the clinical information has a first data format;
- [0715] translate a portion of the clinical information from the first data format to a medical description language used by the artificial intelligence engine
- [0720] identify, based on keywords representing target information described by the clinical information, the portion of the clinical information having values of the target information; [0721] generate a canonical format defined by the medical description language, wherein the canonical format comprises tags identifying the values of the target information.
- a treatment plan for a patient having certain characteristics may be a technically challenging problem.
- characteristics e.g., vital-sign or other measurements; performance; demographic; geographic; diagnostic; measurement- or test-based; medically historic; etiologic; cohort-associative; differentially diagnostic; surgical, physically therapeutic, pharmacologic and other treatment(s) recommended; arterial blood gas and/or oxygenation levels or percentages; psycho graphics; etc.
- a multitude of information may be considered when determining a treatment plan, which may result in inefficiencies and inaccuracies in the treatment plan selection process.
- some of the multitude of information considered may include characteristics of the patient such as personal information, performance information, and measurement information.
- the personal information may include, e.g., demographic, psychographic or other information, such as an age, a weight, a gender, a height, a body mass index, a medical condition, a familial medication history, an injury, a medical procedure, a medication prescribed, or some combination thereof.
- the performance information may include, e.g., an elapsed time of using a treatment apparatus, an amount of force exerted on a portion of the treatment apparatus, a range of motion achieved on the treatment apparatus, a movement speed of a portion of the treatment apparatus, an indication of a plurality of pain levels using the treatment apparatus, or some combination thereof.
- the measurement information may include, e.g., a vital sign, a respiration rate, a heartrate, a temperature, a blood pressure, arterial blood gas and/or oxygenation levels or percentages, glucose levels or levels of other biomarkers, or some combination thereof. It may be desirable to process the characteristics of a multitude of patients, the treatment plans performed for those patients, and the results of the treatment plans for those patients.
- Another technical problem may involve distally treating, via a computing device during a telemedicine or telehealth session, a patient from a location different than a location at which the patient is located.
- An additional technical problem is controlling or enabling the control of, from the different location, a treatment apparatus used by the patient at the location at which the patient is located.
- a physical therapist or other medical professional may prescribe a treatment apparatus to the patient to use to perform a treatment protocol at their residence or any mobile location or temporary domicile.
- a medical professional may refer to a doctor, physician assistant, nurse, chiropractor, dentist, physical therapist, acupuncturist, physical trainer, or the like.
- a medical professional may refer to any person with a credential, license, degree, or the like in the field of medicine, physical therapy, rehabilitation, or the like.
- some embodiments of the present disclosure pertain to using artificial intelligence and/or machine learning to dynamically control a treatment apparatus based on the assignment during an adaptive telemedical session.
- numerous treatment apparatuses may be provided to patients.
- the treatment apparatuses may be used by the patients to perform treatment plans in their residences, at a gym, at a rehabilitative center, at a hospital, at a work site, or any suitable location, including permanent or temporary domiciles.
- the treatment apparatuses may be communicatively coupled to a server.
- Characteristics of the patients may be collected before, during, and/or after the patients perform the treatment plans.
- the personal information, the performance information, and the measurement information may be collected before, during, and/or after the person performs the treatment plans.
- the results (e.g., improved performance or decreased performance) of performing each exercise may be collected from the treatment apparatus throughout the treatment plan and after the treatment plan is performed.
- the parameters, settings, configurations, etc. e.g., position of pedal, amount of resistance, etc.
- of the treatment apparatus may be collected before, during, and/or after the treatment plan is performed.
- Each characteristic of the patient, each result, and each parameter, setting, configuration, etc. may be timestamped and may be correlated with a particular step in the treatment plan. Such a technique may enable determining which steps in the treatment plan lead to desired results (e.g., improved muscle strength, range of motion, etc.) and which steps lead to diminishing returns (e.g., continuing to exercise after 3 minutes actually delays or harms recovery).
- desired results e.g., improved muscle strength, range of motion, etc.
- diminishing returns e.g., continuing to exercise after 3 minutes actually delays or harms recovery.
- Data may be collected from the treatment apparatuses and/or any suitable computing device (e.g., computing devices where personal information is entered, such as a clinician interface or patient interface) over time as the patients use the treatment apparatuses to perform the various treatment plans.
- the data that may be collected may include the characteristics of the patients, the treatment plans performed by the patients, and the results of the treatment plans.
- the data may be processed to group certain people into cohorts.
- the people may be grouped by people having certain or selected similar characteristics, treatment plans, and results of performing the treatment plans. For example, athletic people having no medical conditions who perform a treatment plan (e.g., use the treatment apparatus for 30 minutes a day 5 times a week for 3 weeks) and who fully recover may be grouped into a first cohort. Older people who are classified obese and who perform a treatment plan (e.g., use the treatment plan for 10 minutes a day 3 times a week for 4 weeks) and who improve their range of motion by 75 percent may be grouped into a second cohort.
- an artificial intelligence engine may include one or more machine learning models that are trained using the cohorts.
- the one or more machine learning models may be trained to receive an input of characteristics of a new patient and to output a treatment plan for the patient that results in a desired result.
- the machine learning models may match a pattern between the characteristics of the new patient and at least one patient of the patients included in a particular cohort. When a pattern is matched, the machine learning models may assign the new patient to the particular cohort and select the treatment plan associated with the at least one patient.
- the artificial intelligence engine may be configured to control, distally and based on the treatment plan, the treatment apparatus while the new patient uses the treatment apparatus to perform the treatment plan.
- the characteristics of the new patient may change as the new patient uses the treatment apparatus to perform the treatment plan. For example, the performance of the patient may improve quicker than expected for people in the cohort to which the new patient is currently assigned. Accordingly, the machine learning models may be trained to dynamically reassign, based on the changed characteristics, the new patient to a different cohort that includes people having characteristics similar to the now -changed characteristics as the new patient. For example, a clinically obese patient may lose weight and no longer meet the weight criterion for the initial cohort, result in the patient’ s being reassigned to a different cohort with a different weight criterion.
- a different treatment plan may be selected for the new patient, and the treatment apparatus may be controlled, distally and based on the different treatment plan, the treatment apparatus while the new patient uses the treatment apparatus to perform the treatment plan.
- Such techniques may provide the technical solution of distally controlling a treatment apparatus. Further, the techniques may lead to faster recovery times and/or better results for the patients because the treatment plan that most accurately fits their characteristics is selected and implemented, in real-time, at any given moment.
- Real-time may refer to less than or equal to 2 seconds. Near real-time may refer to any interaction of a sufficiently short time to enable two individuals to engage in a dialogue via such user interface, and will generally be less than 10 seconds but greater than 2 seconds.
- the term “results” may refer to medical results or medical outcomes. Results and outcomes may refer to responses to medical actions.
- the treatment plans may be presented, during a telemedicine or telehealth session, to a medical professional.
- the medical professional may select a particular treatment plan for the patient to cause that treatment plan to be transmitted to the patient and/or to control, based on the treatment plan, the treatment apparatus.
- the artificial intelligence engine may receive and/or operate distally from the patient and the treatment apparatus.
- the recommended treatment plans and/or excluded treatment plans may be presented simultaneously with a video of the patient in real-time or near real-time during a telemedicine or telehealth session on a user interface of a computing device of a medical professional.
- the video may also be accompanied by audio, text and other multimedia information.
- Presenting the treatment plans generated by the artificial intelligence engine concurrently with a presentation of the patient video may provide an enhanced user interface because the medical professional may continue to visually and/or otherwise communicate with the patient while also reviewing the treatment plans on the same user interface.
- the enhanced user interface may improve the medical professional’s experience using the computing device and may encourage the medical professional to reuse the user interface.
- Such a technique may also reduce computing resources (e.g., processing, memory, network) because the medical professional does not have to switch to another user interface screen to enter a query for a treatment plan to recommend based on the characteristics of the patient.
- the artificial intelligence engine provides, dynamically on the fly, the treatment plans and excluded treatment plans.
- the treatment plan may be modified by a medical professional. For example, certain procedures may be added, modified or removed. In the telehealth scenario, there are certain procedures that may not be performed due to the distal nature of a medical professional using a computing device in a different physical location than a patient.
- a potential technical problem may relate to the information pertaining to the patient’s medical condition being received in disparate formats.
- a server may receive the information pertaining to a medical condition of the patient from one or more sources (e.g., from an electronic medical record (EMR) system, application programming interface (API), or any suitable system that has information pertaining to the medical condition of the patient).
- sources e.g., from an electronic medical record (EMR) system, application programming interface (API), or any suitable system that has information pertaining to the medical condition of the patient.
- EMR electronic medical record
- API application programming interface
- some embodiments of the present disclosure may use an API to obtain, via interfaces exposed by APIs used by the sources, the formats used by the sources.
- the API may map and convert the format used by the sources to a standardized (i.e., canonical) format, language and/or encoding (“format” as used herein will be inclusive of all of these terms) used by the artificial intelligence engine.
- a standardized format i.e., canonical
- language and/or encoding format as used herein will be inclusive of all of these terms
- the information converted to the standardized format used by the artificial intelligence engine may be stored in a database accessed by the artificial intelligence engine when the artificial intelligence engine is performing any of the techniques disclosed herein. Using the information converted to a standardized format may enable a more accurate determination of the procedures to perform for the patient and/or a billing sequence to use for the patient.
- a server may receive the information pertaining to a medical condition of the patient from one or more sources (e.g., from an electronic medical record (EMR) system, application programming interface (API), or any suitable system that has information pertaining to the medical condition of the patient).
- EMR electronic medical record
- API application programming interface
- the information may be converted from the format used by the sources to the standardized format used by the artificial intelligence engine.
- the information converted to the standardized format used by the artificial intelligence engine may be stored in a database accessed by the artificial intelligence engine when performing any of the techniques disclosed herein.
- the standardized information may enable generating optimal treatment plans, where the generating is based on treatment plans associated with the standardized information, monetary value amounts, and the set of constraints.
- the optimal treatment plans may be provided in a standardized format that can be processed by various applications (e.g., telehealth) executing on various computing devices of medical professionals and/or patients.
- a technical problem may include the challenge of enabling one medical professional to treat numerous patients at the same time.
- a technical solution to the technical problem may include the enablement of at least one medical professional or a group of medical professionals, wherein one medical professional may participate at one time and a different medical professional may participate at another time, to treat numerous patients at the same time.
- a single medical professional (or “one medical professional” or equivalent) shall be deemed inclusive of all of the scenarios just recited. For example, in group therapy or recovery sessions, it may be desirable for a single medical professional to view, monitor, treat, manage, diagnose, etc. more than one patient at the same time from a distal location.
- a virtual avatar is used to guide each patient through an exercise session of a treatment plan.
- the medical professional may use a computing device to view, monitor, treat, manage, diagnose, etc. the patients at once or in temporally close ranges.
- a trigger event such as a user indicating they are in a substantial amount of pain
- a telemedicine session is initiated either by selection or electronically.
- the telemedicine session causes the virtual avatar to be replaced on the computing device of the patient with a multimedia feed from the computing device of the medical professional.
- the medical professional may select to intervene and/or interrupt any patient’s treatment plan (including, for example and without limitation, an exercise, rehabilitation, prehabilitation, or other session) as desired (e.g., when the medical professional determines a sensor measurement is undesired, the patient is not performing as desired, etc.), while the other patients continue to follow the virtual avatar to perform the exercise session.
- any patient’s treatment plan including, for example and without limitation, an exercise, rehabilitation, prehabilitation, or other session
- the medical professional may select to intervene and/or interrupt any patient’s treatment plan (including, for example and without limitation, an exercise, rehabilitation, prehabilitation, or other session) as desired (e.g., when the medical professional determines a sensor measurement is undesired, the patient is not performing as desired, etc.), while the other patients continue to follow the virtual avatar to perform the exercise session.
- the treatment apparatus may be adaptive and/or personalized because its properties, configurations, and positions may be adapted to the needs of a particular patient.
- the pedals may be dynamically adjusted on the fly (e.g., via a telemedicine session or based on programmed configurations in response to certain measurements being detected) to increase or decrease a range of motion to comply with a treatment plan designed for the user.
- a medical professional may adapt, remotely during a telemedicine session, the treatment apparatus to the needs of the patient by causing a control instruction to be transmitted from a server to treatment apparatus.
- Such adaptive nature may improve the results of recovery for a patient, furthering the goals of personalized medicine, and enabling personalization of the treatment plan on a per-individual basis.
- FIG. 36 shows a block diagram of a computer-implemented system 10, hereinafter called “the system” for managing a treatment plan.
- Managing the treatment plan may include using an artificial intelligence engine to recommend treatment plans and/or provide excluded treatment plans that should not be recommended to a patient.
- the system 4010 also includes a server 4030 configured to store and to provide data related to managing the treatment plan.
- the server 4030 may include one or more computers and may take the form of a distributed and/or virtualized computer or computers.
- the server 4030 also includes a first communication interface 4032 configured to communicate with the clinician interface 4020 via a first network 4034.1n some embodiments, the first network 4034 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
- NFC Near-Field Communications
- the server 4030 includes a first processor 4036 and a first machine-readable storage memory 4038, which may be called a “memory” for short, holding first instructions 4040 for performing the various actions of the server 30 for execution by the first processor 4036.
- the server 4030 is configured to store data regarding the treatment plan.
- the memory 4038 includes a system data store 4042 configured to hold system data, such as data pertaining to treatment plans for treating one or more patients.
- the system data store 4042 may be configured to hold data relating to billing procedures, including rules and constraints pertaining to billing codes, order, timing, insurance regimes, laws, regulations, or some combination thereof.
- the system data store 4042 may be configured to store various billing sequences generated based on billing procedures and various parameters (e.g., monetary value amount generated, patient outcome, plan of reimbursement, fees, a payment plan for patients to pay of an amount of money owed, an amount of revenue to be paid to an insurance provider, etc.).
- the system data store 4042 may be configured to store optimal treatment plans generated based on various treatment plans for users having similar medical conditions, monetaiy value amounts generated by the treatment plans, and the constraints. Any of the data stored in the system data store 4042 may be accessed by an artificial intelligence engine 4011 when performing any of the techniques described herein.
- the server 4030 is also configured to store data regarding performance by a patient in following a treatment plan.
- the memory 4038 includes a patient data store 4044 configured to hold patient data, such as data pertaining to the one or more patients, including data representing each patient’s performance within the treatment plan.
- the characteristics (e.g., personal, performance, measurement, etc.) of the people, the treatment plans followed by the people, the level of compliance with the treatment plans, and the results of the treatment plans may use correlations and other statistical or probabilistic measures to enable the partitioning of or to partition the treatment plans into different patient cohort-equivalent databases in the patient data store 4044.
- the data for a first cohort of first patients having a first similar injury, a first similar medical condition, a first similar medical procedure performed, a first treatment plan followed by the first patient, and a first result of the treatment plan may be stored in a first patient database.
- the data for a second cohort of second patients having a second similar injury, a second similar medical condition, a second similar medical procedure performed, a second treatment plan followed by the second patient, and a second result of the treatment plan may be stored in a second patient database. Any single characteristic or any combination of characteristics may be used to separate the cohorts of patients.
- the different cohorts of patients may be stored in different partitions or volumes of the same database. There is no specific limit to the number of different cohorts of patients allowed, other than as limited by mathematical combinatoric and/or partition theory.
- This characteristic data, treatment plan data, and results data may be obtained from numerous treatment apparatuses and/or computing devices over time and stored in the patient data store.
- the characteristic data, treatment plan data, and results data may be correlated in the patient-cohort databases in the patient data store 4044.
- the characteristics of the people may include personal information, performance information, and/or measurement information.
- characteristics about a current patient being treated may be stored in an appropriate patient cohort-equivalent database.
- the characteristics of the patient may be determined to match or be similar to the characteristics of another person in a particular cohort (e.g., cohort A) and the patient may be assigned to that cohort.
- the server 4030 may execute the artificial intelligence (AI) engine 4011 that uses one or more machine learning models 4013 to perform at least one of the embodiments disclosed herein.
- the server 4030 may include a training engine 409 capable of generating the one or more machine learning models 4013.
- the machine learning models 4013 may be trained to assign people to certain cohorts based on their characteristics, select treatment plans using real-time and historical data correlations involving patient cohort-equivalents, and control a treatment apparatus 4070, among other things.
- the machine learning models 4013 may be trained to generate, based on billing procedures, billing sequences and/or treatment plans tailored for various parameters (e.g., a fee to be paid to a medical professional, a payment plan for the patient to pay off an amount of money owed, a plan of reimbursement, an amount of revenue to be paid to an insurance provider, or some combination thereof).
- the machine learning models 4013 may be trained to generate, based on constraints, optimal treatment plans tailored for various parameters (e.g., monetary value amount generated, patient outcome, risk, etc.).
- the one or more machine learning models 4013 may be generated by the training engine 9 and may be implemented in computer instructions executable by one or more processing devices of the training engine 409 and/or the servers 4030. To generate the one or more machine learning models 4013, the training engine 409 may train the one or more machine learning models 4013.
- the one or more machine learning models 4013 may be used by the artificial intelligence engine 4011.
- the training engine 409 may be a rackmount server, a router computer, a personal computer, a portable digital assistant, a smartphone, a laptop computer, a tablet computer, a netbook, a desktop computer, an Internet of Things (IoT) device, any other desired computing device, or any combination of the above.
- the training engine 409 may be cloud-based or a real-time software platform, and it may include privacy software or protocols, and/or security software or protocols.
- the training engine 409 may use a training data set of a corpus of the information (e.g., characteristics, medical diagnosis codes, etc.) pertaining to medical conditions of the people who used the treatment apparatus 4070 to perform treatment plans, the details (e.g., treatment protocol including exercises, amount of time to perform the exercises, instructions for the patient to follow, how often to perform the exercises, a schedule of exercises, parameters/configurations/settings of the treatment apparatus 4070 throughout each step of the treatment plan, etc.) of the treatment plans performed by the people using the treatment apparatus 4070, the results of the treatment plans performed by the people, a set of monetary value amounts associated with the treatment plans, a set of constraints (e.g., rates pertaining to billing codes associated with the set of treatment plans, laws, regulations, etc.), a set of billing procedures (e.g., rales pertaining to billing codes, order, timing and constraints) associated with treatment plan instructions, a set of parameters (e.g., a
- the one or more machine learning models 4013 may be trained to match patterns of characteristics of a patient with characteristics of other people in assigned to a particular cohort.
- the term “match” may refer to an exact match, a correlative match, a substantial match, etc.
- the one or more machine learning models 4013 may be trained to receive the characteristics of a patient as input, map the characteristics to characteristics of people assigned to a cohort, and select a treatment plan from that cohort.
- the one or more machine learning models 4013 may also be trained to control, based on the treatment plan, the machine learning apparatus 4070.
- the one or more machine learning models 4013 may be trained to match patterns of a first set of parameters (e.g., treatment plans for patients having a medical condition, a set of monetary value amounts associated with the treatment plans, patient outcome, and/or a set of constraints) with a second set of parameters associated with an optimal treatment plan.
- the one or more machine learning models 4013 may be trained to receive the first set of parameters as input, map the characteristics to the second set of parameters associated with the optimal treatment plan, and select the optimal treatment plan a treatment plan.
- the one or more machine learning models 4013 may also be trained to control, based on the treatment plan, the machine learning apparatus 4070.
- the one or more machine learning models 4013 may be trained to match patterns of a first set of parameters (e.g., information pertaining to a medical condition, treatment plans for patients having a medical condition, a set of monetary value amounts associated with the treatment plans, patient outcomes, instructions for the patient to follow in a treatment plan, a set of billing procedures associated with the instructions, and/or a set of constraints) with a second set of parameters associated with a billing sequence and/or optimal treatment plan.
- the one or more machine learning models 4013 may be trained to receive the first set of parameters as input, map or otherwise associate or algorithmically associate the first set of parameters to the second set of parameters associated with the billing sequence and/or optimal treatment plan, and select the billing sequence and/or optimal treatment plan for the patient.
- one or more optimal treatment plans may be selected to be provided to a computing device of the medical professional and/or the patient.
- the one or more machine learning models 4013 may also be trained to control, based on the treatment plan, the machine learning apparatus 4070.
- Different machine learning models 4013 may be trained to recommend different treatment plans tailored for different parameters. For example, one machine learning model may be trained to recommend treatment plans for a maximum monetary value amount generated, while another machine learning model may be trained to recommend treatment plans based on patient outcome, or based on any combination of monetary value amount and patient outcome, or based on those and/or additional goals. Also, different machine learning models 4013 may be trained to recommend different billing sequences tailored for different parameters. For example, one machine learning model may be trained to recommend billing sequences for a maximum fee to be paid to a medical professional, while another machine learning model may be trained to recommend billing sequences based on a plan of reimbursement.
- the one or more machine learning models 4013 may refer to model artifacts created by the training engine 409.
- the training engine 409 may find patterns in the training data wherein such patterns map the training input to the target output and generate the machine learning models 4013 that capture these patterns.
- the artificial intelligence engine 4011, the database, and/or the training engine 409 may reside on another component (e.g., assistant interface 4094, clinician interface 4020, etc.) depicted in FIG. 36.
- the one or more machine learning models 4013 may comprise, e.g., a single level of linear or non linear operations (e.g., a support vector machine [SVM]) or the machine learning models 4013 may be a deep network, i.e., a machine learning model comprising multiple levels of non-linear operations.
- deep networks are neural networks including generative adversarial networks, convolutional neural networks, recurrent neural networks with one or more hidden layers, and fully connected neural networks (e.g., each neuron may transmit its output signal to the input of the remaining neurons, as well as to itself).
- the machine learning model may include numerous layers and/or hidden layers that perform calculations (e.g., dot products) using various neurons.
- the system 4010 also includes a patient interface 4050 configured to communicate information to a patient and to receive feedback from the patient.
- the patient interface includes an input device 4052 and an output device 4054, which may be collectively called a patient user interface 4052, 4054.
- the input device 4052 may include one or more devices, such as a keyboard, a mouse, a touch screen input, a gesture sensor, and/or a microphone and processor configured for voice recognition.
- the output device 4054 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, smartphone, or a smart watch.
- the output device 4054 may include other hardware and/or software components such as a projector, virtual reality capability, augmented reality capability, etc.
- the output device 4054 may incorporate various different visual, audio, or other presentation technologies.
- the output device 4054 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, and/or melodies, which may signal different conditions and/or directions.
- the output device 4054 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the patient.
- the output device 4054 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
- the output device 4054 may present a user interface that may present a recommended treatment plan, billing sequence, or the like to the patient.
- the user interface may include one or more graphical elements that enable the user to select which treatment plan to perform. Responsive to receiving a selection of a graphical element (e.g., “Start” button) associated with a treatment plan via the input device 4054, the patient interface 4050 may communicate a control signal to the controller 4072 of the treatment apparatus 4070, wherein the control signal causes the treatment apparatus 4070 to begin execution of the selected treatment plan.
- a graphical element e.g., “Start” button
- control signal may control, based on the selected treatment plan, the treatment apparatus 4070 by causing actuation of the actuator 4078 (e.g., cause a motor to drive rotation of pedals of the treatment apparatus at a certain speed), causing measurements to be obtained via the sensor 4076, or the like.
- the patient interface 4050 may communicate, via a local communication interface 4068, the control signal to the treatment apparatus 4070.
- the patient interface 4050 includes a second communication interface 4056, which may also be called a remote communication interface configured to communicate with the server 4030 and/or the clinician interface 4020 via a second network 4058.
- the second network 58 may include a local area network (LAN), such as an Ethernet network.
- the second network 4058 may include the Internet, and communications between the patient interface 4050 and the server 4030 and/or the clinician interface 4020 may be secured via encryption, such as, for example, by using a virtual private network (VPN).
- VPN virtual private network
- the second network 4058 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc. In some embodiments, the second network 4058 may be the same as and/or operationally coupled to the first network 4034.
- the patient interface 4050 includes a second processor 4060 and a second machine-readable storage memory 4062 holding second instructions 64 for execution by the second processor 4060 for performing various actions of patient interface 4050.
- the second machine-readable storage memory 4062 also includes a local data store 66 configured to hold data, such as data pertaining to a treatment plan and/or patient data, such as data representing a patient’s performance within a treatment plan.
- the patient interface 4050 also includes a local communication interface 4068 configured to communicate with various devices for use by the patient in the vicinity of the patient interface 4050.
- the local communication interface 4068 may include wired and/or wireless communications.
- the local communication interface 4068 may include a local wireless network such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
- the system 4010 also includes a treatment apparatus 4070 configured to be manipulated by the patient and/or to manipulate a body part of the patient for performing activities according to the treatment plan.
- the treatment apparatus 4070 may take the form of an exercise and rehabilitation apparatus configured to perform and/or to aid in the performance of a rehabilitation regimen, which may be an orthopedic rehabilitation regimen, and the treatment includes rehabilitation of a body part of the patient, such as a joint or a bone or a muscle group.
- the treatment apparatus 4070 may be any suitable medical, rehabilitative, therapeutic, etc.
- the treatment apparatus 4070 may be an electromechanical machine including one or more weights, an electromechanical bicycle, an electromechanical spin-wheel, a smart -mirror, a treadmill, a vibratory apparatus, or the like.
- the body part may include, for example, a spine, a hand, a foot, a knee, or a shoulder.
- the body part may include a part of a joint, a bone, or a muscle group, such as one or more vertebrae, a tendon, or a ligament.
- the treatment apparatus 4070 includes a controller 4072, which may include one or more processors, computer memory, and/or other components.
- the treatment apparatus 4070 also includes a fourth communication interface 4074 configured to communicate with the patient interface 4050 via the local communication interface 4068.
- the treatment apparatus 4070 also includes one or more internal sensors 4076 and an actuator 4078, such as a motor.
- the actuator 4078 may be used, for example, for moving the patient’s body part and/or for resisting forces by the patient.
- the internal sensors 4076 may measure one or more operating characteristics of the treatment apparatus 4070 such as, for example, a force a position, a speed, a velocity and /or an acceleration.
- the internal sensors 4076 may include a position sensor configured to measure at least one of a linear motion or an angular motion of a body part of the patient.
- an internal sensor 4076 in the form of a position sensor may measure a distance that the patient is able to move a part of the treatment apparatus 4070, where such distance may correspond to a range of motion that the patient’s body part is able to achieve.
- the internal sensors 4076 may include a force sensor configured to measure a force applied by the patient.
- an internal sensor 4076 in the form of a force sensor may measure a force or weight the patient is able to apply, using a particular body part, to the treatment apparatus 4070.
- the system 4010 shown in FIG. 36 also includes an ambulation sensor 4082, which communicates with the server 4030 via the local communication interface 4068 of the patient interface 4050.
- the ambulation sensor 4082 may track and store a number of steps taken by the patient.
- the ambulation sensor 4082 may take the form of a wristband, wristwatch, or smart watch.
- the ambulation sensor 4082 may be integrated within a phone, such as a smartphone.
- the ambulation sensor 4082 may be integrated within an article of clothing, such as a shoe, a belt, and/or pants.
- the system 4010 shown in FIG. 36 also includes a goniometer 4084, which communicates with the server 4030 via the local communication interface 4068 of the patient interface 4050.
- the goniometer 4084 measures an angle of the patient’s body part.
- the goniometer 4084 may measure the angle of flex of a patient’s knee or elbow or shoulder.
- the system 4010 shown in FIG. 36 also includes a pressure sensor 4086, which communicates with the server 4030 via the local communication interface 4068 of the patient interface 4050.
- the pressure sensor 4086 measures an amount of pressure or weight applied by a body part of the patient.
- pressure sensor 4086 may measure an amount of force applied by a patient’s foot when pedaling a stationary bike.
- the system 4010 shown in FIG. 36 also includes a supervisory interface 4090 which may be similar or identical to the clinician interface 4020. In some embodiments, the supervisory interface 4090 may have enhanced functionality beyond what is provided on the clinician interface 4020.
- the supervisory interface 4090 may be configured for use by a person having responsibility for the treatment plan, such as an orthopedic surgeon.
- the system 4010 shown in FIG. 36 also includes a reporting interface 4092 which may be similar or identical to the clinician interface 4020.
- the reporting interface 4092 may have less functionality from what is provided on the clinician interface 4020.
- the reporting interface 4092 may not have the ability to modify a treatment plan.
- Such a reporting interface 4092 may be used, for example, by a biller to determine the use of the system 4010 for billing purposes.
- the reporting interface 4092 may not have the ability to display patient identifiable information, presenting only pseudonymized data and/or anonymized data for certain data fields concerning a data subject and/or for certain data fields concerning a quasi-identifier of the data subject.
- Such a reporting interface 4092 may be used, for example, by a researcher to determine various effects of a treatment plan on different patients.
- the system 4010 includes an assistant interface 4094 for an assistant, such as a doctor, a nurse, a physical therapist, or a technician, to remotely communicate with the patient interface 4050 and/or the treatment apparatus 4070.
- an assistant such as a doctor, a nurse, a physical therapist, or a technician
- Such remote communications may enable the assistant to provide assistance or guidance to a patient using the system 4010.
- the assistant interface 4094 is configured to communicate a telemedicine signal 4096, 4097, 4098a, 4098b, 4099a, 4099b with the patient interface 4050 via a network connection such as, for example, via the first network 4034 and/or the second network 4058.
- the telemedicine signal 4096, 4097, 4098a, 4098b, 4099a, 4099b comprises one of an audio signal 4096, an audiovisual signal 4097, an interface control signal 4098a for controlling a function of the patient interface 4050, an interface monitor signal 4098b for monitoring a status of the patient interface 4050, an apparatus control signal 4099a for changing an operating parameter of the treatment apparatus 4070, and/or an apparatus monitor signal 4099b for monitoring a status of the treatment apparatus 4070.
- each of the control signals 4098a, 4099a may be unidirectional, conveying commands from the assistant interface 4094 to the patient interface 4050.
- an acknowledgement message may be sent from the patient interface 4050 to the assistant interface 4094.
- each of the monitor signals 4098b, 4099b may be unidirectional, status-information commands from the patient interface 4050 to the assistant interface 4094.
- an acknowledgement message may be sent from the assistant interface 4094 to the patient interface 4050 in response to successfully receiving one of the monitor signals 4098b, 4099b.
- the patient interface 4050 may be configured as a pass-through for the apparatus control signals 4099a and the apparatus monitor signals 4099b between the treatment apparatus 4070 and one or more other devices, such as the assistant interface 4094 and/or the server 4030.
- the patient interface 4050 may be configured to transmit an apparatus control signal 4099a in response to an apparatus control signal 4099a within the telemedicine signal 4096, 4097, 4098a, 4098b, 4099a, 4099b from the assistant interface 4094.
- the assistant interface 4094 may be presented on a shared physical device as the clinician interface 4020.
- the clinician interface 4020 may include one or more screens that implement the assistant interface 4094.
- the clinician interface 4020 may include additional hardware components, such as a video camera, a speaker, and/or a microphone, to implement aspects of the assistant interface 4094.
- one or more portions of the telemedicine signal 4096, 4097, 4098a, 4098b, 4099a, 4099b may be generated from a prerecorded source (e.g., an audio recording, a video recording, or an animation) for presentation by the output device 4054 of the patient interface 4050.
- a prerecorded source e.g., an audio recording, a video recording, or an animation
- a tutorial video may be streamed from the server 4030 and presented upon the patient interface 4050.
- Content from the prerecorded source may be requested by the patient via the patient interface 4050.
- the assistant via a control on the assistant interface 4094, the assistant may cause content from the prerecorded source to be played on the patient interface 4050.
- the assistant interface 4094 includes an assistant input device 4022 and an assistant display 4024, which may be collectively called an assistant user interface 4022, 4024.
- the assistant input device 4022 may include one or more of a telephone, a keyboard, a mouse, a trackpad, or a touch screen, for example.
- the assistant input device 4022 may include one or more microphones.
- the one or more microphones may take the form of a telephone handset, headset, or wide-area microphone or microphones configured for the assistant to speak to a patient via the patient interface 4050.
- assistant input device 4022 may be configured to provide voice-based functionalities, with hardware and/or software configured to interpret spoken instructions by the assistant by using the one or more microphones.
- the assistant input device 4022 may include functionality provided by or similar to existing voice- based assistants such as Siii by Apple, Alexaby Amazon, Google Assistant, or Bixby by Samsung.
- the assistant input device 4022 may include other hardware and/or software components.
- the assistant input device 4022 may include one or more general purpose devices and/or special-purpose devices.
- the assistant display 4024 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, a smartphone, or a smart watch.
- the assistant display 4024 may include other hardware and/or software components such as projectors, virtual reality capabilities, or augmented reality capabilities, etc.
- the assistant display 4024 may incorporate various different visual, audio, or other presentation technologies.
- the assistant display 4024 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, melodies, and/or compositions, which may signal different conditions and/or directions.
- the assistant display 4024 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the assistant.
- the assistant display 4024 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
- the system 4010 may provide computer translation of language from the assistant interface 4094 to the patient interface 4050 and/or vice-versa.
- the computer translation of language may include computer translation of spoken language and/or computer translation of text.
- the system 4010 may provide voice recognition and/or spoken pronunciation of text.
- the system 4010 may convert spoken words to printed text and/or the system 4010 may audibly speak language from printed text.
- the system 4010 may be configured to recognize spoken words by any or all of the patient, the clinician, and/or the assistant.
- the system 4010 may be configured to recognize and react to spoken requests or commands by the patient.
- the system 4010 may automatically initiate a telemedicine session in response to a verbal command by the patient (which may be given in any one of several different languages).
- the server 4030 may generate aspects of the assistant display 4024 for presentation by the assistant interface 4094.
- the server 4030 may include a web server configured to generate the display screens for presentation upon the assistant display 4024.
- the artificial intelligence engine 4011 may generate treatment plans, billing sequences, and/or excluded treatment plans for patients and generate the display screens including those treatment plans, billing sequences, and/or excluded treatment plans for presentation on the assistant display 4024 of the assistant interface 4094.
- the assistant display 4024 may be configured to present a virtualized desktop hosted by the server 4030.
- the server 4030 may be configured to communicate with the assistant interface 4094 via the first network 4034.
- the first network 4034 may include a local area network (LAN), such as an Ethernet network.
- the first network 4034 may include the Internet, and communications between the server 4030 and the assistant interface 4094 may be seemed via privacy enhancing technologies, such as, for example, by using encryption over a virtual private network (VPN).
- the server 4030 may be configured to communicate with the assistant interface 4094 via one or more networks independent of the first network 4034 and/or other communication means, such as a direct wired or wireless communication channel.
- the patient interface 4050 and the treatment apparatus 4070 may each operate from a patient location geographically separate from a location of the assistant interface 4094.
- the patient interface 4050 and the treatment apparatus 4070 may be used as part of an in-home rehabilitation system, which may be aided remotely by using the assistant interface 4094 at a centralized location, such as a clinic or a call center.
- the assistant interface 4094 may be one of several different terminals (e.g., computing devices) that may be grouped together, for example, in one or more call centers or at one or more clinicians’ offices. In some embodiments, a plurality of assistant interfaces 4094 may be distributed geographically. In some embodiments, a person may work as an assistant remotely from any conventional office infrastructure. Such remote work may be performed, for example, where the assistant interface 94 takes the form of a computer and/or telephone. This remote work functionality may allow for work-from-home arrangements that may include part time and/or flexible work hours for an assistant.
- FIGS. 37-38 show an embodiment of a treatment apparatus 4070. More specifically, FIG. 37 shows a treatment apparatus 4070 in the form of a stationary cycling machine 4100, which may be called a stationary bike, for short.
- the stationary cycling machine 4100 includes a set of pedals 4102 each attached to a pedal arm 4104 for rotation about an axle 4106.
- the pedals 4102 are movable on the pedal arms 4104 in order to adjust a range of motion used by the patient in pedaling.
- the pedals being located inwardly toward the axle 4106 corresponds to a smaller range of motion than when the pedals are located outwardly away from the axle 4106.
- a pressure sensor 4086 is attached to or embedded within one or more of the pedals 4102 for measuring an amount of force applied by the patient on the pedal 4102.
- the pressure sensor 4086 may communicate wirelessly to the treatment apparatus 4070 and/or to the patient interface 4050.
- FIG. 39 shows a person (a patient) using the treatment apparatus of FIG. 37, and showing sensors and various data parameters connected to a patient interface 4050.
- the example patient interface 4050 is a tablet computer or smartphone, or a phablet, such as an iPad, an iPhone, an Android device, or a Surface tablet, which is held manually by the patient.
- the patient interface 4050 may be embedded within or attached to the treatment apparatus 4070.
- FIG. 39 shows the patient wearing the ambulation sensor 4082 on his wrist, with a note showing “ STEPS TODAY 41355”, indicating that the ambulation sensor 4082 has recorded and transmitted that step count to the patient interface 4050.
- FIG. 39 shows the patient wearing the ambulation sensor 4082 on his wrist, with a note showing “ STEPS TODAY 41355”, indicating that the ambulation sensor 4082 has recorded and transmitted that step count to the patient interface 4050.
- FIG. 39 also shows the patient wearing the goniometer 4084 on his right knee, with a note showing “KNEE ANGLE 72°”, indicating that the goniometer 4084 is measuring and transmitting that knee angle to the patient interface 4050.
- FIG. 39 also shows a right side of one of the pedals 4102 with a pressure sensor 4086 showing “FORCE 12.5 lbs.,” indicating that the right pedal pressure sensor 4086 is measuring and transmitting that force measurement to the patient interface 4050.
- FIG. 39 also shows a left side of one of the pedals 4102 with a pressure sensor 4086 showing “FORCE 27 lbs.”, indicating that the left pedal pressure sensor 4086 is measuring and transmitting that force measurement to the patient interface 4050.
- FIG. 36 also shows other patient data, such as an indicator of “SESSION TIME 0:04: 13”, indicating that the patient has been using the treatment apparatus 4070 for 4 minutes and 13 seconds. This session time may be determined by the patient interface 4050 based on information received from the treatment apparatus 4070.
- FIG. 36 also shows an indicator showing “PAIN LEVEL 3”. Such a pain level may be obtained from the patent in response to a solicitation, such as a question, presented upon the patient interface 4050.
- FIG. 40 is an example embodiment of an overview display 4120 of the assistant interface 4094. Specifically, the overview display 4120 presents several different controls and interfaces for the assistant to remotely assist a patient with using the patient interface 4050 and/or the treatment apparatus 4070. This remote assistance functionality may also be called telemedicine or telehealth.
- the overview display 4120 includes a patient profile display 4130 presenting biographical information regarding a patient using the treatment apparatus 4070.
- the patient profile display 4130 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40, although the patient profile display 4130 may take other forms, such as a separate screen or a popup window.
- the patient profile display 4130 may include a limited subset of the patient’s biographical information. More specifically, the data presented upon the patient profile display 4130 may depend upon the assistant’s need for that information.
- the patient profile display 4130 may include pseudonymized data and/or anonymized data or use any privacy enhancing technology to prevent confidential patient data from being communicated in a way that could violate patient confidentiality requirements.
- privacy enhancing technologies may enable compliance with laws, regulations, or other rules of governance such as, but not limited to, the Health Insurance Portability and Accountability Act (HIPAA), or the General Data Protection Regulation (GDPR), wherein the patient may be deemed a “data subject”.
- HIPAA Health Insurance Portability and Accountability Act
- GDPR General Data Protection Regulation
- the patient profile display 4130 may present information regarding the treatment plan for the patient to follow in using the treatment apparatus 4070.
- Such treatment plan information may be limited to an assistant who is a medical professional, such as a doctor or physical therapist.
- a medical professional assisting the patient with an issue regarding the treatment regimen may be provided with treatment plan information, whereas a technician troubleshooting an issue with the treatment apparatus 4070 may not be provided with any information regarding the patient’s treatment plan.
- one or more recommended treatment plans and/or excluded treatment plans may be presented in the patient profile display 4130 to the assistant.
- the one or more recommended treatment plans and/or excluded treatment plans may be generated by the artificial intelligence engine 4011 of the server 4030 and received from the server 4030 in real-time during, inter alia, a telemedicine or telehealth session.
- An example of presenting the one or more recommended treatment plans and/or ruled-out treatment plans is described below with reference to FIG. 42.
- one or more treatment plans and/or billing sequences associated with the treatment plans may be presented in the patient profile display 4130 to the assistant.
- the one or more treatment plans and/or billing sequences associated with the treatment plans may be generated by the artificial intelligence engine 4011 of the server 4030 and received from the server 4030 in real-time during, inter alia, a telehealth session.
- An example of presenting the one or more treatment plans and/or billing sequences associated with the treatment plans is described below with reference to FIG. 44.
- one or more treatment plans and associated monetary value amounts generated, patient outcomes, and risks associated with the treatment plans may be presented in the patient profile display 4130 to the assistant.
- the one or more treatment plans and associated monetary value amounts generated, patient outcomes, and risks associated with the treatment plans may be generated by the artificial intelligence engine 4011 of the server 4030 and received from the server 4030 in real-time during, inter alia, a telehealth session.
- An example of presenting the one or more treatment plans and associated monetary value amounts generated, patient outcomes, and risks associated with the treatment plans is described below with reference to FIG. 47.
- the example overview display 4120 shown in FIG. 40 also includes a patient status display 4134 presenting status information regarding a patient using the treatment apparatus.
- the patient status display 4134 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40, although the patient status display 4134 may take other forms, such as a separate screen or a popup window.
- the patient status display 4134 includes sensor data 4136 from one ormore of the external sensors 4082, 4084, 4086, and/orfrom one or more internal sensors 4076 of the treatment apparatus 4070. In some embodiments, the patient status display 4134 may present other data 4138 regarding the patient, such as last reported pain level, or progress within a treatment plan.
- User access controls may be used to limit access, including what data is available to be viewed and/or modified, on any or all of the user interfaces 4020, 4050, 4090, 4092, 4094 of the system 4010.
- user access controls may be employed to control what information is available to any given person using the system 4010.
- data presented on the assistant interface 4094 may be controlled by user access controls, with permissions set depending on the assistant/user’s need for and/or qualifications to view that information.
- the example overview display 4120 shown in FIG. 40 also includes a help data display 4140 presenting information for the assistant to use in assisting the patient.
- the help data display 4140 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40.
- the help data display 4140 may take other forms, such as a separate screen or a popup window.
- the help data display 4140 may include, for example, presenting answers to frequently asked questions regarding use of the patient interface 4050 and/or the treatment apparatus 4070.
- the help data display 4140 may also include research data or best practices. In some embodiments, the help data display 4140 may present scripts for answers or explanations in response to patient questions.
- the help data display 4140 may present flow charts or walk-throughs for the assistant to use in determining a root cause and/or solution to a patient’ s problem.
- the assistant interface 4094 may present two or more help data displays 4140, which may be the same or different, for simultaneous presentation of help data for use by the assistant for example, a first help data display may be used to present a troubleshooting flowchart to determine the source of a patient’s problem, and a second help data display may present script information for the assistant to read to the patient, such information to preferably include directions for the patient to perform some action, which may help to narrow down or solve the problem.
- the second help data display may automatically populate with script information.
- the example overview display 4120 shown in FIG. 40 also includes a patient interface control 4150 presenting information regarding the patient interface 4050, and/or to modify one or more settings of the patient interface 4050.
- the patient interface control 4150 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40.
- the patient interface control 4150 may take other forms, such as a separate screen or a popup window.
- the patient interface control 4150 may present information communicated to the assistant interface 4094 via one or more of the interface monitor signals 4098b.
- the patient interface control 4150 includes a display feed 4152 of the display presented by the patient interface 4050.
- the display feed 4152 may include a live copy of the display screen currently being presented to the patient by the patient interface 4050. In other words, the display feed 4152 may present an image of what is presented on a display screen of the patient interface 4050. In some embodiments, the display feed 4152 may include abbreviated information regarding the display screen currently being presented by the patient interface 4050, such as a screen name or a screen number.
- the patient interface control 4150 may include a patient interface setting control 4154 for the assistant to adjust or to control one or more settings or aspects of the patient interface 4050. In some embodiments, the patient interface setting control 4154 may cause the assistant interface 4094 to generate and/or to transmit an interface control signal 4098 for controlling a function or a setting of the patient interface 4050.
- the patient interface setting control 4154 may include collaborative browsing or co-browsing capability for the assistant to remotely view and/or control the patient interface 4050.
- the patient interface setting control 4154 may enable the assistant to remotely enter text to one or more text entry fields on the patient interface 4050 and/or to remotely control a cursor on the patient interface 4050 using a mouse or touchscreen of the assistant interface 4094.
- the patient interface setting control 4154 may allow the assistant to change a setting that cannot be changed by the patient.
- the patient interface 4050 may be precluded from accessing a language setting to prevent a patient from inadvertently switching, on the patient interface 4050, the language used for the displays, whereas the patient interface setting control 4154 may enable the assistant to change the language setting of the patient interface 4050.
- the patient interface 4050 may not be able to change a font size setting to a smaller size in order to prevent a patient from inadvertently switching the font size used for the displays on the patient interface 4050 such that the display would become illegible to the patient, whereas the patient interface setting control 4154 may provide for the assistant to change the font size setting of the patient interface 4050.
- the example overview display 4120 shown in FIG. 40 also includes an interface communications display 4156 showing the status of communications between the patient interface 4050 and one or more other devices 4070, 4082, 4084, such as the treatment apparatus 4070, the ambulation sensor 4082, and/or the goniometer 4084.
- the interface communications display 4156 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40.
- the interface communications display 4156 may take other forms, such as a separate screen or a popup window.
- the interface communications display 4156 may include controls for the assistant to remotely modify communications with one or more of the other devices 4070, 4082, 4084.
- the assistant may remotely command the patient interface 4050 to reset communications with one of the other devices 4070, 4082, 4084, or to establish communications with a new one of the other devices 4070, 4082, 4084.
- This functionality may be used, for example, where the patient has a problem with one of the other devices 4070, 4082, 4084, or where the patient receives a new or a replacement one of the other devices 4070, 4082, 4084.
- the example overview display 4120 shown in FIG. 40 also includes an apparatus control 4160 for the assistant to view and/or to control information regarding the treatment apparatus 4070.
- the apparatus control 4160 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40.
- the apparatus control 4160 may take other forms, such as a separate screen or a popup window.
- the apparatus control 4160 may include an apparatus status display 4162 with information regarding the current status of the apparatus.
- the apparatus status display 4162 may present information communicated to the assistant interface 4094 via one or more of the apparatus monitor signals 4099b.
- the apparatus status display 4162 may indicate whether the treatment apparatus 4070 is currently communicating with the patient interface 4050.
- the apparatus status display 4162 may present other current and/or historical information regarding the status of the treatment apparatus 4070.
- the apparatus control 4160 may include an apparatus setting control 4164 for the assistant to adjust or control one or more aspects of the treatment apparatus 4070.
- the apparatus setting control 4164 may cause the assistant interface 4094 to generate and/or to transmit an apparatus control signal 4099 for changing an operating parameter of the treatment apparatus 4070, (e.g., a pedal radius setting, a resistance setting, a target RPM, etc.).
- the apparatus setting control 4164 may include a mode button 4166 and a position control 4168, which may be used in conjunction for the assistant to place an actuator 4078 of the treatment apparatus 4070 in a manual mode, after which a setting, such as a position or a speed of the actuator 4078, can be changed using the position control 4168.
- the mode button 4166 may provide for a setting, such as a position, to be toggled between automatic and manual modes.
- a setting such as a position
- one or more settings may be adjustable at any time, and without having an associated auto/manual mode.
- the assistant may change an operating parameter of the treatment apparatus 4070, such as a pedal radius setting, while the patient is actively using the treatment apparatus 4070. Such “on the fly” adjustment may or may not be available to the patient using the patient interface 4050.
- the apparatus setting control 4164 may allow the assistant to change a setting that cannot be changed by the patient using the patient interface 4050.
- the patient interface 4050 may be precluded from changing a preconfigured setting, such as a height or a tilt setting of the treatment apparatus 4070, whereas the apparatus setting control 4164 may provide forthe assistant to change the height or tilt setting of the treatment apparatus 4070.
- a preconfigured setting such as a height or a tilt setting of the treatment apparatus 4070
- the apparatus setting control 4164 may provide forthe assistant to change the height or tilt setting of the treatment apparatus 4070.
- the example overview display 4120 shown in FIG. 40 also includes a patient communications control 4170 for controlling an audio or an audiovisual communications session with the patient interface 4050.
- the communications session with the patient interface 4050 may comprise a live feed from the assistant interface 4094 for presentation by the output device of the patient interface 4050.
- the live feed may take the form of an audio feed and/or a video feed.
- the patient interface 4050 may be configured to provide two-way audio or audiovisual communications with a person using the assistant interface 4094.
- the communications session with the patient interface 4050 may include bidirectional (two-way) video or audiovisual feeds, with each of the patient interface 4050 and the assistant interface 4094 presenting video of the other one.
- the patient interface 4050 may present video from the assistant interface 4094, while the assistant interface 4094 presents only audio or the assistant interface 4094 presents no live audio or visual signal from the patient interface 4050.
- the assistant interface 4094 may present video from the patient interface 4050, while the patient interface 4050 presents only audio or the patient interface 4050 presents no live audio or visual signal from the assistant interface 4094.
- the audio or an audiovisual communications session with the patient interface 4050 may take place, at least in part, while the patient is performing the rehabilitation regimen upon the body part.
- the patient communications control 4170 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40.
- the patient communications control 4170 may take other forms, such as a separate screen or a popup window.
- the audio and/or audiovisual communications may be processed and/or directed by the assistant interface 4094 and/or by another device or devices, such as a telephone system, or a videoconferencing system used by the assistant while the assistant uses the assistant interface 4094.
- the audio and/or audiovisual communications may include communications with a third party.
- the system 4010 may enable the assistant to initiate a 3-way conversation regarding use of a particular piece of hardware or software, with the patient and a subject matter expert, such as a medical professional or a specialist.
- the example patient communications control 4170 shown in FIG. 40 includes call controls 4172 for the assistant to use in managing various aspects of the audio or audiovisual communications with the patient.
- the call controls 4172 include a disconnect button 4174 for the assistant to end the audio or audiovisual communications session.
- the call controls 4172 also include a mute button 4176 to temporarily silence an audio or audiovisual signal from the assistant interface 4094.
- the call controls 4172 may include other features, such as a hold buton (not shown).
- the call controls 4172 also include one or more record/playback controls 4178, such as record, play, and pause butons to control, with the patient interface 4050, recording and/or playback of audio and/or video from the teleconference session.
- the call controls 4172 also include a video feed display 4180 for presenting still and/or video images from the patient interface 4050, and a self-video display 4182 showing the current image of the assistant using the assistant interface.
- the self video display 4182 may be presented as a picture-in-picture format, within a section of the video feed display 4180, as shown in FIG. 40. Alternatively or additionally, the self-video display 4182 maybe presented separately and/or independently from the video feed display 4180.
- the example overview display 4120 shown in FIG. 40 also includes a third party communications control 4190 for use in conducting audio and/or audiovisual communications with a third party.
- the third party communications control 4190 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40.
- the third party communications control 4190 may take other forms, such as a display on a separate screen or a popup window.
- the third party communications control 4190 may include one or more controls, such as a contact list and/or butons or controls to contact a third party regarding use of a particular piece of hardware or software, e.g., a subject mater expert, such as a medical professional or a specialist.
- the third party communications control 4190 may include conference calling capability for the third party to simultaneously communicate with both the assistant via the assistant interface 4094, and with the patient via the patient interface 4050.
- the system 4010 may provide for the assistant to initiate a 3-way conversation with the patient and the third party.
- FIG. 41 shows an example block diagram of training a machine learning model 4013 to output, based on data 4600 pertaining to the patient, a treatment plan 4602 for the patient according to the present disclosure.
- Data pertaining to other patients may be received by the server 4030.
- the other patients may have used various treatment apparatuses to perform treatment plans.
- the data may include characteristics of the other patients, the details of the treatment plans performed by the other patients, and/or the results of performing the treatment plans (e.g., a percent of recovery of a portion of the patients’ bodies, an amount of recovery of a portion of the patients’ bodies, an amount of increase or decrease in muscle strength of a portion of patients’ bodies, an amount of increase or decrease in range of motion of a portion of patients’ bodies, etc.).
- Cohort A includes data for patients having similar first characteristics, first treatment plans, and first results.
- Cohort B includes data for patients having similar second characteristics, second treatment plans, and second results.
- cohort A may include first characteristics of patients in their twenties without any medical conditions who underwent surgery for a broken limb; their treatment plans may include a certain treatment protocol (e.g., use the treatment apparatus 4070 for 30 minutes 5 times a week for 3 weeks, wherein values for the properties, configurations, and/or setings of the treatment apparatus 4070 are set to X (where X is a numerical value) for the first two weeks and to Y (where Y is a numerical value) for the last week).
- Cohort A and cohort B may be included in a training dataset used to train the machine learning model 4013.
- the machine learning model 4013 may be trained to match a pattern between characteristics for each cohort and output the treatment plan that provides the result. Accordingly, when the data 4600 for a new patient is input into the trained machine learning model 4013, the trained machine learning model 4013 may match the characteristics included in the data 4600 with characteristics in either cohort A or cohort B and output the appropriate treatment plan 4602. In some embodiments, the machine learning model 4013 may be trained to output one or more excluded treatment plans that should not be performed by the new patient.
- FIG. 42 shows an embodiment of an overview display of the patient interface 4050 presenting a virtual avatar 4700 guiding the patient through an exercise session according to the present disclosure.
- the virtual avatar 4700 may be presented on the output device 4054 (e.g., display screen) of the patient interface 4050.
- the virtual avatar 4700 may represent a person.
- the person may be an actual individual, e.g., the medical professional, a professional athlete, the patient, a relative, a friend, a sibling, a celebrity, etc.; in other embodiments, the person may be fictional or constructed, e.g., a superhero, or the like.
- the virtual avatar may be any person, object, building, animal, being, alien, robot, or the like. For example, children may connect more strongly with animal animations to guide them through their exercise sessions.
- the virtual avatar 4700 may be selected by the patient and/or the medical professional from a library of virtual avatars stored on a database at the server 4030. In some embodiments, the virtual avatar 4700 may be able to be uploaded into the database or for private use by the patient and/or the medical professional.
- the library of virtual avatars may be stored at the system data store 4042 and/or the patient data store 4044. Once the virtual avatar 4700 has been selected for the patient, an identifier of the virtual avatar 4700 may be associated with an identifier of the patient in the system data store 4042 and/or the patient data store 4044.
- the virtual avatar 4700 may perform one or more exercises specified in an exercise session of a treatment plan for the patient.
- “exercises” may include, e.g., rehabilitation movements, high intensity interval training, strength training, range of motion training, or any body or physical movement capable of being performed on a treatment device specified in the treatment plan (including modifications or amendments or emendations thereto) or reasonably substituted for with a different treatment device.
- one exercise may involve pedaling a stationary bicycle, and the virtual avatar 4700 may be animated as pedaling the bicycle in a desired manner for the patient.
- the virtual avatar 4700 may include an actual video of a person performing the exercise session.
- the virtual avatar 4700 may be generated by the server 4030 and/or include audio, video, audiovisual and/or multimedia data of a real person performing the exercise session.
- the virtual avatar 4700 may represent a medical professional, such as a physical therapist, a coach, a trainer, etc.
- the virtual avatar 4700 may be controlled by one or more machine learning models 4013 of the artificial intelligence engine 4011.
- the one or more machine learning models 4013 may be trained based on historical data and real-time or near-time data.
- the data used to train the machine learning models 4013 may include previous feedback received from users (e.g., pain levels), characteristics of the patients at various points in their treatment plans (e.g., heartrate, blood pressure, temperature, perspiration rate, etc.), sensor measurements (e.g., pressure on pedals, range of motion, speed of the motor of the treatment apparatus 4070, etc.) received as the patients performed their treatment plans, and/or the results achieved by the patients after certain operations are performed (e.g., initiating a telemedicine session with a multimedia feed of the medical professional, replacing the virtual avatar 4700 with the multimedia feed of the medical professional, emoting certain auditory statements, presenting certain visuals on the output device 4054, changing a parameter of the exercise session (e.g., reducing an amount of resistance
- the output device 54 also presents a self-video section 4702 that presents video of the patient obtained from a camera of the patient interface 4050.
- the self-video may be used by the patient to verify whether they are using proper form, cadence, consistency or any other observable or measurable quality or quantity while performing an exercise session. While the patient performs the exercise session, the video obtained from the camera of the patient interface 4050 may be transmitted to the assistant interface 4094 for presentation.
- a medical professional may view the assistant interface 4094 presenting the video of the patient and determine whether to intervene by speaking to the patient via their patient interface 4050 and/or to replace the virtual avatar 4700 with a multimedia feed from the assistant interface 4094.
- the output device 4054 also presents a graphical user interface (GUI) object 4704.
- GUI object 4704 may be an element that enables the user to provide feedback to the server 4030.
- the GUI object 4704 may present a scale of values representing a level of pain the patient is currently experiencing, and the GUI object 4704 may enable a patient to select a value representing their level of pain.
- the selection may cause a message to be transmitted to the server 4030.
- the message, including the level of pain may pertain to a trigger event.
- the server 4030 may determine whether the level of pain experienced by the patient exceeds a certain threshold severity level. If the level of pain exceeds the certain threshold severity level, then the server 4030 may pause the virtual avatar 4700 and/or replace the virtual avatar 4700 with an audio, visual, audio-visual or multimedia feed from the computing device of a medical professional.
- FIG. 43 shows an embodiment of the overview display 4120 of the assistant interface 4094 receiving a notification pertaining to the patient and enabling the assistant (e.g., medical professional) to initiate a telemedicine session in real-time according to the present disclosure.
- the overview display 4120 includes a section for the patient profile 4130.
- the patient profile 4130 presents information pertaining to the treatment plan being performed by the patient “John Doe.”
- the treatment plan 4800 indicates that “John Doe is cycling on the treatment apparatus for 5 miles.
- the pedals of the treatment apparatus are configured to provide a range of motion of 45 degrees.”
- the overview display 4120 also includes a notification 4802 that is received due to a trigger event.
- the notification presents “John Doe indicated he is experiencing a high level of pain during the exercise.”
- the overview display 4120 also includes a prompt 4804 for a medical professional using the assistant interface 4094.
- the prompt asks, “Initiate telemedicine session?”
- the overview display 4120 includes a graphical element (e.g., button) 4806 that is configured to enable the medical professional to use an wired or wireless input peripheral (e.g., touchscreen, mouse, keyboard, microphone, etc.) to select to initiate the telemedicine session.
- an wired or wireless input peripheral e.g., touchscreen, mouse, keyboard, microphone, etc.
- the assistant e.g., medical professional
- using the assistant interface 94 e.g., computing device
- the assistant interface 4094 may also present, in the same portion of the overview display 4120 as the self-video, a video (e.g., self-video 4182) from the patient in the video feed display 4180.
- the video feed display 4180 may also include a graphical user interface (GUI) object 4808 (e.g., a button) that enables the medical professional to share, in real-time or near real-time during the telemedicine session, a treatment plan with the patient on the patient interface 4050, to control an operational parameter of the treatment apparatus 4070, or the like.
- GUI graphical user interface
- FIG.44 shows an embodiment of an overview display presenting by the output device 4054 of the patient interface 4050.
- the output device 4054 presents, in real-time during a telemedicine session, a feed 4900 (e.g., multimedia preferably including audio, video, or both) of the medical professional that replaced the virtual avatar according to the present disclosure.
- a feed 4900 e.g., multimedia preferably including audio, video, or both
- the virtual avatar may remain presented on the patient interface 4050, but in a paused state, and the feed may preferably by limited to only audio when the medical professional speaks to the patient.
- the feed 4900 may replace the virtual avatar.
- the feed may enable the medical professional and the patient to engage in a telemedicine session where the medical professional talks to the patient and inquires about their pain level, their characteristics (e.g., heartrate, perspiration rate, etc.), and/or one or more sensor measurements (e.g., pressure on pedals, range of motion, etc.).
- their characteristics e.g., heartrate, perspiration rate, etc.
- sensor measurements e.g., pressure on pedals, range of motion, etc.
- each patient may be presented in a respective portion of the user interface of the assistant interface 4094.
- Each respective portion may present a variety of information pertaining to the respective patient.
- each portion may present a feed of the patient performing the exercise session using the treatment apparatus, characteristics of the patient, the treatment plan for the patient, sensor measurements, and the like.
- the user interface of the assistant interface 4094 may be configured to enable the medical professional to select one or more patients to cause the virtual avatar guiding the one or more patients through an exercise to be paused and/or replaced in real-time or near real-time.
- the feed from the computing device of the medical professional may be replaced with the virtual avatar on the patient interface 4050.
- the virtual avatar may continue guiding the patient through the exercise session wherever and/or whenever the exercise session was paused due to the initiation of the trigger event.
- the assistant interface 4094 may resume viewing the feed of the patient performing the exercise session and/or information pertaining to the patient.
- FIG. 45 shows an example embodiment of a method 41000 for replacing, based on a trigger event occurring, a virtual avatar with a feed of a medical professional according to the present disclosure.
- the method 41000 is performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), software (such as is run on a general-purpose computer system or a dedicated machine), or a combination of both.
- the method 41000 and/or each of its individual functions, routines, other methods, scripts, subroutines, or operations may be performed by one or more processors of a computing device (e.g., any component of FIGURE 36, such as server 4030 executing the artificial intelligence engine 4011).
- the method 41000 may be performed by a single processing thread.
- the method 41000 may be performed by two or more processing threads, each thread implementing one or more individual functions or routines; or other methods, scripts, subroutines, or operations of the methods.
- the method 41000 is depicted and described as a series of operations. However, operations in accordance with this disclosure can occur in various orders and/or concurrently, and/or with other operations not presented and described herein. For example, the operations depicted in the method 41000 may occur in combination with any other operation of any other method disclosed herein. Furthermore, not all illustrated operations may be required to implement the method 41000 in accordance with the disclosed subject matter. In addition, those skilled in the art will understand and appreciate that the method 41000 could alternatively be represented as a series of interrelated states via a state diagram, a directed graph, a deterministic finite state automaton, a non-deterministic finite state automaton, a Markov diagram, or event diagrams.
- the processing device may provide, to a computing device (e.g., patient interface 4050) of the patient, a virtual avatar to be presented on the computing device of the patient.
- the virtual avatar may be configured to use a virtual representation of the treatment apparatus 4070 to guide the patient through an exercise session.
- the virtual avatar may be configured to use audio, video, haptic feedback, or some combination thereof, to guide the patient through the exercise session.
- the audio, video, haptic feedback, or some combination thereof, may be provided by the computing device of the patient.
- the processing device may determine, based on a treatment plan for a patient, the exercise session to be performed.
- the treatment apparatus 4070 may be configured to be used by the patient performing the exercise session.
- the processing device may transmit, to the computing device of the patient, a notification to initiate the exercise session.
- the notification may include a push notification, a text message, a phone call, an email, or some combination thereof.
- the notification may be transmitted based on a schedule specified in the treatment plan.
- the schedule may include dates and times to perform exercise sessions, durations for performing the exercise sessions, exercises to perform during the exercise sessions, configurations of parts (e.g., pedals, seat, etc.) of the treatment apparatus 4070, and the like.
- the processing device may receive, from the computing device of the patient, a selection to initiate the exercise session for using the treatment apparatus 4070.
- the processing device may transmit, to the treatment apparatus 4070, a control signal to cause the treatment apparatus 70 to initiate the exercise session. Responsive to transmitting the control signal, the processing device may provide the virtual avatar to the computing device of the patient.
- the virtual avatar may be associated with a medical professional, such as the medical professional that prescribed or generated the treatment plan for the patient to perform.
- the treatment plan may be wholly or partially designed generated by the medical professional.
- the treatment plan may be wholly or partially generated by the artificial intelligence engine 4011, and the medical professional may review the treatment plan and/or modify the treatment plan before it is transmitted to the patient to be performed.
- the virtual avatar may represent a proxy medical professional and may be a person, being, thing, software or electronic bot, object, etc. that guides one or more patients through a treatment plan.
- the virtual avatar may guide numerous patients through treatment plans at various stages of their rehabilitation, prehabilitation, recovery, etc.
- a feed e.g., live audio, audiovisual, etc.
- the feed may be a stream of data packets (e.g., audio, video, or both) obtained via a camera and/or microphone associated with the assistant interface 4094 in real-time or near real-time.
- the feed may be presented on the user interface 4054 of the patient interface 4050.
- the virtual avatar may be initially selected by the medical professional.
- the virtual avatar may be a file stored in a virtual avatar library, and the medical professional may select the virtual avatar from the virtual avatar library.
- the virtual avatar may be a life-like representation of a person (e.g., male, female, non-binary, or any other gender with which the person identifies).
- the virtual avatar may be a life-like or virtual representation of an animal (e.g., tiger, lion, unicorn, rabbit, etc.), which may be more enjoyable and more motivational to younger people (e.g., kids).
- the virtual avatar may be a life-like or virtual representation of a robot, alien, etc.
- the medical professional may design their own virtual avatar.
- the medical professional may be provided with a user interface on their assistant interface 4094, and the user interface may provide user interface elements that enable configuration of a virtual avatar.
- the medical professional may use the user interface to generate a virtual avatar that looks like their own self or any suitable person.
- the virtual avatar may be selected by the patient.
- the selected virtual avatar may be associated with the patient (e.g., via an identifier of the patient and an identifier of the virtual avatar) and stored in a database.
- some patients may have a preference for certain virtual avatars over other virtual avatars.
- different virtual avatars may guide patients through the same or different treatment plan.
- different aspects, portions or configurations of the treatment plan may further be guided by more than one virtual or physical avatar, wherein each such avatar is associated with a particular aspect, portion or configuration of the treatment plan, and every other avatar, to the extent application is associated with a disjoint aspect, portion or configuration of the treatment plan.
- more than one avatar, physical and/or virtual may be present at the same time but performing different functions within the particular aspect, portion or configuration of the treatment plan.
- numerous patients may be performing the same treatment plan, and each patient may have their own patient interface 4050 that concurrently presents the same or a different virtual avatar or avatars guiding the patient through the treatment plan.
- the medical professional may be able to view the numerous patients in different tiles on the user interface of the assistant interface 4094.
- the term “tiles” may refer to squares that each include a feed from the respective patient interfaces 4050 of the patient as the patient performs the treatment plan, a feed of the characteristics of the patient (e.g., heartrate, blood pressure, temperature, etc.), a feed of measurements (e.g., pressure exerted on the pedals, range of motion determined by the goniometer, number of steps, speed of the motor of the treatment apparatus 4070, etc.), or some combination thereof.
- the medical professional may be enabled to manage, monitor, and/or treat numerous patients at a time. Computing resources may be saved by having one medical professional treatment numerous patients at the same time because just the assistant interface 4094 is used to view, treat, manage, monitor, etc. the numerous patients as they perform the treatment plan.
- the processing device may receive, from the computing device of the patient, a message pertaining to a trigger event.
- the message may include data pertaining to a pain level of the patient, a characteristic of the patient, a measurement of a sensor, or some combination thereof.
- the trigger event may refer to any event associated with the data pertaining to the pain level of the patient, the characteristic of the patient (e.g., heartrate, blood pressure, temperature, perspiration rate, etc.), the measurement of the sensor (e.g., pressure, range of motion, speed, etc.), or some combination thereof.
- the virtual avatar may guide the patient through the treatment plan as a prerecorded animation, and the medical professional may not be actively engaged in a telemedicine session with the patient as they perform the treatment plan.
- a notification may be transmitted to the computing device of the medical professional, where such device alerts the medical professional about the notification.
- the medical professional may use a wired or wireless input peripheral (e.g., touchscreen, mouse, keyboard, microphone) to select the notification and to initiate a telemedicine session with the computing device of the patient.
- Such a technique may minimize or otherwise optimize the use and/or cost and/or risk profile of one or more computing resources, suchas network resources (e.g., bandwidth), by initiating the telemedicine session only when the notification is selected and not throughout the entirety of the treatment plan.
- the computing device of the medical professional and the computing device of the patient and the computing device of the medical professional may be continuously or continually engaged in a telemedicine session as the one or more patients perform the treatment plan.
- the trigger event may enable the medical professional to intervene and/or replace the virtual avatar, pause the virtual avatar, or both.
- the medical professional may selectively choose one or more of the patients to cause the virtual avatar on those one or more patients’ computing devices to be replaced, paused, etc.
- Such a technique may enable the medical professional to intervene for some of the patients, but not for all of the patients, as they perform the treatment plan.
- the medical professional may select patients not hitting target thresholds (e.g., pressure, range of motion, speed, etc.) in the treatment plan, select patients indicating they are experiencing a threshold level of pain, or the like.
- target thresholds e.g., pressure, range of motion, speed, etc.
- the virtual avatar may be controlled, in real-time or near real-time, by one or more machine learning models trained to receive input, including sensor data (e.g., pressure measurements from the pedals, range of motion measurements from a goniometer, speed data, etc.), characteristics of the patient (e.g., perspiration rate, heartrate, blood pressure, temperature, arterial blood gas and/or oxygenation levels or percentages etc.), real-time feedback from the patient or other patients (e.g., indication of pain level), or some combination thereof.
- the one or more machine learning models may produce an output that controls the virtual avatar.
- the output may control the virtual avatar such that the way the virtual avatar performs a particular exercise (e.g., pedals faster or slower on a treatment apparatus 70) is modified, to say encouraging statements (e.g., “You got this,” “Keep it up,” etc.), etc.
- the modifications may be based on training data of other patients, where such training data indicates the modifications result in a desired patient performance and/or result for the other patients, or an increase in the probability of achieving the desired patient performance or result.
- the training data may indicate that providing certain audio and/or video when certain sensor data is detected may lead to the patient exerting more force on the pedal, thereby strengthening their leg muscles according to the treatment plan.
- the processing device may determine whether a severity level of the trigger event exceeds a threshold severity level.
- the severity threshold level may be any suitable amount, value, indicator, etc.
- the threshold severity level may be a certain level of pain the patient is in.
- the patient may use any input peripheral of the patient interface 4050 to express their level of pain.
- the patient may touch a button on the touchscreen of the patient interface 4050 and the button may indicate the patient is experiencing a pain level of 8 in a scale from 1 to 10, where 1 is the least amount of pain and 10 is the most amount of pain.
- the threshold severity level may be a pain level of 5.
- the threshold severity level may relate to the amount of force the patient is exerting on the pedals, a range of motion the patient is able to achieve during pedaling, a speed the patient is able to achieve, a duration of a range of motion and/or speed the patient is able to achieve, or the like.
- the threshold severity level may be configured based on the patient pedaling at a certain range of motion for a certain period of time, and if the patient fails to achieve the certain range of motion in the certain period of time, or variations thereof of such goals, during an exercise session, then the threshold severity level may be exceeded.
- the processing device may replace, on the computing device of the patient, the presentation of the virtual avatar with a presentation of a multimedia feed from a computing device (e.g., assistant interface 4094) of the medical professional.
- replacing the virtual avatar with the multimedia feed may initiate a telemedicine session between the patient and the medical professional.
- the processing device may receive, from the computing device of the patient or the medical professional, a second message indicating the telemedicine session is complete, and the processing device may replace, on the computing device of the patient, the presentation of the multimedia feed with the presentation of the virtual avatar.
- the virtual avatar may be configured to continue to guide the patient through the exercise session to completion.
- the exercise session and/or the virtual avatar may be paused at a certain timestamp when the multimedia feed of the medical professional replaces the virtual avatar, and the exercise session and/or the virtual avatar may initiate playback at the certain timestamp when the telemedicine session is completed.
- the processing device may provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient. In some embodiments, responsive to determining that the severity level of the trigger event does not exceed the threshold severity level, the processing device may continue to provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient.
- the processing device may determine, based on a second treatment plan for a second patient, the exercise session to be performed, where the performance by the second patient uses a second treatment apparatus.
- the processing device may present, on a second computing device (e.g., patient interface 50) of the second patient, the virtual avatar configured to guide the patient to use the treatment apparatus through the exercise session.
- the virtual avatar may remain presented on the second computing device of the second patient.
- the virtual avatar may be replaced on the second computing device of the second patient with the multimedia feed from the computing device of the medical professional.
- FIG. 46 shows an example embodiment of a method for providing a virtual avatar according to the present disclosure.
- Method 41100 includes operations performed by processors of a computing device (e.g., any component of FIG. 36, such as server 4030 executing the artificial intelligence engine 4011).
- processors of a computing device e.g., any component of FIG. 36, such as server 4030 executing the artificial intelligence engine 4011.
- one or more operations of the method 41100 are implemented in computer instructions stored on a memory device and executed by a processing device.
- the method 41100 may be performed in the same or a similar manner as described above in regard to method 41000.
- the operations of the method 41100 may be performed in some combination with any of the operations of any of the methods described herein.
- the method 41100 may include further operations associated with 41002 in method 41000 related to providing the virtual avatar to the computing device of the patient.
- the processing device may retrieve data associated with the exercise session.
- the data may include instructions implementing a virtual model that animates one or more movements associated with the exercise session.
- the virtual model may be two-dimensional, three-dimensional, or n-dimensional (in terms of animations or projections onto a 3-D virtual environment or a 2-D layout).
- the virtual model may be a mesh model animation, contour animation, virtual human animation, skeletal animation, etc.
- the virtual model may be a surface representation (referred to as the mesh) used to draw a character (e.g., medical professional), and a hierarchical set of interconnected parts.
- the virtual model may use a virtual armature to animate (e.g., pose and key frame) the mesh.
- an armature may refer to a kinematic chain used in computer animation to simulate the motions of virtual human or animal characters (e.g., virtual avatars).
- virtual armatures may be used, such as keyframing (stop-motion) armatures and real time (puppeteering) armatures.
- the processing device may retrieve data associated with the virtual avatar.
- the data associated with the virtual avatar may include which virtual avatar is selected by the patient and/or the medical professional, wherein such selection is made to guide the patient through the exercise session.
- the data associated with the virtual avatar may include an identifier associated with the virtual avatar. The identifier may be used to retrieve the data associated with the virtual avatar from a database. For example, the patient may have selected a superhero to be their virtual avatar. Accordingly, data pertaining to the particular superhero (e.g., gender, costume, appearance, etc.) may be retrieved from the database.
- the processing device may map the data associated with the virtual avatar onto the virtual model that animates the one or more movements associated with the exercise session.
- the appearance and shape of the virtual avatar may be mapped onto the mesh of the virtual model (e.g., face to a head portion of the mesh) and manipulated and/or animated according to instructions related to the exercise session and/or the virtual avatar.
- the virtual avatar may perform one or more exercises using the treatment apparatus 4070. The performance of the exercises by the virtual avatar may be animated and/or presented on a display screen of the computing device of the patient to guide the patient through the exercise session.
- the virtual avatar may be replaced and/or paused to enable presentation of a multimedia feed of a medical professional.
- FIG. 47 shows an example computer system 41200 which can perform any one or more of the methods described herein, in accordance with one or more aspects of the present disclosure.
- computer system 41200 may include a computing device and correspond to the assistance interface 4094, reporting interface 4092, supervisory interface 4090, clinician interface 4020, server 4030 (including the AI engine 4011), patient interface 4050, ambulatory sensor 4082, goniometer 4084, treatment apparatus 4070, pressure sensor 4086, or any suitable component of FIG. 36.
- the computer system 41200 may be capable of executing instructions implementing the one or more machine learning models 4013 of the artificial intelligence engine 4011 of FIG. 36.
- the computer system may be connected (e.g., networked) to other computer systems in a LAN, an intranet, an extranet, or the Internet, including via the cloud or a peer-to-peer network.
- the computer system may operate in the capacity of a server in a client-server network environment.
- the computer system may be a personal computer (PC), a tablet computer, a wearable (e.g., wristband), a set-top box (STB), a personal Digital Assistant (PDA), a mobile phone, a camera, a video camera, an Internet of Things (IoT) device, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device.
- PC personal computer
- PDA personal Digital Assistant
- IoT Internet of Things
- computer shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
- the computer system 41200 includes a processing device 41202, a main memory 41204 (e.g., read only memory (ROM), flash memory, solid state drives (SSDs), dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 41206 (e.g., flash memory, solid state drives (SSDs), static random access memory (SRAM)), and a data storage device 41208, which communicate with each other via a bus 41210.
- main memory 41204 e.g., read only memory (ROM), flash memory, solid state drives (SSDs), dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)
- DRAM dynamic random access memory
- SDRAM synchronous DRAM
- static memory 41206 e.g., flash memory, solid state drives (SSDs), static random access memory (SRAM)
- SRAM static random access memory
- Processing device 41202 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 41202 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets.
- the processing device 41202 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a system on a chip, a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like.
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- DSP digital signal processor
- network processor or the like.
- the processing device 41202 is configured to execute instructions for performing any of the operations and steps discussed herein.
- the computer system 41200 may further include a network interface device 41212.
- the computer system 41200 also may include a video display 41214 (e.g., a liquid crystal display (LCD), a light-emitting diode (LED), an organic light-emitting diode (OLED), a quantum LED, a cathode ray tube (CRT), a shadow mask CRT, an aperture grille CRT, a monochrome CRT), one or more input devices 41216 (e.g., a keyboard and/or a mouse or a gaming-like control), and one or more speakers 41218 (e.g., a speaker).
- the video display 41214 and the input device(s) 41216 may be combined into a single component or device (e.g., an LCD touch screen).
- the data storage device 41216 may include a computer-readable medium 41220 on which the instructions 41222 embodying any one or more of the methods, operations, or functions described herein is stored.
- the instructions 41222 may also reside, completely or at least partially, within the main memory 41204 and/or within the processing device 41202 during execution thereof by the computer system 41200. As such, the main memory 41204 and the processing device 41202 also constitute computer-readable media.
- the instructions 41222 may further be transmitted or received over a network via the network interface device 41212.
- computer-readable storage medium 41220 is shown in the illustrative examples to be a single medium, the term “computer-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
- the term “computer-readable storage medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
- the term “computer-readable storage medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
- a computer-implemented system comprising:
- a treatment apparatus configured to be manipulated by a patient while performing an exercise session
- a patient interface configured to receive a virtual avatar, wherein the patient interface comprises an output device configured to present the virtual avatar, wherein the virtual avatar uses a virtual representation of the treatment apparatus to guide the patient through an exercise session, and wherein the virtual avatar is associated with a medical professional; and [0837] a server computing device configured to :
- [0844] provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient, or [0845] continue to provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient.
- [0847] receive input comprising sensor data, characteristics of the patient, real-time feedback from the patient or other patients, or some combination thereof, and [0848] produce an output that controls the virtual avatar.
- [0854] transmit, to the patient interface, a notification to initiate the exercise session, wherein the notification is transmitted based on a schedule specified in the treatment plan;
- [0860] determine, based on a second treatment plan for a second patient, the exercise session to be performed, wherein the performance by the second patient uses a second treatment apparatus;
- [0866] store the virtual avatar associated with the patient in a database.
- Clause 73 The method of any clause herein, wherein the virtual avatar is configured to use audio, video, haptic feedback, or some combination thereof to guide the patient through the exercise session.
- Clause 74 A method comprising:
- [0878] receive input comprising sensor data, characteristics of the patient, real-time feedback from the patient or other patients, or some combination thereof, and [0879] produce an output that controls the virtual avatar.
- Clause 79 The method of any clause herein, wherein the notification comprises a push notification, a text message, a phone call, an email, or some combination thereof.
- Clause 80 The method of any clause herein, further comprising:
- Clause 83 The method of any clause herein, wherein the message comprises data pertaining to a pain level of the patient, a characteristic of the patient, a measurement of a sensor, or some combination thereof.
- Clause 85 The method of any clause herein, further comprising determining, based on a treatment plan for a patient, the exercise session to be performed, wherein the treatment apparatus is configured to be used by the patient performing the exercise session.
- a non-transitory, tangible computer-readable medium storing instructions that, when executed, cause a processing device to:
- [0905] provide, to a computing device of a patient, a virtual avatar to be presented on the computing device of the patient, wherein the virtual avatar is configured to use a virtual representation of a treatment apparatus to guide the patient through an exercise session, and the virtual avatar is associated with a medical professional; [0906] receive, from the computing device of the patient, a message pertaining to a trigger event;
- [0910] provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient, or [0911] continue to provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient.
- [0913] receive input comprising sensor data, characteristics of the patient, real-time feedback from the patient or other patients, or some combination thereof, and [0914] produce an output that controls the virtual avatar.
- a processing device communicatively coupled to the memory device, the processing device executes the instructions to:
- [0927] provide, to a computing device of a patient, a virtual avatar to be presented on the computing device of the patient, wherein the virtual avatar is configured to use a virtual representation of a treatment apparatus to guide the patient through an exercise session, and the virtual avatar is associated with a medical professional; [0928] receive, from the computing device of the patient, a message pertaining to a trigger event;
- [0932] provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient, or [0933] continue to provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient.
- [0935] retrieve data associated with the exercise session, wherein the data comprises instructions implementing a virtual model that animates one or more movements associated with the exercise session; [0936] retrieve data associated with the virtual avatar; and
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Physical Education & Sports Medicine (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Pulmonology (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Evolutionary Computation (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Urology & Nephrology (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Medical Treatment And Welfare Office Work (AREA)
- Arrangements For Transmission Of Measured Signals (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
Claims
Applications Claiming Priority (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/856,985 US11107591B1 (en) | 2020-04-23 | 2020-04-23 | Method and system for describing and recommending optimal treatment plans in adaptive telemedical or other contexts |
US202063048456P | 2020-07-06 | 2020-07-06 | |
US17/021,895 US11071597B2 (en) | 2019-10-03 | 2020-09-15 | Telemedicine for orthopedic treatment |
US202063088657P | 2020-10-07 | 2020-10-07 | |
US202063104716P | 2020-10-23 | 2020-10-23 | |
US17/147,211 US11075000B2 (en) | 2019-10-03 | 2021-01-12 | Method and system for using virtual avatars associated with medical professionals during exercise sessions |
US17/147,428 US11317975B2 (en) | 2019-10-03 | 2021-01-12 | Method and system for treating patients via telemedicine using sensor data from rehabilitation or exercise equipment |
US17/147,439 US11101028B2 (en) | 2019-10-03 | 2021-01-12 | Method and system using artificial intelligence to monitor user characteristics during a telemedicine session |
PCT/US2021/028655 WO2021216881A1 (en) | 2020-04-23 | 2021-04-22 | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4139928A1 true EP4139928A1 (en) | 2023-03-01 |
Family
ID=83887200
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP21791805.1A Pending EP4139928A1 (en) | 2020-04-23 | 2021-04-22 | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine |
Country Status (7)
Country | Link |
---|---|
EP (1) | EP4139928A1 (en) |
JP (1) | JP7298053B2 (en) |
KR (1) | KR20230006641A (en) |
AU (2) | AU2021260953B2 (en) |
BR (1) | BR112022021443A2 (en) |
CA (1) | CA3176236C (en) |
MX (1) | MX2022013358A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116153531A (en) * | 2023-04-17 | 2023-05-23 | 北京康爱医疗科技股份有限公司 | Rehabilitation monitoring method and system for tumor patient |
CN118675764A (en) * | 2024-08-21 | 2024-09-20 | 中国人民解放军海军青岛特勤疗养中心 | Thoracic surgery postoperative rehabilitation effect prediction system based on artificial intelligence |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7416537B1 (en) * | 1999-06-23 | 2008-08-26 | Izex Technologies, Inc. | Rehabilitative orthoses |
US20030036683A1 (en) * | 2000-05-01 | 2003-02-20 | Kehr Bruce A. | Method, system and computer program product for internet-enabled, patient monitoring system |
JP4076386B2 (en) * | 2002-07-18 | 2008-04-16 | 帝人株式会社 | Medical remote information system, information processing method, computer program, recording medium for computer program, telemedicine system |
US20080214971A1 (en) * | 2002-10-07 | 2008-09-04 | Talish Roger J | Excercise device utilizing loading apparatus |
US8025607B2 (en) * | 2009-09-16 | 2011-09-27 | Northeastern University | Instrumented handle and pedal systems for use in rehabilitation, exercise and training equipment |
US10810283B2 (en) * | 2013-10-31 | 2020-10-20 | Knox Medical Diagnostics Inc. | Systems and methods for monitoring respiratory function |
-
2021
- 2021-04-22 EP EP21791805.1A patent/EP4139928A1/en active Pending
- 2021-04-22 AU AU2021260953A patent/AU2021260953B2/en active Active
- 2021-04-22 CA CA3176236A patent/CA3176236C/en active Active
- 2021-04-22 JP JP2022564566A patent/JP7298053B2/en active Active
- 2021-04-22 BR BR112022021443A patent/BR112022021443A2/en not_active Application Discontinuation
- 2021-04-22 KR KR1020227040948A patent/KR20230006641A/en not_active IP Right Cessation
- 2021-04-22 MX MX2022013358A patent/MX2022013358A/en unknown
-
2023
- 2023-07-13 AU AU2023204667A patent/AU2023204667B2/en active Active
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116153531A (en) * | 2023-04-17 | 2023-05-23 | 北京康爱医疗科技股份有限公司 | Rehabilitation monitoring method and system for tumor patient |
CN116153531B (en) * | 2023-04-17 | 2023-07-18 | 北京康爱医疗科技股份有限公司 | Rehabilitation monitoring method and system for tumor patient |
CN118675764A (en) * | 2024-08-21 | 2024-09-20 | 中国人民解放军海军青岛特勤疗养中心 | Thoracic surgery postoperative rehabilitation effect prediction system based on artificial intelligence |
Also Published As
Publication number | Publication date |
---|---|
JP2023519759A (en) | 2023-05-12 |
AU2023204667B2 (en) | 2024-04-18 |
KR20230006641A (en) | 2023-01-10 |
AU2021260953B2 (en) | 2023-04-13 |
CA3176236A1 (en) | 2021-10-28 |
CA3176236C (en) | 2024-02-20 |
AU2021260953A1 (en) | 2022-11-17 |
BR112022021443A2 (en) | 2022-12-27 |
MX2022013358A (en) | 2023-01-04 |
AU2023204667A1 (en) | 2023-08-03 |
JP7298053B2 (en) | 2023-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11923057B2 (en) | Method and system using artificial intelligence to monitor user characteristics during a telemedicine session | |
US11942205B2 (en) | Method and system for using virtual avatars associated with medical professionals during exercise sessions | |
US11282604B2 (en) | Method and system for use of telemedicine-enabled rehabilitative equipment for prediction of secondary disease | |
US11282608B2 (en) | Method and system for using artificial intelligence and machine learning to provide recommendations to a healthcare provider in or near real-time during a telemedicine session | |
US11139060B2 (en) | Method and system for creating an immersive enhanced reality-driven exercise experience for a user | |
US11328807B2 (en) | System and method for using artificial intelligence in telemedicine-enabled hardware to optimize rehabilitative routines capable of enabling remote rehabilitative compliance | |
WO2021216881A1 (en) | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine | |
US11445985B2 (en) | Augmented reality placement of goniometer or other sensors | |
US20230060039A1 (en) | Method and system for using sensors to optimize a user treatment plan in a telemedicine environment | |
US20220415471A1 (en) | Method and system for using sensor data to identify secondary conditions of a user based on a detected joint misalignment of the user who is using a treatment device to perform a treatment plan | |
US20230058605A1 (en) | Method and system for using sensor data to detect joint misalignment of a user using a treatment device to perform a treatment plan | |
US20220314075A1 (en) | Method and system for monitoring actual patient treatment progress using sensor data | |
US20220415469A1 (en) | System and method for using an artificial intelligence engine to optimize patient compliance | |
US20220328181A1 (en) | Method and system for monitoring actual patient treatment progress using sensor data | |
AU2023204667B2 (en) | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine | |
CN113223690B (en) | Method and system for rehabilitating a subject via telemedicine | |
US20240203580A1 (en) | Method and system for using artificial intelligence to triage treatment plans for patients and electronically initiate the treament plans based on the triaging | |
WO2022155251A9 (en) | Method and system for using artificial intelligence and machine learning to provide recommendations to a healthcare provider in real-time during a telemedicine session | |
WO2023049508A1 (en) | Method and system for using sensors to optimize a user treatment plan in a telemedicine environment | |
WO2024137305A1 (en) | Method and system for using artificial intelligence to triage treatment plans for patients and electronically initiate the treament plans based on the triaging | |
WO2024049746A1 (en) | System and method for using an artificial intelligence engine to optimize patient compliance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20221104 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Free format text: PREVIOUS MAIN CLASS: G16H0020000000 Ipc: A63B0022000000 |