AU2023204667B2 - Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine - Google Patents
Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine Download PDFInfo
- Publication number
- AU2023204667B2 AU2023204667B2 AU2023204667A AU2023204667A AU2023204667B2 AU 2023204667 B2 AU2023204667 B2 AU 2023204667B2 AU 2023204667 A AU2023204667 A AU 2023204667A AU 2023204667 A AU2023204667 A AU 2023204667A AU 2023204667 B2 AU2023204667 B2 AU 2023204667B2
- Authority
- AU
- Australia
- Prior art keywords
- treatment
- patient
- treatment plan
- user
- interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 327
- 238000011282 treatment Methods 0.000 claims abstract description 2335
- 238000005259 measurement Methods 0.000 claims abstract description 121
- 238000012986 modification Methods 0.000 claims abstract description 69
- 230000004048 modification Effects 0.000 claims abstract description 69
- 230000004044 response Effects 0.000 claims abstract description 60
- 238000012545 processing Methods 0.000 claims description 236
- 230000015654 memory Effects 0.000 claims description 78
- 230000036772 blood pressure Effects 0.000 claims description 64
- 230000029058 respiratory gaseous exchange Effects 0.000 claims description 23
- 238000010801 machine learning Methods 0.000 description 200
- 238000004891 communication Methods 0.000 description 196
- 238000013473 artificial intelligence Methods 0.000 description 155
- 238000011369 optimal treatment Methods 0.000 description 124
- 230000033001 locomotion Effects 0.000 description 105
- 238000012549 training Methods 0.000 description 94
- 239000003814 drug Substances 0.000 description 67
- 208000002193 Pain Diseases 0.000 description 63
- 229940079593 drug Drugs 0.000 description 60
- 230000000875 corresponding effect Effects 0.000 description 58
- 230000000694 effects Effects 0.000 description 40
- 238000011084 recovery Methods 0.000 description 39
- 238000010586 diagram Methods 0.000 description 33
- 230000008859 change Effects 0.000 description 31
- 230000001276 controlling effect Effects 0.000 description 31
- 208000027418 Wounds and injury Diseases 0.000 description 29
- 230000006378 damage Effects 0.000 description 29
- 208000014674 injury Diseases 0.000 description 29
- 230000000007 visual effect Effects 0.000 description 26
- 230000006870 function Effects 0.000 description 25
- 210000003127 knee Anatomy 0.000 description 25
- 230000009471 action Effects 0.000 description 24
- 210000003205 muscle Anatomy 0.000 description 22
- 238000001356 surgical procedure Methods 0.000 description 21
- 238000005516 engineering process Methods 0.000 description 20
- 230000003190 augmentative effect Effects 0.000 description 18
- 238000012544 monitoring process Methods 0.000 description 18
- 238000012552 review Methods 0.000 description 18
- 208000001072 type 2 diabetes mellitus Diseases 0.000 description 18
- 230000000144 pharmacologic effect Effects 0.000 description 17
- 230000001225 therapeutic effect Effects 0.000 description 16
- 238000013519 translation Methods 0.000 description 16
- 238000013024 troubleshooting Methods 0.000 description 16
- 238000002483 medication Methods 0.000 description 15
- 238000013515 script Methods 0.000 description 14
- 206010020751 Hypersensitivity Diseases 0.000 description 13
- 230000007815 allergy Effects 0.000 description 13
- 230000002708 enhancing effect Effects 0.000 description 13
- 238000012360 testing method Methods 0.000 description 13
- 230000003044 adaptive effect Effects 0.000 description 12
- 238000013528 artificial neural network Methods 0.000 description 12
- 230000001413 cellular effect Effects 0.000 description 12
- 210000002569 neuron Anatomy 0.000 description 12
- 230000005236 sound signal Effects 0.000 description 12
- 208000024891 symptom Diseases 0.000 description 12
- 238000005192 partition Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 230000033228 biological regulation Effects 0.000 description 10
- 230000001351 cycling effect Effects 0.000 description 9
- 235000021004 dietary regimen Nutrition 0.000 description 9
- 230000001965 increasing effect Effects 0.000 description 9
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 description 8
- 210000000988 bone and bone Anatomy 0.000 description 8
- 238000004883 computer application Methods 0.000 description 8
- 238000004590 computer program Methods 0.000 description 8
- 238000013500 data storage Methods 0.000 description 8
- 239000008103 glucose Substances 0.000 description 8
- 230000000399 orthopedic effect Effects 0.000 description 8
- 239000007787 solid Substances 0.000 description 8
- 230000003068 static effect Effects 0.000 description 8
- 230000002596 correlated effect Effects 0.000 description 7
- 201000010099 disease Diseases 0.000 description 7
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 7
- 230000000670 limiting effect Effects 0.000 description 7
- 238000012706 support-vector machine Methods 0.000 description 7
- 210000000707 wrist Anatomy 0.000 description 7
- 206010020772 Hypertension Diseases 0.000 description 6
- 239000008280 blood Substances 0.000 description 6
- 210000004369 blood Anatomy 0.000 description 6
- 210000000038 chest Anatomy 0.000 description 6
- 238000007726 management method Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 239000000090 biomarker Substances 0.000 description 5
- 230000002452 interceptive effect Effects 0.000 description 5
- 230000002829 reductive effect Effects 0.000 description 5
- 238000011160 research Methods 0.000 description 5
- 238000011269 treatment regimen Methods 0.000 description 5
- 241000238558 Eucarida Species 0.000 description 4
- 241001465754 Metazoa Species 0.000 description 4
- 208000031074 Reinjury Diseases 0.000 description 4
- 238000003339 best practice Methods 0.000 description 4
- 230000002457 bidirectional effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 238000013527 convolutional neural network Methods 0.000 description 4
- 230000003247 decreasing effect Effects 0.000 description 4
- 230000036541 health Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 210000003041 ligament Anatomy 0.000 description 4
- 239000004973 liquid crystal related substance Substances 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 239000000203 mixture Substances 0.000 description 4
- 238000012797 qualification Methods 0.000 description 4
- 230000000306 recurrent effect Effects 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 210000002435 tendon Anatomy 0.000 description 4
- 230000001755 vocal effect Effects 0.000 description 4
- 230000000386 athletic effect Effects 0.000 description 3
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 3
- 230000017531 blood circulation Effects 0.000 description 3
- 230000001934 delay Effects 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 230000035487 diastolic blood pressure Effects 0.000 description 3
- 230000003467 diminishing effect Effects 0.000 description 3
- 230000002526 effect on cardiovascular system Effects 0.000 description 3
- 210000003414 extremity Anatomy 0.000 description 3
- 239000007789 gas Substances 0.000 description 3
- 230000000977 initiatory effect Effects 0.000 description 3
- 210000002414 leg Anatomy 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 239000001301 oxygen Substances 0.000 description 3
- 229910052760 oxygen Inorganic materials 0.000 description 3
- 238000006213 oxygenation reaction Methods 0.000 description 3
- 238000000554 physical therapy Methods 0.000 description 3
- 230000035485 pulse pressure Effects 0.000 description 3
- 238000000638 solvent extraction Methods 0.000 description 3
- 230000035488 systolic blood pressure Effects 0.000 description 3
- 206010020802 Hypertensive crisis Diseases 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 239000008186 active pharmaceutical agent Substances 0.000 description 2
- 230000003542 behavioural effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 238000007654 immersion Methods 0.000 description 2
- 238000011221 initial treatment Methods 0.000 description 2
- 238000010197 meta-analysis Methods 0.000 description 2
- 238000012148 non-surgical treatment Methods 0.000 description 2
- 230000002980 postoperative effect Effects 0.000 description 2
- 230000008439 repair process Effects 0.000 description 2
- 230000036387 respiratory rate Effects 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 238000005728 strengthening Methods 0.000 description 2
- 238000002560 therapeutic procedure Methods 0.000 description 2
- 208000024827 Alzheimer disease Diseases 0.000 description 1
- 208000006820 Arthralgia Diseases 0.000 description 1
- 206010019909 Hernia Diseases 0.000 description 1
- 208000007353 Hip Osteoarthritis Diseases 0.000 description 1
- 208000005016 Intestinal Neoplasms Diseases 0.000 description 1
- 208000012659 Joint disease Diseases 0.000 description 1
- 206010060820 Joint injury Diseases 0.000 description 1
- 208000000112 Myalgia Diseases 0.000 description 1
- 241000283973 Oryctolagus cuniculus Species 0.000 description 1
- 241000282320 Panthera leo Species 0.000 description 1
- 241000282376 Panthera tigris Species 0.000 description 1
- 208000018737 Parkinson disease Diseases 0.000 description 1
- 208000029033 Spinal Cord disease Diseases 0.000 description 1
- 208000006011 Stroke Diseases 0.000 description 1
- 241000377209 Unicorn Species 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 210000000577 adipose tissue Anatomy 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 238000009530 blood pressure measurement Methods 0.000 description 1
- 208000029028 brain injury Diseases 0.000 description 1
- 210000000133 brain stem Anatomy 0.000 description 1
- 238000002554 cardiac rehabilitation Methods 0.000 description 1
- 239000007933 dermal patch Substances 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 206010012601 diabetes mellitus Diseases 0.000 description 1
- 230000004064 dysfunction Effects 0.000 description 1
- 238000002283 elective surgery Methods 0.000 description 1
- 125000001475 halogen functional group Chemical group 0.000 description 1
- 230000035876 healing Effects 0.000 description 1
- 239000005556 hormone Substances 0.000 description 1
- 229940088597 hormone Drugs 0.000 description 1
- 238000013150 knee replacement Methods 0.000 description 1
- 230000003692 lymphatic flow Effects 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 201000006417 multiple sclerosis Diseases 0.000 description 1
- 230000004118 muscle contraction Effects 0.000 description 1
- 206010028417 myasthenia gravis Diseases 0.000 description 1
- 210000000653 nervous system Anatomy 0.000 description 1
- 208000018360 neuromuscular disease Diseases 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 238000002203 pretreatment Methods 0.000 description 1
- 210000003625 skull Anatomy 0.000 description 1
- 208000020431 spinal cord injury Diseases 0.000 description 1
- 210000000115 thoracic cavity Anatomy 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
- 230000036642 wellbeing Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/10—Machine learning using kernel methods, e.g. support vector machines [SVM]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6895—Sport equipment
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
- G16H10/60—ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H80/00—ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/09—Rehabilitation or training
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B22/00—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
- A63B22/06—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with support elements performing a rotating cycling movement, i.e. a closed path movement
- A63B22/0605—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with support elements performing a rotating cycling movement, i.e. a closed path movement performing a circular movement, e.g. ergometers
- A63B2022/0611—Particular details or arrangement of cranks
- A63B2022/0623—Cranks of adjustable length
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B2071/0675—Input for modifying training controls during workout
- A63B2071/0683—Input by handheld remote control
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/10—Positions
- A63B2220/16—Angular positions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/50—Force related parameters
- A63B2220/51—Force
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2225/00—Miscellaneous features of sport apparatus, devices or equipment
- A63B2225/50—Wireless data transmission, e.g. by radio transmitters or telemetry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Physical Education & Sports Medicine (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Pulmonology (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Evolutionary Computation (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Urology & Nephrology (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Medical Treatment And Welfare Office Work (AREA)
- Arrangements For Transmission Of Measured Signals (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
A method includes receiving treatment data pertaining to a user who uses a treatment
device to perform a treatment plan. The treatment data includes at least one of characteristics
of the user, measurement information pertaining to the user while the user uses the treatment
device, characteristics of the treatment device, and the treatment plan. The method also
includes generating treatment information using the treatment data and storing, for access at
a computing device of a healthcare provider, the treatment information. The method also
includes communicating with an interface, at the computing device of the healthcare provider,
wherein the interface is configured to receive treatment plan input and modifying the
treatment plan in response to receiving treatment plan input including at least one
modification to the treatment plan.
Description
[0001] This application is a divisional application of Australian patent application 2021260953 and claims the same priorities as Australian patent application 2021260953. The full content of Australian patent 2021260953 is incorporated herein by reference.
[0002] Remote medical assistance, or telemedicine, may aid a patient in performing various aspects of a rehabilitation regimen for a body part. The patient may use a patient interface in communication with an assistant interface for receiving the remote medical assistance via audio and/or audiovisual communications.
[0003] Any reference herein to a patent document or other matter which is described herein as prior art is not to be taken as an admission that document or matter was known or that the information it contains was part of the common general knowledge as at the priority date of any of the claims.
[0004] The present invention provides a computer-implemented system configured to control operation of an electromechanical machine, the computer-implemented system comprising: the electromechanical machine, the electromechanical machine being configured to be manipulated by a user while the user performs a treatment plan, wherein the electromechanical machine includes at least one pedal; and a computing device configured to: receive treatment data pertaining to the user who uses the electromechanical machine to perform the treatment plan, wherein the treatment data comprises at least one of characteristics of the user, measurement information pertaining to the user while the user uses the electromechanical machine, at least one characteristic of the electromechanical machine, and at least one aspect of the treatment plan; generate treatment information using the treatment data; transmit the treatment information to a computing device of a healthcare provider; communicate with an interface at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input based on the treatment information; generate a modified treatment plan by modifying the at least one aspect of the treatment plan in response to receiving treatment plan input including a modification to the at least one aspect of the treatment plan; and while the user uses the electromechanical machine, control the electromechanical machine based on the modified treatment plan.
[0005] The present invention also provides a method of operating an electromechanical machine, the method comprising: receiving treatment data pertaining to a user who uses the electromechanical machine to perform a treatment plan, wherein the treatment data comprises at least one of characteristics of the user, measurement information pertaining to the user while the user uses the electromechanical machine, at least one characteristics of the electromechanical machine, and at least one aspect of the treatment plan; generating treatment information using the treatment data; transmitting the treatment information to a computing device of a healthcare provider; communicating with an interface at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input based on the treatment information; generating a modified treatment plan by modifying the at least one aspect of the treatment plan in response to receiving treatment plan input including a modification to the at least one aspect of the treatment plan; and while the user uses the electromechanical machine, controlling the electromechanical machine based on the modified treatment plan.
[00061 The present invention further provides a tangible, non-transitory computer readable medium storing instructions that, when executed, cause a processing device to: receive treatment data pertaining to a user who uses a electromechanical machine to perform a treatment plan, wherein the treatment data comprises at least one characteristic of the user, measurement information pertaining to the user while the user uses the electromechanical machine, at least one characteristic of the electromechanical machine, and at least one aspect of the treatment plan; generate treatment information using the treatment data; transmit the treatment information to a computing device of a healthcare provider; communicate with an interface at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input based on the treatment information; and generate a modified treatment plan by modifying the at least one aspect of the treatment plan in response to receiving the treatment plan input including a modification to the at least one aspect of the treatment plan; and while the user uses the electromechanical machine, control the electromechanical machine based on the modified treatment plan.
[0007] The disclosure is best understood from the following detailed description when read in conjunction with the accompanying drawings. It is emphasized that, according to common practice, the various features of the drawings are not to-scale. On the contrary, the dimensions of the various features are arbitrarily expanded or reduced for clarity.
[0008] FIG. 1 generally illustrates a block diagram of an embodiment of a computer implemented system for managing a treatment plan according to the principles of the present disclosure.
[0009] FIG. 2 generally illustrates a perspective view of an embodiment of a treatment device according to the principles of the present disclosure.
[0010] FIG. 3 generally illustrates a perspective view of a pedal of the treatment device of FIG. 2 according to the principles of the present disclosure.
[0011] FIG. 4 generally illustrates a perspective view of a person using the treatment device of FIG. 2 according to the principles of the present disclosure.
[0012] FIG. 5 generally illustrates an example embodiment of an overview display of an assistant interface according to the principles of the present disclosure.
[0013] FIG. 6 generally illustrates an example block diagram of training a machine learning model to output, based on data pertaining to the patient, a treatment plan for the patient according to the principles of the present disclosure.
[0014] FIG. 7 generally illustrates an embodiment of an overview display of the assistant
_r
interface presenting recommended treatment plans and excluded treatment plans in real-time during a telemedicine session according to the principles of the present disclosure.
[0015] FIG. 8 generally illustrates an embodiment of the overview display of the assistant interface presenting, in real-time during a telemedicine session, recommended treatment plans that have changed as a result of patient data changing according to the principles of the present disclosure.
[0016] FIG. 9 is a flow diagram generally illustrating a method for modifying, based on treatment data received while a user uses the treatment device of FIG. 2, a treatment plan for the patient and controlling, based on the modification, at least one treatment device according to the principles of the present disclosure.
[0017] FIG. 10 is a flow diagram generally illustrating an alternative method for modifying, based on treatment data received while a user uses the treatment device of FIG. 2, a treatment plan for the patient and controlling, based on the modification, at least one treatment device according to the principles of the present disclosure.
[0018] FIG. 11 is a flow diagram generally illustrating an alternative method for modifying, based on treatment data received while a user uses the treatment device of FIG. 2, a treatment plan for the patient and controlling, based on the modification, at least one treatment device according to the principles of the present disclosure.
[0019] FIG. 12 generally illustrates a computer system according to the principles of the present disclosure.
[0020] FIG. 13 shows a block diagram of an embodiment of a computer implemented system for managing a treatment plan according to the present disclosure.
[0021] FIG. 14 shows a perspective view of an embodiment of a treatment apparatus according to the present disclosure.
[0022] FIG. 15 shows a perspective view of a pedal of the treatment apparatus of FIG. 14 according to the present disclosure.
[0023] FIG. 16 shows a perspective view of a person using the treatment apparatus of FIG. 14 according to the present disclosure.
[0024] FIG. 17 shows an example embodiment of an overview display of an assistant interface according to the present disclosure.
[0025] FIG. 18 shows an example embodiment of an overview display of the assistant interface presenting recommended optimal treatment plans and excluded treatment plans in real-time during a telemedicine session according to the present disclosure.
[0026] FIG. 19 shows an example embodiment of a server translating clinical information into a medical description language for processing by an artificial intelligence engine according to the present disclosure.
[0027] FIG. 20 shows an example embodiment of a method for recommending an optimal treatment plan according to the present disclosure.
[0028] FIG. 21 shows an example embodiment of a method for translating clinical information into the medical description language according to the present disclosure.
[0029] FIG. 22 shows an example computer system according to the present disclosure.
[0030] FIG. 23 generally illustrates a block diagram of an embodiment of a computer implemented system for managing a treatment plan according to the principles of the present disclosure.
[0031] FIG. 24 generally illustrates a perspective view of an embodiment of a treatment device according to the principles of the present disclosure.
[0032] FIG. 25 generally illustrates a perspective view of a pedal of the treatment device of FIG. 24 according to the principles of the present disclosure.
[0033] FIG. 26 generally illustrates a perspective view of a person using the treatment device of FIG. 24 according to the principles of the present disclosure.
[0034] FIG. 27 generally illustrates an example embodiment of an overview display of an assistant interface according to the principles of the present disclosure.
[0035] FIG. 28 generally illustrates an example block diagram of training a machine learning model to output, based on data pertaining to the patient, a treatment plan for the patient according to the principles of the present disclosure.
[0036] FIG. 29 generally illustrates an embodiment of an overview display of the assistant interface presenting recommended treatment plans and excluded treatment plans in real-time during a telemedicine session according to the principles of the present disclosure.
[0037] FIG. 30 generally illustrates an embodiment of the overview display of the assistant interface presenting, in real-time during a telemedicine session, recommended treatment plans that have changed as a result of patient data changing according to the principles of the present disclosure.
[0038] FIG. 31 is a flow diagram generally illustrating a method for monitoring, based on treatment data received while a user uses the treatment device of FIG. 24, characteristics of the user while the user uses the treatment device according to the principles of the present disclosure.
[0039] FIG. 32 is a flow diagram generally illustrating an alternative method for monitoring, based on treatment data received while a user uses the treatment device of FIG. 24, characteristics of the user while the user uses the treatment device according to the principles of the present disclosure.
[0040] FIG. 33 is a flow diagram generally illustrating an alternative method for monitoring, based on treatment data received while a user uses the treatment device of FIG. 24, characteristics of the user while the user uses the treatment device according to the principles of the present disclosure.
[0041] FIG. 34 is a flow diagram generally illustrating a method for receiving a selection of an optimal treatment plan and controlling, based on the optimal treatment plan, a treatment device while the patient uses the treatment device according to the present disclosure.
[0042] FIG. 35 generally illustrates a computer system according to the principles of the present disclosure.
[0043] FIG. 36 shows a block diagram of an embodiment of a computer implemented system for managing a treatment plan according to the present disclosure.
[0044] FIG. 37 shows a perspective view of an embodiment of a treatment apparatus according to the present disclosure.
[0045] FIG. 38 shows a perspective view of a pedal of the treatment apparatus of FIG. 37 according to the present disclosure.
[0046] FIG. 39 shows a perspective view of a person using the treatment apparatus of FIG. 37 according to the present disclosure.
[0047] FIG. 40 shows an example embodiment of an overview display of an assistant interface according to the present disclosure.
[0048] FIG. 41 shows an example block diagram of training a machine learning model to output, based on data pertaining to the patient, a treatment plan for the patient according to the present disclosure.
[0049] FIG. 42 shows an embodiment of an overview display of the patient interface presenting a virtual avatar guiding the patient through an exercise session according to the present disclosure.
[0050] FIG. 43 shows an embodiment of the overview display of the assistant interface receiving a notification pertaining to the patient and enabling the assistant to initiate a telemedicine session in real-time according to the present disclosure.
[0051] FIG. 44 shows an embodiment of the overview display of the patient interface presenting, in real-time during a telemedicine session, a feed of the medical professional that replaced the virtual avatar according to the present disclosure.
[0052] FIG. 45 shows an example embodiment of a method for replacing, based on a trigger event occurring, a virtual avatar with a feed of a medical professional according to the present disclosure.
[0053] FIG. 46 shows an example embodiment of a method for providing a virtual avatar according to the present disclosure.
[0054] FIG. 47 shows an example computer system according to the present disclosure.
[0055] Various terms are used to refer to particular system components. Different companies may refer to a component by different names - this document does not intend to distinguish between components that differ in name but not function. In the following discussion and in the claims, the terms "including" and "comprising" are used in an open ended fashion, and thus should be interpreted to mean "including, but not limited to... ." Also, the term "couple" or "couples" is intended to mean either an indirectordirect connection. Thus, if a first device couples to a second device, that connection may be through a direct connection or through an indirect connection via other devices and connections.
[0056] The terminology used herein is for the purpose of describing particular example embodiments only, and is not intended to be limiting. As used herein, the singular forms "a," "an," and "the" may be intended to include the plural forms as well, unless the context clearly indicates otherwise. The method steps, processes, and operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance. It is also to be understood that additional or alternative steps may be employed.
[0057] The terms first, second, third, etc. may be used herein to describe various elements, components, regions, layers and/or sections; however, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms may be only used to distinguish one element, component, region, layer, or section from another region, layer, or section. Terms such as "first," "second," and other numerical terms, when used herein, do not imply a sequence or order unless clearly indicated by the context. Thus, a first element, component, region, layer, or section discussed below could be termed a second element, component, region, layer, or section without departing from the teachings of the example embodiments. The phrase "at least one of," when used with a list of items, means that different combinations of one or more of the listed items may be used, and only one item in the list may be needed. For example, "at least one of: A, B, and C" includes any of the following combinations: A, B, C, A and B, A and C, B and C, and A and B and C. In another example, the phrase "one or more" when used with a list of items means there may be one item or any suitable number of items exceeding one.
[0058] Spatially relative terms, such as "inner," "outer," "beneath," "below," "lower," "above," "upper," "top," "bottom," and the like, may be used herein. These spatially relative terms can be used for ease of description to describe one element's or feature's relationship to another element(s) or feature(s) as illustrated in the figures. The spatially relative terms may also be intended to encompass different orientations of the device in use, or operation, in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as "below" or "beneath" other elements or features would then be oriented "above" the other elements or features. Thus, the example term "below" can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptions used herein interpreted accordingly.
[0059] A "treatment plan" may include one or more treatment protocols, and each treatment protocol includes one or more treatment sessions. Each treatment session comprises several session periods, with each session period including a particular exercise for treating the body part of the patient. For example, a treatment plan for post-operative rehabilitation after a knee surgery may include an initial treatment protocol with twice daily stretching sessions for the first 3 days after surgery and a more intensive treatment protocol with active exercise sessions performed 4 times per day starting 4 days after surgery. A treatment plan may also include information pertaining to a medical procedure to perform on the patient, a treatment protocol for the patient using a treatment device, a diet regimen for the patient, a medication regimen for the patient, a sleep regimen for the patient, additional regimens, or some combination thereof. The treatment plan may also include one or more training protocols, such as strength training protocols, range of motion training protocols, cardiovascular training protocols, endurance training protocols, and the like. Each training protocol may include one or more training sessions comprising several training session periods, with each session period comprising a particular exercise directed to one or more of strength training, range of motion training, cardiovascular training, endurance training, and the like.
[0060] The terms telemedicine, telehealth, telemed, teletherapeutic, telemedicine, remote medicine, etc. may be used interchangeably herein.
[0061] The term "enhanced reality" may include a user experience comprising one or more of augmented reality, virtual reality, mixed reality, immersive reality, or a combination of the foregoing (e.g., immersive augmented reality, mixed augmented reality, virtual and augmented immersive reality, and the like).
[0062] The term "augmented reality" may refer, without limitation, to an interactive user experience that provides an enhanced environment that combines elements of a real-world environment with computer-generated components perceivable by the user.
[0063] The term "virtual reality" may refer, without limitation, to a simulated interactive user experience that provides an enhanced environment perceivable by the user and wherein such enhanced environment may be similar to or different from a real-world environment.
[0064] The term "mixed reality" may refer to an interactive user experience that combines aspects of augmented reality with aspects of virtual reality to provide a mixed reality environment perceivable by the user.
[0065] The term "immersive reality" may refer to a simulated interactive user experienced using virtual and/or augmented reality images, sounds, and other stimuli to immerse the user, to a specific extent possible (e.g., partial immersion or total immersion), in the simulated interactive experience. For example, in some embodiments, to the specific extent possible, the user experiences one or more aspects of the immersive reality as naturally as the user typically experiences corresponding aspects of the real-world. Additionally, or alternatively, an immersive reality experience may include actors, a narrative component, a theme (e.g., an
entertainment theme or other suitable theme), and/or other suitable features of components.
[0066] The term "body halo" may refer to a hardware component or components, wherein such component or components may include one or more platforms, one or more body supports or cages, one or more chairs or seats, one or more back supports or back engaging mechanisms, one or more leg or foot engaging mechanisms, one or more arm or hand engaging mechanisms, one or more head engaging mechanisms, other suitable hardware components, or a combination thereof.
[0067] As used herein, the term "enhanced environment" may refer to an enhanced environment in its entirety, at least one aspect of the enhanced environment, more than one aspect of the enhanced environment, or any suitable number of aspects of the enhanced environment.
[0068] As used herein, the term "threshold" and/or the term "range" may include one or more values expressed as a percentage, an absolute value, a unit of measurement, a difference value, a numerical quantity, or other suitable expression of the one or more values.
[0069] The term "optimal treatment plan" may refer to optimizing a treatment plan based on a certain parameter or combinations of more than one parameter, such as, but not limited to, a monetary value amount generated by a treatment plan and/or billing sequence, wherein the monetary value amount is measured by an absolute amount in dollars or another currency, a Net Present Value (NPV) or any other measure, a patient outcome that results from the treatment plan and/or billing sequence, a fee paid to a medical professional, a payment plan for the patient to pay off an amount of money owed or a portion thereof, a plan of reimbursement, an amount of revenue, profit or other monetary value amount to be paid to an insurance or third-party provider, or some combination thereof.
[0070] Real-time may refer to less than or equal to 2 seconds. Near real-time may refer to any interaction of a sufficiently short time to enable two individuals to engage in a dialogue via such user interface, and will generally be less than 10 seconds but greater than 2 seconds.
[0071] Any of the systems and methods described in this disclosure may be used in connection with rehabilitation. Rehabilitation may be directed at cardiac rehabilitation, rehabilitation from stroke, multiple sclerosis, Parkinson's disease, myasthenia gravis, Alzheimer's disease, any other neurodegenative or neuromuscular disease, a brain injury, a spinal cord injury, a spinal cord disease, a joint injury, a joint disease, or the like.
Rehabilitation can further involve muscular contraction in order to improve blood flow and lymphatic flow, engage the brain and nervous system to control and affect a traumatized area to increase the speed of healing, reverse or reduce pain (including arthralgias and myalgias), reverse or reduce stiffness, recover range of motion, encourage cardiovascular engagement to stimulate the release of pain-blocking hormones or to encourage highly oxygenated blood flow to aid in an overall feeling of well-being. Rehabilitation may be provided for individuals of average height in reasonably good physical condition having no substantial deformities, as well as for individuals more typically in need of rehabilitation, such as those who are elderly, obese, subject to disease processes, injured and/or who have a severely limited range of motion. Unless expressly stated otherwise, is to be understood that rehabilitation includes prehabilitation (also referred to as "pre-habilitation" or "prehab"). Prehabilitation may be used as a preventative procedure or as a pre-surgical or pre-treatment procedure. Prehabilitation may include any action performed by or on a patient (or directed to be performed by or on a patient, including, without limitation, remotely or distally through telemedicine) to, without limitation, prevent or reduce a likelihood of injury (e.g., prior to the occurrence of the injury); improve recovery time subsequent to surgery; improve strength subsequent to surgery; or any of the foregoing with respect to any non-surgical clinical treatment plan to be undertaken for the purpose of ameliorating or mitigating injury, dysfunction, or other negative consequence of surgical or non-surgical treatment on any external or internal part of a patient's body. For example, a mastectomy may require prehabilitation to strengthen muscles or muscle groups affected directly or indirectly by the mastectomy. As a further non-limiting example, the removal of an intestinal tumor, the repair of a hernia, open-heart surgery or other procedures performed on internal organs or structures, whether to repair those organs or structures, to excise them or parts of them, to treat them, etc., can require cutting through, dissecting and/or harming numerous muscles and muscle groups in or about, without limitation, the skull or face, the abdomen, the ribs and/or the thoracic cavity, as well as in or about all joints and appendages. Prehabilitation can improve a patient's speed of recovery, measure of quality of life, level of pain, etc. in all the foregoing procedures. In one embodiment of prehabilitation, a pre-surgical procedure or a pre-non-surgical-treatment may include one or more sets of exercises for a patient to perform prior to such procedure or treatment. Performance of the one or more sets of exercises may be required in order to qualify for an elective surgery, such as a knee replacement. The patient may prepare an area of his or her body for the surgical
procedure by performing the one or more sets of exercises, thereby strengthening muscle groups, improving existing muscle memory, reducing pain, reducing stiffness, establishing new muscle memory, enhancing mobility (i.e., improve range of motion), improving blood flow, and/or the like.
[0072] The following discussion is directed to various embodiments of the present disclosure. Although one or more of these embodiments may be preferred, the embodiments disclosed should not be interpreted, or otherwise used, as limiting the scope of the disclosure, including the claims. In addition, one skilled in the art will understand that the following description has broad application, and the discussion of any embodiment is meant only to be exemplary of that embodiment, and not intended to intimate that the scope of the disclosure, including the claims, is limited to that embodiment.
[0073] Determining a treatment plan for a patient having certain characteristics (e.g., vital sign or other measurements; performance; demographic; geographic; diagnostic; measurement- or test-based; medically historic; etiologic; cohort-associative; differentially diagnostic; surgical, physically therapeutic, pharmacologic and other treatment(s) recommended; etc.) may be a technically challenging problem. For example, a multitude of information may be considered when determining a treatment plan, which may result in inefficiencies and inaccuracies in the treatment plan selection process. In a rehabilitative setting, some of the multitude of information considered may include characteristics of the patient such as personal information, performance information, and measurement information. The personal information may include, e.g., demographic, psychographic or other information, such as an age, a weight, a gender, a height, a body mass index, a medical condition, a familial medication history, an injury, a medical procedure, a medication prescribed, or some combination thereof. The performance information may include, e.g., an elapsed time of using a treatment device, an amount of force exerted on a portion of the treatment device, a range of motion achieved on the treatment device, a movement speed of a portion of the treatment device, an indication of a plurality of pain levels using the treatment device, or some combination thereof. The measurement information may include, e.g., a vital sign, a respiration rate, a heartrate, a temperature, a blood pressure, or some combination thereof. It may be desirable to process the characteristics of a multitude of patients, the treatment plans performed for those patients, and the results of the treatment plans for those patients.
[0074] Further, another technical problem may involve distally treating, via a computing device during a telemedicine or telehealth session, a patient from a location different than a location at which the patient is located. An additional technical problem is controlling or enabling the control of, from the different location, a treatment device used by the patient at the location at which the patient is located. Oftentimes, when a patient undergoes rehabilitative surgery (e.g., knee surgery), a healthcare provider may prescribe a treatment device to the patient to use to perform a treatment protocol at their residence or any mobile location or temporary domicile. A healthcare provider may refer to a doctor, physician assistant, nurse, chiropractor, dentist, physical therapist, acupuncturist, physical trainer, coach, personal trainer, or the like. A healthcare provider may refer to any person with a credential, license, degree, or the like in the field of medicine, physical therapy, rehabilitation, or the like.
[0075] When the healthcare provider is located in a different location from the patient and the treatment device, it may be technically challenging for the healthcare provider to monitor the patient's actual progress (as opposed to relying on the patient's word about their progress) using the treatment device, modify the treatment plan according to the patient's progress, adapt the treatment device to the personal characteristics of the patient as the patient performs the treatment plan, and the like.
[0076] Accordingly, systems and methods, such as those described herein, that use sensor data to modify a treatment plan and/or to adapt the treatment device while a patient performs the treatment plan using the treatment device, may be desirable.
[0077] In some embodiments, the systems and methods described herein may be configured to receive treatment data pertaining to a user while the user is using the treatment device to perform the treatment plan. The user may include a patient user or person using the treatment device to perform various exercises. The treatment plan may correspond to a rehabilitation treatment plan, a prehabilitation treatment plan, an exercise treatment plan, or other suitable treatment plan. The treatment data may include various characteristics of the user, various measurement information pertaining to the user while the user uses the treatment device, various characteristics of the treatment device, the treatment plan, other suitable data, or a combination thereof.
[0078] In some embodiments, while the user uses the treatment device to perform the treatment plan, at least some of the treatment data may correspond to sensor data of a sensor configured to sense various characteristics of the treatment device and/or the measurement information of the user. Additionally, or alternatively, while the user uses the treatment device to perform the treatment plan, at least some of the treatment data may correspond to sensor data from a sensor associated with a wearable device configured to sense the measurement information of the user.
[0079] The various characteristics of the treatment device may include one or more settings of the treatment device, a current revolutions per time period (e.g., such as one minute) of a rotating member (e.g., such as a wheel) of the treatment device, a resistance setting of the treatment device, other suitable characteristics of the treatment device, or a combination thereof. The measurement information may include one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable measurement information of the user, or a combination thereof.
[0080] In some embodiments, the systems and methods described herein may be configured to generate treatment information using the treatment data. The treatment information may include a summary of the performance of the treatment plan by the user while using the treatment device formatted, such that the treatment data is presentable at a computing device of a healthcare provider or healthcare professional responsible for the performance of the treatment plan by the user. The terms "healthcare provider" and "healthcare professional" may be used interchangeably herein. The healthcare provider or healthcare professional may include a medical professional (e.g., such as a doctor, a nurse, a therapist, and the like), an exercise professional (e.g., such as a coach, a trainer, a nutritionist, and the like), or another professional sharing at least one of medical and exercise attributes (e.g., such as an exercise physiologist, a physical therapist, an occupational therapist, and the like). As used herein, and without limiting the foregoing a healthcare provider or healthcare professional may be a human being, a robot, a virtual assistant, a virtual assistant in a virtual and/or augmented reality, or an artificially intelligent entity, including a software program, integrated software and hardware, or hardware alone.
[0081] The systems and methods described herein may be configured to write to an
associated memory, for access at the computing device of the healthcare provider, and/or provide, at the computing device of the healthcare provider, the treatment information. For example, the systems and methods describe herein may be configured to provide the treatment information to an interface configured to present the treatment information to the healthcare provider. The interface may include a graphical user interface configured to provide the treatment information and receive input from the healthcare provider. The interface may include one or more input fields, such as text input fields, dropdown selection input fields, radio button input fields, virtual switch input fields, virtual lever input fields, audio, haptic, tactile, biometric or otherwise activated and/or driven input fields, other suitable input fields, or a combination thereof.
[0082] In some embodiments, the healthcare provider may review the treatment information and determine whether to modify the treatment plan and/or one or more characteristics of the treatment device. For example, the healthcare provider may review the treatment information and compare the treatment information to the treatment plan being performed by the user.
[0083] The healthcare provider may compare the following (i) expected information, which pertains to the user while the user uses the treatment device to perform the treatment plan to (ii) the measurement information (e.g., indicated by the treatment information), which pertains to the user while the user uses the treatment device to perform the treatment plan. The expected information may include one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable information of the user, or a combination thereof. The healthcare provider may determine that the treatment plan is having the desired effect if one or more parts or portions of the measurement information are within an acceptable range associated with one or more corresponding parts or portions of the expected information. Conversely, the healthcare provider may determine that the treatment plan is not having the desired effect if one or more parts or portions of the measurement information are outside of the range associated with one or more corresponding parts or portions of the expected information.
[0084] For example, the healthcare provider may determine whether a blood pressure value (e.g., systolic pressure, diastolic pressure, and/or pulse pressure) corresponding to the user while the user uses the treatment device (e.g., indicated by the measurement information) is within an acceptable range (e.g., plus or minus 1%, plus or minus 5%, or any suitable range) of an expected blood pressure value indicated by the expected information. The healthcare provider may determine that the treatment plan is having the desired effect if the blood pressure value corresponding to the user while the user uses the treatment device is within the range of the expected blood pressure value. Conversely, the healthcare provider may determine that the treatment plan is not having the desired effect if the blood pressure value corresponding to the user while the user uses the treatment device is outside of the range of the expected blood pressure value
[0085] In some embodiments, the healthcare provider may compare the expected characteristics of the treatment device while the user uses the treatment device to perform the treatment plan with characteristics of the treatment device indicated by the treatment information. For example, the healthcare provider may compare an expected resistance setting of the treatment device with an actual resistance setting of the treatment device indicated by the treatment information. The healthcare provider may determine that the user is performing the treatment plan properly if the actual characteristics of the treatment device indicated by the treatment information are within a range of corresponding ones of the expected characteristics of the treatment device. Conversely, the healthcare provider may determine that the user is not performing the treatment plan properly if the actual characteristics of the treatment device indicated by the treatment information are outside the range of corresponding ones of the expected characteristics of the treatment device.
[0086] If the healthcare provider determines that the treatment information indicates that the user is performing the treatment plan properly and/or that the treatment plan is having the desired effect, the healthcare provider may determine not to modify the treatment plan or the one or more characteristics of the treatment device. Conversely, while the user uses the treatment device to perform the treatment plan, if the healthcare provider determines that the treatment information indicates that the user is not or has not been performing the treatment plan properly and/or that the treatment plan is not or has not been having the desired effect, the healthcare provider may determine to modify the treatment plan and/or the one or more characteristics of the treatment device.
[00871 In some embodiments, the healthcare provider may interact with the interface to provide treatment plan input indicating one or more modifications to the treatment plan and/or
/ to one or more characteristics of the treatment device if the healthcare provider determines to modify the treatment plan and/or the one or more characteristics of the treatment device. For example, the healthcare provider may use the interface to provide input indicating an increase or decrease in the resistance setting of the treatment device, or other suitable modification to the one or more characteristics of the treatment device. Additionally, or alternatively, the healthcare provider may use the interface to provide input indicating a modification to the treatment plan. For example, the healthcare provider may use the interface to provide input indicating an increase or decrease in an amount of time the user is required to use the treatment device according to the treatment plan, or other suitable modifications to the treatment plan.
[00881 In some embodiments, the systems and methods described herein may be configured to modify the treatment plan based on one or more modifications indicated by the treatment plan input. Additionally, or alternatively, the systems and methods described herein may be configured to modify the one or more characteristics of the treatment device based on the modified the at least one aspect of the treatment plan and/or the treatment plan input. For example, the treatment plan input may indicate to modify the one or more characteristics of the treatment device and/or the treatment plan may require or indicate adjustments to the treatment device in order for the user to achieve the desired results of the modified treatment plan.
[0089] In some embodiments, the systems and methods described herein may be configured to receive subsequent treatment data pertaining to the user while the user uses the treatment device to perform the treatment plan. For example, after the healthcare provider provides input modifying the treatment plan and/or controlling the one or more characteristics of the treatment device, the user may continue use the treatment device to perform the modified treatment plan. The subsequent treatment data may correspond to treatment data generated while the user uses the treatment device to perform the modified treatment plan. In some embodiments, the subsequent treatment data may correspond to treatment data generated while the user continues to use the treatment device to perform the treatment plan, after the healthcare provider has received the treatment information and determined not to modify the treatment plan and/or control the one or more characteristics of the treatment device.
[0090] Based on subsequent treatment plan input received from the computing device of the healthcare provider, the systems and methods described herein may be configured to further modify the treatment plan and/or control the one or more characteristics of the treatment device. The subsequent treatment plan input may correspond to input provided by the healthcare provider, at the interface, in response to receiving and/or reviewing subsequent treatment information corresponding to the subsequent treatment data. It should be understood that the systems and methods described herein may be configured to continuously and/or periodically provide treatment information to the computing device of the healthcare provider based on treatment data continuously and/or periodically received from the sensors or other suitable sources described herein.
[0091] The healthcare provider may receive and/or review treatment information continuously or periodically while the user uses the treatment device to perform the treatment plan. Based on one or more trends indicated by the continuously and/or periodically received treatment information, the healthcare provider may determine whether to modify the treatment plan and/or control the one or more characteristics of the treatment device. For example, the one or more trends may indicate an increase in heart rate or other suitable trends indicating that the user is not performing the treatment plan properly and/or performance of the treatment plan by the user is not having the desired effect.
[0092] In some embodiments, the systems and methods described herein may be configured to use artificial intelligence and/or machine learning to assign patients to cohorts and to dynamically control a treatment device based on the assignment during an adaptive telemedicine session. In some embodiments, numerous treatment devices may be provided to patients. The treatment devices may be used by the patients to perform treatment plans in their residences, at a gym, at a rehabilitative center, at a hospital, or any suitable location, including permanent or temporary domiciles.
[0093] In some embodiments, the treatment devices may be communicatively coupled to a server. Characteristics of the patients, including the treatment data, may be collected before, during, and/or after the patients perform the treatment plans. For example, the personal information, the performance information, and the measurement information may be collected before, during, and/or after the person performs the treatment plans. The results (e.g., improved performance or decreased performance) of performing each exercise may be collected from the treatment device throughout the treatment plan and after the treatment plan is performed. The parameters, settings, configurations, etc. (e.g., position of pedal, amount of
resistance, etc.) of the treatment device may be collected before, during, and/or after the treatment plan is performed.
[0094] Each characteristic of the patient, each result, and each parameter, setting, configuration, etc. may be timestamped and may be correlated with a particular step in the treatment plan. Such a technique may enable determining which steps in the treatment plan lead to desired results (e.g., improved muscle strength, range of motion, etc.) and which steps lead to diminishing returns (e.g., continuing to exercise after 3 minutes actually delays or harms recovery).
[0095] Data may be collected from the treatment devices and/or any suitable computing device (e.g., computing devices where personal information is entered, such as the interface of the computing device described herein, a clinician interface, patient interface, and the like) over time as the patients use the treatment devices to perform the various treatment plans. The data that may be collected may include the characteristics of the patients, the treatment plans performed by the patients, the results of the treatment plans, any of the data described herein, any other suitable data, or a combination thereof.
[0096] In some embodiments, the data may be processed to group certain people into cohorts. The people may be grouped by people having certain or selected similar characteristics, treatment plans, and results of performing the treatment plans. For example, athletic people having no medical conditions who perform a treatment plan (e.g., use the treatment device for 30 minutes a day 5 times a week for 3 weeks) and who fully recover may be grouped into a first cohort. Older people who are classified obese and who perform a treatment plan (e.g., use the treatment plan for 10 minutes a day 3 times a week for 4 weeks) and who improve their range of motion by 75 percent may be grouped into a second cohort.
[0097] In some embodiments, an artificial intelligence engine may include one or more machine learning models that are trained using the cohorts. For example, the one or more machine learning models may be trained to receive an input of characteristics of a new patient and to output a treatment plan for the patient that results in a desired result. The machine learning models may match a pattern between the characteristics of the new patient and at least one patient of the patients included in a particular cohort. When a pattern is matched, the machine learning models may assign the new patient to the particular cohort and select the treatment plan associated with the at least one patient. The artificial intelligence engine may be configured to control, distally and based on the treatment plan, the treatment device while the new patient uses the treatment device to perform the treatment plan.
[0098] As may be appreciated, the characteristics of the new patient (e.g., a new user) may change as the new patient uses the treatment device to perform the treatment plan. For example, the performance of the patient may improve quicker than expected for people in the cohort to which the new patient is currently assigned. Accordingly, the machine learning models may be trained to dynamically reassign, based on the changed characteristics, the new patient to a different cohort that includes people having characteristics similar to the now changed characteristics as the new patient. For example, a clinically obese patient may lose weight and no longer meet the weight criterion for the initial cohort, result in the patient's being reassigned to a different cohort with a different weight criterion.
[0099] A different treatment plan may be selected for the new patient, and the treatment device may be controlled, distally (e.g., which may be referred to as remotely) and based on the different treatment plan, the treatment device while the new patient uses the treatment device to perform the treatment plan. Such techniques may provide the technical solution of distally controlling a treatment device.
[0100] Further, the systems and methods described herein may lead to faster recovery times and/or better results for the patients because the treatment plan that most accurately fits their characteristics is selected and implemented, in real-time, at any given moment. "Real-time" may also refer to near real-time, which may be less than 10 seconds. As described herein, the term "results" may refer to medical results or medical outcomes. Results and outcomes may refer to responses to medical actions.
[0101] Depending on what result is desired, the artificial intelligence engine may be trained to output several treatment plans. For example, one result may include recovering to a threshold level (e.g., 75% range of motion) in a fastest amount of time, while another result may include fully recovering (e.g., 100% range of motion) regardless of the amount of time. The data obtained from the patients and sorted into cohorts may indicate that a first treatment plan provides the first result for people with characteristics similar to the patient's, and that a second treatment plan provides the second result for people with characteristics similar to the patient.
[0102] Further, the artificial intelligence engine may be trained to output treatment plans
that are not optimal i.e., sub-optimal, nonstandard, or otherwise excluded (all referred to, without limitation, as "excluded treatment plans") for the patient. For example, if a patient has high blood pressure, a particular exercise may not be approved or suitable for the patient as it may put the patient at unnecessary risk or even induce a hypertensive crisis and, accordingly, that exercise may be flagged in the excluded treatment plan for the patient. In some embodiments, the artificial intelligence engine may monitor the treatment data received while the patient (e.g., the user) with, for example, high blood pressure, uses the treatment device to perform an appropriate treatment plan and may modify the appropriate treatment plan to include features of an excluded treatment plan that may provide beneficial results for the patient if the treatment data indicates the patient is handling the appropriate treatment plan without aggravating, for example, the high blood pressure condition of the patient.
[0103] In some embodiments, the treatment plans and/or excluded treatment plans may be presented, during a telemedicine or telehealth session, to a healthcare provider. The healthcare provider may select a particular treatment plan for the patient to cause that treatment plan to be transmitted to the patient and/or to control, based on the treatment plan, the treatment device. In some embodiments, to facilitate telehealth or telemedicine applications, including remote diagnoses, determination of treatment plans and rehabilitative and/or pharmacologic prescriptions, the artificial intelligence engine may receive and/or operate distally from the patient and the treatment device.
[0104] In such cases, the recommended treatment plans and/or excluded treatment plans may be presented simultaneously with a video of the patient in real-time or near real-time during a telemedicine or telehealth session on a user interface of a computing device of a healthcare provider. The video may also be accompanied by audio, text and other multimedia information. Real-time may refer to less than or equal to 2 seconds. Near real-time may refer to any interaction of a sufficiently short time to enable two individuals to engage in a dialogue via such user interface, and will generally be less than 10 seconds but greater than 2 seconds.
[0105] Presenting the treatment plans generated by the artificial intelligence engine concurrently with a presentation of the patient video may provide an enhanced user interface because the healthcare provider may continue to visually and/or otherwise communicate with the patient while also reviewing the treatment plans on the same user interface. The enhanced user interface may improve the healthcare provider's experience using the computing device and may encourage the healthcare provider to reuse the user interface. Such a technique may also reduce computing resources (e.g., processing, memory, network) because the healthcare provider does not have to switch to another user interface screen to enter a query for a treatment plan to recommend based on the characteristics of the patient. The artificial intelligence engine may be configured to provide, dynamically on the fly, the treatment plans and excluded treatment plans.
[0106] In some embodiments, the treatment device may be adaptive and/or personalized because its properties, configurations, and positions may be adapted to the needs of a particular patient. For example, the pedals may be dynamically adjusted on the fly (e.g., via a telemedicine session or based on programmed configurations in response to certain measurements being detected) to increase or decrease a range of motion to comply with a treatment plan designed for the user. In some embodiments, a healthcare provider may adapt, remotely during a telemedicine session, the treatment device to the needs of the patient by causing a control instruction to be transmitted from a server to treatment device. Such adaptive nature may improve the results of recovery for a patient, furthering the goals of personalized medicine, and enabling personalization of the treatment plan on a per-individual basis.
[0107] FIG. 1 generally illustrates a block diagram of a computer-implemented system 10, hereinafter called "the system" for managing a treatment plan. Managing the treatment plan may include using an artificial intelligence engine to recommend treatment plans and/or provide excluded treatment plans that should not be recommended to a patient.
[0108] The system 10 also includes a server 30 configured to store (e.g., write to an associated memory) and to provide data related to managing the treatment plan. The server may include one or more computers and may take the form of a distributed and/or virtualized computer or computers. The server 30 also includes a first communication interface 32 configured to communicate with the clinician interface 20 via a first network 34. In some embodiments, the first network 34 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc. The server 30 includes a first processor 36 and a first machine-readable storage memory 38, which may be called a "memory" for short, holding first instructions 40 for performing the various actions of the server 30 for execution by thefirst processor 36.
[0109] The server 30 is configured to store data regarding the treatment plan. For example, the memory 38 includes a system data store 42 configured to hold system data, such as data pertaining to treatment plans for treating one or more patients. The server 30 is also configured to store data regarding performance by a patient in following a treatment plan. For example, the memory 38 includes a patient data store 44 configured to hold patient data, such as data pertaining to the one or more patients, including data representing each patient's performance within the treatment plan.
[0110] Additionally, or alternatively, the characteristics (e.g., personal, performance, measurement, etc.) of the people, the treatment plans followed by the people, the level of compliance with the treatment plans, and the results of the treatment plans may use correlations and other statistical or probabilistic measures to enable the partitioning of or to partition the treatment plans into different patient cohort-equivalent databases in the patient data store 44. For example, the data for a first cohort of first patients having a first similar injury, a first similar medical condition, a first similar medical procedure performed, a first treatment plan followed by the first patient, and a first result of the treatment plan may be stored in a first patient database. The data for a second cohort of second patients having a second similar injury, a second similar medical condition, a second similar medical procedure performed, a second treatment plan followed by the second patient, and a second result of the treatment plan may be stored in a second patient database. Any single characteristic or any combination of characteristics may be used to separate the cohorts of patients. In some embodiments, the different cohorts of patients may be stored in different partitions or volumes of the same database. There is no specific limit to the number of different cohorts of patients allowed, other than as limited by mathematical combinatoric and/or partition theory.
[0111] This characteristic data, treatment plan data, and results data may be obtained from numerous treatment devices and/or computing devices over time and stored in the database 44. The characteristic data, treatment plan data, and results data may be correlated in the patient-cohort databases in the patient data store 44. The characteristics of the people may include personal information, performance information, and/or measurement information.
[0112] In addition to the historical information about other people stored in the patient cohort-equivalent databases, real-time or near-real-time information based on the current patient's characteristics about a current patient being treated may be stored in an appropriate patient cohort-equivalent database. The characteristics of the patient may be determined to match or be similar to the characteristics of another person in a particular cohort (e.g., cohort A) and the patient may be assigned to that cohort.
[0113] In some embodiments, the server 30 may execute an artificial intelligence (AI) engine 11 that uses one or more machine learning models 13 to perform at least one of the embodiments disclosed herein. The server 30 may include a training engine 9 capable of generating the one or more machine learning models 13. The machine learning models 13 may be trained to assign people to certain cohorts based on their characteristics, select treatment plans using real-time and historical data correlations involving patient cohort equivalents, and control a treatment device 70, among other things.
[0114] The one or more machine learning models 13 may be generated by the training engine 9 and may be implemented in computer instructions executable by one or more processing devices of the training engine 9 and/or the servers 30. To generate the one or more machine learning models 13, the training engine 9 may train the one or more machine learning models 13. The one or more machine learning models 13 may be used by the artificial intelligence engine 11.
[0115] The training engine 9 may be a rackmount server, a router computer, a personal computer, a portable digital assistant, a smartphone, a laptop computer, a tablet computer, a netbook, a desktop computer, an Internet of Things (IoT) device, any other suitable computing device, or a combination thereof. The training engine 9 may be cloud-based or a real-time software platform, and it may include privacy software or protocols, and/or security software or protocols.
[0116] To train the one or more machine learning models 13, the training engine 9 may use a training data set of a corpus of the characteristics of the people that used the treatment device to perform treatment plans, the details (e.g., treatment protocol including exercises, amount of time to perform the exercises, how often to perform the exercises, a schedule of exercises, parameters/configurations/settings of the treatment device 70 throughout each step of the treatment plan, etc.) of the treatment plans performed by the people using the treatment device 70, and the results of the treatment plans performed by the people. The one or more machine learning models 13 may be trained to match patterns of characteristics of a patient with characteristics of other people assigned to a particular cohort. The term "match" may refer to an exact match, a correlative match, a substantial match, etc. The one or more machine learning models 13 may be trained to receive the characteristics of a patient as input, map the characteristics to characteristics of people assigned to a cohort, and select a treatment plan from that cohort. The one or more machine learning models 13 may also be trained to control, based on the treatment plan, the machine learning apparatus 70.
[0117] Different machine learning models 13 may be trained to recommend different treatment plans for different desired results. For example, one machine learning model may be trained to recommend treatment plans for most effective recovery, while another machine learning model may be trained to recommend treatment plans based on speed of recovery.
[0118] Using training data that includes training inputs and corresponding target outputs, the one or more machine learning models 13 may refer to model artifacts created by the training engine 9. The training engine 9 may find patterns in the training data wherein such patterns map the training input to the target output, and generate the machine learning models 13 that capture these patterns. In some embodiments, the artificial intelligence engine 11, the database 33, and/or the training engine 9 may reside on another component (e.g., assistant interface 94, clinician interface 20, etc.) depicted in FIG. 1.
[0119] The one or more machine learning models 13 may comprise, e.g., a single level of linear or non-linear operations (e.g., a support vector machine [SVM]) or the machine learning models 13 may be a deep network, i.e., a machine learning model comprising multiple levels of non-linear operations. Examples of deep networks are neural networks including generative adversarial networks, convolutional neural networks, recurrent neural networks with one or more hidden layers, and fully connected neural networks (e.g., each neuron may transmit its output signal to the input of the remaining neurons, as well as to itself). For example, the machine learning model may include numerous layers and/or hidden layers that perform calculations (e.g., dot products) using various neurons.
[0120] The system 10 also includes a patient interface 50 configured to communicate information to a patient and to receive feedback from the patient. Specifically, the patient interface includes an input device 52 and an output device 54, which may be collectively called a patient user interface 52, 54. The input device 52 may include one or more devices, such as a keyboard, a mouse, a touch screen input, a gesture sensor, and/or a microphone and processor configured for voice recognition. The output device 54 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, smartphone, or a smart watch. The output device 54 may include other hardware and/or software components such as a projector, virtual reality capability, augmented reality capability, etc. The output device 54 may incorporate various different visual, audio, or other presentation technologies. For example, the output device 54 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, and/or melodies, which may signal different conditions and/or directions. The output device 54 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the patient. The output device 54 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
[0121] As is generally illustrated in FIG. 1, the patient interface 50 includes a second communication interface 56, which may also be called a remote communication interface configured to communicate with the server 30 and/or the clinician interface 20 via a second network 58. In some embodiments, the second network 58 may include a local area network (LAN), such as an Ethernet network. In some embodiments, the second network 58 may include the Internet, and communications between the patient interface 50 and the server 30 and/or the clinician interface 20 may be secured via encryption, such as, for example, by using a virtual private network (VPN). In some embodiments, the second network 58 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc. In some embodiments, the second network 58 may be the same as and/or operationally coupled to the first network 34.
[0122] The patient interface 50 includes a second processor 60 and a second machine readable storage memory 62 holding second instructions 64 for execution by the second processor 60 for performing various actions of patient interface 50. The second machine readable storage memory 62 also includes a local data store 66 configured to hold data, such as data pertaining to a treatment plan and/or patient data, such as data representing a patient's performance within a treatment plan. The patient interface 50 also includes a local communication interface 68 configured to communicate with various devices for use by the patient in the vicinity of the patient interface 50. The local communication interface 68 may include wired and/or wireless communications. In some embodiments, the local communication interface 68 may include a local wireless network such as Wi-Fi, Bluetooth, z I
ZigBee, Near-Field Communications (NFC), cellular data network, etc.
[0123] The system 10 also includes a treatment device 70 configured to be manipulated by the patient and/or to manipulate a body part of the patient for performing activities according to the treatment plan. In some embodiments, the treatment device 70 may take the form of an exercise and rehabilitation apparatus configured to perform and/or to aid in the performance of a rehabilitation regimen, which may be an orthopedic rehabilitation regimen, and the treatment includes rehabilitation of a body part of the patient, such as a joint or a bone or a muscle group. The treatment device 70 may be any suitable medical, rehabilitative, therapeutic, etc. apparatus configured to be controlled distally via another computing device to treat a patient and/or exercise the patient. The treatment device 70 may be an electromechanical machine including one or more weights, an electromechanical bicycle, an electromechanical spin-wheel, a smart-mirror, a treadmill, or the like. The body part may include, for example, a spine, a hand, a foot, a knee, or a shoulder. The body part may include a part of a joint, a bone, or a muscle group, such as one or more vertebrae, a tendon, or a ligament. As is generally illustrated in FIG. 1, the treatment device 70 includes a controller 72, which may include one or more processors, computer memory, and/or other components. The treatment device 70 also includes a fourth communication interface 74 configured to communicate with the patient interface 50 via the local communication interface 68. The treatment device 70 also includes one or more internal sensors 76 and an actuator 78, such as a motor. The actuator 78 may be used, for example, for moving the patient's body part and/or for resisting forces by the patient.
[0124] The internal sensors 76 may measure one or more operating characteristics of the treatment device 70 such as, for example, a force a position, a speed, and /or a velocity. In some embodiments, the internal sensors 76 may include a position sensor configured to measure at least one of a linear motion or an angular motion of a body part of the patient. For example, an internal sensor 76 in the form of a position sensor may measure a distance that the patient is able to move a part of the treatment device 70, where such distance may correspond to a range of motion that the patient's body part is able to achieve. In some embodiments, the internal sensors 76 may include a force sensor configured to measure a force applied by the patient. For example, an internal sensor 76 in the form of a force sensor may measure a force or weight the patient is able to apply, using a particular body part, to the treatment device 70.
Z-0
[0125] The system 10 generally illustrated in FIG. 1 also includes an ambulation sensor 82, which communicates with the server 30 via the local communication interface 68 of the patient interface 50. The ambulation sensor 82 may track and store a number of steps taken by the patient. In some embodiments, the ambulation sensor 82 may take the form of a wristband, wristwatch, or smart watch. In some embodiments, the ambulation sensor 82 may be integrated within a phone, such as a smartphone.
[0126] The system 10 generally illustrated in FIG. 1 also includes a goniometer 84, which communicates with the server 30 via the local communication interface 68 of the patient interface 50. The goniometer 84 measures an angle of the patient's body part. For example, the goniometer 84 may measure the angle of flex of a patient's knee or elbow or shoulder.
[0127] The system 10 generally illustrated in FIG. 1 also includes a pressure sensor 86, which communicates with the server 30 via the local communication interface 68 of the patient interface 50. The pressure sensor 86 measures an amount of pressure or weight applied by a body part of the patient. For example, pressure sensor 86 may measure an amount of force applied by a patient's foot when pedaling a stationary bike.
[0128] The system 10 generally illustrated in FIG. 1 also includes a supervisory interface which may be similar or identical to the clinician interface 20. In some embodiments, the supervisory interface 90 may have enhanced functionality beyond what is provided on the clinician interface 20. The supervisory interface 90 may be configured for use by a person having responsibility for the treatment plan, such as an orthopedic surgeon.
[0129] The system 10 generally illustrated in FIG. 1 also includes a reporting interface 92 which may be similar or identical to the clinician interface 20. In some embodiments, the reporting interface 92 may have less functionality from what is provided on the clinician interface 20. For example, the reporting interface 92 may not have the ability to modify a treatment plan. Such a reporting interface 92 may be used, for example, by a biller to determine the use of the system 10 for billing purposes. In another example, the reporting interface 92 may not have the ability to display patient identifiable information, presenting only pseudonymized data and/or anonymized data for certain data fields concerning a data subject and/or for certain data fields concerning a quasi-identifier of the data subject. Such a reporting interface 92 may be used, for example, by a researcher to determine various effects of a treatment plan on different patients.
[0130] The system 10 includes an assistant interface 94 for a healthcare provider, such as those described herein, to remotely communicate with the patient interface 50 and/or the treatment device 70. Such remote communications may enable the healthcare provider to provide assistance or guidance to a patient using the system 10. More specifically, the assistant interface 94 is configured to communicate a telemedicine signal 96, 97, 98a, 98b, 99a, 99b with the patient interface 50 via a network connection such as, for example, via the first network 34 and/or the second network 58. The telemedicine signal 96, 97, 98a, 98b, 99a, 99b comprises one of an audio signal 96, an audiovisual signal 97, an interface control signal 98a for controlling a function of the patient interface 50, an interface monitor signal 98b for monitoring a status of the patient interface 50, an apparatus control signal 99a for changing an operating parameter of the treatment device 70, and/or an apparatus monitor signal 99b for monitoring a status of the treatment device 70. In some embodiments, each of the control signals 98a, 99a may be unidirectional, conveying commands from the assistant interface 94 to the patient interface 50. In some embodiments, in response to successfully receiving a control signal 98a, 99a and/or to communicate successful and/or unsuccessful implementation of the requested control action, an acknowledgement message may be sent from the patient interface 50 to the assistant interface 94. In some embodiments, each of the monitor signals 98b, 99b may be unidirectional, status-information commands from the patient interface 50 to the assistant interface 94. In some embodiments, an acknowledgement message may be sent from the assistant interface 94 to the patient interface 50 in response to successfully receiving one of the monitor signals 98b, 99b.
[0131] In some embodiments, the patient interface 50 may be configured as a pass-through for the apparatus control signals 99a and the apparatus monitor signals 99b between the treatment device 70 and one or more other devices, such as the assistant interface 94 and/or the server 30. For example, the patient interface 50 may be configured to transmit an apparatus control signal 99a in response to an apparatus control signal 99a within the telemedicine signal 96, 97, 98a, 98b, 99a, 99b from the assistant interface 94.
[0132] In some embodiments, the assistant interface 94 may be presented on a shared physical device as the clinician interface 20. For example, the clinician interface 20 may include one or more screens that implement the assistant interface 94. Alternatively or additionally, the clinician interface 20 may include additional hardware components, such as a video camera, a speaker, and/or a microphone, to implement aspects of the assistant interface 94.
[0133] In some embodiments, one or more portions of the telemedicine signal 96, 97, 98a, 98b, 99a, 99b may be generated from a prerecorded source (e.g., an audio recording, a video recording, or an animation) for presentation by the output device 54 of the patient interface 50. For example, a tutorial video may be streamed from the server 30 and presented upon the patient interface 50. Content from the prerecorded source may be requested by the patient via the patient interface 50. Alternatively, via a control on the assistant interface 94, the healthcare provider may cause content from the prerecorded source to be played on the patient interface 50.
[0134] The assistant interface 94 includes an assistant input device 22 and an assistant display 24, which may be collectively called an assistant user interface 22, 24. The assistant input device 22 may include one or more of a telephone, a keyboard, a mouse, a trackpad, or a touch screen, for example. Alternatively or additionally, the assistant input device 22 may include one or more microphones. In some embodiments, the one or more microphones may take the form of a telephone handset, headset, or wide-area microphone or microphones configured for the healthcare provider to speak to a patient via the patient interface 50. In some embodiments, assistant input device 22 may be configured to provide voice-based functionalities, with hardware and/or software configured to interpret spoken instructions by the healthcare provider by using the one or more microphones. The assistant input device 22 may include functionality provided by or similar to existing voice-based assistants such as Siri by Apple, Alexa by Amazon, Google Assistant, or Bixby by Samsung. The assistant input device 22 may include other hardware and/or software components. The assistant input device 22 may include one or more general purpose devices and/or special-purpose devices.
[0135] The assistant display 24 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, a smartphone, or a smart watch. The assistant display 24 may include other hardware and/or software components such as projectors, virtual reality capabilities, or augmented reality capabilities, etc. The assistant display 24 may incorporate various different visual, audio, or other presentation technologies. For example, the assistant display 24 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, melodies, and/or compositions, which may signal different conditions and/or directions. The
-)11
assistant display 24 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the healthcare provider. The assistant display 24 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
[0136] In some embodiments, the system 10 may provide computer translation of language from the assistant interface 94 to the patient interface 50 and/or vice-versa. The computer translation of language may include computer translation of spoken language and/or computer translation of text. Additionally or alternatively, the system 10 may provide voice recognition and/or spoken pronunciation of text. For example, the system 10 may convert spoken words to printed text and/or the system 10 may audibly speak language from printed text. The system may be configured to recognize spoken words by any or all of the patient, the clinician, and/or the healthcare provider. In some embodiments, the system 10 may be configured to recognize and react to spoken requests or commands by the patient. For example, the system may automatically initiate a telemedicine session in response to a verbal command by the patient (which may be given in any one of several different languages).
[0137] In some embodiments, the server 30 may generate aspects of the assistant display 24 for presentation by the assistant interface 94. For example, the server 30 may include a web server configured to generate the display screens for presentation upon the assistant display 24. For example, the artificial intelligence engine 11 may generate recommended treatment plans and/or excluded treatment plans for patients and generate the display screens including those recommended treatment plans and/or external treatment plans for presentation on the assistant display 24 of the assistant interface 94. In some embodiments, the assistant display 24 may be configured to present a virtualized desktop hosted by the server 30. In some embodiments, the server 30 may be configured to communicate with the assistant interface 94 via the first network 34. In some embodiments, thefirst network 34 may include a local area network (LAN), such as an Ethernet network.
[0138] In some embodiments, the first network 34 may include the Internet, and communications between the server 30 and the assistant interface 94 may be secured via privacy enhancing technologies, such as, for example, by using encryption over a virtual private network (VPN). Alternatively or additionally, the server 30 may be configured to communicate with the assistant interface 94 via one or more networks independent of the first
.1 /_
network 34 and/or other communication means, such as a direct wired or wireless communication channel. In some embodiments, the patient interface 50 and the treatment device 70 may each operate from a patient location geographically separate from a location of the assistant interface 94. For example, the patient interface 50 and the treatment device may be used as part of an in-home rehabilitation system, which may be aided remotely by using the assistant interface 94 at a centralized location, such as a clinic or a call center.
[0139] In some embodiments, the assistant interface 94 may be one of several different terminals (e.g., computing devices) that may be grouped together, for example, in one or more call centers or at one or more clinicians' offices. In some embodiments, a plurality of assistant interfaces 94 may be distributed geographically. In some embodiments, a person may work as a healthcare provider remotely from any conventional office infrastructure. Such remote work may be performed, for example, where the assistant interface 94 takes the form of a computer and/or telephone. This remote work functionality may allow for work-from-home arrangements that may include part time and/or flexible work hours for a healthcare provider.
[0140] FIGS. 2-3 show an embodiment of a treatment device 70. More specifically, FIG. 2 generally illustrates a treatment device 70 in the form of a stationary cycling machine 100, which may be called a stationary bike, for short. The stationary cycling machine 100 includes a set of pedals 102 each attached to a pedal arm 104 for rotation about an axle 106. In some embodiments, and as is generally illustrated in FIG. 2, the pedals 102 are movable on the pedal arms 104 in order to adjust a range of motion used by the patient in pedaling. For example, the pedals being located inwardly toward the axle 106 corresponds to a smaller range of motion than when the pedals are located outwardly away from the axle 106. A pressure sensor 86 is attached to or embedded within one of the pedals 102 for measuring an amount of force applied by the patient on the pedal 102. The pressure sensor 86 may communicate wirelessly to the treatment device 70 and/or to the patient interface 50.
[0141] FIG. 4 generally illustrates a person (a patient) using the treatment device of FIG. 2, and showing sensors and various data parameters connected to a patient interface 50. The example patient interface 50 is a tablet computer or smartphone, or a phablet, such as an iPad, an iPhone, an Android device, or a Surface tablet, which is held manually by the patient. In some other embodiments, the patient interface 50 may be embedded within or attached to the treatment device 70.
[0142] FIG. 4 generally illustrates the patient wearing the ambulation sensor 82 on his wrist, with a note showing "STEPS TODAY 1355", indicating that the ambulation sensor 82 has recorded and transmitted that step count to the patient interface 50. FIG. 4 also generally illustrates the patient wearing the goniometer 84 on his right knee, with a note showing "KNEE ANGLE 72°", indicating that the goniometer 84 is measuring and transmitting that knee angle to the patient interface 50. FIG. 4 also generally illustrates a right side of one of the pedals 102 with a pressure sensor 86 showing "FORCE 12.5 lbs.," indicating that the right pedal pressure sensor 86 is measuring and transmitting that force measurement to the patient interface 50.
[0143] FIG. 4 also generally illustrates a left side of one of the pedals 102 with a pressure sensor 86 showing "FORCE 27 lbs.", indicating that the left pedal pressure sensor 86 is measuring and transmitting that force measurement to the patient interface 50. FIG. 4 also generally illustrates other patient data, such as an indicator of "SESSION TIME 0:04:13", indicating that the patient has been using the treatment device 70 for 4 minutes and 13 seconds. This session time may be determined by the patient interface 50 based on information received from the treatment device 70. FIG. 4 also generally illustrates an indicator showing "PAIN LEVEL 3". Such a pain level may be obtained from the patent in response to a solicitation, such as a question, presented upon the patient interface 50.
[0144] FIG. 5 is an example embodiment of an overview display 120 of the assistant interface 94. Specifically, the overview display 120 presents several different controls and interfaces for the healthcare provider to remotely assist a patient with using the patient interface 50 and/or the treatment device 70. This remote assistance functionality may also be called telemedicine or telehealth.
[0145] Specifically, the overview display 120 includes a patient profile display 130 presenting biographical information regarding a patient using the treatment device 70. The patient profile display 130 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5, although the patient profile display 130 may take other forms, such as a separate screen or a popup window.
[0146] In some embodiments, the patient profile display 130 may include a limited subset of the patient's biographical information. More specifically, the data presented upon the patient profile display 130 may depend upon the healthcare provider's need for that information. For example, a healthcare provider that is assisting the patient with a medical issue may be provided with medical history information regarding the patient, whereas a technician troubleshooting an issue with the treatment device 70 may be provided with a much more limited set of information regarding the patient. The technician, for example, may be given only the patient's name.
[0147] The patient profile display 130 may include pseudonymized data and/or anonymized data or use any privacy enhancing technology to prevent confidential patient data from being communicated in a way that could violate patient confidentiality requirements. Such privacy enhancing technologies may enable compliance with laws, regulations, or other rules of governance such as, but not limited to, the Health Insurance Portability and Accountability Act (HIPAA), or the General Data Protection Regulation (GDPR), wherein the patient may be deemed a "data subject".
[0148] In some embodiments, the patient profile display 130 may present information regarding the treatment plan for the patient to follow in using the treatment device 70. Such treatment plan information may be limited to a healthcare provider. For example, a healthcare provider assisting the patient with an issue regarding the treatment regimen may be provided with treatment plan information, whereas a technician troubleshooting an issue with the treatment device 70 may not be provided with any information regarding the patient's treatment plan.
[0149] In some embodiments, one or more recommended treatment plans and/or excluded treatment plans may be presented in the patient profile display 130 to the healthcare provider. The one or more recommended treatment plans and/or excluded treatment plans may be generated by the artificial intelligence engine 11 of the server 30 and received from the server in real-time during, inter alia, a telemedicine or telehealth session. An example of presenting the one or more recommended treatment plans and/or ruled-out treatment plans is described below with reference to FIG. 7.
[0150] The example overview display 120 generally illustrated in FIG. 5 also includes a patient status display 134 presenting status information regarding a patient using the treatment device. The patient status display 134 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5, although the patient status display 134 may take other forms, such as a separate screen or a popup window.
[0151] The patient status display 134 includes sensor data 136 from one or more of the external sensors 82, 84, 86, and/or from one or more internal sensors 76 of the treatment device 70. In some embodiments, the patient status display 134 may include sensor data from one or more sensors of one or more wearable devices worn by the patient while using the treatment device 70. The one or more wearable devices may include a watch, a bracelet, a necklace, a chest strap, and the like. The one or more wearable devices may be configured to monitor a heart rate, a temperature, a blood pressure, one or more vital signs, and the like of the patient while the patient is using the treatment device 70. In some embodiments, the patient status display 134 may present other data 138 regarding the patient, such as last reported pain level, or progress within a treatment plan.
[0152] User access controls may be used to limit access, including what data is available to be viewed and/or modified, on any or all of the user interfaces 20, 50, 90, 92, 94 of the system 10. In some embodiments, user access controls may be employed to control what information is available to any given person using the system 10. For example, data presented on the assistant interface 94 may be controlled by user access controls, with permissions set depending on the healthcare provider/user's need for and/or qualifications to view that information.
[0153] The example overview display 120 generally illustrated in FIG. 5 also includes a help data display 140 presenting information for the healthcare provider to use in assisting the patient. The help data display 140 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5. The help data display 140 may take other forms, such as a separate screen or a popup window. The help data display 140 may include, for example, presenting answers to frequently asked questions regarding use of the patient interface 50 and/or the treatment device 70.
[0154] The help data display 140 may also include research data or best practices. In some embodiments, the help data display 140 may present scripts for answers or explanations in response to patient questions. In some embodiments, the help data display 140 may present flow charts or walk-throughs for the healthcare provider to use in determining a root cause and/or solution to a patient's problem.
[0155] In some embodiments, the assistant interface 94 may present two or more help data displays 140, which may be the same or different, for simultaneous presentation of help data for use by the healthcare provider. for example, a first help data display may be used to present a troubleshooting flowchart to determine the source of a patient's problem, and a second help data display may present script information for the healthcare provider to read to the patient, such information to preferably include directions for the patient to perform some action, which may help to narrow down or solve the problem. In some embodiments, based upon inputs to the troubleshooting flowchart in the first help data display, the second help data display may automatically populate with script information.
[0156] The example overview display 120 generally illustrated in FIG. 5 also includes a patient interface control 150 presenting information regarding the patient interface 50, and/or to modify one or more settings of the patient interface 50. The patient interface control 150 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5. The patient interface control 150 may take other forms, such as a separate screen or a popup window. The patient interface control 150 may present information communicated to the assistant interface 94 via one or more of the interface monitor signals 98b.
[0157] As is generally illustrated in FIG. 5, the patient interface control 150 includes a display feed 152 of the display presented by the patient interface 50. In some embodiments, the display feed 152 may include a live copy of the display screen currently being presented to the patient by the patient interface 50. In other words, the display feed 152 may present an image of what is presented on a display screen of the patient interface 50.
[0158] In some embodiments, the display feed 152 may include abbreviated information regarding the display screen currently being presented by the patient interface 50, such as a screen name or a screen number. The patient interface control 150 may include a patient interface setting control 154 for the healthcare provider to adjust or to control one or more settings or aspects of the patient interface 50. In some embodiments, the patient interface setting control 154 may cause the assistant interface 94 to generate and/or to transmit an interface control signal 98 for controlling a function or a setting of the patient interface 50.
[0159] In some embodiments, the patient interface setting control 154 may include collaborative browsing or co-browsing capability for the healthcare provider to remotely view and/or control the patient interface 50. For example, the patient interface setting control 154 may enable the healthcare provider to remotely enter text to one or more text entry fields on the patient interface 50 and/or to remotely control a cursor on the patient interface 50 using a
/ mouse or touchscreen of the assistant interface 94.
[0160] In some embodiments, using the patient interface 50, the patient interface setting control 154 may allow the healthcare provider to change a setting that cannot be changed by the patient. For example, the patient interface 50 may be precluded from accessing a language setting to prevent a patient from inadvertently switching, on the patient interface 50, the language used for the displays, whereas the patient interface setting control 154 may enable the healthcare provider to change the language setting of the patient interface 50. In another example, the patient interface 50 may not be able to change a font size setting to a smaller size in order to prevent a patient from inadvertently switching the font size used for the displays on the patient interface 50 such that the display would become illegible to the patient, whereas the patient interface setting control 154 may provide for the healthcare provider to change the font size setting of the patient interface 50.
[0161] The example overview display 120 generally illustrated in FIG. 5 also includes an interface communications display 156 showing the status of communications between the patient interface 50 and one or more other devices 70, 82, 84, such as the treatment device , the ambulation sensor 82, and/or the goniometer 84. The interface communications display 156 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5.
[0162] The interface communications display 156 may take other forms, such as a separate screen or a popup window. The interface communications display 156 may include controls for the healthcare provider to remotely modify communications with one or more of the other devices 70, 82, 84. For example, the healthcare provider may remotely command the patient interface 50 to reset communications with one of the other devices 70, 82, 84, or to establish communications with a new one of the other devices 70, 82, 84. This functionality may be used, for example, where the patient has a problem with one of the other devices 70, 82, 84, or where the patient receives a new or a replacement one of the other devices 70, 82, 84.
[0163] The example overview display 120 generally illustrated in FIG. 5 also includes an apparatus control 160 for the healthcare provider to view and/or to control information regarding the treatment device 70. The apparatus control 160 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5. The apparatus control 160 may take other forms, such as a separate screen or a popup window. The apparatus control 160 may include an apparatus status display 162 with information regarding the current status of the apparatus. The apparatus status display 162 may present information communicated to the assistant interface 94 via one or more of the apparatus monitor signals 99b. The apparatus status display 162 may indicate whether the treatment device 70 is currently communicating with the patient interface 50. The apparatus status display 162 may present other current and/or historical information regarding the status of the treatment device 70.
[0164] The apparatus control 160 may include an apparatus setting control 164 for the healthcare provider to adjust or control one or more aspects of the treatment device 70. The apparatus setting control 164 may cause the assistant interface 94 to generate and/or to transmit an apparatus control signal 99 (e.g., which may be referred to as treatment plan input, as described) for changing an operating parameter and/or one or more characteristics of the treatment device 70, (e.g., a pedal radius setting, a resistance setting, a target RPM, other suitable characteristics of the treatment device 70, or a combination thereof).
[0165] The apparatus setting control 164 may include a mode button 166 and a position control 168, which may be used in conjunction for the healthcare provider to place an actuator 78 of the treatment device 70 in a manual mode, after which a setting, such as a position or a speed of the actuator 78, can be changed using the position control 168. The mode button 166 may provide for a setting, such as a position, to be toggled between automatic and manual modes.
[0166] In some embodiments, one or more settings may be adjustable at any time, and without having an associated auto/manual mode. In some embodiments, the healthcare provider may change an operating parameter of the treatment device 70, such as a pedal radius setting, while the patient is actively using the treatment device 70. Such "on the fly" adjustment may or may not be available to the patient using the patient interface 50.
[0167] In some embodiments, the apparatus setting control 164 may allow the healthcare provider to change a setting that cannot be changed by the patient using the patient interface 50. For example, the patient interface 50 may be precluded from changing a preconfigured setting, such as a height or a tilt setting of the treatment device 70, whereas the apparatus setting control 164 may provide for the healthcare provider to change the height or tilt setting of the treatment device 70.
[0168] The example overview display 120 generally illustrated in FIG. 5 also includes a patient communications control 170 for controlling an audio or an audiovisual communications session with the patient interface 50. The communications session with the patient interface 50 may comprise a live feed from the assistant interface 94 for presentation by the output device of the patient interface 50. The live feed may take the form of an audio feed and/or a video feed. In some embodiments, the patient interface 50 may be configured to provide two-way audio or audiovisual communications with a person using the assistant interface 94. Specifically, the communications session with the patient interface 50 may include bidirectional (two-way) video or audiovisual feeds, with each of the patient interface and the assistant interface 94 presenting video of the other one.
[0169] In some embodiments, the patient interface 50 may present video from the assistant interface 94, while the assistant interface 94 presents only audio or the assistant interface 94 presents no live audio or visual signal from the patient interface 50. In some embodiments, the assistant interface 94 may present video from the patient interface 50, while the patient interface 50 presents only audio or the patient interface 50 presents no live audio or visual signal from the assistant interface 94.
[0170] In some embodiments, the audio oran audiovisual communications session with the patient interface 50 may take place, at least in part, while the patient is performing the rehabilitation regimen upon the body part. The patient communications control 170 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5. The patient communications control 170 may take other forms, such as a separate screen or a popup window.
[0171] The audio and/or audiovisual communications maybe processed and/or directed by the assistant interface 94 and/or by another device or devices, such as a telephone system, or a videoconferencing system used by the healthcare provider while the healthcare provider uses the assistant interface 94. Alternatively or additionally, the audio and/or audiovisual communications may include communications with a third party. For example, the system 10 may enable the healthcare provider to initiate a 3-way conversation regarding use of a particular piece of hardware or software, with the patient and a subject matter expert, such as a healthcare provider or a specialist. The example patient communications control 170 generally illustrated in FIG. 5 includes call controls 172 for the healthcare provider to use in
-ro
managing various aspects of the audio or audiovisual communications with the patient. The call controls 172 include a disconnect button 174 for the healthcare provider to end the audio or audiovisual communications session. The call controls 172 also include a mute button 176 to temporarily silence an audio or audiovisual signal from the assistant interface 94. In some embodiments, the call controls 172 may include other features, such as a hold button (not shown).
[0172] The call controls 172 also include one or more record/playback controls 178, such as record, play, and pause buttons to control, with the patient interface 50, recording and/or playback of audio and/or video from the teleconference session. The call controls 172 also include a video feed display 180 for presenting still and/or video images from the patient interface 50, and a self-video display 182 showing the current image of the healthcare provider using the assistant interface 94. The self-video display 182 may be presented as a picture-in-picture format, within a section of the video feed display 180, as is generally illustrated in FIG. 5. Alternatively or additionally, the self-video display 182 may be presented separately and/or independently from the video feed display 180.
[0173] The example overview display 120 generally illustrated in FIG. 5 also includes a third party communications control 190 for use in conducting audio and/or audiovisual communications with a third party. The third party communications control 190 may take the form of a portion or region of the overview display 120, as is generally illustrated in FIG. 5. The third party communications control 190 may take other forms, such as a display on a separate screen or a popup window.
[0174] The third party communications control 190 may include one or more controls, such as a contact list and/or buttons or controls to contact a third party regarding use of a particular piece of hardware or software, e.g., a subject matter expert, such as a healthcare provider or a specialist. The third party communications control 190 may include conference calling capability for the third party to simultaneously communicate with both the healthcare provider via the assistant interface 94, and with the patient via the patient interface 50. For example, the system 10 may provide for the healthcare provider to initiate a 3-way conversation with the patient and the third party.
[0175] FIG. 6 generally illustrates an example block diagram of training a machine learning model 13 to output, based on data 600 pertaining to the patient, a treatment plan 602 for the
-r 1
patient according to the present disclosure. Data pertaining to other patients may be received by the server 30. The other patients may have used various treatment devices to perform treatment plans.
[0176] The data may include characteristics of the other patients, the details of the treatment plans performed by the other patients, and/or the results of performing the treatment plans (e.g., a percent of recovery of a portion of the patients' bodies, an amount of recovery of a portion of the patients' bodies, an amount of increase or decrease in muscle strength of a portion of patients' bodies, an amount of increase or decrease in range of motion of a portion of patients' bodies, etc.).
[0177] As depicted, the data has been assigned to different cohorts. Cohort A includes data for patients having similar first characteristics, first treatment plans, and first results. Cohort B includes data for patients having similar second characteristics, second treatment plans, and second results. For example, cohort A may include first characteristics of patients in their twenties without any medical conditions who underwent surgery for a broken limb; their treatment plans may include a certain treatment protocol (e.g., use the treatment device 70 for minutes 5 times a week for 3 weeks, wherein values for the properties, configurations, and/or settings of the treatment device 70 are set to X (where X is a numerical value) for the first two weeks and to Y (where Y is a numerical value) for the last week).
[0178] Cohort A and cohort B may be included in a training dataset used to train the machine learning model 13. The machine learning model 13 may be trained to match a pattern between characteristics for each cohort and output the treatment plan that provides the result. Accordingly, when the data 600 for a new patient is input into the trained machine learning model 13, the trained machine learning model 13 may match the characteristics included in the data 600 with characteristics in either cohort A or cohort B and output the appropriate treatment plan 602. In some embodiments, the machine learning model 13 may be trained to output one or more excluded treatment plans that should not be performed by the new patient.
[0179] FIG. 7 generally illustrates an embodiment of an overview display 120 of the assistant interface 94 presenting recommended treatment plans and excluded treatment plans in real-time during a telemedicine session according to the present disclosure. As depicted, the overview display 120 just includes sections for the patient profile 130 and the video feed display 180, including the self-video display 182. Any suitable configuration of controls and interfaces of the overview display 120 described with reference to FIG. 5 may be presented in addition to or instead of the patient profile 130, the video feed display 180, and the self video display 182.
[0180] The healthcare provider using the assistant interface 94 (e.g., computing device) during the telemedicine session may be presented in the self-video 182 in a portion of the overview display 120 (e.g., user interface presented on a display screen 24 of the assistant interface 94) that also presents a video from the patient in the video feed display 180. Further, the video feed display 180 may also include a graphical user interface (GUI) object 700 (e.g., a button) that enables the healthcare provider to share, in real-time or near real-time during the telemedicine session, the recommended treatment plans and/or the excluded treatment plans with the patient on the patient interface 50. The healthcare provider may select the GUI object 700 to share the recommended treatment plans and/or the excluded treatment plans. As depicted, another portion of the overview display 120 includes the patient profile display 130.
[0181] The patient profile display 130 is presenting two example recommended treatment plans 600 and one example excluded treatment plan 602. As described herein, the treatment plans may be recommended in view of characteristics of the patient being treated. To generate the recommended treatment plans 600 the patient should follow to achieve a desired result, a pattern between the characteristics of the patient being treated and a cohort of other people who have used the treatment device 70 to perform a treatment plan may be matched by one or more machine learning models 13 of the artificial intelligence engine 11. Each of the recommended treatment plans may be generated based on different desired results.
[0182] For example, as depicted, the patient profile display 130 presents "The characteristics of the patient match characteristics of uses in Cohort A. The following treatment plans are recommended for the patient based on his characteristics and desired results." Then, the patient profile display 130 presents recommended treatment plans from cohort A, and each treatment plan provides different results.
[0183] As depicted, treatment plan "A" indicates "Patient X should use treatment device for 30 minutes a day for 4 days to achieve an increased range of motion of Y%; Patient X has Type 2 Diabetes; and Patient X should be prescribed medication Z for pain management during the treatment plan (medication Z is approved for people having Type 2 Diabetes)."
Accordingly, the treatment plan generated achieves increasing the range of motion of Y%. As may be appreciated, the treatment plan also includes a recommended medication (e.g., medication Z) to prescribe to the patient to manage pain in view of a known medical disease (e.g., Type 2 Diabetes) of the patient. That is, the recommended patient medication not only does not conflict with the medical condition of the patient but thereby improves the probability of a superior patient outcome. This specific example and all such examples elsewhere herein are not intended to limit in any way the generated treatment plan from recommending multiple medications, or from handling the acknowledgement, view, diagnosis and/or treatment of comorbid conditions or diseases.
[0184] Recommended treatment plan "B" may specify, based on a different desired result of the treatment plan, a different treatment plan including a different treatment protocol for a treatment device, a different medication regimen, etc.
[0185] As depicted, the patient profile display 130 may also present the excluded treatment plans 602. These types of treatment plans are shown to the healthcare provider using the assistant interface 94 to alert the healthcare provider not to recommend certain portions of a treatment plan to the patient. For example, the excluded treatment plan could specify the following: "Patient X should not use treatment device for longer than 30 minutes a day due to a heart condition; Patient X has Type 2 Diabetes; and Patient X should not be prescribed medication M for pain management during the treatment plan (in this scenario, medication M can cause complications for people having Type 2 Diabetes). Specifically, the excluded treatment plan points out a limitation of a treatment protocol where, due to a heart condition, Patient X should not exercise for more than 30 minutes a day. The ruled-out treatment plan also points out that Patient X should not be prescribed medication M because it conflicts with the medical condition Type 2 Diabetes.
[0186] The healthcare provider may select the treatment plan for the patient on the overview display 120. For example, the healthcare provider may use an input peripheral (e.g., mouse, touchscreen, microphone, keyboard, etc.) to select from the treatment plans 600 for the patient. In some embodiments, during the telemedicine session, the healthcare provider may discuss the pros and cons of the recommended treatment plans 600 with the patient.
[0187] In any event, the healthcare provider may select the treatment plan for the patient to follow to achieve the desired result. The selected treatment plan may be transmitted to the
-r-r
patient interface 50 for presentation. The patient may view the selected treatment plan on the patient interface 50. In some embodiments, the healthcare provider and the patient may discuss during the telemedicine session the details (e.g., treatment protocol using treatment device 70, diet regimen, medication regimen, etc.) in real-time or in near real-time. In some embodiments, the server 30 may control, based on the selected treatment plan and during the telemedicine session, the treatment device 70 as the user uses the treatment device 70.
[0188] FIG. 8 generally illustrates an embodiment of the overview display 120 of the assistant interface 94 presenting, in real-time during a telemedicine session, recommended treatment plans that have changed as a result of patient data changing according to the present disclosure. As may be appreciated, the treatment device 70 and/or any computing device (e.g., patient interface 50) may transmit data while the patient uses the treatment device 70 to perform a treatment plan. The data may include updated characteristics of the patient and/or other treatment data. For example, the updated characteristics may include new performance information and/or measurement information. The performance information may include a speed of a portion of the treatment device 70, a range of motion achieved by the patient, a force exerted on a portion of the treatment device 70, a heartrate of the patient, a blood pressure of the patient, a respiratory rate of the patient, and so forth.
[0189] In some embodiments, the data received at the server 30 may be input into the trained machine learning model 13, which may determine that the characteristics indicate the patient is on track for the current treatment plan. Determining the patient is on track for the current treatment plan may cause the trained machine learning model 13 to adjust a parameter of the treatment device 70. The adjustment may be based on a next step of the treatment plan to further improve the performance of the patient.
[0190] In some embodiments, the data received at the server 30 may be input into the trained machine learning model 13, which may determine that the characteristics indicate the patient is not on track (e.g., behind schedule, not able to maintain a speed, not able to achieve a certain range of motion, is in too much pain, etc.) for the current treatment plan or is ahead of schedule (e.g., exceeding a certain speed, exercising longer than specified with no pain, exerting more than a specified force, etc.) for the current treatment plan.
[0191] The trained machine learning model 13 may determine that the characteristics of the patient no longer match the characteristics of the patients in the cohort to which the patient is
-rj
assigned. Accordingly, the trained machine learning model 13 may reassign the patient to another cohort that includes qualifying characteristics the patient's characteristics. As such, the trained machine learning model 13 may select a new treatment plan from the new cohort and control, based on the new treatment plan, the treatment device 70.
[0192] In some embodiments, prior to controlling the treatment device 70, the server 30 may provide the new treatment plan 800 to the assistant interface 94 for presentation in the patient profile 130. As depicted, the patient profile 130 indicates "The characteristics of the patient have changed and now match characteristics of uses in Cohort B. The following treatment plan is recommended for the patient based on his characteristics and desired results." Then, the patient profile 130 presents the new treatment plan 800 ("Patient X should use the treatment device for 10 minutes a day for 3 days to achieve an increased range of motion of L%." The healthcare provider may select the new treatment plan 800, and the server may receive the selection. The server 30 may control the treatment device 70 based on the new treatment plan 800. In some embodiments, the new treatment plan 800 may be transmitted to the patient interface 50 such that the patient may view the details of the new treatment plan 800.
[0193] In some embodiments, while the patient is using the treatment device 70 to perform the treatment plan, the server 30 may receive treatment data pertaining to the patient. As described, the treatment plan may correspond to a rehabilitation treatment plan, a prehabilitation treatment plan, an exercise treatment plan, or other suitable treatment plan. The treatment data may include various characteristics of the patient (e.g., such as those described herein), various measurement information pertaining to the patient while the patient uses the treatment device 70 (e.g., such as those described herein), various characteristics of the treatment device 70 (e.g., such as those described herein), the treatment plan, other suitable data, or a combination thereof.
[0194] In some embodiments, at least some of the treatment data may include the sensor data 136 from one or more of the external sensors 82, 84, 86, and/or from one or more internal sensors 76 of the treatment device 70. In some embodiments, at least some of the treatment data may include sensor data from one or more sensors of one or more wearable devices worn by the patient while using the treatment device 70. The one or more wearable devices may include a watch, a bracelet, a necklace, a chest strap, a head sweatband, a wrist sweatband,
-rV
any other suitable sweatband, any other suitable wearable, or a combination thereof. while the patient is using the treatment device 70, the one or more wearable devices may be configured to monitor a heart rate, a temperature, a blood pressure, one or more vital signs, and the like of the patient.
[0195] In some embodiments, the server 30 may generate treatment information using the treatment data. The treatment information may include a formatted summary of the performance of the treatment plan by the user while using the treatment device, such that the treatment data is presentable at a computing device of a healthcare provider responsible for the performance of the treatment plan by the user. In some embodiments, the patient profile display 120 may include and/or display the treatment information.
[0196] The server 30 may be configured to provide, at the overview display 120, the treatment information. For example, the server 30 may store the treatment information for access by the overview display 120 and/or communicate the treatment information to the overview display 120. In some embodiments, the server 30 may provide the treatment information to patient profile display 130 or other suitable section, portion, or component of the overview display 120 or to any other suitable display or interface.
[0197] In some embodiments, the healthcare provider assisting the patient while using the treatment device 70 may review the treatment information and determine whether to modify the treatment plan and/or one or more characteristics of the treatment device 70. For example, the healthcare provider may review the treatment information and compare the treatment information to the treatment plan being performed by the patient.
[0198] While the patient uses the treatment device 70, the healthcare provider may compare one or more parts or portions of expected information pertaining to the patient's ability to perform the treatment plan with one or more corresponding parts or portions of the measurement information (e.g., indicated by the treatment information) pertaining to the patient while the patient uses the treatment device 70 to perform the treatment plan. The expected information may include one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable information of the user, or a combination thereof. The healthcare provider may determine that the treatment plan is having the desired effect if one or more parts or portions of the measurement information are within an acceptable range of one or more corresponding
-r
/ parts or portions of the expected information. Conversely, the healthcare provider may determine that the treatment plan is not having the desired effect if one or more pats or portions of the measurement information are outside of the acceptable range of one or more corresponding parts or portions of the expected information.
[0199] In some embodiments, while the patient uses the treatment device 70 to perform the treatment plan, the healthcare provider may compare the expected respective characteristics of the treatment device 70 with corresponding characteristics of the treatment device 70 indicated by the treatment information. For example, the healthcare provider may compare an expected resistance setting of the treatment device 70 with an actual resistance setting of the treatment device 70 indicated by the treatment information.
[0200] The healthcare provider may determine that the patient is performing the treatment plan properly if the actual characteristics of the treatment device 70 indicated by the treatment information are within a range of the expected characteristics of the treatment device 70. Conversely, the healthcare provider may determine that the patient is not performing the treatment plan properly if the actual characteristics of the treatment device 70 indicated by the treatment information are outside the range of the expected characteristics of the treatment device 70.
[0201] If the healthcare provider determines that the treatment information indicates that the patient is performing the treatment plan properly and/or that the treatment plan is having the desired effect, the healthcare provider may determine not to modify the treatment plan or the one or more characteristics of the treatment device 70. Conversely, if the healthcare provider determines that the treatment information indicates that the patient is not performing the treatment plan properly and/or that the treatment plan is not having the desired effect, the healthcare provider may determine to modify the treatment plan and/or the one or more characteristics of the treatment device 70 while the user uses the treatment device 70 to perform the treatment plan.
[0202] In some embodiments, while the patient uses the treatment device 70 to perform the modified treatment plan, the server 30 may receive subsequent treatment data pertaining to the patient. For example, after the healthcare provider provides input modifying the treatment plan and/or controlling the one or more characteristics of the treatment device 70, the patient may continue to perform the modified treatment plan using the treatment device 70. The
-ro
subsequent treatment data may correspond to treatment data generated while the patient uses the treatment device 70 to perform the modified treatment plan. In some embodiments, the subsequent treatment data may correspond to treatment data generated while the patient continues to perform the treatment plan using the treatment device 70, after the healthcare provider has received the treatment information and determined not to modify the treatment plan and/or control the one or more characteristics of the treatment device 70.
[0203] The server 30 may further modify the treatment plan and/or control the one or more characteristics of the treatment device 70 based on subsequent treatment plan input received from overview display 120. The subsequent treatment plan input may correspond to input provided by the healthcare provider, at the overview display 120, in response to receiving and/or reviewing subsequent treatment information corresponding to the subsequent treatment data. It should be understood that the server 30 may continuously and/or periodically provide treatment information to the patient profile display 130 and/or other sections, portions, or components of the overview display 120 based on continuously and/or periodically received treatment data.
[0204] The healthcare provider may receive and/or review treatment information continuously or periodically while the user uses the treatment device to perform the treatment plan. The healthcare provider may determine whether to modify the treatment plan and/or control the one or more characteristics of the treatment device based on one or more trends indicated by the continuously and/or periodically received treatment information. For example, the one or more trends may indicate an increase in heart rate or changes in other applicable trends indicating that the user is not performing the treatment plan properly and/or performance of the treatment plan by the user is not having the desired effect.
[0205] FIG. 9 is a flow diagram generally illustrating a method 900 for monitoring performance of a treatment plan by a user using a treatment device and for selectively modifying the treatment plan and one or more characteristics of the treatment device. according to the present disclosure. The method 900 is performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine), or a combination of both. The method 900 and/or each of its individual functions, routines, subroutines, or operations may be performed by one or more processors of a computing device (e.g., any component of FIG. 1, such as
-r
server 30 executing the artificial intelligence engine 11). In some embodiments, the method 900 may be performed by a single processing thread. Alternatively, the method 900 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the methods.
[0206] For simplicity of explanation, the method 900 is depicted and described as a series of operations. However, operations in accordance with this disclosure can occur in various orders and/or concurrently, and/or with other operations not presented and described herein. For example, the operations depicted in the method 900 may occur in combination with any other operation of any other method disclosed herein. Furthermore, not all illustrated operations may be required to implement the method 900 in accordance with the disclosed subject matter. In addition, those skilled in the art will understand and appreciate that the method 900 could alternatively be represented as a series of interrelated states via a state diagram or events.
[0207] At 902, the processing device may receive treatment data pertaining to a user who uses a treatment device, such as the treatment device 70, to perform a treatment plan. The treatment data may include characteristics of the user, measurement information pertaining to the user while the user uses the treatment device 70, characteristics of the treatment device , the treatment plan, other suitable data, or a combination thereof.
[0208] At 904, the processing device may generate treatment information using the treatment data. The treatment information may include a summary of the performance of the treatment plan by the user while using the treatment device 70. The treatment information may be formatted, such that the treatment data is presentable at a computing device of a healthcare provider responsible for the performance of the treatment plan by the user.
[0209] At 906, the processing device may be configured to provide (e.g., store for access, make available, make accessible, transmit, and the like), at the computing device of the healthcare provider, the treatment information. At 908, the processing device may be configured to provide the treatment information at an interface of the computing device of the healthcare provider. For example, the processing device may store the treatment information for access by the computing device of the healthcare provide and/or communicate (e.g., or transmit) the treatment information to the computing device of the healthcare provider for display at the patient profile display 130 of the overview display 120, . As described, the overview display 120 may be configured to receive input, such as treatment plan input, indicating one or more modifications to the treatment plan and/or one or more characteristics of the treatment device 70. The healthcare provider may interact with the various controls, input fields, and other aspects of the overview display 120 to provide the treatment plan input.
[0210] At 910, the processing device may modify the treatment plan in response to receiving treatment plan input including at least one modification to the treatment plan. For example, the processing device may modify various features and characteristics of the treatment plan based on the at least one modification indicated by the treatment plan input.
[0211] At 912, the processing device may selectively control the treatment device 70 using the modified treatment plan. For example, the processing device may modify one or more characteristics of the treatment device 70 based on modifications to the treatment plan. Additionally, or alternatively, the processing device may adapt, modify, adjust, or otherwise control on or more characteristics based on the treatment plan input. For example, the treatment plan input may indicate at least one modification to one or more characteristics of the treatment device 70. The processing device may modify the one or more characteristics of the treatment device 70 based on the at least one modification indicated by the treatment plan input.
[0212] FIG. 10 is a flow diagram generally illustrating an alternative method 1000 for monitoring performance of a treatment plan by a user using a treatment device and for selectively modifying the treatment plan and one or more characteristics of the treatment device. according to the present disclosure. Method 1000 includes operations performed by processors of a computing device (e.g., any component of FIG. 1, such as server 30 executing the artificial intelligence engine 11). In some embodiments, one or more operations of the method 1000 are implemented in computer instructions stored on a memory device and executed by a processing device. The method 1000 may be performed in the same or a similar manner as described above in regard to method 900. The operations of the method 1000 may be performed in some combination with any of the operations of any of the methods described herein.
[0213] At 1002, the processing device may receive, during a telemedicine session, first treatment data pertaining to a user that uses a treatment device, such as the treatment device
-) 1
, to perform the treatment plan. The first treatment data includes, at least, measurement information pertaining to the user while the user uses the treatment device 70 to perform the treatment plan. The first treatment data may correspond to sensor data, such as sensor data 136, from one or more of the external sensors, such as external sensors 82, 84, 86, and/or from one or more internal sensors, such as internal sensors 76, of the treatment device 70.
[0214] In some embodiments, at least some of the first treatment data may include sensor data from one or more sensors associated with one or more corresponding wearable devices worn by the user while using the treatment device 70. The one or more wearable devices may include a watch, a bracelet, a necklace, a chest strap, a head sweatband, a wrist sweatband, any other suitable sweatband, and other suitable wearable device, or a combination thereof. The one or more wearable devices may be configured to monitor a heart rate, a temperature, a blood pressure, one or more vital signs, and the like of the user while the user is using the treatment device 70.
[0215] At 1004, the processing device may generate first treatment information using the first treatment data. The first treatment information may include a summary of the performance of the treatment plan by the user while using the treatment device 70. The first treatment information may be formatted, such that the first treatment data is presentable at a computing device of a healthcare provider responsible for the performance of the treatment plan by the user.
[0216] At 1006, the processing device may be configured to write to an associated memory, for access at the computing device of the healthcare provider, and/or provide, at the computing device of the healthcare provider, the first treatment information. At 1008, the processing device may be configured to provide the first treatment information at an interface of the computing device of the healthcare provider. For example, the processing device may be configured to provide the first treatment information at the patient profile display 130 of the overview display 120. As described, the overview display 120 may be configured to receive input, such as treatment plan input, indicating one or more modifications to the treatment plan and/or one or more characteristics of the treatment device 70. The healthcare provider may interact with the various controls, input fields, and other aspects of the overview display 120 to provide the treatment plan input.
[0217] At 1010, the processing device may receive first treatment plan input responsive to the first treatment information. The first treatment plan input may indicate at least one modification to the treatment plan. In some embodiments, the first treatment plan input may be provided by the healthcare provider, as described. In some embodiments, based on the first treatment information, the artificial intelligence engine 11 may generate the first treatment plan input.
[0218] At 1012, the processing device may modify the treatment plan in response to receiving the first treatment plan input including at least one modification to the treatment plan. For example, the processing device may modify various features and characteristics of the treatment plan based on the at least one modification indicated by thefirst treatment plan input.
[0219] At 1014, the processing device may selectively control the treatment device 70 using the modified treatment plan. For example, the processing device may modify one or more characteristics of the treatment device 70 based on modifications to the treatment plan. Additionally, or alternatively, the processing device may adapt, modify, adjust, or otherwise control on or more characteristics based on the first treatment plan input. For example, the first treatment plan input may indicate at least one modification to one or more characteristics of the treatment device 70. The processing device may modify the one or more characteristics of the treatment device 70 based on the at least one modification indicated by the first treatment plan input.
[0220] At 1016, the processing device may receive second treatment plan input responsive to second treatment information generated using second treatment data. For example, the processing device may receive second treatment data pertaining to the user while the user uses the treatment device 70. The second treatment data may include treatment data received by the processing device after the first treatment data. In some embodiments, the second treatment data may pertain to the user while the user uses the treatment device 70 to perform the modified treatment plan.
[0221] In some embodiments, the second treatment data may pertain to the user while the user uses the treatment device 70 to perform the treatment plan (e.g., in cases where the healthcare provider does not modify the treatment plan, as described). The processing device may generate the second treatment information based on the second treatment data. The processing device may receive the second treatment plan input indicating at least one modification to the treatment plan.
[0222] As described, the processing device may be configured to provide the second treatment information to the patient profile display 130 and/or any other suitable section, portion, or component of the overview display 120 or to any other suitable display or interface. The healthcare provider (e.g., and/or the artificial intelligence engine 11) may review the second treatment information and determine whether to modify and/or further modify the treatment plan based on the second treatment information.
[0223] At 1018, using the second treatment plan input, the processing device may modify the treatment plan. For example, the processing device may further modify (e.g., in cases where the processing device has already modified the treatment plan) and/or modify (e.g., in cases where the processing device has not previously modified the treatment plan) various features and characteristics of the treatment plan based on the at least one modification indicated by the second treatment plan input.
[0224] At 1020, using the modified treatment plan, the processing device may selectively control the treatment device 70. For example, based on modifications to the treatment plan, the processing device may modify one or more characteristics of the treatment device 70. Additionally, or alternatively, the processing device may adapt, modify, adjust, or otherwise control on or more characteristics based on the second treatment plan input. For example, the second treatment plan input may indicate at least one modification to one or more characteristics of the treatment device 70. The processing device may modify the one or more characteristics of the treatment device 70 based on the at least one modification indicated by the second treatment plan input.
[0225] FIG. 11 is a flow diagram generally illustrating an alternative method 1100 for monitoring performance of a treatment plan by a user using a treatment device and for selectively modifying the treatment plan and one or more characteristics of the treatment device. according to the present disclosure. Method 1100 includes operations performed by processors of a computing device (e.g., any component of FIG. 1, such as server 30 executing the artificial intelligence engine 11). In some embodiments, one or more operations of the method 1100 are implemented in computer instructions stored on a memory device and executed by a processing device. The method 1100 may be performed in the same or a similar manner as described above in regard to method 900 and/or method 1000. The operations of the method 1100 may be performed in some combination with any of the operations of any of the methods described herein.
[0226] At 1102, the processing device may receive treatment data pertaining to a user who uses a treatment device, such as the treatment device 70, to perform the treatment plan. The treatment data may include any of the data described herein. The treatment data may correspond to sensor data, such as sensor data 136, from one or more of the external sensors, such as external sensors 82, 84, 86, and/or from one or more internal sensors, such as internal sensors 76, of the treatment device 70. In some embodiments, at least some of the treatment data may include sensor data from one or more sensors associated with one or more corresponding wearable devices worn by the user while using the treatment device 70. The one or more wearable devices may include a watch, a bracelet, a necklace, a chest strap, a head sweatband, a wrist sweatband, any other suitable sweatband, any other suitable wearable device, or a combination thereof. The one or more wearable devices may be configured to monitor a heart rate, a temperature, a blood pressure, one or more vital signs, and the like of the user while the user is using the treatment device 70.
[0227] At 1104, the processing device may generate treatment information using the treatment data. The treatment information may include a summary of the performance of the treatment plan by the user while using the treatment device 70. The treatment information may be formatted, such that the treatment data is presentable at a computing device of a healthcare provider responsible for the performance of the treatment plan by the user.
[0228] At 1106, the processing device may be configured to provide, to at least one of the computing device of the healthcare provider and a machine learning model executed by the artificial intelligence engine 11, the treatment information.
[0229] At 1108, the processing device may receive treatment plan input responsive to the treatment information. The treatment plan input may indicate at least one modification to the treatment plan. In some embodiments, the treatment plan input may be provided by the healthcare provider, as described. In some embodiments, based on the treatment information, the artificial intelligence engine 11 executing the machine learning model may generate the treatment plan input.
[0230] At 1110, the processing device determines whether the treatment plan input indicates at least one modification to the treatment plan. If the processing device determines that the treatment plan input does not indicate at least one modification to the treatment plan, the processing device returns to 1102 and continues receiving treatment data pertaining to the user while the user uses the treatment device 70 to perform the treatment plan. If the processing device determines that the treatment plan input indicates at least one modification to the treatment plan, the processing device continues at 1112.
[0231] At 1112, using the treatment plan input, the processing device may modify the treatment plan. For example, using the at least one modification to the treatment plan indicated by the treatment plan input, the processing device may modify the treatment plan. Based on the at least one modification indicated by the treatment plan input, the processing device may modify various features and characteristics of the treatment plan.
[0232] At 1114, using the modified treatment plan, the processing device may selectively control the treatment device 70. For example, based on the at least one modification to the treatment plan, the processing device may modify one or more characteristics of the treatment device 70. Additionally, or alternatively, the processing device may adapt, modify, adjust, or otherwise control on or more characteristics based on the treatment plan input. For example, the treatment plan input may indicate at least one modification to one or more characteristics of the treatment device 70. Based on the at least one modification indicated by the treatment plan input, the processing device may modify the one or more characteristics of the treatment device 70. The processing device may return to 1102 and continue receiving treatment data pertaining to the user while the user uses the treatment device 70 to perform the treatment plan.
[0233] FIG. 12 generally illustrates an example computer system 1200 which can perform any one or more of the methods described herein, in accordance with one or more aspects of the present disclosure. In one example, computer system 1200 may include a computing device and correspond to the assistance interface 94, reporting interface 92, supervisory interface 90, clinician interface 20, server 30 (including the Al engine 11), patient interface , ambulatory sensor 82, goniometer 84, treatment device 70, pressure sensor 86, or any suitable component of FIG. 1. The computer system 1200 may be capable of executing instructions implementing the one or more machine learning models 13 of the artificial intelligence engine 11 of FIG. 1. The computer system may be connected (e.g., networked) to other computer systems in a LAN, an intranet, an extranet, or the Internet, including via the cloud or a peer-to-peer network.
[0234] The computer system may operate in the capacity of a server in a client-server network environment. The computer system may be a personal computer (PC), a tablet computer, a wearable (e.g., wristband), a set-top box (STB), a personal Digital Assistant (PDA), a mobile phone, a camera, a video camera, an Internet of Things (IoT) device, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device. Further, while only a single computer system is illustrated, the term "computer" shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
[0235] The computer system 1200 includes a processing device 1202, amain memory 1204 (e.g., read-only memory (ROM), flash memory, solid state drives (SSDs), dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 1206 (e.g., flash memory, solid state drives (SSDs), static random access memory (SRAM)), and a data storage device 1208, which communicate with each other via a bus 1110.
[0236] Processing device 1202 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 1202 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 1402 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a system on a chip, a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 1402 is configured to execute instructions for performing any of the operations and steps discussed herein.
[0237] The computer system 1200 may further include a network interface device 1212. The computer system 1200 also may include a video display 1214 (e.g., a liquid crystal display (LCD), a light-emitting diode (LED), an organic light-emitting diode (OLED), a quantum LED, a cathode ray tube (CRT), a shadow mask CRT, an aperture grille CRT, a monochrome CRT), one or more input devices 1216 (e.g., a keyboard and/or a mouse or a gaming-like control), and one or more speakers 1218 (e.g., a speaker). In one illustrative
/ example, the video display 1214 and the input device(s) 1216 may be combined into a single component or device (e.g., an LCD touch screen).
[0238] The data storage device 1216 may include a computer-readable medium 1220 on which the instructions 1222 embodying any one or more of the methods, operations, or functions described herein is stored. The instructions 1222 may also reside, completely or at least partially, within the main memory 1204 and/or within the processing device 1202 during execution thereof by the computer system 1200. As such, the main memory 1204 and the processing device 1202 also constitute computer-readable media. The instructions 1222 may further be transmitted or received over a network via the network interface device 1212.
[0239] While the computer-readable storage medium 1220 is generally illustrated in the illustrative examples to be a single medium, the term "computer-readable storage medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "computer-readable storage medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term "computer-readable storage medium" shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
[0240] Determining an optimal treatment plan for a patient having certain characteristics (e.g., demographic; geographic; diagnostic; measurement- or test-based; medically historic; etiologic; cohort-associative; differentially diagnostic; surgical, physically therapeutic, pharmacologic and other treatment(s) recommended; etc.) may be a technically challenging problem. For example, a multitude of information may be considered when determining a treatment plan, which may result in inefficiencies and inaccuracies in the treatment plan selection process. In a rehabilitative setting, some of the multitude of information considered may include a type of injury of the patient, types of available medical procedures to perform, treatment regimens, medication regimens, and the characteristics of the patient. The characteristics of the patient may be vast, and may include medications of the patient, previous injuries of the patient, previous medical procedures performed on the patient, measurements (e.g., body fat, weight, etc.) of the patient, allergies of the patient, medical conditions of the patient, historical information of the patient, vital signs (e.g., temperature, blood pressure, heart rate) of the patient, symptoms of the patient, familial medical information of the patient, and the like.
[0241] Further, in addition to the information described above, it may be desirable to consider additional historical information, such as clinical information pertaining to results of treatment plans performed using a treatment apparatus on other people. The clinical information may include clinical studies, clinical trials, evidence-based guidelines, journal articles, meta-analyses, and the like. The clinical information may be written by people having certain professional degrees (e.g., medical doctor, osteopathic doctor, physical therapist, etc.), certifications, etc. The clinical information may be retrieved from any suitable data source.
[0242] In some embodiments, the clinical information may describe people seeking treatment for a particular ailment (e.g., injury, disease, any applicable medical condition, etc.). The clinical information may describe that certain results are obtained when the people perform or have performed on them particular treatment plans (e.g., medical procedures, treatment protocols using treatment apparatuses, medication regimens, diet regimens, etc.). The clinical information may also include the particular characteristics of the people described. Direct or indirect reference may be made to values of the characteristics therein. It may be desirable to compare the characteristics of the patient with the characteristics of the people in the clinical information to determine what an optimal treatment plan for the patient may be such that the patient can obtain a desired result. Processing this historical information may be computationally taxing, inefficient, and/or infeasible using conventional techniques.
[0243] Accordingly, embodiments of the present disclosure pertain to recommending optimal treatment plans using real-time and historical data correlations involving patient cohort-equivalent databases. In some embodiments, an artificial intelligence engine may be trained to recommend the optimal treatment plan based on characteristics of the patient and the clinical information. For example, the artificial intelligence engine may be trained to match a pattern between the characteristics of the patient and the people in various clinical information. Based on the pattern, the artificial intelligence engine may generate a treatment plan for the patient, where such treatment plan produced a desired result in the clinical information for a similarly matched person or similarly matched people. In that sense, the treatment plan generated may be "optimal" based on the desired result (e.g., speed, efficacy, both speed and efficacy, life expectancy, etc.). In other words, based on the characteristics of the patient, in order to obtain the desired result, there may be certain medical procedures, certain medications, certain rehabilitative exercises, and so forth that should be included in an optimal treatment plan to obtain the desired result.
[0244] Depending on what result is desired, the artificial intelligence engine may be trained to output several optimal or optimized treatment plans. For example, one result may include recovering to a threshold level (e.g., 75% range of motion) in a fastest amount of time, while another result may include fully recovering (e.g., 100% range of motion) regardless of the amount of time. The clinical information may indicate a first treatment plan provides the first result for people with characteristics similar to the patient's, and a second treatment plan provides the second result for people with characteristics similar to the patient.
[0245] Further, the artificial intelligence engine may also be trained to output treatment plans that are not optimal (referred to as "ruled-out treatment plans") for the patient. For example, if a patient has diabetes, a particular medication may not be approved or suitable for the patient and that medication may be flagged in the ruled-out treatment plan for the patient.
[0246] As discussed above, processing patient and clinical information in real-time may, due to the sheer amount of data to process, be infeasible using conventional techniques. Accordingly, in some embodiments, the received clinical information and/or patient information may be translated into a medical description language. The medical description language may refer to an encoding configured to be efficiently processed by the artificial intelligence engine. For example, a clinical trial may be received and parsed, optionally with the addition of an attribute grammar; and then keywords pertaining to target information may be searched for. The values of the target information may be identified. A canonical format defined by the medical description language may be defined and/or generated, where the canonical format includes tags identifying the values of the target information and, optionally, tags implementing an attribute grammar for the medical description language.
[0247] The medical description language may be extensible and include any property of an object-oriented or artificial intelligence programming language. The medical description language may define other methods or procedures. The medical description language may implement the concept of "objects", which can contain data, in the form offields (often known as attributes or properties), and code, in the form of procedures (often known as methods).
The medical description language may encapsulate data and functions that manipulate the data to protect them from interference and misuse. The medical description language may also implement data hiding or obscuring, which prevents certain aspects of the data or functions from being accessible to another component. The medical description language may implement inheritance, which arranges components as "is a type of' relationships, where a first component may be a type of a second component and the first component inherits the functions and data of the second component. The medical description language may also implement polymorphism, which is the provision of a single interface to components of different types.
[0248] The clinical information may be translated to the medical description language prior to the artificial intelligence engine determining the optimal treatment plans and/or ruled-out treatment plans. The artificial intelligence engine may be trained by using the medical description language representing the clinical information, such that the artificial intelligence engine is able to more efficiently determine the optimal treatment plans instead ofusing initial data formats in which the clinical information is received. Further, the artificial intelligence engine may continuously or continually receive the clinical information and include the clinical information in training data to update the artificial intelligence engine.
[0249] In some embodiments, the optimal treatment plans and/or ruled-out treatment plans may be presented to a medical professional. The medical professional may select a particular optimal treatment plan for the patient to cause that treatment plan to be transmitted to the patient. In some embodiments, to facilitate telehealth or telemedicine applications, including remote diagnoses, determination of treatment plans and rehabilitative and/or pharmacologic prescriptions, the artificial intelligence engine may receive and/or operate distally from the source of the clinical information and/or distally from the patient. In such cases, the recommended treatment plans and/or ruled-out treatment plans may be presented during a telemedicine or telehealth session on a user interface of a computing device of a medical professional simultaneously with a video of the patient in real-time. The video may also be accompanied by audio, text and other multimedia information. Real-time may refer to less than 2 seconds.
[0250] Presenting the treatment plans generated by the artificial intelligence engine concurrently with a presentation of the patient video may provide an enhanced user interface because the medical professional may continue to visually and/or otherwise communicate with the patient while also reviewing the treatment plans on the same user interface. The enhanced user interface may improve the medical professional's experience using the computing device and may encourage the medical professional to reuse the user interface. Such a technique may also reduce computing resources (e.g., processing, memory, network) because the medical professional does not have to switch to another user interface screen and enter a query for a treatment plan to recommend based on the characteristics of the patient. The artificial intelligence engine provides, dynamically on the fly, the optimal treatment plans and ruled-out treatment plans.
[0251] In some embodiments, the treatment apparatus maybe adaptive and/or personalized because its properties, configurations, and positions may be adapted to the needs of a particular patient. For example, the pedals may be dynamically adjusted on the fly (e.g., via a telemedicine session or based on programmed configurations in response to certain measurements being detected) to increase or decrease a range of motion to comply with a treatment plan designed for the user. Such adaptive nature may improve the results of recovery for a patient.
[0252] Clause 1. A method comprising: receiving treatment data pertaining to a user that uses a treatment device to perform a treatment plan, wherein the treatment data comprises at least one of characteristics of the user, measurement information pertaining to the user while the user uses the treatment devices, characteristics of the treatment device, and at least one aspect of the treatment plan; generating treatment information using the treatment data; writing to an associated memory, for access by a computing device of a healthcare provider, the treatment information; communicating with an interface, at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input; and modifying the at least one aspect of the treatment plan in response to receiving treatment plan input including at least one modification to the at least one aspect of the treatment plan.
[0253] Clause 2. The method of any clause herein, further comprising controlling, based on the modified the at least one aspect of the treatment plan, the treatment device while the user uses the treatment device.
[0254] Clause 3. The method of any clause herein, further comprising controlling, based on the modified at least one aspect of the treatment plan, the treatment device while the user uses the treatment device during a telemedicine session.
[0255] Clause 4. The method of any clause herein, wherein the measurement information includes at least one of a vital sign of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, and a blood pressure of the user.
[0256] Clause 5. The method of any clause herein, wherein at least some of the treatment data corresponds to sensor data from a sensor associated with the treatment device.
[0257] Clause 6. The method of any clause herein, wherein at least some of the treatment data corresponds to sensor data from a sensor associated with a wearable device worn by the user while using the treatment device.
[0258] Clause 7. The method of any clause herein, further comprising receiving subsequent treatment data pertaining to the user while the user uses the treatment device to perform the treatment plan.
[0259] Clause 8. The method of any clause herein, further comprising modifying the modified the treatment plan in response to receiving subsequent treatment plan input including at least one further modification to the modified the at least one aspect of the treatment plan, wherein the subsequent treatment plan input is based on at least one of the treatment data and the subsequent treatment data.
[0260] Clause 9. A tangible, non-transitory computer-readable medium storing instructions that, when executed, cause a processing device to: receive treatment data pertaining to a user that uses a treatment device to perform a treatment plan, wherein the treatment data comprises at least one of characteristics of the user, measurement information pertaining to the user while the user uses the treatment device, characteristics of the treatment device, and at least one aspect of the treatment plan; generate treatment information using the treatment data; write to an associated memory, for access at a computing device of a healthcare provider, the treatment information; communicate with an interface, at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input; and modify the at least one aspect of the treatment plan in response to receiving treatment plan input including at least one modification to the treatment plan.
[0261] Clause 10. The computer-readable medium of any clause herein, wherein the processing device is further configured to control, based on the modified the at least one
V-1
aspect of the treatment plan, the treatment device while the user uses the treatment device.
[0262] Clause 11. The computer-readable medium of any clause herein, wherein the processing device is further configured to control, based on the modified the at least one aspect of the treatment plan, the treatment device while the user uses the treatment device during a telemedicine session.
[0263] Clause 12. The computer-readable medium of any clause herein, wherein the measurement information includes at least one of a vital sign of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, and a blood pressure of the user.
[0264] Clause 13. The computer-readable medium of any clause herein, wherein at least some of the treatment data corresponds to sensor data from a sensor associated with the treatment device.
[0265] Clause 14. The computer-readable medium of any clause herein, wherein at least some of the treatment data corresponds to sensor data from a sensor associated with a wearable device worn by the user while using the treatment device.
[0266] Clause 15. The computer-readable medium of any clause herein, wherein the processing device is further configured to receive subsequent treatment data pertaining to the user while the user uses the treatment device to perform the treatment plan.
[0267] Clause 16. The computer-readable medium of any clause herein, wherein the processing device is further configured to modify the modified the at least one aspect of the treatment plan in response to receiving subsequent treatment plan input including at least one further modification to the treatment plan, wherein the subsequent treatment plan input is based on at least one of the treatment data and the subsequent treatment data.
[0268] Clause 17. A system comprising: a memory device storing instructions; a processing device communicatively coupled to the memory device, the processing device executes the instructions to: receive treatment data pertaining to a user that uses a treatment device to perform a treatment plan, wherein the treatment data comprises at least one of characteristics of the user, measurement information pertaining to the user while the user uses the treatment device, characteristics of the treatment device, and at least one aspect of the treatment plan; generate treatment information using the treatment data; write to an associated memory, for access at a computing device of a healthcare provider, the treatment information; communicate with an interface, at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input; and modify the at least one aspect of the treatment plan in response to receiving treatment plan input including at least one modification to the treatment plan.
[0269] Clause 18. The system of any clause herein, wherein the processing device is further configured to control, based on the modified the at least one aspect of the treatment plan, the treatment device while the user uses the treatment device.
[0270] Clause 19. The system of any clause herein, wherein the processing device is further configured to control, based on the modified the at least one aspect of the treatment plan, the treatment device while the user uses the treatment device during a telemedicine session.
[0271] Clause 20. The system of any clause herein, wherein the measurement information includes at least one of a vital sign of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, and a blood pressure of the user.
[0272] Clause21. The system of any clause herein, wherein at least some of the treatment data corresponds to sensor data from a sensor associated with the treatment device.
[0273] Clause 22. The system of any clause herein, wherein at least some of the treatment data corresponds to sensor data from a sensor associated with a wearable device worn by the user while using the treatment device.
[0274] Clause 23. The system of any clause herein, wherein the processing device is further configured to receive subsequent treatment data pertaining to the user while the user uses the treatment device to perform the treatment plan.
[0275] Clause 24. The system of any clause herein, wherein the processing device is further configured to modify the modified the at least one of the at least one aspect and any other aspect of the treatment plan in response to receiving subsequent treatment plan input including at least one further modification to the treatment plan, wherein the subsequent treatment plan input is based on at least one of the treatment data and the subsequent treatment data.
[0276] FIG. 13 shows a block diagram of a computer-implemented system 2010, hereinafter called "the system" for managing a treatment plan. Managing the treatment plan may include using an artificial intelligence engine to recommend optimal treatment plans and/or provide ruled-out treatment plans that should not be recommended to a patient. A treatment plan may include one or more treatment protocols, and each treatment protocol includes one or more treatment sessions. Each treatment session comprises several session periods, with each session period including a particular activity for treating the body part of the patient. For example, a treatment plan for post-operative rehabilitation after a knee surgery may include an initial treatment protocol with twice daily stretching sessions for the first 3 days after surgery and a more intensive treatment protocol with active exercise sessions performed 4 times per day starting 4 days after surgery. A treatment plan may also include information pertaining to a medical procedure to perform on the patient, a treatment protocol for the patient using a treatment apparatus, a diet regimen for the patient, a medication regimen for the patient, a sleep regimen for the patient, additional regimens, or some combination thereof.
[0277] The system 2010 also includes a server 2030 configured to store and to provide data related to managing the treatment plan. The server 2030 may include one or more computers and may take the form of a distributed and/or virtualized computer or computers. The server 2030 also includes a first communication interface 2032 configured to communicate with the clinician interface 2020 via a first network 2034.In some embodiments, the first network 2034 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc. The server 2030 includes a first processor 2036 and a first machine-readable storage memory 2038, which may be called a "memory" for short, holding first instructions 2040 for performing the various actions of the server 2030 for execution by the first processor 2036. The server 2030 is configured to store data regarding the treatment plan. For example, the memory 2038 includes a system data store 2042 configured to hold system data, such as data pertaining to treatment plans for treating one or more patients. The server 2030 is also configured to store data regarding performance by a patient in following a treatment plan. For example, the memory 2038 includes a patient data store 2044 configured to hold patient data, such as data pertaining to the one or more patients, including data representing each patient's performance within the treatment plan.
[0278] In addition, the characteristics of the people, the treatment plans followed by the people, the level of compliance with the treatment plans, and the results of the treatment plans may use correlations and other statistical or probabilistic measures to partition the treatment plans into different patient cohort-equivalent databases in the patient data store 2044. For example, the data for a first cohort of first patients having a first similar injury, a first similar medical condition, a first similar medical procedure performed, a first treatment plan followed by the first patient, and a first result of the treatment plan may be stored in a first patient database. The data for a second cohort of second patients having a second similar injury, a second similar medical condition, a second similar medical procedure performed, a second treatment plan followed by the second patients, and a second result of the treatment plan may be stored in a second patient database. Any combination of characteristics may be used to separate the cohorts of patients. In some embodiments, the different cohorts of patients may be stored in different partitions or volumes of the same database.
[0279] This characteristic data, treatment plan data, and results data may be obtained from clinical information that describes the characteristics of people who performed certain treatment plans and the results of those treatment plans. The characteristic data, treatment plan data, and results data may be correlated in the patient-cohort databases in the patient data store 2044. The characteristics of the people may include medications prescribed to the people, injuries of the people, medical procedures performed on the people, measurements of the people, allergies of the people, medical conditions of the people, historical information of the people, vital signs of the people, symptoms of the people, familial medical information of the people, other information of the people, or some combination thereof.
[0280] In addition to the historical information about other people stored in the patient cohort-equivalent databases, real-time information based on the current patient's characteristics about a current patient being treated may be stored in an appropriate patient cohort-equivalent database. The characteristics of the patient may include medications of the patient, injuries of the patient, medical procedures performed on the patient, measurements of the patient, allergies of the patient, medical conditions of the patient, historical information of the patient, vital signs of the patient, symptoms of the patient, familial medical information of the patient, other information of the patient, or some combination thereof.
[0281] In some embodiments, the server 2030 may execute an artificial intelligence (AI)
/ engine 2011 that uses one or more machine learning models 2013 to perform at least one of the embodiments disclosed herein. The server 2030 may include a training engine 9 capable of generating the one or more machine learning models 2013. The machine learning models 2013 may be trained to generate and recommend optimal treatment plans using real-time and historical data correlations involving patient cohort-equivalents, among other things. The one or more machine learning models 2013 may be generated by the training engine 209 and may be implemented in computer instructions executable by one or more processing devices of the training engine 209 and/or the servers 2030. To generate the one or more machine learning models 2013, the training engine 209 may train the one or more machine learning models 2013. The one or more machine learning models 2013 may be used by the artificial intelligence engine 2011.
[0282] The training engine 209 may be a rackmount server, a router computer, a personal computer, a portable digital assistant, a smartphone, a laptop computer, a tablet computer, a netbook, a desktop computer, an Internet of Things (IoT) device, any other desired computing device, or any combination of the above. The training engine 9 may be cloud-based or a real time software platform, and it may include privacy software or protocols, and/or security software or protocols.
[0283] To train the one or more machine learning models 2013, the training engine 209 may use a training data set of a corpus ofkeywords representing target information to identify clinical information. The training data set may also include a corpus of clinical information (e.g., clinical trials, meta-analyses, evidence-based guidelines, journal articles, etc.) having a first data format. The clinical information may include characteristics of people, treatment plans followed by the people, and results of the treatment plans, among other things. The training data set may also include medical description language examples that include tags for target information, telemedical information and values embedded with the tags. The one or more machine learning models may be trained to translate the clinical information from the first data format to the machine description language having a canonical (e.g., tag-value pair and/or attribute grammar) format. The training may be performed by identifying the keywords of the target information, identifying values for the keywords, and generating the canonical value including tags for the target information and the values for the target information.
[0284] The one or more machine learning models 2013 may also be trained to translate characteristics of patients received in real-time (e.g., from an electronic medical records (EMR) system) to the medical description language to store in appropriate patient cohort equivalent databases. The one or more machine learning models 2013 may be trained to match patterns of characteristics of a patient described by the medical description language with characteristics of other people described by the medical description language that represents the clinical information. In some embodiments, the medical description language representing the clinical information may be stored in the various patient cohort-equivalent databases of the patient data store 2044. Accordingly, in some embodiments, the one or more machine learning models 2013 may access the patient cohort-equivalent databases when being trained or when recommending optimal treatment plans for a patient. Computing resources, efficiency of processing, accuracy and error minimization may be enhanced using the medical description language in the canonical format, as opposed to full bodies of text and/or EMR records. In particular, accuracy may be improved and errors may be minimized through the use of a formal medical description language that may be parsed to have one meaning, while informal descriptions may result in more than one, potentially semantically overloaded and unresolvable meanings.
[0285] Different machine learning models 2013 may be trained to recommend different optimal treatment plans for different desired results. For example, one machine learning model may be trained to recommend optimal treatment plans for most effective recovery, while another machine learning model may be trained to recommend optimal treatment plans based on speed of recovery.
[0286] Using training data that includes training inputs and corresponding target outputs, the one or more machine learning models 2013 may refer to model artifacts created by the training engine 209. The training engine 209 may find patterns in the training data wherein such patterns map the training input to the target output, and generate the machine learning models 2013 that capture these patterns. In some embodiments, the artificial intelligence engine 2011, the database 2033, and/or the training engine 209 may reside on another component (e.g., assistant interface 2094, clinician interface 2020, etc.) depicted in FIG. 13.
[0287] As described in more detail below, the one or more machine learning models 2013 may comprise, e.g., a single level of linear or non-linear operations (e.g., a support vector
V9/
machine [SVM]) or the machine learning models 2013 may be a deep network, i.e., a machine learning model comprising multiple levels of non-linear operations. Examples of deep networks are neural networks including generative adversarial networks, convolutional neural networks, recurrent neural networks with one or more hidden layers, and fully connected neural networks (e.g., each neuron may transmit its output signal to the input of the remaining neurons, as well as to itself). For example, the machine learning model may include numerous layers and/or hidden layers that perform calculations (e.g., dot products) using various neurons.
[0288] The system 2010 also includes a patient interface 2050 configured to communicate information to a patient and to receive feedback from the patient. Specifically, the patient interface includes an input device 2052 and an output device 2054, which may be collectively called a patient user interface 2052, 2054. The input device 2052 may include one or more devices, such as a keyboard, a mouse, a touch screen input, a gesture sensor, and/or a microphone and processor configured for voice recognition. The output device 2054 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, smartphone, or a smart watch. The output device 2054 may include other hardware and/or software components such as a projector, virtual reality capability, augmented reality capability, etc. The output device 2054 may incorporate various different visual, audio, or other presentation technologies. For example, the output device 2054 may include a non visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, and/or melodies, which may signal different conditions and/or directions. The output device 2054 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the patient. The output device 2054 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
[0289] As shown in FIG. 13, the patient interface 2050 includes a second communication interface 2056, which may also be called a remote communication interface configured to communicate with the server 2030 and/or the clinician interface 2020 via a second network 2058. In some embodiments, the second network 2058 may include a local area network (LAN), such as an Ethernet network. In some embodiments, the second network 2058 may include the Internet, and communications between the patient interface 2050 and the server 2030 and/or the clinician interface 2020 may be secured via encryption, such as, for example,
by using a virtual private network (VPN). In some embodiments, the second network 2058 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc. In some embodiments, the second network 2058 may be the same as and/or operationally coupled to the first network 2034.
[0290] The patient interface 2050 includes a second processor 2060 and a second machine readable storage memory 2062 holding second instructions 2064 for execution by the second processor 2060 for performing various actions of patient interface 2050. The second machine readable storage memory 2062 also includes a local data store 2066 configured to hold data, such as data pertaining to a treatment plan and/or patient data, such as data representing a patient's performance within a treatment plan. The patient interface 2050 also includes a local communication interface20 68 configured to communicate with various devices for use by the patient in the vicinity of the patient interface 2050. The local communication interface 2068 may include wired and/or wireless communications. In some embodiments, the local communication interface 2068 may include a local wireless network such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
[0291] The system 2010 also includes a treatment apparatus 2070 configured to be manipulated by the patient and/or to manipulate a body part of the patient for performing activities according to the treatment plan. In some embodiments, the treatment apparatus 2070 may take the form of an exercise and rehabilitation apparatus configured to perform and/or to aid in the performance of a rehabilitation regimen, which may be an orthopedic rehabilitation regimen, and the treatment includes rehabilitation of a body part of the patient, such as a joint or a bone or a muscle group. The body part may include, for example, a spine, a hand, a foot, a knee, or a shoulder. The body part may include a part of a joint, a bone, or a muscle group, such as one or more vertebrae, a tendon, or a ligament. As shown in FIG. 13, the treatment apparatus 2070 includes a controller 2072, which may include one or more processors, computer memory, and/or other components. The treatment apparatus 2070 also includes a fourth communication interface 2074 configured to communicate with the patient interface 2050 via the local communication interface 2068. The treatment apparatus 2070 also includes one or more internal sensors 2076 and an actuator 2078, such as a motor. The actuator 2078 may be used, for example, for moving the patient's body part and/or for resisting forces by the patient.
/ 1
[0292] The internal sensors 2076 may measure one or more operating characteristics of the treatment apparatus 2070 such as, for example, a force a position, a speed, and /or a velocity. In some embodiments, the internal sensors 2076 may include a position sensor configured to measure at least one of a linear motion or an angular motion of a body part of the patient. For example, an internal sensor 2076 in the form of a position sensor may measure a distance that the patient is able to move a part of the treatment apparatus 2070, where such distance may correspond to a range of motion that the patient's body part is able to achieve. In some embodiments, the internal sensors 2076 may include a force sensor configured to measure a force applied by the patient. For example, an internal sensor 2076 in the form of a force sensor may measure a force or weight the patient is able to apply, using a particular body part, to the treatment apparatus 2070.
[0293] The system 10 shown in FIG. 13 also includes an ambulation sensor 2082, which communicates with the server 2030 via the local communication interface 2068 of the patient interface 2050. The ambulation sensor 2082 may track and store a number of steps taken by the patient. In some embodiments, the ambulation sensor 2082 may take the form of a wristband, wristwatch, or smart watch. In some embodiments, the ambulation sensor 2082 may be integrated within a phone, such as a smartphone.
[0294] The system 2010 shown in FIG. 13 also includes a goniometer 2084, which communicates with the server 2030 via the local communication interface 2068 of the patient interface 2050. The goniometer 2084 measures an angle of the patient's body part. For example, the goniometer 2084 may measure the angle of flex of a patient's knee or elbow or shoulder.
[0295] The system 2010 shown in FIG. 13 also includes a pressure sensor 2086, which communicates with the server 2030 via the local communication interface 68 of the patient interface 2050. The pressure sensor 2086 measures an amount of pressure or weight applied by a body part of the patient. For example, pressure sensor 2086 may measure an amount of force applied by a patient's foot when pedaling a stationary bike.
[0296] The system 2010 shown in FIG. 13 also includes a supervisory interface 2090 which may be similar or identical to the clinician interface 2020. In some embodiments, the supervisory interface 2090 may have enhanced functionality beyond what is provided on the clinician interface 2020. The supervisory interface 2090 may be configured for use by a
person having responsibility for the treatment plan, such as an orthopedic surgeon.
[0297] The system 2010 shown in FIG. 13 also includes a reporting interface 2092 which may be similar or identical to the clinician interface 2020. In some embodiments, the reporting interface 2092 may have less functionality from what is provided on the clinician interface 2020. For example, the reporting interface 2092 may not have the ability to modify a treatment plan. Such a reporting interface 2092 may be used, for example, by a biller to determine the use of the system 2010 for billing purposes. In another example, the reporting interface 2092 may not have the ability to display patient identifiable information, presenting only pseudonymized data and/or anonymized data for certain data fields concerning a data subject and/or for certain data fields concerning a quasi-identifier of the data subject. Such a reporting interface 2092 may be used, for example, by a researcher to determine various effects of a treatment plan on different patients.
[0298] The system 2010 includes an assistant interface 2094 for an assistant, such as a doctor, a nurse, a physical therapist, or a technician, to remotely communicate with the patient interface 2050 and/or the treatment apparatus 2070. Such remote communications may enable the assistant to provide assistance or guidance to a patient using the system 2010. More specifically, the assistant interface 2094 is configured to communicate a telemedicine signal 2096, 2097, 2098a, 2098b, 2099a, 2099b with the patient interface 2050 via a network connection such as, for example, via the first network 2034 and/or the second network 2058. The telemedicine signal 2096, 2097, 2098a, 2098b, 2099a, 2099b comprises one of an audio signal 2096, an audiovisual signal 2097, an interface control signal 2098a for controlling a function of the patient interface 2050, an interface monitor signal 2098b for monitoring a status of the patient interface 2050, an apparatus control signal 2099a for changing an operating parameter of the treatment apparatus 2070, and/or an apparatus monitor signal 2099b for monitoring a status of the treatment apparatus 2070. In some embodiments, each of the control signals 2098a, 2099a may be unidirectional, conveying commands from the assistant interface 2094 to the patient interface 2050. In some embodiments, in response to successfully receiving a control signal 2098a, 2099a and/or to communicate successful and/or unsuccessful implementation of the requested control action, an acknowledgement message may be sent from the patient interface 2050 to the assistant interface 2094. In some embodiments, each of the monitor signals 2098b, 2099b may be unidirectional, status information commands from the patient interface 2050 to the assistant interface 2094. In some
embodiments, an acknowledgement message may be sent from the assistant interface 2094 to the patient interface 2050 in response to successfully receiving one of the monitor signals 2098b,2099b.
[0299] In some embodiments, the patient interface 2050 may be configured as a pass through for the apparatus control signals 2099a and the apparatus monitor signals 2099b between the treatment apparatus 2070 and one or more other devices, such as the assistant interface 2094 and/or the server 2030. For example, the patient interface 2050 may be configured to transmit an apparatus control signal 2099a in response to an apparatus control signal 2099a within the telemedicine signal 2096, 2097, 2098a, 2098b, 2099a, 2099b from the assistant interface 2094.
[0300] In some embodiments, the assistant interface 2094 may be presented on a shared physical device as the clinician interface 2020. For example, the clinician interface 2020 may include one or more screens that implement the assistant interface 2094. Alternatively or additionally, the clinician interface 2020 may include additional hardware components, such as a video camera, a speaker, and/or a microphone, to implement aspects of the assistant interface 2094.
[0301] In some embodiments, one or more portions of the telemedicine signal 2096, 2097, 2098a, 2098b, 2099a, 2099b may be generated from a prerecorded source (e.g., an audio recording, a video recording, or an animation) for presentation by the output device 2054 of the patient interface 2050. For example, a tutorial video may be streamed from the server 2030 and presented upon the patient interface 2050. Content from the prerecorded source may be requested by the patient via the patient interface 2050. Alternatively, via a control on the assistant interface 2094, the assistant may cause content from the prerecorded source to be played on the patient interface 2050.
[0302] The assistant interface 2094 includes an assistant input device 2022 and an assistant display 2024, which may be collectively called an assistant user interface 2022, 2024. The assistant input device 2022 may include one or more of a telephone, a keyboard, a mouse, a trackpad, or a touch screen, for example. Alternatively or additionally, the assistant input device 2022 may include one or more microphones. In some embodiments, the one or more microphones may take the form of a telephone handset, headset, or wide-area microphone or microphones configured for the assistant to speak to a patient via the patient interface 2050.
/ -r
In some embodiments, assistant input device 2022 may be configured to provide voice-based functionalities, with hardware and/or software configured to interpret spoken instructions by the assistant by using the one or more microphones. The assistant input device 2022 may include functionality provided by or similar to existing voice-based assistants such as Siri by Apple, Alexa by Amazon, Google Assistant, or Bixby by Samsung. The assistant input device 2022 may include other hardware and/or software components. The assistant input device 2022 may include one or more general purpose devices and/or special-purpose devices.
[0303] The assistant display 2024 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, a smartphone, or a smart watch. The assistant display 2024 may include other hardware and/or software components such as projectors, virtual reality capabilities, or augmented reality capabilities, etc. The assistant display 2024 may incorporate various different visual, audio, or other presentation technologies. For example, the assistant display 2024 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, melodies, and/or compositions, which may signal different conditions and/or directions. The assistant display 2024 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the assistant. The assistant display 2024 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
[0304] In some embodiments, the system 2010 may provide computer translation of language from the assistant interface 2094 to the patient interface 2050 and/or vice-versa. The computer translation of language may include computer translation of spoken language and/or computer translation of text. Additionally or alternatively, the system 2010 may provide voice recognition and/or spoken pronunciation of text. For example, the system 2010 may convert spoken words to printed text and/or the system 2010 may audibly speak language from printed text. The system 2010 may be configured to recognize spoken words by any or all of the patient, the clinician, and/or the assistant. In some embodiments, the system 2010 may be configured to recognize and react to spoken requests or commands by the patient. For example, the system 2010 may automatically initiate a telemedicine session in response to a verbal command by the patient (which may be given in any one of several different languages).
[0305] In some embodiments, the server 2030 may generate aspects of the assistant display 2024 for presentation by the assistant interface 2094. For example, the server 2030 may include a web server configured to generate the display screens for presentation upon the assistant display 2024. For example, the artificial intelligence engine 2011 may generate recommended optimal treatment plans and/or excluded treatment plans for patients and generate the display screens including those recommended optimal treatment plans and/or ruled-out treatment plans for presentation on the assistant display 2024 of the assistant interface 2094. In some embodiments, the assistant display 2024 may be configured to present a virtualized desktop hosted by the server 2030. In some embodiments, the server 2030 may be configured to communicate with the assistant interface 2094 via the first network 2034. In some embodiments, the first network 2034 may include a local area network (LAN), such as an Ethernet network. In some embodiments, the first network 2034 may include the Internet, and communications between the server 2030 and the assistant interface 2094 may be secured via privacy enhancing technologies, such as, for example, by using encryption over a virtual private network (VPN). Alternatively or additionally, the server 2030 may be configured to communicate with the assistant interface 2094 via one or more networks independent of the first network 2034 and/or other communication means, such as a direct wired or wireless communication channel. In some embodiments, the patient interface 2050 and the treatment apparatus 2070 may each operate from a patient location geographically separate from a location of the assistant interface 2094. For example, the patient interface 2050 and the treatment apparatus 2070 may be used as part of an in-home rehabilitation system, which may be aided remotely by using the assistant interface 2094 at a centralized location, such as a clinic or a call center.
[0306] In some embodiments, the assistant interface 2094 may be one of several different terminals (e.g., computing devices) that may be grouped together, for example, in one or more call centers or at one or more clinicians' offices. In some embodiments, a plurality of assistant interfaces 2094 may be distributed geographically. In some embodiments, a person may work as an assistant remotely from any conventional office infrastructure. Such remote work may be performed, for example, where the assistant interface 94 takes the form of a computer and/or telephone. This remote work functionality may allow for work-from-home arrangements that may include part time and/or flexible work hours for an assistant.
[0307] FIGS. 14-15 show an embodiment of a treatment apparatus 2070. More specifically,
FIG. 14 shows a treatment apparatus 2070 in the form of a stationary cycling machine 2100, which may be called a stationary bike, for short. The stationary cycling machine 2100 includes a set of pedals 2102 each attached to a pedal arm 2104 for rotation about an axle 2106. In some embodiments, and as shown in FIG. 14, the pedals 2102 are movable on the pedal arms 2104 in order to adjust a range of motion used by the patient in pedaling. For example, the pedals being located inwardly toward the axle 2106 corresponds to a smaller range of motion than when the pedals are located outwardly away from the axle 2106. A pressure sensor 2086 is attached to or embedded within one of the pedals 2102 for measuring an amount of force applied by the patient on the pedal 2102. The pressure sensor 2086 may communicate wirelessly to the treatment apparatus 2070 and/or to the patient interface 2050.
[0308] FIG. l6shows a person (a patient) using the treatment apparatus of FIG. 14, and showing sensors and various data parameters connected to a patient interface 2050. The example patient interface 2050 is a tablet computer or smartphone, or a phablet, such as an iPad, an iPhone, an Android device, or a Surface tablet, which is held manually by the patient. In some other embodiments, the patient interface 2050 may be embedded within or attached to the treatment apparatus 2070. FIG. 16 shows the patient wearing the ambulation sensor 2082 on his wrist, with a note showing "STEPS TODAY 21355", indicating that the ambulation sensor 2082 has recorded and transmitted that step count to the patient interface 2050. FIG. 16 also shows the patient wearing the goniometer 2084 on his right knee, with a note showing "KNEE ANGLE 72°", indicating that the goniometer 2084 is measuring and transmitting that knee angle to the patient interface 2050. FIG. 16 also shows a right side of one of the pedals 2102 with a pressure sensor 2086 showing "FORCE 12.5 lbs.," indicating that the right pedal pressure sensor 2086 is measuring and transmitting that force measurement to the patient interface 2050. FIG. 16 also shows a left side of one of the pedals 2102 with a pressure sensor 2086 showing "FORCE 27 lbs.", indicating that the left pedal pressure sensor 2086 is measuring and transmitting that force measurement to the patient interface 2050. FIG. 16 also shows other patient data, such as an indicator of "SESSION TIME 0:04:13", indicating that the patient has been using the treatment apparatus 2070 for 4 minutes and 13 seconds. This session time may be determined by the patient interface 2050 based on information received from the treatment apparatus 2070. FIG. 16 also shows an indicator showing "PAIN LEVEL 3". Such a pain level may be obtained from the patent in response to a solicitation, such as a question, presented upon the patient interface 2050.
[0309] FIG. 17 is an example embodiment of an overview display 2120 of the assistant interface 2094. Specifically, the overview display 2120 presents several different controls and interfaces for the assistant to remotely assist a patient with using the patient interface 2050 and/or the treatment apparatus 2070. This remote assistance functionality may also be called telemedicine or telehealth.
[0310] Specifically, the overview display 2120 includes a patient profile display 2130 presenting biographical information regarding a patient using the treatment apparatus 2070. The patient profile display 2130 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17, although the patient profile display 2130 may take other forms, such as a separate screen or a popup window. In some embodiments, the patient profile display 2130 may include a limited subset of the patient's biographical information. More specifically, the data presented upon the patient profile display 2130 may depend upon the assistant's need for that information. For example, a medical professional that is assisting the patient with a medical issue may be provided with medical history information regarding the patient, whereas a technician troubleshooting an issue with the treatment apparatus 2070 may be provided with a much more limited set of information regarding the patient. The technician, for example, may be given only the patient's name. The patient profile display 2130 may include pseudonymized data and/or anonymized data or use any privacy enhancing technology to prevent confidential patient data from being communicated in a way that could violate patient confidentiality requirements. Such privacy enhancing technologies may enable compliance with laws, regulations, or other rules of governance such as, but not limited to, the Health Insurance Portability and Accountability Act (HIPAA), or the General Data Protection Regulation (GDPR), wherein the patient may be deemed a "data subject".
[0311] In some embodiments, the patient profile display 2130 may present information regarding the treatment plan for the patient to follow in using the treatment apparatus 2070. Such treatment plan information may be limited to an assistant who is a medical professional, such as a doctor or physical therapist. For example, a medical professional assisting the patient with an issue regarding the treatment regimen may be provided with treatment plan information, whereas a technician troubleshooting an issue with the treatment apparatus 2070 may not be provided with any information regarding the patient's treatment plan.
[0312] In some embodiments, one or more recommended optimal treatment plans and/or
/0
ruled-out treatment plans may be presented in the patient profile display 2130 to the assistant. The one or more recommended optimal treatment plans and/or ruled-out treatment plans may be generated by the artificial intelligence engine 2011 of the server 2030 and received from the server 2030 in real-time during, inter alia, a telemedicine or telehealth session. An example of presenting the one or more recommended optimal treatment plans and/or ruled out treatment plans is described below with reference to FIG. 18.
[0313] The example overview display 2120 shown in FIG. 17 also includes a patient status display 2134 presenting status information regarding a patient using the treatment apparatus. The patient status display 2134 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17, although the patient status display 2134 may take other forms, such as a separate screen or a popup window. The patient status display 2134 includes sensor data 2136 from one or more of the external sensors 2082, 2084, 2086, and/or from one or more internal sensors 2076 of the treatment apparatus 2070. In some embodiments, the patient status display 2134 may present other data 2138 regarding the patient, such as last reported pain level, or progress within a treatment plan.
[0314] User access controls may be used to limit access, including what data is available to be viewed and/or modified, on any or all of the user interfaces 2020, 2050, 2090, 2092, 2094 of the system 2010. In some embodiments, user access controls may be employed to control what information is available to any given person using the system 2010. For example, data presented on the assistant interface 2094 may be controlled by user access controls, with permissions set depending on the assistant/user's need for and/or qualifications to view that information.
[0315] The example overview display 2120 shown in FIG. 17 also includes a help data display 2140 presenting information for the assistant to use in assisting the patient. The help data display 2140 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17. The help data display 2140 may take other forms, such as a separate screen or a popup window. The help data display 2140 may include, for example, presenting answers to frequently asked questions regarding use of the patient interface 2050 and/or the treatment apparatus 2070. The help data display 2140 may also include research data or best practices. In some embodiments, the help data display 2140 may present scripts for answers or explanations in response to patient questions. In some embodiments, the help data display
, 2140 may present flow charts or walk-throughs for the assistant to use in determining a root cause and/or solution to a patient's problem. In some embodiments, the assistant interface 2094 may present two or more help data displays 2140, which may be the same or different, for simultaneous presentation of help data for use by the assistant. for example, a first help data display may be used to present a troubleshooting flowchart to determine the source of a patient's problem, and a second help data display may present script information for the assistant to read to the patient, such information to preferably include directions for the patient to perform some action, which may help to narrow down or solve the problem. In some embodiments, based upon inputs to the troubleshooting flowchart in the first help data display, the second help data display may automatically populate with script information.
[0316] The example overview display 2120 shown in FIG. 17 also includes a patient interface control 2150 presenting information regarding the patient interface 2050, and/or to modify one or more settings of the patient interface 2050. The patient interface control 2150 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17. The patient interface control 2150 may take other forms, such as a separate screen or a popup window. The patient interface control 2150 may present information communicated to the assistant interface 2094 via one or more of the interface monitor signals 2098b. As shown in FIG. 17, the patient interface control 2150 includes a display feed 2152 of the display presented by the patient interface 2050. In some embodiments, the display feed 2152 may include a live copy of the display screen currently being presented to the patient by the patient interface 2050. In other words, the display feed 2152 may present an image of what is presented on a display screen of the patient interface 2050. In some embodiments, the display feed 2152 may include abbreviated information regarding the display screen currently being presented by the patient interface 2050, such as a screen name or a screen number. The patient interface control 2150 may include a patient interface setting control 2154 for the assistant to adjust or to control one or more settings or aspects of the patient interface 2050. In some embodiments, the patient interface setting control 2154 may cause the assistant interface 2094 to generate and/or to transmit an interface control signal 2098 for controlling a function or a setting of the patient interface 2050.
[0317] In some embodiments, the patient interface setting control 2154 may include collaborative browsing or co-browsing capability for the assistant to remotely view and/or control the patient interface 2050. For example, the patient interface setting control 2154 may
enable the assistant to remotely enter text to one or more text entry fields on the patient interface 2050 and/or to remotely control a cursor on the patient interface 2050 using a mouse or touchscreen of the assistant interface 2094.
[0318] In some embodiments, using the patient interface 2050, the patient interface setting control 2154 may allow the assistant to change a setting that cannot be changed by the patient. For example, the patient interface 2050 may be precluded from accessing a language setting to prevent a patient from inadvertently switching, on the patient interface 2050, the language used for the displays, whereas the patient interface setting control 2154 may enable the assistant to change the language setting of the patient interface 2050. In another example, the patient interface 2050 may not be able to change a font size setting to a smaller size in order to prevent a patient from inadvertently switching the font size used for the displays on the patient interface 2050 such that the display would become illegible to the patient, whereas the patient interface setting control 154 may provide for the assistant to change the font size setting of the patient interface 50.
[0319] The example overview display 2120 shown in FIG. 17 also includes an interface communications display 2156 showing the status of communications between the patient interface 2050 and one or more other devices 2070, 2082, 2084, such as the treatment apparatus 2070, the ambulation sensor 2082, and/or the goniometer 2084. The interface communications display 2156 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17. The interface communications display 2156 may take other forms, such as a separate screen or a popup window. The interface communications display 2156 may include controls for the assistant to remotely modify communications with one or more of the other devices 2070, 2082, 2084. For example, the assistant may remotely command the patient interface 2050 to reset communications with one of the other devices 2070, 2082, 2084, or to establish communications with a new one of the other devices 2070, 2082, 2084. This functionality may be used, for example, where the patient has a problem with one of the other devices 2070, 2082, 2084, or where the patient receives a new or a replacement one of the other devices 2070, 2082, 2084.
[0320] The example overview display 2120 shown in FIG. 17 also includes an apparatus control 2160 for the assistant to view and/or to control information regarding the treatment apparatus 2070. The apparatus control 2160 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17. The apparatus control 2160 may take other forms, such as a separate screen or a popup window. The apparatus control 2160 may include an apparatus status display 2162 with information regarding the current status of the apparatus. The apparatus status display 2162 may present information communicated to the assistant interface 2094 via one or more of the apparatus monitor signals 2099b. The apparatus status display 2162 may indicate whether the treatment apparatus 2070 is currently communicating with the patient interface 2050. The apparatus status display 2162 may present other current and/or historical information regarding the status of the treatment apparatus 2070.
[0321] The apparatus control 2160 may include an apparatus setting control 2164 for the assistant to adjust or control one or more aspects of the treatment apparatus 2070. The apparatus setting control 2164 may cause the assistant interface 2094 to generate and/or to transmit an apparatus control signal 2099 for changing an operating parameter of the treatment apparatus 2070, (e.g., a pedal radius setting, a resistance setting, a target RPM, etc.). The apparatus setting control 2164 may include a mode button 2166 and a position control 2168, which may be used in conjunction for the assistant to place an actuator 2078 of the treatment apparatus 2070 in a manual mode, after which a setting, such as a position or a speed of the actuator 2078, can be changed using the position control 2168. The mode button 2166 may provide for a setting, such as a position, to be toggled between automatic and manual modes. In some embodiments, one or more settings may be adjustable at any time, and without having an associated auto/manual mode. In some embodiments, the assistant may change an operating parameter of the treatment apparatus 2070, such as a pedal radius setting, while the patient is actively using the treatment apparatus 2070. Such "on the fly" adjustment may or may not be available to the patient using the patient interface 2050. In some embodiments, the apparatus setting control 2164 may allow the assistant to change a setting that cannot be changed by the patient using the patient interface 2050. For example, the patient interface 2050 may be precluded from changing a preconfigured setting, such as a height or a tilt setting of the treatment apparatus 2070, whereas the apparatus setting control 2164 may provide for the assistant to change the height or tilt setting of the treatment apparatus 2070.
[0322] The example overview display 2120 shown in FIG. 17 also includes a patient communications control 2170 for controlling an audio or an audiovisual communications session with the patient interface 2050. The communications session with the patient interface 2050 may comprise a live feed from the assistant interface 94 for presentation by the output device of the patient interface 2050. The live feed may take the form of an audio feed and/or a video feed. In some embodiments, the patient interface 2050 may be configured to provide two-way audio or audiovisual communications with a person using the assistant interface 2094. Specifically, the communications session with the patient interface 2050 may include bidirectional (two-way) video or audiovisual feeds, with each of the patient interface 2050 and the assistant interface 2094 presenting video of the other one. In some embodiments, the patient interface 2050 may present video from the assistant interface 2094, while the assistant interface 2094 presents only audio or the assistant interface 2094 presents no live audio or visual signal from the patient interface 2050. In some embodiments, the assistant interface 2094 may present video from the patient interface 2050, while the patient interface 2050 presents only audio or the patient interface 2050 presents no live audio or visual signal from the assistant interface 2094.
[0323] In some embodiments, the audio or an audiovisual communications session with the patient interface 2050 may take place, at least in part, while the patient is performing the rehabilitation regimen upon the body part. The patient communications control 2170 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17. The patient communications control 2170 may take other forms, such as a separate screen or a popup window. The audio and/or audiovisual communications may be processed and/or directed by the assistant interface 2094 and/or by another device or devices, such as a telephone system, or a videoconferencing system used by the assistant while the assistant uses the assistant interface 2094. Alternatively or additionally, the audio and/or audiovisual communications may include communications with a third party. For example, the system 2010 may enable the assistant to initiate a 3-way conversation regarding use of a particular piece of hardware or software, with the patient and a subject matter expert, such as a medical professional or a specialist. The example patient communications control 2170 shown in FIG. 17 includes call controls 2172 for the assistant to use in managing various aspects of the audio or audiovisual communications with the patient. The call controls 2172 include a disconnect button 2174 for the assistant to end the audio or audiovisual communications session. The call controls 2172 also include a mute button 2176 to temporarily silence an audio or audiovisual signal from the assistant interface 2094. In some embodiments, the call controls
2172 may include other features, such as a hold button (not shown). The call controls 2172 also include one or more record/playback controls 2178, such as record, play, and pause buttons to control, with the patient interface 2050, recording and/or playback of audio and/or video from the teleconference session. The call controls 2172 also include a video feed display 2180 for presenting still and/or video images from the patient interface 2050, and a self-video display 2182 showing the current image of the assistant using the assistant interface. The self-video display 2182 may be presented as a picture-in-picture format, within a section of the video feed display 2180, as shown in FIG. 17. Alternatively or additionally, the self-video display 2182 may be presented separately and/or independently from the video feed display 2180.
[0324] The example overview display 2120 shown in FIG. 17 also includes a third party communications control 2190 for use in conducting audio and/or audiovisual communications with a third party. The third party communications control 2190 may take the form of a portion or region of the overview display 2120, as shown in FIG. 17. The third party communications control 2190 may take other forms, such as a display on a separate screen or a popup window. The third party communications control 2190 may include one or more controls, such as a contact list and/or buttons or controls to contact a third party regarding use of a particular piece of hardware or software, e.g., a subject matter expert, such as a medical professional or a specialist. The third party communications control 2190 may include conference calling capability for the third party to simultaneously communicate with both the assistant via the assistant interface 2094, and with the patient via the patient interface 2050. For example, the system 2010 may provide for the assistant to initiate a 3-way conversation with the patient and the third party.
[0325] FIG. 18 shows an example embodiment of the overview display 2120 of the assistant interface 2094 presenting recommended optimal treatment plans and ruled-out treatment plans in real-time during a telemedicine session according to the present disclosure. As depicted, the overview display 2120 just includes sections for the patient profile 2130 and the video feed display 2180, including the self-video display 2182. Any suitable configuration of controls and interfaces of the overview display 2120 described with reference to FIG. 17 may be presented in addition to or instead of the patient profile 2130, the video feed display 2180, and the self-video display 2182.
[0326] The assistant (e.g., medical professional), who is using the assistant interface 2094
Our
(e.g., computing device) during the telemedicine session, may be presented in the self-video 2182 in a portion of the overview display 2120 (e.g., user interface presented on a display screen 2024 of the assistant interface 2094) that also presents a video from the patient in the video feed display 2180. As depicted, another portion of the overview display 2120 includes the patient profile display 2130.
[0327] The patient profile display 2130 is presenting two example optimal treatment plans 2600 and one example excluded treatment plan 2602. As described herein, the optimal treatment plans may be recommended in view of various clinical information and characteristics of the patient being treated. The clinical information may include information pertaining to characteristics of other people, treatment plans followed by the other people, and results of the treatment plans. To generate the recommended optimal treatment plans 2600 the patient should follow to achieve a desired result, a pattern between the characteristics of the patient being treated and the other people may be matched by one or more machine learning models 2013 of the artificial intelligence engine 2011. Each of the recommended optimal treatment plans may be generated based on different desired results.
[0328] For example, suppose the following: treatment plan "A" indicates "Patient X should use treatment apparatus for 30 minutes a day for 4 days to achieve an increased range of motion of Y%; Patient X has Type 2 Diabetes; and Patient X should be prescribed medication Z for pain management during the treatment plan (medication Z is approved for people having Type 2 Diabetes)." Accordingly, the optimal treatment plan generated achieves increasing the range of motion of Y%. As may be appreciated, the optimal treatment plan also includes a recommended medication (e.g., medication Z) to prescribe to the patient to manage pain in view of a known medical disease (e.g., Type 2 Diabetes) of the patient. That is, the recommended patient medication not only does not conflict with the medical condition of the patient but thereby improves the probability of a superior patient outcome.
[0329] Recommended optimal treatment plan "B" may specify, based on a different desired result of the treatment plan, a different treatment plan including a different treatment protocol for a treatment apparatus, a different medication regimen, etc.
[0330] As depicted, the patient profile display 2130 may also present the ruled-out treatment plans 2602. These types of treatment plans are shown to the assistant using the assistant interface 2094 to alert the assistant not to recommend certain portions of a treatment plan to the patient. For example, the ruled-out treatment plan could specify the following:
0.)
"Patient X should not use treatment apparatus for longer than 30 minutes a day due to a heart condition; Patient X has Type 2 Diabetes; and Patient X should not be prescribed medication M for pain management during the treatment plan (in this scenario, medication M can cause complications for people having Type 2 Diabetes). Specifically, the ruled-out treatment plan points out a limitation of a treatment protocol where, due to a heart condition, Patient X should not exercise for more than 30 minutes a day. The ruled-out treatment plan also points out that Patient X should not be prescribed medication M because it conflicts with the medical condition Type 2 Diabetes.
[0331] The assistant may select the optimal treatment plan for the patient on the overview display 2120. For example, the assistant may use an input peripheral (e.g., mouse, touchscreen, microphone, keyboard, etc.) to select from the optimal treatment plans 2600 for the patient. In some embodiments, during the telemedicine session, the assistant may discuss the pros and cons of the recommended optimal treatment plans 2600 with the patient.
[0332] In any event, the assistant may select the optimal treatment plan for the patient to follow to achieve the desired result. The selected optimal treatment plan may be transmitted to the patient interface 2050 for presentation. The patient may view the selected optimal treatment plan on the patient interface 2050. In some embodiments, the assistant and the patient may discuss during the telemedicine session the details (e.g., treatment protocol using treatment apparatus 2070, diet regimen, medication regimen, etc.) in real-time.
[0333] FIG. 19 shows an example embodiment of a server 2030 translating clinical information 2700 into a medical description language 2702 for processing by an artificial intelligence engine 2011 according to the present disclosure. The clinical information 2700 may be written by a person having a certain professional credential, license, or degree. In the depicted example, the clinical information 2700 includes a portion of meta-analyses for a clinical trial titled "EFFECT OF USING TREATMENT PLAN FOR HIP OSTEOARTHRITIS PAIN". The portion includes a section for "Results" and a section for "Conclusion". There may be many other portions (e.g., details of the trial procedure, biographies of subjects, etc.) of the clinical information 2700 that, for clarity of explanation, are not depicted.
[0334] One or more machine learning models 2013 may be trained to parse a body of structured or unstructured text (e.g., clinical information 700) in search of a corpus of keywords that represent target information. The target information may be included in one or
0U
more portions of the clinical information 2700. Target information may refer to any suitable information of interest, such as characteristics of people (e.g., vital signs, medical conditions, medical procedures, allergies, familial medical information, measurements, etc.), treatment plans followed by the people, results of the treatment plans, clinical trial information, treatment apparatuses used for the treatment plan, and the like.
[0335] Using tags representing the target information and values associated with the tags, the one or more machine learning models 2013 may generate a canonical format defined by the medical descriptive language. The values may be numbers, characters, alphanumeric characters, strings, arrays, and the like, wherein they are obtained from the portions of the clinical information 2700 (including the target information). The target information may be organized in parent-child relationships based on the structure, organization, and/or relationships of the information. For example, the keyword "Results" may be identified and determined to be a parent level tag due to its encompassing children target information, such as trials, subjects, treatment plan, treatment apparatus, subject characteristics, and conclusions. As such, a parent-level tag for "<results>" may include child-level tags for "<trials>", "<subjects>", "<treatment plan>", "<treatment apparatus>", "<subject characteristics>", and "<conclusions>". Each tag may have a corresponding ending tag (e.g., "<results> ... </results>").
[0336] An embodiment of operations which a trained machine learning model 2013 performs to encode the portion of clinical information 2700 in the medical description language 2702 is now discussed. The trained machine learning model 2013 identified keywords "treatment plan" and "treatment apparatus" in the portion of the clinical information 2700. Once identified, the trained machine learning model 2013 may analyze words in the vicinity (e.g., to the left and right) of the keywords to determine, based on training data, whether the words match a recognized context. The trained machine learning model 2013 may also determine, based on training data and based on attributes of the data, whether the words are recognized as being associated with the keywords. In FIG. 19, the trained machine learning model may determine the words "range of motion (ROM)" fit the context of the keyword "treatment apparatus" and also are likely recognizable as being associated with the keyword "treatment apparatus". Accordingly, the value "ROM" is placed in between tags "<treatment apparatus>" and "</treatment apparatus>" representing target information. The other tags representing target information in the canonical format of the
0/
medical description language 2702 may be populated in a similar manner. The medical description language 2702 representing the portion of the clinical information 2700 may be saved in the patient data store 2044 in an appropriate patient cohort-equivalent database.
[0337] FIG. 20 shows an example embodiment of a method 2800 for recommending an optimal treatment plan according to the present disclosure. The method 2800 is performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), software (such as is run on a general-purpose computer system or a dedicated machine), or a combination of both. The method 2800 and/or each of its individual functions, routines, subroutines, or operations may be performed by one or more processors of a computing device (e.g., any component of FIGURE 13, such as server 2030 executing the artificial intelligence engine 2011). In certain implementations, the method 2800 may be performed by a single processing thread. Alternatively, the method 2800 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the methods.
[0338] For simplicity of explanation, the method 2800 is depicted and described as a series of operations. However, operations in accordance with this disclosure can occur in various orders and/or concurrently, and/or with other operations not presented and described herein. For example, the operations depicted in the method 2800 may occur in combination with any other operation of any other method disclosed herein. Furthermore, not all illustrated operations may be required to implement the method 2800 in accordance with the disclosed subject matter. In addition, those skilled in the art will understand and appreciate that the method 2800 could alternatively be represented as a series of interrelated states via a state diagram or events.
[0339] At 2802, the processing device may receive, from a data source 2015, clinical information 2700 pertaining to results of performing particular treatment plans using the treatment apparatus 2070 for people having certain characteristics. The clinical information has a first data format, which may include natural language text in the form of words arranged in sentences that are further arranged in paragraphs. The first data format may be a report or description, wherein the report or description may include information pertaining to clinical trials, medical research, meta-analyses, evidence-based guidelines, journals, and the like. The first data format may include information arranged in an unstructured manner and may have a first data size (e.g., bytes, kilobytes, etc.).
[0340] The certain characteristics of the people may include medications prescribed to the people, injuries of the people, medical procedures performed on the people, measurements of the people, allergies of the people, medical conditions of the people, first historical information of the people, vital signs of the people, symptoms of the people, familial medical information of the people, or some combination thereof. The characteristics may also include the following information pertaining to the people: demographic, geographic, diagnostic, measurement- or test-based, medically historic, etiologic, cohort-associative, differentially diagnostic, surgical, physically therapeutic, pharmacologic and other treatment(s) recommended.
[0341] At 2804, the processing device may translate a portion of the clinical information from the first data format to a medical description language 2702 used by the artificial intelligence engine 2011. The medical description language 2702 may include a second data format that structures the unstructured data of the clinical information 2700. For example, the medical description language 2702 may include using tag-value pairs, where the tags identify the type of value stored between the tags. The medical description language 2702 may have a second data size (e.g., bits) that is smaller than the first data size of the clinical information 2700. The medical description language may include telemedical data.
[0342] At 2806, the processing device may determine, based on the portion of the clinical information 2700 described by the medical description language 2702 and a set of characteristics pertaining to a patient, the optimal treatment plan 2600 for the patient to follow when using the treatment apparatus 2070 to achieve a desired result. One or more machine learning models 2013 of the artificial intelligence engine 2011 may be trained to output the optimal treatment plan 2600. For example, one machine learning model 2013 may be trained to match a pattern between the portion of the clinical information described by the medical description language 2702 with the set of characteristics of the patient. In some embodiments, the set of characteristics of the patient is also represented in the medical description language. The pattern is associated with the optimal treatment plan that may produce the desired result.
[0343] In some embodiments, the optimal treatment plan may include information pertaining to a medical procedure to perform on the patient, a treatment protocol for the patient using the treatment apparatus 2070, a diet regimen for the patient, a medication regimen for the patient, a sleep regimen for the patient, or some combination thereof.
[0344] The desired result may include obtaining a certain result within a certain time period.
0-/
The certain result may include a range of motion the patient achieves using the treatment apparatus 2070, an amount of force exerted by the patient on a portion of the treatment apparatus 2070, an amount of time the patient exercises using the treatment apparatus 2070, a distance the patient travels using the treatment apparatus 2070, a level of pain experienced by the patient when using the treatment apparatus 2070, or some combination thereof.
[0345] In some embodiments, the processing device may determine, based on the portion of the clinical information described by the medical description language and the set of characteristics pertaining to the patient, a second optimal treatment plan for the patient to follow using the treatment apparatus 2070 to achieve a second desired result. The desired result may pertain to a recovery outcome and the second desired result may pertain to a recovery time. The recovery outcome may include achieving a certain threshold of functionality, mobility movement, range of motion, etc. of a particular body part. The recovery time may include achieving a certain threshold of functionality, mobility, movement, range of motion, etc. of a particular body part within a certain threshold period of time. For example, some people may prefer to recover to a certain level of mobility as fast as possible without full recovery. As discussed above, different machine learning models 2013 may be trained, using different clinical information, to provide different recommended treatment plans that may produce different desired results.
[0346] In some embodiments, the processing device may determine, based on the portion of the clinical information described by the medical description language and the set of characteristics pertaining to the patient, an excluded treatment plan 2602 that should not be recommended for the patient to follow when using the treatment apparatus 2070 to achieve the desired result. In some embodiments, as depicted in FIG. 18, the optimal treatment plan(s) 2600 and the excluded treatment plan(s) 2602 may be concurrently presented in a first portion (e.g., patient profile display 2130) of the user interface while at least the video or other multimedia data from the patient engaged in the telemedicine session may be presented in another portion (e.g., video feed display 2180).
[0347] In some embodiments, the optimal treatment plan(s) 2600 and the excluded treatment plan(s) 2602 may be concurrently presented while the medical professional is not engaged in a telemedicine session. For example, the optimal treatment plan(s) 2600 and the excluded treatment plan(s) 2602 may be presented in the user interface before a telemedicine session begins or after a telemedicine session ends.
[0348] At 2808, the processing device may provide the optimal treatment plan to be presented in a user interface (e.g., overview display 2120) on a computing device (e.g., assistant interface 2094) of a medical professional. In addition, any other generated optimal treatment plans 2600 may be provided to the computing device of the medical professional. For example, different optimal treatment plans that result in different outcomes may be presented to the medical professional. The processing device may receive a selected treatment plan of any of the treatment plans presented. In some embodiments, the medical professional may select the optimal treatment plan based on an outcome preference of the patient. For example, an athlete might wish to optimize for performance, while a retiree might wish to optimize for a pain-free quality of life. The selected treatment plan may be transmitted to the computing device of the patient for presentation on a user interface. In some embodiments, the optimal treatment plan(s) may be provided to the computing device of the medical professional during a telemedicine session to cause the optimal treatment plan to be presented in real-time in a first portion of the user interface while video and, optionally, other multimedia of the patient is concurrently presented in a second portion of the user interface. The selected treatment plan may be presented on the computing device of the patient during the telemedicine session such that the medical professional can explain the selected treatment plan to the patient.
[0349] FIG. 21 shows an example embodiment of a method 2900 for translating clinical information into the medical description language according to the present disclosure. Method 2900 includes operations performed by processors of a computing device (e.g., any component of FIG. 13, such as server 2030 executing the artificial intelligence engine 2011). In some embodiments, one or more operations of the method 2900 are implemented in computer instructions stored on a memory device and executed by a processing device. The method 2900 may be performed in the same or a similar manner as described above in regard to method 2800. The operations of the method 2900 may be performed in some combination with any of the operations of any of the methods described herein.
[0350] The method 2900 may include operation 2804 from the previously-described method 2800 depicted in FIG. 20. For example, at 2804 in the method 2600, the processing device may translate a portion of the clinical information from the first data format to a medical description language used by the artificial intelligence engine.
[0351] The method 2900 in FIG. 21 includes operations 2902, 2904, and 2906. The operations 2902, 2904, and 2906 may be performed by one or more trained machine learning models 2013 of the artificial intelligence engine 2011.
[0352] At 2902, the processing device may parse the clinical information. At 2904, the processing device may identify, based on keywords representing target information in the clinical information, the portion of the clinical information having values related to the target information. At 2906, the processing device may generate a canonical format defined by the medical description language. The canonical format may include tags identifying values of the target information. The tags may be attributes describing specific characteristics of the target information. The specific characteristics may include which cohort class a person is placed in, age of the person, semantic information, being related to a certain cohort, familial history, and the like. In some embodiments, the specific characteristics may include any information or indication that a person is at risk.
[0353] The canonical format may enable more efficient processing of the portion of the clinical information represented by the medical description language when training a machine learning model to generate the optimal treatment plans for patients who are using the trained machine learning model. Further, the canonical format may enable more efficient processing by the trained machine learning model when matching patterns between the characteristics of patients and the portion of the clinical information represented by the medical description language.
[0354] FIG. 22 shows an example computer system 21000 which can perform anyone or more of the methods described herein, in accordance with one or more aspects of the present disclosure. In one example, computer system 21000 may include a computing device and correspond to the assistance interface 2094, reporting interface 2092, supervisory interface 2090, clinician interface 2020, server 2030 (including the Al engine 2011), patient interface 2050, ambulatory sensor 2082, goniometer 2084, treatment apparatus 2070, pressure sensor 2086, or any suitable component of FIG. 13. The computer system 21000 may be capable of executing instructions implementing the one or more machine learning models 2013 of the artificial intelligence engine 2011 of FIG. 13. The computer system may be connected (e.g., networked) to other computer systems in a LAN, an intranet, an extranet, or the Internet, including via the cloud or a peer-to-peer network. The computer system may operate in the capacity of a server in a client-server network environment. The computer system may be a personal computer (PC), a tablet computer, a wearable (e.g., wristband), a set-top box (STB), a personal Digital Assistant (PDA), a mobile phone, a camera, a video camera, an Internet of Things (IoT) device, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device. Further, while only a single computer system is illustrated, the term "computer" shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
[0355] The computer system 21000 includes a processing device 21002, a main memory 21004 (e.g., read-only memory (ROM), flash memory, solid state drives (SSDs), dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 21006 (e.g., flash memory, solid state drives (SSDs), static random access memory (SRAM)), and a data storage device 21008, which communicate with each other via a bus 1010.
[0356] Processing device 21002 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 21002 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 21002 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a system on a chip, a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 21002 is configured to execute instructions for performing any of the operations and steps discussed herein.
[0357] The computer system 21000 may further include a network interface device 21012. The computer system 21000 also may include a video display 21014 (e.g., a liquid crystal display (LCD), a light-emitting diode (LED), an organic light-emitting diode (OLED), a quantum LED, a cathode ray tube (CRT), a shadow mask CRT, an aperture grille CRT, a monochrome CRT), one or more input devices 21016 (e.g., a keyboard and/or a mouse or a gaming-like control), and one or more speakers 21018 (e.g., a speaker). In one illustrative example, the video display 21014 and the input device(s) 21016 may be combined into a single component or device (e.g., an LCD touch screen).
[0358] The data storage device 21016 may include a computer-readable medium 21020 on which the instructions 21022 embodying any one or more of the methods, operations, or functions described herein is stored. The instructions 21022 may also reside, completely or at least partially, within the main memory 21004 and/or within the processing device 21002 during execution thereof by the computer system 21000. As such, the main memory 21004 and the processing device 21002 also constitute computer-readable media. The instructions 21022 may further be transmitted or received over a network via the network interface device 21012.
[0359] While the computer-readable storage medium 21020 is shown in the illustrative examples to be a single medium, the term "computer-readable storage medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "computer-readable storage medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term "computer-readable storage medium" shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
[0360] Clause 25. A method for providing, by an artificial intelligence engine, an optimal treatment plan to use with a treatment apparatus, the method comprising:
[0361] receiving, from a data source, clinical information pertaining to results of using the treatment apparatus to perform particular treatment plans for people having certain characteristics, wherein the clinical information has a first data format;
[0362] translating a portion of the clinical information from the first data format to a medical description language used by the artificial intelligence engine;
[0363] determining, based on the portion of the clinical information described by the medical description language and a plurality of characteristics pertaining to a patient, the optimal treatment plan for the patient to follow using the treatment apparatus to achieve a desired result; and
[0364] providing the optimal treatment plan to be presented on a computing device of a medical professional.
[0365] Clause 26. The method of any clause herein, wherein translating the portion of the clinical information from the first data format to the medical description language used by the artificial intelligence engine further comprises:
[0366] parsing the clinical information;
[0367] identifying, based on keywords representing target information in the clinical information, the portion of the clinical information having values related to the target information;
[0368] generating a canonical format defined by the medical description language, wherein the canonical format comprises tags identifying the values of the target information.
[0369] Clause 27. The method of any clause herein, wherein the tags are attributes describing specific characteristics of the target information;
[0370] Clause 28. The method of any clause herein, wherein providing the optimal treatment plan to be presented on the computing device of the medical professional further comprises:
[0371] causing, during a telemedicine session, the optimal treatment plan to be presented on a user interface of the computing device of the medical professional, wherein the optimal treatment plan is not presented on a display screen of a computing device, such display screen configured to be used by the patient during the telemedicine session.
[0372] Clause 29. The method of any clause herein, further comprising:
[0373] determining, based on the portion of the clinical information described by the medical description language and the plurality of characteristics pertaining to the patient, a ruled-out treatment plan that should not be recommended for the patient to follow when using the treatment apparatus to achieve the desired result; and
[0374] providing the excluded treatment plan to be presented on the computing device of the medical professional.
[0375] Clause 30. The method of any clause herein, further comprising:
[0376] determining, based on the portion of the clinical information described by the medical description language and the plurality of characteristics pertaining to the patient, a second optimal treatment plan for the patient to follow when using the treatment apparatus to achieve a second desired result, wherein the desired result pertains to a recovery outcome and the second desired result pertains to a recovery time; and
[0377] providing the second optimal treatment plan to be presented on the computing device of the medical professional;
[0378] receiving a selected treatment plan of either the optimal treatment plan or the second optimal treatment plan; and
[0379] transmitting the selected treatment plan to a computing device of the patient for presenting on a user interface of the computing device of the patient.
[0380] Clause31. The method of any clause herein, wherein the desired result comprises obtaining a certain result within a certain time period, and the certain result comprises:
[0381] a range of motion the patient achieves using the treatment apparatus,
[0382] an amount of force exerted by the patient on a portion of the treatment apparatus,
[0383] an amount of time the patient exercises using the treatment apparatus,
[0384] a distance the patient travels using the treatment apparatus, or
[0385] some combination thereof.
[0386] Clause 32. The method of any clause herein, wherein:
[0387] the certain characteristics of the people comprise first medications prescribed to the people, first injuries of the people, first medical procedures performed on the people, first measurements of the people, first allergies of the people, first medical conditions of the people, first historical information of the people, first vital signs of the people, first symptoms of the people, first familial medical information of the people, first demographic information of the people, first geographic information of the people, first measurement- or test-based information of the people, first medically historic information of the people, first etiologic information of the people, first cohort-associative information of the people, first differentially diagnostic information of the people, first surgical information of the people, first physically therapeutic information of the people, first pharmacologic information of the people, first other treatments recommended to the people, or some combination thereof, and
[0388] the plurality of characteristics of the patient comprise second medications of the patient, second injuries of the patient, second medical procedures performed on the patient, second measurements of the patient, second allergies of the patient, second medical conditions of the patient, second historical information of the patient, second vital signs of the patient, second symptoms of the patient, second familial medical information of the patient, second demographic information of the patient, second geographic information of the
9/V
patient, second measurement- or test-based information of the patient, second medically historic information of the patient, second etiologic information of the patient, second cohort associative information of the patient, second differentially diagnostic information of the patient, second surgical information of the patient, second physically therapeutic information of the patient, second pharmacologic information of the patient, second other treatments recommended to the patient, or some combination thereof.
[0389] Clause 33. The method of any clause herein, wherein the clinical information is written by a person having a certain professional credential and comprises a journal article, a clinical trial, evidence-based guidelines, meta-analysis, or some combination thereof.
[0390] Clause 34. The method of any clause herein, wherein determining, based on the portion of the clinical information described by the medical description language and the plurality of characteristics pertaining to the patient, the optimal treatment plan for the patient to follow when using the treatment apparatus to achieve the desired result further comprises:
[0391] matching a pattern between the portion of the clinical information described by the medical description language with the plurality of characteristics of the patient, wherein the pattern is associated with the optimal treatment plan that leads to the desired result.
[0392] Clause 35. The method of any clause herein, wherein the optimal treatment plan comprises:
[0393] a medical procedure to perform on the patient,
[0394] a treatment protocol for the patient using the treatment apparatus,
[0395] a diet regimen for the patient,
[0396] a medication regimen for the patient,
[0397] a sleep regimen for the patient, or
[0398] some combination thereof.
[0399] Clause 36. A tangible, non-transitory computer-readable medium storing instructions that, when executed, cause a processing device to:
[0400] receive, from a data source, clinical information pertaining to results of using the treatment apparatus to perform particular treatment plans for people having certain characteristics, wherein the clinical information has a first data format;
[0401] translate a portion of the clinical information from the first data format to a medical description language used by the artificial intelligence engine;
[0402] determine, based on the portion of the clinical information described by the medical description language and a plurality of characteristics pertaining to a patient, the optimal treatment plan for the patient to follow using the treatment apparatus to achieve a desired result; and
[0403] provide the optimal treatment plan to be presented on a computing device of a medical professional.
[0404] Clause 37. The computer-readable medium of any clause herein, wherein translating the portion of the clinical information from the first data format to the medical description language used by the artificial intelligence engine further comprises:
[0405] parse the clinical information;
[0406] identify, based on keywords representing target information in the clinical information, the portion of the clinical information having values of the target information;
[0407] generate a canonical format defined by the medical description language, wherein the canonical format comprises tags identifying the values of the target information.
[0408] Clause 38. The computer-readable medium of any clause herein, wherein providing the optimal treatment plan to be presented on the computing device of the medical professional further comprises:
[0409] causing, during a telemedicine session, the optimal treatment plan to be presented on a user interface of the computing device of the medical professional, wherein, during the telemedicine session, the optimal treatment plan is not presented on a user interface of a computing device of the patient.
[0410] Clause 39. The computer-readable medium of any clause herein, wherein the processing device further:
[0411] determines, based on the portion of the clinical information described by the medical description language and the plurality of characteristics pertaining to the patient, a second optimal treatment plan for the patient to follow when using the treatment apparatus to achieve a second desired result, wherein the desired result pertains to a recovery outcome and the
second desired result pertains to a recovery time; and
[0412] provides the second optimal treatment plan to be presented on the computing device of the medical professional;
[0413] receives a selected treatment plan of either the optimal treatment plan or the second optimal treatment plan; and
[0414] transmits the selected treatment plan to a computing device of the patient.
[0415] Clause 40. The computer-readable medium of any clause herein, wherein the desired result comprises obtaining a certain result within a certain time period, and the certain result comprises:
[0416] a range of motion the patient achieves using the treatment apparatus,
[0417] an amount of force exerted by the patient on a portion of the treatment apparatus,
[0418] an amount of time the patient exercises using the treatment apparatus,
[0419] a distance the patient travels using the treatment apparatus, or
[0420] some combination thereof.
[0421] Clause41. The computer-readable medium of any clause herein, wherein:
[0422] the certain characteristics of the people comprise first medications prescribed to the people, first injuries of the people, first medical procedures performed on the people, first measurements of the people, first allergies of the people, first medical conditions of the people, first historical information of the people, first vital signs of the people, first symptoms of the people, first familial medical information of the people, first demographic information of the people, first geographic information of the people, first measurement- or test-based information of the people, first medically historic information of the people, first etiologic information of the people, first cohort-associative information of the people, first differentially diagnostic information of the people, first surgical information of the people, first physically therapeutic information of the people, first pharmacologic information of the people, first other treatments recommended to the people, or some combination thereof, and
[0423] the plurality of characteristics of the patient comprise second medications of the patient, second injuries of the patient, second medical procedures performed on the patient, second measurements of the patient, second allergies of the patient, second medical conditions of the patient, second historical information of the patient, second vital signs of the patient, second symptoms of the patient, second familial medical information of the patient, second demographic information of the patient, second geographic information of the patient, second measurement- or test-based information of the patient, second medically historic information of the patient, second etiologic information of the patient, second cohort associative information of the patient, second differentially diagnostic information of the patient, second surgical information of the patient, second physically therapeutic information of the patient, second pharmacologic information of the patient, second other treatments recommended to the patient, or some combination thereof.
[0424] Clause 42. The computer-readable medium of any clause herein, wherein the clinical information is written by a person having a certain professional credential and comprises a journal article, a clinical trial, evidence-based guidelines, or some combination thereof.
[0425] Clause 43. A system comprising:
[0426] a memory device storing instructions; and
[0427] a processing device communicatively coupled to the memory device, wherein the processing device executes the instructions to:
[0428] receive, from a data source, clinical information pertaining to results of using the treatment apparatus to perform particular treatment plans for people having certain characteristics, wherein the clinical information has a first data format;
[0429] translate a portion of the clinical information from the first data format to a medical description language used by the artificial intelligence engine;
[0430] determine, based on the portion of the clinical information described by the medical description language and a plurality of characteristics pertaining to a patient, the optimal treatment plan for the patient to follow when using the treatment apparatus to achieve a desired result; and
[0431] provide the optimal treatment plan to be presented on a computing device of a medical professional.
[0432] Clause 44. The system of any clause herein, wherein translating the portion of the clinical information from the first data format to the medical description language used by the artificial intelligence engine further comprises:
[0433] parse the clinical information;
[0434] identify, based on keywords representing target information described by the clinical information, the portion of the clinical information having values of the target information;
[0435] generate a canonical format defined by the medical description language, wherein the canonical format comprises tags identifying the values of the target information.
[0436] Determining a treatment plan for a patient having certain characteristics (e.g., vital sign or other measurements; performance; demographic; geographic; diagnostic; measurement- or test-based; medically historic; etiologic; cohort-associative; differentially diagnostic; surgical, physically therapeutic, behavioral, pharmacologic and other treatment(s) recommended; etc.) may be a technically challenging problem. For example, a multitude of information may be considered when determining a treatment plan, which may result in inefficiencies and inaccuracies in the treatment plan selection process. In a rehabilitative setting, some of the multitude of information considered may include characteristics of the patient such as personal information, performance information, and measurement information. The personal information may include, e.g., demographic, psychographic or other information, such as an age, a weight, a gender, a height, a body mass index, a medical condition, a familial medication history, an injury, a medical procedure, a medication prescribed, behavioral or psychological conditions, or some combination thereof. The performance information may include, e.g., an elapsed time of using a treatment device, an amount of force exerted on a portion of the treatment device, a range of motion achieved on the treatment device, a movement speed of a portion of the treatment device, an indication of a plurality of pain levels using the treatment device, or some combination thereof. The measurement information may include, e.g., a vital sign, a respiration rate, a heartrate, a temperature, a blood pressure, a glucose level or other biomarker, or some combination thereof. It may be desirable to process the characteristics of a multitude of patients, the treatment plans performed for those patients, and the results of the treatment plans for those patients.
[0437] Further, another technical problem may involve distally treating, via a computing device during a telemedicine or telehealth session, a patient from a location different than a location at which the patient is located. An additional technical problem is controlling or enabling the control of, from the different location, a treatment device used by the patient at the location at which the patient is located. Oftentimes, when a patient undergoes rehabilitative surgery (e.g., knee surgery), a healthcare provider may prescribe a treatment device to the patient to use to perform a treatment protocol at their residence or any mobile location or temporary domicile. A healthcare provider may refer to a doctor, physician assistant, nurse, chiropractor, dentist, physical therapist, acupuncturist, physical trainer, coach, personal trainer, or the like. A healthcare provider may refer to any person with a credential, license, degree, or the like in the field of medicine, physical therapy, rehabilitation, or the like.
[0438] When the healthcare provider is located in a different location from the patient and the treatment device, it may be technically challenging for the healthcare provider to monitor the patient's actual progress (as opposed to relying on the patient's word about their progress) using the treatment device, modify the treatment plan according to the patient's progress, adapt the treatment device to the personal characteristics of the patient as the patient performs the treatment plan, and the like.
[0439] Accordingly, systems and methods, such as those described herein, configured to monitor the patient's actual progress, while the patient performs the treatment plan using the treatment device, may be desirable. In some embodiments, the systems and methods described herein may be configured to receive treatment data pertaining to a user who uses a treatment device to perform a treatment plan. The user may include a patient, user, or person using the treatment device to perform various exercises.
[0440] The treatment data may include various characteristics of the user, various baseline measurement information pertaining to the user, various measurement information pertaining to the user while the user uses the treatment device, various characteristics of the treatment device, the treatment plan, other suitable data, or a combination thereof. In some embodiments, the systems and methods described herein may be configured to receive the treatment data during a telemedicine session.
[0441] In some embodiments, while the user uses the treatment device to perform the treatment plan, at least some of the treatment data may correspond to sensor data of a sensor configured to sense various characteristics of the treatment device, and/or the measurement information of the user. Additionally, or alternatively, while the user uses the treatment device to perform the treatment plan, at least some of the treatment data may correspond to sensor data from a sensor associated with a wearable device configured to sense the measurement information of the user.
[0442] The various characteristics of the treatment device may include one or more settings of the treatment device, a current revolutions per time period (e.g., such as one minute) of a rotating member (e.g., such as a wheel) of the treatment device, a resistance setting of the treatment device, other suitable characteristics of the treatment device, or a combination thereof. The baseline measurement information may include, while the user is at rest, one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, a glucose level or other biomarker, other suitable measurement information of the user, or a combination thereof. The measurement information may include, while the user uses the treatment device to perform the treatment plan, one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, a glucose level of the user or, other suitable measurement information of the user, or a combination thereof.
[0443] In some embodiments, the systems and methods described herein may be configured to write to an associated memory, for access by an artificial intelligence engine, the treatment data. The artificial intelligence engine may be configured to use one or more machine learning models configured to use at least some of the treatment data to generate one or more predictions. For example, the artificial intelligence engine may use a machine learning model trained using various treatment data corresponding to various users. The machine learning model may be configured to receive the treatment data corresponding to the user. The machine learning model may analyze the at least one aspect of the treatment data and may generate at least one prediction corresponding to the at least one aspect of the treatment data. The at least one prediction may indicate one or more predicted characteristics of the user. The one or more predicted characteristics of the user may include a predicted vital sign of the user, a predicted respiration rate of the user, a predicted heartrate of the user, a predicted temperature of the user, a predicted blood pressure of the user, a predicted performance parameter of the user performing the treatment plan, a predicted outcome of the treatment plan being performed by the user, a predicted injury of the user resulting from the user performing the treatment plan, or other suitable predicted characteristics of the user.
[0444] In some embodiments, the systems and methods described herein may be configured to receive, from the artificial intelligence engine, one or more predictions. The systems and methods described herein may be configured to identify a threshold corresponding to respective predictions received from the artificial intelligence engine. For example, the systems and methods described herein may identify one or more characteristics of the user indicated by a respective prediction.
[0445] The systems and methods described herein may be configured to access a database configured to associate thresholds with characteristics of the user and/or combinations of characteristics of the user. For example, the database may include information that associates a first threshold with a blood pressure of the user. Additionally, or alternatively, the database may include information that associates a threshold with a blood pressure of the user and a heartrate of the user. It should be understood that the database may include any number of thresholds associated with any of the various characteristics of the user and/or any combination of user characteristics. In some embodiments, a threshold corresponding to a respective prediction may include a value or a range of values, including an upper limit and a lower limit.
[0446] In some embodiments, the systems and methods described herein may be configured to determine whether a prediction received from the artificial intelligence engine is within a range of a corresponding threshold. For example, the systems and methods described herein may be configured to compare the prediction to the corresponding threshold. The systems and methods described herein may be configured to determine whether the prediction is within a predefined range of the threshold. For example, if the threshold includes a value, the predefined range may include an upper limit (e.g., 0.5% or 1% percentagewise, or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable upper limit) above the value and a lower limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable lower limit) below the value. Similarly, if the threshold includes a range including a first upper limit and a first lower limit (e.g., defining an acceptable range of the user characteristic or characteristics corresponding to the prediction), the predefined range may include a second upper limit (e.g.,
0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value) or other suitable upper limit) above the first upper limit and a second lower limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value) or other suitable lower limit) below the first lower limit. It should be understood that the threshold may include any suitable predefined range and may include any suitable format in addition to or other than those described herein.
[0447] If the systems and methods described herein determine that the prediction is within the range of the threshold, the systems and methods described herein may be configured to communicate with (e.g., or over or across) an interface, at a computing device of a healthcare provider, to provide the prediction and the treatment data. In some embodiments, the systems and methods described herein may be configured to generate treatment information using the treatment data. The treatment information may include a summary of the performance of the treatment plan by the user while using the treatment device. The summary may be formatted, such that the treatment data is presentable at a computing device of the healthcare provider. The systems and methods described herein may be configured to communicate the treatment information with the prediction and/or the treatment data, to the computing device of the healthcare provider. Alternatively, if the systems and methods described herein determine that the prediction is outside of the range of the threshold, the systems and methods described herein may be configured to update the treatment data pertaining to the user to indicate the prediction.
[0448] In some embodiments, the systems and methods described herein may, in response to determining that the prediction is within the range of the threshold, modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device based on the prediction.
[0449] In some embodiments, the systems and methods described herein may be configured to control, while the user uses the treatment device during a telemedicine session and based on a generated prediction, the treatment device. For example, the systems and methods described herein may control one or more characteristics of the treatment device based on the prediction and/or the treatment plan.
[0450] The healthcare provider may include a medical professional (e.g., such as a doctor, a nurse, a therapist, and the like), an exercise professional (e.g., such as a coach, a trainer, a nutritionist, and the like), or another professional sharing at least one of medical and exercise attributes (e.g., such as an exercise physiologist, a physical therapist, an occupational therapist, and the like). As used herein, and without limiting the foregoing, a "healthcare provider" may be a human being, a robot, a virtual assistant, a virtual assistant in a virtual and/or augmented reality, or an artificially intelligent entity, including a software program, integrated software and hardware, or hardware alone.
[0451] In some embodiments, the interface may include a graphical user interface configured to provide the treatment information and receive input from the healthcare provider. The interface may include one or more input fields, such as text input fields, dropdown selection input fields, radio button input fields, virtual switch input fields, virtual lever input fields, audio, haptic, tactile, biometric or otherwise activated and/or driven input fields, other suitable input fields, or a combination thereof.
[0452] In some embodiments, the healthcare provider may review the treatment information and/or the prediction. The healthcare provider may determine, based on the review of the treatment information and/or prediction, whether to modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device. For example, the healthcare provider may review the treatment information. The healthcare provider may, based on the review of the treatment information, compare the treatment information to the treatment plan being performed by the user.
[0453] The healthcare provider may compare the following (i) expected information, which pertains to the user while the user uses the treatment device to perform the treatment plan to (ii) the prediction, which pertains to the user while the user uses the treatment device to perform the treatment plan.
[0454] The expected information may include one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable information of the user, or a combination thereof. The healthcare provider may determine that the treatment plan is having the desired effect if the prediction is within an acceptable range associated with one or more corresponding parts or portions of the expected information. Alternatively, the healthcare provider may determine that the treatment plan is not having the desired effect if the prediction is outside of the range associated with one or more corresponding parts or portions of the expected information.
[0455] For example, the healthcare provider may determine whether a blood pressure value indicated by the prediction (e.g., systolic pressure, diastolic pressure, and/or pulse pressure) is within an acceptable range (e.g., plus or minus 1%, plus or minus 5%, percentagewise, plus or minus 1 unit of measurement (or other suitable numerical value), or any suitable percentage-based or numerical range) of an expected blood pressure value indicated by the expected information. The healthcare provider may determine that the treatment plan is having the desired effect if the blood pressure value is within the range of the expected blood pressure value. Alternatively, the healthcare provider may determine that the treatment plan is not having the desired effect if the blood pressure value is outside of the range of the expected blood pressure value.
[0456] In some embodiments, while the user uses the treatment device to perform the treatment plan, the healthcare provider may compare the expected characteristics of the treatment device with characteristics of the treatment device indicated by the treatment information. For example, the healthcare provider may compare an expected resistance setting of the treatment device with an actual resistance setting of the treatment device indicated by the treatment information. The healthcare provider may determine that the user is performing the treatment plan properly if the actual characteristics of the treatment device indicated by the treatment information are within a range of corresponding ones of the expected characteristics of the treatment device. Alternatively, the healthcare provider may determine that the user is not performing the treatment plan properly if the actual characteristics of the treatment device indicated by the treatment information are outside the range of corresponding ones of the expected characteristics of the treatment device.
[0457] If the healthcare provider determines that the prediction and/or the treatment information indicates that the user is performing the treatment plan properly and/or that the treatment plan is having the desired effect, the healthcare provider may determine not to modify the at least one aspect treatment plan and/or the one or more characteristics of the treatment device. Alternatively, while the user uses the treatment device to perform the treatment plan, if the healthcare provider determines that the prediction and/or the treatment information indicates that the user is not or has not been performing the treatment plan properly and/or that the treatment plan is not or has not been having the desired effect, the healthcare provider may determine to modify the at least one aspect of the treatment plan and/or the one or more characteristics of the treatment device.
[0458] In some embodiments, the healthcare provider may interact with the interface to provide treatment plan input indicating one or more modifications to the treatment plan and/or to modify one or more characteristics of the treatment device, if the healthcare provider determines to modify the at least one aspect of the treatment plan and/or to modify one or more characteristics of the treatment device. For example, the healthcare provider may use the interface to provide input indicating an increase or decrease in the resistance setting of the treatment device, or other suitable modification to the one or more characteristics of the treatment device. Additionally, or alternatively, the healthcare provider may use the interface to provide input indicating a modification to the treatment plan. For example, the healthcare provider may use the interface to provide input indicating an increase or decrease in an amount of time the user is required to use the treatment device according to the treatment plan, or other suitable modifications to the treatment plan.
[0459] In some embodiments, based on one or more modifications indicated by the treatment plan input, the systems and methods described herein may be configured to modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device.
[0460] In some embodiments, the systems and methods described herein may be configured to receive the subsequent treatment data pertaining to the user while the user uses the treatment device to perform the modified treatment plan. For example, after the healthcare provider provides input modifying the treatment plan and/or the one or more characteristics of the treatment device, and/or after the artificial intelligence engine modifies the treatment plan and/or one or more characteristics of the treatment device, the user may continue use the treatment device to perform the modified treatment plan. The subsequent treatment data may correspond to treatment data generated while the user uses the treatment device to perform the modified treatment plan. In some embodiments, the subsequent treatment data may correspond to treatment data generated while the user continues to use the treatment device to perform the treatment plan, after the healthcare provider has received the treatment information and determined not to modify the treatment plan and/or the one or more characteristics of the treatment device, and/or the artificial intelligence engine has determined not to modify the treatment plan and/or the one or more characteristics of the treatment device.
[0461] In some embodiments, the artificial intelligence engine may use the one or more machine learning models to generate one or more subsequent predictions based on the subsequent treatment data. The systems and methods described herein may determine whether a respective subsequent prediction is within a range of a corresponding threshold. The systems and methods described herein may, in response to a determination that the respective subsequent prediction is within the range of the threshold, communicate the subsequent treatment data, subsequent treatment information, and/or the prediction to the computing device of the healthcare provider. In some embodiments, based on the subsequent prediction, the systems and methods described herein may modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device.
[0462] In some embodiments, the systems and methods described herein may be configured to receive subsequent treatment plan input from the computing device of the healthcare provider. Based on the subsequent treatment plan input received from the computing device of the healthcare provider, the systems and methods described herein may be configured to further modify the treatment plan and/or to control the one or more characteristics of the treatment device. The subsequent treatment plan input may correspond to input provided by the healthcare provider, at the interface, in response to receiving and/or reviewing subsequent treatment information and/or the subsequent prediction corresponding to the subsequent treatment data. It should be understood that the systems and methods described herein may be configured to continuously and/or periodically generate predictions based on treatment data. The systems and methods described herein may be configured to provide treatment information to the computing device of the healthcare provider based on treatment data continuously and/or periodically received from the sensors or other suitable sources described herein. Additionally, or alternatively, the systems and methods described herein may be configured to continuously and/or periodically monitor, while the user uses the treatment device to perform the treatment plan, the characteristics of the user.
[0463] In some embodiments, the healthcare provider and/or the systems and methods described herein may receive and/or review, continuously or periodically, while the user uses the treatment device to perform the treatment plan, treatment information, treatment data, and or predictions. Based on one or more trends indicated by the treatment information, treatment data, and/or predictions, the healthcare provider and/or the systems and methods described herein may determine whether to modify the treatment plan and/or to modify and/or to control the one or more characteristics of the treatment device. For example, the one or more trends
I0-/
may indicate an increase in heartrate or other suitable trends indicating that the user is not performing the treatment plan properly and/or that performance of the treatment plan by the user is not having the desired effect.
[0464] In some embodiments, the systems and methods described herein may be configured to use artificial intelligence and/or machine learning to assign patients to cohorts and to dynamically control a treatment device based on the assignment during an adaptive telemedicine session. In some embodiments, one or more treatment devices may be provided to patients. The one or more treatment devices may be used by the patients to perform treatment plans in their residences, at a gym, at a rehabilitative center, at a hospital, at their work place, at a hotel, at a conference center, or in or at any suitable location, including permanent or temporary domiciles.
[0465] In some embodiments, the treatment devices may be communicatively coupled to a server. Characteristics of the patients, including the treatment data, may be collected before, during, and/or after the patients perform the treatment plans. For example, the personal information, the performance information, and the measurement information may be collected before, during, and/or after the person performs the treatment plans. The results (e.g., improved performance or decreased performance) of performing each exercise may be collected from the treatment device throughout the treatment plan and after the treatment plan is performed. The parameters, settings, configurations, etc. (e.g., position of pedal, amount of resistance, etc.) of the treatment device may be collected before, during, and/or after the treatment plan is performed.
[0466] Each characteristic of the patient, each result, and each parameter, setting, configuration, etc. may be timestamped and may be correlated with a particular step in the treatment plan. Such a technique may enable determining which steps in the treatment plan are more likely to lead to desired results (e.g., improved muscle strength, range of motion, etc.) and which steps are more likely to lead to diminishing returns (e.g., continuing to exercise after 3 minutes actually delays or harms recovery).
[0467] Data may be collected from the treatment devices and/or any suitable computing device (e.g., computing devices where personal information is entered, such as the interface of the computing device described herein, a clinician interface, patient interface, and the like) over time as the patients use the treatment devices to perform the various treatment plans.
The data that may be collected may include the characteristics of the patients, the treatment plans performed by the patients, the results of the treatment plans, any of the data described herein, any other suitable data, or a combination thereof.
[0468] In some embodiments, the data may be processed to group certain people into cohorts. The people may be grouped by people having certain or selected similar characteristics, treatment plans, and results of performing the treatment plans. For example, athletic people having no medical conditions who perform a treatment plan (e.g., use the treatment device for 30 minutes a day 5 times a week for 3 weeks) and who fully recover may be grouped into a first cohort. Older people who are classified obese and who perform a treatment plan (e.g., use the treatment plan for 10 minutes a day 3 times a week for 4 weeks) and who improve their range of motion by 75 percent may be grouped into a second cohort.
[0469] In some embodiments, an artificial intelligence engine may include one or more machine learning models that are trained using the cohorts. For example, the one or more machine learning models may be trained to receive an input of characteristics of a new patient and to output a treatment plan for the patient that results in a desired result. The machine learning models may match a pattern between the characteristics of the new patient and at least one patient of the patients included in a particular cohort. When a pattern is matched, the machine learning models may assign the new patient to the particular cohort and select the treatment plan associated with the at least one patient. The artificial intelligence engine may be configured to control, distally and based on the treatment plan, the treatment device while the new patient uses the treatment device to perform the treatment plan.
[0470] As may be appreciated, the characteristics of the new patient (e.g., a new user) may change as the new patient uses the treatment device to perform the treatment plan. For example, the performance of the patient may improve quicker than expected for people in the cohort to which the new patient is currently assigned. Accordingly, the machine learning models may be trained to dynamically reassign, based on the changed characteristics, the new patient to a different cohort that includes people having characteristics similar to the now changed characteristics as the new patient. For example, a clinically obese patient may lose weight and no longer meet the weight criterion for the initial cohort, result in the patient's being reassigned to a different cohort with a different weight criterion.
[0471] A different treatment plan may be selected for the new patient, and the treatment
device may be controlled, distally (e.g., which may be referred to as remotely) and based on the different treatment plan, while the new patient uses the treatment device to perform the treatment plan. Such techniques may provide the technical solution of distally controlling a treatment device.
[0472] Further, the systems and methods described herein may lead to faster recovery times and/or better results for the patients because the treatment plan that most accurately fits their characteristics is selected and implemented, in real-time, at any given moment. "Real-time" may also refer to near real-time, which may be less than 10 seconds. As described herein, the term "results" may refer to medical results or medical outcomes. Results and outcomes may refer to responses to medical actions.
[0473] Depending on what result is desired, the artificial intelligence engine may be trained to output several treatment plans. For example, one result may include recovering to a threshold level (e.g., 75% range of motion) in a fastest amount of time, while another result may include fully recovering (e.g., 100% range of motion) regardless of the amount of time. The data obtained from the patients and sorted into cohorts may indicate that a first treatment plan provides the first result for people with characteristics similar to the patient's, and that a second treatment plan provides the second result for people with characteristics similar to the patient.
[0474] Further, the artificial intelligence engine may be trained to output treatment plans that are not optimal i.e., sub-optimal, nonstandard, or otherwise excluded (all referred to, without limitation, as "excluded treatment plans") for the patient. For example, if a patient has high blood pressure, a particular exercise may not be approved or suitable for the patient as it may put the patient at unnecessary risk or even induce a hypertensive crisis and, accordingly, that exercise may be flagged in the excluded treatment plan for the patient. In some embodiments, the artificial intelligence engine may monitor the treatment data received while the patient (e.g., the user) with, for example, high blood pressure, uses the treatment device to perform an appropriate treatment plan and may modify the appropriate treatment plan to include features of an excluded treatment plan that may provide beneficial results for the patient if the treatment data indicates the patient is handling the appropriate treatment plan without aggravating, for example, the high blood pressure condition of the patient.
[0475] In some embodiments, the treatment plans and/or excluded treatment plans may be
presented, during a telemedicine or telehealth session, to a healthcare provider. The healthcare provider may select a particular treatment plan for the patient to cause that treatment plan to be transmitted to the patient and/or to control, based on the treatment plan, the treatment device. In some embodiments, to facilitate telehealth or telemedicine applications, including remote diagnoses, determination of treatment plans and rehabilitative and/or pharmacologic prescriptions, the artificial intelligence engine may receive and/or operate distally from the patient and the treatment device.
[0476] In such cases, the recommended treatment plans and/or excluded treatment plans may be presented simultaneously with a video of the patient in real-time or near real-time during a telemedicine or telehealth session on a user interface of a computing device of a healthcare provider. The video may also be accompanied by audio, text and other multimedia information. Real-time may refer to less than or equal to 2 seconds. Near real-time may refer to any interaction of a sufficiently short time to enable two individuals to engage in a dialogue via such user interface and will generally be less than 10 seconds but greater than 2 seconds.
[0477] Presenting the treatment plans generated by the artificial intelligence engine concurrently with a presentation of the patient video may provide an enhanced user interface because the healthcare provider may continue to visually and/or otherwise communicate with the patient while also reviewing the treatment plans on the same user interface. The enhanced user interface may improve the healthcare provider's experience using the computing device and may encourage the healthcare provider to reuse the user interface. Such a technique may also reduce computing resources (e.g., processing, memory, network) because the healthcare provider does not have to switch to another user interface screen to enter a query for a treatment plan to recommend based on the characteristics of the patient. The artificial intelligence engine may be configured to provide, dynamically on the fly, the treatment plans and excluded treatment plans.
[0478] In some embodiments, the treatment device may be adaptive and/or personalized because its properties, configurations, and positions may be adapted to the needs of a particular patient. For example, the pedals may be dynamically adjusted on the fly (e.g., via a telemedicine session or based on programmed configurations in response to certain measurements being detected) to increase or decrease a range of motion to comply with a treatment plan designed for the user. In some embodiments, a healthcare provider may adapt, remotely during a telemedicine session, the treatment device to the needs of the patient by causing a control instruction to be transmitted from a server to treatment device. Such adaptive nature may improve the results of recovery for a patient, furthering the goals of personalized medicine, and enabling personalization of the treatment plan on a per-individual basis.
[0479] A technical problem may occur which relates to the information pertaining to the patient's medical condition being received in disparate formats. For example, a server may receive the information pertaining to a medical condition of the patient from one or more sources (e.g., from an electronic medical record (EMR) system, application programming interface (API), or any suitable system that has information pertaining to the medical condition of the patient). That is, some sources used by various healthcare providers may be installed on their local computing devices and may use proprietary formats. Accordingly, some embodiments of the present disclosure may use an API to obtain, via interfaces exposed by APIs used by the sources, the formats used by the sources. In some embodiments, when information is received from the sources, the API may map, translate and/or convert the format used by the sources to a standardized format used by the artificial intelligence engine. Further, the information mapped, translated and/or converted to the standardized format used by the artificial intelligence engine may be stored in a database accessed by the artificial intelligence engine when performing any of the techniques disclosed herein. Using the information mapped, translated and/or converted to a standardized format may enable the more accurate determination of the procedures to perform for the patient and/or a billing sequence.
[0480] To that end, the standardized information may enable generating treatment plans and/or billing sequences having a particular format that can be processed by various applications (e.g., telehealth). For example, applications, such as telehealth applications, may be executing on various computing devices of medical professionals and/or patients. The applications (e.g., standalone or web-based) may be provided by a server and may be configured to process data according to a format in which the treatment plans and the billing sequences are implemented. Accordingly, the disclosed embodiments may provide a technical solution by (i) receiving, from various sources (e.g., EMR systems), information in non-standardized and/or different formats; (ii) standardizing the information; and (iii) generating, based on the standardized information, treatment plans and billing sequences
1 1-r
having standardized formats capable of being processed by applications (e.g., telehealth application) executing on computing devices of medical professional and/or patients.
[0481] FIG. 23 generally illustrates a block diagram of a computer-implemented system 3010, hereinafter called "the system" for managing a treatment plan. Managing the treatment plan may include using an artificial intelligence engine to recommend treatment plans and/or provide excluded treatment plans that should not be recommended to a patient.
[0482] The system 3010 also includes a server 3030 configured to store (e.g., write to an associated memory) and to provide data related to managing the treatment plan. The server 3030 may include one or more computers and may take the form of a distributed and/or virtualized computer or computers. The server 3030 also includes a first communication interface 3032 configured to communicate with (e.g., or over) the clinician interface 3020 via a first network 3034. In some embodiments, the first network 3034 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc. The server 3030 includes a first processor 3036 and a first machine-readable storage memory 3038, which may be called a "memory" for short, holding first instructions 3040 for performing the various actions of the server 3030 for execution by the first processor 3036.
[0483] The server 3030 is configured to store data regarding the treatment plan. For example, the memory 3038 includes a system data store 3042 configured to hold system data, such as data pertaining to treatment plans for treating one or more patients. The server 3030 is also configured to store data regarding performance by a patient in following a treatment plan. For example, the memory 3038 includes a patient data store 3044 configured to hold patient data, such as data pertaining to the one or more patients, including data representing each patient's performance within the treatment plan.
[0484] Additionally, or alternatively, the characteristics (e.g., personal, performance, measurement, etc.) of the people, the treatment plans followed by the people, the level of compliance with the treatment plans, and the results of the treatment plans may use correlations and other statistical or probabilistic measures to enable the partitioning of or to partition the treatment plans into different patient cohort-equivalent databases in the patient data store 3044. For example, the data for a first cohort of first patients having a first similar injury, a first similar medical condition, a first similar medical procedure performed, a first treatment plan followed by the first patient, and a first result of the treatment plan may be stored in a first patient database. The data for a second cohort of second patients having a second similar injury, a second similar medical condition, a second similar medical procedure performed, a second treatment plan followed by the second patient, and a second result of the treatment plan may be stored in a second patient database. Any single characteristic or any combination of characteristics may be used to separate the cohorts of patients. In some embodiments, the different cohorts of patients may be stored in different partitions or volumes of the same database. There is no specific limit to the number of different cohorts of patients allowed, other than as limited by mathematical combinatoric and/or partition theory.
[0485] This characteristic data, treatment plan data, and results data may be obtained from numerous treatment devices and/or computing devices over time and stored in the database 3044. The characteristic data, treatment plan data, and results data may be correlated in the patient-cohort databases in the patient data store 3044. The characteristics of the people may include personal information, performance information, and/or measurement information.
[0486] In addition to the historical information about other people stored in the patient cohort-equivalent databases, real-time or near-real-time information based on the current patient's characteristics about a current patient being treated may be stored in an appropriate patient cohort-equivalent database. The characteristics of the patient may be determined to match or be similar to the characteristics of another person in a particular cohort (e.g., cohort A) and the patient may be assigned to that cohort.
[0487] In some embodiments, the server 3030 may execute an artificial intelligence (AI) engine 3011 that uses one or more machine learning models 3013 to perform at least one of the embodiments disclosed herein. The server 3030 may include a training engine 9 capable of generating the one or more machine learning models 3013. The machine learning models 3013 may be trained to assign people to certain cohorts based on their characteristics, select treatment plans using real-time and historical data correlations involving patient cohort equivalents, and control a treatment device 3070, among other things.
[0488] The one or more machine learning models 3013 may be generated by the training engine 309 and may be implemented in computer instructions executable by one or more processing devices of the training engine 309 and/or the servers 3030. To generate the one or more machine learning models 3013, the training engine 309 may train the one or more machine learning models 3013. The one or more machine learning models 3013 may be used by the artificial intelligence engine 3011.
[0489] The training engine 309 may be a rackmount server, a router computer, a personal computer, a portable digital assistant, a smartphone, a laptop computer, a tablet computer, a netbook, a desktop computer, an Internet of Things (IoT) device, any other suitable computing device, or a combination thereof. The training engine 9 may be cloud-based or a real-time software platform, and it may include privacy software or protocols, and/or security software or protocols.
[0490] To train the one or more machine learning models 3013, the training engine 309 may use a training data set of a corpus of the characteristics of the people that used the treatment device 3070 to perform treatment plans, the details (e.g., treatment protocol including exercises, amount of time to perform the exercises, how often to perform the exercises, a schedule of exercises, parameters/configurations/settings of the treatment device 3070 throughout each step of the treatment plan, etc.) of the treatment plans performed by the people using the treatment device 3070, and the results of the treatment plans performed by the people. The one or more machine learning models 3013 may be trained to match patterns of characteristics of a patient with characteristics of other people assigned to a particular cohort. The term "match" may refer to an exact match, a correlative match, a substantial match, etc. The one or more machine learning models 3013 may be trained to receive the characteristics of a patient as input, map the characteristics to characteristics of people assigned to a cohort, and select a treatment plan from that cohort. The one or more machine learning models 3013 may also be trained to control, based on the treatment plan, the machine learning apparatus 3070.
[0491] Different machine learning models 3013 may be trained to recommend different treatment plans for different desired results. For example, one machine learning model may be trained to recommend treatment plans for most effective recovery, while another machine learning model may be trained to recommend treatment plans based on speed of recovery.
[0492] Using training data that includes training inputs and corresponding target outputs, the one or more machine learning models 3013 may refer to model artifacts created by the training engine 309. The training engine 309 may find patterns in the training data wherein such patterns map the training input to the target output and generate the machine learning
/ models 3013 that capture these patterns. In some embodiments, the artificial intelligence engine 3011 and/or the training engine 309 may reside on another component (e.g., assistant interface 3094, clinician interface 3020, etc.) depicted in FIG. 23.
[0493] The one or more machine learning models 3013 may comprise, e.g., a single level of linear or non-linear operations (e.g., a support vector machine [SVM]) or the machine learning models 3013 may be a deep network, i.e., a machine learning model comprising more than one level (e.g., multiple levels) of non-linear operations. Examples of deep networks are neural networks including generative adversarial networks, convolutional neural networks, recurrent neural networks with one or more hidden layers, and fully connected neural networks (e.g., each neuron may transmit its output signal to the input of the remaining neurons, as well as to itself). For example, the machine learning model may include numerous layers and/or hidden layers that perform calculations (e.g., dot products) using various neurons.
[0494] The system 3010 also includes a patient interface 3050 configured to communicate information to a patient and to receive feedback from the patient. Specifically, the patient interface includes an input device 3052 and an output device 3054, which may be collectively called a patient user interface 3052, 3054. The input device 3052 may include one or more devices, such as a keyboard, a mouse, a touch screen input, a gesture sensor, and/or a microphone and processor configured for voice recognition. The output device 3054 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, smartphone, or a smart watch. The output device 3054 may include other hardware and/or software components such as a projector, virtual reality capability, augmented reality capability, etc. The output device 3054 may incorporate various different visual, audio, or other presentation technologies. For example, the output device 3054 may include a non visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, and/or melodies, which may signal different conditions and/or directions. The output device 3054 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the patient. The output device 3054 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
[0495] As is generally illustrated in FIG. 23, the patient interface 3050 includes a second communication interface 3056, which may also be called a remote communication interface configured to communicate with the server 3030 and/or the clinician interface 3020 via a second network 3058. In some embodiments, the second network 3058 may include a local area network (LAN), such as an Ethernet network. In some embodiments, the second network 3058 may include the Internet, and communications between the patient interface 3050 and the server 3030 and/or the clinician interface 3020 may be secured via encryption, such as, for example, by using a virtual private network (VPN). In some embodiments, the second network 3058 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc. In some embodiments, the second network 3058 may be the same as and/or operationally coupled to the first network 3034.
[0496] The patient interface 3050 includes a second processor 3060 and a second machine readable storage memory 3062 holding second instructions 3064 for execution by the second processor 3060 for performing various actions of patient interface 3050. The second machine readable storage memory 3062 also includes a local data store 3066 configured to hold data, such as data pertaining to a treatment plan and/or patient data, such as data representing a patient's performance within a treatment plan. The patient interface 3050 also includes a local communication interface 3068 configured to communicate with various devices for use by the patient in the vicinity of the patient interface 3050. The local communication interface 3068 may include wired and/or wireless communications. In some embodiments, the local communication interface 3068 may include a local wireless network such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
[0497] The system 3010 also includes a treatment device 3070 configured to be manipulated by the patient and/or to manipulate a body part of the patient for performing activities according to the treatment plan. In some embodiments, the treatment device 3070 may take the form of an exercise and rehabilitation apparatus configured to perform and/or to aid in the performance of a rehabilitation regimen, which may be an orthopedic rehabilitation regimen, and the treatment includes rehabilitation of a body part of the patient, such as a joint or a bone or a muscle group. The treatment device 3070 may be any suitable medical, rehabilitative, therapeutic, etc. apparatus configured to be controlled distally via another computing device to treat a patient and/or exercise the patient. The treatment device 3070 may be an electromechanical machine including one or more weights, an
I[I[9/
electromechanical bicycle, an electromechanical spin-wheel, a smart-mirror, a treadmill, or the like. The body part may include, for example, a spine, a hand, a foot, a knee, or a shoulder. The body part may include a part of a joint, a bone, or a muscle group, such as one or more vertebrae, a tendon, or a ligament. As is generally illustrated in FIG. 23, the treatment device 3070 includes a controller 3072, which may include one or more processors, computer memory, and/or other components. The treatment device 3070 also includes a fourth communication interface 3074 configured to communicate with (e.g. or over or across) the patient interface 3050 via the local communication interface 3068. The treatment device 3070 also includes one or more internal sensors 3076 and an actuator 3078, such as a motor. The actuator 3078 may be used, for example, for moving the patient's body part and/or for resisting forces by the patient.
[0498] The internal sensors 3076 may measure one or more operating characteristics of the treatment device 3070 such as, for example, a force a position, a speed, a velocity, and /or an acceleration. In some embodiments, the internal sensors 3076 may include a position sensor configured to measure at least one of a linear motion or an angular motion of a body part of the patient. For example, an internal sensor 3076 in the form of a position sensor may measure a distance that the patient is able to move a part of the treatment device 3070, where such distance may correspond to a range of motion that the patient's body part is able to achieve. In some embodiments, the internal sensors 3076 may include a force sensor configured to measure a force applied by the patient. For example, an internal sensor 3076 in the form of a force sensor may measure a force or weight the patient is able to apply, using a particular body part, to the treatment device 3070.
[0499] The system 3010 generally illustrated in FIG. 23 also includes an ambulation sensor 3082, which communicates with the server 3030 via the local communication interface 3068 of the patient interface 3050. The ambulation sensor 3082 may track and store a number of steps taken by the patient. In some embodiments, the ambulation sensor 3082 may take the form of a wristband, wristwatch, or smart watch. In some embodiments, the ambulation sensor 3082 may be integrated within a phone, such as a smartphone.
[0500] The system 3010 generally illustrated in FIG. 23 also includes a goniometer 3084, which communicates with the server 3030 via the local communication interface 3068 of the patient interface 3050. The goniometer 3084 measures an angle of the patient's body part.
For example, the goniometer 3084 may measure the angle of flex of a patient's knee or elbow or shoulder.
[0501] The system 3010 may also include one or more additional sensors (not shown) which communicate with the server 3030 via the local communication interface 3068 of the patient interface 3050. The one or more additional sensors can measure other patient parameters such as a heartrate, a temperature, a blood pressure, a glucose level, the level of another biomarker, one or more vital signs, and the like. For example, the one or more additional sensors may be optical sensors that detect the reflection of near-infrared light from circulating blood below the level of the skin. Optical sensors may take the form of a wristband, wristwatch, or smartwatch and measure a glucose level, a heartrate, a blood oxygen saturation level, one or more vital signs, and the like.
[0502] In some embodiments, the one or more additional sensors may be located in a room or physical space in which the treatment device 3070 is being used, inside the patient's body, disposed on the person's body (e.g., skin patch), or included in the treatment device 3070, and the one or more additional sensors may measure various vital signs or other diagnostically-relevant attributes (e.g., heartrate, perspiration rate, temperature, blood pressure, oxygen levels, any suitable vital sign, glucose level, a level of another biomarker, etc.). The one or more additional sensors may transmit the measurements of the patient to the server 3030 for analysis and processing (e.g., to be used to modify, based on the measurements, at least the treatment plan for the patient).
[0503] The system 3010 generally illustrated in FIG. 23 also includes a pressure sensor 3086, which communicates with the server 3030 via the local communication interface 68 of the patient interface 3050. The pressure sensor 3086 measures an amount of pressure or weight applied by a body part of the patient. For example, pressure sensor 86 may measure an amount of force applied by a patient's foot when pedaling a stationary bike.
[0504] The system 3010 generally illustrated in FIG. 23 also includes a supervisory interface 3090 which may be similar or identical to the clinician interface 3020. In some embodiments, the supervisory interface 3090 may have enhanced functionality beyond what is provided on the clinician interface 3020. The supervisory interface 90 may be configured for use by a person having responsibility for the treatment plan, such as an orthopedic surgeon.
[0505] The system 3010 generally illustrated in FIG. 23 also includes a reporting interface 3092 which may be similar or identical to the clinician interface 3020. In some embodiments, the reporting interface 3092 may have less functionality from what is provided on the clinician interface 3020. For example, the reporting interface 3092 may not have the ability to modify a treatment plan. Such a reporting interface 3092 may be used, for example, by a biller to determine the use of the system 3010 for billing purposes. In another example, the reporting interface 3092 may not have the ability to display patient identifiable information, presenting only pseudonymized data and/or anonymized data for certain data fields concerning a data subject and/or for certain data fields concerning a quasi-identifier of the data subject. Such a reporting interface 3092 may be used, for example, by a researcher to determine various effects of a treatment plan on different patients.
[0506] The system 3010 includes an assistant interface 3094 for a healthcare provider, such as those described herein, to remotely communicate with (e.g., or over or across) the patient interface 3050 and/or the treatment device 3070. Such remote communications may enable the healthcare provider to provide assistance or guidance to a patient using the system 3010. More specifically, the assistant interface 3094 is configured to communicate a telemedicine signal 3096, 3097, 3098a, 3098b, 3099a, 3099b with the patient interface 3050 via a network connection such as, for example, via the first network 3034 and/or the second network 3058. The telemedicine signal 3096, 3097, 3098a, 3098b, 3099a, 3099b comprises one of an audio signal 3096, an audiovisual signal 3097, an interface control signal 3098a for controlling a function of the patient interface 3050, an interface monitor signal 3098b for monitoring a status of the patient interface 3050, an apparatus control signal 99a for changing an operating parameter of the treatment device 3070, and/or an apparatus monitor signal 3099b for monitoring a status of the treatment device 3070. In some embodiments, each of the control signals 3098a, 3099a may be unidirectional, conveying commands from the assistant interface 3094 to the patient interface 3050. In some embodiments, in response to successfully receiving a control signal 3098a, 3099a and/or to communicate successful and/or unsuccessful implementation of the requested control action, an acknowledgement message may be sent from the patient interface 3050 to the assistant interface 3094. In some embodiments, each of the monitor signals 3098b, 3099b may be unidirectional, status information commands from the patient interface 3050 to the assistant interface 3094. In some embodiments, an acknowledgement message may be sent from the assistant interface 3094 to the patient interface 3050 in response to successfully receiving one of the monitor signals 3098b,3099b.
[0507] In some embodiments, the patient interface 3050 may be configured as a pass through for the apparatus control signals 3099a and the apparatus monitor signals 3099b between the treatment device 3070 and one or more other devices, such as the assistant interface 3094 and/or the server 3030. For example, the patient interface 3050 may be configured to transmit an apparatus control signal 3099a in response to an apparatus control signal 3099a within the telemedicine signal 3096, 3097, 3098a, 3098b, 3099a, 3099b from the assistant interface 3094.
[0508] In some embodiments, the assistant interface 3094 may be presented on a shared physical device as the clinician interface 3020. For example, the clinician interface 3020 may include one or more screens that implement the assistant interface 3094. Alternatively or additionally, the clinician interface 3020 may include additional hardware components, such as a video camera, a speaker, and/or a microphone, to implement aspects of the assistant interface 3094.
[0509] In some embodiments, one or more portions of the telemedicine signal 3096, 3097, 3098a, 3098b, 3099a, 3099b may be generated from a prerecorded source (e.g., an audio recording, a video recording, or an animation) for presentation by the output device 3054 of the patient interface 3050. For example, a tutorial video may be streamed from the server 3030 and presented upon the patient interface 3050. Content from the prerecorded source may be requested by the patient via the patient interface 3050. Alternatively, via a control on the assistant interface 3094, the healthcare provider may cause content from the prerecorded source to be played on the patient interface 3050.
[0510] The assistant interface 3094 includes an assistant input device 3022 and an assistant display 3024, which may be collectively called an assistant user interface 3022, 3024. The assistant input device 3022 may include one or more of a telephone, a keyboard, a mouse, a trackpad, or a touch screen, for example. Alternatively or additionally, the assistant input device 3022 may include one or more microphones. In some embodiments, the one or more microphones may take the form of a telephone handset, headset, or wide-area microphone or microphones configured for the healthcare provider to speak to a patient via the patient interface 3050. In some embodiments, assistant input device 3022 may be configured to provide voice-based functionalities, with hardware and/or software configured to interpret spoken instructions by the healthcare provider by using the one or more microphones. The assistant input device 3022 may include functionality provided by or similar to existing voice based assistants such as Siri by Apple, Alexa by Amazon, Google Assistant, or Bixby by Samsung. The assistant input device 3022 may include other hardware and/or software components. The assistant input device 3022 may include one or more general purpose devices and/or special-purpose devices.
[0511] The assistant display 3024 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, a smartphone, or a smart watch. The assistant display 3024 may include other hardware and/or software components such as projectors, virtual reality capabilities, or augmented reality capabilities, etc. The assistant display 3024 may incorporate various different visual, audio, or other presentation technologies. For example, the assistant display 3024 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, melodies, and/or compositions, which may signal different conditions and/or directions. The assistant display 3024 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the healthcare provider. The assistant display 3024 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
[0512] In some embodiments, the system 3010 may provide computer translation of language from the assistant interface 3094 to the patient interface 3050 and/or vice-versa. The computer translation of language may include computer translation of spoken language and/or computer translation of text. Additionally or alternatively, the system 3010 may provide voice recognition and/or spoken pronunciation of text. For example, the system 10 may convert spoken words to printed text and/or the system 3010 may audibly speak language from printed text. The system 3010 may be configured to recognize spoken words by any or all of the patient, the clinician, and/or the healthcare provider. In some embodiments, the system 3010 may be configured to recognize and react to spoken requests or commands by the patient. For example, the system 3010 may automatically initiate a telemedicine session in response to a verbal command by the patient (which may be given in any one of several different languages).
[0513] In some embodiments, the server 3030 may generate aspects of the assistant display 3024 for presentation by the assistant interface 3094. For example, the server 3030 may include a web server configured to generate the display screens for presentation upon the assistant display 3024. For example, the artificial intelligence engine 3011 may generate recommended treatment plans and/or excluded treatment plans for patients and generate the display screens including those recommended treatment plans and/or external treatment plans for presentation on the assistant display 3024 of the assistant interface 3094. In some embodiments, the assistant display 3024 may be configured to present a virtualized desktop hosted by the server 3030. In some embodiments, the server 3030 may be configured to communicate with (e.g., or over) the assistant interface 3094 via the first network 3034. In some embodiments, the first network 3034 may include a local area network (LAN), such as an Ethernet network.
[0514] In some embodiments, the first network 3034 may include the Internet, and communications between the server 3030 and the assistant interface 3094 may be secured via privacy enhancing technologies, such as, for example, by using encryption over a virtual private network (VPN). Alternatively or additionally, the server 3030 may be configured to communicate with (e.g., or over or across) the assistant interface 3094 via one or more networks independent of the first network 3034 and/or other communication means, such as a direct wired or wireless communication channel. In some embodiments, the patient interface 3050 and the treatment device 3070 may each operate from a patient location geographically separate from a location of the assistant interface 3094. For example, the patient interface 3050 and the treatment device 3070 may be used as part of an in-home rehabilitation system, which may be aided remotely by using the assistant interface 3094 at a centralized location, such as a clinic or a call center.
[0515] In some embodiments, the assistant interface 3094 may be one of several different terminals (e.g., computing devices) that may be grouped together, for example, in one or more call centers or at one or more clinicians' offices. In some embodiments, a plurality of assistant interfaces 3094 may be distributed geographically. In some embodiments, a person may work as a healthcare provider remotely from any conventional office infrastructure. Such remote work may be performed, for example, where the assistant interface 3094 takes the form of a computer and/or telephone. This remote work functionality may allow for work-from-home arrangements that may include part time and/or flexible work hours for a healthcare provider.
[0516] FIGS. 24-25 show an embodiment of a treatment device 3070. More specifically, FIG. 24 generally illustrates a treatment device 3070 in the form of a stationary cycling machine 3100, which may be called a stationary bike, for short. The stationary cycling machine 3100 includes a set of pedals 3102 each attached to a pedal arm 3104 for rotation about an axle 3106. In some embodiments, and as is generally illustrated in FIG. 24, the pedals 3102 are movable on the pedal arms 3104 in order to adjust a range of motion used by the patient in pedaling. For example, the pedals being located inwardly toward the axle 3106 corresponds to a smaller range of motion than when the pedals are located outwardly away from the axle 3106. In some embodiments, the pedals may be adjustable inward and outward from the plane of rotation. Such techniques may enable increasing and decreasing a width of the patient's legs as they pedal. A pressure sensor 3086 is attached to or embedded within one of the pedals 3102 for measuring an amount of force applied by the patient on the pedal 3102. The pressure sensor 3086 may communicate wirelessly to the treatment device 3070 and/or to the patient interface 3050.
[0517] FIG. 26 generally illustrates a person (a patient) using the treatment device of FIG. 24, and showing sensors and various data parameters connected to a patient interface 3050. The example patient interface 3050 is a tablet computer or smartphone, or a phablet, such as an iPad, an iPhone, an Android device, or a Surface tablet, which is held manually by the patient. In some other embodiments, the patient interface 3050 may be embedded within or attached to the treatment device 3070.
[0518] FIG. 26 generally illustrates the patient wearing the ambulation sensor 3082 on his wrist, with a note showing "STEPS TODAY 31355", indicating that the ambulation sensor 3082 has recorded and transmitted that step count to the patient interface 3050. FIG. 26 also generally illustrates the patient wearing the goniometer 3084 on his right knee, with a note showing "KNEE ANGLE 72°", indicating that the goniometer 3084 is measuring and transmitting that knee angle to the patient interface 3050. FIG. 36 also generally illustrates a right side of one of the pedals 3102 with a pressure sensor 3086 showing "FORCE 12.5 lbs.," indicating that the right pedal pressure sensor 3086 is measuring and transmitting that force measurement to the patient interface 3050.
[0519] FIG. 26 also generally illustrates a left side of one of the pedals 4102 with a pressure sensor 3086 showing "FORCE 27 lbs.", indicating that the left pedal pressure sensor 3086 is measuring and transmitting that force measurement to the patient interface 3050. FIG. 26 also generally illustrates other patient data, such as an indicator of "SESSION TIME 0:04:13", indicating that the patient has been using the treatment device 3070 for 4 minutes and 13 seconds. This session time may be determined by the patient interface 3050 based on information received from the treatment device 3070. FIG. 26 also generally illustrates an indicator showing "PAIN LEVEL 3". Such a pain level may be obtained from the patent in response to a solicitation, such as a question, presented upon the patient interface 3050.
[0520] FIG. 27 is an example embodiment of an overview display 3120 of the assistant interface 3094. Specifically, the overview display 3120 presents several different controls and interfaces for the healthcare provider to remotely assist a patient with using the patient interface 3050 and/or the treatment device 3070. This remote assistance functionality may also be called telemedicine or telehealth.
[0521] Specifically, the overview display 3120 includes a patient profile display 3130 presenting biographical information regarding a patient using the treatment device 3070. The patient profile display 3130 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27, although the patient profile display 3130 may take other forms, such as a separate screen or a popup window.
[0522] In some embodiments, the patient profile display 3130 may include a limited subset of the patient's biographical information. More specifically, the data presented upon the patient profile display 3130 may depend upon the healthcare provider's need for that information. For example, a healthcare provider that is assisting the patient with a medical issue may be provided with medical history information regarding the patient, whereas a technician troubleshooting an issue with the treatment device 3070 may be provided with a much more limited set of information regarding the patient. The technician, for example, may be given only the patient's name.
[0523] The patient profile display 3130 may include pseudonymized data and/or anonymized data or use any privacy enhancing technology to prevent confidential patient data from being communicated in a way that could violate patient confidentiality requirements. Such privacy enhancing technologies may enable compliance with laws, regulations, or other rules of governance such as, but not limited to, the Health Insurance Portability and Accountability Act (HIPAA), or the General Data Protection Regulation (GDPR), wherein the patient may be deemed a "data subject".
[0524] In some embodiments, the patient profile display 3130 may present information regarding the treatment plan for the patient to follow in using the treatment device 3070. Such treatment plan information may be limited to a healthcare provider. For example, a healthcare provider assisting the patient with an issue regarding the treatment regimen may be provided with treatment plan information, whereas a technician troubleshooting an issue with the treatment device 3070 may not be provided with any information regarding the patient's treatment plan.
[0525] In some embodiments, one or more recommended treatment plans and/or excluded treatment plans may be presented in the patient profile display 3130 to the healthcare provider. The one or more recommended treatment plans and/or excluded treatment plans may be generated by the artificial intelligence engine 3011 of the server 3030 and received from the server 3030 in real-time during, inter alia, a telemedicine or telehealth session. An example of presenting the one or more recommended treatment plans and/or ruled-out treatment plans is described below with reference to FIG. 29.
[0526] The example overview display 3120 generally illustrated in FIG. 27 also includes a patient status display 3134 presenting status information regarding a patient using the treatment device. The patient status display 3134 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27, although the patient status display 3134 may take other forms, such as a separate screen or a popup window.
[0527] The patient status display 3134 includes sensor data 3136 from one or more of the external sensors 3082, 3084, 3086, and/or from one or more internal sensors 3076 of the treatment device 3070 and/or one or more additional sensors (not shown) as has been previously described herein. In some embodiments, the patient status display 3134 may include sensor data from one or more sensors of one or more wearable devices worn by the patient while using the treatment device 3070. The one or more wearable devices may include a watch, a bracelet, a necklace, a chest strap, and the like. The one or more wearable devices may be configured to monitor a heartrate, a temperature, a blood pressure, a glucose level, a blood oxygen saturation level, one or more vital signs, and the like of the patient while the patient is using the treatment device 3070. In some embodiments, the patient status display 3134 may present other data 3138 regarding the patient, such as last reported pain level, or progress within a treatment plan.
[0528] User access controls may be used to limit access, including what data is available to be viewed and/or modified, on any or all of the user interfaces 3020, 3050, 3090, 3092, 3094 of the system 3010. In some embodiments, user access controls may be employed to control what information is available to any given person using the system 3010. For example, data presented on the assistant interface 3094 may be controlled by user access controls, with permissions set depending on the healthcare provider/user's need for and/or qualifications to view that information.
[0529] The example overview display 3120 generally illustrated in FIG. 27 also includes a help data display 3140 presenting information for the healthcare provider to use in assisting the patient. The help data display 3140 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27. The help data display 3140 may take other forms, such as a separate screen or a popup window. The help data display 3140 may include, for example, presenting answers to frequently asked questions regarding use of the patient interface 3050 and/or the treatment device 3070.
[0530] The help data display 3140 may also include research data or best practices. In some embodiments, the help data display 3140 may present scripts for answers or explanations in response to patient questions. In some embodiments, the help data display 3140 may present flow charts or walk-throughs for the healthcare provider to use in determining a root cause and/or solution to a patient's problem.
[0531] In some embodiments, the assistant interface 3094 may present two or more help data displays 3140, which may be the same or different, for simultaneous presentation of help data for use by the healthcare provider. For example, a first help data display may be used to present a troubleshooting flowchart to determine the source of a patient's problem, and a second help data display may present script information for the healthcare provider to read to the patient, such information to preferably include directions for the patient to perform some action, which may help to narrow down or solve the problem. In some embodiments, based upon inputs to the troubleshooting flowchart in the first help data display, the second help data display may automatically populate with script information.
[0532] The example overview display 3120 generally illustrated in FIG. 27 also includes a patient interface control 3150 presenting information regarding the patient interface 3050, and/or to modify one or more settings of the patient interface 3050. The patient interface control 3150 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27. The patient interface control 3150 may take other forms, such as a separate screen or a popup window. The patient interface control 3150 may present information communicated to the assistant interface 3094 via one or more of the interface monitor signals 3098b.
[0533] As is generally illustrated in FIG. 27, the patient interface control 3150 includes a display feed 3152 of the display presented by the patient interface 3050. In some embodiments, the display feed 3152 may include a live copy of the display screen currently being presented to the patient by the patient interface 3050. In other words, the display feed 3152 may present an image of what is presented on a display screen of the patient interface 3050.
[0534] In some embodiments, the display feed 3152 may include abbreviated information regarding the display screen currently being presented by the patient interface 3050, such as a screen name or a screen number. The patient interface control 3150 may include a patient interface setting control 3154 for the healthcare provider to adjust or to control one or more settings or aspects of the patient interface 3050. In some embodiments, the patient interface setting control 3154 may cause the assistant interface 3094 to generate and/or to transmit an interface control signal 3098 for controlling a function or a setting of the patient interface 3050.
[0535] In some embodiments, the patient interface setting control 3154 may include collaborative browsing or co-browsing capability for the healthcare provider to remotely view and/or to control the patient interface 3050. For example, the patient interface setting control 3154 may enable the healthcare provider to remotely enter text to one or more text entry fields on the patient interface 3050 and/or to remotely control a cursor on the patient interface 3050 using a mouse or touchscreen of the assistant interface 3094.
[0536] In some embodiments, using the patient interface 3050, the patient interface setting control 3154 may allow the healthcare provider to change a setting that cannot be changed by the patient. For example, the patient interface 3050 may be precluded from accessing a language setting to prevent a patient from inadvertently switching, on the patient interface 3050, the language used for the displays, whereas the patient interface setting control 3154
I[I0
may enable the healthcare provider to change the language setting of the patient interface 3050. In another example, the patient interface 3050 may not be able to change a font size setting to a smaller size in order to prevent a patient from inadvertently switching the font size used for the displays on the patient interface 3050 such that the display would become illegible to the patient, whereas the patient interface setting control 3154 may provide for the healthcare provider to change the font size setting of the patient interface 3050.
[0537] The example overview display 3120 generally illustrated in FIG. 27 also includes an interface communications display 3156 showing the status of communications between the patient interface 3050 and one or more other devices 3070, 3082, 3084, such as the treatment device 3070, the ambulation sensor 3082, and/or the goniometer 3084. The interface communications display 3156 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27.
[0538] The interface communications display 3156 may take other forms, such as a separate screen or a popup window. The interface communications display 3156 may include controls for the healthcare provider to remotely modify communications with one or more of the other devices 3070, 3082, 3084. For example, the healthcare provider may remotely command the patient interface 50 to reset communications with one of the other devices 3070, 3082, 3084, or to establish communications with a new one of the other devices 3070, 3082, 3084. This functionality may be used, for example, where the patient has a problem with one of the other devices 3070, 3082, 3084, or where the patient receives a new or a replacement one of the other devices 3070, 3082, 3084.
[0539] The example overview display 3120 generally illustrated in FIG. 27 also includes an apparatus control 3160 for the healthcare provider to view and/or to control information regarding the treatment device 3070. The apparatus control 3160 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27. The apparatus control 3160 may take other forms, such as a separate screen or a popup window. The apparatus control 3160 may include an apparatus status display 3162 with information regarding the current status of the apparatus. The apparatus status display 3162 may present information communicated to the assistant interface 3094 via one or more of the apparatus monitor signals 3099b. The apparatus status display 3162 may indicate whether the treatment device 3070 is currently communicating with the patient interface 3050. The apparatus status display 3162 may present other current and/or historical information regarding the status of the treatment device 3070.
[0540] The apparatus control 3160 may include an apparatus setting control 3164 for the healthcare provider to adjust or control one or more aspects of the treatment device 3070. The apparatus setting control 3164 may cause the assistant interface 3094 to generate and/or to transmit an apparatus control signal 3099 (e.g., which may be referred to as treatment plan input, as described) for changing an operating parameter and/or one or more characteristics of the treatment device 3070, (e.g., a pedal radius setting, a resistance setting, a target RPM, other suitable characteristics of the treatment device 3070, or a combination thereof).
[0541] The apparatus setting control 3164 may include a mode button 3166 and a position control 3168, which may be used in conjunction for the healthcare provider to place an actuator 3078 of the treatment device 3070 in a manual mode, after which a setting, such as a position or a speed of the actuator 3078, can be changed using the position control 3168. The mode button 3166 may provide for a setting, such as a position, to be toggled between automatic and manual modes.
[0542] In some embodiments, one or more settings may be adjustable at any time, and without having an associated auto/manual mode. In some embodiments, the healthcare provider may change an operating parameter of the treatment device 3070, such as a pedal radius setting, while the patient is actively using the treatment device 3070. Such "on the fly" adjustment may or may not be available to the patient using the patient interface 3050.
[0543] In some embodiments, the apparatus setting control 3164 may allow the healthcare provider to change a setting that cannot be changed by the patient using the patient interface 3050. For example, the patient interface 3050 may be precluded from changing a preconfigured setting, such as a height or a tilt setting of the treatment device 3070, whereas the apparatus setting control 3164 may provide for the healthcare provider to change the height or tilt setting of the treatment device 3070.
[0544] The example overview display 3120 generally illustrated in FIG. 27 also includes a patient communications control 3170 for controlling an audio or an audiovisual communications session with the patient interface 3050. The communications session with the patient interface 3050 may comprise a live feed from the assistant interface 3094 for presentation by the output device of the patient interface 3050. The live feed may take the form of an audio feed and/or a video feed. In some embodiments, the patient interface 3050 may be configured to provide two-way audio or audiovisual communications with a person using the assistant interface 3094. Specifically, the communications session with the patient interface 3050 may include bidirectional (two-way) video or audiovisual feeds, with each of the patient interface 3050 and the assistant interface 3094 presenting video of the other one.
[0545] In some embodiments, the patient interface 3050 may present video from the assistant interface 3094, while the assistant interface 3094 presents only audio or the assistant interface 3094 presents no live audio or visual signal from the patient interface 3050. In some embodiments, the assistant interface 3094 may present video from the patient interface 3050, while the patient interface 3050 presents only audio or the patient interface 3050 presents no live audio or visual signal from the assistant interface 3094.
[0546] In some embodiments, the audio or an audiovisual communications session with the patient interface 3050 may take place, at least in part, while the patient is performing the rehabilitation regimen upon the body part. The patient communications control 3170 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27. The patient communications control 3170 may take other forms, such as a separate screen or a popup window.
[0547] The audio and/or audiovisual communications may be processed and/or directed by the assistant interface 3094 and/or by another device or devices, such as a telephone system, or a videoconferencing system used by the healthcare provider while the healthcare provider uses the assistant interface 3094. Alternatively or additionally, the audio and/or audiovisual communications may include communications with a third party. For example, the system 3010 may enable the healthcare provider to initiate a 3-way conversation regarding use of a particular piece of hardware or software, with the patient and a subject matter expert, such as a healthcare provider or a specialist. The example patient communications control 3170 generally illustrated in FIG. 27 includes call controls 3172 for the healthcare provider to use in managing various aspects of the audio or audiovisual communications with the patient. The call controls 3172 include a disconnect button 3174 for the healthcare provider to end the audio or audiovisual communications session. The call controls 3172 also include a mute button 3176 to temporarily silence an audio or audiovisual signal from the assistant interface 3094. In some embodiments, the call controls 3172 may include other features, such as a hold button (not shown).
[0548] The call controls 3172 also include one or more record/playback controls 3178, such as record, play, and pause buttons to control, with the patient interface 3050, recording and/or playback of audio and/or video from the teleconference session. The call controls 3172 also include a video feed display 3180 for presenting still and/or video images from the patient interface 3050, and a self-video display 3182 showing the current image of the healthcare provider using the assistant interface 3094. The self-video display 3182 may be presented as a picture-in-picture format, within a section of the video feed display 3180, as is generally illustrated in FIG. 27. Alternatively or additionally, the self-video display 3182 may be presented separately and/or independently from the video feed display 3180.
[0549] The example overview display 3120 generally illustrated in FIG. 27 also includes a third party communications control 3190 for use in conducting audio and/or audiovisual communications with a third party. The third party communications control 3190 may take the form of a portion or region of the overview display 3120, as is generally illustrated in FIG. 27. The third party communications control 3190 may take other forms, such as a display on a separate screen or a popup window.
[0550] The third party communications control 3190 may include one or more controls, such as a contact list and/or buttons or controls to contact a third party regarding use of a particular piece of hardware or software, e.g., a subject matter expert, such as a healthcare provider or a specialist. The third party communications control 190 may include conference calling capability for the third party to simultaneously communicate with both the healthcare provider via the assistant interface 3094, and with the patient via the patient interface 3050. For example, the system 3010 may provide for the healthcare provider to initiate a 3-way conversation with the patient and the third party.
[0551] FIG. 28 generally illustrates an example block diagram of training a machine learning model 3013 to output, based on data 3600 pertaining to the patient, a treatment plan 3602 for the patient according to the present disclosure. Data pertaining to other patients may be received by the server 3030. The other patients may have used various treatment devices to perform treatment plans.
[0552] The data may include characteristics of the other patients, the details of the treatment plans performed by the other patients, and/or the results of performing the treatment plans
(e.g., a percent of recovery of a portion of the patients' bodies, an amount of recovery of a portion of the patients' bodies, an amount of increase or decrease in muscle strength of a portion of patients' bodies, an amount of increase or decrease in range of motion of a portion of patients' bodies, etc.).
[0553] As depicted, the data has been assigned to different cohorts. Cohort A includes data for patients having similar first characteristics, first treatment plans, and first results. Cohort B includes data for patients having similar second characteristics, second treatment plans, and second results. For example, cohort A may include first characteristics of patients in their twenties without any medical conditions who underwent surgery for a broken limb; their treatment plans may include a certain treatment protocol (e.g., use the treatment device 70 for minutes 5 times a week for 3 weeks, wherein values for the properties, configurations, and/or settings of the treatment device 70 are set to X (where X is a numerical value) for the first two weeks and to Y (where Y is a numerical value) for the last week).
[0554] Cohort A and cohort B may be included in a training dataset used to train the machine learning model 3013. The machine learning model 3013 may be trained to match a pattern between characteristics for each cohort and output the treatment plan that provides the result. Accordingly, when the data 3600 for a new patient is input into the trained machine learning model 3013, the trained machine learning model 3013 may match the characteristics included in the data 3600 with characteristics in either cohort A or cohort B and output the appropriate treatment plan 3602. In some embodiments, the machine learning model 3013 may be trained to output one or more excluded treatment plans that should not be performed by the new patient.
[0555] FIG. 29 generally illustrates an embodiment of an overview display 3120 of the assistant interface 3094 presenting recommended treatment plans and excluded treatment plans in real-time during a telemedicine session according to the present disclosure. As depicted, the overview display 3120 just includes sections for the patient profile 3130 and the video feed display 3180, including the self-video display 3182. Any suitable configuration of controls and interfaces of the overview display 3120 described with reference to FIG. 27 may be presented in addition to or instead of the patient profile 3130, the video feed display 3180, and the self-video display 3182.
[0556] The healthcare provider using the assistant interface 3094 (e.g., computing device) during the telemedicine session may be presented in the self-video 3182 in a portion of the overview display 3120 (e.g., user interface presented on a display screen 3024 of the assistant interface 3094) that also presents a video from the patient in the video feed display 3180. Further, the video feed display 3180 may also include a graphical user interface (GUI) object 3700 (e.g., a button) that enables the healthcare provider to share, in real-time or near real time during the telemedicine session, the recommended treatment plans and/or the excluded treatment plans with the patient on the patient interface 3050. The healthcare provider may select the GUI object 3700 to share the recommended treatment plans and/or the excluded treatment plans. As depicted, another portion of the overview display 3120 includes the patient profile display 3130.
[0557] The patient profile display 3130 is presenting two example recommended treatment plans 3600 and one example excluded treatment plan 3602. As described herein, the treatment plans may be recommended in view of characteristics of the patient being treated. To generate the recommended treatment plans 3600 the patient should follow to achieve a desired result, a pattern between the characteristics of the patient being treated and a cohort of other people who have used the treatment device 3070 to perform a treatment plan may be matched by one or more machine learning models 3013 of the artificial intelligence engine 3011. Each of the recommended treatment plans may be generated based on different desired results.
[0558] For example, as depicted, the patient profile display 3130 presents "The characteristics of the patient match characteristics of uses in Cohort A. The following treatment plans are recommended for the patient based on his characteristics and desired results." Then, the patient profile display 3130 presents recommended treatment plans from cohort A, and each treatment plan provides different results.
[0559] As depicted, treatment plan "A" indicates "Patient X should use treatment device for 3030 minutes a day for 4 days to achieve an increased range of motion of Y%; Patient X has Type 2 Diabetes; and Patient X should be prescribed medication Z for pain management during the treatment plan (medication Z is approved for people having Type 2 Diabetes)." Accordingly, the treatment plan generated achieves increasing the range of motion of Y%. As may be appreciated, the treatment plan also includes a recommended medication (e.g., medication Z) to prescribe to the patient to manage pain in view of a known medical disease (e.g., Type 2 Diabetes) of the patient. That is, the recommended patient medication not only does not conflict with the medical condition of the patient but thereby improves the probability of a superior patient outcome. This specific example and all such examples elsewhere herein are not intended to limit in any way the generated treatment plan from recommending multiple medications, or from handling the acknowledgement, view, diagnosis and/or treatment of comorbid conditions or diseases.
[0560] Recommended treatment plan "B" may specify, based on a different desired result of the treatment plan, a different treatment plan including a different treatment protocol for a treatment device, a different medication regimen, etc.
[0561] As depicted, the patient profile display 3130 may also present the excluded treatment plans 3602. These types of treatment plans are shown to the healthcare provider using the assistant interface 3094 to alert the healthcare provider not to recommend certain portions of a treatment plan to the patient. For example, the excluded treatment plan could specify the following: "Patient X should not use treatment device for longer than 30 minutes a day due to a heart condition; Patient X has Type 2 Diabetes; and Patient X should not be prescribed medication M for pain management during the treatment plan (in this scenario, medication M can cause complications for people having Type 2 Diabetes). Specifically, the excluded treatment plan points out a limitation of a treatment protocol where, due to a heart condition, Patient X should not exercise for more than 30 minutes a day. The ruled-out treatment plan also points out that Patient X should not be prescribed medication M because it conflicts with the medical condition Type 2 Diabetes.
[0562] The healthcare provider may select the treatment plan for the patient on the overview display 3120. For example, the healthcare provider may use an input peripheral (e.g., mouse, touchscreen, microphone, keyboard, etc.) to select from the treatment plans 3600 for the patient. In some embodiments, during the telemedicine session, the healthcare provider may discuss the pros and cons of the recommended treatment plans 3600 with the patient.
[0563] In any event, the healthcare provider may select the treatment plan for the patient to follow to achieve the desired result. The selected treatment plan may be transmitted to the patient interface 3050 for presentation. The patient may view the selected treatment plan on the patient interface 3050. In some embodiments, the healthcare provider and the patient may discuss during the telemedicine session the details (e.g., treatment protocol using treatment device 3070, diet regimen, medication regimen, etc.) in real-time or in near real-time. In some
1-J
/ embodiments, the server 3030 may control, based on the selected treatment plan and during the telemedicine session, the treatment device 3070 as the user uses the treatment device 3070.
[0564] FIG. 30 generally illustrates an embodiment of the overview display 3120 of the assistant interface 3094 presenting, in real-time during a telemedicine session, recommended treatment plans that have changed as a result of patient data changing according to the present disclosure. As may be appreciated, the treatment device 3070 and/or any computing device (e.g., patient interface 3050) may transmit data while the patient uses the treatment device 3070 to perform a treatment plan. The data may include updated characteristics of the patient and/or other treatment data. For example, the updated characteristics may include new performance information and/or measurement information. The performance information may include a speed of a portion of the treatment device 3070, a range of motion achieved by the patient, a force exerted on a portion of the treatment device 3070, a heartrate of the patient, a blood pressure of the patient, a respiratory rate of the patient, and so forth.
[0565] In some embodiments, the data received at the server 3030 may be input into the trained machine learning model 3013, which may determine that the characteristics indicate the patient is on track for the current treatment plan. Determining the patient is on track for the current treatment plan may cause the trained machine learning model 3013 to adjust a parameter of the treatment device 3070. The adjustment may be based on a next step of the treatment plan to further improve the performance of the patient.
[0566] In some embodiments, the data received at the server 3030 may be input into the trained machine learning model 3013, which may determine that the characteristics indicate the patient is not on track (e.g., behind schedule, not able to maintain a speed, not able to achieve a certain range of motion, is in too much pain, etc.) for the current treatment plan or is ahead of schedule (e.g., exceeding a certain speed, exercising longer than specified with no pain, exerting more than a specified force, etc.) for the current treatment plan.
[0567] The trained machine learning model 3013 may determine that the characteristics of the patient no longer match the characteristics of the patients in the cohort to which the patient is assigned. Accordingly, the trained machine learning model 3013 may reassign the patient to another cohort that includes qualifying characteristics the patient's characteristics. As such, the trained machine learning model 3013 may select a new treatment plan from the new cohort and control, based on the new treatment plan, the treatment device 3070.
1-J
[0568] In some embodiments, prior to controlling the treatment device 3070, the server 3030 may provide the new treatment plan 3800 to the assistant interface 3094 for presentation in the patient profile 3130. As depicted, the patient profile 3130 indicates "The characteristics of the patient have changed and now match characteristics ofuses in Cohort B. The following treatment plan is recommended for the patient based on his characteristics and desired results." Then, the patient profile 3130 presents the new treatment plan 3800 ("Patient X should use the treatment device for 10 minutes a day for 3 days to achieve an increased range of motion of L%." The healthcare provider may select the new treatment plan 3800, and the server 3030 may receive the selection. The server 3030 may control the treatment device 3070 based on the new treatment plan 3800. In some embodiments, the new treatment plan 3800 may be transmitted to the patient interface 3050 such that the patient may view the details of the new treatment plan 3800.
[0569] In some embodiments, the server 3030 may be configured to receive treatment data pertaining to a user who uses a treatment device 3070 to perform a treatment plan. The user may include a patient, user, or person using the treatment device 3070 to perform various exercises.
[0570] The treatment data may include various characteristics of the user, various baseline measurement information pertaining to the user, various measurement information pertaining to the user while the user uses the treatment device 3070, various characteristics of the treatment device 3070, the treatment plan, other suitable data, or a combination thereof. In some embodiments, the server 3030 may receive the treatment data during a telemedicine session.
[0571] In some embodiments, while the user uses the treatment device 3070 to perform the treatment plan, at least some of the treatment data may include the sensor data 3136 from one or more of the external sensors 3082, 3084, 3086, and/or from one or more internal sensors 3076 of the treatment device 3070. In some embodiments, at least some of the treatment data may include sensor data from one or more sensors of one or more wearable devices worn by the patient while using the treatment device 3070. The one or more wearable devices may include a watch, a bracelet, a necklace, a chest strap, and the like. The one or more wearable devices may be configured to monitor a heartrate, a temperature, a blood pressure, one or more vital signs, and the like of the patient while the patient is using the treatment device
3070.
[0572] The various characteristics of the treatment device 3070 may include one or more settings of the treatment device 3070, a current revolutions per time period (e.g., such as one minute) of a rotating member (e.g., such as a wheel) of the treatment device 3070, a resistance setting of the treatment device 3070, other suitable characteristics of the treatment device 3070, or a combination thereof. The baseline measurement information may include, while the user is at rest, one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable measurement information of the user, or a combination thereof. The measurement information may include, while the user uses the treatment device 3070 to perform the treatment plan, one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable measurement information of the user, or a combination thereof.
[0573] In some embodiments, the server 3030 may write to an associated memory, for access by the artificial intelligence engine 3011, the treatment data. The artificial intelligence engine 3011 may use the one or more machine learning models 3013, which may be configured to use at least some of the treatment data to generate one or more predictions. For example, the artificial intelligence engine 3011 may use a machine learning model 3013 configured to receive the treatment data corresponding to the user. The machine learning model 3013 may analyze the at least one aspect of the treatment data and may generate at least one prediction corresponding to the at least one aspect of the treatment data.
[0574] The at least one prediction may indicate one or more predicted characteristics of the user. The one or more predicted characteristics of the user may include a predicted vital sign of the user, a predicted respiration rate of the user, a predicted heartrate of the user, a predicted temperature of the user, a predicted blood pressure of the user, a predicted outcome of the treatment plan being performed by the user, a predicted injury of the user resulting from the user performing the treatment plan, or other suitable predicted characteristic of the user.
[0575] In some embodiments, the server 3030 may receive, from the artificial intelligence engine 3011, the one or more predictions. The server 3030 may identify a threshold corresponding to respective predictions received from the artificial intelligence engine 3011. For example, the server 3030 may identify one or more characteristics of the user indicated by a respective prediction.
[0576] The server 3030 may access a database, such as the database 3044 or other suitable database, configured to associate thresholds with characteristics of the user and/or combinations of characteristics of the user. For example, the database 3044 may include information that associates a first threshold with a blood pressure of the user. Additionally, or alternatively, the database 3044 may include information that associates a threshold with a blood pressure of the user and a heartrate of the user. It should be understood that the database 3044 may include any number of thresholds associated with any of the various characteristics of the user and/or any combination of user characteristics. In some embodiments, a threshold corresponding to a respective prediction may include a value or a range of values including an upper limit and a lower limit.
[0577] In some embodiments, the server 3030 may determine whether a prediction received from the artificial intelligence engine 3011 is within a range of a corresponding threshold. For example, the server 3030 may compare the prediction to the corresponding threshold. The server 3030 may determine whether the prediction is within a predefined range of the threshold. For example, if the threshold includes a value, the predefined range may include an upper limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable upper limit) above the value and a lower limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable lower limit) below the value. Similarly, if the threshold includes a range including a first upper limit and a first lower limit (e.g., defining an acceptable range of the user characteristic or characteristics corresponding to the prediction), the predefined range may include a second upper limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable upper limit) above the first upper limit and a second lower limit (e.g., 0.5% or 1% percentagewise or, e.g., 250 or 750 (a unit of measurement or other suitable numerical value), or other suitable lower limit) below the first lower limit. It should be understood that the threshold may include any suitable predefined range and may include any suitable format in addition to or other than those described herein.
[0578] If the server 3030 determines that the prediction is within the range of the threshold, the server 3030 may communicate with (e.g., or over) an interface, such as the overview
1 -r 1
display 3120 at the computing device of the healthcare provider assisting the user, to provide the prediction and the treatment data. In some embodiments, the server 3030 may generate treatment information using the treatment data and/or the prediction. The treatment information may include a formatted summary of the performance of the treatment plan by the user while using the treatment device 3070, such that the treatment data and/or the prediction is presentable at the computing device of a healthcare provider responsible for the performance of the treatment plan by the user. In some embodiments, the patient profile display 3120 may include and/or display the treatment information.
[0579] The server 3030 may be configured to provide, at the overview display 3120, the treatment information. For example, the server 3030 may store the treatment information for access by the overview display 3120 and/or communicate the treatment information to the overview display 3120. In some embodiments, the server 3030 may provide the treatment information to patient profile display 3130 or other suitable section, portion, or component of the overview display 3120 or to any other suitable display or interface.
[0580] In some embodiments, the server 3030 may, in response to determining that the prediction is within the range of the threshold, modify at least one aspect of the treatment plan and/or, based on the prediction, one or more characteristics of the treatment device 3070. In some embodiments, the server 3030 may control, while the user uses the treatment device 3070 during a telemedicine session and based on a generated prediction, the treatment device 3070. For example, the server 3030 may, based on the prediction and/or the treatment plan, control one or more characteristics of the treatment device 3070.
[0581] The healthcare provider may include a medical professional (e.g., such as a doctor, a nurse, a therapist, and the like), an exercise professional (e.g., such as a coach, a trainer, a nutritionist, and the like), or another professional sharing at least one of medical and exercise attributes (e.g., such as an exercise physiologist, a physical therapist, an occupational therapist, and the like). As used herein, and without limiting the foregoing, a "healthcare provider" may be a human being, a robot, a virtual assistant, a virtual assistant in a virtual and/or augmented reality, or an artificially intelligent entity, including a software program, integrated software and hardware, or hardware alone.
[0582] In some embodiments, the interface may include a graphical user interface configured to provide the treatment information and receive input from the healthcare provider. The interface may include one or more input fields, such as text input fields, dropdown selection input fields, radio button input fields, virtual switch input fields, virtual lever input fields, audio, haptic, tactile, biometric or otherwise activated and/or driven input fields, other suitable input fields, or a combination thereof.
[0583] In some embodiments, the healthcare provider may review the treatment information and/or the prediction. The healthcare provider may, based on the review of the treatment information and/or the prediction, determine whether to modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device 3070. For example, the healthcare provider may review the treatment information. The healthcare provider may, based on the review of the treatment information, compare the treatment information to the treatment plan being performed by the user.
[0584] The healthcare provider may compare the following: (i) expected information, which pertains to the user while the user uses the treatment device to perform the treatment plan, to (ii) the prediction, which pertains to the user while the user uses the treatment device to perform the treatment plan.
[0585] The expected information may include one or more vital signs of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, a blood pressure of the user, other suitable information of the user, or a combination thereof. The healthcare provider may determine that the treatment plan is having the desired effect if the prediction is within an acceptable range associated with one or more corresponding parts or portions of the expected information. Alternatively, the healthcare provider may determine that the treatment plan is not having the desired effect if the prediction is outside of the range associated with one or more corresponding parts or portions of the expected information.
[0586] For example, the healthcare provider may determine whether a blood pressure value indicated by the prediction (e.g., systolic pressure, diastolic pressure, and/or pulse pressure) is within an acceptable range (e.g., plus or minus 1%, plus or minus 5%, plus or minus 1 unit of measurement (or other suitable numerical value),or any suitable range) of an expected blood pressure value indicated by the expected information. The healthcare provider may determine that the treatment plan is having the desired effect if the blood pressure value is within the range of the expected blood pressure value. Alternatively, the healthcare provider may determine that the treatment plan is not having the desired effect if the blood pressure value is outside of the range of the expected blood pressure value.
[0587] In some embodiments, while the user uses the treatment device 3070 to perform the treatment plan, the healthcare provider may compare the expected characteristics of the treatment device 3070 with characteristics of the treatment device 3070 indicated by the treatment information and/or the prediction. For example, the healthcare provider may compare an expected resistance setting of the treatment device 3070 with an actual resistance setting of the treatment device 3070 indicated by the treatment information and/or the prediction. The healthcare provider may determine that the user is performing the treatment plan properly if the actual characteristics of the treatment device 3070 indicated by the treatment information and/or the prediction are within a range of corresponding characteristics of the expected characteristics of the treatment device 3070. Alternatively, the healthcare provider may determine that the user is not performing the treatment plan properly if the actual characteristics of the treatment device 3070 indicated by the treatment information and/or the prediction are outside the range of corresponding characteristics of the expected characteristics of the treatment device 3070.
[0588] If the healthcare provider determines that the prediction and/or the treatment information indicates that the user is performing the treatment plan properly and/or that the treatment plan is having the desired effect, the healthcare provider may determine not to modify the at least one aspect treatment plan and/or the one or more characteristics of the treatment device 3070. Alternatively, while the user uses the treatment device 3070 to perform the treatment plan, if the healthcare provider determines that the prediction and/or the treatment information indicates that the user is not or has not been performing the treatment plan properly and/or that the treatment plan is not or has not been having the desired effect, the healthcare provider may determine to modify the at least one aspect of the treatment plan and/or the one or more characteristics of the treatment device 3070.
[0589] In some embodiments, if the healthcare provider determines to modify the at least one aspect of the treatment plan and/or to modify one or more characteristics of the treatment device, the healthcare provider may interact with the interface to provide treatment plan input indicating one or more modifications to the treatment plan and/or to modify one or more characteristics of the treatment device 3070. For example, the healthcare provider may use the interface to provide input indicating an increase or decrease in the resistance setting of the treatment device 3070, or other suitable modification to the one or more characteristics of the treatment device 3070. Additionally, or alternatively, the healthcare provider may use the interface to provide input indicating a modification to the treatment plan. For example, the healthcare provider may use the interface to provide input indicating an increase or decrease in an amount of time the user is required to use the treatment device 3070 according to the treatment plan, or other suitable modifications to the treatment plan.
[0590] In some embodiments, based on one or more modifications indicated by the treatment plan input, the server 3030 modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device 3070.
[0591] In some embodiments, while the user uses the treatment device 3070 to perform the modified treatment plan, the server 3030 may receive the subsequent treatment data pertaining to the user. For example, after the healthcare provider provides input modifying the treatment plan and/or the one or more characteristics of the treatment device 3070 and/or after the server 3030, using the artificial intelligence engine 3011, modifies the treatment plan and/or one or more characteristics of the treatment device 3070, the user may continue use the treatment device 3070 to perform the modified treatment plan. The subsequent treatment data may correspond to treatment data generated while the user uses the treatment device 3070 to perform the modified treatment plan. In some embodiments, after the healthcare provider has received the treatment information and determined not to modify the treatment plan and/or the one or more characteristics of the treatment device 3070 and/or the server 3030, using the artificial intelligence engine 3011, has determined not to modify the treatment plan and/or the one or more characteristics of the treatment device 3070, the subsequent treatment data may correspond to treatment data generated while the user continues to use the treatment device 3070 to perform the treatment plan. In some embodiments, the subsequent treatment data may include the updated treatment data (e.g., the treatment data updated to include the at least one prediction),
[0592] In some embodiments, the server 3030 may use the artificial intelligence engine 3011 using the machine learning model 3013 to generate one or more subsequent predictions based on the subsequent treatment data. The server 3030 may determine whether a respective subsequent prediction is within a range of a corresponding threshold. The server 3030 may, in response to a determination that the respective subsequent prediction is within the range of the threshold, communicate the subsequent treatment data, subsequent treatment information, and/or the prediction to the computing device of the healthcare provider. In some embodiments, the server 3030 may modify at least one aspect of the treatment plan and/or one or more characteristics of the treatment device 3070 based on the subsequent prediction.
[0593] In some embodiments, the server 3030 may receive subsequent treatment plan input from the computing device of the healthcare provider. Based on the subsequent treatment plan input received from the computing device of the healthcare provider, the server 3030 may further modify the treatment plan and/or control the one or more characteristics of the treatment device 3070. The subsequent treatment plan input may correspond to input provided by the healthcare provider, at the interface, in response to receiving and/or reviewing subsequent treatment information and/or the subsequent prediction corresponding to the subsequent treatment data. It should be understood that the server 3030, using the artificial intelligence engine 3011, may continuously and/or periodically generate predictions based on treatment data. Based on treatment data continuously and/or periodically received from the sensors or other suitable sources described herein, the server 3030 may provide treatment information and/or predictions to the computing device of the healthcare provider. Additionally, or alternatively, the server 3030 may continuously and/or periodically monitor, while the user uses the treatment device 3070 to perform the treatment plan, the characteristics of the user.
[0594] In some embodiments, the healthcare provider and/or the server 3030 may receive and/or review, continuously or periodically, while the user uses the treatment device 3070 to perform the treatment plan, treatment information, treatment data, and or predictions. Based on one or more trends indicated by the treatment information, treatment data, and/or predictions, the healthcare provider and/or the server 3030 may determine whether to modify the treatment plan and/or to modify and/or to control the one or more characteristics of the treatment device 3070. For example, the one or more trends may indicate an increase in heartrate or other suitable trends indicating that the user is not performing the treatment plan properly and/or that performance of the treatment plan by the user is not having the desired effect.
[0595] In some embodiments, the server 3030 may control, while the user uses the treatment device 3070 to perform the treatment plan, one or more characteristics of the
1 rV
treatment device 3070 based on the prediction. For example, the server 3030 may determine that the prediction is outside of the range of the corresponding threshold. Based on the prediction, the server 3030 may identify one or more characteristics of the treatment device 3070. The server 3030 may communicate a signal to the controller 3072 of the treatment device 3070 indicating the modifications to the one or more characteristics of the treatment device 3070. Based on the signal, the controller 3072 may modify the one or more characteristics of the treatment device 3070.
[0596] In some embodiments, the treatment plan, including the configurations, settings, range of motion settings, pain level, force settings, and speed settings, etc. of the treatment device 3070 for various exercises, may be transmitted to the controller of the treatment device 3070. In one example, if the user provides an indication, via the patient interface 3050, that he is experiencing a high level of pain at a particular range of motion, the controller may receive the indication. Based on the indication, the controller may electronically adjust the range of motion of the pedal 3102 by adjusting the pedal inwardly, outwardly, or along or about any suitable axis, via one or more actuators, hydraulics, springs, electric motors, or the like. When the user indicates certain pain levels during an exercise, the treatment plan may define alternative range of motion settings for the pedal 3102. Accordingly, once the treatment plan is uploaded to the controller of the treatment device 3070, the treatment device 3070 may continue to operate without further instruction, further external input, and the like. It should be noted that the patient (via the patient interface 3050) and/or the assistant (via the assistant interface 3094) may override any of the configurations or settings of the treatment device 3070 at any time. For example, the patient may use the patient interface 3050 to cause the treatment device 3070 to immediately stop, if so desired.
[0597] FIG. 31 is a flow diagram generally illustrating a method 3900 for monitoring, based on treatment data received while a user uses the treatment device 3070, characteristics of the user while the user uses the treatment device 3070 according to the principles of the present disclosure. The method 3900 is performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), software (such as is run on a general-purpose computer system or a dedicated machine), or a combination of both. The method 3900 and/or each of its individual functions, routines, subroutines, or operations may be performed by one or more processors of a computing device (e.g., any component of FIG. 23, such as server 3030 executing the artificial intelligence engine 3011). In some embodiments, the method 3900
/ may be performed by a single processing thread. Alternatively, the method 3900 may be performed by two or more processing threads, each thread implementing one or more individual functions, routines, subroutines, or operations of the methods.
[0598] For simplicity of explanation, the method 3900 is depicted and described as a series of operations. However, operations in accordance with this disclosure can occur in various orders and/or concurrently, and/or with other operations not presented and described herein. For example, the operations depicted in the method 3900 may occur in combination with any other operation of any other method disclosed herein. Furthermore, not all illustrated operations may be required to implement the method 3900 in accordance with the disclosed subject matter. In addition, those skilled in the art will understand and appreciate that the method 3900 could alternatively be represented as a series of interrelated states via a state diagram or events.
[0599] At 3902, the processing device may receive treatment data pertaining to a user who uses a treatment device, such as the treatment device 3070, to perform a treatment plan. The treatment data may include characteristics of the user, baseline measurement information pertaining to the user, measurement information pertaining to the user while the user uses the treatment device 3070 to perform the treatment plan, characteristics of the treatment device 3070, at least one aspect of the treatment plan, other suitable data, or a combination thereof.
[0600] At 3904, the processing device may write to an associated memory, for access by an artificial intelligence engine, such as the artificial intelligence engine 3011, the treatment data. The artificial intelligence engine 3011 may be configured to use at least one machine learning model, such as the machine learning model 3013. The machine learning model 3013 may be configured to use at least one aspect of the treatment data to generate at least one prediction.
[0601] The at least one prediction may indicate one or more predicted characteristics of the user. The one or more predicted characteristics of the user may include a predicted vital sign of the user, a predicted respiration rate of the user, a predicted heartrate of the user, a predicted temperature of the user, a predicted blood pressure of the user, a predicted outcome of the treatment plan being performed by the user, a predicted injury of the user resulting from the user performing the treatment plan, or other suitable predicted characteristic of the user.
[0602] At 3906, the processing device may receive, from the artificial intelligence engine
1-ro
3011, the at least one prediction.
[0603] At 3908, the processing device may identify a threshold corresponding to the at least one prediction. For example, the processing device may identify one or more characteristics of the user indicated by a respective prediction. The processing device may access a database, such as the database 3044 or other suitable database, configured to associate thresholds with characteristics of the user and/or combinations of characteristics of the user. For example, the database 3044 may include information that associates a first threshold with a blood pressure of the user. Additionally, or alternatively, the database 3044 may include information that associates a threshold with a blood pressure of the user and a heartrate of the user. A threshold corresponding to a respective prediction may include a value or a range of values including an upper limit and a lower limit.
[0604] At 3910, the processing device may, in response to a determination that the at least one prediction is within a range of a corresponding threshold, communicate with an interface at a computing device of a healthcare provider, to provide the at least one prediction and the treatment data. For example, the processing device may compare the at least one prediction and/or one or more characteristic of the user indicated by the prediction with the corresponding threshold identified by the processing device. If the processing device determines that the prediction is within the range of the threshold, the processing device may communicate the at least one prediction and/or the treatment data to the computing device of the healthcare provider.
[0605] At 3912, the processing device may, in response to a determination that the at least one prediction is outside of the range of the corresponding threshold, update the treatment data pertaining to the user to indicate the at least one prediction. The processing device may store the updated treatment data in an associated memory.
[0606] FIG. 32 is a flow diagram generally illustrating an alternative method 31000 for monitoring, based on treatment data received while a user uses the treatment device 3070, characteristics of the user while the user uses the treatment device 3070 according to the principles of the present disclosure. Method 31000 includes operations performed by processors of a computing device (e.g., any component of FIG. 23, such as server 3030 executing the artificial intelligence engine 3011). In some embodiments, one or more operations of the method 31000 are implemented in computer instructions stored on a
1 -r -/
memory device and executed by a processing device. The method 31000 may be performed in the same or a similar manner as described above in regard to method 3900. The operations of the method 31000 may be performed in some combination with any of the operations of any of the methods described herein.
[0607] At 31002, the processing device may receive treatment data, during a telemedicine session, pertaining to a user who uses a treatment device or treatment apparatus, such as the treatment device 3070, to perform a treatment plan. The treatment data may include characteristics of the user, baseline measurement information pertaining to the user, measurement information pertaining to the user while the user uses the treatment device 3070 to perform the treatment plan, characteristics of the treatment device 3070, at least one aspect of the treatment plan, other suitable data, or a combination thereof.
[0608] At 31004, the processing device may write to an associated memory, for access by an artificial intelligence engine, such as the artificial intelligence engine 3011, the treatment data. The artificial intelligence engine 3011 may be configured to use at least one machine learning model, such as the machine learning model 3013. The machine learning model 3013 may be configured to use at least one aspect of the treatment data to generate at least one prediction.
[0609] The at least one prediction may indicate one or more predicted characteristics of the user. The one or more predicted characteristics of the user may include a predicted vital sign of the user, a predicted respiration rate of the user, a predicted heartrate of the user, a predicted temperature of the user, a predicted blood pressure of the user, a predicted outcome of the treatment plan being performed by the user, a predicted injury of the user resulting from the user performing the treatment plan, or other suitable predicted characteristic of the user.
[0610] At 31006, the processing device may receive from the artificial intelligence engine l Ithe at least one prediction.
[0611] At 31008, the processing device may identify a threshold corresponding to the at least one prediction. For example, the processing device may identify one or more characteristics of the user indicated by a respective prediction. The processing device may access a database, such as the database 3044 or other suitable database, configured to associate thresholds with characteristics of the user and/or combinations of characteristics of the user. For example, the database 3044 may include information that associates a first threshold with a blood pressure of the user. Additionally, or alternatively, the database 3044 may include information that associates a threshold with a blood pressure of the user and a heartrate of the user. A threshold corresponding to a respective prediction may include a value or a range of values including an upper limit and a lower limit.
[0612] At 31010, the processing device may, in response to a determination that the at least one prediction is within a range of a corresponding threshold, communicate with an interface at a computing device of a healthcare provider, to provide the at least one prediction and the treatment data. For example, the processing device may compare the at least one prediction and/or one or more characteristic of the user indicated by the prediction with the corresponding threshold identified by the processing device. If the processing device determines that the prediction is within the range of the threshold, the processing device may communicate the at least one prediction and/or the treatment data to the computing device of the healthcare provider.
[0613] At 31012, the processing device may receive, from the computing device of the healthcare provider, treatment plan input indicating at least one modification to the at least one of the at least one aspect of the treatment plan and any other aspect of the treatment plan.
[0614] At 31014, the processing device may modify, using the treatment plan input, the at least one of the at least one aspect of the treatment plan and any other aspect of the treatment plan.
[0615] At 31016, the processing device may control, during a telemedicine session while the user uses the treatment device 3070 and based on the modified at least one of the at least one aspect of the treatment plan or any other aspect of the treatment plan, the treatment device 3070.
[0616] FIG. 33 is a flow diagram generally illustrating an alternative method 31100 for monitoring, based on treatment data received while a user uses the treatment device 3070, characteristics of the user while the user uses the treatment device 3070 according to the principles of the present disclosure. Method 31100 includes operations performed by processors of a computing device (e.g., any component of FIG. 23, such as server 3030 executing the artificial intelligence engine 3011). In some embodiments, one or more operations of the method 31100 are implemented in computer instructions stored on a memory device and executed by a processing device. The method 31100 may be performed in the same or a similar manner as described above in regard to method 3900 and/or method 31000. The operations of the method 31100 maybe performed in some combination with any of the operations of any of the methods described herein.
[0617] At 31102, the processing device may receive treatment data pertaining to a user who uses a treatment device, such as the treatment device 3070, to perform a treatment plan. The treatment data may include characteristics of the user, baseline measurement information pertaining to the user, measurement information pertaining to the user while the user uses the treatment device 3070 to perform the treatment plan, characteristics of the treatment device 3070, at least one aspect of the treatment plan, other suitable data, or a combination thereof.
[0618] At 31104, the processing device may write to an associated memory, for access by an artificial intelligence engine, such as the artificial intelligence engine 3011, the treatment data. The artificial intelligence engine 3011 may be configured to use at least one machine learning model, such as the machine learning model 3013. The machine learning model 3013 may be configured to use at least one aspect of the treatment data to generate at least one prediction.
[0619] The at least one prediction may indicate one or more predicted characteristics of the user. The one or more predicted characteristics of the user may include a predicted vital sign of the user, a predicted respiration rate of the user, a predicted heartrate of the user, a predicted temperature of the user, a predicted blood pressure of the user, a predicted outcome of the treatment plan being performed by the user, a predicted injury of the user resulting from the user performing the treatment plan, or other suitable predicted characteristic of the user.
[0620] At 31106, the processing device may receive, from the artificial intelligence engine 3011, the at least one prediction.
[0621] At 31108, the processing device may generate treatment information using the at least one prediction. The treatment information may include a summary of the performance, while the user uses the treatment device 3070 to perform the treatment plan, of the treatment plan by the user and the at least one prediction. The treatment information may be formatted, such that the treatment data and/or the at least one prediction is presentable at a computing device of a healthcare provider responsible for the performance of the treatment plan by the user.
[0622] At 31110, the processing device may write, to an associated memory for access by at least one of the computing device of the healthcare provider and a machine learning model executed by the artificial intelligence engine 3011, the treatment information and/or the at least one prediction.
[0623] At 31112, the processing device may receive treatment plan input responsive to the treatment information. The treatment plan input may indicate at least one modification to the at least one aspect treatment plan and/or any other aspect of the treatment plan. In some embodiments, the treatment plan input may be provided by the healthcare provider, as described. In some embodiments, based on the treatment information, the artificial intelligence engine 3011 executing the machine learning model 3013 may generate the treatment plan input.
[0624] At 31114, the processing device may determine whether the treatment plan input indicates at least one modification to the at least one aspect treatment plan and/or any other aspect of the treatment plan.
[0625] If the processing device determines that the treatment plan input does not indicate at least one modification to the at least one aspect treatment plan and/or any other aspect of the treatment plan, the processing device returns to 31102 and continues receiving treatment data pertaining to the user while the user uses the treatment device 3070 to perform the treatment plan. If the processing device determines that the treatment plan input indicates at least one modification to the at least one aspect treatment plan and/or any other aspect of the treatment plan, the processing device continues at 31116.
[0626] At 31116, the processing device may selectively modify the at least one aspect of the treatment plan and/or any other aspect of the treatment plan. For example, the processing device may determine whether the treatment data indicates that the treatment plan is having a desired effect. The processing device may modify, in response to determining that the treatment plan is not having the desired effect, at least one aspect of the treatment plan in order to attempt to achieve the desired effect, and if not possible, at least a portion of the desired effect.
[0627] At 31118, the processing device may control, while the user uses the treatment device 3070, based on treatment plan and/or the modified treatment plan the treatment device 3070.
[0628] FIG. 34 generally illustrates an example embodiment of a method 31200 for receiving a selection of an optimal treatment plan and controlling a treatment device while the patient uses the treatment device according to the present disclosure, based on the optimal treatment plan. Method 31200 includes operations performed by processors of a computing device (e.g., any component of FIG. 23, such as server 3030 executing the artificial intelligence engine 3011). In some embodiments, one or more operations of the method 31200 are implemented in computer instructions stored on a memory device and executed by a processing device. The method 31200 may be performed in the same or a similar manner as described above in regard to method 3900. The operations of the method 31200 may be performed in some combination with any of the operations of any of the methods described herein.
[0629] Prior to the method 31200 being executed, various optimal treatment plans may be generated by one or more trained machine learning models 3013 of the artificial intelligence engine 3011. For example, based on a set of treatment plans pertaining to a medical condition of a patient, the one or more trained machine learning models 3013 may generate the optimal treatment plans. The various treatment plans may be transmitted to one or more computing devices of a patient and/or medical professional.
[0630] At 31202 of the method 31200, the processing device may receive an optimal treatment plan selected from some or all of the optimal treatment plans. The selection may have been entered on a user interface presenting the optimal treatment plans on the patient interface 3050 and/or the assistant interface 3094.
[0631] At 31204, the processing device may control, while the patient uses the treatment device 3070, based on the selected optimal treatment plan, the treatment device 3070. In some embodiments, the controlling may be performed distally by the server 3030. For example, if the selection is made using the patient interface 3050, one or more control signals may be transmitted from the patient interface 3050 to the treatment device 3070 to configure, according to the selected treatment plan, a setting of the treatment device 3070 to control operation of the treatment device 3070. Further, if the selection is made using the assistant interface 3094, one or more control signals may be transmitted from the assistant interface 3094 to the treatment device 3070 to configure, according to the selected treatment plan, a setting of the treatment device 3070 to control operation of the treatment device 3070.
[0632] It should be noted that, as the patient uses the treatment device 3070, the sensors
3076 may transmit measurement data to a processing device. The processing device may dynamically control, according to the treatment plan, the treatment device 3070 by modifying, based on the sensor measurements, a setting of the treatment device 3070. For example, if the force measured by the sensor 3076 indicates the user is not applying enough force to a pedal 3102, the treatment plan may indicate to reduce the required amount of force for an exercise.
[0633] It should be noted that, as the patient uses the treatment device 3070, the user may use the patient interface 3050 to enter input pertaining to a pain level experienced by the patient as the patient performs the treatment plan. For example, the user may enter a high degree of pain while pedaling with the pedals 3102 set to a certain range of motion on the treatment device 3070. The pain level entered by the user may be within a range or at a level which may cause the range of motion to be dynamically adjusted based on the treatment plan. For example, the treatment plan may specify alternative range of motion settings if a certain pain level is indicated when the user is performing an exercise subject to a certain range of motion.
[0634] FIG. 35 generally illustrates an example computer system 31300 which can perform any one or more of the methods described herein, in accordance with one or more aspects of the present disclosure. In one example, computer system 31300 may include a computing device and correspond to the assistance interface 3094, reporting interface 3092, supervisory interface 3090, clinician interface 3020, server 3030 (including the Al engine 3011), patient interface 3050, ambulatory sensor 3082, goniometer 3084, treatment device 3070, pressure sensor 3086, or any suitable component of FIG. 23. The computer system 31300 may be capable of executing instructions implementing the one or more machine learning models 3013 of the artificial intelligence engine 3011 of FIG. 23. The computer system may be connected (e.g., networked) to other computer systems in a LAN, an intranet, an extranet, or the Internet, including via the cloud or a peer-to-peer network.
[0635] The computer system may operate in the capacity of a server in a client-server network environment. The computer system may be a personal computer (PC), a tablet computer, a wearable (e.g., wristband), a set-top box (STB), a personal Digital Assistant (PDA), a mobile phone, a camera, a video camera, an Internet of Things (IoT) device, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device. Further, while only a single computer system is illustrated, the term "computer" shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
[0636] The computer system 31300 includes a processing device 31302, a main memory 31304 (e.g., read-only memory (ROM), flash memory, solid state drives (SSDs), dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 31306 (e.g., flash memory, solid state drives (SSDs), static random access memory (SRAM)), and a data storage device 31308, which communicate with each other via a bus 31310.
[0637] Processing device 31302 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 31302 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 31302 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a system on a chip, a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 31302 is configured to execute instructions for performing any of the operations and steps discussed herein.
[0638] The computer system 31300 may further include a network interface device 31312. The computer system 31300 also may include a video display 31314 (e.g., a liquid crystal display (LCD), a light-emitting diode (LED), an organic light-emitting diode (OLED), a quantum LED, a cathode ray tube (CRT), a shadow mask CRT, an aperture grille CRT, a monochrome CRT), one or more input devices 31316 (e.g., a keyboard and/or a mouse or a gaming-like control), and one or more speakers 31318 (e.g., a speaker). In one illustrative example, the video display 31314 and the input device(s) 31316 may be combined into a single component or device (e.g., an LCD touch screen).
[0639] The data storage device 31316 may include a computer-readable medium 31320 on which the instructions 31322 embodying any one or more of the methods, operations, or functions described herein is stored. The instructions 31322 may also reside, completely or at least partially, within the main memory 31304 and/or within the processing device 31302 during execution thereof by the computer system 31300. As such, the main memory 31304 and the processing device 31302 also constitute computer-readable media. The instructions 31322 may further be transmitted or received over a network via the network interface device 31312.
[0640] While the computer-readable storage medium 31320 is generally illustrated in the illustrative examples to be a single medium, the term "computer-readable storage medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "computer-readable storage medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term "computer-readable storage medium" shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
[0641] Clause45. A method for providing, by an artificial intelligence engine, an optimal treatment plan to use with a treatment apparatus, the method comprising:
[0642] receiving, from a data source, clinical information pertaining to results of using the treatment apparatus to perform particular treatment plans for people having certain characteristics, wherein the clinical information has a first data format;
[0643] translating a portion of the clinical information from the first data format to a medical description language used by the artificial intelligence engine;
[0644] determining, based on the portion of the clinical information described by the medical description language and a plurality of characteristics pertaining to a patient, the optimal treatment plan for the patient to follow using the treatment apparatus to achieve a desired result; and
[0645] providing the optimal treatment plan to be presented on a computing device of a medical professional.
[0646] Clause 46. The method of any clause herein, wherein translating the portion of the clinical information from the first data format to the medical description language used by the artificial intelligence engine further comprises:
[0647] parsing the clinical information;
[0648] identifying, based on keywords representing target information in the clinical information, the portion of the clinical information having values related to the target information;
[0649] generating a canonical format defined by the medical description language, wherein the canonical format comprises tags identifying the values of the target information.
[0650] Clause 47. The method of any clause herein, wherein the tags are attributes describing specific characteristics of the target information;
[0651] Clause 48. The method of any clause herein, wherein providing the optimal treatment plan to be presented on the computing device of the medical professional further comprises:
[0652] causing, during a telemedicine session, the optimal treatment plan to be presented on a user interface of the computing device of the medical professional, wherein the optimal treatment plan is not presented on a display screen of a computing device, such display screen configured to be used by the patient during the telemedicine session.
[0653] Clause 49. The method of any clause herein, further comprising:
[0654] determining, based on the portion of the clinical information described by the medical description language and the plurality of characteristics pertaining to the patient, a ruled-out treatment plan that should not be recommended for the patient to follow when using the treatment apparatus to achieve the desired result; and
[0655] providing the excluded treatment plan to be presented on the computing device of the medical professional.
[0656] Clause 50. The method of any clause herein, further comprising:
[0657] determining, based on the portion of the clinical information described by the medical description language and the plurality of characteristics pertaining to the patient, a second optimal treatment plan for the patient to follow when using the treatment apparatus to achieve a second desired result, wherein the desired result pertains to a recovery outcome and the second desired result pertains to a recovery time; and
[0658] providing the second optimal treatment plan to be presented on the computing device of the medical professional;
1[JO
[0659] receiving a selected treatment plan of either the optimal treatment plan or the second optimal treatment plan; and
[0660] transmitting the selected treatment plan to a computing device of the patient for presenting on a user interface of the computing device of the patient.
[0661] Clause51. The method of any clause herein, wherein the desired result comprises obtaining a certain result within a certain time period, and the certain result comprises:
[0662] a range of motion the patient achieves using the treatment apparatus,
[0663] an amount of force exerted by the patient on a portion of the treatment apparatus,
[0664] an amount of time the patient exercises using the treatment apparatus,
[0665] a distance the patient travels using the treatment apparatus, or
[0666] some combination thereof.
[0667] Clause 52. The method of any clause herein, wherein:
[0668] the certain characteristics of the people comprise first medications prescribed to the people, first injuries of the people, first medical procedures performed on the people, first measurements of the people, first allergies of the people, first medical conditions of the people, first historical information of the people, first vital signs of the people, first symptoms of the people, first familial medical information of the people, first demographic information of the people, first geographic information of the people, first measurement- or test-based information of the people, first medically historic information of the people, first etiologic information of the people, first cohort-associative information of the people, first differentially diagnostic information of the people, first surgical information of the people, first physically therapeutic information of the people, first pharmacologic information of the people, first other treatments recommended to the people, or some combination thereof, and
[0669] the plurality of characteristics of the patient comprise second medications of the patient, second injuries of the patient, second medical procedures performed on the patient, second measurements of the patient, second allergies of the patient, second medical conditions of the patient, second historical information of the patient, second vital signs of the patient, second symptoms of the patient, second familial medical information of the patient, second demographic information of the patient, second geographic information of the patient, second measurement- or test-based information of the patient, second medically historic information of the patient, second etiologic information of the patient, second cohort associative information of the patient, second differentially diagnostic information of the patient, second surgical information of the patient, second physically therapeutic information of the patient, second pharmacologic information of the patient, second other treatments recommended to the patient, or some combination thereof.
[0670] Clause 53. The method of any clause herein, wherein the clinical information is written by a person having a certain professional credential and comprises a journal article, a clinical trial, evidence-based guidelines, meta-analysis, or some combination thereof.
[0671] Clause 54. The method of any clause herein, wherein determining, based on the portion of the clinical information described by the medical description language and the plurality of characteristics pertaining to the patient, the optimal treatment plan for the patient to follow when using the treatment apparatus to achieve the desired result further comprises:
[0672] matching a pattern between the portion of the clinical information described by the medical description language with the plurality of characteristics of the patient, wherein the pattern is associated with the optimal treatment plan that leads to the desired result.
[0673] Clause 55. The method of any clause herein, wherein the optimal treatment plan comprises:
[0674] a medical procedure to perform on the patient,
[0675] a treatment protocol for the patient using the treatment apparatus,
[0676] a diet regimen for the patient,
[0677] a medication regimen for the patient,
[0678] a sleep regimen for the patient, or
[0679] some combination thereof.
[0680] Clause 56. A tangible, non-transitory computer-readable medium storing instructions that, when executed, cause a processing device to:
[0681] receive, from a data source, clinical information pertaining to results of using the treatment apparatus to perform particular treatment plans for people having certain characteristics, wherein the clinical information has a first data format;
[0682] translate a portion of the clinical information from the first data format to a medical description language used by the artificial intelligence engine;
[0683] determine, based on the portion of the clinical information described by the medical description language and a plurality of characteristics pertaining to a patient, the optimal treatment plan for the patient to follow using the treatment apparatus to achieve a desired result; and
[0684] provide the optimal treatment plan to be presented on a computing device of a medical professional.
[0685] Clause 57. The computer-readable medium of any clause herein, wherein translating the portion of the clinical information from the first data format to the medical description language used by the artificial intelligence engine further comprises:
[0686] parse the clinical information;
[0687] identify, based on keywords representing target information in the clinical information, the portion of the clinical information having values of the target information;
[0688] generate a canonical format defined by the medical description language, wherein the canonical format comprises tags identifying the values of the target information.
[0689] Clause 58. The computer-readable medium of any clause herein, wherein providing the optimal treatment plan to be presented on the computing device of the medical professional further comprises:
[0690] causing, during a telemedicine session, the optimal treatment plan to be presented on a user interface of the computing device of the medical professional, wherein, during the telemedicine session, the optimal treatment plan is not presented on a user interface of a computing device of the patient.
[0691] Clause 59. The computer-readable medium of any clause herein, wherein the processing device further:
[0692] determines, based on the portion of the clinical information described by the medical description language and the plurality of characteristics pertaining to the patient, a second optimal treatment plan for the patient to follow when using the treatment apparatus to achieve a second desired result, wherein the desired result pertains to a recovery outcome and the second desired result pertains to a recovery time; and
[0693] provides the second optimal treatment plan to be presented on the computing device of the medical professional;
[0694] receives a selected treatment plan of either the optimal treatment plan or the second optimal treatment plan; and
[0695] transmits the selected treatment plan to a computing device of the patient.
[0696] Clause 60. The computer-readable medium of any clause herein, wherein the desired result comprises obtaining a certain result within a certain time period, and the certain result comprises:
[0697] a range of motion the patient achieves using the treatment apparatus,
[0698] an amount of force exerted by the patient on a portion of the treatment apparatus,
[0699] an amount of time the patient exercises using the treatment apparatus,
[0700] a distance the patient travels using the treatment apparatus, or
[0701] some combination thereof.
[0702] Clause 61. The computer-readable medium of any clause herein, wherein:
[0703] the certain characteristics of the people comprise first medications prescribed to the people, first injuries of the people, first medical procedures performed on the people, first measurements of the people, first allergies of the people, first medical conditions of the people, first historical information of the people, first vital signs of the people, first symptoms of the people, first familial medical information of the people, first demographic information of the people, first geographic information of the people, first measurement- or test-based information of the people, first medically historic information of the people, first etiologic information of the people, first cohort-associative information of the people, first differentially diagnostic information of the people, first surgical information of the people, first physically therapeutic information of the people, first pharmacologic information of the people, first other treatments recommended to the people, or some combination thereof, and
[0704] the plurality of characteristics of the patient comprise second medications of the patient, second injuries of the patient, second medical procedures performed on the patient, second measurements of the patient, second allergies of the patient, second medical conditions of the patient, second historical information of the patient, second vital signs of the patient, second symptoms of the patient, second familial medical information of the patient, second demographic information of the patient, second geographic information of the patient, second measurement- or test-based information of the patient, second medically historic information of the patient, second etiologic information of the patient, second cohort associative information of the patient, second differentially diagnostic information of the patient, second surgical information of the patient, second physically therapeutic information of the patient, second pharmacologic information of the patient, second other treatments recommended to the patient, or some combination thereof.
[0705] Clause 62. The computer-readable medium of any clause herein, wherein the clinical information is written by a person having a certain professional credential and comprises a journal article, a clinical trial, evidence-based guidelines, or some combination thereof.
[0706] Clause 63. A system comprising:
[0707] a memory device storing instructions; and
[0708] a processing device communicatively coupled to the memory device, wherein the processing device executes the instructions to:
[0709] receive, from a data source, clinical information pertaining to results of using the treatment apparatus to perform particular treatment plans for people having certain characteristics, wherein the clinical information has a first data format;
[0710] translate a portion of the clinical information from the first data format to a medical description language used by the artificial intelligence engine;
[0711] determine, based on the portion of the clinical information described by the medical description language and a plurality of characteristics pertaining to a patient, the optimal treatment plan for the patient to follow when using the treatment apparatus to achieve a desired result; and
[0712] provide the optimal treatment plan to be presented on a computing device of a medical professional.
[0713] Clause 64. The system of any clause herein, wherein translating the portion of the clinical information from the first data format to the medical description language used by the artificial intelligence engine further comprises:
[0714] parse the clinical information;
[0715] identify, based on keywords representing target information described by the clinical information, the portion of the clinical information having values of the target information;
[0716] generate a canonical format defined by the medical description language, wherein the canonical format comprises tags identifying the values of the target information.
[0717] Determining a treatment plan for a patient having certain characteristics (e.g., vital sign or other measurements; performance; demographic; geographic; diagnostic; measurement- or test-based; medically historic; etiologic; cohort-associative; differentially diagnostic; surgical, physically therapeutic, pharmacologic and other treatment(s) recommended; arterial blood gas and/or oxygenation levels or percentages; psychographics; etc.) may be a technically challenging problem. For example, a multitude of information may be considered when determining a treatment plan, which may result in inefficiencies and inaccuracies in the treatment plan selection process. In a rehabilitative setting, some of the multitude of information considered may include characteristics of the patient such as personal information, performance information, and measurement information. The personal information may include, e.g., demographic, psychographic or other information, such as an age, a weight, a gender, a height, a body mass index, a medical condition, a familial medication history, an injury, a medical procedure, a medication prescribed, or some combination thereof. The performance information may include, e.g., an elapsed time ofusing a treatment apparatus, an amount of force exerted on a portion of the treatment apparatus, a range of motion achieved on the treatment apparatus, a movement speed of a portion of the treatment apparatus, an indication of a plurality of pain levels using the treatment apparatus, or some combination thereof. The measurement information may include, e.g., a vital sign, a respiration rate, a heartrate, a temperature, a blood pressure, arterial blood gas and/or oxygenation levels or percentages, glucose levels or levels of other biomarkers, or some combination thereof. It may be desirable to process the characteristics of a multitude of patients, the treatment plans performed for those patients, and the results of the treatment plans for those patients.
[0718] Further, another technical problem may involve distally treating, via a computing device during a telemedicine or telehealth session, a patient from a location different than a location at which the patient is located. An additional technical problem is controlling or enabling the control of, from the different location, a treatment apparatus used by the patient at the location at which the patient is located. Oftentimes, when a patient undergoes rehabilitative surgery (e.g., knee surgery), a physical therapist or other medical professional may prescribe a treatment apparatus to the patient to use to perform a treatment protocol at their residence or any mobile location or temporary domicile. A medical professional may refer to a doctor, physician assistant, nurse, chiropractor, dentist, physical therapist, acupuncturist, physical trainer, or the like. A medical professional may refer to any person with a credential, license, degree, or the like in the field of medicine, physical therapy, rehabilitation, or the like.
[0719] Since the physical therapist or other medical professional is located in a different location from the patient and the treatment apparatus, it may be technically challenging for the physical therapist or other medical professional to monitor the patient's actual progress (as opposed to relying on the patient's word about their progress) using the treatment apparatus, modify the treatment plan according to the patient's progress, adapt the treatment apparatus to the personal characteristics of the patient as the patient performs the treatment plan, and the like.
[0720] Accordingly, some embodiments of the present disclosure pertain to using artificial intelligence and/or machine learning to dynamically control a treatment apparatus based on the assignment during an adaptive telemedical session. In some embodiments, numerous treatment apparatuses may be provided to patients. The treatment apparatuses may be used by the patients to perform treatment plans in their residences, at a gym, at a rehabilitative center, at a hospital, at a work site, or any suitable location, including permanent or temporary domiciles. In some embodiments, the treatment apparatuses may be communicatively coupled to a server. Characteristics of the patients may be collected before, during, and/or after the patients perform the treatment plans. For example, the personal information, the performance information, and the measurement information may be collected before, during, and/or after the person performs the treatment plans. The results (e.g., improved performance or decreased performance) of performing each exercise may be collected from the treatment apparatus throughout the treatment plan and after the treatment plan is performed. The parameters, settings, configurations, etc. (e.g., position of pedal, amount of resistance, etc.) of the treatment apparatus may be collected before, during, and/or after the treatment plan is performed.
[0721] Each characteristic of the patient, each result, and each parameter, setting, configuration, etc. may be timestamped and may be correlated with a particular step in the treatment plan. Such a technique may enable determining which steps in the treatment plan lead to desired results (e.g., improved muscle strength, range of motion, etc.) and which steps lead to diminishing returns (e.g., continuing to exercise after 3 minutes actually delays or harms recovery).
[0722] Data may be collected from the treatment apparatuses and/or any suitable computing device (e.g., computing devices where personal information is entered, such as a clinician interface or patient interface) over time as the patients use the treatment apparatuses to perform the various treatment plans. The data that may be collected may include the characteristics of the patients, the treatment plans performed by the patients, and the results of the treatment plans.
[0723] In some embodiments, the data may be processed to group certain people into cohorts. The people may be grouped by people having certain or selected similar characteristics, treatment plans, and results of performing the treatment plans. For example, athletic people having no medical conditions who perform a treatment plan (e.g., use the treatment apparatus for 30 minutes a day 5 times a week for 3 weeks) and who fully recover may be grouped into a first cohort. Older people who are classified obese and who perform a treatment plan (e.g., use the treatment plan for 10 minutes a day 3 times a week for 4 weeks) and who improve their range of motion by 75 percent may be grouped into a second cohort.
[0724] In some embodiments, an artificial intelligence engine may include one or more machine learning models that are trained using the cohorts. For example, the one or more machine learning models may be trained to receive an input of characteristics of a new patient and to output a treatment plan for the patient that results in a desired result. The machine learning models may match a pattern between the characteristics of the new patient and at least one patient of the patients included in a particular cohort. When a pattern is matched, the machine learning models may assign the new patient to the particular cohort and select the treatment plan associated with the at least one patient. The artificial intelligence engine may be configured to control, distally and based on the treatment plan, the treatment apparatus while the new patient uses the treatment apparatus to perform the treatment plan.
[0725] As may be appreciated, the characteristics of the new patient may change as the new patient uses the treatment apparatus to perform the treatment plan. For example, the performance of the patient may improve quicker than expected for people in the cohort to which the new patient is currently assigned. Accordingly, the machine learning models may be trained to dynamically reassign, based on the changed characteristics, the new patient to a different cohort that includes people having characteristics similar to the now-changed characteristics as the new patient. For example, a clinically obese patient may lose weight and no longer meet the weight criterion for the initial cohort, result in the patient's being reassigned to a different cohort with a different weight criterion. A different treatment plan may be selected for the new patient, and the treatment apparatus may be controlled, distally and based on the different treatment plan, the treatment apparatus while the new patient uses the treatment apparatus to perform the treatment plan. Such techniques may provide the technical solution of distally controlling a treatment apparatus. Further, the techniques may lead to faster recovery times and/or better results for the patients because the treatment plan that most accurately fits their characteristics is selected and implemented, in real-time, at any given moment. Real-time may refer to less than or equal to 2 seconds. Near real-time may refer to any interaction of a sufficiently short time to enable two individuals to engage in a dialogue via such user interface, and will generally be less than 10 seconds but greater than 2 seconds. As described herein, the term "results" may refer to medical results or medical outcomes. Results and outcomes may refer to responses to medical actions.
[0726] In some embodiments, the treatment plans may be presented, during a telemedicine or telehealth session, to a medical professional. The medical professional may select a particular treatment plan for the patient to cause that treatment plan to be transmitted to the patient and/or to control, based on the treatment plan, the treatment apparatus. In some embodiments, to facilitate telehealth or telemedicine applications, including remote diagnoses, determination of treatment plans and rehabilitative and/or pharmacologic
prescriptions, the artificial intelligence engine may receive and/or operate distally from the patient and the treatment apparatus. In such cases, the recommended treatment plans and/or excluded treatment plans may be presented simultaneously with a video of the patient in real time or near real-time during a telemedicine or telehealth session on a user interface of a computing device of a medical professional. The video may also be accompanied by audio, text and other multimedia information.
[0727] Presenting the treatment plans generated by the artificial intelligence engine concurrently with a presentation of the patient video may provide an enhanced user interface because the medical professional may continue to visually and/or otherwise communicate with the patient while also reviewing the treatment plans on the same user interface. The enhanced user interface may improve the medical professional's experience using the computing device and may encourage the medical professional to reuse the user interface. Such a technique may also reduce computing resources (e.g., processing, memory, network) because the medical professional does not have to switch to another user interface screen to enter a query for a treatment plan to recommend based on the characteristics of the patient. The artificial intelligence engine provides, dynamically on the fly, the treatment plans and excluded treatment plans.
[0728] In some embodiments, the treatment plan may be modified by a medical professional. For example, certain procedures may be added, modified or removed. In the telehealth scenario, there are certain procedures that may not be performed due to the distal nature of a medical professional using a computing device in a different physical location than a patient.
[0729] A potential technical problem may relate to the information pertaining to the patient's medical condition being received in disparate formats. For example, a server may receive the information pertaining to a medical condition of the patient from one or more sources (e.g., from an electronic medical record (EMR) system, application programming interface (API), or any suitable system that has information pertaining to the medical condition of the patient). That is, some sources used by various medical professional entities may be installed on their local computing devices and, additionally and/or alternatively, may use proprietary formats. Accordingly, some embodiments of the present disclosure may use an API to obtain, via interfaces exposed by APIs used by the sources, the formats used by the sources. In some embodiments, when information is received from the sources, the API may map and convert the format used by the sources to a standardized (i.e., canonical) format, language and/or encoding ("format" as used herein will be inclusive of all of these terms) used by the artificial intelligence engine. Further, the information converted to the standardized format used by the artificial intelligence engine may be stored in a database accessed by the artificial intelligence engine when the artificial intelligence engine is performing any of the techniques disclosed herein. Using the information converted to a standardized format may enable a more accurate determination of the procedures to perform for the patient and/or a billing sequence to use for the patient.
[0730] The various embodiments disclosed herein may provide a technical solution to the technical problem pertaining to the patient's medical condition information being received in disparate formats. For example, a server may receive the information pertaining to a medical condition of the patient from one or more sources (e.g., from an electronic medical record (EMR) system, application programming interface (API), or any suitable system that has information pertaining to the medical condition of the patient). The information may be converted from the format used by the sources to the standardized format used by the artificial intelligence engine. Further, the information converted to the standardized format used by the artificial intelligence engine may be stored in a database accessed by the artificial intelligence engine when performing any of the techniques disclosed herein. The standardized information may enable generating optimal treatment plans, where the generating is based on treatment plans associated with the standardized information, monetary value amounts, and the set of constraints. The optimal treatment plans may be provided in a standardized format that can be processed by various applications (e.g., telehealth) executing on various computing devices of medical professionals and/or patients.
[0731] A technical problem may include the challenge of enabling one medical professional to treat numerous patients at the same time. A technical solution to the technical problem may include the enablement of at least one medical professional or a group of medical professionals, wherein one medical professional may participate at one time and a different medical professional may participate at another time, to treat numerous patients at the same time. As used herein the term "a single medical professional" (or "one medical professional" or equivalent) shall be deemed inclusive of all of the scenarios just recited. For example, in group therapy or recovery sessions, it may be desirable for a single medical professional to view, monitor, treat, manage, diagnose, etc. more than one patient at the same time from a distal location. Accordingly, in some embodiments of the present disclosure, a virtual avatar is used to guide each patient through an exercise session of a treatment plan. The medical professional may use a computing device to view, monitor, treat, manage, diagnose, etc. the patients at once or in temporally close ranges. If a trigger event occurs, such as a user indicating they are in a substantial amount of pain, a telemedicine session is initiated either by selection or electronically. The telemedicine session causes the virtual avatar to be replaced on the computing device of the patient with a multimedia feed from the computing device of the medical professional. In some embodiments, the medical professional may select to intervene and/or interrupt any patient's treatment plan (including, for example and without limitation, an exercise, rehabilitation, prehabilitation, or other session) as desired (e.g., when the medical professional determines a sensor measurement is undesired, the patient is not performing as desired, etc.), while the other patients continue to follow the virtual avatar to perform the exercise session.
[0732] In some embodiments, the treatment apparatus maybe adaptive and/or personalized because its properties, configurations, and positions may be adapted to the needs of a particular patient. For example, the pedals may be dynamically adjusted on the fly (e.g., via a telemedicine session or based on programmed configurations in response to certain measurements being detected) to increase or decrease a range of motion to comply with a treatment plan designed for the user. In some embodiments, a medical professional may adapt, remotely during a telemedicine session, the treatment apparatus to the needs of the patient by causing a control instruction to be transmitted from a server to treatment apparatus. Such adaptive nature may improve the results of recovery for a patient, furthering the goals of personalized medicine, and enabling personalization of the treatment plan on a per individual basis.
[0733] FIG. 36 shows a block diagram of a computer-implemented system 10, hereinafter called "the system" for managing a treatment plan. Managing the treatment plan may include using an artificial intelligence engine to recommend treatment plans and/or provide excluded treatment plans that should not be recommended to a patient.
[0734] The system 4010 also includes a server 4030 configured to store and to provide data related to managing the treatment plan. The server 4030 may include one or more computers
and may take the form of a distributed and/or virtualized computer or computers. The server 4030 also includes a first communication interface 4032 configured to communicate with the clinician interface 4020 via a first network 4034.In some embodiments, the first network 4034 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc. The server 4030 includes a first processor 4036 and a first machine-readable storage memory 4038, which may be called a "memory" for short, holding first instructions 4040 for performing the various actions of the server 30 for execution by the first processor 4036. The server 4030 is configured to store data regarding the treatment plan. For example, the memory 4038 includes a system data store 4042 configured to hold system data, such as data pertaining to treatment plans for treating one or more patients.
[0735] The system data store 4042 may be configured to hold data relating to billing procedures, including rules and constraints pertaining to billing codes, order, timing, insurance regimes, laws, regulations, or some combination thereof. The system data store 4042 may be configured to store various billing sequences generated based on billing procedures and various parameters (e.g., monetary value amount generated, patient outcome, plan of reimbursement, fees, a payment plan for patients to pay of an amount of money owed, an amount of revenue to be paid to an insurance provider, etc.). The system data store 4042 may be configured to store optimal treatment plans generated based on various treatment plans for users having similar medical conditions, monetary value amounts generated by the treatment plans, and the constraints. Any of the data stored in the system data store 4042 may be accessed by an artificial intelligence engine 4011 when performing any of the techniques described herein.
[0736] The server 4030 is also configured to store data regarding performance by a patient in following a treatment plan. For example, the memory 4038 includes a patient data store 4044 configured to hold patient data, such as data pertaining to the one or more patients, including data representing each patient's performance within the treatment plan.
[0737] In addition, the characteristics (e.g., personal, performance, measurement, etc.) of the people, the treatment plans followed by the people, the level of compliance with the treatment plans, and the results of the treatment plans may use correlations and other statistical or probabilistic measures to enable the partitioning of or to partition the treatment
plans into different patient cohort-equivalent databases in the patient data store 4044. For example, the data for a first cohort of first patients having a first similar injury, a first similar medical condition, a first similar medical procedure performed, a first treatment plan followed by the first patient, and a first result of the treatment plan may be stored in a first patient database. The data for a second cohort of second patients having a second similar injury, a second similar medical condition, a second similar medical procedure performed, a second treatment plan followed by the second patient, and a second result of the treatment plan may be stored in a second patient database. Any single characteristic or any combination of characteristics may be used to separate the cohorts of patients. In some embodiments, the different cohorts of patients may be stored in different partitions or volumes of the same database. There is no specific limit to the number of different cohorts of patients allowed, other than as limited by mathematical combinatoric and/or partition theory.
[0738] This characteristic data, treatment plan data, and results data may be obtained from numerous treatment apparatuses and/or computing devices over time and stored in the patient data store. The characteristic data, treatment plan data, and results data may be correlated in the patient-cohort databases in the patient data store 4044. The characteristics of the people may include personal information, performance information, and/or measurement information.
[0739] In addition to the historical information about other people stored in the patient cohort-equivalent databases, real-time or near-real-time information based on the current patient's characteristics about a current patient being treated may be stored in an appropriate patient cohort-equivalent database. The characteristics of the patient may be determined to match or be similar to the characteristics of another person in a particular cohort (e.g., cohort A) and the patient may be assigned to that cohort.
[0740] In some embodiments, the server 4030 may execute the artificial intelligence (AI) engine 4011 that uses one or more machine learning models 4013 to perform at least one of the embodiments disclosed herein. The server 4030 may include a training engine 409 capable of generating the one or more machine learning models 4013. The machine learning models 4013 may be trained to assign people to certain cohorts based on their characteristics, select treatment plans using real-time and historical data correlations involving patient cohort equivalents, and control a treatment apparatus 4070, among other things. The machine
learning models 4013 may be trained to generate, based on billing procedures, billing sequences and/or treatment plans tailored for various parameters (e.g., a fee to be paid to a medical professional, a payment plan for the patient to pay off an amount of money owed, a plan of reimbursement, an amount of revenue to be paid to an insurance provider, or some combination thereof). The machine learning models 4013 may be trained to generate, based on constraints, optimal treatment plans tailored for various parameters (e.g., monetary value amount generated, patient outcome, risk, etc.). The one or more machine learning models 4013 may be generated by the training engine 9 and may be implemented in computer instructions executable by one or more processing devices of the training engine 409 and/or the servers 4030. To generate the one or more machine learning models 4013, the training engine 409 may train the one or more machine learning models 4013. The one or more machine learning models 4013 may be used by the artificial intelligence engine 4011.
[0741] The training engine 409 may be a rackmount server, a router computer, a personal computer, a portable digital assistant, a smartphone, a laptop computer, a tablet computer, a netbook, a desktop computer, an Internet of Things (IoT) device, any other desired computing device, or any combination of the above. The training engine 409 may be cloud-based or a real-time software platform, and it may include privacy software or protocols, and/or security software or protocols.
[0742] To train the one or more machine learning models 4013, the training engine 409 may use a training data set of a corpus of the information (e.g., characteristics, medical diagnosis codes, etc.) pertaining to medical conditions of the people who used the treatment apparatus 4070 to perform treatment plans, the details (e.g., treatment protocol including exercises, amount of time to perform the exercises, instructions for the patient to follow, how often to perform the exercises, a schedule of exercises, parameters/configurations/settings of the treatment apparatus 4070 throughout each step of the treatment plan, etc.) of the treatment plans performed by the people using the treatment apparatus 4070, the results of the treatment plans performed by the people, a set of monetary value amounts associated with the treatment plans, a set of constraints (e.g., rules pertaining to billing codes associated with the set of treatment plans, laws, regulations, etc.), a set of billing procedures (e.g., rules pertaining to billing codes, order, timing and constraints) associated with treatment plan instructions, a set of parameters (e.g., a fee to be paid to a medical professional, a payment plan for the patient to pay off an amount of money owed, a plan of reimbursement, an amount of revenue to be
[ / J1
paid to an insurance provider, or some combination thereof, a treatment plan, a monetary value amount generated, a risk, etc.), insurance regimens, etc.
[0743] The one or more machine learning models 4013 may be trained to match patterns of characteristics of a patient with characteristics of other people in assigned to a particular cohort. The term "match" may refer to an exact match, a correlative match, a substantial match, etc. The one or more machine learning models 4013 may be trained to receive the characteristics of a patient as input, map the characteristics to characteristics of people assigned to a cohort, and select a treatment plan from that cohort. The one or more machine learning models 4013 may also be trained to control, based on the treatment plan, the machine learning apparatus 4070.
[0744] The one or more machine learning models 4013 may be trained to match patterns of a first set of parameters (e.g., treatment plans for patients having a medical condition, a set of monetary value amounts associated with the treatment plans, patient outcome, and/or a set of constraints) with a second set of parameters associated with an optimal treatment plan. The one or more machine learning models 4013 may be trained to receive the first set of parameters as input, map the characteristics to the second set of parameters associated with the optimal treatment plan, and select the optimal treatment plan a treatment plan. The one or more machine learning models 4013 may also be trained to control, based on the treatment plan, the machine learning apparatus 4070.
[0745] The one or more machine learning models 4013 may be trained to match patterns of a first set of parameters (e.g., information pertaining to a medical condition, treatment plans for patients having a medical condition, a set of monetary value amounts associated with the treatment plans, patient outcomes, instructions for the patient to follow in a treatment plan, a set of billing procedures associated with the instructions, and/or a set of constraints) with a second set of parameters associated with a billing sequence and/or optimal treatment plan. The one or more machine learning models 4013 may be trained to receive the first set of parameters as input, map or otherwise associate or algorithmically associate the first set of parameters to the second set of parameters associated with the billing sequence and/or optimal treatment plan, and select the billing sequence and/or optimal treatment plan for the patient. In some embodiments, one or more optimal treatment plans may be selected to be provided to a computing device of the medical professional and/or the patient. The one or more
I / "r
machine learning models 4013 may also be trained to control, based on the treatment plan, the machine learning apparatus 4070.
[0746] Different machine learning models 4013 may be trained to recommend different treatment plans tailored for different parameters. For example, one machine learning model may be trained to recommend treatment plans for a maximum monetary value amount generated, while another machine learning model may be trained to recommend treatment plans based on patient outcome, or based on any combination of monetary value amount and patient outcome, or based on those and/or additional goals. Also, different machine learning models 4013 may be trained to recommend different billing sequences tailored for different parameters. For example, one machine learning model may be trained to recommend billing sequences for a maximum fee to be paid to a medical professional, while another machine learning model may be trained to recommend billing sequences based on a plan of reimbursement.
[0747] Using training data that includes training inputs and corresponding target outputs, the one or more machine learning models 4013 may refer to model artifacts created by the training engine 409. The training engine 409 may find patterns in the training data wherein such patterns map the training input to the target output and generate the machine learning models 4013 that capture these patterns. In some embodiments, the artificial intelligence engine 4011, the database, and/or the training engine 409 may reside on another component (e.g., assistant interface 4094, clinician interface 4020, etc.) depicted in FIG. 36.
[0748] The one or more machine learning models 4013 may comprise, e.g., a single level of linear or non-linear operations (e.g., a support vector machine [SVM]) or the machine learning models 4013 may be a deep network, i.e., a machine learning model comprising multiple levels of non-linear operations. Examples of deep networks are neural networks including generative adversarial networks, convolutional neural networks, recurrent neural networks with one or more hidden layers, and fully connected neural networks (e.g., each neuron may transmit its output signal to the input of the remaining neurons, as well as to itself). For example, the machine learning model may include numerous layers and/or hidden layers that perform calculations (e.g., dot products) using various neurons.
[0749] The system 4010 also includes a patient interface 4050 configured to communicate information to a patient and to receive feedback from the patient. Specifically, the patient
interface includes an input device 4052 and an output device 4054, which may be collectively called a patient user interface 4052, 4054. The input device 4052 may include one or more devices, such as a keyboard, a mouse, a touch screen input, a gesture sensor, and/or a microphone and processor configured for voice recognition. The output device 4054 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, smartphone, or a smart watch. The output device 4054 may include other hardware and/or software components such as a projector, virtual reality capability, augmented reality capability, etc. The output device 4054 may incorporate various different visual, audio, or other presentation technologies. For example, the output device 4054 may include a non visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, and/or melodies, which may signal different conditions and/or directions. The output device 4054 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the patient. The output device 4054 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
[0750] In some embodiments, the output device 4054 may present a user interface that may present a recommended treatment plan, billing sequence, or the like to the patient. The user interface may include one or more graphical elements that enable the user to select which treatment plan to perform. Responsive to receiving a selection of a graphical element (e.g., "Start" button) associated with a treatment plan via the input device 4054, the patient interface 4050 may communicate a control signal to the controller 4072 of the treatment apparatus 4070, wherein the control signal causes the treatment apparatus 4070 to begin execution of the selected treatment plan. As described below, the control signal may control, based on the selected treatment plan, the treatment apparatus 4070 by causing actuation of the actuator 4078 (e.g., cause a motor to drive rotation of pedals of the treatment apparatus at a certain speed), causing measurements to be obtained via the sensor 4076, or the like. The patient interface 4050 may communicate, via a local communication interface 4068, the control signal to the treatment apparatus 4070.
[0751] As shown in FIG. 36, the patient interface 4050 includes a second communication interface 4056, which may also be called a remote communication interface configured to communicate with the server 4030 and/or the clinician interface 4020 via a second network 4058. In some embodiments, the second network 58 may include a local area network (LAN),
such as an Ethernet network. In some embodiments, the second network 4058 may include the Internet, and communications between the patient interface 4050 and the server 4030 and/or the clinician interface 4020 may be secured via encryption, such as, for example, by using a virtual private network (VPN). In some embodiments, the second network 4058 may include wired and/or wireless network connections such as Wi-Fi, Bluetooth, ZigBee, Near Field Communications (NFC), cellular data network, etc. In some embodiments, the second network 4058 may be the same as and/or operationally coupled to the first network 4034.
[0752] The patient interface 4050 includes a second processor 4060 and a second machine readable storage memory 4062 holding second instructions 64 for execution by the second processor 4060 for performing various actions of patient interface 4050. The second machine readable storage memory 4062 also includes a local data store 66 configured to hold data, such as data pertaining to a treatment plan and/or patient data, such as data representing a patient's performance within a treatment plan. The patient interface 4050 also includes a local communication interface 4068 configured to communicate with various devices for use by the patient in the vicinity of the patient interface 4050. The local communication interface 4068 may include wired and/or wireless communications. In some embodiments, the local communication interface 4068 may include a local wireless network such as Wi-Fi, Bluetooth, ZigBee, Near-Field Communications (NFC), cellular data network, etc.
[0753] The system 4010 also includes a treatment apparatus 4070 configured to be manipulated by the patient and/or to manipulate a body part of the patient for performing activities according to the treatment plan. In some embodiments, the treatment apparatus 4070 may take the form of an exercise and rehabilitation apparatus configured to perform and/or to aid in the performance of a rehabilitation regimen, which may be an orthopedic rehabilitation regimen, and the treatment includes rehabilitation of a body part of the patient, such as a joint or a bone or a muscle group. The treatment apparatus 4070 may be any suitable medical, rehabilitative, therapeutic, etc. apparatus configured to be controlled distally via another computing device to treat a patient and/or exercise the patient. The treatment apparatus 4070 may be an electromechanical machine including one or more weights, an electromechanical bicycle, an electromechanical spin-wheel, a smart-mirror, a treadmill, a vibratory apparatus, or the like. The body part may include, for example, a spine, a hand, a foot, a knee, or a shoulder. The body part may include a part of a joint, a bone, or a muscle group, such as one or more vertebrae, a tendon, or a ligament. As shown in FIG. 36, the
/ treatment apparatus 4070 includes a controller 4072, which may include one or more processors, computer memory, and/or other components. The treatment apparatus 4070 also includes a fourth communication interface 4074 configured to communicate with the patient interface 4050 via the local communication interface 4068. The treatment apparatus 4070 also includes one or more internal sensors 4076 and an actuator 4078, such as a motor. The actuator 4078 may be used, for example, for moving the patient's body part and/or for resisting forces by the patient.
[0754] The internal sensors 4076 may measure one or more operating characteristics of the treatment apparatus 4070 such as, for example, a force a position, a speed, a velocity and /or an acceleration. In some embodiments, the internal sensors 4076 may include a position sensor configured to measure at least one of a linear motion or an angular motion of a body part of the patient. For example, an internal sensor 4076 in the form of a position sensor may measure a distance that the patient is able to move a part of the treatment apparatus 4070, where such distance may correspond to a range of motion that the patient's body part is able to achieve. In some embodiments, the internal sensors 4076 may include a force sensor configured to measure a force applied by the patient. For example, an internal sensor 4076 in the form of a force sensor may measure a force or weight the patient is able to apply, using a particular body part, to the treatment apparatus 4070.
[0755] The system 4010 shown in FIG. 36 also includes an ambulation sensor 4082, which communicates with the server 4030 via the local communication interface 4068 of the patient interface 4050. The ambulation sensor 4082 may track and store a number of steps taken by the patient. In some embodiments, the ambulation sensor 4082 may take the form of a wristband, wristwatch, or smart watch. In some embodiments, the ambulation sensor 4082 may be integrated within a phone, such as a smartphone. In some embodiments, the ambulation sensor 4082 may be integrated within an article of clothing, such as a shoe, a belt, and/or pants.
[0756] The system 4010 shown in FIG. 36 also includes a goniometer 4084, which communicates with the server 4030 via the local communication interface 4068 of the patient interface 4050. The goniometer 4084 measures an angle of the patient's body part. For example, the goniometer 4084 may measure the angle of flex of a patient's knee or elbow or shoulder.
1/0
[0757] The system 4010 shown in FIG. 36 also includes a pressure sensor 4086, which communicates with the server 4030 via the local communication interface 4068 of the patient interface 4050. The pressure sensor 4086 measures an amount of pressure or weight applied by a body part of the patient. For example, pressure sensor 4086 may measure an amount of force applied by a patient's foot when pedaling a stationary bike.
[0758] The system 4010 shown in FIG. 36 also includes a supervisory interface 4090 which may be similar or identical to the clinician interface 4020. In some embodiments, the supervisory interface 4090 may have enhanced functionality beyond what is provided on the clinician interface 4020. The supervisory interface 4090 may be configured for use by a person having responsibility for the treatment plan, such as an orthopedic surgeon.
[0759] The system 4010 shown in FIG. 36 also includes a reporting interface 4092 which may be similar or identical to the clinician interface 4020. In some embodiments, the reporting interface 4092 may have less functionality from what is provided on the clinician interface 4020. For example, the reporting interface 4092 may not have the ability to modify a treatment plan. Such a reporting interface 4092 may be used, for example, by a biller to determine the use of the system 4010 for billing purposes. In another example, the reporting interface 4092 may not have the ability to display patient identifiable information, presenting only pseudonymized data and/or anonymized data for certain data fields concerning a data subject and/or for certain data fields concerning a quasi-identifier of the data subject. Such a reporting interface 4092 may be used, for example, by a researcher to determine various effects of a treatment plan on different patients.
[0760] The system 4010 includes an assistant interface 4094 for an assistant, such as a doctor, a nurse, a physical therapist, or a technician, to remotely communicate with the patient interface 4050 and/or the treatment apparatus 4070. Such remote communications may enable the assistant to provide assistance or guidance to a patient using the system 4010. More specifically, the assistant interface 4094 is configured to communicate a telemedicine signal 4096, 4097, 4098a, 4098b, 4099a, 4099b with the patient interface 4050 via a network connection such as, for example, via the first network 4034 and/or the second network 4058. The telemedicine signal 4096, 4097, 4098a, 4098b, 4099a, 4099b comprises one of an audio signal 4096, an audiovisual signal 4097, an interface control signal 4098a for controlling a function of the patient interface 4050, an interface monitor signal 4098b for monitoring a status of the patient interface 4050, an apparatus control signal 4099a for changing an operating parameter of the treatment apparatus 4070, and/or an apparatus monitor signal 4099b for monitoring a status of the treatment apparatus 4070. In some embodiments, each of the control signals 4098a, 4099a may be unidirectional, conveying commands from the assistant interface 4094 to the patient interface 4050. In some embodiments, in response to successfully receiving a control signal 4098a, 4099a and/or to communicate successful and/or unsuccessful implementation of the requested control action, an acknowledgement message may be sent from the patient interface 4050 to the assistant interface 4094. In some embodiments, each of the monitor signals 4098b, 4099b may be unidirectional, status information commands from the patient interface 4050 to the assistant interface 4094. In some embodiments, an acknowledgement message may be sent from the assistant interface 4094 to the patient interface 4050 in response to successfully receiving one of the monitor signals 4098b,4099b.
[0761] In some embodiments, the patient interface 4050 may be configured as a pass through for the apparatus control signals 4099a and the apparatus monitor signals 4099b between the treatment apparatus 4070 and one or more other devices, such as the assistant interface 4094 and/or the server 4030. For example, the patient interface 4050 may be configured to transmit an apparatus control signal 4099a in response to an apparatus control signal 4099a within the telemedicine signal 4096, 4097, 4098a, 4098b, 4099a, 4099b from the assistant interface 4094.
[0762] In some embodiments, the assistant interface 4094 may be presented on a shared physical device as the clinician interface 4020. For example, the clinician interface 4020 may include one or more screens that implement the assistant interface 4094. Alternatively or additionally, the clinician interface 4020 may include additional hardware components, such as a video camera, a speaker, and/or a microphone, to implement aspects of the assistant interface 4094.
[0763] In some embodiments, one or more portions of the telemedicine signal 4096, 4097, 4098a, 4098b, 4099a, 4099b may be generated from a prerecorded source (e.g., an audio recording, a video recording, or an animation) for presentation by the output device 4054 of the patient interface 4050. For example, a tutorial video may be streamed from the server 4030 and presented upon the patient interface 4050. Content from the prerecorded source may
110V
be requested by the patient via the patient interface 4050. Alternatively, via a control on the assistant interface 4094, the assistant may cause content from the prerecorded source to be played on the patient interface 4050.
[0764] The assistant interface 4094 includes an assistant input device 4022 and an assistant display 4024, which may be collectively called an assistant user interface 4022, 4024. The assistant input device 4022 may include one or more of a telephone, a keyboard, a mouse, a trackpad, or a touch screen, for example. Alternatively or additionally, the assistant input device 4022 may include one or more microphones. In some embodiments, the one or more microphones may take the form of a telephone handset, headset, or wide-area microphone or microphones configured for the assistant to speak to a patient via the patient interface 4050. In some embodiments, assistant input device 4022 may be configured to provide voice-based functionalities, with hardware and/or software configured to interpret spoken instructions by the assistant by using the one or more microphones. The assistant input device 4022 may include functionality provided by or similar to existing voice-based assistants such as Siri by Apple, Alexa by Amazon, Google Assistant, or Bixby by Samsung. The assistant input device 4022 may include other hardware and/or software components. The assistant input device 4022 may include one or more general purpose devices and/or special-purpose devices.
[0765] The assistant display 4024 may take one or more different forms including, for example, a computer monitor or display screen on a tablet, a smartphone, or a smart watch. The assistant display 4024 may include other hardware and/or software components such as projectors, virtual reality capabilities, or augmented reality capabilities, etc. The assistant display 4024 may incorporate various different visual, audio, or other presentation technologies. For example, the assistant display 4024 may include a non-visual display, such as an audio signal, which may include spoken language and/or other sounds such as tones, chimes, melodies, and/or compositions, which may signal different conditions and/or directions. The assistant display 4024 may comprise one or more different display screens presenting various data and/or interfaces or controls for use by the assistant. The assistant display 4024 may include graphics, which may be presented by a web-based interface and/or by a computer program or application (App.).
[0766] In some embodiments, the system 4010 may provide computer translation of language from the assistant interface 4094 to the patient interface 4050 and/or vice-versa. The computer translation of language may include computer translation of spoken language and/or computer translation of text. Additionally or alternatively, the system 4010 may provide voice recognition and/or spoken pronunciation of text. For example, the system 4010 may convert spoken words to printed text and/or the system 4010 may audibly speak language from printed text. The system 4010 may be configured to recognize spoken words by any or all of the patient, the clinician, and/or the assistant. In some embodiments, the system 4010 may be configured to recognize and react to spoken requests or commands by the patient. For example, the system 4010 may automatically initiate a telemedicine session in response to a verbal command by the patient (which may be given in any one of several different languages).
[0767] In some embodiments, the server 4030 may generate aspects of the assistant display 4024 for presentation by the assistant interface 4094. For example, the server 4030 may include a web server configured to generate the display screens for presentation upon the assistant display 4024. For example, the artificial intelligence engine 4011 may generate treatment plans, billing sequences, and/or excluded treatment plans for patients and generate the display screens including those treatment plans, billing sequences, and/or excluded treatment plans for presentation on the assistant display 4024 of the assistant interface 4094. In some embodiments, the assistant display 4024 may be configured to present a virtualized desktop hosted by the server 4030. In some embodiments, the server 4030 may be configured to communicate with the assistant interface 4094 via the first network 4034. In some embodiments, the first network 4034 may include a local area network (LAN), such as an Ethernet network. In some embodiments, the first network 4034 may include the Internet, and communications between the server 4030 and the assistant interface 4094 may be secured via privacy enhancing technologies, such as, for example, by using encryption over a virtual private network (VPN). Alternatively or additionally, the server 4030 may be configured to communicate with the assistant interface 4094 via one or more networks independent of the first network 4034 and/or other communication means, such as a direct wired or wireless communication channel. In some embodiments, the patient interface 4050 and the treatment apparatus 4070 may each operate from a patient location geographically separate from a location of the assistant interface 4094. For example, the patient interface 4050 and the treatment apparatus 4070 may be used as part of an in-home rehabilitation system, which may be aided remotely by using the assistant interface 4094 at a centralized location, such as a clinic or a call center.
[0768] In some embodiments, the assistant interface 4094 may be one of several different terminals (e.g., computing devices) that may be grouped together, for example, in one or more call centers or at one or more clinicians' offices. In some embodiments, a plurality of assistant interfaces 4094 may be distributed geographically. In some embodiments, a person may work as an assistant remotely from any conventional office infrastructure. Such remote work may be performed, for example, where the assistant interface 94 takes the form of a computer and/or telephone. This remote work functionality may allow for work-from-home arrangements that may include part time and/or flexible work hours for an assistant.
[0769] FIGS. 37-38 show an embodiment of a treatment apparatus 4070. More specifically, FIG. 37 shows a treatment apparatus 4070 in the form of a stationary cycling machine 4100, which may be called a stationary bike, for short. The stationary cycling machine 4100 includes a set of pedals 4102 each attached to a pedal arm 4104 for rotation about an axle 4106. In some embodiments, and as shown in FIG. 37, the pedals 4102 are movable on the pedal arms 4104 in order to adjust a range of motion used by the patient in pedaling. For example, the pedals being located inwardly toward the axle 4106 corresponds to a smaller range of motion than when the pedals are located outwardly away from the axle 4106. A pressure sensor 4086 is attached to or embedded within one or more of the pedals 4102 for measuring an amount of force applied by the patient on the pedal 4102. The pressure sensor 4086 may communicate wirelessly to the treatment apparatus 4070 and/or to the patient interface 4050.
[0770] FIG. 39 shows a person (a patient) using the treatment apparatus of FIG. 37, and showing sensors and various data parameters connected to a patient interface 4050. The example patient interface 4050 is a tablet computer or smartphone, or a phablet, such as an iPad, an iPhone, an Android device, or a Surface tablet, which is held manually by the patient. In some other embodiments, the patient interface 4050 may be embedded within or attached to the treatment apparatus 4070. FIG. 39 shows the patient wearing the ambulation sensor 4082 on his wrist, with a note showing "STEPS TODAY 41355", indicating that the ambulation sensor 4082 has recorded and transmitted that step count to the patient interface 4050. FIG. 39 also shows the patient wearing the goniometer 4084 on his right knee, with a note showing "KNEE ANGLE 72°", indicating that the goniometer 4084 is measuring and transmitting that knee angle to the patient interface 4050. FIG. 39 also shows a right side of one of the pedals 4102 with a pressure sensor 4086 showing "FORCE 12.5 lbs.," indicating that the right pedal pressure sensor 4086 is measuring and transmitting that force measurement to the patient interface 4050. FIG. 39 also shows a left side of one of the pedals 4102 with a pressure sensor 4086 showing "FORCE 27 lbs.", indicating that the left pedal pressure sensor 4086 is measuring and transmitting that force measurement to the patient interface 4050. FIG. 36 also shows other patient data, such as an indicator of "SESSION TIME 0:04:13", indicating that the patient has been using the treatment apparatus 4070 for 4 minutes and 13 seconds. This session time may be determined by the patient interface 4050 based on information received from the treatment apparatus 4070. FIG. 36 also shows an indicator showing "PAIN LEVEL 3". Such a pain level may be obtained from the patent in response to a solicitation, such as a question, presented upon the patient interface 4050.
[0771] FIG. 40 is an example embodiment of an overview display 4120 of the assistant interface 4094. Specifically, the overview display 4120 presents several different controls and interfaces for the assistant to remotely assist a patient with using the patient interface 4050 and/or the treatment apparatus 4070. This remote assistance functionality may also be called telemedicine or telehealth.
[0772] Specifically, the overview display 4120 includes a patient profile display 4130 presenting biographical information regarding a patient using the treatment apparatus 4070. The patient profile display 4130 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40, although the patient profile display 4130 may take other forms, such as a separate screen or a popup window. In some embodiments, the patient profile display 4130 may include a limited subset of the patient's biographical information. More specifically, the data presented upon the patient profile display 4130 may depend upon the assistant's need for that information. For example, a medical professional that is assisting the patient with a medical issue may be provided with medical history information regarding the patient, whereas a technician troubleshooting an issue with the treatment apparatus 4070 may be provided with a much more limited set of information regarding the patient. The technician, for example, may be given only the patient's name. The patient profile display 4130 may include pseudonymized data and/or anonymized data or use any privacy enhancing technology to prevent confidential patient data from being communicated in a way that could violate patient confidentiality requirements. Such privacy enhancing technologies may enable compliance with laws, regulations, or other rules of governance such as, but not limited to, the Health Insurance Portability and Accountability Act (HIPAA), or the General Data Protection Regulation (GDPR), wherein the patient may be deemed a "data subject".
[0773] In some embodiments, the patient profile display 4130 may present information regarding the treatment plan for the patient to follow in using the treatment apparatus 4070. Such treatment plan information may be limited to an assistant who is a medical professional, such as a doctor or physical therapist. For example, a medical professional assisting the patient with an issue regarding the treatment regimen may be provided with treatment plan information, whereas a technician troubleshooting an issue with the treatment apparatus 4070 may not be provided with any information regarding the patient's treatment plan.
[0774] In some embodiments, one or more recommended treatment plans and/or excluded treatment plans may be presented in the patient profile display 4130 to the assistant. The one or more recommended treatment plans and/or excluded treatment plans may be generated by the artificial intelligence engine 4011 of the server 4030 and received from the server 4030 in real-time during, inter alia, a telemedicine or telehealth session. An example of presenting the one or more recommended treatment plans and/or ruled-out treatment plans is described below with reference to FIG. 42.
[0775] In some embodiments, one or more treatment plans and/or billing sequences associated with the treatment plans may be presented in the patient profile display 4130 to the assistant. The one or more treatment plans and/or billing sequences associated with the treatment plans may be generated by the artificial intelligence engine 4011 of the server 4030 and received from the server 4030 in real-time during, inter alia, a telehealth session. An example of presenting the one or more treatment plans and/or billing sequences associated with the treatment plans is described below with reference to FIG. 44.
[0776] In some embodiments, one or more treatment plans and associated monetary value amounts generated, patient outcomes, and risks associated with the treatment plans may be presented in the patient profile display 4130 to the assistant. The one or more treatment plans and associated monetary value amounts generated, patient outcomes, and risks associated with the treatment plans may be generated by the artificial intelligence engine 4011 of the server 4030 and received from the server 4030 in real-time during, inter alia, a telehealth session. An example of presenting the one or more treatment plans and associated monetary value amounts generated, patient outcomes, and risks associated with the treatment plans is described below with reference to FIG. 47.
[0777] The example overview display 4120 shown in FIG. 40 also includes a patient status display 4134 presenting status information regarding a patient using the treatment apparatus. The patient status display 4134 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40, although the patient status display 4134 may take other forms, such as a separate screen or a popup window. The patient status display 4134 includes sensor data 4136 from one or more of the external sensors 4082, 4084, 4086, and/or from one or more internal sensors 4076 of the treatment apparatus 4070. In some embodiments, the patient status display 4134 may present other data 4138 regarding the patient, such as last reported pain level, or progress within a treatment plan.
[0778] User access controls may be used to limit access, including what data is available to be viewed and/or modified, on any or all of the user interfaces 4020, 4050, 4090, 4092, 4094 of the system 4010. In some embodiments, user access controls may be employed to control what information is available to any given person using the system 4010. For example, data presented on the assistant interface 4094 may be controlled by user access controls, with permissions set depending on the assistant/user's need for and/or qualifications to view that information.
[0779] The example overview display 4120 shown in FIG. 40 also includes a help data display 4140 presenting information for the assistant to use in assisting the patient. The help data display 4140 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40. The help data display 4140 may take other forms, such as a separate screen or a popup window. The help data display 4140 may include, for example, presenting answers to frequently asked questions regarding use of the patient interface 4050 and/or the treatment apparatus 4070. The help data display 4140 may also include research data or best practices. In some embodiments, the help data display 4140 may present scripts for answers or explanations in response to patient questions. In some embodiments, the help data display 4140 may present flow charts or walk-throughs for the assistant to use in determining a root cause and/or solution to a patient's problem. In some embodiments, the assistant interface 4094 may present two or more help data displays 4140, which may be the same or different, for simultaneous presentation of help data for use by the assistant. for example, a first help
10V
data display may be used to present a troubleshooting flowchart to determine the source of a patient's problem, and a second help data display may present script information for the assistant to read to the patient, such information to preferably include directions for the patient to perform some action, which may help to narrow down or solve the problem. In some embodiments, based upon inputs to the troubleshooting flowchart in the first help data display, the second help data display may automatically populate with script information.
[0780] The example overview display 4120 shown in FIG. 40 also includes a patient interface control 4150 presenting information regarding the patient interface 4050, and/or to modify one or more settings of the patient interface 4050. The patient interface control 4150 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40. The patient interface control 4150 may take other forms, such as a separate screen or a popup window. The patient interface control 4150 may present information communicated to the assistant interface 4094 via one or more of the interface monitor signals 4098b. As shown in FIG. 40, the patient interface control 4150 includes a display feed 4152 of the display presented by the patient interface 4050. In some embodiments, the display feed 4152 may include a live copy of the display screen currently being presented to the patient by the patient interface 4050. In other words, the display feed 4152 may present an image of what is presented on a display screen of the patient interface 4050. In some embodiments, the display feed 4152 may include abbreviated information regarding the display screen currently being presented by the patient interface 4050, such as a screen name or a screen number. The patient interface control 4150 may include a patient interface setting control 4154 for the assistant to adjust or to control one or more settings or aspects of the patient interface 4050. In some embodiments, the patient interface setting control 4154 may cause the assistant interface 4094 to generate and/or to transmit an interface control signal 4098 for controlling a function or a setting of the patient interface 4050.
[0781] In some embodiments, the patient interface setting control 4154 may include collaborative browsing or co-browsing capability for the assistant to remotely view and/or control the patient interface 4050. For example, the patient interface setting control 4154 may enable the assistant to remotely enter text to one or more text entry fields on the patient interface 4050 and/or to remotely control a cursor on the patient interface 4050 using a mouse or touchscreen of the assistant interface 4094.
10/
[0782] In some embodiments, using the patient interface 4050, the patient interface setting control 4154 may allow the assistant to change a setting that cannot be changed by the patient. For example, the patient interface 4050 may be precluded from accessing a language setting to prevent a patient from inadvertently switching, on the patient interface 4050, the language used for the displays, whereas the patient interface setting control 4154 may enable the assistant to change the language setting of the patient interface 4050. In another example, the patient interface 4050 may not be able to change a font size setting to a smaller size in order to prevent a patient from inadvertently switching the font size used for the displays on the patient interface 4050 such that the display would become illegible to the patient, whereas the patient interface setting control 4154 may provide for the assistant to change the font size setting of the patient interface 4050.
[0783] The example overview display 4120 shown in FIG. 40 also includes an interface communications display 4156 showing the status of communications between the patient interface 4050 and one or more other devices 4070, 4082, 4084, such as the treatment apparatus 4070, the ambulation sensor 4082, and/or the goniometer 4084. The interface communications display 4156 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40. The interface communications display 4156 may take other forms, such as a separate screen or a popup window. The interface communications display 4156 may include controls for the assistant to remotely modify communications with one or more of the other devices 4070, 4082, 4084. For example, the assistant may remotely command the patient interface 4050 to reset communications with one of the other devices 4070, 4082, 4084, or to establish communications with a new one of the other devices 4070, 4082, 4084. This functionality may be used, for example, where the patient has a problem with one of the other devices 4070, 4082, 4084, or where the patient receives a new or a replacement one of the other devices 4070, 4082, 4084.
[0784] The example overview display 4120 shown in FIG. 40 also includes an apparatus control 4160 for the assistant to view and/or to control information regarding the treatment apparatus 4070. The apparatus control 4160 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40. The apparatus control 4160 may take other forms, such as a separate screen or a popup window. The apparatus control 4160 may include an apparatus status display 4162 with information regarding the current status of the apparatus. The apparatus status display 4162 may present information communicated to the assistant interface 4094 via one or more of the apparatus monitor signals 4099b. The apparatus status display 4162 may indicate whether the treatment apparatus 4070 is currently communicating with the patient interface 4050. The apparatus status display 4162 may present other current and/or historical information regarding the status of the treatment apparatus 4070.
[0785] The apparatus control 4160 may include an apparatus setting control 4164 for the assistant to adjust or control one or more aspects of the treatment apparatus 4070. The apparatus setting control 4164 may cause the assistant interface 4094 to generate and/or to transmit an apparatus control signal 4099 for changing an operating parameter of the treatment apparatus 4070, (e.g., a pedal radius setting, a resistance setting, a target RPM, etc.). The apparatus setting control 4164 may include a mode button 4166 and a position control 4168, which may be used in conjunction for the assistant to place an actuator 4078 of the treatment apparatus 4070 in a manual mode, after which a setting, such as a position or a speed of the actuator 4078, can be changed using the position control 4168. The mode button 4166 may provide for a setting, such as a position, to be toggled between automatic and manual modes. In some embodiments, one or more settings may be adjustable at any time, and without having an associated auto/manual mode. In some embodiments, the assistant may change an operating parameter of the treatment apparatus 4070, such as a pedal radius setting, while the patient is actively using the treatment apparatus 4070. Such "on the fly" adjustment may or may not be available to the patient using the patient interface 4050. In some embodiments, the apparatus setting control 4164 may allow the assistant to change a setting that cannot be changed by the patient using the patient interface 4050. For example, the patient interface 4050 may be precluded from changing a preconfigured setting, such as a height or a tilt setting of the treatment apparatus 4070, whereas the apparatus setting control 4164 may provide for the assistant to change the height or tilt setting of the treatment apparatus 4070.
[0786] The example overview display 4120 shown in FIG. 40 also includes a patient communications control 4170 for controlling an audiooranaudiovisual communications session with the patient interface 4050. The communications session with the patient interface 4050 may comprise a live feed from the assistant interface 4094 for presentation by the output device of the patient interface 4050. The live feed may take the form of an audio feed and/or a video feed. In some embodiments, the patient interface 4050 may be configured to provide
10-/
two-way audio or audiovisual communications with a person using the assistant interface 4094. Specifically, the communications session with the patient interface 4050 may include bidirectional (two-way) video or audiovisual feeds, with each of the patient interface 4050 and the assistant interface 4094 presenting video of the other one. In some embodiments, the patient interface 4050 may present video from the assistant interface 4094, while the assistant interface 4094 presents only audio or the assistant interface 4094 presents no live audio or visual signal from the patient interface 4050. In some embodiments, the assistant interface 4094 may present video from the patient interface 4050, while the patient interface 4050 presents only audio or the patient interface 4050 presents no live audio or visual signal from the assistant interface 4094.
[0787] In some embodiments, the audio or an audiovisual communications session with the patient interface 4050 may take place, at least in part, while the patient is performing the rehabilitation regimen upon the body part. The patient communications control 4170 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40. The patient communications control 4170 may take other forms, such as a separate screen or a popup window. The audio and/or audiovisual communications may be processed and/or directed by the assistant interface 4094 and/or by another device or devices, such as a telephone system, or a videoconferencing system used by the assistant while the assistant uses the assistant interface 4094. Alternatively or additionally, the audio and/or audiovisual communications may include communications with a third party. For example, the system 4010 may enable the assistant to initiate a 3-way conversation regarding use of a particular piece of hardware or software, with the patient and a subject matter expert, such as a medical professional or a specialist. The example patient communications control 4170 shown in FIG. includes call controls 4172 for the assistant to use in managing various aspects of the audio or audiovisual communications with the patient. The call controls 4172 include a disconnect button 4174 for the assistant to end the audio or audiovisual communications session. The call controls 4172 also include a mute button 4176 to temporarily silence an audio or audiovisual signal from the assistant interface 4094. In some embodiments, the call controls 4172 may include other features, such as a hold button (not shown). The call controls 4172 also include one or more record/playback controls 4178, such as record, play, and pause buttons to control, with the patient interface 4050, recording and/or playback of audio and/or video from the teleconference session. The call controls 4172 also include a video feed display 4180 for presenting still and/or video images from the patient interface 4050, and a self-video display 4182 showing the current image of the assistant using the assistant interface. The self-video display 4182 may be presented as a picture-in-picture format, within a section of the video feed display 4180, as shown in FIG. 40. Alternatively or additionally, the self-video display 4182 may be presented separately and/or independently from the video feed display 4180.
[0788] The example overview display 4120 shown in FIG. 40 also includes a third party communications control 4190 for use in conducting audio and/or audiovisual communications with a third party. The third party communications control 4190 may take the form of a portion or region of the overview display 4120, as shown in FIG. 40. The third party communications control 4190 may take other forms, such as a display on a separate screen or a popup window. The third party communications control 4190 may include one or more controls, such as a contact list and/or buttons or controls to contact a third party regarding use of a particular piece of hardware or software, e.g., a subject matter expert, such as a medical professional or a specialist. The third party communications control 4190 may include conference calling capability for the third party to simultaneously communicate with both the assistant via the assistant interface 4094, and with the patient via the patient interface 4050. For example, the system 4010 may provide for the assistant to initiate a 3-way conversation with the patient and the third party.
[0789] FIG. 41 shows an example block diagram of training a machine learning model 4013 to output, based on data 4600 pertaining to the patient, a treatment plan 4602 for the patient according to the present disclosure. Data pertaining to other patients may be received by the server 4030. The other patients may have used various treatment apparatuses to perform treatment plans. The data may include characteristics of the other patients, the details of the treatment plans performed by the other patients, and/or the results of performing the treatment plans (e.g., a percent of recovery of a portion of the patients' bodies, an amount of recovery of a portion of the patients' bodies, an amount of increase or decrease in muscle strength of a portion of patients' bodies, an amount of increase or decrease in range of motion of a portion of patients' bodies, etc.).
[0790] As depicted, the data has been assigned to different cohorts. Cohort A includes data for patients having similar first characteristics, first treatment plans, and first results. Cohort B includes data for patients having similar second characteristics, second treatment plans, and
second results. For example, cohort A may include first characteristics of patients in their twenties without any medical conditions who underwent surgery for a broken limb; their treatment plans may include a certain treatment protocol (e.g., use the treatment apparatus 4070 for 30 minutes 5 times a week for 3 weeks, wherein values for the properties, configurations, and/or settings of the treatment apparatus 4070 are set to X (where X is a numerical value) for the first two weeks and to Y (where Y is a numerical value) for the last week).
[0791] Cohort A and cohort B may be included in a training dataset used to train the machine learning model 4013. The machine learning model 4013 may be trained to match a pattern between characteristics for each cohort and output the treatment plan that provides the result. Accordingly, when the data 4600 for a new patient is input into the trained machine learning model 4013, the trained machine learning model 4013 may match the characteristics included in the data 4600 with characteristics in either cohort A or cohort B and output the appropriate treatment plan 4602. In some embodiments, the machine learning model 4013 may be trained to output one or more excluded treatment plans that should not be performed by the new patient.
[0792] FIG. 42 shows an embodiment of an overview display of the patient interface 4050 presenting a virtual avatar 4700 guiding the patient through an exercise session according to the present disclosure. The virtual avatar 4700 may be presented on the output device 4054 (e.g., display screen) of the patient interface 4050. As depicted, the virtual avatar 4700 may represent a person. In some embodiments, the person may be an actual individual, e.g., the medical professional, a professional athlete, the patient, a relative, a friend, a sibling, a celebrity, etc.; in other embodiments, the person may be fictional or constructed, e.g., a superhero, or the like. As discussed further herein, the virtual avatar may be any person, object, building, animal, being, alien, robot, or the like. For example, children may connect more strongly with animal animations to guide them through their exercise sessions. The virtual avatar 4700 may be selected by the patient and/or the medical professional from a library of virtual avatars stored on a database at the server 4030. In some embodiments, the virtual avatar 4700 may be able to be uploaded into the database or for private use by the patient and/or the medical professional. For example, the library of virtual avatars may be stored at the system data store 4042 and/or the patient data store 4044. Once the virtual avatar 4700 has been selected for the patient, an identifier of the virtual avatar 4700 may be associated with an identifier of the patient in the system data store 4042 and/or the patient data store 4044.
[0793] The virtual avatar 4700 may perform one or more exercises specified in an exercise session of a treatment plan for the patient. As used throughout this disclosure, and for the avoidance of doubt, "exercises" may include, e.g., rehabilitation movements, high intensity interval training, strength training, range of motion training, or any body or physical movement capable of being performed on a treatment device specified in the treatment plan (including modifications or amendments or emendations thereto) or reasonably substituted for with a different treatment device. For example, one exercise may involve pedaling a stationary bicycle, and the virtual avatar 4700 may be animated as pedaling the bicycle in a desired manner for the patient. In some embodiments, the virtual avatar 4700 may include an actual video of a person performing the exercise session. Thus, the virtual avatar 4700 may be generated by the server 4030 and/or include audio, video, audiovisual and/or multimedia data of a real person performing the exercise session. In some embodiments, the virtual avatar 4700 may represent a medical professional, such as a physical therapist, a coach, a trainer, etc.
[0794] In some embodiments, the virtual avatar 4700 may be controlled by one or more machine learning models 4013 of the artificial intelligence engine 4011. For example, the one or more machine learning models 4013 may be trained based on historical data and real-time or near-time data. The data used to train the machine learning models 4013 may include previous feedback received from users (e.g., pain levels), characteristics of the patients at various points in their treatment plans (e.g., heartrate, blood pressure, temperature, perspiration rate, etc.), sensor measurements (e.g., pressure on pedals, range of motion, speed of the motor of the treatment apparatus 4070, etc.) received as the patients performed their treatment plans, and/or the results achieved by the patients after certain operations are performed (e.g., initiating a telemedicine session with a multimedia feed of the medical professional, replacing the virtual avatar 4700 with the multimedia feed of the medical professional, emoting certain auditory statements, presenting certain visuals on the output device 4054, changing a parameter of the exercise session (e.g., reducing an amount of resistance provided by the treatment apparatus 4070), etc.).
[0795] The output device 54 also presents a self-video section 4702 that presents video of the patient obtained from a camera of the patient interface 4050. The self-video may be used by the patient to verify whether they are using proper form, cadence, consistency or any other observable or measurable quality or quantity while performing an exercise session. While the patient performs the exercise session, the video obtained from the camera of the patient interface 4050 may be transmitted to the assistant interface 4094 for presentation. A medical professional may view the assistant interface 4094 presenting the video of the patient and determine whether to intervene by speaking to the patient via their patient interface 4050 and/or to replace the virtual avatar 4700 with a multimedia feed from the assistant interface 4094.
[0796] The output device 4054 also presents a graphical user interface (GUI) object 4704. The GUI object 4704 may be an element that enables the user to provide feedback to the server 4030. For example, the GUI object 4704 may present a scale of values representing a level of pain the patient is currently experiencing, and the GUI object 4704 may enable a patient to select a value representing their level of pain. The selection may cause a message to be transmitted to the server 4030. In some embodiments, as described further herein, the message, including the level of pain, may pertain to a trigger event. The server 4030 may determine whether the level of pain experienced by the patient exceeds a certain threshold severity level. If the level of pain exceeds the certain threshold severity level, then the server 4030 may pause the virtual avatar 4700 and/or replace the virtual avatar 4700 with an audio, visual, audio-visual or multimedia feed from the computing device of a medical professional.
[0797] FIG. 43 shows an embodiment of the overview display 4120 of the assistant interface 4094 receiving a notification pertaining to the patient and enabling the assistant (e.g., medical professional) to initiate a telemedicine session in real-time according to the present disclosure. As depicted, the overview display 4120 includes a section for the patient profile 4130. The patient profile 4130 presents information pertaining to the treatment plan being performed by the patient "John Doe." The treatment plan 4800 indicates that "John Doe is cycling on the treatment apparatus for 5 miles. The pedals of the treatment apparatus are configured to provide a range of motion of 45 degrees." The overview display 4120 also includes a notification 4802 that is received due to a trigger event. The notification presents "John Doe indicated he is experiencing a high level of pain during the exercise." The overview display 4120 also includes a prompt 4804 for a medical professional using the assistant interface 4094. The prompt asks, "Initiate telemedicine session?" The overview display 4120 includes a graphical element (e.g., button) 4806 that is configured to enable the medical professional to use an wired or wireless input peripheral (e.g., touchscreen, mouse, keyboard, microphone, etc.) to select to initiate the telemedicine session. Although the above example details the overview display 4120 of the assistant interface 94 presenting information in the form of text, an alternate or additional way of presenting that information can be in the form of graphs, charts, or the like.
[0798] The assistant (e.g., medical professional) using the assistant interface 94 (e.g., computing device) during the telemedicine session may be presented in the self-video 4182 in a portion of the overview display 4120 (e.g., user interface presented on a display screen 4024 of the assistant interface 4094). The assistant interface 4094 may also present, in the same portion of the overview display 4120 as the self-video, a video (e.g., self-video 4182) from the patient in the video feed display 4180. Further, the video feed display 4180 may also include a graphical user interface (GUI) object 4808 (e.g., a button) that enables the medical professional to share, in real-time or near real-time during the telemedicine session, a treatment plan with the patient on the patient interface 4050, to control an operational parameter of the treatment apparatus 4070, or the like.
[0799] FIG.44 shows an embodiment of an overview display presenting by the output device 4054 of the patient interface 4050. The output device 4054 presents, in real-time during a telemedicine session, a feed 4900 (e.g., multimedia preferably including audio, video, or both) of the medical professional that replaced the virtual avatar according to the present disclosure. In some embodiments, the virtual avatar may remain presented on the patient interface 4050, but in a paused state, and the feed may preferably by limited to only audio when the medical professional speaks to the patient. In some embodiments, as depicted, the feed 4900 may replace the virtual avatar. The feed may enable the medical professional and the patient to engage in a telemedicine session where the medical professional talks to the patient and inquires about their pain level, their characteristics (e.g., heartrate, perspiration rate, etc.), and/or one or more sensor measurements (e.g., pressure on pedals, range of motion, etc.).
[0800] It should be understood that the medical professional may be viewing, monitoring, treating, diagnosing, etc. numerous patients on the assistant interface 94 at the same time. For example, as discussed further below, each patient may be presented in a respective portion of the user interface of the assistant interface 4094. Each respective portion may present a variety of information pertaining to the respective patient. For example, each portion may present a feed of the patient performing the exercise session using the treatment apparatus, characteristics of the patient, the treatment plan for the patient, sensor measurements, and the like. The user interface of the assistant interface 4094 may be configured to enable the medical professional to select one or more patients to cause the virtual avatar guiding the one or more patients through an exercise to be paused and/or replaced in real-time or near real-time.
[0801] Upon completion of the telemedicine session between the patient interface 4050 and the assistant interface 4094, the feed from the computing device of the medical professional may be replaced with the virtual avatar on the patient interface 4050. The virtual avatar may continue guiding the patient through the exercise session wherever and/or whenever the exercise session was paused due to the initiation of the trigger event. The assistant interface 4094 may resume viewing the feed of the patient performing the exercise session and/or information pertaining to the patient.
[0802] FIG. 45 shows an example embodiment of a method 41000 for replacing, based on a trigger event occurring, a virtual avatar with a feed of a medical professional according to the present disclosure. The method 41000 is performed by processing logic that may include hardware (circuitry, dedicated logic, etc.), software (such as is run on a general-purpose computer system or a dedicated machine), or a combination of both. The method 41000 and/or each of its individual functions, routines, other methods, scripts, subroutines, or operations may be performed by one or more processors of a computing device (e.g., any component of FIGURE 36, such as server 4030 executing the artificial intelligence engine 4011). In certain implementations, the method 41000 may be performed by a single processing thread. Alternatively, the method 41000 may be performed by two or more processing threads, each thread implementing one or more individual functions or routines; or other methods, scripts, subroutines, or operations of the methods.
[0803] For simplicity of explanation, the method 41000 is depicted and described as a series of operations. However, operations in accordance with this disclosure can occur in various orders and/or concurrently, and/or with other operations not presented and described herein. For example, the operations depicted in the method 41000 may occur in combination with any other operation of any other method disclosed herein. Furthermore, not all illustrated operations may be required to implement the method 41000 in accordance with the disclosed subject matter. In addition, those skilled in the art will understand and appreciate that the method 41000 could alternatively be represented as a series of interrelated states via a state diagram, a directed graph, a deterministic finite state automaton, a non-deterministic finite state automaton, a Markov diagram, or event diagrams.
[0804] At 41002, the processing device may provide, to a computing device (e.g., patient interface 4050) of the patient, a virtual avatar to be presented on the computing device of the patient. The virtual avatar may be configured to use a virtual representation of the treatment apparatus 4070 to guide the patient through an exercise session. The virtual avatar may be configured to use audio, video, haptic feedback, or some combination thereof, to guide the patient through the exercise session. The audio, video, haptic feedback, or some combination thereof, may be provided by the computing device of the patient. In some embodiments, the processing device may determine, based on a treatment plan for a patient, the exercise session to be performed. The treatment apparatus 4070 may be configured to be used by the patient performing the exercise session.
[0805] In some embodiments, prior to providing the virtual avatar, the processing device may transmit, to the computing device of the patient, a notification to initiate the exercise session. The notification may include a push notification, a text message, a phone call, an email, or some combination thereof. The notification may be transmitted based on a schedule specified in the treatment plan. The schedule may include dates and times to perform exercise sessions, durations for performing the exercise sessions, exercises to perform during the exercise sessions, configurations of parts (e.g., pedals, seat, etc.) of the treatment apparatus 4070, and the like. The processing device may receive, from the computing device of the patient, a selection to initiate the exercise session for using the treatment apparatus 4070. The processing device may transmit, to the treatment apparatus 4070, a control signal to cause the treatment apparatus 70 to initiate the exercise session. Responsive to transmitting the control signal, the processing device may provide the virtual avatar to the computing device of the patient.
[0806] The virtual avatar may be associated with a medical professional, such as the medical professional that prescribed or generated the treatment plan for the patient to perform. In some embodiments, the treatment plan may be wholly or partially designed generated by the medical professional. In some embodiments, the treatment plan may be wholly or partially generated by the artificial intelligence engine 4011, and the medical professional may review the treatment plan and/or modify the treatment plan before it is transmitted to the patient to be performed.
I[j/
[0807] The virtual avatar may represent a proxy medical professional and may be a person, being, thing, software or electronic bot, object, etc. that guides one or more patients through a treatment plan. The virtual avatar may guide numerous patients through treatment plans at various stages of their rehabilitation, prehabilitation, recovery, etc. It should be noted that at any time the virtual avatar may be replaced by a feed (e.g., live audio, audiovisual, etc.) of the medical professional, wherein the feed is transmitted, either directly to or indirectly through the server 4030, from the assistant interface 4094 to the patient interface 4050. The feed may be a stream of data packets (e.g., audio, video, or both) obtained via a camera and/or microphone associated with the assistant interface 4094 in real-time or near real-time. The feed may be presented on the user interface 4054 of the patient interface 4050.
[0808] In some embodiments, the virtual avatar may be initially selected by the medical professional. The virtual avatar may be a file stored in a virtual avatar library, and the medical professional may select the virtual avatar from the virtual avatar library. For example, the virtual avatar may be a life-like representation of a person (e.g., male, female, non-binary, or any other gender with which the person identifies). In some embodiments, the virtual avatar may be a life-like or virtual representation of an animal (e.g., tiger, lion, unicorn, rabbit, etc.), which may be more enjoyable and more motivational to younger people (e.g., kids). In some embodiments, the virtual avatar may be a life-like or virtual representation of a robot, alien, etc.
[0809] In some embodiments, the medical professional may design their own virtual avatar. For example, the medical professional may be provided with a user interface on their assistant interface 4094, and the user interface may provide user interface elements that enable configuration of a virtual avatar. The medical professional may use the user interface to generate a virtual avatar that looks like their own self or any suitable person.
[0810] In some embodiments, the virtual avatar may be selected by the patient. In some embodiments, the selected virtual avatar may be associated with the patient (e.g., via an identifier of the patient and an identifier of the virtual avatar) and stored in a database. For example, some patients may have a preference for certain virtual avatars over other virtual avatars. In some embodiments, different virtual avatars may guide patients through the same or different treatment plan. With respect to any treatment plan referenced herein, different aspects, portions or configurations of the treatment plan may further be guided by more than one virtual or physical avatar, wherein each such avatar is associated with a particular aspect, portion or configuration of the treatment plan, and every other avatar, to the extent application is associated with a disjoint aspect, portion or configuration of the treatment plan. In other embodiments, more than one avatar, physical and/or virtual, may be present at the same time but performing different functions within the particular aspect, portion or configuration of the treatment plan. In group therapy sessions, numerous patients may be performing the same treatment plan, and each patient may have their own patient interface 4050 that concurrently presents the same or a different virtual avatar or avatars guiding the patient through the treatment plan.
[0811] As the patients perform the treatment plan, the medical professional may be able to view the numerous patients in different tiles on the user interface of the assistant interface 4094. The term "tiles" may refer to squares that each include a feed from the respective patient interfaces 4050 of the patient as the patient performs the treatment plan, a feed of the characteristics of the patient (e.g., heartrate, blood pressure, temperature, etc.), a feed of measurements (e.g., pressure exerted on the pedals, range of motion determined by the goniometer, number of steps, speed of the motor of the treatment apparatus 4070, etc.), or some combination thereof. As such, using the disclosed embodiments, the medical professional may be enabled to manage, monitor, and/or treat numerous patients at a time. Computing resources may be saved by having one medical professional treatment numerous patients at the same time because just the assistant interface 4094 is used to view, treat, manage, monitor, etc. the numerous patients as they perform the treatment plan.
[0812] At 41004, the processing device may receive, from the computing device of the patient, a message pertaining to a trigger event. In some embodiments, the message may include data pertaining to a pain level of the patient, a characteristic of the patient, a measurement of a sensor, or some combination thereof. The trigger event may refer to any event associated with the data pertaining to the pain level of the patient, the characteristic of the patient (e.g., heartrate, blood pressure, temperature, perspiration rate, etc.), the measurement of the sensor (e.g., pressure, range of motion, speed, etc.), or some combination thereof.
[0813] In some instances, the virtual avatar may guide the patient through the treatment plan as a prerecorded animation, and the medical professional may not be actively engaged in a telemedicine session with the patient as they perform the treatment plan. In some embodiments, when the trigger event occurs, a notification may be transmitted to the computing device of the medical professional, where such device alerts the medical professional about the notification. The medical professional may use a wired or wireless input peripheral (e.g., touchscreen, mouse, keyboard, microphone) to select the notification and to initiate a telemedicine session with the computing device of the patient. Such a technique may minimize or otherwise optimize the use and/or cost and/or risk profile of one or more computing resources, such as network resources (e.g., bandwidth), by initiating the telemedicine session only when the notification is selected and not throughout the entirety of the treatment plan. In other embodiments, the computing device of the medical professional and the computing device of the patient and the computing device of the medical professional may be continuously or continually engaged in a telemedicine session as the one or more patients perform the treatment plan. The trigger event may enable the medical professional to intervene and/or replace the virtual avatar, pause the virtual avatar, or both. In some embodiments, while the patients are performing the treatment plan, the medical professional may selectively choose one or more of the patients to cause the virtual avatar on those one or more patients' computing devices to be replaced, paused, etc. Such a technique may enable the medical professional to intervene for some of the patients, but not for all of the patients, as they perform the treatment plan. For example, the medical professional may select patients not hitting target thresholds (e.g., pressure, range of motion, speed, etc.) in the treatment plan, select patients indicating they are experiencing a threshold level of pain, or the like.
[0814] In some embodiments, the virtual avatar may be controlled, in real-time or near real time, by one or more machine learning models trained to receive input, including sensor data (e.g., pressure measurements from the pedals, range of motion measurements from a goniometer, speed data, etc.), characteristics of the patient (e.g., perspiration rate, heartrate, blood pressure, temperature, arterial blood gas and/or oxygenation levels or percentages etc.), real-time feedback from the patient or other patients (e.g., indication of pain level), or some combination thereof. The one or more machine learning models may produce an output that controls the virtual avatar. For example, the output may control the virtual avatar such that the way the virtual avatar performs a particular exercise (e.g., pedals faster or slower on a treatment apparatus 70) is modified, to say encouraging statements (e.g., "You got this," "Keep it up," etc.), etc. The modifications may be based on training data of other patients, where such training data indicates the modifications result in a desired patient performance and/or result for the other patients, or an increase in the probability of achieving the desired
Z-oo
patient performance or result. For example, the training data may indicate that providing certain audio and/or video when certain sensor data is detected may lead to the patient exerting more force on the pedal, thereby strengthening their leg muscles according to the treatment plan.
[0815] At 41006, the processing device may determine whether a severity level of the trigger event exceeds a threshold severity level. The severity threshold level may be any suitable amount, value, indicator, etc. For example, in one embodiment, the threshold severity level may be a certain level of pain the patient is in. At any time during an exercise session, the patient may use any input peripheral of the patient interface 4050 to express their level of pain. For example, the patient may touch a button on the touchscreen of the patient interface 4050 and the button may indicate the patient is experiencing a pain level of 8 in a scale from I to 10, where 1 is the least amount of pain and 10 is the most amount of pain. In this example, the threshold severity level may be a pain level of 5. Accordingly, the level of pain (8) the patient is experiencing exceeds the threshold severity level (5). In some embodiments, the threshold severity level may relate to the amount of force the patient is exerting on the pedals, a range of motion the patient is able to achieve during pedaling, a speed the patient is able to achieve, a duration of a range of motion and/or speed the patient is able to achieve, or the like. For example, the threshold severity level may be configured based on the patient pedaling at a certain range of motion for a certain period of time, and if the patient fails to achieve the certain range of motion in the certain period of time, or variations thereof of such goals, during an exercise session, then the threshold severity level may be exceeded.
[0816] At 41008, responsive to determining the severity level of the trigger event exceeds the threshold severity level, the processing device may replace, on the computing device of the patient, the presentation of the virtual avatar with a presentation of a multimedia feed from a computing device (e.g., assistant interface 4094) of the medical professional. In some embodiments, replacing the virtual avatar with the multimedia feed may initiate a telemedicine session between the patient and the medical professional. The processing device may receive, from the computing device of the patient or the medical professional, a second message indicating the telemedicine session is complete, and the processing device may replace, on the computing device of the patient, the presentation of the multimedia feed with the presentation of the virtual avatar. The virtual avatar may be configured to continue to guide the patient through the exercise session to completion. That is, the exercise session and/or the virtual avatar may be paused at a certain timestamp when the multimedia feed of the medical professional replaces the virtual avatar, and the exercise session and/or the virtual avatar may initiate playback at the certain timestamp when the telemedicine session is completed.
[0817] In some embodiments, at 41010, responsive to determining that the severity level of the trigger event does not exceed the threshold severity level, the processing device may provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient. In some embodiments, responsive to determining that the severity level of the trigger event does not exceed the threshold severity level, the processing device may continue to provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient.
[0818] In some embodiments, the processing device may determine, based on a second treatment plan for a second patient, the exercise session to be performed, where the performance by the second patient uses a second treatment apparatus. The processing device may present, on a second computing device (e.g., patient interface 50) of the second patient, the virtual avatar configured to guide the patient to use the treatment apparatus through the exercise session. In some embodiments, while the presentation of the virtual avatar is replaced on the computing device of the patient with the presentation of the multimedia feed from the computing device of the medical professional, the virtual avatar may remain presented on the second computing device of the second patient. In some embodiments, while the presentation of the virtual avatar is replaced on the computing device of the patient with the presentation of the multimedia feed from the computing device of the medical professional, the virtual avatar may be replaced on the second computing device of the second patient with the multimedia feed from the computing device of the medical professional.
[0819] FIG. 46 shows an example embodiment of a method for providing a virtual avatar according to the present disclosure. Method 41100 includes operations performed by processors of a computing device (e.g., any component of FIG. 36, such as server 4030 executing the artificial intelligence engine 4011). In some embodiments, one or more operations of the method 41100 are implemented in computer instructions stored on a memory device and executed by a processing device. The method 41100 may be performed in the same or a similar manner as described above in regard to method 41000. The operations of the method 41100 may be performed in some combination with any of the operations of any of the methods described herein. The method 41100 may include further operations associated with 41002 in method 41000 related to providing the virtual avatar to the computing device of the patient.
[0820] At 41102, the processing device may retrieve data associated with the exercise session. The data may include instructions implementing a virtual model that animates one or more movements associated with the exercise session. The virtual model may be two dimensional, three-dimensional, or n-dimensional (in terms of animations or projections onto a 3-D virtual environment or a 2-D layout). In some embodiments, the virtual model may be a mesh model animation, contour animation, virtual human animation, skeletal animation, etc. For example, the virtual model may be a surface representation (referred to as the mesh) used to draw a character (e.g., medical professional), and a hierarchical set of interconnected parts. The virtual model may use a virtual armature to animate (e.g., pose and key frame) the mesh. As used herein, an armature may refer to a kinematic chain used in computer animation to simulate the motions of virtual human or animal characters (e.g., virtual avatars). Various types of virtual armatures may be used, such as keyframing (stop-motion) armatures and real time (puppeteering) armatures.
[0821] At 41104, the processing device may retrieve data associated with the virtual avatar. The data associated with the virtual avatar may include which virtual avatar is selected by the patient and/or the medical professional, wherein such selection is made to guide the patient through the exercise session. In some embodiments, the data associated with the virtual avatar may include an identifier associated with the virtual avatar. The identifier may be used to retrieve the data associated with the virtual avatar from a database. For example, the patient may have selected a superhero to be their virtual avatar. Accordingly, data pertaining to the particular superhero (e.g., gender, costume, appearance, etc.) may be retrieved from the database.
[0822] At 41106, the processing device may map the data associated with the virtual avatar onto the virtual model that animates the one or more movements associated with the exercise session. For example, the appearance and shape of the virtual avatar may be mapped onto the mesh of the virtual model (e.g., face to a head portion of the mesh) and manipulated and/or animated according to instructions related to the exercise session and/or the virtual avatar. In some embodiments, the virtual avatar may perform one or more exercises using the treatment apparatus 4070. The performance of the exercises by the virtual avatar may be animated and/or presented on a display screen of the computing device of the patient to guide the patient through the exercise session. As disclosed herein, at any time, the virtual avatar may be replaced and/or paused to enable presentation of a multimedia feed of a medical professional.
[0823] FIG. 47 shows an example computer system 41200 which can perform any one or more of the methods described herein, in accordance with one or more aspects of the present disclosure. In one example, computer system 41200 may include a computing device and correspond to the assistance interface 4094, reporting interface 4092, supervisory interface 4090, clinician interface 4020, server 4030 (including the Al engine 4011), patient interface 4050, ambulatory sensor 4082, goniometer 4084, treatment apparatus 4070, pressure sensor 4086, or any suitable component of FIG. 36. The computer system 41200 may be capable of executing instructions implementing the one or more machine learning models 4013 of the artificial intelligence engine 4011 of FIG. 36. The computer system may be connected (e.g., networked) to other computer systems in a LAN, an intranet, an extranet, or the Internet, including via the cloud or a peer-to-peer network. The computer system may operate in the capacity of a server in a client-server network environment. The computer system may be a personal computer (PC), a tablet computer, a wearable (e.g., wristband), a set-top box (STB), a personal Digital Assistant (PDA), a mobile phone, a camera, a video camera, an Internet of Things (IoT) device, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that device. Further, while only a single computer system is illustrated, the term "computer" shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
[0824] The computer system 41200 includes a processing device 41202, a main memory 41204 (e.g., read-only memory (ROM), flash memory, solid state drives (SSDs), dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM)), a static memory 41206 (e.g., flash memory, solid state drives (SSDs), static random access memory (SRAM)), and a data storage device 41208, which communicate with each other via a bus 41210.
[0825] Processing device 41202 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processing device 41202 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word
(VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 41202 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a system on a chip, a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 41202 is configured to execute instructions for performing any of the operations and steps discussed herein.
[0826] The computer system 41200 may further include a network interface device 41212. The computer system 41200 also may include a video display 41214 (e.g., a liquid crystal display (LCD), a light-emitting diode (LED), an organic light-emitting diode (OLED), a quantum LED, a cathode ray tube (CRT), a shadow mask CRT, an aperture grille CRT, a monochrome CRT), one or more input devices 41216 (e.g., a keyboard and/or a mouse or a gaming-like control), and one or more speakers 41218 (e.g., a speaker). In one illustrative example, the video display 41214 and the input device(s) 41216 may be combined into a single component or device (e.g., an LCD touch screen).
[0827] The data storage device 41216 may include a computer-readable medium 41220 on which the instructions 41222 embodying any one or more of the methods, operations, or functions described herein is stored. The instructions 41222 may also reside, completely or at least partially, within the main memory 41204 and/or within the processing device 41202 during execution thereof by the computer system 41200. As such, the main memory 41204 and the processing device 41202 also constitute computer-readable media. The instructions 41222 may further be transmitted or received over a network via the network interface device 41212.
[0828] While the computer-readable storage medium 41220 is shown in the illustrative examples to be a single medium, the term "computer-readable storage medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "computer-readable storage medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term "computer-readable storage medium" shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
[0829] Clause 65. A computer-implemented system, comprising:
[0830] a treatment apparatus configured to be manipulated by a patient while performing an exercise session;
[0831] a patient interface configured to receive a virtual avatar, wherein the patient interface comprises an output device configured to present the virtual avatar, wherein the virtual avatar uses a virtual representation of the treatment apparatus to guide the patient through an exercise session, and wherein the virtual avatar is associated with a medical professional; and
[0832] a server computing device configured to:
[0833] provide the virtual avatar of the patient to the patient interface,
[0834] receive, from the patient interface, a message pertaining to a trigger event, and wherein the message comprises a severity level of the trigger event,
[0835] determine whether a severity level of the trigger event exceeds a threshold severity level, and
[0836] responsive to determining that the severity level of the trigger event exceeds the threshold severity level, replace on the patient interface the presentation of the virtual avatar with a presentation of a multimedia feed from a computing device of the medical professional.
[0837] Clause 66. The computer-implemented system of any clause herein, wherein the server computing device is further to:
[0838] responsive to determining that the severity level of the trigger event does not exceed the threshold severity level:
[0839] provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient, or
[0840] continue to provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient.
[0841] Clause 67. The computer-implemented system of any clause herein, wherein the virtual avatar is controlled, in real-time or near real-time, by one or more machine learning models trained to:
[0842] receive input comprising sensor data, characteristics of the patient, real-time feedback from the patient or other patients, or some combination thereof, and
[0843] produce an output that controls the virtual avatar.
/_oU
[0844] Clause 68. The computer-implemented system of any clause herein, wherein providing the virtual avatar further comprises:
[0845] retrieving data associated with the exercise session, wherein the data comprises instructions implementing a virtual model that animates one or more movements associated with the exercise session;
[0846] retrieving data associated with the virtual avatar; and
[0847] mapping the data associated with the virtual avatar onto the virtual model that animates the one or more movements associated with the exercise session.
[0848] Clause69. The computer-implemented system of any clause herein, wherein prior to providing the virtual avatar, the server computing device is further to:
[0849] transmit, to the patient interface, a notification to initiate the exercise session, wherein the notification is transmitted based on a schedule specified in the treatment plan;
[0850] receive, from the patient interface, a selection to initiate the exercise session for using the treatment apparatus;
[0851] transmit, to the treatment apparatus, a control signal to cause the treatment apparatus to initiate the exercise session; and
[0852] responsive to transmitting the control signal, provide the virtual avatar to the patient interface.
[0853] Clause 70. The computer-implemented system any clause herein, wherein the notification comprises a push notification, a text message, a phone call, an email, or some combination thereof.
[0854] Clause 71. The computer-implemented system of any clause herein, wherein the server computing device is further to:
[0855] determine, based on a second treatment plan for a second patient, the exercise session to be performed, wherein the performance by the second patient uses a second treatment apparatus;
[0856] present, on a second patient interface of the second patient, the virtual avatar configured to guide the patient to use the treatment apparatus through the exercise session, wherein:
[0857] while the presentation of the virtual avatar is replaced on the patient interface with the presentation of the multimedia feed from the computing device of the medical professional, the virtual avatar remains presented on the second patient interface, or
4AI
[0858] while the presentation of the virtual avatar is replaced on the patient interface with the presentation of the multimedia feed from the computing device of the medical professional, the virtual avatar is replaced on the second patient interface with the multimedia feed from the computing device of the medical professional.
[0859] Clause 72. The method of any clause herein, wherein the server computing device is further to:
[0860] receive, from the patient interface, a selection of the virtual avatar from a library of virtual avatars; and
[0861] store the virtual avatar associated with the patient in a database.
[0862] Clause 73. The method of any clause herein, wherein the virtual avatar is configured to use audio, video, haptic feedback, or some combination thereof to guide the patient through the exercise session.
[0863] Clause 74. A method comprising:
[0864] providing, to a computing device of the patient, a virtual avatar to be presented on the computing device of the patient, wherein the virtual avatar is configured to use a virtual representation of the treatment apparatus to guide the patient through an exercise session, and the virtual avatar is associated with a medical professional;
[0865] receiving, from the computing device of the patient, a message pertaining to a trigger event;
[0866] determining whether a severity level of the trigger event exceeds a threshold severity level; and
[0867] responsive to determining that the severity level of the trigger event exceeds the threshold severity level, replacing, on the computing device of the patient, the presentation of the virtual avatar with a presentation of a multimedia feed from a computing device of the medical professional.
[0868] Clause 75. The method of any clause herein, further comprising:
[0869] responsive to determining that the severity level of the trigger event does not exceed the threshold severity level:
[0870] providing control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient, or
[0871] continuing to provide control of the virtual avatar to the computing device of the
4A/O
medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient.
[0872] Clause 76. The method of any clause herein, wherein the virtual avatar is controlled, in real-time or near real-time, by one or more machine learning models trained to:
[0873] receive input comprising sensor data, characteristics of the patient, real-time feedback from the patient or other patients, or some combination thereof, and
[0874] produce an output that controls the virtual avatar.
[0875] Clause 77. The method of any clause herein, wherein providing the virtual avatar further comprises:
[0876] retrieving data associated with the exercise session, wherein the data comprises instructions implementing a virtual model that animates one or more movements associated with the exercise session;
[0877] retrieving data associated with the virtual avatar; and
[0878] mapping the data associated with the virtual avatar onto the virtual model that animates the one or more movements associated with the exercise session.
[0879] Clause 78. The method of any clause herein, wherein prior to providing the virtual avatar, the method further comprises:
[0880] transmitting, to the computing device of the patient, a notification to initiate the exercise session, wherein the notification is transmitted based on a schedule specified in the treatment plan;
[0881] receiving, from the computing device of the patient, a selection to initiate the exercise session for using the treatment apparatus;
[0882] transmitting, to the treatment apparatus, a control signal to cause the treatment apparatus to initiate the exercise session; and
[0883] responsive to transmitting the control signal, providing the virtual avatar to the computing device of the patient.
[0884] Clause 79. The method of any clause herein, wherein the notification comprises a push notification, a text message, a phone call, an email, or some combination thereof.
[0885] Clause 80. The method of any clause herein, further comprising:
[0886] determining, based on a second treatment plan for a second patient, the exercise session to be performed, wherein the performance by the second patient uses a second treatment apparatus;
[0887] presenting, on a second computing device of the second patient, the virtual avatar configured to guide the patient to use the treatment apparatus through the exercise session, wherein:
[0888] while the presentation of the virtual avatar is replaced on the computing device of the patient with the presentation of the multimedia feed from the computing device of the medical professional, the virtual avatar remains presented on the second computing device, or
[0889] while the presentation of the virtual avatar is replaced on the computing device of the patient with the presentation of the multimedia feed from the computing device of the medical professional, the virtual avatar is replaced on the second computing device with the multimedia feed from the computing device of the medical professional.
[0890] Clause 81. The method of any clause herein, further comprising:
[0891] receiving, from the computing device of the patient, a selection of the virtual avatar from a library of virtual avatars; and
[0892] storing the virtual avatar associated with the patient in a database.
[0893] Clause 82. The method any clause herein, wherein the virtual avatar is configured to use audio, video, haptic feedback, or some combination thereof to guide the patient through the exercise session.
[0894] Clause 83. The method of any clause herein, wherein the message comprises data pertaining to a pain level of the patient, a characteristic of the patient, a measurement of a sensor, or some combination thereof.
[0895] Clause 84. The method of any clause herein, wherein replacing the virtual avatar with the multimedia feed initiates a telemedicine session between the patient and the medical professional, and the method further comprises:
[0896] receiving, from the computing device of the patient or the medical professional, a second message indicating the telemedicine session is complete; and
[0897] replacing, on the computing device of the patient, presentation of the multimedia feed with the presentation of the virtual avatar, wherein the virtual avatar is configured to continue to guide the patient through the exercise session to completion.
[0898] Clause 85. The method of any clause herein, further comprising determining, based on a treatment plan for a patient, the exercise session to be performed, wherein the treatment apparatus is configured to be used by the patient performing the exercise session.
[0899] Clause 86. A non-transitory, tangible computer-readable medium storing instructions that, when executed, cause a processing device to:
[0900] provide, to a computing device of a patient, a virtual avatar to be presented on the computing device of the patient, wherein the virtual avatar is configured to use a virtual representation of a treatment apparatus to guide the patient through an exercise session, and the virtual avatar is associated with a medical professional;
[0901] receive, from the computing device of the patient, a message pertaining to a trigger event;
[0902] determine whether a severity level of the trigger event exceeds a threshold severity level; and
[0903] responsive to determining that the severity level of the trigger event exceeds the threshold severity level, replace, on the computing device of the patient, the presentation of the virtual avatar with a presentation of a multimedia feed from a computing device of the medical professional.
[0904] Clause 87. The computer-readable medium of any clause herein, wherein, responsive to determining that the severity level of the trigger event does not exceed the threshold severity level, the processing device is further to:
[0905] provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient, or
[0906] continue to provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient.
[0907] Clause 88. The computer-readable medium of any clause herein, wherein the virtual avatar is controlled, in real-time or near real-time, by one or more machine learning models trained to:
[0908] receive input comprising sensor data, characteristics of the patient, real-time feedback from the patient or other patients, or some combination thereof, and
[0909] produce an output that controls the virtual avatar.
[0910] Clause 89. The computer-readable medium of any clause herein, wherein providing the virtual avatar further comprises:
[0911] retrieving data associated with the exercise session, wherein the data comprises
instructions implementing a virtual model that animates one or more movements associated with the exercise session;
[0912] retrieving data associated with the virtual avatar; and
[0913] mapping the data associated with the virtual avatar onto the virtual model that animates the one or more movements associated with the exercise session.
[0914] Clause 90. The computer-readable medium of any clause herein, wherein prior to providing the virtual avatar, the processing device is further to:
[0915] transmit, to the computing device of the patient, a notification to initiate the exercise session, wherein the notification is transmitted based on a schedule specified in the treatment plan;
[0916] receive, from the computing device of the patient, a selection to initiate the exercise session for using the treatment apparatus;
[0917] transmit, to the treatment apparatus, a control signal to cause the treatment apparatus to initiate the exercise session; and
[0918] responsive to transmitting the control signal, provide the virtual avatar to the computing device of the patient.
[0919] Clause 91. A system comprising:
[0920] a memory device storing instructions;
[0921] a processing device communicatively coupled to the memory device, the processing device executes the instructions to:
[0922] provide, to a computing device of a patient, a virtual avatar to be presented on the computing device of the patient, wherein the virtual avatar is configured to use a virtual representation of a treatment apparatus to guide the patient through an exercise session, and the virtual avatar is associated with a medical professional;
[0923] receive, from the computing device of the patient, a message pertaining to a trigger event;
[0924] determine whether a severity level of the trigger event exceeds a threshold severity level; and
[0925] responsive to determining that the severity level of the trigger event exceeds the threshold severity level, replace, on the computing device of the patient, the presentation of the virtual avatar with a presentation of a multimedia feed from a computing device of the medical professional.
[0926] Clause 92. The system of any clause herein, wherein, responsive to determining that the severity level of the trigger event does not exceed the threshold severity level, the processing device is further to:
[0927] provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient, or
[0928] continue to provide control of the virtual avatar to the computing device of the medical professional, such that the medical professional distally controls the virtual avatar to interact with the patient.
[0929] Clause 93. The system of any clause herein, wherein the processing device is further to:
[0930] retrieve data associated with the exercise session, wherein the data comprises instructions implementing a virtual model that animates one or more movements associated with the exercise session;
[0931] retrieve data associated with the virtual avatar; and
[0932] map the data associated with the virtual avatar onto the virtual model that animates the one or more movements associated with the exercise session.
[0933] Clause 94. The system of any clause herein, wherein prior to providing the virtual avatar, the processing device is further to:
[0934] transmit, to the computing device of the patient, a notification to initiate the exercise session, wherein the notification is transmitted based on a schedule specified in the treatment plan;
[0935] receive, from the computing device of the patient, a selection to initiate the exercise session for using the treatment apparatus;
[0936] transmit, to the treatment apparatus, a control signal to cause the treatment apparatus to initiate the exercise session; and
[0937] responsive to transmitting the control signal, provide the virtual avatar to the computing device of the patient.
[0938] The various aspects, embodiments, implementations, or features of the described embodiments can be used separately or in any combination. The embodiments disclosed herein are modular in nature and can be used in conjunction with or coupled to other embodiments.
[0939] Consistent with the above disclosure, the examples of assemblies enumerated in the following clauses are specifically contemplated and are intended as a non-limiting set of examples.
Claims (20)
1. A computer-implemented system configured to control operation of an electromechanical machine, the computer-implemented system comprising: the electromechanical machine, the electromechanical machine being configured to be manipulated by a user while the user performs a treatment plan, wherein the electromechanical machine includes at least one pedal; and a computing device configured to: receive treatment data pertaining to the user who uses the electromechanical machine to perform the treatment plan, wherein the treatment data comprises at least one of characteristics of the user, measurement information pertaining to the user while the user uses the electromechanical machine, at least one characteristic of the electromechanical machine, and at least one aspect of the treatment plan; generate treatment information using the treatment data; transmit the treatment information to a computing device of a healthcare provider; communicate with an interface at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input based on the treatment information; generate a modified treatment plan by modifying the at least one aspect of the treatment plan in response to receiving treatment plan input including a modification to the at least one aspect of the treatment plan; and while the user uses the electromechanical machine, control the electromechanical machine based on the modified treatment plan.
2. The computer-implemented system of claim 1, wherein the computing device is further configured to control, while the user uses the electromechanical machine during a telemedicine session, and based on the modified treatment plan, the electromechanical machine.
3. The computer-implemented system of claim 1, wherein the measurement information includes at least one of a vital sign of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, and a blood pressure of the user.
4. The computer-implemented system of claim 1, wherein at least some of the treatment data corresponds to at least some sensor data from a sensor associated with the electromechanical machine.
5. The computer-implemented system of claim 1, wherein at least some of the treatment data corresponds to at least some sensor data from a sensor associated with a wearable device worn by the user while the user uses the electromechanical machine.
6. The computer-implemented system of claim 1, wherein the computing device is further configured to, while the user uses the electromechanical machine to perform the treatment plan, receiving subsequent treatment data pertaining to the user.
7. A method of operating an electromechanical machine, the method comprising: receiving treatment data pertaining to a user who uses the electromechanical machine to perform a treatment plan, wherein the treatment data comprises at least one of characteristics of the user, measurement information pertaining to the user while the user uses the electromechanical machine, at least one characteristics of the electromechanical machine, and at least one aspect of the treatment plan; generating treatment information using the treatment data; transmitting the treatment information to a computing device of a healthcare provider; communicating with an interface at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input based on the treatment information; generating a modified treatment plan by modifying the at least one aspect of the treatment plan in response to receiving treatment plan input including a modification to the at least one aspect of the treatment plan; and while the user uses the electromechanical machine, controlling the electromechanical machine based on the modified treatment plan.
8. The method of claim 7, further comprising controlling, while the user uses the electromechanical machine during a telemedicine session, and based on the modified treatment plan, the electromechanical machine.
9. The method of claim 7, wherein the measurement information includes at least one of a vital sign of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, and a blood pressure of the user.
10. The method of claim 7, wherein at least some of the treatment data corresponds to at
/_ I V
least some sensor data from a sensor associated with the electromechanical machine.
11. The method of claim 7, wherein at least some of the treatment data corresponds to at least some sensor data from a sensor associated with a wearable device worn by the user while the user uses the electromechanical machine.
12. The method of claim 7, further comprising, while the user uses the electromechanical machine to perform the treatment plan, receiving subsequent treatment data pertaining to the user.
13. The method of claim 12, further comprising modifying the modified at least one aspect of the treatment plan in response to receiving subsequent treatment plan input including at least one further modification to the modified at least one aspect of the treatment plan, wherein the subsequent treatment plan input is based on at least one of the treatment data and the subsequent treatment data.
14. A tangible, non-transitory computer-readable medium storing instructions that, when executed, cause a processing device to: receive treatment data pertaining to a user who uses a electromechanical machine to perform a treatment plan, wherein the treatment data comprises at least one characteristic of the user, measurement information pertaining to the user while the user uses the electromechanical machine, at least one characteristic of the electromechanical machine, and at least one aspect of the treatment plan; generate treatment information using the treatment data; transmit the treatment information to a computing device of a healthcare provider; communicate with an interface at the computing device of the healthcare provider, wherein the interface is configured to receive treatment plan input based on the treatment information; and generate a modified treatment plan by modifying the at least one aspect of the treatment plan in response to receiving the treatment plan input including a modification to the at least one aspect of the treatment plan; and while the user uses the electromechanical machine, control the electromechanical machine based on the modified treatment plan.
15. The computer-readable medium of claim 14, wherein the processing device is further configured to control, while the user uses the electromechanical machine during a
4_LI
/ telemedicine session, and based on the modified treatment plan, the electromechanical machine.
16. The computer-readable medium of claim 14, wherein the measurement information includes at least one of a vital sign of the user, a respiration rate of the user, a heartrate of the user, a temperature of the user, and a blood pressure of the user.
17. The computer-readable medium of claim 14, wherein at least some of the treatment data corresponds to at least some sensor data from a sensor associated with the electromechanical machine.
18. The computer-readable medium of claim 14, wherein at least some of the treatment data corresponds to at least some sensor data from a sensor associated with a wearable device worn by the user while the user uses the electromechanical machine.
19. The computer-readable medium of claim 14, wherein the processing device is further configured to, while the user uses the electromechanical machine to perform the treatment plan, receiving subsequent treatment data pertaining to the user.
20. The computer-readable medium of claim 19, wherein the processing device is further configured to modify the modified at least one aspect of the treatment plan in response to receiving subsequent treatment plan input including at least one further modification to the modified at least one aspect of the treatment plan, wherein the subsequent treatment plan input is based on at least one of the treatment data and the subsequent treatment data.
34 58 First Second Network 15 Network 96, 97, 98a Data 98b, 99a, 99b 50 Source 20 30 56 Patient Interface Remote User 52, Clinician 11 Server 32 54 Interface AI Comms. Interface Comms 82 Engine 60 90 13 Ambulat. ML 36 Processor Supervisory Models Processor Sensor 68 Interface 62 1/47
Local Memory 64 Train. Eng. Comms. 92 Inst. 9 38 66 Reporting 84 Interface Memory 40 Data Gonio- 96, 97, 98a Instructions Meter 98b, 99a, 99b 94 99a, 99b 70 Assistant 86 Treatment Apparatus Interface 42 44 Press. System Patient Sensor Comms. Controller User Data Data 74 72 Interface 22, 24 Sensor Actuator
FIG. 1 76 78
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2023204667A AU2023204667B2 (en) | 2020-04-23 | 2023-07-13 | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine |
Applications Claiming Priority (19)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/856,985 | 2020-04-23 | ||
US16/856,985 US11107591B1 (en) | 2020-04-23 | 2020-04-23 | Method and system for describing and recommending optimal treatment plans in adaptive telemedical or other contexts |
US202063048456P | 2020-07-06 | 2020-07-06 | |
US63/048,456 | 2020-07-06 | ||
US17/021,895 US11071597B2 (en) | 2019-10-03 | 2020-09-15 | Telemedicine for orthopedic treatment |
US17/021,895 | 2020-09-15 | ||
US202063088657P | 2020-10-07 | 2020-10-07 | |
US63/088,657 | 2020-10-07 | ||
US202063104716P | 2020-10-23 | 2020-10-23 | |
US63/104,716 | 2020-10-23 | ||
US17/147,428 US11317975B2 (en) | 2019-10-03 | 2021-01-12 | Method and system for treating patients via telemedicine using sensor data from rehabilitation or exercise equipment |
US17/147,439 | 2021-01-12 | ||
US17/147,439 US11101028B2 (en) | 2019-10-03 | 2021-01-12 | Method and system using artificial intelligence to monitor user characteristics during a telemedicine session |
US17/147,428 | 2021-01-12 | ||
US17/147,211 | 2021-01-12 | ||
US17/147,211 US11075000B2 (en) | 2019-10-03 | 2021-01-12 | Method and system for using virtual avatars associated with medical professionals during exercise sessions |
PCT/US2021/028655 WO2021216881A1 (en) | 2020-04-23 | 2021-04-22 | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine |
AU2021260953A AU2021260953B2 (en) | 2020-04-23 | 2021-04-22 | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine |
AU2023204667A AU2023204667B2 (en) | 2020-04-23 | 2023-07-13 | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2021260953A Division AU2021260953B2 (en) | 2020-04-23 | 2021-04-22 | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine |
Publications (2)
Publication Number | Publication Date |
---|---|
AU2023204667A1 AU2023204667A1 (en) | 2023-08-03 |
AU2023204667B2 true AU2023204667B2 (en) | 2024-04-18 |
Family
ID=83887200
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2021260953A Active AU2021260953B2 (en) | 2020-04-23 | 2021-04-22 | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine |
AU2023204667A Active AU2023204667B2 (en) | 2020-04-23 | 2023-07-13 | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2021260953A Active AU2021260953B2 (en) | 2020-04-23 | 2021-04-22 | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine |
Country Status (7)
Country | Link |
---|---|
EP (1) | EP4139928A1 (en) |
JP (1) | JP7298053B2 (en) |
KR (1) | KR20230006641A (en) |
AU (2) | AU2021260953B2 (en) |
BR (1) | BR112022021443A2 (en) |
CA (1) | CA3176236C (en) |
MX (1) | MX2022013358A (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116153531B (en) * | 2023-04-17 | 2023-07-18 | 北京康爱医疗科技股份有限公司 | Rehabilitation monitoring method and system for tumor patient |
CN118675764A (en) * | 2024-08-21 | 2024-09-20 | 中国人民解放军海军青岛特勤疗养中心 | Thoracic surgery postoperative rehabilitation effect prediction system based on artificial intelligence |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080214971A1 (en) * | 2002-10-07 | 2008-09-04 | Talish Roger J | Excercise device utilizing loading apparatus |
US20110082009A1 (en) * | 2009-09-16 | 2011-04-07 | Richard Ranky | Instrumented handle and pedal systems for use in rehabilitation, exercise and training equipment |
US20170270260A1 (en) * | 2013-10-31 | 2017-09-21 | Knox Medical Diagnostics | Systems and methods for monitoring respiratory function |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7416537B1 (en) * | 1999-06-23 | 2008-08-26 | Izex Technologies, Inc. | Rehabilitative orthoses |
US20030036683A1 (en) * | 2000-05-01 | 2003-02-20 | Kehr Bruce A. | Method, system and computer program product for internet-enabled, patient monitoring system |
JP4076386B2 (en) * | 2002-07-18 | 2008-04-16 | 帝人株式会社 | Medical remote information system, information processing method, computer program, recording medium for computer program, telemedicine system |
-
2021
- 2021-04-22 EP EP21791805.1A patent/EP4139928A1/en active Pending
- 2021-04-22 AU AU2021260953A patent/AU2021260953B2/en active Active
- 2021-04-22 CA CA3176236A patent/CA3176236C/en active Active
- 2021-04-22 JP JP2022564566A patent/JP7298053B2/en active Active
- 2021-04-22 BR BR112022021443A patent/BR112022021443A2/en not_active Application Discontinuation
- 2021-04-22 KR KR1020227040948A patent/KR20230006641A/en not_active IP Right Cessation
- 2021-04-22 MX MX2022013358A patent/MX2022013358A/en unknown
-
2023
- 2023-07-13 AU AU2023204667A patent/AU2023204667B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080214971A1 (en) * | 2002-10-07 | 2008-09-04 | Talish Roger J | Excercise device utilizing loading apparatus |
US20110082009A1 (en) * | 2009-09-16 | 2011-04-07 | Richard Ranky | Instrumented handle and pedal systems for use in rehabilitation, exercise and training equipment |
US20170270260A1 (en) * | 2013-10-31 | 2017-09-21 | Knox Medical Diagnostics | Systems and methods for monitoring respiratory function |
Also Published As
Publication number | Publication date |
---|---|
JP2023519759A (en) | 2023-05-12 |
KR20230006641A (en) | 2023-01-10 |
AU2021260953B2 (en) | 2023-04-13 |
CA3176236A1 (en) | 2021-10-28 |
CA3176236C (en) | 2024-02-20 |
AU2021260953A1 (en) | 2022-11-17 |
BR112022021443A2 (en) | 2022-12-27 |
MX2022013358A (en) | 2023-01-04 |
AU2023204667A1 (en) | 2023-08-03 |
EP4139928A1 (en) | 2023-03-01 |
JP7298053B2 (en) | 2023-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11923057B2 (en) | Method and system using artificial intelligence to monitor user characteristics during a telemedicine session | |
US11075000B2 (en) | Method and system for using virtual avatars associated with medical professionals during exercise sessions | |
US11282604B2 (en) | Method and system for use of telemedicine-enabled rehabilitative equipment for prediction of secondary disease | |
US11282608B2 (en) | Method and system for using artificial intelligence and machine learning to provide recommendations to a healthcare provider in or near real-time during a telemedicine session | |
US11328807B2 (en) | System and method for using artificial intelligence in telemedicine-enabled hardware to optimize rehabilitative routines capable of enabling remote rehabilitative compliance | |
US11139060B2 (en) | Method and system for creating an immersive enhanced reality-driven exercise experience for a user | |
US12096997B2 (en) | Method and system for treating patients via telemedicine using sensor data from rehabilitation or exercise equipment | |
US20220314075A1 (en) | Method and system for monitoring actual patient treatment progress using sensor data | |
US20220328181A1 (en) | Method and system for monitoring actual patient treatment progress using sensor data | |
US20220415471A1 (en) | Method and system for using sensor data to identify secondary conditions of a user based on a detected joint misalignment of the user who is using a treatment device to perform a treatment plan | |
US20220230729A1 (en) | Method and system for telemedicine resource deployment to optimize cohort-based patient health outcomes in resource-constrained environments | |
WO2021216881A1 (en) | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine | |
AU2023204667B2 (en) | Method and system for using sensor data from rehabilitation or exercise equipment to treat patients via telemedicine | |
CN113223690B (en) | Method and system for rehabilitating a subject via telemedicine | |
US20240203580A1 (en) | Method and system for using artificial intelligence to triage treatment plans for patients and electronically initiate the treament plans based on the triaging | |
WO2022155251A9 (en) | Method and system for using artificial intelligence and machine learning to provide recommendations to a healthcare provider in real-time during a telemedicine session | |
WO2024137305A1 (en) | Method and system for using artificial intelligence to triage treatment plans for patients and electronically initiate the treament plans based on the triaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FGA | Letters patent sealed or granted (standard patent) |