US20210186312A1 - Systems and methods for semi-automated medical processes - Google Patents

Systems and methods for semi-automated medical processes Download PDF

Info

Publication number
US20210186312A1
US20210186312A1 US17/195,631 US202117195631A US2021186312A1 US 20210186312 A1 US20210186312 A1 US 20210186312A1 US 202117195631 A US202117195631 A US 202117195631A US 2021186312 A1 US2021186312 A1 US 2021186312A1
Authority
US
United States
Prior art keywords
patient
treatment
medical
diagnostic
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/195,631
Inventor
James Stewart Bates
Kristopher Perry
Chaitanya Prakash Potaraju
Pranav Bhatkal
Aditya Shirvalkar
Tristan Royal
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Advinow Inc
Original Assignee
Advinow Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Advinow Inc filed Critical Advinow Inc
Priority to US17/195,631 priority Critical patent/US20210186312A1/en
Publication of US20210186312A1 publication Critical patent/US20210186312A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000096Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00011Operational features of endoscopes characterised by signal transmission
    • A61B1/00016Operational features of endoscopes characterised by signal transmission using wireless means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00059Operational features of endoscopes provided with identification means for the endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00064Constructional details of the endoscope body
    • A61B1/00108Constructional details of the endoscope body characterised by self-sufficient functionality for stand-alone use
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/00148Holding or positioning arrangements using anchoring means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/227Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for ears, i.e. otoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/24Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/004Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0046Arrangements of imaging apparatus in a room, e.g. room provided with shielding or for improved access to apparatus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6888Cabins
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6889Rooms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00203Electrical control of surgical instruments with speech control or speech recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00221Electrical control of surgical instruments with wireless transmission of data, e.g. by infrared radiation or radiowaves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • A61B2090/3945Active visible markers, e.g. light emitting diodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7455Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • the present disclosure relates to computer-assisted health care, and more particularly, to systems and methods for providing a computer-assisted medical diagnostic architecture in which a patient is able to perform certain medical measurements with the assistance of a medical kiosk or medical camera system.
  • FIG. 1 illustrates an exemplary medical diagnostic system according to embodiments of the present disclosure.
  • FIG. 2 illustrates an exemplary medical instrument equipment system according to embodiments of the present disclosure.
  • FIG. 3A illustrates an exemplary system for measuring a patient's body parts utilizing optical medical instruments and deep learning systems according to embodiments of the present disclosure.
  • FIG. 3B illustrates another exemplary system for measuring a patient's body parts utilizing cameras and sensors according to embodiments of the present disclosure
  • FIG. 4A illustrates an exemplary method of measuring a patient's body part utilizing an optical medical instrument according to embodiments of the present disclosure.
  • FIG. 4B illustrates an exemplary optical medical instrument according to embodiments of the present disclosure.
  • FIG. 4C illustrates an exemplary optical medical instrument being positioned at an angle, position, and rotation according to embodiments of the present disclosure.
  • FIG. 5A is a flowchart of an illustrative method for making accurate medical instrument patient measurements, according to embodiments of the present disclosure.
  • FIG. 5B is another flowchart of an illustrative method for making accurate medical instrument patient measurements, according to embodiments of the present disclosure.
  • FIG. 5C is yet another flowchart of an illustrative method for making accurate medical instrument patient measurements, according to embodiments of the present disclosure.
  • FIG. 6 depicts a simplified block diagram of a computing device/information handling system according to embodiments of the present disclosure.
  • connections between components or systems within the figures are not intended to be limited to direct connections. Rather, data between these components may be modified, re-formatted, or otherwise changed by intermediary components. Also, additional or fewer connections may be used. Also, additional or fewer connections may be used. It shall also be noted that the terms “coupled” “connected” or “communicatively coupled” shall be understood to include direct connections, indirect connections through one or more intermediary devices, and wireless connections.
  • a service, function, or resource is not limited to a single service, function, or resource; usage of these terms may refer to a grouping of related services, functions, or resources, which may be distributed or aggregated.
  • the term “sensor” refers to a device capable of acquiring information related to any type of physiological condition or activity (e.g., a biometric diagnostic sensor); physical data (e.g., a weight); and environmental information (e.g., ambient temperature sensor), including hardware-specific information.
  • the term “position” refers to spatial and temporal data (e.g. orientation and motion information). The position data may be, for example, but without limitations, 2-dimensional (2D), 2.5-dimensional (2.5D), or true 3-dimensional (3D) data representation.
  • Doctor refers to any health care professional, health care provider, physician, or person directed by a physician.
  • “Patient” is any user who uses the systems and methods of the present invention, e.g., a person being examined or anyone assisting such person.
  • the term illness may be used interchangeably with the term diagnosis.
  • “answer” or “question” refers to one or more of 1) an answer to a question, 2) a measurement or measurement request (e.g., a measurement performed by a “patient”), and 3) a symptom (e.g., a symptom selected by a “patient”).
  • FIG. 1 illustrates an exemplary medical diagnostic system according to embodiments of the present disclosure.
  • Diagnostic system 100 comprises automated diagnostic and treatment system 102 , patient interface station 106 , doctor interface station 104 , deep learning system 105 , sensors/cameras 109 and medical instrument equipment 108 .
  • Deep learning system 105 may include, for example, but without limitations, deep convolution neural network (DCNN), Multi-layer preceptrons (MLP), Fully Convolutional Networks (FCNs), Capsule Networks, and artificial neural networks (A-NNs).
  • DCNN deep convolution neural network
  • MLP Multi-layer preceptrons
  • FCNs Fully Convolutional Networks
  • A-NNs artificial neural networks
  • Deep learning system 105 utilizes artificial reality feedback to assist the automated diagnostic and treatment system 102 to obtain improved medical measurements by the medical instrument equipment 108 .
  • Deep learning may be a name referring to the use of “stacked neural networks”; that is, networks composed of several layers. The layers are made of nodes, where a node is just a place where computation occurs and is loosely patterned on a neuron in the human brain. The neuron in the brain fires when it encounters sufficient stimuli.
  • Deep learning system 105 may be used in computer vision and have also have been applied to acoustic modeling for automatic speech recognition (ASR).
  • ASR automatic speech recognition
  • Patient interface station 106 comprises patient interface application module 107 and is coupled to sensors/cameras 109 .
  • sensors/cameras 109 may monitor and may capture images of the patient/assistant.
  • Patient interface station 106 receives from the automated diagnostic and treatment system 102 system instruction and/or relayed physicians instruction, diagnostic feedback, results 126 .
  • Patient interface station 106 sends to the automated diagnostic and treatment system 102 secure raw or processed patient-related data 128 .
  • Patient interface station 106 also provides interfaces between sensors/cameras 109 , located in the patient interface station 106 , and the patient/assistant.
  • Patient interface station 106 also provides interfaces to medical instrument equipment 108 .
  • Medical instrument equipment 108 may be, for example, but without limitations, an ophthalmoscope, intraoral camera, auriscope, or otoscope.
  • Medical instrument equipment 108 is designed to collect mainly diagnostic patient data, and may comprise one or more diagnostic devices, for example, in a home diagnostic medical kit that generates diagnostic data based on physical and non-physical characteristics of a patient. It is noted that diagnostic system 100 may comprise additional sensors and devices that, in operation, collect, process, or transmit characteristic information about the patient, medical instrument usage, orientation, environmental parameters such as ambient temperature, humidity, location, and other useful information that may be used to accomplish the objectives of the present invention.
  • a patient may enter patient-related data, such as health history, patient characteristics, symptoms, health concerns, medical instrument measured diagnostic data, images, and sound patterns, or other relevant information into patient interface station 106 .
  • patient-related data such as health history, patient characteristics, symptoms, health concerns, medical instrument measured diagnostic data, images, and sound patterns, or other relevant information into patient interface station 106 .
  • the patient may use any means of communication, such as voice control, to enter data, e.g., in the form of a questionnaire.
  • Patient interface station 106 may provide the data raw or in processed form to automated diagnostic and treatment system 102 , e.g., via a secure communication.
  • the patient may be prompted, e.g., by a software application, to answer questions intended to aid in the diagnosis of one or more medical conditions.
  • the software application may provide guidance by describing how to use medical instrument equipment 108 to administer a diagnostic test or how to make diagnostic measurements for any particular device that may be part of medical instrument equipment 108 so as to facilitate accurate measurements of patient diagnostic data.
  • the patient may use medical instrument equipment 108 to create a patient health profile that serves as a baseline profile. Gathered patient-related data may be securely stored in database 103 or a secure remote server (not shown) coupled to automated diagnostic and treatment system 102 .
  • automated diagnostic and treatment system 102 supported by and deep learning system 105 , enables interaction between a patient and a remotely located health care professional, who may provide instructions to the patient, e.g., by communicating via the software application.
  • a doctor may log into a cloud-based system (not shown) to access patient-related data via doctor interface station 104 .
  • the doctor interface station 104 comprises a doctor interface communication module 130 .
  • automated diagnostic and treatment system 102 presents automated diagnostic suggestions to a doctor, who may verify or modify the suggested information.
  • Automated diagnostic and treatment system 102 is coupled to doctor interface station 104 .
  • Automated diagnostic and treatment system 102 may communicate to doctor interface station 104 alerts, if needed, risk profile, and data integrity 120 .
  • the patient may be provided with instructions, feedback, results 122 , and other information pertinent to the patient's health.
  • the doctor may select an illness based on automated diagnostic system suggestions and/or follow a sequence of instructions, feedback, and/or results 122 may be adjusted based on decision vectors associated with a medical database.
  • medical instrument equipment 108 uses the decision vectors to generate a diagnostic result, e.g., in response to patient answers and/or measurements of the patient's vital signs.
  • medical instrument equipment 108 comprises a number of sensors, such as accelerometers, gyroscopes, pressure sensors, cameras, bolometers, altimeters, IR LEDs, and proximity sensors that may be coupled to one or more medical devices, e.g., a thermometer, to assist in performing diagnostic measurements and/or monitor a patient's use of medical instrument equipment 108 for accuracy.
  • sensors such as accelerometers, gyroscopes, pressure sensors, cameras, bolometers, altimeters, IR LEDs, and proximity sensors that may be coupled to one or more medical devices, e.g., a thermometer, to assist in performing diagnostic measurements and/or monitor a patient's use of medical instrument equipment 108 for accuracy.
  • a camera, bolometer, or other spectrum imaging device e.g. radar
  • image or facial recognition software and machine vision to recognize body parts, items, and actions to aid the patient in locating suitable positions for taking a measurement on the patient's body, e.g., by identifying any part of the
  • Examples of the types of diagnostic data that medical instrument equipment 108 may generate comprise body temperature, blood pressure, images, sound, heart rate, blood oxygen level, motion, ultrasound, pressure or gas analysis, continuous positive airway pressure, electrocardiogram, electroencephalogram, electrocardiography, BMI, muscle mass, blood, urine, and any other patient-related data 128 .
  • patient-related data 128 may be derived from a non-surgical wearable or implantable monitoring device that gathers sample data.
  • an IR LED, proximity beacon, or other identifiable marker may be attached to medical instrument equipment 108 to track the position and placement of medical instrument equipment 108 .
  • a camera, bolometer, or other spectrum imaging device uses the identifiable marker as a control tool to aid the camera or the patient in determining the position of medical instrument equipment 108 .
  • a deep learning system 105 which may include machine vision software, may be used to track and overlay or superimpose, e.g., on a screen, the position of the identifiable marker e.g., IR LED, heat source, or reflective material with a desired target location at which the patient should place medical instrument equipment 108 , thereby, aiding the patient to properly place or align a sensor and ensure accurate and reliable readings.
  • medical instrument equipment 108 e.g., a stethoscope is placed at the desired target location on a patient's torso, the patient may be prompted by optical or visual cues to breath according to instructions or perform other actions to facilitate medical measurements and to start a measurement.
  • one or more sensors that may be attached to medical instrument equipment 108 monitor the placement and usage of medical instrument equipment 108 by periodically or continuously recording data and comparing measured data, such as location, movement, rotation and angles, to an expected data model and/or an error threshold to ensure measurement accuracy.
  • a patient may be instructed to adjust an angle, location, rotation or motion of medical instrument equipment 108 , e.g., to adjust its state and, thus, avoid low-accuracy or faulty measurement readings.
  • sensors attached or tracking medical instrument equipment 108 may generate sensor data and patient interaction activity data that may be compared, for example, against an idealized patient medical instrument equipment usage sensor model data to create an equipment usage accuracy score.
  • the patient medical instrument equipment measured medical data may also be compared with idealized device measurement data expected from medical instrument equipment 108 to create a device accuracy score.
  • Recording bioelectric signals may include, for example, but without limitations, electroencephalography, electrocardiography, electrooculography, electromyography and electrogastrography.
  • Feedback from medical instrument equipment 108 may be used to instruct the patient to properly align medical instrument equipment 108 during a measurement.
  • medical instrument equipment type and sensor system monitoring of medical instrument equipment 108 patient interaction may be used to create a device usage accuracy score for use in a medical diagnosis algorithm.
  • patient medical instrument equipment measured medical data may be used to create a measurement accuracy score for use by the medical diagnostic algorithm.
  • medical instrument equipment 108 may also be fitted with haptic/tactile feedback sensors which may augment audio or screen-based guidance instructions for taking measurement.
  • haptic feedback devices may include, but are not limited to, vibration motors (similar to those found in smart phones) fitted around the medical instrument equipment to give tactile instructions (e.g. vibration on the left side of the medical device to instruct the patient to move the medical device left), or flywheels to give kinesthetic feedback such as resisting movement in undesirable directions, and aiding movement in the correct direction (similar to certain virtual reality peripherals).
  • vibration motors similar to those found in smart phones
  • flywheels to give kinesthetic feedback such as resisting movement in undesirable directions, and aiding movement in the correct direction (similar to certain virtual reality peripherals).
  • machine vision software may be used to show on a display an animation that mimics a patient's movements and provides detailed interactive instructions and real-time feedback to the patient. This aids the patient in correctly positioning and operating medical instrument equipment 108 relative to the patient's body so as to ensure a high level of accuracy when using medical instrument equipment 108 is operated.
  • the display may be a traditional computer screen.
  • the display may be augmented/virtual/mixed reality hardware, where patients may interact with the kiosk in a completely virtual environment while wearing AR/VR goggles.
  • a validation process comprising a calculation of a trustworthiness score or reliability factor is initiated in order to gauge the measurement accuracy.
  • the patient may be asked to either repeat a measurement or request assistance by an assistant, who may answer questions, e.g., remotely via an application to help with proper equipment usage, or alert a nearby person to assist with using medical instrument equipment 108 .
  • the validation process may also instruct a patient to answer additional questions, and may comprise calculating the measurement accuracy score based on a measurement or re-measurement.
  • automated diagnostic and treatment system 102 may enable a patient-doctor interaction by granting the patient and doctor access to diagnostic system 100 .
  • the patient may enter data, take measurements, and submit images and audio files or any other information to the application or web portal.
  • the doctor may access that information, for example, to review a diagnosis generated by automated diagnostic and treatment system 102 , and generate, confirm, or modify instructions for the patient.
  • Patient-doctor interaction while not required for diagnostic and treatment, if used, may occur in person, real-time via an audio/video application, or by any other means of communication.
  • automated diagnostic and treatment system 102 may utilize images generated from a diagnostic examination of mouth, throat, eyes, ears, skin, extremities, surface abnormalities, internal imaging sources, and other suitable images and/or audio data generated from diagnostic examination of heart, lungs, abdomen, chest, joint motion, voice, and any other audio data sources. Automated diagnostic and treatment system 102 may further utilize patient lab tests, medical images, or any other medical data. In embodiments, automated diagnostic and treatment system 102 enables medical examination of the patient, for example, using medical devices, e.g., ultrasound, in medical instrument equipment 108 to detect sprains, contusions, or fractures, and automatically provide diagnostic recommendations regarding a medical condition of the patient.
  • medical devices e.g., ultrasound
  • diagnosis comprises the use of medical database decision vectors that are at least partially based on the patient's self-measured (or assistant-measured) vitals or other measured medical data, the accuracy score of a measurement dataset, a usage accuracy score of a sensor attached to medical instrument equipment 108 , a regional illness trend, and information used in generally accepted medical knowledge evaluations steps.
  • the decision vectors and associated algorithms which may be installed in automated diagnostic and treatment system 102 , may utilize one or more-dimensional data, patient history, patient questionnaire feedback, and pattern recognition or pattern matching for classification using images and audio data.
  • a medical device usage accuracy score generator may be implemented within automated diagnostic and treatment system 102 and may utilize an error vector of any device in medical instrument equipment or attached medical instrument equipment 108 to create the device usage accuracy score and utilize the actual patient-measured device data to create the measurement data accuracy score.
  • automated diagnostic and treatment system 102 outputs diagnosis and/or treatment information that may be communicated to the patient, for example, electronically or in person by a medical professional, e.g., a treatment guideline that may include a prescription for a medication.
  • a medical professional e.g., a treatment guideline that may include a prescription for a medication.
  • prescriptions may be communicated directly to a pharmacy for pick-up or automated home delivery.
  • automated diagnostic and treatment system 102 may generate an overall health risk profile of the patient and recommend steps to reduce the risk of overlooking potentially dangerous conditions or guide the patient to a nearby facility that can treat the potentially dangerous condition.
  • the health risk profile may assist a treating doctor in fulfilling duties to the patient, for example, to carefully review and evaluate the patient and, if deemed necessary, refer the patient to a specialist, initiate further testing, etc.
  • the health risk profile advantageously reduces the potential for negligence and, thus, medical malpractice.
  • Automated diagnostic and treatment system 102 comprises a payment feature that uses patient identification information to access a database to, e.g., determine whether a patient has previously arranged a method of payment, and if the database does not indicate a previously arranged method of payment, automated diagnostic and treatment system 102 may prompt the patient to enter payment information, such as insurance, bank, or credit card information. Automated diagnostic and treatment system 102 may determine whether payment information is valid and automatically obtain an authorization from the insurance, EHR system, and/or the card issuer for payment for a certain amount for services rendered by the doctor. An invoice may be electronically presented to the patient, e.g., upon completion of a consultation, such that the patient can authorize payment of the invoice, e.g., via an electronic signature.
  • payment information such as insurance, bank, or credit card information.
  • An invoice may be electronically presented to the patient, e.g., upon completion of a consultation, such that the patient can authorize payment of the invoice, e.g., via an electronic signature.
  • database 103 of patient's information may comprise a security interface (not shown) that allows secure access to a patient database, for example, by using patient identification information to obtain the patient's medical history.
  • the interface may utilize biometric, bar code, or other electronically security methods.
  • medical instrument equipment 108 uses unique identifiers that are used as a control tool for measurement data.
  • Database 103 may be a repository for any type of data created, modified, or received by diagnostic system 100 , such as generated diagnostic information, information received from patient's wearable electronic devices, remote video/audio data and instructions, e.g., instructions received from a remote location or from the application.
  • fields in the patient's electronic health care record are automatically populated based on one or more of questions asked by diagnostic system 100 , measurements taken by the diagnostic system 100 , diagnosis and treatment codes generated by diagnostic system 100 , one or more trust scores, and imported patient health care data from one or more sources, such as an existing health care database. It is understood the format of imported patient health care data may be converted to be compatible with the EHR format of diagnostic system 100 . Conversely, exported patient health care data may be converted, e.g., to be compatible with an external EHR database.
  • patient-related data documented by diagnostic system 100 provide support for the code decision for the level of exam a doctor performs.
  • doctors have to choose one of any identified codes (e.g., ICD10 currently holds approximately 97,000 medical codes) to identify an illness and provide an additional code that identifies the level of physical exam/diagnosis performed on the patient (e.g., full body physical exam) based on an illness identified by the doctor.
  • patient answers are used to suggest to the doctor a level of exam that is supported by the identified illness, e.g., to ensure that the doctor does not perform unnecessary in-depth exams for minor illnesses or a treatment that may not be covered by the patient's insurance.
  • diagnostic system 100 upon identifying a diagnostic system 100 generates one or more recommendations/suggestions/options for a particular treatment.
  • one or more treatment plans are generated that the doctor may discuss with the patient and decide on a suitable treatment.
  • one treatment plan may be tailored purely for effectiveness, another one may consider the cost of drugs.
  • diagnostic system 100 may generate a prescription or lab test request and consider factors, such as recent research results, available drugs and possible drug interactions, the patient's medical history, traits of the patient, family history, and any other factors that may affect treatment when providing treatment information.
  • diagnosis and treatment databases may be continuously updated, e.g., by health care professionals, so that an optimal treatment may be administered to a particular patient, e.g., a patient identified as member of a certain risk group.
  • sensors and measurement techniques may be advantageously combined to perform multiple functions using a reduced number of sensors.
  • an optical sensor may be used as a thermal sensor by utilizing IR technology to measure body temperature.
  • some or all data collected by diagnostic system 100 may be processed and analyzed directly within automated diagnostic and treatment system 102 or transmitted to an external reading device (not shown in FIG. 1 ) for further processing and analysis, e.g., to enable additional diagnostics.
  • FIG. 2 illustrates an exemplary patient diagnostic measurement system according to embodiments of the present disclosure.
  • patient diagnostic measurement system 200 comprises microcontroller 202 , spectrum imaging device, e.g., spectrum imaging device camera 204 , monitor 206 , patient-medical equipment activity tracking sensors, e.g., inertial sensor 208 , communications controller 210 , medical instruments 224 , identifiable marker, e.g., identifiable marker IR LED 226 , power management unit 230 , and battery 232 .
  • Each component may be coupled directly or indirectly by electrical wiring, wirelessly, or optically to any other component in patient diagnostic measurement system 200 .
  • Inertial sensor 208 may also be referred to as patient-equipment activity sensors inertial sensors 208 or simply sensors 208 .
  • Medical instrument 224 comprises one or more devices that are capable of measuring physical and non-physical characteristics of a patient that, in embodiments, may be customized, e.g., according to varying anatomies among patients, irregularities on a patient's skin, and the like.
  • medical instrument 224 is a combination of diagnostic medical devices that generate diagnostic data based on patient characteristics.
  • Exemplary diagnostic medical devices are heart rate sensors, otoscopes, digital stethoscopes, in-ear thermometers, blood oxygen sensors, high-definition cameras, spirometers, blood pressure meters, respiration sensors, skin resistance sensors, glucometers, ultrasound devices, electrocardiographic sensors, body fluid sample collectors, eye slit lamps, weight scales, and any devices known in the art that may aid in performing a medical diagnosis.
  • patient characteristics and vital signs data may be received from and/or compared against wearable or implantable monitoring devices that gather sample data, e.g., a fitness device that monitors physical activity.
  • One or more medical instruments 224 may be removably attachable directly to a patient's body, e.g., torso, via patches or electrodes that may use adhesion to provide good physical or electrical contact.
  • medical instruments 224 e.g., a contact-less (or non-contact) thermometer, may perform contact-less measurements some distance away from the patient's body.
  • Non-contact sensors my also support measurements for electrocardiograms (EKG) and electroencephalogram (EEG).
  • microcontroller 202 may be a secure microcontroller that securely communicates information in encrypted form to ensure privacy and the authenticity of measured data and activity sensor and patient-equipment proximity information and other information in patient diagnostic measurement system 200 . This may be accomplished by taking advantage of security features embedded in hardware of microcontroller 202 and/or software that enables security features during transit and storage of sensitive data. Each device in patient diagnostic measurement system 200 may have keys that handshake to perform authentication operations on a regular basis.
  • Spectrum imaging device camera 204 is any audio/video device that may capture patient images and sound at any frequency or image type.
  • Monitor 206 is any screen or display device that may be coupled to camera, sensors and/or any part of patient diagnostic measurement system 200 .
  • Patient-equipment activity tracking inertial sensor 208 is any single or multi-dimensional sensor, such as an accelerometer, a multi-axis gyroscope, pressure sensor, and a magnetometer capable of providing position, motion, pressure on medical equipment or orientation data based on patient interaction. Patient-equipment activity tracking inertial sensor 208 may be attached to (removably or permanently) or embedded into medical instrument 224 .
  • Identifiable marker IR LED 226 represents any device, heat source, reflective material, proximity beacon, altimeter, etc., that may be used by microcontroller 202 as an identifiable marker. Like patient-equipment activity tracking inertial sensor 208 , identifiable marker IR LED 226 may be attachable to or embedded into medical instrument 224 .
  • communication controller 210 is a wireless communications controller attached either permanently or temporarily to medical instrument 224 or the patient's body to establish a bi-directional wireless communications link and transmit data, e.g., between sensors and microcontroller 202 using any wireless communication protocol known in the art, such as Bluetooth Low Energy, e.g., via an embedded antenna circuit that wirelessly communicates the data.
  • any wireless communication protocol known in the art such as Bluetooth Low Energy
  • electromagnetic fields generated by such antenna circuit may be of any suitable type.
  • the operating frequency may be located in the ISM frequency band, e.g., 13.56 MHz.
  • data received by communications controller 210 which may be a wireless controller, may be forwarded to a host device (not shown) that may run a software application.
  • power management unit 230 is coupled to microcontroller 202 to provide energy to, e.g., microcontroller 202 and communication controller 210 .
  • Battery 232 may be a back-up battery for power management unit 230 or a battery in any one of the devices in patient diagnostic measurement system 200 .
  • One of ordinary skill in the art will appreciate that one or more devices in patient diagnostic measurement system 200 may be operated from the same power source (e.g., battery 232 ) and perform more than one function at the same or different times.
  • one or more components e.g., sensors 208 , identifiable marker IR LED 226
  • additional electronics such as filtering elements, etc., may be implemented to support the functions of medical instrument equipment measurement or usage monitoring and tracking in patient diagnostic measurement system 200 according to the objectives of the invention.
  • a patient may use medical instrument 224 to gather patient data based on physical and non-physical patient characteristics, e.g., vital signs data, images, sounds, and other information useful in the monitoring and diagnosis of a health-related condition.
  • the patient data is processed by microcontroller 202 and may be stored in a database (not shown).
  • the patient data may be used to establish baseline data for a patient health profile against which subsequent patient data may be compared.
  • patient data may be used to create, modify, or update EHR data. Gathered medical instrument equipment data, along with any other patient and sensor data, may be processed directly by patient diagnostic measurement system 200 or communicated to a remote location for analysis, e.g., to diagnose existing and expected health conditions to benefit from early detection and prevention of acute conditions or aid in the development of novel medical diagnostic methods.
  • medical instrument 224 is coupled to a number of sensors, such as patient-equipment tracking inertial sensor 208 and/or identifiable marker IR LED 226 , that may monitor a position/orientation of medical instrument 224 relative to the patient's body when a medical equipment measurement is taken.
  • sensor data generated by sensor 208 , identifiable marker IR LED 226 or other sensors may be used in connection with, e.g., data generated by spectrum imaging device camera 204 , proximity sensors, transmitters, bolometers, or receivers to provide feedback to the patient to aid the patient in properly aligning medical instrument 224 relative to the patient's body part of interest when performing a diagnostic measurement.
  • sensors 208 not all sensors 208 , identifiable marker IR LED 226 , beacon, pressure, altimeter, etc., need to operate at all times. Any number of sensors may be partially or completely disabled, e.g., to conserve energy.
  • the sensor emitter comprises a light signal emitted by IR LED 226 or any other identifiable marker that may be used as a reference signal.
  • the reference signal may be used to identify a location, e.g., within an image and based on a characteristic that distinguishes the reference from other parts of the image.
  • the reference signal is representative of a difference between the position of medical instrument 224 and a preferred location relative to a patient's body.
  • spectrum imaging device camera 204 displays, e.g., via monitor 206 , the position of medical instrument 224 and the reference signal at the preferred location so as to allow the patient to determine the position of medical instrument 224 and adjust the position relative to the preferred location, displayed by spectrum imaging device camera 204 .
  • Spectrum imaging device camera 204 proximity sensor, transmitter, receiver, bolometer, or any other suitable device may be used to locate or track the reference signal, e.g., within the image, relative to a body part of the patient.
  • this augmented reality (AR) method may be accomplished by using an overlay method that overlays an image of a body part of the patient against an ideal model of device usage to enable real-time feedback for the patient.
  • the reference signal along with signals from other sensors, e.g., patient-equipment activity inertial sensor 208 , may be used to identify a position, location, angle, orientation, or usage associated with medical instrument 224 to monitor and guide a patient's placement of medical instrument 224 at a target location and accurately activate a device for measurement.
  • methods of mixed reality in which users wearing AR goggles may have virtual objects overlaid on real-world objects (e.g. a virtual clock on a real wall).
  • microcontroller 202 upon receipt of a request signal, activates one or more medical instruments 224 to perform measurements and sends data related to the measurement back to microcontroller 202 .
  • the measured data and other data associated with a physical condition may be automatically recorded and a usage accuracy of medical instrument 224 may be monitored.
  • microcontroller 202 uses an image in any spectrum, motion signal and/or an orientation signal by patient-equipment activity inertial sensor 208 to compensate or correct the vital signs data output by medical instrument 224 .
  • Data compensation or correction may comprise filtering out certain data as likely being corrupted by parasitic effects and erroneous readings that result from medical instrument 224 being exposed to unwanted movements caused by perturbations or, e.g., the effect of movements of the patient's target measurement body part.
  • signals from two or more medical instruments 224 are combined, for example, to reduce signal latency and increase correlation between signals to further improve the ability of patient diagnostic measurement system 200 to reject motion artifacts to remove false readings and, therefore, enable a more accurate interpretation of the measured vital signs data.
  • spectrum imaging device camera 204 displays actual or simulated images and videos of the patient and medical instrument 224 to assist the patient in locating a desired position for medical instrument 224 when performing the measurement so as to increase measurement accuracy.
  • Spectrum imaging device camera 204 may use image or facial recognition software to identify and display eyes, mouth, nose, ears, torso, or any other part of the patient's body as reference.
  • patient diagnostic measurement system 200 uses machine vision software that analyzes measured image data and compares image features to features in a database, e.g., to detect an incomplete image for a target body part, to monitor the accuracy of a measurement and determine a corresponding score. In embodiments, if the score falls below a certain threshold, patient diagnostic measurement system 200 may provide detailed guidance for improving measurement accuracy or to receive a more complete image, e.g., by providing instructions on how to change an angle or depth of an otoscope relative to the patient's ear.
  • the machine vision software may use an overlay method to mimic a patient's posture/movements to provide detailed and interactive instructions, e.g., by displaying a character, image of the patient, graphic, or avatar on monitor 206 to provide feedback to the patient.
  • the instructions, image, or avatar may start or stop and decide what help instruction to display based on the type of medical instrument 224 , the data from spectrum imaging device camera 204 , patient-equipment activity sensors inertial sensors 208 , bolometer, transmitter and receiver, and/or identifiable marker IR LED 226 (an image, a measured position or angle, etc.), and a comparison of the data to idealized data. This further aids the patient in correctly positioning and operating medical instrument 224 relative to the patient's body, ensures a high level of accuracy when operating medical instrument 224 , and solves potential issues that the patient may encounter when using medical instrument 224 .
  • instructions may be provided via monitor 206 and describe in audio/visual format and in any desired level of detail, how to use medical instrument 224 to perform a diagnostic test or measurement, e.g., how to take temperature, so as to enable patients to perform measurements of clinical-grade accuracy.
  • each sensor 208 , identifiable marker IR LED 226 , e.g., proximity, bolometer, transmitter/receiver may be associated with a device usage accuracy score.
  • a device usage accuracy score generator (not shown), which may be implemented in microcontroller 202 , may use the sensor data to generate a medical instrument usage accuracy score that is representative of the reliability of medical instrument 224 measurement on the patient.
  • the score may be based on a difference between an actual position of medical instrument 224 and a preferred position. In addition, the score may be based on detecting a motion, e.g., during a measurement.
  • the device usage accuracy score is derived from an error vector generated for one or more sensors 208 , identifiable marker IR LED 226 , deep learning system 105 . In embodiments, the device usage accuracy score is derived from an error vector generated for one or more sensors 208 , identifiable marker IR LED 226 . The resulting device usage accuracy score may be used when generating or evaluating medical diagnosis data.
  • microcontroller 202 analyzes the patient measured medical instrument data to generate a trust score indicative of the acceptable range of the medical instrument. For example, by comparing the medical instrument measurement data against reference measurement data or reference measurement data that would be expected from medical instrument 224 . As with device usage accuracy score, the trust score may be used when generating or evaluating a medical diagnosis data.
  • FIG. 3A illustrates an exemplary system 300 for measuring a patient's body parts utilizing optical medical instruments and deep learning systems according to embodiments of the present disclosure.
  • System 300 may comprise automated diagnostic and treatment system 102 , deep learning system 105 , doctor interface station 104 and patient kiosk 301 .
  • Patient kiosk 301 may comprise patient interface station 106 , kiosk cameras including sensors/camera- 1 303 , sensors/camera- 2 304 and instrument 312 .
  • patient 310 is present in patient kiosk 301 .
  • Kiosk cameras may also be referred to as camera modules.
  • Instrument 312 may be, but not limited to, an otoscope.
  • Deep learning system 105 may be, but not limited to, a deep convolution neural network (DCNN).
  • DCNN deep convolution neural network
  • An objective of system 300 includes the accurate acquisition of medical measurement data of a target spot of a body part of patient 310 .
  • the automated diagnostic and treatment system 102 provides instructions to patient 310 to allow patient 310 to precisely position instrument 312 in proximity to a target spot of a body part of patient 310 . That is, the movement of instrument 312 may be controlled by the patient 310 . Based on a series of images acquired from the kiosk cameras and instrument 312 , these instructions may be generated. Subsequent images from instrument 312 are analyzed by automated diagnostic and treatment system 102 utilizing database 103 and deep learning system 105 to obtain measured medical data.
  • the accuracy of the instrument 312 positioning and the medical measured data may be determined by automated diagnostic and treatment system 102 , and deep learning system 105 via an error threshold measurement.
  • quality medical measurements are generated for the target spot of a body part of patient 310 .
  • patient 310 has positioned instrument 312 , which may be an otoscope, in proximity to the ear of patient 310 .
  • instrument 312 may be positioned in proximity to another body part.
  • patient 310 may be fitted with haptic/tactile feedback sensors which may augment the patient guidance instructions for taking measurements.
  • haptic feedback in the form of a buzzer/vibrator could be elicited on the side of the device in which movement is required (i.e. left side).
  • haptic feedback devices may include, but are not limited to, vibration motors (similar to those found in smart phones) fitted around the medical instrument equipment to give tactile instructions (e.g. vibration on the left side of the medical device to instruct the patient to move the medical device left), or flywheels to give kinesthetic feedback such as resisting movement in undesirable directions, and aiding movement in the correct direction (similar to certain virtual reality peripherals).
  • Instrument 312 may be wirelessly coupled to patient interface station 106 .
  • the automated diagnostic and treatment system 102 , deep learning system 105 , doctor interface station 104 and patient interface station 106 are operable as previously described herein relative to FIG. 1 .
  • the automated diagnostic and treatment system 102 may send a command to the patient interface station 106 requesting sensors/camera- 1 303 and sensors/camera- 2 304 to capture images of patient 310 and instrument 312 .
  • the captured images (sometimes called “posed” images) may be analyzed by the deep learning system 105 to determine the position of instrument 312 relative to the body part, which may include the location of instrument 312 to the body part, the angle of instrument 312 to the body part, and the rotation of instrument 312 to the body part.
  • the automated diagnostic and treatment system 102 may provide an initial set of instructions to patient 310 to assist patient 310 to position instrument 312 in proximity to the target spot of the body part.
  • the first set of instructions may request the patient to position the instrument 312 adjacent to the patient's right ear.
  • the automated diagnostic and treatment system 102 captures a second set of images and determines if instrument 312 is within a pose threshold of the target spot of the body part.
  • the pose threshold may be less than one inch.
  • These instructions may be considered a “coarse” set of instructions, or “coarse tuning”, inasmuch as the estimates are based on images obtained by sensors/camera- 1 303 and sensors/camera- 2 304 that have a distance d 1 between the camera and the instrument 312 .
  • Distance d 1 may be several feet.
  • Instrument 312 may comprise a camera that may be located on one end of instrument 312 .
  • Automated diagnostic and treatment system 102 may command, via patient interface station 106 , instrument 312 to capture images of a target spot of the body part. Inasmuch as instrument 312 is in close proximity to the body part, these captures images provide detailed information on the position of instrument 312 relative to the body part.
  • the images from instrument 312 are analyzed, resulting in another set of instruction for patient 310 for the positioning of instrument 312 relative to the target spot of the body part.
  • patient 310 may position instrument 312 more accurately relative to the target spot of the body part of patient 310 .
  • the another set of instructions may have refined, i.e., “fine tuned”, the initial set of instructions because the another set of instructions is based on images captured where the instrument camera is distance d 2 from the body part. Distance d 2 may be less than 1 inch.
  • This process may be repeated as newly captured images by instrument 312 are analyzed by the automated diagnostic and treatment system 102 to determine the updated positions of instrument 312 relative to the body part.
  • the updated positions may include the location of instrument 312 to the body part, the angle of instrument 312 to the body part, and the rotation of instrument 312 to the body part.
  • the automated diagnostic and treatment system 102 provides yet another set of instructions to patient 310 to assist patient 310 to position instrument 312 in proximity to the target spot of the body part. This yet another of instructions may further refine the initial sets of instructions.
  • the instructions may be communicated to patient 310 via a visual display or via haptic/tactile feedback sensors.
  • the instructions may be refined by the automated diagnostic and treatment system 102 analyzing the images from the instrument camera and the images from the kiosk cameras by utilizing deep learning system 105 and database 103 to obtain updated estimates of 1) the position of instrument 312 relative to the target spot of the body part and 2) measured body parts.
  • the medical measurement data is provided to a physician by the automated diagnostic and treatment system 102 via the doctor interface station 104 .
  • FIG. 3B illustrates another exemplary system 320 for measuring a patient's body parts or a medical instrument (not shown) utilizing cameras and sensors according to embodiments of the present disclosure.
  • System 320 is based on the principle of determine the time-of-flight (TOF) for each emitted pulse of light.
  • Light detection and ranging systems such as camera/sensors 322 , may employ pulses of light to measure distance to an object based on the time of flight (TOF) of each pulse of light.
  • TOF time of flight
  • a pulse of light emitted from a light source of a light detection and ranging system interacts with a distal object. A portion of the light reflects from the object and returns to a detector of the light detection and ranging system. Based on the time elapsed between emission of the pulse of light and detection of the returned pulse of light, the distance to the object may be estimated.
  • the light pulse may hit multiple objects, each having a different distance from the laser, causing multi-return signals to be received by the light detection and ranging system detector.
  • Multi-return signals may provide more information of the environment to improve mapping or reconstruction.
  • a dedicated detector may be required to precisely identify each return with its associated time delay information.
  • the resulting images may be referred to as “pose images”.
  • one or more camera modules may provide a pose estimate based on time of flight of multiple light signals emitted from each of the one or more camera modules and reflected back to the one or more camera modules.
  • camera/sensors 322 may emits signals S 1 , S 2 and S 3 . These signals may reflect off patient 330 or off a fixed surface, such as a wall or floor. In the kiosk, the walls and floor may have identifiers that can assist in the image identification of the body part of patient 330 .
  • the reflection of signals S 1 , S 2 and S 3 may be detected by camera/sensors 322 and communicated to a diagnostic system, such as automated diagnostic and treatment system 102 .
  • Automated diagnostic and treatment system 102 can determine the distances associated with signals S 1 , S 2 and S 3 based on the TOF of each signal. This process may result in a pose estimate, which may be viewed as a pose image.
  • Identifier 324 may assist in identifying a position of a patient's body parts or a medical instrument.
  • FIG. 4A illustrates an exemplary method 400 of measuring a patient's body part utilizing an optical medical instrument according to embodiments of the present disclosure.
  • instrument 402 may be positioned in an ear of patient 404 .
  • the patient 404 can receive instructions from a diagnostic system to improve the accuracy of the position of instrument 402 relative to a target spot of a body part of patient 404 .
  • FIG. 4B illustrates an exemplary optical medical instrument 420 according to embodiments of the present disclosure.
  • Optical medical instrument 420 may comprise instrument 422 , which may comprise instrument camera 424 , identifiable marker 428 and wireless device 426 .
  • Instrument 422 may also include a serial number on its body. Instrument 422 may be easily held and positioned by a user, such as patient 310 of FIG. 3A .
  • instrument 422 includes wireless device 426 , in other embodiments, instrument 422 may be coupled to a diagnostic system via a wired or optical technology.
  • instrument 422 illustrates instrument camera 424 is located on a tip or one end of instrument 422 , in other embodiments, instrument camera 424 may be located in another location on instrument 422 .
  • identifiable marker 428 or the serial number may assist in determining these measurements.
  • identifiable marker 428 may be an image of a ring of flowers and may comprise reflective material.
  • Instrument 402 and instrument 422 may be otoscopes.
  • FIG. 4C illustrates an exemplary optical medical instrument 440 being positioned at an angle, position and rotation according to embodiments of the present disclosure.
  • FIG. 5A and FIG. 5B are flowcharts 500 and 540 depicting illustrative methods for making accurate medical patient measurements, according to embodiments of the present disclosure.
  • the methods comprise the steps of:
  • This set of instructions may direct the patient to position the instrument at a particular ear.
  • Step 510 Steps 502 , 504 , 506 , 508 and 510 define a coarse tuning method based on pose images.
  • Step 512 Determining whether the instrument is within a pose threshold of a target spot of the body part.
  • step 513 Then repeat steps 508 , 510 , and 512 .
  • Steps 515 and 516 define a fine tuning method.
  • a deep learning system may be, but without limitation, deep convolution neural network (DCNN).
  • Step 518 The deep learning system makes this determination.
  • step 522 providing medical measured data for the target spot of the body part to a physician and to a deep learning system database.
  • step 519 providing another set of instructions, based on the third/another set of estimates, to the patient to further adjust the position of the instrument relative to the target spot of the body part.
  • step 516 Repeating the analysis of the another set of images from the instrument camera, i.e., step 516 and subsequent steps.
  • FIG. 5C is comprises flowchart 560 of an illustrative method for making accurate medical otoscope patient measurements, according to embodiments of the present disclosure.
  • the method comprise the steps of:
  • a deep learning system may be utilized in this step.
  • the deep learning system may be, but not limited to, a deep convolution neural network (step 564 )
  • step 566 Determining whether an error threshold to ensure measurement accuracy for the otoscope positioning relative to the selected ear of the patient has been achieved?
  • step 570 providing medical measured data for the selected ear to a physician and to a deep learning database.
  • step 568 receiving additional images from the otoscope based on additional positioning instructions and optionally additional images from the kiosk cameras, then repeating steps 564 and 566 .
  • a time maybe calculated at which the selected treatment is expected to show a result and patient feedback may be requested, e.g., as part of a feedback process, to improve diagnosis and treatment reliability.
  • the selected treatment and/or the patient feedback may be used to adjusting one or more of the metrics. For example, based on one or more of the metrics a series of treatment plans maybe generated by using an algorithm that combines the metrics. In embodiments, calculating the selected treatment comprises using cost as a factor.
  • one or more computing systems may be configured to perform one or more of the methods, functions, and/or operations presented herein.
  • Systems that implement at least one or more of the methods, functions, and/or operations described herein may comprise an application or applications operating on at least one computing system.
  • the computing system may comprise one or more computers and one or more databases.
  • the computer system may be a single system, a distributed system, a cloud-based computer system, or a combination thereof.
  • the present disclosure may be implemented in any instruction-execution/computing device or system capable of processing data, including, without limitation phones, laptop computers, desktop computers, and servers.
  • the present disclosure may also be implemented into other computing devices and systems.
  • aspects of the present disclosure may be implemented in a wide variety of ways including software (including firmware), hardware, or combinations thereof.
  • the functions to practice various aspects of the present disclosure may be performed by components that are implemented in a wide variety of ways including discrete logic components, one or more application specific integrated circuits (ASICs), and/or program-controlled processors. It shall be noted that the manner in which these items are implemented is not critical to the present disclosure.
  • system 600 includes a central processing unit (CPU) 601 that provides computing resources and controls the computer.
  • CPU 601 may be implemented with a microprocessor or the like, and may also include a graphics processor and/or a floating point coprocessor for mathematical computations.
  • System 600 may also include a system memory 602 , which may be in the form of random-access memory (RAM) and read-only memory (ROM).
  • RAM random-access memory
  • ROM read-only memory
  • An input controller 603 represents an interface to various input device(s) 604 , such as a keyboard, mouse, or stylus.
  • a scanner controller 605 which communicates with a scanner 606 .
  • System 600 may also include a storage controller 607 for interfacing with one or more storage devices 608 each of which includes a storage medium such as magnetic tape or disk, or an optical medium that might be used to record programs of instructions for operating systems, utilities and applications which may include embodiments of programs that implement various aspects of the present disclosure.
  • Storage device(s) 608 may also be used to store processed data or data to be processed in accordance with the disclosure.
  • System 600 may also include a display controller 609 for providing an interface to a display device 611 , which may be a cathode ray tube (CRT), a thin film transistor (TFT) display, or other type of display.
  • System 600 may also include a printer controller 612 for communicating with a printer 613 .
  • a communications controller 614 may interface with one or more communication devices 615 , which enables system 600 to connect to remote devices through any of a variety of networks including the Internet, an Ethernet cloud, an FCoE/DCB cloud, a local area network (LAN), a wide area network (WAN), a storage area network (SAN) or through any suitable electromagnetic carrier signals including infrared signals.
  • LAN local area network
  • WAN wide area network
  • SAN storage area network
  • bus 616 which may represent more than one physical bus.
  • various system components may or may not be in physical proximity to one another.
  • input data and/or output data may be remotely transmitted from one physical location to another.
  • programs that implement various aspects of this disclosure may be accessed from a remote location (e.g., a server) over a network.
  • Such data and/or programs may be conveyed through any of a variety of machine-readable medium including, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store or to store and execute program code, such as application specific integrated circuits (ASICs), programmable logic devices (PLDs), flash memory devices, and ROM and RAM devices.
  • ASICs application specific integrated circuits
  • PLDs programmable logic devices
  • flash memory devices ROM and RAM devices.
  • Embodiments of the present disclosure may be encoded upon one or more non-transitory computer-readable media with instructions for one or more processors or processing units to cause steps to be performed.
  • the one or more non-transitory computer-readable media shall include volatile and non-volatile memory.
  • alternative implementations are possible, including a hardware implementation or a software/hardware implementation.
  • Hardware-implemented functions may be realized using ASIC(s), programmable arrays, digital signal processing circuitry, or the like. Accordingly, the “means” terms in any claims are intended to cover both software and hardware implementations.
  • the term “computer-readable medium or media” as used herein includes software and/or hardware having a program of instructions embodied thereon, or a combination thereof.
  • embodiments of the present disclosure may further relate to computer products with a non-transitory, tangible computer-readable medium that have computer code thereon for performing various computer-implemented operations.
  • the media and computer code may be those specially designed and constructed for the purposes of the present disclosure, or they may be of the kind known or available to those having skill in the relevant arts.
  • Examples of tangible computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store or to store and execute program code, such as application specific integrated circuits (ASICs), programmable logic devices (PLDs), flash memory devices, and ROM and RAM devices.
  • ASICs application specific integrated circuits
  • PLDs programmable logic devices
  • flash memory devices and ROM and RAM devices.
  • Examples of computer code include machine code, such as produced by a compiler, and files containing higher level code that are executed by a computer using an interpreter.
  • Embodiments of the present disclosure may be implemented in whole or in part as machine-executable instructions that may be in program modules that are executed by a processing device.
  • Examples of program modules include libraries, programs, routines, objects, components, and data structures. In distributed computing environments, program modules may be physically located in settings that are local, remote, or both.
  • an information handling system may include any instrumentality or aggregate of instrumentalities operable to compute, calculate, determine, classify, process, transmit, receive, retrieve, originate, switch, store, display, communicate, manifest, detect, record, reproduce, handle, or utilize any form of information, intelligence, or data for business, scientific, control, or other purposes.
  • an information handling system may be a personal computer (e.g., desktop or laptop), tablet computer, mobile device (e.g., personal digital assistant (PDA) or smart phone), server (e.g., blade server or rack server), a network storage device, or any other suitable device and may vary in size, shape, performance, functionality, and price.
  • the information handling system may include random access memory (RAM), one or more processing resources such as a central processing unit (CPU) or hardware or software control logic, ROM, and/or other types of nonvolatile memory. Additional components of the information handling system may include one or more disk drives, one or more network ports for communicating with external devices as well as various input and output (I/O) devices, such as a keyboard, a mouse, touchscreen and/or a video display. The information handling system may also include one or more buses operable to transmit communications between the various hardware components.
  • RAM random access memory
  • processing resources such as a central processing unit (CPU) or hardware or software control logic
  • ROM read-only memory
  • Additional components of the information handling system may include one or more disk drives, one or more network ports for communicating with external devices as well as various input and output (I/O) devices, such as a keyboard, a mouse, touchscreen and/or a video display.
  • I/O input and output
  • the information handling system may also include one or more buses operable to transmit communications between the various

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Physiology (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Signal Processing (AREA)
  • Dentistry (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Psychiatry (AREA)
  • Primary Health Care (AREA)
  • Cardiology (AREA)
  • Data Mining & Analysis (AREA)
  • Epidemiology (AREA)
  • Multimedia (AREA)
  • Ophthalmology & Optometry (AREA)
  • Robotics (AREA)
  • Computing Systems (AREA)
  • Pulmonology (AREA)
  • Business, Economics & Management (AREA)

Abstract

Presented are systems and methods for the accurate acquisition of medical measurement data of a body part of patient. To assist in acquiring accurate medical measurement data, an automated diagnostic and treatment system provides instructions to the patient to allow the patient to precisely position a medical instrument in proximity to a target spot of a body part of patient. Based on a series of images acquired from a kiosk camera and an instrument camera, these instructions are generated. Subsequent images from instrument camera are analyzed by automated diagnostic and treatment system utilizing a database and deep convolution neural network (DCNN) to obtain measured medical data. An error threshold measurement by the automated diagnostic and treatment system may determine the accuracy of the instrument positioning and the medical measured data. The measured medical data may be communicated to a physician.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This patent application is a continuation of and claims priority benefit to co-pending and commonly-owned U.S. patent application Ser. No. 15/913,801, filed on Mar. 6, 2018, entitled “SYSTEMS AND METHODS FOR OPTICAL MEDICAL INSTRUMENT PATIENT MEASUREMENTS”, and listing James Stewart Bates, Kristopher Perry, Chaitanya Prakash Potaraju, Pranav Bhatkal, Aditya Shirvalkar, Tristan Royal as inventors, which patent application is hereby incorporated by reference in its entirety and for all purposes.
  • BACKGROUND Technical Field
  • The present disclosure relates to computer-assisted health care, and more particularly, to systems and methods for providing a computer-assisted medical diagnostic architecture in which a patient is able to perform certain medical measurements with the assistance of a medical kiosk or medical camera system.
  • Background of the Invention
  • One skilled in the art will recognize the importance of addressing the ever-increasing cost of providing consistent, high-quality medical care to patients. Governmental agencies and insurance companies are attempting to find solutions that reduce the cost of medical care without significantly reducing the quality of medical examinations provided by doctors and nurses. Through consistent regulation changes, electronic health record changes and pressure from payers, both health care facilities and providers are looking for ways to make patient intake, triage, diagnosis, treatment, electronic health record data entry, treatment, billing, and patient follow-up activity more efficient, provide better patient experience, and increase the doctor to patient throughput per hour, while simultaneously reducing cost.
  • The desire to increase access to health care providers, a pressing need to reduce health care costs in developed countries and the goal of making health care available to a larger population in less developed countries have fueled the idea of telemedicine. In most cases, however, video or audio conferencing with a doctor does not provide sufficient patient-physician interaction that is necessary to allow for a proper medical diagnosis to efficiently serve patients.
  • What is needed are systems and methods that ensure reliable remote or local medical patient intake, triage, diagnosis, treatment, electronic health record data entry/management, treatment, billing and patient follow-up activity so that physicians can allocate patient time more efficiently and, in some instances, allow individuals to manage their own health, thereby, reducing health care costs.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • References will be made to embodiments of the invention, examples of which may be illustrated in the accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in the context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
  • FIG. 1 illustrates an exemplary medical diagnostic system according to embodiments of the present disclosure.
  • FIG. 2 illustrates an exemplary medical instrument equipment system according to embodiments of the present disclosure.
  • FIG. 3A illustrates an exemplary system for measuring a patient's body parts utilizing optical medical instruments and deep learning systems according to embodiments of the present disclosure.
  • FIG. 3B illustrates another exemplary system for measuring a patient's body parts utilizing cameras and sensors according to embodiments of the present disclosure
  • FIG. 4A illustrates an exemplary method of measuring a patient's body part utilizing an optical medical instrument according to embodiments of the present disclosure.
  • FIG. 4B illustrates an exemplary optical medical instrument according to embodiments of the present disclosure.
  • FIG. 4C illustrates an exemplary optical medical instrument being positioned at an angle, position, and rotation according to embodiments of the present disclosure.
  • FIG. 5A is a flowchart of an illustrative method for making accurate medical instrument patient measurements, according to embodiments of the present disclosure.
  • FIG. 5B is another flowchart of an illustrative method for making accurate medical instrument patient measurements, according to embodiments of the present disclosure.
  • FIG. 5C is yet another flowchart of an illustrative method for making accurate medical instrument patient measurements, according to embodiments of the present disclosure.
  • FIG. 6 depicts a simplified block diagram of a computing device/information handling system according to embodiments of the present disclosure.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • In the following description, for purposes of explanation, specific details are set forth in order to provide an understanding of the disclosure. It will be apparent, however, to one skilled in the art that the disclosure can be practiced without these details. Furthermore, one skilled in the art will recognize that embodiments of the present disclosure, described below, may be implemented in a variety of ways, such as a process, an apparatus, a system, a device, or a method on a tangible computer-readable medium.
  • Elements/components shown in diagrams are illustrative of exemplary embodiments of the disclosure and are meant to avoid obscuring the disclosure. It shall also be understood that throughout this discussion that components may be described as separate functional units, which may comprise sub-units, but those skilled in the art will recognize that various components, or portions thereof, may be divided into separate components or may be integrated together, including integrated within a single system or component. It should be noted that functions or operations discussed herein may be implemented as components/elements. Components/elements may be implemented in software, hardware, or a combination thereof.
  • Furthermore, connections between components or systems within the figures are not intended to be limited to direct connections. Rather, data between these components may be modified, re-formatted, or otherwise changed by intermediary components. Also, additional or fewer connections may be used. Also, additional or fewer connections may be used. It shall also be noted that the terms “coupled” “connected” or “communicatively coupled” shall be understood to include direct connections, indirect connections through one or more intermediary devices, and wireless connections.
  • Reference in the specification to “one embodiment,” “preferred embodiment,” “an embodiment,” or “embodiments” means that a particular feature, structure, characteristic, or function described in connection with the embodiment is included in at least one embodiment of the disclosure and may be in more than one embodiment. The appearances of the phrases “in one embodiment,” “in an embodiment,” or “in embodiments” in various places in the specification are not necessarily all referring to the same embodiment or embodiments. The terms “include,” “including,” “comprise,” and “comprising” shall be understood to be open terms and any lists that follow are examples and not meant to be limited to the listed items. Any headings used herein are for organizational purposes only and shall not be used to limit the scope of the description or the claims.
  • Furthermore, the use of certain terms in various places in the specification is for illustration and should not be construed as limiting. A service, function, or resource is not limited to a single service, function, or resource; usage of these terms may refer to a grouping of related services, functions, or resources, which may be distributed or aggregated.
  • In this document, the term “sensor” refers to a device capable of acquiring information related to any type of physiological condition or activity (e.g., a biometric diagnostic sensor); physical data (e.g., a weight); and environmental information (e.g., ambient temperature sensor), including hardware-specific information. The term “position” refers to spatial and temporal data (e.g. orientation and motion information). The position data may be, for example, but without limitations, 2-dimensional (2D), 2.5-dimensional (2.5D), or true 3-dimensional (3D) data representation. “Doctor” refers to any health care professional, health care provider, physician, or person directed by a physician. “Patient” is any user who uses the systems and methods of the present invention, e.g., a person being examined or anyone assisting such person. The term illness may be used interchangeably with the term diagnosis. As used herein, “answer” or “question” refers to one or more of 1) an answer to a question, 2) a measurement or measurement request (e.g., a measurement performed by a “patient”), and 3) a symptom (e.g., a symptom selected by a “patient”).
  • FIG. 1 illustrates an exemplary medical diagnostic system according to embodiments of the present disclosure. Diagnostic system 100 comprises automated diagnostic and treatment system 102, patient interface station 106, doctor interface station 104, deep learning system 105, sensors/cameras 109 and medical instrument equipment 108. Deep learning system 105 may include, for example, but without limitations, deep convolution neural network (DCNN), Multi-layer preceptrons (MLP), Fully Convolutional Networks (FCNs), Capsule Networks, and artificial neural networks (A-NNs). Both patient interface station 106 and doctor interface station 104 may be implemented into any tablet, computer, mobile device, or other electronic device. Deep learning system 105 utilizes artificial reality feedback to assist the automated diagnostic and treatment system 102 to obtain improved medical measurements by the medical instrument equipment 108. Deep learning may be a name referring to the use of “stacked neural networks”; that is, networks composed of several layers. The layers are made of nodes, where a node is just a place where computation occurs and is loosely patterned on a neuron in the human brain. The neuron in the brain fires when it encounters sufficient stimuli. Deep learning system 105 may be used in computer vision and have also have been applied to acoustic modeling for automatic speech recognition (ASR).
  • Patient interface station 106 comprises patient interface application module 107 and is coupled to sensors/cameras 109. In turn, sensors/cameras 109 may monitor and may capture images of the patient/assistant. Patient interface station 106 receives from the automated diagnostic and treatment system 102 system instruction and/or relayed physicians instruction, diagnostic feedback, results 126. Patient interface station 106 sends to the automated diagnostic and treatment system 102 secure raw or processed patient-related data 128. Patient interface station 106 also provides interfaces between sensors/cameras 109, located in the patient interface station 106, and the patient/assistant. Patient interface station 106 also provides interfaces to medical instrument equipment 108. Medical instrument equipment 108 may be, for example, but without limitations, an ophthalmoscope, intraoral camera, auriscope, or otoscope.
  • Medical instrument equipment 108 is designed to collect mainly diagnostic patient data, and may comprise one or more diagnostic devices, for example, in a home diagnostic medical kit that generates diagnostic data based on physical and non-physical characteristics of a patient. It is noted that diagnostic system 100 may comprise additional sensors and devices that, in operation, collect, process, or transmit characteristic information about the patient, medical instrument usage, orientation, environmental parameters such as ambient temperature, humidity, location, and other useful information that may be used to accomplish the objectives of the present invention.
  • In operation, a patient may enter patient-related data, such as health history, patient characteristics, symptoms, health concerns, medical instrument measured diagnostic data, images, and sound patterns, or other relevant information into patient interface station 106. The patient may use any means of communication, such as voice control, to enter data, e.g., in the form of a questionnaire. Patient interface station 106 may provide the data raw or in processed form to automated diagnostic and treatment system 102, e.g., via a secure communication.
  • In embodiments, the patient may be prompted, e.g., by a software application, to answer questions intended to aid in the diagnosis of one or more medical conditions. The software application may provide guidance by describing how to use medical instrument equipment 108 to administer a diagnostic test or how to make diagnostic measurements for any particular device that may be part of medical instrument equipment 108 so as to facilitate accurate measurements of patient diagnostic data.
  • In embodiments, the patient may use medical instrument equipment 108 to create a patient health profile that serves as a baseline profile. Gathered patient-related data may be securely stored in database 103 or a secure remote server (not shown) coupled to automated diagnostic and treatment system 102. In embodiments, automated diagnostic and treatment system 102, supported by and deep learning system 105, enables interaction between a patient and a remotely located health care professional, who may provide instructions to the patient, e.g., by communicating via the software application. A doctor may log into a cloud-based system (not shown) to access patient-related data via doctor interface station 104. The doctor interface station 104 comprises a doctor interface communication module 130. In embodiments, automated diagnostic and treatment system 102 presents automated diagnostic suggestions to a doctor, who may verify or modify the suggested information. Automated diagnostic and treatment system 102 is coupled to doctor interface station 104. Automated diagnostic and treatment system 102 may communicate to doctor interface station 104 alerts, if needed, risk profile, and data integrity 120.
  • In embodiments, based on one more patient questionnaires, data gathered by medical instrument equipment 108, patient feedback, and historic diagnostic information, the patient may be provided with instructions, feedback, results 122, and other information pertinent to the patient's health. In embodiments, the doctor may select an illness based on automated diagnostic system suggestions and/or follow a sequence of instructions, feedback, and/or results 122 may be adjusted based on decision vectors associated with a medical database. In embodiments, medical instrument equipment 108 uses the decision vectors to generate a diagnostic result, e.g., in response to patient answers and/or measurements of the patient's vital signs.
  • In embodiments, medical instrument equipment 108 comprises a number of sensors, such as accelerometers, gyroscopes, pressure sensors, cameras, bolometers, altimeters, IR LEDs, and proximity sensors that may be coupled to one or more medical devices, e.g., a thermometer, to assist in performing diagnostic measurements and/or monitor a patient's use of medical instrument equipment 108 for accuracy. A camera, bolometer, or other spectrum imaging device (e.g. radar), in addition to taking pictures of the patient, may use image or facial recognition software and machine vision to recognize body parts, items, and actions to aid the patient in locating suitable positions for taking a measurement on the patient's body, e.g., by identifying any part of the patient's body as a reference.
  • Examples of the types of diagnostic data that medical instrument equipment 108 may generate comprise body temperature, blood pressure, images, sound, heart rate, blood oxygen level, motion, ultrasound, pressure or gas analysis, continuous positive airway pressure, electrocardiogram, electroencephalogram, electrocardiography, BMI, muscle mass, blood, urine, and any other patient-related data 128. In embodiments, patient-related data 128 may be derived from a non-surgical wearable or implantable monitoring device that gathers sample data.
  • In embodiments, an IR LED, proximity beacon, or other identifiable marker (not shown) may be attached to medical instrument equipment 108 to track the position and placement of medical instrument equipment 108. In embodiments, a camera, bolometer, or other spectrum imaging device uses the identifiable marker as a control tool to aid the camera or the patient in determining the position of medical instrument equipment 108.
  • In embodiments, a deep learning system 105, which may include machine vision software, may be used to track and overlay or superimpose, e.g., on a screen, the position of the identifiable marker e.g., IR LED, heat source, or reflective material with a desired target location at which the patient should place medical instrument equipment 108, thereby, aiding the patient to properly place or align a sensor and ensure accurate and reliable readings. Once medical instrument equipment 108, e.g., a stethoscope is placed at the desired target location on a patient's torso, the patient may be prompted by optical or visual cues to breath according to instructions or perform other actions to facilitate medical measurements and to start a measurement.
  • In embodiments, one or more sensors that may be attached to medical instrument equipment 108 monitor the placement and usage of medical instrument equipment 108 by periodically or continuously recording data and comparing measured data, such as location, movement, rotation and angles, to an expected data model and/or an error threshold to ensure measurement accuracy. A patient may be instructed to adjust an angle, location, rotation or motion of medical instrument equipment 108, e.g., to adjust its state and, thus, avoid low-accuracy or faulty measurement readings. In embodiments, sensors attached or tracking medical instrument equipment 108 may generate sensor data and patient interaction activity data that may be compared, for example, against an idealized patient medical instrument equipment usage sensor model data to create an equipment usage accuracy score. The patient medical instrument equipment measured medical data may also be compared with idealized device measurement data expected from medical instrument equipment 108 to create a device accuracy score. Recording bioelectric signals may include, for example, but without limitations, electroencephalography, electrocardiography, electrooculography, electromyography and electrogastrography.
  • Feedback from medical instrument equipment 108 (e.g., sensors, proximity, camera, etc.) and actual device measurement data may be used to instruct the patient to properly align medical instrument equipment 108 during a measurement. In embodiments, medical instrument equipment type and sensor system monitoring of medical instrument equipment 108 patient interaction may be used to create a device usage accuracy score for use in a medical diagnosis algorithm. Similarly, patient medical instrument equipment measured medical data may be used to create a measurement accuracy score for use by the medical diagnostic algorithm. In embodiments, medical instrument equipment 108 may also be fitted with haptic/tactile feedback sensors which may augment audio or screen-based guidance instructions for taking measurement. These haptic feedback devices, or haptic feedback sensors, may include, but are not limited to, vibration motors (similar to those found in smart phones) fitted around the medical instrument equipment to give tactile instructions (e.g. vibration on the left side of the medical device to instruct the patient to move the medical device left), or flywheels to give kinesthetic feedback such as resisting movement in undesirable directions, and aiding movement in the correct direction (similar to certain virtual reality peripherals).
  • In embodiments, machine vision software may be used to show on a display an animation that mimics a patient's movements and provides detailed interactive instructions and real-time feedback to the patient. This aids the patient in correctly positioning and operating medical instrument equipment 108 relative to the patient's body so as to ensure a high level of accuracy when using medical instrument equipment 108 is operated. In embodiments, the display may be a traditional computer screen. In other embodiments, the display may be augmented/virtual/mixed reality hardware, where patients may interact with the kiosk in a completely virtual environment while wearing AR/VR goggles.
  • In embodiments, once automated diagnostic and treatment system 102 detects unexpected data, e.g., data representing an unwanted movement, location, measurement data, etc., a validation process comprising a calculation of a trustworthiness score or reliability factor is initiated in order to gauge the measurement accuracy. Once the accuracy of the measured data falls below a desired level, the patient may be asked to either repeat a measurement or request assistance by an assistant, who may answer questions, e.g., remotely via an application to help with proper equipment usage, or alert a nearby person to assist with using medical instrument equipment 108. The validation process, may also instruct a patient to answer additional questions, and may comprise calculating the measurement accuracy score based on a measurement or re-measurement.
  • In embodiments, upon request 124, automated diagnostic and treatment system 102 may enable a patient-doctor interaction by granting the patient and doctor access to diagnostic system 100. The patient may enter data, take measurements, and submit images and audio files or any other information to the application or web portal. The doctor may access that information, for example, to review a diagnosis generated by automated diagnostic and treatment system 102, and generate, confirm, or modify instructions for the patient. Patient-doctor interaction, while not required for diagnostic and treatment, if used, may occur in person, real-time via an audio/video application, or by any other means of communication.
  • In embodiments, automated diagnostic and treatment system 102 may utilize images generated from a diagnostic examination of mouth, throat, eyes, ears, skin, extremities, surface abnormalities, internal imaging sources, and other suitable images and/or audio data generated from diagnostic examination of heart, lungs, abdomen, chest, joint motion, voice, and any other audio data sources. Automated diagnostic and treatment system 102 may further utilize patient lab tests, medical images, or any other medical data. In embodiments, automated diagnostic and treatment system 102 enables medical examination of the patient, for example, using medical devices, e.g., ultrasound, in medical instrument equipment 108 to detect sprains, contusions, or fractures, and automatically provide diagnostic recommendations regarding a medical condition of the patient.
  • In embodiments, diagnosis comprises the use of medical database decision vectors that are at least partially based on the patient's self-measured (or assistant-measured) vitals or other measured medical data, the accuracy score of a measurement dataset, a usage accuracy score of a sensor attached to medical instrument equipment 108, a regional illness trend, and information used in generally accepted medical knowledge evaluations steps. The decision vectors and associated algorithms, which may be installed in automated diagnostic and treatment system 102, may utilize one or more-dimensional data, patient history, patient questionnaire feedback, and pattern recognition or pattern matching for classification using images and audio data. In embodiments, a medical device usage accuracy score generator (not shown) may be implemented within automated diagnostic and treatment system 102 and may utilize an error vector of any device in medical instrument equipment or attached medical instrument equipment 108 to create the device usage accuracy score and utilize the actual patient-measured device data to create the measurement data accuracy score.
  • In embodiments, automated diagnostic and treatment system 102 outputs diagnosis and/or treatment information that may be communicated to the patient, for example, electronically or in person by a medical professional, e.g., a treatment guideline that may include a prescription for a medication. In embodiments, prescriptions may be communicated directly to a pharmacy for pick-up or automated home delivery.
  • In embodiments, automated diagnostic and treatment system 102 may generate an overall health risk profile of the patient and recommend steps to reduce the risk of overlooking potentially dangerous conditions or guide the patient to a nearby facility that can treat the potentially dangerous condition. The health risk profile may assist a treating doctor in fulfilling duties to the patient, for example, to carefully review and evaluate the patient and, if deemed necessary, refer the patient to a specialist, initiate further testing, etc. The health risk profile advantageously reduces the potential for negligence and, thus, medical malpractice.
  • Automated diagnostic and treatment system 102, in embodiments, comprises a payment feature that uses patient identification information to access a database to, e.g., determine whether a patient has previously arranged a method of payment, and if the database does not indicate a previously arranged method of payment, automated diagnostic and treatment system 102 may prompt the patient to enter payment information, such as insurance, bank, or credit card information. Automated diagnostic and treatment system 102 may determine whether payment information is valid and automatically obtain an authorization from the insurance, EHR system, and/or the card issuer for payment for a certain amount for services rendered by the doctor. An invoice may be electronically presented to the patient, e.g., upon completion of a consultation, such that the patient can authorize payment of the invoice, e.g., via an electronic signature.
  • In embodiments, database 103 of patient's information (e.g., a secured cloud-based database) may comprise a security interface (not shown) that allows secure access to a patient database, for example, by using patient identification information to obtain the patient's medical history. The interface may utilize biometric, bar code, or other electronically security methods. In embodiments, medical instrument equipment 108 uses unique identifiers that are used as a control tool for measurement data. Database 103 may be a repository for any type of data created, modified, or received by diagnostic system 100, such as generated diagnostic information, information received from patient's wearable electronic devices, remote video/audio data and instructions, e.g., instructions received from a remote location or from the application.
  • In embodiments, fields in the patient's electronic health care record (EHR) are automatically populated based on one or more of questions asked by diagnostic system 100, measurements taken by the diagnostic system 100, diagnosis and treatment codes generated by diagnostic system 100, one or more trust scores, and imported patient health care data from one or more sources, such as an existing health care database. It is understood the format of imported patient health care data may be converted to be compatible with the EHR format of diagnostic system 100. Conversely, exported patient health care data may be converted, e.g., to be compatible with an external EHR database.
  • In addition, patient-related data documented by diagnostic system 100 provide support for the code decision for the level of exam a doctor performs. Currently, for billing and reimbursement purposes, doctors have to choose one of any identified codes (e.g., ICD10 currently holds approximately 97,000 medical codes) to identify an illness and provide an additional code that identifies the level of physical exam/diagnosis performed on the patient (e.g., full body physical exam) based on an illness identified by the doctor.
  • In embodiments, patient answers are used to suggest to the doctor a level of exam that is supported by the identified illness, e.g., to ensure that the doctor does not perform unnecessary in-depth exams for minor illnesses or a treatment that may not be covered by the patient's insurance.
  • In embodiments, upon identifying a diagnostic system 100 generates one or more recommendations/suggestions/options for a particular treatment. In embodiments, one or more treatment plans are generated that the doctor may discuss with the patient and decide on a suitable treatment. For example, one treatment plan may be tailored purely for effectiveness, another one may consider the cost of drugs. In embodiments, diagnostic system 100 may generate a prescription or lab test request and consider factors, such as recent research results, available drugs and possible drug interactions, the patient's medical history, traits of the patient, family history, and any other factors that may affect treatment when providing treatment information. In embodiments, diagnosis and treatment databases may be continuously updated, e.g., by health care professionals, so that an optimal treatment may be administered to a particular patient, e.g., a patient identified as member of a certain risk group.
  • It is noted that sensors and measurement techniques may be advantageously combined to perform multiple functions using a reduced number of sensors. For example, an optical sensor may be used as a thermal sensor by utilizing IR technology to measure body temperature. It is further noted that some or all data collected by diagnostic system 100 may be processed and analyzed directly within automated diagnostic and treatment system 102 or transmitted to an external reading device (not shown in FIG. 1) for further processing and analysis, e.g., to enable additional diagnostics.
  • FIG. 2 illustrates an exemplary patient diagnostic measurement system according to embodiments of the present disclosure. As depicted, patient diagnostic measurement system 200 comprises microcontroller 202, spectrum imaging device, e.g., spectrum imaging device camera 204, monitor 206, patient-medical equipment activity tracking sensors, e.g., inertial sensor 208, communications controller 210, medical instruments 224, identifiable marker, e.g., identifiable marker IR LED 226, power management unit 230, and battery 232. Each component may be coupled directly or indirectly by electrical wiring, wirelessly, or optically to any other component in patient diagnostic measurement system 200. Inertial sensor 208 may also be referred to as patient-equipment activity sensors inertial sensors 208 or simply sensors 208.
  • Medical instrument 224 comprises one or more devices that are capable of measuring physical and non-physical characteristics of a patient that, in embodiments, may be customized, e.g., according to varying anatomies among patients, irregularities on a patient's skin, and the like. In embodiments, medical instrument 224 is a combination of diagnostic medical devices that generate diagnostic data based on patient characteristics. Exemplary diagnostic medical devices are heart rate sensors, otoscopes, digital stethoscopes, in-ear thermometers, blood oxygen sensors, high-definition cameras, spirometers, blood pressure meters, respiration sensors, skin resistance sensors, glucometers, ultrasound devices, electrocardiographic sensors, body fluid sample collectors, eye slit lamps, weight scales, and any devices known in the art that may aid in performing a medical diagnosis. In embodiments, patient characteristics and vital signs data may be received from and/or compared against wearable or implantable monitoring devices that gather sample data, e.g., a fitness device that monitors physical activity.
  • One or more medical instruments 224 may be removably attachable directly to a patient's body, e.g., torso, via patches or electrodes that may use adhesion to provide good physical or electrical contact. In embodiments, medical instruments 224, e.g., a contact-less (or non-contact) thermometer, may perform contact-less measurements some distance away from the patient's body. Non-contact sensors my also support measurements for electrocardiograms (EKG) and electroencephalogram (EEG).
  • In embodiments, microcontroller 202 may be a secure microcontroller that securely communicates information in encrypted form to ensure privacy and the authenticity of measured data and activity sensor and patient-equipment proximity information and other information in patient diagnostic measurement system 200. This may be accomplished by taking advantage of security features embedded in hardware of microcontroller 202 and/or software that enables security features during transit and storage of sensitive data. Each device in patient diagnostic measurement system 200 may have keys that handshake to perform authentication operations on a regular basis.
  • Spectrum imaging device camera 204 is any audio/video device that may capture patient images and sound at any frequency or image type. Monitor 206 is any screen or display device that may be coupled to camera, sensors and/or any part of patient diagnostic measurement system 200. Patient-equipment activity tracking inertial sensor 208 is any single or multi-dimensional sensor, such as an accelerometer, a multi-axis gyroscope, pressure sensor, and a magnetometer capable of providing position, motion, pressure on medical equipment or orientation data based on patient interaction. Patient-equipment activity tracking inertial sensor 208 may be attached to (removably or permanently) or embedded into medical instrument 224. Identifiable marker IR LED 226 represents any device, heat source, reflective material, proximity beacon, altimeter, etc., that may be used by microcontroller 202 as an identifiable marker. Like patient-equipment activity tracking inertial sensor 208, identifiable marker IR LED 226 may be attachable to or embedded into medical instrument 224.
  • In embodiments, communication controller 210 is a wireless communications controller attached either permanently or temporarily to medical instrument 224 or the patient's body to establish a bi-directional wireless communications link and transmit data, e.g., between sensors and microcontroller 202 using any wireless communication protocol known in the art, such as Bluetooth Low Energy, e.g., via an embedded antenna circuit that wirelessly communicates the data. One of ordinary skill in the art will appreciate that electromagnetic fields generated by such antenna circuit may be of any suitable type. In case of an RF field, the operating frequency may be located in the ISM frequency band, e.g., 13.56 MHz. In embodiments, data received by communications controller 210, which may be a wireless controller, may be forwarded to a host device (not shown) that may run a software application.
  • In embodiments, power management unit 230 is coupled to microcontroller 202 to provide energy to, e.g., microcontroller 202 and communication controller 210. Battery 232 may be a back-up battery for power management unit 230 or a battery in any one of the devices in patient diagnostic measurement system 200. One of ordinary skill in the art will appreciate that one or more devices in patient diagnostic measurement system 200 may be operated from the same power source (e.g., battery 232) and perform more than one function at the same or different times. A person of skill in the art will also appreciate that one or more components, e.g., sensors 208, identifiable marker IR LED 226, may be integrated on a single chip/system, and that additional electronics, such as filtering elements, etc., may be implemented to support the functions of medical instrument equipment measurement or usage monitoring and tracking in patient diagnostic measurement system 200 according to the objectives of the invention.
  • In operation, a patient may use medical instrument 224 to gather patient data based on physical and non-physical patient characteristics, e.g., vital signs data, images, sounds, and other information useful in the monitoring and diagnosis of a health-related condition. The patient data is processed by microcontroller 202 and may be stored in a database (not shown). In embodiments, the patient data may be used to establish baseline data for a patient health profile against which subsequent patient data may be compared.
  • In embodiments, patient data may be used to create, modify, or update EHR data. Gathered medical instrument equipment data, along with any other patient and sensor data, may be processed directly by patient diagnostic measurement system 200 or communicated to a remote location for analysis, e.g., to diagnose existing and expected health conditions to benefit from early detection and prevention of acute conditions or aid in the development of novel medical diagnostic methods.
  • In embodiments, medical instrument 224 is coupled to a number of sensors, such as patient-equipment tracking inertial sensor 208 and/or identifiable marker IR LED 226, that may monitor a position/orientation of medical instrument 224 relative to the patient's body when a medical equipment measurement is taken. In embodiments, sensor data generated by sensor 208, identifiable marker IR LED 226 or other sensors may be used in connection with, e.g., data generated by spectrum imaging device camera 204, proximity sensors, transmitters, bolometers, or receivers to provide feedback to the patient to aid the patient in properly aligning medical instrument 224 relative to the patient's body part of interest when performing a diagnostic measurement. A person skilled in the art will appreciate that not all sensors 208, identifiable marker IR LED 226, beacon, pressure, altimeter, etc., need to operate at all times. Any number of sensors may be partially or completely disabled, e.g., to conserve energy.
  • In embodiments, the sensor emitter comprises a light signal emitted by IR LED 226 or any other identifiable marker that may be used as a reference signal. In embodiments, the reference signal may be used to identify a location, e.g., within an image and based on a characteristic that distinguishes the reference from other parts of the image. In embodiments, the reference signal is representative of a difference between the position of medical instrument 224 and a preferred location relative to a patient's body. In embodiments, spectrum imaging device camera 204 displays, e.g., via monitor 206, the position of medical instrument 224 and the reference signal at the preferred location so as to allow the patient to determine the position of medical instrument 224 and adjust the position relative to the preferred location, displayed by spectrum imaging device camera 204.
  • Spectrum imaging device camera 204, proximity sensor, transmitter, receiver, bolometer, or any other suitable device may be used to locate or track the reference signal, e.g., within the image, relative to a body part of the patient. In embodiments, this augmented reality (AR) method may be accomplished by using an overlay method that overlays an image of a body part of the patient against an ideal model of device usage to enable real-time feedback for the patient. The reference signal along with signals from other sensors, e.g., patient-equipment activity inertial sensor 208, may be used to identify a position, location, angle, orientation, or usage associated with medical instrument 224 to monitor and guide a patient's placement of medical instrument 224 at a target location and accurately activate a device for measurement. In other embodiments, methods of mixed reality, in which users wearing AR goggles may have virtual objects overlaid on real-world objects (e.g. a virtual clock on a real wall).
  • In embodiments, e.g., upon receipt of a request signal, microcontroller 202 activates one or more medical instruments 224 to perform measurements and sends data related to the measurement back to microcontroller 202. The measured data and other data associated with a physical condition may be automatically recorded and a usage accuracy of medical instrument 224 may be monitored.
  • In embodiments, microcontroller 202 uses an image in any spectrum, motion signal and/or an orientation signal by patient-equipment activity inertial sensor 208 to compensate or correct the vital signs data output by medical instrument 224. Data compensation or correction may comprise filtering out certain data as likely being corrupted by parasitic effects and erroneous readings that result from medical instrument 224 being exposed to unwanted movements caused by perturbations or, e.g., the effect of movements of the patient's target measurement body part.
  • In embodiments, signals from two or more medical instruments 224, or from medical instrument 224 and patient-activity activity system inertial sensor 208, are combined, for example, to reduce signal latency and increase correlation between signals to further improve the ability of patient diagnostic measurement system 200 to reject motion artifacts to remove false readings and, therefore, enable a more accurate interpretation of the measured vital signs data.
  • In embodiments, spectrum imaging device camera 204 displays actual or simulated images and videos of the patient and medical instrument 224 to assist the patient in locating a desired position for medical instrument 224 when performing the measurement so as to increase measurement accuracy. Spectrum imaging device camera 204 may use image or facial recognition software to identify and display eyes, mouth, nose, ears, torso, or any other part of the patient's body as reference.
  • In embodiments, patient diagnostic measurement system 200 uses machine vision software that analyzes measured image data and compares image features to features in a database, e.g., to detect an incomplete image for a target body part, to monitor the accuracy of a measurement and determine a corresponding score. In embodiments, if the score falls below a certain threshold, patient diagnostic measurement system 200 may provide detailed guidance for improving measurement accuracy or to receive a more complete image, e.g., by providing instructions on how to change an angle or depth of an otoscope relative to the patient's ear.
  • In embodiments, the machine vision software may use an overlay method to mimic a patient's posture/movements to provide detailed and interactive instructions, e.g., by displaying a character, image of the patient, graphic, or avatar on monitor 206 to provide feedback to the patient. The instructions, image, or avatar may start or stop and decide what help instruction to display based on the type of medical instrument 224, the data from spectrum imaging device camera 204, patient-equipment activity sensors inertial sensors 208, bolometer, transmitter and receiver, and/or identifiable marker IR LED 226 (an image, a measured position or angle, etc.), and a comparison of the data to idealized data. This further aids the patient in correctly positioning and operating medical instrument 224 relative to the patient's body, ensures a high level of accuracy when operating medical instrument 224, and solves potential issues that the patient may encounter when using medical instrument 224.
  • In embodiments, instructions may be provided via monitor 206 and describe in audio/visual format and in any desired level of detail, how to use medical instrument 224 to perform a diagnostic test or measurement, e.g., how to take temperature, so as to enable patients to perform measurements of clinical-grade accuracy. In embodiments, each sensor 208, identifiable marker IR LED 226, e.g., proximity, bolometer, transmitter/receiver may be associated with a device usage accuracy score. A device usage accuracy score generator (not shown), which may be implemented in microcontroller 202, may use the sensor data to generate a medical instrument usage accuracy score that is representative of the reliability of medical instrument 224 measurement on the patient. In embodiments, the score may be based on a difference between an actual position of medical instrument 224 and a preferred position. In addition, the score may be based on detecting a motion, e.g., during a measurement. In embodiments, the device usage accuracy score is derived from an error vector generated for one or more sensors 208, identifiable marker IR LED 226, deep learning system 105. In embodiments, the device usage accuracy score is derived from an error vector generated for one or more sensors 208, identifiable marker IR LED 226. The resulting device usage accuracy score may be used when generating or evaluating medical diagnosis data.
  • In embodiments, microcontroller 202 analyzes the patient measured medical instrument data to generate a trust score indicative of the acceptable range of the medical instrument. For example, by comparing the medical instrument measurement data against reference measurement data or reference measurement data that would be expected from medical instrument 224. As with device usage accuracy score, the trust score may be used when generating or evaluating a medical diagnosis data.
  • FIG. 3A illustrates an exemplary system 300 for measuring a patient's body parts utilizing optical medical instruments and deep learning systems according to embodiments of the present disclosure. System 300 may comprise automated diagnostic and treatment system 102, deep learning system 105, doctor interface station 104 and patient kiosk 301. Patient kiosk 301 may comprise patient interface station 106, kiosk cameras including sensors/camera-1 303, sensors/camera-2 304 and instrument 312. As illustrated, patient 310 is present in patient kiosk 301. Kiosk cameras may also be referred to as camera modules. Instrument 312 may be, but not limited to, an otoscope. Deep learning system 105 may be, but not limited to, a deep convolution neural network (DCNN).
  • An objective of system 300 includes the accurate acquisition of medical measurement data of a target spot of a body part of patient 310. To assist in acquiring accurate medical measurement data, the automated diagnostic and treatment system 102 provides instructions to patient 310 to allow patient 310 to precisely position instrument 312 in proximity to a target spot of a body part of patient 310. That is, the movement of instrument 312 may be controlled by the patient 310. Based on a series of images acquired from the kiosk cameras and instrument 312, these instructions may be generated. Subsequent images from instrument 312 are analyzed by automated diagnostic and treatment system 102 utilizing database 103 and deep learning system 105 to obtain measured medical data. The accuracy of the instrument 312 positioning and the medical measured data may be determined by automated diagnostic and treatment system 102, and deep learning system 105 via an error threshold measurement. When accurately positioned, quality medical measurements are generated for the target spot of a body part of patient 310. As illustrated in FIG. 3A, patient 310 has positioned instrument 312, which may be an otoscope, in proximity to the ear of patient 310. In other embodiments, instrument 312 may be positioned in proximity to another body part. In other embodiments, patient 310 may be fitted with haptic/tactile feedback sensors which may augment the patient guidance instructions for taking measurements. As an example, during an ear exam, should the patient need to point the otoscope left, haptic feedback in the form of a buzzer/vibrator could be elicited on the side of the device in which movement is required (i.e. left side). These haptic feedback devices may include, but are not limited to, vibration motors (similar to those found in smart phones) fitted around the medical instrument equipment to give tactile instructions (e.g. vibration on the left side of the medical device to instruct the patient to move the medical device left), or flywheels to give kinesthetic feedback such as resisting movement in undesirable directions, and aiding movement in the correct direction (similar to certain virtual reality peripherals). Instrument 312 may be wirelessly coupled to patient interface station 106.
  • The automated diagnostic and treatment system 102, deep learning system 105, doctor interface station 104 and patient interface station 106 are operable as previously described herein relative to FIG. 1. The automated diagnostic and treatment system 102 may send a command to the patient interface station 106 requesting sensors/camera-1 303 and sensors/camera-2 304 to capture images of patient 310 and instrument 312. The captured images (sometimes called “posed” images) may be analyzed by the deep learning system 105 to determine the position of instrument 312 relative to the body part, which may include the location of instrument 312 to the body part, the angle of instrument 312 to the body part, and the rotation of instrument 312 to the body part. From this positioning analysis, the automated diagnostic and treatment system 102 may provide an initial set of instructions to patient 310 to assist patient 310 to position instrument 312 in proximity to the target spot of the body part. As an example, the first set of instructions may request the patient to position the instrument 312 adjacent to the patient's right ear. After patient 310 acts on the first set of instructions, the automated diagnostic and treatment system 102 captures a second set of images and determines if instrument 312 is within a pose threshold of the target spot of the body part. The pose threshold may be less than one inch. These instructions may be considered a “coarse” set of instructions, or “coarse tuning”, inasmuch as the estimates are based on images obtained by sensors/camera-1 303 and sensors/camera-2 304 that have a distance d1 between the camera and the instrument 312. Distance d1 may be several feet.
  • As illustrated in patient kiosk 301, there are two camera/sensor modules. In some embodiments, there may be N camera/sensor modules, where N is equal to one of more camera/sensor modules.
  • Instrument 312 may comprise a camera that may be located on one end of instrument 312. Automated diagnostic and treatment system 102 may command, via patient interface station 106, instrument 312 to capture images of a target spot of the body part. Inasmuch as instrument 312 is in close proximity to the body part, these captures images provide detailed information on the position of instrument 312 relative to the body part. The images from instrument 312 are analyzed, resulting in another set of instruction for patient 310 for the positioning of instrument 312 relative to the target spot of the body part. Based on the another set of instructions, patient 310 may position instrument 312 more accurately relative to the target spot of the body part of patient 310. The another set of instructions may have refined, i.e., “fine tuned”, the initial set of instructions because the another set of instructions is based on images captured where the instrument camera is distance d2 from the body part. Distance d2 may be less than 1 inch.
  • This process may be repeated as newly captured images by instrument 312 are analyzed by the automated diagnostic and treatment system 102 to determine the updated positions of instrument 312 relative to the body part. The updated positions may include the location of instrument 312 to the body part, the angle of instrument 312 to the body part, and the rotation of instrument 312 to the body part. From this positioning analysis, the automated diagnostic and treatment system 102 provides yet another set of instructions to patient 310 to assist patient 310 to position instrument 312 in proximity to the target spot of the body part. This yet another of instructions may further refine the initial sets of instructions. The instructions may be communicated to patient 310 via a visual display or via haptic/tactile feedback sensors.
  • The instructions may be refined by the automated diagnostic and treatment system 102 analyzing the images from the instrument camera and the images from the kiosk cameras by utilizing deep learning system 105 and database 103 to obtain updated estimates of 1) the position of instrument 312 relative to the target spot of the body part and 2) measured body parts. When automated diagnostic and treatment system 102 achieves an error threshold to ensure measurement accuracy, the medical measurement data is provided to a physician by the automated diagnostic and treatment system 102 via the doctor interface station 104.
  • FIG. 3B illustrates another exemplary system 320 for measuring a patient's body parts or a medical instrument (not shown) utilizing cameras and sensors according to embodiments of the present disclosure. System 320 is based on the principle of determine the time-of-flight (TOF) for each emitted pulse of light. Light detection and ranging systems, such as camera/sensors 322, may employ pulses of light to measure distance to an object based on the time of flight (TOF) of each pulse of light. A pulse of light emitted from a light source of a light detection and ranging system interacts with a distal object. A portion of the light reflects from the object and returns to a detector of the light detection and ranging system. Based on the time elapsed between emission of the pulse of light and detection of the returned pulse of light, the distance to the object may be estimated.
  • The light pulse may hit multiple objects, each having a different distance from the laser, causing multi-return signals to be received by the light detection and ranging system detector. Multi-return signals may provide more information of the environment to improve mapping or reconstruction. A dedicated detector may be required to precisely identify each return with its associated time delay information. The resulting images may be referred to as “pose images”. In other words, one or more camera modules may provide a pose estimate based on time of flight of multiple light signals emitted from each of the one or more camera modules and reflected back to the one or more camera modules.
  • As illustrated in FIG. 3B, camera/sensors 322 may emits signals S1, S2 and S3. These signals may reflect off patient 330 or off a fixed surface, such as a wall or floor. In the kiosk, the walls and floor may have identifiers that can assist in the image identification of the body part of patient 330. The reflection of signals S1, S2 and S3 may be detected by camera/sensors 322 and communicated to a diagnostic system, such as automated diagnostic and treatment system 102. Automated diagnostic and treatment system 102 can determine the distances associated with signals S1, S2 and S3 based on the TOF of each signal. This process may result in a pose estimate, which may be viewed as a pose image. Identifier 324 may assist in identifying a position of a patient's body parts or a medical instrument.
  • FIG. 4A illustrates an exemplary method 400 of measuring a patient's body part utilizing an optical medical instrument according to embodiments of the present disclosure. As illustrated, instrument 402 may be positioned in an ear of patient 404. As previously discussed relative to FIG. 3A, the patient 404 can receive instructions from a diagnostic system to improve the accuracy of the position of instrument 402 relative to a target spot of a body part of patient 404.
  • FIG. 4B illustrates an exemplary optical medical instrument 420 according to embodiments of the present disclosure. Optical medical instrument 420 may comprise instrument 422, which may comprise instrument camera 424, identifiable marker 428 and wireless device 426. Instrument 422 may also include a serial number on its body. Instrument 422 may be easily held and positioned by a user, such as patient 310 of FIG. 3A. Although instrument 422 includes wireless device 426, in other embodiments, instrument 422 may be coupled to a diagnostic system via a wired or optical technology. Also, although instrument 422 illustrates instrument camera 424 is located on a tip or one end of instrument 422, in other embodiments, instrument camera 424 may be located in another location on instrument 422. When a kiosk camera/sensor endeavors to determine an angle or rotation of instrument 422 relative to a body party of a patient, identifiable marker 428 or the serial number may assist in determining these measurements. In some embodiments, identifiable marker 428 may be an image of a ring of flowers and may comprise reflective material. Instrument 402 and instrument 422 may be otoscopes.
  • FIG. 4C illustrates an exemplary optical medical instrument 440 being positioned at an angle, position and rotation according to embodiments of the present disclosure.
  • FIG. 5A and FIG. 5B are flowcharts 500 and 540 depicting illustrative methods for making accurate medical patient measurements, according to embodiments of the present disclosure. The methods comprise the steps of:
  • Receiving a first set of pose images, by a diagnostic system, of an instrument in close proximity of a body part of a patient, wherein pose images are generated by one or more kiosk cameras. (step 502)
  • Analyzing the first set of pose images by the diagnostic system utilizing a deep learning system, to obtain (1) a first set of estimates of position of the instrument relative to the body part (initial pose estimates) and (2) first set of body part measurements. (step 504)
  • Providing a first set of instructions, by the diagnostic system, based on the first set of estimates, to the patient to adjust the position of the instrument relative to a target spot of the body part. This set of instructions may direct the patient to position the instrument at a particular ear. (step 506)
  • Receiving a second/another set of pose images, by the diagnostic system, after the patient has adjusted the position of the instrument relative to the target spot of the body part based on the first set of instructions. (step 508)
  • Analyzing the second/another set of pose images, by the diagnostic system utilizing deep learning system, to obtain (1) a second/another set of estimates of the position of the instrument relative to the target spot of the body part and (2) second/another set of body part measurements. (step 510) Steps 502, 504, 506, 508 and 510 define a coarse tuning method based on pose images.
  • Determining whether the instrument is within a pose threshold of a target spot of the body part. (step 512) The diagnostic system makes this determination.
  • If the pose threshold is achieved, proceed with fine tuning at step 515.
  • If the pose threshold is not achieved, providing another set of instructions based on the second/another set of estimates to the patient to further adjust the position of the instrument relative to the target spot of the body part. The another set of instructions may cause the patient to re-adjust the instrument position relative to the target spot of the body. (step 513) Then repeat steps 508, 510, and 512.
  • Receiving a set of instrument images, by the diagnostic system, of the target spot of the body part, wherein the set of instrument images are generated by the instrument camera located on one end of the instrument. (step 515)
  • Analyzing, by the diagnostic system, the set/another set of instrument images from the instrument camera and prior sets of pose images from the kiosk cameras utilizing a deep learning system to obtain (1) a third/another set of estimates of the position of the instrument relative to the target spot of the body part and (2) a third/another set of body part measurements (step 516) Steps 515 and 516 define a fine tuning method. A deep learning system, may be, but without limitation, deep convolution neural network (DCNN).
  • Determining whether an error threshold to ensure measurement accuracy for the instrument positioning relative to the target spot of the body part of the patient has been achieved. (step 518) The deep learning system makes this determination.
  • If the error threshold is achieved, providing medical measured data for the target spot of the body part to a physician and to a deep learning system database. (step 522)
  • If the error threshold is not achieved, providing another set of instructions, based on the third/another set of estimates, to the patient to further adjust the position of the instrument relative to the target spot of the body part. (step 519)
  • Receiving another set of instrument images, by the diagnostic system, of the target spot of the body part, after the patient has adjusted the position of the instrument relative to the body part based on the another set of instructions. (step 520)
  • Repeating the analysis of the another set of images from the instrument camera, i.e., step 516 and subsequent steps.
  • FIG. 5C is comprises flowchart 560 of an illustrative method for making accurate medical otoscope patient measurements, according to embodiments of the present disclosure. The method comprise the steps of:
  • Providing, by a diagnostic system, instructions to the patient to adjust the position of an otoscope relative to a selected ear of the patient based on a first set of images (pose images) from one or more kiosk cameras, and confirming the otoscope is within a pose threshold of the selected ear. A deep learning system may be utilized in this step. (step 562)
  • Receiving a second set of images, by the diagnostic system, of the selected ear, the second set of images generated by an otoscope camera located on one end of the otoscope. (step 563)
  • Analyzing the first (or additional) set of images from one or more kiosk cameras, a second (or additional) set of images from the otoscope and utilizing a deep learning system to obtain estimates of the position of the otoscope relative to the selected ear and body measurements. The deep learning system may be, but not limited to, a deep convolution neural network (step 564)
  • Determining whether an error threshold to ensure measurement accuracy for the otoscope positioning relative to the selected ear of the patient has been achieved? (step 566) The deep learning system makes this determination.
  • If the error threshold is achieved, providing medical measured data for the selected ear to a physician and to a deep learning database. (step 570)
  • If the error threshold is not achieved, receiving additional images from the otoscope based on additional positioning instructions and optionally additional images from the kiosk cameras, then repeating steps 564 and 566. (step 568)
  • In embodiments, a time maybe calculated at which the selected treatment is expected to show a result and patient feedback may be requested, e.g., as part of a feedback process, to improve diagnosis and treatment reliability. In embodiments, the selected treatment and/or the patient feedback may be used to adjusting one or more of the metrics. For example, based on one or more of the metrics a series of treatment plans maybe generated by using an algorithm that combines the metrics. In embodiments, calculating the selected treatment comprises using cost as a factor.
  • One skilled in the art will recognize that: (1) certain steps may optionally be performed; (2) steps may not be limited to the specific order set forth herein; and (3) certain steps may be performed in different orders; and (4) certain steps may be done concurrently.
  • In embodiments, one or more computing systems, such as mobile/tablet/computer or the automated diagnostic system, may be configured to perform one or more of the methods, functions, and/or operations presented herein. Systems that implement at least one or more of the methods, functions, and/or operations described herein may comprise an application or applications operating on at least one computing system. The computing system may comprise one or more computers and one or more databases. The computer system may be a single system, a distributed system, a cloud-based computer system, or a combination thereof.
  • It shall be noted that the present disclosure may be implemented in any instruction-execution/computing device or system capable of processing data, including, without limitation phones, laptop computers, desktop computers, and servers. The present disclosure may also be implemented into other computing devices and systems. Furthermore, aspects of the present disclosure may be implemented in a wide variety of ways including software (including firmware), hardware, or combinations thereof. For example, the functions to practice various aspects of the present disclosure may be performed by components that are implemented in a wide variety of ways including discrete logic components, one or more application specific integrated circuits (ASICs), and/or program-controlled processors. It shall be noted that the manner in which these items are implemented is not critical to the present disclosure.
  • Having described the details of the disclosure, an exemplary system that may be used to implement one or more aspects of the present disclosure is described next with reference to FIG. 6. Each of patient interface station 106 and automated diagnostic and treatment system 102 in FIG. 1 may comprise one or more components in the system 600. As illustrated in FIG. 6, system 600 includes a central processing unit (CPU) 601 that provides computing resources and controls the computer. CPU 601 may be implemented with a microprocessor or the like, and may also include a graphics processor and/or a floating point coprocessor for mathematical computations. System 600 may also include a system memory 602, which may be in the form of random-access memory (RAM) and read-only memory (ROM).
  • A number of controllers and peripheral devices may also be provided, as shown in FIG. 6. An input controller 603 represents an interface to various input device(s) 604, such as a keyboard, mouse, or stylus. There may also be a scanner controller 605, which communicates with a scanner 606. System 600 may also include a storage controller 607 for interfacing with one or more storage devices 608 each of which includes a storage medium such as magnetic tape or disk, or an optical medium that might be used to record programs of instructions for operating systems, utilities and applications which may include embodiments of programs that implement various aspects of the present disclosure. Storage device(s) 608 may also be used to store processed data or data to be processed in accordance with the disclosure. System 600 may also include a display controller 609 for providing an interface to a display device 611, which may be a cathode ray tube (CRT), a thin film transistor (TFT) display, or other type of display. System 600 may also include a printer controller 612 for communicating with a printer 613. A communications controller 614 may interface with one or more communication devices 615, which enables system 600 to connect to remote devices through any of a variety of networks including the Internet, an Ethernet cloud, an FCoE/DCB cloud, a local area network (LAN), a wide area network (WAN), a storage area network (SAN) or through any suitable electromagnetic carrier signals including infrared signals.
  • In the illustrated system, all major system components may connect to a bus 616, which may represent more than one physical bus. However, various system components may or may not be in physical proximity to one another. For example, input data and/or output data may be remotely transmitted from one physical location to another. In addition, programs that implement various aspects of this disclosure may be accessed from a remote location (e.g., a server) over a network. Such data and/or programs may be conveyed through any of a variety of machine-readable medium including, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store or to store and execute program code, such as application specific integrated circuits (ASICs), programmable logic devices (PLDs), flash memory devices, and ROM and RAM devices.
  • Embodiments of the present disclosure may be encoded upon one or more non-transitory computer-readable media with instructions for one or more processors or processing units to cause steps to be performed. It shall be noted that the one or more non-transitory computer-readable media shall include volatile and non-volatile memory. It shall be noted that alternative implementations are possible, including a hardware implementation or a software/hardware implementation. Hardware-implemented functions may be realized using ASIC(s), programmable arrays, digital signal processing circuitry, or the like. Accordingly, the “means” terms in any claims are intended to cover both software and hardware implementations. Similarly, the term “computer-readable medium or media” as used herein includes software and/or hardware having a program of instructions embodied thereon, or a combination thereof. With these implementation alternatives in mind, it is to be understood that the figures and accompanying description provide the functional information one skilled in the art would require to write program code (i.e., software) and/or to fabricate circuits (i.e., hardware) to perform the processing required.
  • It shall be noted that embodiments of the present disclosure may further relate to computer products with a non-transitory, tangible computer-readable medium that have computer code thereon for performing various computer-implemented operations. The media and computer code may be those specially designed and constructed for the purposes of the present disclosure, or they may be of the kind known or available to those having skill in the relevant arts. Examples of tangible computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media; and hardware devices that are specially configured to store or to store and execute program code, such as application specific integrated circuits (ASICs), programmable logic devices (PLDs), flash memory devices, and ROM and RAM devices. Examples of computer code include machine code, such as produced by a compiler, and files containing higher level code that are executed by a computer using an interpreter. Embodiments of the present disclosure may be implemented in whole or in part as machine-executable instructions that may be in program modules that are executed by a processing device. Examples of program modules include libraries, programs, routines, objects, components, and data structures. In distributed computing environments, program modules may be physically located in settings that are local, remote, or both.
  • For purposes of this disclosure, an information handling system may include any instrumentality or aggregate of instrumentalities operable to compute, calculate, determine, classify, process, transmit, receive, retrieve, originate, switch, store, display, communicate, manifest, detect, record, reproduce, handle, or utilize any form of information, intelligence, or data for business, scientific, control, or other purposes. For example, an information handling system may be a personal computer (e.g., desktop or laptop), tablet computer, mobile device (e.g., personal digital assistant (PDA) or smart phone), server (e.g., blade server or rack server), a network storage device, or any other suitable device and may vary in size, shape, performance, functionality, and price. The information handling system may include random access memory (RAM), one or more processing resources such as a central processing unit (CPU) or hardware or software control logic, ROM, and/or other types of nonvolatile memory. Additional components of the information handling system may include one or more disk drives, one or more network ports for communicating with external devices as well as various input and output (I/O) devices, such as a keyboard, a mouse, touchscreen and/or a video display. The information handling system may also include one or more buses operable to transmit communications between the various hardware components.
  • One skilled in the art will recognize no computing system or programming language is critical to the practice of the present disclosure. One skilled in the art will also recognize that a number of the elements described above may be physically and/or functionally separated into sub-modules or combined together.
  • It will be appreciated to those skilled in the art that the preceding examples and embodiment are exemplary and not limiting to the scope of the present disclosure. It is intended that all permutations, enhancements, equivalents, combinations, and improvements thereto that are apparent to those skilled in the art upon a reading of the specification and a study of the drawings are included within the true spirit and scope of the present disclosure.

Claims (20)

What is claimed is:
1. A medical data system for generating patient treatment instructions, the system comprising:
a diagnostic engine utilizing machine learning that processes patient information and calculates probabilities associated with illnesses that have been identified from a set of potential illnesses and generates diagnostic data, the diagnostic engine having an input that receives patient information from at least one of a patient, medical staff, doctor, a medical history database and automated system;
a treatment engine coupled to the diagnostic engine, the treatment engine receives diagnostic data from the diagnostic engine and performs the steps of:
determining a treatment option for one or more of the illnesses;
analyzing a likelihood of success for the treatment option;
if the likelihood of success is below a threshold, the treatment requests additional diagnostic data from the diagnostic engine;
if the likelihood of success is above the threshold, treatment instructions for one or more treatments options are generated based on the analysis;
receiving feedback regarding the treatment instructions; and
providing the feedback to the diagnostic engine to improve the machine learning processes.
2. The medical data system according to claim 1, wherein the treatment engine assigns criticality scores to one or more of the illnesses.
3. The medical data system according to claim 1, wherein the diagnostic engine populates an electronic health care record (EHR).
4. The medical data system according to claim 3, further comprising a notes generation engine that generates exam notes to populate the EHR.
5. The medical data system according to claim 1, wherein the medical data system comprises a coding engine that, based on at least one of drug information, a patient preference, and a patient medical history, generates an output that comprises a treatment code.
6. The medical data system according to claim 5, wherein the treatment engine uses at least one of the treatment plan and the treatment code to update an EHR.
7. A method for generating patient treatment instructions, the method comprising:
receiving patient information at a diagnostic engine, the patient information being received from at least one of a patient, medical staff, doctor, a medical history database and automated system;
generating a set of potential illnesses from an analysis of the patient information;
assigning probabilities to the set of potential illnesses using machine learning processes;
producing diagnostic data from the set of potential illnesses and the assigned probabilities;
based on the diagnostic data, determining a treatment option for one or more of the illnesses;
analyzing a likelihood of success for the treatment option;
if the likelihood of success is below a threshold, requesting additional diagnostic data from the diagnostic engine;
if the likelihood of success is above the threshold, generating treatment instructions for one or more treatments options based on the analysis;
receiving feedback regarding the treatment instructions; and
providing the feedback to the diagnostic engine to improve the machine learning processes.
8. The method according to claim 7, wherein the one or more of illnesses may be assigned criticality scores.
9. The method according to claim 7, further comprising receiving patient-related data from an electronic health care record (EHR).
10. The method according to claim 7, further comprising, based on the diagnostic data, generating an output that comprises a treatment code that is based on at least one of drug information, a patient preference, and a patient medical history.
11. The method according to claim 10, further comprising using at least one of the treatment instructions and the treatment code to update an EHR.
12. The method according to claim 10, further comprising generating exam notes to populate the EHR.
13. The method according to claim 7, further comprising recalculating the likelihood of success to adjust the treatment plan.
14. The method according to claim 7, wherein the treatment instructions comprise one of patient treatment instructions and requests for additional examination.
15. The method according to claim 14, wherein further comprising calculating a risk of a false negative for the one or more illnesses.
16. A method for partial automated medical treatment, the method comprising:
receiving patient information from a patient or medical staff;
receiving patient medical history from a medical history database;
generating a set of potential illnesses from an analysis of the patient information and the patient medical history;
assigning a first set of probabilities to the set of potential illnesses using machine learning processes, the first set of probabilities relating to estimated illness accuracies associated with first set of potential illnesses based on the machine learning process;
generating a set of diagnostic data for at least two of the potential illnesses within the plurality of potential illnesses;
based on the at least one potential illness within the set of potential illnesses and at least some of the set of diagnostic data, determining a treatment option for one or more of the illnesses within the set of potential illnesses;
analyzing a likelihood of success for the treatment option;
if the likelihood of success is below a threshold, requesting additional diagnostic data from the diagnostic engine;
if the likelihood of success is above the threshold, generating treatment instructions for one or more treatments options based on the analysis;
receiving feedback regarding the treatment instructions; and
providing the feedback to the diagnostic engine to improve the machine learning processes.
17. The method according to claim 16, further comprising, based on the diagnostic data, generating an output that comprises a treatment code that is based on at least one of drug information, a patient preference, and a patient medical history.
18. The method according to claim 17, further comprising using at least one of the treatment instructions and the treatment code to update an EHR.
19. The method according to claim 17, further comprising generating exam notes to populate the EHR.
20. The method according to claim 16, wherein the treatment instructions comprise one of patient treatment instructions and requests for additional examination.
US17/195,631 2018-03-06 2021-03-08 Systems and methods for semi-automated medical processes Abandoned US20210186312A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/195,631 US20210186312A1 (en) 2018-03-06 2021-03-08 Systems and methods for semi-automated medical processes

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/913,801 US10939806B2 (en) 2018-03-06 2018-03-06 Systems and methods for optical medical instrument patient measurements
US17/195,631 US20210186312A1 (en) 2018-03-06 2021-03-08 Systems and methods for semi-automated medical processes

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/913,801 Continuation US10939806B2 (en) 2018-03-06 2018-03-06 Systems and methods for optical medical instrument patient measurements

Publications (1)

Publication Number Publication Date
US20210186312A1 true US20210186312A1 (en) 2021-06-24

Family

ID=67844088

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/913,801 Active 2039-02-28 US10939806B2 (en) 2018-03-06 2018-03-06 Systems and methods for optical medical instrument patient measurements
US17/195,631 Abandoned US20210186312A1 (en) 2018-03-06 2021-03-08 Systems and methods for semi-automated medical processes

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/913,801 Active 2039-02-28 US10939806B2 (en) 2018-03-06 2018-03-06 Systems and methods for optical medical instrument patient measurements

Country Status (2)

Country Link
US (2) US10939806B2 (en)
WO (1) WO2019173409A1 (en)

Families Citing this family (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11164679B2 (en) 2017-06-20 2021-11-02 Advinow, Inc. Systems and methods for intelligent patient interface exam station
US10939806B2 (en) * 2018-03-06 2021-03-09 Advinow, Inc. Systems and methods for optical medical instrument patient measurements
US11348688B2 (en) 2018-03-06 2022-05-31 Advinow, Inc. Systems and methods for audio medical instrument patient measurements
US11918294B2 (en) * 2019-01-31 2024-03-05 Brainlab Ag Virtual trajectory planning
US11904202B2 (en) 2019-03-11 2024-02-20 Rom Technolgies, Inc. Monitoring joint extension and flexion using a sensor device securable to an upper and lower limb
US11541274B2 (en) 2019-03-11 2023-01-03 Rom Technologies, Inc. System, method and apparatus for electrically actuated pedal for an exercise or rehabilitation machine
US11433276B2 (en) 2019-05-10 2022-09-06 Rehab2Fit Technologies, Inc. Method and system for using artificial intelligence to independently adjust resistance of pedals based on leg strength
US12102878B2 (en) 2019-05-10 2024-10-01 Rehab2Fit Technologies, Inc. Method and system for using artificial intelligence to determine a user's progress during interval training
US11801423B2 (en) 2019-05-10 2023-10-31 Rehab2Fit Technologies, Inc. Method and system for using artificial intelligence to interact with a user of an exercise device during an exercise session
US11904207B2 (en) 2019-05-10 2024-02-20 Rehab2Fit Technologies, Inc. Method and system for using artificial intelligence to present a user interface representing a user's progress in various domains
US11957960B2 (en) 2019-05-10 2024-04-16 Rehab2Fit Technologies Inc. Method and system for using artificial intelligence to adjust pedal resistance
US11071597B2 (en) 2019-10-03 2021-07-27 Rom Technologies, Inc. Telemedicine for orthopedic treatment
US11701548B2 (en) 2019-10-07 2023-07-18 Rom Technologies, Inc. Computer-implemented questionnaire for orthopedic treatment
US11515028B2 (en) 2019-10-03 2022-11-29 Rom Technologies, Inc. Method and system for using artificial intelligence and machine learning to create optimal treatment plans based on monetary value amount generated and/or patient outcome
US20210142893A1 (en) 2019-10-03 2021-05-13 Rom Technologies, Inc. System and method for processing medical claims
US11515021B2 (en) 2019-10-03 2022-11-29 Rom Technologies, Inc. Method and system to analytically optimize telehealth practice-based billing processes and revenue while enabling regulatory compliance
US20210128080A1 (en) * 2019-10-03 2021-05-06 Rom Technologies, Inc. Augmented reality placement of goniometer or other sensors
US11955221B2 (en) 2019-10-03 2024-04-09 Rom Technologies, Inc. System and method for using AI/ML to generate treatment plans to stimulate preferred angiogenesis
US11830601B2 (en) 2019-10-03 2023-11-28 Rom Technologies, Inc. System and method for facilitating cardiac rehabilitation among eligible users
US11101028B2 (en) 2019-10-03 2021-08-24 Rom Technologies, Inc. Method and system using artificial intelligence to monitor user characteristics during a telemedicine session
US11978559B2 (en) 2019-10-03 2024-05-07 Rom Technologies, Inc. Systems and methods for remotely-enabled identification of a user infection
US12062425B2 (en) 2019-10-03 2024-08-13 Rom Technologies, Inc. System and method for implementing a cardiac rehabilitation protocol by using artificial intelligence and standardized measurements
US12020799B2 (en) 2019-10-03 2024-06-25 Rom Technologies, Inc. Rowing machines, systems including rowing machines, and methods for using rowing machines to perform treatment plans for rehabilitation
US11915816B2 (en) 2019-10-03 2024-02-27 Rom Technologies, Inc. Systems and methods of using artificial intelligence and machine learning in a telemedical environment to predict user disease states
US11955223B2 (en) 2019-10-03 2024-04-09 Rom Technologies, Inc. System and method for using artificial intelligence and machine learning to provide an enhanced user interface presenting data pertaining to cardiac health, bariatric health, pulmonary health, and/or cardio-oncologic health for the purpose of performing preventative actions
US11955222B2 (en) 2019-10-03 2024-04-09 Rom Technologies, Inc. System and method for determining, based on advanced metrics of actual performance of an electromechanical machine, medical procedure eligibility in order to ascertain survivability rates and measures of quality-of-life criteria
US11923065B2 (en) 2019-10-03 2024-03-05 Rom Technologies, Inc. Systems and methods for using artificial intelligence and machine learning to detect abnormal heart rhythms of a user performing a treatment plan with an electromechanical machine
US11955220B2 (en) 2019-10-03 2024-04-09 Rom Technologies, Inc. System and method for using AI/ML and telemedicine for invasive surgical treatment to determine a cardiac treatment plan that uses an electromechanical machine
US11887717B2 (en) 2019-10-03 2024-01-30 Rom Technologies, Inc. System and method for using AI, machine learning and telemedicine to perform pulmonary rehabilitation via an electromechanical machine
US11961603B2 (en) 2019-10-03 2024-04-16 Rom Technologies, Inc. System and method for using AI ML and telemedicine to perform bariatric rehabilitation via an electromechanical machine
US11915815B2 (en) 2019-10-03 2024-02-27 Rom Technologies, Inc. System and method for using artificial intelligence and machine learning and generic risk factors to improve cardiovascular health such that the need for additional cardiac interventions is mitigated
US11075000B2 (en) 2019-10-03 2021-07-27 Rom Technologies, Inc. Method and system for using virtual avatars associated with medical professionals during exercise sessions
US12020800B2 (en) 2019-10-03 2024-06-25 Rom Technologies, Inc. System and method for using AI/ML and telemedicine to integrate rehabilitation for a plurality of comorbid conditions
US11756666B2 (en) 2019-10-03 2023-09-12 Rom Technologies, Inc. Systems and methods to enable communication detection between devices and performance of a preventative action
US20210134432A1 (en) 2019-10-03 2021-05-06 Rom Technologies, Inc. Method and system for implementing dynamic treatment environments based on patient information
US20210134412A1 (en) 2019-10-03 2021-05-06 Rom Technologies, Inc. System and method for processing medical claims using biometric signatures
US11317975B2 (en) 2019-10-03 2022-05-03 Rom Technologies, Inc. Method and system for treating patients via telemedicine using sensor data from rehabilitation or exercise equipment
US12087426B2 (en) 2019-10-03 2024-09-10 Rom Technologies, Inc. Systems and methods for using AI ML to predict, based on data analytics or big data, an optimal number or range of rehabilitation sessions for a user
US11069436B2 (en) 2019-10-03 2021-07-20 Rom Technologies, Inc. System and method for use of telemedicine-enabled rehabilitative hardware and for encouraging rehabilitative compliance through patient-based virtual shared sessions with patient-enabled mutual encouragement across simulated social networks
US11826613B2 (en) 2019-10-21 2023-11-28 Rom Technologies, Inc. Persuasive motivation for orthopedic treatment
US11687778B2 (en) 2020-01-06 2023-06-27 The Research Foundation For The State University Of New York Fakecatcher: detection of synthetic portrait videos using biological signals
US11107591B1 (en) 2020-04-23 2021-08-31 Rom Technologies, Inc. Method and system for describing and recommending optimal treatment plans in adaptive telemedical or other contexts
US12100499B2 (en) 2020-08-06 2024-09-24 Rom Technologies, Inc. Method and system for using artificial intelligence and machine learning to create optimal treatment plans based on monetary value amount generated and/or patient outcome
DE102020210995A1 (en) 2020-09-01 2022-03-03 Siemens Healthcare Gmbh Method for providing error information relating to a plurality of individual measurements
US11627243B2 (en) * 2021-07-23 2023-04-11 Phaox LLC Handheld wireless endoscope image streaming apparatus
EP4180989A1 (en) * 2021-11-15 2023-05-17 Koninklijke Philips N.V. Method and apparatus for processing data
GB2616295A (en) * 2022-03-03 2023-09-06 Intelligent Ultrasound Ltd Apparatus and method for monitoring a medical procedure

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030190602A1 (en) * 2001-03-12 2003-10-09 Monogen, Inc. Cell-based detection and differentiation of disease states
US20060189900A1 (en) * 2005-01-18 2006-08-24 Flaherty J C Biological interface system with automated configuration
US20100191100A1 (en) * 2009-01-23 2010-07-29 Warsaw Orthopedic, Inc. Methods and systems for diagnosing, treating, or tracking spinal disorders
US20150193583A1 (en) * 2014-01-06 2015-07-09 Cerner Innovation, Inc. Decision Support From Disparate Clinical Sources

Family Cites Families (123)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6206829B1 (en) 1996-07-12 2001-03-27 First Opinion Corporation Computerized medical diagnostic and treatment advice system including network access
US6205716B1 (en) * 1995-12-04 2001-03-27 Diane P. Peltz Modular video conference enclosure
JP2002511965A (en) 1997-07-14 2002-04-16 アボツト・ラボラトリーズ Telemedicine
US6915254B1 (en) 1998-07-30 2005-07-05 A-Life Medical, Inc. Automatically assigning medical codes using natural language processing
US7418399B2 (en) 1999-03-10 2008-08-26 Illinois Institute Of Technology Methods and kits for managing diagnosis and therapeutics of bacterial infections
CA2314517A1 (en) 1999-07-26 2001-01-26 Gust H. Bardy System and method for determining a reference baseline of individual patient status for use in an automated collection and analysis patient care system
US6398728B1 (en) 1999-11-16 2002-06-04 Cardiac Intelligence Corporation Automated collection and analysis patient care system and method for diagnosing and monitoring respiratory insufficiency and outcomes thereof
US6970742B2 (en) 2000-01-11 2005-11-29 Savacor, Inc. Method for detecting, diagnosing, and treating cardiovascular disease
US20020019749A1 (en) 2000-06-27 2002-02-14 Steven Becker Method and apparatus for facilitating delivery of medical services
US8043224B2 (en) 2000-07-12 2011-10-25 Dimicine Research It, Llc Telemedicine system
DE10046110B8 (en) 2000-09-18 2006-07-06 Siemens Ag Medical diagnostic device with patient recognition
CA2325205A1 (en) 2000-11-02 2002-05-02 The Sullivan Group Computerized risk management module for medical diagnosis
US7395214B2 (en) 2001-05-11 2008-07-01 Craig P Shillingburg Apparatus, device and method for prescribing, administering and monitoring a treatment regimen for a patient
US7171311B2 (en) 2001-06-18 2007-01-30 Rosetta Inpharmatics Llc Methods of assigning treatment to breast cancer patients
US20030028406A1 (en) 2001-07-24 2003-02-06 Herz Frederick S. M. Database for pre-screening potentially litigious patients
US20110295621A1 (en) 2001-11-02 2011-12-01 Siemens Medical Solutions Usa, Inc. Healthcare Information Technology System for Predicting and Preventing Adverse Events
US8738396B2 (en) 2002-04-19 2014-05-27 Greenway Medical Technologies, Inc. Integrated medical software system with embedded transcription functionality
US9842188B2 (en) 2002-10-29 2017-12-12 Practice Velocity, LLC Method and system for automated medical records processing with cloud computing
US20050075907A1 (en) 2003-03-14 2005-04-07 John Rao Emergency station kiosk and related methods
US7410138B2 (en) * 2003-03-14 2008-08-12 Tgr Intellectual Properties, Llc Display adjustably positionable about swivel and pivot axes
US20050054938A1 (en) 2003-07-29 2005-03-10 Wehman Thomas C. Method and apparatus including altimeter and accelerometers for determining work performed by an individual
US9681925B2 (en) 2004-04-21 2017-06-20 Siemens Medical Solutions Usa, Inc. Method for augmented reality instrument placement using an image based navigation system
US20060036619A1 (en) 2004-08-09 2006-02-16 Oren Fuerst Method for accessing and analyzing medically related information from multiple sources collected into one or more databases for deriving illness probability and/or for generating alerts for the detection of emergency events relating to disease management including HIV and SARS, and for syndromic surveillance of infectious disease and for predicting risk of adverse events to one or more drugs
US20060173712A1 (en) 2004-11-12 2006-08-03 Dirk Joubert Portable medical information system
US20060111620A1 (en) * 2004-11-23 2006-05-25 Squilla John R Providing medical services at a kiosk
EP1910956A2 (en) 2005-05-04 2008-04-16 Board of Regents, The University of Texas System System, method and program product for delivering medical services from a remote location
US7384146B2 (en) 2005-06-28 2008-06-10 Carestream Health, Inc. Health care kiosk having automated diagnostic eye examination and a fulfillment remedy based thereon
CN101395621A (en) 2005-07-22 2009-03-25 断层放疗公司 System and method of remotely directing radiation therapy treatment
WO2007034494A2 (en) 2005-09-26 2007-03-29 Hadasit Medical Research Services & Development Company Ltd. A system and method for treating chronic pain
US20070106127A1 (en) 2005-10-11 2007-05-10 Alman Brian M Automated patient monitoring and counseling system
US20070168233A1 (en) 2006-01-16 2007-07-19 Chris Hymel Method for actuarial determination of the cost of one-time procedural or professional liability insurance policy
AU2007227678A1 (en) * 2006-03-13 2007-09-27 Mako Surgical Corp. Prosthetic device and system and method for implanting prosthetic device
US7693717B2 (en) 2006-04-12 2010-04-06 Custom Speech Usa, Inc. Session file modification with annotation using speech recognition or text to speech
US9060683B2 (en) 2006-05-12 2015-06-23 Bao Tran Mobile wireless appliance
US7890153B2 (en) 2006-09-28 2011-02-15 Nellcor Puritan Bennett Llc System and method for mitigating interference in pulse oximetry
WO2008089204A1 (en) 2007-01-15 2008-07-24 Allscripts Healthcare Solutions, Inc. Universal application integrator
US8140154B2 (en) 2007-06-13 2012-03-20 Zoll Medical Corporation Wearable medical treatment device
CN101742960B (en) 2007-07-03 2012-06-20 艾高特有限责任公司 Records access and management
US20090062623A1 (en) 2007-08-30 2009-03-05 Kimberly-Clark Worldwide, Inc. Identifying possible medical conditions of a patient
US20090062670A1 (en) 2007-08-30 2009-03-05 Gary James Sterling Heart monitoring body patch and system
US20090187425A1 (en) 2007-09-17 2009-07-23 Arthur Solomon Thompson PDA software robots leveraging past history in seconds with software robots
EP2245568A4 (en) 2008-02-20 2012-12-05 Univ Mcmaster Expert system for determining patient treatment response
US9743844B2 (en) 2008-03-21 2017-08-29 Computerized Screening, Inc. Community based managed health kiosk and prescription dispensement system
US8358822B2 (en) 2008-05-22 2013-01-22 Siemens Aktiengesellschaft Automatic determination of field of view in cardiac MRI
US20090299766A1 (en) 2008-05-30 2009-12-03 International Business Machines Corporation System and method for optimizing medical treatment planning and support in difficult situations subject to multiple constraints and uncertainties
US20100023351A1 (en) 2008-07-28 2010-01-28 Georgiy Lifshits System and method for automated diagnostics and medical treatment development for oriental medicine
US8078554B2 (en) 2008-09-03 2011-12-13 Siemens Medical Solutions Usa, Inc. Knowledge-based interpretable predictive model for survival analysis
WO2010036732A1 (en) 2008-09-25 2010-04-01 Zeltiq Aesthetics, Inc. Treatment planning systems and methods for body contouring applications
US20150120321A1 (en) * 2009-02-26 2015-04-30 I.M.D. Soft Ltd. Wearable Data Reader for Medical Documentation and Clinical Decision Support
WO2010124137A1 (en) 2009-04-22 2010-10-28 Millennium Pharmacy Systems, Inc. Pharmacy management and administration with bedside real-time medical event data collection
US20100280350A1 (en) 2009-05-02 2010-11-04 Xinyu Zhang Chinese medicine tele-diagnostics and triage system
WO2010144662A1 (en) 2009-06-10 2010-12-16 Medtronic, Inc. Absolute calibrated tissue oxygen saturation and total hemoglobin volume fraction
US8934636B2 (en) 2009-10-09 2015-01-13 George S. Ferzli Stethoscope, stethoscope attachment and collected data analysis method and system
US8761860B2 (en) 2009-10-14 2014-06-24 Nocimed, Llc MR spectroscopy system and method for diagnosing painful and non-painful intervertebral discs
US9314189B2 (en) 2009-11-06 2016-04-19 Biotronik Crm Patent Ag Extracorporeal physiological measurement device
WO2011143631A2 (en) 2010-05-14 2011-11-17 Kai Medical, Inc. Systems and methods for non-contact multiparameter vital signs monitoring, apnea therapy, sway cancellation, patient identification, and subject monitoring sensors
US20120065987A1 (en) 2010-09-09 2012-03-15 Siemens Medical Solutions Usa, Inc. Computer-Based Patient Management for Healthcare
US9002773B2 (en) 2010-09-24 2015-04-07 International Business Machines Corporation Decision-support application and system for problem solving using a question-answering system
US20120084092A1 (en) 2010-10-04 2012-04-05 Kozuch Michael J Method and apparatus for a comprehensive dynamic personal health record system
US20120191476A1 (en) 2011-01-20 2012-07-26 Reid C Shane Systems and methods for collection, organization and display of ems information
HUE056373T2 (en) * 2011-02-17 2022-02-28 Tyto Care Ltd System, handheld diagnostics device and methods for performing an automatic and remote trained personnel guided non-invasive medical examination
US9916420B2 (en) 2011-02-18 2018-03-13 Nuance Communications, Inc. Physician and clinical documentation specialist workflow integration
US20120232930A1 (en) 2011-03-12 2012-09-13 Definiens Ag Clinical Decision Support System
US9162032B2 (en) 2011-03-21 2015-10-20 William Ray Lynch, JR. Systems and methods for diagnosing and treating sleep disorders
US9265468B2 (en) 2011-05-11 2016-02-23 Broncus Medical, Inc. Fluoroscopy-based surgical device tracking method
WO2013029026A1 (en) 2011-08-24 2013-02-28 Acupera, Inc. Remote clinical care system
US20140058755A1 (en) 2011-11-23 2014-02-27 Remedev, Inc. Remotely-executed medical diagnosis and therapy including emergency automation
EP2816950A4 (en) 2012-02-22 2015-10-28 Aclaris Medical Llc Physiological signal detecting device and system
EP4140414A1 (en) * 2012-03-07 2023-03-01 Ziteo, Inc. Methods and systems for tracking and guiding sensors and instruments
US20130268203A1 (en) 2012-04-09 2013-10-10 Vincent Thekkethala Pyloth System and method for disease diagnosis through iterative discovery of symptoms using matrix based correlation engine
CN204428014U (en) * 2012-04-19 2015-07-01 尼尔·拉兹 Be convenient to the medical examination apparatus carrying out long-range inspection
WO2013165380A2 (en) 2012-04-30 2013-11-07 Empire Technology Development, Llc Infrared guide stars for endoscopic orienteering
US9460264B2 (en) 2012-05-04 2016-10-04 Elwha Llc Devices, systems, and methods for automated data collection
US8548828B1 (en) 2012-05-09 2013-10-01 DermTap Method, process and system for disease management using machine learning process and electronic media
EP2885759A4 (en) 2012-08-15 2016-02-10 Healthspot Inc Veterinary kiosk with integrated veterinary medical devices
US9536049B2 (en) 2012-09-07 2017-01-03 Next It Corporation Conversational virtual healthcare assistant
US20140095201A1 (en) 2012-09-28 2014-04-03 Siemens Medical Solutions Usa, Inc. Leveraging Public Health Data for Prediction and Prevention of Adverse Events
WO2014063162A1 (en) 2012-10-19 2014-04-24 Tawil Jack Modular telemedicine enabled clinic and medical diagnostic assistance systems
WO2014088933A1 (en) 2012-12-03 2014-06-12 DocView Solutions LLC Interactive medical device monitoring and management system
US10559377B2 (en) 2013-01-09 2020-02-11 Biomed Concepts Inc. Graphical user interface for identifying diagnostic and therapeutic options for medical conditions using electronic health records
US20140236630A1 (en) 2013-02-15 2014-08-21 Stc.Unm System and methods for health analytics using electronic medical records
KR20140107714A (en) 2013-02-25 2014-09-05 한국전자통신연구원 Health management system and method for providing health information thereof
US20140257852A1 (en) 2013-03-05 2014-09-11 Clinton Colin Graham Walker Automated interactive health care application for patient care
US9414776B2 (en) 2013-03-06 2016-08-16 Navigated Technologies, LLC Patient permission-based mobile health-linked information collection and exchange systems and methods
WO2014160172A1 (en) 2013-03-14 2014-10-02 Jintronix, Inc. Method and system for analysing a virtual rehabilitation activity/exercise
EP2967393A4 (en) 2013-03-15 2016-12-07 Peerbridge Health Inc System and method for monitoring and diagnosing patient condition based on wireless sensor monitoring data
US20140275849A1 (en) 2013-03-15 2014-09-18 Peerbridge Health, Inc. System and method for monitoring and diagnosing a patient condition based on wireless sensor monitoring data
CA2892554C (en) 2013-03-15 2017-04-18 Synaptive Medical (Barbados) Inc. System and method for dynamic validation, correction of registration for surgical navigation
US10303851B2 (en) 2013-03-15 2019-05-28 Md24 Patent Technology, Llc Physician-centric health care delivery platform
US20160067100A1 (en) * 2013-05-15 2016-03-10 Shayn Peirce Cottler Aural foreign body removal device and related methods of use and manufacture
US9391986B2 (en) 2013-05-31 2016-07-12 Verizon Patent And Licensing Inc. Method and apparatus for providing multi-sensor multi-factor identity verification
US20170039502A1 (en) 2013-06-28 2017-02-09 Healthtap, Inc. Systems and methods for evaluating and selecting a healthcare professional using a healthcare operating system
US20190139648A1 (en) 2013-06-28 2019-05-09 Healthtap, Inc. Systems and methods for triaging a health-related inquiry on a computer-implemented virtual consultation application
WO2015116344A1 (en) 2014-01-30 2015-08-06 Systemedical Llc Methods, devices, and systems for multi-format data aggregation
JP6044963B2 (en) 2014-02-12 2016-12-14 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Information processing apparatus, method, and program
US9203814B2 (en) 2014-02-24 2015-12-01 HCA Holdings, Inc. Providing notifications to authorized users
US20170105802A1 (en) * 2014-03-27 2017-04-20 Bresmedical Pty Limited Computer aided surgical navigation and planning in implantology
US9519634B2 (en) 2014-05-30 2016-12-13 Educational Testing Service Systems and methods for determining lexical associations among words in a corpus
US20160038092A1 (en) 2014-08-11 2016-02-11 Douglas A. Golay Applying non-real time and non-user attended algorithms to stored non-imaging data and existing imaging data for obtaining a dental diagnosis
WO2016054287A1 (en) 2014-10-01 2016-04-07 Bright.Md Inc. Medical diagnosis and treatment support apparatus, system, and method
US11593842B2 (en) 2014-10-28 2023-02-28 Hygeia Health, Inc. Systems, apparatuses, and methods for physiological data collection and providing targeted content
US20160140318A1 (en) 2014-11-13 2016-05-19 Peter Stangel Medical artificial intelligence system
US20180329713A1 (en) 2014-12-10 2018-11-15 Intel Corporation Fitness sensor with low power attributes in sensor hub
US10248759B2 (en) 2015-03-13 2019-04-02 Konica Minolta Laboratory U.S.A., Inc. Medical imaging reference retrieval and report generation
US10143418B2 (en) 2015-04-10 2018-12-04 Dymedix Corporation Combination airflow, sound and pulse wave sensor pack with smartphone data aquisition and transfer
ES2902794T3 (en) 2015-04-17 2022-03-29 Cleveland Clinic Found Apparatus and related procedure to facilitate testing by means of a computing device
US20160342753A1 (en) 2015-04-24 2016-11-24 Starslide Method and apparatus for healthcare predictive decision technology platform
US20190148012A1 (en) 2015-05-01 2019-05-16 Laboratory Corporation Of America Holdings Enhanced decision support for systems, methods, and media for laboratory benefit services
US10182710B2 (en) * 2015-07-23 2019-01-22 Qualcomm Incorporated Wearable dual-ear mobile otoscope
US9552745B1 (en) 2015-09-03 2017-01-24 Christian Raul Gutierrez Morales Medical attachment device tracking system and method of use thereof
CA2997552C (en) 2015-09-08 2020-03-24 Robert Howard ROSE Integrated medical device and home based system to measure and report vital patient physiological data via telemedicine
US20170084036A1 (en) * 2015-09-21 2017-03-23 Siemens Aktiengesellschaft Registration of video camera with medical imaging
US9953217B2 (en) 2015-11-30 2018-04-24 International Business Machines Corporation System and method for pose-aware feature learning
US10861604B2 (en) 2016-05-05 2020-12-08 Advinow, Inc. Systems and methods for automated medical diagnostics
US20170323071A1 (en) 2016-05-05 2017-11-09 James Stewart Bates Systems and methods for generating medical diagnosis
US20170344711A1 (en) 2016-05-31 2017-11-30 Baidu Usa Llc System and method for processing medical queries using automatic question and answering diagnosis system
US20180025121A1 (en) 2016-07-20 2018-01-25 Baidu Usa Llc Systems and methods for finer-grained medical entity extraction
US10657671B2 (en) * 2016-12-02 2020-05-19 Avent, Inc. System and method for navigation to a target anatomical object in medical imaging-based procedures
US20170173262A1 (en) * 2017-03-01 2017-06-22 François Paul VELTZ Medical systems, devices and methods
US20180330059A1 (en) 2017-05-09 2018-11-15 James Stewart Bates Patient treatment systems and methods
US20180330058A1 (en) 2017-05-09 2018-11-15 James Stewart Bates Systems and methods for generating electronic health care record data
US10971269B2 (en) 2017-12-08 2021-04-06 International Business Machines Corporation Treatment recommendation decision support using commercial transactions
US10939806B2 (en) * 2018-03-06 2021-03-09 Advinow, Inc. Systems and methods for optical medical instrument patient measurements

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030190602A1 (en) * 2001-03-12 2003-10-09 Monogen, Inc. Cell-based detection and differentiation of disease states
US20060189900A1 (en) * 2005-01-18 2006-08-24 Flaherty J C Biological interface system with automated configuration
US20100191100A1 (en) * 2009-01-23 2010-07-29 Warsaw Orthopedic, Inc. Methods and systems for diagnosing, treating, or tracking spinal disorders
US20150193583A1 (en) * 2014-01-06 2015-07-09 Cerner Innovation, Inc. Decision Support From Disparate Clinical Sources

Also Published As

Publication number Publication date
US10939806B2 (en) 2021-03-09
US20190274523A1 (en) 2019-09-12
WO2019173409A1 (en) 2019-09-12

Similar Documents

Publication Publication Date Title
US20210186312A1 (en) Systems and methods for semi-automated medical processes
EP3622407B1 (en) Patient treatment systems and methods
US11935656B2 (en) Systems and methods for audio medical instrument patient measurements
EP3641638B1 (en) Systems and methods for intelligent patient interface exam station
US20180330058A1 (en) Systems and methods for generating electronic health care record data
US20210090738A1 (en) Systems and methods for automated medical diagnostics
US20170323071A1 (en) Systems and methods for generating medical diagnosis
US20190279767A1 (en) Systems and methods for creating an expert-trained data model
US20170323069A1 (en) Systems and methods for medical instrument patient measurements
Pavel et al. The role of technology and engineering models in transforming healthcare
US20190221310A1 (en) System and method for automated diagnosis and treatment
US20190214134A1 (en) System and method for automated healthcare service
US11636777B2 (en) System and method for improving exercise performance using a mobile device
US10825556B2 (en) Clinical grade consumer physical assessment system
US20190066850A1 (en) Systems and methods for medical instrument patient measurements
CN118414672A (en) Direct medical treatment regimen prediction using artificial intelligence

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION