WO2019173237A1 - Systèmes, dispositifs et procédés de suivi et d'analyse de mouvement d'un sujet pendant un balayage d'imagerie médicale et/ou une intervention thérapeutique - Google Patents

Systèmes, dispositifs et procédés de suivi et d'analyse de mouvement d'un sujet pendant un balayage d'imagerie médicale et/ou une intervention thérapeutique Download PDF

Info

Publication number
WO2019173237A1
WO2019173237A1 PCT/US2019/020593 US2019020593W WO2019173237A1 WO 2019173237 A1 WO2019173237 A1 WO 2019173237A1 US 2019020593 W US2019020593 W US 2019020593W WO 2019173237 A1 WO2019173237 A1 WO 2019173237A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
data
motion
target regions
medical imaging
Prior art date
Application number
PCT/US2019/020593
Other languages
English (en)
Inventor
Jeffrey N. Yu
Michael G. ENGELMANN
Lalit Keshav MESTHA
William C. MELOHN
Barry M. WEINMAN
Ulf Peter Gustafsson
Yen Mei Lisa CHUAH
Original Assignee
Kineticor, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/US2019/013147 external-priority patent/WO2019140155A1/fr
Application filed by Kineticor, Inc. filed Critical Kineticor, Inc.
Publication of WO2019173237A1 publication Critical patent/WO2019173237A1/fr

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5258Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
    • A61B6/5264Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise due to motion
    • A61B6/527Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise due to motion using data from a motion artifact sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/541Control of apparatus or devices for radiation diagnosis involving acquisition triggered by a physiological signal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/545Control of apparatus or devices for radiation diagnosis involving automatic set-up of acquisition parameters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • A61B2017/00699Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body correcting for movement caused by respiration, e.g. by triggering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • A61B2017/00703Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body correcting for movement of heart, e.g. ECG-triggered
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3991Markers, e.g. radio-opaque or breast lesions markers having specific anchoring means to fixate the marker to the tissue, e.g. hooks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • A61N2005/1055Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using magnetic resonance imaging [MRI]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • A61N2005/1059Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using cameras imaging the patient

Definitions

  • Some Embodiments described herein relate to the field of medical imaging and/or treatment, and more specifically to systems, devices, and methods for tracking, correcting, and/or accounting for patient movement during medical scans and/or therapeutic procedures.
  • the disclosure relates generally to the field of image and/or video analysis, and more specifically to systems, devices, and methods for tracking and/or analyzing subject images and/or images, for example during a medical imaging scan and/or therapeutic procedure.
  • MRI magnetic resonance imaging
  • MR magnetic resonance
  • An MRI scanner or magnetic resonance (MR) scanner is a device in which the patient or a portion of the patient’s body is positioned within a powerful magnet where a magnetic field is used to align the magnetization of some atomic nuclei (usually hydrogen nuclei - protons) and radio frequency magnetic fields are applied to systematically alter the alignment of this magnetization. This causes the nuclei to produce a rotating magnetic field detectable by the scanner and this information is recorded to construct an image of the scanned region of the body.
  • CT computer tomography
  • PET positron emission tomography
  • SPECT single-photon emission computed tomography
  • digital angiographic scanner can comprise a computer tomography (CT) scanner, a positron emission tomography (PET) scanner, a single-photon emission computed tomography (SPECT) scanner, and/or a digital angiographic scanner.
  • CT computer tomography
  • PET positron emission tomography
  • SPECT single-photon emission computed tomography
  • digital angiographic scanner digital angiographic scanner.
  • radiation therapy can be applied to a targeted tissue region.
  • radiation therapy can be dynamically applied in response to patient movements.
  • the tracking of patient movements does not have a high degree of accuracy. Accordingly, the use of such systems can result in the application of radiation therapy to non-targeted tissue regions, thereby unintentionally harming healthy tissue while intentionally affecting diseased tissue.
  • proton therapies and other therapies are also true for proton therapies and other therapies.
  • United States Patent No. 9,734,589 issued on August 15, 2017, describes systems, devices, and methods that adaptively compensate for subject motion which may be utilized in combination with and/or in addition to one or more embodiments disclosed herein.
  • United States Patent No. 9,734,589 is incorporated herein in its entirety and forms part of this specification.
  • an accurate and reliable method of determining the dynamic position and orientation of a patient’s head or other body portion can be helpful in compensating for subject motion during such procedures and/or determining one or more biometric data and/or analyses.
  • certain embodiments disclosed herein are directed to systems, devices, and methods that provide practical optical tracking capability, for example using an optical marker configured to be attached to the subject of interest and/or a landmark on the subject, such as a facial feature(s) and/or optical flow vectors derived by analyzing videos or optical signals containing blood volume information obtained by analyzing videos and/or images.
  • tracking data collected of the subject can be used to control, modify, and/or improve results of a medical imaging scanner and/or therapeutic device.
  • tracking data collected by a motion tracking device and/or system can be stored for further analysis regarding the patient or subject, for example for biometrics data analysis and/or using the data for improving modification and/or control of the medical imaging scanner and/or therapeutic device with respect to correcting motion artifacts.
  • a standalone camera or detector system separate from a medical imaging scanner and/or therapeutic device can be used to collect subject image and/or video data, which can be further analyzed or processed to obtain biometrics data.
  • a system for detecting motion of a subject and adjusting one or more scan parameters to adjust image quality during a medical imaging scan comprises: a medical imaging scanner configured to scan one or more target regions of a subject; one or more flexible coils configured to be wrapped around the one or more target regions of the subject during the medical imaging scan; one or more image sensors configured to collect video data of the one or more target regions of the subject based at least in part on the one or more flexible coils; one or more computer readable storage devices configured to store a plurality of computer executable instructions; and one or more hardware computer processors in communication with the one or more computer readable storage devices and configured to execute the plurality of computer executable instructions in order to cause the system to: identify the one or more target regions of the subject during the medical imaging scan; receive video data collected by the one or more image sensors of the one or more target regions of the subject based at least in part on the one or more flexible coils; extract one or more motion attributes from the received video data of the one or more target regions; and dynamically adjust
  • the one or more motion attributes comprises one or more of 6 Degrees of Freedom motion coordinates, gating signal for cardiac synchronization, or respiratory signal for respiratory synchronization.
  • the system further comprises one or more markers to be placed on the one or more target regions of the subject.
  • the one or more markers are configured to be placed on the one or more flexible coils.
  • the one or more markers are configured to be placed on an uncovered portion of a body of the subject, wherein the uncovered portion of the body of the subject is adjacent to the one or more flexible coils.
  • the one or more image sensors are configured to collect video data of the one or more target regions of the subject by detecting the one or more markers and using a 6 Degrees of Freedom motion coordinate processing algorithm.
  • the one or more image sensors are configured to collect video data of the one or more target regions of the subject by detecting one or more landmarks on the one or more target regions without one or more markers. In some embodiments, the one or more image sensors are configured to collect video data of the one or more target regions of the subject by utilizing a 6 Degrees of Freedom motion coordinate processing algorithm. In some embodiments, the one or more image sensors are configured to collect video data of the one or more target regions of the subject by utilizing one or more artificial intelligence algorithms.
  • the one or more target regions of the subject comprises one or more of a shoulder, arm, leg, chest, abdomen, thorax, knee, or head of the subject.
  • the one or more flexible coils are configured to be wrapped around one or more of a shoulder, arm, leg, chest, abdomen, thorax, knee, or head of the subject.
  • the system is further caused to process one or more biometric parameters of the subject during the medical imaging scan based at least in part on the video data collected by the one or more image sensors of the one or more target regions of the subject.
  • the one or more biometric parameters comprises one or more of: respiration rate, pulse rate, anxiety levels, blood pressure, heart rate variability, eye blink rate, or eye blink duration. In some embodiments, the one or more biometric parameters comprises one or more of: position of the subject, height of the subject, width of the subject, body volume of the subject, estimated body mass index of the subject, facial recognition of the subject, or a patient barcode of the subject.
  • a system for detecting motion of a subject and adjusting one or more treatment parameters to adjust treatment quality during a therapeutic treatment procedure comprises: a biomedical therapeutic treatment device configured to treat one or more target regions of a subject; one or more flexible coils configured to be wrapped around the one or more target regions of the subject during the therapeutic treatment procedure; one or more image sensors configured to collect video data of the one or more target regions of the subject based at least in part on the one or more flexible coils; one or more computer readable storage devices configured to store a plurality of computer executable instructions; and one or more hardware computer processors in communication with the one or more computer readable storage devices and configured to execute the plurality of computer executable instructions in order to cause the system to: identify the one or more target regions of the subject during the therapeutic treatment procedure; receive video data collected by the one or more image sensors of the one or more target regions of the subject based at least in part on the one or more flexible coils; extract one or more motion attributes from the received video data of the one or more target regions; and
  • the one or more motion attributes comprises one or more of 6 Degrees of Freedom motion coordinates, gating signal for cardiac synchronization, or respiratory signal for respiratory synchronization. In some embodiments, the one or more motion attributes comprises one or more of 6 Degrees of Freedom motion coordinates, gating signal for cardiac synchronization, or respiratory signal for respiratory synchronization. In some embodiments, the one or more flexible coils are configured to be wrapped around one or more of a shoulder, arm, leg, chest, abdomen, thorax, knee, or head of the subject. In some embodiments, the system further comprises one or more markers to be placed on the one or more target regions of the subject.
  • the one or more image sensors are configured to collect video data of the one or more target regions of the subject by detecting one or more landmarks on the one or more target regions without one or more markers.
  • FIG. 1A illustrates an embodiment of a schematic diagram depicting a side view of a system for tracking and/or analyzing subject images and/or videos;
  • FIG. 1B illustrates an embodiment of a schematic diagram depicting a front view of a system for tracking and/or analyzing subject images and/or videos;
  • FIG. 2 is a block diagram illustrating a computer hardware system configured to run software for implementing one or more embodiments of systems, devices, and methods for tracking and analyzing subject motion during a medical imaging scan and/or therapeutic procedure;
  • FIG. 3 illustrates an example(s) of motion attributes such as 6-DOF coordinates shown with respect to time
  • FIG. 4 illustrates an example(s) of cardiac signal (and power spectral density) band-limited to contain frequency components within 0.75 to 2 Hz after processing y- direction signal;
  • FIG. 5 illustrates an example(s) of respiratory signal (and power spectral density) band-limited to contain frequency components within 0.05 to 0.3 Hz after processing z-direction signal;
  • FIG. 6 illustrates an example(s) of a measured ballistocardiography (BCG) waveform measured using a modified electronic weighing scale for one heartbeat;
  • BCG ballistocardiography
  • FIG. 7A illustrates an example(s) of ballistocardiography (BCG) waveform obtained from y-channel motion signal (H, I, J, K, and L waves are marked. M & N waves can also be seen in the same waveform) and detrended but unfiltered y-channel signal;
  • BCG ballistocardiography
  • FIG. 7B illustrates an example(s) of BCG waveform from y-channel motion signal marked with H, I, J, K, and L waves;
  • FIG. 8 illustrates a schematic of overlapping batch sequence diagram with sliding windows
  • FIG. 9 illustrates an example(s) of respiration gating signal
  • FIG. 10 illustrates an example(s) of absorption coefficient of hemoglobin and water shown with respect to wavelength
  • FIG. 11 illustrates an example(s) of raw signals with videoplethysmography
  • VPG markers from each ROI captured with sequentially switching LEDs in the NIR wavelength band and an example(s) of raw signals with BCG markers;
  • FIG. 12 illustrates an example(s) of raw motion signal and raw signal for ROI of size 112 x 112 pixels with VPG markers;
  • FIG. 13 illustrates an example(s) of VPG signals overlapped with detrended signal but unfiltered at different ROIs extracted using one or more methods described herein;
  • FIG. 14 illustrates an example(s) of a VPG signal (top) and respiration signal (bottom) obtained from one of the ROIs illustrated in FIG. 11 with continuous tracking with sliding windows as illustrated in FIG. 8;
  • FIG. 15 illustrates an example(s) of pulse rate / heart rate (top) and respiration rate (bottom) obtained from one of the ROIs illustrated in FIG. 11 with continuous tracking with sliding windows as illustrated in FIG. 8;
  • FIG. 16 is a schematic diagram illustrating intervals for pulsing illumination;
  • FIG. 17 is a flowchart illustrating an example embodiment(s) of pulsing illumination to obtain clearer raw VPG signals
  • FIG. 18A illustrates example images obtained with pulsing illumination
  • FIG. 18B illustrates example raw VPG signals obtained with pulsing illumination
  • FIG. 19A illustrates example images obtained with continuous illumination
  • FIG. 19B illustrates example raw VPG signals obtained with continuous illumination
  • FIG. 20A illustrates an example unpolarized image of tissue
  • FIG. 20B illustrates an example polarized image of the same tissue from FIG. 20 A;
  • FIG. 21 is a schematic diagram illustrating an example imaging system comprising a polarizer and an analyzer
  • FIG. 22A illustrates an example of how the peak power in low frequency (LF) and high frequency (HG) components can be different under sympathetic and/or parasympathetic influence;
  • FIG. 22B illustrates an example of how the peak power in low frequency (LF) and high frequency (HG) components can be different under sympathetic and/or parasympathetic influence;
  • FIG. 23 illustrates example frequency contents and their range and associations with ANS
  • FIG. 24 illustrates example systems, devices, and methods for balancing
  • FIG. 25 illustrates example images captured with a four-camera motion tracking system, in which the illustrated rectangular regions show areas used for measuring eye blink rate;
  • FIG. 26 is a block diagram illustrating example methods for extracting eye blink rate and/or blink duration
  • FIG. 27 illustrates example data collected for eye blink rate detection
  • FIG. 28 illustrates example data collected for eye blink rate detection
  • FIG. 29 illustrates example data collected for eye blink rate detection
  • FIG. 30 illustrates an embodiment(s) of a schematic of a flexible coil resembling a blanket that is configured to wrap around a body part of a patient;
  • FIG. 31 illustrates an embodiment(s) of a schematic of a flexible coil resembling a blanket with representative markers that is configured to wrap around a body part of a patient;
  • FIG. 32 illustrates an embodiment(s) of a schematic of a flexible coil resembling a blanket without representative markers that is held by the scanner operator prior to wrapping around a body part of a patient;
  • FIG. 33 illustrates an embodiment(s) of a schematic of a flexible coil resembling a blanket without representative markers that shows how the coil can be wrapped around the hand/wrist for imaging;
  • FIG. 34 illustrates an embodiment(s) of a schematic of a flexible coil resembling a blanket with representative markers that is wrapped around the patient’s lower arm/wrist;
  • FIG. 35 illustrates an embodiment(s) of a schematic of a small flexible coil with representative markers used for wrapping around the foot/ankle rest
  • FIG. 36 illustrates an embodiment(s) of a schematic of a flexible coil resembling a blanket with representative markers used for wrapping around the chin;
  • FIG. 37 illustrates an embodiment(s) of a wearable, flexible head coil with representative markers that can be used for head, brain and jaw imaging;
  • FIG. 38 illustrates an embodiment(s) of a motion tracking system for a scanner with a flexible coil to detect movement of the head whilst the patient is lying in the scanner;
  • FIG. 39 illustrates an example(s) of oblique axial, oblique coronal, and oblique sagittal imaging planes that can be used in the diagnosis and/or treatment of many musculoskeletal diseases of the ankle and foot;
  • FIG. 40 illustrates an example(s) of a spine coil embedded in a medical scanner or treatment bed that can be used for spine imaging and/or a potential marker location for lumbar spine imaging
  • FIG. 41 illustrates an example(s) of a neck and head coil that can be used for neck and head imaging and/or potential marker location on the neck of a subject;
  • FIG. 42 illustrates an example(s) of one or more body coils that can be used for body imaging and/or potential marker locations on the abdomen of a subject;
  • FIG. 43 illustrates example(s) of a coil that can be used for hand/wrist imaging and/or potential marker locations on the hand of a subject
  • FIG. 44 illustrates an example(s) of a foot/ankle coil that can be used for foot/ankle imaging and/or potential marker location on the foot of a subject;
  • FIG. 45 illustrates an example(s) of a knee coil that can be used for knee imaging and/or potential marker location on the thigh region of a subject
  • FIG. 46 illustrates an example(s) of a shoulder coil that can be used for shoulder imaging and/or potential marker location closer to the coil near the shoulder of a subject;
  • FIG. 47 illustrates an example(s) of a coil that can be used for breast imaging and/or potential marker location on the back of a subject
  • FIG. 48 illustrates an example(s) of a wearable coil and potential marker locations on such a coil
  • FIG. 49 illustrates an example(s) control flow diagram of an imaging system that can be used for motion tracking and/or a control system for improving image quality in biomedical systems;
  • FIG. 50 illustrates an example(s) flow diagram of an imaging system that can be used for identifying region and/or target of interest for tracking motion attributes
  • FIG. 51 illustrates an example(s) flow diagram of an imaging system that can be used for identifying features of interest from identified region and/or target of interest.
  • FIG. 52 illustrates an example(s) flow diagram of an imaging system that can be used for extracting motion attributes.
  • Radiotherapy there are various technologies for therapeutic radiation and other therapeutics.
  • it can be advantageous in radiation therapy, proton therapy, or other therapies to dynamically apply the radiation to a targeted area in order to account for patient movement.
  • Patient movement can include respiration, twitches or any other voluntary or involuntary movements of the patient.
  • radiation therapy, proton therapy, and any other kind of therapy can be applied in a more targeted way, thereby allowing surrounding healthy tissue to be avoided and/or unharmed.
  • the systems disclosed herein can be adapted and configured to track patient translations with accuracies of about 0.1 mm and angle accuracies of about 0.1 degrees in order to better apply radiation therapy, proton therapy, or any other therapy to the targeted tissue or area of the body.
  • motion tracking data collected of a subject can be used to control, modify, and/or improve results of a medical imaging scanner and/or therapeutic device.
  • tracking data collected by a motion tracking device and/or system in conjunction with a medical imaging scanner and/or therapeutic device can be stored and/or further analyzed, for example for biometrics data analysis and/or using the data for improving modification and/or control of the medical imaging scanner and/or therapeutic device with respect to correcting motion artifacts.
  • a standalone camera or detector system separate from a medical imaging scanner and/or therapeutic device can be used to collect subject image and/or video data, which can be further analyzed or processed to obtain biometric data.
  • FIG. 1 A is a schematic diagram illustrating a side view of a medical imaging scanner and/or medical therapeutic device 104 as part of a motion compensation system 100 for the same.
  • FIG. 1B is a schematic diagram illustrating a front view of a medical imaging scanner and/or medical therapeutic device 104 as part of a motion compensation system 100 for the same that is configured to detect and account for false movements for motion correction during a medical imaging scan or therapeutic procedure.
  • the system can comprise a standalone motion tracking system 102 that can be used to track motion data for generating biometric data without being used in conjunction with a medical imaging scanner and/or medical therapeutic device 104
  • the motion compensation system 100 illustrated in FIGS. 1A and 1B can comprise a motion tracking system 102, a scanner, a scanner controller 106, one or more detectors 108, one or more motion tracking markers or landmarks 110, and/or a biometric data analysis system 116.
  • the biometric data analysis system 116 can be part of or separate from the motion tracking system 102 and/or medical imaging scanner and/or medical therapeutic device 104.
  • one or more markers 110 can be attached and/or otherwise placed on a subject 112.
  • the one or more markers 110 can be placed on the face of a subject 110 for imaging or therapeutic procedures directed to the head or brain of the subject.
  • the one or more markers 110 can be placed on other portions of a body of a subject 110 for imaging or therapeutic procedures directed to other portions of the body of the subject 110.
  • the system 100 can be configured to track the motion of one or more landmark features 110 on the subject 110, without using attachable markers.
  • the subject 110 can be positioned to lie on a table 114 of a medical imaging scanner and/or medical therapeutic device 104.
  • the medical imaging scanner and/or medical therapeutic device 104 can be, for example, a magnetic resonance imaging scanner or MRI scanner.
  • a three-dimensional coordinate system or space can be applied to a subject that is positioned inside a medical imaging scanner and/or medical therapeutic device 104.
  • the center or substantially center of a particular portion of the subject 110 for observation can be thought of as having coordinates of (0, 0, 0).
  • a z-axis can be imposed along the longitudinal axis of the medical imaging scanner and/or medical therapeutic device 104.
  • the z-axis can be positioned along the length of the medical imaging scanner and/or medical therapeutic device 104 and along the height of a subject or patient that is positioned within the medical imaging scanner and/or medical therapeutic device 104, thereby essentially coming out of the medical scanner and/or medical therapeutic device 104.
  • an x-axis can be thought of as being positioned along the width of the medical imaging scanner and/or medical therapeutic device 104 or along the width of the patient or subject that is positioned within the medical imaging scanner and/or medical therapeutic device 104.
  • a y-axis can be thought of as extending along the height of the medical imaging scanner and/or medical therapeutic device 104.
  • the y-axis can be thought of as extending from the patient or subject located within the medical imaging scanner and/or medical therapeutic device 104 towards the one or more detectors 108.
  • a motion tracking device and/or system 102 can be configured to collect motion data and/or subject images and/or videos, for example during a medical imaging scan and/or therapeutic procedure or otherwise.
  • the data collected from the actual scanning and/or therapeutic process can be further analyzed and/or manipulated after that data has been collected.
  • the data can be sent back to a computation repository in some embodiments for further analysis.
  • the system can be configured to conduct further analysis of the motion tracking data in addition to just getting data as to where one or more cameras or other motion tracking detectors is pointing and how accurate it is at seeing the target.
  • such data can be used to determine respiration rate, heart rate (also called“pulse rate”), position of the subject or a portion thereof as a function of time, blood pressure, heart rate variability, health of cardiopulmonary system, and/or other various other sorts of biometrics simply by using the intensity variation due to jitter/motion or intensity variation due to light absorption that is observed in the image.
  • such data can be used to automatically, semi-automatically, or otherwise determine position of the subject or a portion thereof with respect to a medical imaging scanner or therapeutic device, whether the subject is touching a wall of the medical imaging scanner or therapeutic device, length or height of a subject, weight of a subject, subject body volume, subject weight, estimated body mass index (BMI), facial recognition, and/or subject barcode identification.
  • the system can be configured to obtain one or more of such biometrics data and/or conduct analyses in a non-invasive and/or invasive manner.
  • data and/or tracking data collected by a motion tracking and/or correction device can comprise video and/or image data (for example, high- definition video or the like) of a subject of a medical imaging scan or therapeutic procedure, tracking data of motion of one or more targets or markers on the subject, tracking data of motion of one or more landmarks, such as facial or other body features, of the subject, audio or sound data, temperature or thermal data of the subject, atmospheric data, light data, infrared data, hyperspectral data, multispectral data, infrared data, infrared data for pulsation and/or blood movement, moisture, sweat, patient overheating, hemoglobin wavelength, follicle motion, water detection, and/or time associated with any of the aforementioned data.
  • video and/or image data for example, high- definition video or the like
  • tracking data of motion of one or more targets or markers on the subject tracking data of motion of one or more landmarks, such as facial or other body features, of the subject, audio or sound data, temperature or thermal data of the subject, atmospheric
  • one or more biometrics data and/or other data relating to the subject can be observed through one or more correction algorithms and/or derived by using one or more correction algorithms.
  • correction algorithms are described in United States Patent No. 9,734,589, which is hereby incorporated herein in its entirety.
  • a subject if a subject is sitting in a medical imaging scanner and/or therapeutic device and breathing, his or her chest would go up and down, resulting in an affixed target and/or optical landmark or optical flow vectors derived by analyzing video frames to go up and down.
  • This pattern of movement can be streamed as video data generated from the motion tracking system and/or device, and such data can be analyzed and processed in order to determine respiratory pattern, rate, and/or related features of the subject.
  • one or more biometric data and/or data relating to the state of the subject can be collected while under observation in a medical imaging scanner and/or therapeutic device or system.
  • an optical marker can be attached to a particular body indicator of a subject.
  • an optical marker can be attached to the stomach or skin of a subject, or in markerless systems a landmark on the subject can be used in lieu of a marker, or group of optical flow vectors can be obtained by analyzing video frames and such marker or landmark or markerless flow vectors can be observed by one or more detection methods, in order to determine biometric data for cardiac and/or respiratory systems.
  • the system can be configured to determine an anxiety index of the subject, for example based on respiration and/or heart rate variability. Similarly, the amount of motion of the subject may indicate some other aspects from a neurological point of view.
  • such data in order to process the motion tracking and/or image data for analysis, such data can be transferred from the motion tracking device and/or system to a computer system configured to process the data.
  • the data for further analysis, for one or more purposes and/or features described herein, can be much larger than necessary for simply diagnosing the state of the cameras or detectors for motion tracking.
  • the system by looking at that larger collection of information, the system can surmise one or more biometric data and/or other data relating to the subject as described herein.
  • the larger collection of data can be used to diagnose a state of the motion tracking system or device and/or the state of the medical imaging scanner and/or therapeutic device used in conjunction with the motion tracking system or device.
  • a motion tracking system or device communicates with a medical imaging scanner’s host and/or therapeutic device or system’s host periodically, for example once per frame, blasting correction information, which can be based on the alignments of the camera other data.
  • a motion tracking system or device can be configured to tell the scanner where the subject is relative to the target and therefore where the target is relative to what the scanner is looking for. In some embodiments, that information is in process on the scanner host against the image and that is what provides the fundamental correction.
  • the motion tracking system or device could also send additional information, such as metadata and/or additional images, around and/or outside of that frame that gives you more information than just the basic relating to was the camera properly aligned and/or was it within tolerances, etc.
  • additional information can be tagged onto the frame either at the host for the actual scanner or therapeutic device.
  • such additional information can be combined at a later point in time on a another computer system and/or processor.
  • the system can comprise a processor that sits next to the scanner or therapeutic device on the same LAN segment that would collect this information, process it, and generate useful biometric data.
  • a processor or computer system for analyzing the additional data may not necessarily have to be physically located within the scanner. Rather, in some embodiments, the processor or computer system for analyzing the additional data can be located anywhere in the processing because it does not have to be in real time relative to the image correction; it could be done at any time in some embodiments. As such, in some embodiments, biometric analysis can be conducted in the cloud. In other words, the system can be configured to produce a series of interesting information which can be sent back to an appropriate location and/or system for processing.
  • such additional data and/or metadata can comprise extended data that is not necessarily and/or typically processed at the connected medical imaging scanner and/or therapeutic device having to do with tracking of the camera.
  • a subject or patient may move or breathe in between frames of a medical imaging scanner or therapeutic device. That movement of breathing can be detected in some embodiments.
  • the system can be configured to determine that the cameras are still aligned with intolerance, which can be reported to determine that the patient was breathing at that particular time.
  • emotion and/or anxiety can also be reported and/or determined.
  • the system can determine additional aspects of the state of the patient or subject and also the quality of the image from the additional data.
  • the additional information or metadata can comprise image data, video data, and/or information or data from a correction algorithm that is tracking the target via one or more cameras or detectors of the motion tracking system. It can be advantageous to process the data from the correction algorithm rather than video data due to size of the data and related computational costs in certain embodiments. Rather, if all that is being tracked is metadata, relatively small amounts of data can be transmitted, for example kilobytes per second, which can be reassembled for further analysis without actually having image and/or video data of the subject.
  • the metadata or tracking data that can be processed for further analysis may not comprise any protected health information (PHI).
  • PHI protected health information
  • the metadata or tracking data for further analysis does not comprise a photograph of a person or even a person’s chest or even the target; rather the metadata or tracking data can be the relative motion of the target up from the view of the camera of the motion tracking system. This can be important because this can allow transferring the metadata or tracking data anywhere for processing in any way without necessarily referring back to the actual patient and/or without having various kinds of concerns with the Health Insurance Portability and Accountability Act (HIPPA).
  • HPI Health Insurance Portability and Accountability Act
  • this can be further advantageous, because in order to analyze trending data to determine and/or improve the efficacy of the motion tracking and correction system in terms of its ability to do image corrections, it can be helpful to be able to correlate that with various aspects about the patient or subject without running into HIPPA concerns. As such, the data for further analysis can be limited to such that can allow the system to collect information that would be generally useful for the patient’s health and the motion correction algorithm and its continuous improvement without any PHI.
  • a motion tracking and correction system can be configured to obtain raw metadata, which can be tracking data as viewed from the perspective of each of the individual cameras or detectors of the system.
  • the raw metadata does not necessarily need to be processed in real-time or even in the motion tracking and correction system itself in some embodiments. Rather, in certain embodiments, the raw metadata can be transferred from the motion tracking and correction system, for example at kilobytes per second speeds, up to the cloud and/or other processing system.
  • the raw metadata in certain embodiments, can be free of patient privacy data and thereby not raise any HIPAA concerns in processing and/or transmitting the data.
  • that data can be processed in the cloud and/or other computing system, for example by utilizing one or more algorithms in order to identify one or more various biometric parameters of interest.
  • biometric parameters can comprise heart rate, respiration, pulse rate, anxiety levels, and/or the like.
  • the additional data and/or metadata can be processed asynchronously relative to the medical imaging scan, therapeutic procedure, and/or standalone acquisition of the motion data.
  • processing of the additional data and/or metadata may not necessarily be synchronized with the frame of the medical imaging scanner, therapeutic device, and/or motion detection system in some embodiments.
  • inter- frame information can be more helpful for certain analyses in some cases.
  • Being able to process the data asynchronously can be advantageous in certain embodiments, for example because certain computational analyses can be intense to determine one or more features or characteristics. As such, it can be advantageous to be able to transmit the data to another computing system and/or location where computation is not an issue.
  • the system can be configured to analyze the additional data or metadata to compute information or conclusion about the data that may be beneficial to a doctor, a patient.
  • the system can be configured to analyze the additional data or metadata to computer information about the scanner or a therapeutic device and its operation and whether the correction is having positive effects to get over time.
  • the additional data or metadata can be used to determine whether the motion tracking and correction system is performing consistently, whether the medical imaging scanner or therapeutic device is taking the information from the motion tracking and correction system and processing it correctly, whether certain parameters are changing over time, whether a service call is required, whether some sort of intervention is required in terms of the data path, or the like.
  • the additional data or metadata is transmitted from the motion tracking and correction system through a network connection to a connected medical imaging scanner or therapeutic device.
  • the motion tracking and correction system may not have a network connection to anything else.
  • data such as the additional data or metadata
  • the scanner or therapeutic device host can be configured to determine and transmit the message or metadata to an appropriate processing unit for processing.
  • the processing unit configured to analyze and process the metadata or additional data to determine biometrics and/or other information about the subject or patient can be a computing unit, standalone computer system, in the cloud, or the like.
  • the scanner or therapeutic device host is configured not to handle or manipulate the metadata or additional data.
  • the scanner or therapeutic device host may, in some embodiments, encrypt the metadata or additional data as an intermediary prior to sending to a processing unit.
  • this can be the case where the motion tracking and correction system only has one interface with the scanner or therapeutic device host, while the scanner or therapeutic device host can have a plurality of interfaces.
  • the scanner or therapeutic device host can have a local interface as well as a broader network interface that can connect to the Internet or other systems, while the motion tracking and correction system may not.
  • the additional data or metadata can be sent from the motion tracking and correction system to the medical imaging scanner and/or therapeutic device host as a messaging package, such as syslog or rsyslog.
  • the motion tracking and correction system and/or the medical imaging scanner and/or therapeutic device operate Linux and/or Ubuntu.
  • the medical imaging scanner and/or therapeutic device is configured to recognize the additional data or metadata and simply relay and not process the additional data or metadata; rather, the medical imaging scanner and/or therapeutic device can be configured to wait to receive and process only correction data related to adjusting for motion of the subject, for example sequence and/or focus correction of the scanner and/or therapeutic device.
  • the motion tracking and correction system and medical imaging scanner and/or therapeutic device share can comprise all the correction information in it.
  • the medical imaging scanner and/or therapeutic device host can receive this data and correct the image scan or therapeutic procedure accordingly.
  • the metadata or additional data described herein for determining biometric information for example can simply be stored on the motion tracking and correction system, and can be collected in the file, and it can be turned on and/or off for debugging purposes.
  • both syslog or rsyslog and UDP can be utilized.
  • a real-time data stream assigned for providing the data the medical imaging scanner and/or therapeutic device needs now in order to correct the frame can be blasted via a UDP broadcast.
  • additional information or metadata that was collected from the motion tracking and correction system during this particular period of operation for example including inter- frames that can be further processed to determine biometric data among others, can be forwarded by the medical imaging scanner and/or therapeutic device as a messaging package to a processing unit.
  • such additional information or metadata can be transmitted to another computing unit, other than the medical imaging scanner and/or therapeutic device, that is connected to the motion tracking and correction system, for example over a local network, which can then forward the data, for example as a messaging package, to a processing unit.
  • the additional information or metadata is sent to the medical imaging scanner and/or therapeutic device host or other computing unit using this socket, and then the medical imaging scanner and/or therapeutic device host or other computing unit can be configured to forward and/or redirect the additional information or metadata to a desired processing unit based on a rule that was installed.
  • the data transmitted such as the syslog messaging packet, in addition to the frame information and/or other collected data from the subject, can also comprise the IP address of where the data came from for validation and/or correlation purposes.
  • such ancillary information can be carried in the payload of this message, the frame information, and/or any other data that the motion tracking and correction system generates and/or the medical imaging scanner or therapeutic device includes.
  • the information of the subject collected by the motion tracking and correction system is continuous and/or periodic and can include motion and/or other information that happens between frames of the medical imaging scanner and/or therapeutic device. Because of this fact, in some embodiments, the information of the subject collected by the motion tracking and correction system can be aligned with a specific frame and/or a specific set of frames. For example, the information collected can be tied to two particular frames, which may allow observation of respiration that can occur between the two frames. In contrast, correction data that can be used to adjust the medical imaging scanning and/or therapeutic procedure parameters can be frame-centric.
  • Alignment of the collected information with a specific frame and/or a specific set of frames can be amorphous and/or may depend on the type of processing of interest and/or what sort of beginning and ending would be appropriate. In some embodiments, it can be advantageous to begin collection of information when the motion tracking and correction system is initially brought into the picture and set up. Further, in certain embodiments, it can advantageous to end the data collection process when the subject or patient is removed from the medical imaging scanner and/or therapeutic device and there is no longer any information associated with that the subject or patient’s acquiescent state.
  • the information collected does not need to be packetized in any particular way.
  • the information collected may be tied to time, whereas it may not be tied to time in other embodiments.
  • the information collected may or may not be tied to specific events that are taking place either in the motion tracking and correction system and/or the medical imaging scanner or therapeutic device.
  • the motion tracking and correction system can be in direct communication with the processing unit.
  • the motion tracking and correction system may be configured to directly transmit the additional data or metadata to the processing unit without going through the medical imaging scanner or therapeutic device.
  • the additional data or metadata can be configured to be transmitted directly from the motion tracking and correction system to a computing system outside of the scanner or therapeutic device for further processing.
  • the motion tracking and correction system can comprise multiple network interfaces, such as a local interface and a broader network interface.
  • the motion tracking and correction system can comprise only a local interface, in which a local segment on which the motion tracking and correction system and medical imaging scanner and/or therapeutic device sits on comprises a processing element.
  • all metadata and/or additional information can be stored in the motion tracking and correction system, which can be downloaded or transferred to a processing unit, for example at a later point in time via some channel.
  • the system can comprise an additional computer connected to the same LAN structure, and instead of sending the metadata or additional information to the medical imaging scanner and/or therapeutic device host to be sent out to the internet to be processed, it could receive it directly off the LAN host and process it directly there.
  • the LAN host can refer to the processor that runs the medical imaging scanner and/or therapeutic device, and the motion tracking and correction system can communicate with the LAN host via a LAN interface.
  • the LAN includes the motion tracking and correction system and the medical imaging scanner and/or therapeutic device, and the host of the medical imaging scanner and/or therapeutic device can comprise another interface to some other network connection which can ultimately allow communication with the Internet, remote service, and/or the like.
  • the system can comprise an additional computer connected to the same LAN structure, and instead of sending the metadata or additional information to the medical imaging scanner and/or therapeutic device host to be sent out to the internet to be processed, the additional computer can receive the metadata or additional information and transmit the same to the Internet to be processed.
  • the system can comprise two or more different data paths.
  • one data path can be for the actual correction data, which would go from the motion tracking and correction system to the medical imaging scanner and/or therapeutic device for adjusting the frame.
  • Another data path can be for the metadata or additional data, which can go from the motion tracking and correction system to any computing unit, either directly or indirectly through the medical imaging scanner and/or therapeutic device as a way to get off of the local area network to somewhere else.
  • the local area network can connect the motion tracking and correction system, the medical imaging scanner and/or therapeutic device, and/or an additional computing unit in some embodiments, for example through a piece of LAN infrastructure, such as a hub or switch.
  • the system can comprise a hub in which the hub can have a direct connection to the motion tracking and correction system, medical imaging scanner or therapeutic device, additional computing unit, and/or the cloud.
  • the logging facility on the motion tracking and correction system can be utilized at either point at the medical imaging scanner or therapeutic device acting as a gateway or the additional computing unit as desired.
  • the motion tracking and correction system can come online and/or begin a session when a subject or patient is placed in a connected medical imaging scanner and/or therapeutic device and one or more detectors of the motion tracking and correction system identify the target.
  • the one or more detectors can be undergoing processing to both self-align to make sure that the detectors are visualizing the target and that the subject is properly aligned, as well as observing various kinds of motions and other aspects occurring within the environment that can be collected and spewed out via metadata.
  • the metadata can be collected, and immediately or shortly afterwards, but not tied to the operation of the scanner, can be utilized in some form of a correlation analysis.
  • some or all correlation analyses can occur after a session has ended, for example by processing in bulk.
  • some or all correlation analyses can be processed as the medical imaging scan and/or therapeutic procedure is taking place.
  • the system can be configured to determine and/or process one or more biometrics analysis, such as respiration and/or EKG-type cycle, during the medical imaging scan and/or therapeutic procedure.
  • the system can be further configured to generate an electronic data representation of one or more biometric data.
  • one or more biometric data can be imputed from observing the subject or patient in the medical imaging scanner and/or therapeutic device.
  • the session can continue to operate during the time that the medical imaging scanner and/or therapeutic device is operating and/or when it is not operating so that the system can observe subject movement when the subject is agitated and/or when no scanning or therapeutic procedure is occurring.
  • the session may end and/or continue until subject or target actually moves out of field or moves enough to basically say this is complete.
  • the system can be configured to conduct some batch processing on the record of that session, for example to determine some long-term trend that may require more computation.
  • the system can be configured to conduct real-time processing of the metadata to extract certain elements, such as respiratory or heart rate, which may have some real-time information about the scanning and/or therapeutic treatment, as well as possibly conduct post-scanning or post-treatment processing for less time- sensitive mission protocol data points and/or biometric data points.
  • certain elements such as respiratory or heart rate
  • the system can be configured to produce some feedback data relating to how efficient or effective the corrections were.
  • the system can be configured to produce a validation metric regarding how effective or ineffective the corrections were either as an individual session and/or as part of a larger trend.
  • Such feedback data can be used to improve the correction algorithm of the motion tracking and correction system in some embodiments.
  • one or more cameras or detectors of the motion tracking and correction system are configured to record video streams.
  • Such video streams can be transmitted through a plurality of full channels of HDMI data.
  • the system can comprise a plurality of data channels, for example the HDMI channel data for real-time video streams, UDP channel for real-time tracking and correction data for the medical imaging scanner and/or therapeutic device, and a syslog channel for metadata or additional data.
  • a problem can arise in that the plurality of data channels may need to be synchronized.
  • time and/or network time protocol can be utilized to synchronize the plurality of data channels.
  • the medical imaging scanner and/or therapeutic device can be configured to an NTP to provide time synchronization information.
  • the medical imaging scanner and/or therapeutic device can be configured to get real synchronized time via the Internet.
  • the motion tracking and correction system can be synchronized with that time, for example through a local connection to the medical imaging scanner and/or therapeutic device, to obtain a correlated amount of time.
  • the motion tracking and correction system data log can comprise actual time and date stamps. As such, time relevant data of the plurality of data channels can be tied back to an actual time for synchronization.
  • the system can be configured to conduct some processing of the metadata or additional data live in real-time and/or substantially in real-time and/or within minutes of the medical imaging scanning and/or therapeutic treatment.
  • one of such features that can be processed in real-time, substantially in real-time and/or within minutes of the medical imaging scanning and/or therapeutic treatment can be quantifiable movement of a subject or patient, for example as a measurement or a graph of how much the subject or patient is moving during the scan or treatment.
  • Another example can be whether the marker is actively trackable or not by the system, for example as binary, i.e. does the system see the marker or target or not.
  • Another example can be an indication of whether the marker or target was lost during the scan or treatment or was continuously tracked.
  • Such examples can provide immediate benefits during a scan or treatment, for example to know whether the target or marker is being tracked and/or whether the target or marker was continuously visible throughout the scan or treatment.
  • the subject or patient moved a lot, for example above a predetermined level, that can trigger review of images and/or modification of the scan or treatment.
  • one or more processed outputs of metadata or additional data can be configured to be displayed to a user.
  • the motion tracking and correction system can comprise two optical markers and two cameras per marker, wherein raw output and/or processed output from all or some of such cameras can be reflected in the same display.
  • raw output and/or processed output from each individual camera with individual markers can displayed.
  • a consolidated view of raw output and/or processed output from all cameras can displayed.
  • the system can be configured to conduct some processing of the metadata or additional data in real-time, substantially in real-time, within minutes of the medical imaging scanning and/or therapeutic treatment, and/or within hours, days, or years of the medical imaging scanning and/or therapeutic treatment.
  • Some non limiting examples of such features that can be processed can include position or time, heart rate, respiratory rate, blood pressure, non-invasive blood pressure (NIBP), position of the subject, whether the subject is touching the side of the bore, estimated body mass index (BMI) or estimated weight, length of the overall patient, facial recognition for example for identification purposes, identification of a bar code for example on the patient’s arm for identification purposes.
  • the system can be configured to process the metadata or additional data to obtain biometric data and/or rich ancillary biometric data.
  • biometric data and/or rich ancillary biometric data can be stored, analyzed, and/or uploaded for current and/or future processing.
  • the system can be configured to process the metadata or additional data to generate a signal representative of the current state of the subject or patient.
  • the system can be configured to generate an EKG-type signal.
  • the generated signal which can be denoted ballistic data, can be based on rich ancillary biometric data.
  • this ballistic data or generated signal can be a function of the fluid and respiratory dynamics. For example, while the heart might not be beating the same strength from one beat to another, this generally will not be reflected in an EKG; in contrast, the signal generated according to the systems herein can reflect this change, because the signal can be generated based on the fluid pressure impulse, for example by looking at the skull.
  • the system can be configured to observe the neck of a subject to see the pulse in the veins in the neck. As such, the system can be configured to obtain a more beat-by-beat variation on the aggressiveness of cardiac output than what is currently measured by an EKG trace.
  • breathing generally does not remain constant from beat to beat. For example, one can take a deep breath or a shallow breath.
  • the rate and/or depth of breathing can be determined by processing movement of the mouth and the nostrils of a subject, how much the head goes up and down, how much the chest expands or contracts, or the like. Other patient or subject movement may be monitored and/or processed to determine the rate and/or depth of inspiration.
  • certain embodiments herein can provide improved noninvasive cardiology tracking by combining magnetic resonance (MR) scans with rich ancillary biometric data and/or ballistic data, for example by observing and processing data relating to the chest position of the subject during the MR scan.
  • MR magnetic resonance
  • the chest position that is tracked during the MR scan can include absolute position, such as in XYZ coordinates, and also changes in the chest contour and/or how hard the heart is beating.
  • some embodiments are configured to correlate the depth of inspiration specifically to non-invasive cardiology. This can be a substantial improvement over bending the data in both CT and MR, which can be due to the fact that current CT and MR technologies themselves are not gated to synchronize with movement of the heart in real time.
  • certain embodiments herein can actually correlate heart movement to provide more accurate non-invasive cardiology assessment.
  • the motion tracking and correction system can be configured to correct, for example prospectively correct, artifacts in medical images caused by movement of the subject of the medical imaging scan.
  • a movement created partial volume artifact or motion initiated partial volume artifact can be created when a subject of a medical imaging scan, such as CT, MR, PET, SPECT, angiography, or the like.
  • a subject of a medical imaging scan moves during the scan, one pixel or voxel that is supposed to be fat and another pixel or voxel that is supposed to be muscle can be moved over each other during image acquisition, thereby ending up with an average of the two.
  • a single count can be distributed across a range of pixels or voxels rather than all being put back into the original pixel or voxel, which leads to a sampling error or reconstruction artifact.
  • movement created partial volume artifacts or motion initiated partial volume artifacts can occur in images obtained through CT, MR, PET, SPECT, angiography, or the like.
  • certain embodiments of the motion tracking and correction system can be configured to move some obtained data back to the correct pixel or voxel.
  • the motion tracking and correction system can be configured to monitor the velocity of motion of the subject and prevent or cause to prevent the scanner, for example a CT scanner, from scanning when the velocity is over a certain predetermined level because the image obtained would likely be blurry.
  • the motion tracking and correction system can be configured to monitor the velocity of motion of the subject and cause the scanner, for example a CT scanner, to scan the subject if and when the velocity of subject movement is or decreases below a predetermined level.
  • the system can be configured to acquire biometric data of a subject undergoing a medical scan and/or therapeutic procedure based at least in part on raw motion tracking data, such as video and/or image data.
  • the system can further be configured to obtain a cardioballistics (also called “ballistocardiography” or BCG) and/or cardiac pulsation signal and/or blood pressure signal from the biometric data.
  • a cardioballistics also called “ballistocardiography” or BCG
  • BCG cardiac pulsation signal and/or blood pressure signal
  • the system can be configured to analyze the veins and/or skin and/or hair and/or clothing and/or heat spots and/or skin marks and/or tattoos and/or other movements, topographical changes, infrared and hyper spectral changes and/or micromovements of the face, head, and/or neck to obtain a cardioballistics and/or cardiac pulsation signal and/or blood pressure signal.
  • cardioballistics can relate to a matter of how much the head moves and/or how much the vessels distend.
  • biometric data can be further analyzed by the system to create a biometric output or new signal, such as ballistic data.
  • Such output or signal can be indicative of the patient’s health while the patient is undergoing a medical scan and/or therapeutic procedure.
  • biometric data and/or data from the output or signal can be fed back into the medical scanner and/or therapeutic device in order to ensure that the detectors, cameras, therapeutic device is not only focused on the right area of the patient body but also that it takes a picture at the right time in view of the internal motion of the organs of interest.
  • an improved system in addition to looking at outer body motion and using that outer body motion to refocus the medical scanner or therapeutic device to make sure it is taking a picture of the body from the right perspective to ensure that the images stayed clear and attributing signal in the case of an MRI scanner to the appropriate voxels, an improved system can use the biometric data to take an even better image.
  • an EKG signal can be used to track the heart’s own internal motion, which assumes that the heart beats consistently from one beat to the next. However, this may not always be the case, and as such EKG may not be an accurate representation as it relates generally only to an electrical signal.
  • a more accurate system can be provided that can be configured to detect and/or track topological, positioning, and infrared or hyper spectral data, to better infer movement of the heart or other organ of interest with respect to the rest of the body, as well and use both the biometric data (for example, topological data) and electrical signal.
  • Such systems can be further configured to identify ballistic data, such as cardio ballistic data, that can provide improved insight compared to EKG signals.
  • ballistic data can be used to enable a medical imaging scanner to take better images and/or focus or refocus an image and/or enable a therapeutic device to perform more effective treatment.
  • the system can be configured to ensure or at least better ensure that an image of the organ of interest is being taken at the right time, for example because at certain times the internal organ can be in the exact same spot or substantially same location relative to the rest of the body compared to when the last snapshot was taken or to better estimate cardiac output which can change from one beat to the next.
  • the heart may be 50 percent expanded when a first snapshot is taken, in which case on the second snapshot one may want the heart to be at the 50 percent expansion position as well in order to obtain a clearer image between slices.
  • the system can, in some embodiments, utilize both the electrical signal of the heart, for example using EKG, and infrared, hyperspectral, pose, as well as topographical data to provide improved estimation of the positioning and status of the heart, for example using raw motion data and/or biometric data derived therefrom.
  • the system can be configured to capture a better image of the brain by accounting for not only head outer body position, but also pulsation of the CSF or blood in or near the brain in between pulses.
  • the system can be configured to detect where the organ is with respect to the rest of the body as well as the stage of compression and/or electrical signal.
  • the system in certain embodiments, can be configured to refocus a medical imaging scan slice, change imaging parameters such as modifying an MR pulse sequence or changing timing or signal attribution to a given imaging voxel in any moodily including but not limited to PET, CT, MR, MR/PET, MR/CT and SPECT and/or therapeutic procedure depending on subject movement and/or adjust the timing thereof alone, or correlated with the other previously acquired data or data yet to be acquired.
  • the system can be configured to continuously collect data and throw away some of the data that was collected when the organ of interest is in a different state or location.
  • the system can be configured to utilize post-processing analysis to correct for subject movement.
  • the system can be configured to prospectively correct for subject movement, for example by developing a model that predicts what the heart or other organ of interest will do in terms of movement, compression, or the like, based on data that the system captured previously. Based on such prediction, the system can be configured to determine whether or not to discard and/or capture data.
  • the system can be configured to conduct a sentiment analysis and/or generate an alert to the technician that is conducting the medical imaging scan and/or therapeutic procedure.
  • the system can be configured to utilize the biometric data to acquire sentiment data to generate a medical alert to a technician to interact with the subject, for example if the subject is getting agitated or nervous or unconscious or non-responsive or the like.
  • the system can be configured to give feedback to the subject based on the sentiment analysis.
  • the system can generate a calming mechanism, such as audio and/or video signals to relax the subject, if the system determines that the subject is agitated.
  • the system can use the biometric data to generate a plan or roadmap for the subject to follow in order to obtain clearer images and/or better therapeutic results.
  • the system can be configured to generate a graphical representation, such as a line graph comprising one or more peaks and valleys, which can correspond to when the subject should breathe in or out.
  • a graphical representation such as a line graph comprising one or more peaks and valleys, which can correspond to when the subject should breathe in or out.
  • Such roadmap such as a breathing pattern
  • such roadmap can be presented to the subject via audio.
  • one or more cameras or detectors of the system can be configured to detect the subject breathing pattern, based on which the system can determine how closely the subject is breathing in real time according to the plan or roadmap breathing cycle.
  • the system can be configured to provide a gamification aspect to the subject of a medical imaging scan and/or therapeutic procedure.
  • the system can be configured to display a dot or other graphical representation to the subject on a screen or other display on the motion tracking device, medical imaging scanner, and/or therapeutic device.
  • such dot or other graphical representation can move off center if the subject moves his head off center. If and/or when the subject moves his head back to the center or near or to an ideal location, the dot or other graphical representation can be displayed again, on center for example.
  • the system can be configured to keep the patient still, focused, engaged and/or distracted by providing a game aspect in order to obtain clearer medical images during a scan and/or more effective therapeutic results.
  • data and/or tracking data collected by the system can comprise tracking data of motion of a subject or landmark thereof, subject skin motion, subject jaw motion, pupillary dilation, sentiment data, hyperspectral data, respiratory ballistics, video data, image data, audio data, temperature or thermal data of the subject, atmospheric data, light data, infrared data, and/or time associated with any of the aforementioned data.
  • the video data or image data can be synchronized by tying with time data as discussed herein.
  • one or more additional data mentioned herein, such as thermal data can also be tied to video or image data and/or any other data.
  • the system can be configured to link and/or synchronize multiple streams of data, for example using time data.
  • the system can be configured to capture and/or process every possible data point, such as a continuous stream of images for image or video data from every second.
  • the system can be configured to only take and/or process a subset of the possible data points.
  • the system can be configured to only process every other frame or every other second and/or compare one frame to another to determine whether there is a difference between the two frames.
  • the system can be configured to only evaluate certain parameters in real-time such as respiratory and cardiac data to save processing power, and/or perform post or delayed processing of other parameters to provide other biometric data.
  • Such near real time systems can be advantageous because the lower processing requirements may allow the system to more easily process the frame in real time during the medical imaging scan or therapeutic procedure.
  • the system can be configured to gamify an aspect of the medical imaging scan and/or therapeutic procedure for a subject, for example to help focus the subject to remain still and/or to maintain a particular portion of the subject’s body within a particular frame.
  • the system can be configured to process the tracking data or raw data obtained by the system and generate and provide feedback to the subject to improve the quality and/or effectiveness of the medical imaging scan and/or therapeutic procedure.
  • the system can comprise a display or audio speakers to provide feedback to the subject.
  • the display can be part of an add-on motion tracking device, medical imaging scanner, and/or therapeutic device.
  • the display can be a LCD, LED, fiber optically transmitted image, and/or other display.
  • the display can be configured to show a dot or other visualization to the subject that can be indicative of the quality of the subject’s location.
  • the system can be configured to display a little ball that moves one way or another depending on the location of interest of the subject, such as the subject’s head.
  • the ball can be displayed in a particular location or color.
  • the ball can be displayed in another location or color, which can motivate the subject to maintain the subject’s head in the center or substantially center.
  • the system can comprise a gamification feature that can help keep the subject focus and/or stable.
  • the system can be configured to display a video or image and/or play audible sounds that are calming or to entertain the patient and/or configured to calm and/or entertain the subject such that the subject can maintain a stable orientation and/or location.
  • audible sounds that are calming or to entertain the patient and/or configured to calm and/or entertain the subject such that the subject can maintain a stable orientation and/or location.
  • Such audio may be routed through an already existing headphone or other audio system.
  • display or graphical representation of the orientation and/or location of the subject can be displayed to a technician, who can then prompt the subject to move in a certain direction or orientation.
  • the imaging matrix of the system can be rectangular or a grid. As the location of the subject or area of the subject of interest starts to go off axis, the particular area of interest may not fall into the pixels or voxels as neatly. As such, it can be advantageous to keep the subject or area of the subject of interest, such as the subject’s head, in the center or isocenter for highest resolution. Gamification technologies or processes as described herein can help achieve this goal by encouraging the subject to consciously decrease movement.
  • the system can be configured to detect and/or measure the respiratory rate of the subject, other respiratory feature, and/or other biometric data to determine the calmness of the subject. For example, the rate of change or some aspect of calming breath or consistent respiration can be indicative of the calmness of the subject.
  • Such data points and/or analysis results can also be part of the feedback mechanism.
  • the system can be configured to provide one or more visual and/or audio signals to calm the subject, such as a prerecorded and/or computer simulated calming voice that tells the subject to hold still and/or hold his or her breath.
  • the system can be configured to provide such audio and/or visual signals, such as compliance instructions, in a plurality of languages.
  • the system can be configured to generate a ballistic signal, such as a cardio ballistic signal, as described herein.
  • a cardio ballistic signal can comprise a waveform, which is a measurement of the recoil forces of the body in response to the ejection of blood from the heart and movement of the blood through the vasculature.
  • the system can be capable of actually differentiating strong heartbeats from light heartbeats and have a better inference of cardiac position and/or output.
  • the system can be configured to detect and/or generate a ballistic trace of the heart.
  • cardio imaging using a MR scanner can comprise dividing a cycle into a number of segments or slices, such as binning.
  • binning herein is intended to be construed broadly to refer to the general process of dividing an acquisition cycle into a number of segments or slicing and reconstructing the data.
  • respiration can be different from one heartbeat to another, such as for example due to lack of breath or the like. This can mean that the heart position may vary, for example by one or more centimeters, because of the depth of expiration or the like.
  • certain embodiments described herein are configured to detect and/or determine respiratory data or traits to generate more accurate data relating to the heart.
  • the system can be configured to take into account the fact that there can be at least an indirect relationship between the actual pulse rate and respiration. For example, based on previously detected data of the subject, the system can be configured to determine that a higher pulse rate can be indicative of an anxiety level of the subject, which can mean that the subject is more likely to have an irregular respiratory rate. In some embodiments, in determining one or more cardio characteristics of a subject and/or any other biometric data, the system can be configured to detect, track, determine, combine and/or take into account the oxygenation level, respiratory rate, cardiac rate, and/or the like of the subject.
  • the system can be able to determine the location of a particular organ of a subject, such as the heart.
  • the system can be configured to estimate the location of a particular organ of a subject, such as the heart, via electrical signals.
  • the system can be configured to estimate a location of a subject’s heart based on how hard the subject’s heart is beating.
  • the system can be configured to generate an improved inference of where the heart or other organ of interest (such as but not limited to liver, kidneys, or spleen) is located and/or how to get the heart or other organ of interest in exactly the same spot from one beat to the next compared to using electrical trace alone.
  • the significance in improvement can be more pronounced, for example, in people with irregular rhythms, either regularly irregular or irregularly irregular rhythm. This can be because of binning or the fact that a medical imaging scanner may need to collect data over a plurality of beats, which would require the heart to be in the same precise location and in the same state of expansion or contraction to get statistically meaningful data to generate a clearer image.
  • the system can be capable of decreasing the associated blurring and uncertainty arising from reconstructing data collected over a period of time.
  • the noise can be decreased in some embodiments, whereas the electrical signal being added can provide the system to have a great signal relative to the noise.
  • the organ of interest can be the heart, liver, kidneys, gallbladder, and/or any other organ that can move within a subject, for example due to respiration and/or due to its own movement such as the heart.
  • the system can be configured to incorporate the three-dimensionality of the location of a particular organ of interest.
  • a system that relies solely on an electrical signal can be two-dimensional, whereas an object of interest, such as the heart, is generally three-dimensional. Movement and/or location of an organ in three dimensions can be difficult to capture with an electrical signal alone and may require ballistic and physiologic modeling.
  • the system can be configured to first determine if there is beat-to-beat variability in the cardiac output. For example, a subject with a regularly irregular rhythm can have a first regular beat, a second quick beat, third regular beat, and a fourth quick beat.
  • Some systems as described herein can be configured to generate a ballistic trace of the heart and determine that the cardiac output is vastly different between the first and second beats and/or between the third and fourth beats. Based on such determination, the system can be configured to skip every quick beat, wait for every regular beat and only take a medical imaging scan slice or apply a therapeutic procedure for every regular beat. This can allow the system to only take slices of when there is a higher likelihood that the heart or organ of interest is in the same position, thereby prospectively adjusting for potential artifacts.
  • the system can be configured to continuously obtain image scan slices to a subject and retrospectively go back and reconstruct using only those data points in which there is a higher likelihood that the heart or organ of interest is in the same position either on an absolute basis or relative to the known phases of a cardiac cycle in either regular rhythms or regularly irregular or irregularly irregular rhythms.
  • the system can be configured to track and/or determine cardiac output in order to improve medical image scans.
  • the system can be configured to track and/or monitor the veins on the neck and/or head of the subject as a data point for determining cardiac function.
  • the system can be configured to look at the dissension /status of the jugular veins and/or carotid arteries.
  • the system can be configured to determine and/or generate respiratory ballistics data.
  • the system can be configured to look at the shape of the thorax of a subject and/or head position. More specifically, the system can be configured to monitor the size and/or 3D volume of the thorax and/or determine whether the same volume point can be obtained within the chest. In addition, the system can also monitor and/or track positioning information, for example by using an optical marker right over the heart to determine a three-dimensional position thereof.
  • the system can be configured to process a three-dimensional volumetric calculation from the image and/or video data, for example by using a dot projector to project a dot pattern on the chest of the subject to determine depth.
  • the system can be configured to determine the depth of a breath with improved accuracy and/or provide feedback to a gamification system for the subject. For example, the system can provide feedback to the subject to take a series of deep breaths until a desired depth of breathing is obtained. In certain embodiments, the system can provide feedback to encourage the subject to bring his or her respiration into a predictable mode that can result in improved motion tracking and thereby clearer medical image scans and/or more effective therapeutic procedures.
  • the system is capable of adjusting for artifacts that can be caused by changes in the position of an organ or area of interest of a subject that can occur between frames or slices of a medical imaging scan and/or a therapeutic procedure.
  • the system can determine the position of a subject area of interest in real-time or near real-time and/or combine with one or more biometric data to dynamically determine when the subject area of interest is likely to be in the same position as a previous take in order dynamically determine when to take an image slice and/or apply a therapeutic procedure.
  • the system can be configured to encourage or increase the likelihood that the subject area is in the same position by providing feedback relating to the respiration and/or position of a subject area to the subject and/or technician, for example by utilizing one or more gamification features as described herein.
  • the system can be configured to retroactively remove or delete certain image slices that were taken when the subject area was not in the same position in order to correct for artifacts.
  • the system can be configured to modify or alter certain image slices that were taken when the subject area was not in the same position in order to correct for artifacts, for example by modifying the location and/or orientation of an organ of interest in the image.
  • the system can be configured to predict the position of a particular organ or subject area of interest and prospectively adjust one or more parameters of a medical image scan to correct for artifacts.
  • the system can be configured to determine, predict, and/or adjust for artifacts that can be caused by changes in the position and/or orientation of an organ or area of interest of a subject based solely on external features, such as external position and/or topography.
  • the system can be configured to determine, predict, and/or adjust for artifacts that can be caused by changes in the position and/or orientation of an organ or area of interest of a subject based on predictive internal modeling of the organ or area of interest or in combination with anatomical or other information acquired by the primary imaging modality including but not limited to MR, CT, PET, PET attenuation correction etc.
  • one or more features as discussed herein can be utilized by a system to adjust for movement of the brain within the head of a subject as well.
  • the brain can move within the head due to, for example, pulsating blood, spinal fluid pulsation, or the like, which can lead to artifacts in medical imaging, such as cerebral spinal fluid (CSF) pulsation artifacts.
  • CSF cerebral spinal fluid
  • the system can be configured to detect and/or monitor spinal fluid pulsation rate by monitoring external and/or visible features to detect, predict, and/or adjust for brain movement during a medical imaging scan and/or therapeutic procedure, such as by only keeping images where the brain is in a desired position.
  • the system can comprise one or more accelerometers or other sensors that can be capable of measuring spinal fluid movement to this end. Based on such data, the system can be configured to correct for artifacts caused by spinal fluid movement.
  • the system can be configured to dynamically modify the focus and/or alignment of one or more motion tracking cameras or detectors pursuant to movement and/or size of the subject and/or area thereof.
  • one or more motion tracking cameras or detectors can be motorized and/or other non-motorized features for doing so.
  • the system can be configured to conduct sentiment analysis, for example in real-time, substantially real-time, and/or post-processing.
  • the system can be configured to detect, monitor, and/or track facial expression, pupillary dilation, heart rate, thermal data, and/or respiratory rate of a subject. Based on such data, the system can further be configured to generate feedback on whether the subject is claustrophobic, fearful, anxious, relaxed, sleeping, and/or the like.
  • the system is capable of conducting sentiment analysis continuously and as such can dynamically determine a change or slowly evolving trend of the subject’s sentiment, for example to predict that a certain event will or is likely to occur, such as the subject crashing.
  • the system can be configured to dynamically determine whether the subject is exhibiting or is likely to become anxious based on tracking the subject’s facial expression, pupillary dilation, thermal data, heart rate, and/or respiratory rate. Based on such determination, the system can be further configured to generate a warning alert to the technician and/or subject. For example, the system can be configured to determine that a subject is exhibiting or is likely to exhibit signs of anxiety when the subject’s pupils are becoming smaller and/or when the subject’s heart rate and/or respiratory rate is trending upward and/or when the subject is sweating or the thermal data is trending upward. The system can be configured to generate a visual and/or audible warning alert to the subject, for example in a soothing or caring voice to relax the subject and/or display soothing or calming graphics.
  • the system can be configured to capture and/or determine the anxiety level of a subject over a period of time, which can be used to develop and/or modify treatment of the subject. For example, if a patient is regularly undergoing an MR scan, the system can be configured to track the anxiety level of the patient over the course of treatment, which can be used to modify the treatment itself.
  • anxiety levels of one or more patients observed from particular medical image scanners and/or therapeutic device can be used as a metric for evaluation of the scanner or therapeutic device itself. For example, patients who were scanned using a particular brand or type of MR scanner may have experienced lower anxiety levels than another type of MR scanner.
  • the system can be configured to perform interpatient variability measurements among different medical imaging scanners and/or therapeutic devices, based on sentiment analysis of a plurality of patients. In some embodiments, the system can be configured to conduct sentiment analysis in real-time, near real-time, and/or asynchronously or post-scanning.
  • sentiment data collected around a particular patient, a particular patient over a course of treatment, a series of patients over a particular period of time, a series of patients, one or more medical imaging scans and/or therapeutic devices can be used, for example, to measure an effectiveness of the machine and/or workflow and /or environment.
  • the system can be configured to obtain a cardiac rhythm from the raw motion tracking data and/or biometric data.
  • the system can be configured to analyze the veins and/or movements and/or micro-movements of the face and/or neck to obtain a cardiac and/or cardiac or other rhythm signals and/or blood pressure signal.
  • a person moves up and down by about 150 microns when the blood hits the base of the skull.
  • the system can be configured to detect and/or track movement or micro movement of a facial marker on the subject or other visualized based approach.
  • the system in addition to visual data, can be further configured to utilize one or more other modalities, such as infrared, hyperspectral, sound, and/or ultrasound.
  • the system can be configured to utilize and/or incorporate one or more of the following variables for determining cardiac rhythm: marker data for movements or micro-movements, infrared data for pulsation and/or blood movement, hyperspectral data, multispectral data, thermal data, moisture, sweat, patient overheating, hemoglobin wavelength, oxygenation of hemoglobin, follicle motion, water detection, and/or the like.
  • the system can be configured to use infrared or hyperspectral to detect and/or visualize the cardiac rate, pulsation, volume, and/or blood pressure, such as by utilizing infrared transit time resistance. More specifically, if a subject’s arteries are clamped down, the blood can flow slower compared to when the subject is relaxed, the temperature of which can be detected by using infrared and exhibit a longer transit time and/or give an indication of blood pressure.
  • the system can be configured to immediately, in real-time, or in substantially real-time determine blood pressure, for example by using exposing a region of interest to infrared and detecting the temperature. More specifically, in some embodiments, the system can comprise one or more infrared emitters producing infrared rays onto the skin of the subject and one or more detectors detecting the bounce-back infrared, which can be further analyzed by the system. From such data, the system can further be configured to calculate resistance, blood pressure, and/or detect a trend in the blood pressure.
  • the system can be configured to characterize jaw movement based at least in part on motion data relating to the jaw which can affect imaging and/or indicate anxiety on the part of the patient.
  • jaw movement detection can be advantageous because the jaw bone is a prominent feature in a patient for tracking purposes.
  • the system can be configured to quantitate skin motion. Quantitative analysis of skin motion can be advantageous to remove and/or adjust for false motion detection.
  • the system can be configured to determine if a patient or subject is left in a medical imaging scanner and/or therapeutic device after a scan or procedure has been completed and/or at the end of the day or at a particular time.
  • the scan completion workflow of a scanner or a therapeutic procedure completion workflow of a therapeutic device can comprise a process in which one or more cameras or motion detection device components are configured to determine whether a patient or subject is located in the scanner or therapeutic device after completion of a scan or therapeutic procedure.
  • one or more cameras or motion detection device components can be configured to determine whether a patient or subject is located in the scanner or therapeutic device at a particular time of day.
  • the system determines that a patient or subject is still located within the scanner and/or therapeutic device after a certain period of time after the scan or therapeutic procedure is complete and/or at or after a particular time of day, the system can be configured to generate an alert, visual and/or audible, to the patient, subject, and/or medical professional.
  • the systems, methods, and devices disclosed herein are configured to extract and/or generate and/or detect a biological signal/data from a motion trace, and/or a video and/or images generated for motion detection. In some embodiments, the systems, methods, and devices disclosed herein are configured to extract and/or generate and/or detect a cardiac and/or respiratory signal/data and/or trace out of a motion trace, and/or a video and/or images generated for motion detection.
  • the systems, methods, and devices disclosed herein are configured to extract and/or generate and/or detect a pulse rate signal/data and/or a heart rate signal/data from a motion trace, and/or a video and/or images generated for motion detection, patient viewing, and/or biometric detection or surveillance.
  • the systems, methods, and devices disclosed herein are configured to extract and/or generate and/or detect one or more proxies for a biological electrical signal, which in some embodiments can be configured to trigger imaging of a patient by using a different imaging modality, for example, a magnetic resonance (MR) imaging system, computed tomography (CT) scanner, or positron emission tomography (PET) scanner, or others.
  • MR magnetic resonance
  • CT computed tomography
  • PET positron emission tomography
  • the imaging systems are triggered to perform image captures based on triggers generated from biological electrical signals, which in some cases, is detected from a sensor placed on a finger or other body part or with optical flow vectors derived from video stream as in markerless tracking.
  • biological electrical signals which in some cases, is detected from a sensor placed on a finger or other body part or with optical flow vectors derived from video stream as in markerless tracking.
  • the detection of biological electrical signals/data through such sensors on the chest or a finger or other body part may not be technologically possible, and/or may add cost to a procedure, and/or may require patient compliance, and/or may increase the time for a procedure, introduce electrocution or burn risk, or encumber workflow, and/or the like.
  • EKG leads can be positioned on a patient, which can be problematic for many reasons. For example, putting such EKG leads on a patient can be time consuming. For example, such EKG leads can be dangerous because if the insulation is broken on the lead then it acts as a piece of metal inside the magnet of an MR scanner, wherein the magnet can induce current in the EKG lead so you can actually electrocute and bum the patient if the insulation on those leads is damaged. In other situations, a sensor is placed on the finger in order to figure out the pulse timing; however, the problem with such sensors is that by the time the blood travels down the hand and gets recognized, there is a latency on the order of around 300 millisecond.
  • the systems, methods, and devices disclosed herein are configured to extract and/or generate and/or detect a proxy signal/data based on a system requiring no physical contact, in other words a touchless system.
  • the systems, methods, and devices disclosed herein are configured to extract and/or generate and/or detect a proxy signal in order to mimic the same trigger signal that imaging systems currently use for triggering image capture.
  • the systems, methods, and devices disclosed herein are configured to use motion trace data and/or video data showing motion of body parts, for example, a vein and/or vessel, to generate a proxy signal/data that mimics and/or correlates to biological electrical signals in the patient being observed, such proxy signals/data can be used, in some embodiments, as triggers for performing image capture.
  • detection of the cardiac and/or respiratory cycle can be used to analyze or extract additional information from temporally correlated imaging.
  • the systems, methods, and devices disclosed herein are configured to not only generate motion traces of a patient but also configured to extract from the video or a series of images data not previously available and/or analyzed for vein and/or vessel movements in the body, for example, the face and/or neck.
  • vein and/or vessel (and/or other body part movement) movement data in combination with motion trace data (through a marker or markerless system) to get improved accuracy of detecting patient motion and/or improved accuracy of when to image a patient and/or improved accuracy in generating patient images by better knowing when to trigger an image capture and/or by knowing better how to correct for motion correction.
  • the systems and methods disclosed herein relate to extracting and/or generating ballistocardiography (BCG) and/or respiratory data and signals from data generated by a motion tracking system in connection with and/or coupled to a medical imaging system, for example, a magnetic resonance (MR) imaging system.
  • a medical imaging system for example, a magnetic resonance (MR) imaging system.
  • the motion tracking systems disclosed herein comprise one, two, three, or four or more cameras, which may cover areas being imaged by the coupled imaging system or adjacent or distant areas.
  • the one or more cameras in the motion tracking system can be configured to be unobtrusively coupled to the imaging system, for example, an MR, CT, PET, and/or other system, and/or can be configured to be used for prospectively gating biomedical imaging scanner (for example, MR image capture system) to reduce or eliminate motion artifacts.
  • the system can comprise a standalone vital signs monitoring system that can be separate from any imaging system.
  • the systems disclosed herein are configured for automatic prospective gating that is synchronized to cardiac and / respiratory motion. In general, cardiac gating is done using EKG leads attached to specific locations on the chest. Other approaches can involve the use of hardware -based photoplethysmography (PPG).
  • PPG photoplethysmography
  • Impedance pneumography or special belts with pressure sensing are used for creating gating signals connected with respiratory motions.
  • these systems require patient cooperation during scanning. Further, such systems generally require wired connections that can limit the use. Additionally, probes or belts can in some instances obscure the anatomical regions of interest and add cost. Accordingly, there is an advantage in cost and/or complexity associated with an unobtrusive system, especially in a high magnetic field environment, such as in an MR imaging system, wherein special hardware with non-ferrous metals must be used in order to prevent injury to the patient due metallic materials being drawn to, moved by, or having current induced by the magnetic forces.
  • the systems disclosed herein leverage motion detection technology combined with one or more specifically designed markers that can comprise dots and/or circles.
  • the one or more specifically designed markers can be positioned on a patient’s face and/or other body area and/or part.
  • the system utilizes processing algorithms to generate six degrees of freedom motion tracking data and/or signals.
  • the system is configured to use processing analytics that utilize overlapping batches to extract continuous BCG and/or respiratory waveforms from up to six motion signals, and in some embodiments, the system can be configured to process up to 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, or more motion signals. In some embodiments, these waveforms can be used for phase and/or amplitude gating.
  • the systems disclosed herein can use these waveforms for tracking patient’s physiological state (for example, heart rate, respiration rate, heart rate variability, change in cardiac output, change in pulse pressure, or the like) during some period of or multiple periods of the motion tracking capture session, or the entire motion tracking capture session.
  • the systems disclosed herein utilize BCG data and/or signals to perform peak to peak pulse point and/or interval detection.
  • the peak pulse point and/or interval detection data can be used by the systems disclosed herein to detect the occurrence of cardiac arrhythmia (for example, asymptomatic or persistent A-fib, or the like) during the scan by quantifying the variability of the peak-to-peak intervals.
  • the systems disclosed herein for tracking motion of a patient and/or object of interest during biomedical imaging applications to compensate for motion artifacts comprises one or more detectors working in conjunction with one or more markers, wherein the one or more detectors are configured to detect images over time of the one or more optical targets or markers (for example, specially designed markers with one or more dots and/or circles) placed on and/or coupled to a patient’s face.
  • the one or more detectors are configured to generate temporal motion data in terms of six degrees of freedom (6-DOF).
  • the systems disclosed herein include four cameras positioned to view at different directions with at least one camera or at least two cameras being adapted to record two dimensional images of the one or more optical targets.
  • the six degrees of freedom are over orthogonal directions x, y, and z and roll, pitch and yaw angles.
  • the six degrees of freedom are extracted at high speed (for example, during each sample period), and in some embodiments, by using a motion tracking algorithm that is configured to process two dimensional images of the target.
  • the direction x is along the coronal plane (between shoulder to shoulder direction)
  • direction y is along the spinal axis located on the coronal plane perpendicular to the direction x
  • direction z is along the sagittal plane in the floor-to-ceiling direction and perpendicular to the x and y directions.
  • the roll angle, Rx is about the x-axis (the angle made by shaking the head“No”), the pitch angle is about the y-axis (the angle made by shaking the head“Yes”) and the Yaw angle is about the z-axis (the angle made by leaning head forward).
  • FIG. 3 there is illustrated an example of motion tracking data (for example, 6 DOF signals) for approximately 11 minutes (updated at l7ms sampling interval) during an MR imaging session of a test subject.
  • signals for 60 second duration there is illustrated signals for 60 second duration.
  • minute traces of cardiac and respiratory motion signals can be identified in many of these imaging channels.
  • the y-direction signal can indicate large portions of motion induced component from cardiac beats which is riding over the non-stationary slowly varying component.
  • the subtle motion has peak to peak amplitude of approximately 40 microns and can represent the movement to reactionary forces experienced by the body to cardiac expulsion of blood into the arteries.
  • the systems disclosed herein are configured to utilize mechanisms for identifying BCG waveforms.
  • the subject and/or patient is lying in the supine position, the z-direction signal can contain the subject’s respiratory signal.
  • the systems disclosed herein are configured to extract the cardiac and respiratory signals from y and z-motion signals and gate (for example, trigger) the image acquisition to match the amplitude or phase.
  • the systems disclosed herein can be configured to limit motion induced image degradation by extracting the cardiac and respiratory signals from y and z-motion signals and gating (for example, trigger) the image acquisition to match the amplitude or phase.
  • the systems disclosed herein are configured to generate BCG and/or respiratory signals and/or data by using signal processing methods.
  • the systems disclosed herein employ a three-step process.
  • the systems disclosed herein can employ a more complex method, for example, an independent component analysis or multi-channel signal processing algorithms can also be applied to simultaneously process x, y and z or x, y, z, Rx, Ry and Rz motion signals/data and extract cardiac and respiratory signals of interest.
  • the systems disclosed herein employ a first step that involves de-trending the signal for removing slow varying trend that can lead to a non- stationary signal component.
  • Slow linear or more complex non-linear trends representing non- stationary component can cause distortions time and frequency domain analysis.
  • the systems disclosed herein in some embodiments, can systematically test for non-stationarities and retain only stationary segments for further analysis or apply other signal processing methods to try to remove the slow non-stationary trends before analysis.
  • the first method can result in erroneous conclusions because the segments that are removed contain physiological information related to heart rate variability (HRV) data.
  • HRV heart rate variability
  • the systems disclosed herein can be configured to use signal processing methods where de-trending is done via smoothness prior approach. In some embodiments, the systems disclosed herein can be configured to employ a method that uses a single parameter to adjust the frequency response of the signals such that the systems can be adjusted to different situations.
  • de-trending filter operates like a time- varying high pass Finite Impulse Response (FIR) filter by removing lower regions of the frequency band.
  • FIR Finite Impulse Response
  • the one-dimensional signal can be considered as two components: (1) stationary motion signal of interest and (2) a low frequency slowly varying non-stationary component.
  • the slowly varying non-stationary component is modeled in terms of the regression parameters to estimate the trend.
  • the systems disclosed herein can be configured to use the regularized least squares solution to determine the regression parameters.
  • the de-trended, stationary motion signal, P stat is obtained from equation no. 1 shown below.
  • P originai is the original signal that need de-trending (e.g., y-channel motion signal for
  • D 2 is a second order difference matrix, an approximation of the derivative operator, which is of the form:
  • the invertible function in equation 1 represents time varying high pass filter. If P on surveillanceina/ is of size N then / is the identify matrix of size NxN.
  • a single regularization parameter, l is used to adjust the frequency response of the de-trending algorithm.
  • the system can be configured, for example, to be at 200 for y-channel signal sampled at 60Hz and z-channel signal at 1000.
  • an optimal value of the parameter(s) can be set a priori by experimenting on different patients in the environment in which signals are acquired (for example, in their clinical/home/work environment).
  • the systems disclosed herein can employ a second step of applying a band-pass filter to the de-trended signal in order to retain the frequency bands of interest.
  • the band-pass filter can be configured to remove undesirable frequencies below and/or above the expected frequency range.
  • the systems can use a band-pass filter with a frequency range of 0.75 to 2Hz for extracting heart rate signals/data.
  • the upper limit can be as large as 7 Hz or above.
  • the systems can be configured to detect a respiratory motion signal wherein the frequency band can be limited to 0.05 to 0.3Hz.
  • the respiratory motion signal can be configured to have the upper limit as high as 0.5Hz for adults and lHz for neonatal intensive care babies, and somewhere in between for children.
  • the systems can be configured to remove phase distortions in filtering by using digital filtering methods.
  • the systems can be configured to use the zero-phase digital filtering algorithm, which processes the input signal through the chosen band-pass filter transfer function of interest, in the forward and/or reverse directions.
  • the result is the zero-phase distortion.
  • the foregoing system configuration reduces the noise and/or retains the signal in the band of interest without injecting delays as in normal filters.
  • the zero-phase filtering helps to preserve features in a filtered time waveform exactly or substantially exactly or approximately where the features occur in the unfiltered signal.
  • the systems disclosed herein can employ a third step wherein the system is configured to compute power spectral density and determine fundamental frequency of interest.
  • the systems disclosed herein are configured to employ up- sampling the signal through interpolation to increase number of data points and further smoothening may also be required.
  • FIG. 4 illustrates a cardiac signal (and a heart rate) obtained after applying the foregoing steps to y-channel trace of 1 -minute in length.
  • FIG. 5 illustrates a respiratory signal (and a respiratory rate) from z-channel trace for the same duration batch.
  • the system was configured for generating a BCG signal based on a long batch of 1 -minute
  • the systems disclosed herein can be configured for generating continuous signals at a shorter batch length anywhere between 5 to 15 seconds, which in some embodiments, is preferred.
  • the system can be configured to comprise a batch length of 10 second to 30 second, due to lower frequency components, assuming the breathing frequency range is between 6 to 25 cps.
  • these batch length ranges can vary based on the lower limits on the breathing frequency and/or other frequencies.
  • FIG. 8 it is illustrated an example of a method for generating BCG and RR signals by computing successively with overlapping batches one at a time, each new batch using large part of previous batch and at least few new samples from new batch while eliminating same number of samples from previous immediate batch.
  • retaining an overlap of 95% between neighboring batches is preferred to minimize large variations between batches.
  • batch length can be, and in some embodiments, preferably, kept to a minimum with maximum overlap.
  • the overlap in the sliding window can be set to maximum with one sample elimination from previous immediate batch and one sample inclusion from new neighboring batch.
  • continuous stream of signals derived from processing each batch is then stitched and again filtered and with zero-phase distortion filtering mentioned above to remove any discontinuities.
  • FIG. 6 illustrates an example of a measured Ballistocardio graph (BCG) waveform for one heartbeat.
  • BCG Ballistocardio graph
  • This signal/data illustrated in FIG. 6 was produced with a modified electronic weighing scale.
  • the example illustrated in FIG. 6 comprises several waves such as the“H”, “I”,“J”,“K”, and“L” waves, which are typical of BCG recordings when standing on the weighing scale.
  • increasing the upper band of the band-pass filter while processing y-channel signal clearly shows the presence of similar traces of“H”,“I”,“J”, “K”, and L waves, examples of which are illustrated in FIG. 7 A and FIG. 7B.“M” and“N” waves can also be seen in these filtered waveforms. It is to be noted that these BCG signals are inverted compared to FIG. 6. In some embodiments, the system can be configured to confirm this observation after acquiring EKG signals/data.
  • the I-wave in the weighing scale signal/data corresponds approximately to the trough or foot of the BP waveform at the inlet of the ascending aorta, while the time of the J wave peak corresponds approximately to the foot of the BP waveform at the outlet of the descending aorta.
  • BCG signal/data can also be used for determining cardiovascular health and disease information.
  • the time interval between the beginning of the“I” wave and the“J” wave peak (troff of the inverted BCG signal) can represent the aortic pulse transit time.
  • cardiac gating can be initiated by detecting the beginning of“I” wave.
  • PEP Pre Ejection Period
  • RI- intervals the interval between R peak to I wave (RI- intervals) in the foot BCG was then modelled with respect to PEP for a group of 17 subjects.
  • Rl-interval depends on PEP.
  • the system can be configured to use I wave as a good gating signal instead of R wave, for initiating image capture.
  • the systems disclosed herein can be configured to extract BCG waveforms contact free by leveraging the MR imaging machines and motion tracking hardware and for use in reasonably accurate timing for synchronizing with cardiac signals.
  • peak to peak pulse points and/or intervals can be used by the systems disclosed herein to extract time domain Heart Rate Variability (HRV) statistics.
  • HRV refers to the beat-to-beat time variation in heart beat and is modulated primarily by the alterations in the Autonomic Nervous System (ANS) via changes in the balance between parasympathetic and sympathetic influences.
  • HRV statistics can be used as a quantitative marker to measure the state of ANS.
  • heart rate is not fixed even in healthy individuals.
  • heart rate automatically adjusts for stress, respiration, metabolic changes, thermoregulation, physical exertions, endocrine cycles, etc.
  • the ANS is represented by the sympathetic and parasympathetic nervous system (SNS and PSNS).
  • SNS and PSNS function in opposition to each other.
  • HRV monitoring during MRI capture is a quick physiologic indicator and/or a superficial reflection of the state of the ANS.
  • HRV statistics can also be used to determine whether the patient is in A-Fib or sinus rhythm.
  • the systems disclosed herein can be configured to process raw unfiltered BCG waveforms through deep learning algorithms and/or SVM classifiers to extract key features that can be mapped to blood pressure.
  • the systems disclosed herein are configured to exploit clinically significant parameters from the BCG waveform, accuracy of motion measurements should increase by 10X or more and/or can be done with high resolution stereo vision system.
  • the systems disclosed use a method to estimate physiologically relevant cardiac and/or respiratory signals/data by leveraging a 6-degrees of freedom motion signals/data for use in phase / amplitude gating of MR images.
  • the system can be configured to detect characteristics points of the desired cardiac and/or respiratory signal/data in multiple ways; (1) by detecting peak or valley points using peak/valley detection algorithm, (2) zero cross over, or (3) a threshold level detector in the inspiration / expiration cycle.
  • end-inspiration or end-expiration are generally used as trigger points for the X-Ray systems.
  • FIG. 9 there is illustrated an example of the gating trigger signal which is time synchronized to the respiration cycle based on threshold level detector.
  • the systems disclosed herein can employ this method, which in some embodiments, has the potential to not only improve the quality of MR images, but also provide valuable medical diagnosis data unobtrusively and/or with no patient cooperation.
  • the systems and methods disclosed herein are better than other systems because the systems and methods disclosed herein are contact-free, inconspicuous and are very accurate, especially when used with Artificial Intelligence such as feature-based deep learning algorithms.
  • the systems and methods herein can be advantageous in two areas of MR imaging: (1) work flow simplification with high accuracy gating and (2) unobtrusive diagnosis of critical clinical parameters without patient co-operation.
  • the systems, methods, and devices disclosed herein relate to signal acquisition from a video-based apparatus for extracting physiology information from a living body in a non-contact manner, for example in many applications such as in cell phones, laptops or in hospital setting where one or more measuring devices can be mounted inside the device or on the wall.
  • one or more video measurements are taken by sequentially illuminating LEDs on subject’s exposed skin surface.
  • reflected light is then recorded in the photodetector array by integrating the light for a predetermined time.
  • the signals are then transformed to extract one or more various vitals such as heart rate, respiration rate, blood pressure, and/or oxygen saturation etc., for tracking the health of living-beings.
  • Some embodiments described herein are specifically suited for taking measurements under non-cooperative settings. Certain embodiments described herein can be used anywhere of living-beings, such as for example workplace, homes, hospitals, home care, minute clinics, sleep labs, intensive care units, doctors’ offices, automobiles, and/or self-driving vehicles, for intermittent and/or continuous monitoring of one or more vitals. Certain embodiments described herein can also be used for measuring cardiovascular health of fish in fish tanks, animals in zoos, and/or the like.
  • a switched narrow-band illumination is utilized, for example near infrared (NIR) centered around 940nm, and/or wideband between 805-1000 nm since absorption between oxygenated & de-oxygenated hemoglobin can differ significantly.
  • NIR near infrared
  • one or more high intensity Light Emitting Diodes can be used.
  • one or more CCD and/or CMOS detectors can be used in this wavelength band.
  • other wavelengths for example between l000-l300nm, can be suitable with non-silicon detectors.
  • the system can be configured to maintain NIR illumination within eye safety limits.
  • a key metric used for monitoring patient health can be the oxygen saturation in the blood (arterial or venous blood) to know how well the patient is oxygenated. In some cases, this can be done using a device which clamps onto the patient’s finger-tip with a cord transmitting signals back to a display module. However, in some cases, these devices can impede the patient from using both hands.
  • the system can comprise two LEDs having different wavelengths sequentially illuminated one after the other at different wavelengths (one below 805nm and the other above 805nm).
  • these two LEDs can distinguish between two types of hemoglobins (oxygenated hemoglobin, Hb02 and de-oxygenated hemoglobin).
  • FIG. 10 illustrates an example(s) of absorption coefficient of hemoglobin and water shown with respect to wavelength.
  • pulse transit time can represent an approach for ubiquitous BP monitoring.
  • PTT can be measured from the EKG R- wave and the finger pulse from oximeter or the like, and can be mapped to systolic and diastolic blood pressures using biophysics models or calibration look up table created by comparing PTT to gold standard instruments.
  • the system can be configured to utilize PTT measurement between an upstream blood pressure pulse within the blood vessel and a distal peripheral pulse taken from the same video frame or captured with a separate video camera whose acquisitions are synchronized.
  • the approach can use extracting two regions, such as a proximal and a distal region of interest, and post processing the video frames to extract videoplethysmographic (VPG) signals.
  • the system can be configured to compute the phase difference between two VPG signals and map the PTT to systolic and diastolic blood pressures, for example via a look up table and/or a mathematical model.
  • the system can be configured to use the phase difference that exists between BCG wave taken with head motion due to the force created by ejection of blood and the VPG wave taken on a small ROI on the forehead.
  • the system can be configured to perform the capture for a few seconds to collect both BCG and VPG waveforms and then map the phase difference to systolic and diastolic blood pressures.
  • heart rate(s) and/or respiration rate(s) are measured by extracting peak of the power spectral density functions using VPG and Respiration signal at wavelength bands of interest mentioned in earlier text.
  • FIG. 11 and FIG. 12 illustrate a small region of interest representing VPG marks from certain systems described herein, such as for example a system comprising four cameras.
  • FIG. 13 illustrates VPG signals from certain camera systems overlapped with detrended signal but unfiltered at different ROIs.
  • FIG. 14 illustrates an example(s) of a VPG signal (top) and respiration signal (bottom) obtained from one of the ROIs illustrated in FIG. 11 with continuous tracking with sliding windows as illustrated in FIG. 8.
  • FIG. 15 illustrates an example(s) of pulse rate / heart rate (top) and respiration rate (bottom) obtained from one of the ROIs illustrated in FIG. 11 with continuous tracking with sliding windows as illustrated in FIG. 8.
  • VPG Videoplethysmography
  • the systems, methods, and devices disclosed herein are configured to improve the strength of videoplethysmography (VPG) signals.
  • VPG videoplethysmography
  • the system is configured to use pulsed illumination, as opposed to continuous illumination, of the skin and/or subject region of interest.
  • the off-time duration of the illumination can be configured to be greater than the thermal relaxation time (TRT) of the tissue of interest.
  • light penetration of skin can be gradually higher with increasing wavelengths.
  • higher depth of penetration can be useful to reach more red blood cells (RBCs) so that the strength of VPG signal is improved.
  • RBCs red blood cells
  • NIR near infrared
  • around 20% of the light energy can be absorbed in the blood vessels which can result in more heat being generated inside the tissue, which can be problematic inimproving strength of VPG signal.
  • some embodiments herein comprise one or more pulsing illuminators, such that intervals between illuminations can be adjusted to reduce heating effects in the tissue or region of interest.
  • thermal relaxation time (TRT) of tissue represents the time required to cool down the heated structure to about 50% of the initial temperature.
  • TRT thermal relaxation time
  • pulsed illumination of the subject region and/or target of interest can be controlled such that the duration between pulses or illumination is greater than at least the TRT of the subject region and/or target of interest. That way, the subject region and/or target of interest can be given sufficient time to cool down before being illuminated again.
  • the illumination on-time can be configured such that it is shorter than the TRT to prevent overheating of the subject region and/or target of interest.
  • the system can comprise one or more polarization optics to reduce specular reflection.
  • the system can be configured to utilize polarization techniques to preferentially remove specular reflection, which can help to improve the performance of VPG signal extraction.
  • the system can comprise an analyzer in the imaging path, which can be a polarizer with the transmission axis perpendicular to that of a polarizer in the illumination path. As such, the reflected light with the same polarization state as the illuminated light can be blocked.
  • specular reflected light can partially retain the source polarization, while light that penetrates the skin can be reflected with randomized polarization.
  • the polarizer in the imaging path can preferentially transmit the defused light which can improve VPG signal strength.
  • chromophores hemoglobin and its derivates, melanin, water and foreign pigmented tattoos
  • the wavelength of light can influence selective light absorption by a certain target structure and can also influence the depth of tissue penetration.
  • skin penetration is gradually higher with increasing wavelengths. For wavelengths varying from 300-1,200 nm, melanin can be the dominant absorbent.
  • the scattering effect can make the light spread out and limit the depth of light penetration. Higher depth of penetration can be useful to reach more Red Blood Cells (RBC).
  • RBC Red Blood Cells
  • the target structure absorbs light, it can be converted to heat, which is conducted to the surrounding structures.
  • TRT Thermal relaxation time
  • the system can be configured to pulse or flicker light illuminating the target tissue, thereby allowing cooling of the target tissue between pulsing sequences and retaining higher depth of light penetration. Controlling the pulsing or flickering of light illuminating the target tissue of interest can play a key role in VPG signal strength.
  • an epidermal thickness of 0.1 mm can have a TRT of about 1 ms, while a vessel of 0.1 mm diameter can have a TRT of about 4 ms. Further, as another example, a vessel three times larger (0.3 mm) can have a TRT of approximately 10 ms. As such, in some embodiments, controlling the pulsing interval to be greater than lOms can provide sufficient time for the target tissue to cool and return to original state.
  • the time of tissue exposure to light tissue can be adjusted by controlling the on-time or pulse duration of the light, such as LED or other light source, illuminating the target region or tissue of interest.
  • FIG. 16 is a schematic diagram illustrating intervals for pulsing illumination. As shown in FIG. 16, in some embodiments, the switching time or switching interval can be defined as the period between two consecutive on- times when the light is turned on to illuminate the region or tissue of interest. Further, in some embodiments, the pulse duration can be defined as the period during which the light is turned on to illuminate the tissue or region of interest.
  • the switching time or switching interval to be greater than the TRT of the tissue or region of interest and/or by adjusting the pulse duration to be less than the TRT of the tissue or region of interest, sufficient time can be provided for the tissue structure at the region of interest to sufficiently cool down such that clearer images and/or VPG signals can be obtained.
  • the TRT can be proportional to the square of the size of the target tissue structure.
  • the TRT or relaxation time can be computed from the following equation.
  • T relaxation time
  • d size of the heated object
  • oc thermal diffusivity
  • k geometrical factor.
  • the thermal diffusivity can be about 2 xlO 3 cm 2 /s for dermis layer.
  • the geometrical factor can about 16 for a cylindrical object, such as a vessel.
  • FIG. 17 is a flowchart illustrating an example embodiment(s) of pulsing illumination to obtain clearer raw VPG signals.
  • medical personnel, user, and/or system can identify one or more regions of interest (ROI) and/or targets of interest on a subject and/or tissue sample at block 1702.
  • ROI regions of interest
  • the system, medical personnel, and/or other user can determine the TRT of one or more tissue structures within the ROI or target of interest at block 1704, for example using in part the equation above.
  • the system, medical personnel, or other user can set the LED or other system illuminator on-time to be less than the TRT, for example to ensure that the tissue sample is not overheated.
  • the system, medical personnel, or other user can set the duration between pulses or switching interval time to be greater than TRT, for example to ensure sufficient cooling time for one or more tissue structures within the ROI or target of interest.
  • the system can be configured to collect raw signal data and/or image data at block 1710, whose strength can be higher than the same collected by using continuous illumination of the ROI without pulsing or flickering.
  • the system can be configured to extract one or more VPG signal data at 1712, which can be more accurate and/or reliable than the same collected by using continuous illumination of the ROI without pulsing or flickering.
  • VPG signal data can be more accurate and/or reliable than the same collected by using continuous illumination of the ROI without pulsing or flickering.
  • FIG. 18A illustrates example images obtained with pulsing illumination
  • FIG. 18B illustrates example raw VPG signals obtained with pulsing illumination as disclosed herein for a test subject.
  • FIGS. 18A and 18B show raw signals obtained from a tissue sample when the pulse duration or the period during which the LED illuminator was on or the on-time was about 2ms.
  • FIGS. 18A and 18B show raw signals obtained from a tissue sample when the switching interval was about 16.67 ms.
  • the camera integration time was about 8 ms in FIGS. 18A and 18B.
  • FIG. 19A illustrates example images obtained with continuous illumination when the LED illuminator was not pulsing
  • FIG. 19B illustrates example raw VPG signals obtained for the same test subject with continuous illumination when the LED illuminator was not pulsing.
  • raw signals were obtained from the same tissue sample for the same test subject when the LED illuminator was continuously on and without any switching interval.
  • camera integration time was about 8 ms.
  • the specular component of received light has the tendency to block the skin pixels from the image.
  • FIG. 20A illustrates an example of unpolarized image of tissue.
  • direction of light wave polarization can be manipulated with polarizing filters which may be optimized to largely or completely suppress specular light.
  • FIG. 20B illustrates an example polarized image of the same tissue from FIG. 20A.
  • the system comprises one or more polarization optics to reduce specular reflection.
  • FIG. 21 is a schematic diagram illustrating an example imaging system comprising a polarizer and an analyzer in the light path for improving VPG signals.
  • the system can comprise a polarizer in the illumination path.
  • the polarizer can be fixed in some embodiments.
  • the system can comprise an analyzer in the imaging path, which can be a polarizer with the transmission axis perpendicular to that of the polarizer in the illumination path.
  • the analyzer can be rotatable or adjustable in some embodiments.
  • the reflected light with the same polarization state as the illuminated light can be blocked, thereby substantially removing the specular component which can be mainly surface reflection and which can contain substantially no absorption information.
  • defused light coming from deeper structures can be randomly polarized and thus preferentially passed through the analyzer.
  • VPG Videoplethysmography
  • systems, devices, and methods described herein can measure heart rate variability with infrared illumination. In some embodiments, systems, devices, and methods described herein can further provide feedback to immersive video display to balance their heart rate variability (i.e., sympathetic and/or parasympathetic tones). In some embodiments, systems, devices, and methods described herein can be configured to measure sympathetic and/or parasympathetic tones with a videoplesthymography (VPG) signal(s). In some embodiments, systems, devices, and methods described herein can be configured to provide a recommendation(s) for balancing sympathetic and/or parasympathetic tones of a subject.
  • VPG videoplesthymography
  • the system can provide audio signals to the patient and reduce MRI or other medical imaging scanner or therapeutic device noise to a background hum.
  • the system can also provide an immersive visual experience via video display. In some embodiments, this is performed without any feedback from the patient’s autonomic nervous system (ANS).
  • ANS autonomic nervous system
  • the system can be configured to measure the state of autonomic activity operation during the scan, which is triggered or influenced by the scanning operation with or without patient cooperation and then make recommendations to play appropriate video images for balancing the autonomic activity.
  • this approach can lead to improved scanning experience for patients.
  • HRV Heart Rate Variability
  • HRV can refer to a quantitative marker used to measure the state of ANS.
  • HRV can be used in some embodiments as a technique to balance of the autonomic nervous system.
  • heart rate is not fixed, but rather automatically adjusts for stress, respiration, metabolic changes, thermoregulation, physical exertions, endocrine cycles, or the like.
  • HRV can refer to the beat-to-beat time variation in heartbeat, which can be modulated primarily by alterations in the ANS via changes in the balance between parasympathetic and/or sympathetic influences.
  • the ANS can be represented by the sympathetic and parasympathetic nervous system (SNS and PSNS). In some instances, they can function in opposition to each other.
  • SNS typically functions with actions that require quick responses such as“fight or flight” response
  • parasympathetic division functions with actions that do not require immediate reaction as in our ability to relax, repair, digest, eliminate and sleep.
  • HRV monitoring can be a quick physiologic indicator and a superficial reflection of the state of our autonomic activity.
  • HRV by monitoring the electrical activity of the heart with procedures such as with a contact-based ECG or an invasive catheter, HRV can be estimated.
  • HRV signals can be generated by extracting the intervals between R-waves from the ECG.
  • spectral analysis of R waves i.e., RR interval, of a 2 to 5 minute short ECG recordings can contain three components: (1) a very low frequency (VLF) component at a frequency range less than or equal to 0.003 to 0.04 Hz; (2) a low frequency (LF) component within 0.04 to 0.15 Hz; and (3) a high frequency (HF) component at 0.15 to 0.4 Hz.
  • VLF very low frequency
  • LF low frequency
  • HF high frequency
  • typical heart rate can have frequencies anywhere between 0.7 Hz to 4 Hz.
  • ELF ultra-low frequency component
  • a ratio of the powers concentrated in the LF component to the HF component can provide a useful HRV measurement.
  • evolution of this ratio over time also contains useful information and can be used for measuring the state of autonomic activity.
  • FIG. 22A and FIG. 22B illustrate examples of how the peak power in low frequency (LF) and high frequency (HG) components can be different under sympathetic and/or parasympathetic influence.
  • FIGS. 22A and 22B illustrate examples of spectral assessment, in which FIG. 22A illustrates an example power spectrum of a pulsating cardiac signal with a dominant LF component or dominant sympathetic influence, and FIG. 22B illustrates an example power spectrum of a pulsating cardiac signal with a dominant HF component or dominant parasympathetic influence.
  • FIG. 23 illustrates example frequency contents and their range and associations with ANS.
  • FIG. 24 illustrates example spectra contained in pulsating cardiac signal(s) and their associations with ANS.
  • a video processing system can capture one or more VPG waveforms at block 2402.
  • the system can be configured to compute power in low frequency and high frequency spectra at block 2404.
  • the system can be configured to computer a ratio of the power in LF component divided by the power in HF component at block 2406.
  • the system can be configured to determine whether this ratio is above a predetermined threshold at block 2408.
  • the system can be configured to select one or more appropriate videos to suppress sympathetic response at blocks 2410 and/or 2412, which can then be displayed to the subject, for example through an in-bore video display system.
  • motion tracking and/or correction systems such as those illustrated in FIGS. 1A and 1B and/or those comprising one, two, four, or more cameras or detectors for example, can be used to measure HRV.
  • the system can be further configured to select one or more appropriate videos and/or audio from a database to play them in the patient display for balancing HRV.
  • systems, methods, and devices described herein can comprise the following: (i) video system for capturing videoplethysmographic signals; (ii) processing system to compute power in the spectral contents of low frequency and high frequency components; (iii) HRV computation unit to compute the ratio of the powers concentrated in the LF component to the HF component; and/or (iv) video selection & display system for balancing HRV.
  • FIG. 23 illustrates example systems, devices, and methods for balancing HRV.
  • an imbalanced autonomic nervous system with a reduced parasympathetic and increased sympathetic tone can result in stress.
  • sympathetic breathing meditation videos can reduce the sympathetic tone.
  • asking the patient to breath slowly with meditative videos can be another way to reduce sympathetic tone.
  • parasympathetic dominance can lead to slowed down under-energetic nature leading to decrease in respiration and heart rate and increase in digestion. For example, a patient may fall asleep under the parasympathetic dominance.
  • a meditative video(s) with fast rhythmic breathing can be selected and displayed to swing back towards balanced ANS.
  • systems, devices, and methods described herein are configured to determine and/or monitor the rate and/or duration of blinks of a subject’s eye, which can enable an accurate assessment of fatigue and/or drowsiness in the subject.
  • a complete non-contact and/or non-obtrusive measurement technique can be provided for extracting blinking rate and/or the duration of blinks of a subject’s eye.
  • blinking rate and/or the duration of blinks of a subject’s eye can be determined by using one or more motion tracking and/or correction systems, such as those illustrated in FIGS. 1A and 1B and/or those comprising one, two, four, or more cameras or detectors for example.
  • the determined rate and/or duration of blinks can be used alone and/or in combination with Heart Rate Variability (HRB).
  • HRB Heart Rate Variability
  • the determined rate and/or duration of blinks can be calibrated to fatigue-related and/or sleep-related metrics, for example to indicate to the medical imaging scanner or therapeutic device operator whether the subject is under fatigue or feeling drowsy or sleepy during the process.
  • camera-based methods can allow extraction of pixel intensities near the eye region once subject’s head is within one or more cameras’ FOV (Field of View).
  • FOV Field of View
  • the average intensities over time are obtained by directing a region of interest (ROI) in one or all four cameras, for example in a four-camera motion tracking system, to cover one or both eye regions.
  • ROI region of interest
  • FIG. 25 provides an illustrative example of images captured with a four-camera motion tracking system, in which the illustrated rectangular regions show areas used for measuring eye blink rate.
  • FIG. 26 is a block diagram illustrating example methods for extracting eye blink rate and/or blink duration.
  • FIG. 26 is a functional block diagram illustrating an example signal processing algorithm comprising steps for extracting eye blink rate and/or blink duration.
  • a video processing system for tracking pixels within one or more region of interests (ROI) can be used by the system at block 2602.
  • the system can integrate pixel intensities within the one or more ROIs and obtain an average pixel intensity time series signal at block 2604.
  • the system can accumulate until a certain time period of data, such as for example 10 seconds, is captured to form a batch with average pixel intensity values at block 2606.
  • the system can be configured to discard 1 second trailing edge data and append 1 second new data at block 2606.
  • the system can perform one or more signal processing steps on the signal at block 2608.
  • the system can be configured to detrend the signal, detect peaks and times at which peaks occur, count the number of peaks from the most recent 1 second data, extract duration of blink and time intervals between successive peaks, and/or compute blink rate (number of blinks per minute) at block 2608.
  • one or more processes described in connection with blocks 2606 and 26-8 can be repeated every second with 1 second new data, as illustrated in block 2612.
  • the system can be configured to display eye blink rate, duration, and/or fatigue/drowsiness metrics at block 2610.
  • each batch of average pixel intensities comprises a 5 to 10 seconds window.
  • traces of eye blinks can be more prominent in the pixel intensity signal.
  • the top graph of FIG. 27 illustrates an example time series signal, showing average pixel intensity in the ROI used for blink rate detection shown with respect to the video frame number for controlled blinks. In this illustrated example, a total of 6 blinks occurred within a 50 second duration.
  • the bottom graph of FIG. 27 illustrates eye blink rate in blinks per minute shown with respect to time.
  • the system can apply one or more of the following steps for extracting eye blinking data: (1) accumulate 1 second new data and discard previous one second data from a batch of length approximately 5 to 10 seconds duration; (2) detrend the signal to remove non-stationary components and retain traces comprising of large peaks as seen in the top graph of FIG.
  • a template matching and AI based classification algorithms can also be used to estimate eye blink rate and duration.
  • a time series signal can be captured for over a minute-long duration by the system.
  • Corresponding bottom figures shows the extracted eye blink rate per minute.
  • the illustrated examples in FIG. 28 and FIG. 29 represent data collected from two different subjects.
  • duration of blinks is estimated by thresholding the region where peaks occur.
  • the top graph of FIG. 28 illustrates an example time series signal showing average pixel intensity in the ROI used for blink rate detection shown with respect to the video frame number for a first subject under the camera field of view.
  • the data shown in FIG. 28 is for uncontrolled blinking, in which a high blink rate is affected by high light intensity.
  • the bottom graph of FIG. 28 illustrates eye blink rate in blinks per minute shown with respect to time.
  • the top graph of FIG. 29 illustrates an example time series signal showing average pixel intensity in the ROI used for blink rate detection shown with respect to the video frame number for a second subject under the camera field of view.
  • the data shown in FIG. 29 is for uncontrolled blinking, in which a high blink rate is affected by high light intensity.
  • the bottom graph of FIG. 29 illustrates eye blink rate in blinks per minute shown with respect to time.
  • MRI magnetic resonance imaging
  • MRI magnetic resonance imaging
  • MRI magnetic resonance imaging
  • MRI can be an important radiological examination that is crucial for the medical diagnosis, prognosis and management of multiple diseases. Due to its best soft-tissue contrast, MRI can be considered the gold standard for the imaging of neurological diseases, diseases of the musculoskeletal system (spine, small and large joints), oncological diseases, trauma and infections/inflammations in all body regions. As MRI does not involve any ionizing radiation and is minimally invasive, it can be preferred for the imaging of pediatric patients and patients that require long-term follow-up.
  • the MR coil (or antenna) should be placed as close as possible to the body region of interest.
  • these‘local’ coils may be rigid or flexible.
  • typical rigid coils can comprise the Head/Neck coil, the Spine coil, the Shoulder coil, the Knee Coil, and the Foot/ Ankle coil.
  • flexible coils may be used for the imaging of various body regions which would include the abdomen, heart, pelvis, long bone and/or may also be used to wrap around any joint for musculoskeletal imaging.
  • MRI of the knee provides detailed pictures of the soft tissue structures within the knee joint.
  • the structure of interest can include bone, cartilage, meniscus, tendons or ligaments, muscles or blood vessels.
  • MRI images are used to determine integrity of soft-tissue structures, delineate extent of injury to the knee following trauma, infections and inflammation of the joints and muscles, and tumors of the bone and soft tissues.
  • MRI of the shoulder provides detailed pictures of the structures within the shoulder joint. Structures can include bones, tendons, muscles, and/or blood vessels within the shoulder joint. In some embodiments, MRI of the shoulder joint gives clear views of rotator cuff tears, injuries to the biceps tendon and damage to the glenoid labrum, and/or the soft fibrous tissue rim that helps stabilize the joint.
  • MRI of the shoulder is performed to evaluate degenerative joint disorders, such as arthritis and labral tears, fractures, rotator cuff disorders including tears and impingement, joint abnormalities due to trauma, sports and work related injuries, infections, tumors involving bones and joints, pain, and/or swelling or bleeding in the tissues in and around the joint and decreased motion of the shoulder joint.
  • degenerative joint disorders such as arthritis and labral tears, fractures, rotator cuff disorders including tears and impingement, joint abnormalities due to trauma, sports and work related injuries, infections, tumors involving bones and joints, pain, and/or swelling or bleeding in the tissues in and around the joint and decreased motion of the shoulder joint.
  • MRI of the hand/wrist may be performed either with the hand/wrist on the side of the patient or in a‘superman’ position (patient lying in a prone position with wrist raised over the head). The latter position can be especially challenging for older patients and is prone to patient movement.
  • MRI of the breast produces cross-sectional images in all three dimensions (side-to-side, top-to-bottom ad front-to-back).
  • MRI of the breast can offer more valuable information such as for detecting breast cancer and breast abnormalities when compared to other imaging modalities, such as mammography and/or ultrasound.
  • patients may move around during the scan, including, at the minimum, breathing and/or cardiac related motions. Other motions can be restricted for optimum image quality if the patient is cooperative.
  • MRI scan of the abdomen allows physicians to examine the abdominal anatomy to rule out any structural abnormalities.
  • scans may take around or up to 60 minutes or more. Patient movement during a scan can result in loss of optimal image quality.
  • MRI scan of the pelvis allows physicians to examine the pelvis anatomy to rule out any structural abnormalities.
  • scans may take around or up to 60 minutes or more. Patient movement during a scan can result in loss of optimal image quality.
  • MR imaging has opened new horizons in the diagnosis and treatment of many musculoskeletal diseases of the ankle and foot.
  • ligaments e.g., sprain
  • tendons tendinosis, peritendinosis, tenosynovitis, entrapment, rupture, dislocation
  • other soft-tissue structures e.g., anterolateral impingement syndrome, sinus tarsi syndrome, compressive neuropathies [e.g., tarsal tunnel syndrome, Morton neuroma], synovial disorders
  • routine ankle MR imaging is performed in the axial, coronal, and sagittal planes parallel to the table top.
  • the foot is imaged in the oblique axial plane (ie, parallel to the long axis of the metatarsal bones), oblique coronal plane (i.e., perpendicular to the long axis of the metatarsals), and/or oblique sagittal plane.
  • FIG. 39 illustrates an example(s) of oblique axial 3904, oblique coronal 3902, and oblique sagittal 3906 imaging planes that can be used in the diagnosis and/or treatment of many musculoskeletal diseases of the ankle and foot 3900.
  • the patient is supine with the foot in about 20° of plantar flexion.
  • MR imaging can be an invaluable tool for early detection and assessment of variety of osseous abnormalities and the optimal image quality may be compromised due to physical motion of the ankle and/or foot.
  • MRA Magnetic Resonance Angiogram
  • Carotids can be performed, for example to detect narrowing of the arteries.
  • MRA scans can be similar to MRI in that it creates images of soft tissues, bones and internal body structures.
  • MRA of the neck can be used to produce two three-dimensional images of the blood vessels. Patient motion during a scan can lead to loss of optimal image quality.
  • a MRI technician can ask the patient to hold the breath still for few seconds and then reproduce same breath hold repeatedly for the duration of the scan. However, this requires patient cooperation. Also, breath hold may not be reproduceable and not comfortable to the patient. In addition, not all patients can maintain breath hold for a useful length of time. [0248] In some embodiments, to facilitate optimal image quality and/or therapeutic treatment without patient cooperation or external means adapted to restrict motion, motion tracking with a high degree of accuracy with or without the need of markers can be useful. In some embodiments, high accuracy tracking can improve the imaging treatment quality obtained and produced by diagnostic equipment, such as any of the imaging technologies discussed herein.
  • the use of high accuracy patient movement tracking technology can improve the application of certain patient therapies, such as radiation treatment, proton treatment, and the like.
  • therapeutic technologies can apply therapies only to the targeted tissue and avoid healthy surrounding tissue.
  • prospective motion correction can be used, which involves tracking of the object of interest and adjusting scan planes in real time or near real-time such that they follow the movement, resulting in images without the motion artifacts.
  • a motion tracking system for medical imaging and/or therapeutic procedure can be used in conjunction with an optical marker.
  • a motion tracking system for medical imaging and/or therapeutic procedure can be marker-less, for example by identifying and/or tracking one or more landmarks on the subject and/or by utilizing AI-based technology.
  • a magnetic resonance scanner uses a flexible coil apparatus for obtaining better images of a patient that is being scanned, wherein the coil acts as an antenna for obtaining better images of the patient.
  • the flexible coil 3000 resembles a blanket that can be folded, wrapped, rolled, or otherwise deformed to wrap around a body part of a patient.
  • the system is configured to track motion of a flexible coil 3000 without any markers.
  • a flexible coil 3100 can comprise one or more markers 3102 for tracking motion of the flexible coil and can resemble a blanket that can be folded, wrapped, rolled, or otherwise deformed to wrap around a body part of a patient.
  • FIG. 32 illustrates an embodiment(s) of a schematic of a flexible coil resembling a blanket without one or more markers that is held by the scanner operator prior to wrapping around a body part of a patient.
  • the flexible coil can be configured to wrap around a shoulder, upper and/or lower arm, upper and/or lower leg, chest, an abdomen, thorax, boney body part, knee, head, and/or any other body part of a subject.
  • a flexible coil 3300 resembling a blanket, with or without one or more markers can be configured to be wrapped around the hand/wrist of a subject for imaging and/or therapeutic treatment.
  • a flexible coil resembling a blanket with or without one or more markers can be configured to be wrapped around the patient’s lower arm/wrist.
  • FIG. 33 a flexible coil 3300 resembling a blanket, with or without one or more markers
  • a small flexible coil with or without one or more markers can be used for wrapping around the foot/ankle rest of a subject.
  • a flexible coil resembling a blanket with or without one or more markers can be used for wrapping around the chin of a subject.
  • a wearable, flexible head coil with or without one or more markers can be used for head, brain and jaw imaging of a subject.
  • the flexible coil 3700 comprises an opening for the face of the head.
  • the flexible coil comprises a thermoplastic molded portion.
  • the flexible coil does not comprise openings or holes for allowing direct sight of, or visibility of, or visible path to the skin underneath the flexible coil or body part being wrapped by the flexible coil.
  • the flexible coil or other apparatus herein is configured to wrap tightly to the patient body part.
  • the flexible coil is configured to stick to the patient body part.
  • the flexible coil other apparatus herein is configured to be attached and/or coupled to the body part of the patient such that there is no movement between the flexible coil and the underlying body part of the patient.
  • the flexible coil other apparatus herein is configured to move in exactly the same way or substantially the same way the underlying body part is moving.
  • a flexible coil(s) is configured to have one or more markers 3102 attached to the flexible coil 3100 in order for a motion tracking system to track the movement of the marker and in turn the flexible coil, and in turn the underlying patient body part that is wrapped in the flexible coil.
  • the flexible coil is positioned on, attached to, adhered to, velcro- ed on, glued on, embedded, coupled, or otherwise positioned on the exterior of the flexible coil such that at least one or more of the markers is visible to one or more detectors configured to track the motion of the marker.
  • a flexible coil 3700 can be used in conjunction with a motion compensation system 100 as described herein.
  • the flexible coil comprising the one or more markers is configured to be worn by a patient as the patient moves from a diagnostic machine to a treatment machine.
  • the flexible coil having one or more markers can be utilized for position coordinate registration when the patient is changing modalities.
  • the flexible coil can enable repeatability of the positioning of the patient’s body part from one scanning modality to another scanning modality and/or from a diagnostic machine to a therapeutic machine.
  • the one or more markers on the flexible coil are configured to allow the treatment machine to use the same alignment coordinates used in the scanning machine in order to ensure that the treatment machine is using the same relative position coordinates used in the diagnostic machine.
  • the one or more markers are used to determine and/or calculate the relative position of the patient in the diagnostic machine and in the treatment machine in order to align the diagnostic machine with the treatment machine.
  • the flexible coil is configured to be removed after a patient has completed a diagnostic scan and then a flexible coil having one or more exterior markers, or other mask or wrap or flexible mat without an antenna coil but having exterior markers, or just one or more markers, is/are positioned directly on the patient, such that the one or more markers can be utilized to align the treatment machine with the position coordinates used by the scanning machine.
  • the flexible coil having one or more markers, a mat or other wrap apparatus having one or more markers, and/or one or more markers coupled to a patient can be used in conjunction with a robotic patient table that is configured to align a patient properly within a scanning machine and/or treatment machine.
  • the robotic patient table is configured to utilize the one or more markers on the flexible coil, or mat or other wrap device, or one or more markers coupled to the patient in order to facilitate alignment of the patient within the scanning machine and/or the treatment machine.
  • one or more markers can be configured to be attached to a fixed coil device, which is attached to the magnetic resonance scanner and/or the table positioned within the bore of the MRI machine.
  • the MRI machine is prone to vibrations or other movement during normal operation of the MRI, and therefore, the fixed coil that is attached to the scanner machine is subject to movement due to the vibrations or other movement. Accordingly, it can be advantageous to account for such motion of the fixed coil during scanning in order to remove blurriness and/or artifacts from images produced from the scanning.
  • the flexible coil can be positioned on a patient such that the flexible coil is proximate to a joint or the rib cage or the abdomen or the head.
  • the flexible coil without any openings in the flexible coil it would be impossible to attach markers to the skin of the body that would be visible by a detector configured to track position of the patient. Accordingly, it would be advantageous to attach one or more markers to the exterior of the flexible coil in order to enable motion tracking of the body part wrapped in the flexible coil.
  • the system can be configured to track the relative position and/or the topology of the body part tightly wrapped by the flexible coil or other device.
  • the systems, devices and methods disclosed herein can enable the tracking of a range of motion of the patient as well as the relative position within the scanner.
  • the flexible coil wrapped around the chest or abdomen can enable the tracking and/or determination and/or estimation of respiratory rate, depth of respiration, and/or other biometric data, and correlating such patient biometric data with other data from the biometric or infrared analysis of the face or head.
  • the flexible coil, or mat or wrap or other device without an antenna, having one or more markers can be configured to wrap tightly around one or more bony parts of the body, for example, the head while leaving room for the eyes and mouth open, such that the flexible coil or other apparatus can approximately closely match the actual position of the body part, for example, the head.
  • the flexible coil, or mat or wrap or other device without an antenna can comprise one or more cameras and/or other sensors inside the flexible coil or other apparatus, wherein such cameras and/or other sensors are configured to do more tracking or refined tracking of the patient.
  • the flexible coil, or mat or wrap or other device without an antenna is configured to tightly fit, and approximate the underlying body part movement.
  • one or more markers are attached to the flexible coil or other apparatus for tracking overall position and state of movement of the joint or body part (for example, thorax, abdomen, or the like).
  • the flexible coil, or mat or wrap or other device without an antenna is configured to comprise sensors and/or cameras inside for additional tracking sensitivity, for example, flexion / extension sensors for arm / shoulder, knee.
  • the tracking can be tiered, track object within scanner, and then track a body part within the flexible coil or other apparatus.
  • FIG. 40 illustrates an example(s) of a spine coil embedded in a medical scanner or treatment bed that can be used for spine imaging and/or a potential marker location for lumbar spine imaging.
  • the spine coil 4000 is embedded in the MRI bed.
  • methods for the detection of the respiratory cycles can include the use of a respiratory belt, MR navigators and/or small coils/antennas which are located within the spine coil.
  • one or more markers 4002 can be placed on the subject’s body.
  • FIG. 41 illustrates an example(s) of a neck and head coil that can be used for neck and head imaging and/or potential marker location on the neck of a subject.
  • a neck/head coil 1200 can be used in conjunction with one or more markers 4102 placed on the neck of a subject.
  • swallowing or coughing motions can create image artifacts.
  • respiratory motion is not considered a problem in neck imaging.
  • FIG. 42 illustrates an example(s) of one or more body coils that can be used for body imaging and/or potential marker locations on the abdomen of a subject.
  • one or more body coils 4200, 4202 can be used.
  • one or more body 4200, 4202 can be used in conjunction with a spine coil 4000 during a scan.
  • one or more markers can be placed on the body coil.
  • one or more markers 4204, 4206 can be placed on the subject’s body to facilitate capturing motion.
  • body 6/18/30 can generally be used for adults.
  • Flex4 Small/Large can be used for pediatric cardiac scans and/or for general muscular skeleton imaging.
  • FIG. 43 illustrates example(s) of a coil that can be used for hand/wrist imaging and/or potential marker locations on the hand of a subject.
  • a hand/wrist coil 4300 can be rigidly fixed to the scanner bed.
  • other types of hand/wrist coils such as the hand/write coil 4302 can be used for hand/wrist imaging.
  • one or more markers 4304, 4306 can be placed on the wrist to capture motion.
  • FIG. 44 illustrates an example(s) of a foot/ankle coil that can be used for foot/ankle imaging and/or potential marker location on the foot of a subject.
  • a foot/ankle coil 4400 can be used for scanning the foot/ankle.
  • a foot/ankle coil 4400 can be rigid and/or immovable.
  • one or more markers 4402 can be placed on the lower leg of a subject above the flexible coil.
  • FIG. 45 illustrates an example(s) of a knee coil that can be used for knee imaging and/or potential marker location on the thigh region of a subject.
  • a knee coil 4500 can be used for scanning the knee.
  • a knee coil 4500 can be fixed to the scanner bed.
  • one or more markers 4502 can be placed directly above the knee.
  • an opening 4504 may be present at the middle of the anterior portion of the knee coil 4500.
  • one or more holes 4504 can be drilled through the knee coil 4500 so that the motion of the knee cap can be measured directly.
  • one or more markers can be placed on the thigh 4506 of a subject, directly above the knee coil. In some embodiments, one or more other markers can also be placed on the lower limb directly below the knee coil.
  • FIG. 46 illustrates an example(s) of a shoulder coil that can be used for shoulder imaging and/or potential marker location closer to the coil near the shoulder of a subject.
  • a shoulder coil 4600 can be used for scanning the shoulder of a subject.
  • the shoulder coil 4600 can be fixed to the scanner bed.
  • one or more markers 4602 can be affixed close to the shoulder coil on the body surface to detect movement due to respiration as well as other body movement and enable the correction of such movement. In some embodiments, it can be preferred not to have a marker on the coil 4600.
  • FIG. 47 illustrates an example(s) of a coil that can be used for breast imaging and/or potential marker location on the back of a subject.
  • a breast coil 4700 can be used for scanning the breast of a subject.
  • motion detection can be less crucial in breast imaging, for high quality imaging, respiratory motion can induce artifacts.
  • one or more markers on the back of body can be used to facilitate measuring the respiratory motion.
  • FIG. 48 illustrates an example(s) of a wearable coil and potential marker locations on such a coil.
  • a glove coil 4800 can be used for hand/wrist imaging.
  • a glove coil 4800 can be worn by the hand 4804 of a subject.
  • one or more markers 4802 can be placed on movable joints on the hand to capture motion.
  • accurate synchronization between MR imaging data acquisition and cardiac triggering can reduce image artifacts as described herein.
  • the blood flow from the heart to the periphery results in pulsatile vessel dilation that leads to artifacts if unsynchronized imaging is applied.
  • cardiac triggering is used to acquire image data at a distinct phase of the cardiac cycle comprising of systolic and diastolic phases, assuming that at these points in time, the physiology of interest is spatially aligned.
  • measurement hardware for cardiac triggering can be contact-based as in Doppler ultrasound, pulse oximetry, electrocardiography, phonocardiography or non-contact as in image-based.
  • contact-based hardware can require clinical personnel for their proper application and direct interaction with the patient which is time-consuming and may result in patient discomfort. In some embodiments, this can be avoided by using one or more MRI-compatible cameras, since they are not affected by magnetohydrodynamic interactions, excitation pulses, or gradient noise, bears no risk of heating on the patient’s skin and can be used to image all exposed skin regions.
  • the system can comprise one or more in-bore MRI-compatible cameras, which can be very advantageous when increasing scan times are required to determine the presence of diseases by scanning many parts of the body mentioned above.
  • the system can be configured to acquire biometric data of a subject undergoing a medical scan and/or therapeutic procedure based at least in part on raw motion tracking data, such as video and/or image data.
  • the system can further be configured to obtain a cardioballistics (also called“ballistocardiography” or BCG) and/or cardiac pulsation signal containing systolic and diastolic phases and/or blood pressure signal from the biometric data.
  • a cardioballistics also called“ballistocardiography” or BCG
  • cardiac pulsation signal containing systolic and diastolic phases and/or blood pressure signal from the biometric data.
  • the system can be configured to analyze the veins and/or skin and/or hair and/or clothing and/or heat spots and/or skin marks and/or tattoos and/or other movements, topographical changes, infrared and hyper spectral changes and/or micromovements of the face, head, and/or neck to obtain a cardioballistics and / or cardiac pulsation signal and/or blood pressure signal.
  • cardioballistics can relate to a matter of how the body part of interest moves and/or how much the vessels distend.
  • biometric data can be further analyzed by the system to create a biometric output or new signal, such as ballistic data.
  • Such output or signal can be indicative of the patient’s health while the patient is undergoing a medical scan and/or therapeutic procedure.
  • measurement of motion related to respiratory signal is useful for long scan times as in scanning abdomen, breast, shoulder, spine, neck etc., for prospectively gating the image acquisition automatically that is synchronized to respiratory motion.
  • impedance pneumography or special belts with pressure sensing are used for creating gating signals connected with respiratory motions.
  • such systems can generally require wired connections that can limit the use.
  • probes or belts can in some instances obscure the anatomical regions of interest.
  • an MR-compatible video camera with respiratory cycle detection algorithms can be used.
  • the system can be configured to detect characteristics points of the desired respiratory signal containing inspiration and expiration cycle.
  • end-inspiration or end-expiration can be used as trigger points for the scanning system.
  • the ability to affix a marker on the flexible coil may not be possible or is found not necessary for tracking the motion of patient’s body part as there may be good optical landmarks or features available to track within the camera field of view.
  • Artificial Intelligence such as dynamic feature-based deep learning algorithms, such as those described herein.
  • MMMD multi modal multi-disciplinary
  • a motion and / biometric signal detection model can be utilized.
  • the feature learning and features to motion and biometric mapping model produces six degrees of freedom (6DOF) motion coordinates or biometric waveforms in real-time of the body part and can be used to reduce or eliminate imaging artifacts.
  • FIG. 49 is a flow diagram illustrating an example method of motion detection and correction system used for scanning various part of the body.
  • the system can be configured to identify one or more regions and/or targets of interest on the body and/or on the flexible coil for tracking motion attributes at block 4902.
  • this may be coordinated in whole or in part by the use of right kind of flexible coil described herein and with right kind of region on the body near the flexible coil when the region in the coil cannot be used for placing a marker for extracting features to track motion.
  • appropriate marker and the location can be selected on the flexible coil or near the coil but on the body part for marker-based tracking.
  • target image region on the body or on the flexible coil where features containing motion information can be present.
  • the system can receive the one or more regions or targets of interest selected and/or captured by the image sensor at block 4904.
  • the system can extract one or more features of interest from the received target of interest data at block 4906.
  • features can comprise movement of coordinates of the centroid of each marker.
  • features can be MMMD (multi-modal multi-disciplinary) features extracted from the image.
  • the system can be configured to compute one or more motion attributes at block 4908.
  • motion attributes can involve, for example, 6DOF (degrees of freedom) motion coordinates, only gating signal for cardiac gating of imaging system, and/or respiratory signal for respiratory synchronization of imaging system.
  • the motion attributes can be communicated to the device controller to adjust the scanner attributes to account for motion of the subject.
  • motion attributes are presented to the treatment device controller.
  • FIG. 50 is a flow diagram for identification of a region and/or target of interest for tracking.
  • the system and/or scanning technician receives location information about the part of the body where imaging is scheduled at block 5002. For example, for imaging the knee with marker-based approach, the system and/or technician can be informed of the scheduled region of the body for scanning.
  • a region on the body or on the flexible coil is identified as target of interest.
  • a region is identified for placing the marker.
  • sensing system is evaluated to make sure identified region and/or target of interest is adequate for extracting and tracking motion attributes.
  • FIG. 51 is a flow diagram for identifying features of interest from a sensed image, for example from block 4904 described above.
  • the system can receive an identified region and/or target of interest for tracking.
  • the target of interest comprises of the flexible coil region and/or the part of the body region within the field of view of the image sensor.
  • the system can identify a region of interest within the target region identified in method 5102.
  • the region of interest can be the region around one or more markers in the thigh region.
  • the system can identify one or more features of interest for tracking.
  • a feature of interest can be the centroid of the marker.
  • FIG. 52 is a flow diagram depicting methods used in computing the motion attributes.
  • one or more features are received by the system, for example from 4906.
  • 6 DOF motion coordinates are computed.
  • cardiac cycles and gating signal is computed at block 5206.
  • respiratory cycles and gating signal are computed.
  • one or more motion attributes are saved. In some embodiments, only one or more of the processes described in connection with blocks 5204, 5206, and/or 5208 are computed.
  • dynamic feature detection and/or AI-based algorithms can be used to facilitate motion tracking without the need to apply markers, features for dynamic feature detection are described.
  • motion tracking can depend on having an accurate set of reference points.
  • one or more makers can be placed on a subject to provide the reference points.
  • artificial intelligence (AI) based systems and methods can be used for tracking motion and / biometric measurements of human subject during biomedical imaging with sequence of video images and without the use of external markers.
  • the embodiments can include multi modal multi-disciplinary (MMMD) feature representation and a motion and / biometric signal detection model.
  • the feature learning and features to motion and biometric mapping model can produce six degrees of freedom (6DOF) motion coordinates or biometric waveforms in real-time and can be used to reduce or eliminate imaging artifacts.
  • the aspects described may include an offline and a real-time process (e.g., processing data proximate to the time the data was collected).
  • a data set can be generated using marker-based system in a laboratory environment or the marker-based tracking system operating in the field.
  • the field unit may be specifically instrumented to collect and organize the data for training the algorithm.
  • the data may include image frames from an array of sensors included in the instrument (e.g., four cameras).
  • the data may include corresponding set of quaternions or 6DOF values or biometric waveforms.
  • data collected from various study subjects is used to perform feature engineering and develop motion models for transforming features to 6DOF signals.
  • Feature engineering generally refers to a process of identifying, through machine learning or other artificial intelligence methods, those aspects of collected data that are likely to provide sufficient accuracy to serve as the marker for a given subject for an instrument.
  • capturing adequate amount of training data may be an offline process. Salient features or signatures out of the training data using feature engineering may be used to train a feature detection / extraction and motion or biometric signal model. In some embodiments, using such features for training helps to ensure that the derived features are easy to compute during real-time and are sensitive to motion components or biometric signals with more discriminant power than other potential feature combinations or derivations.
  • features are used as inputs to a detection model which maps features to trained outputs. The trained outputs may include detected motion or features likely to identify motion for a data set.
  • efficient learning techniques are used to engineer efficient features and the corresponding detection model.
  • engineered features are extracted from input images using algorithms.
  • a number of features and their parameters are identified and basis functions or feature maps are generated.
  • number of principal components and basis vectors can be determined during training.
  • weights along with number of principal components and basis vectors can also be used.
  • number of weights depend on number of principal components and basis vectors. Actual values of weights can be calculated during real-time tracking using basis vectors from offline training.
  • the system may consider, for each layer: (1) number of layers, (2) number of neurons in each layer, (3) weights and bias values (or feature maps), and (4) activation function (or feature function).
  • the number of layers and/or number of neurons in each layer can define autoencoder network or structure.
  • weights and bias values (or feature maps) and activation function (or feature function) can be grouped under feature maps and feature functions respectively.
  • offline training generates these quantities which can be transferred to a motion tracking device for real-time motion / biometric tracking.
  • the feature maps may be represented as lookup tables. Deep learning features can be cryptic in nature and difficult to interpret or give physical meaning.
  • a Convolution Neural Network comprising of many layers, can extract a subject’s eyes or eyebrows or nose as features which may provide accurate markers.
  • the eyes of the subject may be detected using an image feature detection algorithm which processes the pixel information for an image to identify those pixels likely to represent the eyes of a subject.
  • the motion of these features may then be mapped to trained output such as 6DOF quantities or biometric waveforms on a frame-by-frame basis.
  • the system can be configured to utilize learning from data collected in the field to subsequently improve the detection model so that the model is repurposed/retrained for new circumstances.
  • AI-based techniques such as“transfer learning” may be used for learning from data collected from the field (e.g., actual image data and/or biometric signals for a subject detected by an instrument).
  • the pre-trained feature extraction algorithm and detection model serve as the starting point for new use or further training.
  • the pre-training of feature extraction and detection model may have been done for motion and / biometric tracking in MRI environment for 60cm wing. New use may involve re-using the model for a 70cm wing may require retraining the model.
  • a model trained for use in MRI environment can be re-used with transfer learning for CT environment or PET environment.
  • FIG. 2 is a block diagram illustrating a computer hardware system configured to run software for implementing one or more embodiments of systems, devices, and methods for tracking and analyzing subject motion during a medical imaging scan and/or therapeutic procedure. While FIG. 2 illustrates one embodiment of a computing system 200, it is recognized that the functionality provided for in the components and modules of computing system 200 may be combined into fewer components and modules or further separated into additional components and modules.
  • the computing system 200 comprises a motion tracking and/or biometrics analysis system module 206 that carries out the functions described herein, including any one of techniques described above.
  • the motion tracking and/or biometrics analysis system module 206 and/or other modules may be executed on the computing system 200 by a central processing unit 202 discussed further below.
  • module refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, COBOL, CICS, Java, Lua, C or C++.
  • a software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts.
  • Software instructions may be embedded in firmware, such as an EPROM.
  • hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors.
  • the modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.
  • the computing system 200 also comprises a mainframe computer suitable for controlling and/or communicating with large databases, performing high volume transaction processing, and generating reports from large databases.
  • the computing system 200 also comprises a central processing unit (“CPU”) 202, which may comprise a conventional microprocessor.
  • the computing system 200 further comprises a memory 204, such as random access memory (“RAM”) for temporary storage of information and/or a read only memory (“ROM”) for permanent storage of information, and a mass storage device 208, such as a hard drive, diskette, or optical media storage device.
  • the modules of the computing system 200 are connected to the computer using a standards based bus system.
  • the standards based bus system could be Peripheral Component Interconnect (PCI), MicroChannel, SCSI, Industrial Standard Architecture (ISA) and Extended ISA (EISA) architectures, for example.
  • PCI Peripheral Component Interconnect
  • ISA Industrial Standard Architecture
  • EISA Extended ISA
  • the computing system 200 comprises one or more commonly available input/output (I/O) devices and interfaces 212, such as a keyboard, mouse, touchpad, and printer.
  • the I/O devices and interfaces 212 comprise one or more display devices, such as a monitor, that allows the visual presentation of data to a user. More particularly, a display device provides for the presentation of GEIIs, application software data, and multimedia presentations, for example.
  • the I/O devices and interfaces 212 comprise a microphone and/or motion sensor that allow a user to generate input to the computing system 200 using sounds, voice, motion, gestures, or the like.
  • the I/O devices and interfaces 212 also provide a communications interface to various external devices.
  • the computing system 200 may also comprise one or more multimedia devices 210, such as speakers, video cards, graphics accelerators, and microphones, for example.
  • the computing system 200 may run on a variety of computing devices, such as, for example, a server, a Windows server, a Structure Query Language server, a Unix server, a personal computer, a mainframe computer, a laptop computer, a tablet computer, a cell phone, a smartphone, a personal digital assistant, a kiosk, an audio player, an e-reader device, and so forth.
  • the computing system 200 is generally controlled and coordinated by operating system software, such as z/OS, Windows 95, Windows 98, Windows NT, Windows 2000, Windows XP, Windows Vista, Windows 7, Windows 8, Linux, BSD, SunOS, Solaris, Android, iOS, BlackBerry OS, or other compatible operating systems.
  • the operating system may be any available operating system, such as MAC OS X.
  • the computing system 200 may be controlled by a proprietary operating system.
  • Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, and I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.
  • GUI graphical user interface
  • the computing system 200 is coupled to a network 216, such as a LAN, WAN, or the Internet, for example, via a wired, wireless, or combination of wired and wireless, communication link 214.
  • the network 216 communicates with various computing devices and/or other electronic devices via wired or wireless communication links.
  • the network 216 is communicating with one or more computing systems 217 and/or one or more data sources 219.
  • Access to the motion tracking and/or biometrics analysis system module 206 of the computer system 200 by computing systems 217 and/or by data sources 219 may be through a web-enabled user access point such as the computing systems’ 217 or data source’s 219 personal computer, cellular phone, smartphone, laptop, tablet computer, e-reader device, audio player, or other device capable of connecting to the network 216.
  • a web-enabled user access point such as the computing systems’ 217 or data source’s 219 personal computer, cellular phone, smartphone, laptop, tablet computer, e-reader device, audio player, or other device capable of connecting to the network 216.
  • Such a device may have a browser module that is implemented as a module that uses text, graphics, audio, video, and other media to present data and to allow interaction with data via the network 216.
  • the browser module may be implemented as a combination of an all points addressable display such as a cathode-ray tube (CRT), a liquid crystal display (LCD), a plasma display, or other types and/or combinations of displays.
  • the browser module may be implemented to communicate with input devices 212 and may also comprise software with the appropriate interfaces which allow a user to access data through the use of stylized screen elements such as, for example, menus, windows, dialog boxes, toolbars, and controls (for example, radio buttons, check boxes, sliding scales, and so forth).
  • the browser module may communicate with a set of input and output devices to receive signals from the user.
  • the input device(s) may comprise a keyboard, roller ball, pen and stylus, mouse, trackball, voice recognition system, or pre-designated switches or buttons.
  • the output device(s) may comprise a speaker, a display screen, a printer, or a voice synthesizer.
  • a touch screen may act as a hybrid input/output device.
  • a user may interact with the system more directly such as through a system terminal connected to the score generator without communications over the Internet, a WAN, or LAN, or similar network.
  • the system 200 may comprise a physical or logical connection established between a remote microprocessor and a mainframe host computer for the express purpose of uploading, downloading, or viewing interactive data and databases on line in real time.
  • the remote microprocessor may be operated by an entity operating the computer system 200, including the client server systems or the main server system, an/or may be operated by one or more of the data sources 219 and/or one or more of the computing systems 217.
  • terminal emulation software may be used on the microprocessor for participating in the micro-mainframe link.
  • computing systems 217 who are internal to an entity operating the computer system 200 may access the motion tracking and/or biometrics analysis system module 206 internally as an application or process run by the CPU 202.
  • a Uniform Resource Locator can include a web address and/or a reference to a web resource that is stored on a database and/or a server.
  • the URL can specify the location of the resource on a computer and/or a computer network.
  • the URL can include a mechanism to retrieve the network resource.
  • the source of the network resource can receive a URL, identify the location of the web resource, and transmit the web resource back to the requestor.
  • a URL can be converted to an IP address, and a Domain Name System (DNS) can look up the URL and its corresponding IP address.
  • DNS Domain Name System
  • URLs can be references to web pages, file transfers, emails, database accesses, and other applications.
  • the URLs can include a sequence of characters that identify a path, domain name, a file extension, a host name, a query, a fragment, scheme, a protocol identifier, a port number, a username, a password, a flag, an object, a resource name and/or the like.
  • the systems disclosed herein can generate, receive, transmit, apply, parse, serialize, render, and/or perform an action on a URL.
  • a cookie also referred to as an HTTP cookie, a web cookie, an internet cookie, and a browser cookie, can include data sent from a website and/or stored on a user’s computer. This data can be stored by a user’s web browser while the user is browsing.
  • the cookies can include useful information for websites to remember prior browsing information, such as a shopping cart on an online store, clicking of buttons, login information, and/or records of web pages or network resources visited in the past. Cookies can also include information that the user enters, such as names, addresses, passwords, credit card information, etc. Cookies can also perform computer functions. For example, authentication cookies can be used by applications (for example, a web browser) to identify whether the user is already logged in (for example, to a web site).
  • the cookie data can be encrypted to provide security for the consumer.
  • Tracking cookies can be used to compile historical browsing histories of individuals.
  • Systems disclosed herein can generate and use cookies to access data of an individual.
  • Systems can also generate and use JSON web tokens to store authenticity information, HTTP authentication as authentication protocols, IP addresses to track session or identity information, URLs, and the like.
  • the network 216 may communicate with other data sources or other computing devices.
  • the computing system 200 may also comprise one or more internal and/or external data sources.
  • one or more of the data repositories and the data sources may be implemented using a relational database, such as DB2, Sybase, Oracle, CodeBase and Microsoft® SQL Server as well as other types of databases such as, for example, a flat file database, an entity-relationship database, and object-oriented database, and/or a record-based database.
  • Conditional language such as, among others,“can,”“could,”“might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
  • the headings used herein are for the convenience of the reader only and are not meant to limit the scope of the inventions or claims.
  • the methods disclosed herein may include certain actions taken by a practitioner; however, the methods can also include any third-party instruction of those actions, either expressly or by implication.
  • the ranges disclosed herein also encompass any and all overlap, sub-ranges, and combinations thereof.
  • Language such as “up to,”“at least,”“greater than,”“less than,”“between,” and the like includes the number recited. Numbers preceded by a term such as“about” or“approximately” include the recited numbers and should be interpreted based on the circumstances (e.g., as accurate as reasonably possible under the circumstances, for example ⁇ 5%, ⁇ 10%, ⁇ 15%, etc.).
  • a phrase referring to“at least one of’ a list of items refers to any combination of those items, including single members.
  • “at least one of: A, B, or C” is intended to cover: A, B, C, A and B, A and C, B and C, and A, B, and C.
  • Conjunctive language such as the phrase“at least one of X, Y and Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be at least one of X, Y or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y, and at least one of Z to each be present.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Robotics (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Urology & Nephrology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

L'invention concerne des systèmes, des dispositifs et des méthodes de suivi et/ou d'analyse d'images et/ou de vidéos d'un sujet, par exemple pendant un balayage d'imagerie médicale, une intervention thérapeutique ou autre. Dans certains modes de réalisation, des systèmes, des dispositifs et des procédés décrits dans la description sont conçus pour suivre et/ou analyser des images et/ou des vidéos d'un sujet pendant un balayage d'imagerie médicale ou une intervention thérapeutique à l'aide d'une ou de plusieurs bobines flexibles, avec ou sans un ou plusieurs marqueurs placés sur le sujet, et/ou pour extraire un ou plusieurs attributs de mouvement à partir des données vidéo reçues d'une ou de plusieurs régions cibles et/ou pour ajuster dynamiquement un ou plusieurs paramètres de balayage du scanner d'imagerie médicale ou du dispositif thérapeutique sur la base du ou des attributs de mouvement extraits.
PCT/US2019/020593 2018-03-05 2019-03-04 Systèmes, dispositifs et procédés de suivi et d'analyse de mouvement d'un sujet pendant un balayage d'imagerie médicale et/ou une intervention thérapeutique WO2019173237A1 (fr)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US201862638303P 2018-03-05 2018-03-05
US62/638,303 2018-03-05
US201862677467P 2018-05-29 2018-05-29
US62/677,467 2018-05-29
US201862721981P 2018-08-23 2018-08-23
US62/721,981 2018-08-23
PCT/US2019/013147 WO2019140155A1 (fr) 2018-01-12 2019-01-11 Systèmes, dispositifs et méthodes de suivi et/ou d'analyse d'images et/ou de vidéos d'un sujet
USPCT/US2019/013147 2019-01-11

Publications (1)

Publication Number Publication Date
WO2019173237A1 true WO2019173237A1 (fr) 2019-09-12

Family

ID=67846358

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/020593 WO2019173237A1 (fr) 2018-03-05 2019-03-04 Systèmes, dispositifs et procédés de suivi et d'analyse de mouvement d'un sujet pendant un balayage d'imagerie médicale et/ou une intervention thérapeutique

Country Status (1)

Country Link
WO (1) WO2019173237A1 (fr)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019214992A1 (de) * 2019-09-30 2021-04-01 Siemens Healthcare Gmbh Verfahren zum Bereitstellen eines 3D-Datensatzes, dazu eingerichtetes bildgebendes System und Computerprogrammprodukt
WO2021061947A1 (fr) * 2019-09-24 2021-04-01 Carnegie Mellon University Système et procédé pour analyser des images médicales sur la base de données spatiotemporelles
US20210166406A1 (en) * 2019-11-28 2021-06-03 Siemens Healthcare Gmbh Patient follow-up analysis
DE102020205091A1 (de) 2020-04-22 2021-10-28 Siemens Healthcare Gmbh Verfahren zum Erzeugen eines Steuersignals
EP3923294A1 (fr) * 2020-06-11 2021-12-15 Koninklijke Philips N.V. Détection d'état de critique de patient
WO2021262242A1 (fr) * 2020-06-22 2021-12-30 Siemens Medical Solutions Usa, Inc. Affichage numérique pour un alésage de système d'imagerie médicale
EP4014875A1 (fr) * 2021-05-31 2022-06-22 Siemens Healthcare GmbH Procédé pour commander un examen d'imagerie médicale d'un sujet, système d'imagerie médicale et support d'enregistrement de données lisible par ordinateur
EP4024404A1 (fr) * 2021-01-04 2022-07-06 Koninklijke Philips N.V. Procédé et système pour commander des sessions d'imagerie médicale et optimiser la gêne du patient
US20220399124A1 (en) * 2021-06-11 2022-12-15 Siemens Healthcare Gmbh Risk determination for a ct-examination
US11928839B2 (en) 2020-01-19 2024-03-12 Udisense Inc. Measurement calibration using patterned sheets
EP4353152A1 (fr) * 2022-10-10 2024-04-17 Koninklijke Philips N.V. Appareil d'assistance d'unité d'acquisition d'image médicale
TWI840710B (zh) 2021-10-25 2024-05-01 吳福興 增加量測所得生理微振動信號強度的系統之方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009129457A1 (fr) * 2008-04-17 2009-10-22 The Government Of The United States Of America, As Represented By The Secretary, Department Of Health And Human Services Services, National Institutes Of Health Correction de mouvement dans une irm à l’aide d’une caméra
JP2015526708A (ja) * 2012-07-03 2015-09-10 ザ ステート オブ クイーンズランド アクティング スルー イッツ デパートメント オブ ヘルスThe State Of Queensland Acting Through Its Department Of Health 医用撮像のための動き補正
US20150327948A1 (en) * 2014-05-14 2015-11-19 Stryker European Holdings I, Llc Navigation System for and Method of Tracking the Position of a Work Target
US9785247B1 (en) * 2014-05-14 2017-10-10 Leap Motion, Inc. Systems and methods of tracking moving hands and recognizing gestural interactions
US9782141B2 (en) * 2013-02-01 2017-10-10 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009129457A1 (fr) * 2008-04-17 2009-10-22 The Government Of The United States Of America, As Represented By The Secretary, Department Of Health And Human Services Services, National Institutes Of Health Correction de mouvement dans une irm à l’aide d’une caméra
JP2015526708A (ja) * 2012-07-03 2015-09-10 ザ ステート オブ クイーンズランド アクティング スルー イッツ デパートメント オブ ヘルスThe State Of Queensland Acting Through Its Department Of Health 医用撮像のための動き補正
US9782141B2 (en) * 2013-02-01 2017-10-10 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
US20150327948A1 (en) * 2014-05-14 2015-11-19 Stryker European Holdings I, Llc Navigation System for and Method of Tracking the Position of a Work Target
US9785247B1 (en) * 2014-05-14 2017-10-10 Leap Motion, Inc. Systems and methods of tracking moving hands and recognizing gestural interactions

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021061947A1 (fr) * 2019-09-24 2021-04-01 Carnegie Mellon University Système et procédé pour analyser des images médicales sur la base de données spatiotemporelles
CN114600171A (zh) * 2019-09-24 2022-06-07 卡耐基梅隆大学 用于基于空间-时间数据来分析医学图像的系统和方法
DE102019214992A1 (de) * 2019-09-30 2021-04-01 Siemens Healthcare Gmbh Verfahren zum Bereitstellen eines 3D-Datensatzes, dazu eingerichtetes bildgebendes System und Computerprogrammprodukt
US11823401B2 (en) * 2019-11-28 2023-11-21 Siemens Healthcare Gmbh Patient follow-up analysis
US20210166406A1 (en) * 2019-11-28 2021-06-03 Siemens Healthcare Gmbh Patient follow-up analysis
US11928839B2 (en) 2020-01-19 2024-03-12 Udisense Inc. Measurement calibration using patterned sheets
DE102020205091A1 (de) 2020-04-22 2021-10-28 Siemens Healthcare Gmbh Verfahren zum Erzeugen eines Steuersignals
EP3923294A1 (fr) * 2020-06-11 2021-12-15 Koninklijke Philips N.V. Détection d'état de critique de patient
WO2021249839A1 (fr) * 2020-06-11 2021-12-16 Koninklijke Philips N.V. Détection de l'état critique d'un patient
WO2021262242A1 (fr) * 2020-06-22 2021-12-30 Siemens Medical Solutions Usa, Inc. Affichage numérique pour un alésage de système d'imagerie médicale
EP4024404A1 (fr) * 2021-01-04 2022-07-06 Koninklijke Philips N.V. Procédé et système pour commander des sessions d'imagerie médicale et optimiser la gêne du patient
WO2022144280A1 (fr) * 2021-01-04 2022-07-07 Koninklijke Philips N.V. Procédé et système permettant de commander des sessions d'imagerie médicale et d'optimiser la gêne du patient
EP4014875A1 (fr) * 2021-05-31 2022-06-22 Siemens Healthcare GmbH Procédé pour commander un examen d'imagerie médicale d'un sujet, système d'imagerie médicale et support d'enregistrement de données lisible par ordinateur
US11730440B2 (en) 2021-05-31 2023-08-22 Siemens Healthcare Gmbh Method for controlling a medical imaging examination of a subject, medical imaging system and computer-readable data storage medium
US20220399124A1 (en) * 2021-06-11 2022-12-15 Siemens Healthcare Gmbh Risk determination for a ct-examination
TWI840710B (zh) 2021-10-25 2024-05-01 吳福興 增加量測所得生理微振動信號強度的系統之方法
EP4353152A1 (fr) * 2022-10-10 2024-04-17 Koninklijke Philips N.V. Appareil d'assistance d'unité d'acquisition d'image médicale
WO2024078956A1 (fr) * 2022-10-10 2024-04-18 Koninklijke Philips N.V. Appareil d'assistance à une unité d'acquisition d'image médicale

Similar Documents

Publication Publication Date Title
WO2019173237A1 (fr) Systèmes, dispositifs et procédés de suivi et d'analyse de mouvement d'un sujet pendant un balayage d'imagerie médicale et/ou une intervention thérapeutique
WO2019140155A1 (fr) Systèmes, dispositifs et méthodes de suivi et/ou d'analyse d'images et/ou de vidéos d'un sujet
Nicolò et al. The importance of respiratory rate monitoring: From healthcare to sport and exercise
Zhang et al. Highly wearable cuff-less blood pressure and heart rate monitoring with single-arm electrocardiogram and photoplethysmogram signals
Jeong et al. Introducing contactless blood pressure assessment using a high speed video camera
Reyes et al. Tidal volume and instantaneous respiration rate estimation using a volumetric surrogate signal acquired via a smartphone camera
Lewis et al. A novel method for extracting respiration rate and relative tidal volume from infrared thermography
RU2656559C2 (ru) Способ и устройство для определения жизненно важных показателей
Jafari Tadi et al. Accelerometer‐Based Method for Extracting Respiratory and Cardiac Gating Information for Dual Gating during Nuclear Medicine Imaging
Leelaarporn et al. Sensor-driven achieving of smart living: A review
JP6054584B2 (ja) 患者の生命状態を取得するための患者インタフェースを具備する治療システム
Moreno et al. Facial video-based photoplethysmography to detect HRV at rest
CN105491942A (zh) 用于监测对象的血液动力学状态的监测系统和方法
EP3082586A1 (fr) Système et procédés pour mesurer des paramètres physiologiques
JP2016517325A (ja) 多面的生理的刺激を行うシステムおよびシグネチャ、および脳の健康評価
Blanik et al. Remote vital parameter monitoring in neonatology–robust, unobtrusive heart rate detection in a realistic clinical scenario
JP3221096U (ja) スマート検査測定設備
Zhou et al. The noninvasive blood pressure measurement based on facial images processing
Shao et al. Noncontact physiological measurement using a camera: a technical review and future directions
Nam et al. Biological‐Signal‐Based User‐Interface System for Virtual‐Reality Applications for Healthcare
JPH07124126A (ja) 医療用生体情報検出装置、診断装置および治療装置
Bosi et al. Real-time monitoring of heart rate by processing of Microsoft Kinect™ 2.0 generated streams
Fernandez Rojas et al. A systematic review of neurophysiological sensing for the assessment of acute pain
Nowara et al. Seeing beneath the skin with computational photography
Bennett Non-Contact bed-based monitoring of vital signs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19764473

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19764473

Country of ref document: EP

Kind code of ref document: A1