WO2019140155A1 - Systèmes, dispositifs et méthodes de suivi et/ou d'analyse d'images et/ou de vidéos d'un sujet - Google Patents

Systèmes, dispositifs et méthodes de suivi et/ou d'analyse d'images et/ou de vidéos d'un sujet Download PDF

Info

Publication number
WO2019140155A1
WO2019140155A1 PCT/US2019/013147 US2019013147W WO2019140155A1 WO 2019140155 A1 WO2019140155 A1 WO 2019140155A1 US 2019013147 W US2019013147 W US 2019013147W WO 2019140155 A1 WO2019140155 A1 WO 2019140155A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
data
medical imaging
motion
computer
Prior art date
Application number
PCT/US2019/013147
Other languages
English (en)
Inventor
Jeffrey N. Yu
Michael G. ENGELMANN
William C. MELOHN
Barry M. WEINMAN
Lalit Keshav MESTHA
Ulf Peter Gustafsson
Original Assignee
Kineticor, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kineticor, Inc. filed Critical Kineticor, Inc.
Priority to PCT/US2019/020593 priority Critical patent/WO2019173237A1/fr
Publication of WO2019140155A1 publication Critical patent/WO2019140155A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02405Determining heart rate variability
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/0255Recording instruments specially adapted therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1072Measuring physical dimensions, e.g. size of the entire body or parts thereof measuring distances on the body, e.g. measuring length, height or thickness
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1079Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1102Ballistocardiography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1176Recognition of faces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5258Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
    • A61B6/5264Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise due to motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5205Devices using data or image processing specially adapted for radiation diagnosis involving processing of raw data to produce diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30076Plethysmography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Definitions

  • Some embodiments described herein relate to the field of image and/or video analysis, and, in particular, to systems, devices, and methods, for tracking and/or analyzing subject images and/or videos.
  • the disclosure relates generally to the field of image and/or video analysis, and more specifically to systems, devices, and methods for tracking and/or analyzing subject images and/or images, for example during a medical imaging scan and/or therapeutic procedure.
  • MRI magnetic resonance imaging
  • magnetic resonance imaging (MRI) is a medical imaging technique used in radiology to visualize internal structures of the body in detail.
  • An MRI scanner or magnetic resonance (MR) scanner is a device in which the patient or a portion of the patient’s body is positioned within a powerful magnet where a magnetic field is used to align the magnetization of some atomic nuclei (usually hydrogen nuclei - protons) and radio frequency magnetic fields are applied to systematically alter the alignment of this magnetization. This causes the nuclei to produce a rotating magnetic field detectable by the scanner and this information is recorded to construct an image of the scanned region of the body.
  • Other modalities for performing medical imaging can comprise a computer tomography (CT) scanner, a positron emission tomography (PET) scanner, a single-photon emission computed tomography (SPECT) scanner, and/or a digital angiographic scanner. These scans can take several minutes (for example, 40 minutes or hours, or more in some scanners) and in some devices any significant movement can ruin the images and require the scan to be repeated.
  • radiation therapy can be applied to a targeted tissue region.
  • radiation therapy can be dynamically applied in response to patient movements.
  • the tracking of patient movements does not have a high degree of accuracy. Accordingly, the use of such systems can result in the application of radiation therapy to non-targeted tissue regions, thereby unintentionally harming healthy tissue while intentionally affecting diseased tissue.
  • proton therapies and other therapies are also true for proton therapies and other therapies.
  • United States Patent No. 9,734,589 issued on August 15, 2017, describes systems, devices, and methods that adaptively compensate for subject motion which may be utilized in combination with and/or in addition to one or more embodiments disclosed herein.
  • United States Patent No. 9,734,589 is incorporated herein in its entirety and forms part of this specification.
  • an accurate and reliable method of determining the dynamic position and orientation of a patient’s head or other body portion can be helpful in compensating for subject motion during such procedures and/or determining one or more biometric data and/or analyses.
  • certain embodiments disclosed herein are directed to systems, devices, and methods that provide practical optical tracking capability, for example using an optical marker configured to be attached to the subject of interest and/or a landmark on the subject, such as a facial feature(s) and/or optical flow vectors derived by analyzing videos or optical signals containing blood volume information obtained by analyzing videos and/or images.
  • tracking data collected of the subject can be used to control, modify, and/or improve results of a medical imaging scanner and/or therapeutic device.
  • tracking data collected by a motion tracking device and/or system can be stored for further analysis regarding the patient or subject, for example for biometrics data analysis and/or using the data for improving modification and/or control of the medical imaging scanner and/or therapeutic device with respect to correcting motion artifacts.
  • a standalone camera or detector system separate from a medical imaging scanner and/or therapeutic device can be used to collect subject image and/or video data, which can be further analyzed or processed to obtain biometrics data.
  • a computer- implemented method for determining biometric data using one or more motion tracking detectors of a motion correction system for a medical imaging scanner during a medical imaging scan comprises: obtaining, by one or more motion tracking detectors of the motion correction system, one or more images of a subject during the medical imaging scan; identifying, by the motion correction system, one or more landmarks on the subject on the one or more images and tracking a position change of the one or more landmarks; determining, by the motion correction system, motion data of the subject during the medical imaging scan based at least in part on the tracked position change of the one or more landmarks on the subject on the one or more images; dynamically generating, by the motion correction system, one or more updated scan parameters for controlling the medical imaging scanner to prospectively correct one or more motion artifacts in a medical image obtained by the medical imaging scan, wherein the one or more updated scan parameters is generated based at least in part on the determined motion data of the subject; transmitting, by the motion correction system to a biometric data analysis system, the
  • the one or more biometric parameters comprises one or more of: respiration rate, pulse rate, anxiety levels, blood pressure, heart rate variability, eye blink rate, or eye blink duration.
  • the one or more biometric parameters comprises one or more of: position of the subject, height of the subject, width of the subject, body volume of the subject, estimated body mass index of the subject, facial recognition of the subject, or a patient barcode of the subject.
  • the biometric data analysis system is separate from the motion correction system.
  • the determined motion data of the subject does not comprise any protected health information (PHI) of the subject.
  • PHI protected health information
  • the transmitting, by the motion correction system to a biometric data analysis system, the determined motion data of the subject comprises: encrypting, by the motion correction system, the determined motion data of the subject; transmitting, by the motion correction system to the medical imaging scanner, the encrypted motion data of the subject; and transmitting, by the medical imaging scanner to the biometric data analysis system the encrypted motion data of the subject.
  • the processing the determined motion data of the subject to determine one or more biometric parameters of the subject during the medical imaging scan is conducted asynchronously from the medical imaging scan. In some embodiments, the processing the determined motion data of the subject to determine one or more biometric parameters of the subject during the medical imaging scan is conducted as a batch process. In some embodiments, the processing the determined motion data of the subject to determine one or more biometric parameters of the subject during the medical imaging scan is conducted in real-time.
  • the computer- implemented method further comprises utilizing, by the biometric data analysis system, a network time protocol of the medical imaging scanner to synchronize motion data determined from one or more images obtained by one or more motion tracking detectors.
  • the medical imaging scan comprises one or more of MRI, PT, PT-CET, or CT.
  • the computer- implemented method further comprises generating, by the biometric data analysis system, one or more ballistic data representative of a state of one or more organs of the subject during the medical imaging scan.
  • the computer-implemented method further comprises combining, by the biometric data analysis system, the one or more ballistic data with the medical image obtained by the medical imaging scan.
  • the one or more updated scan parameters for controlling the medical imaging scanner is further generated based at least in part on one or more generated ballistic data.
  • the computer- implemented method further comprises generating, by the biometric data analysis system, a cardio-ballistic signal of the subject during the medical imaging scan. In some embodiments, the computer-implemented method further comprises generating, by the biometric data analysis system, respiratory ballistic data of the subject during the medical imaging scan.
  • the computer- implemented method further comprises generating, by the biometric data analysis system, a videoplesthysmographic (VPG) signal of the subject during the medical imaging scan.
  • the computer- implemented method further comprises improving a quality of the VPG signal by sequentially illuminating pulsing light on a region of interest on the subject during the medical imaging scan.
  • a duration between pulses of the pulsing light is controlled to be greater than a thermal relaxation time of tissue of the region of interest.
  • an illumination-on time of the pulsing light is controlled to be shorter than a thermal relaxation time of tissue of the region of interest.
  • FIG. 1A illustrates an embodiment of a schematic diagram depicting a side view of a system for tracking and/or analyzing subject images and/or videos;
  • FIG. 1B illustrates an embodiment of a schematic diagram depicting a front view of a system for tracking and/or analyzing subject images and/or videos;
  • FIG. 2 is a block diagram illustrating a computer hardware system configured to run software for implementing one or more embodiments of systems, devices, and methods for tracking and analyzing subject images and/or videos;
  • FIG. 3 illustrates an example(s) of motion tracking signals from 6-DOF coordinates
  • FIG. 4 illustrates an example(s) of cardiac signal (and power spectral density) band-limited to contain frequency components within 0.75 to 2 Hz after processing y- direction signal;
  • FIG. 5 illustrates an example(s) of respiratory signal (and power spectral density) band-limited to contain frequency components within 0.05 to 0.3 Hz after processing z-direction signal;
  • FIG. 6 illustrates an example(s) of a measured ballistocardiography (BCG) waveform measured using a modified electronic weighing scale for one heartbeat
  • FIG. 7A illustrates an example(s) of ballistocardiography (BCG) waveform obtained from y-channel motion signal (H, I, J, K, and L waves are marked. M & N waves can also be seen in the same waveform) and detrended but unfiltered y-channel signal;
  • FIG. 7B illustrates an example(s) of BCG waveform from y-channel motion signal marked with H, I, J, K, and L waves;
  • FIG. 8 illustrates a schematic of overlapping batch sequence diagram with sliding windows
  • FIG. 9 illustrates an example(s) of respiration gating signal
  • FIG. 10 illustrates an example(s) of absorption coefficient of hemoglobin and water shown with respect to wavelength
  • FIG. 11 illustrates an example(s) of raw signals with videoplethysmography (VPG) markers from each ROI captured with sequentially switching LEDs in the NIR wavelength band and an example(s) of raw signals with BCG markers;
  • VPG videoplethysmography
  • FIG. 12 illustrates an example(s) of raw motion signal and raw signal for ROI of size 112 x 112 pixels with VPG markers;
  • FIG. 13 illustrates an example(s) of VPG signals overlapped with detrended signal but unfiltered at different ROIs extracted using one or more methods described herein;
  • FIG. 14 illustrates an example(s) of a VPG signal (top) and respiration signal (bottom) obtained from one of the ROIs illustrated in FIG. 11 with continuous tracking with sliding windows as illustrated in FIG. 8;
  • FIG. 15 illustrates an example(s) of pulse rate / heart rate (top) and respiration rate (bottom) obtained from one of the ROIs illustrated in FIG. 11 with continuous tracking with sliding windows as illustrated in FIG. 8;
  • FIG. 16 is a schematic diagram illustrating intervals for pulsing illumination
  • FIG. 17 is a flowchart illustrating an example embodiment(s) of pulsing illumination to obtain clearer raw VPG signals
  • FIG. 18A illustrates example images obtained with pulsing illumination
  • FIG. 18B illustrates example raw VPG signals obtained with pulsing illumination
  • FIG. 19A illustrates example images obtained with continuous illumination
  • FIG. 19B illustrates example raw VPG signals obtained with continuous illumination
  • FIG. 20A illustrates an example unpolarized image of tissue
  • FIG. 20B illustrates an example polarized image of the same tissue from FIG. 20 A
  • FIG. 21 is a schematic diagram illustrating an example imaging system comprising a polarizer and an analyzer
  • FIG. 22A illustrates an example of how the peak power in low frequency (LF) and high frequency (HG) components can be different under sympathetic and/or parasympathetic influence;
  • FIG. 22B illustrates an example of how the peak power in low frequency (LF) and high frequency (HG) components can be different under sympathetic and/or parasympathetic influence;
  • FIG. 23 illustrates example frequency contents and their range and associations with ANS
  • FIG. 24 illustrates example systems, devices, and methods for balancing
  • FIG. 25 illustrates example images captured with a four-camera motion tracking system, in which the illustrated rectangular regions show areas used for measuring eye blink rate;
  • FIG. 26 is a block diagram illustrating example methods for extracting eye blink rate and/or blink duration
  • FIG. 27 illustrates example data collected for eye blink rate detection
  • FIG. 28 illustrates example data collected for eye blink rate detection
  • FIG. 29 illustrates example data collected for eye blink rate detection.
  • the systems disclosed herein can be adapted and configured to track patient translations with accuracies of about 0.1 mm and angle accuracies of about 0.1 degrees in order to better apply radiation therapy, proton therapy, or any other therapy to the targeted tissue or area of the body.
  • motion tracking data collected of a subject can be used to control, modify, and/or improve results of a medical imaging scanner and/or therapeutic device.
  • tracking data collected by a motion tracking device and/or system in conjunction with a medical imaging scanner and/or therapeutic device can be stored and/or further analyzed, for example for biometrics data analysis and/or using the data for improving modification and/or control of the medical imaging scanner and/or therapeutic device with respect to correcting motion artifacts.
  • a standalone camera or detector system separate from a medical imaging scanner and/or therapeutic device can be used to collect subject image and/or video data, which can be further analyzed or processed to obtain biometric data.
  • FIG. 1 A is a schematic diagram illustrating a side view of a medical imaging scanner and/or medical therapeutic device 104 as part of a motion compensation system 100 for the same.
  • FIG. 1B is a schematic diagram illustrating a front view of a medical imaging scanner and/or medical therapeutic device 104 as part of a motion compensation system 100 for the same that is configured to detect and account for false movements for motion correction during a medical imaging scan or therapeutic procedure.
  • the system can comprise a standalone motion tracking system 102 that can be used to track motion data for generating biometric data without being used in conjunction with a medical imaging scanner and/or medical therapeutic device 104
  • the motion compensation system 100 illustrated in FIGS. 1A and 1B can comprise a motion tracking system 102, a scanner, a scanner controller 106, one or more detectors 108, one or more motion tracking markers or landmarks 110, and/or a biometric data analysis system 116.
  • the biometric data analysis system 116 can be part of or separate from the motion tracking system 102 and/or medical imaging scanner and/or medical therapeutic device 104.
  • one or more markers 110 can be attached and/or otherwise placed on a subject 112.
  • the one or more markers 110 can be placed on the face of a subject 110 for imaging or therapeutic procedures directed to the head or brain of the subject.
  • the one or more markers 110 can be placed on other portions of a body of a subject 110 for imaging or therapeutic procedures directed to other portions of the body of the subject 110.
  • the system 100 can be configured to track the motion of one or more landmark features 110 on the subject 110, without using attachable markers.
  • the subject 110 can be positioned to lie on a table 114 of a medical imaging scanner and/or medical therapeutic device 104.
  • the medical imaging scanner and/or medical therapeutic device 104 can be, for example, a magnetic resonance imaging scanner or MRI scanner.
  • a three-dimensional coordinate system or space can be applied to a subject that is positioned inside a medical imaging scanner and/or medical therapeutic device 104.
  • the center or substantially center of a particular portion of the subject 110 for observation can be thought of as having coordinates of (0, 0, 0).
  • a z-axis can be imposed along the longitudinal axis of the medical imaging scanner and/or medical therapeutic device 104.
  • the z-axis can be positioned along the length of the medical imaging scanner and/or medical therapeutic device 104 and along the height of a subject or patient that is positioned within the medical imaging scanner and/or medical therapeutic device 104, thereby essentially coming out of the medical scanner and/or medical therapeutic device 104.
  • an x-axis can be thought of as being positioned along the width of the medical imaging scanner and/or medical therapeutic device 104 or along the width of the patient or subject that is positioned within the medical imaging scanner and/or medical therapeutic device 104.
  • a y-axis can be thought of as extending along the height of the medical imaging scanner and/or medical therapeutic device 104.
  • the y-axis can be thought of as extending from the patient or subject located within the medical imaging scanner and/or medical therapeutic device 104 towards the one or more detectors 108.
  • a motion tracking device and/or system 102 can be configured to collect motion data and/or subject images and/or videos, for example during a medical imaging scan and/or therapeutic procedure or otherwise.
  • the data collected from the actual scanning and/or therapeutic process can be further analyzed and/or manipulated after that data has been collected.
  • the data can be sent back to a computation repository in some embodiments for further analysis.
  • the system can be configured to conduct further analysis of the motion tracking data in addition to just getting data as to where one or more cameras or other motion tracking detectors is pointing and how accurate it is at seeing the target.
  • such data can be used to determine respiration rate, heart rate (also called“pulse rate”), position of the subject or a portion thereof as a function of time, blood pressure, heart rate variability, health of cardiopulmonary system, and/or other various other sorts of biometrics simply by using the intensity variation due to jitter/motion or intensity variation due to light absorption that is observed in the image.
  • such data can be used to automatically, semi-automatically, or otherwise determine position of the subject or a portion thereof with respect to a medical imaging scanner or therapeutic device, whether the subject is touching a wall of the medical imaging scanner or therapeutic device, length or height of a subject, weight of a subject, subject body volume, subject weight, estimated body mass index (BMI), facial recognition, and/or subject barcode identification.
  • the system can be configured to obtain one or more of such biometrics data and/or conduct analyses in a non-invasive and/or invasive manner.
  • data and/or tracking data collected by a motion tracking and/or correction device can comprise video and/or image data (for example, high- definition video or the like) of a subject of a medical imaging scan or therapeutic procedure, tracking data of motion of one or more targets or markers on the subject, tracking data of motion of one or more landmarks, such as facial or other body features, of the subject, audio or sound data, temperature or thermal data of the subject, atmospheric data, light data, infrared data, hyperspectral data, multispectral data, infrared data, infrared data for pulsation and/or blood movement, moisture, sweat, patient overheating, hemoglobin wavelength, follicle motion, water detection, and/or time associated with any of the aforementioned data.
  • video and/or image data for example, high- definition video or the like
  • tracking data of motion of one or more targets or markers on the subject tracking data of motion of one or more landmarks, such as facial or other body features, of the subject, audio or sound data, temperature or thermal data of the subject, atmospheric
  • one or more biometrics data and/or other data relating to the subject can be observed through one or more correction algorithms and/or derived by using one or more correction algorithms.
  • correction algorithms are described in United States Patent No. 9,734,589, which is hereby incorporated herein in its entirety.
  • a subject if a subject is sitting in a medical imaging scanner and/or therapeutic device and breathing, his or her chest would go up and down, resulting in an affixed target and/or optical landmark or optical flow vectors derived by analyzing video frames to go up and down.
  • This pattern of movement can be streamed as video data generated from the motion tracking system and/or device, and such data can be analyzed and processed in order to determine respiratory pattern, rate, and/or related features of the subject.
  • one or more biometric data and/or data relating to the state of the subject can be collected while under observation in a medical imaging scanner and/or therapeutic device or system.
  • an optical marker can be attached to a particular body indicator of a subject.
  • an optical marker can be attached to the stomach or skin of a subject, or in markerless systems a landmark on the subject can be used in lieu of a marker, or group of optical flow vectors can be obtained by analyzing video frames and such marker or landmark or markerless flow vectors can be observed by one or more detection methods, in order to determine biometric data for cardiac and/or respiratory systems.
  • the system can be configured to determine an anxiety index of the subject, for example based on respiration and/or heart rate variability. Similarly, the amount of motion of the subject may indicate some other aspects from a neurological point of view.
  • such data in order to process the motion tracking and/or image data for analysis, such data can be transferred from the motion tracking device and/or system to a computer system configured to process the data.
  • the data for further analysis, for one or more purposes and/or features described herein, can be much larger than necessary for simply diagnosing the state of the cameras or detectors for motion tracking.
  • the system by looking at that larger collection of information, the system can surmise one or more biometric data and/or other data relating to the subject as described herein.
  • the larger collection of data can be used to diagnose a state of the motion tracking system or device and/or the state of the medical imaging scanner and/or therapeutic device used in conjunction with the motion tracking system or device.
  • a motion tracking system or device communicates with a medical imaging scanner’s host and/or therapeutic device or system’s host periodically, for example once per frame, blasting correction information, which can be based on the alignments of the camera other data.
  • a motion tracking system or device can be configured to tell the scanner where the subject is relative to the target and therefore where the target is relative to what the scanner is looking for. In some embodiments, that information is in process on the scanner host against the image and that is what provides the fundamental correction.
  • the motion tracking system or device could also send additional information, such as metadata and/or additional images, around and/or outside of that frame that gives you more information than just the basic relating to was the camera properly aligned and/or was it within tolerances, etc.
  • additional information can be tagged onto the frame either at the host for the actual scanner or therapeutic device.
  • such additional information can be combined at a later point in time on a another computer system and/or processor.
  • the system can comprise a processor that sits next to the scanner or therapeutic device on the same LAN segment that would collect this information, process it, and generate useful biometric data.
  • a processor or computer system for analyzing the additional data may not necessarily have to be physically located within the scanner. Rather, in some embodiments, the processor or computer system for analyzing the additional data can be located anywhere in the processing because it does not have to be in real time relative to the image correction; it could be done at any time in some embodiments. As such, in some embodiments, biometric analysis can be conducted in the cloud. In other words, the system can be configured to produce a series of interesting information which can be sent back to an appropriate location and/or system for processing.
  • such additional data and/or metadata can comprise extended data that is not necessarily and/or typically processed at the connected medical imaging scanner and/or therapeutic device having to do with tracking of the camera.
  • a subject or patient may move or breathe in between frames of a medical imaging scanner or therapeutic device. That movement of breathing can be detected in some embodiments.
  • the system can be configured to determine that the cameras are still aligned with intolerance, which can be reported to determine that the patient was breathing at that particular time.
  • emotion and/or anxiety can also be reported and/or determined.
  • the system can determine additional aspects of the state of the patient or subject and also the quality of the image from the additional data.
  • the additional information or metadata can comprise image data, video data, and/or information or data from a correction algorithm that is tracking the target via one or more cameras or detectors of the motion tracking system. It can be advantageous to process the data from the correction algorithm rather than video data due to size of the data and related computational costs in certain embodiments. Rather, if all that is being tracked is metadata, relatively small amounts of data can be transmitted, for example kilobytes per second, which can be reassembled for further analysis without actually having image and/or video data of the subject.
  • the metadata or tracking data that can be processed for further analysis may not comprise any protected health information (PHI).
  • PHI protected health information
  • the metadata or tracking data for further analysis does not comprise a photograph of a person or even a person’s chest or even the target; rather the metadata or tracking data can be the relative motion of the target up from the view of the camera of the motion tracking system. This can be important because this can allow transferring the metadata or tracking data anywhere for processing in any way without necessarily referring back to the actual patient and/or without having various kinds of concerns with the Health Insurance Portability and Accountability Act (HIPPA).
  • HPI Health Insurance Portability and Accountability Act
  • this can be further advantageous, because in order to analyze trending data to determine and/or improve the efficacy of the motion tracking and correction system in terms of its ability to do image corrections, it can be helpful to be able to correlate that with various aspects about the patient or subject without running into HIPPA concerns. As such, the data for further analysis can be limited to such that can allow the system to collect information that would be generally useful for the patient’s health and the motion correction algorithm and its continuous improvement without any PHI.
  • a motion tracking and correction system can be configured to obtain raw metadata, which can be tracking data as viewed from the perspective of each of the individual cameras or detectors of the system.
  • the raw metadata does not necessarily need to be processed in real-time or even in the motion tracking and correction system itself in some embodiments. Rather, in certain embodiments, the raw metadata can be transferred from the motion tracking and correction system, for example at kilobytes per second speeds, up to the cloud and/or other processing system.
  • the raw metadata in certain embodiments, can be free of patient privacy data and thereby not raise any HIPAA concerns in processing and/or transmitting the data.
  • that data can be processed in the cloud and/or other computing system, for example by utilizing one or more algorithms in order to identify one or more various biometric parameters of interest.
  • biometric parameters can comprise heart rate, respiration, pulse rate, anxiety levels, and/or the like.
  • the additional data and/or metadata can be processed asynchronously relative to the medical imaging scan, therapeutic procedure, and/or standalone acquisition of the motion data.
  • processing of the additional data and/or metadata may not necessarily be synchronized with the frame of the medical imaging scanner, therapeutic device, and/or motion detection system in some embodiments.
  • inter- frame information can be more helpful for certain analyses in some cases.
  • Being able to process the data asynchronously can be advantageous in certain embodiments, for example because certain computational analyses can be intense to determine one or more features or characteristics. As such, it can be advantageous to be able to transmit the data to another computing system and/or location where computation is not an issue.
  • the system can be configured to analyze the additional data or metadata to compute information or conclusion about the data that may be beneficial to a doctor, a patient.
  • the system can be configured to analyze the additional data or metadata to computer information about the scanner or a therapeutic device and its operation and whether the correction is having positive effects to get over time.
  • the additional data or metadata can be used to determine whether the motion tracking and correction system is performing consistently, whether the medical imaging scanner or therapeutic device is taking the information from the motion tracking and correction system and processing it correctly, whether certain parameters are changing over time, whether a service call is required, whether some sort of intervention is required in terms of the data path, or the like.
  • the additional data or metadata is transmitted from the motion tracking and correction system through a network connection to a connected medical imaging scanner or therapeutic device.
  • the motion tracking and correction system may not have a network connection to anything else.
  • data such as the additional data or metadata
  • the scanner or therapeutic device host can be configured to determine and transmit the message or metadata to an appropriate processing unit for processing.
  • the processing unit configured to analyze and process the metadata or additional data to determine biometrics and/or other information about the subject or patient can be a computing unit, standalone computer system, in the cloud, or the like.
  • the scanner or therapeutic device host is configured not to handle or manipulate the metadata or additional data.
  • the scanner or therapeutic device host may, in some embodiments, encrypt the metadata or additional data as an intermediary prior to sending to a processing unit.
  • this can be the case where the motion tracking and correction system only has one interface with the scanner or therapeutic device host, while the scanner or therapeutic device host can have a plurality of interfaces.
  • the scanner or therapeutic device host can have a local interface as well as a broader network interface that can connect to the Internet or other systems, while the motion tracking and correction system may not.
  • the additional data or metadata can be sent from the motion tracking and correction system to the medical imaging scanner and/or therapeutic device host as a messaging package, such as syslog or rsyslog.
  • the motion tracking and correction system and/or the medical imaging scanner and/or therapeutic device operate Linux and/or Ubuntu.
  • the medical imaging scanner and/or therapeutic device is configured to recognize the additional data or metadata and simply relay and not process the additional data or metadata; rather, the medical imaging scanner and/or therapeutic device can be configured to wait to receive and process only correction data related to adjusting for motion of the subject, for example sequence and/or focus correction of the scanner and/or therapeutic device.
  • the motion tracking and correction system and medical imaging scanner and/or therapeutic device share can comprise all the correction information in it.
  • the medical imaging scanner and/or therapeutic device host can receive this data and correct the image scan or therapeutic procedure accordingly.
  • the metadata or additional data described herein for determining biometric information for example can simply be stored on the motion tracking and correction system, and can be collected in the file, and it can be turned on and/or off for debugging purposes.
  • both syslog or rsyslog and UDP can be utilized.
  • a real-time data stream assigned for providing the data the medical imaging scanner and/or therapeutic device needs now in order to correct the frame can be blasted via a UDP broadcast.
  • additional information or metadata that was collected from the motion tracking and correction system during this particular period of operation for example including inter- frames that can be further processed to determine biometric data among others, can be forwarded by the medical imaging scanner and/or therapeutic device as a messaging package to a processing unit.
  • such additional information or metadata can be transmitted to another computing unit, other than the medical imaging scanner and/or therapeutic device, that is connected to the motion tracking and correction system, for example over a local network, which can then forward the data, for example as a messaging package, to a processing unit.
  • the additional information or metadata is sent to the medical imaging scanner and/or therapeutic device host or other computing unit using this socket, and then the medical imaging scanner and/or therapeutic device host or other computing unit can be configured to forward and/or redirect the additional information or metadata to a desired processing unit based on a rule that was installed.
  • the data transmitted such as the syslog messaging packet, in addition to the frame information and/or other collected data from the subject, can also comprise the IP address of where the data came from for validation and/or correlation purposes.
  • such ancillary information can be carried in the payload of this message, the frame information, and/or any other data that the motion tracking and correction system generates and/or the medical imaging scanner or therapeutic device includes.
  • the information of the subject collected by the motion tracking and correction system is continuous and/or periodic and can include motion and/or other information that happens between frames of the medical imaging scanner and/or therapeutic device.
  • the information of the subject collected by the motion tracking and correction system can be aligned with a specific frame and/or a specific set of frames.
  • the information collected can be tied to two particular frames, which may allow observation of respiration that can occur between the two frames.
  • correction data that can be used to adjust the medical imaging scanning and/or therapeutic procedure parameters can be frame-centric.
  • Alignment of the collected information with a specific frame and/or a specific set of frames can be amorphous and/or may depend on the type of processing of interest and/or what sort of beginning and ending would be appropriate. In some embodiments, it can be advantageous to begin collection of information when the motion tracking and correction system is initially brought into the picture and set up. Further, in certain embodiments, it can advantageous to end the data collection process when the subject or patient is removed from the medical imaging scanner and/or therapeutic device and there is no longer any information associated with that the subject or patient’s acquiescent state.
  • the information collected does not need to be packetized in any particular way.
  • the information collected may be tied to time, whereas it may not be tied to time in other embodiments.
  • the information collected may or may not be tied to specific events that are taking place either in the motion tracking and correction system and/or the medical imaging scanner or therapeutic device.
  • the motion tracking and correction system can be in direct communication with the processing unit.
  • the motion tracking and correction system may be configured to directly transmit the additional data or metadata to the processing unit without going through the medical imaging scanner or therapeutic device.
  • the additional data or metadata can be configured to be transmitted directly from the motion tracking and correction system to a computing system outside of the scanner or therapeutic device for further processing.
  • the motion tracking and correction system can comprise multiple network interfaces, such as a local interface and a broader network interface.
  • the motion tracking and correction system can comprise only a local interface, in which a local segment on which the motion tracking and correction system and medical imaging scanner and/or therapeutic device sits on comprises a processing element.
  • all metadata and/or additional information can be stored in the motion tracking and correction system, which can be downloaded or transferred to a processing unit, for example at a later point in time via some channel.
  • the system can comprise an additional computer connected to the same LAN structure, and instead of sending the metadata or additional information to the medical imaging scanner and/or therapeutic device host to be sent out to the internet to be processed, it could receive it directly off the LAN host and process it directly there.
  • the LAN host can refer to the processor that runs the medical imaging scanner and/or therapeutic device, and the motion tracking and correction system can communicate with the LAN host via a LAN interface.
  • the LAN includes the motion tracking and correction system and the medical imaging scanner and/or therapeutic device, and the host of the medical imaging scanner and/or therapeutic device can comprise another interface to some other network connection which can ultimately allow communication with the Internet, remote service, and/or the like.
  • the system can comprise an additional computer connected to the same LAN structure, and instead of sending the metadata or additional information to the medical imaging scanner and/or therapeutic device host to be sent out to the internet to be processed, the additional computer can receive the metadata or additional information and transmit the same to the Internet to be processed.
  • the system can comprise two or more different data paths.
  • one data path can be for the actual correction data, which would go from the motion tracking and correction system to the medical imaging scanner and/or therapeutic device for adjusting the frame.
  • Another data path can be for the metadata or additional data, which can go from the motion tracking and correction system to any computing unit, either directly or indirectly through the medical imaging scanner and/or therapeutic device as a way to get off of the local area network to somewhere else.
  • the local area network can connect the motion tracking and correction system, the medical imaging scanner and/or therapeutic device, and/or an additional computing unit in some embodiments, for example through a piece of LAN infrastructure, such as a hub or switch.
  • the system can comprise a hub in which the hub can have a direct connection to the motion tracking and correction system, medical imaging scanner or therapeutic device, additional computing unit, and/or the cloud.
  • the logging facility on the motion tracking and correction system can be utilized at either point at the medical imaging scanner or therapeutic device acting as a gateway or the additional computing unit as desired.
  • the motion tracking and correction system can come online and/or begin a session when a subject or patient is placed in a connected medical imaging scanner and/or therapeutic device and one or more detectors of the motion tracking and correction system identify the target.
  • the one or more detectors can be undergoing processing to both self-align to make sure that the detectors are visualizing the target and that the subject is properly aligned, as well as observing various kinds of motions and other aspects occurring within the environment that can be collected and spewed out via metadata.
  • the metadata can be collected, and immediately or shortly afterwards, but not tied to the operation of the scanner, can be utilized in some form of a correlation analysis.
  • some or all correlation analyses can occur after a session has ended, for example by processing in bulk.
  • some or all correlation analyses can be processed as the medical imaging scan and/or therapeutic procedure is taking place.
  • the system can be configured to determine and/or process one or more biometrics analysis, such as respiration and/or EKG-type cycle, during the medical imaging scan and/or therapeutic procedure.
  • the system can be further configured to generate an electronic data representation of one or more biometric data.
  • one or more biometric data can be imputed from observing the subject or patient in the medical imaging scanner and/or therapeutic device.
  • the session can continue to operate during the time that the medical imaging scanner and/or therapeutic device is operating and/or when it is not operating so that the system can observe subject movement when the subject is agitated and/or when no scanning or therapeutic procedure is occurring.
  • the session may end and/or continue until subject or target actually moves out of field or moves enough to basically say this is complete.
  • the system can be configured to conduct some batch processing on the record of that session, for example to determine some long-term trend that may require more computation.
  • the system can be configured to conduct real-time processing of the metadata to extract certain elements, such as respiratory or heart rate, which may have some real-time information about the scanning and/or therapeutic treatment, as well as possibly conduct post-scanning or post-treatment processing for less time- sensitive mission protocol data points and/or biometric data points.
  • certain elements such as respiratory or heart rate
  • the system can be configured to produce some feedback data relating to how efficient or effective the corrections were.
  • the system can be configured to produce a validation metric regarding how effective or ineffective the corrections were either as an individual session and/or as part of a larger trend.
  • Such feedback data can be used to improve the correction algorithm of the motion tracking and correction system in some embodiments.
  • one or more cameras or detectors of the motion tracking and correction system are configured to record video streams.
  • Such video streams can be transmitted through a plurality of full channels of HDMI data.
  • the system can comprise a plurality of data channels, for example the HDMI channel data for real-time video streams, UDP channel for real-time tracking and correction data for the medical imaging scanner and/or therapeutic device, and a syslog channel for metadata or additional data.
  • a problem can arise in that the plurality of data channels may need to be synchronized.
  • time and/or network time protocol can be utilized to synchronize the plurality of data channels.
  • the medical imaging scanner and/or therapeutic device can be configured to an NTP to provide time synchronization information.
  • the medical imaging scanner and/or therapeutic device can be configured to get real synchronized time via the Internet.
  • the motion tracking and correction system can be synchronized with that time, for example through a local connection to the medical imaging scanner and/or therapeutic device, to obtain a correlated amount of time.
  • the motion tracking and correction system data log can comprise actual time and date stamps. As such, time relevant data of the plurality of data channels can be tied back to an actual time for synchronization.
  • the system can be configured to conduct some processing of the metadata or additional data live in real-time and/or substantially in real-time and/or within minutes of the medical imaging scanning and/or therapeutic treatment.
  • one of such features that can be processed in real-time, substantially in real-time and/or within minutes of the medical imaging scanning and/or therapeutic treatment can be quantifiable movement of a subject or patient, for example as a measurement or a graph of how much the subject or patient is moving during the scan or treatment.
  • Another example can be whether the marker is actively trackable or not by the system, for example as binary, i.e. does the system see the marker or target or not.
  • Another example can be an indication of whether the marker or target was lost during the scan or treatment or was continuously tracked.
  • Such examples can provide immediate benefits during a scan or treatment, for example to know whether the target or marker is being tracked and/or whether the target or marker was continuously visible throughout the scan or treatment.
  • the subject or patient moved a lot, for example above a predetermined level, that can trigger review of images and/or modification of the scan or treatment.
  • one or more processed outputs of metadata or additional data can be configured to be displayed to a user.
  • the motion tracking and correction system can comprise two optical markers and two cameras per marker, wherein raw output and/or processed output from all or some of such cameras can be reflected in the same display.
  • raw output and/or processed output from each individual camera with individual markers can displayed.
  • a consolidated view of raw output and/or processed output from all cameras can displayed.
  • the system can be configured to conduct some processing of the metadata or additional data in real-time, substantially in real-time, within minutes of the medical imaging scanning and/or therapeutic treatment, and/or within hours, days, or years of the medical imaging scanning and/or therapeutic treatment.
  • Some non limiting examples of such features that can be processed can include position or time, heart rate, respiratory rate, blood pressure, non-invasive blood pressure (NIBP), position of the subject, whether the subject is touching the side of the bore, estimated body mass index (BMI) or estimated weight, length of the overall patient, facial recognition for example for identification purposes, identification of a bar code for example on the patient’s arm for identification purposes.
  • the system can be configured to process the metadata or additional data to obtain biometric data and/or rich ancillary biometric data.
  • biometric data and/or rich ancillary biometric data can be stored, analyzed, and/or uploaded for current and/or future processing.
  • the system can be configured to process the metadata or additional data to generate a signal representative of the current state of the subject or patient.
  • the system can be configured to generate an EKG-type signal.
  • the generated signal which can be denoted ballistic data, can be based on rich ancillary biometric data.
  • this ballistic data or generated signal can be a function of the fluid and respiratory dynamics. For example, while the heart might not be beating the same strength from one beat to another, this generally will not be reflected in an EKG; in contrast, the signal generated according to the systems herein can reflect this change, because the signal can be generated based on the fluid pressure impulse, for example by looking at the skull.
  • the system can be configured to observe the neck of a subject to see the pulse in the veins in the neck. As such, the system can be configured to obtain a more beat-by-beat variation on the aggressiveness of cardiac output than what is currently measured by an EKG trace.
  • breathing generally does not remain constant from beat to beat. For example, one can take a deep breath or a shallow breath.
  • the rate and/or depth of breathing can be determined by processing movement of the mouth and the nostrils of a subject, how much the head goes up and down, how much the chest expands or contracts, or the like. Other patient or subject movement may be monitored and/or processed to determine the rate and/or depth of inspiration.
  • certain embodiments herein can provide improved noninvasive cardiology tracking by combining magnetic resonance (MR) scans with rich ancillary biometric data and/or ballistic data, for example by observing and processing data relating to the chest position of the subject during the MR scan.
  • MR magnetic resonance
  • the chest position that is tracked during the MR scan can include absolute position, such as in XYZ coordinates, and also changes in the chest contour and/or how hard the heart is beating.
  • some embodiments are configured to correlate the depth of inspiration specifically to non-invasive cardiology. This can be a substantial improvement over bending the data in both CT and MR, which can be due to the fact that current CT and MR technologies themselves are not gated to synchronize with movement of the heart in real time.
  • certain embodiments herein can actually correlate heart movement to provide more accurate non-invasive cardiology assessment.
  • the motion tracking and correction system can be configured to correct, for example prospectively correct, artifacts in medical images caused by movement of the subject of the medical imaging scan.
  • a movement created partial volume artifact or motion initiated partial volume artifact can be created when a subject of a medical imaging scan, such as CT, MR, PET, SPECT, angiography, or the like.
  • a subject of a medical imaging scan moves during the scan, one pixel or voxel that is supposed to be fat and another pixel or voxel that is supposed to be muscle can be moved over each other during image acquisition, thereby ending up with an average of the two.
  • a single count can be distributed across a range of pixels or voxels rather than all being put back into the original pixel or voxel, which leads to a sampling error or reconstruction artifact.
  • movement created partial volume artifacts or motion initiated partial volume artifacts can occur in images obtained through CT, MR, PET, SPECT, angiography, or the like.
  • certain embodiments of the motion tracking and correction system can be configured to move some obtained data back to the correct pixel or voxel.
  • the motion tracking and correction system can be configured to monitor the velocity of motion of the subject and prevent or cause to prevent the scanner, for example a CT scanner, from scanning when the velocity is over a certain predetermined level because the image obtained would likely be blurry.
  • the motion tracking and correction system can be configured to monitor the velocity of motion of the subject and cause the scanner, for example a CT scanner, to scan the subject if and when the velocity of subject movement is or decreases below a predetermined level.
  • the system can be configured to acquire biometric data of a subject undergoing a medical scan and/or therapeutic procedure based at least in part on raw motion tracking data, such as video and/or image data.
  • the system can further be configured to obtain a cardioballistics (also called “ballistocardiography” or BCG) and/or cardiac pulsation signal and/or blood pressure signal from the biometric data.
  • a cardioballistics also called “ballistocardiography” or BCG
  • BCG cardiac pulsation signal and/or blood pressure signal
  • the system can be configured to analyze the veins and/or skin and/or hair and/or clothing and/or heat spots and/or skin marks and/or tattoos and/or other movements, topographical changes, infrared and hyper spectral changes and/or micromovements of the face, head, and/or neck to obtain a cardioballistics and/or cardiac pulsation signal and/or blood pressure signal.
  • cardioballistics can relate to a matter of how much the head moves and/or how much the vessels distend.
  • biometric data can be further analyzed by the system to create a biometric output or new signal, such as ballistic data.
  • Such output or signal can be indicative of the patient’s health while the patient is undergoing a medical scan and/or therapeutic procedure.
  • biometric data and/or data from the output or signal can be fed back into the medical scanner and/or therapeutic device in order to ensure that the detectors, cameras, therapeutic device is not only focused on the right area of the patient body but also that it takes a picture at the right time in view of the internal motion of the organs of interest.
  • an improved system in addition to looking at outer body motion and using that outer body motion to refocus the medical scanner or therapeutic device to make sure it is taking a picture of the body from the right perspective to ensure that the images stayed clear and attributing signal in the case of an MRI scanner to the appropriate voxels, an improved system can use the biometric data to take an even better image.
  • an EKG signal can be used to track the heart’s own intemal motion, which assumes that the heart beats consistently from one beat to the next. However, this may not always be the case, and as such EKG may not be an accurate representation as it relates generally only to an electrical signal.
  • a more accurate system can be provided that can be configured to detect and/or track topological, positioning, and infrared or hyper spectral data, to better infer movement of the heart or other organ of interest with respect to the rest of the body, as well and use both the biometric data (for example, topological data) and electrical signal.
  • Such systems can be further configured to identify ballistic data, such as cardio ballistic data, that can provide improved insight compared to EKG signals.
  • ballistic data can be used to enable a medical imaging scanner to take better images and/or focus or refocus an image and/or enable a therapeutic device to perform more effective treatment.
  • the system can be configured to ensure or at least better ensure that an image of the organ of interest is being taken at the right time, for example because at certain times the internal organ can be in the exact same spot or substantially same location relative to the rest of the body compared to when the last snapshot was taken or to better estimate cardiac output which can change from one beat to the next.
  • the heart may be 50 percent expanded when a first snapshot is taken, in which case on the second snapshot one may want the heart to be at the 50 percent expansion position as well in order to obtain a clearer image between slices.
  • the system can, in some embodiments, utilize both the electrical signal of the heart, for example using EKG, and infrared, hyperspectral, pose, as well as topographical data to provide improved estimation of the positioning and status of the heart, for example using raw motion data and/or biometric data derived therefrom.
  • the system can be configured to capture a better image of the brain by accounting for not only head outer body position, but also pulsation of the CSF or blood in or near the brain in between pulses.
  • the system can be configured to detect where the organ is with respect to the rest of the body as well as the stage of compression and/or electrical signal.
  • the system in certain embodiments, can be configured to refocus a medical imaging scan slice, change imaging parameters such as modifying an MR pulse sequence or changing timing or signal attribution to a given imaging voxel in any moodily including but not limited to PET, CT, MR, MR/PET, MR/CT and SPECT and/or therapeutic procedure depending on subject movement and/or adjust the timing thereof alone, or correlated with the other previously acquired data or data yet to be acquired.
  • the system can be configured to continuously collect data and throw away some of the data that was collected when the organ of interest is in a different state or location.
  • the system can be configured to utilize post-processing analysis to correct for subject movement.
  • the system can be configured to prospectively correct for subject movement, for example by developing a model that predicts what the heart or other organ of interest will do in terms of movement, compression, or the like, based on data that the system captured previously. Based on such prediction, the system can be configured to determine whether or not to discard and/or capture data.
  • the system can be configured to conduct a sentiment analysis and/or generate an alert to the technician that is conducting the medical imaging scan and/or therapeutic procedure.
  • the system can be configured to utilize the biometric data to acquire sentiment data to generate a medical alert to a technician to interact with the subject, for example if the subject is getting agitated or nervous or unconscious or non-responsive or the like.
  • the system can be configured to give feedback to the subject based on the sentiment analysis.
  • the system can generate a calming mechanism, such as audio and/or video signals to relax the subject, if the system determines that the subject is agitated.
  • the system can use the biometric data to generate a plan or roadmap for the subject to follow in order to obtain clearer images and/or better therapeutic results.
  • the system can be configured to generate a graphical representation, such as a line graph comprising one or more peaks and valleys, which can correspond to when the subject should breathe in or out.
  • a graphical representation such as a line graph comprising one or more peaks and valleys, which can correspond to when the subject should breathe in or out.
  • Such roadmap such as a breathing pattern
  • such roadmap can be presented to the subject via audio.
  • one or more cameras or detectors of the system can be configured to detect the subject breathing pattern, based on which the system can determine how closely the subject is breathing in real time according to the plan or roadmap breathing cycle.
  • the system can be configured to provide a gamification aspect to the subject of a medical imaging scan and/or therapeutic procedure.
  • the system can be configured to display a dot or other graphical representation to the subject on a screen or other display on the motion tracking device, medical imaging scanner, and/or therapeutic device.
  • such dot or other graphical representation can move off center if the subject moves his head off center. If and/or when the subject moves his head back to the center or near or to an ideal location, the dot or other graphical representation can be displayed again, on center for example.
  • the system can be configured to keep the patient still, focused, engaged and/or distracted by providing a game aspect in order to obtain clearer medical images during a scan and/or more effective therapeutic results.
  • data and/or tracking data collected by the system can comprise tracking data of motion of a subject or landmark thereof, subject skin motion, subject jaw motion, pupillary dilation, sentiment data, hyperspectral data, respiratory ballistics, video data, image data, audio data, temperature or thermal data of the subject, atmospheric data, light data, infrared data, and/or time associated with any of the aforementioned data.
  • the video data or image data can be synchronized by tying with time data as discussed herein.
  • one or more additional data mentioned herein, such as thermal data can also be tied to video or image data and/or any other data.
  • the system can be configured to link and/or synchronize multiple streams of data, for example using time data.
  • the system can be configured to capture and/or process every possible data point, such as a continuous stream of images for image or video data from every second.
  • the system can be configured to only take and/or process a subset of the possible data points. For example, for image or video data, the system can be configured to only process every other frame or every other second and/or compare one frame to another to determine whether there is a difference between the two frames.
  • the system can be configured to only evaluate certain parameters in real-time such as respiratory and cardiac data to save processing power, and/or perform post or delayed processing of other parameters to provide other biometric data.
  • certain parameters in real-time such as respiratory and cardiac data to save processing power, and/or perform post or delayed processing of other parameters to provide other biometric data.
  • Such near real time systems can be advantageous because the lower processing requirements may allow the system to more easily process the frame in real time during the medical imaging scan or therapeutic procedure.
  • the system can be configured to gamify an aspect of the medical imaging scan and/or therapeutic procedure for a subject, for example to help focus the subject to remain still and/or to maintain a particular portion of the subject’s body within a particular frame.
  • the system can be configured to process the tracking data or raw data obtained by the system and generate and provide feedback to the subject to improve the quality and/or effectiveness of the medical imaging scan and/or therapeutic procedure.
  • the system can comprise a display or audio speakers to provide feedback to the subject.
  • the display can be part of an add-on motion tracking device, medical imaging scanner, and/or therapeutic device.
  • the display can be a LCD, LED, fiber optically transmitted image, and/or other display.
  • the display can be configured to show a dot or other visualization to the subject that can be indicative of the quality of the subject’s location.
  • the system can be configured to display a little ball that moves one way or another depending on the location of interest of the subject, such as the subject’s head.
  • the ball can be displayed in a particular location or color.
  • the ball can be displayed in another location or color, which can motivate the subject to maintain the subject’s head in the center or substantially center.
  • the system can comprise a gamification feature that can help keep the subject focus and/or stable.
  • the system can be configured to display a video or image and/or play audible sounds that are calming or to entertain the patient and/or configured to calm and/or entertain the subject such that the subject can maintain a stable orientation and/or location.
  • audible sounds that are calming or to entertain the patient and/or configured to calm and/or entertain the subject such that the subject can maintain a stable orientation and/or location.
  • Such audio may be routed through an already existing headphone or other audio system.
  • display or graphical representation of the orientation and/or location of the subject can be displayed to a technician, who can then prompt the subject to move in a certain direction or orientation.
  • the imaging matrix of the system can be rectangular or a grid. As the location of the subject or area of the subject of interest starts to go off axis, the particular area of interest may not fall into the pixels or voxels as neatly. As such, it can be advantageous to keep the subject or area of the subject of interest, such as the subject’s head, in the center or isocenter for highest resolution. Gamification technologies or processes as described herein can help achieve this goal by encouraging the subject to consciously decrease movement.
  • the system can be configured to detect and/or measure the respiratory rate of the subject, other respiratory feature, and/or other biometric data to determine the calmness of the subject. For example, the rate of change or some aspect of calming breath or consistent respiration can be indicative of the calmness of the subject.
  • Such data points and/or analysis results can also be part of the feedback mechanism.
  • the system can be configured to provide one or more visual and/or audio signals to calm the subject, such as a prerecorded and/or computer simulated calming voice that tells the subject to hold still and/or hold his or her breath.
  • the system can be configured to provide such audio and/or visual signals, such as compliance instructions, in a plurality of languages.
  • the system can be configured to generate a ballistic signal, such as a cardio ballistic signal, as described herein.
  • a cardio ballistic signal can comprise a waveform, which is a measurement of the recoil forces of the body in response to the ejection of blood from the heart and movement of the blood through the vasculature.
  • the system can be capable of actually differentiating strong heartbeats from light heartbeats and have a better inference of cardiac position and/or output.
  • the system can be configured to detect and/or generate a ballistic trace of the heart.
  • cardio imaging using a MR scanner can comprise dividing a cycle into a number of segments or slices, such as binning.
  • binning herein is intended to be construed broadly to refer to the general process of dividing an acquisition cycle into a number of segments or slicing and reconstructing the data.
  • respiration can be different from one heartbeat to another, such as for example due to lack of breath or the like. This can mean that the heart position may vary, for example by one or more centimeters, because of the depth of expiration or the like.
  • certain embodiments described herein are configured to detect and/or determine respiratory data or traits to generate more accurate data relating to the heart.
  • the system can be configured to take into account the fact that there can be at least an indirect relationship between the actual pulse rate and respiration. For example, based on previously detected data of the subject, the system can be configured to determine that a higher pulse rate can be indicative of an anxiety level of the subject, which can mean that the subject is more likely to have an irregular respiratory rate. In some embodiments, in determining one or more cardio characteristics of a subject and/or any other biometric data, the system can be configured to detect, track, determine, combine and/or take into account the oxygenation level, respiratory rate, cardiac rate, and/or the like of the subject.
  • the system can be able to determine the location of a particular organ of a subject, such as the heart.
  • the system can be configured to estimate the location of a particular organ of a subject, such as the heart, via electrical signals.
  • the system can be configured to estimate a location of a subject’s heart based on how hard the subject’s heart is beating.
  • the system can be configured to generate an improved inference of where the heart or other organ of interest (such as but not limited to liver, kidneys, or spleen) is located and/or how to get the heart or other organ of interest in exactly the same spot from one beat to the next compared to using electrical trace alone.
  • the significance in improvement can be more pronounced, for example, in people with irregular rhythms, either regularly irregular or irregularly irregular rhythm. This can be because of binning or the fact that a medical imaging scanner may need to collect data over a plurality of beats, which would require the heart to be in the same precise location and in the same state of expansion or contraction to get statistically meaningful data to generate a clearer image.
  • the system can be capable of decreasing the associated blurring and uncertainty arising from reconstructing data collected over a period of time.
  • the noise can be decreased in some embodiments, whereas the electrical signal being added can provide the system to have a great signal relative to the noise.
  • the organ of interest can be the heart, liver, kidneys, gallbladder, and/or any other organ that can move within a subject, for example due to respiration and/or due to its own movement such as the heart.
  • the system can be configured to incorporate the three-dimensionality of the location of a particular organ of interest.
  • a system that relies solely on an electrical signal can be two-dimensional, whereas an object of interest, such as the heart, is generally three-dimensional. Movement and/or location of an organ in three dimensions can be difficult to capture with an electrical signal alone and may require ballistic and physiologic modeling.
  • the system can be configured to first determine if there is beat-to-beat variability in the cardiac output.
  • a subject with a regularly irregular rhythm can have a first regular beat, a second quick beat, third regular beat, and a fourth quick beat.
  • Some systems as described herein can be configured to generate a ballistic trace of the heart and determine that the cardiac output is vastly different between the first and second beats and/or between the third and fourth beats. Based on such determination, the system can be configured to skip every quick beat, wait for every regular beat and only take a medical imaging scan slice or apply a therapeutic procedure for every regular beat.
  • the system can be configured to continuously obtain image scan slices to a subject and retrospectively go back and reconstruct using only those data points in which there is a higher likelihood that the heart or organ of interest is in the same position either on an absolute basis or relative to the known phases of a cardiac cycle in either regular rhythms or regularly irregular or irregularly irregular rhythms.
  • the system can be configured to track and/or determine cardiac output in order to improve medical image scans.
  • the system can be configured to track and/or monitor the veins on the neck and/or head of the subject as a data point for determining cardiac function.
  • the system can be configured to look at the dissension /status of the jugular veins and/or carotid arteries.
  • the system can be configured to determine and/or generate respiratory ballistics data.
  • the system can be configured to look at the shape of the thorax of a subject and/or head position. More specifically, the system can be configured to monitor the size and/or 3D volume of the thorax and/or determine whether the same volume point can be obtained within the chest. In addition, the system can also monitor and/or track positioning information, for example by using an optical marker right over the heart to determine a three-dimensional position thereof.
  • the system can be configured to process a three-dimensional volumetric calculation from the image and/or video data, for example by using a dot projector to project a dot pattern on the chest of the subject to determine depth.
  • the system can be configured to determine the depth of a breath with improved accuracy and/or provide feedback to a gamification system for the subject. For example, the system can provide feedback to the subject to take a series of deep breaths until a desired depth of breathing is obtained. In certain embodiments, the system can provide feedback to encourage the subject to bring his or her respiration into a predictable mode that can result in improved motion tracking and thereby clearer medical image scans and/or more effective therapeutic procedures.
  • the system is capable of adjusting for artifacts that can be caused by changes in the position of an organ or area of interest of a subject that can occur between frames or slices of a medical imaging scan and/or a therapeutic procedure.
  • the system can determine the position of a subject area of interest in real-time or near real-time and/or combine with one or more biometric data to dynamically determine when the subject area of interest is likely to be in the same position as a previous take in order dynamically determine when to take an image slice and/or apply a therapeutic procedure.
  • the system can be configured to encourage or increase the likelihood that the subject area is in the same position by providing feedback relating to the respiration and/or position of a subject area to the subject and/or technician, for example by utilizing one or more gamification features as described herein.
  • the system can be configured to retroactively remove or delete certain image slices that were taken when the subject area was not in the same position in order to correct for artifacts.
  • the system can be configured to modify or alter certain image slices that were taken when the subject area was not in the same position in order to correct for artifacts, for example by modifying the location and/or orientation of an organ of interest in the image.
  • the system can be configured to predict the position of a particular organ or subject area of interest and prospectively adjust one or more parameters of a medical image scan to correct for artifacts.
  • the system can be configured to determine, predict, and/or adjust for artifacts that can be caused by changes in the position and/or orientation of an organ or area of interest of a subject based solely on external features, such as external position and/or topography.
  • the system can be configured to determine, predict, and/or adjust for artifacts that can be caused by changes in the position and/or orientation of an organ or area of interest of a subject based on predictive internal modeling of the organ or area of interest or in combination with anatomical or other information acquired by the primary imaging modality including but not limited to MR, CT, PET, PET attenuation correction etc.
  • one or more features as discussed herein can be utilized by a system to adjust for movement of the brain within the head of a subject as well.
  • the brain can move within the head due to, for example, pulsating blood, spinal fluid pulsation, or the like, which can lead to artifacts in medical imaging, such as cerebral spinal fluid (CSF) pulsation artifacts.
  • CSF cerebral spinal fluid
  • the system can be configured to detect and/or monitor spinal fluid pulsation rate by monitoring external and/or visible features to detect, predict, and/or adjust for brain movement during a medical imaging scan and/or therapeutic procedure, such as by only keeping images where the brain is in a desired position.
  • the system can comprise one or more accelerometers or other sensors that can be capable of measuring spinal fluid movement to this end. Based on such data, the system can be configured to correct for artifacts caused by spinal fluid movement.
  • the system can be configured to dynamically modify the focus and/or alignment of one or more motion tracking cameras or detectors pursuant to movement and/or size of the subject and/or area thereof.
  • one or more motion tracking cameras or detectors can be motorized and/or other non-motorized features for doing so.
  • the system can be configured to conduct sentiment analysis, for example in real-time, substantially real-time, and/or post-processing.
  • the system can be configured to detect, monitor, and/or track facial expression, pupillary dilation, heart rate, thermal data, and/or respiratory rate of a subject. Based on such data, the system can further be configured to generate feedback on whether the subject is claustrophobic, fearful, anxious, relaxed, sleeping, and/or the like.
  • the system is capable of conducting sentiment analysis continuously and as such can dynamically determine a change or slowly evolving trend of the subject’s sentiment, for example to predict that a certain event will or is likely to occur, such as the subject crashing.
  • the system can be configured to dynamically determine whether the subject is exhibiting or is likely to become anxious based on tracking the subject’s facial expression, pupillary dilation, thermal data, heart rate, and/or respiratory rate. Based on such determination, the system can be further configured to generate a warning alert to the technician and/or subject. For example, the system can be configured to determine that a subject is exhibiting or is likely to exhibit signs of anxiety when the subject’s pupils are becoming smaller and/or when the subject’s heart rate and/or respiratory rate is trending upward and/or when the subject is sweating or the thermal data is trending upward. The system can be configured to generate a visual and/or audible warning alert to the subject, for example in a soothing or caring voice to relax the subject and/or display soothing or calming graphics.
  • the system can be configured to capture and/or determine the anxiety level of a subject over a period of time, which can be used to develop and/or modify treatment of the subject. For example, if a patient is regularly undergoing an MR scan, the system can be configured to track the anxiety level of the patient over the course of treatment, which can be used to modify the treatment itself.
  • anxiety levels of one or more patients observed from particular medical image scanners and/or therapeutic device can be used as a metric for evaluation of the scanner or therapeutic device itself. For example, patients who were scanned using a particular brand or type of MR scanner may have experienced lower anxiety levels than another type of MR scanner.
  • the system can be configured to perform interpatient variability measurements among different medical imaging scanners and/or therapeutic devices, based on sentiment analysis of a plurality of patients. In some embodiments, the system can be configured to conduct sentiment analysis in real-time, near real-time, and/or asynchronously or post-scanning.
  • sentiment data collected around a particular patient, a particular patient over a course of treatment, a series of patients over a particular period of time, a series of patients, one or more medical imaging scans and/or therapeutic devices can be used, for example, to measure an effectiveness of the machine and/or workflow and /or environment.
  • the system can be configured to obtain a cardiac rhythm from the raw motion tracking data and/or biometric data.
  • the system can be configured to analyze the veins and/or movements and/or micro-movements of the face and/or neck to obtain a cardiac and/or cardiac or other rhythm signals and/or blood pressure signal.
  • a person’s head moves up and down by about 150 microns when the blood hits the base of the skull.
  • the system can be configured to detect and/or track movement or micro movement of a facial marker on the subject or other visualized based approach.
  • the system in addition to visual data, can be further configured to utilize one or more other modalities, such as infrared, hyperspectral, sound, and/or ultrasound.
  • the system can be configured to utilize and/or incorporate one or more of the following variables for determining cardiac rhythm: marker data for movements or micro-movements, infrared data for pulsation and/or blood movement, hyperspectral data, multispectral data, thermal data, moisture, sweat, patient overheating, hemoglobin wavelength, oxygenation of hemoglobin, follicle motion, water detection, and/or the like.
  • the system can be configured to use infrared or hyperspectral to detect and/or visualize the cardiac rate, pulsation, volume, and/or blood pressure, such as by utilizing infrared transit time resistance. More specifically, if a subject’s arteries are clamped down, the blood can flow slower compared to when the subject is relaxed, the temperature of which can be detected by using infrared and exhibit a longer transit time and/or give an indication of blood pressure.
  • the system can be configured to immediately, in real-time, or in substantially real-time determine blood pressure, for example by using exposing a region of interest to infrared and detecting the temperature. More specifically, in some embodiments, the system can comprise one or more infrared emitters producing infrared rays onto the skin of the subject and one or more detectors detecting the bounce-back infrared, which can be further analyzed by the system. From such data, the system can further be configured to calculate resistance, blood pressure, and/or detect a trend in the blood pressure. As such, some embodiments can allow monitoring of trending change of blood pressure, such as whether it is rapidly rising and/or fluctuating dramatically, that would normally not be determinable by doing a series of discrete measurements.
  • the system can be configured to characterize jaw movement based at least in part on motion data relating to the jaw which can affect imaging and/or indicate anxiety on the part of the patient.
  • jaw movement detection can be advantageous because the jaw bone is a prominent feature in a patient for tracking purposes.
  • the system can be configured to quantitate skin motion. Quantitative analysis of skin motion can be advantageous to remove and/or adjust for false motion detection.
  • the system can be configured to determine if a patient or subject is left in a medical imaging scanner and/or therapeutic device after a scan or procedure has been completed and/or at the end of the day or at a particular time.
  • the scan completion workflow of a scanner or a therapeutic procedure completion workflow of a therapeutic device can comprise a process in which one or more cameras or motion detection device components are configured to determine whether a patient or subject is located in the scanner or therapeutic device after completion of a scan or therapeutic procedure.
  • one or more cameras or motion detection device components can be configured to determine whether a patient or subject is located in the scanner or therapeutic device at a particular time of day.
  • the system determines that a patient or subject is still located within the scanner and/or therapeutic device after a certain period of time after the scan or therapeutic procedure is complete and/or at or after a particular time of day, the system can be configured to generate an alert, visual and/or audible, to the patient, subject, and/or medical professional.
  • the systems, methods, and devices disclosed herein are configured to extract and/or generate and/or detect a biological signal/data from a motion trace, and/or a video and/or images generated for motion detection. In some embodiments, the systems, methods, and devices disclosed herein are configured to extract and/or generate and/or detect a cardiac and/or respiratory signal/data and/or trace out of a motion trace, and/or a video and/or images generated for motion detection.
  • the systems, methods, and devices disclosed herein are configured to extract and/or generate and/or detect a pulse rate signal/data and/or a heart rate signal/data from a motion trace, and/or a video and/or images generated for motion detection, patient viewing, and/or biometric detection or surveillance.
  • the systems, methods, and devices disclosed herein are configured to extract and/or generate and/or detect one or more proxies for a biological electrical signal, which in some embodiments can be configured to trigger imaging of a patient by using a different imaging modality, for example, a magnetic resonance (MR) imaging system, computed tomography (CT) scanner, or positron emission tomography (PET) scanner, or others.
  • MR magnetic resonance
  • CT computed tomography
  • PET positron emission tomography
  • the imaging systems are triggered to perform image captures based on triggers generated from biological electrical signals, which in some cases, is detected from a sensor placed on a finger or other body part or with optical flow vectors derived from video stream as in markerless tracking.
  • biological electrical signals which in some cases, is detected from a sensor placed on a finger or other body part or with optical flow vectors derived from video stream as in markerless tracking.
  • the detection of biological electrical signals/data through such sensors on the chest or a finger or other body part may not be technologically possible, and/or may add cost to a procedure, and/or may require patient compliance, and/or may increase the time for a procedure, introduce electrocution or burn risk, or encumber workflow, and/or the like.
  • EKG leads can be positioned on a patient, which can be problematic for many reasons. For example, putting such EKG leads on a patient can be time consuming. For example, such EKG leads can be dangerous because if the insulation is broken on the lead then it acts as a piece of metal inside the magnet of an MR scanner, wherein the magnet can induce current in the EKG lead so you can actually electrocute and bum the patient if the insulation on those leads is damaged. In other situations, a sensor is placed on the finger in order to figure out the pulse timing; however, the problem with such sensors is that by the time the blood travels down the hand and gets recognized, there is a latency on the order of around 300 millisecond.
  • the systems, methods, and devices disclosed herein are configured to extract and/or generate and/or detect a proxy signal/data based on a system requiring no physical contact, in other words a touchless system.
  • the systems, methods, and devices disclosed herein are configured to extract and/or generate and/or detect a proxy signal in order to mimic the same trigger signal that imaging systems currently use for triggering image capture.
  • the systems, methods, and devices disclosed herein are configured to use motion trace data and/or video data showing motion of body parts, for example, a vein and/or vessel, to generate a proxy signal/data that mimics and/or correlates to biological electrical signals in the patient being observed, such proxy signals/data can be used, in some embodiments, as triggers for performing image capture.
  • detection of the cardiac and/or respiratory cycle can be used to analyze or extract additional information from temporally correlated imaging.
  • the systems, methods, and devices disclosed herein are configured to not only generate motion traces of a patient but also configured to extract from the video or a series of images data not previously available and/or analyzed for vein and/or vessel movements in the body, for example, the face and/or neck.
  • vein and/or vessel (and/or other body part movement) movement data in combination with motion trace data (through a marker or markerless system) to get improved accuracy of detecting patient motion and/or improved accuracy of when to image a patient and/or improved accuracy in generating patient images by better knowing when to trigger an image capture and/or by knowing better how to correct for motion correction.
  • the systems and methods disclosed herein relate to extracting and/or generating ballistocardiography (BCG) and/or respiratory data and signals from data generated by a motion tracking system in connection with and/or coupled to a medical imaging system, for example, a magnetic resonance (MR) imaging system.
  • a medical imaging system for example, a magnetic resonance (MR) imaging system.
  • the motion tracking systems disclosed herein comprise one, two, three, or four or more cameras, which may cover areas being imaged by the coupled imaging system or adjacent or distant areas.
  • the one or more cameras in the motion tracking system can be configured to be unobtrusively coupled to the imaging system, for example, an MR, CT, PET, and/or other system, and/or can be configured to be used for prospectively gating biomedical imaging scanner (for example, MR image capture system) to reduce or eliminate motion artifacts.
  • the system can comprise a standalone vital signs monitoring system that can be separate from any imaging system.
  • the systems disclosed herein are configured for automatic prospective gating that is synchronized to cardiac and / respiratory motion.
  • cardiac gating is done using EKG leads attached to specific locations on the chest.
  • Other approaches can involve the use of hardware -based photoplethysmography (PPG).
  • Impedance pneumography or special belts with pressure sensing are used for creating gating signals connected with respiratory motions.
  • these systems require patient cooperation during scanning. Further, such systems generally require wired connections that can limit the use. Additionally, probes or belts can in some instances obscure the anatomical regions of interest and add cost.
  • the systems disclosed herein leverage motion detection technology combined with one or more specifically designed markers that can comprise dots and/or circles.
  • the one or more specifically designed markers can be positioned on a patient’s face and/or other body area and/or part.
  • the system utilizes processing algorithms to generate six degrees of freedom motion tracking data and/or signals.
  • the system is configured to use processing analytics that utilize overlapping batches to extract continuous BCG and/or respiratory waveforms from up to six motion signals, and in some embodiments, the system can be configured to process up to 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, or more motion signals. In some embodiments, these waveforms can be used for phase and/or amplitude gating.
  • the systems disclosed herein can use these waveforms for tracking patient’s physiological state (for example, heart rate, respiration rate, heart rate variability, change in cardiac output, change in pulse pressure, or the like) during some period of or multiple periods of the motion tracking capture session, or the entire motion tracking capture session.
  • the systems disclosed herein utilize BCG data and/or signals to perform peak to peak pulse point and/or interval detection.
  • the peak pulse point and/or interval detection data can be used by the systems disclosed herein to detect the occurrence of cardiac arrhythmia (for example, asymptomatic or persistent A-fib, or the like) during the scan by quantifying the variability of the peak-to-peak intervals.
  • the systems disclosed herein for tracking motion of a patient and/or object of interest during biomedical imaging applications to compensate for motion artifacts comprises one or more detectors working in conjunction with one or more markers, wherein the one or more detectors are configured to detect images over time of the one or more optical targets or markers (for example, specially designed markers with one or more dots and/or circles) placed on and/or coupled to a patient’s face.
  • the one or more detectors are configured to generate temporal motion data in terms of six degrees of freedom (6-DOF).
  • the systems disclosed herein include four cameras positioned to view at different directions with at least one camera or at least two cameras being adapted to record two dimensional images of the one or more optical targets.
  • the six degrees of freedom are over orthogonal directions x, y, and z and roll, pitch and yaw angles.
  • the six degrees of freedom are extracted at high speed (for example, during each sample period), and in some embodiments, by using a motion tracking algorithm that is configured to process two dimensional images of the target.
  • the direction x is along the coronal plane (between shoulder to shoulder direction)
  • direction y is along the spinal axis located on the coronal plane perpendicular to the direction x
  • direction z is along the sagittal plane in the floor-to-ceiling direction and perpendicular to the x and y directions.
  • the roll angle, Rx is about the x-axis (the angle made by shaking the head“No”), the pitch angle is about the y-axis (the angle made by shaking the head“Yes”) and the Yaw angle is about the z-axis (the angle made by leaning head forward).
  • FIG. 3 there is illustrated an example of motion tracking data (for example, 6 DOF signals) for approximately 11 minutes (updated at l7ms sampling interval) during an MR imaging session of a test subject.
  • signals for 60 second duration there is illustrated signals for 60 second duration.
  • minute traces of cardiac and respiratory motion signals can be identified in many of these imaging channels.
  • the y-direction signal can indicate large portions of motion induced component from cardiac beats which is riding over the non-stationary slowly varying component.
  • the subtle motion has peak to peak amplitude of approximately 40 microns and can represent the movement to reactionary forces experienced by the body to cardiac expulsion of blood into the arteries.
  • the systems disclosed herein are configured to utilize mechanisms for identifying BCG waveforms.
  • the subject and/or patient is lying in the supine position, the z-direction signal can contain the subject’s respiratory signal.
  • the systems disclosed herein are configured to extract the cardiac and respiratory signals from y and z-motion signals and gate (for example, trigger) the image acquisition to match the amplitude or phase.
  • the systems disclosed herein can be configured to limit motion induced image degradation by extracting the cardiac and respiratory signals from y and z-motion signals and gating (for example, trigger) the image acquisition to match the amplitude or phase.
  • the systems disclosed herein are configured to generate BCG and/or respiratory signals and/or data by using signal processing methods.
  • the systems disclosed herein employ a three-step process.
  • the systems disclosed herein can employ a more complex method, for example, an independent component analysis or multi-channel signal processing algorithms can also be applied to simultaneously process x, y and z or x, y, z, Rx, Ry and Rz motion signals/data and extract cardiac and respiratory signals of interest.
  • the systems disclosed herein employ a first step that involves de-trending the signal for removing slow varying trend that can lead to a non- stationary signal component.
  • Slow linear or more complex non-linear trends representing non- stationary component can cause distortions time and frequency domain analysis.
  • the systems disclosed herein in some embodiments, can systematically test for non-stationarities and retain only stationary segments for further analysis or apply other signal processing methods to try to remove the slow non-stationary trends before analysis.
  • the first method can result in erroneous conclusions because the segments that are removed contain physiological information related to heart rate variability (HRV) data.
  • HRV heart rate variability
  • the systems disclosed herein can be configured to use signal processing methods where de-trending is done via smoothness prior approach. In some embodiments, the systems disclosed herein can be configured to employ a method that uses a single parameter to adjust the frequency response of the signals such that the systems can be adjusted to different situations.
  • de-trending filter operates like a time- varying high pass Finite Impulse Response (FIR) filter by removing lower regions of the frequency band.
  • FIR Finite Impulse Response
  • the one-dimensional signal can be considered as two components: (1) stationary motion signal of interest and (2) a low frequency slowly varying non-stationary component.
  • the slowly varying non-stationary component is modeled in terms of the regression parameters to estimate the trend.
  • the systems disclosed herein can be configured to use the regularized least squares solution to determine the regression parameters.
  • the de-trended, stationary motion signal, P stat is obtained from equation no. 1 shown below.
  • P originai is the original signal that need de-trending (e.g., y-channel motion signal for
  • D 2 is a second order difference matrix, an approximation of the derivative operator, which is of the form:
  • the invertible function in equation 1 represents time varying high pass filter. If P original is of size N then / is the identify matrix of size NxN.
  • a single regularization parameter, l is used to adjust the frequency response of the de-trending algorithm.
  • the system can be configured, for example, to be at 200 for y-channel signal sampled at 60Hz and z-channel signal at 1000.
  • an optimal value of the parameter(s) can be set a priori by experimenting on different patients in the environment in which signals are acquired (for example, in their clinical/home/work environment).
  • the systems disclosed herein can employ a second step of applying a band-pass filter to the de-trended signal in order to retain the frequency bands of interest.
  • the band-pass filter can be configured to remove undesirable frequencies below and/or above the expected frequency range.
  • the systems can use a band-pass filter with a frequency range of 0.75 to 2Hz for extracting heart rate signals/data.
  • the upper limit can be as large as 7 Hz or above.
  • the systems can be configured to detect a respiratory motion signal wherein the frequency band can be limited to 0.05 to 0.3Hz.
  • the respiratory motion signal can be configured to have the upper limit as high as 0.5Hz for adults and lHz for neonatal intensive care babies, and somewhere in between for children.
  • the systems can be configured to remove phase distortions in filtering by using digital filtering methods.
  • the systems can be configured to use the zero-phase digital filtering algorithm, which processes the input signal through the chosen band-pass filter transfer function of interest, in the forward and/or reverse directions.
  • the result is the zero-phase distortion.
  • the foregoing system configuration reduces the noise and/or retains the signal in the band of interest without injecting delays as in normal filters.
  • the zero-phase filtering helps to preserve features in a filtered time waveform exactly or substantially exactly or approximately where the features occur in the unfiltered signal.
  • the systems disclosed herein can employ a third step wherein the system is configured to compute power spectral density and determine fundamental frequency of interest.
  • the systems disclosed herein are configured to employ up- sampling the signal through interpolation to increase number of data points and further smoothening may also be required.
  • FIG. 4 illustrates a cardiac signal (and a heart rate) obtained after applying the foregoing steps to y-channel trace of 1 -minute in length.
  • FIG. 5 illustrates a respiratory signal (and a respiratory rate) from z-channel trace for the same duration batch.
  • the system was configured for generating a BCG signal based on a long batch of 1 -minute
  • the systems disclosed herein can be configured for generating continuous signals at a shorter batch length anywhere between 5 to 15 seconds, which in some embodiments, is preferred.
  • the system can be configured to comprise a batch length of 10 second to 30 second, due to lower frequency components, assuming the breathing frequency range is between 6 to 25 cps.
  • these batch length ranges can vary based on the lower limits on the breathing frequency and/or other frequencies.
  • FIG. 8 it is illustrated an example of a method for generating BCG and RR signals by computing successively with overlapping batches one at a time, each new batch using large part of previous batch and at least few new samples from new batch while eliminating same number of samples from previous immediate batch.
  • retaining an overlap of 95% between neighboring batches is preferred to minimize large variations between batches.
  • batch length can be, and in some embodiments, preferably, kept to a minimum with maximum overlap.
  • the overlap in the sliding window can be set to maximum with one sample elimination from previous immediate batch and one sample inclusion from new neighboring batch.
  • continuous stream of signals derived from processing each batch is then stitched and again filtered and with zero-phase distortion filtering mentioned above to remove any discontinuities.
  • FIG. 6 illustrates an example of a measured Ballistocardio graph (BCG) waveform for one heartbeat.
  • BCG Ballistocardio graph
  • This signal/data illustrated in FIG. 6 was produced with a modified electronic weighing scale.
  • the example illustrated in FIG. 6 comprises several waves such as the“H”, “I”,“J”,“K”, and“L” waves, which are typical of BCG recordings when standing on the weighing scale.
  • increasing the upper band of the band-pass filter while processing y-channel signal clearly shows the presence of similar traces of“H”,“I”,“J”, “K”, and L waves, examples of which are illustrated in FIG. 7 A and FIG. 7B.“M” and“N” waves can also be seen in these filtered waveforms. It is to be noted that these BCG signals are inverted compared to FIG. 6. In some embodiments, the system can be configured to confirm this observation after acquiring EKG signals/data.
  • the I-wave in the weighing scale signal/data corresponds approximately to the trough or foot of the BP waveform at the inlet of the ascending aorta, while the time of the J wave peak corresponds approximately to the foot of the BP waveform at the outlet of the descending aorta.
  • BCG signal/data can also be used for determining cardiovascular health and disease information.
  • the time interval between the beginning of the“I” wave and the“J” wave peak (troff of the inverted BCG signal) can represent the aortic pulse transit time.
  • cardiac gating can be initiated by detecting the beginning of“I” wave.
  • PEP Pre Ejection Period
  • RI- intervals the interval between R peak to I wave (RI- intervals) in the foot BCG was then modelled with respect to PEP for a group of 17 subjects.
  • Rl-interval depends on PEP.
  • the system can be configured to use I wave as a good gating signal instead of R wave, for initiating image capture.
  • the systems disclosed herein can be configured to extract BCG waveforms contact free by leveraging the MR imaging machines and motion tracking hardware and for use in reasonably accurate timing for synchronizing with cardiac signals.
  • peak to peak pulse points and/or intervals can be used by the systems disclosed herein to extract time domain Heart Rate Variability (HRV) statistics.
  • HRV refers to the beat-to-beat time variation in heart beat and is modulated primarily by the alterations in the Autonomic Nervous System (ANS) via changes in the balance between parasympathetic and sympathetic influences.
  • HRV statistics can be used as a quantitative marker to measure the state of ANS.
  • heart rate is not fixed even in healthy individuals.
  • heart rate automatically adjusts for stress, respiration, metabolic changes, thermoregulation, physical exertions, endocrine cycles, etc.
  • the ANS is represented by the sympathetic and parasympathetic nervous system (SNS and PSNS).
  • SNS and PSNS function in opposition to each other.
  • HRV monitoring during MRI capture is a quick physiologic indicator and/or a superficial reflection of the state of the ANS.
  • HRV statistics can also be used to determine whether the patient is in A-Fib or sinus rhythm.
  • the systems disclosed herein can be configured to process raw unfiltered BCG waveforms through deep learning algorithms and/or SVM classifiers to extract key features that can be mapped to blood pressure.
  • the systems disclosed herein are configured to exploit clinically significant parameters from the BCG waveform, accuracy of motion measurements should increase by 10X or more and/or can be done with high resolution stereo vision system.
  • the systems disclosed use a method to estimate physiologically relevant cardiac and/or respiratory signals/data by leveraging a 6-degrees of freedom motion signals/data for use in phase / amplitude gating of MR images.
  • the system can be configured to detect characteristics points of the desired cardiac and/or respiratory signal/data in multiple ways; (1) by detecting peak or valley points using peak/valley detection algorithm, (2) zero cross over, or (3) a threshold level detector in the inspiration / expiration cycle.
  • end-inspiration or end-expiration are generally used as trigger points for the X-Ray systems.
  • FIG. 9 there is illustrated an example of the gating trigger signal which is time synchronized to the respiration cycle based on threshold level detector.
  • the systems disclosed herein can employ this method, which in some embodiments, has the potential to not only improve the quality of MR images, but also provide valuable medical diagnosis data unobtrusively and/or with no patient cooperation.
  • the systems and methods disclosed herein are better than other systems because the systems and methods disclosed herein are contact-free, inconspicuous and are very accurate, especially when used with Artificial Intelligence such as feature-based deep learning algorithms.
  • the systems and methods herein can be advantageous in two areas of MR imaging: (1) work flow simplification with high accuracy gating and (2) unobtrusive diagnosis of critical clinical parameters without patient co-operation.
  • the systems, methods, and devices disclosed herein relate to signal acquisition from a video-based apparatus for extracting physiology information from a living body in a non-contact manner, for example in many applications such as in cell phones, laptops or in hospital setting where one or more measuring devices can be mounted inside the device or on the wall.
  • one or more video measurements are taken by sequentially illuminating LEDs on subject’s exposed skin surface.
  • reflected light is then recorded in the photodetector array by integrating the light for a predetermined time.
  • the signals are then transformed to extract one or more various vitals such as heart rate, respiration rate, blood pressure, and/or oxygen saturation etc., for tracking the health of living-beings.
  • Some embodiments described herein are specifically suited for taking measurements under non-cooperative settings. Certain embodiments described herein can be used anywhere of living-beings, such as for example workplace, homes, hospitals, home care, minute clinics, sleep labs, intensive care units, doctors’ offices, automobiles, and/or self-driving vehicles, for intermittent and/or continuous monitoring of one or more vitals. Certain embodiments described herein can also be used for measuring cardiovascular health of fish in fish tanks, animals in zoos, and/or the like.
  • a switched narrow-band illumination is utilized, for example near infrared (NIR) centered around 940nm, and/or wideband between 805-1000 nm since absorption between oxygenated & de-oxygenated hemoglobin can differ significantly.
  • NIR near infrared
  • one or more high intensity Light Emitting Diodes can be used.
  • one or more CCD and/or CMOS detectors can be used in this wavelength band.
  • other wavelengths for example between l000-l300nm, can be suitable with non-silicon detectors.
  • the system can be configured to maintain NIR illumination within eye safety limits.
  • a key metric used for monitoring patient health can be the oxygen saturation in the blood (arterial or venous blood) to know how well the patient is oxygenated. In some cases, this can be done using a device which clamps onto the patient’s finger-tip with a cord transmitting signals back to a display module. However, in some cases, these devices can impede the patient from using both hands.
  • the system can comprise two LEDs having different wavelengths sequentially illuminated one after the other at different wavelengths (one below 805nm and the other above 805nm).
  • these two LEDs can distinguish between two types of hemoglobins (oxygenated hemoglobin, Hb02 and de-oxygenated hemoglobin).
  • FIG. 10 illustrates an example(s) of absorption coefficient of hemoglobin and water shown with respect to wavelength.
  • pulse transit time can represent an approach for ubiquitous BP monitoring.
  • PTT can be measured from the EKG R- wave and the finger pulse from oximeter or the like, and can be mapped to systolic and diastolic blood pressures using biophysics models or calibration look up table created by comparing PTT to gold standard instruments.
  • the system can be configured to utilize PTT measurement between an upstream blood pressure pulse within the blood vessel and a distal peripheral pulse taken from the same video frame or captured with a separate video camera whose acquisitions are synchronized.
  • the approach can use extracting two regions, such as a proximal and a distal region of interest, and post processing the video frames to extract videoplethysmographic (VPG) signals.
  • the system can be configured to compute the phase difference between two VPG signals and map the PTT to systolic and diastolic blood pressures, for example via a look up table and/or a mathematical model.
  • the system can be configured to use the phase difference that exists between BCG wave taken with head motion due to the force created by ejection of blood and the VPG wave taken on a small ROI on the forehead.
  • the system can be configured to perform the capture for a few seconds to collect both BCG and VPG waveforms and then map the phase difference to systolic and diastolic blood pressures.
  • heart rate(s) and/or respiration rate(s) are measured by extracting peak of the power spectral density functions using VPG and Respiration signal at wavelength bands of interest mentioned in earlier text.
  • FIG. 11 and FIG. 12 illustrate a small region of interest representing VPG marks from certain systems described herein, such as for example a system comprising four cameras.
  • FIG. 13 illustrates VPG signals from certain camera systems overlapped with detrended signal but unfiltered at different ROIs.
  • FIG. 14 illustrates an example(s) of a VPG signal (top) and respiration signal (bottom) obtained from one of the ROIs illustrated in FIG. 11 with continuous tracking with sliding windows as illustrated in FIG. 8.
  • FIG. 15 illustrates an example(s) of pulse rate / heart rate (top) and respiration rate (bottom) obtained from one of the ROIs illustrated in FIG. 11 with continuous tracking with sliding windows as illustrated in FIG. 8.
  • VPG Videoplethysmography
  • the systems, methods, and devices disclosed herein are configured to improve the strength of videoplethysmography (VPG) signals.
  • VPG videoplethysmography
  • the system is configured to use pulsed illumination, as opposed to continuous illumination, of the skin and/or subject region of interest.
  • the off-time duration of the illumination can be configured to be greater than the thermal relaxation time (TRT) of the tissue of interest.
  • light penetration of skin can be gradually higher with increasing wavelengths.
  • higher depth of penetration can be useful to reach more red blood cells (RBCs) so that the strength of VPG signal is improved.
  • RBCs red blood cells
  • NIR near infrared
  • around 20% of the light energy can be absorbed in the blood vessels which can result in more heat being generated inside the tissue, which can be problematic inimproving strength of VPG signal.
  • some embodiments herein comprise one or more pulsing illuminators, such that intervals between illuminations can be adjusted to reduce heating effects in the tissue or region of interest.
  • thermal relaxation time (TRT) of tissue represents the time required to cool down the heated structure to about 50% of the initial temperature. Further, if the illumination on-time is higher than the TRT or skin cooling time, this can alter light-tissue interactions, resulting in lower average depth of light penetration.
  • pulsed illumination of the subject region and/or target of interest can be controlled such that the duration between pulses or illumination is greater than at least the TRT of the subject region and/or target of interest.
  • the subject region and/or target of interest can be given sufficient time to cool down before being illuminated again.
  • the illumination on-time can be configured such that it is shorter than the TRT to prevent overheating of the subject region and/or target of interest.
  • the system can comprise one or more polarization optics to reduce specular reflection.
  • the system can be configured to utilize polarization techniques to preferentially remove specular reflection, which can help to improve the performance of VPG signal extraction.
  • the system can comprise an analyzer in the imaging path, which can be a polarizer with the transmission axis perpendicular to that of a polarizer in the illumination path. As such, the reflected light with the same polarization state as the illuminated light can be blocked.
  • specular reflected light can partially retain the source polarization, while light that penetrates the skin can be reflected with randomized polarization.
  • the polarizer in the imaging path can preferentially transmit the defused light which can improve VPG signal strength.
  • chromophores hemoglobin and its derivates, melanin, water and foreign pigmented tattoos
  • the wavelength of light can influence selective light absorption by a certain target structure and can also influence the depth of tissue penetration.
  • skin penetration is gradually higher with increasing wavelengths. For wavelengths varying from 300-1,200 nm, melanin can be the dominant absorbent.
  • the scattering effect can make the light spread out and limit the depth of light penetration. Higher depth of penetration can be useful to reach more Red Blood Cells (RBC).
  • RBC Red Blood Cells
  • the target structure absorbs light, it can be converted to heat, which is conducted to the surrounding structures.
  • TRT Thermal relaxation time
  • the system can be configured to pulse or flicker light illuminating the target tissue, thereby allowing cooling of the target tissue between pulsing sequences and retaining higher depth of light penetration. Controlling the pulsing or flickering of light illuminating the target tissue of interest can play a key role in VPG signal strength.
  • an epidermal thickness of 0.1 mm can have a TRT of about 1 ms, while a vessel of 0.1 mm diameter can have a TRT of about 4 ms. Further, as another example, a vessel three times larger (0.3 mm) can have a TRT of approximately 10 ms. As such, in some embodiments, controlling the pulsing interval to be greater than lOms can provide sufficient time for the target tissue to cool and return to original state.
  • the time of tissue exposure to light tissue can be adjusted by controlling the on-time or pulse duration of the light, such as LED or other light source, illuminating the target region or tissue of interest.
  • FIG. 16 is a schematic diagram illustrating intervals for pulsing illumination. As shown in FIG. 16, in some embodiments, the switching time or switching interval can be defined as the period between two consecutive on- times when the light is turned on to illuminate the region or tissue of interest. Further, in some embodiments, the pulse duration can be defined as the period during which the light is turned on to illuminate the tissue or region of interest.
  • the switching time or switching interval to be greater than the TRT of the tissue or region of interest and/or by adjusting the pulse duration to be less than the TRT of the tissue or region of interest, sufficient time can be provided for the tissue structure at the region of interest to sufficiently cool down such that clearer images and/or VPG signals can be obtained.
  • the TRT can be proportional to the square of the size of the target tissue structure.
  • the TRT or relaxation time can be computed from the following equation.
  • T relaxation time
  • d size of the heated object
  • oc thermal diffusivity
  • k geometrical factor.
  • the thermal diffusivity can be about 2 xlO 3 cm 2 /s for dermis layer.
  • the geometrical factor can about 16 for a cylindrical object, such as a vessel.
  • FIG. 17 is a flowchart illustrating an example embodiment(s) of pulsing illumination to obtain clearer raw VPG signals.
  • medical personnel, user, and/or system can identify one or more regions of interest (ROI) and/or targets of interest on a subject and/or tissue sample at block 1702.
  • ROI regions of interest
  • the system, medical personnel, and/or other user can determine the TRT of one or more tissue structures within the ROI or target of interest at block 1704, for example using in part the equation above.
  • the system, medical personnel, or other user can set the LED or other system illuminator on-time to be less than the TRT, for example to ensure that the tissue sample is not overheated.
  • the system, medical personnel, or other user can set the duration between pulses or switching interval time to be greater than TRT, for example to ensure sufficient cooling time for one or more tissue structures within the ROI or target of interest.
  • the system can be configured to collect raw signal data and/or image data at block 1710, whose strength can be higher than the same collected by using continuous illumination of the ROI without pulsing or flickering.
  • the system can be configured to extract one or more VPG signal data at 1712, which can be more accurate and/or reliable than the same collected by using continuous illumination of the ROI without pulsing or flickering.
  • FIG. 18A illustrates example images obtained with pulsing illumination
  • FIG. 18B illustrates example raw VPG signals obtained with pulsing illumination as disclosed herein for a test subject.
  • FIGS. 18A and 18B show raw signals obtained from a tissue sample when the pulse duration or the period during which the LED illuminator was on or the on-time was about 2ms.
  • FIGS. 18A and 18B show raw signals obtained from a tissue sample when the switching interval was about 16.67 ms.
  • the camera integration time was about 8 ms in FIGS. 18A and 18B.
  • FIG. 19A illustrates example images obtained with continuous illumination when the LED illuminator was not pulsing
  • FIG. 19B illustrates example raw VPG signals obtained for the same test subject with continuous illumination when the LED illuminator was not pulsing.
  • raw signals were obtained from the same tissue sample for the same test subject when the LED illuminator was continuously on and without any switching interval.
  • camera integration time was about 8 ms.
  • the specular component of received light has the tendency to block the skin pixels from the image.
  • FIG. 20A illustrates an example of unpolarized image of tissue.
  • direction of light wave polarization can be manipulated with polarizing filters which may be optimized to largely or completely suppress specular light.
  • FIG. 20B illustrates an example polarized image of the same tissue from FIG. 20A.
  • the system comprises one or more polarization optics to reduce specular reflection.
  • FIG. 21 is a schematic diagram illustrating an example imaging system comprising a polarizer and an analyzer in the light path for improving VPG signals.
  • the system can comprise a polarizer in the illumination path.
  • the polarizer can be fixed in some embodiments.
  • the system can comprise an analyzer in the imaging path, which can be a polarizer with the transmission axis perpendicular to that of the polarizer in the illumination path.
  • the analyzer can be rotatable or adjustable in some embodiments.
  • the reflected light with the same polarization state as the illuminated light can be blocked, thereby substantially removing the specular component which can be mainly surface reflection and which can contain substantially no absorption information.
  • defused light coming from deeper structures can be randomly polarized and thus preferentially passed through the analyzer.
  • VPG Videoplethysmography
  • systems, devices, and methods described herein can measure heart rate variability with infrared illumination. In some embodiments, systems, devices, and methods described herein can further provide feedback to immersive video display to balance their heart rate variability (i.e., sympathetic and/or parasympathetic tones). In some embodiments, systems, devices, and methods described herein can be configured to measure sympathetic and/or parasympathetic tones with a videoplesthymography (VPG) signal(s). In some embodiments, systems, devices, and methods described herein can be configured to provide a recommendation(s) for balancing sympathetic and/or parasympathetic tones of a subject.
  • VPG videoplesthymography
  • the system can provide audio signals to the patient and reduce MRI or other medical imaging scanner or therapeutic device noise to a background hum.
  • the system can also provide an immersive visual experience via video display. In some embodiments, this is performed without any feedback from the patient’s autonomic nervous system (ANS).
  • ANS autonomic nervous system
  • the system can be configured to measure the state of autonomic activity operation during the scan, which is triggered or influenced by the scanning operation with or without patient cooperation and then make recommendations to play appropriate video images for balancing the autonomic activity.
  • this approach can lead to improved scanning experience for patients.
  • HRV Heart Rate Variability
  • HRV can refer to a quantitative marker used to measure the state of ANS.
  • HRV can be used in some embodiments as a technique to balance of the autonomic nervous system.
  • heart rate is not fixed, but rather automatically adjusts for stress, respiration, metabolic changes, thermoregulation, physical exertions, endocrine cycles, or the like.
  • HRV can refer to the beat-to-beat time variation in heartbeat, which can be modulated primarily by alterations in the ANS via changes in the balance between parasympathetic and/or sympathetic influences.
  • the ANS can be represented by the sympathetic and parasympathetic nervous system (SNS and PSNS). In some instances, they can function in opposition to each other.
  • SNS typically functions with actions that require quick responses such as“fight or flight” response
  • parasympathetic division functions with actions that do not require immediate reaction as in our ability to relax, repair, digest, eliminate and sleep.
  • HRV monitoring can be a quick physiologic indicator and a superficial reflection of the state of our autonomic activity.
  • HRV by monitoring the electrical activity of the heart with procedures such as with a contact-based ECG or an invasive catheter, HRV can be estimated.
  • HRV signals can be generated by extracting the intervals between R-waves from the ECG.
  • spectral analysis of R waves i.e., RR interval, of a 2 to 5 minute short ECG recordings can contain three components: (1) a very low frequency (VLF) component at a frequency range less than or equal to 0.003 to 0.04 Hz; (2) a low frequency (LF) component within 0.04 to 0.15 Hz; and (3) a high frequency (HF) component at 0.15 to 0.4 Hz.
  • VLF very low frequency
  • LF low frequency
  • HF high frequency
  • typical heart rate can have frequencies anywhere between 0.7 Hz to 4 Hz.
  • ELF ultra-low frequency component
  • a ratio of the powers concentrated in the LF component to the HF component can provide a useful HRV measurement.
  • evolution of this ratio over time also contains useful information and can be used for measuring the state of autonomic activity.
  • FIG. 22A and FIG. 22B illustrate examples of how the peak power in low frequency (LF) and high frequency (HG) components can be different under sympathetic and/or parasympathetic influence.
  • FIGS. 22A and 22B illustrate examples of spectral assessment, in which FIG. 22A illustrates an example power spectrum of a pulsating cardiac signal with a dominant LF component or dominant sympathetic influence, and FIG. 22B illustrates an example power spectrum of a pulsating cardiac signal with a dominant HF component or dominant parasympathetic influence.
  • FIG. 23 illustrates example frequency contents and their range and associations with ANS.
  • FIG. 24 illustrates example spectra contained in pulsating cardiac signal(s) and their associations with ANS.
  • a video processing system can capture one or more VPG waveforms at block 2402.
  • the system can be configured to compute power in low frequency and high frequency spectra at block 2404.
  • the system can be configured to computer a ratio of the power in LF component divided by the power in HF component at block 2406.
  • the system can be configured to determine whether this ratio is above a predetermined threshold at block 2408.
  • the system can be configured to select one or more appropriate videos to suppress sympathetic response at blocks 2410 and/or 2412, which can then be displayed to the subject, for example through an in-bore video display system.
  • motion tracking and/or correction systems such as those illustrated in FIGS. 1A and 1B and/or those comprising one, two, four, or more cameras or detectors for example, can be used to measure HRV.
  • the system can be further configured to select one or more appropriate videos and/or audio from a database to play them in the patient display for balancing HRV.
  • systems, methods, and devices described herein can comprise the following: (i) video system for capturing videoplethysmographic signals; (ii) processing system to compute power in the spectral contents of low frequency and high frequency components; (iii) HRV computation unit to compute the ratio of the powers concentrated in the LF component to the HF component; and/or (iv) video selection & display system for balancing HRV.
  • FIG. 23 illustrates example systems, devices, and methods for balancing HRV.
  • an imbalanced autonomic nervous system with a reduced parasympathetic and increased sympathetic tone can result in stress.
  • sympathetic breathing meditation videos can reduce the sympathetic tone.
  • asking the patient to breath slowly with meditative videos can be another way to reduce sympathetic tone.
  • parasympathetic dominance can lead to slowed down under-energetic nature leading to decrease in respiration and heart rate and increase in digestion. For example, a patient may fall asleep under the parasympathetic dominance.
  • a meditative video(s) with fast rhythmic breathing can be selected and displayed to swing back towards balanced ANS.
  • systems, devices, and methods described herein are configured to determine and/or monitor the rate and/or duration of blinks of a subject’s eye, which can enable an accurate assessment of fatigue and/or drowsiness in the subject.
  • a complete non-contact and/or non-obtrusive measurement technique can be provided for extracting blinking rate and/or the duration of blinks of a subject’s eye.
  • blinking rate and/or the duration of blinks of a subject’s eye can be determined by using one or more motion tracking and/or correction systems, such as those illustrated in FIGS. 1A and 1B and/or those comprising one, two, four, or more cameras or detectors for example.
  • the determined rate and/or duration of blinks can be used alone and/or in combination with Heart Rate Variability (HRB).
  • HRB Heart Rate Variability
  • the determined rate and/or duration of blinks can be calibrated to fatigue-related and/or sleep-related metrics, for example to indicate to the medical imaging scanner or therapeutic device operator whether the subject is under fatigue or feeling drowsy or sleepy during the process.
  • camera-based methods can allow extraction of pixel intensities near the eye region once subject’s head is within one or more cameras’ FOV (Field of View).
  • FOV Field of View
  • the average intensities over time are obtained by directing a region of interest (ROI) in one or all four cameras, for example in a four-camera motion tracking system, to cover one or both eye regions.
  • ROI region of interest
  • FIG. 25 provides an illustrative example of images captured with a four-camera motion tracking system, in which the illustrated rectangular regions show areas used for measuring eye blink rate.
  • FIG. 26 is a block diagram illustrating example methods for extracting eye blink rate and/or blink duration.
  • FIG. 26 is a functional block diagram illustrating an example signal processing algorithm comprising steps for extracting eye blink rate and/or blink duration.
  • a video processing system for tracking pixels within one or more region of interests (ROI) can be used by the system at block 2602.
  • the system can integrate pixel intensities within the one or more ROIs and obtain an average pixel intensity time series signal at block 2604.
  • the system can accumulate until a certain time period of data, such as for example 10 seconds, is captured to form a batch with average pixel intensity values at block 2606.
  • the system can be configured to discard 1 second trailing edge data and append 1 second new data at block 2606.
  • the system can perform one or more signal processing steps on the signal at block 2608.
  • the system can be configured to detrend the signal, detect peaks and times at which peaks occur, count the number of peaks from the most recent 1 second data, extract duration of blink and time intervals between successive peaks, and/or compute blink rate (number of blinks per minute) at block 2608.
  • one or more processes described in connection with blocks 2606 and 26-8 can be repeated every second with 1 second new data, as illustrated in block 2612.
  • the system can be configured to display eye blink rate, duration, and/or fatigue/drowsiness metrics at block 2610.
  • each batch of average pixel intensities comprises a 5 to 10 seconds window.
  • traces of eye blinks can be more prominent in the pixel intensity signal.
  • the top graph of FIG. 27 illustrates an example time series signal, showing average pixel intensity in the ROI used for blink rate detection shown with respect to the video frame number for controlled blinks. In this illustrated example, a total of 6 blinks occurred within a 50 second duration.
  • the bottom graph of FIG. 27 illustrates eye blink rate in blinks per minute shown with respect to time.
  • the system can apply one or more of the following steps for extracting eye blinking data: (1) accumulate 1 second new data and discard previous one second data from a batch of length approximately 5 to 10 seconds duration; (2) detrend the signal to remove non-stationary components and retain traces comprising of large peaks as seen in the top graph of FIG.
  • a template matching and AI based classification algorithms can also be used to estimate eye blink rate and duration.
  • a time series signal can be captured for over a minute-long duration by the system.
  • Corresponding bottom figures shows the extracted eye blink rate per minute.
  • the illustrated examples in FIG. 28 and FIG. 29 represent data collected from two different subjects.
  • duration of blinks is estimated by thresholding the region where peaks occur.
  • the top graph of FIG. 28 illustrates an example time series signal showing average pixel intensity in the ROI used for blink rate detection shown with respect to the video frame number for a first subject under the camera field of view.
  • the data shown in FIG. 28 is for uncontrolled blinking, in which a high blink rate is affected by high light intensity.
  • the bottom graph of FIG. 28 illustrates eye blink rate in blinks per minute shown with respect to time.
  • the top graph of FIG. 29 illustrates an example time series signal showing average pixel intensity in the ROI used for blink rate detection shown with respect to the video frame number for a second subject under the camera field of view.
  • the data shown in FIG. 29 is for uncontrolled blinking, in which a high blink rate is affected by high light intensity.
  • the bottom graph of FIG. 29 illustrates eye blink rate in blinks per minute shown with respect to time.
  • FIG. 2 is a block diagram illustrating a computer hardware system configured to run software for implementing one or more embodiments of systems, devices, and methods for tracking and analyzing subject motion during a medical imaging scan and/or therapeutic procedure. While FIG. 2 illustrates one embodiment of a computing system 200, it is recognized that the functionality provided for in the components and modules of computing system 200 may be combined into fewer components and modules or further separated into additional components and modules.
  • the computing system 200 comprises a motion tracking and/or biometrics analysis system module 206 that carries out the functions described herein, including any one of techniques described above.
  • the motion tracking and/or biometrics analysis system module 206 and/or other modules may be executed on the computing system 200 by a central processing unit 202 discussed further below.
  • module refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, COBOL, CICS, Java, Lua, C or C++.
  • a software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts.
  • Software instructions may be embedded in firmware, such as an EPROM.
  • hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors.
  • the modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.
  • the computing system 200 also comprises a mainframe computer suitable for controlling and/or communicating with large databases, performing high volume transaction processing, and generating reports from large databases.
  • the computing system 200 also comprises a central processing unit (“CPU”) 202, which may comprise a conventional microprocessor.
  • the computing system 200 further comprises a memory 204, such as random access memory (“RAM”) for temporary storage of information and/or a read only memory (“ROM”) for permanent storage of information, and a mass storage device 208, such as a hard drive, diskette, or optical media storage device.
  • the modules of the computing system 200 are connected to the computer using a standards based bus system.
  • the standards based bus system could be Peripheral Component Interconnect (PCI), MicroChannel, SCSI, Industrial Standard Architecture (ISA) and Extended ISA (EISA) architectures, for example.
  • PCI Peripheral Component Interconnect
  • ISA Industrial Standard Architecture
  • EISA Extended ISA
  • the computing system 200 comprises one or more commonly available input/output (I/O) devices and interfaces 212, such as a keyboard, mouse, touchpad, and printer.
  • the I/O devices and interfaces 212 comprise one or more display devices, such as a monitor, that allows the visual presentation of data to a user. More particularly, a display device provides for the presentation of GEIIs, application software data, and multimedia presentations, for example.
  • the I/O devices and interfaces 212 comprise a microphone and/or motion sensor that allow a user to generate input to the computing system 200 using sounds, voice, motion, gestures, or the like.
  • the I/O devices and interfaces 212 also provide a communications interface to various external devices.
  • the computing system 200 may also comprise one or more multimedia devices 210, such as speakers, video cards, graphics accelerators, and microphones, for example.
  • the computing system 200 may run on a variety of computing devices, such as, for example, a server, a Windows server, a Structure Query Language server, a Unix server, a personal computer, a mainframe computer, a laptop computer, a tablet computer, a cell phone, a smartphone, a personal digital assistant, a kiosk, an audio player, an e-reader device, and so forth.
  • the computing system 200 is generally controlled and coordinated by operating system software, such as z/OS, Windows 95, Windows 98, Windows NT, Windows 2000, Windows XP, Windows Vista, Windows 7, Windows 8, Linux, BSD, SunOS, Solaris, Android, iOS, BlackBerry OS, or other compatible operating systems.
  • the operating system may be any available operating system, such as MAC OS X.
  • the computing system 200 may be controlled by a proprietary operating system.
  • Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, and I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.
  • GUI graphical user interface
  • the computing system 200 is coupled to a network 216, such as a LAN, WAN, or the Internet, for example, via a wired, wireless, or combination of wired and wireless, communication link 214.
  • the network 216 communicates with various computing devices and/or other electronic devices via wired or wireless communication links.
  • the network 216 is communicating with one or more computing systems 217 and/or one or more data sources 219.
  • Access to the motion tracking and/or biometrics analysis system module 206 of the computer system 200 by computing systems 217 and/or by data sources 219 may be through a web-enabled user access point such as the computing systems’ 217 or data source’s 219 personal computer, cellular phone, smartphone, laptop, tablet computer, e-reader device, audio player, or other device capable of connecting to the network 216.
  • a web-enabled user access point such as the computing systems’ 217 or data source’s 219 personal computer, cellular phone, smartphone, laptop, tablet computer, e-reader device, audio player, or other device capable of connecting to the network 216.
  • Such a device may have a browser module that is implemented as a module that uses text, graphics, audio, video, and other media to present data and to allow interaction with data via the network 216.
  • the browser module may be implemented as a combination of an all points addressable display such as a cathode-ray tube (CRT), a liquid crystal display (LCD), a plasma display, or other types and/or combinations of displays.
  • the browser module may be implemented to communicate with input devices 212 and may also comprise software with the appropriate interfaces which allow a user to access data through the use of stylized screen elements such as, for example, menus, windows, dialog boxes, toolbars, and controls (for example, radio buttons, check boxes, sliding scales, and so forth).
  • the browser module may communicate with a set of input and output devices to receive signals from the user.
  • the input device(s) may comprise a keyboard, roller ball, pen and stylus, mouse, trackball, voice recognition system, or pre-designated switches or buttons.
  • the output device(s) may comprise a speaker, a display screen, a printer, or a voice synthesizer.
  • a touch screen may act as a hybrid input/output device.
  • a user may interact with the system more directly such as through a system terminal connected to the score generator without communications over the Internet, a WAN, or LAN, or similar network.
  • the system 200 may comprise a physical or logical connection established between a remote microprocessor and a mainframe host computer for the express purpose of uploading, downloading, or viewing interactive data and databases on line in real time.
  • the remote microprocessor may be operated by an entity operating the computer system 200, including the client server systems or the main server system, an/or may be operated by one or more of the data sources 219 and/or one or more of the computing systems 217.
  • terminal emulation software may be used on the microprocessor for participating in the micro-mainframe link.
  • computing systems 217 who are internal to an entity operating the computer system 200 may access the motion tracking and/or biometrics analysis system module 206 internally as an application or process run by the CPU 202.
  • a Uniform Resource Locator can include a web address and/or a reference to a web resource that is stored on a database and/or a server.
  • the URL can specify the location of the resource on a computer and/or a computer network.
  • the URL can include a mechanism to retrieve the network resource.
  • the source of the network resource can receive a URL, identify the location of the web resource, and transmit the web resource back to the requestor.
  • a URL can be converted to an IP address, and a Domain Name System (DNS) can look up the URL and its corresponding IP address.
  • DNS Domain Name System
  • URLs can be references to web pages, file transfers, emails, database accesses, and other applications.
  • the URLs can include a sequence of characters that identify a path, domain name, a file extension, a host name, a query, a fragment, scheme, a protocol identifier, a port number, a username, a password, a flag, an object, a resource name and/or the like.
  • the systems disclosed herein can generate, receive, transmit, apply, parse, serialize, render, and/or perform an action on a URL.
  • a cookie also referred to as an HTTP cookie, a web cookie, an internet cookie, and a browser cookie, can include data sent from a website and/or stored on a user’s computer.
  • the cookies can include useful information for websites to remember prior browsing information, such as a shopping cart on an online store, clicking of buttons, login information, and/or records of web pages or network resources visited in the past. Cookies can also include information that the user enters, such as names, addresses, passwords, credit card information, etc. Cookies can also perform computer functions. For example, authentication cookies can be used by applications (for example, a web browser) to identify whether the user is already logged in (for example, to a web site). The cookie data can be encrypted to provide security for the consumer. Tracking cookies can be used to compile historical browsing histories of individuals. Systems disclosed herein can generate and use cookies to access data of an individual. Systems can also generate and use JSON web tokens to store authenticity information, HTTP authentication as authentication protocols, IP addresses to track session or identity information, URLs, and the like.
  • the network 216 may communicate with other data sources or other computing devices.
  • the computing system 200 may also comprise one or more internal and/or external data sources.
  • one or more of the data repositories and the data sources may be implemented using a relational database, such as DB2, Sybase, Oracle, CodeBase and Microsoft® SQL Server as well as other types of databases such as, for example, a flat file database, an entity-relationship database, and object-oriented database, and/or a record-based database.
  • Conditional language such as, among others,“can,”“could,”“might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
  • the headings used herein are for the convenience of the reader only and are not meant to limit the scope of the inventions or claims.
  • the methods disclosed herein may include certain actions taken by a practitioner; however, the methods can also include any third-party instruction of those actions, either expressly or by implication.
  • the ranges disclosed herein also encompass any and all overlap, sub-ranges, and combinations thereof.
  • Language such as “up to,”“at least,”“greater than,”“less than,”“between,” and the like includes the number recited. Numbers preceded by a term such as“about” or“approximately” include the recited numbers and should be interpreted based on the circumstances (e.g., as accurate as reasonably possible under the circumstances, for example ⁇ 5%, ⁇ 10%, ⁇ 15%, etc.).
  • a phrase referring to“at least one of’ a list of items refers to any combination of those items, including single members.
  • “at least one of: A, B, or C” is intended to cover: A, B, C, A and B, A and C, B and C, and A, B, and C.
  • Conjunctive language such as the phrase“at least one of X, Y and Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be at least one of X, Y or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y, and at least one of Z to each be present.

Abstract

L'invention concerne des systèmes, des dispositifs et des méthodes de suivi et/ou d'analyse d'images et/ou de vidéos d'un sujet, par exemple pendant un balayage d'imagerie médicale, une procédure thérapeutique, ou autre. Selon certains modes de réalisation, des systèmes, des dispositifs et des méthodes de la présente invention sont configurés pour traiter des images et/ou des vidéos de sujet afin de déterminer des données biométriques, telles que la fréquence cardiaque, la fréquence respiratoire, le pouls, les niveaux d'anxiété, la pression artérielle, la variabilité de la fréquence cardiaque, la vitesse de clignement des yeux, la durée de clignement des yeux, la position du sujet, la hauteur du sujet, la largeur du sujet, le volume corporel du sujet, l'indice de masse corporelle estimé du sujet, la reconnaissance faciale du sujet, un code-barres de patient du sujet et/ou similaire.
PCT/US2019/013147 2018-01-12 2019-01-11 Systèmes, dispositifs et méthodes de suivi et/ou d'analyse d'images et/ou de vidéos d'un sujet WO2019140155A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2019/020593 WO2019173237A1 (fr) 2018-03-05 2019-03-04 Systèmes, dispositifs et procédés de suivi et d'analyse de mouvement d'un sujet pendant un balayage d'imagerie médicale et/ou une intervention thérapeutique

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US201862616911P 2018-01-12 2018-01-12
US62/616,911 2018-01-12
US201862633255P 2018-02-21 2018-02-21
US62/633,255 2018-02-21
US201862677467P 2018-05-29 2018-05-29
US62/677,467 2018-05-29
US201862721981P 2018-08-23 2018-08-23
US62/721,981 2018-08-23

Publications (1)

Publication Number Publication Date
WO2019140155A1 true WO2019140155A1 (fr) 2019-07-18

Family

ID=67219868

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/013147 WO2019140155A1 (fr) 2018-01-12 2019-01-11 Systèmes, dispositifs et méthodes de suivi et/ou d'analyse d'images et/ou de vidéos d'un sujet

Country Status (1)

Country Link
WO (1) WO2019140155A1 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111345803A (zh) * 2020-03-20 2020-06-30 浙江大学城市学院 一种基于移动设备摄像头的心率变异性测量方法
CN112017188A (zh) * 2020-09-09 2020-12-01 上海航天控制技术研究所 一种空间非合作目标语义识别与重构方法
CN112200099A (zh) * 2020-10-14 2021-01-08 浙江大学山东工业技术研究院 一种基于视频的动态心率检测方法
CN112205996A (zh) * 2020-11-01 2021-01-12 南昌华亮光电有限责任公司 基于光子随机偏移量的图像加密系统与方法
WO2021077003A1 (fr) * 2019-10-18 2021-04-22 Viavi Solutions Inc. Dispositif à capteurs
CN112863646A (zh) * 2019-11-27 2021-05-28 西门子医疗有限公司 用于医学数据获取的系统
WO2021130709A1 (fr) * 2019-12-23 2021-07-01 Analytics For Life Inc. Procédé et système de rejet et d'évaluation de la qualité du signal faisant appel à la variabilité du cycle cardiaque
US11475596B2 (en) * 2020-07-23 2022-10-18 Motorola Solutions, Inc. Device, method and system for adjusting a configuration of a camera device
GB2607994A (en) * 2021-06-02 2022-12-21 Lenovo Beijing Ltd Fatigue measurement method, apparatus, and computer-readable medium
CN117596366A (zh) * 2024-01-18 2024-02-23 北京睿企信息科技有限公司 一种人员状态监测系统
US11918331B2 (en) 2019-12-10 2024-03-05 Hill-Rom Services, Inc. Micro-movement and gesture detection using radar

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070280508A1 (en) * 2006-05-19 2007-12-06 Ernst Thomas M Motion tracking system for real time adaptive imaging and spectroscopy
JP2015526708A (ja) * 2012-07-03 2015-09-10 ザ ステート オブ クイーンズランド アクティング スルー イッツ デパートメント オブ ヘルスThe State Of Queensland Acting Through Its Department Of Health 医用撮像のための動き補正
US20150265220A1 (en) * 2014-03-24 2015-09-24 Thomas Michael Ernst Systems, methods, and devices for removing prospective motion correction from medical imaging scans
US20160035108A1 (en) * 2014-07-23 2016-02-04 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9785247B1 (en) * 2014-05-14 2017-10-10 Leap Motion, Inc. Systems and methods of tracking moving hands and recognizing gestural interactions

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070280508A1 (en) * 2006-05-19 2007-12-06 Ernst Thomas M Motion tracking system for real time adaptive imaging and spectroscopy
JP2015526708A (ja) * 2012-07-03 2015-09-10 ザ ステート オブ クイーンズランド アクティング スルー イッツ デパートメント オブ ヘルスThe State Of Queensland Acting Through Its Department Of Health 医用撮像のための動き補正
US20150265220A1 (en) * 2014-03-24 2015-09-24 Thomas Michael Ernst Systems, methods, and devices for removing prospective motion correction from medical imaging scans
US9785247B1 (en) * 2014-05-14 2017-10-10 Leap Motion, Inc. Systems and methods of tracking moving hands and recognizing gestural interactions
US20160035108A1 (en) * 2014-07-23 2016-02-04 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021077003A1 (fr) * 2019-10-18 2021-04-22 Viavi Solutions Inc. Dispositif à capteurs
CN112863646B (zh) * 2019-11-27 2024-03-22 西门子医疗有限公司 用于医学数据获取的系统
CN112863646A (zh) * 2019-11-27 2021-05-28 西门子医疗有限公司 用于医学数据获取的系统
US11918331B2 (en) 2019-12-10 2024-03-05 Hill-Rom Services, Inc. Micro-movement and gesture detection using radar
WO2021130709A1 (fr) * 2019-12-23 2021-07-01 Analytics For Life Inc. Procédé et système de rejet et d'évaluation de la qualité du signal faisant appel à la variabilité du cycle cardiaque
US20210212582A1 (en) * 2019-12-23 2021-07-15 Analytics For Life Inc. Method and system for signal quality assessment and rejection using heart cycle variability
CN111345803B (zh) * 2020-03-20 2022-04-12 浙江大学城市学院 一种基于移动设备摄像头的心率变异性测量方法
CN111345803A (zh) * 2020-03-20 2020-06-30 浙江大学城市学院 一种基于移动设备摄像头的心率变异性测量方法
US11475596B2 (en) * 2020-07-23 2022-10-18 Motorola Solutions, Inc. Device, method and system for adjusting a configuration of a camera device
CN112017188A (zh) * 2020-09-09 2020-12-01 上海航天控制技术研究所 一种空间非合作目标语义识别与重构方法
CN112017188B (zh) * 2020-09-09 2024-04-09 上海航天控制技术研究所 一种空间非合作目标语义识别与重构方法
CN112200099A (zh) * 2020-10-14 2021-01-08 浙江大学山东工业技术研究院 一种基于视频的动态心率检测方法
CN112205996A (zh) * 2020-11-01 2021-01-12 南昌华亮光电有限责任公司 基于光子随机偏移量的图像加密系统与方法
CN112205996B (zh) * 2020-11-01 2023-05-26 南昌华亮光电有限责任公司 基于光子随机偏移量的图像加密系统与方法
GB2607994A (en) * 2021-06-02 2022-12-21 Lenovo Beijing Ltd Fatigue measurement method, apparatus, and computer-readable medium
GB2607994B (en) * 2021-06-02 2023-09-20 Lenovo Beijing Ltd Fatigue measurement method, apparatus, and computer-readable medium
CN117596366A (zh) * 2024-01-18 2024-02-23 北京睿企信息科技有限公司 一种人员状态监测系统
CN117596366B (zh) * 2024-01-18 2024-03-29 北京睿企信息科技有限公司 一种人员状态监测系统

Similar Documents

Publication Publication Date Title
WO2019140155A1 (fr) Systèmes, dispositifs et méthodes de suivi et/ou d'analyse d'images et/ou de vidéos d'un sujet
WO2019173237A1 (fr) Systèmes, dispositifs et procédés de suivi et d'analyse de mouvement d'un sujet pendant un balayage d'imagerie médicale et/ou une intervention thérapeutique
Zhang et al. Highly wearable cuff-less blood pressure and heart rate monitoring with single-arm electrocardiogram and photoplethysmogram signals
Jeong et al. Introducing contactless blood pressure assessment using a high speed video camera
Umair et al. HRV and stress: A mixed-methods approach for comparison of wearable heart rate sensors for biofeedback
Sun et al. Photoplethysmography revisited: from contact to noncontact, from point to imaging
Lewis et al. A novel method for extracting respiration rate and relative tidal volume from infrared thermography
CN105792734B (zh) 用于获得远程光体积描记波形的改进的信号选择
RU2656559C2 (ru) Способ и устройство для определения жизненно важных показателей
Moreno et al. Facial video-based photoplethysmography to detect HRV at rest
CA2934659A1 (fr) Systeme et procedes pour mesurer des parametres physiologiques
Blanik et al. Remote vital parameter monitoring in neonatology–robust, unobtrusive heart rate detection in a realistic clinical scenario
JP3221096U (ja) スマート検査測定設備
CN111386068A (zh) 基于摄像机的压力测量系统与方法
Zhou et al. The noninvasive blood pressure measurement based on facial images processing
Shao et al. Noncontact physiological measurement using a camera: a technical review and future directions
Chen et al. Non-contact heart rate monitoring in neonatal intensive care unit using RGB camera
Park et al. Non-contact measurement of heart response reflected in human eye
WO2021084488A1 (fr) Lunettes intelligentes pour la détection de paramètres physiologiques
Bosi et al. Real-time monitoring of heart rate by processing of Microsoft Kinect™ 2.0 generated streams
Fernandez Rojas et al. A systematic review of neurophysiological sensing for the assessment of acute pain
Qiao et al. Revise: Remote vital signs measurement using smartphone camera
Vatanparvar et al. Respiration rate estimation from remote ppg via camera in presence of non-voluntary artifacts
EP4033495A1 (fr) Système d'évaluation de tâche d'activité et procédé d'évaluation de tâche d'activité
Chon et al. Wearable Wireless Sensor for Multi-Scale Physiological Monitoring

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19739171

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19739171

Country of ref document: EP

Kind code of ref document: A1