GB2528044A - Non-touch optical detection of vital signs - Google Patents

Non-touch optical detection of vital signs Download PDF

Info

Publication number
GB2528044A
GB2528044A GB1411983.8A GB201411983A GB2528044A GB 2528044 A GB2528044 A GB 2528044A GB 201411983 A GB201411983 A GB 201411983A GB 2528044 A GB2528044 A GB 2528044A
Authority
GB
United Kingdom
Prior art keywords
implementations
temporal
filter
spatial
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1411983.8A
Other versions
GB201411983D0 (en
GB2528044B (en
Inventor
Mark Khachaturian
Michael Smith
Martin Crawley
Irwin Gross
Steven Gest
John Barrett
Michael Cronin
Derek Turnbull
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ARC Devices NI Ltd
Original Assignee
ARC Devices NI Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ARC Devices NI Ltd filed Critical ARC Devices NI Ltd
Priority to GB1811453.8A priority Critical patent/GB2561771B/en
Priority to GB1411983.8A priority patent/GB2528044B/en
Priority to US14/324,235 priority patent/US9508141B2/en
Priority to US14/448,223 priority patent/US8950935B1/en
Priority to US14/457,111 priority patent/US9330459B2/en
Priority to US14/457,105 priority patent/US20160000337A1/en
Priority to US14/457,041 priority patent/US9406125B2/en
Priority to US14/457,029 priority patent/US9478025B2/en
Priority to US14/457,074 priority patent/US9501824B2/en
Priority to US14/457,001 priority patent/US9721339B2/en
Priority to US14/457,098 priority patent/US9324144B2/en
Priority to US14/457,090 priority patent/US20160000331A1/en
Priority to US14/457,061 priority patent/US9495744B2/en
Priority to US14/457,053 priority patent/US9691146B2/en
Priority to US14/457,018 priority patent/US9262826B2/en
Publication of GB201411983D0 publication Critical patent/GB201411983D0/en
Priority to US14/617,926 priority patent/US9282896B2/en
Priority to US14/694,610 priority patent/US20160003679A1/en
Priority to EP15175534.5A priority patent/EP2963617A1/en
Priority to EP15175517.0A priority patent/EP2977732A3/en
Priority to US14/794,669 priority patent/US9305350B2/en
Priority to US14/876,784 priority patent/US9881369B2/en
Publication of GB2528044A publication Critical patent/GB2528044A/en
Priority to US15/224,644 priority patent/US10074175B2/en
Application granted granted Critical
Publication of GB2528044B publication Critical patent/GB2528044B/en
Priority to US16/127,182 priority patent/US10453194B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • A61B5/0261Measuring blood flow using optical means, e.g. infrared light
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0008Temperature signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0013Medical image data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • A61B5/015By temperature mapping of body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • A61B5/02055Simultaneously evaluating both cardiovascular condition and temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • A61B5/14552Details of sensors specially adapted therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/33Heart-related electrical modalities, e.g. electrocardiography [ECG] specially adapted for cooperation with other devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7225Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/725Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/7257Details of waveform analysis characterised by using transforms using Fourier transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7278Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7425Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • G01J5/0025Living bodies
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/025Interfacing a pyrometer to an external device or network; User interface
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/08Optical arrangements
    • G01J5/0893Arrangements to attach devices to a pyrometer, i.e. attaching an optical interface; Spatial relative arrangement of optical elements, e.g. folded beam path
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/10Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors
    • G01J5/28Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors using photoemissive or photovoltaic cells
    • G01J5/30Electrical features thereof
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01KMEASURING TEMPERATURE; MEASURING QUANTITY OF HEAT; THERMALLY-SENSITIVE ELEMENTS NOT OTHERWISE PROVIDED FOR
    • G01K1/00Details of thermometers not specially adapted for particular types of thermometer
    • G01K1/02Means for indicating or recording specially adapted for thermometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01KMEASURING TEMPERATURE; MEASURING QUANTITY OF HEAT; THERMALLY-SENSITIVE ELEMENTS NOT OTHERWISE PROVIDED FOR
    • G01K13/00Thermometers specially adapted for specific purposes
    • G01K13/20Clinical contact thermometers for use with humans or animals
    • G01K13/223Infrared clinical thermometers, e.g. tympanic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration by non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • G06V30/41Analysis of document content
    • G06V30/413Classification of content, e.g. text, photographs or tables
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/164Detection; Localisation; Normalisation using holistic features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
    • H04N25/75Circuitry for providing, modifying or processing image signals from the pixel array
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0204Operational features of power management
    • A61B2560/0214Operational features of power management of power generation or supply
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/04Constructional details of apparatus
    • A61B2560/0475Special features of memory means, e.g. removable memory cards
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • A61B5/02427Details of sensor
    • A61B5/02433Details of sensor for infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular
    • G06T2207/30104Vascular flow; Blood flow; Perfusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/22Cropping

Abstract

A means for motion amplification to communicate biological vital signs comprises: a cropper for receiving at least two images and cropping them, a skin pixel identifier coupled to the cropper for identifying pixel values representative of skin, a spatial bandpass filter coupled to the skin pixel identifier, a regional facial clustering module coupled to the bandpass filter to apply spatial clustering to the output of the filter, a temporal bandpass filter coupled to the output of the clustering module, a temporal variation identifier to identify temporal variations in the output of the filter and a vital sign generator which generates at least on vital sign from the temporal variation. Also disclosed is a non-touch thermometer senses temperature from a digital infrared sensor is described. A microprocessor is operably coupled to a camera from which patient vital signs are determined. A digital signal representing a temperature without conversion from analog is transmitted from the digital infrared sensor. A temporal variation of images is generated from which a heart rate and the respiratory rate can be determined and displayed or stored.

Description

NON-TOUCH OPTICAL DETECTION OF VITAL SIGNS
FIELD
[0001] This disclosure relates generally to motion amplification in images.
BACKGROUND
[0002] Conventional personal computers implement motion amplification.
BRIEF DESCRIPTION
[0003] In one aspect, a non-touch thermometer to measure temperature includes: a microprocessor, a battery operably coupled to the microprocessor, a single button operably coupled to the microprocessor, a camera operably coupled to the microprocessor and providing two or more images to the microprocessor, a digital infrared sensor operably coupled to the microprocessor with no analog-to-digital converter operably coupled between the digital infrared sensor and the microprocessor, the digital infrared sensor having only digital readout ports, the digital infrared sensor having no analog sensor readout ports, and a display device operably coupled to the microprocessor, where the microprocessor is operable to receive from the digital readout ports a digital signal that is representative of an infrared signal detected by the digital infrared sensor and the microprocessor is operable to determine the temperature from the digital signal that is representative of the infrared signal and the microprocessor including a temporal-variation module to determine temporal variation of the pixel values between the two or more images being below a particular threshold, a signal processing module configured to amplify the temporal variation resulting in amplified temporal variation data, and avisualizerto visualize a pattern of flow of blood in the amplified temporal variation data of the two or more images.
[0004] In another aspect, a non-touch thermometer includes a microprocessor, a battery operably coupled to the microprocessor, a single button operably coupled to the microprocessor, a camera operably coupled to the microprocessor and providing two or more images to the microprocessor, a digital infrared sensor operably coupled to the microprocessor, the digital infrared sensor having ports that provide only digital readout, and a display device operably coupled to the microprocessor, where the microprocessor is operable to receive from the ports that provide only digital readout a digital signal that is representative of an infrared signal detected by the digital infrared sensor and the microprocessor is operable to determine the temperature from the digital signal that is representative of the infrared signal and the microprocessor including a temporal-variation module to determine temporal variation of the pixel values between the two or more images being below a particular threshold, a signal processing module configured to amplify the temporal variation resulting in amplified temporal variation data, and avisualizerto visualize a pattern of flow of blood in the amplified temporal variation data of the two or more images.
[0005] In yet another aspect, a non-touch thermometer includes a microprocessor, a battery operably coupled to the microprocessor, a single button operably coupled to the microprocessor, a camera operably coupled to the microprocessor and providing two or more images to the microprocessor, a digital infrared sensor operably coupled to the microprocessor, the digital infrared sensor having only digital readout ports, the digital infrared sensor having no analog sensor readout pods, and a display device operably coupled to the microprocessor, where the microprocessor is operable to receive from the digital readout ports a digital signal that is representative of an infrared signal detected by the digital infrared sensor and the microprocessor is operable to determine the temperature from the digital signal that is representative of the infrared signal and the microprocessor including a temporal-variation module to determine temporal variation of the pixel values between the two or more images being below a particular threshold, a signal processing module configured to ampli' the temporal variation resulting in amplified temporal variation data, and a visualizer to visualize a pattern of flow of blood in the amplified temporal variation data of the two or more images.
[0006] Apparatus, systems, and methods of varying scope are described herein.
In addition to the aspects and advantages described in this summary, further aspects and advantages will become apparent by reference to the drawings and by
reading the detailed description that follows.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] FIG. 1 is a block diagram of a non-touch thermometer that does not include a digital infrared sensor, according to an implementation; [0008] FIG. 2 is a block diagram of a non-touch thermometer that does not include an analog-to-digital converter, according to an implementation; [0009] FIG. 3 is a block diagram of a non-touch thermometer having a color display device, according to an implementation; [0010] FIG. 4 is a flowchart of a method to determine a temperature from a digital infrared sensor, according to an implementation; [0011] FIG. 5 is a flowchart of a method to display temperature color indicators, according to an implementation of three colors; [0012] FIG. 6 is a flowchart of a method to manage power in a non-touch thermometer having a digital infrared sensor, according to an implementation; [0013] FIG. 7 is a block diagram of an apparatus of motion amplification, according to an implementation.
[0014] FIG. 8 is a block diagram of an apparatus of motion amplification, according to an implementation.
[0015] FIG. 9 is a block diagram of an apparatus of motion amplification, according to an implementation.
[0016] FIG. 10 isablock diagram ofan apparatus of motion amplification, according to an implementation.
[0017] FIG. 11 is a block diagram of an apparatus of motion amplification, according to an implementation; [0018] FIG. 12 is a block diagram of an apparatus to generate and present any one of a number of biological vital signs from amplified motion, according to an implementation; [0019] FIG. 13 is a block diagram of an apparatus of motion amplification, according to an implementation; [0020] FIG. 14 is a block diagram of an apparatus of motion amplification, according to an implementation; [0021] FIG. 15 is an apparatus that performs motion amplification to generate biological vital signs, according to an implementation; [0022] FIG. 16 is a flowchart of a method of motion amplification, according to an implementation; [0023] FIG. 17 is a flowchart of a method of motion amplification, according to an implementation that does not include a separate action of determining a temporal variation; [0024] FIG. 18 is a flowchart of a method of motion amplification, according to an implementation; [0025] FIG. 19 is a flowchart of a method of motion amplification, according to an implementation; [0026] FIG, 21 is a schematic of a circuit board of a non-touch thermometer, according to an implementation; and [0027] FIG. 22 is a block diagram of a mobile device, according to an implementation; [0028] FIG. 23 illustrates an example of a computer environment, according to an implementation; [0029] FIG. 24 is a representation of display that is presented on the display device of apparatus in FIG, 1-3, according to an implementation.
DETAILED DESCRIPTION
[0030] In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific implementations which may be practiced. These implementations are described in sufficient detail to enab'e those skilled in the art to practice the implementations, and it is to be understood that other implementations may be utilized and that logical, mechanical, electrical and other changes may be made without departing from the scope of the implementations.
The following detailed description is, therefore, not to be taken in a limiting sense.
[0031] The detailed description is divided into four sections. In the first section, apparatus of digital infrared sensor implementations are described. In the second section, implementations of methods of digital infrared sensors are described. In the third section, implementations of apparatus of vital sign amplification are described. In the fourth section, implementations of methods of vital sign amplification are described, In the fifth section, hardware and operating environments in conjunction with which implementations may be practiced are described, Finally, in the sixth section, a conclusion of the detailed description is provided. The apparatus and methods disclosed in the third and fourth sections are notably beneficial in generating a temporal variation from which a heartrate and the respiratory rate can be generated.
Digital Tnfrared Sensor Apparatus Tmplementations [0032] In this section, particular apparatus of implementations are described by reference to a series of diagrams.
[0033] FIG, 1 is a block diagram of a non-touch thermometer 100 that does not include a digital infrared sensor, according to an implementation, Non-touch thermometer 100 is an apparatus to measure temperature. The non-touch thermometer 100 includes a microprocessor 102. The non-touch thermometer 100 includes a battery 104 that is operably coupled to the microprocessor 102, The non-touch thermometer 100 includes a single button 106 that is operably coupled to the microprocessor 102, The non-touch thermometer 100 includes a digital infrared sensor 108 that is operably coupled to the microprocessor 102, The digital infrared sensor 108 includes digital ports 110 that provide only digital readout signal 112. The non-touch thermometer 100 includes a display device 114 that is operably coupled to the microprocessor 102, The microprocessor 102 is operable to receive from the digital ports 110 that provide only digital readout signal 112. The digital readout signal 112 that is representative of an infrared signal 116 detected by the digital infrared sensor 108. The microprocessor 102 is operable to determine the temperature 120 from the digital readout signal 112 that is representative of the infrared signal 116. The non-touch thermometer 100 includes a camera 122 that is operably coupled to the microprocessor 102 and is operable to provide two or more images 124 to the microprocessor 102.
[0034] FIG. 2 is a block diagram of a non-touch thermometer 200 that does not include an analog-to-digital converter, according to an implementation The non-touch thermometer 200 does not include an analog-to-digital (AID) converter 202 operably coupled between the digital infrared sensor 108 and the microprocessor 102. The digital infrared sensor 108 also does not include analog readout ports 204. The dashed lines of the analog-to-digital converter 202 and the analog readout ports 204 indicates absence of the AID converter 202 and the analog readout ports 204 in the non-touch thermometer 200. The non-touch thermometer includes a microprocessor 102. The non-touch thermometer 200 includes a battery 104 that is operably coupled to the microprocessor 102. The non-touch thermometer 200 includes a single button 106 that is operably coupled to the microprocessor 102. The non-touch thermometer 200 includes a digital infrared sensor 108 that is operably coupled to the microprocessor 102 with no analog-to-digital converter that is operably coupled between the digital infrared sensor 108 and the microprocessor 102, the digital infrared sensor 108 having only digital pods 110, the digital infrared sensor 108 having no analog sensor readout pods.
The non-touch thermometer 200 includes and a display device 114 that is operably coupled to the microprocessor 102, where the microprocessor 102 is operable to receive from the digital ports 110 a digital readout signal 112 that is representative of an infrared signal 116 detected by the digital infrared sensor 108 and the microprocessor 102 is operable to determine the temperature 120 from the digital a readout signal 112 that is representative of the infrared signal 116. The non-touch thermometer 200 also indudes a camera 122 that is operably coup'ed to the microprocessor 102 and is operable to provide two or more images 124 to the microprocessor 02, [0035] In some implementations, the digital IR sensor 108 is a low noise amplifier, 17-bit ADC and powerful DSP unit through which high accuracy and resolution of the thermometer is achieved.
[0036] In some implementations, the digital IR sensor 108, 10-bit pulse width modulation (PWM) is configured to continuously transmit the measured temperature in range of -20.,, 120°C, with an output resolution of 0.14°C, The factory default power on reset (P0R) setting is SMBus.
[0037] In some implementations, the digital IR sensor 108 is packaged in an industry standard 10-3 9 package.
[0038] In some implementations, the generated object and ambient temperatures are available in RAM of the digital JR sensor 108 with resolution of 0.0 1°C. The temperatures are accessible by 2 wire serial SMBus compatible protocol (0.02°C res&ution) or via 10-bit PWM (Pulse Width Modulated) output of the digital IR sensor 108.
[0039] In some implementations, the digital IR sensor 108 is factory calibrated in wide temperature ranges: -40... 85°C for the ambient temperature and - 70,.. 380°C for the object temperature.
[0040] In some implementations of the digital IR sensor 108, the measured value is the average temperature of all objects in the Fie'd Of View (FOV) of the sensor. Jn some implementations, the digital IR sensor 108 has a standard accuracy of ±0,5°C around room temperatures, and in some implementations, the digital JR sensor 108 has an accuracy of ±0.2°C in a limited temperature range around the human body temperature.
[0041] These accuracies are only guaranteed and achievable when the sensor is in thermal equilibrium and under isothermal conditions (there are no temperature differences across the sensor package). The accuracy of the thermometer can be influenced by temperature differences in the package induced by causes like (among others): Hot electronics behind the sensor, heaters/coolers behind or beside the sensor or by a hot/cold object very close to the sensor that not only heats the sensing element in the thermometer but also the thermometer package.
In some implementations of the digital JR sensor 108, the thermal gradients are measured internally and the measured temperature is compensated in consideration of the thermal gradients, but the effect is not totally eliminated. It is therefore important to avoid the causes of thermal gradients as much as possible or to shield the sensor from the thermal gradients.
[0042] In some implementations, the digital JR sensor 108 is calibrated for an object emissivity of 1, but in some implementations, the digital JR sensor 108 is calibrated for any emissivity in the range 0.1... 1.0 without the need of recalibration with a black body.
[0043] In some implementations, the digital JR sensor 108, the PWM can be easily customized for virtually any range desired by the customer by changing the content of 2 EEPROM cells. This has no effect on the factory calibration of the device. The PWJ\'l pin can also be configured to act as a thermal relay (input is To), thus allowing for an easy and cost effective implementation in thermostats or temperature (freezing / boiling) alert applications. The temperature threshold is programmable by the microprocessor 102 of the non-touch thermometer, In a non-touch thermometer having a SMBus system the programming can act as a processor interrupt that can trigger reading all slaves on the bus and to determine the precise condition.
[0044] In some implementations, the digital JR sensor 108 has an optical filter (long-wave pass) that cuts off the visible and near infra-red radiant flux is integrated in the package to provide ambient and sunlight immunity. The wavelength pass band of this optical filter is from 5,5 till J4m.
[0045] Jn some implementations, the digital JR sensor 108 is controlled by an internal state machine, which controls the measurements and generations of the object and ambient temperatures and does the post-processing of the temperatures to output the temperatures through the PWM output or the SMBus compatible interface - [0046] Some implementations of the non-touch thermometer includes 2 JR sensors, the output of the JR sensors being amplified by a low noise low offset chopper amplifier with programmable gain, converted by a Sigma Delta modulator to a single bit stream and fed to a DSP for further processing. The signal is treated by programmable (by means of EEPROM contend) FIR and hR low pass filters for further reduction of the bandwidth of the input signal to achieve the desired noise performance and refresh rate. The output of the JJR filter is the measurement result and is available in the internal RAM. 3 different cells are available: One for the on-board temperature sensor and 2 for the JR sensors. Based on results of the above measurements, the corresponding ambient temperature Ta and object temperatures To are generated. Both generated temperatures have a resolution of 0.01°C. The data for Ta and To is read in two ways: Reading RAM cells dedicated for this purpose via the 2-wire interface (0.02°C resolution, fixed ranges), or through the PWM digital output (10 bit resolution, configurable range). Jn the last step of the measurement cycle, the measured Ta and To are rescaled to the desired output resolution of the PWM) and the regenerated data is loaded in the registers of the PWM state machine, which creates a constant frequency with a duty cycle representing the measured data.
[0047] In some implementations, the digital IR sensor 108 includes a SCL pin for Serial clock input for 2 wire communications protocol, which supports digital input only, used as the clock for SMBus compatible communication. The SCL pin has the auxiliary function for building an external voltage regulator. When the external voltage regulator is used, the 2-wire protocol for a power supply regulator is overdriven.
[0048] In some implementations, the digital IR sensor 108 includes a slave deviceA/PWM pin for Digital input / output. In normal mode the measured object temperature is accessed at this pin Pulse Width Modulated. In SMBus compatible mode the pin is automatically configured as open drain NMOS, Digital input / output, used for both the PWM output of the measured object temperature(s) or the digital input / output for the SMBus. In PWM mode the pin can be programmed in EEPROM to operate as Push / Pull or open drain NMOS (open drain NMOS is factory default). In SMBus mode slave deviceA is forced to open drain NMOS I/O, push-pull selection bit defines PWM / Thermal relay operation. The PWM / slave deviceA pin the digital IR sensor 108 operates as PWM output, depending on the EEPROM settings. When WPWM is enabled, after POR the PWM / slave deviceA pin is directly configured as PWM output. When the digital JR sensor 108 is in PWM mode, SMBus communication is restored by a special command, In some implementations, the digital JR sensor 108 is read viaPWM or SMBus compatible interface. Selection of PWM output is done in EEPROM configuration (factory default is SMBu5). PWM output has two programmable formats, single and dual data transmission, providing single wire reading of two temperatures (dual zone object or object and ambient). The PWM period is derived from the on-chip oscillator and is programmable.
[0049] In some implementations, the digital JR sensor 108 includes a VDD pin for External supply voltage and a VSS pin for ground.
[0050] The microprocessor 102 has read access to the RAM and EEPROM and write access to 9 EEPROM cells (at addresses 0x00, 0x01, 0x02, 0x03, 0x04, 0x05*, 0xOE, OxOF, 0x09). When the access to the digital JR sensor 108 is a read operation, the digital JR sensor 108 responds with 16 data bits and 8 bit PEC only if its own slave address, programmed in internal EEPROM, is equal to the SA, sent by the master. A slave feature allows connecting up to 127 devices (SAOxOO... OxO7F) with only 2 wires. Jn order to provide access to any device or to assign an address to a slave device before slave device is connected to the bus system, the communication starts with zero slave address followed by low R/W bit, When this command is sent from the microprocessor 102, the digital JR sensor 108 responds and ignores the internal chip code information.
[0051] In some implementations, two digital JR sensors 108 are not configured with the same slave address on the same bus.
[0052] In regards to bus protocol, after every received 8 bits the slave device should issue ACK or NACK. When a microprocessor 102 initiates communication, it first sends the address of the slave and only the slave device which recognizes the address will ACK, the rest will remain silent. In case the slave device NACKs one of the bytes, the microprocessor 102 stops the communication and repeat the message, A NACK could be received after the packet error code (PEC), This means that there is an error in the received message and the microprocessor 102 will try resending the message. PEC generation includes all bits except the START, REPEATED START, STOP, ACJC, and NACK bits. The PEC is a CRC-8 with polynomial X8+X2+X1+I. The Most Significant Bit of every byte is transferred first.
[0053] In single PWM output mode the settings for PWMI data only are used.
The temperature reading can be generated from the signal timing as: 2t {4A T0 + ) ML\ yT [0054] where Tmin and Tmax are the corresponding rescale coefficients in EEPROM for the selected temperature [0055] output (Ta, object temperature range is vahd for both Tobji and Tobj2 as specified in the previous table) and T is the PWM period. Tout is TOl, T02 or Ta according to Config Register [5:4] settings.
[0056] The different time intervals ti.. t4 have following meaning: [0057] tI: Start buffer. During this time the signal is always high. ti = 0. 125s x T (where T is the PWM period) [0058] t2: Vahd Data Output Band, 0... 1!2T, PWM output data resolution is 10 bit.
[0059] t3: Error band -information for fatal error in EEPROM (double error detected, not correctable).
[0060] t3 = 0,25s x T. Therefore a PWM pulse train with a duty cycle of 0.875 will indicate a fatal error in EEPROM (for single PWM format). FE means Fatal Error.
[0061] In regards to a format for extended PWM, the temperature transmitted in Data 1 field can be generated using the following equation: r1)j zj, For Dàt2 OeCI the equahon$: 4t TMrI) [0062] FIG. 3 is a block diagram of a non-touch thermometer 300 having a color display device, according to an implementation. In FIG. 3, the display device I 14 of FIG. I is a LED color display device.
[0063] In regards to the structural relationship of the digital infrared sensor 108 and the microprocessor 102 in FIG. 1-3, heat radiation on the digital infrared sensor 108 from any source such as the microprocessor 102 or heat sink, will distort detection of infrared energy by the digital infrared sensor 108. In order to prevent or at least reduce heat transfer between the digital infrared sensor 108 and the microprocessor 102, the non-touch thermometers 100, 200 and 300 are low-powered devices and thus low heat-generating devices that are also powered by a battery 104; and that are only used for approximately a 5 second period of time for each measurement (1 second to acquire the temperature samples and generate the body core temperature result, and 4 seconds to display that result to the operator) so there is little heat generated by the non-touch thermometers 100, 200 and 300 in active use.
[0064] The internal layout of the non-touch thermometers 100, 200 and 300 minimizes as practically as possible the digital infrared sensor as far away in distance from all other components such the microprocessor 102 within the practical limitations of the industrial design of the non-touch thermometers 100, and 300.
[0065] More specifically, to prevent or at least reduce heat transfer between the digital infrared sensor 108 and the microprocessor 102, the digital infrared sensor 08 is isolated on a separate PCB from the PCB that has the microprocessor 102, and the two PCBs are connected by only a connector that has 4 pins. The minimal connection of the single connector having 4 pins reduces heat transfer from the microprocessor 102 to the digital infrared sensor 108 through the electrical connector and through transfer that would occur through the PCB material if the digital infrared sensor 108 arid the microprocessor 102 were mounted on the same PCB.
Digital Infrared Sensor Method Implementations [0066] In the previous section, apparatus of the operation of an implementation was described. In this section, the particular methods performed by non-touch thermometer 100, 200 and 300 of such an implementation are described by reference to a series of flowcharts, [0067] FIG. 4 is a flowchart of a method 400 to determine a temperature from a digital infrared sensor, according to an implementation. Method 400 includes receiving from the digital readout ports of a digita' infrared sensor a digital sign& that is representative of an infrared signal detected by the digital infrared sensor, at block 402.
[0068] Method 400 also includes determining a temperature from the digital signal that is representative of the infrared signal, at Nock 404.
[0069] FIG, 5 is a flowchart of a method 500 to display temperature color indicators, according to an implementation of three colors. Method 500 provides color rendering in the color LED 2412 to indicate a general range of a temperature.
[0070] Method 500 includes receiving a temperature (such as temperature 120 in FIG. 1), at block 501.
[0071] Method 500 also includes determining whether or not the temperature is in the range of 32.0°C and 37,3°C, at block 502. If the temperature is in the range of 32.0°C and 37.3°C, then the color is set to amber' to indicate a temperature that is low, at block 504 and the background of the color LED 2412 is activated in accordance with the color, at block 506.
[0072] If the temperature is not the range of 32.0°C and 37.3°C, then method 500 also includes determining whether or not the temperature is in the range of 37.4°C and 38.0°C, at block 508. Ifthe sensed temperature is in the range of 37.4°C and 38.0°C, then the color is set to green to indicate no medical concern, at block 510 and the background of the color LED 2412 is activated in accordance with the color, at block 506.
[0073] If the temperature is not the range of 37.4°C and 38.0°C, then method 500 also includes determining whether or not the temperature is over 3 8.0°C, at block 512. lfthe temperature is over 38.0°C, then the color is set to red' to indicate alert, at block 512 and the background of the color LED 2412 is activated in accordance with the color, at block 506.
[0074] Method 500 assumes that temperature is in gradients of 1 Oths of a degree. Other temperature range boundaries are used in accordance with other gradients of temperature sensing.
[0075] In some implementations, some pixels in the color LED 2412 are activated as an amber color when the temperature is between 36.3°C and 37.3°C (97.3°F to 99.1°F), some pixels in the color LED 2412 are activated as a green when the temperature is between 37.4°C and 37.9°C (99.3°F to 100.2°F), some pixels in the color LED 2412 are activated as a red color when the temperature is greater than 38°C (100.4°F). In some implementations, the color LED 2412 is a backlit LCD screen 302 in FIG. 3 (which is easy to read in a dark room) and some pixels in the color LED 2412 are activated (remain lit) for about 5 seconds after the single button 106 is released. After the color LED 2412 has shut off, another temperature reading can be taken by the apparatus. The color change of the color LED 2412 is to alert the operator of the apparatus of a potential change of body temperature of the human or animal subject. The temperature reported on the display can be used for treatment decisions.
[0076] FIG. 6 is a flowchart of a method 600 to manage power in a non-touch device having a digital infrared sensor, according to an implementation. The method 600 manages power in the device, such as non-touch thermometer in FTG.
1-3, the non-touch thermometer 2100 in FIG. 21, the hand-held device 2200 in FIG. 22 and/or the computer 2300 in FIG. 23 in order to reduce heat pollution in the digital infrared sensor.
[0077] To prevent or at least reduce heat transfer between the digital infrared sensor 108 and the microprocessor 102, microprocessor 2104 In FIG. 21, main processor 2202 in FIG. 22 or processing unit 2304 in FIG. 23, the components of the non-touch thermometers 100, 200 and 300 in FIG. 1-3, the non-touch thermometer 2100 in FIG. 21, the hand-held device 2200 in FIG. 22 and/or the computer 2300 in FIG. 23 are power controlled, i.e. the non-touch thermometers 100, 200 and 300 in FIG. 1-3, the non-touch thermometer 2100 in FIG. 21, the hand-held device 2200 in FIG. 22 and/or the computer 2300 in FIG. 23 turn sub-systems on and off, and the components are only activated when needed in the measurement and display process, which reduces power consumption and thus heat generation by the microprocessor 102,microprocessor 2104 In FIG. 21, main processor 2202 in FIG. 22 or processing unit 2304 in FIG. 23, of the non-touch thermometers 100, 200 and 300 in FIG. 1-3, the non-touch thermometer 2100 in FIG. 21, the hand-held device 2200 in FIG. 22 and/or the computer 2300 in FIG. 23, respectively. When not in use, at block 602, the non-touch thermometers 100, 200 and 300 in FIG. 1-3, the non-touch thermometer 2100 in FIG. 21, the hand-held device 2200 in FIG. 22 and/or the computer 2300 in FIG. 23 are completely powered-off at block 604 (including the main PCB having the microprocessor 02, microprocessor 2104 In FIG. 21, main processor 2202 in FIG. 22 or processing unit 2304 in FIG. 23, and the sensor PCB having the digital infrared sensor 108) and not drawing any power, other than a power supply, i.e. a boost regulator, which has the effect that the non-touch thermometers 100, 200 and 300 in FIG. 1-3, the non-touch thermometer 2100 in FIG. 21, the hand-held device 2200 in FIG. 22 and/or the computer 2300 in FIG. 23 draw only drawing micro-amps from the battery 104 while in the off state, which is required for the life time requirement of 3 years of operation, but which also means that in the non-use state there is very little powered circuitry in the non-touch thermometers 100, and 300 in FIG. 1-3, the non-touch thermometer 2100 in FIG. 21, the hand-held device 2200 in FIG. 22 and/or the computer 2300 in FIG. 23 and therefore very little heat generated in the non-touch thermometers 00, 200 and 300 in FIG. 1-3, the non-touch thermometer 2100 in FIG. 21, the hand-held device 2200 in FIG. 22 and/or the computer 2300 in FIG. 23.
[0078] When the non-touch thermometers 100, 200 and 300 in FIG. 1-3, the non-touch thermometer 2100 in FIG. 21, the hand-held device 2200 in FIG. 22 and/or the computer 2300 in FIG. 23 are started by the operator, at block 606, only the microprocessor 102, microprocessor 2104 In FIG. 21, main processor 2202 in FIG. 22 or processing unit 2304 in FIG. 23, digital infrared sensor 108, and low power LCD (e.g. display device 114) are turned on for the first 1 second, at block 608, to take the temperature measurement via the digital infrared sensor 108 and generate the body core temperature result via the microprocessor 102, microprocessor 2104 In FIG. 21, main processor 2202 in FIG. 22 or processing unit 2304 in FIG. 23,, at block 610. In this way, the main heat generating components (the LCD 114, the main PCB having the microprocessor 102 and the sensor PCB having the digital infrared sensor 08), the display back-light and the temperature range indicator (ie. the traffic light indicator 2112) are not on and therefore not generating heat during the critical start-up and measurement process, no more than second, After the measurement process of block 610 has been completed, the digital infrared sensor 108 is turned off, at block 612, to reduce current usage from the batteries and heat generation, and also the display back-light and temperature range indicators are turned on, at block 614.
[0079] The measurement result is displayed for 4 seconds, at block 616, and then the non-touch thermometers 100, 200 and 300 in FIG, 1-3, the non-touch thermometer 2100 in FIG. 21, the hand-held device 2200 in FIG. 22 and/or the computer 2300 in FIG. 23 are put in low power-off state, at block 618, [0080] In some implementations of methods and apparatus of FIG. 1-6 an operator can take the temperature of a subject at multiple locations on a patient and from the temperatures at multiple locations to determine the temperature at a number of other locations of the subject. The multiple source points of which the electromagnetic energy is sensed are mutually exclusive to the location of the correlated temperature. In one example, the carotid artery source point on the subject and a forehead source point are mutually exclusive to the core temperature of the subject, an axillary temperature of the subject, a rectal temperature of the subject and an oral temperature of the subject.
[0081] The correlation of action can include a calculation based on Formula 1: [0082] Tbo = jstb(Tsurfacc temp + ffltC(TfltCP+F4bOdy [0083] Formula 1 [0084] where TbQd is the temperature of a body or subject [0085] wherefb is a mathematical formula of a surface of a body [0086] where f is mathematical formula for ambient temperature reading [0087] where surface teittp is a surface temperature determined from the sensing.
[0088] where Tntc is an ambient air temperature reading [0089] where F4bOdy is a calibration difference in axillary mode, which is stored or set in a memory of the apparatus either during manufacturing or in the field, The apparatus also sets, stores and retrieves F401, and F403i in the memory.
[0090] f110(T11) is a bias in consideration of the temperature sensing mode, For example fLXilIa(Ta,&kTn) = 0,2 °C, fomi(Torai) = 0,4 Cfl IT \-fi 0.1 IT \-fl2° "-, J rectalk recth) -, . an J corek core) - [00911 In some implementations of determining a correlated body temperature of carotid artery by biasing a sensed temperature of a carotid artery, the sensed temperature is biased by +0,3 °C to yield the correlated body temperature, In another example, the sensed temperature is biased by -0,5 °C to yield the correlated body temperature. An example of correlating body temperature of a carotid artery follows: [0092] = 0.2 °C when T11 =26.2 °C as retrieved from a data table for body sensing mode.
[0093] assumption: TsurfaceklLlp = 37.8 °C [0094] Tsurface temp + fntc(Tntc) = 37.8 °C + 0.2°C = 380 °C [0095] fstb(Tsurnce temp + = 38 °C + 1.4 °C = 39.4 °C [0096] assumption: F4bOd = 0.5 °C [0097] body = fs(b(Tsurfaeekitp + fltk(TntcD+F4body = 39.4 °C +0,5 C = 39.9 °C [0098] The correlated temperature for the carotid artery is 40.0 °C.
[0099] In an example of correlating temperature of a plurahty of external locations, such as a forehead and a carotid artery to an axillary temperature, first a forehead temperature is calculated using formula 1 as follows: [001001 = 0.2 °C when T11 =26.2 °C as retrieved from a data table for axiflary sensing mode, [001011 assumption: Tsmtacetetnp = 37.8 °C [001021 surface temp + fidc(TnIc) = 37,8 °C + 0.2°C = 38,0 °C [001031 fsw(Tsuriace temp + = 38 °C + 1.4 °C = 39.4 °C [001041 assumption: F4b0d = 0 °C [00105j body = fsIb(Tsurfaee tenp + fIIlc(TTIIc))+F4body = 39.4 °C +0 C I = 39.4 °C [001061 And second, a carotid temperature is calculated using formula I as fo I ows: [00107] fjitc(Tittc) = 0.6 °C when T11 =26.4 °C as retrieved from a data
table.
[00108] assumption: Tsurfaceternp = 38.0 C [00109] surface temp + fittc(Tntc) = 38.0 Cc + 0.6°C = 38.6 °C [001101 fstb(Tsurcacctcmp + = 38.6 °C + 1.4 C = 40.0 C [001111 assumption: F4bOdy = 0°C [00112] Tbc,dy = Ltstb(Tsurface temp + ffltC(Tfl))+F4bOdV = 40.0 °C +0 C I = 40.0 °C [00113] Thereafter the correlated temperature for the forehead (39,4 °C) and the correlated temperature for the carotid artery (40.0 °C) are averaged, yielding the final result of the scan of the forehead and the carotid artery as 39.7 °C.
Vital Sign Motion Amplification Apparatus Implementations [00114] Apparatus in FIG. 7-15 use spatial and temporal signal processing to generate vital signs from a series of digital images.
[00115] FIG. 7 is a block diagram of an apparatus 700 of motion amplification, according to an implementation. Apparatus 700 analyzes the temporal and spatia' variations in digital images of an animal subject in order to generate and communicate biological vital signs.
[00116] In some implementations, apparatus 700 includes a skin-pixel-identifier 702 that identifies pixel values that are representative of the skin in two or more images 704. In some implementations the images 704 are frames of a video. The skin-pixel-identifier 702 performs block 1602 in FIG. 16. Some implementations of the skin-pixel-identifier 702 performs an automatic seed point based clustering process on the two or more images 704. In some implementations, apparatus 700 includes a frequency filter 706 that receives the output of the skin-pixel-identifier 702 and applies a frequency filter to the output of the skin-pixel-identifier 702. The frequency filter 706 performs block 1604 in FIG. 16 to process the images 704 in the frequency domain. In implementations where the apparatus in FIG. 7-15 or the methods in FIG. 16-20 are implemented on non-touch thermometers 100, 200 or 300 in FIG. 1-3, the images 704 in FIG. 7-are the images 124 in FIG. 1-3. In some implementations the apparatus in FIG. 7-15 or the methods in FIG. 16-20 are implemented on the smartphone 2200 in FIG. 22.
[001171 In some implementations, apparatus 700 includes a regional facial clusterial module 708 that applies spatial clustering to the output of the frequency filter 706. The regional facial clusterial module 708 performs block 1606 in FIG. 16. In some implementations the regional facial clusterial module 708 includes fuzzy clustering, Ic-means clustering, expectation-maximization process, Ward's apparatus or seed point based clustering.
[001181 In some implementations, apparatus 700 includes a frequency-filter 710 that applies a frequency filter to the output of the regional facial clusterial module 708, The frequency-filter 710 performs block 1608 in FIG. 16. In some implementations, the frequency-filter 710 is a one-dimensional spatial Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter. Some implementations of frequency-filter 710 includes dc-noising (e.g. smoothing of the data with a Gaussian filter). The skin-pixel-identifier 702, the frequency filter 706, the regional facial clusterial module 708 and the frequency-filter 710 amplify temporal variations (as a temporal-variation-amplifier) in the two or more images 704.
[001191 In some implementations, apparatus 700 includes a temporal-variation identifier 712 that identifies temporal variation of the output of the frequency filter 710. Thus, the temporal variation represents temporal variation of the images 704. The temporal-variation identifier 712 performs block 1610 in FIG. 16.
[001201 In some implementations, apparatus 700 includes a vital-sign generator 714 that generates one or more vital sign(s) 716 from the temporal variation. The vital sign(s) 716 are displayed for review by a healthcare worker or stored in a volatile or nonvolatile memory for later analysis, or transmitted to other devices for analysis, [00121] Fuzzy clustering is a class of processes for cluster analysis in which the allocation of data points to clusters is not hard 1ali..or-nothing) but fuzzy in the same sense a; fuzzy logic. Fuzzy logic being a form of many-valued logic which with reasoning that is approximate rather than fixed and exact. in fuzzy clustering, every point has a degree of belonging to clusters, as in fuzzy logic, rather than belonging completely to just one cluster, Thus, ponts On the edge of a cluster, may be iii the cluster to a lesser rjegree than points in the center of cluster, An overview and comparison of different fuzzy clustering processes is available.
Any point x has a set of coefficients giving the degree of being in the kth cluster wk(x). With fuzzy c-means, the centroid of a cluster is the mean of all points, weighted by Lheir degree of belonging Lo the clusLer: -Ewk(x)mx [00122] The degree of belonging, wk(x), is related inversely to the distance from.r to the cluster center as calculated on the previous pass. It also depends on a parameter m that controls how much weight is given to the closest center.
[00123] k-means clustering is a process of vector quantization, originally from signal processing, that is popular for cluster analysis in data mining, k-means clustering aims to partition n observations into k clusters in which each observation belongs to the cluster with the nearest mean, serving as a prototype of the cluster. This results in a partitioning of the data space into Voronoi cells. A Voronoi Cell being a region within a Voronoi Diagram that is a set of points which is specified beforehand. A Voronoi Diagram being a way of dividing space into a number of regions. K-means clustering uses cluster centers to model the data and tends to find clusters of comparable spatial extent, like K-means clustering, but each data point has a fuzzy degree of belonging to each separate cluster.
[00124] An expectation-maximization process is an iterative process for fmding maximum likelihood or maximum a posteriori (MAP) estimates of parameters in statistical models, where the model depends on unobserved latent variables. The expectation-maximization iteration alternates between performing an expectation step, which creates a function for the expectation of the log-likelihood evaluated using the current estimate for the parameters, and a maximization step, which computes parameters maximizing the expected log-likelihood found on the expectation step. These parameter-estimates are then used to determine the distribution of the latent variables in the next expectation step.
[00125] The expectation maximization process seeks to find the maximization likelihood expectation of the marginal likelihood by iteratively applying the following two steps: [00126] 1. Expectation step (E step): Calculate the expected value of the log likelihood function, with respect to the conditional distribution of Z given X under the current estimate of the parameters Q(I) = E10coi1ogL(9:.X,Z)} [00127] 2. Maximization stcp (M step): Find the parameter that maximizes this quantity: = &ginax [00128] Note that in typical models to which expectation maximization is applied: [00129] 1. The observed data points X may be discrete (taking values in a finite or countably infinite set) or continuous (taking values in an uncountably infinite set). There may in fact be a vector of observations associated with each data point.
[00130] 2. The missing values (aka latent variables)Z are discrete, drawn from a fixed number of values, and there is one latent variable per observed data point.
[00131] 3. The parameters are continuous, and are of two kinds: Parameters that are associated with all data points, and parameters associated with a particular value of a latent variable (i.e. associated with all data points whose corresponding latent variable has a particular value).
[001321 The Fourier Transform is an important image processing tool which is used to decompose an image into its sine and cosine components. The output of the transformation represents the image in the Fourier or frequency domain, while the input image is the spatial domain equivalent. In the Fourier domain image, each point represents a particular frequency contained in the spatial domain image.
[001331 The Discrete Fourier Transform is the sampled Fourier Transform and therefore does not contain all frequencies forming an image, but only a set of samples which is large enough to fully describe the spatial domain image. The number of frequencies corresponds to the number of pixels in the spatial domain image, i.e. the image in the spatial and Fourier domain are of the same size.
[001341 For a square image of size NXN, the two-dimensional DFT is given by: 1. Ni F(k. 1) rn V (Li) r3t+ j:::Q j:1) [001351 where f(a,b) is the image in the spatial domain and the exponential term is the basis function corresponding to each point F(lc,l) in the Fourier space.
The equation can be interpreted as: the value of each point F(k,l) is obtained by multiplying the spatial image with the corresponding base function and summing the result.
[001361 The basis functions are sine and cosine waves with increasing frequencies, i.e. F(0,0) represents the DC-component of the image which corresponds to the average brightness and F(N-I,N-1) represents the highest frequency [00137] A high-pass filter (HPF) is an electronic filter that passes high-frequency signals but attenuates (reduces the amplitude of) signals with frequencies lower than the cutoff frequency. The actual amount of attenuation for each frequency varies from filter to filter. A high-pass filter is usually modeled as a linear time-invariant system. They can also be used in conjunction with a low-pass filter to make a bandpass filter. The simple first-order electronic high-pass filter shown in Figure 1 is implemented by placing an input voltage across the series combination of a pa.cUQr and a resistor and using the voltage across the resistor as an output. The product of the resistance and capacitance (I?4J) is the time consian (t); it is inversely proportional to the cutoff frequencyJ., that is,
I I
= 2irr = 2irRC' [00138] whereJ is in r is in 1? is in ph.rp., and C is in fij.rt.th.
[00139] A low-pass filter is a filter that passes low-frequency signals and attenuates (reduces the amplitude of) signals with frequencies higher than the cutoff frequency. The actual amount of attenuation for each frequency varies depending on specific filter design. It is sometimes called a high-cut filter, or treble cut filter in audio applications. A low-pass filter is the opposite of a high-pass filter, Low-pass filters provide a smoother form of a signal, removing the short-term fluctuations, and leaving the longer-term trend. One simple low-pass filter circuit consists of a resistor in series with a load, and a capacitor in parallel with the load. The capacitor exhibits reactance, and blocks low-frequency signals, forcing them through the load instead, At higher frequencies the reactance drops, and the capacitor effectively functions as a short circuit. The combination of resistance and capacitance gives the time constant of the filter. The break frequency, also called the turnover frequency or cutoff frequency (in hertz), is determined by the time constant.
[001401 A band-pass filter is a device that passes frequencies within a certain range and attenuates frequencies outside that range. These filters can also be created by combining a low-pass filter with a high-pass filter. Bandpass is an adjective that describes a type of filter or filtering process; it is to be distinguished from passband, which refers to the actual portion of affected spectrum. T-Ience, one might say "A dual bandpass filter has two passbands." A bandpass signal is a signal containing a band of frequencies not adjacent to zero frequency, such as a signal that comes out of a bandpass filter.
[001411 FIG. 8 is a block diagram of an apparatus 800 of motion amplification, according to an implementation. Apparatus 800 analyzes the temporal and spatial variations in digital images of an animal subject in order to generate and communicate biological vital signs.
[00142] In some implementations, apparatus 800 includes a skin-pixel-identifier 702 that identifies pixel values that are representative of the skin in two or more images 704. The skin-pixel-identifier 702 performs block 1602 in FIG. 6, Some implementations of the skin-pixel-identifier 702 performs an automatic seed point based clustering process on the least two images 704.
[00143] In some implementations, apparatus 800 includes a frequency filter 706 that receives the output of the skin-pixel-identifier 702 and applies a frequency filter to the output of the skin-pixel-identifier 702. The frequency filter 706 performs block 1604 in FIG. 16 to process the images 704 in the frequency domain.
[001441 In some implementations, apparatus 800 includes a regional facial clusterial module 708 that applies spatial clustering to the output of the frequency filter 706. The regional facial clusterial module 708 performs block 1606 in FiG.
16. In some implementations the regional facial clusterial module 708 includes fuzzy clustering, k-means clustering, expectation-maximization process, Ward's apparatus or seed point based clustering.
[001451 In some implementations, apparatus 800 includes a frequency-filter 710 that applies a frequency filter to the output of the regional facial elusterial module 708, to generate a temporal variation, The frequency-filter 710 performs block 1608 in FIG. 16. In some implementations, the frequency-filter 710 is a one-dimensional spatial Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter, Some implementations of frequency-filter 710 includes de-noising (e.g. smoothing of the data with a Gaussian filter).The skin-pixel-identifier 702, the frequency filter 706, the regional facial clusterial module 708 and the frequency-filter 710 amplify temporal variations in the two or more images 704.
[001461 In some implementations, apparatus 800 includes a vital-sign generator 714 that generates one or more vital sign(s) 716 from the temporal variation, The vital sign(s) 716 are displayed for review by a healthcare worker or stored in a volatile or nonvolatile memory for later analysis, or transmitted to other devices for analysis.
[001471 FIG. 9 is a block diagram of an apparatus 900 of motion amplification, according to an implementation. Apparatus 900 analyzes the temporal and spatial variations in digital images of an animal subject in order to generate and communicate biological vital signs.
[001481 In some implementations, apparatus 900 includes a skin-pixel-identifier 702 that identifies pixel values that are representative of the skin in two or more images 704. The skin-pixel-identifier 702 performs block 1602 in FIG. 16. Some implementations of the skin-pixel-identifier 702 performs an automatic seed point based clustering process on the least two images 704.
[001491 In some implementations, apparatus 900 includes a spatial bandpass filter 902 that receives the output of the skin-pixel-identifier 702 and applies a spatial bandpass filter to the output of the skin-pixel-identifier 702. The spatial bandpass filter 902 performs block 1802 in FIG. 18 to process the images 704 in the spatial domain.
[001501 In some implementations, apparatus 900 includes a regional facial clusterial module 708 that applies spatial clustering to the output of the frequency filter 706. The regional facial clusterial module 708 performs block 1804 in FIG. 18. In some implementations the regional facial clusterial module 708 includes fuzzy clustering, k-means clustering, expectation-maximization process, Ward's apparatus or seed point based clustering.
[001511 In some implementations, apparatus 900 includes a temporal baridpass filter 904 that applies a frequency filter to the output of the regional facial clusterial module 708. The temporal bandpass filter 904 performs block 1806 in FIG. 18. In some implementations, the temporal bandpass filter 904 is a one-dimensional spatial Fourier Iransform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter, Some implementations of temporal bandpass filter 904 includes dc-noising (e.g. smoothing of the data with a Gaussian filter).
[001521 The skin-pixel-identifier 702, the spatial bandpass filter 902, the regional facial clusterial module 708 and the temporal bandpass filter 904 amplify temporal variations in the two or more images 704.
[001531 In some implementations, apparatus 900 includes a temporal-variation identifier 712 that identifies temporal variation of the output of the frequency filter 710. Thus, the temporal variation represents temporal variation of the images 704. The temporal-variation identifier 712 performs block 1808 in FIG. 18.
[001541 In some implementations, apparatus 900 includes a vital-sign generator 714 that generates one or more vital sign(s) 716 from the temporal variation. The vital sign(s) 716 are displayed for review by a healthcare worker or stored in a volatile or nonvolatile memory for later analysis, or transmitted to other devices for analysis.
[001551 FIG. lOis a block diagram of an apparatus 1000 of motion amplification, according to an implementation.
[001561 In some implementations, apparatus 1000 includes a pixel-examiner 1002 that examines pixel values of two or more images 704. The pixel-examiner 1002 performs block 1902 in FIG. 19.
[001571 In some implementations, apparatus 1000 includes a temporal variation determiner 1006 that determines a temporal variation of examined pixel values. The temporal variation determiner 1006 performs block 1904 in FIG. 19.
[001581 In some implementations, apparatus 1000 includes a signal-processor 1008 that applies signal processing to the pixel value temporal variation, generating an amplified temporal variation, The signal-processor 1008 performs block 1906 in FIG. 19. The signal processing amplifies the temporal variation, even when the temporal variation is smafl, In some implementations, the signal processing performed by signal-processor 1008 is temporal bandpass filtering that analyzes frequencies over time. In some implementations, the signal processing performed by signal-processor i008 is spatial processing that removes noise.
Apparatus 1000 amplifies only small temporal variations in the signal-processing module.
[001591 In some implementations, apparatus 900 includes a vital-sign generator 714 that generates one or more vital sign(s) 716 from the temporal variation, The vital sign(s) 716 are displayed for review by a healthcare worker or stored in a volatile or nonvolatile memory for later analysis, or transmitted to other devices for analysis.
[001601 While apparatus 1000 carl process large temporal variations, an advantage in apparatus 1000 is provided for smaJl temporal variations, Therefore apparatus 1000 is most effective when the two or more images 704 have small temporal variations between the two or more images 704. In some implementations, a vital sign is generated from the amplified temporal variations of the two or more images 704 from the signal-processor 1008.
[001611 FIG. II is a block diagram of an apparatus i 100 of motion amplification, according to an implementation. Apparatus 1100 analyzes the temporal and spatial variations in digital images of an animal subject in order to generate and communicate biological vital signs.
[001621 In some implementations, apparatus 1100 includes a skin-pixel-identification module 1102 that identifies pixel values 1106 that are representative of the skin in two or more images 1104, The skin-pixel-identification module 1102 performs block 1602 in FIG. 16. Some implementations of the skin-pixel-identification module 1102 perform an automatic seed point based clustering process on the least two images 1104.
[00163j In some implementations, apparatus 1100 includes a frequency-filter module 1108 that receives the identified pixel values 1106 that are representative of the skin and applies a frequency filter to the identified pixel values 1106. The frequency-filter module 1 08 performs block 604 in FIG. 16 to process the images 704 in the frequency domain. Each of the images 704 is Fourier transformed, multiplied with a filter function and then re-transformed into the spatial domain. Frequency filtering is based on the Fourier Transform, The operator takes an image 704 and a filter function in the Fourier domain. The image 704 is then multiplied with the filter function in a pixel-by-pixel fashion using the formula: [00164j G(k, 1) = F(k, 1) 1-1(k, 1) [00165j where F(k,l) is the input image 704 of identified pixel values 1106 in the Fourier domain, H(k,l) the filter function and G(k,l) is the filtered image 1110.
To obtain the resulting image in the spatial domain, G(k,l) is re-transformed using the inverse Fourier Transform. In some implementations, the frequency-filter module 1108 is a two-dimensional spatial Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter, [001661 In some implementations, apparatus 1100 includes a spatial-cluster module 1112 that applies spatial clustering to the frequency filtered identified pixel values of skin 1110, generating spatial clustered frequency filtered identified pixel values of skin 1114. The spatial-cluster module 1112 performs block 1606 in FIG. 16. In some implementations the spatial-cluster module 1112 includes fuzzy clustering, k-means clustering, expectation-maximization process, Ward's apparatus or seed point based clustering.
[00167] In some implementations, apparatus 1100 indudes a frequency-filter module 1116 that applies a frequency filter to the spatial clustered frequency filtered identified pixel values of skin 1114, which generates frequency filtered spatial clustered frequency filtered identified pixel values of skin 1118. The frequency-fdter module 1116 performs block 1608 in FIG. 16. In some implementations, the frequency-filter module 1116 is a one-dimensional spatial Fourier Transform, a high pass fflter, a ow pass filter, a bandpass filter or a weighted bandpass filter. Some implementations of frequency-filter module 1116 includes de-noising (e.g. smoothing of the data with a Gaussian filter).
[00168] The skin-pixel-identification module 1102, the frequency-filter module 1108, the spatial-cluster module 1112 and the frequency-filter module 1116 amplify temporal variations in the two or more images 704.
[00169] In some implementations, apparatus 1100 includes a temporal-variation module 1120 that determines temporal variation 1122 of the frequency filtered spatial clustered frequency filtered identified pixel values of skin 1118.
Thus, temporal variation 1122 represents temporal variation of the images 704.
The temporal-variation module 1120 performs block 1610 in FIG. 16.
[00170] FIG. 12 is a block diagram of an apparatus 1200 to generate and present any one of a number of biological vital signs from amphfied motion, according to an implementation.
[00171] In some implementations, apparatus 1200 includes a blood-flow-analyzer module 1202 that analyzes a temporal variation to generate a pattern of flow of blood 1204. One example of the temporal variation is temporal variation 1122 in FIG. IL In some implementations, the pattern flow of blood 1204 is generated from motion changes in the pixels and the temporal variation of color changes in the skin of the images 704, In some implementations, apparatus 1200 includes a blood-flow display module 1206 that displays the pattern of flow of blood 1204 for review by a healthcare worker.
[00172] In some implementations, apparatus 1200 includes aheartrate-analyzer module 1208 that analyzes the temporal variation to generate a heartrate 1210. In some implementations, the heartrate 1210 is generated from the frequency spectrum of the temporal signal in a frequency range for heart beats, such as (0-10 Hertz). In some implementations, apparatus 1200 includes a heartrate display module 1212 that displays the heartrate 1210 for review by a healthcare worker.
[00173] In some implementations, apparatus 1200 includes a respiratory rate-analyzer module 1214 that analyzes the temporal variation to determine a respiratory rate 1216. In some implementations, the respiratory rate 1216 is generated from the motion of the pixels in a frequency range for respiration (0-5 Hertz). In some implementations, apparatus 1200 includes respiratory rate display module 1218 that displays the respiratory rate 1216 for review by a healthcare [00174] In some implementations, apparatus 1200 includes a blood-pressure analyzer module 1220 that analyzes the temporal variation to a generate blood pressure 1222. In some implementations, the blood-pressure analyzer module 1220 generates the blood pressure 1222 by analyzing the motion of the pixels and the color changes based on a clustering process and potentially temporal data. In some implementations, apparatus 1200 includes a blood pressure display module 1224 that displays the blood pressure 1222 for review by a healthcare worker.
[001751 In some implementations, apparatus 1200 includes an EKG analyzer module 1226 that analyzes the temporal variation to generate an EKO 1228. In some implementations, apparatus 1200 includes an EKG display module 1230 that displays the EKG 1228 for review by a healthcare worker.
[001761 In some implementations, apparatus 1200 includes a pulse oximetry analyzer module 1232 that analyzes the temporal variation to generate pulse oximetry 1234. In some implementations, the pulse oximetry analyzer module 1232 generates the pulse oximetry 1234 by analyzing the temporal color changes based in conjunction with the k-means clustering process and potentially temporal data. In some implementations, apparatus 1200 includes a pulse oximetry display module 1236 that displays the pulse oximetry 1234 for review by a healthcare [00177] FIG. 13 is a block diagram ofan apparatus 1300 of motion amplification, according to an implementation. Apparatus 1300 analyzes the temporal and spatial variations in digital images of an animal subject in order to generate and communicate biological vital signs.
[00178] In some implementations, apparatus 1300 includes a skin-pixel-identification module 1102 that identifies pixel values 1106 that are representative of the skin in two or more images 704. The skin-pixel-identification module 1102 performs block 1602 in FIG. 16. Some implementations of the skin-pixel-identification module 1102 perform an automatic seed point based clustering process on the least two images 704.
[00179] In some implementations, apparatus 1300 includes a frequency-filter module 1108 that receives the identified pixel values 1106 that are representative of the skin and applies a frequency filter to the identified pixel values 1106. The frequency-filter module 1108 performs block 1604 in FIG. 16 to process the images 704 in the frequency domain. Each of the images 704 is Fourier transformed, multiplied with a filter function and then re-transformed into the spatial domain. Frequency filtering is based on the Fourier Transform, The operator takes an image 7O4and a filter function in the Fourier domain. The image 704 is then multiplied with the fiher function in a pix&-by-pixel fashion using the [001801 formua:G(k, ) = F(k, I) T-T(k, I) [001811 where FQ,l) is the input image 704 of identified pixel values 1106 in the Fourier domain, H(k,l) the filter function and G(k,l) is the filtered image 1110, To obtain the resulting image in the spatial domain, G(k,l) is re-transformed using the inverse Fourier Transform. In some implementations, the frequency-filter module 1108 is a two-dimensional spatial Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter, [001821 In some implementations, apparatus 1300 includes a spatial-cluster module 11 U that applies spatial clustering to the frequency filtered identified pixel values of skin 1110, generating spatial clustered frequency filtered identified pixel values of skin 1114. The spatial-cluster module 1112 performs block 1606 in FIG. 16, In some implementations the spatial clustering includes fuzzy clustering, k-means clustering, expectation-maximization process, Ward's apparatus or seed point based clustering.
[001831 In some implementations, apparatus 1300 includes a frequency-filter module 11 6 that applies a frequency filter to the spatial dustered frequency filtered identified pixel values of skin 1114, which generates frequency filtered spatial clustered frequency filtered identified pixel values of skin 1118. The frequency-filter module 1 16 performs block 608 in FIG. 16 to generate a temporal variation 1122. In some implementations, the frequency-filter module 1116 is a one-dimensional spatial Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter. Some implementations of the frequency-filter module 1116 includes de-noising (e.g. smoothing of the data with a Gaussian filter).The skin-pixel-identification module 1102, the frequency-filter module 1108, the spatial-cluster module 1112 and the frequency-filter module 1116 amplify temporal variations in the two or more images 704.
[00184] The frequency-filter module 1116 is operably coupled to one of more modules in FIG. 12 to generate and present any one or a number of biological vital signs from amplified motion in the temporal variation 1122.
[00185] FIG. 14 is a block diagram of an apparatus 1400 of motion amplification, according to an implementation. Apparatus 1400 analyzes the temporal and spatial variations in digital images of an animal subject in order to generate and communicate biological vital signs.
[00186] In some implementations, apparatus 1400 includes a skin-pixel-identification module 1102 that identifies pixel values 1106 that are representative of the skin in two or more images 704. The skin-pixel-identification module 1102 performs block 1602 in FIG. 18. Some implementations of the skin-pixel-identification module 1102 perform an automatic seed point based clustering process on the least two images 704.In some implementations, apparatus 1400 includes a spatial bandpass filter module 1402 that applies a spatial bandpass filter to the identified pixel values 1106, generating spatial bandpassed filtered identified pixel values of skin 1404. In some implementations, the spatial bandpass filter module 1402 includes a two-dimensional spatial Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter. The spatial bandpass filter module 1402 performs block 1802 in FIG. 18.
[00187j In some implementations, apparatus 1400 includes a spatial-cluster module 1112 that applies spatial clustering to the frequency filtered identified pixel values of skin 1110, generating spatial clustered spatial bandpassed identified pixel va'ues of skin 1406. Tn some implementations the spatial clustering includes fuzzy clustering, k-means clustering, expectation-maximization process, Ward's apparatus or seed point based dustering, The spatial-duster module 1112 performs block 1804 in FIG. 18.
[001881 In some implementations, apparatus 1400 includes a temporal bandpass filter module 1408 that applies a temporal bandpass filter to the spatial clustered spatial bandpass filtered identified pixel values of skin 1406. generating temporal bandpass filtered spatial clustered spatial bandpass filtered identified pixel values of skin 1410. In some implementations, the temporal bandpass filter is a one-dimensional spatial Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter. The temporal bandpass fiher module 1408 performs block 1806 in FIG. 18.
[001891 In some implementations, apparatus 1400 includes a temporal-variation module 1120 that determines temporal variation 1522 of the temporal bandpass filtered spatial clustered spatial bandpass filtered identified pixel values of skin 1410. Thus, temporal variation 1522 represents temporal variation of the images 704. The temporal-variation module 1520 performs block 1808 of FIG. 18. The temporal-variation modLde 1520 is operably coupled to one or more modules in FIG. 12 to generate and present any one of a number of biological vital signs from amplified motion in the temporal variation 1522.
[001901 FIG. 15 is a block diagram of an apparatus 1500 of motion amplification, according to an implementation.
[001911 In some implementations, apparatus 1500 includes a pixel-examination-module 1502 that examines pixel values of two or more images 704, generating examined pixel values 1 504. The pixel-examination-module 1502 performs block 1902 in FIG. 19.
[001921 In some implementations, apparatus 1500 includes a temporal variation determiner module 1506 that determines atemporal variation 1508 of the examined pixel values 1504. The temporal variation determiner module 1506 performs block 1904 in FIG. 19.
[001931 In some implementations, apparatus 1500 includes a signal-processing module 1510 that applies signal processing to the pixel value temporal variations 1508, generating an amplified temporal variation 1522. The signal-processing module 1510 performs block i906 in FIG. 19. The signal processing amplifies the temporal variation 1508, even when the temporal variation 1508 is small. In some implementations, the signal processing performed by signal-processing module 1510 is temporal bandpass filtering that analyzes frequencies over time. in some implementations, the signal processing performed by signal-processing module 1510 is spatial processing that removes noise. Apparatus 1500 amplifies only small temporal variations in the signal-processing module.
[001941 While apparatus 1500 can process large temporal variations, an advantage in apparatus 1500 is provided for small temporal variations, l'herefore apparatus 1500 is most effective when the two or more images 704 have small temporal variations between the two or more images 704. In some implementations, a vital sign is generated from the amplified temporal variations of the two or more images 704 from the signal-processing module 1510.
Vital Sign Motion Amplification Method Implementations [001951 FIG. 16-20 each use spatial and tempora' signal processing to generate vital signs from a series of digital images.
[00196] FIG. 16 is a flowchart ofamethod 1600 of motion amplification, according to an implementation. Method 1600 analyzes the temporal and spatial variations in digital images of an animal subject in order to generate and communicate biological vita' signs.
[00197] In some implementations, method 1600 includes identifying pixel values of two or more images that are representative of the skin, at block 1602.
Some implementations of identifying pixel values that are representative of the skin includes performing an automatic seed point based clustering process on the least two images.
[00198] In some implementations, method 1600 includes applying a frequency filter to the identified pixel values that are representative of the skin, at block 1604. In some implementations, the frequency filter in block 1604 is a two-dimensional spatial Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter.
[00199] In some implementations, method 1600 includes app'ying spatial clustering to the frequency filtered identified pixel values of skin, at block 1606.
In some implementations the spatial clustering includes fuzzy clustering, k-means clustering, expectation-maximization process, Ward's method or seed point based clustering.
[002001 In some implementations, method 1600 includes applying a frequency filter to the spati& clustered frequency fdtered identified pixel va'ues of skin, at block 1608. In some implementations, the frequency filter in block 1608 is a one-dimensiona' spatia' Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter. Some implementations of applying a frequency filter at block 608 include dc-noising (e.g. smoothing of the data with a Gaussian filter).
[002011 Actions 1602, 1604, 1606 and 1608 amplify temporal variations in the two or more images.
[002021 In some implementations, method 1600 includes determining temporal variation of the frequency filtered spatial clustered frequency filtered identified pixel values of skin, at block 1610.
[002031 In some implementations, method 1600 includes analyzing the temporal variation to generate a pattern of flow of blood, at block 1612. In some implementations, the pattern flow of blood is generated from motion changes in the pixels and the temporal variation of color changes in the skin. In some implementations, method 1600 indudes displaying the pattern of flow of blood for review by a healthcare worker, at block 1613.
[002041 In some implementations, method 1600 includes analyzing the temporal variation to generate heartrate, at Nock 614, In some implementations, the heartrate is generated from the frequency spectrum of the temporal variation in a frequency range for heart beats, such as (0-10 Hertz). In some implementations, method 1600 includes displaying the heartrate for review by a healthcare worker, at Noek 1615.
[002051 In some implementations, method 1600 includes analyzing the temporal variation to determine respiratory rate, at block 1616. In some implementations, the respiratory rate is generated from the motion of the pixels in a frequency range for respiration (0-5 Hertz). In some implementations, method 1600 includes displaying the respiratory rate for review by a healthcare worker, at block 617.
[00206] In some implementations, method 1600 includes analyzing the temporal variation to generate blood pressure, at block 1618. In some implementations, the blood pressure is generated by analyzing the motion of the pixels and the color changes based on the clustering process and potentially temporal data from the infrared sensor. In some implementations, method 1600 includes displaying the blood pressure for review by a healthcare worker, at block 1619.
[00207] In some implementations, method 1600 includes analyzing the temporal variation to generate EKG, at block 1620, In some implementations, method 1600 includes displaying the EKG for review by a healthcare worker, at block 621.
[00208] In some implementations, method 1600 includes analyzing the temporal variation to generate pulse oximetry, at block 1622. In some implementations, the pulse oximetry is generated by analyzing the temporal color changes based in conj unction with the k-means clustering process and potentially temporal data from the infrared sensor. In some implementations, method 1600 includes displaying the pulse oximetry for review by a healthcare worker, at block [00209] FIG, 17 is a flowchart of a method of motion amplification, according to an implementation that does not include a separate action of determining a temporal variation. Method 1700 ana'yzes the temporal and spatial variations in digital images of an animal subject in order to generate and communicate biologica' vita' signs.
[002101 In some implementations, method 1700 includes identifying pixel values of two or more images that are representative of the skin, at block 1602.
Some implementations of identifying pixel values that are representative of the skin includes performing an automatic seed point based clustering process on the least two images.
[002111 In some implementations, method 1700 includes applying a frequency filter to the identified pixel values that are representative of the skin, at block 1604. In some implementations, the frequency filter in block 1604 is a two-dimensional spatial Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter.
[002121 In some implementations, method 1700 includes applying spatial clustering to the frequency filtered identified pixel values of skin, at block 1606.
In some implementations the spatial clustering indudes fuzzy clustering, k-means clustering, expectation-maximization process, Ward's method or seed point based clustering.
[002131 In some implementations, method 1700 includes app'ying a frequency filter to the spatial clustered frequency ffltered identified pixel values of skin, at block 1608, yielding a temporal variation, In some implementations, the frequency filter in block 1608 is a one-dimensional spatial Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter.
[002141 In some implementations, method 1700 includes analyzing the temporal variation to generate a pattern of flow of blood, at block 612, In some implementations, the pattern flow of blood is generated from motion changes in the pixels and the temporal variation of color changes in the skin. In some implementations, method 1700 includes displaying the pattern of flow of blood for review by ahealthcare worker, at block 163.
[00215] In some implementations, method 1700 includes analyzing the temporal variation to generate heartrate, at block 1614. In some implementations, the heartrate is generated from the frequency spectrum of the temporal variation in a frequency range for heart beats, such as (0-10 Hertz). In some implementations, method 1700 includes displaying the heartrate for review by a healthcare worker, at block 1615.
[00216] In some implementations, method 1700 includes analyzing the temporal variation to determine respiratory rate, at block 1616. In some implementations, the respiratory rate is generated from the motion of the pixels in a frequency range for respiration (0-5 Hertz). In some implementations, method 700 includes displaying the respiratory rate for review by a healthcare worker, at block 1617.
[00217] In some implementations, method 1700 includes analyzing the temporal variation to generate blood pressure, at block 1618. In some implementations, the blood pressure is generated by analyzing the motion of the pixels and the color changes based on the clustering process and potentially temporal data from the infrared sensor. In some implementations, method 1700 includes displaying the blood pressure for review by a healthcare worker, at block 1619.
[002181 In some implementations, method 1700 includes analyzing the temporal variation to generate EKG, at block 1620. In some imp'ementations, method 1700 includes displaying the EKG for review by a healthcare worker, at block 1621.
[002191 In some implementations, method 1700 includes analyzing the temporal variation to generate pulse oximetry, at block 1622. In some implementations, the pulse oximetry is generated by analyzing the temporal c&or changes based in conjunction with the k-means clustering process and potentially temporal data from the infrared sensor. In some implementations, method 1700 includes displaying the pulse oximetry for review by a healthcare worker, at block 1623.
[002201 FIG. 18 is a flowchart of a method 1800 of motion amplification from which to generate and communicate biological vital signs, according to an implementation. Method 1800 analyzes the temporal and spatial variations in digital images of an animal subject in order to generate and communicate the biological vital signs.
[002211 In some implementations, method 1800 includes identifying pixel values of two or more images that are representative of the skin, at block 1602.
Some implementations of identifying pixel values that are representative of the skin includes performing an automatic seed point based clustering process on the least two images.
[002221 In some implementations, method 1800 includes applying a spatial baridpass filter to the identified pixel values, at block 1802. In some implementations, the spatial fdter in block 1802 is a two-dimensional spatial Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter.
[002231 In some implementations, method 1800 includes applying spatial clustering to the spatial bandpass filtered identified pixel values of skin, at block 1804. In some implementations the spatial clustering includes fuzzy clustering, k-means clustering, expectation-maximization process, Ward's method or seed point based clustering.
[002241 In some implementations, method 1800 includes applying a temporal bandpass filter to the spatial clustered spatial bandpass filtered identified pixel values of skin, at block 1806. In some implementations, the temporal bandpass filter in block 1806 is a one-dimensional spatial Fourier Transform, a high pass filter, a low pass filter, a bandpass filter or a weighted bandpass filter.
[00225] In some implementations, method 1800 includes determining temporal variation of the temporal bandpass filtered spatial clustered spatial bandpass filtered identified pixel values of skin, at block 1808.
[00226] In some implementations, method 1800 includes analyzing the temporal variation to generate and visually display a pattern of flow of blood, at block 1612. In some implementations, the pattern flow of blood is generated from motion changes in the pixels and the temporal variation of color changes in the skin, in some implementations, method 1800 includes displaying the pattern of flow of blood for review by a healthcare worker, at block 1613.
[002271 In some implementations, method 1800 includes analyzing the temporal variation to generate heartrate, at block 614. In some implementations, the heartrate is generated from the frequency spectrum of the temporal variation in a frequency range for heart beats, such as (0-10 Hertz). In some implementations, method 1800 includes displaying the heartrate for review by a healthcare worker, at block 1615.
[00228] In some implementations, method 1800 includes analyzing the temporal variation to determine respiratory rate, at block 1616. In some implementations, the respiratory rate is generated from the motion of the pixels in a frequency range for respiration (0-5 Hertz). In some implementations, method 800 includes displaying the respiratory rate for review by a healthcare worker, at block 1617.
[00229] In some implementations, method 1800 includes analyzing the temporal variation to generate blood pressure, at block 1618. In some implementations, the blood pressure is generated by analyzing the motion of the pixels and the color changes based on the clustering process and potentially temporal data from the infrared sensor. In some implementations, method 1800 includes displaying the blood pressure for review by a healthcare worker, at block 1619.
[00230] In some implementations, method 1800 includes analyzing the temporal variation to generate EKG, at block 1620. In some implementations, method 1800 includes displaying the EKG for review by a healthcare worker, at block 1621.
[00231] In some implementations, method 1800 includes analyzing the temporal variation to generate pulse oximetry, at block 1622. In some implementations, the pulse oximetry is generated by analyzing the temporal color changes based in conjunction with the k-means clustering process and potentially temporal data from the infrared sensor. In some implementations, method 1800 includes displaying the pulse oximetry for review by a healthcare worker, at block 1623.
[002321 FIG. 19 is a flowchart of a method 1900 of motion amplification, according to an implementation. Method 1900 displays the temporal variations based on temporal variations in videos that are difficult or impossible to see with the naked eye. Method 1900 applies spatial decomposition to a video, and applies temporal filtering to the frames. The resulting signal is then amplified to reveal hidden information. Method 1900 can visualize flow of blood filling a face in the video and also ampli' and reveal small motions, and other vital signs such as blood pressure, respiration, EKG and pulse. Method 1900 can execute in real time to show phenomena occurring at temporal frequencies selected by the operator. A combination of spatial and temporal processing of videos can amplify subtle variations that reveal important aspects of the world. Method 1900 considers a time series of color values at any spatial location (e.g., a pixel) and amplifies variation in a given temporal frequency band of interest. For example, method 1900 selects and then amplifies a band of temporal frequencies including plausible human heart rates, The amplification reveals the variation of redness as blood flows through the face. For this application, lower spatial frequencies are temporally filtered (spatial pooling) to allow a subtle input signal to rise above the camera sensor and quantization noise, The temporal filtering approach not only amplifies color variation, but can also reveal low-amplitude motion.
[00233] Method 1900 can enhance the subtle motions around the chest of a breathing baby. Method 1900 mathematical analysis employs a linear approximation related to the brightness constancy assumption used in optical flow formulations, Method 1900 also derives the conditions under which this approximation holds. This leads to a multiscale approach to magnify motion without feature tracking or motion estimation. Properties ofavoxel of fluid are observed, such as pressure and velocity, which evolve over time. Method 1900 studies and amplifies the variation of pixel values over time, in a spatially-multiscale manner. This approach to motion magnification does not explicitly estimate motion, but rather exaggerates motion by amplifying temporal color changes at fixed positions. Method 1900 employs differential approximations that form the basis of optical flow processs. Method 1900 described herein employs localized spatial pooling and bandpass filtering to extract and reveal visually the signal corresponding to the pulse. This primal domain analysis allows amplification and visualization of the pulse signal at each location on the face.
Asymmetry in facial blood flow can be a symptom of arterial problems.
[00234] Method 1900 described herein makes imperceptible motions visible using a multiscale approach. Method 1900 amplifies small motions, in one embodiment, Nearly invisible changes in a dynamic environment can be revealed through spatio-temporal processing of standard monocular video sequences.
Moreover, for a range of amplification values that is suitable for various applications, explicit motion estimation is not required to amplify motion in natural videos. Method 1900 is well suited to small displacements and lower spatial frequencies. Single framework can amplify both spatial motion and purely temporal changes (eg., a heart pulse) and can be adjusted to amplify particular temporal frequencies. A spatial decomposition module decomposes the input video into different spatial frequency bands, then applies the same temporal filter to the spatial frequency bands, The outputted filtered spatial bands are then amplified by an amplification factor, added back to the original signal by adders, and collapsed by a reconstruction module to generate the output video. The temporal filter and amplification factors can be tuned to support different applications, For example, the system can reveal unseen motions of a camera, caused by the flipping mirror during a photo burst, [00235] Method 1900 combines spatial and temporal processing to emphasize subtle temporal changes in a video. Method 1900 decomposes the video sequence into different spatial frequency bands. These bands might be magnified differently because (a) the bands might exhibit different signal-to-noise ratios or (b) the bands might contain spatial frequencies for which the linear approximation used in motion magnification does not hold. Tn the latter case, method 1900 reduces the amplification for these bands to suppress artifacts.
When the goal of spatial processing is to increase temporal signal-to-noise ratio by pooling multiple pixels, the method spatially low-pass filters the frames of the video and downsamples them for computational efficiency. In the general case, however, method 1900 computes a full Laplacian pyramid.
[002361 Method 1900 then performs temporal processing on each spatial band, Method 1900 considers the time series corresponding to the value of a pixel in a frequency band and applies a bandpass filter to extract the frequency bands of interest. As one example, method 1900 may select frequencies within the range of 0.4-4 Hz, corresponding to 24-240 beats per minute, if the operator wants to magnify a pulse. If method 1900 extracts the pulse rate, then method 1900 can employ a narrow frequency band around that value. The temporal processing is uniform for all spatial levels and for all pixels within each level, Method 1900 then multiplies the extracted bandpassed signal by a magnification factor alpha.
This factor can be specified by the operator, and may be attenuated automatically, Method 1900 adds the magnified signal to the original signal and collapses the spatial pyramid to obtain the final output. Since natural videos are spatially and temporally smooth, and since the filtering is performed uniformly over the pixels, the method implicitly maintains spatiotemporal coherency of the results. The motion magnification amplifies small motion without tracking motion. Temporal processing produces motion magnification, shown using an analysis that relies on the first-order Taylor series expansions common in optical flow analyses.
[002371 Method 1900 begins with a pixel-examination module in the microprocessor 102 of the non-touch thermometer 100, 200 or 300 examining pixel values of two or more images 704 from the camera 122, at block 1902.
[00238] Method 1900 thereafter determines the temporal variation of the examined pixel values, at block 1904 by a temporal-variation module in the microprocessor 102.
[00239] A signal-processing module in the microprocessor 102 applies signal processing to the pixel value temporal variations, at block 1906. Signal processing amplifies the determined temporal variations, even when the temporal variations are small. Method 1900 amplifies only small temporal variations in the signal-processing module. While method 1900 can be applied to large temporal variations, an advantage in method 1900 is provided for small temporal variations, Therefore method 1900 is most effective when the input images 704 have small temporal variations between the images 704. In some implementations, the signal processing at block 1906 is temporal bandpass filtering that analyzes frequencies over time, In some implementations, the signal processing at block 1906 is spatial processing that removes noise.
[00240] In some implementations, a vital sign is generated from the amplified temporal variations of the input images 704 from the signal processor at block 1908. Examples of generating a vital signal from a temporal variation include as in actions 1612, 1614, 1616, 1618, 1620 and 1622 in FIG. 16, 17 and 18.
[002411 FIG. 20 is a flowchart of a method 2000 of motion amplification from which to generate and communicate biological vital signs, according to an implementation. Method 2000 analyzes the tempora' and spatial variations in digital images of an animal subject in order to generate and communicate the biological vital signs.
[002421 In some implementations, method 2000 includes cropping at least two images to exclude areas that do not include a skin region, at block 2002. For examp'e, the exduded area can be a perimeter area around the center of each image, so that an outside border area of the image is excluded. In some implementations of cropping out the border, about 72% of the width and about 72% of the height of each image is cropped out, leaving only 7.8% of the original uncropped image, which eliminates about 11/12 of each image and reduces the amount of processing time for the remainder of the actions in this process by about 12-fold. This one action alone at block 2002 in method 2000 can reduce the processing time of plurality of images 124 in comparison to method 1800 from 4 minutes to 30 seconds, which is of significant difference to the health workers who used devices that implement method 2000. In some imp'ementations, the remaining area of the image after cropping in a square area and in other implementation the remaining area after cropping is a circular area. Depending upon the topography and shape of the area in the images that has the most pertinent portion of the imaged subject, different geometries and sizes are most beneficiaL The action of cropping the images at Hock 2002 can be applied at the beginning of methods 1600, 1700, 1800 and 1900 in FIG. 16, 17, 18 and 19, respectively. In other implementations of apparatus 700, 800, 900, 1000, 1100, 1200, 1300, 1400 and 1500, a cropper module that performs action 2002 is placed at the beginning of the modules to greatly decrease processing time of the apparatus.
[002431 In some implementations, method 2000 includes identifying pixel va'ues of the at east two or more cropped images that are representative of the skin, at block 2004. Some implementations of identifying pixel values that are representative of the skin include performing an automatic seed point based clustering process on the least two images.
[002441 In some implementations, method 2000 includes applying a spatial bandpass filter to the identified pixel values, at block 1802. In some implementations, the spatial filter in block 1802 is a two-dimensional spatial Fourier Transform, a high pass fflter, a ow pass filter, a bandpass filter or a weighted bandpass filter.
[00245j In some implementations, method 2000 includes applying spatial clustering to the spatial bandpass filtered identified pixel values of skin, at block 1804. In some implementations the spatial clustering includes fuzzy clustering. k-means clustering, expectation-maximization process, Ward's method or seed point based clustering.
[002461 In some implementations, method 2000 includes applying a temporal bandpass filter to the spatial dustered spatial bandpass filtered identified pixel values of skin, at block 1806. In some implementations, the temporal bandpass filter in block 1806 is a one-dimensional spatial Fourier Transform, a high pass filter, a ow pass filter, a bandpass filter or a weighted bandpass filter.
[002471 In some implementations, method 2000 includes determining temporal variation of the temporal bandpass filtered spatial clustered spatial baridpass filtered identified pixel values of skin, at block 1808.
[002481 In some implementations, method 2000 includes analyzing the temporal variation to generate and visually display a pattern of flow of blood, at block 612. In some implementations, the pattern flow of blood is generated from motion changes in the pixels and the temporal variation of color changes in the skin. In some implementations, method 2000 includes displaying the pattern of flow of blood for review by ahealthcare worker, at block 1613.
[002491 In some implementations, method 2000 includes analyzing the temporal variation to generate heartrate, at Nock 614. In some implementations, the heartrate is generated from the frequency spectrum of the temporal variation in a frequency range for heart beats, such as (0-10 Hertz). In some imp'ementations, method 2000 includes displaying the heartrate for review by a healthcare worker, at block 1615.
[002501 In some implementations, method 2000 includes analyzing the temporal variation to determine respiratory rate, at block 1616. In some implementations, the respiratory rate is generated from the motion of the pixels in a frequency range for respiration (0-5 Hertz), In some implementations, method 2000 includes displaying the respiratory rate for review by a healthcare worker, at block 617.
[002511 In some implementations, method 2000 includes analyzing the temporal variation to generate blood pressure, at block 1618. In some implementations, the Nood pressure is generated by analyzing the motion of the pixels and the color changes based on the clustering process and potentially temporal data from the infrared sensor. In some implementations, method 2000 indudes disp'aying the blood pressure for review by a healthcare worker, at block [002521 In some implementations, method 2000 includes analyzing the temporal variation to generate EKG, at block 1620. In some implementations, method 2000 includes displaying the EKG for review by a healthcare worker, at block 1621.
[00253j In some implementations, method 2000 includes analyzing the temporal variation to generate pulse oximetry, at block 1622. In some implementations, the pulse oximetry is generated by analyzing the temporal color changes based in conjunction with the Ic-means clustering process and potentially temporal data from the infrared sensor. In some implementations, method 2000 indudes disp'aying the pulse oximetry for review by a healthcare worker, at block 1623.
[00254j In some implementations, methods 1600-2000 are implemented as a sequence of instructions which, when executed by a microprocessor 102 in FIG. 1- 3, microprocessor 2104 In FIG. 21, main processor 2202 in FIG. 22 or processing unit 2304 in FIG. 23, cause the processor to perform the respective method, In other implementations, methods 1600-2000 are implemented as a computer-accessible medium having computer executable instructions capable of directing a microprocessor, such as microprocessor 02 in FIG. 1-3, microprocessor 2104 in FIG. 21, main processor 2202 in FIG. 22 or processing unit 2304 in FIG. 23, to perform the respective method. In different implementations, the medium is a magnetic medium, an electronic medium, or an optical medium.
Hardware and Operating Environment [002551 FIG. 21 is a schematic of a non-touch thermometer 2100 having a digital IR sensor, according to an implementation, As discussed above in regards to FIG. 2, thermal isolation of the digital IR sensor is an important feature. In second circuit board 2101, a digital IR sensor 2103 is thermally isolated from the heat of the main processor 2104 through a first digital interface 2102. The digital JR sensor 2103 is not mounted on the same circuit board 2105 as the main processor 2104 which reduces heat transfer from the first circuit board 2105 to the digital JR sensor 2103. A device 2100 includes a first circuit board 2105, the first circuit board 2105 including the microprocessor 2104, a battery 2106 that is operably coupled to the microprocessor 2 04, a display device that is operably coupled to the microprocessor 2104 through a display interface 2108, a single button TI 10 that is operably coupled to the microprocessor 2104, and a first digital interface 2102 that is operably coupled to the microprocessor 2104. The device 2100 also includes a second circuit board 2101, the second circuit board 2101 including a second digital interface 2112, the second interface 2112 being operably coupled to the first digital interface 2102 and a digital infrared sensor 2103 being operably coupled to the second interface 2112, the digital infrared sensor 2103 having ports that provide only digital readout. The microprocessor 2104 is operable to receive from the ports that provide only digital readout a digital signal that is representative of an infrared signal generated by the digital infrared sensor 2103 and the microprocessor 2104 is operable to determine a temperature from the digital signal that is representative of the infrared signal.
[00256] FIG. 22 is a block diagram of a mobile device 2200, according to an implementation. The mobile device 2200 may also have the capability to allow voice communication. Depending on the functionality provided by the mobile device, it may be referred to as a data messaging device, a two-way pager, a cellular telephone with data messaging capabilities, a wireless Internet appliwwe, or a data communication device (with or without telephony capabilities).
[00257] The mobile device 2200 includes a number of modules such as a main processor 2202 that controls the overall operation of the mobile device 2200.
Communication functions, including data and voice communications, are performed through a communication subsystem 2204. The communication subsystem 2204 receives messages from and sends messages to wireless networks 2205. In other implementations of the mobile device 2200, the communication subsystem 2204 can be configured in accordance with the Global System for Mobile Communication (OSM), General Packet Radio Services (GPRS), Enhanced Data GSM Environment (EDGE), Universal Mobile Telecommunications Service (UMTS), data-centric wireless networks, voice-centric wireless networks, and dual-mode networks that can support both voice and data communications over the same physical base stations. Combined dual-mode networks include, but are not limited to, Code Division Multiple Access (CDMA) or CDMA2000 networks, GSMJGPRS networks (as mentioned above), and future third-generation (30) networks like EDGE and UMTS. Some other examples of data-centric networks include MobitexTM and DataTAC1M network communication systems. Examples of other voice-centric data networks include Personal Communication Systems (PCS) networks like OSM and Time Division Multiple Access (TDMA) systems.
[00258j The wireless link connecting the communication subsystem 2204 with the wireless network 2205 represents one or more different Radio Frequency (RF) channels. With newer network protocols, these channels are capable of supporting both circuit switched voice communications and packet switched data communications.
[0025 91 The main processor 2202 also interacts with additional subsystems such as a Random Access Memory (RAM) 2206, a flash memory 2208, a display 2210, an auxiliary input/output (I/O) subsystem 2212, a data port 2214, a keyboard 2216, a speaker 2218, a microphone 2220, short-range communications 2222 and other device subsystems 2224. In some implementations, the flash memory 2208 includes a hybrid femtocell/Wi-Fi protocol stack 2209. The stack 2209 supports authentication and authorization between the mobile device 2200 into a shared Wi-Fi network and both a 30 and 40 mobile networks.
[00260j Some of the subsystems of the mobile device 2200 perform communication-related functions, whereas other subsystems may provide "resident" or on-device functions. By way of example, the display 2210 and the keyboard 2216 may be used for both communication-related functions, such as entering a text message for transmission over the wireless network 2205, and device-resident functions such as a calculator or task list.
[00261j The mobile device 2200 can transmit and receive communication signals over the wireless network 2205 after required network registration or activation procedures have been completed. Network access is associated with a subscriber or user of the mobile device 2200. To identify a subscriber, the mobile device 2200 requires a SIM/RIJIM card 2226 (i.e. Subscriber identity Module or a Removable User Identity Module) to be inserted into a SIM/RUiM interface 2228 in order to communicate with a network, The SIM card or RUIM 2226 is one type of a conventional "smart card" that can be used to identify' a subscriber of the mobile device 2200 and to personalize the mobile device 2200, among other things. Without the SiM card 2226, the mobile device 2200 is not fully operational for communication with the wireless network 2205. By inserting the S1M card/RUIM 2226 into the SIM/RUIM interface 2228, a subscriber can access all subscribed services, Services may include: web browsing and messaging such as e-mail, voice mail, Short Message Service (SMS), and Multimedia Messaging Services (MMS). More advanced services may include: point of sale, field service and sales force automation. The SIM card/RUiM 2226 includes a processor and memory for storing information, Once the SiM card/RUIM 2226 is inserted into the SIM/RUIM interface 2228, it is coupled to the main processor 2202. in order to identify the subscriber, the SIM card/RUIM 2226 can include some user parameters such as an International Mobile Subscriber Identity (IMSI), An advantage of using the SIM card/RIJIM 2226 is that a subscriber is not necessarily bound by any single physical mobile device. The SIM card/RUTM 2226 may store additional subscriber information for a mobile device as well, including datebook (or calendar) information and recent call information. Alternatively, user identification information can also be programmed into the flash memory 2208.
[00262] The mobile device 2200 is a battery-powered device and includes a battery interface 2232 for receiving one or more rechargeable batteries 2230. In one or more implementations, the battery 2230 can be a smart battery with an embedded microprocessor. The battery interface 2232 is coupled to a regulator 2233, which assists the battery 2230 in providing power V+ to the mobile device 2200. Although current technology makes use of a battery, future technologies such as micro fuel cells may provide the power to the mobile device 2200.
[00263] The mobile device 2200 also includes an operating system 2234 and modules 2236 to 2249 which are described in more detail below. The operating system 2234 and the modules 2236 to 2249 that are executed by the main processor 2202 are typically stored in a persistent nonvolatile medium such as the flash memory 2208, which may alternatively be a read-only memory (ROM) or similar storage element (not shown). Those skilled in the art will appreciate that portions of the operating system 2234 and the modules 2236 to 2249, such as specific device applications, or parts thereof, may be temporarily loaded into a volatile store such as the RAM 2206. Other modules can also be included.
[00264] The subset of modules 2236 that control basic device operations, including data and voice communication applications, will normally be installed on the mobile device 2200 during its manufacture. Other modules include a message application 2238 that can be any suitable module that allows a user of the mobile device 2200 to transmit and receive electronic messages. Various alternatives exist for the message application 2238 as is well known to those skiHed in the art. Messages that have been sent or received by the user are typically stored in the flash memory 2208 of the mobile device 2200 or some other suitable storage element in the mobile device 2200. In one or more implementations, some of the sent and received messages may be stored remotely from the device 2200 such as in a data store of an associated host system with which the mobile device 2200 communicates.
[002651 The modules can further include a device state module 2240, a Personal Information Manager (PIM) 2242, and other suitable modules (not shown). The device state module 2240 provides persistence, i.e. the device state module 2240 ensures that important device data is stored in persistent memory, such as the flash memory 2208, so that the data is not lost when the mobile device 2200 is turned off or loses power.
[00266] The PIM 2242 includes functionality for organizing and managing data items of interest to the user, such as, but not limited to, e-mail, contacts, calendar events, voice mails, appointments, and task items. A PIM application has the ability to transmit and receive data items via the wireless network 2205, PIM data items may be seamlessly integrated, synchronized, and updated via the wireless network 2205 with the mobile device subscriber's corresponding data items stored and/or associated with a host computer system. This functionality creates a mirrored host computer on the mobile device 2200 with respect to such items, This can be particularly advantageous when the host computer system is the mobile device subscriber's office computer system.
[002671 The mobile device 2200 also includes a connect module 2244, and an IT policy module 2246. The connect module 2244 implements the communication protocols that are required for the mobile device 2200 to communicate with the wireless infrastructure and any host system, such as an enterprise system, with which the mobile device 2200 is authorized to interface. Examples of a wireless infrastructure and an enterprise system are given in FIGS. 22 and 23, which are described in more detail below.
[00268] The connect module 2244 includes a set of APIs that can be integrated with the mobile device 2200 to allow the mobile device 2200 to use any number of services associated with the enterprise system. The connect module 2244 allows the mobile device 2200 to establish an end-to-end secure, authenticated communication pipe with the host system. A subset of applications for which access is provided by the connect module 2244 can be used to pass IT policy commands from the host system to the mobile device 2200. This can be done in a wireless or wired manner, These instructions can then be passed to the iT policy module 2246 to modify the configuration of the device 2200. Alternatively, in some cases, the IT policy update can also be done over a wired connection.
[00269] The IT policy module 2246 receives IT policy data that encodes the IT policy. The IT policy module 2246 then ensures that the IT policy data is authenticated by the mobile device 2200. The iT policy data can then be stored in the flash memory 2206 in its native form, After the IT policy data is stored, a global notification can be sent by the IT policy module 2246 to all of the applications residing on the mobile device 2200, Applications for which the iT policy may be applicable then respond by reading the IT policy data to look for IT policy rules that are applicable, [002701 The IT policy module 2246 can include a parser 2247, which can be used by the applications to read the IT policy rules. In some cases, another module or application can provide the parser. Grouped IT policy rules, described in more detail below, are retrieved as byte streams, which are then sent (recursively) into the parser to determine the values of each IT policy rule defined within the grouped IT policy rule. In one or more implementations, the IT policy module 2246 can determine which applications are affected by the IT policy data and transmit a notification to only those applications. Tn either of these cases, for applications that are not being executed by the main processor 2202 at the time of the notification, the applications can call the parser or the IT policy module 2246 when the applications are executed to determine if there are any relevant IT policy rules in the newly received IT policy data.
[002711 All applications that support rules in the IT Policy are coded to know the type of data to expect. For example, the value that is set for the "WEP User Name" IT policy rule is known to be a string; therefore the value in the IT policy data that corresponds to this rule is interpreted as a string. As another example, the setting for the "Set Maximum Password Attempts" IT policy rule is known to be an integer, and therefore the value in the iT policy data that corresponds to this rule is interpreted as such.
[002721 After the if policy rules have been applied to the applicable applications or configuration files, the IT policy module 2246 sends an acknowledgement back to the host system to indicate that the IT policy data was received and successfully applied.
[00273] The programs 2237 can also include a temporal-variation-amplifier 2248 and a vital sign generator 2249. In some implementations, the temporal-variation-amplifier 2248 includes a skin-pixel-identifier 702, a frequency-filter 706, a regional facial clusterial module 708 and a frequency filter 710 as in FIG. 7 and 8. In some implementations, the temporal-variation-amplifier 2248 includes a skin-pixel-identifier 702, a spatial bandpass-filter 902, regional facial clusterial module 708 and a temporal bandpass filter 904 as in FIG. 9. In some implementations, the temporal-variation-amplifier 2248 includes a pixel-examiner 1002, a temporal variation determiner 1006 and signal processor 1008 as in FIG. 10. In some implementations, the temporal-variation-amplifier 2248 includes a skin-pixel-identification module 1102, a frequency-filter module 1108, spatial-cluster module 1112 and a frequency filter module 1116 as in FIG. 11 and 12. In some implementations, the temporal-variation-amplifier 2248emodule 1102, a spatial bandpass filter module 1402, a spatial-cluster module 1112 and a temporal bmdpass filter module 1406 as in FIG. 14. In some implementations, the temporal-variation-amplifier 2248 includes a pixel examination-module 1502, a temporal variation determiner module 1506 and a signal processing module 1510 as in FIG. 15. The camera 122 captures images 221 24e2248 and the vital sign generator 2249 to generate the vital sign(s) 716 that is displayed by display 2210 or transmitted by communication subsystem 2204 or short-range communications 2222, enunciated by speaker 2218 or stored by flash memory 2208.
[002741 Other types of modules can also be installed on the mobile device 2200.
These modules can be third party modules, which are added after the manufacture of the mobile device 2200. Examples of third party applications include games, calculators, utilities, etc. [002751 The additional applications can be loaded onto the mobile device 2200 through at least one of the wireless network 2205, the auxiliary I/O subsystem 2212, the data port 2214, the short-range communications subsystem 2222, or any other suitable device subsystem 2224. This flexibility in application installation increases the functionality of the mobile device 2200 and may provide enhanced on-device functions, communication-related functions, or both. For example, secure communication applications may enable electronic commerce functions and other such financial transactions to be performed using the mobile device 2200.
[002761 The data port 2214 enables a subscriber to set preferences through an external device or module and extends the capabilities of the mobile device 2200 by providing for information or module downloads to the mobile device 2200 other than through a wireless communication network. The alternate download path may, for example, be used to load an encryption key onto the mobile device 2200 through a direct and thus reliable and trusted connection to provide secure device communication.
[002771 The data port 2214 can be any suitable port that enables data communication between the mobile device 2200 and another computing device.
The data port 2214 can be a serial or a paraJlel port, In some instances, the data pod 2214 can be a IJSB port that includes data lines for data transfer and a supply line that can provide a charging current to charge the battery 2230 of the mobile device 2200.
[002781 The short-range communications subsystem 2222 provides for communication between the mobile device 2200 and different systems or devices, without the use of the wireless network 2205. For example, the subsystem 2222 may include an infrared device and associated circuits and modules for short-range communication. Examples of short-range communication standards include standards developed by the Infrared Data Association (IrDA), Bluetooth, and the 802.11 family of standards developed by IEEE.
[002791 Bluetooth is a wireless technology standard for exchanging data over short distances (using short-wavelength radio transmissions in the ISM band from 2400-2480 MT-Tz) from fixed and mobile devices, creating personal area networks (PANs) with high levels of security. Created by telecom vendor Ericsson in 1994, Bluetooth was originafly conceived as a wireless alternative to RS-232 data cables.
It can connect several devices, overcoming problems of synchronization.
Bluetooth operates in the range of 2400-2483.5 MI-Iz (including guard bands), which is in the globally unlicensed Industrial, Scientific and Medical (ISM) 2.4 GHz short-range radio frequency band. Bluetooth uses a radio technology cafled frequency-hopping spread spectrum. The transmitted data is divided into packets and each packet is transmitted on one of the 79 designated Bluetooth channels.
Each channel has a bandwidth of I MI-Iz. The first channel starts at 2402 MI-Iz and continues up to 2480 MI-k in 1 MI-k steps. It usually performs 1600 hops per second, with Adaptive Frequency-Hopping (AFT-I) enabled. Originally Gaussian frequency-shift keying (GFSK) modulation was the only modulation scheme available; subsequently, since the introduction of Bluetooth 2.0+EDR, m/4-DQPSK and 8DPSK modulation may also be used between compatible devices.
Devices functioning with GFSK are said to be operating in basic rate (BR) mode where an instantaneous data rate of 1 Mbit/s is possible. The term Enhanced Data Rate (EDR) is used to describe it/4-DPSK and 8DPSK schemes, each giving 2 and 3 Mbit/s respectively. The combination of these (BR and EDR) modes in Bluetooth radio technology is classified as a "BR/EDR radio", Bluetooth is a packet-based protocol with a master-slave structure. One master may communicate with up to 7 slaves in a piconet; all devices share the master's clock.
Packet exchange is based on the basic clock, defined by the master, which ticks at 312.5 us intervals. Two clock ticks make up a slot of 625 j.ts; two slots make up a slot pair of 1250 In the simple case of single-slot packets the master transmits in even slots and receives in odd slots; the slave, conversely, receives in even slots and transmits in odd slots, Packets may be 1, 3 or 5 slots long but in all cases the master transmit will begin in even slots and the slave transmit in odd slots. A master Bluetooth device can communicate with a maximum of seven devices in a piconet (an ad-hoc computer network using Bluetooth technology), though not all devices reach this maximum. The devices can switch roles, by agreement, and the slave can become the master (for example, a headset initiating a connection to a phone will necessarily begin as master, as initiator of the connection; but may subsequently prefer to be slave). The Bluetooth Core Specification provides for the connection of two or more piconets to form a scatternet, in which certain devices simultaneously play the master role in one piconet and the slave role in another. At any given time, data can be transferred between the master and one other device (except for the little-used broadcast mode. The master chooses which slave device to address; typically, it switches rapidly from one device to another in a round-robin fashion. Since it is the master that chooses which slave to address, whereas a slave is (in theory) supposed to listen in each receive slot, being a master is a lighter burden than being a slave. Being a master of seven slaves is possible; being a slave of more than one master is difficult. Many of the services offered over Bluetooth can expose private data or allow the connecting party to control the Bluetooth device. For security reasons it is necessary to be able to recognize specific devices and thus enable control over which devices are allowed to connect to a given Bluetooth device, At the same time, it is useful for Bluetooth devices to be able to establish a connection without user intervention (for example, as soon as the Bluetooth devices of each other are in range). To resolve this conflict, Bluetooth uses a process called bonding, and a bond is created through a process called pairing. The pairing process is triggered either by a specific request from a user lo create a bond (for example, the user explicitly requests to "Add a Bluetooth device"), or it is triggered automatically when connecting to a service where (for the first time) the identity of a device is required for security purposes.
These two cases are referred to as dedicated bonding and general bonding respectively. Pairing often involves some level of user interaction; this user interaction is the basis for confirming the identity of the devices. Once pairing successfully completes, a bond will have been formed between the two devices, enabling those two devices to connect to each other in the future without requiring the pairing process in order to confirm the identity of the devices. When desired, the bonding relationship can later be removed by the user, Secure Simple Pairing (SSP): This is required by Bluetooth v2.1, although aBluetooth v2.1 device may only use legacy pairing to interoperate with a v2,O or earlier device, Secure Simple Pairing uses a form of public key cryptography, and some types can help protect against man in the middle, or MITM attacks, SSP has the following characteristics: Just works: As implied by the name, this method just works. No user interaction is required; however, a device may prompt the user to confirm the pairing process, This method is typically used by headsets with very limited 10 capabilities, and is more secure than the fixed PIN mechanism which is typically used for legacy pairing by this set of limited devices. This method provides no man in the middle (MITM) protection, Numeric comparison: If both devices have a display and at least one can accept a binary Yes/No user input, both devices may use Numeric Comparison, This method displays a 6-digit numeric code on each device. The user should compare the numbers to ensure that the numbers are identical. If the comparison succeeds, the user(s) should confirm pairing on the device(s) that can accept an input, This method provides MITM protection, assuming the user confirms on both devices and actually performs the comparison properly. Passkey Entry: This method may be used between a device with a display and a device with numeric keypad entry (such as a keyboard), or two devices with numeric keypad entry, In the first case, the display is used to show a 6-digit numeric code to the user, who then enters the code on the keypad, In the second case, the user of each device enters the same 6-digit number. Both of these cases provide MITM protection. Out of band (OOB): This method uses an external means of communication, such as Near Field Communication (NFC) to exchange some information used in the pairing process. Pairing is completed using the Bluetooth radio, but requires information from the OOB mechanism. This provides only the level of MITM protection that is present in the OOB mechanism.
SSP is considered simple for the foHowing reasons: In most cases, it does not require a user to generate a passkey. For use-cases not requiring MITM protection, user interaction can be eliminated. For numeric comparison, MTTM protection can be achieved with a simple equality comparison by the user. Using OOB with NFC enables pairing when devices simply get close, rather than requiring a lengthy discovery process.
[002801 In use, a received signal such as a text message, an e-mail message, or web page download will be processed by the communication subsystem 2204 and input to the main processor 2202. The main processor 2202 will then process the received signal for output to the display 2210 or alternatively to the auxiliary 1/0 subsystem 2212. A subscriber may also compose data items, such as e-mail messages, for example, using the keyboard 2216 in conjunction with the display 2210 and possibly the auxiliary I/O subsystem 2212. The auxiliary subsystem 2212 may include devices such as: a touch screen, mouse, track ball, infrared fingerprint detector, or a roller wheel with dynamic button pressing capability. The keyboard 2216 is preferably an alphanumeric keyboard and/or telephone-type keypad. However, other types of keyboards may also be used. A composed item may be transmitted over the wireless network 2205 through the communication subsystem 2204.
[002811 For voice communications, the overall operation of the mobile device 2200 is substantially similar, except that the received signals are output to the speaker 2218, and signals for transmission are generated by the microphone 2220.
Alternative voice or audio I/O subsystems, such as a voice message recording subsystem, can also be implemented on the mobile device 2200. Although voice or audio signal output is accomplished primarily through the speaker 2218, the display 2T10 can also be used to provide additional information such as the identity of a calling party, duration of a voice call, or other voice call related information, [002821 FIG. 23 is a block diagram of a hardware and operating environment 2300 in which different imp'ementations can be practiced. The description of FIG. 23 provides an overview of computer hardware and a suitable computing environment in conjunction with which some implementations can be implemented. Implementations are described in terms of a computer executing computer-executable instructions. However, some implementations can be implemented entirely in computer hardware in which the computer-executable instructions are implemented in read-only memory. Some implementations can also be implemented in client/server computing environments where remote devices that perform tasks are linked through a communications network. Program modules can be located in both local and remote memory storage devices in a distributed computing environment, [002831 FIG. 23 illustrates an example of a computer environment 2300 useful in the context of the environment of FIG. 1-9, in accordance with an implementation. The computer environment 2300 includes a computation resource 2302 capable of implementing the processes described herein. It will be appreciated that other devices can alternatively used that include more modules, or fewer modules, than those illustrated in FIG. 23, [00284] The illustrated operating environment 2300 is only one example of a suitable operating environment, and the example described with reference to FIG, 23 is not intended to suggest any limitation as to the scope of use or functionality of the implementations of this disclosure. Other well-known computing systems, environments, and/or configurations can be suitable for implementation and/or application of the subject matter disclosed herein.
[002851 The computation resource 2302 includes one or more processors or processing units 2304, a system memory 2306, and a bus 2308 that couples various system modules including the system memory 2306 to processor(s) 2304 and other elements in the environment 2300. The bus 2308 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port and a processor or local bus using any of a variety of bus architectures, and can be compatible with SCSI (small computer system interconnect), or other conventional bus architectures and protocols.
[002861 The system memory 2306 includes nonvolatile read-only memory (ROM) 2310 and random access memory (RAM) 2312, which can or can not include volatile memory elements. A basic input/output system (BIOS) 2314, containing the elementary routines that help to transfer information between elements within computation resource 2302 and with external items, typically invoked into operating memory during start-up, is stored in ROM 2310.
[002871 The computation resource 2302 further can include a non-volatile read/write memory 2316, represented in FIG, 23 as a hard disk drive, coupled to bus 2308 via a data media interface 2317 (e.g., a SCSI, ATA, or other type of interface); a magnetic disk drive (not shown) for reading from, and/or writing to, a removable magnetic disk 2320 and an optical disk drive (not shown) for reading from, and/or writing to, a removable optical disk 2326 such as a CD, DVD, or other optical media.
[002881 The non-volatile read/write memory 2316 and associated computer-readable media provide nonvolatile storage of computer-readable instructions, data structures, program modules and other data for the computation resource 2302.
Although the exemplary environment 2300 is described herein as employing a non-volatile read/write memory 2316, a removable magnetic disk 2320 and a removable optical disk 2326, it wifl be appreciated by those skilled in the art that other types of computer-readable media which can store data that is accessible by a computer, such as magnetic cassettes, FLASH memory cards, random access memories (RAMs), read only memories (ROM), and the like, can a'so be used in the exemplary operating environment.
[002891 A number of program modules can be stored via the non-volatile read/write memory 2316, magnetic disk 2320, optical disk 2326, ROM 23 10, or RAM 2312, including an operating system 2330, one or more application programs 2332, program modules 2334 and program data 2336. Examples of computer operating systems conventionally employed include the NUCLEUS® operating system, the LINUX® operating system, and others, for example, providing capability for supporting application programs 2332 using, for example, code modules written in the C++® computer programming language. The application programs 2332 and/or the program modules 2334 can also include a temporal-variation-amplifier (as shown in 2248 in FIG 22) and a vital sign generator (as shown in 2249 in FIG. 23). In some implementations, the temporal-variation-amplifier 2248 in the application programs 2332 and/or the program modules 2334 includes a skin-pixel-identifier 702, a frequency-filter 706, regional facial clusterial module 708 and a frequency filter 710 as in FIG. 7 and 8. In some implementations, the tempora'-variation-amplifier 2248 in application programs 2332 and/or the program modules 2334 includes a skin-pixel-identifier 702, a spatial bandpass-filter 902, regional facial clusterial module 708 and a temporal bandpass filter 904 as in FIG. 9. In some implementations, the temporal-variation-amplifier 2248 in the application programs 2332 and/or the program modules 2334 includes a pixel-examiner 1002, a temporal variation determiner 1006 and signal processor 1008 as in FIG. 10. In some implementations, the temporal-variation-amplifier 2248 in the application programs 2332 and/or the program modules 2334 includes a skin-pixel-identification module 1 02, a frequency-filter module 1108, spatial-cluster module 1112 and a frequency filter module 1116 as in FIG. 11 and U, In some implementations, the temporal-variation-amplifier 2248 in the application programs 2332 and/or the program modules 2334 includes a skin- pixel-identification module 1102, a spatial bandpass filter module 1402, a spatial-cluster module 1112 and a temporal bandpass filter module 1406 as in FIG. 14. In some implementations, the temporal-variation-amplifier 2248 in the application programs 2332 and/or the program modules 2334 includes a pixel examination-module 1502, a temporal variation determiner module 1506 and a signal processing module 1510 as in FIG. 15. The camera 122 captures images 124 that are processed by the temporal-variation-amplifier 2248 and the vital sign generator 2249 to generate the vital sign(s) 716 that is displayed by display 2350 or transmitted by computation resource 2302, enunciated by a speaker or stored in program data 2336.
[002901 A user can enter commands and information into computation resource 2302 through input devices such as input media 2338 (e.g., keyboard/keypad, tactile input or pointing device, mouse, foot-operated switching apparatus, joystick, touchscreen or touchpad, microphone, antenna etc.). Such input devices 2338 are coupled to the processing unit 2304 through a conventional input/output interface 2342 that is, in turn, coupled to the system bus. A monitor 2350 or other type of display device is also coupled to the system bus 2308 via an interface, such as a video adapter 2352.
[002911 The computation resource 2302 can include capability for operating in a networked environment using logical connections to one or more remote computers, such as a remote computer 2360. The remote computer 2360 can be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computation resource 2302. Tn a networked environment, program modules depicted relative to the computation resource 2302, or portions thereof, can be stored in a remote memory storage device such as can be associated with the remote computer 2360. By way of example, remote application programs 2362 reside on a memory device of the remote computer 2360. The logical connections represented in FIG. 23 can include interface capabilities, e.g., such as interface capabilities in FIG. 5, a storage area network (SAN, not illustrated in FIG. 23), local area network (LAN) 2372 and/or a wide area network (WAN) 2374, but can also include other networks.
[002921 Such networking environments are commonplace in modern computer systems, and in association with intranets and the Internet, In certain implementations, the computation resource 2302 executes an Internet Web browser program (which can optionally be integrated into the operating system 2330), such as the "Internet Explorer®" Web browser manufactured and distributed by the Microsoft Corporation of Redmond, Washington.
[01001 When used in a LAN-coupled environment, the computation resource 2302 communicates with or through the local area network 2372 via a network interface or adapter 2376 and typically includes interfaces, such as a modem 2378, or other apparatus, for establishing communications with or through the WAN 2374, such as the Internet. The modem 2378, which can be internal or external, is coupled to the system bus 2308 via a serial port interface.
[01011 In a networked environment, program modules depicted relative to the computation resource 2302, or portions thereof, can be stored in remote memory apparatus. It will be appreciated that the network connections shown are exemp'ary, and other means of establishing a communications link between various computer systems and elements can be used.
[0102] A user of a computer can operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 2360, which can be a personal computer, a server, a router, a network PC, a peer device or other common network node. Typically, a remote computer 2360 includes many or all of the elements described above relative to the computer 2300 of FIG. 23.
[0103] The computation resource 2302 typically includes at least some form of computer-readable media, Computer-readable media can be any available media that can be accessed by the computation resource 2302. By way of example, and not limitation, computer-readable media can comprise computer storage media and communication media.
[0104] Computer storage media indude volatile and nonvolatde, removable and non-removable media, implemented in any method or technology for storage of information, such as computer-readable instructions, data structures, program modules or other data, The term "computer storage media" includes, but is not limited to, RAM, ROM, EEPROM, FLASH memory or other memory technology, CD, DVD, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other media which can be used to store computer-intelligible information and which can be accessed by the computation resource 2302.
[0105] Communication media typically embodies computer-readable instructions, data structures, program modules or other data, represented via, and determinable from, a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media, The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal in a fashion amenable to computer interpretation.
[0106] By way of example, and not limitation, communication media include wired media, such as wired network or direct-wired connections, and wireless media, such as acoustic, RF, infrared and other wireless media, The scope of the term computer-readable media includes combinations of any of the above.
[00293] FIG. 24 is a representation of display 2400 that is presented on the display device of apparatus in FIG. 1-3, according to an implementation.
[00294] Some implementations of display 2400 include a representation of three detection modes 2402, a first detection mode being detection and display of surface temperature, a second detection mode being detection and display of body temperature and a third detection mode being detection and display of room temperature.
[00295] Some implementations of display 2400 include a representation of C&sius 2404 that is activated when the apparatus is in Celsius mode, [00296] Some implementations of disp'ay 2400 indude a representation of a sensed temperature 2406.
[00297] Some implementations of display 2400 include a representation of Fahrenheit 2408 that is activated when the apparatus is in Fahrenheit mode.
[002981 Some implementations of display 2400 include a representation of a mode 2410 of sitetemperature sensing, afirst site mode being detection of an axillary surface temperature, a second site mode being detection of an oral temperature, a third site mode being detection of a rectal temperature and a fourth site mode being detection of a core temperature.
[002991 Some implementations of display 2400 include a representation of a temperature traffic light 2412, in which a green traffic light indicates that the temperature 120 is good; an amber traffic light indicates that the temperature 120 is low; and a red traffic light indicates that the temperature 120 is high.
[00300] Some implementations of display 2400 include a representation of a probe mode 2414 that is activated when the sensed temperature 2406 is from a contact sensor.
[00301] Some implementations of display 2400 include a representation of the current time/date 2416 of the apparatus.
[00302] The non-touch thermometer further induding: a housing, and where the battery 104 is fixedly attached to the housing. The non-touch thermometer where an exterior portion of the housing further includes: a magnet.
Conclusion
[00303] A non-touch thermometer that senses temperature from a digital infrared sensor is described, A technical effect of the apparatus is transmitting from the digital infrared sensor a digital signal representing a temperature without conversion from analog. Another technical effect of the apparatus and methods disclosed herein is generating a temporal variation of images from which a heartrate and the respiratory rate can be determined and disp'ayed or stored.
Although specific implementations are illustrated and described herein, it will be appreciated by those of ordinary skill in the art that any arrangement which is generated to achieve the same purpose may be substituted for the specific implementations shown. This application is intended to cover any adaptations or variations.
[003041 In particular, one of skill in the art will readily appreciate that the names of the methods and apparatus are not intended to limit implementations.
Furthermore, additional methods and apparatus can be added to the modules, functions can be rearranged among the modules, and new modules to corespond to future enhancements and physical devices used in implementations can be introduced without departing from the scope of implementations. One of skill in the art will readily recognize that implementations are applicable to future non-touch temperature sensing devices, different temperature measuring sites on humans or animals and new display devices.
[00305j The terminology used in this application meant to include all temperature sensors, processors and operator environments and alternate technologies which provide the same functionality as described herein.
GB1411983.8A 2014-07-04 2014-07-04 Non-touch optical detection of vital signs Active GB2528044B (en)

Priority Applications (23)

Application Number Priority Date Filing Date Title
GB1811453.8A GB2561771B (en) 2014-07-04 2014-07-04 Non-touch detection of body core temperature from a digital infrared sensor
GB1411983.8A GB2528044B (en) 2014-07-04 2014-07-04 Non-touch optical detection of vital signs
US14/324,235 US9508141B2 (en) 2014-07-04 2014-07-06 Non-touch optical detection of vital signs
US14/448,223 US8950935B1 (en) 2014-07-04 2014-07-31 Thermometer having a digital infrared sensor
US14/457,105 US20160000337A1 (en) 2014-07-04 2014-08-11 Thermometer having a digital infrared sensor
US14/457,041 US9406125B2 (en) 2014-07-04 2014-08-11 Apparatus of non-touch optical detection of vital signs on skin from multiple filters
US14/457,029 US9478025B2 (en) 2014-07-04 2014-08-11 Device having a digital infrared sensor and non-touch optical detection of vital signs from a temporal variation amplifier
US14/457,074 US9501824B2 (en) 2014-07-04 2014-08-11 Non-touch optical detection of vital signs from amplified visual variations of reduced images of skin
US14/457,001 US9721339B2 (en) 2014-07-04 2014-08-11 Device having digital infrared sensor and non-touch optical detection of amplified temporal variation of vital signs
US14/457,098 US9324144B2 (en) 2014-07-04 2014-08-11 Device having a digital infrared sensor and non-touch optical detection of vital signs from a temporal variation amplifier
US14/457,090 US20160000331A1 (en) 2014-07-04 2014-08-11 Apparatus of non-touch optical detection of vital signs of reduced images from spatial and temporal filters
US14/457,061 US9495744B2 (en) 2014-07-04 2014-08-11 Non-touch optical detection of vital signs from amplified visual variations of reduced images
US14/457,111 US9330459B2 (en) 2014-07-04 2014-08-11 Thermometer having a digital infrared sensor on a circuit board that is separate from a microprocessor
US14/457,053 US9691146B2 (en) 2014-07-04 2014-08-11 Non-touch optical detection of vital sign from amplified visual variations
US14/457,018 US9262826B2 (en) 2014-07-04 2014-08-11 Methods of non-touch optical detection of vital signs from multiple filters
US14/617,926 US9282896B2 (en) 2014-07-04 2015-02-09 Thermometer having a digital infrared sensor
US14/694,610 US20160003679A1 (en) 2014-07-04 2015-04-23 Thermometer having a digital infrared sensor
EP15175534.5A EP2963617A1 (en) 2014-07-04 2015-07-06 Non-touch optical detection of vital signs
EP15175517.0A EP2977732A3 (en) 2014-07-04 2015-07-06 Thermometer having a digital infrared sensor
US14/794,669 US9305350B2 (en) 2014-07-04 2015-07-08 Non-touch optical detection of biological vital signs
US14/876,784 US9881369B2 (en) 2014-07-04 2015-10-06 Smartphone having a communication subsystem that is operable in CDMA, a digital infrared sensor with ports that provide a digital signal representing a surface temperature, a microprocessor that receives from the ports the digital signal that is representative of the temperature and that generates a body core temperature from the digital signal that is representative of the temperature and a display device that displays the body core temperature
US15/224,644 US10074175B2 (en) 2014-07-04 2016-07-31 Non-touch optical detection of vital signs from variation amplification subsequent to multiple frequency filters
US16/127,182 US10453194B2 (en) 2014-07-04 2018-09-10 Apparatus having a digital infrared sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1411983.8A GB2528044B (en) 2014-07-04 2014-07-04 Non-touch optical detection of vital signs

Publications (3)

Publication Number Publication Date
GB201411983D0 GB201411983D0 (en) 2014-08-20
GB2528044A true GB2528044A (en) 2016-01-13
GB2528044B GB2528044B (en) 2018-08-22

Family

ID=51410672

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1411983.8A Active GB2528044B (en) 2014-07-04 2014-07-04 Non-touch optical detection of vital signs

Country Status (2)

Country Link
US (19) US9508141B2 (en)
GB (1) GB2528044B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109409367A (en) * 2018-11-02 2019-03-01 四川大学 A kind of infrared image gradation recognition methods based on rock temperature-raising characteristic
CN110380430A (en) * 2019-07-01 2019-10-25 浙江大学 A kind of people having the same aspiration and interest generator recognition methods based on fuzzy clustering
US10492684B2 (en) 2017-02-21 2019-12-03 Arc Devices Limited Multi-vital-sign smartphone system in an electronic medical records system
US10602987B2 (en) 2017-08-10 2020-03-31 Arc Devices Limited Multi-vital-sign smartphone system in an electronic medical records system

Families Citing this family (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150100113A (en) * 2014-02-24 2015-09-02 삼성전자주식회사 Apparatus and Method for processing image thereof
US9575560B2 (en) 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
GB2528044B (en) * 2014-07-04 2018-08-22 Arc Devices Ni Ltd Non-touch optical detection of vital signs
EP3179948B1 (en) 2014-07-25 2022-04-27 Covidien LP Augmented surgical reality environment
US10758133B2 (en) * 2014-08-07 2020-09-01 Apple Inc. Motion artifact removal by time domain projection
US9921660B2 (en) 2014-08-07 2018-03-20 Google Llc Radar-based gesture recognition
US9811164B2 (en) 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
WO2016040540A1 (en) * 2014-09-13 2016-03-17 Arc Devices Inc. Usa Non-touch detection of body core temperature
US9600080B2 (en) 2014-10-02 2017-03-21 Google Inc. Non-line-of-sight radar-based gesture recognition
US9854973B2 (en) 2014-10-25 2018-01-02 ARC Devices, Ltd Hand-held medical-data capture-device interoperation with electronic medical record systems
US10064582B2 (en) 2015-01-19 2018-09-04 Google Llc Noninvasive determination of cardiac health and other functional states and trends for human physiological systems
US10168220B2 (en) * 2015-03-20 2019-01-01 Pixart Imaging Inc. Wearable infrared temperature sensing device
US10016162B1 (en) 2015-03-23 2018-07-10 Google Llc In-ear health monitoring
US9848780B1 (en) 2015-04-08 2017-12-26 Google Inc. Assessing cardiovascular function using an optical sensor
CN107466389B (en) 2015-04-30 2021-02-12 谷歌有限责任公司 Method and apparatus for determining type-agnostic RF signal representation
US10139916B2 (en) 2015-04-30 2018-11-27 Google Llc Wide-field radar-based gesture recognition
KR102328589B1 (en) 2015-04-30 2021-11-17 구글 엘엘씨 Rf-based micro-motion tracking for gesture tracking and recognition
US10080528B2 (en) * 2015-05-19 2018-09-25 Google Llc Optical central venous pressure measurement
US20160345832A1 (en) * 2015-05-25 2016-12-01 Wearless Tech Inc System and method for monitoring biological status through contactless sensing
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
US10376195B1 (en) 2015-06-04 2019-08-13 Google Llc Automated nursing assessment
US10542961B2 (en) 2015-06-15 2020-01-28 The Research Foundation For The State University Of New York System and method for infrasonic cardiac monitoring
US10817065B1 (en) 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
EP3209198A4 (en) * 2015-12-31 2017-12-27 Cnoga Medical Ltd. Method and device for computing optical hemodynamic blood pressure
US10970661B2 (en) * 2016-01-11 2021-04-06 RaceFit International Company Limited System and method for monitoring motion and orientation patterns associated to physical activities of users
CA2958003C (en) * 2016-02-19 2022-04-05 Paul Stanley Addison System and methods for video-based monitoring of vital signs
CN205585991U (en) * 2016-02-26 2016-09-21 严定远 Measure device of human heartbeat breathing and body temperature
US10492302B2 (en) 2016-05-03 2019-11-26 Google Llc Connecting an electronic component to an interactive textile
US20190133516A1 (en) * 2016-06-06 2019-05-09 Tosense, Inc. Physiological monitor for monitoring patients undergoing hemodialysis
US10165612B2 (en) * 2016-06-16 2018-12-25 I/O Interconnected, Ltd. Wireless connecting method, computer, and non-transitory computer-readable storage medium
US10925496B2 (en) 2016-07-16 2021-02-23 Alexander Misharin Methods and systems for obtaining physiologic information
US11412943B2 (en) 2016-07-16 2022-08-16 Olesya Chornoguz Methods and systems for obtaining physiologic information
KR102586792B1 (en) * 2016-08-23 2023-10-12 삼성디스플레이 주식회사 Display device and driving method thereof
GB2569936B (en) * 2016-10-14 2021-12-01 Facense Ltd Calculating respiratory parameters from thermal measurements
CN106600556A (en) * 2016-12-16 2017-04-26 合网络技术(北京)有限公司 Image processing method and apparatus
CN108204860A (en) * 2016-12-20 2018-06-26 天津市军联科技有限公司 Heat supply network infrared temperature measurement system based on the GPRS communication technologys
US20180235478A1 (en) * 2017-02-18 2018-08-23 VVV IP Holdings Limited Multi-Vital Sign Detector in an Electronic Medical Records System
US10506926B2 (en) 2017-02-18 2019-12-17 Arc Devices Limited Multi-vital sign detector in an electronic medical records system
CN107300428A (en) * 2017-06-28 2017-10-27 武汉万千无限科技有限公司 A kind of automatic continuous measuring system of rotary spherical digester temperature based on internet-based control
US20190046056A1 (en) * 2017-08-10 2019-02-14 VVVital Patent Holdings Limited Multi-Vital Sign Detector in an Electronic Medical Records System
CN109548008B (en) * 2017-08-15 2021-09-14 华为技术有限公司 Method and equipment for identifying and controlling remote user equipment by network side
CN109671241A (en) * 2017-10-16 2019-04-23 中国电信股份有限公司 Alarm method and system
EP3681394A1 (en) 2017-11-13 2020-07-22 Covidien LP Systems and methods for video-based monitoring of a patient
WO2019135877A1 (en) 2018-01-08 2019-07-11 Covidien Lp Systems and methods for video-based non-contact tidal volume monitoring
US10522006B2 (en) * 2018-01-09 2019-12-31 Erik Alexander Methods and systems for interactive gaming
EP3572730B1 (en) * 2018-05-02 2023-01-04 Elatronic Ag Remote temperature measurement of cookware through a ceramic glass plate using an infrared sensor
CN108647658A (en) * 2018-05-16 2018-10-12 电子科技大学 A kind of infrared imaging detection method of high-altitude cirrus
US10485431B1 (en) * 2018-05-21 2019-11-26 ARC Devices Ltd. Glucose multi-vital-sign system in an electronic medical records system
US11547313B2 (en) 2018-06-15 2023-01-10 Covidien Lp Systems and methods for video-based patient monitoring during surgery
EP3833241A1 (en) 2018-08-09 2021-06-16 Covidien LP Video-based patient monitoring systems and associated methods for detecting and monitoring breathing
CN109409290B (en) * 2018-10-26 2022-02-11 中国人民解放军火箭军工程大学 Thermometer verification reading automatic identification system and method
US11617520B2 (en) 2018-12-14 2023-04-04 Covidien Lp Depth sensing visualization modes for non-contact monitoring
CN109657720B (en) * 2018-12-20 2021-05-11 浙江大学 On-line diagnosis method for turn-to-turn short circuit fault of power transformer
US11315275B2 (en) 2019-01-28 2022-04-26 Covidien Lp Edge handling methods for associated depth sensing camera devices, systems, and methods
WO2020171554A1 (en) * 2019-02-19 2020-08-27 Samsung Electronics Co., Ltd. Method and apparatus for measuring body temperature using a camera
CN109846463A (en) * 2019-03-04 2019-06-07 武汉迅检科技有限公司 Infrared face temp measuring method, system, equipment and storage medium
US11484208B2 (en) 2020-01-31 2022-11-01 Covidien Lp Attached sensor activation of additionally-streamed physiological parameters from non-contact monitoring systems and associated devices, systems, and methods
US11170894B1 (en) * 2020-04-02 2021-11-09 Robert William Kocher Access and temperature monitoring system (ATMs)
CN111839519B (en) * 2020-05-26 2021-05-18 合肥工业大学 Non-contact respiratory frequency monitoring method and system
US11504014B2 (en) 2020-06-01 2022-11-22 Arc Devices Limited Apparatus and methods for measuring blood pressure and other vital signs via a finger
US20210381902A1 (en) * 2020-06-09 2021-12-09 Dynabrade, Inc. Holder for a temporal thermometer
CN111800312B (en) * 2020-06-23 2021-08-24 中国核动力研究设计院 Message content analysis-based industrial control system anomaly detection method and system
CN111707380A (en) 2020-07-10 2020-09-25 浙江荣胜工具有限公司 Forehead temperature instrument capable of displaying different colors according to detected temperature and control circuit thereof
CN112418251B (en) * 2020-12-10 2024-02-13 研祥智慧物联科技有限公司 Infrared body temperature detection method and system
CN112866030B (en) * 2021-02-03 2022-08-12 挂号网(杭州)科技有限公司 Flow switching method, device, equipment and storage medium
US20230036636A1 (en) * 2021-07-21 2023-02-02 Holographic Humanity, Llc Thermal Imaging Device Performing Image Analysis To Facilitate Early Detection Of Distal Extremity Altered Perfusion States
US11914800B1 (en) 2022-10-28 2024-02-27 Dell Products L.P. Information handling system stylus with expansion bay and replaceable module

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6095682A (en) * 1997-11-21 2000-08-01 Omega Engineering, Inc. Pyrometer multimeter
US6215893B1 (en) * 1998-05-24 2001-04-10 Romedix Ltd. Apparatus and method for measurement and temporal comparison of skin surface images
US20060152737A1 (en) * 2004-02-10 2006-07-13 Fluke Corporation Method and apparatus for electronically generating an outline indicating the size of an energy zone imaged onto the IR detector of a radiometer

Family Cites Families (136)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1322906A (en) 1970-03-05 1973-07-11 Cannon Electric Great Britain Contacts
SE338844B (en) 1970-12-14 1971-09-20 Sandvikens Jernverks Ab
FR2132945A5 (en) 1971-04-02 1972-11-24 Asco Sa
JPS56501138A (en) * 1979-09-12 1981-08-13
US4322012A (en) 1980-05-09 1982-03-30 Dairy Cap Corporation Threaded plastic bottle cap
US4315150A (en) 1980-07-24 1982-02-09 Telatemp Corporation Targeted infrared thermometer
US4494881A (en) * 1982-03-10 1985-01-22 Everest Charles E Intra-optical light beam sighting system for an infrared thermometer
US4602642A (en) 1984-10-23 1986-07-29 Intelligent Medical Systems, Inc. Method and apparatus for measuring internal body temperature utilizing infrared emissions
DE3650770T2 (en) 1985-04-17 2003-02-27 Thermoscan Inc Electronic infrared thermometer and method for temperature measurement
US5017018A (en) * 1987-12-25 1991-05-21 Nippon Steel Corporation Clinical thermometer
US5012813A (en) 1988-12-06 1991-05-07 Exergen Corporation Radiation detector having improved accuracy
JPH03182185A (en) 1989-12-11 1991-08-08 Fujitsu Ltd Infrared monitoring system
US5150969A (en) 1990-03-12 1992-09-29 Ivac Corporation System and method for temperature determination and calibration in a biomedical probe
FR2665533B1 (en) 1990-08-06 1994-03-25 Ortomedic DEVICE FOR REMOTE MEASUREMENT OF TEMPERATURE AND / OR TEMPERATURE DIFFERENCES.
US5272340A (en) * 1992-09-29 1993-12-21 Amara, Inc. Infrared imaging system for simultaneous generation of temperature, emissivity and fluorescence images
US5368038A (en) 1993-03-08 1994-11-29 Thermoscan Inc. Optical system for an infrared thermometer
US5626424A (en) 1994-07-21 1997-05-06 Raytek Subsidiary, Inc. Dual light source aiming mechanism and improved actuation system for hand-held temperature measuring unit
JP3333353B2 (en) 1995-05-31 2002-10-15 安立計器株式会社 Temperature measuring device
DE19604201A1 (en) 1996-02-06 1997-08-07 Braun Ag protective cap
IT1284119B1 (en) 1996-07-05 1998-05-08 Tecnica S R L INFRARED THERMOMETER INCLUDING AN OPTICAL AIMING SYSTEM
US6343141B1 (en) * 1996-10-08 2002-01-29 Lucent Technologies Inc. Skin area detection for video image systems
WO1998044839A1 (en) 1997-04-03 1998-10-15 National Research Council Of Canada Method of assessing tissue viability using near-infrared spectroscopy
DE19827343A1 (en) 1998-06-19 1999-12-23 Braun Gmbh Device for carrying out measurements in ear, e.g. for measuring temperature
IT1298515B1 (en) 1998-01-30 2000-01-12 Tecnica S R L INFRARED THERMOMETER
DE69817622T2 (en) 1998-01-30 2004-06-17 Tecnimed S.R.L., Vedano Olona INFRARED THERMOMETER
US6286994B1 (en) * 1998-04-29 2001-09-11 Qualcomm Incorporated System, method and computer program product for controlling a transmit signal using an expected power level
DE19830830C2 (en) * 1998-07-09 2000-11-23 Siemens Ag Process for the live detection of human skin
US6292685B1 (en) 1998-09-11 2001-09-18 Exergen Corporation Temporal artery temperature detector
US6757412B1 (en) * 1998-10-21 2004-06-29 Computerzied Thermal Imaging, Inc. System and method for helping to determine the condition of tissue
US6751342B2 (en) * 1999-12-02 2004-06-15 Thermal Wave Imaging, Inc. System for generating thermographic images using thermographic signal reconstruction
US6882982B2 (en) 2000-02-04 2005-04-19 Medtronic, Inc. Responsive manufacturing and inventory control
US6904408B1 (en) 2000-10-19 2005-06-07 Mccarthy John Bionet method, system and personalized web content manager responsive to browser viewers' psychological preferences, behavioral responses and physiological stress indicators
US6832000B2 (en) 2001-03-28 2004-12-14 Koninklijke Philips Electronics N.V. Automatic segmentation-based grass detection for real-time video
JP2005509312A (en) 2001-03-30 2005-04-07 ヒル−ロム サービシーズ,インコーポレイティド Hospital bed and network system
IL158826A0 (en) * 2001-06-13 2004-05-12 Compumedics Ltd Methods and apparatus for monitoring consciousness
KR101124852B1 (en) * 2002-04-22 2012-03-28 마시오 마크 아우렐리오 마틴스 애브리우 Apparatus and method for measuring biological parameters
US8849379B2 (en) 2002-04-22 2014-09-30 Geelux Holdings, Ltd. Apparatus and method for measuring biologic parameters
US7140768B2 (en) * 2002-07-15 2006-11-28 Cold Chain Technologies, Inc. System and method of monitoring temperature
US20040186357A1 (en) 2002-08-20 2004-09-23 Welch Allyn, Inc. Diagnostic instrument workstation
US20070100666A1 (en) * 2002-08-22 2007-05-03 Stivoric John M Devices and systems for contextual and physiological-based detection, monitoring, reporting, entertainment, and control of other devices
US20040057494A1 (en) * 2002-09-19 2004-03-25 Simon Tsao Ear thermometer with improved temperature coefficient and method of calibration thereof
US20040153341A1 (en) 2002-12-09 2004-08-05 Brandt Samuel I. System for analyzing and processing orders related to healthcare treatment or services
US7004910B2 (en) * 2002-12-12 2006-02-28 Alert Care, Inc System and method for monitoring body temperature
US20040120383A1 (en) 2002-12-19 2004-06-24 The Boeing Company Non-destructive testing system and method using current flow thermography
US7477571B2 (en) * 2003-04-03 2009-01-13 Sri International Method for detecting vibrations in a biological organism using real-time vibration imaging
US7501984B2 (en) 2003-11-04 2009-03-10 Avery Dennison Corporation RFID tag using a surface insensitive antenna structure
WO2004110248A2 (en) * 2003-05-27 2004-12-23 Cardiowave, Inc. Remote technique to detect core body temperature in a subject using thermal imaging
US7792970B2 (en) 2005-06-17 2010-09-07 Fotonation Vision Limited Method for establishing a paired connection between media devices
US6956337B2 (en) 2003-08-01 2005-10-18 Directed Electronics, Inc. Temperature-to-color converter and conversion method
AU2004311841B2 (en) 2003-12-24 2008-10-09 Walker Digital, Llc Method and apparatus for automatically capturing and managing images
US7572056B2 (en) 2004-11-16 2009-08-11 Welch Allyn, Inc. Probe cover for thermometry apparatus
AU2006204886B2 (en) 2005-01-13 2011-08-04 Welch Allyn, Inc. Vital signs monitor
US20060225737A1 (en) * 2005-04-12 2006-10-12 Mr. Mario Iobbi Device and method for automatically regulating supplemental oxygen flow-rate
US7686232B2 (en) * 2005-09-20 2010-03-30 Novarus Corporation System and method for food safety inspection
US20070080223A1 (en) * 2005-10-07 2007-04-12 Sherwood Services Ag Remote monitoring of medical device
US8768429B2 (en) * 2005-12-23 2014-07-01 E.I.T. Pty Ltd. Internal bleeding detection apparatus
US7407323B2 (en) 2006-02-03 2008-08-05 Ge Infrastructure Sensing Inc. Methods and systems for determining temperature of an object
US20070268954A1 (en) * 2006-05-19 2007-11-22 Sherwood Services Ag Portable test apparatus for radiation-sensing thermometer
US20080018480A1 (en) * 2006-07-20 2008-01-24 Sham John C K Remote body temperature monitoring device
US7520668B2 (en) 2007-01-24 2009-04-21 Innova Electronics Corporation Multi function thermometer
TW200840541A (en) 2007-04-09 2008-10-16 Avita Corp Non-contact temperature-measuring device and the method thereof
FI119531B (en) 2007-06-29 2008-12-15 Optomed Oy Creating an image
US20090049732A1 (en) * 2007-08-24 2009-02-26 Russell Dean Kissinger System and method for cooling the barrel of a firearm
US20090100333A1 (en) * 2007-10-16 2009-04-16 Jun Xiao Visualizing circular graphic objects
US8149273B2 (en) 2007-11-30 2012-04-03 Fuji Xerox Co., Ltd. System and methods for vital sign estimation from passive thermal video
US8549428B2 (en) 2007-12-28 2013-10-01 Fluke Corporation Portable IR thermometer having graphical user display and interface
US7854550B2 (en) 2008-01-04 2010-12-21 Aviton Care Limited Intelligent illumination thermometer
US8218862B2 (en) 2008-02-01 2012-07-10 Canfield Scientific, Incorporated Automatic mask design and registration and feature detection for computer-aided skin analysis
KR100871916B1 (en) * 2008-05-13 2008-12-05 아람휴비스(주) A portable clinical thermometer capable of providing visual images
US8213689B2 (en) * 2008-07-14 2012-07-03 Google Inc. Method and system for automated annotation of persons in video content
WO2010019515A2 (en) 2008-08-10 2010-02-18 Board Of Regents, The University Of Texas System Digital light processing hyperspectral imaging apparatus
EP2347233A4 (en) 2008-10-23 2017-12-20 KAZ Europe SA Non-contact medical thermometer with stray radiation shielding
US9843743B2 (en) * 2009-06-03 2017-12-12 Flir Systems, Inc. Infant monitoring systems and methods using thermal imaging
US8527038B2 (en) * 2009-09-15 2013-09-03 Sotera Wireless, Inc. Body-worn vital sign monitor
TWI495859B (en) 2009-10-05 2015-08-11 Kaz Europe Sa Multi-site attachments for ear thermometers
US20130016185A1 (en) 2009-11-19 2013-01-17 The John Hopkins University Low-cost image-guided navigation and intervention systems using cooperative sets of local sensors
TWM383758U (en) 2010-02-12 2010-07-01 Jia-He Xu Temperature-sensing illumination device
US9075446B2 (en) 2010-03-15 2015-07-07 Qualcomm Incorporated Method and apparatus for processing and reconstructing data
US20110251493A1 (en) 2010-03-22 2011-10-13 Massachusetts Institute Of Technology Method and system for measurement of physiological parameters
EP2380493A1 (en) 2010-04-21 2011-10-26 Koninklijke Philips Electronics N.V. Respiratory motion detection apparatus
WO2011143631A2 (en) 2010-05-14 2011-11-17 Kai Medical, Inc. Systems and methods for non-contact multiparameter vital signs monitoring, apnea therapy, sway cancellation, patient identification, and subject monitoring sensors
US9135693B2 (en) 2010-05-18 2015-09-15 Skin Of Mine Dot Com, Llc Image calibration and analysis
WO2011151806A1 (en) 2010-06-04 2011-12-08 Tecnimed S.R.L. Method and device for measuring the internal body temperature of a patient
KR20110011477U (en) * 2010-06-07 2011-12-14 로얄앤컴퍼니 주식회사 Digital infrared sensor device with an auto sensitivity setting for auto faucet
US8493482B2 (en) 2010-08-18 2013-07-23 Apple Inc. Dual image sensor image processing system and method
WO2012038877A1 (en) 2010-09-22 2012-03-29 Koninklijke Philips Electronics N.V. Method and apparatus for monitoring the respiration activity of a subject
BR112013008702A2 (en) 2010-10-18 2016-06-21 3M Innovative Properties Co "Multifunctional medical device for telemedicine applications"
US8682417B2 (en) 2010-10-26 2014-03-25 Intermountain Invention Management, Llc Heat-related symptom detection systems and methods
US20120136559A1 (en) * 2010-11-29 2012-05-31 Reagan Inventions, Llc Device and system for identifying emergency vehicles and broadcasting the information
US9459158B2 (en) 2010-12-13 2016-10-04 Helen Of Troy Limited Thermometer with age specific feature selection
RU2589389C2 (en) 2011-01-05 2016-07-10 Конинклейке Филипс Электроникс Н.В. Device and method of extracting information from characteristic signals
WO2012093311A1 (en) 2011-01-06 2012-07-12 Koninklijke Philips Electronics N.V. Barcode scanning device for determining a physiological quantity of a patient
US8948842B2 (en) 2011-01-21 2015-02-03 Headwater Partners Ii Llc Radiation treatment with multiple imaging elements
EP2498583B1 (en) * 2011-03-07 2017-05-03 Zedel LED lamp provided with a safety device
US8832080B2 (en) * 2011-05-25 2014-09-09 Hewlett-Packard Development Company, L.P. System and method for determining dynamic relations from images
US8249547B1 (en) * 2011-06-16 2012-08-21 Albert Fellner Emergency alert device with mobile phone
US8693739B2 (en) * 2011-08-24 2014-04-08 Cyberlink Corp. Systems and methods for performing facial detection
JP5898432B2 (en) * 2011-08-25 2016-04-06 京セラ株式会社 Human body detection system
US20130245462A1 (en) 2011-09-06 2013-09-19 Lluis Capdevila Apparatus, methods, and articles of manufacture for determining and using heart rate variability
US8401285B1 (en) * 2011-09-15 2013-03-19 Mckesson Financial Holdings Methods, apparatuses, and computer program products for controlling luminance of non-tissue objects within an image
US8617081B2 (en) 2011-09-28 2013-12-31 Xerox Corporation Estimating cardiac pulse recovery from multi-channel source data via constrained source separation
JP5818091B2 (en) * 2011-12-27 2015-11-18 ソニー株式会社 Image processing apparatus, image processing system, image processing method, and program
US20130204570A1 (en) * 2012-02-06 2013-08-08 Tzila Mendelson Cellular telephone and camera thermometers
US9910118B2 (en) 2012-04-20 2018-03-06 University Of Virginia Patent Foundation Systems and methods for cartesian dynamic imaging
US8897522B2 (en) 2012-05-30 2014-11-25 Xerox Corporation Processing a video for vascular pattern detection and cardiac function analysis
DE202012102739U1 (en) 2012-06-15 2013-09-17 Steinel Gmbh Measuring device, its use and hot air blower with measuring device
US20140003461A1 (en) 2012-06-28 2014-01-02 Brooklands, Inc. Contact and non-contact thermometer
US20140003462A1 (en) 2012-06-28 2014-01-02 Brooklands, Inc. Thermometer display
CN202619644U (en) * 2012-07-12 2012-12-26 上海颂联国际服饰有限公司 Clothes with function of detecting vital sign of human body
US20140064327A1 (en) 2012-08-28 2014-03-06 Brooklands, Inc. Thermometer electromagnetic sensor waveguide
US9811901B2 (en) * 2012-09-07 2017-11-07 Massachusetts Institute Of Technology Linear-based Eulerian motion modulation
US9324005B2 (en) 2012-09-07 2016-04-26 Massachusetts Institute of Technology Quanta Computer Inc. Complex-valued phase-based eulerian motion modulation
US9805475B2 (en) 2012-09-07 2017-10-31 Massachusetts Institute Of Technology Eulerian motion modulation
US20140189576A1 (en) * 2012-09-10 2014-07-03 Applitools Ltd. System and method for visual matching of application screenshots
CN202859096U (en) * 2012-09-13 2013-04-10 深圳市立德通讯器材有限公司 Smartphone with infrared thermometer
US8452382B1 (en) 2012-09-21 2013-05-28 Brooklands Inc. Non-contact thermometer sensing a carotid artery
US20140112367A1 (en) 2012-10-19 2014-04-24 Brooklands, Inc. Calibration of a hand-held medical device by a mobile device
US20140114600A1 (en) 2012-10-19 2014-04-24 Brooklands, Inc. Calibration of a hand-held medical device by a mobile device
US20150297082A1 (en) * 2012-11-26 2015-10-22 John M. HOGGLE Medical monitoring system
MX360210B (en) * 2012-12-04 2018-10-24 Koninklijke Philips Nv Device and method for obtaining vital sign information of a living being.
JP5843751B2 (en) 2012-12-27 2016-01-13 株式会社ソニー・コンピュータエンタテインメント Information processing apparatus, information processing system, and information processing method
US20150110153A1 (en) * 2013-10-22 2015-04-23 Tagnetics, Inc. Temperature sensor for retail environments
US9008458B2 (en) * 2013-02-07 2015-04-14 Raytheon Company Local area processing using packed distribution functions
US11272142B2 (en) 2013-03-06 2022-03-08 Koninklijke Philips N.V. System and method for determining vital sign information
WO2014145361A1 (en) * 2013-03-15 2014-09-18 Innovative Timing Systems Llc System and method of integrating participant biometrics within an event timing system
US9633548B2 (en) 2013-04-23 2017-04-25 Canary Connect, Inc. Leveraging a user's geo-location to arm and disarm a network enabled device
US9811636B2 (en) 2013-09-20 2017-11-07 Beam Ip Lab Llc Connected health care system
US9262689B1 (en) * 2013-12-18 2016-02-16 Amazon Technologies, Inc. Optimizing pre-processing times for faster response
GB2521620A (en) 2013-12-23 2015-07-01 Brooklands Inc Thermometer display
US20150257653A1 (en) 2014-03-14 2015-09-17 Elwha Llc Device, system, and method for determining blood pressure in a mammalian subject
GB2528044B (en) * 2014-07-04 2018-08-22 Arc Devices Ni Ltd Non-touch optical detection of vital signs
US8965090B1 (en) * 2014-07-06 2015-02-24 ARC Devices, Ltd Non-touch optical detection of vital signs
US20160073897A1 (en) * 2014-09-13 2016-03-17 ARC Devices, Ltd Non-touch detection of body core temperature
US9854973B2 (en) * 2014-10-25 2018-01-02 ARC Devices, Ltd Hand-held medical-data capture-device interoperation with electronic medical record systems
US9610893B2 (en) * 2015-03-18 2017-04-04 Car1St Technologies, Llc Methods and systems for providing alerts to a driver of a vehicle via condition detection and wireless communications
US11766182B2 (en) * 2015-06-05 2023-09-26 The Arizona Board Of Regents On Behalf Of The University Of Arizona Systems and methods for real-time signal processing and fitting

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6095682A (en) * 1997-11-21 2000-08-01 Omega Engineering, Inc. Pyrometer multimeter
US6215893B1 (en) * 1998-05-24 2001-04-10 Romedix Ltd. Apparatus and method for measurement and temporal comparison of skin surface images
US20060152737A1 (en) * 2004-02-10 2006-07-13 Fluke Corporation Method and apparatus for electronically generating an outline indicating the size of an energy zone imaged onto the IR detector of a radiometer

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10492684B2 (en) 2017-02-21 2019-12-03 Arc Devices Limited Multi-vital-sign smartphone system in an electronic medical records system
US10667688B2 (en) 2017-02-21 2020-06-02 ARC Devices Ltd. Multi-vital sign detector of SpO2 blood oxygenation and heart rate from a photoplethysmogram sensor and respiration rate, heart rate variability and blood pressure from a micro dynamic light scattering sensor in an electronic medical records system
US10602987B2 (en) 2017-08-10 2020-03-31 Arc Devices Limited Multi-vital-sign smartphone system in an electronic medical records system
CN109409367A (en) * 2018-11-02 2019-03-01 四川大学 A kind of infrared image gradation recognition methods based on rock temperature-raising characteristic
CN109409367B (en) * 2018-11-02 2021-09-21 四川大学 Infrared image grading identification method based on rock temperature rise characteristics
CN110380430A (en) * 2019-07-01 2019-10-25 浙江大学 A kind of people having the same aspiration and interest generator recognition methods based on fuzzy clustering
CN110380430B (en) * 2019-07-01 2021-04-09 浙江大学 Coherent generator identification method based on fuzzy clustering

Also Published As

Publication number Publication date
US9691146B2 (en) 2017-06-27
US20160000328A1 (en) 2016-01-07
US20190005642A1 (en) 2019-01-03
US20160003690A1 (en) 2016-01-07
US9262826B2 (en) 2016-02-16
US9495744B2 (en) 2016-11-15
US9282896B2 (en) 2016-03-15
US9721339B2 (en) 2017-08-01
US10074175B2 (en) 2018-09-11
US20160000335A1 (en) 2016-01-07
GB201411983D0 (en) 2014-08-20
US9501824B2 (en) 2016-11-22
US9478025B2 (en) 2016-10-25
US9881369B2 (en) 2018-01-30
US20160000331A1 (en) 2016-01-07
US20160000334A1 (en) 2016-01-07
US20160004341A1 (en) 2016-01-07
US20160000337A1 (en) 2016-01-07
US20160003679A1 (en) 2016-01-07
US20160035084A1 (en) 2016-02-04
US20160343133A1 (en) 2016-11-24
US20160000381A1 (en) 2016-01-07
US9508141B2 (en) 2016-11-29
US20160022219A1 (en) 2016-01-28
US20160004910A1 (en) 2016-01-07
US9406125B2 (en) 2016-08-02
US9305350B2 (en) 2016-04-05
US9330459B2 (en) 2016-05-03
US20160000327A1 (en) 2016-01-07
US10453194B2 (en) 2019-10-22
US8950935B1 (en) 2015-02-10
US20160000377A1 (en) 2016-01-07
US20160005165A1 (en) 2016-01-07
US9324144B2 (en) 2016-04-26
GB2528044B (en) 2018-08-22
US20160003681A1 (en) 2016-01-07

Similar Documents

Publication Publication Date Title
US10453194B2 (en) Apparatus having a digital infrared sensor
US8965090B1 (en) Non-touch optical detection of vital signs
US20160073903A1 (en) Apparatus for detection of body core temperature based on a cubic relationship
WO2016040540A1 (en) Non-touch detection of body core temperature
EP2997886B1 (en) Non-touch detection of body core temperature
EP2963617A1 (en) Non-touch optical detection of vital signs

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20190516 AND 20190522