WO2012115881A1 - Infrared sensor systems and methods - Google Patents

Infrared sensor systems and methods Download PDF

Info

Publication number
WO2012115881A1
WO2012115881A1 PCT/US2012/025697 US2012025697W WO2012115881A1 WO 2012115881 A1 WO2012115881 A1 WO 2012115881A1 US 2012025697 W US2012025697 W US 2012025697W WO 2012115881 A1 WO2012115881 A1 WO 2012115881A1
Authority
WO
WIPO (PCT)
Prior art keywords
component
structural objects
thermal
wireless
imaging system
Prior art date
Application number
PCT/US2012/025697
Other languages
French (fr)
Inventor
Thomas W. ROCHENSKI
Tom Scanlon
Nicholas HÖGASTEN
Mary L. DEAL
Jr. Arthur J. Mcgowan
Jeffrey D. Frank
Andrew C. Teich
Original Assignee
Flir Systems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flir Systems, Inc. filed Critical Flir Systems, Inc.
Priority to EP12710001.4A priority Critical patent/EP2678842A1/en
Publication of WO2012115881A1 publication Critical patent/WO2012115881A1/en
Priority to US13/973,968 priority patent/US20130335550A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M3/00Investigating fluid-tightness of structures
    • G01M3/002Investigating fluid-tightness of structures by using thermal means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/043Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting an emergency event, e.g. a fall
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0476Cameras to detect unsafe condition, e.g. video cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/10Alarms for ensuring the safety of persons responsive to calamitous events, e.g. tornados or earthquakes

Definitions

  • the present disclosure relates to infrared imaging systems and, in particular, to infrared sensor systems and methods.
  • systems and methods disclosed herein provide for thermal image systems and methods, in accordance with one or more embodiments.
  • systems and methods may provide for wireless thermal imaging, which may include a communication component adapted to remotely communicate with a user over a network, one or more wireless thermal image sensors adapted to capture and provide thermal images of structural objects of a structure for monitoring moisture and/or temperature of the structural objects, and a processing component adapted to receive the thermal images of the structural objects from the one or more wireless thermal image sensors, and process the thermal images of the structural objects to generate moisture and/or temperature content information for remote analysis (e.g., of restoration conditions or fire hazard conditions) of the structural objects.
  • wireless thermal imaging which may include a communication component adapted to remotely communicate with a user over a network, one or more wireless thermal image sensors adapted to capture and provide thermal images of structural objects of a structure for monitoring moisture and/or temperature of the structural objects, and a processing component adapted to receive the thermal images of the structural objects from the one or more wireless thermal image sensors, and process the thermal images of the structural objects
  • the one or more wireless thermal image sensors may include one or more infrared cameras adapted to continuously monitor environmental parameters including one or more of humidity, temperature, and/or moisture associated with the structural objects.
  • the wireless thermal imaging system may include a ruggedized thermal camera system adapted for use as a disaster monitoring camera system to detect and monitor damage from disastrous events including at least one of flooding, fire, explosion, and/or earthquake, and wherein the ruggedized thermal camera system comprises an enclosure that is capable of withstanding disastrous events.
  • the wireless thermal imaging system may include a thermal camera system adapted for use as a safety monitoring system to detect one or more persons in the structure including, for example, one or more fallen persons in the structure.
  • the one or more wireless thermal image sensors may be adapted to monitor one more conditions of the structure including measuring one or more of moisture, humidity, temperature, and/or ambient conditions of its structural envelope.
  • the wireless thermal imaging system may include wireless sensors including a moisture meter and/or a hygrometer to monitor moisture conditions and provide ambient and/or various types of moisture information related to the structure to the processing component.
  • the infrared imaging system may be adapted to simultaneously monitor multiple structures.
  • the one or more wireless thermal image sensors may be affixed to at least one structural object of the structure to provide a view of one or more other structural objects of the structure.
  • the processing component may be adapted to provide an alarm to remotely notify the user of an emergency (e.g., a disastrous event) related to the structure by setting (or based on) a threshold condition for certain parameters (e.g., with specific moisture or temperature ranges).
  • An infrared camera system may be installed within a public or private facility or area to detect and monitor any persons present.
  • the infrared camera system may be installed within an elder care facility (e.g., senior living facility) or within a daycare facility to monitor persons and detect when assistance may be needed and provide an alert (e.g., a local alarm and/or provide a notification to a designated authority).
  • the infrared camera system may detect when assistance is needed based upon a person's body position (e.g., fallen person), a body temperature (e.g., above or below normal range), and/or total time (e.g., in a stationary position).
  • the infrared camera system may be designed to provide lower resolution images to maintain the personal privacy of the person.
  • FIG. 1 shows a block diagram illustrating an infrared imaging system for capturing and processing infrared images, in accordance with an embodiment.
  • FIG. 2 shows a method for capturing and processing infrared images, in accordance with an embodiment.
  • FIG. 3 shows a block diagram illustrating an infrared imaging system for monitoring an area, in accordance with an embodiment.
  • FIG. 4 shows a block diagram illustrating a processing flow of an infrared imaging system, in accordance with one or more embodiments.
  • FIGS. 5A-5B shows a diagram illustrating various profiles of a person, in accordance with one or more embodiments.
  • FIG. 6 shows a block diagram illustrating a method for capturing and processing infrared images, in accordance with one or more embodiments.
  • FIGS. 7A-7C show block diagrams illustrating methods for operating an infrared imaging system in an emergency mode, in accordance with one or more embodiments.
  • FIG. 8 shows an infrared imaging system adapted for monitoring a structure, in accordance with an embodiment.
  • Infrared imaging systems and methods disclosed herein relate to search, rescue, evacuation, remediation, and/or detection of persons that may be injured (e.g., from a fall) and/or structures that may be damaged due to a disastrous event (emergency), such as an earthquake, explosion, flood, fire, tornado, terrorist attack, etc.
  • a disastrous event such as an earthquake, explosion, flood, fire, tornado, terrorist attack, etc.
  • remediation efforts e.g., due to water or fire damage
  • verify status or completion of the remediation effort e.g., the dampness has been remedied
  • further attention e.g., fire has restarted or potential fire hazard increasing due to increased temperature readings.
  • Infrared imaging systems and methods disclosed herein autonomously operate in total or near total darkness, such as night time or during a power outage.
  • a ruggedized infrared imaging system may be adapted to withstand impact of a structural collapse and provide a homing signal to identify locations for retrieval of infrared data and information.
  • a low resolution infrared imaging system may be utilized in places where personal privacy is a concern, such as bedrooms, restrooms, and showers. In some instances, these areas are places where persons often slip and fall and may need assistance.
  • the infrared imaging systems and methods disclosed herein provide an infrared camera capable of imaging in darkness, operating autonomously, retaining video information from emergency or other disastrous event (e.g. ruggedized infrared camera), providing an easily identifiable location, and/or protecting personal privacy.
  • emergency or other disastrous event e.g. ruggedized infrared camera
  • the infrared imaging systems and methods disclosed herein may be utilized in senior citizen care facilities, within a person's home, and/or within other public or private facilities to monitor and provide thermal images that may be analyzed to determine if a person needs assistance (e.g., has fallen or is in distress, has an abnormal body temperature, and/or remains in a fixed position for an extended period of time) and/or provide location information for emergency personnel to locate the individual to provide assistance (e.g., during a medical emergency or during a disaster event).
  • assistance e.g., has fallen or is in distress, has an abnormal body temperature, and/or remains in a fixed position for an extended period of time
  • location information for emergency personnel e.g., during a medical emergency or during a disaster event.
  • the infrared imaging systems and methods disclosed herein may be implemented to monitor remediation efforts, such as directed to water and/or fire damage.
  • the infrared imaging system may provide thermal images for analysis within the infrared imager (e.g., infrared camera) or by a remote processor (e.g., computer) to provide information as to the remediation status.
  • the thermal images may provide information as to the moisture, humidity, and/or temperature status of a structure and whether the structure has sufficiently dried after water damage, such that appropriate remediation personnel may readily determine the remediation status.
  • the thermal images may provide
  • an infrared imaging system in a ruggedized enclosure with capability of operating autonomously aids first responders including search and rescue personnel by identifying images of persons present at the imaged location.
  • the infrared imaging system is adapted to provide a thermal signature of objects in complete darkness and detect objects that are close to skin temperature.
  • first responders upon locating the infrared imaging system may extract infrared data and information about persons present in a specific location.
  • infrared imaging system 100 may comprise a rugged thermal imaging camera system to aid first responders and detect fallen persons or persons requiring medical assistance.
  • infrared imaging system 100 may comprise a wireless thermal image monitoring system for disaster restoration monitoring.
  • Infrared imaging system 100 may include a processing component 110, a memory component 120, an image capture component 130, a display component 140, a control component 150, a communication component 152, a power component 154, a mode sensing component 160, a motion sensing component 162, and/or a location component 170.
  • infrared imaging system 100 may include one or more other sensing components 164 including one or more of a seismic activity sensor, a smoke detection sensor, a heat sensor, a water level sensor, a gaseous fume sensor, a radioactivity sensor, etc.
  • infrared imaging system 100 may represent an infrared imaging device, such as an infrared camera, to capture images, such as image 180.
  • Infrared imaging system 100 may represent any type of infrared camera system, which for example may be adapted to detect infrared radiation and provide representative infrared image data (e.g., one or more snapshot images and/or video images).
  • infrared imaging system 100 may represent an infrared camera and/or video camera that is directed to the near, middle, and/or far infrared spectrums to provide thermal infrared image data.
  • Infrared imaging system 100 may include a permanently mounted infrared imaging device and may be implemented, for example, as a security camera and/or coupled, in other examples, to various types of structures (e.g., buildings bridges, tunnels, etc.). Infrared imaging system 100 may include a portable infrared imaging device and may be
  • infrared imaging system 100 may be integrated as part of a non-mobile installation requiring infrared images to be stored and/or displayed.
  • Processing component 110 comprises, in various embodiments, an infrared image processing component and/or an infrared video image processing component.
  • Processing component 110 includes, in one embodiment, a microprocessor, a single-core processor, a multi-core processor, a microcontroller, a logic device (e.g., programmable logic device configured to perform processing functions), a digital signal processing (DSP) device, or some other type of generally known processor, including image processors and/or video processors.
  • DSP digital signal processing
  • Processing component 110 is adapted to interface and communicate with components 120, 130, 140, 150, 152, 154, 160, 162, 164, and/or 170 to perform method and processing steps as described herein.
  • Processing component 110 may include one or more modules 112A-112N for operating in one or more modes of operation, wherein modules 112A-112N may be adapted to define preset processing and/or display functions that may be embedded in processing component 110 or stored on memory component 120 for access and execution by processing component 110.
  • processing component 110 may be adapted to operate and/or function as a video recorder controller adapted to store recorded video images in memory component 120.
  • processing component 110 may be adapted to perform various types of image processing algorithms and/or various modes of operation, as described herein.
  • each module 112A-112N may be integrated in software and/or hardware as part of processing component 110, or code (e.g., software or configuration data) for each mode of operation associated with each module 112A-112N, which may be stored in memory component 120.
  • modules 112A-112N i.e., modes of operation
  • a separate computer- readable medium e.g., a memory, such as a hard drive, a compact disk, a digital video disk, or a flash memory
  • a computer e.g., logic or processor-based system
  • the computer-readable medium may be portable and/or located separate from infrared imaging system 100, with stored modules 112A-112N provided to infrared imaging system 100 by coupling the computer-readable medium to infrared imaging system 100 and/or by infrared imaging system 100 downloading (e.g., via a wired or wireless link) the modules 112A-112N from the computer-readable medium (e.g., containing the non- transitory information).
  • modules 112A-112N provide for improved infrared camera processing techniques for real time applications, wherein a user or operator may change a mode of operation depending on a particular application, such as monitoring seismic activity, monitoring workplace safety, monitoring disaster restoration, etc.
  • the other sensing components 164 may include one or more of a seismic activity sensor, a smoke detection sensor, a heat sensor, a water level sensor, a humidity sensor, a gaseous fume sensor, a radioactivity sensor, etc. for sensing disastrous events, such as earthquakes, explosions, fires, gas fumes, gas leaks, nuclear meltdowns, etc.
  • modules 112A-112N may be utilized by infrared imaging system 100 to perform one or more different modes of operation including a standard mode of operation, a person detection mode of operation, a fallen person mode of operation, an emergency mode of operation, and a black box mode of operation.
  • a standard mode of operation a person detection mode of operation
  • a fallen person mode of operation a fallen person mode of operation
  • an emergency mode of operation a black box mode of operation.
  • One or more of these modes of operation may be utilized for work and safety monitoring, disaster monitoring, restoration monitoring, and/or remediation progress monitoring. The modes of operation are described in greater detail herein.
  • Memory component 120 includes, in one embodiment, one or more memory devices to store data and information, including infrared image data and information and infrared video image data and information.
  • the one or more memory devices may include various types of memory for infrared image and video image storage including volatile and non-volatile memory devices, such as RAM (Random Access Memory), ROM (Read-Only Memory), EEPROM (Electrically-Erasable Read-Only Memory), flash memory, etc.
  • processing component 110 is adapted to execute software stored on memory component 120 to perform various methods, processes, and modes of operations in manner as described herein.
  • Image capture component 130 includes, in one embodiment, one or more infrared sensors (e.g., any type of infrared detector, such as a focal plane array) for capturing infrared image signals representative of an image, such as image 180.
  • the infrared sensors may be adapted to capture infrared video image signals representative of an image, such as image 180.
  • the infrared sensors of image capture component 130 provide for representing (e.g., converting) a captured image signal of image 180 as digital data (e.g., via an analog-to-digital converter included as part of the infrared sensor or separate from the infrared sensor as part of infrared imaging system 100).
  • Processing component 110 may be adapted to receive infrared image signals from image capture component 130, process infrared image signals (e.g., to provide processed image data), store infrared image signals or image data in memory component 120, and/or retrieve stored infrared image signals from memory component 120. Processing component 110 may be adapted to process infrared image signals stored in memory component 120 to provide image data (e.g., captured and/or processed infrared image data) to display component 140 for viewing by a user.
  • image data e.g., captured and/or processed infrared image data
  • Display component 140 includes, in one embodiment, an image display device (e.g., a liquid crystal display (LCD)) or various other types of generally known video displays or monitors.
  • Processing component 110 may be adapted to display image data and information on display component 140.
  • Processing component 110 may be adapted to retrieve image data and information from memory component 120 and display any retrieved image data and information on display component 140.
  • Display component 140 may include display electronics, which may be utilized by processing component 110 to display image data and information (e.g., infrared images).
  • Display component 140 may receive image data and information directly from image capture component 130 via processing component 110, or the image data and information may be transferred from memory component 120 via processing component 110.
  • processing component 110 may initially process a captured image and present a processed image in one mode, corresponding to modules 112A-112N, and then upon user input to control component 150, processing component 110 may switch the current mode to a different mode for viewing the processed image on display component 140 in the different mode. This switching may be referred to as applying the infrared camera processing techniques of modules 112A-112N for real time applications, wherein a user or operator may change the mode while viewing an image on display component 140 based on user input to control component 150.
  • display component 140 may be remotely positioned, and processing component 110 may be adapted to remotely display image data and information on display component 140 via wired or wireless communication with display component 140.
  • Control component 150 includes, in one embodiment, a user input and/or interface device having one or more user actuated components.
  • actuated components may include one or more push buttons, slide bars, rotatable knobs, and/or a keyboard, that are adapted to generate one or more user actuated input control signals.
  • Control component 150 may be adapted to be integrated as part of display component 140 to function as both a user input device and a display device, such as, for example, a touch screen device adapted to receive input signals from a user touching different parts of the display screen.
  • Processing component 110 may be adapted to sense control input signals from control component 150 and respond to any sensed control input signals received therefrom.
  • Control component 150 may include, in one embodiment, a control panel unit (e.g., a wired or wireless handheld control unit) having one or more user-activated mechanisms (e.g., buttons, knobs, sliders, etc.) adapted to interface with a user and receive user input control signals.
  • a control panel unit e.g., a wired or wireless handheld control unit
  • user-activated mechanisms e.g., buttons, knobs, sliders, etc.
  • the one or more user-activated mechanisms of the control panel unit may be utilized to select between the various modes of operation, as described herein in reference to modules 112A-112N.
  • control panel unit may be adapted to include one or more other user- activated mechanisms to provide various other control functions of infrared imaging system 100, such as auto-focus, menu enable and selection, field of view (FoV), brightness, contrast, gain, offset, spatial, temporal, and/or various other features and/or parameters.
  • a variable gain signal may be adjusted by the user or operator based on a selected mode of operation.
  • control component 150 may include a graphical user interface (GUI), which may be integrated as part of display component 140 (e.g., a user actuated touch screen), having one or more images of the user-activated mechanisms (e.g., buttons, knobs, sliders, etc.), which are adapted to interface with a user and receive user input control signals via the display component 140.
  • GUI graphical user interface
  • Communication component 152 may include, in one embodiment, a network interface component (NIC) adapted for wired and/or wireless communication with a network including other devices in the network.
  • communication component 152 may include a wireless communication component, such as a wireless local area network (WLAN) component based on the IEEE 802.11 standards, a wireless broadband component, mobile cellular component, a wireless satellite component, or various other types of wireless communication components including radio frequency (RF), microwave frequency (MWF), and/or infrared frequency (IRF) components, such as wireless transceivers, adapted for communication with a wired and/or wireless network.
  • WLAN wireless local area network
  • MMF microwave frequency
  • IRF infrared frequency
  • communication component 152 may include an antenna coupled thereto for wireless communication purposes.
  • the communication component 152 may be adapted to interface with a wired network via a wired communication component, such as a DSL (e.g., Digital Subscriber Line) modem, a PSTN (Public Switched Telephone Network) modem, an Ethernet device, and/or various other types of wired and/or wireless network communication devices adapted for communication with a wired and/or wireless network.
  • a wired communication component such as a DSL (e.g., Digital Subscriber Line) modem, a PSTN (Public Switched Telephone Network) modem, an Ethernet device, and/or various other types of wired and/or wireless network communication devices adapted for communication with a wired and/or wireless network.
  • Communication component 152 may be adapted to transmit and/or receive one or more wired and/or wireless video feeds.
  • the network may be implemented as a single network or a combination of multiple networks.
  • the network may include the Internet and/or one or more intranets, landline networks, wireless networks, and/or other appropriate types of communication networks.
  • the network may include a wireless telecommunications network (e.g., cellular phone network) adapted to communicate with other communication networks, such as the Internet.
  • the infrared imaging system 100 may be associated with a particular network link such as for example a URL (Uniform Resource Locator), an IP (Internet Protocol) address, and/or a mobile phone number.
  • URL Uniform Resource Locator
  • IP Internet Protocol
  • Power component 154 comprises a power supply or power source adapted to provide power to infrared imaging system 100 including each of the components 110, 120, 130, 140, 150, 152, 154, 160, 162, 164, and/or 170.
  • Power component 154 may comprise various types of power storage devices, such as battery, or a power interface component that is adapted to receive external power and convert the received external power to a useable power for infrared imaging system 100 including each of the components 110, 120, 130, 140, 150, 152, 154, 160, 162, 164, and/or 170.
  • Mode sensing component 160 may be optional. Mode sensing component 160 may include, in one embodiment, an application sensor adapted to automatically sense a mode of operation, depending on the sensed application (e.g., intended use for an
  • the application sensor may include a mechanical triggering mechanism (e.g., a clamp, clip, hook, switch, push-button, etc.), an electronic triggering mechanism (e.g., an electronic switch, push-button, electrical signal, electrical connection, etc.), an electromechanical triggering mechanism, an electro-magnetic triggering mechanism, or some combination thereof.
  • mode sensing component 160 senses a mode of operation corresponding to the intended application of the infrared imaging system 100 based on the type of mount (e.g., accessory or fixture) to which a user has coupled the infrared imaging system 100 (e.g., image capture component 130).
  • the mode of operation may be provided via control component 150 by a user of infrared imaging system 100.
  • Mode sensing component 160 may include a mechanical locking mechanism adapted to secure the infrared imaging system 100 to a structure or part thereof and may include a sensor adapted to provide a sensing signal to processing component 110 when the infrared imaging system 100 is mounted and/or secured to the structure.
  • Mode sensing component 160 in one embodiment, may be adapted to receive an electrical signal and/or sense an electrical connection type and/or mount type and provide a sensing signal to processing component 110.
  • Processing component 110 may be adapted to communicate with mode sensing component 160 (e.g., by receiving sensor information from mode sensing component 160) and image capture component 130 (e.g., by receiving data and information from image capture component 130 and providing and/or receiving command, control, and/or other information to and/or from other components of infrared imaging system 100).
  • mode sensing component 160 e.g., by receiving sensor information from mode sensing component 160
  • image capture component 130 e.g., by receiving data and information from image capture component 130 and providing and/or receiving command, control, and/or other information to and/or from other components of infrared imaging system 100.
  • mode sensing component 160 may be adapted to provide data and information relating to various system applications including various coupling implementations associated with various types of structures (e.g., buildings, bridges, tunnels, vehicles, etc.).
  • mode sensing component 160 may include communication devices that relay data and information to processing component 110 via wired and/or wireless communication.
  • mode sensing component 160 may be adapted to receive and/or provide information through a satellite, through a local broadcast transmission (e.g., radio frequency), through a mobile or cellular network, and/or through information beacons in an infrastructure (e.g., a transportation or highway information beacon infrastructure) or various other wired and/or wireless techniques.
  • Motion sensing component 162 includes, in one embodiment, a motion detection sensor adapted to automatically sense motion or movement and provide related information to processing component 110.
  • motion sensing component 162 may include an accelerometer, a gyroscope, an inertial measurement unit (IMU), etc., to detect motion of infrared imaging system 100 (e.g., to detect an earthquake).
  • the motion detection sensor may be adapted to detect motion or movement by measuring change in speed or vector of an object or objects in a field of view, which may be achieved by mechanical techniques physically interacting within the field of view or by electronic techniques adapted to quantify and measure changes in the environment.
  • Some methods by which motion or movement may be electronically identified include optical detection and acoustical detection.
  • image capturing system 100 may include one or more other sensing components 164, including environmental and/or operational sensors, depending on application or implementation, which provide information to processing component 110 by receiving sensor information from each sensing component 164.
  • other sensing components 164 may be adapted to provide data and information related to environmental conditions, such as internal and/or external temperature conditions, lighting conditions (e.g., day, night, dusk, and/or dawn), humidity levels, specific weather conditions (e.g., sun, rain, and/or snow), distance (e.g., laser rangefinder), and/or whether a tunnel, a covered parking garage, or some type of structure or enclosure is detected.
  • other sensing components 160 may include one or more conventional sensors as known by those skilled in the art for monitoring various conditions (e.g., environmental conditions) that may have an affect (e.g., on the image appearance) on the data and information provided by image capture component 130.
  • conditions e.g., environmental conditions
  • an affect e.g., on the image appearance
  • each sensing component 164 may include devices that relay information to processing component 110 via wireless communication.
  • each sensing component 164 may be adapted to receive information from a satellite, through a local broadcast (e.g., radio frequency) transmission, through a mobile or cellular network and/or through information beacons in an infrastructure (e.g., a transportation or highway information beacon infrastructure), and/or various other wired and/or wireless techniques in accordance with one or more embodiments.
  • a local broadcast e.g., radio frequency
  • an infrastructure e.g., a transportation or highway information beacon infrastructure
  • Location component 170 includes, in one embodiment, a beacon signaling device adapted to provide a homing beacon signal for location discovery of the infrared imaging system 100.
  • the homing beacon signal may utilize a radio frequency (RF) signal, microwave frequency (MWF) signal, and/or various other wireless frequency signals in accordance with embodiments.
  • location component 170 may utilize an antenna coupled thereto for wireless communication purposes.
  • processing component 110 may be adapted to interface with location component 170 to transmit the homing beacon signal in the event of an emergency or disastrous event.
  • one or more components 110, 120, 130, 140, 150, 152, 154, 160, 162, 164, and/or 170 of image capturing system 100 may be combined and/or implemented or not, as desired or depending on application requirements, with image capturing system 100 representing various functional blocks of a system.
  • processing component 110 may be combined with memory component 120, image capture component 130, display component 140, and/or mode sensing component 160.
  • processing component 110 may be combined with image capture component 130 with only certain functions of processing component 110 performed by circuitry (e.g., processor, logic device, microprocessor, microcontroller, etc.) within image capture component 130.
  • control component 150 may be combined with one or more other components or be remotely connected to at least one other component, such as processing component 110, via a wired or wireless control device so as to provide control signals thereto.
  • FIG. 2 shows a method 200 illustrating a process flow for capturing and processing infrared images, in accordance with an embodiment.
  • image capturing system 100 of FIG. 1 an example of a system, device, or apparatus that may perform method 200.
  • one or more images e.g., infrared image signals comprising infrared image data including video data
  • processing component 110 controls (e.g., causes) image capture component 130 to capture one or more images, such as, for example, image 180 and/or a video image of image 180.
  • processing component 110 may be adapted to optionally store captured images (block 214) in memory component 120 for processing.
  • the one or more captured images may be pre-processed (block 218).
  • pre-processing may include obtaining infrared sensor data related to the captured images, applying correction terms, and applying noise reduction techniques to improve image quality prior to further processing as would be understood by one skilled in the art.
  • processing component 110 may directly pre-process the captured images or optionally retrieve captured images stored in memory component 120 and then pre-process the images.
  • pre-processed images may be optionally stored in memory component 120 for further processing.
  • a mode of operation may be determined (block 222), and one or more captured and/or preprocessed images may be processed according to the determined mode of operation (block 226).
  • the mode of operation may be determined before or after the images are captured and/or preprocessed (blocks 210 and 218), depending upon the types of infrared detector settings (e.g., biasing, frame rate, signal levels, etc.), processing algorithms and techniques, and related configurations.
  • a mode of operation may be defined by mode sensing component 160, wherein an application sensing portion of mode sensing component 160 may be adapted to automatically sense the mode of operation, and depending on the sensed application, mode sensing component 160 may be adapted to provide related data and/or information to processing component 110.
  • the mode of operation may be manually set by a user via display component 140 and/or control component 150 without departing from the scope of the present disclosure.
  • processing component 110 may communicate with display component 140 and/or control component 150 to obtain the mode of operation as provided (e.g., input) by a user.
  • the modes of operation may include the use of one or more infrared image processing algorithms and/or image processing techniques.
  • the modes of operation refer to processing and/or display functions of infrared images, wherein for example an infrared imaging system is adapted to process infrared sensor data prior to displaying the data to a user.
  • infrared image processing algorithms are utilized to present an image under a variety of conditions, and the infrared image processing algorithms provide the user with one or more options to tune parameters and operate the infrared imaging system in an automatic mode or a manual mode.
  • the modes of operation are provided by infrared imaging system 100, and the concept of image processing for different use conditions may be implemented in various types of structure applications and resulting use conditions.
  • the modes of operation may include a standard mode of operation, a person detection mode of operation, a fallen or distressed person mode of operation, an emergency mode of operation, and/or a black box mode of operation.
  • One or more of these modes of operation may be utilized for work and safety monitoring, disaster monitoring, restoration monitoring, and/or remediation progress monitoring.
  • one or more of sensing components 160, 162, 164 may be utilized to determine a mode of operation.
  • mode sensing component 160 may be adapted to interface with motion sensing component 162 and one or more other sensing components 164 to assist with a determination of a mode of operation.
  • the other sensing components 164 may include one or more of a seismic activity sensor, a smoke detection sensor, a heat sensor, a water level sensor, a moisture sensor, a temperature sensor, a humidity sensor, a gaseous fume sensor, a radioactivity sensor, etc. for sensing disastrous events, such as earthquakes, explosions, fires, gas fumes, gas leaks, nuclear events, etc.
  • a seismic activity sensor such as earthquakes, explosions, fires, gas fumes, gas leaks, nuclear events, etc.
  • the one or more images may be stored (block 230, i.e., after processing or prior to processing) and optionally displayed (block 234). Additionally, further processing may be optionally performed depending on application or implementation.
  • images may be displayed in a night mode, wherein the processing component 110 may be adapted to configure display component 140 to apply a night color palette to the images for display in night mode.
  • a night color palette In night mode, an image may be displayed in a red palette or a green palette to improve night vision capacity (e.g., to minimize night vision degradation) for a user.
  • processing component 110 may be adapted to configure display component 140 to apply a non-night mode palette (e.g., black hot or white hot palette) to the images for display via display component 140.
  • a non-night mode palette e.g., black hot or white hot palette
  • processing component 110 may store any of the images, processed or otherwise, in memory component 120. Accordingly, processing component 110 may, at any time, retrieve stored images from memory component 120 and display retrieved images on display component 140 for viewing by a user.
  • the night mode of displaying images refers to using a red color palette or green color palette to assist the user or operator in the dark when adjusting to low light conditions.
  • human visual capacity to see in the dark may be impaired by the blinding effect of a bright image on a display monitor.
  • the night mode changes the color palette from a standard black hot or white hot palette to a red or green color palette display.
  • the red or green color palette is known to interfere less with human night vision capability.
  • the green and blue pixels may be disabled to boost red color for a red color palette.
  • the night mode display may be combined with any other mode of operation of infrared imaging system 100, and a default display mode of infrared imaging system 100 at night may be the night mode display.
  • processing component 110 may switch the processing mode of a captured image in real time and change the displayed processed image from one mode, corresponding to modules 112A-112N, to a different mode upon receiving input from mode sensing component 160 and/or user input from control component 150. As such, processing component 110 may switch a current mode of display to another different mode of display for viewing the processed image by the user or operator on display component 140 depending on the input received from mode sensing component 160 and/or user input from control component 150.
  • This switching may be referred to as applying the infrared camera processing techniques of modules 112A-112N for real time applications, wherein the displayed mode may be switched while viewing an image on display component 140 based on the input received from mode sensing component 160 and/or user input received from control component 150.
  • FIG. 3 shows a block diagram illustrating an infrared imaging system 300 for monitoring an area, in accordance with an embodiment.
  • infrared imaging system 300 may comprise a rugged thermal imaging camera system for utilization as a disaster camera and/or workplace safety monitoring to aid first responders and/or detect fallen persons.
  • infrared imaging system 300 may comprise a wireless thermal imaging system and/or a wireless thermal image monitoring system for disaster and/or restoration monitoring.
  • infrared imaging system 300 may comprise an enclosure 302 (e.g., a highly ruggedized protective housing), a processing component 310 (e.g., a video processing device having a module for detecting a fallen person, emergency, disastrous event, etc.), a memory component 320 (e.g., video storage, recording unit, flash drive, etc.), an image capture component 330 (e.g., a radiometrically calibrated thermal camera), a communication component 352 (e.g., a transceiver having wired and/or wireless
  • a first power component 354A e.g., a battery
  • a second power component 354B e.g., a power interface receiving external power via a power cable 356
  • a motion sensing component 362 e.g., a sensor sensitive to motion or movement, such as an accelerometer
  • a location component 370 e.g., a homing beacon signal generator
  • Infrared imaging system 300 may further include other types of sensors, as discussed herein, such as a temperature sensor, a humidity sensor, and/or a moisture sensor.
  • sensors such as a temperature sensor, a humidity sensor, and/or a moisture sensor.
  • the system 300 may be adapted to provide a live video feed of thermal video captured with image capture component 330 through a wired cable link 358 or wireless communication link 352. Captured video images may be utilized for surveillance operations.
  • the system 300 may be adapted to automatically detect a fallen person or a person in need of assistance (e.g., based on body temperature, location, body position, and/or motionless for a period of time).
  • the fallen person detection system utilizes the image capture component 330 as a radiometrically calibrated thermal imager.
  • the system 300 may be securely mounted to a structure 190 via an adjustable mounting component 192 (e.g., fixed or moveable, such as a pan/tilt or other motion control device) so that the imaging component 330 may be tilted to peer down on persons 304a, 304b within a field of view (FOV) 332.
  • an adjustable mounting component 192 e.g., fixed or moveable, such as a pan/tilt or other motion control device
  • radiometric calibration allows the system 300 to detect objects (e.g., persons 304a, 304b) at or close to skin temperature, such as between 80°C and 110°F.
  • the processing component 310 utilizes a person detection module 312B (i.e., module 112B) to determine or provide awareness of whether one or more persons are present in the scene, such as persons 304a, 304b. If at least one person is present, then the system 300 may be adapted to operate in emergency mode 312A (e.g., module 112A), which may be triggered by motion sensor 362.
  • the processing component 310 may encode person detection information into a homing beacon signal, which may be generated from location device 370. In one aspect, the person detection information may aid search and rescue personnel in their efforts to prioritize search and rescue operations.
  • the system 300 may be enclosed in a ruggedized protective housing 302 built such that even after severe impact from a disastrous event, the non- volatile memory 320, which stores recorded video images, may be extracted in an intact state.
  • An internal battery 354 allows the system 300 to operate after loss of external power via cable 356 for some period of time. Even if the system optics and video processing electronics are rendered useless as a result of a catastrophic event, power from internal battery 354 may be provided to location device 370 so that a homing beacon signal may be generated and transmitted to assist search and rescue personnel with locating the system 300.
  • FIG. 4 shows a block diagram illustrating a process flow 400 of an infrared imaging system, in accordance with one or more embodiments.
  • system 100 of FIG. 1 and/or system 300 in FIG. 3 may be utilized to perform method 400.
  • a data capture component 412 (e.g., processing component 310 of system 300) is adapted to extract frames of thermal imagery from a thermal infrared sensor 410 (e.g., image capture component 330 of system 300).
  • the captured image including data and information thereof, may be normalized, for example, to an absolute temperature scale by a radiometric normalization module 414 (e.g., a module utilized by the processing component 310 of system 300).
  • a person detection module 416 e.g., a module utilized by the processing component 310 of system 300
  • is adapted to operate on the radiometric image to localize persons present in the scene e.g., FOV 332).
  • a fallen person detection module 418 may be adapted to discriminate between upright persons (e.g., standing or walking persons) and fallen persons.
  • the module may be adapted to discriminate based on other parameters, such as time, location, and/or temperature differential.
  • process flow 400 may be used to monitor persons and detect when assistance may be needed and provide an alert (e.g., a local alarm and/or provide a notification to a designated authority).
  • process flow 400 e.g., person detection module 4166 may detect when assistance is needed based upon a person's body position (e.g., fallen person), body temperature (e.g., above or below normal range), and/or total time (e.g., in a stationary position).
  • data and information about coordinates of persons (e.g., fallen and not fallen) and the radiometrically normalized or non-normalized image may be passed to a conversion module 420 (e.g., a module utilized by the processing component 310 of system 300).
  • the conversion module 420 may be adapted to scale the image such that the image fits the dynamic range of a display and may encode the positions of persons and fallen persons in the image, for example, by color coding the locations.
  • the converted and potentially color coded image may be compressed 422 by some standard video compression algorithm or technique so as to reduce memory storage capacity of the extractable video storage component 424 (e.g., the memory component 320 of system 300).
  • a command may be given to the system 300 by a user or the processing component 310 to transmit stored video data and information of the extractable video storage component 424 over a wired video link 426 and/or wireless video link 428 via an antenna 430.
  • the system in standard operation, operates as a thermal imaging device producing a video stream representing the thermal signature of a scene (e.g., FOV 332).
  • the video images produced may be stored in a circular frame buffer in non-volatile memory (e.g., memory component 320 of system 300) in a compressed format so as to store a significant amount of video.
  • non-volatile memory e.g., memory component 320 of system 300
  • any length of video may be stored without departing from the scope of the present embodiments.
  • the type of extractable memory module used and the compression ratio may affect the amount of available memory storage as understood by someone skilled in the art.
  • a processing unit e.g., processing component 310 of system 300
  • processing the thermal video stream may be adapted to detect the presence of persons and/or animals.
  • the system e.g., system 300 of FIG. 3
  • the system may be set to a PERSON_PRESENT mode, wherein person detection information may be utilized during normal operation as is achieved, for example, in standard video analytics software to generate an alert of potential intrusion.
  • the camera may retain the PERSON_PRESENT mode even when disconnected from main power and video network.
  • a background model of the scene (e.g., FOV 332) may be constructed. This may be considered standard procedure in video analytics applications.
  • the exemplary background model may utilize an average of a time series of values for a given pixel. Because of the lack of shadows and general insensitivity to changing lighting conditions, background modeling may be more effective and less prone to false alarms with thermal imaging sensors.
  • regions of the image that differ from the background model may be identified. In the instance of a time series average as a background model, the background may be subtracted from the current captured video frame, and the difference may be thresholded to find one or more ROI (Region Of Interest) corresponding to areas of greatest change.
  • ROI Region Of Interest
  • a detected ROI may indicate the presence of a person.
  • a radiometrically calibrated thermal camera (e.g., system 300 of FIG. 3) may be utilized, which may allow the fallen person detection module 418 to access absolute temperature values for the ROI.
  • the ROI includes at least some areas with temperatures close to body temperature, and if the ROI is of size that may match the profile of a person imaged from the specific camera location, a person may be determined to be present in the captured image.
  • the system 300 may be set to PERSON_PRESENT mode.
  • a user set time constant may determine the length of time that the system 300 may stay in the PERSON_PRESENT mode after the last detection of a person. For instance, the system 300 may stay in the PERSON_PRESENT mode for 10 seconds after the last detection of a person.
  • a processing unit e.g., processing component 310 of system 300
  • processing the thermal video stream may be adapted to discriminate between an upright person (e.g., standing or walking person) and a fallen person.
  • the system e.g., system 300 of FIG. 3
  • the alarm may be encoded into the video or transmitted via a wired and/or wireless communication link.
  • a thermal imaging system (e.g., system 300 of FIG.3) may be mounted at an elevated location, such as the ceiling, and may pointed or tilted in such a manner that the system observes the scene (e.g., FOV 332) from a close to 180° angle (e.g., as shown in FIG. 3, ⁇ being close to 180°).
  • FOV 332 the scene
  • FOV 332 the profile of a standing person (e.g., person 304b) in the scene
  • FOV 332 the profile of a fallen person (e.g., person 304a) in the scene (e.g., FOV 332) appear different to the infrared imaging system 300.
  • the standing person 304b has, in relative terms, a smaller profile than the fallen person 304a having a larger profile.
  • the approximate size e.g., profile size based on the number of measured pixels
  • the approximate size of a standing or fallen person, relative to the total size of the image may be determined based on an approximate distance to the ground (or floor) relative to the thermal imaging system.
  • This approximate distance may be provided to the system by an operator (e.g., via a wired or wireless communication link), may be determined based on the focus position, may be measured using a distance measuring sensor (e.g., a laser range finder), or may be determined by analyzing statistical properties of objects moving relative to the background (e.g., analysis performed by the thermal image camera or by a remote processor coupled to or formed as part of the thermal imaging system).
  • a distance measuring sensor e.g., a laser range finder
  • analyzing statistical properties of objects moving relative to the background e.g., analysis performed by the thermal image camera or by a remote processor coupled to or formed as part of the thermal imaging system.
  • FIG. 5A shows a first profile 500 of an upright person (e.g., standing or walking person, such as person 304b).
  • FIG. 5B shows a second profile 502 of a fallen person (e.g., such as person 304a).
  • the first profile of the upright person is at least smaller than the second profile of the fallen person, which is at least larger than the first profile.
  • the difference between the upright person and the fallen person represents a change in aspect of a person, such as the vertical and/or horizontal aspect of the person.
  • detection of a fallen person may utilize low resolution radiometry and/or thermal imagery, wherein persons may be imaged as warm blobs that are monitored for their presence, movement, and safety. For example, if someone is detected as fallen, a caregiver may be modified to provide assistance to the fallen person.
  • the infrared imaging system 300 may be equipped with autonomous two-way audio so that a caregiver may remotely, bi-directionally communicate with a fallen person, if deemed necessary.
  • the person detection mode 416 and/or the fallen person mode 418 provide awareness to the infrared imaging system 300 as to whether one or more persons are present in the scene (e.g., FOV 332).
  • the system 300 may be adapted to operate in emergency mode 440, which may be triggered by a motion or movement sensor 442 (e.g., motion sensing component 362).
  • the processing component 310 may be adapted to encode person detection information into a communication signal and transmit the communication signal over a network via, for example, a radio frequency (RF) transceiver 444 (e.g., wireless communication component 352) having an antenna 446 (or via antenna 430).
  • RF radio frequency
  • FIG. 6 shows a block diagram illustrating a method 600 for detecting a person in a scene or field of view, in accordance with one or more embodiments.
  • system 100 of FIG. 1 and/or system 300 of FIG. 3 may be utilized to perform method 600.
  • a fallen person may be discriminated from a standing or walking person by calculating the size of the ROI (i.e., the size of the area that differs from the background model) and by radiometric properties.
  • a group of persons walking together i.e., two or more persons meeting
  • a person that suddenly changes position from standing or walking to lying on the ground i.e., a fallen person.
  • the speed of which a specific ROI moves across the scene may be used as a discriminating parameter since a fallen person may not move or move slowly.
  • a background model 610 of the scene may be constructed.
  • the background model 610 may utilize an average of a time series of values for a given pixel, and regions of the image that differ from the background model 610 may be identified.
  • the background may be subtracted from the current captured video frame, and the difference may be thresholded to find one or more ROI (Region Of Interest) corresponding to areas of greatest change, wherein a detected ROI may indicate the presence of a person.
  • ROI Region Of Interest
  • Detection of a fallen person may utilize low resolution radiometric information 612 and thermal imagery, wherein persons may be imaged as warm blobs that are monitored for their presence and movement. Detection of a fallen person may involve user control 614 of parameters, such as setting radiometry resolution, identifying ROI, time period for monitoring the scene, etc.
  • the method 600 is adapted to search for a person in the scene 620, in a manner as described herein. If a person is not present or not detected, then a person present state is set to false 632, and the method 600 is adapted to continue to search for a person in the scene 620. If a person is present or detected in the scene 630, then the person present state is set to true 634, and the method 600 is adapted to analyze the profile of the detected person in the scene 640, in a manner as described herein.
  • the analysis of the scene 640 may monitor persons and detect when assistance may be needed and provide an alert 660 (e.g., a local alarm and/or provide a notification to a designated authority).
  • an alert 660 e.g., a local alarm and/or provide a notification to a designated authority.
  • method 600 e.g., person present 630 and/or analysis 640
  • body position e.g., fallen person
  • body temperature e.g., above or below normal range
  • total time e.g., total time in a stationary, motionless position
  • the method 600 is adapted to determine if the analyzed profile matches the profile of a fallen person 650. If the profile is not determined to match the profile of a fallen person, then a fallen person state is set to false, and the method 600 is adapted to continue to search for a person in the scene 620. Otherwise, if the profile is determined to match the profile of a fallen person, then the fallen person state is set to true 654, and the method 600 is adapted to generate an alert 660 to notify a user or operator that a fallen person has been detected in the scene. Once the alert is generated 660, the method 600 is adapted to continue to search for a person in the scene 620.
  • FIGS. 7A-7C show block diagrams illustrating methods 700, 720, and 750, respectively, for operating an infrared imaging system in an emergency mode, in accordance with one or more embodiments.
  • infrared imaging system 100 of FIG. 1 and/or infrared imaging system 300 of FIG. 3 may be utilized as an example of a system, device, or apparatus that may perform methods 700, 720, and/or 750.
  • the location component 170, 370 is adapted to transmit a homing beacon signal to facilitate locating the system 100, 300, respectively, in a disastrous event, such as in the event of sensed smoke or fire and/or partial or complete collapse of a building.
  • a disastrous event such as in the event of sensed smoke or fire and/or partial or complete collapse of a building.
  • PERSON_PRESENT mode at the time when the system 100, 300 entered emergency mode, then a person present notification is encoded into the transmitted homing beacon signal. If more than one person was present, then the approximate number of persons present may be encoded into the transmitted homing beacon signal.
  • processing component 110, 310 may be adapted to operate and/or function as a video recorder controller 710 adapted to store recorded video images in memory component 120. If the infrared imaging system 100, 300 is determined to be operating in an emergency mode (block 712), then stored video data and information is not erased or overwritten (block 714).
  • a user defined setting may be adapted to set a threshold for an amount of stored video data and information prior to the system 100, 300 operating in emergency mode.
  • a maximum time may be defined by an amount of nonvolatile memory storage capacity and/or a video data compression ratio.
  • the system 100, 300 may be configured to have the last ten minutes of video stored and to not overwrite that video history in the event of an emergency. That way, first responders that are able to extract the video from the system (e.g., by extracting the video memory) may be able to determine what happened at a specific location 10 minutes prior to the event that caused the system 100, 300 to enter emergency mode.
  • different events may cause the system 100, 300 to enter into emergency mode of operation.
  • the system 100, 300 may be adapted to monitor power 722, and if external power is terminated, the system 100, 300 may use battery power for operation and automatically enter emergency mode.
  • the system 100, 300 may be adapted to monitor seismic activity 724, and if integrated motion sensors 162, 362 measure significant motion (e.g., in the event of an explosion or earthquake), the system 100, 300 may enter emergency mode.
  • the system 100, 300 may be adapted to monitor user input 726, and if the system 100, 300 has a wired or wireless external communication channel (e.g., Ethernet connection, wireless network connection, etc.), the system 100, 300 may be set into emergency mode by user command.
  • the system 100, 300 may be adapted to monitor a wired or wireless network for emergency activity. For instance, at a location with multiple systems, one system entering emergency mode may trigger other systems in proximity to enter emergency mode so as to preserve video at the location from that time.
  • processing component 110, 310 may be adapted to operate and/or function as a emergency mode controller 730 adapted to detect an event (e.g., power failure event, seismic event, etc.) and set the system 100, 300 to operate in emergency mode (block 736). If the infrared imaging system 100, 300 detects an event and sets the system 100, 300 to operate in emergency mode (block 736), then an emergency mode state is set to true (block 732). Otherwise, if the infrared imaging system 100, 300 does not detect an event and does not set the system 100, 300 to operate in emergency mode (block 736), then an emergency mode state is set to false (block 734).
  • an event e.g., power failure event, seismic event, etc.
  • processing component 110, 310 may be adapted to operate and/or function as a locator signal controller 760 adapted to transmit a homing beacon signal to facilitate locating the system 100, 300, respectively, in a disastrous event (e.g., earthquake, fire, flood, explosion, building collapse, nuclear event, etc.).
  • a disastrous event e.g., earthquake, fire, flood, explosion, building collapse, nuclear event, etc.
  • a person present 766 is encoded as part of locator signal data 770 in a transmitted locator signal 772 (i.e., homing beacon signal).
  • the approximate number of persons present may be encoded as part of locator signal data 770 in the transmitted locator signal 772. Otherwise, in another embodiment, if the system is in emergency mode (block 762) and/or a person is not detected to be present (block 764), then a person not present 768 is encoded as part of locator signal data 770 in the transmitted locator signal 772.
  • infrared imaging systems 100, 300 are adapted to operate as a disaster camera having a ruggedized enclosure for protecting the camera and non- volatile storage for infrared image data and information.
  • the disaster camera in accordance with embodiments, is adapted to sense various types of emergencies such as a flood, an earthquake and/or explosion (e.g., based on analysis of the thermal image data, via a built-in shock sensor, and/or seismic sensor), sense heat and smoke (e.g., from a fire based on the thermal image data or other sensors), and/or provide an ability to locate and count persons in a collapsed structure more easily.
  • the disaster camera may be adapted to operate in a black box mode utilizing a homing beacon signal (e.g., radio frequency (RF) signal) to find and locate after a disastrous event (e.g., building collapse, earthquake, explosion, etc.).
  • a homing beacon signal e.g., radio frequency (RF) signal
  • the disaster camera may be adapted to operate as a human presence enunciator for search and rescue events via the homing beacon signal.
  • the disaster camera includes a thermal camera, a seismic sensor, and an audible enunciator or RF transmitter that signals the presence of any detected persons in the event of seismic activity.
  • Thermal camera imaging may detect the presence or absence of persons in a 360 degree field of view (FOV) by using multiple thermal image cameras or by scanning the FOV using one or more thermal image cameras.
  • a seismic sensor is constantly monitoring for abrupt and abnormal sudden motion. When such a motion is sensed, an audible alarm may be voiced.
  • the alarm is ruggedized and able to operate separately from the
  • FIG. 8 shows an infrared imaging system 800 adapted for monitoring a structure, in accordance with one or more embodiments.
  • infrared imaging system 800 may comprise a wireless thermal imaging system and/or a wireless thermal image monitoring system for disaster detection and/or disaster restoration monitoring of structure 802.
  • infrared imaging system 800 may comprise (or further comprise) a thermal imaging camera system for utilization as a disaster camera and/or workplace safety monitoring to aid first responders and/or detect fallen persons in structure 802.
  • infrared imaging system 800 of FIG. 8 may have similar scope and function of system 100 of FIG. 1 and/or infrared imaging system 300 of FIG. 3 and may operate as set forth herein (e.g., selectively in reference to FIGS. 1-7C).
  • infrared imaging system 800 utilizes wireless multipoint monitoring devices 830 (e.g., thermal imaging devices, environmental sensor devices, etc.) to monitor the condition of structure 802 including measuring moisture, humidity, temperature, and/or ambient conditions and obtaining thermal images of its structural envelope and/or of its occupants.
  • condition data e.g., information
  • a processing component 810 may be collected locally via a processing component 810 and then sent to a hosted website 870 over a network 860 (e.g., Internet) via a network communication device 852 (e.g., a wired or wireless router and/or modem) for remote viewing, control, and/or analysis of restoration conditions and remediation progress.
  • a network communication device 852 e.g., a wired or wireless router and/or modem
  • infrared imaging system 800 may utilize network-enabled, multi-monitoring technology to collect a breadth of quality data and provide this data to a user in an easily accessible manner.
  • infrared imaging system 800 may improve the efficiency of capturing important moisture, humidity, temperature, and/or ambient readings within the structural envelope.
  • Infrared imaging system 800 may be adapted to provide daily progress reports on restoration conditions and remediation progress at a jobsite for use by industry professionals, such as restoration contractors and insurance companies.
  • Infrared imaging system 800 may be adapted to use moisture meters, thermometers, thermal imaging cameras, and/or hygrometers to monitor conditions and collect data associated with structure 802.
  • Infrared imaging system 800 may be adapted to simultaneously monitor multiple locations at any distance.
  • infrared imaging system 800 effectively allows a user (e.g., operator or administrator) to continuously monitor structural conditions of multiple jobsites from one network-enabled computing device from anywhere in the world.
  • Infrared imaging system 800 may provide real-time restoration monitoring that combines wireless sensing device networks and continuous visual monitoring of multiple environmental parameters including humidity, temperature, and/or moisture, along with thermal images and any other related parameters that influence the integrity of structures.
  • infrared imaging system 800 may be versatile and valuable for structural monitoring, remediation, disaster detection, etc. Infrared imaging system 800 may significantly improve monitoring and documentation capabilities while providing time, travel, and cost savings over conventional approaches.
  • infrared imaging system 800 with thermal imaging capabilities may be utilized for moisture monitoring, removal, and/or remediation in structure 802.
  • Infrared imaging system 800 may be utilized for monitoring structures (e.g., residences, vacation homes, timeshares, hotels, condominiums, etc.) and aspects thereof including ruptured plumbing, dishwashers, washing machine hoses, overflowing toilets, sewage backup, open doors and/or windows, and anything else that may create the potential for moisture damage and/or energy loss.
  • Commercial buildings may benefit from permanent installations of infrared imaging system 800 to provide continuous protection versus temporary ad-hoc installations.
  • infrared imaging system 800 may be utilized to expand structural diagnostic capabilities, provide real-time continuous monitoring, provide remote ability to set alarms and remote alerts for issues occurring on a jobsite, and improve documentation and archiving of stored reports, which for example may be useful for managing legal claims of mold damage.
  • infrared imaging system 800 may be used for restoration monitoring to provide initial measurements (e.g., of temperature, humidity, moisture, and thermal images) to determine initial conditions (e.g., how wet is the structure due to water damage) and may provide these measurements (e.g., periodically or continuously) to a remote location (e.g., hosted website or server) such that restoration progress may be monitored.
  • initial measurements e.g., of temperature, humidity, moisture, and thermal images
  • initial conditions e.g., how wet is the structure due to water damage
  • a remote location e.g., hosted website or server
  • the information may be used to view a time lapse sequence of the restoration to clearly show the progress of the remediation (e.g., how wet was the structure initially and how dry is it now or at completion of the remediation effort).
  • the information may also be monitored to determine when the remediation is complete based on certain measurement thresholds (e.g., the structure is sufficiently dry and a completion alert provided) and to determine if an alert (e.g., alarm) should be provided if sufficient remediation progress is not being made (e.g., based on certain temperature, humidity, or moisture value thresholds).
  • Infrared imaging system 800 may be utilized to reduce site visit travel and expense by providing cost-effective remote monitoring of structures and buildings. Infrared imaging system 800 may be utilized to provide the contractor with quick and accurate validations that a jobsite is dry prior to removing drying equipment. Infrared imaging system 800 may be utilized to provide insurance companies and adjusters with access to current or past claims to monitor progress of a contractor, which may allow insurance companies to make sure the contractor is not charging for more work than is actually being performed, and allow insurance companies access to stored data for any legal issues that may arise.
  • Infrared system 800 may be utilized to provide remote monitoring of structure 802 to detect a fire, flood, earthquake or other disaster and provide an alarm (e.g., an audible alarm, an email alert, a text message, and/or any other desired form of communication for a desired warning) to notify appropriate personnel and/or systems.
  • an alarm e.g., an audible alarm, an email alert, a text message, and/or any other desired form of communication for a desired warning
  • infrared system 800 may be distributed through a portion of or throughout a building to detect a fire or, for a recently extinguished fire, to detect if structural temperatures are beginning to increase or the potential risk for the fire to restart (e.g., to rekindle) is increasing and reaches a certain threshold (e.g., a predetermined temperature threshold).
  • a certain threshold e.g., a predetermined temperature threshold
  • infrared system 800 may provide an alarm to notify the fire department, occupants within structure 802, or other desired personnel.
  • infrared system 800 may comprise one or more thermal infrared cameras (e.g., infrared imaging system 100, 300, or some portion of this system) within and/or around structure 802 to monitor for fire or potential rekindle potential of an extinguished fire.
  • the thermal infrared cameras may provide thermal image data, which could be provided (e.g., sent via a wired or wireless communication link) to a fire station for personnel to monitor to detect a fire or potential of a fire (e.g., based on images and temperature readings of surfaces of structure 802).
  • Infrared system 800 may also provide an alarm if certain thermal conditions based on the temperature measurements are determined to be present for structure 802.
  • infrared imaging system 800 may include a base unit (e.g., processing component 810 and network communication device 852) that functions as a receiver for all wireless remote probes.
  • the base unit may include a color display and be adapted to record data, process data, and transmit data (e.g., in real time) to a hosted website for remote viewing and retrieval by a user, such as a contractor, emergency personnel, and/or insurance appraiser.
  • the base unit may include a touch screen display for improved usability and a USB and/or SD card slot for transferring data onsite without the use of a laptop or PC.
  • infrared imaging system 800 may include various monitoring devices 830 (e.g., various types of sensors), which may include for example a first type of sensor and/or a second type of sensor.
  • the first type of sensor may include a pin-type moisture and ambient probe adapted to collect moisture levels and RH, air temperature, dew point, and/or grains per pound levels.
  • Each first type of sensor may be uniquely identified based on a particular layout and/or configuration of a jobsite.
  • the second type of sensor may represent a standalone thermal imaging sensor to capture infrared image data.
  • the second type of sensor may include a display and may further include an integrated ambient sensor to monitor humidity and/or moisture levels.
  • the first and second type of sensors may be combined to form one modular sensor that may be compact, portable, self contained, and/or wireless and which may be installed (e.g., attached to a wall, floor, and/or ceiling) within a structure as desired by a user.
  • Infrared imaging system 800 may include an Internet connection adapted to transmit data from the base unit (e.g., network communication device 852) located at a jobsite in real-time via the Internet to a website for monitoring, analysis, and downloading. This may be achieved by a LAN/W AN at the site if one is available, or may require an internal wireless telecommunication system, such as a cellular-based (e.g., 3G or 4G) wireless connection for continuous data transmission.
  • a cellular-based (e.g., 3G or 4G) wireless connection for continuous data transmission.
  • infrared imaging system 800 may include various monitoring devices 830, which may include for example moisture sensors and thermal imaging sensors fixed to a wall, baseboard, cabinet, etc. where damage may not occur and/or where a wide field of view of a given wall or surface may be achieved.
  • Each monitoring device 830 e.g., each sensor
  • a battery e.g., a lithium battery
  • fixed, rotating sensors mounted on a ceiling may be employed to provide a 360 degree view of a given room.
  • any related software may be loaded onto a laptop, or use of a full- featured website may allow the user to configure reporting intervals and determine thresholds, and/or set readings desired for remote viewing. Configuration may be done onsite or remotely and settings may be changed at any time from the website interface, as would be understood by one skilled in the art.
  • Alarms may be configured to remotely notify the user of any problems that arise on a jobsite or other area being monitored by infrared imaging system 800. This may be achieved on the website by setting threshold alarms with specific moisture, humidity, or temperature ranges. For example, in some restoration cases, homeowners may unplug drying equipment at night because of excessive noise levels or, as another example, a contractor may load a single circuit with several drying devices that results in a fuse blowing when the homeowner switches additional electrical appliances on. With the alarm notification feature, the sensor automatically responds to a preset threshold and sends an email or text message to the user. For example, a user may set up the system to be notified if the relative humidity rises or air temperature falls (e.g., for water damage restoration applications), indicating a problem and meriting a visit by the contractor.
  • threshold alarms with specific moisture, humidity, or temperature ranges. For example, in some restoration cases, homeowners may unplug drying equipment at night because of excessive noise levels or, as another example, a contractor may load a single circuit with several drying devices
  • Infrared imaging system 800 may be secured with login credentials, such as a user identification and password permitting access to only certain persons. A user may choose to grant access to an insurance adjuster by providing a unique user name and password. Real time data may be automatically downloaded and stored to a server for future viewing. Even if there is a power failure at the jobsite, infrared imaging system 800 and/or the website may be adapted to store the captured data.
  • login credentials such as a user identification and password permitting access to only certain persons.
  • a user may choose to grant access to an insurance adjuster by providing a unique user name and password.
  • Real time data may be automatically downloaded and stored to a server for future viewing. Even if there is a power failure at the jobsite, infrared imaging system 800 and/or the website may be adapted to store the captured data.
  • a user may determine which areas need additional monitoring (e.g., drying or show proof that a building is completely dry) before leaving a jobsite. Data and records from the infrared imaging system 800 may be useful for mitigating legal exposure.
  • the monitoring devices 830 may include one or more ambient sensors with accuracy of at least +/- 2% for relative humidity, with a full range of 0-100%, and a high temperature range up to a least 175°F, as specific examples.
  • the monitoring devices 830 may include one or more moisture sensors with a measuring depth, for example, up to at least 0.75" into building material.
  • the monitoring devices 830 may include one or more thermal views from one or more thermal cameras providing one or more wall shots or 360-degree rotational views.
  • the monitoring devices 830 may include a long range wireless transmission capability up to, for example, 500 feet between each monitoring device 830 and the base unit (e.g., processing component 810 and network communication device 852, which may be combined and/or implemented as one or more devices).
  • the base unit may be accessible via a wired and/or wireless network and may provide 24/7 data availability via dynamic online reporting tools adapted to view, print, and email charts and graphs of the monitoring conditions, as would be understood by one skilled in the art.
  • Infrared imaging system 800 may provide for full access to system configuration settings, customizable thresholds and alarms, user access management (e.g., add, remove, and/or modify personnel access), and alerts the user or operator via cell phone, text message, email, etc., as would be understood by one skilled in the art.
  • Infrared imaging system 800 may include a display to view real time readings on site and provide the ability to toggle between room sensors.
  • conventional visible light cameras e.g., visible spectrum imagers
  • an infrared imager e.g., a low resolution thermal imager
  • an infrared imager may be selected or designed to provide low resolution thermal images that define a person as a non-descript blob to protect the identity of the person.
  • infrared imagers are less intrusive than visible light imagers.
  • objects at human temperature ranges may be discriminated from other objects, which may allow infrared imaging systems and methods in accordance with present embodiments to operate at a low spatial resolution to detect persons, without producing images that may allow for observers to determine the identity of the persons.
  • various embodiments of the invention may be implemented using hardware, software, or various combinations of hardware and software.
  • various hardware components and/or software components set forth herein may be combined into composite components comprising software, hardware, and/or both without departing from the scope and functionality of the present disclosure.
  • various hardware components and/or software components set forth herein may be separated into subcomponents having software, hardware, and/or both without departing from the scope and functionality of the present disclosure.
  • software components may be implemented as hardware components and vice- versa.
  • Software in accordance with the present disclosure, such as program code and/or data, may be stored on one or more computer readable mediums. It is also contemplated that software identified herein may be implemented using one or more general purpose or specific purpose computers and/or computer systems, networked and/or otherwise. Where applicable, ordering of various steps described herein may be changed, combined into composite steps, and/or separated into sub-steps to provide features described herein.
  • modules 112A-112N may be embedded (i.e., hard-coded) in processing component 110 or stored on memory component 120 for access and execution by processing component 110.
  • code e.g., software and/or embedded hardware
  • modules 112A-112N may be adapted to define preset display functions that allow processing component 100 to automatically switch between various processing techniques for sensed modes of operation, as described herein.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Emergency Management (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Gerontology & Geriatric Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)
  • Alarm Systems (AREA)

Abstract

Infrared imaging systems and methods disclosed herein, in accordance with one or more embodiments, provide for a wireless thermal imaging system comprising one or more wireless thermal image sensors adapted to capture and provide thermal images of structural objects of a structure for monitoring moisture of the structural objects and a processing component adapted to receive the thermal images of the structural objects from the one or more wireless thermal image sensors, and process the thermal images of the structural objects to generate moisture content information for remote analysis of restoration conditions of the structural objects.

Description

INFRARED SENSOR SYSTEMS AND METHODS
CROSS REFERENCE TO RELATED APPLICATIONS
[00001] This application claims priority to U.S. Provisional Patent Application Serial No. 61/445,280, incorporated by reference in its entirety.
TECHNICAL FIELD
[00001] The present disclosure relates to infrared imaging systems and, in particular, to infrared sensor systems and methods.
BACKGROUND
[00002] When a building is compromised, such as in the event of an emergency (e.g., an earthquake, explosion, terrorist attack, flood, fire, other type of disaster, etc.), government agencies typically seek to gain knowledge as to the status of the damage and to the number of persons present in the building (e.g., any type of structure or defined perimeter). Surveillance cameras may be utilized to discover this knowledge. Surveillance cameras typically utilize color and monochrome imagers that are sensitive to ambient light in the visible spectrum. Unfortunately, visible light cameras are not ideally suited for detecting persons, including persons in need of assistance. For example, visible light cameras typically produce inferior quality images in low light conditions, such as when interior lighting is not operating in the event of power outage or failure. Generally, loss of power may be expected in disastrous situations that may require emergency aid for persons inside the building.
[00003] As such, in the event of an emergency with potential loss of power, it may be critical for search and rescue personnel to quickly and easily locate persons in the building. Conventional visible light cameras generally do not operate in total or near total darkness, such as night time or during a power outage, or provide information as to moisture levels for a structural member of a building. Conventional security cameras may not operate autonomously. In the event of total or partial collapse of a building, a conventional visible light camera may not withstand a high impact, and locating the camera in a collapsed building may be difficult to retrieve.
[00004] Even in non-emergency conditions, it may be important to quickly and easily identify and alert personnel if, for example, a person has fallen or is in a location they should not be or needs some kind of assistance. [00005] Accordingly, there is a need for an improved imaging device that may be used for a variety of camera applications.
SUMMARY
[00006] Systems and methods disclosed herein provide for thermal image systems and methods, in accordance with one or more embodiments. For example, for one or more embodiments, systems and methods are disclosed that may provide for wireless thermal imaging, which may include a communication component adapted to remotely communicate with a user over a network, one or more wireless thermal image sensors adapted to capture and provide thermal images of structural objects of a structure for monitoring moisture and/or temperature of the structural objects, and a processing component adapted to receive the thermal images of the structural objects from the one or more wireless thermal image sensors, and process the thermal images of the structural objects to generate moisture and/or temperature content information for remote analysis (e.g., of restoration conditions or fire hazard conditions) of the structural objects.
[00007] In various embodiments, the one or more wireless thermal image sensors may include one or more infrared cameras adapted to continuously monitor environmental parameters including one or more of humidity, temperature, and/or moisture associated with the structural objects. The wireless thermal imaging system may include a ruggedized thermal camera system adapted for use as a disaster monitoring camera system to detect and monitor damage from disastrous events including at least one of flooding, fire, explosion, and/or earthquake, and wherein the ruggedized thermal camera system comprises an enclosure that is capable of withstanding disastrous events. The wireless thermal imaging system may include a thermal camera system adapted for use as a safety monitoring system to detect one or more persons in the structure including, for example, one or more fallen persons in the structure.
[00008] In various embodiments, the one or more wireless thermal image sensors may be adapted to monitor one more conditions of the structure including measuring one or more of moisture, humidity, temperature, and/or ambient conditions of its structural envelope.
Condition information of the structural objects may be collected locally via the processing component and provided to a hosted website over the network via the communication component for remote viewing and analysis (e.g., of restoration conditions) by the user. The wireless thermal imaging system may include wireless sensors including a moisture meter and/or a hygrometer to monitor moisture conditions and provide ambient and/or various types of moisture information related to the structure to the processing component. The infrared imaging system may be adapted to simultaneously monitor multiple structures. The one or more wireless thermal image sensors may be affixed to at least one structural object of the structure to provide a view of one or more other structural objects of the structure. The processing component may be adapted to provide an alarm to remotely notify the user of an emergency (e.g., a disastrous event) related to the structure by setting (or based on) a threshold condition for certain parameters (e.g., with specific moisture or temperature ranges).
[00009] An infrared camera system, in accordance with one or more embodiments, may be installed within a public or private facility or area to detect and monitor any persons present. For example, the infrared camera system may be installed within an elder care facility (e.g., senior living facility) or within a daycare facility to monitor persons and detect when assistance may be needed and provide an alert (e.g., a local alarm and/or provide a notification to a designated authority). The infrared camera system may detect when assistance is needed based upon a person's body position (e.g., fallen person), a body temperature (e.g., above or below normal range), and/or total time (e.g., in a stationary position). Additionally, the infrared camera system may be designed to provide lower resolution images to maintain the personal privacy of the person.
[00010] The scope of the disclosure is defined by the claims, which are incorporated into this section by reference. A more complete understanding of embodiments of the present disclosure will be afforded to those skilled in the art, as well as a realization of additional advantages thereof, by a consideration of the following detailed description of one or more embodiments. Reference will be made to the appended sheets of drawings that will first be described briefly.
BRIEF DESCRIPTION OF THE DRAWINGS
[00011] FIG. 1 shows a block diagram illustrating an infrared imaging system for capturing and processing infrared images, in accordance with an embodiment.
[00012] FIG. 2 shows a method for capturing and processing infrared images, in accordance with an embodiment.
[00013] FIG. 3 shows a block diagram illustrating an infrared imaging system for monitoring an area, in accordance with an embodiment.
[00014] FIG. 4 shows a block diagram illustrating a processing flow of an infrared imaging system, in accordance with one or more embodiments. [00015] FIGS. 5A-5B shows a diagram illustrating various profiles of a person, in accordance with one or more embodiments.
[00016] FIG. 6 shows a block diagram illustrating a method for capturing and processing infrared images, in accordance with one or more embodiments.
[00017] FIGS. 7A-7C show block diagrams illustrating methods for operating an infrared imaging system in an emergency mode, in accordance with one or more embodiments.
[00018] FIG. 8 shows an infrared imaging system adapted for monitoring a structure, in accordance with an embodiment.
[00019] Embodiments of the present disclosure and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures.
DETAILED DESCRIPTION
[00020] Infrared imaging systems and methods disclosed herein, in accordance with one or more embodiments, relate to search, rescue, evacuation, remediation, and/or detection of persons that may be injured (e.g., from a fall) and/or structures that may be damaged due to a disastrous event (emergency), such as an earthquake, explosion, flood, fire, tornado, terrorist attack, etc. For example, in the event of an emergency or disaster with potential loss of power, it may be critical for search and rescue personnel to quickly and easily locate persons in a structure, building, or other defined perimeter. Even under non-emergency conditions, it may be important to quickly and easily assist a person that has fallen. As an example for a structure, it may be necessary to monitor remediation efforts (e.g., due to water or fire damage), such as to verify status or completion of the remediation effort (e.g., the dampness has been remedied) and if further attention is needed (e.g., fire has restarted or potential fire hazard increasing due to increased temperature readings).
[00021] Infrared imaging systems and methods disclosed herein, in accordance with one or more embodiments, autonomously operate in total or near total darkness, such as night time or during a power outage. In the event of a total or partial collapse of a structure or building, a ruggedized infrared imaging system may be adapted to withstand impact of a structural collapse and provide a homing signal to identify locations for retrieval of infrared data and information. A low resolution infrared imaging system may be utilized in places where personal privacy is a concern, such as bedrooms, restrooms, and showers. In some instances, these areas are places where persons often slip and fall and may need assistance. As such, the infrared imaging systems and methods disclosed herein provide an infrared camera capable of imaging in darkness, operating autonomously, retaining video information from emergency or other disastrous event (e.g. ruggedized infrared camera), providing an easily identifiable location, and/or protecting personal privacy.
[00022] As a specific example, the infrared imaging systems and methods disclosed herein, in accordance with an embodiment, may be utilized in senior citizen care facilities, within a person's home, and/or within other public or private facilities to monitor and provide thermal images that may be analyzed to determine if a person needs assistance (e.g., has fallen or is in distress, has an abnormal body temperature, and/or remains in a fixed position for an extended period of time) and/or provide location information for emergency personnel to locate the individual to provide assistance (e.g., during a medical emergency or during a disaster event).
[00023] As another specific example, the infrared imaging systems and methods disclosed herein, in accordance with an embodiment, may be implemented to monitor remediation efforts, such as directed to water and/or fire damage. The infrared imaging system may provide thermal images for analysis within the infrared imager (e.g., infrared camera) or by a remote processor (e.g., computer) to provide information as to the remediation status. As a specific example, the thermal images may provide information as to the moisture, humidity, and/or temperature status of a structure and whether the structure has sufficiently dried after water damage, such that appropriate remediation personnel may readily determine the remediation status. As another specific example, the thermal images may provide
information as to the temperature status of a structure, which may have suffered recently from fire damage, and whether the structure and temperatures associated with the structure have stabilized or are increasing, such that appropriate fire personnel may readily determine the fire hazard status and whether the danger of the fire restarting (e.g., rekindle) is increasing so that appropriate actions may be taken.
[00024] Accordingly for an embodiment, an infrared imaging system in a ruggedized enclosure with capability of operating autonomously aids first responders including search and rescue personnel by identifying images of persons present at the imaged location. The infrared imaging system is adapted to provide a thermal signature of objects in complete darkness and detect objects that are close to skin temperature. By enclosing the infrared imaging system in such a way that it may withstand severe impact and by equipping the infrared imaging system with non-volatile memory for storing images, first responders upon locating the infrared imaging system may extract infrared data and information about persons present in a specific location. [00025] FIG. 1 shows a block diagram illustrating an infrared imaging system 100 for capturing and processing infrared images, in accordance with an embodiment. For example, in one embodiment, infrared imaging system 100 may comprise a rugged thermal imaging camera system to aid first responders and detect fallen persons or persons requiring medical assistance. In another embodiment, infrared imaging system 100 may comprise a wireless thermal image monitoring system for disaster restoration monitoring.
[00026] Infrared imaging system 100, in one embodiment, may include a processing component 110, a memory component 120, an image capture component 130, a display component 140, a control component 150, a communication component 152, a power component 154, a mode sensing component 160, a motion sensing component 162, and/or a location component 170. In various embodiments, infrared imaging system 100 may include one or more other sensing components 164 including one or more of a seismic activity sensor, a smoke detection sensor, a heat sensor, a water level sensor, a gaseous fume sensor, a radioactivity sensor, etc.
[00027] In various embodiments, infrared imaging system 100 may represent an infrared imaging device, such as an infrared camera, to capture images, such as image 180. Infrared imaging system 100 may represent any type of infrared camera system, which for example may be adapted to detect infrared radiation and provide representative infrared image data (e.g., one or more snapshot images and/or video images). In one embodiment, infrared imaging system 100 may represent an infrared camera and/or video camera that is directed to the near, middle, and/or far infrared spectrums to provide thermal infrared image data.
Infrared imaging system 100 may include a permanently mounted infrared imaging device and may be implemented, for example, as a security camera and/or coupled, in other examples, to various types of structures (e.g., buildings bridges, tunnels, etc.). Infrared imaging system 100 may include a portable infrared imaging device and may be
implemented, for example, as a handheld device and/or coupled, in other examples, to various types of vehicles (e.g., land-based vehicles, watercraft, aircraft, spacecraft, etc.) or structures via one or more types of mounts. In still another example, infrared imaging system 100 may be integrated as part of a non-mobile installation requiring infrared images to be stored and/or displayed.
[00028] Processing component 110 comprises, in various embodiments, an infrared image processing component and/or an infrared video image processing component. Processing component 110 includes, in one embodiment, a microprocessor, a single-core processor, a multi-core processor, a microcontroller, a logic device (e.g., programmable logic device configured to perform processing functions), a digital signal processing (DSP) device, or some other type of generally known processor, including image processors and/or video processors. Processing component 110 is adapted to interface and communicate with components 120, 130, 140, 150, 152, 154, 160, 162, 164, and/or 170 to perform method and processing steps as described herein. Processing component 110 may include one or more modules 112A-112N for operating in one or more modes of operation, wherein modules 112A-112N may be adapted to define preset processing and/or display functions that may be embedded in processing component 110 or stored on memory component 120 for access and execution by processing component 110. For example, processing component 110 may be adapted to operate and/or function as a video recorder controller adapted to store recorded video images in memory component 120. In other various embodiments, processing component 110 may be adapted to perform various types of image processing algorithms and/or various modes of operation, as described herein.
[00029] In various embodiments, it should be appreciated that each module 112A-112N may be integrated in software and/or hardware as part of processing component 110, or code (e.g., software or configuration data) for each mode of operation associated with each module 112A-112N, which may be stored in memory component 120. Embodiments of modules 112A-112N (i.e., modes of operation) disclosed herein may be stored by a separate computer- readable medium (e.g., a memory, such as a hard drive, a compact disk, a digital video disk, or a flash memory) to be executed by a computer (e.g., logic or processor-based system) to perform various methods disclosed herein.
[00030] In one example, the computer-readable medium may be portable and/or located separate from infrared imaging system 100, with stored modules 112A-112N provided to infrared imaging system 100 by coupling the computer-readable medium to infrared imaging system 100 and/or by infrared imaging system 100 downloading (e.g., via a wired or wireless link) the modules 112A-112N from the computer-readable medium (e.g., containing the non- transitory information). In various embodiments, as described herein, modules 112A-112N provide for improved infrared camera processing techniques for real time applications, wherein a user or operator may change a mode of operation depending on a particular application, such as monitoring seismic activity, monitoring workplace safety, monitoring disaster restoration, etc. Accordingly, in various embodiments, the other sensing components 164 may include one or more of a seismic activity sensor, a smoke detection sensor, a heat sensor, a water level sensor, a humidity sensor, a gaseous fume sensor, a radioactivity sensor, etc. for sensing disastrous events, such as earthquakes, explosions, fires, gas fumes, gas leaks, nuclear meltdowns, etc.
[00031] In various embodiments, modules 112A-112N may be utilized by infrared imaging system 100 to perform one or more different modes of operation including a standard mode of operation, a person detection mode of operation, a fallen person mode of operation, an emergency mode of operation, and a black box mode of operation. One or more of these modes of operation may be utilized for work and safety monitoring, disaster monitoring, restoration monitoring, and/or remediation progress monitoring. The modes of operation are described in greater detail herein.
[00032] Memory component 120 includes, in one embodiment, one or more memory devices to store data and information, including infrared image data and information and infrared video image data and information. The one or more memory devices may include various types of memory for infrared image and video image storage including volatile and non-volatile memory devices, such as RAM (Random Access Memory), ROM (Read-Only Memory), EEPROM (Electrically-Erasable Read-Only Memory), flash memory, etc. In one embodiment, processing component 110 is adapted to execute software stored on memory component 120 to perform various methods, processes, and modes of operations in manner as described herein.
[00033] Image capture component 130 includes, in one embodiment, one or more infrared sensors (e.g., any type of infrared detector, such as a focal plane array) for capturing infrared image signals representative of an image, such as image 180. The infrared sensors may be adapted to capture infrared video image signals representative of an image, such as image 180. In one embodiment, the infrared sensors of image capture component 130 provide for representing (e.g., converting) a captured image signal of image 180 as digital data (e.g., via an analog-to-digital converter included as part of the infrared sensor or separate from the infrared sensor as part of infrared imaging system 100). Processing component 110 may be adapted to receive infrared image signals from image capture component 130, process infrared image signals (e.g., to provide processed image data), store infrared image signals or image data in memory component 120, and/or retrieve stored infrared image signals from memory component 120. Processing component 110 may be adapted to process infrared image signals stored in memory component 120 to provide image data (e.g., captured and/or processed infrared image data) to display component 140 for viewing by a user.
[00034] Display component 140 includes, in one embodiment, an image display device (e.g., a liquid crystal display (LCD)) or various other types of generally known video displays or monitors. Processing component 110 may be adapted to display image data and information on display component 140. Processing component 110 may be adapted to retrieve image data and information from memory component 120 and display any retrieved image data and information on display component 140. Display component 140 may include display electronics, which may be utilized by processing component 110 to display image data and information (e.g., infrared images). Display component 140 may receive image data and information directly from image capture component 130 via processing component 110, or the image data and information may be transferred from memory component 120 via processing component 110.
[00035] In one embodiment, processing component 110 may initially process a captured image and present a processed image in one mode, corresponding to modules 112A-112N, and then upon user input to control component 150, processing component 110 may switch the current mode to a different mode for viewing the processed image on display component 140 in the different mode. This switching may be referred to as applying the infrared camera processing techniques of modules 112A-112N for real time applications, wherein a user or operator may change the mode while viewing an image on display component 140 based on user input to control component 150. In various aspects, display component 140 may be remotely positioned, and processing component 110 may be adapted to remotely display image data and information on display component 140 via wired or wireless communication with display component 140.
[00036] Control component 150 includes, in one embodiment, a user input and/or interface device having one or more user actuated components. For example, actuated components may include one or more push buttons, slide bars, rotatable knobs, and/or a keyboard, that are adapted to generate one or more user actuated input control signals. Control component 150 may be adapted to be integrated as part of display component 140 to function as both a user input device and a display device, such as, for example, a touch screen device adapted to receive input signals from a user touching different parts of the display screen. Processing component 110 may be adapted to sense control input signals from control component 150 and respond to any sensed control input signals received therefrom.
[00037] Control component 150 may include, in one embodiment, a control panel unit (e.g., a wired or wireless handheld control unit) having one or more user-activated mechanisms (e.g., buttons, knobs, sliders, etc.) adapted to interface with a user and receive user input control signals. In various embodiments, the one or more user-activated mechanisms of the control panel unit may be utilized to select between the various modes of operation, as described herein in reference to modules 112A-112N. In other embodiments, it should be appreciated that the control panel unit may be adapted to include one or more other user- activated mechanisms to provide various other control functions of infrared imaging system 100, such as auto-focus, menu enable and selection, field of view (FoV), brightness, contrast, gain, offset, spatial, temporal, and/or various other features and/or parameters. In still other embodiments, a variable gain signal may be adjusted by the user or operator based on a selected mode of operation.
[00038] In another embodiment, control component 150 may include a graphical user interface (GUI), which may be integrated as part of display component 140 (e.g., a user actuated touch screen), having one or more images of the user-activated mechanisms (e.g., buttons, knobs, sliders, etc.), which are adapted to interface with a user and receive user input control signals via the display component 140.
[00039] Communication component 152 may include, in one embodiment, a network interface component (NIC) adapted for wired and/or wireless communication with a network including other devices in the network. In various embodiments, communication component 152 may include a wireless communication component, such as a wireless local area network (WLAN) component based on the IEEE 802.11 standards, a wireless broadband component, mobile cellular component, a wireless satellite component, or various other types of wireless communication components including radio frequency (RF), microwave frequency (MWF), and/or infrared frequency (IRF) components, such as wireless transceivers, adapted for communication with a wired and/or wireless network. As such, communication component 152 may include an antenna coupled thereto for wireless communication purposes. In other embodiments, the communication component 152 may be adapted to interface with a wired network via a wired communication component, such as a DSL (e.g., Digital Subscriber Line) modem, a PSTN (Public Switched Telephone Network) modem, an Ethernet device, and/or various other types of wired and/or wireless network communication devices adapted for communication with a wired and/or wireless network. Communication component 152 may be adapted to transmit and/or receive one or more wired and/or wireless video feeds.
[00040] In various embodiments, the network may be implemented as a single network or a combination of multiple networks. For example, in various embodiments, the network may include the Internet and/or one or more intranets, landline networks, wireless networks, and/or other appropriate types of communication networks. In another example, the network may include a wireless telecommunications network (e.g., cellular phone network) adapted to communicate with other communication networks, such as the Internet. As such, in various embodiments, the infrared imaging system 100 may be associated with a particular network link such as for example a URL (Uniform Resource Locator), an IP (Internet Protocol) address, and/or a mobile phone number.
[00041] Power component 154 comprises a power supply or power source adapted to provide power to infrared imaging system 100 including each of the components 110, 120, 130, 140, 150, 152, 154, 160, 162, 164, and/or 170. Power component 154 may comprise various types of power storage devices, such as battery, or a power interface component that is adapted to receive external power and convert the received external power to a useable power for infrared imaging system 100 including each of the components 110, 120, 130, 140, 150, 152, 154, 160, 162, 164, and/or 170.
[00042] Mode sensing component 160 may be optional. Mode sensing component 160 may include, in one embodiment, an application sensor adapted to automatically sense a mode of operation, depending on the sensed application (e.g., intended use for an
embodiment), and provide related information to processing component 110. In various embodiments, the application sensor may include a mechanical triggering mechanism (e.g., a clamp, clip, hook, switch, push-button, etc.), an electronic triggering mechanism (e.g., an electronic switch, push-button, electrical signal, electrical connection, etc.), an electromechanical triggering mechanism, an electro-magnetic triggering mechanism, or some combination thereof. For example, for one or more embodiments, mode sensing component 160 senses a mode of operation corresponding to the intended application of the infrared imaging system 100 based on the type of mount (e.g., accessory or fixture) to which a user has coupled the infrared imaging system 100 (e.g., image capture component 130).
Alternately, for one or more embodiments, the mode of operation may be provided via control component 150 by a user of infrared imaging system 100.
[00043] Mode sensing component 160, in one embodiment, may include a mechanical locking mechanism adapted to secure the infrared imaging system 100 to a structure or part thereof and may include a sensor adapted to provide a sensing signal to processing component 110 when the infrared imaging system 100 is mounted and/or secured to the structure. Mode sensing component 160, in one embodiment, may be adapted to receive an electrical signal and/or sense an electrical connection type and/or mount type and provide a sensing signal to processing component 110.
[00044] Processing component 110 may be adapted to communicate with mode sensing component 160 (e.g., by receiving sensor information from mode sensing component 160) and image capture component 130 (e.g., by receiving data and information from image capture component 130 and providing and/or receiving command, control, and/or other information to and/or from other components of infrared imaging system 100).
[00045] In various embodiments, mode sensing component 160 may be adapted to provide data and information relating to various system applications including various coupling implementations associated with various types of structures (e.g., buildings, bridges, tunnels, vehicles, etc.). In various embodiments, mode sensing component 160 may include communication devices that relay data and information to processing component 110 via wired and/or wireless communication. For example, mode sensing component 160 may be adapted to receive and/or provide information through a satellite, through a local broadcast transmission (e.g., radio frequency), through a mobile or cellular network, and/or through information beacons in an infrastructure (e.g., a transportation or highway information beacon infrastructure) or various other wired and/or wireless techniques.
[00046] Motion sensing component 162 includes, in one embodiment, a motion detection sensor adapted to automatically sense motion or movement and provide related information to processing component 110. For example, motion sensing component 162 may include an accelerometer, a gyroscope, an inertial measurement unit (IMU), etc., to detect motion of infrared imaging system 100 (e.g., to detect an earthquake). In various embodiments, the motion detection sensor may be adapted to detect motion or movement by measuring change in speed or vector of an object or objects in a field of view, which may be achieved by mechanical techniques physically interacting within the field of view or by electronic techniques adapted to quantify and measure changes in the environment. Some methods by which motion or movement may be electronically identified include optical detection and acoustical detection.
[00047] In various embodiments, image capturing system 100 may include one or more other sensing components 164, including environmental and/or operational sensors, depending on application or implementation, which provide information to processing component 110 by receiving sensor information from each sensing component 164. In various embodiments, other sensing components 164 may be adapted to provide data and information related to environmental conditions, such as internal and/or external temperature conditions, lighting conditions (e.g., day, night, dusk, and/or dawn), humidity levels, specific weather conditions (e.g., sun, rain, and/or snow), distance (e.g., laser rangefinder), and/or whether a tunnel, a covered parking garage, or some type of structure or enclosure is detected. As such, other sensing components 160 may include one or more conventional sensors as known by those skilled in the art for monitoring various conditions (e.g., environmental conditions) that may have an affect (e.g., on the image appearance) on the data and information provided by image capture component 130.
[00048] In some embodiments, other sensing components 164 may include devices that relay information to processing component 110 via wireless communication. For example, each sensing component 164 may be adapted to receive information from a satellite, through a local broadcast (e.g., radio frequency) transmission, through a mobile or cellular network and/or through information beacons in an infrastructure (e.g., a transportation or highway information beacon infrastructure), and/or various other wired and/or wireless techniques in accordance with one or more embodiments.
[00049] Location component 170 includes, in one embodiment, a beacon signaling device adapted to provide a homing beacon signal for location discovery of the infrared imaging system 100. In various embodiments, the homing beacon signal may utilize a radio frequency (RF) signal, microwave frequency (MWF) signal, and/or various other wireless frequency signals in accordance with embodiments. As such, location component 170 may utilize an antenna coupled thereto for wireless communication purposes. In one aspect, processing component 110 may be adapted to interface with location component 170 to transmit the homing beacon signal in the event of an emergency or disastrous event.
[00050] In various embodiments, one or more components 110, 120, 130, 140, 150, 152, 154, 160, 162, 164, and/or 170 of image capturing system 100 may be combined and/or implemented or not, as desired or depending on application requirements, with image capturing system 100 representing various functional blocks of a system. For example, processing component 110 may be combined with memory component 120, image capture component 130, display component 140, and/or mode sensing component 160. In another example, processing component 110 may be combined with image capture component 130 with only certain functions of processing component 110 performed by circuitry (e.g., processor, logic device, microprocessor, microcontroller, etc.) within image capture component 130. In still another example, control component 150 may be combined with one or more other components or be remotely connected to at least one other component, such as processing component 110, via a wired or wireless control device so as to provide control signals thereto.
[00051] FIG. 2 shows a method 200 illustrating a process flow for capturing and processing infrared images, in accordance with an embodiment. For purposes of simplifying discussion of FIG. 2, reference may be made to image capturing system 100 of FIG. 1 as an example of a system, device, or apparatus that may perform method 200. [00052] Referring to FIG. 2, one or more images (e.g., infrared image signals comprising infrared image data including video data) may be captured (block 210) with infrared imaging system 100. In one embodiment, processing component 110 controls (e.g., causes) image capture component 130 to capture one or more images, such as, for example, image 180 and/or a video image of image 180. In one aspect, after receiving one or more captured images from image capture component 130, processing component 110 may be adapted to optionally store captured images (block 214) in memory component 120 for processing.
[00053] The one or more captured images may be pre-processed (block 218). In one embodiment, pre-processing may include obtaining infrared sensor data related to the captured images, applying correction terms, and applying noise reduction techniques to improve image quality prior to further processing as would be understood by one skilled in the art. In another embodiment, processing component 110 may directly pre-process the captured images or optionally retrieve captured images stored in memory component 120 and then pre-process the images. In one aspect, pre-processed images may be optionally stored in memory component 120 for further processing.
[00054] For one or more embodiments, a mode of operation may be determined (block 222), and one or more captured and/or preprocessed images may be processed according to the determined mode of operation (block 226). In one embodiment, the mode of operation may be determined before or after the images are captured and/or preprocessed (blocks 210 and 218), depending upon the types of infrared detector settings (e.g., biasing, frame rate, signal levels, etc.), processing algorithms and techniques, and related configurations.
[00055] In one embodiment, a mode of operation may be defined by mode sensing component 160, wherein an application sensing portion of mode sensing component 160 may be adapted to automatically sense the mode of operation, and depending on the sensed application, mode sensing component 160 may be adapted to provide related data and/or information to processing component 110.
[00056] In another embodiment, it should be appreciated that the mode of operation may be manually set by a user via display component 140 and/or control component 150 without departing from the scope of the present disclosure. As such, in one aspect, processing component 110 may communicate with display component 140 and/or control component 150 to obtain the mode of operation as provided (e.g., input) by a user. The modes of operation may include the use of one or more infrared image processing algorithms and/or image processing techniques. [00057] In various embodiments, the modes of operation refer to processing and/or display functions of infrared images, wherein for example an infrared imaging system is adapted to process infrared sensor data prior to displaying the data to a user. In some embodiments, infrared image processing algorithms are utilized to present an image under a variety of conditions, and the infrared image processing algorithms provide the user with one or more options to tune parameters and operate the infrared imaging system in an automatic mode or a manual mode. In various embodiments, the modes of operation are provided by infrared imaging system 100, and the concept of image processing for different use conditions may be implemented in various types of structure applications and resulting use conditions.
[00058] In various embodiments, the modes of operation may include a standard mode of operation, a person detection mode of operation, a fallen or distressed person mode of operation, an emergency mode of operation, and/or a black box mode of operation. One or more of these modes of operation may be utilized for work and safety monitoring, disaster monitoring, restoration monitoring, and/or remediation progress monitoring. In various embodiments, one or more of sensing components 160, 162, 164 may be utilized to determine a mode of operation. For example, mode sensing component 160 may be adapted to interface with motion sensing component 162 and one or more other sensing components 164 to assist with a determination of a mode of operation. The other sensing components 164 may include one or more of a seismic activity sensor, a smoke detection sensor, a heat sensor, a water level sensor, a moisture sensor, a temperature sensor, a humidity sensor, a gaseous fume sensor, a radioactivity sensor, etc. for sensing disastrous events, such as earthquakes, explosions, fires, gas fumes, gas leaks, nuclear events, etc. The modes of operation are described in further detail herein.
[00059] After processing the one or more images according to a determined mode of operation (block 226), the one or more images may be stored (block 230, i.e., after processing or prior to processing) and optionally displayed (block 234). Additionally, further processing may be optionally performed depending on application or implementation.
[00060] For example, for an embodiment, images may be displayed in a night mode, wherein the processing component 110 may be adapted to configure display component 140 to apply a night color palette to the images for display in night mode. In night mode, an image may be displayed in a red palette or a green palette to improve night vision capacity (e.g., to minimize night vision degradation) for a user. Otherwise, if night mode is not considered necessary, then processing component 110 may be adapted to configure display component 140 to apply a non-night mode palette (e.g., black hot or white hot palette) to the images for display via display component 140.
[00061] In various embodiments, processing component 110 may store any of the images, processed or otherwise, in memory component 120. Accordingly, processing component 110 may, at any time, retrieve stored images from memory component 120 and display retrieved images on display component 140 for viewing by a user.
[00062] In various embodiments, the night mode of displaying images refers to using a red color palette or green color palette to assist the user or operator in the dark when adjusting to low light conditions. During night operation of image capturing system 100, human visual capacity to see in the dark may be impaired by the blinding effect of a bright image on a display monitor. Hence, the night mode changes the color palette from a standard black hot or white hot palette to a red or green color palette display. Generally, the red or green color palette is known to interfere less with human night vision capability. In one example, for a red-green-blue (RGB) type of display, the green and blue pixels may be disabled to boost red color for a red color palette. In one aspect, the night mode display may be combined with any other mode of operation of infrared imaging system 100, and a default display mode of infrared imaging system 100 at night may be the night mode display.
[00063] In various embodiments, processing component 110 may switch the processing mode of a captured image in real time and change the displayed processed image from one mode, corresponding to modules 112A-112N, to a different mode upon receiving input from mode sensing component 160 and/or user input from control component 150. As such, processing component 110 may switch a current mode of display to another different mode of display for viewing the processed image by the user or operator on display component 140 depending on the input received from mode sensing component 160 and/or user input from control component 150. This switching may be referred to as applying the infrared camera processing techniques of modules 112A-112N for real time applications, wherein the displayed mode may be switched while viewing an image on display component 140 based on the input received from mode sensing component 160 and/or user input received from control component 150.
[00064] FIG. 3 shows a block diagram illustrating an infrared imaging system 300 for monitoring an area, in accordance with an embodiment. For example, in one embodiment, infrared imaging system 300 may comprise a rugged thermal imaging camera system for utilization as a disaster camera and/or workplace safety monitoring to aid first responders and/or detect fallen persons. In another embodiment, infrared imaging system 300 may comprise a wireless thermal imaging system and/or a wireless thermal image monitoring system for disaster and/or restoration monitoring. For purposes of simplifying discussion of FIG. 3, reference may be made to image capturing system 100 of FIG. 1, wherein similar system components have similar scope and function.
[00065] In one embodiment, infrared imaging system 300 may comprise an enclosure 302 (e.g., a highly ruggedized protective housing), a processing component 310 (e.g., a video processing device having a module for detecting a fallen person, emergency, disastrous event, etc.), a memory component 320 (e.g., video storage, recording unit, flash drive, etc.), an image capture component 330 (e.g., a radiometrically calibrated thermal camera), a communication component 352 (e.g., a transceiver having wired and/or wireless
communication capability), a first power component 354A (e.g., a battery), a second power component 354B (e.g., a power interface receiving external power via a power cable 356), a motion sensing component 362 (e.g., a sensor sensitive to motion or movement, such as an accelerometer), and a location component 370 (e.g., a homing beacon signal generator).
Infrared imaging system 300 may further include other types of sensors, as discussed herein, such as a temperature sensor, a humidity sensor, and/or a moisture sensor.
[00066] During normal operation, the system 300 may be adapted to provide a live video feed of thermal video captured with image capture component 330 through a wired cable link 358 or wireless communication link 352. Captured video images may be utilized for surveillance operations. The system 300 may be adapted to automatically detect a fallen person or a person in need of assistance (e.g., based on body temperature, location, body position, and/or motionless for a period of time). The fallen person detection system utilizes the image capture component 330 as a radiometrically calibrated thermal imager. The system 300 may be securely mounted to a structure 190 via an adjustable mounting component 192 (e.g., fixed or moveable, such as a pan/tilt or other motion control device) so that the imaging component 330 may be tilted to peer down on persons 304a, 304b within a field of view (FOV) 332. In one embodiment, radiometric calibration allows the system 300 to detect objects (e.g., persons 304a, 304b) at or close to skin temperature, such as between 80°C and 110°F.
[00067] In one embodiment, the processing component 310 utilizes a person detection module 312B (i.e., module 112B) to determine or provide awareness of whether one or more persons are present in the scene, such as persons 304a, 304b. If at least one person is present, then the system 300 may be adapted to operate in emergency mode 312A (e.g., module 112A), which may be triggered by motion sensor 362. The processing component 310 may encode person detection information into a homing beacon signal, which may be generated from location device 370. In one aspect, the person detection information may aid search and rescue personnel in their efforts to prioritize search and rescue operations.
[00068] In one embodiment, the system 300 may be enclosed in a ruggedized protective housing 302 built such that even after severe impact from a disastrous event, the non- volatile memory 320, which stores recorded video images, may be extracted in an intact state. An internal battery 354 allows the system 300 to operate after loss of external power via cable 356 for some period of time. Even if the system optics and video processing electronics are rendered useless as a result of a catastrophic event, power from internal battery 354 may be provided to location device 370 so that a homing beacon signal may be generated and transmitted to assist search and rescue personnel with locating the system 300.
[00069] FIG. 4 shows a block diagram illustrating a process flow 400 of an infrared imaging system, in accordance with one or more embodiments. For example, system 100 of FIG. 1 and/or system 300 in FIG. 3 may be utilized to perform method 400.
[00070] In one embodiment, a data capture component 412 (e.g., processing component 310 of system 300) is adapted to extract frames of thermal imagery from a thermal infrared sensor 410 (e.g., image capture component 330 of system 300). The captured image, including data and information thereof, may be normalized, for example, to an absolute temperature scale by a radiometric normalization module 414 (e.g., a module utilized by the processing component 310 of system 300). A person detection module 416 (e.g., a module utilized by the processing component 310 of system 300) is adapted to operate on the radiometric image to localize persons present in the scene (e.g., FOV 332).
[00071] A fallen person detection module 418 (e.g., a module utilized by the processing component 310 of system 300) may be adapted to discriminate between upright persons (e.g., standing or walking persons) and fallen persons. In various embodiments, the module may be adapted to discriminate based on other parameters, such as time, location, and/or temperature differential.
[00072] For example, process flow 400 may be used to monitor persons and detect when assistance may be needed and provide an alert (e.g., a local alarm and/or provide a notification to a designated authority). As a specific example, process flow 400 (e.g., person detection module 416) may detect when assistance is needed based upon a person's body position (e.g., fallen person), body temperature (e.g., above or below normal range), and/or total time (e.g., in a stationary position). [00073] In one aspect, data and information about coordinates of persons (e.g., fallen and not fallen) and the radiometrically normalized or non-normalized image may be passed to a conversion module 420 (e.g., a module utilized by the processing component 310 of system 300). The conversion module 420 may be adapted to scale the image such that the image fits the dynamic range of a display and may encode the positions of persons and fallen persons in the image, for example, by color coding the locations. The converted and potentially color coded image may be compressed 422 by some standard video compression algorithm or technique so as to reduce memory storage capacity of the extractable video storage component 424 (e.g., the memory component 320 of system 300). In various aspects, a command may be given to the system 300 by a user or the processing component 310 to transmit stored video data and information of the extractable video storage component 424 over a wired video link 426 and/or wireless video link 428 via an antenna 430.
[00074] In one embodiment, in standard operation, the system (e.g., system 300 of FIG. 3) operates as a thermal imaging device producing a video stream representing the thermal signature of a scene (e.g., FOV 332). The video images produced may be stored in a circular frame buffer in non-volatile memory (e.g., memory component 320 of system 300) in a compressed format so as to store a significant amount of video. It should be appreciated that, depending on the memory storage capacity, any length of video may be stored without departing from the scope of the present embodiments. It should also be appreciated that the type of extractable memory module used and the compression ratio may affect the amount of available memory storage as understood by someone skilled in the art.
[00075] In one embodiment, in a person detection mode, a processing unit (e.g., processing component 310 of system 300) processing the thermal video stream may be adapted to detect the presence of persons and/or animals. In one embodiment, if a person is detected, the system (e.g., system 300 of FIG. 3) may be set to a PERSON_PRESENT mode, wherein person detection information may be utilized during normal operation as is achieved, for example, in standard video analytics software to generate an alert of potential intrusion. In the event of an emergency, the camera may retain the PERSON_PRESENT mode even when disconnected from main power and video network.
[00076] In one aspect, by collecting scene statistics for each pixel location, a background model of the scene (e.g., FOV 332) may be constructed. This may be considered standard procedure in video analytics applications. The exemplary background model may utilize an average of a time series of values for a given pixel. Because of the lack of shadows and general insensitivity to changing lighting conditions, background modeling may be more effective and less prone to false alarms with thermal imaging sensors. Once a background model has been constructed, regions of the image that differ from the background model may be identified. In the instance of a time series average as a background model, the background may be subtracted from the current captured video frame, and the difference may be thresholded to find one or more ROI (Region Of Interest) corresponding to areas of greatest change. In one example, a detected ROI may indicate the presence of a person.
[00077] In one embodiment, a radiometrically calibrated thermal camera (e.g., system 300 of FIG. 3) may be utilized, which may allow the fallen person detection module 418 to access absolute temperature values for the ROI. In one example, if the ROI includes at least some areas with temperatures close to body temperature, and if the ROI is of size that may match the profile of a person imaged from the specific camera location, a person may be determined to be present in the captured image. As such, in this instance, the system 300 may be set to PERSON_PRESENT mode. In another example, a user set time constant may determine the length of time that the system 300 may stay in the PERSON_PRESENT mode after the last detection of a person. For instance, the system 300 may stay in the PERSON_PRESENT mode for 10 seconds after the last detection of a person.
[00078] In one embodiment, in a fallen person mode for example, a processing unit (e.g., processing component 310 of system 300) processing the thermal video stream may be adapted to discriminate between an upright person (e.g., standing or walking person) and a fallen person. In one embodiment, if a fallen person is detected, the system (e.g., system 300 of FIG. 3) may be adapted to generate an alarm. The alarm may be encoded into the video or transmitted via a wired and/or wireless communication link. In should be appreciated that the process of determining if a person has fallen is described for a fixed mount camera but an approach may be adapted for moving cameras using image registration methods as known by someone skilled in the art.
[00079] For example, a thermal imaging system (e.g., system 300 of FIG.3) may be mounted at an elevated location, such as the ceiling, and may pointed or tilted in such a manner that the system observes the scene (e.g., FOV 332) from a close to 180° angle (e.g., as shown in FIG. 3, β being close to 180°). When mounted in this manner, the profile of a standing person (e.g., person 304b) in the scene (e.g., FOV 332) and the profile of a fallen person (e.g., person 304a) in the scene (e.g., FOV 332) appear different to the infrared imaging system 300. For instance, the standing person 304b, as imaged from above, has, in relative terms, a smaller profile than the fallen person 304a having a larger profile. The approximate size (e.g., profile size based on the number of measured pixels) of a standing or fallen person, relative to the total size of the image (e.g., also determined based on the number of measured pixels), may be determined based on an approximate distance to the ground (or floor) relative to the thermal imaging system. This approximate distance may be provided to the system by an operator (e.g., via a wired or wireless communication link), may be determined based on the focus position, may be measured using a distance measuring sensor (e.g., a laser range finder), or may be determined by analyzing statistical properties of objects moving relative to the background (e.g., analysis performed by the thermal image camera or by a remote processor coupled to or formed as part of the thermal imaging system).
[00080] For example, FIG. 5A shows a first profile 500 of an upright person (e.g., standing or walking person, such as person 304b). In another example, FIG. 5B shows a second profile 502 of a fallen person (e.g., such as person 304a). In one aspect, as shown in FIGS. 5 A and 5B, the first profile of the upright person is at least smaller than the second profile of the fallen person, which is at least larger than the first profile. In various aspects, the difference between the upright person and the fallen person represents a change in aspect of a person, such as the vertical and/or horizontal aspect of the person. In one embodiment, detection of a fallen person may utilize low resolution radiometry and/or thermal imagery, wherein persons may be imaged as warm blobs that are monitored for their presence, movement, and safety. For example, if someone is detected as fallen, a caregiver may be modified to provide assistance to the fallen person. In another example, the infrared imaging system 300 may be equipped with autonomous two-way audio so that a caregiver may remotely, bi-directionally communicate with a fallen person, if deemed necessary.
[00081] In one embodiment, referring to FIG. 4, the person detection mode 416 and/or the fallen person mode 418 provide awareness to the infrared imaging system 300 as to whether one or more persons are present in the scene (e.g., FOV 332). For example, if at least one person is present in the scene, then the system 300 may be adapted to operate in emergency mode 440, which may be triggered by a motion or movement sensor 442 (e.g., motion sensing component 362). The processing component 310 may be adapted to encode person detection information into a communication signal and transmit the communication signal over a network via, for example, a radio frequency (RF) transceiver 444 (e.g., wireless communication component 352) having an antenna 446 (or via antenna 430). In one embodiment, the person detection information may aid search and rescue personnel in their efforts to prioritize search and rescue operations. [00082] FIG. 6 shows a block diagram illustrating a method 600 for detecting a person in a scene or field of view, in accordance with one or more embodiments. For example, system 100 of FIG. 1 and/or system 300 of FIG. 3 may be utilized to perform method 600.
[00083] In one embodiment, using the method described in FIG. 4 for detecting a person in a scene (e.g., FOV 332) in the person detection mode, a fallen person may be discriminated from a standing or walking person by calculating the size of the ROI (i.e., the size of the area that differs from the background model) and by radiometric properties. By analyzing the change in the scene (e.g., FOV 332) over time, a group of persons walking together (i.e., two or more persons meeting) may be distinguished from a person that suddenly changes position from standing or walking to lying on the ground (i.e., a fallen person). For instance, the speed of which a specific ROI moves across the scene (e.g., FOV 332) may be used as a discriminating parameter since a fallen person may not move or move slowly.
[00084] In one aspect, by collecting scene statistics for each pixel location, a background model 610 of the scene (e.g., FOV 332) may be constructed. The background model 610 may utilize an average of a time series of values for a given pixel, and regions of the image that differ from the background model 610 may be identified. In the instance of a time series average as the background model 610, the background may be subtracted from the current captured video frame, and the difference may be thresholded to find one or more ROI (Region Of Interest) corresponding to areas of greatest change, wherein a detected ROI may indicate the presence of a person. Detection of a fallen person may utilize low resolution radiometric information 612 and thermal imagery, wherein persons may be imaged as warm blobs that are monitored for their presence and movement. Detection of a fallen person may involve user control 614 of parameters, such as setting radiometry resolution, identifying ROI, time period for monitoring the scene, etc.
[00085] Once the background model 610, radiometric information 612, and user control 614 of parameters are obtained, the method 600 is adapted to search for a person in the scene 620, in a manner as described herein. If a person is not present or not detected, then a person present state is set to false 632, and the method 600 is adapted to continue to search for a person in the scene 620. If a person is present or detected in the scene 630, then the person present state is set to true 634, and the method 600 is adapted to analyze the profile of the detected person in the scene 640, in a manner as described herein. The analysis of the scene 640 may monitor persons and detect when assistance may be needed and provide an alert 660 (e.g., a local alarm and/or provide a notification to a designated authority). As a specific example, method 600 (e.g., person present 630 and/or analysis 640) may detect when assistance is needed based upon a person's body position (e.g., fallen person), body temperature (e.g., above or below normal range), and/or total time (e.g., total time in a stationary, motionless position).
[00086] Once the person profile is analyzed 640, the method 600 is adapted to determine if the analyzed profile matches the profile of a fallen person 650. If the profile is not determined to match the profile of a fallen person, then a fallen person state is set to false, and the method 600 is adapted to continue to search for a person in the scene 620. Otherwise, if the profile is determined to match the profile of a fallen person, then the fallen person state is set to true 654, and the method 600 is adapted to generate an alert 660 to notify a user or operator that a fallen person has been detected in the scene. Once the alert is generated 660, the method 600 is adapted to continue to search for a person in the scene 620.
[00087] FIGS. 7A-7C show block diagrams illustrating methods 700, 720, and 750, respectively, for operating an infrared imaging system in an emergency mode, in accordance with one or more embodiments. In some embodiments, infrared imaging system 100 of FIG. 1 and/or infrared imaging system 300 of FIG. 3 may be utilized as an example of a system, device, or apparatus that may perform methods 700, 720, and/or 750.
[00088] In the emergency mode of operation, the location component 170, 370 is adapted to transmit a homing beacon signal to facilitate locating the system 100, 300, respectively, in a disastrous event, such as in the event of sensed smoke or fire and/or partial or complete collapse of a building. In one embodiment, if the system 100, 300 was operating in
PERSON_PRESENT mode at the time when the system 100, 300 entered emergency mode, then a person present notification is encoded into the transmitted homing beacon signal. If more than one person was present, then the approximate number of persons present may be encoded into the transmitted homing beacon signal.
[00089] Referring to Fig. 7A, if the infrared imaging system 100, 300 is operational during an emergency, then the system 100, 300 may continue to monitor the scene (e.g., FOV 332) and may change its status to PERSON_PRESENT mode after the system 100, 300 went into emergency mode. In one embodiment, processing component 110, 310 may be adapted to operate and/or function as a video recorder controller 710 adapted to store recorded video images in memory component 120. If the infrared imaging system 100, 300 is determined to be operating in an emergency mode (block 712), then stored video data and information is not erased or overwritten (block 714). Otherwise, if the infrared imaging system 100, 300 is determined to not be operating in an emergency mode (block 712), then stored video data and information is continuously overwritten with new video data and information (block 716). [00090] In one aspect, a user defined setting may be adapted to set a threshold for an amount of stored video data and information prior to the system 100, 300 operating in emergency mode. In another aspect, a maximum time may be defined by an amount of nonvolatile memory storage capacity and/or a video data compression ratio. In one example, the system 100, 300 may be configured to have the last ten minutes of video stored and to not overwrite that video history in the event of an emergency. That way, first responders that are able to extract the video from the system (e.g., by extracting the video memory) may be able to determine what happened at a specific location 10 minutes prior to the event that caused the system 100, 300 to enter emergency mode.
[00091] In various embodiments, referring to Fig. 7B, different events may cause the system 100, 300 to enter into emergency mode of operation. For example, the system 100, 300 may be adapted to monitor power 722, and if external power is terminated, the system 100, 300 may use battery power for operation and automatically enter emergency mode. In another example, the system 100, 300 may be adapted to monitor seismic activity 724, and if integrated motion sensors 162, 362 measure significant motion (e.g., in the event of an explosion or earthquake), the system 100, 300 may enter emergency mode. In another example, the system 100, 300 may be adapted to monitor user input 726, and if the system 100, 300 has a wired or wireless external communication channel (e.g., Ethernet connection, wireless network connection, etc.), the system 100, 300 may be set into emergency mode by user command. For instance, the system 100, 300 may be adapted to monitor a wired or wireless network for emergency activity. For instance, at a location with multiple systems, one system entering emergency mode may trigger other systems in proximity to enter emergency mode so as to preserve video at the location from that time.
[00092] In one embodiment, referring to Fig. 7B, processing component 110, 310 may be adapted to operate and/or function as a emergency mode controller 730 adapted to detect an event (e.g., power failure event, seismic event, etc.) and set the system 100, 300 to operate in emergency mode (block 736). If the infrared imaging system 100, 300 detects an event and sets the system 100, 300 to operate in emergency mode (block 736), then an emergency mode state is set to true (block 732). Otherwise, if the infrared imaging system 100, 300 does not detect an event and does not set the system 100, 300 to operate in emergency mode (block 736), then an emergency mode state is set to false (block 734).
[00093] In one embodiment, referring to Fig. 7C, processing component 110, 310 may be adapted to operate and/or function as a locator signal controller 760 adapted to transmit a homing beacon signal to facilitate locating the system 100, 300, respectively, in a disastrous event (e.g., earthquake, fire, flood, explosion, building collapse, nuclear event, etc.). In one embodiment, if the system is in emergency mode (block 762) and/or a person is detected to be present (block 764), then a person present 766 is encoded as part of locator signal data 770 in a transmitted locator signal 772 (i.e., homing beacon signal). In one aspect, if more than one person was present, then the approximate number of persons present may be encoded as part of locator signal data 770 in the transmitted locator signal 772. Otherwise, in another embodiment, if the system is in emergency mode (block 762) and/or a person is not detected to be present (block 764), then a person not present 768 is encoded as part of locator signal data 770 in the transmitted locator signal 772.
[00094] In various embodiments, infrared imaging systems 100, 300 are adapted to operate as a disaster camera having a ruggedized enclosure for protecting the camera and non- volatile storage for infrared image data and information. The disaster camera, in accordance with embodiments, is adapted to sense various types of emergencies such as a flood, an earthquake and/or explosion (e.g., based on analysis of the thermal image data, via a built-in shock sensor, and/or seismic sensor), sense heat and smoke (e.g., from a fire based on the thermal image data or other sensors), and/or provide an ability to locate and count persons in a collapsed structure more easily. In one embodiment, the disaster camera may be adapted to operate in a black box mode utilizing a homing beacon signal (e.g., radio frequency (RF) signal) to find and locate after a disastrous event (e.g., building collapse, earthquake, explosion, etc.). For example, the disaster camera may be adapted to operate as a human presence enunciator for search and rescue events via the homing beacon signal. In one embodiment, the disaster camera includes a thermal camera, a seismic sensor, and an audible enunciator or RF transmitter that signals the presence of any detected persons in the event of seismic activity. Thermal camera imaging may detect the presence or absence of persons in a 360 degree field of view (FOV) by using multiple thermal image cameras or by scanning the FOV using one or more thermal image cameras. A seismic sensor is constantly monitoring for abrupt and abnormal sudden motion. When such a motion is sensed, an audible alarm may be voiced. The alarm is ruggedized and able to operate separately from the system, for example, as a warning beacon.
[00095] FIG. 8 shows an infrared imaging system 800 adapted for monitoring a structure, in accordance with one or more embodiments. For example, in one embodiment, infrared imaging system 800 may comprise a wireless thermal imaging system and/or a wireless thermal image monitoring system for disaster detection and/or disaster restoration monitoring of structure 802. In another embodiment, infrared imaging system 800 may comprise (or further comprise) a thermal imaging camera system for utilization as a disaster camera and/or workplace safety monitoring to aid first responders and/or detect fallen persons in structure 802. In one or more embodiments, infrared imaging system 800 of FIG. 8 may have similar scope and function of system 100 of FIG. 1 and/or infrared imaging system 300 of FIG. 3 and may operate as set forth herein (e.g., selectively in reference to FIGS. 1-7C).
[00096] In one or more embodiments, infrared imaging system 800 utilizes wireless multipoint monitoring devices 830 (e.g., thermal imaging devices, environmental sensor devices, etc.) to monitor the condition of structure 802 including measuring moisture, humidity, temperature, and/or ambient conditions and obtaining thermal images of its structural envelope and/or of its occupants. In one embodiment, condition data (e.g., information) may be collected locally via a processing component 810 and then sent to a hosted website 870 over a network 860 (e.g., Internet) via a network communication device 852 (e.g., a wired or wireless router and/or modem) for remote viewing, control, and/or analysis of restoration conditions and remediation progress. As such, infrared imaging system 800 may utilize network-enabled, multi-monitoring technology to collect a breadth of quality data and provide this data to a user in an easily accessible manner.
[00097] With respect to job monitoring and documentation perspectives, infrared imaging system 800 may improve the efficiency of capturing important moisture, humidity, temperature, and/or ambient readings within the structural envelope. Infrared imaging system 800 may be adapted to provide daily progress reports on restoration conditions and remediation progress at a jobsite for use by industry professionals, such as restoration contractors and insurance companies. Infrared imaging system 800 may be adapted to use moisture meters, thermometers, thermal imaging cameras, and/or hygrometers to monitor conditions and collect data associated with structure 802. Infrared imaging system 800 may be adapted to simultaneously monitor multiple locations at any distance. As such, remote monitoring of each location is useful, and infrared imaging system 800 effectively allows a user (e.g., operator or administrator) to continuously monitor structural conditions of multiple jobsites from one network-enabled computing device from anywhere in the world. Infrared imaging system 800 may provide real-time restoration monitoring that combines wireless sensing device networks and continuous visual monitoring of multiple environmental parameters including humidity, temperature, and/or moisture, along with thermal images and any other related parameters that influence the integrity of structures.
[00098] By coupling ambient sensor data with rich visual detail and thousands of thermal data points found in infrared images, infrared imaging system 800 may be versatile and valuable for structural monitoring, remediation, disaster detection, etc. Infrared imaging system 800 may significantly improve monitoring and documentation capabilities while providing time, travel, and cost savings over conventional approaches.
[00099] In one embodiment, infrared imaging system 800 with thermal imaging capabilities may be utilized for moisture monitoring, removal, and/or remediation in structure 802.
Infrared imaging system 800 may be utilized for monitoring structures (e.g., residences, vacation homes, timeshares, hotels, condominiums, etc.) and aspects thereof including ruptured plumbing, dishwashers, washing machine hoses, overflowing toilets, sewage backup, open doors and/or windows, and anything else that may create the potential for moisture damage and/or energy loss. Commercial buildings may benefit from permanent installations of infrared imaging system 800 to provide continuous protection versus temporary ad-hoc installations.
[000100] In various aspects, infrared imaging system 800 may be utilized to expand structural diagnostic capabilities, provide real-time continuous monitoring, provide remote ability to set alarms and remote alerts for issues occurring on a jobsite, and improve documentation and archiving of stored reports, which for example may be useful for managing legal claims of mold damage. For example, infrared imaging system 800 may be used for restoration monitoring to provide initial measurements (e.g., of temperature, humidity, moisture, and thermal images) to determine initial conditions (e.g., how wet is the structure due to water damage) and may provide these measurements (e.g., periodically or continuously) to a remote location (e.g., hosted website or server) such that restoration progress may be monitored. The information (e.g., measurement data) provided may be used to view a time lapse sequence of the restoration to clearly show the progress of the remediation (e.g., how wet was the structure initially and how dry is it now or at completion of the remediation effort). The information may also be monitored to determine when the remediation is complete based on certain measurement thresholds (e.g., the structure is sufficiently dry and a completion alert provided) and to determine if an alert (e.g., alarm) should be provided if sufficient remediation progress is not being made (e.g., based on certain temperature, humidity, or moisture value thresholds).
[000101] Infrared imaging system 800 may be utilized to reduce site visit travel and expense by providing cost-effective remote monitoring of structures and buildings. Infrared imaging system 800 may be utilized to provide the contractor with quick and accurate validations that a jobsite is dry prior to removing drying equipment. Infrared imaging system 800 may be utilized to provide insurance companies and adjusters with access to current or past claims to monitor progress of a contractor, which may allow insurance companies to make sure the contractor is not charging for more work than is actually being performed, and allow insurance companies access to stored data for any legal issues that may arise.
[000102] Infrared system 800, for an embodiment, may be utilized to provide remote monitoring of structure 802 to detect a fire, flood, earthquake or other disaster and provide an alarm (e.g., an audible alarm, an email alert, a text message, and/or any other desired form of communication for a desired warning) to notify appropriate personnel and/or systems. For example for an embodiment, infrared system 800 may be distributed through a portion of or throughout a building to detect a fire or, for a recently extinguished fire, to detect if structural temperatures are beginning to increase or the potential risk for the fire to restart (e.g., to rekindle) is increasing and reaches a certain threshold (e.g., a predetermined temperature threshold). In such an application, infrared system 800 may provide an alarm to notify the fire department, occupants within structure 802, or other desired personnel. As a specific example for an embodiment, infrared system 800 may comprise one or more thermal infrared cameras (e.g., infrared imaging system 100, 300, or some portion of this system) within and/or around structure 802 to monitor for fire or potential rekindle potential of an extinguished fire. The thermal infrared cameras may provide thermal image data, which could be provided (e.g., sent via a wired or wireless communication link) to a fire station for personnel to monitor to detect a fire or potential of a fire (e.g., based on images and temperature readings of surfaces of structure 802). Infrared system 800 may also provide an alarm if certain thermal conditions based on the temperature measurements are determined to be present for structure 802.
[000103] In an embodiment, infrared imaging system 800 may include a base unit (e.g., processing component 810 and network communication device 852) that functions as a receiver for all wireless remote probes. The base unit may include a color display and be adapted to record data, process data, and transmit data (e.g., in real time) to a hosted website for remote viewing and retrieval by a user, such as a contractor, emergency personnel, and/or insurance appraiser. The base unit may include a touch screen display for improved usability and a USB and/or SD card slot for transferring data onsite without the use of a laptop or PC.
[000104] In one embodiment, infrared imaging system 800 may include various monitoring devices 830 (e.g., various types of sensors), which may include for example a first type of sensor and/or a second type of sensor. For example, the first type of sensor may include a pin-type moisture and ambient probe adapted to collect moisture levels and RH, air temperature, dew point, and/or grains per pound levels. Each first type of sensor may be uniquely identified based on a particular layout and/or configuration of a jobsite. As another example, the second type of sensor may represent a standalone thermal imaging sensor to capture infrared image data. As a specific example, the second type of sensor may include a display and may further include an integrated ambient sensor to monitor humidity and/or moisture levels. In one or more embodiments, the first and second type of sensors may be combined to form one modular sensor that may be compact, portable, self contained, and/or wireless and which may be installed (e.g., attached to a wall, floor, and/or ceiling) within a structure as desired by a user.
[000105] Infrared imaging system 800 may include an Internet connection adapted to transmit data from the base unit (e.g., network communication device 852) located at a jobsite in real-time via the Internet to a website for monitoring, analysis, and downloading. This may be achieved by a LAN/W AN at the site if one is available, or may require an internal wireless telecommunication system, such as a cellular-based (e.g., 3G or 4G) wireless connection for continuous data transmission.
[000106] In various embodiments, infrared imaging system 800 may include various monitoring devices 830, which may include for example moisture sensors and thermal imaging sensors fixed to a wall, baseboard, cabinet, etc. where damage may not occur and/or where a wide field of view of a given wall or surface may be achieved. Each monitoring device 830 (e.g., each sensor) may use a battery (e.g., a lithium battery) and, therefore, not require an external power source. Alternately, fixed, rotating sensors mounted on a ceiling may be employed to provide a 360 degree view of a given room. After installation of the base unit and sensors, any related software may be loaded onto a laptop, or use of a full- featured website may allow the user to configure reporting intervals and determine thresholds, and/or set readings desired for remote viewing. Configuration may be done onsite or remotely and settings may be changed at any time from the website interface, as would be understood by one skilled in the art.
[000107] Alarms may be configured to remotely notify the user of any problems that arise on a jobsite or other area being monitored by infrared imaging system 800. This may be achieved on the website by setting threshold alarms with specific moisture, humidity, or temperature ranges. For example, in some restoration cases, homeowners may unplug drying equipment at night because of excessive noise levels or, as another example, a contractor may load a single circuit with several drying devices that results in a fuse blowing when the homeowner switches additional electrical appliances on. With the alarm notification feature, the sensor automatically responds to a preset threshold and sends an email or text message to the user. For example, a user may set up the system to be notified if the relative humidity rises or air temperature falls (e.g., for water damage restoration applications), indicating a problem and meriting a visit by the contractor.
[000108] Infrared imaging system 800 may be secured with login credentials, such as a user identification and password permitting access to only certain persons. A user may choose to grant access to an insurance adjuster by providing a unique user name and password. Real time data may be automatically downloaded and stored to a server for future viewing. Even if there is a power failure at the jobsite, infrared imaging system 800 and/or the website may be adapted to store the captured data.
[000109] In one embodiment, with the data readings compiled and thermal images captured by infrared imaging system 800, a user may determine which areas need additional monitoring (e.g., drying or show proof that a building is completely dry) before leaving a jobsite. Data and records from the infrared imaging system 800 may be useful for mitigating legal exposure.
[000110] The monitoring devices 830 may include one or more ambient sensors with accuracy of at least +/- 2% for relative humidity, with a full range of 0-100%, and a high temperature range up to a least 175°F, as specific examples. The monitoring devices 830 may include one or more moisture sensors with a measuring depth, for example, up to at least 0.75" into building material. The monitoring devices 830 may include one or more thermal views from one or more thermal cameras providing one or more wall shots or 360-degree rotational views. The monitoring devices 830 may include a long range wireless transmission capability up to, for example, 500 feet between each monitoring device 830 and the base unit (e.g., processing component 810 and network communication device 852, which may be combined and/or implemented as one or more devices). The base unit may be accessible via a wired and/or wireless network and may provide 24/7 data availability via dynamic online reporting tools adapted to view, print, and email charts and graphs of the monitoring conditions, as would be understood by one skilled in the art. Infrared imaging system 800 may provide for full access to system configuration settings, customizable thresholds and alarms, user access management (e.g., add, remove, and/or modify personnel access), and alerts the user or operator via cell phone, text message, email, etc., as would be understood by one skilled in the art. Infrared imaging system 800 may include a display to view real time readings on site and provide the ability to toggle between room sensors. [000111] In one embodiment, conventional visible light cameras (e.g., visible spectrum imagers) are typically not accepted in areas were privacy is protected, such as bathrooms, showers, etc. In contrast, an infrared imager (e.g., a low resolution thermal imager) provides a thermal image where the identity of a person may be protected because the person appears as a warm blob that does not represent detailed features, such as facial features, of a person. As such, an infrared imager may be selected or designed to provide low resolution thermal images that define a person as a non-descript blob to protect the identity of the person. Thus, infrared imagers are less intrusive than visible light imagers. Furthermore, due to the radiometric capabilities of thermal imagers, objects at human temperature ranges may be discriminated from other objects, which may allow infrared imaging systems and methods in accordance with present embodiments to operate at a low spatial resolution to detect persons, without producing images that may allow for observers to determine the identity of the persons.
[000112] Where applicable, various embodiments of the invention may be implemented using hardware, software, or various combinations of hardware and software. Where applicable, various hardware components and/or software components set forth herein may be combined into composite components comprising software, hardware, and/or both without departing from the scope and functionality of the present disclosure. Where applicable, various hardware components and/or software components set forth herein may be separated into subcomponents having software, hardware, and/or both without departing from the scope and functionality of the present disclosure. Where applicable, it is contemplated that software components may be implemented as hardware components and vice- versa.
[000113] Software, in accordance with the present disclosure, such as program code and/or data, may be stored on one or more computer readable mediums. It is also contemplated that software identified herein may be implemented using one or more general purpose or specific purpose computers and/or computer systems, networked and/or otherwise. Where applicable, ordering of various steps described herein may be changed, combined into composite steps, and/or separated into sub-steps to provide features described herein.
[000114] In various embodiments, software for modules 112A-112N may be embedded (i.e., hard-coded) in processing component 110 or stored on memory component 120 for access and execution by processing component 110. In one aspect, code (e.g., software and/or embedded hardware) for modules 112A-112N may be adapted to define preset display functions that allow processing component 100 to automatically switch between various processing techniques for sensed modes of operation, as described herein. [000115] Embodiments described above illustrate but do not limit the disclosure. It should also be understood that numerous modifications and variations are possible in accordance with the principles of the present disclosure. Accordingly, the scope of the disclosure is defined only by the following claims.

Claims

CLAIMS What is Claimed is:
1. A wireless thermal imaging system, comprising:
a communication component adapted to remotely communicate with a user over a network;
one or more wireless thermal image sensors adapted to capture and provide thermal images of structural objects of a structure for monitoring moisture and/or temperature of the structural objects; and
a processing component adapted to receive the thermal images of the structural objects from the one or more wireless thermal image sensors, and process the thermal images of the structural objects to generate at least one of moisture content information for remote analysis of restoration conditions of the structural objects and/or fire hazard information for remote analysis of fire hazard conditions for the structural objects.
2. The system of claim 1, wherein the one or more wireless thermal image sensors comprise one or more infrared cameras adapted to continuously monitor environmental parameters including one or more of humidity, temperature, and moisture associated with the structural objects.
3. The system of claim 1, wherein the wireless thermal imaging system comprises a ruggedized thermal camera system adapted for use as a disaster monitoring camera system to detect and monitor damage from disastrous events including at least one of flooding, fire, explosion, and earthquake, and wherein the ruggedized thermal camera system comprises an enclosure to enclose each of the corresponding thermal image sensors and is configured to withstand disastrous events.
4. The system of claim 1, wherein the wireless thermal imaging system comprises a thermal camera system adapted for use as a safety monitoring system to detect one or more persons in the structure including one or more fallen persons in the structure, and wherein the processing component is adapted to generate an alert to emergency personnel in the event of a fire or a person in need of assistance.
5. The system of claim 1, wherein the one or more wireless thermal image sensors are adapted to monitor one more conditions of the structure including measuring one or more of moisture, humidity, and ambient conditions of its structural envelope.
6. The system of claim 1, wherein condition information of the structural objects of the structure is collected locally via the processing component and provided to a hosted website over the network via the communication component for remote viewing and analysis of restoration conditions by the user.
7. The system of claim 1, further comprising wireless sensors including a moisture meter and/or a hygrometer to monitor moisture conditions and provide information on the moisture conditions related to the structure to the processing component.
8. The system of claim 1, wherein the infrared imaging system is adapted to simultaneously monitor multiple structures.
9. The system of claim 1, wherein the one or more wireless thermal image sensors are affixed to at least one structural object of the structure to provide a view of one or more other structural objects of the structure to monitor the temperature of the structural objects to provide the fire hazard information for the remote analysis of fire hazard conditions for the structural objects, and wherein the processing component is adapted to provide an alarm to notify authorities if a fire hazard reaches a predetermined threshold based on the temperature.
10. The system of claim 1, wherein the processing component is adapted to provide an alarm to remotely notify the user of a disastrous event related to the structure based on a threshold condition with specific moisture or temperature ranges.
11. A method, comprising:
remotely communicating with a user over a network;
capturing and providing thermal images of structural objects of a structure for monitoring moisture and/or temperature levels of the structural objects;
receiving the thermal images of the structural objects from one or more wireless thermal image sensors; and processing the thermal images of the structural objects to generate at least one of moisture content information for remote analysis of restoration conditions of the structural objects and/or fire hazard information for remote analysis of fire hazard conditions for the structural objects.
12. The method of claim 11, further comprising monitoring environmental parameters of the structural objects including one or more of humidity, temperature, and moisture associated with the structural objects.
13. The method of claim 11, further comprising:
detecting and monitoring damage from disastrous events including at least one of flooding, fire, explosion, and earthquake; and
generating an alert to provide information based on the detecting and monitoring to authorities.
14. The method of claim 11, further comprising:
detecting one or more persons in the structure including one or more fallen persons in the structure; and
communicating with authorities to provide information based on the processing and/or the detecting.
15. The method of claim 11, further comprising monitoring one more conditions of the structure including measuring one or more of moisture, humidity, temperature, and ambient conditions of its structural envelope.
16. The method of claim 11, further comprising:
gathering condition information of the structural objects of the structure; and sending the condition information to a hosted website over the network for remote viewing and analysis of restoration conditions by the user.
17. The method of claim 11, further comprising storing the thermal images and/or moisture content information in a memory component.
18. The method of claim 11, wherein the method is adapted to simultaneously monitor multiple structures.
19. The method of claim 11, further comprising providing an alarm to remotely notify the user of a disastrous event related to the structure based on a threshold condition with specific moisture or temperature ranges.
20. A computer-readable medium on which is stored non-transitory information for performing a method by a computer, the method comprising:
remotely communicating with a user over a network;
receiving thermal images of structural objects of a structure from one or more wireless thermal image sensors; and
processing the thermal images of the structural objects to generate moisture content information for remote analysis of restoration conditions of the structural objects and/or fire hazard information for remote analysis of fire hazard conditions for the structural objects.
PCT/US2012/025697 2011-02-22 2012-02-17 Infrared sensor systems and methods WO2012115881A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP12710001.4A EP2678842A1 (en) 2011-02-22 2012-02-17 Infrared sensor systems and methods
US13/973,968 US20130335550A1 (en) 2011-02-22 2013-08-22 Infrared sensor systems and methods

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161445280P 2011-02-22 2011-02-22
US61/445,280 2011-02-22

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/973,968 Continuation US20130335550A1 (en) 2011-02-22 2013-08-22 Infrared sensor systems and methods

Publications (1)

Publication Number Publication Date
WO2012115881A1 true WO2012115881A1 (en) 2012-08-30

Family

ID=45873225

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/025697 WO2012115881A1 (en) 2011-02-22 2012-02-17 Infrared sensor systems and methods

Country Status (3)

Country Link
US (1) US20130335550A1 (en)
EP (1) EP2678842A1 (en)
WO (1) WO2012115881A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014180335A1 (en) * 2013-05-09 2014-11-13 Wang Hao Thermal image analysis monitoring device and monitoring system, and thermal image analysis monitoring method
WO2014180337A1 (en) * 2013-05-09 2014-11-13 王浩 Thermal image monitoring control device, monitoring system and thermal image monitoring control method
WO2015000492A1 (en) * 2013-07-02 2015-01-08 Rebhi Taha Hocine Electronic apparatus for protecting against danger for public and industrial use
WO2016064946A1 (en) * 2014-10-21 2016-04-28 Osram Sylvania Inc. Multi-condition sensing device including an ir sensor
CN105934781A (en) * 2014-01-03 2016-09-07 玛丽卡尔公司 Method and system for monitoring
EP2963628A4 (en) * 2013-02-26 2016-10-05 Hitachi Ltd Monitoring system
WO2018002406A1 (en) * 2016-06-29 2018-01-04 Ontech Security, Sl Device, system and method for detecting emergencies in public facilities, building, vehicles and transport networks
US10539502B2 (en) 2015-04-27 2020-01-21 Flir Systems, Inc. Moisture measurement device with thermal imaging capabilities and related methods
CN108663407B (en) * 2018-04-18 2020-07-03 国网上海市电力公司 Intelligent alarm method for underground space humidity
CN112185057A (en) * 2020-09-29 2021-01-05 国网四川省电力公司眉山供电公司 Cable tunnel fire early warning system
US11346938B2 (en) 2019-03-15 2022-05-31 Msa Technology, Llc Safety device for providing output to an individual associated with a hazardous environment
US11927488B2 (en) * 2019-01-03 2024-03-12 Chia-Ling Chen Thermal detection system capable of providing early warning and related products

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8907282B2 (en) * 2012-08-10 2014-12-09 Fluke Corporation Thermal imaging camera with intermittent image capture
EP2974277A1 (en) * 2012-11-30 2016-01-20 Robert Bosch GmbH Methods to combine radiation-based temperature sensor and inertial sensor and/or camera output in a handheld/mobile device
US9251687B2 (en) * 2013-04-19 2016-02-02 Jonathan Thompson Global positioning system equipped hazard detector and a system for providing hazard alerts thereby
US10728468B2 (en) * 2013-07-17 2020-07-28 Fluke Corporation Activity and/or environment driven annotation prompts for thermal imager
US11300855B2 (en) 2015-02-27 2022-04-12 l&Eye Enterprises, LLC Wastewater monitoring system and method
US10602040B2 (en) * 2015-02-27 2020-03-24 I&Eye Enterprises, LLC Wastewater monitoring system and method
US11238984B2 (en) * 2015-04-24 2022-02-01 Honor Technology, Inc. Systems and methods for ensuring quality of care services
US9852645B2 (en) * 2015-08-17 2017-12-26 The Boeing Company Global positioning system (“GPS”) independent navigation system for a self-guided aerial vehicle utilizing multiple optical sensors
CN105263006A (en) * 2015-11-19 2016-01-20 贵州大学 Remote video monitoring system for tobacco baking rooms
US10140832B2 (en) 2016-01-26 2018-11-27 Flir Systems, Inc. Systems and methods for behavioral based alarms
US20170339343A1 (en) * 2016-05-17 2017-11-23 Tijee Corporation Multi-functional camera
US10796425B1 (en) * 2016-09-06 2020-10-06 Amazon Technologies, Inc. Imagery-based member deformation gauge
US10722148B2 (en) * 2017-10-24 2020-07-28 Florida State University Research Foundation, Inc. Fall detection devices, systems, and methods
US10834336B2 (en) * 2018-01-29 2020-11-10 Ge Aviation Systems Llc Thermal imaging of aircraft
US11176799B2 (en) 2019-09-10 2021-11-16 Jonathan Thompson Global positioning system equipped with hazard detector and a system for providing hazard alerts thereby
JP7489248B2 (en) * 2020-02-06 2024-05-23 新東ホールディングス株式会社 Information processing system, information processing method, and information processing program
WO2021232095A1 (en) * 2020-05-20 2021-11-25 Erichsen Asset Pty Ltd A thermography inspection system and method of use thereof
US11350262B1 (en) 2021-05-11 2022-05-31 Daniel Kenney Self-contained disaster condition monitoring system
GB202111624D0 (en) * 2021-08-12 2021-09-29 Thermomedia Europe Ltd Computer-implemented system and method for monitoring a predetermined condition, characteristic or feature
CN113671589A (en) * 2021-09-14 2021-11-19 清华大学 Safety detection match physical system
TWM632669U (en) * 2022-05-13 2022-10-01 劉勉志 Smoking alarm device in non-smoking space
US11780405B1 (en) * 2022-05-19 2023-10-10 Vincent BELL Vehicle alarm assembly
US11895387B2 (en) 2022-07-08 2024-02-06 I & EyeEnterprises, LLC Modular camera that uses artificial intelligence to categorize photos

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004047039A1 (en) * 2002-11-21 2004-06-03 Wespot Ab Method and device for fall prevention and detection
WO2005040779A1 (en) * 2003-10-28 2005-05-06 Flir Systems Ab Method, use and system of an ir-camera for determining the risk of condensation
US20050146429A1 (en) * 2003-12-31 2005-07-07 Spoltore Michael T. Building occupant location and fire detection system
FR2870378A1 (en) * 2004-05-17 2005-11-18 Electricite De France Person e.g. aged person, fall detecting method for use in e.g. home, involves providing exposing unit for successive exposures of same plane, and detecting reduction in height of outline form beyond chosen threshold interval
EP1732314A2 (en) * 2005-06-06 2006-12-13 Flir Systems AB Infrared camera with humidity sensor
US20070229663A1 (en) * 2006-03-31 2007-10-04 Yokogawa Electric Corporation Image processing apparatus, monitoring camera, and image monitoring system
WO2007139658A2 (en) * 2006-05-24 2007-12-06 Objectvideo, Inc. Intelligent imagery-based sensor
US20080186189A1 (en) * 2007-02-06 2008-08-07 General Electric Company System and method for predicting fall risk for a resident
US7567200B1 (en) * 2006-04-27 2009-07-28 Josef Osterweil Method and apparatus for body position monitor and fall detect ion using radar
US20090309723A1 (en) * 2008-06-13 2009-12-17 Freebody Allan P Public distress beacon and method of use thereof
WO2010055205A1 (en) * 2008-11-11 2010-05-20 Reijo Kortesalmi Method, system and computer program for monitoring a person

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8159338B2 (en) * 2002-06-11 2012-04-17 Automotive Technologies International, Inc. Asset monitoring arrangement and method
IL174523A0 (en) * 2006-03-23 2006-12-31 Opgal Optronic Ind Ltd System for detecting and locating a thermal event and for reactive measures
US8054177B2 (en) * 2007-12-04 2011-11-08 Avaya Inc. Systems and methods for facilitating a first response mission at an incident scene using patient monitoring
EP2281255A4 (en) * 2008-04-17 2013-02-20 Travelers Indemnity Co A method of and system for determining and processing object structure condition information
US20100081411A1 (en) * 2008-09-29 2010-04-01 John Mathew Montenero, III Multifunctional telemetry alert safety system (MTASS)

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004047039A1 (en) * 2002-11-21 2004-06-03 Wespot Ab Method and device for fall prevention and detection
WO2005040779A1 (en) * 2003-10-28 2005-05-06 Flir Systems Ab Method, use and system of an ir-camera for determining the risk of condensation
US20050146429A1 (en) * 2003-12-31 2005-07-07 Spoltore Michael T. Building occupant location and fire detection system
FR2870378A1 (en) * 2004-05-17 2005-11-18 Electricite De France Person e.g. aged person, fall detecting method for use in e.g. home, involves providing exposing unit for successive exposures of same plane, and detecting reduction in height of outline form beyond chosen threshold interval
EP1732314A2 (en) * 2005-06-06 2006-12-13 Flir Systems AB Infrared camera with humidity sensor
US20070229663A1 (en) * 2006-03-31 2007-10-04 Yokogawa Electric Corporation Image processing apparatus, monitoring camera, and image monitoring system
US7567200B1 (en) * 2006-04-27 2009-07-28 Josef Osterweil Method and apparatus for body position monitor and fall detect ion using radar
WO2007139658A2 (en) * 2006-05-24 2007-12-06 Objectvideo, Inc. Intelligent imagery-based sensor
US20080186189A1 (en) * 2007-02-06 2008-08-07 General Electric Company System and method for predicting fall risk for a resident
US20090309723A1 (en) * 2008-06-13 2009-12-17 Freebody Allan P Public distress beacon and method of use thereof
WO2010055205A1 (en) * 2008-11-11 2010-05-20 Reijo Kortesalmi Method, system and computer program for monitoring a person

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2963628A4 (en) * 2013-02-26 2016-10-05 Hitachi Ltd Monitoring system
US9728060B2 (en) 2013-02-26 2017-08-08 Hitachi, Ltd. Monitoring system
WO2014180337A1 (en) * 2013-05-09 2014-11-13 王浩 Thermal image monitoring control device, monitoring system and thermal image monitoring control method
WO2014180335A1 (en) * 2013-05-09 2014-11-13 Wang Hao Thermal image analysis monitoring device and monitoring system, and thermal image analysis monitoring method
WO2015000492A1 (en) * 2013-07-02 2015-01-08 Rebhi Taha Hocine Electronic apparatus for protecting against danger for public and industrial use
US10311695B2 (en) 2014-01-03 2019-06-04 Maricare Oy Method and system for monitoring
EP3090416A4 (en) * 2014-01-03 2017-09-13 MariCare Oy Method and system for monitoring
EP3090416B1 (en) 2014-01-03 2020-05-27 MariCare Oy Method and system for monitoring
CN105934781A (en) * 2014-01-03 2016-09-07 玛丽卡尔公司 Method and system for monitoring
AU2014375197B2 (en) * 2014-01-03 2019-06-27 Maricare Oy Method and system for monitoring
WO2016064946A1 (en) * 2014-10-21 2016-04-28 Osram Sylvania Inc. Multi-condition sensing device including an ir sensor
US10539502B2 (en) 2015-04-27 2020-01-21 Flir Systems, Inc. Moisture measurement device with thermal imaging capabilities and related methods
CN109564716A (en) * 2016-06-29 2019-04-02 安泰克安全公司 For detecting the devices, systems, and methods of emergency
WO2018002406A1 (en) * 2016-06-29 2018-01-04 Ontech Security, Sl Device, system and method for detecting emergencies in public facilities, building, vehicles and transport networks
US10672246B2 (en) 2016-06-29 2020-06-02 Ontech Security, Sl Device, system and method for detecting emergencies in public facilities building, vehicles and transport networks
CN108663407B (en) * 2018-04-18 2020-07-03 国网上海市电力公司 Intelligent alarm method for underground space humidity
US11927488B2 (en) * 2019-01-03 2024-03-12 Chia-Ling Chen Thermal detection system capable of providing early warning and related products
US11346938B2 (en) 2019-03-15 2022-05-31 Msa Technology, Llc Safety device for providing output to an individual associated with a hazardous environment
CN112185057A (en) * 2020-09-29 2021-01-05 国网四川省电力公司眉山供电公司 Cable tunnel fire early warning system
CN112185057B (en) * 2020-09-29 2021-12-07 国网四川省电力公司眉山供电公司 Cable tunnel fire early warning system

Also Published As

Publication number Publication date
EP2678842A1 (en) 2014-01-01
US20130335550A1 (en) 2013-12-19

Similar Documents

Publication Publication Date Title
US20160203694A1 (en) Infrared sensor systems and methods
US20130335550A1 (en) Infrared sensor systems and methods
KR101544019B1 (en) Fire detection system using composited video and method thereof
US9311794B2 (en) System and method for infrared intruder detection
KR102028147B1 (en) Integrated control system, video analysis device and local control server for controlling event
US20090121861A1 (en) Detecting, deterring security system
KR101745887B1 (en) Apparatus for alerting fire alarm
KR102034559B1 (en) Appartus and method for monitoring security using variation of correlation coefficient pattern in sound field spectra
KR101863530B1 (en) System for fire predict and maintenance using visible light and infrared ray thermal image
CN106898110B (en) Method and device is monitored using the fire of CCTV
US20040216165A1 (en) Surveillance system and surveillance method with cooperative surveillance terminals
KR101377184B1 (en) Safety and disaster prevention system by using mobile communication network
KR20160053695A (en) Automatic window control system and method based on smartphone
KR101975021B1 (en) Fire-fighting safety management system using communication facility of apartment house
US9594290B2 (en) Monitoring apparatus for controlling operation of shutter
JP2018191030A (en) Surveillance camera system
US20200175843A1 (en) Methods and systems for first responder access to localized presence and identification information
CN114333008A (en) Intelligent building management system and method with heating screening and face recognition access control functions
KR101644032B1 (en) Home security system and method thereof
KR101822489B1 (en) Intelligent-control-unit for crime-disaster prevention and crime-disaster prevention system utilizing intelligent-control-unit
WO2023119436A1 (en) Crime prevention system and crime prevention method
KR101741312B1 (en) Real-time monitoring system for home
KR20160102914A (en) Wireless security system using smart phone
CN115836516A (en) Monitoring system
KR20170099263A (en) emergency sensing apparatus and method therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12710001

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2012710001

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2012710001

Country of ref document: EP