SE2250815A1 - A system and method for fire detection - Google Patents
A system and method for fire detectionInfo
- Publication number
- SE2250815A1 SE2250815A1 SE2250815A SE2250815A SE2250815A1 SE 2250815 A1 SE2250815 A1 SE 2250815A1 SE 2250815 A SE2250815 A SE 2250815A SE 2250815 A SE2250815 A SE 2250815A SE 2250815 A1 SE2250815 A1 SE 2250815A1
- Authority
- SE
- Sweden
- Prior art keywords
- image
- fire
- image capturing
- sensor
- capturing device
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 34
- 238000000034 method Methods 0.000 title claims abstract description 17
- 230000005855 radiation Effects 0.000 claims abstract description 18
- 230000005670 electromagnetic radiation Effects 0.000 claims abstract description 10
- 238000004458 analytical method Methods 0.000 claims description 12
- 230000000977 initiatory effect Effects 0.000 claims description 12
- 238000003384 imaging method Methods 0.000 claims description 11
- 230000000903 blocking effect Effects 0.000 claims description 3
- 230000003287 optical effect Effects 0.000 claims description 3
- 229920006395 saturated elastomer Polymers 0.000 claims description 2
- 230000000007 visual effect Effects 0.000 description 15
- 239000002699 waste material Substances 0.000 description 6
- 239000002551 biofuel Substances 0.000 description 5
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 5
- 230000003247 decreasing effect Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 241000282326 Felis catus Species 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000015654 memory Effects 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 239000002023 wood Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B17/00—Fire alarms; Alarms responsive to explosion
- G08B17/12—Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
- G08B17/125—Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B17/00—Fire alarms; Alarms responsive to explosion
- G08B17/12—Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B29/00—Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
- G08B29/18—Prevention or correction of operating errors
- G08B29/183—Single detectors using dual technologies
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B17/00—Fire alarms; Alarms responsive to explosion
- G08B17/10—Actuation by presence of smoke or gases, e.g. automatic alarm devices for analysing flowing fluid materials by the use of optical means
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B19/00—Alarms responsive to two or more different undesired or abnormal conditions, e.g. burglary and fire, abnormal temperature and abnormal rate of flow
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/14—Central alarm receiver or annunciator arrangements
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B29/00—Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
- G08B29/18—Prevention or correction of operating errors
- G08B29/185—Signal analysis techniques for reducing or preventing false alarms or for enhancing the reliability of the system
- G08B29/188—Data fusion; cooperative systems, e.g. voting among different detectors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/13—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Signal Processing (AREA)
- Computer Security & Cryptography (AREA)
- Fire-Detection Mechanisms (AREA)
Abstract
The present disclosure relates to a system (100) and method for fire detection. The system comprises a first image capturing device (101) comprising a sensor arranged to detect infrared radiation and a second image capturing device (102) comprising a sensor arranged to detect visible radiation. The first and second image capturing devices are arranged to cover an overlapping area. The relation between the first and second image capturing devices is known. Thereby, a digital point in a first image captured by the sensor of the first image capturing device is associated with a corresponding digital point or line in a second image captured by the sensor of the second image capturing device, and vice versa. The system further comprising a controller (103) comprising a processor (104), said controller (103) being operatively connected to said first and second sensors and being arranged to continuously obtaining first images captured by the sensor (101) of the first image capturing device and analysing each first image to identify a potential fire, said identification comprising detecting occurrence of electromagnetic radiation exceeding a pre-set criteria in a part of said first image, said occurrence indicating potential fire in said part of the first image, and to, when a potential fire has been identified, obtaining a second image from the sensor of the second image capturing device and analysing the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire.
Description
TECHNICAL FIELD The present disclosure relates to a camera-based system and method for fire detection.
BACKGROUND When storing for example waste and biofuels outdoors, fires in such piles are a recurring problem.
Fires can occur in several ways. For example, fires can occur from self-ignition in piles of wood chips, batteries in piles of waste, or by sabotage. Regardless of the cause, such a fire can cause major problems, with regard to safety, health, economy and the environment.
When a fire does occur, it is important that the fire is detected as early as possible, and different types of fire detectors have therefore been developed. Radiation-based fire detectors are common, such as video cameras with flame or smoke detection software or thermal cameras that sound alarms at high temperatures. A major problem, however, is various sources of false alarms, which reduces confidence in the systems and thus the usability of the systems. To get around this, surveillance cameras are often used to verify alarms, i.e., an operator receives an alarm and looks at the location of the alarm (using a surveillance camera) and determines if the alarm is a fire or a false alarm. This does not really solve the problem, as it requires that there is someone at hand who can manually check each alarm, i.e., exactly what you want to avoid.
SUMMARY An object of the invention is to improve systems available today for fire detection. The aim is to decrease the number of false alarms systems for fire detection.
False alarms can be caused by many different things, but the most common are vehicles, such as wheel loaders, and sun reflections. For example, the exhaust pipe of a wheel loader is characteristically 200°C.
The object has been achieved by means of a system for fire detection. The system comprises a first image capturing device comprising a sensor arranged to detect infrared radiation and a second image capturing device comprising a sensor arranged to detect visible radiation. The first and second image capturing devices are arranged to cover an overlapping area. The relation between the first and second image capturing devices is known, whereby a digital point in a first image captured by the sensor of the first image capturing device is associated with a corresponding digital point or line in a second image captured by the sensor of the second image capturing device, and vice versa. The system further comprising a controller comprising a processor. The controller is operatively connected to said first and second sensors. The controller is arranged to continuously obtaining first images captured by the sensor of the first image capturing device and analysing each first image to identify a potential fire, said identification comprising detecting occurrence of electromagnetic radiation exceeding a pre-set criteria in a part of said first image, said occurrence indicating potential fire in said part of the first image, and when a potential fire has been identified, obtaining a second image from the sensor of the second image capturing device and analysing the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire.
Accordingly, a system is provided wherein an alarming radiation-based fire detector (thermal image sensor) is combined with a visual sensor that detects sources of false alarms. Such a system can then automatically give an alarm and/or start automatic extinguishing, for example using water cannons, without an operator having to verify. ln an option, the controller is arranged to determine whether the potential fire corresponds to a sun reflection based on the analysis of the second image.
Sun reflections, e.g., in puddles of water, give strong measured intensities in both thermal and visual sensors. However, normal fires do not reach high enough temperatures to emit a significant amount of visual light, and analysing the second image is therefore an effective way to distinguish sun reflections from fires.
Further preferred embodiments are defined in the dependent claims.
The present disclosure further relates to a method for fire detection performed by a controller comprising a processor, said controller being operatively connected a first image capturing device comprising a sensor arranged to detect infrared radiation and a second image capturing device comprising a sensor arranged to detect visible radiation to said first and second sensors, wherein the first and second image capturing devices are arranged to cover an overlapping area and the relation between the first and second image capturing devices is known, whereby a digital point in a first image captured by the sensor of the first image capturing device is associated with a corresponding digital point or line in a second image captured by the sensor of the second image capturing device, and vice versa. The method comprises continuously obtaining first images captured by the sensor of the first image capturing device and analysing each first image to identify a potential fire, said identification comprising detecting occurrence of electromagnetic radiation exceeding a pre-set criteria in a part of said first image, said occurrence indicating potential fire in said part of the first image. When a potential fire has been identified, steps of obtaining a second image from the sensor of the second image capturing device and analysing the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire are carried out.
DESCRIPTION OF DRAWING Figure 1 is a block scheme showing an example system system for fire detection.
Figure 2 illustrates an example implementation of a system for fire detection, seen from above.
Figure 3 illustrates another example implementation of a system for fire detection, seen from above.
Figure 4 is a flow chart illustrating an example method for fire detection.
DETAILED DESCRIPTION Figure 1 discloses a system 100 for fire detection. The system aims at generating a decreased number of false alarms while not decreasing the number of true alarms.
The system comprises a first, thermal image capturing device 101. The sensor of the first image capturing device is arranged to detect electromagnetic radiation with a wavelength bigger than 3 lim.
The system further comprises a second, visual image capturing device 102. The sensor of the second image capturing device is arranged to detect visual wavelengths.
The second, visual image capturing device is here referred to an image capturing device having a sensor that picks up light in the visible area of the electromagnetic spectrum. Such a sensor can capture light in a plurality of wavelength bands or be monochrome, such as black and white, or grayscale. In some cases, the second sensor is not limited to visible light, but also captures near infrared, NIR, or ultraviolet, UV.
In an example, the second image capturing device is arranged to capture electromagnetic radiation within a range between 0.4-1 um.
The second, visual image capturing device 102 may be a camera. The camera may be a color camera or RGB camera or black and white camera or grayscale camera.
Characteristically, the wavelengths captured by the first and second image capturing devices are characteristically not overlapping.
The first and second image capturing devices 101, 102 are arranged to cover an overlapping area. A region of interest to be monitored is located in the overlapping area. The region of interest to be monitored is for example located outdoors. The region of interest to be monitored comprises for example a pile of waste or a pile of biofuels or a pile of other material(s). When storing for example waste and biofuels outdoors, fires in such piles are a recurring problem.
The relation between the first and second image capturing devices 101, 102 is known, whereby a spatial point in a first image captured by the first image capturing device is associated to a corresponding spatial point or line in a second image captured by the second image capturing device and vice versa.
The system 100 further comprises a controller 103. The controller comprises a processor 104 and one or more memories 105. The controller 103 is operatively connected to said first image capturing device 101 and said second image capturing device 102. The controller may be wirelessly connected to the first and second image capturing devices or connected via a wired connection. lnterfaces for communication between the controller and the first and second image capturing devices are not illustrated in the figure.
The controller 103 is arranged to continuously receive first images captured by the first image capturing devices 101. The controller 103 is further arranged to and analyse each first image to detect occurrence of electromagnetic radiation exceeding a pre-set criteria in an identified part of the image. Thus, the controller is arranged to identify any part of the first image where the electromagnetic radiation exceeds the pre-set criteria. Potentially, a fire has occurred in any such identified part of the first image.
The controller is, when a potential fire has been detected in the identified part of the first image, arranged to obtain a second image from the second image capturing device and determine whether there is a fire in a corresponding part of the second image based on analysis of the second image.
Thus, a determination of whether there is a fire is focused on that part or those parts of the monitored region of interest which has/have been identified as having potential fires from the first image.
A fa|se alarm can be caused by many things. An important cause is sun reflections. The controller 103 may be arranged to determine whether the potential fire corresponds to a sun reflection based on the analysis of the second image. Sun reflections, e.g., in puddles of water, give strong measured intensities in both thermal and visual sensors. However, normal fires do not reach high enough temperatures to emit a significant amount of visual light, and analysing the second image is therefore an effective way to distinguish sun cats from fires. ln an example, the second image capturing device is arranged to automatically adjust exposure time.
The automatic adjustment may be controlled via a processor of the second image capturing device or the controller 103. Alternatively or in addition thereto, the exposure time may be manually adjustable for example via a user interface. The user interface may be arranged at the second image capturing device. |nstead or in addition thereto, the exposure time may be remotely user controlled for example via an app of a user electronic device, a computer or the like. The second image capturing device may have a receiver for reception of such remote control signals. The controller may then be arranged to form an intensity value based on the exposure time used at image capture and pixel value(s) of the second image and to determine whether the potential fire corresponds to a sun reflection based on the formed intensity value. ln this context, it should be mentioned that many visual cameras automatically adjust the exposure time according to the brightness of the scene (the area being imaged). This means that a strong light source will give high pixel values if it is strong relative to other parts of the same scene, even if it is weak by absolute standards, and thus it is not possible to say that a light source is strong just because it has a high pixel value. The inclusion of the exposure time in the determination resolves this problem.
Accordingly, the second image capturing device may be arranged to automatically adjust the exposure time such that light sources such as sun reflections give results on the second image capturing device while parts without light sources or where the light source is a fire get pixel values close to zero. The automatic adjustment may be a way of handling varying daylight levels and thereby a varying strength of sun reflections.
Further, the second, visual image capturing device may comprise a polarizing filter, thereby allowing for improved detection of sun reflections. The sun reflections are usually polarized and using a polarizing filter could help in detecting sun reflections.
Further, the second, visual image capturing device may comprise an optical band pass filter, wherein the band pass filter optionally is arranged to let through blue and/or ultraviolet wavelengths while stopping red and/or infrared wavelengths. Thereby, improved detection of sun reflections can be achieved. A normal fire will emit some radiation in red and infrared wavelengths but only a very small amount of radiation in blue and ultraviolet wavelengths. Thus, blocking red and/or infrared radiation will reduce the probability of sensing radiation from a fire.
The second, visual image capturing device may have a plurality of wavelength bands. The controller may then be arranged to determine whether the potential fire corresponds to a sun reflection based on a relation between pixel values for the different wavelength bands.
Another important cause of false alarms is vehicles, such as wheel loaders. The controller may be arranged to analyse the first image and/or second image to detect presence of a vehicle and to determine the potential fire corresponds to the detected vehicle. Thus, some algorithm may be used arranged to recognize vehicles. For example, the controller may be arranged to identify vehicles for the second image based on visual characteristics of vehicles. ln addition or instead, the controller may be arranged to identify vehicles based on heat characteristics of different types of vehicles from the first images, Further or instead, the detection of a vehicle may be made based on a plurality of subsequent images captured by the first and/or second image capturing device to determine a vehicle based on whether the object is moving or not and to determine whether the potential fire corresponds to the detected moving vehicle.
The system may further comprise at least one alarm unit 106 operatively connected to the controller 103 and arranged to generate and alarm upon fire detection. The at least one alarm unit may comprise one or more of the following: 0 a sound alarm installed at the monitored site and/or an operator's site 0 a visual alarm installed at the monitored site and/or an operator's site 0 a mobile unit such a smart phone or the like presenting alarm notifications, wherein the alarm unit may be implemented in an app.
The controller 103 may be arranged to initiate automatic extinguishing of the fire upon fire detection. Thus controller 103 may be operatively connected, for example by wire or wirelessly, to a fire extinguishing system 107. The fire extinguishing may form part of the fire detection system 100.
The fire extinguishing system may for example comprise water cannons. A problem which may arise is that water cannons or other fire extinguishing systems may be inappropriate to use when persons are within its covered area of the extinguishing system. Normally, automatic extinguishing is used only when the monitored area is unmanned, but there can of course be situations where people are still in the area, for example unauthorized persons who have intruded, unforeseen schedule change etc. Therefore, the controller may further be arranged to evaluate the first and/or second images to detect persons in the image. Other person detectors may be used instead of or in addition thereto. When one or more persons have been detected, the automatic extinguishing might be limited. For example, alarm and warning signals can be given so that people in the area can move away or stop the extinguishing. ln figures 2 and 3, different implementations of a system for fire detection, seen from above are illustrated. The system for fire detection may have some of the features as discussed in relation to figure 1.
First and second image capturing devices 101, 102 are arranged to cover an overlapping area 201. A region of interest 200 to be monitored is at least partly located in the overlapping area 201.
The region of interest 200 to be monitored is for example located outdoors. The region of interest to be monitored comprises for example a pile of waste or a pile of biofuels or a pile of other material(s). When storing for example waste and biofuels outdoors, fires in such piles are a recurring problem.
The relation between the first and second image capturing devices 101, 102 is known, whereby a spatial point in a first image captured by the first image capturing device is associated to a corresponding spatial point or line in a second image captured by the second image capturing device and vice versa. ln the illustrated example of figure 2, the overlapping area 201 substantially covers the entire region of interest 200 to be monitored. ln the illustrated example of figure 3, the overlapping area 201 covers only a part of the region of interest 200 to be monitored. Several first image capturing device/second image capturing device pairs may then be used to cover the entire region of interest 200. ln figure 4, a method 200 for fire detection is illustrated. The method for fire detection is performed by a controller comprising a processor, said controller being operatively connected a first image capturing device 101 comprising a sensor arranged to detect infrared radiation and a second image capturing device 102 comprising a sensor arranged to detect visible radiation. The first and second image capturing devices are arranged to cover an overlapping area. The relation between the first and second image capturing devices is known. Thereby, a digital point in a first image captured by the sensor of the first image capturing device is associated with a corresponding digital point or line in a second image captured by the sensor of the second image capturing device, and vice versa, The method comprising continuously obtaining S1 first images captured by the sensor 101 of the first image capturing device.
The method further comprises analysing S2 each first image to identify a potential fire. The potential fire is identified by detecting occurrence of electromagnetic radiation exceeding a pre-set criteria in a part of said first image, said occurrence indicating potential fire in said part of the first image.
When a potential fire has been identified, the method comprises obtaining S3 a second image from the sensor of the second image capturing device.
The second obtained sensor image is in an example filtered through a polarizing filter in the beam path before the sensor.
The second obtained sensor image is in an example filtered through an optical band pass filter optionally arranged to transmit blue and/or ultraviolet wavelengths while blocking red and/or infrared wavelengths.
The method further comprises a step of analysing S4 the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire. ln an implementation example, the determination of whether the potential fire is a fire may comprises determining whether the potential fire corresponds to a sun reflection based on the analysis of the second image. ln detail, this may involve adjusting, either manually or automatically, the exposure time of the second imaging device.
The analysis may comprise forming an intensity value based on both the exposure time used at image capture and pixel value(s) of the second image and to determine whether the potential fire corresponds to a sun reflection based on the formed intensity value. As stated earlier, a strong light source will give high pixel values if it is strong relative to other parts of the same scene, even if it is weak by absolute standards, and thus it is not possible to say that a light source is strongjust because it has a high pixel value. The inclusion of the exposure time in the determination resolves this problem and therefore enables reliable identification of sun reflexions.
The exposure time of the second sensor 102 may be manually or automatically adjustable so that light sources such as sun reflections results in saturated pixels of the second imaging device or pixel values of the second imaging device within the dynamic range, while parts without light sources such as fire get pixel values close to zero. ln an option, the wherein the second imaging device 102 is colour sensitive, the determination of whether the potential fire corresponds to a sun reflection is based on a relation between pixel values for the different wavelength bands.
The analysing S4 of the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire may comprise determine whether the potential fire corresponds to a vehicle based on analysis of the identified first image and/or the second image to detect an object in the form of a vehicle. The detection of a vehicle may be made based on analysis of a plurality of subsequent images captured by the first and/or second imaging device, wherein a vehicle is detected when an object in the first and/or second images has been determined to be moving.
The method may further comprise a step of generating S5 an alarm upon fire detection. The alarm may be generated through audio or light. For example, the audio alarm may be generated by means of load speakers. The audio alarm may be generated at the region of interest to be monitored and or a control room facility. |nstead or in addition thereto, the alarm may be generated on a display in a control room facility. |nstead or in addition thereto, the alarm may be a message transmitted to a user electronic device, such as a smartphone.
The method may instead or alternatively comprise a step of transmitting S7 an initiation signal for initiation of automatic extinguishing of the fire upon fire detection.
The method may further comprise evaluating the identified first image and/or the second image to detect S6 persons in the identified first image and/or the second image. When no person has been detected, an initiation signal for initiation of automatic extinguishing of the fire upon fire detection may then be transmitted S7. When a person has been detected, an initiation signal for initiation of an alternative action upon fire detection may be transmitted. The alternative action may be generating an alarm S5.
Claims (1)
1.Claims A system (100) for fire detection, comprising: a first image capturing device (101) comprising a sensor arranged to detect infrared radiation, a second image capturing device (102) comprising a sensor arranged to detect visible radiation, wherein the first and second image capturing devices are arranged to cover an overlapping area and the relation between the first and second image capturing devices is known, whereby a digital point in a first image captured by the sensor of the first image capturing device is associated with a corresponding digital point or line in a second image captured by the sensor of the second image capturing device, and vice versa, the system further comprising a controller (103) comprising a processor (104), said controller (103) being operatively connected to said first and second sensors and being arranged to continuously obtaining first images captured by the sensor (101) of the first image capturing device and analysing each first image to identify a potential fire, said identification comprising detecting occurrence of electromagnetic radiation exceeding a pre-set criteria in a part of said first image, said occurrence indicating potential fire in said part of the first image, and when a potential fire has been identified, obtaining a second image from the sensor of the second image capturing device and analysing the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire. The system according to any of the preceding claims, wherein the controller (103) is arranged to determine whether the potential fire corresponds to a sun reflection based on the analysis of the second image. The system according to claim 2, wherein the controller is arranged to form an intensity value based on the exposure time used at image capture and pixel value(s) of the second image andto determine whether the potential fire corresponds to a sun reflection based on the formed intensity value. The system according to claim 3, wherein the second imaging device (102) is arranged to automatically adjust exposure time, or the exposure time is manually adjustable. The system according to claim 4, wherein the exposure time of the second sensor (102) manually or automatically adjustably arranged to automatically adjust the exposure time such that light sources such as sun reflections results in saturated pixels of the second imaging device or pixel values of the second imaging device within the dynamic range, while parts without light sources such as fire get pixel values close to zero.get pixel values close to zero. The system according to any of the preceding claims, wherein the second image capturing device (102) comprises a polarizing filter in the beam path before the sensor. The system according to any of the preceding claims, wherein the second imaging device (102) comprises an optical band pass filter, wherein the band pass filter optionally is arranged to transmit blue and/or ultraviolet wavelengths while blocking red and/or infrared wavelengths. The system according to any of the preceding claims, wherein the second imaging device (102) is colour sensitive and wherein the controller is arranged to determine whether the potential fire corresponds to a sun reflection based on a relation between pixel values for the different wavelength bands. The system according to any of the preceding claims, wherein the controller (103) is arranged to determine whether the potential fire corresponds to a vehicle based on analysis of the identified first image and/or the second image to detect an object in the form of a vehicle. The system according to claim 9, wherein the detection of a vehicle is made based on analysis of a plurality of su bsequent images captured by the first and/or second imaging device, wherein a vehicle is detected when an object in the first and/or second images has been determined to be moving. The system according to any of the preceding claims, further comprising an alarm unit (106) arranged to generate an alarm upon fire detection.The system according to any of the preceding claims, wherein the controller (103) is arranged to transmit an initiation signal for initiation of automatic extinguishing of the fire upon fire detection. The system according to any of the claims 1-11, wherein the controller (103) is arranged to evaluate the identified first image and/or the second image to detect persons in the identified first image and/or the second image and when no person has been detected, to transmit an initiation signal for initiation of automatic extinguishing of the fire upon fire detection, and when a person has been detected, transmit an initiation signal for initiation of an alternative action upon fire detection. A method (200) for fire detection performed by a controller comprising a processor, said controller being operatively connected a first image capturing device (101) comprising a sensor arranged to detect infrared radiation and a second image capturing device (102) comprising a sensor arranged to detect visible radiation to said first and second sensors, comprising: wherein the first and second image capturing devices are arranged to cover an overlapping area and the relation between the first and second image capturing devices is known, whereby a digital point in a first image captured by the sensor of the first image capturing device is associated with a corresponding digital point or line in a second image captured by the sensor of the second image capturing device, and vice versa, said method comprising continuously obtaining (S1) first images captured by the sensor (101) of the first image capturing device, analysing (S2) each first image to identify a potential fire, said identification comprising detecting occurrence of electromagnetic radiation exceeding a pre-set criteria in a part of said first image, said occurrence indicating potential fire in said part of the first image, and when a potential fire has been identified, obtaining (S3) a second image from the sensor of the second image capturing device andanalysing (S4) the part of the second image, corresponding to the potential fire in the first image, to determine whether the potential fire is a fire.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE2250815A SE2250815A1 (en) | 2022-06-30 | 2022-06-30 | A system and method for fire detection |
PCT/SE2023/050671 WO2024005701A1 (en) | 2022-06-30 | 2023-06-28 | A system and method for fire detection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE2250815A SE2250815A1 (en) | 2022-06-30 | 2022-06-30 | A system and method for fire detection |
Publications (1)
Publication Number | Publication Date |
---|---|
SE2250815A1 true SE2250815A1 (en) | 2023-12-31 |
Family
ID=87137059
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
SE2250815A SE2250815A1 (en) | 2022-06-30 | 2022-06-30 | A system and method for fire detection |
Country Status (2)
Country | Link |
---|---|
SE (1) | SE2250815A1 (en) |
WO (1) | WO2024005701A1 (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080065833A (en) * | 2007-01-10 | 2008-07-15 | 한국서부발전 주식회사 | Fire alarm method and system |
US20140028803A1 (en) * | 2012-07-26 | 2014-01-30 | Robert Bosch Gmbh | Fire monitoring system |
KR101462247B1 (en) * | 2014-03-10 | 2014-11-21 | 삼성영상보안주식회사 | Smart fire detection system based on infrared thermal-image and interface platform for auto-fire extinguishing apparatus |
KR20150012028A (en) * | 2013-07-24 | 2015-02-03 | 김일환 | Total alarm system |
KR20160061614A (en) * | 2014-11-24 | 2016-06-01 | 멀티펠스 주식회사 | Fire detection system |
KR20200139987A (en) * | 2019-06-05 | 2020-12-15 | 세방전지(주) | Apparatus and method for detecting invader and fire for energy storage system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2372317B (en) * | 2001-02-14 | 2003-04-16 | Infrared Integrated Syst Ltd | Improvements to fire detection sensors |
DE102017009680A1 (en) * | 2017-10-18 | 2019-04-18 | Dräger Safety AG & Co. KGaA | Method and detector system for detecting a flame event |
-
2022
- 2022-06-30 SE SE2250815A patent/SE2250815A1/en unknown
-
2023
- 2023-06-28 WO PCT/SE2023/050671 patent/WO2024005701A1/en unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080065833A (en) * | 2007-01-10 | 2008-07-15 | 한국서부발전 주식회사 | Fire alarm method and system |
US20140028803A1 (en) * | 2012-07-26 | 2014-01-30 | Robert Bosch Gmbh | Fire monitoring system |
KR20150012028A (en) * | 2013-07-24 | 2015-02-03 | 김일환 | Total alarm system |
KR101462247B1 (en) * | 2014-03-10 | 2014-11-21 | 삼성영상보안주식회사 | Smart fire detection system based on infrared thermal-image and interface platform for auto-fire extinguishing apparatus |
KR20160061614A (en) * | 2014-11-24 | 2016-06-01 | 멀티펠스 주식회사 | Fire detection system |
KR20200139987A (en) * | 2019-06-05 | 2020-12-15 | 세방전지(주) | Apparatus and method for detecting invader and fire for energy storage system |
Also Published As
Publication number | Publication date |
---|---|
WO2024005701A1 (en) | 2024-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101462247B1 (en) | Smart fire detection system based on infrared thermal-image and interface platform for auto-fire extinguishing apparatus | |
KR100922784B1 (en) | Image base fire sensing method and system of crime prevention and disaster prevention applying method thereof | |
KR101148799B1 (en) | Infrared thermal image fire detector and fire detection method using the same | |
KR101200433B1 (en) | System for realtime observing fire using CCTV camera, and method for the same | |
GB2257598A (en) | Video camera surveillance system detects intruders and/or fire | |
US8497904B2 (en) | System and method of target based smoke detection | |
US6806471B2 (en) | Flame detection device | |
KR101128946B1 (en) | Apparatus and method for photographing temperature picture in monitoring camera | |
KR101816769B1 (en) | Fire detector for sensing fire using Internet united Protocol camera and sensor for sensing fire and method for sensing fire thereof | |
JP5876347B2 (en) | Hydrogen flame visualization apparatus and method | |
KR20200132137A (en) | Device For Computing Position of Detected Object Using Motion Detect and Radar Sensor | |
GB2372317A (en) | Infrared flame detection sensor | |
KR102088198B1 (en) | Integrated fire control system using triple detectors and method thereof | |
JP6837244B2 (en) | Hydrogen flame monitoring device and hydrogen handling facility | |
KR101046819B1 (en) | Method and system for watching an intrusion by software fence | |
SE2250815A1 (en) | A system and method for fire detection | |
KR101029190B1 (en) | Camera apparatus for fire sensing and image correction, and method therof | |
KR101489215B1 (en) | Senser using image information and the senser diving method and intergrated securing system thereof | |
JP3263311B2 (en) | Object detection device, object detection method, and object monitoring system | |
KR100982342B1 (en) | Intelligent security system and operating method thereof | |
KR20150012028A (en) | Total alarm system | |
JP2000113351A (en) | Image device | |
JP3046404B2 (en) | Fire monitoring device using TV camera | |
KR20140128141A (en) | Camera with fire detection function, and fire detection method thereof | |
JP3269453B2 (en) | Fire detection system |