EP1256105B1 - Rauch- und flammendetektion - Google Patents
Rauch- und flammendetektion Download PDFInfo
- Publication number
- EP1256105B1 EP1256105B1 EP01904091A EP01904091A EP1256105B1 EP 1256105 B1 EP1256105 B1 EP 1256105B1 EP 01904091 A EP01904091 A EP 01904091A EP 01904091 A EP01904091 A EP 01904091A EP 1256105 B1 EP1256105 B1 EP 1256105B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- flame
- smoke
- decision
- image
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
- 239000000779 smoke Substances 0.000 title claims abstract description 79
- 238000001514 detection method Methods 0.000 title claims abstract description 50
- 238000000034 method Methods 0.000 claims abstract description 21
- 238000004422 calculation algorithm Methods 0.000 claims description 73
- 238000004458 analytical method Methods 0.000 claims description 31
- 230000008859 change Effects 0.000 claims description 19
- 238000001914 filtration Methods 0.000 claims description 10
- 238000010207 Bayesian analysis Methods 0.000 description 12
- 238000012545 processing Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 8
- 238000009826 distribution Methods 0.000 description 6
- 230000004044 response Effects 0.000 description 5
- 239000013598 vector Substances 0.000 description 5
- 238000005314 correlation function Methods 0.000 description 4
- 230000006399 behavior Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 230000003111 delayed effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 230000002250 progressing effect Effects 0.000 description 2
- 238000005096 rolling process Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 238000007619 statistical method Methods 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- DGAQECJNVWCQMB-PUAWFVPOSA-M Ilexoside XXIX Chemical compound C[C@@H]1CC[C@@]2(CC[C@@]3(C(=CC[C@H]4[C@]3(CC[C@@H]5[C@@]4(CC[C@@H](C5(C)C)OS(=O)(=O)[O-])C)C)[C@@H]2[C@]1(C)O)C)C(=O)O[C@H]6[C@@H]([C@H]([C@@H]([C@H](O6)CO)O)O)O.[Na+] DGAQECJNVWCQMB-PUAWFVPOSA-M 0.000 description 1
- 238000013477 bayesian statistics method Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000004836 empirical method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 229910052708 sodium Inorganic materials 0.000 description 1
- 239000011734 sodium Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 210000002105 tongue Anatomy 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B17/00—Fire alarms; Alarms responsive to explosion
- G08B17/12—Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
- G08B17/125—Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
Definitions
- the invention relates to the detection of smoke and flame using image processing technology.
- CCTV type cameras can be used to detect fires and generate fire alarm information.
- cameras can be integrated into an automatic fire detection system, which can operate entirely without human intervention thereby reducing the potential for missed alarms.
- JP10269471 discloses a flame detection system with at least one photographing means or a CCD camera; a process arranged to analyse successive frames captured by the camera; comparing individual pixels; analysing successive frames; assessing brightness to detect a bright field; and after a bright field-has been isolated, brightness (again) is assessed in the boundary region of a bright field to detect smoke by determining whether the boundary likeness is below a pre-determined value.
- the objective problem solved by the present invention is how to accelerate the triggering of a signal of "smoke present” and “flame present” whilst at least maintaining or improving the accuracy with which flames which generate smoke are detected.
- the invention provides a combined smoke and flame detection system comprising at least one video camera, a video frame comparator and a processor, wherein said processor is arranged to analyse successive frames captured by the or each said camera by comparing individual pixels thereof according to at least two pre-defined relationships so as to be capable of detecting smoke and flame and generating an output signal indicating the presence of smoke, flame or smoke and flame; characterised in that the process includes means for filtering the digitised images produced by the system using a first pre-defined algorithm such that only changes in pixel characteristics occurring within a pre-determined frequency band are used to produce a "flame present" decision; and simultaneously using a second pre-defined algorithm such that pixel characteristics are assessed to produce a "smoke present" decision independently of said "flame present” decision, and means for analysing the decisions produced by the first and second predetermined algorithms to assess whether a fire exists.
- the invention provides the method of detecting smoke and flame, the method comprising the steps of receiving digitised images of a region to be monitored; comparing pixels of one of said images with pixels of another said image according to pre-determined procedures to produce "flame present” and “smoke present” decisions; and providing a "fire detected” signal according to said "smoke present” and "flame present” decisions; characterised by the step of filtering the digitised images produced by the system using one pre-defined relationship such that only changes in pixel characteristics occurring within a pre-determined frequency band are used to produce a "flame present” decision; and simultaneously using a second pre-defined relationship such that pixel characteristics are assessed to produce a "smoke present” decision in addition or instead of said "flame present” decision.
- the system and method achieves the following unique advantages;
- the smoke and flame detection system 10 comprises one or more video cameras 12 which are variously directed at one or more locations which are to be monitored.
- the cameras also serve a second function as a part of a security or other form of surveillance system, although, it will be understood that one or more cameras could be dedicated solely to fire detection. In the description which follows, purely for the sake of convenience, the system will be described as having one camera 12.
- the camera is directed at a region to be monitored or view area 14 and outputs a standard 625 line analogue video signal at 25Hz frame rate.
- a standard video camera from the Hitachi company has proved suitable.
- Images of the view area 14 captured by the camera are fed to a frame grabber card 16 at a minimum frame rate of 5Hz and preferably approximately 10Hz.
- the frame grabber card digitises the images to a resolution of 640 pixels per line with 480 lines and feeds the digitised images to a processor 18 at the frame rate.
- the frame grabber card is a standard piece of hardware and in practice, a National Instruments PCI 1411 device plugged into the PCI bus of a standard PC has proved suitable.
- the grabber card may utilise Scion Image software.
- the camera may be a digital camera, in which case the grabber card would not be required to digitise the image. In this case, the grabber card would merely be required to grab digitised images from the camera at the required rate and feed the images to the processor 18.
- the processor 18 may comprise a standard IBM TM PC using a 750Hz Intel Pentium 3 TM processor with 128 Mb of RAM, although, it will readily be appreciated that this is just one example of many processors, which would be equally suitable.
- the processor 18 processes the digitised images received from the frame grabber card using separate algorithms 20, 22 for smoke and flame detection. Details of the algorithms are provided below.
- the processor uses a multi-threaded processing environment, such as windows, to simultaneously run the two algorithms to detect smoke or flame areas within the digitised image.
- the processor analyses the data produced by the algorithms to assess whether a fire has been detected.
- the processor may use a vote based analysis to assess whether a fire has been detected. For example, the processor may produce a fire detected signal if there is a yes flame present decision and a yes smoke present decision. This would provide a high level fire present indication. Alternatively, if there is only a yes decision from one algorithm, the processor may produce a lower ranked fire present indication. Yet another alternative would comprise producing a higher ranked fire present indication where both algorithms produce a yes decision and where one of the two, for example, the flame detection algorithm, produces a yes decision while the other produces a no decision and a lower ranked fire present indication where only the other algorithm produces a yes decision.
- the processor may take the data produced using the algorithms and carry out a statistical analysis to asses whether a fire is detected and that such an analysis may produce an unranked fire detected indication or a ranked fire present indication.
- a statistical analysis could be used, including analyses referring to earlier decisions in a predetermined period, and since these will be readily apparent to those skilled in the art, no detailed description of such analyses will be provided herein.
- a suitable signal is output by the processor using a standard serial RS232 link.
- the signal can be used to provide an on-screen warning for the operator on a conventional PC screen 28. The operator is then able to make a decision as to whether or not to trigger an alarm or investigate in more detail.
- the signal may be fed via known interface equipment, such as for example digital to analogue interfaces, to produce an alarm signal using any form of conventional alarm equipment 30.
- a digital alarm signal could additionally, or alternatively be directed to digital receiving means of the local fire service. It will be appreciated that the processor may select the destination of the output signal according to the rank (if any) assigned to the fire detected signal.
- the processor may cause the output signal to be directed to a display and or low level warning device to alert an operator to the possibility of a fire which the operator should then investigate manually.
- the processor may cause the output signal to be directed to a main alarm device if a high ranked fire detected signal is produced. Whilst it is preferred that at some level the processor would cause the output signal to be directed to an alarm device without operator intervention, it will be appreciated that the system could be configured to act simply as an operator warning system, if this is what the user requires.
- the system includes an operator interface, for example a keyboard and/or mouse 34, to permit an operator to interact with the processing algorithms to customise the system by adjusting parameters employed by the algorithms in order to improve detection performance and/or reduce the incidence of false alarms.
- the operator interface may also permit the operator to provide commands to the system, for example to terminate an alarm which is deemed a false alarm.
- the operator may for example adjust the system so that it ignores certain zones in a particular view area or assign differing detection parameters to various portions of the view area.
- Alternative forms of display and input device for the system would include a touch screen.
- the system may be provided without an operator input device where it is considered that operator access to the algorithms is unnecessary or undesirable.
- the system could be configured to receive data and commands from a suitable portable computing device to enable set up and diagnostic work to be carried out by authorised personnel.
- the system may include an event memory 40, which may be an integral part of the processor or a standalone memory. This event memory could be used to hold images showing the view area 14 at the time a fire detection signal is produced. Such images may be used to assist in fine tuning the system, where for example a significant number of false alarms or operator warnings have been produced, or to provide evidence as to the time, location and/or cause of a particular fire. It will be appreciated that the system may be configured such that an image is recorded in the event memory 40 in response to commands from the processor and/or instructions from the operator.
- multi-threaded processing allows one software program and one processor to simultaneously process the smoke and fire detection algorithms and also multiple video channels. It will be appreciated that more than one processor may be used to improve processing speed.
- simultaneous smoke and flame detection improves the ability of the system to provide adequate responses to a detected event whether the event is an instantaneous igntion type fire where there may be little or no smoke or slow progressing fires such as the type that emit smoke. For example, if the system detects smoke without flame in a zone where there is the possibility of a steam leak triggering an initial smoke detection signal, an alarm event can be prevented and/or delayed pending detection of flame. Alternatively, in environments where a flame detection may be triggered without the presence of fire, for example, where conventional sodium lighting is present, an alarm signal may be delayed pending the detection of smoke. Thus the system provides greater flexibility and sensitivity when compared with systems capable of detecting smoke alone or flame alone. It will be appreciated that since the system can simultaneously monitor for the presence of smoke and flame, it can detect all types of fire, whether they be instantaneous ignition type fires or slow burning fires that emit smoke.
- the fire detection algorithm 20 used by the processor to detect the presence of flame will now be described with reference to Figure 2.
- the algorithm is coded in a mixture of LabView TM and Microsoft® Visual ++.
- the fire detection algorithm comprises a series of steps labelled S1 to S7.
- step S1 the video image is entered into the algorithm in the form of a monochrome 640 x 480 image where each image pixel has an intensity value of 8 bits resolution.
- the algorithm processes each pixel individually, using linear mathematical operations.
- step S2 the monochrome 640 x 480 8 bit image is used to generate two separate averaged 640 x 480 8 bit resolution images which filter out rapidly occurring events, one with filter set at 1.25 Hz and the other with the filter set at 4.0Hz.
- the absolute difference between pixel values of these two images is then taken to obtain a movement band 640 x 480 8 bit image, which displays entities that are moving in the image within the frequency band between 1.25 and 4.0Hz.
- This frequency band corresponds with the range of movement frequencies exhibited by petrol flames observed empirically.
- a dimensionless time constant k1 is used to generate a 640 x 480 8 bit image that filters out events that occur more rapidly than 4Hz.
- a dimensionless time constant k2 is used to generate a 640 x 480 resolution 8 bit image that filters out events that occur more rapidly than 1.25Hz.
- Each pixel in the 640 x 480 image has corresponding value of pM2 which can be used to make up the averaged image.
- a so-called movement band 640 x 480 resolution image is generated by taking each of the pixels of these averaged images and calculating the absolute difference between pM1 and pM2 by finding the magnitude of the difference between each of the individual pixels obtained by subtracting pM1 from pM2.
- a 640 x 480 image is obtained which only displays events that occur in the frequency band between 1.25 Hz and 4.0 Hz.
- Each pixel of the movement band image has an 8 bit resolution.
- step S3 once an image has been filtered using the movement band, the filtered image has a threshold applied to create a map of significant movement in the characteristic frequency band defined by k1 and k2.
- the study of the temporal dynamics of these highlighted pixels is used to decide whether or not flames are present in the video image.
- the user of the system can set this value to an arbitrary value between 0 and 255 using the graphical user interface provided by LabView TM .
- the threshold map is a Boolean image of 640 x 480 pixels where non-thresholded pixels have a value of zero, and threshold pixels have a value of one.
- the 'awareness map' is a subset of the 'threshold map'.
- each pixel in the threshold amp defined in step S3 has an awareness level, which is an indication of the likelihood of the flame existing within that particular pixel. If the awareness level, increases above a user-defined threshold defined as the integer t2 (nominal value of 40), then the threshold pixel is registered with binary value 1, into the awareness map.
- the awareness map is a 640 x 480 Boolean image.
- An integer defined as the awareness level is generated for each of the pixels in the awareness map.
- the value of the awareness level is calculated by comparing successive frames of the awareness map for each of the pixels is equal to zero.
- a pixel in the awareness map changes from 1 to 0 or changes from 0 to 1 between successive video frames, then 2 is added to the value of the awareness level for that pixel. If pixel in the awareness map does not change (ie stays at 0 or stays at 1) between successive frames, then 1 is subtracted from the awareness level. The minimum value of the awareness level is zero ie if the awareness level becomes negative it is immediately set to zero.
- step S5 a number of parameters are calculated so that the algorithm can decide whether a flame is present in the video images that are being processed. These parameters may be plotted in a moving graph or used to determine a confidence of a flame detection event.
- the Plot0 parameter is a constant equal to an integer called the Alarm Level, user defined with a default value of 20. A flame is registered in the system when the Plot2 parameter described below exceeds the Alarm Level, which has a nominal value of 20.
- Low values of Alarm Level mean that the algorithm is fast to react to possible flames in the digitised image, but is susceptible to false detected decisions.
- High values of Alarm Level mean that the algorithm is insensitive to false flame detected decisions, but is slow to react to possible flames in the digitised image.
- ROIarea ( x 2 - x 1 ) ⁇ ( y 2 - y 1 )
- step S6 prior to performing the final flame decision, the 'plot' parameters described above are smoothed using a user defined dimensionless time constant k3 with a time constant of 8.0 seconds.
- k3 is applied between successive values of Plot1 and Plot2 obtained from successive video images using the same filtering techniques as used by k1 and k2 described above. This reduces the noise level in the plotted parameters and reduces the false alarm rate.
- the decision whether a flame is occurring within the video image has two operator selectable modes; normal mode and tree filter mode
- Normal flame decision mode is employed when no treelike objects are in the picture.
- Plot1 is ignored.
- an alarm is triggered when the Plot2 parameter is greater than the user Plot0 parameter.
- a positive value of Plot1 indicates a densely packed arrangement of flickering pixels (ie a flame) and a negative value of Plot1 indicates a sparsely packed arrangement of flickering pixels (ie leaves on a tree moving in the wind).
- the alarm for a flame with the tree filter only occurs when Plot2 is greater than the Plot0 and Plot1 is greater than zero.
- the inventors have found that inclusion of the tree filter increases the selectivity of the system, but also increases the amount of time required to reach a decision on whether a flame is present in the picture.
- the algorithm described above has been optimised by empirical methods and the constants determining the function of the algorithm may be chosen to achieve optimum results within the scene environment.
- the processor includes a comparator, which analyses the differences between different images and the pixels which make up the images. For this purpose, the comparator first compares the image with previous images and by subtraction obtains a signal representative of the difference between successive images.
- the system also includes an adjustable threshold control level for sensitivity setting and a means by which changes which are representative of signal noise can be eliminated.
- the output of the comparator is then subjected to the main processing of the signal in accordance with the smoke detection algorithm.
- the processor is looking to see whether there are changes in the individual pixels of a frame and in the differences between adjacent pixels which would have been caused by smoke particles
- the processor involves a number of separate analyses and these involve mathematical analysis by appropriate computer software in the signal process as part of the equipment.
- the signal processing means has to include hardware and/or software to recognise the selected conditions of change so that the presence of a smoke condition can be identified.
- the analysis can be based on the following: Notation and concepts
- the system has two images to work with, where image is defined as an ordered set of pixels intensities.
- the system provides two images in order to evaluate the various changes. These images are R the reference image C the current image
- the consistency of the changing area is evaluated over time in order to assess if that area is dynamic in terms of its overall appearance or static. Lighting changes alter the image but the overall appearance does not change.
- the correlation function is used to evaluate this similarity over time since it is invariant to both scale and gain changes. If an object obscures the background by moving into the area of interest then the appearance within the area of interest will change. If the correlation fluctuates sufficiently over time then the area is deemed to be dynamic. This measure of consistency is forwarded to the decision system.
- a change in edge information is defined as a change in the value of the edge measure.
- the edge measure is defined as the sum of the responses of a standard derivative filter kernel where changes have been detected by the previous stage.
- a standard filter which is employed is the Sobel edge filter. This measure of edge content is forwarded to the decision system.
- the smoke detection software is written in C++, compiled using the WATCOM C++ compiler.
- the features of the software described below are encapsulated in around 50 source code files and a further 50 header files, comprising an estimated 40,000 lines of code in all.
- the smoke detection algorithm examines, in general terms, the following features of a digitised video stream to determine whether smoke has been detected:
- Edge information edge definition this may increase or decrease as smoke emerges (depending on what it was like before) Whether the image overall is static or dynamic Emerging new shapes in the image - comparison of characteristic shape with indicative smoke shapes
- Zones are rectangular regions selected from the entire image by the user when the system is installed. These would typically be arranged to cover likely areas where smoke might be produced, and (more importantly) not cover problem areas of the scene. Each zone is processed entirely separately, and the outputs from each zone may be combined to generate alarms as required. Pixels in the zone may additionally be eliminated so that they are not included in the calculations - for example, the filament of a light bulb, or a shiny metal object that glints in the sunlight. Again, these are selected by the user when the system is commissioned. At any one time there are two primary sets of image data for the zone - the current image and the reference image. The pixels in these images are denoted by x and x r respectively, in the discussions below.
- n parameters are calculated. These parameters are formed into an n -dimensional "vector" , defining a "feature” space.
- Images are acquired from the grabber card on a regular basis. After any adjustments to normalise the brightness and contrast, the system compares the most recently acquired image (current) with the reference image. If pixels differ by more than an adjustable threshold (camera noise may be taken into account too), then the pixel is deemed to have changed.
- an adjustable threshold camera noise may be taken into account too
- the reference image is acquired periodically, when the system has detected no changes in the scene, and when the system determines that the current scene is no longer similar enough to the reference image.
- This reference image is analysed to generate an "environment mask", using the EDGE algorithm below. This essentially indicates where there is edge detail in the zone.
- a pixel-by-pixel "filter” mask used in the calculations detailed below, is constructed by combining the changed pixels with the environment mask.
- the changed pixel mask is only copied to the final filter mask at points where the magnitude of the difference between the current and the reference pixel exceeds the edge detail pixel value. Pixels selected manually as being problematic are also eliminated from this mask at this stage.
- This parameter counts the number of unmasked pixels in the image that deviate from the mean with the opposite sign from the way they do in the reference image.
- the total number of unmasked pixels in the zone i.e. excluding the masked pixels
- the edge content algorithm looks at, for each unmasked pixel in the current image, the four adjacent pixels (up/down/left/right). It sums the sum of the magnitude of the differences between the left and right, and between the up and down pixels, for pixels where this exceeds a threshold value set by the user.
- EDGE ⁇ [ ⁇ x up ⁇ x down
- the masked correlation calculates the same function as the correlation function above, considering only those pixels that are not masked.
- the pixel values might have a Gaussian distribution about the mean pixel value, or the distribution might be asymmetric or otherwise non-Gaussian.
- This function looks at the four nearest pixels to each unmasked pixel, and calculates the mean number of these that are unmasked. Opacity is calculated, for the unmasked pixels only, as 1 N ⁇ [ x ⁇ x r ( x ) ⁇ x r ]
- the unmasked pixels in the current and reference images are examined using the EDGE algorithm above. The routine then calculates the mean ratio of the pixels in the EDGE'd current image and those in the EDGE'd reference image, within the unmasked region, provided that the reference image contained a non-zero value.
- the filter masks are "eroded" before this calculation, using an algorithm that only allows TRUE pixels to remain if all of its original four neighbours were also TRUE. This is a form of filtering to reduce the noise.
- Rule-based analysis is used initially to determine whether a change in the image has occurred, and whether this change is significant. If it is, then further analysis is carried out to see if the change is considered to be associated with smoke, or whether it is associated with, say, a person walking across the scene.
- the rule-based analysis uses a scoring system, where points are allocated for each rule which is met. If the points total exceeds a (variable) criteria (typically 90% of the maximum score), the analysis moves to the next level.
- a (variable) criteria typically 90% of the maximum score
- the analysis is carried out on a region, which is a subset of the area of the zone, defined by the edges of the unmasked pixels.
- the "edge-ness" of the region is the ratio of the EDGES to the COUNT of pixels in the image. This is calculated both for the current and the reference image. If the current image edge-ness is outside a preset band, three points are scored. An additional three points are scored if the edge-ness deviates from the reference edge-ness by more than a preset percentage - selectably either up or down.
- COMPACTNESS (defined above) must lie within a preset band. If it deviates outside of this, three points are scored.
- the EDGE_EVIDENCE is decreased by the presence of smoke. If it falls below a preset threshold, three points are scored.
- the user may determine, when setting up the system, a subset of the available tests to carry out.
- the maximum score will be less, and the is take into account when determining whether the score has exceeded 90% of the maximum value. If it has, a Bayesian analysis is then carried out.
- Bayesian analysis provides a well founded decision criteria which takes into account the co-variance of features and provides the ability to discriminate between different classes of event (nuisance and real alarms).
- An important fact to note when defining features for use with Bayesian analysis is that they should be invariant to external influences such as background and lighting. The algorithm can cope with some variation but in general the effects of external influences should be kept to a minimum.
- Bayesian statistics are a useful tool in making decisions with multivariate systems such as this.
- d is calculated against the two reference classes - nuisance and real, giving d n and d r If d r is greater than d n , the Bayesian analysis signals an alarm condition.
- an important feature of the algorithm is to combine a rule-based analysis with a statistically based analysis, and particularly with one based on Bayesian analysis.
- the rule based analysis takes place first and if certain criteria are met then the Bayesian analysis is instigated.
- the Bayesian analysis and the rule-based analysis disagree.
- the confidence in the Bayesian analysis is used to determine whether the alarm is real or nuisance.
- the difference between real and nuisance is based on experience and the system builds up in accuracy over time.
- the reference image is updated. This effectively adjusts for changes in, for example, lighting level.
- the Bayesian analysis is important in avoiding false alarms, it is envisaged that in certain circumstances the smoke detection algorithm may omit a second level of analysis and instead rely on the output of the flame detection algorithm as a means for reducing the incidence of false alarms.
- a fire detection algorithm 100 is shown in Figure 3.
- the algorithm is specified in general terms and some steps may utilise one or more steps either or both of the flame detection algorithm and the smoke detection algorithm 20 described above.
- the individual steps which comprise the algorithm 100 are indicated in Figure 3.
- the algorithm analyses moving components in the images received by the processor by examining the difference between successive images or a current image and a reference image which depicts a no-fire event condition. The total perimeter area, position and density of the resulting patterns can then be combined with one another to generate quantitative estimates of certain fire attributes in order to produce a flame detected decision. To first order, it is possible to obtain an estimate of the probability of flame occurring by adding these estimates, or parameters, together for each difference frame.
- a list of fire attributes which video images of fires possess and can be used to determine whether a fire is occurring within an image comprises:
- the fire detection algorithm may use a colour camera and the algorithm includes the step of determining whether the image to be processed is from a colour camera. If it is, a colour filter can take information from the red, green and blue channels to see if the image includes the spectral characteristics of a blackbody radiating between 2000K and 3000K. Since CCTV are also sensitive to near IR( ⁇ 900nm), this information can also be gathered for comparison with a suitable filter.
- the rule application function 50 applies a linear combination of statistical parameters. To first order determination, a sum of area, perimeter and number of moving particles is used.
- Various aspects of the invention may include :
Claims (5)
- Ein kombiniertes Rauch- und Flammenerkennungssystem (10), umfassend mindestens eine Videokamera, einen Videoeinzelbild-Komparator (14, 40) und einen Prozessor, wobei der Prozessor (18, 40) so eingerichtet ist, dass er die von der einen oder von jeder Kamera erfassten aufeinander folgenden Einzelbilder analysiert, indem er einzelne Pixel der Einzelbilder gemäß mindestens zwei vordefinierten Beziehungen (20, 22) miteinander vergleicht, um auf diese Weise in der Lage zu sein, Rauch und Flammen zu erkennen und ein Ausgabesignal zu erzeugen, welches das Vorhandensein von Rauch, Flammen oder Rauch und Flammen anzeigt; dadurch gekennzeichnet, dass der Prozessor Mittel zum Filtern der durch das System produzierten digitalisierten Bilder mithilfe eines ersten vordefinierten Algorithmus (20) einschließt, so dass ausschließlich Änderungen in den Pixelmerkmalen, die innerhalb eines vorbestimmten Frequenzbands auftreten, dazu verwendet werden, eine "Flamme vorhanden"-Entscheidung zu treffen; und wobei gleichzeitig ein zweiter vordefinierter Algorithmus (22) verwendet wird, so dass Pixelmerkmale beurteilt werden, um unabhängig von der "Flamme vorhanden"-Entscheidung eine "Rauch vorhanden"-Entscheidung zu treffen, sowie Mittel zum Analysieren der durch den ersten und den zweiten vordefinierten Algorithmus getroffenen Entscheidungen, um zu beurteilen, ob ein Feuer vorliegt.
- Ein Verfahren zum Erkennen von Rauch und Flammen, wobei das Verfahren die folgenden Schritte umfasst: Empfangen digitalisierter Bilder einer zu überwachenden Region; Vergleichen der Pixel von einem der Bilder mit Pixeln eines anderen Bilds entsprechend zwei vorbestimmten Prozeduren, um "Flamme vorhanden"- und "Rauch vorhanden"-Entscheidungen zu treffen; und Bereitstellen eines "Feuer erkannt"-Signals entsprechend den "Rauch vorhanden"- und "Flamme vorhanden"-Entscheidungen; gekennzeichnet durch den Schritt des Filterns der durch das System produzierten digitalisierten Bilder mithilfe einer vordefinierten Beziehung (20), so dass ausschließlich Änderungen in den Pixelmerkmalen, die innerhalb eines vorbestimmten Frequenzbands auftreten, verwendet werden, um eine "Flamme vorhanden"-Entscheidung zu treffen; und wobei gleichzeitig eine zweite vordefinierte Beziehung (22) verwendet wird, so dass Pixelmerkmale beurteilt werden, um zusätzlich zur oder anstelle der "Flamme vorhanden"-Entscheidung eine "Rauch vorhanden"-Entscheidung zu treffen.
- Ein System gemäß Anspruch 1 oder ein Verfahren gemäß Anspruch 2, dadurch gekennzeichnet, dass der Schritt des Filterns das Ausfiltern von Änderungen in den Pixelmerkmalen einschließt, die im Frequenzband 1,25 Hz bis 4 Hz auftreten.
- Die Erfindung gemäß Anspruch 1, Anspruch 2 oder Anspruch 3, dadurch gekennzeichnet, dass die eine Prozedur den Schritt des Ermittelns der Dichte der Änderung von Pixeleigenschaften einschließt sowie das Ermitteln des Vorhandenseins einer Flamme, wenn eine Dichte ermittelt wird, die einen vorbestimmten Dichtewert überschreitet.
- Die Erfindung gemäß jedem der vorherigen Ansprüche, dadurch gekennzeichnet, dass das System des Weiteren Gewichtungsmittel zum Erzeugen des Signals gemäß einer gewichteten Analyse der "Flamme vorhanden"-Entscheidung und/oder der "Rauch vorhanden"-Entscheidung umfasst.
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0002695A GB0002695D0 (en) | 2000-02-07 | 2000-02-07 | Video fire detection |
GB0002695 | 2000-02-07 | ||
GB0010857A GB0010857D0 (en) | 2000-05-05 | 2000-05-05 | Smoke & flame video detection system |
GB0010857 | 2000-05-05 | ||
WOPCT/GB00/03717 | 2000-09-27 | ||
PCT/GB2000/003717 WO2001024131A2 (en) | 1999-09-27 | 2000-09-27 | Fire detection algorithm |
PCT/GB2001/000482 WO2001057819A2 (en) | 2000-02-07 | 2001-02-07 | Smoke and flame detection |
Publications (2)
Publication Number | Publication Date |
---|---|
EP1256105A2 EP1256105A2 (de) | 2002-11-13 |
EP1256105B1 true EP1256105B1 (de) | 2006-09-20 |
Family
ID=27255514
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP01904091A Expired - Lifetime EP1256105B1 (de) | 2000-02-07 | 2001-02-07 | Rauch- und flammendetektion |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP1256105B1 (de) |
DE (1) | DE60123214T2 (de) |
WO (1) | WO2001057819A2 (de) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7245315B2 (en) | 2002-05-20 | 2007-07-17 | Simmonds Precision Products, Inc. | Distinguishing between fire and non-fire conditions using cameras |
US7256818B2 (en) | 2002-05-20 | 2007-08-14 | Simmonds Precision Products, Inc. | Detecting fire using cameras |
US7280696B2 (en) | 2002-05-20 | 2007-10-09 | Simmonds Precision Products, Inc. | Video detection/verification system |
ES2282550T3 (es) * | 2003-07-11 | 2007-10-16 | Siemens Schweiz Ag | Procedimiento y dispositivo para la deteccion de llamas. |
US8326037B1 (en) | 2005-11-23 | 2012-12-04 | Matrox Electronic Systems, Ltd. | Methods and apparatus for locating an object in an image |
DE102008039132A1 (de) | 2008-08-21 | 2010-02-25 | Billy Hou | Intelligenter Bild-Rauch/Flammensensor und Detektionssystem |
CN102609727B (zh) * | 2012-03-06 | 2014-02-26 | 中国人民解放军理工大学工程兵工程学院 | 基于无量纲特征提取的火灾火焰检测方法 |
CN102819735B (zh) * | 2012-08-17 | 2015-07-15 | 深圳辉锐天眼科技有限公司 | 基于视频帧图像的火焰检测方法 |
DE102016207705A1 (de) * | 2016-05-04 | 2017-11-09 | Robert Bosch Gmbh | Rauchdetektionsvorrichtung, Verfahren zur Detektion von Rauch eines Brandes sowie Computerprogramm |
CN110637330B (zh) * | 2017-03-20 | 2021-12-10 | 霍顿集团有限公司 | 消防安全装置方法和系统 |
CN110363725B (zh) * | 2019-07-24 | 2022-09-13 | 安徽工业大学 | 一种碳氢燃料同轴扩散火焰碳烟微纳结构的分析方法 |
CN111368756A (zh) * | 2020-03-09 | 2020-07-03 | 上海金掌网络技术有限责任公司 | 一种基于可见光的明火烟雾快速识别方法和系统 |
CN111489342B (zh) * | 2020-04-09 | 2023-09-26 | 西安星舟天启智能装备有限责任公司 | 一种基于视频的火焰检测方法、系统及可读存储介质 |
CN115439997B (zh) * | 2022-11-07 | 2023-01-31 | 北京中海兴达建设有限公司 | 一种火灾预警方法、装置、设备及可读存储介质 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4785292A (en) * | 1984-03-23 | 1988-11-15 | Santa Barbara Research Center | Dual spectrum frequency responding fire sensor |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB9216811D0 (en) * | 1992-08-07 | 1992-09-23 | Graviner Ltd Kidde | Flame detection methods and apparatus |
JPH10269471A (ja) * | 1997-03-27 | 1998-10-09 | Nohmi Bosai Ltd | 火災検出装置 |
-
2001
- 2001-02-07 EP EP01904091A patent/EP1256105B1/de not_active Expired - Lifetime
- 2001-02-07 DE DE60123214T patent/DE60123214T2/de not_active Expired - Fee Related
- 2001-02-07 WO PCT/GB2001/000482 patent/WO2001057819A2/en active IP Right Grant
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4785292A (en) * | 1984-03-23 | 1988-11-15 | Santa Barbara Research Center | Dual spectrum frequency responding fire sensor |
Also Published As
Publication number | Publication date |
---|---|
WO2001057819A2 (en) | 2001-08-09 |
DE60123214D1 (de) | 2006-11-02 |
DE60123214T2 (de) | 2007-09-20 |
WO2001057819A3 (en) | 2002-07-18 |
EP1256105A2 (de) | 2002-11-13 |
WO2001057819A8 (en) | 2002-10-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7002478B2 (en) | Smoke and flame detection | |
AU764323B2 (en) | Smoke detection | |
EP1256105B1 (de) | Rauch- und flammendetektion | |
CN110516609B (zh) | 一种基于图像多特征融合的火灾视频检测及预警方法 | |
US8462980B2 (en) | System and method for video detection of smoke and flame | |
EP2461300B1 (de) | Rauchmeldevorrichtung | |
US8538063B2 (en) | System and method for ensuring the performance of a video-based fire detection system | |
US6104831A (en) | Method for rejection of flickering lights in an imaging system | |
US7859419B2 (en) | Smoke detecting method and device | |
EP1687784B1 (de) | Rauchmeldeverfahren und -vorrichtung | |
JP4729610B2 (ja) | 煙検出装置 | |
CN101908142A (zh) | 一种基于特征分析的视频火焰检测方法 | |
AU2002220440B2 (en) | Video smoke detection system | |
Gunawaardena et al. | Computer vision based fire alarming system | |
US6956485B1 (en) | Fire detection algorithm | |
WO1998028706B1 (en) | Low false alarm rate video security system using object classification | |
US8655010B2 (en) | Video-based system and method for fire detection | |
EP2000952A2 (de) | Verfahren und Vorrichtung zur Raucherkennung | |
JP5286113B2 (ja) | 煙検出装置 | |
CN110120142B (zh) | 一种火灾烟雾视频智能监控预警系统及预警方法 | |
JP6457728B2 (ja) | 層流煙検出装置および層流煙検出方法 | |
JP6457727B2 (ja) | 層流煙検出装置および層流煙検出方法 | |
Dai Duong et al. | A novel computational approach for fire detection | |
Pedros et al. | Indoor Video-Based Smoke Detection using Gaussian Mixture Model and Motion-based Tracking | |
JP2002175531A (ja) | 画像センサ |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20020906 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
AX | Request for extension of the european patent |
Free format text: AL;LT;LV;MK;RO;SI |
|
17Q | First examination report despatched |
Effective date: 20040210 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT;WARNING: LAPSES OF ITALIAN PATENTS WITH EFFECTIVE DATE BEFORE 2007 MAY HAVE OCCURRED AT ANY TIME BEFORE 2007. THE CORRECT EFFECTIVE DATE MAY BE DIFFERENT FROM THE ONE RECORDED. Effective date: 20060920 Ref country code: LI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20060920 Ref country code: BE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20060920 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20060920 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20060920 Ref country code: CH Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20060920 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20060920 |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REF | Corresponds to: |
Ref document number: 60123214 Country of ref document: DE Date of ref document: 20061102 Kind code of ref document: P |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20061220 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20061220 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20061231 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20070228 |
|
NLV1 | Nl: lapsed or annulled due to failure to fulfill the requirements of art. 29p and 29m of the patents act | ||
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20070312 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
ET | Fr: translation filed | ||
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20070621 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20070207 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20061221 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20070207 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20060920 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20090430 Year of fee payment: 9 Ref country code: IT Payment date: 20090211 Year of fee payment: 9 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20060920 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20090217 Year of fee payment: 9 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: ST Effective date: 20101029 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20100301 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20100901 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20100207 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20150224 Year of fee payment: 15 |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20160207 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20160207 |