GB2360355A - Image detection - Google Patents
Image detection Download PDFInfo
- Publication number
- GB2360355A GB2360355A GB0006257A GB0006257A GB2360355A GB 2360355 A GB2360355 A GB 2360355A GB 0006257 A GB0006257 A GB 0006257A GB 0006257 A GB0006257 A GB 0006257A GB 2360355 A GB2360355 A GB 2360355A
- Authority
- GB
- United Kingdom
- Prior art keywords
- radiation
- output
- image
- scene
- wavelength band
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims description 17
- 230000005855 radiation Effects 0.000 claims abstract description 107
- 230000001419 dependent effect Effects 0.000 claims abstract description 6
- 238000000034 method Methods 0.000 claims description 27
- 238000002485 combustion reaction Methods 0.000 claims description 7
- 238000012544 monitoring process Methods 0.000 claims description 7
- 238000004458 analytical method Methods 0.000 claims description 5
- 230000000007 visual effect Effects 0.000 claims description 4
- 230000003190 augmentative effect Effects 0.000 claims description 3
- 230000015572 biosynthetic process Effects 0.000 claims description 3
- 238000003786 synthesis reaction Methods 0.000 claims description 3
- 230000002596 correlated effect Effects 0.000 abstract description 2
- 230000000875 corresponding effect Effects 0.000 description 6
- 239000004215 Carbon black (E152) Substances 0.000 description 4
- 229930195733 hydrocarbon Natural products 0.000 description 4
- 150000002430 hydrocarbons Chemical class 0.000 description 4
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 229910052710 silicon Inorganic materials 0.000 description 3
- 239000010703 silicon Substances 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 1
- 230000010339 dilation Effects 0.000 description 1
- 230000003628 erosive effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 238000003466 welding Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B17/00—Fire alarms; Alarms responsive to explosion
- G08B17/12—Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
- G08B17/125—Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Fire-Detection Mechanisms (AREA)
- Photometry And Measurement Of Optical Pulse Characteristics (AREA)
Abstract
Apparatus for detecting images, such as flames, comprises an image sensor (<B>10</B>), such as a video camera, and a radiation sensor (<B>14</B>) which views the same scene as the image sensor. The radiation sensor may be tuned to a specific IR wavelength, such as 4.4 micrometers, by the use of a filter (<B>16</B>), and produces a varying output dependent on the radiation received from the whole scene. Outputs from both sensors are fed to a central processing unit (<B>18</B>) where the video signal is image-processed to identify bright clusters. The varying size of a cluster is then compared or correlated with the varying signal from the radiation sensor to identify whether the cluster is one which represents a source of varying infra-red radiation or flame.
Description
2360355 IMAGE DETECTION
The invention relates to image detection arrangements and methods. Apparatus embodying the invention, and to be described in more detail below by way of example only, may be used for detecting particular types of image in a field of view. In a more specific example, such a particular type of image can be the image of a flame. In this way, therefore, the apparatus can be used to detect fires.
According to the invention, there is provided image detection apparatus, comprising image sensing means producing a first output corresponding to an image of a monitored scene and varying in dependence on variations in that scene, radiation sensing means producing at least one second output dependent on radiation received from the scene in at least one predetermined wavelength band and varying in dependence on the integrated value thereof, and comparison means responsive to the first and second outputs for detecting any part of the image of a region of the monitored scene from which radiation has been received by the radiation sensing means and which represents varying radiation in the or at least one said predetermined wavelength band.
According to the invention, there is also provided image detection apparatus for augmenting a video camera visual surveillance system, comprising radiation sensing means operative to sense radiation in at least one predetermined radiation wavelength band and to produce an output varying in dependence on the sensed radiation, and 2 adapted to be mounted adjacent to the camera for viewing substantially the same scene as the camera, and processing means responsive to the output from the radiation sensing means and connectable to receive a video output from the camera in the form of signals representing the intensities of pixels in time-successive matrices of pixels forming respective image frames produced by the camera, the processing means including means for identifying any cluster in each frame of pixels whose intensities lie above a minimum level, and comparison means for comparing the variation in size of any such cluster with the output received from the radiation sensing means, whereby to identify any such cluster representing a source of the radiation in the or at least one said predetermined wavelength band.
According to the invention, there is further provided an image detection method, comprising the steps of producing a first output corresponding to an image of a monitored scene and which varies in dependence on variations in that scene, producing at least one second output dependent on radiation received from the scene in at least one predetermined wavelength band and which varies in dependence thereon and on the integrated value thereof, and comparing the first and second outputs to detect any part of the image of a region of the monitored scene from which radiation has been received by the radiation sensing means and which represents varying radiation in the or at least one said predetermined wavelength band.
Image detection apparatus embodying the invention, and image detection methods 3 according to the invention, will now be described, by way of example only, with reference to the accompanying diagrammatic drawings in which:
Figure 1 is a block diagram of one form of the apparatus; Figure 2 shows a field of view as perceived by an image sensor or video camera in the apparatus of Figure 1;
Figure 3 is a graph plotting the variation in size of an image perceived in the field of view of Figure 2 by the image sensor in the apparatus of Figure 1;
Figure 4 is a graph showing the variation in intensity of radiation received from the field of view of Figure 2 by a radiation sensor in the apparatus of Figure 1; and
Figure 5 corresponds to Figure 1 and shows a modified form of the apparatus.
As shown in Figure 1, the apparatus to be described is for monitoring a scene and for detecting the presence of particular types of image therein. As shown in Figure 1, the scene is viewed by two sensors. One such sensor is an image sensor which monitors and scans the scene and produces time-successive groups of signals, each group comprising signals representing the respective intensities of different parts of the scene. More specifically, the image sensor produces each group of signals as a "frame" made up of a 4 matrix of respective intensity values for different parts or pixels of the scene. The image sensor, shown at 10 in Figure 1, may thiis be in the form of a simple video camera or conventional silicon-based focal plane array type of sensor, possibly a CCD or CMOS sensor. The image sensor 10 may view the field of view through a lens 12.
The second sensor is shown at 14 and is a radiation sensor which gathers radiation in a particular wavelength band from the scene, the gathered radiation being integrated over the whole or substantially the whole scene. As shown in Figure 1, the sensor 14 views the scene through a suitable filter 16 for establishing the wavelength band of the radiation.
In a particular application of the apparatus and method being described, the purpose is to detect the presence of a flame in the scene being viewed - so that the apparatus acts as a fire detector. If the flame to be detected is a hydrocarbon flame, the filter 16 could be a filter having a passband centred at 4.4 micrometres to select the infra-red (M) radiation ftom the C02 emission lines. The sensor 14 could be a pyroelectric sensor or similar device. The image sensor 10 may operate in the visible part of the electromagnetic spectrum. Instead, though, it could view the scene through a suitable filter - provided, of course, that the filter does not block the primary radiation emitted by the flame.
In the following description, it will be assumed that the apparatus is to operate as a flame detector and that the passband of filter 16 is centred at 4.4 micrometres.
The output from the image sensor 10 is fed to a central processing unit 18 on a line 20 and thus comprises time-successive frames each made up of a group of pixel signals, each pixel signal representing the intensity of a particular part of the scene.
The output of the radiation sensor 14 is fed to the CPU 18 on a line 22 and comprises a signal integrating the intensity of the scene, in the particularly chosen wavelength band, integrated over the whole or substantially the whole scene and fluctuating in time in accordance with fluctuations in the integrated radiation.
Figure 2 shows the scene as viewed by the image sensor 10. In this example, it is assumed that the scene includes a hydrocarbon flame shown diagrammatically at 30, another source of intense radiation such as produced by arc welding and shown at 32, and a moving radiation source 34 in the form of a moving light. The scene may also include other radiation sources, but it is assumed that their intensities are substantially less than the sources 30,32 and 34.
The CPU 18 will therefore respond to the signals received from the image sensor 10 on the line 20 by generating time-successive frames each corresponding to the scene as viewed by the image sensor 10 and each comprising pixel signals from a matrix of pixels dependent on the scene. Using known image processing techniques, and in a manner to be described in more detail below, the CPU 18 identifies the position of each cluster of pixels - that is, each group of pixels whose intensity lies above a predetermined minimum 6 threshold (and perhaps below a predetermined maximum threshold); of course, there may only be one or no such cluster. For eac6 identified cluster, the CPU then measures the size of the cluster during each successive frame over a fixed period of time. The size of each cluster is measured advantageously by counting the number of pixels within it, for each frame. Figure 3 shows, purely by way of example, how the size of a particular cluster representing a flame may vary over this period, the number of pixels in the cluster being shown with reference to the vertical axis and time being plotted with respect to the horizontal axis.
In addition, the CPU 18 receives signals on the line 22 from the radiation sensor 14. The CPU 18 responds to these signals by monitoring their variation with time over the same time period as that for which the cluster-size variation is monitored. The radiation sensor 14 of course produces an output representing the radiation integrated over the whole or substantially the whole scene. Figure 4 shows, as an example, how the radiation monitored by the radiation sensor 14 and processed by the CPU varies with time, intensity being plotted with respect to the vertical axis and time with respect to the horizontal axis.
The CPU then carries out a comparison process, by comparing the variation of size of each cluster (Figure 3) with the variation in integrated radiation (Figure 4) to detect correlation between the two variations. In this way, any cluster (there may of course be more than one in the scene being viewed) representing varying radiation in the passband of the filter 16 (Figur.- 1) can be identified. In the present case, therefore, where the 7 passband of the filter 16 corresponds to 4.4 micrometers, hydrocarbon flames can be identified, and an appropriate warning output produced on a line 24 (Figure 1).
The comparison of the cluster-size information with the integrated radiation information could be carried out by any suitable technique such as cross-correlation or possibly by Fourier analysis/waveform synthesis.
The results of the comparison or correlation process can be used to produce a synthetic display, on any suitable device, of the cluster or clusters which are identified by the comparison process. In addition, this synthetic display can be stored as a synthetic image in memory in the CPU 18, where it could be subjected to further image analysis techniques to provide further analysis of the data.
Steady state clusters will be ignored in the comparison process - either by the use of a radiation sensor 14 which does not respond to steady state signals or by suitable filtering of the signals on line 22 to remove steady state signals. Any regularly chopped sources of infra-red radiation in the scene being viewed will also be rejected by the comparison process because they would not show the usual flicker frequency profile of a flame.
The arrangement described thus provides very good discrimination against potentially interfering sources of infra-red radiation and provides the ability to acquire video images of flames from the infra-red portion of the electromagnetic spectrum without the use of 8 complex image processing techniques. Thus, the arrangement described is simpler and more effective than a system using an image sensor or camera on its own. In addition, though, the apparatus described provides much better discrimination against interfering sources of varying infra-red radiation than is possible with an infra-red detector used on its own for detecting flame flicker frequencies.
The apparatus described can easily be embodied by modifying a conventional video camera used in a CCTV surveillance system of the type producing a visual display. The camera can be augmented by an add-on unit containing a radiation sensor (such as an infra-red radiation sensor described above) in combination with a suitable filter for producing an integrated output of the infra-red radiation received from the scene being viewed by the camera. The add-on unit would also include simple processing electronics and the resultant signals (corresponding to the signals on line 22 in Figure 1) could then be transmitted back to a central monitoring station where the normal video signal is received. There, using processing of the type described above and carried out by the CPU 18 (Figure 1), the signals from the IR sensor would be compared (correlated) with the video signals to detect flames in the monitored scene, so that the visual surveillance system would also act as a flame or fire detector. The infra-red modulation signals produced by the radiation sensor would be at very low frequency and could be included in the video signal (e.g. in that part of the signal reserved for Teletext information), thus avoiding the need for additional wiring.
9 In a modification shown in Figure 5, the integrating radiation sensor 14 of Figure 1 is replaced by a low resolution infra-red radiation image sensor 14A - that is, by a lowresolution IR sensor comprising a relatively small number of sensing elements viewing respective parts of the scene and for producing frame outputs made up of a matrix of respective intensity values for different parts or pixels of the scene. The sensor 14A views the scene through a lens 14B. In other respects, the apparatus of Figure 5 is similar to the apparatus of Figure 1 and parts in Figure 5 corresponding to Figure 1 are similarly referenced. In Figure 5, therefore, the integrating radiation sensor 14 is replaced by an image sensor of the same general type as the image sensor 10 except that it is operative in the IR part of the spectrum and has a much lower resolution than the image sensor 10 which, as for the apparatus described with reference to Figure 1, may be in the form of a conventional silicon-based sensor. The signals produced by the low resolution IR sensor 14A are processed by the CPU 18 by monitoring each of its pixel signals separately and determining how the intensity of radiation in that pixel varies in successive frames. Effectively, therefore, each sensitive element of the image sensor 14A is treated as a separate radiation sensor integrating the IR radiation received from a particular part of the scene. The CPU 18 compares or correlates each time-varying output from the image sensor 14A with the signals produced by the CPU from the high resolution image sensor 10 and corresponding to the time-varying size of each detected cluster in the scene. In this way, again, therefore, flames can be detected. The apparatus of Figure 5 effectively converts the low resolution IR image sensor 14A into a high resolution image sensor, by correlating its output with the output of the high resolution silicon image sensor 10. This is very advantageous because high resolution IR image sensors as standalone devices are very expensive. Becausd the time- varying signals from the image sensor 14A represent the integrated IR radiation from parts, only, of the scene, the sensitivity of the apparatus shown in Figure 5 is greater than the sensitivity of the apparatus of Figure 1, where the radiation sensor 14 integrates the IR radiation over the whole of the scene. It is necessary, of course, that the size of the part of the scene monitored by each sensitive element of the image sensor 14A is large enough to integrate the radiation from a single flame.
However, although the apparatus (of both Figure 1 and Figure 5) has been described for the detection of hydrocarbon flames emitting radiation in the 4.4 micrometre band, it may of course be modified to synthesise images at other wavelengths. For example, it could be used for gas detection or boiler flame analysis. More specific examples, are:- (a) a radiation sensor 14 and a filter 16 arranged to monitor radiation in a narrow band centred at 3 10 nm could be used for monitoring the transient OH species; (b) if the radiation sensor is arranged to be sensitive to radiation in the 927 mn, 1.45 gm or 2.89 gm radiation bands, the H20product of combustion can be monitored; (c) if the radiation sensor 14 is arranged to be responsive to radiation in the 3.339 11 [im radiation band, emissions of radiation due to hot CH4can be detected.
It is also possible for the radiation sensor to be replaced by radiation sensors responsive to radiation at two (or more) predetermined wavelengths (or in two- or more predetermined wavelength bands). In this way, the system can synthesize images at two or more wavelengths or in two or more wavelength bands. For example, different parts of a flame may emit radiation at respectively different wavelengths and in such a case two radiation sensors may be used, respectively responsive to radiation at these different wavelengths.
As indicated above, the CPU 18 (in both Figures 1 and 5) uses known image processing techniques to identify the position and centre of each cluster of pixels, for example erosion and dilation and other known image processing techniques.
Such image processing techniques are described in more detail, for example, in European Published Specification No. 0583131 but, instead, other known image processing techniques can be used.
Claims (34)
1. Image detection apparatus, comprising image sensing means producing a first output corresponding to an image of a monitored scene and varying in dependence on variations in that scene, radiation sensing means producing at least one second output dependent on radiation received from the scene in at least one predetermined wavelength band and varying in dependence on the integrated value thereof, and comparison means responsive to the first and second outputs for detecting any part of the image of a region of the monitored scene from which radiation has been received by the radiation sensing means and which represents varying radiation in the or at least one said predetermined wavelength band.
2. Apparatus according to claim 1, in which the comparison means is operative to detect any said part of the image for which the variation in size correlates with the variation in the integrated value of the radiation received therefrom.
3. Apparatus according to claim 1 or 2, in which the image sensing means produces the first output in the form of signals respectively representing the intensities of pixels in time-successive matrices of pixels corresponding to respective image frames, means for identifying one or more clusters in each frame of pixels whose intensities lie above a minimum level, and means for monitoring the variation in size of the or each cluster, and in which the comparison means comprises means for correlating the variation in size of 13 the or each said cluster with the or each second output.
4. Apparatus according to claim 1 or 2, in which the image sensing means comprises a plurality of sensing elements each responsive to radiation received from a respective part of the monitored scene whereby the elements together produce the first output.
5. Apparatus according to any preceding claim, in which the radiation sensing means comprises a radiation sensor receiving radiation from substantially the whole of the monitored scene.
6. Apparatus according to claim 4, in which the radiation sensing means comprises a plurality of sensing elements each responsive to radiation received from a respective said region of the monitored scene whereby each such element produces a respective signal forming part of the or at least one said second output.
7. Apparatus according to claim 6, in which the plurality of sensing elements comprised in the image sensing means is substantially greater in number than the plurality of sensing elements comprised in the radiation Sensing means.
8. Apparatus according to any preceding claim, in which the or each predetermined wavelength band corresponds to a product of combustion whereby the detected part of the image includes the image of a source of the combustion.
14
9. Apparatus according to claim 8, in which the or at least one said wavelength band comprises infra-red radiation.
10. Apparatus according to claim 9, in which the or at least one said wavelength band is a band centred at 4.4 micrometres.
11. Apparatus according to any preceding claim, including means responsive to the output of the comparison means for producing signals corresponding to the or each detected part of the monitored scene.
12. Apparatus according to any preceding claim, in which the comparison means comprises cross-correlation means.
13. Apparatus according to any one of claims 1 to 11, in which the comparison means uses Fourier analysis.
14. Apparatus according to any one of claims 1 to 11, in which the comparison means uses waveform synthesis.
15. Image detection apparatus for augmenting a video camera visual surveillance system, comprising radiation sensing means operative to sense radiation in at least one predetermined radiation wavelength band and to produce an output varying in dependence on the sensed radiation, and adapted to be mounted adjacent to the camera for viewing substantially the same scene as the camerd, and processing means responsive to the output from the radiation sensing means and connectable to receive a video output from the camera in the form of signals representing the intensities of pixels in time-successive matrices of pixels forming respective image frames produced by the camera, the processing means including means for identifying any cluster in each frame of pixels whose intensities lie above a minimum level, and comparison means for comparing the variation in size of any such cluster with the output received from the radiation sensing means, whereby to identify any such cluster representing a source of the radiation in the or at least one said predetermined wavelength band.
16. Apparatus according to claim 15, in which the output from the radiation sensing means is received by the processing means in combination with and as part of the video output and carried on the same cable.
17. Apparatus according to claim 15 or 16, in which at least one said predetermined wavelength band comprises the wavelength of a product of combustion, whereby the comparison 'Means produces a warning output indicating a source of combustion in the field of view.
18. An image detection method, comprising the steps of producing a first output corresponding to an image of a monitored scene and which varies in dependence on 16 variations in that scene, producing at least one second output dependent on radiation received from the scene in at least one pre'determined wavelength band and which varies in dependence thereon and on the integrated value thereof, and comparing the first and second outputs to detect any part of the image of a region of the monitored scene from which radiation has been received by the radiation sensing means and which represents varying radiation in the or at least one said predetermined wavelength band.
19. A method according to claim 18, in which the comparison step is operative to detect any said part of the image for which the variation in size correlates with the variation in the integrated value of the radiation received therefrorn.
20. A method according to claim 18 or 19, in which the step of producing the first output produces it in the form of signals respectively representing the intensities of pixels in time-successive matrices of pixels corresponding to respective image frames, and including the steps of identifying one or more clusters in each frame of pixels whose intensifies lie above a minimum level, and monitoring the variation in size of the or each cluster, and in which the comparison step comprises the step of correlating the variation in size of the or each said cluster with the second output.
21. A method according to claim 18 or 19, in which the step of producing the first output comprises the steps of separately responding to radiation received from respective parts of the monitored scene whereby together to produce the first output.
17
22. A method according to any one of claims 18 to 2 1, in which the step of producing the or each second output comprises the siep of receiving radiation from substantially the whole of the monitored scene.
23. A method according to claim 2 1, in which the step of producing the or at least one said second output comprises the steps of separately responding to radiation received from respective said regions of the monitored scene whereby to produce respective signals forming part of the or at least the said one second output.
24. A method according to any one of claims 18 to 23, in which the or at least one said predetermined wavelength band corresponds to a product of combustion whereby the detected part of the image includes the image of a source of the combustion.
25. A method according to claim 24, in which the or the said one wavelength band comprises infra-red radiation.
26. A method according to claim 25, in which the or the said one wavelength band is a band centred at 4.4 micrometres.
27. A method according to any one of claims 18 to 26, including the step of responding to the comparison step by producing signals corresponding to the or each detected part of the monitored scene.
18
28. A method according to any one of claims 18 to 27, in which the comparison step comprises a cross-correlation step.
29. A method according to any one of claims 18 to 27, in which the comparison step uses Fourier analysis.
30. A method according to any one of claims 18 to 27, in which the comparison step uses waveform synthesis.
31. Image detection apparatus, substantially as described with reference to Figures 1 to 4 of the accompanying drawings.
32. Image detection apparatus, substantially as described with reference to Figures 2 to 5 of the accompanying drawings.
33. An image detection method, substantially as described with reference to Figures 1 to 4 of the accompanying drawings.
34. An image detection method, substantially as described with reference to Figures 2 to 5 of the accompanying drawings.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0006257A GB2360355B (en) | 2000-03-15 | 2000-03-15 | Image detection |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB0006257A GB2360355B (en) | 2000-03-15 | 2000-03-15 | Image detection |
Publications (3)
Publication Number | Publication Date |
---|---|
GB0006257D0 GB0006257D0 (en) | 2000-05-03 |
GB2360355A true GB2360355A (en) | 2001-09-19 |
GB2360355B GB2360355B (en) | 2004-09-22 |
Family
ID=9887684
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB0006257A Expired - Fee Related GB2360355B (en) | 2000-03-15 | 2000-03-15 | Image detection |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2360355B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2372317A (en) * | 2001-02-14 | 2002-08-21 | Infrared Integrated Syst Ltd | Infrared flame detection sensor |
EP1329860A2 (en) | 2002-01-11 | 2003-07-23 | Hochiki Corporation | Flame detection device |
GB2390675A (en) * | 2002-07-10 | 2004-01-14 | Univ Greenwich | Flame characteristic monitor using digitising image camera |
GB2580644A (en) * | 2019-01-18 | 2020-07-29 | Ffe Ltd | Flame detector |
WO2021011300A1 (en) * | 2019-07-18 | 2021-01-21 | Carrier Corporation | Flame detection device and method |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110874907A (en) * | 2018-09-03 | 2020-03-10 | 中国石油化工股份有限公司 | Flame identification method based on spectrum camera |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5153722A (en) * | 1991-01-14 | 1992-10-06 | Donmar Ltd. | Fire detection system |
US5510772A (en) * | 1992-08-07 | 1996-04-23 | Kidde-Graviner Limited | Flame detection method and apparatus |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2340222B (en) * | 1998-07-14 | 2000-07-26 | Infrared Integrated Syst Ltd | Multi-array sensor and method of identifying events using same |
JP3827128B2 (en) * | 1998-08-31 | 2006-09-27 | ホーチキ株式会社 | Fire detection equipment |
DE10011411C2 (en) * | 2000-03-09 | 2003-08-14 | Bosch Gmbh Robert | Imaging fire detector |
-
2000
- 2000-03-15 GB GB0006257A patent/GB2360355B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5153722A (en) * | 1991-01-14 | 1992-10-06 | Donmar Ltd. | Fire detection system |
US5510772A (en) * | 1992-08-07 | 1996-04-23 | Kidde-Graviner Limited | Flame detection method and apparatus |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2372317A (en) * | 2001-02-14 | 2002-08-21 | Infrared Integrated Syst Ltd | Infrared flame detection sensor |
GB2372317B (en) * | 2001-02-14 | 2003-04-16 | Infrared Integrated Syst Ltd | Improvements to fire detection sensors |
US6818893B2 (en) | 2001-02-14 | 2004-11-16 | Infarred Integrated Systems Limited | Fire detection sensors |
EP1329860A2 (en) | 2002-01-11 | 2003-07-23 | Hochiki Corporation | Flame detection device |
EP1329860A3 (en) * | 2002-01-11 | 2003-09-03 | Hochiki Corporation | Flame detection device |
US6806471B2 (en) | 2002-01-11 | 2004-10-19 | Hochiki Corporation | Flame detection device |
AU2002325590B2 (en) * | 2002-01-11 | 2008-01-03 | Hochiki Corporation | Flame Detection Device |
CN100387949C (en) * | 2002-01-11 | 2008-05-14 | 报知机株式会社 | Flame detector |
GB2390675A (en) * | 2002-07-10 | 2004-01-14 | Univ Greenwich | Flame characteristic monitor using digitising image camera |
GB2580644A (en) * | 2019-01-18 | 2020-07-29 | Ffe Ltd | Flame detector |
WO2021011300A1 (en) * | 2019-07-18 | 2021-01-21 | Carrier Corporation | Flame detection device and method |
US11651670B2 (en) | 2019-07-18 | 2023-05-16 | Carrier Corporation | Flame detection device and method |
Also Published As
Publication number | Publication date |
---|---|
GB0006257D0 (en) | 2000-05-03 |
GB2360355B (en) | 2004-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6184792B1 (en) | Early fire detection method and apparatus | |
CN102016944B (en) | Detection device and method for detecting fires and/or signs of fire | |
TWI280519B (en) | Flame detection device | |
CN104036611A (en) | Fire detecting alarm method and detecting alarm apparatus implementing same | |
US7680297B2 (en) | Fire detection method and apparatus | |
CN109155097A (en) | Accelerate to issue the Fike detector of potential fire alarm report based on it with the photodiode for sense ambient light | |
CN102708647A (en) | Image and multi-band infrared-ultraviolet compound fire disaster detection system and method | |
EP1233386B1 (en) | Improvements to fire detection sensors | |
JP5876347B2 (en) | Hydrogen flame visualization apparatus and method | |
CN113160513A (en) | Flame detection device based on multisensor | |
JPH08305980A (en) | Device and method for flame detection | |
US7154400B2 (en) | Fire detection method | |
JP2017207883A (en) | Monitoring system, color camera device and optical component | |
AU2004202851B2 (en) | Method and Device for Detecting Flames | |
CN206628054U (en) | Image acquisition device and fire monitoring system based on microlens array | |
GB2360355A (en) | Image detection | |
JP6837244B2 (en) | Hydrogen flame monitoring device and hydrogen handling facility | |
CN101764999A (en) | Sub-camera video capture device | |
US20140184793A1 (en) | Multispectral flame detector | |
EP1143393B1 (en) | Detection of thermally induced turbulence in fluids | |
US5838242A (en) | Fire detection system using modulation ratiometrics | |
JPH11160158A (en) | Fire monitoring device | |
KR102164802B1 (en) | The surveillance camera with passive detectors and active detectors | |
JPH09293185A (en) | Object detection device/method and object monitoring system | |
JP4690823B2 (en) | Fire detection equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
COOA | Change in applicant's name or ownership of the application | ||
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20110315 |