CA2249216A1 - A fire imaging system and method - Google Patents

A fire imaging system and method Download PDF

Info

Publication number
CA2249216A1
CA2249216A1 CA002249216A CA2249216A CA2249216A1 CA 2249216 A1 CA2249216 A1 CA 2249216A1 CA 002249216 A CA002249216 A CA 002249216A CA 2249216 A CA2249216 A CA 2249216A CA 2249216 A1 CA2249216 A1 CA 2249216A1
Authority
CA
Canada
Prior art keywords
fire
image
location
scene
temperature sensitive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002249216A
Other languages
French (fr)
Inventor
Uzi Zurgil
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MALAT DIVISION ISRAEL AIRCRAFT INDUSTRIES Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from IL11752196A external-priority patent/IL117521A0/en
Application filed by Individual filed Critical Individual
Publication of CA2249216A1 publication Critical patent/CA2249216A1/en
Abandoned legal-status Critical Current

Links

Landscapes

  • Closed-Circuit Television Systems (AREA)
  • Alarm Systems (AREA)

Abstract

The present invention combines real-time infrared (IR) images (3 6) of a fire (11) taken at the time of the fire, with digital images of the area where the fire rages, taken at some previous time. The two types of images are registered and the IR images are superimposed upon the digital images, thereby providing the fire-fighting forces with information regarding the types of objects (e.g. houses (20), trees (22), roads (24), etc.) which lie in the path of the fire. A system is provided which includes a viewing unit and a ground station (44). The viewing unit is mounted on an airborne vehicle (42), an unmanned airborne vehicle (40), or a fire watch tower, and includes a temperature sensitive sensor, such as an IR-sensor, for measuring the hot areas of the scene.

Description

CA 022492l6 l998-09-l7 W O 97/35433 PCT~L97/00097 A FIRE IMAGING SYSTEM AND METHOD
FIELD OF THE INVENTION
The present invention reiates to ground fire mapping generally and to ground fire mapping utilizing infra-red and color images of the fire combined with prior digital images and to the use of unmanned airborne vehicles (UAV) or airborne plafform for viewing the fire, in particular as well as to fire behavior ~ssessri~e,)l systems.
BACKGROUND OF THE INVENTION
When viewing a raging fire, much of the scene is di~ficult to see, due to 10 the billowing of smoke in the area at and around the fire. It is known to view a fire with an infra-red (IR) clelector which, since it detects IR radiation and since little IR
radiation is absorbed by smoke or air ,uallicles, can "see through" smoke to view the hot fire areas. Hence, the IR image enables the observer to distinguish between objects of different temperatures (that is obiects having different IR
radiation intensities).
In the past, IR i",ages of the fire itself have been superimposed upon maps, digital or otherwise, of the area of the fire. The map thus provides information to the hre fighting forces about the area in which the fire rages. One such system is described in US Patent 5,160,842 to Johnson which discusses an infrared fire-perimeter mapping system.
Other systems enlist remote sensing systems, such as those on satellites, in the fight for timely information about fires. A number of such systems are discussed in the book Remote Sensin~ and GIS Applications to Forest Fire Mana~ement, Proceedings of the Workshop held in the University of Alcala de Henares, Spain, Sept. 1995, edited by Emilio Chuvieco.
Generally, single engine planes or helicopters are used for fire recognizance. Unfortunately, flights using single engine planes are restricted to day flights and if night flights are required the more expensive twin-engine planes need to be used.

SUBSTITUTE SHEET (RULE 26) CA 022492l6 l998-09-l7 W O 97/35433 PCTnL97/00097 SUMMARY OF THE PRESENT INVENTION
The present invention combines real-time infra-red (IR) images of a fire, taken at the time of the fire, with digital images or maps of the area where the fire rages, previously prepared. The two types of images are registered and the IR
5 images are superimposed upon the digital images, thereby providing the fire-fighting forces with information regarding the types of objects (e.g. houses, trees, roads, etc.) which lie in the path of the fire.
Therefore, in accordance with a preferred embodiment of the present invention, the system includes a viewing unit and a ground station. The viewing unit mounted on an airborne vehicle, or a fire watch tower, and includes at least a temperature sensitive sensor, such as an infra-red (IR) sensor, for measuring the hot areas of the scene. The airborne vehicle can be an unmanned airborne vehicle (UAV) or an airplane.
The ground station includes a fire image unit, an image database, a prior 15 scene unit, a fire scene unit and a monitor. Optionally, the ground station can also include a printer or plotter. The fire image unit creates a generally real-time fire image, from the output of the temperature sensitive sensor, wherein the fire image details at least the perimeter of a fire. The image database stores digital images of prior scenes in the general area of the fire, wherein the scenes do not 20 show the fire. Alternatively, the stored digital image includes a three dimensional model of the area prior to the fire. The prior scene unit retrieves, from the image database, a previously created digital image or map corresponding to the scene viewed by the fire image unit. The fire scene unit superimposes the fire image onto the previously created digital image or map. The monitor displays the 25 resultant image.
Additionally, in accordance with a preferred embodiment of the present invention, the viewing unit additionally includes a wind speed determining unit for determining the velocity and direction of the wind, an attitude sensor for measuring the attitude of the vehicle and a location sensor for measuring the location of the vehicle. The position of the sensor's line of sight relative to the vehicle can also be measured.
Moreover, in accordance with a preferred embodiment of the present invention, the prior scene unit determines, from output of at least the camera attitude and location sensors, the line of sight of the temperature sensitive sensor.
35 The prior scene unit then generates, from the line of sight and data in a SlJD;~ ITE SHEET (RULE 26) CA 022492l6 l998-09-l7 topography database, the Earth coordinates of the footprint viewed by the temperature sensitive sensor.
Further, in accordance with a preferred embodiment of the present invention, the system of the present invention includes a location indicating system attached at least to some of the fire-fighting forces. In this embodiment, the fire scene unit additionally overlays indications of the locations of fire-fighting forces on the monitor.
Still further, in accordance with a preferred er"bocli,.,ent of the present invention, the fire scene unit also estinlates the speed of the fire front line, based on data from at least the topography d~t~h~se, the image d~t~hase and the wind speed. The fire scene unit can also determine the actual speed of the fire frontline from the location of the fire front line at two different times. The fire scene unit can also display the intensity of the fire in the format of contours showing theintensity of the fire.
Furthermore, the fire scene unit can also validate the predicted progress of the fire and make real time corrections in accordance with the actual progress.
The fire scene unit can also monitor the fire retar~lallt drops and plan the location of future drops of fire retardant. The fire coordinates can be derived from the fire scene digital map.
Additionally, in accordance with a preferred embodiment of the present invention, the fire scene unit also generates a warning if one of the fire-fighting forces might be in danger due to the estimated speed of the fire and the estimated ability of the fire-fighting forces to withdraw from the fire.
Moreover, in accordance with a preferred embodiment of the present invention, the present invention also includes a communications relay mounted onthe viewing unit for providing relative high quality of radio communications among the hre-fighting forces.
Further, in accordance with a preferred embodiment of the present invention, the present invention performs calibration operations to calibrate the coordinate output of the fire scene unit with the location output of the location indicating system. The operation involves viewing the heat of at least one knownhot or cold object, measuring the location of the known hot or cold object with units of the location indicating unit and comparing the output of the two units thereby to determine a calibration correction value. The location of the hot object can also be determined by the use of triangulation techniques. Furthermore, the SUBST~T~E SHEET (RULE 261 CA 022492l6 l998-09-l7 WO 97/35433 PCTnL97/00097 superimposing of the fire image can be overlaid in their correct positions on the fire scene map.
Additionally, in accordance with a preferred embodiment of the present invention, the temperature sensitive sensor is sensitive to a range of temperatures and the fire image unit displays different sections of the range of temperatures in different colors.
Additionally, in accordance with a preferred embodiment of the present invention, the present invention envisions mounting a mobile location indicatingunit on a fire retardant bomb bombing tanker. This enables the fire incident manager to view the location of the tanker relative to the hot areas of the fire and thus, enables the manager to guide the tanker to the hot spots of interest, such as those close to a house.
Additionally, in accordance with a preferred embodiment of the present invention, the present invention can include a guidance unit for deterrnining the time at which to provide a hre retardant drop to be dropped from an airplane.
Finally, in accordance with a preferred embodiment of the present invention, the present invention can include the viewing unit and the fire sceneunit mounted on to a fire retardant bomber, or any other vehicle flying over thefire, such as a helicopter, without the need for a special fitted out air vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which:
Fig. 1 is a schematic illustration of a fire imaging system, constructed and operative in accordance with a preferred embodiment of the present invention, at the scene of a fire;
Fig. 2A is a series of images utilized by the fire imaging system to image the scene of the fire;
Fig. 2B is an illustration of a combined image produced by the fire imaging system from the images of Fig. 2A;
Fig. 3 is a block diagram illustration of the elements of the fire imaging system of the present invention;
Fig. 4 is a flow chart illustration of a field image processor forming part of the fire imaging system of Fig. 3; and Fig. 5 is an illustration of an image of the intensity of the fire.

~UBStlTUtE SHEET (RULE 26) CA 022492l6 l998-09-l7 WO 97/35433 PCTnL97/00097 DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
Reference is now made to Figs. 1, 2A, 2B and 3 which illusl,dles the fire imaging system of the present invention. Fig. 1 illust,ales an exemplary scene of a fire and Figs. 2A and 2B illustrate the images created by the system of the present invention. Fig. 3 illustrates the elements of the system.
Fig. 1 illustrates an area 10 where a fire 11 rages and an area 12 where the fire has yet to reach. Area 10 includes trees 14 which have been burnt. Area12 includes houses 20, unburnt trees 22, a road 24 and a hill 26. Fig. 1 schematically illu~ les the fire-fighting forces near the front line of the fire. The forces include firefighters 30, a fire truck 32 and a command center 34, typically where the fire incident manager (or the fire chief~ operates.
The fire imaging system of the present invention typically comprises a viewing unit 36 which is in contact with a processing unit 38 within the commandcenter 34. In accordance with the present invention, the viewing unit 36 views the fire from a nearby vantage point, thereby to view the fire in detail. For exam,~)lc, the viewing unit 36 can be on an unmanned airborne vehicle (UAV) 40, on an airplane 42 or on a tower 44 which is close enough to view the area 10 of the fire.
The viewing unit 36 transmits images of the fire to the processing unit 38.
The viewing unit 36 includes temperature sensitive sensors, such as infra-red (IR) sensors, which can "see through" the smoke of the fire, and whichtransmit data to processing systems which deterrnine the locations of the front line and of the hot spots of the fire.
In an alternative embodiment, the viewing unit 36 includes a color television (TV) sensor in addition to the IR temperature sensitive sensors. Either or both the TV sensor and the IR sensors may be used to view the fire scenario.
The use of two different types of sensor enables the operator to better view thefire. For example, if there is a hot spot on the IR sensor, the operator can verify the situation in real time. At night, a fire will be visible on the TV sensor, while during the day, smoke from the fire will be seen. By recording a pair of images (TV and IR) in the course of flying, hot spots can later be scanned and checked.The resultant temperature image, labeled 50 in Fig. 2A, indicates the front line 52, the hot spots 54 and the already burnt areas 56 (e.g. areas whichwere part of previous front lines 52 or hot spots 54). Some IR sensors can detect the burnt areas 56 but other sensors cannot. For those that cannot, the burnt 3s areas 56 are determined from the locations of the front line 52 in the past.

SUBSTITUTE SHEET (RULE 26) CA 022492l6 l998-09-l7 In accordance with a preferred embodiment of the present invention, the processing unit 38 combines the temperature image 50 with a previously stored digital image (or map or 3-dimensional model) 60 of the areas 10 and 12. Digitalimage 60 can be a satellite image, such as are received from satellite imaging systems such as the SPOT program satellites, an ortho-photo, a photograph taken from an airplane, or any other digital image or a map with related DEM
(data elevational model) of the area in question, taken at some time before the fire. Thus, as shown in Fig. 2A, the digital image 60 shows all, or most, of theobjects (houses 20, trees 22, road 24) in the line of fire. The objects behind the 10 line of fire (i.e. which have been burnt) are either masked out (as shown) or shown in a different color.
Processing unit 38 can additionally overlay other digital maps, such as map 55 which illustrates the locations of infrastructure elements, such as pipes67, cables 69, power lines, etc. GIS symbols and gridlines may also be added.
Fire image 50, previous image 60 and digital map 55 have associated therewith coordinate grids 61, 64 and 65, respectively. The processing unit 38 utilizes these grids to overlay fire image 50 (and, optionally, digital map 55) over previous image 60, thereby to produce combined image 62 (Fig.2B).
It will be appreciated that combined image 62 provides the fire incident 20 manager with a real-time image of the location of the fire as well as the locations of objects, such as houses, trees, cables, etc. that are near or in the path of the fire. Additional information which can be provided include the locations of the fire-fighting forces and the probable speed of the fire, as described hereinbelow.
Reference is now made to Fig. 5 which illustrates the fire image 50 as 25 fire intensity contours, generally designated 72. The intensity of the hre varies over the area of the fire and fire image 50 can be shown as a series of contoursreferenced 72a, 72b and 72c. The areas between the contours 72 can be colored differently or shaded in "grays" (if a black & white image is displayed), to differentiate between the various fire intensity levels, from fire hot spot 74 (shaded 30 white) to the non-affected area 76 on the perimeter. The fire intensity contours may be overlaid on the fire image display of Fig. 2B, if desired. The overlay may comprise any desirable transparency level of the intensity colors.
The fire intensity threshold ~gray level) may be selectively varied in real time, allowing the fire incident manager to classify the "intensity" levels of the map.
In order to provide the fire incident manager with information on the fire-fighting forces as well as any other objects of interest, such as a firefighter's SUBSTITUTE SHEET (RULE 26) CA 022492l6 l998-09-l7 W O 97/35433 PCT~L97/00097 car 46, the fire imaging system operates with a location determining system, such as that which operates with a global positioning system (GPS). As is known in the art, the GPS system includes a plurality of GPS satellites 70 which envelope theEarth. Fig. 1 schematically illustrates only one such satellite 70.
The location determining system also includes mobile location units (MLUs) 48, such as those manufactured by Motorola Inc. of the USA, mounted on each of the objects of interest. The MLUs 48 include a GPS receiver which determines its location from signals received from at least four GPS satellites 70 and a transceiver which relays the location information to the command center 34typically via the viewing unit 36.
For location accuracy, the MLUs 48 can operate with the GPS "P-code"
(accuracy to 1m) approved by the US Government for use by forestry services.
Alternatively, the MLUs 48 can operate with the known differential GPS
technology.
Image 62 of Fig. 2 shows the fire-fighting forces as dots and rectangles, labeled 30 and 32 as in Fig. 1. Image 62 also shows arrows 68 indicating the movement and speed of the fire. The movement can be illustrated perpendicular to the fire front line and the speed and direction is illustrated by the number of arrows clustered together, where many arrows indicate high speed. Image 62 also includes a window 25 listing the wind speed, humidity, ambient temperature and any alerts. The wind speed is generated by the viewing unit 36 and/or is measured by the fire-fighting forces.
It will be appreciated that, though not required, the use of unmanned airborne vehicles (UAVs) is advantageous. UAVs can stay in flight for significant lengths of time, much longer than a human pilot can. Furthermore, the risks associated with human flight above a fire scene, such as risks from vertigo, smoke inhalation, burning, etc. and the risks of night flights, are not present with UAVs. Finally, there is no need to carry oxygen when flying above the minimum oxygen level of the atmosphere.
Fig. 3 illustrates the elements of the fire imaging system of the present invention. The viewing unit 36 typically but not necessarily includes an IR sensor 80, a GPS receiver 82, a wind speed determining unit 84, a spatial attitude sensor unit 86, such as an inertial measurement unit (IMU) or an inertial navigation system, a communications relay 88 and, optionally, other sensors 89, such as a - 35 microwave radiometer, a synthetic aperture radar (SAR), a Doppler sensor, etc.

SUts~ 111 UTE SHEET (RULE 26) CA 022492l6 l998-09-l7 W O 97/35433 PCT~L97/00097 Other sensors 89 can include a ~V camera for viewing the scenario whenever smoke conditions allow.
The IR sensor 80, which can be of the staring or scanning type, provides an IR image of the fire. The wind speed determining unit 84 determines the speed and direction of the wind. As is known in the art, the wind speed is deterrnined from the position and airspeed velocity and direction of the UAV 40.For the viewing unit on tower 44, the wind speed is determined by a wind speed sensor, such as are known in the art.
The spatial attitude sensor unit 86 provides an indication of the current 10 spatial angles of the viewing unit 36, which typically changes as the UAV 40 flies over the areas 10 and 12. The other sensors provide further information of the areas 10 and 12 which can be superimposed on the combined image 62 or can be utilized to determine various parameters of the fire 11.
The GPS receiver 82 determines the location of the UAV 40 over the Earth. Communications relay 88 relays location information from the MLUs 48 (mounted on the fire-fighting forces) to the processing unit 38. It also enables the command center 34 to communicate with the fire-fighting forces. Since the communications relay 88 is located above the fire, and above most, if not all, of the local terrain shapes, the communications relay enables improved communication between the command center 34 and the fire-fighting forces.
The processing unit 38 typically comprises a data receiver unit 91, an IR
image processor 90, a fire location processor 92, a topography database 94, suchas a digital terrain model (DTM), a field image processor 96, an image database 98, an optional map database 97 and a monitor 99 on which the combined image 62 is displayed. Optionally, the processing unit 38 may further comprise a modem93 for transfer of data and printers and plotters 95in order to produce a hard copy of the images.
The data receiver unit 91 receives the various datastreams transmitted by the viewing unit 36 to the processing unit 38. The data receiver unit 91 can comprise a multiplicity of receivers, each dedicated to receiving some of the transmitted data.
The IR image processor 90 processes the IR image received from IR
sensor 80 to determine the location of the fire front line 52 and of the hot spots 54.
The processing operation is similar to that described in U.S. Patent 5,160,842 to Johnston, the disclosure of which is incorporated herein, and involves determining the outline of areas having the same temperature. U.S. 5,160,842 describes SUBSTITUTE SHEET (RULE 26~

CA 022492l6 l998-09-l7 receiving a plurality of different i")aS~es, each for a different temperature and combining this information into a single image. The IR sensor 80 and the IR
image processor 90 can operate as described in U.S. 5,160,842 or the IR sensor 80 can view all temperatures above a predetermined threshold, such as 1000~C, at once. In the latter case, the IR image processor 90 divides the image into separate images for the temperatures of i~t~r~:st and then processes the resultant images. The IR image processor 90 provides a processed IR image which indicates the locations of hot areas.
The fire location processor 92 creates the coordinate grid 61 for the processed IR image, in accordance with known operations of analytical geometry.
To do so, it first determines the line of sight of the IR sensor 80, a known function of the location of the UAV 40, as measured by the GPS receiver 82 thereon, the attitude of the UAV 40, as measured by attitude sensor 86, and the spatial attitude of the IR sensor 80 with respect to the UAV 40, measured as is known in the art.Fire location processor 92 also determines the "footprint" of the IR sensor 80, a known function of the features of the optics thereof.
Fire location processor 92 then determines, in accordance with known operations, where the line of sight intersects the ground in areas 10 and 12 near the fire 11. The topography of the ground is provided by the topography database94. Fire location processor 92 then determines the coordinates of the area viewed by the IR sensor 80.
With the coordinate information, the fire location processor 92 determines the coordinates of the hot areas of the IR image and from that, determines which hot areas belong to the fire front line and which belong to hotspots. The hot areas which belong to the fire front line are those which are physically close together. The remaining hot areas are defined as hot spots.
It will be appreciated that the fire incident manager can command the UAV 40 to move so as to ensure that IR sensor 80 views a desired area and the fire incident manager can command that optics of the IR sensor 80 zoom in and out, as desired. The fire incident manager can also indicate that he wishes the UAV 40 to move to a location where the view of the desired area is clearer (lessfull of smoke). This form of control is common to UAVs. For example, it can be found on the Hunter System UAV manufactured by Malat - Israel Aircraft Industries of Israel.
- 35 The field image processor 96 receives both the coordinate grid 61 and the processed IR image 50. Processor 96 utilizes the coordinate grid 61 to select SUBSTITUT~ SHEET (RULE 26) CA 022492l6 l998-09-l7 W O 97/35433 PCT~L97/00097 the corresponding digital image 60 from the image database 98, wherein the corresponding digital image 60 views the same, or close to the same, coordinatesas the IR image 50.
It is noted that the digital images or maps in image database 98 can be two- or three-dimensional and they have associated therewith coordinate grid 64.The digital images preferably are of a high enough resolution to be able to differentiate among trees and houses.
Field image processor 96 also combines the other measurements it receives to produce the combined image 62 with all of the indications discussed hereinabove. Field image processor 96 displays the final result on monitor 99. Aprintout of the fire scenario can readily be issued.
It is noted that the IR sensor 80 stares at or scans the scene continuously, producing an image periodically, such as every 30 or 60 seconds.
The fire incident manager can direct the IR sensor or color video 80 to view whichever portions of the scene he wishes, as described hereinabove.
Reference is now made to Fig. 4 which details the flow of operations of the field image processor 96. At step 100, the field image processor 96 retrieves the digital image 60 (and, if desired, the digital map 55) which has the same coordinates as the IR image 50. At step 102, the field image processor 96 superimposes the IR image 50 (and the digital map 55) on the digital image 60 utilizing the coordinate information to align the two (or three) images.
In step 103, the field image processor 96 compares the location of the fire front line in the present image with that in the image previous to it, taken X
seconds previously. For each location along the fire front line, the held image processor 96 determines the current velocity of the fire front line and stores the information.
At step 104, the held image processor 96 draws window 25 showing the wind speed and direction, humidity and ambient temperature. Other steps provide the alert indications.
In step 106, field image processor 96 receives the locations of the hre-fighting forces (and of any other objects of interest) from the relay 88.
Processor 96 then converts these locations to locations within the space of the digital image 60, at which locations processor 96 places the appropriate symbol.In step 108, processor 96 reviews the history of the fire front line and marks as burnt those areas where the hre front line previously was. Alternatively, processor 96 reviews the IR image to determine which areas have the expected SUBSTITUTE SHEET (RULE 26) CA 022492l6 l998-09-l7 WO 97/35433 PCTnL97/00097 temperature range ot a burnt area. Processor 96 then masks out, or provides some other indication for, the portion of the digital image 60 which falls within the burnt areas. The processor 96 can also provide different color masks for each ofpredetermined temperature ranges. Thus, all areas of one temperature range will be colored one color and those of a different temperature range will be colored a second color, etc.
In step 110, the combined image produced in the previous steps, is displayed on monitor 99.
In steps 112 and 114, the field image processor 96 predicts where the 10 fire front line is likely to move and what, if any, consequences this has for the fire-rlgl,tillg forces. In step 112, processor 96 reviews the topo~raphy of the areas 10 and 12, the wind speed and direction and the location of the fire front line. In addition, it assesses all parameters available in the image r~t~h~se, such as the type and dryness of the objects within the area 12 close to the fire front line, which influence fire behavior. Processor 96 combines these variables to esli",ate the speed of the flre front line. For example, processor 96 can perform operations similar to those of the programs BEHAVE and/or FAR-SIGHT produced by the US
Forest Service. Both programs predict the speed of the fire line.
In addition, the processing unit 38 may comprise a dahh~se library containing data on other variables such as fuels, moisture content and air moisture, for example, which can affect the progress of the fire. Processor 96 can utilize the data stored in the database to refine its calculations and predictions.
The predicted fire front line velocity can be compared to or corrected by the actual front line velocity, as determined in step 103. The field image processor 96 then displays the es~illlated and calibrated fire front line velocity, as described hereinabove, with arrows 68, wherein many arrows indicate a fast moving fire.
At any stage during the above described process, the interpreter can manually interpose instructions to control the process. For example, the interpreter can select areas of the mapped image of interest in order to calculate 30 the fire front line velocity of those areas. Alternatively, the interpreter can select two points on different fire lines (representing different times) in order to obtain the velocity.
As is known in the art, by utilizing known parameters, it is possible determine the value of another variable parameter. Thus, for fire behavior 35 prediction, by utilizing known parameters such as slope angle, the type of fuels SlJ~S 1 l l ~ITE SHEET (RULE 26) CA 022492l6 l998-09-l7 W O 97/35433 PCT~L97/00097 (trees, bushes, etc.), wind speed and air humidity, the fuels moisture content can be determined.
The predicted fire front line velocity can also be utilized to predict the location of the fire front line X minutes into the future or, alternatively, the time it will take for the fire front line to arrive at a given location. It will be appreciated that these calculations can be utilized to determine where to place a fire break.
The predicted fire front line velocity can also be utilized to "dead reckon"
the expected location of the fire front line when the viewing unit 36 has not viewed a certain area for a long time. Such a dead reckoned front line can be displayedon the monitor 99 in a manner different than that of the measured fire front line.
For example, the dead reckoned front line might be shown with dashed lines.
In step 114, the field image processor 96 compares the speed of the fire front line with the expected speed of the fire-fighting forces. The expected speed of the fire-fighting forces is a function of the traversable of the terrain, determined by a terrain traversability unit (step 113), from information in the topography database 94, and the general speed of a human being.
If the speed and route of the fire-fighting forces is such that they can easily back away from the fire front line faster than the front line moves forward, then all is fine. Otherwise, processor 96 issues a visible and/or audible warning, in step 116, and places it into window 25.
The present invention can also include a calibration step, performed before or during operation of the system. The calibration operation compensates for any discrepancies in accuracy between the GPS-based location system (for locating the fire-fighting forces) and the line of sight (LOS) location system (for locating the area viewed by the viewing system).
To calibrate, the fire-fighting forces place one or more known objects at some known location away from the fire front line, with an MLU 48 mounted on each. The known objects are to be viewed by the color TV camera or viewing system 36 and therefore, have a known, very high temperature. For example, the known objects might be a fire in a garbage can or an electric heating element.
Since the objects are away from the fire, in a known location, they are easily identifiable and viewable. These objects can also be automatically detected by modulating its illumination The system of the present invention views the objects with the viewing system 36 and provides the IR image to the IR image processor 90. The processed image, which has hot spots wherever the known objects were viewed, SUBSTTTUTE SHEET (RULE 26~

CA 022492l6 l998-09-l7 W O 97/35433 PCT~L97/00097 is processed by the fire location processor 92 to identify the locations of the hot spots.
In parallel, the M~Us 48 on the known objects transmit their position data to field image processor 96, via the relay 88. Field image processor 96, when in the calibration mode, compares the "GPS" locations of the objects with the "LOS" locations and determines the appropriate corrections to the LOS
locations. This correction information is stored and utilized by the field imageprocessor 96 during its regular operation. The calibration operation can be performed continuously or periodically, as desired.
It will be appreciated that the present invention can also assess the efficiency of retarding materials utilized to douse the fire. For example, the fire image produced by the IR sensor 80 contains a range of temperatures and will have some areas with lower temperatures due to the operation of the retarding materials. Furthermore, the interpreter can judge the effectiveness of the fire retardant drops process and can thus determine future optimal drop locations.
It is critical for the safety of the fire-fighting forces that the fire line is blocked by fire retardants. Since the fire drops are visible to the IR sensor, the fire incident manager can select the location of the next drop on the digital map.
The fire incident manager sees these areas since the lower temperature areas are displayed by the system of the present invention in a color different than that of the very hot areas. Furthermore, if MLUs 48 are placed on the vehictes, such as airborne tankers, which spray the retarding materials, their locations will show up on the display. Thus, the fire incident manager can guide the vehicles to the areas which need them the most, by communicating his requests via a standard communications system.
Alternatively or in addition, the field image processor can include a unit for determining the appropriate time to drop a "bomb" of ret~r~ g materials on the fire. The unit considers the relevant parameters, such as velocity of the airplane carrying the bomb, the wind speed and direction, the height above the terrain, etc., in a manner similar to known calculations for dropping military bombs.
It will further be appreciated that the system of the present invention provides the fire incident manager with a view of the fire which indicates the objects of interest therein, information regarding the activity of the fire and information regarding the fire-fighting forces.
In an alternative embodiment, the displayed image can display the moving image as viewed by the camera.

SUBSTITUTE SHEET (RULE 26) CA 022492l6 l998-09-l7 WO 97/3S433 PCTnL97/00097 Furthermore, the processing unit 38 can utilize the continuous wind velocity data to compare the latest wind data with the mean wind speed and velocity of the previous tine period (say, 15 minutes) and in the event of a significant wind change, such as a direction change of more than five degrees or velocity change of five knots.
Additionally, the system can continuously calculate and monitor the possibility of the fire-fighters being entrapped. In the case of pre-determined wind changes, critical the well-being of the fire-fighters, whenever such a situationexists, the interpreter will be notified by an alarm indication.
It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather the scope of the present invention is defined only by the claims below:

SUBSTITUT~ SHEET tRUL~ ~6)

Claims (53)

1. A fire scenario imaging system comprising:
a. a viewing unit mounted on an airborne vehicle and comprising at least a temperature sensitive sensor for measuring the hot areas of said scene, typically associated with said fire; and b. a ground station comprising:
i) fire image means for creating a generally real-time fire image, from the output of said temperature sensitive sensor, detailing at least the perimeter of a fire;
ii) an image database of digital images of prior scenes in the general area of said fire, wherein said scenes do not show said fire;
iii) prior scene means for retrieving, from said image database, a previously created digital image corresponding to the scene viewed by said fire image means;
iv) fire scene means for superimposing said fire image onto said previously created digital image; and v) a monitor for displaying the resultant image.
2. A fire scenario imaging system comprising:
a. a viewing unit mounted on a fire watch tower and comprising at least a temperature sensitive sensor for measuring the hot areas of said scene, typically associated with said fire; and b. a ground station comprising:
i) fire image means for creating a generally real-time fire image, from the output of said temperature sensitive sensor, detailing at least the perimeter of a fire;
ii) an image database of digital images of prior scenes in the general area of said fire, wherein said scenes do not show said fire;
iii) prior scene means for retrieving, from said image database, a previously created digital image corresponding to the scene viewed by said fire image means;

iv) fire scene means for superimposing said fire image onto said previously created digital image; and v) a monitor for displaying the resultant image.
3. A system according to any of claims 1 and 2 and wherein said temperature sensitive sensor is an infra-red (IR) sensor.
4. A system according to any of claims 1 and 2 and wherein said viewing unit additionally includes a wind speed determining unit for determining the velocity of the wind, apparatus for determining the wind turbulence, an attitude sensor for measuring the attitude of said vehicle and a location sensor for measuring the location of said vehicle.
5. A system according to claim 4 and wherein said prior scene means includes:
a. means for determining, from output of at least said camera and vehicle attitude and location sensors, the line of sight of said temperature sensitive sensor;
b. a topography database covering the topography of said scene in the general area of said fire; and c. coordinate means for generating, from said line of sight and said topography database, the Earth coordinates of the footprint viewed by said temperature sensitive sensor.
6. A system according to claim 4 and wherein said fire scene means additionally includes means for displaying at least an indication of said wind velocity and direction on said monitor.
7. A system according to any of claims 1 and 2 and additionally comprising a location indicating system attached at least to some of the fire-fighting forces, wherein said fire scene means additionally includes means for displaying indications of the locations of fire-fighting forces on said monitor.
8. A system according to claim and wherein said fire scene means also includes fire front line means for estimating the speed of said fire front line, based on data from at least said topography database, said image database and said wind speed determining unit.
9. A system according to claim 8 and wherein said ground station also includes fire speed means for measuring the actual speed of said fire front line from the location of said fire front line at two different times.
10. A system according to claim 8 and wherein said fire scene means also includes warning means for generating a warning if one of said fire-fighting forces might be in danger due to the estimated speed of said fire and the estimated ability of said one of said fire-fighting forces to withdraw from said fire.
11. A system according to claim 8 and wherein said fire scene means also includes means for validating the predicted progress of the fire and for making real time corrections.
12. A system according to claim 8 and wherein said fire scene means also includes means for monitoring the fire retardant drops and planning therefrom the location of future drops of fire retardant.
13. A system according to any of the previous claims and also comprising a communications relay mounted on said viewing unit for providing communications among said fire-fighting forces.
14. A system according to claims 6 and 8 and wherein said ground station also comprises calibration means for calibrating the coordinate output of said coordinate means with the location output of said location indicating system.
15. A system according to claim 13 and wherein said means for calibrating include means for receiving, from said fire image means, an IR image of at least one known hot object, means for receiving, from said location indicating system, the location of said at least one known hot object and means for comparing the output of said two means for receiving, thereby to determine a calibration correction value.
16. A system according to claim 13 and wherein said means for calibrating further include triangulation means for determining a calibration correction value.
17. A system according to any of claims 1 - 2 and wherein said displayed resultant image includes the display of said fire image as contours of the intensity of the fire, said intensity level being selectable.
18. A system according to any of claims 1 and 2 and wherein said viewing unit comprises an IR sensor and a color television camera, wherein the existence of a fire may be verified by reference to simultaneous views from both said IR sensor and said color television camera
19. A system according to any of claims 1 and 2 and further comprising a printer for printing the resultant image.
20. A system according to any of claims 1 and 2 and wherein said image database further comprises previously created images comprising a three-dimensional image of said fire area on which the created fire image is superimposed.
21. A system according to any of the previous claims and wherein said airborne vehicle is one of the following group: an unmanned airborne vehicle (UAV), an airplane or a fire-retardant bomber.
22. A system according to any of claims 1 and 2 wherein said temperature sensitive sensor is sensitive to a range of temperatures and wherein said fire image means includes means for displaying different sections of said range of temperatures in different colors and transparency levels.
23. A system according to any of claims 1 and 2 and wherein said ground station is located proximate to the fire-fighters command post.
24. A system according to any of the previous claims wherein said digital image is a digital map.
25. A method for imaging a fire scenario, the method comprising the steps of:
a. measuring the hot areas of said scene with a temperature sensitive sensor mounted on an airborne vehicle;
b. creating a generally real-time fire image, from the output of said step of measuring, detailing at least the perimeter of a fire;

c. retrieving, from an image database, a previously created digital image corresponding to the scene viewed by said fire image means wherein said previously created digital image does not show said fire;
d. superimposing said fire image onto said previously created digital image; and e. outputting the resultant image.
26. A method for imaging a fire scenario, the method comprising the steps of:
a. measuring the hot areas of said scene with a temperature sensitive sensor mounted on a fire watch tower;
b. creating a generally real-time fire image, from the output of said step of measuring, detailing at least the perimeter of a fire;
c. retrieving, from an image database, a previously created digital image corresponding to the scene viewed by said fire image means wherein said previously created digital image does not show said fire;
d. superimposing said fire image onto said previously created digital image; and e. outputting the resultant image.
27. A method according to any of claims 25 and 26 and wherein said temperature sensitive sensor is an infra-red (IR) sensor.
28. A method according to claim 27 and wherein said step of retrieving includes the steps of:
a. determining, from output of at least attitude and location sensors, the line of sight of said temperature sensitive sensor;
b. generating, from said line of sight and a topography database covering the topography of said scene in the general area of said fire, the Earth coordinates of the footprint viewed by said temperature sensitive sensor.
29. A method according to claim 28 and additionally including the step of displaying an indication of at least said wind velocity and direction.
30. A method according to any of claims 25 and 26 and additionally comprising the steps of displaying indications of the locations of fire-fighting forces received from a location indicating system attached to at least some of the fire-fighting forces.
31. A method according to claim 28 and also including the step of estimating the speed of said fire front line, based on said wind speed and data from at least said topography database and said image database.
32. A method according to claim 31 and also including the steps of measuring and displaying the actual speed of said fire front line from the location of said fire front line at two different times.
33. A method according to claim 31 and also including the step of generating a warning if one of said fire-fighting forces might be in danger due to the estimated speed of said fire and the estimated ability of said one of said fire-fighting forces to back away from said fire.
34. A method according to claims 28 and 30 and also including the step of calibrating the coordinate output of said step of generating Earth coordinates with the location output of said location indicating system.
35. A method according to claim 34 and wherein said step of calibrating includes the steps of:
a. mounting a location indicating system on at least one known hot object and placing said known hot object in a generally known location;
b. generating an IR image of said at least one known hot object;
c. generating, from said location indicating system mounted on said at least one known hot object, the location of said at least one known hot object; and d. comparing the output of said two steps of generating thereby to determine a calibration correction value.
36. A method according to claim 34 and wherein said step of calibrating includes the step of triangulation to ascertain a correct calibration value.
37. A method according to any of claims 25 - 36 and wherein said airborne vehicle is one of the following group: an unmanned airborne vehicle (UAV) and an airplane.
38. A method according to any of claims 25 and 26 wherein said temperature sensitive sensor is sensitive to a range of temperatures and including the step of displaying different sections of said range of temperatures in different colors.
39. A method according to any of claims 25 and 26 and further comprising the step of using a color camera to photograph said fire scenario.
40. A method according to any of claims 25 and 26 and further comprising the step of printing said resultant image.
41. A method according to any of claims 25 and 26 and wherein said step of retrieving the step of retrieving previously created images comprising a three-dimensional image of said fire image superimposed on said previously created digital image of the area prior to the fire.
42. A method according to claim 35 and wherein said step of calibrating further includes the steps of:
a. mounting a location indicating system on a vehicle carrying fire retardant elements;
b. generating an IR image of said fire scene;
c. determining which portions of said fire scene are the hottest and assigning the portion to which said fire retardant elements will be directed;
d. generating, from said location indicating system mounted on said vehicle, the location of said vehicle, and e. utilizing the location of said vehicle and other parameters affecting the movement of said fire retardant elements to the portion assigned to them to guide the activation time, location, elevation and direction of said fire retardant elements.
43. A method according to claim 35 and wherein said step of calibrating further includes the steps of:
a. validating the predicted progress of the fire; and b. correcting said predictions.
44. A method according to claim 35 and wherein said step of calibrating further includes the steps of:
a. monitoring the fire retardant drops; and b. planning the location of future drops in accordance with said monitored results.
45. A method according to claim 35 and wherein said step of calibrating further includes the step of extrapolating the subsequent fire-line from the current fire-line speed.
46. A method according to claim 35 and wherein said step of calibrating further includes the step of calculating the mean wind speed and direction due to wind changes.
47. A method according to claim 46 and wherein said step of calculating includes the step of issuing an alarm indication.
48. A method according to any of claims 25 - 47 wherein said digital image is a digital map.
49. A method according to any of claims 25 and 26 and wherein said step of outputting the resultant image is one of the following: displaying, printing, plotting or transferring by modem.
50. A system according to any of claims 1 - 24 substantially as shown and described hereinabove.
51. A system according to any of claims 1 - 24 substantially as illustrated in any of the drawings.
52. A method according to any of claims 25 - 49 substantially as shown and described hereinabove.
53. A method according to any of claims 25 - 49 substantially as illustrated in any of the drawings.
CA002249216A 1996-03-17 1997-03-16 A fire imaging system and method Abandoned CA2249216A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
IL117521 1996-03-17
IL11752196A IL117521A0 (en) 1996-03-17 1996-03-17 A fire imaging system and method
PCT/IL1997/000097 WO1997035433A1 (en) 1996-03-17 1997-03-16 A fire imaging system and method

Publications (1)

Publication Number Publication Date
CA2249216A1 true CA2249216A1 (en) 1997-09-25

Family

ID=29422219

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002249216A Abandoned CA2249216A1 (en) 1996-03-17 1997-03-16 A fire imaging system and method

Country Status (1)

Country Link
CA (1) CA2249216A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109918711A (en) * 2019-01-22 2019-06-21 北京中北国泰装饰工程集团有限公司 Building studies for a second time courses one has flunked method after calamity based on block chain

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109918711A (en) * 2019-01-22 2019-06-21 北京中北国泰装饰工程集团有限公司 Building studies for a second time courses one has flunked method after calamity based on block chain

Similar Documents

Publication Publication Date Title
WO1997035433A1 (en) A fire imaging system and method
CN106546984B (en) Weather radar control system and method of using the same
Sherstjuk et al. Forest fire-fighting monitoring system based on UAV team and remote sensing
US7337156B2 (en) Method for detecting and combating forest and surface fires
San-Miguel-Ayanz et al. Active fire detection for fire emergency management: Potential and limitations for the operational use of remote sensing
US7633428B1 (en) Weather data aggregation and display system for airborne network of member aircraft
US6043756A (en) Aircraft weather information system
EP1523738B1 (en) System and method for territory thermal monitoring
EP2782086A1 (en) Methods and systems for colorizing an enhanced image during alert
WO2017123358A2 (en) Unmanned aerial system based thermal imaging and aggregation systems and methods
CN110176156A (en) A kind of airborne ground early warning system
US9221548B1 (en) Engine system and method using a mode for icing conditions
US10046187B2 (en) Wildfire aerial fighting system utilizing lidar
KR20120038990A (en) Automatic video surveillance system and method
IES20110213A2 (en) System and method for detecting adverse atmospheric conditions ahead of an aircraft
CN106197377A (en) A kind of unmanned plane targeted surveillance over the ground and the display system of two dimension three-dimensional linkage
US20220221398A1 (en) System and method for remote analyte sensing using a mobile platform
US20230123483A1 (en) Systems for detecting and monitoring a small area wildfire and methods related thereto
US20050104771A1 (en) Airborne imaging spectrometry system and method
JP3025969B2 (en) Aircraft navigation system and method for supporting aircraft navigation
RU113046U1 (en) COMPREHENSIVE SYSTEM FOR EARLY DETECTION OF FOREST FIRES, BUILT ON THE PRINCIPLE OF A VARIETY SENSOR PANORAMIC SURVEY OF THE AREA WITH THE FUNCTION OF HIGH-PRECISION DETERMINATION OF THE FIRE OF THE FIRE
CA2249216A1 (en) A fire imaging system and method
Perez-Mato et al. Real-time autonomous wildfire monitoring and georeferencing using rapidly deployable mobile units
US20130215268A1 (en) Unknown
JP2000306084A (en) Three-dimensional image display method

Legal Events

Date Code Title Description
FZDE Discontinued
FZDE Discontinued

Effective date: 20010316