WO1997035433A1 - A fire imaging system and method - Google Patents

A fire imaging system and method Download PDF

Info

Publication number
WO1997035433A1
WO1997035433A1 PCT/IL1997/000097 IL9700097W WO9735433A1 WO 1997035433 A1 WO1997035433 A1 WO 1997035433A1 IL 9700097 W IL9700097 W IL 9700097W WO 9735433 A1 WO9735433 A1 WO 9735433A1
Authority
WO
WIPO (PCT)
Prior art keywords
fire
image
location
scene
temperature sensitive
Prior art date
Application number
PCT/IL1997/000097
Other languages
French (fr)
Inventor
Uzi Zurgil
Original Assignee
Malat Division, Israel Aircraft Industries Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Malat Division, Israel Aircraft Industries Ltd. filed Critical Malat Division, Israel Aircraft Industries Ltd.
Priority to AU19375/97A priority Critical patent/AU1937597A/en
Priority to CA 2249216 priority patent/CA2249216A1/en
Publication of WO1997035433A1 publication Critical patent/WO1997035433A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/005Fire alarms; Alarms responsive to explosion for forest fires, e.g. detecting fires spread over a large or outdoors area
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the present invention relates to ground fire mapping generally and to ground fire mapping utilizing infra-red and color images of the fire combined with prior digital images and to the use of unmanned airborne vehicles (UAV) or airborne platform for viewing the fire, in particular as well as to fire behavior assessment systems.
  • UAV unmanned airborne vehicles
  • the present invention combines real-time infra-red (IR) images of a fire, taken at the time of the fire, with digital images or maps of the area where the fire rages, previously prepared.
  • IR infra-red
  • the two types of images are registered and the IR images are superimposed upon the digital images, thereby providing the fire-fighting forces with information regarding the types of objects (e.g. houses, trees, roads, etc.) which lie in the path of the fire.
  • the system includes a viewing unit and a ground station.
  • the viewing unit mounted on an airborne vehicle, or a fire watch tower, and includes at least a temperature sensitive sensor, such as an infra-red (IR) sensor, for measuring the hot areas of the scene.
  • the airborne vehicle can be an unmanned airborne vehicle (UAV) or an airplane.
  • the ground station includes a fire image unit, an image database, a prior scene unit, a fire scene unit and a monitor.
  • the ground station can also include a printer or plotter.
  • the fire image unit creates a generally real-time fire image, from the output of the temperature sensitive sensor, wherein the fire image details at least the perimeter of a fire.
  • the image database stores digital images of prior scenes in the general area of the fire, wherein the scenes do not show the fire. Alternatively, the stored digital image includes a three dimensional model of the area prior to the fire.
  • the prior scene unit retrieves, from the image database, a previously created digital image or map corresponding to the scene viewed by the fire image unit.
  • the fire scene unit superimposes the fire image onto the previously created digital image or map.
  • the monitor displays the resultant image.
  • the viewing unit additionally includes a wind speed determining unit for determining the velocity and direction of the wind, an attitude sensor for measuring the attitude of the vehicle and a location sensor for measuring the location of the vehicle. The position of the sensor's line of sight relative to the vehicle can also be measured.
  • the prior scene unit determines, from output of at least the camera attitude and location sensors, the line of sight of the temperature sensitive sensor.
  • the prior scene unit then generates, from the line of sight and data in a topography database, the Earth coordinates of the footprint viewed by the temperature sensitive sensor.
  • the system of the present invention includes a location indicating system attached at least to some of the fire-fighting forces.
  • the fire scene unit additionally overlays indications of the locations of fire-fighting forces on the monitor.
  • the fire scene unit also estimates the speed of the fire front line, based on data from at least the topography database, the image database and the wind speed.
  • the fire scene unit can also determine the actual speed of the fire front line from the location of the fire front line at two different times.
  • the fire scene unit can also display the intensity of the fire in the format of contours showing the intensity of the fire.
  • the fire scene unit can also validate the predicted progress of the fire and make real time corrections in accordance with the actual progress.
  • the fire scene unit can also monitor the fire retardant drops and plan the location of future drops of fire retardant.
  • the fire coordinates can be derived from the fire scene digital map. Additionally, in accordance with a preferred embodiment of the present invention, the fire scene unit also generates a warning if one of the fire-fighting forces might be in danger due to the estimated speed of the fire and the estimated ability of the fire-fighting forces to withdraw from the fire.
  • the present invention also includes a communications relay mounted on the viewing unit for providing relative high quality of radio communications among the fire-fighting forces.
  • the present invention performs calibration operations to calibrate the coordinate output of the fire scene unit with the location output of the location indicating system.
  • the operation involves viewing the heat of at least one known hot or cold object, measuring the location of the known hot or cold object with units of the location indicating unit and comparing the output of the two units thereby to determine a calibration correction value.
  • the location of the hot object can also be determined by the use of triangulation techniques.
  • the superimposing of the fire image can be overlaid in their correct positions on the fire scene map.
  • the temperature sensitive sensor is sensitive to a range of temperatures and the fire image unit displays different sections of the range of temperatures in different colors.
  • the present invention envisions mounting a mobile location indicating unit on a fire retardant bomb bombing tanker. This enables the fire incident manager to view the location of the tanker relative to the hot areas of the fire and thus, enables the manager to guide the tanker to the hot spots of interest, such as those close to a house.
  • the present invention can include a guidance unit for determining the time at which to provide a fire retardant drop to be dropped from an airplane.
  • the present invention can include the viewing unit and the fire scene unit mounted on to a fire retardant bomber, or any other vehicle flying over the fire, such as a helicopter, without the need for a special fitted out air vehicle.
  • Fig. 1 is a schematic illustration of a fire imaging system, constructed and operative in accordance with a preferred embodiment of the present invention, at the scene of a fire;
  • Fig. 2A is a series of images utilized by the fire imaging system to image the scene of the fire;
  • Fig. 2B is an illustration of a combined image produced by the fire imaging system from the images of Fig. 2A;
  • Fig. 3 is a block diagram illustration of the elements of the fire imaging system of the present invention.
  • Fig. 4 is a flow chart illustration of a field image processor forming part of the fire imaging system of Fig. 3; and Fig. 5 is an illustration of an image of the intensity of the fire.
  • FIG. 1 illustrates an exemplary scene of a fire
  • Figs. 2A and 2B illustrate the images created by the system of the present invention
  • Fig. 3 illustrates the elements of the system.
  • Fig. 1 illustrates an area 10 where a fire 11 rages and an area 12 where the fire has yet to reach.
  • Area 10 includes trees 14 which have been burnt.
  • Area 12 includes houses 20, unburnt trees 22, a road 24 and a hill 26.
  • Fig. 1 schematically illustrates the fire-fighting forces near the front line of the fire. The forces include firefighters 30, a fire truck 32 and a command center 34, typically where the fire incident manager (or the fire chief) operates.
  • the fire imaging system of the present invention typically comprises a viewing unit 36 which is in contact with a processing unit 38 within the command center 34.
  • the viewing unit 36 views the fire from a nearby vantage point, thereby to view the fire in detail.
  • the viewing unit 36 can be on an unmanned airborne vehicle (UAV) 40, on an airplane 42 or on a tower 44 which is close enough to view the area 10 of the fire.
  • UAV unmanned airborne vehicle
  • the viewing unit 36 transmits images of the fire to the processing unit 38.
  • the viewing unit 36 includes temperature sensitive sensors, such as infra-red (IR) sensors, which can "see through” the smoke of the fire, and which transmit data to processing systems which determine the locations of the front line and of the hot spots of the fire.
  • IR infra-red
  • the viewing unit 36 includes a color television (TV) sensor in addition to the IR temperature sensitive sensors.
  • TV color television
  • Either or both the TV sensor and the IR sensors may be used to view the fire scenario.
  • the use of two different types of sensor enables the operator to better view the fire. For example, if there is a hot spot on the IR sensor, the operator can verify the situation in real time. At night, a fire will be visible on the TV sensor, while during the day, smoke from the fire will be seen. By recording a pair of images (TV and IR) in the course of flying, hot spots can later be scanned and checked.
  • the resultant temperature image indicates the front line 52, the hot spots 54 and the already burnt areas 56 (e.g. areas which were part of previous front lines 52 or hot spots 54). Some IR sensors can detect the burnt areas 56 but other sensors cannot. For those that cannot, the burnt areas 56 are determined from the locations of the front line 52 in the past.
  • the processing unit 38 combines the temperature image 50 with a previously stored digital image (or map or 3-dimensional model) 60 of the areas 10 and 12.
  • Digital image 60 can be a satellite image, such as are received from satellite imaging systems such as the SPOT program satellites, an ortho-photo, a photograph taken from an airplane, or any other digital image or a map with related DEM (data elevational model) of the area in question, taken at some time before the fire.
  • the digital image 60 shows all, or most, of the objects (houses 20, trees 22, road 24) in the line of fire.
  • the objects behind the line of fire i.e. which have been burnt
  • Processing unit 38 can additionally overlay other digital maps, such as map 55 which illustrates the locations of infrastructure elements, such as pipes 67, cables 69, power lines, etc. GIS symbols and gridlines may also be added.
  • Map 55 which illustrates the locations of infrastructure elements, such as pipes 67, cables 69, power lines, etc. GIS symbols and gridlines may also be added.
  • Fire image 50, previous image 60 and digital map 55 have associated therewith coordinate grids 61 , 64 and 65, respectively.
  • the processing unit 38 utilizes these grids to overlay fire image 50 (and, optionally, digital map 55) over previous image 60, thereby to produce combined image 62 (Fig. 2B).
  • combined image 62 provides the fire incident manager with a real-time image of the location of the fire as well as the locations of objects, such as houses, trees, cables, etc. that are near or in the path of the fire. Additional information which can be provided include the locations of the fire-fighting forces and the probable speed of the fire, as described hereinbelow.
  • Fig. 5 illustrates the fire image 50 as fire intensity contours, generally designated 72.
  • the intensity of the fire varies over the area of the fire and fire image 50 can be shown as a series of contours referenced 72a, 72b and 72c.
  • the areas between the contours 72 can be colored differently or shaded in "grays" (if a black & white image is displayed), to differentiate between the various fire intensity levels, from fire hot spot 74 (shaded white) to the non-affected area 76 on the perimeter.
  • the fire intensity contours may be overlaid on the fire image display of Fig. 2B, if desired.
  • the overlay may comprise any desirable transparency level of the intensity colors.
  • the fire intensity threshold may be selectively varied in real time, allowing the fire incident manager to classify the "intensity" levels of the map.
  • the fire imaging system operates with a location determining system, such as that which operates with a global positioning system (GPS).
  • GPS global positioning system
  • the GPS system includes a plurality of GPS satellites 70 which envelope the Earth. Fig. 1 schematically illustrates only one such satellite 70.
  • the location determining system also includes mobile location units
  • MLUs 48 such as those manufactured by Motorola Inc. of the USA, mounted on each of the objects of interest.
  • the MLUs 48 include a GPS receiver which determines its location from signals received from at least four GPS satellites 70 and a transceiver which relays the location information to the command center 34 typically via the viewing unit 36.
  • Image 62 of Fig. 2 shows the fire-fighting forces as dots and rectangles, labeled 30 and 32 as in Fig. 1.
  • Image 62 also shows arrows 68 indicating the movement and speed of the fire. The movement can be illustrated perpendicular to the fire front line and the speed and direction is illustrated by the number of arrows clustered together, where many arrows indicate high speed.
  • Image 62 also includes a window 25 listing the wind speed, humidity, ambient temperature and any alerts. The wind speed is generated by the viewing unit 36 and/or is measured by the fire-fighting forces.
  • UAVs unmanned airborne vehicles
  • the viewing unit 36 typically but not necessarily includes an IR sensor 80, a GPS receiver 82, a wind speed determining unit 84, a spatial attitude sensor unit 86, such as an inertia!
  • IMU inertial measurement unit
  • SAR synthetic aperture radar
  • Doppler Doppler
  • Other sensors 89 can include a TV camera for viewing the scenario whenever smoke conditions allow.
  • the IR sensor 80 which can be of the staring or scanning type, provides an IR image of the fire.
  • the wind speed determining unit 84 determines the speed and direction of the wind. As is known in the art, the wind speed is determined from the position and airspeed velocity and direction of the UAV 40.
  • the wind speed is determined by a wind speed sensor, such as are known in the art.
  • the spatial attitude sensor unit 86 provides an indication of the current spatial angles of the viewing unit 36, which typically changes as the UAV 40 flies over the areas 10 and 12.
  • the other sensors provide further information of the areas 10 and 12 which can be superimposed on the combined image 62 or can be utilized to determine various parameters of the fire 11.
  • the GPS receiver 82 determines the location of the UAV 40 over the Earth.
  • Communications relay 88 relays location information from the MLUs 48
  • the processing unit 38 (mounted on the fire-fighting forces) to the processing unit 38. It also enables the command center 34 to communicate with the fire-fighting forces. Since the communications relay 88 is located above the fire, and above most, if not all, of the local terrain shapes, the communications relay enables improved communication between the command center 34 and the fire-fighting forces.
  • the processing unit 38 typically comprises a data receiver unit 91 , an IR image processor 90, a fire location processor 92, a topography database 94, such as a digital terrain model (DTM), a field image processor 96, an image database
  • the processing unit 38 may further comprise a modem
  • the data receiver unit 91 receives the various datastreams transmitted by the viewing unit 36 to the processing unit 38.
  • the data receiver unit 91 can comprise a multiplicity of receivers, each dedicated to receiving some of the transmitted data.
  • the IR image processor 90 processes the IR image received from IR sensor 80 to determine the location of the fire front line 52 and of the hot spots 54.
  • the processing operation is similar to that described in U.S. Patent 5,160,842 to Johnston, the disclosure of which is incorporated herein, and involves determining the outline of areas having the same temperature.
  • U.S. 5,160,842 describes receiving a plurality of different images, each for a different temperature and combining this information into a single image.
  • the IR sensor 80 and the IR image processor 90 can operate as described in U.S. 5,160,842 or the IR sensor 80 can view all temperatures above a predetermined threshold, such as 1000°C, at once, in the latter case, the IR image processor 90 divides the image into separate images for the temperatures of interest and then processes the resultant images.
  • the IR image processor 90 provides a processed IR image which indicates the locations of hot areas.
  • the fire location processor 92 creates the coordinate grid 61 for the processed IR image, in accordance with known operations of analytical geometry. To do so, it first determines the line of sight of the IR sensor 80, a known function of the location of the UAV 40, as measured by the GPS receiver 82 thereon, the attitude of the UAV 40, as measured by attitude sensor 86, and the spatial attitude of the IR sensor 80 with respect to the UAV 40, measured as is known in the art. Fire location processor 92 also determines the "footprint" of the IR sensor 80, a known function of the features of the optics thereof.
  • Fire location processor 92 determines, in accordance with known operations, where the line of sight intersects the ground in areas 10 and 12 near the fire 11.
  • the topography of the ground is provided by the topography database 94.
  • Fire location processor 92 determines the coordinates of the area viewed by the IR sensor 80.
  • the fire location processor 92 determines the coordinates of the hot areas of the IR image and from that, determines which hot areas belong to the fire front line and which belong to hot spots.
  • the hot areas which belong to the fire front line are those which are physically close together.
  • the remaining hot areas are defined as hot spots.
  • the fire incident manager can command the UAV 40 to move so as to ensure that IR sensor 80 views a desired area and the fire incident manager can command that optics of the IR sensor 80 zoom in and out, as desired.
  • the fire incident manager can also indicate that he wishes the UAV 40 to move to a location where the view of the desired area is clearer (less full of smoke).
  • This form of control is common to UAVs. For example, it can be found on the Hunter System UAV manufactured by Malat - Israel Aircraft Industries of Israel.
  • the field image processor 96 receives both the coordinate grid 61 and the processed IR image 50. Processor 96 utilizes the coordinate grid 61 to select the corresponding digital image 60 from the image database 98, wherein the corresponding digital image 60 views the same, or close to the same, coordinates as the IR image 50.
  • the digital images or maps in image database 98 can be two- or three-dimensional and they have associated therewith coordinate grid 64.
  • the digital images preferably are of a high enough resolution to be able to differentiate among trees and houses.
  • Field image processor 96 also combines the other measurements it receives to produce the combined image 62 with all of the indications discussed hereinabove. Field image processor 96 displays the final result on monitor 99. A printout of the fire scenario can readily be issued.
  • the IR sensor 80 stares at or scans the scene continuously, producing an image periodically, such as every 30 or 60 seconds.
  • the fire incident manager can direct the IR sensor or color video 80 to view whichever portions of the scene he wishes, as described hereinabove.
  • the field image processor 96 retrieves the digital image 60 (and, if desired, the digital map 55) which has the same coordinates as the IR image 50.
  • the field image processor 96 superimposes the IR image 50 (and the digital map 55) on the digital image 60 utilizing the coordinate information to align the two (or three) images.
  • step 103 the field image processor 96 compares the location of the fire front line in the present image with that in the image previous to it, taken X seconds previously. For each location along the fire front line, the field image processor 96 determines the current velocity of the fire front line and stores the information.
  • the field image processor 96 draws window 25 showing the wind speed and direction, humidity and ambient temperature. Other steps provide the alert indications.
  • field image processor 96 receives the locations of the fire-fighting forces (and of any other objects of interest) from the relay 88. Processor 96 then converts these locations to locations within the space of the digital image 60, at which locations processor 96 places the appropriate symbol.
  • processor 96 reviews the history of the fire front line and marks as burnt those areas where the fire front line previously was. Alternatively, processor 96 reviews the IR image to determine which areas have the expected temperature range ot a burnt area. Processor 96 then masks out, or provides some other indication for, the portion of the digital image 60 which falls within the burnt areas. The processor 96 can also provide different color masks for each of predetermined temperature ranges. Thus, all areas of one temperature range will be colored one color and those of a different temperature range will be colored a second color, etc.
  • step 110 the combined image produced in the previous steps, is displayed on monitor 99.
  • the field image processor 96 predicts where the fire front line is likely to move and what, if any, consequences this has for the fire-fighting forces.
  • processor 96 reviews the topography of the areas 10 and 12, the wind speed and direction and the location of the fire front line. In addition, it assesses all parameters available in the image database, such as the type and dryness of the objects within the area 12 close to the fire front line, which influence fire behavior.
  • Processor 96 combines these variables to estimate the speed of the fire front line. For example, processor 96 can perform operations similar to those of the programs BEHAVE and/or FAR-SIGHT produced by the US Forest Service. Both programs predict the speed of the fire line.
  • processing unit 38 may comprise a database library containing data on other variables such as fuels, moisture content and air moisture, for example, which can affect the progress of the fire.
  • Processor 96 can utilize the data stored in the database to refine its calculations and predictions.
  • the predicted fire front line velocity can be compared to or corrected by the actual front line velocity, as determined in step 103.
  • the field image processor 96 then displays the estimated and calibrated fire front line velocity, as described hereinabove, with arrows 68, wherein many arrows indicate a fast moving fire.
  • the interpreter can manually interpose instructions to control the process. For example, the interpreter can select areas of the mapped image of interest in order to calculate the fire front line velocity of those areas. Alternatively, the interpreter can select two points on different fire lines (representing different times) in order to obtain the velocity.
  • the predicted fire front line velocity can also be utilized to predict the location of the fire front line X minutes into the future or, alternatively, the time it will take for the fire front line to arrive at a given location. It will be appreciated that these calculations can be utilized to determine where to place a fire break.
  • the predicted fire front line velocity can also be utilized to "dead reckon" the expected location of the fire front line when the viewing unit 36 has not viewed a certain area for a long time.
  • Such a dead reckoned front line can be displayed on the monitor 99 in a manner different than that of the measured fire front line.
  • the dead reckoned front line might be shown with dashed lines.
  • step 1 14 the field image processor 96 compares the speed of the fire front line with the expected speed of the fire-fighting forces.
  • the expected speed of the fire-fighting forces is a function of the traversable of the terrain, determined by a terrain traversability unit (step 113), from information in the topography database 94, and the general speed of a human being.
  • processor 96 issues a visible and/or audible warning, in step 1 16, and places it into window 25.
  • the present invention can also include a calibration step, performed before or during operation of the system.
  • the calibration operation compensates for any discrepancies in accuracy between the GPS-based location system (for locating the fire-fighting forces) and the line of sight (LOS) location system (for locating the area viewed by the viewing system).
  • the fire-fighting forces place one or more known objects at some known location away from the fire front line, with an MLU 48 mounted on each.
  • the known objects are to be viewed by the color TV camera or viewing system 36 and therefore, have a known, very high temperature.
  • the known objects might be a fire in a garbage can or an electric heating element. Since the objects are away from the fire, in a known location, they are easily identifiable and viewable. These objects can also be automatically detected by modulating its illumination
  • the system of the present invention views the objects with the viewing system 36 and provides the IR image to the IR image processor 90.
  • the processed image which has hot spots wherever the known objects were viewed, is processed by the fire location processor 92 to identify the locations of the hot spots.
  • the MLUs 48 on the known objects transmit their position data to field image processor 96, via the relay 88.
  • Field image processor 96 when in the calibration mode, compares the "GPS" locations of the objects with the "LOS" locations and determines the appropriate corrections to the LOS locations. This correction information is stored and utilized by the field image processor 96 during its regular operation.
  • the calibration operation can be performed continuously or periodically, as desired. It will be appreciated that the present invention can also assess the efficiency of retarding materials utilized to douse the fire. For example, the fire image produced by the IR sensor 80 contains a range of temperatures and will have some areas with lower temperatures due to the operation of the retarding materials. Furthermore, the interpreter can judge the effectiveness of the fire retardant drops process and can thus determine future optimal drop locations.
  • the fire incident manager can select the location of the next drop on the digital map.
  • the fire incident manager sees these areas since the lower temperature areas are displayed by the system of the present invention in a color different than that of the very hot areas. Furthermore, if MLUs 48 are placed on the vehicles, such as airborne tankers, which spray the retarding materials, their locations will show up on the display. Thus, the fire incident manager can guide the vehicles to the areas which need them the most, by communicating his requests via a standard communications system.
  • the field image processor can include a unit for determining the appropriate time to drop a "bomb" of retarding materials on the fire.
  • the unit considers the relevant parameters, such as velocity of the airplane carrying the bomb, the wind speed and direction, the height above the terrain, etc., in a manner similar to known calculations for dropping military bombs.
  • the system of the present invention provides the fire incident manager with a view of the fire which indicates the objects of interest therein, information regarding the activity of the fire and information regarding the fire-fighting forces.
  • the displayed image can display the moving image as viewed by the camera.
  • the processing unit 38 can utilize the continuous wind velocity data to compare the latest wind data with the mean wind speed and velocity of the previous tine period (say, 15 minutes) and in the event of a significant wind change, such as a direction change of more than five degrees or velocity change of five knots.
  • the system can continuously calculate and monitor the possibility of the fire-fighters being entrapped. In the case of pre-determined wind changes, critical the well-being of the fire-fighters, whenever such a situation exists, the interpreter will be notified by an alarm indication. It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather the scope of the present invention is defined only by the claims below:

Abstract

The present invention combines real-time infrared (IR) images (3 6) of a fire (11) taken at the time of the fire, with digital images of the area where the fire rages, taken at some previous time. The two types of images are registered and the IR images are superimposed upon the digital images, thereby providing the fire-fighting forces with information regarding the types of objects (e.g. houses (20), trees (22), roads (24), etc.) which lie in the path of the fire. A system is provided which includes a viewing unit and a ground station (44). The viewing unit is mounted on an airborne vehicle (42), an unmanned airborne vehicle (40), or a fire watch tower, and includes a temperature sensitive sensor, such as an IR-sensor, for measuring the hot areas of the scene.

Description

A FIRE IMAGING SYSTEM AND METHOD
FIELD OF THE INVENTION
The present invention relates to ground fire mapping generally and to ground fire mapping utilizing infra-red and color images of the fire combined with prior digital images and to the use of unmanned airborne vehicles (UAV) or airborne platform for viewing the fire, in particular as well as to fire behavior assessment systems.
BACKGROUND OF THE INVENTION
When viewing a raging fire, much of the scene is difficult to see, due to the billowing of smoke in the area at and around the fire. It is known to view a fire with an infra-red (IR) detector which, since it detects IR radiation and since little IR radiation is absorbed by smoke or air particles, can "see through" smoke to view the hot fire areas. Hence, the IR image enables the observer to distinguish between objects of different temperatures (that is objects having different IR radiation intensities).
In the past, IR images of the fire itself have been superimposed upon maps, digital or otherwise, of the area of the fire. The map thus provides information to the fire fighting forces about the area in which the fire rages. One such system is described in US Patent 5,160,842 to Johnson which discusses an infrared fire-perimeter mapping system.
Other systems enlist remote sensing systems, such as those on satellites, in the fight for timely information about fires. A number of such systems are discussed in the book Remote Sensing and GIS Applications to Forest Fire Management, Proceedings of the Workshop held in the University of Alcala de Henares, Spain, Sept. 1995, edited by Emilio Chuvieco.
Generally, single engine planes or helicopters are used for fire recognizance. Unfortunately, flights using single engine planes are restricted to day flights and if night flights are required the more expensive twin-engine planes need to be used. SUMMARY OF THE PRESENT INVENTION
The present invention combines real-time infra-red (IR) images of a fire, taken at the time of the fire, with digital images or maps of the area where the fire rages, previously prepared. The two types of images are registered and the IR images are superimposed upon the digital images, thereby providing the fire-fighting forces with information regarding the types of objects (e.g. houses, trees, roads, etc.) which lie in the path of the fire.
Therefore, in accordance with a preferred embodiment of the present invention, the system includes a viewing unit and a ground station. The viewing unit mounted on an airborne vehicle, or a fire watch tower, and includes at least a temperature sensitive sensor, such as an infra-red (IR) sensor, for measuring the hot areas of the scene. The airborne vehicle can be an unmanned airborne vehicle (UAV) or an airplane.
The ground station includes a fire image unit, an image database, a prior scene unit, a fire scene unit and a monitor. Optionally, the ground station can also include a printer or plotter. The fire image unit creates a generally real-time fire image, from the output of the temperature sensitive sensor, wherein the fire image details at least the perimeter of a fire. The image database stores digital images of prior scenes in the general area of the fire, wherein the scenes do not show the fire. Alternatively, the stored digital image includes a three dimensional model of the area prior to the fire. The prior scene unit retrieves, from the image database, a previously created digital image or map corresponding to the scene viewed by the fire image unit. The fire scene unit superimposes the fire image onto the previously created digital image or map. The monitor displays the resultant image.
Additionally, in accordance with a preferred embodiment of the present invention, the viewing unit additionally includes a wind speed determining unit for determining the velocity and direction of the wind, an attitude sensor for measuring the attitude of the vehicle and a location sensor for measuring the location of the vehicle. The position of the sensor's line of sight relative to the vehicle can also be measured.
Moreover, in accordance with a preferred embodiment of the present invention, the prior scene unit determines, from output of at least the camera attitude and location sensors, the line of sight of the temperature sensitive sensor. The prior scene unit then generates, from the line of sight and data in a topography database, the Earth coordinates of the footprint viewed by the temperature sensitive sensor.
Further, in accordance with a preferred embodiment of the present invention, the system of the present invention includes a location indicating system attached at least to some of the fire-fighting forces. In this embodiment, the fire scene unit additionally overlays indications of the locations of fire-fighting forces on the monitor.
Still further, in accordance with a preferred embodiment of the present invention, the fire scene unit also estimates the speed of the fire front line, based on data from at least the topography database, the image database and the wind speed. The fire scene unit can also determine the actual speed of the fire front line from the location of the fire front line at two different times. The fire scene unit can also display the intensity of the fire in the format of contours showing the intensity of the fire. Furthermore, the fire scene unit can also validate the predicted progress of the fire and make real time corrections in accordance with the actual progress.
The fire scene unit can also monitor the fire retardant drops and plan the location of future drops of fire retardant. The fire coordinates can be derived from the fire scene digital map. Additionally, in accordance with a preferred embodiment of the present invention, the fire scene unit also generates a warning if one of the fire-fighting forces might be in danger due to the estimated speed of the fire and the estimated ability of the fire-fighting forces to withdraw from the fire.
Moreover, in accordance with a preferred embodiment of the present invention, the present invention also includes a communications relay mounted on the viewing unit for providing relative high quality of radio communications among the fire-fighting forces.
Further, in accordance with a preferred embodiment of the present invention, the present invention performs calibration operations to calibrate the coordinate output of the fire scene unit with the location output of the location indicating system. The operation involves viewing the heat of at least one known hot or cold object, measuring the location of the known hot or cold object with units of the location indicating unit and comparing the output of the two units thereby to determine a calibration correction value. The location of the hot object can also be determined by the use of triangulation techniques. Furthermore, the superimposing of the fire image can be overlaid in their correct positions on the fire scene map.
Additionally, in accordance with a preferred embodiment of the present invention, the temperature sensitive sensor is sensitive to a range of temperatures and the fire image unit displays different sections of the range of temperatures in different colors.
Additionally, in accordance with a preferred embodiment of the present invention, the present invention envisions mounting a mobile location indicating unit on a fire retardant bomb bombing tanker. This enables the fire incident manager to view the location of the tanker relative to the hot areas of the fire and thus, enables the manager to guide the tanker to the hot spots of interest, such as those close to a house.
Additionally, in accordance with a preferred embodiment of the present invention, the present invention can include a guidance unit for determining the time at which to provide a fire retardant drop to be dropped from an airplane.
Finally, in accordance with a preferred embodiment of the present invention, the present invention can include the viewing unit and the fire scene unit mounted on to a fire retardant bomber, or any other vehicle flying over the fire, such as a helicopter, without the need for a special fitted out air vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which:
Fig. 1 is a schematic illustration of a fire imaging system, constructed and operative in accordance with a preferred embodiment of the present invention, at the scene of a fire;
Fig. 2A is a series of images utilized by the fire imaging system to image the scene of the fire;
Fig. 2B is an illustration of a combined image produced by the fire imaging system from the images of Fig. 2A;
Fig. 3 is a block diagram illustration of the elements of the fire imaging system of the present invention;
Fig. 4 is a flow chart illustration of a field image processor forming part of the fire imaging system of Fig. 3; and Fig. 5 is an illustration of an image of the intensity of the fire. DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
Reference is now made to Figs. 1 , 2A, 2B and 3 which illustrates the fire imaging system of the present invention. Fig. 1 illustrates an exemplary scene of a fire and Figs. 2A and 2B illustrate the images created by the system of the present invention. Fig. 3 illustrates the elements of the system.
Fig. 1 illustrates an area 10 where a fire 11 rages and an area 12 where the fire has yet to reach. Area 10 includes trees 14 which have been burnt. Area 12 includes houses 20, unburnt trees 22, a road 24 and a hill 26. Fig. 1 schematically illustrates the fire-fighting forces near the front line of the fire. The forces include firefighters 30, a fire truck 32 and a command center 34, typically where the fire incident manager (or the fire chief) operates.
The fire imaging system of the present invention typically comprises a viewing unit 36 which is in contact with a processing unit 38 within the command center 34. In accordance with the present invention, the viewing unit 36 views the fire from a nearby vantage point, thereby to view the fire in detail. For example, the viewing unit 36 can be on an unmanned airborne vehicle (UAV) 40, on an airplane 42 or on a tower 44 which is close enough to view the area 10 of the fire. The viewing unit 36 transmits images of the fire to the processing unit 38.
The viewing unit 36 includes temperature sensitive sensors, such as infra-red (IR) sensors, which can "see through" the smoke of the fire, and which transmit data to processing systems which determine the locations of the front line and of the hot spots of the fire.
In an alternative embodiment, the viewing unit 36 includes a color television (TV) sensor in addition to the IR temperature sensitive sensors. Either or both the TV sensor and the IR sensors may be used to view the fire scenario. The use of two different types of sensor enables the operator to better view the fire. For example, if there is a hot spot on the IR sensor, the operator can verify the situation in real time. At night, a fire will be visible on the TV sensor, while during the day, smoke from the fire will be seen. By recording a pair of images (TV and IR) in the course of flying, hot spots can later be scanned and checked.
The resultant temperature image, labeled 50 in Fig. 2A, indicates the front line 52, the hot spots 54 and the already burnt areas 56 (e.g. areas which were part of previous front lines 52 or hot spots 54). Some IR sensors can detect the burnt areas 56 but other sensors cannot. For those that cannot, the burnt areas 56 are determined from the locations of the front line 52 in the past. In accordance with a preferred embodiment of the present invention, the processing unit 38 combines the temperature image 50 with a previously stored digital image (or map or 3-dimensional model) 60 of the areas 10 and 12. Digital image 60 can be a satellite image, such as are received from satellite imaging systems such as the SPOT program satellites, an ortho-photo, a photograph taken from an airplane, or any other digital image or a map with related DEM (data elevational model) of the area in question, taken at some time before the fire. Thus, as shown in Fig. 2A, the digital image 60 shows all, or most, of the objects (houses 20, trees 22, road 24) in the line of fire. The objects behind the line of fire (i.e. which have been burnt) are either masked out (as shown) or shown in a different color.
Processing unit 38 can additionally overlay other digital maps, such as map 55 which illustrates the locations of infrastructure elements, such as pipes 67, cables 69, power lines, etc. GIS symbols and gridlines may also be added. Fire image 50, previous image 60 and digital map 55 have associated therewith coordinate grids 61 , 64 and 65, respectively. The processing unit 38 utilizes these grids to overlay fire image 50 (and, optionally, digital map 55) over previous image 60, thereby to produce combined image 62 (Fig. 2B).
It will be appreciated that combined image 62 provides the fire incident manager with a real-time image of the location of the fire as well as the locations of objects, such as houses, trees, cables, etc. that are near or in the path of the fire. Additional information which can be provided include the locations of the fire-fighting forces and the probable speed of the fire, as described hereinbelow.
Reference is now made to Fig. 5 which illustrates the fire image 50 as fire intensity contours, generally designated 72. The intensity of the fire varies over the area of the fire and fire image 50 can be shown as a series of contours referenced 72a, 72b and 72c. The areas between the contours 72 can be colored differently or shaded in "grays" (if a black & white image is displayed), to differentiate between the various fire intensity levels, from fire hot spot 74 (shaded white) to the non-affected area 76 on the perimeter. The fire intensity contours may be overlaid on the fire image display of Fig. 2B, if desired. The overlay may comprise any desirable transparency level of the intensity colors.
The fire intensity threshold (gray level) may be selectively varied in real time, allowing the fire incident manager to classify the "intensity" levels of the map. In order to provide the fire incident manager with information on the fire-fighting forces as well as any other objects of interest, such as a firefighter's car 46, the fire imaging system operates with a location determining system, such as that which operates with a global positioning system (GPS). As is known in the art, the GPS system includes a plurality of GPS satellites 70 which envelope the Earth. Fig. 1 schematically illustrates only one such satellite 70. The location determining system also includes mobile location units
(MLUs) 48, such as those manufactured by Motorola Inc. of the USA, mounted on each of the objects of interest. The MLUs 48 include a GPS receiver which determines its location from signals received from at least four GPS satellites 70 and a transceiver which relays the location information to the command center 34 typically via the viewing unit 36.
For location accuracy, the MLUs 48 can operate with the GPS "P-code" (accuracy to 1 m) approved by the US Government for use by forestry services. Alternatively, the MLUs 48 can operate with the known differential GPS technology. Image 62 of Fig. 2 shows the fire-fighting forces as dots and rectangles, labeled 30 and 32 as in Fig. 1. Image 62 also shows arrows 68 indicating the movement and speed of the fire. The movement can be illustrated perpendicular to the fire front line and the speed and direction is illustrated by the number of arrows clustered together, where many arrows indicate high speed. Image 62 also includes a window 25 listing the wind speed, humidity, ambient temperature and any alerts. The wind speed is generated by the viewing unit 36 and/or is measured by the fire-fighting forces.
It will be appreciated that, though not required, the use of unmanned airborne vehicles (UAVs) is advantageous. UAVs can stay in flight for significant lengths of time, much longer than a human pilot can. Furthermore, the risks associated with human flight above a fire scene, such as risks from vertigo, smoke inhalation, burning, etc. and the risks of night flights, are not present with UAVs. Finally, there is no need to carry oxygen when flying above the minimum oxygen level of the atmosphere. Fig. 3 illustrates the elements of the fire imaging system of the present invention. The viewing unit 36 typically but not necessarily includes an IR sensor 80, a GPS receiver 82, a wind speed determining unit 84, a spatial attitude sensor unit 86, such as an inertia! measurement unit (IMU) or an inertial navigation system, a communications relay 88 and, optionally, other sensors 89, such as a microwave radiometer, a synthetic aperture radar (SAR), a Doppler sensor, etc. Other sensors 89 can include a TV camera for viewing the scenario whenever smoke conditions allow.
The IR sensor 80, which can be of the staring or scanning type, provides an IR image of the fire. The wind speed determining unit 84 determines the speed and direction of the wind. As is known in the art, the wind speed is determined from the position and airspeed velocity and direction of the UAV 40.
For the viewing unit on tower 44, the wind speed is determined by a wind speed sensor, such as are known in the art.
The spatial attitude sensor unit 86 provides an indication of the current spatial angles of the viewing unit 36, which typically changes as the UAV 40 flies over the areas 10 and 12. The other sensors provide further information of the areas 10 and 12 which can be superimposed on the combined image 62 or can be utilized to determine various parameters of the fire 11.
The GPS receiver 82 determines the location of the UAV 40 over the Earth. Communications relay 88 relays location information from the MLUs 48
(mounted on the fire-fighting forces) to the processing unit 38. It also enables the command center 34 to communicate with the fire-fighting forces. Since the communications relay 88 is located above the fire, and above most, if not all, of the local terrain shapes, the communications relay enables improved communication between the command center 34 and the fire-fighting forces.
The processing unit 38 typically comprises a data receiver unit 91 , an IR image processor 90, a fire location processor 92, a topography database 94, such as a digital terrain model (DTM), a field image processor 96, an image database
98, an optional map database 97 and a monitor 99 on which the combined image 62 is displayed. Optionally, the processing unit 38 may further comprise a modem
93 for transfer of data and printers and plotters 95in order to produce a hard copy of the images.
The data receiver unit 91 receives the various datastreams transmitted by the viewing unit 36 to the processing unit 38. The data receiver unit 91 can comprise a multiplicity of receivers, each dedicated to receiving some of the transmitted data.
The IR image processor 90 processes the IR image received from IR sensor 80 to determine the location of the fire front line 52 and of the hot spots 54.
The processing operation is similar to that described in U.S. Patent 5,160,842 to Johnston, the disclosure of which is incorporated herein, and involves determining the outline of areas having the same temperature. U.S. 5,160,842 describes receiving a plurality of different images, each for a different temperature and combining this information into a single image. The IR sensor 80 and the IR image processor 90 can operate as described in U.S. 5,160,842 or the IR sensor 80 can view all temperatures above a predetermined threshold, such as 1000°C, at once, in the latter case, the IR image processor 90 divides the image into separate images for the temperatures of interest and then processes the resultant images. The IR image processor 90 provides a processed IR image which indicates the locations of hot areas.
The fire location processor 92 creates the coordinate grid 61 for the processed IR image, in accordance with known operations of analytical geometry. To do so, it first determines the line of sight of the IR sensor 80, a known function of the location of the UAV 40, as measured by the GPS receiver 82 thereon, the attitude of the UAV 40, as measured by attitude sensor 86, and the spatial attitude of the IR sensor 80 with respect to the UAV 40, measured as is known in the art. Fire location processor 92 also determines the "footprint" of the IR sensor 80, a known function of the features of the optics thereof.
Fire location processor 92 then determines, in accordance with known operations, where the line of sight intersects the ground in areas 10 and 12 near the fire 11. The topography of the ground is provided by the topography database 94. Fire location processor 92 then determines the coordinates of the area viewed by the IR sensor 80.
With the coordinate information, the fire location processor 92 determines the coordinates of the hot areas of the IR image and from that, determines which hot areas belong to the fire front line and which belong to hot spots. The hot areas which belong to the fire front line are those which are physically close together. The remaining hot areas are defined as hot spots.
It will be appreciated that the fire incident manager can command the UAV 40 to move so as to ensure that IR sensor 80 views a desired area and the fire incident manager can command that optics of the IR sensor 80 zoom in and out, as desired. The fire incident manager can also indicate that he wishes the UAV 40 to move to a location where the view of the desired area is clearer (less full of smoke). This form of control is common to UAVs. For example, it can be found on the Hunter System UAV manufactured by Malat - Israel Aircraft Industries of Israel. The field image processor 96 receives both the coordinate grid 61 and the processed IR image 50. Processor 96 utilizes the coordinate grid 61 to select the corresponding digital image 60 from the image database 98, wherein the corresponding digital image 60 views the same, or close to the same, coordinates as the IR image 50.
It is noted that the digital images or maps in image database 98 can be two- or three-dimensional and they have associated therewith coordinate grid 64. The digital images preferably are of a high enough resolution to be able to differentiate among trees and houses.
Field image processor 96 also combines the other measurements it receives to produce the combined image 62 with all of the indications discussed hereinabove. Field image processor 96 displays the final result on monitor 99. A printout of the fire scenario can readily be issued.
It is noted that the IR sensor 80 stares at or scans the scene continuously, producing an image periodically, such as every 30 or 60 seconds. The fire incident manager can direct the IR sensor or color video 80 to view whichever portions of the scene he wishes, as described hereinabove.
Reference is now made to Fig. 4 which details the flow of operations of the field image processor 96. At step 100, the field image processor 96 retrieves the digital image 60 (and, if desired, the digital map 55) which has the same coordinates as the IR image 50. At step 102, the field image processor 96 superimposes the IR image 50 (and the digital map 55) on the digital image 60 utilizing the coordinate information to align the two (or three) images.
In step 103, the field image processor 96 compares the location of the fire front line in the present image with that in the image previous to it, taken X seconds previously. For each location along the fire front line, the field image processor 96 determines the current velocity of the fire front line and stores the information.
At step 104, the field image processor 96 draws window 25 showing the wind speed and direction, humidity and ambient temperature. Other steps provide the alert indications. In step 106, field image processor 96 receives the locations of the fire-fighting forces (and of any other objects of interest) from the relay 88. Processor 96 then converts these locations to locations within the space of the digital image 60, at which locations processor 96 places the appropriate symbol.
In step 108, processor 96 reviews the history of the fire front line and marks as burnt those areas where the fire front line previously was. Alternatively, processor 96 reviews the IR image to determine which areas have the expected temperature range ot a burnt area. Processor 96 then masks out, or provides some other indication for, the portion of the digital image 60 which falls within the burnt areas. The processor 96 can also provide different color masks for each of predetermined temperature ranges. Thus, all areas of one temperature range will be colored one color and those of a different temperature range will be colored a second color, etc.
In step 110, the combined image produced in the previous steps, is displayed on monitor 99.
In steps 112 and 114, the field image processor 96 predicts where the fire front line is likely to move and what, if any, consequences this has for the fire-fighting forces. In step 112, processor 96 reviews the topography of the areas 10 and 12, the wind speed and direction and the location of the fire front line. In addition, it assesses all parameters available in the image database, such as the type and dryness of the objects within the area 12 close to the fire front line, which influence fire behavior. Processor 96 combines these variables to estimate the speed of the fire front line. For example, processor 96 can perform operations similar to those of the programs BEHAVE and/or FAR-SIGHT produced by the US Forest Service. Both programs predict the speed of the fire line.
In addition, the processing unit 38 may comprise a database library containing data on other variables such as fuels, moisture content and air moisture, for example, which can affect the progress of the fire. Processor 96 can utilize the data stored in the database to refine its calculations and predictions.
The predicted fire front line velocity can be compared to or corrected by the actual front line velocity, as determined in step 103. The field image processor 96 then displays the estimated and calibrated fire front line velocity, as described hereinabove, with arrows 68, wherein many arrows indicate a fast moving fire.
At any stage during the above described process, the interpreter can manually interpose instructions to control the process. For example, the interpreter can select areas of the mapped image of interest in order to calculate the fire front line velocity of those areas. Alternatively, the interpreter can select two points on different fire lines (representing different times) in order to obtain the velocity.
As is known in the art, by utilizing known parameters, it is possible determine the value of another variable parameter. Thus, for fire behavior prediction, by utilizing known parameters such as slope angle, the type of fuels (trees, bushes, etc.), wind speed and air humidity, the fuels moisture content can be determined.
The predicted fire front line velocity can also be utilized to predict the location of the fire front line X minutes into the future or, alternatively, the time it will take for the fire front line to arrive at a given location. It will be appreciated that these calculations can be utilized to determine where to place a fire break.
The predicted fire front line velocity can also be utilized to "dead reckon" the expected location of the fire front line when the viewing unit 36 has not viewed a certain area for a long time. Such a dead reckoned front line can be displayed on the monitor 99 in a manner different than that of the measured fire front line. For example, the dead reckoned front line might be shown with dashed lines.
In step 1 14, the field image processor 96 compares the speed of the fire front line with the expected speed of the fire-fighting forces. The expected speed of the fire-fighting forces is a function of the traversable of the terrain, determined by a terrain traversability unit (step 113), from information in the topography database 94, and the general speed of a human being.
If the speed and route of the fire-fighting forces is such that they can easily back away from the fire front line faster than the front line moves forward, then all is fine. Otherwise, processor 96 issues a visible and/or audible warning, in step 1 16, and places it into window 25.
The present invention can also include a calibration step, performed before or during operation of the system. The calibration operation compensates for any discrepancies in accuracy between the GPS-based location system (for locating the fire-fighting forces) and the line of sight (LOS) location system (for locating the area viewed by the viewing system).
To calibrate, the fire-fighting forces place one or more known objects at some known location away from the fire front line, with an MLU 48 mounted on each. The known objects are to be viewed by the color TV camera or viewing system 36 and therefore, have a known, very high temperature. For example, the known objects might be a fire in a garbage can or an electric heating element. Since the objects are away from the fire, in a known location, they are easily identifiable and viewable. These objects can also be automatically detected by modulating its illumination
The system of the present invention views the objects with the viewing system 36 and provides the IR image to the IR image processor 90. The processed image, which has hot spots wherever the known objects were viewed, is processed by the fire location processor 92 to identify the locations of the hot spots.
In parallel, the MLUs 48 on the known objects transmit their position data to field image processor 96, via the relay 88. Field image processor 96, when in the calibration mode, compares the "GPS" locations of the objects with the "LOS" locations and determines the appropriate corrections to the LOS locations. This correction information is stored and utilized by the field image processor 96 during its regular operation. The calibration operation can be performed continuously or periodically, as desired. It will be appreciated that the present invention can also assess the efficiency of retarding materials utilized to douse the fire. For example, the fire image produced by the IR sensor 80 contains a range of temperatures and will have some areas with lower temperatures due to the operation of the retarding materials. Furthermore, the interpreter can judge the effectiveness of the fire retardant drops process and can thus determine future optimal drop locations.
It is critical for the safety of the fire-fighting forces that the fire line is blocked by fire retardants. Since the fire drops are visible to the IR sensor, the fire incident manager can select the location of the next drop on the digital map.
The fire incident manager sees these areas since the lower temperature areas are displayed by the system of the present invention in a color different than that of the very hot areas. Furthermore, if MLUs 48 are placed on the vehicles, such as airborne tankers, which spray the retarding materials, their locations will show up on the display. Thus, the fire incident manager can guide the vehicles to the areas which need them the most, by communicating his requests via a standard communications system.
Alternatively or in addition, the field image processor can include a unit for determining the appropriate time to drop a "bomb" of retarding materials on the fire. The unit considers the relevant parameters, such as velocity of the airplane carrying the bomb, the wind speed and direction, the height above the terrain, etc., in a manner similar to known calculations for dropping military bombs.
It will further be appreciated that the system of the present invention provides the fire incident manager with a view of the fire which indicates the objects of interest therein, information regarding the activity of the fire and information regarding the fire-fighting forces. In an alternative embodiment, the displayed image can display the moving image as viewed by the camera. Furthermore, the processing unit 38 can utilize the continuous wind velocity data to compare the latest wind data with the mean wind speed and velocity of the previous tine period (say, 15 minutes) and in the event of a significant wind change, such as a direction change of more than five degrees or velocity change of five knots.
Additionally, the system can continuously calculate and monitor the possibility of the fire-fighters being entrapped. In the case of pre-determined wind changes, critical the well-being of the fire-fighters, whenever such a situation exists, the interpreter will be notified by an alarm indication. It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather the scope of the present invention is defined only by the claims below:

Claims

1 A fire scenario imaging system comprising a a viewing unit mounted on an airborne vehicle and comprising at least a temperature sensitive sensor for measuring the hot areas of said scene, typically associated with said fire, and b a ground station comprising i) fire image means for creating a generally real-time fire image, from the output of said temperature sensitive sensor, detailing at least the perimeter of a fire, II) an image database of digital images of prior scenes in the general area of said fire, wherein said scenes do not show said fire, m) prior scene means for retrieving, from said image database, a previously created digital image corresponding to the scene viewed by said fire image means, iv) fire scene means for superimposing said fire image onto said previously created digital image, and v) a monitor for displaying the resultant image
2 A fire scenario imaging system comprising a a viewing unit mounted on a fire watch tower and comprising at least a temperature sensitive sensor for measuring the hot areas of said scene typically associated with said fire, and b a ground station comprising i) fire image means for creating a generally real-time fire image, from the output of said temperature sensitive sensor, detailing at least the perimeter of a fire,
II) an image database of digital images of prior scenes in the general area of said fire, wherein said scenes do not show said fire in) prior scene means for retrieving, from said image database, a previously created digital image corresponding to the scene viewed by said fire image means, iv) fire scene means for superimposing said fire image onto said previously created digital image; and v) a monitor for displaying the resultant image.
3. A system according to any of claims 1 and 2 and wherein said temperature sensitive sensor is an infra-red (IR) sensor.
4. A system according to any of claims 1 and 2 and wherein said viewing unit additionally includes a wind speed determining unit for determining the velocity of the wind, apparatus for determining the wind turbulence, an attitude sensor for measuring the attitude of said vehicle and a location sensor for measuring the location of said vehicle.
5. A system according to claim 4 and wherein said prior scene means includes: a. means for determining, from output of at least said camera and vehicle attitude and location sensors, the line of sight of said temperature sensitive sensor; b. a topography database covering the topography of said scene in the general area of said fire; and c. coordinate means for generating, from said line of sight and said topography database, the Earth coordinates of the footprint viewed by said temperature sensitive sensor.
6. A system according to claim 4 and wherein said fire scene means additionally includes means for displaying at least an indication of said wind velocity and direction on said monitor.
7. A system according to any of claims 1 and 2 and additionally comprising a location indicating system attached at least to some of the fire-fighting forces, wherein said fire scene means additionally includes means for displaying indications of the locations of fire-fighting forces on said monitor.
8. A system according to claim and wherein said fire scene means also includes fire front line means for estimating the speed of said fire front line, based on data from at least said topography database, said image database and said wind speed determining unit.
9. A system according to claim 8 and wherein said ground station also includes fire speed means for measuring the actual speed of said fire front line from the location of said fire front line at two different times.
10. A system according to claim 8 and wherein said fire scene means also includes warning means for generating a warning if one of said fire-fighting forces might be in danger due to the estimated speed of said fire and the estimated ability of said one of said fire-fighting forces to withdraw from said fire.
11. A system according to claim 8 and wherein said fire scene means also includes means for validating the predicted progress of the fire and for making real time corrections.
12. A system according to claim 8 and wherein said fire scene means also includes means for monitoring the fire retardant drops and planning therefrom the location of future drops of fire retardant.
13. A system according to any of the previous claims and also comprising a communications relay mounted on said viewing unit for providing communications among said fire-fighting forces.
14. A system according to claims 6 and 8 and wherein said ground station also comprises calibration means for calibrating the coordinate output of said coordinate means with the location output of said location indicating system.
15. A system according to claim 13 and wherein said means for calibrating include means for receiving, from said fire image means, an IR image of at least one known hot object, means for receiving, from said location indicating system, the location of said at least one known hot object and means for comparing the output of said two means for receiving, thereby to determine a calibration correction value.
16. A system according to claim 13 and wherein said means for calibrating further include triangulation means for determining a calibration correction value.
17. A system according to any of claims 1 - 2 and wherein said displayed resultant image includes the display of said fire image as contours of the intensity of the fire, said intensity level being selectable.
18. A system according to any of claims 1 and 2 and wherein said viewing unit comprises an IR sensor and a color television camera, wherein the existence of a fire may be verified by reference to simultaneous views from both said IR sensor and said color television camera
19. A system according to any of claims 1 and 2 and further comprising a printer for printing the resultant image.
20. A system according to any of claims 1 and 2 and wherein said image database further comprises previously created images comprising a three-dimensional image of said fire area on which the created fire image is superimposed.
21. A system according to any of the previous claims and wherein said airborne vehicle is one of the following group: an unmanned airborne vehicle (UAV), an airplane or a fire-retardant bomber.
22. A system according to any of claims 1 and 2 wherein said temperature sensitive sensor is sensitive to a range of temperatures and wherein said fire image means includes means for displaying different sections of said range of temperatures in different colors and transparency levels.
23. A system according to any of claims 1 and 2 and wherein said ground station is located proximate to the fire-fighters command post.
24. A system according to any of the previous claims wherein said digital image is a digital map.
25. A method for imaging a fire scenario, the method comprising the steps of: a. measuring the hot areas of said scene with a temperature sensitive sensor mounted on an airborne vehicle; b. creating a generally real-time fire image, from the output of said step of measuring, detailing at least the perimeter of a fire; c. retrieving, from an image database, a previously created digital image corresponding to the scene viewed by said fire image means wherein said previously created digital image does not show said fire; d. superimposing said fire image onto said previously created digital image; and e. outputting the resultant image.
26. A method for imaging a fire scenario, the method comprising the steps of: a. measuring the hot areas of said scene with a temperature sensitive sensor mounted on a fire watch tower; b. creating a generally real-time fire image, from the output of said step of measuring, detailing at least the perimeter of a fire; c. retrieving, from an image database, a previously created digital image corresponding to the scene viewed by said fire image means wherein said previously created digital image does not show said fire; d. superimposing said fire image onto said previously created digital image; and e. outputting the resultant image.
27. A method according to any of claims 25 and 26 and wherein said temperature sensitive sensor is an infra-red (IR) sensor.
28. A method according to claim 27 and wherein said step of retrieving includes the steps of: a. determining, from output of at least attitude and location sensors, the line of sight of said temperature sensitive sensor; b. generating, from said line of sight and a topography database covering the topography of said scene in the general area of said fire, the Earth coordinates of the footprint viewed by said temperature sensitive sensor.
29. A method according to claim 28 and additionally including the step of displaying an indication of at least said wind velocity and direction.
30. A method according to any of claims 25 and 26 and additionally comprising the steps of displaying indications of the locations of fire-fighting forces received from a location indicating system attached to at least some of the fire-fighting forces.
31. A method according to claim 28 and also including the step of estimating the speed of said fire front line, based on said wind speed and data from at least said topography database and said image database.
32. A method according to claim 31 and also including the steps of measuring and displaying the actual speed of said fire front line from the location of said fire front line at two different times.
33. A method according to claim 31 and also including the step of generating a warning if one of said fire-fighting forces might be in danger due to the estimated speed of said fire and the estimated ability of said one of said fire-fighting forces to back away from said fire.
34. A method according to claims 28 and 30 and also including the step of calibrating the coordinate output of said step of generating Earth coordinates with the location output of said location indicating system.
35. A method according to claim 34 and wherein said step of calibrating includes the steps of: a. mounting a location indicating system on at least one known hot object and placing said known hot object in a generally known location; b. generating an IR image of said at least one known hot object; c. generating, from said location indicating system mounted on said at least one known hot object, the location of said at least one known hot object; and d. comparing the output of said two steps of generating thereby to determine a calibration correction value.
36. A method according to claim 34 and wherein said step of calibrating includes the step of triangulation to ascertain a correct calibration value.
37. A method according to any of claims 25 - 36 and wherein said airborne vehicle is one of the following group: an unmanned airborne vehicle (UAV) and an airplane.
38. A method according to any of claims 25 and 26 wherein said temperature sensitive sensor is sensitive to a range of temperatures and including the step of displaying different sections of said range of temperatures in different colors.
39. A method according to any of claims 25 and 26 and further comprising the step of using a color camera to photograph said fire scenario.
40. A method according to any of claims 25 and 26 and further comprising the step of printing said resultant image.
41. A method according to any of claims 25 and 26 and wherein said step of retrieving the step of retrieving previously created images comprising a three-dimensional image of said fire image superimposed on said previously created digital image of the area prior to the fire.
42. A method according to claim 35 and wherein said step of calibrating further includes the steps of: a. mounting a location indicating system on a vehicle carrying fire retardant elements; b. generating an IR image of said fire scene; c. determining which portions of said fire scene are the hottest and assigning the portion to which said fire retardant elements will be directed; d. generating, from said location indicating system mounted on said vehicle, the location of said vehicle; and e. utilizing the location of said vehicle and other parameters affecting the movement of said fire retardant elements to the portion assigned to them to guide the activation time, location, elevation and direction of said fire retardant elements.
43. A method according to claim 35 and wherein said step of calibrating further includes the steps of: a. validating the predicted progress of the fire; and b. correcting said predictions.
44. A method according to claim 35 and wherein said step of calibrating further includes the steps of: a. monitoring the fire retardant drops; and b. planning the location of future drops in accordance with said monitored results.
45. A method according to claim 35 and wherein said step of calibrating further includes the step of extrapolating the subsequent fire-line from the current fire-line speed.
46. A method according to claim 35 and wherein said step of calibrating further includes the step of calculating the mean wind speed and direction due to wind changes.
47. A method according to claim 46 and wherein said step of calculating includes the step of issuing an alarm indication.
48. A method according to any of claims 25 - 47 wherein said digital image is a digital map.
49. A method according to any of claims 25 and 26 and wherein said step of outputting the resultant image is one of the following: displaying, printing, plotting or transferring by modem.
50. A system according to any of claims 1 - 24 substantially as shown and described hereinabove.
51 . A system according to any of claims 1 - 24 substantially as illustrated in any of the drawings.
52. A method according to any of claims 25 - 49 substantially as shown and described hereinabove.
53. A method according to any of claims 25 - 49 substantially as illustrated in any of the drawings.
PCT/IL1997/000097 1996-03-17 1997-03-16 A fire imaging system and method WO1997035433A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU19375/97A AU1937597A (en) 1996-03-17 1997-03-16 A fire imaging system and method
CA 2249216 CA2249216A1 (en) 1996-03-17 1997-03-16 A fire imaging system and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL11752196A IL117521A0 (en) 1996-03-17 1996-03-17 A fire imaging system and method
IL117521 1996-03-17

Publications (1)

Publication Number Publication Date
WO1997035433A1 true WO1997035433A1 (en) 1997-09-25

Family

ID=11068671

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL1997/000097 WO1997035433A1 (en) 1996-03-17 1997-03-16 A fire imaging system and method

Country Status (3)

Country Link
AU (1) AU1937597A (en)
IL (1) IL117521A0 (en)
WO (1) WO1997035433A1 (en)

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1052606A3 (en) * 1999-05-14 2001-07-18 SAI Servizi Aerei Industriali S.r.l. Thermographic system to check and prevent fires in a vehicle
EP1180324A1 (en) * 2000-08-11 2002-02-20 TD Group S.p.A. Method and apparatus for observing and determining the position of targets in geographical areas
WO2004008407A1 (en) * 2002-07-16 2004-01-22 Gs Gestione Sistemi S.R.L. System and method for territory thermal monitoring
WO2004068433A1 (en) * 2003-01-27 2004-08-12 Energy Laser S.R.L. Modular surveillance system for monitoring critical environments
WO2005005926A1 (en) * 2003-07-09 2005-01-20 Fabrizio Ferrucci Method and apparatus for automatically detecting and mapping, particularly for burnt areas without vegetation
WO2005027069A1 (en) * 2003-08-18 2005-03-24 Idas Informations-, Daten- Und Automationssysteme Gmbh Fire alarm device
WO2006053514A1 (en) * 2004-11-22 2006-05-26 Iq Wireless Gmbh Process for monitoring territories in order to recognise forest and surface fires
EP1689344A2 (en) * 2003-12-05 2006-08-16 Honeywell International, Inc. Fire location detection and estimation of fire spread through image processing based analysis of detector activation
EP1561493A3 (en) * 2004-02-06 2006-11-22 EADS Deutschland GmbH Method for detecting, planing and fighting of forest fires or surface fires
FR2934501A1 (en) * 2008-08-04 2010-02-05 Smart Packaging Solutions Sps FIRE RISK PREVENTION SYSTEM
EP2304692A1 (en) * 2008-06-16 2011-04-06 Eyefi R&D Pty Ltd Spatial predictive approximation and radial convolution
DE19827835B4 (en) * 1998-06-23 2012-01-19 Robert Bosch Gmbh Image transmission method and apparatus
EP2511888A1 (en) * 2011-04-14 2012-10-17 The Boeing Company Fire management system
CN102819926A (en) * 2012-08-24 2012-12-12 华南农业大学 Fire monitoring and warning method on basis of unmanned aerial vehicle
CN104043223A (en) * 2014-06-23 2014-09-17 无锡市崇安区科技创业服务中心 Method for automatically judging and monitoring fire disaster
ITTO20130371A1 (en) * 2013-05-09 2014-11-10 A M General Contractor S P A METHOD OF DETECTION OF THERMAL ENERGY DATA RADIATED IN AN ENVIRONMENT BY PROCESSING IMAGES IN INFRARED RADIATION
CN104851231A (en) * 2015-06-05 2015-08-19 安徽罗伯特科技股份有限公司 Monitoring apparatus of nearby environment of power transmission line
WO2016151250A1 (en) 2015-03-24 2016-09-29 Nimesis Technology Energetically self-contained device for detecting forest fires and method for detecting forest fires implementing such a device
CN107871379A (en) * 2016-02-22 2018-04-03 钱珺佳 Fire Long Range Detecting and Ranging, detection method based on unmanned plane
US20180102034A1 (en) * 2015-09-28 2018-04-12 Dongguan Frontier Technology Institute Fire disaster monitoring method and apparatus
CN108371767A (en) * 2018-01-30 2018-08-07 陈迈 A kind of police multi-functional flight rescue system of fire-fighting
WO2019048603A1 (en) 2017-09-09 2019-03-14 Fcm Dienstleistungs Ag Automatic early warning of smoke, soot and fire by means of a 3d terrain model
US10322803B2 (en) 2017-09-29 2019-06-18 Deere & Company Using unmanned aerial vehicles (UAVs or drones) in forestry productivity and control applications
FR3077875A1 (en) * 2018-02-15 2019-08-16 Helper-Drone EVOLUTIVE TACTICAL MAPPING DEVICE IN EXTERNAL ENVIRONMENT, SYSTEM AND METHOD THEREOF
US10388049B2 (en) 2017-04-06 2019-08-20 Honeywell International Inc. Avionic display systems and methods for generating avionic displays including aerial firefighting symbology
CN110825105A (en) * 2019-10-14 2020-02-21 武汉光庭信息技术股份有限公司 Satellite film pattern spot inspection method and device based on unmanned aerial vehicle
CN111243215A (en) * 2020-01-20 2020-06-05 南京森林警察学院 Low-altitude unmanned monitoring and early warning system and method for forest fire scene
CN111667561A (en) * 2020-04-29 2020-09-15 西安科技大学 Visual analysis and processing method for fire of large public building
WO2021255214A1 (en) * 2020-06-19 2021-12-23 Centre National De La Recherche Scientifique Method and system for geometric characterisation of fires
CN113903136A (en) * 2021-12-13 2022-01-07 环球数科集团有限公司 Forest fire emergency situation modeling analysis system
CN114821946A (en) * 2022-04-15 2022-07-29 国网河北省电力有限公司电力科学研究院 Fire early warning method, monitoring terminal and system for alternating current power supply of transformer substation
US11532156B2 (en) 2017-03-28 2022-12-20 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fire detection
US20230108318A1 (en) * 2021-10-05 2023-04-06 International Business Machines Corporation Identifying changes in firebreak lines
US20230215182A1 (en) * 2022-01-03 2023-07-06 Motorola Solutions, Inc. Intelligent object selection from drone field of view
WO2023180338A1 (en) * 2022-03-21 2023-09-28 Dryad Networks GmbH Device and method for detecting a forest fire
CN114821946B (en) * 2022-04-15 2024-04-19 国网河北省电力有限公司电力科学研究院 Fire disaster early warning method, monitoring terminal and system for transformer substation alternating current power supply

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5153722A (en) * 1991-01-14 1992-10-06 Donmar Ltd. Fire detection system
US5289275A (en) * 1991-07-12 1994-02-22 Hochiki Kabushiki Kaisha Surveillance monitor system using image processing for monitoring fires and thefts
US5548276A (en) * 1993-11-30 1996-08-20 Alan E. Thomas Localized automatic fire extinguishing apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5153722A (en) * 1991-01-14 1992-10-06 Donmar Ltd. Fire detection system
US5289275A (en) * 1991-07-12 1994-02-22 Hochiki Kabushiki Kaisha Surveillance monitor system using image processing for monitoring fires and thefts
US5548276A (en) * 1993-11-30 1996-08-20 Alan E. Thomas Localized automatic fire extinguishing apparatus

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19827835B4 (en) * 1998-06-23 2012-01-19 Robert Bosch Gmbh Image transmission method and apparatus
US6476722B1 (en) 1999-05-14 2002-11-05 Sai Servizi Aerei Industriali S.R.L. Thermographic system to check and prevent fires in a vehicle
EP1052606A3 (en) * 1999-05-14 2001-07-18 SAI Servizi Aerei Industriali S.r.l. Thermographic system to check and prevent fires in a vehicle
EP1180324A1 (en) * 2000-08-11 2002-02-20 TD Group S.p.A. Method and apparatus for observing and determining the position of targets in geographical areas
WO2004008407A1 (en) * 2002-07-16 2004-01-22 Gs Gestione Sistemi S.R.L. System and method for territory thermal monitoring
WO2004068433A1 (en) * 2003-01-27 2004-08-12 Energy Laser S.R.L. Modular surveillance system for monitoring critical environments
WO2005005926A1 (en) * 2003-07-09 2005-01-20 Fabrizio Ferrucci Method and apparatus for automatically detecting and mapping, particularly for burnt areas without vegetation
WO2005027069A1 (en) * 2003-08-18 2005-03-24 Idas Informations-, Daten- Und Automationssysteme Gmbh Fire alarm device
EP1689344A2 (en) * 2003-12-05 2006-08-16 Honeywell International, Inc. Fire location detection and estimation of fire spread through image processing based analysis of detector activation
EP1689344A4 (en) * 2003-12-05 2009-09-16 Honeywell Int Inc Fire location detection and estimation of fire spread through image processing based analysis of detector activation
EP1561493A3 (en) * 2004-02-06 2006-11-22 EADS Deutschland GmbH Method for detecting, planing and fighting of forest fires or surface fires
WO2006053514A1 (en) * 2004-11-22 2006-05-26 Iq Wireless Gmbh Process for monitoring territories in order to recognise forest and surface fires
US8368757B2 (en) 2004-11-22 2013-02-05 Iq Wireless Gmbh Process for monitoring territories in order to recognise forest and surface fires
EP2304692A1 (en) * 2008-06-16 2011-04-06 Eyefi R&D Pty Ltd Spatial predictive approximation and radial convolution
EP2304692A4 (en) * 2008-06-16 2014-03-12 Eyefi Pty Ltd Spatial predictive approximation and radial convolution
US9058689B2 (en) 2008-06-16 2015-06-16 Eyefi Pty Ltd Spatial predictive approximation and radial convolution
WO2010015742A1 (en) * 2008-08-04 2010-02-11 Smart Packaging Solutions (Sps) Method and device for preventing and predicting the evolution of fires
FR2934501A1 (en) * 2008-08-04 2010-02-05 Smart Packaging Solutions Sps FIRE RISK PREVENTION SYSTEM
EP2511888A1 (en) * 2011-04-14 2012-10-17 The Boeing Company Fire management system
CN102819926A (en) * 2012-08-24 2012-12-12 华南农业大学 Fire monitoring and warning method on basis of unmanned aerial vehicle
ITTO20130371A1 (en) * 2013-05-09 2014-11-10 A M General Contractor S P A METHOD OF DETECTION OF THERMAL ENERGY DATA RADIATED IN AN ENVIRONMENT BY PROCESSING IMAGES IN INFRARED RADIATION
EP2801960A1 (en) * 2013-05-09 2014-11-12 A.M. General Contractor S.p.A. Method of detecting data relating to thermal energy radiated in a scene using infrared radiation image processing
US9449240B2 (en) 2013-05-09 2016-09-20 A.M. GENERAL CONTRACTOR S.p.A. Method of detecting data relating to thermal energy radiated in a scene using the infrared radiation image processing
CN104043223A (en) * 2014-06-23 2014-09-17 无锡市崇安区科技创业服务中心 Method for automatically judging and monitoring fire disaster
WO2016151250A1 (en) 2015-03-24 2016-09-29 Nimesis Technology Energetically self-contained device for detecting forest fires and method for detecting forest fires implementing such a device
CN104851231A (en) * 2015-06-05 2015-08-19 安徽罗伯特科技股份有限公司 Monitoring apparatus of nearby environment of power transmission line
EP3309762A4 (en) * 2015-09-28 2019-02-06 Dongguan Frontier Technology Institute Fire disaster monitoring method and apparatus
US20180102034A1 (en) * 2015-09-28 2018-04-12 Dongguan Frontier Technology Institute Fire disaster monitoring method and apparatus
CN107886672A (en) * 2016-02-22 2018-04-06 张琴 Fire Long Range Detecting and Ranging, detection method based on unmanned plane
CN107871379A (en) * 2016-02-22 2018-04-03 钱珺佳 Fire Long Range Detecting and Ranging, detection method based on unmanned plane
US11532156B2 (en) 2017-03-28 2022-12-20 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fire detection
US10388049B2 (en) 2017-04-06 2019-08-20 Honeywell International Inc. Avionic display systems and methods for generating avionic displays including aerial firefighting symbology
WO2019048603A1 (en) 2017-09-09 2019-03-14 Fcm Dienstleistungs Ag Automatic early warning of smoke, soot and fire by means of a 3d terrain model
US10569875B2 (en) 2017-09-29 2020-02-25 Deere & Company Using unmanned aerial vehicles (UAVs or drones) in forestry imaging and assessment applications
US10322803B2 (en) 2017-09-29 2019-06-18 Deere & Company Using unmanned aerial vehicles (UAVs or drones) in forestry productivity and control applications
US10814976B2 (en) 2017-09-29 2020-10-27 Deere & Company Using unmanned aerial vehicles (UAVs or drones) in forestry machine-connectivity applications
CN108371767A (en) * 2018-01-30 2018-08-07 陈迈 A kind of police multi-functional flight rescue system of fire-fighting
WO2019158880A1 (en) * 2018-02-15 2019-08-22 Helper-Drone Device for scalable tactical mapping in an exterior environment, associated system and method
FR3077875A1 (en) * 2018-02-15 2019-08-16 Helper-Drone EVOLUTIVE TACTICAL MAPPING DEVICE IN EXTERNAL ENVIRONMENT, SYSTEM AND METHOD THEREOF
CN110825105B (en) * 2019-10-14 2023-03-10 武汉光庭信息技术股份有限公司 Satellite film pattern spot inspection method and device based on unmanned aerial vehicle
CN110825105A (en) * 2019-10-14 2020-02-21 武汉光庭信息技术股份有限公司 Satellite film pattern spot inspection method and device based on unmanned aerial vehicle
CN111243215A (en) * 2020-01-20 2020-06-05 南京森林警察学院 Low-altitude unmanned monitoring and early warning system and method for forest fire scene
CN111667561B (en) * 2020-04-29 2023-02-03 西安科技大学 Visual analysis and processing method for fire of large public building
CN111667561A (en) * 2020-04-29 2020-09-15 西安科技大学 Visual analysis and processing method for fire of large public building
FR3111723A1 (en) * 2020-06-19 2021-12-24 Centre National De La Recherche Scientifique Method and system for geometric characterization of fires
WO2021255214A1 (en) * 2020-06-19 2021-12-23 Centre National De La Recherche Scientifique Method and system for geometric characterisation of fires
US20230108318A1 (en) * 2021-10-05 2023-04-06 International Business Machines Corporation Identifying changes in firebreak lines
US11823449B2 (en) * 2021-10-05 2023-11-21 International Business Machines Corporation Identifying changes in firebreak lines
CN113903136B (en) * 2021-12-13 2022-02-22 环球数科集团有限公司 Forest fire emergency situation modeling analysis system
CN113903136A (en) * 2021-12-13 2022-01-07 环球数科集团有限公司 Forest fire emergency situation modeling analysis system
US20230215182A1 (en) * 2022-01-03 2023-07-06 Motorola Solutions, Inc. Intelligent object selection from drone field of view
WO2023129406A1 (en) * 2022-01-03 2023-07-06 Motorola Solutions, Inc. Intelligent object selection from drone field of view
US11922700B2 (en) 2022-01-03 2024-03-05 Motorola Solutions, Inc. Intelligent object selection from drone field of view
WO2023180338A1 (en) * 2022-03-21 2023-09-28 Dryad Networks GmbH Device and method for detecting a forest fire
CN114821946A (en) * 2022-04-15 2022-07-29 国网河北省电力有限公司电力科学研究院 Fire early warning method, monitoring terminal and system for alternating current power supply of transformer substation
CN114821946B (en) * 2022-04-15 2024-04-19 国网河北省电力有限公司电力科学研究院 Fire disaster early warning method, monitoring terminal and system for transformer substation alternating current power supply

Also Published As

Publication number Publication date
IL117521A0 (en) 1996-10-31
AU1937597A (en) 1997-10-10

Similar Documents

Publication Publication Date Title
WO1997035433A1 (en) A fire imaging system and method
Sherstjuk et al. Forest fire-fighting monitoring system based on UAV team and remote sensing
US10175353B2 (en) Enhancement of airborne weather radar performance using external weather data
US7337156B2 (en) Method for detecting and combating forest and surface fires
KR101747180B1 (en) Auto video surveillance system and method
US9483951B1 (en) Airborne system and method for detecting and avoiding atmospheric particulates
US7633428B1 (en) Weather data aggregation and display system for airborne network of member aircraft
CA2536671C (en) Integrated system for aircraft vortex safety
EP1523738B1 (en) System and method for territory thermal monitoring
CN110176156A (en) A kind of airborne ground early warning system
EP3387399A2 (en) Unmanned aerial system based thermal imaging and aggregation systems and methods
EP2782086A1 (en) Methods and systems for colorizing an enhanced image during alert
IES20110213A2 (en) System and method for detecting adverse atmospheric conditions ahead of an aircraft
CN106197377A (en) A kind of unmanned plane targeted surveillance over the ground and the display system of two dimension three-dimensional linkage
Sherstjuk et al. Forest fire monitoring system based on UAV team, remote sensing, and image processing
JP3025969B2 (en) Aircraft navigation system and method for supporting aircraft navigation
US9823347B1 (en) Weather radar system and method for high altitude crystal warning interface
CA3001891A1 (en) Wildfire aerial fighting system utilizing lidar
US20220221398A1 (en) System and method for remote analyte sensing using a mobile platform
CN111311865A (en) Forest fire prevention unmanned aerial vehicle platform based on carry on thermal imager
RU113046U1 (en) COMPREHENSIVE SYSTEM FOR EARLY DETECTION OF FOREST FIRES, BUILT ON THE PRINCIPLE OF A VARIETY SENSOR PANORAMIC SURVEY OF THE AREA WITH THE FUNCTION OF HIGH-PRECISION DETERMINATION OF THE FIRE OF THE FIRE
US20230123483A1 (en) Systems for detecting and monitoring a small area wildfire and methods related thereto
RU2542873C1 (en) System for technical surveillance of protected area
CA2249216A1 (en) A fire imaging system and method
WO2005031321A1 (en) Apparatus for remote monitoring of a field of view

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE HU IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG GH

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE HU IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG US UZ VN

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH KE LS MW SD SZ UG AM AZ BY KG KZ MD RU TJ TM AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA

Kind code of ref document: A1

Designated state(s): GH KE LS MW SD SZ UG AM AZ BY KG KZ MD RU TJ TM AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
ENP Entry into the national phase

Ref document number: 2249216

Country of ref document: CA

Ref country code: CA

Ref document number: 2249216

Kind code of ref document: A

Format of ref document f/p: F

NENP Non-entry into the national phase

Ref country code: JP

Ref document number: 97533315

Format of ref document f/p: F

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase