CN112464819A - Forest fire spreading data assimilation method and device based on unmanned aerial vehicle video - Google Patents

Forest fire spreading data assimilation method and device based on unmanned aerial vehicle video Download PDF

Info

Publication number
CN112464819A
CN112464819A CN202011367733.0A CN202011367733A CN112464819A CN 112464819 A CN112464819 A CN 112464819A CN 202011367733 A CN202011367733 A CN 202011367733A CN 112464819 A CN112464819 A CN 112464819A
Authority
CN
China
Prior art keywords
fire
moment
position information
wire
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011367733.0A
Other languages
Chinese (zh)
Other versions
CN112464819B (en
Inventor
陈涛
黄丽达
孙占辉
袁宏永
刘春慧
王晓萌
白硕
张立凡
王镜闲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Beijing Global Safety Technology Co Ltd
Original Assignee
Tsinghua University
Beijing Global Safety Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University, Beijing Global Safety Technology Co Ltd filed Critical Tsinghua University
Priority to CN202011367733.0A priority Critical patent/CN112464819B/en
Publication of CN112464819A publication Critical patent/CN112464819A/en
Priority to PCT/CN2021/112848 priority patent/WO2022110912A1/en
Application granted granted Critical
Publication of CN112464819B publication Critical patent/CN112464819B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/28Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture specially adapted for farming

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a forest fire spreading data assimilation method and device based on unmanned aerial vehicle video, electronic equipment and a computer readable storage medium. The method comprises the following steps: acquiring meteorological data, basic geographic information data and a fire wire state analysis value at the K-1 moment of a fire disaster place; inputting the information into a forest fire spreading model, and acquiring fire line predicted position information at the K moment; acquiring a fire scene area thermal imaging video shot by an unmanned aerial vehicle, and acquiring fire wire observation position information at the moment K; judging whether parameter adjustment needs to be carried out on the model or not according to the fire line predicted position and the observed position at the moment K; and if needed, adjusting the model parameters according to the fire wire predicted position and the observation position at the moment K, recalculating the fire wire predicted position at the moment K, and obtaining the fire wire state analysis value at the moment K. The method and the device are low in cost, can dynamically iterate the forest fire spreading model, obtain the accurate fire wire prediction position, and strive for precious time for forest fire rescue.

Description

Forest fire spreading data assimilation method and device based on unmanned aerial vehicle video
Technical Field
The application relates to the field of data processing, in particular to a forest fire spreading data assimilation method and device based on unmanned aerial vehicle video, electronic equipment and a storage medium, and belongs to the field of data assimilation application.
Background
Forest fires are difficult to control, and the forest fires can destroy forest ecosystems, cause environmental pollution and threaten the safety of human lives and properties. When a fire occurs, emergency management workers urgently need to acquire front line fire information and accurate forest fire spreading prediction information in a short time, so that precious time is won for emergency rescue and saving work.
However, the existing live wire position acquisition is mostly based on satellite remote sensing data and is limited by satellite orbits, the time resolution and the spatial resolution of satellite remote sensing forest fire monitoring are mutually restricted, and the resolution and the timeliness cannot meet the requirements at the same time. Remote sensing satellites for forest fire monitoring mainly comprise two types, namely a static satellite and a polar orbit satellite. The geostationary satellite has high orbit height, wide coverage range and high observation frequency, but the spatial resolution is relatively low, and the geostationary satellite generally finds that the size of a fire scene is kilometer grade, so that the small fire scene is difficult to effectively monitor. Compared with a static satellite, the polar orbit satellite has high spatial resolution and low observation frequency for acquiring remote sensing images, and even though multiple series of various polar orbit satellites are used for earth observation, the observation frequency of about 10 times per day at any place can be realized, and the coverage of the whole time and the whole area cannot be realized. When the fire area is sheltered from by the topography, the sensor of satellite remote sensing can not monitor sheltered from the area, and satellite remote sensing mobility is poor, and the flexibility is not enough. The remote sensing technology cannot completely meet the requirements of real-time monitoring of a fire scene in terms of resolution, timeliness and flexibility. Therefore, how to acquire the fire scene information quickly and accurately in real time becomes a problem to be solved urgently.
Disclosure of Invention
The present application aims to solve at least one of the above mentioned technical problems to a certain extent.
Therefore, the forest fire spreading data assimilation method and device based on the unmanned aerial vehicle video, the electronic equipment and the computer readable storage medium are provided, the precision of a forest fire spreading model can be improved, and the forest fire spreading model can be used for predicting fire scene information rapidly and accurately in real time, so that objective fire scene information is provided for forest fire fighting work.
According to a first aspect of the application, a forest fire spreading data assimilation method based on unmanned aerial vehicle video is provided, and the forest fire spreading data assimilation method comprises the following steps: acquiring meteorological data and basic geographic information data of a fire place, and acquiring a fire wire state analysis value of the fire place at the K-1 moment; inputting the meteorological data, the basic geographic information data and the fire wire state analysis value at the K-1 moment of the fire disaster place into the forest fire spreading model to obtain the fire wire prediction position information at the K moment; acquiring a fire scene area thermal imaging video shot by an unmanned aerial vehicle, and acquiring fire wire observation position information at the K moment according to the fire scene area thermal imaging video; judging whether parameter adjustment needs to be carried out on the forest fire spreading model or not according to the fire line predicted position information and the fire line observation position information at the moment K; if the parameters of the forest fire spreading model need to be adjusted, adjusting model parameters of the forest fire spreading model according to the fire line predicted position information and the fire line observation position information at the K moment, recalculating the fire line predicted position information at the K moment according to the forest fire spreading model adjusted by the model parameters, and calculating a fire line state analysis value at the K moment according to the recalculated fire line predicted position information and the fire line observation position information.
Optionally, the forest fire spread model comprises a Rothermel model and a huygens fluctuation model; inputting the meteorological data, the basic geographic information data and the fire wire state analysis value at the K-1 moment of the fire disaster place into the forest fire spreading model to acquire the fire wire prediction position information at the K moment, wherein the method comprises the following steps: inputting the meteorological data and the basic geographic information data of the fire place into the Rothermel model to obtain the forest fire spreading speed at the K-1 moment; and inputting the forest fire spreading speed at the K-1 moment and the fire wire state analysis value at the K-1 moment into the Huygens fluctuation model to predict the fire wire position, so as to obtain the fire wire predicted position information at the K moment.
Optionally, the obtaining of the fire wire observation position information at the time K according to the thermal imaging video of the fire field area includes: acquiring the thermal imaging of the fire field area at the K moment from the thermal imaging video of the fire field area; determining temperature information corresponding to each pixel in thermal imaging of the fire field area; extracting a fire scene range from the thermal imaging of the fire scene region according to the temperature information and the temperature threshold value corresponding to each pixel in the thermal imaging of the fire scene region; performing edge extraction on the fire field range in the thermal imaging of the fire field area to obtain the pixel position of the fire line; and converting the pixel position of the live wire into a Global Positioning System (GPS) coordinate of the live wire, and acquiring the observation position information of the live wire at the K moment.
Optionally, the number of the unmanned aerial vehicles is at least one, and multiple pixel positions of a fire line are obtained for a thermal imaging video of a fire scene area shot by at least one unmanned aerial vehicle at multiple observation points; the step of converting the pixel position of the live wire into the GPS coordinate of the live wire to obtain the observation position information of the live wire at the moment K comprises the following steps: respectively carrying out coordinate conversion on a plurality of pixel positions of the live wire to obtain a plurality of coordinate values of the live wire in a geographic coordinate system of the unmanned aerial vehicle; calculating a plurality of observation altitude angle matrixes and a plurality of azimuth angle matrixes of the live wire according to a plurality of coordinate values of the live wire in a geographic coordinate system of the unmanned aerial vehicle; performing Kalman filtering estimation on the position of the live wire according to the plurality of observation altitude angle matrixes and the plurality of azimuth angle matrixes of the live wire to obtain a coordinate estimation value of the live wire; and converting the coordinate estimation value of the live wire through a GPS coordinate to obtain the observation position information of the live wire at the K moment.
Optionally, the converting the pixel position of the fire line into a GPS coordinate of the fire line to obtain the fire line observation position information at the time K includes: acquiring DEM geographic information of the fire place; acquiring GPS information, attitude information and built-in parameters of the unmanned aerial vehicle; generating a virtual visual angle of the unmanned aerial vehicle point location according to the DEM (Digital Elevation Model) geographic information, the GPS information, the attitude information and the built-in parameters of the unmanned aerial vehicle; simulating an actual unmanned aerial vehicle imaging process according to the virtual visual angle of the unmanned aerial vehicle point location to obtain a simulation image; determining the pixel coordinates of the live wire in the simulation image according to the pixel position of the live wire; and converting the pixel coordinates of the fire wire in the simulation image through a GPS coordinate to obtain the fire wire observation position information at the K moment.
Optionally, the determining whether parameter adjustment needs to be performed on the forest fire spreading model according to the fire line predicted position information and the fire line observed position information at the time K includes: calculating the deviation between the fire wire predicted position information and the fire wire observation position information at the moment K; judging whether the deviation converges in a target range; if the deviation is not converged in the target range, judging whether the iterated times of the fire spreading model are smaller than the highest iterated times; if the iterative times of the fire spreading model are smaller than the maximum iterative times, judging that parameter adjustment needs to be carried out on the forest fire spreading model; and if the deviation converges in the target range and/or the iteration times of the fire spreading model are larger than or equal to the maximum iteration times, stopping parameter adjustment of the forest fire spreading model.
Optionally, the adjusting the model parameter of the forest fire spreading model according to the fire line predicted position information at the time K and the fire line observation position information, and recalculating the fire line predicted position information at the time K according to the forest fire spreading model adjusted by the model parameter includes: calculating the deviation between the fire wire predicted position information and the fire wire observation position information at the moment K; adjusting the forest fire spreading speed at the K-1 moment according to a preset forest fire spreading speed updating coefficient matrix and the deviation; and inputting the adjusted forest fire spreading speed at the K-1 moment and the adjusted fire wire state analysis value at the K-1 moment into the Huygens fluctuation model, and obtaining the fire wire prediction position information at the K moment again.
Optionally, the adjusting the forest fire spreading speed at the time K-1 according to a preset forest fire spreading speed update coefficient matrix and the deviation includes: and multiplying the forest fire spreading speed updating coefficient matrix and the deviation, and adding the obtained product and the adjusted forest fire spreading speed at the K-1 moment.
Optionally, the calculating a fire line state analysis value at the time K according to the recalculated fire line predicted location information and the fire line observed location information at the time K includes: and performing least square fitting on the recalculated fire wire predicted position information at the K moment and the fire wire observation position information based on an ensemble Kalman filtering algorithm to obtain a fire wire state analysis value at the K moment.
According to the second aspect of the application, a forest fire spreading data assimilation device based on unmanned aerial vehicle video is provided, include: the first acquisition module is used for acquiring meteorological data and basic geographic information data of a fire place; the second acquisition module is used for acquiring a fire wire state analysis value at the K-1 moment of the fire place; the third acquisition module is used for inputting the meteorological data, the basic geographic information data and the fire wire state analysis value at the K-1 moment of the fire disaster place into the forest fire spreading model and acquiring the fire wire prediction position information at the K moment; the fourth acquisition module is used for acquiring a fire scene area thermal imaging video shot by the unmanned aerial vehicle; the fifth acquisition module is used for acquiring the observation position information of the fire wire at the moment K according to the thermal imaging video of the fire scene area; the judging module is used for judging whether parameter adjustment needs to be carried out on the forest fire spreading model or not according to the fire line predicted position information and the fire line observation position information at the K moment; the adjusting module is used for adjusting the model parameters of the forest fire spreading model according to the fire line predicted position information and the fire line observation position information at the K moment when the parameter adjustment of the forest fire spreading model is needed, and recalculating the fire line predicted position information at the K moment according to the forest fire spreading model adjusted by the model parameters; and the data assimilation module is used for calculating a fire wire state analysis value at the K moment according to the recalculated fire wire predicted position information at the K moment and the fire wire observation position information.
Optionally, the forest fire spread model comprises a Rothermel model and a huygens fluctuation model; the third obtaining module is specifically configured to: inputting the meteorological data and the basic geographic information data of the fire place into the Rothermel model to obtain the forest fire spreading speed at the K-1 moment; and inputting the forest fire spreading speed at the K-1 moment and the fire wire state analysis value at the K-1 moment into the Huygens fluctuation model to predict the fire wire position, so as to obtain the fire wire predicted position information at the K moment.
Optionally, the fifth obtaining module is specifically configured to: acquiring the thermal imaging of the fire field area at the K moment from the thermal imaging video of the fire field area; determining temperature information corresponding to each pixel in thermal imaging of the fire field area; extracting a fire scene range from the thermal imaging of the fire scene region according to the temperature information and the temperature threshold value corresponding to each pixel in the thermal imaging of the fire scene region; performing edge extraction on the fire field range in the thermal imaging of the fire field area to obtain the pixel position of the fire line; and converting the pixel position of the live wire into a GPS coordinate of the live wire, and acquiring the live wire observation position information at the K moment.
Optionally, the number of the unmanned aerial vehicles is at least one, and the fifth obtaining module is specifically configured to: the method comprises the steps of obtaining a plurality of pixel positions of a live wire aiming at a fire scene area thermal imaging video shot by at least one unmanned aerial vehicle at a plurality of observation points.
Optionally, the fifth obtaining module is specifically configured to: respectively carrying out coordinate conversion on a plurality of pixel positions of the live wire to obtain a plurality of coordinate values of the live wire in a geographic coordinate system of the unmanned aerial vehicle; calculating a plurality of observation altitude angle matrixes and a plurality of azimuth angle matrixes of the live wire according to a plurality of coordinate values of the live wire in a geographic coordinate system of the unmanned aerial vehicle; performing Kalman filtering estimation on the position of the live wire according to the plurality of observation altitude angle matrixes and the plurality of azimuth angle matrixes of the live wire to obtain a coordinate estimation value of the live wire; and converting the coordinate estimation value of the live wire through a GPS coordinate to obtain the observation position information of the live wire at the K moment.
Optionally, the fifth obtaining module is specifically configured to: acquiring DEM geographic information of the fire place; acquiring GPS information, attitude information and built-in parameters of the unmanned aerial vehicle; generating a virtual visual angle of the unmanned aerial vehicle point location according to the DEM geographic information, the GPS information, the attitude information and the built-in parameters of the unmanned aerial vehicle; simulating an actual unmanned aerial vehicle imaging process according to the virtual visual angle of the unmanned aerial vehicle point location to obtain a simulation image; determining the pixel coordinates of the live wire in the simulation image according to the pixel position of the live wire; and converting the pixel coordinates of the fire wire in the simulation image through a GPS coordinate to obtain the fire wire observation position information at the K moment.
Optionally, the determining module is specifically configured to: calculating the deviation between the fire wire predicted position information and the fire wire observation position information at the moment K; judging whether the deviation converges in a target range; if the deviation is not converged in the target range, judging whether the iterated times of the fire spreading model are smaller than the highest iterated times; if the iterative times of the fire spreading model are smaller than the maximum iterative times, judging that parameter adjustment needs to be carried out on the forest fire spreading model; and if the deviation converges in the target range and/or the iteration times of the fire spreading model are larger than or equal to the maximum iteration times, stopping parameter adjustment of the forest fire spreading model.
Optionally, the adjusting module is specifically configured to: calculating the deviation between the fire wire predicted position information and the fire wire observation position information at the moment K; adjusting the forest fire spreading speed at the K-1 moment according to a preset forest fire spreading speed updating coefficient matrix and the deviation; and inputting the adjusted forest fire spreading speed at the K-1 moment and the adjusted fire wire state analysis value at the K-1 moment into the Huygens fluctuation model, and obtaining the fire wire prediction position information at the K moment again.
Optionally, the adjusting module is specifically configured to: and multiplying the forest fire spreading speed updating coefficient matrix and the deviation, and adding the obtained product and the adjusted forest fire spreading speed at the K-1 moment.
Optionally, the data assimilation module is specifically configured to: and performing least square fitting on the recalculated fire wire predicted position information at the K moment and the fire wire observation position information based on an ensemble Kalman filtering algorithm to obtain a fire wire state analysis value at the K moment.
According to a third aspect of the present application, there is provided an electronic device comprising: the forest fire spreading data assimilation method based on the video of the unmanned aerial vehicle is achieved when the computer program is executed by the processor.
According to a fourth aspect of the present application, there is provided a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for synchronizing forest fire spreading data based on drone video according to the embodiments of the first aspect of the present application.
According to the technical scheme of this application embodiment, the video fire point location technique of unmanned aerial vehicle thermal imaging has been combined, through unmanned aerial vehicle thermal imaging video intelligent analysis discernment live wire position, through observing live wire position real-time correction forest fire and spreading the model parameter, the while model is spread to dynamic iteration forest fire to realize the data assimilation process of this forest fire spreading model, can effectively solve the live wire and can not acquire in real time, forest fire spreads the model parameter and can not in time revise, lead to the problem that the prediction result precision can't be guaranteed. In addition, this application is through using unmanned aerial vehicle to take photo by plane forest fire scene, can the fast migration shoot, and the area coverage is big, and the video passback is rapid, and unmanned aerial vehicle video passback data analysis live wire data is with low costs, and the ageing is strong, and is flexible, effectively avoids the shortcoming that satellite remote sensing data time resolution and spatial resolution restrict each other, can greatly improve the ageing and the accuracy of forest fire spreading model prediction. In addition, the application provides a multi-convergence data assimilation method of the Kalman filtering set aiming at the unsteady meteorological conditions of the forest fire area, the forest fire spreading speed is dynamically iterated while the parameters of the forest fire spreading model are corrected in real time by the position of the fire wire, and the accuracy of the forest fire spreading model is effectively improved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
fig. 1 is a schematic flow diagram of a method for assimilating forest fire spread data based on unmanned aerial vehicle video according to an embodiment of the present application;
fig. 2 is a schematic flow chart diagram of a method for assimilating forest fire spread data based on unmanned aerial vehicle video according to an embodiment of the present application;
fig. 3 is a schematic flow chart of a method for assimilating forest fire spread data based on unmanned aerial vehicle video according to another embodiment of the present application;
FIG. 4 is a flowchart of obtaining predicted location information for a fire line at time K according to one embodiment of the present application;
FIG. 5 is a schematic flowchart of obtaining information about an observed location of a fire wire at time K according to another embodiment of the present application;
fig. 6 is a schematic diagram of a plurality of drones acquiring target three-dimensional coordinate information according to an embodiment of the present application;
fig. 7 is a schematic flowchart of a forest fire spread data assimilation method based on unmanned aerial vehicle video according to an embodiment of the application;
FIG. 8 is a flow chart of obtaining fire line predicted location information according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a forest fire spreading data assimilation device based on unmanned aerial vehicle video according to an embodiment of the application;
fig. 10 is a block diagram of an electronic device for implementing the forest fire spread data assimilation method based on the drone video according to the embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
The forest fire spread data assimilation method, device, electronic equipment and storage medium based on the unmanned aerial vehicle video are described below with reference to the accompanying drawings.
Fig. 1 is a flowchart of a forest fire spread data assimilation method based on unmanned aerial vehicle video according to an embodiment of the application. It should be noted that the forest fire spread data assimilation method in the embodiment of the present application can be applied to the forest fire spread data assimilation device based on the unmanned aerial vehicle video implemented in the present application, and the device can be implemented by software and/or hardware, and can be integrated into electronic equipment.
As shown in fig. 1, the forest fire spread data assimilation method comprises the following steps:
step 101, acquiring meteorological data and basic geographic information data of a fire place, and acquiring a fire wire state analysis value at the K-1 moment of the fire place.
In some embodiments of the present application, the meteorological data may include, but is not limited to, any one or more of wind speed, wind direction, air temperature, precipitation probability, precipitation amount, air pressure, air humidity, air oxygen content, and the like. As one example, the meteorological data may include wind speed and wind direction.
In some embodiments of the present application, the base geographic information data may include, but is not limited to, any one or more of underlay surface type, forest moisture content, forest slope map, slope direction, forest combustible material, physical and chemical properties of forest combustible, and the like. Among these, the physical and chemical properties may include, but are not limited to: any one or more of density, ignition point, calorific value, flammability, and the like. As an example, the base geographic information data may include underlay surface type, forest moisture content, forest slope map, slope direction, and forest combustible material.
In the present embodiment, the method for obtaining the analysis value of the fire line state at the time of K-1 of the fire place may be as follows: inputting meteorological data, basic geographic information data and a fire wire state analysis value at the K-2 moment of a fire place into a forest fire spreading model, obtaining fire wire predicted position information at the K-1 moment, then obtaining a fire wire area thermal imaging video shot by an unmanned aerial vehicle, obtaining fire wire observation position information at the K-1 moment according to the video, judging whether parameter adjustment needs to be carried out on the forest fire spreading model according to the fire wire predicted position information and the observation position information at the K-1 moment, if so, adjusting model parameters of the forest fire spreading model according to the fire wire predicted position information and the fire wire observation position information at the K-1 moment, then recalculating the fire wire predicted position information at the K-1 moment according to the forest fire spreading model adjusted by the model parameters, and calculating K + according to the recalculated fire wire predicted position information and the fire wire observation position information at the K-1 moment And (3) analyzing the state of the fire wire at the moment 1.
In the application, the moment K represents a certain time point when forest fire burns, the moment K-1 represents a time point corresponding to one time step traced back from the time point forward, the moment K-2 represents a time point corresponding to two time steps traced back from the time point forward, and the like.
That is, when calculating the analysis value of the fire line state at the K-1 time of the fire place, the location of the fire line at the K-1 time can be predicted by using the forest fire spreading model based on the analysis value of the fire line state at the previous time (i.e., the K-2 time) of the K-1 time, the meteorological data and the basic geographic information data of the fire place, so as to obtain the predicted location information of the fire line at the K-1 time, then, when the parameters of the forest fire spreading model need to be dynamically adjusted are judged by using the predicted location information of the fire line at the K-1 time and the observed location information of the fire line at the K-1 time, the parameters of the fire spreading model can be adjusted, and then the predicted location information of the fire line at the K-1 time is recalculated by using the forest fire spreading model after the parameters of the model are adjusted, and then the K-doped fire line is calculated according to the recalculated predicted location information of the fire line And (3) analyzing the state of the fire wire at the moment 1. If the model does not need to be subjected to parameter adjustment, the fire wire prediction position information at the K-1 moment does not need to be recalculated, and at the moment, the fire wire state analysis value at the K-1 moment can be directly calculated according to the fire wire prediction position information at the K-1 moment obtained by first prediction and the fire wire observation position information at the K-1 moment.
In the embodiment of the present application, when K is 1, that is, the manner of obtaining the initial state analysis value of the fire line at the fire occurrence location may be as follows: the corresponding relation between the weather, the basic geographic information and the fire wire state analysis value can be obtained through a plurality of times of simulation tests in advance, so that the initial state analysis value of the fire wire of the fire place can be obtained according to the corresponding relation, the weather data and the basic geographic information data of the fire place. That is, when a fire occurs in a certain place, an initial state analysis value of the fire line in the fire occurrence place can be predicted by using an empirical value obtained by a plurality of simulation tests.
And 102, inputting meteorological data, basic geographic information data and a fire wire state analysis value at the K-1 moment of a fire disaster place into a forest fire spreading model, and acquiring fire wire predicted position information at the K moment.
In some embodiments of the present application, a forest fire spread model includes: the model which can simulate forest fire spreading function by inputting information, such as Rothermel model, Huygens fluctuation model, model which is comprehensively used by Rothermel model and Huygens fluctuation model, McArthur model and the like. In the embodiment of the present application, the forest fire spreading model includes: the Rothermel model and the Huygens fluctuation model.
As an example, the specific implementation process of inputting the meteorological data, the basic geographic information data, and the fire line state analysis value at the time K-1 to the forest fire spreading model to obtain the fire line predicted location information at the time K may be as follows: inputting meteorological data and basic geographic information data of a fire place to a Rothermel model, obtaining forest fire spreading speed at the K-1 moment, inputting the forest fire spreading speed at the K-1 moment and a fire wire state analysis value at the K-1 moment to a Huygens fluctuation model to predict the position of a fire wire, and obtaining fire wire predicted position information at the K moment.
As an example, the specific implementation process of step 102 may be as follows:
firstly, according to data such as weather, basic geographic information and the like of a fire place, including wind speed, wind direction, underlying surface type, forest water content, forest slope map, slope direction, forest combustible substances and the like, obtaining forest fire spreading speed R of each fire point through a Rothermel model0. According to the huygens fluctuation theory, the fire points are regarded as points on the wave front, each fire point can be regarded as the next wave source (i.e. secondary wave source), the wave is continuously propagated, and the fire points can be regarded as the predicted fire line position of the next time step K moment
Figure RE-GDA0002905560480000081
(
Figure RE-GDA0002905560480000082
The subscript k denotes time, the superscript denotes the matrix state of the fire, f denotes the prediction matrix of the fire,
Figure RE-GDA0002905560480000083
representing the predicted fire line position at time k; e.g. of the typej,njCoordinates of j points on the live wire, and subscript m is the number of marked points on the perimeter of the live wire). The formula of the forest fire spreading model is expressed as follows:
Figure RE-GDA0002905560480000084
Figure RE-GDA0002905560480000085
wherein formula (1) represents a Rothermel model, and in formula (1), R is0The rate of spread of a forest fire at a certain fire point, IRDenotes the reaction intensity, ζ denotes the propagation rate, ρbDenotes combustible density, ε is effective thermal coefficient, QigThe quantity of heat required to ignite a unit mass of combustible material, phiswWind speed and slope correction coefficients; the expression (2) represents a Huygens fluctuation model, and in the expression (2), H represents a Huygens model,
Figure RE-GDA0002905560480000086
the superscript a of (a) represents a state analysis matrix that aggregates predicted fire and observed fire,
Figure RE-GDA0002905560480000087
represented as the state analysis matrix of the model at the time of the last time step K-1.
As can be seen from the formulas (1) and (2), data such as weather, terrain, vegetation, initial fire source and the like are input into the Rothermel model, so that the forest fire spreading speed can be obtained, and the forest fire spreading speed and the state analysis matrix of the fire line model at the last time step time are input into the Huygens fluctuation model, so that the fire line prediction position information at the current time can be obtained.
And 103, acquiring a fire scene area thermal imaging video shot by the unmanned aerial vehicle, and acquiring fire wire observation position information at the K moment according to the fire scene area thermal imaging video.
In some embodiments of the present application, in the thermal imaging video based on the fire area shot by the drone, the method of obtaining the thermal imaging video from the video shot by the drone includes, but is not limited to, one of the following infrared thermal imaging techniques: all objects in nature have infrared radiation as long as the temperature is higher than absolute zero (-273 ℃) as a result of the thermal motion of molecules in the object, and the wavelength of the radiation is inversely proportional to the temperature, and one thermal imaging technology adopted in the embodiment is an infrared thermal imaging technology which is converted into a thermal image (which may be a gray scale image and/or a pseudo color) of a target object through system processing according to the detected radiation energy of the object. Unmanned aerial vehicle carries on this kind of thermal imager, and the pixel information of thermal imager can reflect the temperature information of shooing the region. In this step, the live wire observation position information at the time K is obtained by using the characteristic that the temperature information of the shooting area can be reflected by the pixel information of the thermal imaging.
In some embodiments of this application, the communication connection of thermal imager accessible and forest fire extension data assimilation device on unmanned aerial vehicle, the thermal imaging video that shoots the thermal imager is sent for forest fire extension data assimilation device to make forest fire extension data assimilation device obtain the regional thermal imaging video of scene of a fire based on the shooting of thermal imager on unmanned aerial vehicle. That is to say, the unmanned aerial vehicle is equipped with the thermal imaging camera to be used for carrying out the shooting of thermal imaging video to the scene of a fire region. Wherein, unmanned aerial vehicle and forest fire spread data assimilation device adopt communication connection to communicate to make forest fire spread data assimilation device can obtain the regional thermal imaging video in scene of a fire from unmanned aerial vehicle based on this communication connection.
As an example, the communication connection may be in a mobile internet mode, a wireless communication mode, or the like. The mobile internet may be one of a 3G (3th generation mobile information system) network, a 4G (4th generation mobile information system) network, a 5G (5th generation mobile communication technology) network, and the like; the Wireless communication may be one of WIFI (Wireless Fidelity), digital Wireless data transmission radio, UWB (Ultra Wide Band) transmission, Zigbee transmission, and the like.
And 104, judging whether parameter adjustment needs to be carried out on the forest fire spreading model or not according to the fire wire predicted position information and the fire wire observation position information at the moment K.
Optionally, the forest fire spread data assimilation can be achieved based on a multi-convergence ensemble kalman filtering method. Wherein, the multi-convergence ensemble Kalman filtering method firstly needs to select the state at the moment K-1 to analyze the position of a fire wire and the fire spreading speed Vk-1As the state parameter to be corrected, the Rothermel Huygens model is based on the time k-1Analysis value of live wire state
Figure RE-GDA0002905560480000091
Predicted fire line predicted position at K time
Figure RE-GDA0002905560480000092
The position of the fire line at time K is set to
Figure RE-GDA0002905560480000093
Then, the position information is predicted according to the fire line at the time K
Figure RE-GDA0002905560480000094
And line observation location information
Figure RE-GDA0002905560480000095
And calculating the deviation between the two models, and determining whether parameter adjustment needs to be carried out on the forest fire spreading model according to the convergence condition of the deviation.
In some embodiments of the present application, as shown in fig. 2, the specific implementation process of determining whether parameter adjustment needs to be performed on the forest fire spreading model according to the fire wire predicted position information and the fire wire observed position information at the time K may include:
step 201, calculating the deviation between the fire wire predicted position information and the fire wire observed position information at the time K.
In some embodiments of the present application, the deviation is defined as follows:
Figure RE-GDA0002905560480000096
that is, the predicted fire line location information at time K can be calculated using equation (3) above
Figure RE-GDA0002905560480000101
And the position of the observation of the fire line at the moment K
Figure RE-GDA0002905560480000102
BetweenThe deviation of (2).
Step 202, determine whether the deviation is within the target range.
Optionally, the fire line predicted position information at the moment of obtaining K
Figure RE-GDA0002905560480000103
And the position of the observation of the fire line at the moment K
Figure RE-GDA0002905560480000104
In the case of a deviation therebetween, whether the deviation converges within the target range can be determined by the following formula (4):
Figure RE-GDA0002905560480000105
wherein, | | ErrhI and
Figure RE-GDA0002905560480000106
2 norm of deviation and observation data, C, at the h-th iteration stepfactorIs a criterion for judging whether the calculation converges. After h iterations, the process was repeated for h times,
Figure RE-GDA0002905560480000107
less than or equal to CfactorAt this time, the iteration for the model is stopped, and the deviation is considered to be converged within the target range; if it is
Figure RE-GDA0002905560480000108
Greater than CfactorAnd if the deviation is not converged in the target range, the model needs to be continuously subjected to parameter adjustment.
And step 203, if the deviation is not converged in the target range, judging whether the iteration frequency of the forest fire spreading model is less than the highest iteration frequency.
In some embodiments of the present application, a variable may be set to record the iteration number of the forest fire spreading model, and the maximum iteration number may be a constant set manually, and the constant may be recorded in the system in advance, or may be recorded in advance in the systemThe maximum number of iterations can be adjusted dynamically, empirically or in the field, in the course of actual operation, given a recommended value in advance in the system. Comparing the recorded iteration times with the set maximum iteration times, so as to obtain the relative relation between the iteration times and the maximum iteration times. For example, assume NiterationThe maximum iteration step length is limited, h is the current iteration times of the forest fire spreading model, and the fire line prediction position information at the moment of judging K
Figure RE-GDA0002905560480000109
And the position of the observation of the fire line at the moment K
Figure RE-GDA00029055604800001010
After the deviation between the forest fire spreading models is not converged in the target range, whether the current iteration times of the forest fire spreading models are smaller than the maximum iteration times needs to be judged. If h is not less than NiterationIf so, the current iteration times of the forest fire spreading model is greater than or equal to the highest iteration times, and the iteration of the forest fire spreading model is stopped at the moment, namely, the parameter adjustment of the forest fire spreading model is not needed. If h<NiterationThen, indicating that the current iteration number for the forest fire spreading model is less than the highest iteration number, step 204 may be executed.
And 204, if the iterative times of the forest fire spreading model are smaller than the highest iterative times, judging that parameter adjustment needs to be carried out on the forest fire spreading model.
And step 205, if the deviation converges in the target range and/or the number of iterations of the fire spread model is greater than or equal to the maximum number of iterations, stopping parameter adjustment of the forest fire spread model. Therefore, through the steps 201 to 205, under the condition of the unsteady meteorological conditions of the forest fire area, the multi-convergence data assimilation method of the ensemble Kalman filtering can be adopted to correct forest fire spreading parameters such as a fire line position and the like in real time, meanwhile, the forest fire spreading speed is dynamically iterated, and the accuracy of the forest fire spreading model is effectively improved.
And 105, if the forest fire spreading model needs to be subjected to parameter adjustment, adjusting model parameters of the forest fire spreading model according to the fire line prediction position information and the fire line observation position information at the K moment, recalculating the fire line prediction position information at the K moment according to the forest fire spreading model adjusted by the model parameters, and calculating a fire line state analysis value at the K moment according to the recalculated fire line prediction position information and the fire line observation position information at the K moment.
In some embodiments of the present application, a specific implementation process of calculating the analysis value of the fire line state at the time K according to the recalculated fire line predicted location information and the fire line observed location information at the time K may be as follows: and based on an ensemble Kalman filtering algorithm, performing least square fitting on the recalculated fire wire predicted position information at the K moment and fire wire observation position information to obtain a fire wire state analysis value at the K moment.
As an example, the implementation process of performing least square fitting on the recalculated fire wire predicted position information at the time K and the fire wire observed position information based on the ensemble kalman filtering algorithm to obtain the fire wire state analysis value at the time K may be as follows:
at the position of observation live wire
Figure RE-GDA0002905560480000111
And predicting fire line location
Figure RE-GDA0002905560480000112
Deviation Err ofhUnder the condition of satisfying the formula (4), the deviation can be considered to be converged within the target range, and the observed live wire can be considered
Figure RE-GDA0002905560480000113
And predicting the firing line
Figure RE-GDA0002905560480000114
Obtaining a state analysis fire wire position through least square fitting calculation
Figure RE-GDA0002905560480000115
I.e. state analysis matrix (state analysis fire wire)
Figure RE-GDA0002905560480000116
With minimal error from the true fire line position). Wherein the analysis value of the state of the live wire is calculated
Figure RE-GDA0002905560480000117
The steps are as follows:
1) computing a set prediction error covariance matrix Pe
Figure RE-GDA0002905560480000118
Figure RE-GDA0002905560480000119
Where N is the number of elements of the state variable set, 1NIs a matrix of size N, with element values of 1/N.
Figure RE-GDA00029055604800001110
For the prediction matrix
Figure RE-GDA00029055604800001111
The mean vector of each column element in (1).
2) An observation set is generated. At the time step of data assimilation, an observation vector y can be obtainedoAdding disturbance into the observation vector to generate an observation matrix containing N times of observation vectors, wherein the adding process of the disturbance is as follows:
Figure RE-GDA00029055604800001112
and obtaining the disturbed observation vector to form an observation matrix.
Figure RE-GDA00029055604800001113
Wherein R ism×NRepresents YoIs a definition domain of (A) represents YoIs m rows and N columns
Meanwhile, the added perturbations may be stored in a matrix:
E=(ε12,…,εN) (9)
the set observation error covariance matrix can be expressed as:
Figure RE-GDA0002905560480000121
3) the ensemble kalman filter gain is calculated as follows:
Ke=PeHT(HPeHT+Re)-1 (11)
where H is an observation operator, mapping X from the state space to the observation space.
4) Updating a system state analysis value:
Figure RE-GDA0002905560480000122
therefore, the embodiment of the application provides a multi-convergence data assimilation method of the Kalman filtering set aiming at the unsteady meteorological conditions of the forest fire area, the forest fire spreading speed is dynamically iterated while the parameters of the forest fire spreading model are corrected in real time by the position of the fire wire, and the accuracy of the forest fire spreading model is effectively improved.
To sum up, the forest fire spreading data assimilation method based on the unmanned aerial vehicle video obtains fire wire prediction position information at the time K through a forest fire spreading model according to data such as meteorological data, basic geographic information data and a fire wire state analysis value at the time K-1 of a fire place, compares the fire wire prediction position information at the time K with fire wire observation position information obtained by an unmanned aerial vehicle, judges whether parameter adjustment needs to be performed on the forest fire spreading model, adjusts model parameters according to the fire wire prediction position information and the observation position information at the time K if the parameter adjustment needs to be performed, recalculates the fire wire prediction position information at the time K according to the adjusted forest fire spreading model, and recalculates the fire wire state analysis value at the time K. The method for assimilating forest fire spreading data based on the unmanned aerial vehicle video adopts the unmanned aerial vehicle as front-end monitoring equipment, extracts the fire line in real time, obtains position information of the fire line, provides an assimilation forest fire spreading model with dynamically adjustable parameters for the forest fire spreading model, can effectively solve the problems that the fire line cannot be obtained in real time, parameters of the forest fire spreading model cannot be corrected in time, and therefore prediction result precision cannot be guaranteed, and improves model prediction precision. Unmanned aerial vehicle has high mobility and low-cost advantage, and unmanned aerial vehicle can pass back on-the-spot video in real time to make the renewal interval of observing the live wire for minute level discernment even second level, thereby effectively avoid the shortcoming of satellite remote sensing data time resolution and spatial resolution restriction each other, can greatly improve the timeliness and the accuracy of forest fire spreading model prediction, thereby can improve the prediction precision of the area of crossing a fire, provide objective fire scene information for forest fire suppression work.
It should be noted that, in order to obtain more accurate live wire observation position information, the fire scene area can be shot based on the thermal imaging camera carried by the unmanned aerial vehicle, and then the live wire observation position information at the time K is calculated by using the thermal imaging video shot by the unmanned aerial vehicle. Specifically, fig. 3 is a flowchart of a method for assimilating forest fire spread data based on a drone video according to another embodiment of the present application, and as shown in fig. 3, the method for assimilating forest fire spread data includes:
step 301, acquiring meteorological data and basic geographic information data of a fire place, and acquiring a fire wire state analysis value at the K-1 moment of the fire place.
Step 302, inputting meteorological data, basic geographic information data and a fire wire state analysis value at the K-1 moment of a fire place into a forest fire spreading model, obtaining fire wire predicted position information at the K moment, and obtaining a fire scene area thermal imaging video shot by an unmanned aerial vehicle.
Step 303, acquiring the thermal imaging of the fire field region at the time K from the thermal imaging video of the fire field region, and determining the temperature information corresponding to each pixel in the thermal imaging of the fire field region.
And 304, extracting a fire field range from the thermal imaging of the fire field region according to the temperature information and the temperature threshold value corresponding to each pixel of the thermal imaging of the fire field region, and performing edge extraction on the fire field range to obtain the pixel position of the fire line.
Step 305, converting the pixel position of the live wire into the GPS coordinate of the live wire, and obtaining the live wire observation position information at the moment K.
In some embodiments of the present application, the process of converting the fire wire pixel information into the GPS information is the reverse process of camera shooting imaging, and as shown in fig. 4, the process of projective transformation from a three-dimensional scene to a two-dimensional image plane shot by the drone is used. The essence of camera imaging is the central perspective projection process in the photography geometry. The three-dimensional ground point determines an observation result through a view cone space and a view point azimuth which are designated by the projection matrix, and the two-dimensional image of the camera picture and the three-dimensional geographic information form a corresponding relation through the view cone and the view point azimuth. Converting the two-dimensional picture information into three-dimensional coordinate information is the inverse process of the above process.
In the embodiment of the present application, there are various ways to convert the pixel position of the fire line into the GPS coordinate of the fire line, and the pixel position of the fire line may be selected and set according to a specific application scenario, for example, as follows:
the first example is a fire wire positioning technology without combining DEM information, in the example, the same fire scene area can be shot at a plurality of observation points based on at least one unmanned aerial vehicle, and then a plurality of pixel positions of the fire wire are obtained according to the fire scene area thermal imaging video shot at the plurality of observation points by the at least one unmanned aerial vehicle, and then the fire wire observation position information at the K moment is calculated by utilizing the plurality of pixel positions of the fire wire. Specifically, as shown in fig. 5, this example includes the steps of:
step 501, coordinate conversion is respectively carried out on a plurality of pixel positions of a live wire, and a plurality of coordinate values of the live wire in a geographic coordinate system of the unmanned aerial vehicle are obtained.
Step 502, calculating a plurality of observation altitude angle matrices and a plurality of azimuth angle matrices of the fire line according to a plurality of coordinate values of the fire line in the geographical coordinate system of the unmanned aerial vehicle.
Step 503, performing kalman filtering estimation on the position of the fire line according to the plurality of observation altitude angle matrices and the plurality of azimuth angle matrices of the fire line to obtain a coordinate estimation value of the fire line.
And step 504, converting the coordinate estimation value of the live wire through a GPS coordinate to obtain the live wire observation position information at the K moment.
For example, a specific implementation process not combining DEM information as described in steps 501-504 may be as follows:
the method for positioning the ground target by the unmanned aerial vehicle mainly comprises the following steps: data are collected and processed through an airborne sensor, relative distance angle data between the unmanned aerial vehicle and a target are obtained, target position coordinates are calculated by combining position and attitude data of the unmanned aerial vehicle, the unmanned aerial vehicle detects the same target through multiple points, and accurate three-dimensional coordinates of the target can be obtained through a visual-based multi-point angle observation live wire positioning method.
The method comprises the steps of carrying out target positioning through multi-point angle observation, calculating the relative altitude angle and azimuth angle matrix of a live wire and an unmanned aerial vehicle according to the pixel information of the live wire according to the imaging principle, establishing a system state equation and an observation equation, estimating the position of the live wire relative to the unmanned aerial vehicle by utilizing unscented Kalman filtering, and converting the position coordinate into a position coordinate under a geodetic coordinate system of the live wire. Observing at the K moment to obtain the actual observed live wire position at the K moment
Figure RE-GDA0002905560480000141
The main steps for obtaining the observation live wire are as follows:
1) and converting the live wire pixel information into a value under an unmanned aerial vehicle geographic coordinate system through coordinates, and calculating an altitude angle and an azimuth angle matrix of the live wire point relative to the unmanned aerial vehicle geographic coordinate.
2) And obtaining a plurality of observation altitude angle matrixes and azimuth angle matrixes of the live wire by observing the live wire at multiple points, and obtaining an estimated value of the live wire by combining Kalman filtering.
3) The estimated value of the live wire is converted into the GPS coordinate of the live wire through the coordinate, namely the position of the actually observed live wire.
A second example, which is a fire line location technique incorporating DEM information, as shown in fig. 7, includes the steps of:
and step 701, acquiring DEM geographic information of a fire place.
And step 702, acquiring GPS information, attitude information and built-in parameters of the unmanned aerial vehicle.
And 703, generating a virtual visual angle of the point location of the unmanned aerial vehicle according to the DEM geographic information, the GPS information, the attitude information and the built-in parameters of the unmanned aerial vehicle.
And 704, simulating an actual unmanned aerial vehicle imaging process according to the virtual visual angle of the unmanned aerial vehicle point location to obtain a simulation image.
Step 705, determining the pixel coordinates of the live wire in the simulation image according to the pixel position of the live wire.
And 706, converting the pixel coordinates of the fire wire in the simulation image through a GPS coordinate to obtain the observation position information of the fire wire at the K moment.
For example, a specific implementation process of the step 701-706 in combination with DEM information may be as follows:
based on DEM Geographic Information of a forest, forming a virtual visual angle of an unmanned plane point through a TS-GIS (type script-Geographic Information system) engine, and generating a projection matrix. And obtaining the spatial coordinates corresponding to the fire line pixel points in the thermal imaging picture by using the projection matrix. Observing at the K moment to obtain the actual observed live wire position at the K moment
Figure RE-GDA0002905560480000142
The live wire positioning process is as follows:
1) and the TS-GIS engine can display three-dimensional DEM information by combining DEM data of the forest region and a remote sensing image data source.
2) Inputting camera GPS information, attitude information and camera built-in parameters, and simulating the imaging process of an actual camera through the TS-GIS virtual camera visual angle by utilizing the consistency of perspective imaging and photogrammetric imaging to obtain a simulation image.
3) In the three-dimensional scene under the virtual visual angle, the simulation image generated by the projection matrix and the observation matrix is set, and the GPS positioning of the live wire is carried out according to the pixel coordinates of the target in the monitoring image. Namely the actual observed live line position.
And step 306, judging whether parameter adjustment needs to be carried out on the forest fire spreading model or not according to the fire wire predicted position information and the fire wire observation position information at the moment K.
And 307, if the parameter adjustment is needed, adjusting the model parameters of the forest fire spreading model according to the fire line prediction position information and the fire line observation position information at the K moment, recalculating the fire line prediction position information at the K moment according to the forest fire spreading model adjusted by the model parameters, and calculating the fire line state analysis value at the K moment according to the recalculated fire line prediction position information and the fire line observation position information at the K moment.
In some embodiments of the application, the forest fire spreading speed parameter in the forest fire spreading model can be adjusted according to the deviation between the fire line predicted position information and the fire line observed position information, and then the fire line predicted position information is recalculated by the adjusted forest fire spreading model.
As an example, as shown in fig. 8, the specific implementation process of adjusting the model parameters of the forest fire spreading model according to the fire line predicted position information and the fire line observation position information at the time K and recalculating the fire line predicted position information at the time K according to the forest fire spreading model adjusted by the model parameters may include:
step 801, calculating the deviation between the fire wire predicted position information and the fire wire observed position information at the time K.
And step 802, adjusting the forest fire spreading speed at the K-1 moment according to a preset forest fire spreading speed updating coefficient matrix and deviation.
In the embodiment of the present application, the forest fire spreading speed at the time K-1 is adjusted according to the preset forest fire spreading speed update coefficient matrix and the deviation, which may be exemplified as follows, and the method includes: and multiplying the forest fire spreading speed updating coefficient matrix and the deviation, and adding the obtained product and the adjusted forest fire spreading speed at the K-1 moment.
And 803, inputting the adjusted forest fire spreading speed at the K-1 moment and the adjusted fire wire state analysis value at the K-1 moment into the Huygens fluctuation model, and obtaining the fire wire prediction position information at the K moment again.
For example, the specific implementation process of steps 801 and 803 may be as follows:
when the deviation does not converge in the target range, calculating the forest fire spreading speed R at the K-1 moment by the Rothermel model0,k-1The forest fire spreading speed in the forest fire spreading model at that time is the forest fire spreading speed in the forest fire spreading model, when the forest fire occurs, the wind direction and the wind speed of the fire scene are influenced by the conditions of hot air flow, convection and the like of the fire scene, the wind direction and the wind direction of the fire scene are not stable, and the fire spreading speed is not stable, so the forest fire spreading speed of the forest fire spreading model also needs to be dynamically adjusted, the unsteady-state factors are referred to at present, and the forest fire spreading speed is updated.
1) Corrected forest fire spread rate:
Rh,k-1=R0,k-1+CErrh (13)
wherein C is forest fire spreading speed updating coefficient matrix ErrhThe deviation of the predicted value and the observed value is obtained for equation (3).
2) Updating the forest fire spreading speed:
R0,k-1=Rh,k-1 (14)
inputting the updated forest fire spreading speed into the forest fire spreading model again, and recalculating the predicted fire line position at the k-1 moment
Figure RE-GDA0002905560480000151
Step
306 is iterated 307.
According to the method for synchronizing the forest fire spreading data based on the unmanned aerial vehicle video, the acquired meteorological data, basic geographic information data and a fire wire state analysis value at the K-1 moment of a fire disaster place are input into a forest fire spreading model, and fire wire prediction position information at the K moment is acquired; acquiring a fire scene area thermal imaging video shot by an unmanned aerial vehicle, acquiring fire scene area thermal imaging at K moment from the fire scene area thermal imaging video, determining temperature information corresponding to each pixel in the thermal imaging, extracting a fire scene range according to the temperature information and a temperature threshold, performing edge extraction on the fire scene range to obtain a pixel position of a fire wire, and converting a coordinate estimation value of the fire wire through a GPS (global positioning system) coordinate to obtain fire wire observation position information at the K moment; and judging whether parameter adjustment needs to be carried out on the forest fire spreading model or not according to the fire line predicted position information and the observation position information at the moment K. If the forest fire spreading model needs to be subjected to parameter adjustment, adjusting model parameters of the forest fire spreading model according to the fire line predicted position information and the fire line observation position information at the moment K, recalculating the fire line predicted position information at the moment K according to the forest fire spreading model subjected to model parameter adjustment, and calculating a fire line state analysis value at the moment K according to the recalculated fire line predicted position information and observation position information. The forest fire spreading data assimilation method based on the unmanned aerial vehicle video, which is implemented by the embodiment, adopts the unmanned aerial vehicle as front-end monitoring equipment, extracts the live wire in real time, obtains position information of the live wire, provides an assimilation forest fire spreading model with dynamically adjustable parameters for the forest fire spreading model, effectively solves the problems that a simulation model cannot dynamically adjust the change of a simulation environment, the forest fire model is not suitable for an unsteady state, the change of the environment cannot be transmitted in real time, and the like, and improves the model prediction precision. Unmanned aerial vehicle has high mobility and low-cost advantage, and unmanned aerial vehicle can pass back on-the-spot video in real time to make the renewal interval of observing the live wire for minute level even second level discernment. The data assimilation method adopted by the model is used for continuously assimilating the forest fire spreading model, so that the prediction precision of the fire passing area is improved, and objective fire scene information is provided for forest fire fighting work. When the deviation is not converged, a solution for incorporating unsteady-state factors of the forest fire occurrence site into the model is provided, and the prediction accuracy of the model is further improved. Meanwhile, the embodiment provides a method for acquiring the observation position of the fire wire from the regional thermal imaging video, the method can acquire the information of the observation position of the fire wire, and simultaneously visually display the observation position of the fire wire, thereby providing direct guidance and powerful support for forest fire extinguishing work.
In order to realize the embodiment, the application also provides a forest fire spreading data assimilation device based on the unmanned aerial vehicle video. Fig. 9 is a schematic structural diagram of a forest fire spreading data assimilation device based on a video of an unmanned aerial vehicle according to an embodiment of the present application, and as shown in fig. 9, the forest fire spreading data assimilation device based on the video of the unmanned aerial vehicle includes:
a first obtaining module 901, configured to obtain meteorological data and basic geographic information data of a fire place;
a second obtaining module 902, configured to obtain a fire line state analysis value at a fire place K-1;
a third obtaining module 903, configured to input meteorological data, basic geographic information data, and the fire line state analysis value at the time K-1 of the fire place to the forest fire spreading model, and obtain fire line predicted position information at the time K;
a fourth obtaining module 904, configured to obtain a fire scene area thermal imaging video shot by the unmanned aerial vehicle;
a fifth obtaining module 905, configured to obtain fire wire observation position information at the time K according to the thermal imaging video of the fire scene area;
the judging module 906 is configured to judge whether parameter adjustment needs to be performed on the forest fire spreading model according to the fire line predicted position information at the time K and the fire line observation position information;
the adjusting module 907 is configured to adjust model parameters of the forest fire spreading model according to the fire line predicted position information and the fire line observation position information at the time K when the forest fire spreading model needs to be subjected to parameter adjustment, and recalculate the fire line predicted position information at the time K according to the forest fire spreading model adjusted by the model parameters;
and the data assimilation module 908 is configured to calculate a fire wire state analysis value at the K time according to the recalculated fire wire predicted position information and the fire wire observation position information at the K time.
In some embodiments of the present application, the forest fire spread model includes a Rothermel model and a Huygens fluctuation model; in this embodiment of the application, the third obtaining module 903 is specifically configured to: inputting meteorological data and basic geographic information data of a fire place into a Rothermel model to obtain forest fire spreading speed at the K-1 moment; and inputting the forest fire spreading speed at the K-1 moment and the fire line state analysis value at the K-1 moment into the Huygens fluctuation model to predict the fire line position, so as to obtain the fire line predicted position information at the K moment.
In some embodiments of the present application, the fifth obtaining module 905 is specifically configured to: acquiring thermal imaging of the fire field area at the K moment from the thermal imaging video of the fire field area; determining temperature information corresponding to each pixel in thermal imaging of a fire field area; extracting a fire scene range from the thermal imaging of the fire scene region according to the temperature information and the temperature threshold value corresponding to each pixel in the thermal imaging of the fire scene region; performing edge extraction on a fire field range in thermal imaging of the fire field area to obtain a pixel position of a fire line; and converting the pixel position of the live wire into the GPS coordinate of the live wire, and acquiring the observation position information of the live wire at the moment K.
In some embodiments of the present application, the number of the drones is at least one, and the fifth obtaining module 905 is specifically configured to: the method comprises the steps of obtaining a plurality of pixel positions of a live wire aiming at a fire scene area thermal imaging video shot by at least one unmanned aerial vehicle at a plurality of observation points. In an embodiment of the present application, the fifth obtaining module 905 converts the pixel position of the fire line into a GPS coordinate of the fire line, and a specific implementation process of obtaining the observation position information of the fire line at the time K may be as follows: respectively carrying out coordinate conversion on a plurality of pixel positions of the live wire to obtain a plurality of coordinate values of the live wire in a geographic coordinate system of the unmanned aerial vehicle; calculating a plurality of observation altitude angle matrixes and a plurality of azimuth angle matrixes of the live wire according to a plurality of coordinate values of the live wire in a geographic coordinate system of the unmanned aerial vehicle; performing Kalman filtering estimation on the position of the live wire according to a plurality of observation altitude angle matrixes and a plurality of azimuth angle matrixes of the live wire to obtain a coordinate estimation value of the live wire; and converting the coordinate estimation value of the live wire through the GPS coordinate to obtain the observation position information of the live wire at the K moment.
In some embodiments of the present application, the fifth obtaining module 905 converts the pixel position of the fire line into a GPS coordinate of the fire line, and a specific implementation process of obtaining the observation position information of the fire line at the time K may be as follows: acquiring DEM geographic information of a fire place; acquiring GPS information, attitude information and built-in parameters of the unmanned aerial vehicle; generating a virtual visual angle of the point location of the unmanned aerial vehicle according to the DEM geographic information, the GPS information, the attitude information and the built-in parameters of the unmanned aerial vehicle; simulating an actual unmanned aerial vehicle imaging process according to the virtual visual angle of the unmanned aerial vehicle point location to obtain a simulation image; determining the pixel coordinates of the live wire in the simulation image according to the pixel position of the live wire; and converting the pixel coordinates of the fire wire in the simulation image through a GPS coordinate to obtain the observation position information of the fire wire at the moment K.
In some embodiments of the present application, the determining module 906 is specifically configured to: calculating the deviation of the fire wire predicted position information and the fire wire observation position information at the moment K; judging whether the deviation is converged within a target range; if the deviation is not converged in the target range, judging whether the number of iterations of the fire spreading model is smaller than the maximum number of iterations; if the iterative times of the fire spreading model are smaller than the highest iterative times, judging that parameter adjustment needs to be carried out on the forest fire spreading model; and if the deviation is converged in the target range and/or the number of iterations of the fire spread model is greater than or equal to the maximum number of iterations, stopping parameter adjustment of the forest fire spread model.
In some embodiments of the present application, the adjusting module 907 adjusts the model parameters of the forest fire spreading model according to the fire line predicted position information and the fire line observation position information at the time K, and the specific implementation process of recalculating the fire line predicted position information at the time K according to the forest fire spreading model adjusted by the model parameters may be as follows: calculating the deviation of the fire wire predicted position information and the fire wire observation position information at the moment K; adjusting the forest fire spreading speed at the K-1 moment according to a preset forest fire spreading speed updating coefficient matrix and the deviation; and inputting the adjusted forest fire spreading speed at the K-1 moment and the adjusted fire wire state analysis value at the K-1 moment into the Huygens fluctuation model, and obtaining the fire wire predicted position information at the K moment again.
In some embodiments of the present application, the specific implementation process of the adjusting module 907 adjusting the forest fire spreading speed at the time K-1 according to the preset forest fire spreading speed updating coefficient matrix and the deviation may be as follows: and multiplying the forest fire spreading speed updating coefficient matrix and the deviation, and adding the obtained product and the adjusted forest fire spreading speed at the K-1 moment.
In some embodiments of the present application, a specific implementation process of the data assimilation module 908 to calculate the analysis value of the fire wire state at the time K according to the recalculated fire wire predicted location information and the fire wire observed location information at the time K may be as follows: and based on an ensemble Kalman filtering algorithm, performing least square fitting on the recalculated fire wire predicted position information at the K moment and fire wire observation position information to obtain a fire wire state analysis value at the K moment.
It should be noted that, the foregoing explanation of the assimilation method of forest fire spreading data based on the video of the unmanned aerial vehicle is also applicable to the assimilation device of forest fire spreading data based on the video of the unmanned aerial vehicle in the embodiment of the present application, and the implementation principle is similar, and is not repeated herein.
In summary, the forest fire spreading data assimilation device based on the unmanned aerial vehicle video acquires meteorological data and basic geographic information data of a fire place; acquiring a fire wire state analysis value at the K-1 moment of a fire place; inputting meteorological data, basic geographic information data and the fire wire state analysis value at the K-1 moment of the fire disaster place into a forest fire spreading model to obtain fire wire prediction position information at the K moment; acquiring a fire scene area thermal imaging video shot by an unmanned aerial vehicle; acquiring fire wire observation position information at the K moment according to the thermal imaging video of the fire field area; judging whether parameter adjustment needs to be carried out on the forest fire spreading model or not according to the fire line predicted position information and the fire line observation position information at the moment K; when the parameters of the forest fire spreading model need to be adjusted, adjusting the model parameters of the forest fire spreading model according to the fire line prediction position information and the fire line observation position information at the K moment, and recalculating the fire line prediction position information at the K moment according to the forest fire spreading model adjusted by the model parameters; and calculating the fire wire state analysis value at the K moment according to the recalculated fire wire predicted position information and the fire wire observation position information at the K moment. This kind of forest fire spreading data assimilation device based on unmanned aerial vehicle video has adopted unmanned aerial vehicle as front end monitoring facilities, draws the live wire in real time, obtains the positional information of live wire, has proposed the assimilation forest fire spreading model that the parameter can be adjusted dynamically to the forest fire spreading model, has effectively solved that the simulation model can not be to the change dynamic adjustment of simulation environment, the forest fire model is not adapted to unsteady state, the unable real-time transmission scheduling problem of change of environment, has improved model prediction precision. Unmanned aerial vehicle has high mobility and low-cost advantage, and unmanned aerial vehicle can pass back on-the-spot video in real time to make the renewal interval of observing the live wire for minute level even second level discernment. The data assimilation method adopted by the model is used for continuously assimilating the forest fire spreading model, so that the prediction precision of the fire passing area is improved, and objective fire scene information is provided for forest fire fighting work.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 10, the embodiment of the present application is a block diagram of an electronic device for a method for assimilating forest fire spread data based on a video of a drone. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 10, the electronic apparatus includes: one or more processors 1001, memory 1002, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). Fig. 10 illustrates an example of one processor 1001.
The memory 1002 is a non-transitory computer readable storage medium provided herein. Wherein the memory stores instructions executable by at least one processor to cause the at least one processor to perform the method of drone video based forest fire spread data assimilation provided herein. A non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the method of drone video based forest fire spread data assimilation provided herein.
The memory 1002 is a non-transitory computer readable storage medium, and may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the method for synchronization of forest fire spread data based on a drone video in the embodiment of the present application (for example, the first obtaining module 901, the second obtaining module 902, the third obtaining module 903, the fourth obtaining module 904, the fifth obtaining module 905, the determining module 906, the adjusting module 907, and the data synchronization module 908 shown in fig. 9). The processor 1001 executes various functional applications and data processing of the server by running non-transitory software programs, instructions and modules stored in the memory 1002, that is, implements the method for assimilating forest fire spread data based on the drone video in the above method embodiment.
The memory 1002 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created from use of an electronic device for forest fire spread data assimilation based on the drone video, and the like. Further, the memory 1002 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 1002 may optionally include memory remotely located from the processor 1001, which may be connected over a network to an electronic device for forest fire spread data assimilation based on drone video. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic equipment of the method for assimilating forest fire spread data based on the unmanned aerial vehicle video can further comprise: an input device 1003 and an output device 1004. The processor 1001, the memory 1002, the input device 1003, and the output device 1004 may be connected by a bus or other means, and the bus connection is exemplified in fig. 10.
The input device 1003 may receive input numeric or character information and generate key signal inputs related to user settings and function controls of the electronic device for forest fire spread data assimilation based on drone video, such as a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointer, one or more mouse buttons, a track ball, a joystick, and the like. The output devices 1004 may include a display device, auxiliary lighting devices (e.g., LEDs), and tactile feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device. These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server may be a cloud Server, which is also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service extensibility in the traditional physical host and VPS (Virtual Private Server) service.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (12)

1. A forest fire spreading data assimilation method based on unmanned aerial vehicle video is characterized by comprising the following steps:
acquiring meteorological data and basic geographic information data of a fire place, and acquiring a fire wire state analysis value of the fire place at the K-1 moment;
inputting the meteorological data, the basic geographic information data and the fire wire state analysis value at the K-1 moment of the fire disaster place into the forest fire spreading model to obtain the fire wire prediction position information at the K moment;
acquiring a fire scene area thermal imaging video shot by an unmanned aerial vehicle, and acquiring fire wire observation position information at the K moment according to the fire scene area thermal imaging video;
judging whether parameter adjustment needs to be carried out on the forest fire spreading model or not according to the fire line predicted position information and the fire line observation position information at the moment K;
if the parameters of the forest fire spreading model need to be adjusted, adjusting model parameters of the forest fire spreading model according to the fire line predicted position information and the fire line observation position information at the K moment, recalculating the fire line predicted position information at the K moment according to the forest fire spreading model adjusted by the model parameters, and calculating a fire line state analysis value at the K moment according to the recalculated fire line predicted position information and the fire line observation position information.
2. The method of claim 1, wherein the forest fire spread model comprises a Rothermel model and a Wheatstone fluctuation model; inputting the meteorological data, the basic geographic information data and the fire wire state analysis value at the K-1 moment of the fire disaster place into the forest fire spreading model to acquire the fire wire prediction position information at the K moment, wherein the method comprises the following steps:
inputting the meteorological data and the basic geographic information data of the fire place into the Rothermel model to obtain the forest fire spreading speed at the K-1 moment;
and inputting the forest fire spreading speed at the K-1 moment and the fire wire state analysis value at the K-1 moment into the Huygens fluctuation model to predict the fire wire position, so as to obtain the fire wire predicted position information at the K moment.
3. The method according to claim 1, wherein the obtaining of the fire line observation position information at the time K according to the fire scene area thermal imaging video comprises:
acquiring the thermal imaging of the fire field area at the K moment from the thermal imaging video of the fire field area;
determining temperature information corresponding to each pixel in thermal imaging of the fire field area;
extracting a fire scene range from the thermal imaging of the fire scene region according to the temperature information and the temperature threshold value corresponding to each pixel in the thermal imaging of the fire scene region;
performing edge extraction on the fire field range in the thermal imaging of the fire field area to obtain the pixel position of the fire line;
and converting the pixel position of the live wire into a GPS coordinate of the live wire, and acquiring the live wire observation position information at the K moment.
4. The method according to claim 3, wherein the number of the unmanned aerial vehicles is at least one, and a plurality of pixel positions of a fire line are obtained for a thermal imaging video of a fire scene area shot by at least one unmanned aerial vehicle at a plurality of observation points; the step of converting the pixel position of the live wire into the GPS coordinate of the live wire to obtain the observation position information of the live wire at the moment K comprises the following steps:
respectively carrying out coordinate conversion on a plurality of pixel positions of the live wire to obtain a plurality of coordinate values of the live wire in a geographic coordinate system of the unmanned aerial vehicle;
calculating a plurality of observation altitude angle matrixes and a plurality of azimuth angle matrixes of the live wire according to a plurality of coordinate values of the live wire in a geographic coordinate system of the unmanned aerial vehicle;
performing Kalman filtering estimation on the position of the live wire according to the plurality of observation altitude angle matrixes and the plurality of azimuth angle matrixes of the live wire to obtain a coordinate estimation value of the live wire;
and converting the coordinate estimation value of the live wire through a GPS coordinate to obtain the observation position information of the live wire at the K moment.
5. The method of claim 3, wherein converting the pixel location of the fire line to the GPS coordinates of the fire line to obtain the observation location information of the fire line at time K comprises:
acquiring DEM geographic information of the fire place;
acquiring GPS information, attitude information and built-in parameters of the unmanned aerial vehicle;
generating a virtual visual angle of the unmanned aerial vehicle point location according to the DEM geographic information, the GPS information, the attitude information and the built-in parameters of the unmanned aerial vehicle;
simulating an actual unmanned aerial vehicle imaging process according to the virtual visual angle of the unmanned aerial vehicle point location to obtain a simulation image;
determining the pixel coordinates of the live wire in the simulation image according to the pixel position of the live wire;
and converting the pixel coordinates of the fire wire in the simulation image through a GPS coordinate to obtain the fire wire observation position information at the K moment.
6. The method according to claim 1, wherein the determining whether parameter adjustment of the forest fire spreading model is required according to the fire line predicted position information and the fire line observed position information at the time K comprises:
calculating the deviation between the fire wire predicted position information and the fire wire observation position information at the moment K;
judging whether the deviation converges in a target range;
if the deviation is not converged in the target range, judging whether the iterated times of the fire spreading model are smaller than the highest iterated times;
if the iterative times of the fire spreading model are smaller than the maximum iterative times, judging that parameter adjustment needs to be carried out on the forest fire spreading model;
and if the deviation converges in the target range and/or the iteration times of the fire spreading model are larger than or equal to the maximum iteration times, stopping parameter adjustment of the forest fire spreading model.
7. The method according to claim 2, wherein the adjusting the model parameters of the forest fire spreading model according to the fire line predicted position information and the fire line observation position information at the time K, and recalculating the fire line predicted position information at the time K according to the forest fire spreading model adjusted by the model parameters comprises:
calculating the deviation between the fire wire predicted position information and the fire wire observation position information at the moment K;
adjusting the forest fire spreading speed at the K-1 moment according to a preset forest fire spreading speed updating coefficient matrix and the deviation;
and inputting the adjusted forest fire spreading speed at the K-1 moment and the adjusted fire wire state analysis value at the K-1 moment into the Huygens fluctuation model, and obtaining the fire wire prediction position information at the K moment again.
8. The method of claim 7, wherein the adjusting the forest fire spreading rate at the time K-1 according to the preset forest fire spreading rate update coefficient matrix and the deviation comprises:
and multiplying the forest fire spreading speed updating coefficient matrix and the deviation, and adding the obtained product and the adjusted forest fire spreading speed at the K-1 moment.
9. The method according to any one of claims 1 to 8, wherein the calculating of the analysis value of the fire line state at the time K according to the recalculated fire line prediction location information and the fire line observation location information comprises:
and performing least square fitting on the recalculated fire wire predicted position information at the K moment and the fire wire observation position information based on an ensemble Kalman filtering algorithm to obtain a fire wire state analysis value at the K moment.
10. The utility model provides a forest fire spreading data assimilation device based on unmanned aerial vehicle video, which comprises:
the first acquisition module is used for acquiring meteorological data and basic geographic information data of a fire place;
the second acquisition module is used for acquiring a fire wire state analysis value at the K-1 moment of the fire place;
the third acquisition module is used for inputting the meteorological data, the basic geographic information data and the fire wire state analysis value at the K-1 moment of the fire disaster place into the forest fire spreading model and acquiring the fire wire prediction position information at the K moment;
the fourth acquisition module is used for acquiring a fire scene area thermal imaging video shot by the unmanned aerial vehicle;
the fifth acquisition module is used for acquiring the observation position information of the fire wire at the moment K according to the thermal imaging video of the fire scene area;
the judging module is used for judging whether parameter adjustment needs to be carried out on the forest fire spreading model or not according to the fire line predicted position information and the fire line observation position information at the K moment;
the adjusting module is used for adjusting the model parameters of the forest fire spreading model according to the fire line predicted position information and the fire line observation position information at the K moment when the parameter adjustment of the forest fire spreading model is needed, and recalculating the fire line predicted position information at the K moment according to the forest fire spreading model adjusted by the model parameters;
and the data assimilation module is used for calculating a fire wire state analysis value at the K moment according to the recalculated fire wire predicted position information at the K moment and the fire wire observation position information.
11. An electronic device, comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method for synchronizing forest fire spreading data based on unmanned aerial vehicle video according to any one of claims 1 to 9 when executing the computer program.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the method for assimilating forest fire spread data based on drone video according to any one of claims 1 to 9.
CN202011367733.0A 2020-11-27 2020-11-27 Forest fire spread data assimilation method and device based on unmanned aerial vehicle video Active CN112464819B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011367733.0A CN112464819B (en) 2020-11-27 2020-11-27 Forest fire spread data assimilation method and device based on unmanned aerial vehicle video
PCT/CN2021/112848 WO2022110912A1 (en) 2020-11-27 2021-08-16 Unmanned aerial vehicle video-based forest fire spreading data assimilation method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011367733.0A CN112464819B (en) 2020-11-27 2020-11-27 Forest fire spread data assimilation method and device based on unmanned aerial vehicle video

Publications (2)

Publication Number Publication Date
CN112464819A true CN112464819A (en) 2021-03-09
CN112464819B CN112464819B (en) 2024-01-12

Family

ID=74809410

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011367733.0A Active CN112464819B (en) 2020-11-27 2020-11-27 Forest fire spread data assimilation method and device based on unmanned aerial vehicle video

Country Status (2)

Country Link
CN (1) CN112464819B (en)
WO (1) WO2022110912A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112947264A (en) * 2021-04-21 2021-06-11 苏州希盟科技股份有限公司 Control method and device for dispenser, electronic equipment and medium
CN113554845A (en) * 2021-06-25 2021-10-26 东莞市鑫泰仪器仪表有限公司 Be used for forest fire prevention thermal imaging device
CN114495416A (en) * 2021-12-29 2022-05-13 北京辰安科技股份有限公司 Fire monitoring method and device based on unmanned aerial vehicle and terminal equipment
WO2022110912A1 (en) * 2020-11-27 2022-06-02 清华大学 Unmanned aerial vehicle video-based forest fire spreading data assimilation method and apparatus
CN115099493A (en) * 2022-06-27 2022-09-23 东北林业大学 CNN-based forest fire spreading rate prediction method in any direction
CN115518316A (en) * 2022-09-20 2022-12-27 珠海安擎科技有限公司 Wisdom fire extinguishing system based on unmanned aerial vehicle, cloud platform and AR glasses interconnection
CN115661245A (en) * 2022-10-24 2023-01-31 东北林业大学 Large-scale live wire instantaneous positioning method based on unmanned aerial vehicle
CN117745536A (en) * 2023-12-25 2024-03-22 东北林业大学 Forest fire large-scale live wire splicing method and system based on multiple unmanned aerial vehicles

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115671617A (en) * 2022-11-03 2023-02-03 国网冀北电力有限公司超高压分公司 Fire positioning method, device, equipment and storage medium for flexible direct current converter station
CN115660428A (en) * 2022-11-10 2023-01-31 四川省林业和草原调查规划院(四川省林业和草原生态环境监测中心) Forest and grassland fire risk assessment system based on geographic information
CN116415712A (en) * 2023-02-14 2023-07-11 武汉大学 Fire spread prediction method and system based on multiple data sources
CN116952081B (en) * 2023-07-26 2024-04-16 武汉巨合科技有限公司 Aerial monitoring system and monitoring method for parameter images of drop points of fire extinguishing bomb
CN117152592B (en) * 2023-10-26 2024-01-30 青岛澳西智能科技有限公司 Building information and fire information visualization system and method
CN117163302B (en) * 2023-10-31 2024-01-23 安胜(天津)飞行模拟系统有限公司 Aircraft instrument display method, device, equipment and storage medium
CN117689520B (en) * 2024-02-01 2024-05-10 青岛山科智汇信息科技有限公司 Grassland fire extinguishing bomb coverage capability evaluation method, medium and system
CN118015198B (en) * 2024-04-09 2024-06-18 电子科技大学 Pipe gallery fire risk assessment method based on convolutional neural network image processing

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102819926A (en) * 2012-08-24 2012-12-12 华南农业大学 Fire monitoring and warning method on basis of unmanned aerial vehicle
KR20170101516A (en) * 2016-02-29 2017-09-06 한국전자통신연구원 Apparatus and method for fire monitoring using unmanned aerial vehicle
CN108763811A (en) * 2018-06-08 2018-11-06 中国科学技术大学 Dynamic data drives forest fire appealing prediction technique
CN109472421A (en) * 2018-11-22 2019-03-15 广东电网有限责任公司 A kind of power grid mountain fire sprawling method for early warning and device
CN109871613A (en) * 2019-02-18 2019-06-11 南京林业大学 A kind of forest fire discrimination model acquisition methods and prediction application
CN110390135A (en) * 2019-06-17 2019-10-29 北京中科锐景科技有限公司 A method of improving forest fire appealing precision of prediction

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202976376U (en) * 2012-11-22 2013-06-05 华南农业大学 Forest fire monitoring and emergency command system based unmanned aerial vehicle
CN106021666B (en) * 2016-05-10 2019-03-12 四川大学 A kind of mountain fire disaster alarm method of overhead transmission line
US9977963B1 (en) * 2017-03-03 2018-05-22 Northrop Grumman Systems Corporation UAVs for tracking the growth of large-area wildland fires
CN112307884B (en) * 2020-08-19 2024-06-25 航天图景(北京)科技有限公司 Forest fire spreading prediction method based on continuous time sequence remote sensing situation data and electronic equipment
CN112464819B (en) * 2020-11-27 2024-01-12 清华大学 Forest fire spread data assimilation method and device based on unmanned aerial vehicle video

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102819926A (en) * 2012-08-24 2012-12-12 华南农业大学 Fire monitoring and warning method on basis of unmanned aerial vehicle
KR20170101516A (en) * 2016-02-29 2017-09-06 한국전자통신연구원 Apparatus and method for fire monitoring using unmanned aerial vehicle
CN108763811A (en) * 2018-06-08 2018-11-06 中国科学技术大学 Dynamic data drives forest fire appealing prediction technique
CN109472421A (en) * 2018-11-22 2019-03-15 广东电网有限责任公司 A kind of power grid mountain fire sprawling method for early warning and device
CN109871613A (en) * 2019-02-18 2019-06-11 南京林业大学 A kind of forest fire discrimination model acquisition methods and prediction application
CN110390135A (en) * 2019-06-17 2019-10-29 北京中科锐景科技有限公司 A method of improving forest fire appealing precision of prediction

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022110912A1 (en) * 2020-11-27 2022-06-02 清华大学 Unmanned aerial vehicle video-based forest fire spreading data assimilation method and apparatus
CN112947264A (en) * 2021-04-21 2021-06-11 苏州希盟科技股份有限公司 Control method and device for dispenser, electronic equipment and medium
CN113554845A (en) * 2021-06-25 2021-10-26 东莞市鑫泰仪器仪表有限公司 Be used for forest fire prevention thermal imaging device
CN114495416A (en) * 2021-12-29 2022-05-13 北京辰安科技股份有限公司 Fire monitoring method and device based on unmanned aerial vehicle and terminal equipment
CN115099493A (en) * 2022-06-27 2022-09-23 东北林业大学 CNN-based forest fire spreading rate prediction method in any direction
CN115099493B (en) * 2022-06-27 2023-11-10 东北林业大学 Forest fire spreading rate prediction method in any direction based on CNN
CN115518316A (en) * 2022-09-20 2022-12-27 珠海安擎科技有限公司 Wisdom fire extinguishing system based on unmanned aerial vehicle, cloud platform and AR glasses interconnection
CN115518316B (en) * 2022-09-20 2024-02-20 珠海安擎科技有限公司 Intelligent fire protection system based on interconnection of unmanned aerial vehicle, cloud platform and AR glasses
CN115661245A (en) * 2022-10-24 2023-01-31 东北林业大学 Large-scale live wire instantaneous positioning method based on unmanned aerial vehicle
CN117745536A (en) * 2023-12-25 2024-03-22 东北林业大学 Forest fire large-scale live wire splicing method and system based on multiple unmanned aerial vehicles
CN117745536B (en) * 2023-12-25 2024-06-11 东北林业大学 Forest fire large-scale live wire splicing method and system based on multiple unmanned aerial vehicles

Also Published As

Publication number Publication date
WO2022110912A1 (en) 2022-06-02
CN112464819B (en) 2024-01-12

Similar Documents

Publication Publication Date Title
CN112464819B (en) Forest fire spread data assimilation method and device based on unmanned aerial vehicle video
US11029211B2 (en) Unmanned aerial system based thermal imaging systems and methods
CN107504957B (en) Method for rapidly constructing three-dimensional terrain model by using unmanned aerial vehicle multi-view camera shooting
US6674391B2 (en) System and method of simulated image reconstruction
Stipaničev et al. Advanced automatic wildfire surveillance and monitoring network
EP3309513A1 (en) Three-dimensional topographic mapping system and mapping method
CN106683039B (en) System for generating fire situation map
WO2023125587A1 (en) Fire monitoring method and apparatus based on unmanned aerial vehicle
Lauterbach et al. The Eins3D project—Instantaneous UAV-based 3D mapping for Search and Rescue applications
JP2020008802A (en) Three-dimensional map generation device and three-dimensional map generation method
Zhang et al. Forest fire detection solution based on UAV aerial data
Qiao et al. Ground target geolocation based on digital elevation model for airborne wide-area reconnaissance system
US20150009326A1 (en) Photographing plan creation device and program and method for the same
Bradley et al. Georeferenced mosaics for tracking fires using unmanned miniature air vehicles
KR20210104033A (en) Planning method, apparatus, control terminal and storage medium of survey and mapping sample points
Matelenok et al. Influence of the canopy structure of a birch forest on the visibility of the fires below
CN115493598B (en) Target positioning method and device in motion process and storage medium
Li et al. Machine learning based tool chain solution for free space optical communication (FSOC) propagation modeling
Stødle et al. High-performance visualisation of UAV sensor and image data with raster maps and topography in 3D
Li et al. Prediction of wheat gains with imagery from four-rotor UAV
Ma et al. Low‐Altitude Photogrammetry and Remote Sensing in UAV for Improving Mapping Accuracy
Guo et al. A new UAV PTZ Controlling System with Target Localization
CN111798448A (en) Method, apparatus, device and storage medium for processing image
Salamí et al. Real-time data processing for the airborne detection of hot spots
La Salandra et al. Application of uav system and sfm techniques to develop high-resolution terrain models

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant