WO2022110912A1 - Procédé et appareil d'assimilation de données de propagation de feu de forêt à base de vidéo d'engin volant sans pilote embarqué - Google Patents

Procédé et appareil d'assimilation de données de propagation de feu de forêt à base de vidéo d'engin volant sans pilote embarqué Download PDF

Info

Publication number
WO2022110912A1
WO2022110912A1 PCT/CN2021/112848 CN2021112848W WO2022110912A1 WO 2022110912 A1 WO2022110912 A1 WO 2022110912A1 CN 2021112848 W CN2021112848 W CN 2021112848W WO 2022110912 A1 WO2022110912 A1 WO 2022110912A1
Authority
WO
WIPO (PCT)
Prior art keywords
fire
time
line
position information
model
Prior art date
Application number
PCT/CN2021/112848
Other languages
English (en)
Chinese (zh)
Inventor
陈涛
黄丽达
孙占辉
袁宏永
刘春慧
王晓萌
白硕
张立凡
王镜闲
Original Assignee
清华大学
北京辰安科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 清华大学, 北京辰安科技股份有限公司 filed Critical 清华大学
Publication of WO2022110912A1 publication Critical patent/WO2022110912A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/28Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture specially adapted for farming

Definitions

  • the present application relates to the field of data processing, in particular to a method, device, electronic device and storage medium for data assimilation of forest fire spread based on drone video, and belongs to the application field of data assimilation.
  • Forest fires are difficult to control, and forest fires will destroy forest ecosystems, cause environmental pollution, and threaten the safety of human life and property.
  • emergency management workers urgently need to obtain front-line fire information and accurate forest fire spread prediction information in a short period of time, so as to gain valuable time for emergency rescue and rescue work.
  • Geostationary satellites have high orbits, wide coverage, and high observation frequency, but relatively low spatial resolution. Geostationary satellites generally find that the size of the fire field is in the kilometer level, and it is difficult to effectively monitor small fire fields. Compared with stationary satellites, polar-orbiting satellites collect remote sensing images with high spatial resolution and low observation frequency.
  • the present application aims to solve one of the technical problems in the related art at least to a certain extent.
  • the present application proposes a method, device, electronic device, and computer-readable storage medium for forest fire spread data assimilation based on drone video, which can improve the accuracy of the forest fire spread model. , High-accuracy prediction of fire information, so as to provide objective fire information for forest fire fighting.
  • a method for assimilating forest fire spread data based on drone video including: acquiring meteorological data and basic geographic information data of the fire occurrence place, and obtaining the fire occurrence place K-1 The fire line state analysis value at time; the meteorological data, basic geographic information data and the fire line state analysis value at the K-1 time of the fire occurrence place are input into the forest fire spread model, and the fire line prediction position information at time K is obtained.
  • the forest fire spread model includes a Rothermel model and a Huygens wave model; the meteorological data, basic geographic information data and the fire line state analysis value at the K-1 moment of the fire are input into the In the forest fire spread model, obtaining the predicted position information of the fire line at time K includes: inputting the meteorological data of the fire occurrence place and the basic geographic information data into the Rothermel model, and obtaining the forest fire spread at time K-1 Speed; input the forest fire spread speed at K-1 time and the fire line state analysis value at K-1 time into the Huygens fluctuation model to predict the fire line position, and obtain the fire line predicted position information at time K.
  • the obtaining the fire line observation position information at time K according to the thermal imaging video of the fire field area includes: obtaining the thermal image of the fire field area at time K from the thermal imaging video of the fire field area; determining the fire field area The temperature information corresponding to each pixel in the thermal imaging; according to the temperature information and temperature threshold corresponding to each pixel in the thermal imaging of the fire area, the fire field range is extracted from the thermal imaging of the fire field area; The fire field range in carries out edge extraction, obtains the pixel position of fire line; The pixel position of described fire line is converted into the GPS (Global Positioning System global positioning system) coordinate of fire line, obtains the fire line observation position information of described K moment.
  • GPS Global Positioning System global positioning system
  • the number of the unmanned aerial vehicle is at least one, and for the thermal imaging video of the fire field area shot by at least one unmanned aerial vehicle at multiple observation points, multiple pixel positions of the line of fire are obtained; Converting the pixel position into the GPS coordinates of the line of fire, and obtaining the position information of the line of observation of the line of fire at time K, includes: performing coordinate conversion on a plurality of pixel positions of the line of fire respectively, and obtaining the number of coordinates of the line of fire in the UAV geographic coordinate system.
  • a plurality of coordinate values calculate a plurality of observation altitude angle matrices and a plurality of azimuth angle matrices of the fire line according to the plurality of coordinate values of the fire line in the UAV geographic coordinate system; according to the plurality of observation altitude angle matrices of the fire line Perform Kalman filter estimation on the position of the live line with multiple azimuth angle matrices to obtain the estimated coordinate value of the live line; convert the estimated coordinate value of the live line through GPS coordinates to obtain the observed position information of the live line at time K.
  • converting the pixel position of the fire line into the GPS coordinates of the fire line, and obtaining the fire line observation position information at the K time includes: obtaining the DEM geographic information of the fire occurrence place; obtaining the unmanned aerial vehicle. GPS information, attitude information and built-in parameters; according to the DEM (Digital Elevation Model digital elevation model) geographic information, the GPS information, attitude information and built-in parameters of the drone, generate a virtual perspective of the point of the drone; The actual UAV imaging process is simulated according to the virtual viewing angle of the UAV point, and a simulated image is obtained; according to the pixel position of the fire line, the pixel coordinates of the fire line in the simulated image are determined; The pixel coordinates in the simulated image are converted by GPS coordinates to obtain the position information of the line of fire observation at the K time.
  • DEM Digital Elevation Model digital elevation model
  • judging whether parameter adjustment of the forest fire spread model needs to be performed according to the predicted fire line location information at the K time and the fire line observation location information includes: calculating the fire line predicted location information at the K time. The deviation from the observation position information of the fire line; judge whether the deviation converges within the target range; if the deviation does not converge within the target range, judge whether the number of iterations for the fire spread model is less than the maximum number of iterations ; if the number of iterations of the fire spread model is less than the maximum number of iterations, it is determined that parameter adjustment of the forest fire spread model is required; if the deviation converges within the target range, and/or the fire spread If the number of iterations of the spread model is greater than or equal to the maximum number of iterations, the parameter adjustment of the forest fire spread model is stopped.
  • the model parameters of the forest fire spread model are adjusted according to the predicted fire line position information and the fire line observation position information at the time K, and the fire spread model at time K is recalculated according to the forest fire spread model adjusted by the model parameters.
  • the fire line prediction position information includes: calculating the deviation between the fire line predicted position information at the K time and the fire line observation position information; updating the coefficient matrix according to the preset forest fire spread speed and the deviation to the K-1 time.
  • the forest fire spread speed is adjusted; the adjusted forest fire spread speed at the K-1 time and the fire line state analysis value at the K-1 time are input into the Huygens fluctuation model, and the fire line at the K time is obtained again. Predicted location information.
  • the adjusting the forest fire spread rate at the time K-1 according to the preset forest fire spread rate update coefficient matrix and the deviation includes: updating the forest fire spread rate update coefficient matrix and all the parameters. The deviation is multiplied, and the obtained product is added to the adjusted forest fire spreading speed at time K-1.
  • the calculation of the live line state analysis value at time K according to the recalculated live line predicted position information at time K and the observed live line position information includes: based on the set Kalman filter algorithm, the recalculated time K The least squares fitting is performed between the predicted hotline position information of , and the observed hotline position information to obtain the hotline state analysis value at the time K.
  • a device for assimilating forest fire spread data based on drone video comprising: a first acquisition module for acquiring meteorological data and basic geographic information data of a fire occurrence place; a second acquisition module module, used to obtain the fire line state analysis value at the time K-1 of the fire occurrence place; the third acquisition module is used to obtain the meteorological data of the fire occurrence place, basic geographic information data and the fire line at the time K-1 time
  • the state analysis value is input into the forest fire spread model, and the predicted position information of the line of fire at time K is obtained;
  • the fourth acquisition module is used to obtain the thermal imaging video of the fire area based on the drone;
  • the fifth acquisition module is used to obtain according to the The thermal imaging video of the fire field area obtains the fire line observation position information at the K time; the judgment module is used to judge whether the forest fire spread model needs to be determined according to the fire line prediction position information and the fire line observation position information at the K time.
  • an adjustment module is used to adjust the model parameters of the forest fire spread model according to the predicted position information of the fire line at the time K and the observed position information of the fire line when the parameters of the forest fire spread model need to be adjusted. , and recalculate the predicted position information of the fire line at time K according to the forest fire spread model adjusted by the model parameters;
  • the data assimilation module is used to calculate the predicted position information of the fire line at time K according to the recalculated predicted position information of the fire line at time K and the observed position information of the fire line at time K. Firewire status analysis value.
  • the forest fire spread model includes a Rothermel model and a Huygens wave model
  • the third acquisition module is specifically configured to: input the meteorological data of the fire occurrence place and the basic geographic information data into the Rothermel model to obtain the forest fire spread speed at time K-1; input the forest fire spread speed at time K-1 and the analysis value of the fire line state at the time K-1 into the Huygens wave model for fire line position analysis to obtain the predicted position information of the live line at time K.
  • the fifth acquisition module is specifically configured to: acquire the thermal imaging of the fire field area at time K from the thermal imaging video of the fire field area; determine the temperature information corresponding to each pixel in the thermal imaging of the fire field area; The temperature information and temperature threshold corresponding to each pixel in the thermal imaging of the fire area are extracted from the thermal imaging of the fire area; the edge of the fire area in the thermal imaging of the fire area is extracted to obtain the pixel position of the fire line ; Convert the pixel position of the live line to the GPS coordinates of the live line, and obtain the live line observation position information at the K time.
  • the number of the unmanned aerial vehicle is at least one, and the fifth acquisition module is specifically configured to: obtain multiple thermal imaging videos of the fire field area captured by the at least one unmanned aerial vehicle at multiple observation points. pixel location.
  • the fifth acquisition module is specifically configured to: perform coordinate transformation on multiple pixel positions of the fire line respectively, to obtain a plurality of coordinate values of the fire line in the UAV geographic coordinate system; Calculate multiple observation altitude angle matrices and multiple azimuth angle matrices of the fire line from multiple coordinate values in the UAV geographic coordinate system; The location is estimated by Kalman filter to obtain the estimated coordinate value of the live line; the coordinate estimated value of the live line is converted by GPS coordinates to obtain the observed location information of the live line at time K.
  • the fifth obtaining module is specifically configured to: obtain the DEM geographic information of the fire occurrence place; obtain the GPS information, attitude information and built-in parameters of the UAV; according to the DEM geographic information, the The GPS information, attitude information and built-in parameters of the drone generate a virtual perspective of the drone point; simulate the actual drone imaging process according to the virtual perspective of the drone point, and obtain a simulated image; determine the pixel coordinates of the live line in the simulated image; convert the pixel coordinates of the live line in the simulated image through GPS coordinates to obtain the observation position information of the live line at time K.
  • the judging module is specifically configured to: calculate the deviation between the predicted live position information at the K time and the observed live position information; judge whether the deviation converges within the target range; if the deviation does not converge within the target range; the target range, it is determined whether the number of iterations for the fire spread model is less than the maximum number of iterations; if the number of iterations for the fire spread model is less than the maximum number of iterations, it is determined that the fire spread model needs to be updated Perform parameter adjustment; if the deviation converges to the target range, and/or the number of iterations of the fire spread model is greater than or equal to the maximum number of iterations, stop adjusting the parameters of the forest fire spread model.
  • the adjustment module is specifically configured to: calculate the deviation between the predicted position information of the live line at the time K and the observed position information of the live line; update the coefficient matrix according to the preset forest fire spread speed and the deviation to the Adjust the forest fire spread speed at time K-1; input the adjusted forest fire spread speed at time K-1 and the analysis value of the state of fire at time K-1 into the Huygens fluctuation model, and re- Obtain the predicted hotline position information at time K.
  • the adjustment module is specifically configured to: multiply the forest fire spread speed update coefficient matrix and the deviation, and perform a multiplication operation between the obtained product and the adjusted forest fire spread speed at time K-1. addition operation.
  • the data assimilation module is specifically configured to: perform least squares fitting on the recalculated live line predicted location information at time K and the live line observed location information based on an aggregated Kalman filter algorithm to obtain the The analysis value of the live line state at time K.
  • an electronic device comprising: a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor executing the computer program , the method for assimilating the forest fire spread data based on the drone video described in the embodiment of the first aspect of the present application is implemented.
  • a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, realizes the drone-based video according to the embodiment of the first aspect of the present application
  • the forest fire spread data assimilation method is provided, on which a computer program is stored, and when the computer program is executed by a processor, realizes the drone-based video according to the embodiment of the first aspect of the present application.
  • the fire point positioning technology of the thermal imaging video of the UAV is combined, the position of the fire line is identified through the intelligent analysis of the thermal imaging video of the UAV, the parameters of the forest fire spread model are corrected in real time by observing the position of the fire line, and the dynamic Iterating the forest fire spread model to realize the data assimilation process of the forest fire spread model can effectively solve the problem that the fire line cannot be obtained in real time, and the parameters of the forest fire spread model cannot be corrected in time, resulting in the inability to guarantee the accuracy of the prediction results.
  • this application can quickly move and shoot, cover a large area, and quickly transmit video.
  • this application proposes a multi-convergent ensemble Kalman filter data assimilation method for the non-steady meteorological conditions in the forest fire area. While the fire line position corrects the parameters of the forest fire spread model in real time, it dynamically iterates the forest fire spread speed, effectively improving the The accuracy of the forest fire spread model.
  • FIG. 1 is a schematic flowchart of a method for assimilating forest fire spread data based on drone video according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a method for assimilating forest fire spread data based on drone video according to an embodiment of the present application
  • FIG. 3 is a schematic flowchart of a method for assimilating forest fire spread data based on drone video according to another embodiment of the present application;
  • Fig. 4 is a flow chart of obtaining the predicted position information of the live line at time K according to an embodiment of the present application
  • FIG. 5 is a schematic flowchart of obtaining position information of live line observation at time K according to another embodiment of the present application.
  • FIG. 6 is a schematic diagram of obtaining three-dimensional coordinate information of a target by multiple drones according to an embodiment of the present application
  • FIG. 7 is a schematic flowchart of a method for assimilating forest fire spread data based on drone video according to an embodiment of the present application
  • FIG. 8 is a flowchart of obtaining the predicted location information of the live wire according to an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a device for assimilating forest fire spread data based on drone video according to an embodiment of the present application.
  • FIG. 1 is a flowchart of a method for assimilating forest fire spread data based on drone video according to an embodiment of the present application.
  • the forest fire spread data assimilation method in the embodiment of the present application can be applied to the forest fire spread data assimilation device based on drone video implemented in the present application, and the device can be implemented by software and/or hardware.
  • the device can be integrated into electronic equipment.
  • the forest fire spread data assimilation method includes the following steps:
  • step 101 the meteorological data and basic geographic information data of the fire occurrence place are obtained, and the fire line state analysis value at the time K-1 of the fire occurrence place is obtained.
  • the meteorological data may include, but not limited to, any one or more of wind speed, wind direction, air temperature, precipitation probability, precipitation amount, air pressure, air humidity, air oxygen content, and the like.
  • the meteorological data may include wind speed and direction.
  • the basic geographic information data may include, but are not limited to, the type of underlying surface, forest moisture content, forest slope map, slope aspect, forest combustible substances, physical and chemical properties of forest combustible substances, etc. any one or more.
  • the physical and chemical properties may include, but are not limited to, any one or more of density, ignition point, calorific value, flammability, and the like.
  • the basic geographic information data may include underlying surface type, forest moisture content, forest slope map, slope aspect, and forest combustible matter.
  • the method for obtaining the analysis value of the fire line state at time K-1 at the fire occurrence place can be as follows: the meteorological data, basic geographic information data and the analysis value of the fire line state at time K-2 at the fire place are input into the forest fire
  • the spread model can obtain the predicted position information of the line of fire at the time of K-1, and then obtain the thermal imaging video of the fire area based on the drone, and obtain the observation position information of the line of fire at the time of K-1 according to this video. According to the predicted position information of the fire line and the observation position information, it is determined whether the parameters of the forest fire spread model need to be adjusted.
  • time K represents a certain time point when the forest fire is burning
  • time K-1 represents a time point corresponding to one time step back from the time point
  • time K-2 represents a time point from the said time point. The time point goes back two time points corresponding to the time steps, and so on.
  • the fire line state analysis value at time K-1 at the fire occurrence place when calculating the fire line state analysis value at time K-1 at the fire occurrence place, it can be based on the live line state analysis value at the previous time K-1 time (ie K-2 time), the meteorological data at the fire place, and Based on the basic geographic information data, the forest fire spread model is used to predict the position of the fire line at the time of K-1, and the predicted position of the fire line at the time of K-1 is obtained.
  • the fire line observation position information determines that the parameters of the fire spread model need to be dynamically adjusted, the parameters of the fire spread model can be adjusted. Calculate the hotline state analysis value at time K-1 according to the recalculated liveline predicted position information at time K-1 and the observed liveline position information at time K-1.
  • the predicted position information of the live line at the time of K-1 obtained in the first prediction and the predicted position of the live line at the time of K-1 can be directly obtained from the first prediction.
  • the live line state analysis value at time K-1 is calculated from the live line observation position information.
  • the acquisition method of the initial state analysis value of the live line where the fire occurred may be as follows: meteorological, basic geographic information and live line state can be obtained through multiple simulation tests in advance.
  • the corresponding relationship of the analysis values in this way, the initial state analysis value of the fire line at the fire occurrence place can be obtained according to the corresponding relationship, the meteorological data and the basic geographic information data of the fire occurrence place. That is to say, when a fire occurs in a certain place, the initial state analysis value of the live line at the place where the fire occurs can be predicted by using the empirical value obtained by multiple simulation tests.
  • Step 102 input the meteorological data, basic geographic information data and the fire line state analysis value at the time of K-1 into the forest fire spread model, and obtain the predicted fire line position information at the time K of the fire.
  • the forest fire spread model includes: the Rothermel model, the Huygens wave model, the model used in combination with the Rothermel model and the Huygens wave model, the McArthur model, etc., which have the function of simulating the spread of forest fire by inputting information 's model.
  • the forest fire spread model includes: the Rothermel model and the Huygens wave model.
  • the meteorological data of the fire occurrence place, the basic geographic information data and the fire line state analysis value at time K-1 are input into the forest fire spread model, and the specific implementation process of obtaining the predicted fire line position information at time K may be as follows : Input the meteorological data and basic geographic information data of the fire location into the Rothermel model, obtain the forest fire spread speed at K-1, and analyze the forest fire spread speed at K-1 and the fire line state analysis value at K-1 Input to the Huygens fluctuation model to predict the position of the live line, and obtain the predicted position of the live line at time K.
  • step 102 may be as follows:
  • each fire can be obtained through the Rothermel model.
  • the forest fire spread speed of the point is R_0.
  • the fire point is regarded as a point on the wave front, and each fire point can be used as the next wave source (ie, the secondary wave source), and the wave continues to propagate, which can be regarded as the next time step K
  • Predicted line of fire position at the moment ( The subscript k represents time, the superscript represents the matrix state of FireWire, and f represents the prediction matrix of FireWire, Represents the predicted position of the live line at time k; e j , n j are the coordinates of point j on the live line, and the subscript m is the number of marked points on the perimeter of the live line).
  • the formula of the forest fire spread model is expressed as follows:
  • formula (1) represents the Rothermel model, in formula (1), R 0 is the fire spread rate of a certain fire point, IR is the reaction intensity, ⁇ is the propagation rate, ⁇ b is the density of combustibles, and ⁇ is the effective Heat coefficient, Q ig is the heat required to ignite a unit mass of combustibles, ⁇ sw wind speed and slope correction coefficient;
  • formula (2) represents the Huygens wave model, in formula (2), H represents the Huygens model,
  • the superscript a of represents the state analysis matrix of the ensemble predicted live line and observed live line, is represented as the state analysis matrix of the model at the last time step K-1.
  • Step 103 obtaining the thermal imaging video of the fire field area based on the UAV shooting, and obtaining the fire line observation position information at time K according to the thermal imaging video of the fire field area.
  • the method for obtaining the thermal imaging video from the video shot by the drone includes but is not limited to the following infrared thermal imaging technology : All objects in nature, due to the thermal motion of molecules inside the object, will have infrared radiation as long as the temperature is higher than absolute zero (-273°C), and the wavelength of this radiation is inversely proportional to its temperature.
  • the thermal imaging technology is infrared thermal imaging technology, which is transformed into a thermal image (may be grayscale and/or pseudo-color) of the target object through systematic processing according to the level of radiant energy of the detected object.
  • the UAV is equipped with such a thermal imager, and the pixel information of the thermal imager can reflect the temperature information of the shooting area.
  • the pixel information of the thermal imaging can be used to reflect the temperature information of the shooting area, so as to obtain the fire line observation position information at time K.
  • the thermal imager on the drone can send the thermal imaging video captured by the thermal imager to the forest fire spread data assimilation device through the communication connection with the forest fire spread data assimilation device, so that the forest fire spread data assimilation device can be
  • the fire spread data assimilation device obtains the thermal imaging video of the fire area based on the thermal imager on the drone. That is to say, the drone is equipped with a thermal imager to shoot thermal imaging video of the fire area.
  • the drone communicates with the forest fire spread data assimilation device using a communication connection, so that the forest fire spread data assimilation device can obtain the thermal imaging video of the fire field area from the drone based on the communication connection.
  • the manner used for the above-mentioned communication connection may be a mobile Internet manner, a wireless communication manner, or the like.
  • the mobile Internet can be 3G (3rd generation mobile networks) network, 4G (4th generation mobile networks fourth generation mobile information system) network, 5G (5th generation mobile networks) fifth generation mobile communication technology ) network, etc.
  • wireless communication can be one of WIFI (Wireless Fidelity), digital wireless data transmission radio, UWB (Ultra Wide Band) transmission, Zigbee transmission, etc.
  • WIFI Wireless Fidelity
  • UWB Ultra Wide Band
  • Step 104 according to the predicted position information of the fire line and the observed position information of the fire line at time K, determine whether parameter adjustment of the forest fire spread model needs to be performed.
  • the embodiment of the present application may implement forest fire spread data assimilation based on a multi-convergence ensemble Kalman filtering method.
  • the multi-convergent ensemble Kalman filter method first needs to select the state analysis live line position at time K-1 and the fire spread speed V k-1 as the state parameters to be corrected.
  • the Rothermel Huygens model is based on the live line state at time k-1.
  • Analysis value The predicted hotline position at time K obtained from the prediction
  • the observation position of the line of fire at time K is: Then, predict the position information according to the live line at time K and fire line observation location information Calculate the deviation between the two, and decide whether to adjust the parameters of the forest fire spread model according to the convergence of the deviation.
  • the specific implementation process of judging whether to adjust the parameters of the forest fire spread model according to the predicted position information of the fire line and the observed position information of the fire line at time K may include:
  • Step 201 Calculate the deviation between the predicted hotline position information and the hotline observation position information at time K.
  • the deviation is defined as follows:
  • the above formula (3) can be used to calculate the predicted position information of the live line at time K and the observation position of the fire line at time K is deviation between.
  • Step 202 judging whether the deviation converges within the target range.
  • Step 203 if the deviation does not converge within the target range, determine whether the number of iterations for the forest fire spread model is less than the maximum number of iterations.
  • a variable may be set to record the number of iterations of the forest fire spread model, and the maximum number of iterations may be a constant set manually, and the constant may be recorded in the system in advance, or may be recorded in the system in advance Given a recommended value in the actual operation, the maximum number of iterations can be dynamically adjusted according to experience or field conditions. By comparing the recorded number of iterations with the set maximum number of iterations, the relative relationship between the number of iterations and the maximum number of iterations can be obtained.
  • N iteration is the maximum iteration step size limit
  • h is the current number of iterations for the forest fire spread model
  • the predicted position information of the fire line at the time of K is judged and the observation position of the fire line at time K is After the deviation between them does not converge within the target range
  • Step 204 if the number of iterations of the forest fire spread model is less than the maximum number of iterations, it is determined that the parameters of the forest fire spread model need to be adjusted.
  • Step 205 if the deviation converges to the target range, and/or the number of iterations of the fire spread model is greater than or equal to the highest number of iterations, stop adjusting the parameters of the forest fire spread model. It can be seen that, through the above steps 201 to 205, under the condition of unsteady meteorological conditions in the forest fire area, the data assimilation method of the multi-convergent set Kalman filter can be used to correct the fire spread parameters such as the fire line position in real time. , the dynamic iterative forest fire spread speed will effectively improve the accuracy of the forest fire spread model.
  • Step 105 if it is necessary to adjust the parameters of the forest fire spread model, adjust the model parameters of the forest fire spread model according to the predicted fire line position information and the fire line observation position information at time K, and recalculate according to the forest fire spread model adjusted by the model parameters.
  • the predicted hotline position information at time K, and the hotline state analysis value at time K is calculated according to the recalculated hotline predicted position information at time K and the observed hotline position information.
  • the specific implementation process of calculating the live line state analysis value at time K according to the recalculated live line prediction position information and live line observation position information at time K may be as follows: Least square fitting is performed between the recalculated live line predicted location information at time K and the live line observed location information to obtain the live line state analysis value at the K time.
  • the position of the live line in the state analysis is obtained by the least squares fitting calculation That is, the state analysis matrix (state analysis firewire Minimal error with the real live wire position). Among them, calculate the live line state analysis value The steps are as follows:
  • N is the number of elements in the state variable set, and 1 N is a matrix of size N ⁇ N with element values 1/N. is the medium prediction matrix A vector of means of the elements of each column in .
  • the observation vector yo can be obtained, and disturbance is added to the observation vector to generate an observation matrix containing N observation vectors.
  • the process of adding disturbance is as follows:
  • the observation vector after perturbation is obtained to form an observation matrix.
  • R m ⁇ N represents the definition domain of Y o , which means that Y o is m rows and N columns
  • the added perturbation can be stored in a matrix:
  • the ensemble observation error covariance matrix can be expressed as:
  • H is the observation operator, which maps X from the state space to the observation space.
  • the embodiment of the present application proposes a multi-convergent ensemble Kalman filter data assimilation method for the unsteady meteorological conditions in the forest fire area.
  • the fire line position corrects the parameters of the forest fire spread model in real time, and dynamically iterates the forest fire spread speed. Effectively improve the accuracy of the forest fire spread model.
  • the method for assimilating the forest fire spread data based on the drone video in the embodiment of the present application according to the meteorological data of the fire occurrence place, the basic geographic information data, the fire line state analysis value at the time of K-1 and other data, through the forest fire spread model, obtain the predicted position information of the fire line at time K, compare the predicted position information of the fire line at time K with the observation position information of the fire line obtained by the UAV, and judge whether the parameters of the forest fire spread model need to be adjusted.
  • the model parameters are adjusted according to the predicted position information of the fire line at time K and the observed position information, and the predicted position information of the fire line at time K is recalculated according to the adjusted forest fire spread model, and the analysis value of the fire line state at time K is recalculated.
  • This method of forest fire spread data assimilation based on UAV video uses UAV as the front-end monitoring equipment, extracts the fire line in real time, and obtains the position information of the fire line.
  • an assimilation forest fire spread with parameters that can be dynamically adjusted is proposed.
  • the model can effectively solve the problem that the fire line cannot be obtained in real time, and the parameters of the forest fire spread model cannot be corrected in time, which leads to the inability to guarantee the accuracy of the prediction results, and improves the prediction accuracy of the model.
  • the UAV has the advantages of high mobility and low cost.
  • the UAV can send back live video in real time, so that the update interval of the observation line of fire can be identified in minutes or even seconds, thus effectively avoiding the temporal resolution and spatial resolution of satellite remote sensing data.
  • the disadvantage of mutual restriction of fire rate can greatly improve the timeliness and accuracy of forest fire spread model prediction, thereby improving the prediction accuracy of fire area and providing objective fire information for forest fire fighting.
  • FIG. 3 is a flowchart of a method for assimilating forest fire spread data based on drone video according to another embodiment of the present application.
  • the forest fire spread data assimilation method includes:
  • Step 301 Obtain meteorological data and basic geographic information data of the fire occurrence place, and obtain the fire line state analysis value at the time K-1 of the fire occurrence place.
  • Step 302 Input the meteorological data, basic geographic information data and the fire line state analysis value at the time of K-1 into the forest fire spread model, obtain the predicted position information of the fire line at the time K, and obtain the fire field area based on the drone shooting. Thermal imaging video.
  • Step 303 Obtain the thermal image of the fire field area at time K from the thermal imaging video of the fire field area, and determine the temperature information corresponding to each pixel in the thermal image of the fire field area.
  • Step 304 according to the temperature information and temperature threshold corresponding to each pixel of the thermal imaging of the fire field, extract the fire field range from the thermal imaging of the fire field area, and perform edge extraction on the fire field range to obtain the pixel position of the fire line.
  • Step 305 Convert the pixel position of the live line to the GPS coordinates of the live line to obtain the observation position information of the live line at time K.
  • the process of converting the firewire pixel information into GPS information adopts the inverse process of camera imaging, as shown in FIG. 4 , is the projection from the 3D scene to the 2D image plane captured by the drone process of transformation.
  • the essence of camera imaging is the process of central perspective projection in photographic geometry.
  • the points on the three-dimensional ground determine the observation results through the viewing cone space and viewpoint orientation specified by the projection matrix, and the two-dimensional image of the camera picture and the three-dimensional geographic information form a corresponding relationship through the viewing cone and viewpoint orientation.
  • Converting two-dimensional picture information into three-dimensional coordinate information is the inverse process of the above process.
  • the first example is a line of fire positioning technology that does not combine DEM information.
  • the same fire field area can be photographed based on at least one drone at multiple observation points, and then the at least one drone can be used in multiple observation points.
  • the thermal imaging video of the fire field area captured by each observation point obtains multiple pixel positions of the fire line, and then uses the multiple pixel positions of the fire line to calculate the fire line observation position information at time K.
  • this example includes the following steps:
  • Step 501 Perform coordinate transformation on multiple pixel positions of the line of fire, respectively, to obtain multiple coordinate values of the line of fire in the UAV geographic coordinate system.
  • Step 502 Calculate multiple observation altitude angle matrices and multiple azimuth angle matrices of the live line according to the multiple coordinate values of the live line in the UAV geographic coordinate system.
  • Step 503 Perform Kalman filter estimation on the position of the live line according to multiple observation elevation angle matrices and multiple azimuth angle matrices of the live line, and obtain the coordinate estimated value of the live line.
  • Step 504 Convert the estimated coordinates of the live line through GPS coordinates to obtain the observed location information of the live line at time K.
  • steps 501-504 without combining DEM information may be as follows:
  • the positioning method of the UAV to the ground target is mainly: collect and process data through the airborne sensor, obtain the relative distance and angle data between the UAV and the target, and calculate the target position coordinates based on the UAV's own position and attitude data, as shown in Figure 6
  • the UAV detects the same target through multi-point positions, and the accurate three-dimensional coordinates of the target can be obtained through the visual-based multi-point angle observation fire line positioning method.
  • Target positioning is carried out through multi-point angle observation, and the pixel information of the line of fire is used to calculate the relative height and azimuth matrix of the line of fire and the UAV according to the imaging principle, and the system state equation and observation equation are established.
  • the position of the man-machine is then converted into the position coordinates in the geodetic coordinate system of the FireWire. Observing at time K, you can get the actual observed fire line position at time K
  • the main steps to obtain the observation line of fire are as follows:
  • the pixel information of the fire line is converted into the value in the UAV geographic coordinate system through the coordinates, and the altitude and azimuth angle matrix of the fire line point relative to the UAV geographic coordinates are calculated.
  • the second example is a live wire positioning technology combined with DEM information, as shown in Figure 7, this example includes the following steps:
  • Step 701 Obtain the DEM geographic information of the place where the fire occurred.
  • Step 702 Obtain GPS information, attitude information and built-in parameters of the UAV.
  • Step 703 according to the DEM geographic information, the GPS information of the drone, the attitude information and the built-in parameters, generate a virtual perspective of the point of the drone.
  • Step 704 simulate the actual UAV imaging process according to the virtual viewing angle of the UAV point to obtain a simulated image.
  • Step 705 Determine the pixel coordinates of the live line in the simulated image according to the pixel position of the live line.
  • Step 706 Convert the pixel coordinates of the live line in the simulated image through GPS coordinates to obtain the observation position information of the live line at time K.
  • a specific implementation process of the steps 701-706 in combination with DEM information may be as follows:
  • the forest-based DEM geographic information through the TS-GIS (TypeScript-Geographic Information system TypeScript language-geographic information system) engine, forms a virtual perspective of the drone point and generates a projection matrix. Using the projection matrix, the spatial coordinates corresponding to the pixel points of the fire line in the thermal image can be obtained. Observing at time K, you can get the actual observed fire line position at time K.
  • the FireWire positioning process is as follows:
  • the TS-GIS engine can display 3D DEM information.
  • Step 306 according to the predicted position information of the fire line and the observed position information of the fire line at time K, determine whether parameter adjustment of the forest fire spread model is required.
  • Step 307 if it is necessary to adjust the parameters of the forest fire spread model, adjust the model parameters of the forest fire spread model according to the predicted fire line position information and the fire line observation position information at time K, and recalculate K according to the forest fire spread model adjusted by the model parameters.
  • the predicted live line position information at time, and the live line state analysis value at time K is calculated according to the recalculated live line predicted position information and live line observed position information at time K.
  • the forest fire spread speed parameter in the forest fire spread model may be adjusted according to the deviation between the predicted fire line position information and the fire line observation position information, and then the fire line is recalculated by the adjusted forest fire spread model. Predicted location information.
  • the model parameters of the forest fire spread model are adjusted according to the predicted fire line position information and the fire line observation position information at time K, and the time K is recalculated according to the forest fire spread model adjusted by the model parameters.
  • the specific implementation process of the FireWire predicted location information may include:
  • Step 801 Calculate the deviation between the predicted hotline position information and the hotline observation position information at time K.
  • Step 802 Adjust the forest fire spread rate at time K-1 according to the preset forest fire spread rate update coefficient matrix and the deviation.
  • the forest fire spread rate at time K-1 is adjusted according to the preset forest fire spread rate update coefficient matrix and the deviation, which can be illustrated as follows.
  • the method includes: updating the forest fire spread rate update coefficient matrix Multiply with the deviation, and add the obtained product to the adjusted forest fire spread rate at time K-1.
  • Step 803 Input the adjusted forest fire spread speed at time K-1 and the analysis value of the fire line state at time K-1 into the Huygens fluctuation model, and obtain the predicted fire line position information at time K again.
  • the specific implementation process of the steps 801-803 may be as follows:
  • the forest fire spread rate R 0,k-1 at time K-1 calculated by the Rothermel model is the forest fire spread rate in the forest fire spread model at that time.
  • the thermal airflow and convection of the fire field will affect the wind direction and wind speed of the fire field.
  • the wind speed and wind direction of the fire field is not steady, and the fire spread speed is not steady. Therefore, the fire spread speed of the forest fire spread model also needs to be dynamically adjusted. , and now refer to the above non-steady-state factors to update the speed of forest fire spread.
  • C is the update coefficient matrix of forest fire spread speed
  • Errh is the deviation between the predicted value and the observed value obtained from formula (3).
  • the obtained meteorological data, basic geographic information data and the fire line state analysis value at the time of K-1 are input into the forest fire spread model, Obtain the predicted position information of the fire line at time K; obtain the thermal imaging video of the fire area based on the drone, obtain the thermal image of the fire area at time K, and determine the temperature information corresponding to each pixel in the thermal imaging, according to the temperature information and temperature Threshold, extract the fire field range, perform edge extraction on the fire field range, obtain the pixel position of the fire line, convert the estimated coordinates of the fire line through GPS coordinates, and obtain the fire line observation position information at time K; predict the position information and observation position according to the fire line at time K information to determine whether parameter adjustment of the forest fire spread model is necessary.
  • the method for assimilation of forest fire spread data based on drone video implemented in this embodiment uses drone as the front-end monitoring device, extracts the fire line in real time, and obtains the location information of the fire line, and proposes an assimilation with dynamically adjustable parameters for the forest fire spread model.
  • the forest fire spread model effectively solves the problems that the simulation model cannot be dynamically adjusted for the changes of the simulated environment, the forest fire model is not suitable for non-steady state, and the changes of the environment cannot be transmitted in real time, etc., and the prediction accuracy of the model is improved.
  • UAVs have the advantages of high maneuverability and low cost. UAVs can send back live video in real time, so that the update interval of the observation line of fire can be identified in minutes or even seconds.
  • the data assimilation method adopted by the model continuously assimilates the forest fire spread model, improves the prediction accuracy of the burned area, and provides objective fire information for forest fire fighting work.
  • this embodiment provides a method for obtaining the observation position of the line of fire from the regional thermal imaging video. This method can obtain the information of the observation position of the line of fire, and at the same time, the observation position of the line of fire can be displayed intuitively, which provides a direct method for forest fire extinguishing work. Guidance and strong support.
  • the present application also proposes a data assimilation device for forest fire spread based on drone video.
  • 9 is a schematic structural diagram of a device for assimilating data on forest fire spread based on drone video according to an embodiment of the present application. As shown in FIG. 9 , the device for assimilating data on forest fire spread based on drone video includes:
  • the first acquisition module 901 is used to acquire meteorological data and basic geographic information data of the fire occurrence place;
  • the second obtaining module 902 is used to obtain the analysis value of the live line state at the moment K-1 of the fire occurrence place;
  • the third acquisition module 903 is used to input the meteorological data, basic geographic information data and the fire line state analysis value at the time of K-1 into the forest fire spread model to obtain the predicted fire line position information at the time K of the fire occurrence place;
  • a fourth acquisition module 904 configured to acquire the thermal imaging video of the fire area based on the drone shot
  • the fifth acquisition module 905 is configured to acquire the fire line observation position information at the K moment according to the thermal imaging video of the fire field area;
  • the judgment module 906 is used for judging whether it is necessary to adjust the parameters of the forest fire spread model according to the predicted position information of the line of fire at time K and the observed position information of the line of fire;
  • the adjustment module 907 is configured to adjust the model parameters of the forest fire spread model according to the predicted position information of the fire line and the observed position information of the fire line at time K when the parameters of the forest fire spread model need to be adjusted, and according to the adjusted forest fire spread model parameters
  • the model recalculates the predicted location information of the live line at time K;
  • the data assimilation module 908 is configured to calculate the live line state analysis value at time K according to the recalculated live line predicted position information and live line observation position information at time K.
  • the forest fire spread model includes the Rothermel model and the Huygens fluctuation model; in the embodiments of the present application, the third acquisition module 903 is specifically used to: obtain the meteorological data and basic geographic information data of the fire occurrence place Input to the Rothermel model to obtain the forest fire spread speed at K-1 time; input the forest fire spread speed at K-1 time and the fire line state analysis value at K-1 time into the Huygens fluctuation model to predict the fire line position, and obtain The predicted location information of the live line at time K.
  • the fifth acquisition module 905 is specifically configured to: acquire the thermal image of the fire area at time K from the thermal imaging video of the fire area; determine the temperature information corresponding to each pixel in the thermal image of the fire area; The temperature information and temperature threshold corresponding to each pixel in the area thermal imaging, extract the fire field range from the fire field area thermal imaging; extract the edge of the fire field range in the fire field area thermal imaging to obtain the pixel position of the fire line; The position is converted into the GPS coordinates of the fire line, and the position information of the fire line observation at time K is obtained.
  • the number of unmanned aerial vehicles is at least one
  • the fifth acquisition module 905 is specifically configured to: obtain thermal imaging videos of the fire field area captured by at least one unmanned aerial vehicle at multiple observation points, and obtain the number of fire lines. pixel location.
  • the fifth acquisition module 905 converts the pixel position of the live line into the GPS coordinates of the live line, and the specific implementation process of obtaining the observation position information of the live line at time K may be as follows: coordinate a plurality of pixel positions of the live line respectively.
  • the multiple observation altitude angle matrices and multiple azimuth angle matrices of the live line are used to estimate the position of the live line by Kalman filter to obtain the estimated coordinate value of the live line;
  • the coordinate estimated value of the live line is converted by GPS coordinates to obtain the observation position information of the live line at time K.
  • the fifth acquisition module 905 converts the pixel position of the fire line into the GPS coordinates of the fire line
  • the specific implementation process of obtaining the fire line observation position information at time K may be as follows: obtaining the DEM geographic information of the fire place; obtaining The GPS information, attitude information and built-in parameters of the UAV; according to the DEM geographic information, the GPS information, attitude information and built-in parameters of the UAV, a virtual perspective of the UAV point is generated; according to the virtual perspective of the UAV point Simulate the actual UAV imaging process to obtain a simulated image; determine the pixel coordinates of the live line in the simulated image according to the pixel position of the live line; convert the pixel coordinates of the live line in the simulated image through GPS coordinates to obtain the observation position information of the live line at time K .
  • the judging module 906 is specifically configured to: calculate the deviation between the predicted live position information and the observed live position information at time K; judge whether the deviation converges within the target range; if the deviation does not converge within the target range, then judge Whether the number of iterations of the fire spread model is less than the maximum number of iterations; if the number of iterations of the fire spread model is less than the maximum number of iterations, it is determined that the parameters of the fire spread model need to be adjusted; if the deviation converges within the target range, and/or, If the number of iterations of the fire spread model is greater than or equal to the maximum number of iterations, the parameter adjustment of the forest fire spread model is stopped.
  • the adjustment module 907 adjusts the model parameters of the forest fire spread model according to the predicted fire line position information and the fire line observation position information at time K, and recalculates the fire line at time K according to the forest fire spread model adjusted by the model parameters.
  • the specific implementation process of the predicted position information can be as follows: calculate the deviation between the predicted position information of the fire line at time K and the observed position information of the fire line; Adjustment: Input the adjusted forest fire spread speed at K-1 time and the fire line state analysis value at K-1 time into the Huygens fluctuation model to regain the predicted fire line position information at time K.
  • the specific implementation process for the adjustment module 907 to adjust the forest fire spread rate at time K-1 according to the preset forest fire spread rate update coefficient matrix and the deviation may be as follows: The matrix and the deviation are multiplied, and the resulting product is added to the adjusted forest fire spread rate at time K-1.
  • the specific implementation process for the data assimilation module 908 to calculate the live line state analysis value at time K according to the recalculated live line predicted position information and live line observation position information at time K may be as follows: based on the set Kalman filter algorithm, The least squares fitting is performed between the recalculated live line predicted position information at time K and the live line observation position information to obtain the live line state analysis value at time K.
  • the forest fire spread data assimilation device based on the drone video of the embodiment of the present application obtains the meteorological data and basic geographic information data of the fire occurrence place; obtains the fire line state analysis value at the time K-1 of the fire occurrence place; The meteorological data, basic geographic information data and the analysis value of the fire line state at the time of K-1 are input into the forest fire spread model, and the predicted position information of the fire line at time K is obtained; the thermal imaging of the fire area based on the drone shooting is obtained.
  • the model parameters of the forest fire spread model are adjusted according to the predicted position information of the fire line and the observation position information of the fire line at time K, and the predicted position information of the fire line at time K is recalculated according to the forest fire spread model adjusted by the model parameters;
  • the hotline state analysis value at time K is calculated from the recalculated hotline predicted position information and hotline observation position information at time K.
  • This UAV video-based forest fire spread data assimilation device uses UAV as the front-end monitoring equipment, extracts the fire line in real time, and obtains the position information of the fire line.
  • a dynamic adjustment of the parameters of the forest fire spread assimilation is proposed.
  • the model effectively solves the problems that the simulation model cannot be dynamically adjusted for the changes of the simulated environment, the forest fire model is not suitable for non-steady state, and the changes of the environment cannot be transmitted in real time, etc., and the prediction accuracy of the model is improved.
  • UAVs have the advantages of high maneuverability and low cost. UAVs can send back live video in real time, so that the update interval of the observation line of fire can be identified in minutes or even seconds.
  • the data assimilation method adopted by the model continuously assimilates the forest fire spread model, improves the prediction accuracy of the burned area, and provides objective fire site information for forest fire fighting work.
  • the present application further provides an electronic device and a readable storage medium.
  • FIG. 10 it is a block diagram of an electronic device of a method for assimilating forest fire spread data based on drone video according to an embodiment of the present application.
  • Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframe computers, and other suitable computers.
  • Electronic devices may also represent various forms of mobile devices, such as personal digital processors, cellular phones, smart phones, wearable devices, and other similar computing devices.
  • the components shown herein, their connections and relationships, and their functions are by way of example only, and are not intended to limit implementations of the application described and/or claimed herein.
  • the electronic device includes: one or more processors 1001, a memory 1002, and interfaces for connecting various components, including a high-speed interface and a low-speed interface.
  • the various components are interconnected using different buses and may be mounted on a common motherboard or otherwise as desired.
  • the processor may process instructions executed within the electronic device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface.
  • multiple processors and/or multiple buses may be used with multiple memories and multiple memories, if desired.
  • multiple electronic devices may be connected, each providing some of the necessary operations (eg, as a server array, a group of blade servers, or a multiprocessor system).
  • a processor 1001 is used as an example.
  • the memory 1002 is the non-transitory computer-readable storage medium provided by the present application.
  • the memory stores instructions executable by at least one processor, so that the at least one processor executes the method for data assimilation of forest fire spread based on drone video provided by the present application.
  • the non-transitory computer-readable storage medium of the present application stores computer instructions for causing a computer to execute the method for assimilating forest fire spread data based on drone video provided by the present application.
  • the memory 1002 can be used to store non-transitory software programs, non-transitory computer-executable programs and modules, such as the method for assimilating forest fire spread data based on drone video in the embodiments of the present application
  • Corresponding program instructions/modules for example, the first acquisition module 901, the second acquisition module 902, the third acquisition module 903, the fourth acquisition module 904, the fifth acquisition module 905, the judgment module 906, the adjustment module 906 shown in FIG. module 907, data assimilation module 908.
  • the processor 1001 executes various functional applications and data processing of the server by running the non-transient software programs, instructions and modules stored in the memory 1002, that is, to realize the forest fire spread data based on the drone video in the above method embodiments. assimilation method.
  • the memory 1002 may include a stored program area and a stored data area, wherein the stored program area may store an operating system and an application program required by at least one function; data created by the use of the device, etc. Additionally, memory 1002 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some embodiments, the memory 1002 may optionally include memory located remotely relative to the processor 1001 that may be connected via a network to electronics for assimilation of forest fire spread data based on drone video. Examples of such networks include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the electronic device of the method for assimilation of forest fire spread data based on drone video may further include: an input device 1003 and an output device 1004 .
  • the processor 1001 , the memory 1002 , the input device 1003 and the output device 1004 may be connected by a bus or in other ways, and the connection by a bus is taken as an example in FIG. 10 .
  • the input device 1003 can receive input numerical or character information, as well as generate key signal input related to user settings and function control of electronic equipment for assimilation of forest fire spread data based on drone video, such as touch screen, keypad, mouse, track Input devices such as pads, touchpads, pointing sticks, one or more mouse buttons, trackballs, joysticks, etc.
  • Output devices 1004 may include display devices, auxiliary lighting devices (eg, LEDs), haptic feedback devices (eg, vibration motors), and the like.
  • the display device may include, but is not limited to, a liquid crystal display (LCD), a light emitting diode (LED) display, and a plasma display. In some implementations, the display device may be a touch screen.
  • Various implementations of the systems and techniques described herein can be implemented in digital electronic circuitry, integrated circuit systems, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include being implemented in one or more computer programs executable and/or interpretable on a programmable system including at least one programmable processor that The processor, which may be a special purpose or general-purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device an output device.
  • the processor which may be a special purpose or general-purpose programmable processor, may receive data and instructions from a storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and the at least one output device an output device.
  • machine-readable medium and “computer-readable medium” refer to any computer program product, apparatus, and/or apparatus for providing machine instructions and/or data to a programmable processor ( For example, magnetic disks, optical disks, memories, programmable logic devices (PLDs), including machine-readable media that receive machine instructions as machine-readable signals.
  • machine-readable signal refers to any signal used to provide machine instructions and/or data to a programmable processor.
  • the systems and techniques described herein may be implemented on a computer having a display device (eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user ); and a keyboard and pointing device (eg, a mouse or trackball) through which a user can provide input to the computer.
  • a display device eg, a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • a keyboard and pointing device eg, a mouse or trackball
  • Other kinds of devices can also be used to provide interaction with the user; for example, the feedback provided to the user can be any form of sensory feedback (eg, visual feedback, auditory feedback, or tactile feedback); and can be in any form (including acoustic input, voice input, or tactile input) to receive input from the user.
  • the systems and techniques described herein may be implemented on a computing system that includes back-end components (eg, as a data server), or a computing system that includes middleware components (eg, an application server), or a computing system that includes front-end components (eg, a user's computer having a graphical user interface or web browser through which a user may interact with implementations of the systems and techniques described herein), or including such backend components, middleware components, Or any combination of front-end components in a computing system.
  • the components of the system may be interconnected by any form or medium of digital data communication (eg, a communication network). Examples of communication networks include: Local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
  • a computer system can include clients and servers.
  • Clients and servers are generally remote from each other and usually interact through a communication network.
  • the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • the server can be a cloud server, also known as a cloud computing server or a cloud host, which is a host product in the cloud computing service system to solve the problems existing in traditional physical hosts and VPS (Virtual Private Server) services.
  • VPS Virtual Private Server

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)

Abstract

La présente demande divulgue un procédé et un appareil d'assimilation de données de propagation de feu de forêt à base de vidéo d'engin volant sans pilote embarqué, un dispositif électronique et un support d'enregistrement lisible par ordinateur. Ledit procédé comprend l'acquisition de données météorologiques, de données d'informations géographiques de base et d'une valeur d'analyse d'état de ligne de feu à un (K-1)ième instant d'un site de feu ; l'entrée des informations décrites dans un modèle de propagation de feu de forêt, et l'acquisition d'informations de position de prédiction de ligne de feu à un Kième instant ; l'acquisition d'une vidéo d'imagerie thermique d'une zone de champ de feu photographiée par un engin volant sans pilote embarqué, et l'acquisition d'informations de position d'observation de ligne de feu au Kième instant ; la détermination, en fonction de la position de prédiction de la ligne de feu et de la position d'observation de la ligne de feu au niveau du Kième instant, quant à savoir si des paramètres du modèle doivent être réglés ; et si les paramètres du modèle doivent être réglés, le réglage des paramètres de modèle en fonction de la position de prédiction de la ligne de feu et de la position d'observation de la ligne de feu au Kième instant, et le recalcul d'une position de prédiction de ligne de feu au Kième instant pour obtenir une valeur d'analyse d'état de ligne de feu au Kième instant. La présente demande a un faible coût, peut itérer dynamiquement le modèle de propagation de feu de forêt, acquiert une position de prédiction de ligne de feu précise et économise du temps utile pour un sauvetage de feu de forêt.
PCT/CN2021/112848 2020-11-27 2021-08-16 Procédé et appareil d'assimilation de données de propagation de feu de forêt à base de vidéo d'engin volant sans pilote embarqué WO2022110912A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011367733.0A CN112464819B (zh) 2020-11-27 2020-11-27 基于无人机视频的林火蔓延数据同化方法以及装置
CN202011367733.0 2020-11-27

Publications (1)

Publication Number Publication Date
WO2022110912A1 true WO2022110912A1 (fr) 2022-06-02

Family

ID=74809410

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/112848 WO2022110912A1 (fr) 2020-11-27 2021-08-16 Procédé et appareil d'assimilation de données de propagation de feu de forêt à base de vidéo d'engin volant sans pilote embarqué

Country Status (2)

Country Link
CN (1) CN112464819B (fr)
WO (1) WO2022110912A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115099493A (zh) * 2022-06-27 2022-09-23 东北林业大学 一种基于cnn的林火向任意方向蔓延速率预测方法
CN115660428A (zh) * 2022-11-10 2023-01-31 四川省林业和草原调查规划院(四川省林业和草原生态环境监测中心) 一种基于地理信息的森林和草原火灾风险评估系统
CN115661245A (zh) * 2022-10-24 2023-01-31 东北林业大学 一种基于无人机的大尺度火线瞬时定位方法
CN115671617A (zh) * 2022-11-03 2023-02-03 国网冀北电力有限公司超高压分公司 柔直换流站火灾定位方法、装置、设备及存储介质
CN116952081A (zh) * 2023-07-26 2023-10-27 武汉巨合科技有限公司 灭火弹落点参数图像空中监测系统及监测方法
CN117152592A (zh) * 2023-10-26 2023-12-01 青岛澳西智能科技有限公司 一种建筑信息与火情信息可视化系统及方法
CN117163302A (zh) * 2023-10-31 2023-12-05 安胜(天津)飞行模拟系统有限公司 飞行器仪表显示方法、装置、设备及存储介质
CN117689520A (zh) * 2024-02-01 2024-03-12 青岛山科智汇信息科技有限公司 一种草原火灾灭火弹覆盖能力评估方法、介质及系统

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112464819B (zh) * 2020-11-27 2024-01-12 清华大学 基于无人机视频的林火蔓延数据同化方法以及装置
CN112947264A (zh) * 2021-04-21 2021-06-11 苏州希盟科技股份有限公司 点胶机控制方法、装置、电子设备及介质
CN113554845B (zh) * 2021-06-25 2022-09-30 东莞市鑫泰仪器仪表有限公司 一种用于森林防火热成像装置
CN114495416A (zh) * 2021-12-29 2022-05-13 北京辰安科技股份有限公司 基于无人机的火情监测方法、装置及终端设备
CN115518316B (zh) * 2022-09-20 2024-02-20 珠海安擎科技有限公司 一种基于无人机、云平台与ar眼镜互联的智慧消防系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202976376U (zh) * 2012-11-22 2013-06-05 华南农业大学 基于无人机的林火动态监测与应急指挥系统
CN106021666A (zh) * 2016-05-10 2016-10-12 四川大学 一种架空输电线路的山火灾害预警方法
US9977963B1 (en) * 2017-03-03 2018-05-22 Northrop Grumman Systems Corporation UAVs for tracking the growth of large-area wildland fires
CN108763811A (zh) * 2018-06-08 2018-11-06 中国科学技术大学 动态数据驱动林火蔓延预测方法
CN112307884A (zh) * 2020-08-19 2021-02-02 航天图景(北京)科技有限公司 基于连续时序遥感态势数据的林火蔓延预测方法及电子设备
CN112464819A (zh) * 2020-11-27 2021-03-09 清华大学 基于无人机视频的林火蔓延数据同化方法以及装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102819926B (zh) * 2012-08-24 2015-04-29 华南农业大学 一种基于无人机的火灾监测预警方法
KR20170101516A (ko) * 2016-02-29 2017-09-06 한국전자통신연구원 무인 항공기를 이용한 화재 감시 장치 및 방법
CN109472421A (zh) * 2018-11-22 2019-03-15 广东电网有限责任公司 一种电网山火蔓延预警方法及装置
CN109871613B (zh) * 2019-02-18 2023-05-19 南京林业大学 一种森林火灾判别模型获取方法及预测应用
CN110390135B (zh) * 2019-06-17 2023-04-21 北京中科锐景科技有限公司 一种提高林火蔓延预测精度的方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202976376U (zh) * 2012-11-22 2013-06-05 华南农业大学 基于无人机的林火动态监测与应急指挥系统
CN106021666A (zh) * 2016-05-10 2016-10-12 四川大学 一种架空输电线路的山火灾害预警方法
US9977963B1 (en) * 2017-03-03 2018-05-22 Northrop Grumman Systems Corporation UAVs for tracking the growth of large-area wildland fires
CN108763811A (zh) * 2018-06-08 2018-11-06 中国科学技术大学 动态数据驱动林火蔓延预测方法
CN112307884A (zh) * 2020-08-19 2021-02-02 航天图景(北京)科技有限公司 基于连续时序遥感态势数据的林火蔓延预测方法及电子设备
CN112464819A (zh) * 2020-11-27 2021-03-09 清华大学 基于无人机视频的林火蔓延数据同化方法以及装置

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115099493B (zh) * 2022-06-27 2023-11-10 东北林业大学 一种基于cnn的林火向任意方向蔓延速率预测方法
CN115099493A (zh) * 2022-06-27 2022-09-23 东北林业大学 一种基于cnn的林火向任意方向蔓延速率预测方法
CN115661245A (zh) * 2022-10-24 2023-01-31 东北林业大学 一种基于无人机的大尺度火线瞬时定位方法
CN115671617A (zh) * 2022-11-03 2023-02-03 国网冀北电力有限公司超高压分公司 柔直换流站火灾定位方法、装置、设备及存储介质
CN115660428A (zh) * 2022-11-10 2023-01-31 四川省林业和草原调查规划院(四川省林业和草原生态环境监测中心) 一种基于地理信息的森林和草原火灾风险评估系统
CN116952081B (zh) * 2023-07-26 2024-04-16 武汉巨合科技有限公司 灭火弹落点参数图像空中监测系统及监测方法
CN116952081A (zh) * 2023-07-26 2023-10-27 武汉巨合科技有限公司 灭火弹落点参数图像空中监测系统及监测方法
CN117152592A (zh) * 2023-10-26 2023-12-01 青岛澳西智能科技有限公司 一种建筑信息与火情信息可视化系统及方法
CN117152592B (zh) * 2023-10-26 2024-01-30 青岛澳西智能科技有限公司 一种建筑信息与火情信息可视化系统及方法
CN117163302B (zh) * 2023-10-31 2024-01-23 安胜(天津)飞行模拟系统有限公司 飞行器仪表显示方法、装置、设备及存储介质
CN117163302A (zh) * 2023-10-31 2023-12-05 安胜(天津)飞行模拟系统有限公司 飞行器仪表显示方法、装置、设备及存储介质
CN117689520A (zh) * 2024-02-01 2024-03-12 青岛山科智汇信息科技有限公司 一种草原火灾灭火弹覆盖能力评估方法、介质及系统
CN117689520B (zh) * 2024-02-01 2024-05-10 青岛山科智汇信息科技有限公司 一种草原火灾灭火弹覆盖能力评估方法、介质及系统

Also Published As

Publication number Publication date
CN112464819B (zh) 2024-01-12
CN112464819A (zh) 2021-03-09

Similar Documents

Publication Publication Date Title
WO2022110912A1 (fr) Procédé et appareil d'assimilation de données de propagation de feu de forêt à base de vidéo d'engin volant sans pilote embarqué
US10942529B2 (en) Aircraft information acquisition method, apparatus and device
CN107504957B (zh) 利用无人机多视角摄像快速进行三维地形模型构建的方法
Stipaničev et al. Advanced automatic wildfire surveillance and monitoring network
US20140362107A1 (en) Generating a composite field of view using a plurality of oblique panoramic images of a geographic area
US10740875B1 (en) Displaying oblique imagery
CN111649724A (zh) 基于移动边缘计算的视觉定位方法和装置
US10726614B2 (en) Methods and systems for changing virtual models with elevation information from real world image processing
CN109102566A (zh) 一种变电站的室内外实景重建方法及其装置
WO2023125587A1 (fr) Procédé et appareil de surveillance d'incendie basés sur un véhicule aérien sans pilote
Renwick et al. Drone-based reconstruction for 3D geospatial data processing
Qiao et al. Ground target geolocation based on digital elevation model for airborne wide-area reconnaissance system
CN109977609A (zh) 一种基于真实遥感数据的地面高温热源红外图像仿真方法
Li et al. Verification of monocular and binocular pose estimation algorithms in vision-based UAVs autonomous aerial refueling system
JP2020008802A (ja) 三次元マップ生成装置および三次元マップ生成方法
WO2023273415A1 (fr) Procédé et appareil de positionnement basés sur un véhicule aérien sans pilote, support de stockage, dispositif électronique et produit
Zhang et al. Forest fire detection solution based on UAV aerial data
Bradley et al. Georeferenced mosaics for tracking fires using unmanned miniature air vehicles
Qu et al. Retrieval of 30-m-resolution leaf area index from China HJ-1 CCD data and MODIS products through a dynamic Bayesian network
WO2021051220A1 (fr) Procédé, dispositif et système de fusion de nuage de points, et support d'enregistrement
Hu et al. A spatiotemporal intelligent framework and experimental platform for urban digital twins
Matelenok et al. Influence of the canopy structure of a birch forest on the visibility of the fires below
CN116597155A (zh) 基于多平台协同计算模式的林火蔓延预测方法和系统
CN114359425A (zh) 正射图像生成方法及装置、正射指数图生成方法及装置
Guo et al. A new UAV PTZ Controlling System with Target Localization

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21896417

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21896417

Country of ref document: EP

Kind code of ref document: A1