WO2023125587A1 - 基于无人机的火情监测方法及装置 - Google Patents

基于无人机的火情监测方法及装置 Download PDF

Info

Publication number
WO2023125587A1
WO2023125587A1 PCT/CN2022/142552 CN2022142552W WO2023125587A1 WO 2023125587 A1 WO2023125587 A1 WO 2023125587A1 CN 2022142552 W CN2022142552 W CN 2022142552W WO 2023125587 A1 WO2023125587 A1 WO 2023125587A1
Authority
WO
WIPO (PCT)
Prior art keywords
fire
area
thermal imaging
thermal
fire scene
Prior art date
Application number
PCT/CN2022/142552
Other languages
English (en)
French (fr)
Inventor
陈涛
黄丽达
刘春慧
王镜闲
王晓萌
刘连顺
Original Assignee
北京辰安科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京辰安科技股份有限公司 filed Critical 北京辰安科技股份有限公司
Publication of WO2023125587A1 publication Critical patent/WO2023125587A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke

Definitions

  • the present disclosure relates to the technical field of forest fire analysis, in particular to a fire monitoring method, device and terminal equipment based on a drone.
  • Forest fire is a natural disaster that is sudden, destructive, difficult to deal with, and extremely dangerous.
  • the present disclosure proposes a fire monitoring method and device based on an unmanned aerial vehicle, so as to at least solve the problem of low timeliness of fire monitoring in related technologies.
  • the disclosed technical scheme is as follows:
  • the embodiments of the present disclosure provide a fire monitoring method based on a drone, including:
  • the geographic information determine the fire line length and fire area corresponding to the fire scene area.
  • the fire area after obtaining the thermal imaging image collected by the UAV, the fire area can be determined according to the thermal imaging image and the shooting parameters of the thermal imaging image, and then the geographic information corresponding to the fire area can be determined, and according to the geographic information, determine The length of the fire line corresponding to the fire area and the fire area. Therefore, through the thermal imaging image collected by the drone, the length of the fire line and the fire area corresponding to the fire area can be determined in a timely manner, thereby improving the timeliness of forest fire information and avoiding missing the best rescue opportunity.
  • the determination of the fire scene area according to the thermal imaging image and the shooting parameters of the thermal imaging image specifically includes:
  • any area is a fire scene area.
  • the determining the geographic information corresponding to the fire scene area specifically includes:
  • each pixel point in each thermal imaging image determines the overlapping sutures between the fire scene areas corresponding to each thermal imaging image
  • Geographic information corresponding to the fused fire scene area is determined.
  • the determining the geographic information corresponding to the fire scene area specifically includes:
  • each piece of geographic information is fused to generate fused geographic information.
  • the determining the geographic information corresponding to the fire scene area specifically includes:
  • attitude angle and field angle of the thermal imager in the UAV determine the virtual viewing angle of the thermal imager
  • the geographic coordinate information corresponding to the fire scene area is determined.
  • the method also includes:
  • the distance between the thermal imager in the drone and the fire point in the fire scene and the focal length information of the thermal imager determine the ground resolution corresponding to the fire point
  • the length of the fire line and the fire area are corrected.
  • the embodiments of the present disclosure provide a fire monitoring device based on a drone, including:
  • the acquisition module is used to acquire the thermal imaging map collected by the drone
  • the first determination module is used to determine the fire scene area according to the thermal imaging image and the shooting parameters of the thermal imaging image
  • the second determination module is used to determine the geographical information corresponding to the fire scene area
  • the third determination module is configured to determine the fire line length and fire area corresponding to the fire scene area according to the geographical information.
  • the first determination module is specifically configured to:
  • any area is a fire scene area.
  • the second determining module is specifically configured to:
  • each pixel point in each thermal imaging image determines the overlapping sutures between the fire scene areas corresponding to each thermal imaging image
  • Geographic information corresponding to the fused fire scene area is determined.
  • the second determining module is specifically configured to:
  • each piece of geographic information is fused to generate fused geographic information.
  • the second determining module is specifically configured to:
  • attitude angle and field angle of the thermal imager in the UAV determine the virtual viewing angle of the thermal imager
  • the geographic coordinate information corresponding to the fire scene area is determined.
  • the third determination module is also used for:
  • the distance between the thermal imager in the drone and the fire point in the fire scene and the focal length information of the thermal imager determine the ground resolution corresponding to the fire point
  • the length of the fire line and the fire area are corrected.
  • a terminal device including:
  • memory for storing processor-executable instructions
  • the processor is configured to execute instructions, so as to realize the fire monitoring method based on the drone as described in the embodiment of the first aspect above.
  • a computer-readable storage medium is provided.
  • the terminal device can execute the UAV-based fire detection method.
  • a computer program product including a computer program, and when the computer program is executed by a processor, the fire monitoring method based on the drone described in the above-mentioned one embodiment is implemented.
  • the fire scene area can be determined according to the thermal imaging image and the shooting parameters of the thermal imaging image, and then , the geographic information corresponding to the fire area can be determined, and according to the geographical information, the length of the fire line and the fire area corresponding to the fire area can be determined. Therefore, through the thermal imaging image collected by the drone, the length of the fire line and the fire area corresponding to the fire area can be determined in a timely manner, thereby improving the timeliness of forest fire information and avoiding missing the best rescue opportunity.
  • FIG. 1 is a schematic flow diagram of a fire monitoring method based on a drone provided in the first embodiment of the present disclosure
  • FIG. 2 is a schematic flow diagram of another unmanned aerial vehicle-based fire monitoring method provided by the second embodiment of the present disclosure
  • FIG. 3 is a schematic flowchart of another UAV-based fire monitoring method provided by the third embodiment of the present disclosure.
  • FIG. 4 is a schematic structural diagram of a UAV-based fire monitoring device provided in a fourth embodiment of the present disclosure.
  • Fig. 5 is a block diagram of a terminal device for fire monitoring based on a drone according to an exemplary embodiment.
  • the UAV can monitor the temperature of the forest fire area through the mounted thermal imaging camera, so that the fire area can be determined based on the temperature, and then the length of the fire line and the area of the fire can be determined. And because the UAV is convenient to take off, the direction is controllable, and the maneuverability is flexible, it can obtain information on forest fires in time, thereby improving the timeliness of obtaining forest fires.
  • FIG. 1 is a flow chart of a fire monitoring method based on a drone provided by an embodiment of the present disclosure, including the following steps 101 to 104 .
  • Step 101 acquiring a thermal imaging image collected by a drone.
  • the thermal imager can receive the infrared rays radiated by the object through the infrared detector, and convert the infrared signals of varying strengths into electrical signals. thermal imagery.
  • the unmanned aerial vehicle can collect the thermal imaging map of the forest fire scene in real time through the mounted thermal imager.
  • the number of UAVs used for parallel flight can be determined according to the forest area where there is a fire. Therefore, through the parallel flight of multiple UAVs, the thermal imaging map of the forest fire scene can be collected, and the entire forest can be obtained in time. fire information, thereby improving the timeliness of forest fire information.
  • the UAV after the UAV obtains the thermal imaging picture, it can transmit the thermal imaging picture to the server in real time through the wireless transmission module.
  • the server can realize real-time monitoring of forest fires.
  • Step 102 determine the area of the fire scene according to the thermal image and the shooting parameters of the thermal image.
  • the shooting parameters may include the distance between the thermal imager and the ground, etc., which can be selected by those skilled in the art according to actual needs.
  • each pixel in the thermal imaging map corresponds to a temperature value
  • the temperature value corresponding to each pixel can be compared with a threshold, and when the temperature value corresponding to the pixel is greater than the threshold, it can be determined
  • the area corresponding to this pixel point is the ignition point.
  • the temperature value corresponding to the pixel point is less than the threshold value, it can be determined that the area corresponding to the pixel point is not an ignition point. Then, the area formed by the ignition point can be used as the fire area.
  • the shooting parameters of the thermal imaging image may have an impact on the temperature value corresponding to the thermal imaging image
  • the temperature correction function corresponding to the shooting parameters can be pre-set in the system, and then the temperature value corresponding to each pixel in the thermal image can be corrected according to the temperature correction function, and then, according to each pixel The corresponding corrected temperature value is used to determine the fire scene area.
  • Step 103 determining geographic information corresponding to the fire scene area.
  • geographic information can be any information such as Global Positioning System (Global Positioning System, GPS) position information that can uniquely determine the location of the real fire scene area, and those skilled in the art can choose according to actual needs.
  • Global Positioning System Global Positioning System, GPS
  • the thermal imaging image only contains the coordinate position information of the camera corresponding to the fire area, and does not contain the geographic information of the real fire area.
  • Geodetic coordinate system to determine the geographic information corresponding to the fire area.
  • the virtual viewing angle of the thermal imager can be determined according to the geographic location information, attitude angle, and field of view angle of the thermal imager in the UAV, and then, based on the digital elevation information and the virtual viewing angle, the coordinate conversion of the thermal imaging map can be determined Then, according to the pixel position of the fire scene area and the coordinate transformation matrix, the geographic coordinate information corresponding to the fire scene area is determined.
  • the geographic location information of the thermal imager can be the longitude and latitude information corresponding to the UAV
  • the attitude angle can include heading angle, horizontal angle, roll angle, etc.
  • the field of view angle can include horizontal field of view angle, vertical field of view angle, etc.
  • the attitude angle and field of view angle can be measured and determined by ranging sensors such as photoelectric pods with laser ranging carried by drones.
  • the latitude and longitude information corresponding to the UAV can be determined through the navigation system carried by the UAV.
  • the geographic location corresponding to the forest fire scene can be input into the TS-GIS engine to determine the digital elevation model corresponding to the forest fire scene.
  • the server can input information such as the geographic location information, attitude angle, and field angle of the thermal imager into the TS-GIS engine, and then determine the corresponding virtual viewing angle of the thermal imager based on the digital elevation information. Then, based on the virtual viewing angle, the coordinate conversion matrix from the camera coordinates to the earth coordinates corresponding to the fire area in the thermal imaging image is calculated. Then, according to the coordinate transformation matrix, the coordinates of each pixel in the thermal imaging in the camera coordinate system are transformed into the earth coordinate system, so as to determine the geographic information corresponding to each pixel in the thermal imaging.
  • the geographic location information, attitude angle and field angle of the thermal imager corresponding to each thermal image are the same, therefore, the coordinate transformation matrices corresponding to all pixels in the same thermal image are the same. Since the UAV is constantly flying, the geographic location information, attitude angle and field of view of the thermal imager corresponding to the multiple thermal imaging images taken during the flight may change. Therefore, each thermal imaging The coordinate transformation matrix corresponding to the graph may be different.
  • Step 104 according to the geographic information, determine the fire line length and fire area corresponding to the fire scene area.
  • the geographical information corresponding to the edge of the fire scene and the fire area can be input into the TS-GIS engine respectively, and then the length of the fire line can be determined according to the geographical information on the edge of the fire scene, and the overfire can be determined according to the GPS information of the fire field area. area.
  • the edge of the fire scene may be the junction of the fire scene area and the non-fire scene area.
  • the fire area after obtaining the thermal imaging image collected by the UAV, the fire area can be determined according to the thermal imaging image and the shooting parameters of the thermal imaging image, and then the geographic information corresponding to the fire area can be determined, and according to the geographic information, determine The length of the fire line corresponding to the fire area and the fire area. Therefore, through the thermal imaging image collected by the drone, the length of the fire line and the fire area corresponding to the fire area can be determined in a timely manner, thereby improving the timeliness of forest fire information and avoiding missing the best rescue opportunity.
  • FIG. 2 is a flow chart of a UAV-based fire monitoring method provided by an embodiment of the present disclosure, including the following steps 201 to 207.
  • Step 201 acquiring a thermal imaging image collected by a drone.
  • step 201 for the specific implementation process of step 201, reference may be made to the detailed description in any embodiment of the present disclosure, and details are not repeated here.
  • Step 202 determine the working band of the thermal imager in the UAV, and the distance between the thermal imager and the fire point.
  • the working band of the thermal imager can be the wavelength of infrared rays that the thermal imager can receive, and can be pre-set in the system.
  • the distance between the thermal imager and the fire point can be determined by measuring the ranging sensor carried by the drone.
  • the distance between the thermal imager and the fire point corresponding to each thermal image can be sent to the server at the same time.
  • Step 203 determine temperature correction parameters according to the working band and distance.
  • the temperature correction parameters may include corrected irradiance, response voltage, etc., which are not limited in the present disclosure.
  • the height of the UAV may change during flight, therefore, due to the influence of the distance between the thermal imager and the fire field, the temperature information in the thermal imaging image may be deviated.
  • the temperature can be corrected based on the working band and distance.
  • the thermal imager first receives infrared rays, and determines the irradiance, and then, according to the irradiance, determines the response voltage of the thermal imager, and then, according to Planck's radiation law, can determine the temperature information of each area of the forest fire scene .
  • the irradiance can be corrected according to the distance, so as to achieve the effect of correcting the temperature.
  • the formula of the corrected irradiance is expressed as follows:
  • ⁇ ⁇ A 0 d -2 [ ⁇ a ⁇ ⁇ ⁇ L b ⁇ (T 0 )+ ⁇ a ⁇ (1- ⁇ ⁇ )L b ⁇ (T u )+ ⁇ a ⁇ L b ⁇ (T a )] (1)
  • ⁇ ⁇ is the surface emissivity
  • ⁇ ⁇ is the surface absorptivity
  • ⁇ a ⁇ is the spectral transmittance of the atmosphere
  • ⁇ a ⁇ is the atmospheric emissivity
  • T 0 is the surface temperature of the measured object
  • T u is the ambient temperature
  • T a is Atmospheric temperature
  • d is the distance between the target and the measuring instrument, usually under certain conditions
  • a 0 is the visual area of the target corresponding to the minimum spatial opening angle of the thermal imager.
  • L b ⁇ is the radiance.
  • a 0 can be pre-set in the system.
  • the thermal imager usually works in a certain narrow band range, therefore, ⁇ ⁇ , ⁇ ⁇ , and ⁇ a ⁇ can be fixed values. Therefore, the corrected response voltage can be determined according to the corrected irradiance. Among them, the formula of the corrected response voltage is expressed as follows:
  • a R is the area of the thermal imager lens
  • R ⁇ is the spectral responsivity of the detector, which is a constant value for a certain infrared thermal imager.
  • AR can be preset in the system.
  • Step 204 correct the temperature of each region in the thermal imaging image according to the temperature correction parameter, so as to determine the corrected temperature of each region.
  • the corrected temperature corresponding to each pixel in the thermal imaging image can be determined according to Planck's radiation law.
  • Planck's radiation law the formula of thermal imager image temperature is expressed as follows:
  • T ⁇ is the temperature value of each pixel in the thermal imaging image
  • n is the working band of the thermal imager.
  • the surface temperature T 0 of the measured object is the corrected temperature.
  • the formula (6) can be sorted out, and the corrected temperature formula can be obtained, as follows:
  • the surface emissivity ⁇ of wood can be preset in the system, and the value of n can be preset in the system according to the working band used by the thermal imager.
  • the value of n can be 8.68; when the working band of the thermal imager is 8-14 microns, the value of n can be 4.09.
  • the server can use the distance d between the thermal imager and the fire point to correct the temperature value T ⁇ of each pixel in the thermal imaging image according to the formula (8), so as to obtain the corrected object surface temperature T 0 .
  • Step 205 if the revised temperature of any area is greater than the threshold, determine any area as the fire scene area.
  • Step 206 determining geographic information corresponding to the fire scene area.
  • Step 207 according to the geographical information, determine the fire line length and fire area corresponding to the fire scene area.
  • step 205 to step 207 for the specific implementation process of step 205 to step 207, reference may be made to the detailed description of any embodiment in the present disclosure, which will not be repeated here.
  • the server after the server obtains the thermal imaging image collected by the drone, it can determine the working band of the thermal imager in the drone and the distance between the thermal imager and the fire point, and then, according to the working band and distance , to determine the temperature correction parameters, and then according to the temperature correction parameters, correct the temperature of each area in the thermal imaging map to determine the corrected temperature of each area, and in the case of any area whose revised temperature is greater than the threshold value, it can be determined that any The first area is the fire area, and then determine the geographic information corresponding to the fire area, and determine the fire line length and fire area corresponding to the fire area based on the geographic information. Therefore, according to the distance between the thermal imager and the fire point, the temperature of the thermal imaging map is corrected, so that the accuracy of the temperature can be improved, and furthermore, the accuracy of the fire scene information can be improved.
  • Fig. 3 is a flow chart of a fire monitoring method based on a drone provided by an embodiment of the present disclosure, including the following steps 301 to 306.
  • Step 301 acquiring a thermal imaging image collected by a drone.
  • Step 302 determine the area of the fire scene according to the thermal image and the shooting parameters of the thermal image.
  • Step 303 determining geographic information corresponding to each fire area in each fire area.
  • step 301 to step 303 for the specific implementation process of step 301 to step 303, reference may be made to the detailed description in any embodiment of the present disclosure, and details are not repeated here.
  • Step 304 according to the matching degree between each geographic information, fuse each geographic information to generate fused geographic information.
  • thermal imaging images taken by each UAV during the flight only correspond to the local conditions of the forest fire scene, multiple thermal imaging images taken by multiple UAVs can be fused to intuitively Get the fire situation of the whole forest fire area.
  • the server can match the obtained multiple thermal imaging images two by two, and when the distance between the geographical information of each corresponding pixel point in a certain fire field area of the two thermal imaging images is less than a threshold, it can be determined that The fire area in the two thermal imaging images is the same fire area, and then the same fire area in the two thermal imaging images can be fused to generate fused geographic information.
  • the characteristics and shooting parameters of each pixel point in each thermal imaging image it is also possible to determine the overlapping sutures between the fire scene areas corresponding to each thermal imaging image, and then, based on the overlapping sutures, each fire scene The regions are fused to generate a fused fire area, and then the geographic information corresponding to the fused fire area is determined.
  • the feature of the pixel point may include pixel value, pixel coordinate and other information, which can be selected by those skilled in the art according to actual needs.
  • the server can fuse the obtained multiple thermal imaging images two by two to determine the overlapping sutures between the fire field regions corresponding to the thermal imaging images. Afterwards, based on the overlapping sutures, the same areas can be fused to generate a fused fire field area. Then, based on any method of determining geographic information corresponding to the fire area in the present disclosure, the fused geographic information corresponding to the fire area is determined.
  • Step 305 according to the distance between the thermal imager in the UAV and the fire point in the fire scene and the focal length information of the thermal imager, determine the ground resolution corresponding to the fire point.
  • each thermal imaging map actually corresponds to a local area of the forest fire scene, and the area of the local area is obviously larger than the area of the thermal imaging map. Therefore, each pixel in the thermal imaging map actually corresponds to a certain area of the real fire scene.
  • each geographical information determined by a pixel is only a location point, not an area corresponding to the pixel. Therefore, there may be errors in the geographical information determined by the pixel, the determined length of the line of fire and the area of fire.
  • Step 306 Correct the length of the fire line and the fire area according to the ground resolution.
  • the geographical information corresponding to the pixel point may be determined as the center position of the area corresponding to the pixel point. Afterwards, according to the ground resolution, the area corresponding to the pixel point is determined, and then the length of the line of fire can be calculated according to the geographical information of the outer edge of the area corresponding to the pixel point of the line of fire, thereby realizing the correction of the length of the line of fire. At the same time, the burnt area can be determined based on the geographic location information of the outer edge of the area corresponding to the pixel on the fire line.
  • the server after the server obtains the thermal imaging map collected by the drone, it can determine the fire scene area according to the thermal imaging map and the shooting parameters of the thermal imaging map, and then determine the geographic information corresponding to each fire scene area in each fire scene area , and then according to the matching degree of each geographic information, each geographic information is fused to generate fused geographic information, and then, according to the distance between the thermal imager in the UAV and the fire point in the fire and the focal length of the thermal imager information, determine the ground resolution corresponding to the fire point, and correct the length of the fire line and the fire area according to the ground resolution.
  • the length of the line of fire and the area of the fire can be corrected, so that the accuracy of the length of the line of fire and the area of the fire can be improved, thereby providing a reliable basis for fire rescue .
  • Fig. 4 is a block diagram of a UAV-based fire monitoring device according to an exemplary embodiment.
  • the device includes an acquisition module 410 , a first determination module 420 , a second determination module 430 , and a third determination module 440 .
  • the acquiring module 410 is configured to acquire thermal imaging images collected by the drone.
  • the first determining module 420 is configured to determine the area of the fire scene according to the thermal imaging image and shooting parameters of the thermal imaging image.
  • the second determination module 430 is configured to determine geographic information corresponding to the fire scene area.
  • the third determination module 440 is configured to determine the fire line length and fire area corresponding to the fire scene area according to the geographical information.
  • the above-mentioned first determination module 420 is specifically configured to: determine the working band of the thermal imager in the drone, and the distance between the thermal imager and the fire point ; According to the working band and distance, determine the temperature correction parameters; according to the temperature correction parameters, correct the temperature in each area in the thermal imaging map to determine the corrected temperature in each area; in any area after revision When the temperature is greater than the threshold, it is determined that any area is a fire scene area.
  • the above-mentioned second determination module 420 is specifically configured to: determine the fire scenes corresponding to each thermal imaging image according to the characteristics of each pixel point in each thermal imaging image and the shooting parameters Overlapping sutures between areas; based on the overlapping sutures, merging each fire scene area to generate a fused fire scene area; determining geographic information corresponding to the fused fire scene area.
  • the above-mentioned second determining module 420 is specifically configured to: determine the geographical information corresponding to each of the fire scene areas in each fire scene area; , fusing each piece of geographic information to generate fused geographic information.
  • the above-mentioned second determination module 420 is specifically used to: determine the virtual location of the thermal imager according to the geographic location information, attitude angle, and field angle of the thermal imager in the UAV.
  • Angle of view based on the digital elevation information and the virtual angle of view, determine the coordinate transformation matrix of the thermal imaging map; determine the geographic coordinate information corresponding to the fire scene area according to the pixel position of the fire scene area and the coordinate transformation matrix.
  • the above-mentioned third determination module 430 is also configured to: according to the distance between the thermal imager in the drone and the fire point in the fire scene and the thermal imager Determine the ground resolution corresponding to the fire point; according to the ground resolution, correct the fire line length and fire area.
  • the fire area after obtaining the thermal imaging image collected by the UAV, the fire area can be determined according to the thermal imaging image and the shooting parameters of the thermal imaging image, and then the geographic information corresponding to the fire area can be determined, and according to the geographic information, determine The length of the fire line corresponding to the fire area and the fire area. Therefore, through the thermal imaging image collected by the drone, the length of the fire line and the fire area corresponding to the fire area can be determined in a timely manner, thereby improving the timeliness of forest fire information and avoiding missing the best rescue opportunity.
  • Fig. 5 is a block diagram of a terminal device for fire monitoring based on a drone according to an exemplary embodiment.
  • the terminal device 500 includes: a memory 510 and a processor 520, a bus 530 connecting different components (including the memory 510 and the processor 520), the memory 510 stores a computer program, and when the processor 520 executes the program At the same time, the UAV-based fire monitoring method described in the embodiment of the present disclosure is realized.
  • Bus 530 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus structures.
  • These architectures include, by way of example, but are not limited to Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MAC) bus, Enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect ( PCI) bus.
  • the terminal device 500 typically includes various electronic device readable media. These media can be any available media that can be accessed by the terminal device 500, including volatile and non-volatile media, removable and non-removable media.
  • Memory 510 may also include computer system readable media in the form of volatile memory, such as random access memory (RAM) 540 and/or cache memory 550 .
  • the terminal device 500 may further include other removable/non-removable, volatile/nonvolatile computer system storage media.
  • storage system 560 may be used to read and write to non-removable, non-volatile magnetic media (not shown in FIG. 5, commonly referred to as a "hard drive").
  • a disk drive for reading and writing to a removable non-volatile disk such as a "floppy disk”
  • a removable non-volatile disk such as a "floppy disk”
  • each drive may be connected to bus 530 through one or more data media interfaces.
  • the memory 510 may include at least one program product having a set (eg, at least one) of program modules configured to perform the functions of various embodiments of the present disclosure.
  • a program/utility 580 having a set (at least one) of program modules 570 may be stored, for example, in memory 510 , each or some combination of these examples may include implementations of network environments.
  • the program modules 570 generally perform the functions and/or methods in the embodiments described in the present disclosure.
  • the terminal device 500 may also communicate with one or more external devices 590 (such as a keyboard, a pointing device, a display 591, etc.), and may also communicate with one or more devices that enable the user to interact with the terminal device 500, and/or communicate with Any device (eg network card, modem, etc.) that enables the terminal device 500 to communicate with one or more other computing devices. Such communication may occur through input/output (I/O) interface 592 .
  • the terminal device 500 can also communicate with one or more networks (such as a local area network (LAN), a wide area network (WAN) and/or a public network such as the Internet) through the network adapter 593 .
  • networks such as a local area network (LAN), a wide area network (WAN) and/or a public network such as the Internet
  • the network adapter 593 communicates with other modules of the terminal device 500 through the bus 530 .
  • other hardware and/or software modules may be used in conjunction with terminal device 500, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives And data backup storage system, etc.
  • the processor 520 executes various functional applications and data processing by executing programs stored in the memory 510 .
  • the amount of calculated data is reduced, which is conducive to improving the calculation speed, and by combining the first attribute information with each second attribute information
  • the similarity between each reference object, as well as the category label corresponding to each reference object determine the category label corresponding to the applicant, so that when the relationship data corresponding to the applicant is unknown, the user only needs to provide simple attribute information, and the service request
  • By performing verification while effectively removing high-risk business requests, it reduces the time for the applicant to apply for business, and improves the efficiency of business application.
  • the relational data can also be verified in this way to increase the credibility of the relational data information.
  • the present disclosure also provides a computer-readable storage medium including instructions, such as a memory including instructions, which can be executed by a processor of a terminal device to complete the above method.
  • the computer readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
  • the present disclosure also provides a computer program product.
  • the terminal device can execute the fire monitoring method based on the drone as described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Fire-Detection Mechanisms (AREA)
  • Alarm Systems (AREA)
  • Image Processing (AREA)

Abstract

本公开提出一种基于无人机的火情监测方法、装置及终端设备,其中,方法包括:获取无人机采集的热成像图;根据热成像图及热成像图的拍摄参数,确定火场区域;确定火场区域对应的地理信息;根据地理信息,确定火场区域对应的火线长度及过火面积。

Description

基于无人机的火情监测方法及装置
相关申请的交叉引用
本申请基于申请号为202111642996.2、申请日为2021年12月29日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本公开涉及森林火情分析技术领域,尤其涉及一种基于无人机的火情监测方法、装置及终端设备。
背景技术
森林火灾是一种突发性强、破坏性大、应急处置困难且危险性极大的自然灾害。
相关技术中,森林火灾发生时,通常采用卫星遥感技术,获取森林火情信息。但是卫星遥感无法及时获取森林火情信息,从而导致延误最佳救援时机。因此,如何提高获取森林火情信息的时效性是目前亟需解决的问题。
发明内容
本公开提出一种基于无人机的火情监测方法和装置,以至少解决相关技术中火情监测时效性低的问题。本公开的技术方案如下:
根据本公开实施例的第一方面,本公开实施例提供一种基于无人机的火情监测方法,包括:
获取无人机采集的热成像图;
根据所述热成像图及所述热成像图的拍摄参数,确定火场区域;
确定所述火场区域对应的地理信息;
根据所述地理信息,确定所述火场区域对应的火线长度及过火面积。
本公开中,在获取无人机采集的热成像图后,可以根据热成像图及热成像图的拍摄参数,确定火场区域,之后,可以确定火场区域对应的地理信息,并根据地理信息,确定火场区域对应的火线长度及过火面积。由此,通过无人机采集的热成像图,能够及时的确定火场区域对应的火线长度及过火面积,从而提高了森林火情信息的时效性,进而可以避免错过最佳救援时机。
在一些实施例中,所述根据所述热成像图及所述热成像图的拍摄参数,确定火场区域,具体包括:
确定所述无人机中热成像仪的工作波段、及所述热成像仪与火点间的距离;
根据所述工作波段及距离,确定温度矫正参数;
根据所述温度矫正参数,对所述热成像图中各区域的温度进行矫正,以确定各区域矫正后的温度;
在任一区域修订后的温度大于阈值的情况下,确定所述任一区域为火场区域。
在一些实施例中,所述确定所述火场区域对应的地理信息,具体包括:
根据每个热成像图像中各个像素点的特征及拍摄参数,确定各热成像图像分别对应的各火场区域间的重合缝线;
基于所述重合缝线,将各个所述火场区域进行融合以生成融合后的火场区域;
确定所述融合后的火场区域对应的地理信息。
在一些实施例中,所述确定所述火场区域对应的地理信息,具体包括:
确定各个火场区域中每个所述火场区域对应的地理信息;
根据各个所述地理信息间的匹配度,将各个所述地理信息进行融合,生成融合后的地理信息。
在一些实施例中,所述确定所述火场区域对应的地理信息,具体包括:
根据无人机中热成像仪的地理位置信息、姿态角及视场角,确定热成像仪的虚拟视角;
基于数字高程信息及所述的虚拟视角,确定热成像图的坐标转换矩阵;
根据所述火场区域的像素位置,及所述坐标转换矩阵,确定火场区域对应的地理坐标信息。
在一些实施例中,该方法还包括:
根据所述无人机中热成像仪与所述火场中火点间的距离及所述热成像仪的焦距信息,确定所述火点对应的地面分辨率;
根据所述地面分辨率,对所述火线长度及过火面积进行修正。
根据本公开实施例的第二方面,本公开实施例提供一种基于无人机的火情监测装置,包括:
获取模块,用于获取无人机采集的热成像图;
第一确定模块,用于根据所述热成像图及所述热成像图的拍摄参数,确定火场区域;
第二确定模块,用于确定所述火场区域对应的地理信息;
第三确定模块,用于根据所述地理信息,确定所述火场区域对应的火线长度及过火面积。
在一些实施例中,所述第一确定模块,具体用于:
确定所述无人机中热成像仪的工作波段、及所述热成像仪与火点间的距离;
根据所述工作波段及距离,确定温度矫正参数;
根据所述温度矫正参数,对所述热成像图中各区域的温度进行矫正,以确定各区域矫正后的温度;
在任一区域修订后的温度大于阈值的情况下,确定所述任一区域为火场区域。
在一些实施例中,所述第二确定模块,具体用于:
根据每个热成像图像中各个像素点的特征及拍摄参数,确定各热成像图像分别对应的各火场区域间的重合缝线;
基于所述重合缝线,将各个所述火场区域进行融合以生成融合后的火场区域;
确定所述融合后的火场区域对应的地理信息。
在一些实施例中,所述第二确定模块,具体用于:
确定各个火场区域中每个所述火场区域对应的地理信息;
根据各个所述地理信息间的匹配度,将各个所述地理信息进行融合,生成融合后的地理信息。
在一些实施例中,所述第二确定模块,具体用于:
根据无人机中热成像仪的地理位置信息、姿态角及视场角,确定热成像仪的虚拟视角;
基于数字高程信息及所述的虚拟视角,确定热成像图的坐标转换矩阵;
根据所述火场区域的像素位置,及所述坐标转换矩阵,确定火场区域对应的地理坐标信息。
在一些实施例中,所述第三确定模块,还用于:
根据所述无人机中热成像仪与所述火场中火点间的距离及所述热成像仪的焦距信息,确定所述火点对应的地面分辨率;
根据所述地面分辨率,对所述火线长度及过火面积进行修正。
根据本公开实施例的第三方面,提供一种终端设备,包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,处理器被配置为执行指令,以实现如上述第一方面实施例所述的基于无人机的火情监测方法。
根据本公开实施例的第四方面,提供一种计算机可读存储介质,当计算机可读存储介质中的指令由终端设备的处理器执行时,使得终端设备能够执行如上述一方面实施例所述的基于无人机的火情监测方法。
根据本公开实施例的第五方面,提供一种计算机程序产品,包括计算机程序,计算机程序被处理器执行时实现上述一方面实施例所述的基于无人机的火情监测方法。
本公开的实施例提供的技术方案至少带来以下有益效果:本公开中,在获取无人机采集 的热成像图后,可以根据热成像图及热成像图的拍摄参数,确定火场区域,之后,可以确定火场区域对应的地理信息,并根据地理信息,确定火场区域对应的火线长度及过火面积。由此,通过无人机采集的热成像图,能够及时的确定火场区域对应的火线长度及过火面积,从而提高了森林火情信息的时效性,进而可以避免错过最佳救援时机。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理,并不构成对本公开的不当限定。
图1为本公开第一实施例提供的一种基于无人机的火情监测方法的流程示意图;
图2为本公开第二实施例提供的另一种基于无人机的火情监测方法的流程示意图;
图3为本公开第三实施例提供的另一种基于无人机的火情监测方法的流程示意图;
图4为本公开第四实施例提供的一种基于无人机的火情监测装置的结构示意图;
图5是根据一示例性实施例示出的一种基于无人机的火情监测的终端设备的框图。
具体实施方式
为了使本领域普通人员更好地理解本公开的技术方案,下面将结合附图,对本公开实施例中的技术方案进行清楚、完整地描述。
需要说明的是,本公开的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本公开的实施例能够以除了在这里图示或描述的那些以外的顺序实施。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
本公开中,无人机可以通过挂载的热成像相机,实现对森林火场区域的温度进行监测,从而即可基于温度,确定火场区域,进而确定火线长度及火场面积。且由于无人机起飞方便、方向可控、机动灵活,因此,即可及时获取森林火情信息,从而提高获取森林火情的时效性。
下面参考附图描述本公开实施例的基于无人机的火情监测方法和装置。
图1为本公开实施例所提供的一种基于无人机的火情监测方法的流程图,包括以下步骤101至步骤104。
步骤101,获取无人机采集的热成像图。
需要说明的是,任何物体当其温度高于绝对温度零度,物体即可辐射红外线(又称热辐射线),不同温度,其红外线的辐射强度不同。因此,热成像仪可以通过红外探测器,接收 物体辐射的红外线,并将强弱不等的红外信号转化成电信号,电信号再经过放大和视频处理,即可形成与物体表面温度分布相对应的热成像图。
本公开中,无人机可以通过挂载的热成像仪,实时采集森林火场的热成像图。
本公开中,可以根据存在火情的森林面积,确定用于并行飞行的无人机数量,由此,通过多台无人机并行飞行,采集森林火场的热成像图,能够及时的获取整个森林的火情信息,从而提高森林火情信息的时效性。此外,无人机在获取热成像图片后,可以通过无线传输模块,实时的将热成像图传输给服务端。由此,服务端即可实现对森林火情的实时监测。
步骤102,根据热成像图及热成像图的拍摄参数,确定火场区域。
其中,拍摄参数可以包括热成像仪与地面的距离等,本领域技术人员可以根据实际需要选择。
本公开中,热成像图中的每个像素点都对应一个温度值,因此,可以将每个像素点对应的温度值,与阈值进行比较,当像素点对应的温度值大于阈值时,可以确定此像素点对应的区域为着火点。当像素点对应的温度值小于阈值时,可以确定此像素点对应的区域不为着火点。然后,即可将着火点组成的区域,作为火场区域。
在一些实施例中,由于热成像图的拍摄参数,可能对热成像图对应的温度值产生影响,因此还可以利用热成像图的拍摄参数,对热成像图中各像素点对应的温度值进行修正,以提高温度值的准确性。
比如,可以将拍摄参数对应的温度修正函数预先设置在系统中,之后,可以根据温度修正函数,对热成像图中的每个像素点对应的温度值进行修正,然后,再根据每个像素点对应的修正后的温度值,确定火场区域。
步骤103,确定火场区域对应的地理信息。
其中,地理信息可以为全球定位系统(Global Positioning System,GPS)位置信息等任一可以唯一确定真实火场区域位置的信息,本领域技术人员可以根据实际需要选择。
可以理解的是,热成像图中只包含火场区域对应的相机坐标位置信息,不包含真实火场区域的地理信息,因此,需要将热成像图进行坐标位置转换,将火场区域从相机坐标系转换到大地坐标系中,以确定火场区域对应的地理信息。
本公开中,可以根据无人机中热成像仪的地理位置信息、姿态角及视场角,确定热成像仪的虚拟视角,之后,基于数字高程信息及虚拟视角,确定热成像图的坐标转换矩阵,然后,根据火场区域的像素位置,及坐标转换矩阵,确定火场区域对应的地理坐标信息。
其中,热成像仪的地理位置信息可以为无人机对应的经纬度信息,姿态角可以包括航向角、水平角、翻滚角等,视场角可以包括水平视场角、垂直视场角等。此外,姿态角、视场角、可以通过无人机搭载的带有激光测距的光电吊舱等测距传感器测量确定。无人机对应的 经纬度信息可以通过无人机搭载的导航系统确定。此外,可以将森林火场对应的地理位置输入到TS-GIS引擎中,即可确定森林火场对应的数字高程模型。
本公开中,服务端可以将热成像仪的地理位置信息、姿态角及视场角等信息,输入到TS-GIS引擎中,即可基于数字高程信息,确定热成像仪对应的虚拟视角。之后,再基于虚拟视角,计算得到热成像图中火场区域对应的相机坐标到大地坐标的坐标转换矩阵。然后,再根据坐标转换矩阵,将热成像中每个像素在相机坐标系中的坐标,转换到大地坐标系中,从而确定热成像中每个像素所对应的地理信息。
可以理解的是,每张热成像图对应的热成像仪的地理位置信息、姿态角及视场角等信息相同,因此,同一张热成像图中的所有像素点对应的坐标转换矩阵相同。由于无人机在不断地飞行中,因此,在飞行过程中拍摄的多张热成像图对应的热成像仪的地理位置信息、姿态角及视场角等信息可能发生改变,因此,各热成像图对应的坐标转换矩阵可能不同。
步骤104,根据地理信息,确定火场区域对应的火线长度及过火面积。
本公开中,可以将火场边缘及火场区域对应的地理信息,分别输入到TS-GIS引擎中,然后,即可根据火场边缘的地理信息,确定火线长度,并根据火场区域的GPS信息,确定过火面积。其中,火场边缘可以为火场区域与非火场区域的交界。
本公开中,在获取无人机采集的热成像图后,可以根据热成像图及热成像图的拍摄参数,确定火场区域,之后,可以确定火场区域对应的地理信息,并根据地理信息,确定火场区域对应的火线长度及过火面积。由此,通过无人机采集的热成像图,能够及时的确定火场区域对应的火线长度及过火面积,从而提高了森林火情信息的时效性,进而可以避免错过最佳救援时机。
图2为本公开实施例所提供的一种基于无人机的火情监测方法的流程图,包括以下步骤201至步骤207。
步骤201,获取无人机采集的热成像图。
本公开中,步骤201的具体实现过程,可参见本公开任一实施例中的详细描述,在此不再赘述。
步骤202,确定无人机中热成像仪的工作波段、及热成像仪与火点间的距离。
其中,热成像仪的工作波段可以为热成像仪所能接收的红外线的波长,可以预先设置在系统中。热成像仪与火点间的距离可以通过无人机搭载的测距传感器测量确定。
本公开中,当无人机将采集的热成像图发送给服务端时,即可将每张热成像图对应的热成像仪与火点间的距离,同时发送给服务端。
步骤203,根据工作波段及距离,确定温度矫正参数。
其中,温度校正参数可以包括校正后的辐射照度、响应电压等,本公开对此不作限制。
本公开中,无人机在飞行的过程中,高度可能会发生变化,因此,受热成像仪与火场之间距离的影响,热成像图中的温度信息可能会产生偏差。为了进一步提高热成像仪成像温度的准确性,可以基于工作波段及距离,对温度进行校正。
本公开中,热成像仪首先接收红外线,并确定辐射照度,之后,根据辐射照度,确定热成像仪的响应电压,然后,再根据普朗克辐射定律,即可确定森林火场各区域的温度信息。由此,可以根据距离,对辐射照度进行校正,以达到修正温度的效果。其中,校正的辐射照度的公式表达如下:
Ε λ=A 0d -2ε λL (T 0)+τ (1-α λ)L (T u)+ε L (T a)]  (1)
其中,ε λ为表面发射率,α λ为表面吸收率,τ 为大气的光谱透射率,ε 为大气发射率,T 0为被测物体表面温度,T u为环境温度,T a为大气温度,d为该目标到测量仪器之间的距离,通常一定条件下,A 0为热像仪最小空间张角所对应的目标的可视面积。L 为辐射亮度。此外,A 0可以预先设置在系统中。
本公开中,热像仪通常工作在某一个很窄的波段范围内,因此,ε λ、α λ、τ 可以为固定值。从而根据修正后的辐射照度,即可确定修正后的响应电压。其中,修正后的响应电压的公式表达如下:
Figure PCTCN2022142552-appb-000001
其中,A R为热像仪透镜的面积,R λ为探测器的光谱响应度,对某台确定的红外热像仪为常值。此外,A R可以预先设置在系统中。
当令K=A RA 0,
Figure PCTCN2022142552-appb-000002
则公式(2)可以为:
Figure PCTCN2022142552-appb-000003
此外,由于大气、木材均满足灰体特质,因此,ε=α,大气的ε a=α a=1-τ a,其中,α a为大气吸收率,所以,公式(3)可以为:
Figure PCTCN2022142552-appb-000004
步骤204,根据温度矫正参数,对热成像图中各区域的温度进行矫正,以确定各区域矫正后的温度。
本公开中,在确定热成像仪的响应电压后,即可根据普朗克辐射定律,确定热成像图中各像素点对应的修正后的温度。其中,热成像仪图像温度的公式表达如下:
Figure PCTCN2022142552-appb-000005
其中,T γ为热成像图中各像素点的温度值,n是热像仪的使用工作波段。将公式(5)整理后,即可得到被测物体修正后的温度公式,如下所示:
Figure PCTCN2022142552-appb-000006
其中,被测物体表面温度T 0,即为矫正后的温度。
本公开中,受小型无人机的飞行高度的限制,无人机距离火场的距离一般为100m到2000m,因此,可以确定大气透射率τ a为定值,且大气温度等同于环境温度,即T u=T a。此外,当令
Figure PCTCN2022142552-appb-000007
σ=Kτ a时,公式(6)整理,即可得到修正后的温度公式,如下所示:
Figure PCTCN2022142552-appb-000008
可以理解的是,当木材燃烧时,燃烧物的表面温度远大于环境温度,因此,
Figure PCTCN2022142552-appb-000009
则公式(7)整理可以得到修正的温度公式:
Figure PCTCN2022142552-appb-000010
其中,木材的表面发射率ε可以预先设置在系统中,n的取值可以根据热像仪使用的工作波段,预先设置在系统中。比如,当热像仪的工作波段为3-5微米时,n的取值可以为8.68,当热像仪的工作波段为8-14微米,n的取值可以为4.09。
本公开中,服务端可以根据公式(8),利用热成像仪与火点的距离d,对热成像图中每个像素点的温度值T γ进行修正,以得到修正后的物体表面温度T 0
步骤205,在任一区域修订后的温度大于阈值的情况下,确定任一区域为火场区域。
步骤206,确定火场区域对应的地理信息。
步骤207,根据地理信息,确定火场区域对应的火线长度及过火面积。
本公开中,步骤205至步骤207的具体实现过程,可以参见本公开中任一实施例的详细描述,在此不再赘述。
本公开中,服务端在获取无人机采集的热成像图后,可以确定无人机中热成像仪的工作波段、及热成像仪与火点间的距离,之后,可以根据工作波段及距离,确定温度矫正参数, 再根据温度矫正参数,对热成像图中各区域的温度进行矫正,以确定各区域矫正后的温度,并在任一区域修订后的温度大于阈值的情况下,可以确定任一区域为火场区域,然后,再确定火场区域对应的地理信息,并根据地理信息,确定火场区域对应的火线长度及过火面积。由此,根据热成像仪与火点间的距离,对热成像图温度进行校正,从而可以提高温度的准确性,进而,有利于提高火场信息的准确性。
图3为本公开实施例所提供的一种基于无人机的火情监测方法的流程图,包括以下步骤301至步骤306。
步骤301,获取无人机采集的热成像图。
步骤302,根据热成像图及热成像图的拍摄参数,确定火场区域。
步骤303,确定各个火场区域中每个火场区域对应的地理信息。
本公开中,步骤301至步骤303的具体实现过程,可参见本公开任一实施例中的详细描述,在此不再赘述。
步骤304,根据各个地理信息间的匹配度,将各个地理信息进行融合,生成融合后的地理信息。
本公开中,由于各无人机在飞行过程中,拍摄的热成像图,只对应于森林火场的局部情况,因此,可以将多台无人机拍摄的多张热成像图进行融合,以直观的获取整个森林火场区域的火灾情况。
本公开中,服务端可以将获取到的多张热成像图,两两进行匹配,当两热成像图的某火场区域中,各对应像素点的地理信息之间的距离小于阈值时,可以确定两热成像图中的此火场区域为同一火场区域,之后,可以将两热成像图中同一火场区域进行融合,以生成融合后的地理信息。可选的,还可以根据每个热成像图像中各个像素点的特征及拍摄参数,确定各热成像图像分别对应的各火场区域间的重合缝线,之后,可以基于重合缝线,将各个火场区域进行融合以生成融合后的火场区域,然后,再确定融合后的火场区域对应的地理信息。
其中,像素点的特征可以包括像素值,像素坐标等信息,本领域技术人员可以根据实际需要选择。
本公开中,服务端可以基于图像融合技术,将获取到的多张热成像图,两两进行融合,以确定热成像图像分别对应的各火场区域间的重合缝线。之后,即可基于重合缝线,将各个相同区域进行融合,以生成融合后的火场区域。然后,再基于本公开任一确定火场区域对应的地理信息的方法,确定融合后的火场区域对应的地理信息。
步骤305,根据无人机中热成像仪与火场中火点间的距离及热成像仪的焦距信息,确定火点对应的地面分辨率。
本公开中,各热成像图实际对应于森林火场的局部区域,局部区域的面积显然大于热成 像图的面积。由此,热成像图中每个像素点,实际对应于真实火场的某一区域。然而,通过像素点确定的每个地理信息,只是一个位置点,并不是此像素点对应的一个区域,因此,通过像素点确定的地理信息,确定的火线长度及过火面积,可能存在误差。
步骤306,根据地面分辨率,对火线长度及过火面积进行修正。
本公开中,可以将像素点对应的地理信息,确定为像素点对应区域的中心位置。之后,再根据地面分辨率,确定像素点对应的区域,然后,可以根据火线像素点对应区域的外边缘的地理信息,计算火线长度,从而,实现火线长度的修正。同时,可以基于火线像素点对应区域的外边缘的地理位置信息,确定过火面积。
本公开中,服务端获取无人机采集的热成像图后,可以根据热成像图及热成像图的拍摄参数,确定火场区域,之后,可以确定各个火场区域中每个火场区域对应的地理信息,再根据各个地理信息间的匹配度,将各个地理信息进行融合,生成融合后的地理信息,然后,再根据无人机中热成像仪与火场中火点间的距离及热成像仪的焦距信息,确定火点对应的地面分辨率,并根据地面分辨率,对火线长度及过火面积进行修正。由此,通过热成像仪与火场中火点间的距离及热成像仪的焦距,对火线长度及过火面积进行修正,从而可以提高火线长度及过火面积的准确性,进而为火灾救援提供可靠依据。
图4是根据一示例性实施例示出的一种基于无人机的火情监测装置框图。参照图4,该装置包括获取模块410、第一确定模块420、第二确定模块430、第三确定模块440。
获取模块410,用于获取无人机采集的热成像图。
第一确定模块420,用于根据所述热成像图及所述热成像图的拍摄参数,确定火场区域。
第二确定模块430,用于确定所述火场区域对应的地理信息。
第三确定模块440,用于根据所述地理信息,确定所述火场区域对应的火线长度及过火面积。
在本公开实施例一种可能的实现方式中,上述第一确定模块420,具体用于:确定所述无人机中热成像仪的工作波段、及所述热成像仪与火点间的距离;根据所述工作波段及距离,确定温度矫正参数;根据所述温度矫正参数,对所述热成像图中各区域的温度进行矫正,以确定各区域矫正后的温度;在任一区域修订后的温度大于阈值的情况下,确定所述任一区域为火场区域。
在本公开实施例一种可能的实现方式中,上述第二确定模块420,具体用于:根据每个热成像图像中各个像素点的特征及拍摄参数,确定各热成像图像分别对应的各火场区域间的重合缝线;基于所述重合缝线,将各个所述火场区域进行融合以生成融合后的火场区域;确定所述融合后的火场区域对应的地理信息。
在本公开实施例一种可能的实现方式中,上述第二确定模块420,具体用于:确定各个 火场区域中每个所述火场区域对应的地理信息;根据各个所述地理信息间的匹配度,将各个所述地理信息进行融合,生成融合后的地理信息。
在本公开实施例一种可能的实现方式中,上述第二确定模块420,具体用于:根据无人机中热成像仪的地理位置信息、姿态角及视场角,确定热成像仪的虚拟视角;基于数字高程信息及所述的虚拟视角,确定热成像图的坐标转换矩阵;根据所述火场区域的像素位置,及所述坐标转换矩阵,确定火场区域对应的地理坐标信息。
在本公开实施例一种可能的实现方式中,上述第三确定模块430,还用于:根据所述无人机中热成像仪与所述火场中火点间的距离及所述热成像仪的焦距信息,确定所述火点对应的地面分辨率;根据所述地面分辨率,对所述火线长度及过火面积进行修正。
关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。
本公开中,在获取无人机采集的热成像图后,可以根据热成像图及热成像图的拍摄参数,确定火场区域,之后,可以确定火场区域对应的地理信息,并根据地理信息,确定火场区域对应的火线长度及过火面积。由此,通过无人机采集的热成像图,能够及时的确定火场区域对应的火线长度及过火面积,从而提高了森林火情信息的时效性,进而可以避免错过最佳救援时机。
图5是根据一示例性实施例示出的一种基于无人机的火情监测的终端设备的框图。
如图5所示,该终端设备500包括:存储器510及处理器520,连接不同组件(包括存储器510和处理器520)的总线530,存储器510存储有计算机程序,当处理器520执行所述程序时实现本公开实施例所述的基于无人机的火情监测方法。
总线530表示几类总线结构中的一种或多种,包括存储器总线或者存储器控制器,外围总线,图形加速端口,处理器或者使用多种总线结构中的任意总线结构的局域总线。举例来说,这些体系结构包括但不限于工业标准体系结构(ISA)总线,微通道体系结构(MAC)总线,增强型ISA总线、视频电子标准协会(VESA)局域总线以及外围组件互连(PCI)总线。
终端设备500典型地包括多种电子设备可读介质。这些介质可以是任何能够被终端设备500访问的可用介质,包括易失性和非易失性介质,可移动的和不可移动的介质。
存储器510还可以包括易失性存储器形式的计算机系统可读介质,例如随机存取存储器(RAM)540和/或高速缓存存储器550。终端设备500可以进一步包括其它可移动/不可移动的、易失性/非易失性计算机系统存储介质。仅作为举例,存储系统560可以用于读写不可移动的、非易失性磁介质(图5未显示,通常称为“硬盘驱动器”)。尽管图5中未示出,可以提供用于对可移动非易失性磁盘(例如“软盘”)读写的磁盘驱动器,以及对可移动非易 失性光盘(例如CD-ROM,DVD-ROM或者其它光介质)读写的光盘驱动器。在这些情况下,每个驱动器可以通过一个或者多个数据介质接口与总线530相连。存储器510可以包括至少一个程序产品,该程序产品具有一组(例如至少一个)程序模块,这些程序模块被配置以执行本公开各实施例的功能。
具有一组(至少一个)程序模块570的程序/实用工具580,可以存储在例如存储器510中,这样的程序模块570包括但不限于操作系统、一个或者多个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。程序模块570通常执行本公开所描述的实施例中的功能和/或方法。
终端设备500也可以与一个或多个外部设备590(例如键盘、指向设备、显示器591等)通信,还可与一个或者多个使得用户能与该终端设备500交互的设备通信,和/或与使得该终端设备500能与一个或多个其它计算设备进行通信的任何设备(例如网卡,调制解调器等等)通信。这种通信可以通过输入/输出(I/O)接口592进行。并且,终端设备500还可以通过网络适配器593与一个或者多个网络(例如局域网(LAN),广域网(WAN)和/或公共网络,例如因特网)通信。如图所示,网络适配器593通过总线530与终端设备500的其它模块通信。应当明白,尽管图中未示出,可以结合终端设备500使用其它硬件和/或软件模块,包括但不限于:微代码、设备驱动器、冗余处理单元、外部磁盘驱动阵列、RAID系统、磁带驱动器以及数据备份存储系统等。
处理器520通过运行存储在存储器510中的程序,从而执行各种功能应用以及数据处理。
需要说明的是,本实施例的终端设备的实施过程和技术原理参见前述对本公开实施例的基于无人机的火情监测方法的解释说明,此处不再赘述。
本公开中,通过基于不同的聚类模式,筛选出相应的第二属性信息,减少了计算的数据量,从而有利于提高计算的速度,并且通过根据第一属性信息与每个第二属性信息间的相似度,以及每个参考对象对应的类别标签,确定申请方对应的类别标签,从而在申请方对应的关系数据未知的情况下,仅需要用户提供简单的属性信息,即可对业务请求进行校验,在有效的去除高风险的业务请求的同时,减少了申请方申请业务的时间,提高了业务申请的效率。另外,在申请方填报的信息中包括关系数据情况下,也可通过此方式对关系数据进行校验,增加关系数据信息的可信度。
在示例性实施例中,本公开还提供了一种包括指令的计算机可读存储介质,例如包括指令的存储器,上述指令可由终端设备的处理器执行以完成上述方法。可选地,计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
为了实现上述实施例,本公开还提供一种计算机程序产品,该计算机程序由终端设备的 处理器执行时,使得终端设备能够执行如前所述的基于无人机的火情监测方法。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限制。

Claims (15)

  1. 一种基于无人机的火情监测方法,包括:
    获取无人机采集的热成像图;
    根据所述热成像图及所述热成像图的拍摄参数,确定火场区域;
    确定所述火场区域对应的地理信息;
    根据所述地理信息,确定所述火场区域对应的火线长度及过火面积。
  2. 如权利要求1所述的方法,其中,所述根据所述热成像图及所述热成像图的拍摄参数,确定火场区域,包括:
    确定所述无人机中热成像仪的工作波段、及所述热成像仪与火点间的距离;
    根据所述工作波段及距离,确定温度矫正参数;
    根据所述温度矫正参数,对所述热成像图中各区域的温度进行矫正,以确定各区域矫正后的温度;
    在任一区域修订后的温度大于阈值的情况下,确定所述任一区域为火场区域。
  3. 如权利要求1所述的方法,其中,所述确定所述火场区域对应的地理信息,包括:
    根据每个热成像图像中各个像素点的特征及拍摄参数,确定各热成像图像分别对应的各火场区域间的重合缝线;
    基于所述重合缝线,将各个所述火场区域进行融合以生成融合后的火场区域;
    确定所述融合后的火场区域对应的地理信息。
  4. 如权利要求1所述的方法,其中,所述确定所述火场区域对应的地理信息,包括:
    确定各个火场区域中每个所述火场区域对应的地理信息;
    根据各个所述地理信息间的匹配度,将各个所述地理信息进行融合,生成融合后的地理信息。
  5. 如权利要求1所述的方法,其中,所述确定所述火场区域对应的地理信息,包括:
    根据无人机中热成像仪的地理位置信息、姿态角及视场角,确定热成像仪的虚拟视角;
    基于数字高程信息及所述的虚拟视角,确定热成像图的坐标转换矩阵;
    根据所述火场区域的像素位置,及所述坐标转换矩阵,确定火场区域对应的地理坐标信息。
  6. 如权利要求1至5中任一项所述的方法,其中,在所述根据所述地理信息,确定所述火场区域对应的火线长度及过火面积之后,还包括:
    根据所述无人机中热成像仪与所述火场中火点间的距离及所述热成像仪的焦距信息,确定所述火点对应的地面分辨率;
    根据所述地面分辨率,对所述火线长度及过火面积进行修正。
  7. 一种基于无人机的火情监测装置,包括:
    获取模块,用于获取无人机采集的热成像图;
    第一确定模块,用于根据所述热成像图及所述热成像图的拍摄参数,确定火场区域;
    第二确定模块,用于确定所述火场区域对应的地理信息;
    第三确定模块,用于根据所述地理信息,确定所述火场区域对应的火线长度及过火面积。
  8. 如权利要求7所述的装置,其中,所述第一确定模块,具体用于:
    确定所述无人机中热成像仪的工作波段、及所述热成像仪与火点间的距离;
    根据所述工作波段及距离,确定温度矫正参数;
    根据所述温度矫正参数,对所述热成像图中各区域的温度进行矫正,以确定各区域矫正后的温度;
    在任一区域修订后的温度大于阈值的情况下,确定所述任一区域为火场区域。
  9. 如权利要求7所述的装置,其中,所述第二确定模块,具体用于:
    根据每个热成像图像中各个像素点的特征及拍摄参数,确定各热成像图像分别对应的各火场区域间的重合缝线;
    基于所述重合缝线,将各个所述火场区域进行融合以生成融合后的火场区域;
    确定所述融合后的火场区域对应的地理信息。
  10. 如权利要求7所述的装置,其中,所述第二确定模块,具体用于:
    确定各个火场区域中每个所述火场区域对应的地理信息;
    根据各个所述地理信息间的匹配度,将各个所述地理信息进行融合,生成融合后的地理信息。
  11. 如权利要求7所述的装置,其中,所述第二确定模块,具体用于:
    根据无人机中热成像仪的地理位置信息、姿态角及视场角,确定热成像仪的虚拟视角;
    基于数字高程信息及所述的虚拟视角,确定热成像图的坐标转换矩阵;
    根据所述火场区域的像素位置,及所述坐标转换矩阵,确定火场区域对应的地理坐标信息。
  12. 如权利要求7至11中任一项所述的装置,其中,所述第三确定模块,还用于:
    根据所述无人机中热成像仪与所述火场中火点间的距离及所述热成像仪的焦距信息,确定所述火点对应的地面分辨率;
    根据所述地面分辨率,对所述火线长度及过火面积进行修正。
  13. 一种终端设备,包括:
    处理器;
    用于存储所述处理器可执行指令的存储器;
    其中,所述处理器被配置为执行所述指令,以实现如权利要求1至6中任一项所述的基于无人机的火情监测方法。
  14. 一种计算机可读存储介质,当所述计算机可读存储介质中的指令由终端设备的处理器执行时,使得所述终端设备能够执行如权利要求1至6中任一项所述的基于无人机的火情监测方法。
  15. 一种计算机程序产品,包括计算机程序,其中,所述计算机程序被处理器执行时实现权利要求1至6任一项所述的基于无人机的火情监测方法。
PCT/CN2022/142552 2021-12-29 2022-12-27 基于无人机的火情监测方法及装置 WO2023125587A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111642996.2A CN114495416A (zh) 2021-12-29 2021-12-29 基于无人机的火情监测方法、装置及终端设备
CN202111642996.2 2021-12-29

Publications (1)

Publication Number Publication Date
WO2023125587A1 true WO2023125587A1 (zh) 2023-07-06

Family

ID=81507744

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/142552 WO2023125587A1 (zh) 2021-12-29 2022-12-27 基于无人机的火情监测方法及装置

Country Status (2)

Country Link
CN (1) CN114495416A (zh)
WO (1) WO2023125587A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114495416A (zh) * 2021-12-29 2022-05-13 北京辰安科技股份有限公司 基于无人机的火情监测方法、装置及终端设备
CN115212489A (zh) * 2022-07-20 2022-10-21 中国矿业大学 一种面向森林火灾的无人机灭火救援决策辅助系统
CN115880598B (zh) * 2023-02-15 2023-05-02 深圳市蜉飞科技有限公司 一种基于无人机的地面图像检测方法及相关装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200600A (zh) * 2014-09-20 2014-12-10 无锡北斗星通信息科技有限公司 基于火情分析的消防监控系统
CN111445661A (zh) * 2020-04-08 2020-07-24 峰飞国际有限公司 一种火情处理方法、装置、设备及存储介质
CN112735082A (zh) * 2020-12-21 2021-04-30 南京森林警察学院 着火点监测系统及监测方法
US20210283439A1 (en) * 2020-03-12 2021-09-16 RapidDeploy, Inc. Dispatching UAVs for Wildfire Surveillance
CN114495416A (zh) * 2021-12-29 2022-05-13 北京辰安科技股份有限公司 基于无人机的火情监测方法、装置及终端设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108600607A (zh) * 2018-03-13 2018-09-28 上海网罗电子科技有限公司 一种基于无人机的消防全景信息展示方法
CN111461013B (zh) * 2020-04-01 2023-11-03 深圳市科卫泰实业发展有限公司 一种基于无人机的实时火场态势感知方法
CN112464819B (zh) * 2020-11-27 2024-01-12 清华大学 基于无人机视频的林火蔓延数据同化方法以及装置
CN112668397A (zh) * 2020-12-04 2021-04-16 普宙飞行器科技(深圳)有限公司 火情实时检测分析方法、系统、存储介质及电子设备
CN112435207B (zh) * 2020-12-07 2024-04-09 深圳航天智慧城市系统技术研究院有限公司 一种基于天空地一体化的森林火灾监测预警方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200600A (zh) * 2014-09-20 2014-12-10 无锡北斗星通信息科技有限公司 基于火情分析的消防监控系统
US20210283439A1 (en) * 2020-03-12 2021-09-16 RapidDeploy, Inc. Dispatching UAVs for Wildfire Surveillance
CN111445661A (zh) * 2020-04-08 2020-07-24 峰飞国际有限公司 一种火情处理方法、装置、设备及存储介质
CN112735082A (zh) * 2020-12-21 2021-04-30 南京森林警察学院 着火点监测系统及监测方法
CN114495416A (zh) * 2021-12-29 2022-05-13 北京辰安科技股份有限公司 基于无人机的火情监测方法、装置及终端设备

Also Published As

Publication number Publication date
CN114495416A (zh) 2022-05-13

Similar Documents

Publication Publication Date Title
WO2023125587A1 (zh) 基于无人机的火情监测方法及装置
AU2012328156B2 (en) Identification and analysis of aircraft landing sites
CN107316012B (zh) 小型无人直升机的火灾检测与跟踪方法
CN111982291B (zh) 一种基于无人机的火点定位方法、装置及系统
US9219858B2 (en) Generating a composite field of view using a plurality of oblique panoramic images of a geographic area
US11887273B2 (en) Post capture imagery processing and deployment systems
CN110930508B (zh) 二维光电视频与三维场景融合方法
Adams et al. Unmanned aerial vehicle data acquisition for damage assessment in hurricane events
CN106683038B (zh) 一种生成火情态势图的方法及装置
CN110675448B (zh) 基于民航客机的地面灯光遥感监测方法、系统及存储介质
CN106683039B (zh) 一种生成火情态势图的系统
CN106537409B (zh) 确定影像的罗盘定位
CN112489032A (zh) 一种复杂背景下无人机载小目标检测定位方法及系统
KR20160082886A (ko) 복수의 센서를 탑재한 무인 비행체를 이용하는 매핑 방법 및 시스템
WO2023150888A1 (en) System and method for firefighting and locating hotspots of a wildfire
CN116385504A (zh) 一种基于无人机采集点云与图像配准的巡检、测距方法
CN116020075A (zh) 针对消防炮的射流落点检测及灭火控制的方法和装置
CN113256493B (zh) 一种热红外遥感图像重建方法和装置
CN109961043A (zh) 一种基于无人机高分辨率影像的单木高度测量方法及系统
JP2020015416A (ja) 画像処理装置
WO2020243256A1 (en) System and method for navigation and geolocation in gps-denied environments
CN108109171A (zh) 无人机航片旋偏角的检测方法、装置、设备和存储介质
KR102392258B1 (ko) 영상 기반 잔불 추적 위치 매핑 장치 및 방법
Guo et al. A new UAV PTZ Controlling System with Target Localization
JP2003139532A (ja) 地図および地図作成方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22914839

Country of ref document: EP

Kind code of ref document: A1