CN111982291A - Fire point positioning method, device and system based on unmanned aerial vehicle - Google Patents

Fire point positioning method, device and system based on unmanned aerial vehicle Download PDF

Info

Publication number
CN111982291A
CN111982291A CN201910435590.3A CN201910435590A CN111982291A CN 111982291 A CN111982291 A CN 111982291A CN 201910435590 A CN201910435590 A CN 201910435590A CN 111982291 A CN111982291 A CN 111982291A
Authority
CN
China
Prior art keywords
camera
fire point
fire
coordinates
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910435590.3A
Other languages
Chinese (zh)
Other versions
CN111982291B (en
Inventor
蔡思杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Hangzhou Hikrobot Co Ltd
Original Assignee
Hangzhou Hikrobot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikrobot Technology Co Ltd filed Critical Hangzhou Hikrobot Technology Co Ltd
Priority to CN201910435590.3A priority Critical patent/CN111982291B/en
Publication of CN111982291A publication Critical patent/CN111982291A/en
Application granted granted Critical
Publication of CN111982291B publication Critical patent/CN111982291B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0014Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation from gases, flames
    • G01J5/0018Flames, plasma or welding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/48Thermography; Techniques using wholly visual means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Plasma & Fusion (AREA)
  • Fire-Detection Mechanisms (AREA)
  • Alarm Systems (AREA)
  • Fire Alarms (AREA)

Abstract

The embodiment of the invention provides a fire point positioning method, a fire point positioning device and a fire point positioning system based on an unmanned aerial vehicle, which relate to the technical field of unmanned aerial vehicle monitoring, and the method comprises the following steps: acquiring a thermal image corresponding to a target detection area generated by a camera; determining pixel coordinates of a fire point in the thermal image when the fire point with the temperature exceeding a preset threshold value exists in the thermal image; and determining the geographical coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of the thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinates of the unmanned aerial vehicle. By adopting the method and the device, the accuracy of the acquired geographic coordinates of the fire point can be improved.

Description

Fire point positioning method, device and system based on unmanned aerial vehicle
Technical Field
The invention relates to the technical field of unmanned aerial vehicle monitoring, in particular to a fire point positioning method, a fire point positioning device and a fire point positioning system based on an unmanned aerial vehicle.
Background
Forest fires are natural disasters which are strong in burst, large in destructiveness and difficult to dispose and rescue. When a fire disaster occurs in a forest, if the position of the fire point cannot be found in time, certain damage and loss can be brought to the forest, a forest ecological system and human beings. With the development of unmanned aerial vehicle technology, people try to sample unmanned aerial vehicles gradually to patrol areas prone to catching fire and find the positions of fire points in time.
Disclosure of Invention
The embodiment of the invention aims to provide a fire point positioning method, device and system based on an unmanned aerial vehicle, which can improve the accuracy of the acquired geographic coordinates of the fire point. The specific technical scheme is as follows:
in a first aspect, a method for locating a fire point based on a drone is provided, the method comprising:
acquiring a thermal image corresponding to a target detection area generated by a camera;
determining pixel coordinates of a fire in the thermal image when the fire is present in the thermal image at a temperature above a preset threshold;
and determining the geographical coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of a thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinates of the unmanned aerial vehicle.
Optionally, the determining the geographic coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of the thermal imaging sensor of the camera, and the attitude angle of the camera, the focal length of the camera, and the position coordinates of the drone when the camera generates the thermal image includes:
determining a first geographical coordinate according to a preset coordinate conversion algorithm, the pixel coordinate of the fire point, the preset size of a thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinate of the unmanned aerial vehicle;
And determining the geographic coordinates of the fire point according to the first geographic coordinates.
Optionally, the determining the geographic coordinate of the fire point according to the first geographic coordinate includes:
determining the first geographic coordinate as a geographic coordinate of the fire;
or
According to presetting digital elevation model DEM data in target detection area follows unmanned aerial vehicle's position coordinate with on the line of first geographical coordinate, confirm the geographical coordinate of fire point, elevation in the geographical coordinate of fire point with longitude and latitude coordinate in the geographical coordinate of fire point is in corresponding elevation equals in the DEM data.
Optionally, the method further includes:
adding a fire marker at the geographic coordinates of the fire in the electronic map of the target detection area.
Optionally, the method further includes:
when the fire point with the temperature exceeding the preset threshold value exists in the thermal image, acquiring a visible light image corresponding to the target detection area generated by a camera;
when it is detected that a user clicks a fire marker in the electronic map, the thermal image and the visible light image are displayed.
Optionally, the method further includes:
And when the fire point with the temperature exceeding the preset threshold value exists in the thermal image, alarm information is output.
In a second aspect, a fire point locating device based on unmanned aerial vehicle is provided, the device comprises
The first acquisition module is used for acquiring a thermal image corresponding to a target detection area generated by a camera;
a first determination module for determining pixel coordinates of a fire in the thermal image if the fire is present in the thermal image at a temperature above a preset threshold;
and the second determination module is used for determining the geographic coordinates of the fire according to the pixel coordinates of the fire, the preset size of the thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinates of the unmanned aerial vehicle.
Optionally, the second determining module includes:
the first determining unit is used for determining a first geographic coordinate according to a preset coordinate conversion algorithm, the pixel coordinate of the fire point, the preset size of a thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinate of the unmanned aerial vehicle;
And the second determining unit is used for determining the geographic coordinate of the fire point according to the first geographic coordinate.
Optionally, the second determining unit is specifically configured to:
determining the first geographic coordinate as a geographic coordinate of the fire;
or
According to presetting digital elevation model DEM data in target detection area follows unmanned aerial vehicle's position coordinate with on the line of first geographical coordinate, confirm the geographical coordinate of fire point, elevation in the geographical coordinate of fire point with longitude and latitude coordinate in the geographical coordinate of fire point is in corresponding elevation equals in the DEM data.
Optionally, the apparatus further comprises a marking module;
the marking module is used for adding fire point marks at the geographic coordinates of the fire points in the electronic map of the target detection area.
Optionally, the apparatus further comprises:
the second acquisition module is used for acquiring a visible light image corresponding to the target detection area generated by the camera when a fire point with the temperature exceeding the preset threshold value exists in the thermal image;
and the display module is used for displaying the thermal image and the visible light image when detecting that the fire mark in the electronic map is clicked by the user.
Optionally, the apparatus further comprises an output module;
and the output module is used for outputting alarm information when the fire point with the temperature exceeding the preset threshold value exists in the thermal image.
In a third aspect, a fire point positioning system based on an unmanned aerial vehicle is provided, the system comprises the unmanned aerial vehicle and a user terminal, wherein the unmanned aerial vehicle is provided with a camera;
the unmanned aerial vehicle is used for acquiring a thermal image corresponding to the target detection area generated by the camera; determining pixel coordinates of a fire point in the thermal image when the fire point with the temperature exceeding a preset threshold exists in the thermal image; sending the pixel coordinates of the fire point to the user terminal;
the user terminal is used for determining the geographical coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of the thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinates of the unmanned aerial vehicle.
In a fourth aspect, an unmanned aerial vehicle based fire point locating system is provided, the system comprising an unmanned aerial vehicle and a user terminal, the unmanned aerial vehicle being provided with a camera;
the unmanned aerial vehicle is used for acquiring a thermal image corresponding to the target detection area generated by the camera; determining pixel coordinates of a fire point in the thermal image when the fire point with the temperature exceeding a preset threshold exists in the thermal image; determining the geographic coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of a thermal imaging sensor of the camera, and the attitude angle of the camera, the focal length of the camera and the position coordinates of the unmanned aerial vehicle when the camera generates the thermal image; sending the geographical coordinates of the fire point to the user terminal;
And the user terminal is used for receiving the geographic coordinates of the fire point sent by the user terminal.
In a fifth aspect, there is provided an electronic device comprising a processor and a machine-readable storage medium storing machine-executable instructions executable by the processor, the processor being caused by the machine-executable instructions to: the method steps of the first aspect are implemented.
In a sixth aspect, a computer-readable storage medium is provided, having stored thereon a computer program which, when being executed by a processor, carries out the method steps of the first aspect.
According to the method, the device and the system for locating the fire point based on the unmanned aerial vehicle, provided by the embodiment of the invention, the thermal image corresponding to the target detection area generated by the camera can be obtained; determining pixel coordinates of a fire point in the thermal image when the fire point with the temperature exceeding a preset threshold value exists in the thermal image; and determining the geographical coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of the thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinates of the unmanned aerial vehicle. Compared with the prior art, the pixel coordinate of the fire point is converted into the geographic coordinate of the fire point, and the accuracy of the acquired geographic coordinate of the fire point can be improved.
Of course, not all of the advantages described above need to be achieved at the same time in the practice of any one product or method of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a fire point locating method based on an unmanned aerial vehicle according to an embodiment of the present application;
FIG. 2 is a graphical representation of a ground location coordinate system provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of a fire point locating device based on an unmanned aerial vehicle according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a fire point locating system based on an unmanned aerial vehicle according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of another fire point locating system based on an unmanned aerial vehicle according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
One possible way to detect the location of the fire point by unmanned aerial vehicles is: be provided with the camera that has the thermal imaging function on the unmanned aerial vehicle, the camera is with certain angle fixed mounting on unmanned aerial vehicle to make unmanned aerial vehicle take off the back, the camera can shoot perpendicularly downwards. In the process of patrolling and examining of unmanned aerial vehicle to the detection area, the camera is shot perpendicularly downwards, generates the thermal image to the detection area, when there is the fire point that the temperature surpassed preset threshold value in the thermal image, acquires the current longitude and latitude coordinate of unmanned aerial vehicle, then regards current longitude and latitude coordinate of unmanned aerial vehicle as the geographical coordinate of fire point. Because the camera shoots vertically downwards, the longitude and latitude coordinates of the unmanned aerial vehicle can be approximately regarded as the geographic coordinates of the fire point.
However, when the installation angle of the camera on the unmanned aerial vehicle is wrong, the camera is not shot vertically downwards, and the accuracy of the acquired geographic coordinates of the fire point is low.
The fire point positioning method based on the unmanned aerial vehicle can be realized by one piece of electronic equipment, the electronic equipment can be communicated with the unmanned aerial vehicle, the unmanned aerial vehicle is provided with a camera with a thermal imaging function, and the electronic equipment can acquire a thermal image corresponding to a target detection area generated by the camera; determining pixel coordinates of a fire point in the thermal image when the fire point with the temperature exceeding a preset threshold value exists in the thermal image; and determining the geographical coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of the thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinates of the unmanned aerial vehicle.
Alternatively, the fire point positioning based on the unmanned aerial vehicle provided by the embodiment of the present application may also be implemented by the unmanned aerial vehicle and the user terminal (or ground station) together. The unmanned aerial vehicle is provided with a camera with a thermal imaging function, the unmanned aerial vehicle can patrol and examine a preset detection area, the camera can periodically generate thermal images aiming at the detection area, the unmanned aerial vehicle can acquire the thermal images corresponding to a target detection area generated by the current camera, and then when a fire point with the temperature exceeding a preset threshold value exists in the thermal images, the pixel coordinates of the fire point in the thermal images are determined, and the pixel coordinates of the fire point are sent to the user terminal. Then, the user terminal may determine the geographical coordinates of the fire point according to the pixel coordinates of the fire point, the preset thermal imaging sensor size of the camera, and the attitude angle of the camera, the focal length of the camera, and the position coordinates of the drone when the camera generates the thermal image.
In one possible way described above for detecting the location of the fire point by the drone, the camera can be shot vertically downwards by mounting the camera at a specified angle on the drone, so that the longitude and latitude coordinates of the drone are directly taken as the geographical coordinates of the fire point; however, when the installation angle of the camera on the unmanned aerial vehicle is wrong in the mode, the camera is not shot vertically downwards, the longitude and latitude coordinates of the unmanned aerial vehicle are directly used as the geographic coordinates of the fire point, and the accuracy of the obtained geographic coordinates of the fire point is low. According to the method and the device, the pixel coordinates of the fire point are converted into the geographic coordinates of the fire point, and the accuracy of the obtained geographic coordinates of the fire point is high.
The embodiment of the present application describes an example in which the above-mentioned fire point positioning method based on the unmanned aerial vehicle is implemented by the unmanned aerial vehicle and the user terminal together, and the other situations are similar to the above-mentioned case. As shown in fig. 1, the method may include the steps of:
step 101, acquiring a thermal image corresponding to a target detection area generated by a camera.
In implementation, the unmanned aerial vehicle may patrol a preset detection area according to a preset flight policy, wherein the flight policy may be manual control flight, pointing flight, airline flight, regional flight, or the like. In the unmanned aerial vehicle inspection process, the camera arranged on the unmanned aerial vehicle can periodically generate the thermal image corresponding to the target detection area, and the unmanned aerial vehicle can acquire the thermal image corresponding to the target detection area generated by the camera.
The unmanned aerial vehicle is at any position, and the camera can generate thermal images corresponding to the target detection areas according to different shooting angles. Compare in foretell a possible mode in unmanned aerial vehicle can only shoot perpendicularly at arbitrary position department camera, the control field of vision of camera is narrower, the control field of vision of camera is great in this application, can more fast patrol and examine the target detection region, discovers the ignition, and confirms that the efficiency of ignition position is higher.
In the presence of a fire in the thermal image having a temperature above a preset threshold, pixel coordinates of the fire in the thermal image are determined, step 102.
In implementation, the drone may determine whether a fire point whose temperature exceeds a preset threshold exists in the thermal image according to a temperature value corresponding to each pixel point in the thermal image. If a fire exists in the thermal image that has a temperature that exceeds a preset threshold, the drone may determine the pixel coordinates of the fire in the thermal image. In one implementation, the unmanned aerial vehicle can acquire temperature data of the thermal image, wherein the temperature data comprises pixel coordinates of each pixel point in the thermal image and temperature values corresponding to the pixel points, the unmanned aerial vehicle can determine first pixel points with temperatures exceeding a preset threshold according to the temperature data, communicate the first pixel points adjacent to each other in position to obtain a first pixel area, and determine pixel coordinates of target pixel points in the first pixel area to be pixel coordinates of fire points. The target pixel point is a central pixel point of the first pixel region, or any pixel point in the first pixel region. There may be zero, one or more first pixel regions on a thermal image.
The steps 101 to 102 may be processed by a camera disposed on the drone, or processed by another processing module of the drone.
And 103, determining the geographical coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of the thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinates of the unmanned aerial vehicle.
The attitude angle of the camera can comprise a pitch angle of a tripod head where the camera is located and a yaw angle of the tripod head where the camera is located, and the tripod head is a bearing mechanism of the camera and can drive the camera to rotate so as to change the shooting angle of the camera.
In the implementation, be provided with positioner on the unmanned aerial vehicle, positioner can detect unmanned aerial vehicle's position coordinate, and wherein, position coordinate includes longitude and latitude coordinate and elevation. When the unmanned aerial vehicle can acquire and generate the thermal image at the camera, the unmanned aerial vehicle's that positioner detected position coordinate, the attitude angle of camera place cloud platform (being the attitude angle of camera), the focus of camera, then unmanned aerial vehicle can send the position coordinate of the unmanned aerial vehicle who acquires, the attitude angle of camera place cloud platform, the focus of camera, and the pixel coordinate of fire point to user terminal.
The user terminal determines the geographical coordinates of the fire point according to the pixel coordinates of the fire point, the size of a thermal imaging sensor of a preset camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinates of the unmanned aerial vehicle, and the specific processing process is as follows:
the user terminal can convert the pixel coordinates of the fire point into image coordinates under an image coordinate system, and the specific processing process is as follows: the pixel coordinates of the fire are substituted into formula (1) to calculate the image coordinates of the fire. The image coordinate system takes the center of an image plane as a coordinate origin, and the X axis and the Y axis are respectively parallel to two vertical edges of the image plane and the coordinate values are expressed by (X, Y). The image coordinate system is the representation of the location of a pixel in an image in physical units (e.g., millimeters).
Figure BDA0002070414850000081
Wherein destX and destY are the pixel abscissa and the pixel ordinate of the fire point respectively, picW and picH are the pixel length and the pixel width of the imaging plane respectively, sensorW and sensorH are the length and the width of the thermal imaging sensor respectively, and x and y are the image coordinates of the fire point.
The user terminal can convert the image coordinates of the fire point into camera coordinates under a camera coordinate system, and the specific processing process is as follows: and (3) substituting the image coordinates of the fire point into the formula (2) to calculate the camera coordinates of the fire point. The camera coordinate system takes the optical center of the camera as a coordinate origin, the X axis and the Y axis of the camera coordinate system are respectively parallel to the X axis and the Y axis of the image coordinate system, and the Z axis of the camera coordinate system is the optical axis of the camera.
Figure BDA0002070414850000082
Where x and y are the image coordinates of the fire, Xc and Yc are the camera coordinates of the fire, f is the focal length, and Zc is the depth of field coefficient.
The user terminal can convert the camera coordinate of the fire point into a world coordinate under a world coordinate system, and the specific processing process is as follows: the camera coordinates of the fire are substituted into equations (3), (4) and (5) to calculate first world coordinates. The world coordinate system is an absolute coordinate system of an objective three-dimensional world, and is also called an objective coordinate system.
Figure BDA0002070414850000083
Figure BDA0002070414850000084
Figure BDA0002070414850000091
Wherein, R is the rotation matrix, and T is the translation matrix, Xw, Yw and Zw are first world coordinates, and Zw is 0, and Xc and Yc are the camera coordinate of fire, and Zc is the depth of field coefficient, and theta represents the angle of pitch of camera relative to the ground, and H represents the height of camera relative to the ground (the height of unmanned aerial vehicle relative to unmanned aerial vehicle aircraft departure point promptly).
The user terminal can perform inverse distortion calculation on the first world coordinate according to a preset radial distortion coefficient and a preset tangential distortion coefficient of the camera to obtain a second world coordinate. The inverse distortion calculation is the prior art, and the details of the embodiment of the present application are not repeated herein. The user terminal may use the second world coordinate as the world coordinate of the fire, or use the first world coordinate as the world coordinate of the fire.
The user terminal may convert the world coordinate of the fire point into a geographic coordinate in a geographic coordinate system, and the specific processing procedure may be: the world coordinates of the fire are substituted into equation (6) to calculate the first geographical coordinates of the fire. The geographic coordinate system is a coordinate system in which longitude and latitude coordinates represent positions.
Figure BDA0002070414850000092
Wherein, lng _ uav and lat _ uav are longitude and latitude coordinates of the airplane, d is a horizontal distance between the fire point and the airplane-free airplane, which can be calculated from world coordinates of the fire point, angle is an angle of the fire point relative to the north direction, which can be calculated from a pan-tilt yaw angle, M is a physical length (unit: meter) of 1 longitude, and lng _ E, lat _ E is longitude and latitude coordinates in the first geographic coordinate.
The user terminal may use the elevation of the takeoff point of the drone as the elevation in the first geographic coordinate.
The user terminal may determine the geographical coordinates of the fire from the first geographical coordinates.
Referring to the calculation process for calculating the first world coordinate of the fire point by using the formula (3), H represents the height of the camera from the ground (i.e. the height of the aircraft relative to the aircraft departure point), and when the height Zw of the user terminal in the first world coordinate of the default fire point is equal to 0 (i.e. the elevation of the default fire point is equal to the elevation of the unmanned aerial vehicle departure point), the horizontal and vertical coordinates Xw, Yw in the first world coordinate of the fire point are calculated, and the first geographic coordinate is calculated according to the first world coordinate. Therefore, under the condition that the actual elevation of the fire point is equal to the elevation of the flying point of the unmanned aerial vehicle, the user terminal can take the first geographic coordinate as the geographic coordinate of the fire point. For example, assuming that the first geographical coordinate is a, the user terminal may take the coordinate a as the geographical coordinate of the fire.
Under the condition that the actual elevation of the fire point is not equal to the elevation of the unmanned aerial vehicle flying point (for example, the fire point is located on a mountain), the user terminal needs to correct the first geographic coordinate. The specific process is as follows: according to preset DEM (Digital Elevation Model) data of a target detection area, determining a second geographic coordinate from a connection line of a position coordinate of the unmanned aerial vehicle and the first geographic coordinate, wherein the Elevation in the second geographic coordinate is equal to the Elevation corresponding to longitude and latitude coordinates in the second geographic coordinate in the DEM data, and taking the second geographic coordinate as the position coordinate of a fire point.
The DEM data of the target detection area comprises longitude and latitude coordinates and elevations of each position point on the ground.
In implementation, the user terminal may determine a connection line between the position coordinates (longitude and latitude coordinates and elevation) of the unmanned aerial vehicle and the first geographic coordinates, then determine the position coordinates (longitude and latitude coordinates and elevation) of each position point on the connection line, then determine the second geographic coordinates according to the position coordinates of each position point on the connection line, wherein the elevation in the second geographic coordinates is equal to the elevation corresponding to the longitude and latitude coordinates in the second geographic coordinates in the DEM data, and then the unmanned aerial vehicle uses the second geographic coordinates as the geographic coordinates of the fire point. Wherein if there are a plurality of second geographic coordinates, then the second geographic coordinate that is close to the position coordinate of the drone is taken as the geographic coordinate of the fire.
For example, as shown in fig. 2, the horizontal axis represents longitude and latitude coordinates, the vertical axis represents elevation, the point E represents position coordinates of the unmanned aerial vehicle, the point D represents first geographic coordinates, the curve H represents a relationship between the longitude and latitude coordinates and the elevation of each position point on the ground, the user terminal may determine a connection line between the point E and the point D, when the connection line between the point E and the point D passes through the ground (e.g., passes through a mountain, a building, etc.), it indicates that the point D is blocked, the camera may not capture the point D, the real geographic coordinate of the fire point is at a blocking point corresponding to the point D, that is, in a direction from the point E to the point D, an intersection point F between the connection line between the point E and the point D and the ground is the first. The user terminal may use the geographical coordinates of point F as the geographical coordinates of the fire.
Therefore, the geographic coordinate of the fire point is determined according to the first geographic coordinate and the DEM data, and the accuracy of the obtained geographic coordinate of the fire point is high.
Compared with the technical scheme that longitude and latitude coordinates of the unmanned aerial vehicle are directly used as the geographic coordinates of the fire point in the prior art, the method and the device convert the pixel coordinates of the fire point into the geographic coordinates of the fire point, and the obtained geographic coordinates of the fire point are high in accuracy. The method and the device can analyze the thermal image corresponding to the target detection area generated by the current camera in real time, find the fire point in time and determine the position of the fire point. And compare in prior art unmanned aerial vehicle in arbitrary position department, the camera can only shoot perpendicularly, and the control field of vision of camera is narrower, and unmanned aerial vehicle is in arbitrary position department in this application, and the camera can generate the thermal image that many target detection regions correspond according to the shooting angle of difference, and the control field of vision of camera is great, can more fast patrol and examine target detection region, discovers the ignition, and confirms that the efficiency of ignition position is higher. The method and the device can analyze the thermal image corresponding to the target detection area generated by the current camera in real time, find the fire point in time and determine the position of the fire point.
Optionally, the fire point positioning method based on the unmanned aerial vehicle further includes the following processing procedures: fire markers are added at the geographic coordinates of the fire in the electronic map of the target detection area.
In implementation, after determining the geographic coordinates of the fire, the user terminal may add a fire mark to the geographic coordinates of the fire in the electronic map of the target detection area, so as to facilitate the user to see the location of the fire.
Optionally, the camera further has a function of capturing a visible light image, for example, a dual-spectrum camera, and the method for automatically locating a fire point further includes the following processing steps: and when the fire point with the temperature exceeding the preset threshold value exists in the thermal image, acquiring a visible light image corresponding to the target detection area generated by the camera, and when detecting that the user clicks the fire point mark in the electronic map, displaying the thermal image and the visible light image.
In implementation, if a fire point with a temperature exceeding a preset threshold exists in the thermal image, the user terminal may acquire a visible light image corresponding to a target detection area shot by the camera, and when it is detected that the user clicks the fire point marker in the electronic map, the thermal image and the visible light image are displayed so as to facilitate the user to check the fire point position, and an invalid fire point is eliminated according to the thermal image and the visible light image.
Optionally, the fire point positioning method based on the unmanned aerial vehicle further includes the following processing procedures: and when the fire point with the temperature exceeding the preset threshold exists in the thermal image, outputting alarm information.
In the implementation, if the fire point with the temperature exceeding the preset threshold value exists in the thermal image, the user terminal can output alarm information to prompt the user of the occurrence of a fire disaster and further timely conduct fire extinguishing treatment.
Based on the same technical concept, as shown in fig. 3, the embodiment of the present application further provides a fire point positioning device based on an unmanned aerial vehicle, the device includes:
a first obtaining module 301, configured to obtain a thermal image corresponding to a target detection area generated by a camera;
a first determination module 302 for determining pixel coordinates of a fire in the thermal image when the fire is present in the thermal image at a temperature exceeding a preset threshold;
a second determining module 303, configured to determine geographic coordinates of the fire according to the pixel coordinates of the fire, a preset size of a thermal imaging sensor of the camera, and an attitude angle of the camera, a focal length of the camera, and position coordinates of the drone when the camera generates the thermal image.
Optionally, the second determining module 303 includes:
The first determining unit is used for determining a first geographic coordinate according to a preset coordinate conversion algorithm, the pixel coordinate of the fire point, the preset size of a thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinate of the unmanned aerial vehicle;
and the second determining unit is used for determining the geographic coordinate of the fire point according to the first geographic coordinate.
Optionally, the second determining unit is specifically configured to:
determining the first geographic coordinate as a geographic coordinate of the fire;
or
According to presetting digital elevation model DEM data in target detection area follows unmanned aerial vehicle's position coordinate with on the line of first geographical coordinate, confirm the geographical coordinate of fire point, elevation in the geographical coordinate of fire point with longitude and latitude coordinate in the geographical coordinate of fire point is in corresponding elevation equals in the DEM data.
Optionally, the apparatus further comprises a marking module;
the marking module is used for adding fire point marks at the geographic coordinates of the fire points in the electronic map of the target detection area.
Optionally, the apparatus further comprises:
The second acquisition module is used for acquiring a visible light image corresponding to the target detection area generated by the camera when a fire point with the temperature exceeding the preset threshold value exists in the thermal image;
and the display module is used for displaying the thermal image and the visible light image when detecting that the fire mark in the electronic map is clicked by the user.
Optionally, the apparatus comprises a further output module;
and the output module is used for outputting alarm information when the fire point with the temperature exceeding the preset threshold value exists in the thermal image.
Based on the same technical concept, as shown in fig. 4, the embodiment of the present application further provides a fire point positioning system based on an unmanned aerial vehicle, where the system includes an unmanned aerial vehicle 401 and a user terminal 402, and the unmanned aerial vehicle 401 is provided with a camera;
the unmanned aerial vehicle 401 is configured to acquire a thermal image corresponding to the target detection area generated by the camera; determining pixel coordinates of a fire point in the thermal image when the fire point with the temperature exceeding a preset threshold exists in the thermal image; sending the pixel coordinates of the fire point to the user terminal;
the user terminal 402 is configured to determine the geographical coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of the thermal imaging sensor of the camera, and the attitude angle of the camera, the focal length of the camera, and the position coordinates of the unmanned aerial vehicle when the camera generates the thermal image.
Based on the same technical concept, as shown in fig. 5, the embodiment of the present application further provides a fire point positioning system based on an unmanned aerial vehicle, the system includes an unmanned aerial vehicle 501 and a user terminal 502, the unmanned aerial vehicle 501 is provided with a camera;
the unmanned aerial vehicle 501 is configured to acquire a thermal image corresponding to a target detection area generated by the camera; determining pixel coordinates of a fire point in the thermal image when the fire point with the temperature exceeding a preset threshold exists in the thermal image; determining the geographic coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of a thermal imaging sensor of the camera, and the attitude angle of the camera, the focal length of the camera and the position coordinates of the unmanned aerial vehicle when the camera generates the thermal image; sending the geographical coordinates of the fire point to the user terminal;
the user terminal 502 is configured to receive the geographic coordinates of the fire point sent by the user terminal.
An embodiment of the present invention further provides an electronic device, as shown in fig. 6, including a processor 601, a communication interface 602, a memory 603, and a communication bus 604, where the processor 601, the communication interface 602, and the memory 603 complete mutual communication through the communication bus 604,
A memory 603 for storing a computer program;
the processor 601 is configured to implement the following steps when executing the program stored in the memory 603:
acquiring a thermal image corresponding to a target detection area generated by a camera; determining pixel coordinates of a fire in the thermal image when the fire is present in the thermal image at a temperature above a preset threshold; and determining the geographical coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of a thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinates of the unmanned aerial vehicle.
Optionally, the program executed by the processor 601 for determining the geographic coordinates of the fire according to the pixel coordinates of the fire, the preset thermal imaging sensor size of the camera, and the attitude angle of the camera, the focal length of the camera, and the position coordinates of the drone when the camera generates the thermal image may include: determining a first geographical coordinate according to a preset coordinate conversion algorithm, the pixel coordinate of the fire point, the preset size of a thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinate of the unmanned aerial vehicle; and determining the geographic coordinates of the fire point according to the first geographic coordinates.
Optionally, the program executed by the processor 601 for determining the geographic coordinate of the fire point according to the first geographic coordinate may include: determining the first geographic coordinate as a geographic coordinate of the fire; or, according to presetting digital elevation model DEM data in target detection area follows unmanned aerial vehicle's position coordinate with on the line of first geographical coordinate, confirm the geographical coordinate of fire point, elevation in the geographical coordinate of fire point with longitude and latitude coordinate in the geographical coordinate of fire point is in corresponding elevation equals in the DEM data.
Optionally, the program executed by the processor 601 may further include: adding a fire marker at the geographic coordinates of the fire in the electronic map of the target detection area.
Optionally, the program executed by the processor 601 may further include: when the fire point with the temperature exceeding the preset threshold value exists in the thermal image, acquiring a visible light image corresponding to the target detection area generated by a camera; when it is detected that a user clicks a fire marker in the electronic map, the thermal image and the visible light image are displayed.
Optionally, the program executed by the processor 601 may further include: and when the fire point with the temperature exceeding the preset threshold value exists in the thermal image, alarm information is output.
The communication bus 604 mentioned in the above electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus 604 may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface 602 is used for communication between the above-described electronic apparatus and other apparatuses.
The Memory 603 may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory 602 may also be at least one memory device located remotely from the aforementioned processor.
The Processor 601 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
In yet another embodiment of the present invention, a computer-readable storage medium is also provided, having a computer program stored therein, which when executed by a processor, performs the steps of any of the above-mentioned drone-based fire location methods.
In yet another embodiment, a computer program product containing instructions is also provided, which when run on a computer, causes the computer to perform any of the above-described drone-based fire location methods.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the apparatus and system embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference may be made to some descriptions of the method embodiments for relevant points.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (16)

1. A fire point locating method based on an unmanned aerial vehicle, the method comprising:
acquiring a thermal image corresponding to a target detection area generated by a camera;
determining pixel coordinates of a fire in the thermal image when the fire is present in the thermal image at a temperature above a preset threshold;
and determining the geographical coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of a thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinates of the unmanned aerial vehicle.
2. The method of claim 1, wherein determining the geographic coordinates of the fire from the pixel coordinates of the fire, a preset thermal imaging sensor size of the camera, and attitude angles of the camera, focal lengths of the camera, and position coordinates of the drone when generating the thermal image comprises:
Determining a first geographical coordinate according to a preset coordinate conversion algorithm, the pixel coordinate of the fire point, the preset size of a thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinate of the unmanned aerial vehicle;
and determining the geographic coordinates of the fire point according to the first geographic coordinates.
3. The method of claim 2, wherein determining the geographic coordinates of the fire from the first geographic coordinates comprises:
determining the first geographic coordinate as a geographic coordinate of the fire;
or
According to presetting digital elevation model DEM data in target detection area follows unmanned aerial vehicle's position coordinate with on the line of first geographical coordinate, confirm the geographical coordinate of fire point, elevation in the geographical coordinate of fire point with longitude and latitude coordinate in the geographical coordinate of fire point is in corresponding elevation equals in the DEM data.
4. The method of claim 1, further comprising:
adding a fire marker at the geographic coordinates of the fire in the electronic map of the target detection area.
5. The method of claim 4, further comprising:
when the fire point with the temperature exceeding the preset threshold value exists in the thermal image, acquiring a visible light image corresponding to the target detection area generated by a camera;
when it is detected that a user clicks a fire marker in the electronic map, the thermal image and the visible light image are displayed.
6. The method of claim 1, further comprising:
and when the fire point with the temperature exceeding the preset threshold value exists in the thermal image, alarm information is output.
7. A fire point locating device based on unmanned aerial vehicle, characterized in that the device includes:
the first acquisition module is used for acquiring a thermal image corresponding to a target detection area generated by a camera;
a first determination module for determining pixel coordinates of a fire in the thermal image if the fire is present in the thermal image at a temperature above a preset threshold;
and the second determination module is used for determining the geographic coordinates of the fire according to the pixel coordinates of the fire, the preset size of the thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinates of the unmanned aerial vehicle.
8. The apparatus of claim 7, wherein the second determining module comprises:
the first determining unit is used for determining a first geographic coordinate according to a preset coordinate conversion algorithm, the pixel coordinate of the fire point, the preset size of a thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinate of the unmanned aerial vehicle;
and the second determining unit is used for determining the geographic coordinate of the fire point according to the first geographic coordinate.
9. The apparatus according to claim 8, wherein the second determining unit is specifically configured to:
determining the first geographic coordinate as a geographic coordinate of the fire;
or
According to presetting digital elevation model DEM data in target detection area follows unmanned aerial vehicle's position coordinate with on the line of first geographical coordinate, confirm the geographical coordinate of fire point, elevation in the geographical coordinate of fire point with longitude and latitude coordinate in the geographical coordinate of fire point is in corresponding elevation equals in the DEM data.
10. The apparatus of claim 7, further comprising a marking module;
The marking module is used for adding fire point marks at the geographic coordinates of the fire points in the electronic map of the target detection area.
11. The apparatus of claim 10, further comprising:
the second acquisition module is used for acquiring a visible light image corresponding to the target detection area generated by the camera when a fire point with the temperature exceeding the preset threshold value exists in the thermal image;
and the display module is used for displaying the thermal image and the visible light image when detecting that the fire mark in the electronic map is clicked by the user.
12. The apparatus of claim 7, further comprising an output module;
and the output module is used for outputting alarm information when the fire point with the temperature exceeding the preset threshold value exists in the thermal image.
13. A fire point positioning system based on an unmanned aerial vehicle is characterized by comprising the unmanned aerial vehicle and a user terminal, wherein the unmanned aerial vehicle is provided with a camera;
the unmanned aerial vehicle is used for acquiring a thermal image corresponding to the target detection area generated by the camera; determining pixel coordinates of a fire point in the thermal image when the fire point with the temperature exceeding a preset threshold exists in the thermal image; sending the pixel coordinates of the fire point to the user terminal;
The user terminal is used for determining the geographical coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of the thermal imaging sensor of the camera, the attitude angle of the camera when the camera generates the thermal image, the focal length of the camera and the position coordinates of the unmanned aerial vehicle.
14. A fire point positioning system based on an unmanned aerial vehicle is characterized by comprising the unmanned aerial vehicle and a user terminal, wherein the unmanned aerial vehicle is provided with a camera;
the unmanned aerial vehicle is used for acquiring a thermal image corresponding to the target detection area generated by the camera; determining pixel coordinates of a fire point in the thermal image when the fire point with the temperature exceeding a preset threshold exists in the thermal image; determining the geographic coordinates of the fire point according to the pixel coordinates of the fire point, the preset size of a thermal imaging sensor of the camera, and the attitude angle of the camera, the focal length of the camera and the position coordinates of the unmanned aerial vehicle when the camera generates the thermal image; sending the geographical coordinates of the fire point to the user terminal;
and the user terminal is used for receiving the geographic coordinates of the fire point sent by the user terminal.
15. An electronic device comprising a processor and a machine-readable storage medium storing machine-executable instructions executable by the processor, the processor being caused by the machine-executable instructions to: carrying out the method steps of any one of claims 1 to 6.
16. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of claims 1 to 6.
CN201910435590.3A 2019-05-23 2019-05-23 Fire point positioning method, device and system based on unmanned aerial vehicle Active CN111982291B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910435590.3A CN111982291B (en) 2019-05-23 2019-05-23 Fire point positioning method, device and system based on unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910435590.3A CN111982291B (en) 2019-05-23 2019-05-23 Fire point positioning method, device and system based on unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN111982291A true CN111982291A (en) 2020-11-24
CN111982291B CN111982291B (en) 2022-11-04

Family

ID=73437484

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910435590.3A Active CN111982291B (en) 2019-05-23 2019-05-23 Fire point positioning method, device and system based on unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN111982291B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112487117A (en) * 2020-11-27 2021-03-12 新奥数能科技有限公司 Method and device for determining intersection point of earth surface, readable medium and electronic equipment
CN112669368A (en) * 2020-12-30 2021-04-16 四川弘和通讯有限公司 Fire spot area detection method, system and equipment based on computer vision
CN112668397A (en) * 2020-12-04 2021-04-16 普宙飞行器科技(深圳)有限公司 Fire real-time detection and analysis method and system, storage medium and electronic equipment
CN112700498A (en) * 2021-01-14 2021-04-23 中广核风电有限公司 Wind driven generator blade tip positioning method and system based on deep learning
CN113329207A (en) * 2021-05-26 2021-08-31 北京远度互联科技有限公司 Auxiliary tracking method, system and computer storage medium based on aircraft shooting
CN113538830A (en) * 2021-05-31 2021-10-22 浙江大华技术股份有限公司 Fire inspection method, device, equipment and computer storage medium
CN114383735A (en) * 2021-12-17 2022-04-22 暨南大学 Thermal power generating unit air cooling array temperature field monitoring method and device based on machine vision
CN114534134A (en) * 2022-03-18 2022-05-27 安徽阿拉丁航空航天有限公司 Online unmanned full-automatic fire prevention rescue unmanned aerial vehicle device and system that puts out a fire
CN114618107A (en) * 2020-12-08 2022-06-14 江苏数字鹰科技股份有限公司 Method for determining ground fire point coordinates by using aircraft camera

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103424126A (en) * 2013-08-12 2013-12-04 西安电子科技大学 System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle
CN105865422A (en) * 2016-03-24 2016-08-17 北京林业大学 Method for positioning forest fire at night via unmanned aerial vehicle
CN106373159A (en) * 2016-08-30 2017-02-01 中国科学院长春光学精密机械与物理研究所 Simplified unmanned aerial vehicle multi-target location method
CN106683038A (en) * 2016-11-17 2017-05-17 云南电网有限责任公司电力科学研究院 Method and device for generating fire situation map
CN107247458A (en) * 2017-05-24 2017-10-13 中国电子科技集团公司第二十八研究所 UAV Video image object alignment system, localization method and cloud platform control method
CN107316012A (en) * 2017-06-14 2017-11-03 华南理工大学 The fire detection and tracking of small-sized depopulated helicopter
CN107544481A (en) * 2016-06-27 2018-01-05 杭州海康机器人技术有限公司 A kind of unmanned plane makes an inspection tour control method, apparatus and system
CN107728633A (en) * 2017-10-23 2018-02-23 广州极飞科技有限公司 Obtain object positional information method and device, mobile device and its control method
CN107727079A (en) * 2017-11-30 2018-02-23 湖北航天飞行器研究所 The object localization method of camera is regarded under a kind of full strapdown of Small and micro-satellite
CN107783555A (en) * 2016-08-29 2018-03-09 杭州海康机器人技术有限公司 A kind of object localization method based on unmanned plane, apparatus and system
CN108305264A (en) * 2018-06-14 2018-07-20 江苏中科院智能科学技术应用研究院 A kind of unmanned plane precision landing method based on image procossing
WO2019033777A1 (en) * 2017-08-18 2019-02-21 深圳市道通智能航空技术有限公司 Method and device for improving depth information of 3d image, and unmanned aerial vehicle
CN109448055A (en) * 2018-09-20 2019-03-08 中国科学院光电研究院 Monocular vision attitude determination method and system
WO2019084919A1 (en) * 2017-11-03 2019-05-09 SZ DJI Technology Co., Ltd. Methods and system for infrared tracking

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103424126A (en) * 2013-08-12 2013-12-04 西安电子科技大学 System and method for verifying visual autonomous landing simulation of unmanned aerial vehicle
CN105865422A (en) * 2016-03-24 2016-08-17 北京林业大学 Method for positioning forest fire at night via unmanned aerial vehicle
CN107544481A (en) * 2016-06-27 2018-01-05 杭州海康机器人技术有限公司 A kind of unmanned plane makes an inspection tour control method, apparatus and system
CN107783555A (en) * 2016-08-29 2018-03-09 杭州海康机器人技术有限公司 A kind of object localization method based on unmanned plane, apparatus and system
CN106373159A (en) * 2016-08-30 2017-02-01 中国科学院长春光学精密机械与物理研究所 Simplified unmanned aerial vehicle multi-target location method
CN106683038A (en) * 2016-11-17 2017-05-17 云南电网有限责任公司电力科学研究院 Method and device for generating fire situation map
CN107247458A (en) * 2017-05-24 2017-10-13 中国电子科技集团公司第二十八研究所 UAV Video image object alignment system, localization method and cloud platform control method
CN107316012A (en) * 2017-06-14 2017-11-03 华南理工大学 The fire detection and tracking of small-sized depopulated helicopter
WO2019033777A1 (en) * 2017-08-18 2019-02-21 深圳市道通智能航空技术有限公司 Method and device for improving depth information of 3d image, and unmanned aerial vehicle
CN107728633A (en) * 2017-10-23 2018-02-23 广州极飞科技有限公司 Obtain object positional information method and device, mobile device and its control method
WO2019084919A1 (en) * 2017-11-03 2019-05-09 SZ DJI Technology Co., Ltd. Methods and system for infrared tracking
CN107727079A (en) * 2017-11-30 2018-02-23 湖北航天飞行器研究所 The object localization method of camera is regarded under a kind of full strapdown of Small and micro-satellite
CN108305264A (en) * 2018-06-14 2018-07-20 江苏中科院智能科学技术应用研究院 A kind of unmanned plane precision landing method based on image procossing
CN109448055A (en) * 2018-09-20 2019-03-08 中国科学院光电研究院 Monocular vision attitude determination method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王士迪: "无人机平台火灾探测研究", 《中国优秀硕士学位论文全文数据库》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112487117A (en) * 2020-11-27 2021-03-12 新奥数能科技有限公司 Method and device for determining intersection point of earth surface, readable medium and electronic equipment
CN112668397A (en) * 2020-12-04 2021-04-16 普宙飞行器科技(深圳)有限公司 Fire real-time detection and analysis method and system, storage medium and electronic equipment
CN114618107A (en) * 2020-12-08 2022-06-14 江苏数字鹰科技股份有限公司 Method for determining ground fire point coordinates by using aircraft camera
CN112669368A (en) * 2020-12-30 2021-04-16 四川弘和通讯有限公司 Fire spot area detection method, system and equipment based on computer vision
CN112700498A (en) * 2021-01-14 2021-04-23 中广核风电有限公司 Wind driven generator blade tip positioning method and system based on deep learning
CN113329207A (en) * 2021-05-26 2021-08-31 北京远度互联科技有限公司 Auxiliary tracking method, system and computer storage medium based on aircraft shooting
CN113538830A (en) * 2021-05-31 2021-10-22 浙江大华技术股份有限公司 Fire inspection method, device, equipment and computer storage medium
CN113538830B (en) * 2021-05-31 2023-05-19 浙江大华技术股份有限公司 Fire inspection method, device, equipment and computer storage medium
CN114383735A (en) * 2021-12-17 2022-04-22 暨南大学 Thermal power generating unit air cooling array temperature field monitoring method and device based on machine vision
CN114383735B (en) * 2021-12-17 2024-03-26 暨南大学 Thermal power generating unit air cooling array temperature field monitoring method and device based on machine vision
CN114534134A (en) * 2022-03-18 2022-05-27 安徽阿拉丁航空航天有限公司 Online unmanned full-automatic fire prevention rescue unmanned aerial vehicle device and system that puts out a fire

Also Published As

Publication number Publication date
CN111982291B (en) 2022-11-04

Similar Documents

Publication Publication Date Title
CN111982291B (en) Fire point positioning method, device and system based on unmanned aerial vehicle
US11842516B2 (en) Homography through satellite image matching
EP3309762A1 (en) Fire disaster monitoring method and apparatus
US8587664B2 (en) Target identification and location system and a method thereof
US20240046640A1 (en) Data processing method, apparatus, and system for fire scene, and unmanned aerial vehicle
AU2013398544B2 (en) A method of determining the location of a point of interest and the system thereof
CN109979468B (en) Lightning stroke optical path monitoring system and method
JP6877293B2 (en) Location information recording method and equipment
CN106683039B (en) System for generating fire situation map
WO2023125587A1 (en) Fire monitoring method and apparatus based on unmanned aerial vehicle
JP6802599B1 (en) Inspection system
AU2019353165B2 (en) Optics based multi-dimensional target and multiple object detection and tracking method
CN111046121A (en) Environment monitoring method, device and system
CN106683038A (en) Method and device for generating fire situation map
CN111582296B (en) Remote sensing image comprehensive matching method and device, electronic equipment and storage medium
KR20160082886A (en) Method and system for mapping using UAV and multi-sensor
US20170208355A1 (en) Method and apparatus for notifying a user whether or not they are within a camera's field of view
WO2023150888A1 (en) System and method for firefighting and locating hotspots of a wildfire
JP2019007938A5 (en)
CN112668397A (en) Fire real-time detection and analysis method and system, storage medium and electronic equipment
JP7206647B2 (en) Fire dispatch aid, method and program
KR101579970B1 (en) Method and apparatus for calculating location of points captured in image
US20240005770A1 (en) Disaster information processing apparatus, disaster information processing system, disaster information processing method, and program
KR102289550B1 (en) Augmented reality information providing method and system using image deep learning for matching aerial image data and Geographic Information System data
CN118670526A (en) Method, device, equipment and medium for detecting power equipment based on infrared temperature measurement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 310051 room 304, B / F, building 2, 399 Danfeng Road, Binjiang District, Hangzhou City, Zhejiang Province

Patentee after: Hangzhou Hikvision Robot Co.,Ltd.

Address before: 310052 5 / F, building 1, building 2, no.700 Dongliu Road, Binjiang District, Hangzhou City, Zhejiang Province

Patentee before: HANGZHOU HIKROBOT TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230711

Address after: No.555, Qianmo Road, Binjiang District, Hangzhou City, Zhejiang Province

Patentee after: Hangzhou Hikvision Digital Technology Co.,Ltd.

Address before: 310051 room 304, B / F, building 2, 399 Danfeng Road, Binjiang District, Hangzhou City, Zhejiang Province

Patentee before: Hangzhou Hikvision Robot Co.,Ltd.