CN116665398A - Ground accurate positioning method based on fireproof monitoring video - Google Patents

Ground accurate positioning method based on fireproof monitoring video Download PDF

Info

Publication number
CN116665398A
CN116665398A CN202310731932.2A CN202310731932A CN116665398A CN 116665398 A CN116665398 A CN 116665398A CN 202310731932 A CN202310731932 A CN 202310731932A CN 116665398 A CN116665398 A CN 116665398A
Authority
CN
China
Prior art keywords
cradle head
fire
aerial vehicle
unmanned aerial
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310731932.2A
Other languages
Chinese (zh)
Inventor
刘波
周磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dilin Weiye Technology Co ltd
Original Assignee
Beijing Dilin Weiye Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dilin Weiye Technology Co ltd filed Critical Beijing Dilin Weiye Technology Co ltd
Priority to CN202310731932.2A priority Critical patent/CN116665398A/en
Publication of CN116665398A publication Critical patent/CN116665398A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/005Fire alarms; Alarms responsive to explosion for forest fires, e.g. detecting fires spread over a large or outdoors area

Landscapes

  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Fire-Detection Mechanisms (AREA)

Abstract

The invention discloses a ground accurate positioning method based on a fireproof monitoring video, which comprises the following steps: s1: arranging a cradle head for observing a monitoring area; s2: obtaining DEM data of a monitoring area; s3: determining a tripod head posture parameter calibration value through posture data generated when the tripod head is observed on the RTK unmanned aerial vehicle falling into the designated position; s4: when a fire disaster is detected, the cradle head is used for calibrating the fire point, and the position of the fire point is determined based on DEM data, cradle head posture parameter calibration values and cradle head real-time PT values. The fire-proof monitoring system based on the fire-proof monitoring video realizes accurate positioning of fire points, is not influenced by factors such as weather, illumination and the like, and has all-weather performance. According to the invention, through the observation of the real-time camera and the real-time calculation of the fire-point accurate positioning service, the position of the fire point can be obtained within a few seconds, and the positioning speed is high.

Description

Ground accurate positioning method based on fireproof monitoring video
Technical Field
The invention relates to the field of fire positioning, in particular to a ground accurate positioning method based on a fireproof monitoring video.
Background
The occurrence of fire is the most common, most prominent and most harmful disaster in real life, and is a big problem directly related to the life safety and property safety of people. Forest fires are areas where fires are likely to occur, and are particularly important for monitoring and controlling forest fires. Global warming is aggravated in recent years, extreme weather is increasingly frequent, and fire frequency is rapidly increased.
Because of uncertainty of fire occurrence time, all-weather requirements are met for monitoring and positioning fire. For preventing and controlling fire, the principle of 'getting small and getting early' is followed, the spread of fire is controlled as early as possible, the loss of fire is reduced, and the fire prevention work has the requirement of timeliness. The fire point can be accurately positioned, so that fire departments and emergency rescue teams can be helped to accurately determine the position of the fire point, and countermeasures are timely taken to control fire, so that the fire prevention work has the requirement of accuracy. The fire monitoring range is wide, the time is long, and how to develop the work with low cost is worth thinking. Aiming at the characteristic requirements of fire monitoring and positioning, the invention provides a set of ground accurate positioning method based on a fireproof monitoring video, which is all-weather, low in cost, easy to operate, rapid and high in precision.
Conventional fire localization techniques are typically based on ground observation and data processing, including several methods:
1. manual observation method: professional staff periodically patrol in the jurisdiction, observe along the edge or surrounding high points of the fire area after finding the fire, and determine the position of the fire through telescope, optical instrument and other means. The method not only needs to consume a large amount of manpower and material resources, but also depends on manual observation and judgment, has subjective factors, and is easy to deviate in positioning.
2. Air observation method: the position of the fire point is determined by means of visual or photographic observation by using aircrafts such as helicopters, airplanes and the like. The method is affected by weather, topography, illumination and other factors, and dead angles or dead areas are easy to appear.
3. Satellite remote sensing method: the infrared spectrum characteristics of the fire area are obtained by utilizing a satellite remote sensing technology, and the data analysis processing is carried out by combining the technologies such as a geographic information system and the like, so as to determine the fire point position. The interval of satellite remote sensing data acquisition is longer, for example, the current sampling interval of the sunflower No. 8 satellite image in China is at least 10 minutes, the data cannot be acquired in real time, the positioning efficiency is lower, and the requirement of quick response cannot be met. Meanwhile, the positioning precision is low, and the high-precision positioning requirement cannot be met.
4. Traditional ground positioning method for fireproof monitoring video: and determining the position of the object in the three-dimensional space by utilizing a three-dimensional reconstruction technology according to the coordinates of the object in the image coordinate system and the internal and external parameters of the camera. And then, according to the position of the three-dimensional object in space, combining the technologies such as map information, GPS positioning and the like, carrying out ground accurate positioning on the object. The method utilizes various technical means, has complex technical system and higher cost.
All the technical means can not simultaneously meet the all-weather, low-cost, easy-to-operate, quick and high-precision requirements of fire point positioning.
Disclosure of Invention
Aiming at the problems existing in the prior art, the invention provides an all-weather, low-cost, easy-to-operate, rapid and high-precision ground accurate positioning method based on a fireproof monitoring video.
The invention discloses a ground accurate positioning method based on a fireproof monitoring video, which comprises the following steps:
s1: arranging a cradle head for observing a monitoring area;
s2: obtaining DEM data of a monitoring area;
s3: determining a tripod head posture parameter calibration value through posture data generated when the tripod head is observed on the RTK unmanned aerial vehicle falling into the designated position;
s4: when a fire disaster is detected, the cradle head is used for calibrating the fire point, and the position of the fire point is determined based on DEM data, cradle head posture parameter calibration values and cradle head real-time PT values.
Further, the step S1 includes:
the cradle head is arranged on the fireproof tower and is connected to the fireproof monitoring command center through the access private network or the wide area network.
Further, the step S2 includes:
DEM data describing elevation information of a terrain surface with horizontal resolution within 12.5 meters and vertical accuracy within 5 meters error.
Further, the step S3 includes:
s301: moving the unmanned aerial vehicle with the RTK to a holder, and determining the coordinates of the holder in three-dimensional geodetic coordinates;
s302: determining a three-dimensional station center coordinate system taking a cradle head as a center and a coordinate system conversion relation between the cradle head three-dimensional station center coordinate system and a three-dimensional geodetic coordinate system;
s303: arranging space observation points so that the cradle head can observe all the space observation points;
s304: controlling the unmanned aerial vehicle to fall into a space observation point, and acquiring PT values when the unmanned aerial vehicle at each position is observed by the cradle head;
s305: and establishing a tripod head posture parameter correction model, and resolving a tripod head posture parameter calibration value.
Further, the step S301 specifically includes:
controlling the RTK unmanned aerial vehicle to fly to the position with the same height as the cradle head, and acquiring the altitude of the RTK unmanned aerial vehicle, namely the cradle head altitude H; and controlling the RTK unmanned aerial vehicle to fly to the vertical top position of the cradle head, and obtaining the longitude and latitude of the RTK unmanned aerial vehicle, namely the cradle head latitude B and the cradle head longitude L.
Further, the step S303 specifically includes:
in a three-dimensional station center coordinate system, arranging a space observation point at intervals of 30 degrees on a horizontal plane with the same height as a cradle head, arranging a space observation point at intervals of 60 degrees on horizontal planes with pitch angles of-3 degrees, -6 degrees and 3 degrees respectively, and defaulting to 0 degree at an initial horizontal angle; the range of the observation distance between the space observation point and the cradle head is 100-300 meters;
the specific observation distance and the initial horizontal angle are determined based on DEM data so as to ensure that the cradle head can observe all the space observation points, and the observation distance is selected to be as large as possible in the range so as to reduce the influence of the position shake of the unmanned aerial vehicle on the calculation of the pitch angle or the azimuth angle.
Further, the step S304 specifically includes:
converting coordinate information of a space observation point into a ground coordinate system supported by the RTK unmanned aerial vehicle from a three-dimensional station core coordinate system based on a coordinate system conversion relation, wherein the RTK unmanned aerial vehicle falls into the space observation point based on the converted coordinates;
and controlling the cradle head to observe the RTK unmanned aerial vehicle, so that the center of the cradle head picture is aimed at the RTK unmanned aerial vehicle, and the PT value of the cradle head during observation is obtained.
Further, the step S305 specifically includes:
establishing a cradle head posture parameter correction model, and resolving cradle head posture calibration parameters (theta, beta, omega) cc ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein θ is the height angle of the rotation plane of the cradle head in the maximum inclination direction, β is the azimuth angle of the rotation plane of the cradle head in the maximum inclination direction, ω c For constant correction value, sigma of azimuth angle observed by cradle head c Observing a pitch angle constant correction value for the cradle head;
the correction model formula is:
ω improvement of =ω True senseMeasuring =-arctan[tanθ*tanσ*sin(β-ω)]+ω c (5-1)
σ Improvement of =σ True senseMeasuring =-σ axayc (5-2)
α x =arctan[tanθ*sin(β-ω)] (5-3)
α y =arctan[tanθ*cos(β-ω)] (5-4)
tan(σ-σ ax )=tanσ*cosa x (5-5)
σ ay =a y (5-6)
Wherein omega True sense Sum sigma True sense Respectively the real PT values of the observation points of the observation space of the cradle head, wherein the real PT values are calculated from the coordinates of the observation points and the coordinates of the cradle head, and omega Measuring Sum sigma Measuring When the cloud deck observes the RTK unmanned aerial vehicle falling into the space observation point, the PT value obtained from the cloud deck;
substituting the combination of 5-1 into the observed value, and taking the (sigma) of the least square result of the model error c Beta,) value as a result of the solution; the error root mean square calculation formula is:
wherein delta is the difference between the actual pitch angle observed value and the model calculated pitch angle value;
in the same way, ω when the model error root mean square minimum result is taken in combination with equation 5-2, substituted into the observed value c And taking the value as a solution result, wherein delta is the difference between the true azimuth angle observed value and the model calculated azimuth angle value.
Further, the step S4 includes:
when a fire occurs, the cradle head equipment of the fireproof command center is matched with the video monitoring terminal to automatically identify the fire, or the fire is identified manually;
after the fire point is identified, the center of the video picture can be automatically or manually aimed at the fire point, and the fire point accurate positioning service is informed that the cradle head equipment observes the fire condition at the moment, and the position of the fire point needs to be calculated;
the fire point accurate positioning service acquires the PT value of the cradle head for observing the fire in real time, and calculates the real PT value based on formulas 5-1 and 5-2;
the fire point accurate positioning service calculates the fire point position by utilizing the real PT value and DEM data and combining the viewing principle and trigonometric function.
Further, the method further comprises:
s5: after the fire control command center obtains the fire position, the fire control related responsibility department is informed to carry out subsequent fire control work.
The invention has at least the following beneficial effects:
(1) The invention realizes the accurate positioning of the fire point based on the fireproof monitoring video, is not influenced by factors such as weather, illumination and the like, and has all-weather performance.
(2) In practical verification, the final positioning accuracy of the fire point 3 km away from the cradle head is within 100m error under the condition of good observation condition, and the method has high accuracy.
(3) According to the invention, through the observation of the real-time camera and the real-time calculation of the fire-point accurate positioning service, the position of the fire point can be obtained within a few seconds, and the positioning speed is high.
(4) After a fire condition occurs, the position of the fire point can be obtained through the fire point accurate positioning service only by controlling the center of the cradle head picture to aim at the fire point by a user, and the operation is simple and convenient.
(5) The invention has lower construction cost and long service life.
Other advantageous effects of the present invention will be described in detail in the detailed description section.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a method for accurately positioning the ground based on the fireproof monitoring video.
Fig. 2 is a flowchart of a technical implementation of a ground accurate positioning method based on a fireproof monitoring video.
Fig. 3 is a diagram showing observation of P-value.
Fig. 4 is a schematic view of T-value observation.
Fig. 5 is a schematic view of a holder installation.
Fig. 6 is a schematic diagram of a pan/tilt device access network.
Fig. 7 is a schematic diagram of a station center coordinate system.
Fig. 8 is a schematic diagram of a layout structure of a space observation point.
Fig. 9 is a schematic view of a pan-tilt center alight unmanned aerial vehicle.
Fig. 10 is a schematic view of a center-firing point of a cradle head.
Fig. 11 is a schematic diagram of a fire accurate positioning service.
Fig. 12 is a logic diagram of fire accurate positioning service calculation.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be described in detail below. It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, based on the examples herein, which are within the scope of the invention as defined by the claims, will be within the scope of the invention as defined by the claims.
The ground accurate positioning method based on the fireproof monitoring video shown in fig. 1 comprises the following steps:
s1: arranging a cradle head for observing a monitoring area;
s2: obtaining DEM data of a monitoring area;
s3: determining a tripod head posture parameter calibration value through posture data generated when the tripod head is observed on the RTK unmanned aerial vehicle falling into the designated position;
s4: when a fire disaster is detected, the cradle head is used for calibrating the fire point, and the position of the fire point is determined based on DEM data, cradle head posture parameter calibration values and cradle head real-time PT values.
Examples
As shown in fig. 2, the implementation flow of the present invention mainly includes: a cloud deck equipment (video monitoring) foundation engineering construction accessed to a private network or a wide area network; the center of the cradle head monitoring picture is aimed at a fire point (target point) to acquire a PT value of the cradle head at the moment; carrying out attitude parameter calibration on each cradle head based on an unmanned aerial vehicle with RTK, wherein the calibration comprises the steps of measuring geographic three-dimensional coordinates of the cradle head and calculating calibration model parameters of the cradle head PT (azimuth angle and pitch angle); high-precision DEM data acquisition; transmitting the holder coordinates and the real-time PT values to a fire point accurate positioning service (server); and calculating to obtain the three-dimensional coordinates (B\L\H) of the fire point.
The present invention will be described in detail in the following steps:
step one
The foundation engineering construction of the cradle head equipment connected to the private network or the wide area network comprises the work of selecting the cradle head equipment, selecting the number and distribution points of the cradle head equipment, network wiring, selecting the network equipment, installing and debugging the cradle head equipment and the like.
As shown in fig. 5, the pan-tilt device is generally installed at the top of the fireproof tower, and the lens of the pan-tilt device can rotate 360 degrees in the horizontal plane and 180 degrees in the vertical plane, so as to ensure that a ground target can be observed. The cradle head device has built-in gesture sensing capability, and can acquire the two angle values, which are commonly called PT values in the field. The P value is increased in a clockwise direction by 0 DEG with the north direction as the starting direction on a horizontal plane, and the value range is [0,360]. The value T is 0 degree in the horizontal direction on a vertical plane, increases upwards and decreases downwards, and the value range is-90, 90. See the P-value view shown in fig. 3 and the T-value view shown in fig. 4.
The cradle head equipment is generally arranged at the top of the fireproof tower and is 20-35 m away from the ground, and is mainly used in sea health and China. The equipment is recommended to be of a thermal infrared type, and infrared thermal radiation has stronger capability of penetrating fog, haze, rain and snow than visible light, so that the observation effect is hardly affected in severe weather conditions. The installation of the cradle head needs to ensure the horizontal rotation surface to be horizontal as much as possible.
The fire towers are generally uniformly distributed in the monitoring area, so that the whole area to be monitored can be monitored. The arrangement position should select the high point of the area position as far as possible, so that the effective visual field range of monitoring of the single cradle head equipment can be improved, and the accuracy of resolving the fire point position can be improved.
The holder equipment is accessed to the fireproof monitoring command center through a private network or a wide area network, and the fireproof monitoring command center can acquire holder equipment parameter information such as monitoring pictures in real time through the network. See the cradle head device access network schematic diagram shown in fig. 6.
The cradle head accessed to the network needs to be debugged, so that a terminal user in the network can acquire a monitoring picture and a PT value thereof in real time, and the running stability of the terminal user is confirmed.
Step two
And obtaining high-precision DEM data. DEM data is an abbreviation for digital elevation model (Digital Elevation Model), which is a digital terrain model data describing elevation information of the earth's surface. The DEM data is comprised of a series of grid cells, each cell containing an elevation value, from which an elevation value H for each geographic location point (B, L) within the data range can be obtained. DEM data is typically acquired from radar, laser scanning, satellite mapping, and like data sources. The high precision required in this embodiment is required not only to be within 12.5m resolution, but also to be within 5m error, as well as high precision in the vertical direction of the DEM data. High-precision DEM data may be acquired and authorized from the relevant mapping departments. The requirements for the coordinate system are that the geographical coordinate system of the DEM data needs to be supported by the RTK unmanned aerial vehicle, and if the RTK unmanned aerial vehicle does not support the geographical coordinate system, the DEM data needs to be converted into the geographical coordinate system supported by the RTK unmanned aerial vehicle.
Step three
And carrying out cradle head attitude parameter calibration based on the unmanned aerial vehicle with the RTK. The RTK unmanned aerial vehicle is an unmanned aerial vehicle provided with a real-time kinematic (Real Time Kinematic, RTK) technology, is high-precision GPS positioning equipment and can achieve centimeter-level positioning precision. The RTK unmanned aerial vehicle flies to a certain point in space, the three-dimensional geodetic coordinate value (B, L, H) of the RTK unmanned aerial vehicle at the moment can be obtained in real time through the matched APP terminal or the handheld device terminal, and the three-dimensional geodetic coordinate value (B, L, H) can be preset, so that the RTK unmanned aerial vehicle flies to the position in space.
The specific steps of the tripod head posture parameter calibration include:
a. three-dimensional coordinates of the pan/tilt head are measured (B, L, H). And (3) flying the RTK unmanned aerial vehicle to the same height position of the cradle head, and obtaining the altitude H of the RTK unmanned aerial vehicle. And (5) flying the unmanned aerial vehicle to a vertical top position of the cradle head, and acquiring B, L of the unmanned aerial vehicle. To ensure high-precision positioning of fire points, the measurement accuracy of B, L is required to be ensured to be in decimeter level precision, and the measurement accuracy of H is required to be in centimeter level precision.
b. The method comprises the steps of establishing a three-dimensional station center coordinate system (ENU) taking a cradle head as a center, and establishing a conversion relation between the station center coordinate system (ENU) and a geodetic coordinate system (BLH) so as to facilitate subsequent calculation. The conversion formula is a general formula, and will not be described here. The station center coordinate system is schematically shown in fig. 7.
c. And the space observation points are arranged for establishing the PTZ value calibration model. And then the unmanned aerial vehicle falls into the observation points in sequence, and the cloud deck is used for observation. In a three-dimensional station center coordinate system (ENU), an observation point is arranged at intervals of 30 degrees on the horizontal plane with the same height as the cradle head, and in addition, an observation point is arranged at intervals of 60 degrees on the horizontal planes with pitch angles of-3 degrees, -6 degrees and 3 degrees respectively. The initial horizontal angle defaults to 0 degrees, and the value can be flexibly adjusted to ensure that the observation point is above the ground and the observation line is not blocked. The angle of the layout is the real PT value, and the real P value is omega True sense The true T value is sigma True sense . The distance between the observation point and the holder is called observationThe distance is planned within the range of 100-300 m, the observation point is above the ground, the observation sight is not covered, and the observation distance is as close as possible to 300m, so that the unmanned aerial vehicle can be conveniently found by adjusting the cradle head picture, and the influence of unmanned aerial vehicle position shake caused by external wind power and the like on pitch angle or azimuth angle calculation can be reduced. The observation distance and the horizontal initial angle of the observation point can be selected, and by means of the acquired DEM data, all the planned points can be ensured to be observed under the condition that the sight is not blocked. The arrangement structure is shown in fig. 8, and for convenience of viewing, only the arrangement points with pitch angles of 0 degrees and 3 degrees are shown in fig. 8. After the point distribution is completed, the distribution point needs to be converted into a ground coordinate system supported by the RTK unmanned aerial vehicle, so that the unmanned aerial vehicle can fall down conveniently.
d. And obtaining a PTZ observation PT value of each space observation point. And (3) sequentially falling into the unmanned aerial vehicle at each planned space observation point, controlling the center of the tripod head picture to aim at the unmanned aerial vehicle (see fig. 9), and simultaneously acquiring the PT value from the tripod head equipment in real time. The cradle head control can be controlled by selecting a cradle head matched terminal, and can also be controlled by developing own terminal programs based on sdk of cradle head equipment manufacturers. Let P be ω Measuring T has a value of sigma Measuring
e. Establishing a cradle head posture parameter correction model, and resolving cradle head posture calibration parameters (theta, beta, omega) cc ). Wherein θ is the altitude of the rotation plane of the cradle head in the direction of maximum inclination, β is the azimuth of the rotation plane in the direction of maximum inclination, ω c For constant correction value, sigma of azimuth angle observed by cradle head c And observing a pitch angle constant correction value for the cradle head.
The correction model formula is:
ω improvement of =ω True senseMeasuring =-arctan[tanθ*tanσ*sin(β-ω)]+ω c (5-1)
σ Improvement of =σ True senseMeasuring =-σ axayc (5-2)
Wherein:
α x =arctan[tanθ*sin(β-ω)] (5-3)
α y =arctan[tanθ*cos(β-ω)] (5-4)
tan(σ-σ ax )=tanσ*cosa x (5-5)
σ ay =a y (5-6)
substituting the combination of 5-1 into the observed value, and taking the (sigma) of the least square result of the model error c Beta, theta) value as a result of the solution. The error root mean square calculation formula is:
wherein delta is the difference between the true pitch angle observation and the model calculated pitch angle value.
Based on the same way, the method combines the formula 5-2, substitutes the observed value, and takes omega when the least square result of the model error is obtained c The value is used as a solution result, and at this time, delta refers to the difference between the actual azimuth observation value and the model calculated azimuth value.
Step four
And calculating a fire position (B, L, H) value in real time according to the fire accurate positioning service.
The fire point accurate positioning service is deployed on the server, and can calculate the fire point position by combining DEM data, the cradle head posture calibration parameter value and the real-time PT value acquired from the cradle head, so that real-time online interface capability for calculating the fire point position observed by each cradle head device is provided for a user.
When a fire occurs, the video monitoring terminal matched with the cradle head equipment of the fireproof command center can automatically identify or manually identify the fire. After the fire point is identified, the center of the video picture can be automatically or manually aimed at the fire point (see fig. 10), and the fire point accurate positioning service is informed that the cradle head device observes the fire condition at the moment, so that the position of the fire point needs to be calculated. The fire point accurate positioning service obtains the PT value of the cradle head equipment in real time, and the real PT value is calculated by using the formulas 5-1 and 5-2. The fire point accurate positioning service calculates the fire point position by utilizing the real PT value and DEM data and combining the perspective principle and trigonometric function in the mathematical field.
Referring to fig. 11 and 12, the fire accurate positioning service calculation logic includes:
(1) The distance of the observation direction increases by step length, 1/2 of the DEM data respectively rate is removed, and the distance is set as dS.
(2) The elevation error threshold is set to dH and initialized to 1m.
(3) An observation ray is made from the holder S, and the observation distance gradually increases dS.
(4) Judging: exceeding the range of the observation radius of the holder, if so, entering (5); if not, then go to (6).
(5) The dH value is incremented, multiplied by 2, and returned to (2).
(6) The target point L (B, L, H) is calculated using trigonometric functions, based on the true PT value.
(7) And calculating the altitude h corresponding to the point P (B, L) from the DEM, and realizing the DEM data based on the monitoring area.
(8) Judging: if the absolute value of the difference between H and H is smaller than dH, the method enters (9), and if not, the method returns to (3).
(9) B, L, h is the desired value.
And the fire prevention command center receives the fire point position calculated by the fire point accurate positioning service and notifies relevant responsibility departments such as fire protection and the like to carry out subsequent fire control work.
By adopting the method disclosed by the embodiment, the attitude parameters of the cradle head equipment are calibrated based on the unmanned aerial vehicle with the RTK, and through actual verification, the three-dimensional coordinate precision of the cradle head equipment after calibration can be ensured to be in a centimeter level, the error in a pitch angle is within 0.2 degree, and the error in an azimuth angle is within 0.5 degree, so that the accuracy of finally positioning the fire point by the technical scheme is ensured. According to the embodiment, the timeliness of fire positioning can be guaranteed by performing fire positioning through real-time video.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention.

Claims (10)

1. A ground accurate positioning method based on a fireproof monitoring video is characterized by comprising the following steps:
s1: arranging a cradle head for observing a monitoring area;
s2: obtaining DEM data of a monitoring area;
s3: determining a tripod head posture parameter calibration value through posture data generated when the tripod head is observed on the RTK unmanned aerial vehicle falling into the designated position;
s4: when a fire disaster is detected, the cradle head is used for calibrating the fire point, and the position of the fire point is determined based on DEM data, cradle head posture parameter calibration values and cradle head real-time PT values.
2. The ground accurate positioning method based on fireproof monitoring video according to claim 1, wherein the step S1 comprises:
the cradle head is arranged on the fireproof tower and is connected to the fireproof monitoring command center through the access private network or the wide area network.
3. The ground accurate positioning method based on fireproof monitoring video according to claim 1, wherein the step S2 comprises:
DEM data describing elevation information of a terrain surface with a resolution in the horizontal direction within 12.5 meters and a precision in the vertical direction within 5 meters error.
4. The ground accurate positioning method based on fireproof monitoring video according to claim 1, wherein the step S3 comprises:
s301: moving the unmanned aerial vehicle with the RTK to a holder, and determining the coordinates of the holder in a three-dimensional geodetic coordinate system;
s302: determining a three-dimensional station center coordinate system taking a cradle head as a center and a coordinate system conversion relation between the cradle head three-dimensional coordinate system and a three-dimensional geodetic coordinate system;
s303: arranging space observation points so that the cradle head can observe all the space observation points;
s304: controlling the unmanned aerial vehicle to fall into a space observation point, and acquiring PT values when the unmanned aerial vehicle at each position is observed by the cradle head;
s305: and establishing a tripod head posture parameter correction model, and resolving a tripod head posture parameter calibration value.
5. The method for accurately positioning the ground based on the fireproof monitoring video according to claim 4, wherein the step S301 specifically comprises:
controlling the RTK unmanned aerial vehicle to fly to the position with the same height as the cradle head, and acquiring the altitude of the RTK unmanned aerial vehicle, namely the cradle head altitude H; and controlling the RTK unmanned aerial vehicle to fly to the vertical top position of the cradle head, and obtaining the longitude and latitude of the RTK unmanned aerial vehicle, namely the cradle head latitude B and the cradle head longitude L.
6. The method for accurately positioning the ground based on the fireproof monitoring video according to claim 5, wherein the step S303 specifically comprises:
in a three-dimensional station center coordinate system, arranging a space observation point at intervals of 30 degrees on a horizontal plane with the same height as a cradle head, arranging a space observation point at intervals of 60 degrees on horizontal planes with pitch angles of-3 degrees, -6 degrees and 3 degrees respectively, and defaulting to 0 degree at an initial horizontal angle; the range of the observation distance between the space observation point and the cradle head is 100-300 meters;
the specific observation distance and the initial horizontal angle are determined based on DEM data so as to ensure that the cradle head can observe all the space observation points, and the observation distance is selected to be as large as possible in the range so as to reduce the influence of the position shake of the unmanned aerial vehicle on the calculation of the pitch angle or the azimuth angle.
7. The method for accurately positioning the ground based on the fireproof monitoring video according to claim 6, wherein the step S304 specifically comprises:
converting coordinate information of a space observation point into a ground coordinate system supported by the RTK unmanned aerial vehicle from a three-dimensional station core coordinate system based on a coordinate system conversion relation, wherein the RTK unmanned aerial vehicle falls into the space observation point based on the converted coordinates;
and controlling the cradle head to observe the RTK unmanned aerial vehicle, so that the center of the cradle head picture is aimed at the RTK unmanned aerial vehicle, and the PT value of the cradle head during observation is obtained.
8. The method for accurately positioning the ground based on the fireproof monitoring video according to claim 7, wherein the step S305 specifically comprises:
establishing a cradle head posture parameter correction model, and resolving cradle head posture calibration parameters (theta, beta, omega) cc ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein θ is the height angle of the rotation plane of the cradle head in the maximum inclination direction, β is the azimuth angle of the rotation plane of the cradle head in the maximum inclination direction, ω c For constant correction value, sigma of azimuth angle observed by cradle head c Observing a pitch angle constant correction value for the cradle head;
the correction model formula is:
ω improvement of =ω True senseMeasuring =-arctan[tanθ*tanσ*sin(β-ω)]+ω c (5-1)
σ Improvement of =σ True senseMeasuring =-σ axayc (5-2)
α x =arctan[tanθ*sin(β-ω)] (5-3)
α y =arctan[tanθ*cos(β-ω)] (5-4)
tan(σ-σ ax )=tanσ*cosa x (5-5)
σ ay =a y (5-6)
Wherein omega True sense Sum sigma True sense Respectively the real PT values of the observation points of the observation space of the cradle head, wherein the real PT values are calculated from the coordinates of the observation points and the coordinates of the cradle head, and omega Measuring Sum sigma Measuring When the cloud deck observes the RTK unmanned aerial vehicle falling into the space observation point, the PTs directly obtained from the cloud deck;
substituting the combination of 5-1 into the observed value, and taking the (sigma) of the least square result of the model error c Beta, theta) value as a result of the calculation; the error root mean square calculation formula is:
wherein delta is the difference between the actual pitch angle observed value and the model calculated pitch angle value;
in the same way, ω when the model error root mean square minimum result is taken in combination with equation 5-2, substituted into the observed value c And taking the value as a solution result, wherein delta is the difference between the true azimuth angle observed value and the model calculated azimuth angle value.
9. The ground accurate positioning method based on fireproof monitoring video according to claim 8, wherein the step S4 comprises:
when a fire occurs, the cradle head equipment of the fireproof command center is matched with the video monitoring terminal to automatically identify the fire, or the fire is identified manually;
after the fire point is identified, the center of the video picture can be automatically or manually aimed at the fire point, and the fire point accurate positioning service is informed that the cradle head equipment observes the fire condition at the moment, and the position of the fire point needs to be calculated;
the fire point accurate positioning service acquires the PT value of the cradle head for observing the fire in real time, and calculates the real PT value based on formulas 5-1 and 5-2;
the fire point accurate positioning service calculates the fire point position by utilizing the real PT value and DEM data and combining the viewing principle and trigonometric function.
10. The fire protection surveillance video based ground precision positioning method of claim 1, further comprising:
s5: after the fire control command center obtains the fire position, the fire control related responsibility department is informed to carry out subsequent fire control work.
CN202310731932.2A 2023-06-20 2023-06-20 Ground accurate positioning method based on fireproof monitoring video Pending CN116665398A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310731932.2A CN116665398A (en) 2023-06-20 2023-06-20 Ground accurate positioning method based on fireproof monitoring video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310731932.2A CN116665398A (en) 2023-06-20 2023-06-20 Ground accurate positioning method based on fireproof monitoring video

Publications (1)

Publication Number Publication Date
CN116665398A true CN116665398A (en) 2023-08-29

Family

ID=87720611

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310731932.2A Pending CN116665398A (en) 2023-06-20 2023-06-20 Ground accurate positioning method based on fireproof monitoring video

Country Status (1)

Country Link
CN (1) CN116665398A (en)

Similar Documents

Publication Publication Date Title
US10509417B2 (en) Flight planning for unmanned aerial tower inspection with long baseline positioning
US10564649B2 (en) Flight planning for unmanned aerial tower inspection
CN107367262B (en) A kind of unmanned plane display interconnection type control method of positioning mapping in real time at a distance
CN110603379B (en) Inspection tool control device for wind power equipment inspection tool
AU2012202966B2 (en) Method for pilot assistance for the landing of and aircraft in restricted visibility
CN110006407B (en) Close-up photogrammetry method based on rotor unmanned aerial vehicle
US20190011920A1 (en) Method and system for generating flight plan of unmanned aerial vehicle for aerial inspection
CN106813900B (en) A kind of civil airport navigational lighting aid flight check method based on unmanned air vehicle technique
RU2486594C2 (en) Method to monitor forest fires and complex system for early detection of forest fires built on principle of heterosensor panoramic view of area with function of highly accurate detection of fire source
CN111998832A (en) Laser point cloud-based inspection method for accurately positioning target object by using unmanned aerial vehicle
CN109814405B (en) Comprehensive quantitative evaluation method for measurement and control station distribution scheme
CN108572379B (en) Communication base station site selection exploration method based on unmanned aerial vehicle
CN107567003B (en) Interference detection method and system, aircraft and controller
CN110044338B (en) Unmanned aerial vehicle monitoring method and system for dam break scene
RU113046U1 (en) COMPREHENSIVE SYSTEM FOR EARLY DETECTION OF FOREST FIRES, BUILT ON THE PRINCIPLE OF A VARIETY SENSOR PANORAMIC SURVEY OF THE AREA WITH THE FUNCTION OF HIGH-PRECISION DETERMINATION OF THE FIRE OF THE FIRE
KR20210075912A (en) Measuring method for ground settlement using drone
CN115876197A (en) Mooring lifting photoelectric imaging target positioning method
CN116665398A (en) Ground accurate positioning method based on fireproof monitoring video
CN108924494B (en) Aerial monitoring system based on ground
CN114534134B (en) Online unmanned full-automatic fire prevention rescue unmanned aerial vehicle device and system that puts out a fire
CN205354138U (en) Unmanned aerial vehicle
CN112598726B (en) Virtual positioning forest fire method based on grid background cloth
CN107925718B (en) System and method for image processing
JP7254934B2 (en) Method for making forest measurements, forest measurement system and computer program
CN113640848A (en) Ground laser footprint data acquisition method, system, medium and equipment of unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination