CN116952081A - Aerial monitoring system and monitoring method for parameter images of drop points of fire extinguishing bomb - Google Patents

Aerial monitoring system and monitoring method for parameter images of drop points of fire extinguishing bomb Download PDF

Info

Publication number
CN116952081A
CN116952081A CN202310927653.3A CN202310927653A CN116952081A CN 116952081 A CN116952081 A CN 116952081A CN 202310927653 A CN202310927653 A CN 202310927653A CN 116952081 A CN116952081 A CN 116952081A
Authority
CN
China
Prior art keywords
image
fire extinguishing
module
extinguishing bomb
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310927653.3A
Other languages
Chinese (zh)
Other versions
CN116952081B (en
Inventor
陈海峰
朱学伟
李稀稀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Joho Technology Co ltd
Original Assignee
Wuhan Joho Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Joho Technology Co ltd filed Critical Wuhan Joho Technology Co ltd
Priority to CN202310927653.3A priority Critical patent/CN116952081B/en
Publication of CN116952081A publication Critical patent/CN116952081A/en
Application granted granted Critical
Publication of CN116952081B publication Critical patent/CN116952081B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F42AMMUNITION; BLASTING
    • F42BEXPLOSIVE CHARGES, e.g. FOR BLASTING, FIREWORKS, AMMUNITION
    • F42B35/00Testing or checking of ammunition
    • F42B35/02Gauging, sorting, trimming or shortening cartridges or missiles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Abstract

The invention discloses an aerial monitoring system for a parameter image of a falling point of a fire extinguishing bomb, which comprises the following components: the aerial flight observation platform comprises a plurality of groups of aerial flight observation platforms, the sensor is mounted on the rotary-wing unmanned aerial vehicle to detect the coordinates of the explosion points of the fire extinguishing bomb, the imaging is observed from top to bottom, the observation visibility and the visual field range can be improved under the condition of vertical projection of the rotary-wing unmanned aerial vehicle, the influence caused by shielding of explosion smoke dust when the space-time distribution of the explosion points of the fire extinguishing bomb is dense is effectively reduced, and in the detection of the coordinates of the explosion points of the fire extinguishing bomb, the conversion of pixel coordinates, camera coordinates and world coordinates of the explosion points of the fire extinguishing bomb is needed; the machine vision algorithm is combined with the detection of the explosion point coordinates of the fire extinguishing bomb, and an ammunition drop point parameter analysis system based on an unmanned aerial vehicle monitoring platform is developed and built, so that the intelligent level significance of improving the explosion effect evaluation of the artillery ball is great.

Description

Aerial monitoring system and monitoring method for parameter images of drop points of fire extinguishing bomb
Technical Field
The invention relates to the technical field of surveying and mapping, in particular to an aerial monitoring system and an aerial monitoring method for parameter images of falling points of fire extinguishing bombs.
Background
The fire extinguishing bomb depends on the explosion of the fire extinguishing medium to generate shock waves, high-pressure gas is formed to impact the combustible material and destroy the combustion condition, and the explosion throws the fire extinguishing medium to act with the combustible material, so that the fire extinguishing efficiency is improved. Fire extinguishing bombs are widely used in fire extinguishing operations of plants, buildings and forests. The evaluation problem of the effect of the firing action of the fire extinguishing bomb is always the key difficulty of the research in the field of fire protection at home and abroad. The detection of the coordinates of the drop points of the fire extinguishing bomb is used as a core link for evaluating the fire extinguishing effect, and is important in the accurate evaluation of the shooting deviation correction and the shooting precision.
Conventional optical measurement devices such as electro-optical theodolites, lidars, ballistic cameras, etc. are expensive to manufacture, so the deployment site will select a safe area farther from the target being measured. For small target measurements such as ammunition (shrapnel) of 82mm caliber and above, due to the long range of action, the positioning target cannot be identified at all, let alone measured. Photogrammetry is an emerging discipline formed by cross fusion under the development of disciplines such as computer vision, photogrammetry, optical measurement and the like. In particular, the content of computer vision about image processing, object recognition, feature extraction, description and matching constitutes a fundamental means of image processing in image capturing measurement, and the multi-view geometry theory in computer vision constitutes a fundamental theory of three-dimensional reconstruction in image capturing measurement, so many students refer to image capturing measurement as vision measurement based on computer vision, that is, the computer has the ability to understand three-dimensional scenes.
Most of the existing drop point coordinate testing equipment is based on a ground observation platform, has the problems of insufficient visibility, small observation field of view and difficult position selection of an observation station, cannot acquire accurate coordinates of drop points in time, and cannot distinguish firing sequences for continuous intensive shooting tests. The sensor is mounted on the rotor fire extinguishing bomb landing point parameter image air monitoring system to detect the coordinates of the explosion points of the fire extinguishing bomb, and the imaging is observed from top to bottom, so that the visibility and the visual field range of observation can be improved under the condition of vertical projection of the rotor fire extinguishing bomb landing point parameter image air monitoring system, and the influence caused by shielding of explosion smoke dust when the space-time distribution of the explosion points of the mass-fire extinguishing bomb is dense is effectively reduced. In the detection of the coordinates of the fire extinguishing bomb explosion point, the calculation of the coordinates of the fire extinguishing bomb explosion point needs to be converted into the pixel coordinates, the camera coordinates and the world coordinates of the explosion point, and the application of combining the machine vision algorithm with the detection of the coordinates of the fire extinguishing bomb explosion point has achieved a certain result at present.
Therefore, the ammunition drop point parameter analysis system based on the extinguishing bomb drop point parameter image air monitoring system monitoring platform is developed and built, and has great significance in improving the intelligent level of extinguishing effect evaluation of the extinguishing bomb.
Disclosure of Invention
The invention aims to provide an aerial monitoring system for a fire extinguishing bomb drop point parameter image, which aims to solve the problems of insufficient visibility, small observation field and difficult observation station position selection existing in the prior fire extinguishing bomb drop point monitoring technology in the background technology. The method is mainly used for measuring the position parameters of the firing ammunition drop points of the pressing type barrel cannon in the air, and the intelligent level of fire extinguishing effect evaluation of the fire extinguishing bomb is improved by combining a machine vision algorithm with the detection of the coordinates of the firing points of the fire extinguishing bomb.
In order to achieve the above purpose, the present invention provides the following technical solutions: fire extinguishing bomb landing point parameter image air monitoring system includes:
the aerial flight observation platform comprises an unmanned aerial vehicle body, an airborne processing module, an airborne B code timing module, a wireless image transmission module, a flight control module, an inertial guidance module, a differential positioning device (RTK) sky end and a power supply; the airborne processing module is respectively connected with the airborne B code time service module, the wireless image transmission module, the flight control module, the inertial guidance module and a differential positioning device (RTK) sky end in a communication way, and performs data interaction with the comprehensive display control system through an airborne end data link; the flight control module is preset with a flight control algorithm for realizing the flight state information acquisition, flight control and wireless communication functions of the air flight observation platform; at least three groups of aerial flight observation platforms are arranged;
the photoelectric pod comprises at least one image acquisition device, at least one height acquisition device and a storage device for storing data acquired by the image acquisition device and the height acquisition device; the system is used for realizing manual background correction, manual shutter correction, black heat/white heat polarity setting, 2-time electronic amplification, cross display, image gain adjustment, image brightness adjustment, cross position adjustment, image enhancement, electric focusing control, electric zooming control, system parameter reset or other expansion functions;
the comprehensive display control system comprises a control terminal, a workstation, image resolving software, a calibration system, a differential positioning device (RTK) ground end and a power supply, and is used for carrying out data processing and man-machine interaction between the fire extinguishing bomb landing point parameter image aerial monitoring system and a user.
As a preferred technical scheme, the optoelectronic pod further comprises a self-stabilizing device, and the stabilizing precision is as follows: and the infrared thermal imaging device is provided with three optical loads of visible light, infrared thermal imaging and laser ranging, wherein the resolution of a visible light camera is not less than 1920 x 1080, the frame frequency is not less than 1000fps, the resolution of a thermal infrared imager is not less than 640 x 480, the frame frequency is not less than 100fps, and the ranging precision of a laser ranging machine is not less than 1m.
As a preferred technical scheme, the inertial guidance module comprises a plurality of navigation modes, and can form a combined navigation system with the GPS/DVL.
As a preferable technical scheme, an onboard processing module RS232 serial communication interface is in communication connection with the onboard B code time service module, the wireless image transmission module, the flight control module, the inertial guidance module and a differential positioning device (RTK) sky end through an RS422 interface.
As a preferable technical scheme, the flight control navigation system comprises an autopilot, a redundant power supply management module, a voltage monitoring module and an engine speed monitoring module, wherein the autopilot consists of a GPS navigation module, an inertial navigation module, an atmospheric data module, a magnetic compass and a flight control module.
As a preferable technical scheme, the method further comprises the following steps: the photoelectric pod adopts a two-shaft multi-frame framework, and is respectively a pod base, an outer azimuth frame, an outer pitching frame, an inner azimuth frame and an inner pitching frame from top to bottom and from outside to inside, and the pod is connected with the carrier through the base; the outer frame is electrically connected with the base through a slip ring; the outer pitching frame, the inner azimuth frame and the inner pitching frame adopt composite limit protection combining mechanical limit, electric limit and software protection, and an optical bench is additionally arranged on the pitching frame and is used for installing a photoelectric sensor.
As a preferable technical scheme, the method further comprises the following steps: the ground comprehensive display control system receives the image transmission signals of the fire extinguishing bomb landing point parameter image air monitoring system, and the airborne data chain transmits data to the ground data chain through wireless transmission, and the airborne data chain is connected with the ground data chain one to one.
An aerial monitoring method for a fire extinguishing bomb drop point parameter image, which uses the aerial monitoring system for the fire extinguishing bomb drop point parameter image, comprises the following steps:
step one: initial calibration; the method specifically comprises the following steps:
1) Firstly, randomly arranging 3 targets with diameters of 2m in a target range area;
2) Taking off the unmanned aerial vehicle, wherein the flying height is about 200m, the photoelectric load of the unmanned aerial vehicle simultaneously positions the same target, and the target positions are calculated by respectively calculating the coordinate relationship;
3) The flight height of the unmanned aerial vehicle is 500m, every two of the unmanned aerial vehicle form an included angle of 120 degrees, the inclined distance of the center of a visual field is 707m, the included angle between an optical axis and a horizontal angle is 45 degrees, and the calculation is completed according to the arrangement requirement of the unmanned aerial vehicle;
step two: firing fire extinguishing bombs toward a target area;
step three: the unmanned aerial vehicle photoelectric load collects an emission target image;
step four: after the unmanned aerial vehicle returns to the home, data transmission is carried out on the comprehensive control platform, the longitude and latitude of the geographic position of the actual landing point of the fire extinguishing bomb are calculated according to the coordinates of the unmanned aerial vehicle and the photoelectric load data, and the longitude and latitude are compared with the longitude and latitude of the target point.
As a preferable technical scheme, the target position calculation adopts Kalman filtering background modeling, and the Gaussian mixture model accurately detects the target of the explosion point, and specifically comprises the following steps:
1) In the prediction stage, the filter predicts the pixel information estimated value of the background of the current frame through the background estimated value of the previous frame transmission image.
The background estimate for the predicted k frames is:
the prediction estimation covariance matrix is:
C k/k-1 =U k C k-1/k-1 U k T +R 1k
in the above-mentioned method, the step of,to predict the current k-frame result using the background value of the previous frame image, < >>For best result of previous frame background, P k If not, then assign 0, C k/k-1 Is->Corresponding correlation matrix, R 1k A noise correlation matrix for the transmission of the image;
2) In the updating stage, for the predicted value obtained in the predicting stage, the filter optimizes the observed value of the current pixel frame to obtain the next more accurate estimated value, and the following three variables are required to be calculated respectively:
reference difference:reference difference covariance: c (C) 1k =H k C k/k-1 H k T +R 2k The method comprises the steps of carrying out a first treatment on the surface of the Optimal kalman gain: k (K) gk =C k/k-1 H k T C 1k -1
The updated background pixel estimate is expressed as:
the updated covariance estimate is then:
C k/k =(I-K gk H k )C k/k-1
matrix where I is 1, i=1 for single model single input; c when the algorithm enters k+1 frames k/k Optimizing covariance for the previous frame;
4) And sequentially cycling the steps to perform updating operation.
As a preferable technical solution, when an adhesion explosion point occurs, a viewing angle judgment algorithm is used to distinguish the sequence of the explosion in the fourth step, and the method specifically includes the following steps: selecting data with the least influence of adhesion from image data acquired from different angles of the three cameras to perform resolving, and acquiring relatively accurate explosion point positioning coordinates; meanwhile, in the process of locating the frying point, on one hand, the accuracy weight of the binocular locating information of the combination of the two cameras aiming at the appointed frying point is calculated through the shooting angles of the three cameras and the identification confidence, so that the cross verification and optimization processing of the frying point position information based on the weight are realized; on the other hand, the explosion point positioning accuracy of different binocular positioning combinations is cross-verified by utilizing the explosion point quantity information captured by the infrared camera.
Compared with the prior art, the invention has the beneficial effects that:
the aerial flight observation platform comprises a plurality of groups of aerial flight observation platforms, fire extinguishing bomb explosion point coordinates are detected through a rotor unmanned aerial vehicle mounting sensor, and observation imaging is carried out from top to bottom, so that under the condition of vertical projection of the rotor unmanned aerial vehicle, the observation visibility and the field of view can be improved, the influence caused by shielding of explosion smoke dust when the space-time distribution of the mass fire extinguishing bomb explosion points is dense is effectively reduced, and in the fire extinguishing bomb explosion point coordinate detection, the fire extinguishing bomb explosion point coordinates are calculated and need to be converted into explosion point pixel coordinates, camera coordinates and world coordinates; the machine vision algorithm is combined with the detection of the explosion point coordinates of the fire extinguishing bomb, so that an ammunition drop point parameter analysis system based on the unmanned aerial vehicle monitoring platform is developed and built, and the intelligent level significance of improving the fire extinguishing effect evaluation of the fire extinguishing bomb is great.
Drawings
FIG. 1 is a diagram of the overall architecture of the system of the aerial monitoring system for the parameters of the drop point of the fire extinguishing bomb according to the present invention;
fig. 2 is a schematic diagram of a measurement scenario of the unmanned aerial vehicle of the present invention;
FIG. 3 is a schematic diagram of a three-dimensional visual fixed-point model;
FIG. 4 is a graph of the image processing tracking model segmentation tracking detection effect;
FIG. 5 is a diagram of the composition of the aerial flight observation platform of the present invention;
FIG. 6 is a general block diagram of a flight control system design;
FIG. 7 is a schematic diagram of a two-axis four-frame image stabilization platform;
FIG. 8 is a schematic diagram of the pod system electrical interface;
FIG. 9 is a pod power block diagram;
FIG. 10 is a diagram of the overall display control system;
FIG. 11 is a system overall interface schematic;
FIG. 12 is a data monitoring main interface diagram of the unmanned control software;
FIG. 13 is a task planning main interface diagram of unmanned control software;
FIG. 14 is a drawing task master interface diagram of unmanned control software;
FIG. 15 is a optoelectronic pod control software main interface;
FIG. 16 is an image resolution software main interface;
fig. 17 is a schematic diagram of the coordinate conversion relationship.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, the present invention provides a technical solution: fire extinguishing bomb landing point parameter image air monitoring system includes: the aerial flight observation platform comprises a fire extinguishing bomb landing point parameter image aerial monitoring system body, an airborne processing module, an airborne B code time service module, a wireless image transmission module, a flight control module, an inertial guidance module, a differential positioning device (RTK) sky end and a power supply; the airborne processing module is respectively connected with the airborne B code time service module, the wireless image transmission module, the flight control module, the inertial guidance module and a differential positioning device (RTK) sky end in a communication way, and performs data interaction with the comprehensive display control system through an airborne end data link; the flight control module is preset with a flight control algorithm for realizing the flight state information acquisition, flight control and wireless communication functions of the air flight observation platform; the aerial flight observation platforms are at least three groups.
The photoelectric pod comprises at least one image acquisition device, at least one height acquisition device and a storage device for storing data acquired by the image acquisition device and the height acquisition device; the system is used for realizing manual background correction, manual shutter correction, black heat/white heat polarity setting, 2-time electronic amplification, cross display, image gain adjustment, image brightness adjustment, cross position adjustment, image enhancement, electric focusing control, electric zooming control, system parameter reset or other expansion functions.
The nacelle core self-grinding part is the structural design of the nacelle: two-axis four-frame stabilized platform. The platform consists of four frames including an inner shaft system and an outer shaft system in pitch and azimuth. The gyroscope and the detection equipment are arranged on the inner frame platform, and when the gyroscope measures the rotation of the inner ring frame, the controller controls the motor to drive the frame to counteract the movement. The angle deviation appears between the inner frame and the outer frame, and the outer frame is controlled to move along with the inner frame by utilizing the angle deviation, so that the inner pitching frame and the azimuth frame are always in a vertical state. Compared with a two-axis two-frame structure, the two-axis four-frame stable tracking platform has the advantages that the two inner frames are always vertical, so that geometric accidents are reduced, the influence of wind resistance moment on aiming lines of detection equipment on the inner frames in the high-speed motion process of a carrier can be overcome, namely, the disturbance moment actually applied to the inner frames is reduced, and therefore, the isolation is better, and the stability precision is higher; secondly, the two inner frames are mutually perpendicular, so that the influence of the parallel alignment line and azimuth axis on the stability and the self-locking phenomenon of the frames when the platform moves in a large angle and is vertically tracked to the ground is eliminated.
The comprehensive display control system comprises a control terminal, a workstation, image resolving software, a calibration system, a differential positioning device (RTK) ground end and a power supply, and is used for carrying out data processing and man-machine interaction between the fire extinguishing bomb landing point parameter image aerial monitoring system and a user.
The whole system architecture is deployed as follows: the aerial monitoring system for the parameter image of the drop point of the fire extinguishing bomb consists of an (RTK) aerial flight observation platform (hereinafter referred to as aerial flight observation platform) provided with a differential positioning device, an onboard light gyro self-stabilization photoelectric pod (hereinafter referred to as photoelectric pod), ground comprehensive display and control equipment and the like. The aerial observation flight platform consists of a multi-rotor unmanned aerial vehicle, an airborne processing module, an airborne B code time service module, a wireless image transmission module, a battery, ground flight control and the like. The photoelectric pod consists of a visible light high-speed camera, an infrared thermal imaging camera, a high-precision laser range finder, a self-stabilizing gyroscope, a storage device and the like. The comprehensive display control system mainly comprises a ground mobile workstation, an oil engine power supply, an unmanned aerial vehicle control system, a photoelectric pod control system, image resolving software, a calibration system and the like.
A conventional measurement scenario is shown in fig. 2: the system is applied to the ammunition falling area of the compressed barrel, and can realize the monitoring and measurement of the ammunition falling point position under the condition of shooting of the multi-barrel artillery.
Before shooting, the whole system enters a state of arming, and all unmanned aerial vehicles wait for a take-off instruction. When receiving a take-off instruction, three unmanned aerial vehicles take off simultaneously and directly travel to the designated positions and the height for standby, and the ground comprehensive display control system receives the state parameters of all unmanned aerial vehicle systems, the state parameters of the photoelectric pod, the positioning parameters of the GNSS/IMU inertial navigation system and part of real-time returned image data in real time, wherein the real-time preview video mainly comprises a visible light high-speed camera. The data can be displayed on the ground comprehensive display control system in real time, and a user can monitor the running states of the unmanned aerial vehicle and the photoelectric pod in real time.
During shooting, the photoelectric pod realizes three-mode measurement of visible light, infrared and laser on the position of a high-speed target falling point according to a preset triggering mode, and the three-mode measurement is stored locally, and the self-stabilization of the gesture can be realized through the gyroscope.
After shooting, the ammunition drop point target is automatically detected by reading the visible light video data downloaded back and utilizing a computer vision image change detection technology, and the positions of the target on the three visible light video images and the infrared image are obtained, so that the next step of triangulating and measuring the explosion point target is facilitated. The position of the ammunition drop point target detected in the last step on the image is read, and the three-dimensional coordinate of the ammunition drop point target is precisely calculated by utilizing three different-direction observation data and adopting a space triangular positioning algorithm in combination with the position and posture data provided by a high-precision differential GNSS inertial navigation system carried on the unmanned plane.
The specific calculation algorithm is as follows: firstly, an outdoor calibration field is established, targets of a visible light high-speed camera are arranged, the accurate coordinates of the targets are checked by using RTKs, the photoelectric pod is controlled to collect images of the calibration field, visible light image videos of the calibration field are obtained, the targets are automatically identified by using a target identification technology, external parameters of a camera system are calibrated, and high-precision camera parameters are provided for position calculation of a target of a explosion point by using a post-intersection measurement principle. Then, through the received video data, the video sequences mainly comprising the visible light high-speed camera and the infrared camera are comprehensively processed, ammunition explosion points are captured by adopting an ammunition explosion point target automatic detection and identification algorithm, and three-dimensional coordinates of the ammunition explosion points in a world coordinate system are calculated through ammunition explosion point positions captured by three unmanned aerial vehicles in different directions, so that precise calculation of the three-dimensional coordinates of the ammunition explosion point target space is realized. The reason for using three sets of airborne observation platforms is related to the detection coverage. The three unmanned aerial vehicles are uniformly distributed along the circumference, the system height is 500m, and the included angle between the optical axis and the ground is 45 degrees. The method can be added to 4 pieces of … equipment and 5 pieces of … equipment, the calculation algorithm is not different, only the air deployment mode is different, and the detection coverage area is also different.
Meanwhile, the method for monitoring the parameter image of the drop point of the matched fire extinguishing bomb in the air is provided:
1. firstly, randomly arranging 3 targets with diameters of 2m in a target range area;
2. taking off three aircrafts, wherein the flying height is about 200m, positioning the same target by three aircraft photoelectric loads at the same time, respectively calculating the target positions, and calculating the coordinate relationship;
3. the flight heights of the three aircrafts are 500m, the included angles are 120 degrees, the inclined distance of the center of the field of view is 707m, the included angle between the optical axis and the horizontal is 45 degrees, and the calculation is completed according to the arrangement requirements of the three aircrafts;
4. transmitting a target to a target area;
5. the three aircraft photoelectric loads collect the emission target images;
6. after the aircraft returns to the voyage, the data are transmitted to the comprehensive control platform, and the required data are calculated according to the coordinates of three aircraft and the photoelectric load data.
The resolving process firstly needs to solve the problem of coordinate conversion, the camera imaging process is realized by a lens imaging principle, in order to obtain the three-dimensional space position information of the measured target point, the process needs to determine the mathematical and geometric conversion relation among a pixel coordinate system, an image physical coordinate system, a camera coordinate system and a world coordinate system, the conversion from the pixel coordinate system to the image coordinate system is just different in the position of the origin of coordinates due to the fact that the planes of the pixel coordinate system to the image coordinate system are identical, and the conversion from the camera coordinate system to the image coordinate system is a set of perspective change process.
Referring to fig. 14, the relationship between the pixel coordinates and the world coordinates of the measured object can be constructed as follows:
of the formula (I)Is an internal reference matrix of the camera, which is composed of f x 、f y 、u 0 、v 0 A 3 x 3 matrix of gamma five internal parameters, where (u) 0 ,v 0 ) The value of the coordinate value is 1/2 of the resolution of the image pixels, and the gamma describes the inclination angle between the u axis and the v axis, which is called as the non-vertical factor, and the value of gamma is usually zero; f (f) x =f/d x 、f y =f/d y Scale factors about the u-axis and v-axis, respectively, expressed in the pixel coordinate system, also referred to as effective focal length; [ R ] 3×3 T 3×1 ]The three-dimensional coordinate system is a 3 x 4 external parameter matrix formed by the rotation matrix R and the translation vector T, the two parameters are related to the position relationship between the camera and the measured object, and the position of the camera in the world coordinate system can be obtained through the parameter values in the matrix.
Under the shooting range environment, the unmanned aerial vehicle air staring target detection has complex noise caused by various factors such as unmanned aerial vehicle shaking, background object movement, light mutation, airflow disturbance and the like, and if the background is not updated in time, the false detection result is accumulated and circulated continuously along with time, so that the monitoring is invalid. Therefore, on the basis of Kalman filtering background modeling, a Gaussian mixture model is adopted to modify continuously-changing background information, so that accurate capturing of quick-moving minuscule features of the explosion point is ensured.
Therefore, the target position calculation adopts Kalman filtering background modeling, the mixed Gaussian model is used for accurately detecting the explosion point target, the Kalman filtering is a recursively estimated time domain filter, the background estimated value of the current frame can be calculated by knowing the background value of the previous frame image and the measured value of the current frame, and image information before measurement and prediction does not need to be acquired.
f k =Uf k-1 +VP k +N 1k
d k =Hf k +N 2k
In the above, f k For background value of k frame image, P k Constraint parameters of k frames of images, U, V is an image system parameter matrix, d k For reference value of k frames, H is parameter matrix of background modeling system, N 1k 、N 2k The noise is a gaussian noise respectively and,representing various interference signals encountered in the image transmission process, the corresponding correlation matrix is R 1 、R 2
In the prediction stage, the filter predicts the pixel information estimated value of the background of the current frame through the background estimated value of the previous frame transmission image.
The background estimate for the predicted k frames is:
the prediction estimation covariance matrix is:
C k/k-1 =U k C k-1/k-1 U k T +R 1k
in the above-mentioned method, the step of,to predict the current k-frame result using the background value of the previous frame image, < >>For best result of previous frame background, P k If not, a value of 0 is assigned. C (C) k/k-1 Is->Corresponding correlation matrix, R 1k Noise correlation matrix for the transmission of images.
And in the updating stage, for the predicted value obtained in the predicting stage, the filter optimizes the observed value of the current pixel frame to obtain the next more accurate estimated value. The following three variables need to be calculated here:
reference difference:reference difference covariance: c (C) 1k =H k C k/k-1 H k T +R 2k The method comprises the steps of carrying out a first treatment on the surface of the Optimal kalman gain: k (K) gk =C k/k-1 H k T C 1k -1
The updated background pixel estimate is expressed as:
the updated covariance estimate is then:
C k/k =(I-K gk H k )C k/k-1
where I is a matrix of 1, i=1 for a single model single input. C when the algorithm enters k+1 frames k/k The covariance is optimized for the previous frame. The algorithm loops through the update operation in turn.
Based on the Kalman filtering background modeling, a mixed Gaussian model is adopted to accurately detect the target of the explosion point. And (3) characterizing the characteristics of each pixel point in the image by using a specified Gaussian model, recalculating the Gaussian mixture model after inputting the next frame of image, and matching the current pixel points with the Gaussian mixture model, wherein if the matching is successful, the pixel point is a background point, and otherwise, the pixel point is a foreground point. Background modeling is determined by parameters represented by variances and means, and different updating mechanisms are adopted to influence the stability, accuracy and convergence of the Gaussian model. As the target of the explosion point is a moving target, two parameters of variance and mean in the model are required to be updated in real time during background extraction modeling. Meanwhile, in order to improve the correction capability of the model, the concept of weight is introduced. Therefore, background images are established and updated in real time, and foreground and background classification is required to be carried out on pixel points by combining weights, means and variances.
If the burst occurs, the burst is a densely shot test burst, which can be understood as a stuck burst treatment. When the sticking explosion point occurs, a visual angle judging algorithm is used, and data with the smallest influence of sticking is selected from image data obtained from different angles of the three cameras to be resolved, so that relatively accurate explosion point positioning coordinates are obtained. Meanwhile, in the process of locating the frying point, on one hand, the accuracy weight of the binocular locating information of the combination of the two cameras aiming at the appointed frying point is calculated through the shooting angles of the three cameras and the identification confidence, so that the cross verification and optimization processing of the frying point position information based on the weight are realized; on the other hand, the explosion point positioning accuracy of different binocular positioning combinations is cross-verified by utilizing the explosion point quantity information captured by the infrared camera.
As shown in fig. 3, the system proposes a three-vision combined ranging algorithm to pair Q 1 、Q 2 、Q 3 The measured values of the three points are optimized, the model is built on the basis of a binocular vision convergence type ranging model by reasonably placing three cameras, and meanwhile, the inter-camera relationship is adjusted to enable the respective optical axes to form a certain angle with each other, namely, the model is composed of three groups of binocular vision ranging models. Suppose that the projection points of the arbitrary point Q in space on the imaging planes of the three cameras a, b and c are Q 1 、q 2 And q 3 A schematic diagram of the three-dimensional visual ranging model is shown in fig. 3. Ideally, the measured value Q of the binocular distance model formed by camera a and camera b 1 Measurement value Q of binocular distance model composed of camera b and camera c 2 Measurement value Q of binocular distance measuring system composed of camera a and camera c 3 Should overlap with the actual measured point Q at one point, i.e. O ca q 1 、O cv q 1 、O cc q 1 The three straight lines intersect to the same point Q. However, in an actual ranging environment, the presence of errors in the binocular-based ranging system results in Q 1 、Q 2 、Q 3 These three points do not overlap with the actual measured point Q. But is made of O ca q 1 、O cb q 1 、O cc q 1 Three straight lines are intersected pairwise to form three different points Q in space 1 、Q 2 、Q 3 The three-dimensional space coordinates of the three points can be obtained by a binocular ranging algorithm.
The spatial coordinates (X) of the measured point Q m ,Y m ,Z m ) Then the actual coordinates of the Q point are optimally estimated by the objective function, and are derived as follows:
F=min(‖Q-Q 1 ‖+‖Q-Q 2 ‖+‖Q-Q 3 ‖)
the assumed point Q is projected to a,Q on b and c camera imaging planes 1 、q 2 And q 3 The pixel coordinates of the three points have been detected as (u) a ,v a )、(u b ,v b )、(u c ,v c ) Assume again that all three cameras in the model have completed calibration work, and M a 、M b 、M c The projection matrices are respectively:
in the formula, M i Representing the projection matrix of each of the three cameras and M i =A i [R i T i ](wherein i=a, b, c), (u) a ,v a ,1)、(u b ,v b ,1)、(u c ,v c 1) respectively representing homogeneous coordinates corresponding to pixel coordinates of three projection points on respective imaging surfaces, (X) m ,Y m ,Z m 1) represents the corresponding homogeneous coordinates of the measured point Q in the world coordinate system. By using the least square method to obtain Q based on the last-node algorithm 1 、Q 2 、Q 3 Three dimensional coordinates of these three points, i.e. Q i (X mi ,Y mi ,Z mi ) (where i=1, 2, 3), and it can acquire projection line equations twice the number of cameras based on the conventional binocular ranging, and further enhance noise immunity, as shown in the following formula.
Wherein:
the method can be simplified as follows:
F=min(‖Q-Q 1 ‖+‖Q-Q 2 ‖+‖Q-Q 3 ‖)
=(X m -X m1 ) 2 +(Y m -Y m1 ) 2 +(Z m -Z m1 ) 2
+(X m -X m2 ) 2 +(Y m -Y m2 ) 2 +(Z m -Z m2 ) 2
+(X m -X m3 ) 2 +(Y m -Y m3 ) 2 +(Z m -Z m3 ) 2
the sum of squares of the differences between the individual variables and their arithmetic averages is minimized, so that an optimal estimated value of the measured point Q can be obtained.
After analysis of the target and the explosion point detection part, the system needs to meet the recognition of more than two types of targets on the algorithm at the same time, and can meet the tracking task of the targets on the motion trail. Therefore, a tracking algorithm under deep learning is selected on the detection of the explosion point and the target point so as to finish the identification of the explosion point and the target point target, and the track process of the explosion point, namely the generation of fireball and the final smoke diffusion when the fuze is pulled open is tracked and identified. Wherein, the tracking detection effect diagram is shown in fig. 4. The expected algorithm can track and identify the behavior track of the target object, the pixel position of the target task in the picture is realized to the greatest extent, and the corresponding process can be carried out on the position of the pixel coordinate point and the coordinate position point of the real world through the conversion matrix.
As shown in FIG. 5, the aerial flight observation platform consists of a multi-rotor unmanned aerial vehicle, an airborne processing module, an airborne B code time service module, a wireless image transmission module, a battery, ground flight control and the like. The multi-rotor unmanned aerial vehicle body adopts a lightweight process composite material, and has high strength and light weight; the detachable quick loader can achieve the state of waiting to fly within ten minutes, and the load is not less than 10kg.
The inertial navigation mode should contain multiple navigation modes, and can form a combined navigation system with GPS/DVL, and the onboard B code timing module can realize automatic one-out-of-order GPS reception, GPS signal, IRIG-B (DC) code and IRIG-B (AC) code reception, and preferably receives GPS.
The flight control navigation system comprises an autopilot, a redundant power supply management module, a voltage monitoring module and an engine rotating speed monitoring module, wherein the autopilot comprises a GPS navigation module, an inertial navigation module, an atmosphere data module, a magnetic compass and a flight control module. The software design of the whole flight control system is shown in fig. 6: the system main function is used for initializing the whole system and coordinating the call among all sub-tasks; the flight information acquisition mainly completes acquisition and processing of sensor information and provides navigation information such as gestures and positions for the system;
the flight control module calculates output adjustment quantity according to the position information of the target point and combining the current flight attitude, and drives the electronic speed regulator to adjust the rotating speed of the brushless motor so as to change the flight state of the unmanned aerial vehicle; the remote controller module is responsible for collecting and decoding remote control signals and realizing the functions of switching the flight modes and manual control; the ground communication module receives ground instructions and returns flight data.
As shown in FIG. 7, the stability precision of the photoelectric pod is less than or equal to 50urad, and the conventional two-shaft two-frame structure cannot achieve the precision, so that the structural design considers that two-shaft four-frames are used, compared with the two-shaft two-frame structure, the two-shaft four-frames can eliminate the geometric constraint coupling of the frames, can keep the mutual superposition of the inner frame and the outer frame so as to avoid the problems that the control precision of a pitching stability loop is greatly reduced and the control on the pitching degree of freedom is lost during large-angle movement, can overcome the disturbance angular speed of a carrier by the rotation speed of an inner frame motor and the rotation speed of an outer frame motor, and has better control performance and isolation.
The two-axis four-frame structure has better carrier disturbance isolation capability, can avoid the problem of self-locking of the frame in a vertical tracking state, controls the inner frame and the outer frame to be mutually independent, reduces the control complexity, and simultaneously reduces the volume of the motor and the requirement on the power consumption of the motor. The structure diagram is shown in the upper diagram. From top to bottom and from outside to inside are respectively a nacelle base, an outer azimuth frame, an outer pitch frame, an inner azimuth frame and an inner pitch frame. The base is the connection part of the nacelle and the carrier. In order to realize continuous rotation of the azimuth tracking range, the outer frame is electrically connected with the base through a slip ring. Meanwhile, based on the consideration of reliable and stable operation of the system, the mechanical limit, the electric limit and the software protection are combined to form a compound limit protection for the outer pitching frame, the inner azimuth frame and the inner pitching frame. An optical bench is additionally arranged on the inner pitching frame and is used for installing photoelectric sensors (a thermal infrared imager, a visible light high-speed camera and a laser range finder).
The electrical design of the system is shown in fig. 8: the hardware of the product circuit mainly comprises an inertial measurement unit IMU, an encoder, a slip ring, a motor, a control circuit and the like. The nacelle system mainly comprises an azimuth component, a pitching component and an optical cabin, wherein the azimuth component comprises an external interface and an external azimuth encoder; the outer pitching assembly comprises an outer pitching motor, an outer pitching encoder and a tracking plate; the inner pitching assembly comprises an inner azimuth motor, an inner pitching motor, an inner azimuth encoder and an inner pitching encoder; the optical cabin comprises a thermal infrared imager, a visible light and visible light high-speed camera, a laser range finder, an internal azimuth gyro, an internal pitching gyro, a main control board and a secondary power supply board. All component wiring follows the nearby fixation at the joint, and all power interfaces reserve enough pins for input and output, and all partial power supplies are separately wired, so that all paths of power supplies are ensured not to be affected by each other. The power supply is isolated from the video signal during the layout of the whole cable, so that the influence on the image quality is avoided.
As shown in fig. 9, the power supply unit of the system uses a regulated power supply module to ensure the stability of the whole circuit when the system works. The direct current power supply inputs +24V voltage, outputs +24V and +12V voltage through the voltage stabilizing module, and supplies the voltage to the optical assembly, the driving motor and all sensors.
The comprehensive display control system architecture is shown in fig. 10, and comprises a control terminal, a workstation, image resolving software, a calibration system, a differential positioning device (RTK) ground terminal and a power supply, wherein the control terminal is used for performing data processing and performing man-machine interaction between the fire extinguishing bomb landing point parameter image air monitoring system and a user.
The connection mode among the unmanned aerial vehicle, the photoelectric pod and the integrated display control system is shown in fig. 11, and the photoelectric pod and the unmanned aerial vehicle data are transmitted to a ground workstation through data/image wireless transmission link equipment. The image resolving software is a core function of the comprehensive display control system, and after an ammunition falling point target is detected, the real three-dimensional coordinates of the ammunition falling point are measured and displayed on a three-dimensional visual interface.
Fig. 12, 13, 14 are data monitoring, mission planning, and mapping mission master interface diagrams, respectively, of the drone control software.
As shown in fig. 15, the main interface of the photoelectric pod control software is provided with the functions of remotely controlling the pod gesture and load mode switching, and can realize the functions of monitoring the state parameters of each module, controlling the state of the photoelectric pod and monitoring the collected data in real time.
The image resolving software interface is shown in fig. 16. The method is responsible for comprehensively processing video data acquired by a visible light high-speed camera and an infrared thermal imager and position and posture measurement data acquired by a GNSS/IMU inertial navigation system, capturing ammunition falling points by adopting an ammunition falling point target automatic detection and identification algorithm, and calculating three-dimensional coordinates of the ammunition falling points in an object space according to the captured ammunition falling point positions, so that three-dimensional measurement of the ammunition falling point targets is realized. The image resolving software has two processing models, namely a real-time resolving mode and a post resolving mode, and when resolving in real time, the ammunition drop point space position can be roughly estimated by processing the data of the visible light high-speed camera in the low-speed mode, namely preview data (the frame frequency is about 25 fps); in post-calculation, the high-speed video data (1000 fps) is processed to obtain more accurate ammunition drop point position coordinates.
It should be noted that the overall system also involves complex environmental adaptation designs, reliability designs, maintenance designs, testability designs, security designs, safety designs, and electromagnetic compatibility designs. The above designs also require a combination of development risk analysis. The system is a complex project of a system, but the ammunition drop point parameter analysis system based on the extinguishing bomb drop point parameter image air monitoring system monitoring platform is realized, and the system has great significance in improving the intelligent level of extinguishing effect evaluation of the extinguishing bomb.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (10)

1. Fire extinguishing bomb landing point parameter image air monitoring system, its characterized in that includes:
the aerial flight observation platform comprises an unmanned aerial vehicle body, an airborne processing module, an airborne B code timing module, a wireless image transmission module, a flight control module, an inertial guidance module, a differential positioning device (RTK) sky end and a power supply; the airborne processing module is respectively connected with the airborne B code time service module, the wireless image transmission module, the flight control module, the inertial guidance module and a differential positioning device (RTK) sky end in a communication way, and performs data interaction with the comprehensive display control system through an airborne end data link; the flight control module is preset with a flight control algorithm for realizing the flight state information acquisition, flight control and wireless communication functions of the air flight observation platform; at least three groups of aerial flight observation platforms are arranged;
the photoelectric pod comprises at least one image acquisition device, at least one height acquisition device and a storage device for storing data acquired by the image acquisition device and the height acquisition device; the system is used for realizing manual background correction, manual shutter correction, black heat/white heat polarity setting, 2-time electronic amplification, cross display, image gain adjustment, image brightness adjustment, cross position adjustment, image enhancement, electric focusing control, electric zooming control, system parameter reset or other expansion functions;
the comprehensive display control system comprises a control terminal, a workstation, image resolving software, a calibration system, a differential positioning device (RTK) ground end and a power supply, and is used for carrying out data processing and man-machine interaction between the fire extinguishing bomb landing point parameter image aerial monitoring system and a user.
2. The fire extinguishing bomb landing point parameter image air monitoring system according to claim 1, wherein the photoelectric pod further comprises a self-stabilizing device, and the stabilizing precision is as follows: and the infrared thermal imaging device is provided with three optical loads of visible light, infrared thermal imaging and laser ranging, wherein the resolution of a visible light camera is not less than 1920 x 1080, the frame frequency is not less than 1000fps, the resolution of a thermal infrared imager is not less than 640 x 480, the frame frequency is not less than 100fps, and the ranging precision of a laser ranging machine is not less than 1m.
3. The fire extinguishing bomb landing point parameter image aerial monitoring system according to claim 1, wherein the photoelectric pod adopts a two-axis multi-frame architecture, and is respectively a pod base, an outer azimuth frame, an outer pitch frame, an inner azimuth frame and an inner pitch frame from top to bottom and from outside to inside, and the pod is connected with the carrier through the base; the outer frame is electrically connected with the base through a slip ring; the outer pitching frame, the inner azimuth frame and the inner pitching frame adopt composite limit protection combining mechanical limit, electric limit and software protection, and an optical bench is additionally arranged on the pitching frame and is used for installing a photoelectric sensor.
4. The fire extinguishing bomb drop point parameter image air monitoring system of claim 1 wherein the inertial guidance module comprises a plurality of navigation modes capable of forming a combined navigation system with GPS/DVL.
5. The fire extinguishing bomb drop point parameter image air monitoring system according to claim 1, wherein the onboard processing module RS232 serial communication interface is in communication connection with the onboard B code timing module, the wireless image transmission module, the flight control module, the inertial guidance module, and a differential positioning device (RTK) sky end through an RS422 interface.
6. The fire extinguishing bomb landing point parameter image air monitoring system of claim 1, wherein the flight control navigation system comprises an autopilot, a redundant power management module, a voltage monitoring module and an engine speed monitoring module, wherein the autopilot is comprised of a GPS navigation module, an inertial navigation module, an atmospheric data module, a magnetic compass and a flight control module.
7. The fire extinguishing bomb landing point parameter image aerial monitoring system of claim 1, further comprising: the ground comprehensive display control system receives the image transmission signals of the fire extinguishing bomb landing point parameter image air monitoring system, and the airborne data chain transmits data to the ground data chain through wireless transmission, and the airborne data chain is connected with the ground data chain one to one.
8. An extinguishing bomb landing point parameter image aerial monitoring method, characterized in that an extinguishing bomb landing point parameter image aerial monitoring system as claimed in any one of the preceding claims 1 to 7 is used, comprising the following steps:
step one: initial calibration; the method specifically comprises the following steps:
1) Firstly, randomly arranging 3 targets with diameters of 2m in a target range area;
2) Taking off the unmanned aerial vehicle, wherein the flying height is about 200m, the photoelectric load of the unmanned aerial vehicle simultaneously positions the same target, and the target positions are calculated by respectively calculating the coordinate relationship;
3) The flight height of the unmanned aerial vehicle is 500m, every two of the unmanned aerial vehicle form an included angle of 120 degrees, the inclined distance of the center of a visual field is 707m, the included angle between an optical axis and a horizontal angle is 45 degrees, and the calculation is completed according to the arrangement requirement of the unmanned aerial vehicle;
step two: firing fire extinguishing bombs toward a target area;
step three: the unmanned aerial vehicle photoelectric load collects an emission target image;
step four: after the unmanned aerial vehicle returns to the home, data transmission is carried out on the comprehensive control platform, the longitude and latitude of the geographic position of the actual landing point of the fire extinguishing bomb are calculated according to the coordinates of the unmanned aerial vehicle and the photoelectric load data, and the longitude and latitude are compared with the longitude and latitude of the target point.
9. The aerial monitoring method of the fire extinguishing bomb landing point parameter image according to the above claim 8, wherein the target position calculation adopts Kalman filtering background modeling, and the mixed Gaussian model accurately detects the bomb landing point target, and specifically comprises the following steps:
1) In the prediction stage, the filter predicts the pixel information estimated value of the background of the current frame through the background estimated value of the previous frame transmission image.
The background estimate for the predicted k frames is:
the prediction estimation covariance matrix is:
C k/k-1 =U k C k-1/k-1 U k T +R 1k
in the above-mentioned method, the step of,to predict the current k-frame result using the background value of the previous frame image, < >>For best result of previous frame background, P k If not, then assign 0, C k/k-1 Is->Corresponding correlation matrix, R 1k A noise correlation matrix for the transmission of the image;
2) In the updating stage, for the predicted value obtained in the predicting stage, the filter optimizes the observed value of the current pixel frame to obtain the next more accurate estimated value, and the following three variables are required to be calculated respectively:
reference difference:reference difference covariance: c (C) 1k =H k C k/k-1 H k T +R 2k The method comprises the steps of carrying out a first treatment on the surface of the Optimal kalman gain: k (K) gk =C k/k-1 H k T C 1k -1
The updated background pixel estimate is expressed as:
the updated covariance estimate is then:
C k/k =(I-K gk H k )C k/k-1
matrix where I is 1, i=1 for single model single input; c when the algorithm enters the K+1 frame k/k Optimizing covariance for the previous frame;
3) And sequentially cycling the steps to perform updating operation.
10. The method for monitoring the parameter image of the drop point of the fire extinguishing bomb in the air according to the above claim 8, wherein when the sticking explosion point occurs, the viewing angle judgment algorithm is used in the fourth step to distinguish the bomb sequence, and the method specifically comprises the following steps: selecting data with the least adhesion influence from image data acquired from different angles by three cameras respectively arranged on the three unmanned aerial vehicles, and resolving to acquire relatively accurate explosion point positioning coordinates; meanwhile, in the process of locating the frying point, on one hand, the accuracy weight of the binocular locating information of the combination of the two cameras aiming at the appointed frying point is calculated through the shooting angles of the three cameras and the identification confidence, so that the cross verification and optimization processing of the frying point position information based on the weight are realized; on the other hand, the explosion point positioning accuracy of different binocular positioning combinations is cross-verified by utilizing the explosion point quantity information captured by the infrared camera.
CN202310927653.3A 2023-07-26 2023-07-26 Aerial monitoring system and monitoring method for parameter images of drop points of fire extinguishing bomb Active CN116952081B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310927653.3A CN116952081B (en) 2023-07-26 2023-07-26 Aerial monitoring system and monitoring method for parameter images of drop points of fire extinguishing bomb

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310927653.3A CN116952081B (en) 2023-07-26 2023-07-26 Aerial monitoring system and monitoring method for parameter images of drop points of fire extinguishing bomb

Publications (2)

Publication Number Publication Date
CN116952081A true CN116952081A (en) 2023-10-27
CN116952081B CN116952081B (en) 2024-04-16

Family

ID=88447255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310927653.3A Active CN116952081B (en) 2023-07-26 2023-07-26 Aerial monitoring system and monitoring method for parameter images of drop points of fire extinguishing bomb

Country Status (1)

Country Link
CN (1) CN116952081B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108388267A (en) * 2018-04-13 2018-08-10 北京天赢测控技术有限公司 Unmanned plane managing and control system
CN210090988U (en) * 2019-04-11 2020-02-18 株洲时代电子技术有限公司 Unmanned aerial vehicle system of patrolling and examining
CN113269098A (en) * 2021-05-27 2021-08-17 中国人民解放军军事科学院国防科技创新研究院 Multi-target tracking positioning and motion state estimation method based on unmanned aerial vehicle
CN113850126A (en) * 2021-08-20 2021-12-28 武汉卓目科技有限公司 Target detection and three-dimensional positioning method and system based on unmanned aerial vehicle
WO2022110912A1 (en) * 2020-11-27 2022-06-02 清华大学 Unmanned aerial vehicle video-based forest fire spreading data assimilation method and apparatus
CN114963898A (en) * 2022-06-15 2022-08-30 西安工业大学 System and method for testing shot blasting point position based on unmanned aerial vehicle
US20230215024A1 (en) * 2020-08-31 2023-07-06 Autel Robotics Co., Ltd. Position estimation method and apparatus for tracking target, and unmanned aerial vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108388267A (en) * 2018-04-13 2018-08-10 北京天赢测控技术有限公司 Unmanned plane managing and control system
CN210090988U (en) * 2019-04-11 2020-02-18 株洲时代电子技术有限公司 Unmanned aerial vehicle system of patrolling and examining
US20230215024A1 (en) * 2020-08-31 2023-07-06 Autel Robotics Co., Ltd. Position estimation method and apparatus for tracking target, and unmanned aerial vehicle
WO2022110912A1 (en) * 2020-11-27 2022-06-02 清华大学 Unmanned aerial vehicle video-based forest fire spreading data assimilation method and apparatus
CN113269098A (en) * 2021-05-27 2021-08-17 中国人民解放军军事科学院国防科技创新研究院 Multi-target tracking positioning and motion state estimation method based on unmanned aerial vehicle
CN113850126A (en) * 2021-08-20 2021-12-28 武汉卓目科技有限公司 Target detection and three-dimensional positioning method and system based on unmanned aerial vehicle
CN114963898A (en) * 2022-06-15 2022-08-30 西安工业大学 System and method for testing shot blasting point position based on unmanned aerial vehicle

Also Published As

Publication number Publication date
CN116952081B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
JP7260269B2 (en) Positioning system for aeronautical non-destructive inspection
US10191486B2 (en) Unmanned surveyor
CN104168455B (en) A kind of space base large scene camera system and method
CN104482934B (en) The super close distance autonomous navigation device of a kind of Multi-sensor Fusion and method
Johnson et al. Real-time terrain relative navigation test results from a relevant environment for Mars landing
CN106056075A (en) Important person identification and tracking system in community meshing based on unmanned aerial vehicle
CN106468547A (en) Utilize multiple optical pickocffs is independent of global positioning system for self-conductance aircraft(“GPS”)Navigation system
CN109597432B (en) Unmanned aerial vehicle take-off and landing monitoring method and system based on vehicle-mounted camera unit
Cui et al. Search and rescue using multiple drones in post-disaster situation
CN110887486B (en) Unmanned aerial vehicle visual navigation positioning method based on laser line assistance
CN109573088A (en) A kind of Shipborne UAV photoelectricity guidance carrier landing system and warship method
Lauterbach et al. The Eins3D project—Instantaneous UAV-based 3D mapping for Search and Rescue applications
CN112363176B (en) Elevator hoistway inspection and modeling method and device and inspection and modeling system
CN104154827B (en) A kind of fire accuracy measuring system and method for testing
CN110316376A (en) It is a kind of for detecting the unmanned plane of mine fire
Adami et al. Ultra light UAV systems for the metrical documentation of cultural heritage: Applications for architecture and archaeology
CN115291536B (en) Visual unmanned aerial vehicle tracking ground target semi-physical simulation platform verification method
CN110989670B (en) Unmanned aerial vehicle system for environmental water conservation monitoring of power transmission and transformation project and aerial photography method thereof
CN114371725A (en) System suitable for automatic inspection of wind turbine generator
CN116952081B (en) Aerial monitoring system and monitoring method for parameter images of drop points of fire extinguishing bomb
CN109612456B (en) Low-altitude search positioning system
CN115493598B (en) Target positioning method and device in motion process and storage medium
Menna et al. Towards online UAS‐based photogrammetric measurements for 3D metrology inspection
CN116358349A (en) Multi-mode guidance simulation system and guidance simulation method based on unmanned aerial vehicle
Howard et al. Active sensor system for automatic rendezvous and docking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant