CN114578849A - Unmanned aerial vehicle target detection system, target detection method and computer readable storage medium - Google Patents

Unmanned aerial vehicle target detection system, target detection method and computer readable storage medium Download PDF

Info

Publication number
CN114578849A
CN114578849A CN202210054509.9A CN202210054509A CN114578849A CN 114578849 A CN114578849 A CN 114578849A CN 202210054509 A CN202210054509 A CN 202210054509A CN 114578849 A CN114578849 A CN 114578849A
Authority
CN
China
Prior art keywords
target
image
aerial vehicle
unmanned aerial
water column
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210054509.9A
Other languages
Chinese (zh)
Inventor
黄亮
王奉祥
段立
张显峰
郭云玮
罗兵
周国庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Naval University of Engineering PLA
Original Assignee
Naval University of Engineering PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Naval University of Engineering PLA filed Critical Naval University of Engineering PLA
Priority to CN202210054509.9A priority Critical patent/CN114578849A/en
Publication of CN114578849A publication Critical patent/CN114578849A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention belongs to the technical field of automatic target detection, and discloses an unmanned aerial vehicle target detection system, a target detection method and a computer readable storage medium, wherein the unmanned aerial vehicle target detection system comprises an unmanned aerial vehicle system which is used for carrying a photoelectric load system to an observation position and hovering at a fixed point; the photoelectric load system is used for executing a fixed-point observation task and carrying out fixed observation on a target range area; a wireless image transmission system for transmitting the observed video image by adopting the HHLM type transmission module; the wireless remote control system is used for controlling the unmanned aerial vehicle system and the photoelectric load system, and the target detection comprehensive task table is used for receiving, storing, forwarding and processing video images. The invention provides an offshore target detection system, which has great significance for improving the level of a test technology and a test device, improving the shooting actual combat capability and the training level of a driving and protecting ship cannon, and simultaneously has great practical significance for ensuring the shooting quality of the cannon actual ammunition and adapting to battlefield requirements.

Description

Unmanned aerial vehicle target detection system, unmanned aerial vehicle target detection method and computer readable storage medium
Technical Field
The invention belongs to the technical field of automatic target detection, and particularly relates to an unmanned aerial vehicle target detection system, a target detection method and a computer readable storage medium.
Background
At present, more and more attention has been paid to offshore safety problems in various countries since 21 century, and the countries gradually attach importance to the modernized construction of navy due to offshore territory taking and ocean rights and interests dispute between the countries, pirate attack activities are rampant, safety problems of offshore free navigation and the like. The ball firing is an important item in military training of navy, and in the competitive assessment of ball firing, the quick and accurate evaluation of the ball firing effect is the key for completing the comprehensive assessment of the operational achievement. The daily training level of the cannon is judged through the hit rate and the miss rate of the cannon after shooting, so that accurate and scientific measurement of the cannon is necessary. The offshore target range is different from the land target range, target test traces on the land target range are easy to capture, and on the sea, the cannonball falls into water to excite a water column, so that the test traces are evanescent and are not easy to capture.
Therefore, the prior art has no automatic target detection method applied to the offshore range, and has no method or system capable of realizing automatic and intelligent marine gun shooting evaluation.
Through the above analysis, the problems and defects of the prior art are as follows: the prior art does not have an automatic target detection method applied to a marine firing range, and also does not have a method or a system capable of realizing automatic and intelligent marine gun shooting evaluation. The traditional optical measurement method needs manual alignment of two groups of image data time, has manual measurement, has orthogonal error, large workload and strong specialization, and is not suitable for modern high-strength training. Meanwhile, if large measurement and control equipment with advanced technology, complex structure and high precision in a target range is used for acquiring data (impact deviation), the cost problem cannot be ignored.
The difficulty in solving the above problems and defects is: from the implementation process of judging the short-distance counterattack shooting result of the traditional main cannon, the following defects exist:
(1) the movement and the protection of the forces are too much and difficult to coordinate. The two security forces of a naval vessel and a target distribution naval vessel need to be referred, and a plurality of cameras need to be arranged, so that a plurality of security personnel are needed, and continuous coordination and communication between the security forces and the security personnel are necessarily involved.
(2) The shooting process has high requirement and complex implementation. Because the gun scattering error can be calculated only by using two videos from a reference ship and a target-distribution ship, the time reference of two cameras is unified to ensure the consistency of the measurement target in the video image, and the simultaneous shooting and recording are ensured, otherwise, the post objective analysis based on the videos is difficult to realize.
(3) The spread error estimation is slow and the accuracy is not high. For the post-processing of the shot water column video, after the consistency of the measurement target is determined, the distance between the shot water column and the floating body target is measured on a computer display screen by using a graduated scale, and the ship and cannon scattering probability error is finally estimated through conversion and processing, so that the time and labor are wasted, and the precision is low.
The significance of solving the problems and the defects is as follows: in the traditional miss amount evaluation work, two orthogonal cameras are used for respectively carrying out scattering and distance scattering from shooting directions of a shooting ship and a guarantee ship, after the shooting ship returns to the shore, two groups of image data are manually aligned for time, then distance is measured on the images, scattering errors are calculated, and assessment results are evaluated. The whole process is long in time consumption and large in workload. The unmanned aerial vehicle target detection system has wide application prospect, is suitable for marine shooting target detection tasks of naval cannons, and is also suitable for marine cannon shooting. Along with the continuous maturity of unmanned aerial vehicle target detection technology, unmanned aerial vehicle target detection system will progressively promote, replaces the target detection system who uses at present. Meanwhile, the estimation work of the miss distance of the gun shooting is finished based on the video image, the video image of the relative position of the target and the shot water column of the gun shooting is obtained from the high altitude at a fixed point by using the automatic target detection system of the gun shooting unmanned aerial vehicle, the video image is transmitted to the digital processing module, the extraction and positioning of the target and the extraction and positioning of the shot water column are finished through the digital image processing technology, the coordinate position of the shot water column is calculated, the shooting scattering error evaluation score is obtained, and finally the GUI implementation of the miss distance estimation system is finished. The time of the evaluation work is shortened, the workload of workers is reduced, and the accuracy and the timeliness of the evaluation are improved.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides an unmanned aerial vehicle target detection system, a target detection method and a computer readable storage medium.
The invention is realized in this way, a drone target-detecting system, the drone target-detecting system includes:
the system comprises an unmanned aerial vehicle system, a photoelectric load system, a wireless image transmission system, a wireless remote control system and a target detection comprehensive task platform;
the unmanned aerial vehicle system comprises a four-rotor unmanned aerial vehicle body, a propulsion device, a flight control device and a power supply device; the device is used for carrying the photoelectric load system to an observation position and hovering at a fixed point;
the photoelectric load system is carried on the four-rotor unmanned aerial vehicle body; the device comprises a CCD camera, an infrared detection device and a nacelle; the system is used for executing a fixed-point observation task and carrying out fixed observation on a target range area;
the wireless image transmission system is used for transmitting the observed video images by adopting the HHLM type transmission module;
the wireless remote control system is used for controlling the unmanned aerial vehicle system and the photoelectric load system;
and the target detection comprehensive task platform is used for receiving, storing, forwarding and processing the video image.
Further, the wireless remote control system includes:
the receiving module is used for receiving the control instruction;
the decoding module is used for decoding the control instruction by using a decoder;
and the control module is used for controlling the unmanned aerial vehicle system and the photoelectric load system based on the control instruction obtained by decoding.
Another object of the present invention is to provide an unmanned aerial vehicle target detection method applied to the unmanned aerial vehicle target detection system, the unmanned aerial vehicle target detection method comprising:
the method comprises the steps of obtaining a video image of a water column shot by a ship gun by utilizing a photoelectric load, analyzing and processing the video image of the water column shot by the ship gun by obtaining video information, obtaining an image sequence, playing the video information, selecting images frame by frame, inputting a coordinate system rotation angle, extracting the water column shot by the ship gun and other means to obtain a hit rate and a miss rate of shooting of the ship gun, and obtaining a shooting evaluation result.
Further, the unmanned aerial vehicle target detection method comprises the following steps:
generating a control instruction, and controlling the quad-rotor unmanned aerial vehicle carrying the photoelectric load to fly to a safe height above a target and then hover based on the generated control instruction; meanwhile, controlling the unmanned aerial vehicle to adjust the shooting attitude of the photoelectric load based on the generated control instruction;
controlling the photoelectric load to record the shooting condition of the naval gun in the offshore target area based on the control instruction, continuously acquiring a water column video image stimulated by the shot entering water, and acquiring the relative relation between the shot water column and the target in the target area, namely the scattering condition of naval gun shooting;
converting the obtained video image into an electric signal, and converting the electric signal into a digital signal after sampling, quantizing and encoding by using an analog/digital converter;
step four, preprocessing the converted image to obtain clear and useful image information; and calculating the relative position of the shot water column and the floating target based on the obtained clear and useful image, calculating to obtain a shot scattering error, and evaluating the shot miss amount of the ship cannon.
Further, the safe altitude calculation method includes:
firstly, the hovering height of the unmanned aerial vehicle is analyzed based on the shooting height of the ship cannon:
H≥h;
wherein H represents the hovering height of the drone; h represents the height of the projectile when it reaches the highest point; the height h of the projectile when reaching the highest point is determined by the relationship among the firing angle, the firing distance and the firing height of the naval gun;
secondly, analyzing the hovering height of the unmanned aerial vehicle based on the photoelectric load observation range: determining the range of the obtained judgment interval according to the type, the shooting method and the average shooting distance of the artillery and the corresponding table of the evaluation parameters of the floating targets and the simulated target shooting scores; determining the hovering height of the unmanned aerial vehicle according to the judgment interval rectangular boundary; the rectangular boundary of the judgment interval is the rectangular boundary of the cannonball falling into the effective hit area according to the corresponding shooting score evaluation parameter table when the cannonball shoots the floating target and the simulated target;
and finally, determining the safety height of the unmanned aerial vehicle based on the hovering height analysis result of the unmanned aerial vehicle launched by the ship cannon, the judgment interval rectangular boundary limit, the field angle range of the camera and the flight height limit of the unmanned aerial vehicle.
Further, the preprocessing the converted image to obtain clear and useful image information includes:
(1) segmenting the converted image by adopting a segmentation method based on color characteristics, and identifying and extracting a target region;
(2) performing curve fitting on the basis of the target information obtained after segmentation by adopting a least square method to determine the circle center and the radius of the target for positioning the target;
(3) and carrying out graying, median filtering and contrast enhancement on the extracted and positioned target image.
Further, the calculating the relative position of the bouncing water column and the floating target based on the obtained clear and useful image comprises:
calculating a threshold value through a self-adaption and iteration method to segment the image of the impinging water column to obtain the image of the impinging water column with the background removed; carrying out binarization processing on the obtained image of the bouncing water column after the background is removed; and simultaneously, the positioning of the bouncing water column is carried out by utilizing an improved mass center method.
Further, the step of segmenting the image of the impinging water column by calculating the threshold value through an adaptive and iterative method comprises the following steps:
1.1) counting the minimum gray value T of the shot water column imageminMaximum gray value TmaxCalculating a binary average value as an initial threshold value T:
Figure BDA0003475643010000031
1.2) segmenting the image according to the threshold value T to obtain two pixel sets which are respectively:
G1={f(x,y)≥T},G2={f(x,y)≤T};
1.3) computing the set of pixels G1And G2Gray scale average value mu of1And mu2
Figure BDA0003475643010000032
Figure BDA0003475643010000033
1.4) according to μ1And mu2Calculating a new threshold value
Figure BDA0003475643010000034
Step 1.2), step 1.3), step 1.4) are repeated until the threshold T converges to a certain range.
Further, the positioning of the bouncing water column by using the improved centroid method comprises the following steps:
and detecting the edge of the water column, extracting the edge information of the water column, and calculating the mass center of the water column based on the edge information of the water column.
Further, in the fourth step, the step of calculating to obtain the shot scattering error and the step of evaluating the shot miss amount of the naval gun comprises the following steps:
(1) the calculation of the dispersion error is performed:
based on the obtained position information of the target and the shot water column in the image, a shot scattering coordinate system with the target as an origin is established, and scattering errors are calculated by the following formula:
Figure BDA0003475643010000041
Figure BDA0003475643010000042
wherein,
Figure BDA0003475643010000043
in formula 4.25
Figure BDA0003475643010000044
(Xi,Zi) The actual coordinate values of each of the impinging water columns are indicated,
Figure BDA0003475643010000045
representing the coordinates of the center of the group of bouncing water columns,
Figure BDA0003475643010000046
the value of (A) is the average value of the actual coordinate values of all the bouncing water columns; 0.6745 represents the likelihood factor, n represents the total number of projectiles;
(2) comparing calculated EX,EZDetermining the shooting result of the gun and the gun shooting miss amount evaluation result according to the value of the difference and the impact point scattering error standard value K.
It is a further object of the invention to provide a computer readable storage medium, storing a computer program which, when executed by a processor, causes the processor to perform the drone detection system functions
By combining all the technical schemes, the invention has the advantages and positive effects that: the invention utilizes the automatic target detection system of the gun shooting unmanned aerial vehicle to obtain the video image of the relative position of the target and the ballistic water column of the gun shooting from high altitude at fixed points, transmits the video image to the digital processing module, completes the extraction and positioning of the target and the ballistic water column through the digital image processing technology, calculates the coordinate position of the ballistic water column, obtains the shooting scatter error evaluation result, has important significance for improving the testing technology and testing equipment level, improving the actual combat capability and training level of the gun shooting of the driving and protecting ship and gun, and simultaneously has strong practical significance for guaranteeing the actual shooting quality of the gun and adapting to battle field requirements.
The invention adopts HHLM type wireless image transmission system, and the transmission distance can reach 50 km. The HHLM type microwave image transmission system is a high-performance and high-quality wireless image transmission system which is specially designed for the long-distance or environment without wire transmission conditions. The invention has small volume and light weight, can transmit high-quality video images in real time without distortion, has stable modulation and demodulation performance, bright and clear transmitted image color and simple installation and debugging.
Drawings
Fig. 1 is a schematic diagram of an unmanned aerial vehicle target detection system provided by an embodiment of the invention.
Fig. 2 is a schematic structural diagram of an unmanned aerial vehicle target detection system provided by an embodiment of the invention;
in the figure: 1. an unmanned aerial vehicle system; 2. a photovoltaic loading system; 3. a wireless image transmission system; 4. a wireless remote control system; 5. and a target detection comprehensive task platform.
Fig. 3 is a schematic diagram of a drone target detection method provided by an embodiment of the invention.
Fig. 4 is a flowchart of a drone target detection method provided by an embodiment of the present invention.
Fig. 5 is a schematic view of the position of the drone for drone detection provided by the embodiment of the present invention.
Fig. 6 is a graph of firing elevation angle versus firing distance provided by an embodiment of the present invention.
Fig. 7 is a graph of elevation angle and height of shot provided by an embodiment of the present invention.
Fig. 8 is a shot graph of different shot elevations provided by an embodiment of the present invention.
Fig. 9 is a schematic diagram of a determination interval according to an embodiment of the present invention.
Fig. 10 is a schematic view of detection of a target detecting system provided by the embodiment of the invention.
FIG. 11 is a schematic diagram of pinhole imaging provided by embodiments of the present invention.
Fig. 12 is a flow chart of automatic extraction and positioning of the water column.
Fig. 13 is a gray scale histogram showing a single peak provided by an embodiment of the present invention.
Fig. 14 is a schematic diagram of an image coordinate system according to an embodiment of the present invention.
Fig. 15 is a schematic diagram of the principle of calculating the coordinates of the bouncing water column according to the embodiment of the invention.
Fig. 16 is a schematic diagram of a pop-up scattering coordinate system according to an embodiment of the present invention.
Fig. 17 is a plane position relation diagram of the shooting ship and the unmanned aerial vehicle provided by the embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Aiming at the problems in the prior art, the invention provides an unmanned aerial vehicle target detection system, and the invention is described in detail below with reference to the accompanying drawings.
As shown in fig. 1-2, an unmanned aerial vehicle target detection system provided by the embodiment of the invention includes:
the system comprises an unmanned aerial vehicle system 1, a photoelectric load system 2, a wireless image transmission system 3, a wireless remote control system 4 and a target detection comprehensive task table 5;
the unmanned aerial vehicle system 1 comprises a four-rotor unmanned aerial vehicle body, a propulsion device, a flight control device and a power supply device; the device is used for carrying the photoelectric load system to an observation position and hovering at a fixed point;
the photoelectric load system 2 is carried on the four-rotor unmanned aerial vehicle body; the device comprises a CCD camera, an infrared detection device and a nacelle; the system is used for executing a fixed-point observation task and carrying out fixed observation on a target range area;
the wireless image transmission system 3 is used for transmitting the observed video images by adopting an HHLM type transmission module;
the wireless remote control system 4 is used for controlling the unmanned aerial vehicle system and the photoelectric load system;
and the target detection comprehensive task table 5 is used for receiving, storing, forwarding and processing the video image.
The wireless remote control system 4 provided by the embodiment of the invention comprises:
the receiving module is used for receiving the control instruction;
the decoding module is used for decoding the control instruction by using a decoder;
and the control module is used for controlling the unmanned aerial vehicle system and the photoelectric load system based on the control instruction obtained by decoding.
As shown in fig. 3, the unmanned aerial vehicle target detection method provided by the embodiment of the present invention includes:
the method comprises the steps of obtaining a video image of a water column shot by a ship gun by utilizing a photoelectric load, analyzing and processing the video image of the water column shot by the ship gun by obtaining video information, obtaining an image sequence, playing the video information, selecting images frame by frame, inputting a coordinate system rotation angle, extracting the water column shot by the ship gun and other means to obtain a hit rate and a miss rate of shooting of the ship gun, and obtaining a shooting evaluation result.
As shown in fig. 4, the unmanned aerial vehicle target detection method provided by the embodiment of the invention includes the following steps:
s101, generating a control command, and controlling the quad-rotor unmanned aerial vehicle carrying the photoelectric load to fly to a safe height above a target and then hover based on the generated control command; meanwhile, controlling the unmanned aerial vehicle to adjust the shooting attitude of the photoelectric load based on the generated control instruction;
s102, controlling the photoelectric load to record the shooting condition of the naval gun in the offshore target area based on a control instruction, continuously acquiring a water column video image stimulated by the shot entering water, and acquiring the relative relation between the shot water column and the target in the target area, namely the scattering condition of the naval gun shooting;
s103, converting the acquired video image into an electric signal, and converting the electric signal into a digital signal after sampling, quantizing and encoding by using an analog/digital converter;
s104, preprocessing the converted image to obtain clear and useful image information; and calculating the relative position of the shot water column and the floating target based on the obtained clear and useful image, calculating to obtain a shot scattering error, and evaluating the shot miss amount of the ship cannon.
The safe height calculating method provided by the embodiment of the invention comprises the following steps:
firstly, analyzing the hovering height of the unmanned aerial vehicle based on the shot-firing height of the ship cannon:
H≥h;
wherein H represents the hovering height of the drone; h represents the height of the projectile when it reaches the highest point; the height h of the projectile when reaching the highest point is determined by the relationship among the firing angle, the firing distance and the firing height of the naval gun;
secondly, analyzing the hovering height of the unmanned aerial vehicle based on the photoelectric load observation range: determining the range of the obtained judgment interval according to the type, the shooting method and the average shooting distance of the artillery and the corresponding table of the evaluation parameters of the floating targets and the simulated target shooting scores; determining the hovering height of the unmanned aerial vehicle according to the judgment interval rectangular boundary; the rectangular boundary of the judgment interval is the rectangular boundary of the cannonball falling into the effective hit area according to the corresponding shooting score evaluation parameter table when the cannonball shoots the floating target and the simulated target;
and finally, determining the safety height of the unmanned aerial vehicle based on the hovering height analysis result of the unmanned aerial vehicle launched by the ship cannon, the judgment interval rectangular boundary limit, the field angle range of the camera and the flight height limit of the unmanned aerial vehicle.
The method for preprocessing the converted image to obtain clear and useful image information comprises the following steps:
(1) segmenting the converted image by adopting a segmentation method based on color characteristics, and identifying and extracting a target region;
(2) performing curve fitting on the basis of the target information obtained after segmentation by adopting a least square method to determine the circle center and the radius of the target for positioning the target;
(3) and carrying out graying, median filtering and contrast enhancement on the extracted and positioned target image.
The method for calculating the relative position of the bouncing water column and the floating target based on the obtained clear and useful image comprises the following steps:
calculating a threshold value through a self-adaption and iteration method to segment the bouncing water column image to obtain the bouncing water column image with the background removed; carrying out binarization processing on the obtained image of the bouncing water column after the background is removed; and simultaneously, the positioning of the bouncing water column is carried out by utilizing an improved mass center method.
The method for segmenting the image of the impinging water column by calculating the threshold value through the self-adaption and iteration method comprises the following steps:
1.1) counting the minimum gray value T of the shot water column imageminMaximum gray value TmaxCalculating a binary average value as an initial threshold value T:
Figure BDA0003475643010000061
1.2) segmenting the image according to the threshold value T to obtain two pixel sets which are respectively:
G1={f(x,y)≥T},G2={f(x,y)≤T};
1.3) computing the set of pixels G1And G2Gray scale average value mu of1And mu2
Figure BDA0003475643010000062
Figure BDA0003475643010000063
1.4) according to μ1And mu2Calculating a new threshold value
Figure BDA0003475643010000064
Step 1.2), step 1.3), step 1.4) are repeated until the threshold T converges to a certain range.
The positioning of the elastic water column by using the improved centroid method provided by the embodiment of the invention comprises the following steps:
and detecting the edge of the water column, extracting the edge information of the water column, and calculating the mass center of the water column based on the edge information of the water column.
The method for evaluating the shot miss amount of the naval gun shooting by calculating the shot scattering error comprises the following steps:
(1) the calculation of the dispersion error is performed:
based on the obtained position information of the target and the shot water column in the image, a shot scattering coordinate system with the target as an origin is established, and scattering errors are calculated by the following formula:
Figure BDA0003475643010000071
Figure BDA0003475643010000072
wherein,
Figure BDA0003475643010000073
in formula 4.25
Figure BDA0003475643010000074
(Xi,Zi) The actual coordinate values of each of the impinging water columns are indicated,
Figure BDA0003475643010000075
representing the coordinates of the center of the group of bouncing water columns,
Figure BDA0003475643010000076
the value of (b) is the average value of the actual coordinate values of all the bouncing water columns; 0.6745 represents the likelihood factor, n represents the total number of projectiles;
(2) comparing calculated EX,EZDetermining the shooting result of the gun and the gun shooting miss amount evaluation result according to the value of the difference and the impact point scattering error standard value K.
The technical solution of the present invention is further described with reference to the following specific embodiments.
Example 1:
1. unmanned aerial vehicle target detection system composition and working principle
1.1 System composition
The unmanned aerial vehicle target detection system comprises an unmanned aerial vehicle system, a photoelectric load, a wireless image transmission system, a wireless remote control system and a target detection comprehensive task platform.
Unmanned Aerial Vehicle (UAV) is an unmanned aerial vehicle operated by radio remote control equipment and a self-contained program control device. The unmanned aerial vehicle platform system is an aerial flight part of the unmanned aerial vehicle system and comprises an aircraft body, a propelling device, a flight control device and a power supply device. The onboard portion of the communication data link, the mission load, is also mounted on the aircraft. The invention selects a quad-rotor unmanned aerial vehicle and adopts an aviation lithium battery to provide power. The four-rotor unmanned aerial vehicle is mainly considered to have the advantages of good application technical foundation, low manufacturing cost, good maneuverability, strong survivability, lower requirement on environment, capability of overcoming observation work which is difficult to carry out manually and convenient use. Therefore, the system is convenient for hovering, shooting and shooting the bouncing point water column video, can carry the optical measurement platform to complete the aerial fixed-point observation task, and realizes the fixed observation of the target range area.
The photoelectric load consists of a CCD camera, an infrared detection device and a nacelle. A CCD (Charge Coupled Device) is made of a semiconductor material having high sensitivity, converts light into electric charges, converts the electric charges into digital signals through an analog-to-digital converter chip, and stores the digital signals after compression in a flash memory or a built-in hard disk card inside a camera, thereby easily transmitting data to a computer. The shooting device is carried on a quad-rotor unmanned aerial vehicle and used for recording gun shooting conditions in an offshore target area, continuously shooting a water column excited by a shot entering water, and obtaining the relative relation between the shot water column and a target in the target area, namely the gun shooting dispersion condition.
The wireless image transmission system is used for transmitting images in time, converting video image information shot by the unmanned aerial vehicle optical measurement platform into an electric signal, converting the electric signal into a digital signal after sampling, quantizing and coding by an analog-digital converter, sending the digital signal to the dynamic memory for temporary storage, and then sending effective data to the information receiving and processing system in real time in a wireless mode by the wireless data transmission chip for later image processing and data operation. The invention adopts HHLM type wireless image transmission system, and the transmission distance can reach 50 km. The HHLM type microwave image transmission system is a high-performance and high-quality wireless image transmission system which is specially designed for the long-distance or environment without wire transmission conditions. The invention has small volume and light weight, can transmit high-quality video images in real time without distortion, has stable modulation and demodulation performance, bright and clear transmitted image color and simple installation and debugging.
The function of the wireless remote control system is to realize the control of the unmanned aerial vehicle and the photoelectric load. The ground control center sends out an instruction, the instruction is transmitted to the receiving module of the wireless remote control system, and the instruction is read out through the decoder, so that the control of the unmanned aerial vehicle and the control of the photoelectric load are realized.
The comprehensive target detection task platform is used for receiving, storing and forwarding video images and effectively processing the video images and mainly comprises a PC (personal computer) and some image processing and numerical calculation software. After receiving the information of the image transmission system, the image is effectively preprocessed, interference and noise are removed, clear and useful image information is taken out, the relative positions of the shot water column and the floating target are calculated according to a reasonable mathematical model by utilizing the information, and the shot scattering error is calculated.
1.2 principle of operation
The traditional evaluation mode is complicated, accidental errors are large, and workload of security personnel is large. The automatic target detection system of the ship gun shooting unmanned aerial vehicle can acquire the later data from the video image information and process the data, so that the defects of the traditional target detection work are overcome, the working time is shortened, and the efficiency is improved. The basic flow of the system operation is roughly divided into three steps, and a schematic diagram is shown in fig. 1. .
(1) The unmanned aerial vehicle is used as a platform carrying the photoelectric load, and the unmanned aerial vehicle carries the photoelectric load to fly to a safe height above the target and then hovers.
(2) The photoelectric load of the unmanned aerial vehicle is adjusted to shoot the gesture to carry out work, and the target is imaged on the target surface through the camera.
(3) Video image information of a water column shot by a ship gun is transmitted to a comprehensive task platform of a target-laying ship in real time through a wireless transmission module, stored and forwarded in real time, then transmitted to a wireless image transmission system, image acquisition and display board digitization of a video signal are carried out, the video image is rapidly and accurately processed through developed comprehensive task software, the hit rate and the miss rate of ship gun shooting are obtained, and a shooting result is evaluated.
2. Unmanned aerial vehicle system target detection task area analysis
In order to guarantee the safety of the unmanned aerial vehicle and meet the requirement of a target detection task, a target detection task area of the unmanned aerial vehicle is determined. When the unmanned aerial vehicle system is used for detecting the target, the unmanned aerial vehicle hovers right above the floating body target, so that the unmanned aerial vehicle target detection task area is an area right above the floating body target, and the main work is to determine the height of the area.
2.1 unmanned aerial vehicle hovering height analysis based on gun shooting height
In the process of shooting a naval gun, the running track of the projectile is similar to a parabola after the projectile leaves a gun bore. When the unmanned aerial vehicle target detection system works, the unmanned aerial vehicle hovers right above the floating body target, and in order to ensure the safety of the unmanned aerial vehicle, the height for ensuring the safety of the unmanned aerial vehicle is determined according to the shooting range of the ship cannon so as to prevent the unmanned aerial vehicle from being knocked down by the projectile. The position design idea of the drone is shown in fig. 5. It can be seen from fig. 5 that the unmanned aerial vehicle is hardly threatened in the horizontal direction, and the safety threat of the unmanned aerial vehicle mainly comes from the shot which reaches the position right above the floating body target, so that the selection of the hovering height of the unmanned aerial vehicle is very important in the shooting process of the ship cannon, and once the position selection is wrong, serious accidents can be caused, and the target detection work falls into a passive situation.
And analyzing the relationship among the firing angle, the firing distance and the firing height of the naval gun to find the optimal hovering height.
The initial velocity obtained when 76 mm main cannon shots leave a bore is 980 m/s, the gravity acceleration g is 9.8, the shooting elevation angle is alpha, and the range of the shooting angle of the ship cannon is-15-85 degrees. And (3) carrying out simulation by utilizing Matlab software to obtain a curve graph of the shooting elevation angle and the shooting distance, the shooting elevation angle and the shooting height and curves of the shooting distance and the shooting height at different shooting elevation angles, and analyzing according to the curve graph and a simulation result so as to determine a safe shooting range of the naval cannon when shooting the floating body target.
First, Matlab is used for simulation, and a graph of the shooting elevation angle and the shooting distance is obtained as shown in FIG. 6. As can be seen from the graph of the shooting elevation angle and the shooting distance in fig. 6, when the shooting elevation angle is 45 °, the shooting distance of the naval gun is the largest; under the condition that the distance between the naval vessel and the floating body target is 3 nautical miles (5.556 ㎞) and the shooting elevation angle is 0-3 degrees, the shooting distance is within 10 kilometers, and the shooting requirement under the condition can be met.
And then Matlab is used for simulation to obtain a curve graph of the shooting elevation angle and the maximum shooting height, as shown in FIG. 7. As can be seen from the elevation versus distance graph of fig. 744, as the shooting elevation increases, the maximum shooting height also increases; the maximum shooting height is not obviously changed between 0 and 10 degrees.
According to the analysis results of the graph of the shooting elevation angle and the shooting distance in fig. 6 and the graph of the shooting elevation angle and the maximum shooting height in fig. 7, Matlab is used for simulation, and corresponding curves of the shooting distance and the shooting height of the projectile at different shooting elevation angles are obtained, and the graphs are shown in fig. 8.
According to the shooting curve diagrams of different shooting elevations in fig. 8 and simulation results, parameters of the maximum height which can be reached by the projectile under different shooting elevations and the height right above the floating body target are obtained, and the parameter table is shown in table 1.
TABLE 1 shot distance and height parameter table
Figure BDA0003475643010000091
According to different shooting elevation angles, shot distances of the shots, a height parameter table and a curve chart, a dangerous area of the gun shooting can be obtained visually, and therefore a shooting range of the gun in height can be obtained. According to the requirement, the shot distance of the naval gun is 3 nautical miles (5.556 ㎞), and the shooting range of the naval gun can be obtained through the shot distance of the projectile, the height parameter table and the shooting curve diagram of different shooting elevation angles in the table 1. The range of the shooting range is that the launching ship is used as a base point and is in an area with the height of 1.5 times of the maximum shooting height of the ship cannon. From the above information, it can be found that the maximum shooting height is 40 meters, and the minimum height of the gun in height is 1.5 times of the maximum shooting height, i.e. 40 × 1.5 is 60 meters.
And continuously calculating the minimum safe height of the unmanned aerial vehicle. When the unmanned aerial vehicle is used for detecting the target, the unmanned aerial vehicle hovers right above the floating body target. The trajectory of the shot is approximate to a parabola, the height of the shot when reaching the highest point is assumed to be H, the hovering height of the unmanned aerial vehicle is assumed to be H, and H is larger than or equal to H in order to ensure the safety of the unmanned aerial vehicle. Under the condition that the distance between the naval vessel and the floating body target is 3 nautical miles (5.556 ㎞), the minimum shot height of the 76 mm naval vessel in height is 60 meters, in order to ensure that the unmanned aerial vehicle is not threatened by the projectile, the determined minimum safe hovering height of the unmanned aerial vehicle is 60 meters, and the unmanned aerial vehicle is right above the floating body target.
2.2 unmanned aerial vehicle hovering height analysis based on photoelectric load observation range
The unmanned aerial vehicle hovering height based on the photoelectric load observation range meets the requirement of shooting effect, and a shot image contains the height of elements required by a target. Therefore, when determining the working height of the unmanned aerial vehicle, it is not enough to only consider the height for ensuring the safety of the unmanned aerial vehicle, and the working height of the unmanned aerial vehicle capable of meeting the requirement of the shooting task is also considered.
The concept of "decision matrix" is introduced here. The rectangular boundary of the 'judgment interval' is the rectangular interval boundary of the cannonball which is judged to fall into the effective hit area according to the corresponding shooting result evaluation parameter table when the cannonball shoots the floating (reflector) target and the simulation target.
The range of the 'judgment interval' is determined according to the type of the artillery, the shooting method and the average shooting distance and according to a corresponding table of evaluation parameters of the floating (reflector) target and the simulated target shooting score. Fig. 9 is a schematic diagram of the "determination section".
In fig. 9, 2X is a distance side length value of the rectangular determination section, and 2Z is a direction side length value of the rectangular determination section. In the evaluation parameter table of the shooting performance of the sea floating body target, half values X and Z of the distance and the direction side length of a rectangular judgment area are listed. The table of parameters for evaluating the shooting performance of the marine floating body targets is shown in table 2.
Table 2 evaluation parameter table for target shooting performance of sea floating body
Figure BDA0003475643010000101
According to the shooting task, the boundary value of the rectangular frame of the "decision section" can be obtained by referring to the parameter table when the shooting distance is 3 nautical miles (5.556 ㎞), the distance X is 114.43 meters, and the direction Z is 32.651 meters.
Considering the shooting visual angle and the shooting effect of the camera, the hovering height of the unmanned aerial vehicle is determined according to the boundary of the rectangular frame of the judgment interval. The design scheme of the detection of the target detection system is shown in fig. 10, and the field angle of the known camera is 94 ° (i.e., ADB ═ 94 °), so that half the field angle, i.e., ACO ═ BCO ═ 1/2ACB ═ 47 °. When the cannon firing practice assessment is carried out, the effective region of the assessment is a region of a rectangular frame of a 'determination interval' taking the floating body target O as the center, and the 'determination interval' rectangular frame is internally tangent to the 'O'.
According to the known conditions, the height of the OC can be easily calculated, and the hovering height of the unmanned aerial vehicle can be determined. Using the pythagorean theorem: c. C2=a2+b2
Wherein a is 114.43 m
b-Z-32.651 m
To obtain
Figure BDA0003475643010000111
That is, the radius OF OD is 119 m, and OA, OB, OE, OF, OG, 119 m.
The length of OC can be determined from the field angle of the camera, using tangent theorem:
Figure BDA0003475643010000112
wherein a is 119 m OA
θ=∠ACO=47°
To obtain
Figure BDA0003475643010000113
I.e., OC ═ b ═ 111 m. The hovering height of the drone is therefore 111 meters.
If a water column excited by the projectile falling outside the area of the rectangular frame of the judgment interval is observed, the visual range of the camera is expanded. Assuming that the length and width of the visible region range are 1.5 times the length and width of the rectangular frame of the "determination section", the boundary value of the rectangular frame is 1.5 × X171.645 m, and the direction Z is 48.98 m. According to the method for solving the hovering height of the unmanned aerial vehicle in the rectangular frame region of the judgment interval, the hovering height of the unmanned aerial vehicle under the condition can be obtained in the same way. Finally, the solution results in OC being 166.5 meters, i.e., the hovering height of the drone is 166.5 meters.
The limit flying height of the unmanned aerial vehicle is 500 meters, the height of the unmanned aerial vehicle safety is comprehensively guaranteed, the rectangular boundary limit of the 'judgment interval', the field angle range of the camera and the flying height limit of the unmanned aerial vehicle are comprehensively guaranteed, and the hovering height of the unmanned aerial vehicle during working is set to be 111 meters finally.
3. Target extraction positioning and water column measurement
In order to realize the shot miss amount evaluation of the naval cannon based on the video image, target image information and shot water column image information need to be extracted from shot water column images. The target of the ship-based cannon shooting at sea is a red spherical floating target with the diameter of three meters, and the color of the target is obviously contrasted with that of the ocean background in the image, so that the target can be extracted and positioned by directly segmenting an interested color area by using an RGB image of an ejected water column acquired by an unmanned aerial vehicle.
If extraction and positioning of the water column are carried out, the input image of the water column needs to be preprocessed first to obtain clear target information. The chapter firstly explains the extraction and positioning method of the target and the proportional relation between the image distance and the actual distance, and carries out preprocessing on the image of the impinging water column to obtain clearer information of the impinging water column. And then, the ballistic water column is extracted by adopting a threshold segmentation method different from the method for extracting the target, so that the automatic extraction and positioning of the ballistic water column under a simple background and a complex background are realized.
3.1 extraction positioning and data preprocessing of targets
The image segmentation is an important step of image processing, which directly determines the quality of subsequent positioning accuracy, and firstly, the target needs to be extracted, and the target is segmented from the image and the background. Because the naval gun shooting uses the red floating target, the target has obvious color difference with the ocean background, and the target can be extracted by directly segmenting the region in the target color range in the RGB image according to the consistency of the target color and the larger contrast of the target and the background color. Therefore, the invention adopts a segmentation method based on color characteristics to segment the image.
The judgment rule adopted by the invention for extracting the target is to judge that a certain pixel point is red according to the condition that R component in R, G, B is obviously not less than other components, and the color of the judgment condition is controlled by setting a threshold value. The most critical area-based image segmentation is that the RGB color judgment is carried out in the place which is difficult to strictly define, the threshold value is set to be large, the change range of the target color is reduced, and the tolerance to color distortion caused by shooting is reduced; when the threshold value is set to be small, the change range of the target color is increased, and the possible segmented target is increased or an area irrelevant to the target is segmented.
Then, the position of the target needs to be further determined, and the position of the target is determined so that the relative position relationship between the impact point and the target can be determined, and the dispersion error is calculated. The target information obtained by image segmentation is an approximate circle, and the positioning of the target is to obtain the circle center and the radius of the target.
Circle detection is an important application in digital image processing, and has many methods for extracting the circle center of a circle or a circle-like object, such as a three-point circle method, a Hough transformation method, a curve fitting method and the like.
(1) The three-point circle method is an outer region boundary representation method, and the algorithm only focuses on the boundary information of the target circular region. In practical application, the algorithm has the disadvantages that any three points on the boundary of the target circular area are selected, and the random selection can cause strong calculation randomness and greatly influence the calculation result. But the algorithm has the advantages that the algorithm can process incomplete and partially defective target circular areas, is relatively simple, is easy to code and has small calculation amount.
(2) The Hough transform method is a kind of cluster classification idea, which carries out clustering operation on the image element information with a certain relation in the image space and finds the parameter space accumulation corresponding points which can link the image elements by using a certain analytic form. The algorithm has the advantages of strong noise resistance and high operation precision. But its disadvantages are: the amount of data that needs to be stored is large, which results in a slow algorithm.
(3) The surface fitting method is also an external area boundary representation method, and usually a best fitting point set is found based on a least square method, and iterative solution is performed. The algorithm has the advantages of high precision, consideration of the influence of each boundary point pixel of a target circular area, and improvement of the problems of too flexible selection of three points and poor control of randomness in a three-point circular method. However, the algorithm has the disadvantages that when the target circular area is large and the number of boundary pixels of the target area is large, the calculation amount of the algorithm is large, which is not beneficial to quickly positioning the circle center and influences the execution efficiency of the algorithm.
By combining the above circle center extraction methods of various circles or similar circles for comparative analysis, in order to meet the requirements of the system on the position speed and accuracy of the target, the invention adopts the least square method for the detection of the center of the target to perform the curve fitting algorithm.
After finding a section of circular arc, calculating corresponding parameters through the equation of the circle, wherein the equation of the circle is as formula 3.1.
(x-xc)2+(y-yc)2=r2(formula 3.1)
Wherein (x)c,yc) As the center of circle, the formula 3.2 is obtained by expansion.
x2+y2+ ax + by + c ═ 0 (formula 3.2)
Wherein a is-2 xc,b=-2yc,c=xc 2+yc 2-r2Calculating parameters using least squaresa, b, c, thereby finding the radius of the circle:
Figure BDA0003475643010000121
and fitting the target according to the target circle center coordinate and the radius obtained by the least square method, wherein the fitting condition of the obtained radius and the radius of the target is better.
After the target is extracted and positioned, image preprocessing must be performed on the acquired image information before the water column is ejected. The image preprocessing is a basic process of image processing, a shot water column video image is obtained through an acquisition system, a target shot water column is extracted in order to measure the miss distance of the ship cannon, some difficulties in the image are overcome, especially the situation that the difference between the shot water column in the image and the background is small, the shot water column target is easier to extract, and therefore the shot water column image is preprocessed.
The image preprocessing process mainly comprises image graying, noise elimination by median filtering, image contrast enhancement and the like, and is prepared for identifying the bouncing water column.
3.2 image distance versus actual distance
The miss distance of the cannonball is evaluated, and the bullet impact deviation amount needs to be calculated according to the actual distance. According to the working requirement of the system, the optical measurement equipment shoots vertically downwards from high altitude to obtain an image which comprises the whole offshore target field area and the water column bounced by the offshore buoy. Based on the principles of perspective geometry and photogrammetry, the relationship between the pixels in the shot image and the actual distance is deduced and calculated.
The method is calculated according to the imaging principle of the pinhole camera, the camera is considered to be in direct proportion to the image acquisition of the target range area, and the error brought by the image distortion to the measurement distance is not considered, because the image of the bouncing water column is influenced less by the lens distortion more along the principle of the image edge.
As shown in fig. 11, O is the lens center, a is the object distance, which is the distance from the optical center O to the object plane, and f is the lens focal length, which is the distance from the optical center O to the image plane.
The known actual target is a sphere with a diameter of three meters, and can be used as a reference object for non-contact distance measurement, and the actual length represented by each pixel, i.e. the size of the proportional relation e, can be obtained according to the number of pixels occupied by the target in the image.
And calculating the distance between the impinging water column and the target, and obtaining the actual distance according to the proportional relation between the pixel size and the length of the actual object by solving the Euclidean distance on the images of the centers of the two targets. The relationship between the image pixel and the actual distance is therefore
Figure BDA0003475643010000131
The center coordinates and radius of the target have been determined in the previous description, the target radius is 14.1032 pixels in size, the actual radius of the target is 1.5 meters, so e 0.1064 meters per pixel.
3.3 automatic extraction and positioning of the water column
The simple background is an environment which is more beneficial to the identification of the water column shot under an ideal condition, and the ideal condition is that the marine background only has one gray level in the image of the water column shot. If the sky background exists, the gray level of the sky is close to the gray level of the water column, so that the target segmentation work is more difficult; if the meteorological conditions are good, the wind speed is low, so that the shape of the bouncing water column is basically kept as a cylinder, the position of the bouncing point can be represented by the center of the water column, the sea condition is good, the surging of the sea surface is low, and the interference on target extraction is small. The shot water column image obtained in this case is called a shot water column image of a simple background, and is efficient and small in error by digital image processing. The design of the automatic extraction and positioning process of the bouncing water column under the simple background is shown in figure 12.
When the cannon shoots the sea, the cannonball generates water column after entering the water, so the shooting deviation can be determined according to the position of the shot water column. The extraction of the bouncing water column mainly segments the target from the ocean background, and then filters the influence of the sea spray, and due to the contrast difference between the water column and the background, the target and the background are segmented by appointing a gray threshold.
Assume that there are some bright objects on the dark background of image f (x, y), and thus the gray levels of the object and background pixels fall into two main modes. One common method of extracting objects from the background is to select a threshold T to segment the two modes. Then, any point (x, y) satisfying the condition f (x, y) > T is called an object point, and the other points are called background points (in turn, a dark object on a light background is the same). The (binary) image g (x, y) after thresholding is defined as equation 4.1.
Figure BDA0003475643010000132
Therefore, the most difficult point of thresholding is how to determine the threshold T. Threshold calculation methods are generally classified into two types: a global threshold and a basic adaptive threshold.
The global threshold is the most commonly used threshold calculation method, and particularly when the gray level histogram distribution of the image shows double peaks, the global threshold can obviously divide the target and the background components to obtain a more ideal image segmentation effect. The basic adaptive threshold is a comparative basic image adaptive segmentation method, which generally performs threshold segmentation based on the characteristics of the gray level change of the image pixel and the neighborhood of the image pixel, and further realizes the binaryzation of the gray level image. The method fully considers the characteristics of each pixel neighborhood, so the boundary of the target and the background can be better highlighted generally.
The background against which the water column is projected is the sea, and the water column is white in color. Under ideal conditions, the contrast ratio of the impact water column and the background is relatively high, and the impact water column and the background can be separated by adopting a maximum class difference method. However, the gray level histogram of the water column is in a single peak state due to the environmental influence during image acquisition, such as reflection of the ocean surface, the shooting angle of the optical measurement platform, and the density of the water bloom of the water column, and fig. 13 is a gray level histogram of the bouncing water column, which is in a single peak state.
According to the characteristics of the unimodal gray level histogram, the self-adaptive threshold calculation steps adopted by the invention are as follows:
(1) and (5) initial value. Counting the minimum gray value T of the bouncing water column imageminMaximum gray value TmaxAnd calculating the binary average value as an initial threshold value T.
Figure BDA0003475643010000141
(2) And (6) dividing. And segmenting the image according to the threshold value T to obtain two pixel sets respectively.
G1={f(x,y)≥T},G2T ≦ f (x, y) (equation 4.6)
(3) And (4) average value. Computing a set of pixels G1And G2Gray scale average value mu of1And mu2
Figure BDA0003475643010000142
Figure BDA0003475643010000143
(4) And (6) iteration. According to μ1And mu2Calculating a new threshold value
Figure BDA0003475643010000144
And repeating the steps 2, 3 and 4 until the threshold value T is converged to a certain range.
The method comprises the steps of calculating a threshold value through a self-adaption method and an iteration method, and segmenting the image of the impinging water column.
After the image is binarized by the method, the segmentation of the bouncing water column and the background is well realized, but still some interference noises exist, wherein the noises are false small target points formed after binarization due to the fact that some areas in the background are close to the gray level of the bouncing water column, or some interference noises also exist in the water column area in the binary image due to uneven gray level of the bouncing water column. The presence of these noises affects the location of the impact point in the next step of extraction of the column of water. The binary image is filtered using mathematical morphological operations for these problems.
The mathematical morphology theory differs from the traditional view on numerical modeling and analysis, mainly by using structural elements to analyze and probe the image. Mathematical morphology is composed of a series of algebraic operations in morphology, including 4 basic operations: dilation (Dilation), Erosion (Erosion), open (Opening), close (Closing).
The dilation operation may incorporate all background points contacted by the object into the object and expand the object boundary outward. With this operation, the space in the object can be filled up; the erosion operation can eliminate target boundary points and shrink the target boundaries inward, and small and meaningless objects in the image can be eliminated by this operation. The filtering of the binary image achieves the elimination of noise through two stages: the first stage is to close the binary image in order to eliminate black dots in the impinging water column, so that the impinging water column area is more complete. The second stage is to open the image to remove small interfering noise in the background and remove the burr that bounces against the edge of the water column, and to smooth the edge of the water column.
After obtaining the image of the impinging water column, the coordinates of the center of the impinging water column need to be found. After the previous image processing, the interference noise in the image is basically eliminated, and the image can be positioned by the bouncing water column. The impact water column is influenced by gravity and wind, an image obtained by high-altitude shooting is not necessarily a regular circle, and the center of gravity of an irregular image is obtained by adopting a centroid method and is used as the center of the impact water column. The centroid method refers to the position in the image represented by the centroid abscissa and ordinate of the water column. Let an image f (x, y), xkIs the x-coordinate of the kth pixel, ykIs the y coordinate of the kth pixel, f (x)k,yk) Is the gray value of the kth pixel. The total pixel number of the centroid coordinate (x, y) is n, the formula is
Figure BDA0003475643010000151
Figure BDA0003475643010000152
Because the gray values of all pixel points in the whole image participate in calculation, each pixel in a region irrelevant to a target needs to be calculated, and for an image with an irregular contour, the pixel position of the outermost edge has a larger influence on the calculation of the centroid of the image. Therefore, the centroid algorithm can be improved, the row and column positions of the edge pixels meeting the target are searched firstly, and then the centroid of the corresponding target image is obtained by arithmetic mean.
The edge refers to a set of pixels with step change or roof change of the gray level of the surrounding pixels, and the edge detection mainly measures, detects and positions the gray level change of the image, namely, the image is segmented by using gradient according to the discontinuity of the image. The derivative is typically approximated using the difference in gray values over a smaller neighborhood in the image. The neighborhood of 3 x 3 is as follows:
Z1 Z2 Z3
Z4 Z5 Z6
Z7 Z8 Z9
simple edge detection is performed by performing convolution operation on the original image by using a first-order differential operator, such as a Roberts operator, a Sobel operator and a Prewitt operator.
(1) The Roberts operator is the simplest operator, and is an operator that finds edges using a local difference operator.
The gradient of which is approximately
gx=Z9-Z5(formula 4.11)
gy=Z8-Z6(formula 4.12)
The convolution operator is: gx
Figure BDA0003475643010000153
And gy
Figure BDA0003475643010000154
(2) The Sobel operator approximates the gradient two components by
gx=(Z7+2Z8+Z9)-(Z1+2Z2+Z3) (formula 4.13)
gy=(Z3+2Z6+Z9)-(Z1+2Z4+Z7) (formula 4.14)
The convolution operator is: gx
Figure BDA0003475643010000155
And gy
Figure BDA0003475643010000156
(3) Prewitt operator
Using the Prewitt operator is computationally simpler than the Sobel operator, but the noise generated may be slightly larger. The gradient of which is approximately
gx=(Z7+Z8+Z9)-(Z1+Z2+Z3) (formula 4.15)
gy=(Z3+Z6+Z9)-(Z1+Z4+Z7) (formula 4.16)
Convolution operator: g is a radical of formulax
Figure BDA0003475643010000157
And gy
Figure BDA0003475643010000158
And extracting the edge information of the target through edge detection, and solving the position of the mass center.
4 off-target amount assessment
4.1 calculation of scatter error and error analysis
The miss distance is also called the impact deviation. The shooting process of the weapon system is influenced by various accidental factors, and corresponding errors are continuously generated, so that the impact point deviates from the target. In evaluating the miss amount, the scatter error, i.e., the probability deviation or mean square error of the impact point relative to the scatter center, is used to characterize the shot density. When shooting a target on a horizontal plane, the projectile dispersion degree is characterized by distance probability deviation Ex and direction probability deviation Ez or relative probability deviations Ex/x and Ez/x. A smaller deviation of the impact point (explosion point) from the mean impact point or centre of spread indicates a smaller spread of the projectiles of the weapon, i.e. a higher shot density.
Before the scatter error calculation, considering the situation that the image coordinate system may not be matched with the bullet scatter coordinate system, the problem of coordinate transformation which may exist needs to be discussed.
Image information acquired by the CCD camera is converted into a digital image after analog-to-digital conversion. The coordinates of the digital image are expressed as (u, v), and represent the image coordinate values in units of pixels. Another coordinate system (x, y) is needed to represent the position of the pixel in the image in physical units. The origin of the coordinate system is a certain point O in the image1The x-axis is parallel to the u-axis and the y-axis is parallel to the v-axis.
In the (x, y) coordinate system, the origin O1Generally at the center of the image, is the intersection of the image plane and the optical axis of the camera. Suppose O1(u) in (u, v) coordinate system0,v0) Position, each pixel having a physical dimension Δ in the x-axis directionxPhysical dimension in Y-axis direction is ΔyThen the relationship between the two coordinate systems is represented by the equation 4.17 and 4.18 for any pixel in the image.
Figure BDA0003475643010000161
Figure BDA0003475643010000162
In the invention, an image coordinate system O with a target as an origin, a u axis as an x axis and a v axis as a y axis needs to be established1-xy。
The formula for calculating the coordinates of the water column is derived below.
As shown in fig. 15. Image coordinate system O1In xy, the center coordinate of the impinging water column m is (x)m,ym) The center coordinate of the struck water column M in the struck water column world coordinate system O ' -X ' Y (the X axis and the Z axis of the coordinate system are not parallel to the strike distance distribution and the direction distribution respectively) is (X 'M,Z′M) And M is the image of M in the image. e is the proportional relation between the actual distance and the distance on the graph.
X′M=xmE (formula 4.19)
Z′M=ymE (formula 4.20)
Since the target is far from the shooting ship and the bullet scattering X-axis is not well determined, the situation that the photographed image coordinate system is not matched with the bullet scattering coordinate system may exist, namely, the bullet water column coordinate system O ' -X ' Y ' rotates relative to the bullet scattering coordinate system O-XZ. As shown in fig. 16. (X)M,ZM) Is the coordinate of the ballistic water column in the ballistic dispersion coordinate system.
Rotary formula of rectangular coordinate system
XM=X′M·cos(θ)+Z′MSin (θ) (formula 4.21)
ZM=-X′M·sin(θ)+Z′MCos (θ) (formula 4.22)
And theta is the rotation angle of the image coordinate system relative to the world coordinate system reflected by the image coordinate system. The rotation angle can be calculated according to the course of the unmanned aerial vehicle, the course of the shooting ship and the shooting bulwark angle, and the precondition is that the direction of an x axis of an image coordinate system is the same as that of the unmanned aerial vehicle. The calculation formula of Θ is shown in formula 4.23.
Θ is TC2-TC1-Q (formula 4.23)
In the formula, TC1 and TC2 are respectively the course of the shooting ship and the unmanned aerial vehicle, and Q is the shooting bulwark angle. These parameters are available from drones and shooting ships.
Through the extraction and positioning of the target and the extraction and positioning of the water column, the position information of the target and the water column in the image is obtained, and an impact scattering coordinate system with the target as the origin is established. The calculation of the dispersion error is performed using the coordinate data of the impinging water column derived from the formula for the coordinates of the impinging water column in the previous section. The formula for calculating the dispersion error is shown in formulas 4.24 and 4.25.
Figure BDA0003475643010000171
Figure BDA0003475643010000172
In formula 4.24
Figure BDA0003475643010000173
In formula 4.25
Figure BDA0003475643010000174
(Xi,Zi) The actual coordinate values of each of the impinging water columns are indicated,
Figure BDA0003475643010000175
indicating a bouncing water columnThe coordinate of the cluster center is the average value of the actual coordinate values of all the shot water columns, 0.6745 is a likelihood coefficient, and n is the total number of the shots.
According to the evaluation standard of the firing table of the gun shooting to the gun shooting result, the impact point scattering error has a standard value K, and the comparison EX,EZThe shooting result of this time can be judged according to the size of the standard value.
The calculation of the dispersion error is completed, and the sources of the error are analyzed mainly as follows according to the knowledge of the error theory and the whole system design process:
(1) a preparation stage: the X axis of the photoelectric load image is not parallel to the heading direction of the unmanned aerial vehicle and is not fixed; the flying height is unreasonable, so that the shooting angle is not good.
(2) A shooting stage: when the photoelectric load of the unmanned aerial vehicle is shot, the photoelectric load of the unmanned aerial vehicle has a certain deviation in control during working due to the fact that the distance between the unmanned aerial vehicle and the control platform is long and the working electromagnetic environment is complex; the photoelectric load can generate errors during shooting, the precision of the camera is not high, lens distortion caused in the lens manufacturing process is avoided, or vibration generated by wind current in the shooting process is transmitted to the optical system through a fixed point of a seat frame of the nacelle, so that the imaging quality is affected. Secondly, when shooting, the image obtained by the photoelectric load cannot be scaled to the actual sea area plane, so that errors occur when calculating the dispersion distance of the water column.
(3) And (3) image transmission stage: the acquired image is affected by the performance of an image transmission system in the wireless transmission process, and the quality of the transmitted image brings errors to the subsequent image processing.
(4) And (3) image processing: the quality of the image preprocessing work can influence the extraction precision of the subsequent target and the bouncing water column, and the positioning precision of the target and the bouncing water column is directly related to the error of the scattering distance of the bouncing water column.
(5) A calculation stage: the video image and the actual sea area are not obtained by scaling, but scaling causes certain errors when converting from the distance on the graph to the actual distance. Furthermore, the determination of the number of significant digits and the rounding of the numbers in the calculation of the distance of the water jet impinging on it can produce errors in the result.
4.2 shooting miss amount evaluation software design based on MATLAB GUI
The rapid modeling and powerful computing capability of the MATLAB tool are used for completing post-event rapid analysis of the target detection acquisition video, and the method specifically comprises the functions of ship cannon shooting video reading, information acquisition, image sequence acquisition, playing, pausing, interactive impact point distance measurement and calculation and the like. And a visual object-oriented operation interface is realized by utilizing a Graphical User Interface (GUI) provided by MATLAB, and the functions of selecting, processing and displaying results of video images and frame-by-frame pictures, calculating scattering errors and the like required by shooting miss amount evaluation work are realized.
The system stores the acquired video images of the gun shooting in a video frame mode, selects images from the video frame to shoot water column images to process and position the shot points when the shot points are determined, calculates the scattering errors after all the shot points are positioned, and judges the shooting scores.
The specific steps and operation steps are shown in fig. 3.
In the design of the system, in order to remind a user of operating according to the specification and reduce the operation error actions, a part related to the introduction of the system is added during the software design, and the operation flow can be viewed from the interface of the software. Meanwhile, if the operation steps are different from the designed flow, the system sends out prompt information.
The design of the shot miss amount evaluation system based on the video image is realized by utilizing a GUI (graphic User interfaces) platform provided by MATLAB, the platform provides a plurality of control tools for interface design, and a User can realize a required operation interface in a convenient and rapid design in a friendly interaction mode provided by the User. According to the implementation method of the GUI, the method is divided into the following three steps.
(1) A blank GUI is created.
(2) And (5) carrying out layout and functional design on the system module.
After the interface design required by the system is finished and the interface design is saved, the MATLAB GUI generates 2 files related to the interface design.
A Fig file: this file includes the image window of the GUI and the full description of all the sub-objects, including the user controls and coordinate axes, and the property values of all the objects.
The M file: it contains all the code needed to run the GUI and can control the GUI and decide the GUI's operational response to the user.
(3) And compiling the callback function.
The user can write the callback function required by the GUI component in the M file frame generated by the GUI design. The M file contains a series of sub-functions, namely a main function, an Opening function, an Output function and a callback function, and it should be noted that the main function cannot be modified, otherwise, the GUI interface initialization will fail.
And testing the system according to the operation steps, running a program, entering an miss distance evaluation interface, and inputting the known rotation angles of the image coordinate system and the bullet scattering coordinate system, wherein the ideal angle is 0. The operating steps of the system can be viewed by clicking on "system introduction":
and finishing the GUI realization of the miss distance evaluation system, loading video information, and after acquiring the video information and the image sequence, checking the frame-by-frame image information of the video in the current path.
Selecting and loading clear image information of the bouncing water column for image processing, clicking a 'bouncing water column extraction' button, displaying the identified target in an image display area, selecting the position of the bouncing water column through a mouse cursor, and simultaneously displaying a mark in the display area to obtain and display the direction and distance deviation of the bouncing point relative to the target. As shown in fig. 5-6, one extraction operation of the impinging water column was performed.
The image data used when the GUI is realized is intercepted from the video image acquired by the target-laying ship, the distance distribution of the shot is reflected, the result is 61.99m, the result is similar to the data obtained by the traditional measurement, and the requirement of the ship gun on the measurement of the shooting off-target amount of the sea is met.
According to the same operation flow, selecting pictures which can reflect the characteristics of the shot water column most for shooting times in sequence to extract the shot water column until all shot water column information is obtained, and finally clicking 'scatter error calculation' to obtain the scatter error of the group of shots. Thereby determining the shooting performance using the scattering error data.
5. The evaluation of the shooting miss amount of the gun plays an indispensable role in gun weapon performance identification and military combat training for naval troops. The invention firstly introduces the composition and the working principle of an unmanned aerial vehicle target detection system, an unmanned aerial vehicle target detection task area and the like, and is mainly characterized in that the off-target amount evaluation work of gun shooting is completed based on video images, the automatic gun shooting unmanned aerial vehicle target detection system is utilized to obtain video images of relative positions of a shot water column and a target shot by the gun shooting from a high altitude fixed point, the video images are transmitted to a digital processing module, the extraction and positioning of the target and the extraction and positioning of the shot water column are completed through a digital image processing technology, the coordinate position of the shot water column is calculated, the score of the shooting scattering error evaluation shooting is obtained, and finally the GUI implementation of the off-target amount evaluation system is completed.
In the description of the present invention, "a plurality" means two or more unless otherwise specified; the terms "upper", "lower", "left", "right", "inner", "outer", "front", "rear", "head", "tail", and the like, indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are only for convenience in describing and simplifying the description, and do not indicate or imply that the device or element referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and thus, should not be construed as limiting the invention. Furthermore, the terms "first," "second," "third," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
It should be noted that the embodiments of the present invention can be realized by hardware, software, or a combination of software and hardware. The hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory and executed by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the apparatus and methods described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided on a carrier medium such as a disk, CD-or DVD-ROM, programmable memory such as read only memory (firmware), or a data carrier such as an optical or electronic signal carrier, for example. The apparatus and its modules of the present invention may be implemented by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., or by software executed by various types of processors, or by a combination of hardware circuits and software, e.g., firmware.
The above description is only for the purpose of illustrating the present invention and the appended claims are not to be construed as limiting the scope of the invention, which is intended to cover all modifications, equivalents and improvements that are within the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An unmanned aerial vehicle system of examining target, its characterized in that, unmanned aerial vehicle system of examining target includes:
the system comprises an unmanned aerial vehicle system, a photoelectric load system, a wireless image transmission system, a wireless remote control system and a target detection comprehensive task platform;
the unmanned aerial vehicle system comprises a four-rotor unmanned aerial vehicle body, a propulsion device, a flight control device and a power supply device; the device is used for carrying the photoelectric load system to an observation position and hovering at a fixed point;
the photoelectric load system is carried on the four-rotor unmanned aerial vehicle body; the device comprises a CCD camera, an infrared detection device and a nacelle; the system is used for executing a fixed-point observation task and carrying out fixed observation on a target range area;
the wireless image transmission system is used for transmitting the observed video images by adopting the HHLM type transmission module;
the wireless remote control system is used for controlling the unmanned aerial vehicle system and the photoelectric load system;
and the target detection comprehensive task platform is used for receiving, storing, forwarding and processing the video image.
2. The drone borescope system of claim 1, wherein the wireless remote control system comprises:
the receiving module is used for receiving the control instruction;
the decoding module is used for decoding the control instruction by using a decoder;
and the control module is used for controlling the unmanned aerial vehicle system and the photoelectric load system based on the control instruction obtained by decoding.
3. An unmanned aerial vehicle target detection method applied to the unmanned aerial vehicle target detection system according to any one of claims 1-2, wherein the unmanned aerial vehicle target detection method comprises the following steps:
the method comprises the steps of obtaining a video image of a water column shot by a ship gun by utilizing a photoelectric load, analyzing and processing the video image of the water column shot by the ship gun by obtaining video information, obtaining an image sequence, playing the video information, selecting images frame by frame, inputting a coordinate system rotation angle, extracting the water column shot by the ship gun and other means to obtain a hit rate and a miss rate of shooting of the ship gun, and obtaining a shooting evaluation result.
4. The drone borescope method of claim 3, wherein the drone borescope method comprises the steps of:
generating a control instruction, and controlling the quad-rotor unmanned aerial vehicle carrying the photoelectric load to fly to a safe height above a target and then hover based on the generated control instruction; meanwhile, controlling the unmanned aerial vehicle to adjust the shooting attitude of the photoelectric load based on the generated control instruction;
controlling the photoelectric load to record the shooting condition of the naval gun in the offshore target area based on the control instruction, continuously acquiring a water column video image stimulated by the shot entering water, and acquiring the relative relation between the shot water column and the target in the target area, namely the scattering condition of naval gun shooting;
converting the obtained video image into an electric signal, and converting the electric signal into a digital signal after sampling, quantizing and encoding by using an analog/digital converter;
step four, preprocessing the converted image to obtain clear and useful image information; and calculating the relative position of the shot water column and the floating target based on the obtained clear and useful image, calculating to obtain a shot scattering error, and evaluating the shot miss amount of the naval gun.
5. The drone target detection method of claim 4, wherein the safe height calculation method comprises:
firstly, analyzing the hovering height of the unmanned aerial vehicle based on the shot-firing height of the ship cannon:
H≥h;
wherein H represents the hovering height of the drone; h represents the height of the projectile when it reaches the highest point; the height h of the projectile when reaching the highest point is determined by the relationship among the firing angle, the firing distance and the firing height of the naval gun;
secondly, analyzing the hovering height of the unmanned aerial vehicle based on the photoelectric load observation range: determining the range of the obtained judgment interval according to the type, the shooting method and the average shooting distance of the artillery and the corresponding table of the evaluation parameters of the floating targets and the simulated target shooting scores; determining the hovering height of the unmanned aerial vehicle according to the judgment interval rectangular boundary; the rectangular boundary of the judgment interval is the rectangular boundary of the cannonball falling into the effective hit area according to the corresponding shooting score evaluation parameter table when the cannonball shoots the floating target and the simulated target;
and finally, determining the safety height of the unmanned aerial vehicle based on the hovering height analysis result of the unmanned aerial vehicle launched by the ship cannon, the judgment interval rectangular boundary limit, the field angle range of the camera and the flight height limit of the unmanned aerial vehicle.
6. The unmanned aerial vehicle target detection method of claim 4, wherein the preprocessing of the converted image to obtain clear and useful image information comprises:
(1) segmenting the converted image by adopting a segmentation method based on color characteristics, and identifying and extracting a target region;
(2) performing curve fitting on the basis of the target information obtained after segmentation by adopting a least square method to determine the circle center and the radius of the target for positioning the target;
(3) and carrying out graying, median filtering and contrast enhancement on the extracted and positioned target image.
7. The drone target detection method of claim 4, wherein calculating the relative position of the bouncing water column and the floating target based on the obtained clear and useful image comprises:
calculating a threshold value through a self-adaption and iteration method to segment the image of the impinging water column to obtain the image of the impinging water column with the background removed; carrying out binarization processing on the obtained image of the bouncing water column after the background is removed; meanwhile, the positioning of the bouncing water column is carried out by utilizing an improved centroid method.
8. The drone boresight method of claim 7, wherein the segmenting the ballistic water column image by calculating the threshold values by an adaptive and iterative method comprises:
1.1) counting the minimum gray value T of the shot water column imageminMaximum gray value TmaxCalculating a binary average value as an initial threshold value T:
Figure FDA0003475642000000031
1.2) segmenting the image according to the threshold value T to obtain two pixel sets which are respectively:
G1={f(x,y)≥T},G2={f(x,y)≤T};
1.3) computing the set of pixels G1And G2Gray scale average value mu of1And mu2
Figure FDA0003475642000000032
Figure FDA0003475642000000033
1.4) according to μ1And mu2Calculating a new threshold value
Figure FDA0003475642000000034
Step 1.2), step 1.3), step 1.4) are repeated until the threshold T converges to a certain range.
9. The drone borescope method of claim 4, wherein the positioning of the bouncing water column using the modified centroid method comprises:
performing edge detection on the bouncing water column, extracting edge information of the bouncing water column, and performing arithmetic mean calculation on the basis of the edge information of the bouncing water column to calculate the mass center of the bouncing water column;
in the fourth step, the step of calculating to obtain the shot scattering error and the step of evaluating the shot miss amount of the gun comprises the following steps:
(1) the calculation of the dispersion error is performed:
based on the obtained position information of the target and the shot water column in the image, a shot scattering coordinate system with the target as an origin is established, and scattering errors are calculated by the following formula:
Figure FDA0003475642000000041
Figure FDA0003475642000000042
wherein,
Figure FDA0003475642000000043
in formula 4.25
Figure FDA0003475642000000044
(Xi,Zi) The actual coordinate values of each of the impinging water columns are indicated,
Figure FDA0003475642000000045
representing the coordinates of the center of the group of bouncing water columns,
Figure FDA0003475642000000046
the value of (A) is the average value of the actual coordinate values of all the bouncing water columns; 0.6745 represents the likelihood factor, n represents the total number of projectiles;
(2) comparing calculated EX,EZDetermining the shooting result of the gun and the gun shooting miss amount evaluation result according to the value of the difference and the impact point scattering error standard value K.
10. A computer-readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the functions of the drone targeting system of any one of claims 1-2.
CN202210054509.9A 2022-01-18 2022-01-18 Unmanned aerial vehicle target detection system, target detection method and computer readable storage medium Pending CN114578849A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210054509.9A CN114578849A (en) 2022-01-18 2022-01-18 Unmanned aerial vehicle target detection system, target detection method and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210054509.9A CN114578849A (en) 2022-01-18 2022-01-18 Unmanned aerial vehicle target detection system, target detection method and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114578849A true CN114578849A (en) 2022-06-03

Family

ID=81769551

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210054509.9A Pending CN114578849A (en) 2022-01-18 2022-01-18 Unmanned aerial vehicle target detection system, target detection method and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114578849A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115421135A (en) * 2022-09-09 2022-12-02 中国人民解放军海军工程大学 Radar/photoelectric composite single-station projectile off-target quantity measuring method, system and terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115421135A (en) * 2022-09-09 2022-12-02 中国人民解放军海军工程大学 Radar/photoelectric composite single-station projectile off-target quantity measuring method, system and terminal

Similar Documents

Publication Publication Date Title
CN111626290B (en) Infrared ship target detection and identification method under complex sea surface environment
Zhang et al. ShipRSImageNet: A large-scale fine-grained dataset for ship detection in high-resolution optical remote sensing images
CN110443201B (en) Target identification method based on multi-source image joint shape analysis and multi-attribute fusion
CN107817679A (en) Based on infrared and naval vessel water cannon control system and method for visible ray fusion tracking
CN106326892A (en) Visual landing pose estimation method of rotary wing type unmanned aerial vehicle
Kim et al. Object detection and tracking for autonomous underwater robots using weighted template matching
CN106485651B (en) The image matching method of fast robust Scale invariant
CN109903305A (en) Line style target impact point positioning method based on aerial three-dimensional localization
CN110766721B (en) Carrier landing cooperative target detection method based on airborne vision
CN109859247A (en) Scene infrared small target detection method near the ground
CN105225251A (en) Over the horizon movement overseas target based on machine vision identifies and locating device and method fast
CN114578849A (en) Unmanned aerial vehicle target detection system, target detection method and computer readable storage medium
Li et al. Vision-based target detection and positioning approach for underwater robots
JP5367244B2 (en) Target detection apparatus and target detection method
CN113822297B (en) Marine ship target recognition device and method
CN116087982A (en) Marine water falling person identification and positioning method integrating vision and radar system
CN114419450A (en) Linear target damage efficiency rapid evaluation method based on image feature analysis
Rao et al. Real time vision-based autonomous precision landing system for UAV airborne processor
CN116929149B (en) Target identification and guidance method based on image guidance
CN112417948B (en) Method for accurately guiding lead-in ring of underwater vehicle based on monocular vision
US9721352B1 (en) Method and apparatus for computer vision analysis of cannon-launched artillery video
CN112113462B (en) Method and system for detecting shooting effect of direct-aiming weapon and virtual target shooting system
CN115909072A (en) Improved YOLOv4 algorithm-based impact point water column detection method
Armbruster et al. Segmentation, classification, and pose estimation of maritime targets in flash-ladar imagery
CN105551013B (en) SAR image sequence method for registering based on motion platform parameter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination