CN114578849A - Unmanned aerial vehicle target detection system, target detection method and computer readable storage medium - Google Patents
Unmanned aerial vehicle target detection system, target detection method and computer readable storage medium Download PDFInfo
- Publication number
- CN114578849A CN114578849A CN202210054509.9A CN202210054509A CN114578849A CN 114578849 A CN114578849 A CN 114578849A CN 202210054509 A CN202210054509 A CN 202210054509A CN 114578849 A CN114578849 A CN 114578849A
- Authority
- CN
- China
- Prior art keywords
- target
- image
- water column
- shooting
- unmanned aerial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 68
- 238000003860 storage Methods 0.000 title claims abstract description 7
- 230000005540 biological transmission Effects 0.000 claims abstract description 34
- 238000012545 processing Methods 0.000 claims abstract description 25
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 174
- 238000000034 method Methods 0.000 claims description 79
- 239000006185 dispersion Substances 0.000 claims description 47
- 238000011156 evaluation Methods 0.000 claims description 34
- 238000010304 firing Methods 0.000 claims description 33
- 238000007667 floating Methods 0.000 claims description 30
- 238000004364 calculation method Methods 0.000 claims description 27
- 230000011218 segmentation Effects 0.000 claims description 15
- 238000004458 analytical method Methods 0.000 claims description 13
- 230000006870 function Effects 0.000 claims description 12
- 230000003044 adaptive effect Effects 0.000 claims description 10
- 238000007781 pre-processing Methods 0.000 claims description 8
- 238000003708 edge detection Methods 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 5
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims description 4
- 238000004590 computer program Methods 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 230000008685 targeting Effects 0.000 claims 1
- 238000005516 engineering process Methods 0.000 abstract description 8
- 238000012549 training Methods 0.000 abstract description 6
- 238000012360 testing method Methods 0.000 abstract description 5
- 230000003116 impacting effect Effects 0.000 description 52
- 238000000605 extraction Methods 0.000 description 24
- 230000008569 process Effects 0.000 description 19
- 238000010586 diagram Methods 0.000 description 16
- 238000004422 calculation algorithm Methods 0.000 description 12
- 238000013461 design Methods 0.000 description 12
- 238000005259 measurement Methods 0.000 description 8
- 238000007689 inspection Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 238000003709 image segmentation Methods 0.000 description 4
- 230000005693 optoelectronics Effects 0.000 description 4
- 230000007547 defect Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 230000008030 elimination Effects 0.000 description 2
- 238000003379 elimination reaction Methods 0.000 description 2
- 230000003628 erosive effect Effects 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 238000013139 quantization Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000012911 target assessment Methods 0.000 description 2
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000002860 competitive effect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000012938 design process Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000010339 dilation Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 229910052744 lithium Inorganic materials 0.000 description 1
- 238000011068 loading method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 230000008092 positive effect Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000003756 stirring Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
本发明属于自动化检靶技术领域,公开了一种无人机检靶系统、检靶方法、计算机可读存储介质,无人机检靶系统包括用于搭载光电载荷系统至观测位置并定点悬停的无人机系统;用于执行定点观测任务,进行靶场区域的固定观测的光电载荷系统;用于采用HHLM型传输模块进行观测视频图像的传输的无线图像传输系统;用于进行无人机系统以及光电载荷系统的控制的无线遥控系统以及用于进行视频图像的接收、存储、转发、处理的检靶综合任务台。本发明提供了一种海上检靶系统,对提高测试技术和测试设备水平,对驱护舰舰炮射击实战能力和训练水平的提高具有重要意义,同时,对保障舰炮实弹射击质量,适应战场需要是有很强的现实意义。
The invention belongs to the technical field of automatic target detection, and discloses an unmanned aerial vehicle target detection system, a target detection method, and a computer-readable storage medium. The unmanned aerial vehicle system; the photoelectric load system used to perform fixed-point observation tasks and fixed observation of the shooting range area; the wireless image transmission system used for the transmission of observation video images using the HHLM transmission module; used for the unmanned aerial vehicle system As well as a wireless remote control system for the control of the photoelectric payload system and a target detection integrated task table for receiving, storing, forwarding and processing video images. The invention provides a target detection system at sea, which is of great significance to improving the level of testing technology and testing equipment, and improving the actual combat capability and training level of destroyers and naval guns. The need is very realistic.
Description
技术领域technical field
本发明属于自动化检靶技术领域,尤其涉及一种无人机检靶系统、检靶方法、计算机可读存储介质。The invention belongs to the technical field of automatic target detection, and in particular relates to an unmanned aerial vehicle target detection system, a target detection method and a computer-readable storage medium.
背景技术Background technique
目前,21世纪以来,海上安全问题引起了各国越来越多的关注,领国之间的海上领土及海洋权益争议、海盗袭击活动猖獗、海上自由航行的安全问题等使得各国逐渐重视海军现代化的建设。实弹射击是海军军事化训练中的重要项目,在实弹射击竞赛性考核中,快速精确的实弹射击效果评判是完成作战成绩综合评定的关键。通过舰炮射击后的命中率和脱靶率来评判其日常的训练水平,因此对其进行精确科学的测量是十分必要的。海上靶场不同于陆地靶场,陆地靶场上的目标试验痕迹很容易捕捉到,而在海上,炮弹落入水中激起一道水柱是一个短暂的过程,试验痕迹稍纵即逝,不易捕捉。At present, since the beginning of the 21st century, maritime security issues have attracted more and more attention from various countries. Disputes over maritime territory and maritime rights and interests between neighboring countries, rampant pirate attacks, and safety issues such as freedom of navigation at sea have caused countries to gradually pay attention to the modernization of the navy. . Live ammunition is an important item in naval militarization training. In the competitive assessment of live ammunition, rapid and accurate evaluation of the effect of live ammunition is the key to completing the comprehensive evaluation of combat performance. The daily training level is judged by the hit rate and miss rate after the naval gun is fired, so it is very necessary to measure it accurately and scientifically. The marine shooting range is different from the land shooting range. The target test traces on the land range are easy to capture, while at sea, the projectile falling into the water to stir up a water column is a short process, and the test traces are fleeting and difficult to capture.
因此现有技术还没有应用于海上靶场的自动化检靶方法,也没有能够实现自动化、智能化海上舰炮射击评估的方法或系统。Therefore, there is no automatic target detection method applied to the marine shooting range in the prior art, and no method or system capable of realizing automatic and intelligent evaluation of the firing of naval guns at sea.
通过上述分析,现有技术存在的问题及缺陷为:现有技术没有应用于海上靶场的自动化检靶方法,也没有能够实现自动化、智能化海上舰炮射击评估的方法或系统。传统的光学测量方法需要人工对准两组图像数据时间,人工测量,并且存在正交误差,工作量大,专业性强,不适用于现代高强度的训练。同时,如果利用靶场中技术先进、结构复杂,精度高的大型测控装备获取数据(弹着偏差量),成本问题不容忽略。Through the above analysis, the existing problems and defects of the existing technology are as follows: the existing technology does not have an automatic target detection method applied to a marine shooting range, nor a method or system capable of realizing automatic and intelligent evaluation of the firing of naval guns at sea. The traditional optical measurement method requires manual alignment of two sets of image data, manual measurement, and there is an orthogonal error, the workload is large, and the professionalism is strong, which is not suitable for modern high-intensity training. At the same time, if the large-scale measurement and control equipment with advanced technology, complex structure and high precision in the shooting range is used to obtain data (impact deviation), the cost problem cannot be ignored.
解决以上问题及缺陷的难度为:从现有主炮近距反击射击成绩判定的实施过程来看,还存在有以下不足:The difficulty of solving the above problems and defects is as follows: Judging from the implementation process of the existing main gun short-range counterattack shooting results, there are still the following shortcomings:
(1)动用保障兵力多,协调困难。需要参考舰艇、布靶舰船两艘保障兵力,而且要布设多架摄像机,也就意味着需要多名保障人员,因此必然涉及保障兵力、保障人员之间的不断协调沟通。(1) The use of support troops is large, and coordination is difficult. It is necessary to refer to the two support forces of the ship and the target ship, and to deploy multiple cameras, which means that multiple support personnel are required, so it must involve continuous coordination and communication between the support forces and the support personnel.
(2)摄录过程要求高,实施复杂。因为只有利用来自参考舰艇、布靶舰船的两处视频才能进行对舰炮散布误差的计算,同时为了保证视频图像中测量目标的一致性,就需要两部摄像机的时间基准统一,还要保证同时开始摄录,否则很难实现基于视频的事后客观分析。(2) The recording process requires high requirements and is complicated to implement. Because only the two videos from the reference ship and the target ship can be used to calculate the dispersion error of naval guns, and in order to ensure the consistency of the measurement targets in the video images, the time reference of the two cameras needs to be unified, and it is also necessary to ensure Start recording at the same time, otherwise it will be difficult to achieve objective post-event analysis based on video.
(3)散布误差估算慢,精度不高。对于弹着水柱视频的事后处理,在确定测量目标的一致性之后,利用刻度尺在电脑显示屏幕上量取弹着水柱与浮体靶的距离,经过换算、处理,最终估算得到舰炮散布概率误差,不仅费时费力,而且精度较低。(3) The estimation of the dispersion error is slow and the accuracy is not high. For the post-processing of the video of the impacting water column, after confirming the consistency of the measurement target, use the scale to measure the distance between the impacting water column and the floating target on the computer display screen. , which is not only time-consuming and labor-intensive, but also has low precision.
解决以上问题及缺陷的意义为:传统的脱靶量评估工作由正交的两摄像机分别从射击舰和保障舰拍摄方向散布和距离散布,回到岸上后先由人工对准两组图像数据时间后再在图像上量取距离计算散布误差,评定考核成绩。整个过程耗时长,工作量大。采用无人机检靶系统的应用前景广阔,不仅适合舰炮对海射击检靶任务,对于舰炮对岸射击同样适用。随着无人机检靶技术的不断成熟,无人机检靶系统将会逐步推广,取代当前使用的检靶系统。同时基于视频图像完成对舰炮射击进行脱靶量的评估工作,利用舰炮射击无人机自动检靶系统从高空定点获取舰炮射击的弹着水柱与靶标相对位置的视频图像,传输至数字处理模块,通过数字图像处理技术完成对靶标的提取定位和弹着水柱的提取定位,计算弹着水柱的坐标位置,得出射击的散布误差评价射击的成绩,最后完成脱靶量的评估系统的GUI实现。缩短了评估工作的时间,减少了人工的工作量,提高了评估的准确性和时效性。The significance of solving the above problems and defects is as follows: in the traditional missed target assessment work, two orthogonal cameras are scattered from the shooting direction and distance of the shooting ship and the support ship respectively. After returning to the shore, the two sets of image data are manually aligned. Then measure the distance on the image to calculate the dispersion error, and evaluate the assessment results. The whole process is time-consuming and labor-intensive. The application prospect of the UAV target detection system is broad. It is not only suitable for the target detection task of naval guns shooting at sea, but also suitable for naval guns shooting at shore. With the continuous maturity of UAV target detection technology, the UAV target detection system will be gradually promoted to replace the currently used target detection system. At the same time, based on the video image, the evaluation of the missed target amount of the naval gun shooting was completed. The automatic target detection system of the naval gun shooting UAV was used to obtain the video image of the relative position of the water column of the naval gun shooting and the target from a fixed point at a high altitude, and then transmitted to the digital processing. Module, through digital image processing technology to complete the extraction and positioning of the target and the extraction and positioning of the impact water column, calculate the coordinate position of the impact water column, get the shooting dispersion error to evaluate the shooting score, and finally complete the GUI realization of the missed target evaluation system. . The evaluation time is shortened, the manual workload is reduced, and the evaluation accuracy and timeliness are improved.
发明内容SUMMARY OF THE INVENTION
针对现有技术存在的问题,本发明提供了一种无人机检靶系统、检靶方法、计算机可读存储介质。In view of the problems existing in the prior art, the present invention provides a target detection system, a target detection method and a computer-readable storage medium for an unmanned aerial vehicle.
本发明是这样实现的,一种无人机检靶系统,所述无人机检靶系统包括:The present invention is realized in this way, an unmanned aerial vehicle target detection system, the unmanned aerial vehicle target detection system includes:
无人机系统、光电载荷系统、无线图像传输系统、无线遥控系统和检靶综合任务台;UAV system, photoelectric load system, wireless image transmission system, wireless remote control system and target detection integrated task platform;
无人机系统,包括四旋翼无人机机体、推进装置、飞行操控装置以及供电装置;用于搭载光电载荷系统至观测位置并定点悬停;UAV system, including quadrotor UAV body, propulsion device, flight control device and power supply device; used to carry the photoelectric load system to the observation position and hover at a fixed point;
光电载荷系统,搭载于所述四旋翼无人机机体上;包括CCD摄像机、红外探测装置和吊舱;用于执行定点观测任务,进行靶场区域的固定观测;The photoelectric load system is mounted on the body of the quadrotor UAV; it includes a CCD camera, an infrared detection device and a pod; it is used to perform fixed-point observation tasks and perform fixed observation of the shooting range area;
无线图像传输系统,用于采用HHLM型传输模块进行观测视频图像的传输;The wireless image transmission system is used to transmit the observation video image by using the HHLM type transmission module;
无线遥控系统,用于进行无人机系统以及光电载荷系统的控制;Wireless remote control system for control of UAV system and photoelectric load system;
检靶综合任务台,用于进行视频图像的接收、存储、转发、处理。The target detection comprehensive task platform is used for receiving, storing, forwarding and processing video images.
进一步,所述无线遥控系统包括:Further, the wireless remote control system includes:
接受模块,用于接受控制指令;Accepting module for accepting control instructions;
解码模块,用于利用解码器进行控制指令解码;The decoding module is used to decode the control instruction by using the decoder;
控制模块,用于基于解码得到的控制指令进行无人机系统与光电载荷系统的控制。The control module is used to control the UAV system and the optoelectronic load system based on the control instructions obtained by decoding.
本发明的另一目的在于提供一种应用于所述无人机检靶系统的无人机检靶方法,所述无人机检靶方法包括:Another object of the present invention is to provide a UAV target detection method applied to the UAV target detection system, and the UAV target detection method includes:
利用光电载荷获取舰炮弹着水柱的视频图像,并通过获取视频信息、获取图像序列、播放视频信息、逐帧选取图像、输入坐标系旋转角、提取弹着水柱及其他手段对所述舰炮弹着水柱的视频图像进行分析处理得到舰炮射击的命中率以及脱靶率,并得到射击评估结果。Use the photoelectric load to obtain the video image of the water column of the naval artillery, and obtain the video information, obtain the image sequence, play the video information, select the image frame by frame, input the rotation angle of the coordinate system, extract the water column of the projectile and other means. The video image of the water column is analyzed and processed to obtain the hit rate and miss rate of the naval gun shooting, and the shooting evaluation results are obtained.
进一步,所述无人机检靶方法包括以下步骤:Further, the UAV target detection method comprises the following steps:
步骤一,生成控制指令,并基于生成的控制指令控制搭载光电载荷的四旋翼无人机飞行至靶标上空安全高度后悬停;同时基于生成的控制指令控制无人机调整光电载荷拍摄姿态;
步骤二,基于控制指令控制光电载荷记录海上靶场区域内舰炮射击情况,连续获取弹丸入水激起的水柱视频图像,获取靶场内弹着水柱与靶标的相对关系即舰炮射击的散布情况;
步骤三,将获取的视频图像转换为电信号,利用模/数转换器进行采样、量化、编码后转换成数字信号;Step 3: Convert the acquired video image into an electrical signal, and convert it into a digital signal after sampling, quantization, and encoding using an analog-to-digital converter;
步骤四,对转换后的图像预处理,得到清晰有用的图像信息;并基于得到的清晰有用的图像计算弹着水柱与漂浮靶的相对位置,并计算得到弹丸散布误差,进行舰炮射击脱靶量评估。Step 4: Preprocess the converted image to obtain clear and useful image information; and calculate the relative position of the impact water column and the floating target based on the obtained clear and useful image, and calculate the projectile dispersion error, and then calculate the amount of misses by the naval gun. Evaluate.
进一步,所述安全高度计算方法包括:Further, the safe height calculation method includes:
首先,基于舰炮射高分析无人机悬停的高度:First, analyze the hovering height of the UAV based on the firing height of the naval gun:
H≥h;H ≥ h;
其中,H表示无人机的悬停高度;h表示弹丸到达最高点时的高度;所述弹丸到达最高点时的高度h由舰炮的射角、射距和射高关系确定;Among them, H represents the hovering height of the drone; h represents the height of the projectile when it reaches the highest point; the height h of the projectile when it reaches the highest point is determined by the relationship between the firing angle, firing range and firing height of the naval gun;
其次,基于光电载荷观测范围分析无人机悬停高度:根据火炮的类型、射击方法和射击平均距离,按照对漂浮靶、模拟目标射击成绩评定参数的相应表,确定得到判定区间的范围;根据判定区间矩形边界确定无人机工作的悬停高度;所述判定区间矩形边界即舰炮在对漂浮靶、模拟目标射击时,根据相应的射击成绩评定参数表,判定炮弹落入有效命中区域的矩形区间边界;Secondly, analyze the hovering height of the UAV based on the observation range of the photoelectric load: according to the type of artillery, the shooting method and the average shooting distance, and according to the corresponding table of the evaluation parameters of the shooting performance of the floating target and the simulated target, determine the range of the judgment interval; The rectangular boundary of the judgment interval determines the hovering height of the UAV; the rectangular boundary of the judgment interval means that when the naval gun shoots at the floating target or the simulated target, according to the corresponding shooting performance evaluation parameter table, it is judged that the shell falls into the effective hit area. Rectangular interval boundary;
最后,基于舰炮射高无人机悬停高度分析结果、判定区间矩形边界限制、摄像机的视场角范围以及无人机自身的飞行高度限制,确定无人机安全高度。Finally, the safe height of the UAV is determined based on the analysis results of the UAV's hovering height, the rectangular boundary limit of the judgment interval, the range of the camera's field of view, and the UAV's own flight height limit.
进一步,所述对转换后的图像预处理,得到清晰有用的图像信息包括:Further, the preprocessing of the converted image to obtain clear and useful image information includes:
(1)采用基于颜色特征的分割方法对转换后的图像进行分割,进行靶标区域的识别与提取;(1) The converted image is segmented by a segmentation method based on color features, and the target area is identified and extracted;
(2)基于分割后得到的靶标信息采用最小二乘法进行曲线拟合确定靶标的圆心和半径进行靶标定位;(2) Using the least squares method to perform curve fitting based on the target information obtained after segmentation to determine the center and radius of the target to locate the target;
(3)对提取定位的靶标图像进行灰度化、中值滤波、对比度增强处理。(3) Grayscale, median filtering, and contrast enhancement are performed on the extracted and positioned target images.
进一步,所述基于得到的清晰有用的图像计算弹着水柱与漂浮靶的相对位置包括:Further, the calculation of the relative position of the impact water column and the floating target based on the obtained clear and useful image includes:
通过自适应和迭代的方法计算阈值对弹着水柱图像进行分割,得到去除背景后的弹着水柱图像;对得到的去除背景后的弹着水柱图像进行二值化处理;同时利用改进的质心法进行弹着水柱的定位。The threshold value is calculated by adaptive and iterative methods to segment the impact water column image to obtain the impact water column image after removing the background; binarize the obtained impact water column image after removing the background; at the same time, the improved centroid method is used. Perform the positioning of the impacting water column.
进一步,所述通过自适应和迭代的方法计算阈值对弹着水柱图像进行分割包括:Further, the calculation of thresholds by adaptive and iterative methods to segment the water column image includes:
1.1)统计弹着水柱图像的最小灰度值Tmin、最大灰度值Tmax,计算二值平均值为初始阈值T:1.1) Count the minimum gray value T min and the maximum gray value T max of the water column image, and calculate the binary average value as the initial threshold T:
1.2)根据阈值T对图像进行分割,得到两个像素集合分别为:1.2) Segment the image according to the threshold T, and obtain two pixel sets:
G1={f(x,y)≥T},G2={f(x,y)≤T};G 1 ={f(x,y)≥T}, G 2 ={f(x,y)≤T};
1.3)计算像素集合G1和G2的灰度平均值μ1和μ2:1.3) Calculate the grayscale average values μ 1 and μ 2 of the pixel sets G 1 and G 2 :
1.4)根据μ1和μ2计算新的阈值重复步骤1.2)、步骤1.3)、步骤1.4),直至阈值T收敛到某一范围为止。1.4) Calculate a new threshold based on μ 1 and μ 2 Repeat steps 1.2), 1.3), and 1.4) until the threshold T converges to a certain range.
进一步,所述利用改进的质心法进行弹着水柱的定位包括:Further, the positioning of the impacting water column using the improved centroid method includes:
进行弹着水柱边缘检测,提取弹着水柱的边缘信息,基于所述弹着水柱的边缘信息进行算术平均计算弹着水柱的的质心。The edge detection of the impacting water column is performed, the edge information of the impacting water column is extracted, and the centroid of the impacting water column is calculated by arithmetic mean based on the edge information of the impacting water column.
进一步,步骤四中,所述计算得到弹丸散布误差,进行舰炮射击脱靶量评估包括:Further, in
(1)进行散布误差的计算:(1) Calculate the dispersion error:
基于得到的靶标和弹着水柱在图像中的位置信息,建立以靶标为原点的弹着散布坐标系,利用下式计算散布误差:Based on the obtained position information of the target and the impact water column in the image, the impact dispersion coordinate system with the target as the origin is established, and the dispersion error is calculated by the following formula:
其中,式4.25中(Xi,Zi)表示每一个弹着水柱的实际坐标值,表示弹着水柱群中心坐标,的值为所有弹着水柱实际坐标值的平均值;0.6745表示或然系数,n表示弹丸总数;in, In Equation 4.25 (X i , Z i ) represents the actual coordinate value of each impacted water column, Represents the coordinates of the center of the impacting water column group, The value of is the average value of the actual coordinate values of all impacting water columns; 0.6745 represents the probability coefficient, and n represents the total number of projectiles;
(2)比较计算的EX,EZ的值与弹着点散布误差标准值K的大小确定舰炮射击成绩,并得到舰炮射击脱靶量评估结果。(2) Comparing the calculated values of E X and E Z with the standard value K of the impact point dispersion error to determine the shooting results of the naval guns, and obtain the evaluation results of the shooting misses of the naval guns.
本发明的另一目的在于提供一种计算机可读存储介质,存储有计算机程序,所述计算机程序被处理器执行时,使得所述处理器执行所述无人机检靶系统功能Another object of the present invention is to provide a computer-readable storage medium storing a computer program, and when the computer program is executed by a processor, the processor enables the processor to perform the function of the UAV target detection system
结合上述的所有技术方案,本发明所具备的优点及积极效果为:本发明利用舰炮射击无人机自动检靶系统从高空定点获取舰炮射击的弹着水柱与靶标相对位置的视频图像,传输至数字处理模块,通过数字图像处理技术完成对靶标的提取定位和弹着水柱的提取定位,计算弹着水柱的坐标位置,得到射击的散布误差评价射击的成绩,对提高测试技术和测试设备水平,对驱护舰舰炮射击实战能力和训练水平的提高具有重要意义,同时,对保障舰炮实弹射击质量,适应战场需要是有很强的现实意义。Combining all the above technical solutions, the advantages and positive effects of the present invention are: the present invention uses the naval gun shooting UAV automatic target detection system to obtain the video image of the relative position of the water column of the naval gun shooting and the target from a fixed point at a high altitude, It is transmitted to the digital processing module, and the extraction and positioning of the target and the extraction and positioning of the impact water column are completed through digital image processing technology, the coordinate position of the impact water column is calculated, and the shooting dispersion error is obtained. It is of great significance to the improvement of the actual combat capability and training level of the destroyer and frigate naval guns.
本发明采用HHLM型的无线图像传输系统,传输最远距离可达到50公里。HHLM型微波图像传输系统是专门针对远距离或不具备有线传输条件的环境而设计的高性能高质量的无线图像传输系统。本发明本身体积小,重量轻,能够无失真的实时传送高质量的视频图像,调制和解调性能稳定,传输图像色彩鲜艳、清晰,并且安装调试简单。The invention adopts the HHLM type wireless image transmission system, and the longest transmission distance can reach 50 kilometers. HHLM microwave image transmission system is a high-performance and high-quality wireless image transmission system specially designed for long-distance or environments without wired transmission conditions. The invention itself is small in size and light in weight, can transmit high-quality video images in real time without distortion, has stable modulation and demodulation performance, bright and clear transmitted images, and is easy to install and debug.
附图说明Description of drawings
图1是本发明实施例提供的无人机检靶系统原理图。FIG. 1 is a schematic diagram of a UAV target detection system provided by an embodiment of the present invention.
图2是本发明实施例提供的无人机检靶系统结构示意图;2 is a schematic structural diagram of a UAV target detection system provided by an embodiment of the present invention;
图中:1、无人机系统;2、光电载荷系统;3、无线图像传输系统;4、无线遥控系统;5、检靶综合任务台。In the picture: 1. UAV system; 2. Photoelectric load system; 3. Wireless image transmission system; 4. Wireless remote control system;
图3是本发明实施例提供的无人机检靶方法原理图。FIG. 3 is a schematic diagram of a method for detecting a target of an unmanned aerial vehicle provided by an embodiment of the present invention.
图4是本发明实施例提供的无人机检靶方法流程图。FIG. 4 is a flow chart of a method for detecting a target of an unmanned aerial vehicle provided by an embodiment of the present invention.
图5是本发明实施例提供的检靶无人机位置示意图。FIG. 5 is a schematic diagram of the position of a target detection drone provided by an embodiment of the present invention.
图6是本发明实施例提供的射击仰角与射击距离曲线图。FIG. 6 is a graph of a shooting elevation angle and a shooting distance provided by an embodiment of the present invention.
图7是本发明实施例提供的射击仰角与射击高度曲线图。FIG. 7 is a graph of a shooting elevation angle and a shooting height provided by an embodiment of the present invention.
图8是本发明实施例提供的不同射击仰角的射击曲线图。FIG. 8 is a shooting curve diagram of different shooting elevation angles provided by an embodiment of the present invention.
图9是本发明实施例提供的判定区间示意图。FIG. 9 is a schematic diagram of a determination interval provided by an embodiment of the present invention.
图10是本发明实施例提供的检靶系统检测示意图。FIG. 10 is a schematic diagram of detection by a target detection system provided by an embodiment of the present invention.
图11是本发明实施例提供的针孔成像示意图。FIG. 11 is a schematic diagram of pinhole imaging provided by an embodiment of the present invention.
图12是本发明实施例提供的弹着水柱自动提取定位流程图。FIG. 12 is a flowchart of the automatic extraction and positioning of the impact water column provided by an embodiment of the present invention.
图13是本发明实施例提供的呈现单峰的灰度直方图。FIG. 13 is a grayscale histogram showing a single peak provided by an embodiment of the present invention.
图14是本发明实施例提供的图像坐标系示意图。FIG. 14 is a schematic diagram of an image coordinate system provided by an embodiment of the present invention.
图15是本发明实施例提供的计算弹着水柱坐标原理示意图。FIG. 15 is a schematic diagram of the principle of calculating the coordinates of the water column of impact provided by an embodiment of the present invention.
图16是本发明实施例提供的弹着散布坐标系示意图。FIG. 16 is a schematic diagram of a projectile dispersion coordinate system provided by an embodiment of the present invention.
图17是本发明实施例提供的射击舰与无人机平面位置关系图。FIG. 17 is a plane position relationship diagram of a shooting ship and an unmanned aerial vehicle provided by an embodiment of the present invention.
具体实施方式Detailed ways
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。In order to make the objectives, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail below with reference to the embodiments. It should be understood that the specific embodiments described herein are only used to explain the present invention, but not to limit the present invention.
针对现有技术存在的问题,本发明提供了一种无人机检靶系统,下面结合附图对本发明作详细的描述。Aiming at the problems existing in the prior art, the present invention provides an unmanned aerial vehicle target detection system. The present invention will be described in detail below with reference to the accompanying drawings.
如图1-图2所示,本发明实施例提供的无人机检靶系统包括:As shown in FIG. 1-FIG. 2, the UAV target detection system provided by the embodiment of the present invention includes:
无人机系统1、光电载荷系统2、无线图像传输系统3、无线遥控系统4和检靶综合任务台5;
无人机系统1,包括四旋翼无人机机体、推进装置、飞行操控装置以及供电装置;用于搭载光电载荷系统至观测位置并定点悬停;
光电载荷系统2,搭载于所述四旋翼无人机机体上;包括CCD摄像机、红外探测装置和吊舱;用于执行定点观测任务,进行靶场区域的固定观测;The
无线图像传输系统3,用于采用HHLM型传输模块进行观测视频图像的传输;The wireless
无线遥控系统4,用于进行无人机系统以及光电载荷系统的控制;The wireless
检靶综合任务台5,用于进行视频图像的接收、存储、转发、处理。The target detection
本发明实施例提供的无线遥控系统4包括:The wireless
接受模块,用于接受控制指令;Accepting module for accepting control instructions;
解码模块,用于利用解码器进行控制指令解码;The decoding module is used to decode the control instruction by using the decoder;
控制模块,用于基于解码得到的控制指令进行无人机系统与光电载荷系统的控制。The control module is used to control the UAV system and the optoelectronic load system based on the control instructions obtained by decoding.
如图3所示,本发明实施例提供的无人机检靶方法包括:As shown in FIG. 3 , the UAV target detection method provided by the embodiment of the present invention includes:
利用光电载荷获取舰炮弹着水柱的视频图像,并通过获取视频信息、获取图像序列、播放视频信息、逐帧选取图像、输入坐标系旋转角、提取弹着水柱及其他手段对所述舰炮弹着水柱的视频图像进行分析处理得到舰炮射击的命中率以及脱靶率,并得到射击评估结果。Use the photoelectric load to obtain the video image of the water column of the naval artillery, and obtain the video information, obtain the image sequence, play the video information, select the image frame by frame, input the rotation angle of the coordinate system, extract the water column of the projectile and other means. The video image of the water column is analyzed and processed to obtain the hit rate and miss rate of the naval gun shooting, and the shooting evaluation results are obtained.
如图4所示,本发明实施例提供的无人机检靶方法包括以下步骤:As shown in FIG. 4 , the UAV target detection method provided by the embodiment of the present invention includes the following steps:
S101,生成控制指令,并基于生成的控制指令控制搭载光电载荷的四旋翼无人机飞行至靶标上空安全高度后悬停;同时基于生成的控制指令控制无人机调整光电载荷拍摄姿态;S101 , generating a control command, and controlling the quadrotor UAV equipped with the photoelectric load to fly to a safe height above the target based on the generated control command, and then hovering; at the same time, based on the generated control command, the UAV is controlled to adjust the photoelectric load shooting attitude;
S102,基于控制指令控制光电载荷记录海上靶场区域内舰炮射击情况,连续获取弹丸入水激起的水柱视频图像,获取靶场内弹着水柱与靶标的相对关系即舰炮射击的散布情况;S102, control the photoelectric load based on the control command to record the shooting situation of the naval guns in the area of the marine shooting range, continuously obtain the video images of the water column caused by the projectile entering the water, and obtain the relative relationship between the impacting water column and the target in the shooting range, that is, the dispersion of the shooting of the naval guns;
S103,将获取的视频图像转换为电信号,利用模/数转换器进行采样、量化、编码后转换成数字信号;S103, convert the acquired video image into an electrical signal, and convert it into a digital signal after sampling, quantization, and encoding using an analog-to-digital converter;
S104,对转换后的图像预处理,得到清晰有用的图像信息;并基于得到的清晰有用的图像计算弹着水柱与漂浮靶的相对位置,并计算得到弹丸散布误差,进行舰炮射击脱靶量评估。S104, preprocess the converted image to obtain clear and useful image information; and calculate the relative position of the impacting water column and the floating target based on the obtained clear and useful image, and calculate the projectile dispersion error, and evaluate the missed target amount of naval gun shooting .
本发明实施例提供的安全高度计算方法包括:The safety height calculation method provided by the embodiment of the present invention includes:
首先,基于舰炮射高分析无人机悬停的高度:First, analyze the hovering height of the UAV based on the firing height of the naval gun:
H≥h;H ≥ h;
其中,H表示无人机的悬停高度;h表示弹丸到达最高点时的高度;所述弹丸到达最高点时的高度h由舰炮的射角、射距和射高关系确定;Among them, H represents the hovering height of the drone; h represents the height of the projectile when it reaches the highest point; the height h of the projectile when it reaches the highest point is determined by the relationship between the firing angle, firing range and firing height of the naval gun;
其次,基于光电载荷观测范围分析无人机悬停高度:根据火炮的类型、射击方法和射击平均距离,按照对漂浮靶、模拟目标射击成绩评定参数的相应表,确定得到判定区间的范围;根据判定区间矩形边界确定无人机工作的悬停高度;所述判定区间矩形边界即舰炮在对漂浮靶、模拟目标射击时,根据相应的射击成绩评定参数表,判定炮弹落入有效命中区域的矩形区间边界;Secondly, analyze the hovering height of the UAV based on the observation range of the photoelectric load: according to the type of artillery, the shooting method and the average shooting distance, and according to the corresponding table of the evaluation parameters of the shooting performance of the floating target and the simulated target, determine the range of the judgment interval; The rectangular boundary of the judgment interval determines the hovering height of the UAV; the rectangular boundary of the judgment interval means that when the naval gun shoots at the floating target or the simulated target, according to the corresponding shooting performance evaluation parameter table, it is judged that the shell falls into the effective hit area. Rectangular interval boundary;
最后,基于舰炮射高无人机悬停高度分析结果、判定区间矩形边界限制、摄像机的视场角范围以及无人机自身的飞行高度限制,确定无人机安全高度。Finally, the safe height of the UAV is determined based on the analysis results of the UAV's hovering height, the rectangular boundary limit of the judgment interval, the range of the camera's field of view, and the UAV's own flight height limit.
本发明实施例提供的对转换后的图像预处理,得到清晰有用的图像信息包括:The preprocessing of the converted image provided by the embodiment of the present invention to obtain clear and useful image information includes:
(1)采用基于颜色特征的分割方法对转换后的图像进行分割,进行靶标区域的识别与提取;(1) The converted image is segmented by a segmentation method based on color features, and the target area is identified and extracted;
(2)基于分割后得到的靶标信息采用最小二乘法进行曲线拟合确定靶标的圆心和半径进行靶标定位;(2) Using the least squares method to perform curve fitting based on the target information obtained after segmentation to determine the center and radius of the target to locate the target;
(3)对提取定位的靶标图像进行灰度化、中值滤波、对比度增强处理。(3) Grayscale, median filtering, and contrast enhancement are performed on the extracted and positioned target images.
本发明实施例提供的基于得到的清晰有用的图像计算弹着水柱与漂浮靶的相对位置包括:The calculation of the relative position of the impact water column and the floating target based on the obtained clear and useful image provided by the embodiment of the present invention includes:
通过自适应和迭代的方法计算阈值对弹着水柱图像进行分割,得到去除背景后的弹着水柱图像;对得到的去除背景后的弹着水柱图像进行二值化处理;同时利用改进的质心法进行弹着水柱的定位。The threshold value is calculated by adaptive and iterative methods to segment the impact water column image to obtain the impact water column image after removing the background; binarize the obtained impact water column image after removing the background; at the same time, the improved centroid method is used. Perform the positioning of the impacting water column.
本发明实施例提供的通过自适应和迭代的方法计算阈值对弹着水柱图像进行分割包括:The embodiment of the present invention provides an adaptive and iterative method to calculate the threshold to segment the water column image, including:
1.1)统计弹着水柱图像的最小灰度值Tmin、最大灰度值Tmax,计算二值平均值为初始阈值T:1.1) Count the minimum gray value T min and the maximum gray value T max of the water column image, and calculate the binary average value as the initial threshold T:
1.2)根据阈值T对图像进行分割,得到两个像素集合分别为:1.2) Segment the image according to the threshold T, and obtain two pixel sets:
G1={f(x,y)≥T},G2={f(x,y)≤T};G 1 ={f(x,y)≥T}, G 2 ={f(x,y)≤T};
1.3)计算像素集合G1和G2的灰度平均值μ1和μ2:1.3) Calculate the grayscale average values μ 1 and μ 2 of the pixel sets G 1 and G 2 :
1.4)根据μ1和μ2计算新的阈值重复步骤1.2)、步骤1.3)、步骤1.4),直至阈值T收敛到某一范围为止。1.4) Calculate a new threshold based on μ 1 and μ 2 Repeat steps 1.2), 1.3), and 1.4) until the threshold T converges to a certain range.
本发明实施例提供的利用改进的质心法进行弹着水柱的定位包括:The positioning of the impacting water column using the improved centroid method provided by the embodiment of the present invention includes:
进行弹着水柱边缘检测,提取弹着水柱的边缘信息,基于所述弹着水柱的边缘信息进行算术平均计算弹着水柱的的质心。The edge detection of the impacting water column is performed, the edge information of the impacting water column is extracted, and the centroid of the impacting water column is calculated by arithmetic mean based on the edge information of the impacting water column.
本发明实施例提供的计算得到弹丸散布误差,进行舰炮射击脱靶量评估包括:The calculation provided by the embodiment of the present invention obtains the projectile dispersion error, and the evaluation of the miss distance of naval gun shooting includes:
(1)进行散布误差的计算:(1) Calculate the dispersion error:
基于得到的靶标和弹着水柱在图像中的位置信息,建立以靶标为原点的弹着散布坐标系,利用下式计算散布误差:Based on the obtained position information of the target and the impact water column in the image, the impact dispersion coordinate system with the target as the origin is established, and the dispersion error is calculated by the following formula:
其中,式4.25中(Xi,Zi)表示每一个弹着水柱的实际坐标值,表示弹着水柱群中心坐标,的值为所有弹着水柱实际坐标值的平均值;0.6745表示或然系数,n表示弹丸总数;in, In Equation 4.25 (X i , Z i ) represents the actual coordinate value of each impacted water column, Represents the coordinates of the center of the impacting water column group, The value of is the average value of the actual coordinate values of all impacting water columns; 0.6745 represents the probability coefficient, and n represents the total number of projectiles;
(2)比较计算的EX,EZ的值与弹着点散布误差标准值K的大小确定舰炮射击成绩,并得到舰炮射击脱靶量评估结果。(2) Comparing the calculated values of E X and E Z with the standard value K of the impact point dispersion error to determine the shooting results of the naval guns, and obtain the evaluation results of the shooting misses of the naval guns.
下面结合具体实施例对本发明的技术方案做进一步说明。The technical solutions of the present invention will be further described below with reference to specific embodiments.
实施例1:Example 1:
1、无人机检靶系统组成与工作原理1. Composition and working principle of UAV target detection system
1.1系统组成1.1 System Composition
无人机检靶系统组成部分包括由无人机系统、光电载荷、无线图像传输系统、无线遥控系统和检靶综合任务台。The components of the UAV target detection system include the UAV system, the photoelectric load, the wireless image transmission system, the wireless remote control system and the target detection integrated task table.
无人机简称UAV,是利用无线电遥控设备和自备的程序控制装置来操纵的不载人飞行器。无人机平台系统是无人机系统的空中飞行部分,包括飞机机体、推进装置、飞行操纵装置、供电装置。通信数据链的机载部分、任务载荷也被安装在飞机上。本发明选用四旋翼无人机,采用航空锂电池提供动力。主要是考虑到四旋翼无人机应用广泛技术基础好、造价低、机动性好、生存能力强、对环境要求较低,能克服人工难以进行的观测工作,使用方便。因此便于悬停摄录弹着点水柱视频,在该系统中可以携带光测平台完成空中定点观测任务,实现对靶场区域的固定观测。UAV, referred to as UAV, is an unmanned aerial vehicle that is operated by radio remote control equipment and its own program control device. The UAV platform system is the air flight part of the UAV system, including the aircraft body, propulsion device, flight control device, and power supply device. The airborne part of the communication data link, the mission payload, is also installed on the aircraft. The present invention selects four-rotor unmanned aerial vehicle, and uses aviation lithium battery to provide power. Mainly considering that the quadrotor UAV is widely used, it has a good technical foundation, low cost, good maneuverability, strong survivability, and low environmental requirements. It can overcome the observation work that is difficult to carry out manually, and is easy to use. Therefore, it is convenient to hover and record the water column video of the impact point. In this system, the optical measuring platform can be carried to complete the fixed-point observation task in the air, and the fixed observation of the shooting range area can be realized.
光电载荷由CCD摄像机、红外探测装置和吊舱组成。CCD(Charge Coupled Device,电荷藕合器件图像传感器)由一种高感光度的半导体材料制成,能把光线转变成电荷,通过模数转换器芯片转换成数字信号,数字信号经过压缩以后由摄相机内部的闪速存储器或内置硬盘卡保存,因而可以轻松地把数据传输给计算机。搭载在四旋翼无人机上用于记录海上靶场区域内舰炮射击情况,连续拍摄弹丸入水激起的水柱,获得靶场内弹着水柱与靶标的相对关系,即舰炮射击的散布情况。The optoelectronic payload consists of a CCD camera, an infrared detection device and a pod. CCD (Charge Coupled Device, charge-coupled device image sensor) is made of a high-sensitivity semiconductor material, which can convert light into electric charges, and convert them into digital signals through an analog-to-digital converter chip. The flash memory inside the camera or the built-in hard disk card is stored, so data can be easily transferred to a computer. It is mounted on a quad-rotor UAV to record the shooting situation of naval guns in the area of the marine shooting range, continuously shoot the water column caused by the projectile entering the water, and obtain the relative relationship between the impacting water column and the target in the shooting range, that is, the dispersion of naval gun shooting.
无线图像传输系统用于图像的及时传输,把由无人机光测平台拍摄到的视频图像信息转换为电信号,经模/数转换器采样、量化、编码后转换成数字信号,送至动态存贮器暂存,然后经由无线数据传输芯片以无线方式将有效数据实时地发送至信息接收与处理系统以备后期对图像做处理和数据运算。本发明采用HHLM型的无线图像传输系统,传输最远距离可达到50公里。HHLM型微波图像传输系统是专门针对远距离或不具备有线传输条件的环境而设计的高性能高质量的无线图像传输系统。本发明本身体积小,重量轻,能够无失真的实时传送高质量的视频图像,调制和解调性能稳定,传输图像色彩鲜艳、清晰,并且安装调试简单。The wireless image transmission system is used for the timely transmission of images. It converts the video image information captured by the UAV photometric platform into electrical signals, which are sampled, quantized and encoded by the analog/digital converter and converted into digital signals, which are sent to the dynamic The memory is temporarily stored, and then the effective data is wirelessly sent to the information receiving and processing system in real time through the wireless data transmission chip for later image processing and data operation. The invention adopts the HHLM type wireless image transmission system, and the longest transmission distance can reach 50 kilometers. HHLM microwave image transmission system is a high-performance and high-quality wireless image transmission system specially designed for long-distance or environments without wired transmission conditions. The invention itself is small in size and light in weight, can transmit high-quality video images in real time without distortion, has stable modulation and demodulation performance, bright and clear transmitted images, and is easy to install and debug.
无线遥控系统的功能是实现对无人机以及光电载荷的控制。由地面控制中心发出指令,指令传输到无线遥控系统的接受模块,通过解码器读出指令,实现对无人机的操控和光电载荷的控制。The function of the wireless remote control system is to realize the control of the UAV and the photoelectric load. The ground control center sends out instructions, which are transmitted to the receiving module of the wireless remote control system, and the instructions are read out through the decoder to realize the control of the UAV and the control of the photoelectric load.
检靶综合任务台,其任务是完成视频图像的接收、存储和转发,对视频图像进行有效处理,主要由PC机和一些图像处理与数值计算软件组成。在接收到图像传输系统的信息后,对图像进行有效的预处理,去除干扰和噪声,取出清晰有用的图像信息,并利用这些信息按合理的数学模型计算出弹着水柱与漂浮靶的相对位置,并计算出弹丸散布误差。Target detection comprehensive task platform, whose task is to complete the receiving, storage and forwarding of video images, and effectively process the video images, mainly composed of a PC and some image processing and numerical calculation software. After receiving the information from the image transmission system, the image is effectively preprocessed, interference and noise are removed, clear and useful image information is taken out, and the relative position of the impacting water column and the floating target is calculated according to a reasonable mathematical model. , and calculate the projectile dispersion error.
1.2工作原理1.2 Working principle
传统的评估方式较为繁琐,偶然误差大,保障人员工作量大。舰炮射击无人机自动检靶系统可以从视频图像信息获取到后期数据处理改变传统检靶工作存在的弊端,减少工作时间,提高效率。系统工作的基本流程大致分为三步,示意图如图1所示。。The traditional evaluation method is relatively cumbersome, with large accidental errors, and the workload of security personnel is large. The automatic target detection system of naval gun shooting UAV can change the drawbacks of traditional target detection work from the acquisition of video image information to the later data processing, reduce working time and improve efficiency. The basic process of system work is roughly divided into three steps, as shown in Figure 1. .
(1)无人机作为搭载光电载荷的平台,搭载光电载荷飞行至靶标上空安全高度后悬停。(1) As a platform for carrying photoelectric loads, the UAV carries the photoelectric load and flies to a safe height above the target and then hovers.
(2)无人机光电载荷调整拍摄姿态开展工作,目标经摄像机在靶面上成像。(2) The photoelectric load of the UAV adjusts the shooting attitude to carry out the work, and the target is imaged on the target surface by the camera.
(3)舰炮弹着水柱的视频图像信息通过无线传输模块实时传输到布靶舰的综合任务平台,进行实时存储、转发,然后通过无线图像传输系统传输至然后将视频信号的图像采集、显示板数字化,通过开发的综合任务软件对视频图像进行快速、精确处理,得到舰炮射击的命中率以及脱靶率,评估射击结果。(3) The video image information of the water column of naval artillery shells is transmitted to the integrated task platform of the target ship in real time through the wireless transmission module, and is stored and forwarded in real time, and then transmitted to the image acquisition and display board of the video signal through the wireless image transmission system. Digitalization, through the development of comprehensive task software, the video images are processed quickly and accurately, the hit rate and miss rate of naval gun shooting are obtained, and the shooting results are evaluated.
2、无人机系统检靶任务区分析2. Analysis of the target detection task area of the UAV system
为保障无人机安全,并且满足检靶任务需求,要明确无人机检靶任务区。无人机系统检靶时,无人机悬停在浮体靶的正上方,因此无人机检靶任务区是浮体靶正上方的区域,并且主要工作是确定该区域的高度。In order to ensure the safety of UAVs and meet the requirements of target inspection tasks, it is necessary to define the UAV target inspection task area. When the UAV system inspects the target, the UAV hovers directly above the floating target, so the UAV target inspection task area is the area directly above the floating target, and the main job is to determine the height of the area.
2.1基于舰炮射高的无人机悬停高度分析2.1 Analysis of the hovering height of UAV based on the firing height of naval guns
在舰炮射击过程中,弹丸出炮膛后其运行轨迹近似抛物线。无人机检靶系统工作时,无人机悬停在浮体靶的正上方,为了确保无人机的安全,要根据舰炮的射界来确定保障无人机安全的高度,以防无人机被弹丸击落。检靶无人机的位置设计思路如图5所示。由图5可以看出无人机几乎不受水平方向的威胁,无人机的安全威胁主要来自于到达浮体靶正上方的弹丸,因此无人机悬停高度的选择在舰炮射击过程中至关重要,一旦位置选择错误,将会造成严重事故,使检靶工作陷入被动局面。In the process of naval gun firing, the trajectory of the projectile after it exits the barrel is approximately parabolic. When the UAV target detection system is working, the UAV hovers right above the floating target. In order to ensure the safety of the UAV, the height to ensure the safety of the UAV should be determined according to the firing range of the naval gun to prevent unmanned aerial vehicles from being unmanned. The machine was shot down by a projectile. The location design idea of the target detection UAV is shown in Figure 5. It can be seen from Figure 5 that the UAV is almost not threatened by the horizontal direction. The safety threat of the UAV mainly comes from the projectile reaching the top of the floating target. Therefore, the choice of the hovering height of the UAV depends on the shooting process of the naval gun. It is very important, once the wrong location is selected, it will cause serious accidents and make the target inspection work fall into a passive situation.
对舰炮的射角、射距和射高关系进行分析,寻找最佳悬停高度。Analyze the relationship between the firing angle, firing range and firing height of naval guns to find the best hovering height.
已知76㎜主炮弹丸出炮膛获得的初速度为980米/秒,重力加速度g=9.8,射击仰角为α,并且已知舰炮射角的范围为-15°~85°。利用Matlab软件进行仿真,得到射击仰角与射击距离、射击仰角与射击高度以及不同射击仰角射击距离与射击高度的曲线图,根据曲线图与仿真结果并进行分析,从而确定舰炮对浮体靶射击时的安全射界。It is known that the initial velocity of the 76mm main gun projectile out of the gun barrel is 980 m/s, the gravitational acceleration g = 9.8, the shooting elevation angle is α, and the range of the known naval gun shooting angle is -15°~85°. Using Matlab software to simulate, get the curves of shooting elevation angle and shooting distance, shooting elevation angle and shooting height, and shooting distance and shooting height with different shooting elevation angles. safety range.
首先利用Matlab进行仿真,得到射击仰角与射击距离的曲线图如图6所示。通过图6射击仰角与射击距离曲线图可得,当射击仰角为45°时,舰炮射击的距离最大;在舰艇与浮体靶的距离为3海里(5.556㎞)条件下,且射击仰角在0°~3°时,射击距离在10公里之内,能够满足此条件下的射击需求。First, use Matlab to simulate, and get the curve graph of shooting elevation angle and shooting distance as shown in Figure 6. From the graph of shooting elevation angle and shooting distance in Fig. 6, it can be seen that when the shooting elevation angle is 45°, the shooting distance of the naval gun is the largest; under the condition that the distance between the ship and the floating target is 3 nautical miles (5.556㎞), and the shooting elevation angle is 0 ° ~ 3 °, the shooting distance is within 10 kilometers, which can meet the shooting needs under this condition.
然后再利用Matlab进行仿真,得到射击仰角与最大射击高度的曲线图,如图7所示。通过图744仰角与距离曲线图可得,随着射击仰角的增大,最大射击高度也随之增大;在0°~10°之间,最大射击高度变化不明显。Then use Matlab to simulate, and get the curve diagram of shooting elevation angle and maximum shooting height, as shown in Figure 7. It can be seen from the graph of elevation angle and distance in Figure 744 that as the elevation angle of the shooting increases, the maximum shooting height also increases; between 0° and 10°, the maximum shooting height does not change significantly.
根据对图6射击仰角与射击距离曲线图与图7射击仰角与最大射击高度曲线图的分析结果,并利用Matlab进行仿真,得到在不同射击仰角时,弹丸射击距离、射击高度的对应曲线,曲线图如图8所示。According to the analysis results of the curve graph of shooting elevation angle and shooting distance in Figure 6 and the curve diagram of shooting elevation angle and maximum shooting height in Figure 7, and using Matlab to simulate, the corresponding curves of projectile shooting distance and shooting height at different shooting elevation angles are obtained. The curve The diagram is shown in Figure 8.
根据图8不同射击仰角的射击曲线图以及仿真的结果,得到在不同射击仰角条件下弹丸能到达的最大高度与到达浮体靶正上方时的高度的参数,参数表如表1所示。According to the shooting curves of different shooting elevation angles in Fig. 8 and the simulation results, the parameters of the maximum height that the projectile can reach and the height when it reaches the top of the floating target under different shooting elevation angles are obtained. The parameter table is shown in Table 1.
表1弹丸射击距离、高度参数表Table 1 Projectile shooting distance and height parameter table
根据不同射击仰角,弹丸射击距离、高度参数表以及曲线图,可以很直观地得出舰炮射击的危险区,因此便得到舰炮在高度上的射界。根据需要,此次舰炮射击的距离为3海里(5.556㎞),通过表1弹丸射击距离、高度参数表及图7不同射击仰角的射击曲线图,便可得出舰炮的射界。射界的范围是以发射舰为基点,在高度为1.5倍舰炮射击最大高度以下的区域。根据以上信息,可以得出最大射击高度为40米,那么舰炮在高度上的射界的最小高度为1.5倍的最大射击高度即40×1.5=60米。According to different shooting elevation angles, projectile shooting distance, height parameter table and graph, the dangerous area of naval gun shooting can be obtained intuitively, so the firing range of naval guns in height can be obtained. According to the needs, the firing distance of the naval gun is 3 nautical miles (5.556㎞). The firing range of the naval gun can be obtained from the projectile firing distance and height parameter table in Table 1 and the firing curves of different firing elevation angles in Figure 7. The range of the firing range is based on the launching ship, and the height is 1.5 times the maximum height of naval gun firing. According to the above information, it can be concluded that the maximum shooting height is 40 meters, then the minimum height of the firing range of the naval gun in height is 1.5 times the maximum shooting height, that is, 40×1.5=60 meters.
继续计算无人机最小安全高度。无人机检靶时,无人机悬停在浮体靶的正上方。已知弹丸的轨迹近似为抛物线,假设弹丸到达最高点时的高度为h,无人机的悬停高度为H,为了保证无人机的安全,应使H≥h。在舰艇与浮体靶的距离为3海里(5.556㎞)条件下,76毫米舰炮在高度上的最小射界高度为60米,为了确保无人机免受弹丸的威胁,确定的无人机最小安全悬停高度为60米,且无人机在浮体靶正上方。Continue to calculate the minimum safe altitude of the drone. When the drone inspects the target, the drone hovers directly above the floating target. It is known that the trajectory of the projectile is approximately a parabola. It is assumed that the height of the projectile when it reaches the highest point is h, and the hovering height of the drone is H. In order to ensure the safety of the drone, H ≥ h. Under the condition that the distance between the ship and the floating target is 3 nautical miles (5.556㎞), the minimum firing range of the 76mm naval gun is 60 meters. The safe hovering height is 60 meters, and the drone is directly above the floating target.
2.2基于光电载荷观测范围的无人机悬停高度分析2.2 UAV hovering height analysis based on photoelectric load observation range
基于光电载荷观测范围的无人机悬停高度是指满足拍摄效果要求,拍摄图像包含检靶所需要素的高度。因此在确定无人机工作高度时,仅仅考虑保证无人机安全的高度是不够的,还要考虑能够满足拍摄任务需求的无人机的工作高度。The hovering height of the UAV based on the observation range of the photoelectric load refers to the height that meets the requirements of the shooting effect and the shooting image contains the elements required for target detection. Therefore, when determining the working height of the UAV, it is not enough to only consider the height that ensures the safety of the UAV, but also the working height of the UAV that can meet the requirements of the shooting task.
在这里引入“判定矩阵”概念。“判定区间”矩形边界是舰炮在对漂浮(反射体)靶、模拟目标射击时,根据相应的射击成绩评定参数表,判定炮弹落入有效命中区域的矩形区间边界。The concept of "decision matrix" is introduced here. The rectangular boundary of the "judgment interval" is the rectangular interval boundary of the projectile that is judged to fall into the effective hit area according to the corresponding shooting performance evaluation parameter table when the naval gun shoots at the floating (reflector) target and the simulated target.
根据火炮的类型、射击方法和射击平均距离,按照对漂浮(反射体)靶、模拟目标射击成绩评定参数的相应表,确定出“判定区间”的范围。“判定区间”示意图如图9所示。According to the type of artillery, the shooting method and the average shooting distance, the range of the "judgment interval" is determined according to the corresponding table of the evaluation parameters of the shooting performance of the floating (reflector) target and the simulated target. The schematic diagram of the "judgment interval" is shown in Figure 9.
图9中,2X为矩形判定区间的距离边长值,2Z为矩形判定区间的方向边长值。在对海浮体靶射击成绩评定参数表中,列出了矩形判定区的距离和方向边长的一半值X和Z。对海浮体靶射击成绩评定参数表如表2所示。In FIG. 9 , 2X is the distance side length value of the rectangular determination section, and 2Z is the direction side length value of the rectangular determination section. In the evaluation parameter table for the shooting performance of floating targets in the sea, the half values X and Z of the distance and direction side length of the rectangular judgment area are listed. Table 2 shows the evaluation parameters for the shooting performance of floating targets in the sea.
表2对海浮体靶射击成绩评定参数表Table 2. Parameters for evaluating the shooting performance of floating targets in the sea
根据此次射击任务,对照参数表,可以得到当射击距离为3海里(5.556㎞)时,“判定区间”矩形框的边界值,距离X=114.43米,方向Z=32.651米。According to this shooting task, by comparing the parameter table, we can get the boundary value of the "judgment interval" rectangle when the shooting distance is 3 nautical miles (5.556㎞), the distance X=114.43 meters, and the direction Z=32.651 meters.
考虑到摄像机的拍摄视角以及拍摄效果,还要根据“判定区间”矩形框的边界确定无人机工作的悬停高度。检靶系统检测的设计方案如图10所示,已知摄像机的视场角为94°(即ADB=94°),那么半个视角即ACO=BCO=1/2ACB=47°。舰炮实弹射击考核时,评判的有效区域是以浮体靶O为中心的“判定区间”矩形框的区域,并且“判定区间”矩形框内切与⊙O。Considering the shooting angle of the camera and the shooting effect, the hovering height of the drone is also determined according to the boundary of the "judgment interval" rectangular box. The design scheme of the target detection system detection is shown in Figure 10. It is known that the field of view of the camera is 94° (ie ADB=94°), then half the angle of view is ACO=BCO=1/2ACB=47°. During the evaluation of live ammunition of naval guns, the effective area for evaluation is the area of the "judgment interval" rectangular frame centered on the floating target O, and the "judgment interval" rectangular frame is inscribed with ⊙O.
根据以上已知条件,可以很容易地计算出OC的高度,确定无人机的悬停高度。利用勾股定理:c2=a2+b2 According to the above known conditions, the height of the OC can be easily calculated to determine the hovering height of the UAV. Using the Pythagorean theorem: c 2 =a 2 +b 2
其中a=X=114.43米Where a=X=114.43 meters
b=Z=32.651米b=Z=32.651 meters
得 have to
即⊙OD的半径等于119米,OA=OB=OD=OE=OF=OG=119米。That is, the radius of ⊙OD is equal to 119 meters, OA=OB=OD=OE=OF=OG=119 meters.
根据摄像机的视场角,便可求得OC的长度,利用正切定理: According to the field of view of the camera, the length of OC can be obtained, using the tangent theorem:
其中a=OA=119米Where a=OA=119 meters
θ=∠ACO=47°θ=∠ACO=47°
得 have to
即OC=b=111米。因此无人机的悬停高度为111米。That is, OC=b=111 meters. Therefore, the hovering height of the drone is 111 meters.
如果观测到落入“判定区间”矩形框区域外的弹丸激起的水柱,摄像机的可视范围要扩大。假设可视区范围的长、宽是“判定区间”矩形框长宽的1.5倍,在此条件下,矩形框边界值为距离X=1.5*X=171.645米,方向Z=Z=48.98米。根据“判定区间”矩形框区域求解无人机悬停高度的方法,同理可得此条件下无人机的悬停高度。最终,求解得到OC=166.5米,即无人机的悬停高度为166.5米。If the water column caused by the projectile falling outside the "judgment interval" rectangle is observed, the visible range of the camera should be expanded. Assuming that the length and width of the visible area are 1.5 times the length and width of the "judgment interval" rectangular frame, under this condition, the boundary value of the rectangular frame is the distance X=1.5*X=171.645 meters, and the direction Z=Z=48.98 meters. According to the method of solving the hovering height of the UAV in the rectangular box area of the "judgment interval", the hovering height of the UAV under this condition can be obtained in the same way. Finally, the solution obtains OC=166.5 meters, that is, the hovering height of the drone is 166.5 meters.
无人机的极限飞行高度为500米,综合保障无人机安全的高度、“判定区间”矩形边界限制、摄像机的视场角范围以及无人机自身的飞行高度限制,最终设定了无人机工作时的悬停高度为111米。The ultimate flight height of the drone is 500 meters. The height of the safety of the drone, the rectangular boundary limit of the "judgment zone", the range of the camera's field of view, and the flight height limit of the drone are finally set. The hovering height when the machine is working is 111 meters.
3、靶标提取定位与弹着水柱测量3. Target extraction and positioning and impact water column measurement
要实现基于视频图像的舰炮射击脱靶量评估,需要从弹着水柱图像中提取靶标图像信息和弹着水柱图像信息。舰炮对海射击的靶标是直径三米的红色球形漂浮靶,在图像中与海洋背景的颜色有着对比明显,因此可以直接利用无人机获取的弹着水柱RGB图像分割出感兴趣的颜色区域,即提取并定位靶标。In order to realize the evaluation of the missed target amount of naval gun shooting based on video images, it is necessary to extract the target image information and the impact water column image information from the impact water column image. The target that the naval gun shoots at the sea is a red spherical floating target with a diameter of three meters, which has obvious contrast with the color of the ocean background in the image. Therefore, the color area of interest can be segmented directly by the RGB image of the water column obtained by the drone. , that is, to extract and locate the target.
如果进行弹着水柱的提取和定位,则需要首先对输入的弹着水柱图像进行预处理以得到清晰的目标信息。本章首先阐述了靶标的提取定位方法、图像距离与实际距离比例关系,对弹着水柱图像进行了预处理,获得了更加清晰的弹着水柱信息。然后采用不同于提取靶标的阈值分割方法提取弹着水柱,实现了简单背景和复杂背景下的弹着水柱的自动提取与定位。In order to extract and locate the impact water column, it is necessary to preprocess the input impact water column image first to obtain clear target information. In this chapter, the extraction and positioning method of the target, the relationship between the image distance and the actual distance are first described, and the impact water column image is preprocessed to obtain clearer impact water column information. Then, a threshold segmentation method different from the extraction target is used to extract the impact water column, which realizes the automatic extraction and positioning of the impact water column under simple and complex backgrounds.
3.1靶标的提取定位与数据预处理3.1 Target extraction and positioning and data preprocessing
图像的分割是图像处理的重要一步,它直接决定后续的定位精度的好坏,首先要将靶标进行提取,将靶标从图像中与背景分割开来。由于舰炮射击所用是红色漂靶,与海洋背景有着明显的颜色区别,依据靶标颜色的一致性和靶标与背景色的对比度较大,直接在RGB图像中分割靶标颜色范围内的区域对靶标进行提取即可。因此本发明采用基于颜色特征的分割方法对图像进行分割。Image segmentation is an important step in image processing. It directly determines the accuracy of subsequent positioning. First, the target should be extracted and the target should be separated from the background from the image. Since the red drift target used in naval gun shooting is obviously different from the ocean background, according to the consistency of the target color and the large contrast between the target and the background color, the target is directly divided in the RGB image. Extract it. Therefore, the present invention adopts the segmentation method based on color feature to segment the image.
本发明针对靶标的提取采用的判别规则是根据R、G、B中R分量明显不小于其他分量时,判别某像素点为红颜色,并通过设置阈值,控制判别条件的颜色。本发明中提到的基于区域的图像分割最为关键的也是最难以严格界定的地方就是对于RGB颜色判定,阈值设置大了,靶标颜色的变化范围就会减小,对由于摄像造成的颜色失真的容忍就会变小;阈值设置小了,靶标颜色的变化范围就会增大,可能分割出的目标会增大或者分割出与目标无关区域。The discrimination rule adopted by the present invention for target extraction is to discriminate a certain pixel as red according to when the R component in R, G, and B is obviously not smaller than other components, and set a threshold to control the colour of the discrimination condition. The area-based image segmentation mentioned in the present invention is the most critical and the most difficult to define strictly. For RGB color determination, the larger the threshold is set, the smaller the variation range of the target color will be. The tolerance will become smaller; if the threshold is set smaller, the range of target color changes will increase, and the possible segmented targets will increase or segment irrelevant areas.
随后需要进一步确定靶标的位置,确定了靶标的位置才能确定弹着点与靶标的相对位置关系,计算散布误差。经过图像分割得到的靶标信息是一个近似的圆,对靶标的定位就是获取靶标的圆心和半径。Subsequently, the position of the target needs to be further determined, and the relative positional relationship between the impact point and the target can be determined only after the position of the target is determined, and the dispersion error can be calculated. The target information obtained by image segmentation is an approximate circle, and the positioning of the target is to obtain the center and radius of the target.
圆检测是数字图像处理中非常重要的应用,对于圆或者类圆目标的圆心提取方法有很多,如三点定圆法、Hough变换法、曲线拟合法等。Circle detection is a very important application in digital image processing. There are many methods for extracting the center of a circle or a circle-like object, such as three-point circle method, Hough transform method, curve fitting method, etc.
(1)三点定圆法是一种外部区域边界表示方法,该算法只关注目标圆形区域的边界信息。在实际应用中该算法的劣势是:选取目标圆形区域边界上任意的三个点,任意的选取会导致计算的随机性较强,对计算结果影响很大。但其优势是:该算法可以对不完整、有部分缺损的目标圆形区域进行处理,并且该算法相对简单,易于编码实现,且计算量较小。(1) The three-point circle method is an external area boundary representation method, and the algorithm only pays attention to the boundary information of the target circular area. The disadvantage of this algorithm in practical application is that any three points on the boundary of the target circular area are selected, and the arbitrary selection will lead to strong randomness of the calculation, which has a great influence on the calculation result. But its advantage is that the algorithm can process incomplete and partially defective target circular areas, and the algorithm is relatively simple, easy to code and implement, and requires less computation.
(2)Hough变换法是一种聚类分类思想,将图像空间内具有一定关系的象元信息执行聚类操作,寻找能把这些象元用某种解析形式联系起来的参数空间累积对应点。该算法的优势是抗噪能力较强、运算精度较高。但其劣势是:需要存储的数据量较大,这导致算法的运算速度较慢。(2) Hough transform method is a kind of clustering classification idea, which performs clustering operation on the pixel information with certain relationship in the image space, and finds the parameter space accumulation corresponding point that can connect these pixels in a certain analytical form. The advantages of this algorithm are strong anti-noise ability and high calculation accuracy. But its disadvantage is: the amount of data that needs to be stored is large, which leads to a slower operation speed of the algorithm.
(3)曲面拟合法也是一种外部区域边界表示方法,通常基于最小二乘法寻找最佳拟合点集,迭代求解。该算法的优势是精度较高,考虑到了目标圆形区域的各边界点像素的影响,且改善了三点定圆法中三点选择太过灵活、随机性不好控制的问题。但该算法劣势是,当目标圆形区域较大,目标区域的边界像素点很多时,算法的计算量很大,不利于快速进行圆心定位,影响算法的执行效率。(3) The surface fitting method is also a method of representing the boundary of the external area. It is usually based on the least squares method to find the best fitting point set and solve it iteratively. The advantage of this algorithm is that the accuracy is high, the influence of each boundary point pixel of the target circular area is considered, and the problem that the three-point selection is too flexible and the randomness is difficult to control in the three-point circle method is improved. However, the disadvantage of this algorithm is that when the target circular area is large and the boundary pixels of the target area are many, the calculation amount of the algorithm is very large, which is not conducive to the rapid positioning of the center of the circle and affects the execution efficiency of the algorithm.
综合以上的各类圆或类圆的圆心提取方法对比分析,由于在靶标的提取过程中基于区域图像分割后靶标的像素数量不是很多,并且靶标会被海浪遮盖掉部分,为了满足系统对靶标位置速度和精确度的要求,本发明对靶标中心的检测采用最小二乘法进行曲线拟合算法。Based on the comparison and analysis of the above methods for extracting the center of various circles or circle-like circles, since the number of pixels of the target after segmentation based on the regional image during the target extraction process is not many, and the target will be partially covered by the waves, in order to satisfy the system's accuracy of the target position To meet the requirements of speed and accuracy, the present invention uses the least squares method to perform a curve fitting algorithm for the detection of the target center.
找到一段圆弧后,通过圆的方程来计算相应的参数,圆的方程如式3.1。After a segment of arc is found, the corresponding parameters are calculated by the equation of the circle, and the equation of the circle is shown in Equation 3.1.
(x-xc)2+(y-yc)2=r2 (式3.1)(xx c ) 2 +(yy c ) 2 =r 2 (Equation 3.1)
其中(xc,yc)为圆心,展开得到式3.2。where (x c , y c ) is the center of the circle, and it is expanded to obtain Equation 3.2.
x2+y2+ax+by+c=0 (式3.2)x 2 +y 2 +ax+by+c=0 (Equation 3.2)
其中,a=-2xc,b=-2yc,c=xc 2+yc 2-r2,使用最小二乘法计算参数a,b,c,从而求取圆的半径: Among them, a=-2x c , b=-2y c , c=x c 2 +y c 2 -r 2 , use the least squares method to calculate the parameters a, b, c, so as to obtain the radius of the circle:
根据最小二乘法得出的靶标圆心坐标和半径对靶标进行拟合,所得半径大小与靶标的半径拟合情况较好。The target is fitted according to the coordinates and radius of the target center obtained by the least squares method, and the obtained radius fits well with the radius of the target.
完成靶标提取定位后,提取弹着水柱之前必须先对获取的图像信息进行图像预处理。图像的预处理是图像处理的基本过程,通过采集系统得到射击后的水柱视频图像,为了测量舰炮的脱靶量,提取出目标弹着水柱,克服图像中的一些困难,尤其是针对图像中弹着水柱与背景差异较小的情况,使弹着水柱目标的提取更加容易,所以应对弹着水柱图像预先处理。After the target extraction and positioning is completed, the acquired image information must be preprocessed before the water column is extracted. Image preprocessing is the basic process of image processing. The water column video image after shooting is obtained through the acquisition system. In order to measure the miss of the naval gun, the water column of the target is extracted, and some difficulties in the image are overcome, especially for the image shot. When the difference between the impacting water column and the background is small, the extraction of the impacting water column target is easier, so the impacting water column image should be pre-processed.
图像预处理的过程主要包括图像灰度化,中值滤波进行噪声的消除和图像对比度增强等,为弹着水柱的识别做准备。The process of image preprocessing mainly includes image grayscale, median filtering for noise elimination and image contrast enhancement, etc., to prepare for the identification of the impacting water column.
3.2图像距离与实际距离的关系3.2 The relationship between image distance and actual distance
对炮弹的脱靶量进行评估,需要根据实际的距离计算弹着偏差量。根据系统的工作要求,光测设备从高空垂直向下拍摄,获得包括整个海上靶场区域、海上浮标弹着水柱的图像。基于透视几何和摄影测量的原理,对拍摄到的图像中的像素和实际距离的关系进行推导与计算。To evaluate the miss distance of the shell, it is necessary to calculate the impact deviation according to the actual distance. According to the working requirements of the system, the photometric equipment shoots vertically downwards from a high altitude to obtain images including the entire area of the marine shooting range and the water column of the marine buoys. Based on the principles of perspective geometry and photogrammetry, the relationship between the pixels in the captured image and the actual distance is derived and calculated.
本发明依据针孔摄像机成像原理进行计算,认为摄像机对靶场区域的图像采集为正比例缩放,不考虑图像畸变对测量距离带来的误差,因为弹着水柱的像越原理图像边缘,受镜头畸变的影响越小。The invention calculates according to the imaging principle of the pinhole camera, and considers that the image acquisition of the shooting range area by the camera is proportional to the scaling, and does not consider the error caused by the image distortion to the measurement distance, because the image hitting the water column is beyond the edge of the principle image, which is affected by the lens distortion. less impact.
如图11所示,O为镜头中心,a为物距,它是光心O到物平面的距离,f为镜头焦距,它是光心O到像平面的距离。As shown in Figure 11, O is the center of the lens, a is the object distance, which is the distance from the optical center O to the object plane, and f is the focal length of the lens, which is the distance from the optical center O to the image plane.
已知实际靶标是一个直径三米的球体,可以作为非接触距离测量的参考物体,根据靶标在图像中所占用的像素个数可以求得每一个像素代表的实际长度,即比例关系e的大小。It is known that the actual target is a sphere with a diameter of three meters, which can be used as a reference object for non-contact distance measurement. According to the number of pixels occupied by the target in the image, the actual length represented by each pixel can be obtained, that is, the size of the proportional relationship e .
计算弹着水柱与靶标的距离,通过求两目标中心的图像上的欧式距离,根据像素大小与实际物体的长度比例关系,得到实际的距离。因此图像像素与实际距离的关系为 Calculate the distance between the impacting water column and the target, and obtain the actual distance by calculating the Euclidean distance on the images of the centers of the two targets, according to the proportional relationship between the pixel size and the length of the actual object. So the relationship between image pixels and actual distance is
在之前的叙述中已经确定了靶标的圆心坐标和半径,靶标半径是14.1032个像素大小,靶标的实际半径为1.5米,因此e=0.1064米/像素。The center coordinates and radius of the target have been determined in the previous description, the target radius is 14.1032 pixels, and the actual radius of the target is 1.5 meters, so e=0.1064 meters/pixel.
3.3弹着水柱自动提取与定位3.3 Automatic extraction and positioning of the impact water column
所谓简单背景就是理想情况下对弹着水柱识别更有利的环境,理想的条件就是弹着水柱图像中仅有一种灰度的海洋背景。若存在天空背景,天空的灰度级与水柱灰度级接近将会使得目标分割工作更加困难;若气象条件良好,风速较小会使得弹着水柱形态基本保持圆柱体,弹着点位置可以用水柱的中心代表,并且海况良好,海面的浪涌较少,对目标提取干扰就小。在此种情况下得到的弹着水柱图像称为简单背景的弹着水柱图像,利用数字图像处理高效且误差较小。简单背景下的弹着水柱自动提取定位流程设计如图12所示。The so-called simple background is ideally an environment that is more favorable for the recognition of the impacting water column, and the ideal condition is the ocean background with only one grayscale in the impacting water column image. If there is a sky background, the gray level of the sky is close to the gray level of the water column, which will make the target segmentation more difficult; if the weather conditions are good and the wind speed is small, the shape of the impacting water column will basically remain a cylinder, and the impact point position can be divided by the water column. The center is representative, and the sea conditions are good, there are less surges on the sea surface, and there is less interference with the target extraction. The impact water column image obtained in this case is called the simple background impact water column image, and the digital image processing is efficient and the error is small. Figure 12 shows the design of the automatic extraction and positioning process of the impacting water column under a simple background.
在舰炮对海射击时,炮弹入水后产生水柱,因此可以根据弹着水柱的位置确定弹着偏差。对弹着水柱的提取主要的从海洋背景中将目标分割出来,其次滤除海洋中浪花的影响,由于水柱与背景的对比度的差别,通过约定灰度阈值来分割目标与背景。When a naval gun is fired at the sea, a water column is generated after the projectile enters the water, so the impact deviation can be determined according to the position of the impact water column. The extraction of the impacting water column mainly separates the target from the ocean background, and then filters out the influence of the waves in the ocean. Due to the difference in the contrast between the water column and the background, the target and the background are segmented by a predetermined grayscale threshold.
假设图像f(x,y)的暗色背景上有一些明亮的物体,因此物体和背景像素的灰度级就分为两种主要模式。从背景中提取物体的一种常用方法就是选取一个阈值T来分割这两种模式。然后,满足条件f(x,y)>T的任何点(x,y)就称为物体点,而其他点则称为背景点(反过来在亮背景上的暗色物体也一样)。阈值处理后的(二值)图像g(x,y)的定义如式4.1。Suppose the image f(x, y) has some bright objects on a dark background, so the gray levels of object and background pixels are divided into two main modes. A common way to extract objects from the background is to choose a threshold T to separate the two modes. Then, any point (x, y) that satisfies the condition f(x, y) > T is called an object point, and other points are called background points (and vice versa for dark objects on a bright background). The definition of the thresholded (binary) image g(x, y) is as in Equation 4.1.
因此,阈值分割的最难点在于如何确定阈值T。阈值计算方法通常分为两种:全局阈值和基本自适应阈值。Therefore, the most difficult part of threshold segmentation is how to determine the threshold T. Threshold calculation methods are usually divided into two types: global threshold and basic adaptive threshold.
全局阈值是最常用的阈值计算方法,特别是当图像的灰度直方图分布呈现双峰时,全局阈值可以明显的将目标和背景分量,得到较为理想的图像分割效果。基本自适应阈值是一种比较基础的图像自适应分割方法,它一般以图像像素自身及其邻域灰度变化的特征为基础进行阈值分割,进而实现灰度图像的二值化。该方法充分考虑了每个像素邻域的特征,所以一般能更好地突出目标和背景的边界。The global threshold is the most commonly used threshold calculation method, especially when the gray histogram distribution of the image presents two peaks, the global threshold can obviously divide the target and background components to obtain a more ideal image segmentation effect. The basic adaptive threshold is a relatively basic image adaptive segmentation method. It generally performs threshold segmentation based on the characteristics of the image pixel itself and the grayscale change of its neighborhood, and then realizes the binarization of the grayscale image. This method fully considers the characteristics of each pixel neighborhood, so it can generally better highlight the boundaries of the target and the background.
弹着水柱的背景是大海,水柱的颜色是白色。理想情况下弹着水柱与背景的对比度比较大,可以采用最大类间差法将弹着水柱和背景分割开。但是由于图像采集时受到环境的影响,如洋面的反光、光测平台的拍摄角度以及水柱的水花的密度等造成灰度直方图呈现单峰的状态,图13为此弹着水柱的灰度直方图,其呈现单峰状态。The background of the bouncing water column is the sea, and the color of the water column is white. Ideally, the contrast between the impacting water column and the background is relatively large, and the maximum inter-class difference method can be used to separate the impacting water column and the background. However, due to the influence of the environment during image acquisition, such as the reflection of the ocean surface, the shooting angle of the photometric platform, and the density of the water column, the grayscale histogram shows a single peak state. Figure 13 shows the grayscale of the water column. Histogram, which exhibits a unimodal state.
根据单峰灰度直方图所具有的特点,本发明采取的自适应阈值计算步骤如下:According to the characteristics of the single-peak grayscale histogram, the adaptive threshold calculation steps adopted by the present invention are as follows:
(1)初值。统计弹着水柱图像的最小灰度值Tmin、最大灰度值Tmax,计算二值平均值为初始阈值T。(1) Initial value. The minimum gray value T min and the maximum gray value T max of the water column image are counted, and the binary average value is calculated as the initial threshold value T.
(2)分割。根据阈值T对图像进行分割,得到两个像素集合分别为。(2) Segmentation. The image is segmented according to the threshold T, and two sets of pixels are obtained.
G1={f(x,y)≥T},G2={f(x,y)≤T} (式4.6)G 1 ={f(x,y)≥T}, G 2 ={f(x,y)≤T} (Equation 4.6)
(3)均值。计算像素集合G1和G2的灰度平均值μ1和μ2。(3) Mean. The grayscale average values μ 1 and μ 2 of the pixel sets G 1 and G 2 are calculated.
(4)迭代。根据μ1和μ2计算新的阈值重复步骤2、3、4,直至阈值T收敛到某一范围为止。(4) Iteration. Calculate a new threshold based on μ 1 and μ 2 Repeat steps 2, 3, and 4 until the threshold T converges to a certain range.
通过自适应和迭代的方法计算阈值对弹着水柱图像进行了分割。The water column images are segmented by calculating thresholds by adaptive and iterative methods.
经过以上的方法给图像二值化后,较好的实现了弹着水柱与背景的分割,但仍存在一些干扰噪声,这些噪声是由于背景中有些区域和弹着水柱的灰度等级接近而在二值化后形成的虚假小目标点,或者是弹着水柱灰度不均导致在二值图像中的水柱区域内也存在一些干扰噪声。这些噪声的存在影响到下一步弹着水柱的提取已经对弹着点的定位。针对这些问题利用数学形态学操作对二值图像进行滤波。After the image is binarized by the above methods, the segmentation of the impacting water column and the background is better achieved, but there are still some interference noises. These noises are caused by the closeness of the gray level of some areas in the background and the impacting water column. The false small target points formed after binarization, or the uneven grayscale of the hitting water column, also cause some interference noise in the water column area in the binary image. The existence of these noises affects the location of the impact point in the next extraction of the impact water column. Aiming at these problems, the binary image is filtered by mathematical morphological operations.
数学形态学理论不同于传统关于数值建模和分析的观点,主要利用结构元素来分析和探测图像。数学形态学是由一系列形态学中的代数运算所组成,包含4个基本运算:膨胀(Dilation)、腐蚀(Erosion)、开运算(Opening)、闭运算(Closing)。Mathematical morphology theory differs from the traditional view on numerical modeling and analysis, mainly using structural elements to analyze and detect images. Mathematical morphology consists of a series of algebraic operations in morphology, including four basic operations: Dilation, Erosion, Opening, and Closing.
膨胀操作可以将物体接触的所有背景点合并到该物体中并使目标物体边界向外部扩张。利用该操作,可以填补物体中的空间;腐蚀操作可以消除目标边界点并使目标边界向内部收缩,在图像中小且无意义的物体可以被该操作消除掉。二值图像的滤波通过两个阶段实现噪声的消除:第一阶段是对二值图像进行闭操作,目的是为了消除弹着水柱内的黑色黑点,使得弹着水柱区域更加完整。第二阶段是对图像进行开操作,目的是除去背景中的小干扰噪声和去除弹着水柱边缘的毛刺,平滑弹着水柱边缘。The expansion operation merges all background points that an object touches into the object and expands the bounds of the target object to the outside. Using this operation, the space in the object can be filled; the erosion operation can eliminate the target boundary points and shrink the target boundary inward, and the small and meaningless objects in the image can be eliminated by this operation. The filtering of the binary image realizes the elimination of noise through two stages: the first stage is to perform the closing operation on the binary image, the purpose is to eliminate the black dots in the impacting water column, so that the impacting water column area is more complete. The second stage is to open the image, the purpose is to remove the small interfering noise in the background and remove the burr on the edge of the water column, and smooth the edge of the water column.
在获得弹着水柱的图像后,需要求出弹着水柱的中心坐标。在经过前面的图像处理之后,图像中的干扰噪声基本被消除了,可以用对弹着水柱定位。弹着水柱由于受到重力和风的影响,从高空拍摄得到的图像不一定是一个规则的圆形,本发明采用质心法求得不规则图像的重心,作为弹着水柱的中心。质心法是指用弹着水柱的质心横纵坐标表示在图像中的位置。设一幅图像f(x,y),xk是第k个像素的x坐标,yk是第k个像素的y坐标,f(xk,yk)是第k个像素的灰度值。质心坐标(x,y)总的像素数为n,则公式为After obtaining the image of the impacting water column, it is necessary to obtain the center coordinates of the impacting water column. After the previous image processing, the interference noise in the image is basically eliminated, and the water column can be used for positioning. Because the impacting water column is affected by gravity and wind, the image captured from high altitude is not necessarily a regular circle. The present invention uses the centroid method to obtain the center of gravity of the irregular image as the center of the impacting water column. The centroid method refers to using the horizontal and vertical coordinates of the centroid of the impacting water column to represent the position in the image. Suppose an image f(x, y), x k is the x-coordinate of the k-th pixel, y k is the y-coordinate of the k-th pixel, and f(x k , y k ) is the gray value of the k-th pixel . The total number of pixels in the center of mass coordinates (x, y) is n, then the formula is
由于整幅图中所有像素点的灰度值都参与了计算,与目标不相关的区域中每个像素都需要进行计算,而对于一个轮廓不规则的图像来说,最外沿的像素位置对该图像质心求取影响更大。因此可以改进质心算法,采用先查找满足目标的边缘像素所在的行列位置,然后进行算术平均求取相应目标图像的质心。Since the gray values of all pixels in the whole image are involved in the calculation, each pixel in the area not related to the target needs to be calculated, and for an image with irregular contours, the outermost pixel position The image centroid calculation has a greater impact. Therefore, the centroid algorithm can be improved by first finding the row and column positions where the edge pixels that satisfy the target are located, and then performing an arithmetic mean to obtain the centroid of the corresponding target image.
边缘是指周围像素灰度有阶跃变化或屋顶变化的像素的集合,边缘检测主要是对图像的灰度变化的度量、检测和定位,就是根据图像的不连续性,使用梯度对图像进行分割。通常使用图像中较小的邻域上的灰度值的差来近似该导数。如下是3*3的邻域:Edge refers to the collection of pixels with step change in surrounding pixel gray level or roof change. Edge detection is mainly to measure, detect and locate the gray level change of the image, which is to use gradient to segment the image according to the discontinuity of the image. . This derivative is usually approximated using the difference in grayscale values over smaller neighborhoods in the image. The following is a 3*3 neighborhood:
简单的边缘检测是利用一阶微分算子对原图像进行卷积运算,如Roberts算子、Sobel算子、Prewitt算子。Simple edge detection is to use first-order differential operators to perform convolution operations on the original image, such as Roberts operator, Sobel operator, and Prewitt operator.
(1)Roberts算子是一种最简单的算子,是一种利用局部差分算子寻找边缘的算子。(1) Roberts operator is one of the simplest operators, and it is an operator that uses local difference operators to find edges.
其梯度近似为Its gradient is approximately
gx=Z9-Z5 (式4.11)g x = Z 9 -Z 5 (Equation 4.11)
gy=Z8-Z6 (式4.12)g y = Z 8 -Z 6 (Equation 4.12)
卷积算子为:gx:和gy: The convolution operator is: g x : and g y :
(2)Sobel算子对梯度两分量近似为(2) The Sobel operator approximates the two components of the gradient as
gx=(Z7+2Z8+Z9)-(Z1+2Z2+Z3)) (式4.13)g x = (Z 7 +2Z 8 +Z 9 )-(Z 1 +2Z 2 +Z 3 )) (Equation 4.13)
gy=(Z3+2Z6+Z9)-(Z1+2Z4+Z7) (式4.14)g y =(Z 3 +2Z 6 +Z 9 )-(Z 1 +2Z 4 +Z 7 ) (Equation 4.14)
卷积算子为:gx:和gy: The convolution operator is: g x : and g y :
(3)Prewitt算子(3) Prewitt operator
使用Prewitt算子与Sobel算子相比计算上要简单一些,但产生的噪声可能稍微的大一些。其梯度近似为Using the Prewitt operator is computationally simpler than the Sobel operator, but the resulting noise may be slightly larger. Its gradient is approximately
gx=(Z7+Z8+Z9)-(Z1+Z2+Z3) (式4.15)g x =(Z 7 +Z 8 +Z 9 )-(Z 1 +Z 2 +Z 3 ) (Equation 4.15)
gy=(Z3+Z6+Z9)-(Z1+Z4+Z7) (式4.16)g y =(Z 3 +Z 6 +Z 9 )-(Z 1 +Z 4 +Z 7 ) (Equation 4.16)
卷积算子:gx:和gy: Convolution operator: g x : and g y :
通过边缘检测,将目标边缘信息的提取出来,进行质心位置的求取。Through edge detection, the edge information of the target is extracted, and the position of the centroid is obtained.
4脱靶量评估4 Off-target assessment
4.1散布误差的计算与误差分析4.1 Calculation and error analysis of dispersion error
脱靶量,亦称弹着偏差量。武器系统射击过程会受到各种偶然因素的作用和影响,会不断产生相应的误差,使弹着点偏离目标。在评估脱靶量时,采用散布误差,即弹着点相对于散布中心的概率偏差或均方差来表征射击密集度。对水平面上的目标射击时,用距离概率偏差Ex和方向概率偏差Ez或用相对概率偏差Ex/x和Ez/x表征射弹散布度。弹着点(炸点)相对于平均弹着点或散布中心的偏差越小,表示武器的射弹散布越小,即射击密集度越高。Missing amount, also known as impact deviation. The shooting process of the weapon system will be affected and affected by various accidental factors, which will continuously produce corresponding errors, so that the impact point deviates from the target. When evaluating misses, the dispersion error, that is, the probability deviation or mean square error of the impact point relative to the dispersion center, is used to characterize the firing density. When shooting at a target on a horizontal plane, the projectile dispersion is characterized by the distance probability deviation Ex and the direction probability deviation Ez or by the relative probability deviation Ex/x and Ez/x. The smaller the deviation of the impact point (explosion point) from the average impact point or dispersion center, the smaller the projectile dispersion of the weapon, that is, the higher the firing density.
在进行散布误差计算前,考虑到图像坐标系可能与弹着散布坐标系存在不吻合的情况,需要对可能存在的坐标变换的问题进行讨论。Before calculating the dispersion error, considering that the image coordinate system may not match the projectile dispersion coordinate system, it is necessary to discuss the possible coordinate transformation.
CCD摄像机获取的图像信息经模数转换后变换成数字图像。数字图像的坐标表示为(u,v),是以像素为单位代表图像坐标值的。还需要另一个坐标系(x,y),用物理单位表示该像素在图像中的位置。该坐标系的原点为图像中的某一点O1,x轴平行于u轴,y轴平行于v轴。The image information obtained by the CCD camera is converted into a digital image after analog-to-digital conversion. The coordinates of a digital image are expressed as (u, v), which represent image coordinate values in pixel units. Another coordinate system (x, y) is also required, representing the position of this pixel in the image in physical units. The origin of the coordinate system is a certain point O 1 in the image, the x-axis is parallel to the u-axis, and the y-axis is parallel to the v-axis.
在(x,y)坐标系中,原点O1一般位于图像的中心处,是图像平面与摄像机光轴的交点。假设O1处在(u,v)坐标系的(u0,v0)位置,每一个像素在x轴方向上的物理尺寸为Δx,在Y轴方向上的物理尺寸为Δy,则两个坐标系的关系用图像中任意一个像素如式4.17和式4.18表示。In the (x, y) coordinate system, the origin O 1 is generally located at the center of the image, which is the intersection of the image plane and the optical axis of the camera. Assuming that O 1 is at the (u 0 , v 0 ) position of the (u, v) coordinate system, the physical size of each pixel in the x-axis direction is Δ x , and the physical size in the Y-axis direction is Δ y , then The relationship between the two coordinate systems is represented by any pixel in the image as shown in Equation 4.17 and Equation 4.18.
在本发明中需要建立一个以靶标为原点,u轴为x轴,v轴为y轴的图像坐标系O1-xy。In the present invention, it is necessary to establish an image coordinate system O 1 -xy with the target as the origin, the u-axis as the x-axis, and the v-axis as the y-axis.
下面推导计算弹着水柱坐标公式。The formula for calculating the water column coordinates of the bullet is deduced below.
如图15所示。图像坐标系O1-xy中,弹着水柱m的中心坐标为(xm,ym),弹着水柱世界坐标系O′-X′Y(此坐标系的X轴、Z轴不与弹着距离散布、方向散布分别平行)中弹着水柱M的中心坐标为(X′M,Z′M),m是M在图像中的像。e为实际距离与图上距离的比例关系。As shown in Figure 15. In the image coordinate system O 1 -xy, the center coordinates of the impacting water column m are (x m , y m ), and the impacting water column world coordinate system O'-X'Y (the X and Z axes of this coordinate system are not the same as the The coordinates of the center of the impacting water column M are (X′ M , Z′ M ), where m is the image of M in the image. e is the proportional relationship between the actual distance and the distance on the map.
X′M=xm·e (式4.19)X′ M = x m ·e (Equation 4.19)
Z′M=ym·e (式4.20)Z′ M = y m ·e (Equation 4.20)
由于靶标距离射击舰距离远,弹着散布X轴不好确定,因此可能存在摄影的图像坐标系与弹着散布坐标系不吻合的情况,即存在弹着水柱坐标系O′-X′Y′相对弹着散布坐标系O-XZ旋转的过程。如图16所示。(XM,ZM)为弹着水柱在弹着散布坐标系中的坐标。Since the target is far away from the shooting ship, the X-axis of the impact dispersion is not easy to determine, so there may be a situation where the photographed image coordinate system does not match the impact dispersion coordinate system, that is, there is the impact water column coordinate system O'-X'Y' The process of rotating relative to the scatter coordinate system O-XZ. As shown in Figure 16. (X M , Z M ) are the coordinates of the impact water column in the impact dispersion coordinate system.
直角坐标系的旋转公式Rotation formula of Cartesian coordinate system
XM=X′M·cos(θ)+Z′M·sin(θ) (式4.21)X M =X′ M ·cos(θ)+Z′ M ·sin(θ) (Equation 4.21)
ZM=-X′M·sin(θ)+Z′M·cos(θ) (式4.22)Z M = -X′ M ·sin(θ)+Z′ M ·cos(θ) (Equation 4.22)
Θ是图像坐标系相对于图像坐标系反映的世界坐标系的旋转角度。该旋转角度可以根据无人机航向、射击舰航向、射击舷角计算得出,前提条件是图像坐标系的x轴与无人机方向相同。Θ的计算公式如式4.23所示。Θ is the rotation angle of the image coordinate system relative to the world coordinate system reflected by the image coordinate system. The rotation angle can be calculated according to the course of the drone, the course of the shooting ship, and the firing broadside, provided that the x-axis of the image coordinate system is the same as the direction of the drone. The formula for calculating Θ is shown in Equation 4.23.
Θ=TC2-TC1-Q (式4.23)Θ=TC2-TC1-Q (Equation 4.23)
式中TC1,TC2分别为射击舰和无人机的航向,Q为射击舷角。这些参数可从无人机和射击舰上获得。In the formula, TC1 and TC2 are the headings of the shooting ship and the UAV, respectively, and Q is the shooting broadside. These parameters are available from drones and shooters.
通过靶标的提取定位和弹着水柱的提取定位,得到了靶标和弹着水柱在图像中的位置信息,建立了一个以靶标为原点的弹着散布坐标系。利用在上节中对弹着水柱坐标的公式推导得到的弹着水柱的坐标数据进行散布误差的计算。计算散布误差公式如式4.24,4.25所示。Through the extraction and positioning of the target and the extraction and positioning of the impact water column, the position information of the target and the impact water column in the image is obtained, and a projectile dispersion coordinate system with the target as the origin is established. The dispersion error is calculated by using the coordinate data of the impact water column derived from the formula of the impact water column coordinates in the previous section. The formula for calculating the dispersion error is shown in Equations 4.24 and 4.25.
式4.24中式4.25中 In Equation 4.24 In Equation 4.25
(Xi,Zi)表示每一个弹着水柱的实际坐标值,表示弹着水柱群中心坐标,它的值为所有弹着水柱实际坐标值的平均值,0.6745是或然系数,n为弹丸总数。(X i , Z i ) represents the actual coordinate value of each impacted water column, Indicates the center coordinate of the impacting water column group, its value is the average value of the actual coordinate values of all the impacting water columns, 0.6745 is the probability coefficient, and n is the total number of projectiles.
根据舰炮射击的射表对于舰炮射击成绩的考评标准,弹着点散布误差有一个标准值K,比较EX,EZ与标准值的大小这样就可以判定本次射击成绩。According to the evaluation standard of naval gun shooting results from the shooting table of naval gun shooting, there is a standard value K for the dispersion error of projectiles.
完成了对散布误差的计算,根据误差理论的知识并结合整个系统设计过程,分析了误差的来源主要如下:The calculation of the dispersion error is completed. According to the knowledge of the error theory and the whole system design process, the main sources of the error are analyzed as follows:
(1)准备阶段:光电载荷图像X轴与无人机航向方向不平行不固定;飞行高度不合理,使得拍摄角度不佳。(1) Preparation stage: The X-axis of the photoelectric load image is not parallel to the UAV heading direction and is not fixed; the flying height is unreasonable, which makes the shooting angle poor.
(2)拍摄阶段:无人机光电载荷在拍摄时,由于无人机距离控制平台距离远,工作电磁环境复杂,使得无人机光电载荷在工作时对其控制会出现一定的偏差;在拍摄时光电载荷本身会产生误差,摄像机精度不高以及镜头制造过程带来的镜头畸变,或者在拍摄的过程中风流产生的振动会通过吊舱的座架固定点传到光学系统都会对成像质量造成影响。其次在拍摄时,由于光电载荷得到的图像不可能是对实际海区平面的等比例缩放,致使计算弹着水柱散布距离时出现误差。(2) Shooting stage: When the photoelectric load of the UAV is shooting, due to the long distance of the UAV from the control platform and the complex working electromagnetic environment, the photoelectric load of the UAV will have a certain deviation in its control when it is working; The photoelectric load itself will cause errors, the camera accuracy is not high and the lens distortion caused by the lens manufacturing process, or the vibration generated by the wind flow during the shooting process will be transmitted to the optical system through the fixed point of the pod's seat, all of which will affect the imaging quality. influences. Secondly, when shooting, because the image obtained by the photoelectric load cannot be proportionally scaled to the actual sea level, there is an error in calculating the dispersion distance of the impacted water column.
(3)图像传输阶段:获取的图像在无线传输过程中后受到图像传输系统性能的影响,传输后图像质量的好坏会给后面的图像处理中带来误差。(3) Image transmission stage: the acquired image is affected by the performance of the image transmission system after the wireless transmission process, and the quality of the image after transmission will bring errors to the subsequent image processing.
(4)图像处理阶段:图像的预处理工作的好坏会对后继的靶标和弹着水柱的提取精度产生影响,靶标和弹着水柱的定位精度直接关系弹着水柱散布距离的误差大小。(4) Image processing stage: The quality of the image preprocessing will affect the extraction accuracy of the subsequent target and the impact water column. The positioning accuracy of the target and the impact water column is directly related to the size of the error in the dispersion distance of the impact water column.
(5)计算阶段:视频图像与实际海区并非完全是等比例缩放获取的,而在由图上距离向实际距离转换时是按等比例计算会引起一定的误差。此外在对弹着水柱的散布距离进行计算的过程中对有效数字位数的确定和对数字的四舍五入都会对结果产生误差。(5) Calculation stage: The video image and the actual sea area are not completely obtained by proportional scaling, but when the distance on the map is converted to the actual distance, the calculation is proportional to the actual distance, which will cause certain errors. In addition, the determination of the number of significant digits and the rounding of the numbers in the calculation of the dispersion distance of the impact water column can cause errors in the results.
4.2基于MATLAB GUI的射击脱靶量评估软件设计4.2 The software design of shooting misses evaluation software based on MATLAB GUI
想要利用MATLAB工具的快速建模及强大计算能力,完成对检靶采集视频的事后快速分析,具体包括舰炮射击视频的读取、信息获取、图像序列获取、播放、暂停、交互式弹着点距离测算等功能。并利用MATLAB所提供的图形用户界面(GUI)实现一个可视的面向对象的操作界面,实现射击脱靶量评估工作所需要的视频图像和逐帧图片进行选择、处理和结果显示,以及散布误差计算等功能。Want to use the rapid modeling and powerful computing power of the MATLAB tool to complete the post-event rapid analysis of the video collected by the target inspection, including the reading of the naval gun shooting video, information acquisition, image sequence acquisition, playback, pause, and interactive bullet impact distance. calculation and other functions. And use the graphical user interface (GUI) provided by MATLAB to realize a visual object-oriented operation interface, and realize the selection, processing and result display of video images and frame-by-frame pictures required for the assessment of shooting misses, as well as the calculation of dispersion errors. and other functions.
系统将获取的舰炮射击的视频图像以视频帧的方式进行存储,进行弹着点位置确定时从视频帧中选取图像进行弹着水柱图像处理定位弹着点,全部弹着点完成定位后,计算散布误差,对射击的成绩进行判定。The system stores the acquired video images of naval gun shooting in the form of video frames, and selects images from the video frames to process the impact water column image when determining the impact point position. results are judged.
具体的步骤操作步骤如图3所示。The specific steps are shown in Figure 3.
在系统的设计中,为了提醒使用人员按照规范进行操作,减少操作错误动作,在软件设计时增加了关于系统介绍的部分,可以从软件的界面上查看操作流程。同时,若操作步骤与设计的流程不同,系统会发出提示信息。In the design of the system, in order to remind the user to operate according to the specification and reduce the wrong operation, a part about the system introduction has been added in the software design, and the operation process can be viewed from the software interface. At the same time, if the operation steps are different from the designed process, the system will issue a prompt message.
利用MATLAB所提供的GUI(Graph User Interfaces)平台来实现本发明的基于视频图像的舰炮射击脱靶量评估系统的设计,该平台为界面设计提供了多种控件工具,用户可在其提供的友好的交互方式下,实现方便快捷的设计所需的操作界面。按照GUI的实现方法,分以下三个步骤进行。The GUI (Graph User Interfaces) platform provided by MATLAB is used to realize the design of the video image-based naval gun shooting missed target evaluation system of the present invention. The platform provides a variety of control tools for interface design, and the user can provide friendly In the interactive mode, realize the operation interface required for convenient and fast design. According to the implementation method of GUI, it is divided into the following three steps.
(1)建立空白GUI。(1) Create a blank GUI.
(2)对系统模块进行布局和功能设计。(2) Layout and functional design of system modules.
完成系统所需界面设计后,对界面设计进行保存后,MATLAB GUI将生成2个与界面设计相关的文件。After completing the interface design required by the system, after saving the interface design, the MATLAB GUI will generate two files related to the interface design.
.Fig文件:该文件包括GUI的图像窗口和所有子对象的完全描述以及所有对象的属性值,子对象包括用户控件和坐标轴。.Fig file: This file includes the GUI's image window and a full description of all sub-objects and property values for all objects, including user controls and axes.
.M文件:它包含运行GUI需要的所有代码,可以控制GUI并决定GUI对用户的操作响应。.M file: It contains all the code needed to run the GUI, control the GUI and decide how the GUI responds to the user's actions.
(3)编译回调函数。(3) Compile the callback function.
用户可以在GUI设计生成的M文件框架内,编写GUI组件所需的回调函数。该M文件中含有一系列子函数:主函数、Opening函数、Output函数和回调函数,需要特别注意的是,主函数不能修改,否则将会导致GUI界面初始化的失败。The user can write the callback functions required by the GUI components within the framework of the M file generated by the GUI design. The M file contains a series of sub-functions: main function, Opening function, Output function and callback function. It should be noted that the main function cannot be modified, otherwise the GUI interface initialization will fail.
按照操作步骤对系统进行测试,运行程序,进入脱靶量评估界面,输入已知的图像坐标系和弹着散布坐标系的旋转角度,理想的角度是0。可通过点击“系统介绍”查看系统的操作步骤:Test the system according to the operation steps, run the program, enter the missed target amount evaluation interface, and input the known image coordinate system and the rotation angle of the shot dispersion coordinate system. The ideal angle is 0. You can view the operating steps of the system by clicking "System Introduction":
完成脱靶量评估系统的GUI实现,载入视频信息并获取视频信息和图像序列后,可以在当前路径中查看视频逐帧图像信息。After completing the GUI implementation of the missed target quantity assessment system, after loading the video information and obtaining the video information and image sequence, you can view the video frame-by-frame image information in the current path.
选取载入弹着水柱清晰的图像信息进行图像处理,点击“弹着水柱提取”按键,图像显示区域显示识别出的靶标,通过鼠标光标选取弹着水柱位置,同时在显示区域显示标记,得到弹着点相对靶标的方向和距离偏差并显示。如图5-6所示,进行一次的弹着水柱提取操作。Select and load the clear image information of the impacting water column for image processing, click the button "Extracting the impacting water column", the image display area displays the identified target, select the position of the impacting water column with the mouse cursor, and display the mark in the display area to get the impact point. Direction and distance deviation from the target are displayed. As shown in Figure 5-6, perform a water column extraction operation.
在GUI实现时所用的图像资料是从布靶舰获取的视频图像中截取的,反映的是弹丸的距离散布,结果为61.99m,结果与传统测量得到的数据相近,符合对舰炮对海射击脱靶量测量的要求。The image data used in the GUI implementation is intercepted from the video image obtained by the target ship, which reflects the distance distribution of the projectile. The result is 61.99m. Requirements for off-target measurement.
按照相同的操作流程,依次选中射击次数的最能反映弹着水柱特征图片进行弹着水柱提取,直至将所有弹着水柱信息都获取,最后点击“散布误差计算”得到该组射击的散布误差。从而利用散布误差数据判定射击成绩。According to the same operation process, select the shots that can best reflect the water column characteristics of the shot in turn to extract the water column until all the information of the water column is obtained. Finally, click "Dispersion Error Calculation" to get the dispersion error of this group of shots. Therefore, the shooting performance is determined by using the dispersion error data.
5、舰炮射击脱靶量评估对海军部队来说,在舰炮武器性能鉴定和军事比武训练中起着不可缺少的作用。本发明先对无人机检靶系统的组成与工作原理及无人机检靶任务区等方面进行介绍,重点在于基于视频图像完成对舰炮射击进行脱靶量评估工作,利用舰炮射击无人机自动检靶系统从高空定点获取舰炮射击的弹着水柱与靶标相对位置的视频图像,传输至数字处理模块,通过数字图像处理技术完成对靶标的提取定位和弹着水柱的提取定位,计算弹着水柱的坐标位置,得出射击的散布误差评价射击的成绩,最后完成脱靶量的评估系统的GUI实现。5. Assessing the missed target of naval gun shooting plays an indispensable role in the performance appraisal of naval gun weapons and military competition training for naval forces. The present invention first introduces the composition and working principle of the UAV target inspection system and the UAV target inspection task area. The automatic target detection system obtains the video image of the relative position of the impact water column and the target fired by the naval gun from a fixed point at high altitude, transmits it to the digital processing module, and completes the extraction and positioning of the target and the impact water column through digital image processing technology. The coordinate position of the impacting water column is used to obtain the dispersion error of the shooting to evaluate the shooting results, and finally the GUI implementation of the evaluation system for the missed target amount is completed.
在本发明的描述中,除非另有说明,“多个”的含义是两个或两个以上;术语“上”、“下”、“左”、“右”、“内”、“外”、“前端”、“后端”、“头部”、“尾部”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。此外,术语“第一”、“第二”、“第三”等仅用于描述目的,而不能理解为指示或暗示相对重要性。In the description of the present invention, unless otherwise stated, "plurality" means two or more; the terms "upper", "lower", "left", "right", "inner", "outer" The orientation or positional relationship indicated by , "front end", "rear end", "head", "tail", etc. are based on the orientation or positional relationship shown in the accompanying drawings, and are only for the convenience of describing the present invention and simplifying the description, not An indication or implication that the referred device or element must have a particular orientation, be constructed and operate in a particular orientation, is not to be construed as a limitation of the invention. Furthermore, the terms "first," "second," "third," etc. are used for descriptive purposes only and should not be construed to indicate or imply relative importance.
应当注意,本发明的实施方式可以通过硬件、软件或者软件和硬件的结合来实现。硬件部分可以利用专用逻辑来实现;软件部分可以存储在存储器中,由适当的指令执行系统,例如微处理器或者专用设计硬件来执行。本领域的普通技术人员可以理解上述的设备和方法可以使用计算机可执行指令和/或包含在处理器控制代码中来实现,例如在诸如磁盘、CD或DVD-ROM的载体介质、诸如只读存储器(固件)的可编程的存储器或者诸如光学或电子信号载体的数据载体上提供了这样的代码。本发明的设备及其模块可以由诸如超大规模集成电路或门阵列、诸如逻辑芯片、晶体管等的半导体、或者诸如现场可编程门阵列、可编程逻辑设备等的可编程硬件设备的硬件电路实现,也可以用由各种类型的处理器执行的软件实现,也可以由上述硬件电路和软件的结合例如固件来实现。It should be noted that the embodiments of the present invention may be implemented by hardware, software, or a combination of software and hardware. The hardware portion may be implemented using special purpose logic; the software portion may be stored in memory and executed by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those of ordinary skill in the art will appreciate that the apparatus and methods described above may be implemented using computer-executable instructions and/or embodied in processor control code, for example on a carrier medium such as a disk, CD or DVD-ROM, such as a read-only memory Such code is provided on a programmable memory (firmware) or a data carrier such as an optical or electronic signal carrier. The device of the present invention and its modules can be implemented by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., It can also be implemented by software executed by various types of processors, or by a combination of the above-mentioned hardware circuits and software, such as firmware.
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,都应涵盖在本发明的保护范围之内。The above are only specific embodiments of the present invention, but the protection scope of the present invention is not limited to this. Any person skilled in the art is within the technical scope disclosed by the present invention, and all within the spirit and principle of the present invention Any modifications, equivalent replacements and improvements made within the scope of the present invention should be included within the protection scope of the present invention.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210054509.9A CN114578849B (en) | 2022-01-18 | 2022-01-18 | A UAV target detection system, a target detection method, and a computer-readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210054509.9A CN114578849B (en) | 2022-01-18 | 2022-01-18 | A UAV target detection system, a target detection method, and a computer-readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114578849A true CN114578849A (en) | 2022-06-03 |
CN114578849B CN114578849B (en) | 2025-02-07 |
Family
ID=81769551
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210054509.9A Active CN114578849B (en) | 2022-01-18 | 2022-01-18 | A UAV target detection system, a target detection method, and a computer-readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114578849B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115421135A (en) * | 2022-09-09 | 2022-12-02 | 中国人民解放军海军工程大学 | Method, system and terminal for measuring miss-target amount of radar/photoelectric composite single-station projectile |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110068250A (en) * | 2019-03-21 | 2019-07-30 | 南京砺剑光电技术研究院有限公司 | Shoot training of light weapons wisdom target range system |
KR102031257B1 (en) * | 2019-05-02 | 2019-11-08 | 한화시스템 주식회사 | System and method of aligning naval guns using wireless communication |
CN111598952A (en) * | 2020-05-21 | 2020-08-28 | 华中科技大学 | Multi-scale cooperative target design and online detection and identification method and system |
CN116258983A (en) * | 2023-03-06 | 2023-06-13 | 中国人民解放军陆军装甲兵学院 | Bullet target surface image acquisition and processing method and device based on multi-rotor unmanned aerial vehicle |
-
2022
- 2022-01-18 CN CN202210054509.9A patent/CN114578849B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110068250A (en) * | 2019-03-21 | 2019-07-30 | 南京砺剑光电技术研究院有限公司 | Shoot training of light weapons wisdom target range system |
KR102031257B1 (en) * | 2019-05-02 | 2019-11-08 | 한화시스템 주식회사 | System and method of aligning naval guns using wireless communication |
CN111598952A (en) * | 2020-05-21 | 2020-08-28 | 华中科技大学 | Multi-scale cooperative target design and online detection and identification method and system |
CN116258983A (en) * | 2023-03-06 | 2023-06-13 | 中国人民解放军陆军装甲兵学院 | Bullet target surface image acquisition and processing method and device based on multi-rotor unmanned aerial vehicle |
Non-Patent Citations (1)
Title |
---|
徐义桂;陈维义;: "双无人机平台的舰炮脱靶量实时检测模型", 电光与控制, no. 04, 31 December 2020 (2020-12-31) * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115421135A (en) * | 2022-09-09 | 2022-12-02 | 中国人民解放军海军工程大学 | Method, system and terminal for measuring miss-target amount of radar/photoelectric composite single-station projectile |
Also Published As
Publication number | Publication date |
---|---|
CN114578849B (en) | 2025-02-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110889324A (en) | Thermal infrared image target identification method based on YOLO V3 terminal-oriented guidance | |
CN109903305B (en) | Linear target drop point positioning method based on aerial three-dimensional positioning | |
CN109990662B (en) | Automatic target scoring method, device, equipment and computer readable storage medium | |
CN109711353B (en) | A machine vision-based method for ship waterline area recognition | |
CN110443201A (en) | The target identification method merged based on the shape analysis of multi-source image joint with more attributes | |
CN105225251A (en) | Over the horizon movement overseas target based on machine vision identifies and locating device and method fast | |
CN111709994B (en) | Autonomous unmanned aerial vehicle visual detection and guidance system and method | |
US10473429B1 (en) | Projectile detection system and method | |
CN110779395A (en) | Target shooting correction system and method | |
JP5367244B2 (en) | Target detection apparatus and target detection method | |
CN114419450A (en) | Linear target damage efficiency rapid evaluation method based on image feature analysis | |
CN113168530A (en) | Target detection device and method, imaging device and movable platform | |
CN114578849A (en) | Unmanned aerial vehicle target detection system, target detection method and computer readable storage medium | |
CN107907006A (en) | The gun sight and its automatic correction method of a kind of automatic deviation correction | |
CN115546170B (en) | Fan blade defect positioning method and system based on laser ranging | |
CN118823329B (en) | Target identification method, device, equipment and medium based on remote sensing image | |
CN110910379A (en) | Incomplete detection method and device | |
CN118485935A (en) | UAV viewpoint target detection method based on adversarial learning and hybrid convolutional domain | |
CN118097119A (en) | Automatic aiming shooting system based on image vision | |
CN117765243A (en) | AI guiding system based on high-performance computing architecture | |
CN117789051A (en) | Remote sensing image ship identification system | |
CN115482374A (en) | Target reporting method and system for single cannonball and computer storage medium | |
CN112113462B (en) | Method and system for detecting shooting effect of direct-aiming weapon and virtual target shooting system | |
KR101723028B1 (en) | Image processing system for integrated management of image information changing in real time | |
Huang et al. | Miss distance evaluation based on UAV video image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |