CN115993602A - Underwater target detection and positioning method based on forward-looking sonar - Google Patents

Underwater target detection and positioning method based on forward-looking sonar Download PDF

Info

Publication number
CN115993602A
CN115993602A CN202211414390.8A CN202211414390A CN115993602A CN 115993602 A CN115993602 A CN 115993602A CN 202211414390 A CN202211414390 A CN 202211414390A CN 115993602 A CN115993602 A CN 115993602A
Authority
CN
China
Prior art keywords
target
image
src
sonar
underwater
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211414390.8A
Other languages
Chinese (zh)
Inventor
汪韵怡
杜俭业
鲍永亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Science and Industry Shenzhen Group Co Ltd
Original Assignee
Aerospace Science and Industry Shenzhen Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Science and Industry Shenzhen Group Co Ltd filed Critical Aerospace Science and Industry Shenzhen Group Co Ltd
Priority to CN202211414390.8A priority Critical patent/CN115993602A/en
Publication of CN115993602A publication Critical patent/CN115993602A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The invention relates to an underwater target detection and target positioning method based on forward looking sonar, which belongs to the technical field of underwater sonar detection and is characterized by comprising the following steps: s1) mapping the multi-beam sonar echo signal intensity to a two-dimensional image; s2) preprocessing the two-dimensional echo intensity image generated in the step S1; s3) carrying out underwater target recognition on the image processed in the step S2; s4) performing target positioning; s5) detecting an image sequence target; s6) outputting target information and longitude and latitude (X2, Y2) of the real target, thereby realizing underwater target detection and target positioning based on forward-looking sonar. The underwater target detection and positioning method based on the forward-looking sonar is based on the echo intensity image of the forward-looking sonar, extracts the effective information of the target characteristics, can realize autonomous detection and identification of the underwater target, and can solve the problem of accurate positioning of the underwater target.

Description

Underwater target detection and positioning method based on forward-looking sonar
Technical Field
The invention belongs to the technical field of underwater sonar detection, and particularly relates to an underwater target detection and positioning method based on forward-looking sonar.
Background
Modern sea warfare is a relatively large volume of surrounding and countering technologies, using a large number of underwater electronic countermeasures. Due to the fact that the underwater environment is very complex originally, various natural random interferences exist, and due to the development of underwater electronic countermeasure technologies, targets can take various artificial cheating and interference measures, and requirements of modern underwater weapon systems on target detection and target identification are higher and higher. Therefore, research on the underwater target recognition technology is very necessary, and has important significance for improving the combat capability of the underwater complex environment.
At present, sonar is the most widely used equipment in underwater target identification, and the novel technology of sonar is greatly developed in various countries, so that the novel technology of sonar provides necessary technical support for timely and accurately finding underwater targets. Target identification is used as an important link of sonar post-data processing, mainly utilizes sonar to transmit pulse sound signals, and makes judgment on target class attributes according to the characteristics of received echo signals. The underwater target recognition uses various parameters of target information as characteristic quantities, for example, in the underwater target recognition, a target reflection echo is a function of a target type, a distance, an azimuth and the like, and the characteristic information such as echo broadening, amplitude, phase, reflection coefficient, target scale, energy spectrum and the like for representing the target is contained. Target recognition has two main roles: firstly, rejecting false alarms and identifying true targets: such as distinguishing artificial objects from "decoys" that exist in nature, such as fish shoals, corals, etc., in execution; secondly, classifying a plurality of targets: in the case where a plurality of targets are found, the targets may be classified and selectively attacked. With rapid development of technology, for unattended underwater weapons, automatic underwater target identification and positioning systems are urgently needed. And the increasing maturity of underwater target identification systems has led to a significant increase in data information for analysis processing. Therefore, it becomes particularly important to develop underwater target identification and positioning technology.
In the prior art, the sonar image processing technology screens the target area by setting the threshold value, but the threshold value is closely related to the underwater environment, the parameter setting is insufficient, the better noise filtering effect cannot be achieved, and the phenomena of false detection and high false alarm rate are easily caused. In the prior art, a deep learning neural network model is also adopted to detect the underwater target, but the sonar underwater target identification and positioning technology is slow to develop and is in an extremely complex underwater background environment, and the influence of the environment interference on the working performance of an underwater weapon system and the underwater target identification is not negligible. The main interference factors are that the transmission channels are more and complex, the influence of reverberation, the radiation sound field of the target and the scattering sound field generated by excitation are very complex, the influence of environmental noise and the like.
Disclosure of Invention
The invention aims to provide an underwater target detection and positioning method based on forward-looking sonar, which extracts effective target characteristic information based on an echo intensity image of the forward-looking sonar, so that autonomous detection and identification of an underwater target can be realized, and the problem of accurate positioning of the underwater target can be solved.
The specific technical scheme of the invention is an underwater target detection and target positioning method based on forward-looking sonar, which is characterized by comprising the following steps:
s1) mapping the multi-beam sonar echo signal intensity to a two-dimensional image;
s2) preprocessing the two-dimensional echo intensity image generated in the step S1, wherein the preprocessing process comprises the steps of converting a sonar two-dimensional signal intensity image from a two-dimensional polar coordinate system to a Cartesian coordinate system, filling the image converted to the Cartesian coordinate system, and filtering, image enhancement and pseudo-color processing the filled image;
s3) carrying out underwater target recognition on the images processed in the step S2, and endowing corresponding target information to the targets recognized in each frame of image, wherein the target information comprises target center point pixel values (x, y);
s4) performing target positioning, reading a target center point pixel value (X, Y) in target information, converting the target center point pixel value (X, Y) into a (r, theta) distance-azimuth value under a polar coordinate system, and calculating the longitude and latitude (X2, Y2) of the target according to the longitude and latitude (X1, Y1) of the known sonar access of the underwater sound positioning instrument and the distance and azimuth angle (r, theta) of the target from the sonar;
s5) detecting an image sequence target, setting a current total n frames of two-dimensional echo intensity images, comparing the distances between the target identified in the current n frames and all the identified targets in the previous 1 to n-1 frames, judging the target as the same target if the target is smaller than a set distance threshold L, updating the hit times, wherein the hit times are the times of judging the target as the same target, otherwise, judging the target as different targets; setting a hit frequency threshold value P, screening out targets with hit frequency larger than P from the identified targets, and judging the targets as real targets;
s6) outputting target information and longitude and latitude (X2, Y2) of the real target, thereby realizing underwater target detection and target positioning based on forward-looking sonar.
Furthermore, the method for filling the image converted into the cartesian coordinate system in the step 2 is to adopt a bilinear interpolation algorithm for filling, and the specific filling steps are as follows:
s221) assuming that an image converted into a cartesian coordinate system is an original image Src, an image width src_w, an image height src_h, an output target image is a target image Dst, an image width dst_w, and an image height dst_h, and assuming that any point in the original image Src is Psrc (src_x, src_y) corresponding to Pdst (dst_x, dst_y) in the target image Dst, first, a pixel point position Psrc (src_x, src_y) of the pixel point position Pdst (dst_x, dst_y) in the target image Dst is mapped back to the original image Src by the following formula:
Figure BDA0003939540930000031
s222) obtaining the neighboring points of the point psc (src_x, src_y) in the original image Src according to the coordinate relation of the diagonal four-neighborhood: q11, Q12, Q21, Q22;
there are 4 diagonally adjacent pixels for a pixel point in the four diagonal fields, i.e., at Psrc (src_x, src_y), this group of pixels is referred to as the 4 neighborhood of Psrc (src_x, src_y),
s223) performing one-dimensional linear interpolation along the direction X, Y by using the obtained neighboring points to complete the bilinear interpolation process: firstly, carrying out one-dimensional linear interpolation on Q11, Q21, Q12 and Q22 respectively along the X direction, obtaining R1 (src_x, Y1) and R2 (src_x, Y2) twice, then carrying out one-dimensional linear interpolation on the two-dimensional linear interpolation along the Y direction by utilizing the R1 and the R2, obtaining a final two-dimensional linear interpolation result f (Pdst) once, and specifically calculating the following formula (V):
Figure BDA0003939540930000041
wherein f is the gray value of the pixel point.
Still further, the method for enhancing the image in the step 2 into dynamic brightness distribution, enhancing the contrast ratio between the target and the background and the brightness of the image by optimizing the mapping function, and mapping the following formula (VII):
Figure BDA0003939540930000042
wherein: l is the minimum value of the echo intensity of the image; h is the maximum value of the echo intensity, zin is the image input intensity value, zout is the image output intensity value, in order to avoid the generation of isolated single peaks, the average of the first 1% of the retrieved wave intensity is the maximum value; gamma is a mapping parameter, gamma=1 generates linear transformation, gamma <1 enhances the overall brightness of the image, and gamma >1 reduces the overall brightness of the image.
Further, the specific steps of the target positioning in the step 4 are as follows:
s41) target coordinate conversion: the identified target center point pixel value (x, y) is read and converted into a (r, θ) distance-azimuth value in a polar coordinate system. Setting a target P center point pixel value as (n, m), and according to a coordinate conversion formula, the following formula (VIII):
Figure BDA0003939540930000051
wherein: alpha is the multi-beam forward looking sonar horizontal viewing angle; n is the number of wave beams; m is the distance sampling number; t is the time sampling rate; c is the sound velocity in water; r is the distance of the target; θ is the beam angle.
S42) target longitude and latitude calculation: according to the following formulas (IX) and (X), the longitude and latitude (X1, Y1) of the sonar access is known according to the underwater sound positioning instrument, and according to the distance from the target to the sonar and the azimuth angle (r, theta), the longitude and latitude (X2, Y2) of the target are calculated as follows:
Figure BDA0003939540930000052
Figure BDA0003939540930000053
wherein: r is the detection distance of the target; r is the earth radius; θ is the azimuth of the target; ΔX
Is a two-point longitude difference; ΔY is the two-point altitude difference.
The invention has the beneficial effects that 1) the processing process with one sonar image completion comprises the whole processes of image generation, pretreatment, underwater target identification, underwater target positioning and image sequence target detection, and the effective information of the target characteristics is extracted based on the echo intensity image of the forward-looking sonar, so that the autonomous detection and identification of the underwater target can be realized, and the problem of accurate positioning of the underwater target can be solved; 2) The method comprises the steps of effectively preprocessing a two-dimensional echo intensity image, converting a sonar two-dimensional signal intensity image into a Cartesian coordinate system from a two-dimensional polar coordinate system, filling the image converted into the Cartesian coordinate system by adopting a bilinear interpolation algorithm, and carrying out image enhancement on the image subjected to filling processing by adopting a method of filtering, pseudo-color processing and dynamic brightness distribution, so that the target recognition rate of a neural network is effectively improved, the processing rate of a neural network model is about 40FPS/s, and both the instantaneity and the accuracy meet the requirements of an underwater target detection and positioning system; 3) The detection of the image sequence steps can also improve the accuracy of underwater target identification and reduce the false alarm rate and false detection rate.
The method provides effective technical support and basis for detection and perception under the background of complex noise under water, so that the feature extraction and recognition technology of the underwater target has very important theoretical significance and engineering application value no matter in the military or civil field.
Drawings
FIG. 1 is a flow chart of a forward looking sonar-based underwater target detection and localization method of the present invention;
FIG. 2 is a flow chart of multi-beam sonar image mapping of the forward looking sonar-based underwater target detection and localization method of the present invention;
FIG. 3 is a flow chart of sonar image preprocessing of the forward looking sonar-based underwater target detection and localization method of the present invention;
FIG. 4 is a flow chart of the underwater target identification of the forward looking sonar-based underwater target detection and localization method of the present invention;
FIG. 5 is a flow chart of the underwater target positioning method based on forward looking sonar of the present invention;
FIG. 6 is a flow chart of an image sequence detection target of the forward looking sonar-based underwater target detection and localization method of the present invention;
FIG. 7 is a schematic diagram of coordinate transformation used in the forward looking sonar-based underwater target detection and localization method of the present invention;
FIG. 8 is a schematic diagram of a bilinear interpolation method used in the forward-looking sonar-based underwater target detection and localization method of the present invention;
FIG. 9 is a raw two-dimensional echo intensity image generated in one embodiment of a forward looking sonar-based underwater target detection and localization method of the present invention;
FIG. 10 is a pre-processed image of an original two-dimensional echo intensity image in one embodiment of a forward looking sonar-based underwater target detection and localization method of the present invention;
FIG. 11 shows a result of identifying and detecting a preprocessed two-dimensional echo intensity image in an embodiment of a forward-looking sonar-based underwater target detection and localization method according to the present invention;
Detailed Description
The technical scheme of the invention is further described below with reference to the attached drawings.
As shown in the accompanying drawings 1-6, the underwater target detection and target positioning method based on forward-looking sonar provided by the invention comprises the following steps:
s1) mapping the multi-beam sonar echo signal intensity to a two-dimensional image.
S11), sonar parameters such as sonar wave beam N, distance sampling number M and data block echo intensity are read.
The echo intensities reflect the change in acoustic energy resulting from the interaction of the beam with its undersea projection point. For echo intensity measurement is a sequence of backscattered intensities. The number of time sequence samples in unit time is several times or tens times of sounding samples. Each time-allowed sample is an interface between a spherical-surface transmit beam pattern and an annular-surface receive beam pattern formed in (t, t+dt) for continuous wave CW for a period of time in which dt is less than the pulse length. Unlike depth measurements, echo intensity sampling measures the beam projected point circle area where the penetration area is surrounded by multiple beam patterns. The working principle is as follows: and (3) forming a group of echo intensity time sequence observables on the intersection line of the sector and the seabed after each measurement of the sonar scanning system is finished, and obtaining the echo intensities of different positions in the measuring area through multiple measurements.
S12) data filtering the echo intensities of the data blocks.
The intensity of the sound wave transmission region can be described by a certain gray level, exhibiting a certain change in brightness. The echo intensity level can be reflected, and the attribute of the underwater target is also reflected. In order to obtain the real echo intensity, various factors influencing the echo intensity must be considered first, and the method such as filtering is adopted to compensate the submarine noise. Because the acoustic data volume is large, a simple data filtering algorithm is often adopted, the method adopts moving average filtering, and a model formula is as follows:
Figure BDA0003939540930000081
wherein: n is the sampling number of sound intensity in the selected window; BSi is the ith smoothed object; BSj is the intra-window intensity sample.
S13) generating a two-dimensional echo intensity image based on the beam N, the distance sampling number M, and the echo intensities of the sampling points. And mapping the two-dimensional coordinate system with the abscissa as the beam N and the ordinate as the sampling point number M based on the echo intensity value of each sampling point to generate an original two-dimensional echo intensity image.
S2) preprocessing the two-dimensional echo intensity image generated in the step S1.
S21) converting the sonar two-dimensional signal intensity image from a two-dimensional polar coordinate system to a Cartesian coordinate system.
As shown in fig. 8, the left graph has an abscissa of beam N and an ordinate of the number of sampling points M, and each square represents an echo point, so that in order to better represent the target characteristics and distribution in the sector scan area of the forward-looking sonar, the echo intensity data of the two-dimensional polar coordinates needs to be converted into a cartesian coordinate system, as shown in the right graph of fig. 8.
Let multibeam forward looking sonar horizontal view angle be alpha, the wave beam number be N, the distance sampling number be M, the time sampling rate be T, the underwater sound speed be c, assume that the two-dimensional visual area of sonar is symmetrical about y-axis, the origin of polar coordinate system and the origin of Cartesian coordinate system coincide.
The existing point A is positioned in m rows and n columns of a sonar imaging matrix, and the coordinate of the point A under the polar coordinate is as follows:
Figure BDA0003939540930000082
the coordinates of the point a in cartesian coordinates are as follows:
Figure BDA0003939540930000091
by the above conversion of the coordinate system, the intensity of the echo point (r, θ) in the polar coordinate system is converted into the intensity value (x, y) in the cartesian coordinate system to be expressed.
S22) bilinear interpolation algorithm filling processing. Comparing the left and right side views of fig. 8, it can be seen that the view near the center of the fan is compressed and the view far from the center is stretched in the angular direction. This problem of image distortion can lead to "holes" in the image (holes are not shown in the figure) that require data interpolation to fill the image. The common interpolation algorithm comprises nearest interpolation, bilinear interpolation, bicubic interpolation and the like, and has similar filling effects, but the bilinear interpolation algorithm has better efficiency, so that the bilinear interpolation algorithm is adopted to carry out filling processing on the sonar image.
The bilinear interpolation method has a smoothing effect on the image, and the specific filling steps are as follows:
s221) setting a point in the original image Src corresponding to Pdst (dst_x, dst_y) in the target image Dst as Psrc (src_x, src_y), an image width of the original image as src_w, an image height as src_h, an image width of the target image as dst_w, and an image height as dst_h, firstly calculating a pixel point position Pdst (dst_x, dst_y) in the target image Dst by a formula, and mapping the pixel point position Psrc (src_x, src_y) back to the original image Src by the formula (IV):
Figure BDA0003939540930000092
s222) obtaining the neighboring points of the point psc (src_x, src_y) in the original image Src according to the coordinate relation of the diagonal four-neighborhood: q11, Q12, Q21, Q22;
the pixel points in the four diagonal fields, namely the Psrc (src_x, src_y), have 4 diagonally adjacent pixels, and this group of pixels is called the 4 neighborhood of Psrc, and the coordinate relationship is shown in the left graph of FIG. 9.
S223) as shown in the right diagram of fig. 9, the bilinear interpolation process is completed by performing one-dimensional linear interpolation along the direction X, Y by using the obtained neighboring points: firstly, carrying out one-dimensional linear interpolation on Q11, Q21, Q12 and Q22 along the X direction, and obtaining R1 (src_x, y 1) and R2 (src_x, y 2) twice; and then carrying out one-dimensional linear interpolation on the two-dimensional linear interpolation along the Y direction by utilizing R1 and R2, and obtaining a final two-dimensional linear interpolation result f (Pdst) once. The specific calculation formula is as follows:
Figure BDA0003939540930000101
wherein f is the gray value of the pixel point.
And filtering and denoising the sonar image, and preprocessing algorithms such as image enhancement, pseudo-color mapping and the like. The method aims at facilitating model training of subsequent underwater target detection and human eye observation of a display control end. Compared with a sonar echo intensity image, the human eye is more sensitive to the preprocessed image, and data sample labeling is facilitated.
S23) noise in the sonar image is mostly salt and pepper noise due to submarine reflections, moisture or float effects. The median filtering has a good processing effect on salt and pepper noise, can smooth images and simultaneously reduce loss of image information, and is simple and effective.
Median filtering is a mature nonlinear smoothing technique, and the gray value of each pixel point can be set as the median of the gray values of all pixel points in a certain neighborhood window of the point. The basic principle is that the value of a point in a digital image or digital sequence is replaced by the median value of the values of points in a neighborhood of the point, and the surrounding pixel values are close to the true value, so that isolated noise points are eliminated. By using a two-dimensional sliding template of some structure, pixels in the panel are ordered according to the size of the pixel values, generating a monotonically rising (or falling) two-dimensional data sequence. The two-dimensional median filter output is g (x, y) =med { f (x-k, y-l), (k, l e W) }, where f (x, y), g (x, y) are the pre-filter input image and the post-process image, respectively. W is a two-dimensional template, typically set to 3 x 3,5 x 5 regions.
S24) optimizing the image by using an image enhancement algorithm to improve the visual clarity of the image. The human eye is more sensitive to images with more detail textures than to light images with less detail, and the noise is less visible in images with more detail textures than in smooth images. If the image is enhanced in detail texture to a much greater extent than the smooth image, the visual effect of overall image enhancement can be achieved.
The invention adopts a dynamic brightness distribution method to enhance the image. Dynamic brightness allocation is a process of mapping echo intensities to linear gray scales [0,255], enhancing contrast and image brightness between the object and the background by optimizing the mapping function, mapping the following formula (VI):
Figure BDA0003939540930000111
wherein: l is the minimum value of the echo intensity of the image; h is the maximum of the echo intensities (to avoid isolated single peaks, the average of the first 1% of the retrieved wave intensities is the maximum); z in Is the image input intensity value, z out Is the image output intensity value; gamma is the mapping parameter, gamma=1 produces a linear transformation, gamma<1 enhancing the overall brightness of the image, gamma>1 reduces the overall brightness of the image.
S25) pseudo color processing converts the gray image into a color image according to a certain mapping method. The multi-beam sonar imaging is generally a gray image, but because human eyes are more sensitive to colors, the multi-beam image needs to be mapped into a color image when the image is displayed on a display control end, and the color image is formed by mapping gray values to intensity values of R, G, B three channels according to a color mapping table.
Let the pixel value of a point in the gray image be f (x, y), the RGB three channel intensity values of the mapped pseudo-color image be R (x, y), G (x, y), B (x, y), which together represent one pixel value of the color image. The present method uses the jet mapping method, which changes from blue to red, as follows:
Figure BDA0003939540930000121
Figure BDA0003939540930000122
Figure BDA0003939540930000123
wherein: f (x, y) is the gray value of the (x, y) pixel point, R (x, y), G (x, y), B (x, y) are the three primary color components of the color image, respectively.
S3) underwater target identification
The method comprises the steps of reading a frame of preprocessed two-dimensional echo intensity image, and detecting an underwater target based on a deep learning neural network model, wherein a general YOLOv5 target detection model in the prior art is adopted. The target candidate region in the image is not required to be extracted, regression training is directly carried out on the whole image, the running speed is higher, and the real-time requirement of underwater target identification is met. After the preprocessed two-dimensional echo intensity image is subjected to target recognition by a deep learning neural network model, target information comprising information such as target ID, target center point pixel values (x, y), target size (w, h), target classification and the like is given to a target.
S4) underwater target positioning
S41) target coordinate conversion: the identified target center point pixel value (x, y) is read and converted into a (r, θ) distance-azimuth value in a polar coordinate system. Setting a target P center point pixel value as (n, m), and according to a coordinate conversion formula, the following formula (VIII):
Figure BDA0003939540930000131
wherein: alpha is the multi-beam forward looking sonar horizontal viewing angle; n is the number of wave beams; m is the distance sampling number; t is the time sampling rate; c is the sound velocity in water; r is the distance of the target; θ is the beam angle.
S42) target longitude and latitude calculation: according to the following formulas (IX) and (X), according to the longitude and latitude (X1, Y1) of the sonar access known by the underwater sound positioning instrument, according to the distance from the target to the sonar and the azimuth angle (r, theta), the longitude and latitude (X2, Y2) of the target are calculated as follows:
Figure BDA0003939540930000132
Figure BDA0003939540930000133
wherein: r is the detection distance of the target; r is the earth radius; θ is the azimuth of the target; ΔX
Is a two-point longitude difference; ΔY is the two-point altitude difference.
S5) image sequence target detection: the purpose of this step is to reduce false alarm rate, false detection rate and missing detection rate. By setting the hit frequency threshold, the underwater real target is screened out, and the influences of image noise, water vapor, submarine noise and the like are filtered out.
S51) reading the target information obtained in the step S3), and storing the target information into an identification target queue targets;
s52) traversing the Shi Shengna queue, calculating a distance between the historically stored target and the current lot target;
s53) judging whether the target distance between two points is smaller than a set distance threshold L (specific numerical values can be set according to different water conditions), if so, considering the two points as the same target, and updating the hit times and the target ID, wherein the hit times are the times of judging the same target; if the target storage queue information is larger than L, the target storage queue information is regarded as different targets, and the target storage queue information is updated;
s54) setting a hit number threshold value P, screening targets with hit numbers larger than P from the recognition target queues targets, and judging the targets as real targets.
S6) outputting information such as a target ID (identity), a target center point pixel value (x, y), a target size (w, h), target classification, longitude and latitude and the like of a real target, thereby realizing underwater target detection and target positioning based on forward-looking sonar.
The invention relates to a forward-looking sonar-based underwater target detection and target positioning method, which comprises the following specific embodiments: the invention detects the underwater static target by using 700D multi-beam forward-looking sonar of the Paul's same technical company, and collects partial data in a test water area by using the detection system for model training and algorithm evaluation. Obtaining a forward-looking sonar two-dimensional original echo image shown in fig. 9 through image mapping in the step S1; in the image preprocessing of step S2, a pseudo color image as shown in fig. 10 is obtained; and taking the series of original images and pseudo-color images as sample data for training a YOLOv5 model, collecting about 1400 sonar pictures with targets, and using Labelimg labels for 1000 images to form a training data set, wherein the rest 400 images form 100 groups of test data sets. The hardware device carries the server of NVDIA 3080Ti display card for training and testing. Through step S3, a detection result example of the sonar image shown in fig. 11 is obtained.
The test data are divided into 100 groups of data sets, wherein the data comprise two types of small targets of the mine and the anchor mine, each group comprises 4 pictures, the total number of the pictures is 400, the pictures comprise 481 targets, and 215 mine test sample targets and 266 anchor mine test sample targets exist in the 400 pictures. And (5) inputting a training model to infer, and recording the time of the test and the target position information. The average accuracy of the test is 0.8725, and the target identification test results are shown in the following table:
TABLE 1 target recognition test results
Figure BDA0003939540930000151
Target positioning test condition: the test analyzes the target position deviation results of 100 groups of test data sets (4 pictures in each group and 400 pictures in total). On the basis of accurate target identification, the longitude and latitude information of the target in the sample data is calculated through the step S4, and compared with the longitude and latitude recorded when the target is placed (the latitude and longitude of the sample target is placed: the sinking mine (121.6656335, 38.8595881) and the anchor mine (121.6656911, 38.8596661)), the target position deviation information can be obtained through calculation. According to the task scene, a distance threshold L=20 and a hit number threshold P=10 are set, and through step S5, real target information is screened and output. The selected part of test results are shown in table 2, and the identification results, longitude and latitude information and position deviation values of 9 pictures are shown.
TABLE 2 target positioning test results
Figure BDA0003939540930000161
Figure BDA0003939540930000171
Note that: sample target placement longitude and latitude: a ground mine (121.6656335, 38.8595881); anchor mines (121.6656911, 38.8596661)
The test result shows that the target recognition accuracy of the test data set is 87.25%, and the average deviation of the target position is 10.2m.
While the invention has been disclosed in terms of preferred embodiments, the embodiments are not intended to limit the invention. Any equivalent changes or modifications can be made without departing from the spirit and scope of the present invention, and are intended to be within the scope of the present invention. The scope of the invention should therefore be determined by the following claims.

Claims (4)

1. The method for detecting and positioning the underwater target based on the forward-looking sonar is characterized by comprising the following steps of:
s1) mapping the multi-beam sonar echo signal intensity to a two-dimensional image;
s2) preprocessing the two-dimensional echo intensity image generated in the step S1, wherein the preprocessing process comprises the steps of converting a sonar two-dimensional signal intensity image from a two-dimensional polar coordinate system to a Cartesian coordinate system, filling the image converted to the Cartesian coordinate system, and filtering, image enhancement and pseudo-color processing the filled image;
s3) carrying out underwater target recognition on the images processed in the step S2, and endowing corresponding target information to the targets recognized in each frame of image, wherein the target information comprises target center point pixel values (x, y);
s4) performing target positioning, reading a target center point pixel value (X, Y) in target information, converting the target center point pixel value (X, Y) into a (r, theta) distance-azimuth value under a polar coordinate system, and calculating the longitude and latitude (X2, Y2) of the target according to the longitude and latitude (X1, Y1) of the known sonar access of the underwater sound positioning instrument and the distance and azimuth angle (r, theta) of the target from the sonar;
s5) detecting an image sequence target, setting a current total n frames of two-dimensional echo intensity images, comparing the distances between the target identified in the current n frames and all the identified targets in the previous 1 to n-1 frames, judging the target as the same target if the target is smaller than a set distance threshold L, updating the hit times, wherein the hit times are the times of judging the target as the same target, otherwise, judging the target as different targets; setting a hit frequency threshold value P, screening out targets with hit frequency larger than P from the identified targets, and judging the targets as real targets;
s6) outputting target information and longitude and latitude (X2, Y2) of the real target, thereby realizing underwater target detection and target positioning based on forward-looking sonar.
2. The method for detecting and positioning the underwater target based on the forward-looking sonar according to claim 1, wherein the method for filling the image converted into the cartesian coordinate system in the step 2 is to adopt bilinear interpolation algorithm for filling, and the specific filling steps are as follows:
s221) assuming that an image converted into a cartesian coordinate system is an original image Src, an image width src_w, an image height src_h, an output target image is a target image Dst, an image width dst_w, and an image height dst_h, and assuming that any point in the original image Src is Psrc (src_x, src_y) corresponding to Pdst (dst_x, dst_y) in the target image Dst, first, a pixel point position Psrc (src_x, src_y) of the pixel point position Pdst (dst_x, dst_y) in the target image Dst is mapped back to the original image Src by the following formula:
Figure FDA0003939540920000021
s222) obtaining the neighboring points of the point psc (src_x, src_y) in the original image Src according to the coordinate relation of the diagonal four-neighborhood: q11, Q12, Q21, Q22;
there are 4 diagonally adjacent pixels for a pixel point in the four diagonal fields, i.e., at Psrc (src_x, src_y), this group of pixels is referred to as the 4 neighborhood of Psrc (src_x, src_y),
s223) performing one-dimensional linear interpolation along the direction X, Y by using the obtained neighboring points to complete the bilinear interpolation process: firstly, carrying out one-dimensional linear interpolation on Q11, Q21, Q12 and Q22 respectively along the X direction, obtaining R1 (src_x, Y1) and R2 (src_x, Y2) twice, then carrying out one-dimensional linear interpolation on the two-dimensional linear interpolation along the Y direction by utilizing the R1 and the R2, obtaining a final two-dimensional linear interpolation result f (Pdst) once, and specifically calculating the following formula (V):
Figure FDA0003939540920000022
wherein f is the gray value of the pixel point.
3. The method for detecting and locating an underwater target based on forward-looking sonar according to claim 1, wherein the image enhancement in the step 2 is a dynamic brightness allocation method, the contrast between the target and the background and the image brightness are enhanced by optimizing the mapping function, and the following formula (VI) is mapped:
Figure FDA0003939540920000031
wherein: l is the minimum value of the echo intensity of the image; h is the maximum value of the echo intensity, zin is the image input intensity value, zout is the image output intensity value, in order to avoid the generation of isolated single peaks, the average of the first 1% of the retrieved wave intensity is the maximum value; gamma is a mapping parameter, gamma=1 generates linear transformation, gamma <1 enhances the overall brightness of the image, and gamma >1 reduces the overall brightness of the image.
4. The method for detecting and locating an underwater target based on forward-looking sonar according to claim 1, wherein the specific steps of locating the target in the step 4 are as follows:
s41) target coordinate conversion: the identified target center point pixel value (x, y) is read and converted into a (r, θ) distance-azimuth value in a polar coordinate system. Setting a target P center point pixel value as (n, m), and according to a coordinate conversion formula, the following formula (VIII):
Figure FDA0003939540920000032
wherein: alpha is the multi-beam forward looking sonar horizontal viewing angle; n is the number of wave beams; m is the distance sampling number; t is the time sampling rate; c is the sound velocity in water; r is the distance of the target; θ is the beam angle.
S42) target longitude and latitude calculation: according to the following formulas (IX) and (X), the longitude and latitude (X1, Y1) of the sonar access is known according to the underwater sound positioning instrument, and according to the distance from the target to the sonar and the azimuth angle (r, theta), the longitude and latitude (X2, Y2) of the target are calculated as follows:
Figure FDA0003939540920000041
Figure FDA0003939540920000042
wherein: r is the detection distance of the target; r is the earth radius; θ is the azimuth of the target; Δx is the two-point longitude difference; ΔY is the two-point altitude difference.
CN202211414390.8A 2022-11-11 2022-11-11 Underwater target detection and positioning method based on forward-looking sonar Pending CN115993602A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211414390.8A CN115993602A (en) 2022-11-11 2022-11-11 Underwater target detection and positioning method based on forward-looking sonar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211414390.8A CN115993602A (en) 2022-11-11 2022-11-11 Underwater target detection and positioning method based on forward-looking sonar

Publications (1)

Publication Number Publication Date
CN115993602A true CN115993602A (en) 2023-04-21

Family

ID=85989520

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211414390.8A Pending CN115993602A (en) 2022-11-11 2022-11-11 Underwater target detection and positioning method based on forward-looking sonar

Country Status (1)

Country Link
CN (1) CN115993602A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437409A (en) * 2023-12-14 2024-01-23 华中师范大学 Deep learning target automatic identification method and system based on multi-view sound image
CN117970338A (en) * 2024-01-09 2024-05-03 南通海狮船舶机械有限公司 Underwater positioning detector and control method thereof
CN118330623A (en) * 2024-06-13 2024-07-12 博雅工道(北京)机器人科技有限公司 Method and device for processing scanning sonar data and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437409A (en) * 2023-12-14 2024-01-23 华中师范大学 Deep learning target automatic identification method and system based on multi-view sound image
CN117970338A (en) * 2024-01-09 2024-05-03 南通海狮船舶机械有限公司 Underwater positioning detector and control method thereof
CN118330623A (en) * 2024-06-13 2024-07-12 博雅工道(北京)机器人科技有限公司 Method and device for processing scanning sonar data and storage medium

Similar Documents

Publication Publication Date Title
CN115993602A (en) Underwater target detection and positioning method based on forward-looking sonar
CN109670411B (en) Ship point cloud depth image processing method and system based on generation countermeasure network
Zhao et al. A coupled convolutional neural network for small and densely clustered ship detection in SAR images
CN108444447B (en) Real-time autonomous detection method for fishing net in underwater obstacle avoidance system
CN109871902B (en) SAR small sample identification method based on super-resolution countermeasure generation cascade network
US6943724B1 (en) Identification and tracking of moving objects in detected synthetic aperture imagery
CN110428008A (en) A kind of target detection and identification device and method based on more merge sensors
US8154952B1 (en) Method and system for real-time automated change detection and classification for images
CN101915910B (en) Method and system for identifying marine oil spill object by marine radar
Buscombe et al. Optical wave gauging using deep neural networks
CN112766221B (en) Ship direction and position multitasking-based SAR image ship target detection method
CN109213204B (en) AUV (autonomous underwater vehicle) submarine target searching navigation system and method based on data driving
CN102975826A (en) Portable ship water gauge automatic detection and identification method based on machine vision
CN109359787A (en) A kind of multi-modal wave forecasting system in small range sea area and its prediction technique
CN105574529B (en) A kind of side-scan sonar object detection method
CN107862271B (en) Detection method of ship target
CN112949380B (en) Intelligent underwater target identification system based on laser radar point cloud data
CN105741284B (en) A kind of multi-beam Forward-looking Sonar object detection method
CN110706177A (en) Method and system for equalizing gray level of side-scan sonar image
CN112435249A (en) Dynamic small target detection method based on periodic scanning infrared search system
CN115983141A (en) Method, medium and system for inverting wave height based on deep learning
CN105824024B (en) A kind of anti-frogman&#39;s solid early warning identifying system of new submerged gate
CN116299492A (en) Bistatic submarine topography acoustic imaging method based on pixel statistical distribution weighting
CN116343057A (en) Ship target detection and identification method combining SAR (synthetic aperture radar) with optical image
CN116243289A (en) Unmanned ship underwater target intelligent identification method based on imaging sonar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination