CN113378932A - Side-scan sonar position correction method based on high-precision heterogeneous common-view image - Google Patents
Side-scan sonar position correction method based on high-precision heterogeneous common-view image Download PDFInfo
- Publication number
- CN113378932A CN113378932A CN202110655786.0A CN202110655786A CN113378932A CN 113378932 A CN113378932 A CN 113378932A CN 202110655786 A CN202110655786 A CN 202110655786A CN 113378932 A CN113378932 A CN 113378932A
- Authority
- CN
- China
- Prior art keywords
- scan sonar
- image
- view image
- sonar image
- feature point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 238000012937 correction Methods 0.000 title claims abstract description 16
- 230000009466 transformation Effects 0.000 claims abstract description 33
- 239000000758 substrate Substances 0.000 claims abstract description 30
- 238000013519 translation Methods 0.000 claims abstract description 21
- 239000011159 matrix material Substances 0.000 claims description 22
- 239000013598 vector Substances 0.000 claims description 6
- 230000008859 change Effects 0.000 claims description 3
- 238000010276 construction Methods 0.000 abstract description 5
- 238000012544 monitoring process Methods 0.000 abstract description 2
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
- G06F18/23213—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Optimization (AREA)
- Mathematical Analysis (AREA)
- Computational Mathematics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computing Systems (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Probability & Statistics with Applications (AREA)
- Bioinformatics & Computational Biology (AREA)
- Algebra (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The invention discloses a side-scan sonar position correction method based on high-precision heterogeneous common-view images, which comprises the following steps: step S1, acquiring a fine matching feature point pair of the heterogeneous common-view image and the side scan sonar image; step S2, acquiring a fine matching feature point pair of the substrate classification result of the heterogeneous common-view image and the substrate classification result of the side-scan sonar image; step S3, constructing a rigid transformation model of the heterogenous co-view image and the side scan sonar image, and solving by using a least square method to obtain a rotation parameter and a translation parameter of the rigid transformation model; and step S4, sequentially carrying out rotation and translation transformation on each pixel of the side-scan sonar image by using the rotation parameters and the translation parameters of the rigid transformation model so as to realize the overall correction of the position of the side-scan sonar image. The method can obtain high-quality submarine landform images, and simultaneously solves the problems of offshore wind power plant construction and subsequent safe operation monitoring.
Description
Technical Field
The invention relates to a side-scan sonar position correction method based on high-precision heterogeneous common-view images.
Background
In ocean engineering construction such as offshore wind power, bridges and the like, high requirements are placed on distribution change characteristics of underwater topographic features and geology, compared with the traditional method for acquiring submarine topographic feature information based on a single-beam sounding system or a multi-beam sounding system, a side-scan sonar detects the submarine topography based on an acoustic scattering principle, when the side-scan sonar works, a transducer array emits pulse sound waves to two sides, the sound waves are outwards spread in a spherical wave form and scattered when touching a submarine or an object in the water, and the backscattered waves return to an original propagation path to be received by a transducer and are reflected on recording paper or a display, the resolution of the side-scan sonar is 50-100 times of the multi-beam sounding resolution, and the method is an effective means for acquiring submarine fine topographic information.
In order to avoid bubble interference generated by the measuring ship and in the running process, a towing operation mode is often adopted by a side-scan sonar system, a transducer is a reference for measurement of the side-scan sonar system, and the position of the transducer can be provided by an ultra-short baseline positioning system. At present, no good solution to this problem exists.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a side-scan sonar position correction method based on a high-precision heterogeneous co-view image so as to obtain a high-quality seabed landform image and effectively solve the problems of offshore wind farm construction and subsequent safe operation monitoring.
The invention relates to a side-scan sonar position correction method based on a high-precision heterologous common-view image, which comprises the following steps of:
step S1, acquiring a fine matching feature point pair of the heterogeneous common-view image and the side scan sonar image;
step S2, adopting a k-means + + unsupervised classification method to classify the heterogeneous co-view images and the side-scan sonar images respectively to obtain the accurate matching feature point pairs of the substrate classification results of the heterogeneous co-view images and the substrate classification results of the side-scan sonar images;
step S3, constructing a rigid transformation model of the heterogenous co-view image and the side-scan sonar image based on the refined matching feature point pairs of the heterogenous co-view image and the side-scan sonar image and the refined matching feature point pairs of the heterogenous co-view image and the side-scan sonar image, and solving by using a least square method to obtain a rotation parameter and a translation parameter of the rigid transformation model; and
and step S4, sequentially carrying out rotation and translation transformation on each pixel of the side-scan sonar image by using the rotation parameters and the translation parameters of the rigid transformation model so as to realize the integral correction of the position of the side-scan sonar image.
In the above-mentioned method for correcting the position of a side-scan sonar image based on a high-precision hetero-source co-view image, step S1 includes the steps of:
step S11, acquiring rough matching feature point pairs of the heterogeneous common-view image and the side scan sonar image based on the SURF algorithm; and
and step S12, rejecting mismatching feature point pairs in the rough matching feature point pairs of the heterogeneous co-view image and the side-scan sonar image by using a RANSAC algorithm to obtain fine matching feature point pairs of the heterogeneous co-view image and the side-scan sonar image.
In the above-mentioned method for correcting the position of a side-scan sonar image based on a high-precision hetero-source co-view image, step S11 includes the steps of:
step S111, respectively extracting feature points of the heterogeneous common-view image and the side-scan sonar image, and respectively calculating Euclidean distances d between each feature point of the heterogeneous common-view image and all feature points of the side-scan sonar image according to the following formula (1):
in the formula, P1For n-dimensional feature vectors, P, in heterogeneous co-view images2N-dimensional feature vectors in the side scan sonar image; and
in step S112, each feature point of the heterologous co-view image and the feature point in the side-scan sonar image with the minimum euclidean distance d from the feature point are defined as a rough matching feature point pair of the heterologous co-view image and the side-scan sonar image.
In the above-mentioned method for correcting the position of a side-scan sonar image based on a high-precision hetero-source co-view image, step S12 includes the steps of:
step S121, randomly choosing 4 groups of matching point pairs from the rough matching feature point pairs of the heterogeneous common-view image and the side scan sonar image;
step S122, obtaining a transformation matrix M according to the 4 groups of matching point pairs selected in the step S121;
in the formula, (x, y) is the position of the characteristic point in the side scan sonar image, (x ', y') is the position of the characteristic point matched with the characteristic point in the side scan sonar image in the heterogenous co-view image, and s is a scale parameter;
step S123, calculating projection errors L of all rough matching feature point pairs and the transformation matrix M of the heterogeneous common-view image and the side-scan sonar image in sequence according to the following formula (3):
in the formula (x)i,yi) Is the position of the ith feature point in the side scan sonar image, (x'i,y′i) The position of the ith characteristic point in the heterogeneous common view image is taken as the position of the ith characteristic point in the heterogeneous common view image;
if the projection error L is smaller than a preset threshold value, adding the rough matching characteristic point pairs of the corresponding heterogeneous common-view image and the side-scan sonar image into an inner point set I, and recording the number of the matching point pairs added into the inner point set I;
step S124, if the number of the matching point pairs in the current inner point set I is larger than the optimal inner point set IbestThe number of the elements in the set is updated to the optimal interior point set IbestAnd if so, finishing the iteration, removing all rough matching feature point pairs except the inner point set I, defining the rough matching feature point pairs in the inner point set I as fine matching feature point pairs of the heterogeneous common-view image and the side-scan sonar image, and if not, adding 1 to the iteration number and returning to execute the step S121.
In the above-mentioned method for correcting the position of a side-scan sonar image based on a high-precision hetero-source co-view image, step S2 includes the steps of:
step S21, initializing k clustering centers by adopting a k-means + + unsupervised classification method, and randomly generating positions u of the k clustering centersk;
Step S22, calculating the Euclidean distance from each pixel position in the side-scan sonar image and the heterogeneous common-view image to each clustering center, and dividing each pixel into a class where the clustering center closest to the pixel position is located;
step S23, updating and obtaining according to the following formula (4)Position u 'of k cluster centers'k:
In the formula, xkiThe position of the ith pixel in the class is the position of the kth clustering center;
step S24, repeating the steps S22 to S23 until the distribution of all classes does not change or reaches the preset maximum iteration number, and then performing step S25;
step S25, acquiring rough matching feature point pairs of the substrate classification result of the heterogeneous common-view image and the substrate classification result of the side-scan sonar image based on the SURF algorithm; and
and step S26, rejecting mismatching feature point pairs in the rough matching feature point pairs of the substrate classification result of the heterogeneous common-view image and the substrate classification result of the side-scan sonar image by using a RANSAC algorithm to obtain the fine matching feature point pairs of the substrate classification result of the heterogeneous common-view image and the substrate classification result of the side-scan sonar image.
In the above-described method for correcting a position of a side-scan sonar image based on a high-precision hetero-source co-view image, step S3 includes:
constructing a rigid transformation model of the heterogenous co-view image and the side scan sonar image according to the following formula (5):
wherein a and b are rotation parameters, c and d are translation parameters, and (X)1,Y1) The position of the feature point in the side scan sonar image (X)2,Y2) The positions of the feature points in the heterogeneous common-view image matched with the feature points in the side-scan sonar image are determined;
and substituting the precisely matched characteristic point pairs of the heterogeneous co-view image and the side scan sonar image into the formula (5), and expanding to obtain an over-determined equation set (6):
converting the over-determined system of equations (6) into a matrix equation (7):
order:
in the formula (X)1n,Y1n) The position of the nth characteristic point in the side scan sonar image (X)2n,Y2n) The position of a characteristic point matched with the nth characteristic point in the side scan sonar image in the heterogeneous common view image is determined;
obtaining a least squares solution of the rotation parameters and the translation parameters of the rigid transformation model according to the formula (8) as:
t=(XTX)-1XTy (9)
in the formula, XTIs a transposed matrix of matrix X, (X)TX)-1Is a matrix XTAn inverse matrix of X;
and (4) calculating to obtain rotation parameters a and b and translation parameters c and d of the rigid transformation model according to the formula (6) to the formula (9).
Based on the technical scheme, the invention utilizes a heterogenous co-view image with high-precision position information, such as a multi-beam sonar image, to form rigid transformation parameters through matching with a side-scan sonar co-view submarine landform image, realizes the correction of the side-scan sonar image position, improves the quality of side-scan sonar image processing, reduces the difficulty, risk and cost of carrying out construction operation based on a sonar image, improves the efficiency, and has important practical significance for the construction and operation of offshore wind power. In conclusion, the invention has the advantages of high precision, high quality, high resolution, low cost, low risk, convenient implementation and the like.
Drawings
Fig. 1 is a schematic diagram of step S1 in the method for correcting the position of a side-scan sonar image based on a high-precision heterologous co-view image according to the present invention, where (a) shows coarse matching feature point pairs of the heterologous co-view image and the side-scan sonar image obtained after step S11 is performed, and (B) shows fine matching feature point pairs of the heterologous co-view image and the side-scan sonar image obtained after step S12 is performed.
Detailed Description
The invention will be further explained with reference to the drawings.
Referring to fig. 1, the invention, namely a side-scan sonar position correction method based on high-precision heterogeneous common-view images, includes the following steps:
step S1, acquiring a fine matching feature point pair of the heterogeneous common-view image and the side scan sonar image;
step S2, considering that although the images are heterogeneous co-view images, the topographic features of the seabed are reflected, besides feature points are found on the basis of surface features, the acquisition of corresponding feature point pairs can be realized by reflecting the substrate distribution information of the same seabed, therefore, a k-means + + unsupervised classification method is adopted to classify the seabed substrate of the heterogeneous co-view images and the side scan sonar images respectively, and the fine matching feature point pairs of the substrate classification results of the heterogeneous co-view images and the substrate classification results of the side scan sonar images are acquired;
step S3, constructing a rigid transformation model of the heterogenous co-view image and the side-scan sonar image based on the refined matching feature point pairs of the heterogenous co-view image and the side-scan sonar image and the refined matching feature point pairs of the substrate classification result of the heterogenous co-view image and the substrate classification result of the side-scan sonar image, and solving by using a least square method to obtain a rotation parameter and a translation parameter of the rigid transformation model; and
and step S4, sequentially carrying out rotation and translation transformation on each pixel of the side-scan sonar image by using the rotation parameters and the translation parameters of the rigid transformation model so as to realize the integral correction of the position of the side-scan sonar image and finally improve the precision of the position of the side-scan sonar image.
Specifically, step S1 includes:
step S11, acquiring rough matching feature point pairs of the heterogeneous co-view image and the side scan sonar image based on SURF (Speed-up robust features) algorithm; and
and step S12, removing mismatching feature point pairs from the rough matching feature point pairs of the heterogeneous common-view image and the side-scan sonar image by using a RANSAC (Random Sample Consensus) algorithm to obtain fine matching feature point pairs of the heterogeneous common-view image and the side-scan sonar image.
Further, the step S11 includes:
step S111, respectively extracting feature points of the heterogeneous common-view image and the side-scan sonar image, and respectively calculating Euclidean distances d between each feature point of the heterogeneous common-view image and all feature points of the side-scan sonar image according to the following formula (1):
in the formula, P1For n-dimensional feature vectors, P, in heterogeneous co-view images2N-dimensional feature vectors in the side scan sonar image; and
in step S112, each feature point of the heterologous co-view image and the feature point in the side-scan sonar image with the minimum euclidean distance d from the feature point are defined as a rough matching feature point pair of the heterologous co-view image and the side-scan sonar image.
Further, the step S12 includes:
step S121, randomly choosing 4 groups of matching point pairs (namely, a RANSAC sample) from the rough matching feature point pairs of the heterogeneous common-view image and the side-scan sonar image;
step S122, obtaining a transformation matrix M according to the 4 groups of matching point pairs selected in the step S121;
in the formula, (x, y) is the position of the feature point in the side scan sonar image(x ', y') is the position of the characteristic point matched with the characteristic point in the side scan sonar image in the heterogenous co-view image, s is a scale parameter, h11~h33All are coefficient parameters (the scale parameter s and the coefficient parameter h can be obtained by solving according to the specific data of the 4 groups of matching point pairs11~h33);
Step S123, calculating projection errors L of all rough matching feature point pairs and the transformation matrix M of the heterogeneous common-view image and the side-scan sonar image in sequence according to the following formula (3):
in the formula (x)i,yi) Is the position of the ith feature point in the side scan sonar image, (x'i,y′i) The position of the ith characteristic point in the heterogeneous common view image is taken as the position of the ith characteristic point in the heterogeneous common view image;
if the projection error L is smaller than a preset threshold (the threshold can be set based on the projection error L of the 4 groups of matched point pairs decimated in step S122 and the transformation matrix M), adding the rough matching characteristic point pairs of the corresponding heterogeneous common-view image and the side-scan sonar image into the inner point set I, and recording the number of the matched point pairs added into the inner point set I;
step S124, if the number of the matching point pairs in the current inner point set I is larger than the optimal inner point set IbestThe number of the elements in the set is updated to the optimal interior point set IbestAnd if so, finishing the iteration, removing all rough matching feature point pairs except the inner point set I, defining the rough matching feature point pairs in the inner point set I as fine matching feature point pairs of the heterogeneous common-view image and the side-scan sonar image, and if not, adding 1 to the iteration number and returning to execute the step S121.
Specifically, step S2 includes:
step S21, initializing k clustering centers by adopting a k-means + + unsupervised classification method, and randomly generating positions u of the k clustering centersk;
Step S22, calculating the Euclidean distance from each pixel position in the side scan sonar image and the heterogeneous co-view image to each clustering center (which can be calculated according to the formula (1)) and dividing each pixel into a class in which the clustering center closest to the pixel position is located;
step S23, updating and obtaining the positions u 'of k clustering centers according to the following formula (4)'k:
In the formula, xkiThe position of the ith pixel in the class is the position of the kth clustering center;
step S24, repeating steps S22 to S23 until the distribution of all classes is not changed or the preset maximum iteration number is reached (the maximum iteration number is generally set according to experimental experience), then executing step S25;
step S25, acquiring rough matching feature point pairs of the substrate classification result of the heterogeneous co-view image and the substrate classification result of the side scan sonar image based on the SURF algorithm (refer to step S11 specifically); and
and step S26, removing the mismatched characteristic point pairs from the rough matching characteristic point pairs of the substrate classification result of the heterogeneous common-view image and the substrate classification result of the side-scan sonar image by using a RANSAC algorithm to obtain the fine matching characteristic point pairs of the substrate classification result of the heterogeneous common-view image and the substrate classification result of the side-scan sonar image (refer to step S12 specifically).
Specifically, step S3 includes:
constructing a rigid transformation model of the heterogenous co-view image and the side scan sonar image according to the following formula (5):
wherein a and b are rotation parameters, c and d are translation parameters, and (X)1,Y1) The position of the feature point in the side scan sonar image (X)2,Y2) Is a feature point in a side scan sonar image in a heterogeneous common-view imageThe location of the matched feature points;
substituting the precisely matched characteristic point pairs of the heterogeneous common-view image and the side-scan sonar image into a formula (5), and expanding to obtain an over-determined equation set (6):
converting the over-determined system of equations (6) into a matrix equation (7):
order:
in the formula (X)1n,Y1n) The position of the nth characteristic point in the side scan sonar image (X)2n,Y2n) The position of a characteristic point matched with the nth characteristic point in the side scan sonar image in the heterogeneous common view image is determined;
obtaining a least squares solution of the rotation parameters and translation parameters of the rigid transformation model according to formula (8) as:
t=(XTX)-1XTy (9)
in the formula, XTIs a transposed matrix of matrix X, (X)TX)-1Is a matrix XTAn inverse matrix of X;
according to the formula (6) -the formula (9), the rotation parameters a and b and the translation parameters c and d of the rigid transformation model can be calculated.
In conclusion, the invention, namely the side-scan sonar position correction method based on the high-precision heterologous common-view image, realizes the correction of the position of the side-scan sonar image, improves the precision and accuracy of the description of the submarine topographic information by the side-scan sonar image, and simultaneously can improve the working efficiency, reduce the operation cost, be more convenient and faster to use and reduce the implementation time consumption.
The above embodiments are provided only for illustrating the present invention and not for limiting the present invention, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the present invention, and therefore all equivalent technical solutions should also fall within the scope of the present invention, and should be defined by the claims.
Claims (6)
1. A side scan sonar image position correction method based on high-precision heterogeneous common-view images is characterized by comprising the following steps: the correction method comprises the following steps:
step S1, acquiring a fine matching feature point pair of the heterogeneous common-view image and the side scan sonar image;
step S2, adopting a k-means + + unsupervised classification method to classify the heterogeneous co-view images and the side-scan sonar images respectively to obtain the accurate matching feature point pairs of the substrate classification results of the heterogeneous co-view images and the substrate classification results of the side-scan sonar images;
step S3, constructing a rigid transformation model of the heterogenous co-view image and the side-scan sonar image based on the refined matching feature point pairs of the heterogenous co-view image and the side-scan sonar image and the refined matching feature point pairs of the heterogenous co-view image and the side-scan sonar image, and solving by using a least square method to obtain a rotation parameter and a translation parameter of the rigid transformation model; and
and step S4, sequentially carrying out rotation and translation transformation on each pixel of the side-scan sonar image by using the rotation parameters and the translation parameters of the rigid transformation model so as to realize the integral correction of the position of the side-scan sonar image.
2. The method for correcting the position of a side scan sonar image based on a high-precision heterologous co-view image according to claim 1, wherein step S1 includes the steps of:
step S11, acquiring rough matching feature point pairs of the heterogeneous common-view image and the side scan sonar image based on the SURF algorithm; and
and step S12, rejecting mismatching feature point pairs in the rough matching feature point pairs of the heterogeneous co-view image and the side-scan sonar image by using a RANSAC algorithm to obtain fine matching feature point pairs of the heterogeneous co-view image and the side-scan sonar image.
3. The method for correcting the position of a side scan sonar image based on a high-precision heterologous co-view image according to claim 2, wherein step S11 includes the steps of:
step S111, respectively extracting feature points of the heterogeneous common-view image and the side-scan sonar image, and respectively calculating Euclidean distances d between each feature point of the heterogeneous common-view image and all feature points of the side-scan sonar image according to the following formula (1):
in the formula, P1For n-dimensional feature vectors, P, in heterogeneous co-view images2N-dimensional feature vectors in the side scan sonar image; and
in step S112, each feature point of the heterologous co-view image and the feature point in the side-scan sonar image with the minimum euclidean distance d from the feature point are defined as a rough matching feature point pair of the heterologous co-view image and the side-scan sonar image.
4. The method for correcting the position of a side scan sonar image based on a high-precision heterologous co-view image according to claim 2, wherein step S12 includes the steps of:
step S121, randomly choosing 4 groups of matching point pairs from the rough matching feature point pairs of the heterogeneous common-view image and the side scan sonar image;
step S122, obtaining a transformation matrix M according to the 4 groups of matching point pairs selected in the step S121;
in the formula, (x, y) is the position of the characteristic point in the side scan sonar image, (x ', y') is the position of the characteristic point matched with the characteristic point in the side scan sonar image in the heterogenous co-view image, and s is a scale parameter;
step S123, calculating projection errors L of all rough matching feature point pairs and the transformation matrix M of the heterogeneous common-view image and the side-scan sonar image in sequence according to the following formula (3):
in the formula (x)i,yi) Is the position of the ith feature point in the side scan sonar image, (x'i,y′i) The position of the ith characteristic point in the heterogeneous common view image is taken as the position of the ith characteristic point in the heterogeneous common view image;
if the projection error L is smaller than a preset threshold value, adding the rough matching characteristic point pairs of the corresponding heterogeneous common-view image and the side-scan sonar image into an inner point set I, and recording the number of the matching point pairs added into the inner point set I;
step S124, if the number of the matching point pairs in the current inner point set I is larger than the optimal inner point set IbestThe number of the elements in the set is updated to the optimal interior point set IbestAnd if so, finishing the iteration, removing all rough matching feature point pairs except the inner point set I, defining the rough matching feature point pairs in the inner point set I as fine matching feature point pairs of the heterogeneous common-view image and the side-scan sonar image, and if not, adding 1 to the iteration number and returning to execute the step S121.
5. The method for correcting the position of a side scan sonar image based on a high-precision heterologous co-view image according to claim 1, wherein step S2 includes the steps of:
step S21, initializing k clustering centers by adopting a k-means + + unsupervised classification method, and randomly generating positions u of the k clustering centersk;
Step S22, calculating the Euclidean distance from each pixel position in the side-scan sonar image and the heterogeneous common-view image to each clustering center, and dividing each pixel into a class where the clustering center closest to the pixel position is located;
step S23, updating and obtaining the positions u 'of k clustering centers according to the following formula (4)'k:
In the formula, xkiThe position of the ith pixel in the class is the position of the kth clustering center;
step S24, repeating the steps S22 to S23 until the distribution of all classes does not change or reaches the preset maximum iteration number, and then performing step S25;
step S25, acquiring rough matching feature point pairs of the substrate classification result of the heterogeneous common-view image and the substrate classification result of the side-scan sonar image based on the SURF algorithm; and
and step S26, rejecting mismatching feature point pairs in the rough matching feature point pairs of the substrate classification result of the heterogeneous common-view image and the substrate classification result of the side-scan sonar image by using a RANSAC algorithm to obtain the fine matching feature point pairs of the substrate classification result of the heterogeneous common-view image and the substrate classification result of the side-scan sonar image.
6. The method for correcting the position of a side scan sonar image based on a high-precision heterologous co-view image according to claim 1, wherein step S3 includes:
constructing a rigid transformation model of the heterogenous co-view image and the side scan sonar image according to the following formula (5):
wherein a and b are rotation parameters, c and d are translation parameters, and (X)1,Y1) The position of the feature point in the side scan sonar image (X)2,Y2) Features of the side scan sonar image in the heterogenous co-view imageThe location of the point-matched feature points;
and substituting the precisely matched characteristic point pairs of the heterogeneous co-view image and the side scan sonar image into the formula (5), and expanding to obtain an over-determined equation set (6):
converting the over-determined system of equations (6) into a matrix equation (7):
order:
in the formula (X)1n,Y1n) The position of the nth characteristic point in the side scan sonar image (X)2n,Y2n) The position of a characteristic point matched with the nth characteristic point in the side scan sonar image in the heterogeneous common view image is determined;
obtaining a least squares solution of the rotation parameters and the translation parameters of the rigid transformation model according to the formula (8) as:
t=(XTX)-1XTy (9)
in the formula, XTIs a transposed matrix of matrix X, (X)TX)-1Is a matrix XTAn inverse matrix of X;
and (4) calculating to obtain rotation parameters a and b and translation parameters c and d of the rigid transformation model according to the formula (6) to the formula (9).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110655786.0A CN113378932A (en) | 2021-06-11 | 2021-06-11 | Side-scan sonar position correction method based on high-precision heterogeneous common-view image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110655786.0A CN113378932A (en) | 2021-06-11 | 2021-06-11 | Side-scan sonar position correction method based on high-precision heterogeneous common-view image |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113378932A true CN113378932A (en) | 2021-09-10 |
Family
ID=77574144
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110655786.0A Pending CN113378932A (en) | 2021-06-11 | 2021-06-11 | Side-scan sonar position correction method based on high-precision heterogeneous common-view image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113378932A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114627367A (en) * | 2022-05-17 | 2022-06-14 | 国家海洋局北海海洋技术保障中心 | Sea bottom line detection method for side-scan sonar image |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111486845A (en) * | 2020-04-27 | 2020-08-04 | 中国海洋大学 | AUV multi-strategy navigation method based on submarine topography matching |
CN112837252A (en) * | 2021-01-27 | 2021-05-25 | 中交三航(上海)新能源工程有限公司 | Side-scan sonar strip image public coverage area image fusion method and system |
-
2021
- 2021-06-11 CN CN202110655786.0A patent/CN113378932A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111486845A (en) * | 2020-04-27 | 2020-08-04 | 中国海洋大学 | AUV multi-strategy navigation method based on submarine topography matching |
CN112837252A (en) * | 2021-01-27 | 2021-05-25 | 中交三航(上海)新能源工程有限公司 | Side-scan sonar strip image public coverage area image fusion method and system |
Non-Patent Citations (2)
Title |
---|
严俊: "多波束与侧扫声呐高质量测量信息获取与叠加", 《中国博士学位论文全文数据库 基础科学辑》, no. 06, pages 86 - 97 * |
勿在浮沙筑高台: "[特征匹配]RANSAC算法原理与源码解析", Retrieved from the Internet <URL:http://blog.csdn.net/luoshixian099/article/details/50217655> * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114627367A (en) * | 2022-05-17 | 2022-06-14 | 国家海洋局北海海洋技术保障中心 | Sea bottom line detection method for side-scan sonar image |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109375154B (en) | Coherent signal parameter estimation method based on uniform circular array in impact noise environment | |
CN110379011B (en) | Underwater terrain point cloud hole repairing method based on improved cubic B-spline curve | |
CN111025273B (en) | Distortion drag array line spectrum feature enhancement method and system | |
CN108845325A (en) | Towed linear-array sonar submatrix error misfits estimation method | |
CN111291327A (en) | Multi-beam seabed sediment classification method based on divide and conquer thought | |
CN100454037C (en) | High resolution detection depth side scan sonar signal processing method | |
CN111443344B (en) | Automatic extraction method and device for side-scan sonar sea bottom line | |
CN111537982A (en) | Distortion drag array line spectrum feature enhancement method and system | |
CN113378932A (en) | Side-scan sonar position correction method based on high-precision heterogeneous common-view image | |
CN110687538A (en) | Near-field focusing-based super-beam forming method | |
CN113960624A (en) | Laser radar echo underwater topography detection method based on self-adaptive DBSCAN | |
Zhao et al. | Side scan sonar image segmentation based on neutrosophic set and quantum-behaved particle swarm optimization algorithm | |
CN116125386A (en) | Intelligent positioning method and system for underwater vehicle with enhanced sparse underwater acoustic ranging | |
Lønmo et al. | Improving swath sonar water column imagery and bathymetry with adaptive beamforming | |
Yu et al. | Treat Noise as Domain Shift: Noise feature disentanglement for underwater perception and maritime surveys in side-scan sonar images | |
CN115587291B (en) | Denoising characterization method and system based on crack ultrasonic scattering matrix | |
CN113313651B (en) | Method for repairing texture distortion area of side-scan sonar image based on surrounding variation | |
CN116559883A (en) | Correction method of side-scan sonar image and side-scan sonar mosaic image | |
CN112862677B (en) | Acoustic image stitching method of same-platform heterologous sonar | |
CN113096171B (en) | Multi-scale iterative self-adaptive registration method for multi-beam and side-scan sonar images | |
Murino et al. | A confidence-based approach to enhancing underwater acoustic image formation | |
CN113075645B (en) | Distorted matrix line spectrum enhancement method based on principal component analysis-density clustering | |
Yang et al. | Bottom detection for multibeam sonars with active contours | |
CN109029387B (en) | Wave beam internal fitting multi-beam sounding algorithm | |
TWI719711B (en) | Method for exploring the underground silt |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |