CN115908930A - Improved CFWPSO-SVM-based forward-looking sonar image recognition and classification method - Google Patents

Improved CFWPSO-SVM-based forward-looking sonar image recognition and classification method Download PDF

Info

Publication number
CN115908930A
CN115908930A CN202211530587.8A CN202211530587A CN115908930A CN 115908930 A CN115908930 A CN 115908930A CN 202211530587 A CN202211530587 A CN 202211530587A CN 115908930 A CN115908930 A CN 115908930A
Authority
CN
China
Prior art keywords
image
particles
cfwpso
improved
svm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211530587.8A
Other languages
Chinese (zh)
Inventor
胡家祯
夏梓铭
于松
朱国豪
赵思聪
孙佳龙
鞠海建
吉方正
沈舟
鞠子夏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong Jianghai Surveying And Mapping Institute Co ltd
Jiangsu Ocean University
Original Assignee
Nantong Jianghai Surveying And Mapping Institute Co ltd
Jiangsu Ocean University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong Jianghai Surveying And Mapping Institute Co ltd, Jiangsu Ocean University filed Critical Nantong Jianghai Surveying And Mapping Institute Co ltd
Priority to CN202211530587.8A priority Critical patent/CN115908930A/en
Publication of CN115908930A publication Critical patent/CN115908930A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a forward-looking sonar image recognition and classification method based on an improved CFWPSO-SVM, which comprises the following steps of: s1: adaptive median-to-mean filtering; s2: carrying out linear transformation on gray scale; s3: carrying out binarization on the sonar image; s4: segmenting sonar images; s5: extracting characteristics; s6: classification was performed by a particle swarm algorithm (CFWPSO _ GA) improved based on genetic algorithm. The method can accurately classify the target object in the forward-looking sonar image. The method comprises the steps of firstly using a self-adaptive median-mean filtering method and a gray linear transformation method to carry out preprocessing such as denoising and image enhancement on an image, then extracting the characteristics of a target in the image, introducing a particle swarm algorithm (CFWPSO _ GA) improved based on a genetic algorithm when optimizing the parameters of an SVM algorithm, optimizing the accuracy of parameter optimization, improving the efficiency of parameter optimization, and being an efficient classification method for sonar image target identification.

Description

Improved CFWPSO-SVM-based forward-looking sonar image recognition and classification method
Technical Field
The invention relates to the field of image recognition, in particular to a forward-looking sonar image recognition and classification method based on an improved CFWPSO-SVM.
Background
In recent years, the wide and profound potential for the development of oceans and their implications have attracted attention in all countries of the world. For the development of the ocean, the development of ocean surveying must provide more abundant and reliable ocean bottom data information for the ocean surveying. But the sea bottom is not on land, and the unknown environment of underwater paint darkness, the complex and variable terrain and the potential risk of no time and no time add a little difficulty to the marine survey. Therefore, in underwater measurement, compared with information transmission methods such as light and electromagnetism, the information transmission effect of sonar and the like is more stable and reliable. Among many sonars, the Forward Looking Sonar (FLS) is more effective in capturing underwater objects and scenes, and it is not affected by underwater optical visibility and water turbidity. Therefore, today, research on sonar image recognition is becoming a focus.
In view of the complexity and the variability of the underwater environment, the sonar image more or less contains various noises, even the sonar target is shielded, and the like, which seriously affects the quality of the sonar image. In order to improve the quality of images, a forward-looking sonar image recognition and classification method based on the improved CFWPSO-SVM is provided.
Disclosure of Invention
The purpose of the invention is: a forward-looking sonar image recognition and classification method based on an improved CFWPSO-SVM is provided.
In order to achieve the purpose, the invention provides the following technical scheme:
a forward-looking sonar image recognition and classification method based on an improved CFWPSO-SVM comprises the following steps:
s1: adaptive median-mean filtering;
s2: carrying out linear transformation on gray scale;
s3: carrying out binarization on the sonar image;
s4: segmenting sonar images;
s5: extracting characteristics;
s6: classification was performed by a particle swarm algorithm (CFWPSO _ GA) improved based on genetic algorithm.
As a preferred embodiment of the present invention, the S1: the self-adaptive median-mean filtering is based on the common median filtering, the size of a filtering window is changed according to a preset condition, whether a current pixel is noise or not is judged according to a certain condition, if so, the current pixel is replaced by a neighborhood median, and finally the mean filtering is used for eliminating burrs of the image outline;
as a preferred technical solution of the present invention, the gray scale linear transformation in S2 is to linearly expand each pixel in the image by using a linear single-valued function, so as to effectively enhance the visual effect of the image, increase the difference between the background and the target object, and improve the accuracy of the subsequent binarization processing and segmentation of the image, and the formula is as follows:
g(x,y)=k×f(x,y)+b
when k >1, the pixel values of the image all increase after transformation, and the contrast of the image also increases.
As a preferred technical solution of the present invention, the sonar image in S3 is binarized; setting a fixed threshold T, dividing the histogram into a background and a foreground for each possible threshold T greater than T, and calculating the gray variance sigma of the background and the foreground respectively B (t)、σ F (t) calculating probabilities ω that a pixel is background and foreground, respectively B (t)、ω F (t); let sigma t =ω B (t)σ B (t)+σ F (t)ω F (t) selecting an optimal threshold t a =min(σ t ) Optimal threshold t obtained by Otus's algorithm a With t a And performing binarization processing on the image as a threshold value.
As a preferred embodiment of the present invention, the specific steps of sonar image segmentation in S4 are as follows:
s41: segmenting each bright spot by a segmentation algorithm of region growing, obtaining a plurality of region stacks after the image is segmented, counting the number of elements in each region stack, and regarding the number as the area of each bright spot;
s42: setting a threshold value to remove a small area and a large area to obtain a suspected target area because the area of a bright spot formed by the target to be identified is in a certain range and is not too small or too large;
s43: calculating the average value of the stack of each suspected target area to obtain the mass center of each bright spot; intercepting a rectangle taking the center of mass of the bright spot as the center on the unprocessed image so as to finish the segmentation of the sonar image;
s44: and according to the coordinates of the target object divided according to the bright spot area, carrying out segmentation on the image before binarization.
As a preferred embodiment of the present invention, the feature extraction in S5 is to perform feature extraction on the segmented image to obtain an area, a perimeter, a shape parameter, a gray level mean, a gray level variance, and an HOG feature of the segmented image.
As a preferred technical solution of the present invention, the process of the HOG feature extraction algorithm is as follows:
s51: the size of the image after target segmentation is inconsistent, the image is firstly scaled to 256 multiplied by 256 to keep consistent, otherwise, the extracted HOG characteristic dimension is inconsistent, and subsequent SVM prediction is influenced;
s52: graying the image; normalizing and correcting the color space of the image, wherein the step is used for adjusting the contrast of the image and reducing the noise influence;
s53: calculating the image gradient; calculating the vertical edge, the horizontal edge, the edge strength and the edge slope of the image;
s54: constructing a cell by 8*8 pixels; solving a gradient direction histogram of each cell, and finally normalizing the histogram;
s55: synthesizing one block by 2*2 cells; all blocks are characterized.
As a preferred embodiment of the present invention, in S6, classification is performed by a particle swarm algorithm (CFWPSO _ GA) improved based on a genetic algorithm;
the method comprises the following specific steps:
s61, initializing relevant parameters and the speed of the population particles, and calculating the initial fitness value of the particles;
s62, searching an initial extreme value, and determining an individual extreme value p according to the fitness value of the initial particle best Sum group extremum g best
S63, updating the inertia weight according to the formula 1; updating the speed and position of the particles according to equations 2 and 3;
formula 1:
Figure BDA0003975647510000041
formula 2: v. of i (t+1)=ω*v i (t)+c 1 *r 1 (pbest-x i (t))+c 2 *r 2 (gbest-x i (t))
Formula 3: x is the number of i (t+1)=x i (t)+v i (t+1)
Where ω is the inertial weight; i is the particle number; x represents the particle position; v represents the particle velocity; t is the number of particle update iterations; the gbest is the position of the global optimal solution; location of the optimal solution for pbest individual, c 1 ,c 2 Is an acceleration factor; where f is the current objective function of the particle, f avg And f min Respectively the average value and the minimum value of all the current particles;
s64, calculating the fitness value of the current position of each particle; sorting the fitness values of the particles from large to small, selecting the first half of particles with higher fitness values to directly enter the next generation, assigning the speed and the position to the second half of particles with lower fitness values, randomly crossing every two particles to generate particles with the same number, and replacing parent particles with child particles;
s65, updating the individual extreme value p according to the fitness value of the new particle best And group extremum g best
S66, if the stopping condition is met or the maximum iteration number is exceeded, outputting an optimal solution; otherwise, jumping to step S63 to continue searching.
As a preferred technical solution of the present invention, the specific idea of intersection in S64 is: in each iteration, the fitness values of the particles are sorted from large to small, the higher the fitness value of the particles is, the better the adaptability is, and the fitness function is the average value of the classification accuracy under 3-fold cross validation; firstly, selecting particles with higher fitness value to directly enter the next generation so as to ensure that the next generation of particles are relatively better; simultaneously assigning the position and the speed of the first half of particles with higher fitness value to the second half of particles with lower fitness value, randomly crossing the particles in pairs, generating the same number of child particles, and replacing parent particles with the child particles; the calculation of the position and the speed of the child particle is as follows:
Figure BDA0003975647510000051
wherein p is a random number between 0 and 1; parent 1 (v),parent 2 (v) The velocities of two different parent particles; parent 1 (x),parent 2 (x) Are the positions of two different parent particles.
The invention has the beneficial effects that: according to the method, through carrying out optimization research on SVM algorithm parameters, a hybrid optimization algorithm (CFWPSO _ GA) of a Genetic Algorithm (GA) crossover operator is introduced on the basis of a self-adaptive weight particle swarm algorithm (CFWPSO) with a shrinkage factor, the method can be used for searching the optimal parameters for the SVM, the problem that other optimization algorithms are trapped in local optimal values is avoided, the accuracy of parameter optimization is optimized, and the sonar image target can be well classified.
The method can accurately classify the target object in the forward-looking sonar image; the method comprises the steps of firstly carrying out preprocessing such as denoising and image enhancement on an image by using a self-adaptive median-mean filtering and gray scale linear transformation method, then extracting the characteristics of a target in the image, introducing a particle swarm algorithm (CFWPSO _ GA) improved based on a genetic algorithm when optimizing parameters of an SVM algorithm, optimizing the accuracy of parameter optimization, and improving the efficiency of parameter optimization.
Drawings
FIG. 1 is a flow chart of the improved algorithm of the present invention;
FIG. 2 is a graph of the classification prediction of three algorithms of the present invention;
FIG. 3 is a graph of the confusion matrix for three algorithms of the present invention.
Detailed Description
Examples of the present invention are described in detail below with reference to the accompanying drawings so that the advantages and features of the invention may be more readily understood by those skilled in the art.
Example (b): the invention provides a forward-looking sonar image recognition and classification method based on an improved CFWPSO-SVM, which comprises the following specific steps:
s1: adaptive median-to-mean filtering;
the self-adaptive median filtering is to change the size of a filtering window according to a preset condition on the basis of the common median filtering, judge whether the current pixel is noise or not according to a certain condition, replace the current pixel by the neighborhood median if the current pixel is noise, and finally eliminate burrs of the image contour by mean filtering.
S2: carrying out linear transformation on gray scale;
the gray scale of the sonar image is limited in a small range, and the sonar image can be seen on a display to be a blurred image without gray scale. Each pixel in the image is linearly expanded by a linear single-value function, so that the visual effect of the image is effectively enhanced, the difference between a background and a target object is increased, and the accuracy of the subsequent binarization processing and segmentation of the image is improved, namely the gray level linear transformation, and the formula is as follows: g (x, y) = k × f (x, y) + b
When k is larger than 1, the pixel values of the image are all increased after transformation, and the contrast of the image is increased;
s3: carrying out binarization on the sonar image;
setting a fixed threshold T, dividing the histogram into a background and a foreground for each possible threshold T greater than T, and calculating the gray variance sigma of the background and the foreground respectively B (t)、σ F (t) calculating probabilities ω that a pixel is background and foreground, respectively B (t)、ω F (t) of (d). Let sigma t =ω B (t)σ B (t)+σ F (t)ω F (t) selecting an optimal threshold t a =min(σ t ) Optimal threshold t obtained by Otus's algorithm a With t a Performing binarization processing on the image as a threshold value;
s4: segmenting sonar images;
through a segmentation algorithm of region growing, in order to segment each bright spot, the image is segmented to obtain a plurality of region stacks, the number of elements in each region stack is counted, and the number is regarded as the area of each bright spot. Because the area of the bright spot formed by the target to be identified is in a certain range and is not too small or too large, a threshold value is set, and a small area and a large area are removed to obtain a suspected target area. Calculating the average value of the stack of each suspected target area to obtain the mass center of each bright spot; and setting the obtained center of mass of the bright spot as the center of mass of the captured bright spot, and cutting a rectangle which takes the center of mass of the bright spot as the center on the unprocessed image to finish the segmentation of the sonar image. Then, the image is divided according to the coordinates of the target object divided according to the bright spot area before binarization.
S5: extracting characteristics;
extracting the features of the segmented image to obtain 6 types of features including the area, the perimeter, the shape parameters, the gray mean, the gray variance and the HOG of the segmented image;
binarizing the segmented image to obtain the area, perimeter and shape parameters of the target, and defining the area A of the binary image with the size of M multiplied by N as
Figure BDA0003975647510000071
Namely the sum of the number of white pixel points; and searching the edge of the binary image through a bwpherem function, and then calculating the number of the sets of boundary pixels of the connected region to obtain the perimeter P of the target object. The shape parameter of a region can be quantified by the ratio of the area of a region to the area of a circle having the same shape perimeter. The shape parameters are then defined as:
Figure BDA0003975647510000072
describing the deviation degree of the target image from the circle, when the target area is circular, F =1, and in other shapes, F >1, and the larger the deviation amount, the larger F;
the gray mean value and the gray variance can reflect the total gray level of the segmentation area to a certain extent, and related targets can be classified according to different gray value distributions. In order to obtain the gray mean value and the gray variance of the target, a gray histogram of the target object is drawn firstly.
Taking the gray mean and the gray variance of the target image as the characteristics of the target object; the formula for the mean and variance of gray levels is as follows:
Figure BDA0003975647510000073
Figure BDA0003975647510000081
wherein H (i) is the number of pixels with the gray scale of i; s gray M, n are the length and width of the divided image, g (x, y) represents the gray scale value of the pixel at (x, y), S var Is the gray variance;
the histogram of the directional gradient is a common feature extraction algorithm, and the HOG feature is combined with the SVM classifier for use, so that the histogram of the directional gradient is widely used for image classification; the realization process of the HOG feature extraction algorithm is as follows:
s51: the size of the image after target segmentation is inconsistent, the image is firstly scaled to 256 multiplied by 256 to keep consistent, otherwise, the extracted HOG characteristic dimension is inconsistent, and subsequent SVM prediction is influenced;
s52: graying the image, and normalizing and correcting the color space of the image, wherein the step is used for adjusting the contrast of the image and reducing the noise influence;
s53: calculating image gradient, and solving vertical edge, horizontal edge, edge strength and edge slope of the image;
s54: constructing cells by 8*8 pixels, obtaining a gradient direction histogram of each cell, usually taking 9 (or other possible) directions (features), namely, each 360/9=40 degrees is divided into one direction, the direction is weighted according to the edge intensity of the pixel, and finally, normalizing the histogram;
s55: solving a gradient direction histogram of each cell, and finally normalizing the histogram;
s56: synthesizing one block by 2*2 cells, and listing the characteristics of all the blocks;
s6: classifying by a particle swarm algorithm (CFWPSO _ GA) improved based on a genetic algorithm;
the population inheritance and cross variation process is simulated through the genetic algorithm, so that when the PSO finds the optimal parameters for the SVM algorithm, the PSO not only can give consideration to local optimization while carrying out global optimization, but also can increase the diversity of population particles, and solve the problems that the particle swarm algorithm is easy to precocious convergence and falls into the local optimal solution.
The support vector machine is a machine learning method which can well solve the problems of small samples, nonlinearity, high dimension and the like; the SVM has the idea that an optimal hyperplane with the maximum classification interval is found in a sample space based on a training set; the optimal hyperplane equation is set as:
w T x+b=0
for the non-linear problem, the constraints are:
y i w T x+b≥1-ε i ,ε i ≥0
the objective function is:
Figure BDA0003975647510000091
for SVM algorithms, the selection of the kernel parameters is determined to be important, and the invention adopts a radial basis function kernel.
Figure BDA0003975647510000092
In the radial basis function kernel, the penalty coefficient C and the gamma coefficient are of great importance, and the optimal C and g parameters need to be searched through an algorithm.
The particle swarm optimization algorithm is essentially a global optimization algorithm for random search, has high convergence rate, is easy to realize, and has good universality, and the formula of the particle swarm optimization algorithm is as follows:
formula 2: v. of i (t+1)=ω*v i (t)+c 1 *r 1 (pbest-x i (t))+c 2 *r 2 (gbest-x i (t))
Formula 3: x is a radical of a fluorine atom i (t+1)=x i (t)+v i (t+1)
Where ω is the inertial weight; i is the particle number; x represents the particle position; v represents the particle velocity; t is the number of particle update iterations; the gbest is the position of the global optimal solution; location of best solution of pbest individual, c 1 ,c 2 Is the acceleration factor.
PSO decides the search according to own speed, and reserves a global search method based on population particles and uses a speed-displacement model. The algorithm is easy to realize, has few parameters, and avoids complex operations of crossing and variation of genetic algorithms.
The particle swarm optimization of the self-adaptive inertia weight improves the inertia weight in the standard particle swarm optimization, adopts a nonlinear dynamic inertia weight coefficient formula to balance the global and local searching capabilities of the particle swarm optimization, and the formula is as follows:
formula 1:
Figure BDA0003975647510000093
where f is the current objective function of the particle, f avg And f min Respectively, the average value and the minimum value of all the particles at present.
The inertia weight describes that the current speed of the particle is influenced by the previous speed, and when the value of omega is larger, the influence of the previous speed of the particle is larger, which is beneficial to global search; when the value of omega is small, the influence of the front speed of the particle is small, and the later local search is facilitated. And jumping out the local minimum value by adjusting the value of the inertia weight omega. The value of ω changes as the value of the particle objective function changes.
The PSO model with the introduced contraction factor adopts the adjustment method of the contraction factor, can ensure the convergence of the PSO algorithm by selecting proper parameters, and can cancel the boundary limit of the speed. The self-adaptive inertia weight particle swarm algorithm with the introduced shrinkage factor can effectively control the flight speed of the particles, so that the algorithm achieves effective balance between global search and local search.
The invention provides a hybrid optimization algorithm (CFWPSO _ GA) which introduces a Genetic Algorithm (GA) crossover operator based on a self-adaptive weight particle swarm algorithm (CFWPSO) with a contraction factor.
In the genetic algorithm, the crossing and the mutation are two main operation steps, which finish the information transmission from a parent generation to a child generation individual and ensure the optimization process of the individual, so the invention aims to combine the crossing idea in the genetic algorithm with the particle swarm algorithm, increase the diversity of population particles, improve the problem that the particle swarm algorithm is easy to premature convergence and falls into the local optimal solution. In the improved algorithm, the specific idea of intersection is as follows: in each iteration, the fitness values of the particles are sorted from large to small, the larger the fitness value of the particle is, the better the adaptability is, and the fitness function is the average value of the classification accuracy under 3-fold cross validation. Firstly, selecting particles with higher fitness value to directly enter the next generation so as to ensure that the next generation of particles are relatively better; simultaneously assigning the position and the speed of the first half of particles with higher fitness value to the second half of particles with lower fitness value, randomly crossing the particles in pairs, generating the same number of child particles, and replacing parent particles with the child particles; wherein the position and speed of the offspring particle is calculated as
Figure BDA0003975647510000111
Wherein p is a random number between 0 and 1; parent 1 (v),parent 2 (v) The velocities of two different parent particles; parent 1 (x),parent 2 (x) Are the positions of two different parent particles.
Therefore, the fast convergence capability of the PSO algorithm is utilized, new particles can be generated through the crossing step of the GA algorithm, the diversity of the particles is increased, and the local optimum jumping is facilitated;
the improved particle swarm algorithm (CFWPSO-GA) comprises the following specific steps:
s61, initializing relevant parameters and the speed of the population particles, and calculating the initial fitness value of the particles;
s62, searching an initial extreme value, and determining an individual extreme value p according to the fitness value of the initial particle best Sum group extremum g best
S63, updating the inertia weight according to the formula 1; updating the speed and the position of the particles according to the formulas 2 and 3;
s64, calculating the fitness value of the current position of each particle; sorting the fitness values of the particles from big to small, selecting the first half of the particles with higher fitness values to directly enter the next generation, assigning the speed and the position to the second half of the particles with lower fitness values, randomly crossing the particles in pairs to generate particles with the same number, and replacing parent particles with child particles;
s65, updating the individual extreme value p according to the fitness value of the new particle best Sum group extremum g best
S66, if the stopping condition is met or the maximum iteration number is exceeded, outputting an optimal solution; otherwise, jumping to the step S63 to continue searching;
the technical flow of the improved algorithm of the invention is shown in figure 1:
six hundred target objects including three types of target objects such as bottles, cans and beverage boxes are selected, 200 target objects are selected for each type, then 600 segmented images are obtained to serve as a training set, 540 target objects serve as a test set, and 60 target objects serve as a training set. The test was performed using a standard particle swarm algorithm, a genetic algorithm and the CFWPSO-GA algorithm modified according to the present invention, respectively (where the learning factors c1=2.3, c2=2.3, the inertial weight wmax =1.2, wmin =0.8, the crossover probability pc =0.6, and the variation probability pm = 0.002), and the objects were classified, and the bottle, jar and beverage box were represented by the numbers 1,2,3, respectively, and the classification results of the three algorithms are shown in fig. 2.
When the standard particle swarm algorithm is used for classifying the target objects, 393 targets in 540 test sets are accurately classified, and the accuracy is 72.78%; when the genetic algorithm is used for classifying the target objects, 337 targets are classified correctly, and the accuracy reaches 62.41%; when the CFWPSO-GA algorithm provided by the invention is used for classification, 435 targets are correctly classified, and the classification accuracy is 80.56%. Drawing a confusion matrix of the three algorithms through a prediction result, wherein the confusion matrix is also called an error matrix and is a standard format for representing precision evaluation, and each column of the confusion matrix represents a prediction category; each row represents the true home category of the data as shown in fig. 3.
The macro precision (macroP), macro Cha Quanlv (macroR) and macro F1 (macroF 1) of the three algorithms are calculated by the confusion matrix. Firstly, the bottle is regarded as a target object, other objects are non-target objects, the true class tp is calculated to be 160, the false negative class fn is calculated to be 20, the false positive class fp is 48, the true negative class tn is 275, the Precision rate (Precision) and the Recall rate (Recall) are calculated, and the formula is as follows:
Figure BDA0003975647510000121
Figure BDA0003975647510000122
it was found that Precision1 was 0.7692 and Recall1 was 0.8889. Then, regarding the tank as a target object, and regarding the other tanks as non-target objects, calculating the Precision ratio Precision2 of 0.8671 and the Recall ratio Recall2 of 0.7611; then, the beverage box is regarded as a target object, and the other objects are non-target objects, the Precision ratio Precision3 is 0.7931, and the Recall ratio Recall3 is 0.7667.
And respectively calculating the classification results of the particle swarm algorithm and the genetic algorithm in the same way to obtain the accuracy and the recall rate of the 3 algorithms.
Table 1: precision, recall and accuracy of three classification models
Figure BDA0003975647510000123
Figure BDA0003975647510000131
The proportion of the positive class predicted as the positive class is determined by the Recall representation, and it can be found from the table above that the Recall value of the improved CFWPSO _ GA algorithm in the three algorithms is larger than the values of the other two algorithms, which shows that the proportion of the positive class predicted as the positive class is high and the classification is more accurate when the improved algorithm is used for classification, and the value of Recall1 is larger than that of Recall2 and Recall3, which shows that the improved algorithm is easy to identify the target class of the bottle. In the other two algorithms, the value of Recall2 is greater than the values of Recall1 and Recall3, indicating that both algorithms are easy to identify the target of the tank. In the three algorithms, precision1 is smaller than Precision2 and Precision3, precision represents the proportion of the number of pictures with correct prediction to the total number of positive predictions, and indicates that other targets are more likely to be predicted into bottles. Describing three algorithms tends to not distinguish the bottle from other objects.
By the precision and recall of the 3 algorithms in the above table data, macro precision (macroP), macro Cha Quanlv (macroR) and macro F1 (macroF 1) of the whole multi-classification model can be calculated; the formula is as follows:
Figure BDA0003975647510000132
Figure BDA0003975647510000133
Figure BDA0003975647510000134
TABLE 2 Macroprecision, macro Cha Quanlv, macro F1 for three classification models
Figure BDA0003975647510000135
According to the calculated macro precision, macro Cha Quanlv and macro F1, the GA algorithm classification result is the worst of four algorithms, and the particle swarm algorithm introduced with genetic algorithm improvement provided by the invention has the best classification result. The GA algorithm is difficult to find the optimal parameters accurately, so that better classification accuracy cannot be achieved. The effect of the particle swarm optimization is better than that of the genetic algorithm in the cyclic optimization, but the particle swarm optimization is easy to precocious or fall into local optimization, and the optimization time is longer than that of the genetic algorithm. And the self-adaptive inertial weight particle swarm algorithm with the contraction factor can effectively alleviate the problem. The particle swarm optimization improved by introducing the genetic algorithm provided by the invention not only has the effect of integrating global optimization and local optimization by self-adaptive inertia weight, but also has the advantages of shrinking factors, accelerating the convergence of the algorithm, and increasing the population diversity by genetic cross variation in the genetic algorithm.
In conclusion, by the method, the forward-looking sonar image target can be accurately identified and classified, and the method has high accuracy and meets the requirement of image classification.
The above examples only show some embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention.

Claims (9)

1. A forward-looking sonar image recognition and classification method based on an improved CFWPSO-SVM is characterized by comprising the following steps:
s1: adaptive median-mean filtering;
s2: carrying out linear transformation on gray scale;
s3: carrying out binarization on the sonar image;
s4: segmenting sonar images;
s5: extracting characteristics;
s6: classification was performed by a particle swarm algorithm (CFWPSO _ GA) improved based on genetic algorithm.
2. The forward-looking sonar image recognition and classification method based on the improved CFWPSO-SVM is characterized in that: and S1: the self-adaptive median-average filtering is based on the common median filtering, the size of a filtering window is changed according to a preset condition, whether a current pixel is noise or not is judged according to a certain condition, if yes, the current pixel is replaced by a neighborhood median, and finally the average filtering is used for eliminating burrs of an image contour.
3. The forward-looking sonar image recognition and classification method based on the improved CFWPSO-SVM is characterized in that: the gray scale linear transformation in the S2 is to utilize a linear single-value function to linearly expand each pixel in the image, so that the visual effect of the image is effectively enhanced, the difference between a background and a target object is increased, and the accuracy of the subsequent binaryzation processing and segmentation of the image is improved, wherein the formula is as follows:
g(x,y)=k×f(x,y)+b
when k >1, the pixel values of the image all increase after transformation, and the contrast of the image also increases.
4. The forward-looking sonar image recognition and classification method based on the improved CFWPSO-SVM is characterized in that: binaryzation is carried out on the sonar image in the S3; setting a fixed threshold T, dividing the histogram into a background and a foreground for each possible threshold T greater than T, and respectively calculating the gray variance sigma of the background and the foreground B (t)、σ F (t) calculating probabilities ω that a pixel is background and foreground, respectively B (t)、ω F (t); let sigma be t =ω B (t)σ B (t)+σ F (t)ω F (t) selecting an optimal threshold t a =min(σ t ) Optimal threshold t obtained by Otus's algorithm a With t a AsAnd (4) carrying out binarization processing on the image by using a threshold value.
5. The forward-looking sonar image recognition and classification method based on the improved CFWPSO-SVM according to claim 1 is characterized in that: the specific steps of sonar image segmentation in the step S4 are as follows:
s41: segmenting each bright spot by a segmentation algorithm of region growing, obtaining a plurality of region stacks after segmenting the image, counting the number of elements in each region stack, and regarding the number as the area of each bright spot;
s42: setting a threshold value to remove small areas and large areas to obtain a suspected target area, wherein the area of a bright spot formed by the target to be identified is within a certain range and is not too small or too large;
s43: calculating the average value of the stack of each suspected target area to obtain the mass center of each bright spot; intercepting a rectangle taking the center of mass of the bright spot as the center on the unprocessed image so as to finish the segmentation of the sonar image;
s44: and according to the coordinates of the target object divided according to the bright spot area, carrying out segmentation on the image before binarization.
6. The forward-looking sonar image recognition and classification method based on the improved CFWPSO-SVM is characterized in that: and the characteristic extraction of S5 is to perform characteristic extraction on the segmented image to obtain the area, the perimeter, the shape parameters, the gray mean, the gray variance and the HOG characteristic of the segmented image.
7. The forward-looking sonar image recognition and classification method based on the improved CFWPSO-SVM according to claim 6, is characterized in that: the process of the HOG feature extraction algorithm is as follows:
s51: the size of the image after target segmentation is inconsistent, the image is firstly scaled to 256 multiplied by 256 to keep consistent, otherwise, the extracted HOG characteristic dimension is inconsistent, and subsequent SVM prediction is influenced;
s52: graying the image; normalizing and correcting the color space of the image, wherein the step is used for adjusting the contrast of the image and reducing the noise influence;
s53: calculating the image gradient; calculating the vertical edge, the horizontal edge, the edge strength and the edge slope of the image;
s54: constructing a cell by 8*8 pixels; solving a gradient direction histogram of each cell, and finally normalizing the histogram;
s55: synthesizing one block by 2*2 cells; all blocks are characterized.
8. The forward-looking sonar image recognition and classification method based on the improved CFWPSO-SVM is characterized in that: in the S6, classification is carried out through a particle swarm optimization algorithm (CFWPSO _ GA) improved based on a genetic algorithm;
the method comprises the following specific steps:
s61, initializing relevant parameters and the speed of the population particles, and calculating the initial fitness value of the particles;
s62, searching an initial extreme value, and determining an individual extreme value p according to the fitness value of the initial particle best Sum group extremum g best
S63, updating the inertia weight according to the formula 1; updating the speed and position of the particles according to equations 2 and 3;
formula 1:
Figure QLYQS_1
formula 2: v. of i (t+1)=ω*v i (t)+c 1 *r 1 (pbest-x i (t))+c 2 *r 2 (gbest-x i (t))
Formula 3: x is the number of i (t+1)=x i (t)+v i (t+1)
Where ω is the inertial weight; i is the particle number; x represents the particle position; v represents the particle velocity; t is the number of particle update iterations; the gbest is the position of the global optimal solution; location of the optimal solution for pbest individual, c 1 ,c 2 Is an acceleration factor; where f is the current objective function of the particle, f avg And f min Respectively the average value and the minimum value of all the current particles;
s64, calculating the fitness value of the current position of each particle; sorting the fitness values of the particles from big to small, selecting the first half of the particles with higher fitness values to directly enter the next generation, assigning the speed and the position to the second half of the particles with lower fitness values, randomly crossing the particles in pairs to generate particles with the same number, and replacing parent particles with child particles;
s65, updating the individual extreme value p according to the fitness value of the new particle best Sum group extremum g best
S66, if the stopping condition is met or the maximum iteration number is exceeded, outputting an optimal solution; otherwise, jumping to step S63 to continue searching.
9. The forward-looking sonar image recognition and classification method based on the improved CFWPSO-SVM according to claim 8, is characterized in that: the specific idea of the intersection in the S64 is as follows: in each iteration, the fitness values of the particles are sorted from large to small, the higher the fitness value of the particles is, the better the adaptability is, and the fitness function is the average value of the classification accuracy under 3-fold cross validation; firstly, selecting particles with higher fitness value to directly enter the next generation so as to ensure that the next generation of particles are relatively better; simultaneously assigning the position and the speed of the first half of particles with higher fitness value to the second half of particles with lower fitness value, randomly crossing the particles in pairs, generating the same number of child particles, and replacing parent particles with the child particles; the calculation of the position and the speed of the child particle is as follows:
Figure QLYQS_2
wherein p is a random number between 0 and 1; parent 1 (v),parent 2 (v) The velocities of two different parent particles; parent 1 (x),parent 2 (x) Are the positions of two different parent particles.
CN202211530587.8A 2022-12-01 2022-12-01 Improved CFWPSO-SVM-based forward-looking sonar image recognition and classification method Pending CN115908930A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211530587.8A CN115908930A (en) 2022-12-01 2022-12-01 Improved CFWPSO-SVM-based forward-looking sonar image recognition and classification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211530587.8A CN115908930A (en) 2022-12-01 2022-12-01 Improved CFWPSO-SVM-based forward-looking sonar image recognition and classification method

Publications (1)

Publication Number Publication Date
CN115908930A true CN115908930A (en) 2023-04-04

Family

ID=86474442

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211530587.8A Pending CN115908930A (en) 2022-12-01 2022-12-01 Improved CFWPSO-SVM-based forward-looking sonar image recognition and classification method

Country Status (1)

Country Link
CN (1) CN115908930A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116664565A (en) * 2023-07-28 2023-08-29 江苏森标科技有限公司 Hidden crack detection method and system for photovoltaic solar cell
CN116864109A (en) * 2023-07-13 2023-10-10 中世康恺科技有限公司 Medical image artificial intelligence auxiliary diagnosis system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070263936A1 (en) * 2004-08-14 2007-11-15 Yuri Owechko Cognitive signal processing system
CN103049805A (en) * 2013-01-18 2013-04-17 中国测绘科学研究院 Vehicle route optimization method with time window constraint based on improved particle swarm optimization (PSO)
CN103972908A (en) * 2014-05-23 2014-08-06 国家电网公司 Multi-target reactive power optimization method based on adaptive chaos particle swarm algorithm
CN106372756A (en) * 2016-09-07 2017-02-01 南京工程学院 Thermal power plant load optimization distribution method based on breeding particle swarm optimization
CN108399450A (en) * 2018-02-02 2018-08-14 武汉理工大学 Improvement particle cluster algorithm based on biological evolution principle
CN113887114A (en) * 2021-09-18 2022-01-04 湖南大学 Automobile transmission parameter optimization method based on improved multi-target hybrid particle swarm algorithm
CN115240059A (en) * 2022-07-13 2022-10-25 江苏海洋大学 Improved PSO-SVM-based forward-looking sonar target recognition method
CN115271237A (en) * 2022-08-09 2022-11-01 淮阴工学院 Industrial data quality prediction method based on improved PSO-GA and SVM

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070263936A1 (en) * 2004-08-14 2007-11-15 Yuri Owechko Cognitive signal processing system
CN103049805A (en) * 2013-01-18 2013-04-17 中国测绘科学研究院 Vehicle route optimization method with time window constraint based on improved particle swarm optimization (PSO)
CN103972908A (en) * 2014-05-23 2014-08-06 国家电网公司 Multi-target reactive power optimization method based on adaptive chaos particle swarm algorithm
CN106372756A (en) * 2016-09-07 2017-02-01 南京工程学院 Thermal power plant load optimization distribution method based on breeding particle swarm optimization
CN108399450A (en) * 2018-02-02 2018-08-14 武汉理工大学 Improvement particle cluster algorithm based on biological evolution principle
CN113887114A (en) * 2021-09-18 2022-01-04 湖南大学 Automobile transmission parameter optimization method based on improved multi-target hybrid particle swarm algorithm
CN115240059A (en) * 2022-07-13 2022-10-25 江苏海洋大学 Improved PSO-SVM-based forward-looking sonar target recognition method
CN115271237A (en) * 2022-08-09 2022-11-01 淮阴工学院 Industrial data quality prediction method based on improved PSO-GA and SVM

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
于梦馨: "基于GA-PSO优化支持向量机的遥感图像分类研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *
姜雯 等: "基于改进粒子群算法的支持向量机遥感影像分类", 《江苏科技大学学报(自然科学版)》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116864109A (en) * 2023-07-13 2023-10-10 中世康恺科技有限公司 Medical image artificial intelligence auxiliary diagnosis system
CN116864109B (en) * 2023-07-13 2024-06-18 中世康恺科技有限公司 Medical image artificial intelligence auxiliary diagnosis system
CN116664565A (en) * 2023-07-28 2023-08-29 江苏森标科技有限公司 Hidden crack detection method and system for photovoltaic solar cell

Similar Documents

Publication Publication Date Title
CN115908930A (en) Improved CFWPSO-SVM-based forward-looking sonar image recognition and classification method
Garg et al. Binarization techniques used for grey scale images
CN106373146B (en) A kind of method for tracking target based on fuzzy learning
CN110263660A (en) A kind of traffic target detection recognition method of adaptive scene changes
CN113486764B (en) Pothole detection method based on improved YOLOv3
CN115240059A (en) Improved PSO-SVM-based forward-looking sonar target recognition method
CN113139979A (en) Edge identification method based on deep learning
CN112329832B (en) Passive positioning target track data enhancement method and system based on deep convolution generation countermeasure network
CN108399420A (en) A kind of visible light naval vessel false-alarm elimination method based on depth convolutional network
Zheng et al. Improvement of grayscale image 2D maximum entropy threshold segmentation method
CN106056165B (en) A kind of conspicuousness detection method based on super-pixel relevance enhancing Adaboost classification learning
CN111274964B (en) Detection method for analyzing water surface pollutants based on visual saliency of unmanned aerial vehicle
CN114022586B (en) Defect image generation method based on countermeasure generation network
CN112329771B (en) Deep learning-based building material sample identification method
CN115131590B (en) Training method of target detection model, target detection method and related equipment
CN110633727A (en) Deep neural network ship target fine-grained identification method based on selective search
Vil’kin et al. Algorithm for segmentation of documents based on texture features
CN105512675B (en) A kind of feature selection approach based on the search of Memorability multiple point crossover gravitation
CN116071339A (en) Product defect identification method based on improved whale algorithm optimization SVM
CN115880572A (en) Forward-looking sonar target identification method based on asynchronous learning factor
CN110097067B (en) Weak supervision fine-grained image classification method based on layer-feed feature transformation
CN116823725A (en) Aeroengine blade surface defect detection method based on support vector machine
CN114139631A (en) Multi-target training object-oriented selectable ash box confrontation sample generation method
CN113838099A (en) Twin neural network-based single target tracking method
CN111429419A (en) Insulator contour detection method based on hybrid ant colony algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20230404