CN111222470A - Visible light remote sensing image ship detection method based on multivariate Gaussian distribution and PCANet - Google Patents

Visible light remote sensing image ship detection method based on multivariate Gaussian distribution and PCANet Download PDF

Info

Publication number
CN111222470A
CN111222470A CN202010020806.2A CN202010020806A CN111222470A CN 111222470 A CN111222470 A CN 111222470A CN 202010020806 A CN202010020806 A CN 202010020806A CN 111222470 A CN111222470 A CN 111222470A
Authority
CN
China
Prior art keywords
image
pcanet
representing
gaussian distribution
ship
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010020806.2A
Other languages
Chinese (zh)
Inventor
王楠
王越
李波
韦星星
王永华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202010020806.2A priority Critical patent/CN111222470A/en
Publication of CN111222470A publication Critical patent/CN111222470A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2433Single-class perspective, e.g. one-against-all classification; Novelty detection; Outlier detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Abstract

The invention discloses a visible light remote sensing image ship detection method based on multivariate Gaussian distribution and PCANet, which comprises the steps of firstly, using an abnormal detection algorithm of the multivariate Gaussian distribution to regard a ship target as an abnormal sea area, effectively extracting a candidate area of the ship and further reducing the number of false alarms; then confirming a ship target by combining a PCANet strategy to eliminate false alarms; finally, a non-maximum suppression strategy is adopted to eliminate the overlapped frames on the same target. The method can still obtain better detection performance under the complex ocean background condition, and has certain robustness on external complex variable factors such as uneven illumination, low contrast and the like of satellite image data.

Description

Visible light remote sensing image ship detection method based on multivariate Gaussian distribution and PCANet
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a visible light remote sensing image ship detection method based on multivariate Gaussian distribution and PCANet.
Background
The ship target detection has extremely important significance and application value in the military and civil fields. Major applications in the military field include battlefield environmental assessment and terrorist activity monitoring. The rapid finding of a ship target of interest from a large amount of optical remote sensing data is a technical prerequisite for tracking, locating and identifying ship targets. In the civilian field, real-time concern about the location of a marine vessel only at the target sea area would help to better manage the sea area and would play an important role in marine traffic, safety and rescue. Therefore, ship target detection has important significance for satellite optical imaging in ocean monitoring.
In recent years, some ship candidate region extraction methods have been proposed in the conventional methods. The ship candidate region is extracted by methods such as Bayesian decision, compressed domain, sparse feature based on multilayer sparse coding, Gamma distribution, constant false alarm detection (CFAR) and the like. In the stage of fine detection and elimination of false alarms, false alarms are further eliminated by means of texture and local shape features, an Extreme Learning Machine (ELM), a Support Vector Machine (SVM), a maximum likelihood decision algorithm, a visual significance model, a sparse bag-of-words model and the like. Under the condition that the sea surface scene is relatively simple, the method can obtain a better detection result. However, these algorithms have limitations when they encounter the following three cases. (a) The contrast of the ship is low; (b) sea surfaces with complex sea conditions, such as heavy waves, uneven illumination intensity and the like; (c) false alarm disturbances such as clouds, reefs, harbors, islands, etc. In addition, the above algorithm also causes false positives to varying degrees when multiple ships dock.
The traditional object detection technology is divided into two steps: searching and classifying. A sliding window is used to search, determine the location of the target, and then manually design features and classifiers in this window for classification based on the features. Compared with the traditional method, the deep learning method is more effective and faster for target detection. In the target detection scheme in natural images, researchers have proposed some target detection techniques based on deep learning, which include steps of raw data set labeling and neural network training. Such as RCNN, Fast-RCNN, R-FCN, YOLO, SSD, FPN, Mask-RCNN, etc. The method is used for detecting the target based on the model obtained by training the natural image public data set. However, the above method has certain disadvantages: the imaging mechanism of the remote sensing image is different from that of the natural image, and target data is not similar to the same type of target data in the natural image.
Therefore, how to provide a visible light remote sensing image ship detection method based on multivariate gaussian distribution and PCANet is a problem which needs to be solved urgently by technical personnel in the field.
Disclosure of Invention
In view of the above, the invention provides a visible light remote sensing image ship detection method based on multivariate Gaussian distribution and PCANet, which comprises the steps of firstly extracting a candidate area by adopting an anomaly detection technology based on multivariate Gaussian distribution, effectively reducing a subsequent search area and reducing the omission factor; and then, the slice of the candidate area is used as the input of the PCANet network for feature extraction and classification, so that false alarms are further eliminated, and the detection accuracy is improved.
In order to achieve the purpose, the invention adopts the following technical scheme:
a ship detection method based on multivariate Gaussian distribution and PCANet visible light remote sensing images comprises the following steps:
(1) an original image with the size of M × N pixels is averagely divided into image blocks with the size of P × P pixels, and then the original image can be divided into (M × N)/(P × P) ═ L blocks; rearranging the pixels of the image with the size of P multiplied by P through a sliding window with the size of K multiplied by K pixels and the step length of S, namely replacing each pixel by a K multiplied by 1 vector in the sliding window;
(2) through the above-described step (1), an image block of P × P size finally forms an image block I of ((P-K)/S +1) × ((P-K)/S +1) in pixel sizei(i ═ 1,2,3 …, L), each pixel in the image block corresponding in the original image to the center of its respective sliding window;
(3) image block IiAs one sample x for each row ini(ii) a Suppose each sample xiObeying to a gaussian distribution and using them to estimate the values of the overall sample block parameter mean μ and variance Σ in order to model the feature vector;
(4) to pairImage block IiCarrying out anomaly detection by adopting multivariate Gaussian distribution; abnormal value m (x) in multivariate gaussian (x-u)T-1(x-u); where μ represents the sample mean, Σ-1Represents the inverse of the variance, M (x) is used as a condition threshold value for judging the abnormity;
(5) mapping pixel points larger than the threshold value of M (x) in the step (4) to original image positions, setting pixel points corresponding to the original image positions as 1 as abnormal points, and setting pixel points smaller than the threshold value as 0 as non-abnormal areas, so as to obtain binary images extracted from the original images through abnormal detection;
(6) then eliminating isolated holes by using a morphological method, and carrying out 8-way communication operation on the communication domain; the position of the maximum external rectangle of each connected domain area can be obtained, and the position is a four-dimensional vector; screening out target candidate slices with the connected domain area size in a certain threshold interval according to the threshold of the connected domain area;
(7) scaling the target candidate region slice obtained in the step (6) to a slice with the same size as a test set, and classifying the candidate slice by taking the test set as the input of a PCANet model;
(8) extracting target index numbers corresponding to the ship targets and the non-target labels in the classification results according to the classification results obtained in the step (7), and performing frame marking on the ship targets according to the target positions in the step (6);
(9) adopting a non-maximum inhibition strategy to eliminate overlapped frames on the same ship; if the intersection area of the overlapped boxes is larger than a certain threshold value, combining the boxes into a valid box; and finally, splicing the image blocks after being partitioned into original images, wherein the detection result after each small block is combined is the detection result of the original large image.
Preferably, in the step (2), when each pixel is replaced by a K × 1 vector in the sliding window, the sample size set of the following formula is formed after the pixels in the original image are rearranged;
[((P-K)/S)+1]2×(K×K) (1)
where S is the step size of the sliding window, and the unit is a pixel.
Preferably, in the step (4), when the target candidate region is extracted, the formula of the multivariate gaussian distribution is as follows:
Figure BDA0002360668420000041
Figure BDA0002360668420000042
Figure BDA0002360668420000043
where u is the sample feature mean, Σ is the variance of all samples minus the mean u, then squared and added, x is the sample after pixel rearrangement, i is the sample index, m is the total number of samples, n is the dimension of each sample, and T is the transpose of the matrix.
Preferably, in the step (7), the filter of the first stage of the PCANet model is expressed by the following formula:
Figure BDA0002360668420000044
Figure BDA0002360668420000045
wherein efRepresenting the eigenvector operation corresponding to the first f largest eigenvalues of the extraction matrix,
Figure BDA0002360668420000046
is a sample vector obtained by preprocessing each slice;
Figure BDA0002360668420000047
a filter representing a first stage; n is the number of training samples, f is the number of filters in the first stage, I is the slice image after scaling in step (7), I isTRepresenting transposing pixels in an image IOperated on image, L1Representing the total number of filters in the first stage.
Preferably, in the step (7), the filter used in the second stage is expressed by the following formula:
Figure BDA0002360668420000051
Figure BDA0002360668420000052
wherein e isfRepresenting the eigenvector operation corresponding to the first f largest eigenvalues of the extraction matrix, YiRepresenting the convolution result of N pictures and a filter; front L2The eigenvector corresponding to the largest eigenvalue is used as the filter of the second stage
Figure BDA0002360668420000053
Y represents a block sample form of the second stage input data, YTRepresenting transpose operations on Y; l is2Representing the number of filters in the second stage.
Preferably, in the step (7), the binarized hash code can be expressed by the following formula:
Figure BDA0002360668420000054
wherein the function H represents a function of one unit step; 1,2, L2Representing the number of filters in the second stage;
Figure BDA0002360668420000055
a principal component PCA map output representing a first layer;
Figure BDA0002360668420000056
representing the result of the hash-coding operation.
Preferably, in the step (7), the block histogram forms a feature vector, and the block spread histogram feature is obtained by using the following formula:
Figure BDA0002360668420000057
wherein λ isiBelong to the space
Figure BDA0002360668420000058
B is the number of divided blocks, and hist represents a statistical function of the histogram; l is1,L2Is reflected in each output matrix T to the first stagei l,(l=1,2,...,L1) When the blocks are divided and the histogram matrix is counted, the value range of the histogram is
Figure BDA0002360668420000059
Preferably, the PCANet model in the step (7) mainly comprises three parts: (a) a PCA filter; (b) binarizing hash codes; (c) the block histogram forms a feature vector.
Preferably, in the step (9), the non-maximum suppression strategy is performed to eliminate the overlapped box by using the following formula:
Figure BDA0002360668420000061
wherein S iskRepresenting the target candidate frame corresponding to the classification accuracy of the kth candidate frame, Sk+1Representing a target candidate frame corresponding to the classification precision of the (k +1) th candidate frame; a. theIOURepresents the intersection area of the kth candidate box and the (k +1) th candidate box, NtRepresenting a threshold value and S representing a target candidate box after the elimination of the overlapping box according to a non-maximum suppression strategy.
Preferably, the following autocorrelation function formula is used to reflect the texture roughness of the ship target under different sea conditions:
Figure BDA0002360668420000062
the formula is used for calculating a correlation value between each pixel point (p, q) in a (2w +1) × (2w +1) window in an image f (u, v) and a pixel with a deviation value of epsilon, η ═ 0, ± 1, ± 2,. ·, and + -T, wherein T represents an offset, (u, v) represents a pixel position in the image f, f (u, v) represents an image pixel value corresponding to the pixel point position, (p, q) represents a pixel point position in the (2w +1) × (2w +1) window, and C (epsilon, η, p, q) represents an autocorrelation function of the pixel point (p, q) at a given deviation (epsilon, η).
The invention has the beneficial effects that:
two main obstacles of ship target detection are how to extract candidate areas under a complex background and how to confirm a target under the condition that the target is similar to a false alarm, so that the invention provides a ship detection scheme based on multivariate Gaussian distribution anomaly detection and PCANet technology for ships in the ocean aiming at the sea surface with uneven illumination and complex changes of sea condition scenes, wherein the multivariate Gaussian distribution is adopted as the anomaly detection technology, and the method can effectively and comprehensively extract the ship candidate areas and reduce the omission ratio; the PCANet technology can better classify ships and non-ships, further reduce false alarm rate, improve detection rate, and simultaneously avoid traditional complex feature selection and extraction processes.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a flow chart of the method of the present invention.
FIG. 2 is a diagram of a binary image of an optical image subjected to an anomaly detection algorithm based on multivariate Gaussian distribution according to the present invention.
Fig. 3 is a diagram showing the final result of the false alarm rejection by PCANet under different sea state backgrounds.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to the attached figure 1, the invention provides a visible light remote sensing image ship detection method based on multivariate Gaussian distribution and PCANet, which comprises the following steps:
step 1: dividing an original image of size M × N into image blocks of size P × P, the original image may be divided into (M × N)/(P × P) ═ L blocks, each for an image block Ii(i ═ 1,2, 3.., L). Each block of images is pixel rearranged by a step size S through a sliding window of size K × K. The resulting image size of the block of P × P images is (P-K +1) × (P-K +1), and each pixel corresponds to the center of its corresponding sliding window. In this scheme, the value of K is 5.
Step 2: when each pixel is replaced by a K × 1 vector in the sliding window, the pixels in the P × P original image are rearranged to form a sample size set of equation (4).
[((P-K)/S)+1]2×(K×K) (1)
Where S is the step size of the sliding window.
And step 3: and carrying out anomaly detection on the small image blocks after the blocking by adopting a Gaussian model. Suppose each sample xiObey a gaussian distribution and use them to estimate the values of the parameter mean μ and variance Σ in order to model the feature vector.
Figure BDA0002360668420000081
Figure BDA0002360668420000082
Figure BDA0002360668420000083
Where u is the sample feature mean, Σ is the variance of all samples minus the mean u, then squared and added, x is the sample after pixel rearrangement, i is the sample index, m is the total number of samples, n is the dimension of each sample, and T is the transpose of the matrix.
And 4, step 4: since the denominator part in equation (2) is close to 0, equation (2) and equation (5) are proved to be equivalent through experiments, so equation (2) can be simplified as:
M(x)=(x-u)TΣ-1(x-u) (5)
m (x) is a threshold value of the outlier. Selecting a proper threshold, taking the area where the pixel greater than the threshold is located as an abnormal area, setting the area as 1, and setting the area of the pixel less than the threshold as 0, thereby obtaining a binary image extracted from the original image through abnormal detection.
And 5: and (4) eliminating isolated holes existing in the result of the step (4) by using a morphological method, and carrying out 8 communication on the communication domains. The position of the maximum bounding rectangle for each connected domain area is also available, which is a four-dimensional vector (x, y, w, h) (containing the coordinates of the top left vertex of the maximum bounding rectangle and the length and width). And screening out a target candidate slice with the connected domain area size in a certain threshold interval in the binary image according to the threshold of the connected domain area.
Step 6: and (5) marking the position corresponding to the target candidate area in the step (5) in the corresponding position in the original image by using a square frame. In order to enable the subsequent steps to make the image zoom in and out with equal proportion, the maximum length and width values are selected as the sides of the square according to the position coordinates when marking. The position area marked by the square frame (including the target and the non-target) is the target candidate area.
And 7: and (4) rescaling the candidate area block (sample block) obtained in the step (6) to the slice with the same size to be used as a training set, and using the training set as an input of the PCANet model. The model mainly comprises three parts: (a) a PCA filter; (b) performing binary hash coding; (c) the block histogram forms a feature vector.
(I) Wherein the filter of the first stage can be represented as:
Figure BDA0002360668420000091
Figure BDA0002360668420000092
wherein efRepresenting the eigenvector operation corresponding to the first f largest eigenvalues of the extraction matrix,
Figure BDA0002360668420000093
is the sample vector obtained by preprocessing each slice.
Figure BDA0002360668420000094
Representing the filter of the first stage. N is the number of training samples and f represents the number of filters in the first stage. In the first stage, by solving for I × ITFront L of1And the eigenvector corresponding to the largest eigenvalue is used as the filter of the second stage.
Similarly, the second phase employs a similar strategy. And filtering at each stage to obtain eigenvectors corresponding to the eigenvalues of the covariance matrix of all the sample images. The input of the second stage is the output of the first stage. The second layer of PCA filters is also constructed by selecting eigenvectors corresponding to the covariance matrix.
Since the first layer has L1A filter kernel, the first layer will generate L1An output matrix for generating L corresponding to each feature matrix output from the first layer in the second layer2And outputting the characteristics. Finally, for each sample, the PCANet through the two stages will generate L1*L2A feature matrix of each output.
Figure BDA0002360668420000101
Figure BDA0002360668420000102
Wherein e isfRepresenting the eigenvector operation corresponding to the first f largest eigenvalues of the extraction matrix, YiRepresenting the convolution result of N pictures with one filter. Similarly, by solving for YxYTCharacteristic vector of (3), front L2The eigenvector corresponding to the largest eigenvalue is used as the filter of the second stage
Figure BDA0002360668420000103
Y represents a block sample form of the second stage input data, YTRepresenting transpose operations on Y; l is2Representing the number of filters in the second stage.
And (II) after the two stages are processed, carrying out binarization Hash coding and normalizing to obtain the final feature vector.
Figure BDA0002360668420000104
Where the function H is similar to a function of one unit step. 1,2, L2Representing the number of filters in the second stage.
Figure BDA0002360668420000105
Representing the principal component PCA mapping output of the first layer. T isi lRepresenting the result of the hash-coding operation.
(III) dividing each output matrix in the previous stage into B blocks, calculating and counting the histogram information of each block, and then cascading the histogram characteristics of each block to finally obtain the block expansion histogram characteristics:
Figure BDA0002360668420000111
wherein λ isiBelong to the space
Figure BDA0002360668420000112
B is the number of divided blocks, and hist represents a statistical function of the histogram; l is1,L2Is reflected in each output matrix for the first stage
Figure BDA0002360668420000113
When the blocks are divided and the histogram matrix is counted, the value range of the histogram is
Figure BDA0002360668420000114
And 8: non-maximum suppression strategies are employed to eliminate overlapping frames on the same ship target. If the intersection area of the overlapping boxes is greater than a certain threshold, the boxes are merged into one valid box. The merge strategy is:
Figure BDA0002360668420000115
wherein S iskRepresenting the target candidate frame corresponding to the classification precision of the kth candidate frame; a. theIOURepresents the intersection area of the kth candidate box and the (k +1) th candidate box, NtRepresenting a threshold value and S representing a target candidate box after the elimination of the overlapping box according to a non-maximum suppression strategy.
And step 9: in order to better evaluate the effectiveness of the scheme in the remote sensing image, the method adopts the autocorrelation function to reflect the texture roughness of the ship target under different sea conditions, so that the detection performance of the scheme under different sea surface complex backgrounds is measured.
The autocorrelation function is defined as follows:
Figure BDA0002360668420000116
the formula is used for calculating a correlation value between each pixel point (p, q) in a (2w +1) × (2w +1) window in an image f (u, v) and a pixel with a deviation value of epsilon, η ═ 0, ± 1, ± 2,. ·, and + -T, wherein T represents an offset, (u, v) represents a pixel position in the image f, f (u, v) represents an image pixel value corresponding to the pixel point position, (p, q) represents a pixel point position in the (2w +1) × (2w +1) window, and C (epsilon, η, p, q) represents an autocorrelation function of the pixel point (p, q) at a given deviation (epsilon, η).
According to the method, a method experiment is carried out on a PC platform, a remote sensing image with the resolution of 0.5 m is used in the experiment process, and the size of an image sample is 18192 multiplied by 18000 pixels. Fig. 2 is a binary image obtained by performing an anomaly detection algorithm based on multivariate gaussian distribution on an optical image. Fig. 3 is a final result graph after false alarm rejection by PCANet in different sea state contexts. As can be seen from the result graph, the method can comprehensively and effectively extract the target candidate area; the method of the invention can still have stronger robustness under the condition of more complex sea state background (such as cloud and fog, large wave interference and the like). The first column of fig. 2 shows the original image, and the second column shows a result diagram of abnormality detection based on multivariate gaussian distribution adopted in the present embodiment. The red ovals in the first and third rows depict ship targets. The first row, the second row and the third row in fig. 2 respectively show the results of anomaly detection by adopting multivariate Gaussian distribution under the conditions of cloud layer interference, sea surface large wave interference and relatively close contrast between a ship target and a sea surface background. In fig. 3, (a), (b), (c), and (d) respectively show the final detection results of the ship target by the method under the complex sea surface scenes of cloud interference, relatively close contrast between the ship target and the ocean background, sea surface big wave interference, and approach to a port. The red rectangular frame encloses the ship target. In fig. 3(b), a red oval circles a low-contrast ship target, a white circle an enlarged portion of the low-contrast ship target, and a black rectangle circles the ship target after enlargement.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A ship detection method based on multivariate Gaussian distribution and PCANet visible light remote sensing images is characterized by comprising the following steps:
(1) an original image with the size of M × N pixels is averagely divided into image blocks with the size of P × P pixels, and then the original image can be divided into (M × N)/(P × P) ═ L blocks; rearranging the pixels of the image with the size of P multiplied by P through a sliding window with the size of K multiplied by K pixels and the step length of S, namely replacing each pixel by a K multiplied by 1 vector in the sliding window;
(2) through the above-described step (1), an image block of P × P size finally forms an image block I of ((P-K)/S +1) × ((P-K)/S +1) in pixel sizei(i ═ 1,2, 3.., L), each pixel in the image block corresponding to the center of its respective sliding window in the original image;
(3) image block IiAs one sample x for each row ini(ii) a Suppose each sample xiObeying to a gaussian distribution and using them to estimate the values of the overall sample block parameter mean μ and variance Σ in order to model the feature vector;
(4) for image block IiCarrying out anomaly detection by adopting multivariate Gaussian distribution; abnormal value in multivariate Gaussian M (x) ═ x-uT-1(x-u); where μ represents the sample mean, Σ-1Represents the inverse of the variance, M (x) is used as a condition threshold value for judging the abnormity;
(5) mapping pixel points larger than the threshold value of M (x) in the step (4) to original image positions, setting pixel points corresponding to the original image positions as 1 as abnormal points, and setting pixel points smaller than the threshold value as 0 as non-abnormal areas, so as to obtain binary images extracted from the original images through abnormal detection;
(6) then eliminating isolated holes by using a morphological method, and carrying out 8-way communication operation on the communication domain; the position of the maximum external rectangle of each connected domain area can be obtained, and the position is a four-dimensional vector; screening out target candidate slices with the connected domain area size in a certain threshold interval according to the threshold of the connected domain area;
(7) scaling the target candidate region slice obtained in the step (6) to a slice with the same size as a test set, and classifying the candidate slice by taking the test set as the input of a PCANet model;
(8) extracting target index numbers corresponding to the ship targets and the non-target labels in the classification results according to the classification results obtained in the step (7), and performing frame marking on the ship targets according to the target positions in the step (6);
(9) adopting a non-maximum inhibition strategy to eliminate overlapped frames on the same ship; if the intersection area of the overlapped boxes is larger than a certain threshold value, combining the boxes into a valid box; and finally, splicing the image blocks after being partitioned into original images, wherein the detection result after each small block is combined is the detection result of the original large image.
2. The method for detecting the ship based on the visible light remote sensing image of multivariate Gaussian distribution and PCANet as claimed in claim 1, wherein in the step (2), when each pixel is replaced by a KxKx 1 vector in a sliding window, a sample size set of the following formula is formed after the pixels in the original image are rearranged;
[((P-K)/S)+1]2×(K×K) (1)
where S is the step size of the sliding window, and the unit is a pixel.
3. The method for detecting the ship based on the visible light remote sensing image of multivariate Gaussian distribution and PCANet as claimed in claim 2, wherein in the step (4), when extracting the target candidate region, the multivariate Gaussian distribution formula is as follows:
Figure FDA0002360668410000021
Figure FDA0002360668410000022
Figure FDA0002360668410000023
where u is the sample feature mean, Σ is the variance of all samples minus the mean u, then squared and added, x is the sample after pixel rearrangement, i is the sample index, m is the total number of samples, n is the dimension of each sample, and T is the transpose of the matrix.
4. The method for detecting the ship based on the visible light remote sensing image of multivariate Gaussian distribution and PCANet as claimed in claim 3, wherein in the step (7), the filter of the first stage of the PCANet model is represented by the following formula:
Figure FDA0002360668410000031
Figure FDA0002360668410000032
wherein efRepresenting the eigenvector operation corresponding to the first f largest eigenvalues of the extraction matrix,
Figure FDA0002360668410000033
is a sample vector obtained by preprocessing each slice;
Figure FDA0002360668410000034
a filter representing a first stage; n is the number of training samples, f is the number of filters in the first stage, I is the slice image after scaling in step (7), I isTRepresenting an image with the transpose operation performed on the pixels in image I, L1Representing the total number of filters in the first stage.
5. The method for detecting the ship based on the visible light remote sensing image of multivariate Gaussian distribution and PCANet as claimed in claim 4, wherein in the step (7), the filter used in the second stage is represented by the following formula:
Figure FDA0002360668410000035
Figure FDA0002360668410000036
wherein e isfRepresenting the eigenvector operation corresponding to the first f largest eigenvalues of the extraction matrix, YiRepresenting the convolution result of N pictures and a filter; the eigenvector corresponding to the maximum eigenvalue of the first L2 is used as the second stage filter
Figure FDA0002360668410000037
Y represents a block sample form of the second stage input data, YTRepresenting transpose operations on Y; l is2Representing the number of filters in the second stage.
6. The method for detecting the ship based on the visible light remote sensing image of multivariate Gaussian distribution and PCANet as claimed in claim 5, wherein in the step (7), the binarized Hash code can be expressed by the following formula:
Figure FDA0002360668410000041
wherein the function H represents a function of one unit step; 1,2, L2Representing the number of filters in the second stage;
Figure FDA0002360668410000042
representing the main component of the first layerOutputting PCA mapping;
Figure FDA0002360668410000043
representing the result of the hash-coding operation.
7. The method for detecting the visible light remote sensing image ship based on the multivariate Gaussian distribution and PCANet as claimed in claim 6, wherein in the step (7), the block histogram forms a feature vector, and the block expansion histogram feature is obtained by adopting the following formula:
Figure FDA0002360668410000044
wherein λ isiBelong to the space
Figure FDA0002360668410000045
B is the number of divided blocks, and hist represents a statistical function of the histogram; l is1,L2Is reflected in each output matrix for the first stage
Figure FDA0002360668410000046
When the blocks are divided and the histogram matrix is counted, the value range of the histogram is
Figure FDA0002360668410000047
8. The method for detecting the ship based on the visible light remote sensing image of the multivariate Gaussian distribution and PCANet as claimed in claim 1 or 7, wherein the PCANet model in the step (7) mainly comprises three parts: (a) a PCA filter; (b) performing binary hash coding; (c) the block histogram forms a feature vector.
9. The ship detection method based on the visible light remote sensing image of multivariate Gaussian distribution and PCANet as claimed in claim 1, wherein in the step (9), the non-maximum suppression strategy is performed to eliminate the overlapped frames by using the following formula:
Figure FDA0002360668410000048
wherein S iskRepresenting the target candidate frame corresponding to the classification accuracy of the kth candidate frame, Sk+1Representing a target candidate frame corresponding to the classification precision of the (k +1) th candidate frame; a. theIOURepresents the intersection area of the kth candidate box and the (k +1) th candidate box, NtRepresenting a threshold value and S representing a target candidate box after the elimination of the overlapping box according to a non-maximum suppression strategy.
10. The method for detecting the ship based on the visible light remote sensing image of the multivariate Gaussian distribution and PCANet as claimed in claim 1, wherein the following autocorrelation function formula is adopted to reflect the texture roughness of the ship target under different sea conditions:
Figure FDA0002360668410000051
the formula is used for calculating a correlation value between each pixel point (p, q) in a (2w +1) × (2w +1) window in an image f (u, v) and a pixel with a deviation value of epsilon, η ═ 0, ± 1, ± 2,. ·, and + -T, wherein T represents an offset, (u, v) represents a pixel position in the image f, f (u, v) represents an image pixel value corresponding to the pixel point position, (p, q) represents a pixel point position in the (2w +1) × (2w +1) window, and C (epsilon, η, p, q) represents an autocorrelation function of the pixel point (p, q) at a given deviation (epsilon, η).
CN202010020806.2A 2020-01-09 2020-01-09 Visible light remote sensing image ship detection method based on multivariate Gaussian distribution and PCANet Pending CN111222470A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010020806.2A CN111222470A (en) 2020-01-09 2020-01-09 Visible light remote sensing image ship detection method based on multivariate Gaussian distribution and PCANet

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010020806.2A CN111222470A (en) 2020-01-09 2020-01-09 Visible light remote sensing image ship detection method based on multivariate Gaussian distribution and PCANet

Publications (1)

Publication Number Publication Date
CN111222470A true CN111222470A (en) 2020-06-02

Family

ID=70809736

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010020806.2A Pending CN111222470A (en) 2020-01-09 2020-01-09 Visible light remote sensing image ship detection method based on multivariate Gaussian distribution and PCANet

Country Status (1)

Country Link
CN (1) CN111222470A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112370078A (en) * 2020-11-10 2021-02-19 安徽理工大学 Image detection method based on ultrasonic imaging and Bayesian optimization
CN112580431A (en) * 2020-11-20 2021-03-30 北京航空航天大学 High-bandwidth remote sensing image target extraction method suitable for on-satellite on-orbit processing

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106778495A (en) * 2016-11-21 2017-05-31 北京航天宏图信息技术股份有限公司 Ship Detection in remote sensing image under complicated sea background
CN108734111A (en) * 2018-04-26 2018-11-02 西南电子技术研究所(中国电子科技集团公司第十研究所) SAR image surface vessel recognition methods
CN109427055A (en) * 2017-09-04 2019-03-05 长春长光精密仪器集团有限公司 The remote sensing images surface vessel detection method of view-based access control model attention mechanism and comentropy
CN110570450A (en) * 2019-09-18 2019-12-13 哈尔滨工业大学 Target tracking method based on cascade context-aware framework

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106778495A (en) * 2016-11-21 2017-05-31 北京航天宏图信息技术股份有限公司 Ship Detection in remote sensing image under complicated sea background
CN109427055A (en) * 2017-09-04 2019-03-05 长春长光精密仪器集团有限公司 The remote sensing images surface vessel detection method of view-based access control model attention mechanism and comentropy
CN108734111A (en) * 2018-04-26 2018-11-02 西南电子技术研究所(中国电子科技集团公司第十研究所) SAR image surface vessel recognition methods
CN110570450A (en) * 2019-09-18 2019-12-13 哈尔滨工业大学 Target tracking method based on cascade context-aware framework

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
NAN WANG 等: "Automatic Ship Detection in Optical Remote Sensing Images Based on Anomaly Detection and SPP-PCANet", 《REMOTE SENSING》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112370078A (en) * 2020-11-10 2021-02-19 安徽理工大学 Image detection method based on ultrasonic imaging and Bayesian optimization
CN112370078B (en) * 2020-11-10 2024-01-26 安徽理工大学 Image detection method based on ultrasonic imaging and Bayesian optimization
CN112580431A (en) * 2020-11-20 2021-03-30 北京航空航天大学 High-bandwidth remote sensing image target extraction method suitable for on-satellite on-orbit processing
CN112580431B (en) * 2020-11-20 2022-06-24 北京航空航天大学 High-bandwidth remote sensing image target extraction method suitable for on-satellite on-orbit processing

Similar Documents

Publication Publication Date Title
Chen et al. Ship detection from coastal surveillance videos via an ensemble Canny-Gaussian-morphology framework
Qi et al. A robust directional saliency-based method for infrared small-target detection under various complex backgrounds
Chitradevi et al. An overview on image processing techniques
Nasiri et al. Infrared small target enhancement based on variance difference
CN110031843B (en) ROI (region of interest) -based SAR (synthetic Aperture Radar) image target positioning method, system and device
CN109427055B (en) Remote sensing image sea surface ship detection method based on visual attention mechanism and information entropy
CN111079596A (en) System and method for identifying typical marine artificial target of high-resolution remote sensing image
CN105512622A (en) Visible remote-sensing image sea-land segmentation method based on image segmentation and supervised learning
Corbane et al. Fully automated procedure for ship detection using optical satellite imagery
Soundrapandiyan et al. An approach to adaptive pedestrian detection and classification in infrared images based on human visual mechanism and support vector machine
CN113674308A (en) SAR image ship target rapid detection method based on image enhancement and multiple detection
CN114764801A (en) Weak and small ship target fusion detection method and device based on multi-vision significant features
CN111222470A (en) Visible light remote sensing image ship detection method based on multivariate Gaussian distribution and PCANet
Soundrapandiyan et al. Adaptive pedestrian detection in infrared images using fuzzy enhancement and top-hat transform
Chen et al. Attention-based hierarchical fusion of visible and infrared images
Aghababaee et al. Contextual PolSAR image classification using fractal dimension and support vector machines
Selvi et al. A novel approach for ship recognition using shape and texture
Li et al. Detection of oil spills based on gray level co-occurrence matrix and support vector machine
Cai et al. Man-made object detection based on texture clustering and geometric structure feature extracting
Yu et al. Visual saliency using binary spectrum of Walsh–Hadamard transform and its applications to ship detection in multispectral imagery
Kekre et al. SAR Image Segmentation using co-occurrence matrix and slope magnitude
Lee et al. Infrared small target detection algorithm using an augmented intensity and density-based clustering
Li et al. Infrared Small Target Detection Based on Gradient-Intensity Joint Saliency Measure
Yu et al. Visual Saliency via Multiscale Analysis in Frequency Domain and Its Applications to Ship Detection in Optical Satellite Images
Sandirasegaram Spot SAR ATR using wavelet features and neural network classifier

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200602

RJ01 Rejection of invention patent application after publication