CN116843906A - Target multi-angle intrinsic feature mining method based on Laplace feature mapping - Google Patents

Target multi-angle intrinsic feature mining method based on Laplace feature mapping Download PDF

Info

Publication number
CN116843906A
CN116843906A CN202310692714.2A CN202310692714A CN116843906A CN 116843906 A CN116843906 A CN 116843906A CN 202310692714 A CN202310692714 A CN 202310692714A CN 116843906 A CN116843906 A CN 116843906A
Authority
CN
China
Prior art keywords
feature
angle
dimensional
target
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310692714.2A
Other languages
Chinese (zh)
Inventor
涂尚坦
范季夏
陈占胜
黄金生
胡广清
薛伶玲
艾韶杰
徐莹
姜岩
李科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Institute of Satellite Engineering
Original Assignee
Shanghai Institute of Satellite Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Institute of Satellite Engineering filed Critical Shanghai Institute of Satellite Engineering
Priority to CN202310692714.2A priority Critical patent/CN116843906A/en
Publication of CN116843906A publication Critical patent/CN116843906A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Multimedia (AREA)
  • Pure & Applied Mathematics (AREA)
  • Software Systems (AREA)
  • Mathematical Optimization (AREA)
  • Computing Systems (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Algebra (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a target multi-angle intrinsic feature mining method and system based on Laplace feature mapping, comprising the following steps: extracting target multi-angle SAR image characteristics; constructing a target original high-dimensional observation feature space according to the multi-angle SAR image features; carrying out Laplace feature mapping on the original high-dimensional observation feature space to obtain a mapped low-dimensional intrinsic feature space; and obtaining the distribution relation between the target angle and the manifold curved surface through the low-dimensional intrinsic characteristic space. The method is oriented to multi-angle target SAR image samples, a Laplace feature mapping method is adopted to find a low-dimensional manifold curved surface which is internally stored in a high-dimensional observation feature space, the consistency of the rule of the target along with the angle change and the rule of the manifold curved surface is verified, and a foundation is laid for the subsequent feature characterization and feature matching application of the multi-angle target.

Description

Target multi-angle intrinsic feature mining method based on Laplace feature mapping
Technical Field
The application relates to the technical field of intrinsic feature mining, in particular to a target multi-angle intrinsic feature mining method and system based on Laplace feature mapping.
Background
Synthetic Aperture Radar (SAR) is widely used in various fields, such as flood monitoring, ocean monitoring, agricultural census and topographic mapping, due to its all-day high-resolution earth-looking ability. The multi-angle SAR can acquire angle information with richer targets, so that the interpretation advantage of the multi-angle SAR image is more obvious.
For multi-angle SAR images, the backward scattering characteristics of targets on electromagnetic waves under different angles are large in difference, and the image representation of the same target under different angles is also large in difference, so that interpretation and interpretation of the SAR images are difficult. Based on the method, the characteristic description mode capable of effectively representing the angle change of the target is found by searching the consistency of the angle and the characteristic rule of the multi-angle SAR image sample obtained under different observation angles of the typical target, so that a foundation is laid for the characteristic representation and the characteristic matching application of the follow-up multi-angle target.
The document radar journal, volume 10, period 6 and month 2021 provides a multi-angle SAR image target recognition method based on EfficientNet and BiGRU, which provides a multi-angle SAR target recognition model based on EfficientNet and BiGRU, trains the model by island loss, and adopts the same MSTAR data set as the application to carry out multi-angle SAR target recognition experiments. The article mainly comprises a target feature for identification, which is formed by an EfficientNet network for extracting single image features and a BiGRU network for further extracting time sequence features of a multi-angle SAR image sequence, and is different from gray features, shape features and texture features adopted in the application; the training mode of using island loss and cross entropy loss as target recognition tasks in the article is different from the manifold learning method of using Laplace feature mapping to search for low-dimensional intrinsic features in a high-dimensional feature space. In comparison, the method can find the change rule of the intrinsic feature space along with the angular distribution in targets with different angles, and is more beneficial to classifying and identifying targets according to the rule.
The technical research of multi-azimuth angle-observation satellite-borne SAR in 9 th volume, 2 nd period and 4 month in 2020 is summarized on the novel multi-azimuth angle-observation satellite-borne SAR technology, and the application directions of a multi-azimuth angle-observation satellite-borne SAR imaging processing algorithm, image radiation quality improvement, sidelobe suppression and the like are referred. The application searches for characteristic mining and angle change rules aiming at an image target after acquiring a multi-angle SAR image, and the article belongs to different but complementary research directions.
The multi-angle SAR moving target detection technology and the high-resolution three-experiment verification research thereof of the 9 th volume, the 2 nd period and the 4 th month in 2020 of the literature radar school report develop the moving target detection capability research based on the two advantages of long-time observation of the scene and large synthetic aperture angle by the multi-angle SAR, and the application verification of the multi-angle moving target detection is carried out on the high-resolution 3 staring beam-focusing mode by the proposed logarithmic background differential detection algorithm. However, the image object corresponding to the subject in this document is a moving object, and the mining of a stationary object is lacking.
The method comprises the steps of classifying and identifying unmanned aerial vehicles based on double radar micro-motion feature fusion in literature radar school report, volume 7, period 5 and month 10 in 2018, and observing targets from different angles by utilizing multiple radars at the same time; then, respectively carrying out short-time Fourier transform on the acquired radar data to obtain a time-frequency spectrogram; then, extracting features from the time-frequency spectrogram by utilizing principal component analysis, and fusing the features obtained by the two radar sensors with different angles together; and finally, training and classifying identification are carried out by using a support vector machine. However, the feature extraction object of the article is not a SAR image, but a time-frequency spectrum in the acquired radar data, and furthermore, the principal component analysis method adopted in the article requires that the inside-lying manifold curved surface linearization assumption is not suitable for a natural image.
Patent document CN114519778A discloses a target three-dimensional reconstruction method of multi-angle SAR data, comprising: acquiring a sub-aperture image sequence of the multi-angle SAR; dividing each sub-aperture image into a first target area and a first background area, and respectively marking the first target area and the first background area to obtain a first mask image; determining a shadow protection area in the first background area according to the first mask image, and marking the shadow protection area to obtain a second mask image; creating an initial 3D voxel grid corresponding to the sub-aperture image sequence; removing non-target voxels in the initial 3D voxel grid according to the second mask image, and generating a 3D point cloud model corresponding to the sub-aperture image sequence; and carrying out three-dimensional reconstruction on the target according to the 3D point cloud model. However, the research content of patent document CN114519778A is to extract scene elevation information based on a multi-angle SAR sub-aperture image sequence, and the data collected by this extraction method is limited in application of classification, identification, and the like of multi-angle targets.
Patent document CN106897985B discloses a multi-angle SAR image fusion method based on visibility classification, which calculates multi-view visibility LI of non-overlapping mask pixels inon_overlap of images of each azimuth view according to binary segmentation results of overlapping mask and non-overlapping mask of image pixels, and automatically realizes fine classification of image pixels through visibility indexes. However, patent document CN106897985B is applied for the purpose of image fusion, and the processing object adopted is an image pixel instead of an image feature for the original high-dimensional observation space construction in the present application.
Disclosure of Invention
Aiming at the defects in the prior art, the application aims to provide a target multi-angle intrinsic feature mining method and system based on Laplace feature mapping.
The target multi-angle intrinsic feature mining method based on Laplace feature mapping provided by the application comprises the following steps:
step S1: extracting target multi-angle SAR image characteristics;
step S2: constructing a target original high-dimensional observation feature space according to the multi-angle SAR image features;
step S3: carrying out Laplace feature mapping on the original high-dimensional observation feature space to obtain a mapped low-dimensional intrinsic feature space;
step S4: and obtaining the distribution relation between the target angle and the manifold curved surface through the low-dimensional intrinsic characteristic space.
Preferably, the SAR image features include: basic pixel characteristics, gray characteristics, shape characteristics and texture characteristics;
and the SAR image features are acquired from the filtered image under a set scale.
Preferably, step S2 includes:
step S2.1: and respectively carrying out maximum and minimum value normalization processing on the multi-angle SAR image characteristics, wherein the calculation formula is as follows:
wherein f (i) is the ith eigenvalue, f min For the minimum value of all characteristic values, f max For the maximum of all eigenvalues, F (i) is normalized to [0,1]Normalized eigenvalues in the range;
step S2.2: and (3) connecting all the processed features in series to obtain an original high-dimensional observation space feature matrix, namely HighDimF (N, M), wherein N is the number of samples, and M is the feature dimension after the series connection.
Preferably, step S3 includes:
step S3.1: taking the original high-dimensional observation space feature matrix as a data set X in the original high-dimensional observation space, and calculating the neighborhood of each data point so as to construct an adjacency graph;
step S3.2: weighting the adjacency graph to obtain a weight matrix W, wherein the weight matrix W is represented by the following formula:
wherein W is ij Representing a weight matrix given by the edge connected with the neighboring point of each point in the adjacency graph constructed in the original high-dimensional observation space; e represents a natural base; t represents a thermonuclear function coefficient; x is x i Representing an ith data point in the dataset in the original high-dimensional observation space; x is x j Representing a jth data point in a dataset in an original high-dimensional observation space; j (J) i Representing sample point x i A set of adjacent points in an original high-dimensional observation space;
step S3.3: according to the weight matrix calculation degree matrix D, the calculation formula is as follows:
D ii =Σ j W ij
wherein D is ii A measurement matrix for representing the weighted adjacency graph, wherein the measurement matrix is in the form of a diagonal matrix, and the value on the diagonal of the matrix is the sum of the weight of each sample point and the edge between adjacent points;
step S3.4: obtaining a Laplacian matrix L through the difference between the degree matrix and the weight matrix, wherein the calculation formula is as follows:
L=D-W
step S3.5: the generalized eigenvalue decomposition of Laplacian matrix L is calculated as follows:
Ly=λDy
take the 2 nd minimum eigenvalue to the first d+1 minimum eigenvalues lambda 2 ,…,λ d+1 Corresponding feature vector u= [ U ] 2 ,…,u d+1 ]Resulting in a low-dimensional embedded representation asWherein λ represents a feature value, y represents a feature vector, and d represents a feature value scoreThe front d dimension obtained by sequencing the eigenvalues from small to large after the solution, k represents the low-dimension intrinsic feature dimension, y i Representing feature vectors corresponding to the first 2 to d+1 feature values in the feature vectors of the ith sample point;
and obtaining a data set Y in the low-dimensional eigenspace as a low-dimensional eigenspace feature matrix LowDimF (N, d), wherein Y= { Y i ,i=1,…,N}。
Preferably, the distribution relation of the target angle and the manifold curved surface forms a low-dimensional manifold curved surface through three intrinsic dimensions in the obtained low-dimensional intrinsic feature space, a multi-angle target sample index is established, and the relation between the target angle and the manifold curved surface rule is searched;
and when the target angle and the manifold curved surface rule are searched, the method is carried out in a mode of synchronous dynamic playing through images and low-dimensional intrinsic characteristic space scatter diagrams.
According to the application, the target multi-angle intrinsic feature mining system based on Laplace feature mapping comprises:
module M1: extracting target multi-angle SAR image characteristics;
module M2: constructing a target original high-dimensional observation feature space according to the multi-angle SAR image features;
module M3: carrying out Laplace feature mapping on the original high-dimensional observation feature space to obtain a mapped low-dimensional intrinsic feature space;
module M4: and obtaining the distribution relation between the target angle and the manifold curved surface through the low-dimensional intrinsic characteristic space.
Preferably, the SAR image features include: basic pixel characteristics, gray characteristics, shape characteristics and texture characteristics;
and the SAR image features are acquired from the filtered image under a set scale.
Preferably, the module M2 comprises:
module M2.1: and respectively carrying out maximum and minimum value normalization processing on the multi-angle SAR image characteristics, wherein the calculation formula is as follows:
wherein f (i) is the ith eigenvalue, f min For the minimum value of all characteristic values, f max For the maximum of all eigenvalues, F (i) is normalized to [0,1]Normalized eigenvalues in the range;
module M2.2: and (3) connecting all the processed features in series to obtain an original high-dimensional observation space feature matrix, namely HighDimF (N, M), wherein N is the number of samples, and M is the feature dimension after the series connection.
Preferably, the module M3 comprises:
module M3.1: taking the original high-dimensional observation space feature matrix as a data set X in the original high-dimensional observation space, and calculating the neighborhood of each data point so as to construct an adjacency graph;
module M3.2: weighting the adjacency graph to obtain a weight matrix W, wherein the weight matrix W is represented by the following formula:
wherein W is ij Representing a weight matrix given by the edge connected with the neighboring point of each point in the adjacency graph constructed in the original high-dimensional observation space; e represents a natural base; t represents a thermonuclear function coefficient; x is x i Representing an ith data point in the dataset in the original high-dimensional observation space; x is x j Representing a jth data point in a dataset in an original high-dimensional observation space; j (J) i Representing sample point x i A set of adjacent points in an original high-dimensional observation space;
module M3.3: according to the weight matrix calculation degree matrix D, the calculation formula is as follows:
D ii =∑ j W ij
wherein D is ii A measurement matrix for representing the weighted adjacency graph, wherein the measurement matrix is in the form of a diagonal matrix, and the value on the diagonal of the matrix is the sum of the weight of each sample point and the edge between adjacent points;
module M3.4: obtaining a Laplacian matrix L through the difference between the degree matrix and the weight matrix, wherein the calculation formula is as follows:
L=D-W
module M3.5: the generalized eigenvalue decomposition of Laplacian matrix L is calculated as follows:
Ly=λDy
take the 2 nd minimum eigenvalue to the first d+1 minimum eigenvalues lambda 2 ,…,λ d+1 Corresponding feature vector u= [ U ] 2 ,…,u d+1 ]Resulting in a low-dimensional embedded representation asWherein lambda represents the eigenvalue, y represents the eigenvector, d represents the front d dimension obtained by sorting the eigenvalues in order from small to large after decomposition of the eigenvalue, k represents the low-dimensional eigen feature dimension, y i Representing feature vectors corresponding to the first 2 to d+1 feature values in the feature vectors of the ith sample point;
and obtaining a data set Y in the low-dimensional eigenspace as a low-dimensional eigenspace feature matrix LowDimF (N, d), wherein Y= { Y i ,i=1,…,N}。
Preferably, the distribution relation of the target angle and the manifold curved surface forms a low-dimensional manifold curved surface through three intrinsic dimensions in the obtained low-dimensional intrinsic feature space, a multi-angle target sample index is established, and the relation between the target angle and the manifold curved surface rule is searched;
and when the target angle and the manifold curved surface rule are searched, the method is carried out in a mode of synchronous dynamic playing through images and low-dimensional intrinsic characteristic space scatter diagrams.
Compared with the prior art, the application has the following beneficial effects:
1. the method is oriented to multi-angle target SAR image samples, a Laplace feature mapping method is adopted to find a low-dimensional manifold curved surface which is internally stored in a high-dimensional observation feature space, the consistency of the rule of the target along with the angle change and the rule of the manifold curved surface is verified, and a foundation is laid for the subsequent feature characterization and feature matching application of the multi-angle target.
2. The method is aimed at the acquired SAR multi-angle target image dataset, the angle and the rule of the manifold curved surface in the characteristic space are searched from the angle of the image characteristic, and the searching means provided by the application can be widely applied to further applications such as classification, identification and the like of multi-angle targets.
3. The application adopts the Laplace characteristic mapping to be a nonlinear manifold learning method, does not need to meet the characteristic space linearization assumption condition, and is more suitable for natural images.
4. When the angle of the target and the rule of the manifold curved surface are searched, the method is carried out in a mode of synchronously and dynamically playing the images and the low-dimensional intrinsic characteristic space scatter diagram, and the rule that targets with different angles correspondingly travel along the curved surface on the low-dimensional manifold curved surface can be clearly observed.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, given with reference to the accompanying drawings in which:
FIG. 1 is a schematic flow chart of the present application.
FIG. 2 is a schematic diagram of the application for finding the angular relationship between a low-dimensional manifold curved surface and a multi-angle target.
Detailed Description
The present application will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present application, but are not intended to limit the application in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present application.
The method is oriented to multi-angle target SAR image samples, proper target features are extracted to construct an original high-dimensional observation feature space, a low-dimensional manifold curved surface which is internally contained in the high-dimensional observation feature space is found by adopting a Laplace feature mapping method, the consistency of the rule of the targets along with the angle change and the rule of the manifold curved surface is verified, and a foundation is laid for the feature characterization and feature matching application of the follow-up multi-angle targets.
Example 1
According to the target multi-angle intrinsic feature mining method based on Laplace feature mapping, as shown in FIG. 1, the method comprises the following steps:
step S1: extracting target multi-angle SAR image characteristics; considering that the high-resolution SAR image target is oriented, the traditional statistical features based on SAR image distribution are not applicable any more, so that more structures, textures and shape features applicable to the high-resolution SAR image target are introduced. In addition, in order to construct a target original high-dimensional observation feature space which is as complete as possible, different SAR image features need to be considered respectively, and the features include: basic pixel features, gray scale features, shape features, texture features. Specifically, a basic pixel characteristic reflecting an image, a gray characteristic reflecting target radiation intensity information, such as a gray histogram, a shape characteristic reflecting target shape characteristics, such as an edge direction histogram EOH, a texture characteristic reflecting characteristics between a target and a background, such as a gray co-occurrence matrix GLCM and a local binary pattern LBP operator.
Considering the influence of the specific speckle noise of the SAR image on the feature extraction, the image features used for constructing the original high-dimensional observation feature space are obtained in the filtered image under a certain scale.
Specifically, the extraction modes of different features are further described:
basic pixel characteristics
The multi-angle target image samples used are image blocks of the same pixel size, e.g., 128 x 128, and for samples of different sizes, it is contemplated that the image pixel size may be normalized by downsampling to obtain upsampling. For normalized image samples, the two-dimensional matrix may be tiled to a one-dimensional row vector using the same rules (e.g., row-by-row or column-by-column), such as: I2D (128 ) is changed to I1D (1, 128 x 128), and a basic pixel characteristic matrix formed by N multi-angle target image samples can be marked as F_OrigPixel (N, 128 x 128).
Gray histogram
The gray level histogram is obtained by the following steps of:
wherein, A is a gray coefficient, for example, the value of A is 255 for an 8bit image; n is the total number of bits of the histogram; p is each gear quantization position in n; x is the image pixel gray value. The gray histogram feature matrix of the N multi-angle target image samples may be denoted as f_hist (N, N).
Edge direction histogram EOH
The edge direction histogram (Edge Orientation Histogram, EOH) can better reflect the shape and edge information of the object based on the statistical features of the image edges. Each pixel on the edge corresponds to an edge gradient direction, the edge can be regarded as being formed by edge pixel points in a specific direction, and statistics of the gradient directions of the edge pixel points can represent the shape of a target. Firstly, edge detection is carried out on an image sample by adopting a Canny operator to obtain an edge image, and after Gaussian filtering, the normal vector direction angle of each edge pixel point is calculated according to the following formula:
where g (i, j) is the gray level of the gaussian filtered image. And taking 45 DEG as a unit, quantizing the 360 DEG angle space into 8 directions, and calculating the statistic value of the normal vector direction angle theta at each edge point falling into the 8 directions respectively to obtain an edge direction histogram. The edge direction histogram feature matrix of the N multi-angle target image samples may be denoted as F Eoh (N, 8).
Gray level co-occurrence matrix GLCM
The Gray Level Co-occurrence Matrix (GLCM) firstly calculates the Gray Level Co-occurrence matrix of the multi-angle image sample, and then calculates the contrast, correlation, energy and homogeneity characteristic values in the matrix according to the following formulas:
contrast ratio:energy: />
Correlation:homogeneity: />
Wherein p (i, j) is a gray level co-occurrence matrix. The gray level co-occurrence matrix feature matrix of the N multi-angle target image samples is denoted as F_Glcm (N, 4).
Local binary pattern LBP
The local binary pattern (Local Binary Pattern, LBP) is within the window, with the central pixel of the window as a threshold, the gray value of the adjacent pixel is compared with the central pixel, if the surrounding pixel value is greater than the central pixel value, the position of the pixel is marked as 1, otherwise, as 0. And carrying out histogram statistics (LBP type is taken as a horizontal axis and the occurrence number is taken as a vertical axis) on codes of all pixels in one image block to obtain LBP characteristics, and reflecting the texture information of the image. The local binary pattern feature matrix of the N multi-angle target image samples is marked as F_Lbp (N, m), wherein m is the number of local binary pattern features.
Step S2: constructing a target original high-dimensional observation feature space according to the multi-angle SAR image features; the step S2 comprises the following steps: step S2.1: and respectively carrying out maximum and minimum value normalization processing on the multi-angle SAR image characteristics, wherein the calculation formula is as follows:
wherein f (i) is the ith eigenvalue, f min For the minimum value of all characteristic values, f max For the maximum of all eigenvalues, F (i) is normalized to [0,1]Normalized eigenvalues in the range. An original high-dimensional observation space feature matrix formed by connecting all features of N multi-angle target image samples in series is denoted as HighDimF (N, M), wherein N is the number of the samples, and M is the feature dimension after the series.
Step S2.2: and (3) connecting all the processed features in series to obtain an original high-dimensional observation space feature matrix, namely HighDimF (N, M), wherein N is the number of samples, and M is the feature dimension after the series connection.
Step S3: under the optimization criterion of keeping the characteristic of the feature space neighbor unchanged, carrying out Laplacian feature mapping on the original high-dimensional observation feature space to obtain a mapped low-dimensional intrinsic feature space; the laplacian feature map (Laplacian Eigenmaps, LE) maintains a local neighbor relation, the criteria of which can be described as: data points that are closer in the original high-dimensional observation space should remain closer together after mapping.
Specifically, an adjacency graph G is first constructed: and taking the characteristic matrix of the original high-dimensional observation space as a data set X in the original high-dimensional observation space, and calculating the neighborhood of each data point so as to construct an adjacency graph. That is, the data set X in the original high-dimensional observation space is an original high-dimensional observation space feature matrix HighDimF (N, M) formed by connecting all features of the N multi-angle target image samples in series.
Then construct a weighted adjacency graph W: weighting the adjacency graph to obtain a weight matrix W, wherein the weight matrix W is represented by the following formula:
wherein W is ij Representing a weight matrix given by the edge connected with the neighboring point of each point in the adjacency graph constructed in the original high-dimensional observation space; e represents a natural base; t represents a thermonuclear function coefficient, and the empirical value is 1; x is x i Representing an ith data point in the dataset in the original high-dimensional observation space; x is x j Representing a jth data point in a dataset in an original high-dimensional observation space; j (J) i Representing sample point x i A set of adjacent points in an original high-dimensional observation space;
and then, calculating a degree matrix D according to the weight matrix, wherein the calculation formula is as follows:
D ii =∑ j W ij
wherein D is ii Representing weighted adjacency graphsThe measurement matrix is in the form of a diagonal matrix, and the value on the diagonal of the matrix is the sum of the weights of the edges between each sample point and the adjacent points;
then solve the Laplacian matrix L: obtaining a Laplacian matrix L through the difference between the degree matrix and the weight matrix, wherein the calculation formula is as follows:
L=D-W
finally solving d-dimensional embedded Y: solving the optimization problem by LE criteriaFrom the following componentsThe optimization problem can be converted into generalized eigenvalue decomposition of the Laplacian matrix L, and the calculation formula is as follows:
Ly=λDy
take the 2 nd minimum eigenvalue to the first d+1 minimum eigenvalues lambda 2 ,…,λ d+1 Corresponding feature vector u= [ U ] 2 ,…,u d+1 ]Resulting in a low-dimensional embedded representation asWherein lambda represents the eigenvalue, y represents the eigenvector, d represents the front d dimension obtained by sorting the eigenvalues in order from small to large after decomposition of the eigenvalue, k represents the low-dimensional eigen feature dimension, y i Representing feature vectors corresponding to the first 2 to d+1 feature values in the feature vectors of the ith sample point; and obtaining a data set Y in the low-dimensional eigenspace as a low-dimensional eigenspace feature matrix LowDimF (N, d), wherein Y= { Y i ,i=1,…,N}。
Step S4: and obtaining the distribution relation between the target angle and the manifold curved surface through the low-dimensional intrinsic characteristic space. The distribution relation of the target angle and the manifold curved surface forms a low-dimensional manifold curved surface through three intrinsic dimensions in the obtained low-dimensional intrinsic feature space, a multi-angle target sample index is established, and the relation between the target angle and the manifold curved surface rule is searched; based on the multi-angle target feature analysis platform compiled by the application, when the target angle and manifold curved surface rules are searched, the method is carried out in a mode of synchronously and dynamically playing images and low-dimensional intrinsic feature space scatter diagrams, and the rules that targets with different angles correspondingly travel along the curved surface on the low-dimensional manifold curved surface can be clearly observed.
Further, the application will be further described with reference to a multi-angle sample library in the MSTAR dataset as a research example:
step one, extracting features of a target multi-angle SAR image, and respectively obtaining a basic pixel feature matrix F_Origpixel (N, 128×128) formed by N multi-angle target image samples, a gray histogram feature matrix F_Hist (N, N), an edge direction histogram feature matrix F_ Eoh (N, 8), a gray symbiotic matrix feature matrix F_Glcm (N, 4) and a local binary pattern feature matrix F_Lbp (N, m).
And step two, establishing an original high-dimensional observation feature space, respectively carrying out maximum and minimum normalization processing on various features in the step one, and connecting the features in series to one high-dimensional feature vector to obtain an original high-dimensional observation space feature matrix HighDimF (N, M) formed by connecting all normalized features of N multi-angle target image samples in series.
And thirdly, reducing the Laplace feature mapping dimension, and reducing the dimension of the original high-dimensional observation space feature matrix HighDimF (N, M) in the second step through the Laplace feature mapping to obtain a low-dimensional intrinsic space feature matrix LowDimF (N, d).
And step four, as shown in fig. 2, searching for the angle relation between the low-dimensional manifold curved surface and the multi-angle target, taking the low-dimensional eigenspace feature matrix LowDimF (N, d) obtained in the step three, drawing a scatter diagram three-dimensionally, obtaining the manifold curved surface of the multi-angle target sample in the low-dimensional eigenspace, and observing the change rule of the target on the manifold curved surface along with the change of the angle through the point index on the manifold curved surface and the image sample association.
Example two
The application also provides a target multi-angle intrinsic feature mining system based on the Laplace feature map, and a person skilled in the art can realize the target multi-angle intrinsic feature mining system based on the Laplace feature map by executing the step flow of the target multi-angle intrinsic feature mining method based on the Laplace feature map, namely the target multi-angle intrinsic feature mining method based on the Laplace feature map can be understood as a preferred implementation mode of the target multi-angle intrinsic feature mining system based on the Laplace feature map.
According to the application, the target multi-angle intrinsic feature mining system based on Laplace feature mapping comprises:
module M1: extracting target multi-angle SAR image characteristics; the SAR image features include: basic pixel characteristics, gray characteristics, shape characteristics and texture characteristics; and the SAR image features are acquired from the filtered image under a set scale.
Module M2: constructing a target original high-dimensional observation feature space according to the multi-angle SAR image features; the module M2 includes: module M2.1: and respectively carrying out maximum and minimum value normalization processing on the multi-angle SAR image characteristics, wherein the calculation formula is as follows:
wherein f (i) is the ith eigenvalue, f min For the minimum value of all characteristic values, f max For the maximum of all eigenvalues, F (i) is normalized to [0,1]Normalized eigenvalues in the range.
Module M2.2: and (3) connecting all the processed features in series to obtain an original high-dimensional observation space feature matrix, namely HighDimF (N, M), wherein N is the number of samples, and M is the feature dimension after the series connection.
Module M3: carrying out Laplace feature mapping on the original high-dimensional observation feature space to obtain a mapped low-dimensional intrinsic feature space; preferably, the module M3 comprises:
module M3.1: and taking the characteristic matrix of the original high-dimensional observation space as a data set X in the original high-dimensional observation space, and calculating the neighborhood of each data point so as to construct an adjacency graph. Module M3.2: weighting the adjacency graph to obtain a weight matrix W, wherein the weight matrix W is represented by the following formula:
wherein W is ij Representing a weight matrix given by the edge connected with the neighboring point of each point in the adjacency graph constructed in the original high-dimensional observation space; e represents a natural base; t represents a thermonuclear function coefficient, and the empirical value is 1; x is x i Representing an ith data point in the dataset in the original high-dimensional observation space; x is x j Representing a jth data point in a dataset in an original high-dimensional observation space; j (J) i Representing sample point x i A set of adjacent points in an original high-dimensional observation space;
module M3.3: according to the weight matrix calculation degree matrix D, the calculation formula is as follows:
D ii =∑ j W ij
wherein D is ii A measurement matrix for representing the weighted adjacency graph, wherein the measurement matrix is in the form of a diagonal matrix, and the value on the diagonal of the matrix is the sum of the weight of each sample point and the edge between adjacent points;
module M3.4: obtaining a Laplacian matrix L through the difference between the degree matrix and the weight matrix, wherein the calculation formula is as follows:
L=D-W
module M3.5: the generalized eigenvalue decomposition of Laplacian matrix L is calculated as follows:
Ly=λDy
take the 2 nd minimum eigenvalue to the first d+1 minimum eigenvalues lambda 2 ,…,λ d+1 Corresponding feature vector u= [ U ] 2 ,…,u d+1 ]Resulting in a low-dimensional embedded representation asWherein lambda represents the eigenvalue, y represents the eigenvector, d represents the front d dimension obtained by sorting the eigenvalues in order from small to large after decomposition of the eigenvalue, k represents the low-dimensional eigen feature dimension, y i Representing feature vectors corresponding to the first 2 to d+1 feature values in the feature vectors of the ith sample point; and then the data set Y in the low-dimensional eigenspace is obtained as the low-dimensional eigenspaceSign matrix LowDimF (N, d), where y= { Y i ,i=1,…,N}。
Module M4: and obtaining the distribution relation between the target angle and the manifold curved surface through the low-dimensional intrinsic characteristic space. The distribution relation of the target angle and the manifold curved surface forms a low-dimensional manifold curved surface through three intrinsic dimensions in the obtained low-dimensional intrinsic feature space, a multi-angle target sample index is established, and the relation of the target angle and the manifold curved surface rule is searched; and when the target angle and the manifold curved surface rule are searched, the method is carried out in a mode of synchronous dynamic playing of the image and the low-dimensional intrinsic characteristic space scatter diagram.
Those skilled in the art will appreciate that the systems, apparatus, and their respective modules provided herein may be implemented entirely by logic programming of method steps such that the systems, apparatus, and their respective modules are implemented as logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc., in addition to the systems, apparatus, and their respective modules being implemented as pure computer readable program code. Therefore, the system, the apparatus, and the respective modules thereof provided by the present application may be regarded as one hardware component, and the modules included therein for implementing various programs may also be regarded as structures within the hardware component; modules for implementing various functions may also be regarded as being either software programs for implementing the methods or structures within hardware components.
The foregoing describes specific embodiments of the present application. It is to be understood that the application is not limited to the particular embodiments described above, and that various changes or modifications may be made by those skilled in the art within the scope of the appended claims without affecting the spirit of the application. The embodiments of the application and the features of the embodiments may be combined with each other arbitrarily without conflict.

Claims (10)

1. The target multi-angle intrinsic feature mining method based on the Laplace feature mapping is characterized by comprising the following steps of:
step S1: extracting target multi-angle SAR image characteristics;
step S2: constructing a target original high-dimensional observation feature space according to the multi-angle SAR image features;
step S3: carrying out Laplace feature mapping on the original high-dimensional observation feature space to obtain a mapped low-dimensional intrinsic feature space;
step S4: and obtaining the distribution relation between the target angle and the manifold curved surface through the low-dimensional intrinsic characteristic space.
2. The target multi-angle eigen feature mining method based on laplacian feature mapping of claim 1, wherein the SAR image features comprise: basic pixel characteristics, gray characteristics, shape characteristics and texture characteristics;
and the SAR image features are acquired from the filtered image under a set scale.
3. The target multi-angle eigen feature mining method based on the laplace feature mapping of claim 1, wherein step S2 includes:
step S2.1: and respectively carrying out maximum and minimum value normalization processing on the multi-angle SAR image characteristics, wherein the calculation formula is as follows:
wherein f (i) is the ith eigenvalue, f min For the minimum value of all characteristic values, f max For the maximum of all eigenvalues, F (i) is normalized to [0,1]Normalized eigenvalues in the range;
step S2.2: and (3) connecting all the processed features in series to obtain an original high-dimensional observation space feature matrix, namely HighDimF (N, M), wherein N is the number of samples, and M is the feature dimension after the series connection.
4. The target multi-angle eigen feature mining method based on the laplace feature mapping of claim 1, wherein step S3 includes:
step S3.1: taking the original high-dimensional observation space feature matrix as a data set X in the original high-dimensional observation space, and calculating the neighborhood of each data point so as to construct an adjacency graph;
step S3.2: weighting the adjacency graph to obtain a weight matrix W, wherein the weight matrix W is represented by the following formula:
wherein W is ij Representing a weight matrix given by the edge connected with the neighboring point of each point in the adjacency graph constructed in the original high-dimensional observation space; e represents a natural base; t represents a thermonuclear function coefficient; x is x i Representing an ith data point in the dataset in the original high-dimensional observation space; x is x j Representing a jth data point in a dataset in an original high-dimensional observation space; j (J) i Representing sample point x i A set of adjacent points in an original high-dimensional observation space;
step S3.3: according to the weight matrix calculation degree matrix D, the calculation formula is as follows:
D ii =Σ j W ij
wherein D is ii A measurement matrix for representing the weighted adjacency graph, wherein the measurement matrix is in the form of a diagonal matrix, and the value on the diagonal of the matrix is the sum of the weight of each sample point and the edge between adjacent points;
step S3.4: obtaining a Laplacian matrix L through the difference between the degree matrix and the weight matrix, wherein the calculation formula is as follows:
L=D-W
step S3.5: the generalized eigenvalue decomposition of Laplacian matrix L is calculated as follows:
Ly=λDy
take the 2 nd minimum eigenvalue to the first d+1 minimum eigenvalues lambda 2 ,…,λ d+1 Corresponding feature vector u= [ U ] 2 ,…,u d+1 ]Resulting in a low-dimensional embedded representation asWherein λ represents a feature value, y represents a feature vector, and d representsThe front d dimension obtained by sorting the eigenvalues from small to large after the eigenvalue decomposition, k represents the low-dimension intrinsic feature dimension, y i Representing feature vectors corresponding to the first 2 to d+1 feature values in the feature vectors of the ith sample point;
and obtaining a data set Y in the low-dimensional eigenspace as a low-dimensional eigenspace feature matrix LowDimF (N, d), wherein Y= { Y i ,i=1,…,N}。
5. The target multi-angle intrinsic feature mining method based on Laplace feature mapping according to claim 1, wherein the distribution relation of the target angles and the manifold curved surfaces is that the three intrinsic dimensions in the obtained low-dimensional intrinsic feature space form a low-dimensional manifold curved surface, a multi-angle target sample index is established, and the relation of the target angles and the manifold curved surface rules is searched;
and when the target angle and the manifold curved surface rule are searched, the method is carried out in a mode of synchronous dynamic playing through images and low-dimensional intrinsic characteristic space scatter diagrams.
6. Target multi-angle intrinsic feature mining system based on Laplace feature mapping, characterized by comprising:
module M1: extracting target multi-angle SAR image characteristics;
module M2: constructing a target original high-dimensional observation feature space according to the multi-angle SAR image features;
module M3: carrying out Laplace feature mapping on the original high-dimensional observation feature space to obtain a mapped low-dimensional intrinsic feature space;
module M4: and obtaining the distribution relation between the target angle and the manifold curved surface through the low-dimensional intrinsic characteristic space.
7. The laplacian feature mapping-based target multi-angle eigenfeature mining system of claim 6 wherein the SAR image features comprise: basic pixel characteristics, gray characteristics, shape characteristics and texture characteristics;
and the SAR image features are acquired from the filtered image under a set scale.
8. The target multi-angle eigen feature mining system based on laplacian feature mapping of claim 6 wherein module M2 comprises:
module M2.1: and respectively carrying out maximum and minimum value normalization processing on the multi-angle SAR image characteristics, wherein the calculation formula is as follows:
wherein f (i) is the ith eigenvalue, f min For the minimum value of all characteristic values, f max For the maximum of all eigenvalues, F (i) is normalized to [0,1]Normalized eigenvalues in the range;
module M2.2: and (3) connecting all the processed features in series to obtain an original high-dimensional observation space feature matrix, namely HighDimF (N, M), wherein N is the number of samples, and M is the feature dimension after the series connection.
9. The target multi-angle eigen feature mining system based on laplacian feature mapping of claim 6 wherein module M3 comprises:
module M3.1: taking the original high-dimensional observation space feature matrix as a data set X in the original high-dimensional observation space, and calculating the neighborhood of each data point so as to construct an adjacency graph;
module M3.2: weighting the adjacency graph to obtain a weight matrix W, wherein the weight matrix W is represented by the following formula:
wherein W is ij Representing a weight matrix given by the edge connected with the neighboring point of each point in the adjacency graph constructed in the original high-dimensional observation space; e represents a natural base; t represents a thermonuclear function coefficient; x is x i Representing the first of the data sets in the original high-dimensional observation spacei data points; x is x j Representing a jth data point in a dataset in an original high-dimensional observation space; j (J) i Representing sample point x i A set of adjacent points in an original high-dimensional observation space;
module M3.3: according to the weight matrix calculation degree matrix D, the calculation formula is as follows:
D ii =Σ j W ij
wherein D is ii A measurement matrix for representing the weighted adjacency graph, wherein the measurement matrix is in the form of a diagonal matrix, and the value on the diagonal of the matrix is the sum of the weight of each sample point and the edge between adjacent points;
module M3.4: obtaining a Laplacian matrix L through the difference between the degree matrix and the weight matrix, wherein the calculation formula is as follows:
L=D-W
module M3.5: the generalized eigenvalue decomposition of Laplacian matrix L is calculated as follows:
Ly=λDy
take the 2 nd minimum eigenvalue to the first d+1 minimum eigenvalues lambda 2 ,…,λ d+1 Corresponding feature vector u= [ U ] 2 ,…,u d+1 ]Resulting in a low-dimensional embedded representation asWherein lambda represents the eigenvalue, y represents the eigenvector, d represents the front d dimension obtained by sorting the eigenvalues in order from small to large after decomposition of the eigenvalue, k represents the low-dimensional eigen feature dimension, y i Representing feature vectors corresponding to the first 2 to d+1 feature values in the feature vectors of the ith sample point;
and obtaining a data set Y in the low-dimensional eigenspace as a low-dimensional eigenspace feature matrix LowDimF (N, d), wherein Y= { Y i ,i=1,…,N}。
10. The target multi-angle intrinsic feature mining system based on Laplace feature mapping according to claim 1, wherein the distribution relation of the target angles and the manifold curved surfaces forms a low-dimensional manifold curved surface through three intrinsic dimensions in the obtained low-dimensional intrinsic feature space, a multi-angle target sample index is established, and the relation of the target angles and the manifold curved surface rules is searched;
and when the target angle and the manifold curved surface rule are searched, the method is carried out in a mode of synchronous dynamic playing through images and low-dimensional intrinsic characteristic space scatter diagrams.
CN202310692714.2A 2023-06-12 2023-06-12 Target multi-angle intrinsic feature mining method based on Laplace feature mapping Pending CN116843906A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310692714.2A CN116843906A (en) 2023-06-12 2023-06-12 Target multi-angle intrinsic feature mining method based on Laplace feature mapping

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310692714.2A CN116843906A (en) 2023-06-12 2023-06-12 Target multi-angle intrinsic feature mining method based on Laplace feature mapping

Publications (1)

Publication Number Publication Date
CN116843906A true CN116843906A (en) 2023-10-03

Family

ID=88169697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310692714.2A Pending CN116843906A (en) 2023-06-12 2023-06-12 Target multi-angle intrinsic feature mining method based on Laplace feature mapping

Country Status (1)

Country Link
CN (1) CN116843906A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117409275A (en) * 2023-12-06 2024-01-16 华能澜沧江水电股份有限公司 Multi-angle radar image processing method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117409275A (en) * 2023-12-06 2024-01-16 华能澜沧江水电股份有限公司 Multi-angle radar image processing method
CN117409275B (en) * 2023-12-06 2024-04-05 华能澜沧江水电股份有限公司 Multi-angle radar image processing method

Similar Documents

Publication Publication Date Title
Qu et al. Vehicle detection from high-resolution aerial images using spatial pyramid pooling-based deep convolutional neural networks
Al Bashish et al. A framework for detection and classification of plant leaf and stem diseases
CN108898065B (en) Deep network ship target detection method with candidate area rapid screening and scale self-adaption
Karuppusamy Building detection using two-layered novel convolutional neural networks
CN113240047B (en) SAR target recognition method based on component analysis multi-scale convolutional neural network
Li et al. Detecting and tracking dim small targets in infrared image sequences under complex backgrounds
Shen et al. Biomimetic vision for zoom object detection based on improved vertical grid number YOLO algorithm
CN107491734A (en) Semi-supervised Classification of Polarimetric SAR Image method based on multi-core integration Yu space W ishart LapSVM
Guo et al. KD-tree-based euclidean clustering for tomographic SAR point cloud extraction and segmentation
CN116843906A (en) Target multi-angle intrinsic feature mining method based on Laplace feature mapping
Wang et al. A deep deformable residual learning network for SAR image segmentation
Woldamanuel Grayscale Image Enhancement Using Water Cycle Algorithm
CN113822361A (en) SAR image similarity measurement method and system based on Hamming distance
Singh et al. Wavelet based histogram of oriented gradients feature descriptors for classification of partially occluded objects
Sami et al. Synthetic aperture radar image classification: A survey
Cheng et al. Tensor locality preserving projections based urban building areas extraction from high-resolution SAR images
Shen et al. Cropland extraction from very high spatial resolution satellite imagery by object-based classification using improved mean shift and one-class support vector machines
CN107607927B (en) Stripe laser echo information extraction method
Samanta A novel statistical approach for segmentation of SAR Images
CN112686871B (en) SAR image change detection method based on improved logarithmic comparison operator and Gabor_ELM
Chitturi Building detection in deformed satellite images using mask r-cnn
Zhao et al. Building extraction from lidar point cloud data using marked point process
CN112766032A (en) SAR image saliency map generation method based on multi-scale and super-pixel segmentation
Sotnikov et al. USING THE SET OF INFORMATIVE FEATURES OF A BINDING OBJECT TO CONSTRUCT A DECISION FUNCTION BY THE SYSTEM OF TECHNICAL VISION WHEN LOCALIZING MOBILE ROBOTS.
Xu et al. A new ship target detection algorithm based on SVM in high resolution SAR images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination