CN114494244A - Scene adaptability analysis method based on support vector machine - Google Patents

Scene adaptability analysis method based on support vector machine Download PDF

Info

Publication number
CN114494244A
CN114494244A CN202210324127.3A CN202210324127A CN114494244A CN 114494244 A CN114494244 A CN 114494244A CN 202210324127 A CN202210324127 A CN 202210324127A CN 114494244 A CN114494244 A CN 114494244A
Authority
CN
China
Prior art keywords
image
matching
samples
sample
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210324127.3A
Other languages
Chinese (zh)
Inventor
何浩东
吴明强
占必超
王才红
宫树香
许馨月
高军强
刘庆国
何向晨
刘青
赵云飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pla 96901
Original Assignee
Pla 96901
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pla 96901 filed Critical Pla 96901
Priority to CN202210324127.3A priority Critical patent/CN114494244A/en
Publication of CN114494244A publication Critical patent/CN114494244A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

A scene adaptability analysis method based on a support vector machine comprises the following steps: s1, image feature selection, S2: generating a sample; s3: training and outputting an SVM model; s4: and (5) applying an SVM model for adaptability analysis. The method can be used for optimizing a guidance scheme and matching positioning algorithm design, can also be applied to guidance mission planning, is beneficial to maximizing weapon performance, and has good economic benefit and military benefit.

Description

Scene adaptability analysis method based on support vector machine
Technical Field
The invention relates to an image adaptability analysis method, in particular to a scene adaptability analysis method based on a support vector machine.
Background
The radar scene matching positioning seeker assists navigation by matching a radar real-time image shot by a missile during flying and a prepared target reference image, and realizes accurate positioning of a target. The suitability of the reference map is an important factor that affects the matching accuracy in addition to the performance of the matching algorithm. It is particularly important to improve the correct matching probability by performing the suitability analysis on the reference image during the planning of the battle mission.
The current scene matching area is mainly selected manually. However, when the scene matching data amount is large, the workload of manually selecting the matching area is large, the speed is slow, and a satisfactory matching area is often difficult to find due to the influence of subjective factors such as the knowledge level and experience of an operator. The existing automatic selection method always follows four principles of richness, stability, uniqueness and significance of scene information, and analyzes the selection problem of a matching area according to characteristic parameters of a reference map and an elevation map, but each characteristic index can only reflect one aspect of image adaptation performance, and influence of various factors on the selection of the matching area is not comprehensively and systematically considered, so that the universality of a scene matching area selection model is poor.
Disclosure of Invention
The invention provides a scene adaptability analysis method based on a support vector machine, which solves the problems that the existing scene adaptive area automatic selection method is poor in universality and more adaptive area analysis criterion parameters are difficult to linearly divide according to a single characteristic parameter.
The technical scheme is as follows: and extracting characteristic parameters closely related to the performance of the matching algorithm by combining scene characteristic information used by the SAR scene matching algorithm, learning and training by a support vector machine method, and establishing an adaptive region analysis model.
A scene adaptability analysis method based on a support vector machine comprises the following steps:
s1, selecting image characteristics, namely selecting DEM variance, DEM range, Sobel edge density, OTSU surface target density and image mean matrix variance as image characteristics; s2: generating a sample; selecting radar images of various terrains as sample basic images, dividing the basic images in a grid form to generate a matching subarea image set, taking the matching subarea image set as a sample set, and removing invalid samples from the matching subarea image set to enable the proportion of training samples to test samples in the valid samples to be 2 to 1; s3: training and outputting an SVM model; the method specifically comprises the following steps of S31: constructing a sample feature vector; calculating the image features of the effective samples as feature vectors of the effective samples; s32: carrying out uplink and downlink image matching on the effective sample, and taking the matching result as the class attribute value; s33: normalizing the feature vectors; s34: training an SVM model; selecting a polynomial kernel function by the kernel function, and searching the optimal parameter of the SVM algorithm by using 5-fold cross validation based on the normalized feature vector and the class attribute value; s35: outputting an SVM model to generate an SVM model file; s4: carrying out adaptability analysis by applying an SVM model; reading the SVM model file, obtaining model parameters, substituting the model parameters and the feature vector of the sample to be analyzed into a decision function, wherein if the result is 1, the image is adaptable, and if the result is 1, the image is not adaptable; the decision function is:
f(x)=sgn(
Figure 398323DEST_PATH_IMAGE001
ai *yi K(x,xi)+b*)
where x is the feature vector of the sample to be analyzed, b*The value is the offset value, x, of the decision functioniAnd ai *The values are respectively a characteristic vector in the support vector and a corresponding Lagrange multiplier; y isiIs equal to xiCorresponding sample class attribute value, sgn denotes sign function, K (x, x)i) For the kernel function, i represents the sequence number of the support vector, and l is the number of the support vectors.
In the step S1, feature selection is carried out by a Filter and Wrapper feature selection method based on an SVM; firstly, selecting initial image features as an uplink and downlink image matching coefficient, an uplink and downlink image matching deviation, 6-scale Horris corner response values, Sobel edge density, Canny edge density, VOM mean matrix variance, OTSU surface target density, DEM variance and DEM range by using a Filter feature selection method, and then selecting the image features which are beneficial to classification from the initial image features by using a Wrapper feature selection method.
In step S2, the ratio of positive and negative samples in the training sample is 1: 1.
The up-down image matching criteria in step S32 are: the matching error of the uplink and downlink images is less than 10 pixels and the correlation coefficient is more than 0.03; if both are satisfied, the class attribute value is set to 1, otherwise, the class attribute value is set to 0.
In step S33, the normalization is performed according to each dimension of the feature vector, that is, if there are N training samples, the ith dimension of the feature vector is normalized, the minimum value of the ith dimension in the N samples is 0 or-1, and the maximum value of the ith dimension in the N samples is 1.
The invention has the beneficial effects that:
the method can be used for optimizing a guidance scheme and matching positioning algorithm design, can also be applied to guidance mission planning, is beneficial to maximizing weapon performance, and has good economic benefit and military benefit.
Drawings
FIG. 1 shows a scene adaptation analysis method based on a support vector machine according to the present invention
FIG. 2 SVM-based Filter & Wrapper feature selection Process
Fig. 3 SVM model file example.
Detailed Description
The present invention will be described in further detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, a scene adaptability analysis method based on a support vector machine includes the following specific steps:
s1: image feature selection
In order to classify images by using an SVM, the images are classified into two types of adaptable type and non-adaptable type, firstly, an image feature vector is constructed, and image features which can comprehensively represent the images and are beneficial to correct classification are selected. And selecting the initial features of the radar image according to the missile imaging matching features. Each feature may reflect the suitability of the image from a different angle. If the height of the ground fluctuates, the missile-borne radar image is affected, and the error is too large due to too large fluctuation. Too flat a floor is not conducive to imaging. Therefore, the ground elevation range and the standard deviation are selected as the characteristics of the image. The richness of the image is also an important factor affecting matching. The edges and connected bodies of the scenes in the images represent the richness of the scenes in the images. In addition, the matching performance between the radar image shot by the satellite in the uplink direction and the radar image shot in the downlink direction can be used as a factor for measuring the image suitability.
In summary, the selected initial image features are: the method comprises the following steps of independent pixel number, image variance, information entropy, edge density, primary and secondary peak ratio of a relevant surface, Harris angular point response characteristics, elevation standard deviation, elevation range, up and down image matching coefficients, up and down image matching deviation, 6-scale Horris angular point response values, Sobel edge density, Canny edge density, VOM mean matrix variance, OTSU surface target density, DEM variance and DEM range.
The matching coefficient of the uplink and downlink images is obtained by matching the uplink image with the downlink image, wherein the maximum matching coefficient is the matching coefficient of the uplink and downlink images.
The uplink and downlink image matching deviation algorithm is as follows: 1) obtaining the best matching position: the pixel position corresponding to the matching coefficient of the uplink and downlink images is the best matching position; 2) determining a corresponding descending map geographical coordinate range through the ascending map geographical coordinate range, thereby obtaining the descending map central pixel position; 3) and (3) calculating matching deviation: and (4) calculating the geometric distance between the two points obtained in the step, namely the uplink and downlink matching deviation.
DEM variance is based on statistical features of a matrix of elevation data corresponding to a region of a ground Digital Elevation Model (DEM) sample. And calculating the variance of the pixel values of the DEM matrix by traversing the DEM matrix to obtain the characteristic value. The formula is as follows:
Figure 240377DEST_PATH_IMAGE002
in the formula I(i,j)The matrix element values of the ith row and the jth column of DEM data are referred.
The DEM range is that the maximum gray value and the minimum gray value of the whole DEM gray image are searched by traversing, namely the maximum altitude point and the minimum altitude point which represent the actual ground are searched. And making difference to obtain the extreme difference of the ground elevation.
DEMR=Max(IDemlmage)-Min(IDemlmage)
In the formula IDemlmageThe area elevation data matrix corresponding to the sample is indicated.
The Sobel edge density and the OTSU surface target density are obtained by respectively carrying out Sobel edge detection and Otsu threshold segmentation on the image, extracting a connected body after segmentation, and counting the proportion of connected body pixels in total pixels, namely the edge density characteristic.
And then selecting a Filter and Wrapper feature selection method based on the SVM for feature selection, wherein the specific process is shown in FIG. 2. The Filter method is to select a feature subset from an initial feature set, and the effectiveness of the feature is represented by the classification effect of the feature on the data set. As one of the modes, the characteristic evaluation criterion based on the euclidean distance uses the inter-class distance and the intra-class distance of the sample as the index of the characteristic discrimination. In order to consider the influence of different dimensions among the features on the identification degree of the feature categories, a D-SCORE evaluation criterion in the Euclidean distance evaluation index is adopted. In the initial stage, a Filter method is used, based on D-SCORE values of all dimensional features, initial features are selected from a plurality of adaptive parameter features such as independent pixel numbers, image variance, information entropy, edge density, correlation surface major-minor peak ratio, Harris corner response features, DEM variance, DEM range, uplink and downlink image matching coefficients, uplink and downlink image matching deviation and the like, and uplink and downlink image matching coefficients, uplink and downlink image matching deviation, Harris corner response values of 6 scales, Sobel edge density, Canny edge density, 0tsu edge density, DEM variance and DEM range are selected as initial feature sets.
The feature selection process of the Wrapper method is closely linked to the classifier learning process. The method divides a data set into a training set and a testing set. And extracting a sub-feature set from the initial feature set each time for classifier training, and evaluating the superiority and inferiority of the selected feature set by the performance of the classifier on the test set. Firstly, selecting features according to a search strategy, namely selecting features by adopting a sequential forward floating search strategy; then, constructing a temporary SVM by using the current feature subset, taking the empty set as an initialization feature subset during searching, and training a support vector machine model according to the current feature set; secondly, classifying the temporary SVM on the test set to obtain the classification accuracy of the test sample; thirdly, judging whether a termination condition is met, when a new feature is added into the subset each time, detecting the added feature, judging whether the new feature subset promotes an effect function (classification accuracy), and if not, abandoning the addition of the feature; and continuing to search the next one-dimensional feature according to the strategy, and repeating the steps until the stopping condition is met (all feature subsets in the feature set are traversed). The characteristics of the constructed image feature vector are finally obtained as follows: DEM variance, DEM range, Sobel edge density, OTSU face target density, and image mean matrix variance.
S2, generating a sample;
in order to improve the universality of the SVM model, radar images of various terrains are selected as samples. Such as mountains, oceans, lakes, plains, forests, hills, etc. Firstly, generating a sample file, dividing an image in a grid form, generating a matching subarea image set, and then processing a sample: 1) invalid sample elimination: before calculating the characteristics, the data sufficiency of the matching subareas is checked, and the characteristics are not calculated for the matching subareas with insufficient data so as not to influence the training of the classifier. I.e., the matching subareas with insufficient data are not added to the valid samples, i.e., are not used as final training samples and test samples. The test criteria for data sufficiency were: the number of pixels in the image, DEM, matching the sub-area, whose data value is 0 cannot exceed a certain proportion of the total number of pixels of the sub-area, which is usually set to 10% (empirical value). 2) Splitting a training sample and a testing sample: splitting a data sample file obtained by calculation into a training sample and a test sample according to the splitting principle: positive and negative samples in the training sample set are 1: 1; the ratio of training samples to test samples was 2: 1.
And S3, training and outputting the SVM model.
And S31, constructing a sample feature vector. The following features were calculated for each matching subregion: DEM variance, DEM range, Sobel edge density, OTSU face target density, and image mean matrix variance.
S32, the method for determining the attribute of the classification category of the SVM comprises the following steps: generating a category attribute value according to the matching result of the uplink and downlink images, wherein the criterion is as follows: 1) match error is less than 10 pixels, 2) correlation coefficient is greater than 0.03. If both are satisfied, the category attribute value is set to 1, otherwise, the category attribute value is set to 0;
s33: normalizing the feature vector, wherein the normalization is performed according to each dimension of the feature, namely if N training samples exist, for the ith dimension of the feature, the minimum value of the ith dimension in the N samples is 0 or-1, and the maximum value of the ith dimension in the N samples is 1;
s34: training an SVM model; and selecting a polynomial kernel function by using the kernel function, and searching optimal parameters of the SVM algorithm by using 5-fold cross validation, wherein the optimal parameters comprise a penalty factor and kernel function parameters. .
S35: outputting the SVM classification model, and saving the trained SVM model into a model file, as shown in FIG. 3.
Wherein the first row represents the selected SVM model type; the second row represents the type of kernel function, polymonomial represents polynomial kernel function (kernel function parameters); the third row represents the degree of the polynomial kernel; the fourth row represents the scaling parameters of the polynomial kernel; the fifth row represents the offset value of the polynomial kernel; the sixth row represents the number of categories divided; the seventh row represents the total number of support vectors; the eighth row represents the offset value of the decision function; the ninth row represents the label set; the tenth row represents the number of positive and negative support vectors; the eleventh line is a support vector flag, indicating the beginning of the support vector list, followed by all support vectors.
S4: fitness analysis using SVM models
Reading an SVM model file, obtaining model parameters, substituting the model parameters and the five feature vectors of the sample to be analyzed into a decision function, wherein if the result is 1, the image is adaptable, and if the result is 1, the image is not adaptable.
The SVM decision function is:
f(x)=sgn(
Figure 963482DEST_PATH_IMAGE001
ai *yi K(x,xi)+b*)
where x is the feature vector of the sample to be analyzed, b*The value is the offset value, x, of the decision functioniAnd ai *The values are respectively a characteristic vector in the support vector and a corresponding Lagrange multiplier; y isiIs equal to xiCorresponding sample class attribute value, sgn denotes sign function, K (x, x)i) For the kernel function, i represents the sequence number of the support vector, and l is the number of the support vectors.
The SVM-based scene suitability analysis is completed.
The present invention is not limited to the above preferred embodiments, and any modifications, equivalents, improvements, etc. made within the principle of the present invention are included in the scope of the present invention.

Claims (5)

1. A scene adaptability analysis method based on a support vector machine is characterized by comprising the following steps:
s1, image feature selection, selecting image features,
selecting DEM variance, DEM range, Sobel edge density, OTSU face target density and image mean matrix variance as image characteristics;
s2: generating a sample;
selecting radar images of various terrains as sample basic images, dividing the basic images in a grid form to generate a matching subarea image set, taking the matching subarea image set as a sample set, and removing invalid samples from the matching subarea image set to enable the proportion of training samples to test samples in the valid samples to be 2 to 1;
s3: training and outputting an SVM model; specifically comprises
S31: constructing a sample feature vector; calculating the image features of the effective samples as feature vectors of the effective samples;
s32: determining the attribute of the classification category of the SVM; carrying out uplink and downlink image matching on the effective sample, and taking a matching result as the category attribute value;
s33: normalizing the feature vectors;
s34: training an SVM model; selecting a polynomial kernel function by the kernel function, and searching the optimal parameter of the SVM algorithm by using 5-fold cross validation based on the normalized feature vector and the category attribute value;
s35: outputting an SVM model to generate an SVM model file;
s4: carrying out adaptability analysis by applying an SVM model;
reading the SVM model file, obtaining model parameters, substituting the model parameters and the feature vector of the sample to be analyzed into a decision function, wherein if the result is 1, the image is adaptable, and if the result is 1, the image is not adaptable; the decision function is:
f(x)=sgn(
Figure 734173DEST_PATH_IMAGE001
ai *yi K(x,xi)+b*)
where x is the feature vector of the sample to be analyzed, b*The value is the offset value, x, of the decision functioniAnd ai *The values are respectively a characteristic vector in the support vector and a corresponding Lagrange multiplier; y isiIs equal to xiCorresponding sample class attribute value, sgn denotes sign function, K (x, x)i) For the kernel function, i denotes the order number of the support vector, and l is the support vectorAnd (4) the number.
2. The support vector machine-based scene suitability analysis method according to claim 1, wherein the step S1, feature selection is performed by SVM-based Filter & Wrapper feature selection method; firstly, selecting initial image features as an uplink and downlink image matching coefficient, an uplink and downlink image matching deviation, 6-scale Horris corner response values, Sobel edge density, Canny edge density, VOM mean matrix variance, OTSU surface target density, DEM variance and DEM range by a Filter feature selection method, and then selecting the image features which are beneficial to classification from the initial image features by a Wrapper feature selection method.
3. The method for scene adaptability analysis based on support vector machine according to claim 1, wherein in step S2, the ratio of positive and negative samples in the training samples is 1: 1.
4. The method for scene adaptability analysis based on support vector machine according to claim 1, wherein in step S32, the up-down image matching criteria are: the matching error of the uplink and downlink images is less than 10 pixels and the correlation coefficient is more than 0.03; if both are satisfied, the category attribute value is set to 1, otherwise the category attribute value is set to 0.
5. The method for scene adaptability analysis based on support vector machine as claimed in claim 1, wherein in step S33, the normalization is performed according to each dimension of the feature vector, i.e. if there are N training samples, the ith dimension of the feature vector is normalized, the minimum value of the ith dimension in the N samples is 0 or-1, and the maximum value of the ith dimension in the N samples is 1.
CN202210324127.3A 2022-03-30 2022-03-30 Scene adaptability analysis method based on support vector machine Pending CN114494244A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210324127.3A CN114494244A (en) 2022-03-30 2022-03-30 Scene adaptability analysis method based on support vector machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210324127.3A CN114494244A (en) 2022-03-30 2022-03-30 Scene adaptability analysis method based on support vector machine

Publications (1)

Publication Number Publication Date
CN114494244A true CN114494244A (en) 2022-05-13

Family

ID=81488892

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210324127.3A Pending CN114494244A (en) 2022-03-30 2022-03-30 Scene adaptability analysis method based on support vector machine

Country Status (1)

Country Link
CN (1) CN114494244A (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112668613A (en) * 2020-12-07 2021-04-16 中国西安卫星测控中心 Satellite infrared imaging effect prediction method based on weather forecast and machine learning

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112668613A (en) * 2020-12-07 2021-04-16 中国西安卫星测控中心 Satellite infrared imaging effect prediction method based on weather forecast and machine learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杨朝辉等: "基于支持向量机的景象匹配区选择方法", 《同济大学学报(自然科学版)》 *
苏娟等: "基于地形起伏和SVM的SAR景象MSA选取", 《系统工程与电子技术》 *

Similar Documents

Publication Publication Date Title
CN110298298B (en) Target detection and target detection network training method, device and equipment
CN109978035B (en) Pedestrian detection method based on improved k-means and loss function
CN107633226B (en) Human body motion tracking feature processing method
CN110363165B (en) Multi-target tracking method and device based on TSK fuzzy system and storage medium
CN109344695B (en) Target re-identification method and device based on feature selection convolutional neural network
CN108428220A (en) Satellite sequence remote sensing image sea island reef region automatic geometric correction method
CN110633727A (en) Deep neural network ship target fine-grained identification method based on selective search
CN111738319A (en) Clustering result evaluation method and device based on large-scale samples
CN109508674B (en) Airborne downward-looking heterogeneous image matching method based on region division
CN112164087B (en) Super-pixel segmentation method and device based on edge constraint and segmentation boundary search
CN107194917B (en) DAP and ARE L M-based on-orbit SAR image change detection method
CN115588178B (en) Automatic extraction method for high-precision map elements
Wang et al. Research on vehicle detection based on faster R-CNN for UAV images
CN112070151A (en) Target classification and identification method of MSTAR data image
CN115953371A (en) Insulator defect detection method, device, equipment and storage medium
CN109063543B (en) Video vehicle weight recognition method, system and device considering local deformation
CN114494244A (en) Scene adaptability analysis method based on support vector machine
CN113343819B (en) Efficient unmanned airborne SAR image target segmentation method
CN105373809B (en) SAR target identification methods based on non-negative least square rarefaction representation
CN115409705A (en) Countermeasure sample generation method for SAR image target identification model
CN111401252B (en) Book spine matching method and equipment of book checking system based on vision
CN111429419B (en) Insulator contour detection method based on hybrid ant colony algorithm
CN109447954B (en) Camouflage effect evaluation method based on kernel density estimation
Yu et al. SAR image segmentation by merging multiple feature regions
CN112508970A (en) Point cloud data segmentation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220513

RJ01 Rejection of invention patent application after publication