CN102902982B - Based on the SAR image texture classifying method of observation vector difference - Google Patents

Based on the SAR image texture classifying method of observation vector difference Download PDF

Info

Publication number
CN102902982B
CN102902982B CN201210344066.3A CN201210344066A CN102902982B CN 102902982 B CN102902982 B CN 102902982B CN 201210344066 A CN201210344066 A CN 201210344066A CN 102902982 B CN102902982 B CN 102902982B
Authority
CN
China
Prior art keywords
image
line unit
classification
test pattern
histogram
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210344066.3A
Other languages
Chinese (zh)
Other versions
CN102902982A (en
Inventor
侯彪
焦李成
李邵利
王爽
张向荣
马文萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201210344066.3A priority Critical patent/CN102902982B/en
Publication of CN102902982A publication Critical patent/CN102902982A/en
Application granted granted Critical
Publication of CN102902982B publication Critical patent/CN102902982B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a kind of SAR image texture classifying method based on observation vector difference, mainly solve the problem to SAR image terrain classification.Its assorting process is: random selecting r width fragmental image processing in (1) training set, is converted to column vector difference matrix P; (2) with observing matrix, above-mentioned P is observed, obtain line unit observation vector difference matrix X, and line unit dictionary D is obtained to its cluster; (3) by step (2), observation vector difference matrix X is calculated to training set image tr; (4) to X trproject to line unit dictionary D, form training image line unit histogram h; (5) test set image table is shown as test pattern line unit histogram h e; (6) h is calculated eand the distance between h, determines h according to this distance size eaffiliated classification; (7) all test patterns are calculated according to step (6), obtain final classification rate.Invention applies up-to-date compressive sensing theory, process is simple, and Classification and Identification rate is high, can be used for SAR image atural object Texture classification.

Description

Based on the SAR image texture classifying method of observation vector difference
Technical field
The invention belongs to technical field of image processing, relate to SAR image classification, can be applicable to SAR image ground object target recognition and classification.
Background technology
Synthetic-aperture radar (Synthetic Aperture Radar, SAR) is a kind of high-resolution radar system, can be applicable to military affairs, agricultural, navigation, the numerous areas such as geographical supervision.It and other remote-sensing imaging system, optical imaging system has compared a lot of difference.In military target identification, SAR image is widely used in object detection field, and SAR image terrain classification technology is then the expansion of traditional automatic Technology of Terrain Classifying.But in SAR image, store very abundant information, characteristics of image complex structure, both comprised the physical feature that landform, vegetation, the hydrology are such, comprise again house and the such man-made features of road.In addition, the relation between these features is also quite complicated, no matter is geometric relationship or semantic relation, all cannot be described by simple method.And traditional artificial visual decipher is wasted time and energy, production and actual needs can not be met.Thus need badly a kind of quick height accurately discriminator technology solve an existing difficult problem.Spectral signature and textural characteristics are two large essential characteristics of remote sensing images, are also two fundamentals carrying out remote Sensing Image Analysis institute foundation.Due to interference, affect the many factors of geologic body spectral information, and there is foreign matter to a certain extent with the phenomenon such as spectrum, the different spectrum of jljl, make the spectral information that extracts in most of applied research very limited, the needs of growing remote sensing application can not be met far away.And texture reflects the space distribution of gradation of image pattern, contain the surface information of image and the relation of surrounding environment thereof, better take into account macrostructure and the micromechanism of image.And the generation suppressing the same spectrum of foreign matter, the different spectrum phenomenon of jljl can be helped.Thus in graphical analysis, very important effect is played.
For the above feature of SAR image, classical texture classifying method has:
The information that S.kuttikkad and R.Chellapp of University of Maryland etc. utilize the SAR image of hyperchannel complete polarization to provide, extracts ROI via CFAR detecting device.Then utilize maximum likelihood partitioning algorithm to split, finally utilize " Site Model " (in Polarimetric SAR Image the geometry of target and position characteristic) to carry out image terrain classification.
9 scholars such as Steven K.Rogers, the Dennis W.Ruck of USAF technical college, together with the Kevin J.Willy in White laboratory, utilize small echo and fractal technology success that polarization ADTS SAR image is divided into the classifications such as shade, background area, number and man-made target.
Weisenseel Robert.A etc. utilizes the method for Markov random field, namely by the problem of Markov random field conditional probability being converted into Gibbs distribution energy Function Minimization problem, image area is divided into target area, background area and shadow region.The Hans-Christoph Quelle etc. of France then proposes and utilizes the distribution of Pearson system to be in fact EM (Expectation Maximization) algorithm improved, and carries out self-adapting SAR Iamge Segmentation etc.
The compressive sensing theory proposed in recent years causes the concern of many scholars.The information capture of non-adaptive Systems with Linear Observation to sparse or compressible high dimensional signal presents reconstruct and the process of almost Perfect.Compressive sensing theory is to Signal and Information Processing, and pattern-recognition, the broader field such as computer vision and machine learning is expanded.LiLiu, PaulFieguth are applied in observation vector technology in natural image Texture classification, and show absolute advantage.But due to the complicacy of SAR image itself, directly existing natural image Texture Processing Methods is applied in SAR image, can not get satisfied result.
Summary of the invention
The object of the invention is to the deficiency for above-mentioned prior art, from SAR image own characteristic, propose a kind of SAR image texture classifying method based on observation vector difference, utilize the observation vector concept in compressed sensing, while reduction computation complexity, improve SAR image classification accuracy rate.
Realizing technical scheme of the present invention is: to the process of former SAR image piecemeal, the observation vector difference of every adjacent image block as the proper vector of image, then by all observation vector difference clusters.Cluster centre is called line unit, and all cluster centres form line unit dictionary.Project in training, test pattern superincumbent line unit dictionary, obtain line unit histogram, comparative training line unit's histogram and test line unit histogram can obtain final classification results.Its specific implementation process is as follows:
(1) random selecting r width image from each class texture image of training set, 8 < r < 15, carry out piecemeal process, and each block is called line unit, is denoted as I p;
(2) by block image I pin each column vector end to end, pull into a column vector p, adjacent column vector subtracted each other and obtains column vector difference, be denoted as p diff, with the column vector subtractive combination vectorial difference matrix in column P of all blocks of images diff;
(3) to column vector difference matrix P diffcalculated by following formula, obtain the observation vector difference matrix X of line unit,
X=MP diff
Wherein, M is observing matrix;
(4) carry out clustering processing to the vector in the observation vector difference matrix X of line unit, cluster centre is denoted as d, forms line unit dictionary D, wherein, C is image category number, and K is the cluster number of every class;
(5) to the every piece image in training set, according to the method for step (1) to step (3), the observation vector difference matrix X of training set image is obtained tr;
(6) to above-mentioned X trin each vector x, calculate the Euclidean distance of all line unit d in itself and line unit dictionary D, the frequency apart from that minimum line unit added 1, form the line unit histogram h of the every piece image in training set;
(7) by the set of histograms synthesis set of the line of images all in training set unit: wherein, S is the number of each class image in training set, and C is the classification number of all images of training set;
(8) to each line unit histogram, the according to the following formula normalized in above-mentioned training set line unit histogram set H:
h c , s ( k ) = h c , s ( k ) &Sigma; k = 1 CK h c , s ( k ) ,
Wherein, c is the classification number of test pattern, and s is the image label in c class, and C is the total classification number of image, and K is the cluster number of every class image;
(9) in test set, step (5) is repeated to step (8) to each width new images, form the line unit histogram of test pattern, be denoted as h e;
(10) test pattern line unit histogram h is calculated according to the following formula ewith the χ of training image line unit histogram h all in training set 2distance:
&chi; 2 ( h , h e ) = 1 2 &Sigma; k = 1 CK [ h ( k ) - h e ( k ) ] 2 h ( k ) + h e ( k ) ,
Wherein, C is the total classification number of image, and K is the cluster number of every class image;
(11) at all training image line unit Histogram distance χ 2(h, h e) in, will histogram h first in test pattern line eh is designated as apart from minimum training line unit histogram m, h mthe classification c at image place mnamely be the classification belonging to this test pattern, if c mclassification belonging to original with this test pattern is identical, then show that this test pattern obtains correct classification;
(12) to whole test pattern line unit histogram according to repetition step (9) to step (11), the number obtaining the test pattern that can correctly classify is N d;
(13) according to the N in step (12) dwith test pattern sum N t, obtain final correct classification rate r:
r = N d N t .
The present invention has the following advantages compared with prior art:
1, present invention utilizes up-to-date compressed sensing technology, just can retain most information of raw data with few sample, so very big pressure level compression raw data, decreases calculated amount, has also saved nearly all raw information from damage simultaneously;
2, the present invention only needs in feature extraction phases the size considering feature space, and need not consider the problem of feature selecting in classic method, thus overcomes the defect giving up to fall important information while compressing in previous methods;
3, assorting process is simple, and particularly the present invention has higher Classification and Identification rate than traditional MR8 filter bank method with based on the texture classifying method of compressed sensing.
Accompanying drawing explanation
Fig. 1 is schematic flow sheet of the present invention;
Fig. 2 is the SAR image database that texture that the present invention uses is single;
Fig. 3 be the present invention use have the SAR image database disturbed between class;
Fig. 4 is the present invention at the simulation result figure of the single SAR image database of texture;
Fig. 5 is that the present invention is at the simulation result figure having the SAR image database disturbed between class.
Embodiment
With reference to Fig. 1, specific implementation step of the present invention is as follows:
Step 1. is random selecting r width image from each class texture image of training set, and 8 < r < 15, do piecemeal process to every piece image, processing procedure is centered by each pixel, be that the length of side carries out piecemeal with n, each block image is called line unit, is denoted as I p;
Step 2. is by block image I pin each column vector end to end, pull into a column vector p, adjacent two column vectors subtracted each other and obtains column vector difference, be denoted as p diff, with the column vector subtractive combination vectorial difference matrix in column P of all blocks of images obtained in step 1 diff;
Step 3. couple column vector difference matrix P diffcalculated by following formula, obtain the observation vector difference matrix X of line unit,
X=MP diff
Wherein, M is observing matrix;
Vector in the observation vector difference matrix X of step 4. pair line unit, adopt K means clustering algorithm cluster, cluster number is K c, 5 < K c< 40, cluster centre is denoted as d, forms line unit dictionary D, wherein, C is image category number, K cfor the cluster number of every class.
Every piece image in step 5. pair training set, according to the method for step 1 to step 3, obtains the observation vector difference matrix X of training set image tr;
Step 6. is to above-mentioned X trin each vector x, utilize following formula to calculate the Euclidean distance l of all line unit d in itself and line unit dictionary D x,d,
l x , d = &Sigma; i = 1 I ( x i - d i ) 2 ,
Wherein, I is the dimension of vector x, by l x,dadd 1 apart from the frequency of minimum that line unit d, form the line unit histogram h of the every piece image in training set;
Step 7. is by the set of histograms synthesis set of the line of images all in training set unit: wherein, S is the number of each class image in training set, and C is the classification number of all images of training set;
Step 8. is to each line unit histogram, the according to the following formula normalized in above-mentioned training set line unit histogram set H:
h c , s ( k ) = h c , s ( k ) &Sigma; k = 1 CK c h c , s ( k ) ,
Wherein, c is the classification number of test pattern, and s is the image label in c class, and C is the total classification number of image, K cit is the cluster number of every class image;
Step 9., in test set, repeats step 5 to step 8 to each width new images, forms the line unit histogram of test pattern, is denoted as h e;
Step 10. calculates test pattern line unit histogram h according to the following formula ewith the χ of training image line unit histogram h all in training set 2distance:
&chi; 2 ( h , h e ) = 1 2 &Sigma; k = 1 CK [ h ( k ) - h e ( k ) ] 2 h ( k ) + h e ( k ) ,
Wherein, C is the total classification number of image, and K is the cluster number of every class image;
Step 11. is at all training image line unit Histogram distance χ 2(h, h e) in, will histogram h first in test pattern line eh is designated as apart from minimum training line unit histogram m, h mthe classification c at image place mnamely be the classification belonging to this test pattern, if c mclassification belonging to original with this test pattern is identical, then show that this test pattern obtains correct classification;
Step 12. arrives step 11 to whole test pattern line units histogram according to repetition step 9, and the number obtaining the test pattern that can correctly classify is N d;
Step 13. is according to the N in step 12 dwith test pattern sum N t, obtain final correct classification rate r:
r = N d N t .
Effect of the present invention further illustrates by following emulation:
1. emulate content: the SAR data storehouse single to texture and have the SAR data storehouse of disturbing between class, use the inventive method (CS-diff) and existing MR8 filter bank method (MR8) and the existing texture classifying method based on compressed sensing (CS) to carry out contrast experiment respectively, evaluate these method performances from final classification rate.
Fig. 2 is the SAR image database that texture is single, comprises farmland, cities and towns, mountain range three class atural object, and wherein every class has 150 width, totally 450 images.In image, texture is single, does not have other classifications to disturb.
Fig. 3 has the SAR image database disturbed between class, comprises farmland, cities and towns, mountain range three class atural object, and wherein every class has 150 width, totally 450 images.Every class image is not all single atural object, has other classifications to disturb.
Emulate the SAR image database classification experiments result of 1. single textures
With the present invention (CS-diff) and existing MR8 filter bank method (MR8), texture classifying method (CS) based on compressed sensing, repetitive cycling tests 50 times respectively, get the mean value of 50 experimental results as final classification rate, result as shown in Figure 4.
The result of Fig. 4 shows, the size of feature space dimension affects final classification results, method from the present invention (CS-diff) and the texture classifying method (CS) based on compressed sensing can be found out, feature space dimension obtains maximal value near 27 dimensions.This is because when dimension is less, the global information of texture can not be obtained; When dimension is larger, abandon again the local message of texture.Thus all good effect can not be obtained.It can also be seen that from Fig. 4, the classifying quality of the present invention (CS-diff) is better than the texture classifying method (CS) based on compressed sensing, and compared with the MR8 filter bank method (MR8) of current trend, accuracy improves about 4 percentage points.
Emulation 2. has the SAR image database classification experiments result disturbed between class
In the noisy situation of SAR image, with the present invention (CS-diff) and existing MR8 filter bank method (MR8), texture classifying method (CS) based on compressed sensing, repetitive cycling tests 50 times respectively, get the mean value of 50 experimental results as final classification rate, experimental result as shown in Figure 5.
As can be seen from Figure 5, the present invention (CS-diff) obtains the high-class rate of 98.33% when feature space is 27 dimension.Along with the increase of feature space dimension, classification rate declines gradually, but about all improving one percentage point than the existing texture classifying method based on compressed sensing (CS), and thoroughly surmount MR8 filter bank method (MR8) comparatively popular at present, proved that the present invention (CS-diff) has good robustness.

Claims (1)

1., based on a SAR image texture classifying method for observation vector difference, comprise the steps:
(1) random selecting r width image from each class texture image of training set, 8<r<15, carries out piecemeal process, and each block is called line unit, is denoted as I p;
(2) by block image I pin each column vector end to end, pull into a column vector p, adjacent column vector subtracted each other and obtains column vector difference, be denoted as p diff, with the column vector subtractive combination vectorial difference matrix in column P of all blocks of images diff;
(3) to column vector difference matrix P diffcalculated by following formula, obtain the observation vector difference matrix X of line unit,
X=MP diff
Wherein, M is observing matrix;
(4) carry out clustering processing to the vector in the observation vector difference matrix X of line unit, cluster centre is denoted as d, forms line unit dictionary D, wherein, C is image category number, and K is the cluster number of every class;
(5) to the every piece image in training set, according to the method for step (1) to step (3), the observation vector difference matrix X of training set image is obtained tr;
(6) to above-mentioned X trin each vector x, calculate the Euclidean distance of all line units in itself and line unit dictionary D, the frequency apart from that minimum line unit added 1, form the line unit histogram h of the every piece image in training set;
(7) by the set of histograms synthesis set of the line of images all in training set unit: wherein, S is the number of each class image in training set;
(8) to each line unit histogram, the according to the following formula normalized in above-mentioned training set line unit histogram set H:
h c , s ( k ) = h c , s ( k ) &Sigma; k = 1 CK h c , s ( k ) ,
Wherein, c is the classification number of test pattern, and s is the image label in c class, and K is the cluster number of every class image;
(9) in test set, step (5) is repeated to step (8) to each width new images, form the line unit histogram of test pattern, be denoted as h e;
(10) test pattern line unit histogram h is calculated according to the following formula ewith the χ of training image line unit histogram h all in training set 2distance:
&chi; 2 ( h , h e ) = 1 2 &Sigma; k = 1 CK [ h ( k ) - h e ( k ) ] 2 h ( k ) + h e ( k ) ,
Wherein, K is the cluster number of every class image;
(11) at all training image line unit Histogram distance χ 2(h, h e) in, will histogram h first in test pattern line eh is designated as apart from minimum training line unit histogram m, h mthe classification c at image place mnamely be the classification belonging to this test pattern, if c mclassification belonging to original with this test pattern is identical, then show that this test pattern obtains correct classification;
(12) repeat step (9) to step (11) to whole test pattern line units histogram, the number obtaining the test pattern that can correctly classify is N d;
(13) according to the N in step (12) dwith test pattern sum N t, obtain final correct classification rate r ,:
r , = N d N t .
CN201210344066.3A 2012-09-17 2012-09-17 Based on the SAR image texture classifying method of observation vector difference Active CN102902982B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210344066.3A CN102902982B (en) 2012-09-17 2012-09-17 Based on the SAR image texture classifying method of observation vector difference

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210344066.3A CN102902982B (en) 2012-09-17 2012-09-17 Based on the SAR image texture classifying method of observation vector difference

Publications (2)

Publication Number Publication Date
CN102902982A CN102902982A (en) 2013-01-30
CN102902982B true CN102902982B (en) 2015-09-30

Family

ID=47575203

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210344066.3A Active CN102902982B (en) 2012-09-17 2012-09-17 Based on the SAR image texture classifying method of observation vector difference

Country Status (1)

Country Link
CN (1) CN102902982B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955914A (en) * 2014-02-27 2014-07-30 西安电子科技大学 SAR image segmentation method based on random projection and Signature/EMD framework
CN104361351B (en) * 2014-11-12 2017-09-26 中国人民解放军国防科学技术大学 A kind of diameter radar image sorting technique based on range statistics similarity
CN104715265B (en) * 2015-04-10 2018-05-08 苏州闻捷传感技术有限公司 Radar scene classification method based on compression sampling Yu integrated coding grader
CN106157240B (en) * 2015-04-22 2020-06-26 南京理工大学 Remote sensing image super-resolution method based on dictionary learning

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298710A (en) * 2011-05-13 2011-12-28 北京航空航天大学 Cat eye effect target identification method based on compressive sensing theory

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7646924B2 (en) * 2004-08-09 2010-01-12 David Leigh Donoho Method and apparatus for compressed sensing

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298710A (en) * 2011-05-13 2011-12-28 北京航空航天大学 Cat eye effect target identification method based on compressive sensing theory

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Texture Classification Using Compressed Sensing;Li Liu等;《2010 Canadian Conference Computer and Robot Vision》;20101231;第71-78页 *
基于自适应采样的图像压缩感知算法研究;周婷;《中国优秀硕士学位论文全文数据库》;20120815(第8期);第16-17页 *
纹理合成算法研究与应用;王江涛;《中国优秀硕士学位论文全文数据库》;20081115(第11期);第6-9页 *

Also Published As

Publication number Publication date
CN102902982A (en) 2013-01-30

Similar Documents

Publication Publication Date Title
CN107239751B (en) High-resolution SAR image classification method based on non-subsampled contourlet full convolution network
CN104361590B (en) High-resolution remote sensing image registration method with control points distributed in adaptive manner
CN103578119B (en) Target detection method in Codebook dynamic scene based on superpixels
CN101551809B (en) Search method of SAR images classified based on Gauss hybrid model
CN103177458B (en) A kind of visible remote sensing image region of interest area detecting method based on frequency-domain analysis
CN102402685B (en) Method for segmenting three Markov field SAR image based on Gabor characteristic
CN103500329B (en) Street lamp automatic extraction method based on vehicle-mounted mobile laser scanning point cloud
CN102622607A (en) Remote sensing image classification method based on multi-feature fusion
CN104376330A (en) Polarization SAR image ship target detection method based on superpixel scattering mechanism
CN102629380B (en) Remote sensing image change detection method based on multi-group filtering and dimension reduction
CN109635789B (en) High-resolution SAR image classification method based on intensity ratio and spatial structure feature extraction
Zhang et al. Learning from GPS trajectories of floating car for CNN-based urban road extraction with high-resolution satellite imagery
CN112183432A (en) Building area extraction method and system based on medium-resolution SAR image
CN111046772A (en) Multi-temporal satellite remote sensing island shore line and development and utilization information extraction method
CN112287983B (en) Remote sensing image target extraction system and method based on deep learning
CN103020649A (en) Forest type identification method based on texture information
CN102902982B (en) Based on the SAR image texture classifying method of observation vector difference
CN112633140A (en) Multi-spectral remote sensing image urban village multi-category building semantic segmentation method and system
CN103106658A (en) Island or reef coastline rapid obtaining method
CN110991418A (en) Synthetic aperture radar target image identification method and system
CN103366365A (en) SAR image varying detecting method based on artificial immunity multi-target clustering
CN102314610B (en) Object-oriented image clustering method based on probabilistic latent semantic analysis (PLSA) model
CN104299232A (en) SAR image segmentation method based on self-adaptive window directionlet domain and improved FCM
CN105913090A (en) SAR image object classification method based on SDAE-SVM
CN109635726A (en) A kind of landslide identification method based on the symmetrical multiple dimensioned pond of depth network integration

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant