CN102175993A - Radar scene matching feature reference map preparation method based on satellite SAR (synthetic aperture radar) images - Google Patents

Radar scene matching feature reference map preparation method based on satellite SAR (synthetic aperture radar) images Download PDF

Info

Publication number
CN102175993A
CN102175993A CN 201110031918 CN201110031918A CN102175993A CN 102175993 A CN102175993 A CN 102175993A CN 201110031918 CN201110031918 CN 201110031918 CN 201110031918 A CN201110031918 A CN 201110031918A CN 102175993 A CN102175993 A CN 102175993A
Authority
CN
China
Prior art keywords
radar
image
regional
feature
reference map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 201110031918
Other languages
Chinese (zh)
Inventor
杨卫东
孙向东
张天序
刘瑞涛
武斌
刘建华
吴洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN 201110031918 priority Critical patent/CN102175993A/en
Publication of CN102175993A publication Critical patent/CN102175993A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a radar scene matching feature reference map preparation method based on satellite SAR (synthetic aperture radar) images. The method comprises the following steps: respectively extracting line features and region features from an orthographic SAR image to obtain a binary line feature map and a binary region feature map; and combining the two feature maps to obtain the radar scene matching feature reference map. The method for preparing the radar matching feature reference map based on the SAR images has a good effect, small calculation amount, good real-time performance and low information assurance requirements, and can be effectively applied in the radar matching reference map preparation field.

Description

Radar scene matching aided navigation feature reference map preparation method based on satellite SAR image
Technical field
The invention belongs to the radar image process field, be specifically related to radar scene matching aided navigation feature reference map preparation method based on satellite synthetic aperture imaging (SAR).
Background technology
The terminal guidance system of advanced aircraft all will use the image from different sensors to mate guidance, and in this process, reference diagram is very crucial, and it provides the common reference or the coordinate system of regulation aircraft current position and destination.
Data obtain and guarantee aspect, countries such as USA and Europe have all set up the satellite-borne SAR sensing system, can be in real time or quasi real time obtain area-of-interest satellite radar image data.Can also adopt the airborne synthetic aperture radar imaging when needed, obtain data easily.The external target background feature measurement research of passing through has for many years accumulated a large amount of raw data, has set up the scattering properties database of all kinds target under the various image-forming conditions such as comprising different imaging spectral coverages, polarization mode, incident angle, weather.Its radar imagery coupling terminal guidance is to adopt same kind imaging sensing technology basically according to reports.But concrete reference diagram technology of preparing rarely has report.
The RADSIM of U.S. SAIC company development is digital radar land piece emulation (DRLMS) system, can automatic real-time emulation high-resolution radar scene image.This system is based upon on ground object target electromagnetic scattering performance data and the ground digital model database basis, utilize high resolving power visible light figure to cut apart and obtain the terrain object categorical data, SAR imaging and DBS radar imagery under the corresponding scene of emulation, the high-resolution radar image under the prediction different points of view.This system is except the model of setting up radar imaginary, also have detailed high precision target property database, Database field comprises the ground object target scattering coefficient and influences the multiple factor of radar imaginary, as type of ground objects, radar wavelength, polarization mode, incident angle, wind speed, soil moisture, vegetation height etc.
At present, China still lacks typical feature Electromagnetic Scattering of Target performance data, is difficult to adopt the mode of similar RADSIM system to utilize visible image prediction preparation radar scene matching aided navigation reference diagram.External radar imagery guidance (for example U.S.) has advanced Observations Means over the ground and detailed target property database, adopts the imaging sensor technology of same kind basically.Abroad compare, China is because the restriction of investigation can only be used optical imagery figure for referencial use source before.But radar imagery is different with optical imagery mechanism, obtains the condition difference, and two strive for survival in very big difference.The radar imagery target seeker reference map preparation of being emitted as of synthetic aperture radar image-forming satellite in recent years provides the data basis, carries out the checking research work but still need in the quality of image and application facet.At home and abroad rarely have under the situation of this respect technical information report, the present invention has just in time satisfied current urgent demand.
Summary of the invention
The invention provides radar scene matching aided navigation feature reference map preparation method, directly support the Guidance System platform that technical guarantee is provided in time, exactly for making satellite information based on satellite SAR image.
Radar scene matching aided navigation feature reference map preparation method based on satellite SAR image, be specially: in just penetrating the SAR image, extract line feature and provincial characteristics respectively, obtain two-value line characteristic pattern and two-value provincial characteristics figure, again the merging of two characteristic patterns is obtained radar scene matching aided navigation feature reference diagram.
Described line feature extracting method is:
Just penetrating any point x in the SAR image 0It is the edge of d that there is direction in the place, x 0The local window Ω at place X0Be divided into three the regional d in left, center, right along edge direction d 1, d 2, d 3, x 0Be positioned at regional d 2In.
To x 0The place carries out the non-linear line characteristic filtering and obtains filtering ρ as a result *, if ρ *Greater than threshold value T, then x 0There is the line feature in the place;
Described filtering ρ as a result is expressed as:
Figure BDA0000045826610000031
Wherein, ρ 2 = min ( ρ 21 2 , ρ 23 2 ) ,
ρ 21 = n 1 n 2 ( μ 1 - μ 2 ) 2 n 12 ( n 1 σ 2 2 + n 2 σ 2 2 ) + n 1 n 2 ( μ 1 - μ 2 ) 2 ,
ρ 23 = n 2 n 3 ( μ 2 - μ 3 ) 2 n 23 ( n 2 σ 2 2 + n 3 σ 3 2 ) + n 2 n 3 ( μ 2 - μ 3 ) 2
μ is for just penetrating the gray average of SAR image, n 1, n 2, n 3Be respectively regional d 1, d 2, d 3Pixel count, μ 1, μ 2, μ 3Be respectively regional d 1, d 2, d 3The pixel grey scale average, σ 1, σ 2, σ 3Be respectively regional d 1, d 2, d 3The pixel grey scale standard deviation, n 12Be regional d 1And d 2Pixel count and, n 23Be regional d 2And d 3Pixel count and;
Threshold value
Figure BDA0000045826610000035
Wherein
Figure BDA0000045826610000036
Be the average of ρ,
Figure BDA0000045826610000037
Standard deviation for ρ.
Technique effect of the present invention is embodied in: the present invention proposes a kind of new radar scene matching aided navigation feature reference map preparation method based on satellite SAR image.This method can extract line feature and provincial characteristics stable in the SAR image preferably, is beneficial to figure coupling in real time.The precision and the reliability of coupling are greatly improved.
Description of drawings
Fig. 1 is a process flow diagram of the present invention;
Fig. 2 is the technology of the present invention description of contents figure;
Fig. 3 is area, a RadarSat Hanzhong SAR orthography;
The DBS radar matching characteristic reference diagram of Fig. 4 for preparing by the SAR orthography.
Embodiment
To area, Hanzhong DBS radar image, realize the preparation of radar matching characteristic reference diagram in conjunction with the RadarSat satellite with flow process of the present invention.
Specific implementation process of the present invention is:
(1) aligns and penetrate the SAR image and carry out the line feature extraction.
In the online feature extraction step, the present invention selects the line feature extraction algorithm based on non-linear simple crosscorrelation filter operator.This algorithm carries out high-pass filtering to the SAR image earlier, then adopt and strengthen the concealed wire feature based on non-linear simple crosscorrelation (CR) operator, again the grey level characteristic pattern is carried out constant false alarm rate (CFAR) and cut apart and obtain bianry image,, obtain two-value line characteristic pattern at last to the bianry image denoising.Be specially:
(11) high-pass filtering
It is to suppress unsettled relatively low frequency gray difference in the image that the SAR image is extracted the purpose that adopts high-pass filtering in the line feature link, reduces it to line characteristic filtering operator Effect on Performance.
(12) based on non-linear simple crosscorrelation (CR) operator splicing thread feature.
Earlier the SAR image is carried out high-pass filtering, then adopt, again the grey level characteristic pattern is carried out normal false alarm rate (CFAR) and cut apart and obtain bianry image,, obtain two-value line characteristic pattern at last to the bianry image denoising based on non-linear simple crosscorrelation (CR) operator splicing thread feature.
Key is the simple crosscorrelation filter operator in the line feature extraction algorithm, and it is the splicing thread feature effectively.The simple crosscorrelation matched filter carries out the method for radar image line feature detection, and its main thought is to adopt the step edge window function to be similar to gray scale in the local window, measures with original local window calculating simple crosscorrelation then and carries out the line feature detection.Simple crosscorrelation operator line characteristic detection method has effect preferably in the feature extraction of radar image line.
The simple crosscorrelation matched filter is defined as follows:
Suppose any point x in the image 0It is the edge of d that there is direction in the place, x 0The local window Ω at place X0D is divided into Ω along edge direction iAnd Ω jLeft and right sides two parts, i.e. Ω X0iY Ω j, Ω iI Ω j=Φ, shown in Fig. 2 (a), Ω among the figure iAnd Ω jBe homogeneous region.
Make f (x) be Ω in the original image X0The area pixel gray scale.F (x) is regarded as the image after step window function s (x) is subjected to noise n (x) pollution, i.e. f (x)=s (x)+n (x).Normalized crosscorrelation tolerance between f (x) and the s (x) is ρ Ij, with it as the filtered image of simple crosscorrelation operator.Can derive:
ρ ij = n i n j ( μ i - μ j ) 2 n ( n i σ i 2 + n j σ j 2 ) + n i n j ( μ i - μ j ) 2 - - - ( 1 )
Definition according to signal noise ratio (snr) of image SNR
Figure BDA0000045826610000052
Derive:
SNR = n i n j ( μ i - μ j ) 2 n ( n i σ i 2 + n j σ j 2 ) - - - ( 2 )
Also can get by (1) (2) formula:
ρ ij = 1 1 + SNR - 2 - - - ( 3 )
σ wherein kBe former graph region Ω kThe standard deviation of interior pixel gray scale, n kBe regional Ω kInterior pixel count, k=i, j.Convenient in order hereinafter to describe, the SNR in above-mentioned is designated as SNR Ijρ IjSize shows x 0The degree of consistency at place's variation of image grayscale and optimal approximation edge.(3) formula explanation calculation of correlation ρ IjAnd signal to noise ratio snr IjAll can be the degree of approximation tolerance of step edge.Normalized crosscorrelation tolerance has linear unchangeability, no matter therefore for low contrast or the edge variation of high-contrast can both detect.But the defective of simple crosscorrelation matched filter is that it only responds local edge, fails from the angular detection line feature of full figure.
Human vision has almost ideal perception, the complicated environment that changes of perception effectively, the present invention is incorporated into the non-linear and adaptability principle of biological vision system senses intensity variation in the enhancing of line characteristic filtering, the absolute eye response that promptly has same intensity has different relative perception responses in the different brightness region of image.Its relative perception response is stronger in the dark space, and the relative perceptive intensity in clear zone a little less than, based on this vision nonlinear relationship, above-mentioned simple crosscorrelation matched filter is improved, propose to have the line characteristic remarkable tolerance of vision nonlinear characteristic.
In order to extract the line feature with certain width, the present invention merges two edge extracting templates, and the linear feature template that is made of first, second and third zone is successively from left to right formed, x in a middle shared zone 0Be positioned at second area, shown in Fig. 2 (b).
The linear feature template is carried out filtering, promptly to d 1d 2The simple crosscorrelation coupling is carried out in the zone of forming, simultaneously to d 2d 3The simple crosscorrelation coupling is carried out in the zone of forming.Then filtering result with simple crosscorrelation tolerance is:
ρ 2 = min ( ρ 21 2 , ρ 23 2 )
According to formula (3), also can adopt following equivalent signal-to-noise ratio tolerance:
ρ=min(SNR 21,SNR 23)
ρ IjBe the line characteristic remarkable that formula (1) calculates, SNR IjThe line characteristic remarkable that calculates for formula (2).Consider that the vision nonlinear characteristic will improve the ability that detects the weak contrast edge, any point x on the image 0Non-linear line characteristic filtering result be:
Figure BDA0000045826610000062
μ is the gradation of image average in the formula, μ X0Be x 0Locate regional interior pixel gray average.
General selected threshold T is
Figure BDA0000045826610000071
Wherein
Figure BDA0000045826610000072
Be the average of ρ,
Figure BDA0000045826610000073
Standard deviation for ρ.When ρ shows that there is the line feature in this some place during greater than threshold value T, so the ρ equivalence is line characteristic remarkable tolerance in a certain direction, line characteristic filtering and extract and just be based on this point.
(13) the grey level characteristic pattern is carried out constant false alarm rate (CFAR) and cut apart and obtain bianry image,, obtain two-value line characteristic pattern at last to the bianry image denoising.
(2) align and penetrate the SAR image and extract provincial characteristics.Can adopt based on Support Vector Machine, based on the feature extracting method of neural network etc.The present invention is preferably based on the SAR dark areas feature extracting method that iteration is cut apart, and obtains provincial characteristics figure.
Earlier the SAR image is carried out high-pass filtering.Then adopt the provincial characteristics extraction algorithm of cutting apart based on iteration, the target area of SAR image is split, obtain many-valued provincial characteristics figure.Again image inversion, the grey level characteristic pattern is carried out constant false alarm rate (CFAR) cut apart and obtain bianry image, at last to the bianry image denoising, obtain two-value line characteristic pattern.
Extract in the link in provincial characteristics, the present invention adopts the SAR dark areas feature extracting method of cutting apart based on iteration.This algorithm comprises the step that high-pass filtering, maximum between-cluster variance iteration are cut apart successively:
(21) high-pass filtering
The purpose of high-pass filtering is to suppress unsettled relatively low frequency gray difference in the image, and the high frequency information such as edge of invariant features are stablized in reflection in the outstanding image.
(22) the maximum between-cluster variance iteration is cut apart
The gray scale threshold method is a kind of main method in the image segmentation, can be applicable to polytype image effectively, the maximum between-cluster variance cluster segmentation is a kind of method commonly used, but the segmentation effect that can not obtain when narrow at histogrammic peak, particularly the ratio that accounts in the image of being cut apart when the target area hour often causes the failure of cutting apart.To the image after the high-pass filtering, the SAR image is because the low frequency gray component is suppressed, and image grey level histogram is a unimodal distribution after the filtering.And dark targets such as runway, road exist than the high-gray level contrast with respect to local background, and pixel count is few in image, and cluster segmentation is to be partitioned into this type of target.A kind of method is the dark gray characteristic that embodies according to target, adopts the CFAR dividing method, promptly sets and cuts apart the ratio of object pixel in image, and statistic histogram distributes, and obtains segmentation threshold, and the error of setting ratio is bigger to the segmentation performance influence.Therefore, maximum between-cluster variance is cut apart the statistics of the global image pixel of utilization, utilizes target to be presented as dark gamma characteristic, therefore we will adopt local segmentation, propose maximum between-cluster variance iteration dividing method, and cut apart, realize the automatic calculating of thresholding in conjunction with CFAR.
A. maximum between-cluster variance cluster segmentation
If image comprise L gray level (0,1..., L-1), gray-scale value is that the pixel number of i is N i, the total pixel number of image is N=N 0+ N 1+ Λ+N L-1Gray-scale value is that the probability of the point of i is:
P i=N i?/N (5)
Thresholding t is divided into dark space c with entire image 1With clear zone c 2Two classes, then inter-class variance
Figure BDA0000045826610000081
Be the function of t:
σ b 2 ( t ) = a 1 a 2 ( u 1 - u 2 ) 2 - - - ( 6 )
In the formula, a jBe class c jThe ratio of area and total image area, j=1,2, a 2=1-a 1u jBe class c jAverage,
Figure BDA0000045826610000084
Figure BDA0000045826610000085
This method is selected optimum thresholding
Figure BDA0000045826610000086
Make the inter-class variance maximum, that is:
σ b 2 ( t ^ ) = max { σ b 2 ( t ) } - - - ( 7 )
Make Δ u=|u 1-u 2|, then have by formula (5) and formula (6):
σ b 2 ( t ^ ) = max { a 1 ( t ) a 2 ( t ) Δu 2 ( t ) } - - - ( 8 )
With this clustering criteria split image, when target was occupied proper proportion in image, segmentation result was relatively good, and algorithm is fairly simple, helped real-time processing.Yet but can not from background, split target little target image, a lot of background mistakes are divided into target through regular meeting.
B. the maximum between-cluster variance iteration is cut apart
Because the ratio that target accounts in image is less, the half-tone information of target is very little to the contribution of entire image, under global threshold can not target and on every side background separately must adopt local threshold method, in the zonule, cut apart, make target in the zone, occupy certain ratio.The simplest way is that the plane of delineation is divided into impartial sub-piece, cuts apart with clustering criteria on each subgraph.Yet when target was in different sub-pieces, the blocky effect of segmentation result was very big; When the group piece almost belongs to background or target, run counter to the prerequisite that criterion is used.Therefore this method is difficult to obtain satisfied segmentation result.We introduce image are carried out repeated segmentation to obtain the computation model of spatially-variable thresholding.This model class is similar to people's visual theory, at first by pan entire image zone, move and fix attention on and to contain interested subregion (being the zone that information is abundant relatively, feature is outstanding relatively), carry out the partial sweep recognition objective again, thereby make discriminating power not only be adapted to entire image but also adjust variation with different subregions.The method of cutting apart is: in last cutting apart on the basis once, obtained some cut zone, and calculated their quantity of information, if quantity of information is few, this zone can be regarded an object as, then no longer handles; Otherwise, then it is cut apart again, until all zones all can not be cut apart or cut apart for twice and no longer include variation.
In the image, object brightness is darker than background luminance after above-mentioned filtering.Original image is cut apart with clustering criteria, obtained target area and background area.Can know the approximate location of having determined target in these dark spaces, we only need cut apart in these zones again.Cut apart through iteration repeatedly, extract the target area at last.
Key a bit was how to control iterations during wherein iteration was cut apart.Here utilize the predefined object pixel ratio thresholding Ta of cutting apart, when the area ratio of object pixel in image of segmented extraction was lower than this thresholding, then iteration stopped, and its thresholding is the image segmentation thresholding.Idiographic flow is as follows:
A. set iterations n (recommendation-2~2, negative value is represented inverse iteration);
B. set and cut apart object pixel ratio thresholding Ta (recommendation 20%~30%);
C. with clustering criteria image is carried out initial segmentation, obtains thresholding T, add up the elemental area ratio of dark target area with respect to full figure, when its less than Ta, then cut apart end; Otherwise, cut apart in this time and to continue in the original image of target area correspondence to adopt clustering criteria to cut apart.
(23) again image inversion, the grey level characteristic pattern is carried out constant false alarm rate (CFAR) cut apart and obtain bianry image, at last to the bianry image denoising, obtain two-value line characteristic pattern.
(3) the line feature and the provincial characteristics of merging SAR image, the radar matching characteristic reference diagram that is prepared into.

Claims (2)

1. based on the radar scene matching aided navigation feature reference map preparation method of satellite SAR image, be specially: in just penetrating the SAR image, extract line feature and provincial characteristics respectively, obtain two-value line characteristic pattern and two-value provincial characteristics figure, again the merging of two characteristic patterns is obtained radar scene matching aided navigation feature reference diagram.
2. radar scene matching aided navigation feature reference map preparation method according to claim 1 is characterized in that described line feature extracting method is:
Just penetrating any point x in the SAR image 0It is the edge of d that there is direction in the place, x 0The local window Ω at place X0Be divided into three the regional d in left, center, right along edge direction d 1, d 2, d 3, x 0Be positioned at regional d 2In.
To x 0The place carries out the non-linear line characteristic filtering and obtains filtering ρ as a result *, if ρ *Greater than threshold value T, then x 0There is the line feature in the place;
Described filtering is ρ as a result *Be expressed as:
Figure FDA0000045826600000011
Wherein, ρ 2 = min ( ρ 21 2 , ρ 23 2 ) ,
ρ 21 = n 1 n 2 ( μ 1 - μ 2 ) 2 n 12 ( n 1 σ 2 2 + n 2 σ 2 2 ) + n 1 n 2 ( μ 1 - μ 2 ) 2 ,
ρ 23 = n 2 n 3 ( μ 2 - μ 3 ) 2 n 23 ( n 2 σ 2 2 + n 3 σ 3 2 ) + n 2 n 3 ( μ 2 - μ 3 ) 2
μ is for just penetrating the gray average of SAR image, n 1, n 2, n 3Be respectively regional d 1, d 2, d 3Pixel count, μ 1, μ 2, μ 3Be respectively regional d 1, d 2, d 3The pixel grey scale average, σ 1, σ 2, σ 3Be respectively regional d 1, d 2, d 3The pixel grey scale standard deviation, n 12Be regional d 1And d 2Pixel count and, n 23Be regional d 2And d 3Pixel count and;
Threshold value
Figure FDA0000045826600000021
Wherein
Figure FDA0000045826600000022
Be the average of ρ,
Figure FDA0000045826600000023
Standard deviation for ρ.
CN 201110031918 2011-01-28 2011-01-28 Radar scene matching feature reference map preparation method based on satellite SAR (synthetic aperture radar) images Pending CN102175993A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110031918 CN102175993A (en) 2011-01-28 2011-01-28 Radar scene matching feature reference map preparation method based on satellite SAR (synthetic aperture radar) images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110031918 CN102175993A (en) 2011-01-28 2011-01-28 Radar scene matching feature reference map preparation method based on satellite SAR (synthetic aperture radar) images

Publications (1)

Publication Number Publication Date
CN102175993A true CN102175993A (en) 2011-09-07

Family

ID=44519196

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110031918 Pending CN102175993A (en) 2011-01-28 2011-01-28 Radar scene matching feature reference map preparation method based on satellite SAR (synthetic aperture radar) images

Country Status (1)

Country Link
CN (1) CN102175993A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103942803A (en) * 2014-05-05 2014-07-23 北京理工大学 SAR (Synthetic Aperture Radar) image based automatic water area detection method
CN106910178A (en) * 2017-01-20 2017-06-30 中国人民解放军装备学院 A kind of multi-angle SAR image fusion method based on hue statistical property sort
CN106910177A (en) * 2017-01-20 2017-06-30 中国人民解放军装备学院 The multi-angle SAR image fusion method that a kind of local image index is optimized
CN110826564A (en) * 2019-11-01 2020-02-21 山东浪潮人工智能研究院有限公司 Small target semantic segmentation method and system in complex scene image
CN111832486A (en) * 2020-07-14 2020-10-27 华东师范大学 Large-scale intertidal vegetation classification method based on synthetic aperture radar
CN112213264A (en) * 2020-09-22 2021-01-12 武汉工程大学 Airport reference map preparation method for scene matching guidance
CN112817339A (en) * 2019-11-15 2021-05-18 中国北方工业有限公司 Instruction fusion algorithm for composite guided aircraft
CN112967308A (en) * 2021-02-26 2021-06-15 湖南南方水利水电勘测设计院有限公司 Amphibious SAR image boundary extraction method and system
CN114076923A (en) * 2020-08-20 2022-02-22 西安电子科技大学 Target identification method based on time domain echo model of multilayer background and target

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493520A (en) * 2009-01-16 2009-07-29 北京航空航天大学 SAR image variation detecting method based on two-dimension gamma distribution

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493520A (en) * 2009-01-16 2009-07-29 北京航空航天大学 SAR image variation detecting method based on two-dimension gamma distribution

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《华中科技大学学报(自然科学版)》 20050228 杨卫东 等 基于检测识别的实孔径雷达景象匹配定位方法 25-27页,图1 1-2 第33卷, 第2期 *
《红外与激光工程》 19960630 张天序 等 雷达与光学景象共性特征提取与匹配算法研究 正文第17页第19行至第33行 1-2 第25卷, 第3期 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103942803A (en) * 2014-05-05 2014-07-23 北京理工大学 SAR (Synthetic Aperture Radar) image based automatic water area detection method
CN103942803B (en) * 2014-05-05 2017-05-17 北京理工大学 SAR (Synthetic Aperture Radar) image based automatic water area detection method
CN106910178A (en) * 2017-01-20 2017-06-30 中国人民解放军装备学院 A kind of multi-angle SAR image fusion method based on hue statistical property sort
CN106910177A (en) * 2017-01-20 2017-06-30 中国人民解放军装备学院 The multi-angle SAR image fusion method that a kind of local image index is optimized
CN106910177B (en) * 2017-01-20 2019-10-29 中国人民解放军装备学院 A kind of multi-angle SAR image fusion method that local image index optimizes
CN106910178B (en) * 2017-01-20 2020-03-06 中国人民解放军装备学院 Multi-angle SAR image fusion method based on tone statistical characteristic classification
CN110826564A (en) * 2019-11-01 2020-02-21 山东浪潮人工智能研究院有限公司 Small target semantic segmentation method and system in complex scene image
CN112817339B (en) * 2019-11-15 2022-12-23 中国北方工业有限公司 Instruction fusion algorithm for composite guided aircraft
CN112817339A (en) * 2019-11-15 2021-05-18 中国北方工业有限公司 Instruction fusion algorithm for composite guided aircraft
CN111832486A (en) * 2020-07-14 2020-10-27 华东师范大学 Large-scale intertidal vegetation classification method based on synthetic aperture radar
CN114076923A (en) * 2020-08-20 2022-02-22 西安电子科技大学 Target identification method based on time domain echo model of multilayer background and target
CN114076923B (en) * 2020-08-20 2024-05-03 西安电子科技大学 Target recognition method based on time domain echo model of multilayer background and target
CN112213264A (en) * 2020-09-22 2021-01-12 武汉工程大学 Airport reference map preparation method for scene matching guidance
CN112213264B (en) * 2020-09-22 2024-04-05 武汉工程大学 Airport reference map preparation method for scene matching guidance
CN112967308A (en) * 2021-02-26 2021-06-15 湖南南方水利水电勘测设计院有限公司 Amphibious SAR image boundary extraction method and system
CN112967308B (en) * 2021-02-26 2023-09-19 湖南南方水利水电勘测设计院有限公司 Amphibious boundary extraction method and system for dual-polarized SAR image

Similar Documents

Publication Publication Date Title
CN102175993A (en) Radar scene matching feature reference map preparation method based on satellite SAR (synthetic aperture radar) images
Ouma et al. Pothole detection on asphalt pavements from 2D-colour pothole images using fuzzy c-means clustering and morphological reconstruction
Tolt et al. A shadow detection method for remote sensing images using VHR hyperspectral and LIDAR data
Kwak et al. Automatic representation and reconstruction of DBM from LiDAR data using Recursive Minimum Bounding Rectangle
CN109657632B (en) Lane line detection and identification method
US9734398B2 (en) Method and apparatus for identifying object
Xu et al. Automatic reconstruction of building objects from multiaspect meter-resolution SAR images
KR101258668B1 (en) Korea local radar processing system
US9576373B2 (en) Geospatial imaging system providing segmentation and classification features and related methods
CN108197583A (en) The building change detecting method of optimization and image structure feature is cut based on figure
CN102073873B (en) Method for selecting SAR (spaceborne synthetic aperture radar) scene matching area on basis of SVM (support vector machine)
CN110197173B (en) Road edge detection method based on binocular vision
CN101650439A (en) Method for detecting change of remote sensing image based on difference edge and joint probability consistency
Wan et al. Automatic extraction of flood inundation areas from SAR images: A case study of Jilin, China during the 2017 flood disaster
Zhang et al. Filtering photogrammetric point clouds using standard lidar filters towards dtm generation
CN107341781A (en) Based on the SAR image correcting methods for improving the matching of phase equalization characteristic vector base map
Jiang et al. Application of multitemporal InSAR covariance and information fusion to robust road extraction
Uzar Automatic building extraction with multi-sensor data using rule-based classification
Li et al. Lane marking quality assessment for autonomous driving
Le Bris et al. Change detection in a topographic building database using submetric satellite images
Korpela et al. The performance of a local maxima method for detecting individual tree tops in aerial photographs
CN106530326B (en) Change detecting method based on image texture feature and DSM
Jin et al. Automated road pavement marking detection from high resolution aerial images based on multi-resolution image analysis and anisotropic Gaussian filtering
CN114299247A (en) Rapid detection and problem troubleshooting method for road traffic sign lines
Song et al. Real-time visibility distance evaluation based on monocular and dark channel prior

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20110907