CN102436652A - Automatic registering method of multisource remote sensing images - Google Patents
Automatic registering method of multisource remote sensing images Download PDFInfo
- Publication number
- CN102436652A CN102436652A CN201110254958XA CN201110254958A CN102436652A CN 102436652 A CN102436652 A CN 102436652A CN 201110254958X A CN201110254958X A CN 201110254958XA CN 201110254958 A CN201110254958 A CN 201110254958A CN 102436652 A CN102436652 A CN 102436652A
- Authority
- CN
- China
- Prior art keywords
- image
- same place
- registration
- cutting
- coupling
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Abstract
The invention discloses an automatic registering method of multisource remote sensing images, comprising the following steps of: carrying out cutting on an overlapping area of a reference image and an image to be registered, carrying out wavelet scale transformation on the image to be registered, and leading the resolution of the image to be registered to be consistent to that of the reference image; carrying out slicing on the reference image, carrying out detection on characteristic points, extracting the characteristic points with uniform distribution and obtaining initial matching homologous point pairs; utilizing the initial matching homologous point pairs to count and obtain the whole offset of the image to be registered along the longitude and latitude directions; and correcting the initial matching homologous point pairs according to the whole offset along the longitude and latitude directions, carrying out matching of two-way cross-correlation characteristic points on the corrected characteristic points to acquire fine-matching homologous points with subpixel positioning accuracy, and finally resampling to obtain the registered image. By adoption of the automatic registering method, the characteristics of large visual field, large span and specificity and the like of HJ (heterogeneous junction) star multisource remote sensing images are effectively overcome, a great amount of high-accuracy matching homologous point pairs with uniform, stable and reliable distribution can be quickly detected, and the differential rectification based on dense triangulation networks is carried out.
Description
Technical field
The present invention relates to a kind of multi-source remote sensing automatic image registration method, belong to the satellite image processing technology field, can be used for that the HJ star is infrared, the autoregistration of CCD and high spectrum same type of sensor and foreign peoples's sensor image.
Background technology
Image registration is one of basic task of Flame Image Process, is to two width of cloth images of the same scenery of taking from different time, different sensors or different visual angles or the process that multiple image matees, superposes.Fields such as graphical analysis, change-detection and information fusion have been widely used in.If reference picture with treat that registering images uses I respectively
1(x
1, y
1) and I
2(x
2, y
2) expression, then image registration can be expressed as on mathematics
I
1(x
1,y
1)=g(I
2(T(x
2,y
2)))
Wherein, g () is one dimension greyscale transformation, and T () is the two-dimensional space evolution.Because in most of the cases, two width of cloth images behind the registration do not require that generally gray scale is consistent, and registration also can not reach being equal to fully of two width of cloth images through one or more conversion, and can only accomplish in a way approximate.
The image registration of HJ star multi-source remote sensing has the singularity of himself, shows that mainly HJ star CCD camera and infrared camera scan (± 29 degree sweep limit) through double camera splicing (two ± 30 degree visual field) and Double sided mirror respectively and realize big view field observation; Hyperspectral imager fabric width 50km, side direction visual range ± 30 degree.HJ star multi-source remote sensing image spatial resolution does not wait to 300m from 30m, 100m, 150m, and span is big.Low-resolution image to the high-resolution conversion that differences in resolution is bigger with it, utilized interpolation algorithm, increase modes such as bandwidth, fill a large amount of deceptive information, can cause image fault, influence registration accuracy.
Typical image registration algorithm roughly can be divided into based on pixel, based on gray scale, based on characteristic and based on the method for physical model.At present, for the registration of multi-source remote sensing image, the main method for registering that is based on characteristic that adopts.In the method for registering images based on characteristic, the selection of characteristic has very big influence to the quality of image registration undoubtedly with extraction, and it will greatly influence the image registration success ratio.The traditional image registration technique will reach higher precision, needs usually to introduce certain manual intervention, and manual mode has not only reduced the efficient of system, and has influenced the application of system.At present, also be not applied to the quick self registration method of HJ data of professionalization system.
The automatic extraction that how to realize unique point with coupling automatically, how to improve wide cut Remote Sensing Images Matching efficient, how to adopt the autoregistration of considering imaging geometry better to satisfy big visual field, large span, different characteristic, be the problem that emphasis of the present invention solves.
Summary of the invention
Technology of the present invention is dealt with problems and is: the deficiency that overcomes prior art; A kind of multi-source remote sensing automatic image registration method is provided; This method has effectively overcome characteristics such as the big visual field of HJ star multi-source remote sensing image, large span, different characteristic; Automatically fast detecting is evenly distributed in a large number, stablizes, high precision coupling same place is right reliably, and carries out the differential rectify based on the intensive triangulation network, supports that the HJ star is infrared, the registration of the multi-source remote sensing image of CCD and high spectrum same type of sensor and foreign peoples's sensor.The invention solves the problem that needs human intervention in traditional multi-source Image registration process, just can realize the whole process autoregistration through parameter preset in the processing starting stage.
Technical solution of the present invention is: a kind of multi-source remote sensing automatic image registration method comprises the following steps:
1. the latitude and longitude information that carries according to remotely-sensed data is to reference map I
1With treat registering images I
2The overlay region carry out cutting, obtain reference map cutting image I as a result
1 *With treat registration figure cutting image I as a result
2 *
According to reference map with treat registering images resolution information R
1And R
2, judge whether resolution is consistent, and unanimity is then carried out next step; The inconsistent registration figure cutting image I as a result that then treats
2W carries out change of scale:
(1) treats registration figure cutting image I as a result
2 *Carry out the Mallat wavelet decomposition; Obtain the high frequency imaging signal on a low-frequency image signal and the different resolution.
(2) image is treated registration figure cutting image I as a result through after the wavelet decomposition
2 *Be broken down into along a series of high-frequency signal images and a low-frequency image of level, vertical and three directions in diagonal angle, have similarity between wherein unidirectional each high frequency imaging signal.The wavelet scale conversion utilizes this similarity exactly, carries out high frequency extrapolation interpolation;
(3) utilize inverse wavelet transform to reconstruct the interpolation image WD higher than original image resolution, the interpolation method that this method adopts when high frequency extrapolation interpolation is a bicubic interpolation.
3. to reference map cutting image I as a result
1 *And carry out the FAST feature point detection, extract equally distributed unique point; Geography information corresponding relation according to 2 grades of images is located match point on image to be corrected, it is right to obtain the initial matching same place:
(1) with reference map cutting image I as a result
1 *Cut into slices according to preset size, obtain image sheet I
1i(i=1,2 ..., M, M are the image sheet sums).Thread is once to the data I of a section
1iCarry out the FAST feature point detection, realize the parallel processing in feature point detection stage:
(2) remember reference map cutting image I as a result
1 *The unique point that detects is PixI
1i(i=1,2,3 ..., N).The OpenGIS coordinate transformation method in GDAL remote sensing images storehouse provides and has been used for describing the service that coordinate system (projection and reference field) and coordinate system are changed each other, and the present invention combines geo-localisation information treating to obtain initial matching same place PixW on the change of scale image WD of registration cutting image through this coordinate conversion
i(i=1,2,3 ..., N), N is the unique point total number.
4. choose the initial matching same place to { PixI
1i, PixW
i(i=1,2,3 ..., sub-set { PixI N)
1j, PixW
j(j=1,2,3 ..., M).Carry out Feature Points Matching (the match search scope is superior to the precision of HJ image system geometry correction) based on pyramid and related coefficient; A thread once matees a point; Reach the parallel computation purpose; Obtain behind elimination of rough difference that same place is right more accurately, statistics is tried to achieve the registering images longitude and latitude direction overall offset amount of treating:
(1) with reference map cutting image I as a result
1 *Set up this pyramid of 3 floor heights with the change of scale image WD that treats registration cutting image, the corresponding from top to bottom I of image pyramid
1 *, I
11 *, I
12 *With WD, WD
1, WD
2, every from the bottom up tomographic image dwindles 1/3.
(2) the 1st layers:
(2.1) extract initial matching same place subclass { PixI
1j, PixW
j(j=1,2,3 ..., initial matching same place { PixI M)
1j, PixW
j.
(2.2) at top layer image I
12 *Go up with PixI
1j/ 9 obtain 127*127 image block, WD for the center
2Go up with PixW
j/ 9 is center 5 sliding windows of pixel coverage around, carries out relevant matches, and (p is q) with the related coefficient C of peak value place to obtain correlation matrix C and peak value correspondence position thereof
Max, work as C
Max, carry out next step at>0.5 o'clock; Otherwise, get back to step (2.1) and calculate following 1 match point.
(2.3) at WD
2Go up with (p, q) for the center obtains the 127*127 image block, top layer image I
12 *Go up with PixI
1j/ 9 sliding windows that are the center around it carry out relevant matches, obtain correlation matrix and peak value correspondence position thereof (m, n), when (m, n) and PixI
1jWhen/9 corresponding some positions equate, think PixI
1j/ 9 with (p q) is 1 pair of coupling same place; Otherwise, get back to step (2.1) and calculate following 1 match point.
(2.4) traversal coupling same place subclass { PixI on the top layer image
1j, PixW
j(j=1,2,3 ..., each some M) carries out RANSAC (Random Sample Consensus) and rejects, and obtains reliable coupling same place to { LvI
2t, LvW
2t(t=1,2 ..., T).
(3) the 2nd layers:
(3.1) according to the 1st layer of coupling of top layer same place { LvI
2t, LvW
2t(t=1,2 ..., T), gold tower enlargement factor 3 multiply by in the each point position, calculates the 2nd layer and treats the affine transformation relationship of registering images to benchmark image;
(3.2) extract initial matching same place subclass { PixI
1j, PixW
j(j=1,2,3 ..., initial matching same place { PixI M)
1j, PixW
j.
(3.3) at the 2nd layer image I
11 *On, calculate PixI according to the radiation relation
1j/ 3 at WD
1Last corresponding position PixW
* j, calculate WD
1Last PixW
j/ 3 and PixW
* jDistance on x and y direction is if all carry out next step less than 15; Otherwise, get back to step (3.2) and calculate following 1 match point.
(3.4) at the 2nd layer image I
11 *On, with PixI
1j/ 3 obtain 127*127 image block, WD for the center
1Go up with PixW
* jFor center sliding windows of 5 pixel coverages around, carry out relevant matches, obtain correlation matrix C and peak value correspondence position thereof (p, q).
(3.5) at WD
1Go up with (p, q) for the center obtains the 127*127 image block, the 2nd layer image I
11 *On, with PixI
1j/ 3 sliding windows that are the center around it carry out relevant matches, obtain correlation matrix and peak value correspondence position thereof (m, n), when (m, n) and PixI
1jWhen/3 corresponding some positions equate, think PixI
1j/ 3 with (p q) is 1 pair of coupling same place; Otherwise, get back to step (3.2) and calculate following 1 match point.
(3.6) traversal coupling same place subclass { PixI on second image
1j, PixW
j(j=1,2,3 ..., each some M) obtains the coupling same place to { LvI
1t, LvW
1t(t=1,2 ..., T).
(4) the 3rd layers (bottom):
(4.1) according to the 2nd layer of coupling of top layer same place { LvI
1t, LvW
1t(t=1,2 ..., T), gold tower enlargement factor 3 multiply by in the each point position, calculates the 3rd layer and treats the affine transformation relationship of registering images to benchmark image;
(4.2) extract initial matching same place subclass { PixI
1j, PixW
j(j=1,2,3 ..., initial matching same place { PixI M)
1j, PixW
j.
(4.3) at the 3rd layer image I
1 *On, calculate PixI according to the radiation relation
1jThe position PixW of correspondence on WD
* j, calculate WD
1Last PixW
jWith PixW
* jDistance on x and y direction is if all carry out next step less than 15; Otherwise, get back to step (4.2) and calculate following 1 match point.
(4.4) at the 3rd layer image I
1 *On, with PixI
1jThe center obtains the 127*127 image block, and WD is last with PixW
* jFor center sliding windows of 5 pixel coverages around, carry out relevant matches, obtain correlation matrix C and peak value correspondence position thereof (p, q).
(4.5) on WD so that (p is q) for the center obtains the 127*127 image block, at the 3rd layer image I
1 *On, with PixI
1jFor center sliding window around it, carry out relevant matches, obtain correlation matrix and peak value correspondence position thereof (m, n), when (m, n) and PixI
1jWhen corresponding some position equates, think PixI
1jWith (p q) is 1 pair of coupling same place; Otherwise, get back to step (3.2) and calculate following 1 match point.
(4.6) traversal coupling same place subclass { PixI on the 3rd image
1j, PixW
j(j=1,2,3 ..., each some M) obtains the coupling same place to { LvI
t, LvW
t(t=1,2 ..., T).
(5) statistics longitude and latitude direction overall offset amount Δ x, Δ y: can know bottom coupling same place { LvI by matching process
t, LvW
t(t=1,2 ..., the LvI in T)
t(t=1,2 ..., T) be beginning coupling same place subclass { PixI
1j, PixW
j(j=1,2,3 ..., M) middle PixI
1jA sub-set, can obtain LvI
t(t=1,2 ..., T) the initial matching point PixW of correspondence
t(t=1,2 ..., T), calculate LvW
t(t=1,2 ..., T) and PixW
t(t=1,2 ..., T) at the mean shift amount Δ x and the Δ y of longitude and latitude direction (correspondence image transverse axis x and longitudinal y directions).
5. according to longitude and latitude direction overall offset amount Δ x, Δ y revises the initial matching same place to { PixI
1i, PixW
i(i=1,2,3 ..., treat the corresponding match point PixW of registration figure in N)
i(i=1,2,3 ..., N), its x coordinate deducts Δ x, and the y coordinate deducts Δ y; Obtain revising back coupling same place to { PixI
1i, PixW
* i(i=1,2,3 ..., N).
6. to carrying out two-way simple crosscorrelation Feature Points Matching, the match peak place adopts the binary three point interpolation to obtain the essence coupling same place of sub-pixel positioning precision; Simultaneously essence coupling same place is verified carrying out fitting of a polynomial, is rejected significantly mistake match point:
(1) extracts correction back coupling same place to { PixI
1i, PixW
* i(i=1,2,3 ..., coupling same place { PixI N)
1i, PixW
* i}
(2) at reference map cutting image I as a result
1 *Go up with PixI
1jFor the center obtains the 127*127 image block, treat that the change of scale image WD of registration cutting image goes up with PixW
* iFor center sliding windows of 5 pixel coverages around, carry out relevant matches, (p is q) with the related coefficient C of peak value place to obtain correlation matrix C and peak value correspondence position thereof
Max, work as C
Max, carry out next step at>0.5 o'clock; Otherwise, get back to step (1) and calculate following 1 match point.
(3) on WD so that (p, q) for the center obtains the 127*127 image block, the reference map cutting is image I as a result
1 *Go up with PixI
1j/ 9 sliding windows that are the center around it carry out relevant matches, obtain correlation matrix and peak value correspondence position thereof (m, n), when (m, n) and PixI
1jWhen corresponding some position equates, think PixI
1jWith (p is 1 pair of coupling same place q), carries out next step; Otherwise, get back to step (1) and calculate following 1 match point.
(4) (p q) adopts in the 3*3 neighborhood binary three point interpolation algorithm that related coefficient is carried out interpolation on every side, and interpolation chooses 1/20 at interval, obtains 3*3 neighborhood correlation matrix Z after the interpolation, obtains the corresponding peak value position (p of Z to correlation matrix C peak value correspondence position
*, q
*), this moment PixI
1jWith (p
*, q
*) be a pair of high precision coupling same place.
(5) get back to step (1), traversal is revised back coupling same place to { PixI
1i, PixW
* i(i=1,2,3 ..., each same place N) obtains sub-pix coupling same place to { PixID
1i, PixWD
i(i=1,2,3 ..., M).
(6) essence is mated same place to { PixID
1i, PixWD
i(i=1,2,3 ..., M) carry out RANSAC and reject, obtain reliable smart coupling same place to { PixID
1i, PixWD
i(i=1,2,3 ..., D).
7. a large amount of smart coupling same place that combines to obtain is to { PixID
1i, PixWD
i(i=1,2,3 ..., D), make up the intensive Delaunay triangulation network, each facet unit is corrected respectively, obtain the high registration accuracy image.
The present invention's beneficial effect compared with prior art is:
(1) the present invention conducts in-depth analysis to HJ star multi-source remote sensing picture characteristics, designed comprise change of scale, based on the initial matching same place of imaging geometry to obtain, the longitude and latitude overall offset is eliminated, same place is to the full-automatic registration flow process of steps such as essence coupling, intensive triangulation network correction; In whole registration process, considered HJ star registration data amount and calculated performance, all adopted concurrent designing, improved registration efficient, satisfied businessization operation demand in feature point detection and matching stage.
(2) HJ star multi-source remote sensing image registration error mainly is the longitude and latitude deflection error; The present invention combines the influence of imaging geometry to registration error; Adopted the method for pyramid and the two-way coupling of simple crosscorrelation to the global error of longitude and latitude direction; Improve the initial matching precision, reduced follow-up coupling calculated amount.
(3) the present invention is directed to the smart coupling of same place, the method that adopts the two-way coupling of simple crosscorrelation to combine with the binary three point interpolation has been brought up to sub-pixel level with matching precision.
(4) to combine HJ star geometric correction data registration error mainly be the characteristics of translation error in the present invention; Have same place to be evenly distributed, stablize, reliable, advantages such as bearing accuracy height can detect big flux matched same place; Support that the HJ star is infrared, the registration of CCD and high spectrum same type of sensor and foreign peoples's sensor image; Lot of test result shows that registration accuracy can reach the level land and be superior to 2 pixels, and the mountain region is superior to 5 pixels.
Description of drawings
Fig. 1 is overall flow figure of the present invention;
Fig. 2 is for the present invention is based on pyramid and related coefficient location longitude and latitude direction side-play amount process flow diagram;
Fig. 3 is the smart coupling of unique point of the present invention process flow diagram.
Embodiment
Further describe in detail below in conjunction with accompanying drawing 1, Fig. 2 and Fig. 3 specific embodiments of the invention:
1. the latitude and longitude information that carries according to remotely-sensed data is to reference map I
1With treat registering images I
2The overlay region carry out cutting, obtain reference map cutting image I as a result
1 *With treat registration figure cutting image I as a result
2 *
According to reference map with treat registering images resolution information R
1And R
2, judge whether resolution is consistent, and unanimity is then carried out next step; The inconsistent registration figure cutting image I as a result that then treats
2 *Carry out change of scale:
(1) the wavelet decomposition process is expressed as: registration figure cutting image I is as a result treated in order
2 *The expression size is the original image of M * N, and l (i) representes the low-pass filter coefficients with respect to analysis wavelet, i=0, and 1,2 ... N
l-1, N
lThe bearing length of expression wave filter L; H (i) representes the Hi-pass filter coefficient with respect to analysis wavelet, i=0, and 1,2 ... N
h-1, N
hThe bearing length of expression filters H, then wavelet decomposition is:
Ll channel is repeated this process, and the decomposition level up to confirming so just can realize the wavelet decomposition of image from the high resolving power to the low resolution.Low frequency signal can reflect the basic waveform of original signal, the expression that it is an original signal on a low class resolution ratio, and on the other hand, and high-frequency signal is then comprising by high resolving power and is transforming to the detailed information that low resolution is lost.If the low frequency signal of output is taken as new signal and is input to same bank of filters, carry out wavelet transform once more, and the several times that so circulate, finally can obtain the detail of the high frequency on a low frequency signal and the different resolution.
(2) image is treated registration figure cutting image I as a result through after the wavelet decomposition
2 *Be broken down into along a series of high-frequency signal images and a low-frequency image of level, vertical and three directions in diagonal angle, have similarity between wherein unidirectional each high frequency imaging signal.The wavelet scale conversion utilizes this similarity exactly, carries out high frequency extrapolation interpolation, utilizes inverse wavelet transform to reconstruct the interpolation image WD higher than original image resolution then.
Might as well establish certain image is the image I L that obtains behind the LPF, resolves into IH1 to IL through wavelet transformation, IV1 earlier; ID1 and low-frequency image IL1 adopt the similarity transformation extrapolation to obtain high frequency details IH, IV then; ID obtains the image I higher than original image resolution with inverse wavelet transform again.
I=DWT
-1(IL,IH,IV,ID)
The interpolation method that this method adopts when high frequency extrapolation interpolation is a bicubic interpolation.
3. to reference map cutting image I as a result
1 *And carry out the FAST feature point detection, extract equally distributed unique point; Geography information corresponding relation according to 2 grades of images is located match point on image to be corrected, it is right to obtain the initial matching same place:
(1) with reference map cutting image I as a result
1 *Cut into slices according to preset size, obtain image sheet I
1i(i=1,2 ..., M, M are the image sheet sums).Thread is once to the data I of a section
1iCarry out the FAST feature point detection, realize the parallel processing in feature point detection stage:
The FAST feature point detection is carried out in section, be expressed as: check measuring point c to be checked circle on every side, seek wherein the longest circular arc, if the gray-scale value of all points then is judged as angle point all greater than I (c) or all less than I (c) in the circular arc, wherein I (c) is the gray-scale value of c.Under discrete case, the circle size is 3 * 3 zones, and arc length is the discrete point number.Gray scale judges that formula is:
|I(c′)-I(c)|≤t
Wherein I (c ') is the gray-scale value of the point on 3 * 3 circular arcs, and t is a threshold value.As long as the number of continuous arc length that satisfies above-mentioned formula is more than or equal to 9, then this central point is an angle point.
To the section I
1iWhen calculating the FAST feature point extraction, in order to obtain unique point sparse and that be evenly distributed, to image slices I
1iCarry out the fine granularity of 20*20 pixel and divide, calculate the unique point response R of each 20*20 pixel granularity the inside
j(j=1,2 ..., 400).Get the maximal value R of angle point response
Max=max (R
j), if R
Max>0, then this angle point of mark is a unique point.Through aforementioned calculation, can guarantee to comprise a unique point in the every 20*20 of the imaging area pixel region on the reference map.
(2) remember reference map cutting image I as a result
1 *The unique point that detects is PixI
1i(i=1,2,3 ..., N).The OpenGIS coordinate transformation method in GDAL remote sensing images storehouse provides and has been used for describing the service that coordinate system (projection and reference field) and coordinate system are changed each other, and the present invention combines geo-localisation information treating to obtain initial matching same place PixW on the change of scale image WD of registration cutting image through this coordinate conversion
i(i=1,2,3 ..., N), N is the unique point total number.Detailed process is expressed as follows:
Extract PixI
1i(i=1,2,3 ..., some PixI N)
1i, according to its coordinate on image and this longitude and latitude Geo of geography information calculating of reference map
i
According to longitude and latitude Geo
iGeography information in conjunction with change of scale image WD is calculated corresponding image coordinate PixW
i
Traversal PixI
1i(i=1,2,3 ..., each some N) obtains corresponding initial matching point PixW
i(i=1,2,3 ..., N).
4. choose the initial matching same place to { PixI
1i, PixW
i(i=1,2,3 ..., sub-set { PixI N)
1j, PixW
j(j=1,2,3 ..., M<N), M=50 among the present invention.Carry out Feature Points Matching (the match search scope is superior to the precision of HJ image system geometry correction) based on pyramid and related coefficient; A thread once matees a point; Reach the parallel computation purpose; Obtain behind elimination of rough difference that same place is right more accurately, statistics is tried to achieve the registering images longitude and latitude direction overall offset amount of treating:
(1) with reference map cutting image I as a result
1 *Set up this pyramid of 3 floor heights with the change of scale image WD that treats registration cutting image, the corresponding from top to bottom I of image pyramid
1 *, I
11 *, I
12 *With WD, WD
1, WD
2, every from the bottom up tomographic image dwindles 1/3.
(2) the 1st layers:
(2.1) extract initial matching same place subclass { PixI
1j, PixW
j(j=1,2,3 ..., initial matching same place { PixI M)
1j, PixW
j.
(2.2) at top layer image I
12 *Go up with PixI
1j/ 9 obtain 127*127 image block, WD for the center
2Go up with PixW
j/ 9 is center 5 sliding windows of pixel coverage around, carries out relevant matches, and (p is q) with the related coefficient C of peak value place to obtain correlation matrix C and peak value correspondence position thereof
Max, work as C
Max, carry out next step at>0.5 o'clock; Otherwise, get back to step (2.1) and calculate following 1 match point.
The related coefficient coupling is expressed as: it is based on algorithm the most frequently used in the gray scale correlation matching algorithm; Related coefficient after average and normalization are handled in the past becomes insensitive to picture contrast, light and shade, makes that the normalized crosscorrelation coefficient is more reliable, adaptability is stronger.The definition of normalized crosscorrelation coefficient is following:
Wherein, A, B represent the reference subgraph of the big or small N * N of being respectively and treat gamete figure.
is respectively two-dimensional matrix A, the average of B.
Definition by normalized crosscorrelation can know, (A, B) corresponding position is the matched position of two width of cloth images to related coefficient C during maximal value.
(2.3) at WD
2Go up with (p, q) for the center obtains the 127*127 image block, top layer image I
12 *Go up with PixI
1j/ 9 sliding windows that are the center around it carry out relevant matches, obtain correlation matrix and peak value correspondence position thereof (m, n), when (m, n) and PixI
1jWhen/9 corresponding some positions equate, think PixI
1j/ 9 with (p q) is 1 pair of coupling same place; Otherwise, get back to step (2.1) and calculate following 1 match point.
(2.4) traversal coupling same place subclass { PixI on the top layer image
1j, PixW
j(j=1,2,3 ..., each some M) carries out RANSAC (Random Sample Consensus) and rejects, and obtains reliable coupling same place to { LvI
2t, LvW
2t(t=1,2 ..., T).
RANSAC rejects and is expressed as: iteratively at input data { PixI
1j, PixW
j(j=1,2,3 ..., M) the so-called minimum point set of middle sampling (3 pairs of coupling same places).And utilize the resulting minimum point set of each sampling (3 pairs of coupling same places) to estimate parameter to be determined (affine transformation parameter); Differentiate then simultaneously which is corresponding to this parameter in the input data according to threshold value; Point promptly, which is inconsistent, i.e. exterior point.So behind the certain number of times of iteration, import correspondence in the data in the highest estimated parameter value of some ratio and interiorly separating at last of being screened as RANSAC.This is separated as the further computation optimization of the initial value of other method, thereby get estimated parameter value to the end.The computing formula of frequency in sampling N is following:
Wherein σ is the raw data error rate of expection, and the s representative has at least a smallest subset to comprise the probability of point in all.
(3) the 2nd layers:
(3.1) according to the 1st layer of coupling of top layer same place { LvI
2t, LvW
2t(t=1,2 ..., T), gold tower enlargement factor 3 multiply by in the each point position, calculates the 2nd layer and treats the affine transformation relationship of registering images to benchmark image;
(3.2) extract initial matching same place subclass { PixI
1j, PixW
j(j=1,2,3 ..., initial matching same place { PixI M)
1j, PixW
j.
(3.3) at the 2nd layer image I
11 *On, calculate PixI according to the radiation relation
1j/ 3 at WD
1Last corresponding position PixW
* j, calculate WD
1Last PixW
j/ 3 and PixW
* jDistance on x and y direction is if all carry out next step less than 15; Otherwise, get back to step (3.2) and calculate following 1 match point.
(3.4) at the 2nd layer image I
11 *On, with PixI
1j/ 3 obtain 127*127 image block, WD for the center
1Go up with PixW
* jFor center sliding windows of 5 pixel coverages around, carry out relevant matches, obtain correlation matrix C and peak value correspondence position thereof (p, q).
(3.5) at WD
1Go up with (p, q) for the center obtains the 127*127 image block, the 2nd layer image I
11 *On, with PixI
1j/ 3 sliding windows that are the center around it carry out relevant matches, obtain correlation matrix and peak value correspondence position thereof (m, n), when (m, n) and PixI
1jWhen/3 corresponding some positions equate, think PixI
1j/ 3 with (p q) is 1 pair of coupling same place; Otherwise, get back to step (3.2) and calculate following 1 match point.
(3.6) traversal coupling same place subclass { PixI on second image
1j, PixW
j(j=1,2,3 ..., each some M) obtains the coupling same place to { LvI
1t, LvW
1t(t=1,2 ..., T).
(4) the 3rd layers (bottom):
(4.1) according to the 2nd layer of coupling of top layer same place { LvI
1t, LvW
1t(t=1,2 ..., T), gold tower enlargement factor 3 multiply by in the each point position, calculates the 3rd layer and treats the affine transformation relationship of registering images to benchmark image;
(4.2) extract initial matching same place subclass { PixI
1j, PixW
j(j=1,2,3 ..., initial matching same place { PixI M)
1j, PixW
j.
(4.3) at the 3rd layer image I
1 *On, calculate PixI according to the radiation relation
1jThe position PixW of correspondence on WD
* j, calculate WD
1Last PixW
jWith PixW
* jDistance on x and y direction is if all carry out next step less than 15; Otherwise, get back to step (4.2) and calculate following 1 match point.
(4.4) at the 3rd layer image I
1 *On, with PixI
1jThe center obtains the 127*127 image block, and WD is last with PixW
* jFor center sliding windows of 5 pixel coverages around, carry out relevant matches, obtain correlation matrix C and peak value correspondence position thereof (p, q).
(4.5) on WD so that (p is q) for the center obtains the 127*127 image block, at the 3rd layer image I
1 *On, with PixI
1jFor center sliding window around it, carry out relevant matches, obtain correlation matrix and peak value correspondence position thereof (m, n), when (m, n) and PixI
1jWhen corresponding some position equates, think PixI
1jWith (P q) is 1 pair of coupling same place; Otherwise, get back to step (3.2) and calculate following 1 match point.
(4.6) traversal coupling same place subclass { PixI on the 3rd image
1j, PixW
j(j=1,2,3 ..., each some M) obtains the coupling same place to { LvI
t, LvW
t(t=1,2 ..., T).
(5) statistics longitude and latitude direction overall offset amount Δ x, Δ y: can know bottom coupling same place { LvI by matching process
t, LvW
t(t=1,2 ..., the LvI in T)
t(t=1,2 ..., T) be beginning coupling same place subclass { PixI
1j, PixW
j(j=1,2,3 ..., M) middle PixI
1jA sub-set, can obtain LvI
t(t=1,2 ..., T) the initial matching point PixW of correspondence
t(t=1,2 ..., T), calculate LvW
t(t=1,2 ..., T) and PixW
t(t=1,2 ..., T) at the mean shift amount Δ x and the Δ y of longitude and latitude direction (correspondence image transverse axis x and longitudinal y directions).
5. according to longitude and latitude direction overall offset amount Δ x, Δ y revises the initial matching same place to { PixI
1i, PixW
i(i=1,2,3 ..., treat the corresponding match point PixW of registration figure in N)
i(i=1,2,3 ..., N), its x coordinate deducts Δ x, and the y coordinate deducts Δ y; Obtain revising back coupling same place to { PixI
1i, PixW
* i(i=1,2,3 ..., N).
6. to carrying out two-way simple crosscorrelation Feature Points Matching, the match peak place adopts the binary three point interpolation to obtain the essence coupling same place of sub-pixel positioning precision; Simultaneously essence coupling same place is verified carrying out fitting of a polynomial, is rejected significantly mistake match point:
(1) extracts correction back coupling same place to { PixI
1i, PixW
* i(i=1,2,3 ..., coupling same place { PixI N)
1i, PixW
* i}
(2) at reference map cutting image I as a result
1 *Go up with PixI
1jFor the center obtains the 127*127 image block, treat that the change of scale image WD of registration cutting image goes up with PixW
* iFor center sliding windows of 5 pixel coverages around, carry out relevant matches, (p is q) with the related coefficient C of peak value place to obtain correlation matrix C and peak value correspondence position thereof
Max, work as C
Max, carry out next step at>0.5 o'clock; Otherwise, get back to step (1) and calculate following 1 match point.
(3) on WD so that (p, q) for the center obtains the 127*127 image block, the reference map cutting is image I as a result
1 *Go up with PixI
1j/ 9 sliding windows that are the center around it carry out relevant matches, obtain correlation matrix and peak value correspondence position thereof (m, n), when (m, n) and PixI
1jWhen corresponding some position equates, think PixI
1jWith (p is 1 pair of coupling same place q), carries out next step; Otherwise, get back to step (1) and calculate following 1 match point.
(4) (p q) adopts in the 3*3 neighborhood binary three point interpolation algorithm that related coefficient is carried out interpolation on every side, and interpolation chooses 1/20 at interval, obtains 3*3 neighborhood correlation matrix Z after the interpolation, obtains the corresponding peak value position (p of Z to correlation matrix C peak value correspondence position
*, q
*), this moment PixI
1jWith (P
*, q
*) be a pair of high precision coupling same place.
(5) get back to step (1), traversal is revised back coupling same place to { PixI
1i, PixW
* i(i=1,2,3 ..., each same place N) obtains sub-pix coupling same place to { PixID
1i, PixWD
i(i=1,2,3 ..., M).
(6) essence is mated same place to { PixID
1i, PixWD
i(i=1,2,3 ..., M) carry out RANSAC and reject, obtain reliable smart coupling same place to { PixID
1i, PixWD
i(i=1,2,3 ..., D).
7. a large amount of smart coupling same place that combines to obtain is to { PixID
1i, PixWD
i(i=1,2,3 ..., D), make up the intensive Delaunay triangulation network, each facet unit is corrected respectively, obtain the high registration accuracy image.
The part that the present invention does not set forth in detail belongs to techniques well known.
Claims (5)
1. multi-source remote sensing automatic image registration method is characterized in that the method includes the steps of:
A. to benchmark image with treat that the longitude and latitude overlay region of registering images carries out cutting, if the reference map cutting as a result image the resolution of image is inconsistent as a result with treating the registration cutting, then treat the registration cutting as a result image carry out the conversion of Mallat wavelet scale;
B. to the reference map cutting as a result image carry out the FAST feature point detection, extract equally distributed unique point; Benchmark figure cutting as a result image with treat the registration cutting as a result the geography information corresponding relation of image treating the registration cutting match point of location feature point on the image as a result, it is right to obtain the initial matching same place;
C. extract the right sub-set of initial matching same place, in subclass, carry out Feature Points Matching based on pyramid and related coefficient, reject the medium errors point, statistical computation obtains longitude and latitude direction overall offset amount;
D. through longitude and latitude direction overall offset amount to the initial matching same place to revising, reject translation error; Each coupling same place is carried out two-way simple crosscorrelation coupling and carries out the binary three point interpolation at the peak value place, and the coupling same place that obtains the sub-pixel matching precision is right, and the coupling same place that utilizes the sub-pixel matching precision is to obtaining registering images.
2. a kind of multi-source remote sensing automatic image registration method described in claim 1 is characterized in that: described in the said steps A treat the registration cutting as a result the image method of carrying out the conversion of Mallat wavelet scale be:
A1. to original treat registration figure cutting as a result image carry out the Mallat wavelet decomposition, obtain the high frequency imaging signal on a low-frequency image signal and the different resolution;
A2. according to the similarity between unidirectional each high frequency imaging signal wherein, carry out high frequency extrapolation interpolation;
A3. to the low-frequency image signal after the interpolation and the high frequency imaging signal on the different resolution utilize inverse wavelet transform reconstruct than original treat registration figure cutting as a result image resolution higher treat registration figure cutting image as a result.
3. a kind of multi-source remote sensing automatic image registration method described in claim 1 is characterized in that: in step B, obtain in the right step of initial matching same place:
Utilize the OpenGIS coordinate transformation method in GDAL remote sensing images storehouse and the service that coordinate system is changed each other, confirm to treat corresponding initial matching same place on the change of scale image of registration cutting image through geo-localisation information.
4. a kind of multi-source remote sensing automatic image registration method described in claim 1 is characterized in that: in said step C, obtain in the step of longitude and latitude direction overall offset amount:
To benchmark image with treat that registering images cutting result images sets up this pyramid of 3 floor heights, the subclass of initial matching same place carries out carrying out two-way simple crosscorrelation coupling at gold cat head layer earlier after the conversion, matching relationship progressively is mapped to lower floor's pyramid.
5. a kind of multi-source remote sensing automatic image registration method described in claim 1, it is characterized in that: the right step of coupling same place of in said step D, obtaining the sub-pixel matching precision is:
D1. through longitude and latitude direction overall offset amount to the initial matching same place to revising, reject translation error after, each coupling same place is carried out relevant matches;
D2. related coefficient peak value place adopts the binary three point interpolation to obtain the sub-pixel positioning precision.After the coupling same place had traveled through after all were revised, essence coupling same place is rejected match point significantly by mistake to carrying out RANSAC, it is right to obtain the coupling same place.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110254958.XA CN102436652B (en) | 2011-08-31 | 2011-08-31 | Automatic registering method of multisource remote sensing images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110254958.XA CN102436652B (en) | 2011-08-31 | 2011-08-31 | Automatic registering method of multisource remote sensing images |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102436652A true CN102436652A (en) | 2012-05-02 |
CN102436652B CN102436652B (en) | 2014-08-27 |
Family
ID=45984695
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201110254958.XA Active CN102436652B (en) | 2011-08-31 | 2011-08-31 | Automatic registering method of multisource remote sensing images |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102436652B (en) |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102750668A (en) * | 2012-07-04 | 2012-10-24 | 西南交通大学 | Digital image triple interpolation amplification method by combining local direction features |
CN103115614A (en) * | 2013-01-21 | 2013-05-22 | 武汉大学 | Associated parallel matching method for multi-source multi-track long-strip satellite remote sensing images |
CN103337052A (en) * | 2013-04-17 | 2013-10-02 | 国家测绘地理信息局卫星测绘应用中心 | Automatic geometric correction method for wide remote-sensing images |
CN103679675A (en) * | 2013-11-29 | 2014-03-26 | 航天恒星科技有限公司 | Remote sensing image fusion method oriented to water quality quantitative remote sensing application |
CN103905746A (en) * | 2012-12-28 | 2014-07-02 | 清华大学 | Method and device for localization and superposition of sub-pixel-level image offset and video device |
CN104021556A (en) * | 2014-06-13 | 2014-09-03 | 西南交通大学 | Heterological remote-sensing image registration method based on geometric structure similarity |
CN104809724A (en) * | 2015-04-21 | 2015-07-29 | 电子科技大学 | Automatic precise registration method for multiband remote sensing images |
CN104966283A (en) * | 2015-05-22 | 2015-10-07 | 北京邮电大学 | Imaging layered registering method |
CN105136164A (en) * | 2015-08-13 | 2015-12-09 | 航天恒星科技有限公司 | Staring imaging simulation and quality evaluation method and device taking regard of satellite comprehensive motion |
CN105160624A (en) * | 2015-08-20 | 2015-12-16 | 中电科海洋信息技术研究院有限公司 | Geographic information picture automatic registration method and apparatus |
CN105205812A (en) * | 2015-09-01 | 2015-12-30 | 哈尔滨工业大学 | Multiframe image reconstruction method based on microsatellite constellation |
CN105913435A (en) * | 2016-04-13 | 2016-08-31 | 西安航天天绘数据技术有限公司 | Multidimensional remote sensing image matching method and multidirectional remote sensing image matching system suitable for large area |
CN106056625A (en) * | 2016-05-25 | 2016-10-26 | 中国民航大学 | Airborne infrared moving target detection method based on geographical homologous point registration |
CN106228593A (en) * | 2015-05-28 | 2016-12-14 | 长沙维纳斯克信息技术有限公司 | A kind of image dense Stereo Matching method |
CN103823889B (en) * | 2014-03-10 | 2017-02-01 | 北京大学 | L1 norm total geometrical consistency check-based wrong matching detection method |
CN106447613A (en) * | 2016-09-27 | 2017-02-22 | 西安蒜泥电子科技有限责任公司 | Image local registration based method and system for removing blur shadow of panorama |
CN107016695A (en) * | 2017-04-13 | 2017-08-04 | 首都师范大学 | A kind of sub-pixel Image registration method and system |
CN107609183A (en) * | 2017-09-29 | 2018-01-19 | 浙江科澜信息技术有限公司 | Original coordinates data are converted to the method, apparatus and equipment of spherical coordinate data |
CN107705244A (en) * | 2017-09-11 | 2018-02-16 | 中国国土资源航空物探遥感中心 | A kind of edge fit correcting method suitable for big several remote sensing images of region |
CN107945216A (en) * | 2017-11-10 | 2018-04-20 | 西安电子科技大学 | More images joint method for registering based on least-squares estimation |
CN108629798A (en) * | 2018-04-28 | 2018-10-09 | 安徽大学 | Rapid Image Registration method based on GPU |
CN109144095A (en) * | 2018-04-03 | 2019-01-04 | 奥瞳系统科技有限公司 | The obstacle avoidance system based on embedded stereoscopic vision for unmanned vehicle |
CN109493298A (en) * | 2018-11-13 | 2019-03-19 | 中国国土资源航空物探遥感中心 | A kind of airborne sweep type high-spectral data fast geometric bearing calibration |
CN110009670A (en) * | 2019-03-28 | 2019-07-12 | 上海交通大学 | The heterologous method for registering images described based on FAST feature extraction and PIIFD feature |
CN110163896A (en) * | 2019-03-29 | 2019-08-23 | 西南电子技术研究所(中国电子科技集团公司第十研究所) | SAR image matching process |
CN110310308A (en) * | 2019-06-18 | 2019-10-08 | 中南林业科技大学 | A kind of method for registering images based on subgraph |
CN110543886A (en) * | 2018-05-28 | 2019-12-06 | 天津理工大学 | Pyramid imaging matching algorithm based on lunar CCD image |
CN111062976A (en) * | 2019-12-25 | 2020-04-24 | 中国科学院长春光学精密机械与物理研究所 | FMT-based low-orbit satellite solar telescope remote sensing image registration method |
CN111242006A (en) * | 2020-01-10 | 2020-06-05 | 长江水利委员会长江科学院 | Method for realizing geographic WPS (Wireless personal storage System) service based on Mask R-CNN (remote sensing image ground object detection) |
CN113223065A (en) * | 2021-03-30 | 2021-08-06 | 西南电子技术研究所(中国电子科技集团公司第十研究所) | Automatic matching method for SAR satellite image and optical image |
CN113344937A (en) * | 2021-04-25 | 2021-09-03 | 中国科学院空天信息创新研究院 | Method for automatically identifying and removing black edge of remote sensing image |
CN113781529A (en) * | 2021-09-24 | 2021-12-10 | 中国科学院精密测量科学与技术创新研究院 | Wide-area SAR complex image sequence rapid registration method adopting twice blocking strategy |
CN115265424A (en) * | 2022-09-27 | 2022-11-01 | 威海晶合数字矿山技术有限公司 | Geological disaster side slope displacement monitoring method based on synthetic aperture radar technology |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6681057B1 (en) * | 2000-02-22 | 2004-01-20 | National Instruments Corporation | Image registration system and method implementing PID control techniques |
CN102117483A (en) * | 2009-12-31 | 2011-07-06 | 核工业北京地质研究院 | Fusion method of multispectral remote sensing images with different spatial resolutions |
-
2011
- 2011-08-31 CN CN201110254958.XA patent/CN102436652B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6681057B1 (en) * | 2000-02-22 | 2004-01-20 | National Instruments Corporation | Image registration system and method implementing PID control techniques |
CN102117483A (en) * | 2009-12-31 | 2011-07-06 | 核工业北京地质研究院 | Fusion method of multispectral remote sensing images with different spatial resolutions |
Non-Patent Citations (4)
Title |
---|
屈有山: "基于小波双三次插值提高光学遥感图像空间分辨率的研究", 《光子学报》, vol. 33, no. 5, 31 May 2004 (2004-05-31), pages 601 - 604 * |
张继贤: "多源遥感影像高精度自动配准的方法研究", 《遥感学报》, vol. 9, no. 1, 31 January 2005 (2005-01-31), pages 73 - 77 * |
成英燕: "用InSAR技术进行形变监测的研究", 《测绘科学》, vol. 31, no. 3, 31 May 2006 (2006-05-31), pages 56 - 59 * |
蔡志凌: "MODIS近红外遥感反演水汽含量及水汽含量对气象数据的回归分析", 《中国科技论文在线》, 12 May 2008 (2008-05-12), pages 1 - 6 * |
Cited By (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102750668B (en) * | 2012-07-04 | 2014-07-02 | 西南交通大学 | Digital image triple interpolation amplification method by combining local direction features |
CN102750668A (en) * | 2012-07-04 | 2012-10-24 | 西南交通大学 | Digital image triple interpolation amplification method by combining local direction features |
CN103905746A (en) * | 2012-12-28 | 2014-07-02 | 清华大学 | Method and device for localization and superposition of sub-pixel-level image offset and video device |
CN103905746B (en) * | 2012-12-28 | 2017-02-22 | 清华大学 | Method and device for localization and superposition of sub-pixel-level image offset and video device |
CN103115614A (en) * | 2013-01-21 | 2013-05-22 | 武汉大学 | Associated parallel matching method for multi-source multi-track long-strip satellite remote sensing images |
CN103115614B (en) * | 2013-01-21 | 2014-12-17 | 武汉大学 | Associated parallel matching method for multi-source multi-track long-strip satellite remote sensing images |
CN103337052A (en) * | 2013-04-17 | 2013-10-02 | 国家测绘地理信息局卫星测绘应用中心 | Automatic geometric correction method for wide remote-sensing images |
CN103337052B (en) * | 2013-04-17 | 2016-07-13 | 国家测绘地理信息局卫星测绘应用中心 | Automatic geometric correcting method towards wide cut remote sensing image |
CN103679675A (en) * | 2013-11-29 | 2014-03-26 | 航天恒星科技有限公司 | Remote sensing image fusion method oriented to water quality quantitative remote sensing application |
CN103679675B (en) * | 2013-11-29 | 2017-01-11 | 航天恒星科技有限公司 | Remote sensing image fusion method oriented to water quality quantitative remote sensing application |
CN103823889B (en) * | 2014-03-10 | 2017-02-01 | 北京大学 | L1 norm total geometrical consistency check-based wrong matching detection method |
CN104021556A (en) * | 2014-06-13 | 2014-09-03 | 西南交通大学 | Heterological remote-sensing image registration method based on geometric structure similarity |
CN104809724A (en) * | 2015-04-21 | 2015-07-29 | 电子科技大学 | Automatic precise registration method for multiband remote sensing images |
CN104966283A (en) * | 2015-05-22 | 2015-10-07 | 北京邮电大学 | Imaging layered registering method |
CN106228593B (en) * | 2015-05-28 | 2019-05-17 | 长沙维纳斯克信息技术有限公司 | A kind of image dense Stereo Matching method |
CN106228593A (en) * | 2015-05-28 | 2016-12-14 | 长沙维纳斯克信息技术有限公司 | A kind of image dense Stereo Matching method |
CN105136164B (en) * | 2015-08-13 | 2019-04-05 | 航天恒星科技有限公司 | Consider the comprehensive staring imaging emulation moved of satellite and method for evaluating quality and device |
CN105136164A (en) * | 2015-08-13 | 2015-12-09 | 航天恒星科技有限公司 | Staring imaging simulation and quality evaluation method and device taking regard of satellite comprehensive motion |
CN105160624A (en) * | 2015-08-20 | 2015-12-16 | 中电科海洋信息技术研究院有限公司 | Geographic information picture automatic registration method and apparatus |
CN105205812A (en) * | 2015-09-01 | 2015-12-30 | 哈尔滨工业大学 | Multiframe image reconstruction method based on microsatellite constellation |
CN105913435A (en) * | 2016-04-13 | 2016-08-31 | 西安航天天绘数据技术有限公司 | Multidimensional remote sensing image matching method and multidirectional remote sensing image matching system suitable for large area |
CN105913435B (en) * | 2016-04-13 | 2019-05-28 | 西安航天天绘数据技术有限公司 | A kind of multiscale morphology image matching method and system suitable for big region |
CN106056625B (en) * | 2016-05-25 | 2018-11-27 | 中国民航大学 | A kind of Airborne IR moving target detecting method based on geographical same place registration |
CN106056625A (en) * | 2016-05-25 | 2016-10-26 | 中国民航大学 | Airborne infrared moving target detection method based on geographical homologous point registration |
CN106447613A (en) * | 2016-09-27 | 2017-02-22 | 西安蒜泥电子科技有限责任公司 | Image local registration based method and system for removing blur shadow of panorama |
CN107016695A (en) * | 2017-04-13 | 2017-08-04 | 首都师范大学 | A kind of sub-pixel Image registration method and system |
CN107016695B (en) * | 2017-04-13 | 2019-09-17 | 首都师范大学 | A kind of sub-pixel Image registration method and system |
CN107705244A (en) * | 2017-09-11 | 2018-02-16 | 中国国土资源航空物探遥感中心 | A kind of edge fit correcting method suitable for big several remote sensing images of region |
CN107609183A (en) * | 2017-09-29 | 2018-01-19 | 浙江科澜信息技术有限公司 | Original coordinates data are converted to the method, apparatus and equipment of spherical coordinate data |
CN107945216A (en) * | 2017-11-10 | 2018-04-20 | 西安电子科技大学 | More images joint method for registering based on least-squares estimation |
CN107945216B (en) * | 2017-11-10 | 2019-10-11 | 西安电子科技大学 | More images based on least-squares estimation combine method for registering |
CN109144095A (en) * | 2018-04-03 | 2019-01-04 | 奥瞳系统科技有限公司 | The obstacle avoidance system based on embedded stereoscopic vision for unmanned vehicle |
CN108629798A (en) * | 2018-04-28 | 2018-10-09 | 安徽大学 | Rapid Image Registration method based on GPU |
CN110543886A (en) * | 2018-05-28 | 2019-12-06 | 天津理工大学 | Pyramid imaging matching algorithm based on lunar CCD image |
CN109493298A (en) * | 2018-11-13 | 2019-03-19 | 中国国土资源航空物探遥感中心 | A kind of airborne sweep type high-spectral data fast geometric bearing calibration |
CN110009670A (en) * | 2019-03-28 | 2019-07-12 | 上海交通大学 | The heterologous method for registering images described based on FAST feature extraction and PIIFD feature |
CN110163896B (en) * | 2019-03-29 | 2023-02-03 | 西南电子技术研究所(中国电子科技集团公司第十研究所) | SAR image matching method |
CN110163896A (en) * | 2019-03-29 | 2019-08-23 | 西南电子技术研究所(中国电子科技集团公司第十研究所) | SAR image matching process |
CN110310308A (en) * | 2019-06-18 | 2019-10-08 | 中南林业科技大学 | A kind of method for registering images based on subgraph |
CN111062976A (en) * | 2019-12-25 | 2020-04-24 | 中国科学院长春光学精密机械与物理研究所 | FMT-based low-orbit satellite solar telescope remote sensing image registration method |
CN111062976B (en) * | 2019-12-25 | 2023-02-28 | 中国科学院长春光学精密机械与物理研究所 | FMT-based low-orbit satellite solar telescope remote sensing image registration method |
CN111242006B (en) * | 2020-01-10 | 2021-04-09 | 长江水利委员会长江科学院 | Method for realizing geographic WPS (Wireless personal storage System) service based on Mask R-CNN (remote sensing image ground object detection) |
CN111242006A (en) * | 2020-01-10 | 2020-06-05 | 长江水利委员会长江科学院 | Method for realizing geographic WPS (Wireless personal storage System) service based on Mask R-CNN (remote sensing image ground object detection) |
CN113223065A (en) * | 2021-03-30 | 2021-08-06 | 西南电子技术研究所(中国电子科技集团公司第十研究所) | Automatic matching method for SAR satellite image and optical image |
CN113223065B (en) * | 2021-03-30 | 2023-02-03 | 西南电子技术研究所(中国电子科技集团公司第十研究所) | Automatic matching method for SAR satellite image and optical image |
CN113344937A (en) * | 2021-04-25 | 2021-09-03 | 中国科学院空天信息创新研究院 | Method for automatically identifying and removing black edge of remote sensing image |
CN113781529A (en) * | 2021-09-24 | 2021-12-10 | 中国科学院精密测量科学与技术创新研究院 | Wide-area SAR complex image sequence rapid registration method adopting twice blocking strategy |
CN115265424A (en) * | 2022-09-27 | 2022-11-01 | 威海晶合数字矿山技术有限公司 | Geological disaster side slope displacement monitoring method based on synthetic aperture radar technology |
Also Published As
Publication number | Publication date |
---|---|
CN102436652B (en) | 2014-08-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102436652B (en) | Automatic registering method of multisource remote sensing images | |
US11244197B2 (en) | Fast and robust multimodal remote sensing image matching method and system | |
CN103679714B (en) | A kind of optics and SAR automatic image registration method based on gradient cross-correlation | |
CN103914678B (en) | Abandoned land remote sensing recognition method based on texture and vegetation indexes | |
CN105651263B (en) | Shallow water depth multi-source remote sensing merges inversion method | |
CN104021556A (en) | Heterological remote-sensing image registration method based on geometric structure similarity | |
CN103839265A (en) | SAR image registration method based on SIFT and normalized mutual information | |
CN103093459B (en) | Utilize the method that airborne LiDAR point cloud data assisted image mates | |
CN103514606A (en) | Heterology remote sensing image registration method | |
CN105184801A (en) | Optical and SAR image high-precision registration method based on multilevel strategy | |
CN104318583B (en) | Visible light broadband spectrum image registration method | |
CN103295239A (en) | Laser-point cloud data automatic registration method based on plane base images | |
CN111008664B (en) | Hyperspectral sea ice detection method based on space-spectrum combined characteristics | |
CN102063715A (en) | Method for fusing typhoon cloud pictures based on NSCT (Nonsubsampled Controurlet Transformation) and particle swarm optimization algorithm | |
CN104809724A (en) | Automatic precise registration method for multiband remote sensing images | |
CN103489178A (en) | Method and system for image registration | |
CN101556694B (en) | Matching method of rotating images | |
CN103218811A (en) | Statistical distribution-based satellite multi-spectral image waveband registration method | |
CN107688776A (en) | A kind of urban water-body extracting method | |
Gao et al. | A general deep learning based framework for 3D reconstruction from multi-view stereo satellite images | |
CN116878748A (en) | Laser and image fusion intelligent gas leakage positioning method and device | |
CN113850769B (en) | Hyperspectral change detection method based on Simese space spectrum joint convolution network | |
Ni et al. | Hurricane eye morphology extraction from SAR images by texture analysis | |
Parmehr et al. | Automatic registration of optical imagery with 3d lidar data using local combined mutual information | |
Kim et al. | Automatic pseudo-invariant feature extraction for the relative radiometric normalization of hyperion hyperspectral images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |