CN102446356A - Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points - Google Patents

Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points Download PDF

Info

Publication number
CN102446356A
CN102446356A CN2011102861001A CN201110286100A CN102446356A CN 102446356 A CN102446356 A CN 102446356A CN 2011102861001 A CN2011102861001 A CN 2011102861001A CN 201110286100 A CN201110286100 A CN 201110286100A CN 102446356 A CN102446356 A CN 102446356A
Authority
CN
China
Prior art keywords
image
point
gauss
remote sensing
tower
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011102861001A
Other languages
Chinese (zh)
Inventor
宁晓刚
孙钰珊
谢文寒
杨景辉
吴宏安
张俊辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chinese Academy of Surveying and Mapping
Original Assignee
Chinese Academy of Surveying and Mapping
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chinese Academy of Surveying and Mapping filed Critical Chinese Academy of Surveying and Mapping
Priority to CN2011102861001A priority Critical patent/CN102446356A/en
Publication of CN102446356A publication Critical patent/CN102446356A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention provides a parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points. The method comprises the following steps of: deblocking overlapped regions of the remote sensing images with different resolutions; parallelly extracting features of matched images aiming at each sub block; during extraction, automatically controlling the logarithmic number of matched points through the classification of a Gaussian image tower with different layers and a Gaussian differential image, so that the homogeneously-distributed matched points of the remote sensing images are parallelly and adaptively acquired, and the matching of the remote sensing images with different resolutions is achieved. The method can realize the parallel and efficient matching of the remote sensing images and can also realize the homogeneous distribution of the matched points.

Description

A kind of remote sensing image parallel adaptive matching process that obtains even distribution match point
Technical field
The target of image coupling is to make up geometric transformation relation best between two width of cloth or several images.Image matching theory and method not only help the analysis of single width image, and help the comparison and the information integration analysis of several images.Therefore, image coupling is basis and the core topic in Digital Image Processing and remote sensing field.Its method is widely used at numerous areas such as remote sensing, map renewal, medical image, computer visions, to different applications, derives the multiple matching process that is applicable to different demands.
The present invention is directed to the Remote Sensing Images Matching problem of different resolution, be the basis, realized that different resolution remote sensing image parallel adaptive obtains equally distributed match point, has very strong using value with the SIFT matching algorithm.
Background technology
The method of Remote Sensing Images Matching has different sorting techniques according to different angles, and following table has been enumerated eight types of image matching methods by four kinds of classification foundations.
The classification of table 1 Remote Sensing Images Matching Method.
Figure 96305DEST_PATH_IMAGE001
Based on the relevant method of gray scale is ripe, the stable matching algorithm of generally acknowledging, its bearing accuracy is high and can obtain intensive disparity surfaces, and related coefficient often is applied to the relevant method of gray scale with least square method.Method based on gray scale is relevant is very responsive to the variation of image rotation and light intensity and contrast, and when the textural characteristics that has repetitive structure in the image or related pixel exist easy error when blocking, its operand is bigger.
Carry out image feature earlier based on the relevant method of characteristic and extract, image feature comprises a characteristic, line characteristic, contour feature etc., on the basis of extracting characteristic, carries out the image coupling according to the characteristic of several images again.Characteristic matching is limited use in photogrammetric, and the precision of characteristic is difficult to reach the requirement of photogrammetric subpixel, and the characteristic of successful match often only accounts for the sub-fraction of total characteristic number, causes the characteristic of space distribution rare, interior slotting surface distortion.
The spatial domain matching process matees to image greyscale or characteristic information on the image territory.
Frequency field match party rule at first is transformed into frequency field with image to be matched through certain algorithm (like Fast Fourier Transform (FFT)), carries out the coupling between image in frequency field then.
The linear matched method comprises translation, rotation, convergent-divergent and other affined transformation, and linear model is overall, so can not make up the local difference that possibly exist between image.
Nonlinear model is called non-rigid model or rubber model again, can realize that the part of image is reset, but calculation of complex.
Mutual matching process need carry out some workers before matching operation intervenes, and comprising: initial seed points, artificial excluding gross error or the like are arranged, confirmed to image.
Automatic matching method is then without user intervention and can accomplish coupling work automatically.
Current, both at home and abroad to the yardstick image coupling SIFT (Scale Invariant Feature Transform) of the employing algorithms that differ greatly more.The SIFT algorithm was proposed by D.G.Lowe in 1999, the perfect summary in 2004.Y.Ke described operator part with SIFT and replaced histogrammic mode with PCA afterwards, and the SIFT algorithm is improved.The thought of SIFT algorithm is the multi-scale expression through image at first; Extract the extreme point (maximal value in DoG space or minimum value) on the metric space; Then extreme point is accurately located; Reach the level of sub-pixel-level, go out the SIFT characteristic to these feature point extraction again, at last through the coupling that relatively realizes image of SIFT characteristic at higher dimensional space.The SIFT characteristic of extracting through top step is that a kind of image local feature that maintains the invariance based on metric space, to image zoom, rotation even affined transformation is described operator.
The SIFT matching algorithm has following advantage: the SIFT characteristic is the local feature of image, and it maintains the invariance to rotation, scale, brightness variation, and visual angle change, affined transformation, noise are also kept stability to a certain degree; Quantity of information is abundant, is applicable in the magnanimity property data base and matees fast and accurately, even several objects of minority also can produce a large amount of SIFT proper vectors; Through the SIFT matching algorithm optimized even can reach real-time requirement; Can unite with other forms of proper vector very easily.
Remote Sensing Images Matching is one of most important link of remote sensing image information integration and analysis, also is the important step and the means of visual fusion, change-detection, super-resolution imaging.Although a lot of problems have been able to solve, remote sensing image matees the important foundation research topic that is still the remote sensing field automatically.Wherein, the coupling than large scale difference Remote Sensing Images Matching, the matching algorithm that obtains even distribution match point, complicated non-linear and local deformation matching theory and method, N dimension (N>2) image all is the challenging problems in image coupling field.
Exploitation has the coupling expert system of automatic identification and matching task and autonomous decision matching algorithm, is the development trend in image coupling field.The coupling expert system should be able to be integrated various matching process, and with regard to various seeking agreement property of application demand solutions.
The method that the present invention proposes can well solve the matching problem of the remote sensing image of different resolution; And the same place that matches can evenly distribute in two width of cloth image intersection; Another advantage of the present invention is can be through overlapping area dividing; Realize parallel extraction image feature point, thereby increase substantially the efficient of Remote Sensing Images Matching to each piecemeal.
Press the similarity measure source and divide, the invention belongs to based on the relevant matching process of characteristic; By the action scope branch, the invention belongs to the spatial domain matching process; Press the transformation model branch, the invention belongs to the non-linear matches method; By the automaticity branch, the invention belongs to automatic matching method.
Summary of the invention
The invention provides a kind of remote sensing image parallel adaptive matching process that obtains even distribution match point, may further comprise the steps:
The first step, the coincidence that obtains remote sensing image to be matched is regional;
In second step, to overlapping the zone according to from left to right, order is from top to bottom carried out piecemeal;
The 3rd step, based on Gauss's image tower, each piecemeal is mated, if counting, the coupling that this piecemeal obtains reaches the quantitative requirement of predefined each piecemeal match point, then this piecemeal coupling finishes;
The 4th step for the piecemeal that does not reach the match point quantitative requirement, increased Gauss's image tower or difference of gaussian image bearing layer, still each block parallel was mated, up to the quantitative requirement that satisfies match point or till accomplishing the matching algorithm of three groups of Gauss's image towers.
Description of drawings
Fig. 1 is that the present invention confirms that two width of cloth remote sensing images to be matched overlap the synoptic diagram in zone;
Fig. 2 is the synoptic diagram that the present invention overlaps area dividing;
Fig. 3 is based on Gauss's image tower of SIFT algorithm among the present invention;
Fig. 4 is the detail flowchart that among the present invention the different resolution remote sensing image is obtained the parallel adaptive matching process of even distribution match point.
Embodiment
Preferred implementation of the present invention is realized by following steps:
The first step, the coincidence that obtains remote sensing image to be matched is regional.According to the meta file information of remote sensing image, obtain the geographic range of remote sensing image, the geographic range with two width of cloth remote sensing images to be matched compares again, obtains to overlap the zone, shown in accompanying drawing 1;
In second step, carry out piecemeal to overlapping the zone.In overlapping the zone, according to from left to right, from top to bottom order carries out piecemeal with 256 * 256 fixed size, the part that overlaps regional end less than 256 row (row) is by remaining big or small piecemeal, shown in accompanying drawing 2;
The 3rd step; Utilize the SIFT algorithm, each block parallel is mated, this moment coupling is only carried out (shown in accompanying drawing 3 based on the three first layers difference of gaussian image of Gauss's image tower of original size; Gauss's image tower of original size, expand twice Gauss's image tower, Gauss's image tower of dwindling one times respectively is of five storeys; The difference of gaussian image (DoG) that generates based on Gauss's image tower respectively has 4 layers, when mating, can carry out characteristic operation respectively to preceding 3 layers and back 3 layers of difference of gaussian image again); The coupling of obtaining like this piecemeal is counted and is reached the quantitative requirement of predefined each piecemeal match point, and then this piecemeal coupling finishes;
The 4th step; For the piecemeal that does not reach the match point quantitative requirement; Still each block parallel is mated; Shown in accompanying drawing 4, self-adaptation increases Gauss's image tower and difference of gaussian image bearing layer, up to satisfying to the quantitative requirement of match point position or till all accomplishing the SIFT algorithm of three groups of Gauss's image towers.
Wherein, the SIFT matching process carries out according to following steps and parameter:
The first step generates Gauss's image and difference of gaussian image on metric space.At first raw video is carried out rising sampling, operation parameter is that Gauss's template of σ=1.5 is carried out process of convolution.On this basis, set up 3 Gauss's image towers, each Gauss's image tower comprises 5 layer images.5 layer images to each Gauss's image tower carry out the difference of gaussian computing; Promptly subtract each other (upper strata subtracts lower floor) between on the same group 2 adjacent images; Each Gauss's image tower just can generate this differential images of 4 floor heights like this, and 3 Gauss's image towers can generate 3 groups of difference of gaussian images.
In second step, carry out that extreme value detects, low contrast point and skirt response point are accurately located and removed to extreme point.The testing of extreme point is carried out between continuous 3 the difference of gaussian images with yardstick.Each measuring point to be checked compares with its 8 consecutive point and corresponding 9 * 2 point (26 points) of neighbouring yardstick with yardstick totally.If the gray-scale value of check point is maximum or minimum, just this point as being that a reserve unique point remains.The SIFT algorithm is removed low contrast point and unsettled skirt response point simultaneously through fitting three-dimensional quadratic function accurately to confirm the position and the yardstick (reaching sub-pixel precision) of key point, to strengthen coupling stability, to improve noise resisting ability.Concerning the unique point that has detected, through Taylor expansion and ask extreme value to accomplish the accurate location of unique point, reject the skirt response point through the Hessian matrix that calculates extreme point.
The 3rd step was the unique point assign direction, made operator possess rotational invariance.Around unique point, get a neighborhood; Calculate the gradient-norm and the weights thereof of each pixel in the neighborhood; Then with their naturalizations in histogram; Confirm the direction of unique point according to the size of histogram energy in the histogram, the histogram that energy is the highest is represented the principal direction of unique point, and energy is the auxilliary direction of the histogram of principal direction energy more than 80% as unique point.
The 4th step, the generation of SIFT unique point descriptor.With the SIFT unique point is that one 16 * 16 window is got at the center; On per 4 * 4 fritter, calculate the histogram of gradients of 8 directions, obtain the gradient accumulated value of each direction, can form a seed points; Have on 16 seed points; Each seed points has 8 directions, forms the vector of one 128 dimension, is the descriptor of this unique point.
The 5th step is based on the image coupling of unique point descriptor.Euclidean distance between the employing unique point descriptor is as the decision metric of its similarity.Get the descriptor of certain unique point in the image A; And find out the descriptor of European nearest preceding two unique points in itself and the image B; In these two key points,, then accept this a pair of match point if nearest distance is removed near in proper order distance less than certain threshold value.Reduce this proportion threshold value, SIFT match point number can reduce, but more stable, can this threshold value be made as 0.6.
The present invention can realize parallel Remote Sensing Images Matching efficiently, and can realize the even distribution of match point.In addition, the present invention also is applicable to the coupling of equal resolution remote sensing image.

Claims (12)

1. remote sensing image parallel adaptive matching process that obtains even distribution match point may further comprise the steps:
The first step, the coincidence that obtains remote sensing image to be matched is regional;
In second step, to overlapping the zone according to from left to right, order is from top to bottom carried out piecemeal;
The 3rd step, based on Gauss's image tower, each piecemeal is mated, if counting, the coupling that this piecemeal obtains reaches the quantitative requirement of predefined each piecemeal match point, then this piecemeal coupling finishes;
The 4th step for the piecemeal that does not reach the match point quantitative requirement, increased Gauss's image tower or difference of gaussian image bearing layer, still each block parallel was mated, up to the quantitative requirement that satisfies match point or till accomplishing the matching algorithm of three groups of Gauss's image towers.
2. the method for claim 1 is characterized in that, the first step is the meta file information according to remote sensing image, obtains the geographic range of remote sensing image, and the geographic range with two width of cloth remote sensing images to be matched compares again, thereby obtains to overlap the zone.
3. the method for claim 1 is characterized in that, can be divided into 256 * 256 piecemeals with overlapping the zone in second step, overlaps the not enough part in regional end by the big or small piecemeal of residue.
4. the method for claim 1 is characterized in that, the coupling in the 3rd step be based on original size Gauss's image tower, expand twice Gauss's image tower, dwindle that these three groups of Gauss's image towers of Gauss's image tower of one times carry out.
5. method as claimed in claim 4 is characterized in that, every group of Gauss's image tower all comprises the n layer image; The difference of gaussian image that generates thus has (n-1) layer, when mating, can carry out characteristic operation to the preceding m layer or the back m layer of difference of gaussian image; M wherein, n is a natural number, and m < n.
6. the method for claim 1 is characterized in that, mating in the 3rd step specifically is divided into following steps:
The first, on metric space, generate Gauss's image and difference of gaussian image;
The second, carry out that extreme value detects, low contrast point and skirt response point are accurately located and removed to extreme point;
The 3rd, be the unique point assign direction, make operator possess rotational invariance;
The 4th, generating feature point descriptor;
The 5th, carry out the coupling of remote sensing image based on the unique point descriptor.
7. method as claimed in claim 6 is characterized in that, at first raw video is carried out rising sampling in the first step; Operation parameter is that Gauss's template of σ=1.5 is carried out process of convolution; On this basis, set up 3 groups of Gauss's image towers, every group of Gauss's image tower comprises 5 layer images, and 5 layer images of every group of Gauss's image tower are carried out the difference of gaussian computing; Promptly adjacent last layer image subtracts each other with following layer image on the same group, and each Gauss's image tower just can generate this differential images of 4 floor heights like this.
8. method as claimed in claim 6; It is characterized in that; The testing of extreme point is carried out between continuous 3 the difference of gaussian images with yardstick in second step; Each measuring point to be checked compares with 8 consecutive point of yardstick and 9 * 2 corresponding points of neighbouring yardstick with its, if the gray-scale value of check point be a maximum or minimum, just this point as being that a reserve unique point remains; Concerning the unique point that has detected, through Taylor expansion and ask extreme value to accomplish the accurate location of unique point, reject the skirt response point through the Hessian matrix that calculates extreme point.
9. method as claimed in claim 6; It is characterized in that the assign direction in the third step is around unique point, to get a neighborhood, calculates the gradient-norm and the weights thereof of each pixel in the neighborhood; Then with their naturalizations in histogram; Confirm the direction of unique point according to the size of histogram energy in the histogram, the histogram that energy is the highest is represented the principal direction of unique point, and energy is the auxilliary direction of the histogram of principal direction energy more than 80% as unique point.
10. method as claimed in claim 6 is characterized in that, is that one 16 * 16 window is got at the center with the unique point in the 4th step; On per 4 * 4 fritter, calculate the histogram of gradients of 8 directions, obtain the gradient accumulated value of each direction, can form a seed points; Have on 16 seed points; Each seed points has 8 directions, forms the vector of one 128 dimension, is the descriptor of this unique point.
11. method as claimed in claim 6; It is characterized in that; Coupling in the 5th step is to adopt Euclidean distance between the unique point descriptor as the decision metric of its similarity; The descriptor of getting nearest preceding two unique points of Euclidean distance if nearest distance is removed near in proper order distance less than certain threshold value, is then accepted this a pair of match point as match objects.
12. method as claimed in claim 11 is characterized in that, said threshold value is more little, and the match point number is few more, but coupling is more stable, and this threshold value can be made as 0.6.
CN2011102861001A 2011-09-24 2011-09-24 Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points Pending CN102446356A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011102861001A CN102446356A (en) 2011-09-24 2011-09-24 Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011102861001A CN102446356A (en) 2011-09-24 2011-09-24 Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points

Publications (1)

Publication Number Publication Date
CN102446356A true CN102446356A (en) 2012-05-09

Family

ID=46008834

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011102861001A Pending CN102446356A (en) 2011-09-24 2011-09-24 Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points

Country Status (1)

Country Link
CN (1) CN102446356A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104268169A (en) * 2014-09-11 2015-01-07 浙江中测新图地理信息技术有限公司 Remote sensing image data rapidly processing method based on PS (Photoshop)
CN104778682A (en) * 2014-01-14 2015-07-15 三星泰科威株式会社 Method of sampling feature points, image matching method, and image matching apparatus
CN105069463A (en) * 2015-07-17 2015-11-18 重庆交通大学 Object-oriented multiple scale mountainous city land coverage information obtaining method
CN107945218A (en) * 2017-11-22 2018-04-20 中国资源卫星应用中心 The big distorted image matching process in edge
CN108492328A (en) * 2018-03-23 2018-09-04 云南大学 Video interframe target matching method, device and realization device
CN108492711A (en) * 2018-04-08 2018-09-04 黑龙江工业学院 A kind of drawing electronic map method and device
CN109598750A (en) * 2018-12-07 2019-04-09 中国地质大学(武汉) One kind being based on the pyramidal large scale differential image characteristic point matching method of deformation space
CN109658446A (en) * 2018-10-30 2019-04-19 武汉珈和科技有限公司 A kind of high-resolution remote sensing image geometrical registration method and device
CN117575979A (en) * 2023-08-01 2024-02-20 广东省国土资源测绘院 Remote sensing image change detection method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101251926A (en) * 2008-03-20 2008-08-27 北京航空航天大学 Remote sensing image registration method based on local configuration covariance matrix
CN101847258A (en) * 2009-03-26 2010-09-29 陈贤巧 Optical remote sensing image registration method
CN101957991A (en) * 2010-09-17 2011-01-26 中国科学院上海技术物理研究所 Remote sensing image registration method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101251926A (en) * 2008-03-20 2008-08-27 北京航空航天大学 Remote sensing image registration method based on local configuration covariance matrix
CN101847258A (en) * 2009-03-26 2010-09-29 陈贤巧 Optical remote sensing image registration method
CN101957991A (en) * 2010-09-17 2011-01-26 中国科学院上海技术物理研究所 Remote sensing image registration method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DAVID G. LOWE: "Distinctive Image Features from Scale-Invariant Keypoints", 《INTERNATIONAL JOURNAL OF COMPUTER VISION》, vol. 60, no. 2, 31 December 2004 (2004-12-31), pages 91 - 110, XP002377338 *
罗征宇,宁晓刚: "不同分辨率遥感影像获取均匀分布匹配点的匹配方法研究", 《测绘通报》, no. 4, 25 April 2011 (2011-04-25) *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104778682A (en) * 2014-01-14 2015-07-15 三星泰科威株式会社 Method of sampling feature points, image matching method, and image matching apparatus
CN104268169A (en) * 2014-09-11 2015-01-07 浙江中测新图地理信息技术有限公司 Remote sensing image data rapidly processing method based on PS (Photoshop)
CN104268169B (en) * 2014-09-11 2017-06-09 浙江中测新图地理信息技术有限公司 A kind of remote sensing image data immediate processing method based on PS softwares
CN105069463A (en) * 2015-07-17 2015-11-18 重庆交通大学 Object-oriented multiple scale mountainous city land coverage information obtaining method
CN107945218B (en) * 2017-11-22 2020-05-22 中国资源卫星应用中心 Edge large distortion image matching method
CN107945218A (en) * 2017-11-22 2018-04-20 中国资源卫星应用中心 The big distorted image matching process in edge
CN108492328A (en) * 2018-03-23 2018-09-04 云南大学 Video interframe target matching method, device and realization device
CN108492711A (en) * 2018-04-08 2018-09-04 黑龙江工业学院 A kind of drawing electronic map method and device
CN109658446A (en) * 2018-10-30 2019-04-19 武汉珈和科技有限公司 A kind of high-resolution remote sensing image geometrical registration method and device
CN109658446B (en) * 2018-10-30 2023-04-07 武汉珈和科技有限公司 Geometric registration method and device for high-resolution remote sensing image
CN109598750A (en) * 2018-12-07 2019-04-09 中国地质大学(武汉) One kind being based on the pyramidal large scale differential image characteristic point matching method of deformation space
CN109598750B (en) * 2018-12-07 2023-05-23 中国地质大学(武汉) Large-scale difference image feature point matching method based on deformation space pyramid
CN117575979A (en) * 2023-08-01 2024-02-20 广东省国土资源测绘院 Remote sensing image change detection method and device

Similar Documents

Publication Publication Date Title
CN102446356A (en) Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points
CN105184801B (en) It is a kind of based on multi-level tactful optics and SAR image high-precision method for registering
CN104574347B (en) Satellite in orbit image geometry positioning accuracy evaluation method based on multi- source Remote Sensing Data data
CN105261014B (en) A kind of multisensor Remote Sensing Images Matching Method
CN110070567B (en) Ground laser point cloud registration method
CN103136525B (en) High-precision positioning method for special-shaped extended target by utilizing generalized Hough transformation
CN105354841B (en) A kind of rapid remote sensing image matching method and system
CN109101981B (en) Loop detection method based on global image stripe code in streetscape scene
CN104063711B (en) A kind of corridor end point fast algorithm of detecting based on K means methods
CN103426186A (en) Improved SURF fast matching method
Chen et al. Robust affine-invariant line matching for high resolution remote sensing images
CN112396643A (en) Multi-mode high-resolution image registration method with scale-invariant features and geometric features fused
CN103295239A (en) Laser-point cloud data automatic registration method based on plane base images
CN103218787A (en) Multi-source heterogeneous remote-sensing image control point automatic collecting method
CN104123554B (en) SIFT image characteristic extracting methods based on MMTD
Yuan et al. Learning to count buildings in diverse aerial scenes
CN104408772A (en) Grid projection-based three-dimensional reconstructing method for free-form surface
CN110197113B (en) Face detection method of high-precision anchor point matching strategy
CN110443261A (en) A kind of more figure matching process restored based on low-rank tensor
CN113409332B (en) Building plane segmentation method based on three-dimensional point cloud
Li et al. Image Matching Algorithm based on Feature-point and DAISY Descriptor.
Liang et al. An extraction and classification algorithm for concrete cracks based on machine vision
CN114332172A (en) Improved laser point cloud registration method based on covariance matrix
CN117078726A (en) Different spectrum image registration method based on edge extraction
Ni et al. A fully automatic registration approach based on contour and SIFT for HJ-1 images

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20120509