CN103400384B - The wide-angle image matching process of calmodulin binding domain CaM coupling and some coupling - Google Patents

The wide-angle image matching process of calmodulin binding domain CaM coupling and some coupling Download PDF

Info

Publication number
CN103400384B
CN103400384B CN201310325400.5A CN201310325400A CN103400384B CN 103400384 B CN103400384 B CN 103400384B CN 201310325400 A CN201310325400 A CN 201310325400A CN 103400384 B CN103400384 B CN 103400384B
Authority
CN
China
Prior art keywords
region
mser
point
characteristic point
represents
Prior art date
Application number
CN201310325400.5A
Other languages
Chinese (zh)
Other versions
CN103400384A (en
Inventor
张强
郑元世
陈月玲
王亚彬
王龙
Original Assignee
西安电子科技大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 西安电子科技大学 filed Critical 西安电子科技大学
Priority to CN201310325400.5A priority Critical patent/CN103400384B/en
Publication of CN103400384A publication Critical patent/CN103400384A/en
Application granted granted Critical
Publication of CN103400384B publication Critical patent/CN103400384B/en

Links

Abstract

The wide-angle image matching process of a kind of calmodulin binding domain CaM coupling and some coupling. The steps include: 1, input the two width images that existence changes with great visual angle; 2, with MSER, image is carried out to region detection, utilize the average in region and variance to come fitted ellipse region; 3, elliptic region is normalized into round region, is described with SIFT descriptor; 4, adopt arest neighbors than inferior neighbour's strategy, select initial Region Matching pair; 5, internal at matching area, utilize SIFT method to detect characteristic point; 6, characteristic point is described, obtains 128 dimension descriptor and the 2 dimension space descriptors based on MSER; 7, adopt the similitude strategy of Binding distance, in two width images, select exact matching point right. The present invention has overcome characteristic point descriptor in prior art not to be had affine consistency and does not consider the defect of spatial information, can extract the matching double points with higher accuracy, better for image registration.

Description

The wide-angle image matching process of calmodulin binding domain CaM coupling and some coupling

Technical field

The invention belongs to technical field of image processing, further relate to the land in a kind of image registration techniques fieldThe wide-angle image matching process of territory coupling and some coupling. The present invention can extract more accurately from the image of inputGo out conforming characteristic point, can be applicable to exist in image registration with great visual angle.

Background technology

Images match is a special dimension of image processing, by images match, the different images of Same Scene itBetween extract conforming characteristic point, determine between image corresponding geometrical relationship, obtain the image after a width coupling.The single image Description Image scene more accurately of image ratio after coupling, generally, can adopt based on officeThe method of portion's feature extracting and matching is carried out images match, and these local features extract and the method for coupling has mainly been consideredThe yardstick of input picture and rotational invariance, but often ignored because image aspects changes the distortion causing, thereby causeCan not obtain effectively, accurately corresponding matching characteristic point problem.

(application number: CN200710176039.9, application is open for the patent " image matching method " of Tsing-Hua University's applicationNumber: a kind of method that discloses images match CN101140624). The concrete steps of the method are: first, utilizeMSER method detects the homogeneous region in image to be matched, with elliptic region equation to the each homogeneous region detectingCarry out matching; Then, utilize oval histogram of gradients and multi-scale image to decompose, obtain elliptic region equation modelAfter the characteristic vector of homogeneous region place measured zone; Finally, adopt layering k means clustering method structure homogeneity districtThe binary tree of territory set, carries out images match. The method is mainly used for images match by MSER method for detecting area,But the deficiency that the method still exists is, because image is often by different sensors and at different visual anglesUnder obtain, if simply utilize MSER region to mate, the region obtaining does not have Shandong for dimensional variationRod, can not solve by changing with great visual angle the image fault problem causing, this can reduce coupling to a certain extentMethod is obtained the performance of robustness consistency point in modified-image with great visual angle.

The paper " Distinctiveimagefeaturesfromscale-invariantkeypoints. " that LoweD deliversIn (InternationalJournalofComputerVision, vol.60, no.2, pp.91-110), disclosing a kind of feature carriesGet and matching process. First the method adopts metric space extreme value to detect the picture position on all yardsticks of search; ThenOn each candidate's position, determine position and yardstick by a meticulous model of matching, based on image localGradient direction, distributes to the one or more directions in each key point position; Finally, the neighborhood around each key pointIn, utilize the gradient of image local to carry out key point description, utilize arest neighbors than inferior neighbour's plan to the characteristic point of describingSlightly mate, obtain the characteristic point of coupling. The deficiency that the method exists is, because the method has just been considered yardstickAnd rotational invariance, do not consider affine consistency. Therefore in the situation that there is geometric distortion in image, two width figureDescription region as corresponding points can not cover identical content, and the method often can not get satisfied result.

Paper " the ASIFT:Anewframeworkforfullyaffine that J.-M.MorelandG.Yu deliversInvariantimagecomparison " (SIAMJournalonImagingSciences2,438,2009). in openThe constant image matching method of a kind of affine yardstick. First the method passes through the directioin parameter of two camera axis of conversion,Namely longitude angle and angle of latitude, carries out suitable sampled analog at two parameter spaces, makes two images to be matchedForm two groups of views; Then, utilize SIFT method to mate these two groups of views; Finally, extract this two picture groupAll match points in picture, and they are mapped on two width original images. The deficiency that the method exists is, due to the partyMethod is simulated on affine space image, forms the image at each visual angle, has consumed a large amount of internal memories; Exist simultaneouslyWhile extracting a large amount of correct match points, also introduce a large amount of Mismatching points, caused correct matching rate to decline.

Summary of the invention

The object of the invention is to overcome the shortcoming of above-mentioned prior art, propose a kind of calmodulin binding domain CaM coupling and some couplingWide-angle image matching process, to better extract the uniformity feature that more robustness is stronger from input picturePoint, and there is higher correct matching rate.

For achieving the above object, when the present invention carries out feature detection to input picture, calmodulin binding domain CaM coupling and some coupling,The spatial information of having introduced characteristic point in a coupling carrys out Expressive Features point simultaneously, and its technical scheme is first to two width figurePicture carries out the detection of MSER region and coupling, and on the MSER region of coupling, carry out yardstick invariant features point and detect,Then characteristic point is carried out describing and spatial description based on the 128 dimension SIFT of MSER; Finally adopt Binding distanceSimilitude strategy obtains comparatively accurate matching double points.

Concrete steps of the present invention are as follows:

(1) input picture: input exists the two width images that change with great visual angle respectively;

(2) MSER detects:

Two width images are carried out respectively to maximum stable extremal region MSER detection, obtain multiple affine consistency that haveIrregular extremal region;

(3) MSER matching:

3a) to each irregular extremal region, the point in digital simulation region according to the following formula:

(x-μ)TU-1(x-μ)=(x-μ)TM(x-μ)=1

Wherein, x represents the point of fitted area, and μ represents the average of irregular extremal region, and T represents transposition, U tableShow the variance of irregular extremal region, M represents the second-order moments matrix of irregular extremal region;

3b) obtained all fitted area points are formed to ellipse fitting region;

(4) MSER normalization:

4a) calculate according to the following formula the point in normalization region:

z=sH-1M1/2(x-μ)

Wherein, z represents the point in normalization region, and s represents zoom factor, 6≤s≤10, and H represents second-order moments matrixThe symmetrical unitary matrice of reality that M singular value decomposition obtains, M represents the second-order moments matrix of irregular extremal region, x representsThe point of fitted area, μ represents the average of irregular extremal region;

4b) obtained all normalization region point are formed to normalization region;

(5) MSER describes:

Normalization region is described with SIFT Feature Descriptor, forms 128 dimension SIFT descriptors;

(6) MSER coupling:

Adopt arest neighbors than inferior neighbour's strategy, in two width images, select initial matching area pair;

(7) yardstick invariant features point detects:

Internal at initial matching area, utilize the SIFT method to detect yardstick constant characteristic point, determine characteristic pointPrecise position information and characteristic dimension information;

(8) characteristic point is described:

8a) obtain the 128 dimension SIFT descriptors based on MSER of characteristic point;

8b) obtain the 2 dimension space descriptors based on MSER of characteristic point;

(9) the similitude strategy of employing Binding distance is selected meticulous Feature Points Matching pair in two width images.

The present invention has the following advantages compared with prior art:

The first, the present invention carries out MSER detection and coupling owing to adopting to two width input pictures, in the region of couplingOn carry out yardstick consistency feature detection, construct affine constant oval neighboring region, on normalization region, carry outSIFT describe processing method, overcome in prior art, be limited to circular neighboring region do not have affine consistency lackFall into, make matching process that the present invention designs compared with prior art, there is the main orientation of more stable characteristic point andAccurate affine consistency descriptor.

The second, the present invention is owing to having adopted characteristic point distinctive spatial relationship in MSER region, and it is imitative that formation has 2 dimensionsPenetrate the method for the space descriptor of consistency, overcome in the matching process of prior art and only considered optical characteristics, and do not hadThere is the defect of considering spatial relationship, make the present invention compared with prior art there is Geng Gaoke district to the description of characteristic pointDivide property.

Brief description of the drawings

Fig. 1 is flow chart of the present invention;

Fig. 2 is the wide-angle image coupling simulated effect figure of structural type scene;

Fig. 3 is the wide-angle image coupling simulated effect figure of texture type scene;

Detailed description of the invention

Below in conjunction with accompanying drawing, the present invention will be further described.

With reference to accompanying drawing 1, concrete steps of the present invention are as follows:

Step 1, input picture: input exists the two width images that change with great visual angle respectively.

Step 2, MSER detects:

Two width images are carried out respectively to maximum stable extremal region MSER detection, obtain multiple affine consistency that haveIrregular extremal region.

Step 3, MSER matching:

To each irregular extremal region, the point in digital simulation region according to the following formula:

(x-μ)TU-1(x-μ)=(x-μ)TM(x-μ)=1

Wherein, x represents the point of fitted area, and μ represents the average of irregular extremal region, and T represents transposition, U tableShow the variance of irregular extremal region, M represents the second-order moments matrix of irregular extremal region.

Obtained all fitted area points are formed to ellipse fitting region.

Step 4, MSER normalization:

Calculate according to the following formula the point in normalization region:

z=sH-1M1/2(x-μ)

Wherein, z represents the point in normalization region, and s represents zoom factor, and 6≤s≤10 adopt s=10 in experiment,H represents the symmetrical unitary matrice of reality that second-order moments matrix M singular value decomposition obtains, and M represents two of irregular extremal regionRank square matrix, x represents the point of fitted area, μ represents the average of irregular extremal region.

Obtained all normalization region point are formed to normalization region.

Step 5, MSER describes:

Normalization region is described with SIFT Feature Descriptor, forms 128 dimension SIFT descriptors.

Step 6, MSER coupling:

Adopt arest neighbors than inferior neighbour's strategy, in two width images, select initial matching area pair.

Arest neighbors refers in two width images than inferior neighbour's strategy, and nearest neighbor distance in descriptor is removed to the following nearest neighbor distanceThe matching area pair that the value obtaining is less than 0.85, as a pair of initial matching area pair.

Step 7, yardstick invariant features point detects:

Internal at initial matching area, utilize the SIFT method to detect yardstick constant characteristic point, determine characteristic pointPrecise position information and characteristic dimension information.

Step 8, characteristic point is described:

The 128 dimension SIFT descriptors based on MSER that obtain characteristic point, its concrete implementation step is as follows:

Choose oval neighboring region: in the Gaussian image at characteristic point yardstick place, choose one centered by characteristic point,Meet the oval neighboring region of the ellipticity of oval neighboring region and two conditions of the area of oval neighboring region simultaneously, itsIn, the ellipticity of oval neighboring region equals the ellipticity of the affiliated MSER fitted area of this characteristic point, oval adjacent areaThe area in territory equals 4.522, the characteristic dimension that wherein σ is characteristic point.

Oval neighboring region normalization: oval neighboring region is normalized to the round neighboring region that a radius is 4.5 σ,The characteristic dimension that wherein σ is characteristic point.

Circle neighboring region describe: with SIFT Feature Descriptor to circle neighboring region be described, form based on MSER128 dimension SIFT descriptors.

The 2 dimension space descriptors based on MSER that obtain characteristic point, its concrete implementation step is as follows:

The MSER region at characteristic point place is normalized to unit circle region.

In this unit circle region, 2 dimension space descriptor l and θ of feature points, l representation feature point is from circleThe distance descriptor of the heart, the principal direction of θ representation feature point deducts the center of circle and points to the poor angle of the direction of this direction and describeSymbol.

Step 9, the similitude strategy of employing Binding distance is selected meticulous Feature Points Matching pair in two width images,Its concrete implementation step is as follows:

The first step, calculate according to the following formula the Binding distance of any two characteristic points in two width input picture corresponding regions:

D(p,q)=α1E1(p,q)+α2E2(p,q)+(1-α12)E3(p,q)

Wherein, D (p, q) represents characteristic point p in two width input picture corresponding regions, the Binding distance of q; α1And α2PointDo not represent two weighted factors, 0≤α1≤1,0≤α2≤1,0≤α12In≤1 experiment, adopt α1=0.3, α2=0.6;E1(p, q) represents characteristic point p in two width input picture corresponding regions, between 2 SIFT descriptors of qEuclidean distance; E2(p, q) represents characteristic point p in two width input picture corresponding regions, and q is apart from the distance between descriptorFrom; E3(p, q) represents characteristic point p in two width input picture corresponding regions, the distance between q angle descriptor.

Second step, select according to the following formula meticulous Feature Points Matching pair:

D(k,m)/D(k,n)<T

Wherein, D (k, m) represents characteristic point k and obtained mating in MSER region in a MSER regionCharacteristic point m between Binding distance, D (k, n) represent characteristic point k in a MSER region and obtainJoin the Binding distance between the characteristic point n in MSER region, T represents threshold value, and the span of T is 0.7 to 0.85Between, in experiment, adopt T=0.85.

Below in conjunction with Fig. 2 and 3 analogous diagram, effect of the present invention is further described.

For verifying validity of the present invention and correctness, adopt the wide-angle image of two groups of different scenes to mate imitativeTrue experiment, and with existing three kinds of classical matching process SIFT methods, MSER method, ASIFT method and Ben FaBright method is carried out emulation comparison. All emulation experiments all adopt Visual under WindowsXP operating systemStudio2010 software is realized.

Emulation 1

What Fig. 2 (a) showed is, and different visual angles 0 is spent, 20 degree, 30 degree, 40 degree, 50 degree, 60 degree is lower obtainsThe gray level image of six width structural type scenes, the image size adopting is 800 × 600, wherein establishes piece image for ginsengExamine image, other images are respectively image to be matched. Fig. 2 (b) is the correct matching number of structural type image, horizontal seatWhat mark represented is visual angle change, and what ordinate represented is with correct match point number, and Fig. 2 (c) is structural type imageCorrect matching rate, what abscissa represented is visual angle change, what ordinate represented is correct matching rate; From Fig. 2 (b) andIn 2 (c), can find out, the correct coupling that the inventive method obtains in all visual angle change situations is counted and is all wanted obviousGround is higher than SIFT method and MSER method, and the inventive method obtained the correct matching rate close to 100%,Especially in enough large situations of visual angle change, namely, in the situations of 60 degree, the inventive method has obtained 162 pairsCorrect matching double points, SIFT method and MSER method have obtained respectively 1 pair and 39 pairs of correct matching double points; ThisBright method and ASIFT method compare, and can find out, although side of the present invention from Fig. 2 (b) and Fig. 2 (c)Method on correct matching double points number lower than ASIFT method, but the correct matching rate of the inventive method apparently higher thanASIFT method, therefore the inventive method not only can obtain higher correct matching double points, and has just higherReally matching rate.

Emulation 2

What emulation 3 (a) showed is that different visual angles 0 is spent, 20 degree, 30 are spent, 40 degree, 50 are spent, 60 lower of degree obtainThe gray level image of six width texture type scenes, the image size adopting is 800 × 600. Wherein establishing piece image isReference picture, other images are respectively image to be matched. Fig. 3 (b) is the correct matching number of texture type image, horizontal strokeWhat coordinate represented is visual angle change, and what ordinate represented is with correct match point number, and Fig. 3 (c) is texture type imageCorrect matching rate, what abscissa represented is visual angle change, what ordinate represented is correct matching rate; From Fig. 3 (b)With in 3 (c), can find out, visual angle change before 40 degree, the inventive method obtain correct coupling number and SIFTThe correct coupling number that method obtains is suitable, and the correct coupling number obtaining than MSER method is obviously higher, along with visual angle becomesThe increase of changing, the correct coupling number that the inventive method obtains, apparently higher than SIFT method and MSER method, especially existsVisual angle change is 60 while spending, and the inventive method has obtained 25 pairs of correct matching double points, and MSER method has obtained 7 pairsCorrect matching double points, SIFT has obtained 4 pairs of true matching double points; The present invention and ASIFT method compare, from figureIn 3 (b) and Fig. 3 (c), can find out, although the inventive method on correct matching double points number lower than ASIFTMethod, but the correct matching rate of the inventive method apparently higher than ASIFT method, therefore the inventive method can not onlyAccess higher correct matching double points, and there is higher correct matching rate.

Claims (5)

1. a wide-angle image matching process for calmodulin binding domain CaM coupling and some coupling, comprises the steps:
(1) input picture: input exists the two width images that change with great visual angle respectively;
(2) MSER detects:
Two width images are carried out respectively to maximum stable extremal region MSER detection, obtain multiple affine consistency that haveIrregular extremal region;
(3) MSER matching:
3a) to each irregular extremal region, the point in digital simulation region according to the following formula:
(x-μ)TU-1(x-μ)=(x-μ)TM(x-μ)=1
Wherein, x represents the point of fitted area, and μ represents the average of irregular extremal region, and T represents transposition, U tableShow the variance of irregular extremal region, M represents the second-order moments matrix of irregular extremal region;
3b) all fitted area points that obtain are formed to ellipse fitting region;
(4) MSER normalization:
4a) calculate according to the following formula the point in normalization region:
z=sH-1M1/2(x-μ)
Wherein, z represents the point in normalization region, and s represents zoom factor, 6≤s≤10, and H represents second-order moments matrixThe symmetrical unitary matrice of reality that M singular value decomposition obtains, M represents the second-order moments matrix of irregular extremal region, x representsThe point of fitted area, μ represents the average of irregular extremal region;
4b) all normalization region point that obtain are formed to normalization region;
(5) MSER describes:
Normalization region is described with SIFT Feature Descriptor, forms 128 dimension SIFT descriptors;
(6) MSER coupling:
Adopt arest neighbors than inferior neighbour's strategy, in two width images, select initial matching area pair;
(7) yardstick invariant features point detects:
Internal at initial matching area, utilize the SIFT method to detect yardstick constant characteristic point, determine characteristic pointPrecise position information and characteristic dimension information;
(8) characteristic point is described:
8a) obtain the 128 dimension SIFT descriptors based on MSER of characteristic point;
8b) obtain the 2 dimension space descriptors based on MSER of characteristic point;
(9) the similitude strategy of employing Binding distance is selected meticulous Feature Points Matching pair in two width images.
2. the wide-angle image matching process of calmodulin binding domain CaM coupling according to claim 1 and some coupling, its featureBe: the arest neighbors described in step (6) refers in two width images than inferior neighbour's strategy, by nearest in descriptorThe region of neighborhood distance is except the region of the following nearest neighbor distance, if the region of nearest neighbor distance is except the following neighbour's distance in descriptorFrom the value in region be less than 0.85, using the region in this descriptor with nearest neighbor distance as a pair of initial couplingRegion pair.
3. the wide-angle image matching process of calmodulin binding domain CaM coupling according to claim 1 and some coupling, its featureBe: step 8a) described in the 128 dimension SIFT descriptor steps based on MSER as follows:
The first step, chooses oval neighboring region: in the Gaussian image at characteristic point yardstick place, choose one with featureCentered by point, meet the ellipse neighbour of the ellipticity of oval neighboring region and two conditions of the area of oval neighboring region simultaneouslyConnect region, wherein, the ellipticity of oval neighboring region equals the ellipticity of the affiliated MSER fitted area of this characteristic point,The area of oval neighboring region equal 20.25 be multiplied by characteristic point yardstick square;
Second step, oval neighboring region normalization: oval neighboring region is normalized to the circle neighbour that a radius is 4.5 σConnect region, the characteristic dimension that wherein σ is characteristic point;
The 3rd step, circle neighboring region describe: with SIFT Feature Descriptor to circle neighboring region be described, form baseIn the 128 dimension SIFT descriptors of MSER.
4. the wide-angle image matching process of calmodulin binding domain CaM coupling according to claim 1 and some coupling, its spyLevy and be: step 8b) described in the step of the 2 dimension space descriptors based on MSER as follows:
The first step, is normalized to unit circle region by the MSER region at characteristic point place;
Second step, utilizes the spatial relationship in characteristic point and unit circle region, and 2 dimension spaces that form based on MSER are describedSymbol l and θ, l representation feature point is from the distance descriptor in the center of circle, and the principal direction of θ representation feature point deducts the center of circle and points toThe poor angle descriptor of direction of this direction.
5. the wide-angle image matching process of calmodulin binding domain CaM coupling according to claim 1 and some coupling, its spyLevy and be: described in step (9), the similitude strategy step of Binding distance is as follows:
The first step, calculate according to the following formula the Binding distance of any two characteristic points in two width input picture corresponding regions:
D(p,q)=α1E1(p,q)+α2E2(p,q)+(1-α12)E3(p,q)
Wherein, D (p, q) represents characteristic point p in two width input picture corresponding regions, the Binding distance of q; α1And α2PointDo not represent two weighted factors, 0≤α1≤1,0≤α2≤1,0≤α12≤1;E1(p, q) represents two width inputsCharacteristic point p in image corresponding region, the Euclidean distance between 2 SIFT descriptors of q; E2Two of (p, q) expressions are defeatedEnter characteristic point p in image corresponding region, q is apart from the distance between descriptor; E3(p, q) represents two width input pictures pairAnswer characteristic point p in region, the distance between q angle descriptor;
Second step, select according to the following formula meticulous Feature Points Matching pair:
D(k,m)/D(k,n)<th
Wherein, D (k, m) represents characteristic point k and obtained mating in MSER region in a MSER regionCharacteristic point m between Binding distance, D (k, n) represent characteristic point k in a MSER region and obtainJoin the Binding distance between the characteristic point n in MSER region, th represents threshold value, and the span of th is 0.7 to 0.85Between.
CN201310325400.5A 2013-07-22 2013-07-22 The wide-angle image matching process of calmodulin binding domain CaM coupling and some coupling CN103400384B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310325400.5A CN103400384B (en) 2013-07-22 2013-07-22 The wide-angle image matching process of calmodulin binding domain CaM coupling and some coupling

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310325400.5A CN103400384B (en) 2013-07-22 2013-07-22 The wide-angle image matching process of calmodulin binding domain CaM coupling and some coupling

Publications (2)

Publication Number Publication Date
CN103400384A CN103400384A (en) 2013-11-20
CN103400384B true CN103400384B (en) 2016-05-25

Family

ID=49563996

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310325400.5A CN103400384B (en) 2013-07-22 2013-07-22 The wide-angle image matching process of calmodulin binding domain CaM coupling and some coupling

Country Status (1)

Country Link
CN (1) CN103400384B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103617616A (en) * 2013-12-03 2014-03-05 康江科技(北京)有限责任公司 Affine invariant image matching method
CN103679726B (en) * 2013-12-16 2016-08-17 成都安健发科技有限公司 A kind of method improving rock debris image image quality
CN104134209B (en) * 2014-07-18 2018-04-10 北京国电富通科技发展有限责任公司 A kind of feature extracting and matching method and system in vision guided navigation
CN104616280B (en) * 2014-11-26 2017-07-28 西安电子科技大学 Method for registering images based on maximum stable extremal region and phase equalization
CN105160686B (en) * 2015-10-21 2017-08-25 武汉大学 A kind of low latitude various visual angles Remote Sensing Images Matching Method based on improvement SIFT operators
CN105741297B (en) * 2016-02-02 2018-08-24 南京航空航天大学 A kind of repeat pattern image matching method with affine-invariant features

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101140624A (en) * 2007-10-18 2008-03-12 清华大学 Image matching method
CN101667293A (en) * 2009-09-24 2010-03-10 哈尔滨工业大学 Method for conducting high-precision and steady registration on diversified sensor remote sensing images
CN102270303A (en) * 2011-07-27 2011-12-07 重庆大学 Joint detection method sensitive image

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3166905B2 (en) * 1997-07-02 2001-05-14 インターナショナル・ビジネス・マシーンズ・コーポレ−ション Image processing method and system by pattern matching
US9070002B2 (en) * 2011-10-18 2015-06-30 King Fahd University Of Petroleum And Minerals Method of performing fingerprint matching
CN103136292B (en) * 2011-12-05 2016-09-07 中国移动通信集团公司 A kind of image search method based on characteristics of image and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101140624A (en) * 2007-10-18 2008-03-12 清华大学 Image matching method
CN101667293A (en) * 2009-09-24 2010-03-10 哈尔滨工业大学 Method for conducting high-precision and steady registration on diversified sensor remote sensing images
CN102270303A (en) * 2011-07-27 2011-12-07 重庆大学 Joint detection method sensitive image

Also Published As

Publication number Publication date
CN103400384A (en) 2013-11-20

Similar Documents

Publication Publication Date Title
Wang et al. MSLD: A robust descriptor for line matching
Zhang et al. An efficient and robust line segment matching approach based on LBD descriptor and pairwise geometric consistency
Siagian et al. Rapid biologically-inspired scene classification using features shared with visual attention
Gopalakrishnan et al. Salient region detection by modeling distributions of color and orientation
CN101976455B (en) Color image three-dimensional reconstruction method based on three-dimensional matching
Novatnack et al. Scale-dependent/invariant local 3D shape descriptors for fully automatic registration of multiple sets of range images
Boltes et al. Automatic extraction of pedestrian trajectories from video recordings
Zhang et al. A robust, real-time ellipse detector
KR101121034B1 (en) System and method for obtaining camera parameters from multiple images and computer program products thereof
CN101950426B (en) Vehicle relay tracking method in multi-camera scene
CN101673403A (en) Target following method in complex interference scene
CN102859535B (en) Daisy descriptor is produced from precalculated metric space
CN101894366A (en) Method and device for acquiring calibration parameters and video monitoring system
CN103503025A (en) Determining model parameters based on transforming a model of an object
CN103106688A (en) Indoor three-dimensional scene rebuilding method based on double-layer rectification method
Cheng et al. Determining vision graphs for distributed camera networks using feature digests
Jiménez et al. Traffic sign shape classification and localization based on the normalized FFT of the signature of blobs and 2D homographies
CN102044151B (en) Night vehicle video detection method based on illumination visibility identification
Yu et al. Motion pattern interpretation and detection for tracking moving vehicles in airborne video
CN101271582A (en) Three-dimensional reconstruction method based on multi-vision angle two-dimension image combined with SIFT algorithm
CN103164858B (en) Adhesion crowd based on super-pixel and graph model is split and tracking
CN101667293A (en) Method for conducting high-precision and steady registration on diversified sensor remote sensing images
Drost et al. 3d object detection and localization using multimodal point pair features
Werghi et al. The mesh-lbp: a framework for extracting local binary patterns from discrete manifolds
US20180268237A1 (en) Method and system for determining at least one property related to at least part of a real environment

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
GR01 Patent grant
C14 Grant of patent or utility model