CN103324948B - The sane matching process of a kind of low altitude remote sensing image based on line features - Google Patents

The sane matching process of a kind of low altitude remote sensing image based on line features Download PDF

Info

Publication number
CN103324948B
CN103324948B CN201310272696.9A CN201310272696A CN103324948B CN 103324948 B CN103324948 B CN 103324948B CN 201310272696 A CN201310272696 A CN 201310272696A CN 103324948 B CN103324948 B CN 103324948B
Authority
CN
China
Prior art keywords
image
line segment
line
intersection point
segment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310272696.9A
Other languages
Chinese (zh)
Other versions
CN103324948A (en
Inventor
邵振峰
郭舒
贺晓波
陈敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201310272696.9A priority Critical patent/CN103324948B/en
Publication of CN103324948A publication Critical patent/CN103324948A/en
Application granted granted Critical
Publication of CN103324948B publication Critical patent/CN103324948B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The sane matching process of low altitude remote sensing image based on line features, comprises the following steps: step 1, extracts the line segment in image, and described image comprises with reference to image and image to be matched; Step 2, to reference to image and image to be matched, utilizes the conspicuousness of line segment and length information that all line segments in step 1 gained image are divided into remarkable line segment or non-significant line segment respectively; Step 3, utilizes the image matching method based on line segment combination to mate the remarkable line segment with reference to image and image to be matched, joins in non-significant line-segment sets by the remarkable line segment that the match is successful; Step 4, according to the step 3 remarkable line segment estimation that the match is successful with reference to the initial transformation matrix between image and image to be matched and relative rotation angle; Step 5, utilizes the image matching method based on coplanar intersection point of line segments to mate non-significant line segment.The method makes full use of line segment feature in image and possible local characteristic region texture information, improves Image Matching efficiency and success ratio.

Description

The sane matching process of a kind of low altitude remote sensing image based on line features
Technical field
The invention belongs to technical field of mapping, be specifically related to the sane matching process of a kind of low altitude remote sensing image based on line features.
Background technology
Technology of low altitude remote sensing because cost is low, the image resolution advantages of higher that obtains and become the strong means of supplementing out economy of traditional airborne remote sensing, all there is wide application prospect in various fields.As in the construction of smart city, the high-resolution characteristic of unmanned plane low altitude remote sensing image, makes Large-scale Urban Digital Mapping be achieved; Disaster monitoring with emergent in, low-altitude remote sensing flexibly, convenient and there is the advantage of real-time, for the remote sensing image of quick obtaining specific region provides a favorable guarantee.But, low altitude remote sensing image film size is little, the feature such as large that distorts also makes traditional image treatment method in efficiency and precision, be difficult to meet the requirement of practical application, especially in the Image Matching stage, traditional image matching method process low altitude remote sensing image is utilized cannot to obtain desirable matching result.Because the image matching method based on point patterns of maturation is applicable to the image of texture compared with horn of plenty mostly, for low texture image common in low altitude remote sensing image, often because of be difficult to detect key point and feature descriptor unique not high and occur that error hiding rate increases the phenomenon maybe cannot mated, and line features can be good at addressing this problem.
Utilize line features to carry out Image Matching and mainly face following problem: the otherness of (1) image makes the performance of same line segment on different images inconsistent, when being difficult to carry out consistance description to it and utilize the information such as end points, length to mate line segment, there will be unforeseen mistake; (2) line segment lacks an overall geometry constraint conditions and retrains matching process, adds its difficulty of matching.
The existing image matching method based on line features roughly can be divided three classes, one is mate based on the line features of camera geometric parameter, the geometric parameter that early stage method all relies on camera mostly retrains Image Matching process, but the geometric parameter of camera is often unknown, add the difficulty of Image Matching; Two is mate based on the line features of line segment neighborhood half-tone information, this type of algorithm does not need the geometric parameter understanding camera, only mate according to the gray scale of line segment neighborhood territory pixel or colouring information, but this algorithm calculated amount is larger, and to illumination and Geometrical change very responsive, be not suitable for the larger image of image-forming condition difference; Three is mate based on the line features of line segment geometric relationship, and this algorithm had both avoided the error hiding phenomenon occurred because of camera parameter disappearance or miscount, turn improved the robustness of algorithm under illumination variation, therefore had very important practice significance.But this kind of method needs to carry out topological relationship calculation or large regions neighborhood regional analysis, thus store with calculated amount quite large.
Summary of the invention
The present invention is directed to existing based on the deficiency in the image matching method of line features, propose the sane matching process of a kind of low altitude remote sensing image based on line features, the method can make full use of the texture information of line segment feature in image and possible local characteristic region thereof, both improve the efficiency of Image Matching, turn improve the success ratio of matching result.
The technical solution adopted in the present invention is the sane matching process of a kind of low altitude remote sensing image based on line features, comprises following steps:
Step 1, extracts the line segment in image, and described image comprises with reference to image and image to be matched;
Step 2, to reference to image and image to be matched, utilizes the conspicuousness of line segment and length information that all line segments in step 1 gained image are divided into remarkable line segment or non-significant line segment respectively, and gained non-significant line segment forms non-significant line-segment sets;
Step 3, utilizes the image matching method based on line segment combination to mate the remarkable line segment with reference to image and image to be matched, joins in non-significant line-segment sets by the remarkable line segment that the match is successful;
Step 4, according to the step 3 remarkable line segment estimation that the match is successful with reference to the initial transformation matrix between image and image to be matched and relative rotation angle;
Step 5, utilizes the image matching method based on coplanar intersection point of line segments to mate non-significant line segment in step 3 gained non-significant line-segment sets, comprises following sub-step,
Step 5.1, to reference to image and image to be matched, finds its intersection point with non-significant line segment respectively in the step 3 each remarkable line segment neighborhood that the match is successful;
Step 5.2, generates intersection point descriptor respectively to each intersection point obtained in step 5.1, and obtains the principal direction of intersection point;
Step 5.3, based on step 4 gained initial transformation matrix and relative rotation angle, the principal direction according to gained intersection point descriptor and intersection point in step 5.2 is mated with reference to the intersection point on image and image to be matched, obtains mating intersection point;
Step 5.4, utilizes the corresponding relation between each coupling intersection point and non-significant line segment obtained in step 5.3 to obtain the non-significant line segment that the match is successful, and all non-significant line segments that the match is successful and the remarkable line segment that in step 3, the match is successful form and final mate line-segment sets.
And, in step 2, all line segments in image are divided into remarkable line segment or non-significant line segment implementation is as follows,
For arbitrary line segment l in image, successively with pixel l (x each in line segment l n, y n) be starting point, along l (x n, y n) the nearest adjacent segments of gradient direction detection range line segment l, and calculate pixel l (x n, y n) to the distance of the searched adjacent segments arrived, be designated as again along l (x n, y n) the nearest adjacent segments of antigradient direction detection range line segment l, and calculate pixel l (x n, y n) to the distance of the searched adjacent segments arrived, obtain distance get both maximal values as pixel l (x n, y n) corresponding support area; If the value that line segment l length is N, n is 1,2 ... N;
If gr nfor pixel l (x in line segment l n, y n) gradient, for the mean value of pixel gradient all in line segment l, following formula is utilized to calculate the conspicuousness s of line segment l:
s = Σ n = 1 N ( ( gr n - gr ‾ ) × d n )
After obtaining the conspicuousness of every bar line segment in image, calculate the mean value of all line segment conspicuousnesses in image with the average length of all line segments if arbitrary line segment l meets in image then be labeled as remarkable line segment, otherwise be labeled as non-significant line segment.
And step 4 implementation is as follows,
If { L 1, L 2, L 3..., L μby being formed line-segment sets, { L' with reference to the remarkable line segment that in image, the match is successful 1, L' 2, L' 3..., L' μformed line-segment sets by the remarkable line segment that in image to be matched, the match is successful, wherein, μ be with reference to image with mate line segment logarithm, (L between image to be matched w, L' w) be with reference to image with the w between image to be matched to mating line segment, the value of w is 1,2 ... μ;
If I pfor set { L 1, L 2, L 3..., L μmiddle conductor is to (L i, L j) intersection point with reference to the position in image, I' pfor set { L' 1, L' 2, L' 3..., L' μin corresponding line segment to (L' i, L' j) the position of intersection point in image to be matched, the value of i is 1,2 ... the value of μ, j is 1,2 ... μ, i ≠ j;
By I pwith I' pbe considered as a pair match point and save as matching double points (I p, I' p), process line segment aggregate { L successively by this 1, L 2, L 3..., L μand { L' 1, L' 2, L' 3..., L' μin corresponding line segment, obtain one group of matching double points set, adopt the estimation of RANSAC algorithm with reference to the initial transformation matrix between image and image to be matched and relative rotation angle according to this matching double points set.
And in step 5.1, the neighborhood make of remarkable line segment is as follows,
If p 1, p 2for remarkable line segment L stwo-end-point, d thfor predeterminable range threshold value, first along line segment L sdirection of both ends is respectively to extension p 1with p 2to a, b place, wherein ap 1=bp 2=d th, then respectively centered by a, b, with straight line for axis of symmetry, with d thfor radius constructs two semicircles, by these two semicircles at straight line the end points of the same side couples together formation closed region as remarkable line segment L sneighborhood.
And step 5.2 gained intersection point descriptor is Descr={Position, Gradient}, wherein, Position represents the coordinate position of intersection point in image, feature in the regional area of Gradient representative centered by intersection point;
Step 5.3 implementation is as follows,
(1) coupling of position of intersecting point information, comprise the position coordinates A that the coordinate position Position of certain intersection point on the initial transformation matrix computing reference image that utilizes and obtain in step 4 is corresponding on image to be matched, centered by this position coordinates A, preset value dis obtains a border circular areas for radius, retain in image to be matched the intersection point to be matched falling into this border circular areas, be referred to as to retain intersection point; (2) coupling of intersection point local features, realizes as follows,
If with reference to the principal direction of certain intersection point on image be on image to be matched, the principal direction of certain corresponding reservation intersection point is in step 4, gained with reference to relative rotation angle between image and image to be matched is , first judge whether [ -δ, + δ] in scope, wherein δ is predetermined angle threshold value; If the Euclidean distance between the local features Gradient then calculating these two intersection points, otherwise then cast out corresponding reservation intersection point; Then, get and calculate the nearest intersection point of gained Euclidean distance, form coupling intersection point.
The beneficial effect of technical scheme provided by the invention is: (1) by the line segment in image is divided into remarkable line segment and non-significant line segment, and mates it respectively, avoids the exhaustive iteration of all straight-line segment features, improves the success ratio of Image Matching; (2) utilize the geometric relationship between line segment be described line segment and mate, avoid because of the unnecessary mistake that camera parameter is unknown or cause in estimation camera parameter process; (3) utilize the intersection point of different classes of line segment to replace corresponding line segment to mate, improve the efficiency of Image Matching; (4) utilize thick matching result to retrain follow-up smart matching process, improve coupling accuracy.Simultaneously in intersection point matching process, as long as the points correspondence algorithm met the demands in precision and efficiency, can be included in technical scheme of the present invention.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of the embodiment of the present invention.
Fig. 2 is that the line segment of the embodiment of the present invention is to descriptor schematic diagram.
Fig. 3 is the intersection point search schematic diagram of the embodiment of the present invention.
Fig. 4 is the position of intersecting point matching constraint process schematic of the embodiment of the present invention.
Embodiment
In order to understand technical scheme of the present invention better, below in conjunction with accompanying drawing, the present invention is described in further detail.The unmanned airship image that embodiments of the invention are CanonEOS5DMarkII shooting, resolution is 0.05m, with reference to Fig. 1, the step of the embodiment of the present invention is as follows:
Step 1, line feature extraction.
First embodiment adopts Canny operator to carry out rim detection to reference to image and image to be matched respectively, then utilizes the line-segment sets that Hough transform obtains in two width images, if there be m, n line segment respectively, and line-segment sets is designated as { l 1, l 2, l 3..., l m, { l' 1, l' 2, l' 3..., l' n.The Hough transform of the Canny operator and lines detection that carry out rim detection belongs to the conventional method of remote sensing image process field, and idiographic flow does not repeat them here.
Step 2, line segment conspicuousness differentiates.
For the line-segment sets in the reference image that step 1 is extracted and the line-segment sets in image to be matched, embodiment utilizes following process to process respectively, obtain with reference to image and each self-corresponding remarkable line segment of image to be matched and non-significant line segment, gained non-significant line segment forms initial non-significant line-segment sets.Specific implementation is as follows:
For with reference to a line segment l in image, successively with pixel l (x each in line segment l n, y n) be starting point, along l (x n, y n) the nearest adjacent segments of gradient direction detection range line segment l, and calculate pixel l (x n, y n) to the distance of the searched adjacent segments arrived, be designated as again along l (x n, y n) the nearest adjacent segments of antigradient direction detection range line segment l, and calculate pixel l (x n, y n) to the distance of the searched adjacent segments arrived, obtain distance get both maximal values as pixel l (x n, y n) corresponding support area.If line segment l length is N, namely comprise N number of pixel, so the corresponding N number of pixel support area of line segment l, the value of n is 1,2 ... N.If grn is pixel l (x in line segment n, y n) gradient, for the mean value of pixel gradient all in line segment, formula (1) is utilized to calculate the conspicuousness s of line segment l:
s = Σ n = 1 N ( ( gr n - gr ‾ ) × d n ) - - - ( 1 )
Utilize the conspicuousness that said process obtains with reference to bar line segment every in image, and the mean value of all line segment conspicuousnesses in computing reference image with the average length of all line segments judge whether meet formula (2) with reference to the line segment in image according to this, if meet, line segment is labeled as remarkable line segment, if do not meet, then line segment is labeled as non-significant line segment.
Same process is carried out to image to be matched and obtains remarkable line segment in image to be matched and non-significant line segment.
Step 3, remarkable line match.
The step 3 of embodiment mainly utilizes existing LS(LineSignature to the remarkable line segment obtained in step 2, LS) method processes, namely the image matching method based on line segment combination is adopted to mate, obtain the remarkable line segment with reference to the match is successful between image and image to be matched, form thick matching result, and using the remarkable line segment that the match is successful also as non-significant line segment, join in non-significant line-segment sets.For the sake of ease of implementation, being implemented as follows of embodiment is provided:
Step 3.1, line segment combines.Embodiment utilizes the remarkable line segment of proximity bordering compounding between line segment end points, forms the sets of line segments as Matching unit, comprises following process:
(1) based on specifically mating application, predeterminable round values is as threshold value k to limit the line segment number in each sets of line segments, and embodiment arranges threshold value k=3.Further, each remarkable line segment is considered as center line segment, two end points of center line segment are considered as central point.A corresponding sets of line segments of central point, therefore corresponding two sets of line segments of remarkable line segment;
(2) for each central point of a center line segment, calculate the distance between other remarkable line segment end points respectively, be not less than the remarkable line segment of k bar of s × 0.5 according to select progressively conspicuousness from small to large.Wherein, the conspicuousness of line segment centered by s;
(3) center line segment and searched to k bar line segment form a sets of line segments, comprise k+1 bar line segment in sets of line segments, as the primitive of Image Matching.
Step 3.2, sets of line segments describes.Center line segment in sets of line segments and other k bar line segments are formed line segment pair successively, by line segment between geometric relationship sets of line segments is described, generate the descriptor desc={v of sets of line segments 1, v 2... v k, v 1, v 2... v kfor the descriptor that k line segment is right.Desc={v in embodiment 1, v 2, v 3.The right descriptor of each line segment can by those skilled in the art's sets itself.Such as adopt 13 dimension descriptors, if the descriptor v={r that arbitrary line segment is right 1 ~ 2, l 1 ~ 5, θ 1 ~ 5, g}, is introduced this descriptor for Fig. 2.If vector line segment with intersect at a c, and center line segment is so with the parameter r that the line formed comprises descriptor 1 ~ 2, l 1 ~ 5, θ 1 ~ 5, g is respectively: l 1=| q 1q 2|/| p 1p 2|, l 2=| q 1p 1|/| p 1p 2|, l 3=| q 1p 2|/| p 1p 2|, l 4=| q 2p 1|/| p 1p 2|, l 5=| q 2p 2|/| p 1p 2|, g=g 1/ g 2represent the Gradient Features between line segment, g 1with g 2be two line segments average gradient, θ 1, θ 2, θ 3, θ 4, θ 5as shown in Figure 2, line segment is respectively to the line of four end points and center line segment between angle, namely with angle, with angle, with angle, with angle, with angle.
Step 3.3, descriptor similarity measurement, determines the similarity of whole sets of line segments by the similarity calculated between each line segment in two sets of line segments pair.
Embodiment utilizes threshold value descriptor to be divided into two types, adopts Near projection invariance and affine Similarity measures formulae discovery similarity respectively, mates sets of line segments.If be v={r with reference to a pair line segment of certain sets of line segments in image to descriptor 1 ~ 2, l 1 ~ 5, θ 1 ~ 5, g}, in image to be matched, corresponding line segment is v'={r' to descriptor 1 ~ 2, l' 1 ~ 5, θ ' 1 ~ 5, g'}.First threshold value T is utilized rsimilarity measures process right for line segment is divided into two kinds of situations by=0.3, as follows:
(1) if | r 1-r' 1| with | r 2-r' 2| be all less than or equal to T r, then following formula (3) is utilized to calculate its similarity S a,
u∈{1,2}(3)
In formula d gbe the threshold parameter arranged to improve matching result, utilize these parameters can get rid of the error hiding phenomenon caused because of unpractiaca distortion inaccuracy, its value is respectively:
d r u = 1 - | r u - r ′ u | T r u ∈ { 1,2 } d θ 1 = 1 - | θ 1 | - θ ′ 1 | π / 2 d l 1 = 1 - max ( l 1 , l ′ ′ 1 ) / min ( l 1 , l ′ ′ 1 ) - 1 3 d g = 1 - max ( g , g ′ ) / min ( g , g ′ ) - 1 2 - - - - ( 4 )
(2) if | r 1-r' 1| or | r 2-r' 2| be greater than T r, utilize following formula (4) to calculate its similarity S g:
In formula d gthe error hiding phenomenon caused because of unpractiaca distortion inaccuracy can be got rid of, be calculated as follows formula (6):
d l v = 1 - max ( l v , l ′ v ) / min ( l v l ′ v ) - 1 3 d θ v = 1 - | θ v θ ′ v | π / 2 d g = 1 - max ( g , g ′ ) / min ( g , g ′ ) - 1 3 , v ∈ [ 1,5 ] - - - ( 6 )
Formula (7) is finally utilized to obtain these two line segments to final similarity S lP.By S in formula gweights be reduced into while weakening other conversion impact, strengthen the constraint of affined transformation, can more easily find the line segment pair meeting affined transformation.
If certain sets of line segments descriptor desc={v in the reference image of embodiment 1, v 2, v 3, there is corresponding sets of line segments descriptor desc={v' in image to be matched 1, v' 2, v' 3, coupling line segment possible in these two descriptors co-exists in 6 kinds of situations, that is: { (V to combination 1, V' 1), (V 2, V' 2), (V 3, V' 3), { (V 1, V' 1), (V 2, V' :), (V 3, V' 2), { (v 1, v' ,), (v 2, v' 1), (v 3, V' 3), { (v 1, v' 2), (v 2, v' 3), (v 3, v' l), { (v 1, v' 3)) (v 2, v' 1)) (v 3, v' 2), { (v 1, v' 3)) (v 2, v' 2)) (v 3, v' 1))) utilize said process to calculate in each combination similarity S between 3 groups of line segments pair respectively lPand ask its add and, obtain the similarity of 6 sets of line segments, maximal value be wherein considered as sets of line segments descriptor desc={v 1, v 2, v 3and desC=(V' 1, V' 2, V' 3) final similarity.
For for a sets of line segments in image, calculate the similarity between all sets of line segments in itself and image to be matched successively, sets of line segments maximum for similarity is considered as the sets of line segments of mating with reference to sets of line segments, then mate organizing interior each line segment according to the relation in sets of line segments between center line segment with other line segments, obtain the single line segment that the match is successful, all line segments that the match is successful form the remarkable line-segment sets that the match is successful.Join in non-significant line-segment sets corresponding to the reference image that obtains in step 2 with reference to the line segment that the match is successful on image, the line segment that on image to be matched, the match is successful joins in non-significant line-segment sets corresponding to the image to be matched that obtains in step 2.
Step 4, the initial transformation matrix (i.e. initial transformation matrix) between estimation image and relative rotation angle information.
The step 4 of embodiment mainly obtains to step 3 line-segment sets that the match is successful and processes, and obtains with reference to the initial transformation matrix between image and image to be matched and relative rotation angle information.Specific implementation is as follows:
If { L 1, L 2, L 3, L μby being formed line-segment sets, { L' with reference to the remarkable line segment that in image, the match is successful l, L' 2, L' 3..., L' μformed line-segment sets by the remarkable line segment that in image to be matched, the match is successful, wherein, μ be with reference to image with mate line segment logarithm, (L between image to be matched w, L' w) be with reference to image with the w between image to be matched to mating line segment, the value of w is 1,2 ... μ.I pfor set { L ,, L 2, L 3..., L μmiddle conductor is to (L i, L j) intersection point with reference to the position in image, I' pfor set { L' 1, L' 2, L' 3..., L' μin corresponding line segment to (L' i, L' j) the position of intersection point in image to be matched, the value of i is 1,2 ... the value of μ, j is 1,2 ... μ, i ≠ j.By I pwith I' pbe considered as a pair match point and save as matching double points (I p, I' p), process line segment aggregate { L successively 1, L 2, L 3..., L μand { L' 1, L' 2, L' 3..., L' μin corresponding line segment, obtain one group of matching double points set.Utilize neighbor point to obtain with reference to the initial transformation matrix between image and image to be matched and relative rotation angle information conforming RANSAC algorithm, input quantity is matching double points collection obtained above.Neighbor point is to conforming RANSAC(RandomSampleConsensus, RANSAC) algorithm belongs to the conventional method of remote sensing image process field, and idiographic flow does not repeat them here.
Step 5, non-significant line match.
The step 5 of embodiment mainly processes with reference to the non-significant line segment in image and image to be matched, utilizes the image matching method based on coplanar intersection point of line segments of the prior art to mate non-significant line segment.Non-significant line segment is all non-significant line segments in step 3 gained non-significant line-segment sets herein.Have 4 sub-steps, wherein, step 5.1 and step 5.2 process with description carrying out same intersection point search respectively with reference to image and image to be matched, and step 5.3 and step 5.4 are finding with between image to be matched the intersection point mated with reference to image, obtaining final coupling line-segment sets.Be implemented as follows:
Step 5.1, carries out intersection point search respectively to reference to image and image to be matched.With reference in image or image to be matched, with the every bar obtained in step 3 remarkable line segment (the i.e. L that the match is successful wor L' w) centered by, around it in certain neighborhood, find the coplanar non-significant line segment that may intersect with it, and ask its intersection point.Its principle is as Fig. 3: establish line segment L sfor the remarkable line segment that the match is successful, p 1, p 2for L stwo-end-point, supposes there are some non-significant line segments around it (as in Fig. 3 and ), d thfor the distance threshold rule of thumb preset, first along line segment L sdirection of both ends is respectively to extension p 1with p 2to a, b place, wherein ap 1=bp 2=d th, then respectively centered by a, b, straight line for axis of symmetry, d thfor radius constructs two semicircles, by these two semicircles at straight line whether the end points of the same side couples together formation closed region as the neighborhood judged, judge to have in this region pixel to be pixel in non-significant line segment, if having, then calculate non-significant line segment corresponding to this pixel and L sintersection point (there are three kinds of situations in this intersection point, namely in non-significant line segment, at L sin or at non-significant line segment and L sextended line on), if this intersection point is also in this regional extent, this non-limiting line segment to be marked with corresponding intersection point, and the positional information of record intersection point in image.In Fig. 3, with for L snon-significant line segment to be matched in neighborhood, and then because not being rejected in contiguous range.After this step completes, each slightly remarkable line segment that the match is successful obtains one group of intersection point collection, each intersection point is a then corresponding non-significant line segment.
Step 5.2, generates intersection point descriptor according to step 5.1 gained intersection point.This process utilizes the geometric invariance of intersection point and part illumination invariant to carry out consistance description to it, for follow-up intersection point coupling, is designated as by the intersection point descriptor obtained: Descr={Position, Gradient}.Wherein, Position represents the coordinate position of intersection point in image, calculate in intersection point search, feature in the regional area (16 × 16 pixel sizes are got in suggestion) of Gradient representative centered by intersection point, existing maturation method can be utilized obtain, embodiment utilizes SIFT(Scale-InvariantFeatureTransform, SIFT) method calculates the characteristic information in intersection point regional area, and the image coordinate of input intersection point obtains the principal direction of intersection point and corresponding intersection point descriptor.SIFT algorithm belongs to the classical way in Image Matching field, and idiographic flow does not repeat them here.
Step 5.3, intersection point descriptor mates.This process is divided into the coupling of position of intersecting point information and coupling two aspects of intersection point local features, as follows:
(1) coupling of position of intersecting point information: theoretically, if two intersection points are coupling intersection point, then their position attribution meets initial homography matrix within the specific limits, therefore the initial transformation matrix between reference image and image to be matched obtained in step 4 is utilized to retrain this process, get rid of obvious unmatched intersection point, composition graphs 4 is described in detail this process.L in Fig. 4 s' and be respectively the remarkable line segment corresponding to intersection point to be matched on image to be matched and non-significant line segment, dis is that the position limit rule of thumb preset is poor, first corresponding on image to be matched according to the coordinate position Position of intersection point on initial transformation matrix computing reference image position coordinates, A place as shown in Figure 4; Next is on image to be matched centered by A point, and dis is that radius obtains a border circular areas, as shown in Figure 4, retains in image to be matched the intersection point to be matched falling into this border circular areas, is hereinafter called reservation intersection point, carry out the coupling of next step local features.
(2) coupling of intersection point local features: utilize Euclidean distance to step 5.3(1) in the local features Gradient of intersection point that retains in image to be matched mate.In the process, the anglec of rotation between the image obtained in step 4 is adopted reduce hunting zone, accelerate matching speed.If with reference to the principal direction of certain intersection point on image be on image to be matched, the principal direction of certain corresponding reservation intersection point is first judge whether [ -δ, + δ] in scope, if, then the Euclidean distance between the local features Gradient calculating these two intersection points, on the contrary then cast out.To with reference to certain intersection point on image, get and calculate the nearest intersection point of gained Euclidean distance from image to be matched in all corresponding reservation intersection points, form coupling intersection point, wherein, δ is the angle threshold rule of thumb preset.
Step 5.4, utilize the corresponding relation between coupling intersection point and non-significant line segment obtained in step 5.3 to obtain the non-significant line segment that the match is successful, all non-significant line segments that the match is successful with obtain the remarkable line segment that the match is successful in step 3 and form and final mate line-segment sets.
Compared with traditional Feature Matching method, method of the present invention all has advantage clearly, and existing higher feature repetition rate, has again higher matching probability, is a kind of feasible Feature Matching method.
Above embodiment is only for doing further similar explanation to the present invention, but not limitation of the present invention, can not think that specific embodiment of the invention is only limitted to above-mentioned explanation, those skilled in the art should be appreciated that, when do not depart from be defined by the appended claims, can make various amendment to this in detail, but all equivalent technical schemes also all belong to the category of Ben Fanming.

Claims (4)

1., based on the sane matching process of low altitude remote sensing image of line features, it is characterized in that, comprise following steps:
Step 1, extracts the line segment in image, and described image comprises with reference to image and image to be matched;
Step 2, to reference to image and image to be matched, utilizes the conspicuousness of line segment and length information that all line segments in step 1 gained image are divided into remarkable line segment or non-significant line segment respectively, and gained non-significant line segment forms non-significant line-segment sets;
Wherein, all line segments in image are divided into remarkable line segment or non-significant line segment implementation is as follows,
For arbitrary line segment l in image, successively with pixel l (x each in line segment l n, y n) be starting point, along l (x n, y n) the nearest adjacent segments of gradient direction detection range line segment l, and calculate pixel l (x n, y n) to the distance of the searched adjacent segments arrived, be designated as again along l (x n, y n) the nearest adjacent segments of antigradient direction detection range line segment l, and calculate pixel l (x n, y n) to the distance of the searched adjacent segments arrived, obtain distance get both maximal values as pixel l (x n, y n) corresponding support area; If the value that line segment l length is N, n is 1,2 ... N;
If gr nfor pixel l (x in line segment l n, y n) gradient, for the mean value of pixel gradient all in line segment l, following formula is utilized to calculate the conspicuousness s of line segment l:
s = Σ n = 1 N ( ( gr n - g r ‾ ) × d n )
After obtaining the conspicuousness of every bar line segment in image, calculate the mean value of all line segment conspicuousnesses in image with the average length of all line segments if arbitrary line segment l meets in image then be labeled as remarkable line segment, otherwise be labeled as non-significant line segment;
Step 3, utilizes the image matching method based on line segment combination to mate the remarkable line segment with reference to image and image to be matched, joins in non-significant line-segment sets by the remarkable line segment that the match is successful;
Wherein, the image matching method implementation based on line segment combination is as follows,
First carry out line segment combination, comprise the remarkable line segment of proximity bordering compounding utilized between line segment end points, form the sets of line segments as Matching unit;
Then carry out sets of line segments description, comprise and center line segment in sets of line segments and other line segments are formed line segment pair successively, by line segment between geometric relationship sets of line segments is described, generate the descriptor of sets of line segments;
Finally be described sub-similarity measurement, comprise the similarity being determined whole sets of line segments by the similarity calculated between each line segment in two sets of line segments pair, for for a sets of line segments in image, calculate the similarity between all sets of line segments in itself and image to be matched successively, sets of line segments maximum for similarity is considered as the sets of line segments of mating with reference to sets of line segments, then mate organizing interior each line segment according to the relation in sets of line segments between center line segment with other line segments, obtain the single line segment that the match is successful, all line segments that the match is successful form the remarkable line-segment sets that the match is successful, join in non-significant line-segment sets corresponding to the reference image that obtains in step 2 with reference to the line segment that the match is successful on image, the line segment that on image to be matched, the match is successful joins in non-significant line-segment sets corresponding to the image to be matched that obtains in step 2,
Step 4, according to the step 3 remarkable line segment estimation that the match is successful with reference to the initial transformation matrix between image and image to be matched and relative rotation angle;
Step 5, utilizes the image matching method based on coplanar intersection point of line segments to mate non-significant line segment in step 3 gained non-significant line-segment sets, comprises following sub-step,
Step 5.1, to reference to image and image to be matched, finds its intersection point with non-significant line segment respectively in the step 3 each remarkable line segment neighborhood that the match is successful;
Step 5.2, generates intersection point descriptor respectively to each intersection point obtained in step 5.1, and obtains the principal direction of intersection point;
Step 5.3, based on step 4 gained initial transformation matrix and relative rotation angle, the principal direction according to gained intersection point descriptor and intersection point in step 5.2 is mated with reference to the intersection point on image and image to be matched, obtains mating intersection point;
Step 5.4, utilizes the corresponding relation between each coupling intersection point and non-significant line segment obtained in step 5.3 to obtain the non-significant line segment that the match is successful, and all non-significant line segments that the match is successful and the remarkable line segment that in step 3, the match is successful form and final mate line-segment sets.
2., according to claim 1 based on the sane matching process of low altitude remote sensing image of line features, it is characterized in that: step 4 implementation is as follows,
If { L 1, L 2, L 3..., L μby being formed line-segment sets, { L' with reference to the remarkable line segment that in image, the match is successful 1, L' 2, L' 3..., L' μformed line-segment sets by the remarkable line segment that in image to be matched, the match is successful, wherein, μ be with reference to image with mate line segment logarithm, (L between image to be matched w, L' w) be with reference to image with the w between image to be matched to mating line segment, the value of w is 1,2 ... μ;
If I pfor set { L 1, L 2, L 3..., L μmiddle conductor is to (L i, L j) intersection point with reference to the position in image, I' pfor set { L' 1, L' 2, L' 3..., L' μin corresponding line segment to (L' i, L' j) the position of intersection point in image to be matched, the value of i is 1,2 ... the value of μ, j is 1,2 ... μ, i ≠ j;
By I pwith I' pbe considered as a pair match point and save as matching double points (I p, I' p), process line segment aggregate { L successively by this 1, L 2, L 3..., L μand { L' 1, L' 2, L' 3..., L' μin corresponding line segment, obtain one group of matching double points set, adopt the estimation of RANSAC algorithm with reference to the initial transformation matrix between image and image to be matched and relative rotation angle according to this matching double points set.
3. according to described in claim 1 or 2 based on the sane matching process of the low altitude remote sensing image of line features, it is characterized in that: in step 5.1, the neighborhood make of remarkable line segment is as follows,
If p 1, p 2for remarkable line segment L stwo-end-point, d thfor predeterminable range threshold value, first along line segment L sdirection of both ends is respectively to extension p 1with p 2to a, b place, wherein ap 1=bp 2=d th, then respectively centered by a, b, with straight line for axis of symmetry, with d thfor radius constructs two semicircles, by these two semicircles at straight line the end points of the same side couples together formation closed region as remarkable line segment L sneighborhood.
4. according to described in claim 3 based on the sane matching process of the low altitude remote sensing image of line features, it is characterized in that: step 5.2 gained intersection point descriptor is Descr={Position, Gradient}, wherein, Position represents the coordinate position of intersection point in image, feature in the regional area of Gradient representative centered by intersection point;
Step 5.3 implementation is as follows,
(1) coupling of position of intersecting point information, comprise the position coordinates A that the coordinate position Position of certain intersection point on the initial transformation matrix computing reference image that utilizes and obtain in step 4 is corresponding on image to be matched, centered by this position coordinates A, preset value dis obtains a border circular areas for radius, retain in image to be matched the intersection point to be matched falling into this border circular areas, be referred to as to retain intersection point;
(2) coupling of intersection point local features, realizes as follows,
If with reference to the principal direction of certain intersection point on image be on image to be matched, the principal direction of certain corresponding reservation intersection point is in step 4, gained with reference to relative rotation angle between image and image to be matched is first judge whether exist in scope, wherein δ is predetermined angle threshold value; If the Euclidean distance between the local features Gradient then calculating these two intersection points, otherwise then cast out corresponding reservation intersection point; Then, get and calculate the nearest intersection point of gained Euclidean distance, form coupling intersection point.
CN201310272696.9A 2013-07-01 2013-07-01 The sane matching process of a kind of low altitude remote sensing image based on line features Active CN103324948B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310272696.9A CN103324948B (en) 2013-07-01 2013-07-01 The sane matching process of a kind of low altitude remote sensing image based on line features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310272696.9A CN103324948B (en) 2013-07-01 2013-07-01 The sane matching process of a kind of low altitude remote sensing image based on line features

Publications (2)

Publication Number Publication Date
CN103324948A CN103324948A (en) 2013-09-25
CN103324948B true CN103324948B (en) 2016-04-27

Family

ID=49193676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310272696.9A Active CN103324948B (en) 2013-07-01 2013-07-01 The sane matching process of a kind of low altitude remote sensing image based on line features

Country Status (1)

Country Link
CN (1) CN103324948B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105957074B (en) * 2016-04-27 2018-12-14 深圳积木易搭科技技术有限公司 Line match method and system based on the description of V-type intersection point and local homography matrix
CN108305277B (en) * 2017-12-26 2020-12-04 中国航天电子技术研究院 Heterogeneous image matching method based on straight line segments
CN108830781B (en) * 2018-05-24 2022-05-24 桂林航天工业学院 Wide baseline image straight line matching method under perspective transformation model
CN109886124B (en) * 2019-01-23 2021-01-08 浙江大学 Non-texture metal part grabbing method based on wire harness description subimage matching

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101251926A (en) * 2008-03-20 2008-08-27 北京航空航天大学 Remote sensing image registration method based on local configuration covariance matrix
CN101833767A (en) * 2010-05-10 2010-09-15 河南理工大学 Gradient and color characteristics-based automatic straight line matching method in digital image
CN102521597A (en) * 2011-12-14 2012-06-27 武汉大学 Hierarchical strategy-based linear feature matching method for images

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101110101A (en) * 2006-07-17 2008-01-23 松下电器产业株式会社 Method for recognizing picture pattern and equipment thereof
US8588547B2 (en) * 2008-08-05 2013-11-19 Pictometry International Corp. Cut-line steering methods for forming a mosaic image of a geographical area

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101251926A (en) * 2008-03-20 2008-08-27 北京航空航天大学 Remote sensing image registration method based on local configuration covariance matrix
CN101833767A (en) * 2010-05-10 2010-09-15 河南理工大学 Gradient and color characteristics-based automatic straight line matching method in digital image
CN102521597A (en) * 2011-12-14 2012-06-27 武汉大学 Hierarchical strategy-based linear feature matching method for images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"一种基于直线特征的遥感图像匹配方法";隋欣航,等;《信号处理》;20121231;第28卷(第12A期);全文 *

Also Published As

Publication number Publication date
CN103324948A (en) 2013-09-25

Similar Documents

Publication Publication Date Title
Yan et al. Sparse single sweep lidar point cloud segmentation via learning contextual shape priors from scene completion
CN106780557B (en) Moving object tracking method based on optical flow method and key point features
CN104766084B (en) A kind of nearly copy image detection method of multiple target matching
CN103400384B (en) The wide-angle image matching process of calmodulin binding domain CaM coupling and some coupling
CN111028277A (en) SAR and optical remote sensing image registration method based on pseudo-twin convolutional neural network
CN103324948B (en) The sane matching process of a kind of low altitude remote sensing image based on line features
CN103679702A (en) Matching method based on image edge vectors
CN104050675B (en) Feature point matching method based on triangle description
CN108305277B (en) Heterogeneous image matching method based on straight line segments
CN101556692A (en) Image mosaic method based on neighborhood Zernike pseudo-matrix of characteristic points
CN105427298A (en) Remote sensing image registration method based on anisotropic gradient dimension space
CN102722887A (en) Image registration method and device
Wu et al. Psdet: Efficient and universal parking slot detection
CN109934857B (en) Loop detection method based on convolutional neural network and ORB characteristics
CN107818598A (en) A kind of three-dimensional point cloud map amalgamation method of view-based access control model correction
CN105654421A (en) Projection transform image matching method based on transform invariant low-rank texture
CN103914690B (en) Shape matching method based on projective invariant
CN103679193A (en) FREAK-based high-speed high-density packaging component rapid location method
CN102663733B (en) Characteristic points matching method based on characteristic assembly
CN110070012A (en) A kind of refinement extracted applied to remote sensing image road network and global connection method
CN104392434A (en) Triangle constraint-based image matching diffusion method
CN102542555B (en) Method and system for generating edge seam path and edge seam topological structure of raster image
CN112581368A (en) Multi-robot grid map splicing method based on optimal map matching
WO2023131203A1 (en) Semantic map updating method, path planning method, and related apparatuses
Zhang et al. Detection of road surface identifiers based on deep learning

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant