CN103324948A - Low altitude remote sensing image robust matching method based on line feature - Google Patents

Low altitude remote sensing image robust matching method based on line feature Download PDF

Info

Publication number
CN103324948A
CN103324948A CN2013102726969A CN201310272696A CN103324948A CN 103324948 A CN103324948 A CN 103324948A CN 2013102726969 A CN2013102726969 A CN 2013102726969A CN 201310272696 A CN201310272696 A CN 201310272696A CN 103324948 A CN103324948 A CN 103324948A
Authority
CN
China
Prior art keywords
image
line segment
intersection point
line
remarkable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013102726969A
Other languages
Chinese (zh)
Other versions
CN103324948B (en
Inventor
邵振峰
郭舒
贺晓波
陈敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201310272696.9A priority Critical patent/CN103324948B/en
Publication of CN103324948A publication Critical patent/CN103324948A/en
Application granted granted Critical
Publication of CN103324948B publication Critical patent/CN103324948B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention provides a low altitude remote sensing image robust matching method based on a line feature. The method comprises the following steps of (1) extracting line segments in an image which comprises a reference image and an image to be matched; (2) for the reference image and the image to be matched, by using significance and length information of all line segments, dividing all line segments in the image obtained in the step (1) into significant line segments or non significant line segments; (3) using an image matching method based on a line segment combination to match the significant line segments of the reference image and the image to be matched, and adding significant line segments which are not successfully matched into a non significant line segment set; (4) according to the successfully matched significant line segments in the step (3), calculating an initial conversion matrix and a relative rotation angle between the reference image and the image to be matched; and (5) by using the image matching method based on the line segment combination, carrying out matching on the non significant line segments. According to the method, line segment features and possible local feature area texture information in the image are fully utilized, and the image matching efficiency and the success rate are raised.

Description

The sane matching process of a kind of low altitude remote sensing image based on the line feature
Technical field
The invention belongs to technical field of mapping, be specifically related to the sane matching process of a kind of low altitude remote sensing image based on the line feature.
Background technology
The strong means of supplementing out economy that Technology of low altitude remote sensing is low because of cost, image resolution advantages of higher that obtain becomes traditional airborne remote sensing all exist wide application prospect in various fields.In the wisdom urban construction, the high-resolution characteristic of unmanned plane low altitude remote sensing image is achieved the Large-scale Urban Digital Mapping; Disaster monitoring with emergent in, low-altitude remote sensing flexibly, convenient and have an advantage of real-time, for the remote sensing image of quick obtaining specific region provides a favorable guarantee.Yet, the characteristics such as the low altitude remote sensing image film size is little, distortion is large are also so that traditional image treatment method is difficult to satisfy the requirement of practical application in efficient and precision, especially in the Image Matching stage, utilize traditional image matching method to process low altitude remote sensing image and can't obtain desirable matching result.Because the ripe image matching method based on a feature is fit to texture mostly than the image of horn of plenty, for low texture image common in the low altitude remote sensing image, often increase the phenomenon that maybe can't mate because being difficult to detect the not high mistake matching rate that occurs of key point and feature descriptor uniqueness, and the line feature can be good at addressing this problem.
Utilize the line feature to carry out Image Matching and mainly face following problem: the otherness of (1) image is so that the performance of same line segment on different images is inconsistent, be difficult to it is carried out that consistance is described and when utilizing the information such as end points, length that line segment is mated, unforeseen mistake can occur; (2) overall geometrical constraint condition of line segment shortage retrains matching process, has increased its difficulty of matching.
The existing image matching method based on the line feature roughly can be divided three classes, one is based on the line characteristic matching of camera geometric parameter, the geometric parameter that early stage method all relies on camera mostly comes the Image Matching process is retrained, yet the geometric parameter of camera is often unknown, has increased the difficulty of Image Matching; Two are based on the line characteristic matching of line segment neighborhood half-tone information, this type of algorithm does not need to understand the geometric parameter of camera, only gray scale or the colouring information according to the line segment neighborhood territory pixel mates, but this algorithm calculated amount is larger, and very responsive to illumination and Geometrical change, be not suitable for image-forming condition and differ larger image; Three are based on the line characteristic matching of line segment geometric relationship, and this algorithm had both been avoided the mistake coupling phenomenon that occurs because of camera parameter disappearance or miscount, had improved again the robustness of algorithm under illumination variation, therefore had very important practice significance.But this kind method need to be carried out topological relationship calculation or large regional neighborhood regional analysis, thereby storage and calculated amount are quite large.
Summary of the invention
The present invention is directed to existing based on the deficiency in the image matching method of line feature, the sane matching process of a kind of low altitude remote sensing image based on the line feature has been proposed, the method can take full advantage of line segment feature in the image and the texture information in possible local feature zone thereof, both improve the efficient of Image Matching, improved again the success ratio of matching result.
The technical solution adopted in the present invention is the sane matching process of a kind of low altitude remote sensing image based on the line feature, comprises following steps:
Step 1 is extracted the line segment in the image, and described image comprises with reference to image and image to be matched;
Step 2 to reference image and image to be matched, utilizes respectively the conspicuousness of line segment and length information that all line segments in the step 1 gained image are divided into remarkable line segment or non-remarkable line segment, and the non-remarkable line segment of gained consists of non-remarkable line-segment sets;
Step 3 is utilized based on the image matching method of the line segment combination remarkable line segment to reference image and image to be matched and is mated, and the remarkable line segment that the match is successful is joined in the non-remarkable line-segment sets;
Step 4 is estimated with reference to initial transformation matrix and relative rotation angle between image and image to be matched according to the step 3 remarkable line segment that the match is successful;
Step 5 utilizes the image matching method based on coplanar intersection point of line segments that non-remarkable line segment in the non-remarkable line-segment sets of step 3 gained is mated, and comprises following substep,
Step 5.1 to reference image and image to be matched, is sought respectively its intersection point with non-remarkable line segment in step 3 each remarkable line segment neighborhood that the match is successful;
Step 5.2 generates respectively the intersection point descriptor to each intersection point that obtains in the step 5.1, and obtains the principal direction of intersection point;
Step 5.3 based on step 4 gained initial transformation matrix and relative rotation angle, is mated the intersection point on reference image and the image to be matched according to the principal direction of gained intersection point descriptor and intersection point in the step 5.2, obtains mating intersection point;
Step 5.4 utilizes the corresponding relation that respectively mates between intersection point and non-remarkable line segment that obtains in the step 5.3 to obtain the non-remarkable line segment that the match is successful, and the remarkable line segment that the match is successful in all non-remarkable line segments that the match is successful and the step 3 consists of final coupling line-segment sets.
And, in the step 2, all line segments in the image are divided into remarkable line segment or non-remarkable line segment implementation is as follows,
For arbitrary line segment l in the image, successively with each pixel l (x in the line segment l n, y n) be starting point, along l (x n, y n) the nearest adjacent segments of gradient direction detection range line segment l, and calculating pixel l (x n, y n) to the distance of the searched adjacent segments that arrives, be designated as
Figure BDA00003444193900022
Again along l (x n, y n) the nearest adjacent segments of antigradient direction detection range line segment l, and calculating pixel l (x n, y n) to the distance of the searched adjacent segments that arrives, obtain distance
Figure BDA00003444193900023
Get both maximal values
Figure BDA00003444193900024
As pixel l (x n, y n) corresponding support area; If line segment l length is N, the value of n is 1,2 ... N;
If gr nBe pixel l (x in the line segment l n, y n) gradient,
Figure BDA00003444193900021
Be the mean value of all pixel gradient in the line segment l, utilize following formula to calculate the conspicuousness s of line segment l:
s = Σ n = 1 N ( ( gr n - gr ‾ ) × d n )
After obtaining the conspicuousness of every line segment in the image, calculate the mean value of all line segment conspicuousnesses in the image
Figure BDA00003444193900032
Average length with all line segments
Figure BDA00003444193900033
If arbitrary line segment l satisfies in the image
Figure BDA00003444193900034
Then be labeled as remarkable line segment, otherwise be labeled as non-remarkable line segment.
And step 4 implementation is as follows,
If { L 1, L 2, L 3..., L μBy being consisted of line-segment sets, { L' with reference to the remarkable line segment that the match is successful in the image 1, L' 2, L' 3..., L' μConsisted of line-segment sets by the remarkable line segment that the match is successful in the image to be matched, wherein, μ is with reference to the coupling line segment logarithm between image and image to be matched, (L w, L' w) be with reference to the w between image and image to be matched to the coupling line segment, the value of w is 1,2 ... μ;
If I pBe set { L 1, L 2, L 3..., L μMiddle conductor is to (L i, L j) the position of intersection point in the reference image, I' pBe set { L' 1, L' 2, L' 3..., L' μIn corresponding line segment to (L' i, L' j) the position of intersection point in image to be matched, the value of i is 1,2 ... μ, the value of j is 1,2 ... μ, i ≠ j;
With I pWith I' pBe considered as a pair of match point and save as matching double points (I p, I' p), process successively line segment aggregate { L by this 1, L 2, L 3..., L μAnd { L' 1, L' 2, L' 3..., L' μInterior corresponding line segment, obtain one group of matching double points set, adopt the estimation of RANSAC algorithm with reference to initial transformation matrix and relative rotation angle between image and image to be matched according to this matching double points set.
And in the step 5.1, significantly the neighborhood make of line segment is as follows,
If p 1, p 2Be remarkable line segment L sTwo-end-point, d ThBe the predeterminable range threshold value, at first along line segment L sThe direction of both ends p that stretches out respectively 1With p 2To a, b place, wherein ap 1=bp 2=d Th, then respectively centered by a, b, with straight line
Figure BDA00003444193900035
Be axis of symmetry, with d ThBe two semicircles of radius structure, with these two semicircles at straight line
Figure BDA00003444193900036
The end points of the same side couples together and consists of a closed region as remarkable line segment L sNeighborhood.
And step 5.2 gained intersection point descriptor is Descr={Position, Gradient}, and wherein, Position represents the coordinate position of intersection point in image, feature in the regional area of Gradient representative centered by intersection point;
Step 5.3 implementation is as follows,
(1) coupling of position of intersecting point information, the position coordinates A that comprises coordinate position Position correspondence on image to be matched of utilizing certain intersection point on the initial transformation matrix computing reference image that obtains in the step 4, centered by this position coordinates A, preset value dis obtains a border circular areas as radius, keep the intersection point to be matched that falls into this border circular areas in the image to be matched, be referred to as to keep intersection point; (2) coupling of intersection point local features is achieved as follows,
If the principal direction with reference to certain intersection point on the image is
Figure BDA00003444193900041
The principal direction of certain corresponding reservation intersection point is on the image to be matched
Figure BDA00003444193900042
Gained with reference to relative rotation angle between image and image to be matched is in the step 4, at first judges
Figure BDA00003444193900043
In [δ ,+δ] scope, wherein δ is the preset angles threshold value; If then calculate the Euclidean distance between the local features Gradient of these two intersection points, otherwise then cast out corresponding reservation intersection point; Then, get and calculate the nearest intersection point of gained Euclidean distance, consist of the coupling intersection point.
The beneficial effect of technical scheme provided by the invention is: (1) by being divided into the line segment in the image on remarkable line segment and non-remarkable line segment, and respectively it is mated, and avoided the exhaustive iteration of all straight-line segment features, improved the success ratio of Image Matching; (2) utilize geometric relationship between line segment that line segment is described and mate, avoided unnecessary mistake unknown because of camera parameter or that in estimation camera parameter process, cause; (3) utilize the intersection point of different classes of line segment to replace corresponding line segment to mate, improved the efficient of Image Matching; (4) utilize thick matching result that follow-up smart matching process is retrained, improved the coupling accuracy.Simultaneously in the intersection point matching process, as long as the some Feature Correspondence Algorithm that meets the demands in precision and efficient can be included in the technical scheme of the present invention.
Description of drawings
Fig. 1 is the process flow diagram of the embodiment of the invention.
Fig. 2 is that the line segment of the embodiment of the invention is to the descriptor schematic diagram.
Fig. 3 is the intersection point search schematic diagram of the embodiment of the invention.
Fig. 4 is the position of intersecting point matching constraint process schematic diagram of the embodiment of the invention.
Embodiment
In order to understand better technical scheme of the present invention, the present invention is described in further detail below in conjunction with accompanying drawing.Embodiments of the invention are that Canon EOS5D Mark II takes, resolution is the unmanned airship image of 0.05m, and with reference to Fig. 1, the step of the embodiment of the invention is as follows:
Step 1, the line feature extraction.
Embodiment at first adopts respectively the Canny operator to carry out rim detection to reference image and image to be matched, then utilizes the Hough conversion to obtain the interior line-segment sets of two width of cloth images, and establishing has respectively m, a n line segment, and line-segment sets is designated as { l 1, l 2, l 3..., l m, { l' 1, l' 2, l' 3..., l' n.Carry out Hough conversion that the Canny operator of rim detection and straight line extract and belong to the conventional method of remote sensing image process field, idiographic flow does not repeat them here.
Step 2, the line segment conspicuousness is differentiated.
For the line-segment sets in the reference image of step 1 extraction and the line-segment sets in the image to be matched, embodiment utilizes following process to process respectively, obtain with reference to image and each self-corresponding remarkable line segment of image to be matched and non-remarkable line segment, the non-remarkable line segment of gained consists of initial non-remarkable line-segment sets.Specific implementation is as follows:
For line segment l in the reference image, successively with each pixel l (x in the line segment l n, y n) be starting point, along l (x n, y n) the nearest adjacent segments of gradient direction detection range line segment l, and calculating pixel l (x n, y n) to the distance of the searched adjacent segments that arrives, be designated as
Figure BDA00003444193900057
Again along l (x n, y n) the nearest adjacent segments of antigradient direction detection range line segment l, and calculating pixel l (x n, y n) to the distance of the searched adjacent segments that arrives, obtain distance Get both maximal values
Figure BDA00003444193900051
As pixel l (x n, y n) corresponding support area.If line segment l length is N, namely comprise N pixel, corresponding N pixel support area of line segment l so, the value of n is 1,2 ... N.If grn is pixel l (x in the line segment n, y n) gradient, Be the mean value of all pixel gradient in the line segment, utilize formula (1) to calculate the conspicuousness s of line segment l:
s = Σ n = 1 N ( ( gr n - gr ‾ ) × d n ) - - - ( 1 )
Utilize said process to obtain conspicuousness with reference to every line segment in the image, and the mean value of all line segment conspicuousnesses in the computing reference image
Figure BDA00003444193900054
Average length with all line segments
Figure BDA00003444193900055
Judge according to this with reference to the line segment in the image whether satisfy formula (2), then line segment is labeled as remarkable line segment if satisfy, if do not satisfy, then line segment is labeled as non-remarkable line segment.
Figure BDA00003444193900056
Image to be matched is carried out same processing obtain remarkable line segment and non-remarkable line segment in the image to be matched.
Step 3, significantly line match.
The step 3 of embodiment is mainly utilized existing LS(Line Signature to the remarkable line segment that obtains in the step 2, LS) method is processed, namely adopt the image matching method based on the line segment combination to mate, obtain the remarkable line segment with reference to the match is successful between image and image to be matched, form thick matching result, and general's remarkable line segment that the match is successful joins in the non-remarkable line-segment sets also as non-remarkable line segment.For the sake of ease of implementation, provide being implemented as follows of embodiment:
Step 3.1, the line segment combination.Embodiment utilizes the remarkable line segment of proximity bordering compounding between the line segment end points, forms the sets of line segments as the coupling primitive, comprises following process:
(1) use based on specific coupling, to limit the line segment number in each sets of line segments, embodiment arranges threshold value k=3 to predeterminable round values as threshold value k.And, the remarkable line segment of each bar is considered as the center line segment, two end points of center line segment are considered as central point.The corresponding sets of line segments of central point, therefore corresponding two sets of line segments of remarkable line segment;
(2) for each central point of a center line segment, calculate respectively and other remarkable line segment end points between distance, be not less than the remarkable line segment of k bar of s * 0.5 according to select progressively conspicuousness from small to large.Wherein, the conspicuousness of line segment centered by the s;
(3) center line segment and the searched k bar line segment that arrives consist of a sets of line segments, comprise k+1 bar line segment in the sets of line segments, as the primitive of Image Matching.
Step 3.2, sets of line segments is described.Center line segment in the sets of line segments and other k bar line segments are consisted of line segment pair successively, by line segment between geometric relationship sets of line segments is described, generate the descriptor desc={v of sets of line segments 1, v 2... v k, v 1, v 2... v kBe k the descriptor that line segment is right.Desc={v among the embodiment 1, v 2, v 3.The right descriptor of each line segment can be set up on their own by those skilled in the art.For example adopt 13 dimension descriptors, establish the right descriptor v={r of arbitrary line segment 1~2, l 1~5, θ 1~5, g} is introduced this descriptor as an example of Fig. 2 example.If vector line segment
Figure BDA00003444193900061
With Intersect at a c, and the center line segment is
Figure BDA00003444193900063
So
Figure BDA00003444193900064
With
Figure BDA00003444193900065
The parameter r that the line that consists of comprises descriptor 1~2, l 1~5, θ 1~5, g is respectively:
Figure BDA00003444193900066
Figure BDA00003444193900067
l 1=| q 1q 2|/| p 1p 2|, l 2=| q 1p 1|/| p 1p 2|, l 3=| q 1p 2|/| p 1p 2|, l 4=| q 2p 1|/| p 1p 2|, l 5=| q 2p 2|/| p 1p 2|, g=g 1/ g 2Gradient Features between the expression line segment, g 1With g 2Be two line segments
Figure BDA00003444193900068
Average gradient, θ 1, θ 2, θ 3, θ 4, θ 5As shown in Figure 2, be respectively line segment to line and the center line segment of four end points
Figure BDA000034441939000610
Between angle, namely With
Figure BDA000034441939000612
Angle,
Figure BDA000034441939000613
With
Figure BDA000034441939000614
Angle,
Figure BDA000034441939000615
With
Figure BDA000034441939000616
Angle,
Figure BDA000034441939000617
With
Figure BDA000034441939000618
Angle,
Figure BDA000034441939000619
With
Figure BDA000034441939000620
Angle.
Step 3.3, the descriptor similarity measurement, by calculate each line segment in two sets of line segments to similarity determine the similarity of whole sets of line segments.
Embodiment utilizes threshold value that descriptor is divided into two types, adopts respectively Near projection invariance and affine similarity computing formula to calculate similarity, and sets of line segments is mated.If a pair of line segment with reference to certain sets of line segments in the image is v={r to descriptor 1~2, l 1~5, θ 1~5, corresponding line segment is v'={r' to descriptor in the g}, image to be matched 1~2, l' 1~5, θ ' 1~5, g'}.At first utilize threshold value T r=0.3 similarity computation process that line segment is right is divided into two kinds of situations, and is as follows:
(1) if | r 1-r' 1| with | r 2-r' 2| all be less than or equal to T r, then utilize following formula (3) to calculate its similarity S a,
u∈{1,2}(3)
In the formula
Figure BDA00003444193900077
d gThe threshold parameter that arranges in order to improve matching result, the mistake coupling phenomenon of utilizing these parameters to get rid of to cause because of unpractiaca distortion inaccuracy, its value is respectively:
d r u = 1 - | r u - r ′ u | T r u ∈ { 1,2 } d θ 1 = 1 - | θ 1 | - θ ′ 1 | π / 2 d l 1 = 1 - max ( l 1 , l ′ ′ 1 ) / min ( l 1 , l ′ ′ 1 ) - 1 3 d g = 1 - max ( g , g ′ ) / min ( g , g ′ ) - 1 2 - - - - ( 4 )
(2) if | r 1-r' 1| or | r 2-r' 2| greater than T r, utilize following formula (4) to calculate its similarity S g:
Figure BDA00003444193900073
In the formula
Figure BDA00003444193900078
d gCan get rid of the mistake coupling phenomenon that causes because of unpractiaca distortion inaccuracy, be calculated as follows formula (6):
d l v = 1 - max ( l v , l ′ v ) / min ( l v l ′ v ) - 1 3 d θ v = 1 - | θ v θ ′ v | π / 2 d g = 1 - max ( g , g ′ ) / min ( g , g ′ ) - 1 3 , v ∈ [ 1,5 ] - - - ( 6 )
Utilize at last formula (7) to obtain these two line segments to final similarity S LPIn the formula with S gWeights be reduced into
Figure BDA00003444193900075
When weakening other conversion impact, strengthened the constraint of affined transformation, can more easily find the line segment pair that satisfies affined transformation.
Figure BDA00003444193900076
If certain sets of line segments descriptor desc={v in the reference image of embodiment 1, v 2, v 3, corresponding sets of line segments descriptor desc={v' is arranged in the image to be matched 1, v' 2, v' 3, possible coupling line segment co-exists in 6 kinds of situations, that is: { (V to combination in these two descriptors 1, V' 1), (V 2, V' 2), (V 3, V' 3), { (V 1, V' 1), (V 2, V' :), (V 3, V' 2), { (v 1, v' ,), (v 2, v' 1), (v 3, V' 3), { (v 1, v' 2), (v 2, v' 3), (v 3, v' l), { (v 1, v' 3)) (v 2, v' 1)) (v 3, v' 2), { (v 1, v' 3)) (v 2, v' 2)) (v 3, v' 1))) utilize said process calculate respectively 3 groups of line segments in each combination to similarity S LPAnd ask it to add and, obtain the similarity of 6 sets of line segments, wherein maximal value is considered as sets of line segments descriptor desc={v 1, v 2, v 3And desC=(V' 1, V' 2, V' 3) final similarity.
For sets of line segments in the reference image, calculate successively the similarity between all sets of line segments in itself and the image to be matched, the sets of line segments of similarity maximum is considered as sets of line segments with reference sets of line segments coupling, then according to the relation between center line segment in the sets of line segments and other line segments each line segment in organizing is mated, obtain the single line segment that the match is successful, all line segments that the match is successful consist of the remarkable line-segment sets that the match is successful.Join in non-remarkable line-segment sets corresponding to the reference image that obtains in the step 2 with reference to the line segment that the match is successful on the image, the line segment that the match is successful on the image to be matched joins in non-remarkable line-segment sets corresponding to the image to be matched that obtains in the step 2.
Step 4, initial transformation matrix (being initial transformation matrix) and relative rotation angle information between the estimation image.
The step 4 of embodiment is mainly processed the resulting line-segment sets that the match is successful of step 3, obtains with reference to the initial transformation matrix between image and image to be matched and relative rotation angle information.Specific implementation is as follows:
If { L 1, L 2, L 3, L μBy being consisted of line-segment sets, { L' with reference to the remarkable line segment that the match is successful in the image l, L' 2, L' 3..., L' μConsisted of line-segment sets by the remarkable line segment that the match is successful in the image to be matched, wherein, μ is with reference to the coupling line segment logarithm between image and image to be matched, (L w, L' w) be with reference to the w between image and image to be matched to the coupling line segment, the value of w is 1,2 ... μ.I pBe set { L ,, L 2, L 3..., L μMiddle conductor is to (L i, L j) the position of intersection point in the reference image, I' pBe set { L' 1, L' 2, L' 3..., L' μIn corresponding line segment to (L' i, L' j) the position of intersection point in image to be matched, the value of i is 1,2 ... μ, the value of j is 1,2 ... μ, i ≠ j.With I pWith I' pBe considered as a pair of match point and save as matching double points (I p, I' p), process successively line segment aggregate { L 1, L 2, L 3..., L μAnd { L' 1, L' 2, L' 3..., L' μInterior corresponding line segment, one group of matching double points set obtained.Utilize neighbor point that conforming RANSAC algorithm is obtained with reference to the initial transformation matrix between image and image to be matched and relative rotation angle information, input quantity is matching double points collection obtained above.Neighbor point is to conforming RANSAC(Random Sample Consensus, RANSAC) algorithm belongs to the conventional method of remote sensing image process field, and idiographic flow does not repeat them here.
Step 5, non-remarkable line match.
The step 5 of embodiment is mainly processed the non-remarkable line segment in reference image and the image to be matched, utilizes the image matching method based on coplanar intersection point of line segments of the prior art that non-remarkable line segment is mated.Non-remarkable line segment is all non-remarkable line segments in the non-remarkable line-segment sets of step 3 gained herein.Have 4 sub-steps, wherein, step 5.1 and step 5.2 are that reference image and image to be matched are carried out respectively same intersection point search and describe processing, and step 5.3 and step 5.4 are to seek the intersection point of coupling between reference image and image to be matched, obtain final coupling line-segment sets.Be implemented as follows:
Step 5.1 is carried out respectively the intersection point search to reference image and image to be matched.In reference image or image to be matched, (be L with every the remarkable line segment that the match is successful that obtains in the step 3 wOr L' w) centered by, in certain neighborhood around it, the coplanar non-remarkable line segment that searching may be intersected with it, and ask its intersection point.Its principle such as Fig. 3: establish line segment L sBe the remarkable line segment that the match is successful, p 1, p 2Be L sTwo-end-point, supposing has some non-remarkable line segments (in Fig. 3 around it
Figure BDA00003444193900097
And
Figure BDA00003444193900096
), d ThBe predefined distance threshold rule of thumb, at first along line segment L sThe direction of both ends p that stretches out respectively 1With p 2To a, b place, wherein ap 1=bp 2=d Th, then respectively centered by a, b, straight line
Figure BDA00003444193900091
Be axis of symmetry, d ThBe two semicircles of radius structure, with these two semicircles at straight line
Figure BDA00003444193900092
The end points of the same side couples together and consists of a closed region as the neighborhood of judging, judges that whether pixel is arranged in this zone is the interior pixel of non-remarkable line segment, if having, then calculates this pixel corresponding non-remarkable line segment and L sIntersection point (there are three kinds of situations in this intersection point, namely in non-remarkable line segment, at L sIn or at non-remarkable line segment and L sExtended line on), if this intersection point also in this regional extent, carries out mark to this non-conspicuousness line segment with corresponding intersection point, and the positional information of record intersection point in image.Among Fig. 3,
Figure BDA00003444193900093
With
Figure BDA00003444193900094
Be L sNon-remarkable line segment to be matched in the neighborhood, and Then because in the neighborhood scope, not being rejected.After this step was finished, each remarkable line segment that slightly the match is successful obtained one group of intersection point collection, each intersection point is then corresponding non-remarkable line segment.
Step 5.2 generates the intersection point descriptor according to step 5.1 gained intersection point.This process utilizes the geometric invariance of intersection point and part illumination invariant that it is carried out the consistance description, is used for follow-up intersection point coupling, the intersection point descriptor that obtains is designated as: Descr={Position, Gradient}.Wherein, Position represents the coordinate position of intersection point in image, in the intersection point search, calculate, feature in the regional area (16 * 16 pixel sizes are got in suggestion) of Gradient representative centered by intersection point, can utilize existing maturation method to obtain, embodiment utilizes SIFT(Scale-Invariant Feature Transform, SIFT) method is calculated the characteristic information in the intersection point regional area, and the image coordinate of input intersection point obtains the principal direction of intersection point and corresponding intersection point descriptor.The SIFT algorithm belongs to the classical way in Image Matching field, and idiographic flow does not repeat them here.
Step 5.3, intersection point descriptor coupling.This process is divided into the coupling of position of intersecting point information and two aspects of coupling of intersection point local features, and is as follows:
(1) coupling of position of intersecting point information: theoretically, if two intersection points are the coupling intersection point, then their position attribution satisfies initial homography matrix within the specific limits, therefore utilize the reference image that obtains in the step 4 and the initial transformation matrix between image to be matched that this process is retrained, get rid of obvious unmatched intersection point, in conjunction with Fig. 4 this process is described in detail.L among Fig. 4 s' and
Figure BDA00003444193900104
The corresponding remarkable line segment of the intersection point that is respectively to be matched on the image to be matched and non-remarkable line segment, dis is for rule of thumb predefined position limit is poor, at first according to the coordinate position Position of intersection point on the initial transformation matrix computing reference image corresponding position coordinates on image to be matched, as shown in Figure 4 A place; Secondly on the image to be matched centered by the A point, dis is that radius obtains a border circular areas, as shown in Figure 4, keeps the intersection point to be matched that falls into this border circular areas in the image to be matched, hereinafter is called the reservation intersection point, carries out the coupling of next step local features.
The local features Gradient of the intersection point that keeps in image to be matched (2) coupling of intersection point local features: utilize Euclidean distance to step 5.3(1) mates.In this process, adopt the anglec of rotation between the image that obtains in the step 4 to dwindle the hunting zone, accelerate matching speed.If the principal direction with reference to certain intersection point on the image is The principal direction of certain corresponding reservation intersection point is on the image to be matched
Figure BDA00003444193900102
At first judge
Figure BDA00003444193900103
Whether in [δ ,+δ] scope, if, then calculate the Euclidean distance between the local features Gradient of these two intersection points, otherwise then cast out.To certain intersection point on the reference image, from image to be matched, get all corresponding reservation intersection points and calculate the nearest intersection point of gained Euclidean distance, consist of the coupling intersection point, wherein, δ is predefined angle threshold rule of thumb.
Step 5.4, utilize the coupling intersection point that obtains in the step 5.3 and the corresponding relation between non-remarkable line segment to obtain the non-remarkable line segment that the match is successful, obtain the remarkable line segment that the match is successful in all non-remarkable line segments that the match is successful and the step 3 and consist of final coupling line-segment sets.
Compare with traditional Feature Matching method, method of the present invention all has advantage clearly, and existing higher feature repetition rate has again higher matching probability, is a kind of feasible Feature Matching method.
Above embodiment only is used for the present invention is done further similar explanation, but not limitation of the present invention, can not think that implementation of the present invention only limits to above-mentioned explanation, those skilled in the art should be appreciated that, limit in the situation that do not break away from by appended claims, can make in detail various modifications to this, but all technical schemes that are equal to also all belong to the category of Ben Fanming.

Claims (5)

1. the sane matching process of the low altitude remote sensing image based on the line feature is characterized in that, comprises following steps:
Step 1 is extracted the line segment in the image, and described image comprises with reference to image and image to be matched;
Step 2 to reference image and image to be matched, utilizes respectively the conspicuousness of line segment and length information that all line segments in the step 1 gained image are divided into remarkable line segment or non-remarkable line segment, and the non-remarkable line segment of gained consists of non-remarkable line-segment sets;
Step 3 is utilized based on the image matching method of the line segment combination remarkable line segment to reference image and image to be matched and is mated, and the remarkable line segment that the match is successful is joined in the non-remarkable line-segment sets;
Step 4 is estimated with reference to initial transformation matrix and relative rotation angle between image and image to be matched according to the step 3 remarkable line segment that the match is successful;
Step 5 utilizes the image matching method based on coplanar intersection point of line segments that non-remarkable line segment in the non-remarkable line-segment sets of step 3 gained is mated, and comprises following substep,
Step 5.1 to reference image and image to be matched, is sought respectively its intersection point with non-remarkable line segment in step 3 each remarkable line segment neighborhood that the match is successful;
Step 5.2 generates respectively the intersection point descriptor to each intersection point that obtains in the step 5.1, and obtains the principal direction of intersection point;
Step 5.3 based on step 4 gained initial transformation matrix and relative rotation angle, is mated the intersection point on reference image and the image to be matched according to the principal direction of gained intersection point descriptor and intersection point in the step 5.2, obtains mating intersection point;
Step 5.4 utilizes the corresponding relation that respectively mates between intersection point and non-remarkable line segment that obtains in the step 5.3 to obtain the non-remarkable line segment that the match is successful, and the remarkable line segment that the match is successful in all non-remarkable line segments that the match is successful and the step 3 consists of final coupling line-segment sets.
2. the sane matching process of described low altitude remote sensing image based on the line feature according to claim 1 is characterized in that: in the step 2, all line segments in the image are divided into remarkable line segment or non-remarkable line segment implementation is as follows,
For arbitrary line segment l in the image, successively with each pixel l (x in the line segment l n, y n) be starting point, along l (x n, y n) the nearest adjacent segments of gradient direction detection range line segment l, and calculating pixel l (x n, y n) to the distance of the searched adjacent segments that arrives, be designated as Again along l (x n, y n) the nearest adjacent segments of antigradient direction detection range line segment l, and calculating pixel l (x n, y n) to the distance of the searched adjacent segments that arrives, obtain distance
Figure FDA00003444193800013
Get both maximal values As pixel l (x n, y n) corresponding support area; If line segment l length is N, the value of n is 1,2 ... N;
If gr nBe pixel l (x in the line segment l n, y n) gradient,
Figure FDA00003444193800011
Be the mean value of all pixel gradient in the line segment l, utilize following formula to calculate the conspicuousness s of line segment l:
s = Σ n = 1 N ( ( gr n - gr ‾ ) × d n )
After obtaining the conspicuousness of every line segment in the image, calculate the mean value of all line segment conspicuousnesses in the image Average length with all line segments
Figure FDA00003444193800023
If arbitrary line segment l satisfies in the image
Figure FDA00003444193800024
Then be labeled as remarkable line segment, otherwise be labeled as non-remarkable line segment.
3. the sane matching process of described low altitude remote sensing image based on the line feature according to claim 1, it is characterized in that: step 4 implementation is as follows,
If { L 1, L 2, L 3..., L μBy being consisted of line-segment sets, { L' with reference to the remarkable line segment that the match is successful in the image 1, L' 2, L' 3..., L' μConsisted of line-segment sets by the remarkable line segment that the match is successful in the image to be matched, wherein, μ is with reference to the coupling line segment logarithm between image and image to be matched, (L w, L' w) be with reference to the w between image and image to be matched to the coupling line segment, the value of w is 1,2 ... μ;
If I pBe set { L 1, L 2, L 3..., L μMiddle conductor is to (L i, L j) the position of intersection point in the reference image, I' pBe set { L' 1, L' 2, L' 3..., L' μIn corresponding line segment to (L' i, L' j) the position of intersection point in image to be matched, the value of i is 1,2 ... μ, the value of j is 1,2 ... μ, i ≠ j;
With I pWith I' pBe considered as a pair of match point and save as matching double points (I p, I' p), process successively line segment aggregate { L by this 1, L 2, L 3..., L μAnd { L' 1, L' 2, L' 3..., L' μInterior corresponding line segment, obtain one group of matching double points set, adopt the estimation of RANSAC algorithm with reference to initial transformation matrix and relative rotation angle between image and image to be matched according to this matching double points set.
According to claim 1 and 2 or described in 3 based on the sane matching process of the low altitude remote sensing image of line feature, it is characterized in that: in the step 5.1, significantly the neighborhood make of line segment is as follows,
If p 1, p 2Be remarkable line segment L sTwo-end-point, d ThBe the predeterminable range threshold value, at first along line segment L sThe direction of both ends p that stretches out respectively 1With p 2To a, b place, wherein ap 1=bp 2=d Th, then respectively centered by a, b, with straight line Be axis of symmetry, with d ThBe two semicircles of radius structure, with these two semicircles at straight line
Figure FDA00003444193800026
The end points of the same side couples together and consists of a closed region as remarkable line segment L sNeighborhood.
According to claim 4 described in based on the sane matching process of the low altitude remote sensing image of line feature, it is characterized in that: step 5.2 gained intersection point descriptor is Descr={Position, Gradient}, wherein, Position represents the coordinate position of intersection point in image, feature in the regional area of Gradient representative centered by intersection point;
Step 5.3 implementation is as follows,
(1) coupling of position of intersecting point information, the position coordinates A that comprises coordinate position Position correspondence on image to be matched of utilizing certain intersection point on the initial transformation matrix computing reference image that obtains in the step 4, centered by this position coordinates A, preset value dis obtains a border circular areas as radius, keep the intersection point to be matched that falls into this border circular areas in the image to be matched, be referred to as to keep intersection point; (2) coupling of intersection point local features is achieved as follows,
If the principal direction with reference to certain intersection point on the image is
Figure FDA00003444193800031
The principal direction of certain corresponding reservation intersection point is on the image to be matched
Figure FDA00003444193800032
Gained with reference to relative rotation angle between image and image to be matched is in the step 4, at first judges
Figure FDA00003444193800033
In [δ ,+δ] scope, wherein δ is the preset angles threshold value; If then calculate the Euclidean distance between the local features Gradient of these two intersection points, otherwise then cast out corresponding reservation intersection point; Then, get and calculate the nearest intersection point of gained Euclidean distance, consist of the coupling intersection point.
CN201310272696.9A 2013-07-01 2013-07-01 The sane matching process of a kind of low altitude remote sensing image based on line features Active CN103324948B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310272696.9A CN103324948B (en) 2013-07-01 2013-07-01 The sane matching process of a kind of low altitude remote sensing image based on line features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310272696.9A CN103324948B (en) 2013-07-01 2013-07-01 The sane matching process of a kind of low altitude remote sensing image based on line features

Publications (2)

Publication Number Publication Date
CN103324948A true CN103324948A (en) 2013-09-25
CN103324948B CN103324948B (en) 2016-04-27

Family

ID=49193676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310272696.9A Active CN103324948B (en) 2013-07-01 2013-07-01 The sane matching process of a kind of low altitude remote sensing image based on line features

Country Status (1)

Country Link
CN (1) CN103324948B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105957074A (en) * 2016-04-27 2016-09-21 武汉大学 Line segment matching method and system based on V-shape intersection description and local homography matrix
CN108305277A (en) * 2017-12-26 2018-07-20 中国航天电子技术研究院 A kind of heterologous image matching method based on straightway
CN108830781A (en) * 2018-05-24 2018-11-16 桂林航天工业学院 A kind of wide Baseline Images matching line segments method under Perspective transformation model
WO2020151454A1 (en) * 2019-01-23 2020-07-30 何再兴 Textureless metal part grasping method based on line bundle descriptor image matching

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101251926A (en) * 2008-03-20 2008-08-27 北京航空航天大学 Remote sensing image registration method based on local configuration covariance matrix
US20090304285A1 (en) * 2006-07-17 2009-12-10 Panasonic Corporation Image processing device and image processing method
US20100034483A1 (en) * 2008-08-05 2010-02-11 Frank Giuffrida Cut-line steering methods for forming a mosaic image of a geographical area
CN101833767A (en) * 2010-05-10 2010-09-15 河南理工大学 Gradient and color characteristics-based automatic straight line matching method in digital image
CN102521597A (en) * 2011-12-14 2012-06-27 武汉大学 Hierarchical strategy-based linear feature matching method for images

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090304285A1 (en) * 2006-07-17 2009-12-10 Panasonic Corporation Image processing device and image processing method
CN101251926A (en) * 2008-03-20 2008-08-27 北京航空航天大学 Remote sensing image registration method based on local configuration covariance matrix
US20100034483A1 (en) * 2008-08-05 2010-02-11 Frank Giuffrida Cut-line steering methods for forming a mosaic image of a geographical area
CN101833767A (en) * 2010-05-10 2010-09-15 河南理工大学 Gradient and color characteristics-based automatic straight line matching method in digital image
CN102521597A (en) * 2011-12-14 2012-06-27 武汉大学 Hierarchical strategy-based linear feature matching method for images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
隋欣航,等: ""一种基于直线特征的遥感图像匹配方法"", 《信号处理》, vol. 28, no. 12, 31 December 2012 (2012-12-31), pages 11 - 14 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105957074A (en) * 2016-04-27 2016-09-21 武汉大学 Line segment matching method and system based on V-shape intersection description and local homography matrix
CN105957074B (en) * 2016-04-27 2018-12-14 深圳积木易搭科技技术有限公司 Line match method and system based on the description of V-type intersection point and local homography matrix
CN108305277A (en) * 2017-12-26 2018-07-20 中国航天电子技术研究院 A kind of heterologous image matching method based on straightway
CN108305277B (en) * 2017-12-26 2020-12-04 中国航天电子技术研究院 Heterogeneous image matching method based on straight line segments
CN108830781A (en) * 2018-05-24 2018-11-16 桂林航天工业学院 A kind of wide Baseline Images matching line segments method under Perspective transformation model
CN108830781B (en) * 2018-05-24 2022-05-24 桂林航天工业学院 Wide baseline image straight line matching method under perspective transformation model
WO2020151454A1 (en) * 2019-01-23 2020-07-30 何再兴 Textureless metal part grasping method based on line bundle descriptor image matching

Also Published As

Publication number Publication date
CN103324948B (en) 2016-04-27

Similar Documents

Publication Publication Date Title
CN106780557B (en) Moving object tracking method based on optical flow method and key point features
Naseer et al. Robust visual SLAM across seasons
CN104766084B (en) A kind of nearly copy image detection method of multiple target matching
CN103400384B (en) The wide-angle image matching process of calmodulin binding domain CaM coupling and some coupling
CN104050675B (en) Feature point matching method based on triangle description
CN101833765B (en) Characteristic matching method based on bilateral matching and trilateral restraining
Neubert et al. Beyond holistic descriptors, keypoints, and fixed patches: Multiscale superpixel grids for place recognition in changing environments
CN104751465A (en) ORB (oriented brief) image feature registration method based on LK (Lucas-Kanade) optical flow constraint
CN108305277B (en) Heterogeneous image matching method based on straight line segments
Wu et al. Psdet: Efficient and universal parking slot detection
CN102651069B (en) Contour-based local invariant region detection method
CN103324948B (en) The sane matching process of a kind of low altitude remote sensing image based on line features
CN103914690B (en) Shape matching method based on projective invariant
CN101488223A (en) Image curve characteristic matching method based on average value standard deviation descriptor
CN107818598A (en) A kind of three-dimensional point cloud map amalgamation method of view-based access control model correction
CN104851095A (en) Workpiece image sparse stereo matching method based on improved-type shape context
CN109034237B (en) Loop detection method based on convolutional neural network signposts and sequence search
Chou et al. 2-point RANSAC for scene image matching under large viewpoint changes
Luo et al. Dynamic multitarget detection algorithm of voxel point cloud fusion based on pointrcnn
CN113989308A (en) Polygonal target segmentation method based on Hough transform and template matching
CN103208003B (en) Geometric graphic feature point-based method for establishing shape descriptor
CN102306383B (en) Descriptor constructing method suitable for dense matching of wide baseline image
CN102592277A (en) Curve automatic matching method based on gray subset division
CN103700119A (en) Local texture description method based on local grouping comparison mode column diagram
CN103310456A (en) Multi-temporal/multi-mode remote sensing image registration method based on Gaussian-Hermite moments

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant