CN101882308A - Method for improving accuracy and stability of image mosaic - Google Patents

Method for improving accuracy and stability of image mosaic Download PDF

Info

Publication number
CN101882308A
CN101882308A CN 201010216340 CN201010216340A CN101882308A CN 101882308 A CN101882308 A CN 101882308A CN 201010216340 CN201010216340 CN 201010216340 CN 201010216340 A CN201010216340 A CN 201010216340A CN 101882308 A CN101882308 A CN 101882308A
Authority
CN
China
Prior art keywords
width
point
image
cloth
individuality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 201010216340
Other languages
Chinese (zh)
Inventor
姚达
周军
安然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN 201010216340 priority Critical patent/CN101882308A/en
Publication of CN101882308A publication Critical patent/CN101882308A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention relates to a method for improving accuracy and stability of image mosaic, belonging to the technical field of video image processing. The method of the invention comprises the following steps: extracting and matching SIFT characteristic points of an image to be mosaicked to obtain an initial matching point pair set; eliminating error matching to obtain a uniform maximum inner point set meeting the same homography matrix; normalizing direct linear change to obtain initial estimate of the homography matrix, and carrying out LM iterative optimization on the initial estimate of the homography matrix to obtain the optimum estimate of the homography matrix; mosaicking the images; determining the public area of two images to be mosaicked and obtaining the sub-images of the two images contained in the minimum external rectangle of the public area; carrying out seamless integration; and sticking the integrated image to the public area to obtain the final result of image mosaic. The invention can carry out image mosaic under the condition that a big amount of error matching exists in the initial matching point pair, and the mosaic has high accuracy, short time consumption and good stability.

Description

Improve the method for image mosaic precision and stability
Technical field
What the present invention relates to is a kind of method of technical field of video image processing, specifically is a kind of method that improves image mosaic precision and stability.
Background technology
The image mosaic technology can be used for setting up high-definition picture with great visual angle, in the virtual reality field, field of medical image processing, remote sensing technology field etc. all be widely used.The process of image mosaic is made up of image registration and synthetic two steps of image, and wherein image registration is the basis of entire image splicing.The registration degree of the main dependency graph picture of the quality of image mosaic, so the registration of image is the core and the key of joining method.According to the difference of method for registering images, generally image split-joint method can be divided into based on the relevant joining method in zone with based on relevant two types of joining method of feature.In general, not high based on the precision of the relevant joining method image registration in zone, rotated easily, be out of shape and the different influence of throwing light on; And based on the relevant joining method of feature owing to utilized the characteristic information of image, rotation, distortion and illumination are had stronger antijamming capability, its registration accuracy is higher usually.Based on the general step of the relevant joining method of feature be divided into the coupling of feature between the feature extracted in the image, different images, between estimated image homography matrix and utilize homography matrix to shift image pixel.Because Feature Extraction exists the coupling of positioning error and feature to have the mistake coupling in the reality, the estimated accuracy that is used for the homography matrix of image mosaic will be influenced by this.In recent years, along with further developing of image mosaic technology, the method that reduces the feature location error in a large number and eliminate the mistake coupling has appearred.
Through the retrieval of prior art document is found, David Lowe has proposed graphical rule invariant features point in " the Distinctive image features fromscale-invariant keypoints (based on the specific image feature of the constant key point of yardstick) " that delivered in 2004 on " International Journal ofComputer Vision (international computer vision periodical) " be the extracting method of SIFT unique point.By the introducing metric space two dimensional image being expanded into three-dimensional pyramid structure and being taken at the point that all has on three directions than the high-gray level variation is unique point, has greatly improved the feature location error.Yet still there is the problem of a large amount of mistake couplings in this technology after the SIFT unique point is mated.
Find by retrieval that again Matthew Brown and David Lowe have proposed to utilize the SIFT unique point to carry out the panorama sketch method of splicing automatically in " Automatic panoramic image stitching usinginvariant features (the using the automatic panoramic picture splicing of invariant features) " of delivering in 2007 on " International Journal ofComputer Vision (international computer vision periodical) ".This method utilizes random sampling coherence method (being the RANSAC method) to eliminate the mistake coupling, improves the estimated accuracy of homography matrix.The RANSAC method will satisfy a certain homography matrix in certain precision match point is to being called the interior point of this homography matrix, otherwise is called exterior point.The RANSAC method can be handled the situation of point to the mistake coupling that exist, yet when the large percentage of mistake coupling total match point centering, the meeting consuming time of RANSAC method is quite big, and the precision of homography matrix also can be subjected to the influence of the deficiency of frequency in sampling.
Summary of the invention
The objective of the invention is to overcome above-mentioned deficiency of the prior art, propose a kind of method that improves image mosaic precision and stability.Concurrency and global optimum's property that this method utilizes genetic method to include improve efficient, estimated accuracy and the robustness that homography matrix is estimated, thereby improve the precision and the stability of image mosaic, can be widely used in fields such as photogrammetric, remote sensing image processing, medical image analysis and virtual reality.
The present invention is achieved by the following technical solutions, the present invention includes following steps:
The first step is carried out the SIFT feature point extraction to two width of cloth images to be spliced respectively, and two groups of SIFT unique points that obtain are mated, and obtains the initial matching point to collection.
Second step, estimate based on the genetic identity of homography matrix, initial matching point is rejected collection miss matching treatment, be met the imperial palace point set of unanimity of same homography matrix.
Described rejecting mistake matching treatment may further comprise the steps:
1) at initial matching point to concentrate selecting P individual composition initial population, count in obtaining that each individuality comprises, and have at least comprise in the body one by one in count and be greater than or equal to dependent thresholds T In
It is right to concentrating the s that selects at random to organize nonsingular match point that described individuality comprises from the initial matching point.
Described s span is: the integer on interval [4,10].
Described dependent thresholds T InSpan be: [2s, 4s], wherein s is the individual match point logarithm that is comprised.
The span of described P is: [s+2,20 (s+2)], wherein s is the individual match point logarithm that is comprised.
Point is that symmetry transfer error is right less than the match point of interior some threshold value Thr in described.
The span of point threshold value Thr is in described: [0.5,2].
Described symmetry shifts error, is:
D ij=||x Li-H jx Ri|| 2+||x Ri-H j -1x Li|| 2
Wherein: D IjBe that the initial matching point is to concentrating i group match point to (x Li, x Ri) for j individual I jSymmetry shift error, 1≤i≤N, N are that the initial matching point is to concentrating the right total number of match point, x Li=(x Li, y Li, 1) TBe the homogeneous coordinates of i group match point centering first width of cloth figure match point, (x Li, y Li) TBe to be pixel coordinate in the coordinate system set up of initial point with first width of cloth figure top left corner pixel point; x Ri=(x Ri, y Ri, 1) TBe the homogeneous coordinates of i group match point centering second width of cloth figure match point, (x Ri, y Ri) TBe to be pixel coordinate in the coordinate system set up of initial point with second width of cloth figure top left corner pixel point, H jBe to utilize j individual I jThe s that is comprised handles resulting homography matrix estimation example, 1≤j≤P to match point to carrying out the direct linear change of normalization.
The direct linear change of described normalization is handled, and may further comprise the steps:
A) the normalization matrix T of calculating first width of cloth figure L:
T L = 1 / f L 0 - u L / f L 0 1 / f L - v L / f L 0 0 1 ,
Wherein:
Figure BDA0000023001720000032
Figure BDA0000023001720000033
f L=W L+ F L/ 2, W LBe that first width of cloth figure is the width of unit with the pixel, F LBe first width of cloth figure be the length of unit with the pixel, f LIt is the yardstick normalized factor of first width of cloth figure;
B) the normalization matrix T of calculating second width of cloth figure R:
T R = 1 / f R 0 - u R / f R 0 1 / f R - v R / f R 0 0 1 ,
Wherein:
Figure BDA0000023001720000035
Figure BDA0000023001720000036
f R=W R+ F R/ 2, W RBe that second width of cloth figure is the width of unit with the pixel, F RBe second width of cloth figure be the length of unit with the pixel, f RIt is the yardstick normalized factor of second width of cloth figure;
C) the normalization match point is right:
x NLi=T Lx′ Li,x NRi=T Rx′ Ri
Wherein: x NLiBe the normalization homogeneous coordinates of first width of cloth figure match point, x NRiBe the normalization homogeneous coordinates of second width of cloth figure match point, x ' Li=(x ' Li, y ' Li, 1) T, x ' Ri=(x ' Ri, y ' Ri, 1) T, (x ' Li, x ' Ri) match point that provides is right;
D) right
Figure BDA0000023001720000037
Carry out svd:
A=UDV T
Wherein: A i = 0 T - x NRi T y NLi x NRi T x NRi T 0 T - x NLi x NRi T ,
x NLi=(x NLi,y NLi,1) T,x NRi=(x NRi,y NRi,1) T
By descending sort, last of V classified h as to the diagonal element of diagonal matrix D along diagonal line;
E) remove normalization, obtain homography matrix:
H = T L - 1 H ~ T R ,
Wherein: H ~ = h 1 h 2 h 3 h 4 h 5 h 6 h 7 h 8 h 9 ,
h iBe i the element of h, H estimates example by the match point that provides to the homography matrix that obtains.
2) be that individual number all is the subgroup of s+2 with the initial population random division, and maximum and inferior two the big individualities of counting in selecting in each subgroup are as parent;
3) two parents in each subgroup are exchanged at random q group match point to generating 2 daughters, individual in the middle of daughter and parent form 4 altogether.
Described q is the random integers on interval [1, s-1].
4) count maximum individuality as candidate's individuality in keeping in 4 the middle individualities in each subgroup.
5) to the processing that makes a variation of the candidate's individuality in each subgroup, making each candidate's individual variation is that s+1 variation is individual, and the individuality that will make a variation adjusts processing, obtains the individual new population with the individual composition of candidate of adjusted variation.
Described variation is handled, the variation individuality that point and 4-m group exterior point were formed in selection was organized by m in the individual interior exterior point of dividing of each candidate respectively, 0≤m≤s.It is individual that each candidate's individuality all generates s+1 variation.
Described exterior point is that symmetry transfer error is right more than or equal to the match point of interior some threshold value Thr.
Described adjustment is handled, and is: to the individual variation individuality that generates of each candidate, compare with differentiating individuality one by one, and if only if differentiate count in individual greater than variation individual in count, the individuality that then will make a variation is replaced into differentiates individuality.
Described differentiation individuality be from the interior point of candidate's individuality at random the s of selection to organize nonsingular match point right.
6) new population is repeated 2)-5) altogether after G time, the interior point of the individuality of the maximum of counting in this moment population is exactly the imperial palace point set of unanimity that satisfies same homography matrix.
Described G is:
G = log ( 1 - C ) P · log ( 1 - ( μ N ) s ) ,
Wherein: P is a number individual in the population, μ executes 5 the G time) after count in obtaining maximum individuality in count, N is the initial matching point to concentrating total right logarithm of match point, and C executes 5 the G time) back by s correctly match point once probability was appearred in individuality at least that form.
The span of described C is: [0.9,0.99].
The 3rd step, utilize all match points of consistent imperial palace point set to handle to carrying out the direct linear change of normalization, obtain the initial estimation H of homography matrix Init, and the initial estimation of homography matrix is carried out the LM iteration optimization handle, obtain the optimal estimation H of homography matrix Opt
Described LM iteration optimization is handled, be in the formula below when total symmetry transfer error E is constant, H OptBe the optimal estimation of homography matrix:
E = Σ j = 1 M | | x INLj - H opt x INRj | | 2 + | | x INRj - H opt - 1 x INLj | | 2 s.t. | | H opt | | F 2 = 1 ,
Wherein: (x INLj, x INRj) be that j match point in the consistent imperial palace point set is right, 1≤j≤M, M are the right number of match point in the imperial palace point set of unanimity, || || FThe Frobenius norm of representing matrix, H OptInitial value be taken as H Init/ || H Init|| F, H InitIt is the initial estimation of homography matrix.
The 4th goes on foot, and utilizes the optimal estimation H of homography matrix OptCarry out the splicing of image, determine the public domain of two images to be spliced, and obtain the subimage of the first contained width of cloth image of the minimum boundary rectangle of public domain
Figure BDA0000023001720000054
Subimage with second width of cloth image
The splicing of described image may further comprise the steps:
1), with second width of cloth image mosaic to first width of cloth image, according to Reg2=Quad{R k/ R k(3) obtain the shared region R eg2 of second width of cloth image,
Wherein: R k=H OptV k,
R kThe homogeneous coordinates on k summit after being second width of cloth image mosaic to first width of cloth image, 1≤k≤4, V kBe the homogeneous coordinates on k summit of second width of cloth image itself, V 1=(1,1,1) T, V 2=(1, F L, 1) T, V 3=(W L, F L, 1) T, V 4=(W L, 1,1) T, W LBe that first width of cloth figure is the width of unit with the pixel, F LBe first width of cloth figure be the length of unit with the pixel, Quad{.} represents the zone that quadrilateral comprised that connects and composes successively by 4, R k(3) be expression vector R kThe 3rd component;
2), according to 1) method, obtain the shared region R eg1 of first width of cloth image, the common factor of region R eg2 and region R eg1 is exactly the public domain of two images to be spliced, and second width of cloth image content different with the public domain is spliced on first width of cloth image.
The 5th step is to the subimage of first width of cloth image Subimage with second width of cloth image Carry out seamless fusion, the image after merging is attached to the net result that the public domain can obtain image mosaic.
Compared with prior art, the invention has the beneficial effects as follows: can put in initial matching and accurately estimate the homography matrix that is used for image mosaic under a large amount of situations of by mistake mating of centering existence fast and stable, formation and the further mode of subgroup that adopts by the control initial population are searched in a plurality of zones in solution space simultaneously, only form sharp contrast with RANSAC in the prior art is each, can realize the estimation of homography matrix under to the situation of a large amount of existence fast at the mistake match point at single range searching; Utilize the variation individuality to carry out autotelic search according to may distributing of interior point, can on global sense, find out the imperial palace point set of unanimity more accurately, select the blindness of region of search to form sharp contrast at random with RANSAC in the prior art at every turn, significantly improve estimated accuracy and robustness that homography matrix is estimated, thereby guaranteed the precision of image mosaic; In the direct linear change of the normalization of being adopted, the yardstick normalized factor of two width of cloth images is all only relevant with the length and width of image self, and this makes the result of processing and the concrete distribution of point have nothing to do, and has improved stability.
Description of drawings
Fig. 1 is the to be spliced and spliced image synoptic diagram of embodiment;
Wherein: (a) be first image to be spliced; (b) be second image to be spliced; (c) be spliced image.
Fig. 2 is gaussian pyramid and the DOG pyramid structure synoptic diagram in the embodiment mesoscale space.
Fig. 3 is that embodiment mesoscale spatial extrema detects synoptic diagram.
Fig. 4 is the sub-synoptic diagram of feature description among the embodiment;
Wherein: (a) be the direction synoptic diagram that coordinate axis is transferred to key point; (b) be 32 dimension synoptic diagram of unique point.
Fig. 5 is that the interior point that adopts embodiment method and RANSAC method of the prior art to obtain among the embodiment respectively detects energy contrast synoptic diagram.
Fig. 6 is the evaluated error contrast synoptic diagram that adopts embodiment method and RANSAC method of the prior art to obtain among the embodiment respectively.
Fig. 7 is the contrast synoptic diagram consuming time that adopts embodiment method and RANSAC method of the prior art to obtain among the embodiment respectively.
Embodiment
Below in conjunction with accompanying drawing method of the present invention is further described: present embodiment is being to implement under the prerequisite with the technical solution of the present invention, provided detailed embodiment and concrete operating process, but protection scope of the present invention is not limited to following embodiment.
Embodiment
Present embodiment is used for image mosaic shown in Fig. 1 (a) to the image shown in Fig. 1 (b), and the image that splicing obtains specifically may further comprise the steps shown in Fig. 1 (c):
The first step is carried out the SIFT feature point extraction to two width of cloth images to be spliced respectively, and two groups of SIFT unique points that obtain are mated, and obtains the initial matching point to collection { (x Li, x Ri) | 1≤i≤N}, N are that the initial matching point is to concentrating the right total number of match point.x Li=(x Li, y Li, 1) TBe the homogeneous coordinates of i group match point centering first width of cloth figure match point, (x Li, y Li) TBe to be pixel coordinate in the coordinate system set up of initial point with first width of cloth figure top left corner pixel point; x Ri=(x Ri, y Ri, 1) TBe the homogeneous coordinates of i group match point centering second width of cloth figure match point, (x Ri, y Ri) TBe to be pixel coordinate in the coordinate system set up of initial point with second width of cloth figure top left corner pixel point.
The extraction and the matching process of described SIFT unique point are as follows:
1) sets up gaussian pyramid
Gaussian pyramid has the o rank, and there is l layer scalogram picture on each rank.Present embodiment is set up (o, gaussian pyramid l)=(4,5).The 1st layer of the 1st rank is to amplify 2 times original image, and the scale factor scale-up factor of adjacent two layers is k in single order, and then the scale factor on the 2nd layer on the 1st rank is k σ, and other layer then can by that analogy then; The 1st layer of middle layer scalogram by first rank on the 2nd rank looks like to carry out the son sampling and obtains, and its scale factor is k 2σ, the 2nd of the 2nd rank the layer scale factor is that the 1st layer k doubly is k then 3σ.The 1st layer of middle layer scalogram by the 2nd rank on the 3rd rank looks like to carry out the son sampling and obtains.The formation on other rank by that analogy.The structure of the gaussian pyramid that obtains is shown in the one side of something of Fig. 2 left side.For two-dimensional image I (x, y) metric space under different scale represent L (x, y, σ) can by image I (x, y) with gaussian kernel G (convolution σ) obtains for x, y:
L(x,y,σ)=G(x,y,σ)*I(x,y),
Wherein:
Figure BDA0000023001720000081
Present embodiment is got σ=1.6.
2) set up the DOG pyramid
The DOG pyramid subtracts each other by adjacent metric space function in the gaussian pyramid and gets final product.The 1st layer of pyramidal the 1st layer scale factor of DOG and gaussian pyramid is consistent, and other rank too.The DOG pyramid that present embodiment obtains is shown in the right one side of something of Fig. 2.
3) extreme value in DOG space detects
For maximal value and the minimum value that detects the DOG space, each pixel in the needs comparison DOG metric space image and the value of its contiguous 26 pixel, in Fig. 3, if the pixel that is labeled as cross is all bigger or all little than the DOG value of adjacent 26 pixels, then this is named a person for a particular job as a SIFT unique point, writes down its position, corresponding exponent number and the number of plies.
4) determine the principal direction of unique point
The exponent number and the number of plies according to unique point find corresponding image layer in gaussian pyramid, and be the center with the position of unique point, in 9 * 9 square neighborhood, add up the size and Orientation of the gradient of being had a few, and add up the gradient direction of neighborhood territory pixel with histogram of gradients.The scope of histogram of gradients is 0 °~360 °, and wherein per 10 degree are a direction, altogether 36 directions.The direction at the histogram peak place that statistics obtains has just been represented the principal direction of this unique point.The big or small m of the gradient of point in the wherein square neighborhood (x, y, σ) as follows with direction calculating:
m = ( x , y , σ ) = ( L ( x + 1 , y , σ ) - L ( x - 1 , y , σ ) ) 2 + ( L ( x , y + 1 , σ ) - L ( x , y - 1 , σ ) ) 2 ,
θ(x,y,σ)=arctan((L(x,y+1,σ)-L(x,y-1,σ))/(L(x+1,y,σ)-L(x-1,y,σ)))。
5) determine feature description of unique point
In order to ensure rotational invariance, at first coordinate axis is rotated to be the direction of key point.With the key point is the center, gets 8 * 8 window, this window is cut into 2 * 2 subwindow, as shown in Figure 4.
The position of current key point is represented at the center of Fig. 4 (a), each little lattice has been represented and the same yardstick of key point, and be a pixel in the key point neighborhood, the direction of arrow is represented the gradient direction of this pixel in the little lattice, the length of arrow is represented the size of the mould of gradient, and circle has been represented the scope of Gauss's weighting.Each subwindow of 2 * 2 window is made up of 4 * 4 fritter on it among Fig. 4 (b).On each fritter of 4 * 4, calculate the gradient orientation histogram of 8 directions, add up the accumulated value of each direction, promptly form a seed points.Unique point by 2 * 2 totally 4 seed points form, each seed points comprises the information of 8 directions, feature description of each unique point just has 4 * 8=32 dimension like this.
Described coupling, specifically: at first feature description with all unique points among first width of cloth figure and second width of cloth figure carries out length normalization method, then each unique point among first width of cloth figure is sought two unique points of similarity measurement maximum in second width of cloth figure.When inferior big similarity measurement is less than proportion threshold value 0.8 divided by the similarity measurement of maximum, then the most similar unique point is a pair of match point among the unique point among first width of cloth figure and second width of cloth figure.Similarity measurement Sd between two unique points is as follows:
Sd(Desc1,Desc2)=Desc1 T·Desc2,
Wherein: feature description that Desc1 and Desc2 representative are represented with 32 dimensional vectors.Big more two unique points that show of Sd value are similar more.
Second step, utilize based on the genetic identity of homography matrix at initial point and to estimate collection A, initial matching point is rejected matching treatment by mistake to collection, be met the imperial palace point set of unanimity of same homography matrix.
Described rejecting mistake matching treatment may further comprise the steps:
1) at initial matching point to concentrate selecting the individual initial population of forming of P (P=12), count in obtaining that each individuality comprises, and have at least comprise in the body one by one in count and be greater than or equal to dependent thresholds T In(T In=12);
Described individuality comprises from the initial matching point concentrating the nonsingular match point of selecting at random of s (s=4) group right.
Point is that symmetry transfer error is right less than the match point of interior some threshold value Thr (Thr=1) in described.
Described symmetry shifts error, is:
D ij=||x Li-H jx Ri|| 2+||x Ri-H j -1x Li|| 2
Wherein: D IjBe that the initial matching point is to concentrating i group match point to (x Li, x Ri) for j individual I jSymmetry shift error, 1≤i≤N, N are that the initial matching point is to concentrating the right total number of match point, x Li=(x Li, y Li, 1) TBe the homogeneous coordinates of i group match point centering first width of cloth figure match point, (x Li, y Li) TBe to be pixel coordinate in the coordinate system set up of initial point with first width of cloth figure top left corner pixel point; x Ri=(x Ri, y Ri, 1) TBe the homogeneous coordinates of i group match point centering second width of cloth figure match point, (x Ri, y Ri) TBe to be pixel coordinate in the coordinate system set up of initial point with second width of cloth figure top left corner pixel point, H jBe to utilize j individual I jThe s that is comprised handles resulting homography matrix estimation example, 1≤j≤P to match point to carrying out the direct linear change of normalization.
The direct linear change of described normalization is handled, and may further comprise the steps:
A) the normalization matrix T of calculating first width of cloth figure L:
T L = 1 / f L 0 - u L / f L 0 1 / f L - v L / f L 0 0 1 ,
Wherein:
Figure BDA0000023001720000101
Figure BDA0000023001720000102
f L=W L+ F L/ 2, W LBe that first width of cloth figure is the width of unit with the pixel, F LBe first width of cloth figure be the length of unit with the pixel, f LIt is the yardstick normalized factor of first width of cloth figure;
B) the normalization matrix T of calculating second width of cloth figure R:
T R = 1 / f R 0 - u R / f R 0 1 / f R - v R / f R 0 0 1 ,
Wherein:
Figure BDA0000023001720000104
Figure BDA0000023001720000105
f R=W R+ F R/ 2, W RBe that second width of cloth figure is the width of unit with the pixel, F RBe second width of cloth figure be the length of unit with the pixel, f RIt is the yardstick normalized factor of second width of cloth figure;
C) the normalization match point is right:
x NLi=T Lx′ Li,x NRi=T Rx′ Ri
Wherein: x NLiBe the normalization homogeneous coordinates of first width of cloth figure match point, x NRiBe the normalization homogeneous coordinates of second width of cloth figure match point, x ' Li=(x ' Li, y ' Li, 1) T, x ' Ri=(x ' Ri, y ' Ri, 1) T, (x ' Li, x ' Ri) match point that provides is right;
D) right
Figure BDA0000023001720000106
Carry out svd:
A=UDV T
Wherein: A i = 0 T - x NRi T y NLi x NRi T x NRi T 0 T - x NLi x NRi T ,
x NLi=(x NLi,y NLi,1) T,x NRi=(x NRi,y NRi,1) T
By descending sort, last of V classified h as to the diagonal element of diagonal matrix D along diagonal line;
E) remove normalization, obtain homography matrix:
H = T L - 1 H ~ T R ,
Wherein: H ~ = h 1 h 2 h 3 h 4 h 5 h 6 h 7 h 8 h 9 ,
h iBe i the element of h, H estimates example by the match point that provides to the homography matrix that obtains.
2) be that individual number all is 6 subgroup with the initial population random division, and maximum and inferior two the big individualities of counting in selecting in each subgroup are as parent;
3) two parents in each subgroup are exchanged at random q (q=2) group match point to generating 2 daughters, individual in the middle of daughter and parent form 4 altogether.
4) count maximum individuality as candidate's individuality in keeping in 4 the middle individualities in each subgroup.
5) to the processing that makes a variation of the candidate's individuality in each subgroup, making each candidate's individual variation is that 5 variations are individual, and the individuality that will make a variation adjusts processing, obtains the individual new population with the individual composition of candidate of adjusted variation.
Described variation is handled, the variation individuality that point and 4-m group exterior point were formed in selection was organized by m in the individual interior exterior point of dividing of each candidate respectively, 0≤m≤4.It is individual that each candidate's individuality all generates 5 variations.
Described exterior point is that symmetry transfer error is right more than or equal to the match point of interior some threshold value Thr (Thr=1).
Described adjustment is handled, and is: to the individual variation individuality that generates of each candidate, compare with differentiating individuality one by one, and if only if differentiate count in individual greater than variation individual in count, the individuality that then will make a variation is replaced into differentiates individuality.
Described differentiation individuality is that 4 groups of nonsingular match points selecting at random from the interior point of candidate's individuality are right.
6) new population is repeated 2)-5) altogether after G time, the interior point of the individuality of the maximum of counting in this moment population is exactly the imperial palace point set of unanimity that satisfies same homography matrix.
Described G is:
G = log ( 1 - C ) P · log ( 1 - ( μ N ) s ) ,
Wherein: P is a number individual in the population, μ executes 5 the G time) after count in obtaining maximum individuality in count, N is that the initial matching point is to concentrating total right logarithm of match point, C executes 5 the G time) back occurred once probability, C=0.99 by 4 correct match points at least to the individuality of forming.
The 3rd step, utilize all match points of consistent imperial palace point set to handle to carrying out the direct linear change of normalization, obtain the initial estimation H of homography matrix Init, and the initial estimation of homography matrix is carried out the LM iteration optimization handle, obtain the optimal estimation H of homography matrix Opt
The direct linear change of described normalization is handled with reference to the direct linear change of the normalization in second step and is handled.
Described LM iteration optimization is handled, and is according to following formula, when total symmetry transfer error E is constant, and H OptBe the optimal estimation of homography matrix:
E = Σ j = 1 M | | x INLj - H opt x INRj | | 2 + | | x INRj - H opt - 1 x INLj | | 2 s.t. | | H opt | | F 2 = 1 ,
Wherein: (x INLj, x INRj) be that j match point in the consistent imperial palace point set is right, 1≤j≤4,4M is the right number of match point in the imperial palace point set of unanimity, || || FThe Frobenius norm of representing matrix, H OptInitial value be taken as H Init/ || H Init|| F, H InitIt is the initial estimation of homography matrix.
The 4th goes on foot, and utilizes the optimal estimation H of homography matrix OptCarry out the splicing of image, determine the public domain of two images to be spliced, and obtain the subimage of the contained Fig. 1 (b) of the minimum boundary rectangle of public domain
Figure BDA0000023001720000123
Subimage with second width of cloth image
Figure BDA0000023001720000124
The splicing of described image may further comprise the steps:
1), Fig. 1 (a) is spliced on Fig. 1 (b), according to Reg2=Quad{R k/ R k(3) obtain the shared region R eg2 of Fig. 1 (a),
Wherein: R k=H OptV k,
R kBe the homogeneous coordinates that Fig. 1 (a) is spliced to k summit after Fig. 1 (b) goes up, 1≤k≤4, V kBe the homogeneous coordinates on k the summit of Fig. 1 (a) itself, V 1=(1,1,1) T, V 2=(1, F L, 1) T, V 3=(W L, F L, 1) T, V 4=(W L, 1,1) T, W LBe that first width of cloth figure is the width of unit with the pixel, F LBe first width of cloth figure be the length of unit with the pixel, Quad{.} represents the zone that quadrilateral comprised that connects and composes successively by 4, R k(3) be expression vector R kThe 3rd component;
For the homogeneous coordinates x of arbitrary integer pixel among the region R eg2, its gray-scale value can be by the corresponding point among second width of cloth figure
Figure BDA0000023001720000125
Gray-scale value determine.Make x R=x R/ x R(3), (x then R(1), x R(2)) TIt promptly is the pixel coordinate of this point.Because (x R(1), x R(2)) TBe generally decimal, need utilize bilinear interpolation to determine the pixel value of this point and compose to give some x among the region R eg2.
Described bilinear interpolation, concrete formula is as follows:
I R(x R(1),x R(2))=(1-a)(1-b)I R(c,d)+a(1-b)I R(c,d+1)+abI R(c+1,d+1)+(1-a)bI R(c+1,d),
Wherein: I R(x R(1), x R(2)) expression second width of cloth figure is at point (x R(1), x R(2)) gray-scale value of locating.
Figure BDA0000023001720000126
For smaller or equal to x R(1) maximum integer; For smaller or equal to x R(2) maximum integer; A=x R(2)-d; B=x R(1)-c.
2), according to 1) method, obtain the shared region R eg1 of Fig. 1 (b), the common factor of region R eg2 and region R eg1 is exactly the public domain of two images to be spliced, and Fig. 1 (a) content different with the public domain is spliced on Fig. 1 (b).In the 5th step, adopt the subimage of Multi-band Blending method to Fig. 1 (b)
Figure BDA0000023001720000132
And the subimage of Fig. 1 (a)
Figure BDA0000023001720000133
Carry out seamless fusion, the image after merging is attached to the net result that the public domain can obtain image mosaic.
Described seamless fusion specifically may further comprise the steps:
1) sets up the gaussian pyramid of two width of cloth subgraphs respectively
To set up subgraph
Figure BDA0000023001720000134
Gaussian pyramid be example, As the ground floor (bottom) of gaussian pyramid, the k tomographic image of gaussian pyramid so Can construct like this:
I Lk RPReg ( i , j ) = Σ m = - 2 2 Σ n = - 2 2 w ( m , n ) I L ( k - 1 ) RPReg ( 2 i + m , 2 j + n ) ,
Wherein: w (m, n)=w (m) w (n), w (0)=3/8, w (± 1)=1/4, w (± 2)=1/16,1≤k≤N, N are the number of plies of gaussian pyramid.
Subgraph
Figure BDA0000023001720000138
The gaussian pyramid structure Set up with the same manner.
2) set up the laplacian pyramid of two width of cloth subgraphs respectively
To set up subgraph
Figure BDA00000230017200001310
Laplacian pyramid be example, the interpolation enlarged drawing of establishing k layer in the gaussian pyramid is
Figure BDA00000230017200001311
I Lk RPReg * ( i , j ) = 4 Σ m = - 2 2 Σ n = - 2 2 w ( m , n ) I Lk RPRe g ′ ( i + m 2 , j + n 2 ) ,
Wherein:
Figure BDA00000230017200001313
Introduce and amplify operator Expand, interpolation amplification conversion can be noted by abridging and is
Figure BDA00000230017200001314
Arrive this, subgraph
Figure BDA00000230017200001315
Laplacian pyramid
Figure BDA00000230017200001316
Be constructed as follows:
Figure BDA00000230017200001317
Subgraph
Figure BDA0000023001720000141
The laplacian pyramid structure
Figure BDA0000023001720000142
Set up with the same manner.
3) set up the gaussian pyramid of fusion coefficients
To create subgraph The fusion coefficients gaussian pyramid be example, at first set up one with big coefficient image IC such as region R PReg L, with image I C LGray-scale value near half image of Fig. 1 (b) is made as 1, and is made as 0 near half of second width of cloth figure.And set up image I C LGaussian pyramid IC Lk, this is figure
Figure BDA0000023001720000144
The fusion coefficients gaussian pyramid, subgraph simultaneously
Figure BDA0000023001720000145
The fusion coefficients gaussian pyramid be IC Rk=1-IC Lk
4) set up the laplacian pyramid of fused image
If fused image is I RPReg, its laplacian pyramid
Figure BDA0000023001720000146
Be constructed as follows:
LPI k RPReg ( i , j ) = IC Lk ( i , j ) LPI Lk RPReg ( i , j ) + IC Rk ( i , j ) LPI Rk RPReg ( i , j ) .
5) image after reconstruction is merged
Laplacian pyramid by fused image Rebuild the gaussian pyramid of fused image
Figure BDA0000023001720000149
As follows:
Figure BDA00000230017200001410
Get the ground floor of gaussian pyramid
Figure BDA00000230017200001411
Be subgraph
Figure BDA00000230017200001412
And subgraph Image after the fusion.
The Graffiti database that utilizes Visual Geometry Group seminar to provide is done experiment, in two width of cloth images, introduce 100 pairs of correct matched data points by the SIFT method, the error matching points of introducing different numbers then at random is to forming different correct match point Comparative Examples, and then difference such as Fig. 5, Fig. 6 and shown in Figure 7 of the present embodiment method (HM-GCE) that obtains and existing RANGSAC method performance under different correct data point ratios.As can be seen from Figure 5: the present embodiment method under the correct matched data point of difference ratio detected in the some number average greater than the RANSAC method, and in count and be stabilized near 85, this shown the present embodiment method more robust more stably detect interior point, thereby provide condition for the precision and the stability of estimated result.Fig. 6 demonstrate the present embodiment method under the correct matched data point of difference ratio to the evaluated error of target homography matrix all less than the RANSAC method, evaluated error adopts the Frobenius norm
Figure BDA00000230017200001414
Wherein
Figure BDA00000230017200001415
For homography matrix is estimated example,
Figure BDA00000230017200001416
Be true homography matrix.Time when correct match point Comparative Examples is big consumes the RANSAC method that is slightly larger than although Fig. 7 demonstrates the present embodiment method, and the consuming time of present embodiment method will be much smaller than the RANSAC method when existing in a large number in error matching points.

Claims (9)

1. a method that improves image mosaic precision and stability is characterized in that, may further comprise the steps:
The first step is carried out the SIFT feature point extraction to two width of cloth images to be spliced respectively, and two groups of SIFT unique points that obtain are mated, and obtains the initial matching point to collection;
Second step, estimate based on the genetic identity of homography matrix, initial matching point is rejected collection miss matching treatment, be met the imperial palace point set of unanimity of same homography matrix;
The 3rd step, utilize all match points of consistent imperial palace point set to handle to carrying out the direct linear change of normalization, obtain the initial estimation H of homography matrix Init, and the initial estimation of homography matrix is carried out the LM iteration optimization handle, obtain the optimal estimation H of homography matrix Opt
The 4th goes on foot, and utilizes the optimal estimation H of homography matrix OptCarry out the splicing of image, determine the public domain of two images to be spliced, and obtain the subimage of the first contained width of cloth image of the minimum boundary rectangle of public domain
Figure FDA0000023001710000011
Subimage with second width of cloth image
Figure FDA0000023001710000012
The 5th step is to the subimage of first width of cloth image
Figure FDA0000023001710000013
Subimage with second width of cloth image
Figure FDA0000023001710000014
Carry out seamless fusion, the image after merging is attached to the public domain, obtain the net result of image mosaic.
2. the method for raising image mosaic precision according to claim 1 and stability is characterized in that, the rejecting mistake matching treatment described in second step may further comprise the steps:
1) at initial matching point to concentrate selecting P individual composition initial population, count in obtaining that each individuality comprises, and have at least comprise in the body one by one in count and be greater than or equal to dependent thresholds T In
It is right to concentrating the s that selects at random to organize nonsingular match point that described individuality comprises from the initial matching point;
Point is that symmetry transfer error is right less than the match point of interior some threshold value Thr in described;
2) be that individual number all is the subgroup of s+2 with the initial population random division, and maximum and inferior two the big individualities of counting in selecting in each subgroup are as parent;
3) two parents in each subgroup are exchanged at random q group match point to generating 2 daughters, individual in the middle of daughter and parent form 4 altogether;
4) count maximum individuality as candidate's individuality in keeping in 4 the middle individualities in each subgroup;
5) to the processing that makes a variation of the candidate's individuality in each subgroup, making each candidate's individual variation is that s+1 variation is individual, and the individuality that will make a variation adjusts processing, obtains the individual new population with the individual composition of candidate of adjusted variation;
6) new population is repeated 2)-5) altogether after G time, the interior point of the individuality of the maximum of counting in this moment population is exactly the imperial palace point set of unanimity that satisfies same homography matrix;
Described G is:
G = log ( 1 - C ) P · log ( 1 - ( μ N ) s ) ,
Wherein: P is a number individual in the population, μ executes 5 the G time) after count in obtaining maximum individuality in count, N is the initial matching point to concentrating total right logarithm of match point, and C executes 5 the G time) back by s correctly match point once probability was appearred in individuality at least that form;
The span of described C is: [0.9,0.99].
3. the method for raising image mosaic precision according to claim 2 and stability is characterized in that, described symmetry shifts error, is:
D ij=||x Li-H jx Ri|| 2+||x Ri-H j -1x Li|| 2
Wherein: D IjBe that the initial matching point is to concentrating i group match point to (x Li, x Ri) for j individual I jSymmetry shift error, 1≤i≤N, N are that the initial matching point is to concentrating the right total number of match point, x Li=(x Li, y Li, 1) TBe the homogeneous coordinates of i group match point centering first width of cloth figure match point, (x Li, y Li) TBe to be pixel coordinate in the coordinate system set up of initial point with first width of cloth figure top left corner pixel point; x Ri=(x Ri, y Ri, 1) TBe the homogeneous coordinates of i group match point centering second width of cloth figure match point, (x Ri, y Ri) TBe to be pixel coordinate in the coordinate system set up of initial point with second width of cloth figure top left corner pixel point, H jBe to utilize j individual I jThe s that is comprised handles resulting homography matrix estimation example, 1≤j≤P to match point to carrying out the direct linear change of normalization.
4. according to the method for claim 1 or 3 described raising image mosaic precision and stability, it is characterized in that the direct linear change of described normalization is handled, and may further comprise the steps:
A) obtain the normalization matrix T of first width of cloth figure L:
T L = 1 / f L 0 - u L / f L 0 1 / f L - v L / f L 0 0 1 ,
Wherein:
Figure FDA0000023001710000032
Figure FDA0000023001710000033
f L=W L+ F L/ 2, W LBe that first width of cloth figure is the width of unit with the pixel, F LBe first width of cloth figure be the length of unit with the pixel, f LIt is the yardstick normalized factor of first width of cloth figure;
B) obtain the normalization matrix T of second width of cloth figure R:
T R = 1 / f R 0 - u R / f R 0 1 / f R - v R / f R 0 0 1 ,
Wherein:
Figure FDA0000023001710000035
Figure FDA0000023001710000036
f R=W R+ F R/ 2, W RBe that second width of cloth figure is the width of unit with the pixel, F RBe second width of cloth figure be the length of unit with the pixel, f RIt is the yardstick normalized factor of second width of cloth figure;
C) the normalization match point is right:
x NLi=T Lx′ Li,x NRi=T Rx′ Ri
Wherein: x NLiBe the normalization homogeneous coordinates of first width of cloth figure match point, x NRiBe the normalization homogeneous coordinates of second width of cloth figure match point, x ' Li=(x ' Li, y ' Li, 1) T, x ' Ri=(x ' Ri, y ' Ri, 1) T, (x ' Li, x ' Ri) match point that provides is right;
D) right
Figure FDA0000023001710000037
Carry out svd:
A=UDV T
Wherein: A i = 0 T - x NRi T y NLi x NRi T x NRi T 0 T - x NLi x NRi T ,
x NLi=(x NLi,y NLi,1) T,x NRi=(x NRi,y NRi,1) T
By descending sort, last of V classified h as to the diagonal element of diagonal matrix D along diagonal line;
E) remove normalization, obtain homography matrix:
H = T L - 1 H ~ T R ,
Wherein: H ~ = h 1 h 2 h 3 h 4 h 5 h 6 h 7 h 8 h 9 ,
h iBe i the element of h, H estimates example by the match point that provides to the homography matrix that obtains.
5. the method for raising image mosaic precision according to claim 2 and stability, it is characterized in that, described variation is handled, the variation individuality that point and 4-m group exterior point are formed in selecting by the m group in the individual interior exterior point of dividing of each candidate respectively, it is individual that 0≤m≤s, each candidate's individuality generate s+1 variation;
Described exterior point is that symmetry transfer error is right more than or equal to the match point of interior some threshold value Thr.
6. the method for raising image mosaic precision according to claim 2 and stability, it is characterized in that, described adjustment is handled, be: to the individual variation individuality that generates of each candidate, compare with the differentiation individuality one by one, if and only if differentiate count in individual greater than variation individual in count, the individuality that then will make a variation is replaced into differentiates individuality;
Described differentiation individuality be from the interior point of candidate's individuality at random the s of selection to organize nonsingular match point right.
7. the method for raising image mosaic precision according to claim 1 and stability is characterized in that, the LM iteration optimization described in the 3rd step is handled, be in the formula below when total symmetry transfer error E is constant, H OptBe the optimal estimation of homography matrix:
E = Σ j = 1 M | | x INLj - H opt x INRj | | 2 + | | x INRj - H opt - 1 x INLj | | 2 s.t. | | H opt | | F 2 = 1 ,
Wherein: (x INLj, x INRj) be that j match point in the consistent imperial palace point set is right, 1≤j≤M, M are the right number of match point in the imperial palace point set of unanimity, || || FThe Frobenius norm of representing matrix, H OptInitial value be taken as H Init/ || H Init|| F, H InitIt is the initial estimation of homography matrix.
8. the image split-joint method of estimating based on the homography matrix genetic identity according to claim 1 is characterized in that, the splicing of the image described in the 4th step may further comprise the steps:
1), with second width of cloth image mosaic to first width of cloth image, according to Reg2=Quad{R k/ R k(3) obtain the shared region R eg2 of second width of cloth image,
Wherein: R k=H OptV k,
R kThe homogeneous coordinates on k summit after being second width of cloth image mosaic to first width of cloth image, 1≤k≤4, V kBe the homogeneous coordinates on k summit of second width of cloth image itself, V 1=(1,1,1) T, V 2=(1, F L, 1) T, V 3=(W L, F L, 1) T, V 4=(W L, 1,1) T, W LBe that first width of cloth figure is the width of unit with the pixel, F LBe first width of cloth figure be the length of unit with the pixel, Quad{.} represents the zone that quadrilateral comprised that connects and composes successively by 4, R k(3) be expression vector R kThe 3rd component;
2), according to 1) method, obtain the shared region R eg1 of first width of cloth image, the common factor of region R eg2 and region R eg1 is exactly the public domain of two images to be spliced, and second width of cloth image content different with the public domain is spliced on first width of cloth image.
9. the method for raising image mosaic precision according to claim 1 and stability is characterized in that, the seamless fusion described in the 5th step may further comprise the steps:
1) sets up the gaussian pyramid of two width of cloth subgraphs respectively
As the ground floor of gaussian pyramid, the k tomographic image of gaussian pyramid
Figure FDA0000023001710000052
Be:
I Lk RPReg ( i , j ) = Σ m = - 2 2 Σ n = - 2 2 w ( m , n ) I L ( k - 1 ) RPReg ( 2 i + m , 2 j + n ) ,
Wherein: w (m, n)=w (m) w (n), w (0)=3/8, w (± 1)=1/4, w (± 2)=1/16,1≤k≤N, N are the number of plies of gaussian pyramid, subgraph
Figure FDA0000023001710000054
The gaussian pyramid structure
Figure FDA0000023001710000055
Set up with the same manner;
2) set up the laplacian pyramid of two width of cloth subgraphs respectively
Subgraph
Figure FDA0000023001710000056
Laplacian pyramid
Figure FDA0000023001710000057
Be constructed as follows:
Figure FDA0000023001710000058
Subgraph
Figure FDA0000023001710000059
The laplacian pyramid structure
Figure FDA00000230017100000510
Set up with the same manner;
3) set up the gaussian pyramid of fusion coefficients
At first set up one with big coefficient image IC such as region R PReg L, with image I C LGray-scale value near half image of second width of cloth image is made as 1, and is made as 0 near half of second width of cloth figure, and sets up image I C LGaussian pyramid IC Lk, this is figure
Figure FDA0000023001710000061
The fusion coefficients gaussian pyramid, subgraph simultaneously
Figure FDA0000023001710000062
The fusion coefficients gaussian pyramid be IC Rk=1-IC Lk
4) set up the laplacian pyramid of fused image
Fused image is I RPReg, its laplacian pyramid
Figure FDA0000023001710000063
Be constructed as follows:
LPI k RPReg ( i , j ) = IC Lk ( i , j ) LPI Lk RPReg ( i , j ) + IC Rk ( i , j ) LP I Rk RPReg ( i , j ) ;
5) image after reconstruction is merged
Laplacian pyramid by fused image
Figure FDA0000023001710000065
Rebuild the gaussian pyramid of fused image
Figure FDA0000023001710000066
As follows:
Figure FDA0000023001710000067
Get the ground floor of gaussian pyramid
Figure FDA0000023001710000068
Be subgraph
Figure FDA0000023001710000069
And subgraph
Figure FDA00000230017100000610
Image after the fusion.
CN 201010216340 2010-07-02 2010-07-02 Method for improving accuracy and stability of image mosaic Pending CN101882308A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010216340 CN101882308A (en) 2010-07-02 2010-07-02 Method for improving accuracy and stability of image mosaic

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010216340 CN101882308A (en) 2010-07-02 2010-07-02 Method for improving accuracy and stability of image mosaic

Publications (1)

Publication Number Publication Date
CN101882308A true CN101882308A (en) 2010-11-10

Family

ID=43054319

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010216340 Pending CN101882308A (en) 2010-07-02 2010-07-02 Method for improving accuracy and stability of image mosaic

Country Status (1)

Country Link
CN (1) CN101882308A (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102480600A (en) * 2010-11-22 2012-05-30 上海银晨智能识别科技有限公司 Double-path monitoring video information fusion method and system
CN103020934A (en) * 2012-12-12 2013-04-03 武汉大学 Seamless automatic image splicing method resistant to subtitle interference
CN103236048A (en) * 2013-04-18 2013-08-07 上海交通大学 Mutual information and interaction-based medical image splicing method
CN103793891A (en) * 2012-10-26 2014-05-14 海法科技有限公司 Low-complexity panorama image joint method
CN104933687A (en) * 2015-07-09 2015-09-23 武汉大学 Seam line multiscale feather algorithm of considering changed area
CN105005964A (en) * 2015-06-30 2015-10-28 南京师范大学 Video sequence image based method for rapidly generating panorama of geographic scene
CN105894505A (en) * 2016-03-30 2016-08-24 南京邮电大学 Quick pedestrian positioning method based on multi-camera geometrical constraint
CN106204456A (en) * 2016-07-18 2016-12-07 电子科技大学 Panoramic video sequences estimation is crossed the border folding searching method
CN106447612A (en) * 2016-09-21 2017-02-22 湖南子午天地科技文化发展有限公司 Image splicing method and image splicing generator
CN106843274A (en) * 2017-03-21 2017-06-13 广东容祺智能科技有限公司 One kind is based on the safe cruising system and methods of unmanned plane VR
CN108154476A (en) * 2017-12-22 2018-06-12 成都华栖云科技有限公司 The method of video-splicing correction
CN108846861A (en) * 2018-06-12 2018-11-20 广州视源电子科技股份有限公司 Image homography matrix calculation method, device, mobile terminal and storage medium
CN109283679A (en) * 2018-10-18 2019-01-29 中国科学院自动化研究所 Big visual field optical microscopic image imaging device and method
CN109982126A (en) * 2017-12-27 2019-07-05 艾迪普(北京)文化科技股份有限公司 A kind of stacking method of associated video
CN110111287A (en) * 2019-04-04 2019-08-09 上海工程技术大学 A kind of fabric multi-angle image emerging system and its method
CN110136092A (en) * 2019-05-21 2019-08-16 北京三快在线科技有限公司 Image processing method, device and storage medium
CN110717861A (en) * 2019-12-12 2020-01-21 成都纵横大鹏无人机科技有限公司 Image splicing method and device, electronic equipment and computer readable storage medium
CN111693025A (en) * 2020-06-12 2020-09-22 深圳大学 Remote sensing image data generation method, system and equipment
CN112288852A (en) * 2020-10-28 2021-01-29 华润电力技术研究院有限公司 Coal yard three-dimensional reconstruction method and system and intelligent control method of thermal power generating unit
CN112532884A (en) * 2020-11-27 2021-03-19 维沃移动通信有限公司 Identification method and device and electronic equipment
CN113643365A (en) * 2021-07-07 2021-11-12 紫东信息科技(苏州)有限公司 Camera pose estimation method, device, equipment and readable storage medium
WO2023024697A1 (en) * 2021-08-26 2023-03-02 北京旷视科技有限公司 Image stitching method and electronic device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1995016977A2 (en) * 1993-12-17 1995-06-22 Xaos Tools Image processing using genetic mutation of neural network parameters
CN1766907A (en) * 2005-10-24 2006-05-03 中国电子科技集团公司第四十五研究所 Multi-target image recognition method based on cluster genetic algorithm

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1995016977A2 (en) * 1993-12-17 1995-06-22 Xaos Tools Image processing using genetic mutation of neural network parameters
CN1766907A (en) * 2005-10-24 2006-05-03 中国电子科技集团公司第四十五研究所 Multi-target image recognition method based on cluster genetic algorithm

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《海洋测绘》 20100131 夏朝贵等 一种基于SIFT特征匹配的自动图像拼接方法 第66-69页 第30卷, 第1期 2 *

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102480600A (en) * 2010-11-22 2012-05-30 上海银晨智能识别科技有限公司 Double-path monitoring video information fusion method and system
CN102480600B (en) * 2010-11-22 2014-07-02 上海银晨智能识别科技有限公司 Double-path monitoring video information fusion method and system
CN103793891A (en) * 2012-10-26 2014-05-14 海法科技有限公司 Low-complexity panorama image joint method
CN103020934B (en) * 2012-12-12 2015-10-21 武汉大学 The image seamless method for automatically split-jointing of anti-captions interference
CN103020934A (en) * 2012-12-12 2013-04-03 武汉大学 Seamless automatic image splicing method resistant to subtitle interference
CN103236048B (en) * 2013-04-18 2016-05-04 上海交通大学 A kind of based on mutual information and mutual medical image joining method
CN103236048A (en) * 2013-04-18 2013-08-07 上海交通大学 Mutual information and interaction-based medical image splicing method
CN105005964A (en) * 2015-06-30 2015-10-28 南京师范大学 Video sequence image based method for rapidly generating panorama of geographic scene
CN105005964B (en) * 2015-06-30 2018-11-16 南京师范大学 Geographic scenes panorama sketch rapid generation based on video sequence image
CN104933687A (en) * 2015-07-09 2015-09-23 武汉大学 Seam line multiscale feather algorithm of considering changed area
CN104933687B (en) * 2015-07-09 2018-01-23 武汉大学 A kind of multiple dimensioned emergence algorithm of jointing line for considering region of variation
CN105894505A (en) * 2016-03-30 2016-08-24 南京邮电大学 Quick pedestrian positioning method based on multi-camera geometrical constraint
CN106204456A (en) * 2016-07-18 2016-12-07 电子科技大学 Panoramic video sequences estimation is crossed the border folding searching method
CN106447612A (en) * 2016-09-21 2017-02-22 湖南子午天地科技文化发展有限公司 Image splicing method and image splicing generator
CN106843274A (en) * 2017-03-21 2017-06-13 广东容祺智能科技有限公司 One kind is based on the safe cruising system and methods of unmanned plane VR
CN108154476A (en) * 2017-12-22 2018-06-12 成都华栖云科技有限公司 The method of video-splicing correction
CN109982126A (en) * 2017-12-27 2019-07-05 艾迪普(北京)文化科技股份有限公司 A kind of stacking method of associated video
CN108846861A (en) * 2018-06-12 2018-11-20 广州视源电子科技股份有限公司 Image homography matrix calculation method, device, mobile terminal and storage medium
CN108846861B (en) * 2018-06-12 2020-12-29 广州视源电子科技股份有限公司 Image homography matrix calculation method and device, mobile terminal and storage medium
CN109283679B (en) * 2018-10-18 2020-05-19 中国科学院自动化研究所 Large-view-field optical microscopic image imaging device and method
CN109283679A (en) * 2018-10-18 2019-01-29 中国科学院自动化研究所 Big visual field optical microscopic image imaging device and method
CN110111287A (en) * 2019-04-04 2019-08-09 上海工程技术大学 A kind of fabric multi-angle image emerging system and its method
CN110136092A (en) * 2019-05-21 2019-08-16 北京三快在线科技有限公司 Image processing method, device and storage medium
CN110717861A (en) * 2019-12-12 2020-01-21 成都纵横大鹏无人机科技有限公司 Image splicing method and device, electronic equipment and computer readable storage medium
CN110717861B (en) * 2019-12-12 2020-03-20 成都纵横大鹏无人机科技有限公司 Image splicing method and device, electronic equipment and computer readable storage medium
CN111693025A (en) * 2020-06-12 2020-09-22 深圳大学 Remote sensing image data generation method, system and equipment
CN112288852A (en) * 2020-10-28 2021-01-29 华润电力技术研究院有限公司 Coal yard three-dimensional reconstruction method and system and intelligent control method of thermal power generating unit
CN112532884A (en) * 2020-11-27 2021-03-19 维沃移动通信有限公司 Identification method and device and electronic equipment
WO2022111461A1 (en) * 2020-11-27 2022-06-02 维沃移动通信有限公司 Recognition method and apparatus, and electronic device
CN112532884B (en) * 2020-11-27 2022-10-14 维沃移动通信有限公司 Identification method and device and electronic equipment
CN113643365A (en) * 2021-07-07 2021-11-12 紫东信息科技(苏州)有限公司 Camera pose estimation method, device, equipment and readable storage medium
CN113643365B (en) * 2021-07-07 2024-03-19 紫东信息科技(苏州)有限公司 Camera pose estimation method, device, equipment and readable storage medium
WO2023024697A1 (en) * 2021-08-26 2023-03-02 北京旷视科技有限公司 Image stitching method and electronic device

Similar Documents

Publication Publication Date Title
CN101882308A (en) Method for improving accuracy and stability of image mosaic
US10198858B2 (en) Method for 3D modelling based on structure from motion processing of sparse 2D images
Brenner et al. Coarse orientation of terrestrial laser scans in urban environments
US9740950B1 (en) Method and system for automatic registration of images
Fathi et al. Automated sparse 3D point cloud generation of infrastructure using its distinctive visual features
Fan et al. Registration of optical and SAR satellite images by exploring the spatial relationship of the improved SIFT
Griffiths et al. Comparison of pre-and self-calibrated camera calibration models for UAS-derived nadir imagery for a SfM application
CN109409292A (en) The heterologous image matching method extracted based on fining characteristic optimization
US8666170B2 (en) Computer system and method of matching for images and graphs
CN101826206B (en) Camera self-calibration method
CN103886611A (en) Image matching method suitable for automatically detecting flight quality of aerial photography
CN102903109B (en) A kind of optical image and SAR image integration segmentation method for registering
Urban et al. Finding a good feature detector-descriptor combination for the 2D keypoint-based registration of TLS point clouds
Jiang et al. Robust image matching via local graph structure consensus
Cao et al. An edge-based scale-and affine-invariant algorithm for remote sensing image registration
CN102446356A (en) Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points
CN102122359A (en) Image registration method and device
CN102404595A (en) Epipolar line rectification method capable of providing instruction for shooting of 3-dimensional programs
JP2023530449A (en) Systems and methods for air and ground alignment
Zhu et al. Research on mosaic method of uav low-altitude remote sensing image based on sift and surf
Yao et al. Registrating oblique SAR images based on complementary integrated filtering and multilevel matching
Aktar et al. Robust mosaicking of maize fields from aerial imagery
Wang et al. Fast and accurate satellite multi-view stereo using edge-aware interpolation
Cui et al. Approach of spectral information-based image registration similarity
CN104992432A (en) Multimodal image registration method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20101110