CN105678722A - Panoramic stitched image bending correction method and panoramic stitched image bending correction device - Google Patents

Panoramic stitched image bending correction method and panoramic stitched image bending correction device Download PDF

Info

Publication number
CN105678722A
CN105678722A CN201410667807.0A CN201410667807A CN105678722A CN 105678722 A CN105678722 A CN 105678722A CN 201410667807 A CN201410667807 A CN 201410667807A CN 105678722 A CN105678722 A CN 105678722A
Authority
CN
China
Prior art keywords
image
matrix
module
matching
gained
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410667807.0A
Other languages
Chinese (zh)
Inventor
魏园波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Infinova Ltd
Original Assignee
Shenzhen Infinova Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Infinova Ltd filed Critical Shenzhen Infinova Ltd
Priority to CN201410667807.0A priority Critical patent/CN105678722A/en
Publication of CN105678722A publication Critical patent/CN105678722A/en
Pending legal-status Critical Current

Links

Abstract

The invention relates to the technical field of image processing, and especially relates to a panoramic stitched image bending correction method and a panoramic stitched image bending correction device. The method comprises the following steps: S101, acquiring more than two panoramic partition images to be stitched; S102, extracting the features of the panoramic partition images acquired in S101; S103, carrying out feature matching on the panoramic partition images obtained in S102 to obtain the number of pairs of matching feature points; S104, matching the panoramic partition images obtained in S103 according to the number of pairs of matching feature points; S105, calculating a perspective transformation matrix of the panoramic partition images obtained in S104, calculating the initial depth of field and the rotation matrix of each image according to the perspective transformation matrix, and acquiring a global transformation matrix through rotation matrix transformation; S106, performing multiplication operation on the global transformation matrix and the rotation matrixes; S107, projecting the panoramic partition images obtained in S106 to the same coordinate system and carrying out image fusion; and S108, outputting an image obtained in S107. Through the method, bending of a panoramic stitched image is eliminated, a bending correction effect is achieved, and the effect of stitching is improved.

Description

A kind of panoramic mosaic image wrapping antidote and device
Technical field
The present invention relates to technical field of image processing, especially one panoramic mosaic image wrapping antidote and device.
Background technology
In today that monitoring system is fast-developing, system scale is increasing, control point gets more and more, staff has no time to attend in the face of increasing monitoring image, continuous expansion along with monitoring range, it is necessary to bed rearrangement bridge, airport, very Chang Yiduan highway or skyscraper are monitored, and the visual field of most of video cameras does not reach so big monitoring range, therefore the requirement of big visual field video monitoring is increased gradually, gradually formed a kind of high demand. Video panorama splices as a solution, have also been obtained increasing concern. There is a unknown 3D and rotate in the world coordinate system owing to choosing, after panoramic mosaic, often image has bending effect, accordingly, it would be desirable to a kind of panoramic mosaic image wrapping antidote and device, is used for eliminating bending effect.
Summary of the invention
The technical problem to be solved is: provides a kind of panoramic mosaic image wrapping antidote and device, it is achieved image wrapping is corrected, eliminates image wrapping effect.
In order to solve above-mentioned technical problem, the technical solution used in the present invention is:
A kind of panoramic mosaic image wrapping antidote, including:
S101, gather the panorama block plan that more than two width are to be spliced;
S102, S101 gained panorama block plan is carried out feature extraction;
S103, S102 gained panorama block plan is carried out characteristic matching, obtain matching characteristic point logarithm;
S104, according to described matching characteristic point logarithm, S103 gained panorama block plan is carried out images match;
S105, S104 gained panorama block plan is calculated perspective transformation matrix, the initial depth of field and the spin matrix of each image can be calculated according to perspective transformation matrix, obtain global change's matrix by spin matrix conversion;
S106, global change's matrix and spin matrix are carried out multiplying;
S107, by S106 gained panorama block plan projection to the same coordinate system, carry out image co-registration;
S108, output S107 gained image.
Another technical scheme that the present invention adopts is:
A kind of panoramic mosaic image wrapping apparatus for correcting, including the image capture module being sequentially connected with, characteristic extracting module, characteristic matching module, images match module, global change's matrix acquisition module, computing module, image co-registration module and image output module;
Described image capture module, for gathering the panorama block plan that more than two width are to be spliced;
Described characteristic extracting module, for carrying out feature extraction by the gained panorama block plan collected;
Described characteristic matching module, for feature extraction gained panorama block plan carries out characteristic matching, obtains matching characteristic point logarithm;
Described images match module, for carrying out images match according to described matching characteristic point logarithm to characteristic matching gained panorama block plan;
Described global change matrix acquisition module, for images match module gained panorama block plan is calculated perspective transformation matrix, calculate the initial depth of field and the spin matrix of each image according to perspective transformation matrix, obtain global change's matrix by spin matrix conversion;
Described computing module, for carrying out multiplying by global change's matrix and spin matrix;
Described image co-registration module, for carrying out image co-registration by the projection of computing module gained panorama block plan to reference plane;
Described image output module, is used for exporting image co-registration module gained image.
The beneficial effects of the present invention is: by Image Feature Matching gained panorama block plan is calculated perspective transformation matrix, the initial depth of field and the spin matrix of each image can be calculated according to perspective transformation matrix, global change's matrix is obtained again through spin matrix conversion, global change's matrix and spin matrix carry out multiplying, eliminate 3D to rotate, thus eliminating image wrapping, it is achieved the detorsion effect of image, promote the effect of splicing.
Accompanying drawing explanation
Fig. 1 is the block diagram of panoramic mosaic image wrapping antidote in the specific embodiment of the invention;
Fig. 2 is the schematic diagram of the metric space extremum extracting in the embodiment of the present invention;
Fig. 3 extracts sift characteristic point to generate the schematic diagram of local feature descriptor in the embodiment of the present invention;
Fig. 4 is the schematic diagram of the gray area integration in the embodiment of the present invention;
Fig. 5 extracts surf characteristic point to generate the schematic diagram of local feature descriptor in the embodiment of the present invention;
Fig. 6 is the schematic diagram of the harris angle point extracted in the embodiment of the present invention;
Fig. 7 is the schematic diagram of the average addition method in embodiment of the present invention step S107;
Fig. 8 is the schematic diagram of d1, d2 in the specific embodiment of the invention;
Fig. 9 is the structural representation of panoramic mosaic image wrapping apparatus for correcting in the specific embodiment of the invention;
Label declaration:
10, image capture module; 20, characteristic extracting module; 30, characteristic matching module; 40, images match module; 50, global change's matrix acquisition module; 60, computing module; 70, image co-registration module; 80, image output module.
Detailed description of the invention
By describing the technology contents of the present invention in detail, being realized purpose and effect, below in conjunction with embodiment and coordinate accompanying drawing to be explained.
The design of most critical of the present invention is in that: by Image Feature Matching gained panorama block plan is calculated perspective transformation matrix, the initial depth of field and the spin matrix of each image can be calculated according to perspective transformation matrix, global change's matrix is obtained again through spin matrix conversion, global change's matrix and spin matrix carry out multiplying, it is achieved image wrapping is corrected.
Refer to Fig. 1, for the block diagram of panoramic mosaic image wrapping antidote in the specific embodiment of the invention, specific as follows:
A kind of panoramic mosaic image wrapping antidote, including:
S101, gather the panorama block plan that more than two width are to be spliced;
S102, S101 gained panorama block plan is carried out feature extraction;
S103, S102 gained panorama block plan is carried out characteristic matching, obtain matching characteristic point logarithm;
S104, according to described matching characteristic point logarithm, S103 gained panorama block plan is carried out images match;
S105, S104 gained panorama block plan is calculated perspective transformation matrix, the initial depth of field and the spin matrix of each image can be calculated according to perspective transformation matrix, obtain global change's matrix by spin matrix conversion;
S106, global change's matrix and spin matrix are carried out multiplying;
S107, the projection of S106 gained panorama block plan is carried out image co-registration after on the same coordinate system;
S108, output S107 gained image.
Known from the above, the beneficial effects of the present invention is: by Image Feature Matching gained panorama block plan is calculated perspective transformation matrix, the initial depth of field and the spin matrix of each image can be calculated according to perspective transformation matrix, global change's matrix is obtained again through spin matrix conversion, global change's matrix and spin matrix carry out multiplying, eliminate 3D and rotate, thus eliminating image wrapping, realize the detorsion effect of image, promote the effect of splicing.
Further, " feature extraction " in described step S102 adopts and extracts sift, surf or harris characteristic point.
Described extraction sift characteristic point step:
1) metric space extremum extracting, primarily determines that key point position and place yardstick. The metric space of two dimensional image is realized by the convolution of gaussian kernel function Yu image.
G ( x , y , σ ) = 1 2 πσ 2 e - ( x 2 + y 2 ) / 2 σ 2 ;
When detecting yardstick spatial extrema, 9 × 2 pixels of surrounding neighbors, 26 pixels altogether of 8 pixels of surrounding neighbors and adjacent yardstick correspondence position that the pixel being labeled as cross in Fig. 2 needs the attendant of a stage actor to draw together same yardstick compare, to guarantee local extremum all to be detected at metric space and two dimensional image space;
2) position of precise positioning feature point, by matching three-dimensional quadratic function accurately to determine position and the yardstick of key point, removes the key point of low contrast and unstable skirt response point simultaneously;
At key point place, Taylor expansion obtains:
D ( X ) = D + ∂ D T ∂ X X + 1 2 X T ∂ 2 D ∂ X 2 X ;
In formula, X=(x, y, σ)TFor the side-play amount of key point, D is the value at D (x, y, σ) key point place;
3) principal direction of characteristic point is determined;
m ( x , y ) = ( L ( x + 1 , y ) - L ( x - 1 , y ) ) 2 + ( L ( x , y + 1 ) - L ( x , y - 1 ) ) 2
θ (x, y)=atan2 ((L (x, y+1)-L (x, y-1))/(L (x+1, y)-L (x-1, y)));
Each key point has three information: position, residing yardstick, direction;
4) local feature descriptor is generated;
First coordinate axes is rotated to be the direction of key point, centered by key point, takes the window of 4*4, as shown in Figure 3. In Fig. 3, the stain of left figure is the position of current key point, each little lattice represent a pixel of key point neighborhood place metric space, the direction of arrow represents the gradient direction of this pixel, arrow length represents the size of gradient, and circle represents the scope (the pixel gradient directional information the closer to key point is contributed more big) of Gauss weighting. Next on the fritter of each 4*4, calculate the gradient orientation histogram in 8 directions, draw the accumulated value of each gradient direction, a seed points can be formed, one key point is made up of totally 4*4 16 seed points, each seed points has 8 direction vector information, namely ultimately forms the sift characteristic vector of 128 dimensions.
Described extraction surf characteristic point step:
1) IntegralImages (integrogram);
Integrogram mainly calculate some region in image pixel and, the integrogram definition at x place, position is as follows:
I Σ ( x ) = Σ i = 0 i ≤ x Σ j = 0 j ≤ y I ( i , j ) ;
As Fig. 4 grey area integrogram is: A-B-C+D;
2) approximate Hessian matrix;
1 X=in given image I (x, y), its Hessian matrix is:
L xx = ∂ 2 g ( σ ) ∂ x 2 * I ( x , y ) ;
3) metric space describes;
Surf is the size variation of boxfilter, but not image scaling;
4) positioning feature point;
Scalogram picture is obtained at (x according to Hessian matrix, y) after the extreme value at place, first in the three-dimensional neighborhood of 3 × 3 × the 3 of extreme point, carry out non-maxima suppression, be interpolated in metric space and image space, use quadratic fit function to be interpolated:
D ( X ) = D + ∂ D T ∂ X X + 1 2 X T ∂ D ∂ X 2 X ;
Above formula is carried out derivation, and the extreme value obtaining extreme point place is:
L xx = ∂ 2 g ( σ ) ∂ x 2 * I ( x , y ) ;
When extreme value >=0.03, this point is characteristic point;
5) feature descriptor;
Such as Fig. 5, centered by characteristic point, building a length of side along main formula position is the square of 20, is further divided into the subregion of 4*4, in each zonule, it is divided into again 5*5 sampled point, calculates Haar small echo in the response both horizontally and vertically gone up responded relative to main formula position;
Described extraction harris characteristic point step:
1) each pixel is calculated correlation matrix m;
m = I x 2 I x I y I x I y I y 2
I x 2 = I x * I x
I y 2 = I y * I y ;
2) four elements of m being carried out Gaussian smoothing filter, obtain new m, Gaussian function is:
Gauss = exp ( - ( x 2 + y 2 ) 2 σ 2 ) ;
3) m is utilized to calculate the angle point amount cim of each pixel;
cim = I x 2 * I y 2 - ( I x I y ) 2 I x 2 + I y 2 ;
4) cim meets more than some threshold value and cim is certain neighborhood local maximum, and what satisfy condition is exactly angle point;
Fig. 6 is the harris angle point extracted in piece image.
Seen from the above description, described " feature extraction " adopts to extract in sift characteristic point and can strengthen coupling stability " by matching three-dimensional quadratic function accurately to determine position and the yardstick of key point, remove the key point of low contrast and unstable skirt response point simultaneously ", improve noise resisting ability; Described " feature extraction " adopts " being interpolated in metric space and image space " in extraction surf characteristic point candidate feature point can be carried out sub-pixel positioning.
Further, in described step S103, " characteristic matching " is to carry out characteristic matching by Euclidean distance; " obtaining matching characteristic point logarithm " in described step S103 adopts RANSAC algorithm to be calculated.
" characteristic matching " in described step S103 adopts Euclidean distance concrete steps:
1) assume image to be matched be image 1 (1, m), image 2 (2, n), image 1 has m characteristic point, image 2 to have n characteristic point, characteristic point i=1 in image 1,2 ... .m, is that each characteristic point searching coupling in image 1 is right in image 2;
2) characteristic point i=1 in image 1, image 2 is first characteristic point in image 1 finds closest approach d1 and time near point d2, being previously noted each characteristic point has (sift:128 dimension) feature descriptor, by characteristic point descriptor computation Euclidean distance, if meeting formula below, then be considered as feature i=1 and European closest Point matching in image 2.
d 1 d 2 < &epsiv; ;
3) repeat step 2) in i=2,3,4....m, find further feature point in image 1 finds in image 2 mate right.
" obtaining matching characteristic point logarithm " in described step S103 adopts RANSAC algorithm specific as follows:
For adjacent two two field pictures, if number is N by matching characteristic point, matching characteristic point set is designated as P (1, N), P (2 respectively, N), the characteristic point set that wherein P (1, N) is benchmark image, P (2, N) for the characteristic point set of image to be matched, specifically comprise the following steps that
1) from initial N to matching characteristic point randomly selects 4 pairs of matching characteristic points;
2) perspective transformation matrix M12 between benchmark image and image to be matched is calculated by the 4 pairs of matching characteristic points chosen, utilize the transformation matrix M12 characteristic point set P (2 to image to be matched, N) N-4 characteristic point P (2 of residue in, N-2) transform under benchmark image coordinate system, it is designated as P ' (2, N-4);
3) error of coordinate between characteristic point P ' (2, N-4) and the characteristic point P (1, N-4) after conversion is calculated;
4) from N, error of coordinate feature point pairs number in certain error threshold value is found out in matching characteristic point centering, be designated as i, be interior point;
5) 1 is iterated) to 4) step n time, find set maximum for i for most imperial palace point set, all the other N-i are Mismatching point, are exterior point. Reduce error with method of least square, eliminate the impact of error hiding, obtain space conversion matrices M.
Further, in described step S106 " obtain global change's matrix " particularly as follows:
1, according to characteristic matching, employing method of least square is obtained the perspective transformation matrix H of the matched image of each imageij; Such as the perspective transformation matrix H (i=3, j=3) of 3 × 3;
H = h 11 h 12 h 13 h 21 h 22 h 23 h 31 h 32 1 ;
2, according to perspective transformation matrix HijCalculate the initial depth of field f of each imageiSpin matrix R with one 3 × 3i, between 3, meet following relation;
H ij = K i R i R j T K j - 1 ; Wherein, K i = f i 0 0 0 f i 0 0 0 1 ;
3, the matrix r 0 of 3 × 1 is definedi, avgzi;
Described r 0 i = R i [ 0 ] | R i [ 3 ] | R i [ 6 ] , Described avgz i = R i [ 2 ] | R i [ 5 ] | R i [ 8 ] ;
dovsum = &Sigma; i = 0 i < image number ( r 0 i * r 0 i T ) , avgzsum = &Sigma; i = 0 i < image number avgz i ;
4, covsum being SVD decompose, get orthogonal matrix, definition r1 is the 3rd row of this orthogonal matrix; Described r0=avgsumT× r1, normalization r0, r2=r0 × r1;
5, global change matrix u is obtained;
u = r 0 | r 1 | r 2 .
Seen from the above description, having the beneficial effects that of global change's matrix is obtained: by obtaining such a global change matrix, it be multiplied with spin matrix, it is possible to the effective bending effect removing output panorama sketch. Therefore, no matter being image rotation, inclination, video camera has one-component all the time in one plane, reaches to correct the purpose of panoramic picture by finding the vertical vector of this plane.
Further, " image co-registration " in described step S107 adopts the average addition method, multi-resolution Fusion or Weighted Fusion.
" image co-registration " in described step S107 adopt the average addition method particularly as follows:
Such as Fig. 7, in image overlapping region, the pixel value Pixel of pixel is obtained by the average superposition of pixel value Pixel_L and Pixel_R of corresponding point in two width images:
Pixel=0.5* (Pixel_L+Pixel_R);
" image co-registration " in described step S107 adopt multi-resolution Fusion particularly as follows:
First build image pyramid hierarchy, input picture is decomposed into the bandpass layer of a series of different frequency bands; Then each bandpass layer is merged respectively the bandpass layer obtaining stitching image: last combination by the bandpass layer of stitching image obtains final stitching image. Because image co-registration carries out respectively at different frequency bands, so the fusion ghost of image can be prevented effectively from; And decompose due to the multiband of image and combination is a reciprocal process, make to can ensure that in this way the correctness of gained stitching image.
1) build image pyramid hierarchy, obtain the low pass layer G of every piece image0,G1,....GN-1:
G l ( x , y ) = &Sigma; &Sigma; m , n = - 2 2 w ( m , n ) G l - 1 ( 2 x + m , 2 y + n ) , Wherein G0For original image, (m, n)=w (m) w (n) are the weighting function of 5x5 window to w.
2) utilize each low pass layer of gained image, decompose the bandpass layer L obtaining this image0,L1,....LN-1;
L l ( x , y ) = G l ( x , y ) - 4 &Sigma; &Sigma; m , n = - 2 2 G l ( 2 x + m 2 , 2 y + n 2 ) ;
3) in each bandpass layer, carry out image co-registration operation respectively, on average realize used here as weighting method, for current LkLayer, has:
L kout ( x , y ) = &Sigma; i = 0 i = N - 1 L kii ( x &prime; , y &prime; ) w i ( x &prime; ) w i ( y &prime; ) &Sigma; i = 0 i = N - 1 w i ( x &prime; ) w i ( y &prime; ) ;
Through the operation of this step, obtain output bandpass space L corresponding to image0out,L1out,....LN-1out
4) each for gained bandpass layer is combined, obtains final stitching image:
G out = &Sigma; k = 0 N L kout ;
" image co-registration " in described step S107 adopt Weighted Fusion particularly as follows:
In image co-registration region, the pixel value Pixel of pixel is obtained by the pixel value Pixel_L of corresponding point in two width images and Pixel_R weighted average, it may be assumed that Pixel=k × Pixel_L+ (1-k) × Pixel_R, and wherein k is adjustable factors;
0≤k≤1 under normal circumstances, namely in integration region, the direction along image 1 to image 2, k is faded to 0 by 1, thus realizing the smooth registration of integration region. For making point in image co-registration region and two width images set up bigger dependency, make k=d1/ (d1+d2), such as Fig. 8, wherein: d1, d2 represent respectively in integration region o'clock to the distance of the left margin in two width image co-registration regions and right margin. Overlapping region is also not equal to integration region, once 2 original images are determined, then overlay region field width is fixing certainly, but integration region is manual control, it is possible to as long as it is just passable to select arbitrarily to meet integration region W1≤W2 within the scope of overlapping region.
Namely using formula Pixel=d1/ (d1+d2) × Pixel_L+d2/ (d1+d2) × Pixel_R to carry out stitching thread process, non-fused region directly maps from artwork and obtains.
Refer to Fig. 9, for the structural representation of panoramic mosaic image wrapping apparatus for correcting in the specific embodiment of the invention, specific as follows:
A kind of panoramic mosaic image wrapping apparatus for correcting, including the image capture module 10 being sequentially connected with, characteristic extracting module 20, characteristic matching module 30, images match module 40, global change's matrix acquisition module 50, computing module 60, image co-registration module 70 and image output module 80;
Described image capture module 10, for gathering the panorama block plan that more than two width are to be spliced;
Described characteristic extracting module 20, for carrying out feature extraction by the gained panorama block plan collected;
Described characteristic matching module 30, for feature extraction gained panorama block plan carries out characteristic matching, obtains matching characteristic point logarithm;
Described images match module 40, for carrying out images match according to described matching characteristic point logarithm to characteristic matching gained panorama block plan;
Described global change matrix acquisition module 50, for images match module gained panorama block plan is calculated perspective transformation matrix, calculate the initial depth of field and the spin matrix of each image according to perspective transformation matrix, obtain global change's matrix by spin matrix conversion;
Described computing module 60, for carrying out multiplying by global change's matrix and rotational transformation matrix;
Described image co-registration module 70, for carrying out image co-registration after in computing module 60 gained panorama block plan projection to reference plane;
Described image output module 80, is used for exporting image co-registration module 70 gained image.
Seen from the above description, gather the two above panorama block plans of width by image capture module and carry out feature extraction to characteristic extracting module, characteristic matching is carried out by characteristic matching module after feature extraction, images match is carried out by images match module according to the matching characteristic point logarithm that characteristic matching obtains, calculate perspective transformation matrix again, global change's matrix is obtained by global change's matrix acquisition module, carry out carrying out image co-registration by image co-registration module after the multiplying with rotational transformation matrix of global change's matrix through computing module, again through image output module, gained image after image co-registration is exported, eliminate 3D to rotate, thus eliminating image wrapping, realize the detorsion of image, promote the effect of splicing.
Further, described characteristic extracting module 20 is for extracting sift characteristic point unit, extracting surf characteristic point unit or extract harris characteristic point unit.
Described extraction sift characteristic point unit adopts to extract in sift characteristic point and can strengthen coupling stability " by matching three-dimensional quadratic function accurately to determine position and the yardstick of key point, remove the key point of low contrast and unstable skirt response point simultaneously ", improve noise resisting ability;Described extraction surf characteristic point unit adopts " being interpolated in metric space and image space " in extraction surf characteristic point candidate feature point can be carried out sub-pixel positioning.
Further, described image co-registration module 70 includes arithmetical unit;
Include the first arithmetic element and the second arithmetic element that are sequentially connected with described arithmetical unit; Described first arithmetic element, for being overlapped the pixel value of each corresponding point in panorama block plan overlapping region more than two width; Described second arithmetic element, for averaging to the pixel value after each corresponding point superpositions in panorama block plan overlapping region more than two width.
Further, described Fusion Module also includes arranging unit;
Described unit is set, is used for presetting an adjustable parameter, and the pixel value according to each corresponding point of panorama block plan more than two width regulates parameter.
In sum, a kind of panoramic mosaic image wrapping antidote provided by the invention and device, gather the two above panorama block plans of width by image capture module and carry out feature extraction to characteristic extracting module, characteristic matching is carried out by characteristic matching module after feature extraction, carry out images match according to the matching characteristic point logarithm that characteristic matching obtains by images match module, then calculate perspective transformation matrix. Global change's matrix is obtained by global change's matrix acquisition module, carry out carrying out image co-registration by image co-registration module after the multiplying with spin matrix of global change's matrix through computing module, again through image output module, gained image after image co-registration is exported, eliminate 3D to rotate, thus eliminating image wrapping, realize the detorsion of image, promote the effect of splicing. Described " feature extraction " adopts to extract in sift characteristic point and can strengthen coupling stability " by matching three-dimensional quadratic function accurately to determine position and the yardstick of key point, remove the key point of low contrast and unstable skirt response point simultaneously ", improve noise resisting ability; Described " feature extraction " adopts " being interpolated in metric space and image space " in extraction surf characteristic point candidate feature point can be carried out sub-pixel positioning.
The foregoing is only embodiments of the invention; not thereby the scope of the claims of the present invention is limited; every equivalents utilizing description of the present invention and accompanying drawing content to make, or directly or indirectly it is used in relevant technical field, all in like manner include in the scope of patent protection of the present invention.

Claims (9)

1. a panoramic mosaic image wrapping antidote, it is characterised in that including:
S101, gather the panorama block plan that more than two width are to be spliced;
S102, S101 gained panorama block plan is carried out feature extraction;
S103, S102 gained panorama block plan is carried out characteristic matching, obtain matching characteristic point logarithm;
S104, according to described matching characteristic point logarithm, S103 gained panorama block plan is carried out images match;
S105, S104 gained panorama block plan is calculated perspective transformation matrix, the initial depth of field and the spin matrix of each image can be calculated according to perspective transformation matrix, obtain global change's matrix by spin matrix conversion;
S106, global change's matrix and rotational transformation matrix are carried out multiplying;
S107, the projection of S106 gained panorama block plan is carried out image co-registration after on the same coordinate system;
S108, output S107 gained image.
2. panoramic mosaic image wrapping antidote according to claim 1, it is characterised in that " feature extraction " in described step S102 adopts extracts sift, surf or harris characteristic point.
3. panoramic mosaic image wrapping antidote according to claim 1, it is characterised in that in described step S103, " characteristic matching " is to carry out characteristic matching by Euclidean distance; " obtaining matching characteristic point logarithm " in described step S103 adopts RANSAC algorithm to be calculated.
4. panoramic mosaic image wrapping antidote according to claim 1, it is characterised in that in described step S106 " obtain global change matrix " particularly as follows:
1) by characteristic matching, employing method of least square is obtained the perspective transformation matrix H of the matched image of each imageij;
2) according to perspective transformation matrix HijCalculate the initial depth of field f of each imageiWith 3 × 3 spin matrix Ri, between 3, meet following relation;
H ij = K i R i R j T K j - 1 , Wherein K i = f i 0 0 0 f i 0 0 0 1 ;
3) matrix r 0 of 3 × 1 is definedi, avgzi;
Described r 0 i = R i [ 0 ] | R i [ 3 ] | R i [ 6 ] , Described avgz i = R i [ 2 ] | R i [ 5 ] | R i [ 8 ] ,
covsum = &Sigma; i = 0 i < imagenumber ( r 0 i * r 0 i T ) , avgzsum = &Sigma; i = 0 i < imagenumber avgz i ;
4) covsum being SVD decompose, get orthogonal matrix, definition r1 is the 3rd row of this orthogonal matrix; Described r0=avgsumT× r1, normalization r0, r2=r0 × r1;
5) global change matrix u is obtained;
u = r 0 | r 1 | r 2 .
5. panoramic mosaic image wrapping antidote according to claim 1, it is characterised in that " image co-registration " in described step S107 adopts the average addition method, multi-resolution Fusion or Weighted Fusion.
6. a panoramic mosaic image wrapping apparatus for correcting, it is characterized in that, including the image capture module being sequentially connected with, characteristic extracting module, characteristic matching module, images match module, global change's matrix acquisition module, computing module, image co-registration module and image output module;
Described image capture module, for gathering the panorama block plan that more than two width are to be spliced;
Described characteristic extracting module, for carrying out feature extraction by the gained panorama block plan collected;
Described characteristic matching module, for feature extraction gained panorama block plan carries out characteristic matching, obtains matching characteristic point logarithm;
Described images match module, for carrying out images match according to described matching characteristic point logarithm to characteristic matching gained panorama block plan;
Described global change matrix acquisition module, for images match module gained panorama block plan is calculated perspective transformation matrix, calculate the initial depth of field and the spin matrix of each image according to perspective transformation matrix, obtain global change's matrix by spin matrix conversion;
Described computing module, for carrying out multiplying by global change's matrix and spin matrix;
Described image co-registration module, for carrying out image co-registration after in computing module gained panorama block plan projection to reference plane;
Described image output module, is used for exporting image co-registration module gained image.
7. panoramic mosaic image wrapping apparatus for correcting according to claim 6, it is characterised in that described characteristic extracting module is for extracting sift characteristic point unit, extracting surf characteristic point unit or extract harris characteristic point unit.
8. panoramic mosaic image wrapping apparatus for correcting according to claim 6, it is characterised in that described image co-registration module includes arithmetical unit;
Described the first arithmetic element including being sequentially connected with and the second arithmetic element; Described first arithmetic element, for being overlapped the pixel value of each corresponding point in panorama block plan overlapping region more than two width; Described second arithmetic element, for averaging to the pixel value after each corresponding point superpositions in panorama block plan overlapping region more than two width.
9. panoramic mosaic image wrapping apparatus for correcting according to claim 6, it is characterised in that described Fusion Module also includes arranging unit;
Described unit is set, is used for presetting an adjustable parameter, and the pixel value according to each corresponding point of panorama block plan more than two width regulates parameter.
CN201410667807.0A 2014-11-20 2014-11-20 Panoramic stitched image bending correction method and panoramic stitched image bending correction device Pending CN105678722A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410667807.0A CN105678722A (en) 2014-11-20 2014-11-20 Panoramic stitched image bending correction method and panoramic stitched image bending correction device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410667807.0A CN105678722A (en) 2014-11-20 2014-11-20 Panoramic stitched image bending correction method and panoramic stitched image bending correction device

Publications (1)

Publication Number Publication Date
CN105678722A true CN105678722A (en) 2016-06-15

Family

ID=56957848

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410667807.0A Pending CN105678722A (en) 2014-11-20 2014-11-20 Panoramic stitched image bending correction method and panoramic stitched image bending correction device

Country Status (1)

Country Link
CN (1) CN105678722A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106161937A (en) * 2016-07-23 2016-11-23 徐荣婷 A kind of panoramic shooting machine people
CN106339981A (en) * 2016-08-25 2017-01-18 安徽协创物联网技术有限公司 Panorama stitching method
CN106412669A (en) * 2016-09-13 2017-02-15 微鲸科技有限公司 Method and device for rendering panoramic video
CN106530235A (en) * 2016-10-24 2017-03-22 深圳市前海视微科学有限责任公司 Image rotating processing method and device
KR101868740B1 (en) * 2017-01-04 2018-06-18 명지대학교 산학협력단 Apparatus and method for generating panorama image
CN109064385A (en) * 2018-06-20 2018-12-21 何中 360 degree of panorama bandwagon effect Core Generators and delivery system
CN109671045A (en) * 2018-12-28 2019-04-23 广东美电贝尔科技集团股份有限公司 A kind of more image interfusion methods
CN110738599A (en) * 2019-10-14 2020-01-31 北京百度网讯科技有限公司 Image splicing method and device, electronic equipment and storage medium
CN110796690A (en) * 2018-08-01 2020-02-14 株式会社理光 Image matching method and image matching device
CN112037130A (en) * 2020-08-27 2020-12-04 江苏提米智能科技有限公司 Adaptive image splicing and fusing method and device, electronic equipment and storage medium
CN113066010A (en) * 2021-04-06 2021-07-02 无锡安科迪智能技术有限公司 Secondary adjustment method and device for panoramic stitching image, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101877140A (en) * 2009-12-18 2010-11-03 北京邮电大学 Panorama-based panoramic virtual tour method
CN102006425A (en) * 2010-12-13 2011-04-06 交通运输部公路科学研究所 Method for splicing video in real time based on multiple cameras
CN104103051A (en) * 2013-04-03 2014-10-15 华为技术有限公司 Image splicing method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101877140A (en) * 2009-12-18 2010-11-03 北京邮电大学 Panorama-based panoramic virtual tour method
CN102006425A (en) * 2010-12-13 2011-04-06 交通运输部公路科学研究所 Method for splicing video in real time based on multiple cameras
CN104103051A (en) * 2013-04-03 2014-10-15 华为技术有限公司 Image splicing method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李佳等: "基于未标定普通相机的全景图像拼接方法", 《系统仿真学报》 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106161937A (en) * 2016-07-23 2016-11-23 徐荣婷 A kind of panoramic shooting machine people
CN106339981A (en) * 2016-08-25 2017-01-18 安徽协创物联网技术有限公司 Panorama stitching method
CN106412669A (en) * 2016-09-13 2017-02-15 微鲸科技有限公司 Method and device for rendering panoramic video
CN106412669B (en) * 2016-09-13 2019-11-15 微鲸科技有限公司 A kind of method and apparatus of panoramic video rendering
CN106530235A (en) * 2016-10-24 2017-03-22 深圳市前海视微科学有限责任公司 Image rotating processing method and device
CN106530235B (en) * 2016-10-24 2019-06-28 深圳市前海视微科学有限责任公司 Image rotation processing method and device
KR101868740B1 (en) * 2017-01-04 2018-06-18 명지대학교 산학협력단 Apparatus and method for generating panorama image
CN109064385A (en) * 2018-06-20 2018-12-21 何中 360 degree of panorama bandwagon effect Core Generators and delivery system
CN110796690A (en) * 2018-08-01 2020-02-14 株式会社理光 Image matching method and image matching device
CN110796690B (en) * 2018-08-01 2023-04-07 株式会社理光 Image matching method and image matching device
CN109671045A (en) * 2018-12-28 2019-04-23 广东美电贝尔科技集团股份有限公司 A kind of more image interfusion methods
CN110738599A (en) * 2019-10-14 2020-01-31 北京百度网讯科技有限公司 Image splicing method and device, electronic equipment and storage medium
CN112037130A (en) * 2020-08-27 2020-12-04 江苏提米智能科技有限公司 Adaptive image splicing and fusing method and device, electronic equipment and storage medium
CN112037130B (en) * 2020-08-27 2024-03-26 江苏提米智能科技有限公司 Self-adaptive image stitching fusion method and device, electronic equipment and storage medium
CN113066010A (en) * 2021-04-06 2021-07-02 无锡安科迪智能技术有限公司 Secondary adjustment method and device for panoramic stitching image, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN105678722A (en) Panoramic stitched image bending correction method and panoramic stitched image bending correction device
CN105608667A (en) Method and device for panoramic stitching
Deshmukh et al. A survey of image registration
CN110135455B (en) Image matching method, device and computer readable storage medium
US10445616B2 (en) Enhanced phase correlation for image registration
Juan et al. SURF applied in panorama image stitching
US8675997B2 (en) Feature based image registration
KR101175097B1 (en) Panorama image generating method
CN107507277B (en) Three-dimensional point cloud reconstruction method and device, server and readable storage medium
CN105678721A (en) Method and device for smoothing seams of panoramic stitched image
CN108765476B (en) Polarized image registration method
Mistry et al. Image stitching using Harris feature detection
CN105678720A (en) Image matching judging method and image matching judging device for panoramic stitching
CN110211169B (en) Reconstruction method of narrow baseline parallax based on multi-scale super-pixel and phase correlation
CN110084743B (en) Image splicing and positioning method based on multi-flight-zone initial flight path constraint
CN105608689B (en) A kind of panoramic mosaic elimination characteristics of image error hiding method and device
US20150254854A1 (en) Camera calibration method and apparatus using a color-coded structure
Misra et al. Feature based remote sensing image registration techniques: A comprehensive and comparative review
CN112163996B (en) Flat angle video fusion method based on image processing
Guiqin et al. A fast aerial images mosaic method based on ORB feature and homography matrix
CN109961092B (en) Binocular vision stereo matching method and system based on parallax anchor point
CN105184736A (en) Image registration method for narrow overlapping dual field high spectral imaging instrument
Xie et al. A new registration algorithm for multimodal remote sensing images
Kim Survey on registration techniques of visible and infrared images
Ishii et al. Wide-baseline stereo matching using ASIFT and POC

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20160615