CN102984453A - Method and system of real-time generating hemisphere panoramic video images through single camera - Google Patents

Method and system of real-time generating hemisphere panoramic video images through single camera Download PDF

Info

Publication number
CN102984453A
CN102984453A CN2012104305118A CN201210430511A CN102984453A CN 102984453 A CN102984453 A CN 102984453A CN 2012104305118 A CN2012104305118 A CN 2012104305118A CN 201210430511 A CN201210430511 A CN 201210430511A CN 102984453 A CN102984453 A CN 102984453A
Authority
CN
China
Prior art keywords
band
panorama
image
hemisphere
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012104305118A
Other languages
Chinese (zh)
Other versions
CN102984453B (en
Inventor
裴继红
谢维信
杨烜
杨焰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen University
Original Assignee
Shenzhen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen University filed Critical Shenzhen University
Priority to CN201210430511.8A priority Critical patent/CN102984453B/en
Publication of CN102984453A publication Critical patent/CN102984453A/en
Application granted granted Critical
Publication of CN102984453B publication Critical patent/CN102984453B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention applies to the technical field of image processing and provides a method and a system of real-time generating hemisphere panoramic video images through a single camera. According to the method and the system of real-time generating the hemisphere panoramic video images through the single camera, an ordinary single camera is currently used to continuously collect videos along with rotation of a tripod head and continuously collected multi-frame images are synthesized into the hemisphere panoramic video images through a computer in real time, wherein the tripod head rotates in a range of 360 degrees in a horizontal direction and 90 degrees in a pitching direction. The ordinary single camera is fast in calculation, montage and imaging, simple in equipment, low in cost and capable of being widely applied in the use of scene video monitoring, traffic monitoring, whole scene security and protection monitoring and the like in big occasions like station halls, airport waiting halls and the like.

Description

Utilize single camera to generate in real time the method and system of hemisphere full-view video image
Technical field
The invention belongs to technical field of image processing, relate in particular to a kind of method and system of utilizing single camera to generate in real time the hemisphere full-view video image.
Background technology
Generating in real time the hemisphere full-view video image has very important meaning for the application in safety defense monitoring system, also can be used for other use simultaneously, and in the prior art, the method that generates the hemisphere full-view video image has:
1992, U.S. Alabama university used panoramic ring lens PAL, utilizes its plane cylindrical lens phenomenon to propose hemisphere imaging and tracking system, realizes the real-time brand-new scheme near (160 °) the interior homing of hemisphere field range and tracking.
Calendar year 2001, the indoor roaming system of spherical panorama figure based on fixed view of having realized calculated in the Chinese Academy of Sciences, and this system can be with the image projection of horizontal direction to sphere, and the realization roaming function.This system is for still image, and does not have whole half spherical model of complete realization.
The CAD﹠amp of Zhejiang University in 2005; CG National Key Laboratory has developed a cover desktop type Virtual Building environment real time roaming system, and this system reads several still images, then carries out level splicing and two kinds of operations of vertical picture mosaic, generates panorama sketch, and realizes roaming function.This system is for still image, and real-time is not strong, and the visual angle that can cover is limited.
That publishes " Chinese security protection " year in December, 2006 is the 3rd interim, introducing its " panoramic video monitoring " in the literary composition only refers to a camera without mechanical displacement means, also without Image Mosaics, can catch and contain whole hemisphere (180 ° * 360 °), Dan Wenzhong does not provide concrete implementation method.
In sum, in the prior art, the method for utilizing single common camera to generate in real time whole hemisphere full-view video image is in blank.
Summary of the invention
First technical problem to be solved by this invention is to provide a kind of method of utilizing single camera to generate in real time the hemisphere full-view video image, is intended to realize utilizing single common camera to generate in real time whole hemisphere full-view video image.
The present invention is achieved in that a kind of method of utilizing single camera to generate in real time the hemisphere full-view video image, comprises the steps:
Steps A as 0 ° of warp, arranges several preset point along 0 ° of warp direction at the selected warp of hemisphere face in the scope of pitching 90 degree, difference of latitude is basically identical between the adjacent preset point;
Step B, at each preset point place, the angle of pitch direction of camera optical axis is adjusted into the latitude direction of the parallel at preset point place, begin rotary camera in the scope of vertical axis at horizontal 360-degree from preset point, and in rotary course at regular intervals the interval gather image, with every frame video image preliminary treatment of instantaneous acquiring;
Step C is spliced into the band panorama that an along continuous straight runs arranges with the identical video image of pretreated and same preset point latitude; The band panorama of each preset point is vertically arranged;
Step D to the every frame video image that embeds in each band panorama, carries out the adjustment of light consistency to the overlapping region of itself and left and right frame video image;
Step e carries out each band panorama to be spliced into a hemisphere unfolded image after 0 ° of longitude alignment;
Step F becomes disk to show the orthographic projection of described hemisphere unfolded image;
Step G becomes hemisphere to show the positive lateral projection of described hemisphere unfolded image.
Second technical problem to be solved by this invention is to provide a kind of system that utilizes single camera to generate in real time the hemisphere full-view video image, comprising:
One video camera, it is placed on the The Cloud Terrace that rotates in the scope of a pitching 90 degree, horizontal 360-degree;
The cradle head control unit is used for the control The Cloud Terrace angle of pitch direction of camera optical axis is adjusted into the latitude direction of the parallel at preset point place, drives described video camera and begins to rotate in the scope of vertical axis at horizontal 360-degree from preset point; Described each preset point is along selected on the hemisphere face 0 ° of warp direction setting, and difference of latitude is basically identical between the adjacent preset point;
Image acquisition units is used for gathering image in the interval at regular intervals at described video camera rotary course, and with every frame video image preliminary treatment of instantaneous acquiring;
Band panorama concatenation unit, be used for the corresponding video image of pretreated and same preset point is spliced into the band panorama that an along continuous straight runs arranges, and for the every frame video image that embeds in each band panorama, the adjustment of light consistency is carried out in the overlapping region of itself and left and right frame video image; The band panorama of each preset point is vertically arranged;
Hemisphere unfolded image concatenation unit is used for each band panorama that described band panorama concatenation unit is spliced into is carried out further being spliced into a hemisphere unfolded image after 0 ° of longitude alignment;
Hemisphere full-view video image generation unit is used for becoming disk demonstration, positive lateral projection to become hemisphere to show the orthographic projection of described hemisphere unfolded image.
Among the present invention, the hemisphere full-view video image merges generation by the Image Mosaics of several horizontal directions and vertical direction, calculating, splicing and image taking speed are fast, equipment is simple, cost is low, can be widely used in the uses such as the large scene scene video monitoring such as station hall and airport lounge and traffic monitoring, panorama safety monitoring.
Description of drawings
Fig. 1 is that the single camera that utilizes that the embodiment of the invention provides generates the realization flow figure of hemisphere full-view video image method in real time;
Fig. 2 is the image spherical transform schematic diagram that the embodiment of the invention provides;
Fig. 3 is that the band panorama that provides of the embodiment of the invention is to the spherical projection schematic diagram;
Fig. 4 is the hemisphere unfolded image schematic diagram that the embodiment of the invention provides;
Fig. 5 is that the hemisphere unfolded image band that the embodiment of the invention provides merges schematic diagram;
Fig. 6 is the positive lateral projection of the hemisphere unfolded image row-coordinate schematic diagram that the embodiment of the invention provides;
Fig. 7 is the positive lateral projection of the hemisphere unfolded image row coordinate schematic diagram that the embodiment of the invention provides;
Fig. 8 is that the single camera that utilizes that the embodiment of the invention provides generates the Method And Principle figure of hemisphere full-view video image in real time;
Fig. 9 is that the single camera that utilizes that the embodiment of the invention provides generates the system configuration schematic diagram of hemisphere full-view video image in real time;
Figure 10 is the control principle drawing of Fig. 9;
Figure 11 is the structure principle chart of the band panorama concatenation unit that provides of the embodiment of the invention;
Figure 12 is the structure principle chart of the hemisphere unfolded image concatenation unit that provides of the embodiment of the invention.
Embodiment
In order to make purpose of the present invention, technical scheme and advantage clearer, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, is not intended to limit the present invention.
The present invention utilizes the common single camera of existing use, on the The Cloud Terrace of 360 ° of levels, 90 ° of scope rotations of pitching, carries out continuous video acquisition with the rotation of The Cloud Terrace, and with the multiple image of continuous acquisition, machine synthesizes the hemisphere full-view video image in real time as calculated.
The hemisphere full-view video image merges generation by the Image Mosaics of several horizontal directions and vertical direction, and 9 width of cloth image bands are arranged on the vertical direction in the embodiment of the invention, and 161.1 width of cloth images are arranged on each band of horizontal direction.The at first selected warp of the present invention arranges 9 preset point as 0 ° of warp along 0 ° of warp, to each band, take preset point as starting point, gathers image along the Same Latitude circle in horizontally rotating process, will embed band after the image preliminary treatment that gather.The band panorama that preset point is in 0 ° of latitude is the 1st band, and vertically upward, the band panorama numbering corresponding with 9 preset point is followed successively by 1,2 ..., 9.Calculate each position of band panorama in the hemisphere unfolded image according to each band radiuscope, the band panorama is carried out embedding the hemisphere unfolded image after the alignment of 0 longitude, the adjustment of light consistency is carried out in adjacent ribbons panorama overlapping region.Become disk to show the orthographic projection of hemisphere unfolded image, positive lateral projection becomes hemisphere to show.
Fig. 1 shows the realization flow that utilizes single camera to generate in real time hemisphere full-view video image method provided by the invention, and details are as follows:
Steps A as 0 ° of warp, arranges several preset point along 0 ° of warp direction at the selected warp of hemisphere face in the scope of pitching 90 degree, difference of latitude is basically identical between the adjacent preset point.
Step B, on each preset point, the angle of pitch direction of camera optical axis is adjusted into the latitude direction of the parallel at preset point place, begin rotary camera in the scope of vertical axis at horizontal 360-degree from preset point, and in rotary course at regular intervals the interval gather image, with every frame video image preliminary treatment of instantaneous acquiring.
Preliminary treatment comprises: video odd even scene sawtooth is subdued, and deletes the left and right border of the video image vertical strip that distortion is larger in every frame video image, keeps the less video image intermediate vertical band of distortion.
Concrete scheme and purpose that video parity field sawtooth is subdued are: can omit this step for the video camera of lining by line scan; For interleaved video camera, only get the odd field of each frame video image, to every delegation of odd field, calculate the mean value of each odd column pixel and follow-up neighbouring even-numbered row pixel, then the even column pixel is abandoned.The purpose of this step is in order to eliminate some video camera because the image parity field sawtooth skew that motion brings.
The standard that keeps the less video image intermediate vertical band of distortion is that general video camera can have at boundary the secret note band of some non-imagings, and larger in boundary distortion all around.According to the difference of video camera quality, cut off the vertical strip of each 10 to 30 pixel wide about the every frame of video image.
Step C, the video image on the latitude circle that pretreated and same preset point is corresponding are spliced into the band panorama that an along continuous straight runs arranges; The band panorama of each preset point is vertically arranged.
Gather N width of cloth image in the round horizontal direction of the Same Latitude at preset point place, interframe correlation distance d according to current band panorama, calculate the less position of intermediate vertical band in current band panorama of distortion that keeps in every frame video image, the position that its embedding is calculated.If one-period panorama band is by N 0Two field picture is spliced, and then above-mentioned N is N 0With the integer part of 20 sums (N can be other value also, adds 20 purpose and be circumference visual field that the number of image frames that makes the video camera rotating acquisition covers suitably greater than a circumference cycle), N is the number of image frames of actual acquisition during each band panorama generates.Because the angular speed that The Cloud Terrace horizontally rotates around vertical axis is for immobilizing, with the latitude orientation independent of camera optical axis, so the number of image frames N that is set in the image sequence of the camera acquisition on the latitude circle of different preset point place is identical.The interframe correlation distance d of the current band panorama that calculates rounded up obtain d I, according to d ICalculate the less position of intermediate vertical band in current band panorama of distortion that keeps in every frame video image, and the position that its embedding is calculated.
Suppose that the circumference circle that the center is in the band panorama of 0 ° of latitude is D 0, the interframe correlation distance that the center is in the band panorama of 0 ° of latitude is d 0, N 0Calculating be specially formula (1):
N 0 = D 0 d 0 - - - ( 1 )
Wherein, above-mentioned center is in the circumference circle D of the band panorama sketch of 0 ° of latitude 0Account form description is arranged in ZL200710125400.5, for ease of understanding, the present invention only is summarized as follows:
A, obtain the sequence of video images that gathers during The Cloud Terrace uniform rotation three to four cycles.
B, according to interframe correlation distance d, the sequence of video images of steps A collection is carried out the panorama splicing.
C, to the multicycle panoramic picture that step B splicing obtains, calculate the gradient projection histogram P (t) of its gray-scale map.
D, to the gradient projection histogram that step C obtains, carry out the one dimension low-pass filtering, the local maximum position (t0) of calculation of filtered signal.
Purpose and the technical scheme of one dimension low-pass filtering are: can obtain more accurately maximum value position in the signal after low-pass filtering, this position is generally the strongest position of vertical boundary in the image.A kind of technic relization scheme is, at first each value of gradient vertical projection histogram taken absolute value, and obtains a non-negative one-dimensional sequence signal, (is 7 low pass filter [1 such as length with this signal and one dimension low pass filter, 1,1,1,1,1,1]) makes signal after the convolution.
E, centered by local maximum position (t0), get suitable length Nogata signal graph, with this as local signal L (t) to be correlated with.
F, in gradient projection histogram P (t), find out with relevant local signal L (t) coefficient correlation greater than 3 positions of 0.9,3 points are respectively t1 point, t2 point, t3 point.
Distance relation between G, checking computations t1 point, t2 point and the t3 point consecutive points, if the distance between t1 point and the t2 point equals the distance between t2 point and the t3 point, the distance between t1 point and t2 point or t2 point and the t3 point is panoramic video figure circumference cycle (D 0).
In like manner, above-mentioned center is in the interframe correlation distance d of the band panorama of 0 ° of latitude 0Account form description is arranged in ZL200710125400.5, for ease of understanding, the present invention only is summarized as follows:
A, obtain the current video image frame, calculate this two field picture gradient along the camera motion direction, obtain parallel gradient map.
Calculating along the purpose of this two field picture gradient of camera motion direction is, characteristics of image in this two field picture is represented with Mathematical Modeling the correlation of being convenient between the two field picture judges that the parallel gradient map in the invention is: to image usage level gradient operator (such as Sobel operator [1,0 ,-1;-2.0 ,-2; 1,0 ,-1] etc.) carries out the figure that convolution algorithm obtains.
B, to the parallel gradient map of steps A, carry out the gradient projection of vertical direction, obtain the gradient vertical projection histogram.The gradient vertical projection histogram is in the invention: the value to each pixel place in the gradient map adds up along column direction, an one-dimensional signal sequence that follows direction of formation.
C, in the consecutive frame overlapping region, get one section relevant projection histogram.
D, to the one section relevant projection histogram of getting among the step C, contrast the relevant position in the projection histogram of adjacent previous frame video image, the correlation distance (d of two adjacent two field pictures before and after calculating 0).
Further, at known d 0Under the condition, the value of the interframe correlation distance d of described current band panorama obtains by following step:
Step C11 obtains the current video image frame, calculates this two field picture gradient along the camera motion direction, obtains parallel gradient map.
Calculating along the purpose of this two field picture gradient of camera motion direction is, characteristics of image in this two field picture is represented with Mathematical Modeling the correlation of being convenient between the two field picture judges that the parallel gradient map in the invention is: to image usage level gradient operator (such as Sobel operator [1,0 ,-1;-2.0 ,-2; 1,0 ,-1] etc.) carries out the figure that convolution algorithm obtains.
Step C12, piece image project on the hemisphere, calculate the visual angle size of row of the end of image after the spherical transform.
If W is picture traverse, H is the figure image height, and the interframe correlation distance that the center is in the band panorama of 0 ° of latitude is d 0, current band interframe correlation distance is d, and radius of sphericity is R, and the latitude of row pixel place latitude circle of the end of current histogram picture is
Figure BDA00002347152100071
Place latitude radius of a circle is r b, calculate the end row angle of visual field size θ of current frame image by following formula b:
R = D 0 2 π - - - ( 2 )
Figure BDA00002347152100082
1≤d≤d 0 (3)
Figure BDA00002347152100083
θ b = 2 arctan ( W 2 r b ) - - - ( 5 )
θ bBe and 2 π/N 0The decimal of an order of magnitude is used θ bWith 1000 multiply each other and be denoted as W after rounding I, W IWide after changing as the gradient map sphere.
Step C13, visual angle large young pathbreaker's step C11 gradient map of row is done spherical transform according to the end of described image, and the pixel with same vertical visual angle is arranged on the vertical row.
The position of supposing the end row pixel from left to right of step C11 gradient map is y, and the size of y is 1 to W; The visual angle number of degrees that y is corresponding are θ, and size is from 0 to θ bH is the vertical range of pixel and gradient map central pixel point in the gradient map, and the h size is from-H/2 to H/2, and vertical range is the radius r of sphere latitude circle of the pixel of h among the compute gradient figure:
r = R sin ( R sin ( d / d 0 ) + h R ) - - - ( 6 )
With gradient map G y(h, y) carries out spherical transform, obtains gradient map g y(h, θ), the spherical transform relational expression:
θ = θ b 2 + arctan ( y - W / 2 r ) , y=1,…,W (7)
y = W 2 + r · tan ( θ - θ b 2 ) , θ=0,…,θ b (8)
The purpose of gradient map being carried out spherical transform is: former gradient map is projected on the hemisphere, and the arrangement with pixel of same vertical angle of view is camber line, and carrying out spherical transform is with the pixel on this camber line, to reorganize on a vertical row, as shown in Figure 2.
Step C14 to the gradient map after the step C13 spherical transform, carries out the gradient projection of vertical direction, and concrete grammar is that the value with each pixel place in the gradient map adds up along column direction, forms an one-dimensional signal sequence that follows direction.Represent this one-dimensional signal sequence with the gradient vertical projection histogram among the present invention.
Step C15, gradient map interframe correlation distance after the calculation procedure C13 spherical transform, this distance is actually interframe visual angle correlation distance.In the panorama generative process, the angular speed of cloud platform rotation is constant, fixes if gather the time interval of image in rotating, and is then identical in the adjacent image interframe field of view angle displacement of different latitude circle collection, supposes that the displacement of the two field picture angle of visual field is Φ 0, the number of image frames in a circumference cycle is N 0, then the pixel displacement of gradient map is S after the corresponding spherical transform θ:
S θ = Φ 0 θ b W I = 2 π W I N 0 θ b - - - ( 9 )
Step C16, the pixel displacement S of gradient map after the spherical transform that obtains according to step C15 θ, in the one-dimensional signal sequence of adjacent two frame gradient map, respectively get one section, calculate this correlation coefficient ρ of two sections.
Step C17 increases to d with the interframe correlation distance d of current band panorama successively from 1 0For each d, execution in step C12 is to step C16, calculate a series of correlation coefficient ρ (d), d corresponding to the maximum of correlation coefficient ρ (d) is as the interframe correlation distance of current band panorama, this moment is because d is integer, so the computational accuracy of interframe correlation distance is Pixel-level.
Further, for obtaining the interframe correlation distance of sub-pixel precision (being accurate to 1 decimal), satisfy the requirement of panorama splicing accuracy, after step C17, also comprise step C18:
Step C18, d increases to d from 1 successively among the acquisition step C17 0Coefficient correlation sequence ρ (d), use window function that coefficient correlation sequence ρ (d) is carried out match, obtain the correlation function ρ after the match 0(t):
ρ 0 ( t ) = Σ d = 1 d 0 ρ ( d ) e ( t - d ) 2 2 h f 2 - - - ( 10 )
Wherein, t is correlation function ρ 0The independent variable of continuous value (t), h fThe window width of fitting function, in embodiments of the present invention h f=2.Further calculate following correlation function interpolating sequence
ρ 1 [ i ] = ρ 0 ( t = i 10 ) , i=1,2,…,10d 0
To sequence ρ 1[i] carries out digital low-pass filtering:
ρ 2 [ i ] = 1 2 m + 1 Σ j = - m m ρ 1 [ i + j ] , i=1,2,…,10d 0 (11)
In the calculating of formula (11), in i+j<1, or i+j 10d 0The time, make ρ 1[i+j]=0.In embodiments of the present invention, the m=4 in the formula (11).
ρ 2[i] is a correlation function sequence that is accurate to a decimal, its maximum ρ 2The value of the d=m/10 that [m] is corresponding is the interframe correlation distance of sub-pixel precision (being accurate to 1 decimal).
Step D to the every frame video image that embeds in each band panorama, carries out the adjustment of light consistency to the overlapping region of itself and left and right frame video image.
Step e carries out each band panorama to be spliced into a hemisphere unfolded image after 0 ° of longitude alignment.Particularly, step e comprises the steps:
Step e 1 to each band panorama, is in the band panorama bottom line of 0 ° of latitude with respect to the center, s the pixel that move to left successively of each row in the current band panorama, and s obtains by following formula:
s = 1 2 ( D 0 - W h ) = D 0 2 ( 1 - r h R ) - - - ( 12 )
Wherein, D 0Centered by be in the circumference circle of the band panorama of 0 ° of latitude; R is radius of sphericity; W hFor the central row vertical range with current band panoramic picture is the girth of the image line of h, r hBe the sphere latitude radius of a circle at this row image place, r hObtain by following formula:
r = R sin ( R sin ( d / d 0 ) + h R ) - - - ( 13 )
Wherein, d is the interframe correlation distance of current band panoramic picture.
Step e 2 arrives hemisphere face with each band panoramic projection, and is stretched to the hemisphere expanded view, and Fig. 3 is image spherical projection schematic diagram, and Fig. 4 is hemisphere unfolded image schematic diagram.Concrete methods of realizing is:
According to band interframe correlation distance d and radius of sphericity R, the position H of band top row in the hemisphere unfolded image behind the computational transformation hPosition H with bottom line l:
H h = R [ arcsin ( d d 0 ) - arctan ( H / 2 R ) ] - - - ( 14 )
H l = R [ arcsin ( d d 0 ) + arctan ( H / 2 R ) ] - - - ( 15 )
Wherein, d is interframe correlation distance, and R is radius of sphericity, H hAnd H lBe respectively the position of the top row of band panorama in the hemisphere unfolded image after the conversion and the position of bottom line.
Step e 3, adjacent ribbons is carried out the adjustment of light consistency:
f ( x , y ) = Δx T f 1 ( x , y ) + ( 1 - Δx T ) f 2 ( x , y ) - - - ( 16 )
In the top formula, f 1(x, y) and f 2(x, y) be respectively the image function of upper and lower adjacent two band panoramas, (x, y) be the pixel coordinate of image, Fig. 5 shows hemisphere unfolded image band and merges schematic diagram, the doubling of the image zone of two bands is such as the shade band among Fig. 5, and the shade band is integration region, and the height of integration region is T=H 1l-H 2h, H 1lThe bottom line position of upper band, H 2hIt is the top row position of lower band, Δ x be overlapping region pixel (x, y) to the vertical range on bottom line border, overlay region, f (x, y) be the pixel value after merging, it is obtained by the weighted sum of two original histograms as the pixel value of corresponding pixel points.
Step F is carried out orthographic projection with the hemisphere unfolded image and is become disk to show.
Because the hemisphere unfolded image is to be obtained by the hemisphere panoramic expansion, therefore directly hemisphere panoramic expansion image mapped is become disk to show.Suppose that hemisphere unfolded image pixel coordinate is (i, j), the width that hemisphere unfolded image i is capable is C i, be mapped as pixel polar coordinates corresponding in the disk panoramic picture and be (r (i, j), θ (i, j)).
r(i,j)=i (17)
θ ( i , j ) = 2 π j C i - - - ( 18 )
Step G carries out positive lateral projection with the hemisphere unfolded image and becomes hemisphere to show.
Fig. 6 and Fig. 7 show respectively the row, column coordinate relation of the positive lateral projection of hemisphere unfolded image.Suppose that radius of sphericity is R, the pixel coordinate of hemisphere unfolded image is (i, j), and the width that hemisphere unfolded image i is capable is C i, the initial longitude of intercepting is v in hemisphere shows, the panorama pixel rectangular coordinate that positive lateral projection rear quarter shows is (X (i, j), Y (i, j)).
X ( i , j ) = R ( 1 - cos ( i R ) ) - - - ( 19 )
Y ( i . j ) = R - C i 2 &pi; cos ( 2 &pi; j C i - v ) , 0 &le; 2 &pi; j C i - v < &pi; - - - ( 20 )
Wherein, in formula (20), if satisfy
Figure BDA00002347152100116
The value of the j that the i of condition is capable is greater than C i, then the coordinate of corresponding actual pixels is (i, j-C in the hemisphere unfolded image i).By changing the value of initial longitude v, can make the observer can observe whole hemisphere visual field, and produce the sensation that hemisphere rotates.
Fig. 8 the single camera that utilizes that adopts the embodiment of the invention to provide is provided generates in real time the principle of hemisphere full-view video image method, repeats no more.
Fig. 9 shows that the embodiment of the invention provides utilizes single camera to generate in real time the structural principle of the system of hemisphere full-view video image, and Figure 10 shows the control principle of Fig. 9.In the lump with reference to Fig. 9, Figure 10, this system comprises video camera 1, cradle head control unit 2, image acquisition units 3, band panorama concatenation unit 4, hemisphere unfolded image concatenation unit 5, hemisphere full-view video image generation unit 6, wherein, band panorama concatenation unit 4, hemisphere unfolded image concatenation unit 5 and hemisphere full-view video image generation unit 6 can be for being built in the software unit in computer or the dsp processor.
Video camera 1 is placed on the The Cloud Terrace that rotates in the scope of a pitching 90 degree, horizontal 360-degree, cradle head control unit 2 is used for the latitude direction that the control The Cloud Terrace is adjusted into the angle of pitch direction of camera optical axis in the parallel at preset point place, driving video camera 1 begins to rotate in the scope of vertical axis at horizontal 360-degree from preset point, as indicated above, described each preset point is along selected on the hemisphere face 0 ° of warp direction setting, and difference of latitude is basically identical between the adjacent preset point.
Image acquisition units 3 in video camera 1 rotary course at regular intervals the interval gather image, and with every frame video image preliminary treatment of instantaneous acquiring.Band panorama concatenation unit 4 is spliced into the band panorama that an along continuous straight runs arranges with the corresponding video image of pretreated and same preset point, and for the every frame video image that embeds in each band panorama, the adjustment of light consistency is carried out in overlapping region to itself and left and right frame video image, and wherein the band panorama of each preset point is vertically arranged.
Then each the band panorama that by hemisphere unfolded image concatenation unit 5 band panorama concatenation unit 4 is spliced into carries out further being spliced into a hemisphere unfolded image after 0 ° of longitude alignment, and the orthographic projection of hemisphere unfolded image becomes disk to show to hemisphere full-view video image generation unit 6 the most at last, just lateral projection becomes hemisphere to show.
The operation principle of described hemisphere full-view video image generation unit 6 is as indicated above, repeats no more herein.
Further, the left and right border vertical strip that the distortion of 3 pairs of every frame video images of image acquisition units is larger is deleted, and keeps the less intermediate vertical band of distortion; And/or for interleaved video camera, only get the odd field of each frame video image, to every delegation of odd field, calculate the mean value of each odd column pixel and follow-up neighbouring even-numbered row pixel, then the even column pixel is abandoned, to subdue the sawtooth skew of video odd even scene.
Further, band panorama concatenation unit 4 gathers N width of cloth image in the horizontal direction, according to the interframe correlation distance d of current band panorama, calculate the less position of intermediate vertical band in current band panorama of reservation distortion of every frame video image, the position that its embedding is calculated;
Wherein, N is greater than the video image frame number N that comprises in the one-period band panorama 0, N 0Can obtain by following formula:
N 0 = D 0 d 0
Wherein, D 0Centered by be in the circumference circle of the band panorama of 0 ° of latitude, d 0Centered by be in the interframe correlation distance of the band panorama of 0 ° of latitude.
As shown in figure 11, band panorama concatenation unit 4 comprises: parallel gradient map computing module 41, row visual angle computing module of the end 42, spherical transform module 43, gradient vertical projection information extraction module 44, pixel displacement computing module 45, Calculation of correlation factor module 46, the interframe correlation distance computing module 47 of band panorama, fitting module 48, sub-pixel precision correlation distance computing module 49.
Parallel gradient map computing module 41 is used for obtaining the current video image frame, calculates this two field picture gradient along the camera motion direction, obtains parallel gradient map.If W is picture traverse, H is the figure image height, and the interframe correlation distance that the center is in the band panorama of 0 ° of latitude is d 0, the interframe correlation distance of current band panorama is d, and radius of sphericity is R, and the latitude of row pixel place latitude circle of the end of current histogram picture is
Figure BDA00002347152100132
Place latitude radius of a circle is r b, row visual angle computing module of the end 42 is used for by the end row visual angle width θ of following formula computed image b:
R = D 0 2 &pi;
Figure BDA00002347152100134
&theta; b = 2 arctan ( W 2 r b )
θ bBe with 2 π/ NThe decimal of 0 one orders of magnitude is used θ bWith 1000 multiply each other and be denoted as W after rounding I, W IWide after changing as the gradient map sphere.
Spherical transform module 43 is done spherical transform according to the gradient map that the described parallel gradient map computing module of the large young pathbreaker in visual angle of row of the end of described image calculates, and the pixel with same vertical visual angle is arranged on the vertical row.44 pairs of described spherical transform modules of gradient vertical projection information extraction module are carried out the gradient map after the spherical transform, carry out the gradient projection of vertical direction, the pixel point value that is about among the Projected figure adds up along column direction, forms an one-dimensional signal sequence that follows direction.Pixel displacement computing module 45 calculates the pixel displacement S that described spherical transform module is carried out gradient map after the conversion according to following formula θ:
S &theta; = &Phi; 0 &theta; b W I = 2 &pi; W I N 0 &theta; b
Wherein, Φ 0It is the angle of visual field displacement of 0 ° of consecutive frame image in the latitude band panoramic picture.
The pixel displacement S of gradient map after the spherical transform that Calculation of correlation factor module 46 is used for obtaining according to described pixel displacement computing module θ, in the one-dimensional signal sequence of adjacent two frame gradient map, respectively get one section, calculate this correlation coefficient ρ of two sections.The interframe correlation distance computing module 47 of band panorama is used for the interframe correlation distance d of current band panorama is increased to d successively from 1 0, d 0The interframe correlation distance of 0 ° of latitude band, for each d, calculate a series of correlation coefficient ρ (d) through row visual angle computing module of the described end, described spherical transform module, described gradient vertical projection information extraction module, described pixel displacement computing module, described Calculation of correlation factor module successively, corresponding d is as the interframe correlation distance of current band panorama during correlation coefficient ρ (d) maximum, this moment, d was integer, and the precision of the interframe correlation distance that therefore calculates is Pixel-level.
Further, band panorama concatenation unit 4 also comprises fitting module 48, the coefficient correlation sequence ρ (d) that is used for correlation distance d between series of frames that the interframe correlation distance computing module to described band panorama obtains, use window function that this coefficient correlation sequence is carried out match, obtain the correlation function ρ after the match 0(t):
&rho; 0 ( t ) = &Sigma; d = 1 d 0 &rho; ( d ) e ( t - d ) 2 2 h f 2
Wherein, t is correlation function ρ 0The independent variable of continuous value (t), h fThe window width of fitting function, in embodiments of the present invention h f=2.
Again further, band panorama concatenation unit 4 comprises sub-pixel precision correlation distance computing module 49.The correlation function ρ that this module utilizes fitting module to obtain 0(t), calculate following correlation function interpolating sequence:
&rho; 1 [ i ] = &rho; 0 ( t = i 10 ) , i=1,2,…,10d 0
Then to sequence ρ 1[i] carries out digital low-pass filtering:
&rho; 2 [ i ] = 1 2 m + 1 &Sigma; j = - m m &rho; 1 [ i + j ] , i=1,2,…,10d 0
In the calculating of following formula, in i+j<1, or i+j〉during 10d0, make ρ 1[i+j]=0.And m=4 in calculating.
Above-mentioned ρ 2[i] is an interframe correlation function sequence that is accurate to a decimal, its maximum ρ 2The value of the d=m/10 that [m] is corresponding is the interframe correlation distance of sub-pixel precision (being accurate to 1 decimal).
As indicated above, the position of establishing the end row pixel from left to right of gradient map that parallel gradient map computing module 41 obtains is y, and the size of y is 1 to W; The corresponding visual angle of the y number of degrees are θ, and size is from 0 to θ b, h is the vertical range of the central pixel point of the pixel of gradient map and gradient map, and the h size is from-H/2 to H/2, and vertical range is the radius r of sphere latitude circle of any pixel of h among the compute gradient figure:
r = R sin ( R sin ( d / d 0 ) + h R )
Spherical transform module 43 is with gradient map G y(h, y) carries out spherical transform, obtains gradient map g y(h, θ); Carry out spherical transform according to following transformation relation formula:
&theta; = &theta; b 2 + arctan ( y - W / 2 r ) , y=1,...W
y = W 2 + r &CenterDot; tan ( &theta; - &theta; b 2 ) , θ=0,..,θ b
Figure 12 shows the structural principle of hemisphere unfolded image concatenation unit 5, comprises 0 ° of longitude alignment module 51 of band and hemisphere unfolded image merge module 52.0 ° of longitude alignment module of band 51 is used for each band panorama, and the bottommost of band panorama that is in 0 ° of latitude with respect to the center is capable, with s the pixel that move to left successively of each row in the current band panorama; Wherein s obtains by following formula:
s = 1 2 ( D 0 - W h ) = D 0 2 ( 1 - r h R )
Wherein, D 0Centered by be in the circumference circle of the band panorama of 0 ° of latitude; R is radius of sphericity; W hFor the central row vertical range with current band panoramic picture is the girth of the current band image line of h, r hBe the sphere latitude radius of a circle at this row image place, r hObtain by following formula:
r h = R sin ( R sin ( d / d 0 ) + h R )
Wherein, d is the interframe correlation distance of current band panoramic picture.
Hemisphere unfolded image merge module 52 is used for according to following formula each band panoramic projection being arrived hemisphere face, and is stretched to the hemisphere expanded view:
H h = R [ arcsin ( d d 0 ) - arctan ( H / 2 R ) ]
H l = R [ arcsin ( d d 0 ) + arctan ( H / 2 R ) ]
Wherein, d is band interframe correlation distance, and R is radius of sphericity, H hAnd H lBe respectively top row position and the bottom line position of band panorama in the hemisphere unfolded image after the conversion.
Further, hemisphere unfolded image concatenation unit 5 also comprises adjacent ribbons consistency adjusting module 53, is used for according to following formula, and the adjustment of light consistency is carried out in the overlapping region of adjacent band panorama:
f ( x , y ) = &Delta;x T f 1 ( x , y ) + ( 1 - &Delta;x T ) f 2 ( x , y )
Wherein, f 1(x, y) and f 2(x, y) is the image function of upper and lower adjacent two band panoramas, T=H 1l-H 2hBe the height of overlapping region, Δ x be overlapping region pixel (x, y) to the distance on bottom line border, overlay region, f (x, y) is the pixel value after merging.
Key problem in technology of the present invention is, utilizes band interframe correlation distance (d) that the multi-frame video Image Mosaics of continuous acquisition is become band.Utilize the center to be in the interframe correlation distance d of the band panorama of 0 ° of latitude 0With panoramic video figure circumference circle D 0, computing cycle band number of image frames N 0With band interframe correlation distance d, to the preliminary treatment that moves to left of each row in the band panorama, be in the bottommost standard of behaviour of the band panorama of 0 ° of latitude with the center, to each row of current band panorama s pixel that move to left, the band mapping transformation is embedded in the panorama, carry out real-time seamless hemisphere full-view video image splicing.Adopt technique scheme, greatly accelerated the calculating when seamless spliced, greatly improved image taking speed, reduced the device fabrication cost.
The above only is preferred embodiment of the present invention, not in order to limiting the present invention, all any modifications of doing within the spirit and principles in the present invention, is equal to and replaces and improvement etc., all should be included within protection scope of the present invention.

Claims (9)

1. a method of utilizing single camera to generate in real time the hemisphere full-view video image is characterized in that, comprises the steps:
Steps A as 0 ° of warp, arranges several preset point along 0 ° of warp direction at the selected warp of hemisphere face in the scope of pitching 90 degree, difference of latitude is basically identical between the adjacent preset point;
Step B, at each preset point place, the angle of pitch direction of camera optical axis is adjusted into the latitude direction of the parallel at preset point place, begin rotary camera in the scope of vertical axis at horizontal 360-degree from preset point, and in rotary course at regular intervals the interval gather image, with every frame video image preliminary treatment of instantaneous acquiring;
Step C is spliced into the band panorama that an along continuous straight runs arranges with the identical video image of pretreated and same preset point latitude; The band panorama of each preset point is vertically arranged;
Step D to the every frame video image that embeds in each band panorama, carries out the adjustment of light consistency to the overlapping region of itself and left and right frame video image;
Step e carries out each band panorama to be spliced into a hemisphere unfolded image after 0 ° of longitude alignment;
Step F becomes disk to show the orthographic projection of described hemisphere unfolded image;
Step G becomes hemisphere to show the positive lateral projection of described hemisphere unfolded image.
2. the method for claim 1 is characterized in that, described step C specifically comprises the steps:
Video camera current on the horizontal direction of vertical axis revolving constant duration gather N width of cloth image, interframe correlation distance d according to current band panorama, calculate the less intermediate vertical band of distortion in every frame video image, and the position of this intermediate vertical band in current band panorama, the position that its embedding is calculated;
Wherein, N be one greater than N 0Constant, N 0The video image frame number that comprises in the band panorama one-period, N 0Can obtain by following formula:
N 0 = D 0 d 0
Wherein, D 0Centered by be in the circumference circle of the band panorama of 0 ° of latitude, d 0Centered by be in the interframe correlation distance of the band panorama of 0 ° of latitude.
3. method as claimed in claim 2 is characterized in that, the calculating of the interframe correlation distance d of described current band panorama obtains by following step:
Step C11 obtains the current video image frame, calculates along rotatablely move this two field picture gradient of direction of video camera, obtains parallel gradient map;
Step C12, establishing W is picture traverse, and H is the figure image height, and the interframe correlation distance that the center is in the band panorama of 0 ° of latitude is d 0, current band interframe correlation distance is d, and radius of sphericity is R, and the latitude of row pixel place latitude circle of the end of image is
Figure FDA00002347152000021
Place latitude radius of a circle is r b, calculate the end row angle of visual field size θ of current frame image by following formula b:
R = D 0 2 &pi;
Figure FDA00002347152000023
Figure FDA00002347152000024
&theta; b = 2 arctan ( W 2 r b ) ;
θ bBe and 2 π/N 0The decimal of an order of magnitude is used θ bWith 1000 multiply each other and be denoted as W after rounding I, W IWide as after the gradient map spherical transform;
Step C13, visual angle large young pathbreaker's step C11 gradient map of row is done spherical transform according to the end of described image, and the pixel with same vertical visual angle is arranged on the vertical row; Gradient map G y(h, y) carries out obtaining after the spherical transform gradient map g after the conversion y(h, θ), the spherical transform relational expression:
&theta; = &theta; b 2 + arctan ( y - W / 2 r ) , y=1,…,W
y = W 2 + r &CenterDot; tan ( &theta; - &theta; b 2 ) , θ=0,…,θ b
Wherein,
r = R sin ( R sin ( d / d 0 ) + h R )
Y is the position of the end row pixel from left to right of gradient map; The visual angle number of degrees that y is corresponding are θ, and h is the vertical range of pixel and gradient map central pixel point in the gradient map, the h size from-H/2 is to H/2, W is the gradient map width, H is that gradient map is high; θ bBe the row angle of visual field size of the end of image; R is to be the latitude radius of a circle at the capable place of h with gradient map central pixel point vertical range in the gradient map; R is radius of sphericity, d 0Centered by be in the interframe correlation distance of the band panorama of 0 ° of latitude, d is current band interframe correlation distance;
Step C14 to the gradient map after the step C13 spherical transform, carries out the gradient projection of vertical direction, and the value that is about to pixel in the gradient map adds up along column direction, forms an one-dimensional signal sequence that follows direction;
Step C15 is according to the pixel displacement S of gradient map after the following formula calculation procedure C13 spherical transform θ:
S &theta; = &Phi; 0 &theta; b W I = 2 &pi; W I N 0 &theta; b
Wherein, Φ 0Centered by be in the consecutive frame image angle of visual field displacement of the band panorama of 0 ° of latitude;
Step C16, the pixel displacement S of gradient map after the spherical transform that obtains according to step C15 θ, in the one-dimensional signal sequence of adjacent two frame gradient map, respectively get one section, calculate this correlation coefficient ρ of two sections;
Step C17 increases to d with the interframe correlation distance d of current band panorama successively from 1 0, for each d, execution in step C12 calculates a series of correlation coefficient ρ (d) to step C16, and d corresponding to the maximum of correlation coefficient ρ (d) is as the interframe correlation distance of current band panorama;
Step C18, d increases to d from 1 successively among the acquisition step C17 0Coefficient correlation sequence ρ (d), use window function that coefficient correlation sequence ρ (d) is carried out match, obtain the correlation function ρ after the match 0(t):
&rho; 0 ( t ) = &Sigma; d = 1 d 0 &rho; ( d ) e ( t - d ) 2 2 h f 2
Wherein, t is correlation function ρ 0The independent variable of continuous value (t), h fThe window width of fitting function, general desirable hf=2; Further calculate following correlation function interpolating sequence;
&rho; 1 [ i ] = &rho; 0 ( t = i 10 ) , i=1,2,…,10d 0
To sequence ρ 1[i] carries out digital low-pass filtering:
&rho; 2 [ i ] = 1 2 m + 1 &Sigma; j = - m m &rho; 1 [ i + j ] , i=1,2,…,10d 0
In the calculating of digital low-pass filtering, in i+j<1, or i+j 10d 0The time, make ρ 1[i+j]=0, m is filter constant, m=4;
Above-mentioned ρ 2[i] is an interframe correlation function sequence that is accurate to a decimal, its maximum ρ 2The d=m/10 that [m] is corresponding is the interframe correlation distance of sub-pixel precision.
4. the method for claim 1 is characterized in that, described step e specifically comprises the steps:
Step e 1 to each band panorama, is in the band panorama bottom line of 0 ° of latitude with respect to the center, s the pixel that move to left successively of each row in the current band panorama, and s obtains by following formula:
s = 1 2 ( D 0 - W h ) = D 0 2 ( 1 - r h R )
Wherein, D 0Centered by be in the circumference circle of the band panorama of 0 ° of latitude; R is radius of sphericity; W hFor the central row vertical range with current band panoramic picture is the girth of the image line of h, r hBe the sphere latitude radius of a circle at this row image place, r hObtain by following formula:
r = R sin ( R sin ( d / d 0 ) + h R )
Wherein, d is the interframe correlation distance of current band panoramic picture;
Step e 2 to hemisphere face, is stretched to the hemisphere expanded view with each band panoramic projection:
H h = R [ arcsin ( d d 0 ) - arctan ( H / 2 R ) ]
H l = R [ arcsin ( d d 0 ) + arctan ( H / 2 R ) ]
Wherein, d is the interframe correlation distance of current band, d 0Centered by be in the interframe correlation distance of the band panorama of 0 ° of latitude, H is the height of current band panorama sketch, H hAnd H lBe respectively the position of the top row of band panorama in the hemisphere unfolded image after the conversion and the position of bottom line.
Step e 3 is carried out the adjustment of light consistency to the adjacent ribbons in the hemisphere expanded view.
5. the method for claim 1 is characterized in that:
If hemisphere unfolded image pixel rectangular coordinate is (i, j), the width that hemisphere unfolded image i is capable is C i, the polar coordinates of pixel are (r (i, j), θ (i, j)) in the disk panoramic picture, described step F specifically becomes disk to show the orthographic projection of described hemisphere unfolded image according to following formula:
r(i,j)=i
&theta; ( i , j ) = 2 &pi; j C i ;
If radius of sphericity is R, the pixel rectangular coordinate of hemisphere unfolded image is (i, j), and the width that i is capable is C i, the initial longitude of intercepting is v in hemisphere shows, the image slices vegetarian refreshments rectangular coordinate that positive lateral projection rear quarter shows is (X (i, j), Y (i, j)), described step G specifically becomes hemisphere to show the positive lateral projection of described hemisphere unfolded image according to following formula:
X ( i , j ) = R ( 1 - cos ( i R ) )
Y ( i . j ) = R - C i 2 &pi; cos ( 2 &pi; j C i - v ) , 0 &le; 2 &pi; j C i - v < &pi;
Wherein in calculating, if satisfy
Figure FDA00002347152000055
The value of the j that the i of condition is capable is greater than C i, then the coordinate of corresponding actual pixels is (i, j-C in the hemisphere unfolded image i).
6. a system that utilizes single camera to generate in real time the hemisphere full-view video image is characterized in that, comprising:
One video camera, it is placed on the The Cloud Terrace that rotates in the scope of a pitching 90 degree, horizontal 360-degree;
The cradle head control unit is used for the control The Cloud Terrace angle of pitch direction of camera optical axis is adjusted into the latitude direction of the parallel at preset point place, drives described video camera and begins to rotate in the scope of vertical axis at horizontal 360-degree from preset point; Described each preset point is along selected on the hemisphere face 0 ° of warp direction setting, and difference of latitude is basically identical between the adjacent preset point;
Image acquisition units is used for gathering image in the interval at regular intervals at described video camera rotary course, and with every frame video image preliminary treatment of instantaneous acquiring;
Band panorama concatenation unit, be used for the corresponding video image of pretreated and same preset point is spliced into the band panorama that an along continuous straight runs arranges, and for the every frame video image that embeds in each band panorama, the adjustment of light consistency is carried out in the overlapping region of itself and left and right frame video image; The band panorama of each preset point is vertically arranged;
Hemisphere unfolded image concatenation unit is used for each band panorama that described band panorama concatenation unit is spliced into is carried out further being spliced into a hemisphere unfolded image after 0 ° of longitude alignment;
Hemisphere full-view video image generation unit is used for becoming disk demonstration, positive lateral projection to become hemisphere to show the orthographic projection of described hemisphere unfolded image.
7. system as claimed in claim 6, it is characterized in that, described band panorama concatenation unit gathers N width of cloth image in the horizontal direction, interframe correlation distance d according to current band panorama, calculate the less position of intermediate vertical band in current band panorama of reservation distortion of every frame video image, the position that its embedding is calculated;
Wherein, N is greater than the video image frame number N that comprises in the one-period band panorama 0, N 0Can obtain by following formula:
N 0 = D 0 d 0
Wherein, D 0Centered by be in the circumference circle of the band panorama of 0 ° of latitude, d 0Centered by be in the interframe correlation distance of the band panorama of 0 ° of latitude.
Described band panorama concatenation unit comprises:
Parallel gradient map computing module is used for obtaining the current video image frame, calculates this two field picture gradient along the camera motion direction, obtains parallel gradient map;
Row visual angle computing module of the end is used for by the end row visual angle width θ of following formula computed image b:
R = D 0 2 &pi;
Figure FDA00002347152000064
&theta; b = 2 arctan ( W 2 r b )
Wherein R is radius of sphericity, and W is picture traverse, and H is the figure image height, d 0Centered by be in the interframe correlation distance of the band panorama of 0 ° of latitude, d is current band interframe correlation distance, the latitude of row pixel place latitude circle of the end of current histogram picture is Place latitude radius of a circle is r b
The θ that calculates bBe and 2 π/N 0The decimal of an order of magnitude is used θ bWith 1000 multiply each other and be denoted as W after rounding I, W IWide after changing as the gradient map sphere;
The spherical transform module is used for the gradient map that the described parallel gradient map computing module of the large young pathbreaker in visual angle according to row of the end of described image calculates and does spherical transform, and the pixel with same vertical visual angle is arranged on the vertical row; Be specially, establish the front gradient map G of conversion yGradient map g after (h, y) and the conversion y(h, θ), the spherical transform relational expression:
&theta; = &theta; b 2 + arctan ( y - W / 2 r ) , y=1,…,W
y = W 2 + r &CenterDot; tan ( &theta; - &theta; b 2 ) , θ=0,…,θ b
Wherein, y is the position of the end row pixel from left to right of gradient map; The visual angle number of degrees that y is corresponding are θ, and h is the vertical range of pixel and gradient map central pixel point in the gradient map, the h size from-H/2 is to H/2, W is the gradient map width, H is that gradient map is high; θ bBe the row angle of visual field size of the end of image; R is that vertical range is the sphere latitude radius of a circle of the pixel of h in the gradient map, and r obtains by following formula:
r = R sin ( R sin ( d / d 0 ) + h R )
R is radius of sphericity, d 0Centered by be in the interframe correlation distance of the band panorama of 0 ° of latitude, d is current band interframe correlation distance;
Gradient vertical projection information extraction module, be used for described spherical transform module is carried out gradient map after the spherical transform, carry out the gradient projection of vertical direction, the pixel point value that is about in the gradient map adds up along column direction, forms an one-dimensional signal sequence that follows direction;
The pixel displacement computing module is used for calculating described spherical transform module according to following formula and carries out that the pixel displacement of gradient map is S after the conversion θ:
S &theta; = &Phi; 0 &theta; b W I = 2 &pi; W I N 0 &theta; b
Wherein, Φ 0It is the angle of visual field displacement of 0 ° of consecutive frame image in the latitude band panoramic picture;
The Calculation of correlation factor module, gradient map pixel displacement S after the spherical transform that is used for obtaining according to described pixel displacement computing module θ, in the one-dimensional signal sequence of adjacent two frame gradient map, respectively get one section, calculate this correlation coefficient ρ of two sections;
The interframe correlation distance computing module of band panorama is used for the interframe correlation distance d of current band panorama is increased to d successively from 1 0, d 0The interframe correlation distance of 0 ° of latitude band, for each d, calculate a series of correlation coefficient ρ (d) through row visual angle computing module of the described end, described spherical transform module, described gradient vertical projection information extraction module, described pixel displacement computing module, described Calculation of correlation factor module successively, corresponding d is as the interframe correlation distance of current band panorama during correlation coefficient ρ (d) maximum;
Fitting module is used for the coefficient correlation sequence ρ (d) of correlation distance d between series of frames that the interframe correlation distance computing module to described band panorama obtains, uses window function that this coefficient correlation sequence is carried out match, obtains the correlation function ρ after the match 0(t):
&rho; 0 ( t ) = &Sigma; d = 1 d 0 &rho; ( d ) e ( t - d ) 2 2 h f 2
Wherein, t is correlation function ρ 0The independent variable of continuous value (t), h fThe window width of fitting function, general desirable h f=2;
Sub-pixel precision correlation distance computing module is used for the correlation function ρ that described fitting module is obtained 0(t), calculating has the interframe correlation distance of sub-pixel precision; This module is at first calculated the correlation function interpolating sequence:
&rho; 1 [ i ] = &rho; 0 ( t = i 10 ) , i=1,2,…,10d 0
Then to sequence ρ 1[i] carries out digital low-pass filtering:
&rho; 2 [ i ] = 1 2 m + 1 &Sigma; j = - m m &rho; 1 [ i + j ] , i=1,2,…,10d 0
In the calculating of following formula, in i+j<1, or i+j 10d 0The time, make ρ 1[i+j]=0.M is filter constant, m=4; ρ 2[i] is a correlation function sequence that is accurate to a decimal, its maximum ρ 2The d=m/10 that [m] is corresponding is the interframe correlation distance of sub-pixel precision.
8. system as claimed in claim 6 is characterized in that, described hemisphere unfolded image concatenation unit comprises:
0 ° of longitude alignment module of band is for first vertical row that the pixel at 0 ° of longitude place is snapped to expanded view at the hemisphere unfolded image;
Be specially, be in the band panorama bottom standard of behaviour of 0 ° of latitude with the center, to s the pixel that move to left successively of each row in each band panorama, s obtains by following formula by following formula:
s = 1 2 ( D 0 - W h ) = D 0 2 ( 1 - r h R )
Wherein, D 0Centered by be in the circumference circle of the band panorama of 0 ° of latitude; R is radius of sphericity; W hFor the central row vertical range with current band panoramic picture is the girth of the current band image line of h, r hBe the sphere latitude radius of a circle at this row image place, r hObtain by following formula:
r = R sin ( R sin ( d / d 0 ) + h R )
Wherein, d is the interframe correlation distance of current band panoramic picture;
Hemisphere unfolded image merge module is used for according to following formula each band panorama stretching, extension being embedded into the hemisphere expanded view:
H h = R [ arcsin ( d d 0 ) - arctan ( H / 2 R ) ]
H l = R [ arcsin ( d d 0 ) + arctan ( H / 2 R ) ]
Wherein, R is radius of sphericity, and d is interframe correlation distance, d 0Centered by be in the interframe correlation distance of the band panorama of 0 ° of latitude, H is the height of current band panorama sketch, H hAnd H lBe respectively the position of the top row of band panorama in the hemisphere unfolded image after the conversion and the position of bottom line;
Adjacent ribbons consistency adjusting module is used for the adjustment of light consistency is carried out in the overlapping region of adjacent band panorama.
9. system as claimed in claim 6 is characterized in that, described hemisphere full-view video image generation unit becomes disk demonstration, positive lateral projection to become hemisphere to show the orthographic projection of hemisphere unfolded image, comprising:
The rectangular coordinate (i, j) of hemisphere unfolded image pixel becomes the relation of the polar coordinates (r (i, j), θ (i, j)) of pixel in the disk demonstration to utilize following formula to calculate with orthographic projection:
r(i,j)=i
&theta; ( i , j ) = 2 &pi; j C i ;
Wherein, C iBe the capable width of hemisphere unfolded image i;
The pixel rectangular coordinate (i, j) of hemisphere unfolded image becomes the relation of the rectangular coordinate (X (i, j), Y (i, j)) of the panorama pixel that hemisphere shows to utilize following formula calculating with positive lateral projection:
X ( i , j ) = R ( 1 - cos ( i R ) )
Y ( i . j ) = R - C i 2 &pi; cos ( 2 &pi; j C i - v ) , 0 &le; 2 &pi; j C i - v < &pi;
Wherein, R is radius of sphericity, C iBe the capable width of hemisphere unfolded image i, v is the initial longitude of intercepting in hemisphere shows.In calculating, if satisfy
Figure FDA00002347152000105
The value of the j that the i of condition is capable is greater than C i, then the coordinate of corresponding actual pixels is (i, j-C in the hemisphere unfolded image i).
CN201210430511.8A 2012-11-01 2012-11-01 Single camera is utilized to generate the method and system of hemisphere full-view video image in real time Active CN102984453B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210430511.8A CN102984453B (en) 2012-11-01 2012-11-01 Single camera is utilized to generate the method and system of hemisphere full-view video image in real time

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210430511.8A CN102984453B (en) 2012-11-01 2012-11-01 Single camera is utilized to generate the method and system of hemisphere full-view video image in real time

Publications (2)

Publication Number Publication Date
CN102984453A true CN102984453A (en) 2013-03-20
CN102984453B CN102984453B (en) 2015-10-28

Family

ID=47858157

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210430511.8A Active CN102984453B (en) 2012-11-01 2012-11-01 Single camera is utilized to generate the method and system of hemisphere full-view video image in real time

Country Status (1)

Country Link
CN (1) CN102984453B (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103841333A (en) * 2014-03-27 2014-06-04 成都动力视讯科技有限公司 Preset bit method and control system
CN105357433A (en) * 2015-10-13 2016-02-24 哈尔滨工程大学 High-speed rotating focal length self-adaptive panoramic imaging method
CN105516755A (en) * 2015-12-14 2016-04-20 成都易瞳科技有限公司 Video previewing method and apparatus
CN105635651A (en) * 2014-10-29 2016-06-01 浙江大华技术股份有限公司 Pan/tilt/zoom positioning method and pan/tilt/zoom positioning device
CN105827978A (en) * 2016-04-28 2016-08-03 努比亚技术有限公司 Semispherical panorama photographing method, apparatus and terminal
CN106331653A (en) * 2016-09-29 2017-01-11 浙江宇视科技有限公司 Method and apparatus for locating panorama camera sub-picture display area
CN106791395A (en) * 2016-12-20 2017-05-31 暴风集团股份有限公司 The hemisphere face player method and system of video
CN106952225A (en) * 2017-02-15 2017-07-14 山东科技大学 A kind of panorama mosaic method towards forest fire protection
CN107087154A (en) * 2017-04-25 2017-08-22 中国科学技术大学先进技术研究院 A kind of method and apparatus that stereoscopic panoramic image collection is carried out using single camera
WO2017185309A1 (en) * 2016-04-28 2017-11-02 SZ DJI Technology Co., Ltd. System and method for obtaining spherical panorama image
CN107341845A (en) * 2017-03-03 2017-11-10 深圳市德赛微电子技术有限公司 A kind of vehicle-mounted panoramic image shade covering method
CN107424118A (en) * 2017-03-28 2017-12-01 天津大学 Based on the spherical panorama mosaic method for improving Lens Distortion Correction
WO2017211089A1 (en) * 2016-06-07 2017-12-14 深圳市灵动飞扬科技有限公司 Vehicle panoramic view system and method thereof
WO2018035764A1 (en) * 2016-08-24 2018-03-01 深圳市大疆灵眸科技有限公司 Method for taking wide-angle pictures, device, cradle heads, unmanned aerial vehicle and robot
CN107995408A (en) * 2016-10-26 2018-05-04 贵州火星探索科技有限公司 A kind of 360 ° of panoramic shooting systems and method
CN108024088A (en) * 2016-10-31 2018-05-11 杭州海康威视系统技术有限公司 A kind of video taking turn method and device
CN108322625A (en) * 2017-12-28 2018-07-24 杭州蜜迩科技有限公司 A kind of panoramic video production method based on panorama sketch
CN108650461A (en) * 2018-05-11 2018-10-12 普宙飞行器科技(深圳)有限公司 Control method, device and the equipment of a kind of variable field of view angle camera head
CN108665502A (en) * 2018-03-22 2018-10-16 韩明 Feasibility cloud computing analysis method
CN108805988A (en) * 2018-05-23 2018-11-13 平安科技(深圳)有限公司 VR image generating methods, device, computer equipment and storage medium
CN109147082A (en) * 2018-03-22 2019-01-04 韩明 Feasibility cloud computing analysis system
CN112085650A (en) * 2020-09-09 2020-12-15 南昌虚拟现实研究院股份有限公司 Image processing method, image processing device, storage medium and computer equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101247513A (en) * 2007-12-25 2008-08-20 谢维信 Method for real-time generating 360 degree seamless full-view video image by single camera
CN101859433A (en) * 2009-04-10 2010-10-13 日电(中国)有限公司 Image mosaic device and method
CN101924921A (en) * 2009-12-29 2010-12-22 天津市亚安科技电子有限公司 Panoramic monitoring pan tilt camera and annular or semispherical panoramic monitoring control method
CN102314686A (en) * 2011-08-03 2012-01-11 深圳大学 Reference view field determination method, system and device of splicing type panoramic video

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101247513A (en) * 2007-12-25 2008-08-20 谢维信 Method for real-time generating 360 degree seamless full-view video image by single camera
CN101859433A (en) * 2009-04-10 2010-10-13 日电(中国)有限公司 Image mosaic device and method
CN101924921A (en) * 2009-12-29 2010-12-22 天津市亚安科技电子有限公司 Panoramic monitoring pan tilt camera and annular or semispherical panoramic monitoring control method
CN102314686A (en) * 2011-08-03 2012-01-11 深圳大学 Reference view field determination method, system and device of splicing type panoramic video

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103841333A (en) * 2014-03-27 2014-06-04 成都动力视讯科技有限公司 Preset bit method and control system
CN105635651B (en) * 2014-10-29 2019-05-24 浙江大华技术股份有限公司 A kind of ball machine localization method and device
CN105635651A (en) * 2014-10-29 2016-06-01 浙江大华技术股份有限公司 Pan/tilt/zoom positioning method and pan/tilt/zoom positioning device
CN105357433B (en) * 2015-10-13 2018-12-07 哈尔滨工程大学 A kind of adaptive method for panoramic imaging of high speed rotation focal length
CN105357433A (en) * 2015-10-13 2016-02-24 哈尔滨工程大学 High-speed rotating focal length self-adaptive panoramic imaging method
CN105516755A (en) * 2015-12-14 2016-04-20 成都易瞳科技有限公司 Video previewing method and apparatus
CN105827978A (en) * 2016-04-28 2016-08-03 努比亚技术有限公司 Semispherical panorama photographing method, apparatus and terminal
WO2017185309A1 (en) * 2016-04-28 2017-11-02 SZ DJI Technology Co., Ltd. System and method for obtaining spherical panorama image
US10805532B2 (en) 2016-04-28 2020-10-13 SZ DJI Technology Co., Ltd. System and method for obtaining spherical panorama image
CN109362234A (en) * 2016-04-28 2019-02-19 深圳市大疆创新科技有限公司 System and method for obtaining Spherical Panorama Image
WO2017211089A1 (en) * 2016-06-07 2017-12-14 深圳市灵动飞扬科技有限公司 Vehicle panoramic view system and method thereof
WO2018035764A1 (en) * 2016-08-24 2018-03-01 深圳市大疆灵眸科技有限公司 Method for taking wide-angle pictures, device, cradle heads, unmanned aerial vehicle and robot
CN106331653A (en) * 2016-09-29 2017-01-11 浙江宇视科技有限公司 Method and apparatus for locating panorama camera sub-picture display area
CN107995408A (en) * 2016-10-26 2018-05-04 贵州火星探索科技有限公司 A kind of 360 ° of panoramic shooting systems and method
CN108024088A (en) * 2016-10-31 2018-05-11 杭州海康威视系统技术有限公司 A kind of video taking turn method and device
US11138846B2 (en) 2016-10-31 2021-10-05 Hangzhou Hikvision System Technology Co., Ltd. Method and apparatus for video patrol
CN106791395A (en) * 2016-12-20 2017-05-31 暴风集团股份有限公司 The hemisphere face player method and system of video
CN106952225B (en) * 2017-02-15 2020-07-07 山东科技大学 Panoramic splicing method for forest fire prevention
CN106952225A (en) * 2017-02-15 2017-07-14 山东科技大学 A kind of panorama mosaic method towards forest fire protection
CN107341845B (en) * 2017-03-03 2020-12-15 深圳市德赛微电子技术有限公司 Vehicle-mounted panoramic image shadow covering method
CN107341845A (en) * 2017-03-03 2017-11-10 深圳市德赛微电子技术有限公司 A kind of vehicle-mounted panoramic image shade covering method
CN107424118A (en) * 2017-03-28 2017-12-01 天津大学 Based on the spherical panorama mosaic method for improving Lens Distortion Correction
CN107087154A (en) * 2017-04-25 2017-08-22 中国科学技术大学先进技术研究院 A kind of method and apparatus that stereoscopic panoramic image collection is carried out using single camera
CN108322625B (en) * 2017-12-28 2020-06-23 杭州蜜迩科技有限公司 Panoramic video production method based on panoramic image
CN108322625A (en) * 2017-12-28 2018-07-24 杭州蜜迩科技有限公司 A kind of panoramic video production method based on panorama sketch
CN108665502B (en) * 2018-03-22 2019-02-15 上海赛邮云计算有限公司 Feasibility cloud computing analysis method
CN109147082A (en) * 2018-03-22 2019-01-04 韩明 Feasibility cloud computing analysis system
CN108665502A (en) * 2018-03-22 2018-10-16 韩明 Feasibility cloud computing analysis method
CN108650461B (en) * 2018-05-11 2020-09-08 普宙飞行器科技(深圳)有限公司 Control method, device and equipment for variable field angle camera holder
CN108650461A (en) * 2018-05-11 2018-10-12 普宙飞行器科技(深圳)有限公司 Control method, device and the equipment of a kind of variable field of view angle camera head
WO2019223158A1 (en) * 2018-05-23 2019-11-28 平安科技(深圳)有限公司 Vr image production method, apparatus, computer device, and storage medium
CN108805988A (en) * 2018-05-23 2018-11-13 平安科技(深圳)有限公司 VR image generating methods, device, computer equipment and storage medium
CN112085650A (en) * 2020-09-09 2020-12-15 南昌虚拟现实研究院股份有限公司 Image processing method, image processing device, storage medium and computer equipment

Also Published As

Publication number Publication date
CN102984453B (en) 2015-10-28

Similar Documents

Publication Publication Date Title
CN102984453A (en) Method and system of real-time generating hemisphere panoramic video images through single camera
CN100576907C (en) Utilize the method for single camera real-time generating 360 degree seamless full-view video image
CN103763479B (en) The splicing apparatus and its method of real time high-speed high definition panorama video
CN102510474B (en) 360-degree panorama monitoring system
CN110248079A (en) A kind of full view image generating system and panorama image generation method
CN101605211B (en) Method for seamlessly composing virtual three-dimensional building and real-scene video of real environment
CN204090039U (en) Integration large scene panoramic video monitoring device
CN102291527B (en) Panoramic video roaming method and device based on single fisheye lens
CN101916455B (en) Method and device for reconstructing three-dimensional model of high dynamic range texture
CN105678693A (en) Panorama video browsing-playing method
CN105488775A (en) Six-camera around looking-based cylindrical panoramic generation device and method
CN206563985U (en) 3-D imaging system
CN101146231A (en) Method for generating panoramic video according to multi-visual angle video stream
CN101661162B (en) Distortion compensation method based on wide-angle lens
CN105791751A (en) Image privacy occlusion method based on ball machine and ball machine
CN103905792A (en) 3D positioning method and device based on PTZ surveillance camera
CN104809719A (en) Virtual view synthesis method based on homographic matrix partition
CN106651859A (en) Multipath fisheye camera calibration device and method
CN107358577B (en) Rapid splicing method of cubic panoramic image
CN104038740A (en) Method and device for shielding privacy region of PTZ (Pan/Tilt/Zoom) surveillance camera
CN106534670B (en) It is a kind of based on the panoramic video generation method for connecting firmly fish eye lens video camera group
CN105046649A (en) Panorama stitching method for removing moving object in moving video
CN104094318A (en) System for filming a video movie
CN106899782A (en) A kind of method for realizing interactive panoramic video stream map
CN109788270B (en) 3D-360-degree panoramic image generation method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant