CN101794448A - Full automatic calibration method of master-slave camera chain - Google Patents

Full automatic calibration method of master-slave camera chain Download PDF

Info

Publication number
CN101794448A
CN101794448A CN 201010139976 CN201010139976A CN101794448A CN 101794448 A CN101794448 A CN 101794448A CN 201010139976 CN201010139976 CN 201010139976 CN 201010139976 A CN201010139976 A CN 201010139976A CN 101794448 A CN101794448 A CN 101794448A
Authority
CN
China
Prior art keywords
image
width
cloth
camera
subimage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010139976
Other languages
Chinese (zh)
Other versions
CN101794448B (en
Inventor
宋利
王嘉
徐奕
李铀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN2010101399769A priority Critical patent/CN101794448B/en
Publication of CN101794448A publication Critical patent/CN101794448A/en
Application granted granted Critical
Publication of CN101794448B publication Critical patent/CN101794448B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention relates to a full automatic calibration method of a master-slave camera chain in the technical field of image processing. In the invention, an image which is automatically obtained in the process of rotating a dynamic camera is spliced into a mosaic image and then the characteristic points of the image and the image of a master camera are automatically extracted and matched, thereby obtaining a calibration result between the coordinates of the pixel of the image of a static camera and the control parameter of the dynamic camera and realizing the full automatic calibration of the master-slave camera chain. The range of the visual field of the dynamic camera is represented by the mosaic spliced image, the relationship between two different imaging planes is estimated by utilizing an SURF characteristic point, only the motion trace of the dynamic camera needs to be established beforehand, the master-slave camera chain can be automatically calibrated, the automatic calibration of the master-slave camera chain can be realized under the condition that the calibration accuracy keeps better (the error rate is 3 to 5 percent), the manpower and the time are saved, and the complexity in the calibration process is lowered.

Description

The full automatic calibration method of master-slave camera chain
Technical field
What the present invention relates to is a kind of method of technical field of image processing, specifically is a kind of full automatic calibration method of master-slave camera chain.
Background technology
At present, the public place generally has been equipped with camera monitoring system, a large amount of cameras is installed is used for large-scale overlay area.Along with the development of supervisory system, also more and more to the demand of high resolving power monitoring image.The camera supervised system of principal and subordinate is a kind of scheme that addresses this problem.In such system, the video camera that (or a plurality of) fix leads one (or many) dynamically (PTZ) video camera defocus interesting target as a leader.So just promptly can obtain macroscopical monitoring image of a broader region, can obtain the high-definition picture of target again.The demarcation of master-slave camera chain is ways of addressing this issue just.Through the camera supervised system of principal and subordinate of demarcating, to pounce on when grasping interesting target when main camera, dynamic camera can be aimed at this target automatically, thereby obtains detail pictures.
Find that through literature search pertinent literature is as follows to prior art:
1, people such as X.Zhou choose some sample points from the main camera image in " A master-slave system to acquirebiometric imagery of humans at distance (a kind of in the master slave system of obtaining personage's biometric image feature at a distance) " literary composition that The First ACM International Workshop on Video Surveillance (first international video monitoring symposial of Association for Computing Machinery) is delivered, manual mobile dynamic camera, make the dynamic camera picture centre corresponding, note sample point coordinate and controlled variable then with the sample point of choosing.The dynamic camera rotational parameters of the some correspondence on other still camera images can obtain by the parameter that has obtained is carried out interpolation.But the adjustment dynamic camera parameter that this Technology Need is manual, therefore consuming time and inconvenient large-scale application.
2, this technology was estimated the relation of main camera and dynamic camera by calculating two homography matrixs between the camera review to A.W.Senior etc. in " AcquiringMulti-Scale Images by Pan-Tilt-Zoom Control and Automatic Multi-Camera Calibration (demarcating the acquisition multi-scale image automatically by PTZ control and multiple-camera) " that The seventh IEEE Workshops on Application of ComputerVision (the 7th computer vision of international institute of electrical and electronic engineers is used symposial) is delivered in 2005.But this technology also needs manual adjustment dynamic camera parameter, same consuming time and inconvenient large-scale application.
Summary of the invention
The present invention is directed to the above-mentioned deficiency of prior art, proposed a kind of full automatic calibration method of master-slave camera chain.The present invention becomes mosaic image by rotating the image mosaic that obtains automatically in the dynamic camera process, automatically this image and main camera image are carried out the extraction and the coupling of unique point again, thereby obtain the calibration result between still camera image pixel coordinate and the dynamic camera location parameter, realize the full automatic calibration master-slave camera chain.
The present invention is achieved by the following technical solutions, comprises the steps:
Step 1, preestablish the movement locus of dynamic camera, dynamic camera rotates automatically along this movement locus, and every the time t width of cloth subimage of sampling, simultaneously by the time sequencing that collects, to each width of cloth subimage label, the parameter that dynamic camera rotates during the every width of cloth subimage of record acquisition, that is: horizontally rotate angle and vertical rotation angle, and parameter and positional information that every width of cloth subimage and corresponding label thereof and corresponding dynamic camera rotate are stored in the lump.
Step 2, after the dynamic camera rotation finishes, the N width of cloth subimage that obtains is carried out arranged in groups by the camera site to be handled, obtain K group set of sub-images, two adjacent in every group of set of sub-images width of cloth subimages are carried out feature point extraction and matching treatment, it is right to obtain matching characteristic point, and then obtains the transform matrix M between two adjacent in same group of set of sub-images width of cloth subimages.
It is that dynamic camera with the subimage correspondence horizontally rotates differential seat angle and is divided into one group of set of sub-images less than the subimage of threshold value that described arranged in groups is handled, thereby obtain K group set of sub-images, and the subimage in every group is arranged according to the vertical rotation angular dimension of the dynamic camera of correspondence.
Described feature point extraction and matching treatment are finished by the SURF method, are promptly extracted the unique point of adjacent two width of cloth subimages in every group of set of sub-images by quick Hessian extraction apparatus, obtain the SURF proper vector of unique point again.
Transform matrix M in the described same group of set of sub-images between adjacent two width of cloth subimages obtains by the RANSAC method, specifically:
x ′ y ′ z ′ = m 0 m 1 m 2 m 3 m 4 m 5 m 6 m 7 m 8 * x y z or u ′ = Mu ,
Wherein: (x ', y ', z ') and (x, y z) are the coordinate of same unique point in two adjacent width of cloth subimage coordinate systems respectively, the subimage coordinate system with the image upper left corner be initial point, x axle horizontal to the right for just, the y axle is being vertically downward for just to set up.
Step 3 according to the transformation matrix that obtains, is carried out image mosaic to every group of set of sub-images and is handled, thereby every group of set of sub-images is spliced into a width of cloth row stitching image, obtains K width of cloth row stitching image altogether.
Described image mosaic is handled, specifically: with the benchmark image of subimage middle in every group of set of sub-images for this group set of sub-images, other subimages in this group set of sub-images are converted in the benchmark image plane coordinate system according to the transformation matrix that obtains, the pixel in the coincidence zone of image is carried out the heavy gray scale splicing of cum rights to be handled, gray scale to the pixel in the non-coincidence zone of image remains unchanged, thereby every group of set of sub-images is spliced into a width of cloth row stitching image.
The gray scale splicing that described cum rights is heavy is handled, specifically:
f res ( P ) = Σ i = 1 W f i ( P ) d i n Σ i = 1 W d i n ,
Wherein: f Res(P) be the pixel value that splicing back P is ordered, f i(p) be the pixel value that P is ordered in the i width of cloth image, W is for participating in spliced image quantity, and n is a constant, d iBe the bee-line of P point to the i width of cloth image boundary that participates in splicing.
Step 4 as an image collection, is carried out K width of cloth row stitching image successively feature point extraction, matching treatment and image mosaic to it and is handled, thereby obtain a width of cloth mosaic stitching image.
Step 5, gather a width of cloth still image by still camera, this still image and mosaic stitching image are carried out feature point extraction and matching treatment, and utilize the further search characteristics point of polar curve geometrical principle, the matching characteristic point that obtains between still image and mosaic stitching image is right, and the unique point of coupling is evenly distributed on still image and the mosaic stitching image as far as possible.
Described polar curve geometrical principle is: the corresponding point of the point of piece image on another width of cloth image are positioned on the corresponding polar curve.
Described further search characteristics point is that the unique point of still image is positioned on the corresponding polar curve in the corresponding point on the mosaic stitching image, thereby increases the quantity of unique point.
Step 6 is carried out the global calibration processing to still image and mosaic stitching image, thereby obtains the mapping relations between still image and mosaic stitching image, i.e. demarcation between principal and subordinate's video camera relation.
Described global calibration is handled, and comprises that step is:
1) any point P in still image s(x s, y s) in the adjacent domain, search still image unique point, N R(P s) be range points P sDistance is less than or equal to the set of all still image unique points formations of R, that is:
N R ( P s ) = { ( P s 1 , r 1 ) , ( P s 2 , r 2 ) , . . . } , Wherein: P s iIt is distance P sBe r iUnique point;
2) find N R(P s) each still image unique point P in the set s iCorresponding mosaic image unique point P d i, obtain comprising P d iSubimage, and then from comprising P d iSubimage in select P d iNear the width of cloth subimage I at its subimage center, place r i
3) search subimage I r iLabel in all set of sub-images obtains dynamic camera and is taking subimage I r iThe time residing location parameter S i
4) to the location parameter S of dynamic camera iCarry out interpolation processing, obtain any point P in the still image s(x s, y s) with the corresponding relation of the location parameter of dynamic camera, specifically:
S=S 1*f 1(r 1,f 2,...r n)+S 2*f 2(r 1,r 2,...r n)+...+S n*f n(r 1,r 2,...r n),
Wherein: f iBe interpolating function.
Compared with prior art, the invention has the beneficial effects as follows: the field range of representing dynamic camera by the mosaic stitching image, utilized the relation between two different imaging planes of SURF unique point estimation, only need to formulate in advance the movement locus of good dynamic video camera, just can calibrate this master-slave camera chain automatically, can keep under the stated accuracy good conditions (error rate is 3%-5%), realize the automatic demarcation of master-slave camera chain, save manpower and time, reduced the complexity in the calibration process.
Description of drawings
Fig. 1 is an embodiment dynamic camera rotary motion trace synoptic diagram;
Wherein: (a) be the position of the dynamic camera when taking 64 width of cloth subimages; (b) be the plurality of sub image that dynamic camera is taken.
Fig. 2 is the synoptic diagram that the heavy gray scale splicing of cum rights is handled.
Fig. 3 is the mosaic stitching image that embodiment obtains.
Fig. 4 is the matching characteristic point that embodiment obtains;
Wherein: (a) be the simple matching characteristic point that utilizes the SURF method to obtain; (b) be the matching characteristic point that further utilizes the polar curve geometrical principle to obtain.
Fig. 5 is the synoptic diagram that global calibration is handled.
Fig. 6 is the calibration result of embodiment;
Wherein: (a) piece image of still camera shooting; (b) be the image of taking with respect to the dynamic camera of (a); (c) be another width of cloth image that still camera is taken; (d) be the image of taking with respect to the still camera of (c).
Fig. 7 is an embodiment calibrated error rate synoptic diagram;
Wherein: (a) be the error rate synoptic diagram of embodiment directions X; (b) be the error rate synoptic diagram of embodiment Y direction.
Embodiment
Below in conjunction with accompanying drawing embodiments of the invention are elaborated: present embodiment has provided detailed embodiment and process being to implement under the prerequisite with the technical solution of the present invention, but protection scope of the present invention is not limited to following embodiment.
Embodiment
In the present embodiment certain an indoor still camera and dynamic camera that is installed on the different location is demarcated.The resolution of still camera and dynamic camera is 320*240.
Present embodiment comprises the steps:
Step 1, the movement locus that preestablishes dynamic camera is " it " font, dynamic camera rotates automatically along given movement locus, and every the time t width of cloth subimage of sampling, simultaneously by the time sequencing of gathering, to each width of cloth subimage label, the parameter that dynamic camera rotates when writing down each images acquired, that is: horizontally rotate angle and vertical rotation angle, and parameter and positional information that every width of cloth subimage and corresponding label thereof and corresponding dynamic camera rotate are stored in the lump.
The movement locus of dynamic camera is shown in arrow among Fig. 1 (a) in the present embodiment, and the width of cloth subimage of sampling at Fig. 2 orbicular spot place respectively obtains 64 width of cloth subimages altogether, and the plurality of sub image that obtains is shown in Fig. 1 (b).
Step 2, after the dynamic camera rotation finishes, the N width of cloth subimage that obtains is carried out arranged in groups by the camera site to be handled, obtain K group set of sub-images, adjacent two width of cloth subimages in every group of set of sub-images are carried out feature point extraction and matching treatment, it is right to obtain matching characteristic point, and then obtains the transform matrix M between two adjacent in same group of set of sub-images width of cloth subimages.
It is to be divided into 8 groups from small to large with 64 width of cloth subimages are horizontally rotated angle by video camera that described arranged in groups is handled, and every group comprises 8 width of cloth subimages, and with vertical rotation angle ascending sort of the subimage in every group according to the dynamic camera of correspondence.
Described feature point extraction and matching treatment are finished by the SURF method, are promptly extracted the unique point of adjacent two width of cloth subimages in every group of set of sub-images by quick Hessian extraction apparatus, obtain the SURF proper vector of unique point again.
Transform matrix M in the described same group of set of sub-images between adjacent two width of cloth subimages obtains by the RANSAC method, specifically:
x ′ y ′ z ′ = m 0 m 1 m 2 m 3 m 4 m 5 m 6 m 7 m 8 * x y z or u ′ = Mu ,
Wherein: (x ', y ', z ') and (x, y z) are the coordinate of same unique point in two adjacent width of cloth subimage coordinate systems respectively, the subimage coordinate system with the image upper left corner be initial point, x axle horizontal to the right for just, the y axle is vertically downward for just setting up.
Step 3 according to the transformation matrix that obtains, is carried out image mosaic to every group of set of sub-images and is handled, thereby every group of set of sub-images is spliced into a width of cloth row stitching image, obtains 8 width of cloth row stitching images altogether.
Described image mosaic is handled, specifically: with the benchmark image of subimage middle in every group of set of sub-images (present embodiment is the 5th width of cloth subimage) for this group set of sub-images, other subimages in this group set of sub-images are converted in the benchmark image plane coordinate system according to the transformation matrix that obtains, the pixel in the coincidence zone of image is carried out the heavy gray scale splicing of cum rights to be handled, gray scale to the pixel in the non-coincidence zone of image remains unchanged, thereby every group of set of sub-images is spliced into a width of cloth row stitching image.
As shown in Figure 2, the gray scale splicing that described cum rights is heavy is handled, specifically:
f res ( P ) = Σ i = 1 8 f i ( P ) d i n Σ i = 1 8 d i n ,
Wherein: f Res(P) be the pixel value that splicing back P is ordered, f i(p) be the pixel value that P is ordered in the i width of cloth image, n is a constant, d iBe the bee-line of P point to the i width of cloth image boundary that participates in splicing.
N is 3 in the present embodiment.
Step 4 as an image collection, is carried out 8 width of cloth row stitching images successively feature point extraction, matching treatment and image mosaic according to the method for step 2 and step 3 to it and is handled, thereby obtain a width of cloth mosaic stitching image.
The mosaic stitching image that present embodiment obtains as shown in Figure 3, white line bar wherein is the border of every width of cloth subimage.
Step 5, gather a width of cloth still image by still camera, this still image and mosaic stitching image are carried out feature point extraction and matching treatment, and utilize the further search characteristics point of polar curve geometrical principle, the matching characteristic point that obtains between still image and mosaic stitching image is right, and the unique point of coupling is evenly distributed on two width of cloth images as far as possible.
Described polar curve geometrical principle is: the corresponding point of the point of piece image on another width of cloth image are positioned on the corresponding polar curve.
Present embodiment is simple utilizes matching characteristic point that the SURF method obtains shown in Fig. 4 (a), further utilizes matching characteristic point that the polar curve geometrical principle obtains shown in Fig. 4 (b).
Step 6 is carried out the global calibration processing to still image and mosaic stitching image, thereby obtains the mapping relations between still image and mosaic stitching image, i.e. demarcation between principal and subordinate's video camera relation.
As shown in Figure 5, described global calibration is handled, and comprises that step is:
1) any point P in still image s(x s, y s) in the adjacent domain, search still image SURF unique point, N R(P s) be range points P sDistance is less than or equal to the set of all still image SURF unique points formations of R:
N R ( P s ) = { ( P s 1 , r 1 ) , ( P s 2 , r 2 ) , . . . } , Wherein: P s iIt is distance P sBe r iUnique point;
2) find N R(P s) each still image SURF unique point P in the set s iCorresponding mosaic image unique point P d i, obtain comprising P d iSubimage, and then from comprising P d iSubimage in select P d iNear the subimage I at subimage center r i
3) search subimage I r iLabel in all set of sub-images obtains dynamic camera and is taking subimage I r iThe time residing position S ii, β i, Z i) i=1,2,3...;
4) to S ii, β i, Z i) carry out interpolation processing, obtain any point P in the still image s(x s, y s) with the corresponding relation of the location parameter of dynamic camera, specifically:
S=S 1*f 1(r 1,r 2,...r n)+S 2*f 2(r 1,r 2,...r n)+...+S n*f n(r 1,r 2,...r n),
Wherein: f iFor inserting the letter number.
Interpolating function is specially in the present embodiment: f i ( r 1 , r 2 , . . . r n ) = r i / Σ i = 1 n r i 2 .
Use still camera to take piece image shown in Fig. 6 (a), use the present embodiment method that the interesting target (people) among Fig. 6 (a) is carried out the demarcation of dynamic camera, thereby the image that the rotation dynamic camera obtains is shown in Fig. 6 (b); Similarly, still camera is taken another width of cloth image shown in Fig. 6 (c), and the image that obtains dynamic camera accordingly is shown in Fig. 6 (d).
Present embodiment is chosen any 8 points in the still image, calculate the anglec of rotation of dynamic camera correspondence by the relation of the demarcation between the principal and subordinate's video camera that obtains, behind the rotation dynamic camera, calculate the dynamic camera picture centre point corresponding at X with selected point, distance on the Y direction, thereby according to X, the ratio of the distance on the Y direction and dynamic camera resolution, obtain the error rate of principal and subordinate's camera calibration, the error rate of the directions X that present embodiment obtains is shown in Fig. 7 (a), the error rate of Y direction is shown in Fig. 7 (b), and the present embodiment method is simple and stated accuracy is high as shown in Figure 7.

Claims (9)

1. the full automatic calibration method of a master-slave camera chain is characterized in that, may further comprise the steps:
Step 1, preestablish the movement locus of dynamic camera, dynamic camera rotates automatically along this movement locus, and every the time t width of cloth subimage of sampling, simultaneously by the time sequencing that collects, to each width of cloth subimage label, the parameter that dynamic camera rotates during the every width of cloth subimage of record acquisition,, and parameter and positional information that every width of cloth subimage and corresponding label thereof and corresponding dynamic camera rotate stored;
Step 2, after the dynamic camera rotation finishes, the N width of cloth subimage that obtains is carried out arranged in groups by the camera site to be handled, obtain K group set of sub-images, two adjacent in every group of set of sub-images width of cloth subimages are carried out feature point extraction and matching treatment, it is right to obtain matching characteristic point, and then obtains the transform matrix M between two adjacent in same group of set of sub-images width of cloth subimages;
Step 3 according to the transformation matrix that obtains, is carried out image mosaic to every group of set of sub-images and is handled, thereby every group of set of sub-images is spliced into a width of cloth row stitching image, obtains K width of cloth row stitching image altogether;
Step 4 as an image collection, is carried out K width of cloth row stitching image successively feature point extraction, matching treatment and image mosaic to it and is handled, thereby obtain a width of cloth mosaic stitching image;
Step 5, gather a width of cloth still image by still camera, this still image and mosaic stitching image are carried out feature point extraction and matching treatment, and utilize the further search characteristics point of polar curve geometrical principle, the matching characteristic point that obtains between still image and mosaic stitching image is right, and the unique point of coupling is evenly distributed on still image and the mosaic stitching image as far as possible;
Step 6 is carried out the global calibration processing to still image and mosaic stitching image, thereby obtains the mapping relations between still image and mosaic stitching image, i.e. demarcation between principal and subordinate's video camera relation.
2. the full automatic calibration method of master-slave camera chain according to claim 1 is characterized in that, the parameter that the dynamic camera described in the step 1 rotates is meant: horizontally rotate angle and vertical rotation angle.
3. the full automatic calibration method of master-slave camera chain according to claim 1, it is characterized in that, it is that dynamic camera with the subimage correspondence horizontally rotates differential seat angle and is divided into one group of set of sub-images less than the subimage of threshold value that arranged in groups described in the step 2 is handled, thereby obtain K group set of sub-images, and the subimage in every group is arranged according to the vertical rotation angular dimension of the dynamic camera of correspondence.
4. the full automatic calibration method of master-slave camera chain according to claim 1, it is characterized in that, feature point extraction described in the step 2 and matching treatment are to be extracted the unique point of adjacent two width of cloth subimages in every group of set of sub-images by quick Hessian extraction apparatus, obtain the SURF proper vector of unique point again.
5. the full automatic calibration method of master-slave camera chain according to claim 1 is characterized in that, the transform matrix M in the same group of set of sub-images described in the step 2 between adjacent two width of cloth subimages is:
x ′ y ′ z ′ = m 0 m 1 m 2 m 3 m 4 m 5 m 6 m 7 m 8 * x y z or u ′ = Mu ,
Wherein: (x ', y ', z ') and (x, y z) are the coordinate of same unique point in two adjacent width of cloth subimage coordinate systems respectively, the subimage coordinate system with the image upper left corner be initial point, x axle horizontal to the right for just, the y axle is being vertically downward for just to set up.
6. the full automatic calibration method of master-slave camera chain according to claim 1, it is characterized in that, image mosaic described in the step 3 is handled: with the benchmark image of subimage middle in every group of set of sub-images for this group set of sub-images, other subimages in this group set of sub-images are converted in the benchmark image plane coordinate system according to the transformation matrix that obtains, the pixel in the coincidence zone of image is carried out the heavy gray scale splicing of cum rights to be handled, gray scale to the pixel in the non-coincidence zone of image remains unchanged, thereby every group of set of sub-images is spliced into a width of cloth row stitching image.
7. the full automatic calibration method of master-slave camera chain according to claim 6 is characterized in that, the gray scale splicing that described cum rights is heavy is handled and is:
f res ( P ) = Σ i = 1 W f i ( P ) d i n Σ i = 1 W d i n ,
Wherein: f Res(P) be the pixel value that splicing back P is ordered, f i(p) be the pixel value that P is ordered in the i width of cloth image, W is for participating in spliced image quantity, and n is a constant, d iBe the bee-line of P point to the i width of cloth image boundary that participates in splicing.
8. the full automatic calibration method of master-slave camera chain according to claim 1, it is characterized in that, further search characteristics point described in the step 5 is that the unique point of still image is positioned on the corresponding polar curve in the corresponding point on the mosaic stitching image, thereby increases the quantity of unique point.
9. the full automatic calibration method of master-slave camera chain according to claim 1 is characterized in that, the global calibration described in the step 6 is handled, and comprises that step is:
1) any point P in still image s(x s, y s) in the adjacent domain, search still image unique point, N R(P s) be range points P sDistance is less than or equal to the set of all still image unique points formations of R, that is:
Figure FDA0000020408470000031
Wherein: P s iIt is distance P sBe r iUnique point;
2) find N R(P s) each still image unique point P in the set s iCorresponding mosaic image unique point P d i, obtain comprising P d iSubimage, and then from comprising P d iSubimage in select P d iNear the width of cloth subimage I at its subimage center, place r i
3) search subimage I r iLabel in all set of sub-images obtains dynamic camera and is taking subimage I r iThe time residing location parameter S i
4) to the location parameter S of dynamic camera iCarry out interpolation processing, obtain any point P in the still image s(x s, y s) with the corresponding relation of the location parameter of dynamic camera be:
S=S 1*f 1(r 1,r 2,...r n)+S 2*f 2(r 1,r 2,...r n)+...+S n*f n(r 1,r 2,...r n),
Wherein: f iBe interpolating function.
CN2010101399769A 2010-04-07 2010-04-07 Full automatic calibration method of master-slave camera chain Expired - Fee Related CN101794448B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010101399769A CN101794448B (en) 2010-04-07 2010-04-07 Full automatic calibration method of master-slave camera chain

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010101399769A CN101794448B (en) 2010-04-07 2010-04-07 Full automatic calibration method of master-slave camera chain

Publications (2)

Publication Number Publication Date
CN101794448A true CN101794448A (en) 2010-08-04
CN101794448B CN101794448B (en) 2012-07-04

Family

ID=42587120

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010101399769A Expired - Fee Related CN101794448B (en) 2010-04-07 2010-04-07 Full automatic calibration method of master-slave camera chain

Country Status (1)

Country Link
CN (1) CN101794448B (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102693543A (en) * 2012-05-21 2012-09-26 南开大学 Method for automatically calibrating Pan-Tilt-Zoom in outdoor environments
CN102842121A (en) * 2011-06-24 2012-12-26 鸿富锦精密工业(深圳)有限公司 Picture splicing system and picture splicing method
CN103024350A (en) * 2012-11-13 2013-04-03 清华大学 Master-slave tracking method for binocular PTZ (Pan-Tilt-Zoom) visual system and system applying same
CN103105858A (en) * 2012-12-29 2013-05-15 上海安维尔信息科技有限公司 Method capable of amplifying and tracking goal in master-slave mode between fixed camera and pan tilt zoom camera
CN103438798A (en) * 2013-08-27 2013-12-11 北京航空航天大学 Initiative binocular vision system overall calibration method
CN103841333A (en) * 2014-03-27 2014-06-04 成都动力视讯科技有限公司 Preset bit method and control system
CN104301674A (en) * 2014-09-28 2015-01-21 北京正安融翰技术有限公司 Panoramic monitoring and PTZ camera linkage method based on video feature matching
CN104537659A (en) * 2014-12-23 2015-04-22 金鹏电子信息机器有限公司 Automatic two-camera calibration method and system
CN104574425A (en) * 2015-02-03 2015-04-29 中国人民解放军国防科学技术大学 Calibration and linkage method for primary camera system and secondary camera system on basis of rotary model
CN105096324A (en) * 2015-07-31 2015-11-25 深圳市大疆创新科技有限公司 Camera device and calibration method thereof
CN105430333A (en) * 2015-11-18 2016-03-23 苏州科达科技股份有限公司 Method and device for calculating gun-type camera distortion coefficient in real time
CN105453600A (en) * 2013-08-02 2016-03-30 高通股份有限公司 Identifying IoT devices/objects/people using out-of-band signaling/metadata in conjunction with optical images
CN105516661A (en) * 2015-12-10 2016-04-20 吴健辉 Master-slave target monitoring system and method in combination of fisheye camera and PTZ camera
CN106652026A (en) * 2016-12-23 2017-05-10 安徽工程大学机电学院 Three-dimensional space automatic calibration method based on multi-sensor fusion
CN108469254A (en) * 2018-03-21 2018-08-31 南昌航空大学 A kind of more visual measuring system overall calibration methods of big visual field being suitable for looking up and overlooking pose
CN109613462A (en) * 2018-11-21 2019-04-12 河海大学 A kind of scaling method of CT imaging
CN111243035A (en) * 2020-04-29 2020-06-05 成都纵横自动化技术股份有限公司 Camera calibration method and device, electronic equipment and computer-readable storage medium
CN112308924A (en) * 2019-07-29 2021-02-02 浙江宇视科技有限公司 Method, device and equipment for calibrating camera in augmented reality and storage medium
CN113393529A (en) * 2020-03-12 2021-09-14 浙江宇视科技有限公司 Camera calibration method, device, equipment and medium
CN113781548A (en) * 2020-06-10 2021-12-10 华为技术有限公司 Multi-device pose measurement method, electronic device and system
US12073071B2 (en) 2020-07-29 2024-08-27 Huawei Technologies Co., Ltd. Cross-device object drag method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101021947A (en) * 2006-09-22 2007-08-22 东南大学 Double-camera calibrating method in three-dimensional scanning system
KR100788643B1 (en) * 2001-01-09 2007-12-26 삼성전자주식회사 Searching method of image based on combination of color and texture

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100788643B1 (en) * 2001-01-09 2007-12-26 삼성전자주식회사 Searching method of image based on combination of color and texture
CN101021947A (en) * 2006-09-22 2007-08-22 东南大学 Double-camera calibrating method in three-dimensional scanning system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《机器人》 20070331 杨广林等 双摄像机系统对移动目标的跟踪 第133-139页 第29卷, 第2期 2 *
《计算机应用》 20071130 姜露露等 基于极线几何约束的非标定图像的立体匹配 第27卷, 第11期 2 *

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102842121A (en) * 2011-06-24 2012-12-26 鸿富锦精密工业(深圳)有限公司 Picture splicing system and picture splicing method
CN102693543B (en) * 2012-05-21 2014-08-20 南开大学 Method for automatically calibrating Pan-Tilt-Zoom in outdoor environments
CN102693543A (en) * 2012-05-21 2012-09-26 南开大学 Method for automatically calibrating Pan-Tilt-Zoom in outdoor environments
CN103024350A (en) * 2012-11-13 2013-04-03 清华大学 Master-slave tracking method for binocular PTZ (Pan-Tilt-Zoom) visual system and system applying same
CN103024350B (en) * 2012-11-13 2015-07-29 清华大学 A kind of principal and subordinate's tracking of binocular PTZ vision system and the system of application the method
CN103105858A (en) * 2012-12-29 2013-05-15 上海安维尔信息科技有限公司 Method capable of amplifying and tracking goal in master-slave mode between fixed camera and pan tilt zoom camera
CN105453600B (en) * 2013-08-02 2019-07-05 高通股份有限公司 IoT equipment/object/people is identified using out-of-band signalling/metadata combination optical imagery
CN105453600A (en) * 2013-08-02 2016-03-30 高通股份有限公司 Identifying IoT devices/objects/people using out-of-band signaling/metadata in conjunction with optical images
CN103438798B (en) * 2013-08-27 2016-01-20 北京航空航天大学 Initiative binocular vision system overall calibration
CN103438798A (en) * 2013-08-27 2013-12-11 北京航空航天大学 Initiative binocular vision system overall calibration method
CN103841333A (en) * 2014-03-27 2014-06-04 成都动力视讯科技有限公司 Preset bit method and control system
CN104301674A (en) * 2014-09-28 2015-01-21 北京正安融翰技术有限公司 Panoramic monitoring and PTZ camera linkage method based on video feature matching
CN104537659A (en) * 2014-12-23 2015-04-22 金鹏电子信息机器有限公司 Automatic two-camera calibration method and system
CN104574425A (en) * 2015-02-03 2015-04-29 中国人民解放军国防科学技术大学 Calibration and linkage method for primary camera system and secondary camera system on basis of rotary model
US10546390B2 (en) 2015-07-31 2020-01-28 SZ DJI Technology Co., Ltd. Method for calibrating an imaging device and an imaging device
CN105096324A (en) * 2015-07-31 2015-11-25 深圳市大疆创新科技有限公司 Camera device and calibration method thereof
WO2017020609A1 (en) * 2015-07-31 2017-02-09 深圳市大疆创新科技有限公司 Method of calibrating camera device and camera device utilizing same
CN105096324B (en) * 2015-07-31 2017-11-28 深圳市大疆创新科技有限公司 A kind of camera device scaling method and camera device
US10192325B2 (en) 2015-07-31 2019-01-29 SZ DJI Technology Co., Ltd. Method for calibrating an imaging device and an imaging device
CN105430333B (en) * 2015-11-18 2018-03-23 苏州科达科技股份有限公司 A kind of method and device for being back-calculated gunlock distortion factor in real time
CN105430333A (en) * 2015-11-18 2016-03-23 苏州科达科技股份有限公司 Method and device for calculating gun-type camera distortion coefficient in real time
CN105516661A (en) * 2015-12-10 2016-04-20 吴健辉 Master-slave target monitoring system and method in combination of fisheye camera and PTZ camera
CN105516661B (en) * 2015-12-10 2019-03-29 吴健辉 Principal and subordinate's target monitoring method that fisheye camera is combined with ptz camera
CN106652026A (en) * 2016-12-23 2017-05-10 安徽工程大学机电学院 Three-dimensional space automatic calibration method based on multi-sensor fusion
CN108469254A (en) * 2018-03-21 2018-08-31 南昌航空大学 A kind of more visual measuring system overall calibration methods of big visual field being suitable for looking up and overlooking pose
CN109613462A (en) * 2018-11-21 2019-04-12 河海大学 A kind of scaling method of CT imaging
CN112308924A (en) * 2019-07-29 2021-02-02 浙江宇视科技有限公司 Method, device and equipment for calibrating camera in augmented reality and storage medium
CN112308924B (en) * 2019-07-29 2024-02-13 浙江宇视科技有限公司 Method, device, equipment and storage medium for calibrating camera in augmented reality
CN113393529A (en) * 2020-03-12 2021-09-14 浙江宇视科技有限公司 Camera calibration method, device, equipment and medium
CN113393529B (en) * 2020-03-12 2024-05-10 浙江宇视科技有限公司 Method, device, equipment and medium for calibrating camera
CN111243035A (en) * 2020-04-29 2020-06-05 成都纵横自动化技术股份有限公司 Camera calibration method and device, electronic equipment and computer-readable storage medium
CN113781548A (en) * 2020-06-10 2021-12-10 华为技术有限公司 Multi-device pose measurement method, electronic device and system
CN113781548B (en) * 2020-06-10 2024-06-14 华为技术有限公司 Multi-equipment pose measurement method, electronic equipment and system
US12073071B2 (en) 2020-07-29 2024-08-27 Huawei Technologies Co., Ltd. Cross-device object drag method and device

Also Published As

Publication number Publication date
CN101794448B (en) 2012-07-04

Similar Documents

Publication Publication Date Title
CN101794448B (en) Full automatic calibration method of master-slave camera chain
CN111062873B (en) Parallax image splicing and visualization method based on multiple pairs of binocular cameras
Aghaei et al. PV power plant inspection by image mosaicing techniques for IR real-time images
CN109348119B (en) Panoramic monitoring system
CN103517041B (en) Based on real time panoramic method for supervising and the device of polyphaser rotation sweep
US8848035B2 (en) Device for generating three dimensional surface models of moving objects
CN102984453B (en) Single camera is utilized to generate the method and system of hemisphere full-view video image in real time
CN103971375B (en) A kind of panorama based on image mosaic stares camera space scaling method
CN109308174B (en) Cross-screen image splicing control method
CN105809701A (en) Panorama video posture calibrating method
CN1712891A (en) Method for associating stereo image and three-dimensional data preparation system
US20210385381A1 (en) Image synthesis system
CN105741233B (en) Video image spherical surface splicing method and system
CN104881869A (en) Real time panorama tracing and splicing method for mobile platform
JP2017182695A (en) Information processing program, information processing method, and information processing apparatus
CN102148965A (en) Video monitoring system for multi-target tracking close-up shooting
CN112348775B (en) Vehicle-mounted looking-around-based pavement pit detection system and method
CN111815672A (en) Dynamic tracking control method, device and control equipment
US11410459B2 (en) Face detection and recognition method using light field camera system
CN108430032A (en) A kind of method and apparatus for realizing that VR/AR device locations are shared
CN114372992A (en) Edge corner point detection four-eye vision algorithm based on moving platform
EP4071713A1 (en) Parameter calibration method and apapratus
CN116132610A (en) Fully-mechanized mining face video stitching method and system
CN112488022B (en) Method, device and system for monitoring panoramic view
CN111080523B (en) Infrared peripheral vision search system and infrared peripheral vision image splicing method based on angle information

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120704

Termination date: 20150407

EXPY Termination of patent right or utility model