CN104376558B - Cuboid-based intrinsic parameter calibration method for Kinect depth camera - Google Patents

Cuboid-based intrinsic parameter calibration method for Kinect depth camera Download PDF

Info

Publication number
CN104376558B
CN104376558B CN201410642053.3A CN201410642053A CN104376558B CN 104376558 B CN104376558 B CN 104376558B CN 201410642053 A CN201410642053 A CN 201410642053A CN 104376558 B CN104376558 B CN 104376558B
Authority
CN
China
Prior art keywords
plane
depth
cuboid
angle
kinect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201410642053.3A
Other languages
Chinese (zh)
Other versions
CN104376558A (en
Inventor
耿卫东
金秉文
雷昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201410642053.3A priority Critical patent/CN104376558B/en
Publication of CN104376558A publication Critical patent/CN104376558A/en
Application granted granted Critical
Publication of CN104376558B publication Critical patent/CN104376558B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a cuboid-based intrinsic parameter calibration method for a Kinect depth camera. The method comprises the steps that a high-precision cuboid is placed on the ground, and the Kinect depth camera is used for photographing the cuboid; according to multi-frame image depth data, normal maps with the same frame number are generated; the plane of a calibration object is divided to obtain plane depth data; a three-dimensional point set is obtained through the inverse process of perspective projection; the least squares fitting method is performed to obtain corresponding planes; the included angle and the distance between the marked planes are calculated; the angle and the distance of the high-precision cuboid are actually measured and compared, an optimized objective function with the difference value being the smallest as the purpose is constructed, intrinsic parameters of the Kinect depth camera are optimized, in this way, the objective function is minimized, and calibration on the Kinect depth camera is completed. According to the method, only depth information is used, and the method can be widely applied to a series of depth cameras and application scenarios; the intrinsic parameters of the calibrated camera can be used for three-dimensional reconstruction, and the method improves accuracy compared with an existing calibration method.

Description

A kind of internal reference scaling method of the Kinect depth camera based on cuboid
Technical field
The present invention relates to a kind of camera calibration method, especially relate to computer vision field a kind of based on cuboid The internal reference scaling method of Kinect depth camera.
Background technology
The attached body-sensing interactive device Kinect for Xbox 360 game machine that Microsoft sells at the end of 2010 is subject to Arrive the concern of computer research worker.Kinect comprises an ordinary optical camera and by an infrared camera and one The depth camera of infrared projection composition, can return photographed scene according to its depth detection method with the speed of 30 frames per second in real time Depth information.This equipment compares traditional depth camera as a kind of depth camera of consumer level, and to have larger price excellent Gesture, thus receive the welcome of researcher, in human body attitude identification, robot application, object identification, the field such as 3D measurement all goes out Show corresponding research work.Recently as the emergence of 3D printing technique, occur in that using Kinect depth camera to scene Or object carries out the work of real-time three-dimensional reconstruction, wherein more famous work is the KinectFusion work of Microsoft in 2012 Make, this technology by the use of Kinect as depth scan equipment, in conjunction with the parallel processing capability of GPU, can be with real-time reconstruction scanning field The threedimensional model of scape.
Demarcating has very important meaning for camera applications.It would be desirable to understand object in three-dimensional space in 3D vision Between in coordinate and imaging plane coordinate relation.Carry out three-dimensional reconstruction process using depth information mainly to include from depth information To three-dimensional projection, and the internal reference of depth camera directly take part in projection and calculates, thus internal reference directly affects backprojection reconstruction Precision.The calibration process of camera is as asked for being used for the process of the parameter of modeling projection in camera model.Different application scenarios Requirement to camera calibration has differences, and Kinect device when appearing on the scene, once can be demarcated to equipment, and will mark by manufacturer Among the fixed firmware of result write device, this group camera parameter is referred to as manufacturer and demarcates internal reference.Manufacturer demarcates internal reference and disclosure satisfy that The daily entertainment applications of Kinect, but for three-dimensional reconstruction such measurement property application, the calibrating parameters of manufacturer then can not be fine Meet precise requirements, so that demarcating to Kinect depth camera again, in the hope of relatively more accurately deep Degree camera internal reference, improves the accuracy of three-dimensional reconstruction.
Content of the invention
The purpose of the present invention is for the deficiencies in the prior art, provides a kind of Kinect depth camera based on cuboid Internal reference scaling method, carries out depth camera parameter calibration using three-dimensional body, for Kinect depth camera, carries out accurate internal reference Demarcate.
The purpose of the present invention is technical scheme by the following method to be to realize:
1) at least three high accuracy cuboid composition cuboid groups of known dimensions are put on the ground, and deep with Kinect Degree camera is shot in multiple different angles;The flatness on described each surface of high accuracy cuboid reaches 0.1mm, high-precision The infrared light that Kinect depth camera is launched can be reflected in the surface of degree cuboid;
2) according to step 1) the middle multiframe depth data shooting in the depth image obtaining, given birth to by perspective projection inverse process Become the normal direction figure generating same number of frames in the three dimensions to camera local coordinate system for the backing up;
3) and divide, on normal direction figure, the plane demarcating object, obtain plane depth data;
4) by step 3) in the plane depth data that obtains, by perspective projection inverse process backing up to camera local coordinate system Under three dimensions in, obtain corresponding three-dimensional point set Q of each plane;
5) to step 4) each three-dimensional point set Q of obtaining carries out least-square fitting approach, obtains three-dimensional point set Q corresponding Plane;
6) according to step 5) angle between the plane that is labeled of the plane computations that obtain and plane and distance;
7) angle and distance between the labeled plane of actual measurement high accuracy cuboid and plane, and with step 6) The angle obtaining is compared with distance, optimization object function for the purpose of difference minimum for the construction, is made by this optimization object function Use nonlinear iteration optimization method, repeat step 4) arrive step 6), the internal reference of Kinect depth camera is optimized, makes Obtain the minimization of object function, complete the internal reference to Kinect depth camera and demarcate.
Described step 1) at least three high accuracy cuboids pendulum arrangements on the ground enable Kinect depth camera Photograph the public vertex face group being made up of three adjacent surfaces with public vertex and be made up of two faces being parallel to each other Parallel surface group, at least one group of public vertex face group, at least two groups of parallel surface group.
Described step 1) in the different angle shot multiple with respect to the central rotation of cuboid group of Kinect depth camera, At least 9 degree of the angle of rotation every time.
Described step 1) in, in each angle shot, set multiple different Kinect depth cameras to cuboid group The distance between center scope, and randomly choose a shooting distance from each distance range and shot.
Described step 2) in the detailed process of the depth image that obtained by the shooting normal direction figure that generates same number of frames be:2.1) To each pixel in each depth image I, take using centered on it window as method of counting to window W, window width Spend for r;
2.2) each depth point q ' in window is passed through by perspective projection according to the internal reference empirical value being set by production firm In the three dimensions to camera local coordinate system for the inverse process backing up, obtain comprising r2Three-dimensional point set Q={ the q of individual pointi| i=1, 2 ..., r2, and q=(x, y, z), wherein each depth point are expressed as q '=(u, v, d (u, v)), (u, v) represents a pixel Coordinate in depth image for the point, d (u, v) represents depth value in depth map for the pixel (u, v);
2.3) according to three-dimensional point set Q, obtain a plane with least square fitting method, its normal vector is three-dimensional vector n= (a, b, c), translational movement is m;
2.4) by three components a, b, c of three-dimensional vector n be respectively adopted below equation be mapped in rgb space red, green, In blue three values, obtain the RGB color value of window W center pixel:
R = 125.5 ( a / a 2 + b 2 + c 2 + 1 )
G = 125.5 ( b / a 2 + b 2 + c 2 + 1 )
B = 125.5 ( c / a 2 + b 2 + c 2 + 1 )
2.5) to each pixel repeat step 2.1 in depth image I)~2.4), you can for each pixel life in I Become one group of RGB color value, and then obtain a coloured image as the normal direction figure N of this depth image I.
Described step 3) on normal direction figure divide demarcate object plane be specially:
3.1) in cuboid combination, least one set public vertex face group and at least two groups parallel surface groups are selected, to wherein All faces are marked with Arabic numerals;
3.2) in step 2) in generate each normal direction figure on, on each marked face draw one be used for representing The polygon of effective depth pixel region profile on this face;
3.3) polygon is mapped back each self-corresponding depth map of normal direction figure, obtain the corresponding plane depth of each polygon Data.
Described step 2) in by perspective projection inverse process and step 4) in identical by perspective projection inverse process, tool Body process is:According to by the internal reference empirical value initially setting, each depth point q ' in window is passed through below equation backing up to phase In three dimensions under machine local coordinate system, obtain comprising r2Three-dimensional point set Q={ the q of individual pointi| i=1,2 ..., r2}:
x = z ( u - c x ) f x . . . . . . ( 1 )
y = z ( v - c y ) f y . . . . . . ( 2 )
z = 1 γ 1 ( d ( u , v ) + D ( u , v ) e α 0 - α 1 d ( u , v ) ) + γ 0 . . . . . . ( 3 )
Wherein, D is two-dimensional matrix, the line number in two-dimensional matrix D and columns respectively with the vertical and horizontal of depth image I Pixel quantity is identical, and D (u, v) is one of D element value, fx, fyRepresent Kinect depth camera in x-axis and y-axis direction respectively On two focal lengths, cx, cvRepresent horizontal stroke on imaging surface for the photocentre, the ordinate of Kinect depth camera, γ respectively0, γ1Point It is not first, second depth conversion parameter of Kinect depth camera, α0, α1, D is the first of Kinect depth camera, respectively Two and the 3rd distortion correction parameter.
Described step 7) in angle and distance between the labeled plane of actual measurement high accuracy cuboid and plane Measure in the following manner:Go out the angle between each plane in above-mentioned labeled public vertex face group with angulation tolerance, use Slide measure measures the distance between above-mentioned labeled parallel surface group midplane.
Described step 6) in angle between labeled plane and plane be specially with distance:Labeled public vertex The distance of the angle of each plane and two planes in labeled parallel surface group in the group of face, angle and distance are respectively by following two Formula obtains:
θ i , j , k = arg cos ( n i , k · n j , k | n i , k | | n j , k | ) . . . . . . ( 4 )
δ i , j , k = 1 2 ( 1 | Q i , k | Σ q ∈ Q i , k | P j , k · q ^ | | n j , k | + 1 | Q j , k | Σ q ∈ Q j , k | P i , k · q ^ | | n i , k | ) . . . . . . ( 5 )
Wherein, θI, j, kRefer to plane PI, kAnd PJ, kBetween angle, δI, j, kRefer to plane PI, kAnd PJ, kThe distance between, k It is the index of depth map, QI, kAnd QJ, kIt is any two three-dimensional point set of kth frame depth map, q is a three-dimensional point,It is three-dimensional The homogeneous coordinates of point q, nI, k=(aI, k,bI, k, cI, k) and nJ, k=(aJ, k, bJ, k, cJ, k) be respectively and QI, kAnd QJ, kCorresponding flat Face normal vector, PI, k=(aI, k,bI, k, cI, k,-mI, k) and PJ, k=(aI, k, bI, k, cI, k,-mI, k) be respectively and QI, kAnd QJ, kCorresponding Plane parameter four dimensional vector, PI, kAnd PJ, kEach self-corresponding plane general equation is a respectivelyI, kx+bI, ky+cI, kZ=mI, kWith aJ, kx+bJ, ky+cJ, kZ=mJ, k, aI, k、bI, k、cI, kWith-mI, kIt is respectively and QI, kThe first of corresponding plane general equation, 2nd, the third and fourth parameter, aJ, k、bJ, k、cJ, kWith-mJ, kIt is respectively and QJ, kCorresponding plane general equation first, second, Third and fourth parameter.
Described step 7) in object function be:
E = S Σ k ∈ σ view ( λ Σ ( i , j ) ∈ σ angle 1 Σ ( i , j ) ∈ σ angle ( θ i , j , k - θ ‾ i , j θ ‾ i , j ) 2 + 1 - λ Σ ( i , j ) ∈ σ dis tan ce 1 Σ ( i , j ) ∈ σ dis tan ce ( δ i , j , k - δ ‾ i , j δ ‾ i , j ) 2 ) . . . . . . ( 6 )
In above formula, S is quantity normalized parameter,σviewIt is the set of the depth index of the picture k shooting, σangleIt is the Arabic numerals mark combination calculating surface used by angle, σdistanceIt is the Arab calculating apart from surface used Numeral mark combines, and λ represents angle difference weight, λ ∈ [0,1],WithRepresent the measured value of angle and distance respectively.
The invention has the beneficial effects as follows:
The inventive method can lift the three-dimensional modeling precision based on Kinect depth camera, realizes the high accuracy of low cost Three-dimensional modeling.At present, high accuracy three-dimensional scanning device is relatively costly, and the usual precision of cheap three-dimensional scanning device is relatively Low.The present invention realizes high accuracy three-dimensional modeling using the relatively more cheap Kinect depth camera of price, greatly reduces The cost of high-precision three-dimensional scanning, is conducive to high-precision three-dimensional scanning technique in 3 D-printing, contactless dimensional measurement etc. The popularization of more applications.
Brief description
Fig. 1 is disposing way and the surface markers of high accuracy cuboid.
Fig. 2 is the RGB image of four kinds of high-precision calibrating squares and corresponding depth image.
Fig. 3 is to shoot for the depth image demarcated.
Fig. 4 is the result that normal direction figure and user divide the plane demarcating object.
Fig. 5 is the disposing way of a high accuracy cuboid and the surface markers having neither part nor lot in demarcation.
Specific embodiment
The invention will be further described with reference to the accompanying drawings and examples.
The calibration process of the present invention is based on by the angle projecting between calculated space plane and distance.Method is first Combining camera internal reference will form space point set under depth data backing up to camera local coordinate system, then by the space of point set matching Plane carrys out constitution optimization target equation, and obtains camera internal reference optimal value using nonlinear iteration optimization method.
Embodiments of the invention are as follows:
1) four high accuracy cuboid composition cuboid groups of known dimensions are put on the ground, as shown in figure 1, arrangement side Formula referring to accompanying drawing 1, this respective outward appearance of four high accuracy cuboids referring to the RGB image in accompanying drawing 2 and corresponding depth image, And shot in 10 different angles with Kinect depth camera;
The flatness of four high accuracy cuboids reaches 0.1mm, and Kinect can be reflected in the surface of high accuracy cuboid The infrared light that depth camera is launched.
Above-mentioned steps 1) in the arrangement on the ground of four high accuracy cuboids pendulum enable the bat of Kinect depth camera Take the photograph the public vertex face group being made up of three adjacent surfaces with public vertex and by putting down that two faces being parallel to each other form Row face group, public vertex face group has 3 groups, and parallel surface group has 12 groups.
Above-mentioned steps 1) in Kinect depth camera with respect to cuboid group 10 different angle shots of central rotation, The angle of rotation is about 15 degree in the horizontal direction every time, at least 9 degree.
Above-mentioned steps 1) in, in each angle shot, set 3 different Kinect depth cameras in cuboid group The distance between heart scope, 0.75m~1.15m, 0.9m~1.35m and 1.15m~1.55m, and from each distance range with Machine ground is chosen shooting distance and is shot, and shoots all depth images obtaining referring to Fig. 3.
2) according to step 1) the middle multiframe depth data shooting in the image obtaining, generated back by perspective projection inverse process Throw in the three dimensions to camera local coordinate system, generate the normal direction figure of same number of frames;
2.1) to each pixel in each depth image I, take using centered on it window as method of counting to Window W, window width is 7 pixels;
2.2) according to by production firm's initial internal reference empirical value setting, each depth point q ' in window is passed through formula 1st, in the three dimensions to camera local coordinate system for the perspective projection inverse process backing up of 2 and 3 expressions, obtain comprising 49 points Three-dimensional point set Q={ qi| i=1,2 ..., r2, and q=(x, y, z), wherein each depth point q ' be expressed as q '=(u, v, d (u, V)), (u, v) represents coordinate in depth image for the pixel, and d (u, v) represents depth in depth map for the pixel (u, v) Value:
x = z ( u - c x ) f x . . . . . . ( 1 )
y = z ( v - c y ) f y . . . . . . ( 2 )
z = 1 γ 1 ( d ( u , v ) + D ( u , v ) e α 0 - α 1 d ( u , v ) ) + γ 0 . . . . . . ( 3 )
Wherein, (fx, fy, cx, cy, γ0, γ1, α0, α1, D) be Kinect depth camera internal reference, its value is according to opinion Civilian Smisek J, Jancosek M, Pajdla T.3D with Kinect [C] .IEEE, the Kinect depth that 2011. are proposed Camera internal reference scaling method obtains.
2.3) according to three-dimensional point set Q, obtain a plane with least square fitting method, its normal vector is three-dimensional vector n= (a, b, c), translational movement is m;
2.4) by three components a, b, c of three-dimensional vector n respectively according to formula G = 125.5 ( b / a 2 + b 2 + c 2 + 1 ) With B = 125.5 ( c / a 2 + b 2 + c 2 + 1 ) It is mapped to the red, green, blue in rgb space In three values, obtain the RGB color value of window W center pixel, simultaneously as the depth value of some pixels in depth image I For invalid 2047, corresponding for these inactive pixels rgb value is set to (0,0,0);
2.5) to each pixel repeat step 2.1 in depth image I)~2.4), you can for each pixel life in I Become one group of RGB color value, and then obtain a coloured image as the normal direction figure N of this depth image I, normal direction figure is referring to Fig. 4.
3) and divide, on normal direction figure, the plane demarcating object, obtain plane depth data, normal direction figure divides and demarcates The result of the plane of object is referring to Fig. 4;
3.1) in cuboid combination, select 3 groups of public vertex face groups and 5 groups of parallel surface groups, wherein all faces are used Ah Arabic numbers is marked, and ready-made mark can be constituted referring to Fig. 1, then any two face in every group of public vertex face group One group of face group being used for angle calculation, the set that these are used for the mark combination of the face group of angle calculation can be expressed as σangle= (1,2), (1,3), (2,3), (4,5), (4,6), (5,6), (7,8), (7,9), (8,9) }, each parallel surface group constitutes one Organize the face group calculating for distance, the set that these are used for the mark combination of the face group that distance calculates can be expressed as σdistance= { (1,10), (4,10), (5,2), (7,10), (9,3) };
3.2) in step 2) in generate each normal direction figure on, on each marked face draw one be used for representing The quadrangle of effective depth pixel region profile on this face, this quadrangle, in the inside of cuboid, does not cover black picture element, in method The reason draw quadrangle on figure, rather than draw on photo or depth map is can be easier on normal direction figure to distinguish Effectively pixel;
3.3) polygon is mapped back each self-corresponding depth map of normal direction figure, obtain the corresponding plane depth of each polygon Data.
4) by step 3) in the plane depth data that obtains, by perspective projection inverse process backing up to camera local coordinate system Under three dimensions in, obtain corresponding three-dimensional point set O of each plane;Step 4) perspective projection inverse process and step 2) saturating Identical depending on projection inverse process.
5) to step 4) three-dimensional point set O that obtains passes through vcglib (the Italian National Research Council Institute, The Visualization and Computer Graphics Library) carry out a young waiter in a wineshop or an inn Take advantage of approximating method, obtain the corresponding plane of three-dimensional point set O;
6) according to step 5) angle between the plane that is labeled of the plane computations that obtain and plane and distance;
Above-mentioned steps 6) in angle between labeled plane and plane be specially with distance:Labeled public vertex The distance of the angle of each plane and two planes in labeled parallel surface group in the group of face.
Described step 6) in angle between labeled plane and plane be specially with distance:Labeled public vertex The distance of the angle of each plane and two planes in labeled parallel surface group in the group of face, angle and distance are respectively by following public affairs Formula 4 and 5 obtains:
θ i , j , k = arg cos ( n i , k · n j , k | n i , k | | n j , k | ) . . . . . . ( 4 )
δ i , j , k = 1 2 ( 1 | Q i , k | Σ q ∈ Q i , k | P j , k · q ^ | | n j , k | + 1 | Q j , k | Σ q ∈ Q j , k | P i , k · q ^ | | n i , k | ) . . . . . . ( 5 )
Wherein, θI, j, kRefer to plane PI, kAnd PJ, kBetween angle, δI, j, kRefer to plane PI, kAnd PJ, kThe distance between, k It is the index of depth map, QI, kAnd QJ, kIt is any two of kth frame depth map by step 4) three-dimensional point set that obtains, q is one Three-dimensional point,It is the homogeneous coordinates of three-dimensional point q, nI, k=(aI, k, bI, k,cI, k) and nJ, k=(aJ, k, bJ, k, cJ, k) be respectively with QI, kAnd QJ, kCorresponding plane normal vector, PI, k=(aI, k, bI, k, cI, k,-mI, k) and PJ, k=(aI, k, bI, k, cI, k,-mI, k) respectively It is and QI, kAnd QJ, kCorresponding plane parameter four dimensional vector, PI, kAnd PJ, kEach self-corresponding plane general equation is a respectivelyI, kx+ bI, ky+cI, kZ=mI, kAnd aJ, kx+bJ, ky+cJ, kZ=mJ, k, and all pass through step 5) and QI, kAnd QJ, kMatching obtains, aI, k、bI, k、 cI, kWith-mI, kIt is respectively and QI, kThe first, second, third and fourth parameter of corresponding plane general equation, aJ, k、bJ, k、cJ, k With-mJ, kIt is respectively and QJ, kThe first, second, third and fourth parameter of corresponding plane general equation.
7) angle and distance between the labeled plane of actual measurement high accuracy cuboid and plane, uses angulation tolerance Go out the angle between each two plane in above-mentioned labeled public vertex face group, then measured with slide measure above-mentioned labeled The distance between parallel surface group midplane, all of measured value is as shown in table 1:
Table 1 cuboid group and in for demarcate plan range and angle measured value
Again with step 6) angle that obtains compared with distance, optimization object function for the purpose of difference minimum for the construction, its Optimization object function is:
E = S Σ k ∈ σ view ( λ Σ ( i , j ) ∈ σ angle 1 Σ ( i , j ) ∈ σ angle ( θ i , j , k - θ ‾ i , j θ ‾ i , j ) 2 + 1 - λ Σ ( i , j ) ∈ σ dis tan ce 1 Σ ( i , j ) ∈ σ dis tan ce ( δ i , j , k - δ ‾ i , j δ ‾ i , j ) 2 ) . . . . . . ( 6 )
In above formula, quantity normalized parameterThe set σ of the depth index of the picture k shootingview=[1, 30], σangleIt is the Arabic numerals mark combination calculating surface used by angle, σdistanceBe calculate apart from surface used Ah Arabic numbers mark combination, λ represents angle difference weight, λ ∈ [0,1],WithRepresent the measurement of angle and distance respectively Value.
This function is with depth camera internal reference for nonlinearity in parameters function, is used non-linear by this optimization object function Nonlinear iteration optimization method in iterative optimization method (above-mentioned steps 7) is levermar method, i.e. Lourakis M.levmar:Levenberg-Marquardt nonlinear least squares algorithms in C/C++[J] .web page]http://www.ics.forth.gr/~lourakis/levmar.2004.), repeat step 4) to step Rapid 6), are optimized to the internal reference of Kinect depth camera so that the minimization of object function, complete to Kinect depth camera Internal reference is demarcated.
8) finally evaluate embodiment using the new high accuracy cuboid having neither part nor lot in calibration process of to finally give Kinect depth camera internal reference stated accuracy.
8.1) by the new high accuracy cuboid having neither part nor lot in calibration process of 1 as another high accuracy cuboid relatively On big face, the flatness on each surface of this two high accuracy cuboids all reaches 0.1mm, as shown in figure 5, claiming this two Cuboid is to evaluate cuboid group.
8.2) with being labeled as the distance between 11 and 14 face in vernier caliper measurement Fig. 5, the distance measuring is 349.25mm, measures the angle between three groups of faces in Fig. 5 with protractor, and the mark in this three groups of faces is respectively (11,12), (11, 13) and (12,13), measured angle is respectively 89.99 °, 90.02 ° and 90.02 °.
8.3) evaluation cuboid group is shot in 10 different angles with Kinect depth camera.
8.4) to step 8.3) in shoot the depth data that obtains, using with step 2) and 3) identical method, obtain Fig. 5 In be labeled as 11,12,13 and 14 face plane depth data.
8.5) by step 4) in the internal reference that used of perspective projection inverse process replace with step 7) internal reference that obtains, and will This perspective projection inverse process is applied to step 8.4) the plane depth data that obtains, obtain being labeled as 11,12,13 and 14 in Fig. 5 Corresponding to face 4 three-dimensional point sets.
8.6) respectively by step 5) used in least-square fitting approach be applied to step 8.5) obtained by 4 three Dimension point set, obtains this 4 plane equation corresponding to 4 three-dimensional point sets.
8.7) utilize step 6) in plane and plane between angle computational methods, calculation procedure 8.6) obtain 4 In individual plane, mark combination is respectively (11,12), and angle between 3 set of planes of (11,13) and (12,13) is calculated Angle and step 8.2) in the average of the absolute value of difference between the angle that obtains of measurement be 0.307 °, using step 6) in made The computational methods of the distance between plane and plane, calculation procedure 8.6) in 4 planes obtaining, mark is respectively 11 Hes The distance between 14 two planes, calculated distance and step 8.3) in difference between the distance that obtains of measurement absolute The average of value is 1.331mm it can be seen that the non-cpntact measurement precision using the Kinect depth camera of this method calibration is very High, there is significant technique effect.
Above-mentioned specific embodiment is used for illustrating the present invention, rather than limits the invention, the present invention's In spirit and scope of the claims, any modifications and changes that the present invention is made, both fall within the protection model of the present invention Enclose.

Claims (9)

1. a kind of internal reference scaling method of the Kinect depth camera based on cuboid is it is characterised in that comprise the steps of:
1) at least three high accuracy cuboids composition cuboid groups of known dimensions are put on the ground, and with Kinect depth phase Machine is shot in multiple different angles;The flatness on described each surface of high accuracy cuboid reaches 0.1mm, and high accuracy is long The infrared light that Kinect depth camera is launched can be reflected in the surface of cube;
2) according to step 1) the middle multiframe depth data shooting in the depth image obtaining, generated back by perspective projection inverse process Throw in the three dimensions to camera local coordinate system, generate the normal direction figure of same number of frames;
3) and divide, on normal direction figure, the plane demarcating object, obtain plane depth data;
4) by step 3) in the plane depth data that obtains, by under perspective projection inverse process backing up to camera local coordinate system In three dimensions, obtain corresponding three-dimensional point set Q of each plane;
5) to step 4) each three-dimensional point set Q of obtaining carries out least-square fitting approach, obtains three-dimensional point set Q corresponding flat Face;
6) according to step 5) angle between the plane that is labeled of the plane computations that obtain and plane and distance;
7) angle and distance between the labeled plane of actual measurement high accuracy cuboid and plane, and with step 6) obtain Angle compare with distance, construction by difference minimum for the purpose of optimization object function, by this optimization object function use non- Linear iteraction optimization method, repeats step 4) arrive step 6), the internal reference of Kinect depth camera is optimized so that mesh Scalar functions minimize, and complete the internal reference to Kinect depth camera and demarcate;
Described step 1) at least three high accuracy cuboids pendulum arrangements on the ground enable the shooting of Kinect depth camera Parallel be made up of two faces being parallel to each other to the public vertex face group being made up of three adjacent surfaces with public vertex Face group, at least one group of public vertex face group, at least two groups of parallel surface group.
2. the internal reference scaling method of a kind of Kinect depth camera based on cuboid according to claim 1, its feature It is:Described step 1) in the different angle shot multiple with respect to the central rotation of cuboid group of Kinect depth camera, often At least 9 degree of the angle of secondary rotation.
3. the internal reference scaling method of a kind of Kinect depth camera based on cuboid according to claim 1, its feature It is:Described step 1) in, in each angle shot, set multiple different Kinect depth cameras to cuboid group center The distance between scope, and randomly choose a shooting distance from each distance range and shot.
4. the internal reference scaling method of a kind of Kinect depth camera based on cuboid according to claim 1, its feature It is:Described step 2) in the detailed process of the depth image that obtained by the shooting normal direction figure that generates same number of frames be:
2.1) to each pixel in each depth image I, take using centered on it window as method of counting to window W, window width is r;
2.2) each depth point q ' in window is passed through by perspective according to the internal reference empirical value being set by production firm
In the projection three dimensions to camera local coordinate system for the inverse process backing up, obtain comprising r2Three-dimensional point set Q=of individual point {qi| i=1,2 ..., r2, and q=(x, y, z), wherein each depth point are expressed as q '=(u, v, d (u, v)), (u, v) represents One pixel coordinate in depth image, d (u, v) represents depth value in depth map for the pixel (u, v);
2.3) according to three-dimensional point set Q, obtain a plane with least square fitting method, its normal vector be three-dimensional vector n=(a, b, C), translational movement is m;
2.4) three components a, b, c of three-dimensional vector n are respectively adopted below equation and are mapped to the red, green, blue three in rgb space In individual value, obtain the RGB color value of window W center pixel:
R = 125.5 ( a / a 2 + b 2 + c 2 + 1 )
G = 125.5 ( b / a 2 + b 2 + c 2 + 1 )
B = 125.5 ( c / a 2 + b 2 + c 2 + 1 )
2.5) to each pixel repeat step 2.1 in depth image I)~2.4), you can generate one for each pixel in I Group RGB color value, and then obtain a coloured image as the normal direction figure N of this depth image I.
5. the internal reference scaling method of a kind of Kinect depth camera based on cuboid according to claim 1, its feature It is:Described step 3) on normal direction figure divide demarcate object plane be specially:
3.1) in cuboid combination, least one set public vertex face group and at least two groups parallel surface groups are selected, to wherein all Face is marked with Arabic numerals;
3.2) in step 2) in generate each normal direction figure on, on each marked face draw one be used for representing this face The polygon of upper effective depth pixel region profile;
3.3) polygon is mapped back each self-corresponding depth map of normal direction figure, obtain each polygon corresponding plane depth number According to.
6. the internal reference scaling method of a kind of Kinect depth camera based on cuboid according to claim 1, its feature It is:Described step 2) in by perspective projection inverse process and step 4) in identical by perspective projection inverse process, specifically Process is:According to by the internal reference empirical value initially setting, each depth point q ' in window is passed through below equation backing up to camera In three dimensions under local coordinate system, obtain comprising r2Three-dimensional point set Q={ the q of individual pointi| i=1,2 ..., r2}:
x = z ( u - c x ) f x ... ... ( 1 )
y = z ( v - c y ) f y ... ... ( 2 )
z = 1 γ 1 ( d ( u , v ) + D ( u , v ) e α 0 - α 1 d ( u , v ) ) + γ 0 ... ... ( 3 )
Wherein, D is two-dimensional matrix, the line number in two-dimensional matrix D and the columns pixel with the vertical and horizontal of depth image I respectively Quantity is identical, and D (u, v) is one of D element value, fx,fyRepresent Kinect depth camera respectively in x-axis and y-axis direction Two focal lengths, cx,cyRepresent horizontal stroke on imaging surface for the photocentre, the ordinate of Kinect depth camera, γ respectively01It is respectively First, second depth conversion parameter of Kinect depth camera, α01, D is first, second He of Kinect depth camera respectively 3rd distortion correction parameter.
7. the internal reference scaling method of a kind of Kinect depth camera based on cuboid according to claim 1, its feature It is:Described step 6) in angle between labeled plane and plane be specially with distance:Labeled public vertex face The distance of the angle of each plane and two planes in labeled parallel surface group in group, angle and distance are respectively by following two formulas Obtain:
θ i , j , k = arg cos ( n i , k · n j , k | n i , k | | n j , k | ) ... ... ( 4 )
δ i , j , k = 1 2 ( 1 | Q i , k | Σ q ∈ Q i , k | P j , k · q ^ | | n j , k | + 1 | Q j , k | Σ q ∈ Q j , k | P i , k · q ^ | | n i , k | ) ... ... ( 5 )
Wherein, θI, j, kRefer to plane PI, kAnd PJ, kBetween angle, δI, j, kRefer to plane PI, kAnd PJ, kThe distance between, k is deep The index of degree figure, QI, kAnd QJ, kIt is any two three-dimensional point set of kth frame depth map, q is a three-dimensional point,It is three-dimensional point q Homogeneous coordinates, nI, k=(aI, k, bI, k, cI, k) and nJ, k=(aJ, k, bJ, k, cJ, k) be respectively and QI, kAnd QJ, kCorresponding planar process Vector, PI, k=(aI, k, bI, k, cI, k,-mI, k) and PJ, k=(aI, k, bI, k, cI, k,-mI, k) be respectively and QI, kAnd QJ, kCorresponding flat Face parameter four dimensional vector, PI, kAnd PJ, kEach self-corresponding plane general equation is a respectivelyI, kx+bI, ky+cI, kZ=mI, kAnd aJ, kx+ bJ, ky+cJ, kZ=mJ, k, aI, k、bI, k、cI, kWith-mI, kIt is respectively and QI, kThe first, second, third of corresponding plane general equation With the 4th parameter, aI, k、bJ, k、cJ, kWith-mJ, kIt is respectively and QJ, kThe first, second, third of corresponding plane general equation and Four parameters.
8. the internal reference scaling method of a kind of Kinect depth camera based on cuboid according to claim 7, its feature It is:Described step 7) in angle and distance between the labeled plane of actual measurement high accuracy cuboid and plane pass through In the following manner measures:Go out the angle between each plane in above-mentioned labeled public vertex face group with angulation tolerance, use vernier Slide calliper rule measure the distance between above-mentioned labeled parallel surface group midplane.
9. the internal reference scaling method of a kind of Kinect depth camera based on cuboid according to claim 1, its feature It is:Described step 7) in object function be:
E = S Σ k ∈ σ v i e w ( λ Σ ( i , j ) ∈ σ a n g l e 1 Σ ( i , j ) ∈ σ a n g l e ( θ i , j , k - θ ‾ i , j θ ‾ i , j ) 2 + 1 - λ Σ ( i , j ) ∈ σ d i s tan c e 1 Σ ( i , j ) ∈ σ d i s tan c e ( δ i , j , k - δ ‾ i , j δ ‾ i , j ) 2 ) ... ... ( 6 )
In above formula, S is quantity normalized parameter,σviewIt is the set of the depth index of the picture k shooting, σangle It is the Arabic numerals mark combination calculating surface used by angle, σdistanceIt is the Arabic numerals mark calculating apart from surface used Note combination, λ represents angle difference weight, λ ∈ [0,1],WithRepresent the measured value of angle and distance respectively.
CN201410642053.3A 2014-11-13 2014-11-13 Cuboid-based intrinsic parameter calibration method for Kinect depth camera Expired - Fee Related CN104376558B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410642053.3A CN104376558B (en) 2014-11-13 2014-11-13 Cuboid-based intrinsic parameter calibration method for Kinect depth camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410642053.3A CN104376558B (en) 2014-11-13 2014-11-13 Cuboid-based intrinsic parameter calibration method for Kinect depth camera

Publications (2)

Publication Number Publication Date
CN104376558A CN104376558A (en) 2015-02-25
CN104376558B true CN104376558B (en) 2017-02-08

Family

ID=52555448

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410642053.3A Expired - Fee Related CN104376558B (en) 2014-11-13 2014-11-13 Cuboid-based intrinsic parameter calibration method for Kinect depth camera

Country Status (1)

Country Link
CN (1) CN104376558B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11810314B2 (en) 2019-06-11 2023-11-07 Cognex Corporation System and method for refining dimensions of a generally cuboidal 3D object imaged by 3D vision system and controls for the same

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104783801A (en) * 2015-04-14 2015-07-22 精迪测量技术(上海)有限公司 Human body high-speed scanning instrument and three-dimensional modeling method
CN106839975B (en) * 2015-12-03 2019-08-30 杭州海康威视数字技术股份有限公司 Volume measuring method and its system based on depth camera
CN106124058A (en) * 2016-06-27 2016-11-16 上海电力学院 Power equipment infrared temperature measurement apparatus based on Kinect depth detection
CN106251282B (en) * 2016-07-19 2019-03-12 中国人民解放军63920部队 A kind of generation method and device of mechanical arm sampling environment analogous diagram
WO2018081967A1 (en) * 2016-11-03 2018-05-11 Intel Corporation Real-time three-dimensional camera calibration
CN106441104B (en) * 2016-12-26 2019-02-01 中国科学院长春光学精密机械与物理研究所 A kind of Size Measuring System
CN108460797B (en) * 2017-02-22 2020-08-25 深圳市鸿合创新信息技术有限责任公司 Method and device for calculating relative pose of depth camera and height of scene plane
CN110506297B (en) * 2017-04-17 2023-08-11 康耐视公司 High accuracy calibration system and method
CN107564051B (en) * 2017-09-05 2020-06-02 歌尔股份有限公司 Depth information acquisition method and system
CN108197571B (en) * 2018-01-02 2021-09-14 联想(北京)有限公司 Mask shielding detection method and electronic equipment
CN108510578B (en) * 2018-03-13 2023-04-07 北京航空航天大学青岛研究院 Three-dimensional model building method and device and electronic equipment
CN108844489A (en) * 2018-06-25 2018-11-20 苏州乐佰图信息技术有限公司 Using the method and camera calibration method of line laser structured light contour of object
CN110232714B (en) * 2019-04-30 2021-07-20 北京云迹科技有限公司 Calibration method and system of depth camera
CN110209997A (en) * 2019-06-10 2019-09-06 成都理工大学 Depth camera automatic Calibration algorithm based on three-dimensional feature point
US11605177B2 (en) 2019-06-11 2023-03-14 Cognex Corporation System and method for refining dimensions of a generally cuboidal 3D object imaged by 3D vision system and controls for the same
CN112150372B (en) * 2019-06-28 2024-02-02 深圳创想未来机器人有限公司 Depth map correction method, depth map correction device and robot
CN112541950A (en) * 2019-09-20 2021-03-23 杭州海康机器人技术有限公司 Method and device for calibrating external parameter of depth camera
CN112147986A (en) * 2020-09-30 2020-12-29 徐州徐工挖掘机械有限公司 Parameter calibration control method suitable for excavator
CN112686961B (en) * 2020-12-31 2024-06-04 杭州海康机器人股份有限公司 Correction method and device for calibration parameters of depth camera
CN113077521B (en) * 2021-03-19 2022-11-01 浙江华睿科技股份有限公司 Camera calibration method and device
CN116805338A (en) * 2022-03-18 2023-09-26 华为技术有限公司 Method and device for calibrating internal parameters of camera

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6081273A (en) * 1996-01-31 2000-06-27 Michigan State University Method and system for building three-dimensional object models
US6094198A (en) * 1994-01-10 2000-07-25 Cognitens, Ltd. System and method for reconstructing surface elements of solid objects in a three-dimensional scene from a plurality of two dimensional images of the scene
CN103646394A (en) * 2013-11-26 2014-03-19 福州大学 Mixed visual system calibration method based on Kinect camera

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6094198A (en) * 1994-01-10 2000-07-25 Cognitens, Ltd. System and method for reconstructing surface elements of solid objects in a three-dimensional scene from a plurality of two dimensional images of the scene
US6081273A (en) * 1996-01-31 2000-06-27 Michigan State University Method and system for building three-dimensional object models
CN103646394A (en) * 2013-11-26 2014-03-19 福州大学 Mixed visual system calibration method based on Kinect camera

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A flexible new technique for camera calibration;Z.ZHANG;《IEEE Transactions on Pattern Analysis and Machine Intelligence》;20001231;第22卷(第11期);第1330-1334页 *
An efficient and accurate camera calibration technique for 3D machine vision;R.Y.Tsai.;《IEEE Conference on Computer Vision and Pattern Recognition》;19861231;第364-374页 *
基于可控旋转的像机高精度标定技术;姜广文等;《光学学报》;20101231;第30卷(第5期);第1308-1314页 *
面向大视场视觉测量的摄像机标定技术;杨博文等;《光学学报》;20121231;第32卷(第9期);第1-3节 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11810314B2 (en) 2019-06-11 2023-11-07 Cognex Corporation System and method for refining dimensions of a generally cuboidal 3D object imaged by 3D vision system and controls for the same

Also Published As

Publication number Publication date
CN104376558A (en) 2015-02-25

Similar Documents

Publication Publication Date Title
CN104376558B (en) Cuboid-based intrinsic parameter calibration method for Kinect depth camera
CN107063129B (en) A kind of array parallel laser projection three-dimensional scan method
CN103530880B (en) Based on the camera marking method of projection Gaussian network pattern
CN110068270A (en) A kind of monocular vision box volume measurement method based on multi-line structured light image recognition
CN106643563B (en) A kind of Table top type wide view-field three-D scanning means and method
CN110956660B (en) Positioning method, robot, and computer storage medium
CN104266608B (en) Field calibration device for visual sensor and calibration method
Yau et al. Underwater camera calibration using wavelength triangulation
CN105403173B (en) A kind of light field deviation art measuring system calibration mirror and its application process
CN109323650A (en) Image visual transducer and the unified approach for putting ligh-ranging sensor measurement coordinate system
CN104279960B (en) Method for measuring size of object by mobile equipment
CN105303574A (en) Integrated imaging camera array calibration method based on homography transformation
CN109859272A (en) A kind of auto-focusing binocular camera scaling method and device
CN102032878A (en) Accurate on-line measurement method based on binocular stereo vision measurement system
CN109141226A (en) The spatial point coordinate measuring method of one camera multi-angle
CN107421462A (en) Object three-dimensional contour outline measuring system based on line laser structured light
CN109559349A (en) A kind of method and apparatus for calibration
CN110378969A (en) A kind of convergence type binocular camera scaling method based on 3D geometrical constraint
CN109341537A (en) Dimension measurement method and device based on binocular vision
CN104197846B (en) Method for measuring displacement of reflector based on characteristic pattern reflection
CN106709955A (en) Space coordinate system calibrate system and method based on binocular stereo visual sense
CN109272555A (en) A kind of external parameter of RGB-D camera obtains and scaling method
CN103985121A (en) Method for calibrating underwater projector
CN103424087A (en) Three-dimensional measurement splicing system and method for large-scale steel plate
CN103278104B (en) A kind of scaling board of the dual camera systems for DIC measurement and scaling method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170208

Termination date: 20201113