CN109827502A - A kind of line structured light vision sensor high-precision calibrating method of calibration point image compensation - Google Patents

A kind of line structured light vision sensor high-precision calibrating method of calibration point image compensation Download PDF

Info

Publication number
CN109827502A
CN109827502A CN201811619300.2A CN201811619300A CN109827502A CN 109827502 A CN109827502 A CN 109827502A CN 201811619300 A CN201811619300 A CN 201811619300A CN 109827502 A CN109827502 A CN 109827502A
Authority
CN
China
Prior art keywords
point
striation
calibration
target
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811619300.2A
Other languages
Chinese (zh)
Other versions
CN109827502B (en
Inventor
刘震
胡杨
任一鸣
阎峰
吴穗宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201811619300.2A priority Critical patent/CN109827502B/en
Publication of CN109827502A publication Critical patent/CN109827502A/en
Application granted granted Critical
Publication of CN109827502B publication Critical patent/CN109827502B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention relates to a kind of line structured light vision sensor high-precision calibrating methods of calibration point image compensation, comprising: demarcates to the camera intrinsic parameter in structured light vision sensor;Using the planar metal target for being inlaid with LED luminescence feature point, video camera shooting has the plane target drone image of striation;Extract target characteristic point and striation calibration point coordinate;It calculates separately target characteristic point coordinate and striation demarcates point location uncertainty;It is constraint with positioning feature point uncertainty, the deviations of all characteristic points is solved by nonlinear optimization mode;Compensate target characteristic point and striation calibration point coordinate;More than twice by target movement, three-dimensional coordinate of all striation calibration points under camera coordinates is obtained, is fitted these three-dimensional coordinate points and solves optic plane equations, completes calibration;The present invention is suitble to complicated light environment at the scene, in addition in larger picture noise or positioning feature point deviation still achievable cable architecture visual sensor high-precision calibrating.

Description

A kind of line structured light vision sensor high-precision calibrating method of calibration point image compensation
Technical field
The present invention relates to the technical fields of transducer calibration, and in particular to a kind of line-structured light view of calibration point image compensation Feel sensor high-precision calibrating method.
Background technique
Line structured light vision sensor is as important three-dimensional data obtaining means, big, the non-contact, speed with range Fastly, the advantages that precision is high is becoming widely adopted in on-line dynamic measurement field.Such as detector for train wheel pair size on-line dynamic measurement, Pantograph wears away on-line measurement and the detection of train Car Body On-line etc..The complicated multiplicity of above-mentioned in-site measurement environment, sensor mark Fixed and measurement is easy to be influenced by factors such as sensor placement, ambients, it is difficult to meet ideal calibration condition, calibration essence Spend it is low, this become restrict measurement accuracy problem.Currently, the calibration method based on plane target drone it is high with its precision, at The advantages that this is low, flexible and convenient is widely used in field calibration.But its stated accuracy is still easy by field calibration environment shadow It rings, so that image characteristic point or striation calibration point are difficult to meet optimal imaging simultaneously, easily generates deviations.And rely on figure As processing method can only reduce deviations, can not but eliminate at all, this cannot achieve this method under live complex environment Calibration.Therefore study one kind is not influenced by picture noise, and can be realized the method demarcated under complicated light condition becomes structure The new direction of light vision sensor development.
Line structured light vision sensor calibration includes that calibration of camera internal parameters and light-plane parameters demarcate two large divisions.For a long time Since, the research of camera internal reference calibration is relatively more, wherein there are many research about Calibration of camera intrinsic parameters this respect, Therefore light-plane parameters calibration process is discussed.Currently have much about the scaling method of optic plane equations.Dewar R. exists Article " Self-generated targets for spatial calibration of structured light Optical sectioning sensors with respect to an external coordinate system " is proposed Wire drawing scaling method is aimed at because bright spot intrinsic brightness is unevenly distributed or highlights phenomena such as reflective with measuring device in space Bright spot is difficult stringent corresponding with the bright spot in image, therefore the obtained calibration point of this method is few and stated accuracy is lower;In article Cross ratio invariability is proposed in " Caibration a structured light stripe system:a novel approach " Theory puts at least three collinear points of known coordinate by three-dimensional target, using cross ratio invariability obtain structural light strip and this The coordinate of the intersection point of straight line where knowing at 3 points.This method can obtain the calibration point on the optical plane of degree of precision, be suitble to scene mark It is fixed.But the high-precision three-dimensional target for needing at least two orthogonal planes to constitute.Simultaneously as to illumination between plane It mutually blocks, it is difficult to obtain the uncalibrated image of high quality, also limit calibration point quantity.Liu et al. is in article " novel calibration method for multi-sensor visual measurement system based on Structured light " proposes the method for indicating striation straight line using plucker ' s equation, with the side using less calibration point Method is compared, and this method can effectively improve stated accuracy.Wei et al. is in article " a novel 1D target-based calibration method with unknown orientation for structured light vision Sensor " proposes a kind of structured light vision sensor scaling method based on 1-dimension drone.Using between 1-dimension drone characteristic point Distance solves the three-dimensional coordinate of the intersection point of optical plane and 1-dimension drone.Optic plane equations are obtained by being fitted multiple intersection points.In recent years Come, using space geometry as auxiliary constraint, under complex environment on site, the structured light vision sensor with optical filter Calibration.Liu et al. is in article " calibration method for line-structured light vision sensor Based a single ball target " proposes the structured light vision sensor scaling method based on ball target.This method It needs to extract ball target outer profile edge, and then obtains orientation of the ball target under camera coordinates system.In conjunction with optical plane and ball target Mark intersection obtains conical profile and solves optic plane equations.This method has the ball target profile obtained not by target placement angle Influence, but need to extract the outer profile of target, be easy the interference by background or the external world.And in " An on-site It is mentioned in calibration of line-structured light vision in complex light environment " Parallel bicylindrical target is utilized out, in the case where camera assembles optical filter and external environment is complicated, realizes that structure optical parameter is existing Field calibration.Intersect corresponding relationship between the parallel elliptic plane in space and the plane of delineation generated in bicylindrical using optical plane, with Cylindrical radius is equal to a length of constraint of space ellipse short axle, advanced optimizes to obtain optic plane equations.Meanwhile there is scholar to propose The method for improving stated accuracy, if week et al. is in article " complete calibration of a structured light One kind is proposed in stripe vison sensor through planar target of unknown orientations " Light-plane parameters scaling method based on plane target drone.The calibration point in optical plane is obtained by Cross ration invariability, leans on flat target Indicated weight moves the calibration point three-dimensional coordinate obtained in optical plane again, and fitting obtains optic plane equations.This method is at low cost with its, Flexibly, the advantages that precision is high, be widely used field of high-precision measurement at the scene.But it still can not get rid of picture noise and draw Calibrated error caused by the deviations risen, cannot achieve the calibration of higher precision.
Current scaling method is analyzed, is all true as actual imaging point coordinate to extract target characteristic point or striation calibration point Value solves to obtain optimal solution by minimizing projecting characteristic points error again after obtaining linear solution.Meanwhile also as far as possible using calibration Space is the calibration mode for measuring space, however inevitably due to complicated light, laser matter when field calibration Amount, target surface process roughness, target placement angle, and image focus is fuzzy, picture noise, and characteristic point and striation extract inclined The factors such as difference influence, and reduce stated accuracy.If only looking after target characteristic point image quality when field calibration, striation is easily caused Brightness decline or thicker;If phenomena such as only treatment striation is imaged, and target feature is easy to appear out of focus fuzzy or under-exposure, two It is in contradiction shape between person, optimal imaging quality can not be met simultaneously.Therefore the outdoor complex condition in scene, general, no is studied The structured light vision sensor high-precision calibrating method interfered by picture noise becomes problem urgently to be resolved.
Summary of the invention
The technology of the present invention solves the problems, such as: overcoming the deficiencies of the prior art and provide a kind of cable architecture of calibration point image compensation Light vision sensor high-precision calibrating method, can at the scene in the case where complicated light environment it is fuzzy especially in the presence of image, High-precision calibrating is realized in the case of out of focus, noise jamming.
In order to achieve the above objectives, the technical scheme of the present invention is realized as follows:
A kind of line structured light vision sensor high-precision calibrating method of calibration point image compensation, this method comprises:
A, in the case where being not switched on laser, the video camera in line structured light vision sensor is demarcated;
B, using the planar metal target for being inlaid with LED luminescence feature point, video camera shooting has the plane target drone of striation Image;Extract target characteristic point and striation calibration point coordinate;
C, it calculates separately target characteristic point coordinate and striation demarcates point location uncertainty.With positioning feature point uncertainty For constraint, the deviations of all characteristic points are solved by nonlinear optimization mode and compensation obtains accurate feature points coordinate;
D, three-dimensional coordinate of all striation calibration points under camera coordinates more than twice by target movement, is obtained, is passed through Fitting solves optic plane equations after RANSAC picks impurity point.
In step a in the case where not opening laser, the video camera in line structured light vision sensor is demarcated, Using Liu Zhen et al. improved Zhang Zhengyou scaling method calibrating camera inner parameter proposed and camera lens second order radial distortion system Number;
The image that striation intersects with plane target drone is shot in step b, and extracts target characteristic point and striation calibration point respectively Image coordinate, the method is as follows:
(b1) adjustment metal target intersects with space optical plane, guarantees that striation does not pass through the target luminescence feature in space Point;
(b2) it takes multiple dimensioned optical spot centre coordinate method to extract target characteristic point image coordinate to sit as target characteristic point Mark initial value;Optical losses point coordinate is extracted, being fitted the target feature point list vertical with striation direction is straight line, and solves it And striation crosspoint is as striation calibration point coordinate initial value.
Target characteristic point is solved in step c, and the specific method is as follows with striation calibration point location uncertainty:
(c1) acquisition image is handled respectively using multiple Gaussian convolution cores, solves multiple positioning of each characteristic point Coordinate, statistics obtain the positioning uncertainty of each characteristic point;
(c2) the positioning uncertainty mathematical model for establishing each striation calibration point topography, using mean filter method Picture noise is solved, and then obtains the positioning uncertainty of each striation calibration point.
Target characteristic point is solved by nonlinear optimization method to constrain with positioning feature point uncertainty in step c With striation calibration point deviations, accurate feature points coordinate is obtained after overcompensation.
The method of last transducer calibration is as follows in step d:
(d1) it is based on compensated striation calibration point coordinate, striation calibration point is obtained using plane target drone scaling method and is being taken the photograph Three-dimensional coordinate under camera coordinate system;
(d2) all striation calibration points are being obtained after the three-dimensional coordinate under camera coordinates, is being rejected using RANSAC method Fitting solves optic plane equations initial value after miscellaneous point;
(d3) optic plane equations optimal solution is acquired using Levenberg-Marquardt nonlinear optimization method, completes line The calibration of structured light vision sensor.
The advantages of the present invention over the prior art are that:
The present invention proposes a kind of structured light vision sensor scaling method based on uncertainty model.Initially set up target The positioning uncertainty model of characteristic point, striation calibration point, solves the positioning uncertainty of each characteristic point;Secondly with spy Levying point location uncertainty is constraint, with striation calibration point back projection to space target and minimum about with target plan range Shu Jianli target equation, the deviations of each characteristic point are acquired by nonlinear optimization method;Finally after overcompensation, then It substitutes into plane target drone scaling method and solves high-precision optic plane equations.The present invention can effectively make up target characteristic point and light Bring deviations when calibration point extracts, be particularly suitable for solving image quality in live complex environment it is bad cause or Stated accuracy caused by the influence such as out of focus, picture noise declines problem, improves the precision of calibration.
Detailed description of the invention
Fig. 1 is that the present invention is based on the structured light vision sensor high-precision calibrating method flow charts of uncertainty model;
Fig. 2 is that line-structured light visual sensing of the present invention demarcates schematic diagram;
Fig. 3 is that target characteristic point of the present invention and striation demarcate point location uncertainty schematic diagram.
Specific embodiment
The basic idea of the invention is that: target characteristic point and striation calibration point coordinate initial value are extracted, and determines all features Point location uncertainty.It is constraint with positioning feature point uncertainty, each characteristic point is obtained by nonlinear optimization mode Deviations.Plane target drone scaling method is substituted into after compensated again, show that striation calibration point is three-dimensional under camera coordinate system Coordinate.Impurity point is picked using RANSAC method, and is accurately solved by Optimization Method exit plane equation, realizes structure light The high-precision calibrating of visual sensor.
Below by taking the line structured light vision sensor that a video camera and a laser line generator form as an example, the present invention is made It is further described.
As shown in Figure 1, the structured light vision sensor high-precision calibrating method the present invention is based on uncertainty model is main The following steps are included:
Step 11: in the case where laser line generator is not opened, the video camera in line structured light vision sensor being marked It is fixed.
Here the video camera of visual sensor is demarcated, that is, solves the inner parameter of video camera, specific scaling method In article [Liu Z, Wu Q, Chen X, the et al.High- for the improved Zhang Zhengyou calibration method that Liu Zhen et al. is proposed accuracy calibration of low-cost camera using image disturbance factor[J] .Optics Express, 2016,24 (21): 24321-24336.] in have a detailed description.
Step 12: open laser, put plane target drone immediately ahead of camera so that laser line generator projection optical plane and Plane target drone intersection, video camera shooting have the plane target drone image of striation.
As shown in Fig. 2, setting OcxcyczcFor camera coordinates system, OtxtytztFor target co-ordinates system, YtFor target plane, Y is sharp Optical plane, LiFor Y and YtIntersection, liFor LiImaging, Q1j、Q2j、Q3jIndicate three luminous points on target previous column,For target Punctuate line and LiIntersection point, for subsequent optic plane equations solve.p1j p2j p3jRespectively Q1j、Q2j、Q3jCorresponding image Point,ForCorresponding picture point is defined as striation calibration point.Optic plane equations are represented by ax+by+cz+d=0, In
Step 13: extracting characteristic point image coordinate, i.e. target characteristic point and striation calibration point coordinate.
Here, specifically includes the following steps:
Step 131: extract shooting image in target calibration point image coordinate, by multiple dimensioned extracting method (such as text Chapter " high accuracy positioning [J] the optical precision engineering at the multiple dimensioned dot pattern picture center Liu Zhen, Shang Yanna, 2013,21 (6): Mentioned in 1586-1591. "), obtain all optimal image coordinates of target characteristic point in image.
Using Steger " Steger C.Unbiased Extraction of Curvilinear Structures The center of the extraction striation of method described in from 2D and 3D Images [J] .1998. ".
Step 14: calculating target characteristic point and striation demarcates point location uncertainty.
Step 141: as shown in figure 3, the present invention takes multiple dimensioned extracting method to complete target positioning feature point initial value and not Degree of certainty solves.Image is handled by multiple different Gaussian convolutions, uses statistical method to obtain target after repeatedly extracting central point Positioning feature point uncertainty, while choosing the corresponding coordinate of best scale and being characterized dot center's coordinate initial value.Choose m not Same Gaussian convolution verification is respectively handled j characteristic point at the position i, obtains m group target characteristic point coordinate pijAnd group At point set Pm, such as cross crunode red in the characteristic point in Fig. 2.And then target characteristic point point set is found out in u, the direction v is averaged CoordinateWith standard deviation sigmau、σv.And by σu、σvFor the positioning uncertainty of target characteristic point.
Step 142: the influence that stimulated light device quality, power, target surface material and ambient light are shone when field calibration, light Central point, which extracts, to be easy to be influenced by picture noise to generate deviations.The present invention provides under any direction striation calibration point not Degree of certainty method for solving.The following are striation calibration point coordinate uncertainty solution procedurees.
If obtaining partial derivative after image and Gauss nuclear convolution is respectively gu、gv、guu、guv、gvv.By calculating Hessian square Battle array, obtains striation normal line vector n (u, v).If (u0,v0) it is image Point Coordinates, striation edge direction (nu,nv) indicate, And | | (nu,nv) | |=1, orthogonal direction is usedIt indicates.Therefore, striation cross section curve can be along edge direction (nu, nv) be expressed as,
For line edge, enableIt can obtain:
Therefore, the maximum point of image grayscale namely optical losses point coordinate are (pu,pv)=((tnu+u0),(tnv+ v0))。
With the corresponding optical losses point (0,0) of noise-free picture for origin,With (nu,nv) it is that reference axis establishes o- Rc coordinate system, if (nu,nv) grey scale curve is h (c, r) on direction, therefore solves (pu,pv) uncertainty be converted into solve h (c,r)R=0Locate c0Uncertainty.If h (c, r)=I (c, r)+N (c, r), wherein I (c, r) is ideal image, and N (c, r) is equal Value is zero, and variance isPicture noise.By [CSteger] it is found that h (c0,r0) at first derivative be zero, then:
Wherein,Respectively intensity profile, ideal image distribution, picture noise are by varianceGauss Data after nuclear convolution,Respectively For in (c0,r0) at single order, second-order partial differential coefficient to c-axis direction.It is rightTaylor expansion is carried out at (0,0),
Since ideal image can be expressed asM is gray scale maximum value, σwFor Gaussian kernel, and meetAndIt acquires:
Corresponding varianceFor,
Location of the core variance is obtained by formula 8,9For,
Formula (11) substitution formula (10), which is obtained center point coordinate positioning variances, is,
Wherein, σwIt can be by taking ncGrey scale curve is obtained using fittype () Function Fitting in matlab on direction, wherein Fitting function prototype is
It willIt decomposes under o-uv coordinate system, it can be deduced that striation calibration point point partial uncertainty
Step 15: being constraint with characteristic point uncertainty, together based on the positioning feature point uncertainty determined in step 14 Back projection's error is minimum between Shi Jianli target feature point image and space target, striation calibration point back projection to space target Plane and point arrive the minimum target equation of plan range, the deviations of characteristic point are obtained by nonlinear optimization method.
Step 151: decomposing target characteristic point imaging process, establish perspective projection point in characteristic point, distortion point and finally make an uproar Transformation equation between sound point.
If pu=[uu,vu,1]T, pd=[ud,vd,1]TWith pn=[un,vn,1]TIt is respectively undistorted under image coordinate system Point, distortion point and actual imaging point coordinate.The imaging of space target point Q=[x, y, 1] can divide it can be seen from imaging optical path Solution is three processes, first be perspective projection model, second be lens distortion model, third be picture noise superposition Model;Wherein perspective projection model may be expressed as:
Wherein H3×3For the homography matrix between target plane and the plane of delineation, ρ is constant factor, and K is camera intrinsic parameter square Battle array, u0With v0It is the obliquity factor .r that principal point coordinate γ is image u v coordinate axis1、r2, t be respectively spin matrix first two columns And translation vector.Lens distortion model may be expressed as:
Whereink1, k2Two rank distortion factors, [x before camera lensn,yn] indicate normalized image coordinate.Root According to practical experience, preceding two ranks radial distortion precisely enough describes lens distortion, reaches degree of precision.If due to image Image deviations caused by the reasons such as noise are Δ u, Δ v, then distortion point may be expressed as: to actual imaging point
Under i-th of placement position of target, if j-th point of the target homogeneous coordinates under target co-ordinates system and image coordinate system Respectively Qj=[xj,yj,1]TWithpijBy formula (14), (15) calculate pu(ij), pu(ij)With Qj=[xj, yj,1]TH is solved by formula 13iMatrix, wherein Δ u in formula (15)ij,ΔvijInitial value be 0.
Step 152: establishing back projection's error minimum between target feature point image and space target, striation calibration point is counter to be thrown To space target plane and point arrives the minimum target equation of plan range, obtains determining for characteristic point by nonlinear optimization method Position deviation.
According to the mapping matrix H of i-th of position target plane to the plane of delineationi, QjIt is obtained j-th of target by formula (13) The homogeneous coordinates p of characteristic point subpoint under image coordinate systemn(ij).With pijWith pn(ij)Between in distance minimum and image statistics To establish first aim function as follows apart from minimum constraint for heart point:
Wherein D (pij,pn(ij)) indicate pijWith pn(ij)Distance, M indicate that target placement position number, N indicate target characteristic point Number.
P is calculated by formula (13)ijThe homogeneous coordinates put under back projection to target co-ordinates systemWith qjWithBetween distance minimum and target point and the subpoint statistics minimum objective function of centre distance establish second target function:
Cross ratio invariability is constrained as the important restrictions condition in projective geometry, can be preferably excellent by target rigid constraint Change coordinate between image characteristic point, is widely used in the links such as camera intrinsic parameter calibration.The present invention is by target feature point image Double ratio constraint is established between coordinate and target, and it is as follows to establish third objective function:
CR in formulah、CRvRespectively indicate horizontal and vertical direction double ratio.To reach the intensity of constraint, while improving calculating effect Rate, horizontal direction, which is appointed, takes four points to calculate double ratio, if shared K kind combination;Vertical direction, which is equally appointed, takes four points to calculate double ratio, If shared L kind combination, K and L specific value are determined by points, guarantee that target characteristic point uniformly covers entire image planes.
If striation calibration pointForward projection and target intersect at space three-dimensional pointAnd and pijIt is thrown under target co-ordinates system The homogeneous coordinates of shadow pointThe target plan range of fitting is minimum, establishes the 4th target to distance minimum between plane with point Function is as follows:
Wherein FiIndicate that target puts the plane equation of i-th of position target point fitting,Indicate point to plane it Between distance.
Meanwhile ifMiddle kth column fitting a straight line isFitting a straight line isWithIntersection point isWithWith Between distance minimum to establish the 5th objective function as follows:
Combining four objective functions can obtain:
E (a)=e1+e2+e3+e4+e5 (21)
For Δ uij,Δvij,It joined optimization range constraint, as shown in Equation 22:
Whereinσu(ij)And σv(ij)It is put for i-th of target Jth point positioning feature point uncertainty in the picture is put at position,For k-th of striation at i position of target Demarcate point location uncertainty.N is non-zero proportions coefficient, and according to a large amount of repetition tests, the present invention is set as 9.
Step 16: the deviations of each characteristic point being obtained based on the nonlinear optimization in step 15, are obtained after compensated Accurate profile point location coordinate, by going distortion to obtain undistorted coordinate.
Step 17: striation calibration point is mapped to three-dimensional space by the homography matrix determined by target characteristic point, and it is flat to obtain light Face three-dimensional coordinate point list.Gross error point is rejected using RANSAC, least square method is recycled to obtain optical plane initial value.Most The maximum likelihood solution of optical plane is obtained by nonlinear optimization afterwards.
IfFor the target characteristic point coordinate after removal distortion, QijFor target characteristic point coordinate,For target plane and figure As the homography matrix between plane;Striation calibration point coordinate after distorting for removal,It is sat for striation calibration point in target Mooring points coordinate is marked,For optical plane and target intersection three-dimensional point coordinate under camera coordinates.Then between picture point and target point MeetWherein s is nonzero coefficient.The invertibity mapped according to homography matrix, it can be deduced that:
IfIt can be analyzed to spin matrixWith translation vectorThen
If optic plane equations are expressed as ax+by+cz+d=0, whereinIt will It substitutes into the least square method of RANSAC constraint, obtains optic plane equations and just solvePlane is arrived based on point on optical plane Apart from minimum constraint, following objective function can be established,
Wherein, a, b, c, d are four coefficients of optic plane equations, xik=[xik,yik,zik, 1] and it is i target position, k table Show that optical plane intersects resulting k calibration point with target on striation.S indicates that target puts number, and M indicates that each position obtains Striation calibration point number.The optimal solution of a, b, c, d can be obtained by the maximal possibility estimation of nonlinear optimization method.

Claims (6)

1. a kind of line structured light vision sensor high-precision calibrating method of calibration point image compensation, which is characterized in that realize step It is rapid as follows:
Step a, in the case where being not switched on laser, the video camera in line structured light vision sensor is demarcated;
Step b, it using the planar metal target for being inlaid with LED luminescence feature point, has been beaten and has been swashed using the video camera shooting demarcated The planar metal target image for being inlaid with LED luminescence feature point of light striation;Calculate all target LED characteristic points in the picture The coordinate initial value of coordinate and all striation calibration points;
Step c, solve target characteristic point and striation calibration point uncertainty, and using uncertainty solve target characteristic point and The deviations of striation calibration point and compensation obtains characteristic point coordinate;
Step d, planar metal target is mobile more than twice, obtain three-dimensional seat of all striation calibration points under camera coordinates Mark, fitting solves optic plane equations after picking impurity point, completes the calibration of line structured light vision sensor.
2. a kind of line structured light vision sensor high-precision calibrating side of calibration point image compensation according to claim 1 Method, it is characterised in that: in step a in the case where not opening laser condition, the video camera in line structured light vision sensor is carried out Calibration obtains intrinsic parameters of the camera and camera lens second order coefficient of radial distortion.
3. a kind of line structured light vision sensor high-precision calibrating side of calibration point image compensation according to claim 1 Method, it is characterised in that: shoot the image that striation intersects with plane target drone in step b, calculate the image coordinate of target characteristic point, benefit Method with the image coordinate initial value of the image coordinate calculating striation calibration point of target characteristic point is as follows:
(b1) the camera lens second order distortion factor obtained using calibration carries out distortion correction to shooting image;
(b2) adjustment planar metal target intersects with space optical plane, guarantees that laser striation does not pass through the target feature in space Point;
(b3) multiple dimensioned optical spot centre coordinate method is taken to extract the image coordinate of target characteristic point as target characteristic point coordinate Initial value;Optical losses point coordinate is extracted, the point range straight line of the target characteristic point vertical with striation direction is fitted, and solves the point Column straight line and striation crosspoint are as striation calibration point coordinate initial value.
4. a kind of line structured light vision sensor high-precision calibrating side of calibration point image compensation according to claim 1 Method, it is characterised in that: target characteristic point is solved in step c, and the specific method is as follows with striation calibration point location uncertainty:
(c1) uncalibrated image is handled respectively using multiple Gaussian convolution cores, solves multiple positioning of each characteristic point respectively Coordinate, statistics obtain the positioning uncertainty of each characteristic point;
(c2) the positioning uncertainty mathematical model of each striation calibration point topography is established, picture noise is solved and then is obtained The positioning uncertainty of each striation calibration point.
5. a kind of line structured light vision sensor high-precision calibrating side of calibration point image compensation according to claim 1 Method, it is characterised in that: constrained in step c based on positioning feature point uncertainty, compensate target characteristic point and striation calibration point Coordinate;
(c1) it is constraint with target characteristic point and the uncertainty of striation anchor point, by the method for nonlinear optimization, solves target Mark the deviations of characteristic point and striation calibration point;
(c2) with the coordinate of deviations compensation target characteristic point and striation calibration point.
6. a kind of line structured light vision sensor high-precision calibrating side of calibration point image compensation according to claim 1 Method, it is characterised in that: the scaling method of step d centerline construction light vision sensor is as follows:
(d1) it is based on compensated striation calibration point coordinate, using plane target drone scaling method, acquires video camera using calibration Inner parameter obtains three-dimensional coordinate of the striation calibration point under camera coordinate system;
(d2) all striation calibration points are being obtained after the three-dimensional coordinate under camera coordinates, is picking impurity point using RANSAC method Fitting solves optic plane equations initial value afterwards;
(d3) according to optic plane equations initial value, optical plane side is acquired using Levenberg-Marquardt nonlinear optimization method Journey optimal solution completes the calibration of line structured light vision sensor.
CN201811619300.2A 2018-12-28 2018-12-28 High-precision calibration method for line-structured light vision sensor for calibration point image compensation Active CN109827502B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811619300.2A CN109827502B (en) 2018-12-28 2018-12-28 High-precision calibration method for line-structured light vision sensor for calibration point image compensation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811619300.2A CN109827502B (en) 2018-12-28 2018-12-28 High-precision calibration method for line-structured light vision sensor for calibration point image compensation

Publications (2)

Publication Number Publication Date
CN109827502A true CN109827502A (en) 2019-05-31
CN109827502B CN109827502B (en) 2020-03-17

Family

ID=66861331

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811619300.2A Active CN109827502B (en) 2018-12-28 2018-12-28 High-precision calibration method for line-structured light vision sensor for calibration point image compensation

Country Status (1)

Country Link
CN (1) CN109827502B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110942434A (en) * 2019-11-22 2020-03-31 华兴源创(成都)科技有限公司 Display compensation system and method of display panel
CN111207670A (en) * 2020-02-27 2020-05-29 河海大学常州校区 Line structured light calibration device and method
CN111275770A (en) * 2020-01-20 2020-06-12 南昌航空大学 Global calibration method of four-eye stereoscopic vision system based on one-dimensional target rotation motion
CN111311686A (en) * 2020-01-15 2020-06-19 浙江大学 Projector out-of-focus correction method based on edge perception
CN112229420A (en) * 2020-08-31 2021-01-15 南京航空航天大学 Line laser calibration method for aircraft skin butt seam measurement
CN112484746A (en) * 2020-11-26 2021-03-12 上海电力大学 Monocular vision-assisted laser radar odometer method based on ground plane
CN112767492A (en) * 2020-12-25 2021-05-07 江苏集萃智能光电系统研究所有限公司 Railway wheel set size detection device and calibration method thereof
CN113155057A (en) * 2021-03-16 2021-07-23 广西大学 Line structured light plane calibration method using non-purpose-made target
CN113639633A (en) * 2021-07-26 2021-11-12 中国航空工业集团公司北京航空精密机械研究所 Method for aligning angular zero position of clamp in multi-axis vision measuring device
WO2022088039A1 (en) * 2020-10-30 2022-05-05 Harman International Industries, Incorporated Unified calibration between dvs and camera

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030007159A1 (en) * 2001-06-27 2003-01-09 Franke Ernest A. Non-contact apparatus and method for measuring surface profile
CN1566906A (en) * 2003-06-11 2005-01-19 北京航空航天大学 Construction optical visual sense transducer calibration method based on plane targets
CN101943563A (en) * 2010-03-26 2011-01-12 天津大学 Rapid calibration method of line-structured light vision sensor based on space plane restriction
CN104848801A (en) * 2015-06-05 2015-08-19 北京航空航天大学 Line structure light vision sensor calibration method based on parallel bicylindrical target
CN106705849A (en) * 2017-01-25 2017-05-24 上海新时达电气股份有限公司 Calibration method of linear-structure optical sensor
CN107218904A (en) * 2017-07-14 2017-09-29 北京航空航天大学 A kind of line structured light vision sensor calibration method based on sawtooth target
CN107255443A (en) * 2017-07-14 2017-10-17 北京航空航天大学 Binocular vision sensor field calibration method and device under a kind of complex environment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030007159A1 (en) * 2001-06-27 2003-01-09 Franke Ernest A. Non-contact apparatus and method for measuring surface profile
CN1566906A (en) * 2003-06-11 2005-01-19 北京航空航天大学 Construction optical visual sense transducer calibration method based on plane targets
CN101943563A (en) * 2010-03-26 2011-01-12 天津大学 Rapid calibration method of line-structured light vision sensor based on space plane restriction
CN104848801A (en) * 2015-06-05 2015-08-19 北京航空航天大学 Line structure light vision sensor calibration method based on parallel bicylindrical target
CN106705849A (en) * 2017-01-25 2017-05-24 上海新时达电气股份有限公司 Calibration method of linear-structure optical sensor
CN107218904A (en) * 2017-07-14 2017-09-29 北京航空航天大学 A kind of line structured light vision sensor calibration method based on sawtooth target
CN107255443A (en) * 2017-07-14 2017-10-17 北京航空航天大学 Binocular vision sensor field calibration method and device under a kind of complex environment

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
ZHEN LIU等: "On-site calibration of line-structured light vision sensor in complex light environments", 《OPTICS EXPRESS》 *
ZHENZHONG WEI等: "Calibration Method for Line Structured Light Vision Sensor Based on Vanish Points and Lines", 《2010 INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION》 *
ZHENZHONG WEI等: "Line structured light vision sensor calibration using parallel straight lines features", 《OPTIK》 *
刘冲等: "大视场线结构光视觉传感器的现场标定", 《光电工程》 *
刘珂等: "线结构光传感器标定不确定度估计", 《光电工程》 *
刘震等: "一种高精度线结构光视觉传感器现场标定方法", 《光学学报》 *
林娜等: "基于机器人系统的线结构光视觉传感器标定新方法", 《传感器与微系统》 *
邝泳聪,崔亮纯: "基于线纹尺的线结构光视觉传感器标定新方法", 《华南理工大学学报(自然科学版)》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110942434A (en) * 2019-11-22 2020-03-31 华兴源创(成都)科技有限公司 Display compensation system and method of display panel
CN110942434B (en) * 2019-11-22 2023-05-05 华兴源创(成都)科技有限公司 Display compensation system and method of display panel
CN111311686A (en) * 2020-01-15 2020-06-19 浙江大学 Projector out-of-focus correction method based on edge perception
CN111311686B (en) * 2020-01-15 2023-05-02 浙江大学 Projector defocus correction method based on edge perception
CN111275770A (en) * 2020-01-20 2020-06-12 南昌航空大学 Global calibration method of four-eye stereoscopic vision system based on one-dimensional target rotation motion
CN111207670A (en) * 2020-02-27 2020-05-29 河海大学常州校区 Line structured light calibration device and method
CN112229420A (en) * 2020-08-31 2021-01-15 南京航空航天大学 Line laser calibration method for aircraft skin butt seam measurement
WO2022088039A1 (en) * 2020-10-30 2022-05-05 Harman International Industries, Incorporated Unified calibration between dvs and camera
CN112484746A (en) * 2020-11-26 2021-03-12 上海电力大学 Monocular vision-assisted laser radar odometer method based on ground plane
CN112767492A (en) * 2020-12-25 2021-05-07 江苏集萃智能光电系统研究所有限公司 Railway wheel set size detection device and calibration method thereof
CN113155057A (en) * 2021-03-16 2021-07-23 广西大学 Line structured light plane calibration method using non-purpose-made target
CN113639633A (en) * 2021-07-26 2021-11-12 中国航空工业集团公司北京航空精密机械研究所 Method for aligning angular zero position of clamp in multi-axis vision measuring device
CN113639633B (en) * 2021-07-26 2023-07-07 中国航空工业集团公司北京航空精密机械研究所 Clamp angular zero alignment method in multi-axis vision measurement device

Also Published As

Publication number Publication date
CN109827502B (en) 2020-03-17

Similar Documents

Publication Publication Date Title
CN109827502A (en) A kind of line structured light vision sensor high-precision calibrating method of calibration point image compensation
CN111414798B (en) Head posture detection method and system based on RGB-D image
CN105956539B (en) A kind of Human Height measurement method of application background modeling and Binocular Vision Principle
CN103971353B (en) Splicing method for measuring image data with large forgings assisted by lasers
CN104484648B (en) Robot variable visual angle obstacle detection method based on outline identification
CN107255443A (en) Binocular vision sensor field calibration method and device under a kind of complex environment
CN109598762A (en) A kind of high-precision binocular camera scaling method
CN105894574B (en) A kind of binocular three-dimensional reconstruction method
CN108986070B (en) Rock crack propagation experiment monitoring method based on high-speed video measurement
CN106705849B (en) Calibrating Technique For The Light-strip Sensors
CN103578088B (en) A kind of starry sky image processing method
CN110332887A (en) A kind of monocular vision pose measurement system and method based on characteristic light punctuate
CN111563878B (en) Space target positioning method
CN108662987B (en) Calibration method of 2D camera type laser measuring head
CN104266608B (en) Field calibration device for visual sensor and calibration method
CN103278138A (en) Method for measuring three-dimensional position and posture of thin component with complex structure
CN106996748A (en) A kind of wheel footpath measuring method based on binocular vision
CN107356202A (en) A kind of laser scanning measurement system target sights method automatically
CN110060304B (en) Method for acquiring three-dimensional information of organism
CN108491810A (en) Vehicle limit for height method and system based on background modeling and binocular vision
CN107595388A (en) A kind of near infrared binocular visual stereoscopic matching process based on witch ball mark point
CN109631912A (en) A kind of deep space spherical object passive ranging method
CN107869954A (en) A kind of binocular vision volume weight measuring system and its implementation
CN109727291A (en) A kind of high-precision online calibration method of zoom camera
CN110223355A (en) A kind of feature mark poiX matching process based on dual epipolar-line constraint

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant