CN106705849A - Calibration method of linear-structure optical sensor - Google Patents

Calibration method of linear-structure optical sensor Download PDF

Info

Publication number
CN106705849A
CN106705849A CN201710056155.0A CN201710056155A CN106705849A CN 106705849 A CN106705849 A CN 106705849A CN 201710056155 A CN201710056155 A CN 201710056155A CN 106705849 A CN106705849 A CN 106705849A
Authority
CN
China
Prior art keywords
preset number
characteristic point
light
line
video camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710056155.0A
Other languages
Chinese (zh)
Other versions
CN106705849B (en
Inventor
汤宜军
张慧
陆娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Step Electric Corp
Original Assignee
Shanghai Step Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Step Electric Corp filed Critical Shanghai Step Electric Corp
Priority to CN201710056155.0A priority Critical patent/CN106705849B/en
Publication of CN106705849A publication Critical patent/CN106705849A/en
Application granted granted Critical
Publication of CN106705849B publication Critical patent/CN106705849B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to the technical field of linear-structure optical sensors and discloses a calibration method of a linear-structure optical sensor. The calibration method comprises the following steps: acquiring N 3D (Three Dimensional) target images through a linear-structure optical sensor to be calibrated, wherein each 3D target image comprises non-coplanar characteristic points with the quantity which is more than or equal to a first pre-set quantity, and two structure light stripes and the structure light stripes comprise non-coplanar characteristic points with the quantity which is more than or equal to a second pre-set quantity; selecting the non-coplanar characteristic points with the first pre-set quantity from the N 3D target images; calibrating a camera according to the selected non-coplanar characteristic points with the first pre-set quantity; selecting the non-coplanar characteristic points with the second pre-set quantity from the structure light stripes of the N 3D target images; calibrating a linear-structure light plane according to the selected non-coplanar characteristic points with the second pre-set quantity and a calibrating result of the camera. With the adoption of the calibration method disclosed by the invention, the camera in the linear-structure optical sensor and the structure light plane are calibrated together, so that the calibration precision is easy to improve and a calibration process is simplified.

Description

Calibrating Technique For The Light-strip Sensors
Technical field
The present invention relates to line structure optical sensor technical field, more particularly to a kind of Calibrating Technique For The Light-strip Sensors.
Background technology
The demarcation of line structure optical sensor is the digitlization of type face, the indispensable instrument of profile measurement, accurate calibration line Structure optical parameter is the premise for carrying out accurate measurement.Method of structured light is a kind of active optical e measurement technology, its general principle It is to project controllable luminous point, striation or smooth surface structure from structured light projector to testee surface, and by imageing sensor (such as video camera) obtains image, by system geometrical relationship, the three-dimensional coordinate of object is calculated using triangle principle.
Line-structured light generating means includes:Laser projecting apparatus and video camera.The Mathematical Modeling of line-structured light:1st, video camera Perspective projection model 2, structure light optic plane equationses.Need the content demarcated:Video camera internal reference (camera focus, principal point, distortion Coefficient), line structure optical sensor structural parameters (optic plane equationses).The demarcation of structural parameters, i.e. optical plane is referred mainly to take the photograph relatively The position of camera.
It is well known that must all there be two calibration process in robot before using line structure optical sensor:Line-structured light is passed Sensor is demarcated, trick relation is demarcated.It is to calculate in space between sensing system and robot system that trick relation is demarcated Correlation process;It is to calculate in space sensor internal video camera and structure in itself that line structure optical sensor is demarcated The process of optical plane correlation between the two, wherein there is 2 important parts in scaling method, one is to demarcate to calculate Method, one is the demarcation target for aiding in.
Line laser transducer calibration method was more and more in the last few years, the caliberating device that either calibration algorithm is still aided in All have greatly improved.But either any scaling method more or less has the defect that itself is designed, such as video camera System, structure optical plane are separately demarcated, and stated accuracy difference is, it is necessary to the tooling platform of outside is aided in, it is necessary to partial is artificial Participate in, the problems such as calibration process is cumbersome.
The present inventor realize it is of the invention during find:Line structure optical sensor given by prior art Scaling method in the demarcation of video camera and structure optical plane be generally carried out separately, this kind of stated accuracy of scaling method is not It is high and calibration process is cumbersome.
The content of the invention
The purpose of embodiment of the present invention is to provide a kind of Calibrating Technique For The Light-strip Sensors so that line-structured light is passed Video camera and structure light Planar realization combined calibrating in sensor, are conducive to improving stated accuracy, simplify calibration process.
In order to solve the above technical problems, embodiments of the present invention provide a kind of Calibrating Technique For The Light-strip Sensors, Including:N number of 3D target images are gathered by line structure optical sensor to be calibrated;Wherein, include in each 3D target images More than or equal to the non-coplanar characteristic point of the first preset number and 2 structural light stripes, wherein, in the structural light stripes Include the non-coplanar characteristic point more than or equal to the second preset number, the N is positive integer;In N number of 3D targets figure The non-coplanar characteristic point of the first preset number is chosen as in;Non-coplanar characteristic point according to the first preset number chosen is to shooting Machine is demarcated;The non-coplanar characteristic point of the second preset number is chosen in the structural light stripes of N number of 3D target images;Root The demarcation of line-structured light plane is completed according to the non-coplanar characteristic point of the second preset number chosen and the calibration result of video camera.
Embodiment of the present invention in terms of existing technologies, can be according to first chosen in N number of 3D target images The non-coplanar characteristic point of preset number, demarcates to video camera.And can be according to the structure striation in N number of 3D target images The non-coplanar characteristic point and the calibration result of video camera of the second preset number chosen in line, realize the mark of line-structured light plane It is fixed.Therefore, present embodiment enables that video camera and structure optical plane in line structure optical sensor realize combined calibrating, has Beneficial to raising stated accuracy, simplify calibration process.
In addition, the non-coplanar characteristic point according to the first preset number chosen is demarcated to video camera, specifically include:It is logical The non-coplanar characteristic point to the first preset number is crossed, to be solved and singly answer square between stereo target coordinate system and image pixel coordinates system Battle array is demarcated to video camera.Embodiment of the present invention, for the demarcation of video camera provides feasibility.
In addition, the calibration result of the non-coplanar characteristic point and video camera according to the second preset number chosen completes knot The demarcation of structure optical plane, specifically includes:By using the non-coplanar characteristic point and the demarcation knot of video camera of the second preset number Fruit is calculated coordinate of the characteristic point in line-structured light plane under camera coordinate system, sets up line-structured light planar linear side Journey, and least square method is utilized, complete the demarcation of line-structured light plane equation.Embodiment of the present invention so that line-structured light is put down The demarcation in face is more convenient feasible.
In addition, in gathering N number of 3D target images by line structure optical sensor to be calibrated, the collection of each 3D target images Angle is different.So, the reliability of the demarcation of line structure optical sensor can be improved.
Brief description of the drawings
Fig. 1 is the flow chart of the Calibrating Technique For The Light-strip Sensors according to first embodiment of the invention;
Fig. 2 is the geometrical relationship schematic diagram of the line structure optical sensor model in first embodiment of the invention;
Fig. 3 is the flow chart of the Calibrating Technique For The Light-strip Sensors according to second embodiment of the invention.
Specific embodiment
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with accompanying drawing to each reality of the invention The mode of applying is explained in detail.However, it will be understood by those skilled in the art that in each implementation method of the invention, In order that reader more fully understands the application and proposes many ins and outs.But, even if without these ins and outs and base Many variations and modification in following implementation method, it is also possible to realize the application technical scheme required for protection.
First embodiment of the invention is related to a kind of Calibrating Technique For The Light-strip Sensors, as shown in figure 1, including:
Step 101:N number of 3D target images are gathered by line structure optical sensor to be calibrated.Wherein, each 3D target images In include more than or equal to the non-coplanar characteristic point of the first preset number and 2 structural light stripes, wherein, 2 knot Include the non-coplanar characteristic point more than or equal to the second preset number in structure striations.The 3D target image number Ns of shooting It is positive integer, gathers an appropriate number of 3D target images and be conducive to improving demarcation efficiency, stated accuracy improved, in an example In, N can between [6,9] value, such as N be equal to 6.
In embodiment of the present invention, in the N number of 3D target images by line structure optical sensor collection to be calibrated, each 3D The acquisition angles of target image can for example gather 3D target images with difference under different poses.Due to different positions and pose The angle of the 3D target images of lower collection is different, and so, the 3D target images of the different angles for collecting are not limited by a certain The depth of field and ad-hoc location under pose, such that it is able to improve the reliability of line structure optical sensor demarcation.
Step 102:The non-coplanar characteristic point of the first preset number is chosen in N number of 3D target images.Wherein, first preset Number determines by the Mathematical Modeling of Camera Calibration Algorithm, (seen below) with reference to present embodiment Camera Calibration Algorithm, first Preset number is 6.
Step 103:Non-coplanar characteristic point according to the first preset number chosen is demarcated to video camera.Specifically Say, by the non-coplanar characteristic point to the first selected preset number, solve stereo target coordinate system and image pixel coordinates Homography matrix between system is demarcated to video camera.
Step 104:The non-coplanar characteristic point of the second preset number is chosen in the structural light stripes of N number of 3D target images. Wherein, the second preset number can be 5.
Step 105:The calibration result of non-coplanar characteristic point and video camera according to the second preset number chosen is completed The demarcation of line-structured light plane.Specifically, by using the non-coplanar characteristic point and the mark of video camera of the second preset number Determine the coordinate of characteristic point that result is calculated in line-structured light plane under camera coordinate system, set up line-structured light planar line Property equation, and utilize least square method, complete the demarcation of line-structured light plane equation.
With reference to Fig. 2, the calibration algorithm of present embodiment is described in detail as follows:
Fig. 2 is the geometrical relationship schematic diagram of line structure optical sensor model, in fig. 2, OwxwywzwIt is global coordinate system, OuxuyuIt is image coordinate system, OcxcyczcIt is camera coordinate system, as shown in Fig. 2 line structure optical sensor model includes shooting Machine 201, the plane of delineation 202, line structure optical sensor 203, structured light device 204, structure optical plane 205 and measured workpiece 206。
If there is a point P on structure optical plane 205, p is projected as under image coordinate system, it is under image coordinate system Homogeneous coordinates areHomogeneous coordinates under global coordinate system areCan be obtained according to video camera pin-hole imaging model:
Wherein,κ is proportionality constant;[R t] is the outer ginseng of video camera Matrix number, description world coordinates is tied to the spin matrix and translation vector of camera coordinate system;It is shooting The Intrinsic Matrix of machine, wherein (u0,v0) represent image principal point pixel coordinate, fx、fyX, equivalent Jiao of Y-axis are represented respectively Away from γ represents imaging plane obliquity factor;M is the projection matrix of video camera.
Meanwhile, point P is the point on structure optical plane, meets structure light plane equation.If the pixel value of P points is (u, v), knot Equation of the structure optical plane under camera coordinate system be:
Alxc+Blyc+Clzc=Dl (2)
Wherein, Al、Bl、Cl、Dl4 coefficients of structure optical plane are represented respectively.
Then point P is from structure optical plane to camera coordinate system OcxcyczcLower coordinate (xc,yc,zc) conversion process it is as follows: Wherein, demarcated and obtained camera intrinsic parameterDistortion factor k1、k2, structure light equation Alxc+Blyc+Clzc =Dl, With reference to Alxc+Blyc+Clzc=DlObtain:
It is as follows to the process that video camera demarcate solution according to the non-coplanar characteristic point chosen:
Note homography matrix M=A [R t], by (1), we can obtain:
Obvious 3 × 4 matrix homography matrix M is related to a scale factor.
Note M=[m1 m2 m3 m4], mi=[mi1 mi2 mi3]TIt is the column elements of M i-th, R=[r1 r2 r3],riIt is the i-th of R Column element.From (4)
[m1 m2 m3 m4]=λ A [r1 r2 r3 t] (5)
Wherein, λ is any scalar.
Camera chain can be made up of imager chip and optical lens, generally there are radial distortion and tangential distortion, wherein Based on radial distortion.
If preferred view point normalization image coordinate system coordinate when (x, y) is in the absence of distortion,When being in the presence of distortion Actual subpoint normalization image coordinate system coordinate.Then have:
And can be obtained by (1)Wherein,It is image actual pixels coordinate.
In embodiment of the present invention, Pi=(Xi,Yi,Zi)T、pi=(ui,vi)TPoint p mesh under global coordinate system is represented respectively Point under punctuate and image coordinate system, homogeneous coordinates noteIn practice, due to the mistake of acquisition picture point There is noise jamming in journey so that PiAnd piEquation (1) can not be met.Assuming that picture point is subject to average for 0 covariance matrix isGauusian noise jammer, then singly the maximal possibility estimation of reflecting property matrix H can be by making functionMinimum is taken to be worth to.Wherein, It is i-th row of matrix M.
In actual applications, if the extraction of each point is separate, can simply assume to be deposited for all of i In this case, singly the solution of reflecting property matrix M becomes a non-linear least squareSolve problems.Non-linear minimisation problem can using Levenberg-Marquardt (the literary Burger of row- Ma Kuaertefa) criterion is solved.The criterion one initial value of needs, the initial value be able to can be obtained by following method.
NoteM might as well be made by (5)43=1, then (4) formula be changed into:WhereinAssuming that there is n point, equation group is remembered It is L2n×11X=K2n×1, wherein there is 11 unknown quantitys, as long as n >=6 can solve X, and have X=(LTL)-1LTK, it is thus possible to To the initial value of singly reflecting property matrix M.
Calibrating parameters A, [R t] solution procedure it is as follows:
From (5):
m1=λ Ar1m2=λ Ar2m3=λ Ar3m4=λ t (6)
It is orthogonal matrix in view of R, then there is following restriction relation:
And
Note
Wherein, B is symmetrical matrix, definition vector b=[B11B12B22B13B23B33]T, then haveWherein,
(7),
Assuming that acquiring n peg model image, the equation of n groups such as (5) is constituted into equation below:
Vb=0 (8)
Wherein, V is a matrix for 5n × 6.If n >=2, then unique solution b can be obtained.If n=1, Ke Yiling Imaging plane obliquity factor γ=0, i.e., increase constraint equation [0 100 0] b=0 to equation (8).The solution of equation (8) It is matrix VTThe corresponding characteristic vector of V minimal eigenvalues.
Matrix B is calculated according to b, matrix B is related to a scale factor, i.e. B=λ A-TA-1, λ is any scalar.According to Following formula can calculate Intrinsic Matrix:
After calculating Intrinsic Matrix A, every outer parameter matrix of image is calculated respectively according to (6):
Wherein, κ= 1/||A-1h1| |=1/ | | A-1h2| |=1/ | | A-1h3||.Due to the interference of noise, matrix [r computed above1 r2 r3] and It is not unit orthogonal matrix.
Make Q=[r1 r2 r3], its singular value decomposition is Q=USVT, wherein S=diag (σ123), then R=UVTBe with Q closest orthogonal matrix, i.e.,It is minimum.
Embodiment of the present invention can also be optimized to the camera parameters for obtaining, and specific optimization process is as follows:
Assuming that collection n refers to object image, take m calibration point on every image, and it is separate during extraction picture point and Noise jamming of these points by same distribution.We can be by making function Minimum obtains Optimal Parameters.
The m in function fijIt is i-th pixel value of j-th point of image, k1,k2The single order second order distortion system for respectively introducing Number, MijIt is the corresponding reference coordinate of i-th image, j-th pixel, A is camera intrinsic parameter, [Ri,ti] it is i-th image When external parameters of cameras.I(mij,A,k1,k2) it is the normalization image coordinate system coordinate calculated by image pixel coordinates,It is the normalization image coordinate system coordinate calculated by object reference coordinate and outer parameter matrix.In optimization, By spin matrix RiBe converted into pitching rolling (RPY) angle φ, θ,And by φ, θ,Optimized as parameter.
Non-linear minimisation can be solved using Levenberg-Marquardt criterions.The criterion needs an initial value. The initial value of distortion factor can pass throughSolve, wherein,
Assuming that collection n opens altogether, m calibration point is taken on every image can then obtain 2mn equation.Remember above-mentioned equation group It is Mk=d, distortion factor initial value k=[k can be obtained1k2] '=(MTM)-1MTD, remaining initial parameter value is above being obtained Arrive.
Point in the structure smooth surface O under camera coordinate systemcxcyczcCoordinate be lower coordinate (xc,yc,zc) all satisfaction sides Journey structure light plane equation Alxc+Blyc+Clzc=Dl, by scheme choose structure optical plane on image coordinate mooring points, then By the camera calibration parameter for calculating, the corresponding point coordinates under camera coordinate system is convertible into, camera coordinates Point coordinates under system is brought into said structure optic plane equationses and the unknown parameter of equation just can be solved using least square.So, The demarcation of line-structured light plane can easily be realized.
Embodiment of the present invention in terms of existing technologies, can be according to first chosen in N number of 3D target images The non-coplanar characteristic point of preset number, demarcates to video camera.And can be according to the structure striation in N number of 3D target images The non-coplanar characteristic point and the calibration result of video camera of the second preset number chosen in line, realize the mark of line-structured light plane It is fixed.Therefore, present embodiment can realize the combined calibrating of video camera and structure optical plane in line structure optical sensor, and can Demarcated with using the calibration algorithm based on Mathematical Modeling, so as to be conducive to improving stated accuracy, simplify calibration process.
Second embodiment of the present invention is related to a kind of Calibrating Technique For The Light-strip Sensors.Second embodiment is first Further improvement is made on the basis of implementation method, has mainly theed improvement is that:In second embodiment of the invention, according to Choose the second preset number non-coplanar characteristic point and video camera calibration result complete line-structured light plane demarcation it Afterwards, also include:Calibrated error is calculated and be shown.As shown in figure 3, the Calibrating Technique For The Light-strip Sensors bag of present embodiment Include:
Step 301:N number of 3D target images are gathered by line structure optical sensor to be calibrated.Wherein, each 3D target images In include more than or equal to the non-coplanar characteristic point of the first preset number and 2 structural light stripes.
Step 302:The non-coplanar characteristic point of the first preset number is chosen in N number of 3D target images.Wherein, first preset Number is determined based on calibration algorithm.
Step 303:Non-coplanar characteristic point according to the first preset number chosen is demarcated to video camera.Specifically Say, by the non-coplanar characteristic point to the first selected preset number, solve stereo target coordinate system and image pixel coordinates Homography matrix between system is demarcated to video camera.
Step 304:The non-coplanar characteristic point of the second preset number is chosen in the structural light stripes of N number of 3D target images. Wherein, the second preset number can be 5.
Step 305:The calibration result of non-coplanar characteristic point and video camera according to the second preset number chosen is completed The demarcation of line-structured light plane.Specifically, by using the non-coplanar characteristic point and the mark of video camera of the second preset number Determine the coordinate of characteristic point that result is calculated in line-structured light plane under camera coordinate system, set up line-structured light planar line Property equation, and utilize least square method, complete the demarcation of line-structured light plane equation.
Step 306:Calibrated error is calculated and be shown.
In step 301, the operation for gathering 3D target images includes, 3D targets are placed on tooling platform is demarcated, and connects Sensor, switches on power, and 22 images (quantity of collection image is not particularly limited) is gathered under different poses, by what is gathered Image saves as bmp forms from 1 to 22 names.
Secondly, after line structure optical sensor to be calibrated gathers multiple 3D target images, open MATLAB and demarcate journey Sequence, getpixel (n) is input into command window, and n is corresponding picture sequence numbers, is sequentially completed default correlation according to prompting and refers to Order.So, target co-ordinates system and its coordinate under image coordinate system, the point on structure optical plane can be obtained in image coordinate Coordinate under system.
Target co-ordinates system and its coordinate under image coordinate system, the point on structure optical plane are being obtained in image coordinate system Under coordinate after, can MATLAB command windows be input into STEPLaserVisionCali (m), m is that the picture for having processed is total Number, you can complete the demarcation of line structure optical sensor, generates the calibration result of final line structure optical sensor STEPSensor.cali.Also, the solution of image and object homography matrix, the optimization of distortion of camera, camera calibration and The work such as the demarcation of structure light plane equation can be completed in MATLAB.
Finally, error analysis can be carried out according to the calibration result for obtaining.Wherein, the calibrated error of present embodiment and its Computational methods include:
1st, camera calibration error and its computational methods
3D targets characteristic point (non-coplanar characteristic point of the first preset number chosen) is transformed under image coordinate system, One group of actual target characteristic point can be acquired in calibration algorithm and by the ideal targets characteristic point after distortion correction, again By 2 norm calculations position residual error between the two, and using this position residual error as camera calibration error.
2nd, structure light plane fitting error
Structure light is fitted by all laser stripe characteristic points (i.e. the non-coplanar characteristic point of the second preset number) for obtaining Plane (Alxc+Blyc+Clzc=Dl), then the computing formula by putting to face is calculated each non-in every structural light stripes Distance of the coplanar characteristic point to structure optical plane:
Using the distance of all non-coplanar characteristic points in structural light stripes as structure optical plane error of fitting.Wherein, xc0、yc0、zc0It is the coordinate of certain structural light stripes point on structure optical plane, A, B, C, D are the coefficients of structure light plane equation.
It should be noted that in embodiment of the present invention, the non-co-planar of the first preset number and the second preset number is special Levy and a little chosen in MATLAB.Also, the non-coplanar characteristic point according to the first preset number chosen enters rower to video camera Fixed and according to the second preset number chosen non-coplanar characteristic point and the calibration result of video camera complete line-structured light and put down The demarcation in face is performed in MATLAB softwares.Due to target characteristic point, (non-co-planar of the first preset number i.e. to be chosen is special Levy a little) or laser plane characteristic point (non-coplanar characteristic point of the second preset number i.e. to be chosen) be all target co-ordinates system mark Know the intersection point or laser stripe and its intersection point of line, therefore, it is in one example, the step of obtain these characteristic points:
According to the order of regulation, successively straight line two ends a mouse click where characteristic point to be calibrated, then determine one by 2 points Bar straight line, target tag line or laser strip streakline where can finally obtaining characteristic point.
The rest may be inferred, and acquisition meets all target tag lines and laser strip streakline of characteristic point attribute, non-parallel by 2 Straight line determines an intersection point (the solve () function i.e. in Calling MATLAB), finally calculates the coordinate bit of all characteristic points Put, so as to not only stated accuracy can be improved by improving the coordinate precision of feature, and demarcation efficiency can be improved.
In embodiment of the present invention, the demarcation of non-coplanar characteristic point and video camera according to the second preset number chosen Result is completed after the demarcation of line-structured light plane, and calibrated error can also be calculated and be shown.So, it is easy to user to understand to demarcate As a result.
Above the step of various methods divide, be intended merely to description it is clear, can be merged into when realizing a step or Some steps are split, multiple steps are decomposed into, as long as comprising identical logical relation, all in the protection domain of this patent It is interior;To adding inessential modification in algorithm or in flow or introducing inessential design, but its algorithm is not changed With the core design of flow all in the protection domain of the patent.
It will be appreciated by those skilled in the art that all or part of step in realizing above-described embodiment method can be by Program is completed to instruct the hardware of correlation, and the program storage is in a storage medium, including some instructions are used to so that one Individual equipment (can be single-chip microcomputer, chip etc.) or processor (processor) perform the application each embodiment methods described All or part of step.And foregoing storage medium includes:USB flash disk, mobile hard disk, read-only storage (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD etc. are various can store journey The medium of sequence code.
It will be understood by those skilled in the art that the respective embodiments described above are to realize specific embodiment of the invention, And in actual applications, can to it, various changes can be made in the form and details, without departing from the spirit and scope of the present invention.

Claims (10)

1. a kind of Calibrating Technique For The Light-strip Sensors, it is characterised in that including:
N number of 3D target images are gathered by line structure optical sensor to be calibrated;Wherein, include big in each 3D target images In or equal to the first preset number non-coplanar characteristic point and 2 structural light stripes, wherein, in the structural light stripes wrap Containing the non-coplanar characteristic point more than or equal to the second preset number, the N is positive integer;
The non-coplanar characteristic point of the first preset number is chosen in N number of 3D target images;
Non-coplanar characteristic point according to the first preset number chosen is demarcated to video camera;
The non-coplanar characteristic point of the second preset number is chosen in the structural light stripes of N number of 3D target images;
The calibration result of non-coplanar characteristic point and video camera according to the second preset number chosen completes line-structured light plane Demarcation.
2. Calibrating Technique For The Light-strip Sensors according to claim 1, it is characterised in that described according to choose first The non-coplanar characteristic point of preset number is demarcated to video camera, is specifically included:
By the non-coplanar characteristic point to first preset number, solve stereo target coordinate system and image pixel coordinates system it Between homography matrix video camera is demarcated.
3. Calibrating Technique For The Light-strip Sensors according to claim 2, it is characterised in that first preset number is 6。
4. Calibrating Technique For The Light-strip Sensors according to claim 1, it is characterised in that
It is described that line-structured light is completed according to the non-coplanar characteristic point of the second preset number chosen and the calibration result of video camera The demarcation of plane, specifically includes:
Line-structured light is calculated by using the non-coplanar characteristic point of the second preset number and the calibration result of video camera to put down The coordinate of characteristic point on face under camera coordinate system, sets up line-structured light planar linear equation, and utilizes least square method, Complete the demarcation of line-structured light plane equation.
5. Calibrating Technique For The Light-strip Sensors according to claim 4, it is characterised in that second preset number is 5。
6. Calibrating Technique For The Light-strip Sensors according to claim 1, it is characterised in that described by line to be calibrated Structured light sensor is gathered in N number of 3D target images, and the acquisition angles of each 3D target images are different.
7. Calibrating Technique For The Light-strip Sensors according to claim 6, it is characterised in that the N takes between [6,9] Value.
8. Calibrating Technique For The Light-strip Sensors according to claim 1, it is characterised in that
It is described that line-structured light is completed according to the non-coplanar characteristic point of the second preset number chosen and the calibration result of video camera After the demarcation of plane, also include:
Calibrated error is calculated and be shown.
9. Calibrating Technique For The Light-strip Sensors according to claim 1, it is characterised in that
The non-coplanar characteristic point of first preset number and the second preset number is chosen in MATLAB.
10. Calibrating Technique For The Light-strip Sensors according to claim 1, it is characterised in that
The non-coplanar characteristic point according to the first preset number chosen is demarcated and described according to selection to video camera The second preset number non-coplanar characteristic point and video camera calibration result complete line-structured light plane demarcation exist Performed in MATLAB softwares.
CN201710056155.0A 2017-01-25 2017-01-25 Calibrating Technique For The Light-strip Sensors Active CN106705849B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710056155.0A CN106705849B (en) 2017-01-25 2017-01-25 Calibrating Technique For The Light-strip Sensors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710056155.0A CN106705849B (en) 2017-01-25 2017-01-25 Calibrating Technique For The Light-strip Sensors

Publications (2)

Publication Number Publication Date
CN106705849A true CN106705849A (en) 2017-05-24
CN106705849B CN106705849B (en) 2019-06-21

Family

ID=58908954

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710056155.0A Active CN106705849B (en) 2017-01-25 2017-01-25 Calibrating Technique For The Light-strip Sensors

Country Status (1)

Country Link
CN (1) CN106705849B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108717715A (en) * 2018-06-11 2018-10-30 华南理工大学 A kind of line-structured light vision system automatic calibration method for arc welding robot
CN108981608A (en) * 2018-05-29 2018-12-11 华南理工大学 A kind of Novel wire Constructed Lighting Vision System and scaling method
CN109443209A (en) * 2018-12-04 2019-03-08 四川大学 A kind of line-structured light system calibrating method based on homography matrix
CN109697736A (en) * 2017-10-20 2019-04-30 杭州海康机器人技术有限公司 Scaling method, device, electronic equipment and the readable storage medium storing program for executing of measuring system
CN109827502A (en) * 2018-12-28 2019-05-31 北京航空航天大学 A kind of line structured light vision sensor high-precision calibrating method of calibration point image compensation
CN110470320A (en) * 2019-09-11 2019-11-19 河北科技大学 The scaling method and terminal device of oscillatory scanning formula line-structured light measuring system
CN111256591A (en) * 2020-03-13 2020-06-09 易思维(杭州)科技有限公司 External parameter calibration device and method for structured light sensor
CN112361958A (en) * 2020-11-04 2021-02-12 同济大学 Line laser and mechanical arm calibration method
CN112894209A (en) * 2021-01-19 2021-06-04 常州英迈乐智能系统有限公司 Automatic plane correction method for intelligent tube plate welding robot based on cross laser
WO2021253940A1 (en) 2020-06-19 2021-12-23 Zhejiang Huaray Technology Co., Ltd. Systems and methods for object measurement

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5146690A (en) * 1990-04-28 1992-09-15 Focus Messtechnik Gmbh & Co Kg Scanning head
US6295866B1 (en) * 1998-09-02 2001-10-02 Mitutoyo Corporation Surface-tracking measuring machine
CN1508511A (en) * 2002-12-17 2004-06-30 北京航空航天大学 Method for calibrating structure optical vision sensor
CN1566906A (en) * 2003-06-11 2005-01-19 北京航空航天大学 Construction optical visual sense transducer calibration method based on plane targets
CN101526336A (en) * 2009-04-20 2009-09-09 陈炳生 Calibration method of linear structured light three-dimensional visual sensor based on measuring blocks
CN101526338A (en) * 2009-04-15 2009-09-09 北京信息科技大学 Field calibration method of structural parameter of line structure light vision sensing system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5146690A (en) * 1990-04-28 1992-09-15 Focus Messtechnik Gmbh & Co Kg Scanning head
US6295866B1 (en) * 1998-09-02 2001-10-02 Mitutoyo Corporation Surface-tracking measuring machine
CN1508511A (en) * 2002-12-17 2004-06-30 北京航空航天大学 Method for calibrating structure optical vision sensor
CN1566906A (en) * 2003-06-11 2005-01-19 北京航空航天大学 Construction optical visual sense transducer calibration method based on plane targets
CN101526338A (en) * 2009-04-15 2009-09-09 北京信息科技大学 Field calibration method of structural parameter of line structure light vision sensing system
CN101526336A (en) * 2009-04-20 2009-09-09 陈炳生 Calibration method of linear structured light three-dimensional visual sensor based on measuring blocks

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张维光等: "线结构光多传感器三维测量系统误差校正方法", 《西安交通大学学报》 *
魏振忠等: "一种线结构光视觉传感器标定方法", 《机械工程学报》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109697736A (en) * 2017-10-20 2019-04-30 杭州海康机器人技术有限公司 Scaling method, device, electronic equipment and the readable storage medium storing program for executing of measuring system
CN108981608B (en) * 2018-05-29 2020-09-22 华南理工大学 Novel line structured light vision system and calibration method
CN108981608A (en) * 2018-05-29 2018-12-11 华南理工大学 A kind of Novel wire Constructed Lighting Vision System and scaling method
CN108717715A (en) * 2018-06-11 2018-10-30 华南理工大学 A kind of line-structured light vision system automatic calibration method for arc welding robot
CN108717715B (en) * 2018-06-11 2022-05-31 华南理工大学 Automatic calibration method for linear structured light vision system of arc welding robot
CN109443209A (en) * 2018-12-04 2019-03-08 四川大学 A kind of line-structured light system calibrating method based on homography matrix
CN109443209B (en) * 2018-12-04 2019-12-31 四川大学 Line structured light system calibration method based on homography matrix
CN109827502A (en) * 2018-12-28 2019-05-31 北京航空航天大学 A kind of line structured light vision sensor high-precision calibrating method of calibration point image compensation
CN110470320B (en) * 2019-09-11 2021-03-05 河北科技大学 Calibration method of swinging scanning type line structured light measurement system and terminal equipment
CN110470320A (en) * 2019-09-11 2019-11-19 河北科技大学 The scaling method and terminal device of oscillatory scanning formula line-structured light measuring system
CN111256591A (en) * 2020-03-13 2020-06-09 易思维(杭州)科技有限公司 External parameter calibration device and method for structured light sensor
WO2021253940A1 (en) 2020-06-19 2021-12-23 Zhejiang Huaray Technology Co., Ltd. Systems and methods for object measurement
EP4136403A4 (en) * 2020-06-19 2023-10-18 Zhejiang Huaray Technology Co., Ltd. Systems and methods for object measurement
CN112361958A (en) * 2020-11-04 2021-02-12 同济大学 Line laser and mechanical arm calibration method
CN112361958B (en) * 2020-11-04 2022-06-21 同济大学 Line laser and mechanical arm calibration method
CN112894209A (en) * 2021-01-19 2021-06-04 常州英迈乐智能系统有限公司 Automatic plane correction method for intelligent tube plate welding robot based on cross laser

Also Published As

Publication number Publication date
CN106705849B (en) 2019-06-21

Similar Documents

Publication Publication Date Title
CN106705849A (en) Calibration method of linear-structure optical sensor
US9965870B2 (en) Camera calibration method using a calibration target
CN101292255B (en) Artifact mitigation in three-dimensional imaging
CN104173054B (en) Measuring method and measuring device for height of human body based on binocular vision technique
US10225473B2 (en) Threshold determination in a RANSAC algorithm
CN111126304A (en) Augmented reality navigation method based on indoor natural scene image deep learning
US8208029B2 (en) Method and system for calibrating camera with rectification homography of imaged parallelogram
JP4284664B2 (en) Three-dimensional shape estimation system and image generation system
CN112270249A (en) Target pose estimation method fusing RGB-D visual features
JP4374625B2 (en) Image generating apparatus and method
EP3273412B1 (en) Three-dimensional modelling method and device
CN106651767A (en) Panoramic image obtaining method and apparatus
US20050128196A1 (en) System and method for three dimensional modeling
CN107169475A (en) A kind of face three-dimensional point cloud optimized treatment method based on kinect cameras
CN100417231C (en) Three-dimensional vision semi-matter simulating system and method
CN108225216A (en) Structured-light system scaling method and device, structured-light system and mobile equipment
JP2008537190A (en) Generation of three-dimensional image of object by irradiating with infrared pattern
JP2009017480A (en) Camera calibration device and program thereof
CN109559349A (en) A kind of method and apparatus for calibration
CN106997605A (en) It is a kind of that the method that foot type video and sensing data obtain three-dimensional foot type is gathered by smart mobile phone
US10186051B2 (en) Method and system for calibrating a velocimetry system
CN107869968A (en) A kind of quick three-dimensional scan method and system suitable for complex object surface
CN101996416A (en) 3D face capturing method and equipment
CN112184793B (en) Depth data processing method and device and readable storage medium
CN113223078A (en) Matching method and device of mark points, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant