CN105783773A - Numerical value calibration method for line structured light vision sensor - Google Patents

Numerical value calibration method for line structured light vision sensor Download PDF

Info

Publication number
CN105783773A
CN105783773A CN201610169495.XA CN201610169495A CN105783773A CN 105783773 A CN105783773 A CN 105783773A CN 201610169495 A CN201610169495 A CN 201610169495A CN 105783773 A CN105783773 A CN 105783773A
Authority
CN
China
Prior art keywords
target
laser
plane
line
fixed cover
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610169495.XA
Other languages
Chinese (zh)
Other versions
CN105783773B (en
Inventor
周京博
李玥华
黄风山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei University of Science and Technology
Original Assignee
Hebei University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei University of Science and Technology filed Critical Hebei University of Science and Technology
Priority to CN201610169495.XA priority Critical patent/CN105783773B/en
Publication of CN105783773A publication Critical patent/CN105783773A/en
Application granted granted Critical
Publication of CN105783773B publication Critical patent/CN105783773B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Abstract

The invention discloses a numerical value calibration method for a line structured light vision sensor. The method comprises steps: an equidistantly-arranged dot array is made as a calibration target, the target position and a focal length are initially adjusted to enable the target to be completely placed in a view field of a camera, and a target point image is distinct; the relative position between a line structured laser plane and the calibration target is precisely adjusted to enable the laser plane and the target plane to be coplanar; the power supply of the laser is cut off, the focal length of the camera is adjusted precisely again, the target image photographed by the camera is ensured to be distinct, an ellipse fitting method is adopted to obtain pixel coordinates of each target point center, and the pixel coordinates are stored in a matrix Q; the target center coordinates in the matrix Q are used for carrying out Delaunary triangulation on the pixel plane, the point at the left corner of the target is marked as the original point of a world coordinate system 0WXY, and world coordinates corresponding to a vertex are determined; and a linear transformation coefficient corresponding to each triangulation area is calculated, and the linear transformation coefficient is stored in a transformation coefficient matrix D. The calibration method of the invention is simple and easy to operate, the computing efficiency is high, and the calibration precision is good.

Description

A kind of numerical value scaling method of line structured light vision sensor
Technical field
The invention belongs to computer vision measurement field, relate to a kind of high accuracy numerical value scaling method for line structured light vision sensor.
Background technology
Line structured light vision sensor is a kind of non-contact measurement apparatus, is mainly made up of camera and laser line generator.There is the advantages such as simple in construction, noncontact, measuring speed be fast, have a wide range of applications in many industrial circles such as profile measurement, reverse engineering, on-line monitoring.
Line structured light vision sensor intersects, by analyzing the laser plane launched by laser line generator, the deformation light stripe obtained with testee, calculates the TP of testee.For setting up the corresponding relation between laser light stripe image and TP coordinate, it is necessary to first line structured light vision sensor is demarcated.Stated accuracy will directly affect the accuracy of measurement result, and developing high-precision scaling method is the key problem realizing line structured light vision sensor high accuracy industrial measurement applications there.
Traditional scaling method of line structured light vision sensor mainly has: fiber elongation method, sawtooth Bar Method, dual cross ratio invariability method, three-point perspective method, dual virtual circule method, Coplanar Reference Target method and the direct standardizition etc. based on standard gauge block.
Fiber elongation method carrys out the parameter of calibration line structure light vision sensor by analyzing the actual coordinate of pixel coordinate and bright spot that laser plane and filament are crossed to form bright spot, and the number of bright spot is consistent with the number of the filament drawn.In this method, the actual coordinate of the bright spot that optical plane intersects with filament is difficult to accurate measurement, and the number of wire drawing is less, limits the stated accuracy of line structured light vision sensor.
University Of Tianjin Duan Fajie, Liu Fengmei propose sawtooth Bar Method, it is published in Chinese journal of scientific instrument, 2000, a kind of " novel Technique To Calibrate The Structured Parameters of The Light Strip Sensors method " of 20 (1) 108-110, the method one zigzag target and one-dimensional workbench pass through to calculate pixel coordinate and the actual coordinate of sawtooth crest line and laser plane joining, calculate the equation obtaining laser plane, it is achieved the demarcation of structured light sensor.The weak point of this method is that the fixed point number obtained is less, and stated accuracy is not high.
The artificial difficult problems affecting stated accuracy that just overlap with laser plane of target point that solve such as BJ University of Aeronautics & Astronautics Wei Zhen is loyal, propose a kind of Calibrating Technique For The Light-strip Sensors based on dual cross ratio invariability, it is published in mechanical engineering journal, 2005,41 (2): 210-214, " a kind of line structured light vision sensor calibration method ".It is disadvantageous in that demarcates required high-precision three-dimensional target processing relative difficulty, does not also deeply consider the lens distortion impact on measurement result.
Han Jiandong of Beijing University of Post & Telecommunication et al. proposes a kind of three-point perspective method and carrys out calibration line structured light vision sensor, is published in optical precision engineering periodical, and 2009,17 (5): 958-963, " quick calibrating method of line-structured light sensor-based system ".The method demarcation of line structure optical sensor based on three-point perspective model realization, method is simple, quickly, adopts the structured light sensor average relative measurement error that the method is demarcated to be about 0.72%.Method is disadvantageous in that stated accuracy is not as high.
Lu Minxun of Tongji University et al. etc. propose the dual virtual circule method of a kind of line structured light vision sensor calibration, are published in Acta Optica, and 2014,34 (10): 1015005, " laser scanning testing head based on dual imaginary circles target is demarcated ".The method, by adopting dual imaginary circles to carry out the Calibration of Laser plane parametric equation relative to camera, completes the demarcation to sensor.The ultimate principle adopted is cross ratio invariability principle, also needs the intrinsic parameter respectively to camera in calibration process, and distortion coefficients of camera lens etc. is corrected, and calibration process is complicated.
He'nan University of Technology Chen Tian flies et al. to propose a kind of line-structured light scaling method based on Coplanar Reference Target, is published in Acta Optica, and 2015,35 (1): 0112004, " line structure optical sensor based on Coplanar Reference Target demarcates new method ".This method is passed through repeatedly to move Coplanar Reference Target, the vanishing point of calculating different directions laser stripe straight line, and its fitting a straight line obtains the blanking line of optical plane, completes optical plane normal direction and demarcates.The weak point of the method is distinctive in that the target of employing is different, and the complexity solving optical plane method is different, need to distinguish the inside and outside parameter of calibration sensor, distortion parameter etc., and scaling method is complicated.
The basic ideas of above-mentioned scaling method are to initially set up the mathematical modeies such as the intrinsic parameter of camera calibration, outer parameter, distortion, complete to demarcate again through the undetermined parameter in optimized algorithm solving model.More and more higher along with what certainty of measurement was required, what model was set up also becomes increasingly complex.
In order to avoid setting up the mathematical model of line structured light vision sensor, Shenyang Inst of Automation, Chinese Academy of Sciences Zou Yuan beautiful woman et al., propose a kind of direct calibration method, and apply for patent of invention " a kind of line structured light vision sensor direct calibration method ", the patent No.: 201210559444.X.This kind of method adopts standard gauge block to make and demarcates target, and in conjunction with the movement of precision stage, searches concordance list by setting up, it is achieved that the demarcation of line structured light vision sensor.Said method is disadvantageous in that: in (1) calibration process, stated accuracy is produced material impact by the reflective caused feature point extraction error of scalar quantity block chamfering and the interpolation algorithm error that adopts;(2) needing repeatedly to adjust the position of precision stage when demarcating, by obtaining many width structure light images, calibration process is relatively cumbersome;(3) owing to precision stage range of movement is limited, the measurement scope adopting the line structured light vision sensor of the method demarcation is less.
In sum, it is not high to there is stated accuracy in traditional scaling method, and peg model is complicated, and calibration process is loaded down with trivial details, measures one or more in the problem such as be limited in scope.It is difficult to the further raising of line structured light vision sensor calibration precision, constrains its popularization in field of high-precision measurement.
Summary of the invention
Present invention aims to the deficiency of above scaling method, it is proposed to a kind of based on delta-shaped region linear transformation simple, precision is high, fireballing line structured light vision sensor numerical value scaling method.
In order to solve above-mentioned technical problem, the technical scheme is that
A kind of line structured light vision sensor numerical value scaling method, it is characterised in that: this scaling method comprises the following steps:
Step (1): target is demarcated in making, demarcates the dot matrixes that target is equidistantly arrangement, and first successive step demarcates the position of target and lens focus enables demarcation target to be completely disposed within the visual field of camera and target dot image is clear;
Step (2): accurately adjust laser line generator and the relative position demarcated between target, makes laser plane that laser line generator launches and demarcates target co-planar;
Step (3): turn off laser line generator power supply, accurately adjusts camera focus again, it is ensured that the image that target is demarcated in camera shooting is clear, adopts ellipse fitting method to obtain the pixel coordinate of each target dot center, is stored in matrix Q;
Step (4): pixel planes is carried out Delaunary triangulation with the target centre coordinate in matrix Q, is denoted as world coordinate system O by the center demarcating target lower-left angle pointWThe initial point of XYZ, it is determined that the coordinate in the world coordinate system that triangular apex is corresponding;
Step (5): calculate the linear transform coefficient corresponding to each triangulation region, and be stored in transform coefficient matrix D, namely complete the demarcation to line structure optical sensor.
Preferably, above-mentioned a kind of line structure optical sensor numerical value scaling method, Delaunary triangulation methodology is wherein adopted with the center of each target point for summit, whole pixel planes is split into many only small delta-shaped regions, adopt linear transformation method to calculate in pixel planes the coordinate in the world coordinate system corresponding to any pixel coordinate in each little Delta Region, complete the measurement to testee profile.
Preferably, above-mentioned a kind of line structure optical sensor numerical value scaling method, wherein this numerical value timing signal needs to adjust to coplanar demarcation target plane and laser plane.
Preferably, above-mentioned a kind of line structure optical sensor numerical value scaling method, wherein this line structured light vision sensor has the device that can adjust laser plane with target plane relative position, this device includes laser line generator (3), with laser instrument fixed cover (2), and link (4), described laser line generator (3) profile is cylindrical, and can rotate specific angle in described laser instrument fixed cover 2;Described laser instrument fixed cover (2) is a circular set, and its internal diameter is slightly larger than the diameter of described laser line generator (3), with lock-screw on it, when being locked by lock-screw after the anglec of rotation of self is determined by laser instrument;Described laser instrument fixed cover (2) and described link (4) are connected by screw, the upper processing of described laser instrument fixed cover (2) has screwed hole, the upper processing of described link (4) has the through hole more slightly larger than the screwed hole on described laser instrument fixed cover (2), make described laser instrument fixed cover (2) can relatively rotate screw-driving when angle reaches requirement relative to link (4), it is ensured that between described laser instrument fixed cover (2) and described link (4), relative motion no longer to occur;Described demarcation target (1) can along the normal direction translation of target plane.
Preferably, above-mentioned a kind of line structure optical sensor numerical value scaling method, the method that wherein accurately adjustment laser plane and demarcation target plane adopt in step (2) is:
Step 2.1: at OwZ-direction is demarcated described in anterior-posterior translation target (1) and is located at the underface of described laser line generator (3), observe the relativeness between seamed edge (1-1) on laser Rhizoma Dioscoreae (peeled) and target plate, if laser Rhizoma Dioscoreae (peeled) intersects with this seamed edge, the screw of fixing described laser line generator (3) is unclamped, rotate described laser instrument, until the laser Rhizoma Dioscoreae (peeled) of its transmitting is paralleled with seamed edge (1-1) on described target plate;
Step 2.2: at OwDemarcating target (1) in Z-direction described in anterior-posterior translation makes laser Rhizoma Dioscoreae (peeled) coincide with seamed edge (1-1) on described target plate;
Step 2.3: observe described demarcation target (1) institute illuminated with laser light situation in the plane, if whole target plane is uniformly illuminated, then shows that laser plane is good with target Planar realization and overlaps;If target plane is not illuminated, needing to rotate described laser instrument fixed cover (2) counterclockwise, then repeating step 2.1 and step 2.2 until reaching the effect described in step 2.3;If front is placed and Rhizoma Dioscoreae (peeled) occurred on the base demarcating target in target plane, needing the described laser instrument fixed cover (2) that turns clockwise, then repeating step 2.1, step 2.2 until reaching the effect described in step 2.3.
Preferably, above-mentioned a kind of line structure optical sensor numerical value scaling method, wherein need not set up the models such as intrinsic parameter in line structured light vision sensor, outer parameter, lens distortion parameter, also without the various calibrating parameters obtained by solving-optimizing algorithm in set up model, only need to ask six linear variation coefficients adopting each triangle that Delaunary triangulation obtains corresponding successively, and these linear transform coefficients are stored in the middle of a matrix, namely complete the demarcation to described line structured light vision sensor.
Preferably, above-mentioned a kind of line structure optical sensor numerical value scaling method, wherein by the initial point O of world coordinate systemWIt is set in the center of lower left corner target point in target plane of demarcating, the X/Y plane of world coordinate system and target planes overlapping, according to the summit P in world coordinate system of the world coordinate system intermediate cam shape after triangulationi(xi, yi)、Pi+1(xi+1, yi+1)、Pi+2(xi+2, yi+2), and they three summit Q corresponding in pixel planesi(ui, vi)、Qi+1(ui+1, vi+1)、Qi+2(ui+2, vi+2) coordinate figure set up Equation of Linear Transformation and solve the linear transform coefficient that this delta-shaped region is corresponding.
Compared with prior art, the technology of the present invention effect is mainly reflected in: pixel planes has been split into many little delta-shaped regions by Delaunary triangulation by the present invention, it is believed that the pixel coordinate of each point and its world coordinates meet linear transformation relation in this delta-shaped region, the pixel coordinate of each vertex of a triangle and its corresponding world coordinates can obtain respectively through ellipse fitting and drone design parameter again simultaneously, can calculate according to each vertex of a triangle coordinate and obtain the linear transform coefficient that any pixel is corresponding in this triangle, and this conversion coefficient is stored in matrix and namely completes the demarcation to line structured light vision sensor.When calculating optical losses point is to the world coordinate system coordinate figure of correspondence, it is only necessary to determine the sequence number of the triangle at its place, and extract corresponding linear transform coefficient, can directly calculate and obtain.It has the advantage that
1. the quantity of characteristic point is determined by the number of round dot on target, target point number choose comparatively flexible, and target make simple, cost is low;
2. adopt triangular linear alternative approach, the model that line structured light vision sensor intrinsic parameter, outer parameter, lens distortion etc. are complicated need not be set up, also without the parameters solved by optimized algorithm in above-mentioned model, have only to determine the delta-shaped region at Rhizoma Dioscoreae (peeled) point place, and the linear transform coefficient extracting correspondence directly calculates and obtains, method is simple, speed is fast.
3. by pixel planes is split into little delta-shaped region, camera lens distortion in each little region can be ignored substantially, contributes to improving further the certainty of measurement of phase structure light vision sensor.
Accompanying drawing explanation
Fig. 1 is line structured light vision sensor calibration figure;
Fig. 2 is line structured light vision sensor calibration flow chart;
Fig. 3 demarcates the dot matrixes target adopted;
Fig. 4 is target and the triangulation result of camera shooting;
Fig. 5 is triangular linear transformation calculations schematic diagram;
Fig. 6 is line structured light vision sensor measurement procedure figure.
Detailed description of the invention
Below in conjunction with accompanying drawing, the specific embodiment of the present invention is described in further detail, so that technical solution of the present invention is more readily understood and grasps.
Fig. 1 is line structured light vision sensor calibration figure, is made up of the link 4 demarcated between target 1, laser instrument fixed cover 2, laser line generator 3, laser line generator and camera, camera 5, vertical support 6, it is characterised in that:
Described demarcation target 1 is vertically placed, and the laser plane that target plane is launched with described laser line generator 3 coincides;
For reaching the requirement that described target plane and laser plane coincide, described laser line generator 3 profile should be cylindrical, and can rotate specific angle in described laser instrument fixed cover 2;The feature of described laser instrument fixed cover 2 is a circular set, and its internal diameter is slightly larger than the diameter of described laser line generator 3, with lock-screw on it, when being locked by lock-screw after the anglec of rotation of self is determined by laser instrument;For reaching the requirement that described target plane and laser plane coincide, laser instrument fixed cover 2 and described link 4 are connected by screw, on described laser instrument fixed cover 2, processing has screwed hole, on described link 4, processing has the through hole more slightly larger than the screwed hole on described laser instrument fixed cover 2, make described laser instrument fixed cover 2 can relatively rotate screw-driving when angle reaches requirement relative to connecting plate 4, it is ensured that between described laser instrument fixed cover 2 and described connecting plate 4, relative motion no longer to occur;Described demarcation target 1 can along normal direction (OwZ direction) translation of target plane;
Described camera 5 lens focus can manual adjustments, it is ensured that the target point on described demarcation target 1 can both be positioned within the field range of described camera 5, and imaging clearly.
In conjunction with line structured light vision sensor calibration figure described in Fig. 1, line structured light vision sensor calibration flow process is as in figure 2 it is shown, be a kind of numerical value scaling method based on triangulated linear conversion, it is characterised in that: this scaling method comprises the following steps:
Step (1): make and demarcate target, demarcates the dot matrixes that target is equidistantly arrangement, and just successive step target position and lens focus enables target to be completely disposed within the visual field of camera and target dot image is clear;
Step (2): accurately adjust laser line generator and the relative position demarcated between target, makes laser plane and target co-planar that laser line generator launches;
Step (3): turn off laser line generator power supply, accurately adjusts camera focus again, it is ensured that the image of camera shooting target is clear, adopts ellipse fitting method to obtain the pixel coordinate of each target dot center, is stored in matrix Q;
Step (4): pixel planes is carried out Delaunary triangulation with the target centre coordinate in matrix Q, is denoted as world coordinate system O by the center of target lower-left angle pointWThe initial point of XYZ, it is determined that the coordinate in the world coordinate system that triangular apex is corresponding;
Step (5): calculate the linear transform coefficient corresponding to each triangulation region, and be stored in transform coefficient matrix D, namely complete the demarcation to line structure optical sensor.
In step (1) when making target, the dot matrixes that target is fabricated to equidistantly arrange will be demarcated, each round dot is a target point, as Fig. 3 demarcates shown in the dot matrixes target of employing, it is the X/Y plane of world coordinate system by target point place plane sets, and the initial point Ow of world coordinate system is set in the center of lower left corner target point, if line space and column pitch between target point are 1, then relative to initial point Ow, m row, the target dot center P at n row place is at world coordinate system OW(x, y), is wherein x=n1, y=m1 to coordinate figure P in XY, as long as determining target point relative to initial point O according to the methodWLine number and its center of columns coordinate figure can by said method calculate obtain;The diameter of target point is 8mm, and totally 11 row 11 arrange, and the spacing between adjacent target punctuate is 12mm, and the plane at target point place is smooth;
When first successive step target position, should first open laser power supply, and make laser plane vertical with horizontal direction by rotating described laser instrument fixed cover 2;Then, demarcation target is positioned over the underface of described laser line generator 3;And then the screw fixing described laser line generator 3 on described laser instrument fixed cover 2 is unclamped, rotary laser makes laser plane and the demarcation planes overlapping demarcated on described demarcation target 1;
After at laser plane target plane launched with line laser laser instrument, just successive step overlaps, it is necessary to turning off laser instrument, the focal length then manually adjusting camera lens makes whole target be all placed within the field range of camera, and makes target point imaging clearly on camera.
Adjusting laser plane and target co-planar that described laser line generator 3 is launched in step (2), its feature includes procedure below:
(2-1) on OwZ direction, described in anterior-posterior translation, demarcate target 1 and be located at the underface of described laser line generator 3, observe the relativeness between seamed edge 1-1 on laser Rhizoma Dioscoreae (peeled) and described target plate, if laser Rhizoma Dioscoreae (peeled) intersects with this seamed edge, the screw of fixing described laser line generator 3 is unclamped, rotate described laser instrument, until the laser Rhizoma Dioscoreae (peeled) of its transmitting is paralleled with seamed edge 1-1 on described target plate;
(2-2) demarcating target 1 on OwZ direction described in anterior-posterior translation makes laser Rhizoma Dioscoreae (peeled) coincide with seamed edge 1-1 on described target plate;
(2-3) observe described demarcation target 1 institute illuminated with laser light situation in the plane, if whole target plane is uniformly illuminated, then shows that laser plane is good with target Planar realization and overlap;If target plane is not illuminated, needing to rotate described laser instrument fixed cover 2 counterclockwise, then repeating step (2-1), (2-2) until reaching the effect described in (2-3);If front is placed and Rhizoma Dioscoreae (peeled) occurred on the base demarcating target in target plane, needing the described laser instrument fixed cover 2 that turns clockwise, then repeating step (2-1), (2-2) until reaching the effect described in (2-3).
In step (3) when turning off laser power supply and accurately adjusting camera lens focal length, it is ensured that described laser line generator 3, described demarcation target 1 position do not change.Target that lens focus is captured after adjusting and triangulation result, as shown in Figure 4.In order to make the shooting effect of guarantee target image, it is to avoid the background interference to producing when extracting target point, a white background plate need to be placed after target when with described camera 5 shooting background target.
The pixel in the whole pixel planes lower left corner is defined as the initial point of pixel coordinate system, and horizontal axis is u, and vertical coordinate axle is v, then the horizontal u of the pixel of any point in pixel planes, vertical v coordinate value are columns and the line number of this pixel.Each target point in the target plane photographed, Canny edge detection method is first adopted to calculate the circumference of this target point, this profile adopts ellipse fitting again, and oval center is the center of this target point, calculates the center of each target point according to the method described above successively.If relative to the center of target point of the m row of pixel coordinate initial point, the n-th row be Q (u, v), itself and Fig. 3 demarcate the m row in the dot matrixes target of employing, the n-th row target center point P (x, y) corresponding.And the centre coordinate of computed all target points is stored in matrix Q.
It is Delaunary triangulation that pixel planes carries out in step (4) method of employing during triangulation, result after triangulation is as shown in Figure 4, the hit region at punctuate place of whole pixel planes has been divided into many only small delta-shaped regions, the number of delta-shaped region is relevant to the number of target point, the number of target point is more many, spacing is more little, and the delta-shaped region being partitioned into is more many, triangle size is more little.Vertex of a triangle after subdivision is the central point at pixel planes internal target punctuate being stored in matrix Q, determine that the method that world coordinate system corresponding to each target central point adopts is ranking method, namely first determine the line number in the whole target plane in each target point place and columns, the center of lower left corner target point in target plane is set as world coordinate system OWThe initial point of XY, line space when designing further according to target point, column pitch calculate obtain each target dot center world coordinates value (x, y)
Step (5) calculates the conversion coefficient that each delta-shaped region is corresponding, owing to pixel planes has been split into many little delta-shaped regions by step (4), the non-linear factors such as lens distortion can be ignored on demarcating the impact produced in each only small Delta Region, directly pixel coordinate in this region and world coordinates are approximately linear transformation relation, the coefficient having only to ask the linear transformation that each delta-shaped region is corresponding can complete the demarcation to line structured light vision sensor, and its feature includes procedure below:
(5-1) linear relationship between fixed point world coordinates value and pixel coordinate value is set up.Being located in the r delta-shaped region, (x, y) (u, the linear transformation relation between v) is corresponding pixel coordinate Q measured point P
u v = a r , 1 a r , 2 a r , 3 a r , 4 a r , 5 a r , 6 x y 1 - - - ( 1 )
A in equation (1)R, 1、aR, 2、aR, 3、aR, 4、aR, 5、aR, 6Being the linear transform coefficient that the r delta-shaped region is corresponding, r is the sequence number corresponding to triangulation rear triangle;
(5-2) linear transform coefficient in solving equation (1).For seeking six linear transform coefficients in described equation (1), it is necessary to the world coordinates value of three points and pixel coordinate value in triangle subdivision region.Fig. 5 is triangular linear transformation calculations schematic diagram, if the summit that the r triangle after triangulation is in world coordinate system is Pi(xi, yi)、Pi+1(xi+1, yi+1)、Pi+2(xi+2, yi+2), its summit corresponding in pixel planes is Qi(ui, vi)、Qi+1(ui+1, vi+1)、Qi+2(ui+2, vi+2), they meet the Linear Mapping relation in equation (1) respectively, therefore
u i u i + 1 u i + 2 v i v i + 1 v i + 2 = a r , 1 a r , 2 a r , 3 a r , 4 a r , 5 a r , 6 x i x i + 1 x i + 2 y i y i + 1 y i + 2 1 1 1 - - - ( 2 )
Solving equation (2), can obtain the linear transform coefficient a corresponding to the r triangle adopting Delaunary triangulation to obtain in pixel planesR, 1、aR, 2、aR, 3、aR, 4、aR, 5、aR, 6Identical method is adopted to ask for linear transform coefficient corresponding to each triangle of triangulation successively and be stored in matrix of a linear transformation D by these coefficients and namely complete the demarcation to line structured light vision sensor, six conversion coefficients needed for each behavior triangulated linear conversion of wherein said matrix of a linear transformation D, line number is the number of the triangle that Delaunary triangulation obtains.
After completing line structured light vision sensor calibration, calculate the step of outline to be measured actual coordinate such as shown in Fig. 6 line structured light vision sensor measurement procedure figure: be first positioned over below laser instrument by object being measured, make laser plane and measured contour convergence;And then adopt camera shooting laser optical strip image, and calculate the coordinate figure in pixel planes of optical losses point by grey scale centre of gravity method;Then employing search method determines the sequence number of the delta-shaped region at laser Rhizoma Dioscoreae (peeled) central point place, if PM, k(xk, yk) measure point, Q for the kth in laser Rhizoma Dioscoreae (peeled)M, k(uk, vk) for corresponding pixel, (and k=1,2,3 ..., K), K is the number measuring point, as shown in Figure 5.QM, kThe sequence number of the triangle at place is r, then PM, kThe coordinate figure of world coordinate system be:
x k = a r , 5 u k - a r , 2 v k + a r , 2 a r , 6 - a r , 3 a r , 5 a r , 1 a r , 5 - a r , 2 a r , 4 y k = a r , 4 u k - a r , 1 v k + a r , 1 a r , 6 - a r , 3 a r , 4 a r , 2 a r , 4 - a r , 1 a r , 5 - - - ( 3 )
Employing formula (3) calculates the world coordinates value corresponding to pixel coordinate of each optical losses point respectively, namely completes the measurement to outline to be measured.
By above-mentioned calibration process it can be seen that, the method of the present invention need not set up the intrinsic parameter of calibration system, outer parameter, the models such as lens distortion parameter, also without being the undetermined coefficient being determined description line-structured light measurement apparatus model by solving-optimizing problem, but by pixel planes is carried out triangulation, give the matrix D that can describe pixel coordinate to its corresponding world coordinate transformation relation, and in each Delta Region, extract matrix D Linear Transformation factor v, the world coordinates of outline to be measured is obtained either directly through linear transformation formula (3), therefore be numerical value scaling method.
For checking beneficial effects of the present invention, employing " M " shape exemplar is upper and lower in line structured light vision sensor measurement scope, left and right is respectively moved to 6 diverse locations measurements.Adopting the horizontal range that three coordinate measuring machine obtains between two cusps of exemplar is 42.8803mm, line structured light vision sensor is as shown in table 1 in the measurement result of diverse location, wherein the RMS value of measurement error is 0.0285mm, the RMS value of relative measurement error is 0.0664%, it was shown that line structure optical sensor all has good certainty of measurement in the horizontal direction within the scope of measurement.
Measurement error in table 1 horizontal direction
Adopting three coordinate measuring machine that ladder part is measured, shoulder height takes the average theoretical level value as step of 10 points, its value is 29.7830mm.The line structured light vision sensor that employing the inventive method the is demarcated profile measurement to the same measurement position of this ladder, still takes 10 average height values as one-shot measurement arriving next ladder distance on step.By this step within measurement scope, successively up and down, moving left and right and to 6 different positions, same step profile is measured, measurement result is as shown in table 2.The RMS value of shoulder height error is 0.0276mm, and the RMS value of relative error is 0.0927%, it was shown that line structured light vision sensor also has good precision in vertical direction.
Measurement error in table 2 vertical direction
Above-mentioned measurement result shows, the numerical value scaling method adopting a kind of line structured light vision sensor of the present invention is capable of the demarcation of line structured light vision sensor, and the models such as intrinsic parameter, outer parameter, lens distortion parameter need not be set up in calibration process, also without the various calibrating parameters obtained by solving-optimizing algorithm in set up model, have only to solve simple triangular linear conversion coefficient, have that algorithm is simple, stated accuracy advantages of higher.
Certainly, being more than the representative instance of the present invention, in addition, the present invention can also have other multiple detailed description of the invention, and all employings are equal to replacement or the technical scheme of equivalent transformation formation, all fall within the scope of protection of present invention.

Claims (7)

1. a line structured light vision sensor numerical value scaling method, it is characterised in that: this scaling method comprises the following steps:
Step (1): target is demarcated in making, demarcates the dot matrixes that target is equidistantly arrangement, and first successive step demarcates the position of target and lens focus enables demarcation target to be completely disposed within the visual field of camera and target dot image is clear;
Step (2): accurately adjust laser line generator and the relative position demarcated between target, makes laser plane that laser line generator launches and demarcates target co-planar;
Step (3): turn off laser line generator power supply, accurately adjusts camera focus again, it is ensured that the image that target is demarcated in camera shooting is clear, adopts ellipse fitting method to obtain the pixel coordinate of each target dot center, is stored in matrix Q;
Step (4): pixel planes is carried out Delaunary triangulation with the target centre coordinate in matrix Q, is denoted as world coordinate system O by the center demarcating target lower-left angle pointWThe initial point of XYZ, it is determined that the coordinate in the world coordinate system that triangular apex is corresponding;
Step (5): calculate the linear transform coefficient corresponding to each triangulation region, and be stored in transform coefficient matrix D, namely complete the demarcation to line structure optical sensor.
2. a kind of line structure optical sensor numerical value scaling method according to claim 1, it is characterized in that: adopt Delaunary triangulation methodology with the center of each target point for summit, whole pixel planes is split into many only small delta-shaped regions, adopt linear transformation method to calculate in pixel planes the coordinate in the world coordinate system corresponding to any pixel coordinate in each little Delta Region, complete the measurement to testee profile.
3. a kind of line structure optical sensor numerical value scaling method according to claim 1, it is characterised in that: this numerical value timing signal needs to adjust to coplanar demarcation target plane and laser plane.
4. a kind of line structure optical sensor numerical value scaling method according to claim 3, it is characterized in that: this line structured light vision sensor has the device that can adjust laser plane with target plane relative position, this device includes laser line generator (3), with laser instrument fixed cover (2), and link (4), described laser line generator (3) profile is cylindrical, and can rotate specific angle in described laser instrument fixed cover 2;Described laser instrument fixed cover (2) is a circular set, and its internal diameter is slightly larger than the diameter of described laser line generator (3), with lock-screw on it, when being locked by lock-screw after the anglec of rotation of self is determined by laser instrument;Described laser instrument fixed cover (2) and described link (4) are connected by screw, the upper processing of described laser instrument fixed cover (2) has screwed hole, the upper processing of described link (4) has the through hole more slightly larger than the screwed hole on described laser instrument fixed cover (2), make described laser instrument fixed cover (2) can relatively rotate screw-driving when angle reaches requirement relative to link (4), it is ensured that between described laser instrument fixed cover (2) and described link (4), relative motion no longer to occur;Described demarcation target (1) can along the normal direction translation of target plane.
5. a kind of line structure optical sensor numerical value scaling method according to claim 1, it is characterised in that: the method that accurately adjustment laser plane and demarcation target plane adopt in step (2) is:
Step 2.1: at OwZ-direction is demarcated described in anterior-posterior translation target (1) and is located at the underface of described laser line generator (3), observe the relativeness between seamed edge (1-1) on laser Rhizoma Dioscoreae (peeled) and target plate, if laser Rhizoma Dioscoreae (peeled) intersects with this seamed edge, the screw of fixing described laser line generator (3) is unclamped, rotate described laser instrument, until the laser Rhizoma Dioscoreae (peeled) of its transmitting is paralleled with seamed edge (1-1) on described target plate;
Step 2.2: at OwDemarcating target (1) in Z-direction described in anterior-posterior translation makes laser Rhizoma Dioscoreae (peeled) coincide with seamed edge (1-1) on described target plate;
Step 2.3: observe described demarcation target (1) institute illuminated with laser light situation in the plane, if whole target plane is uniformly illuminated, then shows that laser plane is good with target Planar realization and overlaps;If target plane is not illuminated, needing to rotate described laser instrument fixed cover (2) counterclockwise, then repeating step 2.1 and step 2.2 until reaching the effect described in step 2.3;If front is placed and Rhizoma Dioscoreae (peeled) occurred on the base demarcating target in target plane, needing the described laser instrument fixed cover (2) that turns clockwise, then repeating step 2.1, step 2.2 until reaching the effect described in step 2.3.
6. a kind of line structure optical sensor numerical value scaling method according to claim 1, it is characterized in that: the models such as intrinsic parameter in line structured light vision sensor, outer parameter, lens distortion parameter need not be set up, also without the various calibrating parameters obtained by solving-optimizing algorithm in set up model, only need to ask six linear variation coefficients adopting each triangle that Delaunary triangulation obtains corresponding successively, and these linear transform coefficients are stored in the middle of a matrix, namely complete the demarcation to described line structured light vision sensor.
7. a kind of line structure optical sensor numerical value scaling method according to claim 1, it is characterised in that: by the initial point O of world coordinate systemWIt is set in the center of lower left corner target point in target plane of demarcating, the X/Y plane of world coordinate system and target planes overlapping, according to the summit P in world coordinate system of the world coordinate system intermediate cam shape after triangulationi(xi, yi)、Pi+1(xi+1, yi+1)、Pi+2(xi+2, yi+2), and they three summit Q corresponding in pixel planesi(ui, vi)、Qi+1(ui+1, vi+1)、Qi+2(ui+2, vi+2) coordinate figure set up Equation of Linear Transformation and solve the linear transform coefficient that this delta-shaped region is corresponding.
CN201610169495.XA 2016-03-18 2016-03-18 A kind of numerical value scaling method of line structured light vision sensor Active CN105783773B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610169495.XA CN105783773B (en) 2016-03-18 2016-03-18 A kind of numerical value scaling method of line structured light vision sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610169495.XA CN105783773B (en) 2016-03-18 2016-03-18 A kind of numerical value scaling method of line structured light vision sensor

Publications (2)

Publication Number Publication Date
CN105783773A true CN105783773A (en) 2016-07-20
CN105783773B CN105783773B (en) 2019-05-10

Family

ID=56391091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610169495.XA Active CN105783773B (en) 2016-03-18 2016-03-18 A kind of numerical value scaling method of line structured light vision sensor

Country Status (1)

Country Link
CN (1) CN105783773B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106403828A (en) * 2016-08-30 2017-02-15 成都唐源电气股份有限公司 Monorail contact line remain height measurement method based on checkerboard calibration and monorail contact line remain height measurement system thereof
CN107121109A (en) * 2017-06-12 2017-09-01 北京航空航天大学 A kind of structure light parameter calibration device and method based on preceding plated film level crossing
CN107123148A (en) * 2017-05-09 2017-09-01 广东工业大学 A kind of camera parameter scaling method and device
CN108510546A (en) * 2017-02-28 2018-09-07 北京航空航天大学 A kind of camera calibration method being suitable for collection of illustrative plates and structural information synchronizing detection system
CN108921890A (en) * 2018-06-15 2018-11-30 广东拓斯达科技股份有限公司 The screwed lock method, apparatus and computer readable storage medium
CN108981608A (en) * 2018-05-29 2018-12-11 华南理工大学 A kind of Novel wire Constructed Lighting Vision System and scaling method
CN109443214A (en) * 2018-12-19 2019-03-08 广东工业大学 A kind of scaling method of structured light three-dimensional vision, device and measurement method, device
CN109596059A (en) * 2019-01-07 2019-04-09 南京航空航天大学 A kind of aircraft skin gap based on parallel lines structure light and scale measurement method
CN110487213A (en) * 2019-08-19 2019-11-22 杭州电子科技大学 Full view line laser structured light three-dimensional image forming apparatus and method based on spatial offset
CN110793458A (en) * 2019-10-30 2020-02-14 成都安科泰丰科技有限公司 Coplane adjusting method for two-dimensional laser displacement sensor
CN110815201A (en) * 2018-08-07 2020-02-21 广明光电股份有限公司 Method for correcting coordinates of robot arm
CN111006706A (en) * 2019-11-12 2020-04-14 长沙长泰机器人有限公司 Rotating shaft calibration method based on line laser vision sensor
CN112393882A (en) * 2020-04-21 2021-02-23 哈尔滨工业大学 Fly-eye imaging adjusting method based on micro-imaging micro-lens parameter detection
CN112669394A (en) * 2020-12-30 2021-04-16 凌云光技术股份有限公司 Automatic calibration method for vision detection system
CN112747687A (en) * 2020-12-18 2021-05-04 中广核核电运营有限公司 Line structure light vision measurement calibration method and system
CN112902878A (en) * 2021-01-21 2021-06-04 中国铁道科学研究院集团有限公司基础设施检测研究所 Method and device for adjusting laser plane of track geometry detection system
CN113251951A (en) * 2021-04-26 2021-08-13 黄淮学院 Calibration method of line structured light vision measurement system based on single calibration surface mapping
WO2022143796A1 (en) * 2020-12-29 2022-07-07 杭州海康机器人技术有限公司 Calibration method and calibration device for line structured light measurement system, and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09274668A (en) * 1996-04-08 1997-10-21 Canon Inc Method and device for reconstituting three-dimensional object surface
CN1508511A (en) * 2002-12-17 2004-06-30 北京航空航天大学 Method for calibrating structure optical vision sensor
CN103884271A (en) * 2012-12-20 2014-06-25 中国科学院沈阳自动化研究所 Direct calibration method for line structured light vision sensor
CN104567727A (en) * 2014-12-24 2015-04-29 天津大学 Three-dimensional target and global unified calibration method for linear structured light profile sensor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09274668A (en) * 1996-04-08 1997-10-21 Canon Inc Method and device for reconstituting three-dimensional object surface
CN1508511A (en) * 2002-12-17 2004-06-30 北京航空航天大学 Method for calibrating structure optical vision sensor
CN103884271A (en) * 2012-12-20 2014-06-25 中国科学院沈阳自动化研究所 Direct calibration method for line structured light vision sensor
CN104567727A (en) * 2014-12-24 2015-04-29 天津大学 Three-dimensional target and global unified calibration method for linear structured light profile sensor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
高珍: "光场相机的数据处理和标定方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106403828A (en) * 2016-08-30 2017-02-15 成都唐源电气股份有限公司 Monorail contact line remain height measurement method based on checkerboard calibration and monorail contact line remain height measurement system thereof
CN108510546B (en) * 2017-02-28 2021-10-01 北京航空航天大学 Camera calibration method suitable for map and structure information synchronous detection system
CN108510546A (en) * 2017-02-28 2018-09-07 北京航空航天大学 A kind of camera calibration method being suitable for collection of illustrative plates and structural information synchronizing detection system
CN107123148A (en) * 2017-05-09 2017-09-01 广东工业大学 A kind of camera parameter scaling method and device
US10690492B2 (en) 2017-06-12 2020-06-23 Beihang University Structural light parameter calibration device and method based on front-coating plane mirror
CN107121109B (en) * 2017-06-12 2019-12-06 北京航空航天大学 structural optical parameter calibration device and method based on front coated plane mirror
WO2018228013A1 (en) * 2017-06-12 2018-12-20 北京航空航天大学 Front coated plane mirror-based structured light parameter calibration device and method
CN107121109A (en) * 2017-06-12 2017-09-01 北京航空航天大学 A kind of structure light parameter calibration device and method based on preceding plated film level crossing
CN108981608B (en) * 2018-05-29 2020-09-22 华南理工大学 Novel line structured light vision system and calibration method
CN108981608A (en) * 2018-05-29 2018-12-11 华南理工大学 A kind of Novel wire Constructed Lighting Vision System and scaling method
CN108921890A (en) * 2018-06-15 2018-11-30 广东拓斯达科技股份有限公司 The screwed lock method, apparatus and computer readable storage medium
CN110815201A (en) * 2018-08-07 2020-02-21 广明光电股份有限公司 Method for correcting coordinates of robot arm
CN110815201B (en) * 2018-08-07 2022-04-19 达明机器人股份有限公司 Method for correcting coordinates of robot arm
CN109443214A (en) * 2018-12-19 2019-03-08 广东工业大学 A kind of scaling method of structured light three-dimensional vision, device and measurement method, device
CN109596059A (en) * 2019-01-07 2019-04-09 南京航空航天大学 A kind of aircraft skin gap based on parallel lines structure light and scale measurement method
CN110487213A (en) * 2019-08-19 2019-11-22 杭州电子科技大学 Full view line laser structured light three-dimensional image forming apparatus and method based on spatial offset
CN110793458A (en) * 2019-10-30 2020-02-14 成都安科泰丰科技有限公司 Coplane adjusting method for two-dimensional laser displacement sensor
CN111006706A (en) * 2019-11-12 2020-04-14 长沙长泰机器人有限公司 Rotating shaft calibration method based on line laser vision sensor
CN112393882A (en) * 2020-04-21 2021-02-23 哈尔滨工业大学 Fly-eye imaging adjusting method based on micro-imaging micro-lens parameter detection
CN112393882B (en) * 2020-04-21 2022-08-23 哈尔滨工业大学 Compound eye imaging adjustment method based on micro-imaging micro-lens parameter detection
CN112747687A (en) * 2020-12-18 2021-05-04 中广核核电运营有限公司 Line structure light vision measurement calibration method and system
CN112747687B (en) * 2020-12-18 2022-06-14 中广核核电运营有限公司 Line structure light vision measurement calibration method and system
WO2022143796A1 (en) * 2020-12-29 2022-07-07 杭州海康机器人技术有限公司 Calibration method and calibration device for line structured light measurement system, and system
CN112669394A (en) * 2020-12-30 2021-04-16 凌云光技术股份有限公司 Automatic calibration method for vision detection system
CN112669394B (en) * 2020-12-30 2023-11-10 凌云光技术股份有限公司 Automatic calibration method for vision detection system
CN112902878A (en) * 2021-01-21 2021-06-04 中国铁道科学研究院集团有限公司基础设施检测研究所 Method and device for adjusting laser plane of track geometry detection system
CN113251951A (en) * 2021-04-26 2021-08-13 黄淮学院 Calibration method of line structured light vision measurement system based on single calibration surface mapping
CN113251951B (en) * 2021-04-26 2024-03-01 湖北汽车工业学院 Calibration method of line structured light vision measurement system based on single calibration surface mapping

Also Published As

Publication number Publication date
CN105783773B (en) 2019-05-10

Similar Documents

Publication Publication Date Title
CN105783773A (en) Numerical value calibration method for line structured light vision sensor
CN107167169B (en) Readings of pointer type meters based on NI Vision Builder for Automated Inspection identifies measurement method
CN103499302B (en) The camshaft diameter dimension On-line Measuring Method of structure based light Vision imaging system
CN100562707C (en) Binocular vision rotating axis calibration method
CN107274453A (en) Video camera three-dimensional measuring apparatus, system and method for a kind of combination demarcation with correction
CN103759669B (en) A kind of monocular vision measuring method of heavy parts
CN110148180A (en) A kind of laser radar and camera fusing device and scaling method
CN106643492B (en) A kind of aero-engine damaged blade 3-dimensional digital speckle formative method
CN105678785A (en) Method for calibrating posture relation of laser and camera
CN102944559B (en) Vision measurement method for anisotropic performance parameters in sheet forming
CN104634248B (en) Revolving shaft calibration method under binocular vision
CN109141226A (en) The spatial point coordinate measuring method of one camera multi-angle
CN112985293B (en) Binocular vision measurement system and measurement method for single-camera double-spherical mirror image
CN111964694A (en) Laser range finder calibration method for three-dimensional measurement
CN110378969A (en) A kind of convergence type binocular camera scaling method based on 3D geometrical constraint
CN104197960A (en) Global calibration method for vision-guided camera of laser tracker
CN104390584B (en) Binocular vision laser calibration measurement apparatus and measuring method
CN105139411A (en) Large visual field camera calibration method based on four sets of collinear constraint calibration rulers
CN101650156B (en) Device and method for measuring geometric parameter of superplastic non-spherical free bulge
CN101509759B (en) Self-demarcating system and method for vision detecting system
CN110672037A (en) Linear light source grating projection three-dimensional measurement system and method based on phase shift method
WO2022134939A1 (en) Data splicing and system calibration method for human body digital measurement device
CN107084671A (en) A kind of recessed bulb diameter measuring system and measuring method based on three wire configuration light
CN103278095B (en) The production method of one-dimensional extension type target apparatus and feature point for calibration thereof
CN110146032B (en) Synthetic aperture camera calibration method based on light field distribution

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant