CN107505324A - 3D scanning means and scan method based on binocular collaboration laser - Google Patents

3D scanning means and scan method based on binocular collaboration laser Download PDF

Info

Publication number
CN107505324A
CN107505324A CN201710681112.1A CN201710681112A CN107505324A CN 107505324 A CN107505324 A CN 107505324A CN 201710681112 A CN201710681112 A CN 201710681112A CN 107505324 A CN107505324 A CN 107505324A
Authority
CN
China
Prior art keywords
camera
image
laser
binocular
right camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710681112.1A
Other languages
Chinese (zh)
Other versions
CN107505324B (en
Inventor
王兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Li Jie
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201710681112.1A priority Critical patent/CN107505324B/en
Publication of CN107505324A publication Critical patent/CN107505324A/en
Application granted granted Critical
Publication of CN107505324B publication Critical patent/CN107505324B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Abstract

Based on the 3D scanning means and scan method of binocular collaboration laser, the present invention relates to 3D scanning means and scan method.It is complicated, cumbersome the invention aims to solve existing cloud sweeping scheme scenario-frame;The point cloud precision of generation is low, it is difficult to the problem of meeting industrial demand.3D scanning means based on binocular collaboration laser includes:Binocular camera, stepper motor, laser, electric machine controller;The binocular camera includes left camera, right camera and binocular connection fixture;Left camera is connected with right camera by binocular connection fixture;Laser is set on stepper motor, and stepper motor is connected with electric machine controller by connector;Motor controller controls stepper motor moves, and stepper motor drives laser movement, and laser by Laser emission to after scanning object, to scanning object take pictures by binocular camera.The present invention is used for 3D scannings field.

Description

3D scanning means and scan method based on binocular collaboration laser
Technical field
The present invention relates to 3D scanning means and scan method.
Background technology
3D point cloud data are the bases of workpiece Intelligent Recognition, crawl, defects detection, and the second best in quality 3D point cloud data can be with Accelerate recognition speed, be applicable greater demand.
Current common point cloud sweeping scheme mainly has three kinds, and one kind coordinates laser rays translation scanning for one camera, a kind of For using calibrated binocular (more mesh) structure shooting scanning object picture, processing generation cloud data is carried out to photo afterwards. The third is face battle array structure light scheme, coordinates projecting apparatus to set up the system architecture of similar binocular using camera, projection, shooting obtain Corresponding cloud data.
One camera coordinates the point cloud precision of laser scheme generation higher, but sweep speed, and is needed in scanning process The position transmission signal moved using encoder according to overall structure, control camera shooting photo.Then photo is handled, During calculating, camera changes with scanning the relative position of object, need to change each frame data using respective algorithms Calculation position.Generally, this scheme precision is high, but complicated, cumbersome.
Binocular (more mesh) scheme directly shoots scanning object photograph using binocular, and processing left images obtain corresponding feature Point pair, the 3D positions of characteristic point are directly calculated according to the parallax of the parameter of demarcation and characteristic point pair, and other positions are counted Calculate filling.The advantages of this scheme is that sweep speed is fast, calculating speed is fast, and coordinates computed is unified, relatively easy using process.But During due to selected characteristic point pair in the picture, error hiding be present, cause the point cloud precision of binocular schemes generation low, it is difficult to meet work The demand of industry production.
Structure light scheme essence is a mutation of binocular scheme, using the structure light of projector prescribed coding, together When using camera shoot corresponding to structure light photo.Decoding computing is carried out according to the picture of projection to all photos afterwards, shone On piece there is one group of unique coding in each pixel.Corresponding characteristic point pair is with the addition of for binocular camera equivalent to artificial, Gu this scheme possesses all advantages of binocular scheme, and computational accuracy is higher.But the structure light of projection be it is sensitive to external light source, The problem of shadow occlusion, the point cloud precision of generation is low be present to scanning object the reflective sensitivity of structure in itself, during scanning.If sweep Retouch object to form for some metal works stacking, scanning effect will have a greatly reduced quality.
The content of the invention
It is complicated, cumbersome the invention aims to solve existing cloud sweeping scheme scenario-frame;The point cloud essence of generation Spend low, it is difficult to the problem of meeting industrial demand, and propose the 3D scanning means based on binocular collaboration laser and scanning side Method.
3D scanning means based on binocular collaboration laser includes:Binocular camera, stepper motor, laser, electric machine controller;
The binocular camera includes left camera, right camera and binocular connection fixture;
Left camera is connected with right camera by binocular connection fixture;
Laser is set on stepper motor, and stepper motor is connected with electric machine controller by connector;
Motor controller controls stepper motor move, stepper motor drive laser movement, laser by Laser emission extremely After scanning object, binocular camera is taken pictures to scanning object.
Based on binocular collaboration laser 3D scan method detailed processes be:
Step 1: motor controller controls stepper motor moves, stepper motor drives laser rotation, and laser is by laser Transmitting extremely scanning object;
Step 2: being demarcated to current binocular camera, it is respective interior that left and right camera in binocular camera is obtained after demarcation Join matrix, distortion matrix, and binocular calibration matrix, relative position matrix;
Step 3: the aperture size of regulation binocular camera, time for exposure, reach left and right camera image and are only capable of seeing laser Line;
Step 4: using the calibration matrix obtained in step 2 to being only capable of seeing the left and right camera figure of laser rays in step 3 As being calibrated, the image after calibration is cut, i.e., intercepts identical bits in the scanning subject image that left and right camera obtains Put, the rectangle effective image of identical size, obtain left and right camera and cut image;
Step 5: camera cutting image in left and right is changed into gray level image through gradation conversion, it is corresponding that left and right camera is calculated respectively Gray level image in per a line pixel point value, the position per a line pixel point value maximum point is per a line brightness highest position Average, according to being worth to the left and right camera each mean place of laser rays in the picture, centered on mean place, Left and right camera image respectively cuts the image of designated length, obtains the center position that left and right camera cuts the laser rays of image;
Step 6: the line number that the center position of the laser rays of image is cut to the left and right camera that step 5 obtains carries out one One correspondence, i.e., left camera cut the X-axis that the X-axis position of the first row pixel in image cuts the first row pixel in image with right camera Position correspondence, left camera cut the X-axis position pair that the X-axis position of the second row in image cuts the second row in image with right camera Should, until left camera cuts the X-axis position correspondence that the X-axis position of Nth row in image cuts Nth row in image with right camera, make With corresponding points alternate position spike, corresponding points line number, the respective internal reference matrix of left and right camera demarcated, the relatively left phase of each pair corresponding points is calculated The three-dimensional coordinate of machine coordinate system;
N is total line number that left and right camera cuts image, and value is positive integer;
Corresponding points alternate position spike is that right camera X-axis positional value subtracts left camera X-axis positional value;
Origin is left camera photocentre, and Y direction is binocular camera image ken upward direction, and X-axis is binocular camera image Ken right direction, Z axis is perpendicular to X/Y plane;
Step 7: 3D coordinate points containers are created, by the relatively left camera coordinates of each pair corresponding points being calculated in step 6 The three-dimensional coordinate of system is all put into 3D coordinate points containers;
Step 8: step 4~step 6 is repeated, by the three of the relatively left camera coordinates system of obtained each pair corresponding points Dimension coordinate is all put into the container of step 7 establishment;Until scanning object is all scanned through, obtain completely putting cloud.
Beneficial effects of the present invention are:
The present invention carries out aiding in corresponding point location using laser rays, and laser rays scans since specified location, while binocular Camera starts to take pictures, and handles the laser lines in image in the magazine center position in left and right, is obtained and swashed using binocular ranging A line being made up of 3D points corresponding to light line.Laser rays persistent movement is scanned, and camera is also persistently taken pictures, repeated the above Process, it can finally obtain the 3D point cloud model that a plurality of line is formed.Camera position is fixed, and the coordinate system of all frames is identical, and no split misses Difference, dependence is not present between frame data and frame data, reduces the cumulative final mask error of frame data error.The present invention is right Camera frame per second is not strong with motor speed dependence, can also handle frame-skipping situation.Generation point cloud coordinate system is relative in binocular camera Left camera photocentre position.
For the present invention based on binocular ranging, binocular ranging can accurately measure specified point in binocular image apart from camera Distance, obtain the magazine symmetric points in left and right simple, although laser need to be coordinated to use, binocular position can be consolidated relatively It is fixed, it is passed to so carrying out signal without using encoder, to photographed frame laser line position no requirement (NR) in scanning process, structure letter It is single.Projected using high light laser, have preferable anti-interference to external light source, the point cloud precision of generation is high, meets Industrial demand.Can be that structure addition laser-assisted acquisition is another if row user thinks that present scanning procedures are excessively slow Present laser line is divided into two by bar laser rays using prism;50% sweep speed can be lifted.
Using MV-GE132GM-T industrial cameras * 2 (resolution ratio 1280*1024, frame per second 92FPS), using hardware circuit board It is two by same trigger signal portion, is transferred to two cameras, ensures the synchronization of camera collection image.For scanning a 30* For 30cm hopper, sweep time is about 5s, and scanning accuracy is about x:0.6mm, y:0.3mm, z:0.15mm, if industrial production The middle faster sweep speed of need, can be that structure installs laser-assisted, sweep time be changed into 2.5s~3s.
Brief description of the drawings
Fig. 1 is the 3D scanning means front views that the present invention cooperates with laser based on binocular;
Fig. 2 is the 3D scanning means side views that the present invention cooperates with laser based on binocular;
Fig. 3 a are the figure of laser rays positioning principle 1 of the present invention;
Fig. 3 b are the figure of laser rays positioning principle 2 of the present invention;
Fig. 4 is scanning system front view of the present invention;
Fig. 5 is scanning system side view of the present invention;
When Fig. 6 is that the present invention starts scanning, laser rays position view;
When Fig. 7 terminates scanning for the present invention, laser rays position view.
Embodiment
Embodiment one:Illustrate present embodiment with reference to Fig. 1, Fig. 2, present embodiment cooperates with laser based on binocular 3D scanning means include:Binocular camera 1, stepper motor 2, laser 3, electric machine controller 4;
The binocular camera 1 includes left camera 1-1, right camera 1-2 and binocular connection fixture 1-3;
Left camera 1-1 is connected with right camera 1-2 by binocular connection fixture 1-3;
Laser 3 is set on stepper motor 2;
Electric machine controller 4 is connected with the signal of stepper motor 2, and control stepper motor 2 moves;Stepper motor 2 drives laser 3 Mobile, laser 3 by Laser emission to after scanning object 5, to scanning object 5 take pictures by binocular camera 1.
Binocular camera:Main survey tool, for three-dimensional coordinate measurement.
Laser light generator:For launching laser rays, auxiliary positioning.
Stepper motor:For rotating laser light generator, make laser rays can at the uniform velocity inswept model surface.
Instrument support:For building experiment scanning mechanical environment, true environment can be neglected.
Connector:For connecting the mechanical connecting part of camera, motor, laser rays.
Embodiment two:Illustrate present embodiment with reference to Fig. 4, Fig. 5, present embodiment cooperates with laser based on binocular 3D scan method detailed processes be:
Step 1: scanning imaging system PtCreator is developed using C Plus Plus, using OPENCV storehouses, operation platform is windows;
Scanning imaging system is mainly made up of five parts, serial ports part, camera SDK, arithmetic section, man-machine interaction part.
Serial ports part, it is main to be responsible for communicating with electric machine controller, control traveling, stopping, reset of stepper motor etc. to operate, Primary operational code is as follows:
Camera SDK is mainly responsible for communicating with camera, sets camera image size, camera exposure time, obtains camera image Operation.Main manipulation code is as follows:
Part major function is demarcated as demarcation binocular camera, obtains the relative position between the interior participation camera of left and right camera, And nominal data is preserved to local, so that computing module uses.Primary operational code is as follows:
Calculating section is mainly responsible for processing image, the laser line position in left images is converted into 3D point coordinates, mainly Operation code is as follows:
Man-machine interaction part major function calls respective code to complete corresponding work, such as user to receive user's input Click on and start scan button, program controlled motor starts to rotate, while controls camera to obtain image, and control calculating section calculates point Cloud.
Electric machine controller 4 controls stepper motor 2 to move, and stepper motor 2 drives laser 3 to rotate, and laser 3 sends out laser It is incident upon scanning object;
The aperture and focal length of left and right camera in the structure of binocular camera 1 are adjusted, is checked with the use of corresponding camera image soft (specifically used software determines part by camera model, picture browsing software corresponding to general industry camera is all supporting, such as this experiment use It is " MindVision demonstration programs " that what camera used, which browses software) Real Time Observation or so camera scanning subject image, work as scanning Subject image is clear, is suitable brightness and focal length when brightness reaches consistent with the object brightness in human eye;
It is exactly that people can see image clearly, some picture positions will not be caused to turn white because aperture is excessive, also will not be because of aperture It is too small to cause some positions not see;
Step 2:
Scanning imaging system PtCreator (Testing Software write dependent on this patent principle) is opened, uses scanning imaging system Calibrating function in PtCreator,
(the binocular calibration in the OPENCV storehouses that the use of binocular calibration method is increased income is demarcated to current binocular camera 1 Module, the black and white case marker fixed board of requirement are demarcated), camera respective internal reference in left and right in binocular camera 1 is obtained after demarcation Matrix, distortion matrix, and binocular calibration matrix, relative position matrix;
Step 3: the aperture size (manual physics regulation) of regulation binocular camera, time for exposure (the supporting program regulation of camera Or adjusted using the camera operational module in scanning imaging system PtCreator), make left and right camera image reach as far as possible be only capable of seeing it is sharp Light, other are all the states of black;If physical function of the camera used without regulation aperture, also SDK, this step without corresponding to Omit.
Step 4: using the calibration matrix obtained in step 2 to being only capable of seeing the left and right camera figure of laser rays in step 3 As being calibrated, the image after calibration is cut, i.e., intercepts identical bits in the scanning subject image that left and right camera obtains Put, the rectangle effective image of identical size, obtain left and right camera and cut image;
(black broken edge occurs in the side of left and right camera image four after calibration, these black due to camera distortion and The error of left and right camera installation is produced, and interference can be produced to ensuing calculating, and these images are referred to as non-effective image, remaining Part be referred to as effective image, pay attention in order to keep cut during left images position, size uniformity, some effectively figures As being also possible to cropped)
Step 5: camera cutting image in left and right is changed into gray level image through gradation conversion, it is corresponding that left and right camera is calculated respectively Gray level image in per a line pixel point value, the position per a line pixel point value maximum point is per a line brightness highest position Average, according to being worth to the left and right camera each mean place of laser rays in the picture, centered on mean place, Left and right camera image respectively cuts the image of designated length, obtains the center position that left and right camera cuts the laser rays of image;
After camera image is loaded on computer memory, it is changed into gray level image, so-called gray level image image by gradation conversion The presence of middle netrual colour, that is, the white photograph that ordinary people understands, the value of each pixel is corresponding for each point in gray-scale map Brightness value, exist with numeric form, numerical value span 0~255.
Admittedly calculating brightness highest position only needs in movement images often row pixel brightness value, where obtaining maximum brightness value Point position.
A value, or multiple values per a line be present, then calculating tool corresponding to brightness peak with gray scale center method Body position.
There is a value per a line, how many row there is how many value, and these values are all preserved, have just obtained a laser Center corresponding to line.
Step 6: the line number that the center position of the laser rays of image is cut to the left and right camera that step 5 obtains carries out one One correspondence, i.e., left camera cut the X-axis that the X-axis position of the first row pixel in image cuts the first row pixel in image with right camera Position correspondence, left camera cut the X-axis position pair that the X-axis position of the second row in image cuts the second row in image with right camera Should, until left camera cuts the X-axis position correspondence (figure that the X-axis position of Nth row in image cuts Nth row in image with right camera As how many row i.e. how many pairs of corresponding points be present), using corresponding points alternate position spike, corresponding points line number, demarcation left and right camera each Internal reference matrix, calculate the relatively left camera coordinates system of each pair corresponding points three-dimensional coordinate;Binocular ranging concrete principle is:
F is camera focus, T is binocular photocentre spacing, Xl, Xr be space midpoint in the magazine image-forming range in left and right ( Calibration), Z is required distance;
Wherein binocular calibration can obtain f, T, camera distortion coefficient.Processing image can obtain Xl, Xr.
In summary, distance measuring accuracy is put, it is in the present invention, auxiliary using laser rays depending on the selection of corresponding points Help positioning mode, point corresponding to selection;
The mode that the present invention uses be the general principle based on binocular ranging OPENCV class libraries, using alternate position spike, line number, Inverse projection matrix (i.e. relative position matrix), can directly calculate the three-dimensional coordinate of corresponding points, and detailed process refers to program Jie " calculating of 3D point coordinates " part in continuing;
N is total line number that left and right camera cuts image, and value is positive integer;
Corresponding points alternate position spike is that right camera X-axis positional value subtracts left camera X-axis positional value;
Origin is left camera photocentre, and Y direction is binocular camera image ken upward direction, and X-axis is binocular camera image Ken right direction, Z axis are pointed to away from the direction of camera perpendicular to X/Y plane;
Step 7: 3D coordinate points container (empty array) is created, each pair corresponding points being calculated in step 6 are relative The three-dimensional coordinate (3D coordinate points) of left camera coordinates system is all put into 3D coordinate points containers (picture altitude coordinate points altogether).
So-called container is exactly the place of a dress data, it is assumed that we handle every frame data and obtain 100 points, altogether scanning 100 frames, 10000 points can be obtained altogether, but this 10000 points are scattered, bad management.We can first create one can To fill the empty array of 10000 points, then will obtain 100 points every time and be put into this array, turn, deposit when Time will be more convenient;
Step 8: step 4~step 6 is repeated, by the three of the relatively left camera coordinates system of obtained each pair corresponding points Dimension coordinate (3D coordinate points) is all put into the container of step 7 establishment;Until scanning object is all scanned through, complete point is obtained Cloud.
Motor movement (is determined, scanning can terminate to scan to edge) to final position by scanning range;
Specific scanning process is by user's control, but overall scanning process need to meet certain requirements, in this programme Bidifly light scanning, when starting scanning, the positions of two laser rays should be one on a left side, one in, the laser in left side Line should be at scanning the left side of object;Such as Fig. 6, Fig. 7;
When terminating to scan, the laser rays in left side should move to centre position (when noticing that this position should be at starting scanning, The right of right side laser rays), and right side laser rays is for the right side of scanning object.
The scanning object of completion is so just can guarantee that, obtained point cloud is only complete point cloud.
According to the number of the point in 3D coordinate points containers and the position of point, create the file of .pcd forms, by a little Preserved in the file for the .pcd forms that data write-in creates;
The file of the .pcd forms preserved is opened using third party software (CloudCompare), checks the 3D point cloud of generation Quality, checking scanning effect.
Corresponding points ask for principle
Corresponding points exist on imaging plane in the form of two-dimensional points, and mode corresponding to acquisition is vertical after first transverse direction herein To laterally use uses laser rays auxiliary positioning to polar plane, longitudinal direction.
Laterally to polar plane principle:The Epipolar geometry function of being provided using OPENCV, using effect are:Correction chart is as rear left The row of right camera image is aligned one by one.
Longitudinal laser rays Auxiliary Principle figure such as Fig. 3 a, Fig. 3 b:
Using the legal position laser rays position of center line of gray scale light belt, the laser rays center that row is corresponded in the camera of left and right is as every Capable corresponding points.Laser rays is set from left to right to scan the 3D point cloud model that can obtain scanning area.
The point nearer apart from image planes, it is bigger in the magazine parallax in left and right, and the point more remote apart from image planes, it is in left and right phase Parallax in machine is smaller.
Embodiment three:Present embodiment is unlike embodiment two:Internal reference square in the step 2 Battle array is the 3*3 of record binocular camera internal reference matrix, and matrix includes binocular camera X, the focal length in Y-direction, and photocentre exists Position in the image of binocular camera shooting;
Distortion matrix is the 1*5 of record binocular camera distortion matrix, have recorded binocular camera radial distortion, tangential The parameters such as distortion;
Binocular calibration matrix is left and right camera image after the calibration of respective calibration matrix, and binocular camera image is in level Highly consistent on direction, i.e., for same shooting object in the image of left and right camera, height and position is consistent;
Binocular calibration matrix accurately says to be six matrixes, respectively left camera calibration matrix, left camera eigenmatrix, the right side Camera calibration matrix, right camera eigenmatrix, the directly demarcation of this four matrixes obtain, and left camera calibrated+left camera is intrinsic can Left camera mapping matrix, right camera is calculated similarly, mapping matrix be with the size such as image, picture position point can be mapped To the matrix of new position, what we finally used is exactly two matrixes of left and right mapping, only larger (the figure of the two matrixes If being 2048*2048, matrix also can be so big), and comparatively calibration matrix (calibrates square with eigenmatrix with regard to much smaller Battle array 3*3, eigenmatrix 3*4), admittedly what is preserved is preceding four matrixes, by the way that latter two matrix is calculated when use, calibrate Image uses latter two matrix, therefore in negligible four matrixes above of principal portion, it is believed that it is intermediate computations variable.
Due to the alignment error of binocular camera, position height of the mutually same point in the camera image of left and right be it is different, After calibration matrix is calibrated, the point of same position is highly just consistent in the camera image of left and right, while calibration matrix can also The distortion of left and right camera image is also together calibrated.
Relative position matrix is locus transformation matrix of the right camera coordinates system relative to left camera coordinates system;
Other steps and parameter are identical with embodiment two.
Embodiment four:Present embodiment is unlike embodiment two or three:The step 5 It is middle that camera cutting image in left and right is changed into gray level image through gradation conversion, calculate respectively every in gray level image corresponding to the camera of left and right One-row pixels point value, the position per a line pixel point value maximum point is the average per a line brightness highest position, according to equal The left and right camera each mean place of laser rays in the picture is worth to, it is each in left and right camera image centered on mean place The image of designated length is cut, obtains the center position that left and right camera cuts the laser rays of image;
One laser is radiated at body surface, while it seem that be a line, but after amplifying it can be found that during brightness according to Gaussian Profile, most bright position is likely to be breached 2~5 pixels, and the position write above is average, is to use grey scale centre of gravity method meter hereinafter The value of calculation;
Each image only has an interception position (X-direction only to be handled, because Y-direction meeting when image calibration is carried out Automatic aligning), the Cl inside article represents left image interception position, and Cr represents right image interception position.The data of corresponding points are Left image laser position+Cl, right image laser position+Cr.If the horizontal stroke of one or two point in the corresponding points of left images be present Coordinate value is 0, then this point is abandoned;
Detailed process is:
1) camera cutting image in left and right, is changed into gray level image through gradation conversion;
2) the first row pixel point value in each self-corresponding gray level image of left and right camera, the position of pixel point value maximum point, are calculated Put the position of the average, as the first row laser rays as the first row brightness highest position;
The maximum span of M maximum point position is calculated if it M maximum point be present, maximum span is m-th maximum point X-axis The difference of position and first maximum point X-axis position;
If maximum span is less than limit value (user specifies, related to laser linewidth, and general designated value is 8~10), pixel The position of point value maximum point is (the maximum point X-axis position of m-th maximum point X-axis position+the first)/2, as the first row laser rays Position;
If maximum span is more than or equal to limit value, maximum span position is given up, it is arranged to 0, as the first row laser rays Position;
M values are positive integer;
Limit value value is 8~10;
Assuming that multiple maximums be present, first X position is 100, and second is 104, and the 3rd is 150, and the 4th is 152, then maximum span is 152-100=52, and more than limit value, then position is set to 0 herein;If an X position is 100, second is 101, the 3rd is 102, without other maximums, then maximum span=102-100=2, less than limit value, then maximum value position= (100+102)/2=101.
Currently processed is single image, and left images all click here reason respectively.
3), if camera each self-corresponding gray level image line number in left and right is N, to each in gray level image corresponding to the camera of left and right Row repeats position that N number of laser rays 2), is calculated, by the position grouping of this N number of laser rays into an one-dimensional matrix P, then P={ P0, P1, P2...Pn }, Pn represent the position of Nth row laser rays;
Medium filtering is carried out to one-dimensional matrix P, 0 value is filtered out, average then is taken to remaining position, obtain left and right phase The respective mean place Pf of laser rays in the picture of machine;
4) it is each in left and right camera image, centered on left and right camera each the mean place Pf of laser rays in the picture Cut the figure of designated length (specific length is relevant with image overall width, typically goes 1st/20th of image overall width) Picture, if Pf is less than designated length away from image Far Left distance, cuts original position and move to right (image ken right direction), then currently It is 0 to cut original position;
If Pf is less than designated length away from image rightmost distance, cutting position moves to left (image ken left direction), then when Preceding cutting original position is source images width-cutting width;
If Pf is located among image, the current original position that cuts is that Pf- cuts designated length/2;
Record the original position Cl that left image is cut;The original position Cr that right image is cut;
5), the brightness because of laser rays in the picture is in Gaussian Profile (known), and this place is used the cutting image 4) obtained Gray scale center method (known, can also to use grey scale centre of gravity method, result of calculation is more or less the same) is calculated left and right camera and cuts image Accurately (position calculated above belongs to rough calculation only for cutting image to laser rays, and the position of this place calculating is smart for position Degree can reach sub-pixel);
The laser line position that obtained left camera is cut to image cuts original position Cl plus corresponding, obtains left camera and cuts out Cut the center position of the laser rays of image;
Left images all need to carry out this calculating, then for the corresponding left images of n rows, we can obtain n to corresponding points, Corresponding points are distributed as in left and right camera cuts image:{PTl0(Pl0,0),PTr0(Pr0,0)},{PTl1(Pl1,1),PTr1 (Pr1,1)}...{PTln(Pln,n),PTrn(Prn, n) },
PTl0For 0 point of left side, PTr0For 0 point of right side, Pl0For PTl0Abscissa, 0 is PTl0Ordinate, (Pl0, 0) be PTl0Coordinate, (Pr0, 0) and it is PTr0Coordinate, Pr0For PTr0Abscissa, 0 is PTr0Ordinate;
PTl1For 1 point of left side, PTr1For 1 point of right side, Pl1For PTl1Abscissa, 1 is PTl1Ordinate, (Pl1, 1) be PTl1Coordinate, (Pr1, 1) and it is PTr1Coordinate, Pr0For PTr1Abscissa, 0 is PTr1Ordinate;
PTlnFor left side n points, PTrnFor left side m points, PlnFor PTlnAbscissa, n PTlnOrdinate, (Pln, n) be PTlnCoordinate, (Prn, n) and it is PTrnCoordinate, Pr0For PTrnAbscissa, n PTrnOrdinate;
The X-coordinate for paying attention to calculating herein is relative to source images;
The laser line position that obtained right camera is cut to image cuts original position Cr plus corresponding, obtains right camera and cuts out Cut the center position of the laser rays of image;
If the X-axis coordinate value that one or two point in the center position of the laser rays of left and right camera cutting image be present is 0, then the central point of this laser rays abandon.
Other steps and parameter are identical with embodiment two or three.
Embodiment five:Unlike one of present embodiment and embodiment two to four:The step 5 Described in specified width, which width determined by image size, may be configured as left and right camera cuts image size 1/10th;
Other steps and parameter are identical with one of embodiment two to four.
Embodiment six:Unlike one of present embodiment and embodiment two to five:The laser rays Center position be rather than the picture position after interception relative to source images position, interception image is only for accelerating image procossing Speed;
Source images are the left and right camera image not cut in step 4.
Other steps and parameter are identical with one of embodiment two to five.
Beneficial effects of the present invention are verified using following examples:
Embodiment one:
3D scanning means and method of the present embodiment based on binocular collaboration laser are specifically to be prepared according to following steps:
Using MV-GE132GM-T industrial cameras * 2 (resolution ratio 1280*1024, frame per second 92FPS), using hardware circuit board It is two by same trigger signal portion, is transferred to two cameras, ensures the synchronization of camera collection image.For scanning a 30* For 30cm hopper, sweep time is about 5s, and scanning accuracy is about x:0.6mm, y:0.3mm, z:0.15mm, if industrial production The middle faster sweep speed of need, can be that structure installs laser-assisted, sweep time be changed into 2.5s~3s.If desired it is more high-precision Degree, replaceable higher resolution, the camera of higher frame per second.
The present invention can also have other various embodiments, in the case of without departing substantially from spirit of the invention and its essence, this area Technical staff works as can make various corresponding changes and deformation according to the present invention, but these corresponding changes and deformation should all belong to The protection domain of appended claims of the invention.

Claims (6)

1. the 3D scanning means based on binocular collaboration laser, it is characterised in that:3D scanning means bags based on binocular collaboration laser Include:Binocular camera (1), stepper motor (2), laser (3), electric machine controller (4);
The binocular camera (1) includes left camera (1-1), right camera (1-2) and binocular connection fixture (1-3);
Left camera (1-1) is connected with right camera (1-2) by binocular connection fixture (1-3);
Laser (3) is set on stepper motor (2);
Electric machine controller (4) is connected with stepper motor (2) signal, and control stepper motor (2) is mobile;Stepper motor (2), which drives, to swash Light device (3) is mobile, and laser (3) by Laser emission to after scanning object (5), clap scanning object (5) by binocular camera (1) According to.
2. the 3D scan methods based on binocular collaboration laser, it is characterised in that:3D scan methods tool based on binocular collaboration laser Body process is:
Step 1: electric machine controller (4) control stepper motor (2) is mobile, stepper motor (2) drives laser (3) rotation, laser Device (3) is by Laser emission to scanning object;
Step 2: being demarcated to current binocular camera (1), it is respective that left and right camera in binocular camera (1) is obtained after demarcation Internal reference matrix, distortion matrix, and binocular calibration matrix, relative position matrix;
Step 3: the aperture size of regulation binocular camera (1), time for exposure, reach left and right camera image and are only capable of seeing laser Line;
Step 4: using the calibration matrix obtained in step 2 to being only capable of seeing that the left and right camera image of laser rays enters in step 3 Row calibration, is cut to the image after calibration, i.e., same position, phase are intercepted in the scanning subject image that left and right camera obtains With the rectangle effective image of size, obtain left and right camera and cut image;
Step 5: camera cutting image in left and right is changed into gray level image through gradation conversion, ash corresponding to the camera of left and right is calculated respectively Spend per a line pixel point value in image, the position per a line pixel point value maximum point is equal per a line brightness highest position Value, according to the left and right camera each mean place of laser rays in the picture is worth to, centered on mean place, left and right Camera image respectively cuts the image of specified width, which width, obtains the center position that left and right camera cuts the laser rays of image;
Step 6: the line number that the center position of the laser rays of image is cut to the left and right camera that step 5 obtains is carried out one a pair Should, i.e., left camera cuts the X-axis position that the X-axis position of the first row pixel in image cuts the first row pixel in image with right camera Corresponding, left camera cuts the X-axis position correspondence that the X-axis position of the second row in image cuts the second row in image with right camera, directly The X-axis position of Nth row and the X-axis position correspondence of Nth row in right camera cutting image in image are cut to left camera, using corresponding Point alternate position spike, corresponding points line number, the respective internal reference matrix of left and right camera of demarcation, calculate the relatively left camera coordinates of each pair corresponding points The three-dimensional coordinate of system;
N is total line number that left and right camera cuts image, and value is positive integer;
Corresponding points alternate position spike is that right camera X-axis positional value subtracts left camera X-axis positional value;
Origin is left camera photocentre, and Y direction is binocular camera image ken upward direction, and X-axis is the binocular camera image ken Right direction, Z axis is perpendicular to X/Y plane;
Step 7: 3D coordinate points containers are created, by the relatively left camera coordinates system of each pair corresponding points being calculated in step 6 Three-dimensional coordinate is all put into 3D coordinate points containers;
Step 8: repeating step 4~step 6, the three-dimensional of the relatively left camera coordinates system of obtained each pair corresponding points is sat Mark is all put into the container of step 7 establishment;Until scanning object is all scanned through, obtain completely putting cloud.
3. the 3D scan methods according to claim 2 based on binocular collaboration laser, it is characterised in that:In in the step 2 Ginseng matrix is the 3*3 of record binocular camera internal reference matrix, and matrix includes binocular camera (1) X-axis, in Y direction Focal length, position of the photocentre in the image that binocular camera is shot;
Distortion matrix is the 1*5 of record binocular camera distortion matrix, have recorded binocular camera radial distortion, tangential distortion Parameter;
Relative position matrix is locus transformation matrix of the right camera coordinates system relative to left camera coordinates system.
4. the 3D scan methods according to claim 3 based on binocular collaboration laser, it is characterised in that:Will in the step 5 Left and right camera cuts image and is changed into gray level image through gradation conversion, calculates respectively in gray level image corresponding to the camera of left and right per a line Pixel point value, the position per a line pixel point value maximum point is the average per a line brightness highest position, according to worth To the respective mean place of laser rays in the picture of left and right camera, centered on mean place, respectively cut in left and right camera image The image of specified width, which width, obtain the center position that left and right camera cuts the laser rays of image;Detailed process is:
1) camera cutting image in left and right, is changed into gray level image through gradation conversion;
2) the first row pixel point value in each self-corresponding gray level image of left and right camera, is calculated, the position of pixel point value maximum point is For the position of the average, as the first row laser rays of the first row brightness highest position;
The maximum span of M maximum point position is calculated if it M maximum point be present, maximum span is m-th maximum point X-axis position With the difference of first maximum point X-axis position;
If maximum span is less than limit value, the position of pixel point value maximum point is (m-th maximum point X-axis position+the first maximum point X-axis position)/2, the as position of the first row laser rays;
If maximum span is more than or equal to limit value, maximum span position is given up, is arranged to 0, the as position of the first row laser rays;
M values are positive integer;
Limit value value is 8~10;
If 3), camera each self-corresponding gray level image line number in left and right is N, to every a line weight in gray level image corresponding to the camera of left and right The position that N number of laser rays 2), is calculated is performed again, by the position grouping of this N number of laser rays into an one-dimensional matrix P, then P ={ P0, P1, P2...Pn }, Pn represent the position of Nth row laser rays;
Medium filtering is carried out to one-dimensional matrix P, 0 value is filtered out, average then is taken to remaining position, it is each to obtain left and right camera From the mean place Pf of laser rays in the picture;
4), centered on left and right camera each the mean place Pf of laser rays in the picture, respectively cut in left and right camera image The image of specified width, which width, if Pf is less than specified width, which width away from image Far Left distance, cuts original position and move to right, then currently cut Beginning position be 0, obtain cut image;
If Pf is less than specified width, which width away from image rightmost distance, cuts position and move to left, then it is source images currently to cut original position Width-cutting width, obtain cutting image;
If Pf is located among image, the current original position that cuts is that Pf- cuts specified width, which width/2, obtains cutting image;
Record the original position Cl that left image is cut;The original position Cr that right image is cut;
5) the laser line position that left and right camera cuts image, is calculated using gray scale center method to the cutting image that 4) obtains;
The laser line position that obtained left camera is cut to image cuts original position Cl plus corresponding, obtains left camera cutting figure The center position of the laser rays of picture;
The laser line position that obtained right camera is cut to image cuts original position Cr plus corresponding, obtains right camera cutting figure The center position of the laser rays of picture;
If the X-axis coordinate value that one or two point in the center position of the laser rays of left and right camera cutting image be present is 0, The central point of this laser rays is abandoned.
5. the 3D scan methods according to claim 4 based on binocular collaboration laser, it is characterised in that:The step 5 middle finger Fixed width degree is arranged to 1/10th that left and right camera cuts image size.
6. the 3D scan methods according to claim 5 based on binocular collaboration laser, it is characterised in that:Swash described in step 5 The center position of light is the position relative to source images;
Source images are the left and right camera image not cut in step 4.
CN201710681112.1A 2017-08-10 2017-08-10 3D scanning device and scanning method based on binocular collaborative laser Active CN107505324B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710681112.1A CN107505324B (en) 2017-08-10 2017-08-10 3D scanning device and scanning method based on binocular collaborative laser

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710681112.1A CN107505324B (en) 2017-08-10 2017-08-10 3D scanning device and scanning method based on binocular collaborative laser

Publications (2)

Publication Number Publication Date
CN107505324A true CN107505324A (en) 2017-12-22
CN107505324B CN107505324B (en) 2020-06-16

Family

ID=60690689

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710681112.1A Active CN107505324B (en) 2017-08-10 2017-08-10 3D scanning device and scanning method based on binocular collaborative laser

Country Status (1)

Country Link
CN (1) CN107505324B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108534708A (en) * 2018-03-30 2018-09-14 深圳积木易搭科技技术有限公司 A kind of binocular three-dimensional scanner assemblies and scan method
CN109590231A (en) * 2018-12-19 2019-04-09 上海易持自动系统有限公司 A kind of non-regular shape material image measurement measuring and controlling device and method
CN110555872A (en) * 2019-07-09 2019-12-10 牧今科技 Method and system for performing automatic camera calibration of a scanning system
CN110595391A (en) * 2019-09-26 2019-12-20 桂林电子科技大学 Cross line structured light binocular vision scanning device
CN111452036A (en) * 2019-03-19 2020-07-28 北京伟景智能科技有限公司 Workpiece grabbing method based on line laser binocular stereo vision
CN111479053A (en) * 2020-03-25 2020-07-31 清华大学 Software control system and method for scanning light field multicolor microscopic imaging
CN111738971A (en) * 2019-03-19 2020-10-02 北京伟景智能科技有限公司 Circuit board stereo scanning detection method based on line laser binocular stereo vision
CN112285125A (en) * 2020-11-11 2021-01-29 安徽锦希自动化科技有限公司 Detection device for collecting dust deposition degree on solar panel
CN112304951A (en) * 2019-08-01 2021-02-02 唐山英莱科技有限公司 Visual detection device and method for high-reflection welding seam through binocular single-line light path
CN112770046A (en) * 2020-12-21 2021-05-07 深圳市瑞立视多媒体科技有限公司 Generation method of control SDK of binocular USB camera and control method of binocular USB camera
CN113048908A (en) * 2021-03-08 2021-06-29 中国海洋大学 Submarine landform detection image generation system based on laser scanning
CN114111574A (en) * 2021-11-23 2022-03-01 西安理工大学 High-temperature red-hot target binocular line laser vision three-dimensional measurement method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07294443A (en) * 1994-04-25 1995-11-10 Central Japan Railway Co Ballast condition inspection device for roadbed shoulder part
WO1999001988A1 (en) * 1997-07-02 1999-01-14 Ericsson, Inc. Three-dimensional imaging and display system
CN102012217A (en) * 2010-10-19 2011-04-13 南京大学 Method for measuring three-dimensional geometrical outline of large-size appearance object based on binocular vision
CN103940369A (en) * 2014-04-09 2014-07-23 大连理工大学 Quick morphology vision measuring method in multi-laser synergic scanning mode
CN104390584A (en) * 2014-05-22 2015-03-04 北京中天荣泰科技发展有限公司 Binocular vision laser calibration measurement device and measurement method
CN105157602A (en) * 2015-07-13 2015-12-16 西北农林科技大学 Remote three-dimensional scanning system and method based on machine vision
CN105300316A (en) * 2015-09-22 2016-02-03 大连理工大学 Light stripe center rapid extraction method based on gray centroid method
CN105698699A (en) * 2016-01-26 2016-06-22 大连理工大学 A binocular visual sense measurement method based on time rotating shaft constraint

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07294443A (en) * 1994-04-25 1995-11-10 Central Japan Railway Co Ballast condition inspection device for roadbed shoulder part
WO1999001988A1 (en) * 1997-07-02 1999-01-14 Ericsson, Inc. Three-dimensional imaging and display system
CN102012217A (en) * 2010-10-19 2011-04-13 南京大学 Method for measuring three-dimensional geometrical outline of large-size appearance object based on binocular vision
CN103940369A (en) * 2014-04-09 2014-07-23 大连理工大学 Quick morphology vision measuring method in multi-laser synergic scanning mode
CN104390584A (en) * 2014-05-22 2015-03-04 北京中天荣泰科技发展有限公司 Binocular vision laser calibration measurement device and measurement method
CN105157602A (en) * 2015-07-13 2015-12-16 西北农林科技大学 Remote three-dimensional scanning system and method based on machine vision
CN105300316A (en) * 2015-09-22 2016-02-03 大连理工大学 Light stripe center rapid extraction method based on gray centroid method
CN105698699A (en) * 2016-01-26 2016-06-22 大连理工大学 A binocular visual sense measurement method based on time rotating shaft constraint

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SUMANDEEP BANERJEE 等: "A Low-Cost Portable 3D Laser Scanning System with Aptness from Acquisition to Visualization", 《IEEE》 *
姜雨彤: "双目测距系统及标定方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108534708A (en) * 2018-03-30 2018-09-14 深圳积木易搭科技技术有限公司 A kind of binocular three-dimensional scanner assemblies and scan method
CN109590231A (en) * 2018-12-19 2019-04-09 上海易持自动系统有限公司 A kind of non-regular shape material image measurement measuring and controlling device and method
CN111452036B (en) * 2019-03-19 2023-08-04 北京伟景智能科技有限公司 Workpiece grabbing method based on line laser binocular stereoscopic vision
CN111738971A (en) * 2019-03-19 2020-10-02 北京伟景智能科技有限公司 Circuit board stereo scanning detection method based on line laser binocular stereo vision
CN111738971B (en) * 2019-03-19 2024-02-27 北京伟景智能科技有限公司 Circuit board stereoscopic scanning detection method based on line laser binocular stereoscopic vision
CN111452036A (en) * 2019-03-19 2020-07-28 北京伟景智能科技有限公司 Workpiece grabbing method based on line laser binocular stereo vision
CN110555872A (en) * 2019-07-09 2019-12-10 牧今科技 Method and system for performing automatic camera calibration of a scanning system
CN110555872B (en) * 2019-07-09 2023-09-05 牧今科技 Method and system for performing automatic camera calibration of scanning system
US11967113B2 (en) 2019-07-09 2024-04-23 Mujin, Inc. Method and system for performing automatic camera calibration for a scanning system
CN111199559A (en) * 2019-07-09 2020-05-26 牧今科技 Method and system for performing automatic camera calibration of a scanning system
US11074722B2 (en) 2019-07-09 2021-07-27 Mujin, Inc. Method and system for performing automatic camera calibration for a scanning system
CN112304951A (en) * 2019-08-01 2021-02-02 唐山英莱科技有限公司 Visual detection device and method for high-reflection welding seam through binocular single-line light path
CN110595391A (en) * 2019-09-26 2019-12-20 桂林电子科技大学 Cross line structured light binocular vision scanning device
CN111479053B (en) * 2020-03-25 2021-07-16 清华大学 Software control system and method for scanning light field multicolor microscopic imaging
CN111479053A (en) * 2020-03-25 2020-07-31 清华大学 Software control system and method for scanning light field multicolor microscopic imaging
CN112285125A (en) * 2020-11-11 2021-01-29 安徽锦希自动化科技有限公司 Detection device for collecting dust deposition degree on solar panel
CN112770046A (en) * 2020-12-21 2021-05-07 深圳市瑞立视多媒体科技有限公司 Generation method of control SDK of binocular USB camera and control method of binocular USB camera
CN113048908A (en) * 2021-03-08 2021-06-29 中国海洋大学 Submarine landform detection image generation system based on laser scanning
CN114111574A (en) * 2021-11-23 2022-03-01 西安理工大学 High-temperature red-hot target binocular line laser vision three-dimensional measurement method
CN114111574B (en) * 2021-11-23 2024-01-09 西安理工大学 High-temperature red-hot target binocular line laser vision three-dimensional measurement method

Also Published As

Publication number Publication date
CN107505324B (en) 2020-06-16

Similar Documents

Publication Publication Date Title
CN107505324A (en) 3D scanning means and scan method based on binocular collaboration laser
CN110763152B (en) Underwater active rotation structure light three-dimensional vision measuring device and measuring method
CN106949845B (en) Two dimension laser galvanometer scanning system and scaling method based on binocular stereo vision
CN106097300B (en) A kind of polyphaser scaling method based on high-precision motion platform
CN109215108B (en) Panoramic three-dimensional reconstruction system and method based on laser scanning
CN110044300A (en) Amphibious 3D vision detection device and detection method based on laser
CN1158684A (en) Method and appts. for transforming coordinate systems in an automated video monitor alignment system
CN109297433A (en) 3D vision guide de-stacking measuring system and its control method
CN111402411A (en) Scattered object identification and grabbing method based on line structured light
CN110966956A (en) Binocular vision-based three-dimensional detection device and method
CN1561502A (en) Strapdown system for three-dimensional reconstruction
CN111189415B (en) Multifunctional three-dimensional measurement reconstruction system and method based on line structured light
CN106500626A (en) A kind of mobile phone stereoscopic imaging method and three-dimensional imaging mobile phone
CN106780593B (en) A kind of acquisition methods of color depth image obtain equipment
CN107241592A (en) A kind of projecting unit and filming apparatus, processor, imaging device including the unit
CN107564051B (en) Depth information acquisition method and system
CN108340405B (en) Robot three-dimensional scanning system and method
CN109506629B (en) Method for calibrating rotation center of underwater nuclear fuel assembly detection device
CN116645476B (en) Rod three-dimensional data model reconstruction method and system based on multi-view vision
CN209342062U (en) 3D vision guide de-stacking measuring system
CN116363226A (en) Real-time multi-camera multi-projector 3D imaging processing method and device
Huang et al. Line laser based researches on a three-dimensional measuring system
KR102080506B1 (en) 3D optical scanner
CN106934861B (en) Object three-dimensional reconstruction method and device
CN109612408A (en) Semiconductor laser angle measurement method, apparatus and readable storage medium storing program for executing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Li Jie

Inventor before: Wang Xing

CB03 Change of inventor or designer information
TA01 Transfer of patent application right

Effective date of registration: 20180321

Address after: 150000 Nantong street, Nangang District, Harbin, Heilongjiang Province, No. 145-11

Applicant after: Li Jie

Address before: 150000 Harbin City, Heilongjiang 150000

Applicant before: Wang Xing

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant