CN102506711B - Line laser vision three-dimensional rotate scanning method - Google Patents
Line laser vision three-dimensional rotate scanning method Download PDFInfo
- Publication number
- CN102506711B CN102506711B CN 201110341204 CN201110341204A CN102506711B CN 102506711 B CN102506711 B CN 102506711B CN 201110341204 CN201110341204 CN 201110341204 CN 201110341204 A CN201110341204 A CN 201110341204A CN 102506711 B CN102506711 B CN 102506711B
- Authority
- CN
- China
- Prior art keywords
- pilot hole
- laser
- coordinate
- point
- robotic arm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 230000000007 visual effect Effects 0.000 claims abstract description 13
- 239000011159 matrix material Substances 0.000 claims description 21
- 230000009466 transformation Effects 0.000 claims description 11
- 238000006243 chemical reaction Methods 0.000 claims description 7
- 238000013461 design Methods 0.000 claims description 7
- 238000005286 illumination Methods 0.000 claims description 6
- 230000003287 optical effect Effects 0.000 claims description 6
- 239000000284 extract Substances 0.000 claims description 5
- 238000003384 imaging method Methods 0.000 claims description 5
- 230000002902 bimodal effect Effects 0.000 claims description 4
- 238000002955 isolation Methods 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 abstract description 4
- 238000005259 measurement Methods 0.000 abstract description 3
- 238000001514 detection method Methods 0.000 abstract description 2
- 230000007547 defect Effects 0.000 abstract 1
- 238000009776 industrial production Methods 0.000 abstract 1
- 238000003466 welding Methods 0.000 description 3
- 241001269238 Data Species 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000012467 final product Substances 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 239000000700 radioactive tracer Substances 0.000 description 1
- 238000009418 renovation Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
Aiming at defects of difficult field calibration, narrow scanning field depth range and low scanning efficiency of existing laser scanning systems, the invention provides a line laser vision three-dimensional rotate scanning method. With the method, field depth can be changed according to the dimension of an object; field calibration can be realized rapidly; virtual reconstruction can be carried out upon the object; and real-time tracking can be carried out upon specified feature points of the object. A line laser scanning method is adopted, such that scanning efficiency and real-time property are improved. A positioning platform technology is adopted, such that system model field rapid calibration can be realized, users can conveniently adjust visual angles and field depths of a camera, and application is more flexible. Virtual reconstruction of the object can be realized by using openGL, such that measurement on any dimension of the object can be realized. A robotic arm positioning hole is provided on the positioning platform, such that real-time or time-delayed tracking can be carried out upon interested points on the object. The method has great significance on industrial production and detection.
Description
Technical field
The present invention relates to a kind of line laser 3 D visual rotation sweep method, specifically refer to a kind ofly can be used for that object is carried out three-dimensional rotation scanning and realize the dimensional measurement of object, the method for feature point tracking.
Background technology
The laser three-dimensional scanning technology is panorama three-dimensional data and the model of reconstructed object from complicated entity or outdoor scene, mainly is three-dimensional measured datas such as the line that obtains target, face, body, space and carries out the high accuracy three-dimensional reverse modeling.Application content relate to target modeling, copy, copy, emulation, reparation, renovation, structure analysis, transformation, safeguard, file and be used for the purposes such as simulation, training, emulation, virtual reality, deduction, test, detection, deformation analysis, CAD, CAM, CMMS, finite element stress analysis, Fluid Dynamical Analysis, other reverse-engineerings of task.Laser cooperates vision technique with its measuring accuracy height, and the method principle is easy, with respect to other vision schemes advantage with low cost, is just obtaining application more and more widely.
According to the type of laser instrument, laser three-dimensional scanning can be divided into: spot scan mode, line sweep mode, coded faces scan mode.Spot scan mode sweep limit is little, inefficiency; Coded faces scan efficiency height, but coded structured light is difficult to realize; The line sweep mode can once realize multi-point scanning, realizes simply being conducive to simplied system structure, reduces system cost.
Summary of the invention
The object of the invention is the shortcoming at point, face scan mode, propose a kind of line laser 3 D visual rotation sweep method, can change the depth of field according to dimension of object, and realize on-site proving fast, object is carried out virtual reconstruct, and can carry out real-time follow-up to object characteristic specified point.
According to technical scheme provided by the invention, described line laser 3 D visual rotation sweep method may further comprise the steps:
At first, Design Orientation platform, locating platform comprise the pilot hole of system's various piece: the turning axle pilot hole, demarcate the bar pilot hole, camera pilot hole, laser instrument pilot hole, robotic arm pilot hole; The initial point of the initial point of described locating platform coordinate system and robotic arm coordinate system overlaps;
Second step installed each parts by position of positioning hole, installed and demarcated bar, and bar is demarcated in the line laser irradiation, and laser strip can form fracture at the place, demarcation hole of demarcating bar; Utilize camera to obtain the image of demarcating the hole, utilize the Medial-Axis Transformation principle to the refinement of line laser striation, laser trace after the refinement is carried out serializing, search breakpoint according to sequence of points, determine to demarcate the coordinate of hole in image coordinate system, finish the demarcation of system in conjunction with the volume coordinate of pilot hole in system;
The 3rd step, remove and demarcate bar, place object to be measured or to be tracked, set the object of turning axle rotational angular velocity ω and carry out passive rotation sweep, utilize camera to obtain laser illumination in real time in the laser strip on measured object surface, the refinement laser strip is also carried out the reverse-engineering computing to each pixel of the laser strip that obtains, obtains the coordinate of pixel in locating platform;
The 4th step: to object measure, virtual reconstruct, tracking: by the coordinate information that obtains, can directly obtain the physical size of object; All coordinate points that obtain are drawn the point cloud chart that obtains object by openGL in computing machine, point cloud chart is carried out gridding, and attach the virtual mapping that texture can obtain object; From a cloud, select interested point to send into robotic arm, realize the tracking to object feature point.
The method of described Design Orientation platform is:
(1.1), determine that robotic arm pilot hole, the initial point of robotic arm coordinate system are the center of robotic arm pilot hole, the initial point of the initial point of platform coordinate system and robotic arm coordinate system overlaps, and sets up the locating platform coordinate system; Described locating platform coordinate is the XYZ space rectangular coordinate system, coincides with the robotic arm coordinate system; Set gradually laser instrument pilot hole and camera pilot hole along X-direction, along Y direction the turning axle pilot hole is set, at turning axle pilot hole and laser instrument pilot hole line the first demarcation bar pilot hole, the second demarcation bar pilot hole are set;
(1.2), on locating platform, determine turning axle pilot hole (x successively
0r, y
0r), first demarcate bar pilot hole (x
1c, y
1c), second demarcate bar pilot hole (x
2c, y
2c), camera pilot hole, laser instrument pilot hole (x
4l, y
4l), described laser instrument pilot hole (x
4l, y
4l), first demarcate bar pilot hole (x
1c, y
1c), second demarcate bar pilot hole (x
2c, y
2c) and the turning axle pilot hole on same straight line;
(1.3), demarcate holes at the position coordinates (x of XYZ space rectangular coordinate system for 6 that determine to be arranged on the demarcation bar
1c, y
1c, z
1c), (x
1c, y
1c, z
2c), (x
1c, y
1c, z
3c), (x
2c, y
2c, z
4c), (x
2c, y
2c, z
5c), (x
2c, y
2c, z
6c).
The demarcation of the described system of step 2 may further comprise the steps:
(2.1), camera obtains the laser strip image by optical filter, utilizes bimodal method that image is carried out binary conversion treatment, accurately extracts laser strip; Utilize the Medial-Axis Transformation principle to carry out refinement to laser strip, obtain the refinement laser strip;
(2.2), the refinement luminous point of forming laser strip is carried out serializing, the serial method method: the pixel coordinate of each laser spots is asked the second order norm || (u, v) ||, according to the size of second order norm luminous point is carried out serializing; Luminous point after the ergodic sequenceization if spaced far is greater than the interval average between the luminous point, is then got average to the pixel coordinate of these two spaced points and is obtained 6 pixel coordinate (u that demarcate the hole
1, v
1), (u
2, v
2), (u
3, v
3), (u
4, v
4), (u
5, v
5), (u
6, v
6);
(2.3), establishing the projection matrix that volume coordinate is tied between the imaging plane pixel coordinate system is:
The optic plane equations that laser strip forms in the space is: a
3X+b
3Y+c
3z
3=1, uniting the projection matrix that above-mentioned volume coordinate is tied between the imaging plane pixel coordinate system can get:
Utilizing the coordinate figure that obtains in step 1.3 and the step 2.2 to bring above-mentioned solution of equation into gets
Deserve to be called and state the projection matrix that matrix is system, finish system calibrating.
The method of each pixel of laser strip being carried out the reverse-engineering computing is:
(3.1), remove to demarcate bar, places object to be measured or to be tracked, the object of setting turning axle rotational angular velocity ω carries out passive rotation sweep, utilizes camera to obtain laser illumination in real time in the laser strip on measured object surface;
(3.2), find the solution reverse projection matrix H
-1=(HH
T)
-1
(3.3), the laser strip of body surface is carried out refinement, to each pixel m of refinement striation
i(u
i, v
i) carry out the reverse-engineering computing and recover its volume coordinate M
i(x
i, y
i, z
i):
Object is measured, virtual reconstruct, is followed the tracks of and may further comprise the steps:
(4.1), the camera collection body surface laser strip time interval is t
i, the lasing image vegetarian refreshments of the n time collection is converted to spatial point sequence seq (n) through reverse-engineering; At any time, turning axle is rotated conversion to seq (n), transformation matrix R (θ), θ=n * t
i* w, then the coordinate of seq this moment (n) is recalculated as seq (n) * R (θ); After 360 ° of scannings finishing object, can obtain the whole cloud data of object in the space; Wherein
Seq (n)=[x
ny
nz
n];
(4.2), to any two some M on the object
pAnd M
q, then the distance between them is || M
p-M
q||;
(4.3), utilize openGL to draw the object space coordinate points that all obtain, obtain the point cloud chart of object; Utilize the delaunay algorithm that cloud data is carried out triangulation a cloud carry out sheetization, attach texture then, finish the virtual reconstruct to object;
(4.4), locating platform provides the robotic arm pilot hole, realize the robotic arm of object specified point is followed the tracks of.
Described robotic arm to the object specified point is followed the tracks of and is comprised:
Real-time follow-up: will scan the point-of-interest that obtains and directly send into robotic arm, and finish the real-time follow-up to object feature point;
Time-delay is followed the tracks of: in the occasion of needs isolation analyzing spot and trace point, the tracking plane of setting robotic arm is X=A, and A is constant, and this planar process vector is (a
4=1, b
4=0, c
4=0); The angular velocity of rotation axis is
The normal vector of the plane of scanning motion is (a
3, b
3, c
3), then the angle on the plane of scanning motion and tracking plane is Q,
Follow the tracks of regularly
At scanning beginning t
sAfter begin to follow the tracks of; Arbitrfary point Mp to the distance L of turning axle=|| M
p-M
0||, some M
pThe tracking coordinate of this moment is [A, y
0+ lcosQ, z
p], realize a M
pTime-delay follow the tracks of.
Compared with the prior art line laser 3 D visual rotation sweep method of the present invention has the following advantages: adopt the line laser scan mode, scan efficiency is high and scanning is real-time; Adopt the locating platform technology, can realize the system model quick field calibration, make things convenient for visual angle and the depth of field of user's field adjustable camera, use more flexible; Utilize openGL to realize the virtual reconstruct of object; Can realize the measurement of physical dimension arbitrarily on the object; Locating platform provides the robotic arm pilot hole, can realize the real-time or time-delay tracking to the point-of-interest of object.
Description of drawings
Fig. 1 is the locating platform structural representation.
Fig. 2 demarcates the bar synoptic diagram.
Fig. 3 is system coordinate system model and time-delay trace model synoptic diagram.
Embodiment
The present invention is directed to the shortcoming of point, face scan mode, adopt line laser to cooperate the vision realization to the 3-D scanning of object; Utilize electric rotating machine to realize 360 ° of comprehensive drive sweeps of object, obtain the detailed space structure information of object; Can change camera lens according to the size of scanned object, change the camera depth of field; Utilize locating platform to cooperate calibration algorithm to realize the quick field calibration of scanning system; Utilize openGL to realize the point cloud chart of scanning object, the drafting of grid chart and texture paster figure reaches the purpose of virtual reconstruct; Locating platform provides the robotic arm pilot hole, can realize purpose that object feature point is followed the tracks of, realizes welding automatically, engraving etc.
The invention will be further described below in conjunction with drawings and Examples.
As Fig. 1, shown in 2, locating platform design of the present invention comprises determines turning axle pilot hole 1 (x
0r, y
0r), first demarcates bar pilot hole 2 (x
1c, y
1c), second demarcate bar pilot hole 3 (x
2c, y
2c), laser instrument pilot hole 4 (x
4l, y
4l) (guaranteeing that laser instrument pilot hole and demarcation bar pilot hole and turning axle pilot hole are on same straight line), robotic arm pilot hole 5, camera pilot hole 6.Determine to be positioned at 6 demarcation hole (x that demarcate on the bar
1c, y
1c, z
1c), (x
1c, y
1c, z
2c), (x
1c, y
1c, z
3c), (x
2c, y
2c, z
4c), (x
2c, y
2c, z
5c), (x
2c, y
2c, z
6c).Each hole on the locating platform is screw, and each equipment is connected screw by double-screw bolt and is fixed on the locating platform.In the platform lower left corner robotic arm pilot hole 5 is set, according to the aufbauprinciple of mechanical arm then the initial point of robotic arm coordinate system be the center of robotic arm pilot hole.Set up XYZ space rectangular coordinate system (coordinate figure of above-mentioned each pilot hole is the coordinate under this coordinate system), this coordinate system coincides with the robotic arm coordinate system.Set gradually laser instrument pilot hole 4 and camera pilot hole 6 along X-direction, along Y direction turning axle pilot hole 1 is set, at turning axle pilot hole 1 and laser instrument pilot hole 4 lines first, second demarcation bar pilot hole 2,3 is set.The maximum radius of this system scan object is R
Max, maximum height is H
MaxEach coordinate figure is determined principle: under coordinate system XYZ, and x
0r=0, y
4l=0, y
0r>R
MaxGet final product; x
1c=y
1c, and
x
2c=y
2c, and
z
1c~z
6cBe respectively
Line laser 3 D visual tracer rotation system of the present invention is demarcated, and is that camera obtains the laser strip image by optical filter, utilizes bimodal method that image is carried out binary conversion treatment, accurately extracts laser strip.Utilize the Medial-Axis Transformation principle to carry out refinement to laser strip, obtain the refinement laser strip; The refinement luminous point of forming laser strip is carried out serializing.Luminous point after the ergodic sequenceization if spaced far is greater than the interval average between the luminous point, is then got average to the pixel coordinate of these two spaced points and is obtained 6 pixel coordinate (u that demarcate the hole
1, v
1), (u
2, v
2), (u
3, v
3), (u
4, v
4), (u
5, v
5), (u
6, v
6) the projection matrix H that obtains system in conjunction with six volume coordinates of demarcating holes finishes system calibrating.
Each pixel of laser strip of the present invention carries out the reverse-engineering computing, and to recover its position in volume coordinate be to find the solution reverse projection matrix H
-1=(HH
T)
-1Each pixel m of laser strip
i(u
i, v
i) coordinate M in its subspace
i=H
-1m
i
Profit of the present invention to object measure, virtual reconstruct, to follow the tracks of be to utilize the rotation of turning axle to realize 360 ° of scannings of object are obtained the cloud datas of contour of object, utilizes openGL to realize the virtual reconstruct of object.Can realize real-time follow-up and the time-delay of object are followed the tracks of by the robotic arm pilot hole on the platform according to on-the-spot actual needs.
The course of work of the present invention specifies as follows:
At first, Design Orientation platform, locating platform comprise the pilot hole of system's various piece: the turning axle pilot hole, demarcate the bar pilot hole, camera pilot hole, laser instrument pilot hole, robotic arm pilot hole; The initial point on the x-y plane of the initial point of platform coordinate system and robotic arm overlaps.
Second step installed each parts by position of positioning hole, installed and demarcated bar, and as shown in Figure 2, bar is demarcated in the line laser irradiation, and laser strip can form fracture at the place, demarcation hole of demarcating bar.Utilize camera to obtain the image of demarcating the hole, utilize the Medial-Axis Transformation principle to the refinement of line laser striation, laser trace after the refinement is carried out serializing, search breakpoint according to sequence of points, determine to demarcate the coordinate of hole in image coordinate system, finish the demarcation of system in conjunction with the volume coordinate of pilot hole in system;
The 3rd step, remove and demarcate bar, place object to be measured or to be tracked, set the object of turning axle rotational angular velocity ω and carry out passive rotation sweep, utilize camera to obtain laser illumination in real time in the laser strip on measured object surface, the refinement laser strip is also carried out the reverse-engineering computing to each pixel of the laser strip that obtains, obtains its coordinate in locating platform.
The 4th step: by the coordinate information that obtains, can directly obtain the physical size of object; All coordinate points that obtain are drawn the point cloud chart that can obtain object by openGL in computing machine, point cloud chart is carried out gridding, and attach the virtual mapping that texture can obtain object; From a cloud, select interested point to send into robotic arm and can realize tracking to object feature point.
The design of described locating platform may further comprise the steps:
(1.1), determine the robotic arm pilot hole, follow the principle that the initial point on the x-y plane of the initial point of platform coordinate system and robotic arm overlaps, set up the locating platform coordinate system;
(1.2), be to determine turning axle pilot hole (x successively at locating platform
0r, y
0r), first demarcate bar pilot hole (x
1c, y
1c), second demarcate bar pilot hole (x
2c, y
2c), camera pilot hole, laser instrument pilot hole (x
4l, y
4l) (guaranteeing that laser instrument pilot hole and demarcation bar pilot hole and turning axle pilot hole are on same straight line);
(1.3), determine to be positioned at 6 demarcation hole (x that demarcate on the bar as shown in Figure 2
1c, y
1c, z
1c), (x
1c, y
1c, z
2c), (x
1c, y
1c, z
3c), (x
2c, y
2c, z
4c), (x
2c, y
2c, z
5c), (x
2c, y
2c, z
6c).
As shown in Figure 3, demarcate bar and pilot hole and be positioned at optical plane, optical plane and tracking plane are fixed angle Q, the angular velocity of rotation axis is ω, the robotic arm coordinate is rectangular coordinate system in space XYZ, image coordinate is plane right-angle coordinate UV, and the projection matrix between them and contrary projection matrix are respectively H, H
-1
The demarcation of described line laser three-dimensional rotation scanning system may further comprise the steps:
(2.1), camera obtains the laser strip image by optical filter, utilizes bimodal method that image is carried out binary conversion treatment, can accurately extract laser strip.Utilize the Medial-Axis Transformation principle to carry out refinement to laser strip, obtain the refinement laser strip;
(2.2), the refinement luminous point of forming laser strip is carried out serializing, the serial method method: the pixel coordinate of each laser spots is asked the second order norm || (u, v) ||, according to the size of second order norm luminous point is carried out serializing.Luminous point after the ergodic sequenceization if spaced far is greater than the interval average between the luminous point, is then got average to the pixel coordinate of these two spaced points and is obtained 6 pixel coordinate (u that demarcate the hole
1, v
1), (u
2, v
2), (u
3, v
3), (u
4, v
4), (u
5, v
5), (u
6, v
6);
(2.3), establishing the projection matrix that volume coordinate is tied between the imaging plane pixel coordinate system is:
The optic plane equations that laser strip forms in the space is: a
3X+b
3Y+c
3z
3=1:
Utilizing the coordinate figure that obtains in (1.3) and (2.2) to bring above-mentioned solution of equation into gets
Deserve to be called and state the projection matrix that matrix is system, finish system calibrating.
Described each pixel to laser strip carries out the reverse-engineering computing and recovers its position in volume coordinate and may further comprise the steps:
(3.1), remove to demarcate bar, places object to be measured or to be tracked, the object of setting turning axle rotational angular velocity ω carries out passive rotation sweep, utilizes camera to obtain laser illumination in real time in the laser strip on measured object surface;
(3.2), find the solution reverse projection matrix H
-1=(HH
T)
-1
(3.3), the laser strip of body surface is carried out refinement, to each pixel m of refinement striation
i(u
i, v
i) carry out the reverse-engineering computing and recover its volume coordinate M
i(x
i, y
i, z
i)
Described object measured, virtual reconstruct, followed the tracks of and may further comprise the steps:
(4.1), the camera collection body surface laser strip time interval is t
i, it is seq (n) that the lasing image vegetarian refreshments of the n time collection is converted to the spatial point sequence through reverse-engineering; T at any time, turning axle is rotated conversion to seq, transformation matrix R (θ), θ=n * t
i, then the coordinate of seq is recalculated as seq (n) * R (θ) at this moment.After 360 ° of scannings finishing object, can obtain the whole cloud data of object in the space.
(4.2), to any two some M on the object
pAnd M
q, then the distance between them is || M
p-M
q||.
(4.3), utilize openGL to draw the object space coordinate points that all obtain, can obtain the point cloud chart of object.Utilize the cloud data of delaunay to carry out triangulation a cloud carry out sheetization, attach texture then, finish the virtual reconstruct to object.
(4.4), locating platform provides the robotic arm pilot hole, can realize the robotic arm of object specified point is followed the tracks of tasks such as finishing the engraving welding.Real-time follow-up: will scan the point-of-interest that obtains and directly send into robotic arm, and finish the real-time follow-up to object feature point.Time-delay is followed the tracks of: in a lot of occasions, for example need to isolate analyzing spot and trace point: trace point is pad in welding process, a large amount of arcings can occur, spark.Extract precision in order to improve laser strip, must guarantee picture quality, therefore need to isolate analyzing spot and trace point.
In order to simplify trace model, the tracking plane of setting robotic arm is X=A (A is constant, is rotating shaft lateral coordinates value), and this planar process vector is (a
4=1, b
4=0, c
4=0); The angular velocity of rotation axis is
The normal vector of the plane of scanning motion is (a
3, b
3, c
3), then the angle on the plane of scanning motion and tracking plane is Q.
Claims (5)
1. line laser 3 D visual rotation sweep method is characterized in that may further comprise the steps:
At first, Design Orientation platform, locating platform comprise the pilot hole of system's various piece: the turning axle pilot hole, demarcate the bar pilot hole, camera pilot hole, laser instrument pilot hole, robotic arm pilot hole; The initial point of the initial point of described locating platform coordinate system and robotic arm coordinate system overlaps;
Second step installed each parts by position of positioning hole, installed and demarcated bar, and bar is demarcated in the line laser irradiation, and laser strip can form fracture at the place, demarcation hole of demarcating bar; Utilize camera to obtain the image of demarcating the hole, utilize the Medial-Axis Transformation principle to the refinement of line laser striation, laser trace after the refinement is carried out serializing, search breakpoint according to sequence of points, determine to demarcate the coordinate of hole in image coordinate system, finish the demarcation of system in conjunction with the volume coordinate of pilot hole in system;
The 3rd step, remove and demarcate bar, place object to be measured or to be tracked, set the object of turning axle rotational angular velocity ω and carry out passive rotation sweep, utilize camera to obtain laser illumination in real time in the laser strip on measured object surface, the refinement laser strip is also carried out the reverse-engineering computing to each pixel of the laser strip that obtains, obtains the coordinate of pixel in locating platform;
The 4th step: to object measure, virtual reconstruct and tracking: by the coordinate information that obtains, can directly obtain the physical size of object; All coordinate points that obtain are drawn the point cloud chart that obtains object by openGL in computing machine, point cloud chart is carried out gridding, and attach the virtual mapping that texture can obtain object; From a cloud, select interested point to send into robotic arm, realize the tracking to object feature point;
The method of described Design Orientation platform is:
(1.1), determine that robotic arm pilot hole, the initial point of robotic arm coordinate system are the center of robotic arm pilot hole, the initial point of the initial point of platform coordinate system and robotic arm coordinate system overlaps, and sets up the locating platform coordinate system; Described locating platform coordinate is the XYZ space rectangular coordinate system, coincides with the robotic arm coordinate system; Set gradually laser instrument pilot hole and camera pilot hole along X-direction, along Y direction the turning axle pilot hole is set, at turning axle pilot hole and laser instrument pilot hole line the first demarcation bar pilot hole, the second demarcation bar pilot hole are set;
(1.2), on locating platform, determine turning axle pilot hole (x successively
0r, y
0r), first demarcate bar pilot hole (x
1c, y
1c), second demarcate bar pilot hole (x
2c, y
2c), camera pilot hole, laser instrument pilot hole (x
4l, y
4l), described laser instrument pilot hole (x
4l, y
4l), first demarcate bar pilot hole (x
1c, y
1c), second demarcate bar pilot hole (x
2c, y
2c) and the turning axle pilot hole on same straight line;
(1.3), demarcate holes at the position coordinates (x of XYZ space rectangular coordinate system for 6 that determine to be arranged on the demarcation bar
1c, y
1c, z
1c), (x
1c, y
1c, z
2c), (x
1c, y
1c, z
3c), (x
2c, y
2c, z
4c), (x
2c, y
2c, z
5c), (x
2c, y
2c, z
6c).
2. line laser 3 D visual rotation sweep method according to claim 1 is characterized in that the demarcation of the described system of step 2 may further comprise the steps:
(2.1), camera obtains the laser strip image by optical filter, utilizes bimodal method that image is carried out binary conversion treatment, accurately extracts laser strip; Utilize the Medial-Axis Transformation principle to carry out refinement to laser strip, obtain the refinement laser strip;
(2.2), the refinement luminous point of forming laser strip is carried out serializing, the serial method method: the pixel coordinate of each laser spots is asked the second order norm || (u, v) ||, according to the size of second order norm luminous point is carried out serializing; Luminous point after the ergodic sequenceization if spaced far is greater than the interval average between the luminous point, is then got average to the pixel coordinate of these two spaced points and is obtained 6 pixel coordinate (u that demarcate the hole
1, v
1), (u
2, v
2), (u
3, v
3), (u
4, v
4), (u
5, v
5), (u
6, v
6);
(2.3), establishing the projection matrix that volume coordinate is tied between the imaging plane pixel coordinate system is:
The optic plane equations that laser strip forms in the space is: a
3X+b
3Y+c
3z
3=1, uniting the projection matrix that above-mentioned volume coordinate is tied between the imaging plane pixel coordinate system can get:
Utilizing the coordinate figure that obtains in step 1.3 and the step 2.2 to bring above-mentioned solution of equation into gets
Deserve to be called and state the projection matrix that matrix is system, finish system calibrating.
3. as line laser 3 D visual rotation sweep method as described in the claim 2, it is characterized in that the method that described each pixel to laser strip carries out the reverse-engineering computing is:
(3.1), remove to demarcate bar, places object to be measured or to be tracked, the object of setting turning axle rotational angular velocity ω carries out passive rotation sweep, utilizes camera to obtain laser illumination in real time in the laser strip on measured object surface;
(3.2), find the solution reverse projection matrix H
-1=(HH
T)
-1
(3.3), the laser strip of body surface is carried out refinement, to each pixel m of refinement striation
i(u
i, v
i) carry out the reverse-engineering computing and recover its volume coordinate M
i(x
i, y
i, z
i):
4. line laser 3 D visual rotation sweep method according to claim 1 is characterized in that, described object is measured, virtual reconstruct, followed the tracks of and may further comprise the steps:
(4.1), the camera collection body surface laser strip time interval is t
i, the lasing image vegetarian refreshments of the n time collection is converted to spatial point sequence seq (n) through reverse-engineering; At any time, turning axle is rotated conversion to seq (n), transformation matrix R (θ), θ=n * t
i* w, then the coordinate of seq this moment (n) is recalculated as seq (n) * R (θ); After 360 ° of scannings finishing object, can obtain the whole cloud data of object in the space; Wherein
(4.2), to any two some M on the object
pAnd M
q, then the distance between them is || M
p-M
q||;
(4.3), utilize openGL to draw the object space coordinate points that all obtain, obtain the point cloud chart of object; Utilize the delaunay algorithm that cloud data is carried out triangulation a cloud carry out sheetization, attach texture then, finish the virtual reconstruct to object;
(4.4), locating platform provides the robotic arm pilot hole, realize the robotic arm of object specified point is followed the tracks of.
5. as line laser 3 D visual rotation sweep method as described in the claim 4, it is characterized in that described robotic arm to the object specified point is followed the tracks of and comprised:
Real-time follow-up: will scan the point-of-interest that obtains and directly send into robotic arm, and finish the real-time follow-up to object feature point;
Time-delay is followed the tracks of: in the occasion of needs isolation analyzing spot and trace point, the tracking plane of setting robotic arm is X=A, and A is constant, and this planar process vector is (a
4=1, b
4=0, c
4=0); The angular velocity of rotation axis is
, the normal vector of the plane of scanning motion is (a
3, b
3, c
3), then the angle on the plane of scanning motion and tracking plane is Q,
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN 201110341204 CN102506711B (en) | 2011-11-01 | 2011-11-01 | Line laser vision three-dimensional rotate scanning method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN 201110341204 CN102506711B (en) | 2011-11-01 | 2011-11-01 | Line laser vision three-dimensional rotate scanning method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102506711A CN102506711A (en) | 2012-06-20 |
CN102506711B true CN102506711B (en) | 2013-07-17 |
Family
ID=46218819
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN 201110341204 Expired - Fee Related CN102506711B (en) | 2011-11-01 | 2011-11-01 | Line laser vision three-dimensional rotate scanning method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102506711B (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103471503A (en) * | 2013-09-16 | 2013-12-25 | 苏州凯欧机械科技有限公司 | Non-contact type precision-measurement mechanical arm |
CN103926129B (en) * | 2014-05-04 | 2016-04-13 | 中南大学 | One copies joint waviness and experimental technique with artificial rock material |
CN104613899A (en) * | 2015-02-09 | 2015-05-13 | 淮阴工学院 | Full-automatic calibration method for structured light hand-eye three-dimensional measuring system |
CN104655048A (en) * | 2015-03-09 | 2015-05-27 | 龚强 | High-speed laser three-dimensional scanning system |
CN106969733B (en) * | 2016-05-20 | 2021-05-14 | 美国西北仪器公司 | Method for positioning target object in target space and distance measuring device |
US11366450B2 (en) * | 2017-03-23 | 2022-06-21 | Abb Schweiz Ag | Robot localization in a workspace via detection of a datum |
CN108889948B (en) * | 2018-08-24 | 2020-12-08 | 合肥工业大学 | Partition scanning method for thin-walled part additive manufacturing |
CN111174788B (en) * | 2018-11-13 | 2023-05-02 | 北京京东乾石科技有限公司 | Indoor two-dimensional mapping method and device |
TWI735953B (en) * | 2019-09-18 | 2021-08-11 | 財團法人工業技術研究院 | Three-dimension measurement device and operation method thereof |
CN117272522B (en) * | 2023-11-21 | 2024-02-02 | 上海弥彧网络科技有限责任公司 | Portable aircraft curved surface skin rivet hole profile measurement system and method thereof |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1782659A (en) * | 2004-12-02 | 2006-06-07 | 中国科学院自动化研究所 | Welding seam tracking sight sensor based on laser structure light |
CN101532827A (en) * | 2009-04-15 | 2009-09-16 | 北京航空航天大学 | Deviation correction method for measurement of rail wear based on laser vision |
CN201552943U (en) * | 2009-09-16 | 2010-08-18 | 济南星辉数控机械科技有限公司 | Laser three-dimensional scanning system |
-
2011
- 2011-11-01 CN CN 201110341204 patent/CN102506711B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1782659A (en) * | 2004-12-02 | 2006-06-07 | 中国科学院自动化研究所 | Welding seam tracking sight sensor based on laser structure light |
CN101532827A (en) * | 2009-04-15 | 2009-09-16 | 北京航空航天大学 | Deviation correction method for measurement of rail wear based on laser vision |
CN201552943U (en) * | 2009-09-16 | 2010-08-18 | 济南星辉数控机械科技有限公司 | Laser three-dimensional scanning system |
Non-Patent Citations (4)
Title |
---|
基于激光扫描的三维机器视觉系统研究;李志;《基于激光扫描的三维机器视觉系统研究》;20070915;正文2.4节,3.1节,3.6节,5.1节,5.6节,图3.6 * |
基于点云数据的逆向技术研究与设计;赵柳;《基于点云数据的逆向技术研究与设计》;20110515;正文第2.1节 * |
李志.基于激光扫描的三维机器视觉系统研究.《基于激光扫描的三维机器视觉系统研究》.2007,正文2.4节,3.1节,3.6节,5.1节,5.6节,图3.6. |
赵柳.基于点云数据的逆向技术研究与设计.《基于点云数据的逆向技术研究与设计》.2011,正文第2.1节. |
Also Published As
Publication number | Publication date |
---|---|
CN102506711A (en) | 2012-06-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102506711B (en) | Line laser vision three-dimensional rotate scanning method | |
CN102607457B (en) | Measuring device and measuring method for large three-dimensional morphology based on inertial navigation technology | |
CN106959078B (en) | A kind of contour measuring method for measuring three-dimensional profile | |
Fröhlich et al. | Terrestrial laser scanning–new perspectives in 3D surveying | |
CN102944188B (en) | A kind of spot scan three dimensional shape measurement system scaling method | |
CN102494657B (en) | Measuring head radius compensation method for curve surface profile measuring and detecting | |
Shang et al. | Measurement methods of 3D shape of large-scale complex surfaces based on computer vision: A review | |
WO2018103694A1 (en) | Robotic three-dimensional scanning device and method | |
CN102012217B (en) | Method for measuring three-dimensional geometrical outline of large-size appearance object based on binocular vision | |
CN103759669A (en) | Monocular vision measuring method for large parts | |
CN108828606A (en) | Laser radar and binocular visible light camera-based combined measurement method | |
CN108198224B (en) | Linear array camera calibration device and calibration method for stereoscopic vision measurement | |
CN104613899A (en) | Full-automatic calibration method for structured light hand-eye three-dimensional measuring system | |
CN103386640B (en) | Large caliber reflecting mirror machining tool accuracy alignment method | |
CN110208771B (en) | Point cloud intensity correction method of mobile two-dimensional laser radar | |
CN101504275A (en) | Hand-hold line laser three-dimensional measuring system based on spacing wireless location | |
CN105526906B (en) | Wide-angle dynamic high precision laser angular measurement method | |
CN104990515A (en) | Three-dimensional shape measurement system and method for large-size object | |
CN108692656B (en) | Laser scanning data acquisition method and device | |
CN109269466A (en) | Target surface relative pose measurement method and system based on characteristic point | |
CN107421462A (en) | Object three-dimensional contour outline measuring system based on line laser structured light | |
CN103604368A (en) | Dynamic and real-time measuring method in airspace engine assembling process | |
CN102538763A (en) | Method for measuring three-dimensional terrain in river model test | |
CN106403900A (en) | Flyer tracking and locating system and method | |
CN111811433B (en) | Structured light system calibration method and device based on red and blue orthogonal stripes and application |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
EE01 | Entry into force of recordation of patent licensing contract |
Application publication date: 20120620 Assignee: WUXI XINJE ELECTRONIC Co.,Ltd. Assignor: Jiangnan University Contract record no.: 2013320000794 Denomination of invention: Line laser vision three-dimensional rotate scanning method Granted publication date: 20130717 License type: Exclusive License Record date: 20131127 |
|
LICC | Enforcement, change and cancellation of record of contracts on the licence for exploitation of a patent or utility model | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20130717 |
|
CF01 | Termination of patent right due to non-payment of annual fee |