CN116309874A - Calibration and point cloud generation method based on binocular single-line laser - Google Patents
Calibration and point cloud generation method based on binocular single-line laser Download PDFInfo
- Publication number
- CN116309874A CN116309874A CN202310278283.5A CN202310278283A CN116309874A CN 116309874 A CN116309874 A CN 116309874A CN 202310278283 A CN202310278283 A CN 202310278283A CN 116309874 A CN116309874 A CN 116309874A
- Authority
- CN
- China
- Prior art keywords
- laser
- camera
- point cloud
- point
- calibration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 239000011159 matrix material Substances 0.000 claims description 12
- 230000000007 visual effect Effects 0.000 claims description 6
- 238000003384 imaging method Methods 0.000 claims description 5
- 230000003287 optical effect Effects 0.000 claims description 5
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 230000005251 gamma ray Effects 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000007547 defect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention belongs to the field of computer three-dimensional vision, in particular to a method for calibrating and generating point cloud based on binocular single-line laser, which aims at the problems that the prior calibration method needs to move a calibration plate for ten times or even tens of times by means of a plane calibration plate, and only one laser line at one position can be calibrated each time, the steps are very complicated, and the practical use is very troublesome, the scheme is provided at present, and the method comprises the following steps: s1: accurately calibrating the binocular camera by using a conventional calibration algorithm, wherein all the follow-up steps are developed on the basis of the calibration of the binocular camera; s2: the laser sensor is fixed, then the arc-shaped calibration block is placed in front of the laser sensor, and the arc surface is opposite to the laser sensor; meanwhile, the flexibility of scanning is improved, so that the scanning process can be completed by both the single-eye camera and the binocular camera; the field of view is enlarged, and more point cloud data are reconstructed.
Description
Technical Field
The invention relates to the technical field of computer three-dimensional vision, in particular to a method for calibrating and generating point cloud based on binocular single-line laser.
Background
The three-dimensional morphology measurement technology by means of laser scanning has become an indispensable technology in the industrial field at present by virtue of the advantages of non-contact, high precision, wide application range and the like, and has high research value. The technology is widely applied to the fields of product defect inspection, automatic assembly, size measurement, cultural relic reconstruction, visual navigation and the like, and has very high practical value.
The conventional calibration method has the advantages that by means of the planar calibration plate, the calibration plate needs to be moved for ten times or even tens of times for completing one-time calibration, and only one laser line at one position can be calibrated at a time, so that the steps are very complicated, and the practical use is very troublesome.
Disclosure of Invention
The invention aims to solve the defects that in the prior art, a calibration method is carried out by means of a plane calibration plate, the calibration plate needs to be moved for ten times or even tens of times for completing one-time calibration, and only one laser line at one position can be calibrated each time, the steps are very complicated, and the actual use is very troublesome.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
the method for calibrating and generating the point cloud based on the binocular single line laser comprises the following steps of:
s1: accurately calibrating the binocular camera by using a conventional calibration algorithm, wherein all the follow-up steps are developed on the basis of the calibration of the binocular camera;
s2: fixing a laser sensor, and then placing an arc calibration block in front of the laser sensor, so that the arc surface is opposite to the laser sensor;
s3: rotating the laser transmitter to move to an initial transmitting position at one side;
s4: the laser sensor sends a trigger signal, and the left camera and the right camera capture images and transmit the images to the PC;
s5: extracting the center points of laser lines in the image, reconstructing three-dimensional coordinates of the laser lines by utilizing binocular reconstruction, positioning all points on an arc line, and fitting a laser plane by utilizing the points;
s6: the laser transmitter is turned to reach the next emission bit and S4, S5 are performed again.
S7: and S6, continuously repeating until the laser transmitter passes through all the transmitting positions, and finally fitting the laser planes of all the transmitting positions.
S8: and (3) moving the position of the arc-shaped calibration block backwards, and executing the third step to the seventh step again to finally obtain another group of laser planes, wherein the step is performed to improve the calibrated depth of field and the calibrated precision, and the position of the arc-shaped calibration block can be moved for multiple times to obtain more groups of laser planes so as to obtain higher precision.
S9: and fitting a plurality of laser planes corresponding to each emission position into one, and finally obtaining a group of laser plane parameters with higher precision, namely laser calibration data.
The reconstruction comprises the following steps:
t1: rotating the laser emitter to a certain emission position, wherein the emission position can be not the initial emission position, and the finally reconstructed point cloud only comprises the area scanned by the laser;
t2: the laser sensor sends a trigger signal to one of the cameras, which captures an image and transmits it to the PC, and the trigger signal may also be sent to both the left and right cameras, causing both cameras to capture an image and transmit it to the PC.
T3: extracting central points on laser lines, calculating coordinates of the central points in an image coordinate system, converting the coordinates into a camera coordinate system, starting from an imaging laser central point, fitting a ray through a camera optical center, calculating an intersection point of the ray and a laser plane of an emission position, wherein the intersection point is a corresponding scanned object point, and repeatedly executing the operation on all other laser line central points to finally reconstruct a point cloud of the laser line;
t4: rotating the laser emitter to reach the next emission position, and executing the third step again;
t5: continuously repeating the T4 until the laser transmitter reaches the transmitting position of the target, and finally obtaining point clouds of all the transmitting positions to the corresponding lines;
t6: if only one camera captures an image, a final point cloud is obtained, and if both the left camera and the right camera capture the image, only the point cloud data of the left camera or the right camera can be selected to be reserved as the final point cloud data, and the point cloud data of the left camera and the right camera can be fused to obtain the point cloud data with a larger visual field range.
Preferably, in the step S5, the laser plane refers to a plane where the laser line projected by the laser emitter at one emission position is located.
Preferably, in the T6, the fusing of the point cloud data includes the following steps: according to the parameters obtained by the double-target determination, the point cloud coordinates in the right camera coordinate system are transformed into the left camera coordinate system, if one point exists in only one camera, the point is reserved, if one point exists in both the left camera and the right camera, the point in the left camera is reserved, and the midpoint of the right camera is removed.
Preferably, in the step S2, the laser transmitter has a series of fixed transmitting positions, the positions of the transmitting positions are determined by the angle sensor on the laser transmitter, the laser transmitter driven by the motor can only move from one transmitting position to the next transmitting position every time, the positions of each transmitting position relative to the two cameras are strictly fixed and cannot change, the relative positions between the left camera and the right camera are relatively fixed, in any transmitting position, the laser transmitter can send trigger signals to the left camera and the right camera, after the cameras receive the trigger signals, the cameras immediately capture images, then transmit the images to the PC, and the PC performs subsequent processing.
Preferably, the calibration is assisted by using a calibration block with an arc-shaped surface, and the calibration blocks at two positions are at least required to be calibrated.
Preferably, in the step S1, an internal parameter matrix of the left camera can be obtained after the camera calibration is completed:
internal parameter matrix of right camera:
wherein,,and->Respectively representing imaging center point coordinates of the left camera and the right camera; alpha l And alpha r The effective focal lengths of the left camera and the right camera along the u axis are respectively shown; beta l And beta r Respectively representing the effective focal lengths of the left camera and the right camera along the v axis; gamma ray l And gamma r The tilt coefficients of the left and right cameras u-axis and v-axis are shown, respectively.
And a rotation matrix R and a translation vector t between the left and right cameras.
Preferably, in the step S5, the center points of the laser lines in the image are extracted, the three-dimensional coordinates of the laser lines are reconstructed by binocular, all points are located on an arc line, and a laser plane is fitted by using the points;
computing an eigenvalue matrix between the left and right cameras:
E=t^R
taking the laser line center point p in the left camera l Using polar constraint equations
l T Ep l =0
Wherein l T Representing a set of points in the right camera that may be matched, the points together forming a epipolar line; the intersection point of the polar line and the laser line in the right camera is a real matching point;
then, the true coordinates of the point in a camera coordinate system can be obtained by utilizing the triangle similarity principle;
the coordinates of all central points on a laser line in a camera coordinate system are obtained by the steps, and as the laser line is hit on a curved surface, the reconstructed points are not on a straight line, so that a plane can be fitted:
A i x+B i y+C i z+D i =0
where the subscript i denotes the ith transmitted bit from left to right.
Preferably, in the T3, the central points on the laser lines are extracted, the coordinates of the central points in the image coordinate system are calculated, then the coordinates are converted into the camera coordinate system, a ray is fitted from an imaging laser central point through a camera optical center, the intersection point of the ray and the emission bit laser plane is calculated, the intersection point is the corresponding scanned object point, then the operation is repeatedly performed on all the other laser line central points, and finally, the point cloud of the laser line can be reconstructed;
after conversion to the camera coordinate system, the center point coordinates can be expressed as (x) 0 ,y 0 ,z 0 ) A straight line equation and a laser plane equation of the emission position are combined:
the result of this system of equations is the coordinates of the scanned object point in the camera coordinate system.
In the invention, the method for calibrating and generating the point cloud based on the binocular single line laser has the beneficial effects that:
according to the invention, the binocular camera and the laser sensor are combined, the visual binocular laser system is built, the laser plane of each emission position is calibrated on the basis of the calibration of the binocular camera, and a stable calibration result can be obtained by moving the calibration block at least twice, so that the laser calibration process is greatly simplified, the laser calibration is easier to execute, and the difficulty of manual operation of non-professional staff is greatly reduced.
In the reconstruction method used in the invention, one camera can be used for participating in scanning, two cameras can be used for participating in scanning, and one camera scanning can be applied to a scene with a small visual field; the two camera scans may fuse their respective point cloud data for a larger field of view. The scanning mode is more changeable, uses more nimble, can satisfy customer's different demands, through the camera of combination different quantity, can conveniently design the product of different appearances.
Drawings
FIG. 1 is a calibration flow chart of a method for calibrating and generating point cloud based on binocular single line laser;
FIG. 2 is a flow chart of a reconstruction method based on the calibration of the binocular single line laser and the generation of the point cloud;
FIG. 3 is a schematic structural diagram of a calibration block of the method for calibrating and generating point cloud based on binocular single line laser;
fig. 4 is a schematic structural diagram of a laser device based on a method for calibrating and generating point clouds by using binocular single-line laser.
In the figure: 1. a laser emitter; 2. an angle sensor; 3. and a motor.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments.
Referring to fig. 1-4, a method for calibrating and generating point cloud based on binocular single line laser comprises the steps of calibrating and reconstructing, and calibrating:
s1: the conventional calibration algorithm is used for accurately calibrating the binocular camera, and all the follow-up steps are developed on the basis of the calibration of the binocular camera.
After the camera calibration is completed, an internal parameter matrix of the left camera can be obtained:
internal parameter matrix of right camera:
wherein,,and->Respectively representing imaging center point coordinates of the left camera and the right camera; alpha l And alpha r The effective focal lengths of the left camera and the right camera along the u axis are respectively shown; beta l And beta r Respectively representing the effective focal lengths of the left camera and the right camera along the v axis; gamma ray l And gamma r The tilt coefficients of the left and right cameras u-axis and v-axis are shown, respectively.
And a rotation matrix R and a translation vector t between the left and right cameras.
S2: and fixing the laser sensor, and then placing the arc calibration block in front of the laser sensor, so that the arc surface is opposite to the laser sensor.
S3: the laser transmitter 1 is turned to move to an initial transmitting position at one side.
S4: the laser sensor sends a trigger signal, and the left and right cameras capture images and transmit to the PC.
S5: and extracting the center point of the laser lines in the image, and reconstructing three-dimensional coordinates of the laser lines by using binocular. All points lie on an arc, and a laser plane is fitted with the points. The laser plane here refers to the plane in which the laser light line projected from the laser emitter 1 at one emission position.
Computing an eigenvalue matrix between the left and right cameras:
E=t^R
taking the laser line center point p in the left camera l Using polar constraint equations
l T Ep l =0
Wherein l T Representing a set of points in the right camera that may match, the points together make up a epipolar line. The intersection point of the epipolar line and the laser line in the right camera is the true matching point.
Then, the true coordinates of the point in the camera coordinate system can be obtained by utilizing the triangle similarity principle.
And obtaining the coordinates of all the center points on one laser line in a camera coordinate system by utilizing the steps. Since the laser line strikes the curved surface, none of the reconstructed points is on a straight line, so a plane can be fitted:
A i x+B i y+C i z+D i =0
where the subscript i denotes the ith transmitted bit from left to right.
S6: the laser transmitter 1 is turned to reach the next emission position. The fourth and fifth steps are performed again.
S7: s6 is repeated until the laser transmitter 1 passes through all the transmission bits. And finally fitting the laser planes of all emission positions.
S8: and (3) moving the position of the arc calibration block backwards, and executing S3 to S7 again. Another set of laser planes will eventually be obtained. This step is performed in order to improve the depth of field and the accuracy of the calibration, and in order to obtain higher accuracy, the positions of the arc calibration blocks can be moved multiple times to obtain more groups of laser planes.
S9: and fitting a plurality of laser planes corresponding to each emission position into one, and finally obtaining a group of laser plane parameters with higher precision, namely laser calibration data.
The laser transmitter 11 has a series of fixed transmitting positions, the positions of the transmitting positions are determined by the angle sensor 2 on the laser transmitter 1, the laser transmitter 1 driven by the motor 3 can only move from one transmitting position to the next transmitting position every time, the positions of each transmitting position relative to two cameras are strictly fixed and cannot change, the relative positions between the left camera and the right camera are relatively fixed, in any transmitting position, the laser transmitter 1 can send trigger signals to the left camera and the right camera, the cameras can immediately capture images after receiving the trigger signals, then transmit the images to the PC, and the PC can perform subsequent processing.
And (3) reconstruction:
t1: the laser transmitter 1 is turned to reach a certain emission position. This emission bit may not be the initial emission bit, and the final reconstructed point cloud includes only the laser scanned area.
T2: the laser sensor sends a trigger signal to a certain camera, which captures an image and transmits it to the PC. The trigger signal may also be sent to both the left and right cameras causing both cameras to capture images and transmit to the PC.
T3: the center points on the laser lines are extracted, their coordinates in the image coordinate system are calculated, and then this coordinate is converted to the camera coordinate system. From an imaging laser center point, a ray is fitted through the camera optical center (i.e., the origin of the camera coordinate system). An intersection point of the ray and the emission bit laser plane is calculated, and the intersection point is the corresponding scanned object point. And then repeatedly executing the operation on all the rest laser line central points, and finally reconstructing a point cloud of the laser line.
After conversion to the camera coordinate system, the center point coordinates can be expressed as (x) 0 ,y 0 ,z 0 ) A straight line equation and a laser plane equation of the emission position are combined:
the result of this system of equations is the coordinates of the scanned object point in the camera coordinate system.
T4: the laser transmitter 1 is turned to reach the next emission position and T3 is performed again.
T5: and repeating the step T4 until the laser transmitter 1 reaches the transmitting bit of the target, and finally obtaining the point cloud of all the transmitting bits to the corresponding line.
T6: if only one camera captures an image, a final point cloud is obtained, and if both the left camera and the right camera capture the image, only the point cloud data of the left camera or the right camera can be selected to be reserved as the final point cloud data, and the point cloud data of the left camera and the right camera can be fused to obtain the point cloud data with a larger visual field range.
The process of point cloud fusion is described as follows:
according to the parameters obtained by the double-target determination, the point cloud coordinates in the right camera coordinate system are transformed into the left camera coordinate system, if one point exists in only one camera, the point is reserved, if one point exists in both the left camera and the right camera, the point in the left camera is reserved, and the midpoint of the right camera is removed.
Patents related to the present invention are:
line laser scanning three-dimensional measurement calibration method publication number: CN111272102a;
the present invention differs from the above-mentioned patent in that:
the above patent uses a monocular camera for calibration, and multiple calibration plates are required for calibration, and the calibration plates are required to be placed in different positions and cannot be located on the same plane.
The invention uses the binocular camera to calibrate, only needs to resort to an arc calibration block during calibration, and the calibration block needs to move twice at least.
Line structure light calibration method publication number based on chessboard target: CN10118528A;
the present invention differs from the above-mentioned patent in that:
the above patent uses a monocular camera for calibration, and a checkerboard calibration plate is needed during calibration, and the pose of the calibration plate needs to be changed multiple times during the calibration process, and only one laser plane can be calibrated at a time.
The invention uses the binocular camera to calibrate, only one arc calibration block is needed to be used for calibrating, the calibration block needs to be moved twice at least, and the laser planes of all emission positions can be calibrated once.
Calibration plate for calibrating line laser position and method for calibrating line laser camera measurement system are disclosed in: CN106056620a;
the present invention differs from the above-mentioned patent in that:
the above patent uses a monocular camera for calibration, which requires the use of a special-shaped checkerboard calibration plate with raised areas.
The invention uses a binocular camera for calibration, and only one arc calibration block is needed for calibration.
The foregoing is only a preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art, who is within the scope of the present invention, should make equivalent substitutions or modifications according to the technical scheme of the present invention and the inventive concept thereof, and should be covered by the scope of the present invention.
Claims (8)
1. The method for calibrating and generating the point cloud based on the binocular single line laser comprises the following steps of:
s1: accurately calibrating the binocular camera by using a conventional calibration algorithm, wherein all the follow-up steps are developed on the basis of the calibration of the binocular camera;
s2: fixing a laser sensor, and then placing an arc calibration block in front of the laser sensor, so that the arc surface is opposite to the laser sensor;
s3: rotating the laser transmitter (1) to move to an initial transmitting position at one side;
s4: the laser sensor sends a trigger signal, and the left camera and the right camera capture images and transmit the images to the PC;
s5: extracting the center points of laser lines in the image, reconstructing three-dimensional coordinates of the laser lines by utilizing binocular reconstruction, positioning all points on an arc line, and fitting a laser plane by utilizing the points;
s6: rotating the laser transmitter (1) to reach the next transmitting position, and executing S4 and S5 again;
s7: continuously repeating the step S6 until the laser transmitter (1) passes through all the transmitting positions, and finally fitting out laser planes of all the transmitting positions;
s8: moving the position of the arc calibration block backwards, and executing the third step to the seventh step again to finally obtain another group of laser planes;
s9: fitting a plurality of laser planes corresponding to each emission position into one, and finally obtaining a group of laser plane parameters with higher precision, namely laser calibration data;
the reconstruction comprises the following steps:
t1: rotating the laser emitter (1) to reach a certain emission position, wherein the emission position can be not the initial emission position, and the finally reconstructed point cloud only comprises the area scanned by the laser;
t2: the laser sensor sends a trigger signal to one camera, the camera captures an image and transmits the image to the PC, and the trigger signal can also be sent to the left camera and the right camera, so that the two cameras capture the image and transmit the image to the PC;
t3: extracting central points on laser lines, calculating coordinates of the central points in an image coordinate system, converting the coordinates into a camera coordinate system, starting from an imaging laser central point, fitting a ray through a camera optical center, calculating an intersection point of the ray and a laser plane of an emission position, wherein the intersection point is a corresponding scanned object point, and repeatedly executing the operation on all other laser line central points to finally reconstruct a point cloud of the laser line;
t4: rotating the laser transmitter (1) to reach the next transmitting position, and executing the third step again;
t5: continuously repeating the T4 until the laser transmitter (1) reaches the transmitting position of the target, and finally obtaining point clouds of all the transmitting positions to the stress line;
t6: if only one camera captures an image, a final point cloud is obtained, and if both the left camera and the right camera capture the image, only the point cloud data of the left camera or the right camera can be selected to be reserved as the final point cloud data, and the point cloud data of the left camera and the right camera can be fused to obtain the point cloud data with a larger visual field range.
2. The method for calibrating and generating point cloud based on binocular single line laser according to claim 1, wherein in S5, the laser plane refers to a plane where a laser line projected by the laser emitter (1) at one emission position is located.
3. The method for calibrating and generating point cloud based on the binocular single line laser according to claim 1, wherein in the T6, the point cloud data are fused, comprising the following steps: according to the parameters obtained by the double-target determination, the point cloud coordinates in the right camera coordinate system are transformed into the left camera coordinate system, if one point exists in only one camera, the point is reserved, if one point exists in both the left camera and the right camera, the point in the left camera is reserved, and the midpoint of the right camera is removed.
4. The method for calibrating and generating point cloud based on binocular single line laser according to claim 1, wherein in the step S2, the laser transmitter (1) has a series of fixed transmitting positions, the positions of the transmitting positions are determined by the angle sensor (2) on the laser transmitter (1), each time the laser transmitter (1) driven by the motor (3) rotates, only moves from one transmitting position to the next transmitting position, each transmitting position is strictly fixed relative to the positions of the two cameras, the positions of the two cameras are not changed, the relative positions between the left camera and the right camera are relatively fixed, the laser transmitter (1) can send trigger signals to the two cameras at any transmitting position, after receiving the trigger signals, the cameras immediately capture images, then transmit the images to the PC, and the PC carries out subsequent processing.
5. The method for calibrating and generating point cloud based on binocular single line laser according to claim 1, wherein the calibration is assisted by using a calibration block with an arc-shaped surface, and the calibration is performed at least by using two calibration blocks.
6. The method for calibrating and generating point cloud based on binocular single line laser according to claim 1, wherein in S1, an internal parameter matrix of a left camera can be obtained after camera calibration is completed:
internal parameter matrix of right camera:
wherein,,and->Respectively representing imaging center point coordinates of the left camera and the right camera; alpha l And alpha r The effective focal lengths of the left camera and the right camera along the u axis are respectively shown; beta l And beta r Respectively representing the effective focal lengths of the left camera and the right camera along the v axis; gamma ray l And gamma r The tilt coefficients of the left and right cameras u-axis and v-axis are respectively represented;
and a rotation matrix R and a translation vector t between the left and right cameras.
7. The method for calibrating and generating point cloud based on binocular single line laser according to claim 1, wherein in S5, the center points of the laser lines in the image are extracted, the three-dimensional coordinates of the laser lines are reconstructed by using the binocular, all the points are located on an arc line, and a laser plane is fitted by using the points;
computing an eigenvalue matrix between the left and right cameras:
E=t^R
taking the laser line center point p in the left camera l Using polar constraint equations
l T Ep l =0
Wherein l T Representing a set of points in the right camera that may be matched, the points together forming a epipolar line; the intersection point of the polar line and the laser line in the right camera is a real matching point;
then, the true coordinates of the point in a camera coordinate system can be obtained by utilizing the triangle similarity principle;
the coordinates of all central points on a laser line in a camera coordinate system are obtained by the steps, and as the laser line is hit on a curved surface, the reconstructed points are not on a straight line, so that a plane can be fitted:
A i x+B i y+C i z+D i =0
where the subscript i denotes the ith transmitted bit from left to right.
8. The method for calibrating and generating point cloud based on binocular single line laser according to claim 1, wherein in said T3, the center points on the laser lines are extracted, the coordinates of them in the image coordinate system are calculated, then the coordinates are converted to the camera coordinate system, a ray is fitted from an imaging laser center point through the camera optical center, the intersection point of the ray and the emission bit laser plane is calculated, the intersection point is the corresponding scanned object point, then the operation is repeatedly performed on all the remaining laser line center points, and finally the point cloud of a laser line can be reconstructed;
after conversion to the camera coordinate system, the center point coordinates can be expressed as (x) 0 ,y 0 ,z 0 ) A straight line equation and a laser plane equation of the emission position are combined:
the result of this system of equations is the coordinates of the scanned object point in the camera coordinate system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310278283.5A CN116309874A (en) | 2023-03-21 | 2023-03-21 | Calibration and point cloud generation method based on binocular single-line laser |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310278283.5A CN116309874A (en) | 2023-03-21 | 2023-03-21 | Calibration and point cloud generation method based on binocular single-line laser |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116309874A true CN116309874A (en) | 2023-06-23 |
Family
ID=86833957
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310278283.5A Pending CN116309874A (en) | 2023-03-21 | 2023-03-21 | Calibration and point cloud generation method based on binocular single-line laser |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116309874A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118091693A (en) * | 2024-04-25 | 2024-05-28 | 武汉汉宁轨道交通技术有限公司 | Adaptive feature matching line structured light and Lidar track multi-scale point cloud fusion method |
-
2023
- 2023-03-21 CN CN202310278283.5A patent/CN116309874A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118091693A (en) * | 2024-04-25 | 2024-05-28 | 武汉汉宁轨道交通技术有限公司 | Adaptive feature matching line structured light and Lidar track multi-scale point cloud fusion method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103971353B (en) | Splicing method for measuring image data with large forgings assisted by lasers | |
CN106949845B (en) | Two dimension laser galvanometer scanning system and scaling method based on binocular stereo vision | |
CN111325801B (en) | Combined calibration method for laser radar and camera | |
CN106091984B (en) | A kind of three dimensional point cloud acquisition methods based on line laser | |
CN105547189B (en) | High-precision optical method for three-dimensional measurement based on mutative scale | |
CN114998499B (en) | Binocular three-dimensional reconstruction method and system based on line laser galvanometer scanning | |
US11587252B2 (en) | Positioning method and system combining mark point positioning and intelligent reverse positioning | |
CN102519434B (en) | Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data | |
CN110703230B (en) | Position calibration method between laser radar and camera | |
CN109579695B (en) | Part measuring method based on heterogeneous stereoscopic vision | |
CN113175899B (en) | Camera and galvanometer combined three-dimensional imaging model of variable sight line system and calibration method thereof | |
CN110966932B (en) | Structured light three-dimensional scanning method based on known mark points | |
CN109325981B (en) | Geometric parameter calibration method for micro-lens array type optical field camera based on focusing image points | |
CN112985293B (en) | Binocular vision measurement system and measurement method for single-camera double-spherical mirror image | |
CN104134188A (en) | Three-dimensional visual information acquisition method based on two-dimensional and three-dimensional video camera fusion | |
CN106340045B (en) | Calibration optimization method in three-dimensional facial reconstruction based on binocular stereo vision | |
CN116051659B (en) | Linear array camera and 2D laser scanner combined calibration method | |
CN106500625A (en) | A kind of telecentricity stereo vision measuring apparatus and its method for being applied to the measurement of object dimensional pattern micron accuracies | |
CN116309874A (en) | Calibration and point cloud generation method based on binocular single-line laser | |
CN108154536A (en) | The camera calibration method of two dimensional surface iteration | |
CN115880344A (en) | Binocular stereo matching data set parallax truth value acquisition method | |
CN110849269A (en) | System and method for measuring geometric dimension of field corn cobs | |
CN115082446B (en) | Method for measuring aircraft skin rivet based on image boundary extraction | |
CN110728745B (en) | Underwater binocular stereoscopic vision three-dimensional reconstruction method based on multilayer refraction image model | |
CN111833392A (en) | Multi-angle scanning method, system and device for mark points |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |