CN108020826A - Multi-line laser radar and multichannel camera mixed calibration method - Google Patents
Multi-line laser radar and multichannel camera mixed calibration method Download PDFInfo
- Publication number
- CN108020826A CN108020826A CN201711012232.9A CN201711012232A CN108020826A CN 108020826 A CN108020826 A CN 108020826A CN 201711012232 A CN201711012232 A CN 201711012232A CN 108020826 A CN108020826 A CN 108020826A
- Authority
- CN
- China
- Prior art keywords
- msub
- mrow
- camera
- laser radar
- point cloud
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 19
- 230000003068 static effect Effects 0.000 claims abstract description 25
- 230000003287 optical effect Effects 0.000 claims description 3
- 238000009434 installation Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a kind of multi-line laser radar and multichannel camera mixed calibration method, comprise the following steps:The collection of S1, the raw image data of multichannel camera, multi-line laser radar cloud data and static laser radar point cloud data;The solution of S2, each camera internal reference model;S3, the image to the collection of each camera distort, the image after being corrected;Static laser radar point cloud data, be registrated in multi-line laser radar point cloud coordinate system by S4;S5, obtain position (X of each camera in multi-line laser radar point cloud coordinate system in the good cloud data of S4 registrationss,Ys,Zs);The pixel coordinate (u, v) of at least four target and the corresponding three-dimensional coordinate (X for putting target in cloud using multi-line laser radar as coordinate origin are chosen in S6, the image after the correction of each camerap,Yp,Zp);S7, the internal reference model according to each camera, camera position (Xs,Ys,Zs) and camera corresponding to target pixel coordinate (u, v) and three-dimensional coordinate (Xp,Yp,Zp), collinearity equation is established, obtains the attitude angle element and 9 direction cosines of each camera, completes calibration.
Description
Technical Field
The invention relates to the technical field of calibration, in particular to a multi-line laser radar and multi-path camera mixed calibration method.
Background
Laser radar detects the position of an object by emitting laser and receiving reflected laser, and in order to improve the detection range and accuracy of the laser radar, people further research and obtain a multi-line laser radar on the basis of a single-line laser radar. The multiline laser radar can emit and receive a plurality of laser beams simultaneously, and a plurality of concentric scanning lines can be obtained during scanning.
In the three-dimensional reconstruction application based on multi-line laser radar scanning, based on the data fusion of multi-line laser radar and multi-path cameras, more environment three-dimensional detail information can be captured, more perfect spatial data is improved for further processing, but in a related system, the multi-line laser radar and the multi-path cameras both have local coordinate systems of the multi-line laser radar and the multi-path cameras and need to be calibrated through related algorithms so as to find out the three-dimensional coordinate transformation relation between the multi-line laser radar and the multi-path cameras, and at present, most of the algorithms for hybrid calibration of the multi-line laser radar and the multi-path cameras need to be calibrated independently for each camera and the multi-line. Because point cloud data acquired by the laser radar is sparse and parameters in the multi-path camera are unknown, the calibration difficulty is high, and no related calibration technology can solve the problem of mixed calibration of the multi-line laser radar and the multi-path camera at one time.
Disclosure of Invention
The invention aims to provide a multi-line laser radar and multi-channel camera mixed calibration algorithm which is used for realizing calibration between a multi-line laser radar and a multi-channel camera.
In order to achieve the purpose, the invention adopts the following technical scheme:
the mixed calibration method of the multi-line laser radar and the multi-path camera comprises the following steps:
s1, collecting original image data of a multi-path camera, multi-line laser radar point cloud data and static laser radar point cloud data;
s2, solving the internal reference model of each camera;
s3, carrying out distortion removal on the images collected by the cameras to obtain corrected images;
s4, registering the static lidar point cloud data into a multi-line lidar point cloud coordinate system;
s5, acquiring the position (X) of each camera in the multiline laser radar point cloud coordinate system from the point cloud data registered in S4s,Ys,Zs);
S6, selecting pixel coordinates (u, v) of at least 4 targets in the corrected image of each camera and corresponding three-dimensional coordinates (X) of the targets in the scene point cloud with the multiline laser radar as the coordinate originp,Yp,Zp);
S7, according to the internal reference model and camera position (X) of each cameras,Ys,Zs) And pixel coordinates (u, v) and three-dimensional coordinates (X) of the target corresponding to the camerap,Yp,Zp) And establishing a collinear equation, solving the attitude angle elements and 9 direction cosines of each camera, and completing calibration.
Further, step S1 specifically includes:
s11, acquisition of multi-path camera image data:
the method comprises the following steps of (1) parking a vehicle statically, and uniformly placing a plurality of targets in a view field of each camera in sequence to obtain original image data of multiple cameras;
s12, acquiring point cloud data of the multi-line laser radar:
starting up and scanning the multi-line laser radar on the roof to obtain multi-line laser radar point cloud data with the position of the multi-line laser radar as the origin of a three-dimensional space coordinate system;
s13, collection of static laser radar point cloud data:
and scanning the whole scene by adopting a ground static laser radar to obtain static laser radar point cloud data and the positions of the cameras in the static laser radar point cloud data.
Further, in step S2, the camera reference model is expressed as
Wherein f isx,fyIs the focal length of the camera, cx,cyAnd solving the camera internal parameters and distortion factors by using a Zhangyingyou chessboard calibration method for the main optical axis point of the camera to obtain a camera internal parameter model.
Further, in step S7, the collinearity equation is:
wherein f is the vertical distance from the center of the lens to the image plane,a1、a2、a3、b1′、b2′、b3′、c1、c2′and c39 direction cosines for each camera, each direction cosine and image attitude angleThe relationship between ω and γ is as follows:
b1=cosωsinγ;
b2=cosωsinγ;
b3=-sinω;
whereinOmega and gamma are rotation angles respectively taking a Y axis as a main axis, an X axis as a main axis and a Z axis as a main axis, wherein the multi-line laser radar is taken as a coordinate origin; and estimating attitude angle elements and 9 direction cosines of each camera to finish calibration.
After adopting the technical scheme, compared with the background technology, the invention has the following advantages:
the invention provides a novel calibration method, which is used for realizing simultaneous mixed calibration of a multi-line laser radar and a multi-path camera, can perform simultaneous calibration of one multi-line laser radar and the multi-path camera, and fills up the blank of the related technology. The method is simple to install, the calibration algorithm is easy to implement, the difficulty of calibrating the multi-line laser radar and the multi-path camera simultaneously is solved under the condition that the parameters in the multi-path camera are unknown and the point cloud data acquired by the laser radar are sparse, the blank of the related technology is filled, and the development of the unmanned technology towards low cost, universality and civilian property is promoted.
Drawings
FIG. 1 is a schematic diagram of an example of the installation of the multiline lidar and the multi-channel camera of the present invention on a vehicle.
FIG. 2 is a flow chart of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the present invention, it should be noted that the terms "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are all based on the orientation or positional relationship shown in the drawings, and are only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the apparatus or element of the present invention must have a specific orientation, and thus, should not be construed as limiting the present invention.
Examples
The invention discloses a mixed calibration method of a multi-line laser radar and a multi-path camera, which is used for realizing the mixed calibration of the multi-line laser radar and the multi-path camera. Fig. 1 is a schematic view showing an example of installation of the multiline laser radar and the multi-channel cameras on the vehicle, wherein the number of the multiline laser radars 5 is 1, the multiline laser radars are arranged at the position of the roof of the vehicle, and the 4-channel cameras 1, 2, 3 and 4 are respectively arranged at the front of the vehicle, at the rear of the vehicle and at the left and right sides of the vehicle.
As shown in fig. 2, is a flow chart of the present invention, which comprises the following steps:
s1, collecting original image data of a multi-path camera, multi-line laser radar point cloud data and static laser radar point cloud data, specifically:
specifically, step S1 includes:
s11, acquisition of multi-path camera image data:
firstly, a spacious target field with the periphery capable of being suspended is found, 5 or more targets are sequentially and uniformly placed in the visual field of each camera, and original image data of multiple cameras are obtained.
S12, acquiring point cloud data of the multi-line laser radar:
and starting up and scanning the multi-line laser radar on the roof to obtain multi-line laser radar point cloud data taking the position of the multi-line laser radar as the origin of a three-dimensional space coordinate system, namely, the three-dimensional coordinate (x, y, z) of the position of the laser radar is (0,0, 0).
S13, collection of static laser radar point cloud data:
and scanning the calibrated whole scene by adopting a high-precision ground static laser radar (the precision is within 5 mm), and obtaining static laser radar point cloud data and the positions of the cameras in the static laser radar point cloud data. The static laser radar point cloud data obtained at this time is point cloud data taking the position of the static laser radar as a three-dimensional coordinate origin, that is, the three-dimensional space coordinate value (X, Y, Z) of each point is relative to the position of the static laser radar.
S2, solving the internal reference model of each camera:
the camera reference model is expressed asWherein f isx,fyIs the focal length of the camera, cx,cyThe distortion factor is (k) for the principal optical axis point of the camera1,k2,k3,k4) And (3) representing, solving the camera internal reference and distortion factor by using a Zhangyingyou chessboard calibration method to obtain a camera internal reference model.
And S3, according to the camera internal parameters and the distortion factors obtained in S2, the images collected by the cameras are subjected to distortion removal, and corrected images are obtained.
S4, registering the static lidar point cloud data into a multiline lidar point cloud coordinate system: and manually registering the static laser radar point cloud data and the multiline laser radar point cloud data obtained in the step S1, converting a point cloud coordinate system taking the static laser radar position as a coordinate origin into a point cloud coordinate system taking the multiline laser radar position as the coordinate origin, and obtaining point cloud data (including targets such as a vehicle, a camera and a target) of the whole calibration scene, wherein the step is completed by means of professional software (such as RiPROCESS).
S5, acquiring the position (X) of each camera in the multiline laser radar point cloud coordinate system from the point cloud data registered in S4s,Ys,Zs)。
S6, selecting pixel coordinates (u, v) of at least 4 targets in the corrected image of each camera and corresponding three-dimensional coordinates (X) of the targets in the scene point cloud with the multiline laser radar as the coordinate originp,Yp,Zp)。
S7, according to the internal reference model and camera position (X) of each cameras,Ys,Zs) And pixel coordinates (u, v) and three-dimensional coordinates (X) of the target corresponding to the camerap,Yp,Zp) And establishing a collinear equation, solving the attitude angle elements and 9 direction cosines of each camera, and completing calibration.
Wherein the collinearity equation is:
wherein f is the vertical distance (i.e. focal length) from the center of the lens to the image plane, and for the sake of simplicity, f is the distance from the center of the lens to the image planea1、a2、a3、b1′、b2′、b3′、c1、c2′And c39 direction cosines for each camera;
cosine of each direction and image attitude angleThe relationship between ω and γ is as follows:
b1=cosωsinγ;
b2=cosωsinγ;
b3=-sinω;
wherein,ω and γ are rotation angles of the Y-axis, the X-axis and the Z-axis, respectively, with the multiline lidar as the origin of coordinates (i.e., rotation of the multiline lidar as the origin of coordinates with the Y-axis as the principal axis)An angle, then an angle omega is rotated around an X axis, and finally an angle gamma is rotated around a Z axis), wherein the main shaft is a fixed shaft with unchanged space direction in the rotating process; and solving a collinear condition equation by adopting a least square method, estimating attitude angle elements and 9 direction cosines of each camera, and finishing calibration.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (4)
1. The mixed calibration method of the multi-line laser radar and the multi-path camera is characterized by comprising the following steps:
s1, collecting original image data of a multi-path camera, multi-line laser radar point cloud data and static laser radar point cloud data;
s2, solving the internal reference model of each camera;
s3, carrying out distortion removal on the images collected by the cameras to obtain corrected images;
s4, registering the static lidar point cloud data into a multi-line lidar point cloud coordinate system;
s5, acquiring the position (X) of each camera in the multiline laser radar point cloud coordinate system from the point cloud data registered in S4s,Ys,Zs);
S6, selecting pixel coordinates (u, v) of at least 4 targets in the corrected image of each camera and corresponding three-dimensional coordinates (X) of the targets in the scene point cloud with the multiline laser radar as the coordinate originp,Yp,Zp);
S7, according to the internal reference model and camera position (X) of each cameras,Ys,Zs) And pixel coordinates (u, v) and three-dimensional coordinates (X) of the target corresponding to the camerap,Yp,Zp) And establishing a collinear equation, solving the attitude angle elements and 9 direction cosines of each camera, and completing calibration.
2. The multi-line lidar and multi-camera hybrid calibration method of claim 1, wherein the step S1 specifically comprises:
s11, acquisition of multi-path camera image data:
the method comprises the following steps of (1) parking a vehicle statically, and uniformly placing a plurality of targets in a view field of each camera in sequence to obtain original image data of multiple cameras;
s12, acquiring point cloud data of the multi-line laser radar:
starting up and scanning the multi-line laser radar on the roof to obtain multi-line laser radar point cloud data with the position of the multi-line laser radar as the origin of a three-dimensional space coordinate system;
s13, collection of static laser radar point cloud data:
and scanning the whole scene by adopting a ground static laser radar to obtain static laser radar point cloud data and the positions of the cameras in the static laser radar point cloud data.
3. The multiline lidar and multi-camera hybrid calibration method of claim 1, wherein: in step S2, the camera reference model is expressed as
Wherein f isx,fyIs the focal length of the camera, cx,cyAnd solving the camera internal parameters and distortion factors by using a Zhangyingyou chessboard calibration method for the main optical axis point of the camera to obtain a camera internal parameter model.
4. The multiline lidar and multi-camera hybrid calibration method of claim 3, wherein: in step S7, the collinearity equation is:
<mrow> <mi>u</mi> <mo>-</mo> <msub> <mi>c</mi> <mi>x</mi> </msub> <mo>=</mo> <mo>-</mo> <mi>f</mi> <mfrac> <mrow> <msub> <mi>a</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>X</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>X</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Y</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Y</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>c</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Z</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Z</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>a</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>X</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>X</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Y</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Y</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>c</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Z</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Z</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>;</mo> </mrow>
<mrow> <mi>v</mi> <mo>-</mo> <msub> <mi>c</mi> <mi>y</mi> </msub> <mo>=</mo> <mo>-</mo> <mi>f</mi> <mfrac> <mrow> <msub> <mi>a</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>X</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>X</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Y</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Y</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>c</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Z</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Z</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>a</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>X</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>X</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Y</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Y</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>c</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Z</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Z</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>;</mo> </mrow>
wherein f is the vertical distance from the center of the lens to the image plane,a1、a2、a3、b1、b2、b3、c1、c2and c39 direction cosines for each camera, each direction cosine and image attitude angleThe relationship between ω and γ is as follows:
b1=cosωsinγ;
b2=cosωsinγ;
b3=-sinω;
whereinOmega and gamma are rotation angles respectively taking a Y axis as a main axis, an X axis as a main axis and a Z axis as a main axis, wherein the multi-line laser radar is taken as a coordinate origin; and estimating attitude angle elements and 9 direction cosines of each camera to finish calibration.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711012232.9A CN108020826B (en) | 2017-10-26 | 2017-10-26 | Multi-line laser radar and multichannel camera mixed calibration method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711012232.9A CN108020826B (en) | 2017-10-26 | 2017-10-26 | Multi-line laser radar and multichannel camera mixed calibration method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108020826A true CN108020826A (en) | 2018-05-11 |
CN108020826B CN108020826B (en) | 2019-11-19 |
Family
ID=62080304
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711012232.9A Active CN108020826B (en) | 2017-10-26 | 2017-10-26 | Multi-line laser radar and multichannel camera mixed calibration method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108020826B (en) |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109300162A (en) * | 2018-08-17 | 2019-02-01 | 浙江工业大学 | A kind of multi-line laser radar and camera combined calibrating method based on fining radar scanning marginal point |
CN109343061A (en) * | 2018-09-19 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Transducer calibration method, device, computer equipment, medium and vehicle |
CN109345596A (en) * | 2018-09-19 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Multisensor scaling method, device, computer equipment, medium and vehicle |
CN109584183A (en) * | 2018-12-05 | 2019-04-05 | 吉林大学 | A kind of laser radar point cloud goes distortion method and system |
CN109633612A (en) * | 2018-10-18 | 2019-04-16 | 浙江大学 | A kind of single line laser radar that nothing is observed jointly and Camera extrinsic scaling method |
CN109900205A (en) * | 2019-02-21 | 2019-06-18 | 武汉大学 | A kind of quick calibrating method of high-precision single line laser device and optical camera |
CN109949371A (en) * | 2019-03-18 | 2019-06-28 | 北京智行者科技有限公司 | A kind of scaling method for laser radar and camera data |
CN110149463A (en) * | 2019-04-22 | 2019-08-20 | 上海大学 | It is a kind of to carry the hand-held line-structured light camera for turning station measurement target |
CN110200552A (en) * | 2019-06-20 | 2019-09-06 | 小狗电器互联网科技(北京)股份有限公司 | The measurement terminals of laser radar are gone with the method and sweeper of distortion |
CN110244282A (en) * | 2019-06-10 | 2019-09-17 | 于兴虎 | A kind of multicamera system and laser radar association system and its combined calibrating method |
CN110353577A (en) * | 2019-08-09 | 2019-10-22 | 小狗电器互联网科技(北京)股份有限公司 | A kind of laser radar point cloud data goes the method and Floor-sweeping device of distortion |
CN110609268A (en) * | 2018-11-01 | 2019-12-24 | 驭势科技(北京)有限公司 | Laser radar calibration method, device and system and storage medium |
CN110765894A (en) * | 2019-09-30 | 2020-02-07 | 杭州飞步科技有限公司 | Target detection method, device, equipment and computer readable storage medium |
CN111311742A (en) * | 2020-03-27 | 2020-06-19 | 北京百度网讯科技有限公司 | Three-dimensional reconstruction method, three-dimensional reconstruction device and electronic equipment |
CN111325801A (en) * | 2020-01-23 | 2020-06-23 | 天津大学 | Combined calibration method for laser radar and camera |
CN111413689A (en) * | 2020-05-07 | 2020-07-14 | 沃行科技(南京)有限公司 | Efficient static calibration method for realizing multi-laser radar point cloud alignment based on rviz |
CN111538008A (en) * | 2019-01-18 | 2020-08-14 | 杭州海康威视数字技术股份有限公司 | Transformation matrix determining method, system and device |
CN112230241A (en) * | 2020-10-23 | 2021-01-15 | 湖北亿咖通科技有限公司 | Calibration method based on random scanning type radar |
CN112241989A (en) * | 2019-08-22 | 2021-01-19 | 北京新能源汽车技术创新中心有限公司 | External parameter calibration method and device, computer equipment and storage medium |
CN112396663A (en) * | 2020-11-17 | 2021-02-23 | 广东电科院能源技术有限责任公司 | Visual calibration method, device, equipment and medium for multi-depth camera |
CN112823294A (en) * | 2019-09-18 | 2021-05-18 | 北京嘀嘀无限科技发展有限公司 | System and method for calibrating camera and multiline lidar |
CN113129590A (en) * | 2021-04-12 | 2021-07-16 | 武汉理工大学 | Traffic facility information intelligent analysis method based on vehicle-mounted radar and graphic measurement |
CN113534110A (en) * | 2021-06-24 | 2021-10-22 | 香港理工大学深圳研究院 | Static calibration method for multi-laser radar system |
CN113625288A (en) * | 2021-06-15 | 2021-11-09 | 中国科学院自动化研究所 | Camera and laser radar pose calibration method and device based on point cloud registration |
CN114047487A (en) * | 2021-11-05 | 2022-02-15 | 深圳市镭神智能系统有限公司 | Radar and vehicle body external parameter calibration method and device, electronic equipment and storage medium |
WO2022256976A1 (en) * | 2021-06-07 | 2022-12-15 | 深圳市大疆创新科技有限公司 | Method and system for constructing dense point cloud truth value data and electronic device |
CN116499364A (en) * | 2023-06-30 | 2023-07-28 | 济南作为科技有限公司 | Method and system for cloud adjustment distortion of three-dimensional laser point of coal-coiling instrument |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103500338A (en) * | 2013-10-16 | 2014-01-08 | 厦门大学 | Road zebra crossing automatic extraction method based on vehicle-mounted laser scanning point cloud |
CN104142157A (en) * | 2013-05-06 | 2014-11-12 | 北京四维图新科技股份有限公司 | Calibration method, device and equipment |
CN105678783A (en) * | 2016-01-25 | 2016-06-15 | 西安科技大学 | Data fusion calibration method of catadioptric panorama camera and laser radar |
-
2017
- 2017-10-26 CN CN201711012232.9A patent/CN108020826B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104142157A (en) * | 2013-05-06 | 2014-11-12 | 北京四维图新科技股份有限公司 | Calibration method, device and equipment |
CN103500338A (en) * | 2013-10-16 | 2014-01-08 | 厦门大学 | Road zebra crossing automatic extraction method based on vehicle-mounted laser scanning point cloud |
CN105678783A (en) * | 2016-01-25 | 2016-06-15 | 西安科技大学 | Data fusion calibration method of catadioptric panorama camera and laser radar |
Cited By (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109300162A (en) * | 2018-08-17 | 2019-02-01 | 浙江工业大学 | A kind of multi-line laser radar and camera combined calibrating method based on fining radar scanning marginal point |
CN109300162B (en) * | 2018-08-17 | 2021-08-03 | 浙江工业大学 | Multi-line laser radar and camera combined calibration method based on refined radar scanning edge points |
CN109345596B (en) * | 2018-09-19 | 2024-07-12 | 阿波罗智能技术(北京)有限公司 | Multi-sensor calibration method, device, computer equipment, medium and vehicle |
CN109343061A (en) * | 2018-09-19 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Transducer calibration method, device, computer equipment, medium and vehicle |
CN109345596A (en) * | 2018-09-19 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Multisensor scaling method, device, computer equipment, medium and vehicle |
US11002840B2 (en) | 2018-09-19 | 2021-05-11 | Baidu Online Network Technology (Beijing) Co., Ltd. | Multi-sensor calibration method, multi-sensor calibration device, computer device, medium and vehicle |
US11042762B2 (en) | 2018-09-19 | 2021-06-22 | Baidu Online Network Technology (Beijing) Co., Ltd. | Sensor calibration method and device, computer device, medium, and vehicle |
CN109633612A (en) * | 2018-10-18 | 2019-04-16 | 浙江大学 | A kind of single line laser radar that nothing is observed jointly and Camera extrinsic scaling method |
CN110609268B (en) * | 2018-11-01 | 2022-04-29 | 驭势科技(北京)有限公司 | Laser radar calibration method, device and system and storage medium |
CN110609268A (en) * | 2018-11-01 | 2019-12-24 | 驭势科技(北京)有限公司 | Laser radar calibration method, device and system and storage medium |
CN109584183A (en) * | 2018-12-05 | 2019-04-05 | 吉林大学 | A kind of laser radar point cloud goes distortion method and system |
CN111538008A (en) * | 2019-01-18 | 2020-08-14 | 杭州海康威视数字技术股份有限公司 | Transformation matrix determining method, system and device |
CN109900205A (en) * | 2019-02-21 | 2019-06-18 | 武汉大学 | A kind of quick calibrating method of high-precision single line laser device and optical camera |
CN109900205B (en) * | 2019-02-21 | 2020-04-24 | 武汉大学 | High-precision single-line laser and optical camera rapid calibration method |
CN109949371A (en) * | 2019-03-18 | 2019-06-28 | 北京智行者科技有限公司 | A kind of scaling method for laser radar and camera data |
CN110149463A (en) * | 2019-04-22 | 2019-08-20 | 上海大学 | It is a kind of to carry the hand-held line-structured light camera for turning station measurement target |
CN110244282A (en) * | 2019-06-10 | 2019-09-17 | 于兴虎 | A kind of multicamera system and laser radar association system and its combined calibrating method |
CN110200552A (en) * | 2019-06-20 | 2019-09-06 | 小狗电器互联网科技(北京)股份有限公司 | The measurement terminals of laser radar are gone with the method and sweeper of distortion |
CN110353577A (en) * | 2019-08-09 | 2019-10-22 | 小狗电器互联网科技(北京)股份有限公司 | A kind of laser radar point cloud data goes the method and Floor-sweeping device of distortion |
CN110353577B (en) * | 2019-08-09 | 2020-12-08 | 小狗电器互联网科技(北京)股份有限公司 | Laser radar point cloud data distortion removal method and sweeping device |
CN112241989A (en) * | 2019-08-22 | 2021-01-19 | 北京新能源汽车技术创新中心有限公司 | External parameter calibration method and device, computer equipment and storage medium |
CN112823294A (en) * | 2019-09-18 | 2021-05-18 | 北京嘀嘀无限科技发展有限公司 | System and method for calibrating camera and multiline lidar |
CN112823294B (en) * | 2019-09-18 | 2024-02-02 | 北京航迹科技有限公司 | System and method for calibrating cameras and multi-line lidar |
CN110765894A (en) * | 2019-09-30 | 2020-02-07 | 杭州飞步科技有限公司 | Target detection method, device, equipment and computer readable storage medium |
CN110765894B (en) * | 2019-09-30 | 2022-07-08 | 杭州飞步科技有限公司 | Target detection method, device, equipment and computer readable storage medium |
CN111325801B (en) * | 2020-01-23 | 2022-03-15 | 天津大学 | Combined calibration method for laser radar and camera |
CN111325801A (en) * | 2020-01-23 | 2020-06-23 | 天津大学 | Combined calibration method for laser radar and camera |
CN111311742A (en) * | 2020-03-27 | 2020-06-19 | 北京百度网讯科技有限公司 | Three-dimensional reconstruction method, three-dimensional reconstruction device and electronic equipment |
CN111413689B (en) * | 2020-05-07 | 2023-04-07 | 沃行科技(南京)有限公司 | Efficient static calibration method for realizing multi-laser radar point cloud alignment based on rviz |
CN111413689A (en) * | 2020-05-07 | 2020-07-14 | 沃行科技(南京)有限公司 | Efficient static calibration method for realizing multi-laser radar point cloud alignment based on rviz |
CN112230241A (en) * | 2020-10-23 | 2021-01-15 | 湖北亿咖通科技有限公司 | Calibration method based on random scanning type radar |
CN112396663A (en) * | 2020-11-17 | 2021-02-23 | 广东电科院能源技术有限责任公司 | Visual calibration method, device, equipment and medium for multi-depth camera |
CN113129590A (en) * | 2021-04-12 | 2021-07-16 | 武汉理工大学 | Traffic facility information intelligent analysis method based on vehicle-mounted radar and graphic measurement |
WO2022256976A1 (en) * | 2021-06-07 | 2022-12-15 | 深圳市大疆创新科技有限公司 | Method and system for constructing dense point cloud truth value data and electronic device |
CN113625288A (en) * | 2021-06-15 | 2021-11-09 | 中国科学院自动化研究所 | Camera and laser radar pose calibration method and device based on point cloud registration |
CN113534110B (en) * | 2021-06-24 | 2023-11-24 | 香港理工大学深圳研究院 | Static calibration method for multi-laser radar system |
CN113534110A (en) * | 2021-06-24 | 2021-10-22 | 香港理工大学深圳研究院 | Static calibration method for multi-laser radar system |
CN114047487B (en) * | 2021-11-05 | 2022-07-26 | 深圳市镭神智能系统有限公司 | Radar and vehicle body external parameter calibration method and device, electronic equipment and storage medium |
CN114047487A (en) * | 2021-11-05 | 2022-02-15 | 深圳市镭神智能系统有限公司 | Radar and vehicle body external parameter calibration method and device, electronic equipment and storage medium |
CN116499364A (en) * | 2023-06-30 | 2023-07-28 | 济南作为科技有限公司 | Method and system for cloud adjustment distortion of three-dimensional laser point of coal-coiling instrument |
CN116499364B (en) * | 2023-06-30 | 2023-09-12 | 济南作为科技有限公司 | Method and system for cloud adjustment distortion of three-dimensional laser point of coal-coiling instrument |
Also Published As
Publication number | Publication date |
---|---|
CN108020826B (en) | 2019-11-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108020826B (en) | Multi-line laser radar and multichannel camera mixed calibration method | |
CN107316325B (en) | Airborne laser point cloud and image registration fusion method based on image registration | |
CN112669393B (en) | Laser radar and camera combined calibration method | |
CN112270713B (en) | Calibration method and device, storage medium and electronic device | |
CN112894832B (en) | Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium | |
CN105716542B (en) | A kind of three-dimensional data joining method based on flexible characteristic point | |
CN103971353B (en) | Splicing method for measuring image data with large forgings assisted by lasers | |
CN110823252B (en) | Automatic calibration method for multi-line laser radar and monocular vision | |
CN113538595B (en) | Method for improving geometric precision of remote sensing stereo image by using laser height measurement data in auxiliary manner | |
CN109272555B (en) | External parameter obtaining and calibrating method for RGB-D camera | |
CN115187798A (en) | Multi-unmanned aerial vehicle high-precision matching positioning method | |
CN113253246B (en) | Calibration method for laser radar and camera | |
CN113947638B (en) | Method for correcting orthographic image of fish-eye camera | |
CN113793270A (en) | Aerial image geometric correction method based on unmanned aerial vehicle attitude information | |
CN112461204B (en) | Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height | |
CN113884519B (en) | Self-navigation X-ray imaging system and imaging method | |
CN108447100B (en) | Method for calibrating eccentricity vector and visual axis eccentricity angle of airborne three-linear array CCD camera | |
EP4224841A1 (en) | System and method for dynamic stereoscopic calibration | |
CN111461963A (en) | Fisheye image splicing method and device | |
CN102589529B (en) | Scanning close-range photogrammetry method | |
CN111383264B (en) | Positioning method, positioning device, terminal and computer storage medium | |
CN113724337A (en) | Camera dynamic external parameter calibration method and device without depending on holder angle | |
CN106846385B (en) | Multi-sensing remote sensing image matching method, device and system based on unmanned aerial vehicle | |
CN111538008B (en) | Transformation matrix determining method, system and device | |
CN113340272B (en) | Ground target real-time positioning method based on micro-group of unmanned aerial vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |