CN110221275A - Calibration method and device between laser radar and camera - Google Patents

Calibration method and device between laser radar and camera Download PDF

Info

Publication number
CN110221275A
CN110221275A CN201910425720.5A CN201910425720A CN110221275A CN 110221275 A CN110221275 A CN 110221275A CN 201910425720 A CN201910425720 A CN 201910425720A CN 110221275 A CN110221275 A CN 110221275A
Authority
CN
China
Prior art keywords
camera
default
scaling board
laser radar
rotating vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910425720.5A
Other languages
Chinese (zh)
Other versions
CN110221275B (en
Inventor
温英杰
孙孟孟
李凯
张斌
李吉利
林巧
曹丹
李卫斌
周光祥
余辉
蓝天翔
顾敏奇
吴紫薇
梁庆羽
毛非一
刘宿东
张善康
李文桐
张成华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cainiao Smart Logistics Holding Ltd
Original Assignee
Cainiao Smart Logistics Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cainiao Smart Logistics Holding Ltd filed Critical Cainiao Smart Logistics Holding Ltd
Priority to CN201910425720.5A priority Critical patent/CN110221275B/en
Publication of CN110221275A publication Critical patent/CN110221275A/en
Priority to PCT/CN2020/089722 priority patent/WO2020233443A1/en
Application granted granted Critical
Publication of CN110221275B publication Critical patent/CN110221275B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application provides a calibration method and a calibration device between a laser radar and a camera, wherein the method comprises the following steps: acquiring an image acquired by the camera aiming at a calibration plate and a point cloud acquired by the laser radar aiming at the calibration plate; determining a plurality of first rotation vectors in a preset first rotation vector interval; calculating the coincidence degree between the corresponding image and the point cloud according to each first rotation vector; and determining the first rotation vector corresponding to the maximum contact ratio as the rotation vector of the coordinate system of the laser radar calibrated to the coordinate system of the camera. By adopting the calibration method provided by the embodiment of the application, the calibration precision requirement of the unmanned vehicle can be met when the laser radar with medium and low precision is calibrated to the camera.

Description

Scaling method and device between a kind of laser radar and camera
Technical field
This application involves field of computer technology, more particularly between a kind of laser radar and camera scaling method, Caliberating device and a kind of caliberating device between a kind of scaling method, a kind of laser radar and camera.
Background technique
With the development of unmanned technology, current almost all of unmanned vehicle all uses Multi-sensor Fusion scheme, It is mounted with the multiple sensors such as laser radar, industrial camera.In unmanned scheme, need to become the coordinate system of multiple sensors It changes under a unified coordinate system, realizes the Space integration of multi-sensor data.
Multisensor calibration at present is broadly divided into calibration by hand and two kinds of automatic Calibration, and calibration is by there is certain calibration by hand The professional of experience passes through the sensing data acquired offline and carries out manual calibration by specific scaling method, is not suitable for Mass calibration;
Automatic Calibration is to realize multisensor by specific algorithm by choosing specific calibration scene and calibration fixture Automation calibration.
Automation scaling scheme on the market at present is mostly the unmanned automobile suitable for using high-end laser radar, and These automation scaling schemes are not particularly suited for the unmanned vehicle using low and middle-end laser radar.
Since the range accuracy and laser line number of low and middle-end laser radar are far below high-end laser radar, because obtained from Environment point cloud information does not have the abundant accurate of high-end radar, if being unable to satisfy use using the calibration algorithm of similar high-end radar The stated accuracy requirement of the unmanned vehicle of low and middle-end laser radar.
Summary of the invention
In view of the above problems, it proposes the embodiment of the present application and overcomes the above problem or at least partly in order to provide one kind Scaling method, a kind of scaling method, a kind of laser radar and the phase between a kind of laser radar and camera to solve the above problems Caliberating device and a kind of caliberating device between machine.
To solve the above-mentioned problems, the embodiment of the present application discloses the scaling method between a kind of laser radar and camera, Include:
Obtain the point that the camera is directed to scaling board acquisition for the image of scaling board acquisition and the laser radar Cloud;
In default first rotating vector section, multiple first rotating vectors are determined;
Respectively according to each first rotating vector, the registration between corresponding described image and described cloud is calculated;
By the first rotating vector of corresponding maximum registration, the coordinate system for being determined as the laser radar is demarcated to the phase The rotating vector of the coordinate system of machine.
Optionally, described respectively according to each first rotating vector, it calculates between corresponding described image and described cloud Registration, comprising:
The translation vector between the coordinate system of the laser radar and the coordinate system of camera is obtained, and obtains the camera Internal reference;
The multiple first rotating vector and the translation vector is respectively adopted, determines multiple first transition matrixes;
For first transition matrix, using the internal reference of first transition matrix and the camera, calculating pair Registration between the described image answered and described cloud.
Optionally, the internal reference using first transition matrix and the camera, calculate corresponding described image with Registration between described cloud, comprising:
Obtain the camera coordinates system of the camera;
It determines the profile of scaling board described in described image, and determines the mark being located in the scaling board in described cloud The three-dimensional coordinate of fixed board point cloud;
It, will be described using first transition matrix, the three-dimensional coordinate of the internal reference of the camera and scaling board point cloud Scaling board point cloud projects to described image, obtains the first point cloud projection;
It determines in first point cloud projection, falls into the first object subpoint in the profile of the scaling board in described image Quantity;
Using the quantity of the first object subpoint, the registration of described image Yu described cloud is determined.
Optionally, the quantity using the first object subpoint determines being overlapped for described image and described cloud Degree, comprising:
Calculate the quantity of the quantity of the corresponding first object subpoint of a scaling board and the scaling board point cloud of the scaling board First object subpoint ratio;
Using the first object subpoint ratio, the registration of described image Yu described cloud is determined.
Optionally, described in default first rotating vector section, determine multiple first rotating vectors, comprising:
In default first rotating vector section, according to default radian interval, multiple first rotating vectors are determined.
Optionally, the default first rotating vector section includes default first rolling angle range, default first pitch angle Section and default first yaw angle range;Described to preset in the first rotating vector section, according to default radian interval, determination is more A first rotating vector, comprising:
In the default first rolling angle range, multiple roll angles are determined according to default radian interval;
In the default first pitching angle range, multiple pitch angles are determined according to the default radian interval;
In the default first yaw angle range, multiple yaw angles are determined according to the default radian interval;
A roll angle is chosen from the multiple roll angle respectively, a pitching is chosen from the multiple pitch angle Angle is chosen a yaw angle from the multiple yaw angle and is combined, obtains multiple first rotating vectors.
Optionally, further includes:
Obtain the horizontal field of view angle of the camera and the resolution ratio of vertical field of view angle and described image;
Using the horizontal field of view angle divided by the width of the resolution ratio, the first radian is obtained;
Using the vertical field of view angle divided by the height of the resolution ratio, the second radian is obtained;
It is lesser to be used as the default radian interval by first radian and second radian.
Optionally, further includes:
Determine benchmark rotating vector;
Using the benchmark rotating vector and the default radian interval, the default first rotating vector section is determined.
Optionally, the determining benchmark rotating vector, comprising:
Default second rotating vector section is obtained, the default second rotating vector section includes default second rolling angular region Between, default second pitching angle range and default second yaw angle range;
Pitch angle is adjusted in the default second pitching angle range, and is adjusted in the default second yaw angle range Whole yaw angle;
Determine the target pitch angle when center of the scaling board of described image is overlapped with the center of first point cloud projection With target yaw angle;
Under the target pitch angle and target yaw angle, the rolling is adjusted in the default second rolling angle range Angle obtains multiple second rotating vectors;
From the multiple second rotating vector, benchmark rotating vector is determined.
Optionally, described from the multiple second rotating vector, determine benchmark rotating vector, comprising:
The coordinate system of the multiple second rotating vector and the laser radar and the coordinate of the camera is respectively adopted Translation vector between system determines multiple second transition matrixes;
For second transition matrix, using the internal reference of second transition matrix and the camera, calculating pair Registration between the described image answered and described cloud;
By the second rotating vector of corresponding maximum registration, determine to be benchmark rotating vector.
Optionally, the three-dimensional coordinate of the scaling board point cloud in the scaling board is located in described cloud of the determination, comprising:
Using a cloud clustering algorithm, it is located at the scaling board point cloud in the scaling board from extraction in described cloud;
Determine the three-dimensional coordinate of the scaling board point cloud.
Optionally, the three-dimensional coordinate of the scaling board point cloud in the scaling board is located in described cloud of the determination, comprising:
Obtain the reflectivity of each point in described cloud;
It is greater than the point of default reflectivity threshold value using reflectivity, determines the scaling board point cloud being located in the scaling board;
Determine the three-dimensional coordinate of the scaling board point cloud.
Optionally, the three-dimensional coordinate of the scaling board point cloud in the scaling board is located in described cloud of the determination, comprising:
Obtain the dimension information of the scaling board;
Using the dimension information of the scaling board, the scaling board point cloud being located in the scaling board in described cloud is determined;
Determine the three-dimensional coordinate of the scaling board point cloud.
The embodiment of the present application also discloses a kind of scaling method, is applied to unmanned vehicle, the unmanned vehicle includes at least one Camera and at least one laser radar, at least one described camera and at least one described laser radar are respectively provided with itself Coordinate system, which comprises
A target camera is chosen from least one described camera, using the coordinate system of the target camera as reference coordinate System;
It is determining with the associated first laser radar of the target camera at least one described laser radar, and by institute The coordinate system for stating first laser radar is demarcated to the frame of reference;
In the camera in addition to the target camera, first camera corresponding with the first laser radar is determined, and By the coordinate system of the coordinate system calibration of the first camera to corresponding first laser radar.
It is determining corresponding with the second laser radar with the not associated second laser radar of the target camera, and determination Second camera;
By the coordinate system of the coordinate system of second camera calibration to associated first laser radar, and by described second Coordinate system of the coordinate system calibration of laser radar to the second camera.
Optionally, at least one described camera include: at least one industrial camera, at least one look around camera;It is described from At least one described camera chooses a target camera
One, which is chosen, from least one described industrial camera is used as target camera.
Optionally, in the camera in addition to the target camera, determination is corresponding with the first laser radar First camera, comprising:
It is described at least one look around in camera, determine and corresponding with the first laser radar first look around camera.
Optionally, determination second camera corresponding with the second laser radar, comprising:
Determine that corresponding with the second laser radar second looks around camera.
The embodiment of the present application also discloses the caliberating device between a kind of laser radar and camera, comprising:
Image collection module, for obtaining the camera for the image of scaling board acquisition and the laser radar for institute State the point cloud of scaling board acquisition;
First rotating vector determining module, in default first rotating vector section, determining multiple first rotation arrows Amount;
First registration computing module, for according to each first rotating vector, calculate respectively corresponding described image with Registration between described cloud;
Rotating vector demarcating module is determined as the laser thunder for that will correspond to the first rotating vector of maximum registration The coordinate system reached demarcates the rotating vector to the coordinate system of the camera.
Optionally, the first registration computing module includes:
Parameter acquisition submodule, for obtaining the translation vector between the coordinate system of the laser radar and the coordinate system of camera Amount, and obtain the internal reference of the camera;
First transition matrix determines submodule, for the multiple first rotating vector and the translation vector to be respectively adopted Amount, determines multiple first transition matrixes;
First registration computational submodule, for being directed to first transition matrix, using the first conversion square The internal reference of battle array and the camera, calculates the registration between corresponding described image and described cloud.
Optionally, the first registration computational submodule includes:
Camera coordinates system acquiring unit, for obtaining the camera coordinates system of the camera;
Image information determination unit, for determining the profile of scaling board described in described image, and determining described cloud In be located at the scaling board in scaling board point cloud three-dimensional coordinate;
Projecting cell, three for internal reference and scaling board point cloud using first transition matrix, the camera Coordinate is tieed up, the scaling board point cloud is projected into described image, obtains the first point cloud projection;
Target projection point determination unit falls into the scaling board in described image for determining in first point cloud projection Profile in first object subpoint quantity;
First registration determination unit determines described image and institute for the quantity using the first object subpoint State the registration of a cloud.
Optionally, the first registration determination unit includes:
Projection ratio computation subunit, for calculating the quantity and the mark of the corresponding first object subpoint of a scaling board The first object subpoint ratio of the quantity of the scaling board point cloud of fixed board;
First registration determines subelement, for using the first object subpoint ratio, determines described image and institute State the registration of a cloud.
Optionally, the first rotating vector determining module includes:
First rotating vector determines submodule, in default first rotating vector section, according to default radian interval, Determine multiple first rotating vectors.
Optionally, the default first rotating vector section includes default first rolling angle range, default first pitch angle Section and default first yaw angle range;First rotating vector determines that submodule includes:
Roll angle determination unit is more according to the determination of default radian interval for rolling in angle range described default first A roll angle;
Pitch angle determination unit, it is true according to the default radian interval for being preset in the first pitching angle range described Fixed multiple pitch angles;
Yaw angle determination unit, it is true according to the default radian interval for being yawed in angle range described default first Fixed multiple yaw angles;
First rotating vector determination unit, for choosing a roll angle from the multiple roll angle respectively, from described A pitch angle is chosen in multiple pitch angles, and a yaw angle is chosen from the multiple yaw angle and is combined, is obtained multiple First rotating vector.
Optionally, further includes:
Camera parameter obtains module, for obtaining horizontal field of view angle and vertical field of view angle and the figure of the camera The resolution ratio of picture;
First radian determining module, for, divided by the width of the resolution ratio, obtaining first using the horizontal field of view angle Radian;
Second radian determining module, for, divided by the height of the resolution ratio, obtaining second using the vertical field of view angle Radian;
Radian interval determination module, for by first radian and second radian, lesser conduct to be described pre- If radian interval.
Optionally, further includes:
Benchmark rotating vector determining module, for determining benchmark rotating vector;
First rotating vector section determining module, for using the benchmark rotating vector and the default radian interval, Determine the default first rotating vector section.
Optionally, the benchmark rotating vector determining module includes:
Second rotating vector section acquisition submodule, it is described to preset second for obtaining default second rotating vector section Rotating vector section includes default second rolling angle range, default second pitching angle range and default second yaw angle range;
Angle adjusting submodule, for adjusting pitch angle in the default second pitching angle range, and described pre- If adjusting yaw angle in the second yaw angle range;
Target angle determines submodule, for determine the scaling board of described image center and first point cloud projection Target pitch angle and target yaw angle when center is overlapped;
Second rotating vector determines submodule, for being preset under the target pitch angle and target yaw angle described The adjustment roll angle in second rolling angle range, obtains multiple second rotating vectors;
Benchmark rotating vector determines submodule, for determining benchmark rotating vector from the multiple second rotating vector.
Optionally, the benchmark rotating vector determines that submodule includes:
Second transition matrix determination unit, for the multiple second rotating vector and the laser thunder to be respectively adopted Translation vector between the coordinate system of the coordinate system and the camera that reach determines multiple second transition matrixes;
Second registration computing unit, for being directed to second transition matrix, using second transition matrix With the internal reference of the camera, the registration between corresponding described image and described cloud is calculated;
Benchmark rotating vector determination unit is revolved on the basis of determining for that will correspond to the second rotating vector of maximum registration Turn vector.
Optionally, described image information determination unit includes:
First scaling board point cloud determines subelement, for extracting from described cloud and being located at institute using point cloud clustering algorithm State the scaling board point cloud in scaling board;
First cloud coordinate determines subelement, for determining the three-dimensional coordinate of the scaling board point cloud.
Optionally, described image information determination unit includes:
Reflectivity obtains subelement, for obtaining the reflectivity of each point in described cloud;
Second scaling board point cloud determines subelement, for being greater than the point of default reflectivity threshold value using reflectivity, determines position In the scaling board point cloud in the scaling board;
Second point cloud coordinate determines subelement, for determining the three-dimensional coordinate of the scaling board point cloud.
Optionally, described image information determination unit includes:
Dimension information obtains subelement, for obtaining the dimension information of the scaling board;
Third scaling board point cloud determines subelement, for the dimension information using the scaling board, determines in described cloud Scaling board point cloud in the scaling board;
Thirdly cloud coordinate determines subelement, for determining the three-dimensional coordinate of the scaling board point cloud.
The embodiment of the present application also discloses a kind of caliberating device, is applied to unmanned vehicle, the unmanned vehicle includes at least one Camera and at least one laser radar, at least one described camera and at least one described laser radar are respectively provided with itself Coordinate system, described device include:
Frame of reference determining module, for choosing a target camera from least one described camera, by the target The coordinate system of camera is as the frame of reference;
First demarcating module, at least one described laser radar, determining associated with the target camera the One laser radar, and the coordinate system of the first laser radar is demarcated to the frame of reference;
Second demarcating module, for determining and the first laser radar in the camera in addition to the target camera Corresponding first camera, and the coordinate system of the first camera is demarcated into the coordinate system to corresponding first laser radar.
Be not associated with determining module, for determining with the not associated second laser radar of the target camera, and determine with The corresponding second camera of the second laser radar;
Third demarcating module, for the coordinate system of the second camera to be demarcated to the coordinate to associated first laser radar System, and the coordinate system of the second laser radar is demarcated into the coordinate system to the second camera.
Optionally, at least one described camera include: at least one industrial camera, at least one look around camera;The base Conventional coordinates determining module includes:
Target camera chooses submodule, is used as target camera for choosing one from least one described industrial camera.
Optionally, second demarcating module includes:
First, which looks around camera, determines submodule, for it is described at least one look around in camera, determine with described first swash Optical radar corresponding first looks around camera.
Optionally, the determining module that is not associated with includes:
Second, which looks around camera, determines submodule, for determining that corresponding with the second laser radar second looks around camera.
The embodiment of the present application also discloses a kind of device, comprising:
One or more processors;With
One or more machine readable medias of instruction are stored thereon with, are executed when by one or more of processors When, so that described device executes one or more method as described above.
The embodiment of the present application also discloses one or more machine readable medias, is stored thereon with instruction, when by one or When multiple processors execute, so that the processor executes one or more method as described above.
The embodiment of the present application includes the following advantages:
In the embodiment of the present application, can translation vector between camera and laser radar it is fixed in the case where, default In first rotating vector section, determine so that the image of camera acquisition and the point cloud registration highest first of laser radar acquisition Rotating vector, by the first rotating vector of corresponding maximum registration, as finally by the coordinate system calibration of laser radar to camera Coordinate system rotating vector.Using the scaling method of the embodiment of the present application, demarcate in the laser radar of precision low by to phase When machine, it can also meet the stated accuracy requirement of unmanned vehicle.
Detailed description of the invention
Fig. 1 is the step flow chart of the scaling method embodiment one between a kind of laser radar and camera of the application;
Fig. 2 is the step flow chart of the scaling method embodiment two between a kind of laser radar and camera of the application;
Fig. 3 is the schematic diagram that scaling board point cloud is projected to image in the embodiment of the present application;
Fig. 4 is the schematic diagram that scaling board point cloud is projected to image by another kind in the embodiment of the present application;
Fig. 5 is a kind of step flow chart of scaling method embodiment of the application;
Fig. 6 is the schematic diagram of unmanned vehicle calibration scene in the embodiment of the present application;
Fig. 7 is the structural block diagram of the caliberating device embodiment between a kind of laser radar and camera of the application;
Fig. 8 is a kind of structural block diagram of caliberating device embodiment of the application.
Specific embodiment
In order to make the above objects, features, and advantages of the present application more apparent, with reference to the accompanying drawing and it is specific real Applying mode, the present application will be further described in detail.
Current logistics unmanned vehicle uses low and middle-end laser radar, if using the calibration algorithm of similar high-end radar It is unable to satisfy the stated accuracy requirement of calibration logistics unmanned vehicle.
The calibration of laser to camera (industrial camera looks around camera) seeks to determine that laser coordinate system arrives camera coordinates system Transformation matrix RT, transformation matrix RT can be uniquely determined by translation vector T (x, y, z) and rotating vector R (r, p, y), if together When 6 variables are optimized, search solution space it is huge, algorithm is extremely easy to converge to locally optimal solution.
After fixing in view of camera and laser radar installation, relative position is just secured, and can be passed through measurement and be obtained The value of very accurate translation vector T, therefore that takes fixed translation vector in the embodiment of the present application traverses rotating vector solution Optimal rotating vector is found in space, to acquire optimal transformation matrix.It is situated between in detail to specific implementation below It continues.
Referring to Fig.1, the step of showing the scaling method embodiment one between a kind of laser radar and camera of the application Flow chart can specifically include following steps:
Step 101, the camera is obtained for the image of scaling board acquisition and the laser radar for the scaling board The point cloud of acquisition;
The scaling method of the embodiment of the present application is the scaling method proposed for low and middle-end laser radar, in being suitable for Low end laser radar is also applied for high-end laser radar.
In unmanned vehicle, the number of camera and laser radar can include multiple, each camera and each laser radar Between, calibration can be realized using the method for the embodiment of the present application.Camera may include industrial camera, look around camera etc. and be answered Camera for unmanned vehicle.
Scaling board is acquired using camera and laser radar, camera acquisition is image, includes the figure of scaling board in image Picture;Laser radar acquisition is a cloud, is put in cloud comprising directive scaling board and by the laser point of scaling board reflection.Laser radar Transmitter launches beam of laser, after laser beam encounters object, by diffusing reflection, is back to laser pickoff, obtains laser Point.
In the embodiment of the present application, quantity and color to scaling board are not restricted, and random color, any amount can be used Scaling board.The red PVC expansion sheet having a size of 80cm*80cm that it is, for example, possible to use 3 is as scaling board.
Step 102, in default first rotating vector section, multiple first rotating vectors are determined;
Rotating vector (r, p, y), wherein r is roll angle roll, and p is pitch angle pitch, and y is yaw angle yaw.
Translation vector T after the relative position of camera and laser radar determines, between camera and laser radar Accurate measurement obtains, therefore only needs to find optimal rotating vector, so that it may acquire most in default first rotating vector section Excellent transformation matrix.
Step 103, the weight between corresponding described image and described cloud is calculated according to each first rotating vector respectively It is right;
It include object in the image of camera acquisition, the position of object is determining in the picture;Point cloud is laser radar It is determined according to the laser reflected by object, the coordinate position for putting cloud reflects the position of object.Registration is description point cloud The parameter for being overlapped degree of coordinate position and the object space in image.
Under different rotating vectors, the relative position of image and point cloud can change, and registration can also change.
Step 104, by the first rotating vector of corresponding maximum registration, it is determined as the coordinate system calibration of the laser radar To the rotating vector of the coordinate system of the camera.
Registration is bigger, and calibration result is more accurate.The first rotating vector when therefore can will make maximal degree of coincidence, makees For finally by the rotating vector of the coordinate system of the coordinate system calibration of laser radar to camera.
In the embodiment of the present application, can translation vector between camera and laser radar it is fixed in the case where, default In first rotating vector section, determine so that the image of camera acquisition and the point cloud registration highest first of laser radar acquisition Rotating vector, by the first rotating vector of corresponding maximum registration, as finally by the coordinate system calibration of laser radar to camera Coordinate system rotating vector.Using the scaling method of the embodiment of the present application, demarcate in the laser radar of precision low by to phase When machine, it can also meet the stated accuracy requirement of unmanned vehicle.
To unmanned vehicle camera and laser radar demarcate when, can determine a frame of reference first, such as The coordinate system of a camera is chosen as the frame of reference.By the method for the embodiment of the present application, can will be sat in addition to benchmark Except mark system, the coordinate system of laser radar or the coordinate system of camera are all demarcated to the frame of reference, are realized to unmanned vehicle Calibration.
And the scaling method of the embodiment of the present application can be realized automation calibration.In the actual operation scene of unmanned vehicle In, after completing unmanned vehicle factory vehicle calibration, various sensors are inevitably replaced in vehicle investment actual operation, And this also means that this vehicle needs to re-start calibration to the sensor of replacement, and the mark in completion to the sensor newly replaced Before fixed work, this vehicle can not put into effect, therefore using the scaling method of the application, can reach the instant of sensor Replacement is demarcated, the target runed immediately immediately.
Referring to Fig. 2, the step of showing the scaling method embodiment two between a kind of laser radar and camera of the application Flow chart can specifically include following steps:
Step 201, the camera is obtained for the image of scaling board acquisition and the laser radar for the scaling board The point cloud of acquisition;
Step 202, in default first rotating vector section, multiple first rotating vectors are determined;
In the embodiment of the present application, the step 202 may include: in default first rotating vector section, according to pre- If radian interval, multiple first rotating vectors are determined.
In the implementation, it is more that entirely default first rotating vector section, determination can be traversed to be divided into step-length between default radian A first rotating vector.
Specifically, the default first rotating vector section includes default first rolling angle range, default first pitch angle Section and default first yaw angle range can determine in the default first rolling angle range according to default radian interval Multiple roll angles;In the default first pitching angle range, multiple pitch angles are determined according to the default radian interval;Institute It states in default first yaw angle range, determines multiple yaw angles according to the default radian interval;Respectively from the multiple rolling A roll angle is chosen in angle, and a pitch angle is chosen from the multiple pitch angle, chooses one from the multiple yaw angle A yaw angle is combined, and obtains multiple first rotating vectors.
For example, default first rotating vector section is [(r1, p1, y1), (r2, p2, y3)], wherein default first rolling Angle range is [r1, r2], therefrom determines n1 roll angle according to default radian interval;Default first pitching angle range be [p1, P2], n2 pitch angle is therefrom determined according to default radian interval;Default first yaw angle range is [y1, y2], according to default Therefrom determine n3 yaw angle in radian interval.A roll angle is chosen from n1 roll angle respectively, is chosen from n2 pitch angle One pitch angle is chosen a yaw angle from n3 yaw angle and is combined, in total available n1*n2*n3 first rotation Turn vector.
In the embodiment of the present application, default radian interval can be determined as follows:
Obtain the horizontal field of view angle α of the camera and the resolution ratio w*h of vertical field of view angle beta and described image;Using institute Horizontal field of view angle α is stated divided by the width w of the resolution ratio, obtains the first radian;Using the vertical field of view angle beta divided by described point The height h of resolution, obtains the second radian;It is lesser to be used as the default arc by first radian and second radian Degree interval.
In the embodiment of the present application, presetting the first rotating vector section can determine as follows: determine that benchmark revolves Turn vector;Using the benchmark rotating vector and the default radian interval, the default first rotating vector section is determined.
Specifically, assuming that benchmark rotating vector is (r0, p0, y0), wherein r0 is benchmark roll angle, and p0 is benchmark pitching Angle, y0 are benchmark yaw angle.
Benchmark roll angle r0 can be used, the product of preset first reference numerical value M Yu default radian interval s is subtracted, obtains Roll angle interval limit r0-M*s;Benchmark roll angle r0 can be used, in addition preset first reference numerical value M and default radian interval The product of s obtains rolling angle range upper limit r0+M*s;Using roll angle interval limit and the rolling angle range upper limit, determine default First rolling angle range [r0-M*s, r0+M*s].
Reference pitch angle p0 can be used, the product of preset first reference numerical value M Yu default radian interval s is subtracted, obtains Pitch angle interval limit p0-M*s;Reference pitch angle p0 can be used, in addition preset first reference numerical value M and default radian interval The product of s obtains pitching angle range upper limit p0+M*s;Using pitch angle interval limit and the pitching angle range upper limit, determine default First pitching angle range [p0-M*s, p0+M*s].
Attitude of combat y0 can be used, the product of preset first reference numerical value M Yu default radian interval s is subtracted, obtains Yaw angle interval limit y0-M*s;Attitude of combat y0 can be used, in addition preset first reference numerical value M and default radian interval The product of s obtains yaw angle range upper limit y0+M*s;Using yaw angle interval limit and the yaw angle range upper limit, determine default First yaw angle range [y0-M*s, y0+M*s].
First referential data M is positive integer, to guarantee that globally optimal solution, M usually require the bigger of setting, such as M= 200。
In fact camera resolution is considered, the angular resolution of field angle and laser radar will obtain relatively high mark Determine precision, the default radian interval very little to be usually arranged, such as 0.001rad, and usually (r, p, y) reasonable constant interval It is often very big for default radian interval, such as [- 0.1,0.1] rad is the normal constant interval of comparison, therefore The traversal number for completing entire solution space is (0.2/0.001) * (0.2/0.001) * (0.2/0.001)=8000000 time, it is assumed that Program traverses only time-consuming 1ms (actual value is much larger than 1ms, about 3~4ms) every time, then demarcating the time of one group of parameter needs It is 8000000/1000/3600=2.2 hours, and this is only the time for demarcating one group of parameter, may be marked in actual scene Determine multiple groups parameter, so long runing time obviously can not receive.Therefore default first rotating vector section how is reduced, reduced The runing time of program seems especially crucial.
Therefore in the embodiment of the present application, orientation adjustment pitch and yaw is first passed through, so that scaling board point cloud projects to figure The center of target is overlapped in first point cloud projection center of picture and image, after usual this mode only needs iteration 50~100 times It will restrain, obtain the p0 and y0 of a benchmark.
It fixes p0 and y0 again later, adjusts roll in the original section of roll, find and make the first subpoint in the section Cloud falls on roll value most in scaling board image-region and is denoted as r0, this step needs iteration 200 times.
By this method, this programme can find a benchmark solution (r0, p0, y0), centered on this benchmark solution, this Shen Please embodiment [- 0.015,0.015] optimal solution can be found in the section of a very little, and test test and show the solution It is globally optimal solution.
In practice, roll could be adjusted after p0 and y0 has only been determined, can not first determine r0 and p0, then adjust yaw Or first after determining r0 and y0, in adjustment pitch.
The loop iteration number that scheme after optimization needs is 100+200+ (0.03/0.001) * (0.03/0.001) * (0.03/0.001)=27300/1000=27s is accelerated by OpenMP using Multi-core later, can again by when Between be reduced to original 1/4, therefore demarcate one group of parameter and only need 6~8 seconds or so, therefore the method for the embodiment of the present application can be complete At instant calibration.
In the embodiment of the present application, the step of determining benchmark rotating vector may include:
Default second rotating vector section is obtained, the default second rotating vector section includes default second rolling angular region Between, default second pitching angle range and default second yaw angle range;
Pitch angle is adjusted in the default second pitching angle range, and is adjusted in the default second yaw angle range Whole yaw angle;
The center for determining the scaling board of described image, target pitch angle when being overlapped with the center of first point cloud projection With target yaw angle;
Under the target pitch angle and target yaw angle, the rolling is adjusted in the default second rolling angle range Angle obtains multiple second rotating vectors;
From the multiple second rotating vector, benchmark rotating vector is determined.
Wherein, described from the multiple second rotating vector, the step of determining benchmark rotating vector, may include:
The coordinate system of the multiple second rotating vector and the laser radar and the coordinate of the camera is respectively adopted Translation vector between system determines multiple second transition matrixes;For second transition matrix, using described second turn The internal reference for changing matrix and the camera calculates the registration between corresponding described image and described cloud;It will corresponding maximum weight The second right rotating vector determines to be benchmark rotating vector.
Wherein, using the internal reference of second transition matrix and the camera, corresponding described image and the point are calculated The step of registration between cloud may include:
Using the second transition matrix, the three-dimensional coordinate of the internal reference of camera and scaling board point cloud, scaling board point cloud is projected to Camera coordinates system obtains the second point cloud projection;It determines in the second point cloud projection, falls into the profile of the scaling board in image The quantity of 2 target projection points;Using the quantity of the second target projection point, the registration of image and point cloud is determined.
It, can registration by the quantity of the second target projection point, as image and described cloud in a kind of example.The The quantity of 2 target projection points is more, and registration is higher.
In another example, the ratio of the second target projection point and scaling board point cloud can be used, determines registration.Tool Body, the quantity of the quantity of the corresponding second target projection point of a scaling board and the scaling board point cloud of the scaling board can be calculated The second target projection point ratio;Using the second target projection point ratio, the registration of image and point cloud is determined.
Step 203, the translation vector between the coordinate system of the laser radar and the coordinate system of camera is obtained, and is obtained The internal reference of the camera;
Internal reference is the parameter for describing camera characteristics.Since camera coordinates system uses the unit of millimeter, and image is flat The pixel that face uses is unit, and the effect of intrinsic parameter is exactly that linear variation is carried out between the two coordinate systems.Camera it is interior Ginseng can be obtained by camera calibration tool.
Step 204, the multiple first rotating vector and the translation vector is respectively adopted, determines multiple first conversion squares Battle array;
In the embodiment of the present application, the translation vector between camera and laser radar is fixed, each first conversion Matrix is made of first rotating vector and fixed translation vector.
Step 205, for first transition matrix, using in first transition matrix and the camera Ginseng, calculates the registration between corresponding described image and described cloud;
Under different transition matrixes, the relative position of image and point cloud can change, and registration can also change.
In the embodiment of the present application, the step 205 may include following sub-step:
Sub-step S11 obtains the camera coordinates system of the camera;
Sub-step S12 determines the profile of scaling board described in described image, and determines and be located at the mark in described cloud The three-dimensional coordinate of scaling board point cloud in fixed board;
Laser radar acquisition point cloud data be it is three-dimensional, by cartesian coordinate system (X, Y, Z) indicate.
In a kind of example, the three-dimensional coordinate of scaling board point cloud can be determined using cloud clustering algorithm.Specifically, can be with Using a cloud clustering algorithm, it is located at the scaling board point cloud in the scaling board from extraction in described cloud;Determine the scaling board The three-dimensional coordinate of point cloud.
In another example, scaling board can be used, as prior information, to determine scaling board point to the reflectivity of laser The three-dimensional coordinate of cloud.Since the object of unlike material is different to the degree of reflection of laser, the mark of high reflectance material can be chosen Fixed board.In the laser point cloud data collected, by the way that suitable reflectivity threshold value is arranged, reflectivity can be greater than reflection The laser point of rate threshold value is determined as the point that laser is got on scaling board.
Specifically, in available described cloud each point reflectivity;Default reflectivity threshold value is greater than using reflectivity Point, determine be located at the scaling board in scaling board point cloud;Determine the three-dimensional coordinate of the scaling board point cloud.
In another example, scaling board point cloud can be determined using the dimension information of scaling board as prior information Three-dimensional coordinate.Specifically, the dimension information of the available scaling board;Using the dimension information of the scaling board, institute is determined State the scaling board point cloud being located in the scaling board in a cloud;Determine the three-dimensional coordinate of the scaling board point cloud.
Sub-step S13 is sat using the three-dimensional of first transition matrix, the internal reference of the camera and scaling board point cloud Mark, projects to described image for the scaling board point cloud, obtains the first point cloud projection;
In practice, in the case where known transition matrix and camera internal reference, dedicated software interface can be called real It now projects, for example, three-dimensional coordinate is projected to two-dimensional image using the projection function ProjectPoints of OpenCV software In.
The schematic diagram that scaling board point cloud is projected to image in the embodiment of the present application is shown referring to Fig. 3.As shown in figure 3, The registration that scaling board point cloud projects to the scaling board in point cloud projection and image in image is lower.In different transition matrixes Under, the position of point cloud projection in the picture can change.
Sub-step S14 determines in first point cloud projection, falls into first in the profile of the scaling board in described image The quantity of target projection point;
Sub-step S15 determines being overlapped for described image and described cloud using the quantity of the first object subpoint Degree.
It, can registration by the quantity of first object subpoint, as image and described cloud in a kind of example.The The quantity of 1 target projection point is more, and registration is higher.
If the laser that laser radar projects is mapped to the points difference of two scaling boards for example, having used two pieces of scaling boards For 120 and 100.Under some first transition matrix, scaling board point cloud projects to first in two scaling board profiles of image The quantity of target projection point is respectively 90 and 80, if being to be overlapped with the sum of the first object subpoint for each scaling board Degree, then registration is 170.
In another example, the ratio that can use first object subpoint and scaling board point cloud, determines registration.Tool Body, the sub-step S15 may include: the quantity and the scaling board for calculating the corresponding first object subpoint of a scaling board Scaling board point cloud quantity first object subpoint ratio;Using the first object subpoint ratio, the figure is determined As the registration with described cloud.
For example, in the above example, the first object subpoint ratio of two scaling boards be respectively 90/120=0.75 and 80/100=0.8, if with the ratio of the quantity of the first object subpoint for each scaling board and the quantity of scaling board point cloud Summation is registration, then registration is 1.55.
Step 206, by the first rotating vector of corresponding maximum registration, it is determined as the coordinate system calibration of the laser radar To the rotating vector of the coordinate system of the camera.
The schematic diagram that scaling board point cloud is projected to image by another kind in the embodiment of the present application is shown referring to Fig. 4.Fig. 4 In, in registration highest, the point cloud projection of scaling board point cloud and the scaling board in image are completely corresponding, entire image and point cloud Also completely corresponding.
In the embodiment of the present application, can translation vector between camera and laser radar it is fixed in the case where, default In first rotating vector section, determine so that the image of camera acquisition and the point cloud registration highest first of laser radar acquisition Rotating vector, by the first rotating vector of corresponding maximum registration, as finally by the coordinate system calibration of laser radar to camera Coordinate system rotating vector.Using the scaling method of the embodiment of the present application, demarcate in the laser radar of precision low by to phase When machine, it can also meet the stated accuracy requirement of unmanned vehicle and can be realized automation calibration.
Referring to Fig. 5, a kind of step flow chart of scaling method embodiment of the application is shown, this method is applied to nobody Vehicle, the unmanned vehicle include at least one industrial camera, at least one look around camera and at least one laser radar, it is described At least one camera and at least one described laser radar are respectively provided with the coordinate system of itself, the method can specifically include as Lower step:
Step 501, from least one described camera choose a target camera, using the coordinate system of the target camera as The frame of reference;
Unmanned vehicle can be equipped with multiple cameras, may include at least one industrial camera and at least one look around camera.
Industrial camera has high picture steadiness, high-transmission ability and high anti-jamming capacity, is generally located on unmanned vehicle Front is used to acquire the image of front space.
The field angle for looking around camera is larger, and multiple cameras of looking around, which are arranged, in unmanned vehicle can cover around unmanned vehicle 360 degree Region, it can be ensured that the blind area in unmanned vehicle traveling process is as small as possible.
Select the coordinate system of different cameras for benchmark coordinate system, calibration process will be different, and complexity is also Difference, in practice, can according to industrial camera in unmanned vehicle, look around the relative position of camera, laser radar, from industrial camera One is chosen as target camera with looking around in camera.
The schematic diagram of unmanned vehicle calibration scene in the embodiment of the present application is shown referring to Fig. 6.Unmanned vehicle all around Camera or laser radar has can be set in four direction, can be corresponding for camera and laser radar that needs are demarcated Place scaling board in direction.Using the image of camera acquisition scaling board, acquisition laser radar is directed to scaling board collection point cloud.
In a kind of example of the embodiment of the present application, industrial camera may include left front setting left industrial camera and In the right industrial camera of right front setting, two industrial cameras form binocular camera.
Laser radar may include the preceding laser radar that front is set, the rear laser radar that rear is set, setting exist The left laser radar of left, the right laser radar that right is set.
Look around camera may include camera is looked around before front is set, rear is set after look around camera, setting exist Camera is looked around on the right side that a left side for left looks around camera, is arranged in right.
For simplicity, when choosing target camera, one can be chosen from least one industrial camera and be used as target Camera.
In the examples described above, left industrial camera can be chosen as target camera, the coordinate system of left industrial camera is chosen For benchmark coordinate system.The coordinate system of right industrial camera can directly demarcate the frame of reference of left industrial camera.
Step 502, at least one described laser radar, the determining and associated first laser thunder of the target camera It reaches, and the coordinate system of the first laser radar is demarcated to the frame of reference;
Being associated between camera and laser radar refers to the association between the shooting space of the two.The two needs to shoot common Space just there is association, could directly demarcate between the two.If the uncommon shooting space of the two, the two is not closed Connection, cannot directly demarcate between the two.For example, the laser radar acquisition at unmanned vehicle rear is arranged in is the point cloud at rear, if Set the industrial camera acquisition in front of unmanned vehicle is the image of front end, between the two uncommon shooting space, therefore i.e. It cannot directly demarcate between the two.
In the examples described above, preceding laser radar, left laser radar and right laser radar and left industrial camera can have altogether Same shooting space, therefore there is association between them.There is the coordinate system of associated first laser radar with target camera, it can The frame of reference is arrived directly to demarcate.
Step 503, it in the camera in addition to the target camera, determines and the first laser radar corresponding first Camera, and the coordinate system of the first camera is demarcated into the coordinate system to corresponding first laser radar;
The corresponding correspondence for referring to orientation mentioned here.Specifically, it is corresponding with first laser radar to can be determination First looks around camera.
In the examples described above, it looks around camera and laser radar correspondence uses, preceding laser radar and before look around camera pair It answers, it is corresponding that rear laser radar looks around camera with after, and left laser radar looks around that camera is corresponding, and right laser radar looks around phase with the right side with a left side Machine is corresponding.
Before look around the coordinate system of camera and can directly demarcate the coordinate system of preceding laser radar, thus the benchmark of indirect calibration Coordinate system;The coordinate system that camera is looked around on a left side can directly demarcate the coordinate system of left laser radar, thus the benchmark of indirect calibration Coordinate system;The coordinate system that camera is looked around on the right side can directly demarcate the coordinate system of right laser radar, thus the benchmark of indirect calibration Coordinate system.
Step 504, the determining and not associated second laser radar of the target camera, and the determining and second laser The corresponding second camera of radar;
For with the not associated second laser radar of target camera, coordinate system cannot directly demarcate the frame of reference, Second camera corresponding with second laser radar, indirect calibration to the frame of reference can be passed through.It is corresponding with rear laser radar Second camera is specifically as follows: corresponding second looks around camera.
For example, rear laser radar and left industrial camera are not related between the two due to not having common shooting space Connection.Can determine it is corresponding with rear laser radar after look around camera.
Step 505, by the coordinate system of the coordinate system calibration of the second camera to associated first laser radar, and will Coordinate system of the coordinate system calibration of the second laser radar to the second camera.
In the embodiment of the present application, the coordinate system of proven first laser radar can use, realize indirect calibration.
The coordinate system of second camera is demarcated to associated first and is swashed by the determining and associated first laser radar of second camera Then the coordinate system calibration of second laser radar is arrived the coordinate system of the second camera, realized second by the coordinate system of optical radar The coordinate system indirect calibration of laser radar is to the frame of reference.
The associated first laser radar of camera is looked around after for example, left laser radar and right laser radar, can be by rear ring Then the coordinate system calibration of rear laser radar is looked around phase after to the coordinate system of left laser radar by the coordinate system calibration depending on camera The coordinate system of machine.
In the embodiment of the present application, the calibration process between industrial camera and laser radar, looks around camera and laser radar Between calibration process can using between laser radar above-mentioned and camera scaling method embodiment realize.
The scaling method of the embodiment of the present application can be by the work in unmanned vehicle suitable for the unmanned vehicle with multisensor Industry camera is looked around camera and laser radar and is directly or indirectly demarcated to a frame of reference, and stated accuracy is high, Neng Goushi Now automation calibration.The calibration to other sensors can also be realized by the frame of reference, for example, can be by the frame of reference Demarcate Inertial Measurement Unit IMU (Inertial measurement unit).
It should be noted that for simple description, therefore, it is stated as a series of action groups for embodiment of the method It closes, but those skilled in the art should understand that, the embodiment of the present application is not limited by the described action sequence, because according to According to the embodiment of the present application, some steps may be performed in other sequences or simultaneously.Secondly, those skilled in the art also should Know, the embodiments described in the specification are all preferred embodiments, and related movement not necessarily the application is implemented Necessary to example.
Referring to Fig. 7, a kind of structural frames of the caliberating device embodiment between the laser radar and camera of the application are shown Figure, can specifically include following module:
Image collection module 701, the image and the laser radar needle for being directed to scaling board acquisition for obtaining the camera To the point cloud of scaling board acquisition;
First rotating vector determining module 702, in default first rotating vector section, determining multiple first rotations Vector;
First registration computing module 703, for calculating corresponding described image respectively according to each first rotating vector With the registration between described cloud;
Rotating vector demarcating module 704 is determined as the laser for that will correspond to the first rotating vector of maximum registration The coordinate system of radar demarcates the rotating vector to the coordinate system of the camera.
In the embodiment of the present application, the first registration computing module 703 may include:
Parameter acquisition submodule, for obtaining the translation vector between the coordinate system of the laser radar and the coordinate system of camera Amount, and obtain the internal reference of the camera;
First transition matrix determines submodule, for the multiple first rotating vector and the translation vector to be respectively adopted Amount, determines multiple first transition matrixes;
First registration computational submodule, for being directed to first transition matrix, using the first conversion square The internal reference of battle array and the camera, calculates the registration between corresponding described image and described cloud.
In the embodiment of the present application, the first registration computational submodule may include:
Camera coordinates system acquiring unit, for obtaining the camera coordinates system of the camera;
Image information determination unit, for determining the profile of scaling board described in described image, and determining described cloud In be located at the scaling board in scaling board point cloud three-dimensional coordinate;
Projecting cell, three for internal reference and scaling board point cloud using first transition matrix, the camera Coordinate is tieed up, the scaling board point cloud is projected into described image, obtains the first point cloud projection;
Target projection point determination unit falls into the scaling board in described image for determining in first point cloud projection Profile in first object subpoint quantity;
First registration determination unit determines described image and institute for the quantity using the first object subpoint State the registration of a cloud.
In the embodiment of the present application, the first registration determination unit may include:
Projection ratio computation subunit, for calculating the quantity and the mark of the corresponding first object subpoint of a scaling board The first object subpoint ratio of the quantity of the scaling board point cloud of fixed board;
First registration determines subelement, for using the first object subpoint ratio, determines described image and institute State the registration of a cloud.
In the embodiment of the present application, the first rotating vector determining module 702 may include:
First rotating vector determines submodule, in default first rotating vector section, according to default radian interval, Determine multiple first rotating vectors.
In the embodiment of the present application, the default first rotating vector section includes default first rolling angle range, presets First pitching angle range and default first yaw angle range;First rotating vector determines that submodule may include:
Roll angle determination unit is more according to the determination of default radian interval for rolling in angle range described default first A roll angle;
Pitch angle determination unit, it is true according to the default radian interval for being preset in the first pitching angle range described Fixed multiple pitch angles;
Yaw angle determination unit, it is true according to the default radian interval for being yawed in angle range described default first Fixed multiple yaw angles;
First rotating vector determination unit, for choosing a roll angle from the multiple roll angle respectively, from described A pitch angle is chosen in multiple pitch angles, and a yaw angle is chosen from the multiple yaw angle and is combined, is obtained multiple First rotating vector.
In the embodiment of the present application, the device can also include:
Camera parameter obtains module, for obtaining horizontal field of view angle and vertical field of view angle and the figure of the camera The resolution ratio of picture;
First radian determining module, for, divided by the width of the resolution ratio, obtaining first using the horizontal field of view angle Radian;
Second radian determining module, for, divided by the height of the resolution ratio, obtaining second using the vertical field of view angle Radian;
Radian interval determination module, for by first radian and second radian, lesser conduct to be described pre- If radian interval.
In the embodiment of the present application, the device can also include:
Benchmark rotating vector determining module, for determining benchmark rotating vector;
First rotating vector section determining module, for using the benchmark rotating vector and the default radian interval, Determine the default first rotating vector section.
In the embodiment of the present application, the benchmark rotating vector determining module may include:
Second rotating vector section acquisition submodule, it is described to preset second for obtaining default second rotating vector section Rotating vector section includes default second rolling angle range, default second pitching angle range and default second yaw angle range;
Angle adjusting submodule, for adjusting pitch angle in the default second pitching angle range, and described pre- If adjusting yaw angle in the second yaw angle range;
Target angle determines submodule, for determine the scaling board of described image center and first point cloud projection Target pitch angle and target yaw angle when center is overlapped;
Second rotating vector determines submodule, for being preset under the target pitch angle and target yaw angle described The adjustment roll angle in second rolling angle range, obtains multiple second rotating vectors;
Benchmark rotating vector determines submodule, for determining benchmark rotating vector from the multiple second rotating vector.
In the embodiment of the present application, the benchmark rotating vector determines that submodule may include:
Second transition matrix determination unit, for the multiple second rotating vector and the laser thunder to be respectively adopted Translation vector between the coordinate system of the coordinate system and the camera that reach determines multiple second transition matrixes;
Second registration computing unit, for being directed to second transition matrix, using second transition matrix With the internal reference of the camera, the registration between corresponding described image and described cloud is calculated;
Benchmark rotating vector determination unit is revolved on the basis of determining for that will correspond to the second rotating vector of maximum registration Turn vector.
In the embodiment of the present application, described image information determination unit may include:
First scaling board point cloud determines subelement, for extracting from described cloud and being located at institute using point cloud clustering algorithm State the scaling board point cloud in scaling board;
First cloud coordinate determines subelement, for determining the three-dimensional coordinate of the scaling board point cloud.
In the embodiment of the present application, described image information determination unit may include:
Reflectivity obtains subelement, for obtaining the reflectivity of each point in described cloud;
Second scaling board point cloud determines subelement, for being greater than the point of default reflectivity threshold value using reflectivity, determines position In the scaling board point cloud in the scaling board;
Second point cloud coordinate determines subelement, for determining the three-dimensional coordinate of the scaling board point cloud.
In the embodiment of the present application, described image information determination unit may include:
Dimension information obtains subelement, for obtaining the dimension information of the scaling board;
Third scaling board point cloud determines subelement, for the dimension information using the scaling board, determines in described cloud Scaling board point cloud in the scaling board;
Thirdly cloud coordinate determines subelement, for determining the three-dimensional coordinate of the scaling board point cloud.
Referring to Fig. 8, a kind of structural block diagram of caliberating device embodiment of the application is shown, the caliberating device is applied to Unmanned vehicle, the unmanned vehicle include at least one camera and at least one laser radar, at least one described camera and described At least one laser radar is respectively provided with the coordinate system of itself, and described device can specifically include following module:
Frame of reference determining module 801, for choosing a target camera from least one described camera, by the mesh The coordinate system of camera is marked as the frame of reference;
First demarcating module 802, at least one described laser radar, determination to be associated with the target camera First laser radar, and the coordinate system of the first laser radar is demarcated to the frame of reference;
Second demarcating module 803, for determining and the first laser thunder in the camera in addition to the target camera The coordinate system to corresponding first laser radar is demarcated up to corresponding first camera, and by the coordinate system of the first camera.
It is not associated with determining module 804, for the determining and not associated second laser radar of the target camera, and determines Second camera corresponding with the second laser radar;
Third demarcating module 805, for demarcating the coordinate system of the second camera to associated first laser radar Coordinate system, and the coordinate system of the second laser radar is demarcated into the coordinate system to the second camera.
In the embodiment of the present application, at least one described camera may include: at least one industrial camera, at least one ring Depending on camera;The frame of reference determining module 801 may include:
Target camera chooses submodule, is used as target camera for choosing one from least one described industrial camera.
In the embodiment of the present application, second demarcating module 803 may include:
First, which looks around camera, determines submodule, for it is described at least one look around in camera, determine with described first swash Optical radar corresponding first looks around camera.
In the embodiment of the present application, the determining module 804 that is not associated with may include:
Second, which looks around camera, determines submodule, for determining that corresponding with the second laser radar second looks around camera.
For device embodiment, since it is basically similar to the method embodiment, related so being described relatively simple Place illustrates referring to the part of embodiment of the method.
The embodiment of the present application also provides a kind of devices, comprising:
One or more processors;With
One or more machine readable medias of instruction are stored thereon with, are executed when by one or more of processors When, so that described device executes method described in the embodiment of the present application.
The embodiment of the present application also provides one or more machine readable medias, are stored thereon with instruction, when by one or When multiple processors execute, so that the processor executes method described in the embodiment of the present application.
All the embodiments in this specification are described in a progressive manner, the highlights of each of the examples are with The difference of other embodiments, the same or similar parts between the embodiments can be referred to each other.
It should be understood by those skilled in the art that, the embodiments of the present application may be provided as method, apparatus or calculating Machine program product.Therefore, the embodiment of the present application can be used complete hardware embodiment, complete software embodiment or combine software and The form of the embodiment of hardware aspect.Moreover, the embodiment of the present application can be used one or more wherein include computer can With in the computer-usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) of program code The form of the computer program product of implementation.
The embodiment of the present application is referring to according to the method for the embodiment of the present application, terminal device (system) and computer program The flowchart and/or the block diagram of product describes.It should be understood that flowchart and/or the block diagram can be realized by computer program instructions In each flow and/or block and flowchart and/or the block diagram in process and/or box combination.It can provide these Computer program instructions are set to general purpose computer, special purpose computer, Embedded Processor or other programmable data processing terminals Standby processor is to generate a machine, so that being held by the processor of computer or other programmable data processing terminal devices Capable instruction generates for realizing in one or more flows of the flowchart and/or one or more blocks of the block diagram The device of specified function.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing terminal devices In computer-readable memory operate in a specific manner, so that instruction stored in the computer readable memory generates packet The manufacture of command device is included, which realizes in one side of one or more flows of the flowchart and/or block diagram The function of being specified in frame or multiple boxes.
These computer program instructions can also be loaded into computer or other programmable data processing terminal devices, so that Series of operation steps are executed on computer or other programmable terminal equipments to generate computer implemented processing, thus The instruction executed on computer or other programmable terminal equipments is provided for realizing in one or more flows of the flowchart And/or in one or more blocks of the block diagram specify function the step of.
Although preferred embodiments of the embodiments of the present application have been described, once a person skilled in the art knows bases This creative concept, then additional changes and modifications can be made to these embodiments.So the following claims are intended to be interpreted as Including preferred embodiment and all change and modification within the scope of the embodiments of the present application.
Finally, it is to be noted that, herein, relational terms such as first and second and the like be used merely to by One entity or operation are distinguished with another entity or operation, without necessarily requiring or implying these entities or operation Between there are any actual relationship or orders.Moreover, the terms "include", "comprise" or its any other variant meaning Covering non-exclusive inclusion, so that process, method, article or terminal device including a series of elements not only wrap Those elements are included, but also including other elements that are not explicitly listed, or further includes for this process, method, article Or the element that terminal device is intrinsic.In the absence of more restrictions, being wanted by what sentence "including a ..." limited Element, it is not excluded that there is also other identical elements in process, method, article or the terminal device for including the element.
Above to scaling method, a kind of scaling method, one between a kind of laser radar provided herein and camera Caliberating device and a kind of caliberating device between kind laser radar and camera, are described in detail, used herein specifically The principle and implementation of this application are described for a example, the application that the above embodiments are only used to help understand Method and its core concept;At the same time, for those skilled in the art is being embodied according to the thought of the application There will be changes in mode and application range, in conclusion the contents of this specification should not be construed as limiting the present application.

Claims (36)

1. the scaling method between a kind of laser radar and camera characterized by comprising
Obtain the point cloud that the camera is directed to scaling board acquisition for the image of scaling board acquisition and the laser radar;
In default first rotating vector section, multiple first rotating vectors are determined;
Respectively according to each first rotating vector, the registration between corresponding described image and described cloud is calculated;
By the first rotating vector of corresponding maximum registration, the coordinate system for being determined as the laser radar is demarcated to the camera The rotating vector of coordinate system.
2. the method according to claim 1, wherein described respectively according to each first rotating vector, calculating pair Registration between the described image answered and described cloud, comprising:
The translation vector between the coordinate system of the laser radar and the coordinate system of camera is obtained, and obtains the interior of the camera Ginseng;
The multiple first rotating vector and the translation vector is respectively adopted, determines multiple first transition matrixes;
It is calculated corresponding for first transition matrix using the internal reference of first transition matrix and the camera Registration between described image and described cloud.
3. according to the method described in claim 2, it is characterized in that, described using first transition matrix and the camera Internal reference calculates the registration between corresponding described image and described cloud, comprising:
Obtain the camera coordinates system of the camera;
It determines the profile of scaling board described in described image, and determines the scaling board being located in the scaling board in described cloud The three-dimensional coordinate of point cloud;
Using first transition matrix, the three-dimensional coordinate of the internal reference of the camera and scaling board point cloud, by the calibration Plate point cloud projects to described image, obtains the first point cloud projection;
It determines in first point cloud projection, falls into the number of the first object subpoint in the profile of the scaling board in described image Amount;
Using the quantity of the first object subpoint, the registration of described image Yu described cloud is determined.
4. according to the method described in claim 3, it is characterized in that, the quantity using the first object subpoint, really Determine the registration of described image Yu described cloud, comprising:
Calculate the of the quantity of the quantity of the corresponding first object subpoint of a scaling board and the scaling board point cloud of the scaling board One target projection point ratio;
Using the first object subpoint ratio, the registration of described image Yu described cloud is determined.
5. determination is more the method according to claim 1, wherein described in default first rotating vector section A first rotating vector, comprising:
In default first rotating vector section, according to default radian interval, multiple first rotating vectors are determined.
6. according to the method described in claim 5, it is characterized in that, the default first rotating vector section includes default first Angle range, the default first pitching angle range and default first of rolling yaw angle range;It is described in default first rotating vector section It is interior, according to default radian interval, determine multiple first rotating vectors, comprising:
In the default first rolling angle range, multiple roll angles are determined according to default radian interval;
In the default first pitching angle range, multiple pitch angles are determined according to the default radian interval;
In the default first yaw angle range, multiple yaw angles are determined according to the default radian interval;
A roll angle is chosen from the multiple roll angle respectively, a pitch angle is chosen from the multiple pitch angle, from A yaw angle is chosen in the multiple yaw angle to be combined, and obtains multiple first rotating vectors.
7. according to the method described in claim 5, it is characterized by further comprising:
Obtain the horizontal field of view angle of the camera and the resolution ratio of vertical field of view angle and described image;
Using the horizontal field of view angle divided by the width of the resolution ratio, the first radian is obtained;
Using the vertical field of view angle divided by the height of the resolution ratio, the second radian is obtained;
It is lesser to be used as the default radian interval by first radian and second radian.
8. according to the method described in claim 5, it is characterized by further comprising:
Determine benchmark rotating vector;
Using the benchmark rotating vector and the default radian interval, the default first rotating vector section is determined.
9. according to the method described in claim 8, it is characterized in that, the determining benchmark rotating vector, comprising:
Obtain default second rotating vector section, the default second rotating vector section include default second rolling angle range, Default second pitching angle range and default second yaw angle range;
Pitch angle is adjusted in the default second pitching angle range, and adjustment is inclined in the default second yaw angle range Boat angle;
Determine the target pitch angle when center of the scaling board of described image is overlapped with the center of first point cloud projection and mesh Mark yaw angle;
Under the target pitch angle and target yaw angle, the roll angle is adjusted in the default second rolling angle range, Obtain multiple second rotating vectors;
From the multiple second rotating vector, benchmark rotating vector is determined.
10. according to the method described in claim 9, determining base it is characterized in that, described from the multiple second rotating vector Quasi- rotating vector, comprising:
Be respectively adopted the multiple second rotating vector and the laser radar coordinate system and the camera coordinate system it Between translation vector, determine multiple second transition matrixes;
It is calculated corresponding for second transition matrix using the internal reference of second transition matrix and the camera Registration between described image and described cloud;
By the second rotating vector of corresponding maximum registration, determine to be benchmark rotating vector.
11. according to the method described in claim 3, it is characterized in that, being located in the scaling board in described cloud of the determination Scaling board point cloud three-dimensional coordinate, comprising:
Using a cloud clustering algorithm, it is located at the scaling board point cloud in the scaling board from extraction in described cloud;
Determine the three-dimensional coordinate of the scaling board point cloud.
12. according to the method described in claim 3, it is characterized in that, being located in the scaling board in described cloud of the determination Scaling board point cloud three-dimensional coordinate, comprising:
Obtain the reflectivity of each point in described cloud;
It is greater than the point of default reflectivity threshold value using reflectivity, determines the scaling board point cloud being located in the scaling board;
Determine the three-dimensional coordinate of the scaling board point cloud.
13. according to the method described in claim 3, it is characterized in that, being located in the scaling board in described cloud of the determination Scaling board point cloud three-dimensional coordinate, comprising:
Obtain the dimension information of the scaling board;
Using the dimension information of the scaling board, the scaling board point cloud being located in the scaling board in described cloud is determined;
Determine the three-dimensional coordinate of the scaling board point cloud.
14. a kind of scaling method, which is characterized in that be applied to unmanned vehicle, the unmanned vehicle includes at least one camera and extremely A few laser radar, at least one described camera and at least one described laser radar are respectively provided with the coordinate system of itself, institute The method of stating includes:
A target camera is chosen from least one described camera, using the coordinate system of the target camera as the frame of reference;
It is determining with the associated first laser radar of the target camera at least one described laser radar, and by described the The coordinate system of one laser radar is demarcated to the frame of reference;
In the camera in addition to the target camera, corresponding with first laser radar first camera is determined, and by institute The coordinate system calibration of first camera is stated to the coordinate system of corresponding first laser radar.
It is determining with the not associated second laser radar of the target camera, and determine corresponding with the second laser radar the Two cameras;
By the coordinate system of the coordinate system of second camera calibration to associated first laser radar, and by the second laser Coordinate system of the coordinate system calibration of radar to the second camera.
15. according to the method for claim 14, which is characterized in that at least one described camera includes: at least one industry Camera, at least one look around camera;It is described to include: from least one described camera one target camera of selection
One, which is chosen, from least one described industrial camera is used as target camera.
16. according to the method for claim 15, which is characterized in that in the camera in addition to the target camera, Determine first camera corresponding with the first laser radar, comprising:
It is described at least one look around in camera, determine and corresponding with the first laser radar first look around camera.
17. according to the method for claim 15, which is characterized in that the determination corresponding with the second laser radar the Two cameras, comprising:
Determine that corresponding with the second laser radar second looks around camera.
18. the caliberating device between a kind of laser radar and camera characterized by comprising
Image collection module, for obtaining the camera for the image of scaling board acquisition and the laser radar for the mark The point cloud of fixed board acquisition;
First rotating vector determining module, for determining multiple first rotating vectors in default first rotating vector section;
First registration computing module, for according to each first rotating vector, calculate respectively corresponding described image with it is described Registration between point cloud;
Rotating vector demarcating module is determined as the laser radar for that will correspond to the first rotating vector of maximum registration Coordinate system demarcates the rotating vector to the coordinate system of the camera.
19. device according to claim 18, which is characterized in that the first registration computing module includes:
Parameter acquisition submodule, for obtaining the translation vector between the coordinate system of the laser radar and the coordinate system of camera, And obtain the internal reference of the camera;
First transition matrix determines submodule, for the multiple first rotating vector and the translation vector to be respectively adopted, really Fixed multiple first transition matrixes;
First registration computational submodule, for being directed to first transition matrix, using first transition matrix and The internal reference of the camera calculates the registration between corresponding described image and described cloud.
20. device according to claim 19, which is characterized in that the first registration computational submodule includes:
Camera coordinates system acquiring unit, for obtaining the camera coordinates system of the camera;
Image information determination unit, for determining the profile of scaling board described in described image, and the determining point Yun Zhongwei In the three-dimensional coordinate of the scaling board point cloud in the scaling board;
Projecting cell, the three-dimensional seat for internal reference and scaling board point cloud using first transition matrix, the camera Mark, projects to described image for the scaling board point cloud, obtains the first point cloud projection;
Target projection point determination unit falls into the wheel of the scaling board in described image for determining in first point cloud projection The quantity of first object subpoint in exterior feature;
First registration determination unit determines described image and the point for the quantity using the first object subpoint The registration of cloud.
21. device according to claim 20, which is characterized in that the first registration determination unit includes:
Projection ratio computation subunit, for calculating the quantity and the scaling board of the corresponding first object subpoint of a scaling board Scaling board point cloud quantity first object subpoint ratio;
First registration determines subelement, for using the first object subpoint ratio, determines described image and the point The registration of cloud.
22. device according to claim 18, which is characterized in that the first rotating vector determining module includes:
First rotating vector determines submodule, for according to default radian interval, determining in default first rotating vector section Multiple first rotating vectors.
23. device according to claim 22, which is characterized in that the default first rotating vector section includes default the One rolling angle range, default first pitching angle range and default first yaw angle range;First rotating vector determines submodule Block includes:
Roll angle determination unit, for determining multiple turn over according to default radian interval in the default first rolling angle range Roll angle;
Pitch angle determination unit is more according to the default radian interval determination for presetting in the first pitching angle range described A pitch angle;
Yaw angle determination unit is more according to the default radian interval determination for yawing in angle range described default first A yaw angle;
First rotating vector determination unit, for choosing a roll angle from the multiple roll angle respectively, from the multiple A pitch angle is chosen in pitch angle, and a yaw angle is chosen from the multiple yaw angle and is combined, obtains multiple first Rotating vector.
24. device according to claim 22, which is characterized in that further include:
Camera parameter obtains module, for obtaining horizontal field of view angle and vertical field of view angle and the described image of the camera Resolution ratio;
First radian determining module, for, divided by the width of the resolution ratio, obtaining the first radian using the horizontal field of view angle;
Second radian determining module, for, divided by the height of the resolution ratio, obtaining the second radian using the vertical field of view angle;
Radian interval determination module, for will in first radian and second radian, it is lesser conduct the default arc Degree interval.
25. device according to claim 22, which is characterized in that further include:
Benchmark rotating vector determining module, for determining benchmark rotating vector;
First rotating vector section determining module is determined for using the benchmark rotating vector and the default radian interval The default first rotating vector section.
26. device according to claim 25, which is characterized in that the benchmark rotating vector determining module includes:
Second rotating vector section acquisition submodule, for obtaining default second rotating vector section, default second rotation Vector section includes default second rolling angle range, default second pitching angle range and default second yaw angle range;
Angle adjusting submodule, for adjusting pitch angle in the default second pitching angle range, and described default the Yaw angle is adjusted in two yaw angle ranges;
Target angle determines submodule, for determining the center of the scaling board of described image and the center of first point cloud projection Target pitch angle and target yaw angle when coincidence;
Second rotating vector determines submodule, for presetting second described under the target pitch angle and target yaw angle The adjustment roll angle, obtains multiple second rotating vectors in angle range of rolling;
Benchmark rotating vector determines submodule, for determining benchmark rotating vector from the multiple second rotating vector.
27. device according to claim 26, which is characterized in that the benchmark rotating vector determines that submodule includes:
Second transition matrix determination unit, for the multiple second rotating vector and the laser radar to be respectively adopted Translation vector between coordinate system and the coordinate system of the camera determines multiple second transition matrixes;
Second registration computing unit, for being directed to second transition matrix, using second transition matrix and institute The internal reference for stating camera calculates the registration between corresponding described image and described cloud;
Benchmark rotating vector determination unit rotates arrow on the basis of determining for that will correspond to the second rotating vector of maximum registration Amount.
28. device according to claim 20, which is characterized in that described image information determination unit includes:
First scaling board point cloud determines subelement, for extracting from described cloud and being located at the mark using point cloud clustering algorithm Scaling board point cloud in fixed board;
First cloud coordinate determines subelement, for determining the three-dimensional coordinate of the scaling board point cloud.
29. device according to claim 20, which is characterized in that described image information determination unit includes:
Reflectivity obtains subelement, for obtaining the reflectivity of each point in described cloud;
Second scaling board point cloud determines subelement, for being greater than the point of default reflectivity threshold value using reflectivity, determines and is located at institute State the scaling board point cloud in scaling board;
Second point cloud coordinate determines subelement, for determining the three-dimensional coordinate of the scaling board point cloud.
30. device according to claim 20, which is characterized in that described image information determination unit includes:
Dimension information obtains subelement, for obtaining the dimension information of the scaling board;
Third scaling board point cloud determines subelement, for the dimension information using the scaling board, determines and is located in described cloud Scaling board point cloud in the scaling board;
Thirdly cloud coordinate determines subelement, for determining the three-dimensional coordinate of the scaling board point cloud.
31. a kind of caliberating device, which is characterized in that be applied to unmanned vehicle, the unmanned vehicle includes at least one camera and extremely A few laser radar, at least one described camera and at least one described laser radar are respectively provided with the coordinate system of itself, institute Stating device includes:
Frame of reference determining module, for choosing a target camera from least one described camera, by the target camera Coordinate system as the frame of reference;
First demarcating module, at least one described laser radar, the determining and target camera associated first to swash Optical radar, and the coordinate system of the first laser radar is demarcated to the frame of reference;
Second demarcating module, in the camera in addition to the target camera, determination to be corresponding with the first laser radar First camera, and by the coordinate system of the first camera calibration to corresponding first laser radar coordinate system.
Be not associated with determining module, for determining with the not associated second laser radar of the target camera, and it is determining with it is described The corresponding second camera of second laser radar;
Third demarcating module, for the coordinate system of the second camera to be demarcated to the coordinate system to associated first laser radar, And the coordinate system of the second laser radar is demarcated into the coordinate system to the second camera.
32. device according to claim 31, which is characterized in that at least one described camera includes: at least one industry Camera, at least one look around camera;The frame of reference determining module includes:
Target camera chooses submodule, is used as target camera for choosing one from least one described industrial camera.
33. device according to claim 32, which is characterized in that second demarcating module includes:
First, which looks around camera, determines submodule, for it is described at least one look around in camera, it is determining with the first laser thunder Camera is looked around up to corresponding first.
34. device according to claim 32, which is characterized in that the determining module that is not associated with includes:
Second, which looks around camera, determines submodule, for determining that corresponding with the second laser radar second looks around camera.
35. a kind of device characterized by comprising
One or more processors;With
One or more machine readable medias of instruction are stored thereon with, when being executed by one or more of processors, are made The methods for obtaining the one or more that described device is executed as described in claim 1-13 or 14-17.
36. one or more machine readable medias, are stored thereon with instruction, when executed by one or more processors, so that The processor executes one or more methods as described in claim 1-13 or 14-17.
CN201910425720.5A 2019-05-21 2019-05-21 Calibration method and device between laser radar and camera Active CN110221275B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910425720.5A CN110221275B (en) 2019-05-21 2019-05-21 Calibration method and device between laser radar and camera
PCT/CN2020/089722 WO2020233443A1 (en) 2019-05-21 2020-05-12 Method and device for performing calibration between lidar and camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910425720.5A CN110221275B (en) 2019-05-21 2019-05-21 Calibration method and device between laser radar and camera

Publications (2)

Publication Number Publication Date
CN110221275A true CN110221275A (en) 2019-09-10
CN110221275B CN110221275B (en) 2023-06-23

Family

ID=67821629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910425720.5A Active CN110221275B (en) 2019-05-21 2019-05-21 Calibration method and device between laser radar and camera

Country Status (2)

Country Link
CN (1) CN110221275B (en)
WO (1) WO2020233443A1 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110596683A (en) * 2019-10-25 2019-12-20 中山大学 Multi-group laser radar external parameter calibration system and method thereof
CN110853101A (en) * 2019-11-06 2020-02-28 深圳市巨力方视觉技术有限公司 Camera position calibration method and device and computer readable storage medium
CN110988801A (en) * 2019-10-25 2020-04-10 东软睿驰汽车技术(沈阳)有限公司 Radar installation angle adjusting method and device
CN111122128A (en) * 2020-01-03 2020-05-08 浙江大华技术股份有限公司 Calibration method and device of spherical camera
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111918203A (en) * 2020-07-03 2020-11-10 武汉万集信息技术有限公司 Target transport vehicle positioning method and device, storage medium and electronic equipment
WO2020233443A1 (en) * 2019-05-21 2020-11-26 菜鸟智能物流控股有限公司 Method and device for performing calibration between lidar and camera
CN112017250A (en) * 2020-08-31 2020-12-01 杭州海康威视数字技术股份有限公司 Calibration parameter determination method and device, radar vision equipment and radar ball joint system
CN112017251A (en) * 2020-10-19 2020-12-01 杭州飞步科技有限公司 Calibration method and device, road side equipment and computer readable storage medium
CN112180348A (en) * 2020-11-27 2021-01-05 深兰人工智能(深圳)有限公司 Attitude calibration method and device for vehicle-mounted multi-line laser radar
CN112233188A (en) * 2020-10-26 2021-01-15 南昌智能新能源汽车研究院 Laser radar-based roof panoramic camera and calibration method thereof
CN112363130A (en) * 2020-11-30 2021-02-12 东风汽车有限公司 Vehicle-mounted sensor calibration method, storage medium and system
CN112446927A (en) * 2020-12-18 2021-03-05 广东电网有限责任公司 Combined calibration method, device and equipment for laser radar and camera and storage medium
CN112578396A (en) * 2019-09-30 2021-03-30 上海禾赛科技股份有限公司 Method and device for coordinate transformation between radars and computer-readable storage medium
CN112669388A (en) * 2019-09-30 2021-04-16 上海禾赛科技股份有限公司 Calibration method and device for laser radar and camera device and readable storage medium
CN112785649A (en) * 2019-11-11 2021-05-11 北京京邦达贸易有限公司 Laser radar and camera calibration method and device, electronic equipment and medium
CN112823294A (en) * 2019-09-18 2021-05-18 北京嘀嘀无限科技发展有限公司 System and method for calibrating camera and multiline lidar
CN112819861A (en) * 2021-02-26 2021-05-18 广州小马慧行科技有限公司 Method and device for motion compensation of point cloud and computer readable storage medium
CN113077517A (en) * 2020-01-03 2021-07-06 湖南科天健光电技术有限公司 Spatial light measurement system calibration device and method based on light beam straight line characteristics
CN113188569A (en) * 2021-04-07 2021-07-30 东软睿驰汽车技术(沈阳)有限公司 Vehicle and laser radar coordinate system calibration method, device and storage medium
CN113740829A (en) * 2021-11-05 2021-12-03 新石器慧通(北京)科技有限公司 External parameter monitoring method and device for environment sensing equipment, medium and running device
CN114152935A (en) * 2021-11-19 2022-03-08 苏州一径科技有限公司 Method, device and equipment for evaluating radar external parameter calibration precision
CN114460552A (en) * 2022-01-21 2022-05-10 苏州皓宇云联科技有限公司 Road-end multi-sensor combined calibration method based on high-precision map
US20220214448A1 (en) * 2020-06-30 2022-07-07 Shanghai Sensetime Intelligent Technology Co., Ltd. Point cloud data fusion method and apparatus, electronic device, storage medium and computer program
WO2023040685A1 (en) * 2021-09-16 2023-03-23 杭州海康机器人股份有限公司 System calibration method and apparatus for line laser device
WO2023150961A1 (en) * 2022-02-10 2023-08-17 华为技术有限公司 Calibration method and device

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112881999B (en) * 2021-01-25 2024-02-02 上海西虹桥导航技术有限公司 Semi-automatic calibration method for multi-line laser radar and vision sensor
US11418771B1 (en) 2021-01-31 2022-08-16 Techman Robot Inc. Method for calibrating 3D camera by employing calibrated 2D camera
EP4040391B1 (en) * 2021-02-09 2024-05-29 Techman Robot Inc. Method for calibrating 3d camera by employing calibrated 2d camera
CN113009456B (en) * 2021-02-22 2023-12-05 中国铁道科学研究院集团有限公司 Vehicle-mounted laser radar data calibration method, device and system
CN113156407B (en) * 2021-02-24 2023-09-05 长沙行深智能科技有限公司 Vehicle-mounted laser radar external parameter joint calibration method, system, medium and device
CN112946612B (en) * 2021-03-29 2024-05-17 上海商汤临港智能科技有限公司 External parameter calibration method and device, electronic equipment and storage medium
CN113177988B (en) * 2021-04-30 2023-12-05 中德(珠海)人工智能研究院有限公司 Spherical screen camera and laser calibration method, device, equipment and storage medium
CN113436278A (en) * 2021-07-22 2021-09-24 深圳市道通智能汽车有限公司 Calibration method, calibration device, distance measurement system and computer readable storage medium
CN113790738A (en) * 2021-08-13 2021-12-14 上海智能网联汽车技术中心有限公司 Data compensation method based on intelligent cradle head IMU
CN113744344B (en) * 2021-08-18 2023-09-08 富联裕展科技(深圳)有限公司 Calibration method, device, equipment and storage medium of laser equipment
CN113643382B (en) * 2021-08-22 2023-10-10 浙江大学 Method and device for acquiring dense colored point cloud based on rotary laser fusion camera
CN113838141B (en) * 2021-09-02 2023-07-25 中南大学 External parameter calibration method and system for single-line laser radar and visible light camera
CN114022566A (en) * 2021-11-04 2022-02-08 安徽省爱夫卡电子科技有限公司 Combined calibration method for single line laser radar and camera
CN114022569B (en) * 2021-11-18 2024-06-07 湖北中烟工业有限责任公司 Method and device for measuring square accuracy of box body based on vision
CN114167393A (en) * 2021-12-02 2022-03-11 新境智能交通技术(南京)研究院有限公司 Position calibration method and device for traffic radar, storage medium and electronic equipment
CN114494806A (en) * 2021-12-17 2022-05-13 湖南国天电子科技有限公司 Target identification method, system, device and medium based on multivariate information fusion
CN114779188B (en) * 2022-01-24 2023-11-03 南京慧尔视智能科技有限公司 Method, device, equipment and medium for evaluating calibration effect
CN114755662B (en) * 2022-03-21 2024-04-30 北京航空航天大学 Road-vehicle fusion perception laser radar and GPS calibration method and device
CN114723715B (en) * 2022-04-12 2023-09-19 小米汽车科技有限公司 Vehicle target detection method, device, equipment, vehicle and medium
CN115100287A (en) * 2022-04-14 2022-09-23 美的集团(上海)有限公司 External reference calibration method and robot
CN115166701B (en) * 2022-06-17 2024-04-09 清华大学 System calibration method and device for RGB-D camera and laser radar
CN115856849B (en) * 2023-02-28 2023-05-05 季华实验室 Depth camera and 2D laser radar calibration method and related equipment
CN116540219B (en) * 2023-07-04 2023-09-22 北醒(北京)光子科技有限公司 Method and device for correcting radar emergent light angle, storage medium and electronic equipment
CN116630444B (en) * 2023-07-24 2023-09-29 中国矿业大学 Optimization method for fusion calibration of camera and laser radar
CN116740197B (en) * 2023-08-11 2023-11-21 之江实验室 External parameter calibration method and device, storage medium and electronic equipment
CN117073581B (en) * 2023-09-12 2024-01-26 梅卡曼德(北京)机器人科技有限公司 Calibration method and device of line laser profilometer system and electronic equipment
CN117607829B (en) * 2023-12-01 2024-06-18 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Ordered reconstruction method of laser radar point cloud and computer readable storage medium
CN117630892B (en) * 2024-01-25 2024-03-29 北京科技大学 Combined calibration method and system for visible light camera, infrared camera and laser radar

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050228555A1 (en) * 2003-08-20 2005-10-13 Samsung Electronics Co., Ltd. Method of constructing artificial mark for autonomous driving, apparatus and method of determining position of intelligent system using artificial mark and intelligent system employing the same
CN107564069A (en) * 2017-09-04 2018-01-09 北京京东尚科信息技术有限公司 The determination method, apparatus and computer-readable recording medium of calibrating parameters
WO2018195999A1 (en) * 2017-04-28 2018-11-01 SZ DJI Technology Co., Ltd. Calibration of laser and vision sensors
CN109029284A (en) * 2018-06-14 2018-12-18 大连理工大学 A kind of three-dimensional laser scanner based on geometrical constraint and camera calibration method
US20180372852A1 (en) * 2017-06-22 2018-12-27 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for calibration between laser radar and camera, device and storage medium
CN109151439A (en) * 2018-09-28 2019-01-04 上海爱观视觉科技有限公司 A kind of the automatic tracing camera system and method for view-based access control model
CN109215063A (en) * 2018-07-05 2019-01-15 中山大学 A kind of method for registering of event triggering camera and three-dimensional laser radar
US20190086524A1 (en) * 2017-09-17 2019-03-21 Baidu Online Network Technology (Beijing) Co., Ltd . Parameter calibration method and apparatus of multi-line laser radar, device and readable medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107167790B (en) * 2017-05-24 2019-08-09 北京控制工程研究所 A kind of two step scaling method of laser radar based on Calibration Field
CN110221275B (en) * 2019-05-21 2023-06-23 菜鸟智能物流控股有限公司 Calibration method and device between laser radar and camera

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050228555A1 (en) * 2003-08-20 2005-10-13 Samsung Electronics Co., Ltd. Method of constructing artificial mark for autonomous driving, apparatus and method of determining position of intelligent system using artificial mark and intelligent system employing the same
WO2018195999A1 (en) * 2017-04-28 2018-11-01 SZ DJI Technology Co., Ltd. Calibration of laser and vision sensors
US20180372852A1 (en) * 2017-06-22 2018-12-27 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for calibration between laser radar and camera, device and storage medium
CN107564069A (en) * 2017-09-04 2018-01-09 北京京东尚科信息技术有限公司 The determination method, apparatus and computer-readable recording medium of calibrating parameters
US20190086524A1 (en) * 2017-09-17 2019-03-21 Baidu Online Network Technology (Beijing) Co., Ltd . Parameter calibration method and apparatus of multi-line laser radar, device and readable medium
CN109029284A (en) * 2018-06-14 2018-12-18 大连理工大学 A kind of three-dimensional laser scanner based on geometrical constraint and camera calibration method
CN109215063A (en) * 2018-07-05 2019-01-15 中山大学 A kind of method for registering of event triggering camera and three-dimensional laser radar
CN109151439A (en) * 2018-09-28 2019-01-04 上海爱观视觉科技有限公司 A kind of the automatic tracing camera system and method for view-based access control model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
贾子永;任国全;李冬伟;程子阳;: "基于梯形棋盘格的摄像机和激光雷达标定方法", 计算机应用, no. 07 *

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020233443A1 (en) * 2019-05-21 2020-11-26 菜鸟智能物流控股有限公司 Method and device for performing calibration between lidar and camera
CN112823294A (en) * 2019-09-18 2021-05-18 北京嘀嘀无限科技发展有限公司 System and method for calibrating camera and multiline lidar
CN112823294B (en) * 2019-09-18 2024-02-02 北京航迹科技有限公司 System and method for calibrating cameras and multi-line lidar
CN112669388A (en) * 2019-09-30 2021-04-16 上海禾赛科技股份有限公司 Calibration method and device for laser radar and camera device and readable storage medium
CN112578396A (en) * 2019-09-30 2021-03-30 上海禾赛科技股份有限公司 Method and device for coordinate transformation between radars and computer-readable storage medium
CN112669388B (en) * 2019-09-30 2022-06-21 上海禾赛科技有限公司 Calibration method and device for laser radar and camera device and readable storage medium
CN110988801A (en) * 2019-10-25 2020-04-10 东软睿驰汽车技术(沈阳)有限公司 Radar installation angle adjusting method and device
CN110596683A (en) * 2019-10-25 2019-12-20 中山大学 Multi-group laser radar external parameter calibration system and method thereof
CN110596683B (en) * 2019-10-25 2021-03-26 中山大学 Multi-group laser radar external parameter calibration system and method thereof
CN110853101A (en) * 2019-11-06 2020-02-28 深圳市巨力方视觉技术有限公司 Camera position calibration method and device and computer readable storage medium
CN110853101B (en) * 2019-11-06 2022-08-23 深圳市巨力方视觉技术有限公司 Camera position calibration method and device and computer readable storage medium
CN112785649A (en) * 2019-11-11 2021-05-11 北京京邦达贸易有限公司 Laser radar and camera calibration method and device, electronic equipment and medium
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111179358B (en) * 2019-12-30 2024-01-05 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN113077517A (en) * 2020-01-03 2021-07-06 湖南科天健光电技术有限公司 Spatial light measurement system calibration device and method based on light beam straight line characteristics
CN111122128B (en) * 2020-01-03 2022-04-19 浙江大华技术股份有限公司 Calibration method and device of spherical camera
CN111122128A (en) * 2020-01-03 2020-05-08 浙江大华技术股份有限公司 Calibration method and device of spherical camera
US20220214448A1 (en) * 2020-06-30 2022-07-07 Shanghai Sensetime Intelligent Technology Co., Ltd. Point cloud data fusion method and apparatus, electronic device, storage medium and computer program
CN111918203A (en) * 2020-07-03 2020-11-10 武汉万集信息技术有限公司 Target transport vehicle positioning method and device, storage medium and electronic equipment
CN111918203B (en) * 2020-07-03 2022-10-28 武汉万集信息技术有限公司 Target transport vehicle positioning method and device, storage medium and electronic equipment
CN112017250B (en) * 2020-08-31 2023-07-25 杭州海康威视数字技术股份有限公司 Calibration parameter determination method and device, radar equipment and Lei Qiu relay system
CN112017250A (en) * 2020-08-31 2020-12-01 杭州海康威视数字技术股份有限公司 Calibration parameter determination method and device, radar vision equipment and radar ball joint system
CN112017251A (en) * 2020-10-19 2020-12-01 杭州飞步科技有限公司 Calibration method and device, road side equipment and computer readable storage medium
CN112233188B (en) * 2020-10-26 2024-03-12 南昌智能新能源汽车研究院 Calibration method of data fusion system of laser radar and panoramic camera
CN112233188A (en) * 2020-10-26 2021-01-15 南昌智能新能源汽车研究院 Laser radar-based roof panoramic camera and calibration method thereof
CN112180348A (en) * 2020-11-27 2021-01-05 深兰人工智能(深圳)有限公司 Attitude calibration method and device for vehicle-mounted multi-line laser radar
CN112180348B (en) * 2020-11-27 2021-03-02 深兰人工智能(深圳)有限公司 Attitude calibration method and device for vehicle-mounted multi-line laser radar
CN112363130A (en) * 2020-11-30 2021-02-12 东风汽车有限公司 Vehicle-mounted sensor calibration method, storage medium and system
CN112363130B (en) * 2020-11-30 2023-11-14 东风汽车有限公司 Vehicle-mounted sensor calibration method, storage medium and system
CN112446927A (en) * 2020-12-18 2021-03-05 广东电网有限责任公司 Combined calibration method, device and equipment for laser radar and camera and storage medium
CN112819861B (en) * 2021-02-26 2024-06-04 广州小马慧行科技有限公司 Point cloud motion compensation method, device and computer readable storage medium
CN112819861A (en) * 2021-02-26 2021-05-18 广州小马慧行科技有限公司 Method and device for motion compensation of point cloud and computer readable storage medium
CN113188569A (en) * 2021-04-07 2021-07-30 东软睿驰汽车技术(沈阳)有限公司 Vehicle and laser radar coordinate system calibration method, device and storage medium
WO2023040685A1 (en) * 2021-09-16 2023-03-23 杭州海康机器人股份有限公司 System calibration method and apparatus for line laser device
CN113740829A (en) * 2021-11-05 2021-12-03 新石器慧通(北京)科技有限公司 External parameter monitoring method and device for environment sensing equipment, medium and running device
CN114152935A (en) * 2021-11-19 2022-03-08 苏州一径科技有限公司 Method, device and equipment for evaluating radar external parameter calibration precision
CN114152935B (en) * 2021-11-19 2023-02-03 苏州一径科技有限公司 Method, device and equipment for evaluating radar external parameter calibration precision
CN114460552A (en) * 2022-01-21 2022-05-10 苏州皓宇云联科技有限公司 Road-end multi-sensor combined calibration method based on high-precision map
WO2023150961A1 (en) * 2022-02-10 2023-08-17 华为技术有限公司 Calibration method and device

Also Published As

Publication number Publication date
WO2020233443A1 (en) 2020-11-26
CN110221275B (en) 2023-06-23

Similar Documents

Publication Publication Date Title
CN110221275A (en) Calibration method and device between laser radar and camera
CN111337947B (en) Instant mapping and positioning method, device, system and storage medium
CN106529495B (en) Obstacle detection method and device for aircraft
US8619144B1 (en) Automatic camera calibration
US9043146B2 (en) Systems and methods for tracking location of movable target object
CN112816949B (en) Sensor calibration method and device, storage medium and calibration system
US20190014310A1 (en) Hardware system for inverse graphics capture
CN109552665A (en) Method for using suspending platform measurement and inspection structure
Collins et al. Calibration of an outdoor active camera system
Läbe et al. Geometric stability of low-cost digital consumer cameras
CN102448679A (en) Method and system for extremely precise positioning of at least one object in the end position in space
CN105427288A (en) Calibration method and device of machine vision alignment system
CN111220126A (en) Space object pose measurement method based on point features and monocular camera
CN207766424U (en) A kind of filming apparatus and imaging device
KR20130121290A (en) Georeferencing method of indoor omni-directional images acquired by rotating line camera
CN113361365B (en) Positioning method, positioning device, positioning equipment and storage medium
CN108007426A (en) A kind of camera distance measuring method and system
CN107534715B (en) Camera production method and advanced driving assistance system
EP4024340A1 (en) System for refining a six degrees of freedom pose estimate of a target object
CN114758011A (en) Zoom camera online calibration method fusing offline calibration results
CN110355758A (en) A kind of machine follower method, equipment and follow robot system
Maas Dynamic photogrammetric calibration of industrial robots
CN111627048B (en) Multi-camera cooperative target searching method
CN112762929A (en) Intelligent navigation method, device and equipment
CN109489642B (en) Dynamic measurement method for relative attitude of two cube mirrors under any spatial attitude

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant