WO2022156176A1 - 多种雷达和相机联合标定方法、系统、设备及存储介质 - Google Patents

多种雷达和相机联合标定方法、系统、设备及存储介质 Download PDF

Info

Publication number
WO2022156176A1
WO2022156176A1 PCT/CN2021/108586 CN2021108586W WO2022156176A1 WO 2022156176 A1 WO2022156176 A1 WO 2022156176A1 CN 2021108586 W CN2021108586 W CN 2021108586W WO 2022156176 A1 WO2022156176 A1 WO 2022156176A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
sensor
center
millimeter
lidar
Prior art date
Application number
PCT/CN2021/108586
Other languages
English (en)
French (fr)
Inventor
章嵘
张大鹏
赵钊
Original Assignee
上海西井信息科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海西井信息科技有限公司 filed Critical 上海西井信息科技有限公司
Priority to EP21920557.2A priority Critical patent/EP4283328A1/en
Publication of WO2022156176A1 publication Critical patent/WO2022156176A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4026Antenna boresight
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0242Testing optical properties by measuring geometrical properties or aberrations
    • G01M11/0257Testing optical properties by measuring geometrical properties or aberrations by analyzing the image formed by the object to be tested
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4052Means for monitoring or calibrating by simulation of echoes
    • G01S7/4082Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder
    • G01S7/4086Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder in a calibrating environment, e.g. anechoic chamber
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20072Graph-based image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Definitions

  • the invention belongs to the technical field of multi-sensor calibration, and in particular relates to a joint calibration method, system, device and storage medium of various radars and cameras.
  • Embodiments of the present invention provide a method for jointly calibrating multiple radars and cameras, including the following steps:
  • S100 set up a variety of radar sensors and image sensors to form a common field of view
  • a calibration plate is sequentially arranged on a plurality of target positions in the public field of view, and a plurality of calibration patterns are arranged on the calibration plate, and the calibration patterns are materials with different reflection coefficients according to the center of the calibration pattern Arranging around, obtaining the detection data of the radar sensor and the image sensor on the calibration plate of different target positions, and obtaining the coordinate information of the center of the calibration pattern in the respective three-dimensional coordinate system of each of the radar sensors and the image sensor;
  • a millimeter-wave radar sensor, a lidar sensor, and an image sensor are arranged to jointly form a common field of view;
  • the same calibration plate is sequentially arranged on W target positions in the public field of view area, and the calibration plate includes a reflector body and p boards embedded in the reflector body at the same layer.
  • a center symmetrical pattern reflection area and a reflector concentrically arranged in the center of each of the center symmetrical pattern reflection areas and protruding from the surface of the reflector body, the reflectivity of the reflector body, the center symmetrical pattern reflection area and the reflector are different. are the same; when the calibration plate is located at each target position, the detected positioning information of the millimeter-wave radar sensor, lidar sensor and image sensor is collected separately, according to the millimeter-wave radar sensor, lidar sensor and image sensor respectively. Perform synchronous detection on the calibration plate located at each of the target positions to obtain the central positioning information about the reflection area of the centrally symmetric pattern;
  • the step S100 includes: rigidly connecting the millimeter-wave radar sensor, the lidar sensor, and the image sensor, and the detection directions of the millimeter-wave radar sensor, the lidar sensor, and the image sensor converge together in the public view area.
  • the step S200 includes the following steps: obtaining, by the image sensor, the center positioning information of the reflection area of the centrally symmetric pattern:
  • the image sensor captures an image of the reflection plate
  • the reflector body is a rectangle
  • the step S211 further includes performing rectangle correction on the image.
  • the step S200 includes the following steps: obtaining, by the lidar sensor, the center positioning information of the center-symmetric pattern reflection area:
  • the lidar sensor scans the calibration plate
  • the millimeter-wave radar sensor scans the shape of the reflector on the calibration plate
  • the step S300 includes the following steps:
  • the transformation matrix error between the image sensor and the lidar sensor is expressed as the Euclidean distance between each relative circle center
  • the 4 ⁇ 4 conversion matrix between the millimeter-wave radar sensor and the lidar sensor is: Wherein, R 2 and t 2 are respectively a 3 ⁇ 3 rotation matrix and a 3 ⁇ 1 translation matrix from the millimeter-wave radar sensor to the lidar sensor;
  • Embodiments of the present invention also provide a variety of radar and camera joint calibration systems for implementing the above-mentioned multiple radar and camera joint calibration methods.
  • the various radar and camera joint calibration systems include:
  • a calibration plate is sequentially arranged on a plurality of target positions in the public field of view, the calibration plate is provided with a plurality of calibration patterns, and the calibration patterns are materials with different reflection coefficients according to the calibration
  • the centers of the patterns are arranged around the center, and the detection data of the radar sensor and the image sensor on the calibration plates of different target positions are obtained to obtain the center of the calibration pattern in the respective three-dimensional coordinate systems of each of the radar sensors and the image sensors. coordinate information;
  • the joint calibration module performs joint calibration according to the coordinate information.
  • Embodiments of the present invention also provide a variety of radar and camera joint calibration devices, including:
  • the processor is configured to execute the steps of the above-mentioned multiple radar and camera joint calibration methods by executing the executable instructions.
  • Embodiments of the present invention further provide a computer-readable storage medium for storing a program, and when the program is executed, the steps of the above-mentioned multiple radar and camera joint calibration methods are implemented.
  • the multiple radar and camera joint calibration method, system, device and storage medium of the present invention can simultaneously obtain the world coordinates of the reference point on the same calibration board through three different sensors, and then realize joint calibration through iterative matching, reducing the need for multiple sensors.
  • the steps of calibration speed up the speed of joint calibration.
  • FIG. 1 is a flow chart of various radar and camera joint calibration methods according to an embodiment of the present invention.
  • FIGS 2 to 4 are schematic diagrams of implementation processes of various radar and camera joint calibration methods according to an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of a joint calibration system for multiple radars and cameras according to an embodiment of the present invention.
  • FIG. 6 is a schematic structural diagram of various radar and camera joint calibration devices according to an embodiment of the present invention. as well as
  • FIG. 7 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
  • FIG. 1 is a flow chart of various radar and camera joint calibration methods according to an embodiment of the present invention. As shown in FIG. 1 , this embodiment provides a method for jointly calibrating multiple radars and cameras, including the following steps:
  • a calibration plate is sequentially arranged on a plurality of target positions in the public field of view, the calibration plate is provided with a plurality of calibration patterns, and the calibration patterns are materials with different reflection coefficients arranged around the center of the calibration pattern to obtain radar sensors and images.
  • the detection data of the sensors on the calibration plates of different target positions obtains the coordinate information of the center of the calibration pattern in the respective three-dimensional coordinate systems of each radar sensor and the image sensor.
  • At least one radar sensor obtains the position information of the trajectory of each calibration pattern through the analysis of reflection data, and then obtains the center of the calibration pattern through image processing, but not limited to this.
  • At least one radar sensor obtains the position information of the calibration object at the center of each calibration pattern through the analysis of reflection data, and then obtains the center of the calibration pattern through the preset spatial positional relationship between the calibration object and the center of the calibration pattern, but Not limited to this.
  • At least one image sensor obtains the position information of the track of each calibration pattern through identification, and then obtains the center of the calibration pattern through image processing, but not limited to this.
  • the accuracy is not high and the process is complicated.
  • the system does not need to manually adjust the alignment and calibration parameters, and the calibration results have high accuracy, and the process is simple and can be mass-produced, thus achieving high-precision mass production of joint calibration in a true sense.
  • the same calibration plate is sequentially arranged on W target positions in the public visual field area (refer to the first target position 3A, the second target position 3B, and the third target position 3C in FIG.
  • Each calibration plate completely appears in the image of the image sensor 23 and the detection range of the millimeter-wave radar sensor 21 and the lidar sensor 22 .
  • the calibration plate 3 includes a reflector body 31 , and p centrally symmetric pattern reflection regions 32 embedded in the reflector body 31 at the same layer (in this embodiment, there are four circular shapes embedded in the reflector body 31 at the same layer).
  • the material of the reflector body 31 is low-reflection foam
  • the material of the centrally symmetric pattern reflection area 32 is high-reflection reflective paper
  • the reflector 33 is a metal triangular pyramid reflector 33 .
  • the reflectivity of the metal triangular pyramid reflector 33 is greater than that of the high-reflection reflective paper, and the reflectivity of the high-reflection reflective paper is greater than that of the low-reflection foam.
  • the detected positioning information of the millimeter-wave radar sensor 21, the lidar sensor 22 and the image sensor 23 are collected respectively.
  • the lidar sensor 22 and the image sensor 23 The calibration plate of each target position (see the first target position 3A, the second target position 3B, and the third target position 3C in FIG. 2 ) synchronously detects the central positioning information about the centrally symmetric pattern reflection area 32 obtained.
  • step S300 joint calibration is performed according to the respective center positioning information of the millimeter-wave radar sensor 21, the lidar sensor 22 and the image sensor 23 at different target positions.
  • Step S300 includes the following steps:
  • the reflector body 31 is rectangular, the image sensor 23 captures an image of the reflector, and performs rectangle correction on the image.
  • obtaining the center positioning information of the center-symmetric pattern reflection area 32 by the lidar sensor 22 includes the following steps:
  • the laser radar sensor 22 scans the calibration plate.
  • the reflector body 31 and the centrally symmetric pattern reflection area 32 are composed of two media with great differences in reflection, so the reflector body 31 and the center-symmetrical pattern can be obtained by analyzing the reflection data of the lidar sensor 22 .
  • the boundary between the reflection areas 32 is used as the second pattern track, but not limited thereto.
  • S223 obtaining the center of the center-symmetric pattern reflection area 32 according to the second pattern track.
  • the corresponding inner center position of each second pattern track is obtained, but not limited thereto.
  • S224 based on the three-dimensional coordinate system of the lidar sensor 22 , obtain a three-dimensional coordinate whose center is in the three-dimensional coordinate system of the lidar sensor 22 .
  • the millimeter-wave radar sensor 21 obtains the center positioning information of the center-symmetric pattern reflection area 32 including the following steps:
  • the millimeter wave radar sensor 21 scans the shape of the reflector 33 on the calibration plate.
  • a metal corner reflector with high reflectivity is used, so the position of the metal corner reflector can be obtained by analyzing the reflection data of the millimeter wave radar sensor 21, but it is not limited thereto.
  • S232 based on the three-dimensional coordinate system of the millimeter-wave radar sensor 21 , obtain the three-dimensional coordinates of the reflector 33 in the three-dimensional coordinate system of the millimeter-wave radar sensor 21 .
  • the center of the calibration pattern is obtained through the predetermined spatial positional relationship between the metal corner reflector and the center of the center-symmetric pattern reflection area 32 , but not limited thereto.
  • f( ⁇ c,l ) and f( ⁇ r,l ) are iteratively optimized by the least square method, and when f( ⁇ c,l ) is the smallest, the transformation matrix between the image sensor 23 and the lidar sensor 22 is obtained. When f( ⁇ r,l ) is the smallest, the conversion matrix between the millimeter-wave radar sensor 21 and the lidar sensor 22 is obtained, but not limited to this.
  • the technical solution adopted by the present invention to solve the technical problem is as follows: the millimeter-wave radar, the laser radar and the camera are rigidly connected to ensure that the relative positions remain unchanged, and the special joint calibration designed by this method is placed in the forward direction in the public field of view of the three sensors. plate. Unlike the previous two sensors that share a set of calibration boards, a variety of different calibration boards respectively standardize different sensors.
  • the special calibration board of this method combines the target features of the three sensors into one, which can realize any arbitrary adjustment between the three sensors. Coordinate conversion, and only one calibration board can satisfy the entire calibration process, thereby improving accuracy and robustness.
  • the data of different sensors are corresponding to each frame, and the four circle center coordinates in the image, the three-dimensional coordinates of the center points of the three point clouds of the lidar data, and the center point of the millimeter wave radar angle reflector 33 are respectively extracted.
  • the three-dimensional coordinates of the depth of the corner reflector 33 are subtracted.
  • the calibration of the three sensors is converted into point-to-point matching, and iterative optimization is performed through multiple sets of data and combined with the iterative matching algorithm in the present invention, and finally a minimum error is obtained.
  • the conversion matrix obtained under the minimum error is For external reference.
  • the three sensor data are verified in pairs, and the results are output after no errors.
  • three different sensors simultaneously obtain the world coordinates of the circle on the calibration board, and then perform iterative matching through the algorithm in this paper, and finally calculate the precise coordinate transformation matrix.
  • the present invention can regularly arrange 1 to N (can be any number) of circular reflective materials with different reflectivity on the rectangular plate, cut out a triangular concave triangular hole in each circular reflective material, and fix the metal corners inside. reflector.
  • the three sensors of millimeter-wave radar, lidar and camera are rigidly connected, and the relative coordinates are kept unchanged, and then a specific calibration board is made and placed in the public field of view of the three sensors.
  • Millimeter-wave radar obtains a stable and high-precision triangle center point, that is, the center of the circle, and the RCS value through the corner reflector.
  • the lidar extracts the circle by the reflectivity, and obtains the 3-dimensional coordinates of the center of the circle through the edge of the circle.
  • the coordinates of the circle center in the image are also obtained through Hough circle transformation, and then the x, y, and z coordinates of the circle center in the camera coordinate system are obtained through ICP registration with the lidar.
  • the next step is to match the coordinates of the respective target circle centers in the three sensors.
  • the transformation matrix error between the camera and lidar is expressed as the Euclidean distance between the centers of each relative circle
  • T c,l represents the rotation and translation matrix from the camera to the lidar
  • p represents the circle number (1 to 4) in the calibration plate.
  • the calibration of the millimeter-wave radar can be converted into an iterative optimization problem.
  • a maximum ⁇ max can be defined, so as to have
  • the external parameter relationship of the three sensors is converted into an iterative optimization problem.
  • the total error between the two sensors is represented by f( ⁇ 1,2 )
  • the corresponding solution is the optimal solution, that is, the corresponding rotation and translation matrix.
  • the pose optimization between the three sensors can set any sensor coordinate as the reference coordinate. Therefore, an iterative optimization equation can be formed between two sensors, and the errors of all combinations can be expressed as,
  • the minimum error is finally optimized, and the transformation matrix corresponding to the minimum error is the external parameter.
  • the above invention can calibrate millimeter wave radar, lidar and camera with high efficiency and high precision, and realize mass production.
  • FIG. 5 is a schematic structural diagram of a joint calibration system for multiple radars and cameras according to an embodiment of the present invention. As shown in FIG. 5 , this embodiment also provides a variety of radar and camera joint calibration systems 5 for implementing the above-mentioned multiple radar and camera joint calibration methods.
  • the various radar and camera joint calibration systems 5 include:
  • the public field of view module 51 is provided with a variety of radar sensors and image sensors to form a common field of view area.
  • the coordinate detection module 52 arranges a calibration plate on multiple target positions in the public field of view in turn, and the calibration plate is provided with a plurality of calibration patterns, and the calibration patterns are materials with different reflection coefficients arranged around the center of the calibration pattern to obtain
  • the detection data of the radar sensor and the image sensor on the calibration plates of different target positions obtains the coordinate information of the center of the calibration pattern in the respective three-dimensional coordinate systems of each radar sensor and the image sensor.
  • the joint calibration module 53 performs joint calibration according to the coordinate information.
  • the multiple radar and camera joint calibration system of the present invention can simultaneously obtain the world coordinates of the reference point on the same calibration board through three different sensors, and then realize the joint calibration through iterative matching, which reduces the steps of multi-sensor calibration and speeds up the joint calibration.
  • the calibrated speed can simultaneously obtain the world coordinates of the reference point on the same calibration board through three different sensors, and then realize the joint calibration through iterative matching, which reduces the steps of multi-sensor calibration and speeds up the joint calibration. The calibrated speed.
  • Embodiments of the present invention also provide a variety of radar and camera joint calibration devices, including a processor.
  • a memory in which executable instructions for the processor are stored.
  • the processor is configured to perform the steps of the various radar and camera joint calibration methods via executing the executable instructions.
  • the various radar and camera joint calibration devices of the present invention can simultaneously obtain the world coordinates of the reference point on the same calibration board through three different sensors, and then achieve joint calibration through iterative matching, which reduces the steps of multi-sensor calibration and speeds up Joint calibration speed.
  • aspects of the present invention may be implemented as a system, method or program product. Therefore, various aspects of the present invention can be embodied in the following forms: a complete hardware implementation, a complete software implementation (including firmware, microcode, etc.), or a combination of hardware and software aspects, which may be collectively referred to herein as implementations "Circuit", “Module” or "Platform”.
  • the storage unit 620 may include a readable medium in the form of a volatile storage unit, such as a random access storage unit (RAM) 6201 and/or a cache storage unit 6202 , and may further include a read only storage unit (ROM) 6203 .
  • RAM random access storage unit
  • ROM read only storage unit
  • the storage unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, An implementation of a network environment may be included in each or some combination of these examples.
  • the bus 630 may be representative of one or more of several types of bus structures, including a memory cell bus or memory cell controller, a peripheral bus, a graphics acceleration port, a processing unit, or a local area using any of a variety of bus structures bus.
  • the electronic device 600 may also communicate with one or more external devices 700 (eg, keyboards, pointing devices, Bluetooth devices, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with Any device (eg, router, modem, etc.) that enables the electronic device 600 to communicate with one or more other computing devices. Such communication may occur through input/output (I/O) interface 650 . Also, the electronic device 600 may communicate with one or more networks (eg, a local area network (LAN), a wide area network (WAN), and/or a public network such as the Internet) through a network adapter 660 . Network adapter 660 may communicate with other modules of electronic device 600 through bus 630 . It should be appreciated that, although not shown, other hardware and/or software modules may be used in conjunction with electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives And data backup storage platform, etc.
  • Embodiments of the present invention further provide a computer-readable storage medium for storing a program, and when the program is executed, the steps of various radar and camera joint calibration methods are implemented.
  • various aspects of the present invention can also be implemented in the form of a program product, which includes program code, when the program product runs on a terminal device, the program code is used to cause the terminal device to execute the above-mentioned description in this specification.
  • the steps in accordance with various exemplary embodiments of the present invention are described in the Various Radar and Camera Joint Calibration Processing Methods section.
  • FIG. 7 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
  • a program product 800 for implementing the above method according to an embodiment of the present invention is described, which can adopt a portable compact disk read only memory (CD-ROM) and include program codes, and can be stored in a terminal device, For example running on a personal computer.
  • CD-ROM compact disk read only memory
  • the program product of the present invention is not limited thereto, and in this document, a readable storage medium may be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • the program product may employ any combination of one or more readable media.
  • the readable medium may be a readable signal medium or a readable storage medium.
  • the readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above. More specific examples (non-exhaustive list) of readable storage media include: electrical connections with one or more wires, portable disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the foregoing.
  • a computer-readable storage medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied thereon. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a readable storage medium can also be any readable medium other than a readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • Program code embodied on a readable storage medium may be transmitted using any suitable medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • the remote computing device may be connected to the user computing device through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computing device (eg, using an Internet service provider business via an Internet connection).
  • LAN local area network
  • WAN wide area network
  • an external computing device eg, using an Internet service provider business via an Internet connection
  • the various radar and camera joint calibration methods, systems, devices and storage media of the present invention can simultaneously obtain the world coordinates of the reference point on the same calibration board through three different sensors, and then achieve joint calibration through iterative matching, reducing The steps of multi-sensor calibration are simplified and the speed of joint calibration is accelerated.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Mathematical Physics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Data Mining & Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Chemical & Material Sciences (AREA)
  • Algebra (AREA)
  • Analytical Chemistry (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

提供了多种雷达和相机联合标定方法、系统、设备及存储介质。标定方法包括以下步骤:设置多种雷达传感器以及图像传感器共同形成公共视野区域(S100);将一标定板依次设置于公共视野区域中的多个靶位,标定板上设有多个标定图案,标定图案为不同反射系数的材料根据标定图案的中心环绕排列,获得雷达传感器以及图像传感器对不同靶位的标定板的侦测数据获得标定图案的中心在每个雷达传感器以及图像传感器各自三维坐标系中的坐标信息(S200);根据坐标信息进行联合标定(S300)。本方法能够通过三种不同传感器同时获得同一标定板上的参照点的世界坐标,然后通过迭代匹配实现联合标定,减少了多传感器标定的步骤,加快了联合标定的速度。

Description

多种雷达和相机联合标定方法、系统、设备及存储介质 技术领域
本发明属于多传感器标定技术领域,尤其涉及一种多种雷达和相机联合标定方法、系统、设备及存储介质。
背景技术
目前,在无人驾驶、V2X路测感知以及其他工业领域广泛的使用包括毫米波雷达、激光雷达和相机在内的多传感器融合技术。多传感器融合需要在空间和时间范围内进行同步,融合的的第一步就是进行空间上的同步,即联合标定,传统的联合标定方法比较零散,需要两两传感器之间单独标定,最后进行整合,且标定流程复杂,需大量的人为优化迭代调参,对场景要求苛刻,标定目标物种类繁多,每次标定都需要大量的专业工程人员花费大量的时间去标定,标定的参数质量不高。如果传感器之间稍有刚体变形就会需要重新标定。
需要说明的是,在上述背景技术部分公开的信息仅用于加强对本发明的背景的理解,因此可以包括不构成对本领域普通技术人员已知的现有技术的信息。
发明内容
针对现有技术中的问题,本发明的目的在于提供多种雷达和相机联合标定方法、系统、设备及存储介质,克服了现有技术的困难,能够通过三种不同传感器同时获得同一标定板上的参照点的世界坐标,然后通过迭代匹配实现联合标定,减少了多传感器标定的步骤,加快了联合标定的速度。
本发明的实施例提供一种多种雷达和相机联合标定方法,包括以下步骤:
S100、设置多种雷达传感器以及图像传感器共同形成公共视野区域;
S200、将一标定板依次设置于所述公共视野区域中的多个靶位,所述标定板上设有多个标定图案,所述标定图案为不同反射系数的材料根据所述标定图案的中心环绕排列,获得所述雷达传感器以及图像传感器对不同靶位的所述标定板的侦测数据获得所述标定图案的中心在每个所述雷达传感器以及图像传感器各自三维坐标系中的坐标信息;
S300、根据所述坐标信息进行联合标定。
在一些实施例中,在所述步骤S100中,设置毫米波雷达传感器、激光雷达传感器以及图像传感器共同形成公共视野区域;
在所述步骤S200中,将同一标定板依次设置于所述公共视野区域中的W个靶位,所述标定板包括一反射板本体、p个同层嵌设于所述反射板本体中的中心对称图案反射区域以及同心设置于每个所述中心对称图案反射区域中心、凸起于反射板本体表面的反射器,所述反射板本体、中心对称图案反射区域以及反射器的反射率各不相同;分别采集所述标定板位于每个靶位时,所述毫米波雷达传感器、激光雷达传感器以及图像传感器的检测到的定位信息,根据所述毫米波雷达传感器、激光雷达传感器以及图像传感器分别对位于每个所述靶位的所述标定板进行同步侦测各自获得的关于所述中心对称图案反射区域的中心定位信息;
在所述步骤S300中,根据不同所述靶位下所述毫米波雷达传感器、激光雷达传感器以及图像传感器的各自中心定位信息进行联合标定。
在一些实施例中,所述步骤S100中包括,将所述毫米波雷达传感器、激光雷达传感器以及图像传感器刚体连接,且所述毫米波雷达传感器、激光雷达传感器以及图像传感器的侦测方向共同交汇于公共视野区域。
在一些实施例中,所述步骤S200中包括,位于所述靶位的标定板完整出现于所述图像传感器的图像以及所述毫米波雷达传感器、激光雷达传感器的侦测范围中。
在一些实施例中,所述步骤S200中包括,所述图像传感器获得所述中心对称图案反射区域的中心定位信息包括以下步骤:
S211、所述图像传感器拍摄所述反射板的图像;
S212、对所述图像进行图像识别,获得中心对称图案反射区域的第一图案轨迹;
S213、根据所述第一图案轨迹,获得所述中心对称图案反射区域的中心;
S214、基于所述图像传感器的三维坐标系,获得所述中心在所述图像传感器的三维坐标系中的三维坐标。
在一些实施例中,所述反射板本体为矩形,所述步骤S211中还包括对图像进行矩形矫正。
在一些实施例中,所述步骤S200中包括,所述激光雷达传感器获得所述中心对称图案反射区域的中心定位信息包括以下步骤:
S221、所述激光雷达传感器扫描所述标定板;
S222、根据所述激光雷达传感器对于所述反射板本体和中心对称图案反射区域的反射 率差异获得区分所述反射板本体和中心对称图案的第二图案轨迹;
S223、根据所述第二图案轨迹,获得所述中心对称图案反射区域的中心;
S224、基于所述激光雷达传感器的三维坐标系,获得所述中心在所述激光雷达传感器的三维坐标系中的三维坐标。
在一些实施例中,所述步骤S200中包括,所述毫米波雷达传感器获得所述中心对称图案反射区域的中心定位信息包括以下步骤:
S231、所述毫米波雷达传感器扫描所述标定板上反射器的形状;
S232、基于所述毫米波雷达传感器的三维坐标系,获得所述反射器在所述毫米波雷达传感器的三维坐标系中的三维坐标;
S233、根据所述反射器与所述中心对称图案反射区域的中心的预设位置关系,获得所述中心在所述毫米波雷达传感器的三维坐标系中的三维坐标。
在一些实施例中,所述步骤S300中包括以下步骤,
设每个传感器的目标特征y={y 1,...y W},设每个标定板的靶位为
Figure PCTCN2021108586-appb-000001
k为标定板位置编号,c、l、r依次为图像传感器、激光雷达传感器以及毫米波雷达传感器;
则所述图像传感器到激光雷达传感器之间的4×4的转换矩阵为
Figure PCTCN2021108586-appb-000002
其中,R 1、t 1分别为所述图像传感器到激光雷达传感器的3×3的旋转矩阵和3×1的平移矩阵;
对于第k个标定板位置,所述图像传感器和激光雷达传感器之间的转换矩阵误差用每个相对的圆心之间的欧式距离表示,
Figure PCTCN2021108586-appb-000003
其中,
Figure PCTCN2021108586-appb-000004
表示靶位k的图像传感器和激光雷达传感器外参匹配的误差,
Figure PCTCN2021108586-appb-000005
表示标定板在靶位k时p圆在图像传感器坐标系下标定板圆心的三维坐标,
Figure PCTCN2021108586-appb-000006
表示标定板在靶位k时p圆在激光雷达传感器转换在图像传感器坐标系下的三维坐标系,T c,l表示图像传感器到激光雷达传感器的旋转和平移矩阵,p表示标定板中的圆编号;
所述毫米波雷达传感器到激光雷达传感器之间的4×4的转换矩阵为
Figure PCTCN2021108586-appb-000007
其中,R 2、t 2分别为所述毫米波雷达传感器到激光雷达传感器的3×3的旋转矩阵和3×1的平移矩阵;
对于毫米波雷达传感器和激光雷达传感器,第k个标定板位置,误差方程表示如下:
Figure PCTCN2021108586-appb-000008
其中,
Figure PCTCN2021108586-appb-000009
表示靶位k在毫米波雷达传感器和激光雷达外参匹配的误差,
Figure PCTCN2021108586-appb-000010
表示标定板在靶位k时p圆在毫米波雷达传感器坐标系下标定板圆心的三维坐标,
Figure PCTCN2021108586-appb-000011
表示标定板在靶位k时p圆在激光雷达传感器转换在图像传感器坐标系下的三维坐标系,T r,l表示毫米波到激光雷达传感器的旋转和平移矩阵,p表示标定板中的圆编号;
通过将欧式空间表示转换到球坐标系,用
Figure PCTCN2021108586-appb-000012
坐标表示公式(3)为:
Figure PCTCN2021108586-appb-000013
其中
Figure PCTCN2021108586-appb-000014
为标定板在靶位k时毫米波雷达传感器坐标系下圆心的球坐标,且俯仰角ψ k为未知数,初始值设为0,
Figure PCTCN2021108586-appb-000015
为激光雷达传感器坐标系下标定板的圆心球坐标;
通过毫米波雷达传感器的垂直方向的视场角,限定一个最大的ψ max,则获得:
Figure PCTCN2021108586-appb-000016
对于第W个标定板位置,所述图像传感器和激光雷达传感器之间的总误差用f(θ c,l)表示:
Figure PCTCN2021108586-appb-000017
所述毫米波雷达传感器到激光雷达传感器之间的总误差用f(θ r,l)表示
Figure PCTCN2021108586-appb-000018
其中,
Figure PCTCN2021108586-appb-000019
表示所述图像传感器和激光雷达传感器之间的参数是否存在,存在为1,不存在为0;
Figure PCTCN2021108586-appb-000020
表示所述毫米波雷达传感器到激光雷达传感器之间的参数是否存在,存在为1,不存在为0;
通过最小二乘法来迭代优化f(θ c,l)和f(θ r,l),当f(θ c,l)最小时,获得所述图像传感器 和激光雷达传感器之间的转换矩阵;当f(θ r,l)最小时,获得所述毫米波雷达传感器到激光雷达传感器之间的转换矩阵。
本发明的实施例还提供一种多种雷达和相机联合标定系统,用于实现上述的多种雷达和相机联合标定方法,多种雷达和相机联合标定系统包括:
公共视野模块,设置多种雷达传感器以及图像传感器共同形成公共视野区域;
坐标侦测模块,将一标定板依次设置于所述公共视野区域中的多个靶位,所述标定板上设有多个标定图案,所述标定图案为不同反射系数的材料根据所述标定图案的中心环绕排列,获得所述雷达传感器以及图像传感器对不同靶位的所述标定板的侦测数据获得所述标定图案的中心在每个所述雷达传感器以及图像传感器各自三维坐标系中的坐标信息;
联合标定模块,根据所述坐标信息进行联合标定。
本发明的实施例还提供一种多种雷达和相机联合标定设备,包括:
处理器;
存储器,其中存储有处理器的可执行指令;
其中,处理器配置为经由执行可执行指令来执行上述多种雷达和相机联合标定方法的步骤。
本发明的实施例还提供一种计算机可读存储介质,用于存储程序,程序被执行时实现上述多种雷达和相机联合标定方法的步骤。
本发明的多种雷达和相机联合标定方法、系统、设备及存储介质,能够通过三种不同传感器同时获得同一标定板上的参照点的世界坐标,然后通过迭代匹配实现联合标定,减少了多传感器标定的步骤,加快了联合标定的速度。
附图说明
通过阅读参照以下附图对非限制性实施例所作的详细描述,本发明的其它特征、目的和优点将会变得更明显。
图1是本发明一实施例的多种雷达和相机联合标定方法的流程图。
图2至4是本发明一实施例的多种雷达和相机联合标定方法的实施过程示意图。
图5是本发明一实施例的多种雷达和相机联合标定系统的结构示意图。
图6是本发明一实施例的多种雷达和相机联合标定设备的结构示意图。以及
图7是本发明一实施例的计算机可读存储介质的结构示意图。
具体实施方式
现在将参考附图更全面地描述示例实施方式。然而,示例实施方式能够以多种形式实施,且不应被理解为限于在此阐述的实施方式。相反,提供这些实施方式使得本发明将全面和完整,并将示例实施方式的构思全面地传达给本领域的技术人员。在图中相同的附图标记表示相同或类似的结构,因而将省略对它们的重复描述。
图1是本发明一实施例的多种雷达和相机联合标定方法的流程图。如图1所示,该实施例提供一种多种雷达和相机联合标定方法,包括以下步骤:
S100、设置多种雷达传感器以及图像传感器共同形成公共视野区域。
S200、将一标定板依次设置于公共视野区域中的多个靶位,标定板上设有多个标定图案,标定图案为不同反射系数的材料根据标定图案的中心环绕排列,获得雷达传感器以及图像传感器对不同靶位的标定板的侦测数据获得标定图案的中心在每个雷达传感器以及图像传感器各自三维坐标系中的坐标信息。
S300、根据坐标信息进行联合标定。
在一个优选方案中,至少一雷达传感器通过反射数据的分析获得每个标定图案的轨迹的位置信息,然后通过图像处理获得标定图案的中心,但不以此为限。
在一个优选方案中,至少一雷达传感器通过反射数据的分析获得每个标定图案中心的标定物的位置信息,然后通过预设的标定物与标定图案中心的空间位置关系获得标定图案的中心,但不以此为限。
在一个优选方案中,至少一图像传感器通过识别获得每个标定图案的轨迹的位置信息,然后通过图像处理获得标定图案的中心,但不以此为限。为了克服传统的基于手眼单独标定的多传感器联合标定方法所带了的精度不高,流程复杂等的不足,本发明提供一种全自动的毫米波雷达、激光雷达和相机的联合标定方法。该系统不需要人为调整对齐标定参数,且标定结果精度高,流程简洁可以实现量产化,从而真正意义上实现联合标定的高精度量产化。
图2至4是该实施例的多种雷达和相机联合标定方法的实施过程示意图。如图2至4所示,该实施例的方法的实施过程包括:
在步骤S100中,将毫米波雷达传感器21、激光雷达传感器22以及图像传感器23刚体连接安装于卡车1的车头前端,且毫米波雷达传感器21、激光雷达传感器22以及图像传感器23的侦测方向共同交汇于公共视野区域。
在步骤S200中,将同一标定板依次设置于公共视野区域中的W个靶位(参见图2中的第一靶位3A、第二靶位3B、第三靶位3C),位于靶位的每个标定板完整出现于图像传感器23的图像以及毫米波雷达传感器21、激光雷达传感器22的侦测范围中。标定板3包括一反射板本体31、p个同层嵌设于反射板本体31中的中心对称图案反射区域32(本实施例中为4个同层嵌设于反射板本体31中的圆形区域)以及同心设置于每个中心对称图案反射区域32中心、凸起于反射板本体31表面的反射器33,反射板本体31、中心对称图案反射区域32以及反射器33的反射率各不相同。本实施例中,反射板本体31的材质为低反射泡棉,中心对称图案反射区域32的材质为高反射反光纸,反射器33为金属三棱锥反射器33。金属三棱锥反射器33的反射率大于高反射反光纸的反射率,高反射反光纸的反射率大于低反射泡棉的反射率。分别采集标定板位于每个靶位时,毫米波雷达传感器21、激光雷达传感器22以及图像传感器23的检测到的定位信息,根据毫米波雷达传感器21、激光雷达传感器22以及图像传感器23分别对位于每个靶位(参见图2中的第一靶位3A、第二靶位3B、第三靶位3C)的标定板进行同步侦测各自获得的关于中心对称图案反射区域32的中心定位信息。
在步骤S300中,根据不同靶位下毫米波雷达传感器21、激光雷达传感器22以及图像传感器23的各自中心定位信息进行联合标定。步骤S300中包括以下步骤,
在一个优选实施例中,图像传感器23获得中心对称图案反射区域32的中心定位信息包括以下步骤:
S211、反射板本体31为矩形,图像传感器23拍摄反射板的图像,对图像进行矩形矫正。
S212、对图像进行图像识别,获得中心对称图案反射区域32的第一图案轨迹。本实施例中,可以采用了现有的的图像识别方法,对图像中重复排列的中心对称图案反射区域32进行图像识别,获得每个中心对称图案反射区域32的未知区域,但不以此为限。
S213、根据第一图案轨迹,获得中心对称图案反射区域32的中心。通过图形处理算法,根据每个第一图案轨迹获得其对应的内部中心位置,但不以此为限。
S214、基于图像传感器23的三维坐标系,获得中心在图像传感器23的三维坐标系中的三维坐标。
在一个优选实施例中,激光雷达传感器22获得中心对称图案反射区域32的中心定位信息包括以下步骤:
S221、激光雷达传感器22扫描标定板。
S222、根据激光雷达传感器22对于反射板本体31和中心对称图案反射区域32的反射率差异获得区分反射板本体31和中心对称图案的第二图案轨迹。本实施例中,反射板本体31和中心对称图案反射区域32是两种反射差异很大的介质分别构成,所以能够通过激光雷达传感器22的反射数据的分析,获得反射板本体31和中心对称图案反射区域32之间的边界作为第二图案轨迹,但不以此为限。
S223、根据第二图案轨迹,获得中心对称图案反射区域32的中心。通过图形处理算法,根据每个第二图案轨迹获得其对应的内部中心位置,但不以此为限。S224、基于激光雷达传感器22的三维坐标系,获得中心在激光雷达传感器22的三维坐标系中的三维坐标。
在一个优选实施例中,毫米波雷达传感器21获得中心对称图案反射区域32的中心定位信息包括以下步骤:
S231、毫米波雷达传感器21扫描标定板上反射器33的形状。本实施例中,采用高反射率的的金属角反射器,所以能够通过毫米波雷达传感器21的反射数据的分析,获得金属角反射器的位置,但不以此为限。S232、基于毫米波雷达传感器21的三维坐标系,获得反射器33在毫米波雷达传感器21的三维坐标系中的三维坐标。
S233、根据反射器33与中心对称图案反射区域32的中心的预设位置关系,获得中心在毫米波雷达传感器21的三维坐标系中的三维坐标。例如:通过预设的金属角反射器与中心对称图案反射区域32中心的空间位置关系获得标定图案的中心,但不以此为限。
在一个优选实施例中,在步骤S300中,根据不同靶位下毫米波雷达传感器21、激光雷达传感器22以及图像传感器23的各自中心定位信息进行联合标定。步骤S300中包括以下步骤,
设每个传感器的目标特征y={y 1,...y W},设每个标定板的靶位为
Figure PCTCN2021108586-appb-000021
k为标定板位置编号,c、l、r依次为图像传感器23、激光雷达传感器22以及毫米波雷达传感器21。
则图像传感器23到激光雷达传感器22之间的4×4的转换矩阵为
Figure PCTCN2021108586-appb-000022
其中,R 1、t 1分别为图像传感器23到激光雷达传感器22的3×3的旋转矩阵和3×1的平移矩阵。
对于第k个标定板位置,图像传感器23和激光雷达传感器22之间的转换矩阵误差用每个相对的圆心之间的欧式距离表示,
Figure PCTCN2021108586-appb-000023
其中,
Figure PCTCN2021108586-appb-000024
表示图像传感器23和激光雷达传感器22外参匹配的误差,
Figure PCTCN2021108586-appb-000025
表示图像传感器23坐标系下标定板圆心的三维坐标,
Figure PCTCN2021108586-appb-000026
表示激光雷达传感器22转换在图像传感器23坐标系下的三维坐标系,T c,l表示图像传感器23到激光雷达传感器22的旋转和平移矩阵,p表示标定板中的圆编号。
毫米波雷达传感器21到激光雷达传感器22之间的4×4的转换矩阵为
Figure PCTCN2021108586-appb-000027
其中,R 2、t 2分别为毫米波雷达传感器21到激光雷达传感器22的3×3的旋转矩阵和3×1的平移矩阵。
对于毫米波雷达传感器21和激光雷达传感器22,第k个标定板位置,误差方程表示如下:
Figure PCTCN2021108586-appb-000028
其中,
Figure PCTCN2021108586-appb-000029
表示毫米波雷达传感器21和激光雷达外参匹配的误差,
Figure PCTCN2021108586-appb-000030
表示毫米波雷达传感器21坐标系下标定板圆心的三维坐标,
Figure PCTCN2021108586-appb-000031
表示激光雷达传感器22转换在图像传感器23坐标系下的三维坐标系,T r,l表示毫米波到激光雷达传感器22的旋转和平移矩阵,p表示标定板中的圆编号。
通过将欧式空间表示转换到球坐标系,用
Figure PCTCN2021108586-appb-000032
坐标表示公式(3)为:
Figure PCTCN2021108586-appb-000033
其中
Figure PCTCN2021108586-appb-000034
为毫米波雷达传感器21坐标系下圆心的球坐标,且俯仰角ψ k为未知数,初始值设为0,
Figure PCTCN2021108586-appb-000035
为激光雷达传感器22坐标系下标定板的圆心球坐标。
通过毫米波雷达传感器21的垂直方向的视场角,限定一个最大的ψ max,则获得:
Figure PCTCN2021108586-appb-000036
对于第W个标定板位置,图像传感器23和激光雷达传感器22之间的总误差用f(θ c,l) 表示:
Figure PCTCN2021108586-appb-000037
毫米波雷达传感器21到激光雷达传感器22之间的总误差用f(θ r,l)表示
Figure PCTCN2021108586-appb-000038
其中,
Figure PCTCN2021108586-appb-000039
表示图像传感器23和激光雷达传感器22之间的参数是否存在,存在为1,不存在为0。
Figure PCTCN2021108586-appb-000040
表示毫米波雷达传感器21到激光雷达传感器22之间的参数是否存在,存在为1,不存在为0。
通过最小二乘法来迭代优化f(θ c,l)和f(θ r,l),当f(θ c,l)最小时,获得图像传感器23和激光雷达传感器22之间的转换矩阵。当f(θ r,l)最小时,获得毫米波雷达传感器21到激光雷达传感器22之间的转换矩阵,但不以此为限。
在一个变形例中,本发明也可以使用现有的或者未来发明的其他联合标定算法,根据不同靶位下毫米波雷达传感器21、激光雷达传感器22以及图像传感器23的各自中心定位信息进行联合标定。
本发明解决其技术问题所采用的技术方案是:将毫米波雷达、激光雷达和相机刚体连接,保证相对位置不变,在三种传感器的公共视野内正向放置本方法所设计的特制联合标定板。与以往的每两个传感器共用一套标定板,多种不同标定板分别标准不同传感器不同,本方法的专用标定板将三种传感器的目标特征集与一身,可以实现三种传感器之间的任意坐标转换,且只需一个标定板就可以满足整个标定流程,从而提高了精度和鲁棒性。
结合本方法中的联合标定板,将各不同传感器的数据每帧对应,分别提取图像中的四个圆心坐标、激光雷达数据三个点云中心点三维坐标以及毫米波雷达角反射器33中心点减去角反射器33深度的三维坐标。此时,三种传感器的标定转换为点点之间的匹配,通过多组数据,并结合本发明中的迭代匹配算法进行迭代优化,最终求得一个最小误差,在最小误差下得到的转换矩阵即为外参。然后将三种传感器数据进行两两验证,无误后输出结果。通过本方法的特定标定板,三种不同传感器同时获得标定板上的圆形的世界坐标,然后通过本文中的算法进行迭代匹配,最终算出精确的坐标转换矩阵。
本发明提供了一种高效、高精度、环境依赖少且标定流程简洁且标定快速的方法急 需被用于解决多传感器以及相机采用单一标定物进行同步标定的问题,为联合标定量产化提供了可行的方案,可以有效解决标定的难题,减少了多传感器标定的步骤,加快了联合标定的速度。
本发明可以在长方形板上规则的排布1到N(可以任意个)不同反射率的圆形反光材,在每个圆形反光材中抠出一个三角凹型的三角孔,在里面固定金属角反射器。
本方案将毫米波雷达、激光雷达和相机三种传感器进行刚体连接,保持相对坐标不变,之后制作特定标定板,并放置在三种传感器的公共视野范围内。
三种传感器对于所需的特征各不相同。毫米波雷达通过角反射器获得稳定高精度的三角形中心点,即圆心,以及RCS值。激光雷达通过不同反射率材质的差异,依靠反射率提取出圆形,并通过圆形的边缘获得圆心的3维坐标。同理,相机采集的图片经过去畸变后,通过霍夫圆变换,也获得图像中的圆心坐标,进而通过和激光雷达进行ICP配准,得到相机坐标系下圆心的x,y,z坐标。
获得三种传感器的公共特征点坐标等信息后,下一步是对三种传感器中各自目标圆心的坐标进行匹配。标定过程中,标定板可以换W个不同的位置,因此每一种传感器都有W个目标特征y={y 1,...y W},对于相机和激光雷达每个圆心包含x,y,z三维坐标特征,毫米波雷达为x,y和z=0的三维特征。
由于每个传感器拥有不同的视场角,所以标定板必须放置在公共视野区域,每一个标定板的放置位置用
Figure PCTCN2021108586-appb-000041
来表示,k表示标定板位置编号,i表示传感器编号。在本发明中,不同传感器之间的外参矩阵用T 1,2表示,且
Figure PCTCN2021108586-appb-000042
代表传感器1到传感器2的旋转R和平移t,即为4*4的齐次方程,可以表示6个自由度。
对于第k个标定板位置,相机和激光雷达之间的转换矩阵误差用每个相对的圆心之间的欧式距离表示,
Figure PCTCN2021108586-appb-000043
其中,
Figure PCTCN2021108586-appb-000044
表示相机和激光雷达外参匹配的误差,
Figure PCTCN2021108586-appb-000045
表示相机坐标系下标定板圆心的三维坐标,
Figure PCTCN2021108586-appb-000046
表示激光雷达转换在相机坐标系下的三维坐标系,T c,l表示相机到激光 雷达的旋转和平移矩阵,p表示标定板中的圆编号(1到4)。
对于毫米波和激光雷达,第k个标定板位置,误差方程表示如下,
Figure PCTCN2021108586-appb-000047
其中,
Figure PCTCN2021108586-appb-000048
表示毫米波雷达和激光雷达外参匹配的误差,
Figure PCTCN2021108586-appb-000049
表示毫米波雷达坐标系下标定板圆心的三维坐标(最开始z默认为0),
Figure PCTCN2021108586-appb-000050
表示激光雷达转换在相机坐标系下的三维坐标系,T r,l表示毫米波到激光雷达的旋转和平移矩阵,p表示标定板中的圆编号(1到4)。
公式(2)与公式(1)不同的是
Figure PCTCN2021108586-appb-000051
相比于
Figure PCTCN2021108586-appb-000052
多一个未知数,即毫米波雷达的z轴方向的值。在公式(2)中很难求出每个位置的z方向的值,因此,可以通过将欧式空间表示转换到球坐标系,用
Figure PCTCN2021108586-appb-000053
坐标表示,
则公式(2)可以用下式表示:
Figure PCTCN2021108586-appb-000054
其中
Figure PCTCN2021108586-appb-000055
为毫米波雷达坐标系下圆心的球坐标,且俯仰角ψ k为未知数,初始值设为0,
Figure PCTCN2021108586-appb-000056
为激光雷达坐标系下标定板的圆心球坐标。
由于毫米波雷达没有俯仰角参数,因此,标定毫米波雷达的可以转换为迭代优化问题,通过毫米波雷达的垂直方向的视场角,可以限定一个最大的ψ max,从而有
Figure PCTCN2021108586-appb-000057
从而将三种传感器外参关系转换为迭代优化问题,对于第W个标定板位置,两两传感器之间的总误差用f(θ 1,2)表示
Figure PCTCN2021108586-appb-000058
其中
Figure PCTCN2021108586-appb-000059
表示两种传感器之间的参数是否存在,存在为1,不存在为0,e k1,2)公式(1)和公式(2)代入,此方程中最终为最小二乘法来迭代优化f(θ 1,2)。
当f(θ 1,2)最小时,对应的解为最优解,即对应的旋转平移矩阵。
在本发明中,三种传感器之间的位姿优化可将任意传感器坐标设置为基准坐标,因此,两两传感器之间均可形成一种迭代优化方程,将所有组合的误差表示为,
Figure PCTCN2021108586-appb-000060
在此方程中,用l表示一个闭循环,即所有闭循环的积等于一个单位矩阵,
Figure PCTCN2021108586-appb-000061
其中,sl为l闭环中的传感器总数,通过将所有的误差加在一起,可以获得鲁棒性更高的优化结果。
通过以上所示方程,最终优化出最小误差,最小误差对应的转换矩阵即为外参。上述发明可高效、高精度标定毫米波雷达、激光雷达和相机,并实现量产化。
图5是本发明一实施例的多种雷达和相机联合标定系统的结构示意图。如图5所示,该实施例还提供一种多种雷达和相机联合标定系统5,用于实现上述的多种雷达和相机联合标定方法,多种雷达和相机联合标定系统5包括:
公共视野模块51,设置多种雷达传感器以及图像传感器共同形成公共视野区域。
坐标侦测模块52,将一标定板依次设置于公共视野区域中的多个靶位,标定板上设有多个标定图案,标定图案为不同反射系数的材料根据标定图案的中心环绕排列,获得雷达传感器以及图像传感器对不同靶位的标定板的侦测数据获得标定图案的中心在每个雷达传感器以及图像传感器各自三维坐标系中的坐标信息。
联合标定模块53,根据坐标信息进行联合标定。
本发明的多种雷达和相机联合标定系统,能够通过三种不同传感器同时获得同一标定板上的参照点的世界坐标,然后通过迭代匹配实现联合标定,减少了多传感器标定的步骤,加快了联合标定的速度。
本发明实施例还提供一种多种雷达和相机联合标定设备,包括处理器。存储器,其中存储有处理器的可执行指令。其中,处理器配置为经由执行可执行指令来执行的多种雷达和相机联合标定方法的步骤。
如上,本发明的多种雷达和相机联合标定设备能够通过三种不同传感器同时获得同一标定板上的参照点的世界坐标,然后通过迭代匹配实现联合标定,减少了多传感器标定 的步骤,加快了联合标定的速度。
所属技术领域的技术人员能够理解,本发明的各个方面可以实现为系统、方法或程序产品。因此,本发明的各个方面可以具体实现为以下形式,即:完全的硬件实施方式、完全的软件实施方式(包括固件、微代码等),或硬件和软件方面结合的实施方式,这里可以统称为“电路”、“模块”或“平台”。
图6是本发明一实施例的多种雷达和相机联合标定设备的结构示意图。下面参照图6来描述根据本发明的这种实施方式的电子设备600。图6显示的电子设备600仅仅是一个示例,不应对本发明实施例的功能和使用范围带来任何限制。
如图6所示,电子设备600以通用计算设备的形式表现。电子设备600的组件可以包括但不限于:至少一个处理单元610、至少一个存储单元620、连接不同平台组件(包括存储单元620和处理单元610)的总线630、显示单元640等。
其中,存储单元存储有程序代码,程序代码可以被处理单元610执行,使得处理单元610执行本说明书上述多种雷达和相机联合标定处理方法部分中描述的根据本发明各种示例性实施方式的步骤。例如,处理单元610可以执行如图1中所示的步骤。
存储单元620可以包括易失性存储单元形式的可读介质,例如随机存取存储单元(RAM)6201和/或高速缓存存储单元6202,还可以进一步包括只读存储单元(ROM)6203。
存储单元620还可以包括具有一组(至少一个)程序模块6205的程序/实用工具6204,这样的程序模块6205包括但不限于:操作系统、一个或者多个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。
总线630可以为表示几类总线结构中的一种或多种,包括存储单元总线或者存储单元控制器、外围总线、图形加速端口、处理单元或者使用多种总线结构中的任意总线结构的局域总线。
电子设备600也可以与一个或多个外部设备700(例如键盘、指向设备、蓝牙设备等)通信,还可与一个或者多个使得用户能与该电子设备600交互的设备通信,和/或与使得该电子设备600能与一个或多个其它计算设备进行通信的任何设备(例如路由器、调制解调器等等)通信。这种通信可以通过输入/输出(I/O)接口650进行。并且,电子设备600还可以通过网络适配器660与一个或者多个网络(例如局域网(LAN),广域网(WAN)和/或公共网络,例如因特网)通信。网络适配器660可以通过总线630与电子设备600的其它模块通信。应当明白,尽管图中未示出,可以结合电子设备600使用其它硬件和/或软 件模块,包括但不限于:微代码、设备驱动器、冗余处理单元、外部磁盘驱动阵列、RAID系统、磁带驱动器以及数据备份存储平台等。
本发明实施例还提供一种计算机可读存储介质,用于存储程序,程序被执行时实现的多种雷达和相机联合标定方法的步骤。在一些可能的实施方式中,本发明的各个方面还可以实现为一种程序产品的形式,其包括程序代码,当程序产品在终端设备上运行时,程序代码用于使终端设备执行本说明书上述多种雷达和相机联合标定处理方法部分中描述的根据本发明各种示例性实施方式的步骤。
如上所示,该实施例的计算机可读存储介质的程序在执行时,能够通过三种不同传感器同时获得同一标定板上的参照点的世界坐标,然后通过迭代匹配实现联合标定,减少了多传感器标定的步骤,加快了联合标定的速度。
图7是本发明一实施例的计算机可读存储介质的结构示意图。参考图7所示,描述了根据本发明的实施方式的用于实现上述方法的程序产品800,其可以采用便携式紧凑盘只读存储器(CD-ROM)并包括程序代码,并可以在终端设备,例如个人电脑上运行。然而,本发明的程序产品不限于此,在本文件中,可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。
程序产品可以采用一个或多个可读介质的任意组合。可读介质可以是可读信号介质或者可读存储介质。可读存储介质例如可以为但不限于电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。可读存储介质的更具体的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。
计算机可读存储介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了可读程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。可读存储介质还可以是可读存储介质以外的任何可读介质,该可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。可读存储介质上包含的程序代码可以用任何适当的介质传输,包括但不限于无线、有线、光缆、RF等等,或者上述的任意合适的组合。
可以以一种或多种程序设计语言的任意组合来编写用于执行本发明操作的程序代码,程序设计语言包括面向对象的程序设计语言—诸如Java、C++等,还包括常规的过程式程 序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算设备上执行、部分地在用户设备上执行、作为一个独立的软件包执行、部分在用户计算设备上部分在远程计算设备上执行、或者完全在远程计算设备或服务器上执行。在涉及远程计算设备的情形中,远程计算设备可以通过任意种类的网络,包括局域网(LAN)或广域网(WAN),连接到用户计算设备,或者,可以连接到外部计算设备(例如利用因特网服务提供商来通过因特网连接)。
综上,本发明的多种雷达和相机联合标定方法、系统、设备及存储介质,能够通过三种不同传感器同时获得同一标定板上的参照点的世界坐标,然后通过迭代匹配实现联合标定,减少了多传感器标定的步骤,加快了联合标定的速度。
以上内容是结合具体的优选实施方式对本发明所作的进一步详细说明,不能认定本发明的具体实施只局限于这些说明。对于本发明所属技术领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干简单推演或替换,都应当视为属于本发明的保护范围。

Claims (12)

  1. 一种多种雷达和相机联合标定方法,其特征在于,包括以下步骤:
    S100、设置多种雷达传感器以及图像传感器共同形成公共视野区域;
    S200、将一标定板依次设置于所述公共视野区域中的多个靶位,所述标定板上设有多个标定图案,所述标定图案为不同反射系数的材料根据所述标定图案的中心环绕排列,获得所述雷达传感器以及图像传感器对不同靶位的所述标定板的侦测数据获得所述标定图案的中心在每个所述雷达传感器以及图像传感器各自三维坐标系中的坐标信息;
    S300、根据所述坐标信息进行联合标定。
  2. 根据权利要求1所述的多种雷达和相机联合标定方法,其特征在于,在所述步骤S100中,设置毫米波雷达传感器、激光雷达传感器以及图像传感器共同形成公共视野区域;
    在所述步骤S200中,将同一标定板依次设置于所述公共视野区域中的W个靶位,所述标定板包括一反射板本体、p个同层嵌设于所述反射板本体中的中心对称图案反射区域以及同心设置于每个所述中心对称图案反射区域中心、凸起于反射板本体表面的反射器,所述反射板本体、中心对称图案反射区域以及反射器的反射率各不相同;分别采集所述标定板位于每个靶位时,所述毫米波雷达传感器、激光雷达传感器以及图像传感器的检测到的定位信息,根据所述毫米波雷达传感器、激光雷达传感器以及图像传感器分别对位于每个所述靶位的所述标定板进行同步侦测各自获得的关于所述中心对称图案反射区域的中心定位信息;
    在所述步骤S300中,根据不同所述靶位下所述毫米波雷达传感器、激光雷达传感器以及图像传感器的各自中心定位信息进行联合标定。
  3. 根据权利要求2所述的多种雷达和相机联合标定方法,其特征在于,所述步骤S100中包括,将所述毫米波雷达传感器、激光雷达传感器以及图像传感器刚体连接,且所述毫米波雷达传感器、激光雷达传感器以及图像传感器的侦测方向共同交汇于公共视野区域。
  4. 根据权利要求2所述的多种雷达和相机联合标定方法,其特征在于,所述步骤S200中包括,位于所述靶位的标定板完整出现于所述图像传感器的图像以及所述毫米波雷达传感器、激光雷达传感器的侦测范围中。
  5. 根据权利要求2所述的多种雷达和相机联合标定方法,其特征在于,所述步骤S200中包括,所述图像传感器获得所述中心对称图案反射区域的中心定位信息包括以下步骤:
    S211、所述图像传感器拍摄所述反射板的图像;
    S212、对所述图像进行图像识别,获得中心对称图案反射区域的第一图案轨迹;
    S213、根据所述第一图案轨迹,获得所述中心对称图案反射区域的中心;
    S214、基于所述图像传感器的三维坐标系,获得所述中心在所述图像传感器的三维坐标系中的三维坐标。
  6. 根据权利要求5所述的多种雷达和相机联合标定方法,其特征在于,所述反射板本体为矩形,所述步骤S211中还包括对图像进行矩形矫正。
  7. 根据权利要求2所述的多种雷达和相机联合标定方法,其特征在于,所述步骤S200中包括,所述激光雷达传感器获得所述中心对称图案反射区域的中心定位信息包括以下步骤:
    S221、所述激光雷达传感器扫描所述标定板;
    S222、根据所述激光雷达传感器对于所述反射板本体和中心对称图案反射区域的反射率差异获得区分所述反射板本体和中心对称图案的第二图案轨迹;
    S223、根据所述第二图案轨迹,获得所述中心对称图案反射区域的中心;
    S224、基于所述激光雷达传感器的三维坐标系,获得所述中心在所述激光雷达传感器的三维坐标系中的三维坐标。
  8. 根据权利要求2所述的多种雷达和相机联合标定方法,其特征在于,所述步骤S200中包括,所述毫米波雷达传感器获得所述中心对称图案反射区域的中心定位信息包括以下步骤:
    S231、所述毫米波雷达传感器扫描所述标定板上反射器的形状;
    S232、基于所述毫米波雷达传感器的三维坐标系,获得所述反射器在所述毫米波雷达传感器的三维坐标系中的三维坐标;
    S233、根据所述反射器与所述中心对称图案反射区域的中心的预设位置关系,获得所述中心在所述毫米波雷达传感器的三维坐标系中的三维坐标。
  9. 根据权利要求2所述的多种雷达和相机联合标定方法,其特征在于,所述步骤S300中包括以下步骤,
    设每个传感器的目标特征y={y 1,...y W},设每个标定板的靶位为
    Figure PCTCN2021108586-appb-100001
    k为标定板位置编号,c、l、r依次为图像传感器、激光雷达传感器以及毫米波雷达传感器;
    则所述图像传感器到激光雷达传感器之间的4×4的转换矩阵为
    Figure PCTCN2021108586-appb-100002
    其中, R 1、t 1分别为所述图像传感器到激光雷达传感器的3×3的旋转矩阵和3×1的平移矩阵;
    对于第k个标定板位置,所述图像传感器和激光雷达传感器之间的转换矩阵误差用每个相对的圆心之间的欧式距离表示,
    Figure PCTCN2021108586-appb-100003
    其中,
    Figure PCTCN2021108586-appb-100004
    表示靶位k的图像传感器和激光雷达传感器外参匹配的误差,
    Figure PCTCN2021108586-appb-100005
    表示标定板在靶位k时p圆在图像传感器坐标系下标定板圆心的三维坐标,
    Figure PCTCN2021108586-appb-100006
    表示标定板在靶位k时p圆在激光雷达传感器转换在图像传感器坐标系下的三维坐标系,T c,l表示图像传感器到激光雷达传感器的旋转和平移矩阵,p表示标定板中的圆编号;
    所述毫米波雷达传感器到激光雷达传感器之间的4×4的转换矩阵为
    Figure PCTCN2021108586-appb-100007
    其中,R 2、t 2分别为所述毫米波雷达传感器到激光雷达传感器的3×3的旋转矩阵和3×1的平移矩阵;
    对于毫米波雷达传感器和激光雷达传感器,第k个标定板位置,误差方程表示如下:
    Figure PCTCN2021108586-appb-100008
    其中,
    Figure PCTCN2021108586-appb-100009
    表示靶位k在毫米波雷达传感器和激光雷达外参匹配的误差,
    Figure PCTCN2021108586-appb-100010
    表示标定板在靶位k时p圆在毫米波雷达传感器坐标系下标定板圆心的三维坐标,
    Figure PCTCN2021108586-appb-100011
    表示标定板在靶位k时p圆在激光雷达传感器转换在图像传感器坐标系下的三维坐标系,T r,l表示毫米波到激光雷达传感器的旋转和平移矩阵,p表示标定板中的圆编号;
    通过将欧式空间表示转换到球坐标系,用
    Figure PCTCN2021108586-appb-100012
    坐标表示公式(3)为:
    Figure PCTCN2021108586-appb-100013
    其中
    Figure PCTCN2021108586-appb-100014
    为标定板在靶位k时毫米波雷达传感器坐标系下圆心的球坐标,且俯仰角ψ k为未知数,初始值设为0,
    Figure PCTCN2021108586-appb-100015
    为激光雷达传感器坐标系下标定板的圆心球坐标;
    通过毫米波雷达传感器的垂直方向的视场角,限定一个最大的ψ max,则获得:
    Figure PCTCN2021108586-appb-100016
    对于第W个标定板位置,所述图像传感器和激光雷达传感器之间的总误差用f(θ c,l)表示:
    Figure PCTCN2021108586-appb-100017
    所述毫米波雷达传感器到激光雷达传感器之间的总误差用f(θ r,l)表示
    Figure PCTCN2021108586-appb-100018
    其中,
    Figure PCTCN2021108586-appb-100019
    表示所述图像传感器和激光雷达传感器之间的参数是否存在,存在为1,不存在为0;
    Figure PCTCN2021108586-appb-100020
    表示所述毫米波雷达传感器到激光雷达传感器之间的参数是否存在,存在为1,不存在为0;
    通过最小二乘法来迭代优化f(θ c,l)和f(θ r,l),当f(θ c,l)最小时,获得所述图像传感器和激光雷达传感器之间的转换矩阵;当f(θ r,l)最小时,获得所述毫米波雷达传感器到激光雷达传感器之间的转换矩阵。
  10. 一种多种雷达和相机联合标定系统,其特征在于,用于实现如权利要求1所述的多种雷达和相机联合标定方法,包括:
    公共视野模块,设置多种雷达传感器以及图像传感器共同形成公共视野区域;
    坐标侦测模块,将一标定板依次设置于所述公共视野区域中的多个靶位,所述标定板上设有多个标定图案,所述标定图案为不同反射系数的材料根据所述标定图案的中心环绕排列,获得所述雷达传感器以及图像传感器对不同靶位的所述标定板的侦测数据获得所述标定图案的中心在每个所述雷达传感器以及图像传感器各自三维坐标系中的坐标信息;
    联合标定模块,根据所述坐标信息进行联合标定。
  11. 一种多种雷达和相机联合标定设备,其特征在于,包括:
    处理器;
    存储器,其中存储有处理器的可执行指令;
    其中,处理器配置为经由执行可执行指令来执行权利要求1至9中任意一项所述多种雷达和相机联合标定方法的步骤。
  12. 一种计算机可读存储介质,用于存储程序,其特征在于,程序被执行时实现权利要求1至9中任意一项所述多种雷达和相机联合标定方法的步骤。
PCT/CN2021/108586 2021-01-22 2021-07-27 多种雷达和相机联合标定方法、系统、设备及存储介质 WO2022156176A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21920557.2A EP4283328A1 (en) 2021-01-22 2021-07-27 Multi-radar and camera joint calibration method, system and device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110089938.5 2021-01-22
CN202110089938.5A CN112859022B (zh) 2021-01-22 2021-01-22 多种雷达和相机联合标定方法、系统、设备及存储介质

Publications (1)

Publication Number Publication Date
WO2022156176A1 true WO2022156176A1 (zh) 2022-07-28

Family

ID=76008033

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/108586 WO2022156176A1 (zh) 2021-01-22 2021-07-27 多种雷达和相机联合标定方法、系统、设备及存储介质

Country Status (3)

Country Link
EP (1) EP4283328A1 (zh)
CN (1) CN112859022B (zh)
WO (1) WO2022156176A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113702967A (zh) * 2021-09-24 2021-11-26 中国北方车辆研究所 地面无人平台的引导车辆目标识别与跟踪方法及车载系统
CN115327532A (zh) * 2022-08-04 2022-11-11 澳克诺(上海)汽车科技有限公司 一种用于传感器数据融合的方法和系统
CN115661270A (zh) * 2022-12-22 2023-01-31 陕西欧卡电子智能科技有限公司 一种用于毫米波雷达与鱼眼摄像头的在线标定方法
CN116819469A (zh) * 2023-08-28 2023-09-29 南京慧尔视智能科技有限公司 一种多雷达目标位置同步方法、装置、设备及存储介质

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112859022B (zh) * 2021-01-22 2023-08-18 上海西井科技股份有限公司 多种雷达和相机联合标定方法、系统、设备及存储介质
CN113205563B (zh) * 2021-06-03 2022-11-18 河南科技大学 一种自动驾驶传感器联合标定靶和标定方法
CN113359097B (zh) * 2021-06-21 2022-09-02 北京易航远智科技有限公司 一种毫米波雷达和相机联合标定的方法
CN113484830A (zh) * 2021-06-22 2021-10-08 上海智能网联汽车技术中心有限公司 复合标定板及标定方法
CN113359117A (zh) * 2021-06-29 2021-09-07 上海智能网联汽车技术中心有限公司 一种静态标定系统及方法
CN113487686A (zh) * 2021-08-02 2021-10-08 固高科技股份有限公司 一种多目相机的标定方法、装置、多目相机和存储介质
CN113589263B (zh) * 2021-08-06 2023-10-31 北京易航远智科技有限公司 一种多个同源传感器联合标定方法及系统
CN113744348A (zh) * 2021-08-31 2021-12-03 南京慧尔视智能科技有限公司 一种参数标定方法、装置及雷视融合检测设备
CN113888652A (zh) * 2021-10-22 2022-01-04 智能移动机器人(中山)研究院 4d毫米波雷达与热感相机的内外参自动标定技术
CN116897300A (zh) * 2022-02-10 2023-10-17 华为技术有限公司 一种标定方法及装置
CN114509776B (zh) * 2022-04-08 2022-07-29 探维科技(北京)有限公司 硬件级图像融合系统的同步测量装置、方法、设备及介质
CN114942039B (zh) * 2022-05-18 2024-05-07 清华大学深圳国际研究生院 一种多传感器联合标定装置及标定方法
CN218299035U (zh) * 2022-05-27 2023-01-13 华为技术有限公司 标定板以及标定控制设备
CN116416319B (zh) * 2022-11-17 2023-11-24 南京理工大学 面向智能驾驶多类型传感器标定的一次性联合标定方法
CN115598626B (zh) * 2022-12-14 2023-03-14 成都量芯集成科技有限公司 一种激光测距仪偏差校准装置及方法
CN117848403A (zh) * 2024-03-08 2024-04-09 宁德时代新能源科技股份有限公司 标定件和使用标定件标定检测系统的方法

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006011570A (ja) * 2004-06-23 2006-01-12 Daihatsu Motor Co Ltd カメラキャリブレーション方法及びカメラキャリブレーション装置
CN107609522A (zh) * 2017-09-19 2018-01-19 东华大学 一种基于激光雷达和机器视觉的信息融合车辆检测系统
CN108399643A (zh) * 2018-03-15 2018-08-14 南京大学 一种激光雷达和相机间的外参标定系统和方法
KR20180133745A (ko) * 2017-06-07 2018-12-17 국방과학연구소 라이다 센서 및 팬틸트줌 카메라를 활용한 비행체 식별 시스템 및 그 제어 방법
CN110390695A (zh) * 2019-06-28 2019-10-29 东南大学 一种基于ros的激光雷达、摄像头的融合标定系统及标定方法
CN111413983A (zh) * 2020-04-08 2020-07-14 江苏盛海智能科技有限公司 一种无人驾驶车辆的环境感知方法及控制端
CN111427020A (zh) * 2020-06-11 2020-07-17 交通运输部公路科学研究所 环境信息数据采集设备的联合标定方法、装置和系统
CN211087307U (zh) * 2019-12-11 2020-07-24 四川深瑞视科技有限公司 用于相机标定的标靶及系统
CN112241007A (zh) * 2020-07-01 2021-01-19 北京新能源汽车技术创新中心有限公司 自动驾驶环境感知传感器的标定方法、布置结构及车辆
CN112859022A (zh) * 2021-01-22 2021-05-28 上海西井信息科技有限公司 多种雷达和相机联合标定方法、系统、设备及存储介质

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN209460400U (zh) * 2019-02-01 2019-10-01 成都信息工程大学 一种用于激光雷达和相机外部参数标定的标定板
CN110097601B (zh) * 2019-04-17 2021-01-19 清华大学深圳研究生院 用于多模态采集系统视场标定的标定板及标定方法
CN110322519B (zh) * 2019-07-18 2023-03-31 天津大学 一种用于激光雷达与相机联合标定的标定装置及标定方法
CN111127561B (zh) * 2019-12-05 2023-03-24 农芯(南京)智慧农业研究院有限公司 一种多视角图像标定装置及方法
CN111366912B (zh) * 2020-03-10 2021-03-16 上海西井信息科技有限公司 激光传感器与摄像头标定方法、系统、设备及存储介质
CN111754583B (zh) * 2020-05-29 2024-05-14 江苏大学 一种车载三维激光雷达和相机外参联合标定的自动化方法

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006011570A (ja) * 2004-06-23 2006-01-12 Daihatsu Motor Co Ltd カメラキャリブレーション方法及びカメラキャリブレーション装置
KR20180133745A (ko) * 2017-06-07 2018-12-17 국방과학연구소 라이다 센서 및 팬틸트줌 카메라를 활용한 비행체 식별 시스템 및 그 제어 방법
CN107609522A (zh) * 2017-09-19 2018-01-19 东华大学 一种基于激光雷达和机器视觉的信息融合车辆检测系统
CN108399643A (zh) * 2018-03-15 2018-08-14 南京大学 一种激光雷达和相机间的外参标定系统和方法
CN110390695A (zh) * 2019-06-28 2019-10-29 东南大学 一种基于ros的激光雷达、摄像头的融合标定系统及标定方法
CN211087307U (zh) * 2019-12-11 2020-07-24 四川深瑞视科技有限公司 用于相机标定的标靶及系统
CN111413983A (zh) * 2020-04-08 2020-07-14 江苏盛海智能科技有限公司 一种无人驾驶车辆的环境感知方法及控制端
CN111427020A (zh) * 2020-06-11 2020-07-17 交通运输部公路科学研究所 环境信息数据采集设备的联合标定方法、装置和系统
CN112241007A (zh) * 2020-07-01 2021-01-19 北京新能源汽车技术创新中心有限公司 自动驾驶环境感知传感器的标定方法、布置结构及车辆
CN112859022A (zh) * 2021-01-22 2021-05-28 上海西井信息科技有限公司 多种雷达和相机联合标定方法、系统、设备及存储介质

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113702967A (zh) * 2021-09-24 2021-11-26 中国北方车辆研究所 地面无人平台的引导车辆目标识别与跟踪方法及车载系统
CN115327532A (zh) * 2022-08-04 2022-11-11 澳克诺(上海)汽车科技有限公司 一种用于传感器数据融合的方法和系统
CN115661270A (zh) * 2022-12-22 2023-01-31 陕西欧卡电子智能科技有限公司 一种用于毫米波雷达与鱼眼摄像头的在线标定方法
CN116819469A (zh) * 2023-08-28 2023-09-29 南京慧尔视智能科技有限公司 一种多雷达目标位置同步方法、装置、设备及存储介质
CN116819469B (zh) * 2023-08-28 2023-11-10 南京慧尔视智能科技有限公司 一种多雷达目标位置同步方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN112859022A (zh) 2021-05-28
EP4283328A1 (en) 2023-11-29
CN112859022B (zh) 2023-08-18

Similar Documents

Publication Publication Date Title
WO2022156176A1 (zh) 多种雷达和相机联合标定方法、系统、设备及存储介质
CN109521403B (zh) 多线激光雷达的参数标定方法及装置、设备及可读介质
US11480443B2 (en) Method for calibrating relative pose, device and medium
CN111508021B (zh) 一种位姿确定方法、装置、存储介质及电子设备
CN109598765B (zh) 基于球形标定物的单目相机与毫米波雷达外参联合标定方法
WO2022142992A1 (zh) 融合定位方法、装置、设备和计算机可读存储介质
EP3506162A1 (en) Method and apparatus for determining matching relationship between point cloud data
CN110988849B (zh) 雷达系统的标定方法、装置、电子设备及存储介质
CN111968121A (zh) 一种基于实例嵌入与语义融合的三维点云场景分割方法
CN110533726B (zh) 一种激光雷达场景三维姿态点法向量估计修正方法
Eynard et al. Real time UAV altitude, attitude and motion estimation from hybrid stereovision
US11367204B1 (en) Multi-sensor spatial data auto-synchronization system and method
Yin et al. SLAM-based self-calibration of a binocular stereo vision rig in real-time
Zhou et al. Review on millimeter-wave radar and camera fusion technology
WO2023284705A1 (zh) 一种激光雷达点云聚类方法、装置、激光雷达及车辆
CN115267718A (zh) 基于点云拼接的环视雷达360°探测实现方法
Lee et al. Extrinsic calibration of multiple 3D LiDAR sensors by the use of planar objects
CN112465908B (zh) 一种物体定位方法、装置、终端设备及存储介质
CN116844124A (zh) 三维目标检测框标注方法、装置、电子设备和存储介质
Nie et al. Automatic extrinsic calibration of dual LiDARs with adaptive surface normal estimation
Song et al. Registration for 3D LiDAR datasets using pyramid reference object
CN114792343B (zh) 图像获取设备的标定方法、获取图像数据的方法、装置
WO2022007465A1 (zh) 一种确定协方差的方法及相关装置
Sulaj et al. Examples of real-time UAV data processing with cloud computing
CN114140608A (zh) 一种光伏板标记方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21920557

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2021920557

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021920557

Country of ref document: EP

Effective date: 20230822