WO2021098439A1 - 传感器标定方法及装置、存储介质、标定系统和程序产品 - Google Patents

传感器标定方法及装置、存储介质、标定系统和程序产品 Download PDF

Info

Publication number
WO2021098439A1
WO2021098439A1 PCT/CN2020/122559 CN2020122559W WO2021098439A1 WO 2021098439 A1 WO2021098439 A1 WO 2021098439A1 CN 2020122559 W CN2020122559 W CN 2020122559W WO 2021098439 A1 WO2021098439 A1 WO 2021098439A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
radar
calibration
cloud data
point cloud
Prior art date
Application number
PCT/CN2020/122559
Other languages
English (en)
French (fr)
Inventor
马政
闫国行
刘春晓
石建萍
Original Assignee
商汤集团有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 商汤集团有限公司 filed Critical 商汤集团有限公司
Priority to JP2021530296A priority Critical patent/JP2022510924A/ja
Publication of WO2021098439A1 publication Critical patent/WO2021098439A1/zh
Priority to US17/747,271 priority patent/US20220276360A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4052Means for monitoring or calibrating by simulation of echoes
    • G01S7/4082Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Definitions

  • the present disclosure relates to the field of computer vision, in particular to sensor calibration methods and devices, storage media, calibration systems, and program products.
  • machinery and equipment include a combination of radar and camera. Based on the data provided by radar and cameras, machine devices can learn to perceive the surrounding environment.
  • the accuracy of the extrinsic parameter between the radar and the camera determines the accuracy of the environment perception.
  • the present disclosure provides a sensor calibration method and device, storage medium, and calibration system to realize the joint calibration of radar and camera.
  • a sensor calibration method where a first calibration board is located within a common field of view of the radar and the camera, and the method includes: collecting multiple first images by the camera , Wherein the pose information of the first calibration plate in the plurality of first images is different; the first internal parameter of the camera calibrated in advance is obtained; according to the first internal parameter and the plurality of first images, Determine the external parameters of the first calibration board for each pose information relative to the camera; acquire multiple sets of radar point cloud data of the first calibration board for each pose; according to the first calibration board for each pose The target external parameters between the radar and the camera are determined respectively with respect to the external parameters of the camera and the multiple sets of radar point cloud data.
  • a sensor calibration device the sensor includes a radar and a camera, a first calibration board is located within a common field of view of the radar and the camera, and the device includes:
  • the acquisition module is used to acquire multiple first images through the camera, wherein the pose information of the first calibration board in the multiple first images is different;
  • the first determination module is used to acquire pre-calibrated all The first internal parameter of the camera, and according to the first internal parameter and the plurality of first images, the first calibration plate of each pose is determined relative to the external parameters of the camera;
  • the second determining module is used to obtain The multiple sets of radar point cloud data of the first calibration board for each pose information, and the multiple sets of radar point cloud data of the first calibration board relative to the camera and the multiple sets of radar point cloud data according to each pose information To determine the target external parameters between the radar and the camera.
  • a computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, it causes the processor to implement any one of the above-mentioned first aspects.
  • the described sensor calibration method when the computer program is executed by a processor, it causes the processor to implement any one of the above-mentioned first aspects.
  • a sensor calibration device including: a processor; a memory for storing executable instructions of the processor; wherein the processor is configured to call the executable instructions , In order to realize the calibration method of the sensor in any one of the first aspect.
  • a calibration system including a camera, a radar, and a first calibration board, the first calibration board being located in a common field of view of the camera and the radar, The pose information of the first calibration board at different collection moments is different.
  • a computer program product including computer readable code, when the computer readable code runs on a device, prompts the device to execute to implement the first The calibration method of the sensor according to any one of the aspects.
  • the first internal parameters of the pre-calibrated camera can be obtained, and the first internal parameters of the pre-calibrated camera can be obtained according to the first internal parameters of the pre-calibrated camera.
  • the external parameters of the first calibration board relative to the camera are obtained, that is, the relative position relationship between the camera and the radar or the pitch/tilt angle
  • the calibration of the sensor can be achieved according to the first internal parameters of the pre-calibrated camera.
  • Fig. 1 is a flow chart of a method for calibrating a sensor according to an exemplary embodiment of the present disclosure.
  • Fig. 2 is a schematic diagram showing a common visual field according to an exemplary embodiment of the present disclosure.
  • Fig. 3 is a schematic diagram showing a calibration board with different postures according to an exemplary embodiment of the present disclosure.
  • Fig. 4 is a schematic diagram showing a radar transmission according to an exemplary embodiment of the present disclosure.
  • Fig. 5 is a flow chart of a method for calibrating a sensor according to another exemplary embodiment of the present disclosure.
  • Fig. 6 is a schematic diagram showing a field of view of a camera according to an exemplary embodiment of the present disclosure.
  • Fig. 7 is a flow chart of a method for calibrating a sensor according to another exemplary embodiment of the present disclosure.
  • Fig. 8 is a schematic diagram showing a second image including a second calibration plate according to an exemplary embodiment of the present disclosure.
  • Fig. 9 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
  • Fig. 10A is a schematic diagram showing a scene of projecting a preset point according to an exemplary embodiment of the present disclosure.
  • Fig. 10B is a schematic diagram of a scene of determining coordinate pairs that have a corresponding relationship according to an exemplary embodiment of the present disclosure.
  • Fig. 11 is a flow chart showing a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
  • Fig. 12 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
  • Fig. 13 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
  • Fig. 14 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
  • Fig. 15 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
  • Fig. 16 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
  • Fig. 17 is a schematic diagram of determining multiple first vectors according to an exemplary embodiment of the present disclosure.
  • Fig. 18 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
  • Fig. 19 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
  • Fig. 20 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
  • Fig. 21A is a schematic diagram of projecting the first calibration plate by radar according to an exemplary embodiment of the present disclosure.
  • Fig. 21B is another schematic diagram of projecting the first calibration plate by radar according to an exemplary embodiment of the present disclosure.
  • Fig. 22 is a schematic diagram showing deployment of radars and cameras on a vehicle according to an exemplary embodiment of the present disclosure.
  • Fig. 23 is a schematic diagram showing the positions of the first calibration plate and the second calibration plate corresponding to the radar and the camera deployed on the vehicle according to an exemplary embodiment of the present disclosure.
  • Fig. 24 is a block diagram of a sensor calibration device according to an exemplary embodiment of the present disclosure.
  • Fig. 25 is a block diagram of a sensor calibration device according to another exemplary embodiment of the present disclosure.
  • first, second, third, etc. may be used in this disclosure to describe various information, the information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other.
  • first information may also be referred to as second information, and similarly, the second information may also be referred to as first information.
  • word “if” as used herein can be interpreted as "when” or “when” or “in response to certainty.”
  • the present disclosure provides a sensor calibration method.
  • sensor calibration it refers to calibrating the internal parameters (Intrinsic Parameters) and external parameters of the sensor.
  • the internal parameters of the sensor refer to the parameters used to reflect the characteristics of the sensor itself. After the sensor leaves the factory, the internal parameters are theoretically unchanged, but in actual use, the internal parameters may change. Take the camera as an example. As it is used, changes in the positional relationship of the various parts of the camera will cause changes in internal parameters.
  • the calibrated internal parameter is usually only a parameter that approximates the real internal parameter, not the true value of the internal parameter.
  • the following takes the sensor including camera and radar as an example to illustrate the internal parameters of the sensor.
  • the internal parameters of the camera refer to the parameters used to reflect the characteristics of the camera itself, which can include but are not limited to at least one of the following, which can be one or more of the multiple parameters listed below: u 0 , v 0 , S x , Sy , f and r.
  • u 0 and v 0 respectively represent the number of horizontal and vertical pixels that differ between the origin of the pixel coordinate system and the origin of the camera coordinate system where the camera is located, in pixels.
  • S x and Sy are the number of pixels per unit length in the horizontal and vertical directions, respectively, and the unit length can be millimeters.
  • f is the focal length of the camera.
  • the r is the distance value of the pixel from the center of the imager caused by image distortion.
  • the center of the imager is the focus center of the camera.
  • the camera described in the present disclosure may be a camera, a video camera, or other equipment with a photographing function, which is not limited in the present disclosure.
  • the internal parameters of the radar refer to the parameters that can be used to reflect the characteristics of the radar itself, which can include but are not limited to at least one of the following, which can be one or more of the multiple parameters listed below: the power and type of the transmitter , The sensitivity and type of the receiver, the parameters and type of the antenna, the number and type of the display, etc.
  • the radar mentioned in the present disclosure may be a laser detection and ranging (Light Detection and Ranging, LiDAR) system or a radio radar, which is not limited in the present disclosure.
  • the external parameters of the sensor refer to the parameters of the conversion relationship between the position of the object in the world coordinate system and the position of the object in the sensor coordinate system. It should be noted that when multiple sensors are included, the sensor external parameters also include parameters for reflecting the conversion relationship between the multiple sensor coordinate systems. The following also takes the sensor including camera and radar as an example to illustrate the external parameters of the sensor.
  • the external parameters of the camera refer to the parameters used to transform a point from the world coordinate system to the camera coordinate system.
  • the external parameters of the calibration board relative to the camera can be used to reflect the change parameters of the position and/or posture required by the calibration board in the world coordinate system to be converted to the camera coordinate system.
  • the external parameters of the camera may include, but are not limited to, one or a combination of the following parameters: the position and/or posture change parameters required for the conversion of the calibration board in the world coordinate system to the camera coordinate system, etc.
  • the distortion parameters include radial distortion parameters and tangential distortion coefficients. Radial distortion and tangential distortion are respectively the positional deviation of image pixels with the distortion center as the center point along the length direction or the tangential direction, which causes the image to be deformed.
  • the change parameters of the position and/or posture required for the calibration board in the world coordinate system to be converted to the camera coordinate system may include a rotation matrix R and a translation matrix T.
  • the rotation matrix R is the rotation angle parameter of the calibration board in the world coordinate system relative to the three coordinate axes of x, y, and z when the calibration board in the world coordinate system is converted to the camera coordinate system.
  • the translation matrix T is the calibration board in the world coordinate system. The translation parameter of the origin when converted to the camera coordinate system.
  • the external parameters of the radar refer to the parameters used to convert a point from the world coordinate system to the radar coordinate system.
  • the external parameters of the calibration board relative to the radar can be used to reflect the change parameters of the position and/or attitude required for the calibration board in the world coordinate system to be converted to the radar coordinate system.
  • the target external parameters between the camera and the radar refer to the parameters used to reflect the conversion relationship between the radar coordinate system and the camera coordinate system.
  • the external parameters between the camera and the radar can reflect the position of the radar coordinate system relative to the camera coordinate system. Changes in posture, etc.
  • the senor may include a camera and a radar, then calibrating the sensor refers to calibrating one of the internal parameters of the camera, the internal parameters of the radar, and the external target parameters between the camera and the radar, or a combination of multiple calibrations.
  • the above-mentioned internal parameters and/or external parameters can be determined by means of a calibration board, for example, the external parameters of the target between the camera and the radar can be determined by means of the external parameters of the calibration board relative to the camera and the external parameters of the calibration board relative to the radar.
  • the actual calibrated parameters may include, but are not limited to, the conditions listed above.
  • the calibration method of the sensor may include the following steps.
  • step 101 a plurality of first images are collected by the camera. Wherein, the pose information of the first calibration board in the plurality of first images is different.
  • the radar may be a lidar that detects characteristic quantities such as the position and speed of a target by emitting a laser beam, or a millimeter-wave radar that works in a millimeter-wave frequency band, or the like.
  • the field of view is the range that can be covered by the light, electromagnetic waves, etc. emitted by the sensor when the position is unchanged.
  • the sensor includes a radar as an example, the field of view refers to the range that the laser beam or electromagnetic wave emitted by the radar can cover; the sensor includes a camera as an example, the field of view refers to what can be captured by the camera's camera range.
  • the first calibration board 230 is located in the range of the common field of view 231 of the radar 210 and the camera 220, for example, as shown in FIG. 2.
  • the range of the common field of view 231 refers to the part where the respective ranges covered by the sensor elements included in the sensor overlap each other, that is, the range covered by the radar 210 (the radar field of view 211 in the figure) and the range photographed by the camera 220 (as shown in the figure)
  • the overlapping part of the camera field of view 221) (the part indicated by the dashed line in the figure).
  • the first calibration plate may be a circular, rectangular or square array plate with a fixed pitch pattern.
  • a rectangular array of black and white grid plates can be used.
  • the pattern of the calibration plate can also include other regular patterns, or patterns that are irregular but have characteristic parameters such as feature point sets, characteristic edges, etc.
  • the shape and pattern of the calibration plate are not limited here.
  • the number of first images collected by the camera may be multiple, for example, more than 20.
  • the poses of the first calibration board in the collected multiple first images may be different, that is, there are at least some images in the multiple first images that respectively show different poses of the first calibration board, such as different Position and/or attitude.
  • the first calibration plate has three-dimensional attitude changes of pitch angle, roll angle, and yaw angle.
  • first images may be collected when the first calibration board is in different positions and/or attitudes, that is, the pose information of the first calibration board included in different first images may be the same or different, but there are
  • the at least two first images include different pose information of the first calibration plate.
  • each first image needs to include a complete first calibration board.
  • the number of the first images may be m, and the number of poses of the first calibration plate may be n. Both m and n are integers greater than or equal to 2.
  • the pose information includes information used to reflect the pose of the first calibration board in the three-dimensional space.
  • the pose information of the first calibration board shown in FIG. 3 may be the attitude change in at least one of the three dimensions of the first calibration board's pitch angle, roll angle, and yaw angle.
  • the first calibration plate can be in a static state.
  • a bracket can be used to fix the first calibration plate.
  • the pose information also includes position information.
  • the multiple first images collected may include images of the first calibration board in different poses at various distances (ie, small distances, moderate distances, large distances, etc.).
  • the laser generated by the radar can cover the complete first calibration board, usually in the process of deploying the first calibration board, make the first calibration board far away from the radar.
  • the process of collecting images of the first calibration boards deployed at different distances in response to the case where the distance d 1 of the first calibration board from the camera is small, for example, the distance d 1 is less than the distance threshold D 1 , Acquire a plurality of first images including the first calibration board in different postures.
  • multiple first images including first calibration plates with different postures may be additionally collected.
  • the distance d 1 is moderate, for example, the distance d 1 is between the above two distance thresholds, that is, D 1 ⁇ d 1 ⁇ D 2
  • multiple first calibration plates including different postures can be additionally collected. image. In this way, the first image taken at various distances between the first calibration plate and the camera can be obtained. The first images at different distances have different pose information.
  • a complete first calibration plate may be included in the multiple first images.
  • the ratio of the area of the first calibration plate to the area of the first image is different. For example, when the distance d 1 is farther, the area of the first calibration plate in the first image is relatively small, and when the distance d 1 is closer, the area of the first calibration plate in the first image is relatively large.
  • step 102 a pre-calibrated first internal parameter of the camera is acquired, and according to the first internal parameter and the plurality of first images, it is determined that the first calibration plate of each pose is relative to the external of the camera. Ginseng.
  • the distortion parameters need to be determined more accurately.
  • the distortion parameters have a great influence on the internal parameters of the camera. Therefore, if the internal parameters of the camera are calibrated directly based on multiple first images, the calibration results may not be accurate enough.
  • the first internal parameters of the pre-calibrated camera can be directly obtained. According to the first internal parameters of the camera and the multiple first images collected by the previous camera, it can be determined that the first calibration plates of different poses are relative to all the first calibration plates. State the external parameters of the camera.
  • the first internal parameter of the camera is the internal parameter of the camera obtained by calibrating the camera when the sensor is calibrated for the first time.
  • the camera collects multiple second images including the complete second calibration plate, and calibrates the first internal parameters of the camera according to the multiple second images.
  • the pose information of the second calibration plate in the plurality of second images is different.
  • the second calibration board may be closer to the camera and close to the edge of the camera's field of view, so that the determined first internal reference of the camera is more accurate than the internal reference of the camera calibrated using multiple first images.
  • the first internal parameter of the pre-calibrated camera can be directly obtained.
  • methods such as Zhang Zhengyou calibration method can be used to calibrate the first internal parameter of the camera.
  • the external parameters of the first calibration plate relative to the camera are determined, including the rotation matrix R and the translation matrix T.
  • step 103 multiple sets of radar point cloud data of the first calibration board of each pose are acquired, and the external parameters and the external parameters of the camera relative to the first calibration board of the respective poses are obtained.
  • the multiple sets of radar point cloud data determine the target external parameters between the radar and the camera.
  • the radar point cloud data is data including multiple radar points generated by the laser or electromagnetic waves emitted by the radar passing through the first calibration plates of different poses.
  • the radar point cloud data includes the point cloud data obtained based on the complete first calibration board.
  • the edge of the first calibration plate 420 is not parallel to the laser or electromagnetic waves emitted by the radar 410, and may have a certain angle to ensure that the laser or electromagnetic waves emitted by the radar 410 pass through each edge of the first calibration plate 420. So that the target radar point cloud data matched by the first calibration board in the radar point cloud data can be better determined later.
  • the target external parameters between the radar and the camera belong to the external parameters between the camera and the radar.
  • the external parameters of the first calibration board relative to the camera are obtained, that is, the relative position relationship between the camera and the radar or
  • the sensor can be calibrated according to the first internal parameters of the pre-calibrated camera.
  • the method before performing the above step 102 to obtain the pre-calibrated first internal parameters of the camera, the method further includes step 100.
  • step 100 in response to the initial calibration of the sensor, the camera is calibrated to obtain the first internal parameter of the camera.
  • the camera can be calibrated to obtain the first internal parameter of the camera.
  • Obtaining the first internal parameter of the pre-calibrated camera in step 102 may include: in response to calibrating the sensor again, obtaining the first internal parameter of the camera obtained by calibrating the sensor for the first time.
  • the first internal parameters of the camera obtained from the initial calibration of the sensor can be directly obtained.
  • the camera in response to the initial calibration of the sensor, the camera is calibrated to obtain the first internal parameters of the camera, and in response to the recalibration of the sensor, the first internal parameters of the camera obtained from the initial calibration of the sensor can be directly obtained.
  • the camera internal parameter calibration process and the target external parameter calibration process between the radar and the camera can be distinguished, and in the process of re-calibrating the sensor, the sensor calibration can be realized directly based on the first internal parameter of the camera obtained from the initial calibration of the sensor.
  • the second calibration board in the case of calibrating the sensor for the first time, should be located in the field of view of the camera, and the second image may include a complete second calibration board, for example, as shown in FIG. 6.
  • the second calibration plate 620 can be located at the edge of the camera field of view 611 of the camera 610.
  • the above step 100 may include the following steps.
  • step 100-1 a plurality of second images are collected by the camera.
  • the pose information of the second calibration plate in the plurality of second images is different.
  • the second calibration board may be the same as or different from the first calibration board.
  • that the first calibration board is the same as the second calibration board means that the same calibration board is used to realize the functions of the first calibration board and the second calibration board, wherein the same calibration board serves as the second calibration board.
  • the same calibration board can be used as the first calibration board.
  • it can also be different from the same calibration board as the first calibration board.
  • the difference between the first calibration board and the second calibration board may mean that completely different or partially different calibration boards are used to realize the functions of the first calibration board and the second calibration board respectively.
  • the pose information may include the attitude of the second calibration board in a three-dimensional space, for example, the attitude changes in three dimensions of the pitch angle, the roll angle, and the yaw angle.
  • the second calibration board should be in a static state.
  • a bracket can be used to fix the second calibration plate.
  • the preset value may be a specific value or a range value. Taking the preset value range as an example, the preset value range will affect the accuracy of each first internal parameter of the camera. Therefore, in order to improve the accuracy of the camera's first internal parameter determined subsequently, the preset value can be changed Set to a value between [0.8, 1]. For example, as shown in FIG. 8, in this figure, the proportion of the second calibration plate in the entire image is within the preset value range, so this figure can be used as the second image.
  • the number of second images collected by the camera may be multiple, for example, more than 20.
  • the pose information of the second calibration plate in the collected multiple second images may be different, that is, there are at least some images in the multiple second images that respectively show the different poses of the second calibration plate, for example, there are The attitude changes in the three dimensions of pitch angle, roll angle and yaw angle.
  • the number of second images can be c, and the number of poses of the second calibration plate can be d.
  • Both c and d are integers greater than or equal to 2.
  • c may be equal to the number m of the aforementioned first images, or not equal to m, similarly, d may be equal to the number n of the poses of the aforementioned second calibration plate, or not equal to n.
  • the multiple second images collected by the camera should not be blurred.
  • the blurred image may be caused by the movement of the sensor, that is, the movement of the camera causes the camera and the second calibration plate to appear. Caused by relative movement.
  • it can be determined whether there are motion-blurred images in the multiple second images collected by the camera, and the motion-blurred images are removed. Or you can filter out motion-blurred images through a preset script.
  • step 100-2 according to the plurality of second images, a plurality of first candidate internal parameters of the camera are respectively determined, and one of the plurality of first candidate internal parameters is combined, Determined as the first internal reference.
  • a preset matlab toolbox may be used to calibrate multiple first candidate internal parameters of the camera according to multiple second images.
  • the camera can reproject the preset point in the camera coordinate system to the pixel coordinate system to obtain the corresponding projection point, and then compare The error between the projection point and the corresponding preset point in the pixel coordinate system can be used to obtain the error value of the preset point.
  • the error values obtained by comparing the respective first candidate internal parameters are compared, and the first candidate internal parameter with the smallest error value is used as the first internal parameter of the camera.
  • steps 100-1 and 100-2 are for the process of calibrating the first internal parameters of the camera in the case of calibrating the sensor for the first time, and there is no restriction on the order of execution of step 101. If the sensor is calibrated again, the first internal parameter of the pre-calibrated camera can be directly obtained.
  • the first candidate internal parameters of the camera are the multiple first candidate internal parameters of the camera that are respectively determined according to the multiple second images of the second calibration plate containing different pose information collected by the camera. .
  • the first candidate internal parameter with the smallest error value between the projection point determined in the above manner and the corresponding preset point in the pixel coordinate system is selected as the first internal parameter of the camera.
  • multiple first candidate internal parameters of the camera can be determined, so that one of the multiple first candidate internal parameters is determined as the first internal parameter, which improves the accuracy and accuracy of determining the camera internal parameters and has high usability.
  • step 100-2 may include the following steps.
  • step 100-21 the preset point located in the camera coordinate system is projected to the pixel coordinate system through the camera and according to the plurality of first candidate internal parameters respectively, to obtain the preset point in the pixel coordinate system.
  • Multiple first coordinate values in the coordinate system are provided.
  • the number of preset points can be one or more.
  • the camera can use different first candidate internal parameters to project the preset point in the camera coordinate system into the pixel coordinate system to obtain each preset point. Multiple first coordinate values of a preset point in a pixel coordinate system.
  • a preset point P in the 3D space is projected into the 2D space to obtain the corresponding first coordinate value P1.
  • step 100-22 for each candidate internal parameter, the second coordinate value of the preset point in the verification image is obtained, and the first coordinate value corresponding to the second coordinate value is determined to obtain the coordinate with the corresponding relationship. Yes, wherein the verification image is one or more second images among the plurality of second images.
  • the second coordinate value of the preset point in the pixel coordinate system can be determined.
  • the second coordinate value shown in FIG. 10B is P2
  • the first coordinate value P1 corresponding to the second coordinate value P2 is determined.
  • multiple sets of coordinate pairs with corresponding relationships can be obtained.
  • P2 corresponds to P1
  • P1 and P2 form a set of coordinate pairs.
  • P2' corresponds to P1'
  • P1' and P2' constitute another set of coordinate pairs.
  • the verification image is a plurality of second images
  • the second coordinate value of the preset point on the verification image j can be obtained, and the first coordinate value corresponding to the second coordinate value can be obtained to form a Group coordinate pair
  • P i multiple sets of coordinate pairs of the preset point on multiple verification images
  • steps 100-23 for each first candidate internal parameter, the distance between the first coordinate value and the second coordinate value in the coordinate pair included in the first candidate internal parameter is determined, and the multiple first candidate internal parameters are combined.
  • the first candidate internal parameter with the smallest distance among the internal parameters is determined as the first internal parameter of the camera.
  • the distance between the first coordinate value and the second coordinate value in each set of coordinate pairs can be calculated separately.
  • a first candidate internal parameter corresponding to the minimum distance can be used as the first internal parameter of the camera.
  • the first candidate internal parameter 2 can be used as the camera's The first internal reference.
  • the first internal reference comprises a plurality of coordinate pairs i
  • P i can be calculated separately from each set of coordinate pairs, then the total distance can be obtained a plurality of sets of coordinates, such as the distance of each set of coordinates may be added Get the total distance.
  • the total distance of each first candidate parameter is compared, and the first candidate internal parameter with the smallest total distance among the plurality of first candidate internal parameters is determined as the first internal parameter of the camera.
  • the above description is based on one preset point.
  • the method for obtaining the first internal parameter is similar to the same preset point. For example, for each first candidate internal parameter, the distance of the coordinate pair of each preset point can be calculated, and then the average value of the distances of multiple preset points can be calculated, and the average value of the distance among the multiple first candidate internal parameters can be minimized
  • One of the first candidate internal parameters is determined as the first internal parameter of the camera.
  • the first candidate internal parameter with the smallest reprojection error is used as the target internal parameter of the camera, so that the first internal parameter of the camera is more accurate.
  • step 102 may include the following steps.
  • step 102-1 for each first image of the plurality of first images, the first image is deformed according to the first internal reference to obtain the third image corresponding to the first image. image.
  • equipment equipped with radar and camera at the same time such as a vehicle equipped with radar and camera at the same time, equipped with image processing equipment (the equipment can be radar, camera or other equipment), can be used for multiple pictures.
  • An image undergoes anti-distortion processing.
  • the first internal parameters of the pre-calibrated camera may be de-distorted to obtain multiple third images.
  • the camera internal parameters can be represented by the internal parameter matrix A', as shown in formula 1:
  • each parameter can refer to the aforementioned description of camera parameters.
  • the internal parameter matrix A can be expressed by formula 2:
  • step 102-2 a second internal parameter of the camera is determined according to a plurality of third images.
  • the preset matlab toolbox can be used to determine the multiple second candidate internal parameters of the camera according to the multiple third images after the distortion processing.
  • the camera uses different second candidate internal parameters to be located in the preset camera coordinate system.
  • the point is projected to the pixel coordinate system to obtain multiple third coordinate values.
  • the fourth coordinate value of each preset point observed in the pixel coordinate system and the corresponding third coordinate value are regarded as a set of coordinate pairs that have a corresponding relationship, and a second candidate corresponding to the smallest distance in the multiple sets of coordinate pairs
  • the internal reference is used as the second internal reference of the camera.
  • the second internal parameter is the internal parameter of the camera determined according to the plurality of third images after de-distortion.
  • the multiple second candidate internal parameters of the camera are based on the multiple first images of the first calibration board with different pose information collected by the camera, and the multiple third images obtained after de-distortion determine that the camera is in the ideal Multiple internal parameters in the state.
  • the second internal parameter is the second candidate internal parameter with the smallest error value between the projection point determined in the multiple second candidate internal parameters and the corresponding preset point in the pixel coordinate system.
  • the second internal parameter is the camera without distortion The internal reference in the ideal state.
  • step 102-3 according to a plurality of third images and the second internal parameters of the camera, the external parameters of the first calibration plate of each pose relative to the camera are determined.
  • the external parameters of the calibration board relative to the camera may include a rotation matrix R and a translation matrix T.
  • the homography matrix is a matrix describing the positional mapping relationship between the world coordinate system and the pixel coordinate system.
  • the multiple first images taken by the camera can be deformed according to the first internal parameters of the camera to obtain multiple third images, and the second internal parameters of the camera can be determined according to the multiple third images. It is equivalent to the internal parameters of a camera without distortion under ideal conditions. Then, according to the multiple third images and the second internal parameters, the external parameters of the first calibration board relative to the camera are determined, and the first calibration board obtained by the above method has a higher accuracy relative to the external parameters of the camera.
  • the above step 102-3 may include the following steps.
  • steps 102-31 the homography matrix corresponding to each third image is determined respectively.
  • the homography matrix H corresponding to each third image can be calculated in the following manner:
  • r 1 , r 2 , and r 3 are the rotation column vectors that make up the rotation matrix R, the dimension is 1 ⁇ 3, and t is the vector form of the translation matrix T.
  • (u, v) are the pixel coordinates
  • (X, Y) correspond to the coordinates of the calibration plate
  • s is the scale factor
  • the homography matrix H corresponding to the multiple third images can be calculated by formula 5.
  • steps 102-32 according to the second internal parameters of the camera and a plurality of the homography matrices, the external parameters of the first calibration plate of each pose relative to the camera are determined.
  • represents the scale factor
  • r 1 ⁇ A -1 h 1
  • r 2 ⁇ A -1 h 2
  • r 3 r 1 ⁇ r 2
  • 1/
  • 1/
  • r 1 , r 2 and r 3 constitute a 3 ⁇ 3 rotation matrix R.
  • the homography matrix corresponding to each third image can be determined separately, and according to the obtained multiple homography matrices and the second internal parameters of the camera, it is determined that the first calibration plate of each pose is relative to the camera.
  • the external parameters make the first calibration board more accurate relative to the external parameters of the camera.
  • the foregoing step 103 may include the following steps.
  • step 103-1 for the first calibration board of each pose, according to the external parameters of the first calibration board relative to the camera and the reference value of the external parameters between the radar and the camera, A set of target radar point cloud data matching the first calibration board is determined from the radar point cloud data in the pose.
  • the external parameter reference value may be a rough estimate of the external parameter value between the radar and the camera based on the approximate position and orientation between the radar and the camera.
  • the coordinate system of the radar can be superimposed with the camera coordinate system according to the reference value of the external parameters, and unified into the camera coordinate system.
  • the first calibration board for each pose, the external reference of the first calibration board relative to the camera, and the external reference between the radar and the camera Value the M-estimator SAmple Consensus (MSAC) algorithm is used to determine the target plane where the first calibration board is located. Further, a MeanShift (MeanShift) clustering algorithm is used to determine the target radar point cloud data that matches the first calibration board in the corresponding radar point cloud data on the target plane.
  • MSAC M-estimator SAmple Consensus
  • step 103-2 the target external parameters between the radar and the camera are determined according to the matching relationship between the multiple sets of the target radar point cloud data and the first calibration plates of each pose.
  • n there is only one pose from the radar to the camera.
  • the first calibration board used, such as n. Therefore, in step 103-1, n sets of target radar point cloud data can be obtained.
  • the radar point cloud data of a group of targets can be matched with the first calibration board of n poses respectively to obtain n matching relationships. Then through these n matching relationships, the external parameters between the radar and the camera can be calculated.
  • the least square method may be used to determine the target external parameters between the radar and the camera.
  • M The estimation algorithm determines the target plane where the first calibration board is located. Further, a mean shift clustering algorithm is used to determine a group of target radar point cloud data matching the first calibration board in the corresponding radar point cloud data on the target plane. The target radar point cloud data matching the first calibration board is automatically determined from the radar point cloud data, which reduces the matching error and improves the accuracy of the point cloud matching. According to the matching relationship between multiple sets of the target radar point cloud data and the first calibration board, the target external parameters between the radar and the camera are determined, and the target external parameters between the radar and the camera can be quickly determined. Improve the accuracy of target external parameters.
  • the above-mentioned step 103-1 may include the following steps.
  • step 103-11 according to the external parameters of the first calibration board relative to the camera and the reference value of the external parameters between the radar and the camera, the candidate position of the first calibration board is determined.
  • the external parameters of the first calibration board relative to the camera and the estimated reference values of the external parameters between the radar and the camera may be first based on the pose, and then the first calibration board can be used for the first calibration board.
  • the position of the first calibration board is estimated from the radar point cloud data collected by the calibration board, and the approximate position and direction of the first calibration board are obtained. Use the approximate position and direction of the first calibration board as the estimated candidate position.
  • the candidate position represents the approximate position of the first calibration plate in a map composed of radar point cloud data.
  • step 103-12 according to the candidate position, the target plane where the first calibration board is located is determined from the radar point cloud data in the pose.
  • multiple first radar points located in the area corresponding to the candidate position can be randomly selected, and multiple first radar points can be obtained.
  • the first plane composed of radar points. Repeat this selection multiple times to obtain multiple first planes.
  • the distances from other radar points except the multiple first radar points in the group of radar point cloud data to the first plane are respectively calculated.
  • the radar point whose distance value is less than the preset threshold among other radar points is used as the second radar point, and the second radar point is determined as the radar point in the first plane.
  • the first plane with the largest number of radar points is used as the target plane where the first calibration board is located.
  • the target plane represents the plane on which the first calibration board is located in a graph composed of radar point cloud data.
  • step 103-13 on the target plane, a set of target radar point cloud data matching the first calibration board of the pose is determined.
  • the first circular area is randomly determined according to the size of the first calibration plate.
  • the initial first circular area may be the area corresponding to the circumscribed circle of the first calibration plate.
  • any radar point located in the initial first circular area is randomly selected as the first center of the first circular area to adjust the first circular area in the The position in the radar point cloud data.
  • the size of the first calibration board is the size of the first calibration board in the figure composed of radar point cloud data.
  • a plurality of first vectors are obtained respectively.
  • the second vector is obtained by adding the plurality of first vectors.
  • the target center position of the first calibration plate is determined.
  • the target center position of the first calibration board is the determined center position of the first calibration board in a map composed of radar point cloud data.
  • a group of target radar point clouds matching the first calibration board is determined in the radar point cloud data data.
  • steps 103-12 may include the following steps.
  • steps 103-121 two or more first radar groups are determined from the radar point cloud data in the pose, and for each first radar group, the corresponding first plane is determined, where Each first radar group includes a plurality of randomly selected first radar points located in the corresponding area of the candidate position, and the first plane corresponding to each first radar group includes a plurality of first radar points of the first radar group. Radar point.
  • a plurality of first radar points located in the area corresponding to the candidate position may be randomly selected each time to obtain a first radar group , Each time a first plane composed of a plurality of first radar points of the first radar group can be obtained. If multiple first radar points are randomly selected for multiple times, multiple first planes can be obtained.
  • the radar points include 1, 2, 3, 4, 5, 6, 7, and 8
  • the first radar points 1, 2, 3, and 4 are randomly selected for the first time to form the first plane 1, and for the second time, the first radar points are randomly selected.
  • a radar point 1, 2, 4, and 6 form the first plane 2
  • the first radar point 2, 6, 7 and 8 is randomly selected for the third time to form the first plane 3.
  • steps 103-122 for each of the first planes, it is determined that other radar points in the radar point cloud data in the pose and the first radar point are to the first radar point. The distance from the plane.
  • the distance values from the other radar points 5, 6, 7, and 8 to the first plane 1 can be calculated, and for the first plane 2, the distances from the other radar points 3, 5, 7, and 8 can be calculated respectively.
  • the distance values of the other radar points 1, 3, 4, and 5 to the first plane 3 can be calculated.
  • a radar point whose distance is less than a threshold among the other radar points is used as a second radar point, and the second radar point is determined as the first radar point.
  • Radar points included in a plane are used as a radar point whose distance is less than a threshold among the other radar points.
  • the first plane 1 includes radar points 1, 2, 3 , 4, and 5.
  • the first plane 2 includes radar points 1, 2, 4, 6, and 7, and it can be assumed that the first plane 3 includes radar points 1, 3, 4, 5, 6, and 8.
  • the first plane that includes the largest number of radar points is used as the target plane.
  • the first plane with the largest number of radar points for example, the first plane 3, is used as the target plane where the first calibration board is located.
  • the above method can be used to determine a target plane where the first calibration plate of a certain pose is located for each group of radar point cloud data.
  • the fitted target plane is more accurate and has high availability.
  • steps 103-13 may include the following steps.
  • an initial first circular area is determined according to the size of the first calibration plate.
  • the initial first circular area can be determined on the target plane according to the size of the first calibration board, and the size of the initial first circular area may be adopted The size of the circumscribed circle of the first calibration plate.
  • the size of the first calibration board is the size of the first calibration board in the figure composed of radar point cloud data.
  • any radar point located in the initial first circular area is randomly selected as the first center of the first circular area to determine the first The position of the circular area in the radar point cloud data.
  • a radar point is randomly selected as the first center of the first circular area .
  • the position of the first circular area in the radar point cloud data is subsequently adjusted through the first circle center.
  • the radius of the first circular area is the same as the radius of the initial first circular area.
  • steps 103-133 using the first circle center as a starting point, and a plurality of third radar points located in the first circular area in the radar point cloud data as an end point, a plurality of first vectors are obtained respectively.
  • the first circle center 170 can be used as the starting point, and the multiple third radar points 171 located in the first circular area in the radar point cloud data can be used as the ending points, so as The first vector 172.
  • the third radar point 171 can effectively cover a circular area. As shown in Figure 17.
  • steps 103-134 the multiple first vectors are added to obtain a second vector.
  • a Meanshift vector that is, the second vector
  • the target center position of the first calibration plate is determined based on the second vector.
  • the end point of the second vector is taken as the second circle center, and the second circular area is obtained according to the size of the first calibration plate. Taking multiple fourth radar points in the second circular area as the terminal and the second circle center as the starting point, multiple third vectors are obtained respectively. Add multiple third vectors to obtain the fourth vector, and then use the end point of the fourth vector as the new second circle center to obtain a new second circular area. Repeat the above steps to determine the fourth vector until the fourth vector converges to With the preset value, the corresponding second circle center at this time is used as the candidate center position of the first calibration plate.
  • the candidate center position is the candidate center position of the first calibration board in the map composed of radar point cloud data.
  • the candidate center position can be directly used as the target center position; otherwise, the new candidate center position can be re-determined until the final target is determined Central location.
  • steps 103-136 according to the target center position of the first calibration board and the size of the first calibration board, a group of all matching with the first calibration board is determined in the radar point cloud data.
  • the target radar point cloud data is determined in the radar point cloud data.
  • the corresponding position of the first calibration plate can be determined according to the target center position and size of the first calibration plate. It is matched with the first calibration board of the pose, so that the radar point cloud data matching the position of the first calibration board in the radar point cloud data can be used as the target radar point cloud data.
  • steps 103-135 may include:
  • steps 103-1351 the end point of the second vector is taken as the second circle center, and the second circular area is determined according to the size of the second circle center and the first calibration plate.
  • the end point of the second vector may be used as the second circle center, the second circle center is again taken as the new circle center, and the radius is the radius of the circumscribed circle of the first calibration plate to obtain the second circular area.
  • steps 103-1352 using the second circle center as a starting point, and multiple fourth radar points located in the second circular area in the radar point cloud data as an ending point, multiple third vectors are determined respectively.
  • the second circle center is used as a starting point, and multiple fourth radar points located in the second circle center area in the radar point cloud data are used as end points, and multiple third vectors are obtained respectively.
  • steps 103-1353 the multiple third vectors are added to obtain a fourth vector.
  • steps 103-1354 it is determined whether the vector value of the fourth vector converges to a preset value.
  • the preset value may be close to zero.
  • steps 103-1355 use the end point of the fourth vector as the second circle center, determine the second circular area according to the second circle center and the size of the first calibration plate, and jump to Steps 103-1352.
  • the end point of the fourth vector can be used as the new second circle center again, and the new fourth vector can be calculated again according to the above steps 103-1352 to 103-1354, and the vector value of the fourth vector can be determined Whether to converge.
  • the above process is repeated continuously until the finally obtained vector value of the fourth vector converges to the preset value.
  • steps 103-1356 the second circle center corresponding to the converged fourth vector is used as the candidate center position of the first calibration plate.
  • the second circle center corresponding to the fourth vector may be used as the candidate center position of the first calibration plate.
  • steps 103-1357 in response to the candidate center position being coincident with the center position of the first calibration plate, the candidate center position is taken as the target center position.
  • steps 103-135 may further include the following steps.
  • steps 103-1358 in response to the candidate center position being not coincident with the center position of the first calibration plate, the candidate center position is re-determined.
  • all radar points in the second circular area may be deleted, and a new second circular area may be determined again. Or directly delete this group of radar point cloud data, and re-determine the candidate center position of the first calibration board according to another group of radar point cloud data corresponding to other attitudes of the first calibration board until the candidate center is determined The position coincides with the center position of the first calibration plate.
  • step 103-2 may include: determining an alternative external parameter between the radar and the camera according to g matching relationships, and according to the difference between the radar and the camera A plurality of candidate external parameters are used to determine the target external parameters between the radar and the camera.
  • the least square method is adopted to determine a candidate external parameter by minimizing the sum of squares of the external parameter error between the radar and the camera, where g is an integer greater than or equal to 3.
  • the first calibration board of pose information 1 corresponds to target radar point cloud data 1
  • the first calibration board of pose information 2 corresponds to target radar point cloud data 2, and so on, there are n sets of matching relationships.
  • the candidate external parameter 1 can be determined based on the first three sets of matching relationships
  • the candidate external parameter 2 can be determined based on the first four sets of matching relationships
  • the candidate external parameter 3 can be determined based on the matching relationships between the first two groups and the fourth set, etc.
  • the candidate external parameter with the best projection effect is determined as the target external parameter between the radar and the camera.
  • the candidate external parameters between the radar and the camera can be determined according to multiple matching relationships, and based on the multiple candidate external parameters, the candidate external parameter with the best projection effect is selected as the radar and the camera.
  • the target external parameter between the radar and the camera improves the accuracy of the target external parameter between the radar and the camera.
  • step 103 may further include the following steps.
  • step 103-21 the first calibration plate is projected by the radar based on each candidate external parameter, and projected onto any first image to generate a set of projection data.
  • the optional external parameters between each radar and the camera, the matrix of the camera internal parameters, and the radar point cloud data are multiplied to realize the projection of the radar point cloud data and project it onto a certain first image.
  • the radar point cloud data may be one of multiple groups of radar point cloud data collected before, or it may be newly collected radar point cloud data.
  • the first calibration board needs to be included in the collected target.
  • step 103-22 a group of projection data with the highest degree of projection matching with the first image among the multiple groups of projection data is determined to be the target projection data.
  • a set of projection data with the highest degree of projection matching with the first image is determined, and the set of projection data is determined as the target projection data.
  • the projection data obtained by projecting on the first image for example, as shown in Figure 21A and Figure 21B, the projection effect of Figure 21A is better than the projection effect of Figure 21B, then the projection data corresponding to Figure 17A is the target Projection data.
  • step 103-23 the candidate external parameter corresponding to the target projection data is determined to be the target external parameter between the radar and the camera.
  • the candidate external parameter corresponding to the target projection data is the target external parameter between the radar and the camera.
  • multiple candidate external parameters can be verified according to the projection effect, and the candidate external parameter with the best projection effect is used as the final target external parameter, which improves the accuracy of the target external parameter between the radar and the camera. .
  • the above-mentioned radar and camera can be deployed on the vehicle, and the radar can be a lidar.
  • the radar and camera can be deployed at different positions of the vehicle at the same time. For example, as shown in FIG. Radar 2220 and camera 2210 can be deployed in the rear, front windshield and other positions. After the first internal parameters of the camera 2210 are determined, if you need to re-determine the target external parameters between the radar 2220 and the camera 2210, you can directly obtain the previously calibrated first internal parameters to quickly determine the target external parameters, and improve the radar 2220 and camera The accuracy of the target external reference between 2210.
  • the above-mentioned methods provided in the embodiments of the present disclosure can be used on machinery and equipment, which can be manually driven or unmanned vehicles, such as airplanes, vehicles, drones, unmanned vehicles, robots, and so on.
  • the two sensors, radar and camera can be set above the center console, close to the front windshield glass. Due to the movement of the vehicle, the attitude of at least one of the radar and the camera will be changed, and the external parameters between the radar and the camera need to be recalibrated. Due to the influence of the front windshield on the refraction of light, the internal parameters of the originally calibrated camera will be inaccurate in the application process, which will affect the accuracy of the external parameters between the radar and the camera.
  • the laser radar is finally determined according to the external parameters of the first calibration board relative to the camera and the multiple sets of radar point cloud data of the different pose information.
  • the target external parameter between the camera and the camera It can quickly calibrate the target external parameters between the lidar and the camera, with high availability.
  • the radar is deployed on the front bumper of the vehicle, and the camera is deployed at the position of the rearview mirror of the vehicle.
  • the first calibration plate 2331 is located in the common field of view of the radar 2320 and the camera 2310.
  • the first calibration board can be fixed on the ground or held by the staff.
  • the camera 2310 is calibrating the first internal parameter, multiple first images containing the first calibration board 2331 are used. Since the radar 2320 and the camera 2310 are not on the same horizontal plane, the camera 2310 is far away from the ground. The first calibration board 2331 in the image may only occupy part of the content of the first image. At this time, the accuracy of the internal parameters of the camera 2310 calibrated according to multiple first images is poor.
  • the camera internal parameters can be calibrated through the second calibration plate 2332 located within the field of view of the camera 2310 and at a relatively short distance from the camera 2310.
  • the horizontal distance between the second calibration plate 2332 and the camera 2310 is Less than the horizontal distance between the first calibration board 2331 and the camera 2310, the second calibration board 2332 can be fixed on the vehicle, and the second image collected at this time can include the complete second calibration board 2332, and then A more accurate first internal parameter of the camera 2310 can be obtained.
  • the camera 2310 and the radar 2320 are deployed on the vehicle, the distance between the camera 2310 and the ground is greater than the distance between the radar 2320 and the ground, and the horizontal distance between the second calibration plate 2332 and the camera 2310 is smaller than the first calibration plate 2331
  • the horizontal distance from the camera 2310, the multiple second images collected by the camera 2310 include the complete second calibration plate 2332, thereby improving the accuracy of calibrating the internal parameters of the camera.
  • the present disclosure also provides an embodiment of an apparatus.
  • FIG. 24 is a block diagram of a sensor calibration device according to an exemplary embodiment of the present disclosure.
  • the first calibration board is located in the common field of view of the radar and the camera, and the device includes:
  • the acquisition module 210 is used to acquire multiple first images through the camera, wherein the pose information of the first calibration board in the multiple first images is different;
  • the first determination module 220 is used to acquire pre-calibration
  • the first internal parameter of the camera, and the external parameters of the first calibration board of each pose relative to the camera are determined according to the first internal parameter and the plurality of first images; a second determination module 230.
  • the multiple sets of radar point cloud data are used to determine the target external parameters between the radar and the camera.
  • the device further includes: a calibration module, configured to calibrate the camera in response to the initial calibration of the sensor to obtain the first internal parameter of the camera; the first determination The module includes: an acquisition sub-module for acquiring the first internal parameter of the camera obtained by calibrating the sensor for the first time in response to calibrating the sensor again.
  • the second calibration board is located within the field of view of the camera, and the calibration module includes: a collection sub-module configured to collect multiple second images through the camera, the multiple second images The pose information of the second calibration board in the image is different; the first determination sub-module is used to determine the multiple first candidate internal parameters of the camera according to the multiple second images, and compare the multiple One of the first candidate internal parameters is determined to be the first internal parameter, wherein each second image corresponds to one first candidate internal parameter.
  • the first determining submodule includes: a projection unit, configured to project a preset point located in the camera coordinate system according to the plurality of first candidate internal parameters through the camera To the pixel coordinate system to obtain multiple first coordinate values of the preset point in the pixel coordinate system; the first determining unit is configured to obtain, for each first candidate internal parameter, that the preset point is in the verification
  • the second coordinate value in the image is determined, and the first coordinate value corresponding to the second coordinate value is determined to obtain a coordinate pair that has a corresponding relationship.
  • the verification image is one or more of the plurality of second images.
  • the first determining module includes: an anti-distortion sub-module, configured to perform, for each first image in the plurality of first images, the first image according to the first internal reference The image undergoes de-distortion processing to obtain a third image corresponding to the first image; a second determining sub-module, configured to determine the second internal parameter of the camera according to a plurality of the third images; a third determining sub-module, The first calibration board used to determine the external parameters of the respective poses relative to the camera according to the plurality of third images and the second internal parameters of the camera.
  • the third determining submodule includes: a third determining unit, configured to determine the homography matrix corresponding to each third image; and a fourth determining unit, configured to determine the homography matrix corresponding to each of the third images;
  • the second internal parameter of the camera and the plurality of homography matrices determine the external parameters of the first calibration board of each pose relative to the camera.
  • the second determination module includes: a fourth determination sub-module, which is used for a first calibration board for each pose, according to the external parameters of the first calibration board relative to the camera, And the reference value of the external parameters between the radar and the camera, determine a set of target radar point cloud data matching the first calibration board in the radar point cloud data in this pose; a fifth determining submodule , For determining the target external parameters between the radar and the camera according to the matching relationship between the multiple sets of the target radar point cloud data and the first calibration board of each pose.
  • the fourth determining sub-module includes: a fifth determining unit, configured to determine according to the external parameters of the first calibration board relative to the camera, and the relationship between the radar and the camera The reference value of the external reference between the time period determines the candidate position where the first calibration board is located; the sixth determining unit is configured to determine the position from the radar point cloud data in the pose according to the candidate position The target plane where the first calibration board is located; a seventh determining unit for determining a set of targets matching the first calibration board on the target plane corresponding to the radar point cloud data in the pose Radar point cloud data.
  • the sixth determining unit includes: a first determining subunit, configured to determine two or more first radar groups from the radar point cloud data in the pose, wherein Each first radar group includes a plurality of randomly selected first radar points located in the corresponding area of the candidate position, and for each first radar group, a corresponding first plane is determined, where each first radar group The first plane corresponding to a radar group includes a plurality of first radar points of the first radar group; the second determining subunit is used for determining the radar in the pose for each of the first planes.
  • the third determining subunit is used for determining the distance between the other radar points for each first plane Among the points, the radar point whose distance is less than the threshold is used as the second radar point, and the second radar point is determined as the radar point included in the first plane; Among the plurality of first planes, the first plane including the largest number of radar points is used as the target plane.
  • the seventh determining unit includes: a fifth determining subunit, configured to determine the initial first circular area on the target plane according to the size of the first calibration plate; selecting the subunit, It is used to randomly select any radar point located in the initial first circular area in the radar point cloud data as the first center of the first circular area, so as to determine that the first circular area is in the first circular area.
  • the position in the radar point cloud data; a sixth determining subunit, configured to take the first circle center as a starting point, and a plurality of third radar points located in the first circular area in the radar point cloud data are At the end point, multiple first vectors are obtained respectively; the seventh determining subunit is used to add the multiple first vectors to obtain the second vector; the eighth determining subunit is used to determine based on the second vector
  • the target center position of the first calibration board; a ninth determining subunit for determining in the radar point cloud data according to the target center position of the first calibration board and the size of the first calibration A set of the target radar point cloud data matching the first calibration board.
  • the eighth determining subunit includes: taking the end point of the second vector as the second circle center, and determining the second circle according to the second circle center and the size of the first calibration plate Area; taking the second circle center as a starting point, a plurality of fourth radar points located in the second circular area in the radar point cloud data as an end point, respectively determining a plurality of third vectors; The third vector is added to obtain the fourth vector; it is determined whether the vector value of the fourth vector converges to the preset value; in response to the vector value of the fourth vector converging to the preset value, the converged
  • the second circle center corresponding to the fourth vector is used as the candidate center position of the first calibration plate; in response to the candidate center position being coincident with the center position of the first calibration plate, the candidate center position is set As the target center position.
  • the eighth determining subunit further includes: in response to the vector value of the fourth vector not converging to the preset value, using the end point of the fourth vector that has not converged as the For the second circle center, the multiple third vectors are re-determined and the fourth vector is re-determined.
  • the eighth determining subunit further includes: re-determining the candidate center position in response to that the candidate center position does not coincide with the center position of the first calibration plate.
  • the fifth determining submodule includes: an eighth determining unit, configured to determine an alternative external parameter between the radar and the camera according to g matching relationships, where g Is an integer greater than or equal to 3, and the target external parameter between the radar and the camera is determined according to a plurality of candidate external parameters between the radar and the camera.
  • the eighth determining unit includes: a tenth determining subunit, configured to use the radar to project the first calibration plate based on each candidate external parameter, and project to any second A set of projection data is generated on one image; the eleventh determining subunit is used to determine the set of projection data with the highest degree of projection matching the first image among multiple sets of projection data as the target projection data; the twelfth determining The subunit is used to determine the candidate external parameter corresponding to the target projection data, which is the target external parameter between the radar and the camera.
  • the radar and the camera are deployed on a vehicle, and the radar is a lidar.
  • the second calibration board is located within the field of view of the camera, and the second calibration board is used to calibrate the first internal parameter of the camera; the distance of the camera relative to the ground is greater than that of the camera. The distance of the radar relative to the ground, the horizontal distance between the second calibration board and the camera is less than the horizontal distance between the first calibration board and the camera, and the plurality of second images include complete The second calibration board.
  • the first image includes the complete first calibration board
  • the radar point cloud data includes point cloud data obtained based on the complete first calibration board
  • the relevant part can refer to the part of the description of the method embodiment.
  • the device embodiments described above are merely illustrative, where the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place. , Or it can be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the present disclosure. Those of ordinary skill in the art can understand and implement it without creative work.
  • the embodiment of the present disclosure also provides a computer-readable storage medium, the storage medium stores a computer program, and when the computer program is executed by a processor, the processor is prompted to implement any one of the aforementioned sensor calibration methods.
  • the computer-readable storage medium may be a non-volatile storage medium.
  • the embodiments of the present disclosure provide a computer program product, including computer-readable code, when the computer-readable code runs on a device, the device is caused to execute for realizing any of the above embodiments.
  • the embodiments of the present disclosure also provide another computer program product for storing computer-readable instructions, which when executed, cause the computer to perform the operations of the sensor calibration method provided in any of the above-mentioned embodiments. .
  • the computer program product can be specifically implemented by hardware, software, or a combination thereof.
  • the computer program product is specifically embodied as a computer storage medium.
  • the computer program product is specifically embodied as a software product, such as a software development kit (SDK), etc. Wait.
  • SDK software development kit
  • An embodiment of the present disclosure also provides a sensor calibration device, including: a processor; a memory for storing executable instructions of the processor; wherein the processor is configured to call the executable instructions to implement any of the above The calibration method of the sensor.
  • FIG. 25 is a schematic diagram of the hardware structure of a sensor calibration device provided by an embodiment of the application.
  • the sensor calibration device 310 includes a processor 311, and may also include an input device 312, an output device 313, a memory 314, and a bus 315.
  • the input device 312, the output device 313, the memory 314, and the processor 311 are connected to each other through a bus 315.
  • RAM random access memory
  • ROM read-only memory
  • EPROM erasable programmable read only memory
  • CD-ROM compact disc read-only memory
  • the input device is used to input data and/or signals
  • the output device is used to output data and/or signals.
  • the output device and the input device can be independent devices or an integrated device.
  • the processor may include one or more processors, such as one or more central processing units (CPU).
  • processors such as one or more central processing units (CPU).
  • CPU central processing units
  • the CPU may be a single-core CPU or Multi-core CPU.
  • the processor is used to call the program code and data in the memory to execute the steps in the foregoing method embodiment.
  • the processor is used to call the program code and data in the memory to execute the steps in the foregoing method embodiment.
  • the description in the method embodiment please refer to the description in the method embodiment, which will not be repeated here.
  • FIG. 25 only shows a simplified design of a calibration device for a sensor.
  • the calibration device of the sensor may also include other necessary components, including but not limited to any number of input/output devices, processors, controllers, memories, etc., and all the sensors that can implement the embodiments of the present application
  • the calibration devices are all within the scope of protection of this application.
  • the functions or modules contained in the device provided in the embodiments of the present disclosure can be used to execute the methods described in the above method embodiments.
  • the functions or modules contained in the device provided in the embodiments of the present disclosure can be used to execute the methods described in the above method embodiments.
  • the embodiment of the present disclosure also provides a calibration system, the calibration system includes a camera, a radar and a first calibration board, the first calibration board is located in the common field of view of the camera and the radar, the first calibration system The pose information of the calibration board at different collection moments is different.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Image Processing (AREA)

Abstract

一种传感器标定方法及装置(310)、存储介质、标定系统和程序产品,传感器包括雷达(210,410,2220,2320)和相机(220,610,2210,2310),第一标定板(230,420,620,2331)位于雷达(210,410,2220,2320)和相机(220,610,2210,2310)的共同视野范围(231)内;方法包括:通过相机(220,610,2210,2310)采集多张第一图像,其中,多张第一图像中第一标定板(230,420,620,2331)的位姿信息不同;获取预先标定的相机(220,610,2210,2310)的第一内参,并根据第一内参和多张第一图像,确定各个位姿的第一标定板(230,420,620,2331)分别相对于相机(220,610,2210,2310)的外参;获取各个位姿的第一标定板(230,420,620,2331)的多组雷达点云数据,并根据各个位姿的第一标定板(230,420,620,2331)分别相对于相机(220,610,2210,2310)的外参和多组雷达点云数据,确定雷达(210,410,2220,2320)和相机(220,610,2210,2310)之间的目标外参。

Description

传感器标定方法及装置、存储介质、标定系统和程序产品
相关申请的交叉引用
本申请要求2019年11月18日提交的题为“传感器的标定方法及装置、存储介质、标定系统”、申请号为201911126534.8的中国专利申请的优先权,以上申请的全部内容通过引用并入本文。
技术领域
本公开涉及计算机视觉领域,具体涉及传感器标定方法及装置、存储介质、标定系统和程序产品。
背景技术
随着计算机视觉的不断发展,机器设备部署的传感器越来越多,不同的传感器可以提供不同类型的数据,例如,机器设备包括雷达和相机的组合。基于雷达和相机提供的数据,可以让机器设备学习感知周围环境。
在同时使用雷达和相机的过程中,雷达和相机之间的外参(Extrinsic Parameter)的准确性决定了环境感知的准确性。
发明内容
本公开提供了一种传感器标定方法及装置、存储介质、标定系统,以实现雷达和相机的联合标定。
根据本公开实施例的第一方面,提供一种传感器标定方法,第一标定板位于所述雷达和所述相机的共同视野范围内,所述方法包括:通过所述相机采集多张第一图像,其中,所述多张第一图像中所述第一标定板的位姿信息不同;获取预先标定的所述相机的第一内参;根据所述第一内参和所述多张第一图像,确定各个位姿信息的第一标定板分别相对于所述相机的外参;获取各个位姿的所述第一标定板的多组雷达点云数据;根据各个位姿的所述第一标定板分别相对于所述相机的外参和所述多组雷达点云数据,确定所述雷达和所述相机之间的目标外参。
根据本公开实施例的第二方面,提供一种传感器标定装置,所述传感器包括雷达和相机,第一标定板位于所述雷达和所述相机的共同视野范围内,所述装置包括:第一采集模块,用于通过所述相机采集多张第一图像,其中,所述多张第一图像中所述第一标定板的位姿信息不同;第一确定模块,用于获取预先标定的所述相机的第一内参,并根据所述第一内参和所述多张第一图像,确定各个位姿的第一标定板分别相对于所述相机的外参;第二确定模块,用于获取各个位姿信息的所述第一标定板的多组雷达点云数据,并根据各个位姿信息的所述第一标定板分别相对于所述相机的外参和所述多组雷达点云数据,确定所述雷达和所述相机之间的目标外参。
根据本公开实施例的第三方面,提供一种计算机可读存储介质,所述存储介质存储有计算机程序,所述计算机程序被处理器执行时,促使所述处理器实现上述第一方面任一所述的传感器标定方法。
根据本公开实施例的第四方面,提供一种传感器标定装置,包括:处理器;用于存储所述处理器可执行指令的存储器;其中,所述处理器被配置为调用所述可执行指令,以实现第一方面中任一项所述的传感器的标定方法。
根据本公开实施例的第五方面,提供一种标定系统,所述标定系统包括相机、雷达和第一标定板,所述第一标定板位于所述相机和所述雷达的共同视野范围内,所述第一标定板在不同采集时刻的位姿信息不同。
根据本公开实施例的第六方面,提供一种计算机程序产品,所述计算机程序产品包括计算机可读代码,当所述计算机可读代码在设备上运行时,促使所述设备执行以实现第一方面中任一项所述的传感器的标定方法。
本实施例中,在标定传感器的过程中,比如,对雷达和相机之间的目标外参进行标定的情况下,可以获取预先标定的相机的第一内参,根据预先标定的相机的第一内参来确定各个位姿的第一标定板分别相对于相机的外参,从而根据各个位姿的所述第一标定板分别相对于相机的外参以及多组雷达点云数据确定雷达和相机之间的目标外参。在标定过程中,根据预先标定的相机的第一内参,以 及多张第一图像,得到第一标定板相对于相机的外参,即在相机与雷达之间相对位置关系或是俯/仰角度发生改变的情况下,可以根据预先标定好的相机的第一内参来实现传感器的标定。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。
图1是本公开根据一示例性实施例示出的一种传感器的标定方法流程图。
图2是本公开根据一示例性实施例示出的一种共同视野示意图。
图3是本公开根据一示例性实施例示出的一种不同姿态的标定板的示意图。
图4是本公开根据一示例性实施例示出的一种雷达发射的示意图。
图5是本公开根据另一示例性实施例示出的一种传感器的标定方法流程图。
图6是本公开根据一示例性实施例示出的一种相机视野示意图。
图7是本公开根据又一示例性实施例示出的一种传感器的标定方法流程图。
图8是本公开根据一示例性实施例示出的一种包括第二标定板的第二图像的示意图。
图9是本公开根据再一示例性实施例示出的一种传感器的标定方法流程图。
图10A是本公开根据一示例性实施例示出的一种对预设点进行投影的场景示意图。
图10B是本公开根据一示例性实施例示出的一种确定存在对应关系的坐标对的场景示意图。
图11是本公开根据再一示例性实施例示出的一种传感器的标定方法流程图。
图12是本公开根据再一示例性实施例示出的一种传感器的标定方法流程图。
图13是本公开根据再一示例性实施例示出的一种传感器的标定方法流程图。
图14是本公开根据再一示例性实施例示出的一种传感器的标定方法流程图。
图15是本公开根据再一示例性实施例示出的一种传感器的标定方法流程图。
图16是本公开根据再一示例性实施例示出的一种传感器的标定方法流程图。
图17是本公开根据一示例性实施例示出的确定多个第一向量的示意图。
图18是本公开根据再一示例性实施例示出的一种传感器的标定方法流程图。
图19是本公开根据再一示例性实施例示出的一种传感器的标定方法流程图。
图20是本公开根据再一示例性实施例示出的一种传感器的标定方法流程图。
图21A是本公开根据一示例性实施例示出的通过雷达对第一标定板进行投影的一种示意图。
图21B是本公开根据一示例性实施例示出的通过雷达对第一标定板进行投影的另一种示意图。
图22是本公开根据一示例性实施例示出的在车辆上部署雷达和相机的示意图。
图23是本公开根据一示例性实施例示出的雷达和相机部署在车辆上对应的第一标定板和第二标定板位置的示意图。
图24是本公开根据一示例性实施例示出的一种传感器的标定装置框图。
图25是本公开根据另一示例性实施例示出的一种传感器的标定装置框图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
在本公开运行的术语是仅仅出于描述特定实施例的目的,而非旨在限制本公开。在本公开和所附权利要求书中所运行的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。还应当理解,本文中运行的术语“和/或”是指并包含一张或多张相关联的列出项目的任何或所有可能组合。
应当理解,尽管在本公开可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本公开范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所运行的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”。
本公开提供了一种传感器的标定方法,对于传感器的标定而言,指的是对传感器的内参(Intrinsic Parameter)和外参进行标定。
其中,传感器的内参是指用于反映传感器自身特性的参数。在传感器出厂后,内参理论上是不 变的,但在实际使用中,内参可能会发生变化。以相机为例,随着使用,相机各部件位置关系的变化会导致内参的变化。标定出的内参通常只是近似真实内参的一个参数,而并非内参的真实值。
下面以传感器包括相机和雷达为例说明传感器的内参。相机的内参,指的可以是用于反映相机自身特性的参数,可以包括但不限于以下至少一项,即可以为以下列举的多个参数中的一个或是多个:u 0、v 0、S x、S y、f和r。其中,u 0和v 0分别表示像素坐标系的原点和相机所在的相机坐标系的原点之间相差的横向和纵向像素数目,以像素为单位。S x和S y分别是横向和纵向每单位长度包括的像素数目,单位长度可以为毫米。f是相机的焦距。r是由于图像畸变造成的像素距离成像仪中心的距离值,在本公开实施例中,成像仪中心为相机的聚焦中心。本公开所述的相机,可以是照相机,也可以是摄像机,还可以是具有拍照功能的其他设备,本公开对此不作限制。
雷达的内参,指的是可以用于反映雷达自身特性的参数,可以包括但不限于以下至少一项,即可以为以下列举的多个参数中的一个或是多个:发射机的功率和型式、接收机的灵敏度和型式、天线的参数和型式、显示器的数量和型式等。本公开所述的雷达,可以是激光探测及测距(Light Detection and Ranging,LiDAR)系统,也可以是无线电雷达,本公开对此不作限制。
传感器的外参是指物体在世界坐标系中所处的位置,相对于该物体在传感器坐标系中所处的位置之间的转换关系的参数。需要说明的是,在包括多个传感器的情况下,传感器外参还包括用于反映多个传感器坐标系之间转换关系的参数。下面同样以传感器包括相机和雷达为例说明传感器的外参。
相机的外参,是指用于将点从世界坐标系转换到相机坐标系的参数。在本公开实施例中,标定板相对于相机的外参,可以用于反映处于世界坐标系的标定板转换到相机坐标系所需的位置和/或姿态的变化参数等。
相机的外参可以包括但不限于如下参数中的一项或多项的组合:处于世界坐标系的标定板转换到相机坐标系所需的位置和/或姿态的变化参数等。
此外,对于相机来讲,还需要考虑畸变参数,畸变参数包括径向畸变参数和切向畸变系数。径向畸变和切向畸变分别是图像像素点以畸变中心为中心点,沿着长度方向或切线方向产生的位置偏差,从而导致图像发生形变。
处于世界坐标系的标定板转换到相机坐标系所需的位置和/或姿态的变化参数可以包括旋转矩阵R和平移矩阵T。其中,旋转矩阵R是处于世界坐标系的标定板转换到相机坐标系的情况下分别相对于x、y、z三个坐标轴的旋转角度参数,平移矩阵T是在处于世界坐标系的标定板转换到相机坐标系的情况下原点的平移参数。
雷达的外参,是指用于将点从世界坐标系转换到雷达坐标系的参数。在本申请实施例中,标定板相对于雷达的外参,可以用于反映处于世界坐标系的标定板转换到雷达坐标系所需的位置和/或姿态的变化参数等。
相机和雷达之间的目标外参,是指用于反映雷达坐标系与相机坐标系之间转换关系的参数,相机和雷达之间的外参可以反映雷达坐标系相对于相机坐标系在位置和姿态上的变化等。
比如,传感器可以包括相机和雷达,那么对传感器进行标定,指的是对相机的内参、雷达的内参、相机和雷达之间的目标外参中的一项进行标定或多项的组合进行标定。其中,可以借助标定板来确定上述内参和/或外参,例如借助标定板相对于相机的外参和标定板相对于雷达的外参来确定相机和雷达之间的目标外参。需要说明的是,实际被标定的参数可以包括但不限于上述列举的情况。
例如图1所示,在传感器包括相机和雷达的情况下,传感器的标定方法可以包括以下步骤。
在步骤101中,通过所述相机采集多张第一图像。其中,所述多张第一图像中第一标定板的位姿信息不同。
在本公开实施例中,雷达可以是通过发射激光束来探测目标的位置、速度等特征量的激光雷达,或者是工作在毫米波频段的毫米波雷达等。
视野是传感器在位置不变的情况下,发射的光、电磁波等可以覆盖的范围。在本申请实施例中,以传感器包括雷达为例,视野指的是雷达发射的激光束或电磁波等可以覆盖的范围;以传感器包括相机为例,视野指的是相机的摄像头所能拍摄到的范围。
在本公开实施例中,第一标定板230位于雷达210和相机220的共同视野231的范围内,例如图2所示。其中,共同视野231的范围是指传感器中包括的传感元件各自覆盖的范围相互重叠的部分,即雷达210覆盖的范围(如图中的雷达视野211)与相机220拍摄的范围(如图中的相机视野221)重叠的部分(如图中的虚线示意的部分)。
在本公开实施例中,第一标定板可以采用圆形、长方形或正方形的固定间距图案的阵列平板。例如图3中的任一张第一图像所示,可以采用长方形的黑白格子相间的阵列平板。当然,标定板的 图案还可以包括其他规则图案,或是包括不规则但具备特征点集、特征边等特征参数的图案,在此对于标定板的形状、图案等内容不予限定。
在本步骤中,为了提高雷达和相机之间的目标外参的准确性,相机采集的第一图像的数目可以为多张,例如大于20张。在本公开实施例中,采集的多张第一图像中第一标定板的位姿可以不同,即多张第一图像中存在至少部分图像分别展示第一标定板的不同位姿,如不同的位置和/或姿态。例如图3中所示出的多张第一图像中,第一标定板存在俯仰(pitch)角、翻滚(roll)角、偏航(yaw)角三个维度的姿态变化。也就意味着,多张第一图像可以是在第一标定板处于不同位置和/或姿态下采集,即不同第一图像所包括的第一标定板的位姿信息可以相同或不同,但存在至少两张第一图像包括第一标定板不同的位姿信息。其中,每张第一图像中需要包括完整的第一标定板。
其中,第一图像的数量可以为m张,第一标定板的位姿的数量可以有n个。m,n均为大于等于2的整数。
其中,位姿信息包括用于反映第一标定板在三维空间中的姿态的信息。例如图3所示的第一标定板的位姿信息可以是,第一标定板俯仰角、翻滚角、偏航角三个维度中至少一个维度上的姿态变化。另外,在相机采集第一图像的过程中,可以使第一标定板处于静止状态。比如,可以使用支架固定第一标定板。
在一种实现方式中,位姿信息还包括位置信息。在所采集的多张第一图像中,可以包括多种距离(即距离较小、距离适中、距离较大等)下的第一标定板在不同位姿下的图像。为了确保雷达产生的激光可以覆盖到完整的第一标定板,通常在部署第一标定板的过程中,使第一标定板距离雷达较远。在对部署在不同距离的第一标定板的图像进行采集的过程中,响应于第一标定板距离相机的距离d 1较小的情况下,例如距离d 1小于距离阈值D 1的情况下,采集包括不同姿态的第一标定板的多张第一图像。响应于d 1较大的情况下,例如d 1大于距离阈值D 2的情况下,可以再额外采集包括不同姿态的第一标定板的多张第一图像。响应于距离d 1适中的情况下,例如距离d 1在上述两个距离阈值之间,即D 1<d 1<D 2,可以再额外采集包括不同姿态的第一标定板的多张第一图像。这样就能得到第一标定板与相机之间在多种距离下所拍摄的第一图像。不同距离的第一图像,其位姿信息不同。
在本公开实施例中,为了后续更准确地确定第一标定板相对于相机的外参,多张第一图像中可以包括完整的第一标定板。例如图3所示多张第一图像中,第一标定板的面积占第一图像的面积的比例不同。如,当距离d 1较远时,第一标定板在第一图像中的面积占比较小,当距离d 1较近时,第一标定板在第一图像中的面积占比较大。
在步骤102中,获取预先标定的所述相机的第一内参,并根据所述第一内参和所述多张第一图像,确定各个位姿的第一标定板分别相对于所述相机的外参。
由于相机视野边缘畸变较大,需要更准确地确定畸变参数。同时畸变参数对相机的内参影响较大,因此如果直接根据多张第一图像来标定相机的内参,标定结果可能不够准确。在本公开实施例中,可以直接获取预先标定好的相机的第一内参,根据相机的第一内参和之前相机采集的多张第一图像,可以确定不同位姿的第一标定板相对于所述相机的外参。
在本公开实施例中,相机的第一内参是在初次标定传感器的情况下,对相机进行标定,得到的相机的内参。在初次标定相机内参的情况下,相机采集多张包括完整的第二标定板的第二图像,根据多张第二图像标定相机的第一内参。其中,多张第二图像中所述第二标定板的位姿信息不同。第二标定板可以距离相机较近,且靠近相机的视野边缘,这样确定出的相机的第一内参比采用多张第一图像标定出的相机的内参更加准确。本公开实施例中,初次标定相机的第一内参之后,再次对传感器进行标定的情况下,可以直接获取预先标定的相机的第一内参。进一步地,可以采用张正友标定法等方式标定相机的第一内参。根据第一内参和多张第一图像,确定第一标定板相对于相机的外参,包括旋转矩阵R和平移矩阵T。
在步骤103中,获取所述各个位姿的所述第一标定板的多组雷达点云数据,并根据所述各个位姿的所述第一标定板分别相对于所述相机的外参和所述多组雷达点云数据,确定所述雷达和所述相机之间的目标外参。
在本公开实施例中,已经通过相机采集了不同位姿的第一标定板的多张第一图像,针对每个位姿的第一标定板可以同时采集对应的雷达点云数据。其中,雷达点云数据是该雷达发射的激光或电磁波穿过不同位姿的第一标定板所产生的包括多个雷达点的数据。且为了提高最终确定的雷达和相机之间的目标外参的准确性,该雷达点云数据包括基于完整的第一标定板得到的点云数据。
例如图4所示,第一标定板420的边缘与雷达410发射的激光或电磁波不平行,可以有一定角度,确保第一标定板420每个边缘都有雷达410发射的激光或电磁波穿过,以便后续可以更好的确定第一标定板在雷达点云数据中匹配的目标雷达点云数据。
所述雷达和所述相机之间的目标外参,属于相机和雷达之间的外参。
上述实施例中,在标定过程中,根据预先标定的相机的第一内参,以及多张第一图像,得到第一标定板相对于相机的外参,即在相机与雷达之间相对位置关系或是俯/仰角度发生改变的情况下,可以根据预先标定的相机的第一内参来实现传感器的标定。
在一些可选实施例中,例如图5所示,在执行上述步骤102获取预先标定的所述相机的第一内参之前,该方法还包括步骤100。
在步骤100中,响应于初次标定所述传感器,对所述相机进行标定,得到所述相机的所述第一内参。
在本公开实施例中,如果是初次对传感器进行标定的情况,则可以对相机进行标定,得到相机的第一内参。
步骤102中获取预先标定的相机的第一内参可以包括:响应于再次标定所述传感器,获取初次标定所述传感器得到的所述相机的所述第一内参。
如果是再次对传感器进行标定的情况,例如需要再次标定传感器中雷达和相机之间的目标外参的情况下,可以直接获取初次标定传感器得到的相机的第一内参。
上述实施例中,响应于初次标定传感器,对相机进行标定,得到相机的第一内参,响应于再次标定传感器,可以直接获取初次标定传感器得到的相机的第一内参。这样可以将相机内参标定过程、以及雷达和相机之间的目标外参的标定过程区分开,并在再次标定传感器的过程中,直接基于初次标定传感器得到的相机的第一内参来实现传感器标定,无需反复对相机的内参进行标定,有效提高了确定目标外参的速度。
在一些可选实施例中,在初次标定传感器的情况下,第二标定板应位于相机的视野范围内,第二图像中可以包括完整的第二标定板,例如图6所示。为了提高初次标定相机的第一内参的准确性,可以让第二标定板620位于相机610的相机视野611的边缘。
例如图7所示,上述步骤100可以包括以下步骤。
在步骤100-1中,通过所述相机采集多张第二图像。
其中,所述多张第二图像中所述第二标定板的位姿信息不同。
第二标定板可以与第一标定板相同或不同。在本申请实施例中,第一标定板与第二标定板相同指的可以是采用同样的标定板来实现第一标定板和第二标定板的功能,其中,在同一标定板作为第二标定板进行使用的情况下,可以采取该同一标定板作为第一标定板所处的位姿,当然,也可以采取不同于该同一标定板作为第一标定板所处的位姿。第一标定板与第二标定板不同指的可以是采用完全不同或是部分不同的标定板来分别实现第一标定板和第二标定板的功能。其中,位姿信息可以包括第二标定板在三维空间中的姿态,例如俯仰角、翻滚角、偏航角三个维度的姿态变化。
在相机采集第二图像的过程中,第二标定板应处于静止状态。可以使用支架固定第二标定板。
相机在采集第二图像的情况下,为了提高第一内参的准确性,让第二标定板尽可能靠近相机视野边缘,这样相机采集到的多张第二图像中,第二标定板在所述第二图像中占据的比例大于预设值。可选地,所述预设值可以是一个具体的数值,或是范围值。以预设值为范围值为例,预设值的范围值会影响相机的每个第一内参的准确性,因此,为了提高后续确定的相机的第一内参的准确性,可以将预设值设置为[0.8,1]之间的数值。例如图8所示,在该图中,第二标定板在整个图像中的占比在预设值范围内,因此可以将该图作为第二图像。
为了提高确定的相机的第一内参的准确性,相机采集的第二图像的数目可以为多张,例如大于20张。在本公开实施例中,采集的多张第二图像中第二标定板的位姿信息可以不同,即多张第二图像中存在至少部分图像分别展示第二标定板的不同位姿,例如存在俯仰角、翻滚角、偏航角三个维度的姿态变化。也就意味着,多张第二图像可以是在第二标定板处于不同位置和/或姿态下采集,即不同第二图像所包括的第二标定板的位姿信息可以相同或不同,但存在至少两张第二图像包括第二标定板不同的位姿信息。其中,每张第二图像中需要包括完整的第二标定板。
其中,第二图像的数量可以为c张,第二标定板的位姿的数量可以有d个。c,d均为大于等于2的整数。c可以等于前述第一图像的数量m,也可以不等于m,类似的,d可以等于前述第二标定板的位姿的数量n,也可以不等于n。
为了提高相机的第一内参的准确性,相机采集的多张第二图像不应出现图像模糊,其中,图像模糊可能是由于传感器的运动,即相机的运动造成相机与第二标定板之间出现相对运动所造成的。可选地,可以确定相机采集到的多张第二图像中是否有运动模糊的图像,去除运动模糊的图像。或者可以通过预设脚本过滤掉运动模糊的图像。
在步骤100-2中,根据所述多张第二图像,分别确定所述相机的多个第一备选内参,并将所述多个第一备选内参中的一个第一备选内参,确定为所述第一内参。
在本公开实施例中,可以采用预设的matlab工具箱根据多张第二图像来分别标定出相机的多个 第一备选内参。
对于多个第一备选内参中的每个第一备选内参,可以通过所述相机,将位于相机坐标系中的预设点重投影到像素坐标系中,得到对应的投影点,然后比较投影点与对应的预设点在像素坐标系中两者之间的误差,可以得到预设点的误差值。比较各个第一备选内参分别得到的误差值,将误差值最小的一个第一备选内参作为相机的第一内参。
上述步骤100-1、100-2是为了初次标定传感器的情况下,对相机的第一内参进行标定的过程,与步骤101在执行时没有先后顺序的限定。如果再次标定传感器的情况下,可以直接获取预先标定的相机的第一内参。
在本公开实施例中,相机的第一备选内参就是根据相机采集到的包括不同位姿信息的第二标定板的多张第二图像,分别确定出的相机的多个第一备选内参。第一备选内参中按照上述方式确定的投影点与对应的预设点在像素坐标系中之间的误差值最小的一个第一备选内参被选为相机的第一内参。
上述实施例中,可以确定相机的多个第一备选内参,从而在多个第一备选内参中确定一个作为第一内参,提高了确定相机内参的精度和准确度,可用性高。
在一些可选实施例中,例如图9所示,步骤100-2可以包括以下步骤。
在步骤100-21中,通过所述相机,分别按照所述多个第一备选内参,将位于相机坐标系中的预设点投影到像素坐标系,获得所述预设点在所述像素坐标系中的多个第一坐标值。
预设点的数目可以是一个或多个,对于每个预设点,相机可以分别采用不同的第一备选内参,将位于相机坐标系中的预设点投影到像素坐标系中,获得每个预设点在像素坐标系中的多个第一坐标值。
例如图10A所示,将3D空间的一个预设点P投影到2D空间中,获得对应的第一坐标值P1。
在步骤100-22中,对于每个备选内参,获取所述预设点在验证图像中的第二坐标值,并确定与第二坐标值对应的第一坐标值,得到存在对应关系的坐标对,其中,所述验证图像为所述多张第二图像中的一张或多张第二图像。
对于每个备选内参,可以确定预设点在像素坐标系中的第二坐标值,例如图10B所示第二坐标值为P2,确定与第二坐标值P2对应的第一坐标值P1。对于各个备选内参,可以得到多组存在对应关系的坐标对。例如,对于第一备选内参,P2对应P1,P1和P2构成一组坐标对。再例如对于第二备选内参,P2’对应P1’,则P1’和P2’构成另一组坐标对。
当所述验证图像为多张第二图像时,对于备选内参i,可以得到预设点在验证图像j上第二坐标值,以及与该第二坐标值对应的第一坐标值,组成一组坐标对
Figure PCTCN2020122559-appb-000001
然后得到该预设点在多张验证图像上的多组坐标对
Figure PCTCN2020122559-appb-000002
可以将其记为P i
在步骤100-23中,对于每个第一备选内参,确定该第一备选内参包括的坐标对中第一坐标值与第二坐标值之间的距离,并将多个第一备选内参中距离最小的一个第一备选内参,确定为所述相机的第一内参。
在本公开实施例中,可以分别计算每一组坐标对中第一坐标值与第二坐标值之间的距离。最小距离对应的一个第一备选内参,可以作为相机的第一内参。
假设第一坐标值和第二坐标值之间的距离分别为d1、d2和d3,其中d2最小,且d2对应的是第一备选内参2,则可以将第一备选内参2作为相机的第一内参。
当第一备选内参i包括多个坐标对时,可以分别计算P i中每组坐标对的距离,然后可以得到多组坐标对的总距离,例如可以将每组坐标对的距离相加,得到该总距离。比较每个第一备选参数的总距离,将多个第一备选内参中总距离最小的一个第一备选内参,确定为所述相机的第一内参。
为了简单起见,上面是以预设点为一个进行说明。但本领域技术人员可知,预设点可以有多个,在多个预设点的情况下,获取第一内参的方法同一个预设点类似。例如,对于每个第一备选内参,可以计算每个预设点的坐标对的距离,然后计算多个预设点的距离的平均值,将多个第一备选内参中距离平均值最小的一个第一备选内参确定为所述相机的第一内参。
上述实施例中,将重投影误差最小的一个第一备选内参作为相机的目标内参,使得相机的第一内参更加准确。
在一些可选实施例中,例如图11所示,步骤102可以包括以下步骤。
在步骤102-1中,对于所述多张第一图像中的每一张第一图像,根据所述第一内参对该第一图像进行去畸变处理,得到与该第一图像对应的第三图像。
例如,同时设置有雷达和相机的机器设备,例如同时设置了雷达和相机的车辆,上部署的具有图像处理功能的设备(该设备可以为雷达、相机或是其他设备),可以对多张第一图像进行去畸变处理。
本公开实施例中,为了后续可以获得比较准确的第一标定板相对于相机的外参,可以先根据预先标定的相机的第一内参对多张第一图像去畸变,得到多张第三图像,根据多张第三图像确定相机的第二内参,即相机在没有畸变的理想情况下的内参,再根据相机的第二内参确定第一标定板相对于相机的外参。
其中,相机内参可以通过内参矩阵A’表示,如公式1所示:
Figure PCTCN2020122559-appb-000003
其中,各参数的含义可以参见前述关于相机参数的描述。
在对多张第一图像进行去畸变处理的过程,就是在上述内参矩阵A’中忽略由于畸变造成的像素距离成像仪中心的距离值r的影响,让r尽可能为0,忽略畸变影响的内参矩阵A可以用公式2表示:
Figure PCTCN2020122559-appb-000004
从而可以得到多张与第一图像分别对应的第三图像。
在步骤102-2中,根据多张第三图像,确定所述相机的第二内参。
可以通过预设的matlab工具箱根据去畸变处理后的多张第三图像,来分别确定相机的多个第二备选内参,相机分别采用不同的第二备选内参将位于相机坐标系的预设点投影到像素坐标系,获得多个第三坐标值。观测到的每个预设点在像素坐标系中的第四坐标值与相应的第三坐标值作为一组存在对应关系的坐标对,将多组坐标对中最小距离对应的一个第二备选内参作为相机的第二内参。
在本公开实施例中,第二内参是根据去畸变之后的多张第三图像确定出的相机的内参。
相机的多个第二备选内参是根据相机采集到的包括不同位姿信息的第一标定板的多张第一图像,进行去畸变后得到的多张第三图像所确定出的相机在理想状态下的多个内参。第二内参是在多个第二备选内参中确定的投影点与对应的预设点在像素坐标系中之间的误差值最小的一个第二备选内参,第二内参就是相机在没有畸变的理想状态下的内参。
在步骤102-3中,根据多张第三图像以及所述相机的第二内参,确定各个位姿的所述第一标定板分别相对于所述相机的外参。
可以先分别计算每张第三图像对应的单应性(Homography)矩阵H,得到多个单应性矩阵H,再根据第二内参和多个单应性矩阵,计算出不同位姿的第一标定板相对于相机的外参,可以包括旋转矩阵R和平移矩阵T。
其中,单应性矩阵是描述世界坐标系和像素坐标系之间的位置映射关系的矩阵。
上述实施例中,可以根据相机的第一内参,对相机拍摄的多张第一图像进行去畸变,获得多张第三图像,根据多张第三图像来确定相机的第二内参,第二内参相当于理想情况下没有畸变的相机的内参。再根据多张第三图像和第二内参,确定第一标定板相对于相机的外参,通过上述方式得到的第一标定板相对于所述相机的外参准确性更高。
在一些可选实施例中,例如图12所示,上述步骤102-3可以包括以下步骤。
在步骤102-31中,分别确定每张所述第三图像对应的单应性矩阵。
在本公开实施例中,每张第三图像对应的单应性矩阵H可以采用以下方式计算:
Figure PCTCN2020122559-appb-000005
H=A[r 1 r 2 t]    公式4
其中,r 1、r 2、r 3是组成旋转矩阵R的旋转列向量,维度是1×3,t是平移矩阵T的向量形式。
根据公式3和公式4可以得出公式5:
Figure PCTCN2020122559-appb-000006
其中,(u,v)是像素坐标,(X,Y)对应标定板的坐标,s是尺度因子。
在本公开实施例中,通过公式5可以计算出多张第三图像分别对应的单应性矩阵H。
在步骤102-32中,根据所述相机的第二内参和多个所述单应性矩阵,确定各个位姿的所述第一标定板分别相对于所述相机的外参。
在计算得到多个单应性矩阵H之后,在确定不同位姿的所述第一标定板相对于所述相机的外参R和T的情况下,可以采用前述公式4进行计算,其中,单应性矩阵H是一个3×3的矩阵,公式4可以进一步表示为:
[h 1 h 2 h 3]=λA[r 1 r 2 t]   公式6
其中,λ表示尺度因子。
计算得到r 1=λA -1h 1,r 2=λA -1h 2,r 3=r 1×r 2,其中,λ=1/||A -1h 1||=1/||A -1h 2||。r 1、r 2和r 3构成了3×3的旋转矩阵R。
根据公式6还可以计算得到t=λA -1h 3,t构成3×1的平移矩阵T。
上述实施例中,可以分别确定每张第三图像对应的单应性矩阵,根据得到的多个单应性矩阵和相机的第二内参,确定各个位姿的第一标定板分别相对于相机的外参,使得第一标定板相对于相机的外参更加准确。
在一些可选实施例中,例如图13所示,上述步骤103可以包括以下步骤。
在步骤103-1中,针对每个位姿的第一标定板,根据该第一标定板相对于所述相机的外参、以及所述雷达和所述相机之间的外参参考值,在该位姿下的所述雷达点云数据中确定一组与该第一标定板匹配的目标雷达点云数据。
其中,外参参考值可以是根据雷达和相机之间的大致位置和朝向,得出的粗略估计的雷达和相机之间的外参值。可以根据外参参考值将雷达所在的坐标系与相机坐标系重合起来,统一到相机坐标系中。
在本公开实施例中,可以针对每个位姿的所述第一标定板,所述第一标定板相对于所述相机的外参、以及所述雷达和所述相机之间的外参参考值,采用M估计(M-estimator SAmple Consensus,MSAC)算法来确定第一标定板所在的目标平面。进一步地,使用均值偏移(MeanShift)聚类算法在所述目标平面上,在对应的雷达点云数据中确定出与第一标定板匹配目标雷达点云数据。
在步骤103-2中,根据多组所述目标雷达点云数据分别与各个位姿的第一标定板之间的匹配关系,确定所述雷达和所述相机之间的目标外参。
在本公开实施例中,雷达到相机的位姿只有一个。但是在确定雷达到相机之间的目标外参时,使用的第一标定板的位姿有多个,例如n个,因此在步骤103-1中可以得到n组目标雷达点云数据,这n组目标雷达点云数据可以分别与n个位姿的第一标定板进行匹配,得到n个匹配关系。然后通过这n个匹配关系,可以计算出雷达和相机之间的外参。
在本公开实施例中,可以基于多组目标雷达点云数据与所述第一标定板之间的匹配关系,采用最小二乘法确定出所述雷达和所述相机之间的目标外参。
上述实施例中,针对每个位姿的第一标定板,可以根据该第一标定板相对于所述相机的外参、以及所述雷达和所述相机之间的外参参考值,采用M估计算法来确定第一标定板所在的目标平 面。进一步地,使用均值偏移聚类算法在所述目标平面上,在对应的雷达点云数据中确定出与第一标定板匹配一组目标雷达点云数据。自动在雷达点云数据中确定出与第一标定板匹配的目标雷达点云数据,降低了匹配误差,提高了点云匹配的准确度。根据多组所述目标雷达点云数据与所述第一标定板之间的匹配关系,确定所述雷达和所述相机之间的目标外参,可以快速确定出雷达和相机之间的目标外参,提高了目标外参的准确性。
在一些可选实施例中,例如图14所示,针对某一个位姿的第一标定板,上述步骤103-1可以包括以下步骤。
在步骤103-11中,根据该第一标定板相对于所述相机的外参、以及所述雷达和所述相机之间的外参参考值,确定该第一标定板所在的备选位置。
在本公开实施例中,可以先根据该位姿的第一标定板相对于所述相机的外参和预估的所述雷达和所述相机之间的外参参考值,在针对该第一标定板采集的雷达点云数据中预估第一标定板所在的位置,获得该第一标定板所在的大致位置和方向。将第一标定板所在的大致位置和方向作为预估的备选位置。该备选位置表示在由雷达点云数据组成的图中,该第一标定板大致的位置。
在步骤103-12中,根据所述备选位置,在该位姿下的雷达点云数据中确定该第一标定板所在的目标平面。
本公开实施例中,可以从该位姿下采集的一组雷达点云数据中,随机选取位于所述备选位置所对应的区域内的多个第一雷达点,可以获得由多个第一雷达点组成的第一平面。重复多次这样的选取,可以得到多个第一平面。
再针对每个第一平面,分别计算该组雷达点云数据中除多个第一雷达点以外的其他雷达点到第一平面的距离。将其他雷达点中所述距离值小于预设阈值的雷达点作为第二雷达点,并将第二雷达点确定为第一平面中的雷达点。将雷达点数目最多的一个第一平面作为第一标定板所在的目标平面。该目标平面表示在由雷达点云数据组成的图中,该第一标定板所在的平面。
在步骤103-13中,在所述目标平面上,确定一组与该位姿的第一标定板匹配的目标雷达点云数据。
在每个目标平面上,根据第一标定板的尺寸随机确定第一圆形区域。初始的第一圆形区域可以是第一标定板外接圆对应的区域。在每组所述雷达点云数据中,随机选取位于初始的第一圆形区域内的任一个雷达点作为第一圆形区域的第一圆心,以调整所述第一圆形区域在所述雷达点云数据中的位置。其中,第一标定板的尺寸是在由雷达点云数据组成的图中第一标定板的大小。
以所述第一圆心为起点,所述雷达点云数据中位于所述第一圆形区域内的多个第三雷达点为终点,分别得到多个第一向量。将所述多个第一向量相加后得到第二向量。再基于第二向量,确定第一标定板的目标中心位置。其中,第一标定板的目标中心位置是在由雷达点云数据组成的图中,所确定的第一标定板的中心位置。
进一步地,根据所述第一标定板的所述目标中心位置和所述第一标定板的尺寸,在所述雷达点云数据中确定与所述第一标定板匹配的一组目标雷达点云数据。
在一些可选实施例中,例如图15所示,步骤103-12可以包括以下步骤。
在步骤103-121中,从该位姿下的所述雷达点云数据中,确定两个或多个第一雷达组,并针对每个第一雷达组,确定相对应的第一平面,其中,每个第一雷达组包括多个随机选取的、位于备选位置对应区域内的第一雷达点,每个第一雷达组相对应的第一平面包括该第一雷达组的多个第一雷达点。
本公开实施例中,可以从某一位姿对应的所述雷达点云数据中,每次随机选取位于所述备选位置所对应区域内的多个第一雷达点,得到一个第一雷达组,每次可以获得由该第一雷达组的多个第一雷达点组成的第一平面。如果分多次随机选取多个第一雷达点,则可以得到多个第一平面。
例如,假设雷达点包括1、2、3、4、5、6、7和8,第一次随机选取第一雷达点1、2、3和4组成第一平面1,第二次随机选取第一雷达点1、2、4和6组成第一平面2,第三次随机选取了第一雷达点2、6、7和8组成第一平面3。
在步骤103-122中,针对每个所述第一平面,分别确定该位姿下的所述雷达点云数据中的除所述多个第一雷达点以外的其他雷达点到所述第一平面的距离。
例如,对于第一平面1,可以计算其他雷达点5、6、7、8分别到第一平面1的距离值,对于第一平面2,可以计算其他雷达点3、5、7、8分别到第一平面2的距离值,同样地,对于第一平面3,可以计算其他雷达点1、3、4、5分别到第一平面3的距离值。
在步骤103-123中,针对每个所述第一平面,将所述其他雷达点中所述距离小于阈值的雷达点作为第二雷达点,并将所述第二雷达点确定为所述第一平面中包括的雷达点。
例如,假设对于第一平面1,其他雷达点5到第一平面1的距离值小于预设阈值,则将雷 达点5作为第二雷达点,最终第一平面1包括雷达点1、2、3、4和5,同样地,可以假设第一平面2包括雷达点1、2、4、6、7,以及可以假设第一平面3包括雷达点1、3、4、5、6、8。
在步骤103-124中,在两个或多个所述第一平面中,将包括的雷达点数目最多的一个第一平面作为所述目标平面。
将雷达点数目最多的一个第一平面,例如第一平面3作为第一标定板所在的目标平面。
采用上述方法可以针对每组雷达点云数据,确定某一位姿的第一标定板所在的一个目标平面。拟合出的目标平面更加准确,可用性高。
在一些可选实施例中,例如图16所示,步骤103-13可以包括以下步骤。
在步骤103-131中,在所述目标平面上,根据所述第一标定板的尺寸确定初始第一圆形区域。
本公开实施例中,在确定了第一标定板所在的目标平面之后,可以在目标平面上按照第一标定板的尺寸确定初始第一圆形区域,该初始第一圆形区域的大小可以采用该第一标定板的外接圆大小。其中,第一标定板的尺寸是在由雷达点云数据组成的图中第一标定板的大小。
在步骤103-132中,在所述雷达点云数据中,随机选取位于所述初始第一圆形区域内的任一个雷达点作为第一圆形区域的第一圆心,以确定所述第一圆形区域在所述雷达点云数据中的位置。
本公开实施例中,在确定了初始第一圆形区域后,在所述初始第一圆形区域内的雷达点云数据中,随机选取一个雷达点作为该第一圆形区域的第一圆心。通过第一圆心后续调整第一圆形区域在雷达点云数据中的位置。第一圆形区域的半径与初始第一圆形区域的半径相同。
在步骤103-133中,以所述第一圆心为起点,所述雷达点云数据中位于所述第一圆形区域内的多个第三雷达点为终点,分别得到多个第一向量。
在本公开实施例中,例如图17所示,可以以第一圆心170为起点,雷达点云数据中位于所述第一圆形区域内的多个第三雷达点171为终点,得到多个第一向量172。
在一些例子中,第三雷达点171可以有效覆盖圆形区域。如图17所示。
在步骤103-134中,将所述多个第一向量相加后得到第二向量。
在本公开实施例中,将所有的第一向量相加后可以得到一个Meanshift向量,即第二向量。
在步骤103-135中,基于所述第二向量,确定所述第一标定板的目标中心位置。
本公开实施例中,将第二向量的终点作为第二圆心,按照第一标定板的尺寸,得到第二圆形区域。将第二圆形区域内的多个第四雷达点为终端,第二圆心作为起点,分别得到多个第三向量。多个第三向量相加得到第四向量,再将第四向量的终点作为新的第二圆心,得到新的第二圆形区域,重复上述确定第四向量的步骤,直到第四向量收敛为预设值,将此时对应的第二圆心作为第一标定板的备选中心位置。该备选中心位置,是第一标定板在雷达点云数据组成的图中的备选中心位置。
可以确定该备选中心位置与第一标定板中心位置是否重合,如果重合可以直接将备选中心位置作为目标中心位置,否则可以重新确定新的所述备选中心位置,直到确定出最终的目标中心位置。
在步骤103-136中,根据所述第一标定板的所述目标中心位置和所述第一标定板的尺寸,在所述雷达点云数据中确定一组与该第一标定板匹配的所述目标雷达点云数据。
在本公开实施例中,在确定了第一标定板的所述目标中心位置之后,根据第一标定板的目标中心位置和尺寸,就可以确定出第一标定板对应的位置,第一标定板与实际上该位姿的第一标定板相匹配,从而可以将雷达点云数据中与该位姿的第一标定板位置匹配的雷达点云数据作为目标雷达点云数据。
在一些可选实施例中,例如图18所示,步骤103-135可以包括:
在步骤103-1351中,以所述第二向量的终点为第二圆心,根据所述第二圆心和所述第一标定板的尺寸确定第二圆形区域。
本公开实施例中,可以将第二向量的终点作为第二圆心,再次以第二圆心为新的圆心,半径为第一标定板外接圆半径得到第二圆形区域。
在步骤103-1352中,以所述第二圆心为起点,所述雷达点云数据中位于所述第二圆形区域内的多个第四雷达点为终点,分别确定多个第三向量。
本公开实施例中,再将第二圆心作为起点,雷达点云数据中位于第二圆心区域内的多个第四雷达点作为终点,分别得到多个第三向量。
在步骤103-1353中,将所述多个第三向量相加后得到第四向量。
在步骤103-1354中,判断第四向量的向量值是否收敛为预设值。
若第四向量的向量值收敛为预设值,则跳转到步骤103-1356。若第四向量的向量值未收敛为预设值,则跳转到步骤103-1355。可选地,预设值可以接近于零。
在步骤103-1355中,将所述第四向量的终点作为所述第二圆心,根据所述第二圆心和所述第一标定板的尺寸确定所述第二圆形区域,并跳转至步骤103-1352。
本公开实施例中,可以将第四向量的终点重新作为新的第二圆心,按照上述步骤103-1352至103-1354的方式再次计算得到新的第四向量,并判断第四向量的向量值是否收敛。不断重复上述过程,直到最终得到的第四向量的向量值收敛到预设值。
在步骤103-1356中,将收敛的所述第四向量所对应的所述第二圆心作为所述第一标定板的备选中心位置。
本公开实施例中,在第四向量的向量值收敛为所述预设值的情况下,第四向量所对应的第二圆心可以作为第一标定板的备选中心位置。
在步骤103-1357中,响应于所述备选中心位置与所述第一标定板的中心位置重合,将所述备选中心位置作为所述目标中心位置。
本公开实施例中,可以在雷达点云数据组成的图中判断备选中心位置与第一标定板的中心位置是否重合,如果重合,可以直接将该备选中心位置作为最终第一标定板的目标中心位置。在一些可选实施例中,例如图19所示,步骤103-135还可以包括以下步骤。
在步骤103-1358中,响应于所述备选中心位置与所述第一标定板的所述中心位置不重合,重新确定所述备选中心位置。
可以在备选中心位置与所述第一标定板的所述中心位置不重合的情况下,删除该第二圆形区域内的所有雷达点,重新确定新的第二圆形区域。或者直接删除这一组雷达点云数据,根据另一组与第一标定板的其他姿态对应的雷达点云数据,来重新确定第一标定板的备选中心位置,直到确定出的备选中心位置与所述第一标定板的所述中心位置重合。
此时再执行步骤103-1357,将该备选中心位置作为与第一标定板的当前目标姿态对应的目标中心位置。
在一些可选实施例中,步骤103-2可以包括:根据g个匹配关系,确定所述雷达和所述相机之间的一个备选外参,并根据所述雷达和所述相机之间的多个备选外参,确定所述雷达和所述相机之间的目标外参。
在本公开实施例中,可以根据g个的匹配关系。采用最小二乘法,通过最小化雷达和所述相机之间的外参误差的平方和的方式,来确定一个备选外参,其中g为大于或等于3的整数。
例如,位姿信息1的第一标定板对应目标雷达点云数据1、位姿信息2的第一标定板对应目标雷达点云数据2,以此类推,共有n组匹配关系。可以根据前3组匹配关系,确定备选外参1,根据前4组匹配关系,确定备选外参2,根据前两组和第4组匹配关系,确定备选外参3等等,确定出多个备选外参。
在上述确定出的多个备选外参中,确定出投影效果最好的一个备选外参作为所述雷达和所述相机之间的目标外参。
上述实施例中,可以根据多个匹配关系,分别确定出雷达和相机之间的备选外参,根据多个备选外参,选择投影效果最好的一个备选外参,作为雷达和相机之间的目标外参,提高了雷达和相机之间的目标外参的准确性。
在一些可选实施例中,例如图20所示,步骤103可以进一步包括以下步骤。
在步骤103-21中,通过所述雷达基于每个备选外参对所述第一标定板进行投影,投影到任一第一图像上,生成一组投影数据。
在相机坐标系中,每个雷达和相机之间的备选外参、相机内参的矩阵以及雷达点云数据相乘,可以实现对雷达点云数据的投影,投影到某一第一图像上,得到一组投影数据,例如图21A所示。该雷达点云数据可以是之前采集的多组雷达点云数据中的一组,也可以是新采集的雷达点云数据。为了更好的进行后续的比较,被采集的目标中需要包括第一标定板。
在步骤103-22中,在多组投影数据中确定投影与所述第一图像匹配度最高的一组投影数据为目标投影数据。
在多组投影数据中,确定投影与第一图像匹配度最高的一组投影数据,将该组投影数据确定为目标投影数据。例如两组投影数据中,分别投影到第一图像上得到的投影数据,例如图21A和图21B所示,图21A的投影效果比图21B的投影效果好,则图17A对应的投影数据就是目标投影数据。
在步骤103-23中,确定所述目标投影数据对应的备选外参,为所述雷达和所述相机之间的目标外参。
目标投影数据对应的备选外参,就是雷达和相机之间的目标外参。
上述实施例中,可以根据投影效果对多个备选外参进行验证,将投影效果最好的备选外参 作为最终的目标外参,提高了雷达和相机之间的目标外参的准确性。
在一些可选实施例中,上述雷达和相机可以部署在车辆上,雷达可以采用激光雷达,可选地,可以在车辆的不同位置同时部署雷达和相机,例如图22所示,在车辆前方和后方、车前挡风玻璃等位置都可以部署雷达2220和相机2210。在相机2210的第一内参确定之后,如果需要重新确定雷达2220和相机2210之间的目标外参,则可以直接获取之前标定的第一内参,快速确定目标外参,且提高了雷达2220和相机2210之间的目标外参的准确性。
本公开实施例提供了的上述方法,可以用于机器设备上,该机器设备可以是手动驾驶或无人驾驶的交通工具,例如飞机、车辆、无人机、无人驾驶车辆、机器人等。以车辆为例,雷达和相机两个传感器可以设置在中控台上方、靠近前风挡玻璃的位置。由于车辆移动,会造成雷达和相机中的至少一个的姿态发生改变,此时需要重新标定雷达和相机之间的外参。由于前挡风玻璃对光线的折射等影响,会造成原先标定的相机内参在应用过程中的不准确,进而影响雷达和相机之间的外参的准确性。
在本公开实施例中,可以直接根据预先标定的相机的第一内参和相机采集的多张第一图像,确定不同位姿信息的第一标定板相对于相机的外参,再获取不同位置信息的第一标定板的多组雷达点云数据,根据所述不同位姿信息的所述第一标定板相对于所述相机的外参和所述多组雷达点云数据,最终确定出激光雷达和相机之间的目标外参。可以快速标定出激光雷达和相机之间的目标外参,可用性高。
在一些可选实施例中,雷达部署在车辆前保险杠上,相机部署在车辆的后视镜的位置,例如图23所示,第一标定板2331位于雷达2320和相机2310的共同视野范围内,第一标定板可以固定在地面上,或是由工作人员手持等。
如果相机2310在标定第一内参的情况下,采用包含第一标定板2331的多张第一图像,由于雷达2320和相机2310不在同一水平面上,相机2310距离地面的距离较远,多张第一图像中第一标定板2331可能只占第一图像的部分内容,此时根据多张第一图像标定的相机2310的内参准确性较差。
在本公开实施例中,可以通过位于相机2310视野范围内且距离相机2310的距离较近的第二标定板2332进行相机内参的标定,第二标定板2332与所述相机2310之间的水平距离小于所述第一标定板2331与所述相机2310之间的水平距离,第二标定板2332可以固定在车上,此时采集到的第二图像中可以包括完整的第二标定板2332,进而可以得到比较准确的相机2310的第一内参。
上述实施例中,相机2310和雷达2320部署在车辆上,相机2310相对于地面的距离大于雷达2320相对于地面的距离,第二标定板2332与相机2310之间的水平距离小于第一标定板2331与相机2310之间的水平距离,相机2310采集的多张第二图像包括了完整的第二标定板2332,从而提高了标定相机内参的准确性。
与前述方法实施例相对应,本公开还提供了装置的实施例。
如图24所示,图24是本公开根据一示例性实施例示出的一种传感器的标定装置框图,第一标定板位于所述雷达和所述相机的共同视野范围内,装置包括:第一采集模块210,用于通过所述相机采集多张第一图像,其中,所述多张第一图像中所述第一标定板的位姿信息不同;第一确定模块220,用于获取预先标定的所述相机的第一内参,并根据所述第一内参和所述多张第一图像,确定各个位姿的所述第一标定板分别相对于所述相机的外参;第二确定模块230,用于获取所述各个位姿的所述第一标定板的多组雷达点云数据,并根据所述各个位姿的所述第一标定板分别相对于所述相机的外参和所述多组雷达点云数据,确定所述雷达和所述相机之间的目标外参。
在一些可选实施例中,所述装置还包括:标定模块,用于响应于初次标定所述传感器,对所述相机进行标定,得到所述相机的所述第一内参;所述第一确定模块包括:获取子模块,用于响应于再次标定所述传感器,获取初次标定所述传感器得到的所述相机的所述第一内参。
在一些可选实施例中,第二标定板位于所述相机的视野范围内,所述标定模块包括:采集子模块,用于通过所述相机采集多张第二图像,所述多张第二图像中所述第二标定板的位姿信息不同;第一确定子模块,用于根据所述多张第二图像,分别确定所述相机的多个第一备选内参,并将所述多个第一备选内参中的一个第一备选内参,确定为所述第一内参,其中,每张所述第二图像对应一个第一备选内参。
在一些可选实施例中,所述第一确定子模块包括:投影单元,用于通过所述相机,分别按照所述多个第一备选内参,将位于相机坐标系中的预设点投影到像素坐标系,获得所述预设点在所述像素坐标系中的多个第一坐标值;第一确定单元,用于对于每个第一备选内参,获取所述预设点在验证图像中的第二坐标值,并确定与第二坐标值对应的第一坐标值,得到存在对应关系的坐标对,其中,所述验证图像为所述多张第二图像中的一张或多张第二图像;第二确定单元,用于对于每个 第一备选内参,确定该第一备选内参包括的坐标对中第一坐标值与第二坐标值之间的距离,并将所述多个第一备选内参中距离最小的一个第一备选内参,确定为所述相机的第一内参。
在一些可选实施例中,所述第一确定模块包括:去畸变子模块,用于对于所述多张第一图像中的每一张第一图像,根据所述第一内参对该第一图像进行去畸变处理,得到与该第一图像对应的第三图像;第二确定子模块,用于根据多张所述第三图像,确定所述相机的第二内参;第三确定子模块,用于根据多张所述第三图像以及所述相机的第二内参,确定各个位姿的所述第一标定板分别相对于所述相机的外参。
在一些可选实施例中,所述第三确定子模块包括:第三确定单元,用于分别确定每张所述第三图像对应的单应性矩阵;第四确定单元,用于根据所述相机的第二内参和所述多个单应性矩阵,确定各个位姿的所述第一标定板分别相对于所述相机的外参。
在一些可选实施例中,所述第二确定模块包括:第四确定子模块,用于针对每个位姿的第一标定板,根据该第一标定板相对于所述相机的外参、以及所述雷达和所述相机之间的外参参考值,在该位姿下所述雷达点云数据中确定一组与该第一标定板匹配的目标雷达点云数据;第五确定子模块,用于根据多组所述目标雷达点云数据分别与各个位姿的所述第一标定板之间的匹配关系,确定所述雷达和所述相机之间的目标外参。
在一些可选实施例中,所述第四确定子模块包括:第五确定单元,用于根据该第一标定板相对于所述相机的所述外参、以及所述雷达和所述相机之间的所述外参参考值,确定所述第一标定板所在的备选位置;第六确定单元,用于根据所述备选位置,在该位姿下的所述雷达点云数据中确定所述第一标定板所在的目标平面;第七确定单元,用于在该位姿下的所述雷达点云数据对应的所述目标平面上,确定一组与该第一标定板匹配的目标雷达点云数据。
在一些可选实施例中,所述第六确定单元包括:第一确定子单元,用于从该位姿下的所述雷达点云数据中,确定两个或多个第一雷达组,其中,每个第一雷达组包括多个随机选取的、位于所述备选位置对应区域内的第一雷达点,针对每个第一雷达组,确定相对应的第一平面,其中,每个第一雷达组相对应的第一平面包括该第一雷达组的多个第一雷达点;第二确定子单元,用于针对每个所述第一平面,分别确定该位姿下的所述雷达点云数据中的除所述多个第一雷达点以外的其他雷达点到所述第一平面的距离;第三确定子单元,用于针对每个所述第一平面,将所述其他雷达点中所述距离小于阈值的雷达点作为第二雷达点,并将所述第二雷达点确定为所述第一平面中包括的雷达点;第四确定子单元,用于在所述两个或多个所述第一平面中,将包括雷达点数目最多的一个第一平面作为所述目标平面。
在一些可选实施例中,所述第七确定单元包括:第五确定子单元,用于在所述目标平面上,根据第一标定板的尺寸确定初始第一圆形区域;选取子单元,用于在所述雷达点云数据中,随机选取位于所述初始第一圆形区域内的任一个雷达点作为第一圆形区域的第一圆心,以确定所述第一圆形区域在所述雷达点云数据中的位置;第六确定子单元,用于以所述第一圆心为起点,所述雷达点云数据中位于所述第一圆形区域内的多个第三雷达点为终点,分别得到多个第一向量;第七确定子单元,用于将所述多个第一向量相加后得到第二向量;第八确定子单元,用于基于所述第二向量,确定所述第一标定板的目标中心位置;第九确定子单元,用于根据所述第一标定板的所述目标中心位置和所述第一标定标的尺寸,在所述雷达点云数据中确定一组与该第一标定板匹配的所述目标雷达点云数据。
在一些可选实施例中,所述第八确定子单元包括:以所述第二向量的终点为第二圆心,根据所述第二圆心和所述第一标定板的尺寸确定第二圆形区域;以所述第二圆心为起点,所述雷达点云数据中位于所述第二圆形区域内的多个第四雷达点为终点,分别确定多个第三向量;将所述多个第三向量相加后得到第四向量;判断所述第四向量的向量值是否收敛为预设值;响应于所述第四向量的向量值收敛为所述预设值,将收敛的所述第四向量对应的所述第二圆心作为所述第一标定板的备选中心位置;响应于所述备选中心位置与所述第一标定板的中心位置重合,将所述备选中心位置作为所述目标中心位置。
在一些可选实施例中,所述第八确定子单元还包括:响应于所述第四向量的向量值未收敛为所述预设值,将未收敛的所述第四向量的终点作为所述第二圆心,重新确定所述多个第三向量并重新确定所述第四向量。
在一些可选实施例中,所述第八确定子单元还包括:响应于所述备选中心位置与所述第一标定板的所述中心位置不重合,重新确定所述备选中心位置。
在一些可选实施例中,所述第五确定子模块包括:第八确定单元,用于根据g个匹配关系,确定所述雷达和所述相机之间的一个备选外参,其中,g为大于或等于3的整数,并根据所述雷达和所述相机之间的多个备选外参,确定所述雷达和所述相机之间的目标外参。
在一些可选实施例中,所述第八确定单元包括:第十确定子单元,用于通过所述雷达基于每个备选外参对所述第一标定板进行投影,投影到任一第一图像上,生成一组投影数据;第十一确定子单元,用于在多组投影数据中确定投影与所述第一图像匹配度最高的一组投影数据为目标投影数据;第十二确定子单元,用于确定所述目标投影数据对应的备选外参,为所述雷达和所述相机之间的目标外参。
在一些可选实施例中,所述雷达和所述相机部署在车辆上,所述雷达为激光雷达。
在一些可选实施例中,第二标定板位于所述相机的视野范围内,所述第二标定板用于标定所述相机的所述第一内参;所述相机相对于地面的距离大于所述雷达相对于地面的距离,所述第二标定板与所述相机之间的水平距离小于所述第一标定板与所述相机之间的水平距离,所述多张第二图像包括完整的所述第二标定板。
在一些可选实施例中,所述第一图像包括完整的所述第一标定板,且所述雷达点云数据包括基于完整的所述第一标定板得到的点云数据。
对于装置实施例而言,由于其基本对应于方法实施例,所以相关之处参见方法实施例的部分说明即可。以上所描述的装置实施例仅仅是示意性的,其中作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本公开方案的目的。本领域普通技术人员在不付出创造性劳动的情况下,即可以理解并实施。
本公开实施例还提供了一种计算机可读存储介质,存储介质存储有计算机程序,计算机程序被处理器执行时,促使所述处理器实现上述任一的传感器的标定方法。其中,所述计算机可读存储介质可以是非易失性存储介质。
在一些可选实施例中,本公开实施例提供了一种计算机程序产品,包括计算机可读代码,当计算机可读代码在设备上运行时,促使所述设备执行用于实现如上任一实施例提供的传感器的标定方法的指令。
在一些可选实施例中,本公开实施例还提供了另一种计算机程序产品,用于存储计算机可读指令,指令被执行时使得计算机执行上述任一实施例提供的传感器的标定方法的操作。
该计算机程序产品可以具体通过硬件、软件或其结合的方式实现。在一个可选实施例中,所述计算机程序产品具体体现为计算机存储介质,在另一个可选实施例中,计算机程序产品具体体现为软件产品,例如软件开发包(Software Development Kit,SDK)等等。
本公开实施例还提供了一种传感器的标定装置,包括:处理器;用于存储处理器可执行指令的存储器;其中,处理器被配置为调用所述可执行指令,以实现上述任一项所述的传感器的标定方法。
图25为本申请实施例提供的一种传感器的标定装置的硬件结构示意图。该传感器的标定装置310包括处理器311,还可以包括输入装置312、输出装置313、存储器314和总线315。该输入装置312、输出装置313、存储器314和处理器311之间通过总线315相互连接。
存储器包括但不限于是随机存储器(random access memory,RAM)、只读存储器(read-only memory,ROM)、可擦除可编程只读存储器(erasable programmable read only memory,EPROM)、或便携式只读存储器(compact disc read-only memory,CD-ROM),该存储器用于相关指令及数据。
输入装置用于输入数据和/或信号,以及输出装置用于输出数据和/或信号。输出装置和输入装置可以是独立的器件,也可以是一个整体的器件。
处理器可以包括是一个或多个处理器,例如包括一个或多个中央处理器(central processing unit,CPU),在处理器是一个CPU的情况下,该CPU可以是单核CPU,也可以是多核CPU。
处理器用于调用该存储器中的程序代码和数据,执行上述方法实施例中的步骤。具体可参见方法实施例中的描述,在此不再赘述。
可以理解的是,图25仅仅示出了一种传感器的标定装置的简化设计。在实际应用中,传感器的标定装置还可以分别包含必要的其他元件,包含但不限于任意数量的输入/输出装置、处理器、控制器、存储器等,而所有可以实现本申请实施例的传感器的标定装置都在本申请的保护范围之内。
在一些实施例中,本公开实施例提供的装置具有的功能或包含的模块可以用于执行上文方法实施例描述的方法,其具体实现可以参照上文方法实施例的描述,为了简洁,这里不再赘述。
本公开实施例还提供了一种标定系统,所述标定系统包括相机、雷达和第一标定板,所述第一标定板位于所述相机和所述雷达的共同视野范围内,所述第一标定板在不同采集时刻的位姿信息不同。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本公开的其它实施方案。本公开旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵 循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或者惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求指出。
以上所述仅为本公开的较佳实施例而已,并不用以限制本公开,凡在本公开的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本公开保护的范围之内。

Claims (23)

  1. 一种传感器标定方法,其特征在于,所述传感器包括雷达和相机,第一标定板位于所述雷达和所述相机的共同视野范围内,所述方法包括:
    通过所述相机采集多张第一图像,其中,所述多张第一图像中所述第一标定板的位姿信息不同;
    获取预先标定的所述相机的第一内参;
    根据所述第一内参和所述多张第一图像,确定各个位姿的第一标定板分别相对于所述相机的外参;
    获取所述各个位姿的所述第一标定板的多组雷达点云数据;
    根据所述各个位姿的所述第一标定板分别相对于所述相机的外参和所述多组雷达点云数据,确定所述雷达和所述相机之间的目标外参。
  2. 根据权利要求1所述的方法,其特征在于,所述获取预先标定的所述相机的第一内参之前,所述方法还包括:
    响应于初次标定所述传感器,对所述相机进行标定,得到所述相机的所述第一内参;
    所述获取预先标定的所述相机的第一内参,包括:
    响应于再次标定所述传感器,获取初次标定所述传感器得到的所述相机的所述第一内参。
  3. 根据权利要求2所述的方法,其特征在于,第二标定板位于所述相机的视野范围内,所述对所述相机进行标定,得到所述相机的所述第一内参,包括:
    通过所述相机采集多张第二图像,所述多张第二图像中所述第二标定板的位姿信息不同;
    根据所述多张第二图像,分别确定所述相机的多个第一备选内参,其中,每张所述第二图像对应一个第一备选内参;
    将所述多个第一备选内参中的一个第一备选内参,确定为所述第一内参。
  4. 根据权利要求3所述的方法,其特征在于,所述将所述多个第一备选内参中的一个第一备选内参,确定为所述第一内参,包括:
    通过所述相机,分别按照所述多个第一备选内参,将位于相机坐标系中的预设点投影到像素坐标系,获得所述预设点在所述像素坐标系中的多个第一坐标值;
    对于每个第一备选内参,获取所述预设点在验证图像中的第二坐标值,并确定与第二坐标值对应的第一坐标值,得到存在对应关系的坐标对,其中,所述验证图像为所述多张第二图像中的一张或多张第二图像;
    对于每个第一备选内参,确定该第一备选内参包括的坐标对中第一坐标值与第二坐标值之间的距离;
    将所述多个第一备选内参中距离最小的一个第一备选内参,确定为所述相机的第一内参。
  5. 根据权利要求1至4任一项所述的方法,其特征在于,所述根据所述第一内参和所述多张第一图像,确定各个位姿的第一标定板分别相对于所述相机的外参,包括:
    对于所述多张第一图像中的每一张第一图像,根据所述第一内参对该第一图像进行去畸变处理,得到与该第一图像对应的第三图像;
    根据多张所述第三图像,确定所述相机的第二内参;
    根据多张所述第三图像以及所述相机的第二内参,确定所述各个位姿的所述第一标定板分别相对于所述相机的外参。
  6. 根据权利要求5所述的方法,其特征在于,所述根据多张所述第三图像以及所述相机的第二内参,确定所述各个位姿的所述第一标定板分别相对于所述相机的外参,包括:
    分别确定每张所述第三图像对应的单应性矩阵;
    根据所述相机的第二内参和所述多个单应性矩阵,确定所述各个位姿的所述第一标定板分别相对于所述相机的外参。
  7. 根据权利要求6所述的方法,其特征在于,所述根据所述各个位姿的所述第一标定板分别相对于所述相机的外参和所述多组雷达点云数据,确定所述雷达和所述相机之间的目标外参,包括:
    针对每个位姿的第一标定板,根据该第一标定板相对于所述相机的外参、以及所述雷达和所述相机之间的外参参考值,在该位姿下的所述雷达点云数据中确定一组与该第一标定板匹配的目标雷达点云数据;
    根据多组所述目标雷达点云数据分别与所述各个位姿的所述第一标定板之间的匹配关系,确定所述雷达和所述相机之间的目标外参。
  8. 根据权利要求7所述的方法,其特征在于,所述根据该第一标定板相对于所述相机的外参、以及所述雷达和所述相机之间的外参参考值,在该位姿下的所述雷达点云数据中确定一组与该第一标定板匹配的目标雷达点云数据,包括:
    根据该第一标定板相对于所述相机的所述外参、以及所述雷达和所述相机之间的所述外参参考值,确定所述第一标定板所在的备选位置;
    根据所述备选位置,在该位姿下的所述雷达点云数据中确定所述第一标定板所在的目标平面;
    在该位姿下的所述雷达点云数据对应的所述目标平面上,确定一组与该第一标定板匹配的目标雷达点云数据。
  9. 根据权利要求8所述的方法,其特征在于,所述根据所述备选位置,在该位姿下的所述雷达点云数据中确定所述第一标定板所在的目标平面,包括:
    从该位姿下的所述雷达点云数据中,确定两个或多个第一雷达组,其中,每个第一雷达组包括多个随机选取的、位于所述备选位置对应区域内的第一雷达点;
    针对每个第一雷达组,确定相对应的第一平面,其中,每个第一雷达组相对应的第一平面包括该第一雷达组的多个第一雷达点;
    针对每个所述第一平面,分别确定该位姿下的所述雷达点云数据中的除所述多个第一雷达点以外的其他雷达点到所述第一平面的距离;
    针对每个所述第一平面,将所述其他雷达点中所述距离小于阈值的雷达点作为第二雷达点;
    针对每个所述第一平面,将所述第二雷达点确定为所述第一平面中包括的雷达点;
    在所述两个或多个所述第一平面中,将包括雷达点数目最多的一个第一平面作为所述目标平面。
  10. 根据权利要求8或9所述的方法,其特征在于,所述在该位姿下的所述雷达点云数据对应的所述目标平面上,确定一组与该第一标定板匹配的目标雷达点云数据,包括:
    在所述目标平面上,根据该第一标定板的尺寸确定初始第一圆形区域;
    在所述雷达点云数据中,随机选取位于所述初始第一圆形区域内的任一个雷达点作为第一圆形区域的第一圆心,以确定所述第一圆形区域在所述雷达点云数据中的位置;
    以所述第一圆心为起点,所述雷达点云数据中位于所述第一圆形区域内的多个第三雷达点为终点,分别得到多个第一向量;
    将所述多个第一向量相加后得到第二向量;
    基于所述第二向量,确定该第一标定板的目标中心位置;
    根据该第一标定板的所述目标中心位置和所述第一标定标的尺寸,在所述雷达点云数据中确定一组与该第一标定板匹配的所述目标雷达点云数据。
  11. 根据权利要求10所述的方法,其特征在于,所述基于所述第二向量,确定该第一标定标的目标中心位置,包括:
    以所述第二向量的终点为第二圆心,根据所述第二圆心和该第一标定板的尺寸确定第二圆形区域;
    以所述第二圆心为起点,所述雷达点云数据中位于所述第二圆形区域内的多个第四雷达点为终点,分别确定多个第三向量;
    将所述多个第三向量相加后得到第四向量;
    判断所述第四向量的向量值是否收敛为预设值;
    响应于所述第四向量的向量值收敛为所述预设值,将收敛的所述第四向量对应的所述第二圆心作为该第一标定板的备选中心位置;
    响应于所述备选中心位置与该第一标定板的中心位置重合,将所述备选中心位置作为所述目标中心位置。
  12. 根据权利要求11所述的方法,其特征在于,还包括:
    响应于所述第四向量的向量值未收敛为所述预设值,将未收敛的所述第四向量的终点作为所述第二圆心,重新确定所述多个第三向量并重新确定所述第四向量。
  13. 根据权利要求11所述的方法,其特征在于,还包括:
    响应于所述备选中心位置与该第一标定板的所述中心位置不重合,重新确定所述备选中心位置。
  14. 根据权利要求7至13任一所述的方法,其特征在于,所述根据多组所述目标雷达点云数据分别与所述各个位姿的所述第一标定板之间的匹配关系,确定所述雷达和所述相机之间的目标外参,包括:
    根据g个匹配关系,确定所述雷达和所述相机之间的一个备选外参,其中,g为大于或等于3的整数;
    根据所述雷达和所述相机之间的多个备选外参,确定所述雷达和所述相机之间的所述目标外参。
  15. 根据权利要求14所述的方法,其特征在于,所述根据所述雷达和所述相机之间的所述多个备选外参,确定所述雷达和所述相机之间的所述目标外参,包括:
    通过所述雷达基于每个备选外参对所述第一标定板进行投影,投影到任一所述第一图像上,生 成一组投影数据;
    在多组投影数据中确定投影与该第一图像匹配度最高的一组投影数据为目标投影数据;
    确定所述目标投影数据对应的备选外参,为所述雷达和所述相机之间的目标外参。
  16. 根据权利要求1至15中任意一项所述的方法,其特征在于,所述雷达和所述相机部署在车辆上,所述雷达为激光雷达。
  17. 根据权利要求16所述的方法,其特征在于,第二标定板位于所述相机的视野范围内,所述第二标定板用于标定所述相机的所述第一内参;
    所述相机相对于地面的距离大于所述雷达相对于地面的距离;
    所述第二标定板与所述相机之间的水平距离小于所述第一标定板与所述相机之间的水平距离;
    所述多张第二图像包括完整的所述第二标定板。
  18. 根据权利要求1至17中任意一项所述的方法,其特征在于,所述第一图像包括完整的所述第一标定板,且所述雷达点云数据包括基于完整的所述第一标定板得到的点云数据。
  19. 一种传感器标定装置,其特征在于,所述传感器包括雷达和相机,第一标定板位于所述雷达和所述相机的共同视野范围内,所述装置包括:
    第一采集模块,用于通过所述相机采集多张第一图像,其中,所述多张第一图像中所述第一标定板的位姿信息不同;
    第一确定模块,用于获取预先标定的所述相机的第一内参,并根据所述第一内参和所述多张第一图像,确定各个位姿的所述第一标定板分别相对于所述相机的外参;
    第二确定模块,用于获取所述各个位姿的所述第一标定板的多组雷达点云数据,并根据所述各个位姿的所述第一标定板分别相对于所述相机的外参和所述多组雷达点云数据,确定所述雷达和所述相机之间的目标外参。
  20. 一种计算机可读存储介质,其特征在于,所述存储介质存储有计算机程序,所述计算机程序被处理器执行时,促使所述处理器实现如权利要求1至18任一项所述的传感器标定方法。
  21. 一种传感器标定装置,其特征在于,包括:
    处理器;
    用于存储所述处理器可执行指令的存储器;
    其中,所述处理器被配置为调用所述可执行指令,以实现如权利要求1至18中任一项所述的传感器标定方法。
  22. 一种标定系统,其特征在于,所述标定系统包括相机、雷达和第一标定板,所述第一标定板位于所述相机和所述雷达的共同视野范围内,所述第一标定板在不同采集时刻的位姿信息不同。
  23. 一种计算机程序产品,其特征在于,包括计算机可读代码,当所述计算机可读代码在设备上运行时,促使所述设备执行以实现如权利要求1-18任一项所述的传感器标定方法。
PCT/CN2020/122559 2019-11-18 2020-10-21 传感器标定方法及装置、存储介质、标定系统和程序产品 WO2021098439A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021530296A JP2022510924A (ja) 2019-11-18 2020-10-21 センサキャリブレーション方法及び装置、記憶媒体、キャリブレーションシステム並びにプログラム製品
US17/747,271 US20220276360A1 (en) 2019-11-18 2022-05-18 Calibration method and apparatus for sensor, and calibration system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911126534.8 2019-11-18
CN201911126534.8A CN112816949B (zh) 2019-11-18 2019-11-18 传感器的标定方法及装置、存储介质、标定系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/747,271 Continuation US20220276360A1 (en) 2019-11-18 2022-05-18 Calibration method and apparatus for sensor, and calibration system

Publications (1)

Publication Number Publication Date
WO2021098439A1 true WO2021098439A1 (zh) 2021-05-27

Family

ID=75852431

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/122559 WO2021098439A1 (zh) 2019-11-18 2020-10-21 传感器标定方法及装置、存储介质、标定系统和程序产品

Country Status (4)

Country Link
US (1) US20220276360A1 (zh)
JP (1) JP2022510924A (zh)
CN (1) CN112816949B (zh)
WO (1) WO2021098439A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436270A (zh) * 2021-06-18 2021-09-24 上海商汤临港智能科技有限公司 传感器标定方法及装置、电子设备和存储介质
CN113702931A (zh) * 2021-08-19 2021-11-26 中汽创智科技有限公司 一种车载雷达的外参标定方法、装置及存储介质
CN113724303A (zh) * 2021-09-07 2021-11-30 广州文远知行科技有限公司 点云与图像匹配方法、装置、电子设备和存储介质
CN113744348A (zh) * 2021-08-31 2021-12-03 南京慧尔视智能科技有限公司 一种参数标定方法、装置及雷视融合检测设备

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11967111B2 (en) * 2020-12-15 2024-04-23 Kwangwoon University Industry-Academic Collaboration Foundation Multi-view camera-based iterative calibration method for generation of 3D volume model
CN114782556B (zh) * 2022-06-20 2022-09-09 季华实验室 相机与激光雷达的配准方法、系统及存储介质

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882313A (zh) * 2010-07-14 2010-11-10 中国人民解放军国防科学技术大学 单线激光雷达与ccd相机之间相互关系的标定方法
US20160320476A1 (en) * 2015-04-28 2016-11-03 Henri Johnson Systems to track a moving sports object
CN106228537A (zh) * 2016-07-12 2016-12-14 北京理工大学 一种三维激光雷达与单目摄像机的联合标定方法
CN106840111A (zh) * 2017-03-27 2017-06-13 深圳市鹰眼在线电子科技有限公司 物体间位置姿态关系实时统一系统及方法
CN107976669A (zh) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 一种确定相机与激光雷达之间的外参数的装置
CN107976668A (zh) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 一种确定相机与激光雷达之间的外参数的方法
CN108198223A (zh) * 2018-01-29 2018-06-22 清华大学 一种激光点云与视觉图像映射关系快速精确标定方法
CN108509918A (zh) * 2018-04-03 2018-09-07 中国人民解放军国防科技大学 融合激光点云与图像的目标检测与跟踪方法
CN108964777A (zh) * 2018-07-25 2018-12-07 南京富锐光电科技有限公司 一种高速相机校准系统及方法
CN109146978A (zh) * 2018-07-25 2019-01-04 南京富锐光电科技有限公司 一种高速相机成像畸变校准装置及方法
CN109946680A (zh) * 2019-02-28 2019-06-28 北京旷视科技有限公司 探测系统的外参数标定方法、装置、存储介质及标定系统

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5051493B2 (ja) * 2005-12-26 2012-10-17 株式会社Ihi 三次元計測用マーカとこれを用いた三次元計測方法
JP2014074632A (ja) * 2012-10-03 2014-04-24 Isuzu Motors Ltd 車載ステレオカメラの校正装置及び校正方法
EP2767846B1 (en) * 2013-02-18 2017-01-11 Volvo Car Corporation Method for calibrating a sensor cluster in a motor vehicle
US20190004178A1 (en) * 2016-03-16 2019-01-03 Sony Corporation Signal processing apparatus and signal processing method
JP6929123B2 (ja) * 2017-05-10 2021-09-01 日本放送協会 カメラ校正装置及びカメラ校正プログラム
CN108764024B (zh) * 2018-04-09 2020-03-24 平安科技(深圳)有限公司 人脸识别模型的生成装置、方法及计算机可读存储介质
CN109343061B (zh) * 2018-09-19 2021-04-02 百度在线网络技术(北京)有限公司 传感器标定方法、装置、计算机设备、介质和车辆

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882313A (zh) * 2010-07-14 2010-11-10 中国人民解放军国防科学技术大学 单线激光雷达与ccd相机之间相互关系的标定方法
US20160320476A1 (en) * 2015-04-28 2016-11-03 Henri Johnson Systems to track a moving sports object
CN106228537A (zh) * 2016-07-12 2016-12-14 北京理工大学 一种三维激光雷达与单目摄像机的联合标定方法
CN107976669A (zh) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 一种确定相机与激光雷达之间的外参数的装置
CN107976668A (zh) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 一种确定相机与激光雷达之间的外参数的方法
CN106840111A (zh) * 2017-03-27 2017-06-13 深圳市鹰眼在线电子科技有限公司 物体间位置姿态关系实时统一系统及方法
CN108198223A (zh) * 2018-01-29 2018-06-22 清华大学 一种激光点云与视觉图像映射关系快速精确标定方法
CN108509918A (zh) * 2018-04-03 2018-09-07 中国人民解放军国防科技大学 融合激光点云与图像的目标检测与跟踪方法
CN108964777A (zh) * 2018-07-25 2018-12-07 南京富锐光电科技有限公司 一种高速相机校准系统及方法
CN109146978A (zh) * 2018-07-25 2019-01-04 南京富锐光电科技有限公司 一种高速相机成像畸变校准装置及方法
CN109946680A (zh) * 2019-02-28 2019-06-28 北京旷视科技有限公司 探测系统的外参数标定方法、装置、存储介质及标定系统

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113436270A (zh) * 2021-06-18 2021-09-24 上海商汤临港智能科技有限公司 传感器标定方法及装置、电子设备和存储介质
CN113702931A (zh) * 2021-08-19 2021-11-26 中汽创智科技有限公司 一种车载雷达的外参标定方法、装置及存储介质
CN113702931B (zh) * 2021-08-19 2024-05-24 中汽创智科技有限公司 一种车载雷达的外参标定方法、装置及存储介质
CN113744348A (zh) * 2021-08-31 2021-12-03 南京慧尔视智能科技有限公司 一种参数标定方法、装置及雷视融合检测设备
CN113724303A (zh) * 2021-09-07 2021-11-30 广州文远知行科技有限公司 点云与图像匹配方法、装置、电子设备和存储介质
CN113724303B (zh) * 2021-09-07 2024-05-10 广州文远知行科技有限公司 点云与图像匹配方法、装置、电子设备和存储介质

Also Published As

Publication number Publication date
JP2022510924A (ja) 2022-01-28
CN112816949A (zh) 2021-05-18
US20220276360A1 (en) 2022-09-01
CN112816949B (zh) 2024-04-16

Similar Documents

Publication Publication Date Title
WO2021098439A1 (zh) 传感器标定方法及装置、存储介质、标定系统和程序产品
WO2021098448A1 (zh) 传感器标定方法及装置、存储介质、标定系统和程序产品
US11461930B2 (en) Camera calibration plate, camera calibration method and device, and image acquisition system
WO2021098608A1 (zh) 传感器的标定方法、装置、系统、车辆、设备及存储介质
CN110689581B (zh) 结构光模组标定方法、电子设备、计算机可读存储介质
CN107167788B (zh) 获取激光雷达校准参数、激光雷达校准的方法及系统
KR101666959B1 (ko) 카메라로부터 획득한 영상에 대한 자동보정기능을 구비한 영상처리장치 및 그 방법
CN111815716A (zh) 一种参数标定方法及相关装置
CN111383279A (zh) 外参标定方法、装置及电子设备
CN113034612B (zh) 一种标定装置、方法及深度相机
CN112308927B (zh) 一种全景相机与激光雷达的融合装置及其标定方法
CN110136207B (zh) 鱼眼相机标定系统、方法、装置、电子设备及存储介质
CN113111513B (zh) 传感器配置方案确定方法、装置、计算机设备及存储介质
CN111105465B (zh) 一种摄像装置校准方法、装置、系统电子设备及存储介质
EP3967969B1 (en) Fisheye camera calibration system, method and apparatus, electronic device, and storage medium
CN115359130A (zh) 雷达和相机的联合标定方法、装置、电子设备及存储介质
CN111145264A (zh) 多传感器的标定方法、装置及计算设备
CN113077523B (zh) 一种标定方法、装置、计算机设备及存储介质
CN109741384B (zh) 深度相机的多距离检测装置及方法
WO2021068723A1 (zh) 传感器标定方法和传感器标定装置
CN114693807A (zh) 一种输电线路图像与点云的映射数据重构方法及系统
CN113763457B (zh) 落差地形的标定方法、装置、电子设备和存储介质
CN113421300A (zh) 确定鱼眼相机图像中物体实际位置的方法及装置
CN113219441A (zh) 标定角度的精度验证方法及装置、设备及存储介质
Chen et al. Image registration with uncalibrated cameras in hybrid vision systems

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2021530296

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20888814

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20888814

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20888814

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09/11/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20888814

Country of ref document: EP

Kind code of ref document: A1