WO2021098439A1 - 传感器标定方法及装置、存储介质、标定系统和程序产品 - Google Patents
传感器标定方法及装置、存储介质、标定系统和程序产品 Download PDFInfo
- Publication number
- WO2021098439A1 WO2021098439A1 PCT/CN2020/122559 CN2020122559W WO2021098439A1 WO 2021098439 A1 WO2021098439 A1 WO 2021098439A1 CN 2020122559 W CN2020122559 W CN 2020122559W WO 2021098439 A1 WO2021098439 A1 WO 2021098439A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- radar
- calibration
- cloud data
- point cloud
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
- G01S7/4972—Alignment of sensor
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4052—Means for monitoring or calibrating by simulation of echoes
- G01S7/4082—Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
Definitions
- the present disclosure relates to the field of computer vision, in particular to sensor calibration methods and devices, storage media, calibration systems, and program products.
- machinery and equipment include a combination of radar and camera. Based on the data provided by radar and cameras, machine devices can learn to perceive the surrounding environment.
- the accuracy of the extrinsic parameter between the radar and the camera determines the accuracy of the environment perception.
- the present disclosure provides a sensor calibration method and device, storage medium, and calibration system to realize the joint calibration of radar and camera.
- a sensor calibration method where a first calibration board is located within a common field of view of the radar and the camera, and the method includes: collecting multiple first images by the camera , Wherein the pose information of the first calibration plate in the plurality of first images is different; the first internal parameter of the camera calibrated in advance is obtained; according to the first internal parameter and the plurality of first images, Determine the external parameters of the first calibration board for each pose information relative to the camera; acquire multiple sets of radar point cloud data of the first calibration board for each pose; according to the first calibration board for each pose The target external parameters between the radar and the camera are determined respectively with respect to the external parameters of the camera and the multiple sets of radar point cloud data.
- a sensor calibration device the sensor includes a radar and a camera, a first calibration board is located within a common field of view of the radar and the camera, and the device includes:
- the acquisition module is used to acquire multiple first images through the camera, wherein the pose information of the first calibration board in the multiple first images is different;
- the first determination module is used to acquire pre-calibrated all The first internal parameter of the camera, and according to the first internal parameter and the plurality of first images, the first calibration plate of each pose is determined relative to the external parameters of the camera;
- the second determining module is used to obtain The multiple sets of radar point cloud data of the first calibration board for each pose information, and the multiple sets of radar point cloud data of the first calibration board relative to the camera and the multiple sets of radar point cloud data according to each pose information To determine the target external parameters between the radar and the camera.
- a computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, it causes the processor to implement any one of the above-mentioned first aspects.
- the described sensor calibration method when the computer program is executed by a processor, it causes the processor to implement any one of the above-mentioned first aspects.
- a sensor calibration device including: a processor; a memory for storing executable instructions of the processor; wherein the processor is configured to call the executable instructions , In order to realize the calibration method of the sensor in any one of the first aspect.
- a calibration system including a camera, a radar, and a first calibration board, the first calibration board being located in a common field of view of the camera and the radar, The pose information of the first calibration board at different collection moments is different.
- a computer program product including computer readable code, when the computer readable code runs on a device, prompts the device to execute to implement the first The calibration method of the sensor according to any one of the aspects.
- the first internal parameters of the pre-calibrated camera can be obtained, and the first internal parameters of the pre-calibrated camera can be obtained according to the first internal parameters of the pre-calibrated camera.
- the external parameters of the first calibration board relative to the camera are obtained, that is, the relative position relationship between the camera and the radar or the pitch/tilt angle
- the calibration of the sensor can be achieved according to the first internal parameters of the pre-calibrated camera.
- Fig. 1 is a flow chart of a method for calibrating a sensor according to an exemplary embodiment of the present disclosure.
- Fig. 2 is a schematic diagram showing a common visual field according to an exemplary embodiment of the present disclosure.
- Fig. 3 is a schematic diagram showing a calibration board with different postures according to an exemplary embodiment of the present disclosure.
- Fig. 4 is a schematic diagram showing a radar transmission according to an exemplary embodiment of the present disclosure.
- Fig. 5 is a flow chart of a method for calibrating a sensor according to another exemplary embodiment of the present disclosure.
- Fig. 6 is a schematic diagram showing a field of view of a camera according to an exemplary embodiment of the present disclosure.
- Fig. 7 is a flow chart of a method for calibrating a sensor according to another exemplary embodiment of the present disclosure.
- Fig. 8 is a schematic diagram showing a second image including a second calibration plate according to an exemplary embodiment of the present disclosure.
- Fig. 9 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
- Fig. 10A is a schematic diagram showing a scene of projecting a preset point according to an exemplary embodiment of the present disclosure.
- Fig. 10B is a schematic diagram of a scene of determining coordinate pairs that have a corresponding relationship according to an exemplary embodiment of the present disclosure.
- Fig. 11 is a flow chart showing a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
- Fig. 12 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
- Fig. 13 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
- Fig. 14 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
- Fig. 15 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
- Fig. 16 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
- Fig. 17 is a schematic diagram of determining multiple first vectors according to an exemplary embodiment of the present disclosure.
- Fig. 18 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
- Fig. 19 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
- Fig. 20 is a flow chart of a method for calibrating a sensor according to still another exemplary embodiment of the present disclosure.
- Fig. 21A is a schematic diagram of projecting the first calibration plate by radar according to an exemplary embodiment of the present disclosure.
- Fig. 21B is another schematic diagram of projecting the first calibration plate by radar according to an exemplary embodiment of the present disclosure.
- Fig. 22 is a schematic diagram showing deployment of radars and cameras on a vehicle according to an exemplary embodiment of the present disclosure.
- Fig. 23 is a schematic diagram showing the positions of the first calibration plate and the second calibration plate corresponding to the radar and the camera deployed on the vehicle according to an exemplary embodiment of the present disclosure.
- Fig. 24 is a block diagram of a sensor calibration device according to an exemplary embodiment of the present disclosure.
- Fig. 25 is a block diagram of a sensor calibration device according to another exemplary embodiment of the present disclosure.
- first, second, third, etc. may be used in this disclosure to describe various information, the information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other.
- first information may also be referred to as second information, and similarly, the second information may also be referred to as first information.
- word “if” as used herein can be interpreted as "when” or “when” or “in response to certainty.”
- the present disclosure provides a sensor calibration method.
- sensor calibration it refers to calibrating the internal parameters (Intrinsic Parameters) and external parameters of the sensor.
- the internal parameters of the sensor refer to the parameters used to reflect the characteristics of the sensor itself. After the sensor leaves the factory, the internal parameters are theoretically unchanged, but in actual use, the internal parameters may change. Take the camera as an example. As it is used, changes in the positional relationship of the various parts of the camera will cause changes in internal parameters.
- the calibrated internal parameter is usually only a parameter that approximates the real internal parameter, not the true value of the internal parameter.
- the following takes the sensor including camera and radar as an example to illustrate the internal parameters of the sensor.
- the internal parameters of the camera refer to the parameters used to reflect the characteristics of the camera itself, which can include but are not limited to at least one of the following, which can be one or more of the multiple parameters listed below: u 0 , v 0 , S x , Sy , f and r.
- u 0 and v 0 respectively represent the number of horizontal and vertical pixels that differ between the origin of the pixel coordinate system and the origin of the camera coordinate system where the camera is located, in pixels.
- S x and Sy are the number of pixels per unit length in the horizontal and vertical directions, respectively, and the unit length can be millimeters.
- f is the focal length of the camera.
- the r is the distance value of the pixel from the center of the imager caused by image distortion.
- the center of the imager is the focus center of the camera.
- the camera described in the present disclosure may be a camera, a video camera, or other equipment with a photographing function, which is not limited in the present disclosure.
- the internal parameters of the radar refer to the parameters that can be used to reflect the characteristics of the radar itself, which can include but are not limited to at least one of the following, which can be one or more of the multiple parameters listed below: the power and type of the transmitter , The sensitivity and type of the receiver, the parameters and type of the antenna, the number and type of the display, etc.
- the radar mentioned in the present disclosure may be a laser detection and ranging (Light Detection and Ranging, LiDAR) system or a radio radar, which is not limited in the present disclosure.
- the external parameters of the sensor refer to the parameters of the conversion relationship between the position of the object in the world coordinate system and the position of the object in the sensor coordinate system. It should be noted that when multiple sensors are included, the sensor external parameters also include parameters for reflecting the conversion relationship between the multiple sensor coordinate systems. The following also takes the sensor including camera and radar as an example to illustrate the external parameters of the sensor.
- the external parameters of the camera refer to the parameters used to transform a point from the world coordinate system to the camera coordinate system.
- the external parameters of the calibration board relative to the camera can be used to reflect the change parameters of the position and/or posture required by the calibration board in the world coordinate system to be converted to the camera coordinate system.
- the external parameters of the camera may include, but are not limited to, one or a combination of the following parameters: the position and/or posture change parameters required for the conversion of the calibration board in the world coordinate system to the camera coordinate system, etc.
- the distortion parameters include radial distortion parameters and tangential distortion coefficients. Radial distortion and tangential distortion are respectively the positional deviation of image pixels with the distortion center as the center point along the length direction or the tangential direction, which causes the image to be deformed.
- the change parameters of the position and/or posture required for the calibration board in the world coordinate system to be converted to the camera coordinate system may include a rotation matrix R and a translation matrix T.
- the rotation matrix R is the rotation angle parameter of the calibration board in the world coordinate system relative to the three coordinate axes of x, y, and z when the calibration board in the world coordinate system is converted to the camera coordinate system.
- the translation matrix T is the calibration board in the world coordinate system. The translation parameter of the origin when converted to the camera coordinate system.
- the external parameters of the radar refer to the parameters used to convert a point from the world coordinate system to the radar coordinate system.
- the external parameters of the calibration board relative to the radar can be used to reflect the change parameters of the position and/or attitude required for the calibration board in the world coordinate system to be converted to the radar coordinate system.
- the target external parameters between the camera and the radar refer to the parameters used to reflect the conversion relationship between the radar coordinate system and the camera coordinate system.
- the external parameters between the camera and the radar can reflect the position of the radar coordinate system relative to the camera coordinate system. Changes in posture, etc.
- the senor may include a camera and a radar, then calibrating the sensor refers to calibrating one of the internal parameters of the camera, the internal parameters of the radar, and the external target parameters between the camera and the radar, or a combination of multiple calibrations.
- the above-mentioned internal parameters and/or external parameters can be determined by means of a calibration board, for example, the external parameters of the target between the camera and the radar can be determined by means of the external parameters of the calibration board relative to the camera and the external parameters of the calibration board relative to the radar.
- the actual calibrated parameters may include, but are not limited to, the conditions listed above.
- the calibration method of the sensor may include the following steps.
- step 101 a plurality of first images are collected by the camera. Wherein, the pose information of the first calibration board in the plurality of first images is different.
- the radar may be a lidar that detects characteristic quantities such as the position and speed of a target by emitting a laser beam, or a millimeter-wave radar that works in a millimeter-wave frequency band, or the like.
- the field of view is the range that can be covered by the light, electromagnetic waves, etc. emitted by the sensor when the position is unchanged.
- the sensor includes a radar as an example, the field of view refers to the range that the laser beam or electromagnetic wave emitted by the radar can cover; the sensor includes a camera as an example, the field of view refers to what can be captured by the camera's camera range.
- the first calibration board 230 is located in the range of the common field of view 231 of the radar 210 and the camera 220, for example, as shown in FIG. 2.
- the range of the common field of view 231 refers to the part where the respective ranges covered by the sensor elements included in the sensor overlap each other, that is, the range covered by the radar 210 (the radar field of view 211 in the figure) and the range photographed by the camera 220 (as shown in the figure)
- the overlapping part of the camera field of view 221) (the part indicated by the dashed line in the figure).
- the first calibration plate may be a circular, rectangular or square array plate with a fixed pitch pattern.
- a rectangular array of black and white grid plates can be used.
- the pattern of the calibration plate can also include other regular patterns, or patterns that are irregular but have characteristic parameters such as feature point sets, characteristic edges, etc.
- the shape and pattern of the calibration plate are not limited here.
- the number of first images collected by the camera may be multiple, for example, more than 20.
- the poses of the first calibration board in the collected multiple first images may be different, that is, there are at least some images in the multiple first images that respectively show different poses of the first calibration board, such as different Position and/or attitude.
- the first calibration plate has three-dimensional attitude changes of pitch angle, roll angle, and yaw angle.
- first images may be collected when the first calibration board is in different positions and/or attitudes, that is, the pose information of the first calibration board included in different first images may be the same or different, but there are
- the at least two first images include different pose information of the first calibration plate.
- each first image needs to include a complete first calibration board.
- the number of the first images may be m, and the number of poses of the first calibration plate may be n. Both m and n are integers greater than or equal to 2.
- the pose information includes information used to reflect the pose of the first calibration board in the three-dimensional space.
- the pose information of the first calibration board shown in FIG. 3 may be the attitude change in at least one of the three dimensions of the first calibration board's pitch angle, roll angle, and yaw angle.
- the first calibration plate can be in a static state.
- a bracket can be used to fix the first calibration plate.
- the pose information also includes position information.
- the multiple first images collected may include images of the first calibration board in different poses at various distances (ie, small distances, moderate distances, large distances, etc.).
- the laser generated by the radar can cover the complete first calibration board, usually in the process of deploying the first calibration board, make the first calibration board far away from the radar.
- the process of collecting images of the first calibration boards deployed at different distances in response to the case where the distance d 1 of the first calibration board from the camera is small, for example, the distance d 1 is less than the distance threshold D 1 , Acquire a plurality of first images including the first calibration board in different postures.
- multiple first images including first calibration plates with different postures may be additionally collected.
- the distance d 1 is moderate, for example, the distance d 1 is between the above two distance thresholds, that is, D 1 ⁇ d 1 ⁇ D 2
- multiple first calibration plates including different postures can be additionally collected. image. In this way, the first image taken at various distances between the first calibration plate and the camera can be obtained. The first images at different distances have different pose information.
- a complete first calibration plate may be included in the multiple first images.
- the ratio of the area of the first calibration plate to the area of the first image is different. For example, when the distance d 1 is farther, the area of the first calibration plate in the first image is relatively small, and when the distance d 1 is closer, the area of the first calibration plate in the first image is relatively large.
- step 102 a pre-calibrated first internal parameter of the camera is acquired, and according to the first internal parameter and the plurality of first images, it is determined that the first calibration plate of each pose is relative to the external of the camera. Ginseng.
- the distortion parameters need to be determined more accurately.
- the distortion parameters have a great influence on the internal parameters of the camera. Therefore, if the internal parameters of the camera are calibrated directly based on multiple first images, the calibration results may not be accurate enough.
- the first internal parameters of the pre-calibrated camera can be directly obtained. According to the first internal parameters of the camera and the multiple first images collected by the previous camera, it can be determined that the first calibration plates of different poses are relative to all the first calibration plates. State the external parameters of the camera.
- the first internal parameter of the camera is the internal parameter of the camera obtained by calibrating the camera when the sensor is calibrated for the first time.
- the camera collects multiple second images including the complete second calibration plate, and calibrates the first internal parameters of the camera according to the multiple second images.
- the pose information of the second calibration plate in the plurality of second images is different.
- the second calibration board may be closer to the camera and close to the edge of the camera's field of view, so that the determined first internal reference of the camera is more accurate than the internal reference of the camera calibrated using multiple first images.
- the first internal parameter of the pre-calibrated camera can be directly obtained.
- methods such as Zhang Zhengyou calibration method can be used to calibrate the first internal parameter of the camera.
- the external parameters of the first calibration plate relative to the camera are determined, including the rotation matrix R and the translation matrix T.
- step 103 multiple sets of radar point cloud data of the first calibration board of each pose are acquired, and the external parameters and the external parameters of the camera relative to the first calibration board of the respective poses are obtained.
- the multiple sets of radar point cloud data determine the target external parameters between the radar and the camera.
- the radar point cloud data is data including multiple radar points generated by the laser or electromagnetic waves emitted by the radar passing through the first calibration plates of different poses.
- the radar point cloud data includes the point cloud data obtained based on the complete first calibration board.
- the edge of the first calibration plate 420 is not parallel to the laser or electromagnetic waves emitted by the radar 410, and may have a certain angle to ensure that the laser or electromagnetic waves emitted by the radar 410 pass through each edge of the first calibration plate 420. So that the target radar point cloud data matched by the first calibration board in the radar point cloud data can be better determined later.
- the target external parameters between the radar and the camera belong to the external parameters between the camera and the radar.
- the external parameters of the first calibration board relative to the camera are obtained, that is, the relative position relationship between the camera and the radar or
- the sensor can be calibrated according to the first internal parameters of the pre-calibrated camera.
- the method before performing the above step 102 to obtain the pre-calibrated first internal parameters of the camera, the method further includes step 100.
- step 100 in response to the initial calibration of the sensor, the camera is calibrated to obtain the first internal parameter of the camera.
- the camera can be calibrated to obtain the first internal parameter of the camera.
- Obtaining the first internal parameter of the pre-calibrated camera in step 102 may include: in response to calibrating the sensor again, obtaining the first internal parameter of the camera obtained by calibrating the sensor for the first time.
- the first internal parameters of the camera obtained from the initial calibration of the sensor can be directly obtained.
- the camera in response to the initial calibration of the sensor, the camera is calibrated to obtain the first internal parameters of the camera, and in response to the recalibration of the sensor, the first internal parameters of the camera obtained from the initial calibration of the sensor can be directly obtained.
- the camera internal parameter calibration process and the target external parameter calibration process between the radar and the camera can be distinguished, and in the process of re-calibrating the sensor, the sensor calibration can be realized directly based on the first internal parameter of the camera obtained from the initial calibration of the sensor.
- the second calibration board in the case of calibrating the sensor for the first time, should be located in the field of view of the camera, and the second image may include a complete second calibration board, for example, as shown in FIG. 6.
- the second calibration plate 620 can be located at the edge of the camera field of view 611 of the camera 610.
- the above step 100 may include the following steps.
- step 100-1 a plurality of second images are collected by the camera.
- the pose information of the second calibration plate in the plurality of second images is different.
- the second calibration board may be the same as or different from the first calibration board.
- that the first calibration board is the same as the second calibration board means that the same calibration board is used to realize the functions of the first calibration board and the second calibration board, wherein the same calibration board serves as the second calibration board.
- the same calibration board can be used as the first calibration board.
- it can also be different from the same calibration board as the first calibration board.
- the difference between the first calibration board and the second calibration board may mean that completely different or partially different calibration boards are used to realize the functions of the first calibration board and the second calibration board respectively.
- the pose information may include the attitude of the second calibration board in a three-dimensional space, for example, the attitude changes in three dimensions of the pitch angle, the roll angle, and the yaw angle.
- the second calibration board should be in a static state.
- a bracket can be used to fix the second calibration plate.
- the preset value may be a specific value or a range value. Taking the preset value range as an example, the preset value range will affect the accuracy of each first internal parameter of the camera. Therefore, in order to improve the accuracy of the camera's first internal parameter determined subsequently, the preset value can be changed Set to a value between [0.8, 1]. For example, as shown in FIG. 8, in this figure, the proportion of the second calibration plate in the entire image is within the preset value range, so this figure can be used as the second image.
- the number of second images collected by the camera may be multiple, for example, more than 20.
- the pose information of the second calibration plate in the collected multiple second images may be different, that is, there are at least some images in the multiple second images that respectively show the different poses of the second calibration plate, for example, there are The attitude changes in the three dimensions of pitch angle, roll angle and yaw angle.
- the number of second images can be c, and the number of poses of the second calibration plate can be d.
- Both c and d are integers greater than or equal to 2.
- c may be equal to the number m of the aforementioned first images, or not equal to m, similarly, d may be equal to the number n of the poses of the aforementioned second calibration plate, or not equal to n.
- the multiple second images collected by the camera should not be blurred.
- the blurred image may be caused by the movement of the sensor, that is, the movement of the camera causes the camera and the second calibration plate to appear. Caused by relative movement.
- it can be determined whether there are motion-blurred images in the multiple second images collected by the camera, and the motion-blurred images are removed. Or you can filter out motion-blurred images through a preset script.
- step 100-2 according to the plurality of second images, a plurality of first candidate internal parameters of the camera are respectively determined, and one of the plurality of first candidate internal parameters is combined, Determined as the first internal reference.
- a preset matlab toolbox may be used to calibrate multiple first candidate internal parameters of the camera according to multiple second images.
- the camera can reproject the preset point in the camera coordinate system to the pixel coordinate system to obtain the corresponding projection point, and then compare The error between the projection point and the corresponding preset point in the pixel coordinate system can be used to obtain the error value of the preset point.
- the error values obtained by comparing the respective first candidate internal parameters are compared, and the first candidate internal parameter with the smallest error value is used as the first internal parameter of the camera.
- steps 100-1 and 100-2 are for the process of calibrating the first internal parameters of the camera in the case of calibrating the sensor for the first time, and there is no restriction on the order of execution of step 101. If the sensor is calibrated again, the first internal parameter of the pre-calibrated camera can be directly obtained.
- the first candidate internal parameters of the camera are the multiple first candidate internal parameters of the camera that are respectively determined according to the multiple second images of the second calibration plate containing different pose information collected by the camera. .
- the first candidate internal parameter with the smallest error value between the projection point determined in the above manner and the corresponding preset point in the pixel coordinate system is selected as the first internal parameter of the camera.
- multiple first candidate internal parameters of the camera can be determined, so that one of the multiple first candidate internal parameters is determined as the first internal parameter, which improves the accuracy and accuracy of determining the camera internal parameters and has high usability.
- step 100-2 may include the following steps.
- step 100-21 the preset point located in the camera coordinate system is projected to the pixel coordinate system through the camera and according to the plurality of first candidate internal parameters respectively, to obtain the preset point in the pixel coordinate system.
- Multiple first coordinate values in the coordinate system are provided.
- the number of preset points can be one or more.
- the camera can use different first candidate internal parameters to project the preset point in the camera coordinate system into the pixel coordinate system to obtain each preset point. Multiple first coordinate values of a preset point in a pixel coordinate system.
- a preset point P in the 3D space is projected into the 2D space to obtain the corresponding first coordinate value P1.
- step 100-22 for each candidate internal parameter, the second coordinate value of the preset point in the verification image is obtained, and the first coordinate value corresponding to the second coordinate value is determined to obtain the coordinate with the corresponding relationship. Yes, wherein the verification image is one or more second images among the plurality of second images.
- the second coordinate value of the preset point in the pixel coordinate system can be determined.
- the second coordinate value shown in FIG. 10B is P2
- the first coordinate value P1 corresponding to the second coordinate value P2 is determined.
- multiple sets of coordinate pairs with corresponding relationships can be obtained.
- P2 corresponds to P1
- P1 and P2 form a set of coordinate pairs.
- P2' corresponds to P1'
- P1' and P2' constitute another set of coordinate pairs.
- the verification image is a plurality of second images
- the second coordinate value of the preset point on the verification image j can be obtained, and the first coordinate value corresponding to the second coordinate value can be obtained to form a Group coordinate pair
- P i multiple sets of coordinate pairs of the preset point on multiple verification images
- steps 100-23 for each first candidate internal parameter, the distance between the first coordinate value and the second coordinate value in the coordinate pair included in the first candidate internal parameter is determined, and the multiple first candidate internal parameters are combined.
- the first candidate internal parameter with the smallest distance among the internal parameters is determined as the first internal parameter of the camera.
- the distance between the first coordinate value and the second coordinate value in each set of coordinate pairs can be calculated separately.
- a first candidate internal parameter corresponding to the minimum distance can be used as the first internal parameter of the camera.
- the first candidate internal parameter 2 can be used as the camera's The first internal reference.
- the first internal reference comprises a plurality of coordinate pairs i
- P i can be calculated separately from each set of coordinate pairs, then the total distance can be obtained a plurality of sets of coordinates, such as the distance of each set of coordinates may be added Get the total distance.
- the total distance of each first candidate parameter is compared, and the first candidate internal parameter with the smallest total distance among the plurality of first candidate internal parameters is determined as the first internal parameter of the camera.
- the above description is based on one preset point.
- the method for obtaining the first internal parameter is similar to the same preset point. For example, for each first candidate internal parameter, the distance of the coordinate pair of each preset point can be calculated, and then the average value of the distances of multiple preset points can be calculated, and the average value of the distance among the multiple first candidate internal parameters can be minimized
- One of the first candidate internal parameters is determined as the first internal parameter of the camera.
- the first candidate internal parameter with the smallest reprojection error is used as the target internal parameter of the camera, so that the first internal parameter of the camera is more accurate.
- step 102 may include the following steps.
- step 102-1 for each first image of the plurality of first images, the first image is deformed according to the first internal reference to obtain the third image corresponding to the first image. image.
- equipment equipped with radar and camera at the same time such as a vehicle equipped with radar and camera at the same time, equipped with image processing equipment (the equipment can be radar, camera or other equipment), can be used for multiple pictures.
- An image undergoes anti-distortion processing.
- the first internal parameters of the pre-calibrated camera may be de-distorted to obtain multiple third images.
- the camera internal parameters can be represented by the internal parameter matrix A', as shown in formula 1:
- each parameter can refer to the aforementioned description of camera parameters.
- the internal parameter matrix A can be expressed by formula 2:
- step 102-2 a second internal parameter of the camera is determined according to a plurality of third images.
- the preset matlab toolbox can be used to determine the multiple second candidate internal parameters of the camera according to the multiple third images after the distortion processing.
- the camera uses different second candidate internal parameters to be located in the preset camera coordinate system.
- the point is projected to the pixel coordinate system to obtain multiple third coordinate values.
- the fourth coordinate value of each preset point observed in the pixel coordinate system and the corresponding third coordinate value are regarded as a set of coordinate pairs that have a corresponding relationship, and a second candidate corresponding to the smallest distance in the multiple sets of coordinate pairs
- the internal reference is used as the second internal reference of the camera.
- the second internal parameter is the internal parameter of the camera determined according to the plurality of third images after de-distortion.
- the multiple second candidate internal parameters of the camera are based on the multiple first images of the first calibration board with different pose information collected by the camera, and the multiple third images obtained after de-distortion determine that the camera is in the ideal Multiple internal parameters in the state.
- the second internal parameter is the second candidate internal parameter with the smallest error value between the projection point determined in the multiple second candidate internal parameters and the corresponding preset point in the pixel coordinate system.
- the second internal parameter is the camera without distortion The internal reference in the ideal state.
- step 102-3 according to a plurality of third images and the second internal parameters of the camera, the external parameters of the first calibration plate of each pose relative to the camera are determined.
- the external parameters of the calibration board relative to the camera may include a rotation matrix R and a translation matrix T.
- the homography matrix is a matrix describing the positional mapping relationship between the world coordinate system and the pixel coordinate system.
- the multiple first images taken by the camera can be deformed according to the first internal parameters of the camera to obtain multiple third images, and the second internal parameters of the camera can be determined according to the multiple third images. It is equivalent to the internal parameters of a camera without distortion under ideal conditions. Then, according to the multiple third images and the second internal parameters, the external parameters of the first calibration board relative to the camera are determined, and the first calibration board obtained by the above method has a higher accuracy relative to the external parameters of the camera.
- the above step 102-3 may include the following steps.
- steps 102-31 the homography matrix corresponding to each third image is determined respectively.
- the homography matrix H corresponding to each third image can be calculated in the following manner:
- r 1 , r 2 , and r 3 are the rotation column vectors that make up the rotation matrix R, the dimension is 1 ⁇ 3, and t is the vector form of the translation matrix T.
- (u, v) are the pixel coordinates
- (X, Y) correspond to the coordinates of the calibration plate
- s is the scale factor
- the homography matrix H corresponding to the multiple third images can be calculated by formula 5.
- steps 102-32 according to the second internal parameters of the camera and a plurality of the homography matrices, the external parameters of the first calibration plate of each pose relative to the camera are determined.
- ⁇ represents the scale factor
- r 1 ⁇ A -1 h 1
- r 2 ⁇ A -1 h 2
- r 3 r 1 ⁇ r 2
- ⁇ 1/
- 1/
- r 1 , r 2 and r 3 constitute a 3 ⁇ 3 rotation matrix R.
- the homography matrix corresponding to each third image can be determined separately, and according to the obtained multiple homography matrices and the second internal parameters of the camera, it is determined that the first calibration plate of each pose is relative to the camera.
- the external parameters make the first calibration board more accurate relative to the external parameters of the camera.
- the foregoing step 103 may include the following steps.
- step 103-1 for the first calibration board of each pose, according to the external parameters of the first calibration board relative to the camera and the reference value of the external parameters between the radar and the camera, A set of target radar point cloud data matching the first calibration board is determined from the radar point cloud data in the pose.
- the external parameter reference value may be a rough estimate of the external parameter value between the radar and the camera based on the approximate position and orientation between the radar and the camera.
- the coordinate system of the radar can be superimposed with the camera coordinate system according to the reference value of the external parameters, and unified into the camera coordinate system.
- the first calibration board for each pose, the external reference of the first calibration board relative to the camera, and the external reference between the radar and the camera Value the M-estimator SAmple Consensus (MSAC) algorithm is used to determine the target plane where the first calibration board is located. Further, a MeanShift (MeanShift) clustering algorithm is used to determine the target radar point cloud data that matches the first calibration board in the corresponding radar point cloud data on the target plane.
- MSAC M-estimator SAmple Consensus
- step 103-2 the target external parameters between the radar and the camera are determined according to the matching relationship between the multiple sets of the target radar point cloud data and the first calibration plates of each pose.
- n there is only one pose from the radar to the camera.
- the first calibration board used, such as n. Therefore, in step 103-1, n sets of target radar point cloud data can be obtained.
- the radar point cloud data of a group of targets can be matched with the first calibration board of n poses respectively to obtain n matching relationships. Then through these n matching relationships, the external parameters between the radar and the camera can be calculated.
- the least square method may be used to determine the target external parameters between the radar and the camera.
- M The estimation algorithm determines the target plane where the first calibration board is located. Further, a mean shift clustering algorithm is used to determine a group of target radar point cloud data matching the first calibration board in the corresponding radar point cloud data on the target plane. The target radar point cloud data matching the first calibration board is automatically determined from the radar point cloud data, which reduces the matching error and improves the accuracy of the point cloud matching. According to the matching relationship between multiple sets of the target radar point cloud data and the first calibration board, the target external parameters between the radar and the camera are determined, and the target external parameters between the radar and the camera can be quickly determined. Improve the accuracy of target external parameters.
- the above-mentioned step 103-1 may include the following steps.
- step 103-11 according to the external parameters of the first calibration board relative to the camera and the reference value of the external parameters between the radar and the camera, the candidate position of the first calibration board is determined.
- the external parameters of the first calibration board relative to the camera and the estimated reference values of the external parameters between the radar and the camera may be first based on the pose, and then the first calibration board can be used for the first calibration board.
- the position of the first calibration board is estimated from the radar point cloud data collected by the calibration board, and the approximate position and direction of the first calibration board are obtained. Use the approximate position and direction of the first calibration board as the estimated candidate position.
- the candidate position represents the approximate position of the first calibration plate in a map composed of radar point cloud data.
- step 103-12 according to the candidate position, the target plane where the first calibration board is located is determined from the radar point cloud data in the pose.
- multiple first radar points located in the area corresponding to the candidate position can be randomly selected, and multiple first radar points can be obtained.
- the first plane composed of radar points. Repeat this selection multiple times to obtain multiple first planes.
- the distances from other radar points except the multiple first radar points in the group of radar point cloud data to the first plane are respectively calculated.
- the radar point whose distance value is less than the preset threshold among other radar points is used as the second radar point, and the second radar point is determined as the radar point in the first plane.
- the first plane with the largest number of radar points is used as the target plane where the first calibration board is located.
- the target plane represents the plane on which the first calibration board is located in a graph composed of radar point cloud data.
- step 103-13 on the target plane, a set of target radar point cloud data matching the first calibration board of the pose is determined.
- the first circular area is randomly determined according to the size of the first calibration plate.
- the initial first circular area may be the area corresponding to the circumscribed circle of the first calibration plate.
- any radar point located in the initial first circular area is randomly selected as the first center of the first circular area to adjust the first circular area in the The position in the radar point cloud data.
- the size of the first calibration board is the size of the first calibration board in the figure composed of radar point cloud data.
- a plurality of first vectors are obtained respectively.
- the second vector is obtained by adding the plurality of first vectors.
- the target center position of the first calibration plate is determined.
- the target center position of the first calibration board is the determined center position of the first calibration board in a map composed of radar point cloud data.
- a group of target radar point clouds matching the first calibration board is determined in the radar point cloud data data.
- steps 103-12 may include the following steps.
- steps 103-121 two or more first radar groups are determined from the radar point cloud data in the pose, and for each first radar group, the corresponding first plane is determined, where Each first radar group includes a plurality of randomly selected first radar points located in the corresponding area of the candidate position, and the first plane corresponding to each first radar group includes a plurality of first radar points of the first radar group. Radar point.
- a plurality of first radar points located in the area corresponding to the candidate position may be randomly selected each time to obtain a first radar group , Each time a first plane composed of a plurality of first radar points of the first radar group can be obtained. If multiple first radar points are randomly selected for multiple times, multiple first planes can be obtained.
- the radar points include 1, 2, 3, 4, 5, 6, 7, and 8
- the first radar points 1, 2, 3, and 4 are randomly selected for the first time to form the first plane 1, and for the second time, the first radar points are randomly selected.
- a radar point 1, 2, 4, and 6 form the first plane 2
- the first radar point 2, 6, 7 and 8 is randomly selected for the third time to form the first plane 3.
- steps 103-122 for each of the first planes, it is determined that other radar points in the radar point cloud data in the pose and the first radar point are to the first radar point. The distance from the plane.
- the distance values from the other radar points 5, 6, 7, and 8 to the first plane 1 can be calculated, and for the first plane 2, the distances from the other radar points 3, 5, 7, and 8 can be calculated respectively.
- the distance values of the other radar points 1, 3, 4, and 5 to the first plane 3 can be calculated.
- a radar point whose distance is less than a threshold among the other radar points is used as a second radar point, and the second radar point is determined as the first radar point.
- Radar points included in a plane are used as a radar point whose distance is less than a threshold among the other radar points.
- the first plane 1 includes radar points 1, 2, 3 , 4, and 5.
- the first plane 2 includes radar points 1, 2, 4, 6, and 7, and it can be assumed that the first plane 3 includes radar points 1, 3, 4, 5, 6, and 8.
- the first plane that includes the largest number of radar points is used as the target plane.
- the first plane with the largest number of radar points for example, the first plane 3, is used as the target plane where the first calibration board is located.
- the above method can be used to determine a target plane where the first calibration plate of a certain pose is located for each group of radar point cloud data.
- the fitted target plane is more accurate and has high availability.
- steps 103-13 may include the following steps.
- an initial first circular area is determined according to the size of the first calibration plate.
- the initial first circular area can be determined on the target plane according to the size of the first calibration board, and the size of the initial first circular area may be adopted The size of the circumscribed circle of the first calibration plate.
- the size of the first calibration board is the size of the first calibration board in the figure composed of radar point cloud data.
- any radar point located in the initial first circular area is randomly selected as the first center of the first circular area to determine the first The position of the circular area in the radar point cloud data.
- a radar point is randomly selected as the first center of the first circular area .
- the position of the first circular area in the radar point cloud data is subsequently adjusted through the first circle center.
- the radius of the first circular area is the same as the radius of the initial first circular area.
- steps 103-133 using the first circle center as a starting point, and a plurality of third radar points located in the first circular area in the radar point cloud data as an end point, a plurality of first vectors are obtained respectively.
- the first circle center 170 can be used as the starting point, and the multiple third radar points 171 located in the first circular area in the radar point cloud data can be used as the ending points, so as The first vector 172.
- the third radar point 171 can effectively cover a circular area. As shown in Figure 17.
- steps 103-134 the multiple first vectors are added to obtain a second vector.
- a Meanshift vector that is, the second vector
- the target center position of the first calibration plate is determined based on the second vector.
- the end point of the second vector is taken as the second circle center, and the second circular area is obtained according to the size of the first calibration plate. Taking multiple fourth radar points in the second circular area as the terminal and the second circle center as the starting point, multiple third vectors are obtained respectively. Add multiple third vectors to obtain the fourth vector, and then use the end point of the fourth vector as the new second circle center to obtain a new second circular area. Repeat the above steps to determine the fourth vector until the fourth vector converges to With the preset value, the corresponding second circle center at this time is used as the candidate center position of the first calibration plate.
- the candidate center position is the candidate center position of the first calibration board in the map composed of radar point cloud data.
- the candidate center position can be directly used as the target center position; otherwise, the new candidate center position can be re-determined until the final target is determined Central location.
- steps 103-136 according to the target center position of the first calibration board and the size of the first calibration board, a group of all matching with the first calibration board is determined in the radar point cloud data.
- the target radar point cloud data is determined in the radar point cloud data.
- the corresponding position of the first calibration plate can be determined according to the target center position and size of the first calibration plate. It is matched with the first calibration board of the pose, so that the radar point cloud data matching the position of the first calibration board in the radar point cloud data can be used as the target radar point cloud data.
- steps 103-135 may include:
- steps 103-1351 the end point of the second vector is taken as the second circle center, and the second circular area is determined according to the size of the second circle center and the first calibration plate.
- the end point of the second vector may be used as the second circle center, the second circle center is again taken as the new circle center, and the radius is the radius of the circumscribed circle of the first calibration plate to obtain the second circular area.
- steps 103-1352 using the second circle center as a starting point, and multiple fourth radar points located in the second circular area in the radar point cloud data as an ending point, multiple third vectors are determined respectively.
- the second circle center is used as a starting point, and multiple fourth radar points located in the second circle center area in the radar point cloud data are used as end points, and multiple third vectors are obtained respectively.
- steps 103-1353 the multiple third vectors are added to obtain a fourth vector.
- steps 103-1354 it is determined whether the vector value of the fourth vector converges to a preset value.
- the preset value may be close to zero.
- steps 103-1355 use the end point of the fourth vector as the second circle center, determine the second circular area according to the second circle center and the size of the first calibration plate, and jump to Steps 103-1352.
- the end point of the fourth vector can be used as the new second circle center again, and the new fourth vector can be calculated again according to the above steps 103-1352 to 103-1354, and the vector value of the fourth vector can be determined Whether to converge.
- the above process is repeated continuously until the finally obtained vector value of the fourth vector converges to the preset value.
- steps 103-1356 the second circle center corresponding to the converged fourth vector is used as the candidate center position of the first calibration plate.
- the second circle center corresponding to the fourth vector may be used as the candidate center position of the first calibration plate.
- steps 103-1357 in response to the candidate center position being coincident with the center position of the first calibration plate, the candidate center position is taken as the target center position.
- steps 103-135 may further include the following steps.
- steps 103-1358 in response to the candidate center position being not coincident with the center position of the first calibration plate, the candidate center position is re-determined.
- all radar points in the second circular area may be deleted, and a new second circular area may be determined again. Or directly delete this group of radar point cloud data, and re-determine the candidate center position of the first calibration board according to another group of radar point cloud data corresponding to other attitudes of the first calibration board until the candidate center is determined The position coincides with the center position of the first calibration plate.
- step 103-2 may include: determining an alternative external parameter between the radar and the camera according to g matching relationships, and according to the difference between the radar and the camera A plurality of candidate external parameters are used to determine the target external parameters between the radar and the camera.
- the least square method is adopted to determine a candidate external parameter by minimizing the sum of squares of the external parameter error between the radar and the camera, where g is an integer greater than or equal to 3.
- the first calibration board of pose information 1 corresponds to target radar point cloud data 1
- the first calibration board of pose information 2 corresponds to target radar point cloud data 2, and so on, there are n sets of matching relationships.
- the candidate external parameter 1 can be determined based on the first three sets of matching relationships
- the candidate external parameter 2 can be determined based on the first four sets of matching relationships
- the candidate external parameter 3 can be determined based on the matching relationships between the first two groups and the fourth set, etc.
- the candidate external parameter with the best projection effect is determined as the target external parameter between the radar and the camera.
- the candidate external parameters between the radar and the camera can be determined according to multiple matching relationships, and based on the multiple candidate external parameters, the candidate external parameter with the best projection effect is selected as the radar and the camera.
- the target external parameter between the radar and the camera improves the accuracy of the target external parameter between the radar and the camera.
- step 103 may further include the following steps.
- step 103-21 the first calibration plate is projected by the radar based on each candidate external parameter, and projected onto any first image to generate a set of projection data.
- the optional external parameters between each radar and the camera, the matrix of the camera internal parameters, and the radar point cloud data are multiplied to realize the projection of the radar point cloud data and project it onto a certain first image.
- the radar point cloud data may be one of multiple groups of radar point cloud data collected before, or it may be newly collected radar point cloud data.
- the first calibration board needs to be included in the collected target.
- step 103-22 a group of projection data with the highest degree of projection matching with the first image among the multiple groups of projection data is determined to be the target projection data.
- a set of projection data with the highest degree of projection matching with the first image is determined, and the set of projection data is determined as the target projection data.
- the projection data obtained by projecting on the first image for example, as shown in Figure 21A and Figure 21B, the projection effect of Figure 21A is better than the projection effect of Figure 21B, then the projection data corresponding to Figure 17A is the target Projection data.
- step 103-23 the candidate external parameter corresponding to the target projection data is determined to be the target external parameter between the radar and the camera.
- the candidate external parameter corresponding to the target projection data is the target external parameter between the radar and the camera.
- multiple candidate external parameters can be verified according to the projection effect, and the candidate external parameter with the best projection effect is used as the final target external parameter, which improves the accuracy of the target external parameter between the radar and the camera. .
- the above-mentioned radar and camera can be deployed on the vehicle, and the radar can be a lidar.
- the radar and camera can be deployed at different positions of the vehicle at the same time. For example, as shown in FIG. Radar 2220 and camera 2210 can be deployed in the rear, front windshield and other positions. After the first internal parameters of the camera 2210 are determined, if you need to re-determine the target external parameters between the radar 2220 and the camera 2210, you can directly obtain the previously calibrated first internal parameters to quickly determine the target external parameters, and improve the radar 2220 and camera The accuracy of the target external reference between 2210.
- the above-mentioned methods provided in the embodiments of the present disclosure can be used on machinery and equipment, which can be manually driven or unmanned vehicles, such as airplanes, vehicles, drones, unmanned vehicles, robots, and so on.
- the two sensors, radar and camera can be set above the center console, close to the front windshield glass. Due to the movement of the vehicle, the attitude of at least one of the radar and the camera will be changed, and the external parameters between the radar and the camera need to be recalibrated. Due to the influence of the front windshield on the refraction of light, the internal parameters of the originally calibrated camera will be inaccurate in the application process, which will affect the accuracy of the external parameters between the radar and the camera.
- the laser radar is finally determined according to the external parameters of the first calibration board relative to the camera and the multiple sets of radar point cloud data of the different pose information.
- the target external parameter between the camera and the camera It can quickly calibrate the target external parameters between the lidar and the camera, with high availability.
- the radar is deployed on the front bumper of the vehicle, and the camera is deployed at the position of the rearview mirror of the vehicle.
- the first calibration plate 2331 is located in the common field of view of the radar 2320 and the camera 2310.
- the first calibration board can be fixed on the ground or held by the staff.
- the camera 2310 is calibrating the first internal parameter, multiple first images containing the first calibration board 2331 are used. Since the radar 2320 and the camera 2310 are not on the same horizontal plane, the camera 2310 is far away from the ground. The first calibration board 2331 in the image may only occupy part of the content of the first image. At this time, the accuracy of the internal parameters of the camera 2310 calibrated according to multiple first images is poor.
- the camera internal parameters can be calibrated through the second calibration plate 2332 located within the field of view of the camera 2310 and at a relatively short distance from the camera 2310.
- the horizontal distance between the second calibration plate 2332 and the camera 2310 is Less than the horizontal distance between the first calibration board 2331 and the camera 2310, the second calibration board 2332 can be fixed on the vehicle, and the second image collected at this time can include the complete second calibration board 2332, and then A more accurate first internal parameter of the camera 2310 can be obtained.
- the camera 2310 and the radar 2320 are deployed on the vehicle, the distance between the camera 2310 and the ground is greater than the distance between the radar 2320 and the ground, and the horizontal distance between the second calibration plate 2332 and the camera 2310 is smaller than the first calibration plate 2331
- the horizontal distance from the camera 2310, the multiple second images collected by the camera 2310 include the complete second calibration plate 2332, thereby improving the accuracy of calibrating the internal parameters of the camera.
- the present disclosure also provides an embodiment of an apparatus.
- FIG. 24 is a block diagram of a sensor calibration device according to an exemplary embodiment of the present disclosure.
- the first calibration board is located in the common field of view of the radar and the camera, and the device includes:
- the acquisition module 210 is used to acquire multiple first images through the camera, wherein the pose information of the first calibration board in the multiple first images is different;
- the first determination module 220 is used to acquire pre-calibration
- the first internal parameter of the camera, and the external parameters of the first calibration board of each pose relative to the camera are determined according to the first internal parameter and the plurality of first images; a second determination module 230.
- the multiple sets of radar point cloud data are used to determine the target external parameters between the radar and the camera.
- the device further includes: a calibration module, configured to calibrate the camera in response to the initial calibration of the sensor to obtain the first internal parameter of the camera; the first determination The module includes: an acquisition sub-module for acquiring the first internal parameter of the camera obtained by calibrating the sensor for the first time in response to calibrating the sensor again.
- the second calibration board is located within the field of view of the camera, and the calibration module includes: a collection sub-module configured to collect multiple second images through the camera, the multiple second images The pose information of the second calibration board in the image is different; the first determination sub-module is used to determine the multiple first candidate internal parameters of the camera according to the multiple second images, and compare the multiple One of the first candidate internal parameters is determined to be the first internal parameter, wherein each second image corresponds to one first candidate internal parameter.
- the first determining submodule includes: a projection unit, configured to project a preset point located in the camera coordinate system according to the plurality of first candidate internal parameters through the camera To the pixel coordinate system to obtain multiple first coordinate values of the preset point in the pixel coordinate system; the first determining unit is configured to obtain, for each first candidate internal parameter, that the preset point is in the verification
- the second coordinate value in the image is determined, and the first coordinate value corresponding to the second coordinate value is determined to obtain a coordinate pair that has a corresponding relationship.
- the verification image is one or more of the plurality of second images.
- the first determining module includes: an anti-distortion sub-module, configured to perform, for each first image in the plurality of first images, the first image according to the first internal reference The image undergoes de-distortion processing to obtain a third image corresponding to the first image; a second determining sub-module, configured to determine the second internal parameter of the camera according to a plurality of the third images; a third determining sub-module, The first calibration board used to determine the external parameters of the respective poses relative to the camera according to the plurality of third images and the second internal parameters of the camera.
- the third determining submodule includes: a third determining unit, configured to determine the homography matrix corresponding to each third image; and a fourth determining unit, configured to determine the homography matrix corresponding to each of the third images;
- the second internal parameter of the camera and the plurality of homography matrices determine the external parameters of the first calibration board of each pose relative to the camera.
- the second determination module includes: a fourth determination sub-module, which is used for a first calibration board for each pose, according to the external parameters of the first calibration board relative to the camera, And the reference value of the external parameters between the radar and the camera, determine a set of target radar point cloud data matching the first calibration board in the radar point cloud data in this pose; a fifth determining submodule , For determining the target external parameters between the radar and the camera according to the matching relationship between the multiple sets of the target radar point cloud data and the first calibration board of each pose.
- the fourth determining sub-module includes: a fifth determining unit, configured to determine according to the external parameters of the first calibration board relative to the camera, and the relationship between the radar and the camera The reference value of the external reference between the time period determines the candidate position where the first calibration board is located; the sixth determining unit is configured to determine the position from the radar point cloud data in the pose according to the candidate position The target plane where the first calibration board is located; a seventh determining unit for determining a set of targets matching the first calibration board on the target plane corresponding to the radar point cloud data in the pose Radar point cloud data.
- the sixth determining unit includes: a first determining subunit, configured to determine two or more first radar groups from the radar point cloud data in the pose, wherein Each first radar group includes a plurality of randomly selected first radar points located in the corresponding area of the candidate position, and for each first radar group, a corresponding first plane is determined, where each first radar group The first plane corresponding to a radar group includes a plurality of first radar points of the first radar group; the second determining subunit is used for determining the radar in the pose for each of the first planes.
- the third determining subunit is used for determining the distance between the other radar points for each first plane Among the points, the radar point whose distance is less than the threshold is used as the second radar point, and the second radar point is determined as the radar point included in the first plane; Among the plurality of first planes, the first plane including the largest number of radar points is used as the target plane.
- the seventh determining unit includes: a fifth determining subunit, configured to determine the initial first circular area on the target plane according to the size of the first calibration plate; selecting the subunit, It is used to randomly select any radar point located in the initial first circular area in the radar point cloud data as the first center of the first circular area, so as to determine that the first circular area is in the first circular area.
- the position in the radar point cloud data; a sixth determining subunit, configured to take the first circle center as a starting point, and a plurality of third radar points located in the first circular area in the radar point cloud data are At the end point, multiple first vectors are obtained respectively; the seventh determining subunit is used to add the multiple first vectors to obtain the second vector; the eighth determining subunit is used to determine based on the second vector
- the target center position of the first calibration board; a ninth determining subunit for determining in the radar point cloud data according to the target center position of the first calibration board and the size of the first calibration A set of the target radar point cloud data matching the first calibration board.
- the eighth determining subunit includes: taking the end point of the second vector as the second circle center, and determining the second circle according to the second circle center and the size of the first calibration plate Area; taking the second circle center as a starting point, a plurality of fourth radar points located in the second circular area in the radar point cloud data as an end point, respectively determining a plurality of third vectors; The third vector is added to obtain the fourth vector; it is determined whether the vector value of the fourth vector converges to the preset value; in response to the vector value of the fourth vector converging to the preset value, the converged
- the second circle center corresponding to the fourth vector is used as the candidate center position of the first calibration plate; in response to the candidate center position being coincident with the center position of the first calibration plate, the candidate center position is set As the target center position.
- the eighth determining subunit further includes: in response to the vector value of the fourth vector not converging to the preset value, using the end point of the fourth vector that has not converged as the For the second circle center, the multiple third vectors are re-determined and the fourth vector is re-determined.
- the eighth determining subunit further includes: re-determining the candidate center position in response to that the candidate center position does not coincide with the center position of the first calibration plate.
- the fifth determining submodule includes: an eighth determining unit, configured to determine an alternative external parameter between the radar and the camera according to g matching relationships, where g Is an integer greater than or equal to 3, and the target external parameter between the radar and the camera is determined according to a plurality of candidate external parameters between the radar and the camera.
- the eighth determining unit includes: a tenth determining subunit, configured to use the radar to project the first calibration plate based on each candidate external parameter, and project to any second A set of projection data is generated on one image; the eleventh determining subunit is used to determine the set of projection data with the highest degree of projection matching the first image among multiple sets of projection data as the target projection data; the twelfth determining The subunit is used to determine the candidate external parameter corresponding to the target projection data, which is the target external parameter between the radar and the camera.
- the radar and the camera are deployed on a vehicle, and the radar is a lidar.
- the second calibration board is located within the field of view of the camera, and the second calibration board is used to calibrate the first internal parameter of the camera; the distance of the camera relative to the ground is greater than that of the camera. The distance of the radar relative to the ground, the horizontal distance between the second calibration board and the camera is less than the horizontal distance between the first calibration board and the camera, and the plurality of second images include complete The second calibration board.
- the first image includes the complete first calibration board
- the radar point cloud data includes point cloud data obtained based on the complete first calibration board
- the relevant part can refer to the part of the description of the method embodiment.
- the device embodiments described above are merely illustrative, where the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place. , Or it can be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the present disclosure. Those of ordinary skill in the art can understand and implement it without creative work.
- the embodiment of the present disclosure also provides a computer-readable storage medium, the storage medium stores a computer program, and when the computer program is executed by a processor, the processor is prompted to implement any one of the aforementioned sensor calibration methods.
- the computer-readable storage medium may be a non-volatile storage medium.
- the embodiments of the present disclosure provide a computer program product, including computer-readable code, when the computer-readable code runs on a device, the device is caused to execute for realizing any of the above embodiments.
- the embodiments of the present disclosure also provide another computer program product for storing computer-readable instructions, which when executed, cause the computer to perform the operations of the sensor calibration method provided in any of the above-mentioned embodiments. .
- the computer program product can be specifically implemented by hardware, software, or a combination thereof.
- the computer program product is specifically embodied as a computer storage medium.
- the computer program product is specifically embodied as a software product, such as a software development kit (SDK), etc. Wait.
- SDK software development kit
- An embodiment of the present disclosure also provides a sensor calibration device, including: a processor; a memory for storing executable instructions of the processor; wherein the processor is configured to call the executable instructions to implement any of the above The calibration method of the sensor.
- FIG. 25 is a schematic diagram of the hardware structure of a sensor calibration device provided by an embodiment of the application.
- the sensor calibration device 310 includes a processor 311, and may also include an input device 312, an output device 313, a memory 314, and a bus 315.
- the input device 312, the output device 313, the memory 314, and the processor 311 are connected to each other through a bus 315.
- RAM random access memory
- ROM read-only memory
- EPROM erasable programmable read only memory
- CD-ROM compact disc read-only memory
- the input device is used to input data and/or signals
- the output device is used to output data and/or signals.
- the output device and the input device can be independent devices or an integrated device.
- the processor may include one or more processors, such as one or more central processing units (CPU).
- processors such as one or more central processing units (CPU).
- CPU central processing units
- the CPU may be a single-core CPU or Multi-core CPU.
- the processor is used to call the program code and data in the memory to execute the steps in the foregoing method embodiment.
- the processor is used to call the program code and data in the memory to execute the steps in the foregoing method embodiment.
- the description in the method embodiment please refer to the description in the method embodiment, which will not be repeated here.
- FIG. 25 only shows a simplified design of a calibration device for a sensor.
- the calibration device of the sensor may also include other necessary components, including but not limited to any number of input/output devices, processors, controllers, memories, etc., and all the sensors that can implement the embodiments of the present application
- the calibration devices are all within the scope of protection of this application.
- the functions or modules contained in the device provided in the embodiments of the present disclosure can be used to execute the methods described in the above method embodiments.
- the functions or modules contained in the device provided in the embodiments of the present disclosure can be used to execute the methods described in the above method embodiments.
- the embodiment of the present disclosure also provides a calibration system, the calibration system includes a camera, a radar and a first calibration board, the first calibration board is located in the common field of view of the camera and the radar, the first calibration system The pose information of the calibration board at different collection moments is different.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Radar Systems Or Details Thereof (AREA)
- Image Processing (AREA)
Abstract
Description
Claims (23)
- 一种传感器标定方法,其特征在于,所述传感器包括雷达和相机,第一标定板位于所述雷达和所述相机的共同视野范围内,所述方法包括:通过所述相机采集多张第一图像,其中,所述多张第一图像中所述第一标定板的位姿信息不同;获取预先标定的所述相机的第一内参;根据所述第一内参和所述多张第一图像,确定各个位姿的第一标定板分别相对于所述相机的外参;获取所述各个位姿的所述第一标定板的多组雷达点云数据;根据所述各个位姿的所述第一标定板分别相对于所述相机的外参和所述多组雷达点云数据,确定所述雷达和所述相机之间的目标外参。
- 根据权利要求1所述的方法,其特征在于,所述获取预先标定的所述相机的第一内参之前,所述方法还包括:响应于初次标定所述传感器,对所述相机进行标定,得到所述相机的所述第一内参;所述获取预先标定的所述相机的第一内参,包括:响应于再次标定所述传感器,获取初次标定所述传感器得到的所述相机的所述第一内参。
- 根据权利要求2所述的方法,其特征在于,第二标定板位于所述相机的视野范围内,所述对所述相机进行标定,得到所述相机的所述第一内参,包括:通过所述相机采集多张第二图像,所述多张第二图像中所述第二标定板的位姿信息不同;根据所述多张第二图像,分别确定所述相机的多个第一备选内参,其中,每张所述第二图像对应一个第一备选内参;将所述多个第一备选内参中的一个第一备选内参,确定为所述第一内参。
- 根据权利要求3所述的方法,其特征在于,所述将所述多个第一备选内参中的一个第一备选内参,确定为所述第一内参,包括:通过所述相机,分别按照所述多个第一备选内参,将位于相机坐标系中的预设点投影到像素坐标系,获得所述预设点在所述像素坐标系中的多个第一坐标值;对于每个第一备选内参,获取所述预设点在验证图像中的第二坐标值,并确定与第二坐标值对应的第一坐标值,得到存在对应关系的坐标对,其中,所述验证图像为所述多张第二图像中的一张或多张第二图像;对于每个第一备选内参,确定该第一备选内参包括的坐标对中第一坐标值与第二坐标值之间的距离;将所述多个第一备选内参中距离最小的一个第一备选内参,确定为所述相机的第一内参。
- 根据权利要求1至4任一项所述的方法,其特征在于,所述根据所述第一内参和所述多张第一图像,确定各个位姿的第一标定板分别相对于所述相机的外参,包括:对于所述多张第一图像中的每一张第一图像,根据所述第一内参对该第一图像进行去畸变处理,得到与该第一图像对应的第三图像;根据多张所述第三图像,确定所述相机的第二内参;根据多张所述第三图像以及所述相机的第二内参,确定所述各个位姿的所述第一标定板分别相对于所述相机的外参。
- 根据权利要求5所述的方法,其特征在于,所述根据多张所述第三图像以及所述相机的第二内参,确定所述各个位姿的所述第一标定板分别相对于所述相机的外参,包括:分别确定每张所述第三图像对应的单应性矩阵;根据所述相机的第二内参和所述多个单应性矩阵,确定所述各个位姿的所述第一标定板分别相对于所述相机的外参。
- 根据权利要求6所述的方法,其特征在于,所述根据所述各个位姿的所述第一标定板分别相对于所述相机的外参和所述多组雷达点云数据,确定所述雷达和所述相机之间的目标外参,包括:针对每个位姿的第一标定板,根据该第一标定板相对于所述相机的外参、以及所述雷达和所述相机之间的外参参考值,在该位姿下的所述雷达点云数据中确定一组与该第一标定板匹配的目标雷达点云数据;根据多组所述目标雷达点云数据分别与所述各个位姿的所述第一标定板之间的匹配关系,确定所述雷达和所述相机之间的目标外参。
- 根据权利要求7所述的方法,其特征在于,所述根据该第一标定板相对于所述相机的外参、以及所述雷达和所述相机之间的外参参考值,在该位姿下的所述雷达点云数据中确定一组与该第一标定板匹配的目标雷达点云数据,包括:根据该第一标定板相对于所述相机的所述外参、以及所述雷达和所述相机之间的所述外参参考值,确定所述第一标定板所在的备选位置;根据所述备选位置,在该位姿下的所述雷达点云数据中确定所述第一标定板所在的目标平面;在该位姿下的所述雷达点云数据对应的所述目标平面上,确定一组与该第一标定板匹配的目标雷达点云数据。
- 根据权利要求8所述的方法,其特征在于,所述根据所述备选位置,在该位姿下的所述雷达点云数据中确定所述第一标定板所在的目标平面,包括:从该位姿下的所述雷达点云数据中,确定两个或多个第一雷达组,其中,每个第一雷达组包括多个随机选取的、位于所述备选位置对应区域内的第一雷达点;针对每个第一雷达组,确定相对应的第一平面,其中,每个第一雷达组相对应的第一平面包括该第一雷达组的多个第一雷达点;针对每个所述第一平面,分别确定该位姿下的所述雷达点云数据中的除所述多个第一雷达点以外的其他雷达点到所述第一平面的距离;针对每个所述第一平面,将所述其他雷达点中所述距离小于阈值的雷达点作为第二雷达点;针对每个所述第一平面,将所述第二雷达点确定为所述第一平面中包括的雷达点;在所述两个或多个所述第一平面中,将包括雷达点数目最多的一个第一平面作为所述目标平面。
- 根据权利要求8或9所述的方法,其特征在于,所述在该位姿下的所述雷达点云数据对应的所述目标平面上,确定一组与该第一标定板匹配的目标雷达点云数据,包括:在所述目标平面上,根据该第一标定板的尺寸确定初始第一圆形区域;在所述雷达点云数据中,随机选取位于所述初始第一圆形区域内的任一个雷达点作为第一圆形区域的第一圆心,以确定所述第一圆形区域在所述雷达点云数据中的位置;以所述第一圆心为起点,所述雷达点云数据中位于所述第一圆形区域内的多个第三雷达点为终点,分别得到多个第一向量;将所述多个第一向量相加后得到第二向量;基于所述第二向量,确定该第一标定板的目标中心位置;根据该第一标定板的所述目标中心位置和所述第一标定标的尺寸,在所述雷达点云数据中确定一组与该第一标定板匹配的所述目标雷达点云数据。
- 根据权利要求10所述的方法,其特征在于,所述基于所述第二向量,确定该第一标定标的目标中心位置,包括:以所述第二向量的终点为第二圆心,根据所述第二圆心和该第一标定板的尺寸确定第二圆形区域;以所述第二圆心为起点,所述雷达点云数据中位于所述第二圆形区域内的多个第四雷达点为终点,分别确定多个第三向量;将所述多个第三向量相加后得到第四向量;判断所述第四向量的向量值是否收敛为预设值;响应于所述第四向量的向量值收敛为所述预设值,将收敛的所述第四向量对应的所述第二圆心作为该第一标定板的备选中心位置;响应于所述备选中心位置与该第一标定板的中心位置重合,将所述备选中心位置作为所述目标中心位置。
- 根据权利要求11所述的方法,其特征在于,还包括:响应于所述第四向量的向量值未收敛为所述预设值,将未收敛的所述第四向量的终点作为所述第二圆心,重新确定所述多个第三向量并重新确定所述第四向量。
- 根据权利要求11所述的方法,其特征在于,还包括:响应于所述备选中心位置与该第一标定板的所述中心位置不重合,重新确定所述备选中心位置。
- 根据权利要求7至13任一所述的方法,其特征在于,所述根据多组所述目标雷达点云数据分别与所述各个位姿的所述第一标定板之间的匹配关系,确定所述雷达和所述相机之间的目标外参,包括:根据g个匹配关系,确定所述雷达和所述相机之间的一个备选外参,其中,g为大于或等于3的整数;根据所述雷达和所述相机之间的多个备选外参,确定所述雷达和所述相机之间的所述目标外参。
- 根据权利要求14所述的方法,其特征在于,所述根据所述雷达和所述相机之间的所述多个备选外参,确定所述雷达和所述相机之间的所述目标外参,包括:通过所述雷达基于每个备选外参对所述第一标定板进行投影,投影到任一所述第一图像上,生 成一组投影数据;在多组投影数据中确定投影与该第一图像匹配度最高的一组投影数据为目标投影数据;确定所述目标投影数据对应的备选外参,为所述雷达和所述相机之间的目标外参。
- 根据权利要求1至15中任意一项所述的方法,其特征在于,所述雷达和所述相机部署在车辆上,所述雷达为激光雷达。
- 根据权利要求16所述的方法,其特征在于,第二标定板位于所述相机的视野范围内,所述第二标定板用于标定所述相机的所述第一内参;所述相机相对于地面的距离大于所述雷达相对于地面的距离;所述第二标定板与所述相机之间的水平距离小于所述第一标定板与所述相机之间的水平距离;所述多张第二图像包括完整的所述第二标定板。
- 根据权利要求1至17中任意一项所述的方法,其特征在于,所述第一图像包括完整的所述第一标定板,且所述雷达点云数据包括基于完整的所述第一标定板得到的点云数据。
- 一种传感器标定装置,其特征在于,所述传感器包括雷达和相机,第一标定板位于所述雷达和所述相机的共同视野范围内,所述装置包括:第一采集模块,用于通过所述相机采集多张第一图像,其中,所述多张第一图像中所述第一标定板的位姿信息不同;第一确定模块,用于获取预先标定的所述相机的第一内参,并根据所述第一内参和所述多张第一图像,确定各个位姿的所述第一标定板分别相对于所述相机的外参;第二确定模块,用于获取所述各个位姿的所述第一标定板的多组雷达点云数据,并根据所述各个位姿的所述第一标定板分别相对于所述相机的外参和所述多组雷达点云数据,确定所述雷达和所述相机之间的目标外参。
- 一种计算机可读存储介质,其特征在于,所述存储介质存储有计算机程序,所述计算机程序被处理器执行时,促使所述处理器实现如权利要求1至18任一项所述的传感器标定方法。
- 一种传感器标定装置,其特征在于,包括:处理器;用于存储所述处理器可执行指令的存储器;其中,所述处理器被配置为调用所述可执行指令,以实现如权利要求1至18中任一项所述的传感器标定方法。
- 一种标定系统,其特征在于,所述标定系统包括相机、雷达和第一标定板,所述第一标定板位于所述相机和所述雷达的共同视野范围内,所述第一标定板在不同采集时刻的位姿信息不同。
- 一种计算机程序产品,其特征在于,包括计算机可读代码,当所述计算机可读代码在设备上运行时,促使所述设备执行以实现如权利要求1-18任一项所述的传感器标定方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021530296A JP2022510924A (ja) | 2019-11-18 | 2020-10-21 | センサキャリブレーション方法及び装置、記憶媒体、キャリブレーションシステム並びにプログラム製品 |
US17/747,271 US20220276360A1 (en) | 2019-11-18 | 2022-05-18 | Calibration method and apparatus for sensor, and calibration system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911126534.8 | 2019-11-18 | ||
CN201911126534.8A CN112816949B (zh) | 2019-11-18 | 2019-11-18 | 传感器的标定方法及装置、存储介质、标定系统 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/747,271 Continuation US20220276360A1 (en) | 2019-11-18 | 2022-05-18 | Calibration method and apparatus for sensor, and calibration system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021098439A1 true WO2021098439A1 (zh) | 2021-05-27 |
Family
ID=75852431
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/122559 WO2021098439A1 (zh) | 2019-11-18 | 2020-10-21 | 传感器标定方法及装置、存储介质、标定系统和程序产品 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20220276360A1 (zh) |
JP (1) | JP2022510924A (zh) |
CN (1) | CN112816949B (zh) |
WO (1) | WO2021098439A1 (zh) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113436270A (zh) * | 2021-06-18 | 2021-09-24 | 上海商汤临港智能科技有限公司 | 传感器标定方法及装置、电子设备和存储介质 |
CN113702931A (zh) * | 2021-08-19 | 2021-11-26 | 中汽创智科技有限公司 | 一种车载雷达的外参标定方法、装置及存储介质 |
CN113724303A (zh) * | 2021-09-07 | 2021-11-30 | 广州文远知行科技有限公司 | 点云与图像匹配方法、装置、电子设备和存储介质 |
CN113744348A (zh) * | 2021-08-31 | 2021-12-03 | 南京慧尔视智能科技有限公司 | 一种参数标定方法、装置及雷视融合检测设备 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11967111B2 (en) * | 2020-12-15 | 2024-04-23 | Kwangwoon University Industry-Academic Collaboration Foundation | Multi-view camera-based iterative calibration method for generation of 3D volume model |
CN114782556B (zh) * | 2022-06-20 | 2022-09-09 | 季华实验室 | 相机与激光雷达的配准方法、系统及存储介质 |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101882313A (zh) * | 2010-07-14 | 2010-11-10 | 中国人民解放军国防科学技术大学 | 单线激光雷达与ccd相机之间相互关系的标定方法 |
US20160320476A1 (en) * | 2015-04-28 | 2016-11-03 | Henri Johnson | Systems to track a moving sports object |
CN106228537A (zh) * | 2016-07-12 | 2016-12-14 | 北京理工大学 | 一种三维激光雷达与单目摄像机的联合标定方法 |
CN106840111A (zh) * | 2017-03-27 | 2017-06-13 | 深圳市鹰眼在线电子科技有限公司 | 物体间位置姿态关系实时统一系统及方法 |
CN107976669A (zh) * | 2016-10-21 | 2018-05-01 | 法乐第(北京)网络科技有限公司 | 一种确定相机与激光雷达之间的外参数的装置 |
CN107976668A (zh) * | 2016-10-21 | 2018-05-01 | 法乐第(北京)网络科技有限公司 | 一种确定相机与激光雷达之间的外参数的方法 |
CN108198223A (zh) * | 2018-01-29 | 2018-06-22 | 清华大学 | 一种激光点云与视觉图像映射关系快速精确标定方法 |
CN108509918A (zh) * | 2018-04-03 | 2018-09-07 | 中国人民解放军国防科技大学 | 融合激光点云与图像的目标检测与跟踪方法 |
CN108964777A (zh) * | 2018-07-25 | 2018-12-07 | 南京富锐光电科技有限公司 | 一种高速相机校准系统及方法 |
CN109146978A (zh) * | 2018-07-25 | 2019-01-04 | 南京富锐光电科技有限公司 | 一种高速相机成像畸变校准装置及方法 |
CN109946680A (zh) * | 2019-02-28 | 2019-06-28 | 北京旷视科技有限公司 | 探测系统的外参数标定方法、装置、存储介质及标定系统 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5051493B2 (ja) * | 2005-12-26 | 2012-10-17 | 株式会社Ihi | 三次元計測用マーカとこれを用いた三次元計測方法 |
JP2014074632A (ja) * | 2012-10-03 | 2014-04-24 | Isuzu Motors Ltd | 車載ステレオカメラの校正装置及び校正方法 |
EP2767846B1 (en) * | 2013-02-18 | 2017-01-11 | Volvo Car Corporation | Method for calibrating a sensor cluster in a motor vehicle |
US20190004178A1 (en) * | 2016-03-16 | 2019-01-03 | Sony Corporation | Signal processing apparatus and signal processing method |
JP6929123B2 (ja) * | 2017-05-10 | 2021-09-01 | 日本放送協会 | カメラ校正装置及びカメラ校正プログラム |
CN108764024B (zh) * | 2018-04-09 | 2020-03-24 | 平安科技(深圳)有限公司 | 人脸识别模型的生成装置、方法及计算机可读存储介质 |
CN109343061B (zh) * | 2018-09-19 | 2021-04-02 | 百度在线网络技术(北京)有限公司 | 传感器标定方法、装置、计算机设备、介质和车辆 |
-
2019
- 2019-11-18 CN CN201911126534.8A patent/CN112816949B/zh active Active
-
2020
- 2020-10-21 WO PCT/CN2020/122559 patent/WO2021098439A1/zh active Application Filing
- 2020-10-21 JP JP2021530296A patent/JP2022510924A/ja not_active Ceased
-
2022
- 2022-05-18 US US17/747,271 patent/US20220276360A1/en not_active Abandoned
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101882313A (zh) * | 2010-07-14 | 2010-11-10 | 中国人民解放军国防科学技术大学 | 单线激光雷达与ccd相机之间相互关系的标定方法 |
US20160320476A1 (en) * | 2015-04-28 | 2016-11-03 | Henri Johnson | Systems to track a moving sports object |
CN106228537A (zh) * | 2016-07-12 | 2016-12-14 | 北京理工大学 | 一种三维激光雷达与单目摄像机的联合标定方法 |
CN107976669A (zh) * | 2016-10-21 | 2018-05-01 | 法乐第(北京)网络科技有限公司 | 一种确定相机与激光雷达之间的外参数的装置 |
CN107976668A (zh) * | 2016-10-21 | 2018-05-01 | 法乐第(北京)网络科技有限公司 | 一种确定相机与激光雷达之间的外参数的方法 |
CN106840111A (zh) * | 2017-03-27 | 2017-06-13 | 深圳市鹰眼在线电子科技有限公司 | 物体间位置姿态关系实时统一系统及方法 |
CN108198223A (zh) * | 2018-01-29 | 2018-06-22 | 清华大学 | 一种激光点云与视觉图像映射关系快速精确标定方法 |
CN108509918A (zh) * | 2018-04-03 | 2018-09-07 | 中国人民解放军国防科技大学 | 融合激光点云与图像的目标检测与跟踪方法 |
CN108964777A (zh) * | 2018-07-25 | 2018-12-07 | 南京富锐光电科技有限公司 | 一种高速相机校准系统及方法 |
CN109146978A (zh) * | 2018-07-25 | 2019-01-04 | 南京富锐光电科技有限公司 | 一种高速相机成像畸变校准装置及方法 |
CN109946680A (zh) * | 2019-02-28 | 2019-06-28 | 北京旷视科技有限公司 | 探测系统的外参数标定方法、装置、存储介质及标定系统 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113436270A (zh) * | 2021-06-18 | 2021-09-24 | 上海商汤临港智能科技有限公司 | 传感器标定方法及装置、电子设备和存储介质 |
CN113702931A (zh) * | 2021-08-19 | 2021-11-26 | 中汽创智科技有限公司 | 一种车载雷达的外参标定方法、装置及存储介质 |
CN113702931B (zh) * | 2021-08-19 | 2024-05-24 | 中汽创智科技有限公司 | 一种车载雷达的外参标定方法、装置及存储介质 |
CN113744348A (zh) * | 2021-08-31 | 2021-12-03 | 南京慧尔视智能科技有限公司 | 一种参数标定方法、装置及雷视融合检测设备 |
CN113724303A (zh) * | 2021-09-07 | 2021-11-30 | 广州文远知行科技有限公司 | 点云与图像匹配方法、装置、电子设备和存储介质 |
CN113724303B (zh) * | 2021-09-07 | 2024-05-10 | 广州文远知行科技有限公司 | 点云与图像匹配方法、装置、电子设备和存储介质 |
Also Published As
Publication number | Publication date |
---|---|
JP2022510924A (ja) | 2022-01-28 |
CN112816949A (zh) | 2021-05-18 |
US20220276360A1 (en) | 2022-09-01 |
CN112816949B (zh) | 2024-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021098439A1 (zh) | 传感器标定方法及装置、存储介质、标定系统和程序产品 | |
WO2021098448A1 (zh) | 传感器标定方法及装置、存储介质、标定系统和程序产品 | |
US11461930B2 (en) | Camera calibration plate, camera calibration method and device, and image acquisition system | |
WO2021098608A1 (zh) | 传感器的标定方法、装置、系统、车辆、设备及存储介质 | |
CN110689581B (zh) | 结构光模组标定方法、电子设备、计算机可读存储介质 | |
CN107167788B (zh) | 获取激光雷达校准参数、激光雷达校准的方法及系统 | |
KR101666959B1 (ko) | 카메라로부터 획득한 영상에 대한 자동보정기능을 구비한 영상처리장치 및 그 방법 | |
CN111815716A (zh) | 一种参数标定方法及相关装置 | |
CN111383279A (zh) | 外参标定方法、装置及电子设备 | |
CN113034612B (zh) | 一种标定装置、方法及深度相机 | |
CN112308927B (zh) | 一种全景相机与激光雷达的融合装置及其标定方法 | |
CN110136207B (zh) | 鱼眼相机标定系统、方法、装置、电子设备及存储介质 | |
CN113111513B (zh) | 传感器配置方案确定方法、装置、计算机设备及存储介质 | |
CN111105465B (zh) | 一种摄像装置校准方法、装置、系统电子设备及存储介质 | |
EP3967969B1 (en) | Fisheye camera calibration system, method and apparatus, electronic device, and storage medium | |
CN115359130A (zh) | 雷达和相机的联合标定方法、装置、电子设备及存储介质 | |
CN111145264A (zh) | 多传感器的标定方法、装置及计算设备 | |
CN113077523B (zh) | 一种标定方法、装置、计算机设备及存储介质 | |
CN109741384B (zh) | 深度相机的多距离检测装置及方法 | |
WO2021068723A1 (zh) | 传感器标定方法和传感器标定装置 | |
CN114693807A (zh) | 一种输电线路图像与点云的映射数据重构方法及系统 | |
CN113763457B (zh) | 落差地形的标定方法、装置、电子设备和存储介质 | |
CN113421300A (zh) | 确定鱼眼相机图像中物体实际位置的方法及装置 | |
CN113219441A (zh) | 标定角度的精度验证方法及装置、设备及存储介质 | |
Chen et al. | Image registration with uncalibrated cameras in hybrid vision systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2021530296 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20888814 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20888814 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20888814 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09/11/2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20888814 Country of ref document: EP Kind code of ref document: A1 |