WO2023143132A1 - Étalonnage de données de capteur - Google Patents

Étalonnage de données de capteur Download PDF

Info

Publication number
WO2023143132A1
WO2023143132A1 PCT/CN2023/072107 CN2023072107W WO2023143132A1 WO 2023143132 A1 WO2023143132 A1 WO 2023143132A1 CN 2023072107 W CN2023072107 W CN 2023072107W WO 2023143132 A1 WO2023143132 A1 WO 2023143132A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
reference system
test equipment
data
moving speed
Prior art date
Application number
PCT/CN2023/072107
Other languages
English (en)
Chinese (zh)
Inventor
王昌龙
庞勃
刘长江
臧波
Original Assignee
北京三快在线科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京三快在线科技有限公司 filed Critical 北京三快在线科技有限公司
Publication of WO2023143132A1 publication Critical patent/WO2023143132A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating

Definitions

  • the present application relates to the field of computer technology, in particular to calibration of sensor data.
  • the obstacle detection and classification method based on the fusion of image data and radar data is widely used in obstacle detection and classification scenarios due to its relatively accurate detection results and classification results.
  • the premise of image data and radar data fusion is the calibration of radar sensor and image sensor.
  • This application provides a calibration method for sensor data, including:
  • the sensor data at least including image data, radar data and inertial data;
  • the radar data determine the Doppler velocity during the moving process of the test equipment, and register the Doppler velocity and the moving velocity of the test equipment in the image reference system, so as to Calibrate the above sensor data.
  • determining the motion trajectory of the testing device includes:
  • the inertial data and the image data determine the angular velocity, acceleration, and observation position of the image marker respectively corresponding to the multiple moments of the test equipment under the inertial reference system;
  • the motion trajectory of the test device in the world reference system is solved, including:
  • the motion trajectory to be solved of the test equipment in the world reference system determine the first parameters to be solved respectively corresponding to the multiple moments, and the first parameters are used to solve the motion trajectory;
  • the angular velocity corresponding to this moment is the same as the estimated angular velocity
  • the acceleration is the same as the estimated acceleration
  • the observed position of the image marker is the same as the estimated position
  • the conversion relationship between the world reference frame and the inertial reference frame is determined, and the estimated angular velocity, estimated Estimated positions of acceleration and image markers, including:
  • the pose of the test device in the first parameter to be solved corresponding to this moment determine the conversion relationship between the world reference frame and the inertial reference frame to be solved at this moment, wherein the first parameter includes The pose, acceleration offset, angular velocity offset and observation position of the image marker of the test equipment;
  • the angular velocity offset to be solved in the first parameter the acceleration offset to be solved, and the observed position of the image marker to be solved, respectively determine the estimated value of the test device Acceleration, estimated angular velocity, and estimated position of image markers.
  • determining the moving speed of the test device under the image reference system according to the motion track includes:
  • the conversion relationship between the world reference system and the inertial reference system corresponding to the multiple moments, and the preset inertial reference system is used to determine the moving speed of the test equipment at the multiple moments in the image reference system.
  • registering the Doppler velocity and the moving velocity of the test device in an image reference system to calibrate the sensor data includes:
  • the moving velocity to be solved in each direction component is registered with the Doppler velocity in each direction component, and the calibration relationship is solved, so as to calibrate the sensor data.
  • the calibration relationship to be solved between the radar reference system and the image reference system, and the preset direction components it is determined that the test device is in The moving speed to be solved on each direction component, including:
  • the moving speed to be solved includes The time difference to be resolved
  • the conversion relationship to be solved between the radar reference system and the image reference system determine the moving speed of the test equipment in the radar reference system to be solved ;
  • each preset direction component determine the moving speed of the test equipment to be solved on each direction component
  • the moving velocity to be solved in each direction component is registered with the Doppler velocity in each direction component, and the conversion relationship and the time difference are calculated as the calibration relationship.
  • the method also includes:
  • the application provides a calibration device for sensor data, including:
  • An acquisition module configured to acquire sensor data at multiple moments during the movement of the test equipment, the sensor data at least including image data, radar data and inertial data;
  • a trajectory determination module configured to determine the motion trajectory of the test device according to the inertial data and the image data, and determine the moving speed of the test device in the image reference system according to the motion trajectory;
  • a calibration module configured to determine the Doppler velocity during the moving process of the test equipment according to the radar data, and register the Doppler velocity with the moving velocity of the test equipment in the image reference system, to calibrate the sensor data.
  • the trajectory determination module is configured to determine the angular velocity, acceleration and image markers of the test equipment corresponding to the multiple moments in the inertial reference system according to the inertial data and the image data According to the observation positions of the angular velocities, accelerations, and image markers corresponding to the multiple moments, the motion trajectory of the test equipment in the world reference system is solved.
  • the trajectory determination module is configured to determine the first parameters to be solved respectively corresponding to the multiple moments according to the motion trajectory to be solved of the test equipment in the world reference system, the first The parameters are used to solve the motion trajectory, and for each of the multiple moments, according to the first parameter to be solved corresponding to the moment, determine the conversion relationship between the world reference frame and the inertial reference frame, and Determine the estimated angular velocity, estimated acceleration, and estimated position of the image marker of the test device in the inertial reference system, and the angular velocity corresponding to this moment is the same as the estimated angular velocity, and the acceleration and The estimated acceleration is the same, the observed position of the image marker is the same as the estimated position, a constraint condition is constructed, and the motion trajectory is solved.
  • the trajectory determination module is configured to determine the world reference frame and the inertial reference frame to be solved at this moment according to the pose of the test device in the first parameter to be solved corresponding to the moment
  • the trajectory determination module is configured to determine the moving speeds of the test equipment corresponding to the multiple moments in the world reference system according to the motion trajectory, and the test equipment according to the multiple moments The pose of the device, determine the conversion relationship between the world reference system and the inertial reference system corresponding to the multiple times, according to the moving speed of the test equipment at the multiple times in the world reference system, the The conversion relationship between the world reference system and the inertial reference system corresponding to multiple times and the preset conversion relationship between the inertial reference system and the image reference system determine the position of the test equipment in the image reference system The movement speed at various times described below.
  • the calibration module is configured to, for each of the multiple moments, determine the Doppler in each direction component at that moment according to the acquired radar data and preset each direction component speed, according to the moment
  • the moving speed of the test equipment in the image reference system, the calibration relationship to be solved between the radar reference system and the image reference system, and the preset components in each direction determine the moving speed of the test equipment in each direction component to be solved.
  • the moving velocity to be solved in each direction component is registered with the Doppler velocity in each direction component, and the calibration relationship is solved to calibrate the sensor data.
  • the calibration module is configured to determine the waiting time of the testing equipment under the image reference system at this moment according to the waiting time difference between the internal clock of the radar sensor and the image sensor set on the testing equipment.
  • the moving speed to be solved, the moving speed to be solved includes the time difference to be solved, according to the moving speed to be solved of the test equipment in the image reference system at this moment, the radar reference system and the image reference system to be solved
  • the conversion relationship of the test equipment to be solved in the radar reference system is determined. According to the preset components in each direction, the moving speed of the test equipment to be solved in each direction component is determined.
  • the moving velocity of the solution is registered with the Doppler velocity in each direction component, and the conversion relationship and the time difference are calculated as the calibration relationship.
  • the calibration module is further configured to determine, according to the determined calibration relationship and the collected sensor data, that the test equipment corresponding to each moment is in the image reference system and the radar reference system pose difference, and judge whether the pose difference is greater than the preset error threshold, if the pose difference is greater than the error threshold, then determine that the sensor data needs to be calibrated, and store the sensor data, if the If the pose difference is not greater than the error threshold, it is determined that the sensor data does not need to be calibrated.
  • the present application provides an electronic device, which includes a memory, a processor, and a computer program stored on the memory and operable on the processor.
  • the processor implements the above sensor data calibration method when executing the program.
  • the present application provides a non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the above sensor data calibration method is realized.
  • the present application provides a computer program product, the computer program product includes a computer program or an instruction, and the computer program or instruction is executed by a processor, so that the computer implements the above sensor data calibration method.
  • the image data, radar data and inertial data at multiple moments during the movement of the test equipment are obtained, and the motion trajectory of the test equipment is determined according to the inertial data and image data.
  • determine the moving speed of the test equipment under the image reference system and then determine the Doppler speed during the moving process of the test equipment according to the radar data, and compare the Doppler speed with the moving speed of the test equipment under the image reference system Registration to calibrate sensor data.
  • this scheme can also be applied when the acquisition ranges of the radar sensor and the image sensor do not overlap, and does not require calibration objects moving within the overlapping field of view, the calibration process is more convenient, and the accuracy of determining the sensor data is improved. calibration efficiency.
  • Fig. 1 is the schematic flow chart of the calibration method of the sensor data provided by the present application
  • FIG. 2 is a schematic diagram of a sensor calibration scenario provided by the present application.
  • Fig. 3 is the schematic diagram of the calibration device of the sensor data provided by the present application.
  • FIG. 4 is a schematic diagram of an electronic device corresponding to FIG. 1 provided in the present application.
  • the commonly used sensor data calibration method is based on the overlapping areas of the acquisition ranges of image sensors and radar sensors.
  • the radar sensor and the image sensor can be controlled to be stationary, and the calibration object can be controlled to move. Then, according to the radar data collected during the movement of the calibration object, the first position of the calibration object at each time is determined, and according to the collected image data, the second position of the calibration object at each time is determined. After that, for each moment, according to the constraint condition that the first position and the second position are the same, the calibration parameters of the radar sensor and the image sensor are determined.
  • the related technology is based on the fact that there is an overlapping area between the acquisition ranges of the image sensor and the radar sensor.
  • the calibration parameters of the sensor are determined, so that the calibration efficiency of the related technology is poor.
  • Fig. 1 is a schematic flow chart of the calibration method of sensor data provided by the present application, including the following steps:
  • S100 Acquire sensor data at multiple moments during the movement of the testing device, where the sensor data at least includes image data, radar data and inertial data.
  • a moving calibration object is set in the overlapping area to calibrate the image sensor and the radar sensor.
  • This application provides a new sensor data calibration method, so that there is no need to set the overlapping area between the image sensor and the radar sensor, but the radar sensor and the image sensor are set in the test equipment, and the mobile test equipment is used to determine the corresponding points at multiple times. imagery data, radar data and inertial data.
  • the calibration relationship between the sensor data is determined based on the sensor data respectively corresponding to a plurality of time points.
  • sensor data at multiple moments during the moving process of the test equipment can be obtained, wherein the sensor data includes: image data, radar data and inertial data.
  • the unmanned equipment can acquire sensor data according to a preset frequency, and the sensor data is required for calibrating the relationship of the sensor data
  • the sensor data includes at least: image data, inertial data and radar data.
  • the test equipment can also send the collected sensor data to the server, and the server performs subsequent steps to determine the calibration relationship between the radar sensor and the image sensor.
  • the calibration process of the sensor data performed by the test equipment is taken as an example for description in the following.
  • the testing device can acquire image data, inertial data and radar data collected by itself.
  • the test equipment can be unmanned equipment, manned equipment, or handheld equipment, which can control the movement of the test equipment, or hold the test equipment to move, and collect sensor data during the movement of the test equipment .
  • the sensor data calibration method is applied to placing calibration objects on the ground, controlling the movement of test equipment, collecting the position of calibration objects by image sensors, determining inertial data at multiple times by inertial sensors, and using radar A scenario where the sensor determines the Doppler velocity at multiple instants. as shown in picture 2.
  • FIG. 2 is a schematic diagram of a sensor calibration scenario provided by the present application.
  • the white cube is the test equipment
  • the three gray cubes set on the white cube are radar sensor, inertial sensor and image sensor respectively, fixed on the ground
  • the image sensor can collect image data containing markers
  • the radar sensor can collect Doppler velocity in various direction components through the Doppler effect of the wall
  • the inertial sensor can collect inertial data corresponding to multiple moments.
  • the image markers can be two-dimensional codes, checkerboards, etc.
  • the test equipment, radar sensors, image sensors, and inertial sensors are all in simplified form.
  • the specific form and fixing method can be set according to needs, and this application does not make any limit.
  • S102 Determine a motion track of the test device according to the inertial data and the image data, and determine a moving speed of the test device in an image reference system according to the motion track.
  • the motion trajectory of the test equipment under the image reference system and the motion trajectory under the radar reference system are actually the same trajectory. Therefore, if the test equipment
  • the calibration relationship between the image reference system and the radar reference system can be determined by registering the movement trajectory of the device under the image reference system and the movement trajectory under the radar reference system.
  • the test device can determine its own motion trajectory according to the acquired inertial data and image data.
  • the acquired inertial data and image data correspond to multiple time points respectively. Its purpose is to determine the trajectory of the test equipment in the world reference system based on the inertial data and image data corresponding to multiple moments.
  • the motion trajectory corresponds to the inertia data and image data corresponding to multiple moments during the moving process of the test equipment, that is to say, the determined motion trajectory is not the entire trajectory corresponding to the moving process of the testing equipment, but the movement of the testing equipment The continuous trajectory determined by the sensor data corresponding to multiple moments acquired in the process.
  • the Doppler velocity of the test equipment can be determined according to the Doppler effect of the acquired radar data.
  • the calculation amount and calculation difficulty are small, therefore, the test equipment can determine the Doppler velocity during its own movement, as well as its own moving velocity in the image reference system, and register the Doppler velocity and the moving velocity, that is, A more accurate calibration relationship can be obtained.
  • the test device can determine the moving speed of the test device in the image reference system according to the determined motion track.
  • the test device can determine the corresponding displacement between each adjacent time in multiple moments of the test device according to the determined motion trajectory, and then determine the corresponding displacements of the test device at multiple times according to the displacement. movement speed.
  • the test device can determine the angular velocity, acceleration and observation position of the image marker corresponding to the test device at multiple moments in the inertial reference system according to the acquired inertial data and image data.
  • the testing equipment can solve the motion trajectory of the testing equipment in the world reference system according to the angular velocity, acceleration and observed position of the image marker corresponding to multiple moments.
  • the test device can determine its own motion trajectory in the world reference system.
  • the test device can determine the motion trajectory of the test device itself based on the poses corresponding to the multiple times.
  • the test equipment can be based on the poses corresponding to itself at multiple moments and the motion trajectory to be solved in the world reference system , to determine the first parameters to be solved respectively corresponding to multiple moments for determining the motion trajectory.
  • the test device can determine the conversion relationship between the world reference frame and the inertial reference frame according to the first parameter to be solved corresponding to the moment, and determine the time when the test device is in the inertial reference frame.
  • the test equipment can establish constraint conditions based on the angular velocity corresponding to the moment is the same as the estimated angular velocity, the acceleration is the same as the estimated acceleration, and the observed position of the image marker is the same as the estimated position. solve.
  • the test equipment needs to determine the respective The corresponding estimated angular velocity, estimated acceleration, and estimated position of the image marker.
  • the test device can determine the conversion relationship between the world reference frame and the inertial reference frame to be solved at the moment according to the position and attitude of the test device in the first parameter to be solved corresponding to the moment. Then, based on the conversion relationship to be solved, the angular velocity offset to be solved in the first parameter, the acceleration offset to be solved, the observation position of the image marker to be solved, etc., respectively determine the expected position of the test equipment in the inertial reference system. estimated acceleration, estimated angular velocity, and estimated position of image markers. Wherein, the position and attitude of the test equipment may be referred to as the pose of the test equipment.
  • x q is the attitude of the test device in the world reference system at that moment
  • x p is the position of the test device in the world reference system at this moment
  • b a is the acceleration corresponding to this moment offset
  • b w is the angular velocity offset corresponding to this moment
  • l is the observation position of the image marker at this moment, where the observation position of the image marker can be characterized by the position of each image feature point in the image marker, for example, the image The position of the central point of the marker, the position of the edge point of the image marker, etc.
  • the image marker can be of various types such as checkerboard, two-dimensional code, etc.
  • the shape of the marker can be various shapes such as triangle, rectangle, circle, polygon, etc., and the shape and type of the image marker can be set according to needs.
  • T represents the transpose of the matrix.
  • the estimated acceleration can be determined as Among them, p is the estimate, k is the kth moment, is the estimated acceleration corresponding to time k, is the rotation matrix from the world reference system to the inertial reference system at time k, is the two derivations of the translation matrix from the world reference system to the inertial reference system at time k, that is, the acceleration of the test equipment in the world reference system at time k, and g w is the gravitational acceleration at the current time. Since its direction is downward, negative The sign indicates the influence of its direction, and b a is the bias of acceleration.
  • the rotation matrix from the world reference system to the inertial reference system and the translation matrix from the world reference system to the inertial reference system are determined based on the conversion relationship between the world reference system and the inertial reference system.
  • the estimated angular velocity can be determined in, is the estimated angular velocity corresponding to time k, is the primary derivative of the rotation matrix from the world reference system to the inertial reference system at time k, and the estimated angular velocity can be derived according to the derivation formula of the rotation matrix derivative: Determination, through the characteristics of the anti-symmetric matrix, the determination formula of the above-mentioned estimated angular velocity can be determined.
  • the test equipment can also determine the estimated position of image markers Among them, j is the moment when the image sensor collects the image, p j represents the image collected at the jth moment, and for each image collection moment, is the translation matrix between the world reference system and the image reference system at this moment, is the rotation matrix between the world reference system and the image reference system at this moment, l is the observation position of the image marker in the world reference system at this moment, ⁇ () means to normalize the contents in brackets, w[ , ⁇ ] Indicates that the observed position of the image marker in the image reference system is transformed into the pixel reference system, and the position of the pixel where the image marker is located is determined. in, and It can be determined according to the conversion relationship between the world reference frame and the inertial reference frame at time j, and the predetermined conversion relationship between the inertial reference frame and the image reference frame.
  • the test equipment can be based on the minimum difference between the estimated acceleration and acceleration, estimated angular velocity and angular velocity, and the observed position and estimated position of the image marker corresponding to multiple times respectively.
  • One parameter is calculated to determine the motion trajectory of the test equipment.
  • the conversion relationship between the inertial reference frame and the image reference frame is predetermined.
  • the rotation matrix and translation matrix between the world reference system and the inertial reference system can be determined according to the poses of the test equipment in the world reference frame at multiple moments. This application determines the rotation between the world reference frame and the inertial reference frame according to the pose
  • the matrix and the way of translating the matrix are not limited.
  • the method of modeling the motion trajectory in this application can also use various modeling methods such as Bezier curves, and which type of motion trajectory construction method is used to solve the motion trajectory according to the inertial data and image data corresponding to multiple moments. It can be set as required, which is not limited in this application.
  • S104 According to the radar data, determine the Doppler velocity during the moving process of the test equipment, and register the Doppler velocity with the moving velocity of the test equipment in the image reference system, so as to Calibrate the above sensor data.
  • the test device can solve the conversion relationship between the image reference frame and the radar reference frame based on the Doppler velocity and the moving velocity.
  • the test equipment can determine the Doppler velocity in each direction component.
  • the test device can determine the Doppler velocity of the test device in each direction component according to the acquired radar data and the preset radar direction components for each time point.
  • the direction component is composed of a pitch angle and an azimuth angle
  • the Doppler velocity on the direction component is the Doppler velocity corresponding to the pitch angle and the azimuth angle.
  • the Doppler velocity on the direction component is the velocity component corresponding to a pitch angle of 30° and an azimuth angle of 60°.
  • Doppler velocity is determined by the radar sensor according to the Doppler effect, that is, for each direction component, the Doppler velocity on the direction component can be measured.
  • the Doppler velocity is the moving velocity of the stationary object relative to the test equipment
  • the Doppler velocity and the moving velocity of the test equipment are opposite in direction and equal in magnitude, therefore, the Doppler velocity is determined
  • the test equipment can register the Doppler velocity and the moving speed of the test equipment under the image reference system to determine the calibration relationship between the radar reference system and the image reference system.
  • the testing equipment can determine the Doppler velocity in each direction component as the Doppler velocity in the same direction as the testing equipment and the moving velocity according to the direction of the moving velocity at multiple moments, and then the Doppler velocity Registering with the moving speed to determine the calibration relationship between the radar reference frame and the image reference frame.
  • the test device can calibrate the acquired sensor data according to the determined calibration relationship. For example, according to the calibration relationship, the radar data in the acquired sensor data is converted into the image reference system, the target recognition and other steps are performed, and then the motion strategy of the test equipment is determined according to the recognition results.
  • the calibration relationship can also be used in various scenarios such as obstacle detection and obstacle classification, and the application of the calibration relationship can be set according to needs, which is not limited in this application.
  • the calibrated sensor data may be sensor data collected during this movement of the test equipment, or may be sensor data collected during a subsequent movement of the test equipment. Of course, it can also be sensor data collected by an unmanned device similar in structure to the test device during delivery tasks and the like.
  • the sensor data to be calibrated can be set according to needs, which is not limited in this application.
  • the determined test equipment is in the radar reference system. There may be errors in the direction of the moving speed.
  • the testing equipment can determine the moving speed of the testing equipment in each moving direction component based on the various direction components of the Doppler velocity, and then use the moving speed and The difference between the Doppler velocities in each direction component is the smallest, and the calibration relationship is solved.
  • the test device may determine the moving speed of the test device in the image reference system at multiple moments according to the motion trajectory of the test device in the world reference system determined in step S102.
  • the test device can determine each of the radar reference frames based on the unresolved calibration relationship between the image reference frame and the radar reference frame and the moving speed of the test device in the image reference frame.
  • the moving speed to be solved includes a calibration relationship to be solved.
  • the calibration relationship at least includes a rotation matrix and a translation matrix between the image reference frame and the radar reference frame.
  • the test device can register the moving velocity to be solved in each direction component and the Doppler velocity in each direction component, so as to solve the calibration relationship.
  • the registration method may take the minimum difference between the moving speed to be solved and the Doppler speed in each direction component as the optimization goal, and solve the optimization goal.
  • the speed of movement of the test equipment in the frame of reference of the image can be determined.
  • is the moving speed of the test device in the image reference system at time i is the rotation matrix between the world reference system and the image reference system at time i
  • v w (t i ) is the moving speed of the test equipment in the world reference system at time i, which can pass the first derivative of the displacement change of the test equipment at this time Sure. It can be determined according to the conversion relationship between the world reference system and the inertial reference system at time i, and the predetermined conversion relationship between the inertial reference system and the image reference system.
  • v r (t i ) is the moving speed of the test equipment in the radar reference frame at time i
  • w c (t i ) identifies the angular velocity of the test equipment in the image reference frame at time i, represents an antisymmetric matrix
  • the test equipment can determine the moving speed to be solved on each direction component in the radar reference system according to the preset each direction component Then according to the Doppler velocity determined in step S104, the following cost function can be determined: in, is the pitch angle, and ⁇ is the azimuth angle.
  • the goal is to minimize the cost function, that is, to solve the conversion relationship between the radar reference frame and the image reference frame. Afterwards, the conversion relationship obtained by solving is used as a calibration relationship between the radar reference system and the image reference system.
  • the radar sensor and the image sensor have their own internal clock systems, when determining the calibration relationship, the time difference between the internal clock system of the radar sensor and the internal clock system of the image sensor can also be determined. That is to say, the calibration relationship includes the conversion relationship between the image reference frame and the radar reference frame and the time difference between the internal clock system of the radar sensor and the internal clock system of the image sensor.
  • the testing device can determine the moving speed of the testing device in the radar reference system according to the moving speed of the testing device in the image reference system at that moment, the conversion relationship between the radar reference system and the image reference system to be solved
  • the test device can determine the moving speed of the test device to be resolved in each direction component according to the preset direction components.
  • t d is the time difference between the internal clock system of the radar sensor and the internal clock system of the image sensor.
  • the moving speed of the test equipment under the image reference system includes the time difference to be solved. If there is a time difference, the moving speed of the test equipment under the radar reference system includes the time difference to be solved and the image reference to be solved The rotation matrix and translation matrix to be solved between the radar frame and the radar reference frame.
  • image data, radar data and inertial data at multiple moments in the moving process of the test equipment are obtained, and according to the inertial data and image data, the motion track of the test equipment is determined, and according to the motion track, Determine the moving speed of the test equipment under the image reference system, and then determine the Doppler speed during the moving process of the test equipment according to the radar data, and match the Doppler speed with the moving speed of the test equipment under the image reference system standard to calibrate sensor data.
  • This solution can also be applied when the acquisition ranges of the radar sensor and the image sensor do not overlap, and does not require a calibration object moving within the overlapping field of view, the calibration process is more convenient, and the calibration efficiency of determining sensor data is improved.
  • the testing equipment can also determine whether the calibration relationship needs to be re-determined during the moving process.
  • the test device can determine the pose gap between the image reference system and the radar reference frame of the test device corresponding to multiple times according to the determined calibration relationship and the collected sensor data, and judge the pose gap Whether it is greater than the preset error threshold. If the pose difference is greater than the error threshold, the testing device may determine that the sensor data needs to be calibrated, and store the sensor data. If the pose difference is not greater than an error threshold, the testing device may determine that sensor data does not require calibration.
  • the test equipment can also store the sensor data within a preset time period, and determine the calibration relationship according to the pre-stored sensor data of a preset time length when the pose difference is greater than the preset error threshold, so as to avoid a large gap Sometimes it is still necessary to collect sensor data for a period of time before calibration, which makes the calibration efficiency low.
  • the preset duration can be set as required, which is not limited in this application.
  • the test device can also record the number of times the pose difference is greater than the error threshold, so as to improve the accuracy of the judgment result. And when the pose difference is less than the error When the threshold value is reached, the calibration relationship is considered to be still correct. Then, when the number of times the pose difference is greater than the error threshold reaches the preset number threshold, it is considered that the determined calibration relationship is no longer reliable.
  • the test equipment can determine the point cloud data and the projection of the point cloud data in the image reference system according to the acquired sensor data and the calibration relationship, and then the projection and the sensor data Fusion of the image data in the image, determine the fusion result, and perform obstacle detection on the fusion result, and determine the position of the obstacle.
  • the above method of merging image data and point cloud data to determine the position of obstacles is only one of the uses of the calibration relationship.
  • the calibration relationship can also be used for obstacle detection, obstacle classification, etc.
  • the application of the calibration relationship can be set as required, which is not limited in this application.
  • the present application also provides a corresponding sensor data calibration device, as shown in FIG. 3 .
  • Fig. 3 is the calibration device of the sensor data provided by the application, including:
  • the acquiring module 200 is configured to acquire sensor data at multiple moments during the moving process of the testing device, the sensor data at least including image data, radar data and inertial data.
  • the trajectory determination module 202 is configured to determine the motion trajectory of the test device according to the inertial data and the image data, and determine the moving speed of the test device in the image reference system according to the motion trajectory.
  • a calibration module 204 configured to determine the Doppler velocity during the moving process of the test equipment according to the radar data, and register the Doppler velocity with the moving velocity of the test equipment in the image reference system , determining a calibration relationship between the radar reference frame and the image reference frame, so as to calibrate the sensor data.
  • the trajectory determination module 202 is configured to determine the angular velocity, acceleration and image markers of the test equipment corresponding to the multiple moments in the inertial reference system according to the inertial data and the image data According to the observed position of the object, according to the angular velocity, acceleration and observed position of the image marker corresponding to the multiple times, the motion track of the test equipment in the world reference system is solved.
  • the trajectory determination module 202 is configured to determine the first parameters to be solved respectively corresponding to the multiple moments according to the motion trajectory to be solved of the test equipment in the world reference system, the first parameter A parameter is used to solve the motion trajectory, and for each of the multiple moments, according to the first parameter to be solved corresponding to the moment, determine the conversion relationship between the world reference frame and the inertial reference frame, And determine the estimated angular velocity, estimated acceleration, and estimated position of the image marker of the test device under the inertial reference system, and the angular velocity corresponding to this moment is the same as the estimated angular velocity, and the acceleration The same as the estimated acceleration, the observed position of the image marker is the same as the estimated position, a constraint condition is constructed, and the motion track is solved.
  • the trajectory determination module 202 is configured to determine the world reference frame and inertial reference to be solved at this moment according to the pose of the test device in the first parameter to be solved corresponding to this moment
  • the trajectory determination module 202 is configured to determine, according to the motion trajectory, the moving speeds of the test equipment corresponding to the multiple moments in the world reference system, and according to the The pose of the test equipment, determine the conversion relationship between the world reference frame and the inertial reference system corresponding to the multiple moments, according to the moving speed of the test equipment at the multiple moments in the world reference system, the The conversion relationship between the world reference system and the inertial reference system corresponding to the multiple times and the preset conversion relationship between the inertial reference system and the image reference system, A velocity of movement of the test device at the plurality of time instants in the image reference frame is determined.
  • the calibration module 204 is configured to, for each of the multiple moments, determine the Doppler in each direction component at that moment according to the acquired radar data and the preset each direction component. Le velocity, according to the moving speed of the test equipment in the image reference system at this moment, the calibration relationship to be solved between the radar reference system and the image reference system, and the preset components in each direction, determine the components of the test equipment in each direction The moving speed to be resolved is registered, the moving speed to be solved in each direction component is registered with the Doppler velocity in each direction component, and the calibration relationship is solved to calibrate the sensor data.
  • the calibration module 204 is configured to determine, according to the time difference to be resolved between the internal clocks of the radar sensor and the image sensor set on the test device, to determine the value of the test device under the image reference system at this moment.
  • the moving speed to be solved, the moving speed to be solved includes the time difference to be solved, according to the moving speed to be solved of the test equipment in the image reference system at this moment, the waiting time between the radar reference system and the image reference system.
  • the conversion relationship of the solution is to determine the moving speed of the test equipment to be solved in the radar reference system, and to determine the moving speed of the test equipment to be solved in each direction component according to the preset components in each direction.
  • the moving velocity to be solved is registered with the Doppler velocity in each direction component, and the conversion relationship and the time difference are calculated as the calibration relationship.
  • the calibration module 204 is further configured to determine, according to the determined calibration relationship and the collected sensor data, the test equipment corresponding to each moment in the image reference system and the radar The pose difference in the reference system, and judge whether the pose difference is greater than a preset error threshold, if the pose difference is greater than the error threshold, then determine that the sensor data needs to be calibrated, and store the sensor data , if the pose difference is not greater than the error threshold, it is determined that the sensor data does not need to be calibrated.
  • the present application also provides a non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium stores a computer program, and the computer program can be used to execute the sensor data calibration method provided in FIG. 1 above.
  • the present application also provides a computer program product, the computer program product includes a computer program or an instruction, and the computer program or instruction is executed by a processor, so that the computer implements the sensor data calibration method provided in FIG. 1 above.
  • the present application also provides a schematic structural diagram of the electronic device shown in FIG. 4 .
  • the electronic device includes a processor, an internal bus, a network interface, a memory, and a non-volatile memory, and of course may also include hardware required by other services.
  • the processor reads the corresponding computer program from the non-volatile memory into the memory and then runs it, so as to realize the sensor data calibration method described in FIG. 1 above.
  • this application does not exclude other implementations, such as logic devices or the combination of software and hardware, etc., that is to say, the execution subject of the following processing flow is not limited to each logic unit, and can also be hardware or logic device.
  • the improvement of a technology can be clearly distinguished as an improvement in hardware (for example, improvements in circuit structures such as diodes, transistors, and switches) or improvements in software (improvement in method flow).
  • improvements in many current method flows can be regarded as the direct improvement of the hardware circuit structure.
  • Designers almost always get the corresponding hardware circuit structure by programming the improved method flow into the hardware circuit. Therefore, it cannot be said that the improvement of a method flow cannot be realized by hardware physical modules.
  • a programmable logic device Programmable Logic Device, PLD
  • PLD Field Programmable Gate Array
  • FPGA Field Programmable Gate Array
  • HDL Hardware Description
  • ABEL Advanced Boolean Expression Language
  • AHDL Altera Hardware Description Language
  • HDCal JHDL
  • JHDL Java Hardware Description Language
  • Lava Lava
  • Lola MyHDL
  • PALASM RHDL
  • Verilog VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently the most commonly used. It should also be clear to those skilled in the art that only a little logical programming of the method flow in the above-mentioned hardware description languages and programming into an integrated circuit can easily obtain a hardware circuit for realizing the logic method flow.
  • the controller may be implemented in any suitable way, for example the controller may take the form of a microprocessor or processor and a computer readable medium storing computer readable program code (such as software or firmware) executable by the (micro)processor , logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers and embedded microcontrollers, examples of controllers include but are not limited to the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicone Labs C8051F320, the memory controller can also be implemented as part of the control logic of the memory.
  • controller in addition to realizing the controller in a purely computer-readable program code mode, it is entirely possible to make the controller use logic gates, switches, application-specific integrated circuits, programmable logic controllers, and embedded The same function can be realized in the form of a microcontroller or the like. Therefore, such a controller can be regarded as a hardware component, and the devices included in it for realizing various functions can also be regarded as structures within the hardware component. Or even, means for realizing various functions can be regarded as a structure within both a software module realizing a method and a hardware component.
  • a typical implementing device is a computer.
  • a computer may be, for example, a personal computer, laptop computer, cellular phone, camera phone, smart phone, personal digital assistant, media player, navigation device, email device, game console, tablet computer, wearable device Or a combination of any of these devices.
  • the embodiments of the present application may be provided as methods, systems, or computer program products. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to operate in a specific manner, such that the instructions stored in the computer-readable memory produce an article of manufacture comprising instruction means, the instructions
  • the device realizes the function specified in one or more procedures of the flowchart and/or one or more blocks of the block diagram.
  • a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • Memory may include non-permanent storage in computer readable media, in the form of random access memory (RAM) and/or nonvolatile memory such as read-only memory (ROM) or flash RAM. Memory is an example of computer readable media.
  • RAM random access memory
  • ROM read-only memory
  • flash RAM flash random access memory
  • Computer-readable media including both permanent and non-permanent, removable and non-removable media, can be implemented by any method or technology for storage of information.
  • Information may be computer readable instructions, data structures, modules of a program, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Flash memory or other memory technology, Compact Disc Read-Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cartridge, tape magnetic disk storage or other magnetic storage device or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
  • computer-readable media excludes transitory computer-readable media, such as modulated data signals and carrier waves.
  • the embodiments of the present application may be provided as methods, systems or computer program products. Accordingly, the present application can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • a computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • the application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote computer storage media including storage devices.
  • each embodiment in the present application is described in a progressive manner, the same and similar parts of each embodiment can be referred to each other, and each embodiment focuses on the differences from other embodiments.
  • the description is relatively simple, and for relevant parts, refer to part of the description of the method embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

Pendant un processus d'étalonnage de données de capteur, des données d'image, des données de radar et des données inertielles à de multiples moments pendant un processus de mouvement d'un dispositif de test sont acquises (S100) ; une trajectoire de mouvement du dispositif de test est déterminée en fonction des données inertielles et des données d'image, et la vitesse de mouvement du dispositif de test sous une trame de référence d'image est déterminée en fonction de la trajectoire de mouvement (S102) ; et la vitesse Doppler lors du processus de mouvement du dispositif de test est déterminée en fonction des données de radar, et la vitesse Doppler et la vitesse de mouvement du dispositif de test sous la trame de référence d'image sont enregistrées de façon à étalonner des données de capteur (S104).
PCT/CN2023/072107 2022-01-29 2023-01-13 Étalonnage de données de capteur WO2023143132A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210112012.8 2022-01-29
CN202210112012.8A CN116558545A (zh) 2022-01-29 2022-01-29 一种传感器数据的标定方法及装置

Publications (1)

Publication Number Publication Date
WO2023143132A1 true WO2023143132A1 (fr) 2023-08-03

Family

ID=87470484

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/072107 WO2023143132A1 (fr) 2022-01-29 2023-01-13 Étalonnage de données de capteur

Country Status (2)

Country Link
CN (1) CN116558545A (fr)
WO (1) WO2023143132A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117589203A (zh) * 2024-01-18 2024-02-23 陕西太合智能钻探有限公司 一种陀螺仪标定方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110782496A (zh) * 2019-09-06 2020-02-11 深圳市道通智能航空技术有限公司 标定方法、装置、航拍设备和存储介质
US20210003693A1 (en) * 2018-04-12 2021-01-07 FLIR Belgium BVBA Adaptive doppler radar systems and methods
CN112815939A (zh) * 2021-01-04 2021-05-18 清华大学深圳国际研究生院 移动机器人的位姿估计方法及计算机可读存储介质
CN113091771A (zh) * 2021-04-13 2021-07-09 清华大学 一种激光雷达-相机-惯导联合标定方法及系统
CN113643321A (zh) * 2021-07-30 2021-11-12 北京三快在线科技有限公司 一种无人驾驶设备的传感器数据采集方法及装置
CN113655453A (zh) * 2021-08-27 2021-11-16 阿波罗智能技术(北京)有限公司 用于传感器标定的数据处理方法、装置及自动驾驶车辆
CN113933818A (zh) * 2021-11-11 2022-01-14 阿波罗智能技术(北京)有限公司 激光雷达外参的标定的方法、设备、存储介质及程序产品

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210003693A1 (en) * 2018-04-12 2021-01-07 FLIR Belgium BVBA Adaptive doppler radar systems and methods
CN110782496A (zh) * 2019-09-06 2020-02-11 深圳市道通智能航空技术有限公司 标定方法、装置、航拍设备和存储介质
CN112815939A (zh) * 2021-01-04 2021-05-18 清华大学深圳国际研究生院 移动机器人的位姿估计方法及计算机可读存储介质
CN113091771A (zh) * 2021-04-13 2021-07-09 清华大学 一种激光雷达-相机-惯导联合标定方法及系统
CN113643321A (zh) * 2021-07-30 2021-11-12 北京三快在线科技有限公司 一种无人驾驶设备的传感器数据采集方法及装置
CN113655453A (zh) * 2021-08-27 2021-11-16 阿波罗智能技术(北京)有限公司 用于传感器标定的数据处理方法、装置及自动驾驶车辆
CN113933818A (zh) * 2021-11-11 2022-01-14 阿波罗智能技术(北京)有限公司 激光雷达外参的标定的方法、设备、存储介质及程序产品

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117589203A (zh) * 2024-01-18 2024-02-23 陕西太合智能钻探有限公司 一种陀螺仪标定方法
CN117589203B (zh) * 2024-01-18 2024-05-10 陕西太合智能钻探有限公司 一种陀螺仪标定方法

Also Published As

Publication number Publication date
CN116558545A (zh) 2023-08-08

Similar Documents

Publication Publication Date Title
US20210190497A1 (en) Simultaneous location and mapping (slam) using dual event cameras
WO2021213432A1 (fr) Fusion de données
US20200124421A1 (en) Method and apparatus for estimating position
WO2019119289A1 (fr) Procédé et dispositif de positionnement, appareil électronique et produit-programme d'ordinateur
JP2018523865A (ja) 情報処理方法、デバイス、および端末
KR20190041315A (ko) 관성 기반 항법 장치 및 상대사전적분에 따른 관성 기반 항법 방법
JP2022510418A (ja) 時間同期処理方法、電子機器及び記憶媒体
US11181379B2 (en) System and method for enhancing non-inertial tracking system with inertial constraints
WO2017008454A1 (fr) Procédé de positionnement de robot
CN108871311B (zh) 位姿确定方法和装置
WO2021169420A1 (fr) Positionnement visuel sur la base d'une pluralité de trames d'image
US20180075609A1 (en) Method of Estimating Relative Motion Using a Visual-Inertial Sensor
CN107917707B (zh) 一种任意姿态下行人方向的确定方法、装置及电子设备
CN108549376A (zh) 一种基于信标的导航定位方法及系统
CN114136315B (zh) 一种基于单目视觉辅助惯性组合导航方法及系统
WO2023143132A1 (fr) Étalonnage de données de capteur
Tomažič et al. Fusion of visual odometry and inertial navigation system on a smartphone
CN113933818A (zh) 激光雷达外参的标定的方法、设备、存储介质及程序产品
KR20180076441A (ko) 적응적 관심영역 및 탐색창을 이용한 객체 검출 방법 및 장치
Deng et al. Global optical flow-based estimation of velocity for multicopters using monocular vision in GPS-denied environments
CN113763549A (zh) 融合激光雷达和imu的同时定位建图方法、装置和存储介质
JP7351892B2 (ja) 障害物検出方法、電子機器、路側機器、及びクラウド制御プラットフォーム
WO2024001649A1 (fr) Procédé de positionnement de robot, appareil et support de stockage lisible par ordinateur
JP2021135286A (ja) 座標変換方法、装置及びデータ処理装置
CN109945864B (zh) 室内行车定位融合方法、装置、存储介质及终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23746014

Country of ref document: EP

Kind code of ref document: A1