WO2021233309A1 - 一种外参变化检测方法、装置、电子设备及检测系统 - Google Patents

一种外参变化检测方法、装置、电子设备及检测系统 Download PDF

Info

Publication number
WO2021233309A1
WO2021233309A1 PCT/CN2021/094433 CN2021094433W WO2021233309A1 WO 2021233309 A1 WO2021233309 A1 WO 2021233309A1 CN 2021094433 W CN2021094433 W CN 2021094433W WO 2021233309 A1 WO2021233309 A1 WO 2021233309A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensors
pose
data
difference
sensor
Prior art date
Application number
PCT/CN2021/094433
Other languages
English (en)
French (fr)
Inventor
万富华
吕吉鑫
孙杰
Original Assignee
杭州海康威视数字技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州海康威视数字技术股份有限公司 filed Critical 杭州海康威视数字技术股份有限公司
Priority to EP21807516.6A priority Critical patent/EP4155673A4/en
Publication of WO2021233309A1 publication Critical patent/WO2021233309A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
    • G01C22/02Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers by conversion into electric waveforms and subsequent integration, e.g. using tachometer generator
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4052Means for monitoring or calibrating by simulation of echoes
    • G01S7/4082Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • This application relates to the field of detection technology, and in particular to a method, device, electronic equipment and detection system for detecting changes in external parameters.
  • a detection system With the continuous development of detection technology, the detection functions of the detection system are becoming more and more abundant. A detection system often requires the ability to detect multiple types of data. Therefore, a detection system will be equipped with multiple sensors, for example, a detection system Lidar, millimeter wave radar, single/binocular camera, depth camera, event camera, wheel speed sensor, steering wheel angle sensor, IMU (Inertial Measurement Unit), GNSS (Global Navigation Satellite System) will be set up in the conference Positioning system) and any two or more sensors.
  • a detection system Lidar, millimeter wave radar, single/binocular camera, depth camera, event camera, wheel speed sensor, steering wheel angle sensor, IMU (Inertial Measurement Unit), GNSS (Global Navigation Satellite System) will be set up in the conference Positioning system) and any two or more sensors.
  • the external parameter refers to the relative position and attitude relationship between the two sensors.
  • the external parameter calibration of multiple sensors is to calibrate the relative relationship between every two sensors. The relationship between position and posture. The accuracy of external parameter calibration directly affects the detection accuracy of the detection system.
  • the position and attitude of the sensor may change, causing the actual position and attitude relationship between the two sensors to be inconsistent with the calibrated external parameters, that is, the external calibrated parameters.
  • the parameters are already inaccurate, which affects the detection accuracy of the detection system.
  • how to detect the changes of the sensor external parameters has become an urgent technical problem to be solved.
  • the purpose of the embodiments of the present application is to provide a method, device, electronic equipment, and detection system for detecting changes in external parameters, so as to realize the detection of changes in external parameters of sensors.
  • the specific technical solutions are as follows:
  • an embodiment of the present application provides a method for detecting changes in external parameters, the method including:
  • an external parameter change detection device which includes:
  • the acquisition module is used to acquire the posture data of any two sensors in the multiple sensors at different moments;
  • the analysis module is used to determine the difference in pose changes of the two sensors according to the pose data
  • the detection module is used for determining that the external parameters of the two sensors have changed if the difference in pose change reaches a preset degree of difference.
  • an embodiment of the present application provides an electronic device, including a processor and a memory, where the memory stores machine executable instructions that can be executed by the processor, and the machine executable instructions are loaded and executed by the processor to achieve The method provided in the first aspect of the embodiments of the present application.
  • an embodiment of the present application provides a machine-readable storage medium, and the machine-readable storage medium stores machine-executable instructions.
  • the machine-executable instructions When the machine-executable instructions are loaded and executed by a processor, they implement the first On the one hand the method provided.
  • an embodiment of the present application provides a detection system, including an electronic device and multiple sensors;
  • Electronic equipment for obtaining the posture data of any two sensors at different moments; according to the posture data, determine the difference in posture changes of the two sensors; if the difference in posture changes reaches a preset degree of difference, determine the two The external parameters of the sensor have changed.
  • An external parameter change detection method, device, electronic equipment, and detection system obtained the position data of any two sensors in a plurality of sensors at different moments, and determine the two sensors according to the position data of each position. If the posture change difference reaches the preset degree of difference, it is determined that the external parameters of any two sensors have changed. Constrained by external parameters, if one of the two sensors with calibrated external parameters changes its pose at different times, the pose of the other sensor should also change simultaneously, and the pose changes of the two sensors should be consistent. Yes, this consistency is reflected in the coordinate transformation through the calibrated external parameters, and the poses of the two sensors are converted to the same coordinate system. There is no difference in the pose changes of the two sensors in this coordinate system.
  • FIG. 1 is a schematic flowchart of an external parameter change detection method according to an embodiment of the application
  • FIG. 2 is a schematic diagram of a process for acquiring pose data according to an embodiment of the application
  • FIG. 3 is a schematic flowchart of a method for detecting changes in external parameters according to another embodiment of the application.
  • FIG. 4 is a schematic flowchart of a method for detecting changes in external parameters according to another embodiment of this application.
  • FIG. 5 is a schematic flowchart of a method for detecting changes in external parameters according to another embodiment of this application.
  • FIG. 6 is a schematic flowchart of a method for detecting changes in external parameters according to another embodiment of this application.
  • FIG. 7 is a schematic diagram of the execution framework of the external parameter change detection method according to an embodiment of the application.
  • FIG. 8 is a schematic diagram of a data flow of a method for detecting changes in external parameters according to an embodiment of the application.
  • FIG. 9 is a schematic flowchart of the sub-steps of pose interpolation and coordinate transformation according to an embodiment of the application.
  • FIG. 10 is a schematic diagram of the flow of pose interpolation and coordinate transformation according to an embodiment of the application.
  • FIG. 11 is a schematic flowchart of the steps of external parameter change detection according to an embodiment of the application.
  • FIG. 12 is a schematic structural diagram of an external parameter change detection device according to an embodiment of the application.
  • FIG. 13 is a schematic structural diagram of an electronic device according to an embodiment of the application.
  • FIG. 14 is a schematic structural diagram of a detection system according to an embodiment of the application.
  • the embodiments of the present application provide a method, device, electronic equipment, and detection system for the change of the external parameter.
  • the method for detecting changes in external parameters provided in the embodiments of the present application is first introduced.
  • the execution subject of the external parameter change detection method provided in the embodiments of the present application may be an external parameter calibration device with an external parameter calibration function, a detection device with a parameter change detection function, or various sensors in the detection system. There is no specific limitation here, and the following are collectively referred to as electronic devices.
  • the method for realizing the external parameter change detection method provided in the embodiments of the present application may be at least one of software, hardware circuit, and logic circuit provided in the electronic device.
  • the method for detecting changes in external parameters may include the following steps.
  • S101 Acquire posture data of any two sensors in a plurality of sensors at different moments.
  • S102 Determine the difference in position and posture changes of the two sensors according to the posture data of each position.
  • the posture change difference of the two sensors is determined. If the posture change difference reaches the expected value Assuming the degree of difference, it is determined that the external parameters of any two sensors have changed. Constrained by external parameters, if one of the two sensors with calibrated external parameters changes its pose at different times, the pose of the other sensor should also change simultaneously, and the pose changes of the two sensors should be consistent. Yes, this consistency is reflected in the coordinate transformation through the calibrated external parameters, and the poses of the two sensors are converted to the same coordinate system. There is no difference in the pose changes of the two sensors in this coordinate system.
  • the method for detecting changes in external parameters is applicable to a multi-sensor detection system, and detects changes in external parameters of every two sensors in the plurality of sensors.
  • Multi-sensor refers to two or more sensors that can directly or indirectly obtain pose data, such as: lidar, millimeter wave radar, mono/binocular camera, depth camera, event camera, wheel speed sensor, steering wheel angle sensor, IMU, GNSS etc.
  • Pose data is a general term for position data and posture data. It can include only position data or posture data, or both position data and posture data; posture data can be relative (that is, the position of one sensor relative to another sensor).
  • the pose data can also be absolute (that is, the pose data obtained by a sensor at a certain moment); the pose data can be directly output by the sensor, It can also be obtained by calculating the data output by the sensor through a specific algorithm; the pose data can be three-dimensional or two-dimensional.
  • the pose data of each sensor at different moments can be obtained, and when the external parameter change detection of which two sensors need to be performed, the data of the two sensors at different moments can be extracted from it. Everybody's posture data.
  • S101 may specifically be: acquiring each sensor data collected by any two sensors in a plurality of sensors at different times; using a data conversion strategy corresponding to the type of each sensor data to convert each sensor data into pose data.
  • Each sensor has a data collection function, which can collect the corresponding type of sensor data.
  • the laser radar collects point cloud data
  • the mono/binocular camera collects image data, etc.
  • each sensor collects sensor data, it can If the type of sensor data is carried in the data information, when each sensor data is obtained, the type of each sensor data can be directly obtained from it.
  • Different sensor data types correspond to different data conversion strategies, and the corresponding data conversion strategies can be used to convert sensor data into pose data.
  • FIG. 2 it is a schematic diagram of a process for acquiring pose data according to an embodiment of the application. Obtain the sensor data collected by each sensor; select the data conversion strategy according to the type of sensor data; use the corresponding data conversion strategy to convert the sensor data to obtain the pose data.
  • the integrated navigation method can be used to convert the acceleration, linear velocity, and satellite positioning data into pose data; if the sensor collects image data, you can use vision Mileage calculation method converts the image data into pose data; if the sensor collects three-dimensional point cloud data, you can use the point cloud mileage calculation method to convert the three-dimensional point cloud data into pose data; if the sensor collects angular velocity and For linear velocity, the heading and attitude calculation method can be used to convert angular velocity and linear velocity into pose data.
  • the monocular visual mileage calculation method takes the monocular visual mileage calculation method and the point cloud mileage calculation method suitable for monocular cameras and lidars as examples to explain the specific acquisition methods of pose data. For other types of sensor data, the corresponding position is obtained. The method of posture data will not be repeated here.
  • One way to realize the monocular visual mileage calculation method is as follows: the monocular visual mileage calculation method is divided into two stages: initialization and normal operation.
  • the initialization process is: extracting image feature points from the input two frames of image data, such as extracting SIFT (Scale-invariant Feature Transform) features from the two frames of image data, and then extracting the features of the two frames of image data
  • SIFT Scale-invariant Feature Transform
  • the SIFT feature descriptors of, are matched, and the possible feature point matching relationship pairs are obtained.
  • the E matrix (Essential Matrix) can be calculated according to these matching relationship pairs, and the E matrix can be decomposed and one of the effective rotation matrix R and translation vector t can be selected.
  • the combination of the two is the rotation and translation matrix T.
  • the depth of each feature point can be calculated by triangulation method (that is, the vertical distance between a three-dimensional point in space and the camera) Distance), the three-dimensional coordinates of each feature point can be obtained, so far, the initialization process ends.
  • the normal operation process is: extracting feature points from the current frame of image data, and performing matching calculations with the feature points of the previous frame of image data that can be calculated by triangulation.
  • the matching relationship between the feature points of the current frame of image data and the three-dimensional points in the previous frame of image data is obtained.
  • the rotation and translation of the current frame of image data relative to the previous frame of image data can be solved
  • Matrix is the pose relationship between the two frames of image data before and after, using the rotation and translation matrix
  • the pose data of the previous frame of image data can be calculated to obtain the pose data of the current frame of image data.
  • the subsequent image data is also processed in the same way, that is, the process of obtaining pose data from the image data is realized.
  • the input of the point cloud mileage calculation method is a series of point clouds in the unit of frame.
  • the point cloud of the next frame and the point cloud of the previous frame are registered using the ICP (Iterative Closest Point) algorithm to make the two frames of points Clouds can be better matched together.
  • ICP Intelligent Closest Point
  • the pose relationship between the point clouds of the previous and the next frame can be obtained.
  • the pose data of the point cloud of the previous frame can be calculated to obtain the pose data of the point cloud of the current frame.
  • Each sensor outputs sensor data or pose data at a certain frequency. Therefore, for a sensor, an electronic device can obtain a series of pose data at different times. For a sensor, there will be changes between the pose data acquired at different times, and the pose changes of the sensor can be obtained by analyzing and calculating the pose data acquired at different times. The position and posture change of each sensor can be obtained, and the difference in the posture change of the two sensors can be compared accordingly.
  • the difference in pose change is a measurement value, which can be a specific amount of difference, such as a 50% difference in pose change between two sensors, a 30-degree difference in angle change, etc., or a statistical value at the moment when there is a difference, for example, At 50% of the time, there is a difference in pose change between the two sensors, and it can also be other values representing metrics, which will not be repeated here.
  • the pose of the other sensor should also change simultaneously, and the positions of the two sensors should also change simultaneously.
  • the pose changes should be consistent. This consistency is reflected in the coordinate transformation through the calibrated external parameters, and the poses of the two sensors are converted to the same coordinate system.
  • the pose changes of the two sensors in this coordinate system are No difference. However, if the posture changes are significantly different (up to a preset degree of difference), it means that the posture changes of the two sensors have been inconsistent, that is, the external parameters of the two sensors have changed.
  • the change of the pose may be a series of changes of the pose data at different moments, or may be the change of the pose data at two adjacent moments.
  • the difference in pose change may be the difference in overall pose change between two sensors in a period of time, or it may be the difference in pose change between two sensors at a certain time.
  • it is generally used to perform an overall analysis within a time period. Specifically, it can be to determine the difference in the pose changes of the two sensors at different times in a time period.
  • the external parameter has changed; it can also determine the respective pose change trends of the two sensors in a period of time, and then compare the two pose change trends and analyze The degree of difference between the two pose change trends is calculated. If the degree of difference is greater (greater than the preset degree), it is considered that the external parameter has changed. There are many ways to determine whether the external parameters of the two sensors have changed by using the difference in pose changes, which will not be listed here.
  • S102 can be specifically implemented through the following steps:
  • the first step is to calculate the relative poses of the two sensors at different moments based on the pose data, where the relative pose includes the relative pose angle of the pose data at the current moment to the pose data at the previous moment.
  • the second step is to calculate the relative pose deviations of the two sensors at different moments according to the relative poses, where the relative pose deviations include attitude deviation and translation deviation.
  • the third step is to calculate the weights of the attitude deviations and the translational deviations of the two sensors at different moments according to the relative attitude angles.
  • the fourth step is to perform a weighted summation of the attitude deviations and translation deviations of the two sensors at different times according to the weights of the attitude deviations and translation deviations of the two sensors at different times to obtain the The scoring result of the external parameter change, and the scoring result of the external parameter change characterizes the difference in the pose changes of the two sensors.
  • S103 may specifically be: if the scoring result of the external parameter change is greater than the preset threshold, it is determined that the external parameters of the two sensors have changed.
  • the relative pose is the amount of change of the pose data at the current moment relative to the pose data of the previous moment.
  • the amount of change can be the relative pose angle.
  • the two positions can be calculated.
  • the relative poses of the sensor at different times, for example, the relative pose of the lidar at time 2 is the amount of change in the pose data at time 2 relative to the pose data at time 1, which can be calculated using formula (1) get.
  • ⁇ T L12 is the relative pose of lidar L at time 2
  • T L1 is the pose data of lidar L at time 1. It is the inverse of T L1
  • T L2 is the pose data of lidar L at time 2
  • time 1 and time 2 are adjacent time.
  • the relative pose deviations of the two sensors at different moments can be calculated.
  • the relative pose deviations of the lidar and monocular camera at time 2 can use the formula (2) Calculated.
  • ⁇ T CL12 is the relative pose deviation of lidar L and monocular camera C at time 2
  • ⁇ T L12 is the relative pose of lidar L at time 2.
  • the relative pose deviation ⁇ T CL12 includes the attitude deviation ⁇ R CL and the translation deviation ⁇ t CL , the attitude deviation angle corresponding to ⁇ R CL can be recorded as
  • the output data types or dimensions of the two sensors are inconsistent, such as the single-axis IMU only outputs one-dimensional attitude information, and the lidar can output three-dimensional position and attitude information
  • the deviation of the common output information is calculated, that is, only one-dimensional attitude deviation is calculated
  • the amount of translation deviation is set to 0.
  • the relative posture includes the relative posture angle ⁇ (- ⁇ , ⁇ ), the relative posture angle determines the weight of the posture deviation and translation deviation.
  • the postures of the two sensors at different moments can be calculated according to the relative posture angles.
  • the deviation weight and each translation deviation weight can be specifically taken as the half sine of the relative attitude angle as the attitude deviation weight, and the absolute value of the half sine of the relative attitude angle and more than half of the relative attitude angle can be taken The sum of the chord values is used as the translation deviation weight.
  • the weighted summation of the attitude deviations and translational deviations of the two sensors at different times can be obtained. Refer to the change score result.
  • the scoring result of the external parameter change characterizes the difference in the pose changes of the two sensors, that is, the larger the value of the scoring result of the external parameter change, the greater the difference in the pose changes of the two sensors.
  • formula (3) can be used to calculate the external parameter change scoring result.
  • score is the scoring result of external parameter change
  • n is the number of relative poses
  • ⁇ i is the relative attitude angle corresponding to the i-th relative pose
  • ⁇ t i is the translation deviation corresponding to the i-th relative pose
  • Is the weight of attitude deviation Is the translation deviation weight.
  • the scoring result of the external parameter change is compared with the preset threshold. If the scoring result of the external parameter change is greater than the preset threshold, it is considered that the external parameters of the two sensors have changed.
  • the preset threshold can be set according to actual usage requirements. If the tolerance of external parameter changes is high, the preset threshold can be set larger, and vice versa, it can be set smaller.
  • the method may further include: outputting the scoring result of the change of the external parameter as the detection confidence.
  • the calculated external parameter change score result can also be output as the detection confidence.
  • the detection confidence characterizes the credibility of the detection result. The higher the external parameter change score, the higher the score. This shows that the greater the difference between the changes of the external parameters of the two sensors, the greater the confidence level of the detection can be used to directly determine whether the judgment result that the external parameters has changed is credible.
  • the above-mentioned processing can be performed for the pairwise combination of all sensors, and the detection result and the detection confidence of whether the external parameters have changed among all the sensors can be obtained.
  • the embodiment of the present application also provides a method for detecting changes in external parameters, as shown in FIG. 3, which may include the following steps.
  • S301 Acquire posture data of any two sensors in a plurality of sensors at different times and time stamps of the posture data.
  • S303 Determine the difference in position and posture changes of any two sensors according to the position and posture data of the first sensor and the position and posture data of the second sensor after interpolation.
  • the time stamp of the posture data refers to the sensor's acquisition.
  • the data that does not correspond to the time means that the timestamps of the various types of pose data are different. Therefore, the pose data needs to be processed to make each of the two sensors
  • the pose data can overlap in time. In the embodiment of the present application, interpolation can be used to achieve this purpose.
  • the interpolation process is: obtain the time stamps of the pose data of the two sensors, select one sensor as the interpolation target (usually a sensor with a lower output frequency, referred to as the first sensor in the embodiment of this application), and use the first sensor
  • the time stamp of the position data of the second sensor is the target, and the position data of the second sensor is interpolated, that is, at the time corresponding to the time stamp of the position data of the first sensor, according to the position of the second sensor near that time Data, interpolation operation is performed at this moment to obtain the pose data of the second sensor at that moment.
  • interpolation methods such as linear interpolation, bicubic interpolation, and spline interpolation may be used
  • interpolation methods such as spherical interpolation
  • the embodiment of the present application also provides a method for detecting changes in external parameters, as shown in FIG. 4, which may include the following steps.
  • S401 Acquire posture data of any two sensors in a plurality of sensors at different moments and external parameters pre-calibrated for the two sensors.
  • S403 Determine the difference in position and posture changes of the two sensors according to the posture data after the coordinate system conversion.
  • the pose data obtained from different sensors are often located in different coordinate systems. Therefore, before calculating the difference in pose changes, it is necessary to convert the pose data of each sensor to the same coordinate system.
  • the external parameters represent the difference between the two sensors.
  • the relative position and attitude relationship between the two sensors can be in two-dimensional or three-dimensional form.
  • the relationship between the coordinate systems of the two sensors can be determined through external parameters. Therefore, according to the external parameters, the position data of the two sensors can be converted to the same coordinate. Under the system, the data in the same coordinate system is more convenient for calculation, and more accurate difference in pose changes can be obtained.
  • the embodiment of the present application also provides a method for detecting changes in external parameters, as shown in FIG. 5, which may include the following steps.
  • S501 Acquire posture data of any two sensors in a plurality of sensors at different moments and external parameters pre-calibrated for the two sensors.
  • S503 Calculate the relative poses of the two sensors at different moments according to the posture data after the coordinate system conversion, where the relative pose includes the translation amount of the pose data at the current moment relative to the pose data at the previous moment. .
  • S505 Calculate the ratio between the accumulated results of the translational amounts of the two sensors to obtain a scale factor.
  • S507 Determine the difference in position and posture change of the two sensors according to the posture data of the unified change scale.
  • the acquired pose data lacks scale information.
  • Scale information refers to the ratio between the sensor’s pose scale and the real physical scale.
  • the change scale of this type of pose data is unified.
  • the unification of the change scale is often after coordinate system 1.
  • the specific process is: in calculating a series of pose data, the amount of change (that is, relative Pose), the calculation method of relative pose is shown in formula (1). From this, a series of relative poses of the two sensors can be obtained.
  • the relative pose includes the relative pose angle and translation amount.
  • the translation amounts in the relative pose of the two sensors are respectively summed, and the ratio of the translation amount of the two sensors after the sum is calculated, then the scale factor S can be obtained.
  • formula (4) can be used to calculate the scale factor.
  • S is the scale factor
  • n is the number of relative poses
  • ⁇ t Ci is the translation amount of the ith relative pose of the monocular camera C
  • ⁇ t Li is the translation amount of the ith relative pose of the lidar L.
  • the pose data of the monocular camera and the lidar can be unified in varying scales. So far, a series of pose data of the two sensors under the unified coordinate system and the varying scales can be obtained. It is the difference of accurate pose change.
  • the embodiment of the present application also provides a method for detecting changes in external parameters, as shown in FIG. 6, which may include the following steps.
  • S601 Acquire posture data of any two sensors in a plurality of sensors at different moments.
  • S602 Determine the difference in position and posture changes of the two sensors according to the posture data of each position.
  • S604 Re-calibrate the external parameters of the two sensors according to the newly acquired pose data of the two sensors.
  • the two sensors After determining that the external parameters of the two sensors have changed, in order to improve the accuracy of the external parameter calibration and the detection accuracy of the detection system, the two sensors can be externally re-examined based on the latest acquired pose data of the two sensors. Parameter calibration, the re-calibrated external parameters can meet the actual pose of the current sensor, thereby improving the detection accuracy of the detection system, and realizing automatic detection of external parameter changes and automatic calibration of external parameters.
  • the electronic device may also notify the technician of the detection result of the change of the external parameter, and the technician will manually mark the corresponding sensor external parameter.
  • the external parameter change detection method provided by the embodiment of the present application can be mainly divided into two major steps: pose acquisition and alignment, and external parameter change detection.
  • the step of pose acquisition and alignment includes two sub-steps of pose acquisition, pose interpolation, and coordinate transformation; the step of detecting changes in external parameters includes a sub-step of scoring changes in external parameters.
  • the input data is sensor data (such as point cloud data, image data, etc.), and the output data is pose data.
  • the output data is pose data.
  • Combine the pose data of all sensors in pairs. The essence of the combination is to extract the pose data.
  • the pose data of two sensors are extracted.
  • One of the combinations (the pose data of the A and D sensors) is selected below. Note that the processing procedures for the remaining combinations are similar.
  • the input data is the pose data of the A and D sensors
  • the output data is the pose data of the A and D sensors that are time-aligned and located in the same coordinate system after interpolation.
  • the input is the pose data of the A and D sensors that are aligned in time and are located in the same coordinate system
  • the output is the scoring result of the external parameter change of the A and D sensors.
  • the function of the sub-step of pose acquisition is to process various sensor data to obtain corresponding pose data.
  • the A and D sensors are a monocular camera and a lidar, respectively, the specific data conversion process is shown in the specific example in the embodiment shown in Fig. 1, which will not be repeated here.
  • the role of the sub-steps of pose interpolation and coordinate transformation is to interpolate the pose data of different sensors, so that the poses under different sensors can be aligned and converted to the same coordinate system with uniform scale at all times.
  • the flow of the sub-steps of pose interpolation and coordinate transformation is shown in Figure 9.
  • the image output frequency of the monocular camera C is 30Hz
  • the sub-steps obtained by the pose The output frequency of the pose data of the monocular camera C obtained in the step is also 30Hz; the typical point cloud frame output frequency of the lidar L is 10Hz, and the output frequency of the pose data of the lidar L obtained through the sub-step of the pose acquisition is also 10Hz.
  • the external parameter obtained by the two pre-calibration is T CL (monocular camera to lidar basis transformation).
  • the pose data of the lidar L is selected as the interpolation target, and the pose data of the monocular camera C is interpolated according to the time stamp of the pose data of the lidar L.
  • the specific interpolation method is as follows:
  • t C represents the time stamp of the pose data of the monocular camera
  • t L represents the time stamp of the pose data of the lidar
  • the pose data corresponding to the three moments are represented as follows: T C1 , T L2 , T C3 , T C represents the pose data of the monocular camera, and T L2 represents the pose data of the lidar.
  • the purpose of interpolation is to use T C1 and T C3 to obtain the pose data T C2 of the monocular camera at time t L2 .
  • the spherical interpolation method is used to interpolate the attitude data
  • the linear interpolation method is used to interpolate the position data.
  • the relative pose between the pose data 1 and pose data 2 of the lidar transformed to the monocular camera coordinate system can be calculated as follows:
  • the relative pose of lidar is also calculated with a similar formula.
  • a series of relative poses of the two sensors are thus obtained.
  • the specific calculation method is to use formula (4) for calculation. Using this scale factor, the pose change scales of the monocular camera and the lidar can be unified. So far, a series of pose data of the two sensors in the same coordinate system with a unified change scale have been obtained.
  • Curve 1 represents the pose data of a series of monocular cameras in the monocular camera coordinate system
  • curve 3 represents the pose data of a series of lidars in the laser radar coordinate system
  • Data curve 2 represents the result of interpolating the pose data of the monocular camera according to the time stamp of the pose data of the lidar
  • Curve 4 shows the result of using pre-calibrated external parameters to transform the pose data of the lidar to the monocular camera coordinate system
  • Curve 5 shows the effect of unifying the change scales of the two pose data after calculating the scale factor. And pointed out the relative pose.
  • the function of the step of external parameter change detection is to calculate the external parameter change scoring result based on a series of two relative poses that have been transformed into a unified coordinate system.
  • the specific process is shown in Figure 11. First, the deviations are calculated for two relative poses, and the weights of the changes of the relative poses are calculated, and then the scores of the changes of the external parameters are calculated according to a series of relative poses and weights.
  • ⁇ T CL12 is the relative pose deviation between lidar and monocular camera.
  • the relative pose deviation ⁇ T CL12 includes the attitude deviation ⁇ R CL and the translation deviation ⁇ t CL , the attitude deviation angle corresponding to ⁇ R CL can be recorded as
  • the output data types or dimensions of the two sensors are inconsistent, such as the single-axis IMU only outputs one-dimensional attitude information, and the lidar can output three-dimensional position and attitude information
  • the deviation of the common output information is calculated, that is, only one-dimensional attitude deviation is calculated
  • the amount of translation deviation is set to 0.
  • the relative posture includes the relative posture angle ⁇ (- ⁇ , ⁇ ), the relative posture angle determines the weight of the posture deviation and translation deviation.
  • the postures of the two sensors at different moments can be calculated according to the relative posture angles.
  • the deviation weight and each translation deviation weight can be specifically taken as the half sine of the relative attitude angle as the attitude deviation weight, and the absolute value of the half sine of the relative attitude angle and more than half of the relative attitude angle can be taken The sum of the chord values is used as the translation deviation weight.
  • the weighted summation of the attitude deviations and translational deviations of the two sensors at different times can be obtained. Refer to the change score result.
  • the scoring result of the external parameter change characterizes the difference in the pose changes of the two sensors, that is, the larger the value of the scoring result of the external parameter change, the greater the difference in the pose changes of the two sensors.
  • formula (9) can be used to calculate the external parameter change score result.
  • score is the scoring result of external parameter change
  • n is the number of relative poses
  • ⁇ i is the relative attitude angle corresponding to the i-th relative pose
  • ⁇ t i is the translation deviation corresponding to the i-th relative pose
  • Is the weight of attitude deviation Is the translation deviation weight.
  • the scoring result of the external parameter change is compared with the preset threshold. If the scoring result of the external parameter change is greater than the preset threshold, it is considered that the external parameters of the two sensors have changed.
  • the preset threshold value is set according to actual usage requirements, and the preset threshold value can be set larger when the tolerance of external parameter changes is high, and vice versa, it can be set smaller.
  • an embodiment of the present application provides an external parameter change detection device.
  • the device may include:
  • the acquiring module 1210 is used to acquire the posture data of any two sensors in a plurality of sensors at different moments;
  • the analysis module 1220 is used to determine the difference in pose changes of the two sensors according to the pose data
  • the detection module 1230 is configured to determine that the external parameters of the two sensors have changed if the difference in pose changes reaches a preset degree of difference.
  • the acquisition module 1210 can be specifically used to: acquire each sensor data collected by any two sensors in a plurality of sensors at different moments; use a data conversion strategy corresponding to the type of each sensor data to convert each sensor data into a bit Pose data.
  • the device may further include: an interpolation module;
  • Interpolation module used to obtain the time stamp of each position data; taking the time stamp of the position data of the first sensor of the two sensors as the target, using the preset interpolation algorithm, the position of the second sensor of the two sensors Perform interpolation operation on the attitude data, where the output frequency of the first sensor is less than the output frequency of the second sensor;
  • the analysis module 1220 can be specifically used to determine the difference in pose changes of the two sensors according to the pose data of the first sensor and the pose data of the second sensor after interpolation.
  • the device may further include: a coordinate conversion module;
  • the coordinate conversion module is used to obtain pre-calibrated external parameters for the two sensors; according to the external parameters, convert the posture data to the same coordinate system;
  • the analysis module 1220 can be specifically used to determine the difference in the pose changes of the two sensors according to the pose data after the coordinate system conversion.
  • the device may also include: a standard uniform module;
  • the unified scale module is used to calculate the relative poses of the two sensors at different moments according to the pose data after the coordinate system conversion.
  • the relative pose includes the pose data at the current moment relative to the pose at the previous moment.
  • the amount of translation of the data for each sensor, sum the amount of translation of the sensor at different times to get the cumulative result of the sensor’s translation; calculate the ratio between the cumulative results of the two sensors to obtain the scale Factor; According to the scale factor, the pose data of the two sensors are unified to obtain the pose data after the unified change scale;
  • the analysis module 1220 can be specifically used to determine the difference in the pose changes of the two sensors according to the pose data after the change scale is unified.
  • the analysis module 1220 can be specifically used to: calculate the relative poses of the two sensors at different moments according to the pose data of each individual, where the relative pose includes the pose data of the current moment relative to the previous moment.
  • the relative attitude angle of the pose data according to the relative poses, calculate the relative pose deviations of the two sensors at different times, where the relative pose deviations include the attitude deviation and the translation deviation; according to the relative attitude angles, calculate the Each attitude deviation weight and each translation deviation weight of the two sensors at different times; according to the attitude deviation weights and translation deviation weights of the two sensors at different times, the attitude deviations and each translation deviation weights of the two sensors at different times
  • the translation deviation is weighted and summed to obtain the scoring result of the external parameter change of the two sensors, wherein the scoring result of the external parameter change characterizes the difference of the pose changes of the two sensors;
  • the detection module 1230 may be specifically used to determine that the external parameters of the two sensors have changed if the scoring result of the external parameter change is greater than the preset threshold.
  • the device may further include: an output module;
  • the output module is used to output the scoring result of the external parameter change as the detection confidence.
  • the device may further include: a calibration module;
  • the calibration module is used for re-calibrating the external parameters of the two sensors according to the latest acquired pose data of the two sensors.
  • the posture change difference of the two sensors is determined. If the posture change difference reaches the expected value Assuming the degree of difference, it is determined that the external parameters of any two sensors have changed. Constrained by external parameters, if one of the two sensors with calibrated external parameters changes its pose at different times, the pose of the other sensor should also change simultaneously, and the pose changes of the two sensors should be consistent. Yes, this consistency is reflected in the coordinate transformation through the calibrated external parameters, and the poses of the two sensors are converted to the same coordinate system. There is no difference in the pose changes of the two sensors in this coordinate system.
  • An embodiment of the present application provides an electronic device, as shown in FIG. 13, including a processor 1301 and a memory 1302, where the memory 1302 stores machine executable instructions that can be executed by the processor 1301, and the machine executable instructions are executed by the processor 1301. 1301 is loaded and executed to implement the external parameter change detection method provided in the embodiment of the present application.
  • the foregoing memory may include RAM (Random Access Memory, random access memory), and may also include NVM (Non-volatile Memory, non-volatile memory), for example, at least one disk storage.
  • NVM Non-volatile Memory, non-volatile memory
  • the memory may also be at least one storage device located far away from the foregoing processor.
  • the above-mentioned processor may be a general-purpose processor, including CPU (Central Processing Unit), NP (Network Processor), etc.; it may also be DSP (Digital Signal Processor), ASIC (Application Processor), etc. Specific Integrated Circuit, FPGA (Field-Programmable Gate Array, Field Programmable Gate Array) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components.
  • CPU Central Processing Unit
  • NP Network Processor
  • DSP Digital Signal Processor
  • ASIC Application Processor
  • FPGA Field-Programmable Gate Array, Field Programmable Gate Array
  • other programmable logic devices discrete gates or transistor logic devices, discrete hardware components.
  • the memory 1302 and the processor 1301 may perform data transmission through a wired connection or a wireless connection, and the electronic device and other devices may communicate through a wired communication interface or a wireless communication interface. What is shown in FIG. 13 is only an example of data transmission through the bus, and is not intended to limit the specific connection mode.
  • the processor reads the machine executable instructions stored in the memory, and loads and executes the machine executable instructions, so as to achieve: by acquiring the posture data of any two sensors in the multiple sensors at different times , According to the posture data of each position, determine the difference of the posture change of the two sensors, if the difference of the posture change reaches the preset degree of difference, it is determined that the external parameters of any two sensors have changed. Constrained by external parameters, if one of the two sensors with calibrated external parameters changes its pose at different times, the pose of the other sensor should also change simultaneously, and the pose changes of the two sensors should be consistent. Yes, this consistency is reflected in the coordinate transformation through the calibrated external parameters, and the poses of the two sensors are converted to the same coordinate system.
  • an embodiment of the present application provides a machine-readable storage medium.
  • the machine-readable storage medium stores machine-executable instructions.
  • the Method for detecting changes in external parameters When the machine-executable instructions are loaded and executed by a processor, the Method for detecting changes in external parameters.
  • the machine-readable storage medium stores machine executable instructions that execute the external parameter change detection method provided in the embodiments of the present application at runtime. Therefore, it can be implemented: According to the posture data at different moments, the posture change difference of the two sensors is determined. If the posture change difference reaches a preset degree of difference, it is determined that the external parameters of any two sensors have changed. Constrained by external parameters, if one of the two sensors with calibrated external parameters changes its pose at different times, the pose of the other sensor should also change simultaneously, and the pose changes of the two sensors should be consistent. Yes, this consistency is reflected in the coordinate transformation through the calibrated external parameters, and the poses of the two sensors are converted to the same coordinate system.
  • the embodiment of the present application provides a detection system, as shown in FIG. 14, including an electronic device 1401 and a plurality of sensors 1402;
  • Multiple sensors 1402 for collecting data and sending the collected data to the electronic device 1401;
  • the electronic device 1401 is used to obtain the posture data of any two sensors at different moments; according to the posture data, determine the posture change difference of the two sensors; if the posture change difference reaches a preset degree of difference, determine the two The external parameters of each sensor have changed.
  • the difference of the pose changes of the two sensors is determined, if the difference of the pose changes reaches a preset degree of difference , It is determined that the external parameters of any two sensors have changed. Constrained by external parameters, if one of the two sensors with calibrated external parameters changes its pose at different times, the pose of the other sensor should also change simultaneously, and the pose changes of the two sensors should be consistent. Yes, this consistency is reflected in the coordinate transformation through the calibrated external parameters, and the poses of the two sensors are converted to the same coordinate system. There is no difference in the pose changes of the two sensors in this coordinate system.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (such as a floppy disk, a hard disk, a magnetic tape), an optical medium (such as a DVD (Digital Versatile Disc)), or a semiconductor medium (such as an SSD (Solid State Disk)), etc. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

一种外参变化检测方法、装置、电子设备及检测系统,受外参约束,已标定外参的两个传感器中若一个传感器在不同时刻的位姿发生变化,则另一个传感器的位姿也同步发生变化,且两个传感器的位姿变化应该是一致的,一致性体现在通过已标定的外参进行坐标变换,将两个传感器的位姿转换至同一坐标系下,在此坐标系下两个传感器的位姿变化无差异;然而,如果根据多个传感器中任两个传感器在不同时刻的各位姿数据,确定出两个传感器之间存在位姿变化差异,且位姿变化差异达到预设差异程度,两个传感器的位姿变化已不一致,从而能够确定这两个传感器的外参已发生变化,实现了对传感器外参变化的检测。

Description

一种外参变化检测方法、装置、电子设备及检测系统
本申请要求于2020年05月21日提交中国专利局、申请号为202010436800.3发明名称为“一种外参变化检测方法、装置、电子设备及检测系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及检测技术领域,尤其涉及一种外参变化检测方法、装置、电子设备及检测系统。
背景技术
随着检测技术的不断发展,检测系统的检测功能越来越丰富,一个检测系统往往要求能够对多种类型的数据进行检测,因此,在一个检测系统会设置多个传感器,例如,一个检测系统中会设置激光雷达、毫米波雷达、单/双目相机、深度相机、事件相机、轮速传感器、方向盘转角传感器、IMU(Inertial Measurement Unit,惯性测量单元)、GNSS(Global Navigation Satellite System,全球卫星定位系统)等任两个或多个传感器。
在进行系统设置时,需要对多个传感器进行外参标定,外参是指两个传感器之间的相对位置和姿态关系,多个传感器的外参标定就是标定出每两个传感器之间的相对位置和姿态关系。外参标定的准确与否直接影响着检测系统的检测精度。
由于受震动、撞击、机械磨损等各种原因的影响,传感器的位置和姿态可能会发生变化,导致两个传感器之间实际的位姿关系与标定的外参并不相符,也就是标定的外参已不准确,影响到检测系统的检测精度。为了尽快地调整外参、保证检测系统的检测精度,如何实现对传感器外参变化的检测成为亟待解决的技术问题。
发明内容
本申请实施例的目的在于提供一种外参变化检测方法、装置、电子设备及检测系统,以实现对传感器外参变化的检测。具体技术方案如下:
第一方面,本申请实施例提供了一种外参变化检测方法,该方法包括:
获取多个传感器中任两个传感器在不同时刻的各位姿数据;
根据各位姿数据,确定该两个传感器的位姿变化差异;
若位姿变化差异达到预设差异程度,则确定该两个传感器的外参发生变化。
第二方面,本申请实施例提供了一种外参变化检测装置,该装置包括:
获取模块,用于获取多个传感器中任两个传感器在不同时刻的各位姿数据;
分析模块,用于根据各位姿数据,确定该两个传感器的位姿变化差异;
检测模块,用于若位姿变化差异达到预设差异程度,则确定该两个传感器的外参发生变化。
第三方面,本申请实施例提供了一种电子设备,包括处理器和存储器,其中,存储器存储有能够被处理器执行的机器可执行指令,机器可执行指令由处理器加载并执行,以实现本申请实施例第一方面所提供的方法。
第四方面,本申请实施例提供了一种机器可读存储介质,机器可读存储介质内存储有机器可执行指令,机器可执行指令在被处理器加载并执行时,实现本申请实施例第一方面所提供的方法。
第五方面,本申请实施例提供了一种检测系统,包括电子设备及多个传感器;
多个传感器,用于采集数据,并将采集的数据发送至电子设备;
电子设备,用于获取任两个传感器在不同时刻的各位姿数据;根据各位姿数据,确定该两个传感器的位姿变化差异;若位姿变化差异达到预设差异程度,则确定该两个传感器的外参发生变化。
本申请实施例提供的一种外参变化检测方法、装置、电子设备及检测系统,通过获取多个传感器中任两个传感器在不同时刻的各位姿数据,根据各位姿数据,确定该两个传感器的位姿变化差异,若位姿变化差异达到预设差异程度,则确定该任两个传感器的外参发生变化。受外参约束,已标定外参的两个传感器中若一个传感器在不同时刻的位姿发生变化,则另一个传感器的位姿也应该同步发生变化,且两个传感器的位姿变化应该是一致的,这种一致性体现在通过已标定的外参进行坐标变换,将两个传感器的位姿转换至同一坐标系下,在该坐标系下两个传感器的位姿变化无差异。然而,如果确定出两个传感器之间存在位姿变化差异,并且位姿变化差异达到预设差异程度,则说明这两个传感器的位姿变化已经不一致,从而能够确定这两个传感器的外参已发生变化,实现了对传感器外参变化的检测。
附图说明
为了更清楚地说明本申请实施例和现有技术的技术方案,下面对实施例和现有技术中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请一实施例的外参变化检测方法的流程示意图;
图2为本申请实施例的获取位姿数据的流程示意图;
图3为本申请另一实施例的外参变化检测方法的流程示意图;
图4为本申请又一实施例的外参变化检测方法的流程示意图;
图5为本申请再一实施例的外参变化检测方法的流程示意图;
图6为本申请再一实施例的外参变化检测方法的流程示意图;
图7为本申请实施例的外参变化检测方法的执行框架示意图;
图8为本申请实施例的外参变化检测方法的数据流示意图;
图9为本申请实施例的位姿插值与坐标变换的子步骤的流程示意图;
图10为本申请实施例的位姿插值与坐标变换的流程示意图;
图11为本申请实施例的外参变化检测的步骤的流程示意图;
图12为本申请实施例的外参变化检测装置的结构示意图;
图13为本申请实施例的电子设备的结构示意图;
图14为本申请实施例的检测系统的结构示意图。
具体实施方式
为使本申请的目的、技术方案、及优点更加清楚明白,以下参照附图并举实施例,对本申请进一步详细说明。显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
为了实现对传感器外参变化的检测,本申请实施例提供了一种外参变化检测方法、装置、电子设备及检测系统。下面,首先对本申请实施例所提供的外参变化检测方法进行介绍。
本申请实施例所提供的外参变化检测方法的执行主体可以为具有外参标定功能的外参标定设备,也可以为具有参数变化检测功能的检测设备,还可以为检测系统中的各传感器,这里不做具体限定,以下统称为电子设备,实现本申请实施例所提供的外参变化检测方法的方式,可以为设置于电子设备中的软件、硬件电路和逻辑电路中的至少一种方式。
如图1所示,本申请实施例所提供的一种外参变化检测方法,可以包括如下步骤。
S101,获取多个传感器中任两个传感器在不同时刻的各位姿数据。
S102,根据各位姿数据,确定该两个传感器的位姿变化差异。
S103,若位姿变化差异达到预设差异程度,则确定该两个传感器的外参发生变化。
应用本申请实施例提供的方案,通过获取多个传感器中任两个传感器在不同时刻的各位姿数据,根据各位姿数据,确定该两个传感器的位姿变化差异,若位姿变化差异达到预设差异程度,则确定该任两个传感器的外参发生变化。受外参约束,已标定外参的两个传 感器中若一个传感器在不同时刻的位姿发生变化,则另一个传感器的位姿也应该同步发生变化,且两个传感器的位姿变化应该是一致的,这种一致性体现在通过已标定的外参进行坐标变换,将两个传感器的位姿转换至同一坐标系下,在该坐标系下两个传感器的位姿变化无差异。然而,如果确定出两个传感器之间存在位姿变化差异,并且位姿变化差异达到预设差异程度,则说明这两个传感器的位姿变化已经不一致,从而能够确定这两个传感器的外参已发生变化,实现了对传感器外参变化的检测。
本申请实施例所提供的外参变化检测方法,适用于多传感器的检测系统,对多个传感器中每两个传感器的外参变化进行检测。多传感器是指两个或以上可直接或间接得到位姿数据的传感器,例如:激光雷达、毫米波雷达、单/双目相机、深度相机、事件相机、轮速传感器、方向盘转角传感器、IMU、GNSS等。位姿数据是位置数据和姿态数据的总称,可以仅包括位置数据或者姿态数据,也可以既包括位置数据又包括姿态数据;位姿数据可以是相对的(即一个传感器相对于另一个传感器的位姿数据,或者一个传感器在一个时刻相对于另一个时刻的位姿数据),也可以是绝对的(即一个传感器在某一个时刻获得到的位姿数据);位姿数据可以由传感器直接输出,也可以通过特定的算法对传感器输出的数据进行运算得到;位姿数据可以是三维的,也可以是二维的。
在本申请实施例的一种实现方式中,可以获取到每个传感器在不同时刻的位姿数据,需要对哪两个传感器进行外参变化检测时,从中提取出这两个传感器在不同时刻的各位姿数据。
可选的,S101具体可以为:获取多个传感器中任两个传感器在不同时刻采集的各传感器数据;利用各传感器数据的类型对应的数据转换策略,将各传感器数据转换为位姿数据。
各传感器具有数据采集功能,能够采集到相应类型的传感器数据,例如激光雷达采集到的是点云数据、单/双目相机采集到的是图像数据等等,各传感器在采集传感器数据时,可以将传感器数据的类型携带在数据信息中,则在获取到各传感器数据时,直接可以从中获知各传感器数据的类型。不同的传感器数据的类型对应着不同的数据转换策略,可以利用相应的数据转换策略,将传感器数据转换为位姿数据。如图2所示,为本申请实施例的获取位姿数据的流程示意图。获取到各传感器采集的传感器数据;根据传感器数据的类型选择数据转换策略;利用相应的数据转换策略对传感器数据进行转换,得到位姿数据。例如,若传感器采集的是加速度、线速度和卫星定位数据,则可以利用组合导航方法,将加速度、线速度和卫星定位数据转换为位姿数据;若传感器采集的是图像数据,则可以利用视觉里程计算法,将图像数据转换为位姿数据;若传感器采集的是三维点云数据,则可以利用点云里程计算法,将三维点云数据转换为位姿数据;若传感器采集的是角速度和线速 度,则可以利用航姿推算方法,将角速度和线速度转换为位姿数据。
为了便于理解,下面以适用于单目相机和激光雷达的单目视觉里程计算法和点云里程计算法为例,对位姿数据的具体获取方式进行说明,对于其他类型的传感器数据获取相应位姿数据的方式这里不再赘述。单目视觉里程计算法的其中一种实现方式如下:单目视觉里程计算法分为初始化和正常运行两个阶段。
初始化的过程为:在输入的前后两帧图像数据中提取图像特征点,如对两帧图像数据分别提取SIFT(Scale-invariant Feature Transform,尺度不变特征变换)特征,再对两帧图像数据中的SIFT特征描述子进行匹配,得到可能的特征点匹配关系对。根据这些匹配关系对可以计算E矩阵(Essential Matrix,本质矩阵),进一步可以将E矩阵分解并选取其中一个有效的旋转矩阵R和平移向量t,两者组合在一起即为旋转平移矩阵T。在获得两帧图像数据之间的位姿关系(旋转平移矩阵T)和特征点坐标以及匹配关系后,可以通过三角测量方法计算每个特征点的深度(即空间中一个三维点距离相机的垂直距离),由此可以得到每个特征点的三维坐标,至此,初始化过程结束。
正常运行过程为:对当前帧图像数据提取特征点,并与上一帧图像数据中可以通过三角测量计算三维点的特征点进行匹配计算。由此得到当前帧图像数据特征点和上一帧图像数据中的三维点的匹配关系。根据匹配关系,使用PnP(Perspctive-n-Point,一种可以根据3D-2D点的匹配关系求出相位位姿的方法),可以求解出当前帧图像数据相对于上一帧图像数据的旋转平移矩阵,该旋转平移矩阵即为前后两帧图像数据之间的位姿关系,利用旋转平移矩阵,对前一帧图像数据的位姿数据进行运算即可得到当前帧图像数据的位姿数据。对后续的图像数据也用同样的方法进行处理,即实现了从图像数据获得位姿数据的过程。
点云里程计算法的输入是以帧为单位的一系列点云,将后一帧点云和前一帧点云采用ICP(Iterative Closest Point,迭代最近点)算法进行配准,使得两帧点云能够较好的匹配在一起。点云匹配的同时即可得到前后两帧点云之间的位姿关系,利用该位姿关系,对前一帧点云的位姿数据进行运算即可得到当前帧点云的位姿数据。
每个传感器会按照一定的频率输出传感器数据或者位姿数据,因此,针对一个传感器,电子设备可以获取到一系列不同时刻的位姿数据。对于一个传感器而言,不同时刻获取到的位姿数据之间会有变化,可以通过对不同时刻获取到的位姿数据进行分析、运算,得到该传感器的位姿变化情况。即可得到每个传感器的位姿变化情况,相应可以比较出两个传感器的位姿变化差异。位姿变化差异为一个度量值,可以是具体的差异量,例如两个传感器的位姿变化相差50%、角度变化相差30度等,也可以是存在差异的时刻统计值,例如一段时间内有50%的时刻两个传感器存在位姿变化差异,还可以为其他表示度量的取值,这 里不再一一赘述。
一般情况下,由于受外参约束,已标定外参的两个传感器中若一个传感器在不同时刻的位姿发生变化,则另一个传感器的位姿也应该同步发生变化,且两个传感器的位姿变化应该是一致的,这种一致性体现在通过已标定的外参进行坐标变换,将两个传感器的位姿转换至同一坐标系下,在该坐标系下两个传感器的位姿变化为无差异。然而,如果位姿变化差异较大(达到预设差异程度),则说明这两个传感器的位姿变化已经不一致,也就是说,这两个传感器的外参已发生变化。
在具体实施时,位姿变化情况可以是一系列不同时刻的位姿数据的变化情况,也可以是相邻两个时刻的位姿数据的变化情况。对应的,位姿变化差异可以是一个时间段内两个传感器间整体的位姿变化差异,也可以是某一个时刻两个传感器间的位姿变化差异。为了更为全面、准确地进行位姿变化分析,一般采用对一个时间段内进行整体分析,具体可以是确定一个时间段内不同时刻两个传感器的位姿变化差异,如果有绝大部分时刻(大于预设比例)的位姿变化存在差异,则认为外参已变化;也可以是确定出一个时间段内两个传感器各自的位姿变化趋势,然后将两个位姿变化趋势进行比较,分析出两个位姿变化趋势的差异程度的大小,如果差异程度较大(大于预设程度),则认为外参已变化。如何利用位姿变化差异来判定两个传感器的外参是否发生变化的方式还有很多种,这里不再一一列举。
可选的,S102具体可以通过如下步骤实现:
第一步,根据各位姿数据,计算两个传感器在不同时刻的各相对位姿,其中,相对位姿包括当前时刻的位姿数据相对于上一时刻的位姿数据的相对姿态角度。
第二步,根据各相对位姿,计算该两个传感器在不同时刻的各相对位姿偏差,其中,相对位姿偏差包括姿态偏差及平移偏差。
第三步,根据各相对姿态角度,计算该两个传感器在不同时刻的各姿态偏差权重及各平移偏差权重。
第四步,根据该两个传感器在不同时刻的各姿态偏差权重及各平移偏差权重,对该两个传感器在不同时刻的各姿态偏差和各平移偏差进行加权求和,得到该两个传感器的外参变化评分结果,外参变化评分结果表征了该两个传感器的位姿变化差异。
相应的,S103具体可以为:若外参变化评分结果大于预设阈值,则确定该两个传感器的外参发生变化。
相对位姿为当前时刻的位姿数据相对于上一时刻的位姿数据的变化量,具体的,该变化量可以为相对姿态角度,根据两个传感器的各位姿数据,可以计算出这两个传感器在不 同时刻的各相对位姿,例如,激光雷达在时刻2的相对位姿为在时刻2的位姿数据相对于在时刻1的位姿数据的变化量,具体可以利用公式(1)计算得到。
Figure PCTCN2021094433-appb-000001
其中,ΔT L12为激光雷达L在时刻2的相对位姿,T L1为激光雷达L在时刻1的位姿数据,
Figure PCTCN2021094433-appb-000002
为T L1的逆,T L2为激光雷达L在时刻2的位姿数据,时刻1和时刻2为相邻时刻。
根据两个传感器在不同时刻的各相对位姿,可以计算出这两个传感器在不同时刻的各相对位姿偏差,例如,激光雷达和单目相机在时刻2的相对位姿偏差具体可以利用公式(2)计算得到。
Figure PCTCN2021094433-appb-000003
其中,ΔT CL12为激光雷达L和单目相机C在时刻2的相对位姿偏差,
Figure PCTCN2021094433-appb-000004
为单目相机C在时刻2的相对位姿的逆,ΔT L12为激光雷达L在时刻2的相对位姿。相对位姿偏差ΔT CL12包括姿态偏差ΔR CL及平移偏差Δt CL,ΔR CL对应的姿态偏差角度可以记为
Figure PCTCN2021094433-appb-000005
当两个传感器输出数据类型或维度不一致时,如单轴IMU只输出一维姿态信息,而激光雷达可以输出三维位置和姿态信息,则计算共有输出信息的偏差量,即只计算一维姿态偏差量,而平移偏差量置为0。
相对位姿包括相对姿态角度θ∈(-π,π),相对姿态角度决定了姿态偏差和平移偏差的权重,具体的,可以根据各相对姿态角度,计算该两个传感器在不同时刻的各姿态偏差权重及各平移偏差权重,具体可以取相对姿态角度的二分之一正弦值作为姿态偏差权重,取相对姿态角度的二分之一正弦值的绝对值和相对姿态角度的二分之一余弦值的和作为平移偏差权重。然后根据该两个传感器在不同时刻的各姿态偏差权重及各平移偏差权重,对该两个传感器在不同时刻的各姿态偏差和各平移偏差进行加权求和,即可得到该两个传感器的外参变化评分结果。外参变化评分结果表征了该两个传感器的位姿变化差异,也就是说外参变化评分结果的值越大,则表示这两个传感器的位姿变化差异越大。具体可以采用公式(3)计算外参变化评分结果。
Figure PCTCN2021094433-appb-000006
其中,score为外参变化评分结果,n为相对位姿的数量,
Figure PCTCN2021094433-appb-000007
为第i个相对位姿对应的姿态偏差角度,θ i为第i个相对位姿对应的相对姿态角度,Δt i为第i个相对位姿对应的平移偏差,
Figure PCTCN2021094433-appb-000008
为姿态偏差权重,
Figure PCTCN2021094433-appb-000009
为平移偏差权重。
将外参变化评分结果与预设阈值进行比较,如果外参变化评分结果大于预设阈值,则认为这两个传感器的外参已经发生变化。预设阈值根据实际使用要求进行设置,对外参变 化容忍度较高的场合可以将预设阈值设置的较大,反之则可以设置的较小。
可选的,在得到外参发生变化的判定结果之后,该方法还可以包括:将外参变化评分结果作为检测置信度进行输出。
在确定两个传感器的外参发生变化之后,还可以将计算得到的外参变化评分结果作为检测置信度进行输出,检测置信度表征了检测结果的可信程度,外参变化评分结果越高,说明两个传感器的外参变化差异越大,可以从检测置信度的大小直接判断出外参已变化的判断结果是否可信。
对于所有传感器的两两组合都可以进行上述处理,即可得到所有传感器之间外参是否变化的检测结果和检测置信度。
基于图1所示实施例,本申请实施例还提供了一种外参变化检测方法,如图3所示,可以包括如下步骤。
S301,获取多个传感器中任两个传感器在不同时刻的各位姿数据以及各位姿数据的时间戳。
S302,以该两个传感器中第一传感器的各位姿数据的时间戳为目标,利用预设的插值算法,对该两个传感器中第二传感器的各位姿数据进行插值运算,其中,第一传感器的输出频率小于第二传感器的输出频率。
S303,根据第一传感器的各位姿数据以及插值运算后的第二传感器的各位姿数据,确定任两个传感器的位姿变化差异。
S304,若位姿变化差异达到预设差异程度,则确定该两个传感器的外参发生变化。
由于不同的传感器可能具有不同的输出频率,那么电子设备在获取到各传感器在不同时刻的各位姿数据时,可能会存在很多时间不相对应的数据,各位姿数据的时间戳指的是传感器获得到位姿数据时给位姿数据打上的时间,时间不相对应的数据指的就是各种类型的位姿数据的时间戳都不相同,因此需要对位姿数据进行处理,使得两个传感器的各位姿数据能够在时间上重合,本申请实施例中,可以采用插值的方式实现该目的。插值的过程为:分别获取两个传感器的各位姿数据的时间戳,选择一个传感器作为插值目标(通常是输出频率较低的传感器,本申请实施例中称为第一传感器),以第一传感器的各位姿数据的时间戳为目标,对第二传感器的各位姿数据进行插值运算,也就是在第一传感器的各位姿数据的时间戳对应的时刻,依据该时刻附近的第二传感器的位姿数据,在该时刻进行插值运算得到该时刻的第二传感器的位姿数据。具体的,在对位置数据进行插值时,可以采用线性插值、双三次插值、样条插值等插值方法,在对姿态数据进行插值时,可以采用球面插值等插值方法。对位姿数据进行插值之后,两个传感器的各位姿数据能够在时间上重 合,这样,能够得到更为准确的位姿变化差异。
基于图1所示实施例,本申请实施例还提供了一种外参变化检测方法,如图4所示,可以包括如下步骤。
S401,获取多个传感器中任两个传感器在不同时刻的各位姿数据以及针对该两个传感器预先标定的外参。
S402,根据外参,将各位姿数据转换至同一坐标系下。
S403,根据坐标系转换后的各位姿数据,确定该两个传感器的位姿变化差异。
S404,若位姿变化差异达到预设差异程度,则确定该两个传感器的外参发生变化。
从不同的传感器获取的位姿数据往往位于不同的坐标系下,因此在计算位姿变化差异之前,需要将各传感器的位姿数据转换至同一坐标系下,由于外参表征了两个传感器之间的相对位置和姿态关系,可以是二维或三维形式,通过外参能够确定两个传感器所处坐标系的关系,因此,根据外参,可以将两个传感器的各位姿数据转换至同一坐标系下,同一坐标系下的数据更便于进行运算,得到更为准确的位姿变化差异。
基于图4所示实施例,本申请实施例还提供了一种外参变化检测方法,如图5所示,可以包括如下步骤。
S501,获取多个传感器中任两个传感器在不同时刻的各位姿数据以及针对该两个传感器预先标定的外参。
S502,根据外参,将各位姿数据转换至同一坐标系下。
S503,根据坐标系转换后的各位姿数据,计算该两个传感器在不同时刻的各相对位姿,其中,相对位姿包括当前时刻的位姿数据相对于上一时刻的位姿数据的平移量。
S504,针对每一个传感器,对该传感器在不同时刻的各平移量进行求和,得到该传感器的平移量累加结果。
S505,计算该两个传感器的平移量累加结果之间的比例,得到尺度因子。
S506,根据尺度因子,对该两个传感器的位姿数据进行变化尺度的统一,得到变化尺度统一后的各位姿数据。
S507,根据变化尺度统一后的各位姿数据,确定该两个传感器的位姿变化差异。
S508,若位姿变化差异达到预设差异程度,则确定该两个传感器的外参发生变化。
对于一些特殊的传感器,如单目相机,获得的位姿数据缺乏尺度信息,尺度信息是指传感器位姿尺度和真实物理尺度之间的比例,对于这种类型的位姿数据需要基于另一个传感器的位姿数据,对该类型的位姿数据进行变化尺度的统一。变化尺度的统一往往在坐标系统一之后,具体的过程是:计算一系列的位姿数据中,每个传感器的当前时刻的位姿数 据相对于上一时刻的位姿数据的变化量(即相对位姿),相对位姿的计算方式如公式(1)所示。由此可以得到两个传感器的一系列相对位姿。相对位姿中包括相对姿态角度和平移量,将两个传感器的相对位姿中的平移量各自求和,并计算求和后两个传感器的平移量比例,即可得到尺度因子S,以激光雷达L和单目相机C为例,可以采用公式(4)计算尺度因子。
Figure PCTCN2021094433-appb-000010
其中,S为尺度因子,n为相对位姿的数量,Δt Ci为单目相机C的第i个相对位姿的平移量,Δt Li为激光雷达L的第i个相对位姿的平移量。
使用该尺度因子,可以将单目相机和激光雷达的位姿数据进行变化尺度的统一,至此,获得了两个传感器在统一坐标系下、变化尺度统一的一系列位姿数据,进而能够得到更为准确的位姿变化差异。
基于图1所示实施例,本申请实施例还提供了一种外参变化检测方法,如图6所示,可以包括如下步骤。
S601,获取多个传感器中任两个传感器在不同时刻的各位姿数据。
S602,根据各位姿数据,确定该两个传感器的位姿变化差异。
S603,若位姿变化差异达到预设差异程度,则确定该两个传感器的外参发生变化。
S604,根据最新获取的该两个传感器的位姿数据,对该两个传感器重新进行外参标定。
在确定两个传感器的外参发生变化之后,为了提高外参标定的准确性、提高检测系统的检测精度,可以根据最新获取的该两个传感器的位姿数据,对这两个传感器重新进行外参标定,重新标定的外参能够满足当前的传感器实际位姿,从而提高了检测系统的检测精度,实现了外参变化的自动检测和外参的自动标定。
当然在另一种实现方式中,电子设备也可以将外参发生变化的检测结果告知技术人员,由技术人员对相应的传感器外参进行手动标注。
为了便于理解,下面结合具体实例,对本申请实施例提供的外参变化检测方法进行详细介绍。
如图7所示,本申请实施例所提供的外参变化检测方法,主要可以分为位姿获取与对齐、外参变化检测两大步骤。其中,位姿获取与对齐的步骤包括位姿获取、位姿插值与坐标变换两个子步骤;外参变化检测的步骤包括外参变化评分一个子步骤。
本申请实施例所提供的方法的数据流如图8所示,在位姿获取的子步骤中,输入的数据是传感器数据(如点云数据、图像数据等),输出的数据是位姿数据,可以是二维数据 或者三维数据。将所有传感器的位姿数据进行两两组合,组合的方式实质就是位姿数据提取,同时提取某两个传感器的位姿数据,以下选取其中一种组合(A、D传感器的位姿数据)进行说明,其余组合的处理流程均类似。在位姿插值与坐标变换的子步骤中,输入的数据是A、D传感器的位姿数据,输出的数据是经过插值后时间对齐且位于同一坐标系下的A、D传感器的位姿数据。在外参变化评分的子步骤中,输入的是时间对齐且位于同一坐标系下的A、D传感器的位姿数据,输出的是A、D传感器的外参变化评分结果。
位姿获取的子步骤的作用是对各类不同的传感器数据进行处理,以获得相应的位姿数据。假设A、D传感器分别为单目相机和激光雷达,具体的数据转换过程见图1所示实施例中的具体举例,这里不再赘述。
位姿插值与坐标变换的子步骤的作用是将不同传感器的位姿数据进行插值,使得不同传感器下的位姿时刻能够对齐并转换到同一坐标系下且尺度统一。
位姿插值与坐标变换的子步骤的流程如图9所示,取A传感器为单目相机C、D传感器为激光雷达L,单目相机C的图像输出频率为30Hz,通过位姿获取的子步骤得到的单目相机C的位姿数据输出频率也为30Hz;激光雷达L的典型点云帧输出频率为10Hz,通过位姿获取的子步骤得到的激光雷达L的位姿数据输出频率也为10Hz。两者事先标定得到的外参为T CL(单目相机到激光雷达的基变换)。选择激光雷达L的位姿数据作为插值目标,将单目相机C的位姿数据按照激光雷达L的位姿数据的时间戳进行插值。具体的插值方法如下:
假设有两个单目相机的位姿数据和一个激光雷达的位姿数据,其中,激光雷达的位姿数据位于两个单目相机的位姿数据之间。需要插值获取激光雷达的位姿数据所在时刻的单目相机的位姿数据。即:
t C1>t L2>t C3            (5)
其中,t C表示单目相机的位姿数据的时间戳,t L表示激光雷达的位姿数据的时间戳。三个时刻对应的位姿数据如下表示:T C1,T L2,T C3,T C表示单目相机的位姿数据,T L2表示激光雷达的位姿数据。
插值的目的是使用T C1和T C3来获得t L2时刻单目相机的位姿数据T C2。这里选用球面插值法对姿态数据进行插值,选用线性插值法对位置数据进行插值。
对两个传感器的一系列位姿数据进行上述操作,可获得一系列的位姿数据对。至此,完成了位姿数据的插值过程。
位姿数据坐标变换时,使用事先标定好的外参T CL。将一系列激光雷达的位姿数据变换到单目相机的坐标系下。将T L2变换为单目相机坐标系下T' L2的公式如下:
Figure PCTCN2021094433-appb-000011
将所有一系列激光雷达的位姿数据都转换到单目相机坐标系下后完成坐标变换操作。
由于单目相机的位姿信息缺乏尺度信息,需要估计尺度因子。计算一系列的位姿数据对中,每个传感器前后两个相邻时刻的位姿数据的变化量(以下称为相对位姿)。如:变换到单目相机坐标系下的激光雷达的位姿数据1和位姿数据2之间的相对位姿可以如下式计算:
Figure PCTCN2021094433-appb-000012
对于激光雷达的相对位姿也是用相似公式计算。由此获得两种传感器的一系列相对位姿。将两个传感器的相对位姿中的平移量各自求和,并计算求和后两个传感器的平移量比例,即可得到尺度因子S,具体的计算方式是采用公式(4)进行计算。使用该尺度因子,可以将单目相机和激光雷达的位姿变化尺度进行统一,至此,获得了两个传感器在同一坐标系下、变化尺度统一的一系列位姿数据。
位姿插值与坐标变换的流程如图10所示,曲线1表示单目相机坐标系下的一系列单目相机的位姿数据,曲线3表示激光雷达坐标系下的一系列激光雷达的位姿数据,曲线2表示单目相机的位姿数据按照激光雷达的位姿数据的时间戳插值的结果。曲线4表示使用事先标定的外参,将激光雷达的位姿数据变换到单目相机坐标系下的结果,曲线5表示计算尺度因子后,将两者的位姿数据变化尺度进行统一后的效果并指出了相对位姿。
外参变化检测的步骤的作用是根据一系列已变换到统一坐标系下的两个相对位姿,计算得到外参变化评分结果。具体流程如图11所示,先对两个相对位姿计算偏差量,并计算相对位姿对外参变化的权重,然后根据一系列的相对位姿偏差量和权重计算外参变化评分结果。
先计算一系列相对位姿的偏差。以计算上文例子中单目相机和激光雷达的相对位姿ΔT C12和ΔT' L12为例进行说明,相对位姿偏差计算公式如下:
Figure PCTCN2021094433-appb-000013
其中,ΔT CL12即为激光雷达和单目相机的相对位姿偏差。相对位姿偏差ΔT CL12包括姿态偏差ΔR CL及平移偏差Δt CL,ΔR CL对应的姿态偏差角度可以记为
Figure PCTCN2021094433-appb-000014
当两个传感器输出数据类型或维度不一致时,如单轴IMU只输出一维姿态信息,而激光雷达可以输出三维位置和姿态信息,则计算共有输出信息的偏差量,即只计算一维姿态偏差量,而平移偏差量置为0。
相对位姿包括相对姿态角度θ∈(-π,π),相对姿态角度决定了姿态偏差和平移偏差的权重,具体的,可以根据各相对姿态角度,计算该两个传感器在不同时刻的各姿态偏差权重 及各平移偏差权重,具体可以取相对姿态角度的二分之一正弦值作为姿态偏差权重,取相对姿态角度的二分之一正弦值的绝对值和相对姿态角度的二分之一余弦值的和作为平移偏差权重。然后根据该两个传感器在不同时刻的各姿态偏差权重及各平移偏差权重,对该两个传感器在不同时刻的各姿态偏差和各平移偏差进行加权求和,即可得到该两个传感器的外参变化评分结果。外参变化评分结果表征了该两个传感器的位姿变化差异,也就是说外参变化评分结果的值越大,则表示这两个传感器的位姿变化差异越大。具体可以采用公式(9)计算外参变化评分结果。
Figure PCTCN2021094433-appb-000015
其中,score为外参变化评分结果,n为相对位姿的数量,
Figure PCTCN2021094433-appb-000016
为第i个相对位姿对应的姿态偏差角度,θ i为第i个相对位姿对应的相对姿态角度,Δt i为第i个相对位姿对应的平移偏差,
Figure PCTCN2021094433-appb-000017
为姿态偏差权重,
Figure PCTCN2021094433-appb-000018
为平移偏差权重。
将外参变化评分结果与预设阈值进行比较,如果外参变化评分结果大于预设阈值,则认为这两个传感器的外参已经发生变化。预设阈值根据实际使用要求进行设置,对外参变化容忍度较高的场合可以将预设阈值设置的较大,反之则可以设置的较小。
基于上述方法实施例,本申请实施例提供了一种外参变化检测装置,如图12所示,该装置可以包括:
获取模块1210,用于获取多个传感器中任两个传感器在不同时刻的各位姿数据;
分析模块1220,用于根据各位姿数据,确定该两个传感器的位姿变化差异;
检测模块1230,用于若位姿变化差异达到预设差异程度,则确定该两个传感器的外参发生变化。
可选的,获取模块1210,具体可以用于:获取多个传感器中任两个传感器在不同时刻采集的各传感器数据;利用各传感器数据的类型对应的数据转换策略,将各传感器数据转换为位姿数据。
可选的,该装置还可以包括:插值模块;
插值模块,用于获取各位姿数据的时间戳;以该两个传感器中第一传感器的各位姿数据的时间戳为目标,利用预设的插值算法,对该两个传感器中第二传感器的各位姿数据进行插值运算,其中,第一传感器的输出频率小于第二传感器的输出频率;
分析模块1220,具体可以用于:根据第一传感器的各位姿数据以及插值运算后的第二传感器的各位姿数据,确定该两个传感器的位姿变化差异。
可选的,该装置还可以包括:坐标转换模块;
坐标转换模块,用于获取针对该两个传感器预先标定的外参;根据外参,将各位姿数据转换至同一坐标系下;
分析模块1220,具体可以用于:根据坐标系转换后的各位姿数据,确定该两个传感器的位姿变化差异。
可选的,该装置还可以包括:尺度统一模块;
尺度统一模块,用于根据坐标系转换后的各位姿数据,计算该两个传感器在不同时刻的各相对位姿,其中,相对位姿包括当前时刻的位姿数据相对于上一时刻的位姿数据的平移量;针对每一个传感器,对该传感器在不同时刻的各平移量进行求和,得到该传感器的平移量累加结果;计算该两个传感器的平移量累加结果之间的比例,得到尺度因子;根据尺度因子,对该两个传感器的位姿数据进行变化尺度的统一,得到变化尺度统一后的各位姿数据;
分析模块1220,具体可以用于:根据变化尺度统一后的各位姿数据,确定该两个传感器的位姿变化差异。
可选的,分析模块1220,具体可以用于:根据各位姿数据,计算该两个传感器在不同时刻的各相对位姿,其中,相对位姿包括当前时刻的位姿数据相对于上一时刻的位姿数据的相对姿态角度;根据各相对位姿,计算该两个传感器在不同时刻的各相对位姿偏差,其中,相对位姿偏差包括姿态偏差及平移偏差;根据各相对姿态角度,计算该两个传感器在不同时刻的各姿态偏差权重及各平移偏差权重;根据该两个传感器在不同时刻的各姿态偏差权重及各平移偏差权重,对该两个传感器在不同时刻的各姿态偏差和各平移偏差进行加权求和,得到该两个传感器的外参变化评分结果,其中,外参变化评分结果表征了该两个传感器的位姿变化差异;
检测模块1230,具体可以用于:若外参变化评分结果大于预设阈值,则确定该两个传感器的外参发生变化。
可选的,该装置还可以包括:输出模块;
输出模块,用于将外参变化评分结果作为检测置信度进行输出。
可选的,该装置还可以包括:标定模块;
标定模块,用于根据最新获取的该两个传感器的位姿数据,对该两个传感器重新进行外参标定。
应用本申请实施例提供的方案,通过获取多个传感器中任两个传感器在不同时刻的各位姿数据,根据各位姿数据,确定该两个传感器的位姿变化差异,若位姿变化差异达到预设差异程度,则确定该任两个传感器的外参发生变化。受外参约束,已标定外参的两个传 感器中若一个传感器在不同时刻的位姿发生变化,则另一个传感器的位姿也应该同步发生变化,且两个传感器的位姿变化应该是一致的,这种一致性体现在通过已标定的外参进行坐标变换,将两个传感器的位姿转换至同一坐标系下,在该坐标系下两个传感器的位姿变化无差异。然而,如果确定出两个传感器之间存在位姿变化差异,并且位姿变化差异达到预设差异程度,则说明这两个传感器的位姿变化已经不一致,从而能够确定这两个传感器的外参已发生变化,实现了对传感器外参变化的检测。
本申请实施例提供了一种电子设备,如图13所示,包括处理器1301和存储器1302,其中,存储器1302存储有能够被处理器1301执行的机器可执行指令,机器可执行指令由处理器1301加载并执行,以实现本申请实施例所提供的外参变化检测方法。
上述存储器可以包括RAM(Random Access Memory,随机存取存储器),也可以包括NVM(Non-volatile Memory,非易失性存储器),例如至少一个磁盘存储器。可选的,存储器还可以是至少一个位于远离前述处理器的存储装置。
上述处理器可以是通用处理器,包括CPU(Central Processing Unit,中央处理器)、NP(Network Processor,网络处理器)等;还可以是DSP(Digital Signal Processor,数字信号处理器)、ASIC(Application Specific Integrated Circuit,专用集成电路)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。
存储器1302与处理器1301之间可以通过有线连接或者无线连接的方式进行数据传输,并且电子设备与其他设备之间可以通过有线通信接口或者无线通信接口进行通信。图13所示的仅为通过总线进行数据传输的示例,不作为具体连接方式的限定。
本申请实施例中,处理器通过读取存储器中存储的机器可执行指令,并通过加载和执行机器可执行指令,能够实现:通过获取多个传感器中任两个传感器在不同时刻的各位姿数据,根据各位姿数据,确定该两个传感器的位姿变化差异,若位姿变化差异达到预设差异程度,则确定该任两个传感器的外参发生变化。受外参约束,已标定外参的两个传感器中若一个传感器在不同时刻的位姿发生变化,则另一个传感器的位姿也应该同步发生变化,且两个传感器的位姿变化应该是一致的,这种一致性体现在通过已标定的外参进行坐标变换,将两个传感器的位姿转换至同一坐标系下,在该坐标系下两个传感器的位姿变化无差异。然而,如果确定出两个传感器之间存在位姿变化差异,并且位姿变化差异达到预设差异程度,则说明这两个传感器的位姿变化已经不一致,从而能够确定这两个传感器的外参已发生变化,实现了对传感器外参变化的检测。
另外,本申请实施例提供了一种机器可读存储介质,机器可读存储介质内存储有机器 可执行指令,机器可执行指令在被处理器加载并执行时,实现本申请实施例所提供的外参变化检测方法。
本申请实施例中,机器可读存储介质存储有在运行时执行本申请实施例所提供的外参变化检测方法的机器可执行指令,因此能够实现:通过获取多个传感器中任两个传感器在不同时刻的各位姿数据,根据各位姿数据,确定该两个传感器的位姿变化差异,若位姿变化差异达到预设差异程度,则确定该任两个传感器的外参发生变化。受外参约束,已标定外参的两个传感器中若一个传感器在不同时刻的位姿发生变化,则另一个传感器的位姿也应该同步发生变化,且两个传感器的位姿变化应该是一致的,这种一致性体现在通过已标定的外参进行坐标变换,将两个传感器的位姿转换至同一坐标系下,在该坐标系下两个传感器的位姿变化无差异。然而,如果确定出两个传感器之间存在位姿变化差异,并且位姿变化差异达到预设差异程度,则说明这两个传感器的位姿变化已经不一致,从而能够确定这两个传感器的外参已发生变化,实现了对传感器外参变化的检测。
在本申请提供的又一实施例中,还提供了一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述实施例中任一外参变化检测方法。
本申请实施例提供了一种检测系统,如图14所示,包括电子设备1401及多个传感器1402;
多个传感器1402,用于采集数据,并将采集的数据发送至电子设备1401;
电子设备1401,用于获取任两个传感器在不同时刻的各位姿数据;根据各位姿数据,确定该两个传感器的位姿变化差异;若位姿变化差异达到预设差异程度,则确定该两个传感器的外参发生变化。
应用本申请实施例,通过获取多个传感器中任两个传感器在不同时刻的各位姿数据,根据各位姿数据,确定该两个传感器的位姿变化差异,若位姿变化差异达到预设差异程度,则确定该任两个传感器的外参发生变化。受外参约束,已标定外参的两个传感器中若一个传感器在不同时刻的位姿发生变化,则另一个传感器的位姿也应该同步发生变化,且两个传感器的位姿变化应该是一致的,这种一致性体现在通过已标定的外参进行坐标变换,将两个传感器的位姿转换至同一坐标系下,在该坐标系下两个传感器的位姿变化无差异。然而,如果确定出两个传感器之间存在位姿变化差异,并且位姿变化差异达到预设差异程度,则说明这两个传感器的位姿变化已经不一致,从而能够确定这两个传感器的外参已发生变化,实现了对传感器外参变化的检测。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产 品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、DSL(Digital Subscriber Line,数字用户线))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如软盘、硬盘、磁带)、光介质(例如DVD(Digital Versatile Disc,数字多功能光盘))、或者半导体介质(例如SSD(Solid State Disk,固态硬盘))等。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
本说明书中的各个实施例均采用相关的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于装置、电子设备、机器可读存储介质、计算机程序产品、检测系统实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
以上所述仅为本申请的较佳实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本申请保护的范围之内。

Claims (18)

  1. 一种外参变化检测方法,其特征在于,所述方法包括:
    获取多个传感器中任两个传感器在不同时刻的各位姿数据;
    根据所述各位姿数据,确定所述任两个传感器的位姿变化差异;
    若所述位姿变化差异达到预设差异程度,则确定所述任两个传感器的外参发生变化。
  2. 根据权利要求1所述的方法,其特征在于,所述获取多个传感器中任两个传感器在不同时刻的各位姿数据的步骤,包括:
    获取多个传感器中任两个传感器在不同时刻采集的各传感器数据;
    利用所述各传感器数据的类型对应的数据转换策略,将所述各传感器数据转换为位姿数据。
  3. 根据权利要求1所述的方法,其特征在于,在所述根据所述各位姿数据,确定所述任两个传感器的位姿变化差异的步骤之前,所述方法还包括:
    获取所述各位姿数据的时间戳;
    以所述任两个传感器中第一传感器的各位姿数据的时间戳为目标,利用预设的插值算法,对所述任两个传感器中第二传感器的各位姿数据进行插值运算,其中,所述第一传感器的输出频率小于所述第二传感器的输出频率;
    所述根据所述各位姿数据,确定所述任两个传感器的位姿变化差异的步骤,包括:
    根据所述第一传感器的各位姿数据以及插值运算后的所述第二传感器的各位姿数据,确定所述任两个传感器的位姿变化差异。
  4. 根据权利要求1所述的方法,其特征在于,在所述根据所述各位姿数据,确定所述任两个传感器的位姿变化差异的步骤之前,所述方法还包括:
    获取针对所述任两个传感器预先标定的外参;
    根据所述外参,将所述各位姿数据转换至同一坐标系下;
    所述根据所述各位姿数据,确定所述任两个传感器的位姿变化差异的步骤,包括:
    根据坐标系转换后的所述各位姿数据,确定所述任两个传感器的位姿变化差异。
  5. 根据权利要求4所述的方法,其特征在于,在所述根据所述外参,将所述各位姿数据转换至同一坐标系下的步骤之后,所述方法还包括:
    根据坐标系转换后的所述各位姿数据,计算所述任两个传感器在不同时刻的各相对位姿,其中,所述相对位姿包括当前时刻的位姿数据相对于上一时刻的位姿数据的平移量;
    针对每一个传感器,对该传感器在不同时刻的各平移量进行求和,得到该传感器的平移量累加结果;
    计算所述任两个传感器的平移量累加结果之间的比例,得到尺度因子;
    根据所述尺度因子,对所述任两个传感器的位姿数据进行变化尺度的统一,得到变化尺度统一后的所述各位姿数据;
    所述根据所述各位姿数据,确定所述任两个传感器的位姿变化差异的步骤,包括:
    根据变化尺度统一后的所述各位姿数据,确定所述任两个传感器的位姿变化差异。
  6. 根据权利要求1所述的方法,其特征在于,所述根据所述各位姿数据,确定所述任两个传感器的位姿变化差异的步骤,包括:
    根据所述各位姿数据,计算所述任两个传感器在不同时刻的各相对位姿,其中,所述相对位姿包括当前时刻的位姿数据相对于上一时刻的位姿数据的相对姿态角度;
    根据所述各相对位姿,计算所述任两个传感器在不同时刻的各相对位姿偏差,所述相对位姿偏差包括姿态偏差及平移偏差;
    根据各相对姿态角度,计算所述任两个传感器在不同时刻的各姿态偏差权重及各平移偏差权重;
    根据所述任两个传感器在不同时刻的各姿态偏差权重及各平移偏差权重,对所述任两个传感器在不同时刻的各姿态偏差和各平移偏差进行加权求和,得到所述任两个传感器的外参变化评分结果,所述外参变化评分结果表征了所述任两个传感器的位姿变化差异;
    所述若所述位姿变化差异达到预设差异程度,则确定所述任两个传感器的外参发生变化的步骤,包括:
    若所述外参变化评分结果大于预设阈值,则确定所述任两个传感器的外参发生变化。
  7. 根据权利要求6所述的方法,其特征在于,在所述若所述外参变化评分结果大于预设阈值,则确定所述任两个传感器的外参发生变化的步骤之后,所述方法还包括:
    将所述外参变化评分结果作为检测置信度进行输出。
  8. 根据权利要求1所述的方法,其特征在于,在所述若所述位姿变化差异达到预设差异程度,则确定所述任两个传感器的外参发生变化的步骤之后,所述方法还包括:
    根据最新获取的所述任两个传感器的位姿数据,对所述任两个传感器重新进行外参标定。
  9. 一种外参变化检测装置,其特征在于,所述装置包括:
    获取模块,用于获取多个传感器中任两个传感器在不同时刻的各位姿数据;
    分析模块,用于根据所述各位姿数据,确定所述任两个传感器的位姿变化差异;
    检测模块,用于若所述位姿变化差异达到预设差异程度,则确定所述任两个传感器的外参发生变化。
  10. 根据权利要求9所述的装置,其特征在于,所述获取模块,具体用于:获取多个传感器中任两个传感器在不同时刻采集的各传感器数据;利用所述各传感器数据的类型对应的数据转换策略,将所述各传感器数据转换为位姿数据。
  11. 根据权利要求9所述的装置,其特征在于,所述装置还包括:插值模块;
    所述插值模块,用于获取各位姿数据的时间戳;以所述任两个传感器中第一传感器的各位姿数据的时间戳为目标,利用预设的插值算法,对所述任两个传感器中第二传感器的各位姿数据进行插值运算,其中,所述第一传感器的输出频率小于所述第二传感器的输出频率;
    所述分析模块,具体用于:根据所述第一传感器的各位姿数据以及插值运算后的所述第二传感器的各位姿数据,确定所述任两个传感器的位姿变化差异。
  12. 根据权利要求9所述的装置,其特征在于,所述装置还包括:坐标转换模块;
    所述坐标转换模块,用于获取针对所述任两个传感器预先标定的外参;根据所述外参,将所述各位姿数据转换至同一坐标系下;
    所述分析模块,具体用于:根据坐标系转换后的所述各位姿数据,确定所述任两个传感器的位姿变化差异。
  13. 根据权利要求12所述的装置,其特征在于,所述装置还包括:尺度统一模块;
    所述尺度统一模块,用于根据坐标系转换后的所述各位姿数据,计算所述任两个传感器在不同时刻的各相对位姿,其中,所述相对位姿包括当前时刻的位姿数据相对于上一时刻的位姿数据的平移量;针对每一个传感器,对该传感器在不同时刻的各平移量进行求和,得到该传感器的平移量累加结果;计算所述任两个传感器的平移量累加结果之间的比例,得到尺度因子;根据所述尺度因子,对所述任两个传感器的位姿数据进行变化尺度的统一,得到变化尺度统一后的各位姿数据;
    所述分析模块,具体用于:根据变化尺度统一后的所述各位姿数据,确定所述任两个传感器的位姿变化差异。
  14. 根据权利要求9所述的装置,其特征在于,所述分析模块,具体用于:根据所述 各位姿数据,计算所述任两个传感器在不同时刻的各相对位姿,其中,所述相对位姿包括当前时刻的位姿数据相对于上一时刻的位姿数据的相对姿态角度;根据所述各相对位姿,计算所述任两个传感器在不同时刻的各相对位姿偏差,其中,所述相对位姿偏差包括姿态偏差及平移偏差;根据各相对姿态角度,计算所述任两个传感器在不同时刻的各姿态偏差权重及各平移偏差权重;根据所述任两个传感器在不同时刻的各姿态偏差权重及各平移偏差权重,对所述任两个传感器在不同时刻的各姿态偏差和各平移偏差进行加权求和,得到所述任两个传感器的外参变化评分结果,其中,所述外参变化评分结果表征了所述任两个传感器的位姿变化差异;
    所述检测模块,具体用于:若所述外参变化评分结果大于预设阈值,则确定所述任两个传感器的外参发生变化。
  15. 根据权利要求14所述的装置,其特征在于,所述装置还包括:输出模块;
    所述输出模块,用于将所述外参变化评分结果作为检测置信度进行输出。
  16. 根据权利要求9所述的装置,其特征在于,所述装置还包括:标定模块;
    所述标定模块,用于根据最新获取的所述任两个传感器的位姿数据,对所述任两个传感器重新进行外参标定。
  17. 一种电子设备,其特征在于,包括处理器和存储器,其中,所述存储器存储有能够被所述处理器执行的机器可执行指令,所述机器可执行指令由所述处理器加载并执行,以实现权利要求1-8任一项所述的方法。
  18. 一种检测系统,其特征在于,包括电子设备及多个传感器;
    所述多个传感器,用于采集数据,并将采集的数据发送至所述电子设备;
    所述电子设备,用于获取任两个传感器在不同时刻的各位姿数据;根据所述各位姿数据,确定所述任两个传感器的位姿变化差异;若所述位姿变化差异达到预设差异程度,则确定所述任两个传感器的外参发生变化。
PCT/CN2021/094433 2020-05-21 2021-05-18 一种外参变化检测方法、装置、电子设备及检测系统 WO2021233309A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21807516.6A EP4155673A4 (en) 2020-05-21 2021-05-18 METHOD AND DEVICE FOR DETECTING EXTRINSIC PARAMETER CHANGES, ELECTRONIC DEVICE AND DETECTION SYSTEM

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010436800.3 2020-05-21
CN202010436800.3A CN113701745B (zh) 2020-05-21 2020-05-21 一种外参变化检测方法、装置、电子设备及检测系统

Publications (1)

Publication Number Publication Date
WO2021233309A1 true WO2021233309A1 (zh) 2021-11-25

Family

ID=78646113

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/094433 WO2021233309A1 (zh) 2020-05-21 2021-05-18 一种外参变化检测方法、装置、电子设备及检测系统

Country Status (3)

Country Link
EP (1) EP4155673A4 (zh)
CN (1) CN113701745B (zh)
WO (1) WO2021233309A1 (zh)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114049404A (zh) * 2022-01-12 2022-02-15 深圳佑驾创新科技有限公司 一种车内相机外参标定方法及装置
CN114136316A (zh) * 2021-12-01 2022-03-04 珠海一微半导体股份有限公司 基于点云特征点的惯导误差消除方法、芯片及机器人
CN115267751A (zh) * 2022-08-19 2022-11-01 广州小鹏自动驾驶科技有限公司 传感器标定方法、装置、车辆及存储介质
CN115993089A (zh) * 2022-11-10 2023-04-21 山东大学 基于pl-icp的在线四舵轮agv内外参标定方法
CN116168502A (zh) * 2023-02-28 2023-05-26 合肥初云信息科技有限公司 一种自优化工业园区消防传感器节能控制系统
CN114448850B (zh) * 2021-12-21 2023-11-03 天翼云科技有限公司 拨测控制方法、电子设备及拨测系统
WO2023220972A1 (zh) * 2022-05-18 2023-11-23 北京小米移动软件有限公司 移动设备及其位姿估计方法、装置、存储介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115235527B (zh) * 2022-07-20 2023-05-12 上海木蚁机器人科技有限公司 传感器外参标定方法、装置以及电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106403998A (zh) * 2016-08-30 2017-02-15 北京云迹科技有限公司 基于imu的抗暴力干扰装置及方法
CN107328411A (zh) * 2017-06-30 2017-11-07 百度在线网络技术(北京)有限公司 车载定位系统和自动驾驶车辆
CN109949372A (zh) * 2019-03-18 2019-06-28 北京智行者科技有限公司 一种激光雷达与视觉联合标定方法
US20190324483A1 (en) * 2017-04-27 2019-10-24 Pixart Imaging Inc. Sensor chip using having low power consumption
CN110782496A (zh) * 2019-09-06 2020-02-11 深圳市道通智能航空技术有限公司 标定方法、装置、航拍设备和存储介质
CN111060138A (zh) * 2019-12-31 2020-04-24 上海商汤智能科技有限公司 标定方法及装置、处理器、电子设备、存储介质

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107747941B (zh) * 2017-09-29 2020-05-15 歌尔股份有限公司 一种双目视觉定位方法、装置及系统
US10970425B2 (en) * 2017-12-26 2021-04-06 Seiko Epson Corporation Object detection and tracking
CN108489482B (zh) * 2018-02-13 2019-02-26 视辰信息科技(上海)有限公司 视觉惯性里程计的实现方法及系统
CN109376785B (zh) * 2018-10-31 2021-09-24 东南大学 基于迭代扩展卡尔曼滤波融合惯性与单目视觉的导航方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106403998A (zh) * 2016-08-30 2017-02-15 北京云迹科技有限公司 基于imu的抗暴力干扰装置及方法
US20190324483A1 (en) * 2017-04-27 2019-10-24 Pixart Imaging Inc. Sensor chip using having low power consumption
CN107328411A (zh) * 2017-06-30 2017-11-07 百度在线网络技术(北京)有限公司 车载定位系统和自动驾驶车辆
CN109949372A (zh) * 2019-03-18 2019-06-28 北京智行者科技有限公司 一种激光雷达与视觉联合标定方法
CN110782496A (zh) * 2019-09-06 2020-02-11 深圳市道通智能航空技术有限公司 标定方法、装置、航拍设备和存储介质
CN111060138A (zh) * 2019-12-31 2020-04-24 上海商汤智能科技有限公司 标定方法及装置、处理器、电子设备、存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4155673A4

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114136316A (zh) * 2021-12-01 2022-03-04 珠海一微半导体股份有限公司 基于点云特征点的惯导误差消除方法、芯片及机器人
CN114448850B (zh) * 2021-12-21 2023-11-03 天翼云科技有限公司 拨测控制方法、电子设备及拨测系统
CN114049404A (zh) * 2022-01-12 2022-02-15 深圳佑驾创新科技有限公司 一种车内相机外参标定方法及装置
CN114049404B (zh) * 2022-01-12 2022-04-05 深圳佑驾创新科技有限公司 一种车内相机外参标定方法及装置
WO2023220972A1 (zh) * 2022-05-18 2023-11-23 北京小米移动软件有限公司 移动设备及其位姿估计方法、装置、存储介质
CN115267751A (zh) * 2022-08-19 2022-11-01 广州小鹏自动驾驶科技有限公司 传感器标定方法、装置、车辆及存储介质
CN115993089A (zh) * 2022-11-10 2023-04-21 山东大学 基于pl-icp的在线四舵轮agv内外参标定方法
CN115993089B (zh) * 2022-11-10 2023-08-15 山东大学 基于pl-icp的在线四舵轮agv内外参标定方法
CN116168502A (zh) * 2023-02-28 2023-05-26 合肥初云信息科技有限公司 一种自优化工业园区消防传感器节能控制系统
CN116168502B (zh) * 2023-02-28 2024-04-19 山西德元致盛建设工程有限公司 一种自优化工业园区消防传感器节能控制系统

Also Published As

Publication number Publication date
EP4155673A1 (en) 2023-03-29
CN113701745B (zh) 2024-03-08
EP4155673A4 (en) 2023-12-06
CN113701745A (zh) 2021-11-26

Similar Documents

Publication Publication Date Title
WO2021233309A1 (zh) 一种外参变化检测方法、装置、电子设备及检测系统
US10825198B2 (en) 3 dimensional coordinates calculating apparatus, 3 dimensional coordinates calculating method, 3 dimensional distance measuring apparatus and 3 dimensional distance measuring method using images
US11295472B2 (en) Positioning method, positioning apparatus, positioning system, storage medium, and method for constructing offline map database
EP3770636A1 (en) Radar data compensation method for mobile robot, device, and storage medium
EP2990828A1 (en) Point cloud position data processing device, point cloud position data processing system, point cloud position data processing method, and program therefor
CN112465877B (zh) 一种基于运动状态估计的卡尔曼滤波视觉追踪稳定方法
US20140285794A1 (en) Measuring device
CN112051591A (zh) 一种激光雷达与惯性测量单元的检测方法及相关装置
CN113933818A (zh) 激光雷达外参的标定的方法、设备、存储介质及程序产品
CN113447923A (zh) 目标检测方法、装置、系统、电子设备及存储介质
CN113777600A (zh) 一种多毫米波雷达协同定位跟踪方法
CN111080682A (zh) 点云数据的配准方法及装置
WO2022205750A1 (zh) 点云数据生成方法、装置、电子设备及存储介质
CN115728753A (zh) 激光雷达与组合导航的外参标定方法、装置及智能车辆
WO2024001649A1 (zh) 机器人定位方法、装置和计算可读存储介质
JP7136737B2 (ja) 三次元位置計測装置、三次元位置計測方法及びプログラム
WO2023179032A1 (zh) 图像处理方法及装置、电子设备、存储介质、计算机程序、计算机程序产品
WO2023000540A1 (zh) 一种高炉料面形状测量方法、终端设备及存储介质
CN115727871A (zh) 一种轨迹质量检测方法、装置、电子设备和存储介质
CN112873280B (zh) 一种用于机器人的传感器的标定方法及设备
CN111723826B (zh) 跟踪算法的精度检测方法、装置、计算机设备和存储介质
KR101212317B1 (ko) 비컨을 구비한 마커를 포함하는 음원 위치 측정 장치 및 이를 이용한 음원 위치 오차 개선 방법
CN117671007B (zh) 一种位移监测方法、装置、电子设备及存储介质
CN115439561B (zh) 机器人的传感器标定方法、机器人及存储介质
WO2022135062A2 (zh) 惯性导航方法、终端设备和计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21807516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021807516

Country of ref document: EP

Effective date: 20221221