WO2023283987A1 - Sensor security detection method and device for unmanned system, and storage medium - Google Patents

Sensor security detection method and device for unmanned system, and storage medium Download PDF

Info

Publication number
WO2023283987A1
WO2023283987A1 PCT/CN2021/107842 CN2021107842W WO2023283987A1 WO 2023283987 A1 WO2023283987 A1 WO 2023283987A1 CN 2021107842 W CN2021107842 W CN 2021107842W WO 2023283987 A1 WO2023283987 A1 WO 2023283987A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
distance
sensors
group
correlation
Prior art date
Application number
PCT/CN2021/107842
Other languages
French (fr)
Chinese (zh)
Inventor
邵翠萍
李慧云
陈贝章
Original Assignee
中国科学院深圳先进技术研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中国科学院深圳先进技术研究院 filed Critical 中国科学院深圳先进技术研究院
Publication of WO2023283987A1 publication Critical patent/WO2023283987A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/23Testing, monitoring, correcting or calibrating of receiver elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Definitions

  • the invention relates to the technical field of automatic driving and sensing, in particular to a sensor safety detection method, equipment and storage medium for an unmanned system.
  • Unmanned driving technology is a comprehensive technology involving many cutting-edge technologies such as artificial intelligence, sensing technology, map technology and computers.
  • On-board sensors are important measurement devices that self-driving cars rely on.
  • the function of the sensor is like the eyes and ears of human beings, and it is mainly composed of laser radar, visual camera, millimeter wave radar, global positioning system (GPS) and other equipment.
  • Vehicle-mounted sensors provide unmanned vehicles with detailed environmental information around them, and provide rich data for planning and decision-making modules, including the location, shape, category, and speed information of obstacles, as well as semantic understanding of some special scenarios (such as construction areas, traffic lights and traffic signs, etc.).
  • Unmanned driving is very dependent on on-board sensors, and the sensors are extremely vulnerable to external attacks or interference and become unreliable.
  • the security of the planning and decision-making process of unmanned vehicles is based on the safety of sensors.
  • the attack of sensors is a simple, direct, violent and effective method. Because it does not need to enter the interior of the unmanned driving system, the threshold of this attack method is Very low, it poses a huge security threat to unmanned systems. Once the sensor of the unmanned vehicle is attacked or fails, it will lead to distortion of the information obtained by the sensor and wrong recognition results, and then plan an incorrect driving strategy, which is very likely to cause a car accident and cause serious loss of life and property.
  • the research on unmanned driving technology mainly focuses on the functional level, such as environmental perception technology, sensor fusion technology, etc., but the security research on unmanned vehicle close-range attack is still in a relatively early stage, mainly through the single sensor.
  • Some improvement and defense measures are taken to enhance the robustness of sensors and reduce the impact of attacks to a certain extent, but a real-time and systematic abnormal sensor detection method has not yet been formed.
  • the present invention provides a sensor safety detection method, equipment and storage medium for unmanned systems, which can detect the safety of sensors in unmanned systems in real time and accurately locate abnormal sensors.
  • a sensor safety detection method for an unmanned system comprising:
  • the sensors in the unmanned system are divided into two groups, the first group is the sensor for positioning, and the second group is the sensor for detecting the shape characteristics of the object;
  • the process of respectively finding the cross-correlation between the sensors in each group includes:
  • the process of screening out suspicious sensors according to the difference between the sensory data of sensors with cross-correlation in each group of sensors includes:
  • the distance between the corresponding sensing data is calculated in real time according to the distance model, and compared with the respective first distance thresholds, and the sensors corresponding to the sensing data exceeding the respective first distance thresholds are suspicious sensors.
  • the first group of sensors includes GPS (Global Positioning System, Global Positioning System), IMU (Inertial Measurement Unit, inertial measurement unit) and Lidar (ie laser radar);
  • GPS Global Positioning System, Global Positioning System
  • IMU Inertial Measurement Unit, inertial measurement unit
  • Lidar ie laser radar
  • the process of unifying the sensory data of the first group of correlated sensors into the same feature dimension including: laser radar and IMU fusion positioning to form a fusion feature vector
  • the original feature vector of GPS is transformed into GPS feature vector after projective transformation
  • n is a non-negative integer.
  • laser radar and IMU when fused and positioned, it includes:
  • IMU measurement data between two time points perform pre-integration
  • the use of continuous feature point cloud sequences ⁇ P k ⁇ , ⁇ P k+1 ⁇ and IMU pre-integration results includes:
  • IMU preintegration results Carry out coordinate transformation on the feature point cloud sequence ⁇ P k+1 ⁇ , so that it is in the same coordinate system as the feature point cloud sequence ⁇ P k ⁇ ;
  • the second group of sensors includes a laser radar and a camera
  • the process of unifying the perception data of the sensors with correlation in the second group into the same feature dimension including:
  • any three-dimensional point (X L , Y L , Z L ) in the three-dimensional point cloud of the perception data is projected to a two-dimensional point (i, j) on the two-dimensional coordinate system;
  • lidar feature vector [X L ,Y L ] T represents the position of the detection target relative to the vehicle body coordinate system [0,0] T
  • [h L ,w L ] T represents the height and width of the detection frame
  • ⁇ L represents the detection target within the detection frame confidence level
  • the norm is used to characterize the distance between the sensing data distance
  • n is a non-negative integer
  • the process of determining a suspicious sensor includes:
  • a normal distribution of the distance D between the sensing data of the sensors A and B with mutual correlation in at least one of the first group and the second group is established:
  • Distance i (A, B) is the distance between the sensing data of two sensors A and B in the i-th sample, m is the total number of normal samples, and m is an integer greater than 0;
  • the process of judging whether each suspicious sensor is an abnormal sensor according to the correlation of sensing data at two adjacent moments includes:
  • the third distance model indicates that the perception data of the jth sensor is at two adjacent moments: the ith moment and the i+th 1 moment distance;
  • the distance between the feature vectors of each suspicious sensor at two adjacent moments is calculated respectively, and compared with the respective second distance thresholds, the sensors corresponding to the sensing data exceeding the respective second distance thresholds It is an abnormal sensor.
  • n is a non-negative integer, Represents the feature vector of the jth sensor at the i+1th moment.
  • the process of determining the abnormal sensor includes:
  • m 3 represents the total number of adjacent time periods involved in the calculation, and m 3 is an integer greater than 0;
  • the suspect sensor is an abnormal sensor.
  • Another object of the present invention is to provide a storage medium in which a plurality of instructions are stored, and the instructions are suitable for being loaded by a processor and performing the sensor safety detection of the unmanned system described in any one of the above method steps.
  • Another object of the present invention is to provide a sensor safety detection device for an unmanned system, which includes the above-mentioned storage medium and a processor suitable for implementing various instructions.
  • the present invention divides the sensors into two groups based on the two functions of sensor positioning and detection, and searches for the mutual correlation between the sensors in the group and the autocorrelation of each sensor in the time series, so that according to the The difference between the sensing data of the sensors of cross-correlation screens out suspicious sensors, and judges whether the sensing data of the sensors are safe and reliable according to the correlation of the sensing data of each suspicious sensor at two adjacent moments, so as to detect the unmanned system in real time. Sensor safety, accurate positioning of abnormal sensors.
  • Fig. 1 is a flow chart of a sensor safety detection method of an unmanned system according to an embodiment of the present invention
  • Fig. 2 is a functional block diagram of a sensor safety detection method of an unmanned system according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of an inertial measurement unit according to an embodiment of the present invention.
  • Fig. 4 is a functional block diagram of a multi-sensor fusion positioning according to an embodiment of the present invention.
  • Fig. 5 is the frequency relationship diagram of the IMU and Lidar of the embodiment of the present invention.
  • FIG. 6 is a schematic diagram of the principle of spherical projection of the laser radar point cloud according to the embodiment of the present invention.
  • FIG. 7 is a schematic diagram of the azimuth ⁇ and vertex angle ⁇ of the laser radar according to the embodiment of the present invention.
  • FIG. 8 is a schematic diagram of the transformation relationship between the spherical coordinates and the rectangular coordinates of the laser radar according to the embodiment of the present invention.
  • Fig. 9 is a GPS, laser radar and IMU fusion positioning track diagram before the analog sensor of the embodiment of the present invention is attacked by the outside world;
  • Fig. 10 is a GPS, laser radar and IMU fusion positioning track diagram after the analog sensor of the embodiment of the present invention is attacked by the outside world;
  • Fig. 11 is a positioning track diagram of the laser radar according to the embodiment of the present invention after being tampered with and attacked;
  • Fig. 12 is a structural block diagram of a sensor safety detection device of an unmanned system according to an embodiment of the present invention.
  • the terms "arranged”, “provided”, and “connected” should be interpreted broadly. For example, it may be a fixed connection, a detachable connection, or an integral structure; it may be a mechanical connection or an electrical connection; it may be a direct connection or an indirect connection through an intermediary; internal connectivity. Those of ordinary skill in the art can understand the specific meanings of the above terms in the present invention according to specific situations.
  • an embodiment of the present invention provides a sensor safety detection method for an unmanned system, including:
  • the first group is the sensor for positioning
  • the second group is the sensor for detecting the shape feature of the object.
  • the on-board sensors in the unmanned system may include IMU, lidar, GPS, visual camera, millimeter-wave radar and other equipment.
  • the IMU is a device that measures the three-axis attitude angle (or angular rate) and acceleration of the carrier.
  • the IMU is usually composed of 3 accelerometers, 3 gyroscopes, and 3 magnetometer groups, which are used to measure the angular velocity and acceleration of the object in three-dimensional space, and then calculate the attitude of the carrier.
  • DOF degrees of freedom
  • Lidar is a radar system that emits a laser beam to detect the position, speed and other characteristics of the target. Its working principle is to emit a detection laser beam to the target, and then combine the received signal (target echo) reflected from the target with the transmitted signal. To obtain relevant information of the target, such as target distance, azimuth, height, speed, attitude, and even shape parameters, so as to detect, track and identify the target and use simultaneous localization and mapping (SLAM) technology to estimate the carrier's position and posture.
  • SLAM simultaneous localization and mapping
  • the visual perception part includes target detection and positioning.
  • the sensors can be divided into the following two groups according to the target detection and positioning functions: the first group of sensors can include GPS, IMU and lidar, and the second group of sensors can include lidar and cameras.
  • the first set of sensors is used for positioning. Since GPS is based on the global coordinate system, IMU and lidar are based on the carrier coordinate system, so in the positioning task, the positioning results of GPS and the positioning results of lidar and IMU can be unified in the world coordinate system. In the SLAM algorithm of fusion positioning, the two types of sensors are tightly coupled and related.
  • Figure 4 shows the joint positioning framework of GPS, lidar and IMU.
  • a second set of sensors is used for detection.
  • the camera and lidar are used to detect the target object, and the position coordinates and height and width information of the object are given.
  • the point cloud of lidar in three-dimensional space can be projected into a spherical coordinate system, and the points in this spherical coordinate system can be transformed into a two-dimensional coordinate system.
  • the feature vector of the detected target can be obtained, which contains the coordinates of the target and the size information of the detection frame.
  • the camera can also obtain the coordinates of the detection target and the size of the detection frame through the RGB image and the visual detection network.
  • the process of finding the cross-correlation between sensors in each group can specifically be to unify the sensory data of the correlated sensors in each group into the same feature dimension, as shown in Figure 2, in this process, the positioning task
  • the processes of feature unification and feature unification in detection tasks are performed separately.
  • This process specifically includes:
  • IMU original feature vector IMU [a x , a y , a z , ⁇ x , ⁇ y , ⁇ z ,g x ,g y ,g z ] T , because in the positioning task, more attention is paid to the velocity and Acceleration state, so the IMU output characteristics can be defined as:
  • the acceleration is a three-dimensional vector in Euclidean space angular acceleration
  • Velocity is a three-dimensional vector in Euclidean space gyroscope bias and accelerometer bias Composition, the zero bias term
  • Lidar is a collection of point clouds in the original feature, which can be expressed as a vector (64-line lidar, each line scans 1800 point clouds), where X represents the abscissa of the point cloud on the horizontal plane, Y represents the vertical coordinate of the point cloud on the horizontal plane, Z represents the height of the point cloud, and intensity represents the reflection of the point cloud strength.
  • the pose of the lidar carrier is solved by the SLAM algorithm, and its feature vector can be expressed as
  • the carrier coordinate system is defined as B, which is consistent with the IMU coordinate system;
  • the world coordinate system is defined as W, and the origin is the carrier body center when the system is initialized.
  • I (i,i+1) contains n groups of acceleration and angular velocity in the carrier coordinate system B
  • the state of the system at time t i includes attitude, position, velocity and IMU bias.
  • the pose transformation constitutes a special Euclidean group [R i ,t i ] ⁇ SE(3);
  • the velocity is a three-dimensional vector in Euclidean space gyroscope bias and accelerometer bias constitute
  • ⁇ N(0, ⁇ ) represents the observation noise
  • ⁇ N(0, ⁇ ) represents the observation noise
  • ⁇ N(0, ⁇ ) represents the observation noise
  • ⁇ N(0, ⁇ ) represents the rotation matrix from the carrier coordinate system B to the world coordinate system W
  • ⁇ t is the translation vector from the carrier coordinate system B to the world coordinate system W.
  • the laser odometer module uses the continuous feature point cloud sequence ⁇ P k ⁇ , ⁇ P k+1 ⁇ and IMU pre-integration results Estimate the relative motion of the carrier, and its output frequency is consistent with the sampling frequency of Lidar.
  • step S022 use the continuous feature point cloud sequence ⁇ P k ⁇ , ⁇ P k+1 ⁇ and IMU pre-integration results
  • the process of estimating the relative motion of the carrier may specifically include:
  • the feature vector output by the GPS positioning algorithm can be defined as
  • GPS positioning can construct the relationship between the world coordinate system W where the carrier is located and the earth coordinate system (WGS-84), and the original feature vector output at time i+1 can be expressed as
  • the original GPS output positioning is based on the latitude and longitude information of the WGS-84 coordinates.
  • the GPS coordinate system needs to be projected into a flat map before dead reckoning can be done.
  • the commonly used projection methods are Gauss Kruger projection, UTM projection, Mercator projection and so on.
  • the GPS coordinates can be converted using the Mercator projection, and the conversion method is as follows:
  • x G,i+1 , y G,i+1 , z G,i+1 are the ground coordinate system coordinates of the original GPS feature vector projected by Mercator, EARTH_RAD is the radius of the earth, and the value is 6378137 meters.
  • SCALE represents the map scale. Therefore, the feature vector output by the GPS positioning algorithm at time i+1
  • the traditional CNN Convolutional Neural Networks, convolutional neural network
  • the traditional CNN Convolutional Neural Networks, convolutional neural network
  • the traditional CNN Convolutional Neural Networks, convolutional neural network
  • the traditional CNN Convolutional Neural Networks, convolutional neural network
  • the traditional CNN Convolutional Neural Networks, convolutional neural network
  • the data Before inputting point cloud data into CNN, the data is first spherically projected to obtain a dense, two-dimensional data.
  • the process of spherical projection to a two-dimensional image plane is shown in Figure 6.
  • Fig. 7 is a schematic diagram of the azimuth ⁇ and the vertex ⁇ of the lidar, where ⁇ and ⁇ in the figure represent the azimuth and the azimuth of the point respectively.
  • the azimuth is the angle relative to the true north direction, but under the Lidar coordinate system of the present embodiment, the azimuth is the angle relative to the x direction (directly ahead of the vehicle), and the calculation formulas of ⁇ and ⁇ are: :
  • (X, Y, Z) is the coordinate of each point in the 3D point cloud. Therefore, according to the above formula, for each point in the point cloud, its ( ⁇ , ⁇ ) can be calculated through its coordinates (X, Y, Z), thus projecting the points in the three-dimensional space coordinate system to a spherical coordinate Tie.
  • each point (X L , Y L , Z L ) in the spherical coordinate system can be represented by a point in the rectangular coordinate system.
  • any 3D point (X L , Y L , Z L ) in the 3D point cloud of the perception data is projected to a 2D point (i, j) on the 2D coordinate system to obtain a Tensor of size (W, H, C), W represents the number of grids divided by the laser perception range in the horizontal direction; H represents the number of lines of the laser radar; C represents the laser point feature vector of dimensions.
  • the 5 features of each point in the point cloud is put into the corresponding two-dimensional coordinates (i, j), intensity represents the reflection intensity of the point cloud, and range represents the distance from the viewpoint to the sampling point.
  • the lidar feature vector can be obtained:
  • [X L , Y L ] T represents the position of the detection target relative to the vehicle body coordinate system [0,0] T
  • [h L , w L ] T represents the height and width of the detection frame
  • ⁇ L represents the Confidence in detecting objects.
  • the self-driving vehicle collects images around the vehicle through the on-board camera, and the collected pictures are input into the target detection system in RGB format.
  • the detection system calls the deep convolutional neural network algorithm to extract the features of the RGB image.
  • the common structure in the neural network algorithm is convolution layer, pooling layer, activation layer, dropout layer, BN (batch normalization) layer, fully connected layer, etc., and finally the features extracted from the picture can effectively describe the information of the target object.
  • the following process is specifically required: call the deep convolutional neural network algorithm to perform feature extraction on the collected RGB images, and obtain the camera feature vector:
  • [X v Y v ] T represents the coordinate value of the target object on the image coordinate system
  • [h v , w v ,] T represents the height and width of the detection frame
  • p represents the confidence of the target object.
  • the process of screening suspicious sensors can specifically include:
  • step S031 similarly, the establishment of the distance model corresponding to the first group of positioning sensors and the establishment of the distance model corresponding to the second group of detection sensors are performed independently, and the specific process is as follows.
  • the perception information of GPS, lidar and IMU can be unified under the same feature representation, and the distance model of the two can be established under the same feature.
  • this embodiment uses the norm to represent the distance between the sensing data, then GPS
  • the output feature distance of the localization method fused with lidar and IMU can be expressed as:
  • n is a non-negative integer.
  • the lidar feature vector with camera character vector The detection results should be consistent, which contains two meanings: one is that the size of the detection frame is consistent, and the other is that the coordinates are consistent, that is, the detection frame completely overlaps. However, due to the influence of noise and sensor measurement accuracy, there will be a certain normal offset between these two feature vectors. built on the same characteristics The distance model between the two can be used to calculate the distance between the detection frame and the coordinates of the two.
  • the present embodiment uses a norm to characterize the distance between sensing data:
  • n is a non-negative integer.
  • step S032 the process of determining suspicious sensors may specifically include:
  • Distance i (A, B) is the distance between the sensing data of two sensors A and B in the i-th sample, m is the total number of normal samples, and m is an integer greater than 0;
  • the corresponding confidence interval M should satisfy the condition:
  • the confidence level can be set as required, and is not necessarily limited to 99%.
  • n 1.65
  • n 1.96
  • the abnormality of each group of sensors is detected in the positioning task and the detection task, which will be introduced in detail below.
  • the lidar and the inertial sensor in the first group of sensors have certain measurement errors, under normal circumstances, affected by factors such as measurement error, measurement accuracy, and environmental noise, the features of the two in the same dimension are also There is a certain deviation or distance. Since the distance between the perception data of the lidar and the inertial sensor is affected by many factors, here, it can be considered that the distance between the two perception data obeys a normal distribution.
  • Distance i GPS, Lidar
  • m 1 represents the total number of normal samples.
  • the corresponding confidence interval M1 should satisfy the condition:
  • the detection frames of the lidar and the camera in the second group of sensors should completely coincide.
  • the offset (including the offset of the coordinates and the offset of the length and width of the detection frame), and the offset distance is within a certain range. Since the distance between the lidar and the camera perception data is affected by many factors, it can be considered that the distance between the two perception data obeys the normal distribution.
  • ⁇ 2 as the distance expectation
  • ⁇ 2 2 as the variance
  • the expected ⁇ 2 and variance ⁇ 2 2 can be calculated through the sample:
  • Difference i (Camera, Lidar) is the offset of the detection frame of the camera and lidar in the i-th sample, and m 2 represents the total number of normal samples.
  • the process of judging abnormal sensors may specifically include:
  • step S03 After the suspicious sensors are screened out in step S03, it is necessary to further detect the autocorrelation of the abnormal sensors in the group, that is, check whether a single sensor is abnormal in time series, and finally locate the abnormal sensor. Specifically, if the positioning function is abnormal, it is necessary to detect the correlation of the first group of sensors: lidar and IMU in time series, and locate the abnormal sensor. If the detection function is abnormal, it is necessary to detect the correlation of the second group of sensors: lidar and camera in time series, and locate the abnormal sensor.
  • the original feature vector output by GPS positioning at time i can be expressed as The output feature vector after Mercator projection
  • the initial feature vector output by the lidar at time t i is Where X represents the abscissa of the point cloud on the horizontal plane, Y represents the ordinate of the point cloud on the horizontal plane, Z represents the height of the point cloud, and intensity represents the reflection intensity of the point cloud.
  • the feature vector here is the original feature vector of the lidar, regardless of whether it is a detection task or a positioning task.
  • x l,i represents the abscissa of the horizontal plane where the carrier equipped with lidar is located at time t i
  • y l,i represents the vertical coordinate of the horizontal plane where the carrier is located
  • z l,i represents the height of the carrier.
  • [X L,i ,Y L,i ] T represents the position of the detection target relative to the vehicle body coordinate system [0 0] T
  • [h L,i ,w L,i ] T represents the height and width of the detection frame
  • ⁇ L is the confidence level of the detection target within the detection frame.
  • the initial feature vector output by the IMU at time t i The acceleration is a three-dimensional vector in Euclidean space angular acceleration Velocity is a three-dimensional vector in Euclidean space gyroscope bias Zero and accelerometer bias constitute
  • X v,i Y v,i ] T represents the coordinate value of the target object on the image coordinate system
  • [h i w i ] T represents the height and width of the detection frame
  • p i represents the confidence of the target object.
  • the confidence level does not participate in the correlation analysis.
  • the third distance model indicates that the sensory data of the jth sensor is at two adjacent moments: the i-th moment and the i+th 1 moment of distance.
  • the information sensed by the sensor at two adjacent moments must have a large area of overlap, that is, there is an obvious correlation in the time series, and the j-th sensor is at two adjacent moments (i-th and i-th
  • the distance at time +1) can be expressed as: Difference sensor-ji (t i ,t i-1 ).
  • the norm is also used in this embodiment to calculate the characteristic distance of sensor perception information at two adjacent moments:
  • L n (t i ,t i+1 ) represents the distance between the characteristic vector of the sensor at the i-th moment and the i+1 moment, Represents the feature vector of the jth sensor at the ith moment, Represents the feature vector of the jth sensor at the time i+1, n is a non-negative integer, and can be 0, 1, 2, 3....
  • the third distance model respectively calculate the distance between the feature vectors of each suspicious sensor at two adjacent moments, and compare them with the respective second distance thresholds, the sensor corresponding to the sensing data exceeding the respective second distance thresholds It is an abnormal sensor.
  • the feature vector of the suspicious sensor can be substituted into the third distance model to calculate the suspicious The characteristic distance of sensor perception information.
  • the second distance threshold corresponding to the characteristic distance of each sensor perception information is not always a fixed value. Considering that under normal circumstances, at two adjacent moments, the data sensed by the sensors must have most overlapping areas, that is, in There is an obvious correlation in the time series, so at the two moments before and after, the distance of the sensor feature vector must be within a certain threshold range. According to this idea, we judge whether the sensor is abnormal by calculating whether the distance between sensor perception data at adjacent moments is within the threshold range.
  • the process of identifying suspicious sensors can be done in the following ways:
  • m 3 represents the total number of adjacent time periods involved in the calculation, and m 3 is an integer greater than 0.
  • the expectation of the distance between adjacent moments of each sensor is different, which is related to the length of time between two adjacent moments of the sensor.
  • the confidence interval M3 with a confidence degree of 99 % is:
  • the suspicious sensor is an abnormal sensor.
  • the confidence level can be set as required, and is not necessarily limited to 99%.
  • n 1.65
  • n 1.96
  • the sensor safety detection method of the unmanned system in this embodiment has been simulated, verified, and used through experiments, and can detect the safety of the sensors in the unmanned system in real time and accurately locate abnormal sensors.
  • the specific verification process is as follows.
  • the multi-sensor cross-correlation positioning experiment is carried out on the ROS (Robot Operating System) robot operating system, using the KITTI dataset as test data.
  • This dataset is used to evaluate algorithms such as lidar odometry, visual odometry, stereo, optical flow, 3D object detection and 3D tracking Performance in an in-vehicle environment.
  • KITTI contains real image data collected in scenes such as urban areas, rural areas, and highways.
  • the senor includes a 64-line laser radar, which is located in the center of the roof.
  • a color camera and a black and white camera are placed on both sides of the laser radar, a total of four cameras.
  • OXTS RT 3003 At the left rear of the radar, there is an integrated navigation system (OXTS RT 3003), which can output RTK/IMU integrated navigation results (including longitude, latitude and attitude), and also output IMU raw data.
  • the KITTI data set was played back offline, and the vehicle pose estimated by the lidar and IMU and the pose estimated by GPS were unified into the world coordinate system, and the eigenvalues of the lidar at different times were tampered with (simulated sensors were attacked by the outside world) , using the cross-correlation of the eigenvectors output by lidar, IMU and GPS, and the autocorrelation of each sensor to find the moment when the data is tampered with and attack, and locate the specific sensor being attacked.
  • Substituting into the calculation of the abnormal distance threshold of the IMU Indicates that the feature data of the IMU sensor sample at time i+1 is normal at a 99% confidence level, otherwise it is abnormal.
  • Figure 10 is the location trajectory output in the space domain after the laser radar attack data is injected;
  • Figure 11 shows the waveform of the location feature [x,y] T in the time domain, where the solid line represents the time of 0, 1000, 2000, 3000, and 4000 The characteristic waveform of injected lidar attack data, and the dotted line indicates the reference waveform of GPS positioning.
  • the circle marks the positioning waveform at the moment when the lidar is attacked.
  • Table 1 below records the interaction between the laser radar and IMU fusion positioning and GPS positioning output at the time series index of 0, 1000, 2000, 3000, 4000 and the time series index of 500, 1500, 2500, 3500 normal time. According to the statistics of correlation and autocorrelation distance, it is found that at the moment of attack injection, the positioning output of laser radar and IMU fusion has a large deviation from the GPS reference waveform, and the average value of the deviation is more than 10 meters, which is much larger than the normal threshold upper limit.
  • the distance between time i and time i+1, after calculation, the distance between time i and time i+1 of the test sample injected into the attacking lidar is greater than the upper limit of Difference lidar (t i , t i+1 ), and the measured The Difference IMU (t i ,t i+1 ) ⁇ [1.01082,1.03722] and the Difference GPS (t i ,t i+1 ) ⁇ [0.29,2.36] of the sample obtained, so it can be argued that the 99% confidence level is At time 0, 1000, 2000, 3000, and 4000, the GPS and IMU are normal but the lidar sensor is abnormal, which may be attacked.
  • the sensor safety detection method for the unmanned system of this embodiment can very accurately locate the abnormal sensor.
  • this embodiment also provides a computer-readable storage medium 10 and a sensor safety detection device for an unmanned system.
  • the processor 20 loads and executes the steps of the above-mentioned sensor safety detection method for an unmanned system, and the storage medium 10 is a part of the detection device.
  • the processor may be a central processing unit (Central Processing Unit, CPU), a controller, a microcontroller, a microprocessor, or other data processing chips.
  • the processor is typically used to control the overall operation of the computing device.
  • the processor is configured to run program codes stored in the storage medium or process data.
  • the embodiment of the present invention aims at the problem of sensor safety, starting from the two major functions of sensor positioning and detection, dividing the sensors into two groups, and unifying the sensing data into the same feature according to the cross-correlation between sensor sensing signals
  • the distance model and distance threshold between sensors are constructed to measure the distance between sensors, and this is used as a basis for judgment to detect abnormalities in sensors.
  • the distance function and distance threshold of adjacent moments are established, and the normal distribution and confidence level are used to judge whether the sensor is abnormal, so as to accurately judge and Locate anomaly sensors.
  • the present invention is not only aimed at a single sensor anomaly detection method, but a complete set of sensor safety detection methods for unmanned systems, not only suitable for unmanned vehicles, but also for other unmanned systems, such as unmanned aerial vehicles, Unmanned ships etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Manufacturing & Machinery (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

Disclosed in the present invention is a sensor security detection method for an unmanned system, comprising: dividing sensors in the unmanned system into two groups according to the positioning and detection functions, and respectively searching the cross correlation between the sensors in each group; screening out suspicious sensors according to the difference between sense data of the sensors with cross correlation in each group of sensors; and determining whether the suspicious sensor is an abnormal sensor or not according to the correlation of the sense data of each suspicious sensor at two adjacent moments. Also disclosed in the present invention are a sensor security detection device for an unmanned system and a storage medium. According to the present invention, the sensors are divided into two groups according to two functions, i.e., sensor positioning and detection, the cross correlation between the sensors in the group and the autocorrelation of each sensor in the time sequence are searched, such that the suspicious sensors are screened out, and the abnormal sensor is determined and screened from the suspicious sensors, to realize real-time detection of sensor security in the unmanned system, thereby accurately positioning the abnormal sensor.

Description

无人系统的传感器安全性检测方法、设备及存储介质Sensor safety detection method, device and storage medium for unmanned system 技术领域technical field
本发明涉及自动驾驶及传感技术领域,尤其涉及一种无人系统的传感器安全性检测方法、设备及存储介质。The invention relates to the technical field of automatic driving and sensing, in particular to a sensor safety detection method, equipment and storage medium for an unmanned system.
背景技术Background technique
无人驾驶技术是一个涉及人工智能、传感技术、地图技术以及计算机等诸多前沿科技的综合技术。车载传感器是自动驾驶汽车所依靠的重要测量设备。传感器的功能如同人类的眼睛和耳朵一样,主要由激光雷达、视觉摄像头、毫米波雷达、全球定位系统(GPS)等设备组成。车载传感器为无人驾驶汽车提供周围详细的环境信息,为规划与决策模块提供丰富的数据,既包括障碍物的位置、形状、类别及速度信息,也包括对一些特殊场景的语义理解(如施工区域、交通信号灯及交通路牌等)。Unmanned driving technology is a comprehensive technology involving many cutting-edge technologies such as artificial intelligence, sensing technology, map technology and computers. On-board sensors are important measurement devices that self-driving cars rely on. The function of the sensor is like the eyes and ears of human beings, and it is mainly composed of laser radar, visual camera, millimeter wave radar, global positioning system (GPS) and other equipment. Vehicle-mounted sensors provide unmanned vehicles with detailed environmental information around them, and provide rich data for planning and decision-making modules, including the location, shape, category, and speed information of obstacles, as well as semantic understanding of some special scenarios (such as construction areas, traffic lights and traffic signs, etc.).
无人驾驶对车载传感器是十分依赖,而传感器又极易受外界的攻击或干扰而出现不可靠的情况。无人驾驶汽车规划与决策环节的安全性以传感器的安全为前提,传感器的攻击是一种简单、直接、暴力、有效的方法,因其不需要进入无人驾驶系统内部,这种攻击方法门槛很低,对无人系统构成巨大的安全威胁。一旦无人车的传感器受到攻击或出现故障,将会导致传感器获取的信息失真及错误的识别结果,进而规划不正确的驾驶策略,极有可能引发车祸,造成严重的生命与财产损失。Unmanned driving is very dependent on on-board sensors, and the sensors are extremely vulnerable to external attacks or interference and become unreliable. The security of the planning and decision-making process of unmanned vehicles is based on the safety of sensors. The attack of sensors is a simple, direct, violent and effective method. Because it does not need to enter the interior of the unmanned driving system, the threshold of this attack method is Very low, it poses a huge security threat to unmanned systems. Once the sensor of the unmanned vehicle is attacked or fails, it will lead to distortion of the information obtained by the sensor and wrong recognition results, and then plan an incorrect driving strategy, which is very likely to cause a car accident and cause serious loss of life and property.
目前对无人驾驶技术的研究主要集中在功能层面,例如环境感知技术,传感器融合技术等,但对于无人车近距离攻击的安全研究尚处在比较早期的阶段,主要是通过针对单一传感器采取一些改善和防御措施,以增强传感器的鲁棒性,从一定程度上减小攻击带来的影响,但尚未形成实时的、系统性的异常传感器检测方法。At present, the research on unmanned driving technology mainly focuses on the functional level, such as environmental perception technology, sensor fusion technology, etc., but the security research on unmanned vehicle close-range attack is still in a relatively early stage, mainly through the single sensor. Some improvement and defense measures are taken to enhance the robustness of sensors and reduce the impact of attacks to a certain extent, but a real-time and systematic abnormal sensor detection method has not yet been formed.
发明内容Contents of the invention
鉴于现有技术存在的不足,本发明提供了一种无人系统的传感器安全性检 测方法、设备及存储介质,可以实时检测无人系统中传感器的安全性,准确定位出现异常的传感器。In view of the deficiencies in the prior art, the present invention provides a sensor safety detection method, equipment and storage medium for unmanned systems, which can detect the safety of sensors in unmanned systems in real time and accurately locate abnormal sensors.
为了实现上述的目的,本发明采用了如下的技术方案:In order to achieve the above object, the present invention adopts the following technical solutions:
一种无人系统的传感器安全性检测方法,包括:A sensor safety detection method for an unmanned system, comprising:
将无人系统中的传感器分为两组,第一组为定位用的传感器,第二组为检测物体形状特征用的传感器;The sensors in the unmanned system are divided into two groups, the first group is the sensor for positioning, and the second group is the sensor for detecting the shape characteristics of the object;
分别寻找各组内的传感器之间的互相关性;Find the cross-correlation between the sensors in each group separately;
根据每组传感器中具有互相关性的传感器的感知数据之间的差异,筛选出可疑传感器;Screen out suspicious sensors based on the difference between the sensing data of sensors with cross-correlation in each group of sensors;
根据各可疑传感器在相邻两个时刻的感知数据的相关性,判断其是否为异常传感器。According to the correlation of sensing data of each suspicious sensor at two adjacent moments, it is judged whether it is an abnormal sensor.
作为其中一种实施方式,所述分别寻找各组内的传感器之间的互相关性的过程包括:As one of the implementation manners, the process of respectively finding the cross-correlation between the sensors in each group includes:
将各组内具有相关性的传感器的感知数据统一到同一特征维度下;Unify the sensory data of sensors with correlation in each group into the same feature dimension;
所述根据每组传感器中具有互相关性的传感器的感知数据之间的差异,筛选出可疑传感器的过程包括:The process of screening out suspicious sensors according to the difference between the sensory data of sensors with cross-correlation in each group of sensors includes:
分别在同一特征维度下,建立度量各组内具有互相关性的传感器的感知数据之间距离的距离模型;Under the same feature dimension, establish a distance model to measure the distance between the sensory data of the sensors with mutual correlation in each group;
根据所述距离模型实时计算对应感知数据之间的距离,并与各自的第一距离阈值比较,超出各自的第一距离阈值的感知数据对应的传感器即为可疑传感器。The distance between the corresponding sensing data is calculated in real time according to the distance model, and compared with the respective first distance thresholds, and the sensors corresponding to the sensing data exceeding the respective first distance thresholds are suspicious sensors.
作为其中一种实施方式,第一组传感器包括GPS(Global Positioning System,全球定位系统)、IMU(Inertial Measurement Unit,惯性测量单元)和Lidar(即激光雷达);As one of the implementations, the first group of sensors includes GPS (Global Positioning System, Global Positioning System), IMU (Inertial Measurement Unit, inertial measurement unit) and Lidar (ie laser radar);
将第一组内具有相关性的传感器的感知数据统一到同一特征维度下的过程,包括:激光雷达与IMU融合定位形成融合特征矢量
Figure PCTCN2021107842-appb-000001
GPS的原始特征矢量经过投影转换后形成GPS特征矢量
Figure PCTCN2021107842-appb-000002
The process of unifying the sensory data of the first group of correlated sensors into the same feature dimension, including: laser radar and IMU fusion positioning to form a fusion feature vector
Figure PCTCN2021107842-appb-000001
The original feature vector of GPS is transformed into GPS feature vector after projective transformation
Figure PCTCN2021107842-appb-000002
在建立度量第一组内具有互相关性的传感器的感知数据之间距离的第一距 离模型Distance(GPS,Lidar)的过程中,采用范数来表征感知数据之间的距离,
Figure PCTCN2021107842-appb-000003
n为非负整数。
In the process of establishing the first distance model Distance (GPS, Lidar) to measure the distance between the sensing data of the sensors with mutual correlation in the first group, the norm is used to characterize the distance between the sensing data,
Figure PCTCN2021107842-appb-000003
n is a non-negative integer.
作为其中一种实施方式,激光雷达与IMU融合定位时,包括:As one of the implementation methods, when laser radar and IMU are fused and positioned, it includes:
对两个时刻间的IMU测量数据
Figure PCTCN2021107842-appb-000004
进行预积分;
IMU measurement data between two time points
Figure PCTCN2021107842-appb-000004
perform pre-integration;
利用连续的特征点云序列{P k},{P k+1}和IMU预积分结果
Figure PCTCN2021107842-appb-000005
估计载体的相对运动。
Using continuous feature point cloud sequence {P k }, {P k+1 } and IMU pre-integration results
Figure PCTCN2021107842-appb-000005
Estimate the relative motion of the carrier.
作为其中一种实施方式,所述利用连续的特征点云序列{P k},{P k+1}和IMU预积分结果
Figure PCTCN2021107842-appb-000006
估计载体的相对运动的过程包括:
As one of the implementations, the use of continuous feature point cloud sequences {P k }, {P k+1 } and IMU pre-integration results
Figure PCTCN2021107842-appb-000006
The process of estimating the relative motion of the carrier includes:
利用IMU预积分结果
Figure PCTCN2021107842-appb-000007
对特征点云序列{P k+1}进行坐标变换,使之与特征点云序列{P k}在同一坐标系下;
Using IMU preintegration results
Figure PCTCN2021107842-appb-000007
Carry out coordinate transformation on the feature point cloud sequence {P k+1 }, so that it is in the same coordinate system as the feature point cloud sequence {P k };
p j∈P k+1,p i∈P k,p j与p i的距离之和记为d,利用Levenberg-Maquardt算法求解d最小时载体坐标系(B)到世界坐标系(W)的旋转量
Figure PCTCN2021107842-appb-000008
和平移量
Figure PCTCN2021107842-appb-000009
p j ∈ P k+1 , p i ∈ P k , the sum of the distances between p j and p i is denoted as d, and the distance from the carrier coordinate system (B) to the world coordinate system (W) when d is minimized is solved by using the Levenberg-Maquardt algorithm amount of rotation
Figure PCTCN2021107842-appb-000008
and translation
Figure PCTCN2021107842-appb-000009
根据所述旋转量
Figure PCTCN2021107842-appb-000010
和所述平移量
Figure PCTCN2021107842-appb-000011
得到激光雷达与IMU融合的定位公式
Figure PCTCN2021107842-appb-000012
According to the amount of rotation
Figure PCTCN2021107842-appb-000010
and the translation amount
Figure PCTCN2021107842-appb-000011
Get the positioning formula of lidar and IMU fusion
Figure PCTCN2021107842-appb-000012
作为其中一种实施方式,第二组传感器包括激光雷达和摄像头;As one of the implementation manners, the second group of sensors includes a laser radar and a camera;
将第二组内具有相关性的传感器的感知数据统一到同一特征维度下的过程,包括:The process of unifying the perception data of the sensors with correlation in the second group into the same feature dimension, including:
针对激光雷达的感知数据:Perception data for lidar:
经过坐标系的变换,将感知数据的三维点云中任意一个三维点(X L,Y L,Z L)投射到二维坐标系上的一个二维点(i,j); After the transformation of the coordinate system, any three-dimensional point (X L , Y L , Z L ) in the three-dimensional point cloud of the perception data is projected to a two-dimensional point (i, j) on the two-dimensional coordinate system;
对得到的张量进行特征提取,得到激光雷达特征矢量
Figure PCTCN2021107842-appb-000013
[X L,Y L] T表示检测目标相对车体坐标系[0,0] T的位置,[h L,w L] T表示检测框的高度和宽度,θ L表示检测框内是检测目标的置信度;
Perform feature extraction on the obtained tensor to obtain the lidar feature vector
Figure PCTCN2021107842-appb-000013
[X L ,Y L ] T represents the position of the detection target relative to the vehicle body coordinate system [0,0] T , [h L ,w L ] T represents the height and width of the detection frame, θ L represents the detection target within the detection frame confidence level;
针对摄像头的感知数据:Perception data for camera:
调用深度卷积神经网络算法对采集到的RGB图像进行特征提取,得到摄像 头特征矢量
Figure PCTCN2021107842-appb-000014
[X v Y v] T表示目标物体在图像坐标系上的坐标值,[h v,w v,] T表示检测框的高度和宽度,p表示目标物体的置信度。
Call the deep convolutional neural network algorithm to extract the features of the collected RGB images, and get the camera feature vector
Figure PCTCN2021107842-appb-000014
[X v Y v ] T represents the coordinate value of the target object on the image coordinate system, [h v , w v ,] T represents the height and width of the detection frame, and p represents the confidence of the target object.
作为其中一种实施方式,在建立度量第二组内具有互相关性的传感器的感知数据之间距离的第二距离模型Distance(Camera,Lidar)的过程中,采用范数来表征感知数据之间的距离,
Figure PCTCN2021107842-appb-000015
Figure PCTCN2021107842-appb-000016
n为非负整数。
As one of the implementations, in the process of establishing the second distance model Distance(Camera, Lidar) to measure the distance between the sensing data of sensors with cross-correlation in the second group, the norm is used to characterize the distance between the sensing data distance,
Figure PCTCN2021107842-appb-000015
Figure PCTCN2021107842-appb-000016
n is a non-negative integer.
作为其中一种实施方式,确定可疑传感器的过程包括:As one of the implementation manners, the process of determining a suspicious sensor includes:
建立第一组与第二组的至少一组中的具有互相关性的传感器A、B的感知数据之间距离D的正态分布:A normal distribution of the distance D between the sensing data of the sensors A and B with mutual correlation in at least one of the first group and the second group is established:
D~N(ε,σ 2); D~N(ε,σ 2 );
其中,期望
Figure PCTCN2021107842-appb-000017
方差
Figure PCTCN2021107842-appb-000018
Distance i(A,B)为第i个样本中,两个传感器A、B的感知数据之间的距离,m为正常样本总数,m为大于0的整数;
Among them, expect
Figure PCTCN2021107842-appb-000017
variance
Figure PCTCN2021107842-appb-000018
Distance i (A, B) is the distance between the sensing data of two sensors A and B in the i-th sample, m is the total number of normal samples, and m is an integer greater than 0;
判断置信区间M是否满足条件:Determine whether the confidence interval M satisfies the conditions:
Figure PCTCN2021107842-appb-000019
Figure PCTCN2021107842-appb-000019
如果不满足,则两个传感器A、B为可疑传感器。If not, the two sensors A, B are suspect sensors.
作为其中一种实施方式,所述根据各可疑传感器在相邻两个时刻的感知数据的相关性,判断其是否为异常传感器的过程包括:As one of the implementation manners, the process of judging whether each suspicious sensor is an abnormal sensor according to the correlation of sensing data at two adjacent moments includes:
确定各可疑传感器的特征矢量表示
Figure PCTCN2021107842-appb-000020
代表第j个传感器在第i时刻的特征矢量;
Determine the feature vector representation for each suspect sensor
Figure PCTCN2021107842-appb-000020
Represents the feature vector of the jth sensor at the ith moment;
建立时间序列的第三距离模型Difference sensor-j-i(t i,t i+1),所述第三距离模型表示第j个传感器的感知数据在相邻两个时刻:第i时刻和第i+1时刻的距离; Establish a third distance model Difference sensor-ji (t i , t i+1 ) of the time series, the third distance model indicates that the perception data of the jth sensor is at two adjacent moments: the ith moment and the i+th 1 moment distance;
根据所述第三距离模型,分别计算各可疑传感器在相邻两个时刻的特征矢 量之间的距离,并与各自的第二距离阈值比较,超出各自的第二距离阈值的感知数据对应的传感器即为异常传感器。According to the third distance model, the distance between the feature vectors of each suspicious sensor at two adjacent moments is calculated respectively, and compared with the respective second distance thresholds, the sensors corresponding to the sensing data exceeding the respective second distance thresholds It is an abnormal sensor.
作为其中一种实施方式,所述建立时间序列的第三距离模型的过程中,采用范数来表征感知数据之间的距离:
Figure PCTCN2021107842-appb-000021
Figure PCTCN2021107842-appb-000022
n为非负整数,
Figure PCTCN2021107842-appb-000023
代表第j个传感器在第i+1时刻的特征矢量。
As one of the implementations, in the process of establishing the third distance model of the time series, the norm is used to characterize the distance between the perception data:
Figure PCTCN2021107842-appb-000021
Figure PCTCN2021107842-appb-000022
n is a non-negative integer,
Figure PCTCN2021107842-appb-000023
Represents the feature vector of the jth sensor at the i+1th moment.
作为其中一种实施方式,确定异常传感器的过程包括:As one of the implementation manners, the process of determining the abnormal sensor includes:
建立可疑传感器在相邻时刻的特征矢量的距离D 3的正态分布: Establish the normal distribution of the distance D3 of the feature vector of the suspicious sensor at adjacent moments:
D 3~N(τ,σ 3 2); D 3 ~N(τ,σ 3 2 );
其中,期望
Figure PCTCN2021107842-appb-000024
方差
Figure PCTCN2021107842-appb-000025
m 3代表参与计算的相邻时间段总数,m 3为大于0的整数;
Among them, expect
Figure PCTCN2021107842-appb-000024
variance
Figure PCTCN2021107842-appb-000025
m 3 represents the total number of adjacent time periods involved in the calculation, and m 3 is an integer greater than 0;
判断置信区间M 3是否满足条件: Determine whether the confidence interval M3 satisfies the conditions:
Figure PCTCN2021107842-appb-000026
Figure PCTCN2021107842-appb-000026
如果不满足,则可疑传感器为异常传感器。If not, the suspect sensor is an abnormal sensor.
本发明的另一目的在于提供一种存储介质,所述存储介质内存储有多条指令,所述指令适于由处理器加载并执行上述任一项所述的无人系统的传感器安全性检测方法的步骤。Another object of the present invention is to provide a storage medium in which a plurality of instructions are stored, and the instructions are suitable for being loaded by a processor and performing the sensor safety detection of the unmanned system described in any one of the above method steps.
本发明的又一目的在于提供一种无人系统的传感器安全性检测设备,包括如上所述的存储介质和适于实现各指令的处理器。Another object of the present invention is to provide a sensor safety detection device for an unmanned system, which includes the above-mentioned storage medium and a processor suitable for implementing various instructions.
本发明针对传感器的安全性问题,从传感器定位和检测两大功能出发,将传感器分成两组,寻找组内传感器之间的互相关性和各个传感器在时间序列上的自相关性,从而根据具有互相关性的传感器的感知数据之间的差异筛选出可疑传感器,根据各可疑传感器在相邻两个时刻的感知数据的相关性,判断传感器的感知数据是否安全可靠,从而实时检测无人系统中传感器的安全性,准确定位出现异常的传感器。Aiming at the security problem of sensors, the present invention divides the sensors into two groups based on the two functions of sensor positioning and detection, and searches for the mutual correlation between the sensors in the group and the autocorrelation of each sensor in the time series, so that according to the The difference between the sensing data of the sensors of cross-correlation screens out suspicious sensors, and judges whether the sensing data of the sensors are safe and reliable according to the correlation of the sensing data of each suspicious sensor at two adjacent moments, so as to detect the unmanned system in real time. Sensor safety, accurate positioning of abnormal sensors.
附图说明Description of drawings
图1为本发明实施例的一种无人系统的传感器安全性检测方法的流程图;Fig. 1 is a flow chart of a sensor safety detection method of an unmanned system according to an embodiment of the present invention;
图2为本发明实施例的一种无人系统的传感器安全性检测方法的原理框图;Fig. 2 is a functional block diagram of a sensor safety detection method of an unmanned system according to an embodiment of the present invention;
图3为本发明实施例的惯性测量单元的原理图;3 is a schematic diagram of an inertial measurement unit according to an embodiment of the present invention;
图4为本发明实施例的一种多传感融合定位的原理框图;Fig. 4 is a functional block diagram of a multi-sensor fusion positioning according to an embodiment of the present invention;
图5为本发明实施例的IMU与Lidar的频率关系图;Fig. 5 is the frequency relationship diagram of the IMU and Lidar of the embodiment of the present invention;
图6为本发明实施例的激光雷达点云球面投影原理示意图;6 is a schematic diagram of the principle of spherical projection of the laser radar point cloud according to the embodiment of the present invention;
图7为本发明实施例的激光雷达的方位角φ和顶角θ的示意图;7 is a schematic diagram of the azimuth φ and vertex angle θ of the laser radar according to the embodiment of the present invention;
图8为本发明实施例的激光雷达的球面坐标与直角坐标的变换关系示意图;8 is a schematic diagram of the transformation relationship between the spherical coordinates and the rectangular coordinates of the laser radar according to the embodiment of the present invention;
图9为本发明实施例的模拟传感器受到外界攻击前的GPS与激光雷达和IMU融合定位轨迹图;Fig. 9 is a GPS, laser radar and IMU fusion positioning track diagram before the analog sensor of the embodiment of the present invention is attacked by the outside world;
图10为本发明实施例的模拟传感器受到外界攻击后的GPS与激光雷达和IMU融合定位轨迹图;Fig. 10 is a GPS, laser radar and IMU fusion positioning track diagram after the analog sensor of the embodiment of the present invention is attacked by the outside world;
图11为本发明实施例的激光雷达被篡改攻击后的定位轨迹图;Fig. 11 is a positioning track diagram of the laser radar according to the embodiment of the present invention after being tampered with and attacked;
图12为本发明实施例的无人系统的传感器安全性检测设备的结构框图。Fig. 12 is a structural block diagram of a sensor safety detection device of an unmanned system according to an embodiment of the present invention.
具体实施方式detailed description
在本发明中,术语“设置”、“设有”、“连接”应做广义理解。例如,可以是固定连接,可拆卸连接,或整体式构造;可以是机械连接,或电连接;可以是直接相连,或者是通过中间媒介间接相连,又或者是两个装置、元件或组成部分之间内部的连通。对于本领域普通技术人员而言,可以根据具体情况理解上述术语在本发明中的具体含义。In the present invention, the terms "arranged", "provided", and "connected" should be interpreted broadly. For example, it may be a fixed connection, a detachable connection, or an integral structure; it may be a mechanical connection or an electrical connection; it may be a direct connection or an indirect connection through an intermediary; internal connectivity. Those of ordinary skill in the art can understand the specific meanings of the above terms in the present invention according to specific situations.
并且,上述部分术语除了可以用于表示方位或位置关系以外,还可能用于表示其他含义,例如术语“上”在某些情况下也可能用于表示某种依附关系或连接关系。对于本领域普通技术人员而言,可以根据具体情况理解这些术语在本发明中的具体含义。Moreover, some of the above terms may be used to indicate other meanings besides orientation or positional relationship, for example, the term "upper" may also be used to indicate a certain attachment relationship or connection relationship in some cases. Those skilled in the art can understand the specific meanings of these terms in the present invention according to specific situations.
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进一步详细说明。应当理解,此处所描述的具体实施例仅仅用 以解释本发明,并不用于限定本发明。In order to make the object, technical solution and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.
参阅图1和图2,本发明实施例提供了一种无人系统的传感器安全性检测方法,包括:Referring to Fig. 1 and Fig. 2, an embodiment of the present invention provides a sensor safety detection method for an unmanned system, including:
S01、将无人系统中的传感器分为两组,统一特征表示,第一组为定位用的传感器,第二组为检测物体形状特征用的传感器。S01. Divide the sensors in the unmanned system into two groups, and unify the feature representation. The first group is the sensor for positioning, and the second group is the sensor for detecting the shape feature of the object.
示例性地,无人系统中的车载传感器可以有IMU、激光雷达、GPS、视觉摄像头、毫米波雷达等设备。Exemplarily, the on-board sensors in the unmanned system may include IMU, lidar, GPS, visual camera, millimeter-wave radar and other equipment.
如图3所示,IMU是测量载体三轴姿态角(或角速率)以及加速度的装置。IMU通常由3个加速度计、3个陀螺仪、3个磁力计组组合而成,分别用来测量物体在三维空间中的角速度和加速度,并以此解算出载体的姿态。加速度计、陀螺仪和磁力计安装在互相垂直的测量轴上,可以将其输出看作为三个方向的加速度、角速度和磁场强度一共9自由度(DOF)的输出,表示为IMU=[a x,a y,a zxyz,g x,g y,g z] TAs shown in Figure 3, the IMU is a device that measures the three-axis attitude angle (or angular rate) and acceleration of the carrier. The IMU is usually composed of 3 accelerometers, 3 gyroscopes, and 3 magnetometer groups, which are used to measure the angular velocity and acceleration of the object in three-dimensional space, and then calculate the attitude of the carrier. Accelerometers, gyroscopes and magnetometers are installed on mutually perpendicular measurement axes, and their output can be regarded as the output of acceleration, angular velocity and magnetic field strength in three directions with a total of 9 degrees of freedom (DOF), expressed as IMU=[a x ,a y ,a zxyz ,g x ,g y ,g z ] T .
激光雷达是以发射激光束探测目标的位置、速度等特征量的雷达系统,其工作原理是向目标发射探激光束,然后将接收到的从目标反射回来的信号(目标回波)与发射信号进行比较,来获得目标的有关信息,如目标距离、方位、高度、速度、姿态、甚至形状等参数,从而对目标进行探测、跟踪和识别以及使用同时定位与建图(SLAM)技术估计载体的位置和姿态。Lidar is a radar system that emits a laser beam to detect the position, speed and other characteristics of the target. Its working principle is to emit a detection laser beam to the target, and then combine the received signal (target echo) reflected from the target with the transmitted signal. To obtain relevant information of the target, such as target distance, azimuth, height, speed, attitude, and even shape parameters, so as to detect, track and identify the target and use simultaneous localization and mapping (SLAM) technology to estimate the carrier's position and posture.
GPS导航系统的基本原理是测量出已知位置的卫星到用户接收机之间的距离,然后综合多颗卫星的数据来确定接收机的具体位置。其输出常见为经度、纬度和高度,表示为GPS=[lon,lat,alt] T,其中lon表示经度,lat表示纬度,alt表示高度。 The basic principle of the GPS navigation system is to measure the distance between the known satellite and the user's receiver, and then integrate the data of multiple satellites to determine the specific position of the receiver. Its output is usually longitude, latitude and altitude, expressed as GPS=[lon,lat,alt] T , where lon indicates longitude, lat indicates latitude, and alt indicates altitude.
视觉感知部分包含目标检测和定位两部分,传感器按照目标检测和定位功能可以分为以下两组:第一组传感器可以包括GPS、IMU和激光雷达,第二组传感器可以包括激光雷达和摄像头。The visual perception part includes target detection and positioning. The sensors can be divided into the following two groups according to the target detection and positioning functions: the first group of sensors can include GPS, IMU and lidar, and the second group of sensors can include lidar and cameras.
第一组传感器用于定位。由于GPS是基于全球坐标系统,IMU和激光雷达是基于载体坐标系,因此在定位任务中,可将GPS的定位结果与激光雷达和IMU的定位结果统一在世界坐标系下,在IMU与激光雷达融合定位的SLAM算法中,两类传感器是紧耦合相关的,如图4展示的是GPS,激光雷达和IMU联合进行定位的框架。The first set of sensors is used for positioning. Since GPS is based on the global coordinate system, IMU and lidar are based on the carrier coordinate system, so in the positioning task, the positioning results of GPS and the positioning results of lidar and IMU can be unified in the world coordinate system. In the SLAM algorithm of fusion positioning, the two types of sensors are tightly coupled and related. Figure 4 shows the joint positioning framework of GPS, lidar and IMU.
第二组传感器用于检测。在检测任务中,摄像头和激光雷达用于检测目标物体,给出物体的位置坐标和高度、宽度信息。激光雷达在三维空间中的点云可以投射到了一个球面坐标系中,这个球面坐标系中的点又可以转换到二维坐标系下。在二维坐标系下,根据点的分布和激光雷达目标检测网络可以得到检测目标的特征矢量,该特征矢量中包含目标的坐标和检测框大小信息。同理,摄像头通过RGB的图片和视觉检测网络,也能得到检测目标的坐标和检测框大小信息。A second set of sensors is used for detection. In the detection task, the camera and lidar are used to detect the target object, and the position coordinates and height and width information of the object are given. The point cloud of lidar in three-dimensional space can be projected into a spherical coordinate system, and the points in this spherical coordinate system can be transformed into a two-dimensional coordinate system. In the two-dimensional coordinate system, according to the distribution of points and the lidar target detection network, the feature vector of the detected target can be obtained, which contains the coordinates of the target and the size information of the detection frame. Similarly, the camera can also obtain the coordinates of the detection target and the size of the detection frame through the RGB image and the visual detection network.
S02、分别寻找各组内的传感器之间的互相关性。S02. Search for the cross-correlation between the sensors in each group respectively.
寻找各组内的传感器之间的互相关性的过程具体可以是将各组内具有相关性的传感器的感知数据统一到同一特征维度下,结合图2所示,这一过程中,定位任务中特征统一和检测任务中特征统一的过程分别进行。The process of finding the cross-correlation between sensors in each group can specifically be to unify the sensory data of the correlated sensors in each group into the same feature dimension, as shown in Figure 2, in this process, the positioning task The processes of feature unification and feature unification in detection tasks are performed separately.
【定位任务中统一特征表示的过程】[The process of unified feature representation in positioning tasks]
具体地,在定位任务中,首先需要将第一组内具有相关性的传感器的感知数据统一到同一特征维度下,这一过程具体包括:Specifically, in the positioning task, it is first necessary to unify the perception data of the sensors with correlation in the first group into the same feature dimension. This process specifically includes:
1)激光雷达与IMU融合定位形成融合特征矢量
Figure PCTCN2021107842-appb-000027
1) Laser radar and IMU fusion positioning form a fusion feature vector
Figure PCTCN2021107842-appb-000027
IMU原始特征矢量IMU=[a x,a y,a zxyz,g x,g y,g z] T,由于在定位任务中,更关注载体三轴的速度与加速度状态,因此可将IMU输出特征定义为:
Figure PCTCN2021107842-appb-000028
其中加速度为欧式空间下的三维向量
Figure PCTCN2021107842-appb-000029
角加速度
Figure PCTCN2021107842-appb-000030
速度为欧式空间下的三维向量
Figure PCTCN2021107842-appb-000031
零偏项由陀螺仪零偏
Figure PCTCN2021107842-appb-000032
和加速度计零偏
Figure PCTCN2021107842-appb-000033
构成,即零偏项
Figure PCTCN2021107842-appb-000034
IMU original feature vector IMU=[a x , a y , a zxyz ,g x ,g y ,g z ] T , because in the positioning task, more attention is paid to the velocity and Acceleration state, so the IMU output characteristics can be defined as:
Figure PCTCN2021107842-appb-000028
The acceleration is a three-dimensional vector in Euclidean space
Figure PCTCN2021107842-appb-000029
angular acceleration
Figure PCTCN2021107842-appb-000030
Velocity is a three-dimensional vector in Euclidean space
Figure PCTCN2021107842-appb-000031
gyroscope bias
Figure PCTCN2021107842-appb-000032
and accelerometer bias
Figure PCTCN2021107842-appb-000033
Composition, the zero bias term
Figure PCTCN2021107842-appb-000034
激光雷达在原始特征为点云集合,可表示为矢量
Figure PCTCN2021107842-appb-000035
(64线激光雷达,每线扫描一周1800个点云),其中X表示点云在水平面的横坐标,Y表示点云在水平面的纵坐标,Z表示点云的高度,intensity表示点云的反射强度。对于定位任务,通过SLAM算法解算搭载了激光雷达载体的位姿,则其特征矢量可表示为
Figure PCTCN2021107842-appb-000036
Lidar is a collection of point clouds in the original feature, which can be expressed as a vector
Figure PCTCN2021107842-appb-000035
(64-line lidar, each line scans 1800 point clouds), where X represents the abscissa of the point cloud on the horizontal plane, Y represents the vertical coordinate of the point cloud on the horizontal plane, Z represents the height of the point cloud, and intensity represents the reflection of the point cloud strength. For the positioning task, the pose of the lidar carrier is solved by the SLAM algorithm, and its feature vector can be expressed as
Figure PCTCN2021107842-appb-000036
激光雷达和IMU特征矢量之间的关系:载体坐标系定义为B,与IMU坐标系保持一致;世界坐标系定义为W,原点为系统初始化时的载体系中心。设Lidar第i次扫描的起始时刻为t i,扫描得到的全部点云为P i,其中任意一点记为p n∈P i;IMU在[t i,t i+1]内采集的数据为I (i,i+1),i为大于0的整数。由于IMU输出频率 高于Lidar,I (i,i+1)中包含了n组在载体坐标系B下的加速度和角速度
Figure PCTCN2021107842-appb-000037
系统在t i时刻的状态包括姿态、位置、速度和IMU零偏。其中,位姿变换构成特殊欧式群[R i,t i]∈SE(3);速度为欧式空间下的三维向量
Figure PCTCN2021107842-appb-000038
零偏项由陀螺仪零偏
Figure PCTCN2021107842-appb-000039
和加速度计零偏
Figure PCTCN2021107842-appb-000040
构成
Figure PCTCN2021107842-appb-000041
The relationship between the lidar and IMU feature vectors: the carrier coordinate system is defined as B, which is consistent with the IMU coordinate system; the world coordinate system is defined as W, and the origin is the carrier body center when the system is initialized. Let the starting time of the i-th scan of Lidar be t i , and all the point clouds obtained by the scan are P i , where any point is recorded as p n ∈ P i ; the data collected by the IMU in [t i , t i+1 ] is I (i,i+1) , i is an integer greater than 0. Since the IMU output frequency is higher than Lidar, I (i,i+1) contains n groups of acceleration and angular velocity in the carrier coordinate system B
Figure PCTCN2021107842-appb-000037
The state of the system at time t i includes attitude, position, velocity and IMU bias. Among them, the pose transformation constitutes a special Euclidean group [R i ,t i ]∈SE(3); the velocity is a three-dimensional vector in Euclidean space
Figure PCTCN2021107842-appb-000038
gyroscope bias
Figure PCTCN2021107842-appb-000039
and accelerometer bias
Figure PCTCN2021107842-appb-000040
constitute
Figure PCTCN2021107842-appb-000041
根据IMU工作原理可知其观测模型为:According to the working principle of IMU, its observation model can be known as:
Figure PCTCN2021107842-appb-000042
Figure PCTCN2021107842-appb-000042
Figure PCTCN2021107842-appb-000043
Figure PCTCN2021107842-appb-000043
其中,
Figure PCTCN2021107842-appb-000044
η~N(0,Σ)表示观测噪声,
Figure PCTCN2021107842-appb-000045
为载体坐标系B到世界坐标系W的旋转矩阵,
Figure PCTCN2021107842-appb-000046
为载体坐标系B到世界坐标系W的平移向量。
Figure PCTCN2021107842-appb-000047
为重力加速度,根据IMU的动力学模型对
Figure PCTCN2021107842-appb-000048
Figure PCTCN2021107842-appb-000049
在IMU采样间隔时间δt内积分
Figure PCTCN2021107842-appb-000050
得到:
in,
Figure PCTCN2021107842-appb-000044
η~N(0,Σ) represents the observation noise,
Figure PCTCN2021107842-appb-000045
is the rotation matrix from the carrier coordinate system B to the world coordinate system W,
Figure PCTCN2021107842-appb-000046
is the translation vector from the carrier coordinate system B to the world coordinate system W.
Figure PCTCN2021107842-appb-000047
is the acceleration of gravity, according to the dynamic model of the IMU
Figure PCTCN2021107842-appb-000048
and
Figure PCTCN2021107842-appb-000049
Integrate within the IMU sampling interval time δt
Figure PCTCN2021107842-appb-000050
get:
Figure PCTCN2021107842-appb-000051
Figure PCTCN2021107842-appb-000051
Figure PCTCN2021107842-appb-000052
Figure PCTCN2021107842-appb-000052
Figure PCTCN2021107842-appb-000053
Figure PCTCN2021107842-appb-000053
如图5所示,为IMU与Lidar的频率关系图,激光里程计模块利用连续的特征点云序列{P k},{P k+1}和IMU预积分结果
Figure PCTCN2021107842-appb-000054
估计载体的相对运动,其输出频率与Lidar的采样频率保持一致。
As shown in Figure 5, it is the frequency relationship diagram between IMU and Lidar. The laser odometer module uses the continuous feature point cloud sequence {P k }, {P k+1 } and IMU pre-integration results
Figure PCTCN2021107842-appb-000054
Estimate the relative motion of the carrier, and its output frequency is consistent with the sampling frequency of Lidar.
因此,在利用激光雷达与IMU融合定位时,具体可以包括如下步骤:Therefore, when using lidar and IMU fusion positioning, the following steps can be specifically included:
S021、对两个时刻间的IMU测量数据
Figure PCTCN2021107842-appb-000055
进行预积分,得到这两个时刻间移动物体的位姿变化;
S021, the IMU measurement data between two moments
Figure PCTCN2021107842-appb-000055
Perform pre-integration to obtain the pose change of the moving object between these two moments;
S022、利用连续的特征点云序列{P k},{P k+1}和IMU预积分结果
Figure PCTCN2021107842-appb-000056
估计载体的相对运动。
S022. Using the continuous feature point cloud sequence {P k }, {P k+1 } and IMU pre-integration results
Figure PCTCN2021107842-appb-000056
Estimate the relative motion of the carrier.
在步骤S022中,利用连续的特征点云序列{P k},{P k+1}和IMU预积分结果
Figure PCTCN2021107842-appb-000057
估计载体的相对运动的过程具体可以包括:
In step S022, use the continuous feature point cloud sequence {P k }, {P k+1 } and IMU pre-integration results
Figure PCTCN2021107842-appb-000057
The process of estimating the relative motion of the carrier may specifically include:
S0221、利用IMU预积分结果
Figure PCTCN2021107842-appb-000058
对特征点云序列{P k+1}进行坐标变换,使之与特征点云序列{P k}在同一坐标系下。其中,p j∈P k+1,p i∈P k,p j与p i的距离之和记为d,j、k为大于0的整数,f(x)表示点云的距离表达式,有:
S0221. Use the IMU pre-integration result
Figure PCTCN2021107842-appb-000058
Carry out coordinate transformation on the feature point cloud sequence {P k+1 }, so that it is in the same coordinate system as the feature point cloud sequence {P k }. Among them, p j ∈ P k+1 , p i ∈ P k , the sum of the distance between p j and p i is recorded as d, j and k are integers greater than 0, f(x) represents the distance expression of the point cloud, Have:
Figure PCTCN2021107842-appb-000059
Figure PCTCN2021107842-appb-000059
Figure PCTCN2021107842-appb-000060
Figure PCTCN2021107842-appb-000060
S0222、令d最小,利用Levenberg-Maquardt算法求解d最小时载体坐标系B到世界坐标系W的旋转量
Figure PCTCN2021107842-appb-000061
和平移量
Figure PCTCN2021107842-appb-000062
S0222. Make d the smallest, and use the Levenberg-Maquardt algorithm to solve the rotation amount from the carrier coordinate system B to the world coordinate system W when d is the smallest
Figure PCTCN2021107842-appb-000061
and translation
Figure PCTCN2021107842-appb-000062
因此,根据旋转量
Figure PCTCN2021107842-appb-000063
和平移量
Figure PCTCN2021107842-appb-000064
即可得到激光雷达与IMU融合的定位公式:
Therefore, according to the amount of rotation
Figure PCTCN2021107842-appb-000063
and translation
Figure PCTCN2021107842-appb-000064
The positioning formula of lidar and IMU fusion can be obtained:
Figure PCTCN2021107842-appb-000065
Figure PCTCN2021107842-appb-000065
2)GPS的原始特征矢量经过投影转换后形成GPS特征矢量
Figure PCTCN2021107842-appb-000066
2) The original feature vector of GPS is transformed into GPS feature vector after projection transformation
Figure PCTCN2021107842-appb-000066
GPS的原始特征矢量GPS=[lon,lat,alt] T是基于WGS-84坐标的经纬度信息,其中lon是经度,lat是纬度,alt表示高度。对于自动驾驶的定位任务,由于GPS坐标系需要投影为平面地图才可以做航位推算。因此GPS定位算法输出的特征矢量可定义为
Figure PCTCN2021107842-appb-000067
The original feature vector of GPS GPS=[lon,lat,alt] T is latitude and longitude information based on WGS-84 coordinates, where lon is longitude, lat is latitude, and alt represents altitude. For the positioning task of automatic driving, since the GPS coordinate system needs to be projected into a flat map, dead reckoning can be done. Therefore, the feature vector output by the GPS positioning algorithm can be defined as
Figure PCTCN2021107842-appb-000067
GPS定位可构建载体所在的世界坐标系W与地球坐标系(WGS-84)之间的关系,其在i+1时刻输出的原始特征矢量可表示为
Figure PCTCN2021107842-appb-000068
GPS原始输出定位是基于WGS-84坐标的经纬度信息。GPS坐标系需要投影为平面地图才可以做航位推算。常用的投影方法有高斯克吕格投影、UTM投影、墨卡托投影等。
GPS positioning can construct the relationship between the world coordinate system W where the carrier is located and the earth coordinate system (WGS-84), and the original feature vector output at time i+1 can be expressed as
Figure PCTCN2021107842-appb-000068
The original GPS output positioning is based on the latitude and longitude information of the WGS-84 coordinates. The GPS coordinate system needs to be projected into a flat map before dead reckoning can be done. The commonly used projection methods are Gauss Kruger projection, UTM projection, Mercator projection and so on.
本实施例中,GPS坐标可以采用墨卡托投影进行转换,其变换方式如下:In this embodiment, the GPS coordinates can be converted using the Mercator projection, and the conversion method is as follows:
Figure PCTCN2021107842-appb-000069
Figure PCTCN2021107842-appb-000069
Figure PCTCN2021107842-appb-000070
Figure PCTCN2021107842-appb-000070
z G,i+1=alt i+1z G,i+1 = alt i+1 ;
Figure PCTCN2021107842-appb-000071
Figure PCTCN2021107842-appb-000071
其中,x G,i+1、y G,i+1、z G,i+1是GPS原始特征矢量经过墨卡托投影的地面坐标系坐标,EARTH_RAD是地球半径,取值为6378137米。SCALE表示地图比例尺。因此,GPS定位算法在i+1时刻输出的特征矢量
Figure PCTCN2021107842-appb-000072
Among them, x G,i+1 , y G,i+1 , z G,i+1 are the ground coordinate system coordinates of the original GPS feature vector projected by Mercator, EARTH_RAD is the radius of the earth, and the value is 6378137 meters. SCALE represents the map scale. Therefore, the feature vector output by the GPS positioning algorithm at time i+1
Figure PCTCN2021107842-appb-000072
【检测任务中统一特征表示的过程】[The process of unified feature representation in detection tasks]
具体地,在检测任务中,传统的CNN(Convolutional Neural Networks,卷积神经网络)检测任务设计多用于二维的图像模式识别(宽×高×通道数),考虑到三维的点云数据格式不符合该模式,而且点云数据稀疏无规律,这对特征提取都是不利的。因此,首先需要将第二组内具有相关性的传感器的感知数据统一到同一特征维度下,下面对这一过程具体描述。Specifically, in the detection task, the traditional CNN (Convolutional Neural Networks, convolutional neural network) detection task design is mostly used for two-dimensional image pattern recognition (width × height × number of channels), considering that the three-dimensional point cloud data format is different It conforms to this pattern, and the point cloud data is sparse and irregular, which is not good for feature extraction. Therefore, it is first necessary to unify the sensory data of the correlated sensors in the second group into the same feature dimension, and this process will be described in detail below.
在将点云数据输入到CNN之前,首先对数据进行球面投影,从而得到一个稠密的、二维的数据,球面投影到二维图像平面的过程如图6所示。Before inputting point cloud data into CNN, the data is first spherically projected to obtain a dense, two-dimensional data. The process of spherical projection to a two-dimensional image plane is shown in Figure 6.
(1)激光雷达(1) LiDAR
图7为激光雷达的方位角φ和顶角θ的示意图,图中的φ和θ分别表示点的方位角(azimuth)和顶角(altitude)。通常,方位角是相对于正北方向的夹角,但是,在本实施例的Lidar的坐标系下,方位角为相对于x方向(车辆正前方)的夹角,φ和θ的计算公式为:Fig. 7 is a schematic diagram of the azimuth φ and the vertex θ of the lidar, where φ and θ in the figure represent the azimuth and the azimuth of the point respectively. Usually, the azimuth is the angle relative to the true north direction, but under the Lidar coordinate system of the present embodiment, the azimuth is the angle relative to the x direction (directly ahead of the vehicle), and the calculation formulas of φ and θ are: :
Figure PCTCN2021107842-appb-000073
Figure PCTCN2021107842-appb-000073
Figure PCTCN2021107842-appb-000074
Figure PCTCN2021107842-appb-000074
其中,(X,Y,Z)为三维点云中每一个点的坐标。因此,根据上式,对于点云中的每一个点都可以通过其坐标(X,Y,Z)计算出其(θ,φ),从而将三维空间坐标系中的点都投射到了一个球面坐标系。Among them, (X, Y, Z) is the coordinate of each point in the 3D point cloud. Therefore, according to the above formula, for each point in the point cloud, its (θ, φ) can be calculated through its coordinates (X, Y, Z), thus projecting the points in the three-dimensional space coordinate system to a spherical coordinate Tie.
如图8,示出了球面坐标与直角坐标的变换关系,通过对其角度(θ,φ)进行微分化,可以得到一个二维的直角坐标系:As shown in Figure 8, the transformation relationship between spherical coordinates and rectangular coordinates is shown. By differentiating the angle (θ, φ), a two-dimensional rectangular coordinate system can be obtained:
Figure PCTCN2021107842-appb-000075
Figure PCTCN2021107842-appb-000075
Figure PCTCN2021107842-appb-000076
Figure PCTCN2021107842-appb-000076
根据上式,球面坐标系下的每一个点(X L,Y L,Z L)都可以使用一个直角坐标系中的点表示。 According to the above formula, each point (X L , Y L , Z L ) in the spherical coordinate system can be represented by a point in the rectangular coordinate system.
也就是说,针对激光雷达的感知数据,当将其感知数据统一到同一特征维度下时,需要进行如下过程:That is to say, for the perception data of lidar, when the perception data is unified under the same feature dimension, the following process needs to be carried out:
1)经过坐标系的变换,将感知数据的三维点云中任意一个三维点(X L,Y L,Z L)投射到二维坐标系上的一个二维点(i,j),得到一个尺寸为(W,H,C)的张量,W表示水平方向上激光感知范围划分的网格数;H表示激光雷达的线数;C表示激光点特征向量
Figure PCTCN2021107842-appb-000077
的维数。
1) After the transformation of the coordinate system, any 3D point (X L , Y L , Z L ) in the 3D point cloud of the perception data is projected to a 2D point (i, j) on the 2D coordinate system to obtain a Tensor of size (W, H, C), W represents the number of grids divided by the laser perception range in the horizontal direction; H represents the number of lines of the laser radar; C represents the laser point feature vector
Figure PCTCN2021107842-appb-000077
of dimensions.
这里,点云中每一个点的5个特征:
Figure PCTCN2021107842-appb-000078
被放入对应的二维坐标(i,j)内,intensity表示点云的反射强度,range表示视点到采样点的距离。
Here, the 5 features of each point in the point cloud:
Figure PCTCN2021107842-appb-000078
is put into the corresponding two-dimensional coordinates (i, j), intensity represents the reflection intensity of the point cloud, and range represents the distance from the viewpoint to the sampling point.
2)通过激光雷达目标检测网络对得到的张量进行特征提取,即可得到激光雷达特征矢量:2) Through the lidar target detection network to extract the features of the obtained tensor, the lidar feature vector can be obtained:
Figure PCTCN2021107842-appb-000079
Figure PCTCN2021107842-appb-000079
其中,[X L,Y L] T表示检测目标相对车体坐标系[0,0] T的位置,[h L,w L] T表示检测框的高度和宽度,θ L表示检测框内是检测目标的置信度。 Among them, [X L , Y L ] T represents the position of the detection target relative to the vehicle body coordinate system [0,0] T , [h L , w L ] T represents the height and width of the detection frame, θ L represents the Confidence in detecting objects.
(2)摄像头(2) Camera
自动驾驶车辆通过车载摄像头采集车辆周围的图像,采集到的图片以RGB格式输入目标检测系统,检测系统调用深度卷积神经网络算法对RGB图像进行 特征提取,神经网络算法中常见的结构有卷积层,池化层,激活层,dropout层,BN(batch normalization)层,全连接层等,最终从图片中提取的特征能有效描述目标物体的信息。The self-driving vehicle collects images around the vehicle through the on-board camera, and the collected pictures are input into the target detection system in RGB format. The detection system calls the deep convolutional neural network algorithm to extract the features of the RGB image. The common structure in the neural network algorithm is convolution layer, pooling layer, activation layer, dropout layer, BN (batch normalization) layer, fully connected layer, etc., and finally the features extracted from the picture can effectively describe the information of the target object.
因此,针对摄像头的感知数据,当将其感知数据统一到同一特征维度下时,具体需要进行如下过程:调用深度卷积神经网络算法对采集到的RGB图像进行特征提取,得到摄像头特征矢量:Therefore, for the sensory data of the camera, when the sensory data is unified into the same feature dimension, the following process is specifically required: call the deep convolutional neural network algorithm to perform feature extraction on the collected RGB images, and obtain the camera feature vector:
Figure PCTCN2021107842-appb-000080
Figure PCTCN2021107842-appb-000080
其中,[X v Y v] T表示目标物体在图像坐标系上的坐标值,[h v,w v,] T表示检测框的高度和宽度,p表示目标物体的置信度。 Among them, [X v Y v ] T represents the coordinate value of the target object on the image coordinate system, [h v , w v ,] T represents the height and width of the detection frame, and p represents the confidence of the target object.
S03、根据每组传感器中具有互相关性的传感器的感知数据之间的差异,筛选出可疑传感器。S03. Screen out suspicious sensors according to the difference between the sensing data of the sensors with mutual correlation in each group of sensors.
筛选可疑传感器的过程具体可以包括:The process of screening suspicious sensors can specifically include:
S031、分别在同一特征维度下,建立度量各组内具有互相关性的传感器的感知数据之间距离的距离模型。S031. Under the same feature dimension, establish a distance model for measuring the distance between the sensing data of the sensors with mutual correlation in each group.
S032、根据距离模型实时计算对应感知数据之间的距离,并与各自的第一距离阈值比较,超出各自的第一距离阈值的感知数据对应的传感器即为可疑传感器。S032. Calculate the distance between the corresponding sensing data in real time according to the distance model, and compare it with the respective first distance thresholds, and the sensors corresponding to the sensing data exceeding the respective first distance thresholds are suspicious sensors.
在步骤S031中,同样地,第一组的定位传感器对应的距离模型的建立与第二组的检测传感器对应的距离模型的建立分别独立地进行,具体过程如下。In step S031, similarly, the establishment of the distance model corresponding to the first group of positioning sensors and the establishment of the distance model corresponding to the second group of detection sensors are performed independently, and the specific process is as follows.
【定位任务中距离模型建立的过程】[The process of establishing the distance model in the positioning task]
根据GPS定位与激光雷达和IMU融合定位的转换关系,可以将GPS,激光雷达和IMU的感知信息统一到同样的特征表示下,并在同样的特征下建立两者的距离模型。According to the conversion relationship between GPS positioning and lidar and IMU fusion positioning, the perception information of GPS, lidar and IMU can be unified under the same feature representation, and the distance model of the two can be established under the same feature.
建立度量第一组内具有互相关性的传感器的感知数据之间距离的第一距离模型Distance(GPS,Lidar)的过程中,本实施例采用范数来表征感知数据之间的距离,则GPS与激光雷达和IMU融合的定位方法的输出特征距离可以表示为:In the process of establishing the first distance model Distance (GPS, Lidar) to measure the distance between the sensing data of the sensors with cross-correlation in the first group, this embodiment uses the norm to represent the distance between the sensing data, then GPS The output feature distance of the localization method fused with lidar and IMU can be expressed as:
Figure PCTCN2021107842-appb-000081
Figure PCTCN2021107842-appb-000081
Figure PCTCN2021107842-appb-000082
Figure PCTCN2021107842-appb-000082
其中,n为非负整数。Among them, n is a non-negative integer.
【检测任务中距离模型建立的过程】[The process of establishing the distance model in the detection task]
理论上,激光雷达特征矢量
Figure PCTCN2021107842-appb-000083
与摄像头特征矢量
Figure PCTCN2021107842-appb-000084
检测的结果应该一致,包含两层含义:一是检测框的大小一致,二是坐标一致,即检测框完全重叠。但是受噪声以及传感器测量精度的影响,这两个特征矢量会存在一定的正常偏移量。在同样的特征下建立
Figure PCTCN2021107842-appb-000085
两者的距离模型,可以用于计算二者的检测框和坐标的距离。
Theoretically, the lidar feature vector
Figure PCTCN2021107842-appb-000083
with camera character vector
Figure PCTCN2021107842-appb-000084
The detection results should be consistent, which contains two meanings: one is that the size of the detection frame is consistent, and the other is that the coordinates are consistent, that is, the detection frame completely overlaps. However, due to the influence of noise and sensor measurement accuracy, there will be a certain normal offset between these two feature vectors. built on the same characteristics
Figure PCTCN2021107842-appb-000085
The distance model between the two can be used to calculate the distance between the detection frame and the coordinates of the two.
在建立度量第二组内具有互相关性的传感器的感知数据之间距离的第二距离模型Distance(Camera,Lidar)的过程中,本实施例采用范数来表征感知数据之间的距离:In the process of establishing the second distance model Distance(Camera, Lidar) to measure the distance between the sensing data of the sensors with cross-correlation in the second group, the present embodiment uses a norm to characterize the distance between sensing data:
Figure PCTCN2021107842-appb-000086
Figure PCTCN2021107842-appb-000087
其中,n为非负整数。
Figure PCTCN2021107842-appb-000086
Figure PCTCN2021107842-appb-000087
Among them, n is a non-negative integer.
步骤S032中,确定可疑传感器的过程具体可以包括:In step S032, the process of determining suspicious sensors may specifically include:
S0321、建立第一组与第二组的至少一组中的具有互相关性的传感器A、B的感知数据之间距离D的正态分布:S0321. Establish a normal distribution of the distance D between the sensing data of the sensors A and B with mutual correlation in at least one of the first group and the second group:
D~N(ε,σ 2); D~N(ε,σ 2 );
其中,期望
Figure PCTCN2021107842-appb-000088
方差
Figure PCTCN2021107842-appb-000089
Distance i(A,B)为第i个样本中,两个传感器A、B的感知数据之间的距离,m为正常样本总数,m为大于0的整数;
Among them, expect
Figure PCTCN2021107842-appb-000088
variance
Figure PCTCN2021107842-appb-000089
Distance i (A, B) is the distance between the sensing data of two sensors A and B in the i-th sample, m is the total number of normal samples, and m is an integer greater than 0;
S0322、判断置信区间M是否满足条件
Figure PCTCN2021107842-appb-000090
n为常数,根据置信度变化。如果不满足,则两个传感器A、B异常,为可疑传感器,否则,两个传感器正常。
S0322. Determine whether the confidence interval M satisfies the condition
Figure PCTCN2021107842-appb-000090
n is a constant that varies according to the confidence level. If not, the two sensors A and B are abnormal, and they are suspicious sensors; otherwise, the two sensors are normal.
例如,如果以99%的置信度才能判断这两个传感器的距离是正常的,则相应的置信区间M应满足条件:For example, if the distance between the two sensors can be judged to be normal with a confidence level of 99%, the corresponding confidence interval M should satisfy the condition:
Figure PCTCN2021107842-appb-000091
Figure PCTCN2021107842-appb-000091
即,
Figure PCTCN2021107842-appb-000092
表示以99%的置信水平认为传感器A,B无异常,否则表示传感器A,B有异常。
which is,
Figure PCTCN2021107842-appb-000092
Indicates that sensors A and B have no abnormalities at a confidence level of 99%, otherwise it indicates that sensors A and B are abnormal.
可以理解,在其他实施方式中,置信度可以根据需要设定,并不一定限定为99%。例如,当求取90%置信区间时,n=1.65,
Figure PCTCN2021107842-appb-000093
Figure PCTCN2021107842-appb-000094
当求取95%置信区间时,n=1.96,
Figure PCTCN2021107842-appb-000095
Figure PCTCN2021107842-appb-000096
It can be understood that, in other implementation manners, the confidence level can be set as required, and is not necessarily limited to 99%. For example, when finding the 90% confidence interval, n=1.65,
Figure PCTCN2021107842-appb-000093
Figure PCTCN2021107842-appb-000094
When seeking the 95% confidence interval, n=1.96,
Figure PCTCN2021107842-appb-000095
Figure PCTCN2021107842-appb-000096
同样地,结合图2所示,定位任务和检测任务中分别对各组传感器的异常与否进行检测,下面将具体进行介绍。Similarly, as shown in FIG. 2 , the abnormality of each group of sensors is detected in the positioning task and the detection task, which will be introduced in detail below.
【定位任务中异常传感器的检测过程】[Detection process of abnormal sensors in positioning tasks]
在定位任务中,由于第一组传感器中的激光雷达和惯性传感器都存在一定的测量误差,正常情况下,受测量误差、测量精度、环境噪声等因素的影响,两者同一维度下的特征也存在一定的偏差或距离。由于激光雷达和惯性传感器感知数据之间的距离是受多种因素影响,这里,可以认为两者感知数据之间的距离服从正态分布。我们定义正常情况下两者的距离均值(即距离期望)为ε 1,σ 1 2为方差,通过样本可以计算得到期望ε 1和方差σ 1 2In the positioning task, since the lidar and the inertial sensor in the first group of sensors have certain measurement errors, under normal circumstances, affected by factors such as measurement error, measurement accuracy, and environmental noise, the features of the two in the same dimension are also There is a certain deviation or distance. Since the distance between the perception data of the lidar and the inertial sensor is affected by many factors, here, it can be considered that the distance between the two perception data obeys a normal distribution. We define the mean distance between the two under normal circumstances (that is, the distance expectation) as ε 1 , and σ 1 2 as the variance, and the expected ε 1 and variance σ 1 2 can be calculated through the sample:
Figure PCTCN2021107842-appb-000097
Figure PCTCN2021107842-appb-000097
Figure PCTCN2021107842-appb-000098
Figure PCTCN2021107842-appb-000098
其中,Distance i(GPS,Lidar)为第i个样本中GPS和激光雷达的距离,m 1代表正常样本总数。 Among them, Distance i (GPS, Lidar) is the distance between GPS and Lidar in the i-th sample, and m 1 represents the total number of normal samples.
建立激光雷达和GPS的距离D 1的正态分布: Establish a normal distribution of the distance D 1 of the lidar and GPS:
D 1~N(ε 11 2) D 1 ~N(ε 11 2 )
假设要以99%的置信度才能判断这组传感器的距离是正常的,则相应的置信区间M 1应满足条件: Assuming that the distance of this group of sensors can be judged to be normal with a confidence level of 99%, the corresponding confidence interval M1 should satisfy the condition:
Figure PCTCN2021107842-appb-000099
Figure PCTCN2021107842-appb-000099
即如果
Figure PCTCN2021107842-appb-000100
则表示以99%的置信水平认为定位传感器中GPS和激光雷达无异常,否则表示定位传感器中IMU或激光雷达有异常。
That is, if
Figure PCTCN2021107842-appb-000100
It means that there is no abnormality in the GPS and lidar in the positioning sensor with a confidence level of 99%, otherwise it means that there is an abnormality in the IMU or lidar in the positioning sensor.
【检测任务中异常传感器的检测过程】[Detection process of abnormal sensor in detection task]
在检测任务中,理想情况下,第二组传感器中的激光雷达和摄像头的检测框应该完全重合,但是实际应用中,由于传感器的测量误差和各自检测框识别算法的不同,检测框会有一定的偏移(包含坐标的偏移和检测框长宽方向的偏移),且偏移距离在一定的范围内。由于激光雷达和摄像头感知数据之间的距离是受多种因素影响,因此可以认为两者感知数据之间的距离服从正态分布。我们定义ε 2为距离期望,σ 2 2为方差,通过样本可以计算得到期望ε 2和方差σ 2 2In the detection task, ideally, the detection frames of the lidar and the camera in the second group of sensors should completely coincide. The offset (including the offset of the coordinates and the offset of the length and width of the detection frame), and the offset distance is within a certain range. Since the distance between the lidar and the camera perception data is affected by many factors, it can be considered that the distance between the two perception data obeys the normal distribution. We define ε 2 as the distance expectation, σ 2 2 as the variance, and the expected ε 2 and variance σ 2 2 can be calculated through the sample:
Figure PCTCN2021107842-appb-000101
Figure PCTCN2021107842-appb-000101
Figure PCTCN2021107842-appb-000102
Figure PCTCN2021107842-appb-000102
其中,Difference i(Camera,Lidar)为第i个样本中摄像头和激光雷达的检测框偏移度,m 2代表正常样本总数。 Among them, Difference i (Camera, Lidar) is the offset of the detection frame of the camera and lidar in the i-th sample, and m 2 represents the total number of normal samples.
建立激光雷达和摄像头的距离D 2的正态分布: Establish the normal distribution of the distance D 2 between the lidar and the camera:
D 2~N(ε 22 2) D 2 ~N(ε 22 2 )
假设要以99%的置信度才能判断这组传感器的距离是正常的,则相应的置信区间M 2满足条件: Assuming that the distance of this group of sensors can be judged to be normal with a confidence level of 99%, then the corresponding confidence interval M2 satisfies the condition:
Figure PCTCN2021107842-appb-000103
Figure PCTCN2021107842-appb-000103
即如果
Figure PCTCN2021107842-appb-000104
表示以99%的置信水平认为检测任务中摄像头和激光雷达无异常,否则表示检测任务中 摄像头或激光雷达有异常。
That is, if
Figure PCTCN2021107842-appb-000104
Indicates that there is no abnormality in the camera and lidar in the detection task with a 99% confidence level, otherwise it indicates that there is an abnormality in the camera or lidar in the detection task.
S04、根据各可疑传感器在相邻两个时刻的感知数据的相关性,判断其是否为异常传感器。S04. According to the correlation of the sensing data of each suspicious sensor at two adjacent moments, determine whether it is an abnormal sensor.
判断异常传感器的过程具体可以包括:The process of judging abnormal sensors may specifically include:
S041、确定各可疑传感器的特征矢量表示
Figure PCTCN2021107842-appb-000105
代表第j个传感器在第i时刻的特征矢量。
S041. Determine the feature vector representation of each suspicious sensor
Figure PCTCN2021107842-appb-000105
Represents the feature vector of the jth sensor at the ith moment.
当步骤S03筛选出可疑传感器后,则需要进一步对有异常的组内传感器进行自相关性的检测,即在时间序列上检验单个传感器是否发生异常,最终定位发生异常的传感器。具体来说,如果是定位功能出现异常,则需要分别检测第一组传感器:激光雷达和IMU在时间序列上的相关性,定位出异常传感器。如果是检测功能出现异常,则需要分别检测第二组传感器:激光雷达和摄像头在时间序列上的相关性,定位出异常传感器。After the suspicious sensors are screened out in step S03, it is necessary to further detect the autocorrelation of the abnormal sensors in the group, that is, check whether a single sensor is abnormal in time series, and finally locate the abnormal sensor. Specifically, if the positioning function is abnormal, it is necessary to detect the correlation of the first group of sensors: lidar and IMU in time series, and locate the abnormal sensor. If the detection function is abnormal, it is necessary to detect the correlation of the second group of sensors: lidar and camera in time series, and locate the abnormal sensor.
为了计算传感器在前后两个相邻时刻的感知数据的距离,首先需要确定传感器的特征表示。综合考虑计算效率和计算精度,各组传感器的用于时间序列相关性计算的传感器的特征矢量分别这样表示:In order to calculate the distance between the sensing data of the sensor at two adjacent moments, it is first necessary to determine the feature representation of the sensor. Considering the calculation efficiency and calculation accuracy comprehensively, the eigenvectors of the sensors used for time series correlation calculation of each group of sensors are expressed as follows:
GPS定位在i时刻输出的原始特征矢量可表示为
Figure PCTCN2021107842-appb-000106
经过墨卡托投影后输出的特征矢量
Figure PCTCN2021107842-appb-000107
The original feature vector output by GPS positioning at time i can be expressed as
Figure PCTCN2021107842-appb-000106
The output feature vector after Mercator projection
Figure PCTCN2021107842-appb-000107
激光雷达在t i时刻输出的初始特征矢量为
Figure PCTCN2021107842-appb-000108
其中X表示点云在水平面的横坐标,Y表示点云在水平面的纵坐标,Z表示点云的高度,intensity表示点云的反射强度。此处的特征矢量是激光雷达原始的特征矢量,不分区是检测任务还是定位任务。
The initial feature vector output by the lidar at time t i is
Figure PCTCN2021107842-appb-000108
Where X represents the abscissa of the point cloud on the horizontal plane, Y represents the ordinate of the point cloud on the horizontal plane, Z represents the height of the point cloud, and intensity represents the reflection intensity of the point cloud. The feature vector here is the original feature vector of the lidar, regardless of whether it is a detection task or a positioning task.
通过激光雷达的定位算法解算在i时刻的特征矢量
Figure PCTCN2021107842-appb-000109
其中x l,i表示t i时刻搭载了激光雷达的载体所在水平面的横坐标,y l,i表示载体所在水 平面的纵坐标,z l,i表示载体的高度。在检测任务中,使用激光检测算法在i时刻解算的特征矢量
Figure PCTCN2021107842-appb-000110
其中[X L,i,Y L,i] T表示检测目标相对车体坐标系[0 0] T的位置,[h L,i,w L,i] T表示检测框的高度和宽度,θ L为检测框内是检测目标的置信度。
Solve the feature vector at time i through the positioning algorithm of lidar
Figure PCTCN2021107842-appb-000109
Among them, x l,i represents the abscissa of the horizontal plane where the carrier equipped with lidar is located at time t i , y l,i represents the vertical coordinate of the horizontal plane where the carrier is located, and z l,i represents the height of the carrier. In the detection task, use the laser detection algorithm to solve the feature vector at time i
Figure PCTCN2021107842-appb-000110
Where [X L,i ,Y L,i ] T represents the position of the detection target relative to the vehicle body coordinate system [0 0] T , [h L,i ,w L,i ] T represents the height and width of the detection frame, θ L is the confidence level of the detection target within the detection frame.
IMU在t i时刻输出的初始特征矢量
Figure PCTCN2021107842-appb-000111
其中加速度为欧式空间下的三维向量
Figure PCTCN2021107842-appb-000112
角加速度
Figure PCTCN2021107842-appb-000113
速度为欧式空间下的三维向量
Figure PCTCN2021107842-appb-000114
零偏项由陀螺仪零偏
Figure PCTCN2021107842-appb-000115
零和加速度计零偏
Figure PCTCN2021107842-appb-000116
构成
Figure PCTCN2021107842-appb-000117
Figure PCTCN2021107842-appb-000118
The initial feature vector output by the IMU at time t i
Figure PCTCN2021107842-appb-000111
The acceleration is a three-dimensional vector in Euclidean space
Figure PCTCN2021107842-appb-000112
angular acceleration
Figure PCTCN2021107842-appb-000113
Velocity is a three-dimensional vector in Euclidean space
Figure PCTCN2021107842-appb-000114
gyroscope bias
Figure PCTCN2021107842-appb-000115
Zero and accelerometer bias
Figure PCTCN2021107842-appb-000116
constitute
Figure PCTCN2021107842-appb-000117
Figure PCTCN2021107842-appb-000118
摄像头目标物体检测结果用向量
Figure PCTCN2021107842-appb-000119
表示。其中X v,i Y v,i] T表示目标物体在图像坐标系上的坐标值,[h i w i] T表示检测框的高度和宽度,p i表示目标物体的置信度。在时序相关性分析中,置信度不参与相关性分析。
Camera target object detection result with vector
Figure PCTCN2021107842-appb-000119
express. Among them, X v,i Y v,i ] T represents the coordinate value of the target object on the image coordinate system, [h i w i ] T represents the height and width of the detection frame, and p i represents the confidence of the target object. In time series correlation analysis, the confidence level does not participate in the correlation analysis.
S042、建立时间序列的第三距离模型Difference sensor-j-i(t i,t i+1),第三距离模型表示第j个传感器的感知数据在相邻两个时刻:第i时刻和第i+1时刻的距离。 S042. Establish the third distance model Difference sensor-ji (t i , t i+1 ) of the time series, the third distance model indicates that the sensory data of the jth sensor is at two adjacent moments: the i-th moment and the i+th 1 moment of distance.
正常情况下,相邻两个时刻传感器感知的信息必然有大部分重叠的区域,即在时间序列上具有明显的相关性,第j个传感器的在相邻两个时刻(第i时刻和第i+1时刻)的距离可以表示为:Difference sensor-j-i(t i,t i-1)。 Under normal circumstances, the information sensed by the sensor at two adjacent moments must have a large area of overlap, that is, there is an obvious correlation in the time series, and the j-th sensor is at two adjacent moments (i-th and i-th The distance at time +1) can be expressed as: Difference sensor-ji (t i ,t i-1 ).
在建立时间序列的第三距离模型Difference sensor-j-i(t i,t i+1)的过程中,本实施例中同样采用范数来计算相邻两个时刻传感器感知信息的特征距离: In the process of establishing the third distance model Difference sensor-ji (t i , t i+1 ) of the time series, the norm is also used in this embodiment to calculate the characteristic distance of sensor perception information at two adjacent moments:
Figure PCTCN2021107842-appb-000120
Figure PCTCN2021107842-appb-000120
其中,L n(t i,t i+1)表示传感器的特征矢量在第i时刻和i+1时刻的距离,
Figure PCTCN2021107842-appb-000121
代表第j个传感器在第i时刻的特征矢量,
Figure PCTCN2021107842-appb-000122
Figure PCTCN2021107842-appb-000123
代表第j个传感器在第i+1时刻的特征矢量,n为非负整数,可以取0,1,2,3...。
Among them, L n (t i ,t i+1 ) represents the distance between the characteristic vector of the sensor at the i-th moment and the i+1 moment,
Figure PCTCN2021107842-appb-000121
Represents the feature vector of the jth sensor at the ith moment,
Figure PCTCN2021107842-appb-000122
Figure PCTCN2021107842-appb-000123
Represents the feature vector of the jth sensor at the time i+1, n is a non-negative integer, and can be 0, 1, 2, 3....
S043、根据第三距离模型,分别计算各可疑传感器在相邻两个时刻的特征矢量之间的距离,并与各自的第二距离阈值比较,超出各自的第二距离阈值的感知数据对应的传感器即为异常传感器。S043. According to the third distance model, respectively calculate the distance between the feature vectors of each suspicious sensor at two adjacent moments, and compare them with the respective second distance thresholds, the sensor corresponding to the sensing data exceeding the respective second distance thresholds It is an abnormal sensor.
第三距离模型Difference sensor-j-i(t i,t i+1)建立后,根据要进一步判断的可疑传感器,将该可疑传感器的特征矢量代入第三距离模型即可计算相邻两个时刻该可疑传感器感知信息的特征距离。 After the third distance model Difference sensor-ji (t i ,t i+1 ) is established, according to the suspicious sensor to be further judged, the feature vector of the suspicious sensor can be substituted into the third distance model to calculate the suspicious The characteristic distance of sensor perception information.
这里,各感器感知信息的特征距离对应的第二距离阈值并非始终某一固定的数值,考虑到正常情况下,相邻两个时刻,传感器感知的数据必然有大部分重叠的区域,即在时间序列上具有明显的相关性,因此在前后两个时刻,传感器特征矢量的距离必然在一定的阈值范围内。根据这一思路,我们通过计算传感器感知数据在相邻时刻的距离是否在阈值范围内去判断传感器是否有异常。Here, the second distance threshold corresponding to the characteristic distance of each sensor perception information is not always a fixed value. Considering that under normal circumstances, at two adjacent moments, the data sensed by the sensors must have most overlapping areas, that is, in There is an obvious correlation in the time series, so at the two moments before and after, the distance of the sensor feature vector must be within a certain threshold range. According to this idea, we judge whether the sensor is abnormal by calculating whether the distance between sensor perception data at adjacent moments is within the threshold range.
由于单一传感器相邻时刻的距离是受环境噪声、测量误差、特征提取算法等多种因素影响,因此我们认为两者之间的距离服从正态分布。Since the distance between adjacent moments of a single sensor is affected by many factors such as environmental noise, measurement error, and feature extraction algorithm, we believe that the distance between the two obeys a normal distribution.
可疑传感器的确定过程可以采用如下方式:The process of identifying suspicious sensors can be done in the following ways:
S0431、建立可疑传感器在相邻时刻的特征矢量的距离D 3的正态分布: S0431. Establish the normal distribution of the distance D3 of the feature vector of the suspicious sensor at adjacent moments:
D 3~N(τ,σ 3 2); D 3 ~N(τ,σ 3 2 );
其中,期望
Figure PCTCN2021107842-appb-000124
方差
Figure PCTCN2021107842-appb-000125
m 3代表参与计算的相邻时间段总数,m 3为大于0的整数。每个传感器相邻时刻的距离的期望不同,其与传感器相邻两个时刻的时间长短有关系。
Among them, expect
Figure PCTCN2021107842-appb-000124
variance
Figure PCTCN2021107842-appb-000125
m 3 represents the total number of adjacent time periods involved in the calculation, and m 3 is an integer greater than 0. The expectation of the distance between adjacent moments of each sensor is different, which is related to the length of time between two adjacent moments of the sensor.
判断置信区间M 3是否满足条件,置信度为99%的置信区间M 3为: To judge whether the confidence interval M3 satisfies the condition, the confidence interval M3 with a confidence degree of 99 % is:
Figure PCTCN2021107842-appb-000126
Figure PCTCN2021107842-appb-000126
即如果
Figure PCTCN2021107842-appb-000127
表示以99%的置信水平认为第j个传感器第i+1时刻感知的特征数据正常,如果不满足,则可疑传感器为异常传感器。
That is, if
Figure PCTCN2021107842-appb-000127
Indicates that the feature data sensed by the jth sensor at the i+1th moment is normal with a 99% confidence level, and if not, the suspicious sensor is an abnormal sensor.
可以理解,在其他实施方式中,置信度可以根据需要设定,并不一定限定为99%。例如,当求取90%置信区间时,n=1.65,
Figure PCTCN2021107842-appb-000128
Figure PCTCN2021107842-appb-000129
当求取95%置信区间时,n=1.96,
Figure PCTCN2021107842-appb-000130
It can be understood that, in other implementation manners, the confidence level can be set as required, and is not necessarily limited to 99%. For example, when finding the 90% confidence interval, n=1.65,
Figure PCTCN2021107842-appb-000128
Figure PCTCN2021107842-appb-000129
When seeking the 95% confidence interval, n=1.96,
Figure PCTCN2021107842-appb-000130
本实施例的无人系统的传感器安全性检测方法经过实验模拟、验证、使用,可以实时检测无人系统中传感器的安全性,准确定位出现异常的传感器,具体验证过程如下。The sensor safety detection method of the unmanned system in this embodiment has been simulated, verified, and used through experiments, and can detect the safety of the sensors in the unmanned system in real time and accurately locate abnormal sensors. The specific verification process is as follows.
(1)实验准备(1) Experiment preparation
多传感互相关的定位实验在ROS(Robot Operating System)机器人操作系统上进行,使用KITTI数据集作为测试数据。该数据集用于评测激光里程计(lidar odometry)、视觉里程计(visual odometry)、立体图像(stereo)、光流(optical flow)、3D物体检测(object detection)和3D跟踪(tracking)等算法在车载环境下的性能。KITTI包含市区、乡村和高速公路等场景采集的真实图像数据。The multi-sensor cross-correlation positioning experiment is carried out on the ROS (Robot Operating System) robot operating system, using the KITTI dataset as test data. This dataset is used to evaluate algorithms such as lidar odometry, visual odometry, stereo, optical flow, 3D object detection and 3D tracking Performance in an in-vehicle environment. KITTI contains real image data collected in scenes such as urban areas, rural areas, and highways.
数据采集平台中,传感器包括一个64线激光雷达,位于车顶的正中心,激光雷达两侧各放一个彩色摄像头和一个黑白摄像头,共四个摄像头。在雷达左后方,有一个组合导航系统(OXTS RT 3003),它可以输出RTK/IMU组合导航结果(包括经纬度和姿态),同时也输出IMU原始数据。In the data acquisition platform, the sensor includes a 64-line laser radar, which is located in the center of the roof. A color camera and a black and white camera are placed on both sides of the laser radar, a total of four cameras. At the left rear of the radar, there is an integrated navigation system (OXTS RT 3003), which can output RTK/IMU integrated navigation results (including longitude, latitude and attitude), and also output IMU raw data.
(2)实验整体思路(2) The overall idea of the experiment
实验对KITTI的数据集进行离线回放,对激光雷达与IMU估计的车辆位姿与GPS估计的位姿统一到世界坐标系下,对不同时刻激光雷达的特征值进行篡改(模拟传感器受到外界攻击),利用激光雷达、IMU与GPS输出的特征矢量的互相关性,与各传感器的自相关性,找到数据被篡改攻击的时刻,并定位具体被攻击是何种传感器。In the experiment, the KITTI data set was played back offline, and the vehicle pose estimated by the lidar and IMU and the pose estimated by GPS were unified into the world coordinate system, and the eigenvalues of the lidar at different times were tampered with (simulated sensors were attacked by the outside world) , using the cross-correlation of the eigenvectors output by lidar, IMU and GPS, and the autocorrelation of each sensor to find the moment when the data is tampered with and attack, and locate the specific sensor being attacked.
根据激光雷达与IMU定位的输出特征
Figure PCTCN2021107842-appb-000131
和GPS定位输出的特征
Figure PCTCN2021107842-appb-000132
由于仅考虑车辆在平面上的运动,在激光雷达与IMU定位中,z l,i的值为预设的固定高度值,而z G,i由卫星测量的值反应了真实的地理位置的高度,两者并无相关性,因此在平面运动的无人车可仅考虑输出特征
Figure PCTCN2021107842-appb-000133
如图9所示,虚线表示GPS在空间域上的定位输出结果x G,i,y G,i,实线表示激光雷达和IMU在空间域上关于x l,i,y l,i的定位输出结果。
Output features based on lidar and IMU positioning
Figure PCTCN2021107842-appb-000131
and the characteristics of the GPS positioning output
Figure PCTCN2021107842-appb-000132
Since only the movement of the vehicle on the plane is considered, in the lidar and IMU positioning, the value of z l,i is a preset fixed height value, and the value of z G,i measured by satellite reflects the height of the real geographic location , there is no correlation between the two, so the unmanned vehicle moving in the plane can only consider the output features
Figure PCTCN2021107842-appb-000133
As shown in Figure 9, the dotted line represents the GPS positioning output x G,i ,y G,i in the spatial domain, and the solid line represents the positioning of the lidar and IMU in the spatial domain about x l,i ,y l,i Output the result.
(3)建立互相关模型和自相关模型(3) Establish cross-correlation model and autocorrelation model
根据传感器的第二距离模型计算互相关的距离
Figure PCTCN2021107842-appb-000134
计算两组定位特征的均值。每隔10S选取未被攻击篡改的两组定位特征作为测试样本,计算定位特征序列的距离的均值。其中均值ε 1=8.32,方差σ 1 2=42.65。计算异常检测的阈值
Figure PCTCN2021107842-appb-000135
表示测试样本可以99%的置信水平认为定位传感器中GPS和激光雷达与IMU无异常,否则表示定位传感器中GPS、IMU或激光雷达有异常。
Calculate the distance of the cross-correlation according to the second distance model of the sensor
Figure PCTCN2021107842-appb-000134
Calculates the mean of two sets of work features. Every 10 seconds, two groups of positioning features that have not been tampered with by the attack are selected as test samples, and the mean value of the distance of the positioning feature sequence is calculated. Wherein the mean value ε 1 =8.32, and the variance σ 1 2 =42.65. Calculate the threshold for anomaly detection
Figure PCTCN2021107842-appb-000135
Indicates that the test sample can be considered with a 99% confidence level that there is no abnormality in the GPS, lidar and IMU in the positioning sensor, otherwise it indicates that there is an abnormality in the GPS, IMU or lidar in the positioning sensor.
对互相关距离在区间[2.37,14.28]外的时刻,标记为传感器可能发生故障的时间t i。根据传感器的第三距离模型计算自相关距离:Difference sensor(t i,t i+1),每隔10s,分别计算GPS、Lidar和IMU在i和i+1时刻的距离的期望τ和方差σ 3 2,计算GPS测试样本的先后时刻距离均值为ε 1=8.3239,方差σ 1 2=42.6508。计 算GPS发生异常的距离阈值
Figure PCTCN2021107842-appb-000136
Figure PCTCN2021107842-appb-000137
表示以99%的置信水平认为GPS传感器样本在第i+1时刻的特征数据正常,否则表示异常。
For the moment when the cross-correlation distance is outside the interval [2.37, 14.28], it is marked as the time t i when the sensor may fail. Calculate the autocorrelation distance according to the third distance model of the sensor: Difference sensor (t i ,t i+1 ), every 10s, calculate the expected τ and variance σ of the distances of GPS, Lidar and IMU at time i and i+1, respectively 3 2 , calculate the average distance of the GPS test samples as ε 1 =8.3239, and the variance σ 1 2 =42.6508. Calculate the distance threshold for GPS anomalies
Figure PCTCN2021107842-appb-000136
Figure PCTCN2021107842-appb-000137
Indicates that the feature data of the GPS sensor sample at time i+1 is normal at a 99% confidence level, otherwise it is abnormal.
计算lidar测试样本的先后时刻距离均值为ε 2=0.9,方差σ 2 2=0.3769。代入并计算lidar发生异常的距离阈值,得到
Figure PCTCN2021107842-appb-000138
Figure PCTCN2021107842-appb-000139
表示以99%的置信水平认为激光雷达传感器样本在第i+1时刻的特征数据正常,否则表示异常。
The average distance of the lidar test samples is calculated as ε 2 =0.9, and the variance σ 2 2 =0.3769. Substituting and calculating the distance threshold of lidar abnormality, we get
Figure PCTCN2021107842-appb-000138
Figure PCTCN2021107842-appb-000139
Indicates that the characteristic data of the lidar sensor sample at the i+1th time is normal at a 99% confidence level, otherwise it is abnormal.
同理,计算IMU测试样本的先后时刻距离均值为ε 3=1.02402,方差σ 3 2=0.000201。代入计算IMU发生异常的距离阈值
Figure PCTCN2021107842-appb-000140
Figure PCTCN2021107842-appb-000141
表示以99%的置信水平认为IMU传感器样本在第i+1时刻的特征数据正常,否则表示异常。
In the same way, the average distance of the IMU test samples is calculated as ε 3 =1.02402, and the variance σ 3 2 =0.000201. Substituting into the calculation of the abnormal distance threshold of the IMU
Figure PCTCN2021107842-appb-000140
Figure PCTCN2021107842-appb-000141
Indicates that the feature data of the IMU sensor sample at time i+1 is normal at a 99% confidence level, otherwise it is abnormal.
(4)注错实验(4) Wrong note experiment
选择5组攻击点模拟传感器攻击测试,分别在时间序列索引为0、1000、2000、3000、4000的时刻注入激光雷达定位特征的错误点云数据,模拟传感器在该时刻受到攻击。Select 5 groups of attack points to simulate the sensor attack test, inject the error point cloud data of the lidar positioning feature at the time series index of 0, 1000, 2000, 3000, and 4000 respectively, and simulate the sensor being attacked at this moment.
图10是注入激光雷达攻击数据后在空间域输出的定位轨迹图;图11显示定位特征[x,y] T在时间域的波形,其中实线表示在0、1000、2000、3000、4000时刻注入激光雷达攻击数据的特征波形,虚线表示GPS定位的参考波形。圆圈标记的是激光雷达被攻击时刻的定位波形。 Figure 10 is the location trajectory output in the space domain after the laser radar attack data is injected; Figure 11 shows the waveform of the location feature [x,y] T in the time domain, where the solid line represents the time of 0, 1000, 2000, 3000, and 4000 The characteristic waveform of injected lidar attack data, and the dotted line indicates the reference waveform of GPS positioning. The circle marks the positioning waveform at the moment when the lidar is attacked.
下表1记录了时间序列索引为0、1000、2000、3000、4000的攻击时刻与时间序列索引为500、1500、2500、3500正常时刻的激光雷达与IMU融合的定位与GPS两种定位输出互相关与自相关距离的统计情况,据统计情况,发现在攻击注入时刻,激光雷达与IMU融合的定位输出与GPS参考波形有较大偏差,其偏差的均值在10米以上,远大于判定正常的阈值
Figure PCTCN2021107842-appb-000142
的上限。此时将距离大于
Figure PCTCN2021107842-appb-000143
的数据(如表1的灰色底色标记的单元格)标记为传感器可能发生故障,对应时间为t i=0、1000、2000、3000、4000的 时刻,分别计算Lidar、GPS、IMU特征在i和i+1时刻的距离,经计算,注入攻击的激光雷达的测试样本的i、i+1先后时刻,距离均大于Difference lidar(t i,t i+1)的上限,而在该时刻测得样本的Difference IMU(t i,t i+1)∈[1.01082,1.03722],Difference GPS(t i,t i+1)∈[0.29,2.36],故可论证以99%的置信水平认为在第0、1000、2000、3000、4000时刻,GPS和IMU没有异常而激光雷达传感器异常,可能被攻击。
Table 1 below records the interaction between the laser radar and IMU fusion positioning and GPS positioning output at the time series index of 0, 1000, 2000, 3000, 4000 and the time series index of 500, 1500, 2500, 3500 normal time. According to the statistics of correlation and autocorrelation distance, it is found that at the moment of attack injection, the positioning output of laser radar and IMU fusion has a large deviation from the GPS reference waveform, and the average value of the deviation is more than 10 meters, which is much larger than the normal threshold
Figure PCTCN2021107842-appb-000142
upper limit. At this time, the distance will be greater than
Figure PCTCN2021107842-appb-000143
The data (such as the cells marked with gray background color in Table 1) are marked as possible failure of the sensor, and the corresponding time is t i =0, 1000, 2000, 3000, 4000, and the Lidar, GPS, and IMU features are calculated at i The distance between time i and time i+1, after calculation, the distance between time i and time i+1 of the test sample injected into the attacking lidar is greater than the upper limit of Difference lidar (t i , t i+1 ), and the measured The Difference IMU (t i ,t i+1 )∈[1.01082,1.03722] and the Difference GPS (t i ,t i+1 )∈[0.29,2.36] of the sample obtained, so it can be argued that the 99% confidence level is At time 0, 1000, 2000, 3000, and 4000, the GPS and IMU are normal but the lidar sensor is abnormal, which may be attacked.
表1.检测结果Table 1. Test results
Figure PCTCN2021107842-appb-000144
Figure PCTCN2021107842-appb-000144
Figure PCTCN2021107842-appb-000145
Figure PCTCN2021107842-appb-000145
可以看出,利用本实施例的无人系统的传感器安全性检测方法可以非常准确地定位出现异常的传感器。It can be seen that the sensor safety detection method for the unmanned system of this embodiment can very accurately locate the abnormal sensor.
另外,如图12所示,本实施例还提供了一种计算机可读的存储介质10及无人系统的传感器安全性检测设备,该存储介质10内存储有多条指令,该指令适于由处理器20加载并执行上述的无人系统的传感器安全性检测方法的步骤,该存储介质10为检测设备的一部分。处理器在一些实施例中可以是中央处理器(Central Processing Unit,CPU)、控制器、微控制器、微处理器、或其他数据处理芯片。该处理器通常用于控制计算设备的总体操作。本实施例中,该处理器用于运行存储介质中存储的程序代码或者处理数据。In addition, as shown in FIG. 12 , this embodiment also provides a computer-readable storage medium 10 and a sensor safety detection device for an unmanned system. There are multiple instructions stored in the storage medium 10, and the instructions are suitable for use by The processor 20 loads and executes the steps of the above-mentioned sensor safety detection method for an unmanned system, and the storage medium 10 is a part of the detection device. In some embodiments, the processor may be a central processing unit (Central Processing Unit, CPU), a controller, a microcontroller, a microprocessor, or other data processing chips. The processor is typically used to control the overall operation of the computing device. In this embodiment, the processor is configured to run program codes stored in the storage medium or process data.
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。It should be noted that, in this document, the term "comprising", "comprising" or any other variation thereof is intended to cover a non-exclusive inclusion such that a process, method, article or apparatus comprising a set of elements includes not only those elements, It also includes other elements not expressly listed, or elements inherent in the process, method, article, or device. Without further limitations, an element defined by the phrase "comprising a ..." does not preclude the presence of additional identical elements in the process, method, article, or apparatus comprising that element.
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本发明各个实施例所述的方法。Through the description of the above embodiments, those skilled in the art can clearly understand that the methods of the above embodiments can be implemented by means of software plus a necessary general-purpose hardware platform, and of course also by hardware, but in many cases the former is better implementation. Based on such an understanding, the essence of the technical solution of the present invention or the part that contributes to the prior art can be embodied in the form of software products, and the computer software products are stored in a storage medium (such as ROM/RAM, disk, CD) contains several instructions to make a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in various embodiments of the present invention.
综上所述,本发明实施例针对传感器安全性的问题,从传感器定位和检测两大功能出发,将传感器分成两组,根据传感器感知信号之间的互相关性,将感知数据统一到同一特征表示下,构建传感器之间的距离模型和距离阈值,以此来度量传感器之间的距离,并以此为判断依据,对传感器进行异常检测。其次,针对可疑的传感器,通过分析计算单个传感器自身在时间域上的相关性,建立相邻时刻的距离函数和距离阈值,采用正态分布和置信水平来判断传感器是否有异常,从而精确判断并定位异常传感器。本发明不是只针对单一的传感器的异常检测方法,而是面向无人系统的一套完整的传感器安全性检测方法, 不仅适用于无人车,也使用于其他无人系统,例如无人机,无人船等。To sum up, the embodiment of the present invention aims at the problem of sensor safety, starting from the two major functions of sensor positioning and detection, dividing the sensors into two groups, and unifying the sensing data into the same feature according to the cross-correlation between sensor sensing signals Under the representation, the distance model and distance threshold between sensors are constructed to measure the distance between sensors, and this is used as a basis for judgment to detect abnormalities in sensors. Secondly, for suspicious sensors, by analyzing and calculating the correlation of a single sensor itself in the time domain, the distance function and distance threshold of adjacent moments are established, and the normal distribution and confidence level are used to judge whether the sensor is abnormal, so as to accurately judge and Locate anomaly sensors. The present invention is not only aimed at a single sensor anomaly detection method, but a complete set of sensor safety detection methods for unmanned systems, not only suitable for unmanned vehicles, but also for other unmanned systems, such as unmanned aerial vehicles, Unmanned ships etc.
以上所述仅是本申请的具体实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本申请原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本申请的保护范围。The above description is only the specific implementation of the present application. It should be pointed out that for those of ordinary skill in the art, without departing from the principle of the present application, some improvements and modifications can also be made. It should be regarded as the protection scope of this application.

Claims (20)

  1. 一种无人系统的传感器安全性检测方法,其中,包括:A sensor safety detection method for an unmanned system, including:
    将无人系统中的传感器分为两组,第一组为定位用的传感器,第二组为检测物体形状特征用的传感器;The sensors in the unmanned system are divided into two groups, the first group is the sensor for positioning, and the second group is the sensor for detecting the shape characteristics of the object;
    分别寻找各组内的传感器之间的互相关性;Find the cross-correlation between the sensors in each group separately;
    根据每组传感器中具有互相关性的传感器的感知数据之间的差异,筛选出可疑传感器;Screen out suspicious sensors based on the difference between the sensing data of sensors with cross-correlation in each group of sensors;
    根据各可疑传感器在相邻两个时刻的感知数据的相关性,判断其是否为异常传感器。According to the correlation of sensing data of each suspicious sensor at two adjacent moments, it is judged whether it is an abnormal sensor.
  2. 根据权利要求1所述的无人系统的传感器安全性检测方法,其中,The sensor safety detection method of unmanned system according to claim 1, wherein,
    所述分别寻找各组内的传感器之间的互相关性的过程包括:The process of finding the cross-correlation between the sensors in each group respectively includes:
    将各组内具有相关性的传感器的感知数据统一到同一特征维度下;Unify the sensory data of sensors with correlation in each group into the same feature dimension;
    所述根据每组传感器中具有互相关性的传感器的感知数据之间的差异,筛选出可疑传感器的过程包括:The process of screening out suspicious sensors according to the difference between the sensory data of sensors with cross-correlation in each group of sensors includes:
    分别在同一特征维度下,建立度量各组内具有互相关性的传感器的感知数据之间距离的距离模型;Under the same feature dimension, establish a distance model to measure the distance between the sensory data of the sensors with mutual correlation in each group;
    根据所述距离模型实时计算对应感知数据之间的距离,并与各自的第一距离阈值比较,超出各自的第一距离阈值的感知数据对应的传感器即为可疑传感器。The distance between the corresponding sensing data is calculated in real time according to the distance model, and compared with the respective first distance thresholds, and the sensors corresponding to the sensing data exceeding the respective first distance thresholds are suspicious sensors.
  3. 根据权利要求2所述的无人系统的传感器安全性检测方法,其中,第一组传感器包括GPS、IMU和激光雷达;The sensor safety detection method of unmanned system according to claim 2, wherein, the first group of sensors comprises GPS, IMU and lidar;
    将第一组内具有相关性的传感器的感知数据统一到同一特征维度下的过程,包括:激光雷达与IMU融合定位形成融合特征矢量
    Figure PCTCN2021107842-appb-100001
    GPS的原始特征矢量经过投影转换后形成GPS特征矢量
    Figure PCTCN2021107842-appb-100002
    The process of unifying the sensory data of the first group of correlated sensors into the same feature dimension, including: laser radar and IMU fusion positioning to form a fusion feature vector
    Figure PCTCN2021107842-appb-100001
    The original feature vector of GPS is transformed into GPS feature vector after projective transformation
    Figure PCTCN2021107842-appb-100002
    在建立度量第一组内具有互相关性的传感器的感知数据之间距离的第一距离模型Distance(GPS,Lidar)的过程中,采用范数来表征感知数据之间的距离,
    Figure PCTCN2021107842-appb-100003
    n为非负整数。
    In the process of establishing the first distance model Distance (GPS, Lidar) to measure the distance between the sensing data of the sensors with mutual correlation in the first group, the norm is used to characterize the distance between the sensing data,
    Figure PCTCN2021107842-appb-100003
    n is a non-negative integer.
  4. 根据权利要求3所述的无人系统的传感器安全性检测方法,其中,激光雷达与IMU融合定位时,包括:The sensor safety detection method of an unmanned system according to claim 3, wherein, when the lidar and the IMU are fused and positioned, it includes:
    对两个时刻间的IMU测量数据
    Figure PCTCN2021107842-appb-100004
    进行预积分;
    IMU measurement data between two time points
    Figure PCTCN2021107842-appb-100004
    perform pre-integration;
    利用连续的特征点云序列{P k},{P k+1}和IMU预积分结果
    Figure PCTCN2021107842-appb-100005
    估计载体的相对运动。
    Using continuous feature point cloud sequence {P k }, {P k+1 } and IMU pre-integration results
    Figure PCTCN2021107842-appb-100005
    Estimate the relative motion of the carrier.
  5. 根据权利要求4所述的无人系统的传感器安全性检测方法,其中,所述利用连续的特征点云序列{P k},{P k+1}和IMU预积分结果
    Figure PCTCN2021107842-appb-100006
    估计载体的相对运动的过程包括:
    The sensor safety detection method of an unmanned system according to claim 4, wherein said utilizing continuous feature point cloud sequences {P k }, {P k+1 } and IMU pre-integration results
    Figure PCTCN2021107842-appb-100006
    The process of estimating the relative motion of the carrier includes:
    利用IMU预积分结果
    Figure PCTCN2021107842-appb-100007
    对特征点云序列{P k+1}进行坐标变换,使之与特征点云序列{P k}在同一坐标系下;
    Using IMU preintegration results
    Figure PCTCN2021107842-appb-100007
    Carry out coordinate transformation on the feature point cloud sequence {P k+1 }, so that it is in the same coordinate system as the feature point cloud sequence {P k };
    p j∈P k+1,p i∈P k,p j与p i的距离之和记为d,利用Levenberg-Maquardt算法求解d最小时载体坐标系(B)到世界坐标系(W)的旋转量
    Figure PCTCN2021107842-appb-100008
    和平移量
    Figure PCTCN2021107842-appb-100009
    p j ∈ P k+1 , p i ∈ P k , the sum of the distances between p j and p i is denoted as d, and the distance from the carrier coordinate system (B) to the world coordinate system (W) when d is minimized is solved by using the Levenberg-Maquardt algorithm amount of rotation
    Figure PCTCN2021107842-appb-100008
    and translation
    Figure PCTCN2021107842-appb-100009
    根据所述旋转量
    Figure PCTCN2021107842-appb-100010
    和所述平移量
    Figure PCTCN2021107842-appb-100011
    得到激光雷达与IMU融合的定位公式
    Figure PCTCN2021107842-appb-100012
    According to the amount of rotation
    Figure PCTCN2021107842-appb-100010
    and the translation amount
    Figure PCTCN2021107842-appb-100011
    Get the positioning formula of lidar and IMU fusion
    Figure PCTCN2021107842-appb-100012
  6. 根据权利要求2所述的无人系统的传感器安全性检测方法,其中,第二组传感器包括激光雷达和摄像头;The sensor safety detection method of an unmanned system according to claim 2, wherein the second group of sensors includes a laser radar and a camera;
    将第二组内具有相关性的传感器的感知数据统一到同一特征维度下的过程,包括:The process of unifying the perception data of the sensors with correlation in the second group into the same feature dimension, including:
    针对激光雷达的感知数据:Perception data for lidar:
    经过坐标系的变换,将感知数据的三维点云中任意一个三维点(X L,Y L,Z L)投射到二维坐标系上的一个二维点(i,j); After the transformation of the coordinate system, any three-dimensional point (X L , Y L , Z L ) in the three-dimensional point cloud of the perception data is projected to a two-dimensional point (i, j) on the two-dimensional coordinate system;
    对得到的张量进行特征提取,得到激光雷达特征矢量
    Figure PCTCN2021107842-appb-100013
    [X L,Y L] T表示检测目标相对车体坐标系[0,0] T的位置,[h L,w L] T表示检测框的高度和宽度,θ L表示检测框内是检测目标的置信度;
    Perform feature extraction on the obtained tensor to obtain the lidar feature vector
    Figure PCTCN2021107842-appb-100013
    [X L ,Y L ] T represents the position of the detection target relative to the vehicle body coordinate system [0,0] T , [h L ,w L ] T represents the height and width of the detection frame, θ L represents the detection target within the detection frame confidence level;
    针对摄像头的感知数据:Perception data for camera:
    调用深度卷积神经网络算法对采集到的RGB图像进行特征提取,得到 摄像头特征矢量
    Figure PCTCN2021107842-appb-100014
    [X v Y v] T表示目标物体在图像坐标系上的坐标值,[h v,w v,] T表示检测框的高度和宽度,p表示目标物体的置信度。
    Call the deep convolutional neural network algorithm to extract the features of the collected RGB images and get the camera feature vector
    Figure PCTCN2021107842-appb-100014
    [X v Y v ] T represents the coordinate value of the target object on the image coordinate system, [h v , w v ,] T represents the height and width of the detection frame, and p represents the confidence of the target object.
  7. 根据权利要求6所述的无人系统的传感器安全性检测方法,其中,在建立度量第二组内具有互相关性的传感器的感知数据之间距离的第二距离模型Distance(Camera,Lidar)的过程中,采用范数来表征感知数据之间的距离,
    Figure PCTCN2021107842-appb-100015
    Figure PCTCN2021107842-appb-100016
    n为非负整数。
    The sensor safety detection method of unmanned system according to claim 6, wherein, in establishing the second distance model Distance (Camera, Lidar) of the distance between the sensing data of the sensor with cross-correlation in the measurement second group In the process, the norm is used to represent the distance between the perception data,
    Figure PCTCN2021107842-appb-100015
    Figure PCTCN2021107842-appb-100016
    n is a non-negative integer.
  8. 根据权利要求2所述的无人系统的传感器安全性检测方法,其中,确定可疑传感器的过程包括:The sensor safety detection method of unmanned system according to claim 2, wherein, the process of determining suspicious sensor comprises:
    建立第一组与第二组的至少一组中的具有互相关性的传感器A、B的感知数据之间距离D的正态分布:A normal distribution of the distance D between the sensing data of the sensors A and B with mutual correlation in at least one of the first group and the second group is established:
    D~N(ε,σ 2); D~N(ε,σ 2 );
    其中,期望
    Figure PCTCN2021107842-appb-100017
    方差
    Figure PCTCN2021107842-appb-100018
    Distance i(A,B)为第i个样本中,两个传感器A、B的感知数据之间的距离,m为正常样本总数,m为大于0的整数;
    Among them, expect
    Figure PCTCN2021107842-appb-100017
    variance
    Figure PCTCN2021107842-appb-100018
    Distance i (A, B) is the distance between the sensing data of two sensors A and B in the i-th sample, m is the total number of normal samples, and m is an integer greater than 0;
    判断置信区间M是否满足条件:Determine whether the confidence interval M satisfies the conditions:
    Figure PCTCN2021107842-appb-100019
    Figure PCTCN2021107842-appb-100019
    如果不满足,则两个传感器A、B为可疑传感器。If not, the two sensors A, B are suspect sensors.
  9. 根据权利要求1所述的无人系统的传感器安全性检测方法,其中,所述根据各可疑传感器在相邻两个时刻的感知数据的相关性,判断其是否为异常传感器的过程包括:The sensor safety detection method of an unmanned system according to claim 1, wherein, according to the correlation of sensing data of each suspicious sensor at two adjacent moments, the process of judging whether it is an abnormal sensor comprises:
    确定各可疑传感器的特征矢量表示
    Figure PCTCN2021107842-appb-100020
    Figure PCTCN2021107842-appb-100021
    代表第j个传感器在第i时刻的特征矢量;
    Determine the feature vector representation for each suspect sensor
    Figure PCTCN2021107842-appb-100020
    Figure PCTCN2021107842-appb-100021
    Represents the feature vector of the jth sensor at the ith moment;
    建立时间序列的第三距离模型Difference sensor-j-i(t i,t i+1),所述第三距离模型表示第j个传感器的感知数据在相邻两个时刻:第i时刻和第i+1时刻的距离; Establish a third distance model Difference sensor-ji (t i , t i+1 ) of the time series, the third distance model indicates that the perception data of the jth sensor is at two adjacent moments: the ith moment and the i+th 1 moment distance;
    根据所述第三距离模型,分别计算各可疑传感器在相邻两个时刻的特征矢量之间的距离,并与各自的第二距离阈值比较,超出各自的第二距离阈值的感知数据对应的传感器即为异常传感器。According to the third distance model, the distance between the feature vectors of each suspicious sensor at two adjacent moments is calculated respectively, and compared with the respective second distance thresholds, the sensors corresponding to the sensing data exceeding the respective second distance thresholds It is an abnormal sensor.
  10. 根据权利要求9所述的无人系统的传感器安全性检测方法,其中,The sensor safety detection method of an unmanned system according to claim 9, wherein,
    所述建立时间序列的第三距离模型的过程中,采用范数来表征感知数据之间的距离:
    Figure PCTCN2021107842-appb-100022
    n为非负整数,
    Figure PCTCN2021107842-appb-100023
    代表第j个传感器在第i+1时刻的特征矢量;
    In the process of establishing the third distance model of the time series, the norm is used to characterize the distance between the perception data:
    Figure PCTCN2021107842-appb-100022
    n is a non-negative integer,
    Figure PCTCN2021107842-appb-100023
    Represents the feature vector of the jth sensor at the time i+1;
    确定异常传感器的过程包括:The process of identifying abnormal sensors includes:
    建立可疑传感器在相邻时刻的特征矢量的距离D 3的正态分布: Establish the normal distribution of the distance D3 of the feature vector of the suspicious sensor at adjacent moments:
    D 3~N(τ,σ 3 2); D 3 ~N(τ,σ 3 2 );
    其中,期望
    Figure PCTCN2021107842-appb-100024
    方差
    Figure PCTCN2021107842-appb-100025
    m 3代表参与计算的相邻时间段总数,m 3为大于0的整数;
    Among them, expect
    Figure PCTCN2021107842-appb-100024
    variance
    Figure PCTCN2021107842-appb-100025
    m 3 represents the total number of adjacent time periods involved in the calculation, and m 3 is an integer greater than 0;
    判断置信区间M 3是否满足条件: Determine whether the confidence interval M3 satisfies the conditions:
    Figure PCTCN2021107842-appb-100026
    Figure PCTCN2021107842-appb-100026
    如果不满足,则可疑传感器为异常传感器。If not, the suspect sensor is an abnormal sensor.
  11. 根据权利要求2所述的无人系统的传感器安全性检测方法,其中,所述根据各可疑传感器在相邻两个时刻的感知数据的相关性,判断其是否为异常传感器的过程包括:The sensor safety detection method of an unmanned system according to claim 2, wherein the process of judging whether the suspicious sensor is an abnormal sensor according to the correlation of the sensing data of each suspicious sensor at two adjacent moments comprises:
    确定各可疑传感器的特征矢量表示
    Figure PCTCN2021107842-appb-100027
    Figure PCTCN2021107842-appb-100028
    代表第j个传感器在第i时刻的特征矢量;
    Determine the feature vector representation for each suspect sensor
    Figure PCTCN2021107842-appb-100027
    Figure PCTCN2021107842-appb-100028
    Represents the feature vector of the jth sensor at the ith moment;
    建立时间序列的第三距离模型Difference sensor-j-i(t i,t i+1),所述第三距离模 型表示第j个传感器的感知数据在相邻两个时刻:第i时刻和第i+1时刻的距离; Establish a third distance model Difference sensor-ji (t i , t i+1 ) of the time series, the third distance model indicates that the perception data of the jth sensor is at two adjacent moments: the ith moment and the i+th 1 moment distance;
    根据所述第三距离模型,分别计算各可疑传感器在相邻两个时刻的特征矢量之间的距离,并与各自的第二距离阈值比较,超出各自的第二距离阈值的感知数据对应的传感器即为异常传感器;According to the third distance model, the distance between the feature vectors of each suspicious sensor at two adjacent moments is calculated respectively, and compared with the respective second distance thresholds, the sensors corresponding to the sensing data exceeding the respective second distance thresholds is the abnormal sensor;
    所述建立时间序列的第三距离模型的过程中,采用范数来表征感知数据之间的距离:
    Figure PCTCN2021107842-appb-100029
    n为非负整数,
    Figure PCTCN2021107842-appb-100030
    代表第j个传感器在第i+1时刻的特征矢量。
    In the process of establishing the third distance model of the time series, the norm is used to characterize the distance between the perception data:
    Figure PCTCN2021107842-appb-100029
    n is a non-negative integer,
    Figure PCTCN2021107842-appb-100030
    Represents the feature vector of the jth sensor at the i+1th moment.
  12. 根据权利要求11所述的无人系统的传感器安全性检测方法,其中,确定异常传感器的过程包括:The sensor safety detection method of an unmanned system according to claim 11, wherein the process of determining an abnormal sensor comprises:
    建立可疑传感器在相邻时刻的特征矢量的距离D 3的正态分布: Establish the normal distribution of the distance D3 of the feature vector of the suspicious sensor at adjacent moments:
    D 3~N(τ,σ 3 2); D 3 ~N(τ,σ 3 2 );
    其中,期望
    Figure PCTCN2021107842-appb-100031
    方差
    Figure PCTCN2021107842-appb-100032
    m 3代表参与计算的相邻时间段总数,m 3为大于0的整数;
    Among them, expect
    Figure PCTCN2021107842-appb-100031
    variance
    Figure PCTCN2021107842-appb-100032
    m 3 represents the total number of adjacent time periods involved in the calculation, and m 3 is an integer greater than 0;
    判断置信区间M 3是否满足条件: Determine whether the confidence interval M3 satisfies the conditions:
    Figure PCTCN2021107842-appb-100033
    Figure PCTCN2021107842-appb-100033
    如果不满足,则可疑传感器为异常传感器。If not, the suspect sensor is an abnormal sensor.
  13. 根据权利要求4所述的无人系统的传感器安全性检测方法,其中,所述根据各可疑传感器在相邻两个时刻的感知数据的相关性,判断其是否为异常传感器的过程包括:The sensor safety detection method of an unmanned system according to claim 4, wherein, according to the correlation of sensing data of each suspicious sensor at two adjacent moments, the process of judging whether it is an abnormal sensor comprises:
    确定各可疑传感器的特征矢量表示
    Figure PCTCN2021107842-appb-100034
    Figure PCTCN2021107842-appb-100035
    代表第j个传感器在第i时刻的特征矢量;
    Determine the feature vector representation for each suspect sensor
    Figure PCTCN2021107842-appb-100034
    Figure PCTCN2021107842-appb-100035
    Represents the feature vector of the jth sensor at the ith moment;
    建立时间序列的第三距离模型Difference sensor-j-i(t i,t i+1),所述第三距离模 型表示第j个传感器的感知数据在相邻两个时刻:第i时刻和第i+1时刻的距离; Establish a third distance model Difference sensor-ji (t i , t i+1 ) of the time series, the third distance model indicates that the perception data of the jth sensor is at two adjacent moments: the ith moment and the i+th 1 moment distance;
    根据所述第三距离模型,分别计算各可疑传感器在相邻两个时刻的特征矢量之间的距离,并与各自的第二距离阈值比较,超出各自的第二距离阈值的感知数据对应的传感器即为异常传感器;According to the third distance model, the distance between the feature vectors of each suspicious sensor at two adjacent moments is calculated respectively, and compared with the respective second distance thresholds, the sensors corresponding to the sensing data exceeding the respective second distance thresholds is the abnormal sensor;
    所述建立时间序列的第三距离模型的过程中,采用范数来表征感知数据之间的距离:
    Figure PCTCN2021107842-appb-100036
    n为非负整数,
    Figure PCTCN2021107842-appb-100037
    代表第j个传感器在第i+1时刻的特征矢量;
    In the process of establishing the third distance model of the time series, the norm is used to characterize the distance between the perception data:
    Figure PCTCN2021107842-appb-100036
    n is a non-negative integer,
    Figure PCTCN2021107842-appb-100037
    Represents the feature vector of the jth sensor at the time i+1;
    确定异常传感器的过程包括:The process of identifying abnormal sensors includes:
    建立可疑传感器在相邻时刻的特征矢量的距离D 3的正态分布: Establish the normal distribution of the distance D3 of the feature vector of the suspicious sensor at adjacent moments:
    D 3~N(τ,σ 3 2); D 3 ~N(τ,σ 3 2 );
    其中,期望
    Figure PCTCN2021107842-appb-100038
    方差
    Figure PCTCN2021107842-appb-100039
    m 3代表参与计算的相邻时间段总数,m 3为大于0的整数;
    Among them, expect
    Figure PCTCN2021107842-appb-100038
    variance
    Figure PCTCN2021107842-appb-100039
    m 3 represents the total number of adjacent time periods involved in the calculation, and m 3 is an integer greater than 0;
    判断置信区间M 3是否满足条件: Determine whether the confidence interval M3 satisfies the conditions:
    Figure PCTCN2021107842-appb-100040
    Figure PCTCN2021107842-appb-100040
    如果不满足,则可疑传感器为异常传感器。If not, the suspect sensor is an abnormal sensor.
  14. 根据权利要求6所述的无人系统的传感器安全性检测方法,其中,所述根据各可疑传感器在相邻两个时刻的感知数据的相关性,判断其是否为异常传感器的过程包括:The sensor safety detection method of an unmanned system according to claim 6, wherein, according to the correlation of sensing data of each suspicious sensor at two adjacent moments, the process of judging whether it is an abnormal sensor comprises:
    确定各可疑传感器的特征矢量表示
    Figure PCTCN2021107842-appb-100041
    Figure PCTCN2021107842-appb-100042
    代表第j个传感器在第i时刻的特征矢量;
    Determine the feature vector representation for each suspect sensor
    Figure PCTCN2021107842-appb-100041
    Figure PCTCN2021107842-appb-100042
    Represents the feature vector of the jth sensor at the ith moment;
    建立时间序列的第三距离模型Difference sensor-j-i(t i,t i+1),所述第三距离模型表示第j个传感器的感知数据在相邻两个时刻:第i时刻和第i+1时刻的距离; Establish a third distance model Difference sensor-ji (t i , t i+1 ) of the time series, the third distance model indicates that the perception data of the jth sensor is at two adjacent moments: the ith moment and the i+th 1 moment distance;
    根据所述第三距离模型,分别计算各可疑传感器在相邻两个时刻的特征矢量之间的距离,并与各自的第二距离阈值比较,超出各自的第二距离阈值的感知数据对应的传感器即为异常传感器;According to the third distance model, the distance between the feature vectors of each suspicious sensor at two adjacent moments is calculated respectively, and compared with the respective second distance thresholds, the sensors corresponding to the sensing data exceeding the respective second distance thresholds is the abnormal sensor;
    所述建立时间序列的第三距离模型的过程中,采用范数来表征感知数据之间的距离:
    Figure PCTCN2021107842-appb-100043
    n为非负整数,
    Figure PCTCN2021107842-appb-100044
    代表第j个传感器在第i+1时刻的特征矢量;
    In the process of establishing the third distance model of the time series, the norm is used to characterize the distance between the perception data:
    Figure PCTCN2021107842-appb-100043
    n is a non-negative integer,
    Figure PCTCN2021107842-appb-100044
    Represents the feature vector of the jth sensor at the time i+1;
    确定异常传感器的过程包括:The process of identifying abnormal sensors includes:
    建立可疑传感器在相邻时刻的特征矢量的距离D 3的正态分布: Establish the normal distribution of the distance D3 of the feature vector of the suspicious sensor at adjacent moments:
    D 3~N(τ,σ 3 2); D 3 ~N(τ,σ 3 2 );
    其中,期望
    Figure PCTCN2021107842-appb-100045
    方差
    Figure PCTCN2021107842-appb-100046
    m 3代表参与计算的相邻时间段总数,m 3为大于0的整数;
    Among them, expect
    Figure PCTCN2021107842-appb-100045
    variance
    Figure PCTCN2021107842-appb-100046
    m 3 represents the total number of adjacent time periods involved in the calculation, and m 3 is an integer greater than 0;
    判断置信区间M 3是否满足条件: Determine whether the confidence interval M3 satisfies the conditions:
    Figure PCTCN2021107842-appb-100047
    Figure PCTCN2021107842-appb-100047
    如果不满足,则可疑传感器为异常传感器。If not, the suspect sensor is an abnormal sensor.
  15. 根据权利要求8所述的无人系统的传感器安全性检测方法,其中,所述根据各可疑传感器在相邻两个时刻的感知数据的相关性,判断其是否为异常传感器的过程包括:The sensor safety detection method of an unmanned system according to claim 8, wherein, according to the correlation of sensing data of each suspicious sensor at two adjacent moments, the process of judging whether it is an abnormal sensor comprises:
    确定各可疑传感器的特征矢量表示
    Figure PCTCN2021107842-appb-100048
    Figure PCTCN2021107842-appb-100049
    代表第j个传感器在第i时刻的特征矢量;
    Determine the feature vector representation for each suspect sensor
    Figure PCTCN2021107842-appb-100048
    Figure PCTCN2021107842-appb-100049
    Represents the feature vector of the jth sensor at the ith moment;
    建立时间序列的第三距离模型Difference sensor-j-i(t i,t i+1),所述第三距离模型表示第j个传感器的感知数据在相邻两个时刻:第i时刻和第i+1时刻的距离; Establish a third distance model Difference sensor-ji (t i , t i+1 ) of the time series, the third distance model indicates that the perception data of the jth sensor is at two adjacent moments: the ith moment and the i+th 1 moment distance;
    根据所述第三距离模型,分别计算各可疑传感器在相邻两个时刻的特征矢量之间的距离,并与各自的第二距离阈值比较,超出各自的第二距离阈值的感知数据对应的传感器即为异常传感器;According to the third distance model, the distance between the feature vectors of each suspicious sensor at two adjacent moments is calculated respectively, and compared with the respective second distance thresholds, the sensors corresponding to the sensing data exceeding the respective second distance thresholds is the abnormal sensor;
    所述建立时间序列的第三距离模型的过程中,采用范数来表征感知数据之间的距离:
    Figure PCTCN2021107842-appb-100050
    n为非负整数,
    Figure PCTCN2021107842-appb-100051
    代表第j个传感器在第i+1时刻的特征矢量;
    In the process of establishing the third distance model of the time series, the norm is used to characterize the distance between the perception data:
    Figure PCTCN2021107842-appb-100050
    n is a non-negative integer,
    Figure PCTCN2021107842-appb-100051
    Represents the feature vector of the jth sensor at the time i+1;
    确定异常传感器的过程包括:The process of identifying abnormal sensors includes:
    建立可疑传感器在相邻时刻的特征矢量的距离D 3的正态分布: Establish the normal distribution of the distance D3 of the feature vector of the suspicious sensor at adjacent moments:
    D 3~N(τ,σ 3 2); D 3 ~N(τ,σ 3 2 );
    其中,期望
    Figure PCTCN2021107842-appb-100052
    方差
    Figure PCTCN2021107842-appb-100053
    m 3代表参与计算的相邻时间段总数,m 3为大于0的整数;
    Among them, expect
    Figure PCTCN2021107842-appb-100052
    variance
    Figure PCTCN2021107842-appb-100053
    m 3 represents the total number of adjacent time periods involved in the calculation, and m 3 is an integer greater than 0;
    判断置信区间M 3是否满足条件: Determine whether the confidence interval M3 satisfies the conditions:
    Figure PCTCN2021107842-appb-100054
    Figure PCTCN2021107842-appb-100054
    如果不满足,则可疑传感器为异常传感器。If not, the suspect sensor is an abnormal sensor.
  16. 一种存储介质,其中,所述存储介质内存储有多条指令,所述指令适于由处理器加载并执行无人系统的传感器安全性检测方法的步骤,所述无人系统的传感器安全性检测方法,包括:A storage medium, wherein a plurality of instructions are stored in the storage medium, and the instructions are suitable for being loaded by a processor and executing the steps of a sensor safety detection method for an unmanned system, and the sensor safety detection method for an unmanned system Detection methods, including:
    将无人系统中的传感器分为两组,第一组为定位用的传感器,第二组为检测物体形状特征用的传感器;The sensors in the unmanned system are divided into two groups, the first group is the sensor for positioning, and the second group is the sensor for detecting the shape characteristics of the object;
    分别寻找各组内的传感器之间的互相关性;Find the cross-correlation between the sensors in each group separately;
    根据每组传感器中具有互相关性的传感器的感知数据之间的差异,筛选出可疑传感器;Screen out suspicious sensors based on the difference between the sensing data of sensors with cross-correlation in each group of sensors;
    根据各可疑传感器在相邻两个时刻的感知数据的相关性,判断其是否为异 常传感器。According to the correlation of the sensing data of each suspicious sensor at two adjacent moments, it is judged whether it is an abnormal sensor.
  17. 根据权利要求16所述的存储介质,其中,所述分别寻找各组内的传感器之间的互相关性的过程包括:The storage medium according to claim 16, wherein the process of respectively finding the cross-correlation between the sensors in each group comprises:
    将各组内具有相关性的传感器的感知数据统一到同一特征维度下;Unify the sensory data of sensors with correlation in each group into the same feature dimension;
    所述根据每组传感器中具有互相关性的传感器的感知数据之间的差异,筛选出可疑传感器的过程包括:The process of screening out suspicious sensors according to the difference between the sensory data of sensors with cross-correlation in each group of sensors includes:
    分别在同一特征维度下,建立度量各组内具有互相关性的传感器的感知数据之间距离的距离模型;Under the same feature dimension, establish a distance model to measure the distance between the sensory data of the sensors with mutual correlation in each group;
    根据所述距离模型实时计算对应感知数据之间的距离,并与各自的第一距离阈值比较,超出各自的第一距离阈值的感知数据对应的传感器即为可疑传感器。The distance between the corresponding sensing data is calculated in real time according to the distance model, and compared with the respective first distance thresholds, and the sensors corresponding to the sensing data exceeding the respective first distance thresholds are suspicious sensors.
  18. 根据权利要求17所述的存储介质,其中,The storage medium according to claim 17, wherein,
    第一组传感器包括GPS、IMU和激光雷达,将第一组内具有相关性的传感器的感知数据统一到同一特征维度下的过程,包括:激光雷达与IMU融合定位形成融合特征矢量
    Figure PCTCN2021107842-appb-100055
    GPS的原始特征矢量经过投影转换后形成GPS特征矢量
    Figure PCTCN2021107842-appb-100056
    The first group of sensors includes GPS, IMU, and lidar. The process of unifying the sensory data of the first group of relevant sensors into the same feature dimension includes: lidar and IMU fusion positioning to form a fusion feature vector
    Figure PCTCN2021107842-appb-100055
    The original feature vector of GPS is transformed into GPS feature vector after projective transformation
    Figure PCTCN2021107842-appb-100056
    在建立度量第一组内具有互相关性的传感器的感知数据之间距离的第一距离模型Distance(GPS,Lidar)的过程中,采用范数来表征感知数据之间的距离,
    Figure PCTCN2021107842-appb-100057
    n为非负整数;
    In the process of establishing the first distance model Distance (GPS, Lidar) to measure the distance between the sensing data of the sensors with mutual correlation in the first group, the norm is used to characterize the distance between the sensing data,
    Figure PCTCN2021107842-appb-100057
    n is a non-negative integer;
    激光雷达与IMU融合定位时,包括:When laser radar and IMU are fused and positioned, it includes:
    对两个时刻间的IMU测量数据
    Figure PCTCN2021107842-appb-100058
    进行预积分;
    IMU measurement data between two time points
    Figure PCTCN2021107842-appb-100058
    perform pre-integration;
    利用连续的特征点云序列{P k},{P k+1}和IMU预积分结果
    Figure PCTCN2021107842-appb-100059
    估计载体的相对运动;
    Using continuous feature point cloud sequence {P k }, {P k+1 } and IMU pre-integration results
    Figure PCTCN2021107842-appb-100059
    Estimate the relative motion of the carrier;
    第二组传感器包括激光雷达和摄像头,将第二组内具有相关性的传感器的感知数据统一到同一特征维度下的过程,包括:The second group of sensors includes lidar and cameras. The process of unifying the sensory data of the second group of sensors with correlations into the same feature dimension includes:
    针对激光雷达的感知数据:Perception data for lidar:
    经过坐标系的变换,将感知数据的三维点云中任意一个三维点 (X L,Y L,Z L)投射到二维坐标系上的一个二维点(i,j); After the transformation of the coordinate system, any three-dimensional point (X L , Y L , Z L ) in the three-dimensional point cloud of the perception data is projected to a two-dimensional point (i, j) on the two-dimensional coordinate system;
    对得到的张量进行特征提取,得到激光雷达特征矢量
    Figure PCTCN2021107842-appb-100060
    [X L,Y L] T表示检测目标相对车体坐标系[0,0] T的位置,[h L,w L] T表示检测框的高度和宽度,θ L表示检测框内是检测目标的置信度;
    Perform feature extraction on the obtained tensor to obtain the lidar feature vector
    Figure PCTCN2021107842-appb-100060
    [X L ,Y L ] T represents the position of the detection target relative to the vehicle body coordinate system [0,0] T , [h L ,w L ] T represents the height and width of the detection frame, θ L represents the detection target within the detection frame confidence level;
    针对摄像头的感知数据:Perception data for camera:
    调用深度卷积神经网络算法对采集到的RGB图像进行特征提取,得到摄像头特征矢量
    Figure PCTCN2021107842-appb-100061
    [X v Y v] T表示目标物体在图像坐标系上的坐标值,[h v,w v,] T表示检测框的高度和宽度,p表示目标物体的置信度;
    Call the deep convolutional neural network algorithm to extract the features of the collected RGB images, and get the camera feature vector
    Figure PCTCN2021107842-appb-100061
    [X v Y v ] T represents the coordinate value of the target object on the image coordinate system, [h v , w v ,] T represents the height and width of the detection frame, and p represents the confidence of the target object;
    在建立度量第二组内具有互相关性的传感器的感知数据之间距离的第二距离模型Distance(Camera,Lidar)的过程中,采用范数来表征感知数据之间的距离,
    Figure PCTCN2021107842-appb-100062
    Figure PCTCN2021107842-appb-100063
    n为非负整数。
    In the process of establishing the second distance model Distance(Camera, Lidar) measuring the distance between the sensing data of the sensors with cross-correlation in the second group, the norm is used to represent the distance between the sensing data,
    Figure PCTCN2021107842-appb-100062
    Figure PCTCN2021107842-appb-100063
    n is a non-negative integer.
  19. 一种无人系统的传感器安全性检测设备,其中,包括存储介质和适于实现各指令的处理器;所述存储介质内存储有多条指令,所述指令适于由处理器加载并执行无人系统的传感器安全性检测方法的步骤,所述无人系统的传感器安全性检测方法,包括:A sensor safety detection device for an unmanned system, which includes a storage medium and a processor adapted to implement instructions; multiple instructions are stored in the storage medium, and the instructions are suitable for being loaded by the processor and executed without The step of the sensor security detection method of human system, the sensor security detection method of described unmanned system, comprises:
    将无人系统中的传感器分为两组,第一组为定位用的传感器,第二组为检测物体形状特征用的传感器;The sensors in the unmanned system are divided into two groups, the first group is the sensor for positioning, and the second group is the sensor for detecting the shape characteristics of the object;
    分别寻找各组内的传感器之间的互相关性;Find the cross-correlation between the sensors in each group separately;
    根据每组传感器中具有互相关性的传感器的感知数据之间的差异,筛选出可疑传感器;Screen out suspicious sensors based on the difference between the sensing data of sensors with cross-correlation in each group of sensors;
    根据各可疑传感器在相邻两个时刻的感知数据的相关性,判断其是否为异常传感器。According to the correlation of sensing data of each suspicious sensor at two adjacent moments, it is judged whether it is an abnormal sensor.
  20. 根据权利要求19所述的无人系统的传感器安全性检测设备,其中,所述分别寻找各组内的传感器之间的互相关性的过程包括:The sensor safety detection device for an unmanned system according to claim 19, wherein the process of respectively finding the mutual correlation between the sensors in each group comprises:
    将各组内具有相关性的传感器的感知数据统一到同一特征维度下;Unify the sensory data of sensors with correlation in each group into the same feature dimension;
    所述根据每组传感器中具有互相关性的传感器的感知数据之间的差异,筛选出可疑传感器的过程包括:The process of screening out suspicious sensors according to the difference between the sensory data of sensors with cross-correlation in each group of sensors includes:
    分别在同一特征维度下,建立度量各组内具有互相关性的传感器的感知数据之间距离的距离模型;Under the same feature dimension, establish a distance model to measure the distance between the sensory data of the sensors with mutual correlation in each group;
    根据所述距离模型实时计算对应感知数据之间的距离,并与各自的第一距离阈值比较,超出各自的第一距离阈值的感知数据对应的传感器即为可疑传感器;Calculate the distance between the corresponding sensing data in real time according to the distance model, and compare it with the respective first distance threshold, and the sensor corresponding to the sensing data exceeding the respective first distance threshold is a suspicious sensor;
    第一组传感器包括GPS、IMU和激光雷达,将第一组内具有相关性的传感器的感知数据统一到同一特征维度下的过程,包括:激光雷达与IMU融合定位形成融合特征矢量
    Figure PCTCN2021107842-appb-100064
    GPS的原始特征矢量经过投影转换后形成GPS特征矢量
    Figure PCTCN2021107842-appb-100065
    The first group of sensors includes GPS, IMU, and lidar. The process of unifying the sensory data of the first group of relevant sensors into the same feature dimension includes: lidar and IMU fusion positioning to form a fusion feature vector
    Figure PCTCN2021107842-appb-100064
    The original feature vector of GPS is transformed into GPS feature vector after projective transformation
    Figure PCTCN2021107842-appb-100065
    在建立度量第一组内具有互相关性的传感器的感知数据之间距离的第一距离模型Distance(GPS,Lidar)的过程中,采用范数来表征感知数据之间的距离,
    Figure PCTCN2021107842-appb-100066
    n为非负整数;
    In the process of establishing the first distance model Distance (GPS, Lidar) to measure the distance between the sensing data of the sensors with mutual correlation in the first group, the norm is used to characterize the distance between the sensing data,
    Figure PCTCN2021107842-appb-100066
    n is a non-negative integer;
    第二组传感器包括激光雷达和摄像头,将第二组内具有相关性的传感器的感知数据统一到同一特征维度下的过程,包括:The second group of sensors includes lidar and cameras. The process of unifying the sensory data of the second group of sensors with correlations into the same feature dimension includes:
    针对激光雷达的感知数据:Perception data for lidar:
    经过坐标系的变换,将感知数据的三维点云中任意一个三维点(X L,Y L,Z L)投射到二维坐标系上的一个二维点(i,j); After the transformation of the coordinate system, any three-dimensional point (X L , Y L , Z L ) in the three-dimensional point cloud of the perception data is projected to a two-dimensional point (i, j) on the two-dimensional coordinate system;
    对得到的张量进行特征提取,得到激光雷达特征矢量
    Figure PCTCN2021107842-appb-100067
    [X L,Y L] T表示检测目标相对车体坐标系[0,0] T的位置,[h L,w L] T表示检测框的高度和宽度,θ L表示检测框内是检测目标的置信度;
    Perform feature extraction on the obtained tensor to obtain the lidar feature vector
    Figure PCTCN2021107842-appb-100067
    [X L ,Y L ] T represents the position of the detection target relative to the vehicle body coordinate system [0,0] T , [h L ,w L ] T represents the height and width of the detection frame, θ L represents the detection target within the detection frame confidence level;
    针对摄像头的感知数据:Perception data for camera:
    调用深度卷积神经网络算法对采集到的RGB图像进行特征提取,得到摄像头特征矢量
    Figure PCTCN2021107842-appb-100068
    [X v Y v] T表示目标物体在图像坐标系上的坐标值,[h v,w v,] T表示检测框的高度和宽度,p表示目标物体的置信度。
    Call the deep convolutional neural network algorithm to extract the features of the collected RGB images, and get the camera feature vector
    Figure PCTCN2021107842-appb-100068
    [X v Y v ] T represents the coordinate value of the target object on the image coordinate system, [h v , w v ,] T represents the height and width of the detection frame, and p represents the confidence of the target object.
PCT/CN2021/107842 2021-07-15 2021-07-22 Sensor security detection method and device for unmanned system, and storage medium WO2023283987A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110799228.1A CN113532499B (en) 2021-07-15 2021-07-15 Sensor security detection method and device for unmanned system and storage medium
CN202110799228.1 2021-07-15

Publications (1)

Publication Number Publication Date
WO2023283987A1 true WO2023283987A1 (en) 2023-01-19

Family

ID=78099326

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/107842 WO2023283987A1 (en) 2021-07-15 2021-07-22 Sensor security detection method and device for unmanned system, and storage medium

Country Status (2)

Country Link
CN (1) CN113532499B (en)
WO (1) WO2023283987A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115856931A (en) * 2023-03-01 2023-03-28 陕西欧卡电子智能科技有限公司 Unmanned ship berthing reservoir position repositioning method based on laser radar
CN117951455A (en) * 2024-03-22 2024-04-30 汶上义桥煤矿有限责任公司 On-line monitoring method for operation faults of scraper conveyor

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114826751B (en) * 2022-05-05 2022-10-28 深圳市永达电子信息股份有限公司 Kalman filtering network prevention and control method for multi-target information fusion

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110501036A (en) * 2019-08-16 2019-11-26 北京致行慕远科技有限公司 The calibration inspection method and device of sensor parameters
JP2020087061A (en) * 2018-11-28 2020-06-04 パナソニックIpマネジメント株式会社 Unmanned mobile body and control method
CN112702408A (en) * 2020-12-20 2021-04-23 国网山东省电力公司临沂供电公司 Internet of things system and method based on multi-sensing function

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109358344A (en) * 2018-11-07 2019-02-19 西安电子科技大学 A kind of anti-GPS fraud system of unmanned plane based on Multi-source Information Fusion and method
CN112254741B (en) * 2020-09-09 2023-06-23 安克创新科技股份有限公司 Abnormality detection method for mileage sensor, self-moving robot, and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020087061A (en) * 2018-11-28 2020-06-04 パナソニックIpマネジメント株式会社 Unmanned mobile body and control method
CN110501036A (en) * 2019-08-16 2019-11-26 北京致行慕远科技有限公司 The calibration inspection method and device of sensor parameters
CN112702408A (en) * 2020-12-20 2021-04-23 国网山东省电力公司临沂供电公司 Internet of things system and method based on multi-sensing function

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LI HUIYUN;SHAO CUIPING;CHEN BEIZHANG;HU YANBU;YANG ZHAONAN: "Attack Defense Technology of Unmanned Vehicle Perception System Based on Matrix Completion", JOURNAL OF INTEGRATION TECHNOLOGY, vol. 9, no. 5, 15 September 2020 (2020-09-15), pages 3 - 14, XP093024160, DOI: 10.12146/j.issn.2095-3135.20200509003 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115856931A (en) * 2023-03-01 2023-03-28 陕西欧卡电子智能科技有限公司 Unmanned ship berthing reservoir position repositioning method based on laser radar
CN117951455A (en) * 2024-03-22 2024-04-30 汶上义桥煤矿有限责任公司 On-line monitoring method for operation faults of scraper conveyor
CN117951455B (en) * 2024-03-22 2024-06-07 汶上义桥煤矿有限责任公司 On-line monitoring method for operation faults of scraper conveyor

Also Published As

Publication number Publication date
CN113532499A (en) 2021-10-22
CN113532499B (en) 2022-08-30

Similar Documents

Publication Publication Date Title
WO2023283987A1 (en) Sensor security detection method and device for unmanned system, and storage medium
Vu et al. Real-time computer vision/DGPS-aided inertial navigation system for lane-level vehicle navigation
CN108868268B (en) Unmanned parking space posture estimation method based on point-to-surface distance and cross-correlation entropy registration
EP4318397A2 (en) Method of computer vision based localisation and navigation and system for performing the same
JP2022106924A (en) Device and method for autonomous self-position estimation
CN104848867B (en) The pilotless automobile Combinated navigation method of view-based access control model screening
Maier et al. Improved GPS sensor model for mobile robots in urban terrain
Brenner Extraction of features from mobile laser scanning data for future driver assistance systems
CN112346463B (en) Unmanned vehicle path planning method based on speed sampling
WO2021021862A1 (en) Mapping and localization system for autonomous vehicles
CN112967392A (en) Large-scale park mapping and positioning method based on multi-sensor contact
Dill et al. Seamless indoor-outdoor navigation for unmanned multi-sensor aerial platforms
Li et al. Autonomous navigation and environment modeling for MAVs in 3-D enclosed industrial environments
CN112805766A (en) Apparatus and method for updating detailed map
Zhang et al. Online ground multitarget geolocation based on 3-D map construction using a UAV platform
Chen et al. Aerial robots on the way to underground: An experimental evaluation of VINS-mono on visual-inertial odometry camera
CN116957360A (en) Space observation and reconstruction method and system based on unmanned aerial vehicle
CN115574816B (en) Bionic vision multi-source information intelligent perception unmanned platform
Soleimani et al. A disaster invariant feature for localization
CN116380079A (en) Underwater SLAM method for fusing front-view sonar and ORB-SLAM3
Barbieri et al. Deep neural networks for cooperative lidar localization in vehicular networks
Kim et al. Vision-based map-referenced navigation using terrain classification of aerial images
Serrano et al. YOLO-Based Terrain Classification for UAV Safe Landing Zone Detection
Sanjukumar et al. Novel technique for Multi Sensor Calibration of a UAV
Wei Multi-sources fusion based vehicle localization in urban environments under a loosely coupled probabilistic framework

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21949767

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE