EP4078238A1 - Procédé et dispositif pour rendre des données de capteur plus robustes à l'égard de perturbations indésirables - Google Patents
Procédé et dispositif pour rendre des données de capteur plus robustes à l'égard de perturbations indésirablesInfo
- Publication number
- EP4078238A1 EP4078238A1 EP20829547.7A EP20829547A EP4078238A1 EP 4078238 A1 EP4078238 A1 EP 4078238A1 EP 20829547 A EP20829547 A EP 20829547A EP 4078238 A1 EP4078238 A1 EP 4078238A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- sensor data
- piece
- sensors
- replaced
- quilting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000002411 adverse Effects 0.000 title abstract description 7
- 238000004590 computer program Methods 0.000 claims abstract description 8
- 230000006870 function Effects 0.000 claims description 35
- 230000008447 perception Effects 0.000 claims description 11
- 238000011156 evaluation Methods 0.000 claims description 9
- 238000013528 artificial neural network Methods 0.000 description 21
- 239000013598 vector Substances 0.000 description 18
- 230000002123 temporal effect Effects 0.000 description 7
- 238000013527 convolutional neural network Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 238000007476 Maximum Likelihood Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24133—Distances to prototypes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/251—Fusion techniques of input or preprocessed data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/862—Combination of radar systems with sonar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/86—Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
- G01S15/931—Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9323—Alternative operation using light waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9324—Alternative operation using ultrasonic waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/023—Interference mitigation, e.g. reducing or avoiding non-intentional interference with other HF-transmitters, base station transmitters for mobile communication or other radar systems, e.g. using electro-magnetic interference [EMI] reduction techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/36—Means for anti-jamming, e.g. ECCM, i.e. electronic counter-counter measures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/495—Counter-measures or counter-counter-measures using electronic or electro-optical means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/537—Counter-measures or counter-counter-measures, e.g. jamming, anti-jamming
Definitions
- the invention relates to a method and a device for robustizing sensor data against adversarial disturbances.
- the invention also relates to a method for operating an assistance system for a vehicle, an assistance system for a vehicle, and a computer program and a data carrier signal.
- Machine learning for example based on neural networks, has great potential for use in modern driver assistance systems and automated vehicles.
- Functions based on deep neural networks process sensor data (e.g. from cameras, radar or lidar sensors) in order to derive relevant information from it.
- sensor data e.g. from cameras, radar or lidar sensors
- This information includes, for example, a type and a position of objects in the surroundings of the motor vehicle, a behavior of the objects or a road geometry or topology.
- CNN convolutional neural networks
- CNN convolutional neural networks
- CNN convolutional neural networks
- input data e.g. image data
- CNN convolutional neural networks
- the convolution network independently develops feature maps based on filter channels that process the input data locally in order to derive local properties. These feature cards are then processed again by further filter channels, which derive more valuable feature cards from them.
- the deep neural network On the basis of this information compressed from the input data, the deep neural network finally derives its decision and makes it available as output data.
- the invention is based on the object of improving a method and a device for robustizing sensor data against adversarial interference, in particular with regard to the use of a plurality of sensors and sensor data fusion.
- a method for robustizing sensor data against adversarial interference with sensor data being obtained from at least two sensors, the sensor data obtained from the at least two sensors being replaced piece by piece by means of quilting, the piece-by-piece replacement being carried out in such a way that each replaced Sensor data from different sensors are plausible to one another, and the sensor data replaced piece by piece are output.
- a device for robustizing sensor data against adversarial disturbances comprising a computing device, the computing device being set up to receive sensor data from at least two sensors, to replace the sensor data obtained from the at least two sensors piece by piece by quilting, and piece by piece Carry out replacement in such a way that each replaced sensor data from different sensors are plausible to one another, and output the sensor data replaced piece by piece.
- the method and the device make it possible, when using a plurality of sensors, that is to say at least two sensors, to robustize the sensor data provided by the plurality of sensors against adversarial disturbances.
- the sensor data of the at least two sensors are replaced piece by piece by means of quilting.
- the piece-by-piece replacement takes place in such a way that the piece-wise replaced sensor data (across sensors) are plausible to one another.
- sensor data patches used for piece-by-piece replacement are selected in such a way that replaced sensor data of the at least two sensors that correspond to one another in terms of time and location are plausible to one another.
- the sensor data that have been robustized in this way are then used For example, supplied as input data to a neural network, an adverse disturbance originally contained in the sensor data received has lost its effect without any semantic content in the sensor data being changed. Because plausibility is maintained between the sensor data of the at least two sensors, the piece-by-piece replacement does not change a contextual relationship, in particular a spatial and temporal relationship or a correlation, between the sensor data of the at least two sensors. This is particularly advantageous if sensor data fusion takes place after the method has been carried out.
- Quilting includes in particular the piece-wise replacement of sensor data, which can also be referred to as piece-wise reconstruction of the sensor data (the term “image quilting” is also used in connection with image data).
- the sensor data can in particular be of any type, that is, the quilting is not limited to two-dimensional image data.
- a set of replaced sensor data forms a reconstruction data domain or is encompassed by a reconstruction data domain. If, for example, images from a camera are involved, the camera image is divided into several partial sections. Usually, small, rectangular sections of the image (also known as patches) are defined for this purpose.
- the individual partial or image sections are compared with partial sections, hereinafter referred to as sensor data patches, which are stored, for example, in a database.
- the sensor data patches can also be referred to as data blocks.
- the sensor data patches here in particular form subsymbolic subsets of previously recorded sensor data of the same type, the sensor data definitely being free from adverse disturbances.
- the comparison takes place on the basis of a distance measure which is defined, for example, via a Euclidean distance on image element vectors or sensor data vectors.
- a partial or image section is linearized as a vector.
- a distance is then determined using a vector space norm, for example the L2 norm.
- the partial or image sections are each replaced by the closest or most similar sensor data patch from the database. It can be provided here that a minimum distance must be maintained or that at least no identity may exist between the partial section from the sensor data and the sensor data patch.
- the piece-by-piece replacement takes place in an analogous manner.
- the piece-by-piece replacement takes place in particular for all partial excerpts of the recorded sensor data, so that replaced or reconstructed sensor data are then available.
- the piece by piece replacement that is, after quilting, there is an effect of the adversarial Disturbances in the replaced or reconstructed sensor data eliminated or at least reduced.
- a “plausibility” of replaced sensor data should in particular mean that the replaced sensor data are physically plausible to one another. In particular, there should be a probability that the respectively replaced sensor data would also occur in the respectively selected combination under real conditions, i.e. in the real world,
- the replaced sensor data of the at least two sensors should be selected in such a way that the probability that these sensor data would also actually occur in this combination is maximized. If, for example, the at least two sensors are a camera and a lidar sensor, then a plausibility between the respectively replaced sensor data means that a viewed image section in the replaced camera data and a spatially and temporally corresponding partial section from the replaced lidar data are selected in such a way that that the sensor data are consistent with one another, i.e. that they are physically free of contradictions to one another.
- the at least two sensors are a camera and a lidar sensor
- the partial sections of the sensor data are each replaced in such a way that each replaced image section corresponds to a replaced partial section of the lidar data, as this is very likely also when simultaneously capturing Sensor data of the camera and the lidar sensor would result.
- the at least two sensors are in particular calibrated with respect to one another in terms of location and time, so that the sensor data of the at least two sensors correspond to one another in terms of location and time or have common reference points in time and location.
- the sensor data of the at least two sensors can in principle be one-dimensional or multidimensional, in particular two-dimensional.
- the sensor data can be two-dimensional camera images from a camera and two-dimensional or three-dimensional lidar data from a lidar sensor.
- the sensor data can also come from other sensors, such as radar sensors or ultrasonic sensors, etc.
- the sensor data obtained are, in particular, sensor data that are recorded and / or output for a function for the automated or partially automated driving of a vehicle and / or for perception of the surroundings.
- a vehicle is in particular a motor vehicle. In principle, however, the vehicle can also be another land, air, water, rail or space vehicle, for example a drone or an air taxi.
- An adversarial perturbation is, in particular, a deliberately made disruption of the input data of a neural network, for example provided in the form of sensor data, in which semantic content in the input data is not changed, but the disruption leads to the neuronal Network inferred an incorrect result, that is, for example, incorrectly classifies or incorrectly semantic segmentation of the input data.
- a neural network is in particular a deep neural network, in particular a convolutional neural network (CNN).
- the neural network is or is, for example, trained on a specific function, for example on a function of an assistance system of a vehicle, in particular for automated or partially automated driving and / or for perception of the surroundings, for example perception of pedestrians or other objects in captured camera images.
- the method is in particular repeated cyclically, so that in particular continuously replaced sensor data can be provided for received sensor data of a sensor data stream.
- the method can be carried out as a computer-implemented method.
- the method can be carried out by means of a data processing device.
- the data processing device comprises in particular at least one computing device and at least one storage device.
- a computer program is also created, comprising instructions which, when the computer program is executed by a computer, cause the computer to carry out the method steps of the disclosed method in accordance with any of the described embodiments.
- a data carrier signal is also created that transmits the aforementioned computer program.
- Parts of the device in particular the computing device, can be designed individually or collectively as a combination of hardware and software, for example as program code that is executed on a microcontroller or microprocessor. However, it can also be provided that parts are designed individually or combined as an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- the method includes the acquisition of the sensor data by means of the at least two sensors.
- the sensor data replaced piece by piece are fed to at least one function for automated or partially automated driving of a vehicle and / or for perception of the surroundings.
- the at least one function can be supplied with robustized sensor data, so that a functionality provided by the at least one function can also be provided in a more robust manner.
- the sensor data that have been replaced piece by piece are fed to the at least one function and, based on the sensor data that has been replaced piece by piece, the at least one function generates, in particular, at least one control signal and / or an evaluation signal and provides this.
- an output of the at least one function in the form of the at least one control signal and / or evaluation signal can thereby be generated and provided more reliably.
- the at least one control signal and / or evaluation signal can be used, for example, to control or regulate an actuator system of the vehicle and / or can be further processed in the context of automated or partially automated driving, for example for trajectory planning.
- the at least one function is in particular a function that is provided by means of a method of machine learning and / or artificial intelligence.
- the at least one function can be provided by means of a trained artificial neural network.
- a database with sensor data patches generated from sensor data of the at least two sensors is provided for quilting, the sensor data patches of the at least two sensors in the database being linked to one another in such a way that the respectively linked sensor data patches are plausible to one another.
- the sensor data patches that are used to replace the sensor data of the at least two sensors piece by piece can be stored in the database, for example in the form of common database entries.
- the sensor data patches for the at least two sensors can already be combined into vectors and stored in the database.
- the sensors are, for example, a camera and a lidar sensor
- the sensor data patches of the camera that is to say individual image excerpts that are used for replacement are combined to form common vectors with sensor data patches that correspond with them in a physically plausible manner, that is to say partial excerpts from lidar data.
- the sensor data of the at least two sensors are then combined analogously to the stored vectors, so that a distance to the vectors stored in the database can be determined with the aid of a distance measure, for example the L2 standard.
- the vector from the database that is closest to a vector to be replaced is used for replacement during quilting.
- the database is created in particular on the basis of (previously independently of the disclosed method) acquired sensor data, the sensor data of the at least two sensors being acquired at the same time, the sensors being calibrated with respect to one another in terms of location and time.
- trustworthy sensor data are used, that is to say sensor data in which there are definitely no adverse disturbances.
- training data from a (deep) neural network can be used to which the (replacement) sensor data are to be supplied in an application phase.
- the sensor data patches are generated from this trustworthy sensor data and stored in the database. If the at least two sensors are other types of sensors, the procedure is analogous.
- a selection of sensor data patches used in quilting for the at least two sensors takes place as a function of the sensor data of only a part of the at least two sensors.
- a selection of a sensor data patch for replacing sensor data of the at least two sensors takes place as a function of sensor data received from only one of the sensors.
- the computing power required for searching can be reduced since, for example, a comparison with sensor data patches in the database only takes into account the sensor data of one sensor. If the sensor data patch is found with the smallest distance to the sensor data of the one sensor, the sensor data patches of the other of the at least two sensors can also be taken from the found sensor data patch without further search due to the existing link.
- the comparison or the search for the closest sensor data patch can also take place on the basis of the lidar data, the associated image section being taken from the entries of the vector after the sensor data patch has been found. Overall, the comparison or the search in the database can be accelerated.
- At least one item of identification information is obtained, the piece-wise replacement during quilting additionally taking into account the at least one identification information item received.
- Identification information can also be referred to as a tag or label.
- the entries in the database that is to say the sensor data patches stored therein, can be marked with additional information so that they can be found more quickly later.
- the database is indexed with the aid of a hash function so that a search in the database can be accelerated, since a number of entries in the database can be reduced via a preselection even before a comparison with the sensor data of the at least two sensors.
- the identification information obtained is or is derived from context information of an environment in which the sensor data of the at least two sensors are or have been recorded.
- Context information can, for example, be a geographical coordinate (e.g. GPS coordinate), a time of day and / or season, a month, a weekday, weather (sun, rain, fog, snow, etc.) and / or a traffic context (city, country , Motorway, pedestrian zone, country road, main road, secondary road etc.).
- sensor data patches can be marked (“Tagged”) with at least one context information must be stored in the database.
- Tagged a preselection can be made before the search in the database, so that only entries or sensor data patches that partially or completely match the at least one identification information are considered during the search or have at least one context information. This means that the piece-by-piece replacement can be accelerated.
- the piece-wise replacement of the received sensor data is carried out taking into account temporally and / or spatially adjacent sensor data of the at least two sensors.
- a correlation between temporally and / or spatially adjacent sensor data can be taken into account in the case of piece-by-piece replacement.
- individual image sections of the camera image usually have a high correlation with respect to their properties to (locally) adjacent image sections of the camera image. If a sequence of camera images is viewed, an image section of a camera image usually also has a high correlation in terms of properties with the same image section of a (temporally) adjacent camera image.
- entries or sensor data patches stored in the database are marked with respect to a temporal and / or spatial proximity to one another.
- the sensor data patches stored as entries in the database can be linked to other stored sensor data patches with regard to their temporal and / or local proximity to these. This can accelerate the comparison with the sensor data patches stored in the database.
- a preselection is made for further partial sections of the sensor data of the one of the at least two sensors. The preselection comprises those sensor data patches which are less than a predefined temporal and / or spatial distance from the already selected sensor data patch, that is to say which are in a predefined temporal and / or spatial proximity to this.
- a method for operating an assistance system for a vehicle is also provided, with at least one function for automated or partially automated driving of a vehicle and / or for perception of the surroundings being provided by means of the assistance system, with sensor data being recorded by at least two sensors, with one
- the method is carried out according to one of the embodiments described above, the piece-wise replaced sensor data being fed to the at least one function, and the at least one function generating and providing at least one control signal and / or an evaluation signal based on the piece-wise replaced sensor data.
- An assistance system for a vehicle comprising at least two sensors, set up to acquire sensor data, and a device according to one of the embodiments described above, the assistance system being set up to include at least one function for automated or partially automated driving of the vehicle and / or to provide for perception of the surroundings, the at least one function generating and providing at least one control signal and / or an evaluation signal based on the sensor data replaced piece by piece by means of the device.
- a vehicle comprising at least one device and / or at least one assistance system according to one of the described embodiments.
- a vehicle is in particular a motor vehicle.
- the vehicle can also be another land, air, water, rail or space vehicle, for example a drone or an air taxi.
- FIG. 1 shows a schematic representation of an embodiment of the device for robustizing sensor data against adversarial disturbances and an embodiment of an assistance system
- FIG. 2 shows a schematic representation to illustrate quilting (prior art).
- FIG. 3 shows a schematic representation to illustrate the quilting according to a
- Embodiment of the method described in this disclosure. 1 shows a schematic representation of an embodiment of the device 1 for robustizing sensor data 20, 21 against adversarial disturbances.
- the device 1 comprises a computing device 2 and a memory device 3.
- the device 1 can be used in particular in a vehicle, in particular a motor vehicle, in order to robustize input data of a neural network 50 used there against adversarial interference.
- the device 1 carries out the method described in this disclosure for robustizing sensor data 20, 21 against adversarial disturbances.
- Parts of the device 1, in particular the computing device 2 can be designed individually or collectively as a combination of hardware and software, for example as program code that is executed on a microcontroller or microprocessor.
- the device 1 or the computing device 2 is supplied with sensor data 20, 21 from two sensors 10, 11.
- the sensors 10, 11 can be, for example, a camera and a lidar sensor.
- the computing device 2 receives or receives the sensor data 20, 21 and replaces the sensor data 20, 21 piece by piece by means of quilting.
- the piece-by-piece replacement takes place in such a way that sensor data 30, 31 of the two sensors 10, 11 that are replaced piece-by-piece are plausible to one another.
- the sensor data 30, 31 replaced piece by piece are then output by the computing device 2.
- the sensor data 30, 31 replaced piece by piece are then fed to an artificial neural network 50.
- the neural network 50 is provided by means of a control device 51, for example in that a computing device of the control device 51 provides a functionality of the neural network 50 or carries out the arithmetic operations necessary to provide the neural network 50.
- the neural network 50 provides, in particular, a function for automated or partially automated driving of a vehicle and / or for perception of the surroundings.
- the function is provided in particular with the aid of an assistance system 200 comprising sensors 10, 11 and device 1.
- the neural network 50 is trained to provide the function.
- the function provided by the neural network 50 generates at least one control signal 52 and / or evaluation signal 53 on the basis of the piece-wise replaced sensor data 30, 31, which, for example, an actuator (not shown) of the vehicle and / or at least one further control unit of the vehicle can be supplied.
- the piece-wise replaced sensor data 30, 31 have the same format after quilting or after the piece-wise replacement as the sensor data 20, 21, so that it is possible to insert the device 1 into already existing applications of sensors 10, 11 and neural networks 50 and to use.
- a database 40 with sensor data patches 60, 61 generated from sensor data from sensors 10, 11 to be provided for quilting, with sensor data patches 60, 61 of sensors 10, 11 being linked to one another in database 40 in such a way that the respectively linked Sensor data patches 60, 61 are plausible to one another.
- the database 40 is stored in the storage device 3, for example.
- the database 40 was created beforehand in particular with the aid of trustworthy recorded sensor data from the two sensors 10, 11, in that a large number of interlinked sensor data patches were generated from the recorded trustworthy sensor data.
- trustworthy is intended to mean, in particular, that the recorded sensor data definitely do not contain any adverse disturbances.
- the trustworthy sensor data are camera images and lidar data, for example, it can be provided that a sensor data patch 60, 61 each has a partial section of 8x8 picture elements of a camera image and a corresponding partial section from the lidar data of 8x8 measurement points.
- the sensors used or the trustworthy sensor data are in particular calibrated with respect to one another in terms of time and location.
- the computing device 2 proceeds in particular as follows.
- the sensor data 20, 21 are each subdivided into partial sections.
- the partial sections are each compared with the sensor data patches 60, 61 stored in the database 40.
- That sensor data patch 60, 61 is searched for for for each sub-section which has the smallest distance to the sub-section under consideration.
- the sensor data 20, 21 comprised by the respective partial section and the sensor data comprised by the sensor data patches 60, 61 are for this purpose each expressed as vectors, for example.
- the distance measure for example the L2 standard, a distance between these vectors can then be determined and the determined distances can be compared with one another.
- the sub-section in the sensor data 20, 21 is replaced by it and made available as replaced sensor data 30, 31. Since the sensor data patches 60, 61 for the two sensors 10, 11 are linked to one another in the database 40, the sensor data 20, 21 of the two sensors 10, 11 are replaced by the linked sensor data patches 60, 61 , 61 the replaced sensor data 30, 31 of the two sensors 10, 11 are checked for plausibility with respect to one another.
- a selection of sensor data patches 60, 61 used during quilting for the two sensors 10, 11 takes place as a function of the sensor data 20, 21 of only some of the sensors 10, 11.
- the selection can only be made on the basis of the sensor data 20 of the sensor 10. Since the sensor data patches 60, 61 are linked to one another, the sensor data patch 61 linked therewith for the sensor data 21 can be identified immediately for a sensor data patch 60 found on the basis of the sensor data 20.
- the sensor data patches 60, 61 in the database 40 can be linked or marked with one another in relation to a temporal and / or spatial neighborhood.
- a preselection can be made in which a temporal and / or spatial correlation is taken into account when the sensor data 20, 21 mapped by the sensor data patches 60, 61 occur.
- the sensor data 20, 21 received are sensor data that are recorded and / or output for a function for automated or partially automated driving of a vehicle and / or for driver assistance of the vehicle and / or for environment detection and / or environment perception become.
- FIG. 2 shows a schematic representation to clarify quilting from the prior art using the example of a camera image 22.
- Sensor data 20, in the present case a camera image 22, are divided into partial sections 23.
- a search is made in a database 40 for a sensor data patch 60 which has the smallest distance to the sub-section 23 in terms of a distance.
- a sensor data patch 60 is an image section which has the size of the partial sections 23, that is to say the same number of picture elements (pixels).
- the distance measure is, for example, the L2 standard, which is applied to vectors that have been generated by linearizing the image sections.
- each partial section 23 is then replaced by the respective sensor data patch 60 with the smallest distance therefrom. It can be provided here that a minimum distance must be maintained. In this way, all partial excerpts 23 are replaced by sensor data patches 60 from database 40.
- Replaced partial sections 24 are created which, taken together, form the replaced sensor data 30 or the replaced camera image 25.
- FIG. 3 shows a schematic illustration to clarify the quilting according to an embodiment of the method described in this disclosure using the example of sensor data 20 in the form of a camera image 22 and of sensor data 21 in the form of lidar data 26.
- the quilting itself takes place in the same way as was already described above in connection with FIG.
- the piece-by-piece replacement is carried out in such a way that replaced sensor data 30, 31 of the sensors, that is to say of the camera and the lidar sensor, are plausible to one another.
- a plausibility check also takes place in the quilting step 100.
- sensor data patches 60, 61 are determined or selected from the database 40 in such a way that the sensor data patches 60, 61 are plausible to one another.
- the replaced partial sections 24, 28 or the replaced camera image 25 and the replaced lidar data 29 must be consistent with one another and must not contradict one another in terms of content or physics.
- a scene imaged in the replaced camera image 25 must fit together in a plausible manner with a depth profile of the replaced lidar data 29.
- the sensor data patches 60, 61 are already stored in the database 40 in a manner linked to one another.
- the database 40 can exist Execution of the method described in this disclosure can be created by creating sensor data patches 60, 61 for both (or even more) sensors at the same time, partial excerpts being generated from trustworthy sensor data recorded at the same time and each being jointly or linked to one another in database 40 as sensor data patches 60 , 61 are filed.
- the individual partial sections for both sensors can be combined to form a common vector and stored as a common or linked sensor data patch 60, 61.
- the piece-wise replacement in quilting step 100 additionally taking into account the at least one identification information item 15 received.
- sensor data patches 60, 61 can be preselected as a function of the identification information 15 received, so that the search for that sensor data patch 60, 61 with the smallest distance can be accelerated.
- the identification information 15 obtained is or is derived from context information 16 of an environment in which the sensor data 20, 21 of the sensors are or have been recorded.
- Context information 16 can, for example, be a geographical coordinate (e.g. GPS coordinate), a time of day and / or season, a month, a day of the week, weather (sun, rain, fog, snow, etc.) and / or a traffic context (city, Country, motorway, pedestrian zone, country road, main road, secondary road etc.).
- context information 16 can, for example, be recorded by means of at least one context sensor or be provided in some other way.
- context information can be requested from a vehicle controller, for example via a Controller Area Network (CAN) bus.
- CAN Controller Area Network
- a preselection from sensor data patches 60, 61 can be made so that the search for the next sensor data patch 60, 61 can be accelerated.
- the sensor data patches 60, 61 are each marked (“tagged”) in the database 40 with an associated expression of the context information.
- Partial section replaced Partial section replaced lidar data replaced sensor data replaced sensor data
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Bioinformatics & Computational Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
L'invention concerne un procédé pour rendre des données de capteur (20, 21) plus robustes à l'égard de perturbations indésirables, ledit procédé consistant à obtenir des données de capteur (20, 21) provenant d'au moins deux capteurs (10, 11) ; à remplacer au moyen d'une analyse de données disparates, des parties des données de capteur (20, 21) obtenues en provenance des au moins deux capteurs (10, 11), le remplacement partiel étant réalisé de sorte que des éléments individuels de données de capteur remplacées (30, 31) provenant de différents capteurs (10, 11) soient plausibles les uns par rapport aux autres ; et à produire les données de capteur (30, 31) dont des parties ont été remplacées. L'invention concerne en outre un dispositif (1) pour rendre des données de capteur (20, 21) plus robustes à l'égard de perturbations indésirables, un procédé de fonctionnement d'un système d'assistance pour un véhicule, un système d'assistance pour un véhicule, un programme informatique et un signal de porteuse de données.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102019219923.2A DE102019219923A1 (de) | 2019-12-17 | 2019-12-17 | Verfahren und Vorrichtung zum Robustifizieren von Sensordaten gegen adversariale Störungen |
PCT/EP2020/085650 WO2021122338A1 (fr) | 2019-12-17 | 2020-12-10 | Procédé et dispositif pour rendre des données de capteur plus robustes à l'égard de perturbations indésirables |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4078238A1 true EP4078238A1 (fr) | 2022-10-26 |
Family
ID=74068254
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20829547.7A Pending EP4078238A1 (fr) | 2019-12-17 | 2020-12-10 | Procédé et dispositif pour rendre des données de capteur plus robustes à l'égard de perturbations indésirables |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230052885A1 (fr) |
EP (1) | EP4078238A1 (fr) |
CN (1) | CN114829978A (fr) |
DE (1) | DE102019219923A1 (fr) |
WO (1) | WO2021122338A1 (fr) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113627597B (zh) * | 2021-08-12 | 2023-10-13 | 上海大学 | 一种基于通用扰动的对抗样本生成方法及系统 |
DE102022209129A1 (de) | 2022-09-02 | 2024-03-07 | Robert Bosch Gesellschaft mit beschränkter Haftung | Modellerstellvorrichtung und Modellerstellverfahren für zumindest zwei an und/oder in einem Fahrzeug montierte Sensorvorrichtungen |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102017206123A1 (de) * | 2017-04-10 | 2018-10-11 | Robert Bosch Gmbh | Verfahren und Vorrichtung zur Fusion von Daten verschiedener Sensoren eines Fahrzeugs im Rahmen einer Objekterkennung |
-
2019
- 2019-12-17 DE DE102019219923.2A patent/DE102019219923A1/de active Pending
-
2020
- 2020-12-10 WO PCT/EP2020/085650 patent/WO2021122338A1/fr unknown
- 2020-12-10 US US17/786,302 patent/US20230052885A1/en active Pending
- 2020-12-10 EP EP20829547.7A patent/EP4078238A1/fr active Pending
- 2020-12-10 CN CN202080085755.XA patent/CN114829978A/zh active Pending
Also Published As
Publication number | Publication date |
---|---|
CN114829978A (zh) | 2022-07-29 |
US20230052885A1 (en) | 2023-02-16 |
DE102019219923A1 (de) | 2021-06-17 |
WO2021122338A1 (fr) | 2021-06-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2149131B1 (fr) | Procédé et dispositif d'identification d'informations importantes pour le trafic | |
DE102018101318A1 (de) | Objekterkennung unter verwendung eines rekurrenten neuronalen netzwerks und einer verketteten merkmalsabbildung | |
DE102016210534A1 (de) | Verfahren zum Klassifizieren einer Umgebung eines Fahrzeugs | |
WO2019001649A1 (fr) | Transfert de connaissance entre différentes architectures d'apprentissage profond | |
DE102012107886A1 (de) | Verfahren zur elektronischen Erkennung von Verkehrszeichen | |
EP3789926A1 (fr) | Procédé de détection d'une perturbation adversaire dans des données d'entrée d'un réseau de neurones | |
EP4078238A1 (fr) | Procédé et dispositif pour rendre des données de capteur plus robustes à l'égard de perturbations indésirables | |
DE102021002798A1 (de) | Verfahren zur kamerabasierten Umgebungserfassung | |
DE102018215055A1 (de) | Verfahren zum Bestimmen einer Spurwechselangabe eines Fahrzeugs, ein computerlesbares Speichermedium und ein Fahrzeug | |
WO2020126339A1 (fr) | Procédé et dispositif de fonctionnement d'un modèle d'apprentissage automatique | |
EP3748454B1 (fr) | Procédé et dispositif de réalisation automatique d'une fonction de commande d'un véhicule | |
WO2021122339A1 (fr) | Procédé et dispositif permettant de fabriquer un réseau neuronal plus robuste par rapport à des perturbations antagonistes | |
DE102018132627A1 (de) | Verfahren zum Erfassen einer Umgebung eines Kraftfahrzeugs mittels zeitlicher Fusion von Bildern durch ein künstliches neuronales Netz; Steuereinheit, Fahrerassistenzsystem; Computerprogrammprodukt | |
WO2021122337A1 (fr) | Procédé et appareil de reconnaissance de suppression d'un domaine de données de capteur à partir d'un domaine de données de référence | |
EP4049186A1 (fr) | Procédé pour robustifier un réseau neuronal vis-à-vis de perturbations antagonistes | |
DE112019004315T5 (de) | Kartenerzeugungsvorrichtung und kartenerzeugungsverfahren | |
DE102019219924B4 (de) | Verfahren und Vorrichtung zum Erzeugen und Bereitstellen einer Datenbank mit darin hinterlegten Sensordatenpatches zur Verwendung beim Quilting | |
DE102019207580A1 (de) | Verfahren zum Betreiben eines tiefen Neuronalen Netzes | |
EP2696310A1 (fr) | Procédé destiné à identifier un bord de route | |
DE102021117529B3 (de) | Verfahren, System und Computerprogrammprodukt zur Verbesserung der Performance von Wahrnehmungsfunktionen für automatisierte Fahrassistenzsysteme | |
DE102019219926A1 (de) | Verfahren und Vorrichtung zum Trainieren eines Neuronalen Netzes | |
DE102020213058A1 (de) | Verfahren und Vorrichtung zum teilautomatisierten oder vollautomatisierten Steuern eines Fahrzeugs | |
DE102021214474A1 (de) | Computerimplementiertes verfahren zur optimierung eines algorithmus zum erkennen eines interessierenden objekts ausserhalb eines fahrzeugs | |
DE102021206190A1 (de) | Verfahren zur Erkennung von Objekten gesuchter Typen in Kamerabildern | |
EP3985565A1 (fr) | Procédé et dispositif de vérification d'un système de traitement d'informations basé sur l'ia utilisé lors de la commande partiellement automatisée ou entièrement automatisée d'un véhicule |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20220718 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) |