WO2021122337A1 - Procédé et appareil de reconnaissance de suppression d'un domaine de données de capteur à partir d'un domaine de données de référence - Google Patents
Procédé et appareil de reconnaissance de suppression d'un domaine de données de capteur à partir d'un domaine de données de référence Download PDFInfo
- Publication number
- WO2021122337A1 WO2021122337A1 PCT/EP2020/085649 EP2020085649W WO2021122337A1 WO 2021122337 A1 WO2021122337 A1 WO 2021122337A1 EP 2020085649 W EP2020085649 W EP 2020085649W WO 2021122337 A1 WO2021122337 A1 WO 2021122337A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor data
- sensor
- alienation
- data
- domain
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/28—Determining representative reference patterns, e.g. by averaging or distorting; Generating dictionaries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/98—Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
- G06V10/993—Evaluation of the quality of the acquired pattern
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/862—Combination of radar systems with sonar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/86—Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
- G01S15/931—Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9323—Alternative operation using light waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9324—Alternative operation using ultrasonic waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/023—Interference mitigation, e.g. reducing or avoiding non-intentional interference with other HF-transmitters, base station transmitters for mobile communication or other radar systems, e.g. using electro-magnetic interference [EMI] reduction techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/36—Means for anti-jamming, e.g. ECCM, i.e. electronic counter-counter measures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/495—Counter-measures or counter-counter-measures using electronic or electro-optical means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24133—Distances to prototypes
- G06F18/24137—Distances to cluster centroïds
- G06F18/2414—Smoothing the distance, e.g. radial basis function networks [RBFN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Definitions
- the invention relates to a method and a device for recognizing an alienation of a sensor data domain from a reference data domain.
- the invention also relates to a motor vehicle, a computer program and a data carrier signal.
- Machine learning for example based on neural networks, has great potential for use in modern driver assistance systems and automated vehicles.
- Functions based on deep neural networks process sensor data (e.g. from cameras, radar or lidar sensors) in order to derive relevant information from it.
- sensor data e.g. from cameras, radar or lidar sensors
- This information includes, for example, a type and a position of objects in the surroundings of the motor vehicle, a behavior of the objects or a road geometry or topology.
- CNN convolutional neural networks
- CNN convolutional neural networks
- CNN convolutional neural networks
- input data e.g. image data
- CNN convolutional neural networks
- the convolution network independently develops feature maps based on filter channels that process the input data locally in order to derive local properties. These feature cards are then processed again by further filter channels, which derive more valuable feature cards from them.
- the deep neural network On the basis of this information compressed from the input data, the deep neural network finally derives its decision and makes it available as output data.
- An essential feature in the development of deep neural networks is the purely data-driven parameter fitting without expert intervention:
- a deviation of an output (for a given parameterization) of a neural network from a ground truth is determined (the so-called loess ).
- the loss function used here is selected in such a way that the parameters of the neural network depend on it in a differentiable manner.
- the parameters of the neural network are adapted in each training step depending on the derivation of the deviation (determined on several examples). These training steps are repeated very often until the loess no longer diminishes.
- the parameters of the neural network are determined without expert assessment or semantically motivated modeling.
- the parameters depend largely on the data used for training.
- Machine-learned models are generally very good at generalizing (successfully applying what they have learned to unknown data), but they can only do so within the data domains they were confronted with in the course of the training process. If, on the other hand, the trained model is applied to data from other data domains, this usually results in a greatly reduced accuracy of the output.
- the AI functions provided by the neural network can, for example, only be used in regions from which the data used for training originate or are very similar.
- the invention is based on the object of creating a method and device for recognizing an alienation of a sensor data domain from a reference data domain.
- a method for recognizing an alienation of a sensor data domain from a reference data domain, with detected sensor data being received from at least one sensor, with distances between partial excerpts of the received sensor data and partially by means of a distance measure Replacing by means of quilting the sensor data used and belonging to the reference data domain sensor data patches are determined or such distances are obtained, and an alienation signal is generated and provided as a function of the determined distances.
- a device for recognizing an alienation of a sensor data domain from a reference data domain comprising an input device for receiving acquired sensor data from at least one sensor, a computing device, and an output device, the computing device being set up to use a distance measure to determine distances between partial sections of the received sensor data and for the piece-wise replacement by means of quilting of the sensor data and belonging to the reference data domain sensor data patches to determine or to obtain such distances, and to generate an alienation signal as a function of the determined distances, and wherein the output device is set up to provide the generated alienation signal.
- the method and the device make it possible to determine an alienation of a sensor data domain from a reference data domain. In this way it can be determined in particular whether currently detected or received sensor data are (still) in the reference data domain or not.
- recorded sensor data from at least one sensor is received.
- the sensor data are, for example, environment data recorded from the environment by means of environment sensors of a motor vehicle. Distances between partial sections of the received sensor data and sensor data patches are determined by means of a distance measure.
- the sensor data patches are used in a quilting process to replace the sensor data piece by piece and belong to or define the reference data domain.
- the respective sections of the sensor data considered and the associated sensor data patches have the same format or the same size.
- the quilting itself can be part of the process; however, the quilting can also be carried out independently of the method, only the distances between the partial excerpts of the sensor data and the associated sensor data patches being determined or obtained for quilting by means of the distance measure.
- the partial sections of the sensor data can be linearized into vectors, for example, and compared with the respective sensor data patches, likewise linearized into vectors, using a vector standard.
- the L2 standard can be used here as a distance measure.
- An alienation signal is generated and provided as a function of the determined distances.
- Quilting particularly refers to the piece-by-piece replacement of sensor data. Quilting is used in particular to robustize sensor data that are fed to a neural network as input data against adversarial interference.
- Quilting can also be referred to as a piece-wise reconstruction of the sensor data.
- image quilting is also used in connection with image data. If, for example, images from a camera are involved, the camera image is divided into several partial sections. For this purpose, small, rectangular image sections (also referred to as patches, for example with a size of 8x8 picture elements / pixels) can be defined.
- the individual partial or image sections are compared with partial sections, referred to in this disclosure as sensor data patches, which are in particular stored in a database. The comparison takes place on the basis of a distance measure which is defined, for example, via a Euclidean distance on picture element vectors. For this purpose, a partial or image section is linearized as a vector.
- a distance is then determined using a vector standard, for example using the L2 standard.
- the partial or image excerpts from the recorded sensor data are each replaced by the closest or most similar sensor data patch from the database. It can be provided here that a minimum distance must be observed or that at least no identity may exist between the partial section from the sensor data and the sensor data patch. If the sensor data have a different form (eg lidar data) or a different format, the piece-by-piece replacement takes place in an analogous manner. The piece-by-piece replacement takes place for all partial excerpts of the sensor data, so that replaced or reconstructed sensor data is then available.
- a vector standard for example using the L2 standard.
- the sensor data patches necessary for quilting are generated from sensor data which are trustworthy, that is to say which with certainty or at least with a high degree of probability no adverse disturbances are contained.
- the sensor data patches are provided, for example, in the form of a searchable database.
- the method and the device can be used to determine when a quilting quality of the recorded sensor data decreases because the sensor data domain and the reference data domain from which the sensor data patches used in quilting were generated have become alienated from one another. Furthermore, the method and the device are particularly advantageous when using trained neural networks, since the method and the device can also be used to determine whether currently acquired sensor data that is to be fed to a trained neural network for processing, are still in a data domain on which the neural network was trained. Since the reference data domain is defined according to the method using the sensor data patches, it is particularly necessary for this purpose that the training data with which the neural network was trained also originate from the reference data domain.
- a typical application scenario for the method and the device is as follows.
- a neural network is used in a vehicle, for example for perception of the surroundings (object recognition, semantic segmentation, etc.).
- Recorded sensor data in particular data on the surroundings of the surroundings of the vehicle, at least one sensor (e.g. a camera, a lidar sensor, etc.) are fed to the neural network.
- the recorded sensor data are replaced piece by piece by means of quilting in order to robustize the recorded sensor data against adversarial disturbances.
- the quilting is done by determining distances from partial sections of the recorded sensor data to sensor data patches, which are stored in a database, for example, and replacing the individual partial sections with the most similar or closest sensor data patch in relation to the respective distance.
- the method described in this disclosure uses the distances determined during quilting in order to recognize and determine an alienation of the sensor data domain from the reference data domain defined by the sensor data patches.
- the sensor data can in principle be one-dimensional or multidimensional, in particular two-dimensional.
- the sensor data can be two-dimensional camera images from a camera and two-dimensional lidar data from a lidar sensor.
- it can also be sensor data from other sensors, such as, for example, radar sensors and / or ultrasonic sensors.
- the sensor data are or have been recorded in particular by means of at least one sensor of a vehicle.
- the sensor data include, in particular, data about the surroundings of the vehicle.
- a data domain is intended to denote, in particular, a set of data, in particular sensor data, which correspond to a specific context or whose data are similar in terms of their origin in at least one property.
- a context can be, for example, a geographical context, e.g. a data domain can include sensor data from one city, whereas a data domain different therefrom comprises sensor data from another city, etc.
- a sensor data domain in particular denotes a data domain in which current sensor data, recorded by means of the at least one sensor, are located.
- a reference data domain in particular denotes a data domain in which the sensor data are located, from which the sensor data patches used in quilting were generated.
- the sensor data recorded or mapped in the data domains are those sensor data that are required for a function for automated driving of a vehicle and / or for driver assistance of the vehicle and / or for environment detection or for a perception function.
- the method and the device are used in particular in at least one vehicle.
- a vehicle is in particular a motor vehicle.
- the vehicle can also be another land, air, water, rail or space vehicle.
- the method can also be used for other types of sensor data, for example in connection with robots, e.g. industrial robots or medical robots.
- An adversarial perturbation is, in particular, a deliberately made disruption of the input data of a neural network, for example provided in the form of sensor data, in which semantic content in the input data is not changed, but the disruption leads to the neuronal Network inferred an incorrect result, that is, for example, incorrectly classifies or incorrectly semantic segmentation of the input data.
- a neural network is in particular a deep neural network, in particular a convolutional neural network (CNN).
- the neural network is or is, for example, trained for a specific perception function, for example for the perception of pedestrians or other objects in captured camera images.
- the method can be carried out as a computer-implemented method.
- the method can be carried out by means of a data processing device.
- the data processing device comprises in particular at least one computing device and at least one storage device.
- a computer program is also created, comprising instructions which, when the computer program is executed by a computer, cause the computer program to be executed To carry out method steps of the disclosed method according to any of the described embodiments.
- a data carrier signal is also created that transmits the aforementioned computer program.
- the method includes acquiring the sensor data of the at least one sensor.
- Parts of the device in particular the computing device, can be designed individually or collectively as a combination of hardware and software, for example as program code that is executed on a microcontroller or microprocessor.
- the alienation signal is generated when the determined distances exceed at least one predetermined threshold value.
- the at least one predetermined threshold value can be used to set how sensitively the method reacts to an alienation or change in the sensor data domain from or with respect to the reference data domain. It can be provided here that the alienation signal is generated when a single specific distance exceeds the at least one predetermined threshold value. In this case, it is sufficient if a distance between a partial section of the received sensor data and an associated sensor data patch exceeds the at least one predetermined threshold value. However, it can also be provided that a minimum number of specific distances must exceed the at least one predetermined threshold value. Furthermore, it can be provided that an average value formed from all certain distances at least for a predetermined period of time or a predetermined number of certain distances must exceed the at least one predetermined threshold value so that the alienation signal is generated.
- the at least one predefined threshold value is predefined or is predefined as a function of position and / or context-dependent and / or sensor-dependent.
- the threshold value can be specified in particular as a function of a geographic position or region.
- the threshold value can be in the area of a play street be chosen differently than in the area of a motorway. In this way, a sensitivity when recognizing the alienation can be determined as a function of the position.
- context dependency a situational context in particular can be considered.
- the threshold value can then be selected differently, for example, depending on whether a vehicle is on a motorway, a country road, a main road or a secondary road. Furthermore, weather conditions can also be taken into account, for example.
- the at least one threshold value can thus be selected as a function of whether one of the following weather conditions is present: rain, snow, fog, sunshine, etc.
- both a type of sensors or the physical measurement principle on which they are based as well as an arrangement and / or alignment and / or weather dependency of the sensors can be taken into account.
- the generated alienation signal is used to initiate detection and / or collection of sensor data from the at least one sensor.
- updated sensor data can be recorded and collected for the alienated sensor data domain, which can then be used to (re) train a neural network.
- the neural network can thereby be brought up to date, corresponding to the changed sensor data domain.
- the generated alienation signal can be fed, for example, to a control device of a motor vehicle which, after receiving the alienation signal, starts detecting and / or collecting the sensor data of the at least one sensor.
- the collection takes place, for example, by means of a storage device set up for this purpose.
- the generated alienation signal is used to initiate a detection and / or collection of sensor data from the at least one sensor by a vehicle fleet.
- the initiation takes place, for example, via a communication interface set up for this purpose.
- the initiation can be mediated or coordinated via a central backend server, for example by the backend server causing other vehicles in the vehicle fleet to detect and / or collect sensor data from the at least one sensor after receiving the alienation signal generated by one of the vehicles in the vehicle fleet.
- the recorded and / or collected sensor data from all vehicles in the vehicle fleet are transmitted to the backend server, which collects them and then makes them available for (re) training a neural network.
- the detection and / or collection of the sensor data takes place as a function of the position and / or context-dependent and / or as a function of the sensor.
- sensor data from the alienated or changed sensor data domain can be targeted for specific positions, in particular geographical positions, or regions or for specific contexts (city, country, motorway, play street, rain,
- Fog, etc. can be captured and collected.
- a sensor data domain can be monitored in a targeted and detailed manner.
- the generated alienation signal is used to initiate the creation and / or updating of sensor data patches which are used in the quilting of the sensor data.
- the sensor data patches used during quilting can be updated to the changed or alienated sensor data domain.
- Sensor data patches updated in this way then form or define a new reference data domain.
- the creation and / or updating of the sensor data patches takes place on the basis of sensor data of the at least one sensor that are recorded and / or collected for the alienated or changed sensor data domain.
- the collected sensor data is divided into sections that form the new sensor data patches.
- the sensor data patches formed in this way are in particular stored in a database which forms the basis for a subsequent application of the quilting method.
- the creation and / or updating of the sensor data patches can also be carried out by means of a backend server, in particular in a coordinated manner for the entire vehicle fleet.
- a motor vehicle comprising at least one device according to any of the described embodiments.
- FIG. 1 shows a schematic representation of an embodiment of the device for detecting an alienation of a sensor data domain from a reference data domain;
- FIG. 2 shows a schematic representation to illustrate the quilting method (prior art).
- FIG. 3 shows a schematic flow diagram of an embodiment of the method for recognizing an alienation of a sensor data domain from a reference data domain.
- the device 1 shows a schematic representation of an embodiment of the device 1 for recognizing an alienation of a sensor data domain from a reference data domain.
- the device 1 carries out the method described in this disclosure for recognizing an alienation of a sensor data domain from a reference data domain.
- the device 1 is embodied in a motor vehicle 50, for example.
- the device 1 comprises an input device 2, a computing device 3, a memory device 4 and an output device 5.
- the computing device 3 can access the memory device 4 and perform arithmetic operations on data stored in the memory device 4.
- Parts of the device 1, in particular the computing device 3, can be designed individually or collectively as a combination of hardware and software, for example as program code that is executed on a microcontroller or microprocessor.
- the device 1 also carries out a quilting process.
- the computing device 3 has a quilting module 6.
- the input device 2 receives sensor data 10 which are detected by a sensor 51 of the motor vehicle 50.
- This sensor 51 can be, for example, a camera or a lidar sensor. In principle, more than just one sensor 51 can also be provided.
- the received sensor data 10 are fed to the quilting module 6, which replaces the sensor data 10 piece by piece by means of quilting.
- the quilting module 6 uses sensor data patches 60 which are stored in a database 40 in the storage device 4.
- the sensor data patches 60 stored in the database 40 define a reference data domain.
- the piece-wise replaced or reconstructed sensor data 20 are fed via the output device 5 to a neural network 52 that performs a KI function, for example a perception of the surroundings (e.g. object recognition, semantic segmentation, etc.) in the sensor data 10 or in the piece-wise replaced Sensor data 20 makes.
- a neural network 52 that performs a KI function, for example a perception of the surroundings (e.g. object recognition, semantic segmentation, etc.) in the sensor data 10 or in the piece-wise replaced Sensor data 20 makes.
- the quilting module 6 determines each distance 11 between partial sections of the received sensor data 10 and the sensor data patches 60 stored in the database 40.
- the determined distances 11 are fed to a recognition module 7.
- the detection module 7 generates an alienation signal 21 as a function of the determined distances 11.
- the detection module 7 checks in particular whether the determined distances 11 exceed a predetermined threshold value 12 or not.
- the predefined threshold value 12 is predefined for the recognition module 7 from the outside.
- the alienation signal 21 is generated when a single specific distance 11 exceeds the at least one predetermined threshold value 12. In this case, it is sufficient if a (single) specific distance 11 between a partial section of the received sensor data 10 and an associated sensor data patch 60 exceeds the at least one predetermined threshold value 12. However, it can also be provided that a minimum number (e.g. 10, 100, 1000 etc.) of certain distances 11 must exceed the at least one predetermined threshold value 12 so that the alienation signal 21 is generated. Furthermore, it can be provided that an average value formed from all certain distances 11 at least for a predetermined period or a predetermined number of certain distances 11 must exceed the at least one predetermined threshold value 12 so that the alienation signal 21 is generated.
- a minimum number e.g. 10, 100, 1000 etc.
- the output device 5 provides the generated alienation signal 21.
- the output device 5 outputs the alienation signal 21, for example in the form of a digital signal or digital data packet.
- the distances 11 are determined independently of the quilting module 6. Provision can be made for the at least one predefined threshold value 12 to be predefined or predefined as a function of position and / or context-dependent and / or sensor-dependent.
- the computing device 3 receives the respective threshold value 12 for a current position and / or a current context and / or for each sensor from the outside.
- all threshold values 12 with the associated dependencies are specified externally and the computing device 3 itself determines which of the specified threshold values 12 must be used at a current point in time or for which sensor.
- the generated alienation signal is used to initiate a detection and / or collection of sensor data 10 from the at least one sensor 51.
- the backend server 55 then initiates the acquisition and / or collection of the sensor data 10 in the motor vehicle 50 and in other motor vehicles in a vehicle fleet.
- the motor vehicle 50 collects sensor data 10 in the storage device 4.
- the collected sensor data are then transmitted to the backend server 55.
- This collects the collected sensor data from all motor vehicles in the vehicle fleet and uses this to compile a training data set for (re) training of the neural network 52.
- the neural network 52 is then (re) trained on the backend server 55 and a structure and parameters of the (re-trained neural network 53 are transmitted to the motor vehicle 50 and the other motor vehicles, which the neural network 52 through the (re) Replace trained neural network 53.
- the back-end server 55 can update the database 40 with the sensor data patches 60 on the basis of the transmitted, collected sensor data, in that the back-end server 55 generates updated sensor data patches 61 from the collected sensor data.
- the updated sensor data patches 61 are then transmitted to the motor vehicle 50 and the other motor vehicles in the vehicle fleet. In this way, both the neural network 52 and the reference data domain formed or defined by the sensor data patches 60 can be adapted to the alienated or changed sensor data domain.
- the detection and / or collection of the sensor data 10 takes place as a function of the position and / or context-dependent and / or as a function of the sensor.
- resources can be saved, for example if alienation has only occurred in certain regions and / or contexts and / or only for certain sensors, etc.
- a targeted and limited recording and / or collection of sensor data can be initiated so that existing resources can be used in a targeted manner.
- FIG. 2 shows a schematic representation to clarify quilting from the prior art using the example of a camera image 13.
- Sensor data 10 in the present case a camera image 13, are divided into partial sections 23.
- a database 40 is searched for a sensor data patch 60 that has the smallest distance to the respective sub-section 23 in terms of a distance measure.
- a sensor data patch 60 is an image section which has the size of the partial sections 23, i.e. the same number of picture elements (pixels), e.g. 8x8 picture elements each.
- the distance measure is, for example, the L2 standard which is applied to vectors that have been generated by linearizing the partial sections 23 or image sections.
- each partial section 23 or image section is then replaced by the respective sensor data patch 60 with the smallest distance to this. It can be provided here that a minimum distance must be observed. In this way, all partial sections 23 or image sections are replaced by sensor data patches 60 from database 40.
- Replaced partial sections 24 are created which, taken together, form the replaced sensor data 20 or a replaced camera image 25.
- the distances determined in each case in the quilting step 100 are used in the method described in this disclosure in order to recognize an alienation of the sensor data domain from a reference data domain formed or defined by the sensor data patches.
- FIG. 3 shows a schematic flow diagram of an embodiment of the method for recognizing an alienation of a sensor data domain from a reference data domain.
- a method step 200 sensed sensor data from at least one sensor are received.
- the at least one sensor is in particular a sensor that detects the surroundings of a motor vehicle, for example a camera and / or a lidar sensor of the motor vehicle.
- a method step 201 distances between partial sections of the received sensor data and sensor data patches used for piecewise replacement by means of quilting of the sensor data and belonging to the reference data domain are determined by means of a distance measure, or such distances are obtained.
- a method step 202 it is checked whether the determined or obtained distances exceed at least one threshold value. If this is not the case, method steps 200-202 are repeated with subsequent or updated acquired sensor data.
- method step 202 If, on the other hand, it is determined in method step 202 that the determined or obtained distances exceed the at least one threshold value, then in method step 203 an alienation signal is generated and provided, in particular output.
- sensing and / or collecting of sensor data from the at least one sensor can be initiated.
- the recorded and / or collected sensor data can then be used for (re) training a neural network or for creating and / or updating sensor data patches used during quilting.
- both the neural network and the sensor data patches can be adapted to the alienated or changed sensor data domain.
- the customization also defines an updated reference data domain.
- Partial section (image section) replaced partial section replaced camera image
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Evolutionary Computation (AREA)
- Quality & Reliability (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Medical Informatics (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
L'invention concerne un procédé pour la reconnaissance de l'éloignement d'un domaine de données de capteur d'un domaine de données de référence, dans lequel procédé des données de capteur (10) détectées sont reçues d'au moins un capteur (51), et au moyen d'une mesure de distance (11) entre des parties (23) des données de capteur (10) reçues et des zones de données de capteur (60), lesquelles sont utilisées pour le remplacement individuel des données de capteur (10) au moyen d'un matelassage, et qui appartient au domaine de capteur de référence, ou de telles distances (11) étant obtenues, un signal de retrait (21) est généré et fourni en fonction des distances (11) déterminées. L'invention concerne en outre un dispositif (1) de reconnaissance de la suppression d'un domaine de données de capteur à partir d'un domaine de données de référence, un véhicule à moteur (50) comportant au moins un tel dispositif (1), un programme informatique et un signal porteur de données.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20829546.9A EP4078237A1 (fr) | 2019-12-17 | 2020-12-10 | Procédé et appareil de reconnaissance de suppression d'un domaine de données de capteur à partir d'un domaine de données de référence |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102019219927.5A DE102019219927A1 (de) | 2019-12-17 | 2019-12-17 | Verfahren und Vorrichtung zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne |
DE102019219927.5 | 2019-12-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021122337A1 true WO2021122337A1 (fr) | 2021-06-24 |
Family
ID=74068253
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2020/085649 WO2021122337A1 (fr) | 2019-12-17 | 2020-12-10 | Procédé et appareil de reconnaissance de suppression d'un domaine de données de capteur à partir d'un domaine de données de référence |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4078237A1 (fr) |
DE (1) | DE102019219927A1 (fr) |
WO (1) | WO2021122337A1 (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102022112655A1 (de) | 2022-05-19 | 2023-11-23 | Bayerische Motoren Werke Aktiengesellschaft | Fahrassistenzsystem und Fahrassistenzverfahren für ein Fahrzeug |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102017206123A1 (de) * | 2017-04-10 | 2018-10-11 | Robert Bosch Gmbh | Verfahren und Vorrichtung zur Fusion von Daten verschiedener Sensoren eines Fahrzeugs im Rahmen einer Objekterkennung |
-
2019
- 2019-12-17 DE DE102019219927.5A patent/DE102019219927A1/de active Pending
-
2020
- 2020-12-10 WO PCT/EP2020/085649 patent/WO2021122337A1/fr unknown
- 2020-12-10 EP EP20829546.9A patent/EP4078237A1/fr active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102017206123A1 (de) * | 2017-04-10 | 2018-10-11 | Robert Bosch Gmbh | Verfahren und Vorrichtung zur Fusion von Daten verschiedener Sensoren eines Fahrzeugs im Rahmen einer Objekterkennung |
Non-Patent Citations (4)
Title |
---|
ALEXEI A EFROS ET AL: "Image quilting for texture synthesis and transfer", COMPUTER GRAPHICS. SIGGRAPH 2001. CONFERENCE PROCEEDINGS. LOS ANGELES, CA, AUG. 12 - 17, 2001; [COMPUTER GRAPHICS PROCEEDINGS. SIGGRAPH], NEW YORK, NY : ACM, US, 1 August 2001 (2001-08-01), pages 341 - 346, XP058253454, ISBN: 978-1-58113-374-5, DOI: 10.1145/383259.383296 * |
ANONYMOUS: "Textursynthese - Wikipedia", 21 March 2021 (2021-03-21), XP055787992, Retrieved from the Internet <URL:https://de.wikipedia.org/wiki/Textursynthese> [retrieved on 20210321] * |
AUS CHUAN GUO ET AL.: "Countering Adversarial Images Using Input Transformations", ARXIV: 1711.00117V3, 25 January 2018 (2018-01-25), Retrieved from the Internet <URL:https://arxiv.org/pdf/1711.00117.pdf> |
CHUAN GUO ET AL: "Countering Adversarial Images using Input Transformations", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 31 October 2017 (2017-10-31), XP081307843 * |
Also Published As
Publication number | Publication date |
---|---|
DE102019219927A1 (de) | 2021-06-17 |
EP4078237A1 (fr) | 2022-10-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AT521607B1 (de) | Verfahren und Vorrichtung zum Testen eines Fahrerassistenzsystem | |
EP3789926A1 (fr) | Procédé de détection d'une perturbation adversaire dans des données d'entrée d'un réseau de neurones | |
DE102016210534A1 (de) | Verfahren zum Klassifizieren einer Umgebung eines Fahrzeugs | |
WO2013152929A1 (fr) | Procédé d'apprentissage en vue de la reconnaissance automatique de panneaux de signalisation routière, procédé de détermination d'un jeu de paramètres actualisés pour la classification d'un panneau de signalisation routière et système de reconnaissance de panneaux de signalisation routière | |
DE102019208735B4 (de) | Verfahren zum Betreiben eines Fahrassistenzsystems eines Fahrzeugs und Fahrerassistenzsystem für ein Fahrzeug | |
EP3828758A1 (fr) | Procédé de classification des objets, circuit de classification des objets, véhicule automobile | |
EP3644239A1 (fr) | Procédé et dispositif d'abstraction d'un ensemble de données | |
DE102019106122A1 (de) | Automatisiertes Fahrsystem | |
EP4078238A1 (fr) | Procédé et dispositif pour rendre des données de capteur plus robustes à l'égard de perturbations indésirables | |
WO2021122337A1 (fr) | Procédé et appareil de reconnaissance de suppression d'un domaine de données de capteur à partir d'un domaine de données de référence | |
DE102018205248B4 (de) | Fusionssystem zur Fusion von Umfeldinformation für ein Kraftfahrzeug | |
EP3985565A1 (fr) | Procédé et dispositif de vérification d'un système de traitement d'informations basé sur l'ia utilisé lors de la commande partiellement automatisée ou entièrement automatisée d'un véhicule | |
DE102021200643B3 (de) | Verfahren zur Umfelderkennung für teilautonome oder autonome Fahrfunktionen eines Kraftfahrzeugs mittels eines neuronalen Netzes | |
EP4049186A1 (fr) | Procédé pour robustifier un réseau neuronal vis-à-vis de perturbations antagonistes | |
WO2021122339A1 (fr) | Procédé et dispositif permettant de fabriquer un réseau neuronal plus robuste par rapport à des perturbations antagonistes | |
EP3973458A1 (fr) | Procédé pour faire fonctionner un réseau neuronal profond | |
EP3973466A1 (fr) | Procédé pour rendre un réseau neuronal plus robuste de manière spécifique à son fonctionnement | |
DE102019119084A1 (de) | Bestimmen eines Signalstatus einer Lichtsignalanlage | |
DE102019219924B4 (de) | Verfahren und Vorrichtung zum Erzeugen und Bereitstellen einer Datenbank mit darin hinterlegten Sensordatenpatches zur Verwendung beim Quilting | |
DE102023000469B3 (de) | Verfahren zur Aufzeichnung von erfassen Umgebungsdaten | |
EP4179471A1 (fr) | Procédé et dispositif pour évaluer et certifier une robustesse d'un système de traitement d'information fondé sur l'ia | |
DE102020120934A1 (de) | Verfahren zum Bereitstellen eines komprimierten neuronalen Netzes zur Multi-Label Multi-Klassen Kategorisierung, Fahrzeugassistenzeinrichtung zur Umgebungskategorisierung und Kraftfahrzeug | |
DE102020203819A1 (de) | Verfahren zum Betreiben eines zumindest teilautomatisiert fahrenden Fahrzeugs und Fahrzeug | |
DE102020213058A1 (de) | Verfahren und Vorrichtung zum teilautomatisierten oder vollautomatisierten Steuern eines Fahrzeugs | |
DE102019219926A1 (de) | Verfahren und Vorrichtung zum Trainieren eines Neuronalen Netzes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20829546 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020829546 Country of ref document: EP Effective date: 20220718 |