EP4078237A1 - Method and apparatus for recognising removal of a sensor data domain from a reference data domain - Google Patents

Method and apparatus for recognising removal of a sensor data domain from a reference data domain

Info

Publication number
EP4078237A1
EP4078237A1 EP20829546.9A EP20829546A EP4078237A1 EP 4078237 A1 EP4078237 A1 EP 4078237A1 EP 20829546 A EP20829546 A EP 20829546A EP 4078237 A1 EP4078237 A1 EP 4078237A1
Authority
EP
European Patent Office
Prior art keywords
sensor data
sensor
alienation
data
domain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20829546.9A
Other languages
German (de)
French (fr)
Inventor
Peter Schlicht
Fabian HÜGER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen AG filed Critical Volkswagen AG
Publication of EP4078237A1 publication Critical patent/EP4078237A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/28Determining representative reference patterns, e.g. by averaging or distorting; Generating dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • G06V10/993Evaluation of the quality of the acquired pattern
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/862Combination of radar systems with sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/023Interference mitigation, e.g. reducing or avoiding non-intentional interference with other HF-transmitters, base station transmitters for mobile communication or other radar systems, e.g. using electro-magnetic interference [EMI] reduction techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/36Means for anti-jamming, e.g. ECCM, i.e. electronic counter-counter measures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/495Counter-measures or counter-counter-measures using electronic or electro-optical means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • G06F18/2414Smoothing the distance, e.g. radial basis function networks [RBFN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

Definitions

  • the invention relates to a method and a device for recognizing an alienation of a sensor data domain from a reference data domain.
  • the invention also relates to a motor vehicle, a computer program and a data carrier signal.
  • Machine learning for example based on neural networks, has great potential for use in modern driver assistance systems and automated vehicles.
  • Functions based on deep neural networks process sensor data (e.g. from cameras, radar or lidar sensors) in order to derive relevant information from it.
  • sensor data e.g. from cameras, radar or lidar sensors
  • This information includes, for example, a type and a position of objects in the surroundings of the motor vehicle, a behavior of the objects or a road geometry or topology.
  • CNN convolutional neural networks
  • CNN convolutional neural networks
  • CNN convolutional neural networks
  • input data e.g. image data
  • CNN convolutional neural networks
  • the convolution network independently develops feature maps based on filter channels that process the input data locally in order to derive local properties. These feature cards are then processed again by further filter channels, which derive more valuable feature cards from them.
  • the deep neural network On the basis of this information compressed from the input data, the deep neural network finally derives its decision and makes it available as output data.
  • An essential feature in the development of deep neural networks is the purely data-driven parameter fitting without expert intervention:
  • a deviation of an output (for a given parameterization) of a neural network from a ground truth is determined (the so-called loess ).
  • the loss function used here is selected in such a way that the parameters of the neural network depend on it in a differentiable manner.
  • the parameters of the neural network are adapted in each training step depending on the derivation of the deviation (determined on several examples). These training steps are repeated very often until the loess no longer diminishes.
  • the parameters of the neural network are determined without expert assessment or semantically motivated modeling.
  • the parameters depend largely on the data used for training.
  • Machine-learned models are generally very good at generalizing (successfully applying what they have learned to unknown data), but they can only do so within the data domains they were confronted with in the course of the training process. If, on the other hand, the trained model is applied to data from other data domains, this usually results in a greatly reduced accuracy of the output.
  • the AI functions provided by the neural network can, for example, only be used in regions from which the data used for training originate or are very similar.
  • the invention is based on the object of creating a method and device for recognizing an alienation of a sensor data domain from a reference data domain.
  • a method for recognizing an alienation of a sensor data domain from a reference data domain, with detected sensor data being received from at least one sensor, with distances between partial excerpts of the received sensor data and partially by means of a distance measure Replacing by means of quilting the sensor data used and belonging to the reference data domain sensor data patches are determined or such distances are obtained, and an alienation signal is generated and provided as a function of the determined distances.
  • a device for recognizing an alienation of a sensor data domain from a reference data domain comprising an input device for receiving acquired sensor data from at least one sensor, a computing device, and an output device, the computing device being set up to use a distance measure to determine distances between partial sections of the received sensor data and for the piece-wise replacement by means of quilting of the sensor data and belonging to the reference data domain sensor data patches to determine or to obtain such distances, and to generate an alienation signal as a function of the determined distances, and wherein the output device is set up to provide the generated alienation signal.
  • the method and the device make it possible to determine an alienation of a sensor data domain from a reference data domain. In this way it can be determined in particular whether currently detected or received sensor data are (still) in the reference data domain or not.
  • recorded sensor data from at least one sensor is received.
  • the sensor data are, for example, environment data recorded from the environment by means of environment sensors of a motor vehicle. Distances between partial sections of the received sensor data and sensor data patches are determined by means of a distance measure.
  • the sensor data patches are used in a quilting process to replace the sensor data piece by piece and belong to or define the reference data domain.
  • the respective sections of the sensor data considered and the associated sensor data patches have the same format or the same size.
  • the quilting itself can be part of the process; however, the quilting can also be carried out independently of the method, only the distances between the partial excerpts of the sensor data and the associated sensor data patches being determined or obtained for quilting by means of the distance measure.
  • the partial sections of the sensor data can be linearized into vectors, for example, and compared with the respective sensor data patches, likewise linearized into vectors, using a vector standard.
  • the L2 standard can be used here as a distance measure.
  • An alienation signal is generated and provided as a function of the determined distances.
  • Quilting particularly refers to the piece-by-piece replacement of sensor data. Quilting is used in particular to robustize sensor data that are fed to a neural network as input data against adversarial interference.
  • Quilting can also be referred to as a piece-wise reconstruction of the sensor data.
  • image quilting is also used in connection with image data. If, for example, images from a camera are involved, the camera image is divided into several partial sections. For this purpose, small, rectangular image sections (also referred to as patches, for example with a size of 8x8 picture elements / pixels) can be defined.
  • the individual partial or image sections are compared with partial sections, referred to in this disclosure as sensor data patches, which are in particular stored in a database. The comparison takes place on the basis of a distance measure which is defined, for example, via a Euclidean distance on picture element vectors. For this purpose, a partial or image section is linearized as a vector.
  • a distance is then determined using a vector standard, for example using the L2 standard.
  • the partial or image excerpts from the recorded sensor data are each replaced by the closest or most similar sensor data patch from the database. It can be provided here that a minimum distance must be observed or that at least no identity may exist between the partial section from the sensor data and the sensor data patch. If the sensor data have a different form (eg lidar data) or a different format, the piece-by-piece replacement takes place in an analogous manner. The piece-by-piece replacement takes place for all partial excerpts of the sensor data, so that replaced or reconstructed sensor data is then available.
  • a vector standard for example using the L2 standard.
  • the sensor data patches necessary for quilting are generated from sensor data which are trustworthy, that is to say which with certainty or at least with a high degree of probability no adverse disturbances are contained.
  • the sensor data patches are provided, for example, in the form of a searchable database.
  • the method and the device can be used to determine when a quilting quality of the recorded sensor data decreases because the sensor data domain and the reference data domain from which the sensor data patches used in quilting were generated have become alienated from one another. Furthermore, the method and the device are particularly advantageous when using trained neural networks, since the method and the device can also be used to determine whether currently acquired sensor data that is to be fed to a trained neural network for processing, are still in a data domain on which the neural network was trained. Since the reference data domain is defined according to the method using the sensor data patches, it is particularly necessary for this purpose that the training data with which the neural network was trained also originate from the reference data domain.
  • a typical application scenario for the method and the device is as follows.
  • a neural network is used in a vehicle, for example for perception of the surroundings (object recognition, semantic segmentation, etc.).
  • Recorded sensor data in particular data on the surroundings of the surroundings of the vehicle, at least one sensor (e.g. a camera, a lidar sensor, etc.) are fed to the neural network.
  • the recorded sensor data are replaced piece by piece by means of quilting in order to robustize the recorded sensor data against adversarial disturbances.
  • the quilting is done by determining distances from partial sections of the recorded sensor data to sensor data patches, which are stored in a database, for example, and replacing the individual partial sections with the most similar or closest sensor data patch in relation to the respective distance.
  • the method described in this disclosure uses the distances determined during quilting in order to recognize and determine an alienation of the sensor data domain from the reference data domain defined by the sensor data patches.
  • the sensor data can in principle be one-dimensional or multidimensional, in particular two-dimensional.
  • the sensor data can be two-dimensional camera images from a camera and two-dimensional lidar data from a lidar sensor.
  • it can also be sensor data from other sensors, such as, for example, radar sensors and / or ultrasonic sensors.
  • the sensor data are or have been recorded in particular by means of at least one sensor of a vehicle.
  • the sensor data include, in particular, data about the surroundings of the vehicle.
  • a data domain is intended to denote, in particular, a set of data, in particular sensor data, which correspond to a specific context or whose data are similar in terms of their origin in at least one property.
  • a context can be, for example, a geographical context, e.g. a data domain can include sensor data from one city, whereas a data domain different therefrom comprises sensor data from another city, etc.
  • a sensor data domain in particular denotes a data domain in which current sensor data, recorded by means of the at least one sensor, are located.
  • a reference data domain in particular denotes a data domain in which the sensor data are located, from which the sensor data patches used in quilting were generated.
  • the sensor data recorded or mapped in the data domains are those sensor data that are required for a function for automated driving of a vehicle and / or for driver assistance of the vehicle and / or for environment detection or for a perception function.
  • the method and the device are used in particular in at least one vehicle.
  • a vehicle is in particular a motor vehicle.
  • the vehicle can also be another land, air, water, rail or space vehicle.
  • the method can also be used for other types of sensor data, for example in connection with robots, e.g. industrial robots or medical robots.
  • An adversarial perturbation is, in particular, a deliberately made disruption of the input data of a neural network, for example provided in the form of sensor data, in which semantic content in the input data is not changed, but the disruption leads to the neuronal Network inferred an incorrect result, that is, for example, incorrectly classifies or incorrectly semantic segmentation of the input data.
  • a neural network is in particular a deep neural network, in particular a convolutional neural network (CNN).
  • the neural network is or is, for example, trained for a specific perception function, for example for the perception of pedestrians or other objects in captured camera images.
  • the method can be carried out as a computer-implemented method.
  • the method can be carried out by means of a data processing device.
  • the data processing device comprises in particular at least one computing device and at least one storage device.
  • a computer program is also created, comprising instructions which, when the computer program is executed by a computer, cause the computer program to be executed To carry out method steps of the disclosed method according to any of the described embodiments.
  • a data carrier signal is also created that transmits the aforementioned computer program.
  • the method includes acquiring the sensor data of the at least one sensor.
  • Parts of the device in particular the computing device, can be designed individually or collectively as a combination of hardware and software, for example as program code that is executed on a microcontroller or microprocessor.
  • the alienation signal is generated when the determined distances exceed at least one predetermined threshold value.
  • the at least one predetermined threshold value can be used to set how sensitively the method reacts to an alienation or change in the sensor data domain from or with respect to the reference data domain. It can be provided here that the alienation signal is generated when a single specific distance exceeds the at least one predetermined threshold value. In this case, it is sufficient if a distance between a partial section of the received sensor data and an associated sensor data patch exceeds the at least one predetermined threshold value. However, it can also be provided that a minimum number of specific distances must exceed the at least one predetermined threshold value. Furthermore, it can be provided that an average value formed from all certain distances at least for a predetermined period of time or a predetermined number of certain distances must exceed the at least one predetermined threshold value so that the alienation signal is generated.
  • the at least one predefined threshold value is predefined or is predefined as a function of position and / or context-dependent and / or sensor-dependent.
  • the threshold value can be specified in particular as a function of a geographic position or region.
  • the threshold value can be in the area of a play street be chosen differently than in the area of a motorway. In this way, a sensitivity when recognizing the alienation can be determined as a function of the position.
  • context dependency a situational context in particular can be considered.
  • the threshold value can then be selected differently, for example, depending on whether a vehicle is on a motorway, a country road, a main road or a secondary road. Furthermore, weather conditions can also be taken into account, for example.
  • the at least one threshold value can thus be selected as a function of whether one of the following weather conditions is present: rain, snow, fog, sunshine, etc.
  • both a type of sensors or the physical measurement principle on which they are based as well as an arrangement and / or alignment and / or weather dependency of the sensors can be taken into account.
  • the generated alienation signal is used to initiate detection and / or collection of sensor data from the at least one sensor.
  • updated sensor data can be recorded and collected for the alienated sensor data domain, which can then be used to (re) train a neural network.
  • the neural network can thereby be brought up to date, corresponding to the changed sensor data domain.
  • the generated alienation signal can be fed, for example, to a control device of a motor vehicle which, after receiving the alienation signal, starts detecting and / or collecting the sensor data of the at least one sensor.
  • the collection takes place, for example, by means of a storage device set up for this purpose.
  • the generated alienation signal is used to initiate a detection and / or collection of sensor data from the at least one sensor by a vehicle fleet.
  • the initiation takes place, for example, via a communication interface set up for this purpose.
  • the initiation can be mediated or coordinated via a central backend server, for example by the backend server causing other vehicles in the vehicle fleet to detect and / or collect sensor data from the at least one sensor after receiving the alienation signal generated by one of the vehicles in the vehicle fleet.
  • the recorded and / or collected sensor data from all vehicles in the vehicle fleet are transmitted to the backend server, which collects them and then makes them available for (re) training a neural network.
  • the detection and / or collection of the sensor data takes place as a function of the position and / or context-dependent and / or as a function of the sensor.
  • sensor data from the alienated or changed sensor data domain can be targeted for specific positions, in particular geographical positions, or regions or for specific contexts (city, country, motorway, play street, rain,
  • Fog, etc. can be captured and collected.
  • a sensor data domain can be monitored in a targeted and detailed manner.
  • the generated alienation signal is used to initiate the creation and / or updating of sensor data patches which are used in the quilting of the sensor data.
  • the sensor data patches used during quilting can be updated to the changed or alienated sensor data domain.
  • Sensor data patches updated in this way then form or define a new reference data domain.
  • the creation and / or updating of the sensor data patches takes place on the basis of sensor data of the at least one sensor that are recorded and / or collected for the alienated or changed sensor data domain.
  • the collected sensor data is divided into sections that form the new sensor data patches.
  • the sensor data patches formed in this way are in particular stored in a database which forms the basis for a subsequent application of the quilting method.
  • the creation and / or updating of the sensor data patches can also be carried out by means of a backend server, in particular in a coordinated manner for the entire vehicle fleet.
  • a motor vehicle comprising at least one device according to any of the described embodiments.
  • FIG. 1 shows a schematic representation of an embodiment of the device for detecting an alienation of a sensor data domain from a reference data domain;
  • FIG. 2 shows a schematic representation to illustrate the quilting method (prior art).
  • FIG. 3 shows a schematic flow diagram of an embodiment of the method for recognizing an alienation of a sensor data domain from a reference data domain.
  • the device 1 shows a schematic representation of an embodiment of the device 1 for recognizing an alienation of a sensor data domain from a reference data domain.
  • the device 1 carries out the method described in this disclosure for recognizing an alienation of a sensor data domain from a reference data domain.
  • the device 1 is embodied in a motor vehicle 50, for example.
  • the device 1 comprises an input device 2, a computing device 3, a memory device 4 and an output device 5.
  • the computing device 3 can access the memory device 4 and perform arithmetic operations on data stored in the memory device 4.
  • Parts of the device 1, in particular the computing device 3, can be designed individually or collectively as a combination of hardware and software, for example as program code that is executed on a microcontroller or microprocessor.
  • the device 1 also carries out a quilting process.
  • the computing device 3 has a quilting module 6.
  • the input device 2 receives sensor data 10 which are detected by a sensor 51 of the motor vehicle 50.
  • This sensor 51 can be, for example, a camera or a lidar sensor. In principle, more than just one sensor 51 can also be provided.
  • the received sensor data 10 are fed to the quilting module 6, which replaces the sensor data 10 piece by piece by means of quilting.
  • the quilting module 6 uses sensor data patches 60 which are stored in a database 40 in the storage device 4.
  • the sensor data patches 60 stored in the database 40 define a reference data domain.
  • the piece-wise replaced or reconstructed sensor data 20 are fed via the output device 5 to a neural network 52 that performs a KI function, for example a perception of the surroundings (e.g. object recognition, semantic segmentation, etc.) in the sensor data 10 or in the piece-wise replaced Sensor data 20 makes.
  • a neural network 52 that performs a KI function, for example a perception of the surroundings (e.g. object recognition, semantic segmentation, etc.) in the sensor data 10 or in the piece-wise replaced Sensor data 20 makes.
  • the quilting module 6 determines each distance 11 between partial sections of the received sensor data 10 and the sensor data patches 60 stored in the database 40.
  • the determined distances 11 are fed to a recognition module 7.
  • the detection module 7 generates an alienation signal 21 as a function of the determined distances 11.
  • the detection module 7 checks in particular whether the determined distances 11 exceed a predetermined threshold value 12 or not.
  • the predefined threshold value 12 is predefined for the recognition module 7 from the outside.
  • the alienation signal 21 is generated when a single specific distance 11 exceeds the at least one predetermined threshold value 12. In this case, it is sufficient if a (single) specific distance 11 between a partial section of the received sensor data 10 and an associated sensor data patch 60 exceeds the at least one predetermined threshold value 12. However, it can also be provided that a minimum number (e.g. 10, 100, 1000 etc.) of certain distances 11 must exceed the at least one predetermined threshold value 12 so that the alienation signal 21 is generated. Furthermore, it can be provided that an average value formed from all certain distances 11 at least for a predetermined period or a predetermined number of certain distances 11 must exceed the at least one predetermined threshold value 12 so that the alienation signal 21 is generated.
  • a minimum number e.g. 10, 100, 1000 etc.
  • the output device 5 provides the generated alienation signal 21.
  • the output device 5 outputs the alienation signal 21, for example in the form of a digital signal or digital data packet.
  • the distances 11 are determined independently of the quilting module 6. Provision can be made for the at least one predefined threshold value 12 to be predefined or predefined as a function of position and / or context-dependent and / or sensor-dependent.
  • the computing device 3 receives the respective threshold value 12 for a current position and / or a current context and / or for each sensor from the outside.
  • all threshold values 12 with the associated dependencies are specified externally and the computing device 3 itself determines which of the specified threshold values 12 must be used at a current point in time or for which sensor.
  • the generated alienation signal is used to initiate a detection and / or collection of sensor data 10 from the at least one sensor 51.
  • the backend server 55 then initiates the acquisition and / or collection of the sensor data 10 in the motor vehicle 50 and in other motor vehicles in a vehicle fleet.
  • the motor vehicle 50 collects sensor data 10 in the storage device 4.
  • the collected sensor data are then transmitted to the backend server 55.
  • This collects the collected sensor data from all motor vehicles in the vehicle fleet and uses this to compile a training data set for (re) training of the neural network 52.
  • the neural network 52 is then (re) trained on the backend server 55 and a structure and parameters of the (re-trained neural network 53 are transmitted to the motor vehicle 50 and the other motor vehicles, which the neural network 52 through the (re) Replace trained neural network 53.
  • the back-end server 55 can update the database 40 with the sensor data patches 60 on the basis of the transmitted, collected sensor data, in that the back-end server 55 generates updated sensor data patches 61 from the collected sensor data.
  • the updated sensor data patches 61 are then transmitted to the motor vehicle 50 and the other motor vehicles in the vehicle fleet. In this way, both the neural network 52 and the reference data domain formed or defined by the sensor data patches 60 can be adapted to the alienated or changed sensor data domain.
  • the detection and / or collection of the sensor data 10 takes place as a function of the position and / or context-dependent and / or as a function of the sensor.
  • resources can be saved, for example if alienation has only occurred in certain regions and / or contexts and / or only for certain sensors, etc.
  • a targeted and limited recording and / or collection of sensor data can be initiated so that existing resources can be used in a targeted manner.
  • FIG. 2 shows a schematic representation to clarify quilting from the prior art using the example of a camera image 13.
  • Sensor data 10 in the present case a camera image 13, are divided into partial sections 23.
  • a database 40 is searched for a sensor data patch 60 that has the smallest distance to the respective sub-section 23 in terms of a distance measure.
  • a sensor data patch 60 is an image section which has the size of the partial sections 23, i.e. the same number of picture elements (pixels), e.g. 8x8 picture elements each.
  • the distance measure is, for example, the L2 standard which is applied to vectors that have been generated by linearizing the partial sections 23 or image sections.
  • each partial section 23 or image section is then replaced by the respective sensor data patch 60 with the smallest distance to this. It can be provided here that a minimum distance must be observed. In this way, all partial sections 23 or image sections are replaced by sensor data patches 60 from database 40.
  • Replaced partial sections 24 are created which, taken together, form the replaced sensor data 20 or a replaced camera image 25.
  • the distances determined in each case in the quilting step 100 are used in the method described in this disclosure in order to recognize an alienation of the sensor data domain from a reference data domain formed or defined by the sensor data patches.
  • FIG. 3 shows a schematic flow diagram of an embodiment of the method for recognizing an alienation of a sensor data domain from a reference data domain.
  • a method step 200 sensed sensor data from at least one sensor are received.
  • the at least one sensor is in particular a sensor that detects the surroundings of a motor vehicle, for example a camera and / or a lidar sensor of the motor vehicle.
  • a method step 201 distances between partial sections of the received sensor data and sensor data patches used for piecewise replacement by means of quilting of the sensor data and belonging to the reference data domain are determined by means of a distance measure, or such distances are obtained.
  • a method step 202 it is checked whether the determined or obtained distances exceed at least one threshold value. If this is not the case, method steps 200-202 are repeated with subsequent or updated acquired sensor data.
  • method step 202 If, on the other hand, it is determined in method step 202 that the determined or obtained distances exceed the at least one threshold value, then in method step 203 an alienation signal is generated and provided, in particular output.
  • sensing and / or collecting of sensor data from the at least one sensor can be initiated.
  • the recorded and / or collected sensor data can then be used for (re) training a neural network or for creating and / or updating sensor data patches used during quilting.
  • both the neural network and the sensor data patches can be adapted to the alienated or changed sensor data domain.
  • the customization also defines an updated reference data domain.
  • Partial section (image section) replaced partial section replaced camera image

Abstract

The invention relates to a method for recognising removal of a sensor data domain from a reference data domain, in which method detected sensor data (10) are received from at least one sensor (51), and by means of a distance measurement in each case distances (11) between parts (23) of the received sensor data (10) and sensor data patches (60), which are used for item by item replacement by means of quilting of the sensor data (10) and belong to the reference sensor domain, are determined or such distances (11) are obtained, a removal signal (21) being generated and provided depending on the determined distances (11). The invention further relates to a device (1) for recognising removal of a sensor data domain from a reference data domain, to a motor vehicle (50) having at least one such device (1), to a computer program and to a data carrier signal.

Description

Beschreibung description
Verfahren und Vorrichtung zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne Method and device for recognizing an alienation of a sensor data domain from a reference data domain
Die Erfindung betrifft ein Verfahren und eine Vorrichtung zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne. Ferner betrifft die Erfindung ein Kraftfahrzeug, ein Computerprogramm und ein Datenträgersignal. The invention relates to a method and a device for recognizing an alienation of a sensor data domain from a reference data domain. The invention also relates to a motor vehicle, a computer program and a data carrier signal.
Maschinelles Lernen, beispielsweise auf Grundlage von Neuronalen Netzen, hat großes Potenzial für eine Anwendung in modernen Fahrerassistenzsystemen und automatisiert fahrenden Kraftfahrzeugen. Auf tiefen Neuronalen Netzen basierende Funktionen verarbeiten hierbei Sensordaten (zum Beispiel von Kameras, Radar- oder Lidarsensoren), um hieraus relevante Informationen abzuleiten. Diese Informationen umfassen zum Beispiel eine Art und eine Position von Objekten in einem Umfeld des Kraftfahrzeugs, ein Verhalten der Objekte oder eine Fahrbahngeometrie oder -topologie. Machine learning, for example based on neural networks, has great potential for use in modern driver assistance systems and automated vehicles. Functions based on deep neural networks process sensor data (e.g. from cameras, radar or lidar sensors) in order to derive relevant information from it. This information includes, for example, a type and a position of objects in the surroundings of the motor vehicle, a behavior of the objects or a road geometry or topology.
Unter den Neuronalen Netzen haben sich insbesondere Faltungsnetze (engl. Convolutional Neural Networks, CNN) als besonders geeignet für Anwendungen in der Bildverarbeitung erwiesen. Faltungsnetze extrahieren in unüberwachter Form stufenweise verschiedene hochwertige Merkmale aus Eingangsdaten (z.B. Bilddaten). Das Faltungsnetz entwickelt hierbei während einer Trainingsphase eigenständig Merkmalskarten basierend auf Filterkanälen, die die Eingangsdaten lokal verarbeiten, um hierdurch lokale Eigenschaften abzuleiten. Diese Merkmalskarten werden dann erneut von weiteren Filterkanälen verarbeitet, die daraus höherwertigere Merkmalskarten ableiten. Auf Grundlage dieser derart aus den Eingangsdaten verdichteten Informationen leitet das tiefe Neuronale Netz schließlich seine Entscheidung ab und stellt diese als Ausgangsdaten bereit. Among the neural networks, in particular convolutional neural networks (CNN) have proven to be particularly suitable for applications in image processing. Convolution networks gradually extract various high-quality features from input data (e.g. image data) in an unsupervised form. During a training phase, the convolution network independently develops feature maps based on filter channels that process the input data locally in order to derive local properties. These feature cards are then processed again by further filter channels, which derive more valuable feature cards from them. On the basis of this information compressed from the input data, the deep neural network finally derives its decision and makes it available as output data.
Während Faltungsnetze klassische Ansätze an funktionaler Genauigkeit übertreffen, besitzen diese jedoch auch Nachteile. So können beispielsweise auf adversarialen Störungen in den Sensordaten/Eingangsdaten basierende Angriffe dazu führen, dass trotz eines semantisch nicht veränderten Inhalts in den erfassten Sensordaten eine Fehlklassifizierung bzw. eine falsche semantische Segmentierung erfolgt. Aus Chuan Guo et al. , Countering Adversarial Images Using Input Transformations, arXiv: 1711.00117v3 [cs.CV], 25. Jan. 2018, https://arxiv.org/pdf/1711.00117.pdf, ist ein Quilting-While convolution meshes outperform classic approaches in terms of functional accuracy, they also have disadvantages. For example, attacks based on adversarial disturbances in the sensor data / input data can result in incorrect classification or incorrect semantic segmentation in the recorded sensor data despite the semantically unchanged content. From Chuan Guo et al. , Countering Adversarial Images Using Input Transformations, arXiv: 1711.00117v3 [cs.CV], Jan. 25, 2018, https://arxiv.org/pdf/1711.00117.pdf, is a quilting
Verfahren zum Beseitigen von adversarialen Störungen in Bilddaten bekannt. Method for eliminating adversarial disturbances in image data is known.
Ein wesentliches Merkmal bei der Entwicklung von tiefen Neuronalen Netzen (dem Training) liegt im rein datengetriebenen Parameterfitting ohne Experteneingriff: Hierbei wird eine Abweichung einer Ausgabe (für eine gegebene Parametrierung) eines Neuronalen Netzes von einer Grundwahrheit (engl ground truth) bestimmt (der sogenannte Löss). Die hierbei verwendete Lossfunktion wird in einer Weise gewählt, dass die Parameter des Neuronalen Netzes differenzierbar von dieser abhängen. Im Rahmen des Gradientenabstiegsverfahrens werden in jedem Trainingsschritt die Parameter des Neuronalen Netzes in Abhängigkeit der Ableitung der (auf mehreren Beispielen ermittelten) Abweichung angepasst. Diese Trainingsschritte werden sehr oft wiederholt, bis sich der Löss nicht mehr verringert. An essential feature in the development of deep neural networks (training) is the purely data-driven parameter fitting without expert intervention: Here, a deviation of an output (for a given parameterization) of a neural network from a ground truth is determined (the so-called loess ). The loss function used here is selected in such a way that the parameters of the neural network depend on it in a differentiable manner. As part of the gradient descent method, the parameters of the neural network are adapted in each training step depending on the derivation of the deviation (determined on several examples). These training steps are repeated very often until the loess no longer diminishes.
Bei diesem Vorgehen werden die Parameter des Neuronalen Netzes ohne eine Experteneinschätzung oder eine semantisch motivierte Modellierung ermittelt. Die Parameter hängen jedoch maßgeblich von den zum Training verwendeten Daten ab. Maschinengelernte Modelle können zwar im Allgemeinen sehr gut generalisieren (das Gelernte auf unbekannte Daten erfolgreich anwenden), jedoch können sie dies nur innerhalb derjenigen Datendomänen, mit denen sie im Laufe des Trainingsvorgangs konfrontiert wurden. Wendet man das trainierte Modell hingegen auf Daten anderer Datendomänen an, resultiert dies meist in einer stark reduzierten Genauigkeit der Ausgabe. Dies führt dazu, dass die von dem Neuronalen Netz bereitgestellten Kl-Funktionen zum Beispiel nur in Regionen verwendet werden können, aus denen die zum Trainieren verwendeten Daten stammen, oder dieser sehr ähnlich sind. In this procedure, the parameters of the neural network are determined without expert assessment or semantically motivated modeling. However, the parameters depend largely on the data used for training. Machine-learned models are generally very good at generalizing (successfully applying what they have learned to unknown data), but they can only do so within the data domains they were confronted with in the course of the training process. If, on the other hand, the trained model is applied to data from other data domains, this usually results in a greatly reduced accuracy of the output. As a result, the AI functions provided by the neural network can, for example, only be used in regions from which the data used for training originate or are very similar.
Der Erfindung liegt die Aufgabe zu Grunde, ein Verfahren und Vorrichtung zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne zu schaffen. The invention is based on the object of creating a method and device for recognizing an alienation of a sensor data domain from a reference data domain.
Die Aufgabe wird erfindungsgemäß durch ein Verfahren mit den Merkmalen des Patentanspruchs 1 und eine Vorrichtung mit den Merkmalen des Patentanspruchs 7 gelöst. Vorteilhafte Ausgestaltungen der Erfindung ergeben sich aus den Unteransprüchen. The object is achieved according to the invention by a method with the features of claim 1 and a device with the features of claim 7. Advantageous refinements of the invention emerge from the subclaims.
Insbesondere wird ein Verfahren zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne zur Verfügung gestellt, wobei erfasste Sensordaten mindestens eines Sensors empfangen werden, wobei mittels eines Distanzmaßes jeweils Distanzen zwischen Teilausschnitten der empfangenen Sensordaten und zum stückweisen Ersetzen mittels Quilting der Sensordaten verwendeten und der Referenzdatendomäne angehörenden Sensordatenpatches bestimmt werden oder solche Distanzen erhalten werden, und wobei ein Entfremdungssignal in Abhängigkeit der bestimmten Distanzen erzeugt und bereitgestellt wird. In particular, a method is provided for recognizing an alienation of a sensor data domain from a reference data domain, with detected sensor data being received from at least one sensor, with distances between partial excerpts of the received sensor data and partially by means of a distance measure Replacing by means of quilting the sensor data used and belonging to the reference data domain sensor data patches are determined or such distances are obtained, and an alienation signal is generated and provided as a function of the determined distances.
Ferner wird insbesondere eine Vorrichtung zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne geschaffen, umfassend eine Eingangseinrichtung zum Empfangen von erfassten Sensordaten mindestens eines Sensors, eine Recheneinrichtung, und eine Ausgabeeinrichtung, wobei die Recheneinrichtung dazu eingerichtet ist, mittels eines Distanzmaßes jeweils Distanzen zwischen Teilausschnitten der empfangenen Sensordaten und zum stückweisen Ersetzen mittels Quilting der Sensordaten verwendeten und der Referenzdatendomäne angehörenden Sensordatenpatches zu bestimmen oder solche Distanzen zu erhalten, und ein Entfremdungssignal in Abhängigkeit der bestimmten Distanzen zu erzeugen, und wobei die Ausgabeeinrichtung dazu eingerichtet ist, das erzeugte Entfremdungssignal bereitzustellen. In addition, a device for recognizing an alienation of a sensor data domain from a reference data domain is created, comprising an input device for receiving acquired sensor data from at least one sensor, a computing device, and an output device, the computing device being set up to use a distance measure to determine distances between partial sections of the received sensor data and for the piece-wise replacement by means of quilting of the sensor data and belonging to the reference data domain sensor data patches to determine or to obtain such distances, and to generate an alienation signal as a function of the determined distances, and wherein the output device is set up to provide the generated alienation signal.
Das Verfahren und die Vorrichtung ermöglichen es, eine Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne festzustellen. Hierdurch kann insbesondere festgestellt werden, ob aktuell erfasste bzw. empfangene Sensordaten (noch) in der Referenzdatendomäne liegen oder nicht. Hierzu werden erfasste Sensordaten mindestens eines Sensors empfangen. Die Sensordaten sind beispielsweise mittels Umfeldsensoren eines Kraftfahrzeugs von einem Umfeld erfasste Umfelddaten. Mittels eines Distanzmaßes werden jeweils Distanzen zwischen Teilausschnitten der empfangenen Sensordaten und Sensordatenpatches bestimmt. Die Sensordatenpatches werden in einem Quilting-Verfahren zum stückweisen Ersetzen der Sensordaten verwendet und gehören der Referenzdatendomäne an bzw. definieren diese. Die jeweils betrachteten Teilausschnitte der Sensordaten und die zugehörigen Sensordatenpatches haben das gleiche Format bzw. die gleiche Größe. Das Quilting selbst kann Teil des Verfahrens sein; das Quilting kann aber auch unabhängig von dem Verfahren ausgeführt werden, wobei lediglich die jeweils zum Quilting mittels des Distanzmaßes bestimmten Distanzen zwischen den Teilausschnitten der Sensordaten und den zugehörigen Sensordatenpatches bestimmt oder erhalten werden. Zum Bestimmen der Distanzen können die Teilausschnitte der Sensordaten beispielsweise zu Vektoren linearisiert werden und mit den jeweiligen, ebenfalls zu Vektoren linearisierten, Sensordatenpatches über eine Vektornorm verglichen werden. Beispielsweise kann als Distanzmaß hierbei die L2-Norm verwendet werden. In Abhängigkeit der bestimmten Distanzen wird ein Entfremdungssignal erzeugt und bereitgestellt. Quilting bezeichnet insbesondere das stückweise Ersetzen von Sensordaten. Das Quilting wird insbesondere dazu verwendet, Sensordaten, die einem Neuronalen Netz als Eingangsdaten zugeführt werden, gegen adversariale Störungen zu robustifizieren. Das Quilting kann auch als stückweise Rekonstruktion der Sensordaten bezeichnet werden. Im Zusammenhang mit Bilddaten wird auch der Begriff „Image-Quilting“ verwendet. Handelt es sich beispielsweise um Bilder einer Kamera, so wird das Kamerabild in mehrere Teilausschnitte unterteilt. Hierzu können kleine, rechteckige Bildausschnitte (auch als Patches bezeichnet, z.B. mit einer Größe von jeweils 8x8 Bildelementen/Pixeln) definiert werden. Die einzelnen Teil- bzw. Bildausschnitte werden mit Teilausschnitten, in dieser Offenbarung als Sensordatenpatches bezeichnet, verglichen, die insbesondere in einer Datenbank hinterlegt sind. Der Vergleich erfolgt auf Grundlage eines Distanzmaßes, welches beispielsweise über einen euklidischen Abstand auf Bildelementvektoren definiert ist. Hierzu wird ein Teil- bzw. Bildausschnitt als Vektor linearisiert. Das Bestimmen einer Distanz erfolgt dann über eine Vektornorm, beispielsweise über die L2- Norm. Die Teil- bzw. Bildausschnitte aus den erfassten Sensordaten werden jeweils durch den nächstliegenden bzw. ähnlichsten Sensordatenpatch aus der Datenbank ersetzt. Es kann hierbei vorgesehen sein, dass eine Mindestdistanz eingehalten werden muss bzw. dass zumindest keine Identität zwischen dem Teilausschnitt aus den Sensordaten und dem Sensordatenpatch vorliegen darf. Haben die Sensordaten eine andere Form (z.B. Lidardaten) bzw. ein anderes Format, so erfolgt das stückweise Ersetzen in analoger Weise. Das stückweise Ersetzen erfolgt für alle Teilausschnitte der Sensordaten, sodass anschließend ersetzte bzw. rekonstruierte Sensordaten vorliegen. Nach dem stückweise Ersetzen, das heißt nach dem Quilting, ist eine Wirkung von adversarialen Störungen in den (ersetzten bzw. rekonstruierten) Sensordaten beseitigt oder zumindest verringert. Die zum Quilting notwendigen Sensordatenpatches werden aus Sensordaten erzeugt, die vertrauenswürdig sind, das heißt in denen mit Sicherheit oder zumindest mit hoher Wahrscheinlichkeit keine adversarialen Störungen enthalten sind. Die Sensordatenpatches werden beispielsweise in Form einer durchsuchbaren Datenbank bereitgestellt. The method and the device make it possible to determine an alienation of a sensor data domain from a reference data domain. In this way it can be determined in particular whether currently detected or received sensor data are (still) in the reference data domain or not. For this purpose, recorded sensor data from at least one sensor is received. The sensor data are, for example, environment data recorded from the environment by means of environment sensors of a motor vehicle. Distances between partial sections of the received sensor data and sensor data patches are determined by means of a distance measure. The sensor data patches are used in a quilting process to replace the sensor data piece by piece and belong to or define the reference data domain. The respective sections of the sensor data considered and the associated sensor data patches have the same format or the same size. The quilting itself can be part of the process; however, the quilting can also be carried out independently of the method, only the distances between the partial excerpts of the sensor data and the associated sensor data patches being determined or obtained for quilting by means of the distance measure. To determine the distances, the partial sections of the sensor data can be linearized into vectors, for example, and compared with the respective sensor data patches, likewise linearized into vectors, using a vector standard. For example, the L2 standard can be used here as a distance measure. An alienation signal is generated and provided as a function of the determined distances. Quilting particularly refers to the piece-by-piece replacement of sensor data. Quilting is used in particular to robustize sensor data that are fed to a neural network as input data against adversarial interference. Quilting can also be referred to as a piece-wise reconstruction of the sensor data. The term “image quilting” is also used in connection with image data. If, for example, images from a camera are involved, the camera image is divided into several partial sections. For this purpose, small, rectangular image sections (also referred to as patches, for example with a size of 8x8 picture elements / pixels) can be defined. The individual partial or image sections are compared with partial sections, referred to in this disclosure as sensor data patches, which are in particular stored in a database. The comparison takes place on the basis of a distance measure which is defined, for example, via a Euclidean distance on picture element vectors. For this purpose, a partial or image section is linearized as a vector. A distance is then determined using a vector standard, for example using the L2 standard. The partial or image excerpts from the recorded sensor data are each replaced by the closest or most similar sensor data patch from the database. It can be provided here that a minimum distance must be observed or that at least no identity may exist between the partial section from the sensor data and the sensor data patch. If the sensor data have a different form (eg lidar data) or a different format, the piece-by-piece replacement takes place in an analogous manner. The piece-by-piece replacement takes place for all partial excerpts of the sensor data, so that replaced or reconstructed sensor data is then available. After the piece-by-piece replacement, that is to say after the quilting, an effect of adversarial disturbances in the (replaced or reconstructed) sensor data is eliminated or at least reduced. The sensor data patches necessary for quilting are generated from sensor data which are trustworthy, that is to say which with certainty or at least with a high degree of probability no adverse disturbances are contained. The sensor data patches are provided, for example, in the form of a searchable database.
Mittels des Verfahrens und mittels der Vorrichtung kann festgestellt werden, wann eine Qualität beim Quilting der erfassten Sensordaten abnimmt, weil die Sensordatendomäne und die Referenzdatendomäne, aus der beim Quilting verwendete Sensordatenpatches erzeugt wurden, sich voneinander entfremdet haben. Ferner sind das Verfahren und die Vorrichtung insbesondere bei einer Anwendung von trainierten Neuronalen Netzen von Vorteil, da mittels des Verfahrens und der Vorrichtung auch festgestellt werden kann, ob aktuell erfasste Sensordaten, die einem trainierten Neuronalen Netz zur Verarbeitung zugeführt werden sollen, noch in einer Datendomäne liegen, auf die das Neuronale Netz trainiert wurde. Da die Referenzdatendomäne verfahrensgemäß über die Sensordatenpatches definiert wird, ist es hierzu insbesondere erforderlich, dass die Trainingsdaten, mit denen das Neuronale Netz trainiert wurde, ebenfalls aus der Referenzdatendomäne stammen. The method and the device can be used to determine when a quilting quality of the recorded sensor data decreases because the sensor data domain and the reference data domain from which the sensor data patches used in quilting were generated have become alienated from one another. Furthermore, the method and the device are particularly advantageous when using trained neural networks, since the method and the device can also be used to determine whether currently acquired sensor data that is to be fed to a trained neural network for processing, are still in a data domain on which the neural network was trained. Since the reference data domain is defined according to the method using the sensor data patches, it is particularly necessary for this purpose that the training data with which the neural network was trained also originate from the reference data domain.
Ein typisches Anwendungsszenario des Verfahrens und der Vorrichtung sieht wie folgt aus. In einem Fahrzeug wird unabhängig von dem offenbarten Verfahren ein Neuronales Netz eingesetzt, beispielsweise zur Umfeldwahrnehmung (Objekterkennung, semantische Segmentierung etc.). Dem Neuronalen Netz werden erfasste Sensordaten, insbesondere Umfelddaten eines Umfelds des Fahrzeugs, mindestens eines Sensors (z.B. einer Kamera, eines Lidarsensors etc.) zugeführt. Vorher werden die erfassten Sensordaten mittels Quilting stückweise ersetzt, um die erfassten Sensordaten gegen adversariale Störungen zu robustifizieren. Das Quilting erfolgt, indem Distanzen von Teilausschnitten der erfassten Sensordaten zu Sensordatenpatches, die beispielsweise in einer Datenbank hinterlegt sind, bestimmt werden und die einzelnen Teilausschnitte jeweils durch das in Bezug auf die jeweilige Distanz ähnlichste bzw. nächstliegende Sensordatenpatch ersetzt werden. Dies erfolgt für alle Teilausschnitte der erfassten Sensordaten, sodass die Sensordaten anschließend vollständig stückweise ersetzt bzw. rekonstruiert sind. Das in dieser Offenbarung beschriebene Verfahren benutzt die beim Quilting bestimmten Distanzen, um eine Entfremdung der Sensordatendomäne gegenüber der über die Sensordatenpatches definierten Referenzdatendomäne zu erkennen und festzustellen. A typical application scenario for the method and the device is as follows. Independent of the disclosed method, a neural network is used in a vehicle, for example for perception of the surroundings (object recognition, semantic segmentation, etc.). Recorded sensor data, in particular data on the surroundings of the surroundings of the vehicle, at least one sensor (e.g. a camera, a lidar sensor, etc.) are fed to the neural network. Before this, the recorded sensor data are replaced piece by piece by means of quilting in order to robustize the recorded sensor data against adversarial disturbances. The quilting is done by determining distances from partial sections of the recorded sensor data to sensor data patches, which are stored in a database, for example, and replacing the individual partial sections with the most similar or closest sensor data patch in relation to the respective distance. This takes place for all partial sections of the recorded sensor data, so that the sensor data are then completely replaced or reconstructed piece by piece. The method described in this disclosure uses the distances determined during quilting in order to recognize and determine an alienation of the sensor data domain from the reference data domain defined by the sensor data patches.
Die Sensordaten können prinzipiell eindimensional oder mehrdimensional, insbesondere zweidimensional, sein. Beispielsweise können die Sensordaten zweidimensionale Kamerabilder einer Kamera und zweidimensionale Lidardaten eines Lidarsensors sein. Es können jedoch auch Sensordaten anderer Sensoren sein, wie beispielsweise Radarsensoren und/oder Ultraschallsensoren. Die Sensordaten werden oder wurden insbesondere mittels mindestens eines Sensors eines Fahrzeugs erfasst. Die Sensordaten umfassen insbesondere Umfelddaten eines Umfelds des Fahrzeugs. The sensor data can in principle be one-dimensional or multidimensional, in particular two-dimensional. For example, the sensor data can be two-dimensional camera images from a camera and two-dimensional lidar data from a lidar sensor. However, it can also be sensor data from other sensors, such as, for example, radar sensors and / or ultrasonic sensors. The sensor data are or have been recorded in particular by means of at least one sensor of a vehicle. The sensor data include, in particular, data about the surroundings of the vehicle.
Eine Datendomäne soll insbesondere eine Gesamtheit von Daten, insbesondere Sensordaten, bezeichnen, die mit einem bestimmten Kontext korrespondieren bzw. deren Daten sich hinsichtlich ihres Ursprungs in mindestens einer Eigenschaft ähnlich sind. Ein solcher Kontext kann beispielsweise ein geographischer Kontext sein, z.B. kann eine Datendomäne Sensordaten einer Stadt umfassen, eine hiervon verschiedene Datendomäne umfasst hingegen Sensordaten einer anderen Stadt etc. Eine Sensordatendomäne bezeichnet insbesondere eine Datendomäne, in der aktuelle, mittels des mindestens einen Sensors erfasste, Sensordaten liegen. Eine Referenzdatendomäne bezeichnet insbesondere eine Datendomäne, in der die Sensordaten liegen, aus denen die beim Quilting verwendeten Sensordatenpatches erzeugt wurden. A data domain is intended to denote, in particular, a set of data, in particular sensor data, which correspond to a specific context or whose data are similar in terms of their origin in at least one property. Such a context can be, for example, a geographical context, e.g. a data domain can include sensor data from one city, whereas a data domain different therefrom comprises sensor data from another city, etc. A sensor data domain in particular denotes a data domain in which current sensor data, recorded by means of the at least one sensor, are located. A reference data domain in particular denotes a data domain in which the sensor data are located, from which the sensor data patches used in quilting were generated.
Insbesondere ist vorgesehen, dass die erfassten bzw. in den Datendomänen abgebildeten Sensordaten solche Sensordaten sind, die für eine Funktion für das automatisierte Fahren eines Fahrzeugs und/oder für eine Fahrerassistenz des Fahrzeugs und/oder für eine Umfelderfassung bzw. für eine Wahrnehmungsfunktion benötigt werden. In particular, it is provided that the sensor data recorded or mapped in the data domains are those sensor data that are required for a function for automated driving of a vehicle and / or for driver assistance of the vehicle and / or for environment detection or for a perception function.
Das Verfahren und die Vorrichtung werden insbesondere in mindestens einem Fahrzeug verwendet. Ein Fahrzeug ist insbesondere ein Kraftfahrzeug. Prinzipiell kann das Fahrzeug jedoch auch ein anderes Land-, Luft-, Wasser-, Schienen- oder Raumfahrzeug sein. Prinzipiell kann das Verfahren auch für andere Arten von Sensordaten verwendet werden, beispielsweise im Zusammenhang mit Robotern, z.B. Industrierobotern oder medizinischen Robotern. The method and the device are used in particular in at least one vehicle. A vehicle is in particular a motor vehicle. In principle, however, the vehicle can also be another land, air, water, rail or space vehicle. In principle, the method can also be used for other types of sensor data, for example in connection with robots, e.g. industrial robots or medical robots.
Eine adversariale Störung (engl adversarial perturbation) ist insbesondere eine gezielt vorgenommene Störung der, beispielsweise in Form von Sensordaten bereitgestellten, Eingangsdaten eines Neuronalen Netzes, bei der ein semantischer Inhalt in den Eingangsdaten zwar nicht verändert wird, die Störung jedoch dazu führt, dass das Neuronale Netz ein falsches Ergebnis inferiert, das heißt beispielsweise eine Fehlklassifikation oder eine falsche semantische Segmentierung der Eingangsdaten vornimmt. An adversarial perturbation is, in particular, a deliberately made disruption of the input data of a neural network, for example provided in the form of sensor data, in which semantic content in the input data is not changed, but the disruption leads to the neuronal Network inferred an incorrect result, that is, for example, incorrectly classifies or incorrectly semantic segmentation of the input data.
Ein Neuronales Netz ist insbesondere ein tiefes Neuronales Netz, insbesondere ein Faltungsnetz (engl. Convolutional Neural Network, CNN). Das Neuronale Netz wird oder ist beispielsweise auf eine bestimmte Wahrnehmungsfunktion trainiert, beispielsweise auf eine Wahrnehmung von Fußgängern oder anderen Objekten in erfassten Kamerabildern. A neural network is in particular a deep neural network, in particular a convolutional neural network (CNN). The neural network is or is, for example, trained for a specific perception function, for example for the perception of pedestrians or other objects in captured camera images.
Das Verfahren kann als computerimplementiertes Verfahren ausgeführt werden. Insbesondere kann das Verfahren mittels einer Datenverarbeitungseinrichtung ausgeführt werden. Die Datenverarbeitungseinrichtung umfasst insbesondere mindestens eine Recheneinrichtung und mindestens eine Speichereinrichtung. The method can be carried out as a computer-implemented method. In particular, the method can be carried out by means of a data processing device. The data processing device comprises in particular at least one computing device and at least one storage device.
Es wird insbesondere auch ein Computerprogramm geschaffen, umfassend Befehle, die bei der Ausführung des Computerprogramms durch einen Computer diesen veranlassen, die Verfahrensschritte des offenbarten Verfahrens gemäß einer beliebigen der beschriebenen Ausführungsformen auszuführen. In particular, a computer program is also created, comprising instructions which, when the computer program is executed by a computer, cause the computer program to be executed To carry out method steps of the disclosed method according to any of the described embodiments.
Darüber hinaus wird insbesondere auch ein Datenträgersignal geschaffen, das das vorgenannte Computerprogramm überträgt. In addition, a data carrier signal is also created that transmits the aforementioned computer program.
Es kann vorgesehen sein, dass das Verfahren ein Erfassen der Sensordaten des mindestens einen Sensors umfasst. It can be provided that the method includes acquiring the sensor data of the at least one sensor.
Teile der Vorrichtung, insbesondere die Recheneinrichtung, können einzeln oder zusammengefasst als eine Kombination von Hardware und Software ausgebildet sein, beispielsweise als Programmcode, der auf einem Mikrocontroller oder Mikroprozessor ausgeführt wird. Parts of the device, in particular the computing device, can be designed individually or collectively as a combination of hardware and software, for example as program code that is executed on a microcontroller or microprocessor.
In einer Ausführungsform ist vorgesehen, dass das Entfremdungssignal erzeugt wird, wenn die bestimmten Distanzen mindestens einen vorgegebenen Schwellenwert überschreiten. Mittels des mindestens einen vorgegebenen Schwellenwertes kann eingestellt werden, wie empfindlich das Verfahren auf eine Entfremdung bzw. Veränderung der Sensordatendomäne von bzw. gegenüber der Referenzdatendomäne reagiert. Es kann hierbei vorgesehen sein, dass das Entfremdungssignal erzeugt wird, wenn eine einzige bestimmte Distanz den mindestens einen vorgegebenen Schwellenwert überschreitet. Es reicht in diesem Fall aus, wenn eine Distanz zwischen einem Teilausschnitt der empfangenen Sensordaten und einem zugehörigen Sensordatenpatch den mindestens einen vorgegebenen Schwellenwert überschreitet. Es kann jedoch auch vorgesehen sein, dass eine Mindestanzahl von bestimmten Distanzen den mindestens einen vorgegebenen Schwellenwert überschreiten muss. Ferner kann vorgesehen sein, dass ein aus allen bestimmten Distanzen zumindest für einen vorgegebenen Zeitraum bzw. eine vorgegebene Anzahl an bestimmten Distanzen gebildeter Mittelwert den mindestens einen vorgegebenen Schwellenwert überschreiten muss, damit das Entfremdungssignal erzeugt wird. In one embodiment it is provided that the alienation signal is generated when the determined distances exceed at least one predetermined threshold value. The at least one predetermined threshold value can be used to set how sensitively the method reacts to an alienation or change in the sensor data domain from or with respect to the reference data domain. It can be provided here that the alienation signal is generated when a single specific distance exceeds the at least one predetermined threshold value. In this case, it is sufficient if a distance between a partial section of the received sensor data and an associated sensor data patch exceeds the at least one predetermined threshold value. However, it can also be provided that a minimum number of specific distances must exceed the at least one predetermined threshold value. Furthermore, it can be provided that an average value formed from all certain distances at least for a predetermined period of time or a predetermined number of certain distances must exceed the at least one predetermined threshold value so that the alienation signal is generated.
In einer weiterbildenden Ausführungsform ist vorgesehen, dass der mindestens eine vorgegebene Schwellenwert positionsabhängig und/oder kontextabhängig und/oder sensorabhängig vorgegeben ist oder vorgegeben wird. Hierdurch können unterschiedliche Schwellenwerte situationsabhängig vorgegeben werden. Bei einer Positionsabhängigkeit kann der Schwellenwert insbesondere in Abhängigkeit einer geographischen Position oder Region vorgegeben werden. Beispielsweise kann der Schwellenwert im Bereich einer Spielstraße anders gewählt sein, als im Bereich einer Autobahn. Hierdurch kann eine Empfindlichkeit beim Erkennen der Entfremdung positionsabhängig festgelegt werden. Bei einer Kontextabhängigkeit kann insbesondere ein situativer Kontext betrachtet werden. Der Schwellenwert kann dann beispielsweise unterschiedlich gewählt werden, je nachdem, ob ein Fahrzeug sich auf einer Autobahn, einer Landstraße, einer Hauptstraße oder einer Nebenstraße befindet. Ferner kann beispielsweise auch eine Witterung berücksichtigt werden. So kann der mindestens eine Schwellenwert in Abhängigkeit davon gewählt werden, ob eine der folgenden Wtterungsbedingungen vorliegt: Regen, Schnee, Nebel, Sonnenschein etc. Mittels der Sensorabhängigkeit können Eigenschaften von Sensoren berücksichtigt werden. In a further developing embodiment, it is provided that the at least one predefined threshold value is predefined or is predefined as a function of position and / or context-dependent and / or sensor-dependent. In this way, different threshold values can be specified depending on the situation. In the case of a position dependency, the threshold value can be specified in particular as a function of a geographic position or region. For example, the threshold value can be in the area of a play street be chosen differently than in the area of a motorway. In this way, a sensitivity when recognizing the alienation can be determined as a function of the position. In the case of context dependency, a situational context in particular can be considered. The threshold value can then be selected differently, for example, depending on whether a vehicle is on a motorway, a country road, a main road or a secondary road. Furthermore, weather conditions can also be taken into account, for example. The at least one threshold value can thus be selected as a function of whether one of the following weather conditions is present: rain, snow, fog, sunshine, etc. By means of the sensor dependency properties of sensors can be taken into account.
Beispielsweise kann sowohl eine Art der Sensoren bzw. des diesen jeweils zugrundeliegenden physikalischen Messprinzips als auch eine Anordnung und/oder Ausrichtung und/oder Wtterungsabhängigkeit der Sensoren berücksichtigt werden. For example, both a type of sensors or the physical measurement principle on which they are based as well as an arrangement and / or alignment and / or weather dependency of the sensors can be taken into account.
In einer Ausführungsform ist vorgesehen, dass mittels des erzeugten Entfremdungssignals ein Erfassen und/oder Sammeln von Sensordaten des mindestens einen Sensors veranlasst wird. Hierdurch können für die entfremdete Sensordatendomäne aktualisierte Sensordaten erfasst und gesammelt werden, welche anschließend zum (Nach-)Trainieren eines Neuronalen Netzes verwendet werden können. Das Neuronale Netz kann hierdurch auf einen aktuellen, der geänderten Sensordatendomäne entsprechenden, Stand gebracht werden. Insbesondere kann das erzeugte Entfremdungssignal beispielsweise einer Steuereinrichtung eines Kraftfahrzeugs zugeführt werden, das nach Empfangen des Entfremdungssignals das Erfassen und/oder Sammeln der Sensordaten des mindestens einen Sensors startet. Das Sammeln erfolgt beispielsweise mittels einer hierfür eingerichteten Speichereinrichtung. In one embodiment it is provided that the generated alienation signal is used to initiate detection and / or collection of sensor data from the at least one sensor. As a result, updated sensor data can be recorded and collected for the alienated sensor data domain, which can then be used to (re) train a neural network. The neural network can thereby be brought up to date, corresponding to the changed sensor data domain. In particular, the generated alienation signal can be fed, for example, to a control device of a motor vehicle which, after receiving the alienation signal, starts detecting and / or collecting the sensor data of the at least one sensor. The collection takes place, for example, by means of a storage device set up for this purpose.
Es kann insbesondere vorgesehen sein, dass mittels des erzeugten Entfremdungssignals ein Erfassen und/oder Sammeln von Sensordaten des mindestens einen Sensors durch eine Fahrzeugflotte veranlasst wird. Dies ermöglicht ein umfangreicheres Erfassen und/oder Sammeln von Sensordaten für die geänderte Sensordatendomäne. Das Veranlassen erfolgt beispielsweise über eine hierfür eingerichtete Kommunikationsschnittstelle. Das Veranlassen kann hierbei über einen zentralen Backendserver vermittelt bzw. koordiniert werden, beispielsweise indem der Backendserver nach Erhalten des von einem der Fahrzeuge der Fahrzeugflotte erzeugten Entfremdungssignals andere Fahrzeuge der Fahrzeugflotte zum Erfassen und/oder Sammeln von Sensordaten des mindestens einen Sensors veranlasst. Die erfassten und/oder gesammelten Sensordaten sämtlicher Fahrzeuge der Fahrzeugflotte werden an den Backendserver übermittelt, der diese sammelt und anschließend zum (Nach-)Trainieren eines Neuronalen Netzes bereitgestellt. In einer weiterbildenden Ausführungsform ist vorgesehen, dass das Erfassen und/oder Sammeln der Sensordaten positionsabhängig und/oder kontextabhängig und/oder sensorabhängig erfolgt. Hierdurch können Sensordaten der entfremdeten bzw. veränderten Sensordatendomäne gezielt für bestimmte Positionen, insbesondere geographische Positionen, oder Regionen bzw. für bestimmte Kontexte (Stadt, Land, Autobahn, Spielstraße, Regen,In particular, it can be provided that the generated alienation signal is used to initiate a detection and / or collection of sensor data from the at least one sensor by a vehicle fleet. This enables more extensive acquisition and / or collection of sensor data for the changed sensor data domain. The initiation takes place, for example, via a communication interface set up for this purpose. The initiation can be mediated or coordinated via a central backend server, for example by the backend server causing other vehicles in the vehicle fleet to detect and / or collect sensor data from the at least one sensor after receiving the alienation signal generated by one of the vehicles in the vehicle fleet. The recorded and / or collected sensor data from all vehicles in the vehicle fleet are transmitted to the backend server, which collects them and then makes them available for (re) training a neural network. In a further developing embodiment, it is provided that the detection and / or collection of the sensor data takes place as a function of the position and / or context-dependent and / or as a function of the sensor. As a result, sensor data from the alienated or changed sensor data domain can be targeted for specific positions, in particular geographical positions, or regions or for specific contexts (city, country, motorway, play street, rain,
Nebel, etc.) erfasst und gesammelt werden. Insbesondere im Zusammenwirken mit positionsabhängigen und/oder kontextabhängigen und/oder sensorabhängigen Schwellenwerten kann eine Sensordatendomäne gezielt und detailliert überwacht werden. Fog, etc.) can be captured and collected. In particular in interaction with position-dependent and / or context-dependent and / or sensor-dependent threshold values, a sensor data domain can be monitored in a targeted and detailed manner.
In einer Ausführungsform ist vorgesehen, dass mittels des erzeugten Entfremdungssignals ein Erstellen und/oder Aktualisieren von Sensordatenpatches veranlasst wird, die beim Quilting der Sensordaten verwendet werden. Hierdurch können die beim Quilting verwendeten Sensordatenpatches auf die veränderte bzw. entfremdete Sensordatendomäne aktualisiert werden. Derart aktualisierte Sensordatenpatches bilden bzw. definieren anschließend eine neue Referenzdatendomäne. Insbesondere erfolgt das Erstellen und/oder Aktualisieren der Sensordatenpatches auf Grundlage von für die entfremdete bzw. veränderte Sensordatendomäne erfassten und/oder gesammelten Sensordaten des mindestens einen Sensors. Die gesammelten Sensordaten werden in Teilausschnitte unterteilt, die die neuen Sensordatenpatches bilden. Die derart gebildeten Sensordatenpatches werden insbesondere in einer Datenbank hinterlegt, welche die Grundlage für eine nachfolgende Anwendung des Quilting-Verfahrens bildet. Das Erstellen und/oder Aktualisieren der Sensordatenpatches kann ebenfalls mittels eines Backendservers, insbesondere koordiniert für die gesamte Fahrzeugflotte, durchgeführt werden. In one embodiment it is provided that the generated alienation signal is used to initiate the creation and / or updating of sensor data patches which are used in the quilting of the sensor data. As a result, the sensor data patches used during quilting can be updated to the changed or alienated sensor data domain. Sensor data patches updated in this way then form or define a new reference data domain. In particular, the creation and / or updating of the sensor data patches takes place on the basis of sensor data of the at least one sensor that are recorded and / or collected for the alienated or changed sensor data domain. The collected sensor data is divided into sections that form the new sensor data patches. The sensor data patches formed in this way are in particular stored in a database which forms the basis for a subsequent application of the quilting method. The creation and / or updating of the sensor data patches can also be carried out by means of a backend server, in particular in a coordinated manner for the entire vehicle fleet.
Weitere Merkmale zur Ausgestaltung der Vorrichtung ergeben sich aus der Beschreibung von Ausgestaltungen des Verfahrens. Die Vorteile der Vorrichtung sind hierbei jeweils die gleichen wie bei den Ausgestaltungen des Verfahrens. Further features for the configuration of the device emerge from the description of configurations of the method. The advantages of the device are in each case the same as in the embodiments of the method.
Ferner wird insbesondere auch ein Kraftfahrzeug geschaffen, umfassend mindestens eine Vorrichtung nach einer beliebigen der beschriebenen Ausführungsformen. Furthermore, in particular, a motor vehicle is also created, comprising at least one device according to any of the described embodiments.
Nachfolgend wird die Erfindung anhand bevorzugter Ausführungsbeispiele unter Bezugnahme auf die Figuren näher erläutert. Hierbei zeigen: Fig. 1 eine schematische Darstellung einer Ausführungsform der Vorrichtung zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne; The invention is explained in more detail below using preferred exemplary embodiments with reference to the figures. Here show: 1 shows a schematic representation of an embodiment of the device for detecting an alienation of a sensor data domain from a reference data domain;
Fig. 2 eine schematische Darstellung zur Verdeutlichung des Quilting-Verfahrens (Stand der Technik); 2 shows a schematic representation to illustrate the quilting method (prior art);
Fig. 3 ein schematisches Ablaufdiagramm einer Ausführungsform des Verfahrens zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne. 3 shows a schematic flow diagram of an embodiment of the method for recognizing an alienation of a sensor data domain from a reference data domain.
In Fig. 1 ist eine schematische Darstellung einer Ausführungsform der Vorrichtung 1 zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne gezeigt. Die Vorrichtung 1 führt das in dieser Offenbarung beschriebene Verfahren zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne aus. Die Vorrichtung 1 ist beispielsweise in einem Kraftfahrzeug 50 ausgebildet. 1 shows a schematic representation of an embodiment of the device 1 for recognizing an alienation of a sensor data domain from a reference data domain. The device 1 carries out the method described in this disclosure for recognizing an alienation of a sensor data domain from a reference data domain. The device 1 is embodied in a motor vehicle 50, for example.
Die Vorrichtung 1 umfasst eine Eingangseinrichtung 2, eine Recheneinrichtung 3, eine Speichereinrichtung 4 und eine Ausgabeeinrichtung 5. Die Recheneinrichtung 3 kann auf die Speichereinrichtung 4 zugreifen und Rechenoperationen auf in der Speichereinrichtung 4 hinterlegten Daten ausführen. The device 1 comprises an input device 2, a computing device 3, a memory device 4 and an output device 5. The computing device 3 can access the memory device 4 and perform arithmetic operations on data stored in the memory device 4.
Teile der Vorrichtung 1, insbesondere die Recheneinrichtung 3, können einzeln oder zusammengefasst als eine Kombination von Hardware und Software ausgebildet sein, beispielsweise als Programmcode, der auf einem Mikrocontroller oder Mikroprozessor ausgeführt wird. Parts of the device 1, in particular the computing device 3, can be designed individually or collectively as a combination of hardware and software, for example as program code that is executed on a microcontroller or microprocessor.
In dem gezeigten Beispiel führt die Vorrichtung 1 auch ein Quilting-Verfahren aus. Hierzu weist die Recheneinrichtung 3 ein Quilting-Modul 6 auf. Die Eingangseinrichtung 2 empfängt Sensordaten 10, die von einem Sensor 51 des Kraftfahrzeugs 50 erfasst werden. Dieser Sensor 51 kann beispielsweise eine Kamera oder ein Lidarsensor sein. Prinzipiell können auch mehr als nur ein Sensor 51 vorgesehen sein. Die empfangenen Sensordaten 10 werden dem Quilting-Modul 6 zugeführt, welches die Sensordaten 10 stückweise ersetzt mittels Quilting. Hierzu greift das Quilting-Modul 6 auf Sensordatenpatches 60 zurück, die in einer Datenbank 40 in der Speichereinrichtung 4 hinterlegt sind. Die in der Datenbank 40 hinterlegten Sensordatenpatches 60 definieren eine Referenzdatendomäne. Nach dem stückweise Ersetzen werden die stückweise ersetzten bzw. rekonstruierten Sensordaten 20 über die Ausgabeeinrichtung 5 einem Neuronalen Netz 52 zugeführt, das eine Kl-Funktion ausführt, beispielsweise eine Umfeldwahrnehmung (z.B. eine Objekterkennung, eine semantische Segmentierung etc.) in den Sensordaten 10 bzw. in den stückweise ersetzten Sensordaten 20 vornimmt. In the example shown, the device 1 also carries out a quilting process. For this purpose, the computing device 3 has a quilting module 6. The input device 2 receives sensor data 10 which are detected by a sensor 51 of the motor vehicle 50. This sensor 51 can be, for example, a camera or a lidar sensor. In principle, more than just one sensor 51 can also be provided. The received sensor data 10 are fed to the quilting module 6, which replaces the sensor data 10 piece by piece by means of quilting. For this purpose, the quilting module 6 uses sensor data patches 60 which are stored in a database 40 in the storage device 4. The sensor data patches 60 stored in the database 40 define a reference data domain. After replacing piece by piece the piece-wise replaced or reconstructed sensor data 20 are fed via the output device 5 to a neural network 52 that performs a KI function, for example a perception of the surroundings (e.g. object recognition, semantic segmentation, etc.) in the sensor data 10 or in the piece-wise replaced Sensor data 20 makes.
Zum stückweise Ersetzen mittels Quilting bestimmt das Quilting-Modul 6 jeweils Distanzen 11 zwischen Teilausschnitten der empfangenen Sensordaten 10 und den in der Datenbank 40 hinterlegten Sensordatenpatches 60. Die bestimmten Distanzen 11 werden einem Erkennungsmodul 7 zugeführt. Das Erkennungsmodul 7 erzeugt ein Entfremdungssignal 21 in Abhängigkeit der bestimmten Distanzen 11. Hierzu überprüft das Erkennungsmodul 7 insbesondere, ob die bestimmten Distanzen 11 einen vorgegebenen Schwellenwert 12 überschreiten oder nicht. Der vorgegebene Schwellenwert 12 wird dem Erkennungsmodul 7 von außen vorgegeben. For piece-by-piece replacement by means of quilting, the quilting module 6 determines each distance 11 between partial sections of the received sensor data 10 and the sensor data patches 60 stored in the database 40. The determined distances 11 are fed to a recognition module 7. The detection module 7 generates an alienation signal 21 as a function of the determined distances 11. For this purpose, the detection module 7 checks in particular whether the determined distances 11 exceed a predetermined threshold value 12 or not. The predefined threshold value 12 is predefined for the recognition module 7 from the outside.
Es kann hierbei vorgesehen sein, dass das Entfremdungssignal 21 erzeugt wird, wenn eine einzige bestimmte Distanz 11 den mindestens einen vorgegebenen Schwellenwert 12 überschreitet. Es reicht in diesem Fall aus, wenn eine (einzige) bestimmte Distanz 11 zwischen einem Teilausschnitt der empfangenen Sensordaten 10 und einem zugehörigen Sensordatenpatch 60 den mindestens einen vorgegebenen Schwellenwert 12 überschreitet. Es kann jedoch auch vorgesehen sein, dass eine Mindestanzahl (z.B. 10, 100, 1000 etc.) von bestimmten Distanzen 11 den mindestens einen vorgegebenen Schwellenwert 12 überschreiten muss, damit das Entfremdungssignal 21 erzeugt wird. Ferner kann vorgesehen sein, dass ein aus allen bestimmten Distanzen 11 zumindest für einen vorgegebenen Zeitraum bzw. eine vorgegebene Anzahl an bestimmten Distanzen 11 gebildeter Mittelwert den mindestens einen vorgegebenen Schwellenwert 12 überschreiten muss, damit das Entfremdungssignal 21 erzeugt wird. It can be provided here that the alienation signal 21 is generated when a single specific distance 11 exceeds the at least one predetermined threshold value 12. In this case, it is sufficient if a (single) specific distance 11 between a partial section of the received sensor data 10 and an associated sensor data patch 60 exceeds the at least one predetermined threshold value 12. However, it can also be provided that a minimum number (e.g. 10, 100, 1000 etc.) of certain distances 11 must exceed the at least one predetermined threshold value 12 so that the alienation signal 21 is generated. Furthermore, it can be provided that an average value formed from all certain distances 11 at least for a predetermined period or a predetermined number of certain distances 11 must exceed the at least one predetermined threshold value 12 so that the alienation signal 21 is generated.
Die Ausgabeeinrichtung 5 stellt das erzeugte Entfremdungssignal 21 bereit. Insbesondere gibt die Ausgabeeinrichtung 5 das Entfremdungssignal 21 aus, beispielsweise in Form eines digitalen Signals oder digitalen Datenpakets. The output device 5 provides the generated alienation signal 21. In particular, the output device 5 outputs the alienation signal 21, for example in the form of a digital signal or digital data packet.
Alternativ zum Erhalten der bestimmten Distanzen 11 von dem Quilting-Modul 6 kann auch vorgesehen sein, dass die Distanzen 11 unabhängig von dem Quilting-Modul 6 bestimmt werden. Es kann vorgesehen sein, dass der mindestens eine vorgegebene Schwellenwert 12 positionsabhängig und/oder kontextabhängig und/oder sensorabhängig vorgegeben ist oder vorgegeben wird. Die Recheneinrichtung 3 erhält dann den jeweiligen Schwellenwert 12 für eine aktuelle Position und/oder einen aktuellen Kontext und/oder für jeden Sensor von außen. Alternativ werden sämtliche Schwellenwerte 12 mit den zugehörigen Abhängigkeiten, beispielsweise als Funktion oder Liste, von außen vorgegeben und die Recheneinrichtung 3 bestimmt selbst, welcher der vorgegebenen Schwellenwerte 12 zu einem aktuellen Zeitpunkt bzw. für welchen Sensor verwendet werden muss. As an alternative to obtaining the determined distances 11 from the quilting module 6, it can also be provided that the distances 11 are determined independently of the quilting module 6. Provision can be made for the at least one predefined threshold value 12 to be predefined or predefined as a function of position and / or context-dependent and / or sensor-dependent. The computing device 3 then receives the respective threshold value 12 for a current position and / or a current context and / or for each sensor from the outside. Alternatively, all threshold values 12 with the associated dependencies, for example as a function or list, are specified externally and the computing device 3 itself determines which of the specified threshold values 12 must be used at a current point in time or for which sensor.
Es kann vorgesehen sein, dass mittels des erzeugten Entfremdungssignals ein Erfassen und/oder Sammeln von Sensordaten 10 des mindestens einen Sensors 51 veranlasst wird.It can be provided that the generated alienation signal is used to initiate a detection and / or collection of sensor data 10 from the at least one sensor 51.
Dies erfolgt beispielsweise mittels eines Backendservers 55, dem das Entfremdungssignal 21 zugeführt wird. Der Backendserver 55 veranlasst dann das Erfassen und/oder Sammeln der Sensordaten 10 beim Kraftfahrzeug 50 und bei anderen Kraftfahrzeugen einer Fahrzeugflotte. This takes place, for example, by means of a backend server 55 to which the alienation signal 21 is fed. The backend server 55 then initiates the acquisition and / or collection of the sensor data 10 in the motor vehicle 50 and in other motor vehicles in a vehicle fleet.
Das Kraftfahrzeug 50 sammelt nach der Veranlassung Sensordaten 10 in der Speichereinrichtung 4. Die gesammelten Sensordaten werden anschließend an den Backendserver 55 übermittelt. Dieser sammelt die gesammelten Sensordaten aller Kraftfahrzeuge der Fahrzeugflotte und stellt hieraus einen Trainingsdatensatz zum (Nach)- Trainieren des Neuronalen Netzes 52 zusammen. Das Neuronale Netz 52 wird anschließend auf dem Backendserver 55 (nach-)trainiert und eine Struktur und Parameter des (nach- jtrainierten Neuronalen Netzes 53 werden an das Kraftfahrzeug 50 und die anderen Kraftfahrzeuge übermittelt, welche das Neuronale Netz 52 durch das (nach-)trainierte Neuronale Netz 53 ersetzen. After the initiation, the motor vehicle 50 collects sensor data 10 in the storage device 4. The collected sensor data are then transmitted to the backend server 55. This collects the collected sensor data from all motor vehicles in the vehicle fleet and uses this to compile a training data set for (re) training of the neural network 52. The neural network 52 is then (re) trained on the backend server 55 and a structure and parameters of the (re-trained neural network 53 are transmitted to the motor vehicle 50 and the other motor vehicles, which the neural network 52 through the (re) Replace trained neural network 53.
Ferner kann der Backendserver 55 auf Grundlage der übermittelten gesammelten Sensordaten die Datenbank 40 mit den Sensordatenpatches 60 aktualisieren, indem der Backendserver 55 aus den gesammelten Sensordaten aktualisierte Sensordatenpatches 61 erzeugt. Die aktualisierten Sensordatenpatches 61 werden anschließend an das Kraftfahrzeug 50 und die anderen Kraftfahrzeuge der Fahrzeugflotte übermittelt. Hierdurch können sowohl das Neuronale Netz 52 als auch die durch die Sensordatenpatches 60 gebildete bzw. definierte Referenzdatendomäne der entfremdeten bzw. veränderten Sensordatendomäne angepasst werden. In addition, the back-end server 55 can update the database 40 with the sensor data patches 60 on the basis of the transmitted, collected sensor data, in that the back-end server 55 generates updated sensor data patches 61 from the collected sensor data. The updated sensor data patches 61 are then transmitted to the motor vehicle 50 and the other motor vehicles in the vehicle fleet. In this way, both the neural network 52 and the reference data domain formed or defined by the sensor data patches 60 can be adapted to the alienated or changed sensor data domain.
Weiterbildend kann vorgesehen sein, dass das Erfassen und/oder Sammeln der Sensordaten 10 positionsabhängig und/oder kontextabhängig und/oder sensorabhängig erfolgt. Hierdurch können Ressourcen eingespart werden, beispielsweise wenn eine Entfremdung lediglich in bestimmten Regionen und/oder Kontexten und/oder nur für bestimmte Sensoren etc. aufgetreten ist. In diesen Fällen kann ein gezieltes und dem Umfang nach beschränktes Erfassen und/oder Sammeln von Sensordaten veranlasst werden, sodass vorhandene Ressourcen gezielt eingesetzt werden können. In a further development, it can be provided that the detection and / or collection of the sensor data 10 takes place as a function of the position and / or context-dependent and / or as a function of the sensor. In this way, resources can be saved, for example if alienation has only occurred in certain regions and / or contexts and / or only for certain sensors, etc. In these cases, a targeted and limited recording and / or collection of sensor data can be initiated so that existing resources can be used in a targeted manner.
In Fig. 2 ist eine schematische Darstellung zur Verdeutlichung des Quilting aus dem Stand der Technik am Beispiel eines Kamerabildes 13 gezeigt. Sensordaten 10, vorliegend ein Kamerabild 13, werden in Teilausschnitte 23 zerteilt. Für jeden der Teilausschnitte 23 (d.h. Bildausschnitt) des Kamerabildes 13 wird im Rahmen eines Quiltingschritts 100 in einer Datenbank 40 nach einem Sensordatenpatch 60 gesucht, das in Bezug auf ein Distanzmaß die geringste Distanz zu dem jeweiligen Teilausschnitt 23 aufweist. Ein Sensordatenpatch 60 ist vorliegend ein Bildausschnitt, der die Größe der Teilausschnitte 23, das heißt dieselbe Anzahl von Bildelementen (Pixeln), aufweist, z.B. jeweils 8x8 Bildelemente. Das Distanzmaß ist beispielsweise die L2-Norm, die auf Vektoren angewandt wird, die durch Linearisierung der Teilausschnitte 23 bzw. Bildausschnitte erzeugt wurden. Im Quiltingschritt 100 wird dann jeder Teilausschnitt 23 bzw. Bildausschnitt durch das jeweilige Sensordatenpatch 60 mit der jeweils hierzu kleinsten Distanz ersetzt. Es kann hierbei vorgesehen sein, dass eine Mindestdistanz eingehalten werden muss. Auf diese Weise werden sämtliche Teilausschnitte 23 bzw. Bildausschnitte durch Sensordatenpatches 60 aus der Datenbank 40 ersetzt. Es entstehen ersetzte Teilausschnitte 24, die zusammengenommen die ersetzten Sensordaten 20 bzw. ein ersetztes Kamerabild 25 ausbilden. FIG. 2 shows a schematic representation to clarify quilting from the prior art using the example of a camera image 13. Sensor data 10, in the present case a camera image 13, are divided into partial sections 23. For each of the sub-sections 23 (i.e. image section) of the camera image 13, as part of a quilting step 100, a database 40 is searched for a sensor data patch 60 that has the smallest distance to the respective sub-section 23 in terms of a distance measure. In the present case, a sensor data patch 60 is an image section which has the size of the partial sections 23, i.e. the same number of picture elements (pixels), e.g. 8x8 picture elements each. The distance measure is, for example, the L2 standard which is applied to vectors that have been generated by linearizing the partial sections 23 or image sections. In the quilting step 100, each partial section 23 or image section is then replaced by the respective sensor data patch 60 with the smallest distance to this. It can be provided here that a minimum distance must be observed. In this way, all partial sections 23 or image sections are replaced by sensor data patches 60 from database 40. Replaced partial sections 24 are created which, taken together, form the replaced sensor data 20 or a replaced camera image 25.
Die beim Quiltingschritt 100 jeweils bestimmten Distanzen werden in dem in dieser Offenbarung beschriebenen Verfahren verwendet, um eine Entfremdung der Sensordatendomäne von einer durch die Sensordatenpatches gebildete bzw. definierte Referenzdatendomäne zu erkennen. The distances determined in each case in the quilting step 100 are used in the method described in this disclosure in order to recognize an alienation of the sensor data domain from a reference data domain formed or defined by the sensor data patches.
In Fig. 3 ist ein schematisches Ablaufdiagramm einer Ausführungsform des Verfahrens zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne gezeigt. FIG. 3 shows a schematic flow diagram of an embodiment of the method for recognizing an alienation of a sensor data domain from a reference data domain.
In einem Verfahrensschritt 200 werden erfasste Sensordaten mindestens eines Sensors empfangen. Der mindestens eine Sensor ist insbesondere ein Sensor, der ein Umfeld eines Kraftfahrzeugs erfasst, beispielsweise eine Kamera und/oder ein Lidarsensor des Kraftfahrzeugs. ln einem Verfahrensschritt 201 werden mittels eines Distanzmaßes jeweils Distanzen zwischen Teilausschnitten der empfangenen Sensordaten und zum stückweisen Ersetzen mittels Quilting der Sensordaten verwendeten und der Referenzdatendomäne angehörenden Sensordatenpatches bestimmt oder solche Distanzen werden erhalten. In a method step 200, sensed sensor data from at least one sensor are received. The at least one sensor is in particular a sensor that detects the surroundings of a motor vehicle, for example a camera and / or a lidar sensor of the motor vehicle. In a method step 201, distances between partial sections of the received sensor data and sensor data patches used for piecewise replacement by means of quilting of the sensor data and belonging to the reference data domain are determined by means of a distance measure, or such distances are obtained.
In einem Verfahrensschritt 202 wird überprüft, ob die bestimmten oder erhaltenen Distanzen mindestens einen Schwellenwert überschreiten. Ist dies nicht der Fall, so werden die Verfahrensschritte 200-202 mit nachfolgenden bzw. aktualisierten erfassten Sensordaten wiederholt. In a method step 202 it is checked whether the determined or obtained distances exceed at least one threshold value. If this is not the case, method steps 200-202 are repeated with subsequent or updated acquired sensor data.
Wird im Verfahrensschritt 202 hingegen festgestellt, dass die bestimmten oder erhaltenen Distanzen den mindestens einen Schwellenwert überschreiten, so wird in einem Verfahrensschritt 203 ein Entfremdungssignal erzeugt und bereitgestellt, insbesondere ausgegeben. If, on the other hand, it is determined in method step 202 that the determined or obtained distances exceed the at least one threshold value, then in method step 203 an alienation signal is generated and provided, in particular output.
Anschließend kann in einem Verfahrensschritt 204 ein Erfassen und/oder Sammeln von Sensordaten des mindestens einen Sensors veranlasst werden. Die erfassten und/oder gesammelten Sensordaten können dann zum (Nach-)Trainieren eines Neuronalen Netzes oder zum Erstellen und/oder Aktualisieren von beim Quilting verwendeten Sensordatenpatches verwendet werden. Hierdurch können sowohl das Neuronale Netz als auch die Sensordatenpatches an die entfremdete bzw. veränderte Sensordatendomäne angepasst werden. Durch die Anpassung wird auch eine aktualisierte Referenzdatendomäne definiert. Subsequently, in a method step 204, sensing and / or collecting of sensor data from the at least one sensor can be initiated. The recorded and / or collected sensor data can then be used for (re) training a neural network or for creating and / or updating sensor data patches used during quilting. In this way, both the neural network and the sensor data patches can be adapted to the alienated or changed sensor data domain. The customization also defines an updated reference data domain.
Bezugszeichenliste List of reference symbols
Vorrichtung contraption
Eingangseinrichtung Entrance facility
Recheneinrichtung Computing device
Speichereinrichtung Storage facility
Ausgabeeinrichtung Output device
Quilting-Modul Quilting module
Erkennungsmodul Detection module
Sensordaten Sensor data
Distanz distance
Schwellenwert Threshold
Kamerabild ersetzte Sensordaten Camera image replaced sensor data
Entfremdungssignal Alienation signal
Teilausschnitt (Bildausschnitt) ersetzter Teilausschnitt ersetztes Kamerabild Partial section (image section) replaced partial section replaced camera image
Datenbank Database
Kraftfahrzeug Motor vehicle
Sensor sensor
Neuronales Netz Neural network
(nach-)trainiertes Neuronales Netz(re) trained neural network
Backendserver Backend server
Sensordatenpatches aktualisierte Sensordatenpatches Sensor data patches updated sensor data patches
Quilting-Schritt -204 Verfahrensschritte Quilting Step -204 Process Steps

Claims

Patentansprüche Claims
1. Verfahren zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne, wobei erfasste Sensordaten (10) mindestens eines Sensors (51) empfangen werden, wobei mittels eines Distanzmaßes jeweils Distanzen (11) zwischen Teilausschnitten (23) der empfangenen Sensordaten (10) und zum stückweisen Ersetzen mittels Quilting der Sensordaten (10) verwendeten und der Referenzdatendomäne angehörenden Sensordatenpatches (60) bestimmt werden oder solche Distanzen (11) erhalten werden, und wobei ein Entfremdungssignal (21) in Abhängigkeit der bestimmten Distanzen (11) erzeugt und bereitgestellt wird. 1. A method for recognizing an alienation of a sensor data domain from a reference data domain, with detected sensor data (10) of at least one sensor (51) being received, with distances (11) between partial sections (23) of the received sensor data (10) and the piece by piece replacement by means of quilting of the sensor data (10) used and belonging to the reference data domain sensor data patches (60) are determined or such distances (11) are obtained, and an alienation signal (21) is generated and provided as a function of the determined distances (11).
2. Verfahren nach Anspruch 1, dadurch gekennzeichnet, dass das Entfremdungssignal (21) erzeugt wird, wenn die bestimmten Distanzen (11) mindestens einen vorgegebenen Schwellenwert (12) überschreiten. 2. The method according to claim 1, characterized in that the alienation signal (21) is generated when the determined distances (11) exceed at least one predetermined threshold value (12).
3. Verfahren nach Anspruch 2, dadurch gekennzeichnet, dass der mindestens eine vorgegebene Schwellenwert (12) positionsabhängig und/oder kontextabhängig und/oder sensorabhängig vorgegeben ist oder vorgegeben wird. 3. The method according to claim 2, characterized in that the at least one predetermined threshold value (12) is predetermined or is predetermined as a function of position and / or context-dependent and / or sensor-dependent.
4. Verfahren nach einem der vorangegangenen Ansprüche, dadurch gekennzeichnet, dass mittels des erzeugten Entfremdungssignals (21) ein Erfassen und/oder Sammeln von Sensordaten (10) des mindestens einen Sensors (51) veranlasst wird. 4. The method according to any one of the preceding claims, characterized in that by means of the generated alienation signal (21) a detection and / or collection of sensor data (10) of the at least one sensor (51) is caused.
5. Verfahren nach Anspruch 4, dadurch gekennzeichnet, dass das Erfassen und/oder Sammeln der Sensordaten (10) positionsabhängig und/oder kontextabhängig und/oder sensorabhängig erfolgt. 5. The method according to claim 4, characterized in that the detection and / or collection of the sensor data (10) takes place as a function of position and / or context-dependent and / or sensor-dependent.
6. Verfahren nach einem der vorangegangenen Ansprüche, dadurch gekennzeichnet, dass mittels des erzeugten Entfremdungssignals (21) ein Erstellen und/oder Aktualisieren von Sensordatenpatches (60) veranlasst wird, die beim Quilting der Sensordaten (10) verwendet werden. 6. The method according to any one of the preceding claims, characterized in that the generated alienation signal (21) is used to create and / or update sensor data patches (60) which are used when the sensor data (10) is quilted.
7. Vorrichtung (1) zum Erkennen einer Entfremdung einer Sensordatendomäne von einer Referenzdatendomäne, umfassend: eine Eingangseinrichtung (2) zum Empfangen von erfassten Sensordaten (10) mindestens eines Sensors (51), eine Recheneinrichtung (3), und eine Ausgabeeinrichtung (5), wobei die Recheneinrichtung (2) dazu eingerichtet ist, mittels eines Distanzmaßes jeweils Distanzen (11) zwischen Teilausschnitten (23) der empfangenen Sensordaten (10) und zum stückweisen Ersetzen mittels Quilting der Sensordaten (10) verwendeten und der Referenzdatendomäne angehörenden Sensordatenpatches (60) zu bestimmen oder solche Distanzen (11) zu erhalten, und ein Entfremdungssignal (21) in Abhängigkeit der bestimmten Distanzen (11) zu erzeugen, und wobei die Ausgabeeinrichtung (5) dazu eingerichtet ist, das erzeugte Entfremdungssignal (21) bereitzustellen. 7. Device (1) for recognizing an alienation of a sensor data domain from a reference data domain, comprising: an input device (2) for receiving detected sensor data (10) of at least one sensor (51), a computing device (3), and an output device (5) , wherein the computing device (2) is set up to use a distance measure to calculate distances (11) between partial sections (23) of the received sensor data (10) and to replace the sensor data patches (60) used piece by piece by means of quilting of the sensor data (10) and belonging to the reference data domain. to determine or to obtain such distances (11), and to generate an alienation signal (21) as a function of the determined distances (11), and wherein the output device (5) is set up to provide the generated alienation signal (21).
8. Kraftfahrzeug (50), umfassend mindestens eine Vorrichtung (1) nach Anspruch 7. 8. Motor vehicle (50) comprising at least one device (1) according to claim 7.
9. Computerprogramm, umfassend Befehle, die bei der Ausführung des Computerprogramms durch einen Computer diesen veranlassen, die Verfahrensschritte des Verfahrens nach einem beliebigen der Ansprüche 1 bis 6 auszuführen. 9. Computer program, comprising instructions which, when the computer program is executed by a computer, cause the computer to carry out the method steps of the method according to any one of claims 1 to 6.
10. Datenträgersignal, das das Computerprogramm nach Anspruch 9 überträgt. 10. Data carrier signal which the computer program according to claim 9 transmits.
EP20829546.9A 2019-12-17 2020-12-10 Method and apparatus for recognising removal of a sensor data domain from a reference data domain Pending EP4078237A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019219927.5A DE102019219927A1 (en) 2019-12-17 2019-12-17 Method and device for recognizing an alienation of a sensor data domain from a reference data domain
PCT/EP2020/085649 WO2021122337A1 (en) 2019-12-17 2020-12-10 Method and apparatus for recognising removal of a sensor data domain from a reference data domain

Publications (1)

Publication Number Publication Date
EP4078237A1 true EP4078237A1 (en) 2022-10-26

Family

ID=74068253

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20829546.9A Pending EP4078237A1 (en) 2019-12-17 2020-12-10 Method and apparatus for recognising removal of a sensor data domain from a reference data domain

Country Status (3)

Country Link
EP (1) EP4078237A1 (en)
DE (1) DE102019219927A1 (en)
WO (1) WO2021122337A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022112655A1 (en) 2022-05-19 2023-11-23 Bayerische Motoren Werke Aktiengesellschaft Driving assistance system and driving assistance method for a vehicle

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017206123A1 (en) * 2017-04-10 2018-10-11 Robert Bosch Gmbh Method and device for merging data from various sensors of a vehicle as part of an object recognition

Also Published As

Publication number Publication date
DE102019219927A1 (en) 2021-06-17
WO2021122337A1 (en) 2021-06-24

Similar Documents

Publication Publication Date Title
AT521607B1 (en) Method and device for testing a driver assistance system
EP3789926A1 (en) Method for detecting an adversarial fault in the input data of a neural network
DE102016210534A1 (en) Method for classifying an environment of a vehicle
WO2013152929A1 (en) Learning method for automated recognition of traffic signs, method for determining an updated parameter set for the classification of a traffic sign and traffic sign recognition system
EP3644239A1 (en) Device and method for abstracting a dataset
EP3828758A1 (en) Object classification method, object classification circuit, motor vehicle
DE102019106122A1 (en) Automated driving system
EP4078238A1 (en) Method and device for making sensor data more robust against adverse disruptions
EP4078237A1 (en) Method and apparatus for recognising removal of a sensor data domain from a reference data domain
DE102019208735B4 (en) Method for operating a driver assistance system for a vehicle and a driver assistance system for a vehicle
DE102021200643B3 (en) Method for environment recognition for semi-autonomous or autonomous driving functions of a motor vehicle using a neural network
EP4049186A1 (en) Method for making a neural network more robust against adversarial disruptions
WO2021122339A1 (en) Method and device for making a neural network more robust in relation to adversarial disruptions
WO2020233991A1 (en) Method for operating a deep neural network
WO2020233992A1 (en) Method for making a neural network more robust in a function-specific manner
DE102019219924B4 (en) Method and device for creating and providing a database with sensor data patches stored therein for use in quilting
EP3985565A1 (en) Method and device for checking an ai-based information processing system used in partially automated or fully automated control of a vehicle
EP4179471A1 (en) Method and device for evaluating and certifying the robustness of an ai-based information processing system
DE102020120934A1 (en) Method for providing a compressed neural network for multi-label, multi-class categorization, vehicle assistance device for environment categorization and motor vehicle
DE102020203819A1 (en) Method for operating an at least partially automated vehicle and vehicle
DE102020213058A1 (en) Method and device for partially automated or fully automated control of a vehicle
DE102019219926A1 (en) Method and device for training a neural network
DE102018008325A1 (en) A method of operating a smart sensor fusion system having a centralized fusion device and having a decentralized fusion device and intelligent sensor fusion system
EP4003802A1 (en) Method and system for providing a context-dependent knowledge base for plausibilizing at least one detection function
DE102019119084A1 (en) Determining a signal status of a traffic light system

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220718

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)