EP3983936A1 - Verfahren und generator zum erzeugen von gestörten eingangsdaten für ein neuronales netz - Google Patents
Verfahren und generator zum erzeugen von gestörten eingangsdaten für ein neuronales netzInfo
- Publication number
- EP3983936A1 EP3983936A1 EP20733710.6A EP20733710A EP3983936A1 EP 3983936 A1 EP3983936 A1 EP 3983936A1 EP 20733710 A EP20733710 A EP 20733710A EP 3983936 A1 EP3983936 A1 EP 3983936A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- metric
- sensor data
- neural network
- input data
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 137
- 238000000034 method Methods 0.000 title claims abstract description 95
- 238000005457 optimization Methods 0.000 claims abstract description 50
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 38
- 230000008859 change Effects 0.000 claims abstract description 33
- 238000012549 training Methods 0.000 claims description 24
- 238000012545 processing Methods 0.000 claims description 18
- 230000008034 disappearance Effects 0.000 claims description 2
- 238000012360 testing method Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 230000011218 segmentation Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000000342 Monte Carlo simulation Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 230000002068 genetic effect Effects 0.000 description 2
- 230000001537 neural effect Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 230000003292 diminished effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000033001 locomotion Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24133—Distances to prototypes
- G06F18/24137—Distances to cluster centroïds
- G06F18/2414—Smoothing the distance, e.g. radial basis function networks [RBFN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
Definitions
- the present invention relates to a method for generating disturbed input data for a neural network for analyzing sensor data, in particular for analyzing digital images, of a driver assistance system.
- the invention also relates to a method for checking the robustness of such a neural network and a method for improving a parameter set of such a neural network.
- the invention further relates to a generator for generating disturbed input data for a neural network for analyzing sensor data, in particular for analyzing digital images, of a driver assistance system.
- Modern vehicles include driver assistance systems that support the driver in controlling the vehicle or partially or completely take over the driving task.
- Level of automation only information and warnings are given to the driver. With a higher degree of automation, the driver assistance system actively intervenes in the control of the vehicle. For example, the steering of the vehicle or the acceleration in a positive or negative direction are intervened. At an even higher degree of automation
- Automation is so far intervened in devices of the vehicle that certain types of locomotion of the vehicle, for example straight travel, can be carried out automatically. With the highest degree of automation, the vehicle can drive autonomously.
- Driver assistance system safely control the vehicle.
- machine learning has great potential.
- Raw sensor data which are generated, for example, by a camera, a radar sensor or a lidar sensor of a vehicle, are processed by means of a deep neural network.
- the neural network generates output data from which the
- 2020-06-12 bes.doc Driving derives. For example, the type and position of objects in the vehicle environment and their behavior are determined. Furthermore, the road geometry and road topology can be determined by means of neural networks. For processing digital images are
- Such deep neural networks are trained for use in a driver assistance system.
- the parameters of the neural network can be suitably adapted by entering data without a human expert having to intervene.
- the deviation of an output of a neural network from a basic truth is measured. This deviation is also called "loess"
- loss function is chosen in such a way that the
- Parameters differentiable depend on it. As part of a gradient descent, the parameters of the neural network are then adapted in each training step as a function of the derivation of the deviation, which is determined on the basis of several examples. These training steps are repeated very often until the deviation, i. H. the loess, no longer diminished.
- the parameters are determined without the assessment of a human expert or semantically motivated modeling.
- Driver assistance systems generate usable output data when the input data is disturbed.
- Input data that can be used to test how robust a neural network is against such disturbances.
- disturbed input data can only be generated to a limited extent by known disturbances.
- the present invention is therefore based on the object of a method and a
- this object is achieved by a method with the features of claim 1 and a method with the features of claim 12 as well as a generator with the features of claim 14 and a generator with the features of claim 20. Furthermore, a method for checking the robustness of a neural network for analyzing sensor data, in particular digital images, against disturbed input data and a method for improving a parameter set of such a neural network can be specified.
- a first metric is defined that indicates how the extent of a change in a digital image is measured
- a second metric is defined that indicates what a Disruption of
- Input data of a digital image is directed.
- An optimization problem is generated from a combination of the first metric and the second metric.
- the optimization problem is solved by means of at least one solution algorithm, the solution specifying a target disruption of the input data, and input data from sensor data for the neural network disrupted by the target disruption being generated.
- the sensor data are in particular digital images.
- the target disturbance in this case thus generates disturbed, i.e. H. altered digital images that form the input data for the neural network that analyzes the digital image.
- possible harmful disturbances of a neural network which is used for analyzing sensor data, are considered on a structural level.
- the disturbance is seen as a composition of different elements for which different metrics are defined.
- This target disturbance can then be used to generate disturbed input data from sensor data for the neural network.
- the neural network can then be tested and trained on the basis of this disturbed input data. This advantageously enables method according to the invention to generate new faults very quickly and in a simple manner.
- the first metric used in the method according to the invention indicates how the extent of a change in sensor data is measured. If the sensor data is a digital image from a camera, the disturbance for testing the neural network should usually be as small as possible.
- the first metric indicates how the extent of the change in the digital image can be quantified. A digital image can be modified, for example, by shifting, rotating or mirroring the pixels of the image. The first metric gives the degree of change in such
- a rotation or translation of a digital image can be defined by a fixed point and the rotation angle or the translation distance in the horizontal and vertical directions. Furthermore, the first metric can determine the image distances for each pixel of the image by determining the sum of the differences of all pixel values.
- the pixel value can be, for example, a grayscale value or a color value.
- For each pixel the difference between the pixel value for the original image and for the disturbed image is formed. This difference is determined for each pixel and the differences are then added. The result is an image spacing which indicates the difference between the two images according to the first metric.
- changed image areas can be viewed after the first metric.
- the image areas can be defined by a starting point and an extent in the horizontal and vertical directions, or by a list of pixels. Image distances can be determined for these image areas according to the first metric.
- the first metric can be the extent of a change in a digital image with respect to image characteristics, such as e.g. B. luminance, contrast and / or structure values, or any combination thereof.
- the definition of the first metric can also contain restrictions
- the changes that are considered in the first metric only take into account those image areas in which, for example, certain image characteristics are present. For example, only those areas can be considered in which the contrast exceeds a certain threshold value.
- the first metric is selected from first metrics which measure potential naturally occurring disturbances, since disturbances determined by these metrics can actually occur during execution in the field.
- Such natural disturbances are, for example, changes in the sensor data that are generated due to the effects of the weather, such as fog or snow, sensor noise or camera pollution, or that are generated by textures.
- naturally occurring disturbances are naturally occurring objects in the vicinity of a vehicle, such as printed posters or stickers on objects.
- the disturbance of the second metric is aimed at making objects of a certain class disappear, it is possible to add a printed poster, a sticker on an object, fog or textures to a digital image.
- a printed poster, a sticker on an object, fog or textures By means of such disturbances, which according to the second metric is directed towards a certain effect in the sensor data, it is advantageously possible to generate such disturbed input data for a neural network which are particularly relevant for use in a driver assistance system.
- the second metric is aimed at changing the classification of objects.
- it measures the deviation of the true model output from the desired false model output, i.e. the target of the adverserial disturbance.
- the disturbance can be aimed at ensuring that whenever an area is recognized as a street, this street is always recognized as an empty street without other road users.
- the second metric can be aimed at the disappearance of objects.
- the disturbance is aimed, for example, at the fact that recognized objects are changed in such a way that they disappear.
- the second metric can only relate to certain image areas. For example, the disturbance that is described by the second metric can be aimed at the fact that objects of a specific class cannot occur in a specific image area.
- the second metric is aimed at a change in an object of a specific class.
- an object can be recognized and classified.
- an image area can be a
- the second metric is then on, for example directed to display this object larger or smaller or in a different position.
- magnification is defined, for example, by specifying the absolute number of pixels by which the object is enlarged or reduced by the disturbance on the left, right, top and bottom.
- Any changes to the sensor data can be brought about in order to change the sensor data in such a way that the analysis of the sensor data in a
- a pattern or a grid can be applied to the sensor data so that objects of a certain class, for example pedestrians, disappear in a digital image, but other objects continue to be classified correctly.
- driver assistance systems are particularly relevant to those second metrics that measure the naturally occurring disturbances: the model output appears plausible, but deviates from the truth in certain safety-relevant details.
- the disturbances that are described by the first and / or second metric are naturally occurring disturbances.
- a selection is made for the possible disturbances that are described by the first and / or second metric, which is necessary for checking and improving neural networks for use in one
- the first and / or second metrics are stored in a database.
- the metrics for possible disturbances in the input data (first metrics) and for possible changes in the model outputs (second metrics) can be stored in the database, for example.
- a data set is then used for a naturally occurring disorder (measured with a first metric) and for a possible goal (an advised
- a third metric is defined, which indicates what kind of sensor data a third disturbance is applied to. For example, the error occurs on all data, on only one data point or on data with certain data
- the optimization problem is then generated from a combination of at least two metrics of the first, the second and the third metric.
- the optimization problem is generated in particular from a combination of the first, the second and the third metric.
- the sensor data are in particular digital images. These are analyzed in particular by a neural network in a driver assistance system.
- the third metric can in particular relate to all sensor data, for example all digital images.
- the disruption in all digital images can result in objects of a certain class disappearing.
- the third metric can only affect a subset of the sensor data, in particular the digital images.
- the disturbance can, for example, only describe those digital images which contain objects of a certain class, for example objects classified as pedestrians.
- the third metric can describe digital images that were recorded on days with snowfall or rain.
- the disturbed input data for the neural network when used in a driver assistance system can, for example, cause a different assessment of a special traffic situation or environmental situation.
- the third metric only describes sensor data that contain a specific object.
- the third metric can only select a specific digital image.
- the optimization problem that has been generated on the basis of the metrics can be represented as follows, for example: With a given maximum change in a digital image, for example by rotating a certain image area, the number of pixels classified as people should be minimized, for as many images as possible in which people appear.
- the number of pixels classified as a person is to be minimized, namely for as many images as possible in which people occur.
- the solution algorithm includes iterative methods using the gradients of the neural network to determine the directions of change. Furthermore, iterative methods using sampling, evaluation and combinations thereof can be used.
- Noise is generated to a digital image and the result is verified.
- the solution to the optimization problem can be, for example, a disturbed digital image or a disturbance with which sensor data can be disturbed in order to generate disturbed input data for a neural network.
- the disrupted sensor data or the disrupted digital image then represent the input data for the neural network that is to be checked.
- a perturbation can also be applied to a set of input data by combining at the pixel level, for example by summation.
- Another aspect of the invention relates to a method for generating disturbed
- Input data for a neural network for analyzing sensor data, in particular digital images, of a driver assistance system in which a first set is defined which contains first metrics which each differently indicate how the extent of a change in sensor data is measured, a second set is defined , which contains second metrics, which each differently indicate where a disturbance of sensor data is aimed, any combination of a first metric of the first set and a second metric of the second set is selected, an optimization problem is generated from the selected combination of the first and second metric becomes, the optimization problem by means of at least one
- Solution algorithm is solved, the solution indicating a target disruption of the input data, and input data 9 disrupted by the target disruption are generated by sensor data for the neural network.
- the advantage of this method is that any first metric of the first set and any second metric of the second set can be used in order to arrive at a target disruption by solving the optimization problem.
- the more metrics the first and second Quantity the more different target faults can be generated by the method. A very large number of target disturbances can thus be generated.
- the first set comprises at least two, in particular at least five, different first metrics.
- the first metric can also contain more than 10, 20 or more than 100 metrics.
- the second set comprises at least two, in particular at least five, different second metrics.
- the second metric can also contain more than 10, 20 or more than 100 metrics.
- the first and / or the second metric of the first or second quantity can in particular individually or in combination have the features as described above.
- a third metric is defined which indicates what kind of sensor data a disturbance applies to and any combination of a first metric of the first set, a second metric of the second set and the third metric is selected. An optimization problem is then generated from the selected combination of the first, second and third metrics.
- the third metric can in particular individually or in combination have the features as described above.
- Solution algorithm set that contains several solution algorithms that use the
- Solution algorithm arrives at different target disturbances.
- the solution algorithms of the solution algorithm set can include iterative methods using the gradients of the neural network to determine the directions of change, as well as sampling-based methods, gradient-based methods, gradient-based methods with momentum and / or surrogate-model-based methods.
- the present invention also relates to a method for checking the robustness of a neural network for analyzing sensor data, in particular digital images, in relation to disturbed input data, in which the following steps are carried out:
- Providing a neural network with an associated parameter set generating training data using an example sensor data set, generating a first analysis of the example sensor data set based on the training data using the neural network, generating disrupted input data as training data for the example sensor data set using the above-described Method for generating disrupted input data for a neural network, generating a second analysis of the sample sensor data set based on the disrupted input data by means of the neural network, comparing the first and second analysis and determining a robustness value depending on the result of the comparison of the first and second Analysis.
- the present invention also relates to a method for improving a
- Parameter set of a neural network for analyzing sensor data, in particular digital images, with regard to disturbed input data involves the following steps: a. Provision of a neural network with the associated parameter set,
- the example sensor data set is in particular a digital example image.
- the method described at the beginning for generating disrupted input data can be used to check how robust a neural network is to analyze sensor data against the disturbed input data. If the neural network is used in a process for analyzing sensor data of a
- the neural network is robust to disrupted input data for the neural network.
- the neural network is robust against such disturbed input data if the deviation in the first and the second analysis is very small.
- the disturbed input data then have little influence on the output of the neural network. If, on the other hand, the disturbed input data, even if the disturbances on the sensor data are only very small, become very large
- the neural network is not robust against the disturbance of the input data.
- the first and second analysis can include semantic segmentation of the digital image, recognition of objects of the digital image, classification of objects of the digital image or recognition of the position of an object in the digital image.
- the analyzes can be used to identify how an object changes in the digital image. These analyzes are particularly relevant when the neural network is used in a driver assistance system, so that it is important that the neural network is robust against disturbances that occur during such analyzes, so that there are slight changes in the analysis when input data is disturbed be used.
- the invention also relates to a method for improving a parameter set of a neural network for analyzing sensor data, in particular digital images, with respect to disturbed input data.
- the method comprises the steps a. to f., as given above. Then in a step h. an improved one
- Parameter set for the neural network generated on the basis of the result of the comparison of the first and second analysis.
- the improved set of parameters is obtained by training the neural network.
- the training is for disturbed and undisturbed sensor data, i. H. especially digital images.
- the improved parameter set then results, for example, from a gradient descent (adverserial training).
- the invention also relates to a generator for generating disturbed
- Input data for a neural network for analyzing sensor data, in particular digital images a driver assistance system with a first metric unit with a first metric that indicates how the extent of a change in sensor data is measured, a second metric unit with a second metric that indicates what a disruption of the
- Input data of sensor data is directed to a processing unit which is coupled to the first and second metric unit and which is designed to generate an optimization problem from the first and the second metric, a solution unit that is associated with the
- Processing unit is coupled and which is designed to solve the optimization problem by means of at least one solution algorithm, the solution being a target disruption of the
- Solution unit is coupled and which is designed to be disturbed by means of the target disruption
- the generator according to the invention is designed in particular to carry out the method described above for generating disturbed input data. It therefore also has the same advantages as this method.
- the generator also includes a third metric unit with a third metric which specifies the sensor data to which the disturbance applies.
- the processing unit is also coupled to the third metric unit and designed to generate the optimization problem from at least two metrics of the first, second and third metrics.
- the invention also relates to a device for generating a parameter set for a neural network for analyzing sensor data of a driver assistance system with a first analysis unit for generating a first analysis by means of the neural network based on training data for an example sensor data set, the generator described above for generating of disturbed input data to generate disturbed
- Analysis unit for generating a second analysis of the example sensor data set on the basis of the disturbed input data by means of the neural network a comparison unit which is coupled to the first and the second analysis unit and which is designed to compare the first and second analysis, and a generation unit, which is coupled to the comparison unit and which is designed to generate an improved parameter set for the neural network on the basis of the result of the comparison of the first and second analysis.
- the device for generating a parameter set is designed in particular to carry out the method described above for improving a parameter set of a neural network. It therefore also has the same advantages as this method.
- FIG 1 shows schematically the structure of an embodiment of the invention
- FIG. 2 shows schematically the sequence of an embodiment of the invention
- FIG. 3 shows schematically the structure of an embodiment of the invention
- FIG. 4 shows the sequence of an exemplary embodiment of the method according to the invention for checking the robustness of a neural network
- FIG. 5 shows the sequence of the method according to the invention for improving a
- Figure 6 shows an example of a fault.
- sensor data are analyzed by a neural network or disturbed input data for a neural network are generated from such sensor data.
- the sensor data in the exemplary embodiments are
- the sensor can be a camera, a radar sensor, a lidar sensor or any other sensor that generates sensor data that is processed further in a driver assistance system.
- the sensor data are digital images that have been recorded by a camera of a vehicle.
- the invention can also be applied to other sensor data in the same way.
- an exemplary embodiment of the generator 10 for generating disturbed input data for a neural network for analyzing digital images of a driver assistance system is first described.
- the generator 10 comprises a first metric unit 1, a second metric unit 2 and a third metric unit 3.
- the first metric unit 1 comprises a first metric which indicates how the extent of a change in digital images is measured. It is defined by the first metric unit 1 how the extent of a change in digital images is measured.
- the definition of the first metric can be entered in the first metric unit 1.
- the first metric unit 1 can also access a database 16 via an interface, in which data is stored with a large number of possible definitions for metrics that measure the extent of a change in digital images.
- the first metric can compare the image distances between two digital images and output a value for this image distance.
- the image distance can be defined, for example, by the sum of the differences of all pixel values of the digital images to be compared.
- the first metric unit 1 selects a disturbance that is as natural as possible from the database 16. Under a natural disorder becomes a disorder
- Natural disturbances are understood to mean image changes in which objects are inserted into the image or disappear from the image, as can also occur in the vicinity of a vehicle.
- a poster or sticker can be inserted on an object in the vicinity of the vehicle.
- Other, non-naturally occurring disturbances, as they can also be contained in the database 16, are not taken into account by the second metric unit 1, since they are of lesser relevance for testing a neural network that is used in a driver assistance system.
- the second metric unit 2 comprises a second metric which indicates what a disturbance of the input data of the digital images is aimed at, i. H. the second metric defines what a disruption of a digital image is directed to.
- the definition of the second metric can be transmitted to the second metric unit 2 by an input.
- the second metric unit 2 can also be coupled to the database 16, in which data on a large number of disturbances are stored, which is directed to a specific change in digital images. These can be collections of such faults.
- the second metric unit 2 selects a fault that is as plausible as possible from the database 16.
- a plausible disturbance is understood to be a disturbance which apparently results in a realistic model output, but which is relevant
- the second metric can be aimed, for example, at increasing the size of all objects that are assigned to a specific class, for example the class of pedestrians.
- the disruption thus generates a digital image in which an object of the initial image, which is classified as a pedestrian, is iteratively enlarged in all four directions, the resulting segmentation of the disrupted digital image being combined with one another again.
- the result is a digital image in which all objects that do not belong to the pedestrian class remain unchanged, but the objects that belong to the pedestrian class are shown enlarged.
- the other objects are only changed to the extent that they were changed by enlarging the objects of the pedestrian class.
- the third metric unit 3 comprises a third metric which specifies to which digital images the disturbance applies. For example, it can be defined by the metric that the disturbance is only applied to digital images which show other road users, i.e. H. for example pedestrians, cyclists and other vehicles.
- the three metric units 1 to 3 are connected to a processing unit 4.
- the processing unit 4 The three metric units 1 to 3 are connected to a processing unit 4.
- Processing unit 4 is designed to generate an optimization problem from the three metrics of the first to third metric units 1 to 3. For example, this includes
- optimization problem of a loss function for a neural network which contains a disturbance parameter as a parameter and an image resulting from the disturbance (second metric).
- the aim is to find the minimum of the disturbance parameter in the optimization problem, specifically for the digital images that are defined according to the third metric and below the
- Output image is below a certain value according to the first metric.
- the processing unit 4 transmits the optimization problem as a data set to a
- the solution unit 5 is coupled to a database 6 in which at least one solution algorithm, preferably a plurality of solution algorithms, for optimization problems is stored.
- solution algorithms are known per se.
- the solution unit 5 can generate a target disruption of the input data of digital images as the solution to the optimization problem.
- the target disturbance thus generates a disturbed digital image which can be used as input data for a neural network for analyzing digital images.
- the neural network is set up in particular to analyze digital images from a driver assistance system.
- the solving unit 5 transmits the target disturbance to a generating unit 7.
- Generation unit 7 is also coupled to a database 8 in which a multiplicity of digital images are stored.
- the generating unit 7 can disturb digital images in the database 8 in such a way that disturbed input data 9 of the digital images are generated for a neural network. The disturbed input data 9 are then from the
- Generation unit 7 output. With these disturbed input data 9, a neural network can then be tested, trained or the parameter set of the neural network can be improved.
- a first metric is defined, which indicates how the extent of a
- the first metric or a data record which describes the first metric is stored in the first metric unit 1.
- a second metric is defined which indicates what a disruption of the digital images is aimed at.
- This second metric or a data record that describes the second metric is also stored in the second metric unit 2.
- a third metric is defined which specifies the type of digital images to which a disturbance applies. This third metric or a data record that describes this third metric is stored in the third metric unit 3.
- a step S4 the data records which describe the three metrics are transmitted to the processing unit 4.
- the processing unit 4 generates an optimization problem from a combination of the three metrics.
- the processing unit 4 transmits a data record, which describes the generated optimization problem, to the solution unit 5.
- the solution unit 5 solves the optimization problem by means of at least one solution algorithm which the solution unit 5, for example, by accessing the
- the solution is a target disruption for digital images.
- a data record for this target disruption is transmitted to the generation unit 7.
- step S9 the generation unit 7 generates disrupted digital images as input data 9 for a neural network by accessing the database 8. These disturbed input data 9 are output in a step S10.
- the output y of the model M is shown in FIG. 6B.
- the digital image x has been segmented, i.e. H. the pixels of the digital image x have been assigned classes as shown in Fig. 6B. The following class assignments resulted:
- the target output y ′′ which is to be generated by the disturbance D, is shown in FIG. 6C.
- the aim of the disturbance D is that the pedestrian is shown enlarged.
- the target disturbance is defined as a shift of individual pixel values by the value 3.
- the target data consist of a concrete image x.
- the first metric is then defined as follows:
- the size of the disturbance is measured as the maximum pixel value between 0 and 255 in the disturbance D.
- the second metric is defined as follows:
- the third metric is defined as follows:
- the attack only relates to the input image x if one demands d3 (x ‘) ⁇ 1.
- the focus with regard to the data to be attacked changes dramatically if one demands d3 (x ‘) ⁇ 2: then the attack relates to all images.
- the first metric can only allow pixel changes in an image area that are classified as a “tree”.
- a D is to be found in image areas “tree” in the digital image x, so that d (A) is minimal, with di (A) ⁇ 3.
- the optimization problem can then be formulated as follows: A D is to be found such that d (A) is minimal for all images, where di (A) ⁇ 3. In other words: A D with di (A) ⁇ 3 should be found so that the model output for all input images x looks like y ".
- the generator 10 of the further exemplary embodiment comprises, as in the first exemplary embodiment, a first metric unit 1 and a second metric unit 2.
- the first metric unit 1 comprises a first set with a plurality of first metrics, each of which indicates differently, such as the extent of a change measured by sensor data.
- the second metric unit 2 comprises a second set with a multiplicity of second metrics, which each differently indicate where a disturbance of the input data 9 of sensor data is directed.
- the processing unit 4 coupled to the first 1 and second 2 metric units is designed in this case to generate the optimization problem from any combination of a first metric of the first set and a second metric of the second set.
- the solution unit 5 coupled to the processing unit 4 is then designed to solve the optimization problem by means of at least one solution algorithm, the solution indicating a target disruption of the input data 9 of sensor data.
- the generation unit 7 is also designed to generate input data 9 from sensor data for a neural network 11 that are disturbed by the target interference.
- the method of the further exemplary embodiment runs analogously to the method of the first exemplary embodiment.
- a first set is defined which contains the first metrics, which in each case indicate differently how the extent of a change in sensor data is measured.
- a second set is defined, which contains the second metrics, which in each case differently indicate where a disturbance of sensor data is directed. Any combination of a first metric of the first set and a second metric of the second set is then selected and the optimization problem is generated from the selected combination of the first and second metric.
- this is then solved by means of at least one solution algorithm, the solution indicating a target disruption of the input data 9.
- disturbed input data 9 are generated by sensor data for the neural network 11.
- the device comprises the database 8 with digital images.
- the generator 10 described with reference to FIG. 1 is connected to this database 8.
- a neural network 11 is coupled to the database 8 and the generator 10.
- the output of the neural network 11 is coupled to a first analysis unit 12 and a second analysis unit 13.
- the first analysis unit 12 generates a first analysis by means of the neural network 11 on the basis of digital images which are supplied as input data from the database 8 to the neural network 11.
- the second analysis unit 13 generates a second analysis on the basis of disturbed input data 9, which are fed from the generator 10 to the neural network 11. To generate the disturbed input data 9, the generator 10 accesses the
- the first analysis unit 12 and the second analysis unit 13 are coupled to a comparison unit 14. This is designed to compare the first and the second analysis with one another.
- the comparison unit 14 is coupled to a parameter set generation unit 15.
- the parameter set generation unit 15 is designed to generate an improved parameter set for the neural network 11 on the basis of the result of the comparison of the first and the second analysis, which was transmitted from the comparison unit 14.
- the parameter set generation unit 15 generates the parameter set for the neural network 11 in such a way that the disrupted input data 9 of the digital images generated by the generator 10 have little influence on the analysis of these input data by the neural network 11.
- the improved parameter set is generated in such a way that the effects of the disturbed input data 9 on the semantic segmentation of the digital image by means of the neural network 11 for the disturbed input data does not lead to objects relevant to safety being classified incorrectly for a driver assistance system, these objects disappearing or changing being represented.
- the neural network 11 can thus be trained using the disturbed input data 9 generated by the generator 10.
- a neural network with an associated parameter set is created
- This neural network is to be checked.
- training data are generated using a large number of digital images.
- a step R3 the neural network is trained in a manner known per se with training data and a first analysis of the digital images is generated on the basis of the training data by means of the neural network.
- a step R4 disturbed input data are generated as training data for the digital images by means of the method as explained with reference to FIG.
- a second analysis of the digital images on the basis of the disturbed input data is generated by means of the neural network.
- a step R6 the first and the second analysis are compared with one another.
- a robustness value is finally determined as a function of the result of the comparison of the first and second analysis.
- the robustness value is high if the deviation of the second analysis from the first analysis is small, in particular with regard to deviations that are relevant for the operation of a driver assistance system,
- steps R1 to R6 are carried out, as was explained with reference to FIG. Subsequently, in a step R8, an improved parameter set for the neural network is generated on the basis of the result of the comparison and the second analysis.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Mathematical Physics (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102019208733.7A DE102019208733A1 (de) | 2019-06-14 | 2019-06-14 | Verfahren und Generator zum Erzeugen von gestörten Eingangsdaten für ein neuronales Netz |
PCT/EP2020/066348 WO2020249758A1 (de) | 2019-06-14 | 2020-06-12 | Verfahren und generator zum erzeugen von gestörten eingangsdaten für ein neuronales netz |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3983936A1 true EP3983936A1 (de) | 2022-04-20 |
Family
ID=71108567
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20733710.6A Pending EP3983936A1 (de) | 2019-06-14 | 2020-06-12 | Verfahren und generator zum erzeugen von gestörten eingangsdaten für ein neuronales netz |
Country Status (5)
Country | Link |
---|---|
US (1) | US20220358747A1 (de) |
EP (1) | EP3983936A1 (de) |
CN (1) | CN114207674A (de) |
DE (1) | DE102019208733A1 (de) |
WO (1) | WO2020249758A1 (de) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3920106A1 (de) * | 2020-06-02 | 2021-12-08 | Bull SAS | Verfahren, computerprogramm und vorrichtung zur evaluierung der widerstandsfähigkeit eines neuronalen netzes gegen bildstörungen |
DE102021108934A1 (de) | 2021-04-09 | 2022-10-13 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und System zur Bereitstellung einer Funktion anhand einer Maschinen-erlernten Verarbeitungseinheit |
DE102021127958A1 (de) | 2021-10-27 | 2023-04-27 | Dr. Ing. H.C. F. Porsche Aktiengesellschaft | Computerimplementiertes Verfahren zum Training einer künstlichen Intelligenz |
DE102022207450A1 (de) | 2022-07-21 | 2024-02-01 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Validieren eines Algorithmus des maschinellen Lernens |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10528824B2 (en) * | 2017-12-11 | 2020-01-07 | GM Global Technology Operations LLC | Artificial neural network for lane feature classification and localization |
DE102019218613B4 (de) * | 2019-11-29 | 2021-11-11 | Volkswagen Aktiengesellschaft | Objektklassifizierungsverfahren, Objektklassifizierungsschaltung, Kraftfahrzeug |
DE102020213057A1 (de) * | 2020-10-15 | 2022-04-21 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zum Überprüfen eines beim teilautomatisierten oder vollautomatisierten Steuern eines Fahrzeugs verwendeten KI-basierten Informationsverarbeitungssystems |
US20220155096A1 (en) * | 2020-11-16 | 2022-05-19 | Waymo Llc | Processing sparse top-down input representations of an environment using neural networks |
DE102021201445A1 (de) * | 2021-02-16 | 2022-08-18 | Robert Bosch Gesellschaft mit beschränkter Haftung | Computerimplementiertes Verfahren zum Testen der Konformität zwischen realen und synthetischen Bildern für maschinelles Lernen |
WO2022241238A1 (en) * | 2021-05-13 | 2022-11-17 | Drilldocs Company | Object imaging and detection systems and methods |
-
2019
- 2019-06-14 DE DE102019208733.7A patent/DE102019208733A1/de active Pending
-
2020
- 2020-06-12 CN CN202080043495.XA patent/CN114207674A/zh active Pending
- 2020-06-12 WO PCT/EP2020/066348 patent/WO2020249758A1/de active Application Filing
- 2020-06-12 US US17/619,159 patent/US20220358747A1/en active Pending
- 2020-06-12 EP EP20733710.6A patent/EP3983936A1/de active Pending
Also Published As
Publication number | Publication date |
---|---|
US20220358747A1 (en) | 2022-11-10 |
CN114207674A (zh) | 2022-03-18 |
WO2020249758A1 (de) | 2020-12-17 |
DE102019208733A1 (de) | 2020-12-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102019208735B4 (de) | Verfahren zum Betreiben eines Fahrassistenzsystems eines Fahrzeugs und Fahrerassistenzsystem für ein Fahrzeug | |
WO2020249758A1 (de) | Verfahren und generator zum erzeugen von gestörten eingangsdaten für ein neuronales netz | |
AT523834B1 (de) | Verfahren und System zum Testen eines Fahrerassistenzsystems | |
DE102019114577A1 (de) | Systeme, vorrichtungen und verfahren für eingebettete codierungen von kontextbezogenen informationen unter verwendung eines neuronalen netzwerks mit vektorraummodellierung | |
DE102018008685A1 (de) | Verfahren zum Trainieren eines künstlichen neuronalen Netzes, künstliches neuronales Netz, Verwendung eines künstlichen neuronalen Netzes sowie entsprechendes Computerprogramm maschinenlesbares Speichermedium und entsprechende Vorrichtung | |
EP4224436A1 (de) | Verfahren und computerprogramm zum charakterisieren von zukünftigen trajektorien von verkehrsteilnehmern | |
DE102019209463A1 (de) | Verfahren zur Bestimmung eines Vertrauenswertes eines Objektes einer Klasse | |
DE102019208233A1 (de) | Verfahren und Vorrichtung zum automatischen Ausführen einer Steuerfunktion eines Fahrzeugs | |
DE102008036219A1 (de) | Verfahren zur Erkennung von Objekten im Umfeld eines Fahrzeugs | |
WO2020233961A1 (de) | Verfahren zum beurteilen einer funktionsspezifischen robustheit eines neuronalen netzes | |
DE102022212788A1 (de) | Verfahren zum Extrahieren von Merkmalen aus Verkehrsszenen-Daten basierend auf einem neuronalen Graphennetzwerk | |
DE102019208234A1 (de) | Verfahren und Vorrichtung zum automatischen Ausführen einer Steuerfunktion eines Fahrzeugs | |
DE102020204321A1 (de) | Verfahren zum Betreiben eines Fahrerassistenzsystems und Fahrerassistenzsystem für ein Fahrzeug | |
DE102018109680A1 (de) | Verfahren zum Unterscheiden von Fahrbahnmarkierungen und Bordsteinen durch parallele zweidimensionale und dreidimensionale Auswertung; Steuereinrichtung; Fahrassistenzsystem; sowie Computerprogrammprodukt | |
DE102021205447A1 (de) | Datenanreicherung für das Trainieren von Bildklassifizierern | |
DE102020212921A1 (de) | Verfahren, Computerprogramm und Vorrichtung zum Bewerten einer Verwendbarkeit von Simulationsdaten | |
DE102020213057A1 (de) | Verfahren und Vorrichtung zum Überprüfen eines beim teilautomatisierten oder vollautomatisierten Steuern eines Fahrzeugs verwendeten KI-basierten Informationsverarbeitungssystems | |
EP3973466A1 (de) | Verfahren zum funktionsspezifischen robustifizieren eines neuronalen netzes | |
DE102021104077B3 (de) | Verfahren, System und Computerprogrammprodukt zur automatisierten Generierung von Verkehrsdaten | |
DE102019202751A1 (de) | Verfahren zur Objektdetektion mit zwei neuronalen Netzwerken | |
DE102019128223A1 (de) | Verfahren, Vorrichtungen und Computerprogramme | |
DE102019212830A1 (de) | Analyse und Validierung eines neuronalen Netzes für ein Fahrzeug | |
DE102023000578A1 (de) | Verfahren zum Ermitteln von potentiellen, zukünftigen Positionen eines potentiellen Kollisionsobjekts mittels eines lernfähigen Systems, sowie Verfahren zum Trainieren eines entsprechenden lernfähigen Systems | |
DE102023119371A1 (de) | Verfahren, System und Computerprogrammprodukt zur Verbesserung von simulierten Darstellungen von realen Umgebungen | |
EP4078433A1 (de) | Verfahren und vorrichtung zum erzeugen und bereitstellen einer datenbank mit darin hinterlegten sensordatenpatches zur verwendung beim quilting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20211117 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20231109 |