EP3983934A1 - Verfahren zum betreiben eines fahrassistenzsystems eines fahrzeugs und fahrerassistenzsystem für ein fahrzeug - Google Patents
Verfahren zum betreiben eines fahrassistenzsystems eines fahrzeugs und fahrerassistenzsystem für ein fahrzeugInfo
- Publication number
- EP3983934A1 EP3983934A1 EP20733555.5A EP20733555A EP3983934A1 EP 3983934 A1 EP3983934 A1 EP 3983934A1 EP 20733555 A EP20733555 A EP 20733555A EP 3983934 A1 EP3983934 A1 EP 3983934A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- sensor data
- data
- neural network
- vehicle
- metric
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 74
- 238000013528 artificial neural network Methods 0.000 claims abstract description 118
- 238000005457 optimization Methods 0.000 claims description 38
- 238000004422 calculation algorithm Methods 0.000 claims description 29
- 238000012795 verification Methods 0.000 claims description 22
- 238000012549 training Methods 0.000 claims description 21
- 238000011156 evaluation Methods 0.000 claims description 13
- 230000011218 segmentation Effects 0.000 claims description 5
- 238000012360 testing method Methods 0.000 abstract description 4
- 238000012545 processing Methods 0.000 description 10
- 238000011161 development Methods 0.000 description 8
- 230000018109 developmental process Effects 0.000 description 8
- 230000006870 function Effects 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000000342 Monte Carlo simulation Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000002068 genetic effect Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000011109 contamination Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 230000003292 diminished effect Effects 0.000 description 1
- 230000008034 disappearance Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2458—Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/0205—Diagnosing or detecting failures; Failure detection models
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/25—Integrating or interfacing systems involving database management systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24133—Distances to prototypes
- G06F18/24137—Distances to cluster centroïds
- G06F18/2414—Smoothing the distance, e.g. radial basis function networks [RBFN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/98—Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
- G06V10/993—Evaluation of the quality of the acquired pattern
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/0075—Automatic parameter input, automatic initialising or calibrating means
- B60W2050/0083—Setting, resetting, calibration
- B60W2050/0088—Adaptive recalibration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/02—Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
- B60W50/0205—Diagnosing or detecting failures; Failure detection models
- B60W2050/0215—Sensor drifts or sensor failures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo or light sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/20—Ambient conditions, e.g. wind or rain
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/10—Historical data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/20—Data confidence level
Definitions
- the present invention relates to a method for operating a driver assistance system of a vehicle in which sensor data from the surroundings of the vehicle are recorded in succession.
- the recorded sensor data are verified.
- the verified sensor data are then analyzed using a neural network. This generates analyzed sensor data.
- control data for the partially or fully automated control of the vehicle is generated.
- the invention also relates to a driver assistance system for a vehicle with a sensor unit arranged in the vehicle for recording sensor data from the surroundings of the vehicle.
- Modern vehicles include driver assistance systems that support the driver in controlling the vehicle or partially or completely take over the driving task.
- Level of automation only information and warnings are given to the driver. With a higher degree of automation, the driver assistance system actively intervenes in the control of the vehicle. For example, the steering of the vehicle or the acceleration in a positive or negative direction are intervened. At an even higher degree of automation
- Automation is so far intervened in devices of the vehicle that certain types of locomotion of the vehicle, for example straight travel, can be carried out automatically. With the highest degree of automation, the vehicle can drive autonomously.
- Driver assistance system safely control the vehicle.
- machine learning has great potential.
- Raw sensor data which are generated, for example, by a camera, a radar sensor or a lidar sensor of a vehicle, are processed by means of a deep neural network.
- the neural network generates output data from which the
- Driver assistance system relevant information for the partially automated or fully automated Driving derives. For example, the type and position of objects in the vehicle environment and their behavior are determined. Furthermore, the road geometry and road topology can be determined by means of neural networks. For processing digital images are
- Such deep neural networks are trained for use in a driver assistance system.
- the parameters of the neural network can be suitably adapted by entering data without a human expert having to intervene.
- the deviation of an output of a neural network from a basic truth is measured. This deviation is also called "loess"
- loss function is chosen in such a way that the
- Parameters differentiable depend on it. As part of a gradient descent, the parameters of the neural network are then adapted in each training step as a function of the derivation of the deviation, which is determined on the basis of several examples. These training steps are repeated very often until the deviation, i. H. the loess, no longer diminished.
- the parameters are determined without the assessment of a human expert or semantically motivated modeling.
- Driver assistance systems generate usable output data when the input data is disturbed.
- the present invention is therefore based on the object of a method and a
- this object is achieved by a method with the features of claim 1 and a driver assistance system with the features of claim 10.
- Advantageous refinements and developments result from the dependent claims.
- sensor data from the surroundings of the vehicle are recorded in succession.
- the recorded sensor data are verified.
- at least first sensor data that were recorded at a first, earlier point in time are compared with second sensor data that were recorded at a second, later point in time.
- the result of the comparison is compared with data from a
- Adjusted database in which the data on disturbances of input data of a neural network are stored, it being checked whether the second sensor data at least partially generated by a disturbance of the first sensor data stored in the database.
- the verified sensor data are then analyzed using a neural network. This generates analyzed sensor data.
- control data for the partially or fully automated control of the
- the sensor data that are recorded from the surroundings of the vehicle are reliably verified on the basis of faults that are stored in advance in a database. In this way it is possible to find known harmful ones
- the disturbances can be recognized especially in real time. In this way it is also possible to determine changes in the sensor data over time that occur naturally
- the sensor data are in particular digital images from a camera in the vehicle.
- the digital images record the surroundings of the vehicle.
- the sensor data are in particular raw sensor data, in particular raw sensor data from the camera.
- the sensor data can also be the data generated by a radar sensor or a lidar sensor.
- the sensor data are in particular raw sensor data.
- the database in which the data on disturbances of input data of the neural network are stored, is arranged in particular in the vehicle.
- Comparison of the sensor data recorded at different times with the database means that the database can be accessed directly, that is to say without a wireless interface. This ensures that the comparison can be carried out in the vehicle at any time, even if the vehicle has no data connection to external
- the data stored in the database particularly describe naturally occurring disturbances. These are, for example, weather influences such as fog or snow, sensor noise, camera contamination or those generated by textures
- naturally occurring disturbances are naturally occurring objects in the vicinity of a vehicle, such as printed posters or stickers Objects.
- artificial, disturbances can also be stored in the database, although a classification of the disturbances is also stored so that when the result of the comparison of the sensor data recorded at different times is compared with the database, it can be checked whether a change has occurred the sensor data was generated by a naturally occurring disturbance or another, for example artificial, in particular harmful disturbance.
- the sensor data can be verified. If this is not the case, there is a high probability that the change in the sensor data was triggered by a harmful disturbance. In this case, the sensor data are not verified and these sensor data are then not used to generate the analyzed
- the verified sensor data are checked for plausibility in that additional sensor data are obtained from the surroundings of the vehicle, the additional sensor data are analyzed, deviations of the second sensor data from the first sensor data are determined and it is checked whether the deviations are in accordance with the Analysis of the supplementary sensor data is.
- the cause of deviations in the sensor data recorded at different points in time is, on the one hand, the movement of the vehicle relative to the environment. In addition to these deviations associated with the movement of the vehicle, however,
- Deviations are generated that result from a change in the environment.
- the supplementary sensor data can be used to check whether these further deviations are caused by such environmental changes.
- the plausibility check enables a harmful disruption to be detected which in itself corresponds to a naturally occurring disorder or is similar to one. If, for example, the sensor data shows that the weather conditions have changed so that fog has formed in the vicinity of the
- Vehicle shows this can be checked for plausibility by the supplementary sensor data. If, for example, an optical sensor indicates that there is no fog, that is to say that there is clear visibility, it can be recognized that the changes in the sensor data which could have been caused by fog are actually due to a harmful interference caused.
- the checking of the sensor data obtained can be further improved by checking the plausibility of the sensor data. This increases the safety when using the method in a driver assistance system.
- visibility conditions in the area surrounding the vehicle are obtained in particular. It is then checked whether the deviations of the second sensor data from the first sensor data originate at least partially from the visibility conditions in the vicinity of the vehicle.
- the visibility conditions in the vicinity of the vehicle can be checked particularly easily by sensors, which are usually already provided in the vehicle, independently of the sensors of the driver assistance system. In this way, a harmful disruption of the driver assistance system can be detected by sensors which are not used directly by the driver assistance system to generate the
- Tax data are used. This also increases the safety when operating the
- the parameter set of the neural network which is used in the method is generated by the following steps: a. Provision of a neural network with the associated parameter set,
- G Generating an improved parameter set for the neural network on the basis of the result of the comparison of the first and second analysis.
- the example sensor data set is in particular a digital example image.
- the sensor data that are used to improve the parameter set are sensor data obtained in advance that are used for training the neural network.
- these can be similar sensor data, for example similar digital Images show how they are generated in a driver assistance system in the field when the method is used.
- the improved set of parameters is thus obtained by training the neural network.
- the training is for disturbed and undisturbed sensor data, i. H. especially digital images.
- the improved parameter set then results, for example, from a gradient descent (adverserial training).
- the disturbed input data can in particular be generated in that a first metric is defined that indicates how the extent of a change in a digital image is measured, and a second metric is defined that indicates what a disturbance of the Input data of a digital image is directed.
- An optimization problem is generated from a combination of the first metric and the second metric. The optimization problem is solved by means of at least one
- Solution algorithm solved the solution specifying a target disturbance of the input data, and by means of the target disturbance disturbed input data of sensor data for the neural network are generated.
- the sensor data are in particular digital images.
- the target disturbance in this case thus generates disturbed, i.e. H. altered digital images that form the input data for the neural network that analyzes the digital image.
- possible harmful disturbances of a neural network which is used for analyzing sensor data, are considered on a structural level.
- the disturbance is seen as a composition of different elements for which different metrics are defined.
- This causes a target disruption of the input data generated.
- This target disturbance can then be used to generate disturbed input data from sensor data for the neural network.
- the neural network can then be tested and trained on the basis of this disturbed input data.
- the method according to the invention advantageously enables new faults to be generated very quickly and in a simple manner.
- the first metric used in the method according to the invention indicates how the extent of a change in sensor data is measured. If the sensor data is a digital image from a camera, the disturbance for testing the neural network should usually be as small as possible.
- the first metric indicates how the extent of the change in the digital image can be quantified. A digital image can be modified, for example, by shifting, rotating or mirroring the pixels of the image. The first metric gives the degree of change in such
- a rotation or translation of a digital image can be defined by a fixed point and the rotation angle or the translation distance in the horizontal and vertical directions. Furthermore, the first metric can determine the image distances for each pixel of the image by determining the sum of the differences of all pixel values.
- the pixel value can be, for example, a grayscale value or a color value.
- For each pixel the difference between the pixel value for the original image and for the disturbed image is formed. This difference is determined for each pixel and the differences are then added. The result is an image spacing which indicates the difference between the two images according to the first metric.
- changed image areas can be viewed after the first metric.
- the image areas can be defined by a starting point and an extent in the horizontal and vertical directions, or by a list of pixels. Image distances can be determined for these image areas according to the first metric.
- the first metric can be the extent of a change in a digital image with respect to image characteristics, such as e.g. B. luminance, contrast and / or structure values, or any combination thereof.
- the definition of the first metric can also contain restrictions
- the changes which are considered in the first metric only take into account those image areas in which, for example, certain image characteristics exist. For example, only those areas can be considered in which the contrast exceeds a certain threshold value.
- the second metric is aimed at changing the classification of objects.
- it measures the deviation of the true model output from the desired false model output, i.e. the target of the adverserial disturbance.
- the disturbance can be aimed at ensuring that whenever an area is recognized as a street, this street is always recognized as an empty street without other road users.
- the second metric can be aimed at the disappearance of objects.
- the disturbance is aimed, for example, at the fact that recognized objects are changed in such a way that they disappear.
- the second metric can only relate to certain image areas. For example, the disturbance that is described by the second metric can be aimed at the fact that objects of a specific class cannot occur in a specific image area.
- the second metric is aimed at a change in an object of a specific class.
- an object can be recognized and classified.
- an image area can be a
- the second metric is then aimed, for example, at displaying this object larger or smaller or at a different position.
- magnification is defined, for example, by specifying the absolute number of pixels by which the object is enlarged or reduced by the disturbance on the left, right, top and bottom.
- Any changes to the sensor data can be brought about in order to change the sensor data in such a way that the analysis of the sensor data in a
- a pattern or a grid can be applied to the sensor data so that objects of a certain class, e.g. pedestrians disappear, but other objects continue to be classified correctly.
- objects of a certain class e.g. pedestrians disappear, but other objects continue to be classified correctly.
- driver assistance systems are particularly relevant to those second metrics that measure the naturally occurring disturbances: the model output appears plausible, but deviates from the truth in certain safety-relevant details.
- the disturbances that are described by the first and / or second metric are naturally occurring disturbances.
- a selection is made for the possible disturbances that are described by the first and / or second metric, which is necessary for checking and improving neural networks for use in one
- the first and / or second metrics are stored in a database.
- the metrics for possible disturbances in the input data (first metrics) and for possible changes in the model outputs (second metrics) can be stored in the database, for example.
- a data set is then used for a naturally occurring disorder (measured with a first metric) and for a possible goal (an advised
- the disturbed input data can also be generated in that a first set is defined which contains first metrics which each differently indicate how the extent of a change in sensor data is measured, a second set is defined, the second metrics that each differently indicate where a disturbance of sensor data is aimed, any combination of a first metric of the first set and a second metric of the second set is selected, an optimization problem is generated from the selected combination of the first and second metric , the optimization problem is solved by means of at least one solution algorithm, the solution specifying a target disruption of the input data, and input data from sensor data for the neural network disrupted by the target disruption being generated.
- the advantage of this method is that any first metric of the first set and any second metric of the second set can be used in order to arrive at a target disruption by solving the optimization problem.
- the first set comprises at least two, in particular at least five, different first metrics.
- the first metric can also contain more than 10, 20 or more than 100 metrics.
- the second set comprises at least two, in particular at least five, different second metrics.
- the second metric can also contain more than 10, 20 or more than 100 metrics.
- the first and / or the second metric of the first or second quantity can in particular individually or in combination have the features as described above.
- a third metric is defined, which indicates what kind of sensor data a third disturbance is applied to.
- the disturbance is on all data, on only one data point or on data with certain
- the optimization problem is then generated from a combination of at least two metrics of the first, the second and the third metric.
- the optimization problem is generated in particular from a combination of the first, the second and the third metric.
- the sensor data are in particular digital images. These are analyzed in particular by a neural network in a driver assistance system.
- the third metric can in particular relate to all sensor data, for example all digital images.
- the disruption in all digital images can result in objects of a certain class disappearing.
- the third metric can only affect a subset of the sensor data, in particular the digital images.
- the disturbance can, for example, only describe those digital images which contain objects of a certain class, for example objects classified as pedestrians.
- the third metric can describe digital images that were recorded on days with snowfall or rain. This can cause the disturbed input data for example, cause a different evaluation of a special traffic situation or environmental situation for the neural network when used in a driver assistance system.
- the third metric only describes sensor data that contain a specific object.
- the third metric can only select a specific digital image.
- any combination of a first metric of the first set, a second metric of the second set and the third metric is selected.
- the optimization problem is then generated from the selected combination of the first, second and third metrics.
- the third metric can in particular individually or in combination have the features as described above.
- the optimization problem that has been generated on the basis of the metrics can be represented as follows, for example: With a given maximum change in a digital image, for example by rotating a certain image area, the number of pixels classified as people should be minimized, for as many images as possible in which people appear.
- the number of pixels classified as a person is to be minimized, namely for as many images as possible in which people occur.
- the solution algorithm includes iterative methods using the gradients of the neural network to determine the directions of change. Furthermore, iterative methods using sampling, evaluation and combinations thereof can be used.
- Noise is generated to a digital image and the result is verified.
- a Another embodiment can be a genetic algorithm to solve the
- the solution to the optimization problem can be, for example, a disturbed digital image or a disturbance with which sensor data can be disturbed in order to generate disturbed input data for a neural network.
- the disrupted sensor data or the disrupted digital image then represent the input data for the neural network that is to be checked.
- a perturbation can also be applied to a set of input data by combining at the pixel level, for example by summation.
- the first and second analysis can include semantic segmentation of the digital image, recognition of objects of the digital image, classification of objects of the digital image or recognition of the position of an object in the digital image.
- the analyzes can be used to identify how an object changes in the digital image. These analyzes are particularly relevant when the neural network is used in a driver assistance system, so that it is important that the neural network is robust against disturbances that occur during such analyzes, so that there are slight changes in the analysis when input data is disturbed be used.
- Solution algorithm set that contains several solution algorithms that use the
- Solution algorithm arrives at different target disturbances.
- the solution algorithms of the solution algorithm set can include iterative methods using the gradients of the neural network to determine the directions of change, as well as sampling-based methods, gradient-based methods, gradient-based methods with momentum and / or surrogate-model-based methods.
- the invention also relates to a driver assistance system for a vehicle.
- the invention also relates to a driver assistance system for a vehicle.
- the driver assistance system has a sensor unit arranged in the vehicle Recording of sensor data from the area around the vehicle. It also has a
- Verification unit for verifying the recorded sensor data, wherein the
- Verification unit is formed, at least first when verifying the sensor data
- the driver assistance system comprises an evaluation unit which is coupled to the sensor unit and in which a neural network is stored and which is designed to analyze the verified sensor data by means of the neural network and thereby to generate analyzed sensor data.
- the driver assistance system has a control unit which is coupled to the evaluation unit and which is designed to generate control data for the partially automated or pre-automated control of the vehicle on the basis of the analyzed sensor data.
- the driver assistance system according to the invention is designed in particular that
- the data on disturbances of input data that are stored in the database have in particular disturbed input data for such a neural network as it is also stored in the evaluation unit.
- data on disturbances of input data for any neural networks can also be stored in the database, so that the database can be designed for the use of different neural networks in the evaluation unit.
- the sensor unit is, in particular, a camera for recording digital images.
- the digital images then record the surroundings of the vehicle.
- the camera is arranged in particular in the vehicle.
- the database is preferably also arranged in the vehicle and coupled to the verification unit via a wire connection.
- Verification unit formed, deviations of the second sensor data from the first
- Driver assistance system a supplementary sensor unit, which is designed to supplement Obtain sensor data from the surroundings of the vehicle.
- the driver assistance system also has, in particular, a plausibility unit coupled to the supplementary sensor unit and the verification unit, which is designed to analyze the supplementary sensor data and to check whether the discrepancies are in agreement with the analysis of the supplementary sensor data. In particular, it can be checked whether the
- Deviations from disturbances can arise, which were determined by analyzing the supplementary sensor data.
- FIG 1 shows schematically the structure of an embodiment of the invention
- FIG. 2 shows schematically the sequence of an embodiment of the invention
- Figure 3 shows schematically the structure of a generator for generating disturbed
- Figure 4 shows schematically the sequence of a method for generating disturbed
- Figure 5 shows an example of a fault
- Figure 6 shows schematically the structure of a device for generating a
- FIG. 7 shows the sequence of a method for improving a parameter set of a neural network.
- the driver assistance system 20 comprises a sensor unit 21 which is arranged in the vehicle.
- the exemplary embodiment is a camera that takes digital images of the surroundings of the vehicle. These digital images can be generated consecutively while the vehicle is in use.
- the sensor unit 21 is connected to a verification unit 22.
- the verification unit 22 can verify the sensor data recorded by the sensor unit 21, that is to say determine whether unwanted changes, in particular harmful disturbances, have occurred in the sensor data. In this way, the sensor data is also authenticated, that is, the unchanged
- the origin of the sensor data from the sensor unit 21 is checked.
- the verification unit 22 compares sensor data that were recorded at different times, in particular successive sensor data sets. For example, two consecutive sensor data sets can be compared. However, it is also possible to compare a large number of successive sensor data sets in order to determine changes.
- the verification unit 22 determines deviations of later recorded sensor data from earlier recorded sensor data.
- the verification unit 22 is coupled to a database 16.
- data on disturbances of input data of a neural network are stored.
- a natural disturbance is understood to be a disturbance which influences sensor data, in the present case digital images, of the surroundings of the vehicle in the same way as they can be caused by naturally occurring phenomena in the surroundings of the vehicle.
- the change in a digital image due to a natural disturbance corresponds, for example, to the change in a digital image as it occurs when weather phenomena occur, such as when fog, snowfall or rain occur.
- natural disturbances are understood to mean image changes in which objects are inserted into the image or disappear from the image, as can also occur in the vicinity of a vehicle.
- a poster or a sticker can appear on an object in the vicinity of the vehicle.
- Non-naturally occurring disturbances include, in particular, harmful disturbances in a neural network which are intended to disrupt the safe operation of the driver assistance system.
- harmful disturbances in a neural network which are intended to disrupt the safe operation of the driver assistance system.
- Such faults are identified in the database 16.
- the verification unit 22 is designed to match the result of the comparison of the sensor data recorded at different times with the data in the database 16. It can be checked whether the sensor data recorded later were at least partially generated by a disturbance of the first sensor data stored in the database. If it turns out that the changes were not caused by natural disturbances, the verification unit 22 verifies the recorded ones
- the plausibility unit 23 is connected to a supplementary sensor unit 24.
- the plausibility unit 23 can also be connected to an interface 25 via which the supplementary sensor data of the plausibility unit 23 can be transmitted.
- the supplementary sensor unit 24 detects the visual range in the
- the plausibility unit 23 can also be used.
- Weather data are transmitted via the interface 25. From this, the plausibility unit 23 can determine the visibility in the vicinity of the vehicle.
- the plausibility unit 23 can now analyze the supplementary sensor data and check whether the deviations of the recorded sensor data, which have been determined by the verification unit 22, are in agreement with the analysis of the supplementary sensor data. If, for example, the verification unit 22 shows that the range of vision in the vicinity of the vehicle has been greatly reduced, since that of the sensor unit 21
- the plausibility unit 23 is coupled to an evaluation unit 26. In the
- Evaluation unit 26 is stored in a neural network 11.
- the evaluation unit 26 is designed to analyze the verified and plausibility-checked sensor data by means of the neural network 11 and thereby to generate analyzed sensor data. This analysis is carried out in a manner known per se.
- the digital images are, for example, semantically segmented and the recognized objects on the digital images are assigned to different classes. In this way it can be determined whether, for example, a pedestrian is on a roadway and how this pedestrian is relative to the vehicle and relative to the
- the evaluation unit 26 is coupled to a control unit 27.
- the control unit 27 is designed, based on the analyzed sensor data, control data for the partially automated or to generate fully automated control of the vehicle.
- These control data are transmitted from the driver assistance system 20 to actuators 28, which, for example, control the steering and propulsion or braking of the vehicle.
- the actuators 28 can control signaling emanating from the vehicle.
- Such a partially automated or fully automated control of the vehicle by means of a driver assistance system 20 is known per se.
- a step T 1 sensor data from the surroundings of the vehicle are recorded successively by means of the sensor unit 21.
- step T2 first sensor data that were recorded at a first, earlier point in time are compared with second sensor data that were recorded at a second, later point in time. There is a result of the comparison.
- a step T3 the result of the comparison is compared with the data in the database 16. In doing so, it is taken into account whether the data in the database 16 belong to disturbances that can have a natural cause or belong to harmful disturbances.
- a step T4 it is checked whether the second sensor data were at least partially generated by a disturbance of the first sensor data stored in the database. If this stored disturbance is a natural disturbance, the recorded sensor data are verified in a step T5.
- the verified sensor data are checked for plausibility in a step T6.
- additional sensor data is obtained from the surroundings of the vehicle and the additional sensor data is analyzed.
- deviations of the second sensor data from the first sensor data are determined and it is checked whether the deviations are in agreement with the analysis of the supplementary sensor data.
- a step T7 the verified and plausibility-checked sensor data are analyzed by means of a neural network 11. As a result, analyzed sensor data are generated in a step T8. Finally, in a step T9, control data for the partially automated or fully automated control of the vehicle are generated and output on the basis of the analyzed sensor data.
- a parameter set for the neural network 11 used is obtained in a special way.
- an exemplary embodiment of the generator 10 for generating disrupted input data for a neural network for analyzing digital images of the exemplary embodiment of the driver assistance system is described below with reference to FIG.
- sensor data are analyzed by a neural network or disturbed input data for a neural network are generated from such sensor data.
- the sensor data are raw data from sensors of a vehicle, which were obtained in a driver assistance system before the neural network was actually used.
- the sensor can be a camera, a radar sensor, a lidar sensor or any other sensor that generates sensor data that is processed further in a driver assistance system.
- the sensor data are digital images that have been recorded by a camera of a vehicle.
- other sensor data can also be used in the same way.
- the generator 10 comprises a first metric unit 1, a second metric unit 2 and a third metric unit 3.
- the first metric unit 1 comprises a first metric which indicates how the extent of a change in digital images is measured. It is defined by the first metric unit 1 how the extent of a change in digital images is measured.
- the definition of the first metric can be entered in the first metric unit 1.
- the first metric unit 1 can also access a database 16 via an interface, in which data is stored with a large number of possible definitions for metrics that measure the extent of a change in digital images.
- the first metric can compare the image distances between two digital images and a value for this image distance output.
- the image distance can be defined, for example, by the sum of the differences of all pixel values of the digital images to be compared.
- the first metric unit 1 selects a disturbance that is as natural as possible from the database 16. Under a natural disorder becomes a disorder
- Natural disturbances are understood to mean image changes in which objects are inserted into the image or disappear from the image, as can also occur in the vicinity of a vehicle.
- a poster or sticker can be inserted on an object in the vicinity of the vehicle.
- Other, non-naturally occurring disturbances, as they can also be contained in the database 16, are not taken into account by the second metric unit 1, since they are of lesser relevance for testing a neural network that is used in a driver assistance system.
- the second metric unit 2 comprises a second metric which indicates what a disturbance of the input data of the digital images is aimed at, i. H. the second metric defines what a disruption of a digital image is directed to.
- the definition of the second metric can be transmitted to the second metric unit 2 by an input.
- the second metric unit 2 can also be coupled to the database 16, in which data on a large number of disturbances are stored, which is directed to a specific change in digital images. These can be collections of such faults.
- the second metric unit 2 selects a fault that is as plausible as possible from the database 16.
- a plausible disturbance is understood to be a disturbance which apparently results in a realistic model output, but which is relevant
- the disruption thus generates a digital image in which an object of the initial image, which is classified as a pedestrian, is iteratively enlarged in all four directions, the resulting segmentation of the disrupted digital image being combined with one another again.
- the result is a digital image in which all objects that do not belong to the pedestrian class remain unchanged, but the objects that belong to the pedestrian class are shown enlarged. The other objects are only changed to the extent that they were changed by enlarging the objects of the pedestrian class.
- the third metric unit 3 comprises a third metric which specifies to which digital images the disturbance applies. For example, it can be defined by the metric that the disturbance is only applied to digital images which show other road users, i.e. H. for example pedestrians, cyclists and other vehicles.
- the three metric units 1 to 3 are connected to a processing unit 4.
- the processing unit 4 The three metric units 1 to 3 are connected to a processing unit 4.
- Processing unit 4 is designed to generate an optimization problem from the three metrics of the first to third metric units 1 to 3. For example, this includes
- optimization problem of a loss function for a neural network which contains a disturbance parameter as a parameter and an image resulting from the disturbance (second metric).
- the aim is to find the minimum of the disturbance parameter in the optimization problem, specifically for the digital images that are defined according to the third metric and below the
- Output image is below a certain value according to the first metric.
- the processing unit 4 transmits the optimization problem as a data set to a
- the solution unit 5 is coupled to a database 6 in which at least one solution algorithm, preferably a plurality of solution algorithms, for optimization problems is stored.
- solution algorithms are known per se. For example, Monte Carlo methods, genetic algorithms and / or gradient-based methods, which the solution unit 5 can access, can be stored in the database 6.
- the solution unit 5 can generate a target disruption of the input data of digital images as the solution to the optimization problem.
- the target disturbance thus generates a disturbed digital image which can be used as input data for a neural network for analyzing digital images.
- the neural network is set up in particular to analyze digital images from a driver assistance system.
- the solving unit 5 transmits the target disturbance to a generating unit 7.
- Generation unit 7 is also coupled to a database 8 in which a multiplicity of digital images are stored.
- the generating unit 7 can disturb digital images in the database 8 in such a way that disturbed input data 9 of the digital images are generated for a neural network. The disturbed input data 9 are then from the
- Generation unit 7 output. With these disturbed input data 9, a neural network can then be tested, trained or the parameter set of the neural network can be improved.
- a first metric is defined, which indicates how the extent of a
- the first metric or a data record which describes the first metric is stored in the first metric unit 1.
- a second metric is defined which indicates what a disruption of the digital images is aimed at.
- This second metric or a data record that describes the second metric is also stored in the second metric unit 2.
- a third metric is defined which specifies the type of digital images to which a disturbance applies. This third metric or a data record that describes this third metric is stored in the third metric unit 3.
- a step S4 the data records which describe the three metrics are transmitted to the processing unit 4.
- a step S5 the processing unit 4 generates an optimization problem from a combination of the three metrics.
- the processing unit 4 transmits a data record, which describes the generated optimization problem, to the solution unit 5.
- the solution unit 5 solves the optimization problem by means of at least one solution algorithm, which the solution unit 5, for example, by accessing the
- the solution is a target disruption for digital images.
- a data record for this target disruption is transmitted to the generation unit 7.
- step S9 the generation unit 7 generates disrupted digital images as input data 9 for a neural network by accessing the database 8. These disturbed input data 9 are output in a step S10.
- the output y of the model M is shown in FIG. 5B.
- the digital image x has been segmented, i.e. H. the pixels of the digital image x have been assigned classes as shown in Fig. 5B. The following class assignments resulted:
- the target output y ′′ which is to be generated by the disturbance D, is shown in FIG. 5C.
- the aim of the disturbance D is that the pedestrian is shown enlarged.
- the target disturbance is defined as a displacement of individual pixel values by the value 3.
- the target data consist of a concrete image x.
- the first metric is then defined as follows:
- the size of the disturbance is measured as the maximum pixel value between 0 and 255 in the disturbance D.
- the second metric is defined as follows:
- the third metric is defined as follows:
- the attack only relates to the input image x if one demands d3 (x ‘) ⁇ 1.
- the focus with regard to the data to be attacked changes dramatically if one demands d3 (x ‘) ⁇ 2: then the attack relates to all images.
- the first metric can only allow pixel changes in an image area that are classified as a “tree”.
- a D is to be found in image areas “tree” in the digital image x, so that 0 2 (D) is minimal, where di (A) ⁇ 3.
- the optimization problem can then be formulated as follows: A D is to be found such that 0 2 (D) is minimal for all images, where di (A) ⁇ 3. In other words: A D with di (A) ⁇ 3 should be found so that the model output for all input images x looks like y ".
- the device comprises the database 8 with digital images.
- the generator 10 described with reference to FIG. 3 is connected to this database 8.
- a neural network 11 is coupled to the database 8 and the generator 10.
- the output of the neural network 11 is coupled to a first analysis unit 12 and a second analysis unit 13.
- the first analysis unit 12 generates a first analysis by means of the neural network 11 on the basis of digital images which are supplied as input data from the database 8 to the neural network 11.
- the second analysis unit 13 generates a second analysis on the basis of disturbed input data 9, which are fed from the generator 10 to the neural network 11. To generate the disturbed input data 9, the generator 10 accesses the
- the first analysis unit 12 and the second analysis unit 13 are coupled to a comparison unit 14. This is designed to compare the first and the second analysis with one another.
- the comparison unit 14 is coupled to a parameter set generation unit 15.
- the parameter set generation unit 15 is designed to generate an improved parameter set for the neural network 11 on the basis of the result of the comparison of the first and the second analysis, which was transmitted from the comparison unit 14.
- the parameter set generation unit 15 generates the parameter set for the neural network 11 in such a way that the disrupted input data 9 of the digital images generated by the generator 10 have little influence on the analysis of these input data by the neural network 11.
- the improved parameter set is generated in such a way that the effects of the disturbed input data 9 on the semantic segmentation of the digital image by means of the neural network 11 for the disturbed input data does not lead to objects relevant to safety being classified incorrectly for a driver assistance system, these objects disappearing or changing being represented.
- the neural network 11 can thus be trained using the disturbed input data 9 generated by the generator 10.
- a neural network with an associated parameter set is created
- This neural network is to be checked.
- training data are generated using a large number of digital images.
- a step R3 the neural network is trained in a manner known per se with training data and a first analysis of the digital images is generated on the basis of the training data by means of the neural network.
- a step R4 disturbed input data are generated as training data for the digital images using the method as explained with reference to FIG.
- a step R5 a second analysis of the digital images on the basis of the disturbed input data, that is to say on the basis of the digital images to which the target disturbance was applied, is generated by means of the neural network.
- a step R6 the first and the second analysis are compared with one another.
- a step R8 an improved set of parameters for the neural network is generated on the basis of the result of the comparison and the second analysis.
- the generator 10 of the further exemplary embodiment comprises a first metric unit 1 and a second metric unit 2, as in the first exemplary embodiment measured by sensor data.
- the second metric unit 2 comprises a second set with a multiplicity of second metrics, which each differently indicate where a disturbance of the input data 9 of sensor data is directed.
- the processing unit 4 coupled to the first 1 and second 2 metric units is designed in this case to generate the optimization problem from any combination of a first metric of the first set and a second metric of the second set.
- the solution unit 5 coupled to the processing unit 4 is then designed to solve the optimization problem by means of at least one solution algorithm, the solution indicating a target disruption of the input data 9 of sensor data.
- the generation unit 7 is also designed to generate input data 9 from sensor data for a neural network 11 that are disturbed by the target disturbance.
- the method of the further exemplary embodiment runs analogously to the method of the first exemplary embodiment.
- a first set is defined which contains the first metrics, which in each case indicate differently how the extent of a change in sensor data is measured.
- a second set is defined, which contains the second metrics, which in each case differently indicate where a disturbance of sensor data is directed. It then becomes any combination of a first metric of the selected first set and a second metric of the second set and generated from the selected combination of the first and second metric the optimization problem.
- this is then solved by means of at least one solution algorithm, the solution indicating a target disruption of the input data 9.
- disturbed input data 9 are generated by sensor data for the neural network 11.
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102019208735.3A DE102019208735B4 (de) | 2019-06-14 | 2019-06-14 | Verfahren zum Betreiben eines Fahrassistenzsystems eines Fahrzeugs und Fahrerassistenzsystem für ein Fahrzeug |
PCT/EP2020/066341 WO2020249755A1 (de) | 2019-06-14 | 2020-06-12 | Verfahren zum betreiben eines fahrassistenzsystems eines fahrzeugs und fahrerassistenzsystem für ein fahrzeug |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3983934A1 true EP3983934A1 (de) | 2022-04-20 |
Family
ID=71105452
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20733555.5A Pending EP3983934A1 (de) | 2019-06-14 | 2020-06-12 | Verfahren zum betreiben eines fahrassistenzsystems eines fahrzeugs und fahrerassistenzsystem für ein fahrzeug |
Country Status (5)
Country | Link |
---|---|
US (1) | US20220266854A1 (de) |
EP (1) | EP3983934A1 (de) |
CN (1) | CN114008683A (de) |
DE (1) | DE102019208735B4 (de) |
WO (1) | WO2020249755A1 (de) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102021107793A1 (de) | 2021-03-29 | 2022-09-29 | Ford Global Technologies, Llc | Verfahren zum Betreiben einer Fahrerassistenzfunktion |
DE102022202002A1 (de) | 2022-02-28 | 2023-08-31 | Volkswagen Aktiengesellschaft | Verfahren, Computerprogramm, und Vorrichtung zur Anpassung von Betriebsparametern eines Fortbewegungsmittels, sowie Fortbewegungsmittel |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IL152310A (en) * | 2002-10-15 | 2010-05-17 | Magal Security Systems Ltd | System and method for detecting, locating and recognizing an approach toward an elongated installation |
DE102010012682B4 (de) | 2010-03-24 | 2011-10-27 | Deutsch-Französisches Forschungsinstitut Saint-Louis | Verfahren zur Regelung unter Verwendung eines neuronalen Netzes und Einrichtung hierzu |
DE102011082477A1 (de) * | 2011-09-12 | 2013-03-14 | Robert Bosch Gmbh | Verfahren und System zur Erstellung einer digitalen Abbildung eines Fahrzeugumfeldes |
US20180324194A1 (en) * | 2013-03-15 | 2018-11-08 | CyberSecure IPS, LLC | System and method for detecting a disturbance on a physical transmission line |
DE102016202805A1 (de) | 2016-02-24 | 2017-08-24 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und Vorrichtung zum Betreiben eines Umfeldsensors eines Fahrzeugs |
DE102016212326A1 (de) * | 2016-07-06 | 2018-01-11 | Robert Bosch Gmbh | Verfahren zur Verarbeitung von Sensordaten für eine Position und/oder Orientierung eines Fahrzeugs |
EP3306590A1 (de) * | 2016-10-07 | 2018-04-11 | Autoliv Development AB | Fahrerassistenzsystem und -verfahren für ein motorfahrzeug |
DE102017204404B3 (de) | 2017-03-16 | 2018-06-28 | Audi Ag | Verfahren und Vorhersagevorrichtung zum Vorhersagen eines Verhaltens eines Objekts in einer Umgebung eines Kraftfahrzeugs und Kraftfahrzeug |
DE102017217256A1 (de) * | 2017-09-28 | 2019-03-28 | Zf Friedrichshafen Ag | Kommunikationsfluss von Verkehrsteilnehmer in Richtung eines automatisiert fahrenden Fahrzeug |
US10997491B2 (en) * | 2017-10-04 | 2021-05-04 | Huawei Technologies Co., Ltd. | Method of prediction of a state of an object in the environment using an action model of a neural network |
-
2019
- 2019-06-14 DE DE102019208735.3A patent/DE102019208735B4/de active Active
-
2020
- 2020-06-12 WO PCT/EP2020/066341 patent/WO2020249755A1/de active Application Filing
- 2020-06-12 CN CN202080043498.3A patent/CN114008683A/zh active Pending
- 2020-06-12 EP EP20733555.5A patent/EP3983934A1/de active Pending
- 2020-06-12 US US17/619,094 patent/US20220266854A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2020249755A1 (de) | 2020-12-17 |
CN114008683A (zh) | 2022-02-01 |
DE102019208735A1 (de) | 2020-12-17 |
US20220266854A1 (en) | 2022-08-25 |
DE102019208735B4 (de) | 2021-12-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3983936A1 (de) | Verfahren und generator zum erzeugen von gestörten eingangsdaten für ein neuronales netz | |
DE102018008685A1 (de) | Verfahren zum Trainieren eines künstlichen neuronalen Netzes, künstliches neuronales Netz, Verwendung eines künstlichen neuronalen Netzes sowie entsprechendes Computerprogramm maschinenlesbares Speichermedium und entsprechende Vorrichtung | |
DE102011107458A1 (de) | Verfahren zum Evaluieren einer Objekterkennungseinrichtung eines Kraftfahrzeugs | |
AT523834B1 (de) | Verfahren und System zum Testen eines Fahrerassistenzsystems | |
EP3983934A1 (de) | Verfahren zum betreiben eines fahrassistenzsystems eines fahrzeugs und fahrerassistenzsystem für ein fahrzeug | |
WO2020126167A1 (de) | Verfahren zur erkennung von mindestens einem muster in einer umgebung eines fahrzeugs, steuergerät zum ausführen eines solchen verfahrens, sowie fahrzeug mit einem solchen steuergerät | |
DE102008036219A1 (de) | Verfahren zur Erkennung von Objekten im Umfeld eines Fahrzeugs | |
EP3748454B1 (de) | Verfahren und vorrichtung zum automatischen ausführen einer steuerfunktion eines fahrzeugs | |
DE102019209463A1 (de) | Verfahren zur Bestimmung eines Vertrauenswertes eines Objektes einer Klasse | |
DE102019208233A1 (de) | Verfahren und Vorrichtung zum automatischen Ausführen einer Steuerfunktion eines Fahrzeugs | |
DE102020204321A1 (de) | Verfahren zum Betreiben eines Fahrerassistenzsystems und Fahrerassistenzsystem für ein Fahrzeug | |
DE102018109680A1 (de) | Verfahren zum Unterscheiden von Fahrbahnmarkierungen und Bordsteinen durch parallele zweidimensionale und dreidimensionale Auswertung; Steuereinrichtung; Fahrassistenzsystem; sowie Computerprogrammprodukt | |
DE102020212921A1 (de) | Verfahren, Computerprogramm und Vorrichtung zum Bewerten einer Verwendbarkeit von Simulationsdaten | |
EP3772017A1 (de) | Bahnsignalerkennung für autonome schienenfahrzeuge | |
DE102019202751A1 (de) | Verfahren zur Objektdetektion mit zwei neuronalen Netzwerken | |
DE102021104077B3 (de) | Verfahren, System und Computerprogrammprodukt zur automatisierten Generierung von Verkehrsdaten | |
DE102019212830A1 (de) | Analyse und Validierung eines neuronalen Netzes für ein Fahrzeug | |
DE102023000578A1 (de) | Verfahren zum Ermitteln von potentiellen, zukünftigen Positionen eines potentiellen Kollisionsobjekts mittels eines lernfähigen Systems, sowie Verfahren zum Trainieren eines entsprechenden lernfähigen Systems | |
DE102022119715A1 (de) | Verfahren, System und Computerprogrammprodukt zur objektiven Bewertung der Leistungsfähigkeit eines ADAS/ADS-Systems | |
EP4078433A1 (de) | Verfahren und vorrichtung zum erzeugen und bereitstellen einer datenbank mit darin hinterlegten sensordatenpatches zur verwendung beim quilting | |
DE102021211357A1 (de) | Verfahren zum Ausführen einer sicherheitsrelevanten Funktion eines Fahrzeuges, Computerprogrammprodukt, sowie Fahrzeug | |
DE102022204618A1 (de) | Effektive Qualitätsbestimmung einer Objekterkennung eines neuronalen Netzes | |
DE102020213057A1 (de) | Verfahren und Vorrichtung zum Überprüfen eines beim teilautomatisierten oder vollautomatisierten Steuern eines Fahrzeugs verwendeten KI-basierten Informationsverarbeitungssystems | |
DE102022116564A1 (de) | Verfahren, System und Computerprogrammprodukt zur Bewertung von Testfällen zum Testen und Trainieren eines Fahrerassistenzsystems (ADAS) und/oder eines automatisierten Fahrsystems (ADS) | |
EP4224436A1 (de) | Verfahren und computerprogramm zum charakterisieren von zukünftigen trajektorien von verkehrsteilnehmern |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20220114 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20240221 |