WO2021122339A1 - Procédé et dispositif permettant de fabriquer un réseau neuronal plus robuste par rapport à des perturbations antagonistes - Google Patents

Procédé et dispositif permettant de fabriquer un réseau neuronal plus robuste par rapport à des perturbations antagonistes Download PDF

Info

Publication number
WO2021122339A1
WO2021122339A1 PCT/EP2020/085651 EP2020085651W WO2021122339A1 WO 2021122339 A1 WO2021122339 A1 WO 2021122339A1 EP 2020085651 W EP2020085651 W EP 2020085651W WO 2021122339 A1 WO2021122339 A1 WO 2021122339A1
Authority
WO
WIPO (PCT)
Prior art keywords
neural network
data
piece
vehicle
sensor
Prior art date
Application number
PCT/EP2020/085651
Other languages
German (de)
English (en)
Inventor
Peter Schlicht
Fabian HÜGER
Original Assignee
Volkswagen Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen Aktiengesellschaft filed Critical Volkswagen Aktiengesellschaft
Priority to CN202080087609.0A priority Critical patent/CN114787650A/zh
Priority to EP20829548.5A priority patent/EP4078239A1/fr
Publication of WO2021122339A1 publication Critical patent/WO2021122339A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/36Means for anti-jamming, e.g. ECCM, i.e. electronic counter-counter measures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/495Counter-measures or counter-counter-measures using electronic or electro-optical means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/023Interference mitigation, e.g. reducing or avoiding non-intentional interference with other HF-transmitters, base station transmitters for mobile communication or other radar systems, e.g. using electro-magnetic interference [EMI] reduction techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks

Definitions

  • the invention relates to a method and a device for robustizing a neural network against adversarial disturbances.
  • the invention also relates to a method for operating an assistance system for a vehicle and an assistance system for a vehicle as well as a vehicle, a computer program and a data carrier signal.
  • Machine learning for example based on neural networks, has great potential for use in modern driver assistance systems and automated vehicles.
  • Functions based on deep neural networks process sensor data (e.g. from cameras, radar or lidar sensors) in order to derive relevant information from it.
  • sensor data e.g. from cameras, radar or lidar sensors
  • This information includes, for example, a type and a position of objects in the surroundings of the motor vehicle, a behavior of the objects or a road geometry or topology.
  • CNN convolutional neural networks
  • CNN convolutional neural networks
  • CNN convolutional neural networks
  • input data e.g. image data
  • CNN convolutional neural networks
  • the convolution network independently develops feature maps based on filter channels that process the input data locally in order to derive local properties. These feature cards are then processed again by further filter channels, which derive more valuable feature cards from them.
  • the deep neural network On the basis of this information compressed from the input data, the deep neural network finally derives its decision and makes it available as output data.
  • the invention is based on the object of creating a method and a device for robustizing a neural network against adversarial disturbances.
  • a method for robustizing a neural network against adversarial disturbances with sensor data recorded from at least one sensor being fed to the neural network as input data, with output data of at least one layer of the neural network being replaced piece by piece by means of quilting, and the piece-by-piece replaced output data are fed as input data to at least one subsequent layer of the neural network.
  • a device for robustizing a neural network against adversarial disturbances comprising a computing device, wherein the computing device is set up to provide a neural network or to access a provided neural network, to receive detected sensor data of at least one sensor and to the neural network as To supply input data, to replace output data at least one layer of the neural network piece by piece by means of quilting, and to supply the piece-by-piece replaced output data as input data to at least one subsequent layer of the neural network.
  • the method and the device make it possible to eliminate or at least reduce an effect of adverse disturbances potentially contained in the acquired sensor data on a final output of the neural network.
  • the neural network or a function of the neural network is thereby robustized against adversarial disturbances.
  • the output data of at least one layer of the neural network is replaced piece by piece by means of quilting.
  • the layer is, in particular, an internal layer of the neural network.
  • the output data, which are replaced piece by piece are fed to at least one subsequent layer of the neural network.
  • An effect of Adversarial disturbances on a data flow within the neural network can hereby be eliminated or at least reduced, so that an effect on the final output of the neural network is eliminated or at least reduced.
  • An advantage of the method and the device is that the robustification is in particular independent of a specific embodiment of an adverse disturbance contained in the sensor data, since the piece-wise replacement by means of quilting is independent of the presence of an adverse disturbance and, in particular, independent of a type of adverse disturbance .
  • Another advantage is that the method and the device can be easily integrated into existing neural networks or KI functions provided with them.
  • a structure of the neural network does not have to be adapted, only the output data of the at least one layer of the neural network have to be replaced or exchanged piece by piece by means of quilting.
  • a training phase of the neural network in particular does not have to be adapted or repeated. In this way, effort and costs can be saved despite the robustification achieved.
  • the output data include, in particular, activations of the at least one layer of the neural network.
  • the activations are in particular in the form of an activation map of the layer.
  • the activation map can also be a feature map of a layer of the neural network designed as a convolution layer.
  • Quilting includes, in particular, the piece-wise replacement of output data, which can also be referred to as piece-wise reconstruction of the output data.
  • the output data are divided into several sections.
  • small, in particular rectangular sections also referred to as patches
  • the individual subsections are compared with subsections, hereinafter referred to as output data patches, which are stored in a database, for example.
  • the comparison takes place on the basis of a distance measure which is defined, for example, via a Euclidean distance on picture element vectors.
  • a partial section is linearized as a vector.
  • a distance is then determined using a vector standard, for example using the L2 standard.
  • the partial sections are replaced by the closest or most similar output data patch from the database. It can be provided here that a minimum distance must be maintained or that at least no identity exists between the partial excerpt from the output data and the output data patch may. If the output data have a different form or a different format, the piece-by-piece replacement takes place in an analogous manner.
  • the at least one layer can in principle be any layer in the neural network. It can also be provided that output data from several layers of the neural network are replaced piece by piece by means of quilting. For the output data of each of the layers, in particular output data patches generated and provided individually for each layer are used.
  • the at least one layer is in particular an internal layer of the neural network.
  • the sensor data of the at least one sensor can in principle be one-dimensional or multidimensional, in particular two-dimensional.
  • the sensor data can be two-dimensional camera images from a camera and / or two-dimensional or three-dimensional lidar data from a lidar or radar sensor.
  • a sensor can in particular be a camera, a lidar sensor, a radar sensor, an ultrasonic sensor or some other sensor suitable for detecting the surroundings.
  • An adversarial perturbation is, in particular, a deliberately made disruption of the input data of a neural network, for example provided in the form of sensor data, in which semantic content in the input data is not changed, but the disruption leads to the neuronal Network inferred an incorrect result, that is, for example, a misclassification or an incorrect semantic segmentation of the input data or an incorrect detection or localization of objects in it.
  • a neural network is in particular a deep neural network, in particular a convolutional neural network (CNN).
  • CNN convolutional neural network
  • the neural network is trained for a specific perception function, for example, for the perception of pedestrians or other objects in captured camera images.
  • the method and the device are applied in particular to a (fully) trained neural network.
  • the method can be carried out as a computer-implemented method.
  • the method can be carried out by means of a data processing device.
  • the Data processing device comprises in particular at least one computing device and at least one storage device.
  • a computer program is also created, comprising instructions which, when the computer program is executed by a computer, cause the computer to carry out the method steps of the disclosed method in accordance with any of the described embodiments.
  • a data carrier signal is also created that transmits the aforementioned computer program.
  • the method includes acquiring the sensor data by means of the at least one sensor.
  • a final output of the neural network is fed to at least one control device, in particular a vehicle.
  • the control device can, for example, provide a function for automated driving of a vehicle and / or for driver assistance of the vehicle and / or for environment detection and / or environment perception.
  • the control device can, for example, control or regulate longitudinal and lateral guidance of the vehicle.
  • the method and the device can also be used in other areas of application, for example in industrial production or in medical robots.
  • a vehicle is in particular a motor vehicle.
  • a vehicle can also be another land, rail, water, air or space vehicle, for example a drone or an air taxi.
  • Parts of the device in particular the computing device, can be designed individually or collectively as a combination of hardware and software, for example as program code that is executed on a microcontroller or microprocessor. However, it can also be provided that parts are designed individually or combined as an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • the neural network provides at least one function for automated or partially automated driving of a vehicle and / or for perception of the surroundings, with a final output of the neural network at least is fed to a control device of the vehicle.
  • the control device of the vehicle is supplied with more reliable and safer input data, so that a function provided by the control device can also be provided more reliably and safely.
  • the control device can, for example, provide an actuator control for an actuator of the vehicle or carry out (higher-value) further processing of the final output of the neural network.
  • a database with output data patches is provided for quilting, the output data patches being or having been generated on the basis of output data of the at least one layer of the neural network which were obtained with the aid of undisturbed input data.
  • the undisturbed input data are selected and compiled in such a way that they have no adverse disturbances with certainty or at least with an increased probability.
  • the output data inferred from the at least one layer of the neural network on the basis of the undisturbed input data are then broken down into sub-sections, each sub-section forming an output data patch.
  • a size of the partial sections is selected depending on a specific application scenario of the method and the device.
  • the output data patches generated in this way are stored in the database.
  • the database is provided, for example, by means of a storage device that can be accessed by the computing device.
  • the output data patches can be stored in the database, for example, linearized as vectors.
  • output data that are provided by the at least one layer of the neural network are then also broken down into partial sections and linearized to form a vector.
  • the respective vectors of the partial sections can then be compared with the vectors of the output data patches stored in the database by means of a vector standard, for example the L2 standard.
  • a partial section of the output data is then replaced by the most similar or closest output data patch, based on the specific distances.
  • the output data are processed in the form of two-dimensional image data, with image sections of the two-dimensional image data being replaced piece by piece by means of the quilting.
  • activation maps (“activation maps”) are used, which represent the at least one layer of the neural network generated as output data, treated as two-dimensional image data or images.
  • the piece-by-piece replacement carried out for quilting takes place via, in particular rectangular, image sections that are replaced.
  • image details can, for example, have a size of 8 ⁇ 8 picture elements (pixels).
  • the output data patches also have a size of 8 ⁇ 8 picture elements each.
  • At least one item of identification information is obtained, the piece-wise replacement during quilting additionally taking into account the at least one identification information item received.
  • Identification information can also be referred to as a tag or label.
  • the entries in the database that is to say output data patches stored therein, can be marked with additional information so that they can be found more quickly later.
  • the database is indexed with the help of a hash function so that a search in the database can be accelerated, since a number of entries in the database is already reduced via a preselection before a comparison with the output data of the at least one layer of the neural network can be.
  • the identification information obtained is or is derived from context information of an environment in which the sensor data of the at least one sensor is or was recorded.
  • Context information can, for example, be a geographical coordinate (e.g. GPS coordinate), a time of day and / or season, a month, a weekday, weather (sun, rain, fog, snow, etc.) and / or a traffic context (city, country , Motorway, pedestrian zone, country road, main road, secondary road etc.).
  • the quality of the output data that is replaced piece-by-piece can be increased, since in the case of piece-by-piece replacement, a context in which the output data were generated by the at least one layer of the neural network can be taken into account.
  • output data patches can be marked (“tagged”) with at least one context information stored in the database.
  • a preselection can be made before the search in the database, so that only entries or output data patches that partially or completely match the at least one identification information are considered during the search or who have at least one context information. This means that the piece-by-piece replacement can be accelerated.
  • the channels can be replaced piece by piece, depending on one another. All channels can either be treated together as one datum and replaced piece by piece, whereby this replacement can take into account the spatial correlation between the channels, or the channels can be replaced independently of one another, with each channel being broken down individually and reconstructed piece by piece using output data patches from its own database becomes.
  • the neural network provides a function for the automated driving of a vehicle and / or for a driver assistance of the vehicle and / or for a surrounding area detection and / or surrounding area perception.
  • a vehicle is in particular a motor vehicle.
  • the vehicle can also be another land, air, water, rail or space vehicle.
  • the method can also be used in other areas, for example in industrial production, e.g. in production robots that have to process sensor data, or in medical robots.
  • a method for operating an assistance system for a vehicle is also provided, with at least one function for automated or partially automated driving of the vehicle and / or for perception of the surroundings being provided by means of the assistance system, with sensor data being recorded by means of at least one sensor, with one method is carried out according to one of the described embodiments, and wherein a final output of the neural network is fed to at least one control device of the vehicle.
  • the control device can, for example, provide an actuator control for an actuator of the vehicle or carry out further processing of the final output of the neural network.
  • an assistance system for a vehicle comprising at least one sensor set up to acquire sensor data, and a device according to one of the described embodiments, the assistance system being set up to perform at least one function for automated or partially automated driving of the vehicle and / or for Provide environment perception, the detected sensor data of the device and to feed a final output of the neural network provided by means of the device to at least one control device of the vehicle.
  • the assistance system can also include the control device.
  • a vehicle is also created, in particular, comprising at least one device according to one of the described embodiments or at least one assistance system according to one of the described embodiments.
  • FIG. 1 shows a schematic representation of an embodiment of the device for robustizing sensor data against adversarial disturbances and an embodiment of an assistance system
  • FIG. 2 shows a schematic representation to illustrate quilting (prior art).
  • FIG. 3 shows a schematic representation to illustrate an embodiment of the
  • the device 1 shows a schematic representation of an embodiment of the device 1 for robustizing a neural network 15 against adversarial disturbances.
  • the device 1 comprises a computing device 2 and a storage device 3.
  • the device 1 carries out the method described in this disclosure for robustizing the neural network 15 against adversarial disturbances.
  • the device 1 can in particular be used in a vehicle 50 in order to robustize a neural network 15 or a function provided by the neural network 15 against adversarial disturbances.
  • the vehicle 50 is in particular a motor vehicle.
  • the vehicle 50 can include an assistance system 200, the assistance system 200 including the device 1.
  • the assistance system 200 further comprises at least one sensor 51 and a control device 52.
  • the embodiment is shown by way of example in connection with a motor vehicle. In principle, however, the method and the device 1 as well as the assistance system 200 can also be used in other vehicles 50.
  • the neural network 15 in particular provides at least one function for automated or partially automated driving of the vehicle 50 and / or for perception of the surroundings.
  • the assistance system 200 in particular at least partially by means of the neural network 15, provides at least one function for automated or partially automated driving of the vehicle 50 and / or for perception of the surroundings.
  • Parts of the device 1, in particular the computing device 2 can be designed individually or collectively as a combination of hardware and software, for example as program code that is executed on a microcontroller or microprocessor.
  • the computing device 2 provides the neural network 15, that is to say it provides a functionality of the neural network 15 and in particular carries out the necessary arithmetic operations for this purpose.
  • a structure and parameters of the neural network 15 are stored in the memory device 3, for example.
  • the computing device 2 can only access a neural network 15 provided in some other way.
  • Acquired sensor data 20 of sensor 51 for example a camera or a lidar sensor of vehicle 50, are fed to computing device 2.
  • the computing device 2 receives the acquired sensor data 20 and feeds them to the neural network 15 as input data. Furthermore, the computing device 2 replaces output data of at least one layer of the neural network 15 piece by piece by means of quilting. The output data replaced piece by piece are then fed as input data to at least one subsequent layer of the neural network 15.
  • a final output 30 of the neural network 15 is output by the computing device 2, for example in the form of a digital data packet.
  • the final output 30 is fed, for example, to the control device 52 of the assistance system 200 or of the vehicle 50, which, for example, starting from the final output 30, controls or regulates a longitudinal and / or lateral guidance of the vehicle 50.
  • the output data replaced piece by piece have the same format as the (original) output data, so that it is possible to insert the method or the device 1 into already existing applications of neural networks 15 and use them.
  • a database 40 with output data patches 60 is provided for quilting, the output data patches 60 being or having been generated on the basis of output data of the at least one layer of the neural network 15, which were obtained with the aid of undisturbed input data.
  • the undisturbed input data are based, for example, on trustworthy training data with which the neural network 15 was trained.
  • the identification information 10 is fed to the computing device 2, for example, and can be used to preselect output data patches 60 of the database 40 used in quilting, so that the quilting can be accelerated.
  • the identification information 10 obtained is or is derived from context information 11 of an environment in which the sensor data 20 of the sensor 51 are or have been recorded.
  • Context information can, for example, be a geographical coordinate (e.g. GPS coordinate), a time of day and / or season, a month, a weekday, weather (sun, rain, fog, snow, etc.) and / or a traffic context (city, country , Motorway, pedestrian zone, country road, main road, secondary road etc.).
  • the context information 11 can, for example, be recorded by means of a context sensor system (not shown) of the vehicle 50 set up for this purpose (e.g. rain sensor, temperature sensor, etc.), but can also be provided by a controller of the vehicle 50 and, for example, via a controller area network (CAN) Bus can be called up and / or provided (e.g. vehicle speed, vehicle orientation, etc.).
  • a context sensor system not shown
  • CAN controller area network
  • FIG. 2 shows a schematic representation to clarify quilting from the prior art using the example of a camera image 22.
  • Sensor data 20, in the present case a camera image 22 are divided into partial sections 23.
  • a sensor data patch 61 in a database 40 is searched for a sensor data patch 61 in a database 40 as part of a quilting step 100, which has the smallest distance to the respective partial section 23 in terms of a distance.
  • a sensor data patch 61 is an image section which has the size of the partial sections 23, that is to say the same number of picture elements (pixels).
  • the distance measure is, for example, the L2 standard, which is applied to vectors that have been generated by linearizing the partial sections 23 or image sections.
  • each partial section 23 or image section is then replaced by the respective sensor data patch 61 with the smallest distance therefrom. It can be provided here that a minimum distance must be maintained. In this way, all partial sections 23 or image sections are replaced by sensor data patches 61 from database 40. Replaced partial sections 24 are created which, taken together, form the replaced sensor data 21 or a replaced camera image 25.
  • FIG. 3 shows a schematic illustration to clarify an embodiment of the method for robustizing a neural network 15 against adversarial disturbances.
  • the neural network 15 comprises several convolutional layers 16-x and fully connected layers 17-x.
  • Sensor data 20 recorded as input data, for example a camera image 22, are fed to the neural network 15.
  • the last convolution layer 16-4 supplies an activation map 19 as output data 18, which can also be referred to as a feature map and in particular can be understood as images with many image channels (i.e. filter channels).
  • the output data 18 or the activation card 19 are replaced piece by piece in a quilting step 100 by means of quilting.
  • the quilting takes place here in principle in the same way as was described in connection with FIG. 2, with the difference that it is not the sensor data 20 that is replaced piece by piece by means of quilting, but rather the output data 18. There is therefore an intervention in a data flow within the neural network 15.
  • the quilting takes place in that partial sections 23 of the output data 18, that is to say image sections of the images of the Activation card 19 can be replaced by output data patches 60 which, with regard to a distance measure, have the smallest distance from the respective partial sections 23 or image sections.
  • the output data patches 60 are stored in a database 40 and are retrieved from this. Since, in particular, several filter channels are included in the activation card 19, the quilting step 100 can also be referred to as multi-channel quilting.
  • replaced partial sections 24 or replaced image sections are available as replaced output data 28 or replaced activation card 29.
  • the replaced output data 28 or the replaced activation card 29 are fed to a subsequent layer 17-1 as input data.
  • an influence or an effect of an adverse disturbance is eliminated or at least reduced compared to the original output data 18 or the original activation card 19.
  • the neural network 15 provides a final output 30 which can contain, for example, object recognition or semantic segmentation on the acquired sensor data 20.
  • the neural network 15 provides a function for automated driving of a vehicle and / or for driver assistance of the vehicle and / or for environment detection and / or environment perception.
  • the final output 30 can, for example, be fed to a control device of the vehicle which, for example, controls or regulates a longitudinal and / or lateral guidance of the vehicle, plans a trajectory or carries out a higher-quality evaluation of the final output 30.
  • the method is shown by way of example for the output data 18 of a single layer 16-4 of the neural network 15. However, it can be provided that the method is carried out for further layers 16-x, 17-x. If the output data of several layers 16-x, 17-x of the neural network 15 are replaced piece by piece by means of quilting, output data patches 60 generated or provided individually for each of the layers 16-x, 17-x are used.
  • the database 40 includes, in particular, output data patches 60 for each individual layer.
  • the method can be embedded in a method for operating an assistance system.
  • LIST OF REFERENCE NUMERALS Device Computing device Storage device Identification information
  • Context information Neural network -x
  • Convolution layer -x Completely connected layer
  • Output data Activation card Sensor data Replaced sensor data
  • Camera image Partial section Replaced partial section Replaced camera image
  • Replaced output data Replaced activation card
  • Final output database Vehicle sensor control device
  • Output data patches Sensor data patch 0 Quilting step 0 Assistance system

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

L'invention se rapporte à un procédé permettant de fabriquer un réseau neuronal (15) plus robuste par rapport à des perturbations antagonistes, des données de capteur détectées (20) d'au moins un capteur (51) étant fournies au réseau neuronal (15) en tant que données d'entrée, des données de sortie (18) d'au moins une couche (16-x,17-x) du réseau neuronal (15) étant remplacées partie par partie au moyen d'un matelassage, et ces données de sortie (28) qui ont été remplacées partie par partie, étant fournies à au moins une couche suivante (16-x,17-x) du réseau neuronal (15) en tant que données d'entrée. L'invention se rapporte également à un dispositif (1) permettant de fabriquer un réseau neuronal (15) plus robuste par rapport à des perturbations antagonistes, à un procédé permettant de faire fonctionner un système d'assistance (200) pour un véhicule (60) et à un système d'assistance (200) pour un véhicule (50), ainsi qu'à un véhicule (50), à un programme d'ordinateur et à un signal de support de données.
PCT/EP2020/085651 2019-12-17 2020-12-10 Procédé et dispositif permettant de fabriquer un réseau neuronal plus robuste par rapport à des perturbations antagonistes WO2021122339A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080087609.0A CN114787650A (zh) 2019-12-17 2020-12-10 用于使神经网络针对对抗性干扰鲁棒化的方法和装置
EP20829548.5A EP4078239A1 (fr) 2019-12-17 2020-12-10 Procédé et dispositif permettant de fabriquer un réseau neuronal plus robuste par rapport à des perturbations antagonistes

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019219925.9A DE102019219925A1 (de) 2019-12-17 2019-12-17 Verfahren und Vorrichtung zum Robustifizieren eines Neuronalen Netzes gegen adversariale Störungen
DE102019219925.9 2019-12-17

Publications (1)

Publication Number Publication Date
WO2021122339A1 true WO2021122339A1 (fr) 2021-06-24

Family

ID=74068255

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/085651 WO2021122339A1 (fr) 2019-12-17 2020-12-10 Procédé et dispositif permettant de fabriquer un réseau neuronal plus robuste par rapport à des perturbations antagonistes

Country Status (4)

Country Link
EP (1) EP4078239A1 (fr)
CN (1) CN114787650A (fr)
DE (1) DE102019219925A1 (fr)
WO (1) WO2021122339A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022205084B3 (de) 2022-05-20 2023-10-12 Volkswagen Aktiengesellschaft Verfahren, Computerprogramm und Vorrichtung zur Umfeldwahrnehmung im Fahrzeug sowie entsprechendes Fahrzeug

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
ALEXEI A EFROS ET AL: "Image quilting for texture synthesis and transfer", COMPUTER GRAPHICS. SIGGRAPH 2001. CONFERENCE PROCEEDINGS. LOS ANGELES, CA, AUG. 12 - 17, 2001; [COMPUTER GRAPHICS PROCEEDINGS. SIGGRAPH], NEW YORK, NY : ACM, US, 1 August 2001 (2001-08-01), pages 341 - 346, XP058253454, ISBN: 978-1-58113-374-5, DOI: 10.1145/383259.383296 *
ANONYMOUS: "Textursynthese - Wikipedia", 21 March 2021 (2021-03-21), XP055787992, Retrieved from the Internet <URL:https://de.wikipedia.org/wiki/Textursynthese> [retrieved on 20210321] *
AUS CHUAN GUO ET AL.: "Countering Adversarial Images Using Input Transformations", ARXIV: 1711.00117V3 [CS.CV, 25 January 2018 (2018-01-25), Retrieved from the Internet <URL:https://arxiv.org/pdf/1711.00117.pdf>
CHUAN GUO ET AL: "Countering Adversarial Images using Input Transformations", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 31 October 2017 (2017-10-31), XP081307843 *
DUBEY ABHIMANYU ET AL: "Defense Against Adversarial Images Using Web-Scale Nearest-Neighbor Search", 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), IEEE, 15 June 2019 (2019-06-15), pages 8759 - 8768, XP033687485, DOI: 10.1109/CVPR.2019.00897 *
KENNETH T CO ET AL: "Procedural Noise Adversarial Examples for Black-Box Attacks on Deep Convolutional Networks", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLINE LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 30 September 2018 (2018-09-30), XP081561740, DOI: 10.1145/3319535.3345660 *

Also Published As

Publication number Publication date
CN114787650A (zh) 2022-07-22
DE102019219925A1 (de) 2021-06-17
EP4078239A1 (fr) 2022-10-26

Similar Documents

Publication Publication Date Title
EP3523168B1 (fr) Procédé et dispositif de régulation de la dynamique de conduite pour un véhicule automobile
EP3824247A1 (fr) Procédé et système destiné à déterminer une position d&#39;un véhicule
DE102016210534A1 (de) Verfahren zum Klassifizieren einer Umgebung eines Fahrzeugs
DE102012107886A1 (de) Verfahren zur elektronischen Erkennung von Verkehrszeichen
EP3789926A1 (fr) Procédé de détection d&#39;une perturbation adversaire dans des données d&#39;entrée d&#39;un réseau de neurones
DE102019008093A1 (de) Verfahren zum Fusionieren von Sensordaten einer Vielzahl von Erfassungseinrichtungen mittels eines spärlichen Belegungsgitters, sowie Fahrerassistenzsystem
WO2020061603A1 (fr) Procédé et dispositif d&#39;analyse d&#39;un flux de données de capteur et procédé de guidage d&#39;un véhicule
DE102016215249A1 (de) Verfahren und Vorrichtung zum Unterstützen eines Fahrerassistenzsystems in einem Kraftfahrzeug
EP4078238A1 (fr) Procédé et dispositif pour rendre des données de capteur plus robustes à l&#39;égard de perturbations indésirables
DE102018113344A1 (de) Verfahren zum Lokalisieren eines Kraftfahrzeugs in einer Umgebung nach einer Lernfahrt; Steuereinrichtung sowie Fahrassistenzsystem
DE102018133457B4 (de) Verfahren und System zum Bereitstellen von Umgebungsdaten
EP4078239A1 (fr) Procédé et dispositif permettant de fabriquer un réseau neuronal plus robuste par rapport à des perturbations antagonistes
DE102018222202A1 (de) Verfahren und Vorrichtung zum Betreiben eines Maschinenlernmodells
DE102017128082A1 (de) Meta-Architektur-Design für ein CNN-Netzwerk
DE102016225631A1 (de) Verfahren und Vorrichtung zum Entfernen von zumindest einer Landmarkenposition einer Landmarke in einer Radarkarte
DE102019201690A1 (de) Verfahren und Assistenzsystem zur Umfeldüberwachung eines Ego-Fahrzeugs
EP4049186A1 (fr) Procédé pour robustifier un réseau neuronal vis-à-vis de perturbations antagonistes
EP4078237A1 (fr) Procédé et appareil de reconnaissance de suppression d&#39;un domaine de données de capteur à partir d&#39;un domaine de données de référence
EP4053593A1 (fr) Traitement des données de capteur dans un moyen de transport
DE102020128952A1 (de) Verfahren und Assistenzeinrichtung zur zweistufigen bildbasierten Szenenerkennung und Kraftfahrzeug
DE112019004315T5 (de) Kartenerzeugungsvorrichtung und kartenerzeugungsverfahren
DE102020200875A1 (de) Verfahren zum Bereitstellen von Sensordaten durch eine Sensorik eines Fahrzeugs
DE102019207580A1 (de) Verfahren zum Betreiben eines tiefen Neuronalen Netzes
EP2696310A1 (fr) Procédé destiné à identifier un bord de route
DE102019219924B4 (de) Verfahren und Vorrichtung zum Erzeugen und Bereitstellen einer Datenbank mit darin hinterlegten Sensordatenpatches zur Verwendung beim Quilting

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20829548

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020829548

Country of ref document: EP

Effective date: 20220718