DE102017207442A1 - Method and device for classifying objects in the environment of a motor vehicle - Google Patents

Method and device for classifying objects in the environment of a motor vehicle

Info

Publication number
DE102017207442A1
DE102017207442A1 DE102017207442.6A DE102017207442A DE102017207442A1 DE 102017207442 A1 DE102017207442 A1 DE 102017207442A1 DE 102017207442 A DE102017207442 A DE 102017207442A DE 102017207442 A1 DE102017207442 A1 DE 102017207442A1
Authority
DE
Germany
Prior art keywords
radar
signal
class
environment
motor vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
DE102017207442.6A
Other languages
German (de)
Inventor
Marc-Michael Meinecke
Michael Heuer
Mikael Johansson
Volker Schomerus
Tom Nyström
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Scania CV AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen AG, Scania CV AB filed Critical Volkswagen AG
Priority to DE102017207442.6A priority Critical patent/DE102017207442A1/en
Publication of DE102017207442A1 publication Critical patent/DE102017207442A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/28Details of pulse systems
    • G01S7/285Receivers
    • G01S7/292Extracting wanted echo-signals
    • G01S7/2923Extracting wanted echo-signals based on data belonging to a number of consecutive radar periods
    • G01S7/2927Extracting wanted echo-signals based on data belonging to a number of consecutive radar periods by deriving and controlling a threshold value
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4021Means for monitoring or calibrating of parts of a radar system of receivers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/411Identification of targets based on measurements of radar reflectivity
    • G01S7/412Identification of targets based on measurements of radar reflectivity based on a comparison between measured values and known or stored values
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/00791Recognising scenes perceived from the perspective of a land vehicle, e.g. recognising lanes, obstacles or traffic signs on road scenes
    • G06K9/00805Detecting potential obstacles
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/62Methods or arrangements for recognition using electronic means
    • G06K9/6267Classification techniques
    • G06K9/6268Classification techniques relating to the classification paradigm, e.g. parametric or non-parametric approaches
    • G06K9/627Classification techniques relating to the classification paradigm, e.g. parametric or non-parametric approaches based on distances between the pattern to be recognised and training or reference patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar

Abstract

The invention relates to a method for classifying objects in the environment of a motor vehicle, comprising the following method steps: detecting an environment of the motor vehicle by means of at least one radar sensor (2), wherein radar returns (6) of a radar pulse caused by objects in the surroundings are detected; Evaluating a radar signal (7) provided by the radar sensor (2) on the basis of the detected radar echoes (6) by means of a radar evaluation device (3), wherein a signal section (8) corresponding to an object is identified in the radar signal (7), and wherein a the distance (9) corresponding to the object, a speed (10) and / or an azimuth angle (11) are determined; Classifying the signal section (8) by means of a classification device (4), the signal section (8) being classified by means of a deep neural network (12) by assigning at least one object class (13) to the signal section (8), and by means of the deep neural Networks (12) for each of the signal section (8) associated with the object class (13) a class probability measure (14) is determined; Outputting the at least one associated object class (13) and the respectively associated class probability measures (14) by means of an output device (5). Furthermore, the invention relates to an associated device (1).

Description

  • The invention relates to a method and a device for classifying objects in the environment of a motor vehicle.
  • Modern motor vehicles are equipped with a variety of sensor systems for environment detection. For example, radar sensors are capable of determining a distance, a speed and an azimuth angle of objects in the surroundings of the motor vehicle. Such radar sensors already form the basis for various driver assistance systems and can also be used in combination with other sensors.
  • Furthermore, methods for image analysis based on artificial neural networks are known. Such a method is for example in the EP 1 674 883 A2 described. A special class of artificial neural network optimization methods are Deep Neural Networks. These deep neural networks provide a more stable learning experience due to an extensive internal structure.
  • The invention has for its object to provide a method and apparatus for classifying objects in the environment of a motor vehicle, in which the classification of the objects is improved.
  • The object is achieved by a method having the features of patent claim 1 and a device having the features of patent claim 7. Advantageous embodiments will be apparent from the dependent claims.
  • In particular, a method is provided for classifying objects in the vicinity of a motor vehicle, comprising the following method steps: detecting an environment of the motor vehicle by means of at least one radar sensor, whereby radar returns of a radar pulse caused by objects in the surroundings are detected; Evaluating a radar signal provided by the radar sensor on the basis of the detected radar returns by means of a radar evaluation device, wherein a signal section corresponding to an object is identified in the radar signal, and wherein a distance corresponding to the object, a speed and / or an azimuth angle are determined; Classifying the signal section by means of a classifier, wherein the signal section is classified by means of a deep neural network by assigning at least one object class to the signal section, and wherein a class probability measure is determined by means of the deep neural network for each object class associated with the signal section; Outputting the at least one associated object class and the respectively associated class probability measures by means of an output device.
  • Furthermore, an apparatus for classifying objects in the vicinity of a motor vehicle is provided, comprising at least one radar sensor for detecting an environment of the motor vehicle, wherein the radar sensor detects radar returns of a radar pulse caused by objects in the surroundings, radar evaluation device for evaluating one of the radar sensor based on the detected radar echoes radar signal provided, wherein the radar evaluation means is adapted to identify a signal portion corresponding to an object in the radar signal, and to determine a distance corresponding to the object, a speed and / or an azimuth angle; a classifier, wherein the classifier is arranged to classify the signal portion by means of a deep neural network by assigning at least one class of object to the signal portion, and to determine a class likelihood measure for each class of object associated with the signal portion by the deep neural network; and an output device configured to output the at least one associated object class and the respective associated class probability measures.
  • The basic idea of the invention is to detect objects in the vicinity of a motor vehicle by means of a radar sensor and to evaluate in detail a radar signal provided by the radar sensor. For this purpose, a radar evaluation device determines from the radar signal a distance corresponding to the object, a speed and / or an azimuth angle, wherein the distance, the speed and / or the azimuth angle are respectively determined relative to the radar sensor or the motor vehicle. In addition, the radar evaluation device identifies a signal section corresponding to the object in the radar signal. This signal section is fed to a classification device, which uses a deep neural network to classify different object classes. The Deep Neural Network also provides a class likelihood measure for each object class associated with a signal portion and the corresponding object, respectively. An object is thus assigned at least one object class with an associated class probability measure.
  • The class probability measure here forms a confidence value or a confidence interval, which is a measure of a plausibility or reliability of the respective assignment of an object class to an object.
  • A signal section can be understood here as a radar signature of an associated object. In this case, such a radar signature corresponds to the time profile of the radar echo of the radar pulse caused by the object. Alternatively, the radar signature can be taken from the radar power spectrum. Such a radar signature is composed here of all the radar echoes thrown back by the object at individual radar echo centers. Depending on the size, shape, surface texture and / or surface condition of the object, this results in characteristic radar signatures for different objects. For example, the radar signatures of pedestrians differ significantly from those of a motor vehicle or a lorry. These differences are recognized by the Deep Neural Network and evaluated for classification.
  • In the simplest case, the class probability measure indicates with which probability a detected object belongs to a certain object class. Since an assignment of an object class will generally not be unambiguous, the method and the apparatus for a detected object will yield a set consisting of object classes and respective associated probability values. Thus, for example, the object class "motor vehicle" can be assigned to a detected object with 90% probability, and the object classes "tree" and "truck" can each be assigned with 5% probability.
  • In other words, in this case, the deep neural network provides to each signal portion the probabilities p i that this signal portion was caused by a particular object class i of object:
  • Object class 1:
    p 1 ,
    Object class 2:
    p 2 ,
    ...
    ...
    Object class n
    p n,
    With p 1 + p 2 + ... + p n = 1.
    Figure DE102017207442A1_0001
  • The advantage of the method and the device is that not only information about the distance, the speed and / or the azimuth angle of an object in the environment of the motor vehicle can be obtained by means of a radar sensor, but also statements about what kind of object ( eg vehicle, cyclist, pedestrian, tree, etc.). This allows a variety of applications, for example, by making this information in addition in driver assistance systems advantage. For example, a parking assistant can detect whether an object in the vicinity of the motor vehicle is another motor vehicle, a curb or a bollard, and make use of this information during automatic parking.
  • After performing the method, the output device thus provides an assignment to at least one object class as well as a distance, a speed and / or an azimuth angle for each object detected in the surroundings.
  • In one embodiment, it is provided that an object is detected several times in the environment, evaluated and classified accordingly and results of these classifications are summarized on the basis of the associated class probability measures. This allows the multiple detection of an object, for example, from different distances, perspectives and / or detection angles. In each individual detected radar echo of the object, the signal section corresponding to the object is identified and assigned to it by the Deep Neural Network at least one object class. The class likelihood measures for the different object classes, which are determined after each individual acquisition, are then summarized, that is, for example, summed up and renormalized. This has the advantage that objects can be captured more than once and in this way a more accurate image of the environment is provided.
  • In the device, it is correspondingly provided that the at least one radar sensor, the radar evaluation device and the classification device are designed to detect, evaluate and classify an object in the environment several times, and to combine results of these classifications on the basis of the associated class probability measures.
  • In some embodiments, it is further provided that, after summarizing the individual results, a decision is made as to which of the object classes from the combined result is assigned to an object. In other words, after summarizing the results, it is decided, for example by maximum value recognition within the various class probability measures of the object classes, which object class is finally selected or uniquely assigned to an object. For example, if the class probability measure is a simple probability with which the individual object classes are present, then that object class with the highest Probability as the only and final object class associated with that object.
  • In a further development, it is further provided that the class probability measure is an evidence value and the summarization is carried out by means of the Dempster-Shafer method. Evidence is a two-dimensional measure of probability: it is composed of the degree of belief or the degree of confidence in the fact that the statement of a source applies (English: "degree of belief"), and the plausibility of the event or from a Probability range with a lower and upper limit. In this embodiment, the deep neural network then provides an associated evidence value for a signal portion to each of the object classes. By means of the Dempster-Shafer method, it is possible to classify objects that have been detected several times much better by combining the individual measurements into one overall statement.
  • In a further embodiment, it is provided that the deep neural network is trained by means of training data of a camera-based reference system. For this purpose, images of the surroundings captured by the camera-based reference system are classified by the reference system, and the objects in the vicinity of associated object classes are assumed to be reference classes ("ground truth"). These reference classes are then used as training data in training the deep neural network to evaluate the radar signal provided by the radar sensor or the signal section. It may be provided here that camera data are previously evaluated manually, so that the reference classes are assigned manually to the corresponding objects.
  • It may be provided here to use camera data captured during a journey in an unknown environment. Alternatively, however, it can also be provided that camera data is used which has been acquired in an environment with known objects.
  • The corresponding embodiment of the device accordingly provides a camera-based reference system, wherein the deep neural network is adapted to be trained by means of training data provided by the camera-based reference system.
  • In a further embodiment, it is provided that the signal section corresponding to the object is determined by means of a threshold value detection. For this purpose, it is provided that signal components in the radar signal which are below a certain threshold value are not taken into account. In this way, noise components in the radar signal can be completely suppressed or at least significantly reduced. The consequence is that a success rate in the assignment of an object class to an object can be significantly improved by means of the Deep Neural Network.
  • In this embodiment, the device accordingly provides that the radar evaluation device is designed to determine the signal portion corresponding to the object by means of a threshold value detection.
  • In a further development, it is further provided that the threshold value detection is carried out by means of the ordered-constant-false-alarm-rate method (OS-CFAR).
  • The classifier and the Deep Neural Network can be implemented, for example, by means of a Graphics Processing Unit (GPU or GPGPU).
  • The device may be formed for example in a motor vehicle. In this way, the classification can be carried out directly in the motor vehicle itself and is available while driving through an environment.
  • The invention will be explained in more detail with reference to preferred embodiments with reference to the figures. Hereby show:
    • 1 a schematic representation of an embodiment of the device for classifying objects in the environment of a motor vehicle;
    • 2 a schematic representation of another embodiment of the device for classifying objects in the environment of a motor vehicle;
    • 3 a schematic representation of different radar echo centers at a rear of a motor vehicle to illustrate the formation of a radar echo;
    • 4a a schematic radar signature of a pedestrian;
    • 4b a schematic radar signature of a motor vehicle;
    • 4c a schematic radar signature of a truck;
    • 5 a schematic representation for explaining the threshold detection.
  • In 1 is a schematic representation of an embodiment of the device 1 for classifying objects in the environment of a motor vehicle. The device 1 includes a radar sensor 2 , a radar evaluation device 3 , a classification device 4 and a output device 5 , The radar sensor 2 captures the environment of the motor vehicle in which this radar echo caused by objects in this environment 6 a radar pulse detected. The radar sensor 2 based on the recorded radar echoes 6 a radar signal 7 available and leads this the Radarauswertungseinrichtung 3 to. The radar evaluation device 3 is formed such a signal portion corresponding to an object 8th in the radar signal 7 to identify and a distance corresponding to the object 9 , a speed 10 and / or an azimuth angle 11 to determine. The one of the radar evaluation device 3 in the radar signal 7 identified signal section 8th becomes the classifier 4 fed. The classifier 4 is formed, the signal portion 8th using a Deep Neural Network 12 by assigning at least one object class 13 to the signal section 8th to classify, and through the Deep Neural Network 12 for each the signal section 8th assigned object class 13 a class probability measure 14 to investigate. The output device 5 gives the associated object class 13 or the associated object classes 13 and the associated class likelihood measures 14 out. Furthermore, the output device gives 5 also the distance 9 , the speed 10 and / or the azimuth angle 11 out. The output device 5 can output the aforementioned values, for example, as a data telegram in digital form or as analog voltage values.
  • It may be provided in some embodiments that an object in the environment is detected several times, evaluated and classified accordingly and results of these classifications based on the associated class probability measures 14 be summarized. For this purpose, the device 1 For example, in addition, an averaging device (not shown), which performs the summarizing of the individual results. Furthermore, the summarizing can also be done by means of the output device 5 be carried out, which has for this purpose, for example, a suitably trained averaging module (not shown).
  • The device may be formed for example in a motor vehicle. In this way, the classification can be carried out directly in the motor vehicle itself and is available while driving through an environment.
  • In further embodiments, it may further be provided that the class probability measure is an evidence value and the summarization is carried out by means of the Dempster-Shafer method.
  • In 2 is a schematic representation of another embodiment of the device 1 for classifying objects in the environment of a motor vehicle. The device 1 corresponds substantially to the embodiment which in 1 is shown. The same reference numerals denote the same features. In addition, the in 2 embodiment shown a camera-based reference system 15 on, using the camera-based reference system 15 a camera 16 and a reference classifier 17 having. The camera 16 detected the environment of the motor vehicle and leads the recorded the reference classifier 17 to. The reference classifier 17 recognize in these Objects and assigns reference objects to these objects 19 to. These reference classes 19 then form training data 20 to the Deep Neural Network 12 the classifier 4 to train. The of the reference system 15 classified and assigned to the objects in the environment reference classes 19 are hereby accepted as a reference ("ground truth"). It can be provided here that the previously evaluated manually, so that an assignment of the reference classes 19 to the corresponding objects in the done manually.
  • In 3 is a schematic representation of the evoked signal strength of different radar echo centers 31 at a stern 30 of a motor vehicle to illustrate the emergence of a radar echo shown. That from the stern 30 thrown back radar echo one at the stern 30 reflected radar pulse is then composed of a superposition of the individual radar echo centers 31 , From the superimposed radar echo detected by a radar sensor, the radar signal is subsequently formed.
  • Depending on the size, shape, surface texture and / or material of an object on which a radar echo is generated, a radar signature of the corresponding object differs. In the 4a to 4c are three different radar signatures 32 shown by different objects, in each case the waveform of the detected by a radar sensor Radar echo of these objects is shown over the time axis. In the 4a is the radar signature 32 a pedestrian shown. In the 4b is the radar signature 32 to see a motor vehicle and in the 4c is the radar signature 32 of a truck. It can be seen that the different radar signatures 32 clearly different from each other. This difference is made use of in the described method and in the device described when assigning object classes to the objects by means of the deep neural network.
  • In 5 FIG. 4 is a schematic diagram for explaining, in some embodiments. FIG used threshold detection. This is a radar signal 7 evaluated via a threshold detection, with a certain threshold 33 is predetermined. Signal components which are below the threshold value 33 are neglected. Signal components above the threshold value 33 are cut out and the cut-out period becomes a signal section 8th supplied to the Deep Neural Network as an input vector. In this way, a noise component in the radar signal 7 be suppressed or at least minimized prior to classifying.
  • In one embodiment, threshold detection is performed using the Ordered-Statistics Constant-False-Alarm-Rate (OS-CFAR) method.
  • LIST OF REFERENCE NUMBERS
  • 1
    contraption
    2
    radar sensor
    3
    Radar evaluation device
    4
    classifier
    5
    output device
    6
    radar echo
    7
    radar signal
    8th
    signal section
    9
    distance
    10
    speed
    11
    azimuth angle
    12
    Deep Neural Network
    13
    object class
    14
    Klassenwahrscheinlichkeitsmaß
    15
    Camera-based reference system
    16
    camera
    17
    Reference classifier
    18
    Illustration
    19
    reference class
    20
    training data
    30
    Rear
    31
    Radar echo center
    32
    radar signature
    33
    threshold
  • QUOTES INCLUDE IN THE DESCRIPTION
  • This list of the documents listed by the applicant has been generated automatically and is included solely for the better information of the reader. The list is not part of the German patent or utility model application. The DPMA assumes no liability for any errors or omissions.
  • Cited patent literature
    • EP 1674883 A2 [0003]

Claims (10)

  1. Method for classifying objects in the environment of a motor vehicle, comprising the following method steps: Detecting an environment of the motor vehicle by means of at least one radar sensor (2), wherein radar returns (6) of a radar pulse caused by objects in the environment are detected; Evaluating a radar signal (7) provided by the radar sensor (2) on the basis of the detected radar echoes (6) by means of a radar evaluation device (3), wherein a signal section (8) corresponding to an object is identified in the radar signal (7), and wherein a the distance (9) corresponding to the object, a speed (10) and / or an azimuth angle (11) are determined; Classifying the signal section (8) by means of a classification device (4), the signal section (8) being classified by means of a deep neural network (12) by assigning at least one object class (13) to the signal section (8), and by means of the deep neural Networks (12) for each of the signal section (8) associated with the object class (13) a class probability measure (14) is determined; Outputting the at least one associated object class (13) and the respectively associated class probability measures (14) by means of an output device (5).
  2. Method according to Claim 1 , characterized in that an object is detected, evaluated and classified several times in the environment and results of these classifications are summarized on the basis of the associated class probability measures (14).
  3. Method according to Claim 2 , characterized in that the class probability measure (14) is an evidence value and the summarization is performed by means of the Dempster-Shafer method.
  4. Method according to one of the preceding claims, characterized in that the deep neural network (12) is trained by means of training data (20) of a camera-based reference system (15).
  5. Method according to one of the preceding claims, characterized in that the signal section (8) corresponding to the object is determined by means of a threshold value detection.
  6. Method according to Claim 5 , characterized in that the threshold detection is performed by means of the Ordered-Statistics Constant-False-Alarm-Rate-method.
  7. Device (1) for classifying objects in the environment of a motor vehicle, comprising: at least one radar sensor (2) for detecting an environment of the motor vehicle, wherein the radar sensor (2) detects radar returns (6) of a radar pulse caused by objects in the surroundings, a radar evaluation device (3) for evaluating a radar signal (7) provided by the radar sensor (2) on the basis of the detected radar returns (6), wherein the radar evaluation device (3) is designed such that it has a signal section (8) in the radar signal corresponding to an object ( 7), and to determine a distance (9) corresponding to the object, a speed (10) and / or an azimuth angle (11); a classification device (4), wherein the classification device (4) is designed to classify the signal section (8) by means of a deep neural network (12) by assigning at least one object class (13) to the signal section (8), and by means of the deep Neural networks (12) for each object class (13) associated with the signal section (8) to determine a class probability measure (14); an output device (5) which is designed to output the at least one associated object class (13) and the respectively associated class probability measures (14).
  8. Device (1) according to Claim 7 , characterized in that the at least one radar sensor (2), the radar evaluation device (3) and the classification device (4) are designed to detect, evaluate and classify an object in the environment several times, and results of these classifications on the basis of the associated class probability measures (14) to summarize.
  9. Device (1) according to one of Claims 7 to 8th characterized by a camera-based reference system (15), wherein the deep neural network (12) is adapted to be trained by means of training data (20) provided by the camera-based reference system (15).
  10. Device (1) according to one of Claims 7 to 9 , characterized in that the radar evaluation device (3) is designed in such a way to determine the signal section (8) corresponding to the object by means of a threshold value recognition.
DE102017207442.6A 2017-05-03 2017-05-03 Method and device for classifying objects in the environment of a motor vehicle Pending DE102017207442A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102017207442.6A DE102017207442A1 (en) 2017-05-03 2017-05-03 Method and device for classifying objects in the environment of a motor vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102017207442.6A DE102017207442A1 (en) 2017-05-03 2017-05-03 Method and device for classifying objects in the environment of a motor vehicle
PCT/EP2018/060788 WO2018202552A1 (en) 2017-05-03 2018-04-26 Method and device for classifying objects in the environment of a motor vehicle

Publications (1)

Publication Number Publication Date
DE102017207442A1 true DE102017207442A1 (en) 2018-11-08

Family

ID=62091875

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102017207442.6A Pending DE102017207442A1 (en) 2017-05-03 2017-05-03 Method and device for classifying objects in the environment of a motor vehicle

Country Status (2)

Country Link
DE (1) DE102017207442A1 (en)
WO (1) WO2018202552A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19518993A1 (en) * 1995-05-29 1996-12-05 Sel Alcatel Ag Device and method for automatic detection or classification of objects
DE19649618A1 (en) * 1996-11-29 1998-06-04 Alsthom Cge Alcatel Method and device for automatic classification of objects
EP1674883A2 (en) * 2004-12-27 2006-06-28 Hitachi, Ltd. Apparatus and method for detecting vehicle
US20160140438A1 (en) * 2014-11-13 2016-05-19 Nec Laboratories America, Inc. Hyper-class Augmented and Regularized Deep Learning for Fine-grained Image Classification

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5499030A (en) * 1994-03-18 1996-03-12 The United States Of America As Represented By The Secretary Of The Air Force Expert system constant false alarm rate (CFAR) processor
US5963653A (en) * 1997-06-19 1999-10-05 Raytheon Company Hierarchical information fusion object recognition system and method
US9594159B2 (en) * 2013-07-15 2017-03-14 Texas Instruments Incorporated 2-D object detection in radar applications
JP6548376B2 (en) * 2014-10-06 2019-07-24 日本電産株式会社 Radar system, radar signal processing device, vehicle travel control device and method, and computer program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19518993A1 (en) * 1995-05-29 1996-12-05 Sel Alcatel Ag Device and method for automatic detection or classification of objects
DE19649618A1 (en) * 1996-11-29 1998-06-04 Alsthom Cge Alcatel Method and device for automatic classification of objects
EP1674883A2 (en) * 2004-12-27 2006-06-28 Hitachi, Ltd. Apparatus and method for detecting vehicle
US20160140438A1 (en) * 2014-11-13 2016-05-19 Nec Laboratories America, Inc. Hyper-class Augmented and Regularized Deep Learning for Fine-grained Image Classification

Also Published As

Publication number Publication date
WO2018202552A1 (en) 2018-11-08

Similar Documents

Publication Publication Date Title
US9994218B2 (en) Method for smartphone-based accident detection
Mukhtar et al. Vehicle detection techniques for collision avoidance systems: A review
JP5425853B2 (en) Road use vulnerable person protection system
US9429650B2 (en) Fusion of obstacle detection using radar and camera
US10049284B2 (en) Vision-based rain detection using deep learning
US8379924B2 (en) Real time environment model generation system
US9436879B2 (en) Method for recognizing traffic signs
DE102006020192B4 (en) Apparatus and method for predicting collision
DE102013113619A1 (en) Probabilistic target selection and hazard assessment procedures and application to an intersection collision warning system
US20150049195A1 (en) Image processing unit, object detection method, object detection program, and vehicle control system
US8457408B2 (en) Method and system of identifying one or more features represented in a plurality of sensor acquired data sets
CN101303735B (en) Method for detecting moving objects in a blind spot region of a vehicle and blind spot detection device
CN103886757B (en) Method for the vehicle in movement of classifying automatically
WO2017175025A2 (en) Detecting visual information corresponding to an animal
CN107972662B (en) Vehicle forward collision early warning method based on deep learning
US6838980B2 (en) Camera-based precrash detection system
CN104573646B (en) Chinese herbaceous peony pedestrian detection method and system based on laser radar and binocular camera
CN105512623B (en) Based on multisensor travelling in fog day vision enhancement and visibility early warning system and method
DE102009006113B4 (en) Device and method for sensor fusion with dynamic objects
US7672514B2 (en) Method and apparatus for differentiating pedestrians, vehicles, and other objects
US7486802B2 (en) Adaptive template object classification system with a template generator
DE102006012914B4 (en) System and method for determining the distance to a preceding vehicle
US20090060273A1 (en) System for evaluating an image
ES2429572T3 (en) Integrated vehicle system for collision prevention at low speed
US7702425B2 (en) Object classification system for a vehicle

Legal Events

Date Code Title Description
R079 Amendment of ipc main class

Free format text: PREVIOUS MAIN CLASS: G01S0013880000

Ipc: G01S0007410000

R163 Identified publications notified
R081 Change of applicant/patentee

Owner name: VOLKSWAGEN AKTIENGESELLSCHAFT, DE

Free format text: FORMER OWNERS: SCANIA CV AB, SOEDERTAELJE, SE; VOLKSWAGEN AG, 38440 WOLFSBURG, DE

R082 Change of representative

Representative=s name: PATENTANWAELTE BRESSEL UND PARTNER MBB, DE