EP4314895A1 - Procédé pour le fonctionnement d'un système d'assistance au conducteur, produit programme informatique, système d'assistance au conducteur et véhicule - Google Patents

Procédé pour le fonctionnement d'un système d'assistance au conducteur, produit programme informatique, système d'assistance au conducteur et véhicule

Info

Publication number
EP4314895A1
EP4314895A1 EP22718982.6A EP22718982A EP4314895A1 EP 4314895 A1 EP4314895 A1 EP 4314895A1 EP 22718982 A EP22718982 A EP 22718982A EP 4314895 A1 EP4314895 A1 EP 4314895A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
time
sensor signals
sig1
determined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22718982.6A
Other languages
German (de)
English (en)
Inventor
Ludovic Mosnier-Thoumas
Markus Heimberger
Niko Moritz Scholz
Jean-Francois Bariant
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Valeo Schalter und Sensoren GmbH
Original Assignee
Valeo Schalter und Sensoren GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Valeo Schalter und Sensoren GmbH filed Critical Valeo Schalter und Sensoren GmbH
Publication of EP4314895A1 publication Critical patent/EP4314895A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • G01S15/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/525Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking automatically indicating risk of collision between vehicles in traffic or with pedestrians, e.g. after risk assessment using the vehicle sensor data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q5/00Arrangement or adaptation of acoustic signal devices
    • B60Q5/005Arrangement or adaptation of acoustic signal devices automatically actuated
    • B60Q5/006Arrangement or adaptation of acoustic signal devices automatically actuated indicating risk of collision between vehicles or with pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/008Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/862Combination of radar systems with sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/50Systems of measurement, based on relative movement of the target
    • G01S15/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/50Systems of measurement, based on relative movement of the target
    • G01S15/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S15/588Velocity or trajectory determination systems; Sense-of-movement determination systems measuring the velocity vector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/87Combinations of sonar systems
    • G01S15/876Combination of several spaced transmitters or receivers of known location for determining the position of a transponder or a reflector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4044Direction of movement, e.g. backwards
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2015/937Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles sensor installation details

Definitions

  • the present invention relates to a method for operating a driver assistance system, a computer program product, a driver assistance system and a vehicle with such a driver assistance system.
  • Known vehicles have a number of sensor units with which they can detect their surroundings, such as ultrasonic sensors. This can be particularly helpful when maneuvering and/or parking the vehicle, especially if the vehicle is large and/or difficult to see.
  • an acoustic, haptic and/or visual signal can be output to the driver to warn him of a collision.
  • One difficulty is detecting moving (dynamic) objects and determining whether a collision with them can occur.
  • a dynamic object is, in particular, another road user, such as a pedestrian or a cyclist.
  • DE 102006045418 A1 discloses a motor vehicle with a driver assistance system and with a sensor for measuring the distance from an obstacle. It is proposed to detect a movement direction of a moving object in order to increase road safety.
  • an object of the present invention is to improve the operation of a driver assistance system.
  • a method for operating a driver assistance system for a vehicle includes: a) receiving a driving condition sensor signal indicative of a driving condition of the vehicle at a number of different points in time, b) receiving a number of sensor signals indicative of an area surrounding the vehicle at a number of different points in time, c) detecting a number of objects in the area of the vehicle as a function of a first number of sensor signals received at a first point in time, d) determining a position and a motion vector for a detected object as a function of the first number of sensor signals and a second number of sensor signals which are received at a second , the time following the first time was received using a plurality of different determination methods, wherein different determination methods of the plurality have a different computing effort, and e) outputting a warning signal when a potential collision of the vehicle with de m detected object is determined on the basis of the driving condition sensor signal received at a specific point in time and the position determined for the detected object and its movement
  • This method has the advantage that the movement of detected objects is determined using different determination methods.
  • a less complex determination method that can be carried out very quickly and a more complex determination method that can be carried out somewhat more slowly can be used here.
  • the different determination methods have, for example, different levels of accuracy and/or reliability with regard to their respective results. So in particular a less complex investigative procedures can be less reliable, but save time, especially in critical situations.
  • the driving condition sensor signal includes, for example, odometry data of the vehicle, such as a current speed, a current wheel speed, a current wheel angle and/or a current steering angle.
  • odometry data of the vehicle such as a current speed, a current wheel speed, a current wheel angle and/or a current steering angle.
  • a direction of the vehicle for example a future trajectory or a driving path, can be determined on the basis of the driving state sensor signal.
  • the fact that the driving condition sensor signal is received at a number of different points in time is to be understood in particular to mean that the driving condition sensor signal that is current at this point in time is received at a particular point in time.
  • the current driving condition sensor signal is indicative in particular of the current driving condition at the current point in time.
  • the driving condition sensor signal is received regularly, in particular periodically, for example with a frequency of more than 1 Hz, preferably at least 10 Hz, preferably up to 100 Hz. Based on at least consecutive driving condition sensor signals, a change in the driving condition can be determined, for example.
  • the sensor signals indicative of the surroundings of the vehicle include, in particular, ultrasonic sensor signals.
  • a number of sensor signals are received is to be understood in particular as meaning, for example, that sensor signals are received from a number of different sensors, with different sensors on the one hand comprising sensors of the same type but with a different arrangement and/or orientation, but on the other hand also comprising sensors of different types , such as an ultrasonic sensor and a camera.
  • the number comprises a quantity greater than or equal to one.
  • the sensor signals can still be received from virtual sensors. Alternatively, one can also say that the sensor signals are retrieved.
  • a sensor provides an output signal that can be called up by external components, such as the driver assistance system.
  • the fact that the number of sensor signals are received at a number of different points in time is to be understood in particular to mean that the sensor signals that are current at this point in time are received at a particular point in time. All of the sensor signals of the number are preferably received at one point in time, for example as a data packet that includes all of the sensor signals of the number.
  • sensor signals of the number for example sensor signals from two different ultrasonic sensors, can have different detection times, even if the number of sensor signals is received at one time.
  • an interval between two sensor signals from a respective sensor is different from an interval between two sensor signals from a further sensor.
  • Each sensor preferably provides its sensor signal regularly, in particular periodically, for example with a frequency of more than 1 Hz, preferably at least 10 Hz, preferably up to 100 Hz, with the current sensor signal always being provided at a particular point in time.
  • the times at which the number of sensor signals is received may differ from the times at which the driving condition sensor signal is received.
  • the fact that a number of objects in the area surrounding the vehicle are detected as a function of a first number of sensor signals that were received at a first point in time means in particular that the first number of received sensor signals is processed and/or analyzed , and as a result of the processing and/or the analyzing a number of objects are detected.
  • the processing and/or analysis includes, for example, a signal analysis of the individual sensor signals and/or a signal analysis of a number of sensor signals that are correlated with one another.
  • the number of objects is greater than or equal to one.
  • a position and a motion vector are determined for at least one of the detected objects.
  • a single sensor signal at a point in time can be sufficient to determine the position.
  • At least two sensor signals, which were received at different points in time, are necessary to determine the movement vector.
  • Two sensor signals of a specific sensor received at different times are preferably used to determine the movement vector, but sensor signals from different sensors that were received at different times can also be used.
  • the more sensor signals are used for a respective object the more accurate the determination of the position and the motion vector can be.
  • the position of the object relates in particular to a position in a coordinate system of the vehicle and has at least two coordinates.
  • the position of the object can refer to a single point of the object.
  • the motion vector comprises a two-dimensional vector.
  • a magnitude of the motion vector corresponds, for example, to the speed of the object.
  • the plurality of different determination methods includes at least two determination methods that have different computational complexity. For example, in a first determination method, the position is determined and the motion vector is determined directly on the basis of raw sensor data, and in a second determination method, processed data is generated on the basis of the raw data and the position is determined and the motion vector is determined on the basis of the processed data.
  • Sensor raw data are in particular the unprocessed output signal of a respective sensors. Since there is no pre-processing, this determination method can be particularly fast. On the other hand, signal noise or the like may affect a determination result.
  • a processing step includes, for example, performing a particular mathematical operation on a respective sensor signal, such as forming a moving average to mask outliers in the sensor signal, or applying a noise filter, a Fourier transform, and the like.
  • a processing step can also include a number of sensor signals, for example when a correlation is determined and/or the sensor signals are mutually checked for plausibility.
  • Different positions and/or movement vectors can be determined for an object on the basis of the various determination methods.
  • the most up-to-date position and the most up-to-date movement vector are preferably always used as the basis for determining a possible collision. If a position and a motion vector for an object were determined by two different determination methods, which are based on the same current sensor signals, then in particular that position and that motion vector are used as the basis for determining a possible collision, their respective accuracy and/or reliability is higher.
  • the accuracy and/or reliability can be determined, for example, in the form of a determination error, which can originate on the one hand from measurement errors and on the other hand from the determination method itself.
  • the position and the motion vector are determined for several of the detected objects of the number, preferably for each detected object of the number.
  • the position and the motion vector are constantly updated. For example, a new determination is made whenever a current sensor signal or a number of current sensor signals is received.
  • the current position and the current movement vector can therefore also be used in the following.
  • the vehicle Based on the driving condition sensor signal received at a specific point in time and the position determined for the detected object and its movement vector, it can be determined whether the vehicle may collide with the object moving according to its movement vector in the future.
  • an extrapolation of the position of the vehicle and the object based on the respective current driving state sensor signal and an extrapolation of the position of the object based on the current position and the current movement vector of the object can be determined.
  • a warning signal is given when a collision is likely.
  • the warning signal can be output directly to the user of the vehicle, for example as an acoustic, haptic and/or visual signal.
  • the warning signal can also be output to the outside of the vehicle, for example to the object, in particular as an acoustic warning signal. If the warning signal is issued, further functions of the driver assistance system and/or other units of the vehicle can also be triggered.
  • the warning signal is preferably output independently of the determination method used to determine the position and the movement vector of the object with which a possible collision was determined.
  • the number of different determination methods includes at least one first determination method in which for each detected object of the number a Kalman filter is assigned and initialized, which is used to determine the position and the motion vector of the respective object.
  • the Kalman filter (also Kalman-Bucy filter, Stratonovich-Kalman-Bucy filter or Kalman-Bucy-Stratonovich filter) is a mathematical method for the iterative estimation of parameters, in this case the position and the motion vector of the object based on erroneous measurements, in this case the received sensor signals.
  • the Kalman filter is used to estimate system variables that cannot be measured directly, while the errors in the measurements are optimally reduced.
  • the Kalman filter describes an estimated value using multidimensional normal distributions. These represent a probability distribution of possible errors around each estimate, as well as correlations between estimate errors of different variables.
  • the previous estimated values are optimally combined with the new measurements in each time step, so that remaining errors are minimized as quickly as possible.
  • the Kalman filter has a filter state comprising the current estimated values as well as error estimates and correlations. After each new measurement, the Kalman filter improves the previous estimates and updates the associated error estimates and correlations. In dynamic systems in which, for example, a speed is also estimated, the Kalman filter also estimates correlations between the speed and, for example, the position, in particular on the basis of equations of motion, and takes these into account for the next time step.
  • the Kalman filter is preferably updated at least whenever a number of current sensor signals have been received.
  • different sensor signals of the number of different scanning areas in the environment are assigned, with a respective sensor signal of the number of sensor signals received at a particular point in time, assigned to a specific scanning area in the area, being fed to that Kalman filter whose assigned object has a position , which lies within the scanning range assigned to the sensor signal.
  • the warning signal is output if a potential collision is determined on the basis of the position determined for the respective detected object using the first determination method and its motion vector, only if the determined motion vector of the object is not equal to zero.
  • the possible collision there is no warning of a possible collision with an object if the possible collision was determined based on the position determined using the first determination method and, for example, a movement of the vehicle, but the object itself is not moving but is static. It should be noted that this does not preclude warnings of such a collision.
  • the possible collision can also be determined on the basis of the position of the object determined using a further determination method, and a warning can then be issued accordingly.
  • this includes:
  • the term “travel path” describes in particular the area that the vehicle would sweep over if it were moved forwards or backwards with the current wheel angle or steering angle. This means that a change in the steering angle or the wheel angle causes a change in the driving path.
  • the driving path can be represented, for example, by trajectories for each wheel of the vehicle.
  • the driving path can also be understood as a two-dimensional future trajectory of the vehicle.
  • a warning signal is only output when a distance of the respective object from the vehicle and/or from the determined driving path is less than or equal to a lower threshold value.
  • the threshold value can be variably determined in particular for different vehicles and/or situations. Furthermore, the threshold value can be determined as a function of the vehicle speed and/or the speed of the object. Furthermore, the threshold value can be determined as a function of a measurement accuracy, such as a standard deviation or a variance.
  • This embodiment has the advantage that a warning is only issued when the probability of a collision is relatively high.
  • the threshold value takes into account that it is unknown how the object will continue to move, ie whether it will change its speed and/or direction, for example.
  • the threshold can, for example, assume a value from an interval from zero to two meters.
  • a warning signal is only output when the determined movement vector of the respective object points in the direction of the vehicle and/or the determined driving path.
  • step e) comprises:
  • the future trajectory can be determined, for example, as an extrapolation of a previous trajectory of the object.
  • a curved future trajectory can also be determined here.
  • the predetermined minimum distance can be equal to the lower threshold value, but it can also be different from it.
  • the predetermined minimum distance can be variably determined, in particular for different vehicles and/or situations.
  • the predetermined minimum distance can be determined as a function of the vehicle speed and/or the speed of the object.
  • the predetermined minimum distance can be determined as a function of a measurement accuracy, such as a standard deviation or a variance.
  • the predetermined minimum distance is determined as a function of another variable, such as the vehicle speed, means, for example, that the predetermined minimum distance is predetermined as a function of the other variable, so that a specific numerical value at a point in time depends on the current value of the other quantity is determined.
  • the received sensor signals exclusively include ultrasonic sensor signals.
  • This embodiment can be used advantageously for vehicles that only have ultrasonic sensors. Compared to other sensors, such as radar, lidar and/or cameras, ultrasonic sensors are inexpensive to manufacture and do not require high computing power to evaluate their sensor signals.
  • This embodiment can also be advantageous for vehicles that have additional sensors, since the computing power required for the proposed method is less than for alternative methods that, for example, additionally carry out an image evaluation of a camera image.
  • the number of different determination methods includes at least one second determination method in which a feature recognition is carried out on the basis of the number of sensor signals received at a particular point in time and a digital map of the surroundings is determined using recognized features.
  • the digital map of the surroundings can advantageously be used to also determine a collision with static objects and to warn of this.
  • carrying out a feature detection (English: "feature extraction") is complex and requires more computing power. Therefore, the second preliminary investigation may last longer.
  • moving objects are not detected during feature detection, for example if the sensor signal changes quickly and therefore looks like noise. This can be the case in particular with pedestrians and/or cyclists.
  • the method is only carried out when the vehicle has a speed of less than or equal to 15 km/h, preferably less than or equal to 10 km/h, preferably less than or equal to 7 km/h, more preferably less than or equal to 5 km/h.
  • the vehicle speed can be determined in particular on the basis of the driving condition sensor signal.
  • a computer program product which comprises instructions which, when the program is executed by a computer, cause the latter to execute the method according to the first aspect.
  • a computer program product such as a computer program means
  • a server in a network, for example, as a storage medium such as a memory card, USB stick, CD-ROM, DVD, or in the form of a downloadable file. This can be done, for example, in a wireless communication network by transferring a corresponding file with the computer program product or the computer program means.
  • a driver assistance system for a vehicle comprises a receiving unit for receiving a driving condition sensor signal indicative of a driving condition of the vehicle at a number of different points in time, and for receiving a number of sensor signals indicative of an area surrounding the vehicle at a number of different points in time, a detection unit for detecting a number of objects in the Environment of the vehicle depending on a first number of sensor signals received at a first point in time, a determination unit for determining a position and a movement vector for a detected object of the number depending on the first number of sensor signals and a second number of sensor signals, received at a second time subsequent to the first time using a plurality different determination methods, wherein different determination methods of the plurality have a different computing effort, and an output unit for outputting a warning signal if there is a potential collision of the vehicle with the detected object on the basis of the driving condition sensor signal received at a specific point in time and the position determined for the detected object and its Motion vector is determined.
  • the embodiments and features described for the proposed method apply accordingly to the proposed driver assistance system.
  • the advantages and/or definitions that were mentioned in relation to the method according to the first aspect also apply to the proposed driver assistance system.
  • the driver assistance system is operated in particular with the method according to the first aspect or one of the specific embodiments of the method.
  • the respective unit of the driver assistance system can be implemented in terms of hardware and/or software.
  • the respective unit can be embodied, for example, as a computer or as a microprocessor.
  • the respective unit can be designed as a computer program product, as a function, as a routine, as an algorithm, as part of a program code or as an executable object.
  • each of the units mentioned here can also be designed as part of a higher-level control system of the vehicle, such as a central electronic control device and/or an engine control unit (ECU: Engine Control Unit).
  • ECU Engine Control Unit
  • the driver assistance system can be set up in particular for semi-autonomous or fully autonomous driving of the vehicle.
  • Partially autonomous driving is understood to mean, for example, that the driver assistance system controls a steering device and/or an automatic driving stage.
  • Fully autonomous driving means, for example, that the driver assistance system also controls a drive device and a braking device.
  • the vehicle is, for example, a passenger car or a truck.
  • the vehicle preferably includes a number of sensor units that are set up to detect the driving state of the vehicle and to detect an environment of the vehicle.
  • sensor units of the vehicle are imaging devices, such as a camera, radar (radio detection and ranging) or a lidar (engl, light detection and ranging), ultrasonic sensors, location sensors, wheel angle sensors and/or wheel speed sensors.
  • the sensor units are each set up to output a sensor signal, for example to the driver assistance system, which carries out the partially autonomous or fully autonomous driving as a function of the detected sensor signals.
  • the environmental sensor units exclusively include ultrasonic sensors.
  • the vehicle has a mass of more than 2.5 tons and/or a length of more than 5 meters.
  • the vehicle is designed as a transporter.
  • Vans for example, are confusing and can have one or more "blind spots" that a driver of the van can see poorly or not at all.
  • a blind spot is, for example, on the passenger side behind the A-pillar or on the driver and/or passenger side close to the vehicle.
  • the method has the advantage that a warning can be given in the event of an imminent collision with an object that is in an area close to the vehicle that is difficult or impossible for the driver to see.
  • FIG. 1 shows a schematic view of an exemplary embodiment of a vehicle
  • Fig. 2 shows a schematic view of different scanning areas
  • FIG. 3 shows a schematic view of a first traffic situation
  • FIG. 5 shows a schematic view of a third traffic situation at different points in time
  • FIG. 6 shows a schematic block diagram of an exemplary embodiment of a driver assistance system
  • FIG. 7 shows a schematic block diagram of an exemplary embodiment of a method for operating a driver assistance system.
  • FIG. 1 shows a schematic view of a vehicle 100 from a bird's eye view.
  • the vehicle 100 is, for example, a car that is arranged in an environment 200 .
  • Car 100 has a driver assistance system 110, which is embodied as a control unit, for example.
  • a plurality of environment sensor devices 120, 130 are arranged on car 100, these being optical sensors 120 and ultrasonic sensors 130, for example.
  • the optical sensors 120 include, for example, visual cameras, a radar and/or a lidar.
  • the optical sensors 120 can each capture an image of a respective area from the environment 200 of the car 100 and output it as an optical sensor signal.
  • the ultrasonic sensors 130 are set up in particular for scanning a respective area 131 - 136 (see FIG. 2 ) of the environment 200 . On the basis of the sensor signal output by the ultrasonic sensors 130, objects 210, 211 (see FIGS.
  • a motion vector VEC (see FIGS. 3-5) of an object 210, 211 can be determined from sensor signals that follow one another in time.
  • driver assistance system 110 may be able to drive car 100 partially or fully autonomously.
  • the vehicle 100 has various other sensor devices 120, 130. Examples of this are a microphone, an acceleration sensor, a wheel speed sensor, a wheel angle sensor, a steering angle sensor, an antenna with a coupled receiver for receiving electromagnetically transmittable data signals, and the like.
  • Driver assistance system 110 is designed, for example, as explained with reference to FIG. 6 and set up to carry out the method described with reference to FIG. 7 .
  • Driver assistance system 110 is preferably also set up to carry out processing procedures as described below with reference to FIGS. 2-5.
  • FIG. 2 shows a schematic view of different scanning areas 131-136 of different ultrasonic sensors 130.
  • six ultrasonic sensors 130 are arranged on a front sill of a vehicle 100.
  • Vehicle 100 is designed, for example, as described with reference to FIG. 1 .
  • Each sensor 130 has a specific scanning range 131-136.
  • the shape of a respective scanning area 131 - 136 depends on the arrangement and alignment of ultrasonic sensor 130 on vehicle 100, but also on the structure of ultrasonic sensor 130.
  • the scanning areas 131 - 136 can at least partially overlap, so that the immediate surroundings 200 in front of the sill of vehicle 100 can preferably be detected without any gaps.
  • the range of a respective ultrasonic sensor 130 depends on its structure, for example in a range between five meters and ten meters.
  • a virtual ultrasonic sensor is based, for example, on a first ultrasonic sensor 130 emitting an ultrasonic signal and a second ultrasonic sensor 130 receiving a reflection of the ultrasonic signal emitted by the first ultrasonic sensor.
  • a virtual ultrasonic sensor has, for example, a virtual position between two ultrasonic sensors 130 that are physically present.
  • a first object 210 for example a cyclist, is located in the scanning areas 135, 136 of two ultrasonic sensors 130. The cyclist 210 is therefore detected by two ultrasonic sensors in particular. In addition, the cyclist can be detected by a virtual ultrasonic sensor as described above.
  • a second object 211 for example a Pedestrian is located in the scanning area 132 of a single ultrasonic sensor 130. However, the pedestrian 211 can also be detected by a virtual ultrasonic sensor as described above.
  • each detected object 210, 211 is assigned a Kalman filter mapped and initialized.
  • a respective Kalman filter is set up to estimate the position of the respective object 210, 211 on the basis of the ultrasonic sensor signals SIG1(t) received one after the other.
  • the position includes in particular the position POS and the movement vector VEC of the respective object 210, 211.
  • each Kalman filter is supplied with the ultrasonic sensor signals SIG1(t) received from those ultrasonic sensors 130 in whose scanning range 131 - 136 the respective object 210, 211 is located just located. A precise and consistent result and an exact tracking of the objects 210, 211 are thus possible.
  • a second determination method V2 can provide that on the basis of the number of sensor signals SIG1(t) received at a particular point in time tO-15 (see FIG. 5) a feature recognition is carried out and a digital environment map is used is determined from recognized features.
  • FIG. 3 shows a schematic view of a first traffic situation, in which, for example, vehicle 100 from FIG. 1 or FIG. 2 is shown on a road.
  • An object 210 for example a pedestrian, is shown on the right in front of vehicle 100 .
  • the driving path TR for the vehicle 100 is also shown.
  • Driving path TR is determined, for example, by driver assistance system 110 (see FIG. 1 or 6) on the basis of a driving condition sensor signal SIG0(t) (see FIG. 6 or 7), which includes a current steering angle or a current wheel angle.
  • the ultrasonic sensors 130 preferably constantly emit ultrasonic signals and detect the reflected signals, ie they constantly scan their respective scanning area 131-136 (see FIG. 2) with ultrasonic signals.
  • the sampling occurs 10 times per second, preferably at least 50 times per second, preferably at least 100 times per second.
  • Ultrasonic sensors 130 emit ultrasonic sensor signals SIG1(t) (see FIG. 6 or 7) with a corresponding frequency, for example to driver assistance system 110. Based on the ultrasonic sensor signals, a position POS (see FIG. 7) of pedestrian 210 can be inferred.
  • a movement vector VEC for the pedestrian 210 can also be determined on the basis of at least two ultrasonic sensor signals SIG(t) recorded one after the other. This is done, for example, as described with reference to FIG. 2 using a first determination method V1
  • pedestrian 210 is moving toward route path TR of vehicle 100 .
  • the current distance D of the pedestrian 210 from the driving path TR is also shown.
  • Driver assistance system 110 is set up to output a warning signal as a function of predetermined criteria. For example, it is checked whether the distance D of the pedestrian 210 from the current driving path TR (alternatively from the vehicle 100) is less than or equal to a predetermined threshold value, or whether the determined movement vector VEC points in the direction of the driving path TR or the vehicle 100. If one or more of these criteria are met, then the warning signal is issued since a collision with the pedestrian 210 is then likely unless the vehicle 100 is stopped or changes direction.
  • FIG. 4 shows a schematic view of a second traffic situation, in which, for example, vehicle 100 from FIG. 1 or FIG. 2 is shown on a road. An object 210 , for example a pedestrian, is shown on the right in front of vehicle 100 .
  • the driving path TR for the vehicle 100 is also shown.
  • Driving path TR is determined, for example, by driver assistance system 110 (see FIG. 1 or 6) on the basis of a driving condition sensor signal SIG0(t) (see FIG. 6 or 7), which includes a current steering angle or a current wheel angle.
  • a position POS (see FIG. 7) and a movement vector VEC of the pedestrian 210 are determined on the basis of ultrasonic sensor signals SIG1(t) (see FIG. 6 or 7).
  • a future trajectory TR1 of pedestrian 210 is determined in this example.
  • the previous trajectory of the pedestrian 210 is extrapolated for this purpose.
  • the future trajectory TR1 can be determined on the basis of a specific embodiment of the first determination method V1, ie using Kalman filters. Additionally and/or alternatively, the future trajectory TR1 can be determined on the basis of a third determination method.
  • a smallest distance between the driving path TR and the future trajectory TR1 can be determined. If this distance D is less than a predetermined minimum distance, a warning signal is output, for example.
  • FIG. 5 shows a schematic view of a third traffic situation at different points in time tO ⁇ 15, vehicle 100 of FIG. 1 or FIG. 2 being shown on a road, for example.
  • An object 210 is determined to the right of vehicle 100 at an initial time tO. This takes place in particular on the basis of a number of sensor signals SIG1(t) received at the start time t0 (see FIG. 6 or 7). At a subsequent first point in time t1, a second number of sensor signals SIG1(t) are received. A current position POS (see FIG. 7) of the object 210(t1) is determined on the basis of the second number of sensor signals.
  • a current motion vector VEC(t1) for Time t1 are determined.
  • a third number of sensor signals SIG1(t) are received and a current position POS of object 210(t2) at point in time t2 and a current motion vector VEC(t2) at point in time t2 are determined.
  • a fourth number of sensor signals SIG1(t) are received and a current position POS of the object 210(t3) at time t3 and a current motion vector VEC(t3) at time t3 are determined.
  • a fifth number of sensor signals SIG1(t) are received and a current position POS of the object 210(t4) at time t4 and a current motion vector VEC(t4) at time t4 are determined.
  • a sixth number of sensor signals SIG1(t) are received and a current position POS of the object 210(t5) at time t5 and a motion vector VEC(t5) current at time t5 are determined.
  • the movement of the object 210 can thus be tracked at any point in time t0 ⁇ 15.
  • the movement of the object 210 can additionally be predicted, for example using corresponding equations of motion.
  • the determination of the position POS and the motion vector VEC at a respective point in time t0 - 15 is preferably based on the first determination method V1 using a Kalman filter and on the basis of a further determination method V2 (see Fig. 6 or 7)
  • Fig. 6 shows a schematic block diagram of an exemplary embodiment of a driver assistance system 110, for example driver assistance system 110 of vehicle 100 in Fig.
  • the driving assistance system 110 includes a receiving unit 112 for receiving a driving condition sensor signal SIG0(t) indicative of a driving condition of the vehicle 100 at a number of different points in time tO-15 (see FIG. 5), and for receiving a number of signals for an environment 200 ( see Fig. 1 or 2) of the vehicle 100 indicative sensor signals SIG1 (t) at a number of different times tO - 15.
  • the driver assistance system 110 further includes a detection unit 114 for detecting a number of objects 210, 211 (see Fig. 2) in the Environment 200 of the vehicle 100 depending from a first number of sensor signals SIG1(t), which were received at a first point in time, a determination unit 116 for determining a position POS (see Fig.
  • a motion vector VEC (see Fig. 3 - 5) for a detected object 210 , 211 as a function of the first number of sensor signals SIG(t) and a second number of sensor signals SIG(t), which was received at a second point in time following the first point in time, using a plurality of different determination methods V1, V2, where different determination methods of the plurality have a different computing effort, and an output unit 118 for outputting a warning signal if there is a potential collision of vehicle 100 with the detected object 210, 211 on the basis of the driving condition sensor signal SIG0(t) received at a specific point in time tO - 15 and the position POS determined for the detected object 210, 211 and its motion vector VEC.
  • Fig. 7 shows a schematic block diagram of an exemplary embodiment of a method for operating a driver assistance system 110, for example the driver assistance system 110 of Fig. 6 or the driver assistance system 110 of the vehicle 100 of Fig. 1.
  • a driver assistance system 110 for example the driver assistance system 110 of Fig. 6 or the driver assistance system 110 of the vehicle 100 of Fig. 1.
  • a first step S1 a for a driving state of the vehicle 100 indicative vehicle condition sensor signal SIG0(t) at a number of different times tO-15 (see Figure 5).
  • a number of sensor signals SIG1(t) indicative of an area 200 (see FIG. 1 or 2) of vehicle 100 is received at a number of different points in time t0-15.
  • a third step S3 a number of objects 210, 211 (see FIG.
  • a position POS and a movement vector VEC for a detected object 210, 211 are determined as a function of the first number of sensor signals SIG1(t) and a second number of sensor signals SIG1(t) , which was received at a second point in time following the first point in time, determined using a plurality of different determination methods V1, V2, with different determination methods V1, V2 of the plurality having a different chen calculation effort.
  • a warning signal is output if there is a potential collision of the vehicle 100 with the detected object 110 based on the driving condition sensor signal SIG0(t) received at a specific point in time tO - 15 and the position POS determined for the detected object 210 and its Movement vector VEC is determined.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Acoustics & Sound (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

L'invention concerne un procédé pour le fonctionnement d'un système d'assistance au conducteur (110). Le procédé comprend les étapes consistant à : a) recevoir (S1) un signal de capteur d'état de conduite (SIG0(t)), qui indique l'état de conduite, à un certain nombre de différents points dans le temps (t0 - t5), b) recevoir (S2) un certain nombre de signaux de capteur (SIG1(t)), qui indiquent les environs (200), à un certain nombre de différents points dans le temps (t0 - t5), c) détecter (S3) un certain nombre d'objets (210, 211) dans les environs (200) sur la base d'un premier nombre de signaux de capteur (SIG1(t)), qui ont été détectés à un point temporel, d) déterminer (S4) une position (POS) et un vecteur de mouvement (VEC) pour un objet détecté (210, 211) sur la base du premier nombre de signaux de capteur (SIG1(t)) et d'un second nombre de signaux de capteur (SIG1(t)), qui ont été reçus à un second point temporel qui suit le premier point temporel, à l'aide d'une pluralité de différentes méthodes d'évaluation (V1, V2), différentes méthodes d'évaluation (V1, V2) de la pluralité possèdent un degré de complexité de calcul différent, et e) délivrer en sortie (S5) un signal d'avertissement si une collision potentielle avec l'objet détecté (210, 211) est évaluée sur la base du signal de capteur d'état de conduite (SIG0(t)) reçu à un point temporel spécifié et de la position (POS) et du vecteur de mouvement (VEC) déterminés pour l'objet détecté (210, 211).
EP22718982.6A 2021-03-30 2022-03-30 Procédé pour le fonctionnement d'un système d'assistance au conducteur, produit programme informatique, système d'assistance au conducteur et véhicule Pending EP4314895A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021107972.1A DE102021107972A1 (de) 2021-03-30 2021-03-30 Verfahren zum betreiben eines fahrassistenzsystems, computerprogrammprodukt, fahrassistenzsystem und fahrzeug
PCT/EP2022/058356 WO2022207679A1 (fr) 2021-03-30 2022-03-30 Procédé pour le fonctionnement d'un système d'assistance au conducteur, produit programme informatique, système d'assistance au conducteur et véhicule

Publications (1)

Publication Number Publication Date
EP4314895A1 true EP4314895A1 (fr) 2024-02-07

Family

ID=81387319

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22718982.6A Pending EP4314895A1 (fr) 2021-03-30 2022-03-30 Procédé pour le fonctionnement d'un système d'assistance au conducteur, produit programme informatique, système d'assistance au conducteur et véhicule

Country Status (7)

Country Link
US (1) US20240194077A1 (fr)
EP (1) EP4314895A1 (fr)
JP (1) JP2024511664A (fr)
KR (1) KR20230164145A (fr)
CN (1) CN117157555A (fr)
DE (1) DE102021107972A1 (fr)
WO (1) WO2022207679A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102023201371A1 (de) 2023-02-17 2024-08-22 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Abstandswarnung für ein Fahrrad

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0952459B1 (fr) 1998-04-23 2011-05-25 Volkswagen Aktiengesellschaft Dispositif de détection d'objets pour véhicules
DE10257842A1 (de) 2002-05-07 2003-11-27 Bosch Gmbh Robert Verfahren zur Bestimmung einer Unfallgefahr eines ersten Objekts mit wenigstens einem zweiten Objekt
DE102006045418A1 (de) 2006-09-26 2008-04-10 GM Global Technology Operations, Inc., Detroit Kraftfahrzeug mit einem Parkassistenzsystem
DE102012214547A1 (de) * 2012-08-16 2014-02-20 Robert Bosch Gmbh Verfahren zur Überwachung eines toten Winkels und Fahrassistenzsystem
DE102017213353A1 (de) 2017-08-02 2019-02-07 Bayerische Motoren Werke Aktiengesellschaft Verifikation einer geplanten Trajektorie eines automatisierten Kraftfahrzeugs
US11789449B2 (en) * 2018-02-09 2023-10-17 Nvidia Corporation Controlling autonomous vehicles using safe arrival times

Also Published As

Publication number Publication date
JP2024511664A (ja) 2024-03-14
CN117157555A (zh) 2023-12-01
DE102021107972A1 (de) 2022-10-06
WO2022207679A1 (fr) 2022-10-06
US20240194077A1 (en) 2024-06-13
KR20230164145A (ko) 2023-12-01

Similar Documents

Publication Publication Date Title
EP1475764B1 (fr) Méthode et appareil pour calculer la probabilité d'une collision entre une véhicule et un objet
DE102004035842B4 (de) Duales ungleichartiges erfassendes Objektdetektions- und Zielerfassungssystem
DE69426916T2 (de) Fahr-Kontrollanlage für Kraftfahrzeuge
EP1531343B1 (fr) Procédé de suivi d'objets
DE102007055799B4 (de) Fahrzeugumgebungsüberwachungsgerät
DE102016100401A1 (de) Verfahren zum Ermitteln einer Falschausrichtung eines Objektsensors
EP3292423B1 (fr) Procédé de diagnostic pour un capteur de vision d'un véhicule et véhicule muni d'un capteur de vision
DE102019108645B4 (de) Verfahren und vorrichtung zur erzeugung von situationswahrnehmungsgrafiken unter verwendung von kameras von verschiedenen fahrzeugen
DE102016106298A1 (de) Fusionsverfahren für eine Querverkehr-Applikation unter Verwenden von Radargeräten und einer Kamera
DE102016106299A1 (de) Raderfassung und ihre Applikation bei Objektverfolgung und Sensorregistrierung
DE102018120789A1 (de) Verfahren und Vorrichtung zur Überwachung eines autonomen Fahrzeugs
EP1731922A1 (fr) Procédé et dispositif destinés à la détermination de surfaces libres dans l'environnement d'un poids lourd
EP1298454A2 (fr) Procédé de reconnaissance et de suivi d'objets
DE102020119541A1 (de) Detektieren von fahrzeugbetriebsbedingungen
DE102018109404A1 (de) System und Verfahren zur Bestimmung der Position von Seitenaufprallvermeidungssensoren an einem Fahrzeug
DE102005042989B3 (de) Verfahren zur Erkennung eines bevorstehenden Unfalls aufgrund eines Schleudervorgangs bei einem vorausfahrenden Fahrzeug
EP1306690B1 (fr) Procédé de reconnaissance et de suivi d'objets
DE112017000865T5 (de) Abschätzungsvorrichtung
DE102020211970A1 (de) Verfahren zum Steuern eines Fahrzeugs
DE102020127205A1 (de) Verfahren und vorrichtung zur ampelpositionierung und -kartierung unter verwendung von durch menschenmengen (crowd-sensed) erfassten daten
WO2021083464A1 (fr) Procédé de localisation d'un véhicule cible à distance dans une zone entourant un véhicule automobile au moyen d'un dispositif de détection de collision
DE102021100701A1 (de) Situationsbewusstsein für gleichzeitigen spurwechsel
EP1298012B1 (fr) Procédé de reconnaissance et de poursuite d'objets
EP4314895A1 (fr) Procédé pour le fonctionnement d'un système d'assistance au conducteur, produit programme informatique, système d'assistance au conducteur et véhicule
DE19828160A1 (de) Verfahren zum automatischen Erkennen der Hauptrichtungsfahrbahn bei einer mehrspurigen Strecke

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230906

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)