EP3283343A1 - Object tracking before and during an impact - Google Patents

Object tracking before and during an impact

Info

Publication number
EP3283343A1
EP3283343A1 EP16710939.6A EP16710939A EP3283343A1 EP 3283343 A1 EP3283343 A1 EP 3283343A1 EP 16710939 A EP16710939 A EP 16710939A EP 3283343 A1 EP3283343 A1 EP 3283343A1
Authority
EP
European Patent Office
Prior art keywords
motor vehicle
phase
collision
tracking
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP16710939.6A
Other languages
German (de)
French (fr)
Other versions
EP3283343B1 (en
Inventor
Sybille Eisele
Ulf Wilhelm
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP3283343A1 publication Critical patent/EP3283343A1/en
Application granted granted Critical
Publication of EP3283343B1 publication Critical patent/EP3283343B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W2030/082Vehicle operation after collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • B60W2050/0215Sensor drifts or sensor failures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects

Definitions

  • the invention relates to an environment detection system and a
  • a motor vehicle is equipped with a driver assistance system or an automatic control that assists a driver in driving the motor vehicle.
  • a longitudinal and a lateral control of the motor vehicle are supported individually or jointly.
  • For controlling an environment of the motor vehicle is scanned, for example optically and / or by radar, and objects and the course of their relative movements in the environment of the motor vehicle, so their trajectories are determined from the scanned information. This process is also called tracking.
  • the information is usually collected at fixed time intervals and made plausible by means of a movement model.
  • the motion model of the motor vehicle is based on assumptions, such as a maximum acceleration or a maximum yaw rate, which are expected in a normal operation of the motor vehicle.
  • the invention has for its object to provide an improved technique for detecting the environment of a motor vehicle in the event of a first collision.
  • the invention achieves this object by means of a method having the features of the independent claim. Subclaims give preferred embodiments again.
  • a method comprises steps of detecting an object in an environment of a motor vehicle, tracking, in a first phase, the object relating to the motor vehicle by means of a first movement model of the motor vehicle, detecting a collision of the motor vehicle with an obstacle and the tracking, in a second Phase, of the object with respect to the motor vehicle by means of a second movement model of the motor vehicle.
  • detection data of the object may be continuously used to track the object with respect to the motor vehicle.
  • the relative movement between the object and the motor vehicle can thus also be determined in the event of an accident.
  • the movement models usually differ by maximum speed and acceleration values and / or maximum yaw rates of the motor vehicle.
  • the second movement model may allow significantly higher acceleration values than the first movement model.
  • the object is also tracked on the basis of detection data of the object in the first phase.
  • the tracking of the object or its trajectory with respect to the motor vehicle can be made plausible in particular on the basis of detection data of both phases. The object or its trajectory can thus be traced more reliably or completely.
  • the end of the collision is further determined and, in a third phase, the object or its trajectory with respect to the motor vehicle is tracked by means of the first movement model of the motor vehicle.
  • the object in the third phase is also tracked on the basis of detection data of the object in the second phase.
  • detection data of the object in the first phase can be used.
  • the aim is the most complete possible use of temporally spaced detection data of the object in several phases. The plausibility of the detection data and thereby the tracking of the object or the course of its relative movement with respect to the motor vehicle can be carried out improved.
  • a movement of the motor vehicle in the third phase is controlled on the basis of the tracked object.
  • the movement of the motor vehicle can be controlled such that the motor vehicle leaves a danger zone that applies in the region of the collision.
  • Other controls can also be performed. For example, it can be determined whether another collision is imminent and measures can be taken to avoid the second collision or to mitigate its consequences.
  • an active or passive safety system can be reactivated for occupants of the motor vehicle.
  • the second phase is usually too short to perform a plausibility check of sensor or acquisition data.
  • the crash is determined based on data from an acceleration sensor.
  • the accelerometer can be central or be mounted remotely on the motor vehicle. It is also possible to use several acceleration sensors.
  • an acceleration-related acceleration and an acceleration direction are involved in tracking the object in the second phase.
  • the tracking of the object or its trajectory can thus also be carried out under the measuring and processing-technically difficult conditions during the collision.
  • data from an up-front sensor enters the tracking of the object in the second phase.
  • Upfront sensors are usually mounted in front of the vehicle in the direction of travel and can be used, for example, to determine the course and the severity of a frontal impact at an early stage.
  • a partially covered frontal accident can be detected by means of several Upfront sensors.
  • one or more surroundings detection sensors in the area of the affected partial coverage can not be further evaluated when partially overlapped head-on accidents are detected.
  • data of a peripheral sensor, a roll rate sensor or a yaw rate sensor may be included in the tracking of the object in the second phase.
  • characteristic data can be used which allow a follow-up of the movement of the object improved.
  • a computer program product comprises program code means for carrying out the described method when the computer program product runs on a processing device or is stored on a computer-readable data carrier.
  • FIG. 1 shows a device on board a motor vehicle
  • FIG. 2 shows phases of a collision of the motor vehicle of FIG. 1 with an object
  • FIG. 3 shows a flow diagram of a method for controlling the motor vehicle of FIG.
  • FIG. 1 shows a motor vehicle 100 with a device 105.
  • the device 105 implements a driver assistance system or an automatic vehicle control, wherein the device 105 may comprise components which are also associated with another system on board the motor vehicle 105, for example a parking or navigation system.
  • the device 105 is configured to determine an object or the course of its relative movement in an environment with respect to the motor vehicle.
  • the device 105 comprises a processing device 15, which is equipped with at least one sensor 120 for scanning an object 125 in the environment 110 of the motor vehicle 100.
  • the object 125 may be an object immovable to an environment, such as a street post, or include a moving object, such as another motor vehicle or a pedestrian. Although typically multiple objects 125 are detected, the technique presented below will be described primarily with respect to only one exemplary object 125.
  • a plurality of sensors 120 can be used which can scan different regions of the environment 110 and / or can be constructed differently.
  • one or more camera or radar sensors 130 may be provided for scanning the object 125.
  • These sensors 120 may be mounted at a central location of the motor vehicle 100 or in the region of an outline of the motor vehicle 100.
  • further sensors may be provided.
  • an inertial sensor which determines an acceleration in one or more spatial directions, or a rotation rate sensor for one or more spatial direction axes belong.
  • crash sensor data can be used to determine the
  • Crash motion model can be used.
  • a central acceleration sensor 135 and / or one or more upfront sensors 140 and / or one or more peripheral sensors 141, 142 may be used to determine the time of the crash or the crash direction. If at least two upfront sensors 140, for example in the front or in the rear of the motor vehicle 100, are installed in a vehicle, a partial overlap of the motor vehicle 100 with the object 125 during the collision can also be detected. A lateral collision can be detected by means of one or more peripheral sensors and / or a central acceleration sensor.
  • a pressure sensor 141 which may be installed approximately in a vehicle door, and / or an acceleration sensor, which may be installed, for example, in a mitschweiler or in a B-pillar and / or in a C-pillar, are evaluated.
  • a vehicle rollover can by an additional evaluation of a
  • Roll rate sensor 143 for determining a rotational rate about the longitudinal axis of the motor vehicle 100 (roll angular rate) are detected. It can also be one
  • Yaw rate sensor 144 may be provided to determine a rotational speed of the motor vehicle 100 about the vertical axis (yaw rate).
  • the acceleration sensor 135, the roll rate sensor 143 and / or the yaw rate sensor 144 are preferably located on a longitudinal axis of the motor vehicle 100.
  • Several sensors 130-144 can also be designed to be integrated with one another, for example in the form of a multi-channel acceleration sensor.
  • the processing device 115 is configured to provide movement information about the object 125 at an interface 150.
  • a driver assistance system or an automatic vehicle control can use data to further control the motor vehicle 100, for example by activating a safety system or also by controlling the movement of the motor vehicle 100.
  • the motor vehicle 100 can be driven to leave a danger zone in the environment 110. This danger zone may include the object 125.
  • This danger zone may include the object 125.
  • a safety zone can be actuated which, for example, example, a stanchion or breakdown strip may include a road.
  • a longitudinal or lateral control of the motor vehicle 100 can be actively influenced.
  • a consequential accident with the motor vehicle 100 can be avoided.
  • the sensors 120 are typically scanned several times in a predetermined period of time, so that temporally the coordinated measurements are present. These measurements are made plausible with respect to a movement model of the motor vehicle 100.
  • this movement model may include maximum speed or acceleration values for the motor vehicle 100. Samples indicating, for example, an acceleration of the motor vehicle 100 with respect to the object 125 that exceed the limit of the motion model may be discarded. It is proposed to determine a collision between the motor vehicle 100 and the obstacle 145 and to use an altered movement model during the collision to evaluate the sensor data of the sensors 120. The relative movement of the motor vehicle 100 relative to the object 125 can thus also be determined during the collision correctly and in particular with the aid of sensor values before the collision.
  • the obstacle 145 may also be considered an object 125 prior to collision, and its movement relative to the motor vehicle 100 tracked. In one embodiment, the tracking of the obstacle 145 as an object 125 may be maintained even after the crash; however, in another embodiment, after the collision, the obstacle 145 is no longer considered and tracked as object 125.
  • FIG. 2 shows phases of a collision of the motor vehicle 100 of FIG. 1 with the obstacle 145.
  • An exemplary head-on collision is shown, although the technique presented herein is also used with any other type of collision, such as a rear collision, staggered collision, or collision can.
  • FIG. 2 a shows a first phase in which the motor vehicle 100 is in motion with respect to the obstacle 145 in normal operation.
  • the movement of the motor vehicle 100 can be determined by means of a first movement model.
  • FIG. 2b shows the transition from the first to a second phase, during which the motor vehicle 100 collides with the obstacle 145.
  • acceleration values for example in the longitudinal or transverse direction or high direction or yaw rates about a longitudinal or vertical axis of the motor vehicle 100, may exceed the limits of the first movement model.
  • Such acceleration values or rates of rotation may occur, for example, during a collision, a spin, a rollover or another accident sequence.
  • Figure 2c shows the motor vehicle 100 in the second phase during the collision with the obstacle 145.
  • the forward speed of the motor vehicle 100 is rapidly reduced, while the motor vehicle 100 is accelerated sharply around the vertical axis. It is preferred during the second phase to determine the movement of the motor vehicle 100 with respect to a second movement model, which in particular allows acceleration values of the motor vehicle 100, such as may occur during such maneuvers.
  • FIG. 2 d shows the motor vehicle 100 in a third phase, which can follow the second phase.
  • the collision with the obstacle 145 is finished and the movement determination can be performed again on the basis of the first movement model.
  • a control of the motor vehicle 100 can be carried out, in particular in order to move the motor vehicle to a safe position or to a safe position.
  • the control can be done by means of a driver assistant or an automatic vehicle control.
  • FIG. 3 shows a flow diagram of a method 300 for controlling the motor vehicle 100 from FIG. 1.
  • a first step 305 data of the sensors 20 are recorded at a time interval.
  • the obstacle 145 is tracked in its movement with respect to the motor vehicle 100, wherein a first movement model of the motor vehicle 100 is used.
  • the detected movement may be provided to a driver assistance system or an automatic vehicle control, which in particular performs a longitudinal or lateral control of the motor vehicle 100.
  • the driver assistance system or the automatic The vehicle control system may for example be integrated into the processing device 115 from FIG.
  • a step 310 it is determined whether an accident or collision of the motor vehicle 100 with the obstacle 145 has resulted. This determination can be made in particular on the basis of the sensors 135 and / or 140 and / or 141 and / or 142 and / or 143. In a refinement, it may be determined how severe the collision is to deduce from whether the motor vehicle 100 still remains manoeuvrable after the collision has ended. If no collision has been detected, then method 300 may return to step 305 and re-run.
  • the movement model underlying the tracking of the obstacle 145 with respect to the motor vehicle 100 is changed.
  • the movement model or the movement between the motor vehicle 100 and the obstacle 145 can be determined in particular on the basis of a direction from which the obstacle 145 acts on the motor vehicle 100 and a deceleration of the motor vehicle 100.
  • yaw rate signals can also be evaluated.
  • the tracking of the object 125 can be limited to sensor data of such sensors 120, which are remote from the direction of the action of the obstacle 145 on the motor vehicle 100.
  • signals from a sensor 140 are used for plausibility or tracking, whose associated sensor 140 has been affected by the collision.
  • the obstacle 145 which can perform a movement difficult to determine by the collision, is further traced as object 125.
  • a position and / or a direction of the motor vehicle 100 after the collision with the obstacle 145 can be determined.
  • the tracking of the object 125 on the basis of the data of the sensors 120 does not have to be made plausible over several measuring cycles for this purpose.
  • the end of the crash may be determined, particularly based on acceleration data, such as the acceleration sensor 135 and / or the upfront sensors 140 and / or the peripheral sensors 141, 142 and / or the roll rate sensor 143, with the end of the collision, the first movement model can again be activated in order to further determine the tracking between the motor vehicle 100 and the object 125.
  • a step 335 it may be determined whether the motor vehicle 100 is in a danger zone or there is a risk of a following collision.
  • the motor vehicle 100 may be controlled to assume a safer position in the environment 110.
  • functions of the above-mentioned driver assistance system and / or the automatic vehicle control can be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Electric Propulsion And Braking For Vehicles (AREA)

Abstract

The invention relates to a method comprising steps of detecting an object in an environment of a motor vehicle, tracking the object in relation to the motor vehicle by means of a first motion model of the motor vehicle in a first phase, detecting an impact of the motor vehicle with an obstacle, and tracking the object in relation to the motor vehicle by means of a second motion model of the motor vehicle in a second phase.

Description

Beschreibung  description
Obiektverfolqunq vor und während eines Zusammenstoßes Object tracking before and during a collision
Die Erfindung betrifft ein Umfelderfassungssystem und ein The invention relates to an environment detection system and a
Crashdetektionssystem in einem Fahrzeug. Crash detection system in a vehicle.
Stand der Technik State of the art
Ein Kraftfahrzeug ist mit einem Fahrerassistenzsystem oder einer automatischen Steuerung ausgestattet, die einen Fahrer beim Führen des Kraftfahrzeugs unterstützt. Dabei werden insbesondere eine Längs- und eine Quersteuerung des Kraftfahrzeugs einzeln oder gemeinsam unterstützt. Zur Steuerung wird ein Umfeld des Kraftfahrzeugs abgetastet, beispielsweise optisch und/oder per Radar, und Objekte und der Verlauf ihrer Relativbewegungen im Umfeld des Kraftfahrzeugs, also deren Trajektorien, werden aus den abgetasteten Informationen ermittelt. Dieser Vorgang wird auch Tracking genannt. Um die Trajektorien sicher bestimmen zu können, werden die Informationen üblicherweise in festen zeitlichen Abständen erhoben und mittels eines Bewegungsmodells plausibilisiert. Das Bewegungsmodell des Kraftfahrzeugs basiert auf Annahmen, beispielsweise einer maximalen Beschleunigung oder einer maximalen Gierrate, die in einem üblichen Betrieb des Kraftfahrzeugs erwartet werden. A motor vehicle is equipped with a driver assistance system or an automatic control that assists a driver in driving the motor vehicle. In particular, a longitudinal and a lateral control of the motor vehicle are supported individually or jointly. For controlling an environment of the motor vehicle is scanned, for example optically and / or by radar, and objects and the course of their relative movements in the environment of the motor vehicle, so their trajectories are determined from the scanned information. This process is also called tracking. In order to be able to reliably determine the trajectories, the information is usually collected at fixed time intervals and made plausible by means of a movement model. The motion model of the motor vehicle is based on assumptions, such as a maximum acceleration or a maximum yaw rate, which are expected in a normal operation of the motor vehicle.
Erleidet das Kraftfahrzeug einen Unfall, insbesondere einen Zusammenstoß, so wird das Fahrerassistenzsystem üblicherweise sofort abgeschaltet. Die während eines Zusammenstoßes auftretenden Beschleunigungen überschreiten die im Bewegungsmodell vorgesehenen Beschleunigungen häufig um ein Vielfaches, sodass die Plausibilisierung der Trajektorien während des Unfalls fehlschlägt. Um nach dem Umfall das Tracking wieder zuverlässig bestimmen zu können, ist üblicherweise die Abtastung des Umfelds des Kraftfahrzeugs in mehreren Abtastschritten erforderlich, wobei eine Plausibilisierung üblicherweise Daten erfordert, die in einem Zeitraum von mehreren Sekunden oder sogar mehreren Minuten erhoben wurden. Die Bestimmung der Trajektorien kann daher auch nach dem Ende des Unfalls nur schlecht oder gar nicht bestimmt werden. Der Erfindung liegt die Aufgabe zugrunde, eine verbesserte Technik zur Erfassung des Umfelds eines Kraftfahrzeuges im Falle einer Erstkollision bereitzustellen. Die Erfindung löst diese Aufgabe mittels eines Verfahrens mit den Merkmalen des unabhängigen Anspruchs. Unteransprüche geben bevorzugte Ausführungsformen wieder. If the motor vehicle suffers an accident, in particular a collision, then the driver assistance system is usually switched off immediately. The accelerations occurring during a collision frequently exceed the accelerations provided in the movement model by a multiple, so that the plausibility of the trajectories during the accident fails. In order to be able to determine the tracking reliably again after the fall, it is usually necessary to scan the surroundings of the motor vehicle in several scanning steps, whereby a plausibility check usually requires data which has been collected over a period of several seconds or even several minutes. The determination of the trajectories can therefore be poorly or not at all determined even after the end of the accident. The invention has for its object to provide an improved technique for detecting the environment of a motor vehicle in the event of a first collision. The invention achieves this object by means of a method having the features of the independent claim. Subclaims give preferred embodiments again.
Offenbarung der Erfindung Disclosure of the invention
Ein Verfahren umfasst Schritte des Erfassens eines Objekts in einer Umgebung eines Kraftfahrzeugs, des Verfolgens, in einer ersten Phase, des Objekts bezüglich dem Kraftfahrzeug mittels eines ersten Bewegungsmodells des Kraftfahrzeugs, des Erfassens eines Zusammenstoßens des Kraftfahrzeugs mit einem Hindernis und des Verfolgens, in einer zweiten Phase, des Objekts bezüglich dem Kraftfahrzeug mittels eines zweiten Bewegungsmodells des Kraftfahrzeugs. A method comprises steps of detecting an object in an environment of a motor vehicle, tracking, in a first phase, the object relating to the motor vehicle by means of a first movement model of the motor vehicle, detecting a collision of the motor vehicle with an obstacle and the tracking, in a second Phase, of the object with respect to the motor vehicle by means of a second movement model of the motor vehicle.
Durch das Anwenden eines anderen Bewegungsmodells während des Zusammenstoßes können Erfassungsdaten des Objekts kontinuierlich verwendet werden, um das Objekt bezüglich des Kraftfahrzeugs zu verfolgen. Die Relativbewegung zwischen dem Objekt und dem Kraftfahrzeug kann so auch im Fall eines Unfalls bestimmt werden. Die Bewegungsmodelle unterscheiden sich üblicherweise durch maximale Geschwind igkeits- und Beschleunigungswerte und/oder maximale Gierraten des Kraftfahrzeugs. Das zweite Bewegungsmodell kann beispielsweise deutlich höhere Beschleunigungswerte als das erste Bewegungsmodell erlauben. By applying a different motion model during the collision, detection data of the object may be continuously used to track the object with respect to the motor vehicle. The relative movement between the object and the motor vehicle can thus also be determined in the event of an accident. The movement models usually differ by maximum speed and acceleration values and / or maximum yaw rates of the motor vehicle. For example, the second movement model may allow significantly higher acceleration values than the first movement model.
Bevorzugterweise wird das Objekt in der zweiten Phase auch auf der Basis von Erfassungsdaten des Objekts in der ersten Phase verfolgt. Die Verfolgung des Objekts oder seiner Trajektorie bezüglich des Kraftfahrzeugs kann insbesondere auf der Basis von Erfassungsdaten beider Phasen plausibilisiert werden. Das Objekt oder seine Trajektorie können so verlässlicher oder lückenloser verfolgt werden. Preferably, in the second phase, the object is also tracked on the basis of detection data of the object in the first phase. The tracking of the object or its trajectory with respect to the motor vehicle can be made plausible in particular on the basis of detection data of both phases. The object or its trajectory can thus be traced more reliably or completely.
In einer weiteren Ausführungsform wird ferner das Ende des Zusammenstoßes bestimmt und in einer dritten Phase das Objekt oder seine Trajektorie bezüglich dem Kraftfahrzeug mittels des ersten Bewegungsmodells des Kraftfahrzeugs weiterverfolgt. Durch das Umschalten auf die übliche Verarbeitung nach dem Ende des Zusammenstoßes können die Erfassungsdaten des Objekts verbessert plausibilisiert werden. Das Objekt kann dadurch besser oder genauer verfolgt werden. In a further embodiment, the end of the collision is further determined and, in a third phase, the object or its trajectory with respect to the motor vehicle is tracked by means of the first movement model of the motor vehicle. By switching to the usual processing after the At the end of the collision, the detection data of the object can be made more plausible. The object can thus be tracked better or more accurately.
Es ist insbesondere bevorzugt, dass das Objekt in der dritten Phase auch auf der Basis von Erfassungsdaten des Objekts in der zweiten Phase verfolgt wird. Zusätzlich oder alternativ zu den Erfassungsdaten des Objekts in der zweiten Phase können Erfassungsdaten des Objekts in der ersten Phase verwendet werden. Angestrebt wird ein möglichst lückenloses Verwenden zeitlich beabstandeter Erfassungsdaten des Objekts in mehreren Phasen. Die Plausibilisierung der Erfassungsdaten und dadurch die Verfolgung des Objekts oder der Verlauf seiner Relativbewegung bezüglich des Kraftfahrzeugs können dadurch verbessert durchgeführt werden. It is particularly preferred that the object in the third phase is also tracked on the basis of detection data of the object in the second phase. In addition or as an alternative to the detection data of the object in the second phase, detection data of the object in the first phase can be used. The aim is the most complete possible use of temporally spaced detection data of the object in several phases. The plausibility of the detection data and thereby the tracking of the object or the course of its relative movement with respect to the motor vehicle can be carried out improved.
Es ist weiter bevorzugt, dass eine Bewegung des Kraftfahrzeugs in der dritten Phase auf der Basis des verfolgten Objekts gesteuert wird. Insbesondere kann die Bewegung des Kraftfahrzeugs so gesteuert werden, dass das Kraftfahrzeug eine Gefahrenzone verlässt, die im Bereich des Zusammenstoßes gilt. Es können auch weitere Steuerungen durchgeführt werden. Beispielsweise kann bestimmt werden, ob ein weiterer Zusammenstoß droht und es können Maßnahmen ergriffen werden, den zweiten Zusammenstoß zu vermeiden oder seine Folgen abzumildern. Beispielsweise kann ein aktives oder passives Sicherheitssystem für Insassen des Kraftfahrzeugs neu aktiviert werden. It is further preferred that a movement of the motor vehicle in the third phase is controlled on the basis of the tracked object. In particular, the movement of the motor vehicle can be controlled such that the motor vehicle leaves a danger zone that applies in the region of the collision. Other controls can also be performed. For example, it can be determined whether another collision is imminent and measures can be taken to avoid the second collision or to mitigate its consequences. For example, an active or passive safety system can be reactivated for occupants of the motor vehicle.
Es kann bestimmt werden, aus welcher Richtung das Hindernis mit dem Kraftfahrzeug zusammenstößt, wobei das Objekt in der zweiten Phase nur noch auf der Basis von Sensordaten verfolgt wird, deren zugeordnete Sensoren der Richtung des Zusammenstoßes abgewandt sind. Die zweite Phase ist üblicherweise zu kurz, um eine Plausibilisierung von Sensor- oder Erfassungsdaten durchzuführen. Durch das Verzichten auf Sensordaten von solchen Sensoren, die mit hoher Wahrscheinlichkeit durch den Zusammenstoß in ihrer Funktionsfähigkeit beeinträchtigt worden sind, kann trotzdem in der zweiten Phase eine verbesserte Verfolgung des Objekts durchgeführt werden. It can be determined from which direction the obstacle collides with the motor vehicle, wherein the object is tracked in the second phase only on the basis of sensor data whose associated sensors are facing away from the direction of the collision. The second phase is usually too short to perform a plausibility check of sensor or acquisition data. By dispensing with sensor data from such sensors, which are likely to have been impaired by the collision in their functionality, an improved tracking of the object can nevertheless be carried out in the second phase.
In einer Ausführungsform wird der Zusammenstoß auf der Basis von Daten eines Beschleunigungssensors bestimmt. Der Beschleunigungssensor kann zentral oder dezentral am Kraftfahrzeug angebracht sein. Es können auch mehrere Beschleunigungssensoren verwendet werden. In one embodiment, the crash is determined based on data from an acceleration sensor. The accelerometer can be central or be mounted remotely on the motor vehicle. It is also possible to use several acceleration sensors.
In einer Variante gehen eine durch den Zusammenstoß bedingte Beschleunigung und eine Beschleunigungsrichtung in die Verfolgung des Objekts in der zweiten Phase ein. Die Verfolgung des Objekts oder seiner Trajektorie können so auch unter den mess- und verarbeitungstechnisch schwierigen Bedingungen während des Zusammenstoßes durchgeführt werden. In one variant, an acceleration-related acceleration and an acceleration direction are involved in tracking the object in the second phase. The tracking of the object or its trajectory can thus also be carried out under the measuring and processing-technically difficult conditions during the collision.
In einer weiteren Ausführungsform gehen Daten eines Upfront-Sensors in die Verfolgung des Objekts in der zweiten Phase ein. Upfront-Sensoren sind üblicherweise in Fahrtrichtung vorne am Kraftfahrzeug angebracht und können beispielsweise dazu verwendet werden, den Verlauf und die Schwere eines Frontalaufpralls zu einem frühen Zeitpunkt zu ermitteln. Außerdem kann mittels mehrerer Upfront-Sensoren ein teilüberdeckte Frontalunfall erkannt werden. In der zweiten und dritten Phase können bei Erkennung von teilüberdeckten Frontalunfällen einer oder mehrere Umfelderfassungssensoren im Bereich der betroffenen Teilüberdeckung nicht weiter ausgewertet werden. In a further embodiment, data from an up-front sensor enters the tracking of the object in the second phase. Upfront sensors are usually mounted in front of the vehicle in the direction of travel and can be used, for example, to determine the course and the severity of a frontal impact at an early stage. In addition, a partially covered frontal accident can be detected by means of several Upfront sensors. In the second and third phases, one or more surroundings detection sensors in the area of the affected partial coverage can not be further evaluated when partially overlapped head-on accidents are detected.
Ferner können Daten eines peripheren Sensors, eines Wankratensensors oder eines Gierratensensors in die Verfolgung des Objekts in der zweiten Phase eingehen. Dadurch können charakteristische Daten verwendet werden, die eine Weiterverfolgung der Bewegung des Objekts verbessert erlauben. Further, data of a peripheral sensor, a roll rate sensor or a yaw rate sensor may be included in the tracking of the object in the second phase. As a result, characteristic data can be used which allow a follow-up of the movement of the object improved.
Ein Computerprogrammprodukt umfasst Programmcodemittel zur Durchführung des beschriebenen Verfahrens, wenn das Computerprogrammprodukt auf eine Verarbeitungseinrichtung abläuft oder auf einem computerlesbaren Datenträger gespeichert ist. A computer program product comprises program code means for carrying out the described method when the computer program product runs on a processing device or is stored on a computer-readable data carrier.
Kurzbeschreibung der Figuren Brief description of the figures
Die Erfindung wird nun mit Bezug auf die beigefügten Figuren genauer beschrieben, in denen: The invention will now be described in more detail with reference to the attached figures, in which:
Figur 1 eine Vorrichtung an Bord eines Kraftfahrzeugs; Figur 2 Phasen eines Zusammenstoßes des Kraftfahrzeugs von Figur 1 mit einem Objekt; und Figure 1 shows a device on board a motor vehicle; FIG. 2 shows phases of a collision of the motor vehicle of FIG. 1 with an object; and
Figur 3 ein Ablaufdiagramm eines Verfahrens zur Steuerung des Kraftfahrzeugs von Figur 1 darstellt.  FIG. 3 shows a flow diagram of a method for controlling the motor vehicle of FIG.
Genaue Beschreibung von Ausführungsbeispielen Detailed description of embodiments
Figur 1 zeigt ein Kraftfahrzeug 100 mit einer Vorrichtung 105. Die Vorrichtung 105 realisiert ein Fahrerassistenzsystem oder eine automatische Fahrzeugsteuerung, wobei die Vorrichtung 105 Komponenten umfassen kann, die auch einem anderen System an Bord des Kraftfahrzeugs 105 zugeordnet sind, beispielsweise einem Einpark- oder Navigationssystem. Insbesondere ist die Vorrichtung 105 dazu eingerichtet, ein Objekt oder den Verlauf seiner Relativbewegung in einer Umgebung bezüglich des Kraftfahrzeugs zu bestimmen. FIG. 1 shows a motor vehicle 100 with a device 105. The device 105 implements a driver assistance system or an automatic vehicle control, wherein the device 105 may comprise components which are also associated with another system on board the motor vehicle 105, for example a parking or navigation system. In particular, the device 105 is configured to determine an object or the course of its relative movement in an environment with respect to the motor vehicle.
Die Vorrichtung 105 umfasst eine Verarbeitungseinrichtung 1 15, die mit wenigstens einem Sensor 120 zur Abtastung eines Objekts 125 in der Umgebung 1 10 des Kraftfahrzeugs 100 eingerichtet ist. Das Objekt 125 kann ein gegenüber einer Umgebung unbewegliches Objekt sein, beispielsweise ein Straßenpfosten, oder ein bewegliches bzw. sich bewegendes Objekt umfassen, etwa ein anderes Kraftfahrzeug oder einen Fußgänger. Obwohl üblicherweise mehrere Objekte 125 erfasst werden, wird die vorgestellte Technik im Folgenden hauptsächlich Bezug auf nur ein exemplarisches Objekt 125 beschrieben. The device 105 comprises a processing device 15, which is equipped with at least one sensor 120 for scanning an object 125 in the environment 110 of the motor vehicle 100. The object 125 may be an object immovable to an environment, such as a street post, or include a moving object, such as another motor vehicle or a pedestrian. Although typically multiple objects 125 are detected, the technique presented below will be described primarily with respect to only one exemplary object 125.
Es kann eine Vielzahl von Sensoren 120 verwendet werden, die unterschiedliche Bereiche der Umgebung 1 10 abtasten können und/oder unterschiedlich aufgebaut sein können. Beispielsweise können einer oder mehrere Kamera- oder Radarsensoren 130 zur Abtastung des Objekts 125 bereitgestellt sein. Diese Sensoren 120 können an einer zentralen Stelle des Kraftfahrzeugs 100 oder im Bereich eines Umrisses des Kraftfahrzeugs 100 angebracht sein. A plurality of sensors 120 can be used which can scan different regions of the environment 110 and / or can be constructed differently. For example, one or more camera or radar sensors 130 may be provided for scanning the object 125. These sensors 120 may be mounted at a central location of the motor vehicle 100 or in the region of an outline of the motor vehicle 100.
Zur weiteren Bestimmung des Bewegungsverhaltens des Kraftfahrzeugs 100 können noch weitere Sensoren vorgesehen sein. Dazu können ein Inertialsensor, der eine Beschleunigung in einer oder mehreren Raumrichtungen bestimmt oder ein Drehratensensor für eine oder mehrere Raumrichtungsachsen gehören. Für die zweite Phase können Crashsensordaten zur Bestimmung des For further determination of the movement behavior of the motor vehicle 100, further sensors may be provided. For this purpose, an inertial sensor, which determines an acceleration in one or more spatial directions, or a rotation rate sensor for one or more spatial direction axes belong. For the second phase, crash sensor data can be used to determine the
Crashbewegungsmodells verwendet werden. Zur Bestimmung des Zeitpunktes des Zusammenstoßes oder der Crashrichtung können beispielsweise ein zentraler Beschleunigungssensor 135 und/oder einer oder mehrere Upfrontsensoren 140 und/oder einer oder mehrere periphere Sensoren 141 , 142 verwendet werden. Sind in einem Fahrzeug wenigstens zwei Upfrontsensoren 140, beispielsweise in der Front oder im Heck des Kraftfahr- zeugs 100, verbaut, kann auch eine Teilüberdeckung des Kraftfahrzeugs 100 mit dem Objekt 125 bei der Kollision erkannt werden. Ein seitlicher Zusammenstoß kann mittels eines oder mehrerer peripherer Sensoren und/oder einem zentralen Beschleunigungssensor erkannt werden. Zu den peripheren Sensoren können ein Drucksensor 141 , der etwa in einer Fahrzeugtür verbaut sein kann, und/oder ein Beschleunigungssensor, der beispielsweise in einem Türschweiler oder in einer B-Säule und/oder in einer C-Säule verbaut sein kann, ausgewertet werden. Ein Fahrzeugüberschlag kann durch eine zusätzliche Auswertung eines Crash motion model can be used. For example, a central acceleration sensor 135 and / or one or more upfront sensors 140 and / or one or more peripheral sensors 141, 142 may be used to determine the time of the crash or the crash direction. If at least two upfront sensors 140, for example in the front or in the rear of the motor vehicle 100, are installed in a vehicle, a partial overlap of the motor vehicle 100 with the object 125 during the collision can also be detected. A lateral collision can be detected by means of one or more peripheral sensors and / or a central acceleration sensor. To the peripheral sensors, a pressure sensor 141, which may be installed approximately in a vehicle door, and / or an acceleration sensor, which may be installed, for example, in a Türschweiler or in a B-pillar and / or in a C-pillar, are evaluated. A vehicle rollover can by an additional evaluation of a
Wankratensensors 143 zur Bestimmung einer Dreh rate um die Längsachse des Kraftfahrzeugs 100 (Wankwinkelrate) erkannt werden. Es kann auch ein Roll rate sensor 143 for determining a rotational rate about the longitudinal axis of the motor vehicle 100 (roll angular rate) are detected. It can also be one
Gierratensensor 144 vorgesehen sein, um eine Drehgeschwindigkeit des Kraftfahrzeugs 100 um die Hochachse (Gierrate) zu bestimmen. Der Beschleunigungssensor 135, der Wankratensensor 143 und/oder der Gierratensensor 144 liegen bevorzugterweise auf einer Längsachse des Kraftfahrzeugs 100. Mehrere Sensoren 130-144 können auch miteinander integriert ausgeführt sein, bei- spielsweise in Form eines mehrkanaligen Beschleunigungssensors. Yaw rate sensor 144 may be provided to determine a rotational speed of the motor vehicle 100 about the vertical axis (yaw rate). The acceleration sensor 135, the roll rate sensor 143 and / or the yaw rate sensor 144 are preferably located on a longitudinal axis of the motor vehicle 100. Several sensors 130-144 can also be designed to be integrated with one another, for example in the form of a multi-channel acceleration sensor.
In einer Ausführungsform ist die Verarbeitungseinrichtung 115 dazu eingerichtet, Bewegungsinformationen über das Objekt 125 an einer Schnittstelle 150 bereitzustellen. Ein Fahrerassistenzsystem oder eine automatische Fahrzeugsteue- rung können Daten dazu verwenden, das Kraftfahrzeug 100 weiter zu steuern, beispielsweise durch Aktivieren eines Sicherheitssystems oder auch durch Steuerung der Bewegung des Kraftfahrzeugs 100. Insbesondere kann nach dem Zusammenstoß des Kraftfahrzeugs 100 mit einem Hindernis 145 das Kraftfahrzeug 100 dazu angesteuert werden, eine Gefahrenzone in der Umgebung 110 zu ver- lassen. Diese Gefahrenzone kann das Objekt 125 umfassen. In einer anderenIn one embodiment, the processing device 115 is configured to provide movement information about the object 125 at an interface 150. A driver assistance system or an automatic vehicle control can use data to further control the motor vehicle 100, for example by activating a safety system or also by controlling the movement of the motor vehicle 100. In particular, after the motor vehicle 100 collides with an obstacle 145, the motor vehicle 100 can be driven to leave a danger zone in the environment 110. This danger zone may include the object 125. In another
Ausführungsform kann eine Sicherheitszone angesteuert werden, die beispiels- weise einen Stand- oder Pannenstreifen einer Straße umfassen kann. Dazu können eine Längs- oder Quersteuerung des Kraftfahrzeugs 100 aktiv beeinflusst werden. So kann ein Folgeunfall mit dem Kraftfahrzeug 100 vermieden werden. Embodiment, a safety zone can be actuated which, for example, example, a stanchion or breakdown strip may include a road. For this purpose, a longitudinal or lateral control of the motor vehicle 100 can be actively influenced. Thus, a consequential accident with the motor vehicle 100 can be avoided.
Die Sensoren 120 werden üblicherweise mehrfach in einem vorbestimmten Zeitraum abgetastet, sodass zeitlich die koordinierten Messungen vorliegen. Diese Messungen werden bezüglich eines Bewegungsmodells des Kraftfahrzeugs 100 plausibilisiert. Dieses Bewegungsmodell kann insbesondere maximale Ge- schwindigkeits- oder Beschleunigungswerte für das Kraftfahrzeug 100 umfassen. Abtastungen, die beispielsweise auf einer Beschleunigung des Kraftfahrzeugs 100 bezüglich des Objekts 125 hinweisen, die die Grenze des Bewegungsmodells übersteigen, können verworfen werden. Es wird vorgeschlagen, einen Zusammenstoß zwischen dem Kraftfahrzeug 100 und dem Hindernis 145 zu bestimmen und während des Zusammenstoßes ein verändertes Bewegungsmodell zu verwenden, um die Sensordaten der Sensoren 120 auszuwerten. Die Relativbewegung des Kraftfahrzeugs 100 bezüglich des Objekts 125 kann so auch während des Zusammenstoßes zutreffend und insbesondere unter Zuhilfenahme von Sensorwerten vor dem Zusammenstoß bestimmt werden. The sensors 120 are typically scanned several times in a predetermined period of time, so that temporally the coordinated measurements are present. These measurements are made plausible with respect to a movement model of the motor vehicle 100. In particular, this movement model may include maximum speed or acceleration values for the motor vehicle 100. Samples indicating, for example, an acceleration of the motor vehicle 100 with respect to the object 125 that exceed the limit of the motion model may be discarded. It is proposed to determine a collision between the motor vehicle 100 and the obstacle 145 and to use an altered movement model during the collision to evaluate the sensor data of the sensors 120. The relative movement of the motor vehicle 100 relative to the object 125 can thus also be determined during the collision correctly and in particular with the aid of sensor values before the collision.
Das Hindernis 145 kann vor dem Zusammenstoß ebenfalls als Objekt 125 betrachtet und seine Bewegung gegenüber dem Kraftfahrzeug 100 verfolgt werden. In einer Ausführungsform kann die Verfolgung des Hindernisses 145 als Objekt 125 auch nach dem Zusammenstoß aufrecht erhalten werden; in einer anderen Ausführungsform hingegen wird nach dem Zusammenstoß das Hindernis 145 nicht weiter als Objekt 125 betrachtet und verfolgt. The obstacle 145 may also be considered an object 125 prior to collision, and its movement relative to the motor vehicle 100 tracked. In one embodiment, the tracking of the obstacle 145 as an object 125 may be maintained even after the crash; however, in another embodiment, after the collision, the obstacle 145 is no longer considered and tracked as object 125.
Figur 2 zeigt Phasen eines Zusammenstoßes des Kraftfahrzeugs 100 von Figur 1 mit dem Hindernis 145. Dabei ist exemplarisch ein Frontalzusammenstoß gezeigt, obwohl die hier vorgestellte Technik auch mit einer beliebigen anderen Kollisionsart, beispielsweise einem Heckzusammenstoß, einem versetzten Zusammenstoß oder einem seitlichen Zusammenstoß, verwendet werden kann. FIG. 2 shows phases of a collision of the motor vehicle 100 of FIG. 1 with the obstacle 145. An exemplary head-on collision is shown, although the technique presented herein is also used with any other type of collision, such as a rear collision, staggered collision, or collision can.
Figur 2a zeigt eine erste Phase, in der das Kraftfahrzeug 100 im üblichen Betrieb bezüglich des Hindernisses 145 in Bewegung ist. Während dieser ersten Phase kann die Bewegung des Kraftfahrzeugs 100, wie oben genauer beschrieben ist, mittels eines ersten Bewegungsmodells bestimmt werden. Figur 2b zeigt den Übergang von der ersten in eine zweite Phase, während derer das Kraftfahrzeug 100 mit dem Hindernis 145 zusammenstößt. Während der zweiten Phase können beispielsweise Beschleunigungswerte, beispielsweise in Längs- oder Querrichtung oder Hochrichtung oder Drehraten um eine Längsoder Hochachse des Kraftfahrzeugs 100, die Grenzen des ersten Bewegungsmodells überschreiten. Solche Beschleunigungswerte oder Drehraten können beispielsweise während eines Zusammenpralls, eines Schleuderns, eines Überschlagens oder einer anderen Unfallfolge auftreten. FIG. 2 a shows a first phase in which the motor vehicle 100 is in motion with respect to the obstacle 145 in normal operation. During this first phase, the movement of the motor vehicle 100, as described in more detail above, can be determined by means of a first movement model. FIG. 2b shows the transition from the first to a second phase, during which the motor vehicle 100 collides with the obstacle 145. During the second phase, for example, acceleration values, for example in the longitudinal or transverse direction or high direction or yaw rates about a longitudinal or vertical axis of the motor vehicle 100, may exceed the limits of the first movement model. Such acceleration values or rates of rotation may occur, for example, during a collision, a spin, a rollover or another accident sequence.
Figur 2c zeigt das Kraftfahrzeug 100 in der zweiten Phase während des Zusammenstoßes mit dem Hindernis 145. Im gewählten Beispiel wird die Vorwärtsgeschwindigkeit des Kraftfahrzeugs 100 rasch abgebaut, gleichzeitig wird das Kraftfahrzeug 100 stark um die Hochachse beschleunigt. Es ist bevorzugt, während der zweiten Phase die Bewegung des Kraftfahrzeugs 100 bezüglich eines zweiten Bewegungsmodells zu bestimmen, welches insbesondere Beschleunigungswerte des Kraftfahrzeugs 100 zulässt, wie sie während derartigen Manöver auftreten können. Figure 2c shows the motor vehicle 100 in the second phase during the collision with the obstacle 145. In the example chosen, the forward speed of the motor vehicle 100 is rapidly reduced, while the motor vehicle 100 is accelerated sharply around the vertical axis. It is preferred during the second phase to determine the movement of the motor vehicle 100 with respect to a second movement model, which in particular allows acceleration values of the motor vehicle 100, such as may occur during such maneuvers.
Figur 2d zeigt das Kraftfahrzeug 100 in einer dritten Phase, die sich an die zweite Phase anschließen kann. Der Zusammenstoß mit dem Hindernis 145 ist beendet und die Bewegungsbestimmung kann wieder auf der Basis des ersten Bewegungsmodells durchgeführt werden. Dabei kann eine Steuerung des Kraftfahrzeugs 100 durchgeführt werden, insbesondere, um das Kraftfahrzeug in eine sichere Stellung oder an eine sichere Position zu bewegen. Die Steuerung kann mittels eines Fahrerassistenten oder einer automatischen Fahrzeugsteuerung erfolgen. FIG. 2 d shows the motor vehicle 100 in a third phase, which can follow the second phase. The collision with the obstacle 145 is finished and the movement determination can be performed again on the basis of the first movement model. In this case, a control of the motor vehicle 100 can be carried out, in particular in order to move the motor vehicle to a safe position or to a safe position. The control can be done by means of a driver assistant or an automatic vehicle control.
Figur 3 zeigt ein Ablaufdiagramm eines Verfahrens 300 zur Steuerung des Kraftfahrzeugs 100 aus Figur 1. In einem ersten Schritt 305 werden Daten der Sensoren 20 zeitlich beabstandet aufgenommen. Das Hindernis 145 wird in seiner Bewegung bezüglich des Kraftfahrzeugs 100 verfolgt, wobei ein erstes Bewegungsmodell des Kraftfahrzeugs 100 zugrunde gelegt ist. Die festgestellte Bewegung kann einem Fahrerassistenzsystem oder einer automatischen Fahrzeugsteuerung bereitgestellt sein, das insbesondere eine Längs- oder Quersteuerung des Kraftfahrzeugs 100 durchführt. Das Fahrerassistenzsystem oder die automa- tische Fahrzeugsteuerung kann beispielsweise in die Verarbeitungseinrichtung 115 aus Figur 1 integriert sein. FIG. 3 shows a flow diagram of a method 300 for controlling the motor vehicle 100 from FIG. 1. In a first step 305, data of the sensors 20 are recorded at a time interval. The obstacle 145 is tracked in its movement with respect to the motor vehicle 100, wherein a first movement model of the motor vehicle 100 is used. The detected movement may be provided to a driver assistance system or an automatic vehicle control, which in particular performs a longitudinal or lateral control of the motor vehicle 100. The driver assistance system or the automatic The vehicle control system may for example be integrated into the processing device 115 from FIG.
In einem Schritt 310 wird bestimmt, ob sich ein Unfall bzw. ein Zusammenstoß des Kraftfahrzeugs 100 mit dem Hindernis 145 ergeben hat. Diese Bestimmung kann insbesondere auf der Basis der Sensoren 135 und/oder 140 und/oder 141 und/oder 142 und/oder 143 erfolgen. In einer Verfeinerung kann bestimmt werden, wie schwer der Zusammenstoß ist, um daraus abzuleiten, ob das Kraftfahrzeug 100 nach Beendigung des Zusammenstoßes noch bewegungsfähig bzw. steuerbar bleibt. Wurde kein Zusammenstoß festgestellt, so kann das Verfahren 300 zum Schritt 305 zurückkehren und erneut durchlaufen. In a step 310 it is determined whether an accident or collision of the motor vehicle 100 with the obstacle 145 has resulted. This determination can be made in particular on the basis of the sensors 135 and / or 140 and / or 141 and / or 142 and / or 143. In a refinement, it may be determined how severe the collision is to deduce from whether the motor vehicle 100 still remains manoeuvrable after the collision has ended. If no collision has been detected, then method 300 may return to step 305 and re-run.
Andernfalls wird in einem Schritt 315 das Bewegungsmodell, das der Verfolgung des Hindernisses 145 bezüglich des Kraftfahrzeugs 100 zugrunde liegt, geändert. Das Bewegungsmodell bzw. die Bewegung zwischen dem Kraftfahrzeug 100 und dem Hindernis 145 können insbesondere auf der Basis einer Richtung, aus der das Hindernis 145 auf das Kraftfahrzeug 100 einwirkt und einer Verzögerung des Kraftfahrzeugs 100 bestimmt werden. Zusätzlich können neben den Crashbeschleunigungswerten noch Drehratensignale ausgewertet werden. Otherwise, in a step 315, the movement model underlying the tracking of the obstacle 145 with respect to the motor vehicle 100 is changed. The movement model or the movement between the motor vehicle 100 and the obstacle 145 can be determined in particular on the basis of a direction from which the obstacle 145 acts on the motor vehicle 100 and a deceleration of the motor vehicle 100. In addition to the crash acceleration values, yaw rate signals can also be evaluated.
In einem Schritt 320 kann die Verfolgung des Objekts 125 auf Sensordaten solcher Sensoren 120 eingeschränkt werden, die der Richtung des Einwirkens des Hindernisses 145 auf das Kraftfahrzeug 100 abgewandt sind. In der Praxis liegen üblicherweise mehrere Objekte 125 vor, von denen dann nur noch solche weiter verfolgt werden können, die der Richtung des Einwirkens abgewandt sind. So kann vermieden werden, dass Signale eines Sensors 140 zur Plausibilisierung oder Verfolgung verwendet werden, dessen zugeordneter Sensor 140 durch den Zusammenstoß in Mitleidenschaft gezogen worden ist. Außerdem kann so vermieden werden, dass das Hindernis 145, das durch den Zusammenstoß eine schwer zu bestimmende Bewegung vollführen kann, weiter als Objekt 125 verfolgt wird. In a step 320, the tracking of the object 125 can be limited to sensor data of such sensors 120, which are remote from the direction of the action of the obstacle 145 on the motor vehicle 100. In practice, there are usually several objects 125, of which then only those can continue to be pursued, which are remote from the direction of action. Thus, it can be avoided that signals from a sensor 140 are used for plausibility or tracking, whose associated sensor 140 has been affected by the collision. In addition, it can thus be avoided that the obstacle 145, which can perform a movement difficult to determine by the collision, is further traced as object 125.
In einem Schritt 325 können insbesondere eine Lage und/oder eine Richtung des Kraftfahrzeugs 100 nach dem Zusammenstoß mit dem Hindernis 145 bestimmt werden. Das Verfolgen des Objekts 125 auf der Basis der Daten der Sensoren 120 muss hierfür nicht über mehrere Messzyklen plausibilisiert werden. ln einem Schritt 330 kann das Ende des Zusammenstoßes bestimmt werden, insbesondere auf der Basis von Beschleunigungsdaten, beispielsweise des Beschleunigungssensors 135 und/oder der Upfront-Sensoren 140 und/oder der peripheren Sensoren 141 , 142 und/oder des Wankratensensors 143, Mit dem Ende des Zusammenstoßes kann wieder das erste Bewegungsmodell aktiviert werden, um das Tracking zwischen dem Kraftfahrzeug 100 und dem Objekt 125 weiter zu bestimmen. In a step 325, in particular, a position and / or a direction of the motor vehicle 100 after the collision with the obstacle 145 can be determined. The tracking of the object 125 on the basis of the data of the sensors 120 does not have to be made plausible over several measuring cycles for this purpose. In step 330, the end of the crash may be determined, particularly based on acceleration data, such as the acceleration sensor 135 and / or the upfront sensors 140 and / or the peripheral sensors 141, 142 and / or the roll rate sensor 143, with the end of the collision, the first movement model can again be activated in order to further determine the tracking between the motor vehicle 100 and the object 125.
In einem Schritt 335 kann bestimmt werden, ob sich das Kraftfahrzeug 100 in einer Gefahrenzone befindet oder die Gefahr einer Folgekollision besteht. In diesem Fall kann das Kraftfahrzeug 100 gesteuert werden, um eine sicherere Position in der Umgebung 1 10 einzunehmen. Dazu können Funktionen des oben erwähnten Fahrerassistenzsystems und/oder der automatischen Fahrzeugsteuerung verwendet werden. In a step 335, it may be determined whether the motor vehicle 100 is in a danger zone or there is a risk of a following collision. In this case, the motor vehicle 100 may be controlled to assume a safer position in the environment 110. For this purpose, functions of the above-mentioned driver assistance system and / or the automatic vehicle control can be used.

Claims

Verfahren (100), folgende Schritte umfassend: Method (100), comprising the following steps:
- Erfassen eines Objekts (125) in einer Umgebung (110) eines Kraftfahrzeugs (100);  - Detecting an object (125) in an environment (110) of a motor vehicle (100);
- Verfolgen, in einer ersten Phase, des Objekts (125) bezüglich dem Kraftfahrzeug (100) mittels eines ersten Bewegungsmodells des Kraftfahrzeugs (100);  Tracking, in a first phase, the object (125) with respect to the motor vehicle (100) by means of a first movement model of the motor vehicle (100);
gekennzeichnet durch marked by
- Erfassen eines Zusammenstoßes des Kraftfahrzeugs (100) mit einem Hindernis (145);  - Detecting a collision of the motor vehicle (100) with an obstacle (145);
- Verfolgen, in einer zweiten Phase, des Objekts (125) bezüglich des Kraftfahrzeugs (100) mittels eines zweiten Bewegungsmodells des Kraftfahrzeugs (100).  - Tracking, in a second phase, the object (125) with respect to the motor vehicle (100) by means of a second movement model of the motor vehicle (100).
Verfahren (100) nach Anspruch 1 , wobei das Objekt (125) in der zweiten Phase auch auf der Basis von Erfassungsdaten des Objekts (125) in der ersten Phase verfolgt wird. The method (100) of claim 1, wherein the object (125) in the second phase is also tracked based on detection data of the object (125) in the first phase.
Verfahren (100) nach Anspruch 1 oder 2, wobei ferner das Ende des Zusammenstoßes bestimmt und in einer dritten Phase das Objekt (125) bezüglich dem Kraftfahrzeug (100) mittels des ersten Bewegungsmodells des Kraftfahrzeugs (100) weiter verfolgt wird. The method (100) of claim 1 or 2, further comprising determining the end of the crash and, in a third phase, tracking the object (125) with respect to the motor vehicle (100) using the first motion model of the motor vehicle (100).
Verfahren (100) nach Anspruch 3, wobei das Objekt (125) in der dritten Phase auch auf der Basis von Erfassungsdaten des Objekts (125) in der zweiten Phase verfolgt wird. The method (100) of claim 3, wherein the object (125) in the third phase is also tracked based on detection data of the object (125) in the second phase.
Verfahren (100) nach einem der vorangehenden Ansprüche, wobei in der dritten Phase eine Bewegung des Kraftfahrzeugs (100) auf der Basis des verfolgten Objekts (125) gesteuert wird. Method (100) according to one of the preceding claims, wherein in the third phase a movement of the motor vehicle (100) is controlled on the basis of the tracked object (125).
Verfahren (100) nach einem der vorangehenden Ansprüche, wobei bestimmt wird, aus welcher Richtung das Hindernis (145) mit dem Kraftfahrzeug (100) zusammenstößt, wobei das Objekt (125) in der zweiten Phase nur noch auf der Basis von Sensordaten verfolgt wird, deren zugeordnete Sensoren der Richtung des Zusammenstoßes abgewandt sind. Method (100) according to one of the preceding claims, wherein it is determined from which direction the obstacle (145) collides with the motor vehicle (100), wherein the object (125) in the second phase is tracked only on the basis of sensor data, their associated sensors of the Facing away from the collision.
7. Verfahren (100) nach einem der vorangehenden Ansprüche, wobei der Zusammenstoß auf der Basis von Daten eines Beschleunigungssensors (135) bestimmt wird. The method (100) of any one of the preceding claims, wherein the crash is determined based on data from an acceleration sensor (135).
8. Verfahren (100) nach Anspruch 7, wobei eine durch den Zusammenstoß bedingte Beschleunigung und eine Beschleunigungsrichtung in die Verfolgung des Objekts (125) in der zweiten Phase eingehen. The method (100) of claim 7, wherein an acceleration due to the collision and an acceleration direction enter into the tracking of the object (125) in the second phase.
9. Verfahren (100) nach einem der vorangehenden Ansprüche, wobei Daten eines Upfront-Sensors (140) in die Verfolgung des Objekts (125) in der zweiten Phase eingehen. The method (100) of any one of the preceding claims, wherein data from an up-front sensor (140) is used to track the object (125) in the second phase.
10. Verfahren (100) nach einem der vorangehenden Ansprüche, wobei Daten eines peripheren Sensors (141 , 142) in die Verfolgung des Objekts (125) in der zweiten Phase eingehen. A method (100) according to any one of the preceding claims, wherein data from a peripheral sensor (141, 142) is included in the tracking of the object (125) in the second phase.
1 1 . Verfahren (100) nach einem der vorangehenden Ansprüche, wobei Daten eines Wankratensensors (143) in die Verfolgung des Objekts (125) in der zweiten Phase eingehen. 1 1. Method (100) according to one of the preceding claims, wherein data of a roll rate sensor (143) enter into the tracking of the object (125) in the second phase.
12. Verfahren (100) nach einem der vorangehenden Ansprüche, wobei Daten eines Gierratensensors (144) in die Verfolgung des Objekts (125) in der zweiten Phase eingehen. The method (100) of any one of the preceding claims, wherein data from a yaw rate sensor (144) is included in the tracking of the object (125) in the second phase.
13. Computerprogrammprodukt mit Programmcodemitteln zur Durchführung des Verfahrens (100) nach einem der vorangehenden Ansprüche, wenn das Computerprogrammprodukt auf einer Verarbeitungseinrichtung (1 15) abläuft oder auf einem computerlesbaren Datenträger gespeichert ist. 13. Computer program product with program code means for performing the method (100) according to any one of the preceding claims, when the computer program product runs on a processing device (1 15) or is stored on a computer-readable medium.
EP16710939.6A 2015-04-17 2016-03-08 Object tracking before and during an impact Active EP3283343B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102015207016.6A DE102015207016A1 (en) 2015-04-17 2015-04-17 Object tracking before and during a collision
PCT/EP2016/054882 WO2016165880A1 (en) 2015-04-17 2016-03-08 Object tracking before and during an impact

Publications (2)

Publication Number Publication Date
EP3283343A1 true EP3283343A1 (en) 2018-02-21
EP3283343B1 EP3283343B1 (en) 2022-02-23

Family

ID=55587251

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16710939.6A Active EP3283343B1 (en) 2015-04-17 2016-03-08 Object tracking before and during an impact

Country Status (6)

Country Link
US (1) US10427677B2 (en)
EP (1) EP3283343B1 (en)
JP (1) JP6526832B2 (en)
CN (1) CN107531238B (en)
DE (1) DE102015207016A1 (en)
WO (1) WO2016165880A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6597517B2 (en) * 2016-08-10 2019-10-30 株式会社デンソー Target detection device
JP2018065482A (en) * 2016-10-20 2018-04-26 本田技研工業株式会社 Occupant protection device
WO2018099500A1 (en) 2016-11-30 2018-06-07 Bruker Daltonik Gmbh Preparing live microbial samples and microorganisms for subsequent mass-specrometric measurement and evaluation
DE102017220910A1 (en) * 2017-11-23 2019-05-23 Robert Bosch Gmbh Method and device for detecting a collision of a vehicle
DE102018206956A1 (en) * 2018-05-04 2019-11-07 Continental Teves Ag & Co. Ohg Method for determining a vehicle position
CN110077400A (en) * 2019-04-28 2019-08-02 深圳市元征科技股份有限公司 A kind of reversing householder method, device and terminal device
US12083985B2 (en) 2020-03-11 2024-09-10 Zf Friedrichshafen Ag Vehicle safety system implementing integrated active-passive front impact control algorithm

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3736340B2 (en) * 2000-12-14 2006-01-18 トヨタ自動車株式会社 Vehicle control device
EP1687183B1 (en) * 2003-11-14 2008-06-25 Continental Teves AG & Co. oHG Method and device for reducing damage caused by an accident
JP4449518B2 (en) * 2004-03-22 2010-04-14 株式会社デンソー Vehicle occupant protection device
DE102005008715A1 (en) * 2005-02-25 2006-08-31 Robert Bosch Gmbh Radar system e.g. for motor vehicle, supplies probable collision time-point and collision speed to pre-crash-system
DE102005016009A1 (en) 2005-04-07 2006-10-12 Robert Bosch Gmbh Method and device for stabilizing a vehicle after a collision
DE102008001648A1 (en) * 2008-05-08 2009-11-12 Robert Bosch Gmbh Driver assistance method for moving a motor vehicle and driver assistance device
DE102011115223A1 (en) 2011-09-24 2013-03-28 Audi Ag Method for operating a safety system of a motor vehicle and motor vehicle
EP2591966B1 (en) * 2011-11-11 2019-02-27 Volvo Car Corporation Vehicle safety system comprising active and passive safety means
DE102011087781A1 (en) 2011-12-06 2013-06-06 Robert Bosch Gmbh Method and system for reducing accident damage in a collision between two vehicles
DE102012201902A1 (en) * 2012-02-09 2013-08-14 Robert Bosch Gmbh Driver assistance system for adapting the target position in transverse parking spaces by the driver
US9558667B2 (en) * 2012-07-09 2017-01-31 Elwha Llc Systems and methods for cooperative collision detection
DE102012107186B4 (en) 2012-08-06 2022-04-21 Continental Automotive Gmbh Method for detecting a dangerous situation of a vehicle based on at least one surroundings sensor and at least one inertial sensor
DE102013211651A1 (en) 2013-06-20 2014-12-24 Robert Bosch Gmbh Method and device for avoiding a possible subsequent collision or for reducing the accident consequences of a collision
DE102013215472B4 (en) 2013-08-06 2024-08-22 Volkswagen Aktiengesellschaft Planning an exit trajectory to reduce collision consequences
JP2015047980A (en) * 2013-09-02 2015-03-16 トヨタ自動車株式会社 Brake control device
DE102014008350A1 (en) 2014-06-05 2014-11-27 Daimler Ag Method for operating a vehicle dynamics control system of a vehicle and a motor vehicle

Also Published As

Publication number Publication date
CN107531238B (en) 2021-09-14
JP6526832B2 (en) 2019-06-05
US10427677B2 (en) 2019-10-01
JP2018513053A (en) 2018-05-24
EP3283343B1 (en) 2022-02-23
CN107531238A (en) 2018-01-02
WO2016165880A1 (en) 2016-10-20
US20180043889A1 (en) 2018-02-15
DE102015207016A1 (en) 2016-10-20

Similar Documents

Publication Publication Date Title
EP3283343B1 (en) Object tracking before and during an impact
DE112009005400B4 (en) Collision avoidance device
DE102007042481B4 (en) Vehicle control system for an automobile
EP2043896B1 (en) Method and device for avoiding and/or reducing the consequences of collisions
EP2242674B1 (en) Method and assistance system for detecting objects in the surrounding area of a vehicle
EP2077212B1 (en) Driver assist system
EP3356203B1 (en) Method for determining a parking surface for parking a motor vehicle, driver assistance system, and motor vehicle
WO2006106009A1 (en) Method and device for stabilising a vehicle after a collision
EP2163448A1 (en) Driver assistance device for supporting a driver of a vehicle when pulling out of a parking spot
EP1554604A1 (en) Method and device for preventing a collision of vehicles
DE102016223541A1 (en) Method and parameter module for detecting the type and / or severity of a collision of a vehicle with a collision object
DE102019202026A1 (en) Method and control device for vehicle collision avoidance
DE102015200926A1 (en) Protective function for an automated controlled motor vehicle
WO2016180665A1 (en) Method for controlling a functional device of a motor vehicle on the basis of merged sensor data, control device, driver assistance system and motor vehicle
WO2019137864A1 (en) Method for preventing a critical situation for a motor vehicle, wherein a distance between a motor vehicle contour and an object contour is determined, driver assistance system and motor vehicle
DE102013211427B4 (en) Method and device for determining a driving state of an external motor vehicle
DE102012107186A1 (en) Method for detecting hazardous situation of vehicle based on sensor, involves determining parameter for controllability of vehicle as result of lateral impact, where two acceleration sensors are arranged in plane parallel to track
DE102008059240A1 (en) Vehicle i.e. car, operating method, involves comparing object data with contour of vehicle, and determining and adjusting release time point and/or release characteristic of restraint system based on collision time
DE102013010729B4 (en) Method for canceling an automatic driving mode of a motor vehicle, driver assistance device and motor vehicle
EP3347739B1 (en) Method for determining the severity of a possible collision between a motor vehicle and a further vehicle, control device, driver assistance system and motor vehicle
DE102016213329B4 (en) Method for controlling a sensor device
DE102017103700A1 (en) Avoiding a collision by a steering intervention in response to a movement of an external vehicle object
DE102007006757B4 (en) Motor vehicle safety system for the support and / or protection of drivers in critical driving situations and motor vehicle
DE102016013601A1 (en) Method for operating a vehicle
DE102015005999B3 (en) Method for changing a position of an exterior mirror of a motor vehicle and motor vehicle

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20171117

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ROBERT BOSCH GMBH

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210428

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20210914

RIN1 Information on inventor provided before grant (corrected)

Inventor name: WILHELM, ULF

Inventor name: EISELE, SYBILLE

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1470238

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220315

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502016014546

Country of ref document: DE

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20220223

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220623

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220523

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220523

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220524

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220623

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502016014546

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20220331

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20220523

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220308

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220331

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220308

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220423

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220331

26N No opposition filed

Effective date: 20221124

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220331

REG Reference to a national code

Ref country code: AT

Ref legal event code: MM01

Ref document number: 1470238

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220308

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220523

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: AT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220308

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20160308

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240527

Year of fee payment: 9

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220223