EP2869284A1 - Système d'assistance à la conduite pour des véhicules, en particulier des véhicules utilitaires - Google Patents

Système d'assistance à la conduite pour des véhicules, en particulier des véhicules utilitaires Download PDF

Info

Publication number
EP2869284A1
EP2869284A1 EP20140190877 EP14190877A EP2869284A1 EP 2869284 A1 EP2869284 A1 EP 2869284A1 EP 20140190877 EP20140190877 EP 20140190877 EP 14190877 A EP14190877 A EP 14190877A EP 2869284 A1 EP2869284 A1 EP 2869284A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
collision
collision object
unit
assistance system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP20140190877
Other languages
German (de)
English (en)
Other versions
EP2869284B1 (fr
Inventor
Dr. Werner Lang
Dr. Stefan Schinzer
Manuel Kunz
Michael Witzke
Johannes Nagel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mekra Lang GmbH and Co KG
Original Assignee
Mekra Lang GmbH and Co KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mekra Lang GmbH and Co KG filed Critical Mekra Lang GmbH and Co KG
Publication of EP2869284A1 publication Critical patent/EP2869284A1/fr
Application granted granted Critical
Publication of EP2869284B1 publication Critical patent/EP2869284B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection

Definitions

  • the present invention relates to a driver assistance system for vehicles, in particular commercial vehicles, for warning against collision objects located in the surroundings of the vehicle.
  • a (image) recording unit continuously records a recording image.
  • Data for example, by means of a calculation unit and optionally after further processing, to a reproducing device located in the cab are supplied, which for the driver permanently and at any time visible the recorded area that may contain legally required fields of view.
  • the objects located in the recorded area are displayed.
  • the DE 10 2011 010 624 A1 a display device for legally prescribed fields of view of a commercial vehicle in a cab of the commercial vehicle, which has at least one display unit which is adapted to display at least two of the legally prescribed fields permanently and in real time on the display unit in the cab.
  • the visibility in the immediate vehicle environment, especially on the vehicle sides and especially on the passenger side is critical. Obstacles, like other road users, eg. As other vehicles, pedestrians and / or cyclists and / or stationary objects such as street posts, street lights, street signs, etc., are poorly recognized, since the resolution of the representation is often insufficient, eg. B. due to the figure relatively wide angle, and a variety of information is displayed. Also, the orientation for the driver on the relatively many facilities for indirect vision difficult, so that there is a risk that especially during turning or maneuvering collision objects are overlooked, although they are shown in the device for indirect vision.
  • the receiving unit is mounted on the vehicle, in particular commercial vehicle, in such a way that a viewing area is detected which contains the so-called blind spot.
  • the disadvantage here is that parts of the playback image are obscured by the graphic overlays and thus often the exact location of the obstacle on the playback image is not clear or the orientation for the driver on the playback image is difficult, and no distinction between collision-relevant objects and such objects , which have no or only a very small probability of collision, is made.
  • current warning systems are unable to recognize obstacles that are detected from different perspectives as such because they have different representations on the playback image in different perspectives.
  • warning systems such as emergency brake assist, lane departure warning, lane change assistants, parking aids, etc. should be mentioned.
  • it is possible to detect collision objects in the recorded image for example by means of the so-called “Mobileye Pedestrian Collision Warning” (pedestrian recognition) and by means of speed limit assistants (traffic sign recognition), and to warn the driver in this regard.
  • driver assistance systems for example so-called “crossing assistants”, can determine the trajectories of the detected collision objects and indicate potential collisions.
  • a method for detecting objects on the side of a commercial vehicle is known.
  • at least one camera detects objects located in a sector located on one side of a commercial vehicle.
  • the detected objects are evaluated in an evaluation unit, whereby a position of the detected objects relative to the commercial vehicle is determined and a risk of collision with the commercial vehicle is evaluated.
  • the evaluation unit transmits information to a reproduction unit, to which the reproduction unit issues a warning signal.
  • the DE 103 36 638 A1 discloses an apparatus for classifying at least one object in a vehicle environment by means of environmental sensors.
  • the object is classified at least on the basis of its three-dimensional shape and its dimensions.
  • the environment sensor is configured so that it can determine the shape and dimensions.
  • From the DE 10 2007 059 735 A1 is a stereo vision system for vehicles for detection of lateral obstacles known.
  • the system and method employ cameras in the stereo ratio with unobtrusive mounting to provide reliable detection within a good range discrimination.
  • the image signal received by each camera can be conditioned before it is adjusted and compared to bumps viewed above the ground and measured against predetermined inequality criteria.
  • a driver assistance system for a vehicle in particular a commercial vehicle, which issues a warning to the driver and / or intervenes in the control of the vehicle if at least one collision object emitting a risk of collision for the vehicle enters the vehicle environment is detected.
  • directional indications refer to a vehicle, in particular commercial vehicle, during normal forward travel.
  • lateral direction thus means that direction along the vertical to a forward direction of travel vector of the Motor vehicle is and corresponds to the left-right direction.
  • viewing area describes an area that can be detected by a recording unit, for example a camera, to the maximum. This is different from the concept of a "field of view", which indicates an area which has to be consulted by law by the driver. Therefore, a viewing area usually describes an area that is usually larger than a legally prescribed field of view.
  • the invention is based on the idea of providing a driver assistance system for a vehicle, in particular a commercial vehicle, which issues a warning to the driver and / or intervenes in the control of the vehicle if an obstacle or collision object presents a risk of collision for the vehicle radiates, is located in the vehicle environment. It is not or relatively slowly moving object in the immediate vicinity of the vehicle or it is a relatively fast moving object with risk of collision farther away from the vehicle.
  • the driver assistance system has a recording unit, a calculation unit and a playback unit.
  • the receiving unit has a recording device having an optical axis which is at an angle to the roadway of the vehicle.
  • the calculation unit is adapted to evaluate data received from the acquisition unit with regard to collision objects and to initiate the issuing of a warning to the driver and / or to intervene in the control of the vehicle upon detection of a collision object.
  • the calculation unit determines the presence and / or the type of a collision object for the purpose of detecting collision objects on the basis of collision object parameters selected as a function of a relative position of a collision object to the acquisition device.
  • the calculation unit determines the type of object when recognizing an object in the recording image on the basis of parameters that vary depending on the relative position of the object to the recording device. From the determined type of object and the relative position, the calculation unit can then conclude whether it is a collision object. That is to say, the detection of the at least one collision object takes place everywhere in the entire multi-perspective image depending on the respective perspective of the photograph and thus the type of image of a respective collision object by selecting the collision parameters corresponding to the relative position to the recorder.
  • a pedestrian in plan view may have a completely different outline than in a horizon or side view.
  • the mounting position of the receiving unit can be very high, for example, in a height of more than 2 m, which in the immediate vicinity of the vehicle a plan view of the collision objects is detected.
  • the collision object can be recognized as a pedestrian by selecting the recognition parameters depending on the recording position.
  • a pedestrian is dependent on the detection as such z. B. classified as a collision object when it is near the vehicle, but not classified as a collision object when it is sufficiently removed from the vehicle.
  • further features such as movement trajectories, can be used for the classification as a collision object.
  • Another example is a motorcyclist to call, for. B. is located relative to the vehicle and approaches the vehicle at a relatively fast speed.
  • the calculation unit can recognize the motorcyclist in the almost horizontal and frontal view from the captured image received by the capture unit, which in this view has a shape similar to a thick vertical bar. If the motorcyclist approaches the vehicle, his perspective, for example, can also change from the front view into a nearly side view, which in this view has the familiar shape of a motorcyclist with both wheels.
  • the calculation unit is able to determine from both perspectives the presence and type of the collision object, in this example a motorcyclist, and to classify the object as a collision object, depending on its distance from the vehicle. When detected as a motorcycle, for example, a remote motorcycle is classified as a collision object due to its possible higher speed.
  • the calculation unit is adapted to evaluate the captured images of the recording unit. It should be mentioned that the recording unit continuously captures individual snapshots of the vehicle surroundings at different times and sends these individual images to the calculation unit for image evaluation. For example, the capture unit is adapted to capture 25 frames per second. Preferably the computing unit evaluates each image received by the capturing unit with respect to the above-mentioned collision parameters.
  • the driver of the vehicle is warned of those collision objects which emit a collision risk to the vehicle, independently of the perspective detection.
  • the calculation unit takes into account in the evaluation of each of the recording unit as a plan view or as a view obliquely from above on the vehicle environment recorded images, recognizes critical and / or dangerous for the vehicle collision objects and warns the driver, for example, acoustically, visually or haptically, in front of them ,
  • the driver is preferably warned of those collision objects which are located in an area behind and / or laterally next to the vehicle, in particular in the so-called blind spot, and from which a risk of collision arises.
  • the calculation unit additionally or alternatively to the above-mentioned warning causes an intervention in the control of the vehicle when a collision object has been detected.
  • the calculation unit may provide the control of the vehicle with a signal indicating that the vehicle is to be braked or accelerated so as to prevent a collision with the collision object.
  • the calculation unit can provide the control of the vehicle with a signal which indicates that the steering angle of the vehicle is to be changed in such a way that a collision with the collision object can be avoided.
  • collision objects refer to any objects that are in the immediate, but also remote vehicle environment during a journey with a vehicle, in particular a commercial vehicle.
  • Collision objects are those objects with which the vehicle can collide and cause an accident.
  • Kollisions Fantasy-like objects, such as other road users, cyclists, motorcyclists, pedestrians, etc. are mentioned, for the most part an increased risk emanating from the vehicle and represent an increased risk to the vehicle.
  • stationary objects such as street signs, street posts, street lamps, garbage cans, advertising columns, parked vehicles or other non-moving objects besides the movable collision objects.
  • this will be beneficial in a parking or maneuvering operation so that the driver does not inadvertently overlook one of the stationary objects and cause a collision with it.
  • the calculation unit is further adapted to determine the presence and / or the type of a collision object on the basis of stored data with respect to previously detected collision objects.
  • the calculation unit has a memory in which the parameters and features of already determined and detected collision objects are stored, for. B. the respective outlines or silhouettes of pedestrians, motorcyclists, cyclists, etc. in plan view, side view, front view. Upon re-detection of collision objects, the calculation unit may access and compare this data with the parameters and features of the newly detected collision objects to determine the presence and / or type of the collision objects.
  • the calculation unit has a learning algorithm which preferably generalizes the detected and individual examples and recognizes laws in the acquired examples, which can then be used in the future detection of collision objects.
  • the memory can be provided with prestored information, for example already known collision object parameters and features, which the calculation unit can access when determining the collision object parameters.
  • collision object parameters refer to the particular silhouette of the collision object on the captured image.
  • a head detected from above and arms or shoulders are collision object parameters for a pedestrian.
  • z For example, two thinner strokes with a human outline in between, the collision object parameters for a cyclist and two thicker strokes with a human outline in between, the collision object parameters for a motorcyclist.
  • the calculation unit is further adapted to subdivide the captured images of the capture unit into at least two image areas and the presence and / or the nature of the collision object depending on in which of the at least two image areas the collision object is based on different parameters determine. In a preferred embodiment, the calculation unit divides the data acquired by the recording unit into three image areas.
  • the calculation unit divides the field of view detected by the acquisition unit into at least two image areas, e.g. For example, a vehicle-proximate area and a vehicle-remote area, so that the recording unit detects a multi-perspective recording image having different angles of view of an object, depending on its relative position to the recording device of the recording unit. Consequently, the perspective detection or the viewing angle of a collision object in the vehicle-proximate area to the perspective detection or the viewing angle of the collision object in the vehicle-distant area is different, so that the calculation unit determines the collision object in the at least two image areas based on different parameters.
  • a pedestrian is received in the vehicle-near area from the top and therefore has a different representation on the playback picture than a pedestrian in the vehicle-distant area, which is detected from top to bottom obliquely.
  • the calculation unit is thus adapted in this embodiment to infer from the two different images of the pedestrian on the Kollisions Participart "pedestrian".
  • an on-vehicle cyclist for example, being detected from above, is displayed on the photograph as two dashes representing the wheels of the bicycle as a top view and as a body of the cyclist from above. From this representation, the calculation unit can close the collision object "cyclist".
  • the recording device of the recording unit which is for example a camera, has a lens which determines the course of the optical axis.
  • the optical axis is adjusted as desired to include a desired angle with the roadway that is in the range of about 5 ° to 90 °.
  • the optical axis intersects the road surface and does not run parallel to the roadway or to the longitudinal direction of the vehicle. Due to this course of the optical Axis, the recording device is able to capture multi-perspective images of the vehicle environment.
  • the calculation unit is further adapted to recognize from the collision object parameters whether the collision objects detected at two different times are one and the same collision object, for example a pedestrian or cyclist moving along the vehicle.
  • the calculation unit can preferably recognize these pedestrians or cyclists from the recorded images acquired by the recording unit at different points in time and thus "track" them. As soon as the trajectory of the pedestrian changes such that the collision probability exceeds a predetermined threshold therewith, the above-mentioned warning is issued to the driver or intervention is made in the vehicle control to prevent a collision with the recognized pedestrian or cyclist.
  • the calculation unit is further configured to detect at least one collision object based on the above-mentioned collision object parameters.
  • collision object parameters for example, the movement speed of the collision object, the position of the collision object relative to the vehicle, the position of the collision object relative to the recording device and / or the size of the collision object are to be mentioned.
  • the calculation unit is adapted to determine the trajectory of the vehicle and / or the trajectory of the collision object.
  • the calculation unit records the trajectory of the collision object from the data acquired by the acquisition unit and can also estimate the future trajectory of the collision object. This estimation of the trajectory is updated in the continuous acquisition and evaluation of the data acquired by the acquisition unit.
  • the calculation unit preferably determines the actual and the estimated trajectory of the vehicle that is equipped with the driver assistance system, for example from the vehicle speed, the steering movements of the vehicle, the activation state of a turn signal of the vehicle and / or the global position data (GPS). Data) of the vehicle.
  • GPS global position data
  • the calculation unit may issue the above warning and / or make the intervention in the vehicle control when the calculation unit calculates an expected collision from the determined trajectories.
  • the calculation unit can determine from the trajectories a collision probability of the vehicle with the collision object and warn the driver of the vehicle if this probability exceeds a predetermined threshold value, or change the type of warning as a function of the collision probability.
  • the driver assistance system has, in addition to the receiving unit, at least one sensor, for example a radar and / or ultrasound sensor, as a distance sensor which is mounted on the vehicle, in particular the commercial vehicle.
  • the sensor can determine a collision object in the immediate vehicle environment and send a corresponding signal to the calculation unit.
  • the calculation unit may use the data of the at least one sensor in determining the collision object parameters in addition to the acquisition images acquired by the acquisition unit. Furthermore, the calculation unit can take into account the data received from the at least one sensor in the calculation of the collision probability.
  • Other additional sensors may, for. B. be a GPS sensor for GPS data of the vehicle, whose data are also included in the determination of whether a detected object is a collision object.
  • the calculation unit preferably issues a warning to the driver of the vehicle and / or intervenes in the vehicle control if selected collision objects, such as, for example, crashed. B. moving objects or moving at a speed greater than a certain threshold speed objects are determined. In addition, when detecting a collision object, it is additionally determined whether it is such a selected object.
  • the driver assistance system preferably determines how far away a detected collision object from the vehicle is. From the determined distance and taking into account the collision object speed and the vehicle speed, the calculation unit can determine the collision probability and issue a warning to the driver or intervene in the vehicle control if the calculated collision probability exceeds a predetermined value.
  • the determination of the distance of the object to the vehicle can be made either before determining whether it is a collision object or only afterwards. Collision objects with increased collision probability receive an increased priority and cause a warning to be issued as low collision probability collision objects.
  • the field of vision detected by the receiving unit contains at least part of a legally prescribed field of view.
  • the legally prescribed field of view may correspond to one of the fields of view defined in ECE Guideline R 46.
  • the receiving unit is adapted to capture at least a portion of a first legally prescribed field of view and / or part of a second legally prescribed field of view and / or part of a third legally prescribed field of view and / or part of a fourth legally prescribed field of view ,
  • the first legally prescribed field of view corresponds to field of view II according to ECE Directive R 46
  • the second legally prescribed field of view IV according to ECE Directive R 46 the third legally prescribed field of view V according to ECE Directive R 46 and the fourth legally prescribed field of view the field of view VI according to the ECE Directive R 46.
  • a first recording unit detects at least a portion of the first and / or second field of view and a second recording unit at least a portion of the third and / or fourth field of view.
  • the issuing of a warning to the driver of the vehicle preferably comprises a visual, acoustic and / or haptic indication.
  • a visual indication of the collision object is given to the reproduction image displayed on the reproduction unit, in particular via flashing of the reproduction image, coloring of the reproduction image and / or changing the reproduction quality of the reproduction image.
  • An acoustic indication may be, for example, a short tone that indicates to the driver via a loudspeaker in the vehicle that there is a collision object in the vehicle environment.
  • a haptic indication may be, for example, a vibration of the steering wheel of the vehicle, which indicates the driver to the object of collision located in the vehicle environment.
  • the recording unit which creates a recording image of the vehicle environment and provides it to the calculation unit, is connected to the calculation unit via a first connection, for example a suitable data cable for the digital or analog transmission of the acquisition images captured by the acquisition unit.
  • the calculation unit is further connected to the playback unit via a second connection, for example a suitable one Data cable for digital or analog transmission of images.
  • a first connection for example a suitable data cable for the digital or analog transmission of the acquisition images captured by the acquisition unit.
  • the calculation unit is further connected to the playback unit via a second connection, for example a suitable one Data cable for digital or analog transmission of images.
  • a second connection for example a suitable one Data cable for digital or analog transmission of images.
  • one or both of the above-mentioned connections can also be carried out wirelessly, for example via Bluetooth, WLAN or an infrared connection.
  • the playback unit is designed to display the received playback picture permanently and in real time.
  • the recording unit is also preferably designed to record recorded images permanently and in real time, and the calculation unit is designed to process these recording images also permanently and in real time.
  • Permanent means here that, for example, the representation of the playback image is not interrupted by other information (temporally), so that the driver at any time can look at the playback unit in the environment of the vehicle and be notified of relevant obstacles and collision objects.
  • permanent means also that the representation of the fields of view at least in driving operation of the commercial vehicle is continuously available.
  • condition which is to be described as “permanently” and to be encompassed by it may also optionally be based on the ignition state of the vehicle or, for example, on a state in which a driver may be in the vehicle, eg. B. depending on detecting a located in the vicinity of the vehicle or in the vehicle key device, be extended.
  • the calculation unit may be integrally implemented with the recording unit, the playback unit or the control unit of the vehicle. Furthermore, the calculation unit can be mounted as a separate unit in or on the vehicle, for. B. integral with an on-board computer.
  • the reproducing unit may be a unit suitably displaying the reproduced image provided by the calculating unit to the driver of the vehicle.
  • the playback unit may be a separate device, such as an LCD, an LED, a projector, or the like.
  • the playback unit with the so-called Central Information Display which can already be standard in the vehicle, be integral.
  • the receiving unit may preferably be a camera with a lens which may be mounted on or in the vehicle.
  • the driver assistance system disclosed herein may preventively prevent accidents.
  • the driver assistance system is used on commercial vehicles, since in commercial vehicles, such as trucks, the direct view of other road users, such as pedestrians and cyclists who move parallel to the vehicle, is hardly possible.
  • the driver can be particularly warned of collision objects, which the driver recognizes in the devices for indirect vision, such as mirrors, but incorrectly classified as uncritical. For example, a pedestrian may initially be uncritical, but at the next moment be on a collision course with the vehicle.
  • Fig. 1 schematically shows a driver assistance system 10.
  • the driver assistance system 10 which on a commercial vehicle 50 (see Fig. 2 ), such as a truck, includes at least one image pickup unit 20, a computing unit 30, and a display unit 40 Fig. 1
  • Recording unit 20 shown has a first recording device, such as a first camera 22, and a second recording device, such as a second camera 24, which at different positions on the commercial vehicle can be attached.
  • the field of vision detected by the first camera 22 is then provided to the calculation unit 30 via a first connection 21, for example a suitable data cable for the digital transmission of the images captured by the image acquisition unit, and the field of vision detected by the second camera 24 is then transferred to the calculation unit 30 a second connection 23, for example a suitable data cable for the digital transmission of the images captured by the image acquisition unit.
  • the first and second cameras 22, 24 are highly dynamic and high-resolution cameras, each replacing a side mirror of the vehicle.
  • the driver assistance system 10 is an additional system to a camera system that replaces the side mirrors.
  • the calculation unit 30 is configured to modify the captured images to a display image in a desired manner.
  • the calculation unit 30 can evaluate the captured images of the image acquisition unit 20 and detect certain collision objects in the captured image, so that a warning is issued to the driver of the utility vehicle 50.
  • the reproduction image modified by the calculation unit 30 is then provided to the rendering unit 40 via a third connection 31, for example a suitable data cable for the digital transmission of images.
  • the reproduction unit 40 is adapted to present the reproduction image photorealistically to a driver of the utility vehicle 50 in a visible manner. In this case, the reproduction unit 40 can reproduce both the image captured by the first camera 22 and the image captured by the second camera 24 in a representation, for example in the so-called split-screen method.
  • the optical axis 28 of the first recording device 22 of the receiving unit 20 is shown, which forms an angle ⁇ with the roadway of the vehicle.
  • the angle ⁇ is chosen such that the recording unit captures an image from above to below obliquely.
  • the angle ⁇ is preferably in a range of about 5 ° to about 90 °.
  • the driver assistance system 10 is shown attached to the utility vehicle 50. It should be noted that the calculation unit 30 and the reproduction unit 40 in the Fig. 2 are not explicitly shown.
  • the receiving unit 20 is mounted, for example, at an elevated position on the driver side of the utility vehicle 50 in such a way that essentially a viewing area 60 is detected obliquely from above becomes.
  • the receiving unit 20 is mounted approximately 2 m above the lane on the driver's side of the vehicle 50.
  • the viewing area 60 has a three-dimensional shape, but in the Fig. 2 only the projected onto the roadway and captured area is shown hatched.
  • the field of view 60 detected by the image acquisition unit 20 comprises at least a portion of a first legally prescribed field of view 70 and at least a portion of a second legally prescribed field of view 72.
  • the first legally prescribed field of view 70 corresponds to the field of view II according to the ECE Directive R 46 and the second legally prescribed field of view 72 the field of view IV according to ECE guideline R 46.
  • the Fig. 3 schematically shows the relative position of a collision object 100 to the receiving unit 20.
  • the collision object 100 is a pedestrian 100.
  • the viewing area 60 detected by the recording unit 20 is subdivided into a first area 62, a second area 64 and a third area 66, which differ in the perspective of the collision objects and thus in the perspective.
  • the pedestrian 100 in the first area 62 which represents the vehicle-proximate area, is detected in plan view and consequently displayed on the display image according to the representation 101, in which only the head and the arms of the pedestrian 100 are recognized.
  • the pedestrian 102 in the second area 64 is detected in a perspective that upsets the pedestrian 102 on the reproduced image as shown in FIG.
  • the pedestrian 104 in the third area 66 which is the vehicle-distant area, is detected in a perspective including the vehicle Pedestrian 104 on the display image as shown in the representation 104 almost real faithful, since the pedestrian 104 is detected almost frontal.
  • the driver assistance system 10, in particular the calculation unit 30, is able to determine from the representations 101, 103, 105 the presence and / or the type of a collision object 100 and then cause the driver of the vehicle 50 to have a visual, acoustic and / haptic warning is issued and / or intervened in the control of the vehicle 50.
  • the calculation unit 30 can fall back on different collision object parameters for the different regions 62, 64, 66 and thus perform different collision object determination methods in these regions 62, 62, 66.
  • the calculation unit 30 is adapted to distinguish, for example, a pedestrian from a vehicle, a cyclist, a motorcyclist and / or a stationary collision object from the representations 101, 103, 105 and their parameters.
  • the calculation unit 30 uses the features typical of the respective collision object, such as the typical velocity or direction change capability, for example, to estimate the future trajectory of the collision object.
  • the different silhouettes are z. B. in a memory, which can access the calculation unit 30, deposited or learned as so-called collision object parameters that are dependent on the relative position to the recording device.
  • the calculation unit 30 can determine the type of the collision object 100 and then store the detected Kollisionsdorfparameter in memory and learn.
  • the Fig. 4 2 shows, by way of example, an image acquired by the image acquisition unit 20, which comprises at least part of the vehicle 50, a pedestrian 100, 102, 104 and at least partially the horizon 80 and is subdivided into the first region 62, the second region 64 and the third region 66 ,
  • the pedestrian 100, 102, 104 is one and the same pedestrian captured at different times.
  • the pedestrian 100 is detected at a first time in plan view, whereas the pedestrian 102 and the pedestrian 104 are detected at a second and third time, for example, later than the first time.
  • the pedestrian 100 has a different representation on the playback image than the pedestrian 102 or the pedestrian 104.
  • the calculation unit 30 can determine the trajectory of this pedestrian 100, 102, 104 from the detection time and the detection positions of the pedestrian 100, 102, 104 relative to the recording unit 20 and calculate a collision probability with the vehicle 50. If the collision probability exceeds a predetermined threshold value, the calculation unit 30 outputs a signal to the driver and possibly takes preventive measures in the vehicle control in order to prevent a collision with the collision object.
  • FIG. 3 is an exemplary flowchart of the steps performed by the driver assistance system 10.
  • the method begins at step 200, for example, when the vehicle 50 is started or when the driver assistance system 10 is activated.
  • the capture unit 20 continuously acquires multi-perspective vehicle-environment images at step 202 and sends these captured images to the computing unit 30 at step 204.
  • the computing unit 30 evaluates at query 206, the received images are in terms of objects. If no object is detected in the captured image at step 206, the process returns to step 202. However, if an object in the shot image, such as a pedestrian 100, in the shot image is detected at step 206, the process proceeds to step 208, where it is determined whether the object is a collision object. Further, at step 208, the collision object type and the collision object probability are determined.
  • the step 208 of Fig. 5 represents a subroutine used in the Fig. 6 is shown in more detail.
  • the subroutine 208 starts at step 300.
  • the computing unit 30 determines the relative position of the object 100 to the recorder from the captured image received by the capture unit 20.
  • the calculation unit may further data from z. B. take into account a mounted on the vehicle 50 distance sensor. Further, in this step, the calculation unit 30 may determine further object parameters, such as the size of the collision object.
  • the computing unit 30 determines the object type from the object parameters determined at step 302, such as whether the object is a pedestrian, a cyclist, a vehicle, or a static object. For example, if it is determined at step 304 that the object is a pedestrian 100 that is far from the vehicle 50, subroutine 208 may revert to the method of FIG Fig. 5 return because the object is not a collision object. However, if a critical object, such as a motorcyclist determined in the immediate vicinity of the vehicle, the method can also be used to process the Fig. 5 return because it is determined that the object is a collision object.
  • a critical object such as a motorcyclist determined in the immediate vicinity of the vehicle
  • the method can also be used to process the Fig. 5 return because it is determined that the object is a collision object.
  • the calculation unit 30 is capable of additionally performing the following steps.
  • the calculation unit 30 can determine the velocity of the collision object from the determined object parameters and / or from the collision object parameters stored for the determined object type. In this case, the calculation unit 30 can also refer back to the previously determined position of the collision object and the associated collision object parameters.
  • the calculation unit 30 detects the speed of the vehicle 50 at step 306.
  • the calculation unit 30 estimates in step 310 the future trajectory of the collision object and the vehicle 50, from which the calculation unit 30 calculates the collision probability in step 312, before, in step 314, the method 200 of FIG Fig. 5 returns.
  • the calculation unit 30 may be the one in Fig. 6 omit step 304 for determining the type of object and proceed directly with the calculation of the collision probability.
  • the individual steps may be performed in a suitably different order to reasonably calculate the collision probability.
EP14190877.2A 2013-11-05 2014-10-29 Système d'assistance à la conduite pour des véhicules, en particulier des véhicules utilitaires Active EP2869284B1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE201310018543 DE102013018543A1 (de) 2013-11-05 2013-11-05 Fahrerassistenzsystem für Fahrzeuge, insbesondere Nutzfahrzeuge

Publications (2)

Publication Number Publication Date
EP2869284A1 true EP2869284A1 (fr) 2015-05-06
EP2869284B1 EP2869284B1 (fr) 2016-10-12

Family

ID=51845303

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14190877.2A Active EP2869284B1 (fr) 2013-11-05 2014-10-29 Système d'assistance à la conduite pour des véhicules, en particulier des véhicules utilitaires

Country Status (3)

Country Link
EP (1) EP2869284B1 (fr)
DE (1) DE102013018543A1 (fr)
ES (1) ES2603002T3 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109562757A (zh) * 2016-08-22 2019-04-02 索尼公司 驾驶辅助装置、驾驶辅助方法、移动体和程序
CN109644251A (zh) * 2016-07-22 2019-04-16 康蒂-特米克微电子有限公司 用于拍摄本车的环境区域的摄像装置及用于提供驾驶员辅助功能的方法
CN109952518A (zh) * 2016-09-09 2019-06-28 克诺尔商用车制动系统有限公司 用于对车辆的车辆驾驶员进行物体的警示的设备及具有这种设备的车辆

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015011536A1 (de) 2015-09-02 2017-03-02 Man Truck & Bus Ag Spiegelersatzsystem als Kamera-Monitor-System (KMS) eines Kraftfahrzeugs, insbesondere eines Nutzfahrzeugs
DE102016008218A1 (de) 2016-07-06 2018-01-11 Audi Ag Verfahren zum verbesserten Erkennen von Objekten durch ein Fahrerassistenzsystem
US20180372875A1 (en) * 2017-06-27 2018-12-27 Uber Technologies, Inc. Sensor configuration for an autonomous semi-truck
DE102017215379A1 (de) 2017-09-01 2019-03-07 Robert Bosch Gmbh Verfahren zur Ermittlung einer Kollisionsgefahr
DE102018108751B4 (de) 2018-04-12 2023-05-04 Motherson Innovations Company Limited Verfahren, System und Vorrichtung zum Erhalten von 3D-Information von Objekten
DE102019204752A1 (de) * 2019-04-03 2020-03-26 Thyssenkrupp Ag Verfahren und Einrichtung zum Betrieb von insbesondere im Tagebau einsetzbaren Abraum- und Fördermaschinen

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10336638A1 (de) 2003-07-25 2005-02-10 Robert Bosch Gmbh Vorrichtung zur Klassifizierung wengistens eines Objekts in einem Fahrzeugumfeld
DE102006031895A1 (de) * 2006-07-07 2008-01-10 Siemens Ag Anzeigesystem zur Visualisierung von Fahrzeugabständen
DE102007059735A1 (de) 2006-12-12 2008-07-24 Cognex Corp., Natick Stereo-Sichtsystem für Fahrzeuge zur Erkennung seitlich liegender Hindernisse
DE102011010624A1 (de) 2011-02-08 2012-08-09 Mekra Lang Gmbh & Co. Kg Anzeigevorrichtung für Sichtfelder eines Nutzfahrzeugs
DE102011109459A1 (de) 2011-08-04 2013-02-07 Man Truck & Bus Ag Verfahren zum Erfassen von Objekten seitlich eines Nutzfahrzeugs und Nutzfahrzeug mit einem Erfassungssystem zum Ausführen des Verfahrens
DE102011116771A1 (de) * 2011-10-22 2013-04-25 Valeo Schalter Und Sensoren Gmbh Verfahren zum Anzeigen von Bildinformationen auf einer Anzeigeeinheit eines Fahrzeugs sowie Fahrerassistenzeinrichtung zum Durchführen eines derartigen Verfahrens

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080300755A1 (en) * 2007-05-30 2008-12-04 Dinu Petre Madau Side collision avoidance system
DE102011010524A1 (de) * 2011-02-08 2012-08-09 Hester Döring Bewegliche Schranktheke mit zwei Zugängen
DE102011016775A1 (de) * 2011-04-12 2011-12-15 Daimler Ag Assistenzsystem sowie Verfahren zum Betrieb eines Assistenzsystems

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10336638A1 (de) 2003-07-25 2005-02-10 Robert Bosch Gmbh Vorrichtung zur Klassifizierung wengistens eines Objekts in einem Fahrzeugumfeld
DE102006031895A1 (de) * 2006-07-07 2008-01-10 Siemens Ag Anzeigesystem zur Visualisierung von Fahrzeugabständen
DE102007059735A1 (de) 2006-12-12 2008-07-24 Cognex Corp., Natick Stereo-Sichtsystem für Fahrzeuge zur Erkennung seitlich liegender Hindernisse
DE102011010624A1 (de) 2011-02-08 2012-08-09 Mekra Lang Gmbh & Co. Kg Anzeigevorrichtung für Sichtfelder eines Nutzfahrzeugs
DE102011109459A1 (de) 2011-08-04 2013-02-07 Man Truck & Bus Ag Verfahren zum Erfassen von Objekten seitlich eines Nutzfahrzeugs und Nutzfahrzeug mit einem Erfassungssystem zum Ausführen des Verfahrens
DE102011116771A1 (de) * 2011-10-22 2013-04-25 Valeo Schalter Und Sensoren Gmbh Verfahren zum Anzeigen von Bildinformationen auf einer Anzeigeeinheit eines Fahrzeugs sowie Fahrerassistenzeinrichtung zum Durchführen eines derartigen Verfahrens

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109644251A (zh) * 2016-07-22 2019-04-16 康蒂-特米克微电子有限公司 用于拍摄本车的环境区域的摄像装置及用于提供驾驶员辅助功能的方法
CN109562757A (zh) * 2016-08-22 2019-04-02 索尼公司 驾驶辅助装置、驾驶辅助方法、移动体和程序
CN109562757B (zh) * 2016-08-22 2022-07-12 索尼公司 驾驶辅助装置、驾驶辅助方法、移动体和程序
CN109952518A (zh) * 2016-09-09 2019-06-28 克诺尔商用车制动系统有限公司 用于对车辆的车辆驾驶员进行物体的警示的设备及具有这种设备的车辆

Also Published As

Publication number Publication date
ES2603002T3 (es) 2017-02-23
DE102013018543A1 (de) 2015-05-07
EP2869284B1 (fr) 2016-10-12

Similar Documents

Publication Publication Date Title
EP2869284B1 (fr) Système d'assistance à la conduite pour des véhicules, en particulier des véhicules utilitaires
DE102019205223A1 (de) Einfädelverhaltenssysteme und Verfahren zum Einfädeln von Fahrzeugen
DE60122963T2 (de) Navigationsvorrichtung
EP2620929B1 (fr) Procédé et dispositif de reconnaissance d'une situation particulière dans le trafic routier
DE102019205228A1 (de) Einfädelverhaltenssysteme und Verfahren für Fahrzeuge auf einer Hauptspur
EP2998937B1 (fr) Dispositif d'affichage pour vehicules, notamment vehicules utilitaires
DE10336638A1 (de) Vorrichtung zur Klassifizierung wengistens eines Objekts in einem Fahrzeugumfeld
DE102011121948A1 (de) Vorausschau auf Aktionen eines autonomen Fahrsystems
DE102008011228A1 (de) Verfahren zur Unterstützung eines Nutzers eines Fahrzeugs, Steuereinrichtung für ein Fahrerassistenzsystem eines Fahrzeugs und Fahrzeug mit einer derartigen Steuereinrichtung
EP1554604A1 (fr) Procede et dispositif pour empecher la collision de vehicules
DE102011080928A1 (de) Verfahren zur Unterstützung eines Fahrers eines Kraftfahrzeugs
DE10336986A1 (de) Verfahren zum Vermeiden von Kollisionen eines Fahrzeugs
WO2007051835A1 (fr) Dispositif d’assistance pour conducteur lors de la conduite d’un véhicule
DE102006016807A1 (de) Verfahren und Vorrichtung zur Verbesserung der Sicht von Fahrern von Kraftfahrzeugen
DE102017214969A1 (de) Verfahren und Vorrichtung zur Falschfahrwarnung eines Fahrzeugs nach einem Unfall und/oder einer sicherheitskritischen Fahrsituation, insbesondere nach einem zwischenzeitlichen Stillstand des Fahrzeugs
DE102008020007A1 (de) Verfahren zum Unterstützen eines Fahrers beim Fahren mit einem Fahrzeug mit einer Fahrspurerkennung
EP1652161B1 (fr) Dispositif pour classifier au moins un objet dans un environnement de vehicule
WO2019121243A1 (fr) Dispositif d'alerte de situations de danger pour un véhicule automobile
EP1028387B1 (fr) Dispositif de reconnaissance de l'environnement, en particulier la reconnaissance de panneaux de signalisation routière
EP2555178B1 (fr) Procédé de détection d'objets placés sur le côté d'un véhicule utilitaire et véhicule utilitaire avec système de détection permettant de réaliser le procédé
DE102011016217A1 (de) Verfahren und Kamerasystem zum Warnen eines Fahrers eines Kraftfahrzeugs vor einer Kollisionsgefahr und Kraftfahrzeug mit einem Kamerasystem
DE102018220791A1 (de) Verfahren und Fahrzeug zum Reagieren auf ein Objekt im Fahrzeugumfeld
WO2018149768A1 (fr) Activation automatisée d'un système d'assistance visuelle
DE102011080720A1 (de) Visualisierung einer Rampenabfahrt
DE102015220312A1 (de) Verfahren zur Warnung eines Fahrers vor toten Winkeln und Vorrichtung zur Durchführung des Verfahrens

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20141029

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

R17P Request for examination filed (corrected)

Effective date: 20150710

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

17Q First examination report despatched

Effective date: 20150818

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20160520

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 837128

Country of ref document: AT

Kind code of ref document: T

Effective date: 20161015

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502014001694

Country of ref document: DE

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 3

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

REG Reference to a national code

Ref country code: SE

Ref legal event code: TRGR

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: ES

Ref legal event code: FG2A

Ref document number: 2603002

Country of ref document: ES

Kind code of ref document: T3

Effective date: 20170223

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20161031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170113

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170112

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170213

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170212

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502014001694

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20170112

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20161029

26N No opposition filed

Effective date: 20170713

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 4

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20161029

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20161031

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20141029

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20171031

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20171031

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161012

REG Reference to a national code

Ref country code: AT

Ref legal event code: MM01

Ref document number: 837128

Country of ref document: AT

Kind code of ref document: T

Effective date: 20191029

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191029

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230427

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 502014001694

Country of ref document: DE

Representative=s name: ALPSPITZ IP ALLGAYER UND PARTNER PATENTANWAELT, DE

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20231023

Year of fee payment: 10

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20231025

Year of fee payment: 10

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: ES

Payment date: 20231117

Year of fee payment: 10

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: SE

Payment date: 20231025

Year of fee payment: 10

Ref country code: IT

Payment date: 20231031

Year of fee payment: 10

Ref country code: FR

Payment date: 20231023

Year of fee payment: 10

Ref country code: DE

Payment date: 20231018

Year of fee payment: 10