WO2016087900A1 - Système d'assistance pour conducteur d'un véhicule et procédé pour assister le conducteur d'un véhicule - Google Patents

Système d'assistance pour conducteur d'un véhicule et procédé pour assister le conducteur d'un véhicule Download PDF

Info

Publication number
WO2016087900A1
WO2016087900A1 PCT/IB2014/066619 IB2014066619W WO2016087900A1 WO 2016087900 A1 WO2016087900 A1 WO 2016087900A1 IB 2014066619 W IB2014066619 W IB 2014066619W WO 2016087900 A1 WO2016087900 A1 WO 2016087900A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
sensor
surroundings
camera
driver assistance
Prior art date
Application number
PCT/IB2014/066619
Other languages
English (en)
Inventor
Carsten KAUSCH
Original Assignee
Audi Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi Ag filed Critical Audi Ag
Priority to CN201480083866.1A priority Critical patent/CN107003389A/zh
Priority to PCT/IB2014/066619 priority patent/WO2016087900A1/fr
Publication of WO2016087900A1 publication Critical patent/WO2016087900A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/04Display arrangements
    • G01S7/06Cathode-ray tube displays or other two dimensional or three-dimensional displays
    • G01S7/062Cathode-ray tube displays or other two dimensional or three-dimensional displays in which different colours are used
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93273Sensor installation details on the top of the vehicles

Definitions

  • the invention relates to a driver assistance system for a vehicle as well as a method for assisting a driver in driving a vehicle.
  • US 2007/0106440 Al shows a vehicle parking assisting system comprising camera means for taking picture images outside a vehicle.
  • the vehicle parking assisting system further comprises image generating means for generating a bird's-eye view image by subjecting the picture images to a bird's-eye view conversion.
  • the vehicle parking assisting system further comprises motion detecting means for detecting a motion of the vehicle and combined image generating means for generating a combined bird's-eye view image by combining, based on the motion of the vehicle, a past bird's-eye view image generated previously by the image generating means or a past combined bird's-eye view image generated previously with a present bird's-eye view image generated at present.
  • the vehicle parking assisting system further comprises obstacle detecting means for detecting an obstacle in a fan-shaped detection area around the vehicle, and display means for displaying, on the combined bird's-eye view image, both a present detection point and a past detection point for the obstacle in an overlapping manner.
  • the display means displays the present detection point as a series of marks extending form a predetermined position in the fan-shaped detection area in an arcuate shape along the detection area of the obstacle detecting means, and the l past detection point as a mark located at the predetermined position in the fan-shaped detection area.
  • US 2012/0154592 Al shows an image-processing system operative to process image data obtained by capturing images outside a periphery of a vehicle, the image-processing system comprising a plurality of image-capturing units that is fixed to the vehicle and that generates image-data items by capturing images outside the periphery of the vehicle.
  • the image-processing system further comprises a bird's-eye-view-image drawing unit configured to generate a bird's-eye-view image by determining a viewpoint above the vehicle for each of the image-data items generated by the image-capturing units based on the image-data item so that end portions of real spaces corresponding to two adjacent bird's-eye- view images overlap each other.
  • a first aspect of the invention relates to a driver assistance system for a vehicle, the driver assistance system comprising a sensor system for observing at least a portion of the vehicle's surroundings.
  • the sensor system comprises at least one camera configured to take images of at least the portion of the surroundings. Moreover, the camera is configured to provide image data characterizing the images.
  • the sensor system further comprises at least one sensor being different from the camera.
  • the sensor is configured to capture at least one parameter characterizing at least the portion of the surroundings. For example, objects arranged in the portion of the surroundings can be detected by the sensor and characterized by the parameter, wherein the sensor is configured to provide sensor data characterizing the parameter.
  • said sensor data represent the parameter and, thus, objects detected by the sensor, for example.
  • the sensor system comprises a data fusion unit configured to merge the image data and the sensor data thereby creating virtual three-dimensional images of at least the portion of the surroundings.
  • the image data and the sensor data are transmitted to the data fusion unit which receives the image data and the sensor data.
  • the data received by the data fusion unit are combined in such a way that three-dimensional images of at least the portion of the surroundings are created on the basis of the received data.
  • the driver can be assisted in parking the vehicle so that the driver can park the vehicle in particularly narrow parking places without colliding with objects bounding the parking places.
  • the driver assistance system according to the present invention can help the driver to keep calm and keep the control whilst being in a chaotic traffic situation.
  • the driver assistance system comprises at least one display unit configured to present the three-dimensional images to the driver of the vehicle.
  • the driver can visually perceive at least the portion of the surroundings in a three-dimensional manner by means of the display unit showing the three-dimensional images so that current traffic situations can be visualized and the driver's view can be extended by means of the driver assistance system.
  • the risk of collision between the vehicle and other traffic participants can be kept particularly low.
  • the driver can drive the vehicle through very narrow gaps bounded by other traffic participants without colliding with the other traffic participants on the basis of three-dimensional images shown by the display.
  • the display unit is configured to present objects in different colors on the basis of at least one predetermined criterion.
  • Said objects are part of the three-dimensional images displayed by the display unit.
  • said objects are different from the vehicle, arranged in the vehicle's surroundings and detected by the sensor system.
  • said criterion can be a distance between the detected objects and the vehicle and/or a relative speed between the objects and the vehicle and/or respective directions of movement of the vehicle and the objects.
  • objects approaching the vehicle are shown in red since the vehicle can potentially collide with said objects.
  • objects moving away from the vehicle and/or objects with a constant distance from the vehicle are shown in grey since the risk of a collision between the vehicle and these objects is particularly low.
  • the display unit is configured to show objects in the surroundings of the vehicle in different colors on the basis of a determined hazardous situation.
  • the camera is configured as an ultra-high definition camera (UHD camera) or a light fidelity camera (LiFi camera).
  • UHD camera ultra-high definition camera
  • LiFi camera light fidelity camera
  • a light fidelity camera is also referred to as a LiFi camera which uses an optical data transmission system which is also referred as visible light communications.
  • the senor is arranged above a roof of the vehicle so that a particularly large portion of the surroundings can be detected by the sensor.
  • the senor and/or the camera is mounted on a roof of the vehicle so that the surroundings of the vehicle can be detected particularly effectively.
  • the senor is configured as an electro-magnetic sensor or an ultrasonic sensor.
  • the sensor system is configured as a near-range sensor system having a detection range of 15 meters at the most.
  • the near surroundings of the vehicle can be observed particularly precisely so that the risk of collisions between the vehicle and other objects can be kept particularly low.
  • a near-range surrounding overview assistant can be realized to present a three-dimensional virtual overview of the near-range surroundings of the vehicle to the driver.
  • overtaking actions and/or parking actions and/or low range steering can be assisted by the driver assistance system.
  • the sensor system comprises at least one sensor band mounted on a outside of a body of the vehicle, the sensor band extending across at least a major portion of the length of the vehicle, said length extending in the longitudinal direction of the vehicle.
  • the sensor system comprises at least one sensor box arranged above a roof of the vehicle. Moreover, the sensor system comprises a plurality of sensors arranged in the sensor box so that the surroundings of the vehicle can be observed particularly precisely.
  • the invention also relates to a method for assisting a driver in driving a vehicle, the method comprising observing at least a portion of the vehicle's surroundings by means of a sensor system comprising at least one camera, at least one sensor being different from the camera, and a data fusion unit.
  • the method further comprises taking images of at least the portion of the surroundings by means of the camera. Image data characterizing the images are provided by the camera. At least one parameter characterizing at least the portion of the surroundings is captured by means of the sensor. Moreover, sensor data characterizing the parameter are provided by the sensor. Furthermore, the image data and the sensor data are merged by means of the data fusion unit thereby creating virtual three-dimensional images of at least the portion of the surroundings.
  • Advantageous embodiments and advantages of the driver assistance system according to the present invention are to be regarded as advantageous embodiments and advantages of the method according to the present invention and vice versa.
  • Fig. 1 a schematic side view of a vehicle in the form of a passenger vehicle comprising a driver assistance system for assisting a driver of the vehicle in driving;
  • Fig. 2 a schematic front view of a display unit of the driver assistance system, the display unit showing a three-dimensional image of at least a portion of the surroundings of the vehicle.
  • Fig. 1 shows a vehicle 10 in the form of a passenger vehicle.
  • the vehicle 10 comprises a driver assistance system having a sensor system 12 for observing at lest a portion of the surroundings 14 of the vehicle 10.
  • the sensor system comprises a plurality of cameras 16 and 18.
  • the camera 16 is configured as a LiFi-camera or a ultra-high definition camera (UHD camera), wherein the cameras 18 are configured as conventional cameras.
  • the sensor system 12 comprises a plurality of sensors 20 which are configured as ultrasonic sensors.
  • the sensor system 12 comprises a plurality of sensors which are arranged in a sensor box 22 arranged above a roof 24 of the vehicle 10, the sensor box 22 being mounted on the roof 24.
  • the sensor system 12 comprises a sensor band 26 which extends around at least a major portion of the circumference of the vehicle 10.
  • the sensor band 26 extends completely around the body 28 of the vehicle 10 and, thus, the vehicle 10 itself, wherein the sensor band 26 is mounted on an outside 30 of the body 28.
  • the sensor band 26 comprises a plurality of sensors which are configured as, for example, electro-magnetic sensors and/or cameras and/or ultrasonic sensors. Said sensors of the sensor system 12 are different form the cameras 16 and 18.
  • the cameras 16 and 18 are configured to take images of at least a portion of the surroundings 14, wherein the cameras 16 and 18 provide image data characterizing the taken images.
  • Said sensors being different from the cameras 16 and 18 are configured to capture at least one parameter characterizing at least the portion of the surroundings 14.
  • said sensors are configured to detect objects arranged in the surroundings 14 and respective distances between the detected objects and the vehicle 10.
  • said sensors are configured to provide sensor data characterizing said parameter which in turn characterizes the detected objects and distances.
  • the sensor system 12 comprises a data fusion unit which receives the image data and the sensor data.
  • the data fusion unit is configured to merge the received image data and the received sensor data thereby creating virtual three-dimensional images of at least the portion of the surroundings 14.
  • the sensor system 12 comprises at least one display unit 32 which is arranged in the interior of the vehicle 10.
  • the display unit is configured to present the created three-dimensional images to the driver of the vehicle 10, wherein one of said three-dimensional images can be seen in Fig. 2.
  • Said three-dimensional image presented by the display unit 32 comprises or shows the vehicle 10 itself as well as other traffic participants such as other vehicles 34 and 36, a cyclist 38 and pedestrians 40, each in a three-dimensional manner.
  • the driver of the vehicle 10 can visually perceive the orientation of the vehicle 10 in relation to said other traffic participants. Thereby, the driver can drive the vehicle 10 in a particularly safe manner without colliding with the other traffic participants.
  • the sensor system 12 has a detection range of 15 meters at the most so that a near-range overview of the surroundings 14 can be presented to the driver by the display unit 32.
  • the display unit 32 and, thus, the sensor system 12 are configured to present objects and, thus, said traffic participants detected by the sensor system 12 in different colors on the basis of at least one predetermined criterion.
  • the criterion comprises a distance between the vehicle 10 and the respective other traffic participants as well as a direction of movement of the respective other traffic participant in relation to the own vehicle 10.
  • the vehicle 36 and the pedestrians 40 are shown in grey color in the three-dimensional image since the vehicle 36 and the pedestrians 40 do not approach the own vehicle 10.
  • the vehicle 46 moves away from the vehicle 10 so that the distance between the vehicle 10 and the vehicle 36 increases.
  • the pedestrians 40 stand still so that respective distances between the pedestrians 40 and the vehicle 10 are constant or increase.
  • the risk of collisions between the vehicle 10 and the vehicle 36 and the pedestrians 40 is below a predetermined threshold value so that the vehicle 36 and the pedestrians 40 are shown in grey in the three-dimensional image.
  • the vehicle 34 and the cyclist 38 are, for example, shown in red color since a respective distance between the vehicle 10 and the vehicle 34 and between the vehicle 10 and the cyclist 38 is below a threshold value and the vehicle 34 and the cyclist 38 approach the vehicle 10.
  • the vehicle 34 and the cyclist 38 are potential collision objects which can potentially collide with the vehicle 10. This means the risk of a collision between the vehicle 10 and the vehicle 34 and a risk of a collision between the vehicle 10 and the cyclist 38 exceed the threshold value so that the vehicle 34 and the cyclist 38 are shown in red in the three-dimensional image.
  • the driver can be assisted in driving in heavy traffic so that the risk of collisions between the vehicle 10 and other traffic participants can be kept particularly low.
  • the driver 10 can avoid collisions between the vehicle 10 and other traffic participants on the basis of the presented three-dimensional images.
  • the driver can be assisted in parking the vehicle in narrow parking places without colliding with objects bounding the parking places.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

L'invention concerne un système d'assistance pour conducteur d'un véhicule (10), ledit système d'assistance comprenant un système de détection (12) pour observer au moins une partie de l'environnement (14) du véhicule. Le système de détection (12) comprend : au moins une caméra (16, 18) conçue pour prendre des images d'au moins la partie de l'environnement (14) et fournir des données d'image caractérisant les images ; au moins un capteur (20), différent de la caméra (16, 18) et conçu pour capturer au moins un paramètre caractérisant au moins la partie de l'environnement (14), ledit capteur (20) étant conçu pour fournir des données de capteur caractérisant le paramètre ; et une unité de fusion de données, conçue pour fusionner les données d'image et les données de capteur afin de créer des images virtuelles en trois dimensions d'au moins la partie de l'environnement (14).
PCT/IB2014/066619 2014-12-05 2014-12-05 Système d'assistance pour conducteur d'un véhicule et procédé pour assister le conducteur d'un véhicule WO2016087900A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201480083866.1A CN107003389A (zh) 2014-12-05 2014-12-05 用于车辆的驾驶员辅助系统以及辅助车辆的驾驶员的方法
PCT/IB2014/066619 WO2016087900A1 (fr) 2014-12-05 2014-12-05 Système d'assistance pour conducteur d'un véhicule et procédé pour assister le conducteur d'un véhicule

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2014/066619 WO2016087900A1 (fr) 2014-12-05 2014-12-05 Système d'assistance pour conducteur d'un véhicule et procédé pour assister le conducteur d'un véhicule

Publications (1)

Publication Number Publication Date
WO2016087900A1 true WO2016087900A1 (fr) 2016-06-09

Family

ID=52434876

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/066619 WO2016087900A1 (fr) 2014-12-05 2014-12-05 Système d'assistance pour conducteur d'un véhicule et procédé pour assister le conducteur d'un véhicule

Country Status (2)

Country Link
CN (1) CN107003389A (fr)
WO (1) WO2016087900A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070106440A1 (en) 2005-11-04 2007-05-10 Denso Corporation Vehicle parking assisting system and method
US20120154592A1 (en) 2007-10-15 2012-06-21 Alpine Electronics, Inc. Image-Processing System and Image-Processing Method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5407764B2 (ja) * 2009-10-30 2014-02-05 トヨタ自動車株式会社 運転支援装置
DE102010034139A1 (de) * 2010-08-12 2012-02-16 Valeo Schalter Und Sensoren Gmbh Verfahren zur Unterstützung eines Parkvorgangs eines Kraftfahrzeugs, Fahrerassistenzsystem und Kraftfahrzeug
DE102010040803A1 (de) * 2010-09-15 2012-03-15 Continental Teves Ag & Co. Ohg Visuelles Fahrerinformations- und Warnsystem für einen Fahrer eines Kraftfahrzeugs
DE102011077143A1 (de) * 2011-06-07 2012-12-13 Robert Bosch Gmbh Fahrzeugkamerasystem und Verfahren zur Bereitstellung eines lückenlosen Bildes der Fahrzeugumgebung

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070106440A1 (en) 2005-11-04 2007-05-10 Denso Corporation Vehicle parking assisting system and method
US20120154592A1 (en) 2007-10-15 2012-06-21 Alpine Electronics, Inc. Image-Processing System and Image-Processing Method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
AMIR ILIAIFAR: "cars/lidar-lasers-and-beefed-up-computers-the-intricate-anatomy-of-an-autonomous-vehicle", 6 February 2013 (2013-02-06), XP055197393, Retrieved from the Internet <URL:http://www.digitaltrends.com/cars/lidar-lasers-and-beefed-up-computers-the-intricate-anatomy-of-an-autonomous-vehicle/> [retrieved on 20150622] *
ANONYMOUS: "Fujitsu Develops World's First 3D Image Synthesis Technology to Display Vehicle Exterior without Distortion - Fujitsu Global", 9 October 2013 (2013-10-09), XP055197255, Retrieved from the Internet <URL:http://www.fujitsu.com/global/about/resources/news/press-releases/2013/1009-03.html> [retrieved on 20150622] *
SEBASTIAN THRUN ET AL: "Stanley: The robot that won the DARPA Grand Challenge", JOURNAL OF FIELD ROBOTICS, JOHN WILEY & SONS, INC, US, vol. 23, no. 9, 1 September 2006 (2006-09-01), pages 661 - 692, XP009156447, ISSN: 1556-4959, [retrieved on 20060925], DOI: 10.1002/ROB.20147C *

Also Published As

Publication number Publication date
CN107003389A (zh) 2017-08-01

Similar Documents

Publication Publication Date Title
CN104608692B (zh) 停车辅助系统及其方法
US10558221B2 (en) Method for assisting in a parking operation for a motor vehicle, driver assistance system and a motor vehicle
US9505346B1 (en) System and method for warning a driver of pedestrians and other obstacles
JP5523448B2 (ja) 運転支援システム、情報表示装置、及び情報表示プログラム
US9499168B2 (en) Vehicle periphery display device
US20140240502A1 (en) Device for Assisting a Driver Driving a Vehicle or for Independently Driving a Vehicle
US20100329510A1 (en) Method and device for displaying the surroundings of a vehicle
WO2016002163A1 (fr) Dispositif d&#39;affichage d&#39;image et procédé d&#39;affichage d&#39;image
JP6379779B2 (ja) 車両用表示装置
KR101611194B1 (ko) 차량 주변 이미지 생성 장치 및 방법
US10618467B2 (en) Stereo image generating method using mono cameras in vehicle and providing method for omnidirectional image including distance information in vehicle
JP2012198207A (ja) 画像処理装置および方法、ならびに移動体衝突防止装置
WO2007015446A1 (fr) Dispositif de surveillance de l’environnement d’un véhicule et procédé de surveillance de l’environnement d’un véhicule
JP2005045602A (ja) 車両用視界モニタシステム
US20190135169A1 (en) Vehicle communication system using projected light
JP2009040108A (ja) 画像表示制御装置及び画像表示制御システム
JP6375816B2 (ja) 車両用周辺情報表示システム及び表示装置
JP5776995B2 (ja) 車両用周辺監視装置
WO2017154787A1 (fr) Système d&#39;affichage de zone de stationnement et système de stationnement automatique faisant appel à celui-ci
JP2019120994A (ja) 表示制御装置および表示制御方法
JP2020065141A (ja) 車両の俯瞰映像生成システム及びその方法
JP2009231937A (ja) 車両用周囲監視装置
WO2019193715A1 (fr) Dispositif d&#39;aide à la conduite
JP2012001126A (ja) 車両用周辺監視装置
JP5083142B2 (ja) 車両周辺監視装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14833191

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14833191

Country of ref document: EP

Kind code of ref document: A1