EP3526725A1 - Verfahren zur detektion von objekten in einem umgebungsbereich eines kraftfahrzeugs unter berücksichtigung von sensordaten im infrarotwellenlängenbereich, objektdetektionsvorrichtung, fahrerassistenzsystem sowie kraftfahrzeug - Google Patents

Verfahren zur detektion von objekten in einem umgebungsbereich eines kraftfahrzeugs unter berücksichtigung von sensordaten im infrarotwellenlängenbereich, objektdetektionsvorrichtung, fahrerassistenzsystem sowie kraftfahrzeug

Info

Publication number
EP3526725A1
EP3526725A1 EP17757697.2A EP17757697A EP3526725A1 EP 3526725 A1 EP3526725 A1 EP 3526725A1 EP 17757697 A EP17757697 A EP 17757697A EP 3526725 A1 EP3526725 A1 EP 3526725A1
Authority
EP
European Patent Office
Prior art keywords
sensor data
motor vehicle
wavelength range
basis
classified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP17757697.2A
Other languages
English (en)
French (fr)
Inventor
James Mcdonald
John Mcdonald
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Connaught Electronics Ltd
Original Assignee
Connaught Electronics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Connaught Electronics Ltd filed Critical Connaught Electronics Ltd
Publication of EP3526725A1 publication Critical patent/EP3526725A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Definitions

  • the present invention relates to a method for detecting objects in an environmental region of a motor vehicle, wherein by means of a detecting device of the motor vehicle first sensor data describing the environmental region in the visible wavelength range, and second sensor data describing the environmental region in the infrared wavelength range, are provided and the object is detected on the basis of the first sensor data and/or the second sensor data.
  • the present invention relates to an object detection apparatus.
  • the present invention relates to a driver assistance system.
  • the present invention relates to a motor vehicle.
  • the interest is presently directed to the detection of objects in an environmental region of a motor vehicle.
  • objects in the environment are detected with sensors, which are arranged on the motor vehicle.
  • the information about the objects can then be used by the driver assistance systems of the motor vehicle.
  • a warning to the driver of the motor vehicle can be outputted if a collision with an object or obstacle is imminent.
  • the objects can for example be detected by the aid of cameras that provide images of the environmental region. Using appropriate object detection methods, the objects can then be detected in the images.
  • the objects are classified or assigned to a group.
  • the objects can be detected as pedestrians, vehicles or road markings. Further, it may be provided that the objects are classified as static or dynamic objects.
  • WO 2015/161208 A1 discloses a vehicle with a vision system comprising a stereo camera for light in the visible wavelength range.
  • the vision system comprises an infrared camera.
  • the vision system comprises a camera for the near infrared wavelength range.
  • first sensor data describing in particular the environmental region in the visible wavelength range, and second sensor data describing in particular the environmental region in the infrared wavelength range are provided.
  • the object is preferably detected on the basis of the first sensor data and/or the second sensor data.
  • the object is preferably classified as a static object or as a dynamic object on the basis of the first sensor data and/or the second sensor data and if the object is classified as a static or as a dynamic object, it is preferably verified on the basis of the second sensor data, whether the object is a plant.
  • a method according to the invention serves for detection in an environmental region of a motor vehicle.
  • first sensor data describing the environmental region in the visible wavelength range
  • second sensor data describing the environmental region in the infrared wavelength range
  • the object is detected on the basis of the first sensor data and/or the second sensor data.
  • the object is detected on the basis of the first sensor data and/or the second sensor data.
  • the object is preferably classified as a static object or as a dynamic object on the basis of the first sensor data and/or the second sensor data. If the object is classified as a static or as a dynamic object, it is verified on the basis of the second sensor data, whether the object is a plant.
  • the first sensor data are provided with the detection device.
  • the first sensor data describe the environmental region in the visible wavelength range.
  • the first sensor data describes in particular light in the visible wavelength range, which is reflected from the object and/or emitted by the object.
  • the second sensor data are provided by means of the detection device, which describe the surrounding region and particularly the object in the infrared wavelength range.
  • the second sensor data describe in particular the radiation in the infrared wavelength range, which is reflected by the object and/or is emitted from it.
  • the object On the basis of the first sensor data and/or the second sensor data, the object can then be detected.
  • a computing device can be used by means of which the first sensor data and/or the second sensor data can be evaluated. If the first sensor data and the second sensor data are provided each as an image, an appropriate object recognition algorithm can for example be carried out using the computing device to recognize the object. In this case, for example, methods of segmentation and/or classification can be used to detect the object in the images. It can also be provided that the detected object is compared with known forms. In particular, it is provided that the object is classified based on the first sensor data and/or the second sensor data.
  • the classification it can in particular be differentiated between a static or non-moving object on the one hand and a dynamic or a moving object on the other hand.
  • dynamic objects those objects are to be understood, which move relative to the motor vehicle.
  • the dynamic objects can move in particular on a road or a floor.
  • static objects those objects are to be understood, which do not move relative to the motor vehicle.
  • the static objects can be located in particular on the road or near the road.
  • the object is classified as a static object or as a dynamic object, it is verified on the basis of the second sensor data, whether the object is a plant.
  • This is based on the finding that plants can move as a result of environmental influences. For example, the plants and in particular their leaves can move in wind or precipitation. This can lead to the situation that these plants or parts thereof are classified as a moving object.
  • textures in static vegetation can mimic the patterns of objects or pedestrians used in classification methodologies. Thus, these textures of the static plants can be classified as static objects. If the object is classified as a static or as a moving object, the second sensor data describing the object can be used.
  • the infrared radiation, and particularly the near infrared radiation which is reflected and/or emitted from the object is examined.
  • plants typically reflect or scatter radiation in the near infrared wavelength region during photosynthesis.
  • the plants reflect or scatter the radiation in the near infrared wavelength range in order to prevent overheating and damage to the cells.
  • the second sensor data it is possible to distinguish the objects classified as static or as moving objects from plants.
  • the false positive rate can be significantly reduced.
  • the object can be classified as a static pedestrian. As already mentioned, this may be due to the texture of the plants.
  • the verification based on the second sensor data it can be determined in a reliable manner, whether it is a pedestrian or a plant, for example a tree, a bush or the like. This information can then be used for example by a driver assistance system of the motor vehicle to avoid a collision between the motor vehicle and the pedestrian.
  • the object that is classified as static or as dynamic is detected as a plant on the basis of the radiation emitted from the object in the near infrared wavelength range, in particular in a wavelength range of between 0.7 ⁇ and 1 .1 ⁇ .
  • the knowledge is taken into account that plants especially during
  • photosynthesis significantly reflect or scatter radiation in the near infrared region to prevent overheating and thus damage to the cells. If it is now recognized that the object emits radiation in this wavelength range, it can be assumed with a high probability that it is a plant.
  • the first sensor data can be used to assess the object that is classified as a static object or as a dynamic object closer. For example, it can be verified, which colour the object has. If the object has a green colour, it can be assumed with a high probability that it is a plant.
  • the colour of the object or a part of the object can be compared with predetermined colours, which describe different plants. This takes into account that leaves of trees and shrubs can have different colours. Thus, the false positive rate can be further reduced.
  • the object that is classified as static or as dynamic is detected as a plant based on a radiation emitted from the object in a wavelength range of between 0.4 ⁇ and 0.7 ⁇ .
  • a radiation emitted from the object in a wavelength range of between 0.4 ⁇ and 0.7 ⁇ .
  • the chlorophyll in the plants and in particular in the leaves of plants reflects or scatters light in the visible wavelength range of between 0.4 ⁇ and 0.7 ⁇ .
  • a normalized differenced vegetation index of the as object that is classified as dynamic is determined on the basis of the first sensor data and the second sensor data and it is verified whether the object is a plant on the basis of the normalized differenced vegetation index.
  • the normalized differenced vegetation index is usually calculated on the basis of satellite data. This NDVI can now be determined based on the first sensor data and the second sensor data. In particular, the NDVI can be determined from reflection values in the near infrared range, which are included in the second sensor data, and reflection values in the red visible range, which are included in the first sensor data. Based on the NDVI plants in the environmental region can be determined in a simple and reliable manner.
  • an image is determined on the basis of the first sensor data and the second sensor data and it is verified whether the object is a plant on the basis of the image.
  • a composite image that contains information in the visible wavelength region and the infrared wavelength range can be provided based on the first sensor data and the second sensor data.
  • this image can describe the NDVI.
  • a so-called NRG image (NRG - Near- infrared/Red/Green) is provided, describing the environmental region in the near infrared wavelength range, in the red wavelength range and the green wavelength range.
  • NRG image NRG - Near- infrared/Red/Green
  • the proportion of plants or vegetation can be determined in the environmental region.
  • the object detection apparatus is adapted for performing a method according to the invention and
  • the computing device is in particular connected to the detecting device for data transmission and can receive the first sensor data and the second sensor data from the detection device.
  • the computing device may also identify and classify the object in the surrounding area based on the first sensor data and/or the second sensor data. Further, the computing device can verify based on the second sensor data, if the object that is classified as static or as dynamic is a plant.
  • the detection device comprises a camera for providing the first sensor data and a sensor for providing the second sensor data.
  • the camera may be a camera, which is typically used for object detection in the automotive field. With the camera, for example, images of the environmental region can be provided as the first sensor data. With the sensor then additionally the second sensor data in the infrared wavelength range, and in particular near-infrared wavelength region, can be provided.
  • the sensor may be, for example, a corresponding infrared camera.
  • the detection device can comprise a camera, which can provide both the first sensor data and the second sensor data.
  • a camera can be used which does not have an infrared filter.
  • this camera can provide, for example, images that provide in addition to the information in the visible wavelength range also information in the infrared wavelength range.
  • a driver assistance system comprises an object detection apparatus according to the invention.
  • objects and in particular pedestrians can be detected reliably.
  • static objects such as walls, curbs, guardrails or the like can be detected with the object detection apparatus based on the first sensor data and/or the second sensor data.
  • other motor vehicles, motorcycles or cyclists can be detected. This information can be used by the driver assistance system in order to assist the driver when driving the motor vehicle.
  • a motor vehicle according to the invention includes a camera system according to the invention.
  • the motor vehicle is in particular formed as a passenger car.
  • Fig. 1 a motor vehicle according to an embodiment of the present invention, comprising a driver assistance system with an object detection apparatus;
  • Fig. 2 an image which is provided by the object detection apparatus.
  • Fig. 1 shows a motor vehicle 1 according to an embodiment of the present invention in a plan view.
  • the motor vehicle 1 is configured as a passenger car.
  • the motor vehicle 1 comprises a driver assistance system 2, which is used to assist a driver when driving the motor vehicle 1 .
  • the driver assistance system 2 includes an object detection apparatus 3. With the aid of the object detection apparatus 3 objects 9 can be detected in an environmental region 8 of the motor vehicle 1 . In the present case, as objects 9 a pedestrian 10 and a plant 1 1 or a tree are located in the environmental region 8 of the vehicle 1 in the direction of travel in front of the motor vehicle 1 .
  • the driver assistance system 2 can intervene in a steering system, a drive motor and/or a braking system of the motor vehicle 1 in order to prevent a collision between the motor vehicle 1 and the object 9.
  • the object recognition device 3 includes a camera 5, by means of which the first sensor data can be provided.
  • the first sensor data describe the environmental region 8 in the visible wavelength range.
  • the first sensor data for example, image data or images can be provided by the camera 5.
  • the object detection apparatus 3 comprises a sensor 6 by means of which the second sensor data can be provided.
  • the second sensor data describe the environmental region 8 in the infrared wavelength range.
  • the sensor 6 is embodied as an infrared camera.
  • the object detection apparatus 3 includes a computing device 7.
  • the computing device 7 is connected with the camera 5 and the sensor 6 for data transmission. By means of the calculating device 7, the first sensor data from the camera 5 and the second sensor data from the sensor 6 can be received.
  • the objects 9 can be detected in the
  • the first sensor data and the second sensor data can be provided, for example, as an image. It can also be provided that a composite image 12 is determined by means of the computing device 7 based on the first sensor data and the second sensor data.
  • the objects 9 can be detected by means of a corresponding object detection algorithm. For example, a segmentation method can be used. By means of the computing device 7, the detected objects 9 are classified additionally. In particular, it is provided that the objects 9 are classified as static objects or as dynamic objects.
  • an image 12 showing the environmental region 8 and the objects 9 is provided with the calculating device 7 based on the first sensor data and the second sensor data.
  • Such an image 12 is exemplarily shown in Fig. 2.
  • the image 12 shows the pedestrian 10 and a part of the plant 1 1 or the tree. It can be the case that both the pedestrian 10 and the plant 1 1 are classified as static or moving object, and in particular as a pedestrian. This may be due to the plant 1 1 and in particular the leaves of the plant 1 1 moving as a result of environmental conditions such as wind.
  • the object 9 If the object 9 is classified as static or as dynamic, it can be verified on the basis of the second sensor data whether the object 9 reflects or scatters radiation in the near infrared wavelength region and in particular in a wavelength range between 0.7 ⁇ and 1 .1 ⁇ . This takes into account that plants 1 1 scatter radiation in this wavelength range in order to prevent overheating and thereby caused damage to the cells. Alternatively or additionally, it can be verified on the basis of the first sensor data, whether the object scatters light in a wavelength range of between 0.4 ⁇ and 0.7 ⁇ . This takes into account that the chlorophyll in the leaves of plants strongly absorbs light in this wavelength range.
  • the image 12 a composite image is provided that shows both components in the visible wavelength range and in the near infrared wavelength range.
  • the image 12 can show the normalized differenced vegetation index (NDVI).
  • NDVI normalized differenced vegetation index
  • the pedestrian 10 and the plant 1 1 have different values of the NDVI.
  • the plant can have an NDVI of 1 . This is schematically illustrated by the hatching 13.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
EP17757697.2A 2016-10-14 2017-08-07 Verfahren zur detektion von objekten in einem umgebungsbereich eines kraftfahrzeugs unter berücksichtigung von sensordaten im infrarotwellenlängenbereich, objektdetektionsvorrichtung, fahrerassistenzsystem sowie kraftfahrzeug Withdrawn EP3526725A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102016119592.8A DE102016119592A1 (de) 2016-10-14 2016-10-14 Verfahren zum Erkennen von Objekten in einem Umgebungsbereich eines Kraftfahrzeugs unter Berücksichtigung von Sensordaten im infraroten Wellenlängenbereich, Objekterkennungsvorrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug
PCT/EP2017/069928 WO2018068919A1 (en) 2016-10-14 2017-08-07 Method for detecting objects in an environmental region of a motor vehicle considering sensor data in the infrared wavelength range, object detection apparatus, driver assistance system as well as motor vehicle

Publications (1)

Publication Number Publication Date
EP3526725A1 true EP3526725A1 (de) 2019-08-21

Family

ID=59702673

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17757697.2A Withdrawn EP3526725A1 (de) 2016-10-14 2017-08-07 Verfahren zur detektion von objekten in einem umgebungsbereich eines kraftfahrzeugs unter berücksichtigung von sensordaten im infrarotwellenlängenbereich, objektdetektionsvorrichtung, fahrerassistenzsystem sowie kraftfahrzeug

Country Status (3)

Country Link
EP (1) EP3526725A1 (de)
DE (1) DE102016119592A1 (de)
WO (1) WO2018068919A1 (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020105821A1 (de) 2020-03-04 2021-09-09 Connaught Electronics Ltd. Verfahren und System zum Führen eines Fahrzeugs
DE102022127833A1 (de) 2022-10-21 2024-05-02 Bayerische Motoren Werke Aktiengesellschaft Fahrassistenzsystem und Fahrassistenzverfahren für ein Fahrzeug

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2454891C (en) * 2001-07-24 2009-07-21 The Board Of Regents For Oklahoma State University A process for in-season nutrient application based on predicted yield potential
DE102006059033A1 (de) * 2006-12-14 2008-06-19 Volkswagen Ag Verfahren und System zum Erkennen eines Verkehrsteilnehmers und zum Erzeugen einer Warnung
US8350724B2 (en) * 2009-04-02 2013-01-08 GM Global Technology Operations LLC Rear parking assist on full rear-window head-up display
AU2010259848A1 (en) * 2009-06-11 2012-02-02 Pa Llc Vegetation indices for measuring multilayer microcrop density and growth
BR112015030886B1 (pt) 2014-04-18 2022-09-27 Autonomous Solutions, Inc. Veículo, sistema de visão para uso por um veículo e método de direcionamento de um veículo com o uso de um sistema de visão
DE102014223741A1 (de) * 2014-11-20 2016-05-25 Conti Temic Microelectronic Gmbh Erfassen von Terahertz-Strahlung zum Unterstützen eines Fahrers eines Fahrzeugs
DE102014224857A1 (de) * 2014-12-04 2016-06-09 Conti Temic Microelectronic Gmbh Sensorsystem und Verfahren zur Klassifikation von Fahrbahnoberflächen

Also Published As

Publication number Publication date
DE102016119592A1 (de) 2018-05-03
WO2018068919A1 (en) 2018-04-19

Similar Documents

Publication Publication Date Title
US9499171B2 (en) Driving support apparatus for vehicle
US8301344B2 (en) Device for classifying at least one object in the surrounding field of a vehicle
CN106537180B (zh) 用于用针对行人的主动制动的摄影机输入缓解雷达传感器限制的方法
US20170297488A1 (en) Surround view camera system for object detection and tracking
CN110065494B (zh) 一种基于车轮检测的车辆防碰撞方法
CN108263279A (zh) 基于传感器整合的行人检测和行人碰撞避免装置及方法
US9359009B2 (en) Object detection during vehicle parking
CN107093180B (zh) 使用轮胎向后飞溅的基于视觉的潮湿路面状况检测
CN107273785B (zh) 多尺度融合的道路表面状况检测
CN110033621B (zh) 一种危险车辆检测方法、装置及系统
CN106845332B (zh) 利用轮胎旁溅的基于视觉的潮湿路面状况检测
US20190065878A1 (en) Fusion of radar and vision sensor systems
US9870513B2 (en) Method and device for detecting objects from depth-resolved image data
KR20170127036A (ko) 차도 위의 반사체를 인식하고 평가하기 위한 방법 및 장치
US20060038885A1 (en) Method for detecting the environment ahead of a road vehicle by means of an environment detection system
EP3282392A1 (de) Sichtsystem und -verfahren für ein kraftfahrzeug
CN109677402A (zh) 自动驾驶工具的安全防护系统及方法
JP4116643B2 (ja) 車両周囲の少なくとも1つの物体を分類する装置
CN111505617B (zh) 车辆定位方法、装置、设备及存储介质
KR102017958B1 (ko) 철도차량용 증강현실 헤드업 디스플레이 시스템
EP1854666B1 (de) Sich für industriefahrzeuge eignendes system zur erfassung von in einem externen frontendbereich eines fahrzeug befindlichen objekten
WO2018068919A1 (en) Method for detecting objects in an environmental region of a motor vehicle considering sensor data in the infrared wavelength range, object detection apparatus, driver assistance system as well as motor vehicle
LU100761B1 (en) Method for obstacle identification
WO2016079117A1 (en) Gradient detection based on perspective-transformed image
CN115699105A (zh) 用于机动车辆的视觉系统和方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20190402

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20191202