WO2009070069A1 - A system for classifying objects in the vicinity of a vehicle - Google Patents

A system for classifying objects in the vicinity of a vehicle Download PDF

Info

Publication number
WO2009070069A1
WO2009070069A1 PCT/SE2007/050901 SE2007050901W WO2009070069A1 WO 2009070069 A1 WO2009070069 A1 WO 2009070069A1 SE 2007050901 W SE2007050901 W SE 2007050901W WO 2009070069 A1 WO2009070069 A1 WO 2009070069A1
Authority
WO
WIPO (PCT)
Prior art keywords
classifier
data
camera
reflected radiation
vehicle
Prior art date
Application number
PCT/SE2007/050901
Other languages
English (en)
French (fr)
Inventor
Ognjan Hedberg
Jonas HAMMARSTRÖM
Original Assignee
Autoliv Development Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Autoliv Development Ab filed Critical Autoliv Development Ab
Priority to PCT/SE2007/050901 priority Critical patent/WO2009070069A1/en
Priority to EP07852173A priority patent/EP2212160A4/de
Publication of WO2009070069A1 publication Critical patent/WO2009070069A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • G06F18/256Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R2021/0002Type of accident

Definitions

  • a system for classifying objects in the vicinity of a vehicle A system for classifying objects in the vicinity of a vehicle.
  • THIS INVENTION relates to a classification system, and in particular concerns a system for classifying objects in the vicinity of a vehicle.
  • Modern motor vehicles are typically equipped with several different safety systems, which are adapted to protect both occupants of the vehicle (in the case of internal air- bags or seal belt pret ⁇ nsioners), or pedestrians (for example, bonnet lifters and external air-bag.
  • one or more of these safety systems may be deployed, or a safety system may be deployed in one of a plurality of possible modes, depending in part upon the type of the other object that is involved. For instance, an impact with a pole or tree at a given speed may be more severe than in impact with another vehicle. If it appears that the vehicle is about to strike a pedestrian, then an external air-bag or bonnet lifter may be activated, but if the vehicle is about to strike an inanimate object such as a tree then there is no need for these protection systems to be deployed.
  • Existing object classifiers comprise computer programs which are operable to analyse data from a vehicle sensor, such as a camera or radar system.
  • the classifier is "trained '5 with exposure to many different types of object in different circumstances, so that the program is able to make an accurate determination as to the type of a new kind of object that is detected,
  • one aspect of the present invention provides a classification system for classifying objects in the vicinity of a vehicle, the system comprising: a video camera for gathering camera data; a reflected radiation system for gathering reflected radiation data; and a classifier, wherein raw data from the camera and the reflected radiation system are combined and analysed by the classifier, the classifier being configured to provide an output relating to the type of an object that appears in data gathered by both the camera and the reflected radiation system,
  • the determination comprises an indication that the object is a certain type of object.
  • the determination comprises a probability that the object is of a certain type
  • the classifier comprises a neural network, a support vector machine or an adaptive boosting classifier.
  • the classifier is trained to discriminate between different types of object.
  • the classifier is trained to discriminate between pedestrians and other types of object.
  • the classifier is trained to discriminate between pedestrians, vehicles and pole-like objects.
  • the output of the classifier is used to control the deployment of a pedestrian protection device
  • the output of the classifier is used to control the deployment of both a pedestrian protection device and an occupant protection device.
  • the data gathered by the camera and the reflected radiation system are input into a pre-selector., which is configured to analyse the data from the camera and the reflected radiation system and to identify regions of the data which contain an object of potential interest.
  • the data from the camera and/or the reflected radiation system is analysed by an impact evaluator, which evaluates the risk of an impact between the vehicle and the object of potential interest.
  • outputs from both the classifier and the impact evaluator are used to control the deployment of one or more protection devices.
  • the protection devices are pedestrian protection devices and/or occupant protection devices.
  • the reflected radiation system is a radar system.
  • Figures 1 and 4 show schematic views of the components of a classification system embodying the present invention.
  • FIGS 2 and 3 show schematic views of data gathered by different on-board vehicle sensors.
  • FIG I 5 components of a system embodying the present invention are shown.
  • a vehicle camera 1 and a radar system 2 are both provided, being mounted on a vehicle so that the fields of view of the camera 1 and radar system 2 are overlapping,
  • the camera 1 may be a "mono" camera, may comprise a stereo camera system, may comprise an infrared (IR) camera or be any other suitable kind of camera that forms an image from received light, which need not fall within the visible portion of the spectrum.
  • IR infrared
  • the radar system 2 may be a conventional radar system, or may alternatively comprise a lidar system, an ultrasonic system or any other suitable system in which waves are emitted by an emitter and reflected waves which return to the vehicle are analysed as will be understood by those of skill in the art.
  • Data from both the camera 1 and the radar system 2 are output to a pre-selector 3, which identifies potential objects of interest in the data.
  • FIG 2 simplified images 4,5 from the camera system 1 and the radar system 2 are shown, in the case where a pedestrian 6 is positioned in front of the vehicle.
  • the image 4 from the camera 1 comprises an image of the pedestrian 6, whereas the image 5 from the radar system 2 indicates that, in the same region of space, there is an object which is closer to the vehicle than other detected '"background" objects,
  • the pre-classif ⁇ r 3 is operable to determine that the data from the camera 1 and radar system 2 contain an object of potential interest, and corresponding regions 7,8 of the images 4,5 taken by the camera i and radar system 2 are identified by the pre ⁇ selector for further analysis. These regions 7,8 of the images 4,5 are the regions containing the data relating to the object of potential interest.
  • data from the selected regions 7,8 of the images 4,5 is combined, and the combined camera and radar data is then passed to a trained classifier 9, As discussed above, an object classifier is trained through exposure to different types of objects in different circumstances, so that the classifier is able to provide a high degree of accuracy in classifying objects in new data that is presented. Sn this case, the classifier 9 is trained by repeated exposure to combined camera and radar data relating to different types of object.
  • the classifier 9 analyses the combined camera and radar data io provide a determination as to the type of object that appears in the combined data.
  • classifiers might be used. Examples are neural network classifiers, support vector machine classifiers or Adaptive Boosting ("Adaboost") classifiers.
  • Adaboost Adaptive Boosting
  • Examples of features in the combined raw camera/radar data that could be analysed to classify objects in the data include grey scale values, gradients, patterns, amplitudes and (for the radar data only) phase information.
  • the classifier 9 may analyse the data to determine whether it is one of several types of essential object in a single step. Alternatively, as shown in figure 4, the classifier 9 may analyse the raw combined camera/radar data separately for each potential type of object, in parallel. Referring to figure 4, the raw data is input to a pedestrian classifier 10, a pole classifier 11 and a vehicle classifier 12, and each classifier analyses the data and provides a determination as to whether the data contains an object of that particular classification, or alternatively outputs a measure of probability that the data contains that type of object.
  • each classifier 10,11,12 is operable to output a "negative" signal if it is determined that the data does not contain an object of that particular type.
  • classifiers 10.11,12 out negative signals, this may be passed to an error generator 13, which outputs a signal indicating that the classifier 9 has not been able to identify the object.
  • the system may default to a "safest " ' mode in which both occupant and pedestrian safety systems are activated if activation would be triggered by the object being of the most hazardous type.
  • the combined camera/radar data is also input to an impact evaluator 14, which analyses the data in parallel with the classifier 9.
  • the impact evaluator 14 analyses the data to calculate the likelihood of the vehicle being involved in an impact with the object in question, and will also make a determination as to the likely time of the impact, and the relative speed and/or orientation of the vehicle and the object at the predicted impact.
  • the impact evaluator 14 may consider only the camera data, or only the radar data. Indeed, the impact evaluator 14 may take data from a further sensor 17, such as an accelerometer or impact sensor, alone or in combination with the camera and/or radar data.
  • only data from the camera 1 and radar system 2 in one or more regions identified by the pre-classifier 3 as containing an object of potential interest are analysed by the classifier 9.
  • the pre-classifier 3 may discard other data and only pass on data relating to these regions, or the data relating to these regions may be "marked" as being for analysis.
  • Outputs from the classifier 9 and the impact evaluator 14 are then passed to an algorithm 15 for pedestrian protection, and also to an algorithm 16 for occupant protection.
  • These algorithms 15,16 will coordinate the deployment, if appropriate, of one or more safety system to protect pedestrians and/or vehicle occupants, in dependence upon the outputs from the classifier 9, which provides an indication of the type of object, and from the impact evaluator 14.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Traffic Control Systems (AREA)
PCT/SE2007/050901 2007-11-26 2007-11-26 A system for classifying objects in the vicinity of a vehicle WO2009070069A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/SE2007/050901 WO2009070069A1 (en) 2007-11-26 2007-11-26 A system for classifying objects in the vicinity of a vehicle
EP07852173A EP2212160A4 (de) 2007-11-26 2007-11-26 System zum klassifizieren von objekten in der nähe eines fahrzeugs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SE2007/050901 WO2009070069A1 (en) 2007-11-26 2007-11-26 A system for classifying objects in the vicinity of a vehicle

Publications (1)

Publication Number Publication Date
WO2009070069A1 true WO2009070069A1 (en) 2009-06-04

Family

ID=40678803

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2007/050901 WO2009070069A1 (en) 2007-11-26 2007-11-26 A system for classifying objects in the vicinity of a vehicle

Country Status (2)

Country Link
EP (1) EP2212160A4 (de)
WO (1) WO2009070069A1 (de)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013033351A3 (en) * 2011-08-30 2013-06-27 5D Robotics, Inc. Graphical rendition of multi-modal data
WO2016126315A1 (en) * 2015-02-06 2016-08-11 Delphi Technologies, Inc. Autonomous guidance system
WO2018047115A1 (en) * 2016-09-08 2018-03-15 Mentor Graphics Development (Deutschland) Gmbh Object recognition and classification using multiple sensor modalities
US10150414B2 (en) 2016-07-08 2018-12-11 Ford Global Technologies, Llc Pedestrian detection when a vehicle is reversing
US10317901B2 (en) 2016-09-08 2019-06-11 Mentor Graphics Development (Deutschland) Gmbh Low-level sensor fusion
US10520904B2 (en) 2016-09-08 2019-12-31 Mentor Graphics Corporation Event classification and object tracking
US10553044B2 (en) 2018-01-31 2020-02-04 Mentor Graphics Development (Deutschland) Gmbh Self-diagnosis of faults with a secondary system in an autonomous driving system
US10678240B2 (en) 2016-09-08 2020-06-09 Mentor Graphics Corporation Sensor modification based on an annotated environmental model
EP3648006A4 (de) * 2017-09-22 2020-07-29 Samsung Electronics Co., Ltd. Verfahren und vorrichtung zur erkennung eines objekts
US10884409B2 (en) 2017-05-01 2021-01-05 Mentor Graphics (Deutschland) Gmbh Training of machine learning sensor data classification system
US10948924B2 (en) 2015-02-06 2021-03-16 Aptiv Technologies Limited Method and apparatus for controlling an autonomous vehicle
US10991247B2 (en) 2015-02-06 2021-04-27 Aptiv Technologies Limited Method of automatically controlling an autonomous vehicle based on electronic messages from roadside infrastructure or other vehicles
US11067996B2 (en) 2016-09-08 2021-07-20 Siemens Industry Software Inc. Event-driven region of interest management
US11145146B2 (en) 2018-01-31 2021-10-12 Mentor Graphics (Deutschland) Gmbh Self-diagnosis of faults in an autonomous driving system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010045981A1 (en) * 2000-05-24 2001-11-29 Joachim Gloger Camera-based precrash detection system
US20030060956A1 (en) * 2001-09-21 2003-03-27 Ford Motor Company Method for operating a pre-crash sensing system with object classifier in a vehicle having a countermeasure system
US20030114964A1 (en) * 2001-12-19 2003-06-19 Ford Global Technologies, Inc. Simple classification scheme for vehicle/pole/pedestrian detection
US20050131646A1 (en) * 2003-12-15 2005-06-16 Camus Theodore A. Method and apparatus for object tracking prior to imminent collision detection
EP1760632A2 (de) * 2005-08-30 2007-03-07 Fuji Jukogyo Kabushiki Kaisha Bidprozessoreinrichtung

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6269308B1 (en) * 1998-08-20 2001-07-31 Honda Giken Kogyo Kabushiki Kaisha Safety running system for vehicle
JP2007148835A (ja) * 2005-11-28 2007-06-14 Fujitsu Ten Ltd 物体判別装置、報知制御装置、物体判別方法および物体判別プログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010045981A1 (en) * 2000-05-24 2001-11-29 Joachim Gloger Camera-based precrash detection system
US20030060956A1 (en) * 2001-09-21 2003-03-27 Ford Motor Company Method for operating a pre-crash sensing system with object classifier in a vehicle having a countermeasure system
US20030114964A1 (en) * 2001-12-19 2003-06-19 Ford Global Technologies, Inc. Simple classification scheme for vehicle/pole/pedestrian detection
US20050131646A1 (en) * 2003-12-15 2005-06-16 Camus Theodore A. Method and apparatus for object tracking prior to imminent collision detection
EP1760632A2 (de) * 2005-08-30 2007-03-07 Fuji Jukogyo Kabushiki Kaisha Bidprozessoreinrichtung

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2212160A4 *

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9731417B2 (en) 2011-08-30 2017-08-15 5D Robotics, Inc. Vehicle management system
US9195911B2 (en) 2011-08-30 2015-11-24 5D Robotics, Inc. Modular robotic manipulation
WO2013033351A3 (en) * 2011-08-30 2013-06-27 5D Robotics, Inc. Graphical rendition of multi-modal data
US10948924B2 (en) 2015-02-06 2021-03-16 Aptiv Technologies Limited Method and apparatus for controlling an autonomous vehicle
WO2016126315A1 (en) * 2015-02-06 2016-08-11 Delphi Technologies, Inc. Autonomous guidance system
US10209717B2 (en) 2015-02-06 2019-02-19 Aptiv Technologies Limited Autonomous guidance system
US11763670B2 (en) 2015-02-06 2023-09-19 Aptiv Technologies Limited Method of automatically controlling an autonomous vehicle based on electronic messages from roadside infrastructure or other vehicles
US11543832B2 (en) 2015-02-06 2023-01-03 Aptiv Technologies Limited Method and apparatus for controlling an autonomous vehicle
US10991247B2 (en) 2015-02-06 2021-04-27 Aptiv Technologies Limited Method of automatically controlling an autonomous vehicle based on electronic messages from roadside infrastructure or other vehicles
US10150414B2 (en) 2016-07-08 2018-12-11 Ford Global Technologies, Llc Pedestrian detection when a vehicle is reversing
RU2708469C2 (ru) * 2016-07-08 2019-12-09 ФОРД ГЛОУБАЛ ТЕКНОЛОДЖИЗ, ЭлЭлСи Обнаружение пешеходов, когда транспортное средство движется задним ходом
US10520904B2 (en) 2016-09-08 2019-12-31 Mentor Graphics Corporation Event classification and object tracking
US10558185B2 (en) 2016-09-08 2020-02-11 Mentor Graphics Corporation Map building with sensor measurements
US10678240B2 (en) 2016-09-08 2020-06-09 Mentor Graphics Corporation Sensor modification based on an annotated environmental model
US10317901B2 (en) 2016-09-08 2019-06-11 Mentor Graphics Development (Deutschland) Gmbh Low-level sensor fusion
US10740658B2 (en) 2016-09-08 2020-08-11 Mentor Graphics Corporation Object recognition and classification using multiple sensor modalities
US10802450B2 (en) 2016-09-08 2020-10-13 Mentor Graphics Corporation Sensor event detection and fusion
WO2018047115A1 (en) * 2016-09-08 2018-03-15 Mentor Graphics Development (Deutschland) Gmbh Object recognition and classification using multiple sensor modalities
US10585409B2 (en) 2016-09-08 2020-03-10 Mentor Graphics Corporation Vehicle localization with map-matched sensor measurements
US11067996B2 (en) 2016-09-08 2021-07-20 Siemens Industry Software Inc. Event-driven region of interest management
US10884409B2 (en) 2017-05-01 2021-01-05 Mentor Graphics (Deutschland) Gmbh Training of machine learning sensor data classification system
US11170201B2 (en) 2017-09-22 2021-11-09 Samsung Electronics Co., Ltd. Method and apparatus for recognizing object
EP3648006A4 (de) * 2017-09-22 2020-07-29 Samsung Electronics Co., Ltd. Verfahren und vorrichtung zur erkennung eines objekts
US10553044B2 (en) 2018-01-31 2020-02-04 Mentor Graphics Development (Deutschland) Gmbh Self-diagnosis of faults with a secondary system in an autonomous driving system
US11145146B2 (en) 2018-01-31 2021-10-12 Mentor Graphics (Deutschland) Gmbh Self-diagnosis of faults in an autonomous driving system

Also Published As

Publication number Publication date
EP2212160A1 (de) 2010-08-04
EP2212160A4 (de) 2012-07-04

Similar Documents

Publication Publication Date Title
WO2009070069A1 (en) A system for classifying objects in the vicinity of a vehicle
US8876157B2 (en) System for protection of a vulnerable road user and method for operating the system
US10007854B2 (en) Computer vision based driver assistance devices, systems, methods and associated computer executable code
US8379924B2 (en) Real time environment model generation system
JP4598653B2 (ja) 衝突予知装置
US9158978B2 (en) Vehicle environment classifying safety system for a motor vehicle
US7486802B2 (en) Adaptive template object classification system with a template generator
US7480570B2 (en) Feature target selection for countermeasure performance within a vehicle
US7616101B2 (en) Device for monitoring the surroundings of a vehicle
EP2484567B1 (de) Borderkennungssystem
WO2016185653A1 (ja) 保護制御装置
JP5178276B2 (ja) 画像認識装置
CN104709215B (zh) 安全系统和用于操作车辆的安全系统的方法
EP2562053B1 (de) Verfahren, Computerprogrammprodukt und System zum Feststellen, ob es notwendig ist, eine Fahrzeugsicherheitsausrüstung zu verwenden, und diese umfassendes Fahrzeug
US7636625B2 (en) Device for classifying at least one object with the aid of an environmental sensor system
KR20120117753A (ko) 차량의 전면 영역에서 객체의 충돌 영역의 폭을 인식하기 위한 방법 및 그 컨트롤러
WO2019088028A1 (ja) 保護制御装置および保護制御装置の制御方法
US20050125126A1 (en) Pre-crash sensing system and method for detecting and classifying objects
WO2014171863A1 (en) System for controlling the deployment of an external safety device
EP1274608A1 (de) Methode zur steuerung eines rückhaltesystems
US20050004719A1 (en) Device and method for determining the position of objects in the surroundings of a motor vehicle
US10908259B2 (en) Method for detecting a screening of a sensor device of a motor vehicle by an object, computing device, driver-assistance system and motor vehicle
EP2851840B1 (de) Sichtsystem und -verfahren für ein Kraftfahrzeug
US20220009439A1 (en) Enhanced occupant collision safety system
KR20220152590A (ko) 차량 도어 끼임 방지 장치 및 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07852173

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2007852173

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE