JP4377439B1 - Vehicle periphery monitoring device - Google Patents

Vehicle periphery monitoring device Download PDF

Info

Publication number
JP4377439B1
JP4377439B1 JP2008154408A JP2008154408A JP4377439B1 JP 4377439 B1 JP4377439 B1 JP 4377439B1 JP 2008154408 A JP2008154408 A JP 2008154408A JP 2008154408 A JP2008154408 A JP 2008154408A JP 4377439 B1 JP4377439 B1 JP 4377439B1
Authority
JP
Japan
Prior art keywords
contact determination
vehicle
determination area
area
contact
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2008154408A
Other languages
Japanese (ja)
Other versions
JP2009301283A (en
Inventor
誠 相村
伸治 長岡
英樹 橋本
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Priority to JP2008154408A priority Critical patent/JP4377439B1/en
Priority to EP09762203A priority patent/EP2270765B1/en
Priority to US12/994,922 priority patent/US8189868B2/en
Priority to AT09762203T priority patent/ATE556910T1/en
Priority to CN2009801218769A priority patent/CN102057414B/en
Priority to PCT/JP2009/000977 priority patent/WO2009150771A1/en
Application granted granted Critical
Publication of JP4377439B1 publication Critical patent/JP4377439B1/en
Publication of JP2009301283A publication Critical patent/JP2009301283A/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/30Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing vision in the non-visible spectrum, e.g. night or infrared vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/205Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/302Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/307Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8093Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Emergency Alarm Devices (AREA)
  • Steering-Linkage Mechanisms And Four-Wheel Steering (AREA)
  • Alarm Systems (AREA)

Abstract

【課題】車両と対象物との接触可能性が高い状況にあることを、この対象物の種類に鑑みて適当なタイミングまたは頻度で通報することができる車両周辺監視装置を提供する。
【解決手段】本発明の車両周辺監視装置によれば、対象物が人間であると判定され、かつ、この対象物の実空間位置が第1の接触判定領域A1に含まれている場合、自車両10とこの対象物との接触可能性が高いことが通報される。一方、対象物が四足動物であると判定され、かつ、この対象物の実空間位置が第2の接触判定領域A2に含まれている場合、当該通報がなされる。ここで、第2の接触判定領域A2は、第1の接触判定領域A1との重なり領域と、第1の接触判定領域A1から少なくとも一部がはみ出した領域とを有している。
【選択図】図4
To provide a vehicle periphery monitoring device capable of reporting at a suitable timing or frequency that the possibility of contact between a vehicle and an object is high in view of the type of the object.
According to the vehicle periphery monitoring device of the present invention, when it is determined that the object is a human and the actual space position of the object is included in the first contact determination area A1, It is reported that the possibility of contact between the vehicle 10 and the object is high. On the other hand, when it is determined that the object is a quadruped and the real space position of the object is included in the second contact determination area A2, the notification is made. Here, the second contact determination area A2 has an overlapping area with the first contact determination area A1 and an area at least partially protruding from the first contact determination area A1.
[Selection] Figure 4

Description

本発明は、車両に搭載された撮像装置によって得られる撮像画像を用いて車両の周辺を監視する車両周辺監視装置に関する。   The present invention relates to a vehicle periphery monitoring device that monitors the periphery of a vehicle using a captured image obtained by an imaging device mounted on the vehicle.

車両の周辺に存在する動物等の対象物の時系列的な位置データに基づいてその対象物の実空間における移動ベクトルを算出し、この移動ベクトルに基づいて車両と対象物との接触可能性の高低を判定し、当該接触可能性が高いと判定した場合にはその旨を運転者に通報する車両周辺監視装置が提案されている(特許文献1参照)。
特開2001−006096号公報
Based on the time-series position data of objects such as animals existing around the vehicle, a movement vector in the real space of the object is calculated, and based on this movement vector, the possibility of contact between the vehicle and the object is calculated. There has been proposed a vehicle periphery monitoring device that determines the height and determines that the possibility of contact is high, and notifies the driver to that effect (see Patent Document 1).
JP 2001-006096 A

しかし、対象物が鹿等の四足動物である場合、一般的にその行動を予測するのが困難であるため、移動ベクトルに基づいて接触可能性の高低が判定されると、運転者への通報タイミングが遅れる場合がある。一方、人間のようにその行動を比較的容易に予測しうる対象物にもかかわらず、単に通報タイミングを早めるだけでは、運転者への通報頻度がいたずらに高くなる場合がある。   However, when the target object is a quadruped animal such as a deer, it is generally difficult to predict its behavior, so if the possibility of contact is determined based on the movement vector, Report timing may be delayed. On the other hand, in spite of an object whose behavior can be predicted relatively easily like a human being, the notification frequency to the driver may become unnecessarily high simply by advancing the notification timing.

そこで、本発明は、車両と対象物との接触可能性が高い状況にあることを、この対象物の種類に鑑みて適当なタイミングまたは頻度で通報することができる車両周辺監視装置を提供することを目的とする。   Accordingly, the present invention provides a vehicle periphery monitoring device capable of reporting at a suitable timing or frequency that the possibility of contact between a vehicle and an object is high in view of the type of the object. With the goal.

第1発明の車両周辺装置は、車両に搭載された撮像装置によって得られる撮像画像を用いて車両の周辺を監視する車両周辺監視装置であって、前記撮像画像から対象物を抽出する対象物抽出手段と、前記対象物抽出手段により抽出された前記対象物の実空間位置を測定する位置測定手段と、前記対象物抽出手段により抽出された前記対象物が人間および四足動物のいずれに該当するかを判定する対象物判定手段と、前記対象物判定手段により前記対象物が人間であると判定された場合、前記車両と前記対象物としての人間との接触可能性の高低を判定するための第1の接触判定領域を設定する一方、前記対象物判定手段により前記対象物が四足動物であると判定された場合、前記車両と前記対象物としての四足動物との接触可能性の高低を判定するための領域であって、前記第1の接触判定領域との重なり領域と前記第1の接触判定領域から少なくとも一部がはみ出した領域とを有する第2の接触判定領域を設定する接触判定領域設定手段と、前記接触判定領域設定手段により設定された前記第1の接触判定領域に前記位置測定手段により測定された前記対象物としての人間の実空間位置が含まれる場合、または、前記接触判定領域設定手段により設定された前記第2の接触判定領域に前記位置測定手段により測定された実空間位置に前記対象物としての四足動物の実空間位置が含まれる場合、運転者に対して前記対象物の存在を通報する対象物通報手段とを備えることを特徴とする。 A vehicle peripheral device according to a first aspect of the present invention is a vehicle periphery monitoring device that monitors the periphery of a vehicle using a captured image obtained by an imaging device mounted on the vehicle, and extracts an object from the captured image. Means, position measuring means for measuring the real space position of the object extracted by the object extracting means, and the object extracted by the object extracting means corresponds to either a human or a quadruped animal An object determining means for determining whether or not the object is a human by the object determining means for determining whether the vehicle and the human being as the object are likely to contact each other While the first contact determination area is set, when the target determination unit determines that the target is a quadruped, the possibility of contact between the vehicle and the quadruped as the target is high or low The A region for the contact determination area setting a second contact determination area having at least a portion of which protruded region from the overlap region and the first contact determination area of the first contact determination area When the real space position of the person as the object measured by the position measuring unit is included in the first contact determination region set by the setting unit and the contact determination region setting unit , or the contact determination When the real space position of the quadruped as the object is included in the real space position measured by the position measurement means in the second contact determination area set by the area setting means, It is provided with the object notification means which reports presence of a target object.

第1発明の車両周辺監視装置によれば、対象物が人間であると判定され、かつ、この対象物の実空間位置が第1の接触判定領域に含まれている場合、車両とこの対象物との接触可能性が高い状況にあることが通報される。一方、対象物が四足動物であると判定され、かつ、この対象物の実空間位置が第2の接触判定領域に含まれている場合、当該通報がなされる。ここで、第2の接触判定領域は、第1の接触判定領域との重なり領域と、第1の接触判定領域から少なくとも一部がはみ出した領域とを有している。このため、対象物が四足動物である場合、この対象物の実空間位置が第1の接触判定領域に含まれていなくても第2の接触判定領域のうち第1の接触判定領域からはみだした領域に含まれていれば通報がなされるので、対象物が人間である場合と比較して通報タイミングを早めることができる。一方、対象物が人間である場合、この対象物の実空間位置が第2の接触判定領域のうち第1の接触判定領域からはみだした領域に含まれていても通報が無条件になされることはないので、車両と人間とが接触する可能性が高いことを運転者に通報する頻度がいたずらに高くなることが防止される。したがって、車両と対象物が接触する可能性が高い状況にあることを、この対象物の種類に鑑みて適当なタイミングまたは頻度で通報することができる。   According to the vehicle periphery monitoring device of the first invention, when it is determined that the object is a human and the real space position of the object is included in the first contact determination area, the vehicle and the object It is reported that there is a high possibility of contact with. On the other hand, when it is determined that the object is a quadruped and the real space position of the object is included in the second contact determination area, the notification is made. Here, the second contact determination area has an overlapping area with the first contact determination area and an area at least partially protruding from the first contact determination area. For this reason, when the target object is a quadruped animal, even if the real space position of the target object is not included in the first contact determination area, the second contact determination area extends beyond the first contact determination area. Since the notification is made if it is included in the area, the notification timing can be advanced compared to the case where the object is a human being. On the other hand, when the target object is a human, the notification is unconditionally made even if the real space position of the target object is included in the second contact determination area that is outside the first contact determination area. Therefore, it is possible to prevent the frequency of notifying the driver that there is a high possibility that the vehicle and the person are in contact with each other. Therefore, it is possible to report that there is a high possibility that the vehicle and the object are in contact with each other at an appropriate timing or frequency in view of the type of the object.

第2発明の車両周辺監視装置は、第1発明の車両周辺監視装置において、前記位置測定手段により異なる時点において測定された前記対象物が複数の実空間位置から前記対象物の移動ベクトルを算出する移動ベクトル算出手段を備え、前記接触判定領域設定手段は、前記対象物判定手段により前記対象物が人間であると判定された場合、前記第1の接触判定領域の外側に第3の接触判定領域を設定し、前記対象物通報手段は、前記接触判定領域設定手段により設定された前記第3の接触判定領域に前記位置測定手段により測定された前記対象物としての人間の実空間位置が含まれ、かつ、前記移動ベクトル算出手段で算出された前記対象物としての人間の移動ベクトルが前記第1の接触判定領域に向かっている場合、前記運転者に対して前記対象物としての人間の存在を通報することを特徴とする。 The vehicle periphery monitoring device according to a second aspect of the present invention is the vehicle periphery monitoring device according to the first aspect of the invention, wherein the object measured at different time points by the position measuring means calculates a movement vector of the object from a plurality of real space positions. Movement vector calculation means, and when the object determination means determines that the object is a human, the contact determination area setting means has a third contact determination area outside the first contact determination area. The object notifying means includes the real space position of the person as the object measured by the position measuring means in the third contact determination area set by the contact determination area setting means. and, if the motion vector of the human being as the object calculated by the motion vector calculation means is toward the first contact determination area, the relative the driver Wherein the notifying the existence of a human as elephants thereof.

第2発明の車両周辺監視装置によれば、対象物が人間であると判定され、この対象物の実空間位置が第3の接触判定領域に含まれ、かつ、この対象物の移動ベクトルが第1の接触判定領域に向かっている場合、車両とこの対象物との接触可能性が高いことが通報される。このため、人間の実空間位置が第3の接触判定領域に含まれ、人間の移動ベクトルが第1の接触判定領域に向かっていない場合には通報がされることがないので、車両と人間とが接触する可能性が高いことを運転者に通報する頻度がいたずらに高くなることが防止される。一方、対象物が四足動物であると判定され、かつ、この対象物の実空間位置が第2の接触判定領域の一部としての第3の接触判定領域に含まれていればその移動ベクトルの向きとは関係なく通報がなされるので、対象物が人間である場合と比較して通報タイミングを早めることができる。したがって、車両と対象物が接触する可能性が高い状況にあることを、この対象物の種類に鑑みて適当なタイミングまたは頻度で通報することができる。   According to the vehicle periphery monitoring device of the second invention, it is determined that the object is a human, the real space position of the object is included in the third contact determination area, and the movement vector of the object is the first When the vehicle is heading toward the contact determination area 1, it is reported that the possibility of contact between the vehicle and the object is high. For this reason, since the human real space position is included in the third contact determination area and the human movement vector is not directed to the first contact determination area, the notification is not issued. It is prevented that the frequency of notifying the driver that there is a high possibility of contact with the vehicle is unnecessarily high. On the other hand, if it is determined that the object is a quadruped and the real space position of the object is included in the third contact determination area as a part of the second contact determination area, the movement vector thereof Since the notification is made regardless of the direction of the notification, the notification timing can be advanced compared to the case where the object is a human being. Therefore, it is possible to report that there is a high possibility that the vehicle and the object are in contact with each other at an appropriate timing or frequency in view of the type of the object.

第3発明の車両周辺監視装置は、第1または2発明の車両周辺監視装置において、前記接触判定領域設定手段は、前記車両の前方において、前記車両の進行方向に対して平行に延び前記車両の左右両側に所定の余裕を加えた幅を有する領域を、前記第1の接触判定領域として設定することを特徴とする。   A vehicle periphery monitoring device according to a third aspect of the invention is the vehicle periphery monitoring device according to the first or second aspect of the invention, wherein the contact determination area setting means extends parallel to the traveling direction of the vehicle in front of the vehicle. An area having a width obtained by adding a predetermined margin on both the left and right sides is set as the first contact determination area.

第3発明の車両周辺監視装置によれば、対象物が人間であると判定された場合、第1の接触判定領域は、車両の前方において、車両の進行方向に対して平行に延び車両の左右両側に所定の余裕を加えた幅を有する領域に設定される。当該領域は、車両とこの対象物とが接触する可能性が高い領域であり、当該領域の外側の領域は、車両とこの対象物とが接触する可能性が必ずしも高くない領域であることに鑑みて、この対象物の実空間位置が第1の接触判定領域に含まれている場合、運転者に対して車両とこの対象物とが接触可能性が高いことが適当な頻度で通報される。また、第1の接触判定領域の外側は、この対象物の実空間位置が当該領域に含まれていても当該通報がなされることはないので、運転者に通報する頻度がいたずらに高くなることが防止される。一方、第1の接触判定領域の外側には第2の接触判定領域が含まれるため、対象物が四足動物である場合には、対象物が人間である場合と比較して通報タイミングを早めることができる。したがって、車両と対象物が接触する可能性が高い状況にあることを、この対象物の種類に鑑みて適当なタイミングまたは頻度で通報することができる。   According to the vehicle periphery monitoring device of the third aspect of the invention, when it is determined that the object is a human, the first contact determination region extends in parallel to the traveling direction of the vehicle in front of the vehicle and It is set to an area having a width with a predetermined margin on both sides. The area is an area where the vehicle and the object are highly likely to come into contact, and the area outside the area is an area where the possibility that the vehicle and the object are in contact is not necessarily high. When the real space position of the object is included in the first contact determination area, the driver is notified at a suitable frequency that the vehicle and the object are likely to contact each other. Moreover, since the notification is not made outside the first contact determination region even if the real space position of the object is included in the region, the frequency of reporting to the driver is unnecessarily high. Is prevented. On the other hand, since the second contact determination region is included outside the first contact determination region, when the target is a quadruped, the notification timing is advanced compared to the case where the target is a human. be able to. Therefore, it is possible to report that there is a high possibility that the vehicle and the object are in contact with each other at an appropriate timing or frequency in view of the type of the object.

第4発明の車両周辺監視装置は、第1〜3のいずれかの発明の車両周辺監視装置において、前記接触判定領域設定手段は、前記撮像装置により撮像される撮像領域を前記第2の接触判定領域に設定することを特徴とする。   The vehicle periphery monitoring device according to a fourth aspect of the present invention is the vehicle periphery monitoring device according to any one of the first to third aspects, wherein the contact determination area setting means determines the imaging area captured by the imaging apparatus as the second contact determination. It is set to an area.

第4発明の車両周辺監視装置によれば、対象物が四足動物であると判定された場合、第2の接触判定領域は、車両が認識できる最大限の範囲である撮像装置により撮像される撮像領域に設定されることに鑑みて、この対象物の実空間位置が当該領域に含まれている場合、車両とこの対象物の接触可能性が高いことの通報タイミングを早めることができる。一方、対象物が人間であると判定された場合、第1の接触判定領域は、当該撮像領域に対して限定された領域に設定され、この対象物の実空間位置が当該領域からはみ出した領域に含まれていても当該通報が無条件になされることはないことに鑑みて、運転者に通報する頻度がいたずらに高くなることが防止される。したがって、車両と対象物が接触する可能性が高い状況にあることを、この対象物の種類に鑑みて適当なタイミングまたは頻度で通報することができる。   According to the vehicle periphery monitoring device of the fourth aspect of the invention, when it is determined that the object is a quadruped, the second contact determination area is imaged by the imaging device that is the maximum range that the vehicle can recognize. In view of being set in the imaging region, when the real space position of the object is included in the region, the notification timing that the possibility of contact between the vehicle and the object is high can be advanced. On the other hand, when it is determined that the object is a human, the first contact determination area is set to an area limited to the imaging area, and the real space position of the object protrudes from the area. In view of the fact that the notification is not made unconditionally even if it is included in the vehicle, the frequency of reporting to the driver is prevented from becoming unnecessarily high. Therefore, it is possible to report that there is a high possibility that the vehicle and the object are in contact with each other at an appropriate timing or frequency in view of the type of the object.

本発明の一実施形態を以下に図1〜図6を参照して説明する。まず、本実施形態の車両周辺監視装置の構成について説明する。図1および図2を参照して本実施形態の車両周辺監視装置は画像処理ユニット1を備える。画像処理ユニット1には、自車両10の前方の画像を撮像する撮像装置としての2つの赤外線カメラ2R,2Lが接続されると共に、自車両10の走行状態を検出するセンサとして、自車両10のヨーレートを検出するヨーレートセンサ3、自車両10の走行速度(車速)を検出する車速センサ4、および自車両10のブレーキ操作の有無を検出するブレーキセンサ5とが接続されている。さらに、画像処理ユニット1には、音声などによる聴覚的な通報情報を出力するためのスピーカ6、および赤外線カメラ2R,2Lにより撮像された撮像画像や視覚的な通報情報を表示するための表示装置7が接続されている。なお、赤外線カメラ2R,2Lが本発明における撮像装置に相当する。   An embodiment of the present invention will be described below with reference to FIGS. First, the structure of the vehicle periphery monitoring apparatus of this embodiment is demonstrated. With reference to FIGS. 1 and 2, the vehicle periphery monitoring device of the present embodiment includes an image processing unit 1. The image processing unit 1 is connected with two infrared cameras 2R and 2L as imaging devices for capturing an image in front of the host vehicle 10, and as a sensor for detecting the traveling state of the host vehicle 10, the image processing unit 1 A yaw rate sensor 3 that detects the yaw rate, a vehicle speed sensor 4 that detects the traveling speed (vehicle speed) of the host vehicle 10, and a brake sensor 5 that detects the presence or absence of a brake operation of the host vehicle 10 are connected. Further, the image processing unit 1 includes a speaker 6 for outputting auditory notification information such as voice, and a display device for displaying captured images and visual notification information captured by the infrared cameras 2R and 2L. 7 is connected. The infrared cameras 2R and 2L correspond to the imaging device in the present invention.

画像処理ユニット1は、詳細な図示は省略するが、A/D変換回路、マイクロコンピュータ(CPU、RAM、ROM等を有する)、画像メモリなどを含む電子回路により構成され、赤外線カメラ2R,2L、ヨーレートセンサ3、車速センサ4およびブレーキセンサ5から出力されるアナログ信号が、A/D変換回路によりデジタルデータ化されて、マイクロコンピュータに入力される。そして、マイクロコンピュータは、入力されたデータを基に、人(歩行者、自転車に乗っている者)などの対象物を検出し、検出した対象物が所定の通報要件を満す場合にスピーカ6や表示装置7により運転者に通報を発する処理を実行する。なお、画像処理ユニット1は、本発明における対象物抽出手段、対象物判定手段、および接触領域設定手段としての機能を備えている。   Although not shown in detail, the image processing unit 1 includes electronic circuits including an A / D conversion circuit, a microcomputer (having a CPU, a RAM, a ROM, and the like), an image memory, and the like. The infrared cameras 2R, 2L, Analog signals output from the yaw rate sensor 3, the vehicle speed sensor 4, and the brake sensor 5 are converted into digital data by the A / D conversion circuit and input to the microcomputer. The microcomputer detects an object such as a person (pedestrian or person riding a bicycle) based on the input data. When the detected object satisfies a predetermined notification requirement, the microcomputer 6 And the process which issues a report to a driver by the display device 7 is executed. The image processing unit 1 has functions as an object extraction means, an object determination means, and a contact area setting means in the present invention.

図2に示されているように、赤外線カメラ2R,2Lは、自車両10の前方を撮像するために、自車両10の前部(図ではフロントグリルの部分)に取り付けられている。この場合、赤外線カメラ2R,2Lは、それぞれ、自車両10の車幅方向の中心よりも右寄りの位置、左寄りの位置に配置されている。それら位置は、自車両10の車幅方向の中心に対して左右対称である。赤外線カメラ2R,2Lは、それらの光軸が互いに平行に自車両10の前後方向に延在し、且つ、それぞれの光軸の路面からの高さが互いに等しくなるように固定されている。赤外線カメラ2R,2Lは、遠赤外域に感度を有し、それにより撮像される物体の温度が高いほど、出力される映像信号のレベルが高くなる(映像信号の輝度が高くなる)特性を有している。実空間座標系は図2に示されているように車両10の前端中央部を原点Oとし、車両10の右方向を+X方向、鉛直下方を+Y方向、車両10の前方を+Z方向として定義されている。   As shown in FIG. 2, the infrared cameras 2 </ b> R and 2 </ b> L are attached to the front portion (the front grill portion in the figure) of the host vehicle 10 in order to image the front of the host vehicle 10. In this case, the infrared cameras 2R and 2L are respectively disposed at a position on the right side and a position on the left side of the center of the host vehicle 10 in the vehicle width direction. These positions are symmetrical with respect to the center of the vehicle 10 in the vehicle width direction. The infrared cameras 2R and 2L are fixed so that their optical axes extend in parallel in the front-rear direction of the vehicle 10, and the heights of the respective optical axes from the road surface are equal to each other. The infrared cameras 2R and 2L have sensitivity in the far-infrared region, and have a characteristic that the higher the temperature of an object to be imaged, the higher the level of the output video signal (the higher the luminance of the video signal). is doing. As shown in FIG. 2, the real space coordinate system is defined with the center of the front end of the vehicle 10 as the origin O, the right direction of the vehicle 10 as + X direction, the vertically downward direction as + Y direction, and the front of the vehicle 10 as + Z direction. ing.

また、本実施形態では、表示装置7として、自車両10のフロントウィンドウに画像情報を表示するヘッド・アップ・ディスプレイ7a(以下、HUD7aという)を備えている。なお、表示装置7として、HUD7aの代わりに、もしくは、HUD7aと共に、自車両10の車速などの走行状態を表示するメータに一体的に設けられたディスプレイ、あるいは、車載ナビゲーション装置に備えられたディスプレイを用いてもよい。   In the present embodiment, the display device 7 includes a head-up display 7a (hereinafter referred to as HUD 7a) that displays image information on the front window of the host vehicle 10. As the display device 7, instead of the HUD 7a or together with the HUD 7a, a display provided integrally with a meter for displaying a traveling state such as the vehicle speed of the host vehicle 10 or a display provided in an in-vehicle navigation device is provided. It may be used.

次に、前記構成の車両周辺監視装置の基本的な機能について図3のフローチャートを用いて説明する。図3のフローチャートの基本的な処理内容は、例えば本出願人による特開2001−6096号の図3および特開2007−310705号の図3に記載されている処理内容と同様である。   Next, basic functions of the vehicle periphery monitoring apparatus having the above-described configuration will be described with reference to the flowchart of FIG. The basic processing content of the flowchart of FIG. 3 is the same as the processing content described in FIG. 3 of Japanese Patent Laid-Open No. 2001-6096 and FIG.

具体的にはまず、画像処理ユニット1は、赤外線カメラ2R、2Lの出力信号から赤外線画像が入力される(図3/STEP11)。次いで、画像処理ユニット1は、赤外線カメラ2R、2Lの出力信号をA/D変換する(図3/STEP12)。そして、画像処理ユニット1は、A/D変換された赤外線画像からグレースケール画像を取得する(図3/STEP13)。その後、画像処理ユニット1は、基準画像(右画像)を2値化する(図3/STEP14)。次いで、画像処理ユニット1は、前記2値化画像に対してSTEP15〜17の処理を実行し、該2値化画像から対象物(より正確には対象物に対応する画像領域)を抽出する。具体的には、前記2値化画像の高輝度領域を構成する画素群をランレングスデータに変換する(図3/STEP15)。次いで、基準画像の縦方向に重なりを有するライン群のそれぞれにラベル(識別子)を付する(図3/STEP16)。そして、ライン群のそれぞれを対象物として抽出する(図3/STEP17)。その後、画像処理ユニット1は、上記の如く抽出した各対象物の重心の位置(基準画像上での位置)、面積、および外接四角形の縦横比を算出する(図3/STEP18)。次いで、画像処理ユニット1は、前記STEP18で抽出した対象物の時刻間追跡、すなわち、画像処理ユニット1の演算処理周期毎の同一対象物の認識を行なう(図3/STEP19)。そして、画像処理ユニット1は、前記車速センサ4およびヨーレートセンサ5の出力(車速の検出値およびヨーレートの検出値)を読み込む(図3/STEP20)。   Specifically, first, the image processing unit 1 receives an infrared image from output signals of the infrared cameras 2R and 2L (FIG. 3 / STEP 11). Next, the image processing unit 1 A / D converts the output signals of the infrared cameras 2R and 2L (FIG. 3 / STEP 12). Then, the image processing unit 1 acquires a gray scale image from the A / D converted infrared image (FIG. 3 / STEP 13). Thereafter, the image processing unit 1 binarizes the reference image (right image) (FIG. 3 / STEP 14). Next, the image processing unit 1 performs the processing of STEPs 15 to 17 on the binarized image, and extracts an object (more precisely, an image region corresponding to the object) from the binarized image. Specifically, the pixel group constituting the high luminance area of the binarized image is converted into run length data (FIG. 3 / STEP 15). Next, a label (identifier) is attached to each line group that overlaps in the vertical direction of the reference image (FIG. 3 / STEP 16). Then, each line group is extracted as an object (FIG. 3 / STEP 17). Thereafter, the image processing unit 1 calculates the position of the center of gravity of each object extracted as described above (position on the reference image), the area, and the aspect ratio of the circumscribed rectangle (FIG. 3 / STEP 18). Next, the image processing unit 1 tracks the object extracted in STEP 18 during time, that is, recognizes the same object for each calculation processing period of the image processing unit 1 (FIG. 3 / STEP 19). Then, the image processing unit 1 reads the outputs (vehicle speed detection value and yaw rate detection value) of the vehicle speed sensor 4 and the yaw rate sensor 5 (FIG. 3 / STEP 20).

一方、画像処理ユニット1は、STEP19,20の処理と並行して、STEP31の処理を実行する。まず、前記基準画像のうち、各対象物に対応する領域(例えば該対象物の外接四角形の領域)を探索画像として抽出する(図3/STEP31)。次いで、左画像中から探索画像に対応する画像(対応画像)を探索する探索領域を設定し、相関演算を実行して対応画像を抽出する(図3/STEP32)。そして、対象物の自車両10からの距離(自車両10の前後方向における距離)を算出する(図3/STEP33)。   On the other hand, the image processing unit 1 executes the processing of STEP 31 in parallel with the processing of STEP 19 and STEP 20. First, in the reference image, an area corresponding to each object (for example, a circumscribed quadrangular area of the object) is extracted as a search image (FIG. 3 / STEP 31). Next, a search area for searching for an image (corresponding image) corresponding to the search image from the left image is set, and a correlation operation is executed to extract the corresponding image (FIG. 3 / STEP 32). And the distance from the own vehicle 10 of the target object (distance in the front-back direction of the own vehicle 10) is calculated (FIG. 3 / STEP33).

次いで、画像処理ユニット1は、各対象物の実空間上での位置(自車両10に対する相対位置)である実空間位置を算出する(図3/STEP21)。そして、画像処理ユニット1は、対象物の実空間位置(X,Y,Z)のうちのX方向の位置Xを前記STEP20で求めた回頭角の時系列データに応じて補正する(図3/STEP22)。その後、画像処理ユニット1は、対象物の自車両10に対する相対移動ベクトルを推定する(図3/STEP23)。次いで、STEP23において、相対移動ベクトルが求められたら、検出した対象物との接触の可能性を判定し、その可能性が高いときに通報を発する通報判定処理を実行する(図3/STEP24)。以上が本実施形態の周辺監視装置の全体的作動である。なお、画像処理ユニット1によりSTEP11〜18の処理を実行する構成が、本発明の対象物抽出手段に相当する。   Next, the image processing unit 1 calculates a real space position that is a position (relative position with respect to the host vehicle 10) of each object in the real space (FIG. 3 / STEP 21). Then, the image processing unit 1 corrects the position X in the X direction of the real space position (X, Y, Z) of the object according to the time-series data of the turning angle obtained in STEP 20 (FIG. 3 / (Step 22). Thereafter, the image processing unit 1 estimates a relative movement vector of the object with respect to the host vehicle 10 (FIG. 3 / STEP 23). Next, when the relative movement vector is obtained in STEP 23, the possibility of contact with the detected object is determined, and a notification determination process for issuing a notification when the possibility is high is executed (FIG. 3 / STEP 24). The above is the overall operation of the periphery monitoring device of this embodiment. In addition, the structure which performs the process of STEP11-18 by the image processing unit 1 is equivalent to the target object extraction means of this invention.

次に、本発明の車両周辺監視装置の主要な機能である通報処理について図4のフローチャートを用いて説明する。   Next, notification processing, which is a main function of the vehicle periphery monitoring device of the present invention, will be described with reference to the flowchart of FIG.

まず、対象物抽出手段により抽出された対象物が人間、および四足動物のいずれかであるかを判定する対象物判定処理を実行する(図4/STEP100)。   First, an object determination process for determining whether the object extracted by the object extraction means is a human or a quadruped animal is executed (FIG. 4 / STEP 100).

具体的には、グレースケール画像上で対象物の画像領域の形状や大きさ、輝度分散等の特徴から対象物の種類を判定する。たとえば、頭部(具体的には第1の高輝度画像領域に相当する)と、肩部、胴部、腕部または脚部(具体的には第1の高輝度画像領域の下側に存在し、かつ、第1の高輝度画像領域に対して、頭部に対する肩部等の標準的な配置関係と同じ配置関係を有する第2の高輝度画像領域)とを有する対象物が人間であると判定される。また、胴部(具体的には第1の高輝度画像領域)と、頭部または脚部(具体的には第1の高輝度画像領域の横側または下側に位置し、かつ、第1の高輝度画像領域より小さい第2の高輝度画像領域)とを有する対象物が鹿、羊、犬または馬等の四足動物であると判定される。さらに、対象物の外形と、メモリにあらかじめ記憶されている対象物の外形とのパターンマッチングにより対象物の種類が判定されてもよい。   Specifically, the type of the object is determined from the characteristics such as the shape and size of the image area of the object on the gray scale image and the luminance dispersion. For example, the head (specifically corresponding to the first high-intensity image area) and the shoulder, torso, arm or leg (specifically, below the first high-intensity image area) In addition, an object having a second high-intensity image region having the same arrangement relationship as a standard arrangement relationship such as a shoulder portion with respect to the head with respect to the first high-intensity image region is a human being. It is determined. Further, the body (specifically, the first high-luminance image region) and the head or leg (specifically, located on the side or the lower side of the first high-luminance image region), and the first It is determined that the object having the second high-intensity image region smaller than the high-intensity image region is a quadruped animal such as a deer, a sheep, a dog, or a horse. Further, the type of the object may be determined by pattern matching between the outer shape of the target object and the outer shape of the target object stored in advance in the memory.

そして、対象物が人間であると判定された場合(図4/STEP100‥A)、第1の接触判定領域A1および第3の接触判定領域A3が設定される(図4/STEP102)。具体的には、図5(a)に示されているように赤外線カメラ2R、2Lにより監視可能な三角形領域A0の内側において、Z方向に平行に延び、かつ、X方向について自車両10の車幅αの左右両側に第1の余裕β(たとえば50〜100cm)を加えた幅を有する領域が第1の接触判定領域A1として設定される(斜線部分参照)。第1の接触判定領域A1の車両1からみたZ方向の奥行きZ1は自車両10と対象物との相対速度Vsに、余裕時間Tを積算して得られた距離に定められている。また、図5(a)に示されているように三角形領域A0において第1の接触判定領域の左右両側に隣接する領域が第3の接触判定領域A3L.A3Rとして設定される。   When it is determined that the object is a human (FIG. 4 / STEP 100... A), the first contact determination area A1 and the third contact determination area A3 are set (FIG. 4 / STEP 102). Specifically, as shown in FIG. 5 (a), the vehicle of the host vehicle 10 extends parallel to the Z direction inside the triangular area A0 that can be monitored by the infrared cameras 2R and 2L, and extends in the X direction. A region having a width obtained by adding a first margin β (for example, 50 to 100 cm) to both the left and right sides of the width α is set as the first contact determination region A1 (see the hatched portion). The depth Z1 in the Z direction as viewed from the vehicle 1 in the first contact determination area A1 is determined to be a distance obtained by adding the margin time T to the relative speed Vs between the host vehicle 10 and the object. Further, as shown in FIG. 5A, in the triangular area A0, the areas adjacent to the left and right sides of the first contact determination area are the third contact determination areas A3L. Set as A3R.

さらに、対象物の実空間位置が第1の接触判定領域A1に含まれるか否かが判定される(図4/STEP104)。なお、対象物の実空間位置は、図3のSTEP21で測定される。画像処理ユニット1によりSTEP21の処理を実行する構成が、本発明の位置測定手段に相当する。第1の接触判定領域A1内に対象物の実空間位置が存在すると判定される場合(図4/STEP104‥YES)、通報出力判定処理が実行される(図4/STEP106)。一方、第1の接触判定領域A1内に対象物が存在しないと判定された場合(図4/STEP104‥NO)、対象物が第3の接触判定領域A3R,A3Lに存在し、かつ、STEP23で求めた対象物の移動ベクトルが第1の接触判定領域A1に向かっているか否かを判定する(図4/STEP105)。   Further, it is determined whether or not the real space position of the object is included in the first contact determination area A1 (FIG. 4 / STEP 104). Note that the real space position of the object is measured in STEP 21 of FIG. The configuration in which the processing of STEP 21 is performed by the image processing unit 1 corresponds to the position measuring unit of the present invention. When it is determined that the real space position of the object exists in the first contact determination area A1 (FIG. 4 / STEP 104... YES), a notification output determination process is executed (FIG. 4 / STEP 106). On the other hand, when it is determined that the object does not exist in the first contact determination area A1 (FIG. 4 / STEP 104... NO), the object exists in the third contact determination areas A3R and A3L, and in STEP23. It is determined whether or not the obtained movement vector of the object is toward the first contact determination area A1 (FIG. 4 / STEP 105).

当該判定結果が肯定的である場合(図4/STEP105‥YES)、ブレーキセンサ5の出力から自車両10の運転者が適切なブレーキ操作を行なっているか否かが判定される(図4/STEP106)。車両10の運転者が適切なブレーキ操作を行なっていないと判定された場合(図4/STEP106‥NO)、運転者に対して車両と対象物との接触可能性が高い状況である旨が通報される(図4/STEP108)。具体的には、スピーカ6を介して音声が出力することにより当該状況が通報される。また、HUD7aにおいて対象物を強調表示することによっても当該状況が通報される。なお、STEP106の判定処理は省略されてもよい。   If the determination result is affirmative (FIG. 4 / STEP 105... YES), it is determined from the output of the brake sensor 5 whether the driver of the host vehicle 10 is performing an appropriate brake operation (FIG. 4 / STEP 106). ). When it is determined that the driver of the vehicle 10 does not perform an appropriate brake operation (FIG. 4 / STEP 106... NO), the driver is informed that the possibility of contact between the vehicle and the object is high. (FIG. 4 / STEP 108). Specifically, the situation is reported by outputting sound through the speaker 6. The situation is also reported by highlighting the object on the HUD 7a. Note that the determination process in STEP 106 may be omitted.

その一方、対象物が四足動物であると判定された場合(図4/STEP100‥B)、第2の接触判定領域A2が設定される(図4/STEP202)。具体的には、図5(b)に示されているように、赤外線カメラ2R、2Lにより監視可能な三角形領域A0の内側において、Z方向に平行に延び、かつ、X方向について自車両10の車幅αの左右両側に第1の余裕βよりも大きい第2の余裕γを加えた幅を有する領域が第2の接触判定領域A2として設定される(斜線部分参照)。第2の接触判定領域A2の車両1からみたZ方向の奥行きZ1は、第1の接触判定領域A1と同様に自車両10と対象物との相対速度Vsに、余裕時間Tを積算して得られた距離に定められている。第2の接触判定領域A2は第1の接触判定領域A1の全部に重なる領域と、第1の接触判定領域A1の左右両側に第2の余裕γおよび第1の余裕βの偏差(γ−β)だけはみ出した領域とを有する。   On the other hand, when it is determined that the object is a quadruped animal (FIG. 4 / STEP 100... B), the second contact determination area A2 is set (FIG. 4 / STEP 202). Specifically, as shown in FIG. 5B, inside the triangular area A0 that can be monitored by the infrared cameras 2R and 2L, it extends parallel to the Z direction, and the vehicle 10 is in the X direction. A region having a width obtained by adding a second margin γ larger than the first margin β on both the left and right sides of the vehicle width α is set as the second contact determination region A2 (see the hatched portion). The depth Z1 in the Z direction as viewed from the vehicle 1 in the second contact determination area A2 is obtained by adding the margin time T to the relative speed Vs between the host vehicle 10 and the object, as in the first contact determination area A1. It is determined at a given distance. The second contact determination region A2 includes a region that overlaps the entire first contact determination region A1, and a deviation (γ−β) between the second margin γ and the first margin β on both the left and right sides of the first contact determination region A1. Only).

次いで、対象物の実空間位置が第2の接触判定領域A2に含まれるか否かを判定する(図4/STEP204)。なお、対象物の実空間位置は、図3のSTEP21で測定される。対象物の実空間位置が第2の接触判定領域A2に存在すると判定された場合(図4/STEP204‥YES)、ブレーキセンサ5の出力から自車両10の運転者が適切なブレーキ操作を行なっているか否かが判定される(図4/STEP206)。自車両10の運転者が適切なブレーキ操作を行なっていないと判定された場合(図4/STEP206‥NO)、運転者に対して車両10と対象物との接触可能性が高い旨が通報される(図4/STEP208)。   Next, it is determined whether or not the real space position of the object is included in the second contact determination area A2 (FIG. 4 / STEP 204). Note that the real space position of the object is measured in STEP 21 of FIG. When it is determined that the real space position of the object exists in the second contact determination area A2 (FIG. 4 / STEP 204... YES), the driver of the host vehicle 10 performs an appropriate brake operation from the output of the brake sensor 5. It is determined whether or not it exists (FIG. 4 / STEP 206). When it is determined that the driver of the host vehicle 10 is not performing an appropriate brake operation (FIG. 4 / STEP 206... NO), the driver is notified that the possibility of contact between the vehicle 10 and the object is high. (FIG. 4 / STEP 208).

なお、画像処理ユニット1によりSTEP100の処理を実行する構成が、本発明の対象物判定手段に、STEP102およびSTEP202の処理を実行する構成が、本発明の接触判定領域設定手段に、STEP108およびSTEP208の処理を実行する構成が、本発明の対象物通報手段にそれぞれ相当する。   Note that the configuration in which the processing of STEP 100 is performed by the image processing unit 1 is the object determination unit of the present invention, the configuration of executing the processing of STEP 102 and STEP 202 is the contact determination region setting unit of the present invention, and STEP 108 and STEP 208 The configuration for executing the processing corresponds to the object notification means of the present invention.

前記機能を発揮する車両周辺監視装置によれば、対象物が人間であると判定され、かつ、この対象物の実空間位置が第1の接触判定領域A1に含まれている場合、自車両10とこの対象物との接触可能性が高いことが通報される(図4/S100‥A,S102〜108、図5(a)参照)。一方、対象物が四足動物であると判定され、かつ、この対象物の実空間位置が第2の接触判定領域A2に含まれている場合、当該通報がなされる(図4/S100‥B,S202〜208、図5(b)参照)。ここで、第2の接触判定領域A2は、第1の接触判定領域A1との重なり領域と、第1の接触判定領域A1から少なくとも一部がはみ出した領域とを有している(図5(a)(b)参照)。このため、対象物が四足動物である場合、この対象物の実空間位置が第1の接触判定領域A1に含まれていなくても第2の接触判定領域A2のうち第1の接触判定領域A1からはみだした領域に含まれていれば通報がなされるので、対象物が人間である場合と比較して通報タイミングを早めることができる。一方、対象物が人間である場合、この対象物の実空間位置が第2の接触判定領域A2のうち第1の接触判定領域A1からはみだした領域に含まれていても通報が無条件になされることはないので、自車両10と人間とが接触する可能性が高いことを運転者に通報する頻度がいたずらに高くなることが防止される。したがって、自車両10と対象物が接触する可能性が高い状況にあることを、この対象物の種類に鑑みて適当なタイミングまたは頻度で通報することができる。   According to the vehicle periphery monitoring device that exhibits the function, when the object is determined to be a human and the real space position of the object is included in the first contact determination area A1, the host vehicle 10 It is reported that there is a high possibility of contact with the object (see FIG. 4 / S100... A, S102 to 108, FIG. 5A). On the other hand, when it is determined that the object is a quadruped and the real space position of the object is included in the second contact determination area A2, the notification is made (FIG. 4 / S100... B). , S202 to 208, see FIG. 5B). Here, the second contact determination region A2 has an overlapping region with the first contact determination region A1 and a region at least partially protruding from the first contact determination region A1 (FIG. 5 ( a) (b)). For this reason, when the target object is a quadruped animal, even if the real space position of the target object is not included in the first contact determination area A1, the first contact determination area in the second contact determination area A2 Since the notification is made if it is included in the area that protrudes from A1, the notification timing can be advanced compared to the case where the object is a human. On the other hand, when the object is a human being, the notification is unconditionally made even if the real space position of the object is included in the area of the second contact determination area A2 that extends beyond the first contact determination area A1. Therefore, it is possible to prevent the frequency of notifying the driver that there is a high possibility that the host vehicle 10 and the person are in contact with each other. Therefore, it is possible to report at a suitable timing or frequency in view of the type of the object that there is a high possibility that the host vehicle 10 and the object are in contact with each other.

なお、前記実施形態において、図5(b)に示されている第2の接触判定領域A2は、図5(a)に示されている第1の接触判定領域A1の全部と重なるように設定されたが、図6(a)に示されているように、第2の接触判定領域A2(斜線部分)が第1の接触判定領域A1の一部と重なるように設定されてもよい。また、前記実施形態では第2の接触判定領域A2において、第1の接触判定領域A1との重なり領域と、第1の接触判定領域A1からのはみ出し領域とは隣接していたが、図6(b)に示されているように当該重なり領域および当該はみ出し領域が相互に離れていてもよい。さらに、第2の接触判定領域A2のうち、第1の接触判定領域A1からのはみ出し領域は、第1の接触判定領域との重なり領域の左右いずれか一方の側にのみ設けられていてもよく、左右非対称な形状または配置とされていてもよい。また、前記実施形態では第2の接触判定領域A2は、自車両10の車幅αの両側に第2の余裕γ(γ>β)を加えた範囲に対応する領域に設定したが、第1の接触判定領域A1と第3の接触判定領域A3R,A3Lとを合わせた領域を第2の接触判定領域A2としてもよい。   In the embodiment, the second contact determination area A2 shown in FIG. 5B is set so as to overlap the entire first contact determination area A1 shown in FIG. 5A. However, as shown in FIG. 6A, the second contact determination area A2 (shaded portion) may be set to overlap a part of the first contact determination area A1. Moreover, in the said embodiment, in 2nd contact determination area | region A2, although the overlap area | region with 1st contact determination area | region A1 and the protrusion area | region from 1st contact determination area | region A1 were adjacent, FIG. As shown in b), the overlapping area and the protruding area may be separated from each other. Further, the protruding area from the first contact determination area A1 in the second contact determination area A2 may be provided only on either the left or right side of the overlapping area with the first contact determination area. Alternatively, the shape or arrangement may be asymmetrical. In the embodiment, the second contact determination area A2 is set to an area corresponding to a range obtained by adding the second margin γ (γ> β) to both sides of the vehicle width α of the host vehicle 10. A region obtained by combining the contact determination region A1 and the third contact determination regions A3R and A3L may be used as the second contact determination region A2.

さらに、第2の接触判定領域AR4を赤外線カメラ2R,2Lにより撮像される撮像領域としてもよい。また、本実施形態においては、画像処理ユニット1の処理結果に基づいて、所定の通報を行うように構成されているが、該処理結果に基づいて車両挙動を制御するように構成してもよい。さらに、前記実施形態では、2台の赤外線カメラ2R,2Lを備えたが、1台の赤外線カメラを自車両10に搭載するようにしてもよい。この場合には、対象物との距離をレーダーなどにより検出するようにする。   Further, the second contact determination area AR4 may be an imaging area that is imaged by the infrared cameras 2R and 2L. Moreover, in this embodiment, although it is comprised so that a predetermined | prescribed report may be performed based on the process result of the image processing unit 1, you may comprise so that vehicle behavior may be controlled based on this process result. . Furthermore, although the two infrared cameras 2R and 2L are provided in the embodiment, one infrared camera may be mounted on the host vehicle 10. In this case, the distance to the object is detected by a radar or the like.

本発明の車両の周辺監視装置の一実施形態の全体構成を示す図The figure which shows the whole structure of one Embodiment of the periphery monitoring apparatus of the vehicle of this invention. 図1の周辺監視装置を備えた車両の斜視図The perspective view of the vehicle provided with the periphery monitoring apparatus of FIG. 図1の周辺監視装置に備えた画像処理ユニットの処理を示すフローチャートThe flowchart which shows the process of the image processing unit with which the periphery monitoring apparatus of FIG. 1 was equipped. 本実施形態における通報判定処理を示すフローチャートThe flowchart which shows the report determination processing in this embodiment 本実施形態における撮像装置で撮像される接触判定領域を示す図The figure which shows the contact determination area imaged with the imaging device in this embodiment. 本実施形態における撮像装置で撮像される接触判定領域の変形例を示す図The figure which shows the modification of the contact determination area imaged with the imaging device in this embodiment

符号の説明Explanation of symbols

1…画像処理ユニット(対象物抽出手段、対象物判定手段、接触領域設定手段)、2R,2L…赤外線カメラ(撮像装置)、6…スピーカ(対象物通報手段)、7a…HUD(対象物通報手段) DESCRIPTION OF SYMBOLS 1 ... Image processing unit (object extraction means, object determination means, contact area setting means), 2R, 2L ... Infrared camera (imaging device), 6 ... Speaker (object notification means), 7a ... HUD (object notification) means)

Claims (4)

車両に搭載された撮像装置によって得られる撮像画像を用いて車両の周辺を監視する車両周辺監視装置であって、
前記撮像画像から対象物を抽出する対象物抽出手段と、
前記対象物抽出手段により抽出された前記対象物の実空間位置を測定する位置測定手段と、
前記対象物抽出手段により抽出された前記対象物が人間および四足動物のいずれに該当するかを判定する対象物判定手段と、
前記対象物判定手段により前記対象物が人間であると判定された場合、前記車両と前記対象物としての人間との接触可能性の高低を判定するための第1の接触判定領域を設定する一方、前記対象物判定手段により前記対象物が四足動物であると判定された場合、前記車両と前記対象物としての四足動物との接触可能性の高低を判定するための領域であって、前記第1の接触判定領域との重なり領域と前記第1の接触判定領域から少なくとも一部がはみ出した領域とを有する第2の接触判定領域を設定する接触判定領域設定手段と、
前記接触判定領域設定手段により設定された前記第1の接触判定領域に前記位置測定手段により測定された前記対象物としての人間の実空間位置が含まれる場合、または、前記接触判定領域設定手段により設定された前記第2の接触判定領域に前記位置測定手段により測定された実空間位置に前記対象物としての四足動物の実空間位置が含まれる場合、運転者に対して前記対象物の存在を通報する対象物通報手段とを備えることを特徴とする車両周辺監視装置。
A vehicle periphery monitoring device that monitors the periphery of a vehicle using a captured image obtained by an imaging device mounted on the vehicle,
Object extraction means for extracting an object from the captured image;
Position measuring means for measuring the real space position of the object extracted by the object extracting means;
An object determination means for determining whether the object extracted by the object extraction means corresponds to a human or a quadruped animal;
When the object determining means determines that the object is a human, a first contact determination area for determining the level of possibility of contact between the vehicle and a human being as the object is set. , if the object is determined to be a quadruped animal by the object determination unit, an area for determining the level of possibility of contact between tetrapods as the object and the vehicle, Contact determination area setting means for setting a second contact determination area having an overlapping area with the first contact determination area and an area at least partially protruding from the first contact determination area;
When the first contact determination area set by the contact determination area setting means includes a human real space position as the object measured by the position measurement means, or by the contact determination area setting means When the real space position of the quadruped animal as the object is included in the real space position measured by the position measuring means in the set second contact determination area, the presence of the object to the driver A vehicle surroundings monitoring device comprising object reporting means for reporting
請求項1記載の車両周辺監視装置において、
前記位置測定手段により異なる時点において測定された前記対象物が複数の実空間位置から前記対象物の移動ベクトルを算出する移動ベクトル算出手段を備え、
前記接触判定領域設定手段は、前記対象物判定手段により前記対象物が人間であると判定された場合、前記第1の接触判定領域の外側に第3の接触判定領域を設定し、
前記対象物通報手段は、前記接触判定領域設定手段により設定された前記第3の接触判定領域に前記位置測定手段により測定された前記対象物としての人間の実空間位置が含まれ、かつ、前記移動ベクトル算出手段で算出された前記対象物としての人間の移動ベクトルが前記第1の接触判定領域に向かっている場合、前記運転者に対して前記対象物としての人間の存在を通報することを特徴とする車両周辺監視装置。
The vehicle periphery monitoring device according to claim 1,
The object measured at different time points by the position measuring means comprises a movement vector calculating means for calculating a movement vector of the object from a plurality of real space positions,
The contact determination area setting means sets a third contact determination area outside the first contact determination area when the object determination means determines that the object is a human,
The object notifying means includes a human real space position as the object measured by the position measuring means in the third contact determination area set by the contact determination area setting means, and When the movement vector of the person as the object calculated by the movement vector calculation means is toward the first contact determination area, the presence of the person as the object is notified to the driver. A vehicle periphery monitoring device.
請求項1または2記載の車両周辺監視装置において、
前記接触判定領域設定手段は、前記車両の前方において、前記車両の進行方向に対して平行に延び前記車両の左右両側に所定の余裕を加えた幅を有する領域を、前記第1の接触判定領域として設定することを特徴とする車両周辺監視装置。
In the vehicle periphery monitoring device according to claim 1 or 2,
The contact determination area setting means includes, in front of the vehicle, an area extending in parallel to the traveling direction of the vehicle and having a width with a predetermined margin on both the left and right sides of the vehicle. A vehicle periphery monitoring device characterized by being set as follows.
請求項1〜3いずれか1つに記載の車両周辺監視装置において、
前記接触判定領域設定手段は、前記撮像装置により撮像される撮像領域を前記第2の接触判定領域として設定することを特徴とする車両周辺監視装置。
In the vehicle periphery monitoring device according to any one of claims 1 to 3,
The vehicle periphery monitoring device, wherein the contact determination region setting means sets an imaging region imaged by the imaging device as the second contact determination region.
JP2008154408A 2008-06-12 2008-06-12 Vehicle periphery monitoring device Active JP4377439B1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2008154408A JP4377439B1 (en) 2008-06-12 2008-06-12 Vehicle periphery monitoring device
EP09762203A EP2270765B1 (en) 2008-06-12 2009-03-04 Vehicle periphery monitoring device
US12/994,922 US8189868B2 (en) 2008-06-12 2009-03-04 Vehicle periphery monitoring device
AT09762203T ATE556910T1 (en) 2008-06-12 2009-03-04 DEVICE FOR MONITORING THE VEHICLE SURROUNDINGS
CN2009801218769A CN102057414B (en) 2008-06-12 2009-03-04 Vehicle periphery monitoring device
PCT/JP2009/000977 WO2009150771A1 (en) 2008-06-12 2009-03-04 Vehicle periphery monitoring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008154408A JP4377439B1 (en) 2008-06-12 2008-06-12 Vehicle periphery monitoring device

Publications (2)

Publication Number Publication Date
JP4377439B1 true JP4377439B1 (en) 2009-12-02
JP2009301283A JP2009301283A (en) 2009-12-24

Family

ID=41416487

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008154408A Active JP4377439B1 (en) 2008-06-12 2008-06-12 Vehicle periphery monitoring device

Country Status (6)

Country Link
US (1) US8189868B2 (en)
EP (1) EP2270765B1 (en)
JP (1) JP4377439B1 (en)
CN (1) CN102057414B (en)
AT (1) ATE556910T1 (en)
WO (1) WO2009150771A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5761910B2 (en) * 2009-12-17 2015-08-12 キヤノン株式会社 Speed detection device
JP5751842B2 (en) 2010-09-16 2015-07-22 キヤノン株式会社 Speed detection apparatus and image forming apparatus
EP2711912A4 (en) * 2011-09-28 2015-04-22 Honda Motor Co Ltd Biometric device
EP2578464B1 (en) * 2011-10-06 2014-03-19 Honda Research Institute Europe GmbH Video-based warning system for a vehicle
US9437000B2 (en) * 2014-02-20 2016-09-06 Google Inc. Odometry feature matching
JP6190758B2 (en) * 2014-05-21 2017-08-30 本田技研工業株式会社 Object recognition device and vehicle
CN104192065A (en) * 2014-08-20 2014-12-10 刘健萍 Automobile periphery monitoring device capable of distinguishing monitored objects
JP6481520B2 (en) * 2015-06-05 2019-03-13 トヨタ自動車株式会社 Vehicle collision avoidance support device
JP6775285B2 (en) * 2015-09-24 2020-10-28 アルパイン株式会社 Rear side vehicle detection alarm device
CN109416887B (en) * 2016-08-24 2021-09-21 日立安斯泰莫株式会社 Vehicle notification device for identifying control object
US10366310B2 (en) * 2016-09-12 2019-07-30 Aptiv Technologies Limited Enhanced camera object detection for automated vehicles
JP7290119B2 (en) * 2020-01-24 2023-06-13 トヨタ自動車株式会社 vehicle alarm device

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3263699B2 (en) * 1992-12-22 2002-03-04 三菱電機株式会社 Driving environment monitoring device
JP3522317B2 (en) * 1993-12-27 2004-04-26 富士重工業株式会社 Travel guide device for vehicles
JP3866328B2 (en) * 1996-06-06 2007-01-10 富士重工業株式会社 Vehicle peripheral three-dimensional object recognition device
US6445409B1 (en) * 1997-05-14 2002-09-03 Hitachi Denshi Kabushiki Kaisha Method of distinguishing a moving object and apparatus of tracking and monitoring a moving object
WO2000007373A1 (en) * 1998-07-31 2000-02-10 Matsushita Electric Industrial Co., Ltd. Method and apparatus for displaying image
WO2000073996A1 (en) * 1999-05-28 2000-12-07 Glebe Systems Pty Ltd Method and apparatus for tracking a moving object
JP3515926B2 (en) * 1999-06-23 2004-04-05 本田技研工業株式会社 Vehicle periphery monitoring device
EP1202214A3 (en) * 2000-10-31 2005-02-23 Matsushita Electric Industrial Co., Ltd. Method and apparatus for object recognition
US20020130953A1 (en) * 2001-03-13 2002-09-19 John Riconda Enhanced display of environmental navigation features to vehicle operator
JP2003016429A (en) * 2001-06-28 2003-01-17 Honda Motor Co Ltd Vehicle periphery monitor device
US7167576B2 (en) * 2001-07-02 2007-01-23 Point Grey Research Method and apparatus for measuring dwell time of objects in an environment
CN1407509A (en) * 2001-09-04 2003-04-02 松下电器产业株式会社 Image processor, method and programme
KR100866450B1 (en) * 2001-10-15 2008-10-31 파나소닉 주식회사 Automobile surrounding observation device and method for adjusting the same
JP4647648B2 (en) * 2002-01-18 2011-03-09 本田技研工業株式会社 Vehicle periphery monitoring device
DE10258287A1 (en) * 2002-12-13 2004-06-24 Robert Bosch Gmbh Motor vehicle object detection system, e.g. for use in an adaptive or automatic cruise control system, comprises an array of three object detection sensors with different sensitivity areas and angles
US6859144B2 (en) * 2003-02-05 2005-02-22 Delphi Technologies, Inc. Vehicle situation alert system with eye gaze controlled alert signal generation
DE10336638A1 (en) 2003-07-25 2005-02-10 Robert Bosch Gmbh Apparatus for classifying at least one object in a vehicle environment
DE10348109A1 (en) * 2003-10-16 2005-05-19 Bayerische Motoren Werke Ag Method and device for visualizing a vehicle environment
JP4425642B2 (en) * 2004-01-08 2010-03-03 富士重工業株式会社 Pedestrian extraction device
JP2005316607A (en) * 2004-04-27 2005-11-10 Toyota Motor Corp Image processor and image processing method
JP4140567B2 (en) * 2004-07-14 2008-08-27 松下電器産業株式会社 Object tracking device and object tracking method
US20060050929A1 (en) * 2004-09-09 2006-03-09 Rast Rodger H Visual vector display generation of very fast moving elements
US20060170769A1 (en) * 2005-01-31 2006-08-03 Jianpeng Zhou Human and object recognition in digital video
US7447334B1 (en) * 2005-03-30 2008-11-04 Hrl Laboratories, Llc Motion recognition system
JP2007257148A (en) * 2006-03-22 2007-10-04 Nissan Motor Co Ltd Image processing apparatus and method
JP4173901B2 (en) * 2006-05-19 2008-10-29 本田技研工業株式会社 Vehicle periphery monitoring device
JP4528283B2 (en) * 2006-06-05 2010-08-18 本田技研工業株式会社 Vehicle periphery monitoring device
JP2007334511A (en) * 2006-06-13 2007-12-27 Honda Motor Co Ltd Object detection device, vehicle, object detection method and program for object detection
JP4203512B2 (en) * 2006-06-16 2009-01-07 本田技研工業株式会社 Vehicle periphery monitoring device
JP2008026997A (en) * 2006-07-18 2008-02-07 Denso Corp Pedestrian recognition device and pedestrian recognition method
JP4940933B2 (en) 2006-11-21 2012-05-30 株式会社明電舎 PWM inverter output voltage control device
DE102007015032A1 (en) * 2007-03-29 2008-01-10 Daimlerchrysler Ag Method for evaluating how critical driving situation is comprises calculating possible accelerations and decelerations for vehicle and object whose paths are set to cross and deducing time periods in which their paths would overlap
JP4567072B2 (en) * 2008-05-14 2010-10-20 本田技研工業株式会社 Vehicle periphery monitoring device

Also Published As

Publication number Publication date
EP2270765A1 (en) 2011-01-05
EP2270765B1 (en) 2012-05-09
EP2270765A4 (en) 2011-04-27
ATE556910T1 (en) 2012-05-15
WO2009150771A1 (en) 2009-12-17
JP2009301283A (en) 2009-12-24
CN102057414A (en) 2011-05-11
US8189868B2 (en) 2012-05-29
US20110096956A1 (en) 2011-04-28
CN102057414B (en) 2013-12-25

Similar Documents

Publication Publication Date Title
JP4377439B1 (en) Vehicle periphery monitoring device
JP4173901B2 (en) Vehicle periphery monitoring device
WO2010050090A1 (en) System for monitoring the area around a vehicle
JP5198346B2 (en) Vehicle periphery monitoring device
US9235990B2 (en) Vehicle periphery monitoring device
JP4173902B2 (en) Vehicle periphery monitoring device
JP5480925B2 (en) Vehicle periphery monitoring device
JP4644273B2 (en) Vehicle periphery monitoring device
JP2007128460A (en) Collision determination system, collision determination method, computer program, and determination device
JP5172482B2 (en) Vehicle periphery monitoring device
JP5004903B2 (en) Vehicle periphery monitoring device
JP5004919B2 (en) Vehicle periphery monitoring device
JP4615061B2 (en) Vehicle periphery monitoring device
JP4995778B2 (en) Vehicle periphery monitoring device
JP5026398B2 (en) Vehicle periphery monitoring device
JP5907849B2 (en) Vehicle periphery monitoring device
JP4972073B2 (en) Environment recognition device

Legal Events

Date Code Title Description
TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20090901

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20090910

R150 Certificate of patent or registration of utility model

Ref document number: 4377439

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120918

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120918

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130918

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140918

Year of fee payment: 5

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313113

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313113

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250