JP2009086788A - Vehicle surrounding monitoring device - Google Patents

Vehicle surrounding monitoring device Download PDF

Info

Publication number
JP2009086788A
JP2009086788A JP2007252957A JP2007252957A JP2009086788A JP 2009086788 A JP2009086788 A JP 2009086788A JP 2007252957 A JP2007252957 A JP 2007252957A JP 2007252957 A JP2007252957 A JP 2007252957A JP 2009086788 A JP2009086788 A JP 2009086788A
Authority
JP
Japan
Prior art keywords
vehicle
blind spot
unit
obstacle
warning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2007252957A
Other languages
Japanese (ja)
Inventor
Hisaya Nakataku
拓久哉 中
Masaru Yamazaki
勝 山崎
Tatsuya Yoshida
龍也 吉田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to JP2007252957A priority Critical patent/JP2009086788A/en
Publication of JP2009086788A publication Critical patent/JP2009086788A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Traffic Control Systems (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To solve the problem of conventional techniques having difficulty in alerting a driver of an obstacle that suddenly appears from a blind spot where a monitoring area is blocked by installations on a road, etc. <P>SOLUTION: This vehicle surrounding monitoring device includes: a warning unit; an obstacle detecting unit for detecting any obstacle around one's own vehicle; a one's own vehicle movement detecting unit for detecting the movement of one's own vehicle; a one's own vehicle course trajectory estimating unit for estimating the course trajectory of one's own vehicle based on the direction and speed of travel of one's own vehicle; a blind spot estimating unit for estimating a blind spot created by an obstacle existing around one's own vehicle; a course obstacle appearance determining unit for estimating a position where one's own vehicle will collide with a course obstacle on the assumption that a course obstacle blocking the course trajectory of the one's own vehicle will appear form the blind spot; and a warning determining unit for computing the danger level of the blind spot based on the distance between the colliding position and the one's own vehicle and the speed of travel of the vehicle, and determining whether or not the warning unit should be operated according to the danger level. The warning unit provides a warning to the driver based on the determination by the warning determining unit. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

本発明は、衝突事故を未然に防ぐ車両周辺監視装置に関する。   The present invention relates to a vehicle periphery monitoring device that prevents a collision accident in advance.

従来から、車両同士の衝突を回避するための様々な技術が開発されている。   Conventionally, various techniques for avoiding a collision between vehicles have been developed.

ここで、車両後方に生じる運転者の死角領域に対して複数のレーザビームを照射して、死角領域を走行する車両との相対距離を計測し、相対距離がある基準値以下となる場合に、警報を発する車両周辺監視装置がある(特許文献1参照)。又、車両に複数のレーダを備えることで、車両周辺の障害物を検知し、後方から接近する車両を回避する追突防止装置がある(特許文献2参照)。   Here, by irradiating a driver's blind spot area generated behind the vehicle with a plurality of laser beams, measuring the relative distance with the vehicle traveling in the blind spot area, and when the relative distance is below a certain reference value, There is a vehicle periphery monitoring device that issues an alarm (see Patent Document 1). Further, there is a rear-end collision prevention device that detects obstacles around the vehicle and avoids vehicles approaching from behind by providing a plurality of radars on the vehicle (see Patent Document 2).

特開平8−24149号公報JP-A-8-24149 特開2005−182198号公報JP 2005-182198 A

上記従来技術によれば、自車に搭載されたレーダ等の監視装置で検知された障害物に対して運転者に警報を与える事はできるが、車両や壁面、道路上の設置物等によって監視領域が遮られた死角領域から急に出現する障害物に対しては、運転者に回避や制動を促す時間を考慮した警報を与える事が困難である。例えば、壁で遮られた見通しの悪い十字路の交差点では、壁によって監視領域が遮られるため、交差点の中心付近に接近するまで交差道路の交通状況を検知することができず、出会い頭の事故を起こすことがある。又、停車している車両によって生じる死角領域から歩行者が飛び出してくることもある。   According to the above prior art, the driver can be alerted to obstacles detected by a monitoring device such as a radar mounted on the host vehicle, but monitoring is performed by the vehicle, the wall surface, the installation on the road, etc. For obstacles that suddenly appear from the blind spot area where the area is blocked, it is difficult to give a warning that takes into account the time for the driver to avoid or brake. For example, at a crossroad intersection with poor visibility that is blocked by a wall, the surveillance area is blocked by the wall, so the traffic situation of the intersection road cannot be detected until it approaches the vicinity of the center of the intersection, causing an encounter accident Sometimes. Moreover, a pedestrian may jump out of a blind spot area caused by a stopped vehicle.

このような死角が原因となる事故を防ぐために、道路上にカメラやレーダなどの監視装置を設置し、光ビーコンやDSRC(Dedicated Short Range Communications)などの無線技術を使うことで、逐次障害物の存在を車両に知らせる路車間通信技術や、車両間で障害物の検知情報を共有する車車間通信技術が開発されている。しかし、監視装置が全国の道路上に十分に整備され、かつ車両側にも係る装置が搭載されるには多大な費用及び時間がかかる、という課題がある。   In order to prevent such accidents caused by blind spots, monitoring devices such as cameras and radars are installed on the road, and wireless technologies such as optical beacons and DSRC (Dedicated Short Range Communications) are used to detect obstacles sequentially. Road-to-vehicle communication technology for informing the vehicle of existence and vehicle-to-vehicle communication technology for sharing obstacle detection information between vehicles have been developed. However, there is a problem that it takes a lot of cost and time for the monitoring apparatus to be sufficiently provided on roads throughout the country and to be mounted on the vehicle side.

そこで、本発明の目的は、自車周辺に存在する障害物によって生じる死角領域の危険性を運転者に報知する車両周辺監視装置を提供することにある。   Accordingly, an object of the present invention is to provide a vehicle periphery monitoring device that notifies a driver of the danger of a blind spot area caused by an obstacle present around the host vehicle.

本発明の望ましい態様の一つは次の通りである。   One desirable embodiment of the present invention is as follows.

車両周辺監視装置は、運転者に警報を与える警報部と、自車周辺の障害物を複数の検出点の点列として検出し、当該自車と各検出点の相対距離,方位角、及び、相対速度とを検出する障害物検出部と、自車の走行方向と走行速度とを検出する自車運動検出部と、自車運動検出部が検出した自車の走行方向と走行速度に基づいて、自車進路軌跡を推定する自車進路軌跡推定部と、自車周辺に存在する障害物によって生じる死角領域を推定する死角領域推定部と、死角領域推定部が推定した死角領域から、自車進路軌跡推定部が推定した自車進路軌跡を遮る進路妨害物が出現すると仮定して自車と進路妨害物が衝突する位置を推定する進路妨害物出現判断部と、進路妨害物出現判断部が推定した衝突位置と自車との距離と、自車運動検出部が検出した自車の走行速度に基づいて死角領域の危険度合いを演算し、当該危険度合いに応じて警報部の動作を判断する警報判断部を備え、警報部は、警報判断部の判断に基づいて運転者に警報を与える。   The vehicle periphery monitoring device detects an alarm unit that gives a warning to the driver, obstacles around the vehicle as a point sequence of a plurality of detection points, and the relative distance, azimuth angle between the vehicle and each detection point, and Based on the obstacle detection unit for detecting the relative speed, the own vehicle motion detection unit for detecting the traveling direction and the traveling speed of the own vehicle, and the traveling direction and the traveling speed of the own vehicle detected by the own vehicle motion detection unit. From the own vehicle course trajectory estimation unit for estimating the own vehicle course trajectory, the blind spot region estimation unit for estimating the blind spot region caused by the obstacle existing around the own vehicle, and the blind spot region estimated by the blind spot region estimation unit, A path obstruction appearance determination section that estimates a position where the own vehicle and the path obstruction collide, and a path obstruction appearance determination section assume that a path obstruction that obstructs the vehicle path locus estimated by the path locus estimation section appears. The distance between the estimated collision position and the vehicle and the vehicle motion detector A warning determination unit that calculates the degree of danger in the blind spot area based on the traveling speed of the own vehicle and determines the operation of the alarm unit according to the degree of danger, and the alarm unit operates based on the determination of the alarm determination unit Alert the person.

本発明によれば、自車周辺に存在する障害物によって生じる死角領域の危険性を運転者に報知する車両周辺監視装置を提供することができる。   ADVANTAGE OF THE INVENTION According to this invention, the vehicle periphery monitoring apparatus which alert | reports to a driver the danger of the blind spot area | region which arises with the obstruction which exists around the own vehicle can be provided.

以下、車両周辺監視装置の一実施形態について図1〜図5を用いて説明する。   Hereinafter, an embodiment of a vehicle periphery monitoring device will be described with reference to FIGS.

図1は、車両周辺監視装置300を示す図である。   FIG. 1 is a diagram showing a vehicle periphery monitoring device 300.

自車20に周辺の障害物を検出するための障害物検出部100a,100b,100c,100d,自車の位置姿勢(走行方向含む)、及び運動(走行速度含む)を検出する自車運動検出部として、GPS110,加速度・ヨーレートセンサ120,方向指示器130,車輪速センサ140a,140b,140c,140d,操舵角センサ150,運転者の視線を検知する視線検知部として視線検知センサ160が搭載されている。更に、地図情報記憶装置200,道路交通情報受信装置210,交通事故情報記憶装置240,車両周辺監視に必要な演算処理を行う演算装置220及び警報装置230が搭載されている。本構成により、以下の効果を得る。
● 「死角領域の危険判断」:車両周辺に存在する障害物によって生じる死角領域の危険 度合いを自車に搭載した各種センサから自律的に判断し警報を発する。
● 「走行場面に応じた危険判断」:自車の走行状態,過去及び現在の交通状況,路面状 況など走行場面に応じた死角領域の危険度合いを算出。
● 「運転者との親和性」:運転者の視線を検知することで、死角領域に対する運転者の 注意状況を考慮した警報動作を行い、更に、死角の発生位置を警報によって知らせる。
Obstacle detection unit 100a, 100b, 100c, 100d for detecting obstacles around the own vehicle 20, own vehicle motion detection for detecting the position and orientation of the own vehicle (including travel direction) and motion (including travel speed) The GPS 110, the acceleration / yaw rate sensor 120, the direction indicator 130, the wheel speed sensors 140a, 140b, 140c, and 140d, the steering angle sensor 150, and the gaze detection sensor 160 as a gaze detection unit that detects the driver's gaze are mounted. ing. Further, a map information storage device 200, a road traffic information reception device 210, a traffic accident information storage device 240, a calculation device 220 that performs calculation processing necessary for vehicle periphery monitoring, and an alarm device 230 are mounted. With this configuration, the following effects are obtained.
● “Blind area danger judgment”: The risk of the dead area caused by obstacles around the vehicle is autonomously judged from various sensors installed in the vehicle and an alarm is issued.
● “Danger judgment according to the driving scene”: Calculates the degree of danger in the blind spot area according to the driving scene, such as the driving state of the vehicle, past and current traffic conditions, and road surface conditions.
● “Affinity with the driver”: By detecting the driver's line of sight, an alarm action is performed in consideration of the driver's attention to the blind spot area, and further, the location of the blind spot is notified by an alarm.

次に、自車の位置姿勢及び運動を検出する自車運動検出部について説明する。   Next, a description will be given of the own vehicle motion detection unit that detects the position and orientation and motion of the own vehicle.

GPS(Global Positioning System)110は、自車の地球上の現在位置を検出するセンサであり、ナビゲーションシステムの一つとして車両に一般的に使用されている。   A GPS (Global Positioning System) 110 is a sensor that detects the current position of the vehicle on the earth, and is generally used in vehicles as one of navigation systems.

加速度・ヨーレートセンサ120は、自車の進行方向と車幅方向の前後加速度・横加速度及びヨーレートを検出するセンサである。   The acceleration / yaw rate sensor 120 is a sensor that detects longitudinal acceleration / lateral acceleration and yaw rate in the traveling direction and the vehicle width direction of the host vehicle.

方向指示器130は、運転者によって指示された自車の進路変更方向に対応する信号を出力する装置である。   The direction indicator 130 is a device that outputs a signal corresponding to the course change direction of the host vehicle instructed by the driver.

車輪速センサ140a,140b,140c,140dは、普通乗用車等の4輪車両に一般的に装備される前後左右の車輪の回転速度を検出するセンサである。   The wheel speed sensors 140a, 140b, 140c, and 140d are sensors that detect the rotational speeds of front, rear, left and right wheels that are generally installed in four-wheeled vehicles such as ordinary passenger cars.

操舵角センサ150は、ステアリングの操舵角を検出するセンサである。   The steering angle sensor 150 is a sensor that detects the steering angle of the steering.

これらの情報は、各々定期的に信号線を経由し、演算装置220に送信される。   These pieces of information are periodically transmitted to the arithmetic unit 220 via signal lines.

次に、自車周辺の障害物を検出する障害物検出部について説明する。   Next, an obstacle detection unit that detects obstacles around the host vehicle will be described.

障害物検出部100a,100b,100c,100dは、前方,後方,側方に複数備えられており、検出領域中をスキャンすることで得られる複数の検出点の相対距離・相対速度・方位角情報を出力する。代表的な障害物検出部としては、ミリ波レーダ103,レーザレンジファインダ102,カメラ101等が挙げられる。   A plurality of obstacle detection units 100a, 100b, 100c, and 100d are provided on the front, rear, and sides, and relative distance, relative speed, and azimuth information of a plurality of detection points obtained by scanning the detection area. Is output. Typical obstacle detection units include a millimeter wave radar 103, a laser range finder 102, a camera 101, and the like.

ミリ波レーダ103は、24GHz帯域や30〜300GHz帯域の周波数の電磁波を所定領域内で1次元もしくは2次元方向にスキャンし、障害物から反射する電磁波の往復時間及び電磁波の位相差を計測することで複数の検出点の2次元及び3次元の相対距離・方位角・相対速度の測定データを検出できる。障害物の材質によって電磁波の反射率が異なることから、反射波の反射強度を検出することで障害物の材質を特定することが可能である。   The millimeter wave radar 103 scans an electromagnetic wave having a frequency of 24 GHz band or 30 to 300 GHz band in a one-dimensional or two-dimensional direction within a predetermined region, and measures the round-trip time of the electromagnetic wave reflected from the obstacle and the phase difference of the electromagnetic wave. The measurement data of the two-dimensional and three-dimensional relative distances, azimuths, and relative velocities at a plurality of detection points can be detected. Since the reflectance of electromagnetic waves varies depending on the material of the obstacle, the material of the obstacle can be specified by detecting the reflection intensity of the reflected wave.

レーザレンジファインダ102は、指向性の高いレーザ光を使用したレーダであり、ミリ波レーダ103と同様に所定領域で1次元もしくは2次元方向にスキャンし障害物から反射する光の往復時間を計測することで、複数の検出点の2次元及び3次元の相対距離・方位角情報を検出できる。一般に、レーザレンジファインダ102では、直接相対速度を検出することができないため、1ステップ前にスキャンしたデータと現在のスキャンしたデータとの間で差分を行うことで相対速度を求める。レーザレンジファインダ102もミリ波レーダ103と同様に障害物の材質によって反射率が異なるため、反射強度を検出することで障害物の材質を特定することができる。   The laser range finder 102 is a radar that uses laser light with high directivity, and, like the millimeter wave radar 103, scans in a one-dimensional or two-dimensional direction in a predetermined area and measures the round trip time of light reflected from an obstacle. Thus, the two-dimensional and three-dimensional relative distance / azimuth information of a plurality of detection points can be detected. In general, the laser range finder 102 cannot directly detect the relative speed, so the relative speed is obtained by performing a difference between the data scanned one step before and the current scanned data. Similarly to the millimeter wave radar 103, the laser range finder 102 also has a different reflectance depending on the material of the obstacle. Therefore, the material of the obstacle can be specified by detecting the reflection intensity.

カメラ101による障害物検知には幾つのカメラ101を用いるかによって検知方法が異なる。代表的な2台のカメラ101を用いた両眼立体視では視差を利用した三角測量に基づいて撮像された領域内の3次元の相対距離・方位角情報を検出する事ができる。カメラ101は障害物の輝度情報を取得できるため、認識したい対象の形状や色などの特徴を表したテンプレートを用意し、そのテンプレートと撮影した画像間でマッチング処理を行うことで障害物の認識を行うことができる。カメラ101を用いた場合もレーザレンジファインダ102と同様に直接相対速度を検出することができないため、1ステップ前にスキャンしたデータと現在スキャンしたデータとの間で差分を行うことで相対速度を求める事ができる。これら障害物検出部の検出結果は信号線を経由し演算装置220に送信される。   The detection method differs depending on how many cameras 101 are used for obstacle detection by the camera 101. In binocular stereoscopic vision using two typical cameras 101, it is possible to detect three-dimensional relative distance and azimuth information in an imaged area based on triangulation using parallax. Since the camera 101 can acquire the luminance information of the obstacle, a template representing features such as the shape and color of the target to be recognized is prepared, and the obstacle is recognized by performing a matching process between the template and the captured image. It can be carried out. Even when the camera 101 is used, the relative speed cannot be directly detected in the same manner as the laser range finder 102, so the relative speed is obtained by performing a difference between the data scanned one step before and the currently scanned data. I can do things. The detection results of these obstacle detection units are transmitted to the arithmetic unit 220 via signal lines.

尚、本実施例では説明を簡便にすませる都合上、2次元の検出データを取得する事ができる障害物検出部を使用した場合について説明する。3次元の障害物検出部を使用した場合も、2次元の場合と同様の処理手順を踏むため一般性が失われる事は無い。また、本実施形態でのセンサ類は本稿の内容を制限するものではなく、他のセンサや測定方法を用いても良いし、車両の設置箇所も限定しない。   In this embodiment, for convenience of explanation, a case where an obstacle detection unit capable of acquiring two-dimensional detection data is used will be described. Even when a three-dimensional obstacle detection unit is used, generality is not lost because the same processing procedure as in the two-dimensional case is followed. Further, the sensors in the present embodiment do not limit the content of this paper, and other sensors and measurement methods may be used, and the installation location of the vehicle is not limited.

運転者の視線を検知する視線検知部として搭載された視線検知センサ160は、視線を検知することで運転者がどの方向に注意を払っているか判断するために使用する。視線を検知するセンサの構成及び方法については特に限定しないが、車内に固定されたカメラで運転者の目を撮影し、画像データに基づいて視線を検知する方法がよく用いられる。   The line-of-sight detection sensor 160 mounted as a line-of-sight detection unit that detects the driver's line of sight is used to determine which direction the driver is paying attention by detecting the line of sight. The configuration and method of the sensor for detecting the line of sight are not particularly limited, but a method of detecting the line of sight based on the image data by photographing the eyes of the driver with a camera fixed in the vehicle is often used.

地図情報記憶装置200は、ナビゲーションシステムで一般的に使用される、地球表面の一部もしくは全部に関する土地の標高・地形・住所・建築物の形状・道路の形状・道路区分などの様々な地理情報が記憶された装置であり、演算装置220の情報要求に応じて必要な地理情報が読み出される。上記の全ての記憶情報が車両に搭載されていても、また一部情報のみ車両に搭載し、無線技術を利用することで基地局に保存された記憶情報を逐次車両に送信して更新されても構わないし、新たな地理情報が加えられてもよい。   The map information storage device 200 is commonly used in navigation systems, and various geographical information such as altitude, landform, address, building shape, road shape, road classification, etc., relating to part or all of the earth's surface. Is stored, and necessary geographic information is read in response to an information request from the arithmetic device 220. Even if all the above-mentioned stored information is mounted on the vehicle, only a part of the information is mounted on the vehicle, and the storage information stored in the base station is sequentially transmitted to the vehicle and updated by using wireless technology. It does not matter, and new geographical information may be added.

道路交通情報受信装置210は、渋滞や事故・交通規制情報や、天気情報といった走行道路前方の道路交通情報を受信する装置である。例えば、「財団法人道路交通情報通信システムセンタ」が渋滞情報提供サービスを行っている。前記渋滞や事故・交通規制・天気情報を得ることで、危険を回避する方策を前もって取ることが可能となる。渋滞・事故・交通規制情報からは、前方道路において事故車両が存在することや渋滞が予報されている場合、低速走行もしくは停車している車両が存在する可能性が高いことから、走行道路前方に障害物が存在する可能性が高いことになる。一方で渋滞が予報されていない場合は、走行道路前方に障害物が存在する可能性が低いことになる。天気情報からは路面の乾きや濡れまた凍結といった路面状態を推測することが可能である。路面状態が推測できるとタイヤ路面間の摩擦係数を推測することができるため、天気に応じて制動に必要な距離を推測することが可能となる。   The road traffic information receiving device 210 is a device that receives road traffic information ahead of the traveling road such as traffic jams, accident / traffic regulation information, and weather information. For example, the “Road Traffic Information Communication System Center” provides a traffic jam information providing service. By obtaining the traffic jam, accident / traffic regulation / weather information, it becomes possible to take measures to avoid danger in advance. From traffic jams / accidents / traffic regulation information, there is a high possibility that there is an accident vehicle on the road ahead, or there is a vehicle running at a low speed or when there is a traffic jam. An obstacle is likely to exist. On the other hand, when no traffic jam is predicted, the possibility that an obstacle exists in front of the traveling road is low. From the weather information, it is possible to infer road surface conditions such as dryness, wetness and freezing of the road surface. If the road surface state can be estimated, the coefficient of friction between the tire road surfaces can be estimated, so that it is possible to estimate the distance required for braking according to the weather.

交通事故情報記憶装置240は、過去に道路上で生じた交通事故の統計量を保存した記憶装置であり、季節や時刻・場所・道路形状に応じた事故発生件数に関する統計量が保存されているとする。この情報から、ある地点で事故が多発する季節や時刻情報や、車両同士の事故が多いのか、車両と人の事故が多いのかという事故の内容が分かる。   The traffic accident information storage device 240 is a storage device that stores statistics on traffic accidents that have occurred on the road in the past, and stores statistics on the number of accidents that occur according to the season, time, place, and road shape. And From this information, the season and time information at which accidents frequently occur at a certain point and the contents of accidents such as whether there are many accidents between vehicles or between vehicles and people are known.

図2は、車両周辺監視装置300の処理の流れを示したブロック図である。   FIG. 2 is a block diagram illustrating a processing flow of the vehicle periphery monitoring apparatus 300.

障害物検出部1は、図1の100a,100b,100c,100dをまとめて表現したブロックである。自車運動検出部2は、図1のGPS110,加速度・ヨーレートセンサ120,方向指示器130,車輪速センサ140a,140b,140c,140d,操舵角センサ150をまとめて表現したブロックである。視線検知部3は、視線検知センサ160のことである。   The obstacle detection unit 1 is a block that collectively represents 100a, 100b, 100c, and 100d in FIG. The own vehicle motion detection unit 2 is a block that collectively represents the GPS 110, the acceleration / yaw rate sensor 120, the direction indicator 130, the wheel speed sensors 140a, 140b, 140c, 140d, and the steering angle sensor 150 of FIG. The line-of-sight detection unit 3 is a line-of-sight detection sensor 160.

演算装置220は、車両周辺監視を行うための演算処理を行う装置であり、障害物認識部4,マップマッチング部8,死角領域推定部6,進行方向推定部7,自車位置推定部5,障害物出現判断部9,警報判断部10の演算処理を行う。演算装置220は、障害物検出部1と自車運動検出部2と地図情報記憶装置200と交通事故情報記憶装置240と道路交通情報受信装置210とから送信された情報を処理し、処理結果に応じて警報装置230を動作させる。   The arithmetic device 220 is a device that performs arithmetic processing for monitoring the vehicle periphery, and includes an obstacle recognition unit 4, a map matching unit 8, a blind spot region estimation unit 6, a traveling direction estimation unit 7, an own vehicle position estimation unit 5, The obstacle appearance determination unit 9 and the alarm determination unit 10 perform arithmetic processing. The arithmetic device 220 processes information transmitted from the obstacle detection unit 1, the own vehicle motion detection unit 2, the map information storage device 200, the traffic accident information storage device 240, and the road traffic information reception device 210, and outputs the processing result. In response, the alarm device 230 is operated.

自車位置推定部5は、自車運動検出部2の出力に基づき、自車の絶対位置及び姿勢を推定し、これら推定結果を出力する要素であり、この要素によって障害物検出部1の検出領域に生じる死角領域が自車周辺のどの位置に生じているか判断することが可能となる。   The own vehicle position estimation unit 5 is an element that estimates the absolute position and posture of the own vehicle based on the output of the own vehicle motion detection unit 2 and outputs these estimation results. The detection of the obstacle detection unit 1 by this element It is possible to determine at which position around the own vehicle the blind spot region generated in the region is generated.

自車の絶対位置姿勢を推定するため、GPS110だけではなく、加速度ヨーレートセンサ120,車輪速センサ140a,140b,140c,140d,操舵角センサ150から得られる複数情報を基に車両ダイナミクスに基づく車両運動モデルを用いた位置姿勢推定を行う。車両運動モデルを用いた位置姿勢推定は時間積分操作を行うため、各種センサの計測誤差が蓄積し推定誤差が肥大化してしまう恐れがある。これに対しては、GPS110から得られる位置情報を付加することで推定誤差の低減を図ることも可能である。又、カメラやレーダでレーン検知することでレーンに対する自車の位置姿勢を検出する方法や、道路上に備え付けたセンサあるいはビーコンから得られる情報に基づいて自車の位置姿勢を検出する方法など積分操作を行うことなく直接走行道路と自車との位置姿勢関係を検出できる方法と融合することで推定精度の向上を図る事が可能である。   In order to estimate the absolute position and orientation of the vehicle, not only the GPS 110 but also the vehicle motion based on the vehicle dynamics based on a plurality of information obtained from the acceleration yaw rate sensor 120, the wheel speed sensors 140a, 140b, 140c, 140d, and the steering angle sensor 150 Estimate the position and orientation using the model. Since position / orientation estimation using a vehicle motion model performs a time integration operation, there is a possibility that measurement errors of various sensors accumulate and the estimation error becomes enlarged. In contrast, it is possible to reduce the estimation error by adding position information obtained from the GPS 110. Integration such as detecting the position and orientation of the vehicle relative to the lane by detecting the lane with a camera or radar, and detecting the position and orientation of the vehicle based on information obtained from sensors or beacons provided on the road It is possible to improve the estimation accuracy by combining with a method that can detect the position and orientation relationship between the traveling road and the own vehicle directly without any operation.

自車位置及び姿勢の絶対座標系は、日本測地系に基づいて定義された緯度経度座標系でも、国際地球基準座標系に基づいて定義された緯度経度座標系でもよく、前記緯度経度座標系ではなく平面直角座標系を用いてもよい。基準点は基準座標系によって異なることから各々の基準点を使用すればよい。尚、上記位置姿勢推定法は本稿の内容を制限するものではなく、他の位置姿勢推定法を使用してもよいし、他の座標系を用いてもよい。   The absolute coordinate system of the vehicle position and orientation may be a latitude / longitude coordinate system defined based on the Japanese geodetic system or a latitude / longitude coordinate system defined based on the international earth reference coordinate system. Alternatively, a planar rectangular coordinate system may be used. Since the reference point differs depending on the reference coordinate system, each reference point may be used. The position / orientation estimation method does not limit the contents of this paper, and other position / orientation estimation methods may be used, or other coordinate systems may be used.

自車進路推定部7は、自車運動検出部2の出力に基づき、自車位置推定部5で推定された絶対位置から将来自車が進むと予想される進路軌跡の絶対位置を推定し、推定された進路軌跡の絶対位置を出力する要素である。具体的には、車輪速センサ140aと140bと140cと140dとから出力する車輪速と車輪半径とから算出した車両速度と、加速度・ヨーレートセンサ120が出力するヨーレートとを基に将来自車が通過する地点を複数箇所算出する事で進路軌跡を推定する。   Based on the output of the own vehicle motion detection unit 2, the own vehicle course estimation unit 7 estimates the absolute position of the course trajectory that the host vehicle is expected to travel in the future from the absolute position estimated by the own vehicle position estimation unit 5, This is an element that outputs the absolute position of the estimated track. Specifically, the vehicle will pass in the future based on the vehicle speed calculated from the wheel speed and wheel radius output from the wheel speed sensors 140a, 140b, 140c and 140d, and the yaw rate output from the acceleration / yaw rate sensor 120. A course locus is estimated by calculating a plurality of points.

加速度・ヨーレートセンサ120が出力する前後加速度及び横加速度を用いて進路軌跡を推定してもよいし、操舵角センサ150の出力するステアリング角度を用いてもよい。又、自車位置推定部5で用いられた、車両ダイナミクスを表現した車両運動モデルを用いて進路軌跡の推定を行ってもよい。自車運動検出部2の一つである方向指示器130に基づいて自車の進路を決定してもよい。但し、方向指示器130だけでは自車の進路軌跡を推定できないため、前記進路軌跡を推定する手法と組み合わせて使用するのがよい。尚、上記進路軌跡推定法は本稿の内容を制限するものではなく、他の推定法を使用してもよい。   The course trajectory may be estimated using the longitudinal acceleration and lateral acceleration output from the acceleration / yaw rate sensor 120, or the steering angle output from the steering angle sensor 150 may be used. Alternatively, the path trajectory may be estimated using a vehicle motion model that expresses vehicle dynamics used in the vehicle position estimation unit 5. The course of the host vehicle may be determined based on the direction indicator 130 that is one of the host vehicle motion detection units 2. However, since the course locus of the host vehicle cannot be estimated only by the direction indicator 130, it is preferable to use it in combination with the method for estimating the course locus. The course trajectory estimation method does not limit the contents of this paper, and other estimation methods may be used.

障害物認識部4は、障害物検出部1が出力する複数の検出点列から、同一の障害物を検出していると判断される検出点列のグルーピング処理を行い、障害物検出部1の出力データにグループ番号を付加したデータを出力する要素である。具体的には、隣接する検出点間の距離が閾値以下となる検出点の点列をスキャン方向に向かって順次グルーピングしていくことで、同一の障害物と認識する。検出点間の距離が所定の値以上となった場合は、隣接する検出点は異なる障害物であると判断し、グルーピングされた検出点データにグループ番号を付加し、次の検出点のグルーピング処理に移る。前記処理を全ての検出点のデータに行い、複数の検知点のグルーピングを行う。前記グルーピング方法は距離という空間的な情報だけを用いたが、検出点間の速度差や反射強度の差も含め、所定値以下となる検出点の点列を同一の障害物から検出された点と判断するグルーピング方法を用いてもよい。またグルーピング処理の結果、グループに属する検出点の数が所定の数より少ない場合はその検出点はノイズであると判断し、検出点列からデータを除去してもよい。   The obstacle recognition unit 4 performs a grouping process of detection point sequences that are determined to detect the same obstacle from the plurality of detection point sequences output by the obstacle detection unit 1. This element outputs data with a group number added to the output data. Specifically, it is recognized as the same obstacle by sequentially grouping point sequences of detection points whose distance between adjacent detection points is equal to or less than a threshold value in the scan direction. If the distance between the detection points exceeds a predetermined value, it is determined that the adjacent detection points are different obstacles, a group number is added to the grouped detection point data, and the next detection point is grouped. Move on. The process is performed on the data of all detection points, and a plurality of detection points are grouped. The grouping method uses only spatial information such as distances, but points detected from the same obstacle are point sequences of detection points that are less than or equal to a predetermined value, including speed differences and reflection intensity differences between detection points. It may be possible to use a grouping method for determining as follows. Further, as a result of the grouping process, when the number of detection points belonging to the group is less than a predetermined number, it is determined that the detection points are noise, and data may be removed from the detection point sequence.

尚、本稿ではグルーピング処理の方法は限定しないため、前記方法以外の手段を用いてもよい。また障害物認識部4は演算装置220において処理される必要はなく、障害物検出部1に内蔵された演算装置で行われてもよい。但し、障害物検出部1に内蔵された演算装置で処理が行われた場合は、前記識別結果を含めた検出点の前記検出データが演算装置220に送信されるものとする。   In this paper, the grouping processing method is not limited, and means other than the above method may be used. The obstacle recognition unit 4 does not need to be processed by the arithmetic device 220, and may be performed by an arithmetic device incorporated in the obstacle detection unit 1. However, when the processing is performed by the arithmetic device built in the obstacle detection unit 1, the detection data of the detection point including the identification result is transmitted to the arithmetic device 220.

図3は、障害物認識部4の処理例を示す図である。ここでは、自車20のフロント部分に搭載された障害物検出部1であるレーザレンジファインダで検出範囲をスキャンし自車周辺の障害物を検知した例を示している。   FIG. 3 is a diagram illustrating a processing example of the obstacle recognition unit 4. Here, an example is shown in which the detection range is scanned by the laser range finder, which is the obstacle detection unit 1 mounted on the front portion of the host vehicle 20, and obstacles around the host vehicle are detected.

自車20は、右側に壁面22が存在する道路を走行しており、前方に停車している他車21がいるとし、この状況下でレーザレンジファインダによって自車周辺の障害物を検出したところ、他車21及び壁面22に該当する白抜きの丸30で図示した複数の検出点と自車20との相対位置と方位角及び相対速度のデータが得られたとする。この複数の検出点のデータを障害物認識部4が受け取り、前記グルーピング処理を行うことで、点線で囲まれたグループ31,32が形成され、グループ31,32内の検出点列データにそれぞれ識別子A1及びA2が付加される。   The own vehicle 20 is traveling on a road having a wall surface 22 on the right side, and there is another vehicle 21 parked in front. Under this situation, an obstacle around the own vehicle is detected by the laser range finder. Assume that data on the relative positions, azimuths, and relative speeds between the plurality of detection points illustrated by the white circles 30 corresponding to the other vehicle 21 and the wall surface 22 and the host vehicle 20 are obtained. The obstacle recognition unit 4 receives the data of the plurality of detection points and performs the grouping process to form groups 31 and 32 surrounded by dotted lines, and each of the detection point sequence data in the groups 31 and 32 has an identifier. A1 and A2 are added.

死角領域推定部6は、障害物認識部4が出力するグループ番号が付加された複数の検出点の情報と、自車位置推定部5で推定された自車の絶対位置及び姿勢情報とから、障害物が存在することで検出領域に生じる障害物検出部1の死角領域の絶対場所を推定し、推定された死角領域の絶対場所情報を出力する要素である。障害物によって生じる障害物検出部1の死角領域とは、送信波及び太陽光の反射光が遮蔽され相対位置及び相対速度等のデータを測定することができない領域のことであり、障害物上の測定点、障害物上の測定点と死角領域との境界点(以下、死角発生点)、及び障害物検出部1の設置位置を知る事ができれば、死角領域を推定することができる。障害物検出部1によって得られた測定点から前記死角発生点を判断するために、障害物検出部1から得られた複数の検出点において、隣接する検出点間の距離が所定の値以上となる点を探す。前記死角発生点を探す処理は障害物認識手段4で行われたグルーピング処理と同様の処理内容であるため、障害物認識手段4によって既にグループ番号が付加されている検出点列の両端の検出点を、二つの死角発生点に該当させることにする。これにより死角領域を推定することができる。   The blind spot area estimation unit 6 is based on the information on a plurality of detection points to which the group number output by the obstacle recognition unit 4 is added and the absolute position and posture information of the own vehicle estimated by the own vehicle position estimation unit 5. This is an element that estimates the absolute location of the blind spot area of the obstacle detection unit 1 generated in the detection area due to the presence of the obstacle and outputs the absolute location information of the estimated blind spot area. The blind spot area of the obstacle detection unit 1 caused by the obstacle is an area where the transmission wave and the reflected light of the sunlight are blocked and data such as relative position and relative velocity cannot be measured. If the measurement point, the boundary point between the measurement point on the obstacle and the blind spot area (hereinafter, blind spot generation point), and the installation position of the obstacle detection unit 1 can be known, the blind spot area can be estimated. In order to determine the blind spot generation point from the measurement points obtained by the obstacle detection unit 1, the distance between adjacent detection points is a predetermined value or more at a plurality of detection points obtained from the obstacle detection unit 1. Find a point. Since the process of searching for the blind spot occurrence point has the same processing contents as the grouping process performed by the obstacle recognition unit 4, the detection points at both ends of the detection point sequence to which the group number has already been added by the obstacle recognition unit 4 Are made to correspond to two blind spot generation points. Thereby, a blind spot area can be estimated.

隣接する死角領域が複数存在する場合は、それらを1つの死角領域に纏めてもよい。具体的には、隣り合う死角領域における近接する死角発生点と、障害物検出手段4の設置点を頂点とする三角形において、障害物検出手段4の角度がある所定の値以下となる場合、隣り合う死角領域を同一の死角領域と見なせばよい。   When there are a plurality of adjacent blind spot areas, they may be combined into one blind spot area. Specifically, in a triangle whose apex is the adjacent blind spot generation point in the adjacent blind spot area and the installation point of the obstacle detection means 4, if the angle of the obstacle detection means 4 is less than a predetermined value, The matching blind spot areas may be regarded as the same blind spot area.

図4は、死角領域推定部6が行う処理を示す図である。ここで太線33,34は、それぞれ死角領域40,41の境界を表し、黒丸60,61,62,63は死角発生点を表す。   FIG. 4 is a diagram illustrating a process performed by the blind spot area estimation unit 6. Here, the thick lines 33 and 34 represent the boundaries of the blind spot areas 40 and 41, respectively, and the black circles 60, 61, 62, and 63 represent the blind spot generation points.

図4は、図3の障害物認識部4の処理が済んだデータを死角領域推定部6が受け取った際の処理結果を示しており、グループ番号の順に検出点列の両端に位置する二つの死角発生点60,61,62,63を検出し、障害物検出部1の座標系の原点と死角発生点60,61とを通過する2つの直線および検出点列から斜線で表示される死角領域40を推定し、同様に死角発生点62,63に対して死角領域41を推定する。   FIG. 4 shows a processing result when the blind spot area estimation unit 6 receives the data processed by the obstacle recognition unit 4 in FIG. 3, and shows two results positioned at both ends of the detection point sequence in the order of group numbers. A blind spot area that is detected by hatching from two straight lines and detection point sequences that pass through the origin of the coordinate system of the obstacle detection unit 1 and the blind spot generation points 60, 61 by detecting the blind spot occurrence points 60, 61, 62, 63. 40 is similarly estimated for the blind spot generation points 62 and 63.

マップマッチング部8は、絶対位置情報に基づいて所定の位置の地理情報を地理情報記憶装置200から取得する。自車位置推定部5から得られる自車の絶対位置と、死角領域推定部6から得られる死角領域の絶対場所と、自車進路推定部7から得られる進路軌跡の絶対位置とから、自車周辺と死角領域周辺と進路軌跡周辺との地理情報を地図情報記憶装置200から引き出す。地理情報から得られる建築物や道路の種類及び形状を基に、どのような区分の道路が存在するのか建築物が存在するのかという判断が可能となる。マップマッチング部8から出力される信号は、地理情報を含んだ自車位置推定部5と自車進路推定部7と死角領域推定部40との出力値である。   The map matching unit 8 acquires geographic information of a predetermined position from the geographic information storage device 200 based on the absolute position information. From the absolute position of the own vehicle obtained from the own vehicle position estimation unit 5, the absolute location of the blind spot region obtained from the blind spot region estimation unit 6, and the absolute position of the course locus obtained from the own vehicle route estimation unit 7, Geographic information about the periphery, the blind spot region, and the course locus periphery is extracted from the map information storage device 200. Based on the types and shapes of buildings and roads obtained from geographic information, it is possible to determine what kind of roads exist and whether there are buildings. The signals output from the map matching unit 8 are output values of the vehicle position estimation unit 5, the vehicle route estimation unit 7, and the blind spot area estimation unit 40 including geographic information.

進路妨害物出現判断部9は、障害物認識部4が出力するグルーピング処理が済んだ車両周辺の障害物の検出点データと、マップマッチング部8が出力する地理情報を含んだ死角領域の境界位置及び自車の進路軌跡情報とに基づき、自車の進路を妨げる進路妨害物の死角領域からの出現判断を行い、出現すると判断された場合は進路妨害物との衝突位置を予測し、その予測位置を出力する要素である。尚、進路妨害物の種類及び大きさは任意に設定できるものとする。   The path obstruction appearance determination unit 9 is a boundary position of a blind spot area including the detection point data of obstacles around the vehicle that has been grouped by the obstacle recognition unit 4 and the geographical information output by the map matching unit 8. On the basis of the trajectory information of the vehicle and its own vehicle, and the appearance of the obstruction obstructing the vehicle from the blind spot area is determined, and if it is determined to appear, the collision position with the obstruction is predicted and predicted. This element outputs the position. The type and size of the route obstruction can be set arbitrarily.

進路妨害物出現判断部9は、以下の条件を満足する死角領域の境界(以降探索境界と呼ぶ)に対して、進路妨害物の出現判断処理を行う。
(条件1−1)自車から所定の範囲内に存在する死角領域
(条件1−2)障害物の検出点上の境界を除いた自車進行軌跡に近い境界
The path obstruction appearance determination unit 9 performs a path obstruction appearance determination process on a boundary of a blind spot area (hereinafter referred to as a search boundary) that satisfies the following conditions.
(Condition 1-1) A blind spot area existing within a predetermined range from the own vehicle (Condition 1-2) A boundary close to the own vehicle traveling locus excluding a boundary on an obstacle detection point

(条件1−1)の探索境界を所定距離内で限定する目的は、自車にとって危険性が高い近距離の死角領域のみ処理する事で障害物出現判断部9の処理量を減らすことにある。   The purpose of limiting the search boundary of (Condition 1-1) within a predetermined distance is to reduce the processing amount of the obstacle appearance determination unit 9 by processing only a short-range blind spot region that is highly dangerous for the vehicle. .

(条件1−2)の進行軌跡に近い死角領域の境界に限定する目的は、死角領域から出現した進路妨害物が前記境界を通過した時にはじめて障害物検出部1によって検出する事に対応させるためである。   The purpose of limiting to the boundary of the blind spot area close to the traveling locus of (Condition 1-2) is to correspond to the obstacle detection unit 1 detecting the path obstruction appearing from the blind spot area for the first time when passing through the boundary. It is.

以上より選択された探索境界において、以下の2つの条件を満たす場合、死角領域から進路妨害物が出現すると判断する。尚、進路妨害物は、道路形状によって進路が制約されない限り、基本的に自車進路軌跡に対して垂直に侵入すると仮定する。
(条件2−1)探索境界付近の死角領域に進路妨害物が存在しうる地理的条件が揃っている
(条件2−2)探索境界もしくは進路妨害物の進路に進入する障害物が存在しない
When the following two conditions are satisfied at the search boundary selected as described above, it is determined that the path obstruction appears from the blind spot area. It is assumed that the route obstruction basically enters perpendicularly to the own vehicle route trajectory unless the route is restricted by the road shape.
(Condition 2-1) Geographical conditions where a path obstruction exists in the blind spot area near the search boundary are aligned (Condition 2-2) No obstacles enter the search boundary or the path of the path obstruction.

(条件2−1)に記載する地理的条件とは、(条件A)死角領域に進路妨害物の存在しうる面積があるか否か、(条件B)死角領域が進路妨害物の存在しうる土地であるか否か、(条件C)死角領域に進路を妨げる立体物が存在しないか、という地図情報から得られる地理情報に基づいて設定された条件である。(条件A)の判断は、例えば、進路妨害物の一候補に車両が分類され所定の大きさが決定されている場合、車両が存在しうる面積が死角領域にないと判断された場合は、進路妨害物の候補から車両が除かれることになる。(条件B)の判断は、例えば、死角領域が水面や建物内である場合は、進路妨害物は存在しないことになる。(条件C)については、例えば、ガードレールや縁石の設置情報がある場合、(条件A)および(条件B)の判断で進路妨害物が存在しうると判断された場合も自車進路軌跡に侵入できる車両が存在しないと判断される。但し、(条件B)及び(条件C)の判断は地理情報が非常に多岐に渡るため、予め地理情報に基づいた進路妨害物の存在の可否パターンをデータベースとして記憶装置に保存しておくのが望ましい。   The geographical conditions described in (Condition 2-1) are (Condition A) whether there is an area where a path obstruction can exist in the blind spot area, or (Condition B) where the blind spot area may contain a path obstruction. This is a condition set on the basis of geographical information obtained from map information such as whether or not it is land and whether or not there is a three-dimensional object that obstructs the course in the blind spot area. The determination of (Condition A) is, for example, when the vehicle is classified as a candidate for a path obstruction and a predetermined size is determined, and when it is determined that the area where the vehicle can exist is not in the blind spot area, The vehicle will be removed from the candidate obstacles. In the determination of (Condition B), for example, when the blind spot area is on the water surface or in a building, there is no course obstruction. As for (Condition C), for example, when there is guardrail or curb setting information, it is also possible to enter the vehicle course trajectory even if it is judged that there is a course obstruction in the judgment of (Condition A) and (Condition B). It is determined that there is no vehicle that can be used. However, since the determination of (Condition B) and (Condition C) has a wide variety of geographic information, the presence / absence pattern of the presence of a path obstruction based on the geographic information is stored in advance in a storage device as a database. desirable.

(条件2−2)に記載される探索境界に侵入する、もしくは進路妨害物の進路を妨げる障害物とは、障害物検出部1によって検出された障害物で、探索境界もしくは進路妨害物の進路に進入すると予想される障害物のことであり、前記障害物が存在する場合、(条件2−1)で存在すると判断された進路妨害物と前記障害物との衝突が予想されるため、進路妨害物が自車進路軌跡上に出現しないと判断する。   The obstacle that intrudes into the search boundary described in (Condition 2-2) or obstructs the path of the path obstruction is an obstacle detected by the obstacle detection unit 1, and is the path of the search boundary or path obstruction. If there is an obstacle, the collision between the obstacle obstructed to be present in (Condition 2-1) and the obstacle is expected. It is determined that the obstruction does not appear on the own vehicle track.

以上(条件2−1)(条件2−2)の判定処理は、探索境界において自車位置に近い場所から順に行われるとし、(条件2−1)(条件2−2)を満たした位置から進路妨害物が出現すると判断し、判定処理を終了させる。探索境界が複数存在する場合は、次の探索境界に移し同様の処理を繰り返し行うことで、全ての探索境界に対し進路妨害物出現判断処理を行う。   The above determination processing of (Condition 2-1) and (Condition 2-2) is performed in order from a location close to the vehicle position at the search boundary, and from the position satisfying (Condition 2-1) (Condition 2-2). It is determined that a path obstruction appears, and the determination process is terminated. When there are a plurality of search boundaries, the process is moved to the next search boundary and the same process is repeated to perform the route obstacle appearance determination process for all the search boundaries.

図5は、進路妨害物出現判断部9の判断例を示す図である。特に、図3の死角推定部6による処理が済んだデータを進路妨害物出現判断部9が受け取り死角領域41から進路妨害物23が出現するという判断を行った結果を示した図である。   FIG. 5 is a diagram illustrating a determination example of the route obstruction appearance determination unit 9. In particular, FIG. 4 is a diagram illustrating a result of the path obstruction appearance determination unit 9 receiving data processed by the blind spot estimation unit 6 of FIG. 3 and determining that the path obstruction 23 appears from the blind spot area 41.

図5中にはグループ番号A1及びA2の障害物によって生じた死角領域40,41が二つ存在し、自車20は自車進路軌跡50の矢印の方向に向かって進むとする。太線33で表示された死角領域の境界において(条件1−1)(条件1−2)に基づき、自車20から所定の範囲に存在し自車進路軌跡50に近い探索境界を選択すると、極太線で表示される二つの死角領域の境界が探索境界42,43として選択されることになる。   In FIG. 5, it is assumed that there are two blind spot areas 40 and 41 caused by the obstacles of the group numbers A1 and A2, and the host vehicle 20 proceeds in the direction of the arrow of the host vehicle track 50. When a search boundary that is located within a predetermined range from the vehicle 20 and is close to the vehicle track 50 is selected based on (Condition 1-1) (Condition 1-2) at the boundary of the blind spot area indicated by the bold line 33, The boundaries between the two blind spots displayed with lines are selected as the search boundaries 42 and 43.

次にこの二つの探索境界42,43に対して(条件2−1)(条件2−2)に基づき、障害物の出現判断を行う。グループ番号A1の壁面22によって生じる死角領域40に属する探索境界42は、壁面に存在することから(条件2−1)を満足しないため進路妨害物が存在しないと判断される。又、グループ番号A2の停車している他車21によって生じる死角領域41に属する探索境界43は、進路妨害物が存在しうる面積が存在すること、道路上であること、進路を妨げる立体物が存在しないことから、(条件2−1)を満足し、又、進路妨害物の進路を妨げる障害物が周辺に存在しないことから(条件2−2)も満足する。従って、図に示す場所に進路妨害物23が存在すると判断することができる。   Next, on the two search boundaries 42 and 43, the appearance of an obstacle is determined based on (Condition 2-1) (Condition 2-2). Since the search boundary 42 belonging to the blind spot area 40 generated by the wall surface 22 of the group number A1 exists on the wall surface, it is determined that there is no course obstruction because the condition (2-1) is not satisfied. In addition, the search boundary 43 belonging to the blind spot area 41 generated by the other vehicle 21 that stops at the group number A2 has an area where a path obstruction may exist, is on a road, and a three-dimensional object that obstructs the path is present. Since it does not exist, (Condition 2-1) is satisfied, and since there is no obstacle in the vicinity that obstructs the path of the path obstruction, (Condition 2-2) is also satisfied. Therefore, it can be determined that the route obstruction 23 is present at the location shown in the figure.

前記出現判断された進路妨害物23と自車の進路軌跡50との衝突予想位置44は、進路妨害物23の周辺道路に制約を与える道路形状が存在しないため、進路軌跡50に対して垂直に進むという仮定に基づき、図中の×で表示する位置となる。尚、グループ番号A2の他車21が前方方向に走行している場合は、(条件2−1)を満足するが、他車21が走行することで探索境界43に侵入するため、(条件2−2)を満足しないことになり、進路妨害物が出現しないと判断される。   The predicted collision position 44 between the path obstruction 23 determined to appear and the path locus 50 of the host vehicle is perpendicular to the path locus 50 because there is no road shape that restricts the road around the path obstruction 23. Based on the assumption that the vehicle is moving, the position is indicated by x in the figure. When the other vehicle 21 of the group number A2 is traveling in the forward direction, (Condition 2-1) is satisfied. However, since the other vehicle 21 travels and enters the search boundary 43, (Condition 2) -2) is not satisfied, and it is determined that no obstacles appear.

警報判断部10は、進路妨害物出現判断部9が出力する衝突予想位置,自車運動検出部2の出力値、マップマッチング部8の出力値、視線検知部3に基づき、警報装置230の動作判断を行う。動作判断には、衝突予想位置で停止するのに必要となる減速度D、及び交通事故情報記憶装置240から得られる衝突予想位置周辺の過去の交通事故情報と道路交通情報受信装置210から得られるリアルタイムの交通情報に基づいて算出される事故発生パラメータPの二つの変数からなる警報判断パラメータA、及び、視線検知部から得られる視線情報を用いる。減速度Dは以下の式より算出される。   The warning determination unit 10 operates the warning device 230 based on the predicted collision position output by the path obstruction appearance determination unit 9, the output value of the own vehicle motion detection unit 2, the output value of the map matching unit 8, and the line-of-sight detection unit 3. Make a decision. The motion determination is obtained from the deceleration D required to stop at the predicted collision position and the past traffic accident information around the predicted collision position obtained from the traffic accident information storage device 240 and the road traffic information receiving device 210. The alarm determination parameter A composed of two variables of the accident occurrence parameter P calculated based on the real-time traffic information and the line-of-sight information obtained from the line-of-sight detection unit are used. The deceleration D is calculated from the following equation.

Figure 2009086788
Figure 2009086788

ここでV[m/s]は自車の速度、μはタイヤと路面との摩擦係数、L[m]は自車の現在位置と衝突予想位置との距離、g[m/s2]は重力加速度を表す。単位は[G]である。(式1)より、減速度Dは、車両速度Vが大きく、距離Lが小さく、摩擦係数μが小さい場合に値が大きくなる変数である。即ち、減速度が大きくなるに従い、衝突予想位置で静止するのが困難になる。一般に人が不快に感じる減速度は0.4〜0.5[G]であると言われており、その値以上は急ブレーキ操作に対応する。以上を鑑みて、警報判断を行うための一つの指標として減速度Dを使用する。 Here, V [m / s] is the speed of the own vehicle, μ is a coefficient of friction between the tire and the road surface, L [m] is a distance between the current position of the own vehicle and the predicted collision position, and g [m / s 2 ] is Represents gravitational acceleration. The unit is [G]. From (Equation 1), the deceleration D is a variable that increases in value when the vehicle speed V is large, the distance L is small, and the friction coefficient μ is small. That is, as the deceleration increases, it becomes difficult to stop at the predicted collision position. In general, it is said that the deceleration that a person feels uncomfortable is 0.4 to 0.5 [G], and a value greater than that corresponds to a sudden braking operation. In view of the above, the deceleration D is used as one index for making an alarm determination.

尚、速度V及び距離Lは、自車運動検出部2,マップマッチング部9から得られる情報を用いて算出できる。摩擦係数μは、タイヤトレッドの摩耗具合や路面状況,路面の濡れ程度に応じて値が変化する係数であり、高精度にその値を算出することは困難である。しかし、路面がアスファルトであるのか、砂利道であるのかという情報及び天候情報が分かると、おおよそ摩擦係数μの値を見積もる事ができる。例えば、乾いたアスファルトもしくはコンクリートではμ=0.7であり、濡れたコンクリートではμ=0.5、濡れたアスファルトではμ=0.45〜0.6、砂利道ではμ=0.55という値が一般的に使用される。従って、衝突予想位置周辺の地理情報から路面の種類を判断し、道路交通情報受信装置210から得られる天候情報から路面の濡れ具合を判断することで、摩擦係数μの値を推定できる。以上より、値が決定された各変数に基づき(式1)によって減速度Dを算出することができる。   The speed V and the distance L can be calculated using information obtained from the own vehicle motion detection unit 2 and the map matching unit 9. The friction coefficient μ is a coefficient that changes depending on the degree of wear of the tire tread, the road surface condition, and the degree of wetness of the road surface, and it is difficult to calculate the value with high accuracy. However, if the information on whether the road surface is asphalt or gravel road and the weather information are known, the value of the friction coefficient μ can be estimated roughly. For example, μ = 0.7 for dry asphalt or concrete, μ = 0.5 for wet concrete, μ = 0.45-0.6 for wet asphalt, and μ = 0.55 for gravel roads. Is commonly used. Therefore, the value of the friction coefficient μ can be estimated by determining the type of road surface from the geographical information around the predicted collision position and determining the wetness of the road surface from the weather information obtained from the road traffic information receiving device 210. From the above, the deceleration D can be calculated by (Equation 1) based on each variable whose value has been determined.

事故発生パラメータPは、衝突予想位置周辺の過去及び現在の交通状況に応じて決定されるパラメータであり、衝突予想位置周辺の過去の事故発生件数が少なくかつ現在の交通状況に事故や渋滞が発生していない場合は小さい値に、過去の事故発生件数が多く、現在の交通状況に事故や渋滞が発生している場合は大きい値となるように設定する。尚、事故発生パラメータは0から1の値に正規化された値とする。事故発生パラメータPは、交通事故情報記憶装置240及び道路交通情報受信装置210から得られる過去の事故発生件数及び現在の道路交通情報に対し、事故の状況や件数,道路の込み具合に応じて10段階評価などで過去及び現在の道路上の危険度合いをそれぞれP1,P2と数値化し、0〜1の数字に正規化することで以下の式から求めることができる。   The accident occurrence parameter P is a parameter determined according to the past and current traffic conditions around the predicted collision position. The number of past accidents around the predicted collision position is small, and accidents and traffic jams occur in the current traffic condition. If not, set the value to a small value. If the number of past accidents is large and the current traffic situation has an accident or traffic jam, the value is set to a large value. The accident occurrence parameter is a value normalized from 0 to 1. The accident occurrence parameter P is 10 according to the situation and number of accidents and the degree of road congestion with respect to the past accident occurrence number and current road traffic information obtained from the traffic accident information storage device 240 and the road traffic information receiving device 210. The degree of danger on the road in the past and the present is numerically expressed as P1 and P2 in a stage evaluation, etc., and normalized to a number from 0 to 1, and can be obtained from the following formula.

Figure 2009086788
Figure 2009086788

ここでα及びβは正の実数とし、過去と現在の事故情報が警報動作判断の基準となる事故発生パラメータPにどの割合で関わるかを決定する重みパラメータとする。例えば、過去の情報より現在の情報の方が警報判断に重要であると判断される場合はβの値をαより大きく設定することで、現在の情報が支配的となる事故発生パラメータPを算出することができる。算出された減速度D及び事故発生パラメータPの二つの指標から、警報動作判断を行うための警報判断パラメータAを以下のように算出する。   Here, α and β are positive real numbers, and are weight parameters for determining at what ratio the past and present accident information are related to the accident occurrence parameter P that is a criterion for alarm action determination. For example, when it is determined that the current information is more important for alarm judgment than the past information, the accident occurrence parameter P in which the current information becomes dominant is calculated by setting the value of β larger than α. can do. From the two indexes of the calculated deceleration D and accident occurrence parameter P, an alarm determination parameter A for performing an alarm operation determination is calculated as follows.

Figure 2009086788
Figure 2009086788

(式3)によって算出された警報判断パラメータの値に応じて、警報装置230の動作を決定することができる。警報装置230の動作パターンは様々存在し、本稿では警報装置230の動作パターンを限定しないが、例えば、警報判断パラメータAが所定の閾値以上の場合のみディスプレイに危険と判断された死角領域を明示させたり、警報判断パラメータAの値に応じてブザー音を変化させるなど、ランプやディスプレイ、スピーカやブザーによって運転者の感性に適した警報を行うことが望ましい。   The operation of the alarm device 230 can be determined according to the value of the alarm determination parameter calculated by (Equation 3). There are various operation patterns of the alarm device 230, and in this paper, the operation pattern of the alarm device 230 is not limited. For example, only when the alarm determination parameter A is equal to or greater than a predetermined threshold, the blind spot area determined to be dangerous is clearly indicated on the display. It is desirable to give a warning suitable for the driver's sensibility using a lamp, display, speaker, or buzzer, such as changing the buzzer sound according to the value of the warning judgment parameter A.

又、警報を発する判断及び警報を解除する方法として、視線検知部3から得られる運転者の視線情報を用いるとよい。視線情報を用いることで、運転者が死角領域を視認しているにも関わらず警報装置が動作することで生じる運転者の不快感を低減することが可能となる。又、視線情報を用いて警報を解除できれば、ハンドル操作に与える影響を少なくすることができる。死角領域の境界付近を運転者が視認したと判断するには、所定の時間視線が死角領域の境界付近に向いていることを検知すればよい。視認したと判断された場合は、警報判断パラメータAの値を所定時間0とする。所定時間経過後に再度警報判断パラメータAの値を算出し、以降前記処理を繰り返すことで、死角領域の監視を継続的に実行させる。   Further, as a method of issuing a warning and canceling the warning, it is preferable to use the driver's line-of-sight information obtained from the line-of-sight detection unit 3. By using the line-of-sight information, it is possible to reduce the driver's discomfort caused by the operation of the alarm device while the driver is viewing the blind spot area. If the alarm can be canceled using the line-of-sight information, the influence on the steering wheel operation can be reduced. In order to determine that the driver has visually recognized the vicinity of the blind spot area boundary, it is only necessary to detect that the line of sight for a predetermined time is directed to the vicinity of the boundary of the blind spot area. When it is determined that the user has visually confirmed, the value of the alarm determination parameter A is set to 0 for a predetermined time. After the predetermined time elapses, the value of the alarm determination parameter A is calculated again, and thereafter, the above process is repeated to continuously monitor the blind spot area.

又、自車室内に複数の警報装置を設置し、進路妨害物出現判断部9によって進路妨害物が出現すると判断された死角領域の存在場所に応じて、警報装置を作動させることで、運転者に死角領域の存在場所を知らせることができる。例えば、室内の前後左右にスピーカやブザーを設置し、自車の前方左側に存在する死角領域から進路妨害物が出現すると判断された場合は、自車室内の左前に設置されたスピーカ及びブザーで警報を発生することで、自車前方左側に運転者の注意を向ける事が可能となる。   In addition, a plurality of warning devices are installed in the passenger compartment, and the driver is operated by operating the warning device according to the location of the blind spot area in which the route obstruction is determined to appear by the route obstruction appearance determining unit 9. Can inform the location of the blind spot area. For example, if speakers and buzzers are installed in the front, rear, left and right of the room, and it is determined that a path obstruction appears from the blind spot area on the front left side of the vehicle, the speaker and buzzer installed in the front left of the vehicle By generating an alarm, the driver's attention can be directed to the left front side of the vehicle.

本稿によれば、自車周辺に発生する死角領域を推定し、地理情報と過去及び現在の交通情報と天気情報との複数の情報に基づき死角領域の危険度合いを計算することで、路車間通信や車車間通信を用いることなく自律的に死角が原因となる衝突事故を防ぐ事が可能になる。又、複数の情報に基づき危険度合いを計算することで、走行環境や走行時刻に応じた死角領域の警報を与える事ができる。又、運転者の視線を検知し、死角領域に対する運転者の注意動向に応じた警報を与える事で、運転者の死角領域の見落としを防ぎ、かつ、車両周辺監視装置の警報に対する運転者の違和感の低減が可能になる。更に、自車内に複数の警報部を備え、危険と判断された死角領域の存在する場所に応じて警報部を動作させることで、運転者に危険と判断された死角領域の存在場所を知らせる事が可能となり、ひいては、運転者の危険回避判断を促す事が可能となる。   According to this article, road-to-vehicle communication is estimated by estimating the blind spot area around the vehicle and calculating the degree of danger in the blind spot area based on geographical information, past and present traffic information, and weather information. It is possible to prevent collision accidents caused by blind spots autonomously without using vehicle-to-vehicle communication. Further, by calculating the degree of danger based on a plurality of information, it is possible to give a warning of a blind spot area corresponding to the traveling environment and the traveling time. In addition, by detecting the driver's line of sight and giving an alarm according to the driver's attention trend with respect to the blind spot area, the driver's blind spot area is prevented from being overlooked, and the driver feels uncomfortable with the vehicle periphery monitoring device alarm. Can be reduced. In addition, a plurality of warning sections are provided in the vehicle, and the warning section is operated according to the location where the blind spot area determined to be dangerous is present, thereby notifying the driver of the location of the blind spot area determined to be dangerous. As a result, it is possible to promote the driver's risk avoidance judgment.

車両周辺監視装置を示す図。The figure which shows a vehicle periphery monitoring apparatus. 車両周辺監視装置の処理の流れを示したブロック図。The block diagram which showed the flow of the process of a vehicle periphery monitoring apparatus. 障害物認識部の処理例を示す図。The figure which shows the process example of an obstruction recognition part. 死角領域推定部が行う処理を説明する図。The figure explaining the process which a blind spot area estimation part performs. 進路妨害物出現判断部9の判断例を示す図。The figure which shows the example of a judgment of the course obstruction appearance judgment part 9. FIG.

符号の説明Explanation of symbols

100 障害物検出部
110 GPS
120 加速度・ヨーレートセンサ
130 方向指示器
140 車輪速センサ
150 操舵角センサ
200 地図情報記憶装置
210 道路交通情報受信装置
220 演算装置
230 警報装置
240 交通事故情報記憶装置
300 車両周辺監視装置
100 Obstacle detection unit 110 GPS
120 Acceleration / Yaw Rate Sensor 130 Direction Indicator 140 Wheel Speed Sensor 150 Steering Angle Sensor 200 Map Information Storage Device 210 Road Traffic Information Receiving Device 220 Arithmetic Device 230 Alarm Device 240 Traffic Accident Information Storage Device 300 Vehicle Perimeter Monitoring Device

Claims (8)

運転者に警報を与える警報部と、
自車周辺の障害物を複数の検出点の点列として検出し、当該自車と各検出点の相対距離,方位角、及び、相対速度とを検出する障害物検出部と、
前記自車の走行方向と走行速度とを検出する自車運動検出部と、
前記自車運動検出部が検出した前記自車の走行方向と走行速度に基づいて、前記自車進路軌跡を推定する自車進路軌跡推定部と、
前記自車周辺に存在する障害物によって生じる死角領域を推定する死角領域推定部と、
前記死角領域推定部が推定した死角領域から、前記自車進路軌跡推定部が推定した自車進路軌跡を遮る進路妨害物が出現すると仮定して前記自車と前記進路妨害物が衝突する位置を推定する進路妨害物出現判断部と、
前記進路妨害物出現判断部が推定した衝突位置と前記自車との距離と、前記自車運動検出部が検出した前記自車の走行速度に基づいて前記死角領域の危険度合いを演算し、当該危険度合いに応じて前記警報部の動作を判断する警報判断部を備え、
前記警報部は、前記警報判断部の判断に基づいて前記運転者に警報を与える、車両周辺監視装置。
An alarm unit for giving an alarm to the driver;
An obstacle detection unit that detects obstacles around the vehicle as a point sequence of a plurality of detection points, and detects a relative distance, an azimuth angle, and a relative speed between the vehicle and each detection point;
A host vehicle motion detection unit for detecting a travel direction and a travel speed of the host vehicle;
Based on the traveling direction and traveling speed of the host vehicle detected by the host vehicle motion detecting unit, the host vehicle track locus estimating unit that estimates the host vehicle track locus;
A blind spot area estimator for estimating a blind spot area caused by an obstacle existing around the vehicle;
From the blind spot area estimated by the blind spot area estimation unit, assuming that a path obstruction that obstructs the own vehicle path trajectory estimated by the own vehicle path trajectory estimation unit appears, the position where the own vehicle and the path obstruction collide An estimated path obstruction appearance determination unit;
Based on the distance between the collision position estimated by the path obstruction appearance determination unit and the own vehicle, and the traveling speed of the own vehicle detected by the own vehicle motion detection unit, the degree of danger in the blind spot area is calculated, An alarm determination unit that determines the operation of the alarm unit according to the degree of danger,
The vehicle periphery monitoring device, wherein the warning unit gives a warning to the driver based on the determination of the warning determination unit.
前記障害物検出部が取得した複数の検出点の中に、隣接する検出点間の距離が所定の閾値内となる検出点が存在する場合、前記隣接する検出点は同一の障害物から得られた検出点列であると判断する障害物認識部を備え、
前記死角領域推定部は、前記障害物検出部の視点を原点とする、前記検出点列が不連続となる両端の検出点を通過する二つの半直線とで構成される前記検出点列を含む領域において、前記検出点列で囲まれた領域を除く領域を死角領域と判断する、請求項1記載の車両周辺監視装置。
When there is a detection point whose distance between adjacent detection points is within a predetermined threshold among the plurality of detection points acquired by the obstacle detection unit, the adjacent detection points are obtained from the same obstacle. An obstacle recognition unit that determines that the detected point sequence is
The blind spot area estimation unit includes the detection point sequence including two half lines that pass through detection points at both ends where the detection point sequence is discontinuous, with the viewpoint of the obstacle detection unit as an origin. The vehicle periphery monitoring device according to claim 1, wherein an area excluding an area surrounded by the detection point sequence is determined as a blind spot area.
地球表面に関する地理情報を記憶した地図情報記憶装置と、前記自車の走行位置を推定する自車位置推定部を更に備え、
前記進路妨害物出現判断部は、前記自車から所定の範囲内に存在する前記死角領域の前記自車進路軌跡付近の境界を処理対象とし、前記自車の走行位置を基に前記地図情報記憶装置から得られる前記死角領域の境界付近の地理情報が、進路妨害物の存在しうる地理条件を満たし、前記自車進路軌跡への前記進路妨害物の進路を妨げる障害物が前記死角領域の境界付近に存在しない場合、前記進路妨害物が出現すると判断する、請求項1又は2記載の車両周辺監視装置。
A map information storage device that stores geographical information about the surface of the earth, and a vehicle position estimation unit that estimates a travel position of the vehicle;
The course obstruction appearance determination unit sets a boundary of the blind spot area in the predetermined range from the own vehicle in the vicinity of the own vehicle course locus, and stores the map information based on the travel position of the own vehicle. Geographic information near the boundary of the blind spot area obtained from the apparatus satisfies the geographical condition where a path obstruction can exist, and an obstacle that obstructs the course of the path obstruction to the own vehicle course locus is a boundary of the blind spot area. The vehicle periphery monitoring device according to claim 1, wherein when there is no nearby object, the route obstruction is determined to appear.
前記自車の位置と前記衝突位置との距離を算出し、前記衝突位置において前記自車が静止するために必要な減速度を、前記算出した距離と車両速度と路面とタイヤとの摩擦係数とに基づいて算出する減速度算出部を更に備え、
前記警報判断部は、前記算出した減速度を前記死角領域の危険度合いとすることで前記運転者に警報を与える、請求項1乃至3何れかに記載の車両周辺監視装置。
The distance between the position of the host vehicle and the collision position is calculated, and the deceleration required for the host vehicle to stop at the collision position is calculated as the calculated distance, the vehicle speed, the friction coefficient between the road surface and the tire. Further comprising a deceleration calculation unit for calculating based on
4. The vehicle periphery monitoring device according to claim 1, wherein the warning determination unit gives a warning to the driver by setting the calculated deceleration as a risk level of the blind spot area. 5.
前記地図情報記憶装置から得られる前記自車走行路の路面情報と、現在の自車周辺の道路上の事故や渋滞や交通規制情報や天気情報などの道路交通情報を受信する道路交通情報受信装置から得られる前記自車周辺の前記天気情報とに基づいて、前記路面とタイヤとの摩擦係数の値を推定する摩擦係数推定部を更に備え、
前記減速度算出部は、前記摩擦係数推定部が推定した摩擦係数に基づいて減速度を算出する、請求項4記載の車両周辺監視装置。
A road traffic information receiving device for receiving road surface information of the own vehicle traveling path obtained from the map information storage device and road traffic information such as accidents, traffic jams, traffic regulation information and weather information on the road around the current vehicle A friction coefficient estimator for estimating a value of a friction coefficient between the road surface and the tire based on the weather information around the vehicle obtained from the vehicle,
The vehicle periphery monitoring device according to claim 4, wherein the deceleration calculation unit calculates a deceleration based on the friction coefficient estimated by the friction coefficient estimation unit.
過去の交通事故統計情報を記憶した交通事故情報記憶装置と、前記道路交通情報受信装置とから得られる前記自車周辺の過去及び現在の交通事故情報に基づいて、過去の事故発生件数が少なく、かつ現在の交通状況に事故や渋滞が発生していない場合は小さい値に、過去の事故発生件数が多く、かつ現在の交通状況に事故や渋滞が発生している場合は大きい値となるように算出する事故発生パラメータ算出部を備え、
前記警報判断部は、前記算出された事故発生パラメータと前記減速度とを独立変数とする危険度合いを算出することで前記運転者に警報を与える、請求項5記載の車両周辺監視装置。
Based on the past and present traffic accident information around the vehicle obtained from the traffic accident information storage device storing past traffic accident statistical information and the road traffic information receiving device, the number of past accident occurrences is small, And if there is no accident or traffic jam in the current traffic situation, it will be a small value, and if there are many past accidents and the current traffic situation has an accident or traffic jam, it will be a large value. Accident occurrence parameter calculation unit to calculate,
6. The vehicle periphery monitoring device according to claim 5, wherein the warning determination unit gives a warning to the driver by calculating a degree of risk using the calculated accident occurrence parameter and the deceleration as independent variables.
前記運転者の視線を検知する視線検知部を更に備え、
前記警報判断部は、前記自車進路を遮る進路妨害物が出現すると判断された死角領域の方向に視線が向いていることを検知した場合、前記死角領域の危険度合いを低くした状態で前記運転者に警報を与える、請求項1乃至6何れかに記載の車両周辺監視装置。
A line-of-sight detection unit that detects the line of sight of the driver;
When the warning determination unit detects that the line of sight is directed in the direction of the blind spot area where it is determined that a path obstruction that blocks the own vehicle path appears, the driving is performed in a state in which the risk level of the blind spot area is reduced. The vehicle periphery monitoring device according to any one of claims 1 to 6, which gives a warning to a person.
前記自車に複数の警報部を備え、
前記警報判断部は、前記自車進路を遮る前記進路妨害物が出現すると判断された前記死角領域の場所に応じて、前記複数の警報部の中から警報を発する前記警報部を選択することで運転者に前記死角領域の存在場所を知らしめる、請求項1乃至7何れかに記載の車両周辺監視装置。
The vehicle has a plurality of warning units,
The warning determination unit is configured to select the warning unit that issues a warning from the plurality of warning units according to the location of the blind spot area where it is determined that the obstruction obstructing the own vehicle path appears. The vehicle periphery monitoring apparatus according to any one of claims 1 to 7, wherein a driver is informed of a location where the blind spot area exists.
JP2007252957A 2007-09-28 2007-09-28 Vehicle surrounding monitoring device Pending JP2009086788A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007252957A JP2009086788A (en) 2007-09-28 2007-09-28 Vehicle surrounding monitoring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007252957A JP2009086788A (en) 2007-09-28 2007-09-28 Vehicle surrounding monitoring device

Publications (1)

Publication Number Publication Date
JP2009086788A true JP2009086788A (en) 2009-04-23

Family

ID=40660175

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007252957A Pending JP2009086788A (en) 2007-09-28 2007-09-28 Vehicle surrounding monitoring device

Country Status (1)

Country Link
JP (1) JP2009086788A (en)

Cited By (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010127849A (en) * 2008-11-28 2010-06-10 Sogo Keibi Hosho Co Ltd Moving body detecting device and method
JP2010274837A (en) * 2009-05-29 2010-12-09 Hitachi Automotive Systems Ltd Vehicle controller and vehicle control method
WO2010140239A1 (en) * 2009-06-04 2010-12-09 トヨタ自動車株式会社 Vehicle surrounding monitor device and method for monitoring surroundings used for vehicle
JP2011150633A (en) * 2010-01-25 2011-08-04 Toyota Central R&D Labs Inc Object detection device and program
JP2011248870A (en) * 2010-04-27 2011-12-08 Denso Corp Dead angle area detection device, dead angle area detection program and dead angle area detection method
JP2012104029A (en) * 2010-11-12 2012-05-31 Toyota Motor Corp Danger degree calculation device
CN102632839A (en) * 2011-02-15 2012-08-15 汽车零部件研究及发展中心有限公司 Back sight image cognition based on-vehicle blind area early warning system and method
WO2012128232A1 (en) * 2011-03-23 2012-09-27 トヨタ自動車株式会社 Vehicle information processing device
JP2012226437A (en) * 2011-04-15 2012-11-15 Toyota Motor Corp Drive support system
US20120310466A1 (en) * 2011-06-01 2012-12-06 Google Inc. Sensor field selection
JP2012531339A (en) * 2009-06-29 2012-12-10 ヴァレオ ビジョン Method for controlling vehicle components
JP2013196032A (en) * 2012-03-15 2013-09-30 Toyota Motor Corp Driving support device
JP2014117989A (en) * 2012-12-14 2014-06-30 Nippon Soken Inc Parking assistance device
JP2015064747A (en) * 2013-09-25 2015-04-09 日産自動車株式会社 Operation support device and operation support method
JP2015108926A (en) * 2013-12-04 2015-06-11 三菱電機株式会社 Vehicle driving support device
JP2015535204A (en) * 2012-09-20 2015-12-10 グーグル インコーポレイテッド Road weather condition detection
US9381916B1 (en) 2012-02-06 2016-07-05 Google Inc. System and method for predicting behaviors of detected objects through environment representation
JP2016122308A (en) * 2014-12-25 2016-07-07 クラリオン株式会社 Vehicle controller
JPWO2014073571A1 (en) * 2012-11-08 2016-09-08 日立建機株式会社 Image processing apparatus for self-propelled industrial machine and image processing method for self-propelled industrial machine
WO2016158970A1 (en) * 2015-03-31 2016-10-06 株式会社デンソー Object detection device and object detection method
JP2017114405A (en) * 2015-12-25 2017-06-29 マツダ株式会社 Drive support device
WO2017188020A1 (en) * 2016-04-28 2017-11-02 日本電気株式会社 Management system, mobile body, management device, velocity notification method, management method, and storage medium having program stored thereon
JP2018034709A (en) * 2016-09-01 2018-03-08 マツダ株式会社 Vehicle control device
KR20180088819A (en) * 2015-12-02 2018-08-07 콘티 테믹 마이크로일렉트로닉 게엠베하 Control device and method for a driver assistance system of a vehicle
CN109398286A (en) * 2018-10-11 2019-03-01 华东交通大学 A kind of expressway fog fog-zone road conditions visualization device based on microwave radar
JP2019069659A (en) * 2017-10-06 2019-05-09 トヨタ自動車株式会社 Driving support device
JP2019079316A (en) * 2017-10-25 2019-05-23 株式会社Soken In-vehicle system, target recognition method, and computer program
WO2019098323A1 (en) * 2017-11-17 2019-05-23 アイシン・エィ・ダブリュ株式会社 Vehicle drive assist system, vehicle drive assist method, and vehicle drive assist program
JP2019114100A (en) * 2017-12-25 2019-07-11 株式会社東芝 Road traffic control system and onboard equipment
JP2019200461A (en) * 2018-05-14 2019-11-21 株式会社デンソー Driving support device
CN110509842A (en) * 2019-08-15 2019-11-29 重庆长安汽车股份有限公司 A kind of detection of vehicle low speed segment blind area, lane-change householder method and system
JP2020035386A (en) * 2018-08-31 2020-03-05 パイオニア株式会社 Evaluation device, evaluation method and evaluation program
CN111204333A (en) * 2018-11-22 2020-05-29 沃尔沃汽车公司 Vehicle forward blind spot detection and warning system
JP2020085716A (en) * 2018-11-28 2020-06-04 日産自動車株式会社 Object recognition method and object recognition device
CN111391863A (en) * 2019-01-02 2020-07-10 长沙智能驾驶研究院有限公司 Blind area detection method, vehicle-mounted unit, road side unit, vehicle and storage medium
WO2020225868A1 (en) * 2019-05-08 2020-11-12 三菱電機株式会社 Driving assistance device and driving assistance method
WO2020235386A1 (en) * 2019-05-22 2020-11-26 日立オートモティブシステムズ株式会社 Vehicle control device
WO2020255740A1 (en) * 2019-06-21 2020-12-24 パナソニック株式会社 Surveillance system, and surveillance method
WO2020255949A1 (en) * 2019-06-21 2020-12-24 パナソニック株式会社 Monitoring device and monitoring method
CN112180395A (en) * 2019-06-13 2021-01-05 北京百度网讯科技有限公司 Extraction method, device and equipment of road mark and storage medium
CN112368757A (en) * 2018-07-04 2021-02-12 日产自动车株式会社 Driving assistance method and driving assistance device
CN113276769A (en) * 2021-04-29 2021-08-20 深圳技术大学 Vehicle blind area anti-collision early warning system and method
CN113808409A (en) * 2020-06-17 2021-12-17 华为技术有限公司 Road safety monitoring method, system and computer equipment
CN113859251A (en) * 2021-10-29 2021-12-31 广州文远知行科技有限公司 Vehicle speed planning method, driving control method and related equipment related to driving blind area
EP3975155A1 (en) * 2020-08-26 2022-03-30 Hyundai Mobis Co., Ltd. Rear lateral blind-spot warning system and method for vehicle
CN114290990A (en) * 2021-12-24 2022-04-08 浙江吉利控股集团有限公司 Obstacle early warning system and method for vehicle A-column blind area and signal processing device
CN115100863A (en) * 2022-06-23 2022-09-23 中国人民公安大学 Road monitoring method, device, equipment and storage medium
US11804135B2 (en) 2019-11-28 2023-10-31 Mitsubishi Electric Corporation Object recognition apparatus, object recognition method, and computer readable medium
CN117173897A (en) * 2023-11-03 2023-12-05 浪潮智慧科技(青岛)有限公司 Road traffic monitoring and controlling method and system based on Internet of things technology

Cited By (77)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010127849A (en) * 2008-11-28 2010-06-10 Sogo Keibi Hosho Co Ltd Moving body detecting device and method
JP2010274837A (en) * 2009-05-29 2010-12-09 Hitachi Automotive Systems Ltd Vehicle controller and vehicle control method
US8781643B2 (en) 2009-05-29 2014-07-15 Hitachi Automotive Systems, Ltd. Vehicle control device and vehicle control method
WO2010140239A1 (en) * 2009-06-04 2010-12-09 トヨタ自動車株式会社 Vehicle surrounding monitor device and method for monitoring surroundings used for vehicle
US8676488B2 (en) 2009-06-04 2014-03-18 Toyota Jidosha Kabushiki Kaisha Vehicle surrounding monitor device and method for monitoring surroundings used for vehicle
JP5327321B2 (en) * 2009-06-04 2013-10-30 トヨタ自動車株式会社 Vehicle periphery monitoring device and vehicle periphery monitoring method
JP2012531339A (en) * 2009-06-29 2012-12-10 ヴァレオ ビジョン Method for controlling vehicle components
JP2011150633A (en) * 2010-01-25 2011-08-04 Toyota Central R&D Labs Inc Object detection device and program
JP2011248870A (en) * 2010-04-27 2011-12-08 Denso Corp Dead angle area detection device, dead angle area detection program and dead angle area detection method
JP2012104029A (en) * 2010-11-12 2012-05-31 Toyota Motor Corp Danger degree calculation device
CN102632839A (en) * 2011-02-15 2012-08-15 汽车零部件研究及发展中心有限公司 Back sight image cognition based on-vehicle blind area early warning system and method
WO2012128232A1 (en) * 2011-03-23 2012-09-27 トヨタ自動車株式会社 Vehicle information processing device
JP2012226437A (en) * 2011-04-15 2012-11-15 Toyota Motor Corp Drive support system
US20120310466A1 (en) * 2011-06-01 2012-12-06 Google Inc. Sensor field selection
US8589014B2 (en) * 2011-06-01 2013-11-19 Google Inc. Sensor field selection
US9381916B1 (en) 2012-02-06 2016-07-05 Google Inc. System and method for predicting behaviors of detected objects through environment representation
US10564639B1 (en) 2012-02-06 2020-02-18 Waymo Llc System and method for predicting behaviors of detected objects through environment representation
US11287820B1 (en) 2012-02-06 2022-03-29 Waymo Llc System and method for predicting behaviors of detected objects through environment representation
US9766626B1 (en) 2012-02-06 2017-09-19 Waymo Llc System and method for predicting behaviors of detected objects through environment representation
JP2013196032A (en) * 2012-03-15 2013-09-30 Toyota Motor Corp Driving support device
JP2015535204A (en) * 2012-09-20 2015-12-10 グーグル インコーポレイテッド Road weather condition detection
JPWO2014073571A1 (en) * 2012-11-08 2016-09-08 日立建機株式会社 Image processing apparatus for self-propelled industrial machine and image processing method for self-propelled industrial machine
JP2014117989A (en) * 2012-12-14 2014-06-30 Nippon Soken Inc Parking assistance device
JP2015064747A (en) * 2013-09-25 2015-04-09 日産自動車株式会社 Operation support device and operation support method
JP2015108926A (en) * 2013-12-04 2015-06-11 三菱電機株式会社 Vehicle driving support device
JP2016122308A (en) * 2014-12-25 2016-07-07 クラリオン株式会社 Vehicle controller
WO2016158970A1 (en) * 2015-03-31 2016-10-06 株式会社デンソー Object detection device and object detection method
JP2016192164A (en) * 2015-03-31 2016-11-10 株式会社デンソー Object detection device and object detection method
KR20180088819A (en) * 2015-12-02 2018-08-07 콘티 테믹 마이크로일렉트로닉 게엠베하 Control device and method for a driver assistance system of a vehicle
KR102609040B1 (en) 2015-12-02 2023-12-01 콘티 테믹 마이크로일렉트로닉 게엠베하 Control device and method for driver assistance system of vehicle
JP2017114405A (en) * 2015-12-25 2017-06-29 マツダ株式会社 Drive support device
WO2017188020A1 (en) * 2016-04-28 2017-11-02 日本電気株式会社 Management system, mobile body, management device, velocity notification method, management method, and storage medium having program stored thereon
JPWO2017188020A1 (en) * 2016-04-28 2019-03-07 日本電気株式会社 Management system, moving body, management apparatus, speed notification method, management method, and program
JP2018034709A (en) * 2016-09-01 2018-03-08 マツダ株式会社 Vehicle control device
US10322720B2 (en) 2016-09-01 2019-06-18 Mazda Motor Corporation Vehicle control device including object detection, speed distribution area setting and avoidance control execution sections
JP2019069659A (en) * 2017-10-06 2019-05-09 トヨタ自動車株式会社 Driving support device
JP2019079316A (en) * 2017-10-25 2019-05-23 株式会社Soken In-vehicle system, target recognition method, and computer program
JP7107660B2 (en) 2017-10-25 2022-07-27 株式会社Soken In-vehicle system, target object recognition method and computer program
WO2019098323A1 (en) * 2017-11-17 2019-05-23 アイシン・エィ・ダブリュ株式会社 Vehicle drive assist system, vehicle drive assist method, and vehicle drive assist program
US11787287B2 (en) 2017-11-17 2023-10-17 Aisin Corporation Vehicle drive assist system, vehicle drive assist method, and vehicle drive assist program
JPWO2019098323A1 (en) * 2017-11-17 2020-10-22 アイシン・エィ・ダブリュ株式会社 Vehicle driving assistance system, vehicle driving assistance method, and vehicle driving assistance program
JP2019114100A (en) * 2017-12-25 2019-07-11 株式会社東芝 Road traffic control system and onboard equipment
WO2019220884A1 (en) * 2018-05-14 2019-11-21 株式会社デンソー Driving assistance device
JP2019200461A (en) * 2018-05-14 2019-11-21 株式会社デンソー Driving support device
CN112368757B (en) * 2018-07-04 2022-11-01 日产自动车株式会社 Driving assistance method and driving assistance device
CN112368757A (en) * 2018-07-04 2021-02-12 日产自动车株式会社 Driving assistance method and driving assistance device
JP2020035386A (en) * 2018-08-31 2020-03-05 パイオニア株式会社 Evaluation device, evaluation method and evaluation program
CN109398286A (en) * 2018-10-11 2019-03-01 华东交通大学 A kind of expressway fog fog-zone road conditions visualization device based on microwave radar
CN111204333A (en) * 2018-11-22 2020-05-29 沃尔沃汽车公司 Vehicle forward blind spot detection and warning system
JP2020085716A (en) * 2018-11-28 2020-06-04 日産自動車株式会社 Object recognition method and object recognition device
JP7149172B2 (en) 2018-11-28 2022-10-06 日産自動車株式会社 Object recognition method and object recognition device
CN111391863B (en) * 2019-01-02 2022-12-16 长沙智能驾驶研究院有限公司 Blind area detection method, vehicle-mounted unit, road side unit, vehicle and storage medium
CN111391863A (en) * 2019-01-02 2020-07-10 长沙智能驾驶研究院有限公司 Blind area detection method, vehicle-mounted unit, road side unit, vehicle and storage medium
WO2020225868A1 (en) * 2019-05-08 2020-11-12 三菱電機株式会社 Driving assistance device and driving assistance method
JPWO2020225868A1 (en) * 2019-05-08 2021-10-28 三菱電機株式会社 Driving support device and driving support method
JP2020189560A (en) * 2019-05-22 2020-11-26 日立オートモティブシステムズ株式会社 Vehicle control device
WO2020235386A1 (en) * 2019-05-22 2020-11-26 日立オートモティブシステムズ株式会社 Vehicle control device
CN113840764A (en) * 2019-05-22 2021-12-24 日立安斯泰莫株式会社 Vehicle control device
CN112180395A (en) * 2019-06-13 2021-01-05 北京百度网讯科技有限公司 Extraction method, device and equipment of road mark and storage medium
CN112180395B (en) * 2019-06-13 2023-04-07 北京百度网讯科技有限公司 Extraction method, device and equipment of road mark and storage medium
WO2020255949A1 (en) * 2019-06-21 2020-12-24 パナソニック株式会社 Monitoring device and monitoring method
JP2021002226A (en) * 2019-06-21 2021-01-07 パナソニック株式会社 Monitoring device, and monitoring method
WO2020255740A1 (en) * 2019-06-21 2020-12-24 パナソニック株式会社 Surveillance system, and surveillance method
CN113994404B (en) * 2019-06-21 2023-11-07 松下控股株式会社 Monitoring device and monitoring method
JP7296261B2 (en) 2019-06-21 2023-06-22 パナソニックホールディングス株式会社 Monitoring device and monitoring method
CN113994404A (en) * 2019-06-21 2022-01-28 松下电器产业株式会社 Monitoring device and monitoring method
CN110509842B (en) * 2019-08-15 2023-03-24 重庆长安汽车股份有限公司 Vehicle low-speed section blind area detection and lane change auxiliary method and system
CN110509842A (en) * 2019-08-15 2019-11-29 重庆长安汽车股份有限公司 A kind of detection of vehicle low speed segment blind area, lane-change householder method and system
US11804135B2 (en) 2019-11-28 2023-10-31 Mitsubishi Electric Corporation Object recognition apparatus, object recognition method, and computer readable medium
CN113808409A (en) * 2020-06-17 2021-12-17 华为技术有限公司 Road safety monitoring method, system and computer equipment
EP3975155A1 (en) * 2020-08-26 2022-03-30 Hyundai Mobis Co., Ltd. Rear lateral blind-spot warning system and method for vehicle
CN113276769A (en) * 2021-04-29 2021-08-20 深圳技术大学 Vehicle blind area anti-collision early warning system and method
CN113859251A (en) * 2021-10-29 2021-12-31 广州文远知行科技有限公司 Vehicle speed planning method, driving control method and related equipment related to driving blind area
CN114290990A (en) * 2021-12-24 2022-04-08 浙江吉利控股集团有限公司 Obstacle early warning system and method for vehicle A-column blind area and signal processing device
CN115100863A (en) * 2022-06-23 2022-09-23 中国人民公安大学 Road monitoring method, device, equipment and storage medium
CN117173897A (en) * 2023-11-03 2023-12-05 浪潮智慧科技(青岛)有限公司 Road traffic monitoring and controlling method and system based on Internet of things technology
CN117173897B (en) * 2023-11-03 2024-01-26 浪潮智慧科技(青岛)有限公司 Road traffic monitoring and controlling method and system based on Internet of things technology

Similar Documents

Publication Publication Date Title
JP2009086788A (en) Vehicle surrounding monitoring device
US10800455B2 (en) Vehicle turn signal detection
US7797108B2 (en) Collision avoidance system and method of aiding rearward vehicular motion
US7411486B2 (en) Lane-departure warning system with differentiation between an edge-of-lane marking and a structural boundary of the edge of the lane
JP5094658B2 (en) Driving environment recognition device
US20180173236A1 (en) Vehicle control system, vehicle control method, and vehicle control program
JP6323064B2 (en) Traveling lane identification device, lane change support device, traveling lane identification method
US11618502B2 (en) On-road localization methodologies and equipment utilizing road surface characteristics
US11042160B2 (en) Autonomous driving trajectory determination device
US20210208282A1 (en) Detection device and detection system
CN115031981A (en) Vehicle and sensor simulation method and device
WO2015009218A1 (en) Determination of lane position
US20210370931A1 (en) Driving assistance device
JP7043765B2 (en) Vehicle driving control method and equipment
WO2023147160A1 (en) Radar object classification based on radar cross-section data
JP2005258941A (en) Device for detecting obstacle
CN109425861B (en) Device for calculating reliability of vehicle position
WO2021261228A1 (en) Obstacle information management device, obstacle information management method, and device for vehicle
JP6881001B2 (en) Automatic driving control device
JP2017097580A (en) Object detection device
JP5354193B2 (en) Vehicle driving support device
CN115808921A (en) Method and system for a vehicle
JP2022060075A (en) Drive support device
JP2021068315A (en) Estimation method and estimation system of lane condition
WO2023145494A1 (en) Information processing device and information processing method