JP2004144671A - Car-mounted radar device - Google Patents

Car-mounted radar device Download PDF

Info

Publication number
JP2004144671A
JP2004144671A JP2002311680A JP2002311680A JP2004144671A JP 2004144671 A JP2004144671 A JP 2004144671A JP 2002311680 A JP2002311680 A JP 2002311680A JP 2002311680 A JP2002311680 A JP 2002311680A JP 2004144671 A JP2004144671 A JP 2004144671A
Authority
JP
Japan
Prior art keywords
vehicle
image
line segment
area
direction vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2002311680A
Other languages
Japanese (ja)
Other versions
JP3862015B2 (en
Inventor
Koji Horibe
堀部 剛治
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omron Corp
Original Assignee
Omron Corp
Omron Tateisi Electronics Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omron Corp, Omron Tateisi Electronics Co filed Critical Omron Corp
Priority to JP2002311680A priority Critical patent/JP3862015B2/en
Priority to US10/681,840 priority patent/US6831591B2/en
Publication of JP2004144671A publication Critical patent/JP2004144671A/en
Application granted granted Critical
Publication of JP3862015B2 publication Critical patent/JP3862015B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4026Antenna boresight
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4026Antenna boresight
    • G01S7/4034Antenna boresight in elevation, i.e. in the vertical plane
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
    • G01S3/7864T.V. type tracking systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4026Antenna boresight
    • G01S7/403Antenna boresight in azimuth, i.e. in the horizontal plane
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4052Means for monitoring or calibrating by simulation of echoes
    • G01S7/4082Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder
    • G01S7/4091Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder during normal radar operation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide a car-mounted radar device hardly missing a leading vehicle by correcting the optical axis of a laser radar 2 in an optimum direction even if a road is sloping or S-curved. <P>SOLUTION: This device is provided with a camera 1 for acquiring an image including a road surface ahead of or in the rear of an own vehicle, a line segment detection part 23 for detecting line segments along a lane where the own vehicle runs from the image, and a direction vector detection part 24 for finding a direction vector on actual coordinates of the line segment acquired by the detection part 23. The direction of the optical axis is corrected vertically and horizontally so as to coincide with the direction vector acquired by the detection part 24. <P>COPYRIGHT: (C)2004,JPO

Description

【0001】
【発明の属する技術分野】
本発明は、車両に搭載されるレーダ装置に関する。
【0002】
【従来の技術】
近年、自動車においては、運転の容易性や安全性を向上させるために、先行車等の監視機能や先行車への追従走行機能などが設けられる場合がある。そして、このような機能を実現するために、レーザレーダやミリ波レーダのようなセンサもち、自車両の前方又は後方に存在する検出対象(先行車やガードレールなどの障害物)の少なくとも位置情報を取得するためのレーダ装置が、車両に搭載される。
このような車載用レーダ装置では、カーブ走行時などにも先行車を見逃さないよう、走査範囲の中心軸或いは検出範囲の中心軸(以下、場合により光軸という)を適宜補正するのが好ましく、そのための技術として、従来では特許文献1に開示されたものがある。
これは、カーブ走行している先行車やレーンの中央を走行していない先行車の見逃しに対応し、カーブの曲率や自車及び先行車の横方向位置から先行車を見逃さないように光軸を補正するものである。
【0003】
【特許文献1】
特開平9−218265号公報
【0004】
【発明が解決しようとする課題】
ところが、上記従来技術は、勾配のない平坦な道路や高速道路のようにカーブ曲率が略一定であることを想定しているものである。このため、道路が平坦でありカーブ曲率が略一定である場合には、先行車を見逃すことが少なくなるが、市街地のように勾配がある場合やカーブ曲率が細かく変化する場合には、従来技術のように光軸を補正しても、先行車を見逃してしまう。というのは、従来技術は、カーブの曲率や自車及び先行車の横方向位置に基づいて補正すべき光軸の方向を特定するものであり、垂直方向(上下方向)に光軸を補正できない(垂直方向の光軸の理想的な向きを特定できない)ため、光軸の垂直方向位置はあくまで一定であって、勾配があると先行車が検出エリアから上下方向に外れてしまい検出不可能となる恐れがある。また従来技術は、例えばS字カーブであっても一定の曲率であるとして最適な光軸を求めて補正するため、曲率が細かく変化する場合には、横方向(水平方向)に大きな誤差(求められた光軸方向と実際に最適な光軸方向との差)を生じて、先行車が補正後の検出エリアから横方向に外れてしまい易いからである。
そこで本発明は、勾配又はS字カーブなどがあってもセンサの中心軸を最適な方向に補正して先行車を見逃し難い車載用レーダ装置を提供することを目的としている。
【0005】
【課題を解決するための手段】
この発明による車載用レーダ装置は、自車両の前方又は後方に存在する検出対象の少なくとも位置情報を取得するためのセンサをもつ車載用レーダ装置であって、
自車両の前方又は後方の路面を含む画像を取得するための撮像手段と、この撮像手段により得られた画像に基づいて前記センサの中心軸の向きを補正するための制御手段とを備え、
前記制御手段は、
前記画像から自車両が走行しているレーンに沿った線分を検出する線分検出部と、この線分検出部により得られた線分の実座標上の方向ベクトルを求める方向ベクトル検出部と、この方向ベクトル検出部により得られた方向ベクトルに一致するように前記中心軸の向きを上下左右に補正する制御を実行する補正制御部とを有することを特徴とするものである。
【0006】
ここで、「検出対象」とは、基本的には先行車両又は後続車両(四輪車以外のバイク等も含む)を意味するが、車両以外の障害物(ガードレール等)が含まれてもよい。
また、「センサ」とは、例えば電磁波や音波などを検出エリアに照射し、その反射波に基づいて検出対象を検出するもの(具体的には、レーザレーダやミリ波レーダなど)である。なおこのセンサは、必ずしも走査型である必要はなく、非走査型(但し、指向性があり、特定の検出エリアが存在するもの)でもよい。
また、「センサの中心軸」とは、検出エリアの中心軸であり、走査型の場合には走査範囲の中心軸を意味する。
また、「自車両が走行しているレーンに沿った線分」とは、自車両が走行しているレーンの片側又は両側にある路面上のマーク(白線や黄線、二重線や破線など)、ガードレール、中央分離帯、防護壁、歩道との境界部分などの画像を構成する線(いわゆるエッジ)、或いはこれら画像の経時変化から得られるオプティカルフローであり、自車両が走行しているレーンの接線方向のものである。
【0007】
なお、上記線分の検出(画像からの抽出)は、白線などの濃淡(明暗)のはっきりしたものであれば、いわゆるエッジ抽出により相当の信頼性で実現できる。但し、路面の汚れや道路周辺の建造物などの画像成分を排除して確実に上記線分を検出できる信頼性を向上させるためには、或いは検出のしきい値を下げて濃淡のはっきりしないもの(例えば中央分離帯などのエッジ)も相当の信頼性で検出できるようにするには、上記線分を検出する画像中の位置や領域、或いは線分の角度範囲等を予めあり得る条件に限定しておくのが好ましい。例えば、図5(c)に示すように、画像フレームの左下を原点としてフレーム下辺から反時計回りに30°〜60°の角度範囲と、画像フレームの右下を原点としてフレーム下辺の延長線から反時計回りに120°〜150°の角度範囲に、上記線分を検出する領域や角度を限定してもよい。自車が走行しているレーンの両側の縁部(レーンを区画する白線や中央分離帯等)の位置や方向が、このような条件に納まるように、撮像手段(カメラ)の撮像方向や画角を設定することは容易かつ自然であり、その場合、このような条件から外れるものを検出範囲から除けば、前述した所定の線分のみをより確実かつ容易に検出できるからである。
【0008】
本発明の車載用レーダ装置によれば、自車両の前方又は後方の画像における自車両が走行しているレーンに沿った線分の実座標上での方向にセンサの中心軸が補正されるため、曲率が一定でない道路や勾配のある道路状況でも、検出すべき検出対象(特に先行車又は後続車)を見逃し難い。例えば、自車両から所定距離離れた位置(画像における上下方向の所定位置)の白線などよりなる線分を検出し、この線分の実座標上の方向にセンサの中心軸を補正すれば、自車両がその時点で走行している直近の道路の曲率や勾配にとらわれず、自車両の前方又は後方の道路方向(3次元的方向)に対応させて、センサの中心軸を上下左右に補正することができるからである。特に本発明では、前方又は後方の道路の勾配に応じてセンサの中心軸方向が上下するため、従来行われていなかった上下方向の補正も可能となり、勾配のある道路でも先行車等を見逃し難くなる。
なお、S字カーブやアップダウンのある道路状況に良好に対応するには、後述する態様のように、前記線分を複数検出して、各線分の方向から、自車両の前方又は後方の道路全体を見渡すことができる最適な方向ベクトルを特定することが好ましい。
【0009】
次に、この発明の好ましい態様は、前記線分検出部が、前記線分として、自車両が走行しているレーンの両側にあるものを一対以上求める機能を有し、
前記方向ベクトル検出部が、対をなす前記線分の延長線の交点として前記画像上の消失点を求め、この消失点から前記方向ベクトルを求めるものである。
このような態様であると、撮像手段として一つのカメラを設けた簡単な構成でも、後述する形態例に示すように、簡単な処理で前記方向ベクトルを求めることができる。
【0010】
なお、上述したように消失点から前記方向ベクトルを求める場合、さらに次のような態様とすることが好ましい。
即ち、第1の態様は、前記撮像手段により得られた画像を、複数の小領域に分割する領域分割部を備え、
前記線分検出部が、前記線分の検出を前記小領域毎に実行して、前記線分の対を複数検出し、
前記方向ベクトル検出部が、これら複数の線分の対に基づいて前記画像上の消失点を複数求め、これら消失点の中から、最も手前側に位置する一対の線分上の点及び各消失点を頂点とする画像上の三角形領域と、自車両が走行しているレーンの画像上の領域(以下、場合により道路領域という)との共通領域の面積が最大となる消失点を選択し、この選択した消失点から前記方向ベクトルを求めるものである。なお、上記道路領域は、例えば上記複数の線分によって囲まれた領域として求めることができる。
【0011】
次に第2の態様は、前記線分検出部が、前記画像の下辺に沿った最下帯状領域にある前記線分の対を検出し、
前記方向ベクトル検出部は、前記最下帯状領域にある前記線分の対に基づいて前記画像上の消失点を求め、この消失点をこの消失点を中心とする上下左右の矩形領域に離散化してなる画像上の複数の座標を決定し、これら座標の中から、前記最下帯状領域にある一対の線分上の点及び各座標を頂点とする画像上の三角形領域と、自車両が走行しているレーンの画像上の領域との共通領域の面積が最大となる座標を選択し、この選択した座標から前記方向ベクトルを求めるものである。
【0012】
これら第1の態様又は第2の態様であると、複数の消失点(或いは、一つの消失点を離散化してなる複数の座標)の中から、道路全体を最もよく見渡せる点の方向にセンサの中心軸が補正されるので、S字カーブなどの複雑に曲率が変化する道路状況、或いはアップダウンの激しい道路状況でも、検出すべき検出対象(特に先行車又は後続車)をより見逃し難い。
加えて、第2の態様の場合には、消失点を1点しか計算しないので、計算負荷が軽減される利点がある。
【0013】
なお、上記第1の態様では、前記方向ベクトル検出部が、前記センサにより得られた検出対象(少なくとも、先行車又は後続車)の位置情報に基づき、当該検出対象が画像上で前記線分と重なる又は重なる可能性のある小領域を特定し、この小領域を前記消失点を求める対象から排除する構成が好ましい。このようにすれば、例えば先行車などがレーンの縁(レーンとレーンの境界含む)にいることによって、その位置の線分が適正に検出できず、補正ができなくなること、或いは良好な補正ができなくなることを防止できる。
【0014】
また本発明の補正は、自車の走行中或いはイグニションスイッチのオン時などに、常時(例えば周期的に)行ってもよいが、次のようにして必要なときだけ行い、処理の効率化を図るようにしてもよい。即ち、検出対象の水平方向位置が変化したと検知したこと、自車両の速度が変化したと検知したこと、自車両が水平に対して傾いたと検知したこと、及び自車両が車線変更したと検知したことのうち、何れか一つ又は複数の条件が成立したことを起因として、前記制御処理手段が前記中心軸の向きを補正する新たな制御を開始する態様としてもよい。
【0015】
【発明の実施の形態】
以下、本発明の実施の形態を図面に基づいて説明する。
(第1形態例)
図1は、本例の車載用レーダ装置の構成を説明する図であって、図1(a)は全体構成を示すブロック図、図1(b)は画像処理ユニットを説明する機能ブロック図である。
【0016】
まず、全体構成について説明する。
本装置は、図1(a)に示すように、カメラ1(撮像手段)と、センサ2と、画像処理ユニット3と、コントローラ4とよりなる。
カメラ1は、例えばCCD又はCMOSなどの周知のデバイスよりなる撮像手段であり、この場合、1個だけ設置されている。またカメラ1は、例えば自車前方の路面を含む画像が得られるように、自車の前方斜め下(或いは後方斜め下)に向けて搭載される(図6参照)。
センサ2は、例えばレーザレーダである。
画像処理ユニット3は、マイクロコンピュータ(以下、マイコンという)を含む回路よりなり、カメラ1により得られた画像からセンサ2の光軸の最適方向を求めて出力する。
コントローラ4は、図示省略したアクチュエータを制御して、画像処理ユニット3が出力した最適方向にセンサ2の光軸を変化させるもので、やはりマイコンを含む回路よりなる。なお、画像処理ユニット3とコントローラ4は、一体のユニットとして構成することもできる。
【0017】
次に、画像処理ユニット3の詳細を説明する。画像処理ユニット3は、機能的に図1(b)に示す要素(歪補正部11、帯領域分割部12、白線検出部13、消失点検出部14、及び走査領域算出部15)を有する。
ここで、歪補正部11は、後述する図2のステップS1に対応するもので、カメラ1のレンズの歪の影響を排除すべく、カメラ1により得られた画像データを補正するものである。
帯領域分割部12は、後述する図2のステップS2に対応するもので、曲率が一定でないS字カーブなどの道路状況にも的確に対応するため、線分を含む複数の小領域(この場合、横長の帯状領域)に画像を分割する。
白線検出部13は、後述する図2のステップS3〜S5に対応するもので、自車が走行している道路上のレーン両側の白線(黄線や破線を含めてもよい)を、領域毎の直線状の線分として検出する。
消失点検出部14は、後述する図2のステップS6に対応するもので、領域毎に検出された線分から画像上の消失点を求める。
走査領域算出部15は、後述する図2のステップS8〜S13に対応するもので、領域毎に算出された消失点から最適な光軸方向を求める。
【0018】
次に、本装置の動作(主に画像処理ユニット3の処理内容)について、図2により説明する。
画像処理ユニット3は、次の一連の処理(ステップS1〜S14)を例えば周期的に繰り返す。
即ち、まずステップS1において、カメラ1の画像データを読み取り、歪補正を行う。具体的には、例えば数1に示す関係式により画素座標を(x,y)から(x´,y´)に変換する。なお。数1におけるκは負の定数である。
【0019】
【数1】

Figure 2004144671
【0020】
次いでステップS2では、例えば図3(a)に示す如く、道路を含むように画像を横長の帯状領域(N個)に分割するための画像データの設定を行う。道路を含むようにとは、例えば図3(a)に示す如く、道路の画像が存在する画像フレームの下側の領域を少なくとも含むようにという意味である。またこの場合、帯状領域の横幅は画像全体の横幅と同等とし、帯状領域の縦幅は画像の下側の領域をN等分した幅とする。なお本例では、N=7、即ち7個の帯状領域1〜7に分割する。また、ここでの帯状領域は、本発明の小領域に相当する。
【0021】
次に、帯状領域毎にステップS3〜S6の処理を行う。
まず、ステップS3で、エッジ抽出用フィルタを使ってエッジ抽出を行う。エッジ抽出用フィルタとしては、例えば数2に示すような右向きSobelフィルタを用いる。なお数2において、f(x,y)は、画像上の座標(x,y)の画素濃度値を意味し、g(x,y)はフィルタ処理後の値(隣接画素同士の濃度値の差分)を意味する。
【0022】
【数2】
Figure 2004144671
【0023】
次いでステップS4では、白線候補抽出を行う。白線候補抽出とは、数2で計算されたg(x,y)よりなるエッジ画像に対して、所定のしきい値(256階調の場合には、例えば128)により2値化処理を行い、道路両側の白線の内側の座標(白線候補点)を求める。具体的には、前記エッジ画像を左から右に向かって移動したとき、しきい値を超え、さらにその後左右方向の幅が規定幅B以内においてしきい値を下回る座標が存在すれば、その時の最初の座標(しきい値を超えた座標)が左側の白線の左端と推定され、その時の後の座標(しきい値を下回った座標)が左側の白線の右端(左側の白線の内側)と推定される。また、同様の処理で右側の白線の左端(右側の白線の内側)を求め、これら白線の内側の座標を白線候補点として求める。なお通常、これら白線候補点は、帯状領域の縦方向の画素数分だけ複数求まる。またここで、規定幅Bによる制限は、実際の白線幅に相当する幅を大きく超えるものを排除し、白線のエッジだけを信頼性高く検出するためのものである。
なお、この白線候補抽出では、路面の汚れや道路周辺の建造物などの画像成分(輪郭線など)を誤って白線として検出しないように、白線を検出する画像中の位置や領域、或いは角度範囲等を予めあり得る条件に限定しておくのが好ましい。例えば、既述した図5(c)に示すような限定条件を設定して、上記白線候補抽出を行うのがよい。
【0024】
次にステップS5では、上記白線候補点に対してHough変換を行い、最も適合する2直線を、小領域毎の各白線の内側の接線(本発明の線分或いはその延長線)としてを求める。
Hough変換とは、画像上(x−y平面上)の候補点の座標(x,y)に対して、数3で示すρ−θ平面上のHough曲線を求め、このHough曲線の交点の座標から画像上の直線を抽出する処理である。この場合、最も適合する2直線は、前述した複数の白線候補点から得られた複数のHough曲線が多く交差する二つの交点(ρa,θa)及び(ρb,θb)を求めれば特定できる。数3の式においてρとθの値が決まれば、画像上の直線が特定できるからである。なお図5(b)は、(x,y)と(ρ,θ)の位置関係を示す。
【0025】
【数3】
Figure 2004144671
【0026】
次に、ステップS6では、ステップS5で得られた前記二つの直線のデータから画像上の消失点の座標を算出する。消失点は前記二つの直線の交点であるから、前述したデータ(ρa,θa)及び(ρb,θb)から、数4の式によって求まる。
【0027】
【数4】
Figure 2004144671
【0028】
次に、ステップS7では、ステップS3〜S6の実行回数がN未満であるか否か(即ち、全ての帯状領域についてステップS3〜S6の処理が完了していないか否か)を判定し、N未満であればステップS3に戻って次の帯状領域について処理を繰り返す。そして、N以上であれば(全ての帯状領域について処理完了であれば)、ステップS8に進む。
なお図3(b)は、以上の処理により求められた消失点の例を示す図である。図中、太線が道路の白線を示し、破線が各小領域毎に求められた白線の内側(線分)を延長した直線であり、▲1▼〜▲6▼の符号が消失点を示す(この場合7番目の消失点▲7▼は求められなかった)。
【0029】
次にステップS8では、道路領域検出を行う。道路領域検出は、例えばステップS3〜S4の処理により得られた白線の内側の座標に基づいて、二つの白線で囲まれた道路領域(例えば、図4(a)に斜線で示す領域)を検出する。
次に、帯状領域毎にステップS9〜S10の処理を行う。
まず、ステップS9では、最大N個存在する各消失点を上側の頂点とし、手前両側の白線上の点を下側の頂点とする底辺が一定の三角形領域を検出する。即ち、帯状領域1については、例えば図4(b)に斜線で示すように、ステップS6で帯状領域1について求められた消失点を上側の頂点とし、ステップS5で帯状領域1について求められた両側の直線(両白線の内縁側の線)の下端を下側の頂点(底辺の両端点)とする三角形の領域を検出する。また、帯状領域2以降についても、同様に各消失点を上側の頂点とし、帯状領域1について求められた両側の直線の下端を下側の頂点とする三角形の領域を検出する。なお、図4(b)の場合には、両白線の下端内側が画像フレームの下側両隅部に位置しているため、上記三角形の底辺が画像フレームの下辺に一致しているが、必ずしもこのようになるとは限らない。例えば、両白線の下端が画像フレームの横幅よりも内側にある場合には、上記三角形の底辺は画像フレームの横幅(下辺の長さ)よりも短い線分となる。また、上記三角形の下側の頂点は、必ずしも帯状領域1における白線の内縁側の下端である必要はなく、例えば帯状領域1における白線の上端の点(帯状領域1と帯状領域2の境界上の点)でもよいし、帯状領域1における白線の外縁側の点でもよい。また、帯状領域1内に白線が存在しない場合等には、例えば一つ上の帯状領域2における白線上の点を、上記三角形の下側の頂点としてもよい。
【0030】
次にステップS10では、ステップS8で求めた道路領域と、ステップS9で求めた三角形領域とのAND面積(重複部分の面積)を求める。
次にステップS11では、ステップS7と同様に、全ての帯状領域についてステップS9〜S10の処理が完了したか否かを判定し、全ての帯状領域について処理完了であればステップS12に進み、そうでなければステップS9に戻る。
そしてステップS12では、ステップS10で求めたAND面積が最大となる帯状領域Nmaxを求める。なお以降では、この帯状領域Nmaxとして帯状領域6が求められたと仮定して説明を進める。
【0031】
次にステップS13では、上記帯状領域Nmax(帯状領域6)の消失点の座標(X6,Y6)を道路座標系(xr,yr,zr)に変換し、前記消失点の道路座標上での方向ベクトルV6を求める。なお、センサ2(レーザレーダなど)は、通常車両のラジエター(バンパの上)に取り付けられるため、センサ2の座標系は厳密には道路座標系とは若干異なるが、センサ2の方向を補正する場合、センサ2の座標系と道路座標系を同一視し、道路座標系のデータでその方向を補正しても特に問題ない。
【0032】
図6に示すように、各座標系の符号と方向の関係を定めた場合、道路座標系(xr,yr,zr)とカメラ座標系(xc,yc,zc)と画像座標系(X,Y)の間には、次の数5及び数6に示す関係式が成り立つ。ここで、Rは3×3の行列、Tは3×1の行列を表す。また、RとTは、カメラキャリブレーションで予め設定しておく。また、数6におけるFは、カメラ1の焦点距離である。
【0033】
【数5】
Figure 2004144671
【0034】
【数6】
Figure 2004144671
【0035】
このため、次のようすれば、消失点の座標(X6,Y6)を道路座標系(xr,yr,zr)に変換できる。
即ち、数6の(X,Y)に(X6,Y6)を代入し、数7に示すようにカメラ座標系(xc,yc,zc)にまず変換する。なお、数7においてはzc=kとしている。
【0036】
【数7】
Figure 2004144671
【0037】
次に、数7の式を数5の式に代入すれば、数8に示す式のように道路座標系(xr,yr,zr)に変換できる。なお、数8の式におけるkの値は、適当な拘束条件を適用して、算出すればよい。例えば、レーザレーダの測定可能距離(例えば、150m)を数8の式のzrに代入してkを求め、この値に設定すればよい。
【0038】
【数8】
Figure 2004144671
【0039】
そして、上述したように求めた前記消失点の道路座標系での座標から、単位ベクトルを求めれば、それが方向ベクトルV6である。
最後に、ステップS14では、上記方向ベクトルV6から、光軸の最適位置を求めてコントローラ4に出力する。これを受けて、コントローラ4は、指令された最適位置に光軸を動かし、光軸の向きが方向ベクトルV6に一致するように補正される。
なお、図5(a)はレーザレーダ2の走査範囲(走査領域)を示す図である。ここで、座標原点Oは、通常状態(補正なしの状態)での走査の中心座標である。また、(−Sx,Sx)と(−Sy,Sy)が通常状態での走査範囲であり、(−Sxmax,Sxmax)と(−Symax,Symax)が最大の走査範囲を示す。即ち、レーザレーダ2の走査範囲の中心(光軸)をxl方向(横方向)に動かすことができる最大補正量はSxmax−Sxであり、yl方向(上下方向)に動かすことができる最大補正量はSymax−Syである。
【0040】
したがって、前述の単位ベクトルV6の成分(xr,yr,zr)から、数9に示す関係式により求まるxl方向とyl方向の光軸位置(xL,yL)が、光軸の最適位置となる。
【0041】
【数9】
Figure 2004144671
【0042】
以上説明したレーダ装置によれば、自車両の前方又は後方の画像における自車両が走行しているレーンに沿った線分(この場合、白線の内側エッジ)の実座標上での方向にセンサの中心軸が補正されるため、曲率が一定でない道路や勾配のある道路状況でも、検出すべき検出対象(特に先行車又は後続車)を見逃し難い。特に本形態例の場合には、画像を領域分割して得られた複数の消失点の中から、道路全体を最もよく見渡せる消失点の方向にセンサの中心軸が補正されるので、S字カーブなどの複雑に曲率が変化する道路状況、或いはアップダウンの激しい道路状況でも、検出すべき検出対象をより見逃し難い。
また本発明は、消失点を求めて方向ベクトルを定めているため、撮像手段として一つのカメラを設けた簡単な構成でも、上述したような簡単な処理で前記方向ベクトルを求めることができる。
【0043】
(第2形態例)
次に、第2形態例を説明する。この形態例は、画像処理ユニット3の処理内容の一部に特徴を有し、他は第1形態例と同様である。図8は、本例の処理内容を示すフローチャートである。なお、第1形態例(図2)と同様のステップには同符号を付して重複する説明を省略する(後述の第3形態例も同様)。
この場合、消失点は、最下の帯状領域1(N=1)のみについて求められる(ステップS3〜S6)。
そして、ステップS8(道路領域検出)の次のステップS21では、図7(a)に示すように、N=1のときの消失点を、この消失点を中心とする上下左右の矩形領域に離散化してなる画像上の複数(D個)の座標(点)を決定する。例えば図7(a)では、N=1のときの消失点を中心に、規定の間隔で左右に5列と上下に3列の点(この場合、D=15)を生成している。
【0044】
次いでステップS22〜S26では、図2のステップS9〜S13と同様に、D個存在する各座標を上側の頂点とし、手前両側の白線上の点を下側の頂点とする画像上の三角形領域と、ステップS8で求めた道路領域との共通領域の面積が最大となる座標を選択し、この選択された座標から前記方向ベクトルを求める。
この第2形態例によれば、第1形態例と同様の加えて、次のような効果がある。即ち、この第2の態様の場合には、消失点を1点しか計算しないので、計算負荷が軽減される利点がある。
【0045】
(第3形態例)
次に、第3形態例を説明する。この形態例も、画像処理ユニット3の処理内容の一部に特徴を有する。但し本例の場合には、図1(a)に破線で示すように、レーザレーダ2による検出結果の情報(少なくとも、他車両の位置情報)が、例えばコントローラ4から画像処理ユニット3に入力される構成となっている必要がある。
図9は、本例の処理内容を示すフローチャートである。
この場合、ステップS2の次に、ステップS31が設けられている。このステップS31では、消失点を求めようとする帯状領域に他車両が存在していて、他車両が線分と重なっている可能性があるか否かを、レーザレーダ2による検出結果の情報に基づいて判断する。そして、他車両が存在していれば、ステップS3〜S6をスキップしてステップS7に進み、その帯状領域についての消失点の算出は実行しない。
【0046】
なお、帯状領域に他車両が存在しているか否かの判定は、例えば次のようにして行う。即ち、レーザレーダ2による検出結果の情報(他車両の位置情報)に基づいて、他車両が存在する画像上の位置を求め、その位置の近傍でヒストグラムを形成し、他車両の画像上での概略の幅寸法や高さ寸法を定めて他車両が存在する画像上の領域を特定する。そして、例えばこの領域と各帯状領域との重複割合を算出し、所定の割合以上重複している帯状領域には他車両が存在していると判定すればよい。
【0047】
本形態例によれば、先行車等が存在する帯状領域が前記方向ベクトル(消失点)を求めるための処理対象から除外される。例えば、図7(b)に示すような画像が得られたときには、領域4及び5が除外される。このため、図7(b)に示すように、先行車などがレーンの縁(レーンとレーンの境界含む)にいることによって、その位置の線分(この場合、白線のエッジ)がその先行車等に隠れて適正に検出できず、補正ができなくなること、或いは良好な補正ができなくなることを防止できる。
【0048】
なお、本発明は上記形態例に限られず、各種の態様や変形が有り得る。
例えば、上記形態例における画像処理ユニット3は、図1(c)に示すような構成でよい。図1(c)は、画像処理ユニット3の上位概念的機能ブロック図であり、この場合画像処理ユニット3は、前処理部21と、領域分割部22と、線分検出部23と、方向ベクトル検出部24と、演算部25とよりなる。
前処理部21は、図1(b)の歪補正部11に対応するもので、ここでは、前述のレンズの歪補正(幾何学的補正)に加え、ガンマ補正やコントラスト強調などの輝度補正、メディアンフィルタなどのノイズ除去を、必要に応じて行ってもよい。
【0049】
また、領域分割部22は、横長の帯状領域に画像を分割する態様に限られず、例えば縦長の帯状領域に画像を分割する態様、或いは格子状に画像を分割するものであってもよい。なお、このように画像を分割する場合、対をなす線分の組(レーンの両側に存在し、同一の消失点に向かうもの)をどのように対応付けるかが問題となるが、例えば、画像の左右対称位置にある領域でそれぞれ検出された線分を、一対の線分として選択すればよい。即ち、例えば縦長の帯状領域に分割する場合、画像における最も左側の帯状領域で検出された線分と、最も右側の帯状領域で検出された線分の延長線の交点として一つ目の消失点を特定し、画像における左から2番目の帯状領域で検出された線分と、右から2番目の帯状領域で検出された線分の延長線の交点として二番目の消失点を特定する、といったように順次消失点を求めればよい。
また、画像を分割する際の分割幅は必ずしも均等又は一定でなくてよい。また、分割数Nは、前記形態例のようにN=7以外でもよいし、可変でもよい。
【0050】
また、線分検出部23は、図1(b)の白線検出部13に対応するが、この白線検出部13の態様に限られず、既述したように、白線以外にも、ガードレールなどの画像を構成する線、或いはこれら画像の経時変化から得られるオプティカルフローを、本発明の線分として検出するものであってもよい。
また、方向ベクトル検出部24は、図1(b)の消失点検出部14及び走査領域算出部15の一部に対応するが、消失点を求めないで線分の方向ベクトルを特定する態様もあり得る。例えば、複数のカメラを設置して、三角測量の原理で線分上の点の3次元位置を求め、実座標における線分の方向を計算してもよい。このような態様であると、単独の線分で(いいかえると、線分毎に)方向ベクトルが定まる(即ち、前記形態例のように線分の対を検出する必要がない)。
また演算部25は、算出された方向ベクトルから最適な光軸方向を求めるもので、図1(b)の走査領域算出部15に対応する。
【0051】
また前記形態例では、画像を分割して得られた複数の線分から複数の方向ベクトル(消失点)を求め、この中から、前述した三角形領域と道路領域の重複部分が最大となる方向ベクトルを選択しているが、複数の線分から最終的な方向ベクトルを選択して光軸を補正する態様は、このようなものに限定されない。例えば、自車両から比較的遠く離れた位置(画像における上方領域)の白線よりなる線分と、自車両に近い位置(画像における下方領域)の白線よりなる線分といったように、前記線分を道路の長手方向に複数検出し、これらの平均的方向にセンサの中心軸を補正する態様(消失点を求めない態様含む)でもよい。平均的方向が、必ずしも最良の方向とはならないが、従来のように一定の曲率であるとして光軸を補正する場合に比較すれば、道路全体を見渡すことができ、先行車等を見逃し難い。
【0052】
また、前記形態例では、画像処理ユニット3とコントローラ4が本発明の制御手段を構成しており、前述したように例えば図2の処理を周期的に行って光軸の補正を常時定期的に行っているが、既述したように、必要なとき(例えば先行車の水平位置が変化したことが検知されたとき)だけ、光軸補正を行う態様でもよい。
また、前記第3形態例では、他車両が存在している小領域は他車両が線分と重なっている可能性があるとして、そのような小領域を全て、方向ベクトル(消失点)を求める処理対象から除外している。しかし、他車両が存在している小領域でも、そこで他車両が線分と重なっていないと判定できる場合(例えば、その他車両が画像の下側の左右方向中央位置にあって、レーンの端に存在しないと判定できる場合など)には、処理対象からその小領域を除外しない構成(即ち、他車両が線分と重なっている小領域のみを除外する態様)でもよい。
【0053】
【発明の効果】
本発明の車載用レーダ装置によれば、自車両の前方又は後方の画像における自車両が走行しているレーンに沿った線分の実座標上での方向にセンサの中心軸が補正されるため、曲率が一定でない道路や勾配のある道路状況でも、検出すべき検出対象(特に先行車又は後続車)を見逃し難い。
【図面の簡単な説明】
【図1】車載用レーダ装置の構成を説明する図である。
【図2】光軸補正処理(第1形態例)のフローチャートである。
【図3】画像の領域分割及び消失点を説明する図である。
【図4】画像における道路領域及び三角形領域を説明する図である。
【図5】レーザレーダの走査範囲などを説明する図である。
【図6】各座標系の符号と方向の関係を示す図である。
【図7】他の形態例を説明する図である。
【図8】光軸補正処理(第2形態例)のフローチャートである。
【図9】光軸補正処理(第3形態例)のフローチャートである。
【符号の説明】
1 カメラ(撮像手段)
2 レーザレーダ(センサ)
3 画像処理ユニット(制御手段)
4 コントローラ(制御手段、補正制御部)
23 線分検出部
24 方向ベクトル検出部[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention relates to a radar device mounted on a vehicle.
[0002]
[Prior art]
2. Description of the Related Art In recent years, a vehicle may be provided with a function of monitoring a preceding vehicle or the like, a function of following a preceding vehicle, or the like, in order to improve the ease and safety of driving. In order to realize such a function, a sensor such as a laser radar or a millimeter-wave radar is provided, and at least position information of a detection target (an obstacle such as a preceding vehicle or a guardrail) existing in front of or behind the own vehicle is obtained. A radar device for acquiring is mounted on a vehicle.
In such an in-vehicle radar device, it is preferable to appropriately correct the central axis of the scanning range or the central axis of the detection range (hereinafter, sometimes referred to as an optical axis) so that the preceding vehicle is not missed even when traveling on a curve, As a technique for that, there is a technique disclosed in Patent Document 1 conventionally.
This corresponds to oversight of a preceding vehicle traveling on a curve or a preceding vehicle not traveling in the center of the lane, and avoids overlooking the preceding vehicle based on the curvature of the curve and the lateral position of the own vehicle and the preceding vehicle. Is to be corrected.
[0003]
[Patent Document 1]
JP-A-9-218265
[0004]
[Problems to be solved by the invention]
However, the above-mentioned prior art assumes that the curvature of a curve is substantially constant like a flat road or a highway without a slope. For this reason, when the road is flat and the curve curvature is substantially constant, it is less likely to miss the preceding vehicle.However, when there is a slope like in an urban area or when the curve curvature changes finely, the conventional technology is not used. Even if the optical axis is corrected like this, the preceding vehicle is missed. This is because the prior art specifies the direction of the optical axis to be corrected based on the curvature of the curve and the lateral position of the own vehicle and the preceding vehicle, and cannot correct the optical axis in the vertical direction (up and down direction). (The ideal direction of the optical axis in the vertical direction cannot be specified.) Therefore, the position of the optical axis in the vertical direction is constant, and if there is a gradient, the preceding vehicle will fall out of the detection area in the vertical direction and it will not be detected. Could be. Further, in the related art, for example, an S-shaped curve is determined as having a constant curvature and an optimum optical axis is obtained and corrected. Therefore, when the curvature changes finely, a large error (calculation) occurs in the horizontal direction (horizontal direction). (The difference between the determined optical axis direction and the actually optimal optical axis direction), and the preceding vehicle is likely to deviate laterally from the corrected detection area.
Accordingly, it is an object of the present invention to provide an in-vehicle radar device in which the center axis of a sensor is corrected in an optimal direction even when there is a gradient or an S-shaped curve, and it is difficult to miss a preceding vehicle.
[0005]
[Means for Solving the Problems]
An on-vehicle radar device according to the present invention is an on-vehicle radar device having a sensor for acquiring at least positional information of a detection target existing in front of or behind a host vehicle,
An imaging unit for acquiring an image including a road surface in front of or behind the host vehicle, and a control unit for correcting the direction of the center axis of the sensor based on the image obtained by the imaging unit,
The control means includes:
A line segment detection unit that detects a line segment along the lane in which the host vehicle is traveling from the image, and a direction vector detection unit that determines a direction vector on the actual coordinates of the line segment obtained by the line segment detection unit. A correction control unit that executes control for correcting the direction of the central axis up, down, left, and right so as to match the direction vector obtained by the direction vector detection unit.
[0006]
Here, the “detection target” basically means a preceding vehicle or a following vehicle (including a motorcycle other than a four-wheeled vehicle), but may include an obstacle (a guardrail or the like) other than the vehicle. .
The “sensor” is, for example, one that irradiates a detection area with an electromagnetic wave or a sound wave and detects a detection target based on the reflected wave (specifically, a laser radar, a millimeter-wave radar, or the like). Note that this sensor is not necessarily a scanning type, and may be a non-scanning type (however, one having directivity and having a specific detection area).
The “center axis of the sensor” is the center axis of the detection area, and in the case of the scanning type, means the center axis of the scanning range.
The “line segment along the lane on which the vehicle is running” is a mark on the road surface on one or both sides of the lane on which the vehicle is running (white line, yellow line, double line, broken line, etc.). ), Lines (so-called edges) constituting an image such as a guardrail, a median strip, a protective wall, a boundary with a sidewalk, or an optical flow obtained from a temporal change of these images, and a lane in which the vehicle is traveling. In the tangential direction.
[0007]
Note that the detection of the line segment (extraction from the image) can be realized with considerable reliability by so-called edge extraction, as long as the density (brightness and darkness) such as a white line is clear. However, in order to improve the reliability of reliably detecting the above line segment by eliminating image components such as dirt on the road surface and buildings around the road, or by lowering the threshold value of the detection, the shading is not clear. In order to be able to detect even the edges of the median strip, for example, with considerable reliability, the position and area in the image for detecting the line segment, the angle range of the line segment, and the like are limited to possible conditions in advance. It is preferable to keep it. For example, as shown in FIG. 5 (c), an angle range of 30 ° to 60 ° counterclockwise from the lower side of the frame with the lower left of the image frame as the origin, and an extension line of the lower side of the frame with the lower right of the image frame as the origin. The area and angle for detecting the line segment may be limited to an angle range of 120 ° to 150 ° counterclockwise. The imaging direction and image of the imaging means (camera) are adjusted so that the positions and directions of the edges (white line and median strip dividing the lane, etc.) on both sides of the lane in which the vehicle is traveling fall under such conditions. This is because it is easy and natural to set the angle, and in this case, by excluding an object that deviates from such a condition from the detection range, only the above-described predetermined line segment can be more reliably and easily detected.
[0008]
According to the on-vehicle radar device of the present invention, the central axis of the sensor is corrected in the direction on the actual coordinates of the line along the lane in which the host vehicle is running in the image ahead or behind the host vehicle. In addition, it is difficult to miss a detection target (particularly, a preceding vehicle or a following vehicle) to be detected even on a road having an uneven curvature or a road with a gradient. For example, if a line segment such as a white line at a position (a predetermined position in the vertical direction in the image) at a predetermined distance from the vehicle is detected, and the center axis of the sensor is corrected in the direction of the actual coordinates of the line segment, The center axis of the sensor is corrected up, down, left, and right in accordance with the road direction (three-dimensional direction) ahead or behind the host vehicle, regardless of the curvature or gradient of the nearest road on which the vehicle is currently traveling. Because you can do it. In particular, in the present invention, since the center axis direction of the sensor moves up and down according to the gradient of the road ahead or behind, correction in the vertical direction, which has not been performed conventionally, is also possible, and it is difficult to miss a preceding vehicle or the like even on a sloped road. Become.
In addition, in order to properly cope with an S-shaped curve or an up-down road condition, as described later, a plurality of the line segments are detected, and from the direction of each line segment, a road ahead or behind the own vehicle is detected. It is preferable to identify the optimal direction vector that can overlook the whole.
[0009]
Next, in a preferred aspect of the present invention, the line segment detection unit has a function of obtaining, as the line segment, one or more pairs on both sides of the lane in which the own vehicle is traveling,
The direction vector detection unit obtains a vanishing point on the image as an intersection of the extension lines of the pair of line segments, and obtains the direction vector from the vanishing point.
In such a mode, even in a simple configuration in which one camera is provided as an imaging unit, the direction vector can be obtained by a simple process as shown in an embodiment described later.
[0010]
When the direction vector is obtained from the vanishing point as described above, it is preferable to adopt the following mode.
That is, the first aspect includes an area dividing unit that divides an image obtained by the imaging unit into a plurality of small areas,
The line segment detection unit executes the detection of the line segment for each of the small areas, detects a plurality of pairs of the line segment,
The direction vector detection unit obtains a plurality of vanishing points on the image based on the plurality of pairs of line segments, and among these vanishing points, a point on a pair of line segments located closest to the foreground and each vanishing point. A vanishing point that maximizes the area of the common region between the triangular region on the image having the point as the vertex and the region on the image of the lane in which the vehicle is traveling (hereinafter, sometimes referred to as a road region), The direction vector is obtained from the selected vanishing point. The road area can be obtained, for example, as an area surrounded by the plurality of line segments.
[0011]
Next, in a second aspect, the line segment detection unit detects the pair of line segments in a lowermost band-like region along a lower side of the image,
The direction vector detection unit obtains a vanishing point on the image based on the pair of line segments in the lowermost band-like area, and discretizes the vanishing point into upper, lower, left, and right rectangular areas around the vanishing point. A plurality of coordinates on the image are determined, and from these coordinates, a point on a pair of line segments in the lowermost band-like region and a triangular region on the image having each coordinate as a vertex, and a vehicle traveling by itself. The coordinates at which the area of the common region with the region on the image of the lane is maximized are selected, and the direction vector is obtained from the selected coordinates.
[0012]
According to the first mode or the second mode, the sensor is set in the direction of the point where the entire road can be best seen from a plurality of vanishing points (or a plurality of coordinates obtained by discretizing one vanishing point). Since the central axis is corrected, it is more difficult to overlook a detection target (particularly, a preceding vehicle or a following vehicle) to be detected even in a road condition in which the curvature changes complicatedly such as an S-shaped curve or a road condition in which ups and downs are severe.
In addition, in the case of the second mode, since only one vanishing point is calculated, there is an advantage that the calculation load is reduced.
[0013]
In the first aspect, the direction vector detection unit may determine that the detection target is the same as the line segment on an image based on position information of the detection target (at least, a preceding vehicle or a following vehicle) obtained by the sensor. It is preferable that a small area that overlaps or is likely to overlap is specified, and this small area is excluded from the target for calculating the vanishing point. In this way, for example, when the preceding vehicle is at the edge of the lane (including the boundary between lanes), the line segment at that position cannot be detected properly, and correction cannot be performed, or good correction cannot be performed. It can be prevented from becoming impossible.
[0014]
The correction of the present invention may be performed at all times (for example, periodically) while the vehicle is running or when the ignition switch is turned on. However, the correction is performed only when necessary as described below to improve the processing efficiency. You may make it aim. That is, it is detected that the horizontal position of the detection target has changed, that the speed of the own vehicle has changed, that the own vehicle has tilted with respect to the horizontal, and that the own vehicle has changed lanes. The control processing means may start a new control for correcting the direction of the central axis due to the fact that any one or a plurality of conditions are satisfied.
[0015]
BEST MODE FOR CARRYING OUT THE INVENTION
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
(First Embodiment)
1A and 1B are diagrams illustrating the configuration of the on-vehicle radar device according to the present embodiment. FIG. 1A is a block diagram illustrating the entire configuration, and FIG. 1B is a functional block diagram illustrating an image processing unit. is there.
[0016]
First, the overall configuration will be described.
As shown in FIG. 1A, the present apparatus includes a camera 1 (imaging means), a sensor 2, an image processing unit 3, and a controller 4.
The camera 1 is an image pickup unit composed of a well-known device such as a CCD or a CMOS. In this case, only one camera is provided. In addition, the camera 1 is mounted facing obliquely downward (or obliquely downward rearward) of the vehicle so as to obtain an image including a road surface in front of the vehicle (see FIG. 6).
The sensor 2 is, for example, a laser radar.
The image processing unit 3 includes a circuit including a microcomputer (hereinafter, referred to as a microcomputer), and obtains and outputs an optimal direction of the optical axis of the sensor 2 from an image obtained by the camera 1.
The controller 4 controls an actuator (not shown) to change the optical axis of the sensor 2 in the optimal direction output by the image processing unit 3, and is also composed of a circuit including a microcomputer. Note that the image processing unit 3 and the controller 4 may be configured as an integrated unit.
[0017]
Next, details of the image processing unit 3 will be described. The image processing unit 3 functionally includes the elements shown in FIG. 1B (a distortion correction unit 11, a band region division unit 12, a white line detection unit 13, a vanishing point detection unit 14, and a scanning region calculation unit 15).
Here, the distortion correction unit 11 corresponds to step S1 in FIG. 2 described below, and corrects image data obtained by the camera 1 so as to eliminate the influence of distortion of the lens of the camera 1.
The band area dividing unit 12 corresponds to step S2 in FIG. 2 described below. In order to accurately cope with a road condition such as an S-shaped curve having a non-uniform curvature, a plurality of small areas including line segments (in this case, , The image is divided into horizontally long band-like regions.
The white line detection unit 13 corresponds to steps S3 to S5 in FIG. 2 described below, and detects white lines (may include a yellow line or a broken line) on both sides of the lane on the road on which the vehicle is traveling, for each region. Is detected as a straight line segment.
The vanishing point detector 14 corresponds to step S6 in FIG. 2 described later, and obtains a vanishing point on an image from a line segment detected for each region.
The scanning area calculation unit 15 corresponds to steps S8 to S13 in FIG. 2 described later, and obtains an optimal optical axis direction from vanishing points calculated for each area.
[0018]
Next, the operation of the present apparatus (mainly the processing contents of the image processing unit 3) will be described with reference to FIG.
The image processing unit 3 repeats the next series of processing (steps S1 to S14), for example, periodically.
That is, first, in step S1, image data of the camera 1 is read and distortion correction is performed. Specifically, for example, the pixel coordinates are converted from (x, y) to (x ′, y ′) by the relational expression shown in Expression 1. In addition. In Expression 1, κ is a negative constant.
[0019]
(Equation 1)
Figure 2004144671
[0020]
Next, in step S2, as shown in FIG. 3A, for example, image data for dividing an image into a horizontally long band-like area (N pieces) so as to include a road is set. To include a road means to include at least a lower region of an image frame where a road image exists, as shown in FIG. 3A, for example. In this case, the width of the band-shaped region is equal to the width of the entire image, and the length of the band-shaped region is a width obtained by equally dividing the lower region of the image by N. In this example, N = 7, that is, the image is divided into seven band-shaped regions 1 to 7. Further, the band-shaped region here corresponds to the small region of the present invention.
[0021]
Next, the processing of steps S3 to S6 is performed for each band-shaped area.
First, in step S3, edge extraction is performed using an edge extraction filter. As the edge extraction filter, for example, a rightward Sobel filter as shown in Expression 2 is used. In Equation 2, f (x, y) means the pixel density value at the coordinates (x, y) on the image, and g (x, y) is the value after filtering (the density value of the adjacent pixels). Difference).
[0022]
(Equation 2)
Figure 2004144671
[0023]
Next, in step S4, white line candidate extraction is performed. White line candidate extraction is performed by performing a binarization process on an edge image composed of g (x, y) calculated by Expression 2 using a predetermined threshold value (for example, 128 in the case of 256 gradations). Then, the coordinates inside the white lines on both sides of the road (white line candidate points) are obtained. More specifically, when the edge image is moved from left to right, if there is a coordinate exceeding the threshold, and further, if there is a coordinate whose width in the left-right direction is less than the threshold within a specified width B, the coordinate at that time is displayed. The first coordinate (coordinate exceeding the threshold) is estimated to be the left end of the left white line, and the subsequent coordinates (coordinates below the threshold) are assumed to be the right end of the left white line (inside the left white line) Presumed. Further, the left end of the right white line (inside the right white line) is obtained by the same processing, and the coordinates inside these white lines are obtained as white line candidate points. Usually, a plurality of these white line candidate points are obtained for the number of pixels in the vertical direction of the band-shaped area. Here, the limitation by the specified width B is for excluding a width significantly exceeding the width corresponding to the actual white line width, and detecting only the edge of the white line with high reliability.
In this white line candidate extraction, a position, an area, or an angle range in an image in which a white line is detected is set so that an image component (such as a contour line) such as dirt on a road surface or a building around the road is not erroneously detected as a white line. It is preferable to limit the conditions to possible conditions in advance. For example, it is preferable to set the above-described limiting condition as shown in FIG. 5C and perform the white line candidate extraction.
[0024]
Next, in step S5, Hough transform is performed on the white line candidate points, and the two best matching straight lines are determined as tangents (the line segment of the present invention or an extension thereof) of each white line for each small area.
The Hough transform is to find a Hough curve on a ρ-θ plane expressed by Equation 3 with respect to coordinates (x, y) of a candidate point on an image (on an xy plane), and to obtain coordinates of an intersection of the Hough curves. This is a process of extracting a straight line on the image from. In this case, the most suitable two straight lines can be specified by obtaining two intersections (ρa, θa) and (ρb, θb) where a plurality of Hough curves obtained from the above-described plurality of white line candidate points intersect with each other. This is because if the values of ρ and θ are determined in the equation (3), a straight line on the image can be specified. FIG. 5B shows the positional relationship between (x, y) and (ρ, θ).
[0025]
[Equation 3]
Figure 2004144671
[0026]
Next, in step S6, the coordinates of the vanishing point on the image are calculated from the data of the two straight lines obtained in step S5. Since the vanishing point is the intersection of the two straight lines, the vanishing point is obtained from the above-mentioned data (ρa, θa) and (ρb, θb) by the equation (4).
[0027]
(Equation 4)
Figure 2004144671
[0028]
Next, in step S7, it is determined whether or not the number of executions of steps S3 to S6 is less than N (that is, whether or not the processing of steps S3 to S6 has not been completed for all band-shaped areas). If it is less, the process returns to step S3 and the process is repeated for the next band-like region. If it is equal to or more than N (if processing is completed for all band-shaped areas), the process proceeds to step S8.
FIG. 3B is a diagram showing an example of a vanishing point obtained by the above processing. In the figure, a thick line indicates a white line of a road, a broken line indicates a straight line extending inside the white line (line segment) obtained for each small area, and signs of (1) to (6) indicate vanishing points ( In this case, the seventh vanishing point {7} was not found).
[0029]
Next, in step S8, road area detection is performed. The road area detection detects a road area surrounded by two white lines (for example, an area indicated by oblique lines in FIG. 4A) based on the coordinates inside the white line obtained by the processing of steps S3 to S4. I do.
Next, the processing of steps S9 to S10 is performed for each band-shaped area.
First, in step S9, a triangular region having a fixed base with a maximum of N vanishing points as upper vertices and points on white lines on both front sides as lower vertices is detected. That is, for the band-like region 1, as shown by the hatched portion in FIG. 4B, for example, the vanishing point obtained for the band-like region 1 in step S6 is set as the upper vertex, and both sides obtained for the band-like region 1 in step S5. (A line on the inner edge side of both white lines) is detected as a triangular area having a lower vertex (both end points on the bottom side). Similarly, for the band-shaped region 2 and thereafter, a triangular region having each vanishing point as the upper vertex and the lower end of the straight line on both sides obtained for the band-shaped region 1 as the lower vertex is detected. In addition, in the case of FIG. 4B, since the lower inner sides of both white lines are located at the lower corners of the image frame, the bottom of the triangle coincides with the lower side of the image frame. This is not always the case. For example, when the lower ends of both white lines are inside the horizontal width of the image frame, the base of the triangle is a line segment shorter than the horizontal width (length of the lower side) of the image frame. The lower vertex of the triangle does not necessarily need to be the lower end of the inner edge of the white line in the belt-shaped region 1. For example, the upper end point of the white line in the belt-shaped region 1 (on the boundary between the belt-shaped region 1 and the belt-shaped region 2). Point) or a point on the outer edge side of the white line in the belt-shaped region 1. Further, when there is no white line in the band-shaped region 1, for example, a point on the white line in the upper band-shaped region 2 may be set as the lower vertex of the triangle.
[0030]
Next, in step S10, an AND area (area of an overlapping portion) between the road area obtained in step S8 and the triangular area obtained in step S9 is obtained.
Next, in step S11, similarly to step S7, it is determined whether or not the processing in steps S9 to S10 has been completed for all the band-shaped regions. If the processing has been completed for all the band-shaped regions, the process proceeds to step S12. If not, the process returns to step S9.
Then, in step S12, a band-shaped region Nmax in which the AND area obtained in step S10 is maximized is obtained. In the following, description will be made on the assumption that the band-shaped region 6 has been obtained as the band-shaped region Nmax.
[0031]
Next, in step S13, the coordinates (X6, Y6) of the vanishing point of the band-shaped area Nmax (band-shaped area 6) are converted into a road coordinate system (xr, yr, zr), and the direction of the vanishing point on the road coordinates is calculated. Find the vector V6. Since the sensor 2 (laser radar, etc.) is usually mounted on a radiator (on a bumper) of the vehicle, the coordinate system of the sensor 2 is slightly different from the road coordinate system, but the direction of the sensor 2 is corrected. In this case, there is no particular problem if the coordinate system of the sensor 2 and the road coordinate system are regarded as the same, and the direction is corrected by the data of the road coordinate system.
[0032]
As shown in FIG. 6, when the relationship between the sign and the direction of each coordinate system is determined, the road coordinate system (xr, yr, zr), the camera coordinate system (xc, yc, zc), and the image coordinate system (X, Y) ), The following relational expressions shown in Expressions 5 and 6 hold. Here, R represents a 3 × 3 matrix, and T represents a 3 × 1 matrix. Further, R and T are set in advance by camera calibration. F in Equation 6 is the focal length of the camera 1.
[0033]
(Equation 5)
Figure 2004144671
[0034]
(Equation 6)
Figure 2004144671
[0035]
Therefore, the coordinates of the vanishing point (X6, Y6) can be converted into the road coordinate system (xr, yr, zr) as follows.
That is, (X6, Y6) is substituted for (X, Y) in equation (6), and is first converted to the camera coordinate system (xc, yc, zc) as shown in equation (7). In Equation 7, zc = k.
[0036]
(Equation 7)
Figure 2004144671
[0037]
Next, by substituting equation (7) into equation (5), it is possible to convert to the road coordinate system (xr, yr, zr) as in equation (8). It should be noted that the value of k in Equation 8 may be calculated by applying an appropriate constraint condition. For example, k can be obtained by substituting the measurable distance (for example, 150 m) of the laser radar into zr in the equation (8), and set to this value.
[0038]
(Equation 8)
Figure 2004144671
[0039]
When a unit vector is obtained from the coordinates of the vanishing point in the road coordinate system obtained as described above, the unit vector is the direction vector V6.
Finally, in step S14, the optimum position of the optical axis is obtained from the direction vector V6 and output to the controller 4. In response to this, the controller 4 moves the optical axis to the commanded optimal position and corrects the direction of the optical axis so that it matches the direction vector V6.
FIG. 5A is a diagram showing a scanning range (scanning area) of the laser radar 2. Here, the coordinate origin O is the center coordinate of scanning in a normal state (a state without correction). Further, (-Sx, Sx) and (-Sy, Sy) are the scanning range in the normal state, and (-Sxmax, Sxmax) and (-Symax, Symax) are the maximum scanning range. That is, the maximum correction amount that can move the center (optical axis) of the scanning range of the laser radar 2 in the xl direction (lateral direction) is Sxmax-Sx, and the maximum correction amount that can be moved in the yl direction (vertical direction). Is Symax-Sy.
[0040]
Therefore, the optical axis position (xL, yL) in the xl direction and the yl direction determined by the relational expression shown in Expression 9 from the component (xr, yr, zr) of the unit vector V6 becomes the optimal position of the optical axis.
[0041]
(Equation 9)
Figure 2004144671
[0042]
According to the radar device described above, the sensor in the direction on the real coordinates of the line segment (in this case, the inner edge of the white line) along the lane in which the host vehicle is traveling in the image ahead or behind the host vehicle. Since the central axis is corrected, it is difficult to miss a detection target (particularly, a preceding vehicle or a following vehicle) to be detected even on a road having an uneven curvature or a road with a gradient. In particular, in the case of the present embodiment, the central axis of the sensor is corrected in the direction of the vanishing point that can best overlook the entire road from among a plurality of vanishing points obtained by dividing the image into regions, so that the S-shaped curve It is more difficult to overlook a detection target to be detected even in a road condition in which the curvature changes in a complicated manner or a road condition in which the up-down is severe.
Further, in the present invention, since the direction vector is determined by obtaining the vanishing point, the direction vector can be obtained by the above-described simple processing even with a simple configuration in which one camera is provided as the imaging means.
[0043]
(Second embodiment)
Next, a second embodiment will be described. This embodiment has a feature in a part of the processing content of the image processing unit 3, and the other is the same as the first embodiment. FIG. 8 is a flowchart showing the processing content of this example. Note that the same steps as those in the first embodiment (FIG. 2) are denoted by the same reference numerals, and redundant description is omitted (the same applies to a third embodiment described later).
In this case, the vanishing point is obtained only for the lowest band-like area 1 (N = 1) (steps S3 to S6).
Then, in step S21 following step S8 (road area detection), as shown in FIG. 7A, the vanishing point when N = 1 is discretely divided into upper, lower, left, and right rectangular areas around the vanishing point. A plurality of (D) coordinates (points) on the converted image are determined. For example, in FIG. 7 (a), five rows horizontally and three rows vertically (in this case, D = 15) are generated at prescribed intervals around the vanishing point when N = 1.
[0044]
Next, in steps S22 to S26, similar to steps S9 to S13 in FIG. 2, a triangular area on an image in which each of the D coordinates exists as an upper vertex, and points on the white line on both sides in front are lower vertices. Then, the coordinates at which the area of the common region with the road region determined at step S8 is maximized are selected, and the direction vector is determined from the selected coordinates.
According to the second embodiment, the following effects are obtained in addition to the same effects as in the first embodiment. That is, in the case of the second mode, only one vanishing point is calculated, and thus there is an advantage that the calculation load is reduced.
[0045]
(Third embodiment)
Next, a third embodiment will be described. This embodiment also has a feature in a part of the processing content of the image processing unit 3. However, in the case of this example, as indicated by a broken line in FIG. 1A, information of the detection result by the laser radar 2 (at least the position information of another vehicle) is input from the controller 4 to the image processing unit 3, for example. It is necessary to have a configuration that
FIG. 9 is a flowchart showing the processing content of this example.
In this case, step S31 is provided after step S2. In this step S31, whether or not there is another vehicle in the belt-like area for which the vanishing point is to be obtained and whether or not the other vehicle may overlap with the line segment is determined by the information of the detection result by the laser radar 2. Judge based on. If another vehicle exists, the process skips steps S3 to S6 and proceeds to step S7, and does not execute the calculation of the vanishing point for the band-shaped region.
[0046]
The determination as to whether or not another vehicle exists in the belt-shaped area is performed, for example, as follows. That is, based on the information of the detection result by the laser radar 2 (position information of another vehicle), the position on the image where the other vehicle is present is obtained, a histogram is formed near the position, and the histogram is formed on the image of the other vehicle. An approximate width dimension and height dimension are determined, and an area on the image where another vehicle exists is specified. Then, for example, the overlap ratio between this region and each band-shaped region may be calculated, and it may be determined that another vehicle exists in the band-shaped region overlapping by a predetermined ratio or more.
[0047]
According to the present embodiment, the belt-like area where the preceding vehicle or the like exists is excluded from the processing target for obtaining the direction vector (vanishing point). For example, when an image as shown in FIG. 7B is obtained, regions 4 and 5 are excluded. For this reason, as shown in FIG. 7B, when the preceding vehicle or the like is at the edge of the lane (including the boundary between lanes), the line segment at that position (in this case, the edge of the white line) becomes the preceding vehicle. It is possible to prevent that the detection cannot be performed properly and the correction cannot be performed, or that the correction cannot be properly performed.
[0048]
Note that the present invention is not limited to the above-described embodiment, and may have various aspects and modifications.
For example, the image processing unit 3 in the above embodiment may have a configuration as shown in FIG. FIG. 1C is a high-level conceptual functional block diagram of the image processing unit 3. In this case, the image processing unit 3 includes a preprocessing unit 21, a region division unit 22, a line segment detection unit 23, and a direction vector. It comprises a detection unit 24 and a calculation unit 25.
The pre-processing unit 21 corresponds to the distortion correction unit 11 in FIG. 1B. Here, in addition to the above-described lens distortion correction (geometric correction), luminance correction such as gamma correction and contrast enhancement, Noise removal such as a median filter may be performed as necessary.
[0049]
Further, the region dividing unit 22 is not limited to a mode in which an image is divided into a horizontally long band-shaped region, and may be, for example, a case in which an image is divided into a vertically long band-shaped region, or a case in which an image is divided into a lattice. When the image is divided in this manner, how to associate a pair of line segments forming a pair (existing on both sides of the lane and going to the same vanishing point) is a problem. The line segments detected in the regions at the left-right symmetry positions may be selected as a pair of line segments. That is, for example, when the image is divided into a vertically long band, the first vanishing point is defined as the intersection of the line detected in the leftmost band in the image and the extension of the line detected in the rightmost band. Is specified, and the second vanishing point is specified as the intersection of the line detected in the second band from the left in the image and the extension of the line detected in the second band from the right. The vanishing points may be obtained in this manner.
Further, the division width at the time of dividing an image is not necessarily equal or constant. Further, the number of divisions N may be other than N = 7 as in the above-described embodiment, or may be variable.
[0050]
Further, the line segment detection unit 23 corresponds to the white line detection unit 13 in FIG. 1B, but is not limited to the aspect of the white line detection unit 13, and as described above, in addition to the white line, an image of a guardrail or the like may be used. Or an optical flow obtained from a temporal change of these images may be detected as a line segment of the present invention.
The direction vector detection unit 24 corresponds to a part of the vanishing point detection unit 14 and the scanning area calculation unit 15 in FIG. 1B, but a mode in which a direction vector of a line segment is specified without finding a vanishing point is also available. possible. For example, a plurality of cameras may be installed, the three-dimensional position of a point on a line segment may be obtained by the principle of triangulation, and the direction of the line segment in real coordinates may be calculated. In such an embodiment, the direction vector is determined by a single line segment (in other words, for each line segment) (that is, there is no need to detect a pair of line segments as in the above embodiment).
The calculation unit 25 calculates the optimum optical axis direction from the calculated direction vector, and corresponds to the scanning area calculation unit 15 in FIG.
[0051]
Further, in the above embodiment, a plurality of direction vectors (vanishing points) are obtained from a plurality of line segments obtained by dividing the image, and a direction vector in which the overlapping portion of the above-described triangular region and the road region is the largest is obtained. Although selected, the manner of correcting the optical axis by selecting a final direction vector from a plurality of line segments is not limited to this. For example, the line segment includes a white line at a position relatively far from the host vehicle (upper region in the image) and a white line at a position closer to the host vehicle (lower region in the image). A mode in which a plurality of detections are performed in the longitudinal direction of the road and the center axis of the sensor is corrected in the average direction (including a mode in which the vanishing point is not determined) may be used. Although the average direction is not always the best direction, it is possible to look over the entire road and hardly miss a preceding vehicle or the like, as compared with the conventional case where the optical axis is corrected assuming a constant curvature.
[0052]
Further, in the above embodiment, the image processing unit 3 and the controller 4 constitute the control means of the present invention. As described above, for example, the processing of FIG. However, as described above, the optical axis correction may be performed only when necessary (for example, when a change in the horizontal position of the preceding vehicle is detected).
In the third embodiment, the direction vector (vanishing point) is obtained for all such small areas, assuming that the small area where the other vehicle is present may possibly overlap the line segment. Excluded from processing. However, even in a small area where another vehicle is present, when it can be determined that the other vehicle does not overlap with the line segment (for example, when the other vehicle is located at the lower left and right center position in the image and is located at the end of the lane In the case where it can be determined that the small area does not exist, the configuration may be such that the small area is not excluded from the processing target (that is, only the small area where another vehicle overlaps the line segment) may be used.
[0053]
【The invention's effect】
According to the on-vehicle radar device of the present invention, the central axis of the sensor is corrected in the direction on the actual coordinates of the line along the lane in which the host vehicle is running in the image ahead or behind the host vehicle. In addition, it is difficult to miss a detection target (particularly, a preceding vehicle or a following vehicle) to be detected even on a road having an uneven curvature or a road with a gradient.
[Brief description of the drawings]
FIG. 1 is a diagram illustrating a configuration of a vehicle-mounted radar device.
FIG. 2 is a flowchart of an optical axis correction process (first embodiment).
FIG. 3 is a diagram illustrating region division and vanishing points of an image.
FIG. 4 is a diagram illustrating a road area and a triangle area in an image.
FIG. 5 is a diagram illustrating a scanning range and the like of a laser radar.
FIG. 6 is a diagram illustrating a relationship between a sign and a direction of each coordinate system.
FIG. 7 is a diagram for explaining another embodiment.
FIG. 8 is a flowchart of an optical axis correction process (second embodiment).
FIG. 9 is a flowchart of an optical axis correction process (third embodiment).
[Explanation of symbols]
1 camera (imaging means)
2 Laser radar (sensor)
3 Image processing unit (control means)
4 Controller (control means, correction control unit)
23 Line segment detector
24 Direction vector detector

Claims (6)

自車両の前方又は後方に存在する検出対象の少なくとも位置情報を取得するためのセンサをもつ車載用レーダ装置であって、
自車両の前方又は後方の路面を含む画像を取得するための撮像手段と、
この撮像手段により得られた画像に基づいて前記センサの中心軸の向きを補正するための制御手段とを備え、
前記制御手段は、
前記画像から自車両が走行しているレーンに沿った線分を検出する線分検出部と、
この線分検出部により得られた線分の実座標上の方向ベクトルを求める方向ベクトル検出部と、
この方向ベクトル検出部により得られた方向ベクトルに一致するように前記中心軸の向きを上下左右に補正する制御を実行する補正制御部と
を有することを特徴とする車載用レーダ装置。
An on-vehicle radar device having a sensor for acquiring at least positional information of a detection target existing in front of or behind the own vehicle,
Imaging means for acquiring an image including a road surface in front of or behind the own vehicle,
Control means for correcting the direction of the center axis of the sensor based on the image obtained by the imaging means,
The control means includes:
A line segment detection unit that detects a line segment along the lane in which the vehicle is traveling from the image,
A direction vector detection unit that obtains a direction vector on the actual coordinates of the line segment obtained by the line segment detection unit,
An on-vehicle radar device comprising: a correction control unit that executes control for correcting the direction of the center axis up, down, left, and right so as to match the direction vector obtained by the direction vector detection unit.
前記線分検出部は、前記線分として、自車両が走行しているレーンの両側にあるものを一対以上求める機能を有し、
前記方向ベクトル検出部は、対をなす前記線分の延長線の交点として前記画像上の消失点を求め、この消失点から前記方向ベクトルを求めることを特徴とする請求項1記載の車載用レーダ装置。
The line segment detection unit has a function of obtaining, as the line segment, one or more pairs on both sides of the lane in which the own vehicle is traveling,
The on-vehicle radar according to claim 1, wherein the direction vector detection unit obtains a vanishing point on the image as an intersection of extension lines of the paired line segments, and obtains the direction vector from the vanishing point. apparatus.
前記撮像手段により得られた画像を、複数の小領域に分割する領域分割部を備え、
前記線分検出部は、前記線分の検出を前記小領域毎に実行して、前記線分の対を複数検出し、
前記方向ベクトル検出部は、これら複数の線分の対に基づいて前記画像上の消失点を複数求め、これら消失点の中から、最も手前側に位置する一対の線分上の点及び各消失点を頂点とする画像上の三角形領域と、自車両が走行しているレーンの画像上の領域との共通領域の面積が最大となる消失点を選択し、この選択した消失点から前記方向ベクトルを求めることを特徴とする請求項2記載の車載用レーダ装置。
The image obtained by the imaging means, comprising an area dividing unit for dividing into a plurality of small areas,
The line segment detection unit performs the detection of the line segment for each of the small areas, detects a plurality of pairs of the line segment,
The direction vector detection unit obtains a plurality of vanishing points on the image based on the plurality of pairs of line segments, and among these vanishing points, a point on a pair of line segments located closest to the foreground and each vanishing point. A vanishing point in which the area of the common area between the triangular area on the image having the point as the vertex and the area on the image of the lane in which the own vehicle is running is maximized is selected. 3. The on-vehicle radar device according to claim 2, wherein:
前記線分検出部は、前記画像の下辺に沿った最下帯状領域にある前記線分の対を検出し、
前記方向ベクトル検出部は、前記最下帯状領域にある前記線分の対に基づいて前記画像上の消失点を求め、この消失点をこの消失点を中心とする上下左右の矩形領域に離散化してなる画像上の複数の座標を決定し、これら座標の中から、前記最下帯状領域にある一対の線分上の点及び各座標を頂点とする画像上の三角形領域と、自車両が走行しているレーンの画像上の領域との共通領域の面積が最大となる座標を選択し、この選択した座標から前記方向ベクトルを求めることを特徴とする請求項2記載の車載用レーダ装置。
The line segment detection unit detects the pair of line segments in a lowermost band-shaped region along a lower side of the image,
The direction vector detection unit obtains a vanishing point on the image based on the pair of line segments in the lowermost band-like area, and discretizes the vanishing point into upper, lower, left, and right rectangular areas around the vanishing point. A plurality of coordinates on the image are determined, and from these coordinates, a point on a pair of line segments in the lowermost band-like region and a triangular region on the image having each coordinate as a vertex, and a vehicle traveling by itself. 3. The on-vehicle radar device according to claim 2, wherein a coordinate at which an area of a common region with a region on the image of the lane is maximized is selected, and the direction vector is obtained from the selected coordinate.
前記方向ベクトル検出部は、
前記センサにより得られた検出対象の位置情報に基づき、当該検出対象が画像上で前記線分と重なる又は重なる可能性のある小領域を特定し、この小領域を前記消失点を求める対象から排除することを特徴とする請求項3記載の車載用レーダ装置。
The direction vector detector,
Based on the position information of the detection target obtained by the sensor, a small area where the detection target overlaps or is likely to overlap with the line segment on the image is specified, and this small area is excluded from the target for obtaining the vanishing point. The on-vehicle radar device according to claim 3, wherein:
前記制御手段は、
検出対象の水平方向位置が変化したと検知したこと、自車両の速度が変化したと検知したこと、自車両が水平に対して傾いたと検知したこと、及び自車両が車線変更したと検知したことのうち、何れか一つ又は複数の条件が成立したことを起因として、前記中心軸の向きを補正する新たな制御を開始することを特徴とする請求項1乃至5の何れかに記載の車載用レーダ装置。
The control means includes:
Detecting that the horizontal position of the detection target has changed, detecting that the speed of the own vehicle has changed, detecting that the own vehicle has tilted with respect to the horizontal, and detecting that the own vehicle has changed lanes The vehicle-mounted vehicle according to any one of claims 1 to 5, wherein a new control for correcting the direction of the center axis is started due to one or more of the conditions being satisfied. Radar equipment.
JP2002311680A 2002-10-25 2002-10-25 Automotive radar equipment Expired - Fee Related JP3862015B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2002311680A JP3862015B2 (en) 2002-10-25 2002-10-25 Automotive radar equipment
US10/681,840 US6831591B2 (en) 2002-10-25 2003-10-08 Radar device for a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2002311680A JP3862015B2 (en) 2002-10-25 2002-10-25 Automotive radar equipment

Publications (2)

Publication Number Publication Date
JP2004144671A true JP2004144671A (en) 2004-05-20
JP3862015B2 JP3862015B2 (en) 2006-12-27

Family

ID=32105318

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2002311680A Expired - Fee Related JP3862015B2 (en) 2002-10-25 2002-10-25 Automotive radar equipment

Country Status (2)

Country Link
US (1) US6831591B2 (en)
JP (1) JP3862015B2 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005345251A (en) * 2004-06-02 2005-12-15 Toyota Motor Corp Obstacle recognition device
JP2007163258A (en) * 2005-12-13 2007-06-28 Alpine Electronics Inc Apparatus and method for compensating onboard sensor
JP2007233440A (en) * 2006-02-27 2007-09-13 Omron Corp On-vehicle image processor
JP2007240277A (en) * 2006-03-07 2007-09-20 Olympus Corp Distance measuring device/imaging device, distance measuring method/imaging method, distance measuring program/imaging program, and storage medium
JP2007240276A (en) * 2006-03-07 2007-09-20 Olympus Corp Distance measuring device/imaging device, distance measuring method/imaging method, distance measuring program/imaging program, and storage medium
JP2010515183A (en) * 2007-01-04 2010-05-06 コンチネンタル オートモーティヴ ゲゼルシャフト ミット ベシュレンクテル ハフツング Rider sensor vertical alignment
JP2012018531A (en) * 2010-07-07 2012-01-26 Suzuki Motor Corp White line detection device
JP2012046084A (en) * 2010-08-27 2012-03-08 Koito Mfg Co Ltd Light distribution control apparatus
WO2015119298A1 (en) * 2014-02-10 2015-08-13 株式会社デンソー Axis deviation detection device for beam sensor
JP2016053563A (en) * 2014-02-10 2016-04-14 株式会社デンソー Axis deviation detector
CN111361509A (en) * 2018-12-26 2020-07-03 财团法人工业技术研究院 Automatic adjusting method and system for vehicle sensor
JP2021060371A (en) * 2019-10-09 2021-04-15 株式会社Soken Axial misalignment estimation device
WO2021133892A1 (en) * 2019-12-27 2021-07-01 Lyft, Inc. Adaptive tilting radars for effective vehicle controls
JP2023510735A (en) * 2020-01-06 2023-03-15 ルミナー,エルエルシー Adaptive scan pattern with virtual horizon estimation
WO2023188793A1 (en) * 2022-03-30 2023-10-05 パナソニックIpマネジメント株式会社 Display system and display method

Families Citing this family (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3998601B2 (en) * 2002-10-09 2007-10-31 富士通株式会社 Pulse radar equipment
JP3862015B2 (en) * 2002-10-25 2006-12-27 オムロン株式会社 Automotive radar equipment
JP4313089B2 (en) * 2003-05-23 2009-08-12 富士通テン株式会社 Radar apparatus for automobile and its mounting direction adjusting method
US7046190B2 (en) * 2003-07-25 2006-05-16 Raytheon Company Process for phase-derived range measurements
US6906663B2 (en) * 2003-07-30 2005-06-14 The Boeing Company E-field monitor for pulsed signals
US20050024260A1 (en) * 2003-07-30 2005-02-03 Johnston Gary P. E-field monitor for broadband pulsed
US6882302B1 (en) * 2003-09-22 2005-04-19 Rockwell Collins Enhanced adaptive weather thresholds for identification of hazards system and method
US6977610B2 (en) * 2003-10-10 2005-12-20 Raytheon Company Multiple radar combining for increased range, radar sensitivity and angle accuracy
US7038615B2 (en) * 2003-10-10 2006-05-02 Raytheon Company Efficient technique for estimating elevation angle when using a broad beam for search in a radar
JP4259357B2 (en) * 2004-03-12 2009-04-30 三菱ふそうトラック・バス株式会社 Vehicle running state determination device
JP4296973B2 (en) * 2004-03-19 2009-07-15 三菱ふそうトラック・バス株式会社 Vehicle running state determination device
JP4258413B2 (en) * 2004-03-26 2009-04-30 三菱ふそうトラック・バス株式会社 Vehicle running state determination device
JP4225229B2 (en) * 2004-03-30 2009-02-18 三菱ふそうトラック・バス株式会社 Arousal level judgment device
JP4763250B2 (en) * 2004-04-09 2011-08-31 株式会社デンソー Object detection device
JP4895484B2 (en) * 2004-06-28 2012-03-14 富士通テン株式会社 Axis deviation calculation method for on-vehicle radar device and on-vehicle radar axis deviation determination method
DE102004036580A1 (en) * 2004-07-28 2006-03-16 Robert Bosch Gmbh Method and device for object detection in a vehicle
US20060091654A1 (en) * 2004-11-04 2006-05-04 Autoliv Asp, Inc. Sensor system with radar sensor and vision sensor
US20060091653A1 (en) * 2004-11-04 2006-05-04 Autoliv Asp, Inc. System for sensing impending collision and adjusting deployment of safety device
JP2006151125A (en) * 2004-11-26 2006-06-15 Omron Corp On-vehicle image processing device
DE102005001429A1 (en) * 2005-01-12 2006-07-20 Robert Bosch Gmbh Method for image-position correction of a monitor image
US7627170B2 (en) * 2005-10-11 2009-12-01 Northrop Grumman Corporation Process for the identification of objects
JP4304517B2 (en) * 2005-11-09 2009-07-29 トヨタ自動車株式会社 Object detection device
US7544945B2 (en) 2006-02-06 2009-06-09 Avago Technologies General Ip (Singapore) Pte. Ltd. Vertical cavity surface emitting laser (VCSEL) array laser scanner
JP4248558B2 (en) * 2006-03-24 2009-04-02 トヨタ自動車株式会社 Road marking line detection device
US7633431B1 (en) * 2006-05-18 2009-12-15 Rockwell Collins, Inc. Alignment correction engine
JP4367475B2 (en) * 2006-10-06 2009-11-18 アイシン精機株式会社 Moving object recognition apparatus, moving object recognition method, and computer program
EP2122599B1 (en) * 2007-01-25 2019-11-13 Magna Electronics Inc. Radar sensing system for vehicle
JP5160114B2 (en) * 2007-03-26 2013-03-13 本田技研工業株式会社 Vehicle passage judgment device
US8017898B2 (en) 2007-08-17 2011-09-13 Magna Electronics Inc. Vehicular imaging system in an automatic headlamp control system
JP2009053818A (en) * 2007-08-24 2009-03-12 Toshiba Corp Image processor and method thereof
EP2535883B1 (en) * 2008-07-10 2014-03-19 Mitsubishi Electric Corporation Train-of-vehicle travel support device
US8095276B2 (en) * 2008-10-15 2012-01-10 Autoliv Asp, Inc. Sensor system including a confirmation sensor for detecting an impending collision
US20100225522A1 (en) * 2009-03-06 2010-09-09 Demersseman Bernard Guy Sensor system for detecting an impending collision of a vehicle
US8284997B2 (en) * 2009-03-11 2012-10-09 Honeywell International Inc. Vision-based vehicle navigation system and method
US8949069B2 (en) * 2009-12-16 2015-02-03 Intel Corporation Position determination based on propagation delay differences of multiple signals received at multiple sensors
JP2013002927A (en) * 2011-06-15 2013-01-07 Honda Elesys Co Ltd Obstacle detection apparatus and computer program
US10162070B2 (en) * 2012-04-05 2018-12-25 Westerngeco L.L.C. Converting a first acquired data subset to a second acquired data subset
JP2013217799A (en) * 2012-04-10 2013-10-24 Honda Elesys Co Ltd Object detection device, object detection method, object detection program, and operation control system
WO2013162559A1 (en) * 2012-04-26 2013-10-31 Intel Corporation Determining relative positioning information
DE102013113054B4 (en) * 2012-12-03 2022-01-27 Denso Corporation Target detection device for avoiding a collision between a vehicle and a target detected by a sensor mounted on the vehicle
WO2014193334A1 (en) 2013-05-26 2014-12-04 Intel Corporation Apparatus, system and method of communicating positioning information
WO2015005912A1 (en) 2013-07-10 2015-01-15 Intel Corporation Apparatus, system and method of communicating positioning transmissions
JP5812061B2 (en) * 2013-08-22 2015-11-11 株式会社デンソー Target detection apparatus and program
JP6087858B2 (en) * 2014-03-24 2017-03-01 株式会社日本自動車部品総合研究所 Traveling lane marking recognition device and traveling lane marking recognition program
US10032249B2 (en) * 2014-09-05 2018-07-24 Sakai Display Products Corporation Image generating apparatus, image generating method, and computer program
DE102014013432B4 (en) * 2014-09-10 2016-11-10 Audi Ag Method for processing environment data in a vehicle
JP6265095B2 (en) * 2014-09-24 2018-01-24 株式会社デンソー Object detection device
US10962638B2 (en) 2017-09-07 2021-03-30 Magna Electronics Inc. Vehicle radar sensing system with surface modeling
US11150342B2 (en) 2017-09-07 2021-10-19 Magna Electronics Inc. Vehicle radar sensing system with surface segmentation using interferometric statistical analysis
US10877148B2 (en) 2017-09-07 2020-12-29 Magna Electronics Inc. Vehicle radar sensing system with enhanced angle resolution using synthesized aperture
US10962641B2 (en) 2017-09-07 2021-03-30 Magna Electronics Inc. Vehicle radar sensing system with enhanced accuracy using interferometry techniques
TWI734932B (en) * 2018-09-17 2021-08-01 為昇科科技股份有限公司 Radar detection angle caliberation system and method thereof
CN109901183A (en) * 2019-03-13 2019-06-18 电子科技大学中山学院 Method for improving all-weather distance measurement precision and reliability of laser radar
KR20210054944A (en) * 2019-11-06 2021-05-14 현대자동차주식회사 Apparatus for compensating error of radar in vehicle and method thereof
CN113829994B (en) * 2020-06-08 2023-11-21 广州汽车集团股份有限公司 Early warning method and device based on car external whistling, car and medium
CN114076946A (en) * 2020-08-18 2022-02-22 华为技术有限公司 Motion estimation method and device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11142520A (en) * 1997-11-06 1999-05-28 Omron Corp Axis adjusting method for distance measuring apparatus and detecting method for axis deviation as well as distance measuring apparatus
JP2003121547A (en) * 2001-10-18 2003-04-23 Fuji Heavy Ind Ltd Outside-of-vehicle monitoring apparatus
JP3880837B2 (en) * 2001-11-02 2007-02-14 富士重工業株式会社 Outside monitoring device
JP3880841B2 (en) * 2001-11-15 2007-02-14 富士重工業株式会社 Outside monitoring device
JP3861781B2 (en) * 2002-09-17 2006-12-20 日産自動車株式会社 Forward vehicle tracking system and forward vehicle tracking method
JP3862015B2 (en) * 2002-10-25 2006-12-27 オムロン株式会社 Automotive radar equipment

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005345251A (en) * 2004-06-02 2005-12-15 Toyota Motor Corp Obstacle recognition device
US7570198B2 (en) 2004-06-02 2009-08-04 Toyota Jidosha Kabushiki Kaisha Obstacle recognition system and obstacle recognition method
JP2007163258A (en) * 2005-12-13 2007-06-28 Alpine Electronics Inc Apparatus and method for compensating onboard sensor
JP2007233440A (en) * 2006-02-27 2007-09-13 Omron Corp On-vehicle image processor
JP2007240277A (en) * 2006-03-07 2007-09-20 Olympus Corp Distance measuring device/imaging device, distance measuring method/imaging method, distance measuring program/imaging program, and storage medium
JP2007240276A (en) * 2006-03-07 2007-09-20 Olympus Corp Distance measuring device/imaging device, distance measuring method/imaging method, distance measuring program/imaging program, and storage medium
JP2010515183A (en) * 2007-01-04 2010-05-06 コンチネンタル オートモーティヴ ゲゼルシャフト ミット ベシュレンクテル ハフツング Rider sensor vertical alignment
JP2012018531A (en) * 2010-07-07 2012-01-26 Suzuki Motor Corp White line detection device
JP2012046084A (en) * 2010-08-27 2012-03-08 Koito Mfg Co Ltd Light distribution control apparatus
JP2016053563A (en) * 2014-02-10 2016-04-14 株式会社デンソー Axis deviation detector
WO2015119298A1 (en) * 2014-02-10 2015-08-13 株式会社デンソー Axis deviation detection device for beam sensor
CN111361509A (en) * 2018-12-26 2020-07-03 财团法人工业技术研究院 Automatic adjusting method and system for vehicle sensor
JP2020106511A (en) * 2018-12-26 2020-07-09 財團法人工業技術研究院Industrial Technology Research Institute Method and system for automatic adjustment of sensor for automobile
JP2021060371A (en) * 2019-10-09 2021-04-15 株式会社Soken Axial misalignment estimation device
CN114556143A (en) * 2019-10-09 2022-05-27 株式会社电装 Shaft offset estimation device
JP7339114B2 (en) 2019-10-09 2023-09-05 株式会社Soken Axial misalignment estimator
WO2021133892A1 (en) * 2019-12-27 2021-07-01 Lyft, Inc. Adaptive tilting radars for effective vehicle controls
US11360191B2 (en) 2019-12-27 2022-06-14 Woven Planet North America, Inc. Adaptive tilting radars for effective vehicle controls
JP2023510735A (en) * 2020-01-06 2023-03-15 ルミナー,エルエルシー Adaptive scan pattern with virtual horizon estimation
JP7395755B2 (en) 2020-01-06 2023-12-11 ルミナー,エルエルシー Adaptive scanning pattern with virtual horizon estimation
WO2023188793A1 (en) * 2022-03-30 2023-10-05 パナソニックIpマネジメント株式会社 Display system and display method

Also Published As

Publication number Publication date
JP3862015B2 (en) 2006-12-27
US20040080449A1 (en) 2004-04-29
US6831591B2 (en) 2004-12-14

Similar Documents

Publication Publication Date Title
JP3862015B2 (en) Automotive radar equipment
JP5829980B2 (en) Roadside detection device
JP5637302B2 (en) Driving support apparatus and adjacent vehicle detection method
EP2605185B1 (en) Detection of obstacles at night by analysis of shadows
JP3759429B2 (en) Obstacle detection apparatus and method
EP2993654B1 (en) Method and system for forward collision warning
JP3711405B2 (en) Method and system for extracting vehicle road information using a camera
JP4930046B2 (en) Road surface discrimination method and road surface discrimination device
JP2003296736A (en) Device for detecting obstacle and method thereof
JP2007300181A (en) Periphery monitoring apparatus and periphery monitoring method and program thereof
JP2005301603A (en) Traveling lane detection device
US10318824B2 (en) Algorithm to extend detecting range for AVM stop line detection
JP2001092970A (en) Lane recognizing device
JP2006331193A (en) Vehicle, image processing system, image processing method, and image processing program
WO2014033936A1 (en) Image processing device, image processing method, and image processing program
JP2008262333A (en) Road surface discrimination device and road surface discrimination method
KR101268282B1 (en) Lane departure warning system in navigation for vehicle and method thereof
JPH11351862A (en) Foregoing vehicle detecting method and equipment
JP4721278B2 (en) Lane departure determination device, lane departure prevention device, and lane tracking support device
JP2013142972A (en) Travel path recognition device
JP2007018451A (en) Road boundary line detecting device
JP6608664B2 (en) Own vehicle position recognition device
JP2007310591A (en) Image processor and parking space determination method
JP3729005B2 (en) Vehicle rear monitoring device
JP6949090B2 (en) Obstacle detection device and obstacle detection method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20050126

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20060327

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20060614

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20060802

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20060906

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20060919

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20101006

Year of fee payment: 4

LAPS Cancellation because of no payment of annual fees