JP2005030968A - Device for calculating vehicle-mounted distance - Google Patents

Device for calculating vehicle-mounted distance Download PDF

Info

Publication number
JP2005030968A
JP2005030968A JP2003271984A JP2003271984A JP2005030968A JP 2005030968 A JP2005030968 A JP 2005030968A JP 2003271984 A JP2003271984 A JP 2003271984A JP 2003271984 A JP2003271984 A JP 2003271984A JP 2005030968 A JP2005030968 A JP 2005030968A
Authority
JP
Japan
Prior art keywords
feature point
vehicle
real space
reference feature
road surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2003271984A
Other languages
Japanese (ja)
Other versions
JP4144464B2 (en
Inventor
Hidekazu Nishiuchi
秀和 西内
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Priority to JP2003271984A priority Critical patent/JP4144464B2/en
Publication of JP2005030968A publication Critical patent/JP2005030968A/en
Application granted granted Critical
Publication of JP4144464B2 publication Critical patent/JP4144464B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Measurement Of Optical Distance (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide a vehicle-mounted distance calculation device, capable of calculating the distance between a vehicle of one's own and an object, even if there is no difference in brightness in the contact point between the object and the road surface. <P>SOLUTION: The device includes a camera 1, disposed at a prescribed position of the vehicle of one's own and for imaging the vicinity of one's own vehicle and a microcomputer 4 for extracting a characteristic point, such that the brightness is considerably different from that of the neighboring region thereof from the image taken by the camera 1, in which a reference characteristic point existing on the road surface in the real space is extracted from the extracted characteristic points by the microcomputer 4, and in which the real spatial positions of the reference characteristic point and the characteristic point are calculated from the time variation in the image positions of the reference characteristic point and the characteristic point. Thus, the distance between the vehicle of one's own and the object is calculated, from the real spatial position of the reference characteristic point or the characteristic point. <P>COPYRIGHT: (C)2005,JPO&NCIPI

Description

本発明は、自車両と対象物との距離を算出する車載用距離算出装置に関する。   The present invention relates to an in-vehicle distance calculation device that calculates a distance between a host vehicle and an object.

従来より、自車両にセンサを搭載し、自車両から該自車両の前方の対象物までの距離を算出する技術が研究されているが、単一カメラで撮像された画像から対象物までの距離を算出する方法は、ステレオカメラを用いて距離を算出する方法と比べて、安価であり、システム構成も簡単であるなどの利点がある。このように単一カメラから対象物までの距離を算出するには、次のように距離算出が行われる。すなわち、1つのカメラが走路面に対して高さH、角度θで車両に設置されている。このカメラにより撮像された画像上における対象物の上下方向の座標をyt、カメラの焦点距離をfとすると、対象物までの距離Lは、式(1)で計算される。
L=H(yt・tanθ+f)/(f・tanθ−yt) …(1)
特開2001−116512号公報 特開平5−157557号公報
Conventionally, techniques for mounting a sensor on a host vehicle and calculating the distance from the host vehicle to an object in front of the host vehicle have been studied, but the distance from an image captured by a single camera to the target object Compared with the method of calculating the distance using a stereo camera, the method of calculating is advantageous in that it is inexpensive and the system configuration is simple. Thus, in order to calculate the distance from the single camera to the object, the distance is calculated as follows. That is, one camera is installed in the vehicle at a height H and an angle θ with respect to the road surface. When the vertical coordinate of the object on the image captured by the camera is yt and the focal length of the camera is f, the distance L to the object is calculated by Expression (1).
L = H (yt · tan θ + f) / (f · tan θ−yt) (1)
JP 2001-116512 A JP-A-5-157557

しかしながら、上記のような従来の技術にあっては、カメラの設置位置と、平面である走路面と、対象物と走路面との接点を用いて、三角測量法により自車両と対象物との距離を算出するため、対象物の影により、対象物と走路面の接点との輝度差がない場合など、対象物と走路面との接点を検出することができず、自車両と対象物との距離が算出できないことがあった。
本発明の目的は、自車両と対象物との距離を安価に算出することができる車載用距離算出装置を提供することにある。
However, in the conventional technology as described above, the position of the camera, the plane of the road surface, and the contact point between the object and the road surface are used for triangulation to determine the distance between the vehicle and the object. In order to calculate the distance, the contact between the object and the road surface cannot be detected due to the shadow of the object, such as when there is no luminance difference between the contact point of the object and the road surface. The distance could not be calculated.
The objective of this invention is providing the vehicle-mounted distance calculation apparatus which can calculate the distance of the own vehicle and a target object cheaply.

上記課題を解決するため、本発明の車載用距離算出装置は、自車両の近傍を撮像する撮像手段と、この撮像手段の撮像画像から特徴点を抽出する特徴点抽出手段と、特徴点の中から実空間上で走路面に存在する基準特徴点を抽出する基準特徴点抽出手段と、基準特徴点と特徴点の画像位置の時間的変化から、基準特徴点と特徴点の実空間位置を算出する実空間位置算出手段と、基準特徴点もしくは特徴点の実空間位置から、自車両と対象物との距離を算出する対象物距離算出手段とを有するという構成になっている。   In order to solve the above-described problems, an in-vehicle distance calculation device according to the present invention includes an imaging unit that captures an image of the vicinity of the host vehicle, a feature point extraction unit that extracts a feature point from an image captured by the imaging unit, From the reference feature point extraction means that extracts the reference feature points existing on the road surface in real space, and the reference feature points and the actual space positions of the feature points are calculated from temporal changes in the image positions of the reference feature points and feature points And a target distance calculation means for calculating a distance between the host vehicle and the target object from a reference feature point or a real space position of the feature point.

本発明によれば、自車両と対象物との距離を安価に算出することができる車載用距離算出装置を提供することができる。   ADVANTAGE OF THE INVENTION According to this invention, the vehicle-mounted distance calculation apparatus which can calculate the distance of the own vehicle and a target object at low cost can be provided.

以下、図面に基づき本発明の実施の形態について詳細に説明する。なお、以下で説明する図面で、同一機能を有するものは同一符号を付け、その繰り返しの説明は省略する。
図1は、本発明の車載用距離算出装置の実施の形態の構成ブロックである。図1に示すようにカメラ1、車両センサ2、画像メモリ3、マイクロコンピュータ4、警報装置5が接続されている。
カメラ1は、自車両の所定位置、例えば車両前部に設けられ、自車両前方を撮像する。車両センサ2は、自車両の挙動ここでは自車両の速度、操舵角を計測する。画像メモリ3は、カメラ1から入力された信号をデジタル値に変換し保持する。マイクロコンピュータ4は、画像メモリ3に保持されたデジタル画像を入力し、自車両から前方の対象物(障害物)までの距離を算出する。また、マイクロコンピュータ4は、車両センサ2から入力された自車両速度、操舵角から所定時間後の自車両位置を予測し、前方の対象物に衝突するか否かの判断をする。警報装置5は、マイクロコンピュータ4で前方障害物と衝突すると判断された場合、例えばブザーを鳴らして警報する。
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In the drawings described below, components having the same function are denoted by the same reference numerals, and repeated description thereof is omitted.
FIG. 1 is a configuration block of an embodiment of an in-vehicle distance calculation device according to the present invention. As shown in FIG. 1, a camera 1, a vehicle sensor 2, an image memory 3, a microcomputer 4, and an alarm device 5 are connected.
The camera 1 is provided at a predetermined position of the host vehicle, for example, at the front of the vehicle, and images the front of the host vehicle. The vehicle sensor 2 measures the behavior and the steering angle of the host vehicle. The image memory 3 converts the signal input from the camera 1 into a digital value and holds it. The microcomputer 4 inputs the digital image stored in the image memory 3 and calculates the distance from the host vehicle to the front object (obstacle). Further, the microcomputer 4 predicts the position of the host vehicle after a predetermined time from the host vehicle speed and the steering angle input from the vehicle sensor 2, and determines whether or not to collide with a front object. When the microcomputer 4 determines that the microcomputer 4 collides with a front obstacle, the alarm device 5 sounds a buzzer, for example.

図2、図3は、本実施の形態においてマイクロコンピュータ4により抽出された特徴点から実空間の走路面上に存在する基準特徴点を抽出する基本原理を示す図である。図2は、カメラ1で撮像した画像上の対象物12とその影15を示す。図3(a)は、自車両10と先行車両11の走路面上の走行シーンを示し、(b)はそのときにカメラ1で撮像した撮像画像を示す。図2に示すように特徴点13の近傍に所定サイズの基準特徴点判定窓16を配置し(ここでは4個1組の構成となっている)、基準特徴点判定窓16内の画像輝度値の平均値と分散値を算出する。基準特徴点判定窓16bは走路面に投影された影であるため、走路面である基準特徴点判定窓16aと比較して平均輝度値は影によりオフセットして低くなり、分散値は同じ走路面(同じテクスチャ)であるため等しくなる。このようにして基準特徴点判定窓16の分散値が等しく、平均輝度値が走路面輝度と影によりオフセットしている輝度値に2極化している特徴点を抽出することで、対象物自体を含む特徴点から影の特徴点のみを抽出することができる。また、図3に示すように、自車両10の直前を基準走路面17とし、撮像画像から基準走路面17のパターンデータを記憶する。次に、このパターンと基準特徴点判定窓16の相関度が高いもののみを抽出することで、抽出された影の特徴点から走路面上にある影の特徴点のみ抽出できる。   2 and 3 are diagrams showing a basic principle for extracting a reference feature point existing on the road surface in the real space from the feature points extracted by the microcomputer 4 in the present embodiment. FIG. 2 shows the object 12 and its shadow 15 on the image captured by the camera 1. 3A shows a traveling scene on the traveling road surface of the host vehicle 10 and the preceding vehicle 11, and FIG. 3B shows a captured image captured by the camera 1 at that time. As shown in FIG. 2, a reference feature point determination window 16 having a predetermined size is disposed in the vicinity of the feature point 13 (in this case, a set of four pieces), and the image luminance value in the reference feature point determination window 16 The average value and variance value are calculated. Since the reference feature point determination window 16b is a shadow projected on the road surface, the average luminance value is offset and lowered by the shadow and the variance value is the same as the road surface surface compared to the reference feature point determination window 16a which is the road surface. Since they are the same texture, they are equal. In this way, by extracting the feature points having the same variance value of the reference feature point determination window 16 and the average luminance value being bipolar to the luminance value offset by the road surface luminance and the shadow, the object itself is extracted. Only the shadow feature points can be extracted from the included feature points. Further, as shown in FIG. 3, a pattern immediately before the host vehicle 10 is set as a reference road surface 17 and pattern data of the reference road surface 17 is stored from a captured image. Next, by extracting only those having a high degree of correlation between this pattern and the reference feature point determination window 16, only the shadow feature points on the road surface can be extracted from the extracted shadow feature points.

図4〜図8は、本実施の形態において走路面上の対象物12の特徴点である基準特徴点14と、対象物12の特徴点13の画像上の座標から、実空間上の基準特徴点14と対象物12の特徴点13の位置関係を算出する基本原理を示している。
図4(a)、(b)はそれぞれ時刻T0と時刻T1における基準特徴点14と特徴点13の実空間位置と撮像画像座標を示し、図5(a)、(b)はそれぞれ上方から見た実空間位置と撮像画像座標の関係を示し、図6(a)、(b)はそれぞれ側方から見た実空間位置と撮像画像座標の関係を示し、図7は側方から見た実空間における基準特徴点14と特徴点13の時間的変化を示し、図8は側方から見た実空間における基準特徴点14と特徴点13の実空間位置を示す。
図4(a)に示すように時刻T0における対象物12の基準特徴点14の画像座標を(xbo,yb0)、実空間位置を(Xbo,Yb0,0)とし、対象物12の特徴点13の画像座標を(xfo,yf0)、実空間位置を(Xf0,Yf0,Zfo)とする。また、図4(b)に示すように時刻T1における対象物12の基準特徴点14の画像座標を(xb1,yb1)、実空間位置を(Xb1,Yb1,0)とし、対象物12の特徴点13の画像座標を(xf1,yf1)、実空間位置を(Xf1,Yf1,Zf1)とする。また、図5、図6より実空間における視軸からのずれ角(以下見え角と記す)(θx,θy)は画像位置(x,y)を用いて幾何学的に、以下のような式で表されることがわかる。
θx=f(x)
θy=g(y) …(2)
すなわち、視軸からのずれ画素−視軸からのずれ角の関係は、以下のような式で表される。
θx=2tan−1(Cx/2・F)/Pix・Px
θy=2tan−1(Cy/2・F)/Piy・Py
図7は時刻T0と時刻T1における基準特徴点14と特徴点13の実空間位置を示している。このように高さHに車載されたカメラ1から、実空間位置でdY離れた基準特徴点14と特徴点13を撮像した場合、時刻T0における基準特徴点14と特徴点13の実空間位置は、
θbT0=H/L …(3)
θfT0=(H−Z)/(L+dY) …(4)
となる。
また、これらの基準特徴点14と特徴点13が時刻T1までに相対距離dY移動した時の空間位置は、
θbT1=H/(L+dL) …(5)
θfT1=(H―Z)/(L+dY+dL) …(6)
となる。
また、式(2)より、全ての見え角θbT0,θfT0,θbT1,θfT1は撮像した画像座標から算出できる。さらに、カメラの設置高さHは既知であるため、式(3)(4)(5)(6)を用い、基準特徴点14と特徴点13の実空間位置差dYと高さZは、
dY=(θfT1/θbT1−θfT0/θbT0)/(θfT0−θfT1)・H …(7)
Z=(1−(θfT0・θfT1bT0−θbT1))/(θbT0・θbT1fT0−θfT1)))・H …(8)
となる。
図8は基準特徴点14と特徴点13のカメラ1の視軸から見て横方向の空間位置を示し、基準特徴点14と特徴点13の横位置X、Xおよび基準特徴点14と特徴点13の位置差dXは、
=L・tanθxb
=(L+dY)・tanθxf
dX=X−X
となり、(3)(7)式を代入することで算出できる。
すなわち、対象物12の走路面上の基準特徴点14と対象物12の特徴点13の時間的変化を計測することで、この2点の実空間的位置(配置)が算出できる。
FIGS. 4 to 8 show reference features in real space from the reference feature points 14 which are feature points of the object 12 on the road surface and the coordinates of the feature points 13 of the object 12 in the present embodiment. The basic principle for calculating the positional relationship between the point 14 and the feature point 13 of the object 12 is shown.
FIGS. 4A and 4B show the real space positions and captured image coordinates of the reference feature point 14 and the feature point 13 at time T0 and time T1, respectively. FIGS. 5A and 5B are views from above. 6 (a) and 6 (b) show the relationship between the real space position and the captured image coordinates viewed from the side, respectively, and FIG. 7 shows the relationship between the actual space position and the captured image coordinates. FIG. 8 shows the real space positions of the reference feature point 14 and the feature point 13 in the real space as viewed from the side.
As shown in FIG. 4A, the image coordinates of the reference feature point 14 of the object 12 at time T0 are (x bo , y b0 ), the real space position is (X bo , Y b0 , 0), and the object 12 The image coordinates of the feature point 13 are (x fo , y f0 ), and the real space position is (X f0 , Y f0 , Z fo ). Also, as shown in FIG. 4B, the image coordinates of the reference feature point 14 of the object 12 at time T1 are (x b1 , y b1 ), the real space position is (X b1 , Y b1 , 0), and the object The image coordinates of the feature point 13 of the object 12 are (x f1 , y f1 ), and the real space position is (X f1 , Y f1 , Z f1 ). 5 and 6, the deviation angle from the visual axis in real space (hereinafter referred to as the viewing angle) (θx, θy) is geometrically expressed using the image position (x, y) as follows: It can be seen that
θx = f (x)
θy = g (y) (2)
That is, the relationship between the deviation pixel from the visual axis and the deviation angle from the visual axis is expressed by the following equation.
θx = 2 tan −1 (Cx / 2 · F) / Pix · Px
θy = 2tan −1 (Cy / 2 · F) / Pyy · Py
FIG. 7 shows the real space positions of the reference feature point 14 and the feature point 13 at time T0 and time T1. In this way, when the reference feature point 14 and the feature point 13 separated by dY at the real space position are imaged from the camera 1 mounted on the vehicle at the height H, the real space positions of the reference feature point 14 and the feature point 13 at time T0 are ,
θ bT0 = H / L (3)
θ fT0 = (H−Z) / (L + dY) (4)
It becomes.
Further, the spatial position when the reference feature point 14 and the feature point 13 are moved by the relative distance dY by the time T1 is:
θ bT1 = H / (L + dL) (5)
θ fT1 = (H−Z) / (L + dY + dL) (6)
It becomes.
Further, from the expression (2), all the viewing angles θ bT0 , θ fT0 , θ bT1 , and θ fT1 can be calculated from the captured image coordinates. Furthermore, since the installation height H of the camera is known, the real space position difference dY and the height Z between the reference feature point 14 and the feature point 13 are calculated using the equations (3), (4), (5), and (6).
dY = ([theta] fT1 / [theta] bT1- [ theta ] fT0 / [theta] bT0 ) / ([theta] fT0- [ theta ] fT1 ) .H (7)
Z = (1− (θ fT0 · θ fT1bT0 −θ bT1 )) / (θ bT0 · θ bT1fT0 −θ fT1 ))) · H (8)
It becomes.
FIG. 8 shows the spatial positions of the reference feature point 14 and the feature point 13 in the horizontal direction when viewed from the visual axis of the camera 1. The reference feature point 14, the lateral positions X b and X f of the feature point 13, and the reference feature point 14 The positional difference dX of the feature point 13 is
X b = L · tan θ xb
X f = (L + dY) · tan θ xf
dX = X b −X f
And can be calculated by substituting equations (3) and (7).
That is, by measuring temporal changes of the reference feature point 14 on the road surface of the object 12 and the feature point 13 of the object 12, the real space position (arrangement) of these two points can be calculated.

図9は、以上述べた基本原理に従って対象物までの距離を算出する処理フローを示している。
S100はカメラ1から出力された信号をデジタル値に変換し保存してある画像メモリ2から画像輝度情報を読み出す処理である。
S101はS100で入力した画像輝度信号について縦方向と横方向のソベルフィルタをかけ、縦エッジ画像と横エッジ画像を算出する処理である。
S102はS101で算出された縦エッジ画像と横エッジ画像の交点を特徴点として抽出する処理である。
S103はS102で抽出された特徴点のフレーム間の動きに基づいて物体毎にラベルをつけるクラスタリング処理である。
S104はS103でクラスタリングされた物体毎に抽出された特徴点の中から走路面上に投影さえた影の交点である基準特徴点を抽出する処理である。
S105はS103でクラスタリングされた物体毎に、S104で抽出された基準特徴点と、S102で抽出されたそれ以外の特徴点の画像座標の時間的変化から実空間における基準特徴点の空間的位置を算出する処理である。
S106は既知のカメラの設置位置と、自車両直前の基準走路面(図3の17)と、S105で算出された対象物と走路面との接点である基準特徴点とを用いて、三角測量法により自車両から物体(対象物)までの距離を算出する処理である。
S107は車両センサ2から自車両の速度および操舵角度を入力する処理である。
S108はS107で入力した自車両挙動から所定時間後の自車両位置を算出する処理である。
S109はSS106で算出した物体までの距離と、S108で算出した自車両の所定時間後の位置から衝突するかどうか判定する処理である。
S110はS109で衝突すると判定した場合は、S111処理を実施する。S111はドライバに警報装置5のブザーで物体への衝突を警報する処理である。
FIG. 9 shows a processing flow for calculating the distance to the object in accordance with the basic principle described above.
In step S100, the signal output from the camera 1 is converted into a digital value and image luminance information is read from the stored image memory 2.
In step S101, vertical and horizontal edge images are calculated by applying vertical and horizontal sobel filters to the image luminance signal input in step S100.
S102 is a process of extracting the intersection of the vertical edge image and the horizontal edge image calculated in S101 as a feature point.
S103 is a clustering process for labeling each object based on the movement of the feature points extracted in S102 between frames.
S104 is a process of extracting a reference feature point that is an intersection of shadows projected on the road surface from the feature points extracted for each object clustered in S103.
In step S105, for each object clustered in step S103, the spatial position of the reference feature point in the real space is determined from the temporal change in the image feature coordinates of the reference feature point extracted in step S104 and other feature points extracted in step S102. This is a calculation process.
S106 is a triangulation using a known camera installation position, a reference road surface immediately before the host vehicle (17 in FIG. 3), and a reference feature point that is a contact point between the object and the road surface calculated in S105. This is a process for calculating the distance from the vehicle to the object (target object) by the law.
S107 is a process for inputting the speed and steering angle of the host vehicle from the vehicle sensor 2.
S108 is a process of calculating the own vehicle position after a predetermined time from the own vehicle behavior input in S107.
S109 is a process of determining whether or not the vehicle collides from the distance to the object calculated in SS106 and the position of the host vehicle calculated in S108 after a predetermined time.
If it is determined in S109 that the collision has occurred in S110, the S111 process is performed. In step S111, the driver is warned of a collision with an object with the buzzer of the alarm device 5.

以上説明したように、本実施の形態の車載用距離算出装置は、自車両の所定位置に配置され、自車両の近傍を撮像するカメラ1と、カメラ1により撮像された画像から、その近傍領域と輝度差が大きいなどの特徴点を抽出するマイクロコンピュータ4と、上記マイクロコンピュータ4により抽出された特徴点の中から、実空間上で走路面に存在する基準特徴点を抽出する上記マイクロコンピュータ4と、基準特徴点と特徴点の画像位置の時間的変化から、基準特徴点と特徴点の実空間位置を算出する上記マイクロコンピュータ4と、上記マイクロコンピュータ4により算出された基準特徴点もしくは特徴点の実空間位置から、自車両と対象物との距離を算出する上記マイクロコンピュータ4とを有する(なお、マイクロコンピュータ4は通常1個である)。なお、マイクロコンピュータ4およびS102が特徴点抽出手段に、マイクロコンピュータ4およびS104が基準特徴点抽出手段に、マイクロコンピュータ4およびS105が実空間位置算出手段に、マイクロコンピュータ4およびS106が対象物距離算出手段に相当する。
このように対象物の特徴点の中から走路面上にある特徴点を基準特徴点とし、基準特徴点と特徴点の時間的変化に着目することで、単一カメラを用いてそれぞれの特徴点の空間的配置が算出できるため、自車両から対象物までの距離が安価に算出でき、また、システム構成も簡単である。
As described above, the in-vehicle distance calculation apparatus according to the present embodiment is arranged at a predetermined position of the host vehicle, and captures the vicinity of the host vehicle from the camera 1 that captures the vicinity of the host vehicle. The microcomputer 4 that extracts feature points such as a large luminance difference and the microcomputer 4 that extracts the reference feature points existing on the road surface in real space from the feature points extracted by the microcomputer 4 The microcomputer 4 for calculating the reference feature point and the real space position of the feature point from the temporal change of the reference feature point and the image position of the feature point, and the reference feature point or feature point calculated by the microcomputer 4 The microcomputer 4 calculates the distance between the vehicle and the object from the real space position of the vehicle. Number and is). Note that the microcomputers 4 and S102 are feature point extraction means, the microcomputers 4 and S104 are reference feature point extraction means, the microcomputers 4 and S105 are real space position calculation means, and the microcomputers 4 and S106 are object distance calculations. Corresponds to means.
In this way, the feature points on the road surface among the feature points of the object are used as the reference feature points, and each feature point is measured using a single camera by paying attention to the temporal change of the reference feature points and feature points. Therefore, the distance from the host vehicle to the object can be calculated at a low cost, and the system configuration is simple.

また、本車載用距離算出装置は、基準特徴点を抽出するマイクロコンピュータ4において、基準特徴点として対象物の走路面における影を抽出する。このように対象物の走路面への影を基準特徴点としているため、例えば車影によりタイヤが認識できない場合等、対象物と走路面との接点との輝度差がない場合でも、対象物と走路面の接点が検出できない場合でも、対象物までの距離を算出することができる。   Moreover, this vehicle-mounted distance calculation apparatus extracts the shadow on the road surface of the target object as the reference feature point in the microcomputer 4 that extracts the reference feature point. Since the shadow on the road surface of the object is used as a reference feature point in this way, even if there is no luminance difference between the contact point of the object and the road surface, such as when a tire cannot be recognized due to a vehicle shadow, Even when the contact point on the road surface cannot be detected, the distance to the object can be calculated.

なお、以上説明した実施の形態は、本発明の理解を容易にするために記載されたものであって、本発明を限定するために記載されたものではない。したがって、上記の実施の形態に開示された各要素は、本発明の技術的範囲に属する全ての設計変更や均等物をも含む趣旨である。   The embodiment described above is described in order to facilitate understanding of the present invention, and is not described in order to limit the present invention. Therefore, each element disclosed in the above embodiment includes all design changes and equivalents belonging to the technical scope of the present invention.

本発明の実施の形態の車載用距離算出装置の構成を示すブロック図である。It is a block diagram which shows the structure of the vehicle-mounted distance calculation apparatus of embodiment of this invention. 本発明の実施の形態の車載用距離算出装置の基本原理を説明する図である。It is a figure explaining the basic principle of the vehicle-mounted distance calculation apparatus of embodiment of this invention. 本発明の実施の形態の車載用距離算出装置の基本原理を説明する図である。It is a figure explaining the basic principle of the vehicle-mounted distance calculation apparatus of embodiment of this invention. 本発明の実施の形態の車載用距離算出装置の基本原理を説明する図である。It is a figure explaining the basic principle of the vehicle-mounted distance calculation apparatus of embodiment of this invention. 本発明の実施の形態の車載用距離算出装置の基本原理を説明する図である。It is a figure explaining the basic principle of the vehicle-mounted distance calculation apparatus of embodiment of this invention. 本発明の実施の形態の車載用距離算出装置の基本原理を説明する図である。It is a figure explaining the basic principle of the vehicle-mounted distance calculation apparatus of embodiment of this invention. 本発明の実施の形態の車載用距離算出装置の基本原理を説明する図である。It is a figure explaining the basic principle of the vehicle-mounted distance calculation apparatus of embodiment of this invention. 本発明の実施の形態の車載用距離算出装置の基本原理を説明する図である。It is a figure explaining the basic principle of the vehicle-mounted distance calculation apparatus of embodiment of this invention. 本発明の実施の形態の車載用距離算出装置の動作の流れを説明するフローチャートである。It is a flowchart explaining the flow of operation | movement of the vehicle-mounted distance calculation apparatus of embodiment of this invention.

符号の説明Explanation of symbols

1…カメラ
2…車両センサ
3…画像メモリ
4…マイクロコンピュータ
5…警報装置
10…自車両
11…先行車両
12…対象物
13…特徴点
14…基準特徴点
15…対象物の影
16、16a、16b…基準特徴点判定窓
17…基準走路面
DESCRIPTION OF SYMBOLS 1 ... Camera 2 ... Vehicle sensor 3 ... Image memory 4 ... Microcomputer 5 ... Alarm apparatus 10 ... Own vehicle 11 ... Predecessor vehicle 12 ... Object 13 ... Feature point 14 ... Reference | standard feature point 15 ... Object shadow 16, 16a, 16b ... Reference feature point determination window 17 ... Reference runway surface

Claims (2)

自車両の所定位置に配置され、上記自車両の近傍を撮像する撮像手段と、
上記撮像手段により撮像された画像から特徴点を抽出する特徴点抽出手段と、
上記特徴点抽出手段により抽出された上記特徴点の中から、実空間上で走路面に存在する基準特徴点を抽出する基準特徴点抽出手段と、
上記基準特徴点と上記特徴点の画像位置の時間的変化から、上記基準特徴点と上記特徴点の実空間位置を算出する実空間位置算出手段と、
上記実空間位置算出手段により算出された上記基準特徴点もしくは上記特徴点の上記実空間位置から、上記自車両と対象物との距離を算出する対象物距離算出手段と
を有することを特徴とする車載用距離算出装置。
An imaging means that is arranged at a predetermined position of the host vehicle and images the vicinity of the host vehicle;
Feature point extraction means for extracting feature points from the image captured by the imaging means;
Reference feature point extraction means for extracting a reference feature point existing on the road surface in real space from the feature points extracted by the feature point extraction means;
Real space position calculating means for calculating the reference feature point and the real space position of the feature point from the temporal change of the reference feature point and the image position of the feature point;
It has object distance calculation means for calculating the distance between the vehicle and the object from the reference feature point calculated by the real space position calculation means or the real space position of the feature point. In-vehicle distance calculation device.
上記基準特徴点抽出手段において、上記基準特徴点として上記対象物の上記走路面における影を抽出することを特徴とする請求項1記載の車載用距離算出装置。   The in-vehicle distance calculation device according to claim 1, wherein the reference feature point extracting means extracts a shadow of the object on the road surface as the reference feature point.
JP2003271984A 2003-07-08 2003-07-08 In-vehicle distance calculation device Expired - Fee Related JP4144464B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003271984A JP4144464B2 (en) 2003-07-08 2003-07-08 In-vehicle distance calculation device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003271984A JP4144464B2 (en) 2003-07-08 2003-07-08 In-vehicle distance calculation device

Publications (2)

Publication Number Publication Date
JP2005030968A true JP2005030968A (en) 2005-02-03
JP4144464B2 JP4144464B2 (en) 2008-09-03

Family

ID=34209679

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003271984A Expired - Fee Related JP4144464B2 (en) 2003-07-08 2003-07-08 In-vehicle distance calculation device

Country Status (1)

Country Link
JP (1) JP4144464B2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011016257A1 (en) 2009-08-06 2011-02-10 パナソニック株式会社 Distance calculation device for vehicle
WO2011043030A1 (en) 2009-10-09 2011-04-14 パナソニック株式会社 Vehicle peripheral monitoring device
JP2013002883A (en) * 2011-06-14 2013-01-07 Honda Motor Co Ltd Distance measuring device
JP2013002884A (en) * 2011-06-14 2013-01-07 Honda Motor Co Ltd Distance measuring device
KR101276073B1 (en) * 2011-01-27 2013-06-18 팅크웨어(주) System and method for detecting distance between forward vehicle using image in navigation for vehicle
JP2020508501A (en) * 2017-02-20 2020-03-19 コンチネンタル オートモーティヴ ゲゼルシャフト ミット ベシュレンクテル ハフツングContinental Automotive GmbH Method and apparatus for estimating the range of a moving object

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011016257A1 (en) 2009-08-06 2011-02-10 パナソニック株式会社 Distance calculation device for vehicle
WO2011043030A1 (en) 2009-10-09 2011-04-14 パナソニック株式会社 Vehicle peripheral monitoring device
US9001204B2 (en) 2009-10-09 2015-04-07 Panasonic Intellectual Property Management Co., Ltd. Vehicle peripheral monitoring device
KR101276073B1 (en) * 2011-01-27 2013-06-18 팅크웨어(주) System and method for detecting distance between forward vehicle using image in navigation for vehicle
JP2013002883A (en) * 2011-06-14 2013-01-07 Honda Motor Co Ltd Distance measuring device
JP2013002884A (en) * 2011-06-14 2013-01-07 Honda Motor Co Ltd Distance measuring device
JP2020508501A (en) * 2017-02-20 2020-03-19 コンチネンタル オートモーティヴ ゲゼルシャフト ミット ベシュレンクテル ハフツングContinental Automotive GmbH Method and apparatus for estimating the range of a moving object
JP7107931B2 (en) 2017-02-20 2022-07-27 コンチネンタル オートモーティヴ ゲゼルシャフト ミット ベシュレンクテル ハフツング Method and apparatus for estimating range of moving objects

Also Published As

Publication number Publication date
JP4144464B2 (en) 2008-09-03

Similar Documents

Publication Publication Date Title
EP2463843B1 (en) Method and system for forward collision warning
JP6795027B2 (en) Information processing equipment, object recognition equipment, device control systems, moving objects, image processing methods and programs
JP3739693B2 (en) Image recognition device
CN101910781B (en) Moving state estimation device
WO2016117200A1 (en) Outside environment recognition device for vehicles and vehicle behavior control device using same
US20050276450A1 (en) Vehicle surroundings monitoring apparatus
JP2021510227A (en) Multispectral system for providing pre-collision alerts
JP6816401B2 (en) Image processing device, imaging device, mobile device control system, image processing method, and program
JP2002366937A (en) Monitor outside vehicle
JP6291866B2 (en) Driving support device and driving support method
JP2006318271A (en) On-vehicle image processor and image processing method
CN107423675B (en) Advanced warning system for forward collision warning of traps and pedestrians
JP2018091652A (en) Information processor, imaging apparatus, equipment control system, information processing method and program
EP3115933B1 (en) Image processing device, image capturing device, mobile body control system, image processing method, and computer-readable recording medium
JP6032034B2 (en) Object detection device
EP3545464A1 (en) Information processing device, imaging device, equipment control system, mobile object, information processing method, and computer-readable recording medium
JP5539250B2 (en) Approaching object detection device and approaching object detection method
JP4144464B2 (en) In-vehicle distance calculation device
JP6992356B2 (en) Information processing equipment, image pickup equipment, equipment control system, mobile body, information processing method and program
JP7404173B2 (en) Image processing device
Yang Estimation of vehicle's lateral position via the Lucas-Kanade optical flow method
JP4321410B2 (en) Object detection apparatus and method
JP3586938B2 (en) In-vehicle distance measuring device
JP2005170290A (en) Obstacle detecting device
JP4176558B2 (en) Vehicle periphery display device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20060127

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20070525

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20070703

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20070719

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20080219

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20080228

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20080527

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20080609

R150 Certificate of patent or registration of utility model

Ref document number: 4144464

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110627

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120627

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120627

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130627

Year of fee payment: 5

LAPS Cancellation because of no payment of annual fees