JP6521796B2 - Stereo image processing device - Google Patents

Stereo image processing device Download PDF

Info

Publication number
JP6521796B2
JP6521796B2 JP2015167088A JP2015167088A JP6521796B2 JP 6521796 B2 JP6521796 B2 JP 6521796B2 JP 2015167088 A JP2015167088 A JP 2015167088A JP 2015167088 A JP2015167088 A JP 2015167088A JP 6521796 B2 JP6521796 B2 JP 6521796B2
Authority
JP
Japan
Prior art keywords
road surface
parallax
distance
far
pair
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2015167088A
Other languages
Japanese (ja)
Other versions
JP2017044573A (en
Inventor
敦史 高野
敦史 高野
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Subaru Corp
Original Assignee
Subaru Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Subaru Corp filed Critical Subaru Corp
Priority to JP2015167088A priority Critical patent/JP6521796B2/en
Publication of JP2017044573A publication Critical patent/JP2017044573A/en
Application granted granted Critical
Publication of JP6521796B2 publication Critical patent/JP6521796B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Description

本発明は、移動体に搭載されている左右一対のカメラの視差誤差を求めて、視差オフセット値を校正するステレオ画像処理装置に関する。   The present invention relates to a stereo image processing apparatus that obtains parallax errors of a pair of left and right cameras mounted on a moving body and calibrates a parallax offset value.

従来、三次元の距離情報を取得し、この距離情報に基づいて周囲環境や自己位置を認識する三次元計測装置が知られており、車両やヘリコプタ等の移動体に搭載されて実用化されている。この種の三次元計測装置は、左右一対のカメラ(ステレオカメラ)で撮像した同一対象物の一対の画像の相関を求め、同一対象物に対する視差からステレオカメラの取付け位置や焦点距離等のカメラパラメータを用いて三角測量の原理により対象物までの距離を算出するものである。   Conventionally, a three-dimensional measuring device that acquires three-dimensional distance information and recognizes the surrounding environment and the self position based on the distance information is known, and is mounted on a mobile object such as a vehicle or a helicopter and put to practical use There is. This type of three-dimensional measurement device finds the correlation between a pair of images of the same object captured by a pair of left and right cameras (stereo cameras), and uses camera parameters such as the attachment position of the stereo camera and focal length based on the parallax with respect to the same object. The distance to the object is calculated according to the principle of triangulation using.

従って、ステレオマッチングの精度(信頼性の高い距離情報)を得るためには、視差以外の位置的なずれがステレオ画像に存在しないことが望ましい。しかし、実際には、取付け時にカメラをネジ締めすること等によりカメラに加わる物理的なストレスやカメラが搭載された装置の振動や熱による歪み等によりカメラの光軸がずれる等して、視差が経時的にずれてしまう場合がある。ステレオカメラに視差誤差が生じると、基準画像と比較画像との視差と物体の実際の距離とが正確に対応しなくなり、ステレオマッチング法による物体の三次元的な位置の割り出しの精度が低下して、対象物までの距離情報の信頼性が損なわれてしまう。   Therefore, in order to obtain stereo matching accuracy (reliable distance information), it is desirable that no positional deviation other than parallax be present in the stereo image. However, in actuality, the optical axis of the camera is shifted due to physical stress applied to the camera by screwing the camera at the time of attachment, etc. or distortion due to vibration or heat of the device equipped with the camera, etc. It may shift with time. If a parallax error occurs in the stereo camera, the parallax between the reference image and the comparison image does not correspond exactly to the actual distance of the object, and the accuracy of the three-dimensional position determination of the object by the stereo matching method decreases. The reliability of the distance information to the object is lost.

これに対処するに、例えば、特許文献1(特開2001−169310号公報)には、自車両前方に信号機や電柱等、固定された静止対象物が検出された場合、第1の時刻t0とこの第1の時刻t0から所定時間経過した第2の時刻とにおける自車両から静止対象物までの距離及び視差を検出し、更に、時刻t0からt1までの車両の走行距離を算出し、これらの視差及び走行距離に基づいて、ステレオカメラの視差誤差を求め、この視差誤差で、ステレオカメラの光軸間平行度ずれに起因する誤差である視差オフセット値を校正する技術が開示されている。   In order to cope with this, for example, according to Patent Document 1 (Japanese Patent Laid-Open No. 2001-169310), when a stationary object such as a traffic light or a telephone pole fixed in front of the host vehicle is detected, the first time t0 The distance from the subject vehicle to the stationary object and the parallax at a second time when a predetermined time has elapsed from the first time t0 are detected, and further, the travel distance of the vehicle from time t0 to t1 is calculated. A technique is disclosed for obtaining a parallax error of a stereo camera based on parallax and a traveling distance, and calibrating a parallax offset value, which is an error caused by an inter-optical axis parallelism deviation of the stereo camera, using this parallax error.

特開2001−169310号公報JP 2001-169310 A

しかし、上述した文献に開示されている技術では、静止対象物が検出されない場合は視差オフセット値を校正することができず、視差オフセット値を校正する機会を充分に確保することができない不都合がある。   However, with the technique disclosed in the above-mentioned document, the parallax offset value can not be calibrated when the stationary object is not detected, and there is a disadvantage that the opportunity to calibrate the parallax offset value can not be sufficiently ensured. .

特に、車両の目標走行路を生成し、その目標走行路に沿って車線維持制御(レーンキープアシスト)等の各種運転支援や自動運転を行う最近の車両制御では、ステレオカメラを利用した白線認識によって、自車両前方の道路形状を認識し、自車両が車線中央を走行するように制御している。そのため、視差オフセット値の校正の機会が少ないと、ステレオマッチング精度が低下し、車両制御において自車両を車線中央に沿って継続的に走行させることが困難になる不都合がある。   In particular, in recent vehicle control that generates a target travel path of a vehicle and performs various driving support such as lane keeping control (lane keeping assist) along the target travel path and automatic driving, white line recognition using a stereo camera The road shape ahead of the host vehicle is recognized, and the host vehicle is controlled to travel in the middle of the lane. Therefore, when there are few opportunities for calibration of the parallax offset value, the stereo matching accuracy is lowered, and there is a problem that it becomes difficult to make the vehicle travel continuously along the center of the lane in vehicle control.

本発明は、上記事情に鑑み、固定された静止対象物を検出することなく、ステレオカメラの視差誤差検出を可能として、視差オフセット値の校正の機会を増加させ、ステレオマッチング精度を向上させることのできるステレオ画像処理装置を提供することを目的とする。   In view of the above circumstances, the present invention enables parallax error detection of a stereo camera without detecting a stationary still object, thereby increasing the opportunity for calibration of a parallax offset value and improving stereo matching accuracy. It is an object of the present invention to provide a stereo image processing apparatus that can

本発明によるステレオ画像処理装置は、移動体に搭載され、光軸が路面と平行に設定された状態で前方を撮像する左右一対の撮像手段と、前記一対の撮像手段によって撮像された一対の画像に、前記光軸と一致する無限遠位垂直座標と予め設定した近方路面距離及び遠方路面距離に位置する近方垂直座標及び遠方垂直座標をそれぞれ設定する垂直座標設定手段と、前記一対の画像の前記近方路面距離に位置する路面上で特徴部分を抽出して近方路面視差を求め、又前記遠方路面距離に位置する路面上で特徴部分を抽出して遠方路面視差を求める路面視差演算手段と、前記一対の撮像手段間の基線長に前記無限遠位垂直座標から前記近方垂直座標までの距離を乗算した値を前記近方路面視差で除算して前記近方路面距離上の路面から前記光軸までの近方垂直高さを求める近方垂直高さ演算手段と、前記一対の撮像手段間の基線長に前記無限遠位垂直座標から前記遠方垂直座標までの距離を乗算した値を前記近方垂直高で除算して補正遠方路面視差を求める補正遠方路面視差演算手段と、前記補正遠方路面視差と前記遠方路面視差との差分に基づき、前記一対の撮像手段の光軸間平行度ずれに基づいて設定されている視差オフセット値を校正する視差誤差を求める視差誤差演算手段とを備える。   A stereo image processing apparatus according to the present invention is mounted on a moving body, and a pair of left and right imaging means for imaging the front with the optical axis set parallel to the road surface, and a pair of images imaged by the pair of imaging means Vertical coordinate setting means for respectively setting an infinite distal vertical coordinate coincident with the optical axis, a near vertical coordinate and a far vertical coordinate located at a preset near road distance and a far road distance, and the pair of images A road surface parallax operation for extracting a characteristic portion on the road surface located at the near road surface distance to obtain a near road surface parallax, and extracting a characteristic portion on the road surface located at the distant road surface distance for obtaining a distant road surface parallax Means and a value obtained by multiplying the distance from the infinite distal vertical coordinate to the near vertical coordinate by the baseline length between the pair of imaging means divided by the near road surface parallax to obtain a road surface on the near road distance To the optical axis Near vertical height computing means for obtaining the near vertical height, and a value obtained by multiplying the distance from the infinite distal vertical coordinate to the far vertical coordinate by the baseline length between the pair of imaging means as the near vertical Based on the difference in optical axis parallelism of the pair of imaging means on the basis of the corrected far road surface parallax computing means for dividing by high to obtain the corrected far road surface parallax, and the difference between the corrected far road surface parallax and the far road surface parallax. And parallax error calculating means for obtaining a parallax error for calibrating the set parallax offset value.

本発明によれば、移動体前方の近方路面距離に位置する路面上の特徴部分に基づいて近方路面視差を求め、遠方路面距離に位置する路面上の特徴部分に基づいて遠方路面視差を求め、又、無限遠位垂直座標から遠方垂直座標までの距離と近方垂直高さとを用いて補正遠方路面視差を算出し、そして、この補正遠方路面視差と遠方路面視差との差分から、視差オフセット値を校正する視差誤差を算出するようにしたので、固定された静止対象物を検出することなく、ステレオカメラの視差誤差検出が可能となり、視差オフセット値の校正の機会が増加し、ステレオマッチング精度を向上させることができる。   According to the present invention, the near road surface parallax is obtained based on the characteristic part on the road surface located at the near road surface distance ahead of the moving body, and the distant road surface parallax is obtained based on the characteristic part on the road surface located at the distant road surface distance. Also, the corrected distance road surface parallax is calculated using the distance from the infinity distal vertical coordinate to the distance vertical coordinate and the near vertical height, and from the difference between the corrected distance road surface parallax and the distance road surface parallax, the parallax is calculated Since the parallax error for calibrating the offset value is calculated, it is possible to detect the parallax error of the stereo camera without detecting the fixed stationary object, and the opportunity of the calibration of the parallax offset value increases, and the stereo matching Accuracy can be improved.

第1実施形態によるステレオ画像処理装置の機能ブロック図Functional block diagram of stereo image processing apparatus according to the first embodiment 同、視差補正処理部の機能ブロック図Same as functional block diagram of disparity correction processing unit 同、視差オフセット校正処理ルーチンを示すフローチャート(その1)Similarly, a flowchart showing a parallax offset calibration processing routine (part 1) 同、視差オフセット校正処理ルーチンを示すフローチャート(その2)Similarly, a flowchart showing a parallax offset calibration processing routine (part 2) 同、実座標での近方の路面距離でのカメラ高さと遠方の路面距離でのカメラ高さを示す説明図It is an explanatory view showing the camera height in the near road surface distance and the camera height in the near road surface distance in the actual coordinates. 同、視差と距離との関係を示す特性図The same characteristic chart showing the relationship between parallax and distance 同、近方と遠方の各特徴点までの路面距離と視差との関係を示し、(a)は平面図、(b)は側面図Similarly, the relationship between the road surface distance to each feature point between near and far and the parallax is shown, (a) is a plan view, (b) is a side view 同、(a)はメインカメラで撮像した画像の説明図、(b)はサブカメラで撮像した画像の説明図、(c)は両カメラで撮像した近方と遠方における路面距離上の視差の説明図Similarly, (a) is an explanatory view of an image taken by the main camera, (b) is an explanatory view of an image taken by the sub camera, and (c) is a parallax on the road surface distance in the near and far distances taken by both cameras. Explanation 同、別態様による図8相当の各説明図Each explanatory drawing of the FIG. 8 equivalent by the same and another aspect 第2実施形態による実座標での近方の先行車後輪が接地している路面からのカメラ高さと遠方の先行車後輪が接地している路面からのカメラ高さとを示す説明図Explanatory drawing which shows the camera height from the road surface which the near preceding vehicle rear wheel in real coordinates in the 2nd embodiment contacts to the ground, and the camera height from the road surface which a distant leading vehicle rear wheel contacts to the ground 同、(a)は近方路面距離の路面に先行車後輪が接地している画像の説明図、(b)は遠方路面距離の路面に先行車後輪が接地している画像の説明図Similarly, (a) is an explanatory view of an image in which the rear wheels of the preceding vehicle are in contact with the road surface near the road distance, and (b) is an explanatory view of the image in which the rear wheels of the preceding vehicle are in contact with a road surface

以下、図面に基づいて本発明の一実施形態を説明する。   Hereinafter, an embodiment of the present invention will be described based on the drawings.

[第1実施形態]
図1に示すステレオ画像処理装置1は、自動車を代表とする車両等、路面を走行する移動体に搭載されている。尚、以下においては、このステレオ画像処理装置1を移動体としての車両20に搭載した場合を例示して説明する。
First Embodiment
The stereo image processing apparatus 1 shown in FIG. 1 is mounted on a mobile body traveling on a road surface, such as a vehicle represented by a car. In addition, below, the case where this stereo image processing apparatus 1 is mounted in the vehicle 20 as a moving body is illustrated and demonstrated.

ステレオ画像処理装置1は、コンピュータを中心に構成されている画像処理部2を有し、この画像処理部2の入力側に、A/Dコンバータ3,4を介して車載カメラ5が接続されている。   The stereo image processing apparatus 1 has an image processing unit 2 mainly composed of a computer, and an on-vehicle camera 5 is connected to the input side of the image processing unit 2 via A / D converters 3 and 4. There is.

車載カメラ5は、CCDやCMOS等、撮像手段としてのイメージセンサ6a,6b(図7参照)をそれぞれ内蔵するメインカメラ5aとサブカメラ5bとで構成されたステレオカメラであり、この一対のカメラ5a,5bは、それぞれ車室内の天井前方であって、車幅方向中央を挟んで右側と左側に等間隔離れた位置に取り付けられ、車外の環境を異なる視点からステレオ撮像を行う。   The on-vehicle camera 5 is a stereo camera including a main camera 5a and a sub-camera 5b each incorporating image sensors 6a and 6b (see FIG. 7) such as CCD and CMOS as imaging means, and this pair of cameras 5a , 5b are respectively mounted in front of the ceiling in the vehicle compartment at equal intervals on the right and left sides across the center in the vehicle width direction, and perform stereo imaging from different viewpoints of the environment outside the vehicle.

メインカメラ5aは、本実施形態では右側に配設されており、ステレオ画像処理を行う際に必要な基準画像(右画像)を撮像し、左側に配設されたサブカメラ5bは、比較画像(左画像)を撮像する。撮像された左右の一対のアナログ画像は互いの同期が取れている状態でA/Dコンバータ3,4にてデジタル画像に変換されて、画像処理部2へ送信される。   The main camera 5a is disposed on the right side in the present embodiment, picks up a reference image (right image) necessary for performing stereo image processing, and the sub camera 5b disposed on the left side Capture the left image). The pair of captured left and right analog images are converted into digital images by the A / D converters 3 and 4 in a state where they are synchronized with each other, and transmitted to the image processing unit 2.

画像処理部2は、画像補正部11、ステレオ画像処理手段としてのステレオ画像処理部12、視差補正処理部13、距離データ記憶部14、画像データ記憶部15、画像認識部16を備えている。   The image processing unit 2 includes an image correction unit 11, a stereo image processing unit 12 as stereo image processing means, a parallax correction processing unit 13, a distance data storage unit 14, an image data storage unit 15, and an image recognition unit 16.

画像補正部11は、画像データに対して輝度補正や画像の幾何学的な変換等を行う。通常、左右一対のカメラ5a,5bの取付位置に誤差があり、それに起因したずれが左右の画像に生じる。このずれを補正するために、アフィン変換等を用いて、画像の回転や平行移動等の幾何学的な変換を行う。このような画像補正処理により、メインカメラ5aから画像データに基づいて基準画像データが生成され、サブカメラ5bからの画像データに基づいて比較画像データが生成される。   The image correction unit 11 performs luminance correction, geometric conversion of an image, and the like on image data. Usually, there is an error in the attachment position of the pair of left and right cameras 5a and 5b, and the displacement resulting therefrom is generated in the left and right images. In order to correct this deviation, geometrical transformation such as rotation or translation of the image is performed using affine transformation or the like. By such image correction processing, reference image data is generated from the main camera 5a based on the image data, and comparison image data is generated based on the image data from the sub camera 5b.

この両画像データがステレオ画像処理部12に送信される。ステレオ画像処理部12は、基準画像データと比較画像データとに基づいて、1フレーム相当の撮像画像に関し、同一物体に対する視差dpから三角測量の原理を用いて対象物までの距離(距離データ)Dを、
D=f・L/dp …(1)
から算出する。ここで、fは焦点距離、Lは左右のカメラ5a,5bの互いに平行な光軸間の距離(基線長)である。尚、Dは焦点距離fの位置から対象物までの距離である。
The two image data are sent to the stereo image processing unit 12. The stereo image processing unit 12 is based on the reference image data and the comparison image data, regarding the captured image equivalent to one frame, using the principle of triangulation from the parallax dp for the same object, the distance to the object (distance data) D ,
D = f · L / dp (1)
Calculated from Here, f is the focal length, and L is the distance (baseline length) between the parallel optical axes of the left and right cameras 5a, 5b. Here, D is the distance from the position of the focal length f to the object.

視差dpは、同一対象物を撮像した基準画像データと比較画像データとの水平方向のずれ量であり、画像上では視差画素数xと画素ピッチωi[mm/pix]とを乗算した値で求められる(dp=x・ωi)。そして、この距離データを視差補正処理部13に送信する。   The parallax dp is a shift amount in the horizontal direction between the reference image data obtained by imaging the same object and the comparison image data, and is obtained by multiplying the parallax pixel number x by the pixel pitch ωi [mm / pix] on the image. (Dp = x · ωi). Then, the distance data is transmitted to the parallax correction processing unit 13.

図2に示すように、視差補正処理部13は、視差オフセット校正部13aと位置補正部13bとを備えている。視差オフセット校正部13aは、焦点距離fを基準として、その前方における予め設定されている近方側の路面距離(以下、「近方路面距離」と称する)Daと遠方側の路面距離(以下、「遠方路面距離」と称する)Db上の路面画像から共通する2つの特徴部分を抽出し、各特徴部分から各々ステレオマッチング処理により近方路面距離Daでの路面視差(以下、「近方路面視差」と称する)md1,md2と遠方路面距離Dbでの路面視差(以下、「遠方路面視差」と称する)nd1,nd2とを求める。そして、近方路面視差md1,md2の平均値を基準として遠方路面視差nd1,nd2平均値に対する視差誤差gdを算出し、この視差誤差gdで、両カメラ5a,5bの光軸間平行度ずれに起因する誤差を修正するために予め設定されている視差オフセット値DPを校正する。   As shown in FIG. 2, the parallax correction processing unit 13 includes a parallax offset calibration unit 13 a and a position correction unit 13 b. The parallax offset calibration unit 13a sets a road surface distance on the near side (hereinafter referred to as “near road surface distance”) and a road surface distance on the far side (hereinafter referred to as “preceding near road surface distance”). Two feature parts that are common from the road surface image on the “distant road surface distance” Db are extracted, and the road surface parallax at the near road surface distance Da by stereo matching processing from each feature part (hereinafter, “near road surface parallax ) Md1, md2 and road surface parallax at a distant road surface distance Db (hereinafter referred to as "far road surface parallax") nd1, nd2 are determined. Then, the parallax error gd with respect to the far road surface parallax nd1 and nd2 average value is calculated based on the average value of the near road surface parallaxes md1 and md2, and the parallax error between the optical axes of both cameras 5a and 5b is calculated with this parallax error gd. The disparity offset value DP, which is preset, is calibrated in order to correct the resulting error.

位置補正部13bは、ステレオ画像処理部12で設定した視差dpを、視差オフセット値DPで補正し、補正後の視差dpに基づいて対象物の距離Dを求める。そして、対象物までの距離Dのデータ、及び、この距離Dのデータに対応する画像データを距離データ記憶部14、画像データ記憶部15にそれぞれ記憶させる。   The position correction unit 13 b corrects the parallax dp set by the stereo image processing unit 12 with the parallax offset value DP, and obtains the distance D of the object based on the corrected parallax dp. Then, the data of the distance D to the object and the image data corresponding to the data of the distance D are stored in the distance data storage unit 14 and the image data storage unit 15, respectively.

画像認識部16は、画像データ記憶部15に記憶されている画像データを用いて対象物を認識し、距離データ記憶部14に記憶されている距離データに基づいて、当該対象物の三次元的な位置を認識する。この画像認識部16で認識した対象物の三次元的な位置情報は、使用目的に応じて適宜読込まれる。例えば、ステレオ画像処理装置1が車両20に搭載されている場合、車両20が走行する車線の左右を区画する区画線、先行車、信号機等の立体物等が認識すべき対象物となる。   The image recognition unit 16 recognizes an object using the image data stored in the image data storage unit 15, and based on the distance data stored in the distance data storage unit 14, three-dimensional the object. Recognize a certain position. The three-dimensional position information of the object recognized by the image recognition unit 16 is appropriately read according to the purpose of use. For example, when the stereo image processing apparatus 1 is mounted on the vehicle 20, a dividing line dividing the left and right of the lane in which the vehicle 20 travels, a leading vehicle, a three-dimensional object such as a traffic light, and the like are objects to be recognized.

上述した視差オフセット校正部13aで演算される視差オフセット値DPの校正処理について更に、詳しく説明する。   The calibration process of the parallax offset value DP calculated by the above-described parallax offset calibration unit 13a will be described in more detail.

図5に示すように、各カメラ5a,5bの光軸が路面に対して平行な場合、路面から光軸までの垂直高さ(以下、「カメラ高さ」と称する)Hはどの路面距離で計測しても不変である。又、図6に示すように、視差は近方ほど大きく、遠方ほど小さくなり、無限遠では視差0に近づく反比例の関係にある。   As shown in FIG. 5, when the optical axis of each of the cameras 5a and 5b is parallel to the road surface, the vertical height (hereinafter referred to as "camera height") H from the road surface to the optical axis is It does not change even if it measures. Further, as shown in FIG. 6, the parallax is larger as it gets closer, smaller as it gets farther, and in inverse proportion to approach 0 at infinity.

従って、近方路面距離Daの路面から光軸までの垂直高さである近方カメラ高さHaと遠方面距離Dbの路面から光軸までの垂直高さである遠方カメラ高さHbとが異なっている場合、視差誤差が生じていると推定される。その際、遠方カメラ高さHbが近方カメラ高さHaよりも視差誤差の影響を大きく受けるため、近方カメラ高さHaを基準(正しい値)として、遠方カメラ高さHbを算出する際に求めた視差を補正し、補正後の視差と遠方側の視差との差分から、視差オフセット値DPを校正する視差誤差gdを求める。   Therefore, the near camera height Ha which is the vertical height from the road surface to the light axis at the near road surface distance Da and the far camera height Hb which is the vertical height from the road surface to the light axis at the far surface distance Db are different. If so, it is estimated that a parallax error has occurred. At that time, since the far camera height Hb is more affected by the parallax error than the near camera height Ha, the far camera height Hb is calculated based on the near camera height Ha (correct value). The obtained parallax is corrected, and the parallax error gd for calibrating the parallax offset value DP is calculated from the difference between the corrected parallax and the parallax on the far side.

この視差オフセット校正部13aで実行する視差オフセット値DPの校正は、図3、図4に示す視差オフセット校正処理ルーチンに従って行われる。このルーチンは、左右一対のカメラ5a,5bで撮像した画像に対して1フレーム毎、或いは所定フレーム毎に実行される。   The calibration of the parallax offset value DP performed by the parallax offset calibration unit 13a is performed according to a parallax offset calibration process routine shown in FIGS. 3 and 4. This routine is executed for each frame or for each predetermined frame on the images captured by the pair of left and right cameras 5a and 5b.

このルーチンが起動されると、先ず、ステップS1で、視差オフセット校正開始条件判定処理が実行される。本実施形態では、カメラ高さHが不変であることを前提にしているため、車両20の挙動では、ピッチング、ヨーイング、ローリングの何れかが検出された場合は、開始条件不成立となる。又、道路は、路面と光軸とが平行に保たれている必要があるため、少なくとも車両20から遠方路面距離Dbまでの間は、平坦な直線路である必要がある。従って、前方に傾斜勾配の開始或いは傾斜勾配の終わりを示す折れ線が検出された場合やカーブの開始点が検出された場合、或いは登坂蕗走行において遠方路面距離Dbよりも手前で道路が途切れた(登坂路が終了した)場合は開始条件不成立となる。   When this routine is started, first, in step S1, disparity offset calibration start condition determination processing is executed. In the present embodiment, it is assumed that the camera height H does not change, so in the behavior of the vehicle 20, when any of pitching, yawing, and rolling is detected, the start condition is not satisfied. In addition, since the road needs to keep the road surface and the optical axis parallel, at least from the vehicle 20 to the distant road surface distance Db, the road needs to be a flat and straight road. Therefore, when a broken line indicating the start of the slope or the end of the slope is detected ahead, the start point of the curve is detected, or the road breaks before the far road surface distance Db in uphill driving ( In the case where the uphill road is completed), the start condition is not satisfied.

そして、ステップS2へ進み、ステップS1での判定処理にて開始条件の全てが満足されたか否かを調べ、一つでも満足されなかった場合は開始条件不成立と判定してルーチンを抜ける。一方、全ての開始条件が満足された場合は開示条件成立と判定し、ステップS3へ進む。   Then, the process proceeds to step S2, and it is checked whether or not all of the start conditions are satisfied in the determination processing in step S1. If even one is not satisfied, it is determined that the start conditions are not satisfied, and the routine is exited. On the other hand, if all the start conditions are satisfied, it is determined that the disclosure condition is satisfied, and the process proceeds to step S3.

ステップS3へ進むと、左右一対のカメラ5a,5bで撮像した1フレーム画像を取り込み、無限遠位の垂直座標を設定する。図8(a)にメインカメラ5aで撮像した画像(右画像)を示し、(b)にサブカメラ5bで撮像した画像(左画像)を示す。両画像は、左下を原点、垂直方向をJ座標、水平方向をI座標とし、座標は画素で表される。   When the process proceeds to step S3, one frame image captured by the pair of left and right cameras 5a and 5b is captured, and vertical coordinates at infinity distal are set. FIG. 8A shows an image (right image) captured by the main camera 5a, and FIG. 8B shows an image (left image) captured by the sub camera 5b. In both images, the lower left is the origin, the vertical direction is the J coordinate, the horizontal direction is the I coordinate, and the coordinates are represented by pixels.

又、同図には、画像の一例として片側1車線の直線路が表示されている。ここで、車両は左側通行とし、画像には、輝度差により二値化処理した後のセンターラインL1、走行車線の路側区画線L2、対向車線の路側区画線L3、及び路面上の複数(本実施形態では2つ)の特徴部分のエッジPa,Pb,Qa,Qbが表示されている。この特徴部分は、路面に形成されたタイヤ痕、凹み、水たまり、残雪等のエッジと路面との間に輝度差が生じるものであれば、特に限定はされない。   Also, in the figure, a straight road with one lane on one side is displayed as an example of the image. Here, the vehicle is left-handed, and the center line L1 after binarization processing by the brightness difference, the roadside division line L2 of the traveling lane, the roadside division line L3 of the opposite lane, and a plurality In the embodiment, the edges Pa, Pb, Qa, Qb of the two feature portions are displayed. This characteristic part is not particularly limited as long as a difference in luminance occurs between the road surface and an edge such as a tire mark, a dent, a puddle, or remaining snow formed on the road surface.

又、J座標(垂直座標)上に無限遠位として設定する無限遠位垂直座標(以下、「ヴァーティカル座標」と称する)JVは、各カメラ5a,5bの光軸が路面と平行であるため、水平線と一致する。この場合、無限遠位は、例えば、センターラインL1、左右路側区画線L2,L3が光軸上に消失する点を検出することで設定する。   Further, an infinite distal vertical coordinate (hereinafter referred to as "vertical coordinate") JV set as infinite distal on the J coordinate (vertical coordinate) is because the optical axis of each camera 5a, 5b is parallel to the road surface , Coincides with the horizon. In this case, the infinite distal end is set, for example, by detecting a point at which the center line L1 and the left and right roadside division lines L2 and L3 disappear on the optical axis.

次いで、ステップS4へ進み、ヴァーティカル座標JVが設定されたか否かを調べ、設定できなかった場合は、ルーチンを抜ける。又、ヴァーティカル座標JVが設定された場合は、ステップS5へ進む。   Then, the process proceeds to step S4, and it is checked whether or not the vertical coordinate JV is set. If the vertical coordinate JV can not be set, the routine is exited. If the vertical coordinate JV is set, the process proceeds to step S5.

ステップS5では、近方路面距離Daに対応する近方J座標jaをJ座標に設定し、ステップS6で遠方路面距離Dbに対応する遠方J座標jbをJ座標に設定する。この場合、本実施形態では、近方路面距離Daを10〜14[m]程度に設定し、又、遠方路面距離Dbを40[m]程度に設定しているが、これに限定されず、車載カメラ5の種類、取付け状態等に基づいて最適な値を設定する。   In step S5, the near J coordinate ja corresponding to the near road surface distance Da is set as the J coordinate, and in step S6 the far J coordinate jb corresponding to the distant road surface distance Db is set as the J coordinate. In this case, in the present embodiment, the near road surface distance Da is set to about 10 to 14 [m], and the far road surface distance Db is set to about 40 [m], but it is not limited thereto. An optimal value is set based on the type of vehicle camera 5 and the mounting state thereof.

但し、カメラ高さ(光軸高さ)Hはどの位置で計測しても不変であり、近方カメラ高さHaと遠方カメラ高さHbとは同一の固定高さであるため、設定された両路面距離Da,Dbは必ずしも正確に割り出す必要はない。尚、近方J座標jaと遠方J座標jbは、例えば、図7(b)に示すように、路面距離Da,Dbに基づいて設定した俯角θja,θjbに基づいて設定する。以上、ステップS3〜ステップS6までの処理が、本発明の垂直座標設定手段に対応している。   However, the camera height (optical axis height) H remains unchanged at any position, and the near camera height Ha and the far camera height Hb are set to the same fixed height. Both road surface distances Da and Db do not necessarily have to be accurately determined. The near J coordinate ja and the far J coordinate jb are set based on the depression angles θja and θjb set based on the road surface distances Da and Db, for example, as shown in FIG. 7B. The processes from step S3 to step S6 correspond to the vertical coordinate setting means of the present invention.

その後、ステップS7へ進み、両カメラ5a,5bで撮像した画像の近方J座標ja上の2つの特徴部分の各エッジPa,Qaの近方路面視差md1,md2を、次式(2)から算出し、続く、ステップS8で、遠方J座標jb上の2つの特徴部分の各エッジPb,Qbの遠方路面視差nd1,nd2を、次式(3)から算出する。尚、ステップS7,S8での処理が、本発明の路面視差演算手段に対応している。   Thereafter, the process proceeds to step S7, and the near road surface parallaxes md1 and md2 of the edges Pa and Qa of the two characteristic portions on the near J coordinate ja of the images captured by both cameras 5a and 5b are calculated from the following equation (2) In the following step S8, far road surface parallaxes nd1, nd2 of the edges Pb, Qb of the two feature parts on the far J coordinate jb are calculated from the following equation (3). The processes in steps S7 and S8 correspond to the road surface parallax calculating means of the present invention.

md1,md2=f・L/Da …(2)
nd1,nd2=f・L/Db …(3)
又、本実施形態では、J座標ja,jb上において、2つの異なる位置のエッジPa,Qa、及びPb,Qbを求めているが、視差は、複数の特徴部分のエッジから求めて平均化した方が高い精度の視差を導き出すことができるため、各J座標ja,jb上において3つ以上の特徴部分のエッジを平均化して視差を求めるようにしても良い。
md1, md2 = f · L / Da (2)
nd1, nd2 = f · L / Db (3)
Further, in the present embodiment, the edges Pa, Qa and Pb, Qb at two different positions are determined on the J coordinates ja and jb, but the parallax is averaged from the edges of a plurality of feature portions. It is possible to derive the parallax by averaging the edges of three or more feature parts on each of the J coordinates ja and jb, since the parallax can be derived with higher accuracy.

その後、ステップS9へ進み、近方カメラ高さHa[mm]を、
Ha=(L・(ja−JV)・ωj)/(ωi・(md1+md2)/2) …(3)
から算出する。ここで、ωjはJ座標方向の画素ピッチ[mm/pix]、ωiはI座標方向の画素ピッチ[mm/pix]である。尚、このステップでの処理が、本発明の近方垂直高さ演算手段に対応している。
Thereafter, the process proceeds to step S9, and the near camera height Ha [mm],
Ha = (L. (ja-JV) .ωj) / (ωi. (Md1 + md2) / 2) (3)
Calculated from Here, ωj is a pixel pitch [mm / pix] in the J coordinate direction, and ωi is a pixel pitch [mm / pix] in the I coordinate direction. The processing in this step corresponds to the near vertical height calculation means of the present invention.

次いで、ステップS10へ進み、遠方カメラ高さHb{mm]を、
Hb=(L・(jb−JV)・ωj)/(ωi・(nd1+nd2)/2) …(4)
から算出する。
Next, the process proceeds to step S10, where the far camera height Hb {mm] is
Hb = (L. (jb-JV) .. omega.j) / (. Omega.i. (Nd1 + nd2) / 2) (4)
Calculated from

その後、ステップS11へ進み、今回求めた各データJV,ja,jb,md1,md2,nd1,nd2,Ha,Hbをメモリーに保存し、ステップS12へ進み、データサンプリング数が所定値以上か否かを調べ、所定値以上の場合、ステップS13へ進み、所定値に達していない場合は、ルーチンを抜けて、再度、データをサンプリングする。   Thereafter, the process proceeds to step S11, the data JV, ja, jb, md1, md2, nd1, nd2, Ha, Hb obtained this time are stored in the memory, and the process proceeds to step S12. If the predetermined value is exceeded, the process proceeds to step S13. If the predetermined value is not reached, the routine is exited and data is sampled again.

ステップS13へ進むと、近方路面視差md1,md2と近方カメラ高さHaとに基づき、最小二乗法により近方側の一次近似直線を作成し、又、遠方路面視差nd1,nd2と遠方カメラ高さHbとに基づき、最小二乗法により遠方側一次近似直線を作成し、ステップS14へ進む。ステップS14では、両一次近似直線、すなわち、切片と傾きが求められたか否かを調べ、一次近似曲線が作成できなかった場合は、データが有効ではないと判定し、ステップS15へ分岐し、今回のデータを破棄してルーチンを抜ける。一方、近方側一次近似直線と遠方側一次近似曲線との双方が作成された場合、データが有効であると判定しステップS16へ進む。尚、ステップS11〜ステップS15までの処理が、本発明のデータ判定手段に対応している。   In step S13, the near-side linear approximation straight line is created by the least squares method based on the near road surface parallaxes md1, md2 and the near camera height Ha, and the far road surface parallax nd1, nd2 with the far camera Based on the height Hb, a far-side first-order approximate straight line is created by the least squares method, and the process proceeds to step S14. In step S14, it is checked whether or not both linear approximate straight lines, that is, the intercept and inclination have been obtained, and if the linear approximate curve can not be created, it is determined that the data is not valid, and the process branches to step S15. Discard the data of and exit the routine. On the other hand, when both the near side linear approximate straight line and the far side linear approximate curve are created, it is determined that the data is valid, and the process proceeds to step S16. The processing from step S11 to step S15 corresponds to the data determination means of the present invention.

そして、ステップS16へ進むと、各データを近方路面データ(ja,md1,md2,Ha)と遠方路面データ(JV,jb,nd1,nd2,Hb)とに分けてメモリに保存する。   Then, in step S16, each data is divided into near road surface data (ja, md1, md2, Ha) and far road surface data (JV, jb, nd1, nd2, Hb) and stored in the memory.

その後、ステップS17へ進み、近方路面データと遠方路面データとに保存したデータ数が、所定数以上か否かを調べ、所定数に達していない場合は、ルーチンを抜け、再度データを蓄積する。一方、所定数以上の場合は、ステップS18へ進み、視差オフセット値校正処理を実行してルーチンを抜ける。   Thereafter, the process proceeds to step S17, and it is checked whether the number of data stored in the near road surface data and the distant road surface data is equal to or greater than a predetermined number. If the predetermined number is not reached, the routine is exited and data is accumulated again. . On the other hand, if the number is equal to or more than the predetermined number, the process proceeds to step S18, the parallax offset value calibration process is performed, and the routine is exited.

ステップS18では、先ず、近方路面データに記憶されているデータja,md1,md2,Haと、遠方路面データと記憶されているデータJV,jb,nd1,nd2,Hbに対し、ばらつきの頻度分布(ヒストグラム)を作成し、最も高い頻度のデータja,md1,md2,Ha,JV,jb,nd1,nd2,Hbを抽出する。   In step S18, first, the frequency distribution of variation for the data ja, md1, md2, Ha stored in the near road surface data and the data JV, jb, nd1, nd2, Hb stored as the distant road surface data (Histogram) is created, and the highest frequency data ja, md1, md2, Ha, JV, jb, nd1, nd2, Hb are extracted.

ところで、図7に示すように、イメージセンサ6a,6b上の近方路面視差md1(md2)は遠方路面視差nd1(nd2)よりも大きく、視差誤差gdの影響は、遠方路面視差nd1(nd2)のほうが近方路面視差md1(md2)よりも大きい。更に、カメラ高さHa,Hbを算出する場合、遠方カメラ高さHbよりも近方カメラ高さHaの方が近く、視差が大きいため精度は高く、近方カメラ高さHaがより真値(カメラ高さH)に近い値となる。   By the way, as shown in FIG. 7, the near road surface parallax md1 (md2) on the image sensors 6a and 6b is larger than the distant road surface parallax nd1 (nd2), and the influence of the parallax error gd is the far road surface parallax nd1 (nd2) Is larger than the near road surface parallax md1 (md2). Furthermore, when the camera heights Ha and Hb are calculated, the near camera height Ha is closer than the far camera height Hb, and the parallax is larger, so the accuracy is high, and the near camera height Ha is more true value ( The value is close to the camera height H).

従って、近方カメラ高さHaは遠方カメラ高さHbよりも信頼できる値であると推定できるため、この近方カメラ高さHaを、上述した(4)式の遠方カメラ高さHbに代入して、遠方路面距離Dbの遠方路面視差(補正遠方路面視差)Cd[pix]を、
Cd=(L・(jb−JV)・ωj)/(ωi・Ha) …(5)
から算出する。この処理が、本発明の補正遠方路面視差演算手段に対応している。
次いで、この補正遠方路面視差Cdと、2つの遠方路面視差nd1,nd2の平均値とに基づき視差誤差gd[pix]を、
gd=Cd−(nd1+nd2)/2 …(6)
から算出する。
Therefore, since the near camera height Ha can be estimated to be a more reliable value than the far camera height Hb, the near camera height Ha is substituted for the far camera height Hb of the above-mentioned equation (4). Distance road surface distance Db (corrected distance road surface parallax) Cd [pix],
Cd = (L (jb−JV) · ωj) / (ωi · Ha) (5)
Calculated from This process corresponds to the corrected far road surface parallax calculating means of the present invention.
Then, based on the corrected far road surface parallax Cd and the average value of the two far road surface parallaxes nd1 and nd2, the parallax error gd [pix] is
gd = Cd− (nd1 + nd2) / 2 (6)
Calculated from

そして、最新の視差オフセット値DP(n)を視差誤差gdで校正して、新たな視差オフセット値DPを設定する(DP←DP(n)−gd)。そして、この視差オフセット値DPを順次更新する。ここでの処理が、本発明の視差誤差演算手段に対応している。尚、gd=0の場合は、現在の視差オフセット値DP(n)は正しい値を示していることになる。   Then, the latest parallax offset value DP (n) is calibrated with the parallax error gd, and a new parallax offset value DP is set (DP ← DP (n) −gd). Then, the disparity offset value DP is sequentially updated. The processing here corresponds to the parallax error calculation means of the present invention. In the case of gd = 0, the current disparity offset value DP (n) indicates a correct value.

本実施形態では、視差オフセット値DPが視差誤差gdで順次校正されるため、気候、温度、気圧、経年劣化などの影響を受けて刻々と変化する視差オフセット値DPを常に正しく校正することができる。その際、本実施形態では、近方路面距離Daと遠方路面距離Dbの路面を撮像した画像から特徴部分を抽出し、その視差に基づいて視差誤差gdを求めるようにしているので、従来のように固定された静止対象物を逐次検出する必要がなく、視差オフセット値DPの校正の機会が増加し、ステレオマッチング精度を向上させることができる。   In this embodiment, since the parallax offset value DP is sequentially calibrated by the parallax error gd, the parallax offset value DP, which changes every moment under the influence of weather, temperature, barometric pressure, aging deterioration, etc., can always be correctly calibrated. . At this time, in the present embodiment, the characteristic portion is extracted from the image obtained by imaging the road surface with the near road surface distance Da and the far road surface distance Db, and the parallax error gd is obtained based on the parallax. It is not necessary to sequentially detect a stationary object fixed to the point d, and the opportunity for calibration of the disparity offset value DP is increased, and stereo matching accuracy can be improved.

ところで、上述した実施形態では特徴部分を、路面上に形成されたタイヤ痕、凹み、水たまり、残雪等、非人為的、或いは自然発生的に形成されたものを対象としているが、例えば、図9の(a)に示す右画像、及び同図(b)に示す左画像のように、走行車線を区画するセンターラインL1と路側区画線L2とを特徴部分として抽出するようにしても良い。そして、右画像から抽出したセンターラインL1の内側エッジQr、及び路側区画線L2の内側エッジPrと、左画像から抽出したセンターラインL1の内側エッジQl、及び路側区画線L2の内側エッジPlとを輝度差から検出し、同図(c)に示すように、近方J座標ja上の近方路面視差md1,md2、遠方J座標jb上の遠方路面視差nd1,nd2を求めるようにしても良い。   By the way, in the above-described embodiment, the characteristic portion is intended for tire marks formed on the road surface, dents, water pools, residual snow, etc. which are formed artificially or spontaneously, for example, as shown in FIG. As in the right image shown in (a) and the left image shown in FIG. (B), the center line L1 and the roadside division line L2 that divide the traveling lane may be extracted as the characteristic part. Then, the inner edge Qr of the center line L1 extracted from the right image, the inner edge Pr of the roadside division line L2, the inner edge Q1 of the centerline L1 extracted from the left image, and the inner edge Pl of the roadside division line L2 It is also possible to detect from the luminance difference and obtain the near road surface parallaxes md1 and md2 on the near J coordinate ja and the distant road surface parallax nd1 and nd2 on the far J coordinate jb as shown in FIG. .

[第2実施形態]
図11に本発明の第2実施形態を示す。上述した第1実施形態では近方路面距離Daに位置する路面と、遠方路面距離Dbに位置する路面とに形成されている特徴部分を抽出して視差誤差gdを求めるようにしている。これに対し、本実施形態は、直前を走行する近方先行車31の左右後輪が近方路面距離Da上に存在するとき、この左右後輪が接地している路面を特徴部分として抽出し、左右後輪と路面との境界を示す内側エッジ(或いは外側エッジ)Pa,Qaから、図8(c)に示すような近方路面視差md1,md2を求める。
Second Embodiment
FIG. 11 shows a second embodiment of the present invention. In the first embodiment described above, the characteristic error formed on the road surface located at the near road surface distance Da and the road surface located at the far road surface distance Db is extracted to obtain the parallax error gd. On the other hand, in the present embodiment, when the left and right rear wheels of the near preceding vehicle 31 traveling immediately before are present on the near road surface distance Da, the road surface on which the left and right rear wheels are in contact is extracted as the feature portion. From the inner edges (or outer edges) Pa and Qa indicating the boundary between the left and right rear wheels and the road surface, near road surface parallaxes md1 and md2 as shown in FIG. 8C are obtained.

同様に、直前を走行する遠方先行車32の左右後輪が遠方路面距離Db上に存在するとき、この左右後輪が接地している路面を特徴部分として抽出し、その内側エッジ(或いは外側エッジ)Pb,Qbから、図8(c)に示す遠方路面視差nd1,nd2を求める。尚、遠方先行車32は前述した近方先行車31と同一である必要はなく、異なる車両であっても遠方路面視差nd1,nd2を求めることはできる。   Similarly, when the left and right rear wheels of the distant preceding vehicle 32 traveling immediately before are present on the distant road surface distance Db, the road surface on which the left and right rear wheels are in contact is extracted as a characteristic portion, and the inner edge (or outer edge) The far road surface parallaxes nd1 and nd2 shown in FIG. 8 (c) are obtained from Pb) and Qb. The distant preceding vehicle 32 does not have to be the same as the near preceding vehicle 31 described above, and even if it is a different vehicle, the distant road surface parallax nd1, nd2 can be obtained.

又、図10においては、便宜的に近方先行車31の前方に遠方先行車32を記載しているが、車載カメラ5が遠方先行車32を認識する際に、この車両20と遠方先行車32との間に、近方先行車31等のような他の車両は存在せず、従って、近方路面視差md1,md2と遠方路面視差nd1,nd2とは、異なるタイミングで検出される。   Further, in FIG. 10, the distant preceding vehicle 32 is described in front of the near preceding vehicle 31 for convenience. However, when the on-vehicle camera 5 recognizes the distant preceding vehicle 32, the distant leading vehicle 32 There is no other vehicle such as the near preceding vehicle 31 etc. between 32 and 32, so the near road surface parallaxes md1, md2 and the far road surface parallaxes nd1, nd2 are detected at different timings.

このように、本実施形態では、近方路面距離Daを走行している近方先行車31と遠方路面距離Dbを走行している遠方先行車32との各左右後輪が接地している路面を特徴部分として抽出するようにしているので、路面に特徴部分が形成されていない場合であっても、視差誤差gdを求めることができる。従って、例えば新しく敷設した舗装道路であって特徴部分を抽出することが困難な路面であっても先行車31,32を検出することで容易に視差誤差gdを求めることができる。   As described above, in the present embodiment, the left and right rear wheels of the near preceding vehicle 31 traveling on the near road surface distance Da and the far preceding vehicle 32 traveling on the far road surface distance Db are in contact with the road surface Is extracted as the feature portion, the parallax error gd can be obtained even when the feature portion is not formed on the road surface. Therefore, for example, even on a paved road which is newly laid and on which it is difficult to extract a characteristic portion, the parallax error gd can be easily obtained by detecting the leading vehicles 31 and 32.

1…ステレオ画像処理装置、
2…画像処理部、
5…車載カメラ、
5a…メインカメラ、
5b…サブカメラ、
6a,6b…イメージセンサ、
11…画像補正部、
12…ステレオ画像処理部、
13…視差補正処理部、
13a…視差オフセット校正部、
13b…位置補正部、
14…距離データ記憶部、
15…画像データ記憶部、
16…画像認識部、
20…車両、
31…近方先行車、
32…遠方先行車、
Cd…補正遠方路面視差、
Da…近方路面距離、
Db…遠方路面距離、
DP…視差オフセット値、
f…焦点距離、
gd…視差誤差、
Ha…近方カメラ高さ、
Hb…遠方カメラ高さ、
ja…近方J座標、
jb…遠方J座標、
JV…無限遠位垂直座標、
L…基線長、
L1…センターライン、
L2…路側区画線、
L3…路側区画線、
md1,md2…近方路面視差、
nd1,nd2…遠方路面視差、
Pa,Pb,Qa,Qb…エッジ、
Pl,Pr,Ql,Qr…内側エッジ、
x…視差画素数、
θja,θjb…俯角、
ωi,ωj…画素ピッチ、
1 ... stereo image processing device,
2 ... Image processing unit,
5 ... In-vehicle camera,
5a ... main camera,
5b ... sub camera,
6a, 6b ... image sensor,
11: Image correction unit
12: Stereo image processing unit,
13 ... disparity correction processing unit,
13a ... parallax offset calibration unit,
13b ... position correction unit,
14 ... distance data storage unit,
15: Image data storage unit,
16: Image recognition unit
20: Vehicle,
31 ... Near front car,
32 ... Far ahead car,
Cd ... corrected distant road surface parallax,
Da ... Near road distance,
Db ... distant road surface distance,
DP ... disparity offset value,
f ... focal length,
gd ... parallax error,
Ha ... Near camera height,
Hb ... distance camera height,
ja ... near J coordinates,
jb ... distant J coordinate,
JV ... infinite distal vertical coordinate,
L: Baseline length,
L1 ... center line,
L2 ... roadside section line,
L3 ... roadside section line,
md1, md2 ... near road surface parallax,
nd1, nd2 ... distant road surface parallax,
Pa, Pb, Qa, Qb ... edge,
Pl, Pr, Ql, Qr ... inner edge,
x ... number of parallax pixels,
θja, θjb...
ωi, ωj ... pixel pitch,

Claims (4)

移動体に搭載され、光軸が路面と平行に設定された状態で前方を撮像する左右一対の撮像手段と、
前記一対の撮像手段によって撮像された一対の画像に、前記光軸と一致する無限遠位垂直座標と予め設定した近方路面距離及び遠方路面距離に位置する近方垂直座標及び遠方垂直座標をそれぞれ設定する垂直座標設定手段と、
前記一対の画像の前記近方路面距離に位置する路面上で特徴部分を抽出して近方路面視差を求め、又前記遠方路面距離に位置する路面上で特徴部分を抽出して遠方路面視差を求める路面視差演算手段と、
前記一対の撮像手段間の基線長に前記無限遠位垂直座標から前記近方垂直座標までの距離を乗算した値を前記近方路面視差で除算して前記近方路面距離上の路面から前記光軸までの近方垂直高さを求める近方垂直高さ演算手段と、
前記一対の撮像手段間の基線長に前記無限遠位垂直座標から前記遠方垂直座標までの距離を乗算した値を前記近方垂直高で除算して補正遠方路面視差を求める補正遠方路面視差演算手段と、
前記補正遠方路面視差と前記遠方路面視差との差分に基づき、前記一対の撮像手段の光軸間平行度ずれに基づいて設定されている視差オフセット値を校正する視差誤差を求める視差誤差演算手段と
を備えることを特徴とするステレオ画像処理装置。
A pair of left and right imaging means mounted on a moving body and imaging the front with the optical axis set parallel to the road surface;
In a pair of images captured by the pair of imaging means, an infinite distal vertical coordinate coinciding with the optical axis, a near vertical coordinate located at a near road surface distance and a preset near road surface distance, and a far vertical coordinate respectively Vertical coordinate setting means to be set;
A characteristic portion is extracted on the road surface located at the near road surface distance of the pair of images to obtain a near road surface parallax, and a characteristic portion is extracted on the road surface located at the distant road surface distance to obtain a distant road surface parallax Road surface parallax calculating means to be obtained;
A value obtained by multiplying the distance between the infinite distal vertical coordinate and the near vertical coordinate by the baseline length between the pair of imaging means is divided by the near road surface parallax to obtain the light from the road surface on the near road surface distance A near vertical height computing means for obtaining a near vertical height to the axis;
A correction distance road surface parallax calculating unit which obtains a corrected distance road surface parallax by dividing a value obtained by multiplying the distance from the infinite distal vertical coordinate to the distance vertical coordinate by the base length between the pair of imaging units by the near vertical height When,
Parallax error calculating means for obtaining a parallax error for calibrating a parallax offset value set based on an inter-optical axis parallelism deviation of the pair of imaging means based on a difference between the corrected far road surface parallax and the far road surface parallax; A stereo image processing apparatus comprising:
少なくとも前記路面視差演算手段で求めた前記近方路面視差と前記遠方路面視差とを所定数だけサンプリングし、サンプリングしたデータの有効性を判定し、有効でないと判定された場合は当該データを破棄し、前記補正遠方路面視差演算手段と視差誤差演算手段とを実行させないデータ判定手段を更に有する
ことを特徴とする請求項1記載のステレオ画像処理装置。
At least a predetermined number of the near road surface parallax and the far road surface parallax obtained by the road surface parallax calculation means are sampled, the validity of the sampled data is judged, and when it is judged not to be valid, the data is discarded. The stereo image processing apparatus according to claim 1, further comprising data determination means for not executing the corrected far road surface parallax calculation means and the parallax error calculation means.
前記路面視差演算手段は、前記近方路面視差を前記近方路面距離上において少なくとも2つの特徴部分から求めた近方路面視差の平均値とし、前記遠方路面視差を前記遠方路面距離上において少なくとも2つの特徴部分から求めた遠方路面視差の平均値としている
ことを特徴とする請求項1或いは2記載のステレオ画像処理装置。
The road surface parallax calculating means sets the near road surface parallax as an average value of the near road surface parallax calculated from at least two characteristic portions on the near road surface distance, and the far road surface parallax is at least 2 on the far road surface distance. The stereo image processing apparatus according to claim 1 or 2, characterized in that it is an average value of far road surface parallaxes obtained from the two feature parts.
前記路面視差演算手段は、前記一対の画像から前記特徴部分のエッジと路面間に輝度差の生じる共通部分に基づいて前記近方路面視差及び前記遠方路面視差を求める
ことを特徴とする請求項1〜3の何れか1項に記載のステレオ画像処理装置。
The road surface parallax calculating means obtains the near road surface parallax and the distant road surface parallax based on a common portion where a luminance difference occurs between the edge of the characteristic portion and the road surface from the pair of images. The stereo image processing apparatus in any one of -3.
JP2015167088A 2015-08-26 2015-08-26 Stereo image processing device Active JP6521796B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015167088A JP6521796B2 (en) 2015-08-26 2015-08-26 Stereo image processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2015167088A JP6521796B2 (en) 2015-08-26 2015-08-26 Stereo image processing device

Publications (2)

Publication Number Publication Date
JP2017044573A JP2017044573A (en) 2017-03-02
JP6521796B2 true JP6521796B2 (en) 2019-05-29

Family

ID=58211797

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2015167088A Active JP6521796B2 (en) 2015-08-26 2015-08-26 Stereo image processing device

Country Status (1)

Country Link
JP (1) JP6521796B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111272070A (en) * 2020-03-05 2020-06-12 南京华捷艾米软件科技有限公司 Structured light reference image acquisition device and method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023010136A (en) 2021-07-09 2023-01-20 株式会社Subaru Stereo camera device and control device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4172554B2 (en) * 1998-03-12 2008-10-29 富士重工業株式会社 Stereo camera adjustment device
JP4573977B2 (en) * 1999-09-22 2010-11-04 富士重工業株式会社 Distance correction device for monitoring system and vanishing point correction device for monitoring system
JP2001169310A (en) * 1999-12-06 2001-06-22 Honda Motor Co Ltd Distance detector
JP5280768B2 (en) * 2008-08-18 2013-09-04 本田技研工業株式会社 Vehicle periphery monitoring device
JP6141734B2 (en) * 2013-09-26 2017-06-07 株式会社Subaru Stereo image processing device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111272070A (en) * 2020-03-05 2020-06-12 南京华捷艾米软件科技有限公司 Structured light reference image acquisition device and method
CN111272070B (en) * 2020-03-05 2021-10-19 南京华捷艾米软件科技有限公司 Structured light reference image acquisition device and method

Also Published As

Publication number Publication date
JP2017044573A (en) 2017-03-02

Similar Documents

Publication Publication Date Title
US10834324B2 (en) Image distortion correction of a camera with a rolling shutter
US8417022B2 (en) Stereo-image processing apparatus
US9378553B2 (en) Stereo image processing device for vehicle
US9697421B2 (en) Stereoscopic camera apparatus
US8824741B2 (en) Method for estimating the roll angle in a travelling vehicle
US9912933B2 (en) Road surface detection device and road surface detection system
JP6779365B2 (en) Object detection device and vehicle
JP6302519B2 (en) Vehicle driving support device
US10235579B2 (en) Vanishing point correction apparatus and method
JP6622664B2 (en) Self-vehicle position specifying device and self-vehicle position specifying method
JPH09133525A (en) Distance measuring device
JP5421819B2 (en) Lane recognition device
WO2017022080A1 (en) Step detection device and step detection method
JP6521796B2 (en) Stereo image processing device
US20200193184A1 (en) Image processing device and image processing method
JP2013092820A (en) Distance estimation apparatus
JP6044084B2 (en) Moving object position and orientation estimation apparatus and method
JP6141734B2 (en) Stereo image processing device
US10339394B2 (en) Step detection device and step detection method
JP2018136739A (en) Calibration device
CN111316322B (en) Road surface area detection device
JP2018036225A (en) State estimation device
JP6072508B2 (en) Road surface step detection method, road surface step detection device, and vehicle equipped with road surface step detection device
WO2020036039A1 (en) Stereo camera device
JP6451544B2 (en) Road boundary detection device, self-position estimation device, and road boundary detection method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20180510

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20190228

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20190402

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20190423

R150 Certificate of patent or registration of utility model

Ref document number: 6521796

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250