JP2008040965A - Road surface estimation device - Google Patents

Road surface estimation device Download PDF

Info

Publication number
JP2008040965A
JP2008040965A JP2006217040A JP2006217040A JP2008040965A JP 2008040965 A JP2008040965 A JP 2008040965A JP 2006217040 A JP2006217040 A JP 2006217040A JP 2006217040 A JP2006217040 A JP 2006217040A JP 2008040965 A JP2008040965 A JP 2008040965A
Authority
JP
Japan
Prior art keywords
road surface
dimensional object
camera
vehicle
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2006217040A
Other languages
Japanese (ja)
Other versions
JP4754434B2 (en
Inventor
Akihito Kimata
亮人 木俣
Akio Takahashi
昭夫 高橋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Priority to JP2006217040A priority Critical patent/JP4754434B2/en
Publication of JP2008040965A publication Critical patent/JP2008040965A/en
Application granted granted Critical
Publication of JP4754434B2 publication Critical patent/JP4754434B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a road surface estimation device 3 capable of quickly estimating a road surface with high accuracy. <P>SOLUTION: The road surface estimation device is provided with a three-dimensional object detecting means 12 for detecting a three-dimensional object in front of one's own vehicle 1, a bottom edge position detecting means 14 for detecting the bottom edge position of the detected three-dimensional object, a virtual road surface calculating means 16 for calculating a virtual road surface according to the bottom edge position of the three-dimensional object and the reference position of the one's own vehicle 1, and a real road surface estimating means 18 for estimating a real road surface on the basis of the virtual road surface. <P>COPYRIGHT: (C)2008,JPO&INPIT

Description

本発明は、路面推定装置に関するものである。   The present invention relates to a road surface estimation device.

カメラによって車両周辺の画像を撮影し、画像中の物体を認識するための技術が開発されている。特に、路面上に存在する歩行者等を認識するための前提として、路面位置を認識する手法の開発が望まれている。
特許文献1には、2本の道路平行線の先端が交わる道路消失点を抽出し、その道路抽出点の座標に基づいて、道路に対するカメラのヨー角やピッチ角などの姿勢パラメータを算出する手法が提案されている。また非特許文献1には、ステレオ動画像を用いて、画像間の射影変換を求めると同時に、空間中の道路平面部分に対応した領域を抽出する手法が提案されている。
特許第2544898号 関、奥富、「ステレオ動画像を利用した平面領域抽出による障害物検出」、情報処理学会論文誌:コンピュータビジョンとイメージメディア、平成16年12月、Vol.45、No.SIG13(CVIM10)
A technique for capturing an image around a vehicle with a camera and recognizing an object in the image has been developed. In particular, as a premise for recognizing pedestrians and the like existing on the road surface, development of a method for recognizing the road surface position is desired.
Patent Document 1 discloses a method of extracting a road vanishing point where the ends of two road parallel lines intersect and calculating posture parameters such as a yaw angle and a pitch angle of a camera with respect to the road based on the coordinates of the road extraction point. Has been proposed. Further, Non-Patent Document 1 proposes a method of obtaining a projective transformation between images using a stereo moving image and simultaneously extracting a region corresponding to a road plane portion in space.
Japanese Patent No. 2544898 Seki, Okutomi, “Obstacle Detection by Extracting Planar Regions Using Stereo Video”, Transactions of Information Processing Society of Japan: Computer Vision and Image Media, December 2004, Vol. 45, no. SIG13 (CVIM10)

しかしながら、特許文献1の手法では、処理速度は速いものの、道路消失点を抽出するため画像中に2本の道路平行線が必要であり、適応可能場面が限られるという問題がある。また非特許文献1の手法では、道路平面部分を精度よく検出できるものの、最適化するパラメータが多く処理が重いという問題がある。
そこで本発明は、迅速に精度良く路面を推定することが可能な路面推定装置の提供を課題とする。
However, although the method of Patent Document 1 has a high processing speed, there is a problem that two road parallel lines are required in the image in order to extract the road vanishing point, and the adaptable scene is limited. Further, although the method of Non-Patent Document 1 can accurately detect a road plane portion, there is a problem that many parameters are optimized and the processing is heavy.
Therefore, an object of the present invention is to provide a road surface estimation apparatus that can quickly and accurately estimate a road surface.

上記課題を解決するために、請求項1に係る発明は、自車両(例えば、実施形態における自車両1)の前方の立体物(例えば、実施形態における支柱50)を検出する手段(例えば、実施形態における立体物検出手段12)と、検出した前記立体物の下端位置(例えば、実施形態における下端位置54)を検出する手段(例えば、実施形態における下端位置検出手段14)と、前記立体物の下端位置と、前記自車両の基準位置(例えば、実施形態におけるカメラ位置)とにより、仮路面(例えば、実施形態における仮ピッチ角θnの面)を求める手段(例えば、実施形態における仮路面算出手段16)と、前記仮路面をもとに実路面(例えば、実施形態における実ピッチ角θの面)を推定する手段(例えば、実施形態における実路面推定手段18)と、を備えることを特徴とする。   In order to solve the above-mentioned problem, the invention according to claim 1 is a means (for example, implementation) for detecting a three-dimensional object (for example, the support 50 in the embodiment) in front of the own vehicle (for example, the own vehicle 1 in the embodiment). Three-dimensional object detection means 12) in the form, means for detecting the lower end position of the detected three-dimensional object (for example, the lower end position 54 in the embodiment) (for example, the lower end position detection means 14 in the embodiment), Means (for example, temporary road surface calculation means in the embodiment) for obtaining a temporary road surface (for example, the surface of the temporary pitch angle θn in the embodiment) based on the lower end position and the reference position of the host vehicle (for example, the camera position in the embodiment). 16) and means for estimating the actual road surface (for example, the surface of the actual pitch angle θ in the embodiment) based on the temporary road surface (for example, the actual road surface estimation means 1 in the embodiment) And 8).

請求項1に係る発明によれば、立体物の下端位置と自車両の基準位置とにより迅速に仮路面を求めることが可能であり、自車両が傾斜した場合でも精度良く実路面を推定することができる。   According to the first aspect of the present invention, the temporary road surface can be quickly obtained from the lower end position of the three-dimensional object and the reference position of the host vehicle, and the actual road surface can be accurately estimated even when the host vehicle is inclined. Can do.

以下、本発明の実施形態につき図面を参照して説明する。
図1は、本実施形態に係る路面推定装置の概略構成図である。路面推定装置3は、車両前方の画像を撮影するためのステレオカメラ5L,5Rを備えている。ステレオカメラ5L,5RはCCDカメラ等で構成され、フロントガラスの内側の天井部に取り付けられている。左側カメラ5Lは、車幅方向中央部のバックミラー近傍に配置され、右側カメラ5Rは、車幅方向右端部に配置されている。なお右側カメラ5Rを車幅方向中央部のバックミラー近傍に配置し、左側カメラ5Lを車幅方向左端部に配置してもよい。左側カメラ5Lと右側カメラ5Rとの距離は、30〜40cm程度に設定されている。
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
FIG. 1 is a schematic configuration diagram of a road surface estimation apparatus according to the present embodiment. The road surface estimation device 3 includes stereo cameras 5L and 5R for capturing an image ahead of the vehicle. The stereo cameras 5L and 5R are composed of a CCD camera or the like, and are attached to the ceiling portion inside the windshield. The left camera 5L is disposed in the vicinity of the rear-view mirror at the center in the vehicle width direction, and the right camera 5R is disposed at the right end in the vehicle width direction. The right camera 5R may be disposed near the rear-view mirror in the center in the vehicle width direction, and the left camera 5L may be disposed at the left end in the vehicle width direction. The distance between the left camera 5L and the right camera 5R is set to about 30 to 40 cm.

ステレオカメラ5L,5Rが撮影した画像は、車両に搭載されたコンピュータ10に入力される。コンピュータ10には、自車両1の前方の立体物を検出する立体物検出手段12と、検出した立体物の下端位置を検出する下端位置検出手段14と、立体物の下端位置とカメラ5の位置とにより仮路面を求める仮路面算出手段16と、仮路面をもとに実路面を推定する実路面推定手段18とが構築されている。   Images taken by the stereo cameras 5L and 5R are input to the computer 10 mounted on the vehicle. The computer 10 includes a three-dimensional object detection unit 12 that detects a three-dimensional object in front of the host vehicle 1, a lower end position detection unit 14 that detects a lower end position of the detected three-dimensional object, a lower end position of the three-dimensional object, and a position of the camera 5. The temporary road surface calculating means 16 for obtaining the temporary road surface and the actual road surface estimating means 18 for estimating the actual road surface based on the temporary road surface are constructed.

上述した本実施形態に係る路面推定装置3の動作につき、図1ないし図7を用いて説明する。
図2は、路面推定方法のフローチャートである。まず、ステレオカメラ5L,5Rにより車両前方の画像を撮影する(S20)。
図3は、ステレオカメラ5L,5Rにより撮影された画像の一例である。撮影された画像は、コンピュータ10の立体物検出手段12に出力する。
The operation of the road surface estimation device 3 according to this embodiment described above will be described with reference to FIGS.
FIG. 2 is a flowchart of the road surface estimation method. First, images in front of the vehicle are taken by the stereo cameras 5L and 5R (S20).
FIG. 3 is an example of images taken by the stereo cameras 5L and 5R. The captured image is output to the three-dimensional object detection means 12 of the computer 10.

次に、立体物検出手段12により画像中の縦長エッジを抽出する(S22)。具体的には、まず画像全体を走査して、横方向に隣接する画素の輝度を比較し、輝度差が大きい一対の画素を抽出する。次に、一対の画素のうち高輝度の画素と、その下方向に隣接する画素と輝度を比較する。これらの輝度差が小さければ、高輝度の画素列が下方向に連続していると判断し、さらにその下方向に隣接する画素と輝度を比較する。また輝度差が大きければ、高輝度の画素列の下端部であると判断する。このように検出した高輝度画素列の長さが所定値を超えている場合には、その高輝度画素列を縦長エッジとして抽出する。   Next, the vertically long edge in the image is extracted by the three-dimensional object detection means 12 (S22). Specifically, first, the entire image is scanned, the luminance of pixels adjacent in the horizontal direction is compared, and a pair of pixels having a large luminance difference is extracted. Next, the luminance of the high-brightness pixel in the pair of pixels and the pixel adjacent in the lower direction are compared. If these luminance differences are small, it is determined that the high luminance pixel row is continuous in the downward direction, and the luminance is compared with the pixels adjacent in the downward direction. If the luminance difference is large, it is determined that the lower end portion of the high luminance pixel column. If the length of the detected high-luminance pixel row exceeds a predetermined value, the high-luminance pixel row is extracted as a vertically long edge.

図4は、図3の画像から縦長エッジを抽出したものである。図3の画像におけるガードレールの支柱50の輪郭が、図4において縦長エッジ52として抽出されている。また図3の画像におけるビル60の輪郭も、図4において縦長エッジ62として抽出されている。さらに図3の画像におけるビルの窓70の輪郭も、図4において縦長エッジ72として抽出されている。なお図4では、縦長エッジの下端部のうち路面上に位置するものを黒丸54,64で、路面上に位置しないものを白丸74で示している。   FIG. 4 shows a vertically long edge extracted from the image of FIG. The contour of the guardrail support 50 in the image of FIG. 3 is extracted as a vertically long edge 52 in FIG. The outline of the building 60 in the image of FIG. 3 is also extracted as the vertically long edge 62 in FIG. Further, the outline of the building window 70 in the image of FIG. 3 is also extracted as a vertically long edge 72 in FIG. In FIG. 4, black circles 54 and 64 indicate the lower end of the vertically long edge on the road surface, and white circles 74 indicate those not positioned on the road surface.

次に、立体物検出手段12により立体物の縦長エッジを検出する(S23)。
図5は、立体物の縦長エッジの検出方法の説明図である。図5(a)に示すように、立体物80のほかに、所定形状の平面模様90が存在する場合を例にして説明する。図5(b)は、これらの立体物80および平面模様90を左側カメラ5Lで撮影した画像であり、図5(c)は右側カメラ5Rで撮影した画像である。
Next, the three-dimensional object detection means 12 detects a vertically long edge of the three-dimensional object (S23).
FIG. 5 is an explanatory diagram of a method for detecting a vertically long edge of a three-dimensional object. As shown in FIG. 5A, a case where a planar pattern 90 having a predetermined shape is present in addition to the three-dimensional object 80 will be described as an example. FIG. 5B is an image obtained by photographing the three-dimensional object 80 and the planar pattern 90 with the left camera 5L, and FIG. 5C is an image obtained with the right camera 5R.

立体物80の輪郭82,84は、左側カメラ5Lの画像においても、右側カメラ5Rの画像においても、上下方向に伸びる縦長エッジとして抽出される。ところが平面模様90の輪郭92,94は、左側カメラ5Lの画像において上下方向に伸びる縦長エッジとして抽出されても、右側カメラ5Rの画像においては斜め方向に伸びるエッジとして抽出される。そこで、左側カメラ5Lの画像および右側カメラ5Rのいずれの画像においても、略同形状の縦長エッジとして抽出されたものを、立体物80の縦長エッジとして検出する。
具体的には、左側カメラ5Lの画像において縦長エッジを抽出した場合に、その縦長エッジの位置に対応する右側カメラ5Rの画像中の位置に、略同形状の縦長エッジが存在するか判断する。なお左側カメラ5Lおよび右側カメラ5Rの位置は固定されているので、両者の画像において対応する位置を特定することは容易である。そして、略同形状の縦長エッジが存在する場合には、それを立体物80の縦長エッジとして検出する。
The contours 82 and 84 of the three-dimensional object 80 are extracted as vertically elongated edges extending in the vertical direction in both the image of the left camera 5L and the image of the right camera 5R. However, even if the contours 92 and 94 of the planar pattern 90 are extracted as vertically long edges extending in the vertical direction in the image of the left camera 5L, they are extracted as edges extending in the oblique direction in the image of the right camera 5R. Therefore, in both the image of the left camera 5L and the image of the right camera 5R, what is extracted as a vertically long edge having substantially the same shape is detected as a vertically long edge of the three-dimensional object 80.
Specifically, when a vertically long edge is extracted from the image of the left camera 5L, it is determined whether a vertically long edge having substantially the same shape exists at a position in the image of the right camera 5R corresponding to the position of the vertically long edge. Since the positions of the left camera 5L and the right camera 5R are fixed, it is easy to specify corresponding positions in both images. If a vertically long edge having substantially the same shape exists, it is detected as a vertically long edge of the three-dimensional object 80.

次に、下端位置検出手段14により立体物の縦長エッジの下端位置を検出する(S24)。ここでは、縦長エッジの下端画素の位置を下端位置として検出する。なお、S22において検出した高輝度画素列の下端画素の位置を記録しておき、これを縦長エッジの下端位置として読み出してもよい。   Next, the lower end position detecting means 14 detects the lower end position of the vertically long edge of the three-dimensional object (S24). Here, the position of the lower end pixel of the vertically long edge is detected as the lower end position. Note that the position of the lower end pixel of the high luminance pixel row detected in S22 may be recorded and read as the lower end position of the vertically long edge.

図6は、仮路面(仮ピッチ角θn)の算出方法の説明図である。車両1の停止状態において、カメラ5の光軸は実路面と平行になっている。また、カメラ5は実路面から距離hだけ上方に固定されている。この場合、カメラ5の光軸と平行であって、なおかつカメラ5から距離hだけ下方に位置する面が実路面である。   FIG. 6 is an explanatory diagram of a method for calculating the temporary road surface (temporary pitch angle θn). When the vehicle 1 is stopped, the optical axis of the camera 5 is parallel to the actual road surface. The camera 5 is fixed upward by a distance h from the actual road surface. In this case, a plane that is parallel to the optical axis of the camera 5 and that is positioned below the camera 5 by a distance h is an actual road surface.

しかしながら、車両1の加減速や実路面の凹凸2などにより、車両がピッチ角θで傾斜する場合がある。これに伴って、カメラ5の光軸も実路面に対して角度θで傾斜することになる。この場合、カメラ5の光軸と角度θをなし、なおかつカメラ5から距離hだけ下方に位置する面が実路面である。すなわち、車両のピッチ角θを求める作業は、実路面の位置を求める作業に一致する。そこで、カメラ5が撮影した画像ごとに、車両のピッチ角θを求める作業を行う。
本実施形態では、縦長エッジに対応する立体物の下端が実路面上に位置すると仮定して、画像中の縦長エッジごとに車両1の仮ピッチ角θnを算出する。次に、立体物の下端が最も多く位置するのは実路面上である蓋然性が大きいことを前提に、複数の仮ピッチ角θnをもとに車両1の実ピッチ角θを推定する。
However, the vehicle may be inclined at a pitch angle θ due to acceleration / deceleration of the vehicle 1 or unevenness 2 on the actual road surface. Along with this, the optical axis of the camera 5 is also inclined at an angle θ with respect to the actual road surface. In this case, the surface that forms an angle θ with the optical axis of the camera 5 and that is positioned below the camera 5 by a distance h is the actual road surface. That is, the work for obtaining the pitch angle θ of the vehicle coincides with the work for obtaining the position of the actual road surface. Therefore, an operation for obtaining the vehicle pitch angle θ is performed for each image captured by the camera 5.
In the present embodiment, the provisional pitch angle θn of the vehicle 1 is calculated for each vertically long edge in the image, assuming that the lower end of the three-dimensional object corresponding to the vertically long edge is located on the actual road surface. Next, the actual pitch angle θ of the vehicle 1 is estimated based on a plurality of provisional pitch angles θn on the assumption that the probability that the lower end of the three-dimensional object is located most on the actual road surface is large.

まず、仮路面算出手段16により仮路面(仮ピッチ角θn)を算出する(S26)。その具体的な手順は以下の通りである。まず、ステレオカメラで撮影した画像を用いて、カメラ5から立体物53までの距離Lを算出する。次に、カメラ5の光軸から立体物53の下端位置55までの距離Hを算出する。ここで、カメラ5の光軸から縦長エッジ52の下端位置54までの距離をyとし、カメラ5からその焦点位置Foまでの距離をfとすれば、数式1により距離Hを算出することができる。   First, a temporary road surface (temporary pitch angle θn) is calculated by the temporary road surface calculation means 16 (S26). The specific procedure is as follows. First, a distance L from the camera 5 to the three-dimensional object 53 is calculated using an image taken with a stereo camera. Next, a distance H from the optical axis of the camera 5 to the lower end position 55 of the three-dimensional object 53 is calculated. Here, if the distance from the optical axis of the camera 5 to the lower end position 54 of the vertically long edge 52 is y, and the distance from the camera 5 to the focal position Fo is f, the distance H can be calculated by Equation 1. .

Figure 2008040965
Figure 2008040965

次に、カメラ5の光軸から立体物53の下端位置55までの距離Hと、カメラの高さhとの差ΔHを、数式2により算出する。   Next, a difference ΔH between the distance H from the optical axis of the camera 5 to the lower end position 55 of the three-dimensional object 53 and the height h of the camera is calculated by Equation 2.

Figure 2008040965
Figure 2008040965

このΔHおよびLを用いて、数式3により仮ピッチ角θnを算出する。   The temporary pitch angle θn is calculated by Equation 3 using ΔH and L.

Figure 2008040965
Figure 2008040965

次に、実路面推定手段18により実路面(実ピッチ角θ)を推定する(S28)。図3および図4に示すように、立体物の下端位置が最も多く存在するのは実路面上である蓋然性が大きい。そのため、複数の仮ピッチ角θnから実ピッチ角θを推定する手法として、仮ピッチ角θnの最多値や平均値、中央値を採用する方法が考えられる。本実施形態では、仮ピッチ角θnの最多値を採用する方法(ボーティング)によって、車両の実ピッチ角θを推定する。   Next, an actual road surface (actual pitch angle θ) is estimated by the actual road surface estimation means 18 (S28). As shown in FIGS. 3 and 4, there is a high probability that the lower end position of the three-dimensional object is the most on the actual road surface. Therefore, as a method for estimating the actual pitch angle θ from a plurality of provisional pitch angles θn, a method of adopting the maximum value, average value, or median value of the provisional pitch angles θn can be considered. In the present embodiment, the actual pitch angle θ of the vehicle is estimated by a method (boating) that employs the most frequent value of the temporary pitch angle θn.

図7は、ボーディングの説明図である。まず、仮ピッチ角θnの範囲(ランク)を設定する。図7では、θn=0を中心に幅0.03radの複数のランクを設定している。次に、S26で算出した仮ピッチ角θnを、設定したランクに振り分ける(投票する)。図7では、θn=−0.06〜−0.03radのランクが最多得票ランクになっている。そこで、最多得票ランクの中央値である−0.045radを、仮ピッチ角θnの最多値として、実ピッチ角θに採用する。
そして、カメラ5の光軸と実ピッチ角θをなし、なおかつカメラ5から距離hだけ下方に位置する面が、実路面であると推定される。
FIG. 7 is an explanatory diagram of boarding. First, the range (rank) of the temporary pitch angle θn is set. In FIG. 7, a plurality of ranks having a width of 0.03 rad are set around θn = 0. Next, the temporary pitch angle θn calculated in S26 is distributed (voted) to the set rank. In FIG. 7, the rank of θn = −0.06 to −0.03 rad is the highest-ranked vote rank. Therefore, −0.045 rad, which is the median value of the most frequently obtained vote ranks, is adopted as the actual pitch angle θ as the most frequent value of the temporary pitch angle θn.
A surface that forms an actual pitch angle θ with the optical axis of the camera 5 and that is positioned below the camera 5 by a distance h is estimated to be an actual road surface.

以上に詳述したように、図1に示す本実施形態に係る路面推定装置3は、自車両1の前方の立体物を検出する立体物検出手段12と、検出した前記立体物の下端位置を検出する下端位置検出手段14と、立体物の下端位置と自車両1の基準位置とにより仮路面を求める仮路面算出手段16と、仮路面をもとに実路面を推定する実路面推定手段18とを備えることを特徴とする。この構成によれば、立体物の下端位置(例えば、縦長エッジ52の下端位置54)と自車両1のカメラ位置(例えば、ステレオカメラの間隔や高さh、光軸方向、焦点位置Foなど)とにより簡単に仮路面を求めることが可能であり、自車両1が傾斜した場合でも精度良く実路面を推定することができる。   As described in detail above, the road surface estimation device 3 according to the present embodiment shown in FIG. 1 includes the three-dimensional object detection means 12 that detects a three-dimensional object in front of the host vehicle 1 and the detected lower end position of the three-dimensional object. A lower end position detecting unit 14 for detecting, a temporary road surface calculating unit 16 for obtaining a temporary road surface from the lower end position of the three-dimensional object and the reference position of the host vehicle 1, and an actual road surface estimating unit 18 for estimating the actual road surface based on the temporary road surface. It is characterized by providing. According to this configuration, the lower end position of the three-dimensional object (for example, the lower end position 54 of the vertically long edge 52) and the camera position of the host vehicle 1 (for example, the distance and height h of the stereo camera, the optical axis direction, the focal position Fo, etc.) Thus, the temporary road surface can be easily obtained, and the actual road surface can be accurately estimated even when the host vehicle 1 is inclined.

また、本実施形態に係る路面推定装置に、歩行者や障害物等の認識装置を組み合わせることが望ましい。本実施形態で推定された実路面上に、歩行者等が存在すると認識された場合には、その歩行者等は自車両と衝突する可能性があるとして、注意を促すことができる。   In addition, it is desirable to combine a recognition device for pedestrians and obstacles with the road surface estimation device according to the present embodiment. When it is recognized that a pedestrian or the like is present on the actual road surface estimated in the present embodiment, the pedestrian or the like can be cautioned that it may collide with the own vehicle.

なお、この発明は上述した実施形態に限られるものではない。
例えば、実施形態ではステレオカメラにより車両前方の画像を撮影したが、車両側方や車両後方の画像を撮影して実路面を推定してもよい。また、実施形態では車両停止状態においてカメラの光軸を路面と水平に設定したが、路面と所定角度をなすように設定してもよい。
The present invention is not limited to the embodiment described above.
For example, in the embodiment, an image in front of the vehicle is captured with a stereo camera, but an actual road surface may be estimated by capturing images on the side of the vehicle or behind the vehicle. In the embodiment, the optical axis of the camera is set to be horizontal with the road surface when the vehicle is stopped. However, it may be set to form a predetermined angle with the road surface.

実施形態に係る路面推定装置の概略構成図である。It is a schematic block diagram of the road surface estimation apparatus which concerns on embodiment. 路面推定方法のフローチャートである。It is a flowchart of a road surface estimation method. 撮影された画像の一例である。It is an example of the image | photographed image. 図3の画像から縦長エッジを抽出したものである。FIG. 4 is a vertical edge extracted from the image of FIG. 3. 立体物の縦長エッジの検出方法の説明図である。It is explanatory drawing of the detection method of the longitudinally long edge of a solid object. 仮路面の算出方法の説明図である。It is explanatory drawing of the calculation method of a temporary road surface. ボーディングの説明図である。It is explanatory drawing of boarding.

符号の説明Explanation of symbols

1…自車両 3…路面推定装置 5…カメラ 10…コンピュータ 12…立体物検出手段 14…下端位置検出手段 16…仮路面算出手段 18…実路面推定手段   DESCRIPTION OF SYMBOLS 1 ... Own vehicle 3 ... Road surface estimation apparatus 5 ... Camera 10 ... Computer 12 ... Solid object detection means 14 ... Lower end position detection means 16 ... Temporary road surface calculation means 18 ... Actual road surface estimation means

Claims (1)

自車両の前方の立体物を検出する手段と、
検出した前記立体物の下端位置を検出する手段と、
前記立体物の下端位置と、前記自車両の基準位置とにより、仮路面を求める手段と、
前記仮路面をもとに実路面を推定する手段と、
を備えることを特徴とする路面推定装置。
Means for detecting a three-dimensional object in front of the host vehicle;
Means for detecting a lower end position of the detected three-dimensional object;
Means for obtaining a temporary road surface by a lower end position of the three-dimensional object and a reference position of the host vehicle;
Means for estimating an actual road surface based on the temporary road surface;
A road surface estimation device comprising:
JP2006217040A 2006-08-09 2006-08-09 Road surface estimation device Expired - Fee Related JP4754434B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006217040A JP4754434B2 (en) 2006-08-09 2006-08-09 Road surface estimation device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006217040A JP4754434B2 (en) 2006-08-09 2006-08-09 Road surface estimation device

Publications (2)

Publication Number Publication Date
JP2008040965A true JP2008040965A (en) 2008-02-21
JP4754434B2 JP4754434B2 (en) 2011-08-24

Family

ID=39175864

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006217040A Expired - Fee Related JP4754434B2 (en) 2006-08-09 2006-08-09 Road surface estimation device

Country Status (1)

Country Link
JP (1) JP4754434B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010237810A (en) * 2009-03-30 2010-10-21 Mazda Motor Corp System and method for detecting moving object
JP2014139766A (en) * 2012-12-20 2014-07-31 Denso Corp Road surface shape estimation device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018092596A (en) 2016-11-30 2018-06-14 株式会社リコー Information processing device, imaging device, apparatus control system, mobile body, information processing method, and program
WO2018100971A1 (en) 2016-11-30 2018-06-07 Ricoh Company, Ltd. Information processing device, imaging device, apparatus control system, movable body, information processing method, and computer program product

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1062162A (en) * 1996-08-13 1998-03-06 Nissan Motor Co Ltd Detector for obstacle
JP2001134771A (en) * 1999-11-04 2001-05-18 Honda Motor Co Ltd Object recognizing device
JP2001134769A (en) * 1999-11-04 2001-05-18 Honda Motor Co Ltd Object recognizing device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1062162A (en) * 1996-08-13 1998-03-06 Nissan Motor Co Ltd Detector for obstacle
JP2001134771A (en) * 1999-11-04 2001-05-18 Honda Motor Co Ltd Object recognizing device
JP2001134769A (en) * 1999-11-04 2001-05-18 Honda Motor Co Ltd Object recognizing device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010237810A (en) * 2009-03-30 2010-10-21 Mazda Motor Corp System and method for detecting moving object
JP2014139766A (en) * 2012-12-20 2014-07-31 Denso Corp Road surface shape estimation device

Also Published As

Publication number Publication date
JP4754434B2 (en) 2011-08-24

Similar Documents

Publication Publication Date Title
US20200265550A1 (en) Dense structure from motion
US20200401822A1 (en) Monocular cued detection of three-dimensional structures from depth images
JP3868876B2 (en) Obstacle detection apparatus and method
JP3759429B2 (en) Obstacle detection apparatus and method
JP2008242571A (en) Object detection device
JP6032034B2 (en) Object detection device
JP2009053818A (en) Image processor and method thereof
JP2007232389A (en) Three-dimensional shape detection apparatus
WO2017094300A1 (en) Image processing device, object recognition device, device conrol system, image processing method, and program
JP2013250907A (en) Parallax calculation device, parallax calculation method and parallax calculation program
WO2017159056A1 (en) Image processing device, image pickup device, mobile-body apparatus control system, image processing method, and program
JP5430232B2 (en) Road direction recognition method and apparatus
JP4344860B2 (en) Road plan area and obstacle detection method using stereo image
JP2008309519A (en) Object detection device using image processing
JP4754434B2 (en) Road surface estimation device
WO2011016257A1 (en) Distance calculation device for vehicle
JP3868915B2 (en) Forward monitoring apparatus and method
JP2012252501A (en) Traveling path recognition device and traveling path recognition program
JP2008286648A (en) Distance measuring device, distance measuring system, and distance measuring method
JP6677141B2 (en) Parking frame recognition device
JP2007199932A (en) Image processor and its method
KR101531313B1 (en) Apparatus and method for object detection of vehicle floor
JP6477246B2 (en) Road marking detection device and road marking detection method
JP2005301892A (en) Lane recognition device by a plurality of cameras
JP2003322522A (en) Inter-vehicle distance detection device and detection method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20081127

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110121

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110125

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110301

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110517

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110525

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140603

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

LAPS Cancellation because of no payment of annual fees