JP2021140723A - Vehicle exterior environment recognition device - Google Patents

Vehicle exterior environment recognition device Download PDF

Info

Publication number
JP2021140723A
JP2021140723A JP2020135392A JP2020135392A JP2021140723A JP 2021140723 A JP2021140723 A JP 2021140723A JP 2020135392 A JP2020135392 A JP 2020135392A JP 2020135392 A JP2020135392 A JP 2020135392A JP 2021140723 A JP2021140723 A JP 2021140723A
Authority
JP
Japan
Prior art keywords
width
overlap
specific object
threshold value
dimensional object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2020135392A
Other languages
Japanese (ja)
Other versions
JP7514139B2 (en
Inventor
稔 倉岡
Minoru Kuraoka
稔 倉岡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Subaru Corp
Original Assignee
Subaru Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Subaru Corp filed Critical Subaru Corp
Priority to US17/143,748 priority Critical patent/US11842552B2/en
Priority to CN202110022897.8A priority patent/CN113361310A/en
Publication of JP2021140723A publication Critical patent/JP2021140723A/en
Application granted granted Critical
Publication of JP7514139B2 publication Critical patent/JP7514139B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

To improve determination accuracy of a specific object.SOLUTION: A vehicle exterior environment recognition device includes: a first width derivation unit for deriving a first width of a first solid object based on a luminance image capable of specifying luminance of an imaging object; a second width derivation unit for deriving a second width of a second solid object based on a distance image capable of specifying a distance of the imaging object; an overlap degree derivation unit for obtaining an overlap width in which the first width and the second width overlap in a horizontal direction to determine a larger one of an overlap width/the first width and an overlap width/the second width as an overlap degree; and a specific object determination unit for determining the first solid object and the second solid object as the same specific object if the overlap degree is a predetermined threshold value of 196 or more.SELECTED DRAWING: Figure 6

Description

本発明は、自車両の進行方向に存在する特定物を特定する車外環境認識装置に関する。 The present invention relates to an external environment recognition device that identifies a specific object existing in the traveling direction of the own vehicle.

従来、特許文献1のように、自車両の前方に位置する先行車両を検出し、先行車両との衝突による被害を軽減したり、先行車両との車間距離を安全な距離に保つように追従制御する技術が知られている。 Conventionally, as in Patent Document 1, the preceding vehicle located in front of the own vehicle is detected to reduce the damage caused by the collision with the preceding vehicle, and the follow-up control is performed so as to keep the distance between the preceding vehicle and the vehicle at a safe distance. The technology to do is known.

特許第3349060号公報Japanese Patent No. 3349060

自車両と先行車両との衝突被害を軽減したり、自車両を先行車両に追従させる追従制御を実現するためには、まず、自車両周辺に位置する立体物が車両等の特定物であるか否か判断しなくてはならない。例えば、撮像部で撮像した輝度画像内における立体物の形状を認識して先行車両と判定したり、距離画像内における立体物の相対距離や速度を認識して先行車両と判定していた。 In order to reduce the damage caused by the collision between the own vehicle and the preceding vehicle and to realize the follow-up control that causes the own vehicle to follow the preceding vehicle, first of all, is the three-dimensional object located around the own vehicle a specific object such as a vehicle? You have to decide whether or not. For example, the shape of the three-dimensional object in the luminance image captured by the imaging unit is recognized and determined as the preceding vehicle, or the relative distance and speed of the three-dimensional object in the distance image are recognized and determined as the preceding vehicle.

しかし、立体物が自車両から遠方に位置している場合、画像中における立体物自体が占有する面積が小さくなるので、輝度画像における形状や距離画像における相対距離が揺らぎ、特定物の判定精度の低下を招いていた。 However, when the three-dimensional object is located far from the own vehicle, the area occupied by the three-dimensional object itself in the image becomes small, so that the shape in the luminance image and the relative distance in the distance image fluctuate, and the determination accuracy of the specific object is improved. It caused a decline.

本発明は、このような課題に鑑み、特定物の判定精度を向上することが可能な、車外環境認識装置を提供することを目的としている。 In view of such a problem, an object of the present invention is to provide an external environment recognition device capable of improving the determination accuracy of a specific object.

上記課題を解決するために、本発明の車外環境認識装置は、撮像対象の輝度を特定可能な輝度画像に基づいて第1立体物の第1幅を導出する第1幅導出部と、撮像対象の距離を特定可能な距離画像に基づいて第2立体物の第2幅を導出する第2幅導出部と、第1幅と第2幅とが水平方向において重複する重複幅を求め、重複幅/第1幅、および、重複幅/第2幅のうち大きい方を重複度とする重複度導出部と、重複度が所定の閾値以上であれば、第1立体物と第2立体物とを同一の特定物と判定する特定物判定部と、を備える。 In order to solve the above problems, the vehicle exterior environment recognition device of the present invention has a first width derivation unit that derives the first width of the first three-dimensional object based on a brightness image that can specify the brightness of the imaged object, and an imaged object. The overlap width is obtained by finding the overlap width in which the second width derivation unit for deriving the second width of the second three-dimensional object and the first width and the second width overlap in the horizontal direction based on the distance image in which the distance can be specified. / The first width and the overlap width / the second width, whichever is larger, is the overlap degree derivation part, and if the overlap degree is equal to or more than a predetermined threshold, the first three-dimensional object and the second three-dimensional object are combined. It is provided with a specific object determination unit for determining the same specific object.

特定物判定部は、同一の特定物と判定した重複度に対する閾値を下げるとしてもよい。 The specific object determination unit may lower the threshold value for the degree of overlap determined to be the same specific object.

特定物判定部は、重複度に対する閾値を下げた後、同一の特定物ではないと判定した重複度に対する閾値を上げるとしてもよい。 The specific object determination unit may lower the threshold value for the degree of duplication and then increase the threshold value for the degree of duplication determined that they are not the same specific object.

特定物判定部は、同一の特定物と判定した回数に応じて重複度に対する閾値を段階的に変化させるとしてもよい。 The specific object determination unit may change the threshold value for the degree of duplication stepwise according to the number of times that the same specific object is determined.

特定物判定部は、ワイパーの作動有無に応じて閾値を変化させるとしてもよい。 The specific object determination unit may change the threshold value depending on whether or not the wiper is operated.

特定物判定部は、自車両の速度に応じて閾値を変化させるとしてもよい。 The specific object determination unit may change the threshold value according to the speed of the own vehicle.

本発明によれば、特定物の判定精度を向上することが可能となる。 According to the present invention, it is possible to improve the determination accuracy of a specific object.

図1は、車外環境認識システムの接続関係を示したブロック図である。FIG. 1 is a block diagram showing the connection relationship of the vehicle exterior environment recognition system. 図2は、車外環境認識装置の概略的な機能を示した機能ブロック図である。FIG. 2 is a functional block diagram showing a schematic function of the vehicle exterior environment recognition device. 図3は、車外環境認識方法の流れを示すフローチャートである。FIG. 3 is a flowchart showing the flow of the method for recognizing the environment outside the vehicle. 図4は、輝度画像および距離画像を説明するための説明図である。FIG. 4 is an explanatory diagram for explaining a luminance image and a distance image. 図5は、重複度を説明するための説明図である。FIG. 5 is an explanatory diagram for explaining the degree of overlap. 図6は、特定物判定部で参照される閾値の求め方を説明するためのグラフである。FIG. 6 is a graph for explaining how to obtain the threshold value referred to by the specific object determination unit. 図7は、閾値の変化態様を説明するための説明図である。FIG. 7 is an explanatory diagram for explaining the change mode of the threshold value. 図8は、閾値の変化態様を説明するための説明図である。FIG. 8 is an explanatory diagram for explaining a change mode of the threshold value.

以下に添付図面を参照しながら、本発明の好適な実施形態について詳細に説明する。かかる実施形態に示す寸法、材料、その他具体的な数値などは、発明の理解を容易とするための例示にすぎず、特に断る場合を除き、本発明を限定するものではない。なお、本明細書および図面において、実質的に同一の機能、構成を有する要素については、同一の符号を付することにより重複説明を省略し、また本発明に直接関係のない要素は図示を省略する。 Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings. The dimensions, materials, and other specific numerical values shown in such an embodiment are merely examples for facilitating the understanding of the invention, and do not limit the present invention unless otherwise specified. In the present specification and drawings, elements having substantially the same function and configuration are designated by the same reference numerals to omit duplicate description, and elements not directly related to the present invention are not shown. do.

(車外環境認識システム100)
図1は、車外環境認識システム100の接続関係を示したブロック図である。車外環境認識システム100は、撮像装置110と、車外環境認識装置120と、車両制御装置130とを含む。
(External environment recognition system 100)
FIG. 1 is a block diagram showing a connection relationship of the vehicle exterior environment recognition system 100. The vehicle exterior environment recognition system 100 includes an image pickup device 110, a vehicle exterior environment recognition device 120, and a vehicle control device 130.

撮像装置110は、CCD(Charge-Coupled Device)やCMOS(Complementary Metal-Oxide Semiconductor)等の撮像素子を含んで構成され、自車両1の前方の車外環境を撮像し、少なくとも輝度の情報が含まれる輝度画像(カラー画像やモノクロ画像)を生成することができる。また、撮像装置110は、自車両1の進行方向側において2つの撮像装置110それぞれの光軸が略平行になるように、略水平方向に離隔して配置される。撮像装置110は、自車両1の前方の検出領域に存在する立体物を撮像した輝度画像を、例えば1/60秒のフレーム毎(60fps)に連続して生成する。 The image pickup device 110 is configured to include an image pickup element such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal-Oxide Semiconductor), images the outside environment in front of the own vehicle 1, and includes at least brightness information. A brightness image (color image or monochrome image) can be generated. Further, the image pickup device 110 is arranged so as to be separated from each other in the substantially horizontal direction so that the optical axes of the two image pickup devices 110 are substantially parallel to each other on the traveling direction side of the own vehicle 1. The image pickup apparatus 110 continuously generates a luminance image of a three-dimensional object existing in the detection region in front of the own vehicle 1 every frame (60 fps) of, for example, 1/60 second.

また、車外環境認識装置120は、撮像装置110から取得した輝度画像や、2の輝度画像に基づく距離画像を通じて車外の境を認識する。そして、車外環境認識装置120は、認識した車外環境と、自車両1の走行状況とに基づいて、自車両1の走行における速度制御や舵角制御を行う。車外環境認識装置120については後程詳述する。 Further, the vehicle exterior environment recognition device 120 recognizes the boundary outside the vehicle through the luminance image acquired from the image pickup apparatus 110 and the distance image based on the luminance image of 2. Then, the vehicle exterior environment recognition device 120 performs speed control and steering angle control in the traveling of the own vehicle 1 based on the recognized external environment and the traveling condition of the own vehicle 1. The vehicle exterior environment recognition device 120 will be described in detail later.

車両制御装置130は、ECU(Electronic Control Unit)で構成され、ステアリングホイール132、アクセルペダル134、ブレーキペダル136を通じて運転手の操作入力を受け付け、車軸に設けられた速度センサ138を参照し、操舵機構142、駆動機構144、制動機構146に伝達することで自車両1を制御する。また、車両制御装置130は、車外環境認識装置120の指示に従い、操舵機構142、駆動機構144、制動機構146を制御する。また、雨天時の走行において、車両制御装置130は、運転手の操作に応じ、自車両1のフロントガラスおよびリアガラスの汚れや不純物を拭き取るワイパー148を動作させる。 The vehicle control device 130 is composed of an ECU (Electronic Control Unit), receives a driver's operation input through a steering wheel 132, an accelerator pedal 134, and a brake pedal 136, refers to a speed sensor 138 provided on the axle, and is a steering mechanism. The own vehicle 1 is controlled by transmitting the information to 142, the drive mechanism 144, and the braking mechanism 146. Further, the vehicle control device 130 controls the steering mechanism 142, the drive mechanism 144, and the braking mechanism 146 in accordance with the instructions of the vehicle exterior environment recognition device 120. Further, when traveling in the rain, the vehicle control device 130 operates a wiper 148 that wipes off dirt and impurities on the windshield and the rear glass of the own vehicle 1 in response to the driver's operation.

(車外環境認識装置120)
図2は、車外環境認識装置120の概略的な機能を示した機能ブロック図である。図2に示すように、車外環境認識装置120は、I/F部150と、データ保持部152と、中央制御部154とを含んで構成される。
(External environment recognition device 120)
FIG. 2 is a functional block diagram showing a schematic function of the vehicle exterior environment recognition device 120. As shown in FIG. 2, the vehicle exterior environment recognition device 120 includes an I / F unit 150, a data holding unit 152, and a central control unit 154.

I/F部150は、撮像装置110、および、車両制御装置130との双方向の情報交換を行うためのインターフェースである。データ保持部152は、RAM、フラッシュメモリ、HDD等で構成され、以下に示す各機能部の処理に必要な様々な情報を保持する。 The I / F unit 150 is an interface for bidirectional information exchange with the image pickup device 110 and the vehicle control device 130. The data holding unit 152 is composed of a RAM, a flash memory, an HDD, and the like, and holds various information necessary for processing of each of the following functional units.

中央制御部154は、中央処理装置(CPU)、プログラム等が格納されたROM、ワークエリアとしてのRAM等を含む半導体集積回路で構成され、システムバス156を通じて、I/F部150、データ保持部152等を制御する。また、本実施形態において、中央制御部154は、画像取得部160、距離画像生成部162、立体物特定部163、第1幅導出部164、第2幅導出部166、重複度導出部168、特定物判定部170としても機能する。以下、本実施形態に特徴的な、自車両1前方の立体物を抽出し、先行車両等の特定物を判定する車外環境認識方法について、当該中央制御部154の各機能部の動作も踏まえて詳述する。 The central control unit 154 is composed of a semiconductor integrated circuit including a central processing unit (CPU), a ROM in which programs and the like are stored, a RAM as a work area, and the like, and is an I / F unit 150 and a data holding unit through a system bus 156. It controls 152 and so on. Further, in the present embodiment, the central control unit 154 includes an image acquisition unit 160, a distance image generation unit 162, a three-dimensional object identification unit 163, a first width derivation unit 164, a second width derivation unit 166, and an overlap degree derivation unit 168. It also functions as a specific object determination unit 170. Hereinafter, the method of recognizing the external environment, which is characteristic of the present embodiment, is to extract a three-dimensional object in front of the own vehicle 1 and determine a specific object such as a preceding vehicle, based on the operation of each functional unit of the central control unit 154. It will be described in detail.

(車外環境認識方法)
図3は、車外環境認識方法の流れを示すフローチャートである。車外環境認識装置120は、所定の割込時間毎に当該車外環境認識方法を実行する。車外環境認識方法では、まず、画像取得部160が、複数の輝度画像を取得する(S200)。距離画像生成部162が、距離画像を生成する(S202)。立体物特定部163が、立体物を特定する(S203)。第1幅導出部164が、撮像対象の輝度を特定可能な輝度画像に基づいて第1立体物の第1幅を導出する(S204)。第2幅導出部166が、撮像対象の距離を特定可能な距離画像に基づいて第2立体物の第2幅を導出する(S206)。重複度導出部168が、第1幅と第2幅とが画像水平方向において重複する領域の幅である重複幅を求める。そして、重複度導出部168は、重複幅/第1幅、および、重複幅/第2幅のうち大きい方を重複度とする(S208)。特定物判定部170が、重複度が所定の閾値以上であれば、第1立体物と第2立体物とを同一の特定物と判定する(S210)。以下、車外環境認識方法の各処理について詳細に説明し、本実施形態の特徴と無関係な処理については説明を省略する。なお、本実施形態において「/」は除算を意味し、例えば、重複幅/第1幅は、重複幅を第1幅で除算することを示す。
(Method of recognizing the environment outside the vehicle)
FIG. 3 is a flowchart showing the flow of the method for recognizing the environment outside the vehicle. The vehicle exterior environment recognition device 120 executes the vehicle exterior environment recognition method at predetermined interruption times. In the vehicle exterior environment recognition method, first, the image acquisition unit 160 acquires a plurality of luminance images (S200). The distance image generation unit 162 generates a distance image (S202). The three-dimensional object identification unit 163 identifies the three-dimensional object (S203). The first width derivation unit 164 derives the first width of the first three-dimensional object based on a luminance image capable of specifying the luminance of the imaging target (S204). The second width derivation unit 166 derives the second width of the second three-dimensional object based on the distance image in which the distance to be imaged can be specified (S206). The overlap degree derivation unit 168 obtains the overlap width, which is the width of the region where the first width and the second width overlap in the horizontal direction of the image. Then, the overlap degree derivation unit 168 sets the larger of the overlap width / first width and the overlap width / second width as the overlap degree (S208). If the degree of overlap is equal to or greater than a predetermined threshold value, the specific object determination unit 170 determines that the first three-dimensional object and the second three-dimensional object are the same specific object (S210). Hereinafter, each process of the vehicle exterior environment recognition method will be described in detail, and description of the process unrelated to the features of the present embodiment will be omitted. In the present embodiment, "/" means division, and for example, the overlap width / first width indicates that the overlap width is divided by the first width.

(画像取得処理S200)
図4(図4A〜図4C)は、輝度画像および距離画像を説明するための説明図である。画像取得部160は、撮像装置110で光軸を異として撮像された複数(ここでは2)の輝度画像をそれぞれ取得する。ここで、画像取得部160は、輝度画像180として、図4Aに示す、自車両1の比較的右側に位置する撮像装置110で撮像された第1輝度画像180aと、図4Bに示す、自車両1の比較的左側に位置する撮像装置110で撮像された第2輝度画像180bとを取得したとする。
(Image acquisition process S200)
FIG. 4 (FIGS. 4A to 4C) is an explanatory diagram for explaining a luminance image and a distance image. The image acquisition unit 160 acquires a plurality of (here, 2) luminance images imaged by the image pickup apparatus 110 with different optical axes. Here, the image acquisition unit 160 uses the first luminance image 180a captured by the imaging device 110 located on the relatively right side of the own vehicle 1 shown in FIG. 4A and the own vehicle shown in FIG. 4B as the luminance image 180. It is assumed that the second luminance image 180b imaged by the image pickup apparatus 110 located on the relatively left side of 1 is acquired.

図4を参照すると、撮像装置110の撮像位置の違いから、第1輝度画像180aと第2輝度画像180bとで、画像に含まれる立体物の画像位置が水平方向に異なるのが理解できる。ここで、水平は、撮像した画像の画面横方向を示し、垂直は、撮像した画像の画面縦方向を示す。 With reference to FIG. 4, it can be understood that the image positions of the three-dimensional objects included in the images differ in the horizontal direction between the first luminance image 180a and the second luminance image 180b due to the difference in the imaging position of the imaging device 110. Here, horizontal indicates the horizontal direction of the screen of the captured image, and vertical indicates the vertical direction of the screen of the captured image.

(距離画像生成処理S202)
距離画像生成部162は、画像取得部160が取得した、図4Aに示す第1輝度画像180aと、図4Bに示す第2輝度画像180bとに基づいて、図4Cのような、撮像対象の距離を特定可能な距離画像182を生成する。
(Distance image generation process S202)
The distance image generation unit 162 is based on the first luminance image 180a shown in FIG. 4A and the second luminance image 180b shown in FIG. 4B acquired by the image acquisition unit 160, and the distance to be imaged as shown in FIG. 4C. Generates a identifiable distance image 182.

具体的に、距離画像生成部162は、所謂パターンマッチングを用いて視差、および、任意のブロックの画像内の位置を示す画像位置を含む視差情報を導出する。具体的に、一方の輝度画像(ここでは第1輝度画像180a)から任意に抽出したブロックに対応するブロックを他方の輝度画像(ここでは第2輝度画像180b)から検索する。ここで、ブロックは、例えば、水平4画素×垂直4画素の配列で表される。また、パターンマッチングは、一方の輝度画像から任意に抽出したブロックに対応するブロックを他方の輝度画像から検索する手法である。 Specifically, the distance image generation unit 162 derives parallax information including parallax and an image position indicating a position in an image of an arbitrary block by using so-called pattern matching. Specifically, the block corresponding to the block arbitrarily extracted from one luminance image (here, the first luminance image 180a) is searched from the other luminance image (here, the second luminance image 180b). Here, the block is represented by, for example, an array of 4 horizontal pixels × 4 vertical pixels. Further, pattern matching is a method of searching for a block corresponding to a block arbitrarily extracted from one luminance image from the other luminance image.

例えば、パターンマッチングにおけるブロック間の一致度を評価する関数として、輝度の差分をとるSAD(Sum of Absolute Difference)、差分を2乗して用いるSSD(Sum of Squared intensity Difference)や、各画素の輝度から平均値を引いた分散値の類似度をとるNCC(Normalized Cross Correlation)等の手法がある。 For example, as a function for evaluating the degree of matching between blocks in pattern matching, SAD (Sum of Absolute Difference) that takes the difference in brightness, SSD (Sum of Squared intensity Difference) that squares the difference, and the brightness of each pixel There is a method such as NCC (Normalized Cross Correlation) that takes the similarity of the variance value obtained by subtracting the average value from.

距離画像生成部162は、このようなブロック単位の視差導出処理を、例えば、600画素×200画素の検出領域に映し出されている全てのブロックについて行う。ここでは、ブロックを4画素×4画素としているが、ブロック内の画素数は任意に設定することができる。 The distance image generation unit 162 performs such a block-based parallax derivation process for all the blocks projected in the detection area of, for example, 600 pixels × 200 pixels. Here, the block is 4 pixels × 4 pixels, but the number of pixels in the block can be set arbitrarily.

ただし、距離画像生成部162は、検出分解能単位であるブロック毎に視差を導出することはできるが、そのブロックがどのような立体物の一部であるかを認識できない。したがって、視差情報は、立体物単位ではなく、検出領域における検出分解能単位、例えばブロック単位で独立して導出されることとなる。ここでは、説明の便宜上、視差が導出されたブロックを黒のドットで表している。 However, although the distance image generation unit 162 can derive the parallax for each block, which is a detection resolution unit, it cannot recognize what kind of three-dimensional object the block is a part of. Therefore, the parallax information is independently derived not in units of three-dimensional objects but in units of detection resolution in the detection region, for example, in units of blocks. Here, for convenience of explanation, the block from which the parallax is derived is represented by a black dot.

距離画像生成部162は、距離画像182におけるブロック毎の視差情報を、所謂ステレオ法を用いて三次元の位置情報に変換し、相対距離を導出する。ここで、ステレオ法は、三角測量法を用いることで、ブロックの視差からそのブロックの撮像装置110に対する相対距離を導出する方法である。 The distance image generation unit 162 converts the parallax information for each block in the distance image 182 into three-dimensional position information using the so-called stereo method, and derives the relative distance. Here, the stereo method is a method of deriving the relative distance of the block to the imaging device 110 from the parallax of the block by using the triangulation method.

車外環境認識装置120では、このように導出された輝度画像180および距離画像182を通じて車外環境を認識し、例えば、自車両1より前方の立体物を、先行車両等の特定物と判定する。しかし、図4Aにおける立体物184、および、図4Cにおける立体物186が自車両1から遠方に位置している場合、画像中における立体物自体が占有する面積が小さく、輝度画像180における形状や距離画像182における相対距離が揺らぐので、特定物の判定精度の低下を招いていた。そこで、輝度画像180および距離画像182それぞれで特定された立体物同士が重複する重複度を導出し、その重複度に応じて両立体物が同一の特定物であるか否か判定する。 The vehicle exterior environment recognition device 120 recognizes the vehicle exterior environment through the luminance image 180 and the distance image 182 derived in this way, and determines, for example, a three-dimensional object in front of the own vehicle 1 as a specific object such as a preceding vehicle. However, when the three-dimensional object 184 in FIG. 4A and the three-dimensional object 186 in FIG. 4C are located far from the own vehicle 1, the area occupied by the three-dimensional object itself in the image is small, and the shape and distance in the luminance image 180. Since the relative distance in the image 182 fluctuates, the determination accuracy of the specific object is lowered. Therefore, the degree of overlap in which the three-dimensional objects specified in the luminance image 180 and the distance image 182 each overlap is derived, and it is determined whether or not the compatible objects are the same specific object according to the degree of overlap.

(立体物特定処理S203)
立体物特定部163は、まず、自車両1前方の路面を特定する。そして、立体物特定部163は、特定した路面の鉛直上方に高さを有する立体物を特定する。具体的に、立体物特定部163は、路面からの高さが所定距離(例えば0.3m)以上に位置するブロックを、路面から高さ方向に突出している立体物の候補とする。立体物特定部163は、路面の鉛直上方に高さを有する立体物の候補とされた複数のブロックのうち、自車両1との相対距離が等しいブロックをグループ化し、立体物として特定する。
(Three-dimensional object identification process S203)
The three-dimensional object identification unit 163 first identifies the road surface in front of the own vehicle 1. Then, the three-dimensional object identification unit 163 identifies a three-dimensional object having a height vertically above the specified road surface. Specifically, the three-dimensional object identification unit 163 sets a block whose height from the road surface is at a predetermined distance (for example, 0.3 m) or more as a candidate for a three-dimensional object protruding from the road surface in the height direction. The three-dimensional object identification unit 163 groups blocks having the same relative distance to the own vehicle 1 from among a plurality of blocks that are candidates for a three-dimensional object having a height vertically above the road surface, and identifies them as a three-dimensional object.

(第1幅導出処理S204)
第1幅導出部164は、2つの輝度画像180のうちいずれか一方、例えば、第1輝度画像180a中の所定の立体物184(第1立体物)の水平方向の幅である第1幅を導出する。第1幅は、例えば、ピクセル数や距離の換算値で表される。
(First width derivation process S204)
The first width derivation unit 164 sets the first width, which is the horizontal width of a predetermined three-dimensional object 184 (first three-dimensional object) in the first luminance image 180a, for example, of one of the two luminance images 180. Derived. The first width is represented by, for example, a converted value of the number of pixels or the distance.

なお、輝度画像180から立体物を抽出する際には機械学習技術が用いられる。例えば、輝度画像180のエッジパターンおよび時間的変化を入力とし、一体形成されているとみなすことができる立体物を出力する。かかる機械学習技術は、既存の様々な技術を適用できるので、ここでは、その詳細な説明を省略する。 A machine learning technique is used when extracting a three-dimensional object from the luminance image 180. For example, the edge pattern and the temporal change of the luminance image 180 are input, and a three-dimensional object that can be regarded as integrally formed is output. Since various existing techniques can be applied to such machine learning techniques, detailed description thereof will be omitted here.

(第2幅導出処理S206)
第2幅導出部166は、距離画像182中の所定の立体物186(第2立体物)の水平方向の幅である第2幅を導出する。第2幅は、例えば、ピクセル数や距離の換算値で表される。
(Second width derivation process S206)
The second width derivation unit 166 derives the second width, which is the horizontal width of the predetermined three-dimensional object 186 (second three-dimensional object) in the distance image 182. The second width is represented by, for example, a converted value of the number of pixels or the distance.

ここでは、第1幅導出部164および第2幅導出部166のいずれもが、立体物184、186の水平方向の幅を導出している。これは以下の理由による。自車両1は、道路の凹凸により、ピッチ方向およびロール方向に揺動するおそれがある。一方、ヨー方向は、ピッチ方向およびロール方向ほど揺動しない。したがって、安定したヨー方向に相当する水平方向の幅を対象とすることで、特定物の判定精度を向上することができる。なお、平坦な道路では、ヨー方向に相当する水平方向の幅に代えて、または、加えて、ピッチ方向に相当する垂直方向の幅を対象とすることができるのは言うまでもない。 Here, both the first width derivation unit 164 and the second width derivation unit 166 derive the horizontal widths of the three-dimensional objects 184 and 186. This is due to the following reasons. The own vehicle 1 may swing in the pitch direction and the roll direction due to the unevenness of the road. On the other hand, the yaw direction does not swing as much as the pitch direction and the roll direction. Therefore, by targeting the width in the horizontal direction corresponding to the stable yaw direction, the determination accuracy of the specific object can be improved. Needless to say, on a flat road, the width in the vertical direction corresponding to the pitch direction can be targeted instead of or in addition to the width in the horizontal direction corresponding to the yaw direction.

(重複度導出処理S208)
図5(図5A〜図5F)は、重複度を説明するための説明図である。重複度導出部168は、まず、第1幅導出部164が導出した第1幅190と、第2幅導出部166が導出した第2幅192とを取得し、第1幅と第2幅とが画像水平方向において重複する領域の幅である重複幅194を導出する。
(Multiplicity derivation process S208)
5 (FIGS. 5A to 5F) are explanatory views for explaining the degree of overlap. The multiplicity derivation unit 168 first acquires the first width 190 derived by the first width derivation unit 164 and the second width 192 derived by the second width derivation unit 166, and obtains the first width and the second width. Derivates the overlap width 194, which is the width of the overlapping regions in the horizontal direction of the image.

図5A〜図5Cでは、立体物184の第1幅190の方が、立体物186の第2幅192より長い。図5Aでは、水平方向において立体物186が立体物184に含まれている。したがって、重複幅194は、第2幅192と等しくなる。図5Bでは、水平方向において立体物184と立体物186とのそれぞれ一部が重複している。したがって、重複幅194は、第1幅190および第2幅192より短くなる。図5Cでは、水平方向において立体物184と立体物186とが重複していない。したがって、重複幅194は0となる。 In FIGS. 5A to 5C, the first width 190 of the three-dimensional object 184 is longer than the second width 192 of the three-dimensional object 186. In FIG. 5A, the three-dimensional object 186 is included in the three-dimensional object 184 in the horizontal direction. Therefore, the overlap width 194 is equal to the second width 192. In FIG. 5B, a part of the three-dimensional object 184 and the three-dimensional object 186 overlap each other in the horizontal direction. Therefore, the overlapping width 194 is shorter than the first width 190 and the second width 192. In FIG. 5C, the three-dimensional object 184 and the three-dimensional object 186 do not overlap in the horizontal direction. Therefore, the overlap width 194 becomes 0.

図5D〜図5Fでは、立体物184の第1幅190の方が、立体物186の第2幅192より短い。図5Dでは、水平方向において立体物184が立体物186に含まれている。したがって、重複幅194は、第1幅190と等しくなる。図5Eでは、水平方向において立体物184と立体物186とのそれぞれ一部が重複している。したがって、重複幅194は、第1幅190および第2幅192より短くなる。図5Fでは、水平方向において立体物184と立体物186とが重複していない。したがって、重複幅194は0となる。 In FIGS. 5D to 5F, the first width 190 of the three-dimensional object 184 is shorter than the second width 192 of the three-dimensional object 186. In FIG. 5D, the three-dimensional object 184 is included in the three-dimensional object 186 in the horizontal direction. Therefore, the overlap width 194 is equal to the first width 190. In FIG. 5E, a part of the three-dimensional object 184 and the three-dimensional object 186 overlap each other in the horizontal direction. Therefore, the overlapping width 194 is shorter than the first width 190 and the second width 192. In FIG. 5F, the three-dimensional object 184 and the three-dimensional object 186 do not overlap in the horizontal direction. Therefore, the overlap width 194 becomes 0.

続いて、重複度導出部168は、重複幅が0以外の値を示す場合、すなわち、少なくとも重複していると判定した場合、重複幅194/第1幅190、および、重複幅194/第2幅192を導出し、重複幅194/第1幅190、および、重複幅194/第2幅192のうち大きい方を重複度とする。 Subsequently, the multiplicity derivation unit 168 determines that the overlap width shows a value other than 0, that is, at least overlaps, the overlap width 194 / first width 190, and the overlap width 194 / second. The width 192 is derived, and the larger of the overlap width 194 / first width 190 and the overlap width 194 / second width 192 is set as the degree of overlap.

図5Aの例では、重複幅194/第1幅190<重複幅194/第2幅192なので、重複度は重複幅194/第2幅192(=1)となる。図5Bの例では、重複幅194/第1幅190<重複幅194/第2幅192なので、重複度は重複幅194/第2幅192となる。図5Cの例では、重複幅194が0なので、すなわち、重複していないので、重複度も0となる。 In the example of FIG. 5A, since the overlap width 194 / first width 190 <overlap width 194 / second width 192, the degree of overlap is 194 overlap width / 192 second width (= 1). In the example of FIG. 5B, since the overlap width 194 / first width 190 <overlap width 194 / second width 192, the degree of overlap is 194 overlap width / 192 second width. In the example of FIG. 5C, the overlap width 194 is 0, that is, since there is no overlap, the degree of overlap is also 0.

図5Dの例では、重複幅194/第1幅190>重複幅194/第2幅192なので、重複度は重複幅194/第1幅190(=1)となる。図5Eの例では、重複幅194/第1幅190>重複幅194/第2幅192なので、重複度は重複幅194/第1幅190となる。図5Fの例では、重複幅194が0なので、すなわち、重複していないので、重複度も0となる。 In the example of FIG. 5D, since the overlap width 194 / first width 190> overlap width 194 / second width 192, the degree of overlap is 194 overlap width / 190 first width (= 1). In the example of FIG. 5E, since the overlap width 194 / first width 190> overlap width 194 / second width 192, the degree of overlap is 194 overlap width / 190 first width. In the example of FIG. 5F, since the overlap width 194 is 0, that is, there is no overlap, the degree of overlap is also 0.

(特定物判定処理S210)
特定物判定部170は、閾値を参照して、立体物184と立体物186とが同一の特定物であるか否か判定する。
(Specific object determination process S210)
The specific object determination unit 170 determines whether or not the three-dimensional object 184 and the three-dimensional object 186 are the same specific object with reference to the threshold value.

図6は、特定物判定部170で参照される閾値の求め方を説明するためのグラフである。図6のグラフにおける横軸は相対距離を示し、縦軸は重複度を示す。ここで、重複度は0〜1の範囲で表される。かかる重複度グラフは予め準備される。 FIG. 6 is a graph for explaining how to obtain the threshold value referred to by the specific object determination unit 170. In the graph of FIG. 6, the horizontal axis indicates the relative distance, and the vertical axis indicates the degree of overlap. Here, the degree of overlap is expressed in the range of 0 to 1. Such a multiplicity graph is prepared in advance.

例えば、自車両1が、事前に、所定の地域において走行し、その間に抽出した複数の立体物のうち車両に相当する特定物のみを特定して、その相対距離と重複度の実測値をプロットする。そうすると、図6の点群を得ることができる。ここでは、相対距離が長くなる程、重複度として小さい値が生じることが理解できる。図6において、全ての相対距離における点群の下限値以下となる直線が生成される。かかる直線が閾値196となる。したがって、閾値196は、相対距離の一次関数として表すことができる。なお、全ての相対距離における点群の下限値以下であれば、閾値196は、直線に限らず曲線、すなわち複数次の関数で表されてもよい。なお、ここでは、自車両1自体が、相対距離と重複度の実測値をプロットする例を挙げて説明したが、かかる場合に限らず、他の車両でプロットした相対距離と重複度の実測値を自車両1に反映するとしてもよい。 For example, the own vehicle 1 travels in a predetermined area in advance, identifies only a specific object corresponding to the vehicle among a plurality of three-dimensional objects extracted during that period, and plots the measured relative distance and the measured value of the degree of overlap. do. Then, the point cloud of FIG. 6 can be obtained. Here, it can be understood that the longer the relative distance, the smaller the value of the multiplicity. In FIG. 6, a straight line that is equal to or less than the lower limit of the point cloud at all relative distances is generated. Such a straight line becomes the threshold value 196. Therefore, the threshold value 196 can be expressed as a linear function of the relative distance. The threshold value 196 is not limited to a straight line but may be represented by a curve, that is, a function of a plurality of orders as long as it is equal to or less than the lower limit value of the point cloud at all relative distances. Here, the example in which the own vehicle 1 itself plots the measured values of the relative distance and the multiplicity has been described, but the actual measurement values of the relative distance and the multiplicity plotted by another vehicle are not limited to this case. May be reflected in the own vehicle 1.

特定物判定部170は、このようにして準備された閾値196を参照し、重複度導出部168が導出した立体物184と立体物186とが位置する相対距離における立体物184と立体物186との重複度が閾値196以上であれば、立体物184と立体物186とが同一の特定物であると判定する。 The specific object determination unit 170 refers to the threshold value 196 prepared in this way, and refers to the three-dimensional object 184 and the three-dimensional object 186 at the relative distance where the three-dimensional object 184 and the three-dimensional object 186 derived by the multiplicity derivation unit 168 are located. If the degree of overlap is 196 or more, it is determined that the three-dimensional object 184 and the three-dimensional object 186 are the same specific object.

ここでは、立体物184と立体物186とが同一の特定物であることを厳格に判定するために、閾値196として高い値を採用している。しかし、立体物184と立体物186とが同一の特定物であると判定した後にまで、閾値196を高くする必要はない。そこで、ヒステリシス機能を設け、一旦、同一の特定物と判定されれば、重複度が多少低下しても、同一の特定物として判定し続けられるようにする。 Here, in order to strictly determine that the three-dimensional object 184 and the three-dimensional object 186 are the same specific object, a high value is adopted as the threshold value 196. However, it is not necessary to increase the threshold value 196 until after it is determined that the three-dimensional object 184 and the three-dimensional object 186 are the same specific object. Therefore, a hysteresis function is provided so that once it is determined that the specific object is the same, it can be continuously determined as the same specific object even if the degree of overlap is slightly reduced.

図7、図8は、閾値196の変化態様を説明するための説明図である。具体的に、特定物判定部170は、立体物184と立体物186とを同一の特定物と判定すると、その立体物184と立体物186との重複度を比較する閾値196を、例えば、図7において破線で示した閾値196aから、一点鎖線で示した閾値196bに下げる。 7 and 8 are explanatory views for explaining the change mode of the threshold value 196. Specifically, when the specific object determination unit 170 determines that the three-dimensional object 184 and the three-dimensional object 186 are the same specific object, the threshold value 196 for comparing the degree of overlap between the three-dimensional object 184 and the three-dimensional object 186 is set, for example, in FIG. The threshold value 196a indicated by the broken line in No. 7 is lowered to the threshold value 196b indicated by the alternate long and short dash line.

こうして、同一の特定物と判定された立体物184と立体物186との重複度が閾値196a近傍で揺動したとしても、少なくとも閾値196b以上となるので、その後も継続して同一の特定物と判定され易くなる。したがって、安定して特定物を判定することが可能となる。 In this way, even if the degree of overlap between the three-dimensional object 184 and the three-dimensional object 186 determined to be the same specific object fluctuates in the vicinity of the threshold value 196a, the threshold value is at least 196b or more. It becomes easier to judge. Therefore, it is possible to stably determine a specific object.

なお、特定物判定部170は、連続して同一の特定物と判定した回数を計数する。そして、特定物判定部170は、同一の特定物と判定した回数に応じて重複度に対する閾値196を、図7において破線で示した閾値196a→一点鎖線で示した閾値196b→実線で示した閾値196cといったように予め準備された複数段階の閾値を段階的に下げてもよい。 The specific object determination unit 170 continuously counts the number of times that the same specific object is determined. Then, the specific object determination unit 170 sets the threshold value 196 for the degree of overlap according to the number of times that the same specific object is determined as the threshold value 196a shown by the broken line in FIG. 7 → the threshold value 196b shown by the alternate long and short dash line → the threshold value shown by the solid line. The threshold value of a plurality of steps prepared in advance such as 196c may be lowered stepwise.

具体的に、閾値196aにより、所定回数(例えば、10回)、同一の特定物と判定されると、特定物判定部170は、同一の特定物と判定された立体物184と立体物186との重複度に対する閾値196を、例えば、図7の閾値196aから閾値196bに下げる。さらに、閾値196bにより、所定回数(例えば、10回)、同一の特定物と判定されると、特定物判定部170は、同一の特定物と判定された立体物184と立体物186との重複度に対する閾値196を、例えば、図7の閾値196bから閾値196cに下げる。 Specifically, when the same specific object is determined a predetermined number of times (for example, 10 times) by the threshold value 196a, the specific object determination unit 170 determines that the three-dimensional object 184 and the three-dimensional object 186 are determined to be the same specific object. The threshold value 196 for the degree of overlap is lowered from, for example, the threshold value 196a in FIG. 7 to the threshold value 196b. Further, when the same specific object is determined by the threshold value 196b a predetermined number of times (for example, 10 times), the specific object determination unit 170 overlaps the three-dimensional object 184 determined to be the same specific object and the three-dimensional object 186. The threshold 196 for degrees is lowered, for example, from the threshold 196b in FIG. 7 to the threshold 196c.

こうして、立体物184と立体物186とが、同一の特定物と判定されると、閾値196が段階的に低くなり、その後も継続して同一の特定物と判定され易くなる。したがって、より安定して特定物を判定することが可能となる。 In this way, when the three-dimensional object 184 and the three-dimensional object 186 are determined to be the same specific object, the threshold value 196 is gradually lowered, and thereafter, it is easy to be continuously determined to be the same specific object. Therefore, it becomes possible to determine a specific object more stably.

また、閾値196が下げられているにも拘わらず、重複度が閾値196未満となり、立体物184と立体物186とが同一の特定物ではないと判定されれば、最早、その閾値196を維持すべきではない。そこで、同一の特定物と判定されなくなった場合にもヒステリシス機能を設け、一旦、同一の特定物ではないと判定され、重複度が高まらなければ、同一の特定物として判定しないこととする。 Further, if the degree of overlap is less than the threshold value 196 even though the threshold value 196 is lowered and it is determined that the three-dimensional object 184 and the three-dimensional object 186 are not the same specific object, the threshold value 196 is maintained anymore. should not do. Therefore, even if it is no longer determined to be the same specific object, a hysteresis function is provided, and once it is determined that the specific object is not the same, and if the degree of duplication does not increase, the determination is not made as the same specific object.

具体的に、特定物判定部170は、閾値196が図8において実線で示した閾値196cであった場合に、立体物184と立体物186とを同一の特定物ではないと判定すると、その立体物184と立体物186との重複度を比較する閾値196を、例えば、閾値196cから、図8において一点鎖線で示した閾値196bに上げる。 Specifically, when the threshold value 196 is the threshold value 196c shown by the solid line in FIG. 8, the specific object determination unit 170 determines that the three-dimensional object 184 and the three-dimensional object 186 are not the same specific object. The threshold value 196 for comparing the degree of overlap between the object 184 and the three-dimensional object 186 is increased from, for example, the threshold value 196c to the threshold value 196b shown by the alternate long and short dash line in FIG.

こうして、同一の特定物ではないと判定された立体物184と立体物186との重複度が閾値196c以上となったとしても、少なくとも閾値196b以上となるまで、同一の特定物と判定され難くなる。したがって、特定物ではない立体物を安定して排除できる。 In this way, even if the degree of overlap between the three-dimensional object 184 determined not to be the same specific object and the three-dimensional object 186 becomes the threshold value 196c or more, it becomes difficult to determine that they are the same specific object until at least the threshold value 196b or more. .. Therefore, a three-dimensional object that is not a specific object can be stably excluded.

なお、特定物判定部170は、同一の特定物ではないと判定した回数に応じて重複度に対する閾値196を、図8において実線で示した閾値196c→一点鎖線で示した閾値196b→破線で示した閾値196aといったように段階的に上げてもよい。 The specific object determination unit 170 indicates the threshold value 196 for the degree of overlap according to the number of times it is determined that they are not the same specific object: the threshold value 196c shown by the solid line in FIG. 8 → the threshold value 196b shown by the alternate long and short dash line → the broken line. The threshold value may be increased stepwise, such as 196a.

同一の特定物と判定された立体物184と立体物186が、閾値196cにより、所定回数(例えば、10回)、同一の特定物ではないと判定されると、特定物判定部170は、同一の特定物と判定された立体物184と立体物186との重複度に対する閾値196を、例えば、図8の閾値196cから閾値196bに上げる。さらに、閾値196bにより、所定回数(例えば、10回)、同一の特定物ではないと判定されると、特定物判定部170は、同一の特定物と判定された立体物184と立体物186との重複度に対する閾値196を、例えば、図8の閾値196bから閾値196aに上げる。 When the three-dimensional object 184 and the three-dimensional object 186 determined to be the same specific object are determined not to be the same specific object a predetermined number of times (for example, 10 times) by the threshold value 196c, the specific object determination unit 170 is the same. The threshold value 196 for the degree of overlap between the three-dimensional object 184 determined to be the specific object and the three-dimensional object 186 is increased from the threshold value 196c in FIG. 8 to the threshold value 196b, for example. Further, when it is determined by the threshold value 196b that the specific objects are not the same a predetermined number of times (for example, 10 times), the specific object determination unit 170 determines that the three-dimensional object 184 and the three-dimensional object 186 are determined to be the same specific object. The threshold value 196 for the degree of overlap of the above is raised from the threshold value 196b in FIG. 8 to the threshold value 196a, for example.

こうして、同一の特定物と判定された立体物184と立体物186とが、同一の特定物ではないと判定されると、閾値196が段階的に高くなり、その後も同一の特定物と判定され難くなる。したがって、特定物ではない立体物をより安定して排除できる。 In this way, when it is determined that the three-dimensional object 184 and the three-dimensional object 186 determined to be the same specific object are not the same specific object, the threshold value 196 is gradually increased, and thereafter it is determined to be the same specific object. It becomes difficult. Therefore, a three-dimensional object that is not a specific object can be eliminated more stably.

なお、降雨時における走行では、撮像装置110前方の例えばフロントガラスやレンズに雨滴が付着し輝度画像180や距離画像182がぼやける場合がある。その場合、立体物184と立体物186との重複度が本来の値より小さくなる場合がある。そうすると、特定物判定部170は、本来、同一の特定物と判定すべき立体物184と立体物186とを、同一の特定物ではないと判定するおそれがある。そこで、特定物判定部170は、雨滴の付着が想定される場合、例えば、ワイパー148が作動している場合、ワイパー148が作動していない場合より閾値196を下げるとしてもよい。かかる構成により、降雨時の走行においても、特定物を適切に判定することが可能となる。 When traveling in the rain, raindrops may adhere to, for example, the windshield or the lens in front of the image pickup device 110, and the luminance image 180 or the distance image 182 may be blurred. In that case, the degree of overlap between the three-dimensional object 184 and the three-dimensional object 186 may be smaller than the original value. Then, the specific object determination unit 170 may determine that the three-dimensional object 184 and the three-dimensional object 186, which should be originally determined to be the same specific object, are not the same specific object. Therefore, the specific object determination unit 170 may lower the threshold value 196 when raindrops are expected to adhere, for example, when the wiper 148 is operating and when the wiper 148 is not operating. With such a configuration, it is possible to appropriately determine a specific object even when traveling in the rain.

また、自車両1の速度が高いと、遠くに位置する特定物としての先行車両をより適切に判定しなければならないが、自車両1の速度が低いと、自車両1の速度が高い場合ほど確実に判定しなくてもよい。そこで、特定物判定部170は、自車両1の速度に応じて閾値196を変化させる。例えば、速度センサ138で検出された自車両1の速度が高いと、速度が低いときに比べ閾値196を上げ、自車両1の速度が低いと、速度が高いときに比べ閾値196を下げる。ここで、自車両1の速度に比例して線形的に、または、段階的に閾値196を変化させてもよい。こうして、自車両1の速度に拘わらず、特定物を適切に判定することが可能となる。 Further, when the speed of the own vehicle 1 is high, it is necessary to more appropriately determine the preceding vehicle as a specific object located far away, but when the speed of the own vehicle 1 is low, the higher the speed of the own vehicle 1, the higher the speed. It is not necessary to make a reliable judgment. Therefore, the specific object determination unit 170 changes the threshold value 196 according to the speed of the own vehicle 1. For example, when the speed of the own vehicle 1 detected by the speed sensor 138 is high, the threshold value 196 is raised as compared with when the speed is low, and when the speed of the own vehicle 1 is low, the threshold value 196 is lowered as compared with when the speed is high. Here, the threshold value 196 may be changed linearly or stepwise in proportion to the speed of the own vehicle 1. In this way, it is possible to appropriately determine a specific object regardless of the speed of the own vehicle 1.

そして、特定物判定部170は、重複度が所定の閾値以上となった同一の特定物が、いずれの特定物であるか特定する。例えば、特定物判定部170は、まず、距離画像182において、路面からの高さが所定距離以上に位置する複数のブロックのうち自車両1との相対距離が等しく、かつ、互いに垂直方向および水平方向の距離が近いブロックをグループ化し、例えば、先行車両の候補とする。次に、特定物判定部170は、このように特定した立体物全てを含む第1輝度画像180aにおける矩形の領域を立体物領域として特定する。ここで、矩形は、垂直方向に延伸し、立体物の左右エッジにそれぞれ接する2本の直線、および、水平方向に延伸し、立体物の上下エッジにそれぞれ接する2本の直線で構成される。 Then, the specific object determination unit 170 identifies which specific object is the same specific object whose degree of overlap is equal to or greater than a predetermined threshold value. For example, in the distance image 182, the specific object determination unit 170 first has the same relative distance to the own vehicle 1 among the plurality of blocks located at a height from the road surface equal to or higher than a predetermined distance, and is perpendicular and horizontal to each other. Blocks that are close in direction are grouped and used as candidates for the preceding vehicle, for example. Next, the specific object determination unit 170 specifies a rectangular region in the first luminance image 180a including all the three-dimensional objects specified in this way as a three-dimensional object region. Here, the rectangle is composed of two straight lines extending in the vertical direction and contacting the left and right edges of the three-dimensional object, and two straight lines extending in the horizontal direction and contacting the upper and lower edges of the three-dimensional object, respectively.

特定物判定部170は、このように形成された立体物領域に含まれる特定物を、車両らしさを示す様々な条件に基づいて、先行車両と特定する。このように、立体物領域が先行車両と特定されると、車外環境認識装置120は、先行車両との衝突による被害を軽減したり、先行車両との車間距離を安全な距離に保つように制御する。 The specific object determination unit 170 identifies the specific object included in the three-dimensional object region thus formed as the preceding vehicle based on various conditions indicating the vehicle-likeness. In this way, when the three-dimensional object region is identified as the preceding vehicle, the vehicle exterior environment recognition device 120 controls to reduce the damage caused by the collision with the preceding vehicle and keep the distance between the preceding vehicle and the vehicle at a safe distance. do.

また、特定物判定部170は、同一の特定物と判定した立体物184と立体物186の位置および速度をデータ保持部152に記憶し、位置や速度に基づいて立体物184および立体物186を追跡する。こうして、特定物判定部170は、前回のフレームで同一の特定物と判定された立体物184や立体物186を特定することが可能となる。 Further, the specific object determination unit 170 stores the positions and velocities of the three-dimensional object 184 and the three-dimensional object 186 determined to be the same specific object in the data holding unit 152, and stores the three-dimensional object 184 and the three-dimensional object 186 based on the position and speed. Chase. In this way, the specific object determination unit 170 can identify the three-dimensional object 184 and the three-dimensional object 186 that are determined to be the same specific object in the previous frame.

本実施形態では、輝度画像および距離画像を用い、それぞれにおける立体物の重複度に応じて同一の特定物であるか否かを判定する構成により、特定物の判定精度を向上することが可能となる。 In the present embodiment, it is possible to improve the determination accuracy of a specific object by using a luminance image and a distance image and determining whether or not they are the same specific object according to the degree of overlap of the three-dimensional objects in each. Become.

また、コンピュータを車外環境認識装置120として機能させるプログラムや、当該プログラムを記録した、コンピュータで読み取り可能なフレキシブルディスク、光磁気ディスク、ROM、CD、DVD、BD等の記憶媒体も提供される。ここで、プログラムは、任意の言語や記述方法にて記述されたデータ処理手段をいう。 In addition, a program that causes the computer to function as the vehicle exterior environment recognition device 120, and a storage medium such as a computer-readable flexible disk, magneto-optical disk, ROM, CD, DVD, or BD that records the program are also provided. Here, the program refers to a data processing means described in an arbitrary language or description method.

以上、添付図面を参照しながら本発明の好適な実施形態について説明したが、本発明はかかる実施形態に限定されないことは言うまでもない。当業者であれば、特許請求の範囲に記載された範疇において、各種の変更例または修正例に想到し得ることは明らかであり、それらについても当然に本発明の技術的範囲に属するものと了解される。 Although the preferred embodiment of the present invention has been described above with reference to the accompanying drawings, it goes without saying that the present invention is not limited to such an embodiment. It is clear that a person skilled in the art can come up with various modifications or modifications within the scope of the claims, and it is understood that these also naturally belong to the technical scope of the present invention. Will be done.

例えば、上述した実施形態では、自車両1の前方の特に遠くに位置する特定物としての先行車両を判定する例を挙げて説明したが、かかる場合に限らず、近傍に位置する特定物にも適用できることは言うまでもない。 For example, in the above-described embodiment, an example of determining a preceding vehicle as a specific object located particularly far in front of the own vehicle 1 has been described, but not only in such a case, but also to a specific object located in the vicinity. Needless to say, it can be applied.

また、上述した実施形態では、2つの撮像装置110で撮像された2つの輝度画像180を用い、その一方である第1輝度画像180aと、両方に基づく距離画像182とを利用する例を挙げて説明した。しかし、輝度画像180と距離画像182とは、それぞれ独立して取得されてもよく、例えば、3つの撮像装置110の1の撮像装置110から輝度画像180を、2の撮像装置110から距離画像182を導出するとしてもよい。 Further, in the above-described embodiment, an example is given in which two luminance images 180 captured by the two imaging devices 110 are used, and a first luminance image 180a and a distance image 182 based on both of them are used. explained. However, the luminance image 180 and the distance image 182 may be acquired independently of each other. For example, the luminance image 180 is obtained from one imaging device 110 of the three imaging devices 110, and the distance image 182 is obtained from the two imaging devices 110. May be derived.

なお、本明細書の車外環境認識方法の各工程は、必ずしもフローチャートとして記載された順序に沿って時系列に処理する必要はなく、並列的あるいはサブルーチンによる処理を含んでもよい。 It should be noted that each step of the vehicle exterior environment recognition method of the present specification does not necessarily have to be processed in chronological order in the order described as a flowchart, and may include processing in parallel or by a subroutine.

本発明は、自車両の進行方向に存在する特定物を特定する車外環境認識装置に利用することができる。 The present invention can be used as an external environment recognition device for identifying a specific object existing in the traveling direction of the own vehicle.

110 撮像装置
120 車外環境認識装置
130 車両制御装置
164 第1幅導出部
166 第2幅導出部
168 重複度導出部
170 特定物判定部
180 輝度画像
182 距離画像
190 第1幅
192 第2幅
194 重複幅
196 閾値
110 Imaging device 120 External environment recognition device 130 Vehicle control device 164 1st width out-licensing unit 166 2nd width out-licensing unit 168 Multiplicity out-licensing unit 170 Specific object judgment unit 180 Brightness image 182 Distance image 190 1st width 192 2nd width 194 Overlap Width 196 threshold

Claims (6)

撮像対象の輝度を特定可能な輝度画像に基づいて第1立体物の第1幅を導出する第1幅導出部と、
撮像対象の距離を特定可能な距離画像に基づいて第2立体物の第2幅を導出する第2幅導出部と、
前記第1幅と前記第2幅とが水平方向において重複する重複幅を求め、前記重複幅/前記第1幅、および、前記重複幅/前記第2幅のうち大きい方を重複度とする重複度導出部と、
前記重複度が所定の閾値以上であれば、前記第1立体物と第2立体物とを同一の特定物と判定する特定物判定部と、
を備える車外環境認識装置。
A first width derivation unit that derives the first width of the first three-dimensional object based on a brightness image that can specify the brightness of the imaged object, and
A second width derivation unit that derives the second width of the second three-dimensional object based on a distance image that can specify the distance to be imaged, and a second width derivation unit.
The overlap width in which the first width and the second width overlap in the horizontal direction is obtained, and the larger of the overlap width / the first width and the overlap width / the second width is defined as the degree of overlap. Derivation part and
If the degree of overlap is equal to or greater than a predetermined threshold value, a specific object determination unit that determines that the first three-dimensional object and the second three-dimensional object are the same specific object, and
An external environment recognition device equipped with.
前記特定物判定部は、同一の特定物と判定した前記重複度に対する前記閾値を下げる請求項1に記載の車外環境認識装置。 The vehicle exterior environment recognition device according to claim 1, wherein the specific object determination unit lowers the threshold value for the degree of overlap determined to be the same specific object. 前記特定物判定部は、前記重複度に対する前記閾値を下げた後、同一の特定物ではないと判定した前記重複度に対する前記閾値を上げる請求項2に記載の車外環境認識装置。 The vehicle exterior environment recognition device according to claim 2, wherein the specific object determination unit lowers the threshold value for the overlap degree and then raises the threshold value for the overlap degree determined that the specific object is not the same. 前記特定物判定部は、同一の特定物と判定した回数に応じて前記重複度に対する前記閾値を段階的に変化させる請求項2または3に記載の車外環境認識装置。 The vehicle exterior environment recognition device according to claim 2 or 3, wherein the specific object determination unit changes the threshold value for the degree of overlap stepwise according to the number of times that the same specific object is determined. 前記特定物判定部は、ワイパーの作動有無に応じて前記閾値を変化させる請求項1から4のいずれか1項に記載の車外環境認識装置。 The vehicle exterior environment recognition device according to any one of claims 1 to 4, wherein the specific object determination unit changes the threshold value according to whether or not the wiper is operated. 前記特定物判定部は、自車両の速度に応じて前記閾値を変化させる請求項1から5のいずれか1項に記載の車外環境認識装置。 The vehicle exterior environment recognition device according to any one of claims 1 to 5, wherein the specific object determination unit changes the threshold value according to the speed of the own vehicle.
JP2020135392A 2020-03-06 2020-08-07 Outside environment recognition device Active JP7514139B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/143,748 US11842552B2 (en) 2020-03-06 2021-01-07 Vehicle exterior environment recognition apparatus
CN202110022897.8A CN113361310A (en) 2020-03-06 2021-01-08 Vehicle exterior environment recognition device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020039047 2020-03-06
JP2020039047 2020-03-06

Publications (2)

Publication Number Publication Date
JP2021140723A true JP2021140723A (en) 2021-09-16
JP7514139B2 JP7514139B2 (en) 2024-07-10

Family

ID=77668824

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2020135392A Active JP7514139B2 (en) 2020-03-06 2020-08-07 Outside environment recognition device

Country Status (1)

Country Link
JP (1) JP7514139B2 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007073058A (en) 2006-09-28 2007-03-22 Sumitomo Electric Ind Ltd Vehicle sensing device
WO2017130639A1 (en) 2016-01-28 2017-08-03 株式会社リコー Image processing device, imaging device, mobile entity apparatus control system, image processing method, and program
JP6885721B2 (en) 2016-12-27 2021-06-16 株式会社デンソー Object detection device, object detection method
JP6613332B2 (en) 2018-03-05 2019-11-27 株式会社Subaru Vehicle driving support device
JP6927132B2 (en) 2018-04-17 2021-08-25 株式会社デンソー Driver assistance systems and methods

Also Published As

Publication number Publication date
JP7514139B2 (en) 2024-07-10

Similar Documents

Publication Publication Date Title
JP6795027B2 (en) Information processing equipment, object recognition equipment, device control systems, moving objects, image processing methods and programs
JP7206583B2 (en) Information processing device, imaging device, device control system, moving object, information processing method and program
US10885351B2 (en) Image processing apparatus to estimate a plurality of road surfaces
US10803605B2 (en) Vehicle exterior environment recognition apparatus
JP6687039B2 (en) Object detection device, device control system, imaging device, object detection method, and program
JP6743882B2 (en) Image processing device, device control system, imaging device, image processing method, and program
JP6722084B2 (en) Object detection device
JP2013250907A (en) Parallax calculation device, parallax calculation method and parallax calculation program
US10789727B2 (en) Information processing apparatus and non-transitory recording medium storing thereon a computer program
JP7356319B2 (en) Vehicle exterior environment recognition device
JP6701905B2 (en) Detection device, parallax value derivation device, object recognition device, device control system, and program
JP7229032B2 (en) External object detection device
JP2016186702A (en) Outside-vehicle environment recognition device
JP7261006B2 (en) External environment recognition device
JP2018092605A (en) Information processing device, imaging device, apparatus control system, movable body, information processing method, and program
JP2021140723A (en) Vehicle exterior environment recognition device
WO2017169704A1 (en) Environment recognition device
JP7071102B2 (en) External environment recognition device
US11842552B2 (en) Vehicle exterior environment recognition apparatus
JP2016173248A (en) Parallax value computation device, object recognition device, mobile instrument control system and parallax computation program
WO2018097269A1 (en) Information processing device, imaging device, equipment control system, mobile object, information processing method, and computer-readable recording medium
WO2022130780A1 (en) Image processing device
US20220073066A1 (en) Vehicle external environment recognition apparatus
US11854279B2 (en) Vehicle exterior environment recognition apparatus
JP6334773B2 (en) Stereo camera

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20230703

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20240425

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20240604

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20240628

R150 Certificate of patent or registration of utility model

Ref document number: 7514139

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150