JP2007188354A - Forward solid object recognition system for vehicle - Google Patents

Forward solid object recognition system for vehicle Download PDF

Info

Publication number
JP2007188354A
JP2007188354A JP2006006700A JP2006006700A JP2007188354A JP 2007188354 A JP2007188354 A JP 2007188354A JP 2006006700 A JP2006006700 A JP 2006006700A JP 2006006700 A JP2006006700 A JP 2006006700A JP 2007188354 A JP2007188354 A JP 2007188354A
Authority
JP
Japan
Prior art keywords
dimensional object
coordinate
solid object
fusion
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2006006700A
Other languages
Japanese (ja)
Other versions
JP4712562B2 (en
Inventor
Hiroyuki Sekiguchi
弘幸 関口
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Subaru Corp
Original Assignee
Fuji Heavy Industries Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Heavy Industries Ltd filed Critical Fuji Heavy Industries Ltd
Priority to JP2006006700A priority Critical patent/JP4712562B2/en
Publication of JP2007188354A publication Critical patent/JP2007188354A/en
Application granted granted Critical
Publication of JP4712562B2 publication Critical patent/JP4712562B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Image Processing (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To precisely recognize a forward solid object using a forward recognition system which spreads generally without using a highly precise forward recognition means. <P>SOLUTION: A solid object detection processing part 9 obtains a featured value comprising positional coordinates in a body width direction, distance coordinates in a body longitudinal direction and a relative speed in the body longitudinal direction on the basis of forward solid objects Mi, Ii and Li detected by respective forward recognition means of a millimeter wave radar system 3, a stereoscopic camera apparatus 2 and a laser radar apparatus 4, and obtains the same probability of each featured value from all the combinations of the respective forward solid objects Mi, Ii and Li on the basis of Gaussian distribution regarding the error of each featured value. Then, it is decided whether the respective forward solid objects Mi, Ii and Li include the same solid objects on the basis of the same probability. Regarding the forward solid object decided to be the same, a fusion solid object is generated on the basis of each of the featured value, and regarding the forward solid object which is not decided to be the same, the fusion solid object is generated on the basis of only a featured value specifying the forward solid object. <P>COPYRIGHT: (C)2007,JPO&INPIT

Description

本発明は、複数の前方認識手段で検出した前方立体物が同一か否かを、各前方立体物を特定する特徴量の誤差に関するガウス分布に基づいて判定する車両の前方立体物認識装置に関する。   The present invention relates to a vehicle front three-dimensional object recognition device that determines whether or not front three-dimensional objects detected by a plurality of front recognition means are the same based on a Gaussian distribution relating to an error in a feature amount that identifies each front three-dimensional object.

近年、自動車等の車両においては、車両に搭載したカメラやミリ波レーダ等の前方認識手段を用いて前方立体物を認識し、この認識結果を種々の車両制御に利用する技術が提案されている。   In recent years, in vehicles such as automobiles, a technique has been proposed in which a front three-dimensional object is recognized using a front recognition unit such as a camera or a millimeter wave radar mounted on the vehicle, and the recognition result is used for various vehicle controls. .

ところで、カメラ装置やレーダ装置を代表とする各種前方認識手段には、走行環境によって検出精度にばらつきがある。そのため、最近では、カメラ装置とレーダ装置等の複数の前方認識手段を搭載し、各前方認識手段で検出した結果を組み合わせて、前方立体物を認識するセンサフュージョンと呼ばれる手法が提案されている。   By the way, various forward recognition means represented by a camera device and a radar device have variations in detection accuracy depending on the traveling environment. For this reason, recently, a technique called sensor fusion has been proposed in which a plurality of forward recognition means such as a camera device and a radar apparatus are mounted and the results detected by the respective forward recognition means are combined to recognize a forward three-dimensional object.

例えば特許文献1(特開2005−165421号公報)には、ミリ波レーダとカメラを含む画像認識装置で検出した各データに基づき、この各データの確かさを示す確率分布を求め、同じデータ値に関する複数の確率分布の積を取ることによってデータをフュージョンし、最も高い確率を示すデータ値に基づいて前方立体物の種別を調べる技術が開示されている。   For example, in Patent Document 1 (Japanese Patent Laid-Open No. 2005-165421), a probability distribution indicating the certainty of each data is obtained based on each data detected by an image recognition apparatus including a millimeter wave radar and a camera, and the same data value is obtained. A technique is disclosed in which data is fused by taking a product of a plurality of probability distributions and the type of the front three-dimensional object is examined based on the data value indicating the highest probability.

この文献に開示されている技術によれば、ある前方認識手段で検出したデータに基づいて求めた確率分布の確率が低下した場合であっても、他の前方認識手段で検出したデータに基づいて求めた確率分布が高い確率を示していれば、この高い確率のデータ値を用いて前方立体物の種別を認識することができる。
特開2005−165421号公報
According to the technique disclosed in this document, even if the probability of the probability distribution obtained based on the data detected by a certain forward recognition means is reduced, based on the data detected by other forward recognition means If the obtained probability distribution indicates a high probability, the type of the front three-dimensional object can be recognized using the data value with the high probability.
JP 2005-165421 A

ところで、例えば画像認識装置は、夜間や降雨、降雪時の検出精度が低下するためデータ値の確率分布を補正して検出精度を高める必要がある。この点、例えばミリ波レーダは、日照の影響を受け難く、高精度な検出を行うことができるが、歩行者の検出精度は低い。   By the way, for example, the image recognition apparatus needs to improve the detection accuracy by correcting the probability distribution of the data value because the detection accuracy at night, during rain, and snow falls. In this respect, for example, the millimeter wave radar is not easily affected by sunlight, and can perform highly accurate detection, but the detection accuracy of a pedestrian is low.

従って、例えばカメラとミリ波レーダとの2つの前方認識手段を用いて前方立体物を認識しようとした場合であっても、夜間や降雨、降雪時等の走行環境では、前方立体物の検出精度が低下する不具合がある。   Therefore, for example, even when trying to recognize a front three-dimensional object using two front recognition means of a camera and a millimeter wave radar, the detection accuracy of the front three-dimensional object is not good in a driving environment such as at night, during rainfall, or during snowfall. There is a problem that decreases.

そのため、特許文献1では、照度センサで検出した暗さに関するデータ値に基づいて、画像認識装置で検出したデータ値の確率分布を補正して検出精度を高めるようにしている。   Therefore, in Patent Document 1, the detection accuracy is improved by correcting the probability distribution of the data value detected by the image recognition device based on the data value relating to the darkness detected by the illuminance sensor.

しかし、特許文献1に開示されている技術では、走行環境の影響を受けて画像認識装置等の前方認識手段の検出精度を高めるために、照度センサ等の補助的なセンサ類を併設しなければならず、部品点数が多くなり、製造コストが嵩む問題がある。   However, in the technique disclosed in Patent Document 1, auxiliary sensors such as an illuminance sensor must be provided in order to increase the detection accuracy of the front recognition means such as the image recognition device under the influence of the traveling environment. However, there is a problem that the number of parts increases and the manufacturing cost increases.

一方、検出精度の高い前方認識手段を用いることで、使用する補助的なセンサ類の数を減少させることも考えられるが、前方認識手段が高価となり、製品コストが高くなる問題がある。   On the other hand, it is conceivable to reduce the number of auxiliary sensors to be used by using the forward recognition means with high detection accuracy, but there is a problem that the forward recognition means becomes expensive and the product cost increases.

本発明は、上記事情に鑑み、高精度な前方認識手段を用いることなく、一般に普及している前方認識装置を用いて前方立体物を高精度に検出することができ、しかも、部品点数の大幅な増加、及び製品コストの高騰を抑制して、高い信頼を得ることのできる車両の前方立体物認識装置を提供することを目的とする。   In view of the above circumstances, the present invention can detect a front three-dimensional object with high accuracy using a generally recognized front recognition device without using high-precision front recognition means, and has a large number of parts. An object of the present invention is to provide a vehicle front three-dimensional object recognition device that can suppress a significant increase and a rise in product cost and obtain high reliability.

上記目的を達成するため本発明は、異なる検出特性を有すると共に車両に搭載されている少なくとも3種類の前方認識手段と、上記各前方認識手段で検出した各前方立体物から同一の前方立体物を検出する立体物検出処理部とを備える車両の前方立体物認識装置において、上記立体物検出処理部が、上記各前方立体物に基づき該各前方立体物を特定する特徴量をそれぞれ求め、上記各前方認識手段で検出した上記各前方立体物の全ての組み合わせから該各前方立体物の上記特徴量の同一確率を、該各特徴量の誤差に関するガウス分布に基づき求め、求めた上記同一確率に基づき上記各前方立体物に同一立体物があるか否かを判定し、同一と判定された場合は、当該各前方立体物の上記特徴量に基づいてフュージョン立体物の特徴量を生成し、非同一と判定された場合は非同一と判定された各前方立体物の特徴量のみで該フュージョン立体物の特徴量を生成するフュージョン立体物生成部を有することを特徴とする。   In order to achieve the above object, the present invention provides at least three types of front recognition means having different detection characteristics and mounted on a vehicle, and the same front three-dimensional object from each front three-dimensional object detected by each front recognition means. In a vehicle front three-dimensional object recognition device including a three-dimensional object detection processing unit to detect, the three-dimensional object detection processing unit obtains a feature amount that identifies each front three-dimensional object based on each front three-dimensional object, and The same probability of the feature quantity of each front three-dimensional object is obtained from all combinations of the front three-dimensional objects detected by the forward recognition means based on the Gaussian distribution related to the error of each feature quantity, and based on the obtained same probability It is determined whether or not each front three-dimensional object has the same three-dimensional object, and if it is determined to be the same, a feature amount of the fusion three-dimensional object is generated based on the feature amount of each front three-dimensional object, If it is determined that the same and having a fusion solid objects generator only by the feature amount generating a feature amount of the fusion solid objects of the three-dimensional objects ahead it is determined that non-identical.

本発明によれば、前方立体物を少なくとも3種類の前方認識手段で検出し、各前方認識手段で検出した各前方立体物の同一確率を、各前方立体物を特定する特徴量の誤差に関するガウス分布に基づき求めるようにしたので、高精度な前方認識手段を用いることなく、一般に普及している前方認識装置を用いて前方立体物を高精度に検出することができる。   According to the present invention, the front three-dimensional object is detected by at least three types of front recognition means, and the same probability of each front three-dimensional object detected by each of the front recognition means is used as a Gauss relating to an error in a feature amount that identifies each front three-dimensional object. Since the determination is made based on the distribution, it is possible to detect the front three-dimensional object with high accuracy using a generally recognized front recognition device without using high-precision front recognition means.

又、少なくとも3種類の前方認識手段を用いて前方立体物を検出するようにしたので、例えば1つの前方認識手段の検出精度が特定の走行環境下で低下するような場合であっても、他の前方認識手段にて補間されるため、補助的なセンサ類を併設する必要が無く、部品点数の大幅な増加、及び製品コストの高騰が抑制されて、高い信頼を得ることができる。   In addition, since the front three-dimensional object is detected using at least three types of front recognition means, for example, even if the detection accuracy of one front recognition means is lowered under a specific traveling environment, Therefore, there is no need to provide auxiliary sensors, and a large increase in the number of parts and a rise in product cost are suppressed, and high reliability can be obtained.

以下、図面に基づいて本発明の一形態を説明する。図1に車両の平面図、図2に車両に搭載されている前方立体物認識装置の概略構成図を示す。   Hereinafter, an embodiment of the present invention will be described with reference to the drawings. FIG. 1 is a plan view of the vehicle, and FIG. 2 is a schematic configuration diagram of a front three-dimensional object recognition device mounted on the vehicle.

図1に示すように、自動車等の車両(自車)1には、前方認識手段として、異なる検出特性を有する3種類のステレオカメラ装置2、ミリ波レーダ装置3、レーザレーダ装置4を備えている。   As shown in FIG. 1, a vehicle (own vehicle) 1 such as an automobile includes three types of stereo camera devices 2, millimeter wave radar devices 3, and laser radar devices 4 having different detection characteristics as forward recognition means. Yes.

ステレオカメラ装置2は、ステレオ光学系として、 固体撮像素子を搭載する左右一対のカメラ(CCDカメラやCMOSカメラ)を有し、この両カメラが、ルームミラー上方に一定の間隔を開けて取り付けられ、車両1の前方を異なる視点からステレオ撮像する。又、ミリ波レーダ装置3は、車両1の前端部の車幅方向略中央に取付けられている。ミリ波レーダ装置3から出射されるミリ波は、後述するレーザに比べて細く絞ることが困難であるため、複数本のミリ波のビームを車両1の前方に放射状に送信し、前方立体物からの反射波を受波する。   The stereo camera device 2 has a pair of left and right cameras (CCD camera and CMOS camera) on which a solid-state image pickup device is mounted as a stereo optical system, and both the cameras are mounted at a certain distance above the room mirror. Stereo imaging is performed in front of the vehicle 1 from different viewpoints. Further, the millimeter wave radar device 3 is attached to the front end portion of the vehicle 1 substantially at the center in the vehicle width direction. Since the millimeter wave emitted from the millimeter wave radar device 3 is difficult to narrow down compared to a laser to be described later, a plurality of millimeter wave beams are transmitted radially to the front of the vehicle 1 and from the front three-dimensional object. Receive the reflected wave.

レーザレーダ装置4は、ミリ波レーダ装置3とほぼ同じ位置に取付けられている。レーザレーダ装置4から出射されるレーザ光は細く絞ることが可能であるため、図1に示すように、車両1の前方に送信した細幅のレーザ光を、車幅方向(或いは車幅方向と高さ方向)を所定の視野角を有して走査しながらパルス照射し、前方立体物からの反射光を受光する。   The laser radar device 4 is attached at substantially the same position as the millimeter wave radar device 3. Since the laser beam emitted from the laser radar device 4 can be narrowed down, as shown in FIG. 1, the narrow laser beam transmitted to the front of the vehicle 1 is changed to the vehicle width direction (or the vehicle width direction). Pulse irradiation is performed while scanning with a predetermined viewing angle in the height direction, and reflected light from the front three-dimensional object is received.

ステレオカメラ装置2で撮像したステレオ画像、ミリ波レーダ装置3で検出したミリ波信号、レーザレーダ装置4で検出したレーザ信号が、図2に示す前方立体物認識装置5に入力される。この前方立体物認識装置5は、マイクロコンピュータ等のコンピュータを主体に構成され、周知のCPU、ROM、RAM等が備えられている。   A stereo image captured by the stereo camera device 2, a millimeter wave signal detected by the millimeter wave radar device 3, and a laser signal detected by the laser radar device 4 are input to the front three-dimensional object recognition device 5 shown in FIG. The front three-dimensional object recognition device 5 is mainly configured by a computer such as a microcomputer, and includes a known CPU, ROM, RAM, and the like.

前方立体物認識装置5は、前方立体物を検出する機能として、ステレオ画像処理部6、ミリ波信号処理部7、レーザ信号処理部8、及び立体物検出処理部9を備えている。更に、この前方立体物認識装置5に車両制御部21が接続されている。   The front three-dimensional object recognition device 5 includes a stereo image processing unit 6, a millimeter wave signal processing unit 7, a laser signal processing unit 8, and a three-dimensional object detection processing unit 9 as functions for detecting a front three-dimensional object. Further, a vehicle control unit 21 is connected to the front three-dimensional object recognition device 5.

ステレオ画像処理部6は、ステレオカメラ装置2で撮像したステレオ画像に基づき、検出した前方立体物(以下、「画像立体物」と称する)Ii(iは変数、図1では、i=1,2,3)の位置のずれ量から三角測量の原理によって距離データを生成し、この距離データに基づいて、周知のグルーピング処理や、予め記憶しておいた立体物データと比較し、車両や歩行者等の画像立体物を各々抽出する。更に、各画像立体物Iiを特定する特徴量(画像特徴量)を求める。本形態は、画像特徴量を、車両1を原点とする車体前後方向(以下、「z方向」と称する)の距離座標(以下、「z座標」と称する)Iz、及び車幅方向(以下、「x方向」と称する)の位置座標(以下、「x座標」と称する)Ixと、z座標Izの時間的変化に基づいて求めたz方向の相対速度(以下、「Vz速度」と称する)Ivで表す。   The stereo image processing unit 6 detects a front three-dimensional object (hereinafter referred to as “image three-dimensional object”) Ii (i is a variable, i = 1, 2 in FIG. 1) based on a stereo image captured by the stereo camera device 2. , 3) Generate distance data based on the principle of triangulation from the amount of position deviation, and based on this distance data, compare with well-known grouping processing and pre-stored three-dimensional object data to compare vehicles and pedestrians. Each of the three-dimensional image objects is extracted. Further, a feature amount (image feature amount) for specifying each image solid object Ii is obtained. In the present embodiment, the image feature amount includes distance coordinates (hereinafter referred to as “z coordinates”) Iz in the vehicle longitudinal direction (hereinafter referred to as “z direction”) with the vehicle 1 as the origin, and vehicle width directions (hereinafter referred to as “z direction”). Position coordinate (hereinafter referred to as “x coordinate”) Ix of the “x direction”) and relative speed in the z direction (hereinafter referred to as “Vz speed”) obtained based on the temporal change of the z coordinate Iz. Iv.

ミリ波信号処理部7は、ミリ波レーダ装置3からの検出信号に基づき各前方立体物(以下、「ミリ波立体物」と称する)Mi(図1ではi=1,2,3)を特定する特徴量(ミリ波特徴量)を求める。本形態は、ミリ波特徴量を、ミリ波レーダ装置3によるミリ波の送信から受信までの時間に基づき求めるz座標Mz、複数本のビームのうちで反射波が受波されたビームの送信方向に基づき求めるx座標Mx、ドップラー原理に基づいて求めるVz速度Mvで表す。   The millimeter wave signal processing unit 7 identifies each front three-dimensional object (hereinafter referred to as “millimeter wave three-dimensional object”) Mi (in FIG. 1, i = 1, 2, 3) based on the detection signal from the millimeter wave radar device 3. The feature quantity (millimeter wave feature quantity) to be obtained is obtained. In this embodiment, the millimeter wave feature amount is obtained based on the time from the millimeter wave transmission to reception of the millimeter wave by the millimeter wave radar device 3, the z coordinate Mz, and the transmission direction of the beam from which the reflected wave is received among the plurality of beams. X coordinate Mx obtained based on Vz, and Vz velocity Mv obtained based on the Doppler principle.

レーザ信号処理部8は、レーザレーダ装置4からの検出信号に基づき各前方立体物(以下、「レーザ立体物」と称する)Li(図1ではi=1,2)を特定する特徴量(レーザ特徴量)を求める。本形態は、レーザ特徴量を、レーザレーダ装置4によるレーザ光の送光から受光までの時間に基づき求めるz座標Lz、反射光が受光されたときのレーザ光の送光方向に基づき求めるx座標Lx、及び、z座標Lzの時間的変化から求めるVz速度Lvで表す。   The laser signal processing unit 8 is a feature quantity (laser) for identifying each front three-dimensional object (hereinafter referred to as “laser three-dimensional object”) Li (i = 1, 2 in FIG. 1) based on the detection signal from the laser radar device 4. (Feature). In this embodiment, the laser feature amount is obtained based on the z-coordinate Lz based on the time from laser light transmission to light reception by the laser radar device 4, and the x-coordinate is obtained based on the laser light transmission direction when the reflected light is received. Lx and Vz velocity Lv obtained from a temporal change of the z coordinate Lz are represented.

尚、上述した各処理部6〜8で検出される立体物Ii,Mi,Liの各特徴量を表す座標系は、基準となる座標系(例えば、画像立体物Iを認識する座標系)に整合されて設定されている。   The coordinate system representing each feature quantity of the three-dimensional object Ii, Mi, Li detected by each of the processing units 6 to 8 described above is a reference coordinate system (for example, a coordinate system that recognizes the image three-dimensional object I). Aligned and set.

又、表1に、ミリ波レーダ装置3、ステレオカメラ装置2、レーザレーダ装置4による前方立体物の検出特性を、走行環境との関係で示す。ここで、◎,○,△,×は検出精度を示し、◎は良好、○はやや良好、△はやや劣る、×は劣るである。

Figure 2007188354
Table 1 shows the detection characteristics of the front three-dimensional object by the millimeter wave radar device 3, the stereo camera device 2, and the laser radar device 4 in relation to the traveling environment. Here, ◎, ○, Δ, and X indicate detection accuracy, ◎ is good, ◯ is slightly good, Δ is slightly inferior, and × is inferior.
Figure 2007188354

上述したように、ミリ波レーダ装置3から出力されるミリ波はレーザ光に比べて細く絞ることが困難であるため、x方向(横位置)の分解能は、レーザレーダ装置4に比べて低く、前方立体物の幅を検出する精度は低いが、天候の影響を受けずに前方立体物を検出することができる。   As described above, since the millimeter wave output from the millimeter wave radar device 3 is difficult to narrow down compared to the laser beam, the resolution in the x direction (lateral position) is lower than that of the laser radar device 4. Although the accuracy of detecting the width of the front three-dimensional object is low, the front three-dimensional object can be detected without being affected by the weather.

一方、レーザレーダ装置4から出射されるレーザ光は細く絞ることが可能で、x方向を連続走査することができるため横位置の分解能に優れている。従って、横位置を高精度に検出することができるが、レーザ光が雨滴によって屈折され、或いは反射されるため、天候の影響を受け易い。一方、レーザ光は、前方立体物が先行車の場合、車体後面に設けた左右のリフレクタから強く反射されるため、前方立体物の横位置や左右幅の検出精度に優れている。しかし、1台の四輪車の左右のリフレクタを並走する2台の自動二輪車と誤判定したり、並走する2台の自動二輪車を1台の四輪車と誤判定し易く、前方立体物の形状を認識する性能は低い。   On the other hand, the laser light emitted from the laser radar device 4 can be narrowed down and can be continuously scanned in the x direction, so that the lateral position resolution is excellent. Therefore, although the lateral position can be detected with high accuracy, the laser light is refracted or reflected by raindrops, and is therefore easily affected by the weather. On the other hand, when the front three-dimensional object is a preceding vehicle, the laser light is strongly reflected from the left and right reflectors provided on the rear surface of the vehicle body, so that the lateral position and the left and right width of the front three-dimensional object are excellent in detection accuracy. However, it is easy to mistakenly judge two motorcycles running side by side on the left and right reflectors of one four-wheeled vehicle, or two motorcycles running side by side as one four-wheeled vehicle. The ability to recognize the shape of an object is low.

一方、ステレオカメラ装置2は、前方立体物を光のコントラストで認識するため、横位置、及び形状の分解能は優れているが、コントラストを認識することの困難な走行環境(夜間、降雨時等)での分解能が低い。   On the other hand, since the stereo camera device 2 recognizes the front three-dimensional object with the light contrast, the lateral position and the shape resolution are excellent, but the driving environment in which the contrast is difficult to recognize (nighttime, during rain, etc.) The resolution at is low.

又、図3に示すように、この立体物検出処理部9が、一次フュージョン立体物生成部11、二次フュージョン立体物生成部12を備えている。   As shown in FIG. 3, the three-dimensional object detection processing unit 9 includes a primary fusion three-dimensional object generation unit 11 and a secondary fusion three-dimensional object generation unit 12.

一次フュージョン立体物生成部11は、ステレオ画像処理部6で求めた、注目している画像立体物Iの画像特徴量(x座標Ix、z座標Iz、Vz速度Iv)と、ミリ波信号処理部7で求めた、注目しているミリ波立体物Mのミリ波特徴量(x座標Mx、z座標Mz、Vz速度Mv)とを読込み、両前方立体物I,Mの特徴量(x座標Mx,Ix、z座標Mz,Iz、Vz速度Mv,Iv)が、それぞれ同一である確率(同一確率)Pz,Px,PVを算出し、この各同一確率Pz,Px,Pvの積から立体物同一確率Pを算出し、この立体物同一確率Pに基づき、両前方立体物I,Mが同一か否かを調べる。   The primary fusion three-dimensional object generation unit 11 obtains the image feature amount (x coordinate Ix, z coordinate Iz, Vz velocity Iv) of the image solid object I of interest obtained by the stereo image processing unit 6, and the millimeter wave signal processing unit. 7. Read the millimeter wave feature quantity (x coordinate Mx, z coordinate Mz, Vz speed Mv) of the millimeter wave solid object M of interest that was obtained in step 7, and the feature quantities (x coordinate Mx) of the two forward solid objects I and M. , Ix, z coordinates Mz, Iz, Vz speeds Mv, Iv) are calculated as probabilities (identical probabilities) Pz, Px, PV, respectively, and the three-dimensional object is identical from the product of the same probabilities Pz, Px, Pv. A probability P is calculated, and it is checked whether or not both forward three-dimensional objects I and M are identical based on the three-dimensional object probability P.

そして、両前方立体物I,Mが同一と判定された場合、各特徴量(x座標Mx,Ix、z座標Mz,Iz、Vz速度Mv,Iv)の組み合わせから、一次フュージョンによる前方立体物(以下、「一次フュージョン立体物」と称する)Fの特徴量である一次フュージョン特徴量(x座標Fx、z座標Fz、Vz速度Fv)を生成する。   When it is determined that both the front three-dimensional objects I and M are the same, the front three-dimensional object by primary fusion (from the combination of the feature amounts (x coordinates Mx, Ix, z coordinates Mz, Iz, Vz speed Mv, Iv)) ( Hereinafter, a primary fusion feature value (x coordinate Fx, z coordinate Fz, Vz velocity Fv) that is a feature value of F) (referred to as “primary fusion solid object”) is generated.

又、二次フュージョン立体物生成部12は、一次フュージョン立体物Fの特徴量である二次フュージョン特徴量(x座標Fx、z座標Fz、Vz速度Fv)と、レーザ信号処理部8で求めた、注目しているレーザ立体物Lのレーザ特徴量(x座標Lx、z座標Lz、Vz速度Lv)とを読込み、両前方立体物F,Lの特徴量(x座標Fx,Lx、z座標Fz,Lz、Vz速度Fv,Lv)が、各々同一である確率(同一確率)Pz,Px,PVを算出し、この各同一確率Pz,Px,Pvの積から立体物同一確率Pを算出し、この立体物同一確率Pに基づき、両前方立体物F,Lが同一か否かを調べる。そして、両前方立体物F,Lが同一と判定された場合、x座標Fx,Lx、z座標Fz,Lz、Vz速度Fv,Lvの組み合わせから、二次フュージョンによる前方立体物(以下、「二次フュージョン立体物」と称する)Sの特徴量(x座標Sx、z座標Sz、Vz速度Sv)を生成する。   Further, the secondary fusion three-dimensional object generation unit 12 obtains the secondary fusion feature amount (x coordinate Fx, z coordinate Fz, Vz velocity Fv) that is the feature amount of the primary fusion three-dimensional object F and the laser signal processing unit 8. , The laser feature amount (x coordinate Lx, z coordinate Lz, Vz speed Lv) of the laser solid object L of interest is read, and the feature amounts (x coordinate Fx, Lx, z coordinate Fz) of both front solid objects F, L are read. , Lz, Vz velocities Fv, Lv) are calculated as probabilities (identity probabilities) Pz, Px, PV, respectively, and a three-dimensional object probability P is calculated from the product of the same probabilities Pz, Px, Pv. Based on this three-dimensional object identity probability P, it is checked whether or not both front three-dimensional objects F and L are identical. When it is determined that both the front three-dimensional objects F and L are the same, the front three-dimensional object by the secondary fusion (hereinafter referred to as “two” from the combination of the x coordinate Fx, Lx, the z coordinate Fz, Lz, and the Vz speed Fv, Lv). A feature quantity (x-coordinate Sx, z-coordinate Sz, Vz velocity Sv) of S) (referred to as “next fusion solid object”) is generated.

車両制御部21は、前方立体物認識装置5で生成した二次フュージョン立体物Sの特徴量(x座標Sx、z座標Sz、Vz速度Sv)に基づき、前方立体物の認識・監視を行うと共に、必要に応じて、警報装置を含む車両全体の制御を行う。例えば、前方立体物が歩行者と認識された場合は、当該歩行者との距離が短くなり、運転者への警報が必要な走行状況では、警報装置を作動させ、スピーカから注意を促す音声を吹鳴したり、モニタ上に警告文字を表示したりして、運転者に注意を促す。更に、必要な場合は、自動ブレーキ等を作動させて車両1を減速させる等の衝突回避制御を行う。   The vehicle control unit 21 recognizes and monitors the front three-dimensional object based on the feature amount (x coordinate Sx, z coordinate Sz, Vz speed Sv) of the secondary fusion three-dimensional object S generated by the front three-dimensional object recognition device 5. If necessary, the entire vehicle including the alarm device is controlled. For example, when a front three-dimensional object is recognized as a pedestrian, the distance to the pedestrian is shortened, and in a driving situation where a warning to the driver is required, an alarm device is activated and a voice prompting attention from a speaker is emitted. The driver is warned by sounding or displaying warning characters on the monitor. Furthermore, if necessary, collision avoidance control such as operating an automatic brake or the like to decelerate the vehicle 1 is performed.

上述した前方立体物認識装置5で実行される立体物認識処理は、具体的には、図4に示す立体物認識処理ルーチンに従って処理される。このルーチンはイグニッションスイッチをONした後、設定周期毎に実行され、先ず、ステップS1でステレオカメラ装置2で撮像したステレオ画像を読込み、ステップS2で、読み込んだステレオ画像から、各画像立体物Iiのx座標Ix、z座標Iz、Vz速度Ivをそれぞれ算出し、メモリに一時記憶させる(ステレオ画像処理部6)。   Specifically, the three-dimensional object recognition process executed by the front three-dimensional object recognition device 5 described above is processed according to a three-dimensional object recognition process routine shown in FIG. This routine is executed every set cycle after turning on the ignition switch. First, a stereo image captured by the stereo camera device 2 is read in step S1, and in step S2, each stereo image Ii is read from the read stereo image. The x-coordinate Ix, the z-coordinate Iz, and the Vz speed Iv are respectively calculated and temporarily stored in the memory (stereo image processing unit 6).

次いで、ステップS3へ進み、ミリ波レーダ装置3で検出したミリ波信号を読み込み、ステップS4で、読み込んだミリ波信号から、各ミリ波立体物Miのx座標Mx、z座標Mz、Vz速度Mvをそれぞれ算出し、メモリに一時記憶させる(ミリ波信号処理部7)。   Next, the process proceeds to step S3, and the millimeter wave signal detected by the millimeter wave radar device 3 is read. In step S4, the x-coordinate Mx, z-coordinate Mz, and Vz velocity Mv of each millimeter-wave solid object Mi are read from the read millimeter-wave signal. Are calculated and temporarily stored in the memory (millimeter wave signal processing unit 7).

その後、ステップS5へ進み、レーザレーダ装置4で認識したレーザ信号を読み込み、ステップS6で、読み込んだレーザ信号から、レーザ立体物Liのx座標Lx、z座標Lz、Vz速度Lvをそれぞれ算出し、メモリに一時記憶させる(レーザ信号処理部8)。   Thereafter, the process proceeds to step S5, and the laser signal recognized by the laser radar device 4 is read. In step S6, the x-coordinate Lx, the z-coordinate Lz, and the Vz velocity Lv of the laser three-dimensional object Li are calculated from the read laser signal. Temporary storage in the memory (laser signal processing unit 8).

そして、ステップS7へ進み、各前方立体物Ii,Mi,Liのx座標Ix,Mx,Lx、z座標Iz,Mz,Lz、Vz速度Iv,Mv,Lvに基づいて、それらの同一確率から、フュージョン立体物を生成し(立体物検出処理部9)、ルーチンを抜ける。   Then, the process proceeds to step S7, and based on the x-coordinates Ix, Mx, Lx, z-coordinates Iz, Mz, Lz, and Vz velocities Iv, Mv, Lv of the respective front three-dimensional objects Ii, Mi, Li, A fusion three-dimensional object is generated (three-dimensional object detection processing unit 9), and the routine is exited.

ステップS7で実行されるフュージョン立体物の生成処理は、図5、図6に示すフュージョン立体物生成サブルーチンに従って行われる。尚、このサブルーチンのステップS11〜S21で、一次フュージョン立体物生成部11で行われる処理が実行され、続く、ステップS22〜S32で、二次フュージョン立体物生成部12で行われる処理が実行される。   The fusion three-dimensional object generation process executed in step S7 is performed according to a fusion three-dimensional object generation subroutine shown in FIGS. In addition, the process performed by the primary fusion solid object production | generation part 11 is performed by step S11-S21 of this subroutine, and the process performed by the secondary fusion solid object production | generation part 12 is performed by subsequent step S22-S32. .

先ず、ステップS11〜S22では、ミリ波レーダ装置3で検出した立体物とステレオカメラ装置2で撮像した立体画像とを比較して、同一立体物を検出する。この場合、ミリ波と画像とを比較した場合、Vz速度、z方向の距離(x座標)はミリ波の検出精度が高いため、ミリ波立体物Miを基準立体物として設定している。   First, in steps S11 to S22, the three-dimensional object detected by the millimeter wave radar device 3 is compared with the three-dimensional image captured by the stereo camera device 2, and the same three-dimensional object is detected. In this case, when the millimeter wave and the image are compared, the millimeter wave three-dimensional object Mi is set as the reference three-dimensional object because the Vz velocity and the distance (x coordinate) in the z direction have high millimeter wave detection accuracy.

すなわち、ステップS11では、ミリ波信号処理部7で検出した各ミリ波立体物Miから、基準となるn番目のミリ波立体物(以下、「基準ミリ波立体物」と称する)Mnを選択する。尚、nは整数で、初期値は1である。   That is, in step S11, a reference n-th millimeter wave solid object (hereinafter referred to as “reference millimeter wave solid object”) Mn is selected from each millimeter wave solid object Mi detected by the millimeter wave signal processing unit 7. . Note that n is an integer and the initial value is 1.

次いで、ステップS12へ進み、1つの画像立体物Iiを選択する。その後、ステップS13で、ミリ波レーダ装置3とステレオカメラ装置2とのx方向の標準偏差σx、z方向の標準偏差σz、及びVz速度の標準偏差σvをそれぞれ算出する。ところで、上述したように、ミリ波と画像の検出精度を比較した場合、z方向の距離及び速度については、ミリ波の検出精度が優れている。しかし、x方向の距離については画像の検出精度が優れている。但し、画像の検出精度はx方向の距離が離れるに従って低下する。   Next, the process proceeds to step S12, and one image solid object Ii is selected. Thereafter, in step S13, the standard deviation σx in the x direction, the standard deviation σz in the z direction, and the standard deviation σv in the Vz velocity between the millimeter wave radar device 3 and the stereo camera device 2 are calculated. By the way, as described above, when millimeter wave and image detection accuracy are compared, millimeter wave detection accuracy is excellent with respect to the distance and speed in the z direction. However, the image detection accuracy is excellent for the distance in the x direction. However, the image detection accuracy decreases as the distance in the x direction increases.

そのため、各標準偏差σx,σz,σvは、検出精度が比較的高い基準ミリ波立体物Mnのz座標Mzを基準に次式から算出する。   Therefore, each standard deviation σx, σz, σv is calculated from the following equation based on the z coordinate Mz of the reference millimeter-wave solid object Mn with relatively high detection accuracy.

σx=ηx[m]+k1・Mz …(1)
σz=ηz[m]+k2・Mz …(2)
σv=(ηv[m]/t[sec])+k3・Mz …(3)
ここで、ηx,ηz,ηvは、ミリ波と画像とのx方向、z方向、Vz速度の検出精度のばらつきを、それぞれ補正する値であり、
ηx=(α・kmx)+(β・kix)
ηz=(α・kmz)+(β・kiz)
ηv=(α・kmv)+(β・kiv)
で求められる。ここで、α,βは係数であり、予め実験等から求めて設定されている。又、kmx,kmz,kmvはミリ波レーダ装置3のx方向、z方向、Vz速度の誤差、kix,kiz,kivはステレオカメラ装置2のx方向、z方向、Vz速度の誤差であり、予め実験等から求めて設定されている。
σx = ηx [m] + k1 · Mz (1)
σz = ηz [m] + k2 · Mz (2)
σv = (ηv [m] / t [sec]) + k3 · Mz (3)
Here, ηx, ηz, and ηv are values for correcting variations in detection accuracy of the millimeter wave and the image in the x direction, the z direction, and the Vz speed, respectively.
ηx = (α · kmx) + (β · kix)
ηz = (α · kmz) + (β · kiz)
ηv = (α · kmv) + (β · kiv)
Is required. Here, α and β are coefficients, which are set in advance by experiments. Further, kmx, kmz, and kmv are errors in the x direction, z direction, and Vz velocity of the millimeter wave radar device 3, and kix, kiz, and kiv are errors in the x direction, z direction, and Vz velocity of the stereo camera device 2, and It is set by seeking from experiments.

このように、本形態は標準偏差σx,σz,σvを、検出精度の優れているミリ波レーダ装置3で検出した基準ミリ波立体物Mnのz座標(z方向の距離座標)Mzに基づいて可変設定するようにしたので、より正確なガウス分布を求めることができる。   Thus, in this embodiment, the standard deviations σx, σz, σv are based on the z coordinate (distance coordinate in the z direction) Mz of the reference millimeter wave solid object Mn detected by the millimeter wave radar device 3 having excellent detection accuracy. Since it is variably set, a more accurate Gaussian distribution can be obtained.

次いで、ステップS14へ進み、基準ミリ波立体物Mnと画像立体物Iiとの、x座標Mx,Ix、z座標Mz,Iz、Vz速度Mv,Ivの各同一確率Px,Pz,Pvを、次式に示す確率密度関数に基づきそれぞれ設定する。但し、次式においては、基準ミリ波立体物Mnと画像立体物Iiのx座標、z座標、Vz速度をそれぞれ、Mn=(x1,z1,v1)、Ii=(x2,z2.v2)として表している。

Figure 2007188354
Next, the process proceeds to step S14, where the same probability Px, Pz, Pv of the x coordinate Mx, Ix, z coordinate Mz, Iz, Vz velocity Mv, Iv between the reference millimeter wave solid object Mn and the image solid object Ii is Each is set based on the probability density function shown in the equation. However, in the following equation, the x coordinate, z coordinate, and Vz velocity of the reference millimeter wave solid object Mn and the image solid object Ii are Mn = (x1, z1, v1) and Ii = (x2, z2.v2), respectively. Represents.
Figure 2007188354

(4)〜(6)式の意味するところは、基準ミリ波立体物Mnと画像立体物Iiとのx座標x1,x2、z座標z1,z2、Vz速度v1,v2の誤差に関するガウス分布を、x座標x1、z座標z1、Vz速度v1を基準としてそれぞれ求め、その差(x2−x1)、(z2−z1),(v2−v1)から基準ミリ波立体物Mnと画像立体物Iiの同一確率を算出しようとするものである。この各同一確率Px,Pz,Pvの最大値は1(100[%])であり、従って、図7に示すように、例えばx座標x1,x2の関係が、x2>x1、すなわち差が生じているときは、Px<1となる。これは、z座標z1,z2、Vz座標v1,v2も同様である。   The meanings of the equations (4) to (6) mean that a Gaussian distribution relating to errors in the x-coordinates x1, x2, z-coordinates z1, z2, and Vz velocities v1, v2 between the reference millimeter-wave solid object Mn and the image solid object Ii. , X-coordinate x1, z-coordinate z1, and Vz velocity v1 are obtained as references, and from the difference (x2-x1), (z2-z1), (v2-v1), the reference millimeter-wave solid object Mn and the image solid object Ii The same probability is to be calculated. The maximum value of each of the same probabilities Px, Pz, Pv is 1 (100 [%]). Therefore, as shown in FIG. 7, for example, the relationship between the x coordinates x1, x2 is x2> x1, that is, a difference occurs. Px <1. The same applies to the z coordinates z1 and z2 and the Vz coordinates v1 and v2.

尚、本形態では、この各同一確率Px,Pz,Pvをマップ検索により設定するようにしている。各マッブ(x座標同一確率マップ、z座標同一確率マップ、Vz速度同一確率マップ)には、予め基準ミリ波立体物Mnのx座標Mx、z座標Mz、Vz速度Mvと各標準偏差σx,σz,σvとで設定される各ガウス分布表が各々記憶されており、この各ガウス分布表に基づきx座標Ix、z座標Iz、Vz速度Ivを変数として同一確率Px,Pz,Pvがそれぞれ設定される。   In this embodiment, the same probabilities Px, Pz, and Pv are set by map search. Each map (x coordinate identical probability map, z coordinate identical probability map, Vz velocity identical probability map) includes an x coordinate Mx, z coordinate Mz, Vz velocity Mv and standard deviations σx, σz of the reference millimeter wave solid object Mn in advance. , Σv and the respective Gaussian distribution tables are stored, and based on the respective Gaussian distribution tables, the same probabilities Px, Pz, Pv are respectively set by using the x-coordinate Ix, the z-coordinate Iz, and the Vz velocity Iv as variables. The

次いで、ステップS15へ進み、立体物同一確率Piを次式から算出する。
Pi←Px・Pz・Pv …(7)
上述したように、各同一確率Px,Pz,Pvの最大値は1であるため、完全同一のときはP=1(100[%])となり、又、各座標系に差が生じているときは、P<1となる。
Subsequently, it progresses to step S15 and the solid thing identity probability Pi is calculated from following Formula.
Pi ← Px · Pz · Pv (7)
As described above, since the maximum value of each of the same probabilities Px, Pz, and Pv is 1, P = 1 (100 [%]) when they are completely the same, and when there is a difference in each coordinate system Is P <1.

その後、ステップS16へ進み、立体物同一確率Piと、予め設定されているしきい値SL1とを比較する。このしきい値SL1は、同一と見なせる範囲を予め実験等から求めて設定したもので、本形態では40〜70[%]程度に設定されている。   Thereafter, the process proceeds to step S16, where the three-dimensional object identity probability Pi is compared with a preset threshold value SL1. This threshold value SL1 is obtained by setting a range that can be regarded as the same in advance through experiments or the like. In this embodiment, the threshold value SL1 is set to about 40 to 70 [%].

そして、立体物同一確率Piがしきい値SL1未満のときは(Pi<SL1)、立体物同一確率Piは無効と判断し、ステップS17へ進み、立体物同一確率Piをクリアして(Pi←0)、ステップS18へ進む。又、立体物同一確率Piがしきい値SL1以上ときは(P≧SL1)、立体物同一確率Piは有効と判断し、そのままステップS18へ進む。尚、この場合、有効と判定される立体物同一確率Piは、1つの基準ミリ波立体物Mnに対し1つとは限らず、立体物同一確率Piがしきい値SL1以上であれば、複数存在する可能性もある。   When the three-dimensional object identity probability Pi is less than the threshold value SL1 (Pi <SL1), it is determined that the three-dimensional object identity probability Pi is invalid, and the process proceeds to step S17, where the three-dimensional object identity probability Pi is cleared (Pi ← 0), go to step S18. If the three-dimensional object identity probability Pi is greater than or equal to the threshold value SL1 (P ≧ SL1), it is determined that the three-dimensional object identity probability Pi is valid, and the process proceeds to step S18. In this case, the three-dimensional object identity probability Pi determined to be valid is not limited to one for one reference millimeter wave three-dimensional object Mn. There is also a possibility to do.

そして、ステップS18へ進むと、今回設定した立体物同一確率Piを記憶する。図1に示すように、例えば、ミリ波立体物M1を基準ミリ波立体物として選択し、この基準ミリ波立体物M1を基準に、全ての画像立体物I1,I2,I3との同一確率Pを算出した結果、基準ミリ波立体物M1と画像立体物I1とは同一であるが、他の画像立体物I2,I3とは非同一であると判定された場合、基準ミリ波立体物M1と画像立体物I1との立体物同一確率Pは有効と判定されて、そのまま記憶される。又、基準ミリ波立体物M1と画像立体物I2との立体物同一確率P、及び基準ミリ波立体物M1と画像立体物I3との立体物同一確率Pはそれぞれ0として記憶される。同様に、他の各ミリ波立体物M2,M3についても、全ての画像立体物I1,I2,I3との立体物同一確率Pが算出される。   Then, when proceeding to step S18, the three-dimensional object identity probability Pi set this time is stored. As shown in FIG. 1, for example, the millimeter wave solid object M1 is selected as the reference millimeter wave solid object, and the same probability P with all the image solid objects I1, I2, and I3 based on the reference millimeter wave solid object M1. As a result, when it is determined that the reference millimeter wave solid object M1 and the image solid object I1 are the same, but the other image solid objects I2 and I3 are not identical, the reference millimeter wave solid object M1 The three-dimensional object identity probability P with the image three-dimensional object I1 is determined to be valid and stored as it is. The three-dimensional object identity probability P of the reference millimeter wave solid object M1 and the image solid object I2 and the three-dimensional object identity probability P of the reference millimeter wave solid object M1 and the image solid object I3 are stored as 0, respectively. Similarly, for the other millimeter wave solid objects M2 and M3, the solid object probability P with all the image solid objects I1, I2 and I3 is calculated.

その結果、ミリ波立体物M1,M2,M3と画像立体物I1,I2,I3との全ての組み合わせにおいて立体物同一確率Pが算出される。従って、画像立体物I3のように同一のミリ波立体物が検出されない場合も立体物同一確率Pは算出され、この場合、立体物同一確率Pは0に設定される。   As a result, the solid object probability P is calculated for all combinations of the millimeter wave solid objects M1, M2, and M3 and the image solid objects I1, I2, and I3. Therefore, even when the same millimeter wave solid object is not detected as in the image solid object I3, the solid object probability P is calculated. In this case, the solid object probability P is set to zero.

勿論、ミリ波立体物M3のように画像立体物が検出されない場合でも立体物同一確率Pは算出される。この場合も立体物同一確率Pは0に設定される。尚、この場合、例えば図1の画像立体物I3のように、各ミリ波立体物M1,M2,M3では検出されない前方立体物であっても、各ミリ波立体物M1,M2,M3との同一確率Pがそれぞれ設定されるが、後述する一次フュージョン立体物Fiの生成において1つの前方立体物として統合される。   Of course, even when an image solid object is not detected like the millimeter wave solid object M3, the solid object probability P is calculated. In this case also, the three-dimensional object probability P is set to zero. In this case, for example, even the front three-dimensional object that is not detected by each of the millimeter wave solid objects M1, M2, and M3, such as the image solid object I3 of FIG. 1, is connected to each millimeter wave solid object M1, M2, and M3. Although the same probability P is respectively set, in the generation of the primary fusion three-dimensional object Fi described later, they are integrated as one front three-dimensional object.

その後、ステップS19へ進み、画像立体物Iiの全てが選択されたか否かを調べ、未選択の画像立体物Iiが存在しているときは、ステップS12へ戻り、次の画像立体物Iiを選択する。又、画像立体物Iiの全てが選択された場合は、ステップS20へ進む。   Thereafter, the process proceeds to step S19, where it is checked whether or not all of the image solid object Ii has been selected. If there is an unselected image solid object Ii, the process returns to step S12 to select the next image solid object Ii. To do. If all of the image solid object Ii is selected, the process proceeds to step S20.

ステップS20では、ミリ波立体物Miの全てが選択されたか否かを調べ、未選択のミリ波立体物Miが存在しているときは、ステップS11へ戻り、次のミリ波立体物Miを基準ミリ波立体物Mnに設定し、当該基準ミリ波立体物Mnを基準として画像立体物Iiとの立体物同一確率Piを求める。又、ミリ波立体物Miの全てが選択された場合は、ステップS21へ進む。   In step S20, it is checked whether or not all the millimeter wave solid objects Mi are selected. If there is an unselected millimeter wave solid object Mi, the process returns to step S11, and the next millimeter wave solid object Mi is used as a reference. The millimeter wave solid object Mn is set, and the solid object probability Pi with the image solid object Ii is obtained with the reference millimeter wave solid object Mn as a reference. If all the millimeter wave solid objects Mi are selected, the process proceeds to step S21.

ステップS21では、一次フュージョン立体物Fiを生成する。この一次フュージョン立体物Fiを生成するに際し、先ず、算出した全ての立体物同一確率Piに基づき、同一立体物と判定されるミリ波立体物Miと画像立体物Iiとを抽出する。同一立体物を抽出する際の判定条件は、以下の通りである。   In step S21, a primary fusion solid object Fi is generated. When generating the primary fusion three-dimensional object Fi, first, the millimeter wave three-dimensional object Mi and the image three-dimensional object Ii determined to be the same three-dimensional object are extracted based on all the calculated three-dimensional object identity probabilities Pi. Determination conditions for extracting the same three-dimensional object are as follows.

条件1:Pi>0である
条件2:ミリ波立体物Miに対して画像立体物Iiが最大同一確率である
条件3:画像立体物Iiに対してミリ波立体物Miが最大同一確率である
そして、この条件1〜3の全てが満足された場合、ミリ波立体物Miと画像立体物Iiとは同一であると判定する。
Condition 1: Pi> 0 Condition 2: The image solid object Ii has the maximum same probability with respect to the millimeter wave solid object Mi Condition 3: The millimeter wave solid object Mi has the maximum same probability with respect to the image solid object Ii When all of the conditions 1 to 3 are satisfied, it is determined that the millimeter wave solid object Mi and the image solid object Ii are the same.

そして、同一と判定されたミリ波立体物Miと画像立体物Iiとの組み合わせ、及び立体物同一確率Piが0に設定されているミリ波立体物Mi及び画像立体物Iiに基づき、一次フュージョン立体物Fiを各々生成する。この場合、画像立体物Miのみ(同一となるミリ波立体物Miが無い)、或いはミリ波立体物のみ(同一となる画像立体物Iiが無い)の場合は、画像立体物Iiの特徴量(x座標Ix、z座標Iz、Vz速度Iv)、或いはミリ波立体物Miの特徴量(x座標Mx、z座標Mz、Vz速度Mv)で一次フュージョン立体物Fiが生成され、又、ミリ波立体物Miと画像立体物Iiとが同一と判定された場合は、ミリ波レーダ装置3とステレオカメラ装置2との検出精度が高い側の特徴量に基づいて一次フュージョン立体物Fiが生成される。   Based on the combination of the millimeter wave solid object Mi and the image solid object Ii determined to be the same, and the millimeter wave solid object Mi and the image solid object Ii whose solid object probability Pi is set to 0, the primary fusion solid is obtained. Each of the objects Fi is generated. In this case, in the case of only the image solid object Mi (no identical millimeter wave solid object Mi) or only the millimeter wave solid object (no identical image solid object Ii), the feature amount of the image solid object Ii ( The primary fusion solid object Fi is generated with the x-coordinate Ix, the z-coordinate Iz, the Vz velocity Iv), or the feature quantity of the millimeter-wave solid object Mi (x-coordinate Mx, z-coordinate Mz, Vz velocity Mv). When it is determined that the object Mi and the image solid object Ii are the same, the primary fusion solid object Fi is generated based on the feature amount on the side where the millimeter wave radar device 3 and the stereo camera device 2 have high detection accuracy.

具体的には、以下の基準に従って、一次フュージョン立体物Fiの特徴量が生成される。
・Pi=0(但し、同一となるミリ波立体物Miが無く、画像立体物Iiのみ)
x座標←画像立体物Iiのx座標Ix
z座標←画像立体物Iiのz座標Iz
Vz速度←画像立体物IiのVz速度Iv
従って、画像立体物Iiのみで生成した一次フュージョン立体物Fiの特徴量は、x座標Ix、z座標Iz、Vz速度Ivで表される。
Specifically, the feature amount of the primary fusion three-dimensional object Fi is generated according to the following criteria.
Pi = 0 (However, there is no millimeter wave solid object Mi that is the same, only the image solid object Ii)
x-coordinate ← x-coordinate Ix of image solid object Ii
z coordinate ← z coordinate Iz of the three-dimensional image Ii
Vz speed ← Vz speed Iv of the three-dimensional object Ii
Therefore, the feature amount of the primary fusion solid object Fi generated only by the image solid object Ii is represented by the x coordinate Ix, the z coordinate Iz, and the Vz speed Iv.

・Pi=0(但し、同一となる画像立体物Iiが無く、ミリ波立体物Miのみ)
x座標←ミリ波立体物Miのx座標Mx
z座標←ミリ波立体物Miのz座標Mz
Vz速度←ミリ波立体物MiのVz速度Mv
従って、ミリ波立体物Miのみで生成した一次フュージョン立体物Fiの特徴量は、x座標Mx、z座標Mz、Vz速度Mvで表される。
Pi = 0 (However, there is no image solid object Ii that is the same, only the millimeter wave solid object Mi)
x coordinate ← x coordinate Mx of millimeter wave solid object Mi
z-coordinate ← z-coordinate Mz of solid object Mi
Vz velocity ← Vz velocity Mv of millimeter wave solid object Mi
Therefore, the feature quantity of the primary fusion three-dimensional object Fi generated only by the millimeter wave three-dimensional object Mi is represented by the x coordinate Mx, the z coordinate Mz, and the Vz velocity Mv.

・Pi>0(ミリ波立体物Miと画像立体物Iiとは同一)
x座標←画像立体物Iiとミリ波立体物Miの各x座標Ii,Miの加重平均
z座標←ミリ波立体画像Miのz座標Mz
Vz速度←ミリ波立体物MiのVz速度Mv
従って、ミリ波立体物Miと画像立体物Iiとが同一と判定されたときに生成した一次フュージョン立体物Fiは、x座標(Ii,Miの加重平均)、z座標Mz、Vz速度Mvで表される。上述したように、ミリ波と画像の検出精度を比較した場合、z方向の距離及び速度については、ミリ波の検出精度が優れており、x方向の距離については画像の検出精度が優れている。しかし、x方向の距離が離れるに従って画像の検出精度は低下する。
Pi> 0 (the millimeter wave solid object Mi and the image solid object Ii are the same)
x-coordinate ← weighted average of x-coordinates Ii and Mi of image solid object Ii and millimeter-wave solid object Mi z-coordinate ← z-coordinate Mz of millimeter-wave solid image Mi
Vz velocity ← Vz velocity Mv of millimeter wave solid object Mi
Accordingly, the primary fusion three-dimensional object Fi generated when it is determined that the millimeter wave three-dimensional object Mi and the image three-dimensional object Ii are the same is represented by an x coordinate (weighted average of Ii and Mi), a z coordinate Mz, and a Vz velocity Mv. Is done. As described above, when comparing the millimeter wave and image detection accuracy, the millimeter wave detection accuracy is excellent for the distance and speed in the z direction, and the image detection accuracy is excellent for the x direction distance. . However, the image detection accuracy decreases as the distance in the x direction increases.

そのため、ミリ波立体物Miと画像立体物Iiとが同一と認定されたときの一次フュージョン立体物Fiを表す特徴量は、z座標、Vz速度については、ミリ波立体画像Miのz座標Mz、Vz速度Mvを優先的に採用し、又、x座標については両x座標Ii,Miの加重平均から算出することで、検出精度を高めるようにしている。   Therefore, the feature quantity representing the primary fusion solid object Fi when the millimeter wave solid object Mi and the image solid object Ii are recognized to be the same is the z coordinate, and the Vz velocity is the z coordinate Mz of the millimeter wave solid image Mi. The Vz velocity Mv is preferentially adopted, and the x-coordinate is calculated from the weighted average of both x-coordinates Ii and Mi, thereby improving the detection accuracy.

例えば、図1において、ミリ波立体物M1と画像立体物I1、及びミリ波立体物M2と画像立体物I2とが同一と判定された場合、これらがそれぞれフュージョンされて、同図に中線の枠で示すような一次フュージョン立体物F1,F2が生成される。又、画像立体物I3には同一となるミリ波画像がなく、ミリ波立体物M3には同一となる画像立体物が無い。そのため、画像立体物I3、及び画像立体物M3は自身の有する特徴量で一次フュージョン立体物F3,F4が生成される。   For example, in FIG. 1, when it is determined that the millimeter wave solid object M1 and the image solid object I1, and the millimeter wave solid object M2 and the image solid object I2 are the same, they are respectively fused, and the middle line in FIG. Primary fusion solids F1 and F2 as shown by the frame are generated. Further, the image solid object I3 does not have the same millimeter wave image, and the millimeter wave solid object M3 does not have the same image solid object. Therefore, the primary fusion three-dimensional objects F3 and F4 are generated with the feature quantities of the image solid object I3 and the image solid object M3.

そして、上述したステップS21で全ての一次フュージョン立体物Fiが生成されると、プログラムは、ステップS22へ進み、ステップS22以降で、一次フュージョン立体物Fiとレーザ立体物Liと組み合わせから、二次フュージョン立体物Siを生成する。   When all the primary fusion three-dimensional objects Fi are generated in step S21 described above, the program proceeds to step S22, and in step S22 and subsequent steps, the secondary fusion is obtained from the combination of the primary fusion three-dimensional object Fi and the laser three-dimensional object Li. A three-dimensional object Si is generated.

先ず、ステップS22では、一次フュージョン立体物Fiから、基準となるn番目の一次フュージョン立体物(以下、「基準一次フュージョン立体物」と称する)Fnを選択する。尚、nは整数で、初期値は1である。   First, in step S22, an n-th primary fusion solid object (hereinafter referred to as “reference primary fusion solid object”) Fn serving as a reference is selected from the primary fusion solid object Fi. Note that n is an integer and the initial value is 1.

次いで、ステップS23へ進み、1つのレーザ立体物Liを選択する。その後、ステップS24で、ミリ波レーダ装置3とステレオカメラ装置2とレーザレーダ装置4とのx方向の標準偏差σx、z方向の標準偏差σz、及びVz速度の標準偏差σvをそれぞれ算出する。この各標準偏差σx,σz,σvは、基準一次フュージョン立体物Fnのz座標Fzに基づいて次式から算出する。   Next, the process proceeds to step S23, and one laser solid object Li is selected. Thereafter, in step S24, the standard deviation σx in the x direction, the standard deviation σz in the z direction, and the standard deviation σv in the Vz velocity of the millimeter wave radar device 3, the stereo camera device 2, and the laser radar device 4 are calculated. Each standard deviation σx, σz, σv is calculated from the following equation based on the z coordinate Fz of the reference primary fusion solid object Fn.

σx=ηx[m]+k1・Fz …(1’)
σz=ηz[m]+k2・Fz …(2’)
σv=(ηv[m]/t[sec])+k3・Fz …(3’)
ここで、ηx,ηz,ηvは、ミリ波と画像とレーザ光とのx方向、z方向、Vz速度の検出精度のばらつきを、それぞれ補正する値であり、
ηx=(α・kmx)+(β・kix)+(γ・klx)
ηz=(α・kmz)+(β・kiz)+(γ・klz)
ηv=(α・kmv)+(β・kiv)+(γ・klv)
で求められる。ここで、α,β,γは係数であり、予め実験等から求めて設定されている。又、kmx,kmz,kmvはミリ波レーダ装置3のx方向、z方向、Vz速度の誤差、kix,kiz,kivはステレオカメラ装置2のx方向、z方向、Vz速度の誤差、klx,klz,klvはレーザレーダ装置4のx方向、z方向、Vz速度の誤差であり、予め実験等から求めて設定されている。
σx = ηx [m] + k1 · Fz (1 ′)
σz = ηz [m] + k2 · Fz (2 ′)
σv = (ηv [m] / t [sec]) + k3 · Fz (3 ′)
Here, ηx, ηz, and ηv are values for correcting variations in detection accuracy of the millimeter wave, the image, and the laser beam in the x direction, the z direction, and the Vz speed, respectively.
ηx = (α · kmx) + (β · kix) + (γ · klx)
ηz = (α · kmz) + (β · kiz) + (γ · klz)
ηv = (α · kmv) + (β · kiv) + (γ · klv)
Is required. Here, α, β, and γ are coefficients, which are set in advance by experiments. Further, kmx, kmz, kmv are errors in the x direction, z direction, and Vz velocity of the millimeter wave radar device 3, kix, kiz, kiv are errors in the x direction, z direction, Vz velocity of the stereo camera device 2, and klx, klz. , Klv are errors in the x-direction, z-direction, and Vz speed of the laser radar device 4, and are set in advance by experiments.

このように、本形態は標準偏差σx,σz,σvを、レーザレーダ装置4よりも検出精度の優れているミリ波レーダ装置3或いはステレオカメラ装置2で検出した前方立体物Mi,Iiに基づいて生成した基準一次ヒュージョン立体物Fnのz座標(z方向の距離座標)Fzに基づいて可変設定するようにしたので、より正確なガウス分布を求めることができる。   As described above, in the present embodiment, the standard deviations σx, σz, and σv are based on the front three-dimensional objects Mi and Ii detected by the millimeter wave radar device 3 or the stereo camera device 2 having better detection accuracy than the laser radar device 4. Since it is variably set based on the z coordinate (distance coordinate in the z direction) Fz of the generated reference primary fusion solid object Fn, a more accurate Gaussian distribution can be obtained.

次いで、ステップS25へ進み、基準一次フュージョン立体物Fnとレーザ立体物Liとの、x座標Fx,Lx、z座標Fz,Lz、Vz速度Fv,Lvの各同一確率Px,Pz,Pvを、上述した(4)〜(5)式に示す確率密度関数に基づきそれぞれ設定する。この場合、基準一次フュージョン立体物Fnとレーザ立体物Liのx座標、z座標、Vz速度をそれぞれ、Fn=(x1,z1,v1)、Ii=(x2,z2.v2)としている。尚、本形態では、この各同一確率Px,Pz,Pvをマップ検索により設定するようにしている。各マッブ(x座標同一確率マップ、z座標同一確率マップ、Vz速度同一確率マップ)には、予め基準一次フュージョン立体物Fnのx座標Fx、z座標Fz、Vz速度Fvと各標準偏差σx,σz,σvとで設定される各ガウス分布表が各々記憶されており、この各ガウス分布表に基づきx座標Lx、z座標Lz、Vz速度Lvを変数として各同一確率Px,Pz,Pvがそれぞれ設定される。   Next, the process proceeds to step S25, where the same probabilities Px, Pz, Pv of the x coordinate Fx, Lx, z coordinate Fz, Lz, Vz velocity Fv, Lv of the reference primary fusion solid object Fn and the laser solid object Li are described above. Each is set based on the probability density function shown in equations (4) to (5). In this case, the x coordinate, the z coordinate, and the Vz velocity of the reference primary fusion three-dimensional object Fn and the laser three-dimensional object Li are set to Fn = (x1, z1, v1) and Ii = (x2, z2.v2), respectively. In this embodiment, the same probabilities Px, Pz, and Pv are set by map search. In each map (x coordinate identical probability map, z coordinate identical probability map, Vz velocity identical probability map), the x-coordinate Fx, z-coordinate Fz, Vz velocity Fv and standard deviations σx, σz of the reference primary fusion three-dimensional object Fn in advance. , Σv and the respective Gaussian distribution tables are stored, and based on the Gaussian distribution tables, the respective probabilities Px, Pz, Pv are set by using the x-coordinate Lx, the z-coordinate Lz, and the Vz velocity Lv as variables. Is done.

次いで、ステップS26へ進み、立体物同一確率Piを、上述した(7)式から算出する。   Subsequently, it progresses to step S26 and the solid thing identity probability Pi is calculated from (7) Formula mentioned above.

その後、ステップS27へ進み、立体物同一確率Piと、予め設定されているしきい値SL2とを比較する。このしきい値SL2は、同一と見なせる範囲を予め実験等から求めて設定したもので、本形態では40〜70[%]程度に設定されている。   Thereafter, the process proceeds to step S27, where the three-dimensional object identity probability Pi is compared with a preset threshold value SL2. This threshold value SL2 is obtained by setting a range that can be regarded as the same in advance through experiments or the like, and is set to about 40 to 70 [%] in this embodiment.

そして、立体物同一確率Piがしきい値SL2未満のときは(Pi<SL2)、ステップS28へ進み、立体物同一確率Piをクリアして(Pi←0)、ステップS29へ進む。又、立体物同一確率Piがしきい値SL2以上ときは(P≧SL2)、そのままステップS29へ進む。ステップS29へ進むと、今回設定した立体物同一確率Piを記憶する。   If the three-dimensional object identity probability Pi is less than the threshold value SL2 (Pi <SL2), the process proceeds to step S28, the three-dimensional object identity probability Pi is cleared (Pi ← 0), and the process proceeds to step S29. If the three-dimensional object identity probability Pi is greater than or equal to the threshold value SL2 (P ≧ SL2), the process proceeds directly to step S29. If it progresses to step S29, the solid object probability Pi set this time will be memorize | stored.

例えば、図1において、一次フュージョン立体物F1を基準一次フュージョン立体物として選択し、この基準一次フュージョン立体物F1を基準に、全てのレーザ立体物L1,L2との同一確率Pを算出した結果、基準一次フュージョン立体物F1とレーザ立体物L1とは同一であるが、他のレーザ立体物L2とは非同一であると判定された場合、基準一次フュージョン立体物F1とレーザ立体物L1との立体物同一確率Pは有効と判定されて、この立体物同一確率Pの値がそのまま記憶される。又、基準一次フュージョン立体物F1と他のレーザ立体物L2との立体物同一確率Pは0として記憶される。同様に、他の各フュージョン立体物F2〜F4についても、全てのレーザ立体物L1,L2との立体物同一確率Pが算出される。その結果、例えば、一次フュージョン立体物F3とレーザ立体物L2とが同一と判定された場合、この両者の立体物同一確率Pの値はそのまま記憶される。一方、一次フュージョン立体物F4は、同一となるレーザ立体物が無いため、その立体物同一確率Pは0に設定される。   For example, in FIG. 1, the primary fusion three-dimensional object F1 is selected as the reference primary fusion three-dimensional object, and the same probability P with all the laser three-dimensional objects L1, L2 is calculated based on the reference primary fusion three-dimensional object F1, When it is determined that the reference primary fusion three-dimensional object F1 and the laser three-dimensional object L1 are the same, but the other laser three-dimensional object L2 is not identical, the three-dimensional structure of the reference primary fusion three-dimensional object F1 and the laser three-dimensional object L1. The object identity probability P is determined to be valid, and the value of the three-dimensional object identity probability P is stored as it is. The three-dimensional object probability P of the reference primary fusion three-dimensional object F1 and the other laser three-dimensional object L2 is stored as zero. Similarly, for the other fusion solid objects F2 to F4, the solid object probability P with all the laser solid objects L1 and L2 is calculated. As a result, for example, when it is determined that the primary fusion three-dimensional object F3 and the laser three-dimensional object L2 are the same, the value of the three-dimensional object identity probability P is stored as it is. On the other hand, since the primary fusion solid object F4 does not have the same laser solid object, the solid object identity probability P is set to zero.

その後、ステップS30へ進み、レーザ立体物Liの全てが選択されたか否かを調べ、未選択のレーザ立体物Liが存在しているときは、ステップS23へ戻り、次のレーザ立体物Liを選択する。又、レーザ立体物Liの全てが選択された場合は、ステップS31へ進む。   Then, it progresses to step S30, it is investigated whether all the laser solid objects Li were selected, and when the unselected laser solid object Li exists, it returns to step S23 and selects the next laser solid object Li To do. On the other hand, when all of the laser solid objects Li are selected, the process proceeds to step S31.

ステップS31では、フュージョン立体物Fiの全てが選択されたか否かを調べ、未選択のフュージョン立体物Fiが存在しているときは、ステップS22へ戻り、次のフュージョン立体物Fiを基準一次フュージョン立体物Fnに設定し、当該基準一次フュージョン立体物Fnを基準としてレーザ立体物Liとの立体物同一確率Piを求める。又、フュージョン立体物Fiの全てが選択された場合は、ステップS32へ進む。   In step S31, it is checked whether or not all of the fusion three-dimensional object Fi has been selected. If there is an unselected fusion three-dimensional object Fi, the process returns to step S22, and the next fusion three-dimensional object Fi is used as the reference primary fusion three-dimensional object. The object Fn is set, and the three-dimensional object probability Pi with the laser three-dimensional object Li is obtained with the reference primary fusion three-dimensional object Fn as a reference. If all of the fusion solid objects Fi are selected, the process proceeds to step S32.

ステップS32へ進むと、二次フュージョン立体物Siを生成する。この二次フュージョン立体物Siを生成するに際し、先ず、算出した全ての立体物同一確率Piに基づき、同一立体物と判定される一次フュージョン立体物Fiとレーザ立体物Liとを抽出する。同一立体物を抽出する際の判定条件は、以下の通りである。   If it progresses to step S32, the secondary fusion solid object Si will be produced | generated. When generating the secondary fusion solid object Si, first, the primary fusion solid object Fi and the laser solid object Li, which are determined as the same solid object, are extracted based on all the calculated three-dimensional object probabilities Pi. Determination conditions for extracting the same three-dimensional object are as follows.

条件1:Pi>0である
条件2:一次フュージョン立体物Fiに対してレーザ立体物Liが最大同一確率である
条件3:レーザ立体物Liに対して一次フュージョン立体物Fiが最大同一確率である
そして、この条件1〜3が全て満足された場合、一次フュージョン立体物Fiとレーザ立体物Liとは同一であると判定する。
Condition 1: Pi> 0 Condition 2: The solid three-dimensional object Li has the highest probability of being the same as the primary fusion solid object Fi Condition 3: The primary fusion solid object Fi has the highest probability of being the same as the laser three-dimensional object Li And when these conditions 1-3 are all satisfied, it determines with the primary fusion solid object Fi and the laser solid object Li being the same.

そして、同一と判定された一次フュージョン立体物Fiとレーザ立体物Liとの組み合わせ、及び立体物同一確率Piが0に設定されている一次フュージョン立体物Fi及びレーザ立体物Liに基づき、二次フュージョン立体物Siを各々生成する。   Based on the combination of the primary fusion solid object Fi and the laser solid object Li determined to be the same, and the primary fusion solid object Fi and the laser solid object Li in which the solid object probability Pi is set to 0, the secondary fusion is performed. A three-dimensional object Si is generated.

この場合、レーザ立体物Liのみ(同一となる一次フュージョン立体物Fiが無い)、或いは一次フュージョン立体物Fiのみ(同一となるレーザ立体物Liが無い)の場合は、レーザ立体物Liの特徴量(x座標Lx、z座標Lz、Vz速度Lv)、或いは一次フュージョン立体物Fiの特徴量(x座標Mx、z座標Mz、Vz速度Mv)で二次フュージョン立体物Siが生成され、又、一次フュージョン立体物Fiとレーザ立体物Liとが同一と判定された場合は、検出精度の高い値の特徴量に基づいて二次フュージョン立体物Siが生成される。   In this case, in the case of only the laser three-dimensional object Li (no identical primary fusion three-dimensional object Fi) or only the primary fusion three-dimensional object Fi (no identical laser three-dimensional object Li), the feature amount of the laser three-dimensional object Li A secondary fusion three-dimensional object Si is generated with (x-coordinate Lx, z-coordinate Lz, Vz velocity Lv) or a feature quantity (x-coordinate Mx, z-coordinate Mz, Vz velocity Mv) of the primary fusion three-dimensional object Fi. When it is determined that the fusion three-dimensional object Fi and the laser three-dimensional object Li are the same, the secondary fusion three-dimensional object Si is generated based on the feature amount having a high detection accuracy.

具体的には、以下の基準に従って、二次フュージョン立体物Siの特徴量が生成される。
・Pi=0(但し、同一となる一次フュージョン立体物Fiが無く、レーザ立体物Liのみ)
x座標←レーザ立体物Liのx座標Lx
z座標←レーザ立体物Liのz座標Lz
Vz速度←レーザ立体物LiのVz速度Lv
従って、レーザ立体物Liのみで生成した二次フュージョン立体物Siの特徴量は、x座標Lx、z座標Lz、Vz速度Lvで表される。
Specifically, the feature quantity of the secondary fusion solid object Si is generated according to the following criteria.
Pi = 0 (However, there is no primary fusion solid object Fi that is the same, only the laser solid object Li)
x coordinate ← x coordinate Lx of laser solid object Li
z-coordinate ← z-coordinate Lz of laser solid object Li
Vz velocity ← Vz velocity Lv of laser solid object Li
Therefore, the feature quantity of the secondary fusion solid object Si generated only by the laser solid object Li is represented by the x coordinate Lx, the z coordinate Lz, and the Vz speed Lv.

・Pi=0(但し、同一となるレーザ立体物Liが無く、一次フュージョン立体物Fiのみ)
x座標←一次フュージョン立体物Fiのx座標Fx
z座標←一次フュージョン立体物Fiのz座標Fz
Vz速度←一次フュージョン立体物FiのVz速度Fv
従って、一次フュージョン立体物Fiのみで生成した二次フュージョン立体物Siの特徴量は、x座標Fx、z座標Fz、Vz速度Fvで表される。
Pi = 0 (However, there is no laser solid object Li that is the same, only the primary fusion solid object Fi)
x-coordinate ← x-coordinate Fx of primary fusion solid object Fi
z-coordinate ← z-coordinate Fz of primary fusion solid Fi
Vz velocity ← Vz velocity Fv of primary fusion solid object Fi
Therefore, the feature quantity of the secondary fusion solid object Si generated only by the primary fusion solid object Fi is represented by the x coordinate Fx, the z coordinate Fz, and the Vz velocity Fv.

・Pi>0(一次フュージョン立体物Fiとレーザ立体物Liとは同一)で、且つ一次フュージョン立体物Fiが画像立体物Iiのみで生成
x座標←一次フュージョン立体物Fiとレーザ立体物Liの各x座標Fx,Lxの加重平均
z座標←レーザ立体物Liのz座標Lz
Vz速度←レーザ立体物LiのVz速度Lv
従って、画像立体物Iiのみで生成されている一次フュージョン立体物Fiとレーザ立体物Liとで生成される二次フュージョン立体物Siの特徴量は、x座標(FiとLiとの加重平均)、z座標Lz、Vz速度Lvとなる。
Pi> 0 (primary fusion three-dimensional object Fi and laser three-dimensional object Li are the same) and primary fusion three-dimensional object Fi is generated only from image three-dimensional object Ii x-coordinate ← each of primary fusion three-dimensional object Fi and laser three-dimensional object Li Weighted average of x-coordinates Fx and Lx z-coordinate ← z-coordinate Lz of laser solid object Li
Vz velocity ← Vz velocity Lv of laser solid object Li
Therefore, the feature quantity of the secondary fusion solid object Si generated by the primary fusion solid object Fi generated only by the image solid object Ii and the laser solid object Li is an x coordinate (weighted average of Fi and Li), The z coordinate Lz and the Vz speed Lv are obtained.

ステレオカメラ装置2とレーザレーダ装置4との検出精度を比較した場合、x方向の検出精度は双方とも同程度であるため、x座標は各x座標Fx,Lxの加重平均で求める。一方、z方向検出精度はレーザレーダ装置4のほうが高いため、レーザレーダ装置4で検出したz座標Lz、Vz速度Lvを採用する。   When the detection accuracy of the stereo camera device 2 and the laser radar device 4 is compared, the detection accuracy in the x direction is about the same, so the x coordinate is obtained by a weighted average of the x coordinates Fx and Lx. On the other hand, since the z-direction detection accuracy is higher in the laser radar device 4, the z-coordinate Lz and the Vz velocity Lv detected by the laser radar device 4 are employed.

・Pi>0(一次フュージョン立体物Fiとレーザ立体物Liとは同一)で、且つ一次フュージョン立体物Fiがミリ波立体物Miのみで生成
x座標←一次フュージョン立体物Fiとレーザ立体物Liの各x座標Fx,Lxの加重平均
z座標←ミリ波立体物Miのz座標Mz
Vz速度←ミリ波立体物MiのVz速度Mv
従って、ミリ波立体物Miのみで生成されている一次フュージョン立体物Fiとレーザ立体物Liとで生成される二次フュージョン立体物Siの特徴量は、x座標(FiとLiとの加重平均)、z座標Mz、Vz速度Mvとなる。
Pi> 0 (the primary fusion solid object Fi and the laser solid object Li are the same), and the primary fusion solid object Fi is generated only by the millimeter wave solid object Mi x-coordinate ← of the primary fusion solid object Fi and the laser solid object Li Weighted average of each x-coordinate Fx, Lx z-coordinate ← z-coordinate Mz of millimeter wave solid object Mi
Vz velocity ← Vz velocity Mv of millimeter wave solid object Mi
Therefore, the feature quantity of the secondary fusion solid object Si generated by the primary fusion solid object Fi generated only by the millimeter wave solid object Mi and the laser solid object Li is the x coordinate (weighted average of Fi and Li). , Z coordinate Mz, and Vz velocity Mv.

ミリ波レーダ装置3とレーザレーダ装置4との検出精度を比較した場合、x方向の検出精度は同程度であるため、x座標は各x座標Fx,Lxの加重平均で求める。一方、z方向検出精度はミリ波レーダ装置3のほうが高いため、ミリ波レーダ装置3で検出したz座標Mz、Vz速度Mvを採用する。   When the detection accuracy of the millimeter wave radar device 3 and the laser radar device 4 is compared, the detection accuracy in the x direction is comparable, and therefore the x coordinate is obtained by a weighted average of the x coordinates Fx and Lx. On the other hand, since the millimeter wave radar device 3 has higher z-direction detection accuracy, the z coordinate Mz and the Vz velocity Mv detected by the millimeter wave radar device 3 are employed.

・Pi>0(一次フュージョン立体物Fiとレーザ立体物Liとは同一)で、且つ一次フュージョン立体物Fiがミリ波立体物Miと画像立体物Iiで生成
x座標←一次フュージョン立体物Fiとレーザ立体物Liの各x座標Fx,Lxの加重平均
z座標←ミリ波立体物Miのz座標Mz
Vz速度←ミリ波立体物MiのVz速度Mv
従って、ミリ波立体物Mi及び画像立体物Iiで生成されている一次フュージョン立体物Fiとレーザ立体物Liとで生成される二次フュージョン立体物Siの特徴量は、x座標(FiとLiとの加重平均)、z座標Mz、Vz速度Mvとなる。組み合わせの選択理由は上述した通りである。
Pi> 0 (the primary fusion three-dimensional object Fi and the laser three-dimensional object Li are the same), and the primary fusion three-dimensional object Fi is generated by the millimeter wave three-dimensional object Mi and the image three-dimensional object Ii x-coordinate ← primary fusion three-dimensional object Fi and laser Weighted average of each x-coordinate Fx, Lx of the three-dimensional object Li z-coordinate ← z-coordinate Mz of the three-dimensional object Mi
Vz velocity ← Vz velocity Mv of millimeter wave solid object Mi
Accordingly, the feature quantity of the secondary fusion solid object Si generated by the primary fusion solid object Fi generated by the millimeter wave solid object Mi and the image solid object Ii and the laser solid object Li is expressed by the x coordinate (Fi and Li ), Z coordinate Mz, and Vz velocity Mv. The reason for selecting the combination is as described above.

そして、上述したステップS32で全ての二次フュージョン立体物Siが生成された後、ルーチンを抜ける。   And after all the secondary fusion solid object Si is produced | generated by step S32 mentioned above, a routine is exited.

次いで、図2に示す立体物検出処理部9で生成した二次フュージョン立体物Siを車両制御部21へ出力する。車両制御部21では、二次フュージョン立体物Siの特徴量(x座標Sx、z座標Sz、Vz速度Sv)に基づき、前方立体物の認識・監視を行う。そして、二次フュージョン立体物Siとの衝突不可避と判定された場合は、自動ブレーキ等を作動させて車両1を減速させる等の衝突回避制御を行う。   Next, the secondary fusion solid object Si generated by the solid object detection processing unit 9 shown in FIG. 2 is output to the vehicle control unit 21. The vehicle control unit 21 recognizes and monitors the front three-dimensional object based on the feature quantities (x coordinate Sx, z coordinate Sz, Vz speed Sv) of the secondary fusion three-dimensional object Si. When it is determined that a collision with the secondary fusion three-dimensional object Si is unavoidable, collision avoidance control such as operating an automatic brake or the like to decelerate the vehicle 1 is performed.

このように、本形態では、車両1の前方立体物を、それぞれ異なる検出特性を有するステレオカメラ装置2、ミリ波レーダ装置3、レーザレーダ装置4を組み合わせて1つの前方立体物(二次フュージョン立体物)Siを生成するようにしたので、ステレオカメラ装置2、ミリ波レーダ装置3、レーザレーダ装置4の内の1つ、或いは2つの検出精度が特定の走行環境下で低下するような場合であっても、他の装置で補間されるため、従来のような補助的なセンサ類を併設して検出精度の低下を補間する必要が無く、部品点数の大幅な増加、及び製品コストの高騰が抑制されて、高い信頼を得ることができる。例えば夜間や降雨、降雪時等の走行環境において、ステレオカメラ装置2やミリ波レーダ装置3では検出されなかった前方立体物をレーザレーダ装置4で検出することができる。そのため、より高い前方立体物の検出精度を得ることができる。   As described above, in this embodiment, the front three-dimensional object of the vehicle 1 is combined with the stereo camera device 2, the millimeter wave radar device 3, and the laser radar device 4 each having different detection characteristics. ) Since Si is generated, the detection accuracy of one or two of the stereo camera device 2, the millimeter wave radar device 3, and the laser radar device 4 is reduced in a specific traveling environment. Even if it exists, since it is interpolated by other devices, it is not necessary to interpolate the decrease in detection accuracy by providing auxiliary sensors as in the past, which greatly increases the number of parts and increases the product cost. It is suppressed and high reliability can be obtained. For example, a front three-dimensional object that is not detected by the stereo camera device 2 or the millimeter wave radar device 3 can be detected by the laser radar device 4 in a traveling environment such as at night, during rainfall, or during snowfall. Therefore, higher detection accuracy of the front three-dimensional object can be obtained.

又、ステレオカメラ装置2、ミリ波レーダ装置3、レーザレーダ装置4の3種類のみを使用しているので、部品点数の大幅な増加が抑制される。更に、これらは一般に普及している機器を採用しているので、部品コストの高騰が抑制される。   In addition, since only three types of the stereo camera device 2, the millimeter wave radar device 3, and the laser radar device 4 are used, a significant increase in the number of parts is suppressed. In addition, since these devices are generally used, the rise in component costs is suppressed.

ところで、本形態による前方立体物認識技術を歩行者認識処理に適用することも可能である。すなわち、二次フュージョン立体物Siの特徴量(x座標Sx、z座標Sz、Vz速度Sv)と、ステレオカメラ装置2で検出した二次フュージョン立体物Siの幅情報、及びx方向への移動速度等に基づき、検出した二次フュージョン立体物Siが歩行者か否かを、大まかに判定する。そして、歩行者と判定した場合、ステレオカメラ装置2で撮像した当該二次フュージョン立体物を画像処理して、歩行者であるか否かを更に詳しく調べる。   By the way, it is also possible to apply the front three-dimensional object recognition technique according to this embodiment to the pedestrian recognition process. That is, the feature amount (x coordinate Sx, z coordinate Sz, Vz speed Sv) of the secondary fusion solid object Si, the width information of the secondary fusion solid object Si detected by the stereo camera device 2, and the moving speed in the x direction. Based on the above, it is roughly determined whether or not the detected secondary fusion three-dimensional object Si is a pedestrian. And when it determines with a pedestrian, the said secondary fusion solid object imaged with the stereo camera apparatus 2 is image-processed, and it is investigated in more detail whether it is a pedestrian.

歩行者か否かを、二次フュージョン立体物Siの特徴量を基に大まかに判定することで、全ての立体物を画像処理する場合に比し、歩行者認識に要する演算負荷を大幅に軽減することができる。   By roughly determining whether or not a person is a pedestrian based on the feature quantity of the secondary fusion solid object Si, the computational load required for pedestrian recognition is greatly reduced compared to when processing all solid objects. can do.

尚、本形態は、上述した形態に限るものではなく、例えば前方認識手段は、上述したステレオカメラ装置2、ミリ波レーダ装置3、レーザレーダ装置4以外に、これらと異なる検出特性を有する、近赤外線レーザレーダ装置等、他の前方認識線を加えて、4種類以上としても良い。   Note that the present embodiment is not limited to the above-described embodiment. For example, the forward recognition means has a detection characteristic different from these in addition to the stereo camera device 2, the millimeter wave radar device 3, and the laser radar device 4 described above. There may be four or more types by adding other front recognition lines such as an infrared laser radar device.

車両の平面図Plan view of the vehicle 車両に搭載されている前方立体物認識装置の概略構成図Schematic configuration diagram of a front three-dimensional object recognition device mounted on a vehicle 立体物検出処理部の機能ブロック図Functional block diagram of the three-dimensional object detection processing unit 立体物認識処理ルーチンを示すフローチャートFlow chart showing a three-dimensional object recognition processing routine フュージョン立体物生成ルーチンを示すフローチャート(その1)Flow chart showing a fusion three-dimensional object generation routine (part 1) フュージョン立体物生成ルーチンを示すフローチャート(その2)Flow chart showing a fusion three-dimensional object generation routine (part 2) 前方立体物の横位置の同一確率を示す図表Chart showing the same probability of the horizontal position of the front three-dimensional object

符号の説明Explanation of symbols

1…車両、
2…ステレオカメラ装置、
3…ミリ波レーダ装置、
4…レーザレーダ装置、
5…前方立体物認識装置、
6…ステレオ画像処理部、
7…ミリ波信号処理部、
8…レーザ信号処理部、
9…立体物検出処理部、
11…一次フュージョン立体物生成部、
12…二次フュージョン立体物生成部、
Fi…一次フュージョン立体物、
Fn…基準一次フュージョン立体物、
Ii,Mi,Li…立体物、
Iv,Mv,Lv,Sv…Vz速度、
Ix,Mx,Lx,Sx…x座標、
Iz,Mz,Lz,Sz…z座標、
Li…レーザ立体物、
Mi…ミリ波立体物、
Mn…基準ミリ波立体物、
Pi…立体物同一確率、
Px,Pz,Pv…同一確率、
Si…二次フュージョン立体物、
σx,σz,σv…標準偏差
1 ... vehicle,
2 ... Stereo camera device,
3 Millimeter wave radar device,
4 ... Laser radar device,
5 ... Front three-dimensional object recognition device,
6 ... Stereo image processing unit,
7: Millimeter wave signal processing unit,
8: Laser signal processing unit,
9 ... Solid object detection processing unit,
11 ... Primary fusion three-dimensional object generation unit,
12 ... Secondary fusion three-dimensional object generator,
Fi ... primary fusion solid,
Fn: Standard primary fusion solid,
Ii, Mi, Li ... solid objects,
Iv, Mv, Lv, Sv ... Vz velocity,
Ix, Mx, Lx, Sx ... x coordinate,
Iz, Mz, Lz, Sz ... z coordinate,
Li: Laser solid object,
Mi ... Millimeter wave solid object,
Mn: Standard millimeter wave solid object,
Pi: solid object probability,
Px, Pz, Pv ... same probability,
Si: Secondary fusion solid,
σx, σz, σv ... standard deviation

Claims (5)

異なる検出特性を有すると共に車両に搭載されている少なくとも3種類の前方認識手段と、上記各前方認識手段で検出した各前方立体物から同一の前方立体物を検出する立体物検出処理部とを備える車両の前方立体物認識装置において、
上記立体物検出処理部が、
上記各前方立体物に基づき該各前方立体物を特定する特徴量をそれぞれ求め、上記各前方認識手段で検出した上記各前方立体物の全ての組み合わせから該各前方立体物の上記特徴量の同一確率を、該各特徴量の誤差に関するガウス分布に基づき求め、求めた上記同一確率に基づき上記各前方立体物に同一立体物があるか否かを判定し、同一と判定された場合は、当該各前方立体物の上記特徴量に基づいてフュージョン立体物の特徴量を生成し、非同一と判定された場合は非同一と判定された各前方立体物の特徴量のみで該フュージョン立体物の特徴量を生成するフュージョン立体物生成部を有する
ことを特徴とする車両の前方立体物認識装置。
At least three types of forward recognition means having different detection characteristics and mounted on a vehicle, and a three-dimensional object detection processing unit for detecting the same forward three-dimensional object from the respective front three-dimensional objects detected by the respective front recognition means. In the vehicle front three-dimensional object recognition device,
The three-dimensional object detection processing unit is
Based on each of the front three-dimensional objects, a feature amount that specifies each of the front three-dimensional objects is obtained, and the same amount of the feature amount of each of the front three-dimensional objects is obtained from all combinations of the front three-dimensional objects detected by the front recognition means. The probability is calculated based on the Gaussian distribution relating to the error of each feature value, and it is determined whether or not each front three-dimensional object has the same three-dimensional object based on the obtained same probability. A feature amount of the fusion three-dimensional object is generated based on the feature amount of each front three-dimensional object, and when it is determined as non-identical, only the feature amount of each front three-dimensional object determined as non-identical is the feature of the fusion three-dimensional object A vehicle front three-dimensional object recognition device comprising a fusion three-dimensional object generation unit that generates a quantity.
上記フュージョン立体物生成部は、上記ガウス分布の標準偏差を、該ガウス分布の設定対象となる上記前方立体物を検出した上記前方認識手段の検出精度に応じて可変設定する
ことを特徴とする請求項1記載の車両の前方立体物認識装置。
The fusion three-dimensional object generation unit variably sets the standard deviation of the Gaussian distribution according to the detection accuracy of the front recognition means that detects the front three-dimensional object that is the setting target of the Gaussian distribution. Item 3. A vehicle front three-dimensional object recognition device according to Item 1.
上記フュージョン立体物生成部は、上記フュージョン立体物の特徴量を、該フュージョン立体物の基礎となる上記各前方立体物を検出した上記前方認識手段の検出特性に応じて設定する
ことを特徴とする請求項1或いは2に記載の車両の前方立体物認識装置。
The fusion three-dimensional object generation unit sets the feature amount of the fusion three-dimensional object according to detection characteristics of the front recognition means that detects the front three-dimensional objects that are the basis of the fusion three-dimensional object. The vehicle front three-dimensional object recognition device according to claim 1 or 2.
上記特徴量は、上記車両を原点とする車体前後方向の距離座標と車幅方向の位置座標と、該車体前後方向の距離座標の時間的変化に基づいて求めた相対速度とを有する
ことを特徴とする請求項1〜3の何れか1項に記載の車両の前方立体物認識装置。
The feature amount includes a distance coordinate in a vehicle longitudinal direction with respect to the vehicle as an origin, a position coordinate in a vehicle width direction, and a relative speed obtained based on a temporal change in the distance coordinate in the vehicle longitudinal direction. The vehicle front three-dimensional object recognition device according to any one of claims 1 to 3.
上記前方認識手段は、少なくともミリ波レーダ装置とステレオカメラ装置とレーザレーダ装置である
ことを特徴とする請求項1〜4の何れか1項に記載の車両の前方立体物認識装置。
The vehicle front three-dimensional object recognition device according to any one of claims 1 to 4, wherein the front recognition means is at least a millimeter wave radar device, a stereo camera device, and a laser radar device.
JP2006006700A 2006-01-13 2006-01-13 Vehicle front three-dimensional object recognition device Active JP4712562B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006006700A JP4712562B2 (en) 2006-01-13 2006-01-13 Vehicle front three-dimensional object recognition device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006006700A JP4712562B2 (en) 2006-01-13 2006-01-13 Vehicle front three-dimensional object recognition device

Publications (2)

Publication Number Publication Date
JP2007188354A true JP2007188354A (en) 2007-07-26
JP4712562B2 JP4712562B2 (en) 2011-06-29

Family

ID=38343473

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006006700A Active JP4712562B2 (en) 2006-01-13 2006-01-13 Vehicle front three-dimensional object recognition device

Country Status (1)

Country Link
JP (1) JP4712562B2 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015200622A (en) * 2014-04-10 2015-11-12 本田技研工業株式会社 object recognition device
JP2017220157A (en) * 2016-06-10 2017-12-14 三菱電機株式会社 Object recognition processor, object recognition processing method and automatic driving system
JP2017223680A (en) * 2016-12-30 2017-12-21 東軟集団股▲分▼有限公司 Method and device for generating target detection information, and equipment
CN107590433A (en) * 2017-08-04 2018-01-16 湖南星云智能科技有限公司 A kind of pedestrian detection method based on millimetre-wave radar and vehicle-mounted camera
CN108226883A (en) * 2017-11-28 2018-06-29 深圳市易成自动驾驶技术有限公司 Test the method, apparatus and computer readable storage medium of millimetre-wave radar performance
JP2019046251A (en) * 2017-09-04 2019-03-22 株式会社デンソーテン Target detection equipment, driving assist system, and target detection method
CN110325423A (en) * 2017-02-23 2019-10-11 本田技研工业株式会社 Vehicle control system and control method
JP2020052897A (en) * 2018-09-28 2020-04-02 株式会社デンソーテン Target detection device and target detection method
JP2020122772A (en) * 2019-01-31 2020-08-13 株式会社デンソー Object determination device
WO2021033591A1 (en) * 2019-08-22 2021-02-25 ソニー株式会社 Information processing device, information processing method, and program

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7359048B2 (en) 2020-03-16 2023-10-11 株式会社デンソー Driving support devices and driving support programs

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004037239A (en) * 2002-07-03 2004-02-05 Fuji Heavy Ind Ltd Identical object judging method and system, and misregistration correcting method and system
JP2004117071A (en) * 2002-09-24 2004-04-15 Fuji Heavy Ind Ltd Vehicle surroundings monitoring apparatus and traveling control system incorporating the same

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004037239A (en) * 2002-07-03 2004-02-05 Fuji Heavy Ind Ltd Identical object judging method and system, and misregistration correcting method and system
JP2004117071A (en) * 2002-09-24 2004-04-15 Fuji Heavy Ind Ltd Vehicle surroundings monitoring apparatus and traveling control system incorporating the same

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015200622A (en) * 2014-04-10 2015-11-12 本田技研工業株式会社 object recognition device
JP2017220157A (en) * 2016-06-10 2017-12-14 三菱電機株式会社 Object recognition processor, object recognition processing method and automatic driving system
JP2017223680A (en) * 2016-12-30 2017-12-21 東軟集団股▲分▼有限公司 Method and device for generating target detection information, and equipment
US10217005B2 (en) 2016-12-30 2019-02-26 Neusoft Corporation Method, apparatus and device for generating target detection information
CN110325423A (en) * 2017-02-23 2019-10-11 本田技研工业株式会社 Vehicle control system and control method
CN107590433A (en) * 2017-08-04 2018-01-16 湖南星云智能科技有限公司 A kind of pedestrian detection method based on millimetre-wave radar and vehicle-mounted camera
JP2019046251A (en) * 2017-09-04 2019-03-22 株式会社デンソーテン Target detection equipment, driving assist system, and target detection method
CN108226883A (en) * 2017-11-28 2018-06-29 深圳市易成自动驾驶技术有限公司 Test the method, apparatus and computer readable storage medium of millimetre-wave radar performance
JP2020052897A (en) * 2018-09-28 2020-04-02 株式会社デンソーテン Target detection device and target detection method
JP2020122772A (en) * 2019-01-31 2020-08-13 株式会社デンソー Object determination device
JP7069061B2 (en) 2019-01-31 2022-05-17 株式会社デンソー Object judgment device
WO2021033591A1 (en) * 2019-08-22 2021-02-25 ソニー株式会社 Information processing device, information processing method, and program

Also Published As

Publication number Publication date
JP4712562B2 (en) 2011-06-29

Similar Documents

Publication Publication Date Title
JP4712562B2 (en) Vehicle front three-dimensional object recognition device
CN106909152B (en) Automobile-used environmental perception system and car
US9809223B2 (en) Driving assistant for vehicles
JP6369390B2 (en) Lane junction determination device
JP4809019B2 (en) Obstacle detection device for vehicle
US10956757B2 (en) Image processing device, outside recognition device
JP6202367B2 (en) Image processing device, distance measurement device, mobile device control system, mobile device, and image processing program
US8204678B2 (en) Vehicle drive assist system
JP6787157B2 (en) Vehicle control device
US10074021B2 (en) Object detection apparatus, object detection method, and program
JP5145585B2 (en) Target detection device
US10752223B2 (en) Autonomous emergency braking system and method for vehicle at crossroad
JP2009086787A (en) Vehicle detection device
JP2007310741A (en) Solid object recognition device
JP3727400B2 (en) Crossing detection device
JP2020016541A (en) Display controller for vehicles, display control method for vehicles, and control program
US11760275B2 (en) Image pickup system and image pickup device
JP2006011570A (en) Camera calibration method and camera calibration device
JP4848644B2 (en) Obstacle recognition system
JP6564127B2 (en) VISUAL SYSTEM FOR AUTOMOBILE AND METHOD FOR CONTROLLING VISUAL SYSTEM
JP2024518934A (en) Optical interference detection during vehicle navigation
JP2019197418A (en) Obstacle detection apparatus
JP4376147B2 (en) Obstacle recognition method and obstacle recognition device
JP6604052B2 (en) Runway boundary estimation device and runway boundary estimation method
US20230286548A1 (en) Electronic instrument, movable apparatus, distance calculation method, and storage medium

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20081212

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20101215

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20101221

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110216

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110308

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110323

R150 Certificate of patent or registration of utility model

Ref document number: 4712562

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250