JPH03200067A - Method for separating vehicles in picture processing system vehicle sensor - Google Patents

Method for separating vehicles in picture processing system vehicle sensor

Info

Publication number
JPH03200067A
JPH03200067A JP33855789A JP33855789A JPH03200067A JP H03200067 A JPH03200067 A JP H03200067A JP 33855789 A JP33855789 A JP 33855789A JP 33855789 A JP33855789 A JP 33855789A JP H03200067 A JPH03200067 A JP H03200067A
Authority
JP
Japan
Prior art keywords
vehicle
time
parallel
feature point
vehicles
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP33855789A
Other languages
Japanese (ja)
Other versions
JP2797573B2 (en
Inventor
Kazuto Nishiyama
和人 西山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sumitomo Electric Industries Ltd
Original Assignee
Sumitomo Electric Industries Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sumitomo Electric Industries Ltd filed Critical Sumitomo Electric Industries Ltd
Priority to JP33855789A priority Critical patent/JP2797573B2/en
Publication of JPH03200067A publication Critical patent/JPH03200067A/en
Application granted granted Critical
Publication of JP2797573B2 publication Critical patent/JP2797573B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Landscapes

  • Traffic Control Systems (AREA)

Abstract

PURPOSE:To accurately measure a traffic volume by recognizing the existence of parallel vehicles surely and detecting the vehicles separately when a picture which indicates the existence of the parallel vehicles in an image picking up picture exists. CONSTITUTION:When a part of a parallel traveling small vehicle 12 which travels the adjacent lane of a discriminating object large vehicle 11 is hidden for instance, a position on the measuring area plane (x) axis of a feature point C of the vehicle 12 observed while overlapping at a time (t) is (a) and on an optical axis 1 which connects the point(a) and a camera center line, actual feature points C and G of the automatic vehicle are present together. On the other hand, in the condition of a time t', the feature points C and G advance up to the position indicated with a C' and a G' respectively. Then, optical axes 2 and 3 which connect a camera center and the points C' and G' respectively become different line segments and also the positions on a measuring area face become different points, points (c) and (d), respectively. Thus, by discriminating whether the feature point is the head part of the parallel traveling vehicle or not, the vehicles observed while overlapping can be detected separately.

Description

【発明の詳細な説明】 [産業上の利用分野] 本発明は、車両を撮像し、その撮像結果に基き、車両の
存在の有無、形状及び移動速度等を認識する画像処理式
車両感知器における車両分離方法に関する。
Detailed Description of the Invention [Industrial Application Field] The present invention relates to an image processing type vehicle sensor that images a vehicle and recognizes the presence or absence of the vehicle, its shape, moving speed, etc. based on the imaging results. Relating to a vehicle separation method.

[従来の技術] 従来、画像処理式車両感知器において、隣接車線に並走
している車両の画像分離を行う方法としては、例えば論
文:第17回画像工学コンファレンス16−13  r
車両動態計測の試みJ P295〜298に示されるよ
うに、判別対象である車両の撮像画面上での車幅を計測
し、車輌の画像を分離する方法が提案されている。
[Prior Art] Conventionally, as a method for separating images of vehicles running parallel to adjacent lanes in an image processing type vehicle sensor, for example, there is a method described in Paper: 17th Image Engineering Conference 16-13 r
Attempts at Measuring Vehicle Dynamics As shown in JP 295-298, a method has been proposed in which the width of the vehicle to be determined is measured on the image capture screen and the images of the vehicle are separated.

[発明が解決しようとする課題] しかしながら、従来方法では計測した撮像画面上の車幅
の長さが予め設定した値を越えた場合、あるいは、−台
の車輌について進行方向に沿って、連続的に計測した車
幅の長さに変化が生じた場合に、並走の車両の重なりが
あると、判定していたので、 (1)多種多様の車両については、車幅長を一義的に決
定できない。
[Problems to be Solved by the Invention] However, in the conventional method, if the measured vehicle width on the imaging screen exceeds a preset value, or if - vehicles are continuously (1) For a wide variety of vehicles, the vehicle width length must be uniquely determined. Can not.

(2)移動する車両を斜めから撮影した場合、第5図に
示すように、同一車両でも見る位置によって、車幅長は
(a) 、 (b) 、 (c)のように異なる。
(2) When a moving vehicle is photographed from an angle, the width of the same vehicle differs as shown in (a), (b), and (c) depending on the viewing position, as shown in FIG.

といった問題があり、正確に車幅を切出す(算出する)
ことができなかった。
There is a problem like that, and it is necessary to accurately cut out (calculate) the vehicle width.
I couldn't.

このため、従来方法では並走車両があると、しばしば2
台の車両を1台と誤認識してしまう不具合があった。
For this reason, in the conventional method, when there are vehicles running parallel to each other, there are often two
There was a problem where multiple vehicles were mistakenly recognized as one vehicle.

そこで、本発明の目的は、このような不具合を解消し、
撮像画面中に並走車両の存在を示す画像がある場合には
、並走車両の存在を確実に認識し、車両を分離すること
の可能な車両分離方法を提案することにある。
Therefore, the purpose of the present invention is to eliminate such problems,
The object of the present invention is to propose a vehicle separation method that can reliably recognize the presence of parallel vehicles and separate the vehicles when there is an image showing the presence of parallel vehicles in the captured image.

[課題を解決するための手段] このような目的を達成するために、本発明は、移動する
感知対象の車両を画像処理式車両感知器により第1の時
刻で撮像し、当該撮像の結果として得られる2次元画像
の中に並行車両の存在を示す特徴点が存在するときには
、引き続き前記感知対象の車両を第2の時刻で撮像し、
前記第1の時刻の撮像の結果に基き、前記特徴点を感知
対象の車両の一部を想定した場合の当該特徴点の前記第
2の時刻での3次元空間での第1位置座標を単眼立体視
の手法で算出し、前記第1の時刻の撮像の結果に基き、
前記特徴点を前記並行車両の一部と想定した場合の当該
特徴点の前記第2の時刻での3次元空間での第2位置座
標を単眼立体視の手法で算出し、前記第2の時刻の撮像
の結果に基き、前記特徴点の3次元空間での第3位置座
標を単眼立体視の手法で算出し、前記第3位置座標が前
記第1位置座標と一致するときは前記特徴点は感知対象
の車両の一部と確定し、前記第3位置座標が前記第2位
置座標と一致するときは前記特徴点は並行車両の一部と
確定することにより並行車両の分離を行うことを特徴と
する。
[Means for Solving the Problems] In order to achieve such an object, the present invention images a moving vehicle to be detected at a first time using an image processing vehicle sensor, and as a result of the imaging, When a feature point indicating the presence of a parallel vehicle exists in the obtained two-dimensional image, the vehicle to be sensed is subsequently imaged at a second time;
Based on the result of the imaging at the first time, the first position coordinates of the feature point in the three-dimensional space at the second time are determined by a monocular camera, assuming that the feature point is a part of the vehicle to be sensed. Calculated using a stereoscopic viewing method, based on the result of imaging at the first time,
Calculating second position coordinates of the feature point in a three-dimensional space at the second time when the feature point is assumed to be a part of the parallel vehicle using a monocular stereoscopic viewing method; Based on the imaging results, the third positional coordinates of the feature point in the three-dimensional space are calculated using a monocular stereoscopic method, and when the third positional coordinates match the first positional coordinates, the feature point is When the feature point is determined to be part of the vehicle to be sensed and the third position coordinates match the second position coordinates, the feature point is determined to be part of the parallel vehicle, thereby separating the parallel vehicle. shall be.

[作 用] 本発明では、撮像の結果として得られる2次元画像から
感知対象の車両の3次元空間での位置を算出することが
できることに着目し、第1の時刻の撮像結果に基き、第
2の時刻の3次元空間での位置を想定する。このとき、
第1の時刻での撮像結果の中の並走車両の特徴を表わす
特徴点を感知対象の車両の一部と想定した場合と、並走
車両の一部と想定した場合とに分ける。算出された各場
合の想定位置と、実際に第2の時刻の撮像結果から得ら
れる位置とを比較することにより、上記特徴点が感知対
象の車両の一部か並走車両の一部がか識別される。この
結果、感知対象の車両と並走車両とが近接し、分離が難
しい場合でも、不特定の車種について確実に分離が可能
となる。
[Function] The present invention focuses on the fact that the position of the vehicle to be sensed in a three-dimensional space can be calculated from the two-dimensional image obtained as a result of imaging, and based on the imaging result at the first time, Assume the position in the three-dimensional space at time 2. At this time,
The feature points representing the characteristics of the parallel vehicle in the imaging result at the first time are divided into two cases: a case where the feature points are assumed to be part of the vehicle to be sensed, and a case where the feature points are assumed to be part of the parallel vehicle. By comparing the calculated assumed position in each case with the position actually obtained from the imaging result at the second time, it is possible to determine whether the above feature point is part of the vehicle to be sensed or part of the parallel vehicle. be identified. As a result, even if the vehicle to be sensed and the vehicle running parallel to each other are close to each other and it is difficult to separate them, it is possible to reliably separate unspecified vehicle types.

[実施例] 以下、図面を参照して、本発明実施例を詳細に説明する
[Example] Hereinafter, examples of the present invention will be described in detail with reference to the drawings.

なお、本発明の説明に先立って、本発明を適用した、車
両感知器について、第2図を用いて説明する。
Before explaining the present invention, a vehicle sensor to which the present invention is applied will be explained using FIG. 2.

車両感知器は、カメラ部1および制御部2から構成され
る。カメラ部1は最大4車線までの道路を計測領域とし
て設定可能であり、移動する車両を斜め上方から撮影す
るように支柱3に設置されている。カメラ部1は被写体
画像を光電変換し、画像信号の形態で撮像結果を出力す
る。
The vehicle sensor includes a camera section 1 and a control section 2. The camera unit 1 can be set to measure a road with up to four lanes, and is installed on a support 3 so as to photograph a moving vehicle from diagonally above. The camera unit 1 photoelectrically converts a subject image and outputs the imaging result in the form of an image signal.

制御部2はカメラ部1から受信した1画面分の画像デー
タ(信号)について後述の画像解析処理を行って、並走
車両の有無の判別を行う。次に、カメラ部1から得られ
た画像データに対する車両分離のための処理手順を示す
The control unit 2 performs image analysis processing, which will be described later, on one screen worth of image data (signal) received from the camera unit 1, and determines whether there is a vehicle running in parallel. Next, a processing procedure for vehicle separation on image data obtained from the camera unit 1 will be described.

本例では、第3図に示すように判別対象の車両11と車
両12が並走状態にある画面を例として用いる。
In this example, a screen in which the vehicle 11 and the vehicle 12 to be determined are running side by side as shown in FIG. 3 is used as an example.

第3図の画面中、中央の車線の画面上方(道路上では上
流)から画面下方(下流)に走行する大型バスが判別対
象の車両11であり、屋根等車体の一部分が隣の中央側
の車線にはみだしている。また、隣接車線を走行する並
行車両12は小型車で車体の一部が隣のバスに隠れてい
る。
In the screen of Figure 3, a large bus traveling from the top of the screen (upstream on the road) to the bottom of the screen (downstream) in the center lane is the vehicle 11 to be identified, and a part of the vehicle body such as the roof is located on the adjacent center side. It's stuck in the lane. Further, the parallel vehicle 12 running in the adjacent lane is a small vehicle with a part of its body hidden by the adjacent bus.

(イ)カメラ部1から送られてきた画像データの中の車
両部分の画像を周知の被写体分離手法により抽出する。
(a) Extract the image of the vehicle part from the image data sent from the camera unit 1 using a well-known subject separation method.

具体的には、画像データの示す画素毎の輝度レベルをし
きい値と比較することにより画像データを2値(黒/白
)化し、第4図に示すように、車両部分21をドツト“
1”、背景部分22をドツト°゛0”で表わす画面を作
成する。
Specifically, the image data is converted into a binary value (black/white) by comparing the brightness level of each pixel indicated by the image data with a threshold value, and as shown in FIG.
1", a screen is created in which the background portion 22 is represented by a dot "0".

この結果、判別対象車両11および並走車両12の輪郭
を表す画像21が抽出される。
As a result, an image 21 representing the contours of the vehicle 11 to be determined and the parallel vehicle 12 is extracted.

(ロ)従来手法を用いて画像21の各特徴点a−’−g
の画面中の位置を検出する。
(b) Each feature point a-'-g of image 21 using the conventional method
Detects the position of on the screen.

(ハ)車両画像の道路の幅方向に平行で下部に位置する
辺a−bおよび辺c−gを検出することにより、判別対
象の車両の車頭部分を検出する。
(c) By detecting sides a-b and sides c-g that are parallel to the width direction of the road and located at the bottom of the vehicle image, the head portion of the vehicle to be determined is detected.

この車頭検出方法は従来手法として存在する。This vehicle head detection method exists as a conventional method.

(ニ)辺c−gが判別対象の車両11の一部であるか並
走車両12の一部(車頭)であるかの判定を後述の処理
手順に従って実行し、並走車両の有無を判断する。
(d) Determine whether side c-g is a part of the vehicle 11 to be determined or a part (vehicle head) of the parallel vehicle 12 according to the processing procedure described later, and determine the presence or absence of a parallel vehicle. do.

次に、上記(4項における本発明に関わる並走車両の判
別のための具体的な処理手順を説明する。
Next, a specific processing procedure for determining parallel vehicles according to the present invention in Section 4 above will be explained.

第3図の示す車両11.12を道路側面から見たときの
位置関係を第1図に示す。
FIG. 1 shows the positional relationship of the vehicles 11 and 12 shown in FIG. 3 when viewed from the side of the road.

尚、本図には車両11.12の位置関係として第1の時
刻tにおける位置関係とある時間が経過した第2の時刻
t′における位置関係の2つを示す(第3図の状態は時
刻tの状態とする。)1台のカメラで、三次元物体を計
測した場合でもカメラの設置条件と計測対象平面(この
場合の計測領域と同じ)の位置条件から三次元空間での
被写体の位置を求めることができる。
Note that this figure shows two positional relationships between the vehicles 11 and 12: the positional relationship at the first time t, and the positional relationship at the second time t' after a certain period of time (the state in Figure 3 is the positional relationship at the time t'). ) Even if a three-dimensional object is measured with one camera, the position of the object in three-dimensional space can be determined from the camera installation conditions and the positional conditions of the measurement target plane (same as the measurement area in this case). can be found.

第4図の画面上型なってみえる特徴点c、gの計測領域
面のX軸上の位置は第1図に示す通り点(ア)であり、
点(ア)とカメラ中心を結ぶ光軸1上に画面の特徴点に
対応する実際の車両の特徴点C,Gが共に存在する。
The positions of feature points c and g, which appear to be molded on the screen in FIG. 4, on the X-axis of the measurement area plane are points (A) as shown in FIG.
Actual vehicle feature points C and G, which correspond to the feature points on the screen, both exist on the optical axis 1 connecting point (A) and the camera center.

ところが時刻t′の状態では特徴点C,Gはそれぞれ符
合C’、G”で示す位置まで進む。
However, in the state at time t', feature points C and G advance to positions indicated by symbols C' and G'', respectively.

このとき、カメラ中心および点C′とを結ぶ線分(光軸
2)と、カメラ中心および点G′を結ぶ線分(光軸3)
は異った線分になり、計測領域面での位置もそれぞれ点
(つ)、点(1)と異なる点になる。
At this time, a line segment connecting the camera center and point C' (optical axis 2) and a line segment connecting the camera center and point G' (optical axis 3)
are different line segments, and their positions on the measurement area surface are also different from point (1) and point (1), respectively.

本発明はこの点に着目し、第3図の車両画像21の特徴
点Cあるいはgが並走車両の車頭部分(バンパ一部)で
あるかどうかを、以下の手順により判別する。すなわち
、 処理(1)時刻tにおいて得られる画像データに基き、
第4図C点(あるいはg点)を 求める。また第1図の(ア)〜(イ)間の距離を求める
The present invention pays attention to this point, and determines whether feature point C or g of the vehicle image 21 in FIG. 3 is the head portion (part of the bumper) of a parallel vehicle by the following procedure. That is, Process (1) Based on the image data obtained at time t,
Figure 4: Find point C (or point g). Also, find the distance between (a) and (b) in FIG.

処理(2)処理(1)で求めたX座標上の点(1)とカ
メラ中心を結んだ線分(第1図に示 す光軸1)を求め、判別の対象の車両 11の車頭として検出した点Aを通り y軸あるいはy軸に平行な線分を求め る。
Process (2) A line segment (optical axis 1 shown in Figure 1) connecting the point (1) on the X coordinate obtained in Process (1) and the center of the camera is obtained and detected as the head of the vehicle 11 to be identified. Find the y-axis or a line segment parallel to the y-axis that passes through point A.

この2本の線分が前述の光軸1と交 差する点P、Gを求める。These two line segments intersect with the optical axis 1 mentioned above. Find the points P and G that point.

処理(3)次に時刻t′ (時刻tから一定時間経過)
での2値化画面において前回検出 した判別対象の車両11の車頭点AのX座標上の値(点
(わ)を算出することにより対象物体の移動距離を求め
、前述 点29点Qの時刻t′におけるX座標 上の値(点(11) 8よび点(キ))を想定する。
Process (3) Next, time t' (a certain period of time has passed since time t)
The moving distance of the target object is determined by calculating the value (point (wa)) on the X coordinate of the vehicle head point A of the vehicle 11 to be determined that was detected last time on the binarized screen, and the time of the aforementioned point 29 point Q is calculated. Assume the values on the X coordinate at t' (point (11) 8 and point (ki)).

処理(4)時刻t′での0点(あるいはg点)のX座標
上の値(第1図の点(つ)点(1))がそれぞれ処理(
3)で求めた点(力)および点(りのどちらに含まれる
かを調 べる。
Processing (4) The value on the X coordinate of point 0 (or point g) at time t' (point (1) in Figure 1) is calculated by processing (
Check which of the points (force) and (ri) found in 3) are included.

(i)上記X座標の値が点(力)に含まれる場合は、並
走車両の車頭の一部 (ii)上記X座標の値が点(キ)に含まれる場合は、
判別対象の車両の一部 と判別する。
(i) If the value of the above X coordinate is included in the point (force), a part of the head of the parallel vehicle (ii) If the value of the above X coordinate is included in the point (ki),
It is determined that it is part of the vehicle to be determined.

尚、処理(3)において車種によるバ ンパ一部の高さあるいは車体前面の形 状のバラつきによる誤差を吸収するた め、点(h)および点(キンはある範囲をもった点の領
域として想定する。
In addition, in process (3), in order to absorb errors due to variations in the height of a part of the bumper or the shape of the front surface of the vehicle depending on the vehicle type, the point (h) and the point (kin) are assumed to be areas of points with a certain range.

[発明の効果] 以上、説明したように、本発明によれば、バス等の大型
車両の走行等により画面上では隣接の並走車両に重なっ
たように見える場合でも、個々の車両に分離して検出で
きるので、より正確な交通量計測ができる。また、カメ
ラの画角設定において隣接車両の重なりの影響を特に考
慮する必要がなくなるので、車線上にカメラ部の設置の
ためのアームを極端に張出さなくてもよくなる。このた
め、美観向上の面においても効果がある。
[Effects of the Invention] As explained above, according to the present invention, even when a large vehicle such as a bus appears to overlap on the screen with an adjacent vehicle running in parallel, the vehicle is separated into individual vehicles. This allows for more accurate traffic measurement. Furthermore, since it is no longer necessary to take into account the influence of overlapping adjacent vehicles when setting the camera's viewing angle, it is no longer necessary to extend the arm for installing the camera unit over the lane. Therefore, it is also effective in improving aesthetic appearance.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は、本発明実施例の並走車両の存在の有無を判別
するための手順を示す側面図、第2図は本発明実施例の
車両感知器の配置を示す斜視図、 第3図は本発明実施例の車両の走行状態を示す斜視図、 第4図は第3図の車両の撮像結果を示す説明図、 第5図は従来例の車両の判別のための画像抽出領域を示
す説明図である。 1・・・カメラ部、 2・・・制御部、 3・・・支柱、 4・・・計測領域。 第4図
FIG. 1 is a side view showing a procedure for determining the presence or absence of a parallel vehicle according to an embodiment of the present invention, FIG. 2 is a perspective view showing the arrangement of a vehicle sensor according to an embodiment of the present invention, and FIG. 4 is an explanatory diagram showing the imaging results of the vehicle in FIG. 3; FIG. 5 is a perspective view showing the driving state of the vehicle according to the embodiment of the present invention; FIG. It is an explanatory diagram. DESCRIPTION OF SYMBOLS 1... Camera part, 2... Control part, 3... Support column, 4... Measurement area. Figure 4

Claims (1)

【特許請求の範囲】 移動する感知対象の車両を画像処理式車両感知器により
第1の時刻で撮像し、 当該撮像の結果として得られる2次元画像の中に並行車
両の存在を示す特徴点が存在するときには、引き続き前
記感知対象の車両を第2の時刻で撮像し、 前記第1の時刻の撮像の結果に基き、前記特徴点を感知
対象の車両の一部と想定した場合の当該特徴点の前記第
2の時刻での3次元空間での第1位置座標を単眼立体視
の手法で算出し、 前記第1の時刻の撮像の結果に基き、前記特徴点を前記
並行車両の一部と想定した場合の当該特徴点の前記第2
の時刻での3次元空間での第2位置座標を単眼立体視の
手法で算出し、 前記第2の時刻の撮像の結果に基き、前記特徴点の3次
元空間での第3位置座標を単眼立体視の手法で算出し、 前記第3位置座標が前記第1位置座標と一致するときは
前記特徴点は感知対象の車両の一部と確定し、 前記第3位置座標が前記第2位置座標と一致するときは
前記特徴点は並行車両の一部と確定することにより並行
車両の分離を行う ことを特徴とする画像処理式車両感知器における車両分
離方法。
[Claims] A moving vehicle to be sensed is imaged at a first time by an image processing vehicle sensor, and a feature point indicating the presence of a parallel vehicle is included in a two-dimensional image obtained as a result of the imaging. If there is, the vehicle to be sensed is subsequently imaged at a second time, and based on the result of the imaging at the first time, the feature point is assumed to be a part of the vehicle to be sensed. calculate a first position coordinate in a three-dimensional space at the second time of the vehicle using a monocular stereoscopic method, and based on the result of imaging at the first time, identify the feature point as a part of the parallel vehicle. The second feature point of the feature point in the assumed case
Calculate the second position coordinates in the three-dimensional space at the time of , using a monocular stereoscopic viewing method, and calculate the third position coordinates of the feature point in the three-dimensional space using the monocular stereoscopic method based on the result of the imaging at the second time. Calculated using a stereoscopic viewing method, and when the third position coordinates match the first position coordinates, the feature point is determined to be a part of the vehicle to be sensed, and the third position coordinates are the second position coordinates. A method for separating a vehicle in an image processing type vehicle sensor, characterized in that when the characteristic point matches the characteristic point, the feature point is determined to be a part of a parallel vehicle, thereby separating the parallel vehicle.
JP33855789A 1989-12-28 1989-12-28 Vehicle separation method in image processing type vehicle sensor Expired - Lifetime JP2797573B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP33855789A JP2797573B2 (en) 1989-12-28 1989-12-28 Vehicle separation method in image processing type vehicle sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP33855789A JP2797573B2 (en) 1989-12-28 1989-12-28 Vehicle separation method in image processing type vehicle sensor

Publications (2)

Publication Number Publication Date
JPH03200067A true JPH03200067A (en) 1991-09-02
JP2797573B2 JP2797573B2 (en) 1998-09-17

Family

ID=18319300

Family Applications (1)

Application Number Title Priority Date Filing Date
JP33855789A Expired - Lifetime JP2797573B2 (en) 1989-12-28 1989-12-28 Vehicle separation method in image processing type vehicle sensor

Country Status (1)

Country Link
JP (1) JP2797573B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017162281A (en) * 2016-03-10 2017-09-14 スズキ株式会社 Moving body detection system and data consistency determination method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017162281A (en) * 2016-03-10 2017-09-14 スズキ株式会社 Moving body detection system and data consistency determination method

Also Published As

Publication number Publication date
JP2797573B2 (en) 1998-09-17

Similar Documents

Publication Publication Date Title
JP3596314B2 (en) Object edge position measuring device and moving object traffic judging device
US6812831B2 (en) Vehicle surroundings monitoring apparatus
JP4528283B2 (en) Vehicle periphery monitoring device
JP3747599B2 (en) Obstacle detection device for vehicle
JP3872179B2 (en) Vehicle collision prevention device
JP3961584B2 (en) Lane marking detector
JP4856525B2 (en) Advance vehicle departure determination device
JP3384278B2 (en) Distance measuring device
JP3807651B2 (en) White line recognition device
JP2635246B2 (en) Inter-vehicle distance detection device for tracking the preceding vehicle
JPH07119606B2 (en) Tracking inter-vehicle distance detector
JPH0981757A (en) Vehicle position detecting device
JPH05157558A (en) Vehicular gap detector
JP3612821B2 (en) In-vehicle distance measuring device
JPH07244717A (en) Travel environment recognition device for vehicle
JP3586938B2 (en) In-vehicle distance measuring device
WO2014050285A1 (en) Stereo camera device
JP3844750B2 (en) Infrared image recognition device and alarm device using infrared image recognition device
JP2797573B2 (en) Vehicle separation method in image processing type vehicle sensor
JP2000259997A (en) Height of preceding vehicle and inter-vehicle distance measuring device
JPH1141521A (en) Image pickup device, instrument and method for measuring distance between vehicles
JPH1038562A (en) Obstacle detector for vehicle
JPH0979821A (en) Obstacle recognizing device
JP2001116545A (en) Distance image calculator
JP2635232B2 (en) Inter-vehicle distance detection device

Legal Events

Date Code Title Description
FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20080703

Year of fee payment: 10

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20080703

Year of fee payment: 10

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090703

Year of fee payment: 11

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090703

Year of fee payment: 11

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100703

Year of fee payment: 12

EXPY Cancellation because of completion of term
FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100703

Year of fee payment: 12