JPH03265100A - Image processing type traffic flow measuring instrument - Google Patents

Image processing type traffic flow measuring instrument

Info

Publication number
JPH03265100A
JPH03265100A JP2062655A JP6265590A JPH03265100A JP H03265100 A JPH03265100 A JP H03265100A JP 2062655 A JP2062655 A JP 2062655A JP 6265590 A JP6265590 A JP 6265590A JP H03265100 A JPH03265100 A JP H03265100A
Authority
JP
Japan
Prior art keywords
vehicle
shadow
image
car
road surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2062655A
Other languages
Japanese (ja)
Other versions
JP2841652B2 (en
Inventor
Kazuto Nishiyama
和人 西山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sumitomo Electric Industries Ltd
Original Assignee
Sumitomo Electric Industries Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sumitomo Electric Industries Ltd filed Critical Sumitomo Electric Industries Ltd
Priority to JP2062655A priority Critical patent/JP2841652B2/en
Publication of JPH03265100A publication Critical patent/JPH03265100A/en
Application granted granted Critical
Publication of JP2841652B2 publication Critical patent/JP2841652B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Landscapes

  • Traffic Control Systems (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

PURPOSE:To eliminate the influence caused by a shadow and to improve the discrimination of the kind of a car and the car number measuring accuracy by constituting the measuring instrument so that when a distance on a road surface of feature points of a car head and a side part of a vehicle is not varied from two-dimensional screen data subjected to image pickup from the upper part at a prescribed angle, it is detected as a part of the shadow. CONSTITUTION:An image pickup image is digitized from the upper part by a fixed camera 1, subjected to ternary conversion processing and a car head candidate is detected. When extracted two-dimensional data satisfies a reference value, a car head detecting part 24 decides it to be a vehicle, and a shadow discriminating part 25 tracks a timewise moving distance of the vehicle of every measuring period. The discriminating part 25 checks a variation of a distance between feature points of a car head part and a car side and in the case the distances on the road surface are equal irrespective of the lapse of time, its feature point is detected as a part of a vehicle shadow. A vehicle measuring part 2 sets this vehicle shadow as a noise, eliminates the part of the shadow from data of an image storage part 22 and recognizes only a car body. These traffic measured data are sent out to a central equipment from an output part 27. Accordingly, from an image measuring part 2, exact vehicle discriminated data and passing car number measured data which are not influenced by the shadow are obtained.

Description

【発明の詳細な説明】[Detailed description of the invention]

[産業上の利用分野1 本発明は、工業用テレビ(ITV)カメラ等を用いて画
像処理により車両を検知する画像処理式の交通流計測装
置に関し、特に車両の周囲に発生する影を高精度に検出
して影の影響を除去可能にした交通流計測装置に関する
[Industrial Application Field 1] The present invention relates to an image processing type traffic flow measurement device that detects vehicles through image processing using an industrial television (ITV) camera, etc. The present invention relates to a traffic flow measurement device that detects shadows and removes the effects of shadows.

【従来の技術】[Conventional technology]

この種の交通流計測装置においては、車両の周囲に発生
する影は、車両の大きさを誤って測定することにより車
種を誤って検出したり、別の車両と誤ることにより測定
車両の個数を誤って検出する原因となる。従って、車両
の周囲に発生する影を正確に検出して、影の影響を除去
する必要がある。 従来のこの種の交通流計測装置における車両の影の検出
方法としては、車両の影が路面よりも暗く、しかも明る
さの変化がほとんどなく−様であることを利用して影を
検出する方法が一般に知られている(橋本、熊谷他、“
画像処理交通流計測システムの開発”、住友電気 第1
27号 P59昭和61年9月 参照)。 [発明が解決しようとする課題] しかしながら、従来技術では、上述の様に、影の判断条
件として、 ■影の画面の輝度が路面の輝度よりも暗いこと、■影は
輝度の変化がほとんどなく−様な明るさを示しているこ
と、 のみを用いているので、車体色が黒い車両で車体の形状
が−様な車両(例えば、バス、ワゴン車等)では、車両
とその車両の影とをうまく判別できず、そのため影を黒
い車両として検出してしまう等の解決すべき課題があっ
た。 本発明の目的は、上述の課題を解決して、影の影響を正
確に除去でき、車種判別精度1台数計測精度の向上が図
れる画像処理式の交通流計測装置を提供することにある
In this type of traffic flow measurement device, the shadows that appear around the vehicle may cause the size of the vehicle to be measured incorrectly, resulting in the erroneous detection of the vehicle type, or the number of vehicles being measured by being mistaken for another vehicle. This may cause incorrect detection. Therefore, it is necessary to accurately detect the shadows that appear around the vehicle and remove the effects of the shadows. The conventional method for detecting the shadow of a vehicle in this type of traffic flow measurement device is to detect the shadow by utilizing the fact that the shadow of the vehicle is darker than the road surface, and there is almost no change in brightness. is generally known (Hashimoto, Kumagai et al., “
“Development of Image Processing Traffic Flow Measurement System”, Sumitomo Electric Vol. 1
(Refer to No. 27, P59, September 1986). [Problems to be Solved by the Invention] However, in the conventional technology, as mentioned above, the conditions for determining a shadow are: ■ the brightness of the shadow screen is darker than the brightness of the road surface, and ■ there is almost no change in the brightness of the shadow. Since we only use There were issues that needed to be resolved, such as the shadows being detected as black vehicles. An object of the present invention is to solve the above-mentioned problems and provide an image processing type traffic flow measurement device that can accurately remove the influence of shadows and improve the accuracy of vehicle type discrimination and single-unit measurement.

【課題を解決するための手段1 上記目的を達成するため、本発明は、走行車両の画像信
号を画像処理して交通流を計測する装置において、道路
の路側に所定の高さに設置されて、道路上を走行する車
両を上方前方あるいは後方から所定の固定画角で所定の
計測周期毎に撮像する1台の撮像手段と、該撮像手段で
撮像された時間的隔りのある複数の2次元画像データか
ら車両の車頭部と車両の側部の特徴点の路面上の位置を
算出する算出手段と、該算出手段の算出結果に基いて前
記特徴点が路面上にのみ存在し前記車頭部と前記特徴点
の路面上での距離が時刻の経過に関係なく変化しないか
否かを判定して、該距離が変化しない場合は当該特徴点
を車両の影の一部として検出する形検出手段とを具備し
たことを特徴とする。 [作 用1 本発明では、道路の路側に所定の高さに設置した撮像手
段(例えば、ITVカメラ)で一定の設置画角で上方か
ら撮像した時間的隔りのある複数の2次元画面データを
用いて、車両の車頭と車両側部の特徴点の位置関係の変
化を判定し、時刻の経過に関係なく位置関係が変化しな
い場合はその特徴点を影の一部として検出する。このよ
うに本発明では、車両の影が路面上にのみ存在する2次
元データであり、その影は車両の移動に伴って移動する
という性質を利用して単眼立体視の手法を用いて車両の
影を検出するので、走行車両の車体色、形状に関係な(
安定して影を検出でき、影が隣接車線上にのびたような
場合でも影を黒い車両と誤検知することがない。従って
、本発明により影の影響を効果的に除去でき、車種判別
精度9台数計測精度の向上が図れる。 [実施例1 以下、図面を参照して本発明の実施例を詳細に説明する
。 第1図は本発明の一実施例の画像処理式交通流計測装置
の要部回路構成を示す。本図において、1は道路上を走
行する車両を撮像する撮像手段としてのITVカメラの
ようなカメラ部(固定カメラ)、2はカメラ部lの出力
画像信号を画像処置して車両の周囲に発生する影を検出
する画像計測部である。画像計測部2はA/D  (ア
ナログ・デジタル)変換部21.画像記憶部221画像
3値化処理部23.車頭検出部24.影判別部25.車
両計測部26および出力部27とから構成される。 A/D変換部21はカメラ部1の出力画像信号をデジタ
ル化する0画像記憶部22はRAM  (ランダムアク
セスメモリ)等から成り、デジタル化された画像信号を
輝度信号として撮像画面単位で記憶する。画像3値化処
理23は記憶された輝度信号を用いて計測サンプル点毎
の3値化(1,0,−1)を行う。車頭検出部24は計
測サンプル点の輝度の偏移から車両の先頭部分(以下、
車頭と称する)を検出する。影判別部25は検出した車
両を追跡して複数の2次元画像データから後述のように
して車両の影を判別する。車両計測部26は影判別部2
5から得られる車両の影データを用いて車両を識別し、
影の影響のない各種の交通流計測データを算出し、出力
部27からこの計測データを遠隔の交通管制用中実装置
へ送出する。 第2図は第1図のカメラ部1と画像計測部2の実際の取
付は設置状態の一例を示す。カメラ部lは多車線道路5
の路側7に立てた支柱(ボール)3の上部アーム部に固
定され、画像計測部2は支柱3の胴部の所定位置に固定
される。カメラ部1は一定の高さから多車線道路5の所
定の範囲を計測領域4として一定の角度で撮像し、画像
計測部2はカメラ部lで撮像した画像データから車両の
感知処理を行う。本実施例では、−例として片側3車線
の場合を考える。なお、6は中央!1(中央分離帯)を
示す。 次に、第3図〜第7図を用いて本発明実施例の動作を説
明する。 第3図は上述のカメラ部lで撮像された1画面を示す。 この画面には上流から下流に向って中央の車線を進行す
る一台の車両(本例では、大型バス)10が撮像されて
おり、この車体の影11が隣の車線まで伸びている。4
に示す計測領域内で車両10の検出を行うが、この計測
領域4は路面に平行な平面であり、その大きさ(特に長
さ)は計測周期毎に複数回以上車両を検出できるものと
する。 第4図は第3図の画像データを画像3値化処理部23で
3値化(1,O,−1)L、た車両候補の抽出画面を模
式的に示す。第4図のa−gは車頭検出部24で設定さ
れる特徴点である。 第5図および第6図はそれぞれ第3図の状態を上空から
、また道路側面から見た図であり、現時刻tおよびそれ
より以前の時刻t’ (すなわち、時刻tよりもある計
測周期分以前の時刻)における車両と影の相対的位置関
係を示している。 画像処理技術の1つである単眼立体視の手法を用いれば
、本例のように1台のカメラが撮像した画面からでも、
カメラの設置画角と計測対象平面である路面の位置から
計測対象物の3次元空間での位置を求めることができる
。 ■第4図における特徴点gは時刻t°においては第6図
中の光軸12上の点である。(特徴点gが影の一部であ
る場合にはg+点に一致する。)■今、特徴点gとして
第6図において3点(gZ+g’s、 g’s)を考え
る。この3点はそれぞれ路面からの距離が異なるがカメ
ラ部1からは全て点(力)上に存在するように見える。 0次に、時刻tにおける状態を考えると、特徴点gが影
の場合(第6図のg’+点)には太陽光は平行光である
ため第5図におけるa’ −g’とa−gの路面上での
位置関係は変わらない。 よって第6図における車頭との距離点(オ)〜(力)間
の距離と点(ア)〜(1)間の距離は一致する。 ■一方、特徴点gが立体の一部すなわち路面からある高
さをもつ場合にはく例えば第6図におけるg’s、 g
’s) 、時刻t゛と時刻tの2つの時刻において車頭
との距離を算出すると第6図かられかるように点(ア)
〜(イ)間の距離1点(ア)〜(つ)間の距離はどちら
らも点(オ)〜(力)間の距離とは一致しない。 以上から、時刻を及び時刻t°のそれぞれにおいて車両
の車頭部(点a、b)と特徴点gの路面上の位置を算出
する。(第6図中の点(ア)。 (オ)及び点(1)、(力))。 次に、それぞれの時刻において車頭部と特徴点の距離を
比較して、時刻の経過に関係な(距離が等しい場合には
、特徴点gは路面上の点すなわち影の一部として検出す
る。 このように、それぞれの周期における車頭部分a % 
bと各特徴点Q % gとの相対的な位置関係から、特
徴点c −Hの中に路面上に存在する点(路面からの高
さが“0”の点)が複数個存在することを認識すること
により、この路面上に存在する複数の特徴点により形成
される平面を影として検出する。 さらに、第7図のフローチャートを参照して本発明の実
施例の動作を詳細に説明する。 カメラ部1の出力画像信号はA/D変換部21でデジタ
ル化され、画像記憶部22にあらかじめ定めた時間間隔
で周期的に画面単位で記憶される(ステップSt) 。 画像3値化処理部23では路面基準輝度を算出しくステ
ップS2)、その路面基準輝度を基に3値化のための閾
値を算出する(ステップS3)、ここで、路面基準輝度
は入力画像データと路面基準輝度との差分値の大きさに
よって指数平滑の係数を変化させ外部環境に追従させた
値である。さらに、画像3値化処理部23では画像記憶
部22から読み出した各計測サンプル点の輝度データを
上記の3値化のための閾値(3値化閾値と称する)で処
理して計測サンプル点毎の3値化(1,O。 −1)を行う(ステップS4)。 車頭検出部24では画像3値化処理部23から送られた
3値化データを基に、計測ライン(道路横断方向)毎に
上記計測サンプル点の輝度偏移を見つけ、車頭候補を検
出する。すなわち、第4図を例にとれば、図面下側の下
流方向から図面上方の上流方向へ向って計測ライン毎に
走査し、輝度偏移の計測ライン方向の長さがあらかじめ
定めたある一定長よりも大きい場合を走行車両の車頭部
分(辺a、b)と識別する(ステップS5)。 続いて、車頭検出部24では抽出された2次元データ(
物体)に対し、第4図に示す通りに8〜gの特徴点を設
定し、各特徴点の計測領域面上での位置を算出し、計測
ライン毎の輝度偏移幅が車長、車幅の判定値(基準値)
を満足しているか否かをチエツクする。その判定値を満
足しているときは車両と判定する。その判定値を満足し
ていない場合は車両でないと判定して上述のステップS
】に戻る(ステップS6)。 車両であると判定したときには、車頭検出部24の検出
データは影判別部25に受は渡され、影判別部25では
あらかじめ設定した時間間隔で、すなわち計測周期毎の
車両の時間的移動距離を追跡する(ステップSV)。 次に、影判別部25では時刻tと時刻t°における車両
の車頭部と車側(車両の側面)の特徴点間の距離の変化
を調べ(ステップS8)、時刻の経過に関係なくその路
面上での距離が等しい場合にはその特徴点は車両の影の
一部として検出する(ステップSIO)。また、時刻t
のときにその距離が短縮した場合は単独車両の一部であ
ると判定しくステップS9)、その距離が長くなった場
合は並走車両の一部であると判定する(ステップSll
 ) 。 車両の影の判定結果は第1図の車両計測部26へ送られ
、対象車両が計測領域4を脱出するまでステップSlに
戻って上述の処理を繰り返す(ステップS12 ) 。 車両計測部26では影判別部25で検出した車両の影を
ノイズとして画像記憶部22のデータから影の部分を除
去して車体のみを認識する。この車体認識は例えば、計
測領域内の車両データの立上りと立下りを検出すること
により行う、すなわち、計測領域内を上流区域と下流区
域に分け、下流区域での車両データの立上りで通過車両
として交通量をカウントし、また下流区域での車両デー
タの立上りと車両データの立下りの距離から車長を計測
して大型、小型の車種判定を行い、さらに車両の移動距
離と移動時間から車両の速度(車速)を算出する。これ
らの交通流計測データは出力部27から遠隔の中実装置
へ送出される。従って、画像計測部2からは影の影響を
受けない正確な車種判別データ、通過台数計測データ等
が得られる。 【発明の効果1 以上説明したように、本発明によれば、車両の影が路面
上にのみ存在する2次元データであり、車両の影が車両
の移動に伴って移動するという性質を利用して、一定の
設置画角で上方から撮像した時間的隔りのある複数の2
次元画面データから車両の車頭と車両側部の特徴点の路
面上での位置関係(距離)が時刻の経過に関係なく変化
しない場合は影の一部として検出するようにしたので、
走行車両の車体色、形状に関係なく安定して影を検出で
きるので車種の判定(大型車/小型車の車種判別)の精
度を向上させるとともに、影が隣接車線上にのびたよう
な場合でも影を黒い車両と誤検知することがないので交
通量計測の精度を向上させることができる効果が得られ
る。
[Means for Solving the Problems 1] In order to achieve the above object, the present invention provides an apparatus for measuring traffic flow by image processing image signals of traveling vehicles, which is installed at a predetermined height on the side of a road. , one imaging device that images a vehicle traveling on a road from above or from the front or rear at a predetermined fixed angle of view at a predetermined measurement cycle; and a plurality of two images taken by the imaging device at a time interval. calculation means for calculating the positions on the road surface of feature points of the vehicle head and side parts of the vehicle from the dimensional image data; A method in which it is determined whether the distance between the head and the feature point on the road surface does not change regardless of the passage of time, and if the distance does not change, the feature point is detected as part of the vehicle's shadow. The present invention is characterized by comprising a detection means. [Function 1] In the present invention, a plurality of two-dimensional screen data with time intervals are captured from above at a fixed angle of view by an imaging means (for example, an ITV camera) installed on the roadside at a predetermined height. is used to determine changes in the positional relationship between feature points on the front and side of the vehicle, and if the positional relationship does not change regardless of the passage of time, the feature point is detected as part of the shadow. In this way, in the present invention, the shadow of a vehicle is two-dimensional data that exists only on the road surface, and the shadow moves with the movement of the vehicle. Shadows are detected, so shadows are detected (
Shadows can be detected stably, and even if the shadow extends onto an adjacent lane, it will not be mistakenly detected as a black vehicle. Therefore, according to the present invention, the influence of shadows can be effectively removed, and the accuracy of vehicle type determination and the accuracy of vehicle number measurement can be improved. [Embodiment 1] Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. FIG. 1 shows the main circuit configuration of an image processing type traffic flow measuring device according to an embodiment of the present invention. In this figure, 1 is a camera unit (fixed camera) such as an ITV camera as an imaging means for capturing images of a vehicle running on a road, and 2 is a camera unit (fixed camera) that processes the output image signal of the camera unit 1 to generate images around the vehicle. This is an image measurement unit that detects shadows. The image measurement section 2 includes an A/D (analog/digital) conversion section 21. Image storage section 221 Image ternarization processing section 23. Vehicle head detection section 24. Shadow discrimination section 25. It is composed of a vehicle measuring section 26 and an output section 27. The A/D conversion section 21 digitizes the output image signal of the camera section 1. The image storage section 22 is composed of a RAM (random access memory), etc., and stores the digitized image signal as a luminance signal in units of imaging screens. . Image ternarization processing 23 performs ternarization (1, 0, -1) for each measurement sample point using the stored luminance signal. The vehicle head detection unit 24 detects the front part of the vehicle (hereinafter referred to as
(referred to as the head of the vehicle). The shadow determination unit 25 tracks the detected vehicle and determines the shadow of the vehicle from a plurality of two-dimensional image data as described below. The vehicle measuring section 26 is the shadow discriminating section 2
identify the vehicle using the vehicle shadow data obtained from 5;
Various types of traffic flow measurement data that are not affected by shadows are calculated, and the output unit 27 sends this measurement data to a remote solid device for traffic control. FIG. 2 shows an example of how the camera section 1 and image measurement section 2 shown in FIG. 1 are actually installed. Camera part l is multi-lane road 5
The image measurement unit 2 is fixed to the upper arm of a pillar (ball) 3 erected on the roadside 7 of , and the image measurement unit 2 is fixed at a predetermined position on the trunk of the pillar 3. The camera unit 1 images a predetermined range of a multi-lane road 5 from a certain height at a certain angle as a measurement area 4, and the image measurement unit 2 performs vehicle sensing processing from the image data captured by the camera unit 1. In this embodiment, a case where there are three lanes on each side will be considered as an example. In addition, 6 is in the center! 1 (median strip). Next, the operation of the embodiment of the present invention will be explained using FIGS. 3 to 7. FIG. 3 shows one screen imaged by the above-mentioned camera section l. This screen captures an image of a vehicle 10 (in this example, a large bus) traveling in the center lane from upstream to downstream, and the shadow 11 of this vehicle body extends to the adjacent lane. 4
The vehicle 10 is detected within the measurement area shown in , but this measurement area 4 is a plane parallel to the road surface, and its size (especially length) is such that it can detect the vehicle multiple times or more in each measurement cycle. . FIG. 4 schematically shows a vehicle candidate extraction screen in which the image data in FIG. 3 is ternarized (1, O, -1)L by the image ternarization processing unit 23. Characteristic points a to g in FIG. 4 are set by the vehicle head detection section 24. 5 and 6 are views of the state shown in FIG. 3 from above and from the side of the road, respectively, and show the current time t and previous times t' (that is, for a certain measurement period after time t). This shows the relative positional relationship between the vehicle and the shadow at the previous time. By using monocular stereoscopic viewing, which is one of the image processing techniques, even from the screen captured by a single camera as in this example,
The position of the object to be measured in three-dimensional space can be determined from the angle of view of the camera and the position of the road surface, which is the plane of the object to be measured. (2) The feature point g in FIG. 4 is a point on the optical axis 12 in FIG. 6 at time t°. (If the feature point g is part of the shadow, it matches the g+ point.) ②Now, consider three points (gZ+g's, g's) in FIG. 6 as the feature point g. These three points are located at different distances from the road surface, but from the camera unit 1, they all appear to exist on a point (force). Next, considering the state at time t, if the feature point g is a shadow (g'+ point in Figure 6), sunlight is parallel light, so a' - g' and a in Figure 5 -g's positional relationship on the road surface remains unchanged. Therefore, the distance between the distance points (O) and (R) from the vehicle head in FIG. 6 matches the distance between the points (A) and (1). ■On the other hand, when the feature point g is a part of a solid, that is, has a certain height from the road surface, for example, g's in Fig. 6, g
's), when the distance to the vehicle head is calculated at two times, time t' and time t, point (a) is obtained as shown in Figure 6.
Distance between points (A) and (A)Neither of the distances between points (A) and (T) match the distance between points (O) and (force). From the above, the positions of the vehicle head (points a, b) and feature point g on the road surface are calculated at each time and time t°. (Point (a) in Figure 6. (e) and point (1), (force)). Next, the distance between the vehicle head and the feature point is compared at each time, and if the distance is equal (if the distance is equal, the feature point g is detected as a point on the road surface, that is, as part of the shadow). In this way, the vehicle head portion a % in each cycle
From the relative positional relationship between b and each feature point Q % g, there are multiple points that exist on the road surface (points whose height from the road surface is "0") among feature points c - H. By recognizing this, a plane formed by a plurality of feature points existing on this road surface is detected as a shadow. Furthermore, the operation of the embodiment of the present invention will be explained in detail with reference to the flowchart of FIG. The output image signal of the camera section 1 is digitized by the A/D conversion section 21 and stored in the image storage section 22 periodically in units of screens at predetermined time intervals (step St). The image ternarization processing unit 23 calculates the road surface reference brightness (step S2), and calculates a threshold value for ternarization based on the road surface reference brightness (step S3). Here, the road surface reference brightness is calculated based on the input image data. This is a value obtained by changing the coefficient of exponential smoothing according to the magnitude of the difference value between the road surface reference brightness and the road surface reference brightness to follow the external environment. Furthermore, the image ternarization processing section 23 processes the luminance data of each measurement sample point read out from the image storage section 22 using the threshold value for ternarization described above (referred to as the ternarization threshold value). is ternarized (1, O. -1) (step S4). The vehicle head detection section 24 finds the luminance shift of the measurement sample points for each measurement line (road crossing direction) based on the ternarized data sent from the image ternarization processing section 23, and detects a vehicle head candidate. In other words, taking FIG. 4 as an example, each measurement line is scanned from the downstream direction at the bottom of the drawing to the upstream direction at the top of the drawing, and the length of the luminance shift in the measurement line direction is a predetermined constant length. If it is larger than , it is identified as the head portion (sides a, b) of the running vehicle (step S5). Next, the vehicle head detection unit 24 extracts the extracted two-dimensional data (
For the object), set 8 to g feature points as shown in Figure 4, calculate the position of each feature point on the measurement area surface, and calculate the brightness deviation width for each measurement line based on the vehicle length and vehicle length. Width judgment value (reference value)
Check whether you are satisfied with the following. If the determination value is satisfied, it is determined that the object is a vehicle. If the determination value is not satisfied, it is determined that the vehicle is not a vehicle and the step S described above is performed.
] (Step S6). When it is determined that the vehicle is a vehicle, the detection data of the vehicle head detection section 24 is passed to the shadow discrimination section 25, and the shadow discrimination section 25 calculates the temporal movement distance of the vehicle at preset time intervals, that is, at each measurement cycle. Track (Step SV). Next, the shadow discriminating unit 25 examines the change in the distance between the vehicle head and the feature points on the vehicle side (vehicle side) between time t and time t° (step S8), If the distances on the road surface are equal, the feature point is detected as part of the vehicle's shadow (step SIO). Also, time t
If the distance is shortened, it is determined that the vehicle is part of a single vehicle (step S9), and if the distance is longer, it is determined that the vehicle is part of a parallel vehicle (step S9).
). The result of determining the shadow of the vehicle is sent to the vehicle measuring section 26 in FIG. 1, and the process returns to step Sl to repeat the above-described process until the target vehicle leaves the measurement area 4 (step S12). The vehicle measuring section 26 uses the shadow of the vehicle detected by the shadow discriminating section 25 as noise, removes the shadow portion from the data in the image storage section 22, and recognizes only the vehicle body. This vehicle body recognition is performed, for example, by detecting the rise and fall of vehicle data within the measurement area.In other words, the measurement area is divided into an upstream area and a downstream area, and the rise of the vehicle data in the downstream area is recognized as a passing vehicle. We count the traffic volume, measure the vehicle length from the distance between the rise of vehicle data and the fall of vehicle data in the downstream area, and determine whether the vehicle is large or small. Calculate the speed (vehicle speed). These traffic flow measurement data are sent from the output unit 27 to a remote solid device. Therefore, the image measurement unit 2 can obtain accurate vehicle type discrimination data, passing vehicle number measurement data, etc. that are not affected by shadows. [Effect of the invention 1] As explained above, according to the present invention, the shadow of a vehicle is two-dimensional data that exists only on the road surface, and the shadow of the vehicle moves as the vehicle moves. multiple images taken from above at a fixed angle of view with a time interval.
From the dimensional screen data, if the positional relationship (distance) on the road surface between the feature points on the head of the vehicle and the side of the vehicle does not change regardless of the passage of time, it is detected as part of the shadow.
Shadows can be stably detected regardless of the color or shape of the vehicle in motion, improving the accuracy of vehicle type determination (large/small vehicle type discrimination). Since there is no false detection of a black vehicle, the accuracy of traffic volume measurement can be improved.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は本発明実施例の回路構成を示すブロック図、 第2図は第1図の装置の設置例を示す斜視図、第3図は
第2図のカメラ部で撮像した場合の一画面の例を示す説
明図。 第4図は第3図の一画面の画像データを3値化して得ら
れる画面状態を示す説明図、 第5図は第3図の状態を道路上空から見た場合の平面図
、 第6図は第3図の状態を道路側面方向から見た側面図、 第7図は本発明実施例の動作内容を示すフローチャート
である。 ・・・カメラ部、 ・・・画像計測部、 ・・・支柱、 ・・・計測領域、 ・・・多車線道路、 6・・・中央線、 7・・・路側、 lO・・・車両、 11・・・影、 12・・・光軸、 21・・・A/D変換部、 22・・・画像記憶部、 23・・・画像3値化処理部、 24・・・車頭検出部、 25・・・影判別部、 26・・・車両計測部、 27・・・出力部。
Fig. 1 is a block diagram showing the circuit configuration of an embodiment of the present invention, Fig. 2 is a perspective view showing an example of installation of the device shown in Fig. 1, and Fig. 3 is a screen shot taken by the camera unit shown in Fig. 2. An explanatory diagram showing an example. Figure 4 is an explanatory diagram showing the screen state obtained by ternarizing the image data of one screen in Figure 3, Figure 5 is a plan view of the state in Figure 3 viewed from above the road, and Figure 6. 3 is a side view of the state shown in FIG. 3 viewed from the side of the road, and FIG. 7 is a flowchart showing the operation details of the embodiment of the present invention. ...Camera section, ...Image measurement section, ...Strut, ...Measurement area, ...Multi-lane road, 6.Central line, 7.Roadside, 1O..Vehicle, DESCRIPTION OF SYMBOLS 11... Shadow, 12... Optical axis, 21... A/D conversion part, 22... Image storage part, 23... Image ternary processing part, 24... Vehicle head detection part, 25...Shadow discrimination section, 26...Vehicle measurement section, 27...Output section.

Claims (1)

【特許請求の範囲】 1)走行車両の画像信号を画像処理して交通流を計測す
る装置において、 道路の路側に所定の高さに設置されて、道路上を走行す
る車両を上方前方あるいは後方から所定の固定画角で所
定の計測周期毎に撮像する1台の撮像手段と、 該撮像手段で撮像された時間的隔りのある複数の2次元
画像データから車両の車頭部と車両の側部の特徴点の路
面上の位置を算出する算出手段と、 該算出手段の算出結果に基いて前記特徴点が路面上にの
み存在し前記車頭部と前記特徴点の路面上での距離が時
刻の経過に関係なく変化しないか否かを判定して、該距
離が変化しない場合は当該特徴点を車両の影の一部とし
て検出する影検出手段と を具備したことを特徴とする画像処理式交通流計測装置
[Claims] 1) A device for measuring traffic flow by image processing image signals of traveling vehicles, which is installed at a predetermined height on the side of a road, and is installed at a predetermined height on the side of a road so that the vehicle traveling on the road can be viewed from above, in front of, or behind the vehicle. one imaging device that captures an image at a predetermined fixed angle of view at a predetermined measurement cycle; and a plurality of two-dimensional image data with time gaps captured by the imaging device to capture the head of the vehicle and the image of the vehicle. a calculation means for calculating the position of a side feature point on the road surface; and a calculation unit that calculates the position of the feature point on the road surface, based on the calculation result of the calculation means, when the feature point exists only on the road surface and the distance between the vehicle head and the feature point on the road surface. an image characterized by comprising a shadow detection means for determining whether or not the distance does not change regardless of the passage of time, and detecting the feature point as part of the shadow of the vehicle if the distance does not change. Processing type traffic flow measuring device.
JP2062655A 1990-03-15 1990-03-15 Image processing type traffic flow measurement device Expired - Fee Related JP2841652B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2062655A JP2841652B2 (en) 1990-03-15 1990-03-15 Image processing type traffic flow measurement device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2062655A JP2841652B2 (en) 1990-03-15 1990-03-15 Image processing type traffic flow measurement device

Publications (2)

Publication Number Publication Date
JPH03265100A true JPH03265100A (en) 1991-11-26
JP2841652B2 JP2841652B2 (en) 1998-12-24

Family

ID=13206552

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2062655A Expired - Fee Related JP2841652B2 (en) 1990-03-15 1990-03-15 Image processing type traffic flow measurement device

Country Status (1)

Country Link
JP (1) JP2841652B2 (en)

Also Published As

Publication number Publication date
JP2841652B2 (en) 1998-12-24

Similar Documents

Publication Publication Date Title
KR100201739B1 (en) Method for observing an object, apparatus for observing an object using said method, apparatus for measuring traffic flow and apparatus for observing a parking lot
US9904861B2 (en) Method for detecting target objects in a surveillance region
CN110794405B (en) Target detection method and system based on camera and radar fusion
CN105930787B (en) Opening door of vehicle method for early warning
US5995900A (en) Infrared traffic sensor with feature curve generation
EP0700017B1 (en) Method and apparatus for directional counting of moving objects
KR20030080285A (en) Apparatus and method for queue length of vehicle to measure
JP3183320B2 (en) Counting method and apparatus for each moving object direction
KR20160035121A (en) Method and Apparatus for Counting Entity by Using Location Information Extracted from Depth Image
JP2016162354A (en) Axle number detection device, vehicle type distinguishing system, axle number detection method, and program
KR20150029551A (en) Determining source lane of moving item merging into destination lane
KR100532058B1 (en) Traffic information acquisition method and apparatus using camera calibration
CN112037536A (en) Vehicle speed measuring method and device based on video feature recognition
JP3465531B2 (en) Object recognition method and apparatus
JP2924063B2 (en) Image processing type traffic flow measurement device
JPH03265100A (en) Image processing type traffic flow measuring instrument
JPH0991439A (en) Object monitor
JPH11175883A (en) Traffic volume measuring instrument and signal control device
JP2000163691A (en) Traffic flow measuring instrument
JPH0883392A (en) Method and device for detecting vehicle
JP2855770B2 (en) Image processing type traffic flow measurement device
JP2001297397A (en) Method and device for counting vehicle
JPH08210848A (en) Distance measuring instrument
JP2946620B2 (en) Automatic number reading device with speed measurement function
JP2001033238A (en) Object recognizing apparatus

Legal Events

Date Code Title Description
LAPS Cancellation because of no payment of annual fees