JP2010191888A - Image processing apparatus and traffic monitoring device - Google Patents

Image processing apparatus and traffic monitoring device Download PDF

Info

Publication number
JP2010191888A
JP2010191888A JP2009038209A JP2009038209A JP2010191888A JP 2010191888 A JP2010191888 A JP 2010191888A JP 2009038209 A JP2009038209 A JP 2009038209A JP 2009038209 A JP2009038209 A JP 2009038209A JP 2010191888 A JP2010191888 A JP 2010191888A
Authority
JP
Japan
Prior art keywords
vehicle
traffic
road surface
traffic event
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2009038209A
Other languages
Japanese (ja)
Other versions
JP5175765B2 (en
Inventor
Tetsuo Aikawa
徹郎 相川
Kazuchika Nagao
一親 永尾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Toshiba Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp filed Critical Toshiba Corp
Priority to JP2009038209A priority Critical patent/JP5175765B2/en
Publication of JP2010191888A publication Critical patent/JP2010191888A/en
Application granted granted Critical
Publication of JP5175765B2 publication Critical patent/JP5175765B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To detect a traffic phenomenon such as a stopping vehicle, congestion or a falling object, without being affected by a road surface changing by an environment or a weather of the site. <P>SOLUTION: A traffic monitoring device includes: a monitoring camera 1 installed on a road; a vehicle candidate domain extraction part 11 for extracting a vehicle candidate domain by using a background differential method from a camera image; a road surface domain extraction part 12 for extracting a road surface domain in consideration of a road surface state changing by an environment or a weather from the camera image; a vehicle domain extraction part 13 for detecting a vehicle domain by operating the vehicle candidate domain and the road surface domain, and by removing a noise domain and a road surface domain appearing by a change of the environment or the weather of the site included in the vehicle candidate domain; a traffic phenomenon detection part 3 for detecting a traffic phenomenon from a position of the vehicle domain output from the vehicle domain extraction part 13 and the ratio of the vehicle domain to the road surface; and a monitor display device 4 for displaying the detected traffic phenomenon on the camera image. <P>COPYRIGHT: (C)2010,JPO&INPIT

Description

本発明は、交差点などの交通量の多い場所やカーブなどの前方の見通しが悪い場所における交通事象や落下物の有無を検出し監視する画像処理装置及び交通監視装置に関する。   The present invention relates to an image processing apparatus and a traffic monitoring apparatus that detect and monitor the presence or absence of a traffic event or a fallen object in a place with a high traffic volume such as an intersection or a place with a poor forward view such as a curve.

従来、交差点などの交通量の多い場所やカーブなどの前方の見通しが悪い場所などの道路脇に監視カメラを設置し、この監視カメラで当該場所の路面上を撮影することにより、停止車両、渋滞、落下物の有無などに関する交通事象の映像をモニター表示装置に表示し、街灯場所の路面上の状況を確認する交通監視システムが導入されている。   Conventionally, surveillance cameras have been installed on the side of roads such as intersections and other places with high traffic volume or places with poor visibility, such as curves, and by using this surveillance camera to photograph the road surface of the location, In addition, a traffic monitoring system has been introduced that displays a video of a traffic event related to the presence or absence of a fallen object on a monitor display device and confirms the situation on the road surface of a streetlight place.

また、監視カメラによって撮影された映像の中から画像処理により交通事象を自動的に検出する画像処理装置も開発されている。自動的に交通事象を検出する画像処理装置は、一定時間前の撮影画像と監視時の撮影画像とを比較し、両画像の輝度差から車両の有無を検出する背景差分法が広く利用されている(特許文献1)。   In addition, an image processing apparatus has been developed that automatically detects a traffic event from an image captured by a surveillance camera by image processing. An image processing device that automatically detects a traffic event is widely used a background difference method that compares a captured image of a certain time ago with a captured image at the time of monitoring and detects the presence or absence of a vehicle from the luminance difference between the images. (Patent Document 1).

この特許文献1の技術は、撮影カメラで撮影された画像を画像分割手段で所定サイズの画像に分割した後、各分割領域の画像から画像の特徴を抽出する。そして、画像特徴から撮影対象の外部環境を判定する。しかる後、既に撮影済みの背景画像と監視時の撮影画像とから背景差分画像を生成し、この生成された背景差分画像から動体である車両を検出する画像処理である。   In the technique disclosed in Patent Document 1, an image captured by a photographing camera is divided into images of a predetermined size by an image dividing unit, and then image features are extracted from the images of the divided regions. Then, the external environment to be imaged is determined from the image characteristics. Thereafter, a background difference image is generated from a background image that has already been shot and a shot image at the time of monitoring, and image processing is performed to detect a moving vehicle from the generated background difference image.

特開2002−83301号公報JP 2002-83301 A

しかしながら、特許文献1に記載される画像処理技術は、昼夜の外部環境を判定するに当り、夜間時の車両のヘッドライトや街灯等の映り込みによる昼間との誤判定をなくすための画像処理であって、例えば現地の環境や天候によって路面自体が一様に変化している場合には背景差分法を用いただけでは車両の判定が難しくなる。   However, the image processing technique described in Patent Document 1 is an image process for eliminating misjudgment with daytime due to reflection of vehicle headlights, street lights, etc. at night when judging the external environment during the day and night. Thus, for example, when the road surface itself changes uniformly due to the local environment and weather, it is difficult to determine the vehicle only by using the background difference method.

例えば、一定時間前の撮影画像と監視時の撮影画像とを比較するに際し、一定時間前時刻と監視時の時刻との間に雪が降った場合、一定時間前の路面には積雪が無いが、監視時の路面には一様に積雪が発生する。その結果、同じ路面でありながら一定時間前の撮影画像と監視時の撮影画像との間に大きな輝度差が発生し、積雪のあった路面を車両領域と誤判定する可能性が出てくる。   For example, when comparing a captured image before a certain time and a captured image at the time of monitoring, if snow falls between the time before the certain time and the time at the time of monitoring, there is no snow on the road surface before the certain time, Evenly, snowfall occurs on the road surface during monitoring. As a result, there is a possibility that a large luminance difference occurs between the photographed image of a predetermined time and the photographed image at the time of monitoring even though the road surface is the same, and the road surface with snow may be erroneously determined as the vehicle region.

また、画像処理を用いた一般的な車両検出手段としては、背景差分法に代表されるように、車両と路面が異なる輝度分布を有しているという特徴から車両の領域を検出している。つまり、車両の特徴を利用し、監視カメラの撮影画像から車両の領域を検出している。車両は、乗用車,大型車、小型車などの違いがあるものの、外乱により左右されない特徴を有することから、その車両の特徴を定義し、撮影カメラの画像から車両の領域を検出する。   Moreover, as a general vehicle detection means using image processing, as represented by the background difference method, a vehicle region is detected from the feature that the vehicle and the road surface have different luminance distributions. That is, the area of the vehicle is detected from the captured image of the monitoring camera using the characteristics of the vehicle. The vehicle has characteristics that are not influenced by disturbances, although there are differences among passenger cars, large cars, and small cars. Therefore, the characteristics of the vehicle are defined, and the area of the vehicle is detected from the image of the photographing camera.

しかし、その反面,路面の状態については、降雪だけでなく、現地の環境や天候によって一様に変化することから、一定の特徴を定義し難い問題がある。このことは、現地の環境や天候により一様に変化するため、一定時間前の画像と監視時の画像との間で輝度が異なり、例えば複数の車両が重なり合った車両領域と誤判定してしまう問題がある。   However, on the other hand, the road surface condition varies not only by snowfall, but also by the local environment and weather, so there is a problem that it is difficult to define certain characteristics. Since this changes uniformly according to the local environment and weather, the brightness differs between the image before a certain time and the image at the time of monitoring, and for example, it is erroneously determined as a vehicle region in which a plurality of vehicles overlap. There's a problem.

本発明は上記事情に鑑みてなされたもので、現地の環境や天候によって一様に変化する路面の状況を把握して、撮影映像から路面の領域及びノイズ成分領域を除外して車両領域を検出し、交通事象を正確に検出し監視する画像処理装置及び交通監視装置を提供することを目的とする。   The present invention has been made in view of the above circumstances, and grasps the road surface conditions that change uniformly depending on the local environment and weather, and detects the vehicle area by excluding the road surface area and the noise component area from the captured image. An object of the present invention is to provide an image processing apparatus and a traffic monitoring apparatus that accurately detect and monitor a traffic event.

上記課題を解決するために、本発明に係る画像処理装置は、道路の所要位置に設置された監視カメラの映像から背景差分法を用いて車両候補領域を抽出する車両候補領域抽出手段と、前記監視カメラの映像から現地の環境や天候によって変化する路面の状態を考慮して路面の領域を抽出する路面領域抽出手段と、前記車両候補領域抽出手段で抽出される車両候補領域と前記路面領域抽出手段で抽出される路面領域とを演算し、前記車両候補領域の中に含まれる現地の環境や天候の変化によって現れるノイズ領域及び前記路面領域を除外し、真の車両領域を検出する車両領域検出手段とを備えた構成である。   In order to solve the above-described problem, an image processing apparatus according to the present invention includes a vehicle candidate area extraction unit that extracts a vehicle candidate area from a video of a surveillance camera installed at a required position on a road using a background difference method, Road surface area extraction means for extracting a road surface area from the video of the monitoring camera in consideration of the road surface state that changes depending on the local environment and weather, the vehicle candidate area extracted by the vehicle candidate area extraction means, and the road surface area extraction Vehicle area detection that detects a true vehicle area by calculating a road area extracted by the means and excluding a noise area and the road area that appear due to a change in local environment and weather included in the vehicle candidate area And means.

また、本発明に係る交通監視装置は、道路の所要位置に設置された監視カメラと、この監視カメラの映像から背景差分法を用いて車両候補領域を抽出する車両候補領域抽出手段と、前記監視カメラの映像から現地の環境や天候によって変化する路面の状態を考慮して路面の領域を抽出する路面領域抽出手段と、前記車両候補領域抽出手段で抽出される車両候補領域と前記路面領域抽出手段で抽出される路面領域とを演算し、前記車両候補領域の中に含まれる現地の環境や天候の変化によって現れるノイズ領域及び前記路面領域を除外し、真の車両領域を検出する車両領域検出手段と、この車両領域検出手段から出力される車両領域の位置及び前記路面に対する前記車両領域の割合から停止車両,渋滞などの交通事象を検知する交通事象検知手段と、この交通事象検知手段で検知された交通事象を前記監視カメラの映像上に表示するモニター表示装置とを備えた構成である。   The traffic monitoring apparatus according to the present invention includes a monitoring camera installed at a required position on a road, a vehicle candidate area extracting unit that extracts a vehicle candidate area from a video of the monitoring camera using a background difference method, and the monitoring Road surface area extraction means for extracting a road surface area from the video of the camera in consideration of the road surface state that changes depending on the local environment and weather, the vehicle candidate area extracted by the vehicle candidate area extraction means, and the road surface area extraction means Vehicle area detection means for calculating a road surface area extracted in step (1), excluding noise areas and road areas that appear due to changes in the local environment and weather included in the vehicle candidate area, and detecting a true vehicle area And a traffic event detector for detecting a traffic event such as a stopped vehicle or a traffic jam from the position of the vehicle region output from the vehicle region detection means and the ratio of the vehicle region to the road surface. When a configuration in which a monitor display device that displays traffic event sensed at this traffic event detection unit on the image of the monitoring camera.

なお、以上の画像処理装置の構成に新たに、前記交通事象検知手段で検知された一定時間ごとの交通事象が予め定める回数に亘って同一結果の交通事象であるとき、当該交通事象の発生と判定し、前記モニター表示装置に表示される映像上に表示する交通事象判定手段を設けた構成であってもよい。   In addition, when the traffic event per fixed time detected by the traffic event detection means is a traffic event of the same result over a predetermined number of times, the occurrence of the traffic event is newly added to the configuration of the image processing apparatus described above. A configuration may be provided in which traffic event determination means for determining and displaying on the video displayed on the monitor display device is provided.

さらに、本発明に係る画像処理装置としては、道路に沿って複数台の監視カメラを設置するカメラ設置手段と、これら監視カメラと前記画像処理装置との間に設けられ、前記交通事象判定手段から所定時間ごとに切替え指示信号を受けて、前記複数台の監視カメラを順次切替え選択する切替え手段とを備え、前記交通事象判定手段は、前記各監視カメラの映像から順次交通事象の発生を判定し、前記モニター表示装置に個別選択的に表示しまたは予め定めた分割領域に同時並行的に表示する構成であってもよい。   Furthermore, as an image processing apparatus according to the present invention, a camera installation unit that installs a plurality of surveillance cameras along a road, and provided between the surveillance camera and the image processing apparatus, the traffic event determination unit Switching means for sequentially switching and selecting the plurality of monitoring cameras in response to a switching instruction signal every predetermined time, and the traffic event determining means sequentially determines the occurrence of traffic events from the images of the respective monitoring cameras. The display may be individually selected on the monitor display device or displayed in parallel in a predetermined divided area.

本発明によれば、現地の環境や天候によって一様に変化する路面の状況を把握して、撮影映像から路面の領域及びノイズ成分領域を除外して車両領域を検出し、交通事象を正確に検出し監視することができる画像処理装置及び交通監視装置を提供できる。   According to the present invention, the situation of the road surface that changes uniformly according to the local environment and weather is grasped, and the vehicle area is detected by excluding the road surface area and the noise component area from the photographed video, thereby accurately identifying the traffic event. An image processing apparatus and a traffic monitoring apparatus that can detect and monitor can be provided.

本発明に係る画像処理装置を含む交通監視装置の第1の実施形態を示す構成図。The block diagram which shows 1st Embodiment of the traffic monitoring apparatus containing the image processing apparatus which concerns on this invention. 図1に示す車両候補領域抽出部の処理の流れを説明する処理フロー図。FIG. 3 is a process flow diagram illustrating a process flow of a vehicle candidate area extraction unit illustrated in FIG. 1. 路面が現地の環境や天候によって一様に変化した時の状態を説明する図。The figure explaining the state when a road surface changes uniformly with local environment and the weather. 図1に示す路面領域抽出部の処理の流れを説明する処理フロー図。FIG. 3 is a process flow diagram illustrating a process flow of a road surface area extraction unit illustrated in FIG. 1. 図4の局所領域比較処理を説明する図。The figure explaining the local region comparison process of FIG. 図1の車両候補領域抽出部及び路面領域抽出部でそれぞれ抽出される車両候補領域及び路面領域の一例を模式的に示す図。The figure which shows typically an example of the vehicle candidate area | region and road surface area which are each extracted by the vehicle candidate area | region extraction part and road surface area extraction part of FIG. 本発明に係る画像処理装置を含む交通監視装置の第2の実施形態を示す構成図。The block diagram which shows 2nd Embodiment of the traffic monitoring apparatus containing the image processing apparatus which concerns on this invention. 本発明に係る画像処理装置を含む交通監視装置の第3の実施形態を示す構成図。The block diagram which shows 3rd Embodiment of the traffic monitoring apparatus containing the image processing apparatus which concerns on this invention. 本発明に係る画像処理装置を含む交通監視装置の第3の実施形態の他の変形例を示す構成図。The block diagram which shows the other modification of 3rd Embodiment of the traffic monitoring apparatus containing the image processing apparatus which concerns on this invention.

以下、本発明の実施の形態について図面を参照して説明する。
(第1の実施の形態)
図1は本発明に係る画像処理装置を用いた交通監視装置の第1の実施の形態を示す構成図である。
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
(First embodiment)
FIG. 1 is a block diagram showing a first embodiment of a traffic monitoring apparatus using an image processing apparatus according to the present invention.

交通監視装置は、道路を撮影する監視カメラ1と、監視カメラ1で撮影された映像(画像)を画像処理する画像処理装置2と、この画像処理装置2の出力側に接続された交通事象検知部3と、監視カメラ1で撮影された映像及び交通事象検知部3で検知された交通事象を表示するモニター表示装置4とで構成される。   The traffic monitoring device includes a monitoring camera 1 that captures a road, an image processing device 2 that performs image processing on a video (image) captured by the monitoring camera 1, and traffic event detection that is connected to the output side of the image processing device 2. And a monitor display device 4 for displaying a video photographed by the monitoring camera 1 and a traffic event detected by the traffic event detection unit 3.

監視カメラ1は、交差点などの交通量の多い場所やカーブなどの前方の見通しが悪い場所などの道路脇などに設置され、当該場所の路面及び路面上の車両などを撮影する。   The monitoring camera 1 is installed on the side of a road such as a place with a lot of traffic such as an intersection or a place with a poor forward view such as a curve, and photographs the road surface of the place and vehicles on the road surface.

画像処理装置2は、監視カメラ1で撮影された映像から映像内における車両の候補領域を抽出する車両候補領域抽出部11と、監視カメラ1で撮影された映像から映像内における路面の領域を抽出する路面領域抽出部12と、車両候補領域抽出部11で抽出された車両候補領域と路面領域抽出部12で抽出された路面領域とを比較し、車両の領域を検出する車両領域検出部13とからなる。   The image processing apparatus 2 extracts a vehicle candidate area extraction unit 11 that extracts a candidate area of a vehicle in the video from the video captured by the monitoring camera 1, and extracts a road surface area in the video from the video captured by the monitoring camera 1. A road surface area extracting unit 12 that compares the vehicle candidate area extracted by the vehicle candidate area extracting unit 11 with the road surface area extracted by the road surface area extracting unit 12, and detects a vehicle area; Consists of.

交通事象検知部3は、車両領域検出部13から送られてくる車両領域の位置、車両速度及び路面面積に対する車両領域の割合などから停止車両,渋滞,落下物有りなどの交通状況を判断し、予め交通事象管理テーブルなどに規定された交通状況内容に対応付けられた交通事象データを出力する機能を有している。   The traffic event detection unit 3 determines the traffic situation such as a stopped vehicle, a traffic jam, and a fallen object from the position of the vehicle region sent from the vehicle region detection unit 13, the vehicle speed and the ratio of the vehicle region to the road surface area, etc. It has a function of outputting traffic event data associated with the traffic situation contents defined in advance in the traffic event management table.

次に、以上のように構成された交通監視装置の動作について説明する。
監視カメラ1は、監視の必要な道路の適宜な場所に設置され、常時は監視対象場所の路面及び路面上の車両などを撮影し、その撮影された映像を、画像処理装置2を構成する車両候補領域抽出部11及び路面領域抽出部12と、モニター表示装置4とにそれぞれ送出する。
Next, the operation of the traffic monitoring device configured as described above will be described.
The surveillance camera 1 is installed at an appropriate location on a road that needs to be monitored. The surveillance camera 1 always photographs a road surface of a surveillance target location and a vehicle on the road surface, and the captured video is used as a vehicle constituting the image processing device 2. The data is sent to the candidate area extraction unit 11, the road surface area extraction unit 12, and the monitor display device 4.

車両候補領域抽出部11は、図2に示すような処理手順に従って、監視カメラ1の映像から映像内における車両の候補領域を抽出する。
車両候補領域抽出部11は、具体的には、監視カメラ1の一定時間前の映像(背景画像)と監視時の監視カメラ1で撮影された映像とを比較し、背景差分法に基づく画像間比較処理21を実行し、両映像の輝度差を求める。すなわち、車両が撮影されていない背景画像と監視時の監視カメラ1で撮影された画像とを比較し、画像間の輝度差を算出した差分画像を求める。なお、背景画像は、一定時間前の多数の映像を用いて、同一座標における画素の輝度を求めていき、これら求めた輝度の中間の輝度で置き換えて出力するメディアン処理を実行する。これは、画素ごとにメディアン処理を実行することにより、背景画像を作成する。
The vehicle candidate area extraction unit 11 extracts a candidate area of the vehicle in the video from the video of the monitoring camera 1 according to the processing procedure as shown in FIG.
Specifically, the vehicle candidate region extraction unit 11 compares a video (background image) of the monitoring camera 1 before a certain time with a video taken by the monitoring camera 1 at the time of monitoring, and compares the images based on the background difference method. A comparison process 21 is executed to determine the luminance difference between the two images. That is, a background image in which the vehicle is not photographed is compared with an image photographed by the monitoring camera 1 at the time of monitoring, and a difference image in which a luminance difference between the images is calculated is obtained. For the background image, a median process is performed in which the luminance of the pixel at the same coordinate is obtained using a large number of videos before a certain period of time, and is replaced with an intermediate luminance between the obtained luminances. This creates a background image by performing median processing for each pixel.

引き続き、画像間比較処理21によって求めた輝度差と予め定めたしきい値とを比較し、当該しきい値を超えた輝度値を「1」に設定していく2値化処理22を実行する。つまり、所定のしきい値を設定し、差分画像が所定のしきい値を超えた領域を順次「1」に設定していくことにより、車両の占める領域を抽出する。   Subsequently, the luminance difference obtained by the inter-image comparison process 21 is compared with a predetermined threshold value, and a binarization process 22 is executed in which the luminance value exceeding the threshold value is set to “1”. . That is, a predetermined threshold value is set, and an area occupied by the vehicle is extracted by sequentially setting an area where the difference image exceeds the predetermined threshold value to “1”.

そして、抽出された領域について、膨張処理や縮小処理を行い、細かなノイズを消去しつつ確定すべき車両の領域の形を整える領域整形処理23を実行し、最終的に監視カメラ1の映像内における車両の候補領域を抽出する。   Then, an expansion process and a reduction process are performed on the extracted area, and an area shaping process 23 for adjusting the shape of the area of the vehicle to be confirmed while erasing fine noise is executed. The candidate area of the vehicle is extracted.

一方、路面領域抽出部12は、監視カメラ1の映像から映像内における路面の領域を抽出するが、路面の状況は現地の環境や天候の影響を受けて図3のように一様に変化している。
図3の(a)は通常の路面状態を表す図であって、路面(イ)の色が例えば灰色で表されている。図示矢印(ロ)は走行車両の進行方向に示す。
On the other hand, the road surface area extraction unit 12 extracts the road surface area in the video from the video of the surveillance camera 1, but the road surface condition changes uniformly as shown in FIG. 3 due to the influence of the local environment and weather. ing.
FIG. 3A is a diagram showing a normal road surface state, and the color of the road surface (A) is expressed in gray, for example. The illustrated arrow (B) indicates the traveling direction of the traveling vehicle.

ところで、路面(イ)上に雪が降った場合には積雪(ハ)を表す例えば白色の路面状態となる(図3(b),(c)参照)。すなわち、路面(イ)に雪が降ることにより、積雪(ハ)が発生し、路面(イ)が図3(b)のように灰色から白色に変化した状態となる。   By the way, when snow falls on the road surface (A), for example, a white road surface state representing snow (C) is obtained (see FIGS. 3B and 3C). That is, when snow falls on the road surface (I), snow (C) is generated, and the road surface (I) changes from gray to white as shown in FIG. 3B.

また、車両が積雪(ハ)の状態にある路面上を走行したとき、図3(c)のように路面上に走行車両の轍(ニ)が発生した状態となる。   Further, when the vehicle travels on a road surface in a state of snow (c), the vehicle is in a state in which a drunkenness (d) of the traveling vehicle is generated on the road surface as shown in FIG.

すなわち、天候などの影響が路面(イ)上に一様に発生すると仮定すると、走行車両が路面(イ)の状態に最も影響すると考えられる。このとき、走行車両の影響は走行車両の進行方向(ロ)に対して一様に現れてくることが想定できる。これは降雪だけでなく、晴天、曇り、降雨でも同様であると考えられる
また、現地の環境によっても変化することが考えられる。すなわち、路面(イ)の状態は、車両が路面(イ)を走行することにより、その路面上に摩耗が一様に発生すると考えられる。このことは、摩耗による路面(イ)の変色や摩耗による轍(ニ)への降雨後の水溜りなども走行車両の進行方向(ロ)に対して一様に発生すると想定できる。
That is, if it is assumed that the influence of the weather etc. occurs uniformly on the road surface (A), it is considered that the traveling vehicle has the most influence on the state of the road surface (A). At this time, it can be assumed that the influence of the traveling vehicle appears uniformly with respect to the traveling direction (b) of the traveling vehicle. This is considered to be the same not only in snowfall but also in fine weather, cloudy weather, and rain. It is also possible that this will vary depending on the local environment. That is, it is considered that the road surface (A) is uniformly worn on the road surface when the vehicle travels on the road surface (A). It can be assumed that the discoloration of the road surface (A) due to wear and the water pool after rainfall on the bag (D) due to wear occur uniformly in the traveling direction (B) of the traveling vehicle.

そこで、本発明に係る画像処理装置及び交通監視装置は、路面領域抽出部12を設け、現地の環境や天候により変化する路面の状況を正確に抽出することにある。図4は路面領域抽出部12による一連の処理手順を示すフロー図である。   Therefore, the image processing apparatus and the traffic monitoring apparatus according to the present invention are provided with the road surface area extraction unit 12 to accurately extract the road surface conditions that change depending on the local environment and weather. FIG. 4 is a flowchart showing a series of processing procedures by the road surface area extraction unit 12.

すなわち、路面領域抽出部12は、監視カメラ1で撮影された映像に対して局所領域比較処理31を実行した後、2値化処理32及び領域整形処理33を実行し、監視カメラ1の映像内における路面の領域を抽出し、車両領域検出部13に送出する。   That is, the road surface area extraction unit 12 executes the local area comparison process 31 on the video imaged by the monitoring camera 1, executes the binarization process 32 and the area shaping process 33, The road surface area is extracted and sent to the vehicle area detector 13.

局所領域比較処理31は、図5に示すように1画素を囲む局所領域34aまたは所定画素(例えば3画素)×所定画素(例えば3画素)=9画素を囲む局所領域34aを設定し、走行車両の進行方向(ロ)に沿って重なることなく局所領域34b,34c,…を移動させつつ、先の局所領域31aと移動させた局所領域34b,34c,…の間の映像比較35-1,35-2を行い、当該局所領域34a−34b、34a−34c,…間の類似性を評価する。このとき、天候によって積雪が発し、路面(イ)上に走行車両の轍(ニ)が発生している場合でも、走行車両の進行方向(ロ)に沿って局所領域34a−34b、34a−34c,…間の比較を行えば、類似性の評価は高くなる。なお、局所領域34a−34b、34a−34c,…間の映像比較35-1,35-2としては、輝度差を評価する方法や輝度の相関を評価する方法などがある。   As shown in FIG. 5, the local region comparison process 31 sets a local region 34a surrounding one pixel or a local region 34a surrounding a predetermined pixel (for example, 3 pixels) × predetermined pixel (for example, 3 pixels) = 9 pixels. .., While moving the local regions 34b, 34c,... Without overlapping in the traveling direction (b), while comparing the previous local region 31a and the moved local regions 34b, 34c,. -2 is performed to evaluate the similarity between the local regions 34a-34b, 34a-34c,. At this time, even if snow is generated due to the weather and the dredging (d) of the traveling vehicle is generated on the road surface (b), the local regions 34a-34b, 34a-34c are along the traveling direction (b) of the traveling vehicle. If the comparison is made between,,..., The similarity evaluation becomes high. As the video comparisons 35-1, 35-2 between the local regions 34a-34b, 34a-34c,..., There are a method of evaluating a luminance difference, a method of evaluating a luminance correlation, and the like.

何れにせよ、前後の局所領域間の類似性を評価したとき、走行車両の進行方向(ロ)に対して路面(イ)の状態は現地の環境や天候の変化に対して一様となることから類似性が高くなるが、局所領域に車両が存在する場合には、その車両と路面(イ)を局所的に映像比較すれば、走行車両の類似性の評価は低くなる。   In any case, when the similarity between the front and back local areas is evaluated, the road surface (b) will be uniform with respect to changes in the local environment and weather relative to the traveling direction (b) of the traveling vehicle. However, if there is a vehicle in a local region, if the vehicle and the road surface (A) are compared locally, the similarity evaluation of the traveling vehicle will be low.

以上のようにして監視カメラ1の映像内において、全画素に対して走行車両の進行方向(ロ)に沿って映像比較35-1,35-2,…を実行する。   As described above, the image comparisons 35-1, 35-2,... Are executed for all the pixels in the image of the monitoring camera 1 along the traveling direction (b) of the traveling vehicle.

引き続き、2値化処理32を実行する。この2値化処理32は、予め所定のしきい値を設定し、局所領域比較処理31にて走行車両の進行方向(ロ)に沿って評価した類似性に従って、当該類似性が所定のしきい値を超えた領域を「1」として抽出する。   Subsequently, the binarization process 32 is executed. In the binarization process 32, a predetermined threshold value is set in advance, and the similarity is determined to be a predetermined threshold according to the similarity evaluated along the traveling direction (b) of the traveling vehicle in the local region comparison process 31. An area exceeding the value is extracted as “1”.

そして、抽出された領域について、膨張処理や縮小処理を行い、細かなノイズ成分を消去しつつ確定すべき車両の領域の形を整える領域整形処理33を実行し、監視カメラ1の映像内における路面領域を抽出し、車両領域検出部13に送出する。   Then, an area shaping process 33 is performed on the extracted area to perform a dilation process and a reduction process, and to adjust the shape of the area of the vehicle to be confirmed while eliminating fine noise components, and the road surface in the video of the surveillance camera 1 An area is extracted and sent to the vehicle area detection unit 13.

その結果、図6(a)に示すよう車両(ホ)が路面(イ)上を走行している場合、車両候補領域抽出部11では、監視カメラ1の映像内から同図(b)に示すような車両領域(ホ´)を抽出することができ、一方、路面領域抽出部12では、監視カメラ1の映像内から同図(c)に示すような路面領域(イ´)を抽出することができる。なお、(へ),(ト)は現地の環境や天候の影響を受けて映し出される轍、積雪、水溜り等のノイズ領域であるが、当該路面領域抽出部12では一種の車両候補領域として抽出される。   As a result, when the vehicle (e) is traveling on the road surface (b) as shown in FIG. 6 (a), the vehicle candidate area extraction unit 11 is shown in FIG. Such a vehicle area (e ') can be extracted, while the road surface area extraction unit 12 extracts a road surface area (i') as shown in FIG. Can do. Note that (f) and (g) are noise areas such as hail, snow cover, and puddles that are projected under the influence of the local environment and weather, but the road surface area extraction unit 12 extracts them as a kind of vehicle candidate area. Is done.

そこで、車両領域検出部13は、車両候補領域抽出部11で抽出された車両候補領域(ホ´),(ヘ),(ト)と路面領域抽出部12で抽出された路面領域(イ´)とを用いて、車両候補領域(ヘ),(ト)及び路面領域(イ´)を除外し、真の車両領域を取り出す。   Therefore, the vehicle region detection unit 13 includes the vehicle candidate regions (e '), (f), (g) extracted by the vehicle candidate region extraction unit 11, and the road surface region (a') extracted by the road surface region extraction unit 12. Are used to exclude the vehicle candidate areas (F), (G) and the road surface area (A '), and take out the true vehicle area.

すなわち、車両領域検出部13の一例具体例としては、車両候補領域抽出部11で抽出された車両候補領域(ホ´),(ヘ),(ト)と路面領域抽出部12で抽出された路面領域(イ´)を反転して得られる領域とをAND論理演算を実行し、前述した車両候補領域(ヘ),(ト)及び路面領域(イ´)を除外し、真の車両領域だけを検出し、交通事象検知部3に送出する。   That is, as one specific example of the vehicle area detection unit 13, the vehicle candidate areas (e), (f), (g) extracted by the vehicle candidate area extraction unit 11 and the road surface extracted by the road surface area extraction unit 12. An AND logic operation is performed on the area obtained by inverting the area (I '), and the vehicle candidate areas (F), (G) and the road surface area (I') are excluded, and only the true vehicle area is obtained. Detected and sent to the traffic event detector 3.

交通事象検知部3は、予め交通事象管理テーブルに交通事象内容に応じた交通事象データ(例えば渋滞、車両停止、落下物その他の事象)が記憶され、車両領域検出部13から送られてくる車両領域の位置、車両速度及び路面面積に対する車両領域の割合などから渋滞を判断し、当該交通事象管理テーブルから渋滞なる交通事象データを取り出し、モニター表示部4に送出する。   The traffic event detection unit 3 stores in advance traffic event data (for example, traffic jam, vehicle stop, fallen object, etc.) according to the traffic event content in the traffic event management table, and is sent from the vehicle area detection unit 13. Judgment is made from the position of the area, the vehicle speed and the ratio of the vehicle area to the road surface area, etc., and the traffic event data causing the traffic jam is extracted from the traffic event management table and sent to the monitor display unit 4.

なお、車両領域の位置計測は、予め監視カメラ1が固定設置されているので、監視カメラ1の映像に現れる各画像座標と監視カメラ1からの絶対距離との関係が変換テーブルに規定され、車両領域検出部13で検出された車両領域の画像座標に基づき、変換テーブルから絶対距離データを取り出し、車両領域の位置を求めていく。このとき、車両領域の位置は、車両領域のうち、車両が路面に接している接地面の画像座標から絶対距離を取り出し、車両領域の位置を求めることが位置精度の点から望ましい。そのため、車両領域の下部の座標から位置を求める。   Note that since the monitoring camera 1 is fixedly installed in advance for measuring the position of the vehicle area, the relationship between each image coordinate appearing in the video of the monitoring camera 1 and the absolute distance from the monitoring camera 1 is defined in the conversion table, and the vehicle Based on the image coordinates of the vehicle area detected by the area detection unit 13, absolute distance data is extracted from the conversion table, and the position of the vehicle area is obtained. At this time, it is desirable from the viewpoint of position accuracy that the position of the vehicle area is obtained from the image coordinates of the ground contact surface where the vehicle is in contact with the road surface in the vehicle area to obtain the position of the vehicle area. Therefore, a position is calculated | required from the coordinate of the lower part of a vehicle area | region.

さらに、交通事象検知部3は、時系列的な車両位置とその時間間隔とから車両速度を計測するとともに、予めカメラ視野内の路面面積データが記憶されており、路面面積データに対する車両領域検出部13で検出された車両領域の面積の割合から停止車両、渋滞、または落下物などの交通事象を検知し、その交通事象内容に応じた交通事象データを出力する。   Furthermore, the traffic event detector 3 measures the vehicle speed from the time-series vehicle position and its time interval, and road surface area data in the camera field of view is stored in advance, and a vehicle area detector for road surface area data. A traffic event such as a stopped vehicle, a traffic jam, or a fallen object is detected from the ratio of the area of the vehicle area detected in 13, and traffic event data corresponding to the content of the traffic event is output.

モニター表示装置4は、監視カメラ1で撮影された映像内に、交通事象検知部3で検知された交通事象データである渋滞、車両停止、落下物有り等のデータを表示する。監視員は、モニター表示装置4に表示される映像と交通事象データとから現在の交通事象を正確に確認することができる。   The monitor display device 4 displays data such as traffic jam data detected by the traffic event detection unit 3 such as traffic jam, vehicle stop, and falling objects in the video taken by the monitoring camera 1. The monitor can accurately confirm the current traffic event from the video displayed on the monitor display device 4 and the traffic event data.

従って、以上のような実施の形態によれば、路面における走行車両の進行方向に対して路面の状態が一様であるとする路面の特徴を定義し、監視カメラ1の映像の中から、背景差分法を用いて車両候補領域を抽出し、また、局所領域比較によって路面領域を抽出し、この抽出された路面領域を用いて、車両候補領域の中に含む車両以外の現地の環境や天候で変化する路面上の映像ノイズや路面領域を除外することにより、背景差分法などに起因した車両領域の誤検出を低減することができ、より正確に交通事象を確認することができる。   Therefore, according to the above-described embodiment, the road surface characteristic that the road surface state is uniform with respect to the traveling direction of the traveling vehicle on the road surface is defined. The candidate vehicle area is extracted using the difference method, and the road surface area is extracted by comparing the local areas, and the extracted road surface area is used for the local environment and weather other than the vehicle included in the vehicle candidate area. By excluding video noise and road area on the changing road surface, it is possible to reduce the erroneous detection of the vehicle area due to the background difference method or the like, and to confirm the traffic event more accurately.

(第2の実施の形態)
図7は本発明に係る画像処理装置を用いた交通監視装置の第2の実施の形態を示す構成図である。なお、同図において、図1と同一または等価な部分には同一符号を付し、その詳しい説明を省略する。
(Second Embodiment)
FIG. 7 is a block diagram showing a second embodiment of a traffic monitoring apparatus using the image processing apparatus according to the present invention. In the figure, the same or equivalent parts as in FIG. 1 are denoted by the same reference numerals, and detailed description thereof is omitted.

第2の実施の形態は、第1の実施の形態と比較して特に異なるところは交通事象検知部3の出力側に交通事象判定部5を新たに設けたことにある。   The second embodiment is different from the first embodiment in that a traffic event determination unit 5 is newly provided on the output side of the traffic event detection unit 3.

交通事象判定部5は、交通事象検知部3から出力される一定時間ごとの交通事象が予め定める複数回にわたって同一の検知結果であるとき、渋滞、車両停止、落下物などの交通事象が発生したと判断し、モニター表示装置4に表示する。   The traffic event determination unit 5 generates traffic events such as traffic jams, vehicle stops, and falling objects when the traffic events output from the traffic event detection unit 3 have the same detection result for a plurality of predetermined times. Is displayed on the monitor display device 4.

次に、以上のように構成された装置のうち、特に交通事象判定部5の動作について説明する。   Next, among the devices configured as described above, the operation of the traffic event determination unit 5 will be described in particular.

先ず、交通事象検知部3は、前述したように監視カメラ1の映像から一定時間ごとに交通事象を検知するが、この検知された停止車両、渋滞などの交通事象は、瞬間的に発生・解消するものではなく、ある程度の時間にわたって継続したとき、確定的な交通事象の発生と判断できる。すなわち、交通事象検知部3から一定時間ごとに検知された交通事象の結果が複数回に亘って同一の交通事象であれば、その検知された交通事象が正しいと言える。   First, as described above, the traffic event detection unit 3 detects a traffic event from the video of the monitoring camera 1 at regular intervals. The detected traffic event such as a stopped vehicle or a traffic jam is instantaneously generated / resolved. If it continues for a certain amount of time, it can be determined that a definite traffic event has occurred. That is, if the result of the traffic event detected from the traffic event detection unit 3 at regular intervals is the same traffic event for a plurality of times, it can be said that the detected traffic event is correct.

そこで、交通事象判定部5は、確定的に正しいと判定できる所定回数を設定し、交通事象検知部3から一定時間ごとに出力される交通事象の検知結果と前回の検知結果とを比較し、最初の交通事象の検知結果と同一の検知結果が所定回数に亘って継続的に受けたとき、初めて停止車両や渋滞などの交通事象が発生したと判断し、モニター表示装置4に表示する。   Therefore, the traffic event determination unit 5 sets a predetermined number of times that can be definitely determined to be correct, compares the detection result of the traffic event output from the traffic event detection unit 3 every predetermined time with the previous detection result, When the same detection result as the detection result of the first traffic event is continuously received for a predetermined number of times, it is determined that a traffic event such as a stopped vehicle or a traffic jam has occurred for the first time and is displayed on the monitor display device 4.

従って、以上のような第2の実施の形態によれば、交通事象検知部3から出力される一定時間ごとの検知結果が継続的に複数回に亘って同一の検知結果が得られたとき、その交通事象の検知結果が正しいと判定し、モニター表示装置4に送出するので、第1の実施の形態の検知結果よりもより精度の交通事象を得ることができ、結果として交通事象の誤検知をより低減化できる。   Therefore, according to the second embodiment as described above, when the detection result for each fixed time output from the traffic event detection unit 3 is continuously obtained multiple times, Since it is determined that the detection result of the traffic event is correct and is sent to the monitor display device 4, it is possible to obtain a more accurate traffic event than the detection result of the first embodiment, resulting in a false detection of the traffic event. Can be further reduced.

(第3の実施の形態)
図8は本発明に係る画像処理装置を用いた交通監視装置の第3の実施の形態を示す構成図である。なお、同図において、図1と同一または等価な部分には同一符号を付し、その詳しい説明を省略する。
(Third embodiment)
FIG. 8 is a block diagram showing a third embodiment of a traffic monitoring apparatus using the image processing apparatus according to the present invention. In the figure, the same or equivalent parts as in FIG. 1 are denoted by the same reference numerals, and detailed description thereof is omitted.

第3の実施の形態は、長い交差点などの交通量の多い場所や長いカーブなどで前方の見通しが悪い場所などのとき、その道路に沿って複数の監視カメラ1a,1b,1cを設置し、停止車両,渋滞あるいは落下物の発生を判定することが考えられる。従って、この実施の形態は、以上のような交通事象でも対応できるようにした例である。従って、その他の構成については第1,第2の実施の形態と同様であるので、同一または等価な部分には同一符号を付し、その詳しい説明を省略する。   In the third embodiment, a plurality of surveillance cameras 1a, 1b, and 1c are installed along a road when the traffic ahead is a place with a large traffic volume such as a long intersection or a place with a long curve, etc. It may be possible to determine the occurrence of a stopped vehicle, traffic jam or falling objects. Therefore, this embodiment is an example in which it is possible to cope with the above traffic events. Accordingly, since the other configuration is the same as that of the first and second embodiments, the same or equivalent parts are denoted by the same reference numerals, and detailed description thereof is omitted.

この第3の実施の形態において、特に異なるところは、監視カメラ1a,1b,1cの出力端を切替える切替え手段6と、あるカメラ例えば1aの映像から交通事象検知結果を出力するごと、あるいは所定の周期ごとに切替え指示信号を切替え手段6に送出する切替え信号発生機能を有する交通事象判定部5aとを設けた構成である。   In the third embodiment, the difference is that the switching means 6 for switching the output ends of the monitoring cameras 1a, 1b, 1c and the traffic event detection result from a video of a certain camera, for example, 1a, or a predetermined A traffic event determination unit 5a having a switching signal generation function for sending a switching instruction signal to the switching means 6 for each period is provided.

次に、第3の実施の形態における交通監視装置の動作について説明する。
先ず、交通事象判定部5aからの切替え指示信号に基づき、切替え手段6が監視カメラ1aを選択し、当該監視カメラ1aで撮影された映像を取り込み、前述したように車両候補領域抽出部11及び路面領域抽出部12と、モニター表示装置4とにそれぞれ送出する。
Next, the operation of the traffic monitoring device according to the third embodiment will be described.
First, based on the switching instruction signal from the traffic event determination unit 5a, the switching unit 6 selects the monitoring camera 1a, captures the video imaged by the monitoring camera 1a, and the vehicle candidate area extraction unit 11 and the road surface as described above. The data is sent to the area extraction unit 12 and the monitor display device 4, respectively.

車両候補領域抽出部11及び路面領域抽出部12は、それぞれ第1の実施の形態と同様の画像処理手段により、監視カメラ1aの映像から映像内の車両候補領域及び路面領域を抽出し、さらに交通事象検知部13にて交通事象を検知し、交通事象判定部5にてその検知された交通事象が複数回同一結果であれば、停止車両,渋滞、落下物などの交通事象が発生したと判定し、モニター表示装置4に表示する。   The vehicle candidate region extraction unit 11 and the road surface region extraction unit 12 each extract the vehicle candidate region and the road surface region in the video from the video of the monitoring camera 1a by the same image processing means as in the first embodiment, and further the traffic The event detection unit 13 detects a traffic event, and the traffic event determination unit 5 determines that a traffic event such as a stopped vehicle, a traffic jam, or a fallen object has occurred if the detected traffic event is the same result multiple times. And displayed on the monitor display device 4.

引き続き、交通事象判定部5aは、切替え指示信号を切替え手段6に送出し、監視カメラ1b,監視カメラ1cの映像を順次選択し、前述と同様の要領で交通事象の発生を判定し、モニター表示装置4に表示する。   Subsequently, the traffic event determination unit 5a sends a switching instruction signal to the switching means 6, sequentially selects the images of the monitoring camera 1b and the monitoring camera 1c, determines the occurrence of the traffic event in the same manner as described above, and displays on the monitor. Display on device 4.

このとき、モニター表示装置4には、監視カメラ1a,1b,1cごとに切替えてカメラ映像及び交通事象を表示するが、例えばモニター表示装置4の表示領域を予め複数分割(例えば4分割)に一定時間に亘って監視カメラ1a,1b,1cの映像及び交通事象を同時並行的に表示させる構成であってもよい。   At this time, the monitor display device 4 is switched for each of the monitoring cameras 1a, 1b, and 1c to display camera images and traffic events. For example, the display area of the monitor display device 4 is previously divided into a plurality of divisions (for example, four divisions). The structure which displays simultaneously the image | video and traffic event of surveillance camera 1a, 1b, 1c over time may be sufficient.

これにより、監視員は例えば交差点の全体の交通事象等を正確、大局的に把握することができる。   Thereby, the supervisor can grasp | ascertain the traffic event etc. of the whole intersection etc. correctly, globally.

さらに、他の例としては、交通事象判定部5は、所定の処理時間ごとに監視カメラ1a,1b,1cを順次切替えるのではなく、例えば重要な個所を監視する特定の監視カメラ例えば1aの映像から交通事象が発生したと判定したときだけ、順次他の監視カメラ1b,1c,…の順序で切替え、道路の状況をモニター表示装置4に表示する構成であってもよい。   Furthermore, as another example, the traffic event determination unit 5 does not sequentially switch the monitoring cameras 1a, 1b, and 1c at every predetermined processing time, but for example, a video of a specific monitoring camera that monitors important points, for example, 1a. Only when it is determined that a traffic event has occurred, the other monitoring cameras 1b, 1c,... Are sequentially switched in order and the road condition is displayed on the monitor display device 4.

なお、複数台の監視カメラ1a,1b,1c、…を設置した場合、各監視カメラ1a,1b,1c、…ごとに、車両候補領域抽出部11−路面領域抽出部12−交通事象検知部13を接続し、交通事象判定部5bが個別処理のアプリケーションソフトのもとに同時並行的に交通事象の発生有無を判定し、モニター表示装置4の対応する分割表示領域に表示させる構成であってもよい。   When a plurality of surveillance cameras 1a, 1b, 1c,... Are installed, for each surveillance camera 1a, 1b, 1c,..., Vehicle candidate area extraction unit 11-road surface area extraction unit 12-traffic event detection unit 13 Even if the traffic event determination unit 5b determines whether or not a traffic event has occurred simultaneously in parallel under application software for individual processing, and displays it in the corresponding divided display area of the monitor display device 4. Good.

図10は道路に沿って所要の距離を隔てて複数台の監視カメラ1a,1b,1c,…を設置した場合の他の構成例を示す図である。   FIG. 10 is a diagram showing another configuration example when a plurality of surveillance cameras 1a, 1b, 1c,... Are installed at a predetermined distance along the road.

一般に、一台の監視カメラ例えば1aで監視できる範囲は、数十mから200m程度であり、複数台の車両が縦列渋滞する交通事象の場合には多数の監視カメラ1a,1b,1c,…を設置する必要があるが、各監視カメラ1a,1b,1c,…の監視範囲(監視区間)を分けて交通事象を判定する必要がある。   In general, the range that can be monitored by one monitoring camera, for example, 1a is about several tens of meters to 200 m, and in the case of a traffic event in which a plurality of vehicles are congested in series, a large number of monitoring cameras 1a, 1b, 1c,. Although it is necessary to install, it is necessary to determine the traffic event by dividing the monitoring range (monitoring section) of each monitoring camera 1a, 1b, 1c,.

そこで、この実施の形態では、一部の監視カメラを重複させつつ、監視カメラ(1a−1b−1c)からなる第1カメラグループ、監視カメラ(1b−1c−1d)からなる第2カメラグループにグループ分けしておく。   Therefore, in this embodiment, a part of the monitoring cameras are overlapped, and the first camera group including the monitoring cameras (1a-1b-1c) and the second camera group including the monitoring cameras (1b-1c-1d) are included. Keep groups.

そして、各交通事象判定部5aは、あるグループの複数台の監視カメラ1a,1b,1cの各映像から、前述する画像処理装置2を経て全ての映像から渋滞と判定したとき、監視カメラ1a,1b,1cの監視区間で渋滞などの交通事象が発生したと判定し、モニター表示装置4に表示する。   When each traffic event determination unit 5a determines that there is a traffic jam from all the images via the image processing device 2 described above from each image of a plurality of monitoring cameras 1a, 1b, 1c in a certain group, It is determined that a traffic event such as traffic jam has occurred in the monitoring sections 1b and 1c, and is displayed on the monitor display device 4.

つまり、カメラグループ毎の複数の監視カメラの映像が同一の交通事象であるときを条件とし、該当監視区間で特定の交通事象が発生したと判定し、モニター表示装置4に表示する。   That is, on the condition that the video of the plurality of monitoring cameras for each camera group is the same traffic event, it is determined that a specific traffic event has occurred in the corresponding monitoring section, and is displayed on the monitor display device 4.

表示形式としては、グループごとに表示領域4a,4b,…に分け、第1のカメラグループ内の重要とする監視カメラ1b、第2のカメラグループ内の重要とする監視カメラ1cの映像をそれぞれ表示領域4a,4b,…に表示するとともに、領域4a,4b,…内に渋滞文字データまたは非渋滞文字データを表示するとか、あるいはモニター表示装置4に長い交差点や長いカーブの道路の全体映像を表示し、その全体映像内の所定領域内に第1グループ監視区間=渋滞、第2グループ監視区間=渋滞と表示するとか、あるいは直接映像内の該当場所を含む監視区間を異なる色データ(例えば赤色)で表示するとか、映像内の該当場所を含む監視区間を点滅などによって表示する。   As a display format, each group is divided into display areas 4a, 4b,..., And images of important monitoring cameras 1b in the first camera group and important monitoring cameras 1c in the second camera group are respectively displayed. In addition to displaying in the areas 4a, 4b,..., Traffic congestion character data or non-congestion character data is displayed in the areas 4a, 4b,..., Or an entire image of a road with a long intersection or a long curve is displayed on the monitor display device 4 The first group monitoring section = congestion and the second group monitoring section = congestion are displayed in a predetermined area in the entire video, or the monitoring section including the corresponding place in the direct video is displayed in different color data (for example, red). Or the monitoring section including the corresponding place in the video is displayed by blinking or the like.

従って、この第3の実施の形態によれば、道路に沿って設置されたグループ監視カメラ1a,1b,1cの全映像から交通事象を判定し、また隣接する次グループ監視カメラ1b,1c,1dの全映像から交通事象を判定し、表示するので、長い交差点などの交通量の多い場所や長いカーブなどで前方の見通しが悪い場所などの車両領域から交通事象を正確に判定し、より誤検出なく広い区間に亘って交通事象を確認することができる。   Therefore, according to the third embodiment, a traffic event is determined from all the images of the group monitoring cameras 1a, 1b, and 1c installed along the road, and the adjacent next group monitoring cameras 1b, 1c, and 1d are used. Traffic events are judged and displayed from all images of the vehicle, so traffic events can be accurately determined from areas such as long intersections where there is a lot of traffic or long curves, etc. The traffic event can be confirmed over a wide section.

その他、本発明は、上記実施の形態に限定されるものでなく、その要旨を逸脱しない範囲で種々変形して実施できる。   In addition, the present invention is not limited to the above-described embodiment, and various modifications can be made without departing from the scope of the invention.

1a,1b,1c,… …監視カメラ、2…画像処理装置、3…交通事象検知部、4…モニター表示装置、5,5a…交通事象判定部、6…切替え手段、11…車両候補領域抽出部,12…路面領域抽出部、13…車両領域検出部、21…画像間比較処理、31…局所領域比較処理、22,32…2値化処理、23,33…領域整形処理、34a〜34c…局所領域、(イ)…路面、(イ´)…路面領域、(ロ)走行車両の進行方向、(ハ)…積雪、(ニ)…轍、(ホ)…車両、(ホ´)…車両候補領域。 DESCRIPTION OF SYMBOLS 1a, 1b, 1c, ... Monitoring camera, 2 ... Image processing apparatus, 3 ... Traffic event detection part, 4 ... Monitor display apparatus, 5, 5a ... Traffic event determination part, 6 ... Switching means, 11 ... Vehicle candidate area | region extraction , 12 ... Road surface area extraction part, 13 ... Vehicle area detection part, 21 ... Inter-image comparison process, 31 ... Local area comparison process, 22, 32 ... Binarization process, 23, 33 ... Area shaping process, 34a to 34c ... Local area, (I) ... Road surface, (I ') ... Road surface area, (B) Traveling direction of traveling vehicle, (C) ... Snow cover, (D) ... W, (E) ... Vehicle, (E') ... Vehicle candidate area.

Claims (6)

道路の所要位置に設置された監視カメラの映像から背景差分法を用いて車両候補領域を抽出する車両候補領域抽出手段と、
前記監視カメラの映像から現地の環境や天候によって変化する路面の状態を考慮して路面の領域を抽出する路面領域抽出手段と、
前記車両候補領域抽出手段で抽出される車両候補領域と前記路面領域抽出手段で抽出される路面領域とを演算し、前記車両候補領域の中に含まれる現地の環境や天候の変化によって現れるノイズ領域及び前記路面領域を除外し、真の車両領域を検出する車両領域検出手段と
を備えたことを特徴とする画像処理装置。
Vehicle candidate area extraction means for extracting a vehicle candidate area from a video of a surveillance camera installed at a required position on a road using a background difference method;
Road surface area extraction means for extracting a road surface area in consideration of the road surface state that changes depending on the local environment and weather from the video of the monitoring camera;
A noise region that appears due to a change in local environment or weather included in the vehicle candidate region by calculating the vehicle candidate region extracted by the vehicle candidate region extracting unit and the road surface region extracted by the road surface region extracting unit An image processing apparatus comprising: vehicle area detection means that excludes the road surface area and detects a true vehicle area.
請求項1に記載の画像処理装置において、
前記路面領域抽出手段は、前記監視カメラの映像に対して、走行車両の進行方向に沿って重複することなく所望の画素数からなる局所領域を順次設定し、その前後の局所領域の映像比較から類似性を評価する局所領域比較処理手段と、予めしきい値が設定され、前記局所領域比較処理手段で求めた類似性の評価値がしきい値を超えた領域を路面領域として抽出する領域抽出手段とを設けたことを特徴とする画像処理装置。
The image processing apparatus according to claim 1.
The road surface area extraction means sequentially sets a local area having a desired number of pixels without overlapping along the traveling direction of the traveling vehicle with respect to the video of the monitoring camera, and from the video comparison of the local areas before and after the local area Local region comparison processing means for evaluating similarity, and region extraction for extracting, as a road surface region, a region in which a threshold is set in advance and the similarity evaluation value obtained by the local region comparison processing unit exceeds the threshold And an image processing apparatus.
道路の所要位置に設置された監視カメラと、
この監視カメラの映像から前記真の車両領域を検出する請求項1または請求項2に記載される構成の画像処理装置と、
この画像処理装置から出力される車両領域の位置及び前記路面に対する前記車両領域の割合から停止車両,渋滞などの交通事象を検知する交通事象検知手段と、
この交通事象検知手段で検知された交通事象を前記監視カメラの映像上に表示するモニター表示装置と
を備えたことを特徴とする交通監視装置。
A surveillance camera installed at the required location on the road;
The image processing apparatus having the configuration according to claim 1 or 2, wherein the true vehicle area is detected from a video of the monitoring camera;
A traffic event detection means for detecting a traffic event such as a stopped vehicle or a traffic jam from the position of the vehicle area output from the image processing apparatus and the ratio of the vehicle area to the road surface;
A traffic monitoring apparatus comprising: a monitor display device for displaying a traffic event detected by the traffic event detection means on an image of the monitoring camera.
請求項3に記載の交通監視装置において、
前記交通事象検知手段で検知された一定時間ごとの交通事象が予め定める回数に亘って同一結果の交通事象であるとき、当該交通事象の発生と判定し、前記モニター表示装置に表示される映像上に表示する交通事象判定手段を、さらに設けたことを特徴とする交通監視装置。
In the traffic monitoring device according to claim 3,
When the traffic event detected by the traffic event detection means is a traffic event of the same result over a predetermined number of times, it is determined that the traffic event has occurred, and the video displayed on the monitor display device A traffic monitoring apparatus, further comprising a traffic event determination means to be displayed.
請求項4に記載の交通監視装置において、
道路に沿って複数台の監視カメラを設置するカメラ設置手段と、これら監視カメラと前記画像処理装置との間に設けられ、前記交通事象判定手段から所定時間ごとに切替え指示信号を受けて、前記複数台の監視カメラを順次切替え選択する切替え手段とを備え、
前記交通事象判定手段は、前記各監視カメラの映像から順次交通事象の発生を判定し、前記モニター表示装置に個別選択的に表示しまたは予め定めた分割領域に同時並行的に表示することを特徴とする交通監視装置。
The traffic monitoring device according to claim 4,
Camera installation means for installing a plurality of surveillance cameras along the road, provided between these surveillance cameras and the image processing device, receiving a switching instruction signal every predetermined time from the traffic event judgment means, Switching means for sequentially switching and selecting a plurality of surveillance cameras,
The traffic event determination means sequentially determines the occurrence of a traffic event from the images of the respective monitoring cameras, and displays the traffic events individually on the monitor display device or simultaneously in a predetermined divided area. Traffic monitoring device.
道路に沿って所要の距離ごとに設置されたn(nは整数)台の監視カメラと、
このn台の監視カメラを隣接関係にあるnより少ない複数台の監視カメラごとにグループ分けするカメラグループ分け手段と、
各カメラグループごとに設けられ、当該グループに属する各監視カメラの映像からそれぞれ真の車両領域を検出する請求項1または請求項2に記載される構成のグループ対応画像処理装置と、
このグループ対応画像処理装置ごとに設けられ、各グループ対応画像処理装置からそれぞれ出力される当該グループに属する各監視カメラの映像からそれぞれ車両領域の位置及び前記路面に対する前記車両領域の割合から停止車両,渋滞などの交通事象を検知するグループ対応交通事象検知手段と、
各グループ対応交通事象検知手段で検知された前記グループに属する全部の監視カメラの映像から得られた一定時間ごとの交通事象が予め定める回数に亘って同一結果の交通事象であるとき、当該グループ区間に亘って交通事象の発生と判定し、前記モニター表示装置に表示する交通事象判定手段と
を備えたことを特徴とする交通監視装置。
N (n is an integer) number of surveillance cameras installed at every required distance along the road;
Camera grouping means for grouping the n surveillance cameras into a plurality of surveillance cameras fewer than n in the adjacent relationship;
A group-corresponding image processing apparatus having a configuration according to claim 1 or 2, wherein a true vehicle area is detected from video of each monitoring camera belonging to each camera group provided for each camera group;
A vehicle that is provided for each group-corresponding image processing device and that is stopped from the position of the vehicle region and the ratio of the vehicle region to the road surface from the video of each monitoring camera belonging to the group that is output from each group-corresponding image processing device, A group-compatible traffic event detection means for detecting traffic events such as traffic jams,
When the traffic event per fixed time obtained from the images of all the monitoring cameras belonging to the group detected by the traffic event detection means corresponding to each group is a traffic event of the same result over a predetermined number of times, the group section A traffic monitoring apparatus comprising: a traffic event determination means for determining that a traffic event has occurred over a period of time and displaying the traffic event on the monitor display device.
JP2009038209A 2009-02-20 2009-02-20 Image processing apparatus and traffic monitoring apparatus Active JP5175765B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009038209A JP5175765B2 (en) 2009-02-20 2009-02-20 Image processing apparatus and traffic monitoring apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009038209A JP5175765B2 (en) 2009-02-20 2009-02-20 Image processing apparatus and traffic monitoring apparatus

Publications (2)

Publication Number Publication Date
JP2010191888A true JP2010191888A (en) 2010-09-02
JP5175765B2 JP5175765B2 (en) 2013-04-03

Family

ID=42817836

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009038209A Active JP5175765B2 (en) 2009-02-20 2009-02-20 Image processing apparatus and traffic monitoring apparatus

Country Status (1)

Country Link
JP (1) JP5175765B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013073620A (en) * 2011-09-28 2013-04-22 Honda Research Inst Europe Gmbh Method and system for detecting road landform for driver support system
WO2013121911A1 (en) * 2012-02-16 2013-08-22 日産自動車株式会社 Solid-object detection device and solid-object detection method
KR20200098788A (en) * 2019-02-12 2020-08-21 한국도로공사 Expressway traffic monitoring system of transport information center CCTV using artificial intelligence
CN111798677A (en) * 2020-07-15 2020-10-20 安徽达尔智能控制系统股份有限公司 Traffic incident monitoring and commanding system based on road video
WO2023190081A1 (en) * 2022-04-01 2023-10-05 京セラ株式会社 Information processing device, roadside unit, and information processing method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003324727A (en) * 2002-05-08 2003-11-14 Av Planning Center:Kk Method and apparatus for estimating background image
JP2006059183A (en) * 2004-08-20 2006-03-02 Matsushita Electric Ind Co Ltd Image processor

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003324727A (en) * 2002-05-08 2003-11-14 Av Planning Center:Kk Method and apparatus for estimating background image
JP2006059183A (en) * 2004-08-20 2006-03-02 Matsushita Electric Ind Co Ltd Image processor

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013073620A (en) * 2011-09-28 2013-04-22 Honda Research Inst Europe Gmbh Method and system for detecting road landform for driver support system
WO2013121911A1 (en) * 2012-02-16 2013-08-22 日産自動車株式会社 Solid-object detection device and solid-object detection method
JPWO2013121911A1 (en) * 2012-02-16 2015-05-11 日産自動車株式会社 Three-dimensional object detection apparatus and three-dimensional object detection method
KR20200098788A (en) * 2019-02-12 2020-08-21 한국도로공사 Expressway traffic monitoring system of transport information center CCTV using artificial intelligence
KR102207393B1 (en) 2019-02-12 2021-01-28 한국도로공사 Expressway traffic monitoring system of transport information center CCTV using artificial intelligence
CN111798677A (en) * 2020-07-15 2020-10-20 安徽达尔智能控制系统股份有限公司 Traffic incident monitoring and commanding system based on road video
WO2023190081A1 (en) * 2022-04-01 2023-10-05 京セラ株式会社 Information processing device, roadside unit, and information processing method

Also Published As

Publication number Publication date
JP5175765B2 (en) 2013-04-03

Similar Documents

Publication Publication Date Title
US11869347B2 (en) Traffic monitoring system and traffic monitoring method
JP6180482B2 (en) Methods, systems, products, and computer programs for multi-queue object detection and analysis (multi-queue object detection and analysis)
RU2571368C1 (en) Device for detecting three-dimensional objects, method of detecting three-dimensional objects
EP0986036A2 (en) Method of updating reference background image, method of detecting entering objects and system for detecting entering objects using the methods
KR101032160B1 (en) System and method for road visibility measurement using camera
JP4600929B2 (en) Stop low-speed vehicle detection device
JP5702544B2 (en) Vehicle traffic monitoring device and program
JP6139088B2 (en) Vehicle detection device
JP5175765B2 (en) Image processing apparatus and traffic monitoring apparatus
JP2007026300A (en) Traffic flow abnormality detector and traffic flow abnormality detection method
CN104603838A (en) Apparatus and method for monitoring object from captured image
JP2015090679A (en) Vehicle trajectory extraction method, vehicle region extraction method, vehicle speed estimation method, vehicle trajectory extraction program, vehicle region extraction program, vehicle speed estimation program, vehicle trajectory extraction system, vehicle region extraction system, and vehicle speed estimation system
JP2010210477A (en) Navigation device
JP2010237874A (en) Vehicle surroundings monitoring device
JP3914447B2 (en) Image-type vehicle detection system and image-type vehicle detection method
JP3742410B2 (en) Traffic flow monitoring system for moving objects
JP2003255430A (en) Video diagnostic apparatus, video diagnostic system of on-vehicle type video monitoring apparatus
JP2005338941A (en) Method and device for detecting visibility
JPH10269492A (en) Vehicle monitoring device
JP4972596B2 (en) Traffic flow measuring device
JP4697761B2 (en) Queue detection method and queue detection apparatus
JP2004199649A (en) Sudden event detection method
JP3394662B2 (en) Traffic flow measurement method and device
JP2006268678A (en) Device and method for detecting stopping or low-speed vehicle
Gumpp et al. Recognition and tracking of temporary lanes in motorway construction sites

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20110304

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20120726

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120807

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20121003

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20121211

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130107

R151 Written notification of patent or utility model registration

Ref document number: 5175765

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20160111

Year of fee payment: 3