JP2007293507A - Vehicle tracking device - Google Patents

Vehicle tracking device Download PDF

Info

Publication number
JP2007293507A
JP2007293507A JP2006119336A JP2006119336A JP2007293507A JP 2007293507 A JP2007293507 A JP 2007293507A JP 2006119336 A JP2006119336 A JP 2006119336A JP 2006119336 A JP2006119336 A JP 2006119336A JP 2007293507 A JP2007293507 A JP 2007293507A
Authority
JP
Japan
Prior art keywords
image
visible light
camera
vehicle
infrared camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2006119336A
Other languages
Japanese (ja)
Other versions
JP4722760B2 (en
Inventor
Yoshinori Senda
宜紀 千田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to JP2006119336A priority Critical patent/JP4722760B2/en
Publication of JP2007293507A publication Critical patent/JP2007293507A/en
Application granted granted Critical
Publication of JP4722760B2 publication Critical patent/JP4722760B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide a vehicle tracking device for clearly photographing a vehicle to be tracked. <P>SOLUTION: The vehicle tracking device comprises a turn base driver 12 for driving the turning operation of a visible light camera 1 and infrared camera 2 arranged in parallel, a turn base driver 22 for driving the turning operation of the infrared camera 2, a lens driver 11 for adjusting the zoom scaling factor of a telephoto lens 13 of the visible light camera 1, a lens driver 21 for adjusting the zoom scaling factor of a telephoto lens 23 of the infrared camera 2, an image processing section 3 that extracts a part having a feature amount of the vehicle to be tracked from image data output by the infrared camera 2 and determines by calculation the data for adjusting the angle of view of a photographed image according to the part having the feature amount, and a CPU 5 that controls the turn base drivers 12 and 22 and the lens driver 11 and 21 based on the data determined by the image processing section 3 and outputs the data of the image photographed by the visible light camera 1. <P>COPYRIGHT: (C)2008,JPO&INPIT

Description

この発明は、パトカーや白バイ等の車両に搭載して追跡対象車両の撮影に使用する車両追尾装置に関するものである。   The present invention relates to a vehicle tracking device that is mounted on a vehicle such as a police car or a white motorcycle and used for photographing a tracking target vehicle.

パトカーや白バイ等にビデオカメラを設置し、例えば暴走車両などを追跡する際に当該追跡対象車両を撮影して、この撮影画像を事後の車両特定や違反行為などの証拠に用いている。
追跡対象車両の撮影は、これまでは一般的なビデオカメラをパトカーや白バイの車両前方または後方に設置する、あるいはパトカー等の助手席の搭乗者自らがビデオカメラを保持し、これを手動操作によって撮影している。
また、監視カメラおよび赤外線カメラを、車両の後部に撮影方向を後方下部の道路面に向けて設置した装置がある。この装置は、赤外線カメラの撮影した映像から、即ち赤外線カメラから出力される映像信号から、位置検出部を用いて熱量を発生する生物や移動物体の位置を検出し、監視カメラの撮影した映像の中で上記の検出した位置に該当する部分を表示制御部を用いて強調してモニタへ表示するもので、走行方向に存在する障害物確認を容易に行うことができるようにしている(例えば、特許文献1参照)。
For example, when tracking a runaway vehicle or the like, a video camera is installed in a police car or a white bike, and the tracked vehicle is photographed, and this captured image is used as evidence of subsequent vehicle identification or violation.
To shoot a vehicle to be tracked, a conventional video camera has been installed in front of or behind a police car or white bike, or a passenger in a passenger seat such as a police car holds the video camera and operates it manually. Shooting.
There is also an apparatus in which a surveillance camera and an infrared camera are installed at the rear of the vehicle with the shooting direction facing the road surface at the lower rear. This device detects the position of a living organism or a moving object that generates heat using a position detection unit from a video image taken by an infrared camera, that is, a video signal output from the infrared camera, and The portion corresponding to the detected position is emphasized using the display control unit and displayed on the monitor so that obstacles existing in the traveling direction can be easily confirmed (for example, Patent Document 1).

特開2005−184523号公報(第5,6頁、図1,図2)Japanese Patent Laying-Open No. 2005-184523 (pages 5 and 6, FIGS. 1 and 2)

従来の追跡対象車両を撮影する装置は以上のように構成されているので、追跡されている車両は逃れるために激しく蛇行し方向を絶えず変化させることから、追跡しているパトカーや白バイ等に固定装着されたビデオカメラで追跡対象車両を鮮明に撮影することは困難であった。また、パトカー等の助手席で搭乗者がビデオカメラを保持して撮影する場合も、車内が激しくゆれるため追跡対象車両を鮮明に撮影することが困難であった。また、車両に撮像カメラと赤外線カメラとを併設した装置は、モニタ等に映像を表示するとき、生物などが映っている部分を強調するもので、激しく動く追跡対象車両を追尾して鮮明に撮影することは難しいという課題があった。   Since the conventional device for capturing a tracked vehicle is configured as described above, the tracked vehicle snakes violently in order to escape, and the direction is constantly changed. It was difficult to clearly photograph the vehicle to be tracked with the attached video camera. Further, when a passenger holds a video camera in a passenger seat such as a patrol car and takes a picture, it is difficult to take a clear picture of the vehicle to be tracked because the inside of the vehicle shakes violently. In addition, a device with an imaging camera and an infrared camera attached to the vehicle emphasizes the part where living things are reflected when displaying images on a monitor, etc., and clearly captures the tracking target vehicle that moves violently. There was a problem that it was difficult to do.

この発明は上記のような課題を解決するためになされたもので、追跡対象車両を鮮明に撮影する車両追尾装置を得ることを目的とする。   The present invention has been made to solve the above-described problems, and an object of the present invention is to obtain a vehicle tracking device that clearly captures a tracked vehicle.

この発明に係る車両追尾装置は、並べて配置した可視光カメラ及び赤外線カメラと、可視光カメラ及び赤外線カメラを回動させる回動手段と、可視光カメラ及び赤外線カメラのレンズのズーム倍率を調整するレンズドライブ手段と、赤外線カメラの出力した画像データから追跡対象車両の特徴量を有する部分を抽出し、当該特徴量を有する部分に応じて撮影画像の画角を調整するデータを求める画像処理手段と、画像処理手段の求めたデータに基づいて回動手段及びレンズドライブ手段を制御して可視光カメラに撮影させた画像のデータを出力させる制御手段とを備えるものである。   The vehicle tracking device according to the present invention includes a visible light camera and an infrared camera arranged side by side, a rotating means for rotating the visible light camera and the infrared camera, and a lens for adjusting the zoom magnification of the lenses of the visible light camera and the infrared camera. A drive means, an image processing means for extracting data having a feature amount of the tracking target vehicle from the image data output from the infrared camera, and obtaining data for adjusting the angle of view of the captured image in accordance with the feature amount portion; And control means for controlling the rotation means and the lens drive means based on the data obtained by the image processing means to output data of an image photographed by the visible light camera.

この発明によれば、並べて配置した可視光カメラ及び赤外線カメラを回動手段によって回動させ、可視光カメラ及び赤外線カメラのレンズのズーム倍率をズームドライブ手段によって調整して、画像処理手段が赤外線カメラからの画像データの追跡対象車両の特徴量を有する部分を抽出し、当該特徴量を有する部分に応じて撮影画像の画角を調整するデータを求め、制御手段が当該データに基づいて回動手段及びレンズドライブ手段を制御して可視光カメラに撮影させた画像のデータを出力させるようにしたので、追跡対象車両を視認し易く鮮明に撮影することができるという効果がある。   According to the present invention, the visible light camera and the infrared camera arranged side by side are rotated by the rotating means, and the zoom magnification of the lenses of the visible light camera and the infrared camera is adjusted by the zoom drive means. A portion having the feature amount of the tracking target vehicle in the image data from the image data is extracted, data for adjusting the angle of view of the captured image is obtained according to the portion having the feature amount, and the control means rotates based on the data In addition, since the lens drive means is controlled to output the data of the image captured by the visible light camera, there is an effect that the tracking target vehicle can be easily and clearly captured.

以下、この発明の実施の一形態を説明する。
実施の形態1.
図1は、この発明の実施の形態1による車両追尾装置の構成を示すブロック図である。図示した車両追尾装置は、追跡を行う車両の前側あるいは後側に並べて搭載される可視光カメラ1と赤外線カメラ2、可視光カメラ1及び赤外線カメラ2から出力された信号を処理する画像処理部3、画像処理部3から入力したデータを記憶し、必要に応じて記憶しているデータを画像処理部3へ出力する画像メモリ4、可視光カメラ1と赤外線カメラ2と画像処理部3とを制御するプロセッサ等のCPU5、可視光カメラ1と赤外線カメラ2と画像処理部3とCPU5の動作を同期させるタイミング信号等を発生するタイミング発生器6、画像処理部3が処理したデータを記録する記録部7、及び画像処理部3から出力されたデータに基づいて画像を表示する表示部8を備えている。
An embodiment of the present invention will be described below.
Embodiment 1 FIG.
1 is a block diagram showing a configuration of a vehicle tracking apparatus according to Embodiment 1 of the present invention. The illustrated vehicle tracking device includes an image processing unit 3 that processes signals output from the visible light camera 1 and the infrared camera 2, the visible light camera 1, and the infrared camera 2 that are mounted side by side on the front side or the rear side of the vehicle to be tracked. Controls the image memory 4 that stores data input from the image processing unit 3 and outputs the stored data to the image processing unit 3, the visible light camera 1, the infrared camera 2, and the image processing unit 3, as necessary. A CPU 5 such as a processor, a visible light camera 1, an infrared camera 2, an image processing unit 3, a timing generator 6 for generating a timing signal for synchronizing operations of the CPU 5, and a recording unit for recording data processed by the image processing unit 3 7 and a display unit 8 for displaying an image based on the data output from the image processing unit 3.

可視光カメラ1は水平垂直回動台10に搭載され、また赤外線カメラ2は水平垂直回動台20に搭載され、これらの水平垂直回動台10,20を車両に固定/設置している。また、可視光カメラ1及び赤外線カメラ2は、各々水平方向ならびに垂直方向に回動するように車両に搭載されている。可視光カメラ1にはレンズドライバ11と回動台ドライバ12が備えられている。レンズドライバ11は、可視光カメラ1に装着されている望遠レンズ13のズーム調整を行うように接続/構成されている。回動台ドライバ12は、水平垂直回動台10の駆動制御を行うように接続/構成されている。赤外線カメラ2にはレンズドライバ21と回動台ドライバ22が備えられている。レンズドライバ21は、赤外線カメラ2に装着されている望遠レンズ23のズーム調整を行うように接続/構成されている。回動台ドライバ22は、水平垂直回動台20の駆動制御を行うように接続/構成されている。   The visible light camera 1 is mounted on a horizontal / vertical rotating base 10, and the infrared camera 2 is mounted on a horizontal / vertical rotating base 20, and these horizontal / vertical rotating bases 10, 20 are fixed / installed on the vehicle. The visible light camera 1 and the infrared camera 2 are mounted on the vehicle so as to rotate in the horizontal direction and the vertical direction, respectively. The visible light camera 1 includes a lens driver 11 and a turntable driver 12. The lens driver 11 is connected / configured to perform zoom adjustment of the telephoto lens 13 attached to the visible light camera 1. The turntable driver 12 is connected / configured to perform drive control of the horizontal / vertical turntable 10. The infrared camera 2 includes a lens driver 21 and a turntable driver 22. The lens driver 21 is connected / configured to perform zoom adjustment of the telephoto lens 23 attached to the infrared camera 2. The turntable driver 22 is connected / configured to perform drive control of the horizontal / vertical turntable 20.

図2は、実施の形態1による車両追尾装置の構成を示す説明図である。この図は、図1に示した可視光カメラ1と赤外線カメラ2とを、例えばパトカーなどの車両30に搭載したときの態様の一例を示したもので、車両30を側方視したときの概観である。図1に示したものと同一部分に同じ符号を使用し、その説明を省略する。水平垂直回動台10,20は車両30のルーフ前端部位に搭載/固定されている。前述のように水平垂直回動台10,20には各々可視光カメラ1と赤外線カメラ2が搭載されている。また、可視光カメラ1と赤外線カメラ2は、望遠レンズ13,23が車両30の前方を向くように載置されている。   FIG. 2 is an explanatory diagram showing the configuration of the vehicle tracking device according to the first embodiment. This figure shows an example of a mode when the visible light camera 1 and the infrared camera 2 shown in FIG. 1 are mounted on a vehicle 30 such as a police car, for example, and an overview when the vehicle 30 is viewed from the side. It is. The same parts as those shown in FIG. 1 are denoted by the same reference numerals, and the description thereof is omitted. The horizontal and vertical turntables 10 and 20 are mounted / fixed on the front end portion of the roof of the vehicle 30. As described above, the visible light camera 1 and the infrared camera 2 are mounted on the horizontal and vertical turntables 10 and 20, respectively. Further, the visible light camera 1 and the infrared camera 2 are placed so that the telephoto lenses 13 and 23 face the front of the vehicle 30.

図3は、実施の形態1による車両追尾装置の構成を示す説明図である。この図は、図2に示した可視光カメラ1と赤外線カメラ2とを搭載した車両30を、前方から視たときの概観を表している。図1及び図2に示したものと同一部分に同じ符号を使用し、その説明を省略する。図3に例示したものは、図中左側に水平垂直回動台10に搭載された可視光カメラ1、右側に水平垂直回動台20に搭載された赤外線カメラ2を配置している。このとき、可視光カメラ1と赤外線カメラ2は、例えば車両30のルーフ表面からのレンズ高さを揃えて、可視光カメラ1と赤外線カメラ2が水平に並ぶように、また望遠レンズ13と望遠レンズ23の間隔が数十センチメートル空くように配置されている。追跡対象車両と追跡対象車両を追跡する車両30との距離は少なくとも数メートル以上空いていることから、後述する処理に支障が生じないように、被写体までの距離に比べてわずかな距離を開けて各カメラを配置して、追跡対象車両を同様な大きさ及び形状で各々のカメラが撮影できるように配置している。   FIG. 3 is an explanatory diagram showing the configuration of the vehicle tracking device according to the first embodiment. This figure shows an overview when the vehicle 30 equipped with the visible light camera 1 and the infrared camera 2 shown in FIG. 2 is viewed from the front. The same parts as those shown in FIGS. 1 and 2 are denoted by the same reference numerals, and the description thereof is omitted. In the example illustrated in FIG. 3, the visible light camera 1 mounted on the horizontal / vertical rotating base 10 is arranged on the left side of the drawing, and the infrared camera 2 mounted on the horizontal / vertical rotating base 20 is arranged on the right side. At this time, the visible light camera 1 and the infrared camera 2 have, for example, the same lens height from the roof surface of the vehicle 30 so that the visible light camera 1 and the infrared camera 2 are aligned horizontally, and the telephoto lens 13 and the telephoto lens. It arrange | positions so that the space | interval of 23 may be tens of centimeters. Since the distance between the tracked vehicle and the vehicle 30 that tracks the tracked vehicle is at least several meters, the distance from the subject can be set slightly smaller than the distance to the subject so that the processing described later is not hindered. Each camera is arranged so that the tracking target vehicle can be photographed with the same size and shape.

次に動作について説明する。
車両追尾装置は、可視光カメラ1と赤外線カメラ2が同一被写体を撮影するように、CPU5によって回動台ドライバ12,22を制御して水平垂直回動台10,20を駆動している。当該車両追尾装置を起動させたとき、CPU5は車両30の前方において同一範囲を撮影するように回動台ドライバ12及び回動台ドライバ22を制御し、水平垂直回動台10と水平垂直回動台20とを駆動させ、その後可視光カメラ1と赤外線カメラ2の向きを一定に保持させる。詳しくは、当該装置の起動時に各カメラのレンズドライバ11,21と回動台ドライバ12,22を駆動し原点検出を行って各望遠レンズのズーム倍率と各カメラの向きを同一に合わせ、一台の追跡対象車両を同様に撮影することができる状態にする。この動作は、例えばCPU5が原点として予め設定されているズーム倍率と現在のズーム倍率との差分を求め、各望遠レンズのズーム倍率の調整/制御を行う。また、可視光カメラ1と赤外線カメラ2の向きについて、予め原点として設定されている方向と現在の各カメラの向いている方向との差分を求め、各カメラを予め設定されている方向へ向けるように水平垂直回動台10,20の回動動作を制御する。このようにして可視光カメラ1と赤外線カメラ2の向きならびに各カメラの望遠レンズ13,23のズーム倍率を同一に揃え、以降のCPU5による制御では、可視光カメラ1と赤外線カメラ2の向きを同じ方向へ揃えて、また各カメラのズーム倍率を揃えて行われる。
Next, the operation will be described.
In the vehicle tracking apparatus, the CPU 5 controls the rotating table drivers 12 and 22 to drive the horizontal and vertical rotating tables 10 and 20 so that the visible light camera 1 and the infrared camera 2 capture the same subject. When the vehicle tracking device is activated, the CPU 5 controls the turntable driver 12 and the turntable driver 22 so that the same range is photographed in front of the vehicle 30, and the horizontal / vertical turntable 10 and the horizontal / vertical turn. The table 20 is driven, and then the orientations of the visible light camera 1 and the infrared camera 2 are kept constant. Specifically, when the apparatus is activated, the lens drivers 11 and 21 and the turntable drivers 12 and 22 of each camera are driven to detect the origin, and the zoom magnification of each telephoto lens and the direction of each camera are set to be the same. The vehicle to be tracked can be photographed in the same manner. In this operation, for example, the CPU 5 obtains the difference between the zoom magnification preset as the origin and the current zoom magnification, and adjusts / controls the zoom magnification of each telephoto lens. Further, with respect to the orientations of the visible light camera 1 and the infrared camera 2, the difference between the direction set in advance as the origin and the current direction of each camera is obtained, and each camera is directed in the preset direction. The horizontal and vertical rotary bases 10 and 20 are controlled to rotate. In this way, the directions of the visible light camera 1 and the infrared camera 2 and the zoom magnifications of the telephoto lenses 13 and 23 of each camera are made the same, and in the subsequent control by the CPU 5, the directions of the visible light camera 1 and the infrared camera 2 are the same. It is done by aligning in the direction and by aligning the zoom magnification of each camera.

追跡対象車両が被写体のとき、上記のように同一方向へ望遠レンズ13,23を向けている可視光カメラ1と赤外線カメラ2によって撮影を行う。可視光カメラ1及び赤外線カメラ2はタイミング発生器6から入力した複数のタイミング信号に同期させて追跡対象車両の撮影を行い、その撮影画像を画像データとして画像処理部3へ出力する。
画像処理部3は、可視光カメラ1及び赤外線カメラ2から順次入力している画像データを画像メモリ4へ格納する。画像メモリ4へ格納した赤外線カメラ2からの画像データについて、追跡対象車両を特定する特徴量を用いて当該車両が写っている画像中の領域を抽出する。また、画像表示を行ったとき、この抽出領域が画面上において所定の位置へ表示される画角と、現在処理対象としている画像データの画角との差分を求めて当該差分を示す画角差分データをCPU5へ出力する。
When the vehicle to be tracked is a subject, photographing is performed by the visible light camera 1 and the infrared camera 2 in which the telephoto lenses 13 and 23 are directed in the same direction as described above. The visible light camera 1 and the infrared camera 2 capture the vehicle to be tracked in synchronization with a plurality of timing signals input from the timing generator 6, and output the captured image to the image processing unit 3 as image data.
The image processing unit 3 stores the image data sequentially input from the visible light camera 1 and the infrared camera 2 in the image memory 4. With respect to the image data from the infrared camera 2 stored in the image memory 4, a region in an image in which the vehicle is captured is extracted using a feature amount that specifies the vehicle to be tracked. In addition, when an image is displayed, the difference between the angle of view at which this extraction area is displayed at a predetermined position on the screen and the angle of view of the image data currently being processed is obtained and the angle difference indicating the difference is displayed. Data is output to the CPU 5.

CPU5は、入力した画角差分データの値が小さくなるようにレンズドライバ11及び回動台ドライバ12と、レンズドライバ21及び回動台ドライバ22とを制御し、可視光カメラ1と赤外線カメラ2の向きや、各々の望遠レンズ13,23のズーム倍率を変化させる。画像処理部3から入力した画角差分データの値が充分小さく、概ね画角の差分がないとみなすことができるようになったとき、好ましくは当該データの示す差分が皆無になったとき画像処理部3を制御して、この時点で画像メモリ4に格納されている可視光カメラ1からの画像データ、換言すると上記の値の小さな画角差分データが求められた赤外線カメラ2の画像データと同期に可視光カメラ1が撮影した画像データを画像メモリ4から出力させ、例えば記録部7へ記録させる。また、外部から当該CPU5になされた設定に応じて画像処理部3を制御し、画像メモリ4に格納されている可視光カメラ1からの画像データを表示部8へ出力させ、当該画像データに基づく画像を表示させる。   The CPU 5 controls the lens driver 11 and the turntable driver 12, the lens driver 21 and the turntable driver 22 so that the value of the input angle-of-view difference data becomes small, and the visible light camera 1 and the infrared camera 2 are controlled. The direction and the zoom magnification of each of the telephoto lenses 13 and 23 are changed. When the value of the angle-of-view difference data input from the image processing unit 3 is sufficiently small and can be regarded as having almost no angle-of-view difference, preferably when the difference indicated by the data is completely absent The unit 3 is controlled to synchronize with the image data from the visible light camera 1 stored in the image memory 4 at this time, in other words, the image data of the infrared camera 2 for which the angle-of-view difference data having a small value is obtained. The image data captured by the visible light camera 1 is output from the image memory 4 and recorded in the recording unit 7, for example. Further, the image processing unit 3 is controlled according to the setting made to the CPU 5 from the outside, and the image data from the visible light camera 1 stored in the image memory 4 is output to the display unit 8, and based on the image data. Display an image.

前述の処理動作のうち、画像処理部3による処理動作とCPU5によるレンズドライバ11、回動台ドライバ12、レンズドライバ21、及び、回動台ドライバ22の制御を詳細に説明する。
図4は、実施の形態1による車両追尾装置の処理動作を示す説明図である。この図は、前述の画像処理部3による赤外線カメラ2からの画像データの処理を表したものである。図示した画像40は、図2などに表した車両30により追跡対象車両50を後方から追尾して撮影したときのものである。追跡対象車両50には、後方へ向けて排気管が設けられており、この排気管にはマフラ51が取り付けられている。ここで説明する車両追尾装置は、走行中に高温になるマフラ51の存在を、撮影画像の中の追尾対象車両50を特定する特徴量として用いている。
Among the processing operations described above, the processing operation by the image processing unit 3 and the control of the lens driver 11, the turntable driver 12, the lens driver 21, and the turntable driver 22 by the CPU 5 will be described in detail.
FIG. 4 is an explanatory diagram showing a processing operation of the vehicle tracking device according to the first embodiment. This figure shows the processing of image data from the infrared camera 2 by the above-described image processing unit 3. The illustrated image 40 is obtained when the tracking target vehicle 50 is tracked from behind and photographed by the vehicle 30 shown in FIG. The tracking target vehicle 50 is provided with an exhaust pipe toward the rear, and a muffler 51 is attached to the exhaust pipe. In the vehicle tracking device described here, the presence of the muffler 51 that becomes high temperature during traveling is used as a feature amount that identifies the tracking target vehicle 50 in the captured image.

車両追尾装置を起動させたとき、前述のようにCPU5の制御によって可視光カメラ1と赤外線カメラ2の撮影範囲が同一となるように望遠レンズ13,23のズーム倍率と水平垂直回動台10,20の動作が制御される。車両30の前方を走行している追跡対象車両50の後姿を撮影するとき、可視光カメラ1と赤外線カメラ2の撮影動作を同期させる。詳しくは、図1に示したタイミング発生器6がタイミング信号として生成した撮像素子読み出し信号、画像水平同期信号、及び画像垂直同期信号を各々可視光カメラ1と赤外線カメラ2へ入力し、これらの信号が示すタイミングに各カメラを構成している撮像素子の受光動作や画像データの出力動作を同期させている。上記のようにCPU5の制御により、同一画角で可視光カメラ1と赤外線カメラ2が追跡対象車両50を撮影し、画像処理部3が前述のようにこれらのカメラから出力された画像データを画像メモリ4へ格納する。   When the vehicle tracking device is activated, the zoom magnification of the telephoto lenses 13 and 23 and the horizontal and vertical turntable 10 are set so that the photographing ranges of the visible light camera 1 and the infrared camera 2 become the same by the control of the CPU 5 as described above. 20 operations are controlled. When the rear view of the tracking target vehicle 50 traveling in front of the vehicle 30 is photographed, the photographing operations of the visible light camera 1 and the infrared camera 2 are synchronized. Specifically, the imaging device readout signal, the image horizontal synchronization signal, and the image vertical synchronization signal generated as timing signals by the timing generator 6 shown in FIG. 1 are input to the visible light camera 1 and the infrared camera 2 respectively, and these signals are input. The light receiving operation and the image data output operation of the image sensor constituting each camera are synchronized at the timing indicated by. As described above, under the control of the CPU 5, the visible light camera 1 and the infrared camera 2 photograph the tracking target vehicle 50 at the same angle of view, and the image processing unit 3 converts the image data output from these cameras into the image as described above. Store in memory 4.

画像処理部3は、上記のように画像メモリ4に格納されている赤外線カメラ2の画像データから、例えば当該画像データが表す赤外線画像を図4に示した画像40としたとき当該画像40の中から、追跡対象車両50の後部において最も高温になっているマフラ51の位置を抽出し、この部分が図4に破線で示した特徴量枠41の内部に入るように、画像の中における特徴枠41の位置ならびに枠の大きさを計算して当該画像中に設定する。
図4に破線で示した監視枠42は、図示したように特徴量枠41を自らの下側に含んでいるもので、監視枠42の内側の領域には追尾して撮影した追跡対象車両50の画像が含まれる。
追跡対象車両50とマフラ51との投影面積比、及び追跡対象車両50のバックビューにおけるマフラ51の位置は、いずれの車両でも概ね同様なものになっている。そのため、予め特徴量枠41と監視枠42との面積比や互いの位置関係を設定しておくことができる。このことから、上記のように特徴量枠41を画像に設定して、さらに当該画像中に特徴量枠41に関連させて監視枠42を設定し、画像中の追跡対象車両50を監視枠42の内側に囲う。
The image processing unit 3 uses the image data of the infrared camera 2 stored in the image memory 4 as described above, for example, when the infrared image represented by the image data is the image 40 shown in FIG. Then, the position of the muffler 51 having the highest temperature in the rear portion of the tracking target vehicle 50 is extracted, and the feature frame in the image is so positioned that this portion enters the inside of the feature amount frame 41 indicated by the broken line in FIG. The position 41 and the size of the frame are calculated and set in the image.
A monitoring frame 42 indicated by a broken line in FIG. 4 includes a feature amount frame 41 on its lower side as shown in the figure, and a tracking target vehicle 50 that is tracked and photographed in a region inside the monitoring frame 42. Images are included.
The projected area ratio between the tracking target vehicle 50 and the muffler 51 and the position of the muffler 51 in the back view of the tracking target vehicle 50 are substantially the same in any vehicle. Therefore, the area ratio between the feature amount frame 41 and the monitoring frame 42 and the mutual positional relationship can be set in advance. Therefore, the feature amount frame 41 is set as an image as described above, and the monitoring frame 42 is set in the image in relation to the feature amount frame 41, and the tracking target vehicle 50 in the image is set in the monitoring frame 42. Enclose inside.

図5は、実施の形態1による車両追尾装置の処理動作を示す説明図である。図4に示したものと同一あるいは相当する部分に同じ符号を使用し、その説明を省略する。この図は、図4に示した画像40において画像処理部3が前述の画角差分データを求める処理動作を示したものである。
図5に示した画像40の中心点43と監視枠42の中心点44との偏差ΔLを求める。偏差ΔLを求めることにより、画像40と監視枠42の関係について、水平方向の偏差ΔT及び垂直方向の偏差ΔPが求められる。
FIG. 5 is an explanatory diagram showing a processing operation of the vehicle tracking device according to the first embodiment. The same reference numerals are used for parts that are the same as or correspond to those shown in FIG. This figure shows the processing operation in which the image processing unit 3 obtains the above-mentioned angle-of-view difference data in the image 40 shown in FIG.
A deviation ΔL between the center point 43 of the image 40 shown in FIG. 5 and the center point 44 of the monitoring frame 42 is obtained. By obtaining the deviation ΔL, the horizontal deviation ΔT and the vertical deviation ΔP are obtained for the relationship between the image 40 and the monitoring frame 42.

また、画像40の例えば水平方向のサイズを辺長Z1とし、また監視枠42の水平方向のサイズを辺長Z2としたとき、Z2/Z1の値を求めることにより画像40と監視枠42のズーム比が求められる。なお、ここでズーム比とは、一般的に用いられている最大焦点距離と最小焦点距離の比を表すものではなく、画像40の全体のサイズと監視枠42の全体のサイズの比に等しいもので、ズーム倍率の修正量を表す値である。また、上記の画像処理部3による特徴量の抽出や各枠の設定などの処理は、赤外線カメラ2から出力された画像データを用いて行われるものである。
画像処理部3は、上記のように求めた水平方向の偏差ΔT、垂直方向の偏差ΔP、及びズーム比Z2/Z1を画角差分データとしてCPU5へ出力する。
For example, when the horizontal size of the image 40 is the side length Z1, and the horizontal size of the monitoring frame 42 is the side length Z2, the zoom of the image 40 and the monitoring frame 42 is obtained by obtaining the value of Z2 / Z1. A ratio is required. Here, the zoom ratio does not represent the ratio between the maximum focal length and the minimum focal length that is generally used, but is equal to the ratio between the overall size of the image 40 and the overall size of the monitoring frame 42. This is a value representing the correction amount of the zoom magnification. Further, the processing such as feature amount extraction and setting of each frame by the image processing unit 3 is performed using image data output from the infrared camera 2.
The image processing unit 3 outputs the horizontal deviation ΔT, the vertical deviation ΔP, and the zoom ratio Z2 / Z1 obtained as described above to the CPU 5 as field angle difference data.

CPU5は、入力した画角差分データの各値、即ち水平方向の偏差ΔT、垂直方向の偏差ΔP、及びズーム比Z2/Z1の各値をパラメータとして、レンズドライバ11及び回動台ドライバ12、またレンズドライバ21及び回動台ドライバ22を制御する。この制御によって、可視光カメラ1と赤外線カメラ2の向き及び各カメラのズーム倍率が同様に調整される。このように調整された可視光カメラ1の撮影した追跡対象車両50の画像は、画像40の中に大きく写されたものになり、換言すると最適な画角で表示することができるものになって、明瞭な追跡対象車両50の撮影画像が得られる。このようにして可視光カメラ1が撮影した追跡対象車両50の画像データを前述のように記録部7に記録する、または当該画像を表示部8に表示する。なお、このとき赤外線カメラ2からの画像データを用いて求めた画角差分データは、各パラメータの値が小さいものとなり、現在可視光カメラ1の撮影している画像に対して、画角を調整する前述の各処理が行われなくなる。即ち、画像処理部3が求めた画角差分データの値が十分小さくなったとき、もしくは差分が皆無になったときに可視光カメラ1が撮影した画像のデータを記録部7に記録する、または表示部8へ当該画像データを出力して画像表示を行わせる。   The CPU 5 uses the values of the input angle-of-view difference data, that is, the horizontal deviation ΔT, the vertical deviation ΔP, and the zoom ratio Z2 / Z1 as parameters, and the lens driver 11 and the turntable driver 12, or The lens driver 21 and the turntable driver 22 are controlled. By this control, the orientation of the visible light camera 1 and the infrared camera 2 and the zoom magnification of each camera are similarly adjusted. The image of the tracking target vehicle 50 captured by the visible light camera 1 adjusted in this way is a large image in the image 40, in other words, can be displayed with an optimal angle of view. A clear captured image of the tracking target vehicle 50 is obtained. The image data of the tracking target vehicle 50 captured by the visible light camera 1 in this way is recorded in the recording unit 7 as described above, or the image is displayed on the display unit 8. At this time, the angle-of-view difference data obtained using the image data from the infrared camera 2 has a small value for each parameter, and the angle of view is adjusted with respect to the image currently captured by the visible light camera 1. The above-described processes are not performed. That is, when the value of the angle-of-view difference data obtained by the image processing unit 3 is sufficiently small, or when the difference is completely absent, the image data captured by the visible light camera 1 is recorded in the recording unit 7, or The image data is output to the display unit 8 to display the image.

以上のように実施の形態1によれば、同一の追跡対車両50を撮影するように並べた可視光カメラ1と赤外線カメラ2と、赤外線カメラ2の出力した画像データからマフラ51を抽出して画角を変化させるためのパラメータを求める画像処理部3と、画像処理部3の求めたパラメータを用いて可視光カメラ1と赤外線カメラ2の向きとズーム倍率を制御するCPU5とを備えたので、追跡対象車両50を画角一杯に撮影することができ、車両の特定やナンバープレートの確認等を容易に行うことができるという効果がある。
また、最適な画角で追跡対象車両50を撮影するように制御していることから、追跡中にカメラ等の操作に煩わされることがなく、追跡者は追跡行為に注意を集中させることができるという効果がある。
As described above, according to the first embodiment, the muffler 51 is extracted from the visible light camera 1, the infrared camera 2, and the image data output from the infrared camera 2 arranged so as to photograph the same tracking vehicle 50. Since the image processing unit 3 for obtaining parameters for changing the angle of view and the CPU 5 for controlling the orientation and zoom magnification of the visible light camera 1 and the infrared camera 2 using the parameters obtained by the image processing unit 3 are provided. The tracking target vehicle 50 can be photographed with a full angle of view, and it is possible to easily identify the vehicle and check the license plate.
Further, since control is performed so that the tracking target vehicle 50 is photographed at an optimal angle of view, the tracker can concentrate attention on the tracking action without being bothered by the operation of the camera or the like during tracking. There is an effect.

この発明の実施の形態1による車両追尾装置の構成を示すブロック図である。It is a block diagram which shows the structure of the vehicle tracking apparatus by Embodiment 1 of this invention. 実施の形態1による車両追尾装置の構成を示す説明図である。It is explanatory drawing which shows the structure of the vehicle tracking apparatus by Embodiment 1. FIG. 実施の形態1による車両追尾装置の構成を示す説明図である。It is explanatory drawing which shows the structure of the vehicle tracking apparatus by Embodiment 1. FIG. 実施の形態1による車両追尾装置の処理動作を示す説明図である。FIG. 6 is an explanatory diagram illustrating a processing operation of the vehicle tracking device according to the first embodiment. 実施の形態1による車両追尾装置の処理動作を示す説明図である。FIG. 6 is an explanatory diagram illustrating a processing operation of the vehicle tracking device according to the first embodiment.

符号の説明Explanation of symbols

1 可視光カメラ、2 赤外線カメラ、3 画像処理部、4 画像メモリ、5 CPU、6 タイミング発生器、7 記録部、8 表示部、10,20 水平垂直回動台、11,21 レンズドライバ、12,22 回動台ドライバ、13,23 望遠レンズ、30 車両、40 画像、41 特徴量枠、42 監視枠、50 追跡対象車両、51 マフラ。
DESCRIPTION OF SYMBOLS 1 Visible light camera, 2 Infrared camera, 3 Image processing part, 4 Image memory, 5 CPU, 6 Timing generator, 7 Recording part, 8 Display part, 10,20 Horizontal / vertical turntable 11,21 Lens driver, 12 , 22 Turntable driver, 13, 23 Telephoto lens, 30 vehicles, 40 images, 41 feature frame, 42 monitoring frame, 50 tracking target vehicle, 51 muffler.

Claims (4)

並べて配置した可視光カメラ及び赤外線カメラと、
前記可視光カメラ及び赤外線カメラを回動させる回動手段と、
前記可視光カメラ及び赤外線カメラのレンズのズーム倍率を調整するレンズドライブ手段と、
前記赤外線カメラの出力した画像データから追跡対象車両の特徴量を有する部分を抽出し、当該特徴量を有する部分に応じて撮影画像の画角を調整するデータを求める画像処理手段と、
前記画像処理手段の求めたデータに基づいて前記回動手段及びレンズドライブ手段を制御して前記可視光カメラに撮影させた画像のデータを出力させる制御手段と、
を備える車両追尾装置。
A visible light camera and an infrared camera arranged side by side;
Rotating means for rotating the visible light camera and the infrared camera;
Lens drive means for adjusting the zoom magnification of the lenses of the visible light camera and the infrared camera;
Image processing means for extracting a portion having a feature amount of the tracking target vehicle from the image data output from the infrared camera and obtaining data for adjusting the angle of view of the captured image according to the portion having the feature amount;
Control means for controlling the rotation means and the lens drive means based on the data obtained by the image processing means to output image data taken by the visible light camera;
A vehicle tracking device.
画像処理手段は、高温になっているマフラの部分を抽出し、当該マフラ部分に関連させて監視枠を設定して当該監視枠の画角と全体画像の画角との差分を示すデータを求め、
制御手段は、前記画像処理手段の求めた画角の差分を示すデータに基づいて前記監視枠の画角と全体画像の画角との差分が小さくなるように回動手段及びレンズドライブ手段を制御することを特徴とする請求項1記載の車両追尾装置。
The image processing means extracts a portion of the muffler that is at a high temperature, sets a monitoring frame in association with the muffler portion, and obtains data indicating a difference between the angle of view of the monitoring frame and the angle of view of the entire image. ,
The control means controls the rotation means and the lens drive means so that the difference between the angle of view of the monitoring frame and the angle of view of the entire image is reduced based on the data indicating the difference in angle of view obtained by the image processing means. The vehicle tracking device according to claim 1, wherein:
制御手段は、画像処理手段から入力したデータが小さい差分を示すとき、もしくは差分が皆無を示すとき可視光カメラからの画像データを記憶手段へ記憶させることを特徴とする請求項2記載の車両追尾装置。   3. The vehicle tracking according to claim 2, wherein the control means stores the image data from the visible light camera in the storage means when the data inputted from the image processing means shows a small difference or when the difference shows no difference. apparatus. 制御手段は、画像処理手段から入力したデータが小さい差分を示すとき、もしくは差分が皆無を示すとき可視光カメラからの画像データを用いて表示手段に表示させることを特徴とする請求項2記載の車両追尾装置。
3. The control means according to claim 2, wherein when the data inputted from the image processing means shows a small difference or when the difference shows no difference, the control means causes the display means to display using the image data from the visible light camera. Vehicle tracking device.
JP2006119336A 2006-04-24 2006-04-24 Vehicle tracking device Expired - Fee Related JP4722760B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006119336A JP4722760B2 (en) 2006-04-24 2006-04-24 Vehicle tracking device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006119336A JP4722760B2 (en) 2006-04-24 2006-04-24 Vehicle tracking device

Publications (2)

Publication Number Publication Date
JP2007293507A true JP2007293507A (en) 2007-11-08
JP4722760B2 JP4722760B2 (en) 2011-07-13

Family

ID=38764091

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006119336A Expired - Fee Related JP4722760B2 (en) 2006-04-24 2006-04-24 Vehicle tracking device

Country Status (1)

Country Link
JP (1) JP4722760B2 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011193061A (en) * 2010-03-11 2011-09-29 Hitachi Computer Peripherals Co Ltd Imaging apparatus, and monitoring system
CN104778842A (en) * 2015-04-29 2015-07-15 深圳市保千里电子有限公司 Cloud vehicle running track tracing method and system based on vehicle license plate recognition
JP5856700B1 (en) * 2015-01-29 2016-02-10 パナソニックIpマネジメント株式会社 Wearable camera system and recording control method
JP5856701B1 (en) * 2015-01-29 2016-02-10 パナソニックIpマネジメント株式会社 Wearable camera system and recording control method
JP5856702B1 (en) * 2015-01-29 2016-02-10 パナソニックIpマネジメント株式会社 Wearable camera system and attribute information assigning method
WO2016120932A1 (en) * 2015-01-29 2016-08-04 パナソニックIpマネジメント株式会社 Wearable camera system and image recording control method in wearable camera system
WO2017090928A1 (en) * 2015-11-27 2017-06-01 엘지이노텍 주식회사 Camera module for both normal photography and infrared photography
WO2018092388A1 (en) * 2016-11-21 2018-05-24 パナソニックIpマネジメント株式会社 Speed enforcement system and speed enforcement method
CN113853515A (en) * 2019-05-30 2021-12-28 松下知识产权经营株式会社 Stress analysis device for moving object
CN115060665A (en) * 2022-08-16 2022-09-16 君华高科集团有限公司 Automatic inspection system for food safety

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102235031B1 (en) * 2019-01-24 2021-04-01 주식회사 아이에이 Real-time processable camera based driving environment detection vehicle system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02300741A (en) * 1989-05-15 1990-12-12 Mazda Motor Corp Preceding vehicle recognizing device for moving vehicle
JPH11296785A (en) * 1998-04-14 1999-10-29 Matsushita Electric Ind Co Ltd Vehicle number recognition system
JP2002099997A (en) * 2000-09-26 2002-04-05 Mitsubishi Motors Corp Detection device for moving object
JP2004326327A (en) * 2003-04-23 2004-11-18 Fujitsu Ltd Vehicular passage route prediction system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02300741A (en) * 1989-05-15 1990-12-12 Mazda Motor Corp Preceding vehicle recognizing device for moving vehicle
JPH11296785A (en) * 1998-04-14 1999-10-29 Matsushita Electric Ind Co Ltd Vehicle number recognition system
JP2002099997A (en) * 2000-09-26 2002-04-05 Mitsubishi Motors Corp Detection device for moving object
JP2004326327A (en) * 2003-04-23 2004-11-18 Fujitsu Ltd Vehicular passage route prediction system

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011193061A (en) * 2010-03-11 2011-09-29 Hitachi Computer Peripherals Co Ltd Imaging apparatus, and monitoring system
US10356371B2 (en) 2015-01-29 2019-07-16 Panasonic Intellectual Property Management Co., Ltd. Wearable camera system, and video recording control method for wearable camera system
JP5856700B1 (en) * 2015-01-29 2016-02-10 パナソニックIpマネジメント株式会社 Wearable camera system and recording control method
JP5856701B1 (en) * 2015-01-29 2016-02-10 パナソニックIpマネジメント株式会社 Wearable camera system and recording control method
JP5856702B1 (en) * 2015-01-29 2016-02-10 パナソニックIpマネジメント株式会社 Wearable camera system and attribute information assigning method
WO2016120932A1 (en) * 2015-01-29 2016-08-04 パナソニックIpマネジメント株式会社 Wearable camera system and image recording control method in wearable camera system
US9854211B2 (en) 2015-01-29 2017-12-26 Panasonic Intellectual Property Management Co., Ltd. Wearable camera system, and video recording control method for wearable camera system
US10554935B2 (en) 2015-01-29 2020-02-04 Panasonic I-Pro Sensing Solutions Co., Ltd. Wearable camera system, and video recording control method for wearable camera system
CN104778842A (en) * 2015-04-29 2015-07-15 深圳市保千里电子有限公司 Cloud vehicle running track tracing method and system based on vehicle license plate recognition
US11212450B2 (en) 2015-11-27 2021-12-28 Lg Innotek Co., Ltd. Camera module for both normal photography and infrared photography
WO2017090928A1 (en) * 2015-11-27 2017-06-01 엘지이노텍 주식회사 Camera module for both normal photography and infrared photography
JPWO2018092388A1 (en) * 2016-11-21 2018-11-15 パナソニックIpマネジメント株式会社 Speed control system and speed control method
WO2018092388A1 (en) * 2016-11-21 2018-05-24 パナソニックIpマネジメント株式会社 Speed enforcement system and speed enforcement method
CN113853515A (en) * 2019-05-30 2021-12-28 松下知识产权经营株式会社 Stress analysis device for moving object
CN113853515B (en) * 2019-05-30 2024-03-19 松下知识产权经营株式会社 Stress analysis device for moving object
CN115060665A (en) * 2022-08-16 2022-09-16 君华高科集团有限公司 Automatic inspection system for food safety

Also Published As

Publication number Publication date
JP4722760B2 (en) 2011-07-13

Similar Documents

Publication Publication Date Title
JP4722760B2 (en) Vehicle tracking device
CN101404122B (en) Driving support device, driving support method, and computer program
JP4699040B2 (en) Automatic tracking control device, automatic tracking control method, program, and automatic tracking system
US8390686B2 (en) Surveillance camera apparatus and surveillance camera system
KR101776702B1 (en) Monitoring camera for generating 3 dimensional scene and method thereof
JP5302766B2 (en) Surveillance image display device
KR20120126152A (en) Divice and method for photographing image and divice for extracting image information
KR101096157B1 (en) watching apparatus using dual camera
JP2004194071A (en) Drive support image generator
JP2011109630A (en) Universal head for camera apparatus
US7936385B2 (en) Image pickup apparatus and imaging method for automatic monitoring of an image
JP2004096488A (en) Object detection apparatus, object detection method and object detection program
JP2002101408A (en) Supervisory camera system
JP5597382B2 (en) Wide-angle image display control method and apparatus, and wide-angle image pickup apparatus
TW201215146A (en) Image capturing device and method for tracking a moving object using the image capturing device
JP5768172B2 (en) Image display control method and apparatus, and image pickup apparatus
JP3838881B2 (en) Surveillance camera device
JP2013106175A (en) Camera system
JP2005175852A (en) Photographing apparatus and method of controlling photographing apparatus
JP4512690B2 (en) Monitoring system and method by image processing
JP2006013832A (en) Video photographing apparatus and video photographing program
JP6844055B1 (en) Surveillance camera
KR101255143B1 (en) On-vehicles type camera system comprising monitoring function and monitoring method thereof
JP2019027824A (en) Display control device, display control system, display control method, and display control program
KR20180134114A (en) Real Time Video Surveillance System and Method

Legal Events

Date Code Title Description
RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20070921

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20080630

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20090323

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110125

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110201

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110307

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110329

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110406

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140415

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

LAPS Cancellation because of no payment of annual fees