JPH06266840A - Status detector for moving object - Google Patents

Status detector for moving object

Info

Publication number
JPH06266840A
JPH06266840A JP5050374A JP5037493A JPH06266840A JP H06266840 A JPH06266840 A JP H06266840A JP 5050374 A JP5050374 A JP 5050374A JP 5037493 A JP5037493 A JP 5037493A JP H06266840 A JPH06266840 A JP H06266840A
Authority
JP
Japan
Prior art keywords
moving
moving object
state
image
distribution pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP5050374A
Other languages
Japanese (ja)
Inventor
政雄 ▲高▼藤
Masao Takato
Yasuo Morooka
泰男 諸岡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to JP5050374A priority Critical patent/JPH06266840A/en
Publication of JPH06266840A publication Critical patent/JPH06266840A/en
Pending legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)

Abstract

PURPOSE:To accurately recognize the status of the whole movement of a moving object (area) or an object (area) executing unique movement by recognizing the moving status of a moving object based upon the distribution pattern of feature values of the object. CONSTITUTION:A picture processing part 100 inputs a picture photographed by a camera 101 and stores it in a picture memory 104. An inter-picture operation circuit 105, a binarizing circuit 106, a labeling circuit 107, and a feature value extracting circuit 108 respectively execute inter-picture operation, binarization, labeling and feature value extracting processing by using the data stored in the memory 104. A CPU 112 generates the distribution pattern of feature values relating to the position, moving distance, moving speed, and moving direction of each object, the variation of the moving direction, a direction difference between a moving vector and an initial vector, etc., based upon the processed results of the processing part 100. The dynamic status of the moving object is detected from the distribution pattern and the dynamic status of the whole moving object or an object or an area moving differently from others is displayed on a monitor 114.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】本発明は移動物体の状態、すなわ
ち全体の動きの状況や特異な動きをする物体あるいは領
域を認識する装置に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a device for recognizing the state of a moving object, that is, the condition of the whole movement or an object or area having a unique movement.

【0002】[0002]

【従来の技術】従来、特異な動きをする領域を把握する
ために直線運動をする物体の移動方向の平均値を基準方
向とし、該基準方向からのずれにより特異領域を求める
方法が知られている。
2. Description of the Related Art Heretofore, there has been known a method in which an average value of moving directions of a linearly moving object is used as a reference direction in order to grasp a region having a unique motion, and a unique region is obtained from a deviation from the reference direction. There is.

【0003】[0003]

【発明が解決しようとする課題】上記従来技術は全物体
の平均値を用いて基準方向を求めていたので、外乱の影
響で基準方向が正しく求まらないといった問題があっ
た。
In the above-mentioned prior art, since the reference direction is obtained by using the average value of all objects, there is a problem that the reference direction cannot be obtained correctly due to the influence of disturbance.

【0004】本発明の目的は、移動物体(領域)全体の
動きの状況や特異な動きをする物体(領域)を外乱の影
響を受けることなく、精度良く認識し得る移動物体の状
態検出装置の提供にある。
An object of the present invention is to provide a moving object state detecting apparatus capable of accurately recognizing a state of motion of an entire moving object (region) and an object (region) having a peculiar motion without being affected by disturbance. In offer.

【0005】[0005]

【課題を解決するための手段】上記目的を達成するため
に、本発明は、入力された画像情報から各物体の位置,
移動距離,移動速度,移動方向,移動方向の変化量,移
動ベクトルと初期ベクトルとの方向差等の動きに関する
特徴量の分布パターンを生成し、この分布パターンから
移動物体の動的状態を認識するようにしたものである。
In order to achieve the above object, the present invention provides the position of each object from the input image information,
Generates a distribution pattern of movement-related features such as movement distance, movement speed, movement direction, movement direction change amount, direction difference between movement vector and initial vector, and recognizes the dynamic state of a moving object from this distribution pattern. It was done like this.

【0006】[0006]

【作用】本発明に係る移動物体の状態検出装置によれ
ば、移動物体全体の動的状態や他のものと異なった動き
をする物体あるいは領域が特徴量の分布パターンとして
求められるため、外乱の影響を受けることなく、精度良
く自動的に認識できる。
According to the state detecting apparatus for a moving object of the present invention, since the dynamic state of the entire moving object or an object or a region which moves differently from other objects is obtained as a distribution pattern of the characteristic amount, the disturbance It can be recognized automatically with high accuracy without being affected.

【0007】[0007]

【実施例】以下、本発明の一実施例を図1を用いて説明
する。
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS An embodiment of the present invention will be described below with reference to FIG.

【0008】本実施例に係る移動物体の状態検出装置9
0は、対象物体の画像を取り込むカメラ101に取込ま
れた画像から物体の特徴量を抽出する画像処理部10
0,装置全体の制御や画像処理部100の処理結果から
移動物体の状態を検出するCPU112,計測結果等を
記憶するメモリ113,画像や種々の情報を表示するモ
ニタ114,監視センタ120との通信を行う通信装置
115を有する。
A moving object state detecting device 9 according to the present embodiment.
0 is an image processing unit 10 that extracts the feature amount of the object from the image captured by the camera 101 that captures the image of the target object.
0, control of the entire apparatus and CPU 112 that detects the state of a moving object from the processing result of the image processing unit 100, memory 113 that stores measurement results, etc., monitor 114 that displays images and various information, communication with monitoring center 120 It has a communication device 115 for performing.

【0009】画像処理部100は、A/D変換器10
2,画像メモリ104,画像間演算回路105,2値化
回路106,ラベリング回路107,特徴量抽出回路1
08,D/A変換器110を備えている。
The image processing unit 100 includes an A / D converter 10
2, image memory 104, inter-image operation circuit 105, binarization circuit 106, labeling circuit 107, feature amount extraction circuit 1
08, the D / A converter 110 is provided.

【0010】画像メモリ104は、例えば256×25
6画素の濃淡メモリがk枚G1〜Gk備わつており、ま
た、必要に応じて2値画像を格納する2値画像メモリを
j枚B1〜Bj備える。
The image memory 104 is, for example, 256 × 25.
A gray-scale memory of 6 pixels is provided for k sheets G1 to Gk, and j binary image memories for storing binary images are provided as needed for j sheets B1 to Bj.

【0011】以下その動作について説明する。The operation will be described below.

【0012】CPU112からの指令に基づいて、画像
処理部100は、カメラ101によって撮影された画像
信号を取込み、A/D変換器102によつて、例えば1
28階調の濃度データ等に変換して画像メモリ104に
記憶する。
On the basis of a command from the CPU 112, the image processing section 100 takes in an image signal photographed by the camera 101, and the A / D converter 102, for example, 1
It is converted into density data of 28 gradations and stored in the image memory 104.

【0013】さらに、該画像処理部100は、CPU1
12の指令に基づいて、画像メモリ104のデータを用
いて、画像間演算,2値化,ラベリング,特徴量抽出等
の処理を、それぞれ画像間演算回路105,2値化回路
106,ラベリング回路107,特徴量抽出回路108等で
処理し、必要に応じて処理結果等をD/A変換器110に
よって映像信号に変換してモニタ114に表示する。続
いて、CPU112は後述する特異物体(領域)を検出
し、その有無を判定し、有る場合には、モニタ114や
通信装置115を介して、さらにはアラーム音,音声等
を用いて現場あるいは監視センタ120の監視員にその
旨を報知する。
Further, the image processing unit 100 includes a CPU 1
Based on the command of 12, the data in the image memory 104 is used to perform inter-image calculation, binarization, labeling, feature amount extraction, and other processes, respectively, inter-image arithmetic circuit 105, binarization circuit 106, and labeling circuit 107. Then, it is processed by the characteristic amount extraction circuit 108 and the like, and the processing result and the like are converted into a video signal by the D / A converter 110 as necessary and displayed on the monitor 114. Subsequently, the CPU 112 detects a peculiar object (region) described later and determines the presence or absence of the peculiar object. This is notified to the observer of the center 120.

【0014】また、別のシステム構成例を図15に示
す。移動物体の状態検出装置90′は、カメラ101
a,101b,101c,101d等の複数のカメラか
ら画像処理部100′に画像を取り込み、画像処理部1
00′ではカメラ切替器116により一つずつ入力画像
を選択し、処理する。複数の画像の処理結果はCPU11
2へ送られ、CPU112で逐次的に移動物体の状態認
識を行い、複数画像あるいは複数地点の移動状態の関係
を総合的に判断するか、あるいは逐次的な処理結果を監
視センタ120へ送信し、監視センタ120で複数画像
あるいは複数地点の移動状態の関係を総合的に判断す
る。本構成例ではカメラが4台の場合を示したが、個数
はなんら限定されない。
FIG. 15 shows another system configuration example. The moving object state detection device 90 ′ includes a camera 101
a, 101b, 101c, 101d, and the like, images are captured by the image processing unit 100 ′ from the plurality of cameras, and the image processing unit 1
At 00 ', the input images are selected one by one by the camera switch 116 and processed. The result of processing a plurality of images is the CPU 11
2, the CPU 112 sequentially recognizes the state of the moving object and comprehensively determines the relationship between the moving states of a plurality of images or a plurality of points, or transmits a sequential processing result to the monitoring center 120, The monitoring center 120 comprehensively determines the relationship between a plurality of images or movement states of a plurality of points. In this configuration example, the case where there are four cameras is shown, but the number is not limited at all.

【0015】また、別のシステム構成例を図16に示
す。移動物体の状態検出装置90″は、カメラ101
a,101b,101c,101d等の複数のカメラか
ら画像処理部100″に画像を取り込み、画像処理部1
00″ではカメラ切替器116により一つずつ入力画像
を選択し、A/D変換後、画面合成器103により画面
を合成し、大画像メモリ104′へ格納する。その後、
該大画像メモリ104′を処理する。画像の処理結果は
CPU112へ送られ、CPU112で移動物体の状態
認識を行い、複数地点の移動状態の関係を総合的に判断
するか、あるいは処理結果を監視センタ120へ送信
し、監視センタ120で複数地点の移動状態の関係を総
合的に判断する。本構成例ではカメラが4台の場合を示
したが、個数はなんら限定されない。
FIG. 16 shows another system configuration example. The moving object state detection device 90 ″ includes a camera 101.
a, 101b, 101c, 101d, and the like, images are captured by the image processing unit 100 ″ from a plurality of cameras, and the image processing unit 1
In 00 ″, the input images are selected one by one by the camera switching unit 116, after A / D conversion, the screens are combined by the screen combining unit 103 and stored in the large image memory 104 ′.
The large image memory 104 'is processed. The processing result of the image is sent to the CPU 112, the state of the moving object is recognized by the CPU 112, and the relationship between the moving states of a plurality of points is comprehensively determined, or the processing result is transmitted to the monitoring center 120, and the monitoring center 120. Comprehensively determine the relationship of movement status at multiple points. In this configuration example, the case where there are four cameras is shown, but the number is not limited at all.

【0016】さらにまた、別のシステム構成例を図17
に示す。移動物体の状態検出装置90a,90b,90
c,90dは、それぞれ、カメラ101を用いて画像を
取り込み、画像処理部100で処理する。画像の処理結
果はCPU112へ送られ、CPU112で移動物体の
状態認識を行い、処理結果を監視センタ120へ送信
し、監視センタ120で複数画像あるいは複数地点の移
動状態の関係を総合的に判断する。本構成例では状態検
出装置が4台の場合を示したが、個数はなんら限定され
ない。
Furthermore, another example of system configuration is shown in FIG.
Shown in. Moving object state detection device 90a, 90b, 90
Each of c and 90d captures an image using the camera 101 and processes the image in the image processing unit 100. The processing result of the image is sent to the CPU 112, the CPU 112 recognizes the state of the moving object, transmits the processing result to the monitoring center 120, and the monitoring center 120 comprehensively determines the relationship between the plurality of images or the moving state of the plurality of points. . In this configuration example, the case where the number of state detection devices is four is shown, but the number is not limited at all.

【0017】次に、本発明の中心部分である全体の動き
の状況や特異な動きをする物体(領域)を検出する処理
を図2を用いて説明する。
Next, the process of detecting the condition of the whole motion and the object (region) having a peculiar motion, which is the central part of the present invention, will be described with reference to FIG.

【0018】まず、入力画像取り込み11は、画像処理
部100において、カメラ101から取り込んだ画像を
画像メモリ104に格納する。差分画像抽出12はあら
かじめ画像メモリ104に記憶してある背景画像と上記
入力画像との差分画像s(t)を求める。2値化処理13
は上記差分画像s(t)を所定のしきい値で2値化し、2
値画像b(t)を作成する。その後、ラベリング14によ
り各物体にラベル付けをし、重心計算15で各物体毎の
重心座標を求める。次に、前回の入力画像内の物体との
対応付け16を行う。対応付けの方法としては、テンプ
レートマッチング,相関法,特徴追跡法等種々の方法が
報告されているが、ここでは簡単に特徴追跡法の一つと
して物体の重心間距離が一番小さいもの同士を対応付け
ることにしている。対応付けができると特徴量計算17
で各物体毎の特徴量を求める。特徴量としては、始点座
標,終点座標,移動距離,移動距離の変化量,移動方
向,移動方向の変化量等がある。これらについて、図3
を用いて説明する。図3において物体Aの時刻t−1,
t,t+1における物体位置をAt−1,At,At+
1で表す。物体Aの移動ベクトルをMV,x方向,y方
向の移動距離をdx,dyとすると、時刻t−1から時
刻tまでの間に物体Aが移動した距離Lは、
First, in the input image capturing section 11, the image processing section 100 stores the image captured from the camera 101 in the image memory 104. The differential image extraction 12 obtains a differential image s (t) between the background image previously stored in the image memory 104 and the input image. Binarization processing 13
Binarizes the difference image s (t) with a predetermined threshold, and
A value image b (t) is created. Then, each object is labeled by the labeling 14, and the barycentric coordinates of each object are obtained by the barycentric calculation 15. Next, the association 16 with the object in the previous input image is performed. Various methods such as template matching, correlation method, and feature tracking method have been reported as the matching method. Here, as one of the feature tracking methods, the one having the smallest distance between the centers of gravity of the objects is simply described. It is decided to correspond. If it is possible to make correspondence, feature amount calculation 17
Then, the feature amount of each object is obtained. The feature amount includes start point coordinates, end point coordinates, movement distance, movement distance change amount, movement direction, movement direction change amount, and the like. About these,
Will be explained. In FIG. 3, the time t-1 of the object A,
The object positions at t and t + 1 are At-1, At, At +
Expressed as 1. Assuming that the movement vector of the object A is MV, the movement distances in the x direction and the y direction are dx, dy, the distance L moved by the object A from time t−1 to time t is

【0019】[0019]

【数1】 [Equation 1]

【0020】で、移動方向は、Then, the moving direction is

【0021】[0021]

【数2】 [Equation 2]

【0022】で、移動方向の変化量は、The amount of change in the moving direction is

【0023】[0023]

【数3】 [Equation 3]

【0024】で、移動速度はThen, the moving speed is

【0025】[0025]

【数4】 [Equation 4]

【0026】でそれぞれ表される。なお、ここで移動方
向は角度を時計廻りに計測するものとする。
Each is represented by In addition, here, the moving direction is assumed to measure the angle clockwise.

【0027】また、全領域中で部分的に異なる動きをす
る領域、あるいは複数物体が存在する場合は、その中で
他の物体と異なる動きをする物体を特異領域と定義す
る。例えば、ある一定方向の人流の中を人が横切れば、
その人は特異な動きを行ったことになる。図4では物体
25a〜25jの中で物体25c,25jが特異領域と
なる。
Further, in a case where a region that partially moves differently in the entire region, or when there are a plurality of objects, an object that moves differently from other objects is defined as a peculiar region. For example, if a person crosses a certain flow of people,
That person has made a unique move. In FIG. 4, among the objects 25a to 25j, the objects 25c and 25j are peculiar regions.

【0028】図5は直線的な一定の流れに対する特異領
域の例を示す。矢印26b,26e,26fが特異領域
となる動きの方向を示す。ここで全体の流れの方向を基
準方向と呼ぶ。基準方向を図6に示す頻度分布を用いて
求める。すなわち、数2で各物体ごとに求めた移動方向
θを横軸とした頻度分布を求める。これが図2における
分布パターンの生成18である。このときθはα度単位
(例えば、10度毎)に求める。そして、分布パターン
の認識19では最大頻度をとる方向を求め、これを基準
方向とし、この基準方向と±β度(例えば±10度)以
上の方向の違いがある物体を特異領域として抽出する。
また、基準方向の別の求め方としては、頻度がある一定
値γ以上の方向θを求め、求めた複数の方向θをそれぞ
れ基準方向とすることができる。さらには上記、頻度が
ある一定値γ以上をとる複数の該方向θの平均値を基準
方向とすることもできる。
FIG. 5 shows an example of a singular region for a linear constant flow. Arrows 26b, 26e, and 26f indicate the directions of movements that are unique regions. Here, the direction of the entire flow is called the reference direction. The reference direction is obtained using the frequency distribution shown in FIG. That is, the frequency distribution is obtained with the horizontal direction being the moving direction θ obtained for each object in Equation 2. This is the generation 18 of the distribution pattern in FIG. At this time, θ is obtained in units of α degrees (for example, every 10 degrees). Then, in the recognition 19 of the distribution pattern, a direction having the highest frequency is obtained, and this is used as a reference direction, and an object having a difference of ± β degrees (for example, ± 10 degrees) or more from this reference direction is extracted as a unique region.
As another method of obtaining the reference direction, a direction θ having a certain frequency γ or more can be obtained, and the obtained plurality of directions θ can be used as the reference directions. Furthermore, the reference direction may be an average value of the plurality of the directions θ in which the frequency is a certain value γ or more.

【0029】以下、円運動の場合、放射状に拡散する場
合、放射状に収縮する場合の分布パターンの生成18及
び分布パターンの認識19の各処理について述べる。
In the following, each process of generating a distribution pattern 18 and recognizing a distribution pattern 19 in the case of circular motion, radial diffusion, and radial contraction will be described.

【0030】図7は円運動に対する特異領域の例を示
す。矢印27c,27dが特異領域となる動きの方向を
示す。これは円運動をするものの中で、円の流れに入ら
ない物体あるいは領域である。これに対しては、各物体
の移動方向の変化量を数3で求め、該移動方向の変化量
Δθを横軸とする頻度分布を前述の移動方向の場合と同
様にして求め、最大頻度をとる変化量、またはある一定
頻度γ以上をとる変化量、あるいは頻度がある一定値γ
以上をとる複数の該変化量の平均値を基準方向変化量と
することができる。この基準方向変化量と±β度(例え
ば±10度)以上の方向変化量の違いがある物体を特異
領域として抽出できる。
FIG. 7 shows an example of a singular region for circular motion. Arrows 27c and 27d indicate the direction of movement that becomes the peculiar region. This is an object or region that does not enter the flow of a circle among those that make a circular motion. On the other hand, the amount of change in the moving direction of each object is calculated by Equation 3, and the frequency distribution with the amount of change Δθ in the moving direction as the horizontal axis is calculated in the same manner as in the case of the moving direction described above, and the maximum frequency is calculated. The amount of change to take, or the amount of change to take a certain frequency γ or more, or the frequency has a certain value γ
The average value of a plurality of the above-described change amounts can be set as the reference direction change amount. An object having a difference between the reference direction change amount and the direction change amount of ± β degrees (for example, ± 10 degrees) or more can be extracted as a unique region.

【0031】図8は放射状に拡散する場合の特異領域の
例を示す。これは中心付近が危険状態になった時の人の
流れやスクランブル交差点における歩道の各地点での人
の流れ等が該当する。矢印28a,28bが特異領域と
なる動きの方向を示す。このような状況の場合は、各物
体の移動ベクトルMVi の始点座標(xis,yis)のx
及びy座標をそれぞれ横軸とする頻度分布を求める。こ
のとき座標値はα単位(例えば、20毎)にもとめる。
そして、最大頻度をとる座標値を基準位置とする。ま
た、前述の移動方向の場合と同様にして、ある一定頻度
以上をとる複数の該座標値の平均値を基準位置とするこ
ともできる。該基準位置から各物体の始点座標までの距
離を求め、この距離がある値β以上の物体を特異領域と
して抽出できる。
FIG. 8 shows an example of a peculiar region in the case of radial diffusion. This corresponds to the flow of people when the area near the center becomes dangerous, and the flow of people at each point of the sidewalk at the scramble intersection. Arrows 28a and 28b indicate the direction of movement that is the unique region. In such a situation, x of the starting point coordinates (xis, yis) of the movement vector MV i of each object
Then, the frequency distribution with the abscissa as the y coordinate and the y coordinate is obtained. At this time, the coordinate value is limited to α units (for example, every 20 units).
Then, the coordinate value having the maximum frequency is set as the reference position. Further, similarly to the case of the moving direction described above, an average value of a plurality of coordinate values having a certain frequency or more can be set as the reference position. The distance from the reference position to the starting point coordinates of each object is obtained, and an object having a distance β or more can be extracted as a unique region.

【0032】また、別法として、該基準位置から各物体
の始点座標までの各初期ベクトルを求め、移動ベクトル
の移動方向θi と該初期ベクトルの方向θ0iとの方向差
(Δθ0ii−θ0i)を求め、該方向差Δθ0iが±β度
(例えば、±5度)以上の物体を特異領域として抽出で
きる。あるいは、上記該方向差Δθ0iの頻度分布を求
め、前述の移動方向の場合と同様に、最大頻度をとる方
向差、またはある一定頻度γ以上をとる方向差、あるい
は頻度がある一定値γ以上をとる複数の該方向差の平均
値を基準方向差とすることができる。この基準方向差と
±β度(例えば±5度)以上の方向差の違いがある物体
を特異領域として抽出できる。
As another method, each initial vector from the reference position to the starting point coordinate of each object is obtained, and the direction difference (Δθ 0i = θ) between the moving direction θ i of the moving vector and the direction θ 0i of the initial vector is obtained. i −θ 0i ) is obtained, and an object whose direction difference Δθ 0i is ± β degrees (for example, ± 5 degrees) or more can be extracted as a unique region. Alternatively, the frequency distribution of the direction difference Δθ 0i is obtained, and the direction difference having the maximum frequency, or the direction difference having a certain frequency γ or more, or the frequency having a certain value γ or more, as in the case of the moving direction described above. The average value of the plurality of directional differences can be used as the reference directional difference. An object having a difference in direction difference of ± β degrees (for example, ± 5 degrees) or more from this reference direction difference can be extracted as a unique region.

【0033】図9は放射状に収縮する場合の特異領域の
例を示す。矢印29a,29bが特異領域となる動きの
方向を示す。これは、何か事故や事件,アクシデント等
が起こったとき、大半はその地点に集まるが、一部無関
心者や別の目的を持った人等はそこから離れるような動
きをする場合に相当する。このような状況の場合は、図
8とは逆に各物体の移動ベクトルMVi の終点座標(xi
d,yid)のx及びy座標をそれぞれ横軸とする頻度分布
を求め、前述の放射状に拡散する場合と同様にして基準
位置を求める。該基準位置から各物体の終点座標までの
距離を求め、この距離がある値β以上の物体を特異領域
として抽出できる。
FIG. 9 shows an example of a peculiar region in the case of radially contracting. Arrows 29a and 29b indicate the direction of movement that is the unique region. This corresponds to the case where when an accident, incident, accident, etc. occurs, most of them gather at that point, but some indifferent persons or people with other purposes move away from it. . In such a situation, contrary to FIG. 8, the end point coordinates (x i of the movement vector MV i of each object are
The frequency distribution is obtained with the x and y coordinates of (d, yid) as the abscissa, and the reference position is obtained in the same manner as in the case of the radial diffusion described above. The distance from the reference position to the end point coordinates of each object is obtained, and an object having this distance of a certain value β or more can be extracted as a peculiar region.

【0034】また、別法として、上述の放射状に拡散す
る場合と同様に該方向差(Δθ0i)を求め、さらに方向差
(Δθ′0i=Δθ0i−180)を求め、該方向差Δθ′
0iが±β度(例えば、±5度)以上の物体を特異領域と
して抽出できる。あるいは、上記該方向差Δθ′0iの頻
度分布を求め、前述の移動方向の場合と同様に、最大頻
度をとる方向差、またはある一定頻度γ以上をとる方向
差、あるいは頻度がある一定値γ以上をとる複数の該方
向差の平均値を基準方向差とすることができる。この基
準方向差と±β度(例えば±5度)以上の方向差の違い
がある物体を特異領域として抽出できる。
As another method, the direction difference (Δθ 0i ) is obtained in the same manner as in the case of the radial diffusion described above, and the direction difference (Δθ ′ 0i = Δθ 0i −180) is further obtained, and the direction difference Δθ ′ is obtained.
An object whose 0i is ± β degrees (for example, ± 5 degrees) or more can be extracted as a unique region. Alternatively, the frequency distribution of the direction difference Δθ ′ 0i is obtained, and the direction difference having the maximum frequency or the direction difference having a certain frequency γ or more, or the frequency having a certain value γ is obtained as in the case of the moving direction described above. An average value of a plurality of the above-mentioned direction differences can be used as the reference direction difference. An object having a difference in direction difference of ± β degrees (for example, ± 5 degrees) or more from this reference direction difference can be extracted as a unique region.

【0035】さらに、移動距離Lまたは移動速度Vにつ
いても上記と同様に移動距離Lまたは移動速度Vの頻度
分布をもとめ、それらから基準移動距離または基準移動
速度を求める。そして、この基準移動距離または基準移
動速度と±β(例えば±10)以上の移動距離または移動
速度の違いがある物体を特異領域として抽出できる。以
上述べた移動方向,方向変化量,初期ベクトルと移動ベ
クトルとの方向差等は一時刻の値を使うことも考えられ
るが、より精度を上げるために、各物体の複数時刻にお
ける計測値の平均の値を用いて上記と同様に移動方向,
方向変化量,初期ベクトルと移動ベクトルとの方向差等
を求め、その値を用いて特異領域を抽出することが考え
られる。
Further, for the moving distance L or the moving speed V, the frequency distribution of the moving distance L or the moving speed V is obtained in the same manner as described above, and the reference moving distance or the reference moving speed is obtained from them. Then, an object having a difference between the reference movement distance or the reference movement speed and the movement distance or the movement speed of ± β (for example, ± 10) or more can be extracted as the unique region. It is possible to use the values at one time for the moving direction, direction change amount, direction difference between the initial vector and the moving vector, etc. described above, but in order to improve the accuracy, the average of the measured values of each object at multiple times is used. Using the value of
It is conceivable that the direction change amount, the direction difference between the initial vector and the movement vector, and the like are obtained, and the values are used to extract the peculiar region.

【0036】さらに、分布パターンの認識19の処理と
してパターンマッチングやニューラルネットワークを用
いた方法について述べる。
Further, as the processing of the distribution pattern recognition 19, a method using pattern matching or a neural network will be described.

【0037】図10(a)はある道路50を移動してい
る物体の動きを示しており、その移動ベクトルの分布パ
ターンが図10(b)に示される。図10(b)の横軸
は位置を、縦軸は物体の移動距離または物体の個数(出
現頻度)または各物体の移動距離の累積値等を表す。ま
た図10(b)が、複数の監視地点における測定値の分
布パターンを表す場合は、同図の横軸は各地点を表す。
この分布パターンが図11に示すようなシステムであら
かじめ用意したいくつかの基準分布パターンのどれに近
いかを判定することにより現時点での道路の状態や各地
点における移動状態の関係を判断することができる。例
えば、ある道路における正常時の分布パターンが図11
の(a)としたときに、現時点での分布パターンが
(b)となった場合は道路の中央付近に何か抵抗(例え
ば、障害物がある、けんかが発生している等)があるこ
とが分かる。入力の分布パターンの判定は入力分布パタ
ーンと基準分布パターンとのパターンマチングや後述す
るニューラルネットワークにより行える。例えば、図1
0にあてはめてみると、図11(a)は道路の両側に抵
抗があり、中心に比べて速度が遅い、流れる物体の量が
少ない等を表している。図11(b)は逆に、道路の中
心に抵抗があり、両側に比べて速度が遅い、流れる物体
の量が少ない等を表している。図11(c)は道路の右
側に抵抗があり、左側に比べて速度が遅い、流れる物体
の量が少ない等を表している。図11(d)は逆に、道路
の左側に抵抗があり、右側に比べて速度が遅い、流れる
物体の量が少ない等を表している。図11(e)は道路
には特に抵抗がなく全体が同じ速度である、移動する物
体の量が同じである等を表している。以上は一方通行の
流れの例を示しているが、双方向の流れについても図1
2に示す基準分布パターンを用いて同様に判定できる。
また、より複雑な分布パターンに対してはニューラルネ
ットワークを用いて、入力分布パターンが各基準分布パ
ターンのどの程度の割合の組合わせで構成されているか
を判断することができる。すなわち、ニューラルネット
ワークの入力層のニューロン数は分布パターンの入力の
数と同じとし、出力層のニューロン数は基準分布パター
ンの数と同じとしてパターン分類を行う。すなわち、ニ
ューラルネットワークの入力層の各ニューロンに分布パ
ターンの構成要素である各物体の移動距離または物体の
個数または各物体の移動距離の累積値等を入力すること
により、出力層からは各基準分布パターンの構成割合が
出力される。これにより、全体の動き(群流)の状態、
または全体の動き(群流)の中で特異な動きをする物体
あるいは領域を検出することができる。
FIG. 10 (a) shows the movement of an object moving on a certain road 50, and the distribution pattern of the movement vector is shown in FIG. 10 (b). 10B, the horizontal axis represents the position, and the vertical axis represents the moving distance of the object, the number of objects (frequency of appearance), the cumulative value of the moving distance of each object, or the like. When FIG. 10B shows a distribution pattern of measured values at a plurality of monitoring points, the horizontal axis in the figure represents each point.
It is possible to determine the relationship between the current road condition and the moving condition at each point by determining which of the several reference distribution patterns prepared in advance by the system as shown in FIG. it can. For example, a normal distribution pattern on a certain road is shown in FIG.
If (a) is used and the current distribution pattern is (b), there must be some resistance (for example, there is an obstacle or a warp) near the center of the road. I understand. The input distribution pattern can be determined by pattern matching between the input distribution pattern and the reference distribution pattern or a neural network described later. For example, in FIG.
When applied to 0, FIG. 11A shows that there is resistance on both sides of the road, the speed is slower than the center, the amount of flowing objects is small, and so on. On the contrary, FIG. 11B shows that there is resistance at the center of the road, the speed is slower than both sides, and the amount of flowing objects is small. FIG. 11C shows that there is resistance on the right side of the road, the speed is slower than on the left side, and the amount of flowing objects is small. On the contrary, FIG. 11D shows that there is resistance on the left side of the road, the speed is slower than the right side, the amount of flowing objects is small, and the like. FIG. 11E shows that there is no particular resistance on the road, the speed is the same as a whole, the amount of moving objects is the same, and the like. The above shows an example of a one-way flow, but a bidirectional flow is also shown in FIG.
It can be similarly determined using the reference distribution pattern shown in FIG.
For more complicated distribution patterns, a neural network can be used to determine what proportion of the reference distribution patterns the input distribution patterns are combined with. That is, the number of neurons in the input layer of the neural network is set to be the same as the number of inputs in the distribution pattern, and the number of neurons in the output layer is set to be the same as the number of reference distribution patterns, and pattern classification is performed. That is, by inputting the moving distance of each object or the number of objects or the cumulative value of the moving distance of each object, which is a constituent element of the distribution pattern, to each neuron of the input layer of the neural network, each reference distribution is output from the output layer. The composition ratio of the pattern is output. By this, the state of the whole movement (group flow),
Alternatively, it is possible to detect an object or a region that makes a peculiar motion in the entire motion (group flow).

【0038】以上は直線的に運動する場合について述べ
たが、図13には、放射状に拡散運動をする場合の分布
パターンの例を、図14には、放射状に収縮する場合の
分布パターンの例を示す。図13(b),14(b)に
おいて、横軸はx軸方向を0度としたときの時計廻りに
見た各物体の移動方向(θ)を表す。
Although the case of linear movement is described above, FIG. 13 shows an example of a distribution pattern in the case of radial diffusion movement, and FIG. 14 shows an example of a distribution pattern in the case of radial contraction. Indicates. 13 (b) and 14 (b), the horizontal axis represents the moving direction (θ) of each object viewed clockwise when the x-axis direction is 0 degree.

【0039】以上、2次元分布パターンを対象とした場
合について説明したが、分布パターンとしては、位置を
x軸のパラメータ,移動距離,移動速度,移動方向,移
動方向の変化量,移動ベクトルと初期ベクトルとの方向
差のうち一つをy軸のパラメータ,頻度をz軸のパラメ
ータとする3次元分布パターンについても同様の考え方
を適用できる。
The case of the two-dimensional distribution pattern has been described above. As the distribution pattern, the position is defined by parameters of the x-axis, the moving distance, the moving speed, the moving direction, the change amount of the moving direction, the moving vector and the initial value. The same idea can be applied to a three-dimensional distribution pattern in which one of the direction differences from the vector is the y-axis parameter and the frequency is the z-axis parameter.

【0040】以上述べた方法により、分布パターンの認
識を行った後、異常状態,特異物体(領域)の有無判定
20を行う。そして、異常状態または特異物体(領域)
がある場合にはその旨を監視員等に報知するとともに現
状をモニタに表示し、監視員の判断に任せることにより
異常状態,特異物体(領域)の報知21を行う。適当な
処置が終了すると監視員が再度処理の開始を指示するこ
とにより上記と同様の処理を繰り返す。一方、異常状態
または特異物体(領域)が無い場合には最初の処理に戻
り、新たに画像を入力し、上記と同様の処理を繰り返
す。
After the distribution pattern is recognized by the method described above, the presence / absence determination 20 of the abnormal state and the peculiar object (region) is performed. And abnormal state or peculiar object (area)
If there is, the fact is displayed to the monitor or the like, the current state is displayed on the monitor, and the abnormal condition and the peculiar object (area) are notified 21 by leaving the judgment to the monitor. When the appropriate treatment is completed, the supervisor again instructs the start of the treatment to repeat the same treatment as above. On the other hand, if there is no abnormal state or a peculiar object (region), the process returns to the first process, a new image is input, and the same process as above is repeated.

【0041】[0041]

【発明の効果】本発明によれば、全体の動き(群流)の
状態,他のものと異なった状態あるいは動きをする物体
(領域)を認識できるので、駅の構内や地下街,商店街
等における全体の動き(群流)の状態や特異な動きをす
る物体(人間等)を自動的に監視し、認識できるという
効果がある。
According to the present invention, it is possible to recognize the state of the whole movement (group current), the state different from other things, or the moving object (region), so that the inside of a station, an underground mall, a shopping district, etc. The effect of automatically monitoring and recognizing the state of the whole movement (group flow) in and the object (human being, etc.) having a peculiar movement.

【0042】また、複数のカメラから入力される画像を
処理することにより広域的に広がる状況を把握すること
ができ、一つ一つ判断していた監視を全体との関連で把
握することができる。
Further, by processing the images input from a plurality of cameras, it is possible to grasp the situation spreading over a wide area, and it is possible to grasp the monitoring that was judged one by one in relation to the whole. .

【図面の簡単な説明】[Brief description of drawings]

【図1】移動物体の状態検出装置の構成を示すブロック
図である。
FIG. 1 is a block diagram showing a configuration of a moving object state detection device.

【図2】本発明の一実施例である特異物体検出処理の流
れを示すフロー図である。
FIG. 2 is a flowchart showing a flow of a unique object detection process which is an embodiment of the present invention.

【図3】特徴量の説明図である。FIG. 3 is an explanatory diagram of feature quantities.

【図4】特異領域の説明図である。FIG. 4 is an explanatory diagram of a unique region.

【図5】直線運動に対する特異領域の説明図である。FIG. 5 is an explanatory diagram of a peculiar region for linear motion.

【図6】移動方向の頻度分布の説明図である。FIG. 6 is an explanatory diagram of a frequency distribution in a moving direction.

【図7】円運動に対する特異領域の説明図である。FIG. 7 is an explanatory diagram of a singular region for circular motion.

【図8】放射状に拡散する場合の特異領域の説明図であ
る。
FIG. 8 is an explanatory diagram of a peculiar region in the case of radial diffusion.

【図9】放射状に収縮する場合の特異領域の説明図であ
る。
FIG. 9 is an explanatory diagram of a peculiar region when radially contracting.

【図10】直線運動の分布パターンの説明図である。FIG. 10 is an explanatory diagram of a distribution pattern of linear motion.

【図11】一方向の基準分布パターンの説明図である。FIG. 11 is an explanatory diagram of a reference distribution pattern in one direction.

【図12】双方向の基準分布パターンの説明図である。FIG. 12 is an explanatory diagram of a bidirectional reference distribution pattern.

【図13】放射状に拡散する場合の分布パターンの説明
図である。
FIG. 13 is an explanatory diagram of a distribution pattern in the case of radially spreading.

【図14】放射状に収縮する場合の分布パターンの説明
図である。
FIG. 14 is an explanatory diagram of a distribution pattern when radially contracting.

【図15】移動物体の状態検出装置の別の構成を示すブ
ロック図である。
FIG. 15 is a block diagram showing another configuration of a moving object state detection device.

【図16】移動物体の状態検出装置の別の構成を示すブ
ロック図である。
FIG. 16 is a block diagram showing another configuration of a moving object state detection device.

【図17】移動物体の状態検出装置の別の構成を示すブ
ロック図である。
FIG. 17 is a block diagram showing another configuration of a moving object state detection device.

【符号の説明】[Explanation of symbols]

100…画像処理部、101…カメラ、112…CP
U、114…モニタ、115…通信装置。
100 ... Image processing unit, 101 ... Camera, 112 ... CP
U, 114 ... Monitor, 115 ... Communication device.

Claims (12)

【特許請求の範囲】[Claims] 【請求項1】対象の画像を取り込む画像入力手段と、前
記画像入力手段により取り込まれた画像に対して各種の
画像処理を行う画像処理手段と、各種データ処理を行う
データ処理手段を用いて、移動物体を追跡するものにお
いて、前記画像入力手段で入力した画像から移動物体の
動き情報を抽出し、抽出された情報から各物体の位置,
移動距離,移動速度,移動方向,移動方向の変化量、及
び移動ベクトルと初期ベクトルとの方向差のうちいずれ
か一つを横軸のパラメータとし、且つ、少なくとも物体
頻度,累積移動距離、及び累積移動速度のうちいずれか
一つを縦軸のパラメータとする2次元分布パターンを生
成する分布パターン生成手段と、前記分布パターン生成
手段からの2次元分布パターンに基づいて前記移動物体
の移動状態を認識する手段と、認識結果あるいは画像を
表示する表示手段と、前記認識結果あるいは画像を通信
する通信手段を具備することを特徴とする移動物体の状
態検出装置。
1. An image input means for capturing an image of a target, an image processing means for performing various image processing on the image captured by the image input means, and a data processing means for performing various data processing, In tracking a moving object, motion information of the moving object is extracted from the image input by the image input means, and the position of each object is extracted from the extracted information.
One of the moving distance, the moving speed, the moving direction, the change amount of the moving direction, and the direction difference between the moving vector and the initial vector is used as a parameter on the horizontal axis, and at least the object frequency, the cumulative moving distance, and the cumulative A distribution pattern generation means for generating a two-dimensional distribution pattern having one of the movement speeds as a vertical axis parameter, and a movement state of the moving object is recognized based on the two-dimensional distribution pattern from the distribution pattern generation means. A device for detecting a state of a moving object, which comprises: a means for displaying the recognition result or the image; and a communication means for communicating the recognition result or the image.
【請求項2】対象の画像を取り込む画像入力手段と、前
記画像入力手段により取り込まれた画像に対して各種の
画像処理を行う画像処理手段と、各種データ処理を行う
データ処理手段を用いて、移動物体を追跡するものにお
いて、前記画像入力手段で入力した画像から移動物体の
動き情報を抽出し、抽出された情報から各物体の位置を
一軸のパラメータとし、移動距離,移動速度,移動方
向,移動方向の変化量及び移動ベクトルと初期ベクトル
との方向差のうちいずれか一つを他の一軸のパラメータ
とし、且つ、頻度をさらに他の一軸のパラメータとする
3次元分布パターンを生成する分布パターン生成手段
と、前記分布パターン生成手段からの3次元分布パター
ンに基づいて前記移動物体の移動状態を認識する手段
と、認識結果あるいは画像を表示する表示手段と、前記
認識結果あるいは画像を通信する通信手段を具備するこ
とを特徴とする移動物体の状態検出装置。
2. An image input means for capturing an image of an object, an image processing means for performing various image processing on the image captured by the image input means, and a data processing means for performing various data processing, In tracking a moving object, motion information of the moving object is extracted from the image input by the image input means, the position of each object is used as a uniaxial parameter from the extracted information, and the moving distance, moving speed, moving direction, A distribution pattern that generates a three-dimensional distribution pattern in which one of the amount of change in the movement direction and the direction difference between the movement vector and the initial vector is used as another uniaxial parameter, and the frequency is used as another uniaxial parameter. Generating means, means for recognizing the moving state of the moving object based on the three-dimensional distribution pattern from the distribution pattern generating means, recognition result or image. Display means for displaying the recognition result or state detecting apparatus of a moving object, characterized by comprising a communication means for communicating the image.
【請求項3】請求項1または2の移動物体の状態検出装
置において、複数の画像入力手段による入力画像を処理
し、該処理結果の全体関係から広域に広がる移動物体の
状態を把握することを特徴とする移動物体の状態検出装
置。
3. The moving object state detecting apparatus according to claim 1 or 2, wherein the input images from a plurality of image input means are processed, and the state of the moving object spread over a wide area is grasped from the overall relation of the processing results. A characteristic detecting device for a moving object.
【請求項4】請求項1または2の移動物体の状態検出装
置において、複数の画像入力手段による入力画像を合成
し、該合成画像を処理することにより広域に広がる移動
物体の状態を把握することを特徴とする移動物体の状態
検出装置。
4. The moving object state detecting device according to claim 1, wherein the input images from a plurality of image input means are combined, and the combined images are processed to grasp the state of the moving object spread over a wide area. A state detecting device for a moving object characterized by.
【請求項5】請求項1または2の移動物体の状態検出装
置において、各物体の特徴量の基準量として、各物体の
特徴量の頻度分布から最大頻度をとる特徴量を求め、該
基準量との差異から、全体の動きである群流の中で特異
な動きをする物体あるいは領域を認識することを特徴と
する移動物体の状態検出装置。
5. The moving object state detecting apparatus according to claim 1, wherein the feature quantity having the maximum frequency is obtained from the frequency distribution of the feature quantity of each object, and the reference quantity is obtained as the reference quantity of the feature quantity of each object. A device for detecting the state of a moving object, which is characterized by recognizing an object or a region having a peculiar motion in a group flow, which is the whole motion.
【請求項6】請求項1または2の移動物体の状態検出装
置において、各物体の特徴量の頻度分布から頻度がある
一定値以上の特徴量を求め、求めた複数の該特徴量をそ
れぞれ基準量とし、該基準量との差異から、全体の動き
である群流の中で特異な動きをする物体あるいは領域を
認識することを特徴とする移動物体の状態検出装置。
6. The moving object state detecting apparatus according to claim 1, wherein a characteristic amount having a frequency of a certain value or more is obtained from the frequency distribution of the characteristic amounts of each object, and the obtained plurality of characteristic amounts are used as references. A state detecting device for a moving object, characterized by recognizing an object or a region having a unique motion in a group flow, which is an overall motion, based on a difference from the reference amount.
【請求項7】請求項1または2の移動物体の状態検出装
置において、各物体の特徴量の頻度分布から頻度がある
一定値以上の特徴量を求め、求めた複数の該特徴量の平
均値を基準量とし、該基準量との差異から、全体の動き
である群流の中で特異な動きをする物体あるいは領域を
認識することを特徴とする移動物体の状態検出装置。
7. The moving object state detecting apparatus according to claim 1 or 2, wherein a characteristic value having a frequency equal to or higher than a certain value is obtained from a frequency distribution of the characteristic amounts of the respective objects, and an average value of the obtained plurality of characteristic values is obtained. Is used as a reference amount, and a state detection device for a moving object is characterized by recognizing an object or a region having a peculiar movement in a group flow, which is an overall movement, from the difference from the reference amount.
【請求項8】請求項1または2の移動物体の状態検出装
置において、各物体の特徴量の分布パターンを求め、あ
らかじめ用意した基準分布パターンとのマッチングによ
り、全体の動きの状態を認識することを特徴とする移動
物体の状態検出装置。
8. The moving object state detecting apparatus according to claim 1 or 2, wherein the distribution pattern of the feature amount of each object is obtained, and the entire motion state is recognized by matching with a reference distribution pattern prepared in advance. A state detecting device for a moving object characterized by.
【請求項9】請求項1または2の移動物体の状態検出装
置において、各物体の特徴量の分布パターンを求め、こ
れをあらかじめ用意したニューラルネットワークに入力
し、そのニューラルネットワークの出力値を用いて、全
体の動きの状態を認識することを特徴とする移動物体の
状態検出装置。
9. The moving object state detecting apparatus according to claim 1 or 2, wherein a distribution pattern of feature quantities of each object is obtained, the distribution pattern is input to a neural network prepared in advance, and the output value of the neural network is used. , A moving object state detection device characterized by recognizing the state of the entire movement.
【請求項10】請求項1または2の移動物体の状態検出
装置において、各物体の特徴量の移動距離,移動方向か
ら移動ベクトルを求めるとともに、各物体の始点座標ま
たは終点座標から基準位置を求める手段と、該基準位置
から各物体の始点座標への初期ベクトルを求め、該初期
ベクトルの方向と該移動ベクトルの方向との差を求める
手段と、該手段により抽出された方向差を用いて、物体
の移動状態を認識することを特徴とする移動物体の状態
検出装置。
10. The moving object state detecting apparatus according to claim 1, wherein a moving vector is obtained from a moving distance and a moving direction of a feature amount of each object, and a reference position is obtained from a starting point coordinate or an ending point coordinate of each object. Means, means for obtaining an initial vector from the reference position to the starting point coordinates of each object, means for obtaining the difference between the direction of the initial vector and the direction of the movement vector, and using the direction difference extracted by the means, A moving object state detecting device characterized by recognizing a moving state of an object.
【請求項11】請求項10の状態検出装置において、複
数の監視点を設け、該監視点に少なくとも一台の画像入
力手段を設置し、該入力手段の入力情報を処理し、処理
結果を監視センタヘ送信し、監視センタで統合処理する
ことにより、移動物体の状況を監視,把握することを特
徴とする移動物体の状態検出装置。
11. The state detection device according to claim 10, wherein a plurality of monitoring points are provided, at least one image input means is installed at the monitoring points, the input information of the input means is processed, and the processing result is monitored. A state detecting device for a moving object, which monitors and grasps the state of the moving object by transmitting the information to a center and performing integrated processing at the monitoring center.
【請求項12】請求項10の状態検出装置において、複
数の監視点を設け、該監視点に少なくとも一台の画像入
力手段を設置し、該入力手段の入力情報を合成し、処理
することにより、移動物体の状況を監視,把握すること
を特徴とする移動物体の状態検出装置。
12. The state detecting device according to claim 10, wherein a plurality of monitoring points are provided, at least one image input means is installed at the monitoring points, and input information of the input means is combined and processed. A state detecting device for a moving object, characterized by monitoring and grasping the state of the moving object.
JP5050374A 1993-03-11 1993-03-11 Status detector for moving object Pending JPH06266840A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP5050374A JPH06266840A (en) 1993-03-11 1993-03-11 Status detector for moving object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP5050374A JPH06266840A (en) 1993-03-11 1993-03-11 Status detector for moving object

Publications (1)

Publication Number Publication Date
JPH06266840A true JPH06266840A (en) 1994-09-22

Family

ID=12857115

Family Applications (1)

Application Number Title Priority Date Filing Date
JP5050374A Pending JPH06266840A (en) 1993-03-11 1993-03-11 Status detector for moving object

Country Status (1)

Country Link
JP (1) JPH06266840A (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08292871A (en) * 1995-04-21 1996-11-05 Nec Corp Image processing system
JPH0991595A (en) * 1995-09-25 1997-04-04 Mitsubishi Motors Corp Obstacle recognition device
JPH09130781A (en) * 1995-10-31 1997-05-16 Matsushita Electric Ind Co Ltd Broad area supervisory equipment
JPH10105712A (en) * 1996-09-30 1998-04-24 Mitsubishi Heavy Ind Ltd Moving body tracking device
JPH10124684A (en) * 1996-10-16 1998-05-15 Ricoh Co Ltd Image processing method and storage medium and image processor
JPH1144700A (en) * 1997-07-25 1999-02-16 Sanyo Electric Co Ltd Speed-measuring apparatus, automatic tracking system using the apparatus and display system for estimated arrival position
JP2000038121A (en) * 1998-07-22 2000-02-08 Mitsubishi Motors Corp Travel control method for vehicle
JP2000209570A (en) * 1999-01-20 2000-07-28 Toshiba Corp Moving object monitor
JP2001067483A (en) * 1999-08-30 2001-03-16 Katsuyoshi Kawasaki Method of objectivety evaluating severity of movement disorder
JP2001216519A (en) * 2000-02-04 2001-08-10 Fujitsu Ltd Traffic monitor device
JP2001243447A (en) * 2000-02-28 2001-09-07 Kddi Corp Road area detecting device
JP2001309263A (en) * 2000-04-26 2001-11-02 Nippon Telegr & Teleph Corp <Ntt> Providing method for related action responding to satatus of object in video image, providing apparatus for the same and record media recording program of the method
JP2002329207A (en) * 2001-05-02 2002-11-15 Riosu Corp:Kk Moving body tracking display system
JP2003141551A (en) * 2001-10-31 2003-05-16 Toshiba Corp Method and system for calculating face direction
JP2005214914A (en) * 2004-02-02 2005-08-11 Fuji Heavy Ind Ltd Traveling speed detecting device and traveling speed detection method
JP2006098119A (en) * 2004-09-28 2006-04-13 Ntt Data Corp Object detector, object detection method, and object detection program
WO2006080367A1 (en) * 2005-01-28 2006-08-03 Olympus Corporation Particle group movement analysis system, and particle group movement analysis method and program
JPWO2005069222A1 (en) * 2004-01-15 2008-04-24 旭化成株式会社 Information recognition device, information recognition method, information recognition program, and alarm system
JP2008271329A (en) * 2007-04-23 2008-11-06 Nippon Telegr & Teleph Corp <Ntt> Institution monitoring device and institution monitoring method
JP2011039752A (en) * 2009-08-10 2011-02-24 Canon Inc Apparatus and processing image method
JP2012212407A (en) * 2011-03-31 2012-11-01 Sogo Keibi Hosho Co Ltd State determining device, state determining method and program
USRE44225E1 (en) 1995-01-03 2013-05-21 Prophet Productions, Llc Abnormality detection and surveillance system
USRE44527E1 (en) 1995-01-03 2013-10-08 Prophet Productions, Llc Abnormality detection and surveillance system
JP2016062207A (en) * 2014-09-17 2016-04-25 富士フイルム株式会社 Emergency detector, emergency detection system, emergency detection program, and emergency detecting method
JP2020098606A (en) * 2014-06-27 2020-06-25 日本電気株式会社 Abnormality detection device and abnormality detection method
CN112005247A (en) * 2020-06-12 2020-11-27 深圳盈天下视觉科技有限公司 People flow data monitoring system and people flow data display method and device thereof
JP2021034983A (en) * 2019-08-28 2021-03-01 Kddi株式会社 Program, server, system, terminal and method for estimating external factor information affecting video stream

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE44225E1 (en) 1995-01-03 2013-05-21 Prophet Productions, Llc Abnormality detection and surveillance system
USRE44527E1 (en) 1995-01-03 2013-10-08 Prophet Productions, Llc Abnormality detection and surveillance system
JPH08292871A (en) * 1995-04-21 1996-11-05 Nec Corp Image processing system
JPH0991595A (en) * 1995-09-25 1997-04-04 Mitsubishi Motors Corp Obstacle recognition device
JPH09130781A (en) * 1995-10-31 1997-05-16 Matsushita Electric Ind Co Ltd Broad area supervisory equipment
JPH10105712A (en) * 1996-09-30 1998-04-24 Mitsubishi Heavy Ind Ltd Moving body tracking device
JPH10124684A (en) * 1996-10-16 1998-05-15 Ricoh Co Ltd Image processing method and storage medium and image processor
JPH1144700A (en) * 1997-07-25 1999-02-16 Sanyo Electric Co Ltd Speed-measuring apparatus, automatic tracking system using the apparatus and display system for estimated arrival position
JP2000038121A (en) * 1998-07-22 2000-02-08 Mitsubishi Motors Corp Travel control method for vehicle
JP2000209570A (en) * 1999-01-20 2000-07-28 Toshiba Corp Moving object monitor
JP2001067483A (en) * 1999-08-30 2001-03-16 Katsuyoshi Kawasaki Method of objectivety evaluating severity of movement disorder
JP4595104B2 (en) * 1999-08-30 2010-12-08 勝義 川崎 Objective severity assessment method for movement disorders
US6810132B1 (en) 2000-02-04 2004-10-26 Fujitsu Limited Traffic monitoring apparatus
JP2001216519A (en) * 2000-02-04 2001-08-10 Fujitsu Ltd Traffic monitor device
JP2001243447A (en) * 2000-02-28 2001-09-07 Kddi Corp Road area detecting device
JP2001309263A (en) * 2000-04-26 2001-11-02 Nippon Telegr & Teleph Corp <Ntt> Providing method for related action responding to satatus of object in video image, providing apparatus for the same and record media recording program of the method
JP2002329207A (en) * 2001-05-02 2002-11-15 Riosu Corp:Kk Moving body tracking display system
JP2003141551A (en) * 2001-10-31 2003-05-16 Toshiba Corp Method and system for calculating face direction
JPWO2005069222A1 (en) * 2004-01-15 2008-04-24 旭化成株式会社 Information recognition device, information recognition method, information recognition program, and alarm system
JP2005214914A (en) * 2004-02-02 2005-08-11 Fuji Heavy Ind Ltd Traveling speed detecting device and traveling speed detection method
JP2006098119A (en) * 2004-09-28 2006-04-13 Ntt Data Corp Object detector, object detection method, and object detection program
WO2006080367A1 (en) * 2005-01-28 2006-08-03 Olympus Corporation Particle group movement analysis system, and particle group movement analysis method and program
JP4934018B2 (en) * 2005-01-28 2012-05-16 オリンパス株式会社 Particle swarm motion analysis system, particle swarm motion analysis method and program
JP2008271329A (en) * 2007-04-23 2008-11-06 Nippon Telegr & Teleph Corp <Ntt> Institution monitoring device and institution monitoring method
JP2011039752A (en) * 2009-08-10 2011-02-24 Canon Inc Apparatus and processing image method
JP2012212407A (en) * 2011-03-31 2012-11-01 Sogo Keibi Hosho Co Ltd State determining device, state determining method and program
JP2020098606A (en) * 2014-06-27 2020-06-25 日本電気株式会社 Abnormality detection device and abnormality detection method
US11106918B2 (en) 2014-06-27 2021-08-31 Nec Corporation Abnormality detection device and abnormality detection method
US11250268B2 (en) 2014-06-27 2022-02-15 Nec Corporation Abnormality detection device and abnormality detection method
JP2016062207A (en) * 2014-09-17 2016-04-25 富士フイルム株式会社 Emergency detector, emergency detection system, emergency detection program, and emergency detecting method
US9576467B2 (en) 2014-09-17 2017-02-21 Fujifilm Corporation Emergency detection device, emergency detection system, recording medium, and method therefor
JP2021034983A (en) * 2019-08-28 2021-03-01 Kddi株式会社 Program, server, system, terminal and method for estimating external factor information affecting video stream
CN112005247A (en) * 2020-06-12 2020-11-27 深圳盈天下视觉科技有限公司 People flow data monitoring system and people flow data display method and device thereof
WO2021248479A1 (en) * 2020-06-12 2021-12-16 深圳盈天下视觉科技有限公司 People counting data monitoring system, method and device for displaying people counting data thereof

Similar Documents

Publication Publication Date Title
JPH06266840A (en) Status detector for moving object
US10402985B2 (en) Collision prediction
JP4148281B2 (en) Motion capture device, motion capture method, and motion capture program
CN101406390B (en) Method and apparatus for detecting part of human body and human, and method and apparatus for detecting objects
CN110142785A (en) A kind of crusing robot visual servo method based on target detection
GB2431717A (en) Scene analysis
CN105447529A (en) Costume detection and attribute value identification method and system
CN112560741A (en) Safety wearing detection method based on human body key points
JP2017191501A (en) Information processing apparatus, information processing method, and program
JP6590609B2 (en) Image analysis apparatus and image analysis method
Iwasawa et al. Real-time, 3D estimation of human body postures from trinocular images
Wang et al. Real-time 3D human tracking for mobile robots with multisensors
Shalnov et al. Convolutional neural network for camera pose estimation from object detections
CN106970709A (en) A kind of 3D exchange methods and device based on holographic imaging
Shinmura et al. Estimation of Human Orientation using Coaxial RGB-Depth Images.
Weinrich et al. Appearance-based 3D upper-body pose estimation and person re-identification on mobile robots
Iwasawa et al. Human body postures from trinocular camera images
CN111144260A (en) Detection method, device and system of crossing gate
Amat et al. Stereoscopic system for human body tracking in natural scenes
CN114120368A (en) Target detection method and detection equipment
Du et al. A color projection for fast generic target tracking
Cielniak et al. Appearance-based tracking of persons with an omnidirectional vision sensor
Ko et al. Stereo camera-based intelligence surveillance system
Chen et al. An integrated sensor network method for safety management of construction workers
Yun et al. Development of action-recognition technology using LSTM based on skeleton data