JPH1125278A - Behavior analysis estimating device for traveling object and its waviness association processing method - Google Patents

Behavior analysis estimating device for traveling object and its waviness association processing method

Info

Publication number
JPH1125278A
JPH1125278A JP17718197A JP17718197A JPH1125278A JP H1125278 A JPH1125278 A JP H1125278A JP 17718197 A JP17718197 A JP 17718197A JP 17718197 A JP17718197 A JP 17718197A JP H1125278 A JPH1125278 A JP H1125278A
Authority
JP
Japan
Prior art keywords
image
cross
images
correlation
behavior analysis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP17718197A
Other languages
Japanese (ja)
Other versions
JP3351460B2 (en
Inventor
Hidetomo Sakaino
英朋 境野
Satoshi Suzuki
智 鈴木
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nippon Telegraph and Telephone Corp
Original Assignee
Nippon Telegraph and Telephone Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Telegraph and Telephone Corp filed Critical Nippon Telegraph and Telephone Corp
Priority to JP17718197A priority Critical patent/JP3351460B2/en
Publication of JPH1125278A publication Critical patent/JPH1125278A/en
Application granted granted Critical
Publication of JP3351460B2 publication Critical patent/JP3351460B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Image Processing (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Image Analysis (AREA)

Abstract

PROBLEM TO BE SOLVED: To improve the reliability and estimation accuracy of a cross- correlation coefficient in the case of a rigid and non-rigid object accompanying gray level change by analyzing similarity based on a cross-correlation method, changing a mean gray level of two images about an average of gray levels in a small block to find maximum similarity and estimating a motion vector. SOLUTION: An image inputting part 11 uses a camera, etc., and inputs a time series image. An image accumulating part 12 accumulates it as a time series image. A comparing part 13 splits each image into small blocks among not less than two continuous two-dimensional images, extracts image feature quantity and compares image feature quantity at different times. A compensation cross-correlating part 14 finds maximum similarity by analyzing similarity based on a cross-correlation method, changing a mean gray level of two images about an average of a gray level in a small block, estimates a motion vector, performs waviness association processing when waviness exists in the gray level and performs analysis estimation of the behavior of a traveling object.

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【発明の属する技術分野】本発明は移動体の動きをカメ
ラ・レーダーにより検出してその挙動を解析予測する分
野に関し、特に気象レーダーエコー画像からの降水量の
変化予測や流体工学における流体の挙動の解析等の解析
予測に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to the field of detecting the movement of a moving object with a camera / radar and analyzing and predicting the movement, and more particularly to predicting the change in precipitation from weather radar echo images and the behavior of fluid in fluid engineering. Analysis prediction, such as analysis of

【0002】[0002]

【従来の技術】従来、画像中の剛体系の物体の移動ベク
トルを推定する場合は、物体上の照明変化がほとんどな
いモデルを採用していることが多い。その検出方法は、
例えば文献“Dona H.Ballard著「コンピ
ュータビジョン」日本コンピュータビジョン協会”に示
されるように、オプティカルフローや相互相関法に基づ
いた方法が中心的である。
2. Description of the Related Art Conventionally, when estimating a movement vector of an object having a rigid system in an image, a model having almost no change in illumination on the object is often used. The detection method is
For example, as shown in the document “Computer Vision” by Donna H. Ballard, Computer Vision Association of Japan, a method based on an optical flow or a cross-correlation method is mainly used.

【0003】一方、非剛体の物体に対する適切な移動ベ
クトルはないと言える。これは、連続する画像間であっ
ても、物体の輪郭線、濃淡値等の属性が同時に変化する
ために、明確な対応づけを行えないことに起因する。即
ち、複雑な物体に対しては、統計的な類似性を追従して
いく相互相関法が適用されることが多い。例えば、気象
レーダーエコー画像を用いた場合、降水パターンは非剛
体的に変化するものの、相互相関法が適用されている。
On the other hand, it can be said that there is no appropriate movement vector for a non-rigid object. This is due to the fact that attributes such as contours and gray levels of an object change at the same time even between consecutive images, so that a clear association cannot be made. That is, for a complex object, a cross-correlation method that follows statistical similarity is often applied. For example, when a weather radar echo image is used, the precipitation pattern changes non-rigidly, but the cross-correlation method is applied.

【0004】いずれの場合も、相互相関法の解析結果に
基づいて、画像間の物体の移動ベクトルを推定する場合
は、画像間で、物体の濃淡値分布にほとんど変化がない
との仮定をおいている。剛体系の場合は、照明の変化が
ほとんどなく、非剛体系の場合は、特に、気象レーダー
画像の場合には対流現象に伴った生成・消滅がないモデ
ルを想定している。その他、物体の濃淡値にはノイズが
ないと仮定されている。
In any case, when estimating the movement vector of an object between images based on the analysis result of the cross-correlation method, it is assumed that there is almost no change in the gray value distribution of the object between images. Have been. In the case of a rigid system, it is assumed that there is almost no change in lighting, and in the case of a non-rigid system, a model is assumed that there is no generation or disappearance due to a convection phenomenon especially in the case of a weather radar image. In addition, it is assumed that the gray value of the object is noise-free.

【0005】[0005]

【発明が解決しようとする課題】しかし、剛体系、非剛
体系の物体には実際には濃淡値の分布に変化があり、濃
淡値分布に変化がないと仮定した挙動の解析による予測
では、当然実際の変化との間に誤差を生ずるという問題
点がある。
However, in the case of a rigid or non-rigid object, the distribution of gray values actually changes, and the prediction based on the analysis of the behavior assuming that there is no change in the gray value distribution, Naturally, there is a problem that an error is generated between the actual change and the actual change.

【0006】本発明の目的は、剛体系、非剛体系の物体
で濃淡値変化を伴っている場合の、相互相関係数の信頼
性を向上させ、予測精度を向上させる挙動解析予測装置
およびそのうねり対応づけ処理方法を提供することにあ
る。
An object of the present invention is to improve the reliability of a cross-correlation coefficient and to improve the prediction accuracy when a rigid or non-rigid object is accompanied by a change in gray value, and a behavior analysis prediction device therefor. An object of the present invention is to provide a swell correspondence processing method.

【0007】[0007]

【課題を解決するための手段】本発明の移動体の挙動解
析予測装置は、移動体の画像を用いた挙動解析予測装置
であって、画像を入力する画像入力手段と、入力された
画像を時系列画像として蓄積する画像蓄積手段と、画像
蓄積手段に蓄積されている連続する2つ以上の2次元画
像間で、各画像を小ブロックに分割した後、画像処理に
より画像特徴量について抽出して異なる時刻の画像特徴
量を比較する比較手段と、比較手段より得られた小ブロ
ック内の濃淡値の平均値について、2つの画像の平均濃
淡値を変化させながら、相互相関法に基づいて類似性を
解析して最大類似点を見い出して移動ベクトルを推定す
ることにより、濃淡値にうねりが存在する場合にうねり
を対応づけさせた補正を行った移動ベクトルの推定を可
能とする補正相互相関手段と、解析された結果を出力す
る出力手段とを備える。
SUMMARY OF THE INVENTION The present invention provides a behavior analysis / prediction device for a moving object, which is a behavior analysis / prediction device using an image of a moving object, comprising: an image input means for inputting an image; Each image is divided into small blocks between an image accumulating unit that accumulates as a time-series image and two or more continuous two-dimensional images accumulated in the image accumulating unit, and an image feature amount is extracted by image processing. Means for comparing image feature values at different times by using a cross-correlation method based on a cross-correlation method while changing the average gray value of two images with respect to the average value of gray values in a small block obtained by the comparing means. By analyzing the characteristics and finding the maximum similarity point and estimating the movement vector, when the swell exists in the grayscale value, it is possible to estimate the movement vector corrected by associating the swell. Seki and means, and output means for outputting the result of the analysis.

【0008】本発明の移動体の挙動解析予測装置のうね
り対応づけ処理方法は、移動体の画像を用いた挙動解析
予測装置のうねり対応づけ処理方法であって、移動体の
画像を時系列で画像入力手段により入力する手順と、入
力した時系列画像を画像蓄積手段に蓄積する手順と、比
較手段で、画像蓄積手段に蓄積した複数の画像から連続
する2つ以上の2次元画像を抽出し、抽出した各画像を
小ブロックに分割し、画像処理により画像特徴量につい
て抽出し、異なる時刻の画像特徴量の比較を行う手順
と、補正相互相関手段で、比較手段より得られた小ブロ
ック内の濃淡値の平均値について2つの画像の平均濃淡
値を変化させながら、相互相関法に基づいて類似性を解
析し、最大類似点を見出して移動ベクトルを推定するこ
とにより、濃淡値にうねりが存在する場合、うねり対応
づけ処理による補正を行って移動ベクトルを推定する手
順と、補正相互相関手段で解析された結果を出力手段で
出力する手順とを備える。
The undulation correspondence processing method of the behavior analysis / prediction device of the moving object according to the present invention is a undulation correspondence processing method of the behavior analysis / prediction device using the image of the moving object, wherein the image of the movability object is time-series. A procedure for inputting by the image input means, a procedure for storing the input time-series images in the image storage means, and a comparing means for extracting two or more continuous two-dimensional images from the plurality of images stored in the image storage means. A procedure of dividing each extracted image into small blocks, extracting image feature amounts by image processing, and comparing image feature amounts at different times, and correcting cross-correlation means, in the small block obtained by the comparison means. The similarity is analyzed based on the cross-correlation method while the average gray value of the two images is changed with respect to the average value of the gray value of, and the maximum similarity point is found to estimate the movement vector. If kneading is present comprises a step of estimating a motion vector by performing a correction by undulation association process, a procedure for outputting the output means the result of the analysis by correcting the cross-correlation means.

【0009】画像特徴量を求める手順では、画像の差分
をとり、濃淡値の時間的な変動の空間周波数成分として
抽出し、移動ベクトルを推定する手順では、その周期的
成分と濃淡値の変動を時間方向に対応させ、相関をとる
画像間で濃淡値の平均値をそれに従って補正しながら相
互相関解析を行うことが好ましい。
In the procedure for obtaining the image feature amount, the difference between the images is obtained and extracted as the spatial frequency component of the temporal variation of the gray value. In the procedure for estimating the movement vector, the periodic component and the variation of the gray value are calculated. It is preferable to perform the cross-correlation analysis while corresponding to the time direction and correcting the average value of the grayscale values between the correlated images accordingly.

【0010】補正相互相関手段における移動ベクトルを
推定する手順において、2つの画像間で類似性を解析し
て得られた移動ベクトルを初期ベクトルとして、線形外
挿法により移動ベクトルを線形に予測する手順と、流体
方程式の移流項に当てはめて流体変動を予測する手順と
を含んでいることが好ましい。
In the procedure for estimating a motion vector in the correction cross-correlation means, a procedure for linearly predicting a motion vector by a linear extrapolation method using a motion vector obtained by analyzing similarity between two images as an initial vector. And a procedure for predicting fluid fluctuation by applying the advection term of the fluid equation.

【0011】画像特徴量を比較する手順では、前処理と
して各画像に平滑化処理と、孤立点除去処理とを行うこ
とが好ましい。
In the procedure for comparing image feature values, it is preferable to perform a smoothing process and an isolated point removing process on each image as preprocessing.

【0012】レーダーやカメラにより観測された流体系
の物体の濃淡値分布には、さまざまなノイズが加わって
いて、それにより真の濃淡値分布が乱されていることが
十分考えられる。特に、降雨現象は対流変化の強弱に左
右され、画像特徴量としても濃淡値(真値)そのものが
大きく変動している。そのため、相互相関法により統計
的な類似性解析を利用して移動ベクトルを推定する場合
は、濃淡値を補正しながら相互相関法の結果を利用する
ことが適切である。対流による濃淡値のうねりが見い出
されるときは、その空間周波数成分を解析して、周期性
成分と濃淡値変動を時間方向に対応させながら相関をと
る画像間で濃淡値を補正する。
It is sufficiently conceivable that various gray levels are added to the gray level distribution of a fluid-based object observed by a radar or a camera, thereby disturbing the true gray level distribution. In particular, the rainfall phenomenon depends on the strength of the convection change, and the grayscale value (true value) itself greatly fluctuates as an image feature amount. Therefore, when estimating the movement vector using the statistical similarity analysis by the cross-correlation method, it is appropriate to use the result of the cross-correlation method while correcting the grayscale value. When the swell of the gray value due to convection is found, the spatial frequency component is analyzed, and the gray value is corrected between the correlated images while associating the periodic component and the gray value fluctuation in the time direction.

【0013】[0013]

【発明の実施の形態】次に、本発明の実施の形態につい
て図面を参照して説明する。図1は本発明の実施の形態
のうねり対応づけ処理方法を備えた移動体挙動解析予測
装置とうねり対応づけ処理方法の説明図であり、(a)
は移動体挙動解析予測装置のブロック構成図であり、
(b)は本発明の実施の形態のうねり対応づけ処理方法
のフローチャートである。図中符号11は画像入力部、
12は画像蓄積部、13は比較部、14は補正相互相関
部、15は出力部、S101〜S111は処理の手順で
ある。
Next, embodiments of the present invention will be described with reference to the drawings. FIG. 1 is an explanatory diagram of a moving object behavior analysis prediction device provided with a swell correspondence processing method and a swell correspondence processing method according to an embodiment of the present invention, and FIG.
Is a block diagram of a moving object behavior analysis prediction device,
(B) is a flowchart of a swell correspondence processing method according to the embodiment of the present invention. In the figure, reference numeral 11 denotes an image input unit,
12 is an image storage unit, 13 is a comparison unit, 14 is a corrected cross-correlation unit, 15 is an output unit, and S101 to S111 are processing procedures.

【0014】本発明の実施の形態の移動体の画像を用い
た挙動解析予測装置は、カメラやレーダーを用いて時系
列画像を入力する画像入力部11と、入力された画像を
時系列画像として蓄積する画像蓄積部12と、画像蓄積
部12に蓄積されている連続する2つ以上の2次元画像
間で、各画像を小ブロックに分割した後、画像処理によ
り画像特徴量について抽出して異なる時刻の画像特徴量
を比較する比較部13と、比較部13より得られた小ブ
ロック内の濃淡値の平均値について、2つの画像の平均
濃淡値を変化させながら、相互相関法に基づいて類似性
を解析して最大類似点を見い出して移動ベクトルを推定
することにより、濃淡値にうねりが存在する場合にうね
り対応づけ処理を行い移動体の挙動の解析予測を行う補
正相互相関部14と、補正相互相関部14で解析された
結果を出力する出力手段とを備えている。
A behavior analysis and prediction apparatus using an image of a moving object according to an embodiment of the present invention includes an image input unit 11 for inputting a time-series image using a camera or radar, and the input image as a time-series image. Each image is divided into small blocks between an image storage unit 12 to be stored and two or more continuous two-dimensional images stored in the image storage unit 12, and image features are extracted and different by image processing to be different. Based on the cross-correlation method, the comparing unit 13 that compares the image feature amounts at the time and the average value of the gray values in the small blocks obtained by the comparing unit 13 are changed while changing the average gray value of the two images. Correction cross-correlation unit 14 that analyzes the characteristics and finds the maximum similarity point to estimate the movement vector, performs swell association processing when swell exists in the grayscale value, and analyzes and predicts the behavior of the moving object. , And an output means for outputting the result of the analysis by correcting the cross-correlation unit 14.

【0015】本発明の実施の形態のうねり対応づけ処理
方法は、処理を開始すると(S101)カメラやレーダ
ーを用いて移動体の画像を時系列で画像入力部11によ
り入力し(S102)、画像蓄積部12で入力した時系
列画像を蓄積する(S103)。
In the undulation associating processing method according to the embodiment of the present invention, when processing is started (S101), images of a moving object are input in time series by the image input unit 11 using a camera or radar (S102). The time-series image input by the storage unit 12 is stored (S103).

【0016】比較部13で、画像蓄積部12に蓄積した
複数の画像から連続する2つ以上の2次元画像を抽出し
(S104)、抽出した各画像を小ブロックに分割し
(S105)、画像処理により画像特徴量について抽出
し(S106)、異なる時刻の画像特徴量の比較を行う
(S107)。
The comparison unit 13 extracts two or more continuous two-dimensional images from the plurality of images stored in the image storage unit 12 (S104), divides each extracted image into small blocks (S105), The image features are extracted by the processing (S106), and the image features at different times are compared (S107).

【0017】補正相互相関部14で、比較部13より得
られた小ブロック内の濃淡値の平均値について2つの画
像の平均濃淡値を変化させながら、うねりに対応させた
補正を行って相互相関法に基づいて類似性を解析し(S
108)、最大類似点を見い出して移動ベクトルを推定
する(S109)。2つの画像の平均濃淡値を変化させ
ながら類似性を解析することにより、濃淡値にうねりが
存在する場合うねり対応づけ処理が行われ結果が補正さ
れる。
The correction cross-correlation unit 14 corrects the average value of the gray values in the small blocks obtained by the comparison unit 13 while changing the average gray value of the two images, and performs cross-correlation. The similarity is analyzed based on the method (S
108), the maximum similarity is found, and the movement vector is estimated (S109). By analyzing the similarity while changing the average gray value of the two images, if there is a undulation in the gray value, a undulation association process is performed and the result is corrected.

【0018】出力部15で解析された結果を出力して
(S110)、処理を終了する(S111)。
The result analyzed by the output unit 15 is output (S110), and the process is terminated (S111).

【0019】S104における2つ以上の2次元画像の
抽出について補足説明する。基本的には異なる時刻の2
つの画像だけを用いても移動量の検出等の画像処理を行
うことができる。しかしながら、実際の降水パターンは
複雑にかつ非定常的に変化していることが多いために、
平均処理を含めることが望ましい。その一つの方法とし
て、例えば過去の連続する6フレームを抽出し、抽出し
た6フレームの前半3フレームと後半3フレームについ
てそれぞれ一度平均をとり、前半の平均値と後半の平均
値を用いて移動量の検出等の画像処理を行う。
The supplementary description of the extraction of two or more two-dimensional images in S104 will be given. Basically different time 2
Image processing such as detection of a moving amount can be performed using only one image. However, the actual precipitation patterns are often complex and non-stationary,
It is desirable to include averaging. As one of the methods, for example, six consecutive frames in the past are extracted, the first three frames and the second three frames of the extracted six frames are averaged once, respectively, and the moving amount is calculated using the average value of the first half and the average value of the second half. Image processing such as detection of

【0020】S106で画像特徴量を求める際には、比
較する画像の差分をとり、濃淡値の時間的な変動の空間
周波数成分として抽出し、その周期的成分と濃淡値の変
動とを時間方向に対応させ、相関をとる画像間で濃淡値
の平均値をそれに従って補正しながらS108で相互相
関解析を行う。
When obtaining the image feature value in S106, the difference between the images to be compared is calculated and extracted as the spatial frequency component of the temporal variation of the gray value, and the periodic component and the variation of the gray value are extracted in the time direction. In step S108, the cross-correlation analysis is performed while correcting the average value of the grayscale values between the images to be correlated.

【0021】S107の異なる時刻の画像特徴量の比較
においては、前処理として各画像に平滑化処理と、孤立
点除去処理とを行う。
In the comparison of the image feature amounts at different times in S107, a smoothing process and an isolated point removing process are performed on each image as preprocessing.

【0022】補正相互相関部15における処理において
は、S109において2つの画像間で類似性を解析して
得られた移動ベクトルを初期ベクトルとして、線形外挿
法により移動ベクトルを線形に予測する手順と、流体方
程式の移流項に当てはめて流体変動を予測する手順とを
含んでいる。
In the processing in the correction cross-correlation unit 15, a procedure of linearly predicting a moving vector by a linear extrapolation method using a moving vector obtained by analyzing the similarity between the two images in S109 as an initial vector. Predicting fluid fluctuations by applying the advection term of the fluid equation.

【0023】図2は、濃淡値変化のある時系列パターン
の変化の例であり、(a)は第1の例、(b)は第2の
例を示す。視覚的に見ると、第1の例、第2の例ともに
パターンにははっきりとした模様を見ることができる
が、それぞれの組の相互相関係数を計算すると、かなり
小さくなってしまう。これは、相関関数が濃淡値を変数
としているために、濃淡値の平均値が大きいと、輪郭形
状の類似性に関わらず、相関係数は低くなってしまうか
らである。
FIGS. 2A and 2B show an example of a change in a time-series pattern having a change in gray value. FIG. 2A shows a first example, and FIG. 2B shows a second example. Visually, in both the first example and the second example, a clear pattern can be seen in the pattern, but when the cross-correlation coefficient of each set is calculated, the pattern becomes considerably small. This is because, because the correlation function uses the gray value as a variable, if the average value of the gray values is large, the correlation coefficient will be low regardless of the similarity of the contour shapes.

【0024】図3は、連続する時刻の異なる2つの画像
の平均濃淡値分布と相互相関係数変化の補正前と補正後
のグラフである。輪郭形状の類似性があっても濃淡値の
類似性は低くなることがある。時間的推移でみると、平
均濃淡値変化は連続する画像であっても異なり、相互相
関係数も0.5以下になってしまう。
FIG. 3 is a graph before and after the correction of the average gray value distribution and the cross-correlation coefficient change of two images having different successive times. Even if there is a similarity in the outline shape, the similarity in the gray value may be low. In terms of temporal transition, the average gray level change differs even for continuous images, and the cross-correlation coefficient becomes 0.5 or less.

【0025】2つの時刻の画像を比較する際、それぞれ
の平均濃淡値が近くなるように、正規化した上で相互相
関係数をとることで、相互相関係数の値は1.0に近く
なり、視覚的に対応をとったときと近い結果が得られ
る。即ち、移動ベクトルの推定精度が向上し、同時に、
予測精度が向上する。
When comparing the images at the two times, the cross-correlation coefficient is normalized and the cross-correlation coefficient is calculated so that the respective average grayscale values are close to each other. Thus, a result close to that obtained when the correspondence is visually obtained is obtained. That is, the estimation accuracy of the movement vector is improved, and at the same time,
The prediction accuracy is improved.

【0026】図4は、気象レーダーエコー画像に本発明
の移動ベクトル推定方法を適用して、降水パターン変化
を予測したときの評価結果のグラフである。本発明を適
用しない場合と、適用した場合での予測精度は、本発明
を適用した場合の方が改善されていることが理解でき
る。
FIG. 4 is a graph showing an evaluation result when a change in precipitation pattern is predicted by applying the movement vector estimation method of the present invention to a weather radar echo image. It can be understood that the prediction accuracy when the present invention is not applied and when the present invention is applied is improved when the present invention is applied.

【0027】なお、この場合、いずれの方法も式(1)
に基づく方法で降水パターン変化を予測している。
In this case, any of the methods uses the formula (1)
Predicts changes in precipitation patterns using a method based on GPS.

【0028】式(1)中、I(x,y,t)は画像の濃
淡値であり、位置(x,y)、時間(t)である。λは
拡散係数、εは消散定数、
In the equation (1), I (x, y, t) is a gray value of an image, and is a position (x, y) and a time (t). λ is the diffusion coefficient, ε is the extinction constant,

【0029】[0029]

【外1】 は移流ベクトルであり、本発明で推定された移動ベクト
ルが挿入された。
[Outside 1] Is the advection vector, and the movement vector estimated by the present invention was inserted.

【0030】評価の比較実験では、補正した相互相関法
を適用した場合と、補正しない相互相関法により推定さ
れた移動ベクトルを全画素に一次補間により穴埋めをし
た移動ベクトルを与えて予測精度を比較した。式(1)
の左辺は、時間項、右辺の第1項より、拡散項、移流
項、湧きだし項、吸い込み項である。式(1)を
In the comparison experiment of evaluation, the prediction accuracy is compared by applying a corrected cross-correlation method to a motion vector estimated by an uncorrected cross-correlation method by providing a motion vector in which all pixels are filled in by linear interpolation. did. Equation (1)
Are the diffusion term, the advection term, the springing term, and the suction term from the time term and the first term on the right side. Equation (1)

【0031】[0031]

【数1】 とおいて、画像の画素について離散化し、時間項につい
ては前進差分にした上で時間積分を必要な予測ステップ
数だけ続ける。
(Equation 1) In this case, the pixels of the image are discretized, and the time term is set to the forward difference, and the time integration is continued for the required number of prediction steps.

【0032】[0032]

【数2】 即ち、時間積分の結果、I(x,y,t)が更新され、
これが予測された降水パターンの結果そのものになる。
湧きだし項、吸い込み項は、連続する画像を2値化し、
その差分画像をとったときの正領域と負領域とをそれぞ
れに与える。拡散係数は予め、面積変化と拡散係数との
線形なテーブルを統計的に作成しておく。
(Equation 2) That is, as a result of the time integration, I (x, y, t) is updated,
This is the exact result of the predicted precipitation pattern.
The source and sink terms are binarized continuous images,
A positive area and a negative area when the difference image is obtained are given to each. For the diffusion coefficient, a linear table of the area change and the diffusion coefficient is statistically created in advance.

【0033】その数値シミュレーションの結果、補正さ
れた相互相関係数で推定された移流ベクトル(移動ベク
トル)と式(1)を用いて降水パターンを予測した場合
は、予測時間の経過とともに増大する濃淡値の予測誤差
が、補正を行わない相互相関係数を用いた場合の増加よ
りも低く抑えられているのことがわかる。
As a result of the numerical simulation, when the rainfall pattern is predicted using the advection vector (movement vector) estimated by the corrected cross-correlation coefficient and the equation (1), the shading that increases as the prediction time elapses. It can be seen that the prediction error of the value is suppressed lower than the increase when the cross-correlation coefficient without correction is used.

【0034】[0034]

【発明の効果】以上説明したように、本発明によれば、
連続する濃淡値変化を伴った移動体の対応づけを行う
際、時刻の異なる画像間で、濃淡値の平均値が同じとな
るように正規化してから相互相関係数を計算すれば、対
応づけの信頼性が向上して移動ベクトルの推定精度が向
上する。従って、このようにして推定された移動に基づ
いて予測を行う場合も、その予測精度の向上が図れると
いう効果がある。
As described above, according to the present invention,
When associating moving objects with continuous gray value changes, normalization is performed so that the average gray value between images at different times is the same, and then the cross-correlation coefficient is calculated. And the accuracy of estimating the movement vector is improved. Therefore, even when the prediction is performed based on the movement estimated in this way, there is an effect that the prediction accuracy can be improved.

【図面の簡単な説明】[Brief description of the drawings]

【図1】本発明の実施の形態のうねり対応づけ処理方法
を備えた移動体挙動解析予測装置とうねり対応づけ処理
方法の説明図である。(a)は移動体挙動解析予測装置
のブロック構成図である。(b)は本発明の実施の形態
のうねり対応づけ処理方法のフローチャートである。
FIG. 1 is an explanatory diagram of a moving object behavior analysis prediction device including a swell correspondence processing method and a swell correspondence processing method according to an embodiment of the present invention. (A) is a block diagram of a mobile object behavior analysis prediction device. (B) is a flowchart of a swell correspondence processing method according to the embodiment of the present invention.

【図2】濃淡値変化のある時系列パターンの変化の例で
ある。(a)は第1の例である。(b)は第2の例であ
る。
FIG. 2 is an example of a change in a time-series pattern having a change in gray value. (A) is a first example. (B) is a second example.

【図3】連続する時刻の異なる2つの画像の平均濃淡値
分布と相互相関係数変化の補正前と補正後のグラフであ
る。
FIG. 3 is a graph before and after correction of average gray value distribution and cross-correlation coefficient change of two images having different continuous times.

【図4】気象レーダーエコー画像に本発明の移動ベクト
ル推定方法を適用して、降水パターン変化を予測したと
きの評価結果のグラフである。
FIG. 4 is a graph of an evaluation result when a change in a precipitation pattern is predicted by applying the movement vector estimation method of the present invention to a weather radar echo image.

【符号の説明】[Explanation of symbols]

11 画像入力部 12 画像蓄積部 13 比較部 14 補正相互相関部 15 出力部 S101〜S111 処理の手順 DESCRIPTION OF SYMBOLS 11 Image input part 12 Image storage part 13 Comparison part 14 Correction cross-correlation part 15 Output part S101-S111 Processing procedure

───────────────────────────────────────────────────── フロントページの続き (51)Int.Cl.6 識別記号 FI G06T 1/00 G06F 15/62 385 // G01S 13/95 G01S 13/95 ──────────────────────────────────────────────────続 き Continued on the front page (51) Int.Cl. 6 Identification symbol FI G06T 1/00 G06F 15/62 385 // G01S 13/95 G01S 13/95

Claims (5)

【特許請求の範囲】[Claims] 【請求項1】 移動体の画像を用いた挙動解析予測装置
であって、 画像を入力する画像入力手段と、 入力された画像を時系列画像として蓄積する画像蓄積手
段と、 前記画像蓄積手段に蓄積されている連続する2つ以上の
2次元画像間で、各画像を小ブロックに分割した後、画
像処理により画像特徴量について抽出して異なる時刻の
画像特徴量を比較する比較手段と、 前記比較手段より得られた小ブロック内の濃淡値の平均
値について、2つの画像の平均濃淡値を変化させなが
ら、相互相関法に基づいて類似性を解析して最大類似点
を見い出して移動ベクトルを推定することにより、濃淡
値にうねりが存在する場合にうねりを対応づけさせた補
正を行った移動ベクトルの推定を可能とする補正相互相
関手段と、 解析された結果を出力する出力手段と、を備えたことを
特徴とする移動体の画像を用いた挙動解析予測装置。
1. A behavior analysis prediction device using an image of a moving object, comprising: an image input unit for inputting an image; an image storage unit for storing the input image as a time-series image; A comparison unit that divides each image into small blocks between two or more consecutive two-dimensional images stored, extracts image feature amounts by image processing, and compares image feature amounts at different times; With respect to the average value of the gray values in the small block obtained by the comparison means, the similarity is analyzed based on the cross-correlation method while changing the average gray value of the two images, the maximum similar point is found, and the movement vector is calculated. Correction cross-correlation means that enables estimation of a motion vector that has been corrected by associating undulations when undulations exist in the grayscale values, and output that outputs the analyzed result Behavior Analysis prediction apparatus using the image of the moving body, characterized in that it comprises a stage, a.
【請求項2】 移動体の画像を用いた挙動解析予測装置
のうねり対応づけ処理方法であって、 前記移動体の画像を時系列で画像入力手段により入力す
る手順と、 入力した時系列画像を画像蓄積手段に蓄積する手順と、 比較手段で、前記画像蓄積手段に蓄積した複数の画像か
ら連続する2つ以上の2次元画像を抽出し、抽出した前
記各画像を小ブロックに分割し、画像処理により画像特
徴量について抽出し、異なる時刻の画像特徴量の比較を
行う手順と、 補正相互相関手段で、前記比較手段より得られた前記小
ブロック内の濃淡値の平均値について2つの画像の平均
濃淡値を変化させながら、相互相関法に基づいて類似性
を解析し、最大類似点を見出して移動ベクトルを推定す
ることにより、濃淡値にうねりが存在する場合、うねり
対応づけ処理による補正を行って移動ベクトルを推定す
る手順と、 前記補正相互相関手段で解析された結果を出力手段で出
力する手順と、を備えたことを特徴とする挙動解析予測
装置のうねり対応づけ処理方法。
2. A swell-correlation processing method for a behavior analysis / prediction device using an image of a moving object, comprising: a step of inputting the image of the moving object in time series by an image input means; A procedure for storing in the image storing means, and a comparing means for extracting two or more continuous two-dimensional images from the plurality of images stored in the image storing means, dividing each of the extracted images into small blocks, Extracting image feature values by processing and comparing the image feature values at different times; and correcting cross-correlation means to calculate the average value of the grayscale values in the small block obtained by the comparison means. By analyzing the similarity based on the cross-correlation method while changing the average gray value, finding the maximum similarity point and estimating the movement vector, if there is undulation in the gray value, correlate with the undulation Correlating processing of a behavior analysis prediction device, comprising: a procedure of estimating a movement vector by performing a correction based on a logical process; and a procedure of outputting a result analyzed by the corrected cross-correlation unit by an output unit. Method.
【請求項3】 前記画像特徴量を求める手順では、前記
画像の差分をとり、濃淡値の時間的な変動の空間周波数
成分として抽出し、前記移動ベクトルを推定する手順で
は、その周期的成分と濃淡値の変動を時間方向に対応さ
せ、相関をとる画像間で濃淡値の平均値をそれに従って
補正しながら相互相関解析を行う請求項2に記載の挙動
解析予測装置のうねり対応づけ処理方法。
3. In the step of obtaining the image feature quantity, a difference between the images is obtained and extracted as a spatial frequency component of a temporal variation of a grayscale value. 3. The undulation associating processing method of the behavior analysis prediction device according to claim 2, wherein the variation of the gray value is made to correspond to the time direction, and the cross-correlation analysis is performed while correcting the average value of the gray values between the correlated images.
【請求項4】 前記補正相互相関手段における前記移動
ベクトルを推定する手順において、2つの画像間で類似
性を解析して得られた移動ベクトルを初期ベクトルとし
て、線形外挿法により移動ベクトルを線形に予測する手
順と、流体方程式の移流項に当てはめて流体変動を予測
する手順とを含む請求項2に記載の挙動解析予測装置の
うねり対応づけ処理方法。
4. In the procedure for estimating the movement vector in the correction cross-correlation means, a movement vector obtained by analyzing similarity between two images is used as an initial vector, and the movement vector is linearized by a linear extrapolation method. 3. The undulation associating processing method of the behavior analysis prediction device according to claim 2, further comprising: a procedure of predicting the fluid variation by predicting the fluid advection term of the fluid equation.
【請求項5】 前記画像特徴量を比較する手順では、前
処理として各画像に平滑化処理と、孤立点除去処理とを
行う請求項2に記載の挙動解析予測装置のうねり対応づ
け処理方法。
5. The undulation associating processing method of the behavior analysis prediction device according to claim 2, wherein in the step of comparing the image feature amounts, a smoothing process and an isolated point removing process are performed on each image as preprocessing.
JP17718197A 1997-07-02 1997-07-02 Apparatus for predicting and analyzing behavior of a moving object and a method for associating the undulation with the apparatus Expired - Fee Related JP3351460B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP17718197A JP3351460B2 (en) 1997-07-02 1997-07-02 Apparatus for predicting and analyzing behavior of a moving object and a method for associating the undulation with the apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP17718197A JP3351460B2 (en) 1997-07-02 1997-07-02 Apparatus for predicting and analyzing behavior of a moving object and a method for associating the undulation with the apparatus

Publications (2)

Publication Number Publication Date
JPH1125278A true JPH1125278A (en) 1999-01-29
JP3351460B2 JP3351460B2 (en) 2002-11-25

Family

ID=16026605

Family Applications (1)

Application Number Title Priority Date Filing Date
JP17718197A Expired - Fee Related JP3351460B2 (en) 1997-07-02 1997-07-02 Apparatus for predicting and analyzing behavior of a moving object and a method for associating the undulation with the apparatus

Country Status (1)

Country Link
JP (1) JP3351460B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009162611A (en) * 2008-01-07 2009-07-23 Mitsubishi Electric Corp Radar system
JP2012154731A (en) * 2011-01-25 2012-08-16 Panasonic Corp Positioning information formation device, detector, and positioning information formation method
CN111415450A (en) * 2019-01-08 2020-07-14 深圳怡化电脑股份有限公司 Bill altering detection method and detection device and computer storage medium
KR20210032210A (en) * 2019-09-16 2021-03-24 한국에너지기술연구원 Apparatus for calculating cloud motion vector and method thereof
KR102411506B1 (en) * 2020-12-15 2022-06-21 부산대학교 산학협력단 Apparatus and method for deriving deriving tropospheric ozone motion vector

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009162611A (en) * 2008-01-07 2009-07-23 Mitsubishi Electric Corp Radar system
JP2012154731A (en) * 2011-01-25 2012-08-16 Panasonic Corp Positioning information formation device, detector, and positioning information formation method
US8983130B2 (en) 2011-01-25 2015-03-17 Panasonic Intellectual Property Management Co., Ltd. Positioning information forming device, detection device, and positioning information forming method
CN111415450A (en) * 2019-01-08 2020-07-14 深圳怡化电脑股份有限公司 Bill altering detection method and detection device and computer storage medium
KR20210032210A (en) * 2019-09-16 2021-03-24 한국에너지기술연구원 Apparatus for calculating cloud motion vector and method thereof
KR102411506B1 (en) * 2020-12-15 2022-06-21 부산대학교 산학협력단 Apparatus and method for deriving deriving tropospheric ozone motion vector

Also Published As

Publication number Publication date
JP3351460B2 (en) 2002-11-25

Similar Documents

Publication Publication Date Title
US8045783B2 (en) Method for moving cell detection from temporal image sequence model estimation
US8019124B2 (en) Robust camera pan vector estimation using iterative center of mass
Badenas et al. Motion-based segmentation and region tracking in image sequences
US8396285B2 (en) Estimating vanishing points in images
US9785825B2 (en) Method for estimating disparity search range to which multi-level disparity image division is applied, and stereo image matching device using the same
CN109472820B (en) Monocular RGB-D camera real-time face reconstruction method and device
CN106373128B (en) Method and system for accurately positioning lips
WO2006081018A1 (en) Object-of-interest image capture
CN106296729A (en) The REAL TIME INFRARED THERMAL IMAGE imaging ground moving object tracking of a kind of robust and system
Little et al. Structural lines, tins, and dems
JP3351460B2 (en) Apparatus for predicting and analyzing behavior of a moving object and a method for associating the undulation with the apparatus
JP4836065B2 (en) Edge tracking method and computer program therefor
CN112084855B (en) Outlier elimination method for video stream based on improved RANSAC method
KR20120094102A (en) Similarity degree calculation device, similarity degree calculation method, and program
CN100574371C (en) A kind of smoothing method of digital image limit
JP2005165969A (en) Image processor and method
JP3351461B2 (en) Rainfall maximum likelihood estimator
KR100994366B1 (en) Method for tracking a movement of a moving target of image tracking apparatus
Han et al. An efficient estimation method for intensity factor of illumination changes
JP3377079B2 (en) Flow velocity estimation device
Chang et al. Real time hand tracking based on active contour model
Koljonen et al. Dynamic template size control in digital image correlation based strain measurements
JP5152144B2 (en) Image processing device
Amintoosi et al. Precise image registration with structural similarity error measurement applied to superresolution
JP2003256849A (en) Object extracting system and method and its program

Legal Events

Date Code Title Description
FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20080920

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20080920

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090920

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090920

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100920

Year of fee payment: 8

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100920

Year of fee payment: 8

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110920

Year of fee payment: 9

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120920

Year of fee payment: 10

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130920

Year of fee payment: 11

LAPS Cancellation because of no payment of annual fees