JPS59208983A - Automatic tracing camera - Google Patents

Automatic tracing camera

Info

Publication number
JPS59208983A
JPS59208983A JP58082757A JP8275783A JPS59208983A JP S59208983 A JPS59208983 A JP S59208983A JP 58082757 A JP58082757 A JP 58082757A JP 8275783 A JP8275783 A JP 8275783A JP S59208983 A JPS59208983 A JP S59208983A
Authority
JP
Japan
Prior art keywords
signal
image
picture
subject
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP58082757A
Other languages
Japanese (ja)
Other versions
JPH0614698B2 (en
Inventor
Hiroyasu Otsubo
宏安 大坪
Toshio Murakami
敏夫 村上
Toshiro Kinugasa
敏郎 衣笠
Yoshiaki Mochimaru
持丸 芳明
Noritaka Narita
成田 徳孝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Image Information Systems Inc
Hitachi Ltd
Original Assignee
Hitachi Ltd
Hitachi Video Engineering Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd, Hitachi Video Engineering Co Ltd filed Critical Hitachi Ltd
Priority to JP58082757A priority Critical patent/JPH0614698B2/en
Publication of JPS59208983A publication Critical patent/JPS59208983A/en
Publication of JPH0614698B2 publication Critical patent/JPH0614698B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
    • G01S3/7864T.V. type tracking systems

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

PURPOSE:To realize a simplified considerably in constitution, small sized and inexpensive automatic tracing camera by detecting the difference between pictures at prescribed time intervals, calculating the center of gravity between the said pictures, detecting the moving amount and the direction of the center of gravity to an image picked-up screen of an object picture to control an image pickup device. CONSTITUTION:A picture signal from a camera circuit 3 is applied from an input terminal 6 to a separating circuit 7, where the signal is separated into a luminance signal and a chrominance signal. The picture signal from the separating circuit 7 is applied to an analog/digital converting circuit 8, where the picture signal is digitized by setting M1XM2 sampling points per one frame sampling the picture signal. A subtraction circuit 10 processes subtraction for the stored picture signal for one frame's share and the picture signal of other frame having a prescribed time interval, generates a picture difference signal and stores the result to a frame memory 11. An operating device 12 uses a supplied picture difference signal, calculates a center of gravity of the picture represented by the picture difference signal by using a reference point of the picked-up screen as the origin, generates a control signal and applies the signal to a control section 5.

Description

【発明の詳細な説明】 〔発明の利用分野〕 本発明は、目標とする被写体を自動的に追跡し又撮像す
ることができるよう忙した自動追尾カメラに関する。
DETAILED DESCRIPTION OF THE INVENTION Field of the Invention The present invention relates to an automatic tracking camera capable of automatically tracking and imaging a target object.

〔発明の背景〕[Background of the invention]

従来、移動する物体(以下、被写体という)を常時観測
するために、′M写体の移動とともに撮像装置の向き、
すなわち、カメラ方向を自動的に変え、被写体を追跡、
撮律するようにした自動追尾カメラか知られている。か
かる自動追尾カメラにおい又は、まず、被写体を認識す
る必要かあり、このため虻、従来では画壇法処理技術を
用い(被写体の撮像による画像(以下、被写体画像とい
う)のノくターン認識を行なつ工いた。すなわち、被写
体の形状1色彩など力)も被写体の特徴を表わすノくタ
ーンを抽出し、力1力)るパターンと予じめ設定された
基準ノ(ターンとを比較して被写体を認識するようにし
ている。
Conventionally, in order to constantly observe a moving object (hereinafter referred to as a subject), the orientation of the imaging device as well as the movement of the subject,
In other words, it automatically changes the camera direction, tracks the subject,
It is known that it is an automatic tracking camera that uses shooting rules. In such an automatic tracking camera, it is first necessary to recognize the subject, and for this purpose, conventionally, a processing technique using the art industry method (to perform turn recognition of an image obtained by capturing an image of the subject (hereinafter referred to as a subject image)) is used. In other words, the shape, color, etc. of the subject were extracted to show the characteristics of the subject, and the pattern was compared with the preset standard turns to determine the subject. I try to recognize it.

したがつ又、このよ5な)くターン認識を行なうために
、大型の奄子計典機が用いられ、膨大なプログラムでも
って処理を行わなければならず、従来の自動追尾カメラ
は必然的に人規俟となり、また、非常に両画なものであ
った。
However, in order to recognize these kinds of turns, a large Amashi calculation machine is used, and a huge number of programs must be used to perform the processing, making conventional automatic tracking cameras unavoidable. It became very popular in Japan, and was also very biographical.

−万、近年、ビテオテーブレコーダと相まってビデオカ
メラの普及が目覚筐しく、産業用。
-In recent years, video cameras have become increasingly popular, along with video table recorders, for industrial use.

家庭用し又広く用いられるようになってきた。It has become widely used for household use.

しかも、かかるビデオカメラは、ユーザーカ(直接操作
してH[望の被写体を撮像するために使用されるばかり
でなく、監視装置のセンサとしても利用されている。こ
のように、ビデオカメラをセンサとして利用する場合、
ビデオカメラを固定して一定の視野内を監視するように
することも、それなりの意味があることであるが、監視
しようとする被写体が移動し、ビデオカメラの画角が制
限されるような特定の無人監視装置におい又は、ビデオ
カメラ妊移動する被写体を追跡して撮像する自動追尾機
能をもたせる必要がある。
Moreover, such video cameras are not only used to capture images of desired subjects by direct operation by the user, but also used as sensors for monitoring devices. When used as
There is some meaning in fixing a video camera and monitoring within a certain field of view, but if the subject to be monitored moves and the video camera's angle of view is restricted, It is necessary for an unmanned monitoring device or a video camera to have an automatic tracking function to track and image a moving subject.

しかし、先にも述べた。ように、従来の自動追尾カメラ
は規模が大きく、非常に市価となるものであるから、産
業用、家庭用のビデオカメラに自動追尾機能をもたせる
のは、現実的に不可能であった。
However, as mentioned earlier. As conventional automatic tracking cameras are large-scale and extremely expensive, it is practically impossible to provide an automatic tracking function to a video camera for industrial or home use.

〔発明の目的〕[Purpose of the invention]

本発明の目的は、上記従来技術の欠点を除き。 The object of the present invention is to eliminate the drawbacks of the above-mentioned prior art.

構成が大幅に簡易化され、小型で安価な自動追尾カメラ
を提供1−るにある。
To provide a compact and inexpensive automatic tracking camera with a greatly simplified configuration.

〔発明の構成〕[Structure of the invention]

この目的を達成するために1本発明は、一定時間間隔の
画像間の差を検出して該画1絖間の差の重心を算出し、
該重心の移動量、移動方向から被写体画像の撮像画面に
対1−る移wJ量、移動方向を検出し又撮像装置を制卸
し、該被写体両開を該撮像画面の基準領域内に設定する
ことができるようにした点に特徴がある。
In order to achieve this object, the present invention detects the difference between images at regular time intervals and calculates the center of gravity of the difference between the images,
The amount of movement of the center of gravity, the amount of movement wJ relative to the imaging screen of the subject image from the direction of movement, and the direction of movement are detected, the imaging device is controlled, and the subject is set in a reference area of the imaging screen. The feature is that it is possible to do this.

以下1本発明の実施例を図面について説明する。An embodiment of the present invention will be described below with reference to the drawings.

第1図は本発明による自動追尾カメラの一夾施例を示す
ブロック図であって、1は被写体。
FIG. 1 is a block diagram showing one embodiment of an automatic tracking camera according to the present invention, where 1 indicates a subject.

2は撮1戒装置、6はカメラ回路、4は信号処理MS−
5は制御部である。
2 is a photographing device, 6 is a camera circuit, and 4 is a signal processing MS-
5 is a control section.

同図において、撮像装置2は移動する被写体1を撮像し
て画像信号に発生する。この画像信号はカメラ回路3に
供給されて増幅などの所定の処理がなされ、信号処理部
4に供給される。
In the figure, an imaging device 2 images a moving subject 1 and generates an image signal. This image signal is supplied to the camera circuit 3, subjected to predetermined processing such as amplification, and then supplied to the signal processing section 4.

信号処理回路4は、供給された画像信号を処理して撮像
装置2の撮像画面(撮像面に投影される視野画面)にお
ける被写体1の画像、すなわち、被写体画像の移j17
1量および移動方向を表わす制御イぎ号を発生する。こ
の制イ卸信号は制御部5に供給され、制御部5は撮像装
置2を制御してそのカメラ方向を変化させ、被写体画像
が撮像画面の所定の′頭載(たとえは、中央部)内にあ
るように追尾させる。
The signal processing circuit 4 processes the supplied image signal to create an image of the subject 1 on the imaging screen (viewing screen projected onto the imaging surface) of the imaging device 2, that is, a movement of the subject image.
1 generates a control key signal representing the quantity and direction of movement. This control signal is supplied to the control unit 5, and the control unit 5 controls the imaging device 2 to change its camera direction so that the subject image is within a predetermined head position (for example, the center) of the imaging screen. Track as shown in .

第2図は稟1スの信号処理部の一具体例を示すブロック
図であって、6は入力端子、7は分離回路、8はアナロ
グ−デジタル変換回路、9はフレームメモリ、10は減
算回路、11はフレームメモリ、12は演N 装置、1
3は出力端子である。
FIG. 2 is a block diagram showing a specific example of the signal processing section of the Ren1st, in which 6 is an input terminal, 7 is a separation circuit, 8 is an analog-to-digital conversion circuit, 9 is a frame memory, and 10 is a subtraction circuit. , 11 is a frame memory, 12 is a performance N device, 1
3 is an output terminal.

同図において、カメラ1回路3(兜1図)がもの画像信
号は入力端子6がら分離回路7に供給され、輝度信号と
色信号とに分離される。分離回路7は、撮像装置2(第
1図)から得られる画像信号が一般にカラー映像信号で
あることがら設けられているが、輝度信号と色m号とは
同一処理がなされ、かつ、これらの信号が分離されて処
理されることが1本発明にとって特に重要な意味をなす
ものでないから、以下−画像信号として説明する。
In the figure, a camera 1 circuit 3 (helmet 1) image signal is supplied from an input terminal 6 to a separation circuit 7, where it is separated into a luminance signal and a color signal. The separation circuit 7 is provided because the image signal obtained from the imaging device 2 (FIG. 1) is generally a color video signal, but the luminance signal and the color m number are subjected to the same processing, and these Since it is not particularly important for the present invention that the signals be separated and processed, they will be described below as image signals.

分離回路7からの画fiJ!信号はアナログーテジタル
変換回路8に供給される。アナログーテジタル変換回路
8は、画像信号を1フレーム当りM、 XM、個の標本
点を設定して標本化してデジタル化する。
Picture fiJ from separation circuit 7! The signal is supplied to an analog-to-digital conversion circuit 8. The analog-to-digital conversion circuit 8 samples and digitizes the image signal by setting M, XM sample points per frame.

すなわち、第3図(,4)に示すように、フレームI°
の走査軸14中M2個の走査線について、夫々鵡個の標
本点を設定し、l面1家信号の各標本点をデジタル化f
る。これは、1フレームの全走査腺数01/M□本毎の
短資線につい℃、短資飯期間の1/M0期間毎に画像信
号を標本化してデジタル化1−ればよい。
That is, as shown in FIG. 3(,4), the frame I°
A number of sample points are set for each of M2 scanning lines in the scanning axis 14 of
Ru. This can be done by sampling and digitizing the image signal for every 01/M□ tansashi line in one frame, and every 1/M0 period of the tansashi period.

いま、標本点15の位置を(”、y)(但し、x、yは
正整数であって、1≦X≦M1.1≦y≦”2 )とし
、左上隅の標本点の位置を(1,1)。
Now, let the position of the sample point 15 be ('', y) (where x, y are positive integers, 1≦X≦M1.1≦y≦”2), and the position of the sample point in the upper left corner be ( 1,1).

右下隅の標本点の位置を(M+ 、#2 )とすると。Let the position of the sample point in the lower right corner be (M+, #2).

デジタル化された画隊信号はYCx、y)でもって表わ
すことができろ。
The digitized squadron signal can be expressed as YCx,y).

アナログ−デジタル変換回路8からの画1家侶号は、減
算回路10とフレームメモリ9とに供給される。フレー
ムメモリ9は1フレ一ム分の画像信号を記憶し、減算回
路10はフレームメモリ9に記憶された1フレ一ム分の
画f象イぎ号とこの7L/−ムより一定時間間隔の他の
フレームの画像イH号とを減算処理し、画1家差信号を
発生してフレームメモリ11に記憶させる。
The picture number from the analog-to-digital conversion circuit 8 is supplied to a subtraction circuit 10 and a frame memory 9. The frame memory 9 stores an image signal for one frame, and the subtraction circuit 10 calculates a fixed time interval from the image signal f for one frame stored in the frame memory 9 and this 7L/-m. A subtraction process is performed on images A and H of other frames to generate a single image difference signal and store it in the frame memory 11.

ところで、被写体1(第1図)は移動するものであるか
ら、撮像画面に対する被写体画像の位置は時々刻々変化
し、このために、フレームメモリ9にHα億されたフレ
ームの画1M信号とこれより一定時間間隔のフレームの
画像信号とは。
By the way, since the subject 1 (Fig. 1) is moving, the position of the subject image with respect to the imaging screen changes from time to time, and for this reason, the frame image 1M signal stored in the frame memory 9 and the What is an image signal of frames at fixed time intervals?

被写体1の移動に応じ℃波形が異なり、この結果、減算
回路10から得られる画像差信号は、被写体1の移箭に
ともなう上記2つのフレームの画像FW]の差に応じた
信号である。
The °C waveform changes depending on the movement of the subject 1, and as a result, the image difference signal obtained from the subtraction circuit 10 is a signal corresponding to the difference between the images FW of the two frames as the subject 1 moves.

すなわち、第4図において、撮1オ装置2の撮像走査に
より11次フレームが得られるが、いま、フレームへ、
・・・、フレームFN−1+フレームFM−I+フレー
ムFNのN個のフレームを想定し、かつ。
That is, in FIG. 4, an 11th frame is obtained by the imaging scan of the imaging device 2, but now, to the frame,
..., assuming N frames of frame FN-1+frame FM-I+frame FN, and.

Aを被写体画像、Bを静止体の画像とし、フレームFH
の画像からフレーム君の画像を差し引くと、静止体Bの
画像は相殺され、フレームFNにおけろ被写体画像Aと
フレームF1におけろ被写体画像Aとの差画像A′のみ
からなる画像ΔFが得られる。フレームメモリ11に記
憶される画像差43号は、かかる画像ΔFに対応した4
M号である。フレームp′1の画像信号をY+ (” 
、y ) 、フレームF〃の画像信号をYn(”−y)
とすると・Yn(x、y)−Y、(”、y)−ΔY(x
、y)が自IW差信号である。なお−” + 3’は、
りも5図(/J)の説明で定義したイ直である。
A is a subject image, B is an image of a stationary object, and frame FH
When the image of Frame-kun is subtracted from the image of , the image of stationary object B is canceled out, and an image ΔF consisting only of the difference image A' between the subject image A in frame FN and the subject image A in frame F1 is obtained. It will be done. Image difference No. 43 stored in the frame memory 11 is 4 corresponding to the image ΔF.
It is M number. The image signal of frame p′1 is Y+ (”
, y), the image signal of frame F〃 as Yn(''-y)
Then, Yn (x, y) - Y, ('', y) - ΔY (x
, y) is the self-IW difference signal. In addition, −” + 3’ is
This is the direct position defined in the explanation of Figure 5 (/J).

この場合、減算回路10からの画像差4g号は1、E対
偶の4を表わす信号としてフレームメモリ11に記憶さ
れる。
In this case, the image difference number 4g from the subtraction circuit 10 is stored in the frame memory 11 as a signal representing 1, E to even 4.

フレームメモリ11に記憶された画像差48号は。Image difference No. 48 stored in the frame memory 11 is as follows.

読み出されてマイクロコンピュータなどからなる演1装
置12に供給される。
The data is read out and supplied to a performance device 12 consisting of a microcomputer or the like.

眞尊装置12は、1ず、供給された画像差信号を用い、
撮像画iM+の奉!■点(たとえは、中上・点)火涼点
とし゛′C,匝源走1ぎ号が衣わ1画1yの車上・点を
昇a1丁−句。
The Shinson device 12 first uses the supplied image difference signal,
Imaging image iM+ service! ■A point (for example, Nakagami, point) is a hot point, ゛'C, Sogen running number 1 is wearing 1 stroke 1 y, and the point is a1 cho-phrase.

そこで、1ず、l= 博m 1M 号ΔYCx、y)k
Qr足の閾1はよりも大きけれは1.小さけれぽOとし
て21ut化し、2餉化された画1家1H号ンj(x、
y)とすると、車上位置(−ta 、 ’/a )は次
式、fなわち。
Therefore, 1z, l = Hm 1M No. ΔYCx, y)k
Qr bar threshold 1 is greater than 1. It was made into 21ut as a small Kerepo O, and it was made into 2 pieces.
y), the on-vehicle position (-ta, '/a) is given by the following formula, f.

によって算出される。Calculated by

ところで、熾揮画面における被写体画像の移重量、移動
方同とこの被写体の移動にともなう面ノー家差信号で表
わされる画像の束毛・点の移動量。
By the way, the amount of movement and movement of the subject image on the screen, as well as the amount of movement of the hairs and points in the image expressed by the surface difference signal as the subject moves.

移動方向との関係は、次のようになる。The relationship with the direction of movement is as follows.

1)基七点の移動量は被写体画1家の移動量の4である
1) The amount of movement of the seven base points is 4, which is the amount of movement of one subject image.

2)重心点の移動方向は被写体画像の移動方向に一致す
る。
2) The moving direction of the center of gravity matches the moving direction of the subject image.

このことから、厘七点の移動量、移動方向にもとづいて
被写体画像の移動量、移動方向を検出することができる
From this, the amount and direction of movement of the subject image can be detected based on the amount and direction of movement of the seven points.

いま、撮像画面の中心点の位置を(”o = yo)と
し、この中心点(”o=3’o)を原点とする座標を想
足し、被写体画像がこの原点を含む撮像画面の中心部の
鎖酸(以下、基準領域という)に設定されたときのフレ
ーム石(第4図)の画像信号がフレームメモリ9(第2
図)に1ピ憶されたものとすると、一定時間後のフレー
ムFsCm4図)の画像とフレームF1の画像との差の
画像の重心点の位置は(Jta 、 3’G)であ る
から、重心点は、 水平方向に、<3:G−xo) 垂直方向に、(3’G3’、) 移動したことなる。したかつ又、被写体画像は。
Now, let us assume that the position of the center point of the imaging screen is ("o = yo), and add the coordinates with this center point ("o = 3'o) as the origin, so that the subject image will be located at the center of the imaging screen that includes this origin. The image signal of the frame stone (Fig. 4) when set to the chain acid (hereinafter referred to as the reference area) is stored in the frame memory 9 (second
Assuming that 1 frame is stored in Figure), the position of the center of gravity of the difference between the image of frame FsCm4 (Figure) and the image of frame F1 after a certain period of time is (Jta, 3'G). The center of gravity has moved horizontally by <3:G-xo) and vertically by (3'G3',). However, as for the subject image.

水平方向に、2X(xG−0) 垂直方向に、2X(3’G  ’10)移動したことな
る。なお、水平方向とは走査巌の方向、垂直方向とは短
資勝に垂直な方向である。
This means that it has moved 2X (xG-0) in the horizontal direction and 2X (3'G'10) in the vertical direction. It should be noted that the horizontal direction is the direction of the scanning wave, and the vertical direction is the direction perpendicular to the short bet win.

以上のようにして得られたテークにもとづいて、演算装
置12は制御信号を発生し、この制御イg号は出力端子
13がら制御部5(第1図)に供給される。
Based on the take obtained as described above, the arithmetic unit 12 generates a control signal, and this control signal is supplied to the control section 5 (FIG. 1) through the output terminal 13.

このために、第1図において%撮像装置2は、被写体画
像が撮1象画面の基準領域に設定されるように、制御さ
れてカメラ方向を変えられる。
For this purpose, in FIG. 1, the imaging device 2 is controlled to change the camera direction so that the subject image is set in the reference area of the image plane.

この毎号処理部4は処理動作を停止しており。The processing unit 4 for each issue has stopped its processing operation.

カメラ方向が被写体に向けられて被写体画像が撮像画面
の基準領域に設定されると、1g号処理部4は上記の処
理動作を再開する。
When the camera direction is directed toward the subject and the subject image is set in the reference area of the imaging screen, the No. 1g processing unit 4 resumes the above processing operation.

このように、被写体の移動の計測と撮像装置1の制御と
が交互に繰り返されて自動追尾が行なわれる。
In this way, measurement of the movement of the subject and control of the imaging device 1 are alternately repeated to perform automatic tracking.

この実施例によると、パターン認識技術を用いることな
く、自動追尾を行なうことができて4g号を処理するだ
めの回路とマイクロコンピュータなどの比較的小型の演
算装置でもって構成することができるし、また、被写体
の移動にともなう画像の変化分を処理対象とするもので
あるから、処理する情報量としては非常に少なく、演算
処理を迅速に行なうことができるとともに。
According to this embodiment, automatic tracking can be performed without using pattern recognition technology, and it can be constructed using a circuit for processing No. 4G and a relatively small arithmetic device such as a microcomputer. Further, since the processing target is the change in the image due to the movement of the subject, the amount of information to be processed is extremely small, and calculation processing can be performed quickly.

使用されるフレームメモリも比較的小容量のものを用い
ることがき、応答速夏が同上し、かつ。
The frame memory used can also have a relatively small capacity, and the response speed is the same as above.

小型、軽量で安価なものとなる。It is small, lightweight, and inexpensive.

なお、撮繍画囲の原点は必ずしも、その中Iシ・点と1
−る必要は7ヨク、任意の点に設定することができ、し
たがって、自動追尾によって設定される被写体画像は所
望の位置に設定するようにrることができる。
Note that the origin of the photographic image area is not necessarily the point I and 1 within it.
- It is possible to set it at any point, and therefore, the subject image set by automatic tracking can be set at a desired position.

第5図は本発明による自動追尾カメラの他の実施列の説
明図であって、16は撮像画面、17゜18は画角であ
る。
FIG. 5 is an explanatory diagram of another embodiment of the automatic tracking camera according to the present invention, where 16 is an imaging screen and 17° and 18 are angles of view.

この実施例1は1画角を用いて被写体の移動な検仰1−
るものであって、基本構成としては、第1図、第2図と
変わりはない。
In this first embodiment, the subject is moved using one angle of view.
The basic configuration is the same as that in FIGS. 1 and 2.

第5図において、撮像装置2(第1図)の撮像画面16
内に2つり画角17 、18を想定する。
In FIG. 5, the imaging screen 16 of the imaging device 2 (FIG. 1)
Assume that there are two angles of view 17 and 18 within.

夫々の画角17 、18の中心点を撮像画[1111(
Sの中心点に一致させ、小さい画角17を先の基準領域
となるように設定するととも洗、画角18を所望の大き
さに設定する。したがって、被写体画像が常に画角17
内にあるように、自動追尾を行なう。また、画角18は
被写体が移動したか否かを判定するために用いられるも
のであっ℃1画角17と画角18との間の領域(以下、
移動判定領域という)で画像差4g号が生じた場合、被
写体が移動したと判定する。
The center points of the respective angles of view 17 and 18 are set as the captured image [1111(
The small angle of view 17 is set to match the center point of S, and the angle of view 18 is set to a desired size. Therefore, the subject image always has an angle of view of 17
Automatic tracking is performed as shown in the image below. In addition, the angle of view 18 is used to determine whether the subject has moved or not, and the area between the angle of view 17 and the angle of view 18 (hereinafter referred to as
If image difference No. 4g occurs in the movement determination area), it is determined that the subject has moved.

絹6図はこの実施例の動fl:を示すフローチャートで
あり、第2図、第5図、f、6図により。
Figure 6 is a flowchart showing the operation of this embodiment, and is based on Figures 2, 5, f, and 6.

この実施例の動作馨説明すの。Let me explain the operation of this embodiment.

先の実施例で説明したように、まず、フレームメモリ9
に1フレームの画揮イぎ号を記憶しく処理19)、一定
時間後の1フレームの画像毎号との画像差信号を減算回
路10で得℃フレームメモリ11に記憶する(処理2υ
)。次に、演算装1筺12は、フレームメモリ11の画
角17 、18間の移動判定領域に対応したアドレスを
読み出し、これらのアドレス中に画像差信号か僧き込ま
れ℃いるとき、jなわち、11′のテークがあるときに
フラッグをセットする(処理21)。
As explained in the previous embodiment, first, the frame memory 9
Process 19) to store the image signal of one frame in memory, and store the image difference signal between each image of one frame after a certain period of time in the frame memory 11 using the subtraction circuit 10 (process 2υ
). Next, the processing unit 1 case 12 reads out the addresses corresponding to the movement determination area between the viewing angles 17 and 18 of the frame memory 11, and when the image difference signal is written into these addresses, the j-line is determined. In other words, when there is a take of 11', a flag is set (process 21).

次ニ、フラッグかセットされているか否かの判定を行な
い(判定22)、7ラングがセットされ又いrLければ
、被写体の移Mはないものとして処理20に戻る。この
場合、メモリ?は前に記憶された1フレームの画琢信号
をその1ま記憶しておき、さらに−に時jbJ後の1フ
レームの画像信号との画像差18号を得るようにする。
Next, it is determined whether or not the flag is set (determination 22), and if the 7 rung is set or rL, it is assumed that there is no movement M of the subject and the process returns to process 20. In this case, memory? stores the previously stored image signal of one frame, and furthermore obtains an image difference No. 18 with the image signal of one frame after the time jbJ.

フラッグがセットされていて’l’lJ足22で「YE
S」とい5刊定かはされると、自動追尾動作を終了させ
る詰合があるか否かの判定を行ない(刊尾ン3)、この
命名が71いとぎには画1象18内について、先の実施
例と同僚の爪ノb点の算出・被写体画壇象の移動量およ
び移動方向の検出、撮像装置2(第1図)の制御による
カメラ方向の設定のための処理が行なわれ(処理24)
、カメラ方向の設定が完了すると、処理19に戻って再
び同様の動作をな1−0 これら一連の動作が繰り返えされ、自動追尾動作を終了
させる命令があると1判定26で「YES」と判定され
て自動追尾動作は停止する。
The flag is set and 'l'lJ foot 22 says 'YE'.
When it is determined that there is a condition to terminate the automatic tracking operation (Part 3), if this naming is 71 Itogi, it is determined that there is a problem that will cause the automatic tracking operation to end. Similar to the previous embodiment, processing is performed to calculate the point b of the nail of the colleague, to detect the amount and direction of movement of the subject image, and to set the camera direction by controlling the imaging device 2 (FIG. 1). 24)
When the setting of the camera direction is completed, the process returns to process 19 and performs the same operation again.1-0 These series of operations are repeated, and if there is an instruction to end the automatic tracking operation, a 1-decision 26 returns "YES". It is determined that the automatic tracking operation stops.

この実施例では1画角18内の画像差4g号を処理する
ものであるから、処理される情報量が。
In this embodiment, since the image difference number 4g within one field of view 18 is processed, the amount of information processed is .

撮像画面全体にわたる画像差信号を処理J−る場合より
も非常に少なくなり、自動追尾動作がより迅速に行なわ
れるとともに、フレームメモリ9.11(第2図)とじ
又、より各音が小さいものを用いることができろ。捷た
1画像差4M号の処理範囲が画角18内と小さな範囲に
制限されるものであるから、被写体画像の検出のための
演算処理に対するノイズの影響が大幅に軽減され。
This is much smaller than when processing image difference signals over the entire imaging screen, and the automatic tracking operation is performed more quickly. You can use it. Since the processing range of the shuffled one-image difference No. 4M is limited to a small range within the angle of view 18, the influence of noise on the arithmetic processing for detecting the subject image is significantly reduced.

自動追尾の精度が大幅に同上する。The accuracy of automatic tracking is greatly improved.

さらに、この実施例では、画角17 、18間の移動判
定領域を設けて被写体画像の移動を検知し、被写体の移
動が検知され又からカメラ方向設定のための演算処理を
行なうものであって。
Furthermore, in this embodiment, a movement determination area between the viewing angles 17 and 18 is provided to detect movement of the subject image, and after the movement of the subject is detected, arithmetic processing for setting the camera direction is performed. .

被写体に移動がないと判定されろと、直ちに、次のフレ
ームがフレームメモリ9 (i2図)K記憶されている
フレームとす画像差イぎ号ブ);検出されるものである
から、被写体画数が移動すると(す7【わち、被写体画
像が画角17からはみ吊すと)、1区ちに、それが検仰
され、自ヂυ追尾の応答特性が同上する。
As soon as it is determined that there is no movement of the subject, the next frame is detected as the image difference between the frame stored in the frame memory 9 (Fig. i2) and the number of strokes of the subject. When it moves (i.e., when the subject image hangs out from the angle of view 17), it is detected in one section, and the response characteristic of self-tracking becomes the same as above.

7【お1画角17 、18は、互いに中上〜が一致する
ならげ、撮像画面16の任意の位置に設屋することがで
き、したがって、被写体をとらえる範囲を任急に設定す
ることができろ。
7. The angles of view 17 and 18 can be set at any position on the imaging screen 16, as long as the angles of view 17 and 18 coincide with each other. Therefore, the range in which the subject is captured can be set at will. You can do it.

第7図は本発明による自TfJに@尾カメラのさらに他
の実施例の要部を示すブロック図であって、。
FIG. 7 is a block diagram showing the main parts of still another embodiment of the self-TfJ@tail camera according to the present invention.

25は自動焦点調整装置であり、第1図、絹2図に対応
する部分には同一符号をつけて脱明ン一部省略する。
Reference numeral 25 denotes an automatic focus adjustment device, and parts corresponding to those in FIGS. 1 and 2 are given the same reference numerals and some parts are omitted.

この笑施?l+は、移動体1(第1図)の移動方向が撮
像装置2(第1図)に近づく方向にある場合、ズーム倍
率を変化させて広角撮1家にしたり1画角を大きくした
りなどして被写体画像を所定の大きさに限定することが
できるようにしたものであって、第5図について先に説
明した実施例と同様に、撮像画面16内に画角17 、
1Bを想定する。
This lol? When the moving direction of the moving object 1 (Fig. 1) is in the direction approaching the imaging device 2 (Fig. 1), l+ changes the zoom magnification to take wide-angle shots or increase the angle of view. This allows the subject image to be limited to a predetermined size, and similarly to the embodiment described above with reference to FIG.
Assume 1B.

第8図はこの実施例の動作の要部を示すフローチャート
である。この実施例の動作は、処理24が第8図のフロ
ーチャートで示す処理をなす点を除いて、第6図のフロ
ーチャートでも−yて表わされる。
FIG. 8 is a flowchart showing the main part of the operation of this embodiment. The operation of this embodiment is also represented by -y in the flowchart of FIG. 6, except that process 24 performs the processing shown in the flowchart of FIG.

第5図、第7図および第8図において、先の実施例で説
明したように、画角17 、18間の移動判定領域によ
って被写体画像が移動していることが検知されると、演
算装置12において、移動判定領域のいかなる部分に画
像差信号が存在するかの検出処理が行なわれる(処理2
6)。すなわち、移WtJJ判定領域はいくつかの部分
移動判定領域に区分されており、処理26では、各部分
移動判足頚域毎に画像差信号の有無を検出する。
In FIGS. 5, 7, and 8, as explained in the previous embodiment, when it is detected that the subject image is moving by the movement determination area between the viewing angles 17 and 18, the arithmetic unit 12, a detection process is performed to determine in what part of the movement determination area the image difference signal exists (process 2
6). That is, the movement WtJJ judgment area is divided into several partial movement judgment areas, and in process 26, the presence or absence of an image difference signal is detected for each partial movement judgment area.

次に5画角17につい又の画1#差イ]号の重心の算出
、被写体画像の移G量および移動方向の検出のための演
算処理が、先の実施例で説明したように行なわれ、演算
結果にもとづく制御信号が制御部5に供給される。
Next, calculation processing for calculating the center of gravity of the next image 1 #difference A] at the 5th angle of view 17 and detecting the G shift amount and moving direction of the subject image is performed as explained in the previous embodiment. , a control signal based on the calculation result is supplied to the control section 5.

次に、演n装置12は、処理26における検出結果にも
とづいて、画角11に対し℃相反する複数の方向の部分
移動判定領域に画像差信号が存在するか否かの判定を行
なう(判定28)。この判定28は、被写体1(第1図
)の移動方向が撮1家装置2(第1図)の方向であるか
否を検出するものであって、この判定が「NOJであれ
ば、被写体1は撮像装置2に近づくものではないとし、
第6図の処理19に戻る。
Next, based on the detection result in the process 26, the processing unit 12 determines whether or not an image difference signal exists in the partial movement determination area in a plurality of directions opposite to each other with respect to the angle of view 11 (determination). 28). This determination 28 is to detect whether the moving direction of the subject 1 (Fig. 1) is the direction of the photographer's device 2 (Fig. 1). 1 is not close to the imaging device 2,
Returning to process 19 in FIG.

これに対して、判定28がf−YESJであるならば、
ズーム倍率を算出するための計算処理がなされる(処理
29)。この計算処理は、処理27でカメラ方向を変更
した後の画角17の中心点から被写体面1象の最もはな
れた標本点(第3図)までの距離を求めるものである。
On the other hand, if the decision 28 is f-YESJ,
A calculation process is performed to calculate the zoom magnification (process 29). This calculation process is to find the distance from the center point of the angle of view 17 after changing the camera direction in process 27 to the farthest sample point (FIG. 3) on the object plane.

いま、被写体面1家の移動ベクトルを(ΔX。Now, the movement vector of one object plane is (ΔX.

Δy)1画角17の中上・点の座標を(X・、 3’、
)。
Δy) 1 The coordinates of the upper middle point of the angle of view 17 are (X・, 3',
).

処理27前の被写体画像の各標本点の座標を(x。The coordinates of each sample point of the subject image before processing 27 are (x.

y)とすると、処理27によって被写体画像は(−ΔX
、−Δy)移動したことになるから、処理27後の被写
体画像の各標本点の画角17の中心点がらの距離dは。
y), the subject image becomes (-ΔX
, -Δy), the distance d from the center point of the angle of view 17 of each sample point of the subject image after processing 27 is.

である。この式から、各標本点の距離dが着、出される
と、次に、これらのうちの最大値となる距離d□工を選
びだす。このへ61に対する標本点がカメラ方向設定後
の被写体画像の画角17の中心点から最も離れている標
本点である。
It is. Once the distance d of each sample point is determined from this equation, the distance d□ which is the maximum value among these is selected. The sample point for this direction 61 is the sample point farthest from the center point of the angle of view 17 of the subject image after the camera direction is set.

なお、画角17 、18間の移動中」足金置載に画像差
信号が検出されて始めて処理29が行なわれるから、こ
の移a判足狽域中の画像差信号は、処理27ii7の被
写体画像の画像イH号とみなすことができ、したがって
−x+ 、Vは移動判足頴域内の標本点の座標111で
ある。葦だ、ΔX、Δyは、処理27によって検出され
た被写体面像の移動対。
It should be noted that processing 29 is performed only after an image difference signal is detected in the movement between the viewing angles 17 and 18, so the image difference signal in the movement A-size foot trap area is the subject of processing 27ii7. The image of the image can be regarded as the image number IH, and therefore -x+, V are the coordinates 111 of the sample point in the moving leg area. ΔX, Δy are movement pairs of the subject plane image detected in the process 27.

移動方向から得られる。Obtained from the direction of movement.

このようにして(%られたctma工と目動焦7点調整
装置25からのズーム倍率とにより、d、na工に対応
した標本点が画角17内に入るようにズーム倍率が算出
され、この結果、制御信号が自動焦点調整装置25に供
給されて被写体画像全体が画角17に納捷るようにズー
ムレンズが市り御される。そシテ、ズームレンズの制御
が完了すると、第6図の処8!19に戻る。
In this way, the zoom magnification is calculated so that the sample points corresponding to d and na are within the angle of view 17 using the calculated ctma and the zoom magnification from the 7-point eye movement focus adjustment device 25, As a result, a control signal is supplied to the automatic focus adjustment device 25, and the zoom lens is controlled so that the entire subject image is included in the angle of view 17.When the control of the zoom lens is completed, the sixth Return to 8!19 in the figure.

なお、この実施例では、演算装置12での被写体面像の
移動のM無俟仰のための処理を含めて演算処理籾に、フ
レームメモリ9に書き込゛まれるフレームは新められる
(J−rjわち、第6図におい又1判定22で「NOJ
と判定されると、処理19に戻る)。これは、ズーム倍
率に誤りが生じないよ5に′1′−るためである。
In addition, in this embodiment, the frame written in the frame memory 9 is updated after the calculation process including the process for increasing the movement of the subject plane image in the calculation unit 12 (J -rj.
If it is determined that this is the case, the process returns to process 19). This is to prevent errors in zoom magnification from occurring.

この実施例は、被写体の撮像装置からの距離に関係ig
 (−破写坏画揮乞画角17内に納めることができ、被
写体画像の移動の肩無を正確に検仰することかできるし
、筐だ、被写体画像の移動量や移動の方向も正確に検出
することができて、自動追尾が被写体の移動方向に関係
なく正確に行なわれる。
In this embodiment, the distance between the subject and the imaging device is determined.
(- It is possible to capture the image within the angle of view of 17, and it is possible to accurately check the movement of the subject image, and the amount and direction of movement of the subject image are also accurate. Automatic tracking is performed accurately regardless of the direction of movement of the subject.

なお1以上の実施例では、ノモ1Ji1,12としてフ
レームメモリを用いたが、フィールドメモリを用いるこ
ともできる。
Note that in one or more embodiments, frame memories are used as the nodes 1Ji1, 12, but field memories may also be used.

〔発明の効果〕〔Effect of the invention〕

以上説明したように、不発明fよると、央雑な処理を必
要とせず、m)ホな回路’X成でもっこ誤りがなく、シ
かも、連速に目!1I71追尾ケ行はうことができ、小
型、@童かつ安価であって、上記従来技術にない轡れた
イ幾能の目動追尾カメラを提供することができる。
As explained above, according to the invention, there is no need for complicated processing, there is no error in the circuit's X configuration, and there is no need for continuous speed. It is possible to provide an eye movement tracking camera that can perform 1I71 tracking, is small, inexpensive, and has advanced functions not found in the prior art.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図は不発明による自!J追尾カメラの一実施例を示
すブロック図、第2 V、1は第1図り・は号処理部の
一具体例を示J−ブロック図、第5図(,4)は1フレ
ームの画像の標本点を示り′続開図、第3はi CB)
は第6図(,4)の矩且緋中の自画(g号の標本点を示
す説明シ1.ム↓4図(j、第2図の側iボ差信号狛生
のだめの処理動作を示す説明図、與5図は不発明による
目動追尾カメラの他の笑施伸1の説明図、第6図は第5
図の実施例の鯛作乞示すフローチャート、第7図は不発
明による自動追尾カメラのさらに他の実施例の要部を示
すグロンク図、第8図は第7図の実施例の動作の要部な
ホ丁フローチャートである。 2・・・撮像装置    6・・・カメラ回路4・・・
信号処理部   5・・・制御部9・・・フレームメモ
リ 10・・減算l!2回路11・−フレームメモリ 
12・・・演n装置16・・・悔球画ii     1
7.18・・・画角25・・目動焦7執瞬整装置直 第 11 ネ ? 圃 第 5 図 (8) 5 第4 図 第 5 圀 第 l 第 「1 )ンJ λ 7”3 、・ 第1頁の続き ■出 願 人 日立ビデオエンジニアリング株式会社 横浜市戸塚区吉田町292番地 =516
Figure 1 shows self-invention due to non-invention! J-Block diagram showing an example of an embodiment of the tracking camera, 2nd V, 1 shows a specific example of the processing section of the 1st diagram, and FIG. (Continued diagram showing the sample points, 3rd one is i CB)
Figure 6(,4) is a rectangular and medium-sized self-portrait (g). 5 is an explanatory diagram of another eye tracking camera according to the invention, and FIG.
FIG. 7 is a Gronk diagram showing the main parts of yet another embodiment of the automatic tracking camera according to the invention; FIG. 8 is a main part of the operation of the embodiment shown in FIG. 7. This is a simple flowchart. 2... Imaging device 6... Camera circuit 4...
Signal processing unit 5...Control unit 9...Frame memory 10...Subtraction l! 2 circuits 11・-Frame memory
12...Performance device 16...Repentance ball picture ii 1
7.18... Angle of view 25... Eye movement focus 7 Adjustment device direct 11th ? Field No. 5 Figure 5 (8) 5 Figure 4 Figure 5 Field No. 1 "1)nJ λ 7" 3, Continued from page 1 ■Applicant Hitachi Video Engineering Co., Ltd. 292 Yoshida-cho, Totsuka-ku, Yokohama City =516

Claims (1)

【特許請求の範囲】 (1)追尾すべき被写体の移動量、移動方向を計測し、
計測結果により、該被写体を撮像可能に制御される自動
追尾カメラにおい又、撮像装置から得られる画像信号が
供給され一定時間間隔の画像間の差を表わす画像差信号
を得ろ第1の手段と、該画像差信号により撮像画面にお
ける前記画IW間の差の重心位置を算出し該撮像画面に
おける被写体画像の移動量および移動方向を検出する第
2の手段と、該第2の手段からの検出信号にもとづいて
前記撮像装置を制御する第5の手段とを設け、前記被写
体画像を前配撮1家画面の基準領域内に設定することが
できるように構成したことを特徴とする目動追尾カメラ
。。 (2)  特許請求の範囲第(11項におい(、前記基
準領域を第1の画角とし一前記第2の手段は。 該第1の画角外で、かつ、該第1の画角を含む第2の画
角内に前記画像間の差があることを検知し、前記第3の
手段への前記検出信号を発生することを特徴とする自動
追尾カメラ。 (6)  特許請求の範囲第(11項または第(2)項
において、前記第2の手段は、前記撮1m装置のカメラ
方向を設定するだめの前記検出(g号を発生することを
特徴とする自動追尾カメラ。 (4)特許請求の範囲第+21項において、前記第2の
手段は、1■記撮像@置のカメラ方向を設定し、かつ、
前=a被写体画像を前記第1画角内に納まる大きさにす
るだめの前記検出信号を発生することを%徴と1−る自
動追尾カメラ。
[Claims] (1) Measuring the amount and direction of movement of the subject to be tracked;
A first means for obtaining an image difference signal representing a difference between images at a fixed time interval by supplying an image signal obtained from the imaging device to an automatic tracking camera controlled to be able to image the subject according to the measurement result; a second means for calculating the position of the center of gravity of the difference between the images IW on the imaging screen based on the image difference signal and detecting the amount and direction of movement of the subject image on the imaging screen; and a detection signal from the second means and a fifth means for controlling the imaging device based on the above, and the eye movement tracking camera is configured such that the subject image can be set within a reference area of a front camera screen. . . (2) Claim No. 11 (wherein the reference area is a first angle of view, and the second means is outside the first angle of view, and the first angle of view is An automatic tracking camera characterized in that the automatic tracking camera detects that there is a difference between the images within a second angle of view including the second angle of view, and generates the detection signal to the third means. (In Clause 11 or Clause (2), the automatic tracking camera is characterized in that the second means generates the detection (g) to set the camera direction of the 1m photographing device. (4) In claim 21, the second means sets the camera direction for the imaging @ position, and
An automatic tracking camera whose % sign is to generate the detection signal to reduce the size of the object image to fit within the first angle of view.
JP58082757A 1983-05-13 1983-05-13 Automatic tracking camera Expired - Lifetime JPH0614698B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP58082757A JPH0614698B2 (en) 1983-05-13 1983-05-13 Automatic tracking camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP58082757A JPH0614698B2 (en) 1983-05-13 1983-05-13 Automatic tracking camera

Publications (2)

Publication Number Publication Date
JPS59208983A true JPS59208983A (en) 1984-11-27
JPH0614698B2 JPH0614698B2 (en) 1994-02-23

Family

ID=13783311

Family Applications (1)

Application Number Title Priority Date Filing Date
JP58082757A Expired - Lifetime JPH0614698B2 (en) 1983-05-13 1983-05-13 Automatic tracking camera

Country Status (1)

Country Link
JP (1) JPH0614698B2 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62208783A (en) * 1986-03-10 1987-09-14 Hitachi Ltd Automatic tracking video camera
JPH0576007A (en) * 1991-09-13 1993-03-26 Nkk Corp Monitoring method by television camera
JPH05176239A (en) * 1991-12-25 1993-07-13 Mitsumi Electric Co Ltd Image pickup device
GB2368741A (en) * 2000-10-26 2002-05-08 Fuji Photo Optical Co Ltd A following device for a camera pan head
US6661450B2 (en) 1999-12-03 2003-12-09 Fuji Photo Optical Co., Ltd. Automatic following device
EP2051505A2 (en) 2007-10-17 2009-04-22 Sony Corporation Composition determining apparatus, composition determining method, and program
EP2059029A2 (en) 2007-10-17 2009-05-13 Sony Corporation Composition determining apparatus, composition determining method, and program
EP2104336A2 (en) 2008-03-19 2009-09-23 Sony Corporation Composition determination device, composition determination method, and program
EP2120210A2 (en) 2008-04-25 2009-11-18 Sony Corporation Composition determination device, composition determination method, and program
CN101964872A (en) * 2009-07-23 2011-02-02 索尼公司 Composition determination device, imaging system, composition determination method, and program
US7916172B2 (en) 2005-09-13 2011-03-29 Canon Kabushiki Kaisha Image pickup apparatus with object tracking capability

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006059252A (en) 2004-08-23 2006-03-02 Denso Corp Method, device and program for detecting movement, and monitoring system for vehicle
JP2007251429A (en) * 2006-03-14 2007-09-27 Omron Corp Moving image imaging unit, and zoom adjustment method
JP5096258B2 (en) * 2008-08-05 2012-12-12 株式会社藤商事 Game machine

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5489417A (en) * 1977-12-27 1979-07-16 Nippon Aviotronics Kk Video tracing system
JPS54161821A (en) * 1978-06-13 1979-12-21 Mitsubishi Electric Corp Picture tracking device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5489417A (en) * 1977-12-27 1979-07-16 Nippon Aviotronics Kk Video tracing system
JPS54161821A (en) * 1978-06-13 1979-12-21 Mitsubishi Electric Corp Picture tracking device

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62208783A (en) * 1986-03-10 1987-09-14 Hitachi Ltd Automatic tracking video camera
JPH0576007A (en) * 1991-09-13 1993-03-26 Nkk Corp Monitoring method by television camera
JPH05176239A (en) * 1991-12-25 1993-07-13 Mitsumi Electric Co Ltd Image pickup device
US6661450B2 (en) 1999-12-03 2003-12-09 Fuji Photo Optical Co., Ltd. Automatic following device
GB2368741A (en) * 2000-10-26 2002-05-08 Fuji Photo Optical Co Ltd A following device for a camera pan head
GB2368741B (en) * 2000-10-26 2004-12-15 Fuji Photo Optical Co Ltd Following device
US7256817B2 (en) 2000-10-26 2007-08-14 Fujinon Corporation Following device
US7916172B2 (en) 2005-09-13 2011-03-29 Canon Kabushiki Kaisha Image pickup apparatus with object tracking capability
EP2059029A2 (en) 2007-10-17 2009-05-13 Sony Corporation Composition determining apparatus, composition determining method, and program
EP2051505A2 (en) 2007-10-17 2009-04-22 Sony Corporation Composition determining apparatus, composition determining method, and program
US8164643B2 (en) 2007-10-17 2012-04-24 Sony Corporation Composition determining apparatus, composition determining method, and program
US8275171B2 (en) 2007-10-17 2012-09-25 Sony Corporation Composition determining apparatus, composition determining method, and program
EP2104336A2 (en) 2008-03-19 2009-09-23 Sony Corporation Composition determination device, composition determination method, and program
US8810673B2 (en) 2008-03-19 2014-08-19 Sony Corporation Composition determination device, composition determination method, and program
EP2120210A2 (en) 2008-04-25 2009-11-18 Sony Corporation Composition determination device, composition determination method, and program
US8594390B2 (en) 2008-04-25 2013-11-26 Sony Corporation Composition determination device, composition determination method, and program
US9679394B2 (en) 2008-04-25 2017-06-13 Sony Corporation Composition determination device, composition determination method, and program
CN101964872A (en) * 2009-07-23 2011-02-02 索尼公司 Composition determination device, imaging system, composition determination method, and program
US8908057B2 (en) 2009-07-23 2014-12-09 Sony Corporation Composition determination device, imaging system, composition determination method, and program

Also Published As

Publication number Publication date
JPH0614698B2 (en) 1994-02-23

Similar Documents

Publication Publication Date Title
KR102213328B1 (en) Video processing apparatus, video processing method, and program
JPS59208983A (en) Automatic tracing camera
CN106488115B (en) For tracking and sensing the control device and method of the image sensing apparatus of tracking target
US20110096143A1 (en) Apparatus for generating a panoramic image, method for generating a panoramic image, and computer-readable medium
JPH08202879A (en) Method for change of continuous video images belonging to sequence of mutually interrelated images as well as apparatus and method for replacement of expression of targetdiscriminated by set of object points by matched expression of predetermined and stored pattern of same geometrical shape in continuous tv frames of same sequence
US20040141633A1 (en) Intruding object detection device using background difference method
JPS63166370A (en) Picture movement correcting device
JP2007188294A (en) Method for detecting moving object candidate by image processing, moving object detection method for detecting moving object from moving object candidate, moving object detection apparatus, and moving object detection program
JPH0580248A (en) Automatic focusing device
JPH0644292B2 (en) Two-dimensional visual recognition device
JPH0591492A (en) Moving vector detector
JPH0275284A (en) Image pickup device
JP3465264B2 (en) Apparatus and method for detecting motion of video data
JPH04345382A (en) Scene change detecting method
JP3534551B2 (en) Motion detection device
JP2006215655A (en) Method, apparatus, program and program storage medium for detecting motion vector
JPH04213973A (en) Image shake corrector
JP3473064B2 (en) Image motion vector detecting device and video camera
JPH0568666B2 (en)
TWI671684B (en) System and method for monitoring an image
JP2889410B2 (en) Image recognition device
JP2001025021A (en) Motion detection method and motion detector
JPH0728406B2 (en) Motion vector detector
JP7346021B2 (en) Image processing device, image processing method, imaging device, program and recording medium
JPH057327A (en) Motion vector detecting device