JPH1010418A - Automatic focusing device - Google Patents
Automatic focusing deviceInfo
- Publication number
- JPH1010418A JPH1010418A JP8166066A JP16606696A JPH1010418A JP H1010418 A JPH1010418 A JP H1010418A JP 8166066 A JP8166066 A JP 8166066A JP 16606696 A JP16606696 A JP 16606696A JP H1010418 A JPH1010418 A JP H1010418A
- Authority
- JP
- Japan
- Prior art keywords
- image plane
- state
- lens
- data
- image surface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Focusing (AREA)
- Automatic Focus Adjustment (AREA)
Abstract
Description
【0001】[0001]
【発明の属する技術分野】本発明は、カメラ等の撮影レ
ンズを合焦駆動する自動合焦装置に関する。BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an automatic focusing device for driving a taking lens such as a camera.
【0002】[0002]
【従来の技術】カメラの自動焦点調節装置において、移
動する被写体を追尾してある時間に露光を行ない、焦点
の合った写真を撮る追尾方法が知られている。第1の追
尾方法は最も単純な方法であり、現在もこの方法を用い
ているカメラも多い。この追尾方法では、過去に得た複
数の被写体の像面位置とその検出時刻の組から差分をと
り、像面位置の変化量つまり像面速度を求め、その像面
速度の傾きで位置を外延して、所定時刻の像面位置を予
想する方法である。この方法は、一次微分を差分の代替
として求めた、位置に関するニュートン的な一次近似の
方法といえる。2. Description of the Related Art In an automatic focusing apparatus for a camera, there is known a tracking method in which a moving subject is tracked, exposure is performed at a certain time, and a focused picture is taken. The first tracking method is the simplest method, and many cameras still use this method. In this tracking method, a difference is obtained from a set of image plane positions of a plurality of subjects obtained in the past and their detection times, an amount of change in the image plane position, that is, an image plane speed is obtained, and the position is extended by a slope of the image plane velocity. Then, the image plane position at a predetermined time is predicted. This method can be said to be a Newton-like first-order approximation method for the position in which the first derivative is obtained as an alternative to the difference.
【0003】第2の追尾方法は、上記第1の方法をさら
に一次深めて像面位置にかかる二次補正としたものであ
る。具体的には、3世代の像面位置と時刻のデータの組
をとり、これら3点を通る像面位置−時間の空間におけ
る放物線を決定し、これを外延して所定時刻の像面位置
を予測するものである。この追尾方法は、等速で運動す
る被写体であっても、予定像面上では線形ではない一定
の関数、A second tracking method is a method in which the first method is further deepened by a first order to perform a secondary correction on an image plane position. Specifically, a set of image plane position and time data of three generations is taken, a parabola in the space of image plane position-time passing through these three points is determined, and this is extrapolated to obtain an image plane position at a predetermined time. To predict. This tracking method is a function that is not linear on a predetermined image plane even if the subject moves at a constant speed.
【数1】 (数式1において、fはレンズの焦点距離、vは被写体
速度、hは最接近した時の距離、tは時間、t=0で最
接近である。)で変化するため、像面速度の変化がある
という事実から、像面加速度を2次式で評価することに
よって正解に至ろうとするものである。しかし、実際に
は、焦点はずれ量の測定値に相当量のゆらぎがあり、こ
のゆらぎの影響は微分係数が高次になればなるほど大き
くなる。これは、離散的なデータセットを使用している
以上宿命であり、このことからこの方法が正しく速度変
化を捉えられるかどうかは非常に疑問である。(Equation 1) (Where f is the focal length of the lens, v is the object speed, h is the distance at the time of closest approach, t is time, and t = 0 is the closest approach.) Because of the fact that there is, an attempt is made to reach the correct answer by evaluating the image plane acceleration by a quadratic equation. However, actually, the measured value of the defocus amount has a considerable amount of fluctuation, and the influence of this fluctuation becomes larger as the derivative coefficient becomes higher. This is fateful as it uses discrete data sets, which makes it very questionable if this method can correctly capture velocity changes.
【0004】さらに改善された追尾方法として、統計演
算を行なう方法がある。この目的は、測定のゆらぎの影
響を最小限に留めようとするものである。第3の追尾方
法は、統計的な手法で像面の位置を予測する方法であ
る。すなわち、像面位置−測定時刻の組を過去数世代に
わたって格納しておき、これらのデータ群から一次回帰
式を導く。この一次回帰式の傾きは像面位置の時間変化
になるから、データ群を代表する像面速度を表すことに
なる。この方式は、測定データのゆらぎにも強く、信頼
度の高いデータが得られるが、ある程度の時間間隔を持
つデータ群の変化量を代表するために、急速に変化する
被写体の対応はむずかしい。As a further improved tracking method, there is a method of performing a statistical operation. The purpose is to try to minimize the effects of measurement fluctuations. The third tracking method is a method of predicting the position of the image plane by a statistical method. That is, a pair of the image plane position and the measurement time is stored over the past several generations, and a linear regression equation is derived from these data groups. Since the slope of this linear regression equation changes over time in the image plane position, it represents the image plane velocity representing the data group. Although this method is resistant to fluctuations in measured data and can provide highly reliable data, it is difficult to deal with rapidly changing subjects because it represents the amount of change in a data group having a certain time interval.
【0005】上記第3の方法を改善した第4方法とし
て、像面の位置と像面速度の平方根の空間で回帰式を求
め、この回帰式から逆算して像面速度と像面加速度を求
める方法がある。この方法は、非線形な像面の動きを基
本的に数式で捉えているため、ゆらぎの影響の少ない像
面加速度を得ることができる。As a fourth method improved from the third method, a regression equation is obtained in the space of the position of the image plane and the square root of the image plane velocity, and the image plane velocity and the image plane acceleration are obtained from the regression equation. There is a way. In this method, since the movement of the non-linear image plane is basically captured by mathematical expressions, it is possible to obtain an image plane acceleration which is less affected by the fluctuation.
【0006】[0006]
【発明が解決しようとする課題】上述した従来の追尾方
法では、移動被写体の像面位置の予測方法を改善すれば
するほど、予測決定に係るデータ数が増えるという問題
がある。図1は、一定速度vで移動する被写体が遠方か
らカメラに接近し、遠ざかっていく場合の撮影レンズの
像面位置(実線で示す)と像面速度(破線で示す)の関
係を示す。像面位置は、移動被写体がカメラの真正面の
最接近距離hの位置に来た時に最大となる。また、移動
被写体がカメラに接近する場合の移動速度を正にとる
と、移動被写体がカメラに接近するにつれて像面速度が
増加し、カメラの真正面の位置で像面速度の極性が反転
する。そして、移動被写体がカメラから遠ざかるにつれ
て像面速度が減少する。The above-described conventional tracking method has a problem that the more the method of predicting the image plane position of a moving subject is improved, the more the number of data involved in the prediction decision increases. FIG. 1 shows the relationship between the image plane position (shown by a solid line) and the image plane speed (shown by a broken line) of a photographic lens when a subject moving at a constant speed v approaches and moves away from a camera from a distance. The image plane position becomes maximum when the moving subject comes to the position of the closest approach distance h in front of the camera. If the moving speed when the moving subject approaches the camera is positive, the image surface speed increases as the moving subject approaches the camera, and the polarity of the image surface speed is inverted at a position directly in front of the camera. Then, as the moving subject moves away from the camera, the image plane speed decreases.
【0007】この明細書では、移動被写体がカメラに最
接近し、像面速度の極性が反転する点を折り返し点と呼
び、移動被写体がカメラに接近して像面位置と像面速度
の増加する範囲を上り坂、移動被写体がカメラから遠ざ
かって像面位置と像面速度が減少する範囲を下り坂と呼
ぶ。In this specification, the point at which the moving subject comes closest to the camera and the polarity of the image plane speed is reversed is called a turning point, and the moving object approaches the camera and the image plane position and the image plane velocity increase. The range in which the moving subject moves away from the camera and the image plane position and the image plane speed decrease is called a downhill.
【0008】このような移動被写体を追尾する場合に、
下り坂に差しかかった時に、それまでの上り坂における
像面位置や像面速度などのデータに基づいて下り坂にお
ける像面位置や像面速度などを予測することになるた
め、予測点が実際の像面の動きに追随できなくなる。つ
まり、単調増加を想定して像面の動きを部分的に近似し
ている関数が、実際の像面の動きを代表する関数の極大
点もしくは折り返し点で予測不能になってしまう。ま
た、折り返し点では速度−一次導関数−の変化が激しい
だけではなく、駆動の方向が逆転するために、駆動力を
伝達しているギアなどの機械部材にバックラッシュが発
生し、制御を非線形なものとしている。折り返し点直後
の動きは予測が難しいのに加え、予測はできても追尾駆
動自体が困難であることが加わって、この折り返し点近
傍での追尾は現在まで放擲されていた。When tracking such a moving subject,
When the vehicle approaches a downhill, it predicts the image plane position and the image surface speed on the downhill based on data such as the image plane position and the image surface speed on the previous uphill. Cannot follow the movement of the image plane. That is, a function that partially approximates the motion of the image plane assuming a monotonous increase becomes unpredictable at the local maximum point or the turning point of the function representing the actual motion of the image plane. In addition, at the turning point, not only the speed-first derivative-changes drastically, but also the driving direction is reversed, causing backlash to occur in the mechanical members such as gears transmitting the driving force, and the control becomes nonlinear. It is assumed that. The movement immediately after the turning point is difficult to predict, and the tracking drive itself is difficult even if the movement can be predicted, so that the tracking near the turning point has been thrown until now.
【0009】本発明の目的は、カメラに接近し遠ざかっ
ていく移動被写体を正確に追尾する自動合焦装置を提供
することにある。An object of the present invention is to provide an automatic focusing apparatus for accurately tracking a moving subject approaching and moving away from a camera.
【0010】[0010]
(1) 請求項1の発明は、撮影レンズの焦点調節状態
を検出する焦点検出手段と、撮影レンズの合焦用レンズ
の位置を検出するレンズ位置検出手段と、焦点検出手段
により検出された焦点調節状態と前記レンズ位置検出手
段により検出された合焦用レンズの位置により、撮影レ
ンズの像面位置を演算し記憶する像面位置演算手段と、
像面位置演算手段により演算された像面位置に基づいて
撮影レンズの像面が遠方から至近へ向う第1の状態と至
近から遠方へ向う第2の状態とを判定する像面状態判定
手段と、像面状態判定手段により第1の状態から第2の
状態に切り換わったと判定されると、第2の状態に切り
換わるまでの撮影レンズの像面データに基づいて、第2
の状態に切り換わった後の撮影レンズの像面データを予
測する像面データ予測手段と、像面データ予測手段によ
る予測像面データに基づいて合焦用レンズを駆動制御す
るレンズ駆動制御手段とを備える。 (2) 請求項2の自動合焦装置は、像面データ予測手
段によって、第2の状態に切り換わるまでの撮影レンズ
の像面位置データに基づいて、第2の状態に切り換わっ
た後の撮影レンズの像面速度を補間演算するようにした
ものである。 (3) 請求項3の自動合焦装置は、像面位置演算手段
によって、今回の演算結果の像面位置が、前回の演算結
果の像面位置から所定量以上、変化した場合のみ記憶す
るようにしたものである。撮影レンズの像面が遠方から
至近に向う第1の状態の時に、撮影レンズの焦点調節状
態を検出するとともに、その時の合焦用レンズの位置を
検出する。そして、焦点調節状態と合焦用レンズ位置と
により撮影レンズの像面位置を演算し、記憶する。撮影
レンズの像面が第1の状態から第2の状態に切り換わる
と、検出された撮影レンズの焦点調節状態および合焦用
レンズ位置と、第2の状態に切り換わるまでの像面位置
データの記憶値とに基づいて第2の状態に切り換わった
後の撮影レンズの像面速度を補間演算し、演算結果の像
面速度に基づいて合焦用レンズを駆動制御する。(1) The invention of claim 1 is a focus detecting means for detecting a focus adjustment state of a photographing lens, a lens position detecting means for detecting a position of a focusing lens of the photographing lens, and a focus detected by the focus detecting means. Image plane position calculating means for calculating and storing the image plane position of the taking lens based on the adjustment state and the position of the focusing lens detected by the lens position detecting means;
Image plane state determining means for determining, based on the image plane position calculated by the image plane position calculating means, a first state in which the image plane of the photographing lens moves from a distant position to a close position and a second state in which the image plane moves from a close position to a distant position; If it is determined by the image plane state determining means that the state has been switched from the first state to the second state, the second state is determined based on the image plane data of the photographing lens until the state is switched to the second state.
Image plane data prediction means for predicting the image plane data of the photographing lens after switching to the state, and lens drive control means for driving and controlling the focusing lens based on the predicted image plane data by the image plane data prediction means. Is provided. (2) In the automatic focusing apparatus according to the second aspect, the image plane data predicting unit is configured to switch to the second state based on image plane position data of the photographing lens until the state is switched to the second state. The image plane speed of the taking lens is calculated by interpolation. (3) In the automatic focusing device according to the third aspect, the image plane position calculating means stores the image plane position of the current calculation result by a predetermined amount or more from the image plane position of the previous calculation result. It was made. When the image plane of the taking lens is in the first state in which the image plane faces from a distance to the closest, the focus adjustment state of the taking lens is detected, and the position of the focusing lens at that time is detected. Then, the image plane position of the photographing lens is calculated and stored based on the focus adjustment state and the focusing lens position. When the image plane of the photographing lens is switched from the first state to the second state, the detected focus adjustment state and focusing lens position of the photographing lens, and image plane position data until the state is switched to the second state. The image plane speed of the photographing lens after the switching to the second state is interpolated based on the stored value and the focusing state is driven based on the calculated image plane speed.
【0011】[0011]
【発明の実施の形態】図2に一実施形態の構成を示す。
レンズ部1には、撮影レンズ2と、レンズ位置検出装置
3と、駆動ギア4が内蔵されている。レンズ位置検出装
置3は、撮影レンズ2に含まれる合焦用レンズ(不図
示)の位置を検出する。駆動ギア4は、カメラボディ5
から不図示の伝達機構を介して駆動力が伝達され、合焦
用レンズを駆動する。FIG. 2 shows an embodiment of the present invention.
The lens unit 1 includes a photographing lens 2, a lens position detecting device 3, and a driving gear 4. The lens position detecting device 3 detects a position of a focusing lens (not shown) included in the photographing lens 2. The driving gear 4 includes a camera body 5
The driving force is transmitted from the camera via a transmission mechanism (not shown) to drive the focusing lens.
【0012】一方、カメラボディ5には、焦点はずれ量
検出装置6と、クロック7と、演算装置8と、記憶装置
9と、像面状態判定装置10と、駆動制御装置11が内
蔵されている。焦点はずれ量検出装置6は、撮影レンズ
1とメインミラー12を通過し、サブミラー13により
反射された被写体からの一対の光束を焦点検出用光学系
(不図示)によりイメージセンサー(不図示)上に導
き、一対の被写体像を形成して被写体像の光強度分布に
応じた信号を生成する。そして、その信号に基づいて周
知の焦点検出演算を行ない、撮影レンズ1の予定結像面
からの焦点のはずれ量を検出する。クロック7は、焦点
はずれ量検出装置6により焦点はずれ量が検出された時
刻を計時する。演算装置8はカメラの各種演算を行な
う。記憶装置9は演算装置8により演算されたデータを
記憶する。像面状態判定装置10は、焦点はずれ量など
の各種データに基づいて像面の動きをモニターし、その
折り返し点を検出する。On the other hand, the camera body 5 contains a defocus amount detecting device 6, a clock 7, an arithmetic device 8, a storage device 9, an image plane state judging device 10, and a drive control device 11. . The defocus amount detecting device 6 passes a pair of light beams from the subject, which have passed through the photographing lens 1 and the main mirror 12 and are reflected by the sub-mirror 13, onto an image sensor (not shown) by a focus detection optical system (not shown). Then, a signal corresponding to the light intensity distribution of the subject image is generated by forming a pair of subject images. Then, a well-known focus detection calculation is performed based on the signal, and a defocus amount of the photographing lens 1 from a predetermined image forming plane is detected. The clock 7 measures the time when the defocus amount is detected by the defocus amount detection device 6. The arithmetic unit 8 performs various arithmetic operations of the camera. The storage device 9 stores the data calculated by the calculation device 8. The image plane state determination device 10 monitors the movement of the image plane based on various data such as the amount of defocus, and detects the turning point.
【0013】駆動制御装置11は、演算装置8により演
算された駆動データ、すなわち図3に示すような所定時
刻の像面速度とオフセット量による指示駆動直線にした
がって、駆動ギア4を介して合焦用レンズを駆動する。The drive control device 11 focuses via the drive gear 4 in accordance with the drive data calculated by the calculation device 8, ie, the designated drive straight line based on the image plane speed and the offset amount at a predetermined time as shown in FIG. Drive lens.
【0014】今、図1に示すように、最接近距離hのと
ころを一定速度vで通過する移動被写体を想定し、この
移動被写体をカメラで追尾する場合を例に上げて実施形
態の動作を説明する。この実施形態における追尾は、次
のようなプロセスを辿ることになる。まず、移動被写体
が遠方からカメラに向かって近づいてくる。この時、ま
ず通常通り、焦点はずれ量検出装置6から焦点はづれ量
bf0とその時の時刻to、およびtoにおける合焦用
レンズの位置lp0を入力し、像面位置(df0=bf0+lp
0)とその検出時刻の組として記憶装置9に貯えるとと
もに、それらのデータに基づいて合焦用レンズを駆動す
る。この自動焦点調節サイクルが何度か繰り返され、像
面位置とその検出時刻のデータの組が複数世代格納され
ると、このデータから単位時間当たりの像面の動き、す
なわち像面速度が算出され、自動合焦装置は追尾状態に
入る。Now, as shown in FIG. 1, the operation of the embodiment will be described by assuming a case where a moving subject is passing at a constant speed v at the closest approach distance h and the moving subject is tracked by a camera. explain. Tracking in this embodiment follows the following process. First, a moving subject approaches the camera from a distance. At this time, first, as usual, the defocus amount bf0, the time to, and the position lp0 of the focusing lens at to are input from the defocus amount detection device 6, and the image plane position (df0 = bf0 + lp)
0) and its detection time are stored in the storage device 9 and the focusing lens is driven based on the data. This autofocusing cycle is repeated several times, and when data sets of the image plane position and the detection time are stored for a plurality of generations, the movement of the image plane per unit time, that is, the image plane speed is calculated from the data. Then, the automatic focusing device enters a tracking state.
【0015】なお、像面速度の演算方法に関しては、2
つの像面位置と検出時刻データの差分をとる方法から、
複数のデータに基づく統計演算の回帰式の傾きを用いる
方法まで、色々な方法があるが、本発明に直接関係しな
いので詳細な説明を省略する。ただし、算出された像面
速度vimiに対応する像面位置yiと時刻tiを同時
に計算しておく必要がある。単純に差分で像面速度を求
める場合には、2つのデータの間の中間位置と時刻がそ
れを代表することになるし、統計演算ならば採用された
データの平均位置と時刻が、これに相当することにな
る。このデータの組を駆動テーブルの第1の数値の組と
する。同様に、焦点はずれ量が検出され、各種パラメー
タが演算された次回のサイクルでは、今回算出された像
面位置が前回テーブルに格納した像面位置から所定量d
以上変化しているかどうかを調べ、変化している場合に
のみ、テーブルに次のデーターセットとして格納する。
これは、自動焦点調節サイクルが被写体の像面移動に比
べてかなり速いので、すべてのデータをテーブルとして
格納するとテーブルが巨大になってしまう可能性がある
ことと、データのゆらぎによりテーブルの単調増加性が
崩れてしまうことを防止する意味で行なわれる操作であ
る。テーブルを大小比較で検索する関係から、単調増加
性が崩れることはソフト製作上問題である。この操作が
繰り返され、テーブルが形成されていく。この様子を図
4に示し、記憶1、記憶2および記憶3の時刻で像面位
置と像面速度が記憶される。このようにして像面が上り
坂にある時にテーブルが形成されていく。As for the method of calculating the image plane speed, 2
From the method of taking the difference between the two image plane positions and the detection time data,
There are various methods up to the method using the slope of the regression equation of the statistical operation based on a plurality of data, but a detailed description is omitted because it is not directly related to the present invention. However, it is necessary to simultaneously calculate the image plane position yi and the time ti corresponding to the calculated image plane velocity vimi. When the image plane speed is simply obtained by the difference, the intermediate position and the time between the two data are representative thereof. In the case of the statistical operation, the average position and the time of the adopted data are represented by this. Would be equivalent. This set of data is a first set of numerical values in the drive table. Similarly, in the next cycle in which the amount of defocus is detected and various parameters are calculated, the image plane position calculated this time is shifted by a predetermined amount d from the image plane position previously stored in the table.
It is checked whether it has changed, and only when it has changed, it is stored in the table as the next data set.
This is because the auto focus cycle is much faster than the image plane movement of the subject, so storing all the data as a table may make the table huge, and the table may increase monotonically due to data fluctuation. This operation is performed in the sense of preventing the property from being lost. It is a problem in software production that the monotonous increase property is broken due to the relation of searching the tables by magnitude comparison. This operation is repeated to form a table. This situation is shown in FIG. 4, and the image plane position and the image plane velocity are stored at the times of the storage 1, the storage 2 and the storage 3. In this way, a table is formed when the image plane is on an uphill.
【0016】一方、こうした操作を繰り返しながら、像
面状態判定装置10は折り返し点を越えたか否かを判定
し続ける。上述したように、折り返し点に到達するま
で、テーブルの形成が続けられる。像面状態判定装置1
0は、各種のパラメータと、行なっている予測駆動の特
性とに基づいて折り返し点を判断する。例えば、一定の
像面速度を越えると加速度補正のオーバーラップをかけ
る統計演算を用いた系、従来の技術の項で述べた第3お
よび第4の追尾方法であれば、 (1) テーブルの第1番目の像面位置から現在の像面
位置が一定量以上至近側によっていること。 (2) 加速度補正のオーバーラップが開始されている
こと。 (3) それまでの最高像面速度が一定の大きさを越え
ていること。 (4) 焦点が至近側にはづれていること。 (5) 現在の像面速度が0に近いか負になっているこ
と、あるいは前回と今回の測定データがそれに准ずる関
係になっていること。 (6) まだ折り返し点に至ったという判断がないこ
と。 を判断し、これらの条件が全て満足されたら、像面状態
判定装置10は折り返し点に到達したと判断する。On the other hand, while repeating such operations, the image plane state determination device 10 continues to determine whether or not the turning point has been exceeded. As described above, the formation of the table is continued until the turning point is reached. Image plane state determination device 1
0 determines the turning point based on various parameters and the characteristics of the predictive driving being performed. For example, a system using a statistical operation that overlaps the acceleration correction when exceeding a certain image plane speed, and the third and fourth tracking methods described in the section of the prior art, are as follows. The current image plane position is closer than the first image plane position by a certain amount to the close side. (2) Acceleration correction overlap has started. (3) The maximum image surface speed up to that time has exceeded a certain value. (4) The focal point is on the closest side. (5) The current image plane speed is close to 0 or negative, or the previous and current measurement data have a relationship corresponding thereto. (6) There is no judgment that the turning point has been reached yet. When all of these conditions are satisfied, the image plane state determination device 10 determines that the turning point has been reached.
【0017】これらの条件の内、(1)〜(3)は、状
態の判断要素もあるが、これを満足しない場合には最至
近距離が遠く、像面速度変化が小さいために、折り返し
点を判断しないまま駆動を続けても特に問題がないから
という側面もある。(4)は、上述した第3と第4の追
尾方法が山の頂上付近で加速度補正が過剰気味になると
いう性質からきた判断要素である。(6)については、
一回のパスで折り返し点の検出は一回限りであるという
ルールに基づいたものと解されるとよい。Among these conditions, (1) to (3) also have a factor for judging the state. If these conditions are not satisfied, the closest distance is long and the change in image surface speed is small, so the turning point There is also an aspect that there is no particular problem even if driving is continued without judging. (4) is a judgment factor derived from the property that the third and fourth tracking methods described above have a tendency that the acceleration correction tends to be excessive near the top of the mountain. About (6),
It may be understood that the detection of the turning point is performed only once in one pass.
【0018】テーブルは例えば表1に示すようなものに
なっているはずである。The table should be as shown in Table 1 for example.
【表1】 [Table 1]
【0019】折り返し点を越えると、このテーブルを用
いて追尾駆動がなされるようになる。次に折り返し点を
過ぎて、下り坂に差しかかった時の追尾駆動の方法を説
明する。折り返し点が過ぎてデータが焦点はずれ量検出
装置6より入力されると、そのデータの中の像面位置か
らテーブルを引き、テーブルデータを補完して対応する
時刻を算出する。最も簡単な補完は比例配分によって答
えを求める方法で、例えば像面位置ytを3.0だとす
ると、表1において、第一番目y1のデータと第2番目
のデータy2を選択し、これらの比例配分で求めてやれ
ばよい。When the turning point is exceeded, a tracking drive is performed using this table. Next, a description will be given of a tracking drive method when the vehicle is approaching a downhill after the turning point. When the data is input from the defocus amount detection device 6 after the turning point, the table is subtracted from the image plane position in the data, and the corresponding time is calculated by complementing the table data. The simplest complementation is a method of obtaining an answer by proportional distribution. For example, assuming that the image plane position yt is 3.0, in Table 1, the first data y1 and the second data y2 are selected, and these proportional distributions are selected. You can ask for it.
【数2】ratio=(y2−yt)/(y2−y1)[Mathematical formula-see original document] ratio = (y2-yt) / (y2-y1)
【数3】time=(t2−t1)・ratio+t1 上記の式から簡単に仮想的な時刻timeが算出され、
−3.9になる。[Mathematical formula-see original document] time = (t2-t1) * ratio + t1 A virtual time time is easily calculated from the above equation.
-3.9.
【0020】この時刻は測定がなされた時刻で、現在時
刻とは若干の差tdがあるはずである。折り返し点の前
と速度は逆になっているから、この値をtimeから引
いてやれば、現在に対応する仮想時間time0が算出
される。ここで、仮にこの時間を0.1とすると、この
仮想時間は−4.0になる。この仮想時間に対応する像
面速度と像面位置を逆引きしてやれば、現在時点の像面
速度と像面位置が算出される。数式2と数式3と同様の
演算をふたたび行なえば、像面位置は2.92、像面速
度は0.75になる。像面速度を算出する際には、加速
度を持つ局面であるから測定間隔を考慮して、像面速度
を計算する際だけ測定間隔の半分をさらに追加してやっ
てもよい。駆動制御装置11へ出力する時には、像面位
置については像面位置から現在のレンズ位置を引いて、
残駆動量として出力してやらなければならないのは言う
までもない。This time is the time at which the measurement was made, and should have a slight difference td from the current time. Since the speed before the turning point is opposite to the speed, if this value is subtracted from time, the virtual time time0 corresponding to the present time is calculated. Here, assuming that this time is 0.1, this virtual time is -4.0. If the image plane speed and the image plane position corresponding to the virtual time are reversed, the image plane velocity and the image plane position at the present time are calculated. If the same calculations as in Equations 2 and 3 are performed again, the image plane position is 2.92 and the image plane velocity is 0.75. When calculating the image plane speed, half of the measurement interval may be further added only when calculating the image plane speed in consideration of the measurement interval because it is a phase having acceleration. When outputting to the drive control device 11, the current lens position is subtracted from the image plane position for the image plane position.
Needless to say, the output must be output as the remaining drive amount.
【0021】このようにして、テーブルが適応できる間
は、この学習したデータからオーバーラップ補正を繰り
返し行なっていけば、テーブルから正しく像面速度が出
力できるし、像面のふるまいも既知であるから、ただち
に精度の高い加速度補正、オーバーラップ補正が可能で
ある。In this way, while the table can be adapted, if the overlap correction is repeatedly performed from the learned data, the image plane speed can be correctly output from the table, and the behavior of the image plane is known. Accurate acceleration correction and overlap correction can be immediately performed.
【0022】以上の発明の実施の形態の構成において、
焦点はずれ量検出装置6が焦点検出手段を、位置検出装
置3がレンズ位置検出手段を、演算装置8が像面位置演
算手段および像面データ予測手段を、像面状態判定装置
10が像面状態判定手段を、駆動制御装置11がレンズ
駆動制御手段をそれぞれ構成する。In the configuration of the above embodiment of the present invention,
The out-of-focus amount detector 6 is a focus detector, the position detector 3 is a lens position detector, the calculator 8 is an image plane position calculator and an image plane data predictor, and the image plane state determiner 10 is an image plane state detector. The drive control device 11 constitutes the lens drive control means as the determination means.
【0023】[0023]
【発明の効果】以上説明したように本発明によれば、追
尾が難しかった像面移動カーブの折り返し点以後の下り
坂において、折り返し後すぐに、高精度な追尾駆動が可
能になる。As described above, according to the present invention, on a downhill after the turning point of the image plane movement curve, which has been difficult to track, a high-precision tracking drive can be performed immediately after the turning.
【図1】 カメラに接近し遠ざかっていく移動被写体を
追尾する場合の、撮影レンズの像面位置と像面速度を示
す図である。FIG. 1 is a diagram illustrating an image plane position and an image plane speed of a photographing lens when tracking a moving subject approaching and moving away from a camera.
【図2】 一実施形態の構成を示す図である。FIG. 2 is a diagram illustrating a configuration of an embodiment.
【図3】 撮影レンズの駆動データと実際の撮影レンズ
の動きを示す図である。FIG. 3 is a diagram illustrating driving data of a photographing lens and actual movement of the photographing lens.
【図4】 上り坂におけるテーブルの作成例を示す図で
ある。FIG. 4 is a diagram showing an example of creating a table on an uphill.
1 レンズ部 2 撮影レンズ 3 レンズ位置検出装置 4 駆動ギア 5 カメラボディ 6 焦点はずれ量検出装置 7 クロック 8 演算装置 9 記憶装置 10 像面状態判定装置 11 駆動制御装置 REFERENCE SIGNS LIST 1 lens unit 2 photographing lens 3 lens position detecting device 4 drive gear 5 camera body 6 defocus amount detecting device 7 clock 8 arithmetic device 9 storage device 10 image plane state determining device 11 drive control device
Claims (3)
点検出手段と、 前記撮影レンズの合焦用レンズの位置を検出するレンズ
位置検出手段と、 前記焦点検出手段により検出された焦点調節状態と前記
レンズ位置検出手段により検出された合焦用レンズの位
置により、前記撮影レンズの像面位置を演算し記憶する
像面位置演算手段と、 前記像面位置演算手段により演算された像面位置に基づ
いて前記撮影レンズの像面が遠方から至近へ向う第1の
状態と至近から遠方へ向う第2の状態とを判定する像面
状態判定手段と、 前記像面状態判定手段により前記第1の状態から前記第
2の状態に切り換わったと判定されると、前記第2の状
態に切り換わるまでの前記撮影レンズの像面データに基
づいて、前記第2の状態に切り換わった後の前記撮影レ
ンズの像面データを予測する像面データ予測手段と、 前記像面データ予測手段による予測像面データに基づい
て前記合焦用レンズを駆動制御するレンズ駆動制御手段
とを備えることを特徴とする自動合焦装置。A focus detection unit configured to detect a focus adjustment state of the imaging lens; a lens position detection unit configured to detect a position of a focusing lens of the imaging lens; a focus adjustment state detected by the focus detection unit; An image plane position calculating unit that calculates and stores an image plane position of the photographing lens based on the position of the focusing lens detected by the lens position detecting unit; and an image plane position calculated by the image plane position calculating unit. Image plane state determining means for determining a first state in which the image plane of the photographing lens moves from distant to close and a second state in which the image plane moves from close to distant based on the first state by the image plane state determining means. If it is determined that the state has been switched to the second state, the photographing after switching to the second state is performed based on image plane data of the photographing lens until the state is switched to the second state. Image plane data prediction means for predicting the image plane data of a lens, and lens drive control means for driving and controlling the focusing lens based on the predicted image plane data by the image plane data prediction means. Automatic focusing device.
て、 前記像面データ予測手段は、前記第2の状態に切り換わ
るまでの前記撮影レンズの像面位置データに基づいて、
前記第2の状態に切り換わった後の前記撮影レンズの像
面速度を補間演算することを特徴とする自動合焦装置。2. The automatic focusing apparatus according to claim 1, wherein the image plane data predicting unit is configured to calculate the image plane data based on image plane position data of the photographing lens until switching to the second state.
An automatic focusing device, wherein an interpolation operation is performed on an image plane speed of the photographing lens after switching to the second state.
焦装置において、 前記像面位置演算手段は、今回の演算結果の像面位置
が、前回の演算結果の像面位置から所定量以上、変化し
た場合のみ記憶することを特徴とする自動合焦装置。3. The automatic focusing apparatus according to claim 1, wherein said image plane position calculating means determines that an image plane position of a current calculation result is a predetermined amount from an image plane position of a previous calculation result. As described above, the automatic focusing apparatus is characterized in that the information is stored only when it changes.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP16606696A JP3757470B2 (en) | 1996-06-26 | 1996-06-26 | Automatic focusing device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP16606696A JP3757470B2 (en) | 1996-06-26 | 1996-06-26 | Automatic focusing device |
Publications (2)
Publication Number | Publication Date |
---|---|
JPH1010418A true JPH1010418A (en) | 1998-01-16 |
JP3757470B2 JP3757470B2 (en) | 2006-03-22 |
Family
ID=15824352
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP16606696A Expired - Fee Related JP3757470B2 (en) | 1996-06-26 | 1996-06-26 | Automatic focusing device |
Country Status (1)
Country | Link |
---|---|
JP (1) | JP3757470B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002116373A (en) * | 2000-10-11 | 2002-04-19 | Canon Inc | Automatic focusing device and camera |
JP2017040879A (en) * | 2015-08-21 | 2017-02-23 | キヤノン株式会社 | Imaging apparatus, control method, program, and storage medium |
-
1996
- 1996-06-26 JP JP16606696A patent/JP3757470B2/en not_active Expired - Fee Related
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002116373A (en) * | 2000-10-11 | 2002-04-19 | Canon Inc | Automatic focusing device and camera |
JP4659197B2 (en) * | 2000-10-11 | 2011-03-30 | キヤノン株式会社 | Auto focus device and camera |
JP2017040879A (en) * | 2015-08-21 | 2017-02-23 | キヤノン株式会社 | Imaging apparatus, control method, program, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
JP3757470B2 (en) | 2006-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5502537A (en) | Focus adjustment device having predictive function | |
US9354487B2 (en) | Image-pickup apparatus | |
JPS6398615A (en) | Automatic focus adjusting method | |
US9288378B2 (en) | Autofocus apparatus | |
JP2001021794A (en) | Auto-focusing adjusting device, and optical instrument | |
JPH0862484A (en) | Focusing device | |
EP1939681A1 (en) | Optical apparatus and image-pickup system | |
US5239332A (en) | Automatic focusing device of camera for moving object photography | |
JP2008268815A (en) | Automatic focusing device | |
JP2006251777A (en) | Focus detection apparatus and optical apparatus | |
JPH1010418A (en) | Automatic focusing device | |
JPH0730801A (en) | Automatic focus adjustment device | |
US6085041A (en) | Autofocus system and photographic lens | |
JP2554051B2 (en) | Autofocus device | |
JP4900134B2 (en) | Focus adjustment device, camera | |
JP2531182B2 (en) | Camera auto focus device | |
JPH04161912A (en) | Automatic focusing device for electronic camera | |
JP3028563B2 (en) | Camera autofocus device | |
JP5930979B2 (en) | Imaging device | |
JPH05219418A (en) | Focusing detector | |
JP2709486B2 (en) | Image plane moving speed prediction device and automatic focus adjustment device | |
JP5355239B2 (en) | Imaging apparatus and control method thereof | |
JP3379531B2 (en) | Auto focus camera | |
JP3191885B2 (en) | Automatic focusing device | |
JP3834874B2 (en) | Camera autofocus device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20050729 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20050809 |
|
A521 | Written amendment |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20051006 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20051206 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20051219 |
|
R150 | Certificate of patent or registration of utility model |
Free format text: JAPANESE INTERMEDIATE CODE: R150 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20120113 Year of fee payment: 6 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20120113 Year of fee payment: 6 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20150113 Year of fee payment: 9 |
|
S531 | Written request for registration of change of domicile |
Free format text: JAPANESE INTERMEDIATE CODE: R313531 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20150113 Year of fee payment: 9 |
|
R350 | Written notification of registration of transfer |
Free format text: JAPANESE INTERMEDIATE CODE: R350 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20150113 Year of fee payment: 9 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
LAPS | Cancellation because of no payment of annual fees |