JPH025A - Device for detecting eye direction of camera - Google Patents

Device for detecting eye direction of camera

Info

Publication number
JPH025A
JPH025A JP63143259A JP14325988A JPH025A JP H025 A JPH025 A JP H025A JP 63143259 A JP63143259 A JP 63143259A JP 14325988 A JP14325988 A JP 14325988A JP H025 A JPH025 A JP H025A
Authority
JP
Japan
Prior art keywords
light
lens
line
eye
detection device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP63143259A
Other languages
Japanese (ja)
Other versions
JP2859270B2 (en
Inventor
Osamu Shindo
修 進藤
Shigeo Toushi
重男 藤司
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pentax Corp
Original Assignee
Asahi Kogaku Kogyo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Asahi Kogaku Kogyo Co Ltd filed Critical Asahi Kogaku Kogyo Co Ltd
Priority to DE19883841575 priority Critical patent/DE3841575C2/en
Priority to DE3844912A priority patent/DE3844912C2/en
Priority to DE3844907A priority patent/DE3844907C2/en
Publication of JPH025A publication Critical patent/JPH025A/en
Priority to US07/982,427 priority patent/US5327191A/en
Priority to US08/370,367 priority patent/US5583606A/en
Priority to US08/462,688 priority patent/US5557364A/en
Application granted granted Critical
Publication of JP2859270B2 publication Critical patent/JP2859270B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2213/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B2213/02Viewfinders
    • G03B2213/025Sightline detection

Abstract

PURPOSE:To accomplish the detection of a photographer's eye direction by projecting an infrared ray on an eye, catching a reflected light from a firs Purkinje image based on the mirror reflection of a cornea and the reflected light from an eyeground, and arithmetically operating a photoreceiving output. CONSTITUTION:The infrared ray is projected from an infrared light source 48 on the photographer's eye which is positioned on the right side of the pentaprism 40 of the eye direction detecting device 46. Thereby, the first Purkinje image PI based on the mirror reflection of a cornea is formed. The reflected light from the Purkinje image and the reflected light from the eyeground are caught by means of a primary line sensor 53. The light receiving output of the sensor 53 is amplified by an amplifier 56 and a digital signal is converted by an analog digital converter 57. Then, the process of detecting the photographer's eye direction is performed by means of a microcomputer 58. Consequently, the eye direction can be detected, and the optical systems of plural focusing zones can be automatically selected and driven.

Description

【発明の詳細な説明】 (産業上の利用分野) 本発明は、カメラの視線方向検出装置に関し、とりわけ
、ファインダーの視野内に設けられた複数個の合焦用ゾ
ーンと光学的に略共役な位置にそのファインダーの各合
焦用ゾーンに対応するオートフォーカス光学系の合焦用
ゾーンを設け、そのファインダーの各合焦用ゾーンのい
ず九かーっを選択してその選択された合焦用ゾーンに対
応するオートフォーカス光学系を用いて、その合焦用ゾ
ーンに重なって見える被写体に合焦を行なう自動台焦装
置を有するカメラに好適の視線方向検出装置に関するも
のである。
DETAILED DESCRIPTION OF THE INVENTION (Field of Industrial Application) The present invention relates to a line-of-sight direction detection device for a camera, and more particularly, to a device that is optically approximately conjugate with a plurality of focusing zones provided within the field of view of a finder. A focusing zone of the autofocus optical system corresponding to each focusing zone of the finder is set at a position, and each focusing zone of the finder is selected and the selected focusing zone is set. The present invention relates to a line-of-sight direction detection device suitable for a camera having an automatic focus device that uses an autofocus optical system corresponding to the zone to focus on a subject that appears to overlap a focusing zone.

(発明の背景) 従来から、カメラには、オートフォーカス光学系を備え
たものがある。たとえば、第39図は、このオートフォ
ーカス光学系を備えた一眼レフカメラの光学系の概略構
成を示すもので、その第39図において、1は撮影レン
ズ、2は被写体、3は視野マスク、4はコンデンサレン
ズ、5は絞りマスク、6,7はセパレータレンズ、8は
受光部としてのCCOである。ここで、視野マスク3.
コンデンサレンズ4.絞りマスク5.セパレータレンズ
6、?、CCD8は、一体にモジュール化されて。
(Background of the Invention) Conventionally, some cameras are equipped with an autofocus optical system. For example, Fig. 39 shows a schematic configuration of the optical system of a single-lens reflex camera equipped with this autofocus optical system. 5 is a condenser lens, 5 is an aperture mask, 6 and 7 are separator lenses, and 8 is a CCO as a light receiving section. Here, field mask 3.
Condenser lens 4. Aperture mask 5. Separator lens 6? , CCD8 are integrated into a module.

オートフォーカス光学系9を構成する。An autofocus optical system 9 is configured.

このオートフォーカス光学系9は、視野マスク3がフィ
ルム等価面10の近傍に設けられている。
In this autofocus optical system 9, the field mask 3 is provided near the film equivalent surface 10.

フィルム等価面1Gは、撮影レンズ1を介して被写体2
と光学的に共役な位置にある。このフィルム等価面10
には、撮影レンズ1が合焦状態にあるときに、被写体2
の像11がピントの合った状態で形成される。コンデン
サレンズ4と絞りマスク5とは、撮影レンズlの左右を
通過する撮影光を2つの光束に分割する機能を有し、セ
パレータレンズ6.7は、コンデンサレンズ4を介して
撮影レンズ1と光学的に共役な位置にある。
The film equivalent surface 1G is exposed to the subject 2 through the photographic lens 1.
It is in an optically conjugate position with. This film equivalent surface 10
When the photographic lens 1 is in focus, the subject 2
An image 11 of is formed in focus. The condenser lens 4 and the aperture mask 5 have the function of splitting the photographing light passing through the left and right sides of the photographic lens l into two luminous fluxes, and the separator lens 6.7 is connected to the photographic lens 1 via the condenser lens 4. is in a conjugate position.

セパレータレンズ6.7は、第40図に模式的に示すよ
うに、水平方向に配置されている。このセパレータレン
ズ6.7は、後述するファインダーの中央合焦用ゾーン
と光学的に共役な位置にある合焦用ゾーン12を介して
撮影レンズ1の射出瞳13の仮想的な開口領域14.1
5を覗いている。セパレータレンズ6.7には、その開
口領域14.15を通過した光束が取り込まれる。フィ
ルム等価面10に形成された像tiは、そのセパレータ
レンズ6.7によって、CCD8の2つの領域に像11
として再結像される。
The separator lens 6.7 is arranged horizontally, as schematically shown in FIG. This separator lens 6.7 is connected to a virtual aperture area 14.1 of the exit pupil 13 of the photographic lens 1 via a focusing zone 12 located at a position optically conjugate with the central focusing zone of the finder, which will be described later.
I'm looking at 5. The separator lens 6.7 receives the light beam that has passed through its aperture area 14.15. The image ti formed on the film equivalent surface 10 is divided into two areas of the CCD 8 by the separator lens 6.7.
The image is re-imaged as

この再結像された像11の合焦時(第41図(a)参照
)の像間隔に対応する信号Sの間隔を第42図に示すよ
うに悲。とする。ここで、第41図(b)に示すように
合焦時に較べて、m側で撮影レンズ1のピントが合って
いるときには、第42図に示すように像間隔が狭まって
、これに対応する信号Sの間隔が悲。よりも小さくなる
。反対に、第41図(c)に示すように、合焦時に較べ
て後側で撮影レンズ1のピントが合っているときには、
第42図に示すように、像間隔が広がって、これに対応
する信号Sの間隔が悲。よりも大きくなる。この像間隔
の変化は、撮影レンズ1のデフォーカス量に比例するの
で、従来の一眼レフカメラでは、そのCCD 8の像間
隔を検出し、これを演算処理して撮影レンズ1のデフォ
ーカス方向とデフォーカス量とにより。
The interval of the signal S corresponding to the image interval when the re-focused image 11 is in focus (see FIG. 41(a)) is expressed as shown in FIG. shall be. Here, when the photographing lens 1 is in focus on the m side, the image interval becomes narrower as shown in FIG. 42, compared to when the camera is in focus, as shown in FIG. 41(b). The interval between signal S is sad. becomes smaller than On the other hand, as shown in FIG. 41(c), when the photographic lens 1 is in focus at the rear side compared to when it is in focus,
As shown in FIG. 42, the image interval has widened and the corresponding interval of the signal S has become narrower. becomes larger than This change in the image interval is proportional to the defocus amount of the photographing lens 1, so in a conventional single-lens reflex camera, the image interval of the CCD 8 is detected and this is processed to determine the defocus direction of the photographic lens 1. Depending on the amount of defocus.

撮影レンズ1を合焦位置に駆動させるようにしている。The photographic lens 1 is driven to the in-focus position.

そして、たとえば、第43図に示すように、ファインダ
ー16の中央に設けられた中央合焦用ゾーン17に所望
の被写体2が入るように構図を決め、ボタンを操作する
と、デフォーカス方向とデフォーカス量とが自動的に演
算され、被写体2にピントの合った撮影写真を得ること
ができる。
For example, as shown in FIG. 43, when composing the photograph so that the desired subject 2 falls within the central focusing zone 17 provided at the center of the finder 16, and operating a button, the defocus direction and defocus direction can be adjusted. The amount is automatically calculated, and a photograph with the subject 2 in focus can be obtained.

ところで、この種の一眼レフカメラでは、合焦用ゾーン
がファインダー16の中央に設けられているので、この
ままでは、被写体2が写真中央に位置することになる。
By the way, in this type of single-lens reflex camera, the focusing zone is provided at the center of the finder 16, so if this is done, the subject 2 will be located at the center of the photograph.

しかし、周辺に被写体2を配置した撮影写真を得たい場
合もある。
However, there may be cases where it is desired to obtain a photograph in which the subject 2 is placed in the vicinity.

従来の一眼レフカメラでは、これを考慮してフォーカス
ロック機構が設けられている。このフォーカスロツク機
構番用いれば、被写体2をファインダーの中央に位置さ
せ、被写体2に撮影レンズを合焦させ、この状態でフォ
ーカスロックを行ない、第441i1に示すように、フ
レーミングを行なって撮影すれば、周辺部に所望の被写
体2を配置した撮影写真を得ることができる。
In consideration of this, conventional single-lens reflex cameras are provided with a focus lock mechanism. If you use this focus lock mechanism number, you can position the subject 2 in the center of the viewfinder, focus the photographic lens on the subject 2, lock the focus in this state, and frame and photograph as shown in No. 441i1. , it is possible to obtain a photograph in which the desired subject 2 is placed in the periphery.

ところが、この一眼レフカメラでは、構図を決め直して
撮影を行なうという手順を踏むため、撮影に手間がかか
りすぎる開運がある。
However, with this single-lens reflex camera, you have to recompose the shot before taking the picture, which can sometimes be too time-consuming.

そこで、本件出願人は、周辺部に所望の被写体を配置し
た撮影写真を得るための撮影操作を迅速に行なうことの
できる一眼レフカメラの自動測距装置を先に出願した(
特l[@62−22561号)。
Therefore, the present applicant previously filed an application for an automatic distance measuring device for a single-lens reflex camera that can quickly perform photographing operations to obtain photographs with a desired subject located in the peripheral area (
Special [@62-22561].

この先に出願に開示のものを、第27図〜第30図を参
照しつつ概略説明する。
The device disclosed in the application will now be briefly explained with reference to FIGS. 27 to 30.

第27図において、実線で示す13はオートフォーカス
光学系9の合焦用ゾーン12から覗いた射出瞳である。
In FIG. 27, 13 indicated by a solid line is an exit pupil seen from the focusing zone 12 of the autofocus optical system 9.

この射出瞳13は第28図に示すように略円形である。This exit pupil 13 is approximately circular as shown in FIG.

°一方、セパレータレンズ6.7から覗いた開口領域1
4.15は略楕円形である。
° On the other hand, the aperture area 1 seen from the separator lens 6.7
4.15 is approximately elliptical.

オートフォーカス光学系9の左右両側には、周辺部合焦
用のオートフオーカス光学系I11.19が設けられて
いる。オートフォーカス光学系18は一対のセパレータ
レンズ20.21− CCD22を有し、オートフォー
カス光学系19は一対のセパレータレンズ23、24、
CCD25を有する。
Autofocus optical systems I11.19 for peripheral focusing are provided on both the left and right sides of the autofocus optical system 9. The autofocus optical system 18 has a pair of separator lenses 20, 21-CCD 22, and the autofocus optical system 19 has a pair of separator lenses 23, 24,
It has CCD25.

ファインダー16の視野内には、第29図に示すように
、その中央合焦用ゾーン17の左右両側に、周辺部合焦
用のオートフォーカス光学系1g、19に対応させて、
周辺部合焦用ゾーン26.27が並べて設けられている
As shown in FIG. 29, within the field of view of the finder 16, autofocus optical systems 1g and 19 for peripheral focusing are provided on both sides of the central focusing zone 17.
Peripheral focusing zones 26, 27 are arranged side by side.

この周辺部合焦用ゾーン26.27はオートフォーカス
合焦用ゾーン2g、29と光学的に略共役な位置関係に
ある。セパレータレンズ20.21.セパレータレンズ
23,24は、上下方向に配置され、図示を略すコンデ
ンサレンズ4を介して撮影レンズ1の破線で示す射出瞳
13と光学的に略共役であり、合焦用ゾーン28.29
を介してその破線で示された射出瞳13の上下方向の開
口領域30.31を覗いている。
The peripheral focusing zones 26 and 27 are in an optically substantially conjugate positional relationship with the autofocus focusing zones 2g and 29. Separator lens 20.21. The separator lenses 23 and 24 are arranged in the vertical direction, and are optically substantially conjugate with the exit pupil 13 shown by the broken line of the photographing lens 1 via the condenser lens 4 (not shown), and are located in focusing zones 28 and 29.
The user is looking into the vertical aperture region 30.31 of the exit pupil 13 indicated by the broken line.

このようにセパレータレンズ20.21.セパレータレ
ンズ23,24を上下方向に配置したのは、撮影レンズ
lを介して合焦用ゾーン28.29に入射する光束は、
第30111に示すようにビネッティングの影響を受け
て斜光束となり、合焦用ゾーン28.29から見た撮影
レンズ1の破線で示す射出瞳13はビネッティングを受
けて、偏平につぶれた形状となり、水平方向に開口領域
30.31を設けると、セパレータレンズ20.21(
セパレータレンズ20.21)のレンズ間の基線長を十
分に確保することができず。
In this way, the separator lenses 20.21. The reason why the separator lenses 23 and 24 are arranged in the vertical direction is that the light flux that enters the focusing zone 28, 29 via the photographing lens l is
As shown in No. 30111, the beam becomes oblique due to the influence of vignetting, and the exit pupil 13 of the photographing lens 1 shown by the broken line as seen from the focusing zone 28, 29 is vignetted and has a flattened shape. , when the opening area 30.31 is provided in the horizontal direction, the separator lens 20.21 (
A sufficient baseline length between the lenses of the separator lenses 20 and 21) cannot be secured.

レンズの性能に低下をきたして像間隔の検出精度が劣化
するからである。
This is because the performance of the lens deteriorates and the accuracy of detecting the image interval deteriorates.

なお、その第27図において、息は撮影レンズ1の光軸
、虞□はオートフォーカス光学系18の中心光軸、jl
、はオート・フォーカス光学系19の中心光軸、中心光
軸處い童、は実線で示された射出瞳13の中心01で交
わっている。また。處、1はセパレータレンズ20の光
軸、悲□、はセパレータレンズ21の光軸、n2□はセ
パレータレンズ23の光軸、attはセパレータレンズ
24の光軸であり、光軸愈□、、j1.□は開口領域3
1′の中心02で交おっており。
In FIG. 27, breath is the optical axis of the photographing lens 1, □ is the central optical axis of the autofocus optical system 18, and jl
, are the central optical axes of the autofocus optical system 19, and the central optical axes 1 and 2 intersect at the center 01 of the exit pupil 13, which is indicated by a solid line. Also.處, 1 is the optical axis of the separator lens 20, □, is the optical axis of the separator lens 21, n2 □ is the optical axis of the separator lens 23, att is the optical axis of the separator lens 24, .. □ is opening area 3
It intersects at the center 02 of 1'.

光軸Q□7+ a、2は開口領域の中心03で交わって
いる。
The optical axes Q□7+a,2 intersect at the center 03 of the aperture area.

このように、ファインダー16の視野内に複数個の合焦
用ゾーンを設けると共に、その複数個の合焦用ゾーンと
光学的に略共役な位置にファインダー16の各合焦用ゾ
ーンに対応するオートフォーカス光学系の合焦用ゾーン
を設け、J11影者の意図する合焦用ゾーン(第29図
参照)に対応するCCDを駆動させるために、ボタン操
作によりその意図する合焦用ゾーンを選択するようにす
れば、その選択された合焦用ゾーンに対応するオートフ
ォーカス光学系を用いて、その選択された合焦用ゾーン
を通じて見える被写体2に撮影レンズを自動的に合焦さ
せることができることになる。
In this way, a plurality of focusing zones are provided within the field of view of the finder 16, and an autofocusing zone corresponding to each focusing zone of the finder 16 is provided at a position that is optically substantially conjugate with the plurality of focusing zones. A focusing zone is provided for the focusing optical system, and in order to drive the CCD corresponding to the focusing zone intended by the J11 shadow person (see Figure 29), the intended focusing zone is selected by button operation. By doing so, the photographing lens can be automatically focused on the subject 2 that can be seen through the selected focusing zone using the autofocus optical system corresponding to the selected focusing zone. Become.

よって、この一眼レフカメラを用いれば、構図を決める
ためにフォーカスロックを行なう傾わしさな解消できる
Therefore, using this single-lens reflex camera eliminates the tendency to lock the focus to determine the composition.

(発明が解決しようとする課題) ところで、せっかく、そのファインダー16の視野内に
複数個の合焦用ゾーン17.26.27を設けると共に
、その複数個の合焦用ゾーン17.26.27と光学的
に略共役な位置にファインダー16の各合焦用ゾーン1
7,26.27に対応するオートフォーカス光学系の合
焦用ゾーン9.18.19を設けたのであるから、その
ファインダー16の視野内の複数個の合焦用ゾーンのう
ちのいずれかが選択されたことを自動的に検出できるよ
うにすれば、ファインダー16の視野内に設けられた複
数個の合焦用ゾーン17.26.27のうちの一つを手
動により選択する煩わしさも解消することができ、カメ
ラとしてより一層便利なものとなる。
(Problem to be Solved by the Invention) By the way, it is possible to provide a plurality of focusing zones 17, 26, 27 within the field of view of the finder 16, and to Each focusing zone 1 of the finder 16 is located at an optically approximately conjugate position.
Since focusing zones 9, 18, and 19 of the autofocus optical system corresponding to 7, 26, and 27 are provided, any one of the plurality of focusing zones within the field of view of the finder 16 is selected. By making it possible to automatically detect whether or not the object is in focus, the trouble of manually selecting one of the plurality of focusing zones 17, 26, 27 provided within the field of view of the finder 16 can be eliminated. This makes the camera even more convenient.

本発明は、上記の事情に鑑みて為されたもので。The present invention has been made in view of the above circumstances.

本発明の第1の目的は、撮影者の眼の視線方向を検出す
るカメラの視線方向検出装置を提供することにある。
A first object of the present invention is to provide a camera line-of-sight direction detection device that detects the line-of-sight direction of a photographer's eyes.

本発明の第2の目的は、ファインダーの視野内に設けら
れた複数個の合焦用ゾーンと光学的に略共役な位置にそ
のファインダーの各合焦用ゾーンに対応するオートフォ
ーカス光学系の合焦用ゾーンを設け、そのファインダー
の各合焦用ゾーンのいずれか一つを選択してその選択さ
れた合焦用ゾーンに対応するオートフォーカス光学系を
用いて。
A second object of the present invention is to align an autofocus optical system corresponding to each focusing zone of the finder at a position that is optically approximately conjugate with a plurality of focusing zones provided within the field of view of the finder. Focusing zones are provided, any one of the focusing zones of the finder is selected, and an autofocus optical system corresponding to the selected focusing zone is used.

その合焦用ゾーンに重なって見える被写体に合焦を行な
う自動合焦装置を有するカメラに好適のカメラの視線方
向検出装置を提供することにある。
It is an object of the present invention to provide a line-of-sight direction detection device for a camera suitable for a camera having an automatic focusing device that focuses on a subject that appears to overlap the focusing zone.

本発明の第3の目的は、一次元ラインセンサを用いて撮
影者の眼の視線方向を検出するカメラの視線方向検出装
置を提供することにある。
A third object of the present invention is to provide a camera line-of-sight direction detection device that detects the line-of-sight direction of a photographer's eye using a one-dimensional line sensor.

(課題を解決するための手段) 本発明に係るカメラの視線検出装置の特徴は、撮影者の
眼に平行光束を導く送光系と、受光部を有しかつ前記眼
の角膜鏡面反射に基づき第1プルキンエ像を形成する反
射光と前記眼の眼底からの反射光とを受光する受光系と
前記受光部の受光出力に基づき前記撮影者の眼の視線方
向を検出するための処理回路とがカメラ本体に設けられ
ているところにある。
(Means for Solving the Problems) The camera line-of-sight detection device according to the present invention is characterized by having a light transmitting system that guides a parallel light beam to the photographer's eye, and a light receiving section, and based on corneal specular reflection of the eye. a light receiving system that receives reflected light forming a first Purkinje image and reflected light from the fundus of the eye; and a processing circuit that detects the line of sight direction of the photographer's eye based on the light receiving output of the light receiving section. It's located on the camera body.

本発明に係るカメラの視線検出装置の他の特徴は、カメ
ラ本体のファインダーの視野内に複数個の合焦用ゾーン
を設けると共に、この合焦用ゾーンと略光学的に共役な
位置にこの合焦用ゾーンに対応するオートフォーカス光
学系の合焦ゾーンを設け、処理回路に、そのファインダ
ーの各合焦用ゾーンのいずれか一つが選択されたことを
自動的 −に感知させる構成としたところにある。
Another feature of the camera line-of-sight detection device according to the present invention is that a plurality of focusing zones are provided within the field of view of the finder of the camera body, and the focusing zone is located at a position substantially optically conjugate with the focusing zone. The focusing zone of the autofocus optical system corresponding to the focusing zone is provided, and the processing circuit is configured to automatically sense that any one of the focusing zones of the finder is selected. be.

本発明に係るカメラの視線検出装置のさらなる特徴は、
受光部が一次元ラインセンサから構成され、処理回路が
その一次元ラインセンサからの出力を眼底からの反射光
に対応する眼底反射光対応出力°成分と第1プルキンエ
像を形成する反射光に対応する第1プルキンエ像形成反
射光対応出力成分とに分離する分離手段を備え、分離さ
れた眼底反射光対応出力成分の重心位置と第1プルキン
エ像形成反射光対応出力成分の重心位置とをそれぞれ求
め、眼の視線方向を検出するところにある。
Further features of the camera line of sight detection device according to the present invention include:
The light receiving section is composed of a one-dimensional line sensor, and the processing circuit converts the output from the one-dimensional line sensor into the fundus reflected light corresponding output degree component corresponding to the reflected light from the fundus and the reflected light forming the first Purkinje image. and a separation means for separating the output component corresponding to the first Purkinje image forming reflected light, and determining the center of gravity position of the separated output component corresponding to the fundus reflected light and the center position of the center of gravity of the output component corresponding to the first Purkinje image forming reflected light, respectively. , which detects the direction of the eye's line of sight.

その他の特徴は、本件発明の明細書から明らかとなるで
あろう。
Other features will become apparent from the description of the invention.

(発明の原理) まず、実施例の説明前に、本発明の原理を説明する。(Principle of the invention) First, before explaining embodiments, the principle of the present invention will be explained.

視線方向を検出する検出方法は、たとえば、池田光男著
のr視覚の心理物理学」に記載されているが、カメラに
適用する場合、撮影者の眼の平行移動は検出しないよう
にしなければならない、というのは、眼の視線方向の検
出と共にその眼の平行移動を検出するものの場合には、
眼の平行移動による視線方向の情報が角度方向の情報に
重なり、撮影者がいずれの合焦用ゾーンを注視している
か否かを区別することができないからである。
A detection method for detecting the direction of the line of sight is described, for example, in ``Psychophysics of Vision'' by Mitsuo Ikeda, but when applied to a camera, it must be done so that parallel movement of the photographer's eye is not detected. , because in the case of detecting the direction of the line of sight of the eye as well as the parallel movement of the eye,
This is because the information on the line of sight direction due to the parallel movement of the eyes overlaps the information on the angular direction, making it impossible to distinguish which focusing zone the photographer is gazing at.

あえて、平行移動も検出できる視線方向検出光学系を採
用するものとすると、カメラのファインダーの光軸と撮
影者の眼球の回旋中心との相対距腫を一定にしておかな
ければならないが、これは、手持ち式のカメラが一般的
であることに鑑みると、眼がファインダー16に対して
相対的に左右にふれるため、事実上不可能である。
If we were to use a line-of-sight direction detection optical system that can detect parallel movement, we would have to keep the relative distance between the optical axis of the camera's viewfinder and the center of rotation of the photographer's eyeball constant; Considering that hand-held cameras are common, this is virtually impossible because the eye moves left and right relative to the finder 16.

角度方向のみの視線を検出する視線方向検出光学系とし
ては、たとえば、1974年のOptical fin
gineering誌の778月号VOL、13.NO
4,P339〜P342に、Fixation Poi
nt Naasuraw+ent by theOcu
lometar Techniqueに紹介されている
ものがある。
As a line-of-sight direction detection optical system that detects line-of-sight only in the angular direction, for example, the Optical fin in 1974
Gineering magazine 77 August issue VOL, 13. NO
4, Fixation Poi on P339-P342
nt Naasuraw+ent by theOcu
There is one introduced in lometar Technique.

このものに詔介されている視線方向検出光学系の原理は
、第22図に示すように、凸面鏡30に光軸患、に平行
な平行光束Pを照射すると、光学的に無限大の躍層にあ
る光源の像は、凸面鏡30の曲率中心kと光軸悲、が鏡
面に交する光点Kとの間の中点Qに光点として生じる。
The principle of the line-of-sight direction detection optical system advocated in this article is that, as shown in FIG. The image of the light source located at is generated as a light point at a midpoint Q between a light point K where the center of curvature k of the convex mirror 30 and the optical axis 3 intersect with the mirror surface.

ここで、第23図に示すように人眼31の角膜32に光
軸り、に平行な平行光束Pを照射した場合にも、光学的
に無限大の距離にある光源の像が角膜32の曲率中心R
と角膜頂点K′との間の中点Qに光点として生じる(こ
の光点を第1プルキンエ像PIという)、なお、符号3
3は虹彩、34は瞳孔の中心−、SA′は眼球の旋回中
心である。
Here, even when the cornea 32 of the human eye 31 is irradiated with a parallel light beam P parallel to the optical axis as shown in FIG. Center of curvature R
and the corneal vertex K' as a light spot (this light spot is called the first Purkinje image PI), and the symbol 3
3 is the iris, 34 is the center of the pupil, and SA' is the center of rotation of the eyeball.

角[132を照明する光束Pの光軸1諸人眼の視線方向
を示す視軸悲、′とが一致しているときに。
When the optical axis 1 of the luminous flux P illuminating the corner 132 coincides with the visual axis 1, which indicates the line of sight direction of the human eyes.

瞳孔の中心34.第1プルキンエ像P1.角膜32の曲
率中心R,@球の回旋中心SA′は光軸ml、上にある
。カメラについて考えると、ファインダーの光軸1.上
に眼球の回旋中心SA′があるものとして。
Center of pupil 34. First Purkinje statue P1. The center of curvature R of the cornea 32 and the center of rotation SA' of the sphere are above the optical axis ml. When thinking about a camera, the optical axis of the finder is 1. Assuming that the center of rotation SA' of the eyeball is located above.

眼球を回旋中心SA′を中心に左右方向に旋回させたと
する。すると、第24図に示すように、瞳孔の中心34
と第1プルキンエ像PIとの間に相対的なずれが生じる
Assume that the eyeball is rotated in the left-right direction around the center of rotation SA'. Then, as shown in FIG. 24, the center 34 of the pupil
A relative shift occurs between the first Purkinje image PI and the first Purkinje image PI.

また。仮りに、光軸11に対して角度0だけ眼を旋回さ
せ、瞳孔の中心34から角膜32に垂直に入射する光線
P′に下ろした垂線の長さをdとすると。
Also. Suppose that the eye is rotated by an angle of 0 with respect to the optical axis 11, and the length of the perpendicular line drawn from the center 34 of the pupil to the light ray P' that perpendicularly enters the cornea 32 is d.

d=kユ・sin II  ・・・・・・■ここで、k
□は瞳孔の中心34から角膜32の曲率中心Rまでの距
離であり、個人差があるが、アメリカ合衆国の国防省編
集によるにIL−HDBK−141rOPTICAL 
DESINJによれば、約4.5■諷である。なお、符
号Hは瞳孔の中心34から角膜32に垂直に入射する光
線P′に下ろした垂線とその光線P′との交点を示す。
d=k Yu・sin II ・・・・・・■Here, k
□ is the distance from the center 34 of the pupil to the center of curvature R of the cornea 32, and although there are individual differences, it is IL-HDBK-141rOPTICAL as edited by the United States Department of Defense.
According to DESINJ, it is about 4.5cm. Note that the symbol H indicates the intersection of the light ray P' and a perpendicular line drawn from the center 34 of the pupil to the light ray P' that perpendicularly enters the cornea 32.

上記■式から明らかなように、距離k、が既知であるの
で、長さdを求めれば、回旋角0を求めることができる
As is clear from the above equation (2), since the distance k is known, the rotation angle 0 can be found by finding the length d.

ここで、交点Hと第1プルキンエ像PIとが光線P′上
にあるものであるという点に鑑みると、角膜32に向け
て平行光束Pを照射し、角膜32からの鏡面反射光のう
ち、入射光束と平行な方向に反射して戻ってくる光線P
”を検出し、瞳孔の中心34と第1プルキンエ像PIと
の関係を求めれば、眼の回旋角0を知ることができる。
Here, considering that the intersection point H and the first Purkinje image PI are on the ray P', the parallel light beam P is irradiated toward the cornea 32, and among the specularly reflected light from the cornea 32, Ray P that is reflected and returned in a direction parallel to the incident light beam
” and find the relationship between the center 34 of the pupil and the first Purkinje image PI, the rotation angle 0 of the eye can be determined.

そこで、平行光束Pを眼に投影し、第25!l−第26
図に示すように、眼底からの反射光に基づきシルエット
として浮び上がった瞳孔の周縁34′と、第1プルキン
エ像PIとを受光素子(たとえば、一次元ラインセンサ
)に結像させると、その受光素子上での受光出力は、第
1プルキンエ像PIに対応 ″する箇所にピークを有し
、眼底からの反射光に対応する箇所が台形状となる。よ
って、スライスレベルL、により瞳孔の周縁34,34
に対応する瞳孔周縁対応座標11+1.を求めると共に
、スライスレベルト2により第1プルキンエ像PIに対
応する第1プルキンエ像対応座標PIいPI、を求めて
、下記の式■、弐〇により瞳孔の中心34に対応する中
心座標i′と中心座I!IPI との差d=PI−i を演算する。ここで検出光学系の倍率をmとすると、距
離dは以下に示す■式から求められる。
Therefore, the parallel light beam P is projected onto the eye, and the 25th! l-26th
As shown in the figure, when the periphery 34' of the pupil that stands out as a silhouette based on the light reflected from the fundus and the first Purkinje image PI are imaged on a light receiving element (for example, a one-dimensional line sensor), the light received The received light output on the element has a peak at a location corresponding to the first Purkinje image PI, and the location corresponding to the reflected light from the fundus has a trapezoidal shape. ,34
The pupil periphery corresponding coordinates 11+1. At the same time, the coordinates corresponding to the first Purkinje image PI corresponding to the first Purkinje image PI are obtained using the slice level 2, and the center coordinate i' corresponding to the center 34 of the pupil is obtained using the following formula and Chuoza I! Calculate the difference d=PI-i from IPI. Here, if the magnification of the detection optical system is m, then the distance d can be obtained from the following equation.

i  ==(in+ i 1) / 2  ・・・・・
・■p■′= (pzl+pu、)/ 2  ・・・・
・・0d=d/m     ・・・・・・■ したがって、このような処理回路を備えた視線方向検出
装置を用いれば、ファインダー16に設けられた複数個
の合焦用ゾーンのうちのいず九を注視しているか否かを
自動的に選択できることになる。
i == (in+ i 1) / 2...
・■p■'= (pzl+pu,)/2 ・・・・
...0d=d/m ...■ Therefore, if a line-of-sight direction detection device equipped with such a processing circuit is used, it is possible to detect which of the plurality of focusing zones provided in the finder 16. This means that you can automatically choose whether or not you are looking at nine.

(実施例) 以下に、本発明に係るカメラの視線方向検出装置の実施
例を図面を参照しつつ説明する。
(Example) Hereinafter, an example of the line-of-sight direction detection device for a camera according to the present invention will be described with reference to the drawings.

第1図において、40はカメラに組み込まれているペン
タプリズム、41はクイックリターンミラー。
In FIG. 1, 40 is a pentaprism built into the camera, and 41 is a quick return mirror.

42はピント板、43はコンデンサレンズ、44はファ
インダールーペ、45は撮影者の眼、I18ft4述の
ファインダー光学系の光軸である。ここで、ファインダ
ールーペ44は、レンズ44a、 44bから構成され
る装置 カメラ本体には,ペンタプリズム40を境にファインダ
ールーペと反対側に,ファインダーl6を覗く撮影者の
眼45の視線方向を検出する視線方向検出装!i46が
組み込まれている.第1図には,その視線方向検出装置
46の枠体47が示されている.視線方向検出装[46
は送光系46Aと受光系46Bとを有する.送光系46
Aは第2図,第3図に示すように,赤外光を発生する赤
外光源(たとえば.赤外発光ダイオード)48を有する
,この赤外光は、ハーフミラ−49、縮小レンズ50,
フンベンセータプリズム51、ペンタプリズム40,フ
ァインダールーぺ44を介して平行光束として撮影者の
眼45に照射される.これによって、角膜32の鏡面反
射に基づく第1プルキンエ像PIが形成される。
42 is a focusing plate, 43 is a condenser lens, 44 is a finder magnifying glass, 45 is the photographer's eye, and is the optical axis of the finder optical system described in I18ft4. Here, the finder magnifying glass 44 is a device camera body composed of lenses 44a and 44b, and has a pentaprism 40 on the opposite side of the finder magnifying glass to detect the line of sight direction of the photographer's eye 45 looking into the finder l6. Gaze direction detection device! i46 is included. FIG. 1 shows a frame 47 of the line-of-sight direction detection device 46. Gaze direction detection device [46
has a light transmitting system 46A and a light receiving system 46B. Light transmission system 46
As shown in FIGS. 2 and 3, A has an infrared light source (for example, an infrared light emitting diode) 48 that generates infrared light. This infrared light is transmitted through a half mirror 49, a reduction lens 50,
The light is irradiated to the photographer's eye 45 as a parallel beam of light via the Fumbenseta prism 51, the pentaprism 40, and the finder loupe 44. As a result, a first Purkinje image PI based on specular reflection of the cornea 32 is formed.

ここで、赤外光を用いたのは,撮影者に視線方向検出装
[46の光学系の照明に基づくまぶしさを与えないよう
に配慮したのである.一方、縮小レンズ50を用いるこ
とにしたのは,以下の理由からである。
The reason why infrared light was used here was to avoid giving the photographer glare caused by the illumination of the optical system of the line-of-sight direction detection device [46]. On the other hand, the reason why we decided to use the reduction lens 50 is as follows.

まず.視線方向検出装[46の光学系の光路長を極力短
かくしてカメラにコンパクトに組み込めるようにしたか
らである.次に,光軸j1.に平行な赤外反射光のみを
用いるので,眼45からの反射光量が少ないと考えられ
,後述する受光部としての一次元ラインセンサの受光面
のできるだけ狭い面積に反射光を結像させ,受光素子の
受光面での感度を高くするようにすることも配慮したか
らである。
first. This is because the optical path length of the optical system of the line-of-sight direction detection device [46] was made as short as possible so that it could be compactly incorporated into the camera. Next, optical axis j1. Since only the infrared reflected light parallel to the rays is used, the amount of reflected light from the eye 45 is thought to be small. This is because consideration was given to increasing the sensitivity on the light-receiving surface of the element.

その眼45の角膜32からの反射光のうち、入射光束と
平行な光束は,ファインダールーぺ44,ペンタプリズ
ム40,コンペンセータプリズム51.縮小レンズ5G
を介してハーフミラ−49に導かれ,そのハーフミラ−
49によって再結像レンズ52に導かれ。
Of the light reflected from the cornea 32 of the eye 45, the light beam parallel to the incident light beam is transmitted to the finder loupe 44, the pentaprism 40, the compensator prism 51. reduction lens 5G
is guided to the half mirror 49 through the
49 to the reimaging lens 52.

その再結像レンズ52によって受光素子としての一次元
ラインセンサ(たとえば、 CCD)53に結像される
,結像レンズ52には.第4図に示すように、マスク5
4が設けられ,そのマスク54には開口55が設けられ
,その間口55の中心は再結像レンズ52の曲率中心Y
に位置する.ここで−a口55の直径は約L2mmであ
る。
The image forming lens 52 forms an image on a one-dimensional line sensor (for example, CCD) 53 as a light receiving element by the re-imaging lens 52. As shown in FIG.
4, the mask 54 is provided with an opening 55, and the center of the opening 55 is the center of curvature Y of the re-imaging lens 52.
Located in Here, the diameter of the -a port 55 is approximately L2 mm.

撮影者の眼45は,通常、アイポイントに置かれるもの
として,一次元ラインセンサ53とその撮影者の眼45
の瞳孔とは,第5図に模式的に示すように.ファインダ
ールーぺ44,縮小レンズ50,再結像レンズ52を介
して光学的に共役な位置関係にあるものとされている.
一次元ラインセンサ53には、第1プルキンエ像PIと
共に,眼底からの反射光により瞳孔の周縁34′がシル
エットとして形成される。そこで、第3図に示すように
,この一次元ラインセンサ53の受光出力を増幅器56
により増幅し、アナログデジタル変換器57によりデジ
タル信号に変換して,マイクロコンピュータ58のメモ
リー59に一時的に保存させる。
The photographer's eye 45 is normally placed at the eye point, and the one-dimensional line sensor 53 and the photographer's eye 45 are connected to each other.
The pupil is shown schematically in Figure 5. They are assumed to be in an optically conjugate positional relationship via the finder magnifying glass 44, reduction lens 50, and reimaging lens 52.
On the one-dimensional line sensor 53, together with the first Purkinje image PI, the periphery 34' of the pupil is formed as a silhouette by the reflected light from the fundus. Therefore, as shown in FIG.
The signal is amplified by the analog-to-digital converter 57, converted into a digital signal, and temporarily stored in the memory 59 of the microcomputer 58.

そのメモリー59には距離k1が情報として記録されて
いる.この距離k1の情報と受光出力の情報とを演算回
路60に呼び出し,■〜■式に基−づき演−算し,回旋
角θを求め,この回旋角θからいずれの合焦用ゾーンが
選択されたかを意味する選択44号を駆動回路61に出
力させる。
The distance k1 is recorded in the memory 59 as information. The information on this distance k1 and the information on the light receiving output are called into the calculation circuit 60, and calculations are performed based on the formulas 1 to 2 to obtain the rotation angle θ, and which focusing zone is selected from this rotation angle θ. The drive circuit 61 is caused to output selection number 44, which means whether the selection is made or not.

そして,その駆動回路61によってその選択された合焦
用ゾーンに対応するオートフォーカス光学系のCCDを
駆動させると、撮影者の意図する合焦用ゾーンを通じて
見える被写体に撮影レンズを自動的に合焦させることが
できる。
When the drive circuit 61 drives the CCD of the autofocus optical system corresponding to the selected focusing zone, the photographing lens is automatically focused on the subject that can be seen through the focusing zone intended by the photographer. can be done.

ところで,第29図に示すように、ファインダー16の
視野中心0=(フォーカシングスクリーン中心)から左
右の合焦用ゾーン0,.0ウまでの距離(像高さ)をy
とし,ファインダールーぺ44の焦点距離をfとすると
By the way, as shown in FIG. 29, left and right focusing zones 0, . The distance to 0 (image height) is y
Assume that the focal length of the finder magnifying glass 44 is f.

y=f+tan θ ・・−・・・■ 上記の■式に■式を代入すると。y=f+tan θ ・・・−・・・■ Substituting the ■expression into the above ■expression.

F =f−d / (K、・eos 6) −−■すな
わち、yはd/ (K、・cost)に比例する。
F=f−d/(K,·eos 6) −−■ That is, y is proportional to d/(K,·cost).

これは、一次元ラインセンサ53に形成された像のディ
ストーションをなくしたとしても、dの値からyの値を
線形には求め得ないこと、つまり。
This means that even if the distortion of the image formed on the one-dimensional line sensor 53 is eliminated, the value of y cannot be determined linearly from the value of d.

非線形性の存在を意味する。It means the existence of nonlinearity.

35■■カメラの場合、ビネッティング等のために。35 ■■ For vignetting, etc. in the case of a camera.

複数個の合焦用ゾーンの像高さyは、大きくても6鵬−
〜9−臘であると考えられる。
The image height y of the plurality of focusing zones is at most 6 -
~9-臘.

ここで、視線方向検出装置46の光学系が瞳孔の像を非
線形性のあるままで、後方の一次元ラインセンサ53に
伝達するものとし、かつ、その一次元ラインセンサ53
で検出された長さdが像高さyに比例するものと仮定す
ると、実際の長さdよりもその長さが0.1%〜1.6
瓢だけ長い方に検出されるのみで、合焦用ゾーンの選択
には支障はないが、視線方向検出装置i46の光学系の
精度を向上させる観点からは、非線形性のない方が好ま
しい。
Here, it is assumed that the optical system of the line-of-sight direction detection device 46 transmits the pupil image with nonlinearity to the rear one-dimensional line sensor 53, and that the one-dimensional line sensor 53
Assuming that the length d detected in is proportional to the image height y, the length is 0.1% to 1.6% smaller than the actual length d.
Although only the longer gourd is detected, there is no problem in selecting the focusing zone, but from the viewpoint of improving the accuracy of the optical system of the line-of-sight direction detecting device i46, it is preferable that there be no nonlinearity.

このような場合には、マイクロコンピュータで補正が可
能である。しかし、光学系自体に、ディストーションが
存在すると、測定が不正確となるので、少なくとも光学
系のディストーションをなくす必要はある。
In such a case, correction can be made using a microcomputer. However, if distortion exists in the optical system itself, measurement will be inaccurate, so it is necessary to eliminate at least distortion in the optical system.

そこで、縮小レンズ500球面収差を小さくするために
、ファインダールーペ44に近い側の面50aを非球面
とし、かつ、再結像レンズ52の曲率中心Yに縮小レン
ズ50の焦点を位置させる。このように縮小レンズ50
を非球面とし、かつ、再結像レンズ5zの曲率中心Yに
縮小レンズ50の焦点を位置させると、開口55が再結
像レンズ52の曲率中心Yに位置されていることと相ま
ってディストーションの少ない光学系を実現でき、視線
方向検出装[46の光学系としてより一層好ましいもの
となる。
Therefore, in order to reduce the spherical aberration of the reduction lens 500, the surface 50a on the side closer to the finder magnifying glass 44 is made an aspherical surface, and the focal point of the reduction lens 50 is positioned at the center of curvature Y of the re-imaging lens 52. In this way, the reduction lens 50
When is made an aspherical surface and the focal point of the reduction lens 50 is located at the center of curvature Y of the re-imaging lens 5z, the aperture 55 is located at the center of curvature Y of the re-imaging lens 52, resulting in less distortion. This makes it possible to realize an optical system, which is even more preferable as an optical system for the line-of-sight direction detection device [46].

次に、このような視線方向検出装置46の光学系の設計
の一例を以下に説明する。
Next, an example of the design of the optical system of such a line-of-sight direction detection device 46 will be described below.

まず、レンズ44aからアイポイントまでの間隔を14
.7l−とし、レンズ44aの中心厚は4.98mm、
レンズ44aのアイポイント側の面の曲率半径は凸の1
gt、tea−一、レンズ44aのレンズ44bに臨む
側の面の曲率半径は凸の−25,500■■、レンズ4
4aの屈折率は1.69105とする。そして、光軸處
、上でのレンズ44aとレンズ44bの間隔は3.Ol
■膳とする。また。レン−ズ44bの中心厚は4.10
m脂、レンズ446のレンズ44aに臨む側の面の曲率
半径は凹の−23,860m+m、レンズ44bのペン
タプリズム40の臨む側の面の曲率半径は凸の−48,
140mm、レンズ44bの屈折率は1.79175と
する。また、ペンタプリズム40の面40aとレンズ4
4bとの間隔は3.21鳳■とし、ペンタプリズム40
0面40aから面40bまでの光軸り上における長さは
、 28.00mm。各面40a、 40bの曲率半径
はの。
First, set the distance from the lens 44a to the eye point to 14
.. 7l-, the center thickness of the lens 44a is 4.98 mm,
The radius of curvature of the eye point side surface of the lens 44a is convex 1
gt, tea-1, the radius of curvature of the surface of lens 44a facing lens 44b is -25,500■■, lens 4
The refractive index of 4a is 1.69105. The distance between the lenses 44a and 44b at the top of the optical axis is 3. Ol
■Serve as a meal. Also. The center thickness of the lens 44b is 4.10
The radius of curvature of the surface of the lens 446 facing the lens 44a is concave -23,860m+m, and the radius of curvature of the surface of the lens 44b facing the pentaprism 40 is convex -48,
140 mm, and the refractive index of the lens 44b is 1.79175. In addition, the surface 40a of the pentaprism 40 and the lens 4
The distance from 4b is 3.21mm, and the pentaprism is 40mm.
The length from the zero surface 40a to the surface 40b along the optical axis is 28.00 mm. The radius of curvature of each surface 40a, 40b is .

ペンタプリズム40の屈折率は1.51260とする。The refractive index of the pentaprism 40 is 1.51260.

次に、コンペンセータプリズム51の面51aとペンタ
プリズム40の面40bとの間隔は0.10mmに設定
し、コシペンセータプリズム51の面51bと縮小レン
ズ50の面50aとの間隔も0.10mmに設定す″る
。なお、ゴンペンセータプリズム51の面51bと面5
1aとの光軸處、上における長さは+ 2.00mm、
各面51a。
Next, the distance between the surface 51a of the compensator prism 51 and the surface 40b of the pentaprism 40 is set to 0.10 mm, and the distance between the surface 51b of the compensator prism 51 and the surface 50a of the reduction lens 50 is also set to 0.10 mm. The surface 51b and surface 5 of the gompensator prism 51
The length at the top of the optical axis with 1a is + 2.00 mm,
Each side 51a.

51bの曲率半径はφ、そのコンペンセータプリズム5
1の屈折率は1.51260とする。
The radius of curvature of 51b is φ, and its compensator prism 5
The refractive index of 1 is 1.51260.

縮小レンズ50は面5◎aの曲率半径を凸の12.69
0m園(ただし、 k3=−3,00)とし、その中心
厚さは2.00su+に設計し、その屈折率は1.4J
1716とする。なお。
The reduction lens 50 has a convex radius of curvature of 12.69 on the surface 5◎a.
0m garden (k3=-3,00), its center thickness is designed to be 2.00su+, and its refractive index is 1.4J.
1716. In addition.

縮小レンズ50の他側の面50bの曲率半径は凸の−2
00,000mmであり、再結像レンズ52とその面5
0bとの間隔は11.48閣騰に設定されている。
The radius of curvature of the other surface 50b of the reduction lens 50 is convex -2
00,000 mm, and the reimaging lens 52 and its surface 5
The interval with 0b is set to 11.48 degrees.

再結像レンズ52の面52aの曲率半径は凸の1.52
0■閤、面52bの曲率半径はΦ、その再結像レンズ5
2の中心厚さは1.52mmとし、屈折率は縮小レンズ
50と同じ1.48716のものを用いる。直径0.2
mmの開口55を有するマスク54は面52bに貼り付
けであるから、そのマスク54と面52bの間隔はOl
園であり、マスク54の厚さは0.04mmとし、マス
ク54から受光素子53の受光面までの間隔は1.46
mmとした。なお、マスク54、受光素子53の受光面
の曲率半径はの、各光学素子の間には空気が介在してい
るものとする。
The radius of curvature of the surface 52a of the re-imaging lens 52 is convex with a radius of 1.52
0 ■ The radius of curvature of the surface 52b is Φ, and the re-imaging lens 5
The center thickness of lens 2 is 1.52 mm, and the refractive index is 1.48716, which is the same as that of the reduction lens 50. Diameter 0.2
Since the mask 54 having an opening 55 of mm is attached to the surface 52b, the distance between the mask 54 and the surface 52b is Ol.
The thickness of the mask 54 is 0.04 mm, and the distance from the mask 54 to the light receiving surface of the light receiving element 53 is 1.46 mm.
mm. Note that the radii of curvature of the light-receiving surfaces of the mask 54 and the light-receiving element 53 are such that air is present between each optical element.

また+ k3は非球面係数を示しており、サグ量Xとの
間には以下の式で示す関係がある。
Further, +k3 indicates an aspheric coefficient, and there is a relationship between it and the sag amount X as shown in the following equation.

X=h−c/(1+J l−(k3+IJ h−c−J
ここで、hは光軸a、からの高さを示しており、Cは縮
小レンズ50の曲率半径の逆数である。
X=h-c/(1+J l-(k3+IJ h-c-J
Here, h indicates the height from the optical axis a, and C is the reciprocal of the radius of curvature of the reduction lens 50.

縮小レンズ50を非球面としない場合には−第6図に示
すように球面収差が生じ、第7図に示すようなディスト
ーションがあるが、上記のように設計された視線方向検
出光学系を用いると、第8図に示すように球面収差が改
善され、これに伴って第9図に示すようにディストーシ
ョンが改善される。
If the reduction lens 50 is not made of an aspherical surface, spherical aberration will occur as shown in FIG. 6, and distortion as shown in FIG. 7 will occur, but the line-of-sight direction detection optical system designed as described above is used. As shown in FIG. 8, spherical aberration is improved, and along with this, distortion is improved as shown in FIG. 9.

なお、この実施例において、ファインダー16の視野内
に各合焦用ゾーン17,26,27に対応するLEDを
それぞれ設け、選択された合焦用ゾーンに対応するLE
Dを点滅表示させ、撮影者の意図する合焦用ゾーンであ
るか否かを確認させる構成とすることもできる。また、
この実施例においては、ファインダー16の視野内に3
個の合焦用ゾーンがある場合について説明したが、2個
以上であれば。
In this embodiment, an LED corresponding to each focusing zone 17, 26, 27 is provided within the field of view of the finder 16, and an LED corresponding to a selected focusing zone is provided.
It is also possible to have a configuration in which D is displayed blinking to confirm whether or not the focusing zone is the one intended by the photographer. Also,
In this embodiment, there are three
Although the case where there are two or more focusing zones has been described, the case where there are two or more focusing zones is applicable.

本発明が成立することを容易に理解できるであろう。It will be easy to understand that the present invention is realized.

さらに、この実施例においては、送光系46Aと受光系
46Bとをペンタプリズム40を境にファインダールー
ペ44と反対側に組み込む構成としたが。
Furthermore, in this embodiment, the light transmitting system 46A and the light receiving system 46B are installed on the opposite side of the finder magnifying glass 44 with the pentaprism 40 as the boundary.

送光系46Aと受光系46Bとのいずれか一方を、ペン
タプリズム40を境にファインダールーペ44と同じ側
に設ける構成とすることもできる。これについては、後
述する。
Either one of the light transmitting system 46A and the light receiving system 46B may be provided on the same side of the pentaprism 40 as the finder magnifying glass 44. This will be discussed later.

次に、本発明に係る視線方向検出装置46の他の実施例
を第10図〜第13図を参照しつつ説明する。
Next, another embodiment of the line-of-sight direction detecting device 46 according to the present invention will be described with reference to FIGS. 10 to 13.

受光部には、二次元の固体撮像素子を用いることもでき
る。ところが、この場合、固体撮像素子の配列が二次元
であるため、固体撮像素子を走査する走査処理時間が長
くかかることが予想され。
A two-dimensional solid-state image sensor can also be used for the light receiving section. However, in this case, since the array of solid-state image sensors is two-dimensional, it is expected that the scanning processing time for scanning the solid-state image sensors will take a long time.

かつ、コスト高ともなる。ところで、複数個の合焦用ゾ
ーン17.26.27の中心O,,0,,0,が第29
@に示すように直線的に並ぶものにあっては。
Moreover, the cost is also high. By the way, the center O,,0,,0, of the plurality of focusing zones 17, 26, 27 is the 29th
For things arranged in a straight line as shown in @.

その合焦用ゾーン17.26.27の中心O,、O,,
0、が並ぶ方向と対応する方向に光電素子が配列された
一次元ラインセンサを用いることが考えられる。ところ
が、このような一次元ラインセンサを用いると、以下に
説明するような問題がある。
The center of the focusing zone 17.26.27 O,, O,,
It is conceivable to use a one-dimensional line sensor in which photoelectric elements are arranged in a direction corresponding to the direction in which 0 and 0 are lined up. However, when such a one-dimensional line sensor is used, there are problems as described below.

第12図、第13図はこの問題を説明するための図であ
って、第12図において、lOOはファインダールーペ
、101は再結像レンズ、102は一次元ラインセンサ
である。この図に示すように、視線方向検出装置!46
の光学系の光軸!、、即ち、フアインダールーペ10G
の光軸皇、と人眼31の視軸!、′とが一致していると
きには、瞳孔のシルエット(周縁)としての瞳孔像34
a、第1プルキンエ像PIが、一次元ラインセンサ10
2上に形成されるので正常に視線方向の検出を行なうこ
とができる。ところが。
FIGS. 12 and 13 are diagrams for explaining this problem. In FIG. 12, lOO is a finder magnifying glass, 101 is a re-imaging lens, and 102 is a one-dimensional line sensor. As shown in this figure, the line of sight direction detection device! 46
The optical axis of the optical system! ,,i.e., Finder Loupe 10G
The optical axis emperor, and the visual axis of the human eye 31! , ' match, the pupil image 34 as the silhouette (periphery) of the pupil
a, the first Purkinje image PI is the one-dimensional line sensor 10
2, the line of sight direction can be detected normally. However.

カメラ本体に対して人眼31が上下方向に動いた場合に
は、第13図に示すようにシルエットとしての瞳孔像3
4a、第1プルキンエ像PIが一次元ラインセンサ10
2から外れてしまって、視線方向検出を正常に行なうこ
とができない不都合を生じる。
When the human eye 31 moves vertically with respect to the camera body, the pupil image 3 becomes a silhouette as shown in FIG.
4a, the first Purkinje image PI is the one-dimensional line sensor 10
2, causing the inconvenience that line-of-sight direction cannot be detected normally.

そこで、第11図に示すように、再結像レンズ52に、
たとえばシリコントリカルレンズを用いる。
Therefore, as shown in FIG. 11, the re-imaging lens 52
For example, a silicon trical lens is used.

このシリコントリカルレンズの平坦面側には、第4図に
示すと同様構成のマスク54が設けられている。そのマ
スク54には開口55が設けられ、その開口55の中心
は再結像レンズ52の曲率中心Yに位置している。ここ
で、開口55は矩形上のスリット孔とされ、そのスリッ
ト孔の延びる方向は一次元ラインセンサ53の光電素子
53aの配列方向と直交している。再結像レンズ52は
その曲面を構成する側がファインダールーペ44の側に
設けられてい、る。
A mask 54 having the same structure as shown in FIG. 4 is provided on the flat surface side of this silicon trical lens. The mask 54 is provided with an aperture 55, and the center of the aperture 55 is located at the center of curvature Y of the re-imaging lens 52. Here, the opening 55 is a rectangular slit hole, and the extending direction of the slit hole is orthogonal to the arrangement direction of the photoelectric elements 53a of the one-dimensional line sensor 53. The re-imaging lens 52 has its curved surface disposed on the finder magnifying glass 44 side.

、このように、一次元ラインセンサ53の光電素子53
11が複数個のオートフォーカス光学系の合焦用ゾーン
に対応させて配列されているものにあっては、この再結
像レンズ52にシリンドリカルレンズを用いて一次元ラ
インセンサ53の配列方向と直交する方向に縦長の第1
プルキンエ像PIとシルエットとしての瞳孔像34aと
を一次元ラインセンサ53を含む平面上に形成するよう
に配置しであるので、第11図に示すように、眼45が
カメラ本体に対して上下方向に移動したとしても、それ
らの各像PI。
, thus, the photoelectric element 53 of the one-dimensional line sensor 53
11 are arranged corresponding to the focusing zones of a plurality of autofocus optical systems, a cylindrical lens is used for this re-imaging lens 52, and the re-imaging lens 52 is arranged perpendicularly to the arrangement direction of the one-dimensional line sensor 53. vertically in the direction of
Since the arrangement is such that the Purkinje image PI and the pupil image 34a as a silhouette are formed on a plane including the one-dimensional line sensor 53, the eye 45 is vertically aligned with respect to the camera body, as shown in FIG. Each of those images PI.

34aの一部が一次元ラインセンサ102に少なくとも
形成されていることになる。また、マスク54の開口5
5も一次元ラインセンサ53の光電素子53aの配列方
向と直交する方向に長く延びるスリット孔としたので、
一次元ラインセンサ53を含む面上に形成される瞳孔像
34.第1プルキンエ像PIが配列方向と直交する方向
により一層縦長となり、確実に視線方向の検出を行なう
ことができる。
At least a part of 34a is formed in the one-dimensional line sensor 102. In addition, the opening 5 of the mask 54
5 is also a slit hole that extends long in the direction perpendicular to the arrangement direction of the photoelectric elements 53a of the one-dimensional line sensor 53.
A pupil image 34 formed on a surface including the one-dimensional line sensor 53. The first Purkinje image PI becomes more vertically elongated in the direction perpendicular to the arrangement direction, so that the line-of-sight direction can be detected reliably.

なお、この実施例では、再結像レンズ52にシリンドリ
カルレンズを用いであるが、トーリックレンズを用いる
こともできる。
In this embodiment, a cylindrical lens is used as the re-imaging lens 52, but a toric lens may also be used.

次に、本発明に係る視線方向検出装置46の処理回路の
他の例について説明する。
Next, another example of the processing circuit of the line-of-sight direction detection device 46 according to the present invention will be described.

カメラ本体に視線方向検出装置46の光学系を組み込む
こと、コストアップを極力避けることに鑑み九ば、その
光学系が極力単純であることが望ましく、再結像レンズ
52に関していえば、単レンズであることが好ましい。
In order to incorporate the optical system of the line-of-sight direction detection device 46 into the camera body and to avoid cost increases as much as possible, it is desirable that the optical system be as simple as possible. It is preferable that there be.

ところが、このような再結像レンズ52を用いた場合、
一様な光量分布の光をその再結像レンズ52に入射させ
ると、第1414に模式的に示すように。
However, when such a re-imaging lens 52 is used,
When light with a uniform light intensity distribution is made incident on the re-imaging lens 52, as schematically shown in No. 1414.

一次元ラインセンサ53の受光面上に結像される光の光
量が周辺部で減衰する。その第14図において。
The amount of light that is imaged on the light receiving surface of the one-dimensional line sensor 53 is attenuated at the periphery. In that Figure 14.

二点鎖線G、は光量減衰がないとした場合の光量分布を
示しており、破線G、は光量減衰がある場合の光量分布
を示し、處、は前記と同様に視線方向検出装置46の光
学系の光軸を示している。
The two-dot chain line G shows the light amount distribution in the case that there is no light amount attenuation, the broken line G shows the light amount distribution in the case that there is light amount attenuation, and , as in the above, indicates the optical power of the line-of-sight direction detection device 46. It shows the optical axis of the system.

このような光量減衰がある状態で、一次元ラインセンサ
53の出力に基づき光量分布の重心位置を求めることに
すると、求めた重心位置が実際の重心位置からずれるお
それがあり、その求めた重心位置を用いて視線方向を演
算により決定することにした場合、実際の視線方向との
間に誤差を生じる。
If the center of gravity of the light intensity distribution is determined based on the output of the one-dimensional line sensor 53 under such light intensity attenuation, there is a risk that the obtained center of gravity will deviate from the actual center of gravity. If the line-of-sight direction is determined by calculation using , an error will occur between the line-of-sight direction and the actual line-of-sight direction.

区別すべき視線方向の角度が大きく離れている場合には
、この光量減衰に基づく誤差を許容できるが、区別すべ
き視線方向の角度が小さくなるに伴って、光量減衰に基
づく誤差を無視できなくなる。これに限らず、光量減衰
に基づく誤差が除去できるものであるならば、できるだ
けこれを取り除く方が、演算処理により視線方向を検出
するうえで好ましい。
If the angles of the line-of-sight directions to be distinguished are far apart, this error based on light attenuation can be tolerated, but as the angle of line-of-sight directions to be distinguished becomes smaller, the error based on light attenuation cannot be ignored. . The present invention is not limited to this, but as long as the error due to light intensity attenuation can be removed, it is preferable to remove it as much as possible in order to detect the line-of-sight direction by arithmetic processing.

そこで、この処理回路では、あらかじめ、光量減衰を求
めて光量補正値を後述するROMに記憶させる手段を講
じている。
Therefore, in this processing circuit, a method is taken in which the light amount attenuation is determined in advance and a light amount correction value is stored in a ROM which will be described later.

すなわち、光量減衰のある光量分布に対応する一次元ラ
インセンサ53の出力分布は第1411に符号G3で示
すようなものとなる。ここで、符号iはi番目の光電素
子53aを意味し、jはj番目の光電素子53aを意味
し、X、は111目の光電素子53aの出力、 Xjは
j番目の光電素子53aの出力を示している。今、j番
目の光電素子53aは光軸悲、上に −あるものとする
。すなわち、このj番目の光電素子53aはa番地とb
番地との中央の番地であるとする。この場合、j番目の
光電素子53aの出力は最大であると予想できる。
That is, the output distribution of the one-dimensional line sensor 53 corresponding to a light amount distribution with light amount attenuation is as indicated by the symbol G3 in the 1411th line. Here, the code i means the i-th photoelectric element 53a, j means the j-th photoelectric element 53a, X is the output of the 111th photoelectric element 53a, and Xj is the output of the j-th photoelectric element 53a. It shows. Now, it is assumed that the j-th photoelectric element 53a is located above the optical axis. That is, this j-th photoelectric element 53a has addresses a and b.
Assume that the address is in the middle of the street address. In this case, the output of the j-th photoelectric element 53a can be expected to be maximum.

そこで、a番地の光電素子53aからb番地の光電素子
53aまでの冬山力を求め、補正係数H,を求める。
Therefore, the force of Fuyuyama from the photoelectric element 53a at address a to the photoelectric element 53a at address b is determined, and the correction coefficient H is determined.

この補正係数H,と出力X、と出力Xjとの間には、以
下の関係式がある。
The following relational expression exists between the correction coefficient H, the output X, and the output Xj.

H,−−X、 = X、     −@そして、この補
正係数H,を正規化するためにxlで割って補正値HL
′を求め、第15図に示す処理回路のROMに記憶させ
ておく。
H,--X, = X, -@And, in order to normalize this correction coefficient H, divide it by xl to get the correction value HL
' is determined and stored in the ROM of the processing circuit shown in FIG.

HL  = HL / X s    ・=■このよう
に正規化した補正値HL′を、実際に得られた各番地(
a番地からb番地まで)の光電素子53aの出力に乗算
して補正すれば、符号G、に示すように、減衰のある光
量分布に対応する出力分布が補正される。つまり、一様
な光に対して。
HL = HL /
By multiplying and correcting the output of the photoelectric element 53a (from address a to address b), the output distribution corresponding to the attenuated light amount distribution is corrected, as shown by symbol G. That is, for uniform light.

再結像レンズ52の周辺部の影響に基づく光量減衰を補
正した一様な出力分布G、が得られることになる。
A uniform output distribution G is obtained in which the attenuation of the amount of light due to the influence of the peripheral portion of the re-imaging lens 52 is corrected.

さらに、補正値として、ファインダールーペ44から平
行な一様光を入射させたときに得られる光量分布に基づ
く補正値を用いることにし、これを書き込み書き換え可
能なEl!PRONに記憶させておけば、再結像レンズ
52以外の光学系の光学要素を含めたうえでの光量分布
に基づく誤差、一次元ラインセンサ53それ自体の各光
電素子53aの感度のバラツキを含めたうえでの補正を
行なうことができる。よって、このような補正を行なう
ことにすれば、一次元ラインセンサ53それ自体の光特
性に関する規格を緩めることが可能となり、歩留まりの
向上に基づくコストダウンを図ることができる。
Further, as a correction value, a correction value based on the light amount distribution obtained when parallel uniform light is incident from the finder magnifying glass 44 is used, and this is written into the rewritable El! If stored in PRON, errors based on the light intensity distribution including optical elements of the optical system other than the re-imaging lens 52, and variations in sensitivity of each photoelectric element 53a of the one-dimensional line sensor 53 itself can be included. Corrections can be made later. Therefore, by performing such correction, it becomes possible to relax the standards regarding the optical characteristics of the one-dimensional line sensor 53 itself, and it is possible to reduce costs based on improved yield.

ところで、角膜鏡面反射に基づき第1プルキンエ像PI
を形成する光量分布の重心位置と眼底からの反射光の光
量分布重心位置とをそれぞれ求めるためには、一次元ラ
インセンサ53の出力を、眼底反射光に対応する眼底反
射光対応出力成分と第1プルキンエ像PIに対応する第
1プルキンエ像形成反射光対応出力成分とに分離する必
要がある。
By the way, the first Purkinje image PI based on corneal specular reflection
In order to obtain the center of gravity of the light intensity distribution that forms the fundus and the center of gravity of the light intensity distribution of the light reflected from the fundus, the output of the one-dimensional line sensor 53 is combined with the output component corresponding to the fundus reflected light corresponding to the fundus reflected light. It is necessary to separate the output component into the output component corresponding to the first Purkinje image forming reflected light corresponding to one Purkinje image PI.

というのは、実際の光量分布は、第16図に実線GSで
示すようなものとなり、眼底反射光対応出力成分G6と
第1プルキンエ像形成反射光対応出力成分G7とに分離
せず処理するものとすると。
This is because the actual light amount distribution is as shown by the solid line GS in FIG. 16, and is processed without being separated into the output component G6 corresponding to the fundus reflected light and the output component G7 corresponding to the first Purkinje image forming reflected light. If so.

この両者を含んだ重心位II(座標又は番地)が求めら
れることになり、瞳孔の中心34と第1プルキンエ像P
Iの中心とが求められないからである。
The center of gravity position II (coordinates or address) including both of these is determined, and the center 34 of the pupil and the first Purkinje image P
This is because the center of I cannot be found.

この場合に、眼底反射光対応出力成分G6と第1プルキ
ンエ像形成反射光対応出力成分G1とを極力正確に分離
するようにするためには、スライスレベルSLをその境
目付近に設定する必要がある。
In this case, in order to separate the output component G6 corresponding to the fundus reflected light and the output component G1 corresponding to the first Purkinje image forming reflected light as accurately as possible, it is necessary to set the slice level SL near the boundary between them. .

このために、複数個のゾーンレベルZNを設け、光電変
換素子53aの出力頻度を調べる。
For this purpose, a plurality of zone levels ZN are provided and the output frequency of the photoelectric conversion element 53a is checked.

ここでは、このゾーンレベルZNを第17図に示すよう
に8個とする。なお、この8個のゾーンレベルZNを符
号211〜ZN、を用いて示す。
Here, it is assumed that there are eight zone levels ZN as shown in FIG. 17. Note that these eight zone levels ZN are indicated using symbols 211 to ZN.

そして、その光電変換素子53aの出力頻度を調べるた
めに、8個のゾーンレベルZN、〜ZN、に対応させて
、8個の出力頻度レジスタR,〜R,を準備する。なお
、この出力頻度レジスタR,〜R,のビット数は8とす
る。そして、この出力頻度レジスタR□〜R,にa番地
からb番地までの各光電素子53aの出力を順次入力さ
せる。たとえば、a番地の出力は、「0」であるから、
全ての出現頻度レジスタの内容は「0」である。今、i
番地の光電変換素子53aの出力が、「21」に対応す
る出力であるときには、出現頻度レジスタR,の内容が
「00000010Jとなり、他の出現頻度レジスタの
内容はrOJである。また、たとえば、i+1番地の光
電素子53aの出力がi番地の光電変換素子53aの出
力「zllll、よりも1ビットに相当する分だけ大き
いときには、出現頻度レジスタR3の内容はrlooo
ooloJとなる。
In order to check the output frequency of the photoelectric conversion element 53a, eight output frequency registers R, .about.R, are prepared in correspondence with the eight zone levels ZN, .about.ZN. Note that the number of bits of the output frequency registers R, .about.R, is 8. Then, the outputs of the photoelectric elements 53a from addresses a to b are sequentially input into the output frequency registers R□ to R. For example, since the output of address a is "0",
The contents of all appearance frequency registers are "0". Now, i
When the output of the photoelectric conversion element 53a at the address is an output corresponding to "21", the content of the appearance frequency register R is "00000010J, and the content of the other appearance frequency registers is rOJ. Also, for example, i+1 When the output of the photoelectric element 53a at the address is larger than the output "zllll" of the photoelectric conversion element 53a at the i address by an amount corresponding to one bit, the content of the appearance frequency register R3 is rlooo.
It becomes ooloJ.

そこで、出力頻度レジスタR1〜R,の上位3ビットに
着目し、上位3ビットの内容のデータが少なくとも[1
」を含むとき、その出力頻度レジスタR1−R,から「
+IJを出力させる。そして。
Therefore, we focused on the upper 3 bits of the output frequency registers R1 to R, and determined that the data in the upper 3 bits was at least [1
", the output frequency register R1-R contains "
+IJ is output. and.

各番地(i = aからbまで)の光電素子53aの出
力が入力され、上位3ビットの内容が「1」を含むたび
に、各出現頻度レジスタR1〜R,の出力をインクリメ
ントカウントする。なお、上位3ビットかの内容が「1
」を含まないときには、インクリメントカウントしない
、このように、各番地の光電素子53aの出力のたびに
、出現頻度レジスタR□〜R,をインクリメントカウン
トすると、この模式的に示す出力分布の場合には、ゾー
ンレベルZN2とゾーンレベルZN、との間に出力レベ
ルがある光電素子53aの個数が最も多いから、出現頻
度レジスタR3のインクリメントカウント個数が最大と
なることが予想される。
The output of the photoelectric element 53a at each address (i = a to b) is input, and each time the contents of the upper three bits include "1", the output of each appearance frequency register R1 to R is incremented. Note that the content of the upper 3 bits is “1”.
", the increment count is not performed. In this way, each time the photoelectric element 53a at each address outputs, the appearance frequency registers R□ to R, are incremented. In the case of the output distribution shown schematically, Since the number of photoelectric elements 53a having an output level between zone level ZN2 and zone level ZN is the largest, it is expected that the increment count number of appearance frequency register R3 will be the largest.

そこで、全ての番地の光電索子53aの出力分布につい
て、インクリメントカウント後、出現頻度レジスタR1
〜R,のインクリメントカウント数が最大となったか否
かを判定する。そして、そのインクリメントカウント数
が最大となった出力頻度レジスタR,〜R,に対応する
ゾーンレベルZNをスライスレベルSLとして決定する
。このスライスレベルSLを用いれば、眼底反射光対応
出力成分G6と第1プルキンエ像形成反射光対応出力成
分とを分離することができる。
Therefore, after incrementing the output distribution of the photoelectric cable 53a at all addresses, the appearance frequency register R1
It is determined whether the increment count number of ~R has reached the maximum. Then, the zone level ZN corresponding to the output frequency register R, to R, whose increment count is the maximum is determined as the slice level SL. By using this slice level SL, it is possible to separate the output component G6 corresponding to the fundus reflected light and the output component corresponding to the first Purkinje image forming reflected light.

ここで、ゾーンレベルZN、〜ZN、の幅は、眼底から
の反射に基づくノイズレベルに応じて決めるもので、こ
のノイズレベルの成分はローパスフィルタを通して除去
できるが、ゾーンレベルZN1〜ZN。
Here, the width of the zone levels ZN, ~ZN, is determined according to the noise level based on the reflection from the fundus, and this noise level component can be removed through a low-pass filter, but the width of the zone levels ZN1 ~ ZN.

をオーバーラップさせるというソフトウェア処理によっ
ても行なうことができる。
This can also be done by software processing of overlapping.

たとえば、第18図に示すように、隣接する出力頻度レ
ジスタR□〜R,のインクリメントカウント数の和をと
り、その和が最大である出力頻度レジスタR□〜R,を
判定する。この第18図に示す例では、出現頻度レジス
タR1と出力頻度レジスタR。
For example, as shown in FIG. 18, the sum of the increment counts of adjacent output frequency registers R□-R is calculated, and the output frequency register R□-R with the largest sum is determined. In the example shown in FIG. 18, the appearance frequency register R1 and the output frequency register R.

との和が最大であるので、出現頻度レジスタR4のイン
クリメントカウント数が最大であると判定される。
Since the sum is the maximum, it is determined that the increment count number of the appearance frequency register R4 is the maximum.

なお、眼底反射光対応出力成分G、のうち最も出現頻度
の多い出力成分は中間レベルであるので。
Note that among the output components G corresponding to fundus reflected light, the output component that appears most frequently is at an intermediate level.

スライスレベルSLの決定に関し、ゾーンレベルZN、
、ZN、に対応する出現頻度レジスタR,。 R,は当
初から除いて考える。
Regarding the determination of slice level SL, zone level ZN,
, ZN, corresponding to the frequency register R,. Consider excluding R from the beginning.

このようにして、出現頻度レジスタR,に対応するゾー
ンレベルZN、を求めることができたとする。ここで、
その出現頻度レジスタR,の内容が。
Assume that it is possible to obtain the zone level ZN corresponding to the appearance frequency register R, in this way. here,
The contents of the appearance frequency register R, are.

rooooooolJ以上のときを第1プルキンエ像形
成反射光対応出力成分G、、r00000110」以下
のときを、眼底反射光対応出力成分G6とあらかじめ決
めておく。
It is predetermined in advance that when the value is equal to or more than roooooooolJ, the output component G corresponds to the first Purkinje image forming reflected light, and when it is equal to or less than "r00000110", the output component corresponding to the fundus reflected light G6.

このようにすれば、その出現頻度レジスタR。In this way, the appearance frequency register R.

の内容に基づき、第16図に示すようにスライスレベル
SL、、SL、を、眼底反射光対応出力成分G6と第1
プルキンエ像形成反射光対応出力成分G7との境目近傍
で設定できることになる。
As shown in FIG. 16, based on the contents of
This means that it can be set near the boundary with the Purkinje image forming reflected light corresponding output component G7.

このようにして、スライスレベルSL□、SL、を決定
し、第16図に示す光量分布特性に対応する出力成分を
スライスして像分離処理を行なうと、第19図に示す分
離出力が得られる。この第1911において、実線G、
は眼底反射光対応分離出力を示し。
In this way, by determining the slice levels SL□, SL, and performing image separation processing by slicing the output components corresponding to the light intensity distribution characteristics shown in FIG. 16, the separated output shown in FIG. 19 is obtained. . In this No. 1911, solid line G,
indicates the separated output corresponding to the fundus reflected light.

実線G、は第1プルキンエ像形成反射光対応分離出力を
示している。ここで、眼底反射光対応分離出力G8は台
形となっているが、これは、一次元ラインセンサ53の
出力を、眼底反射光対応分離出力G、と第1プルキンエ
像形成反射光対応分離出力G、とに分離する前に、前述
の補正処理を行なったからである。よって、眼底反射光
対応分離出力G、の重心位置をX□、第1プルキンエ像
形成反射光対応分離出力G、の重心位置を特徴とする特
許瞳孔の中心34から第1プルキンエ像までの距lid
′は+d=X、−X、として求められる、   −重心
位置を求めるための演算アルゴリズムとしては、PSD
 (ポジションセンサーダイオード)の出力を、ソフト
ウェア演算により実現したものが用いられる。すなわち
、第20図(a)、第20図(b)に示すように、重価
関数WA、 W、を用いて、重価関数WA、W、の出力
に対応する像分離出力のコンポリューシ、Jン(たたみ
こみ積分)をとった後にこれを、積分する。たとえば、
第20図(C)、第20図(d)に示す像分離、出力G
1と重価関数W、、W、とのコンポリューシ3ンをと、
り、乗算出力CA、C,を得る。そして、この乗算出力
C。、 C,を積分して積分値S、、S、を得る。
A solid line G indicates the separated output corresponding to the first Purkinje image forming reflected light. Here, the separate output G8 corresponding to the fundus reflected light has a trapezoidal shape, which means that the output of the one-dimensional line sensor 53 is divided into the separated output G corresponding to the fundus reflected light and the separated output G corresponding to the first Purkinje image forming reflected light. This is because the above-mentioned correction process was performed before separating into , and. Therefore, the position of the center of gravity of the separated output G corresponding to the fundus reflected light is X□, and the distance lid from the center 34 of the patent pupil to the first Purkinje image, which is characterized by the position of the center of gravity of the separated output G corresponding to the reflected light forming the first Purkinje image, is
' is determined as +d=X, -X, -The calculation algorithm for determining the center of gravity position is PSD.
(position sensor diode) output realized by software calculation is used. That is, as shown in FIG. 20(a) and FIG. 20(b), using the weighting functions WA, W, the convolution of the image separation output corresponding to the output of the weighting functions WA, W, After taking the convolution integral, this is integrated. for example,
Image separation and output G shown in Fig. 20(C) and Fig. 20(d)
1 and the weight function W, , W, is the conpolution 3,
and obtain the multiplication output CA,C. And this multiplication output C. , C, to obtain the integral value S,,S,.

すると、重心位[Xは、原点0からの距離をSfとして
、。
Then, the center of gravity [X is the distance from the origin 0 as Sf.

X=Sr* ((SA−S、)/(SA+S、)+1)
X1/2とし工求められる、    この方法は、コンボリューションをとるために、各ビッ
ト毎の乗算が必要である。近時は、マイクロコンピュー
タにも乗算機能を有するものが一般化してきているので
−この方法により重心位置を求めることができる。
X=Sr* ((SA-S,)/(SA+S,)+1)
This method requires a bitwise multiplication to perform the convolution. Recently, it has become common for microcomputers to have a multiplication function, so the position of the center of gravity can be determined using this method.

しかし、ソフトリエアでこの重心位iiixを求めるこ
とにすると、演算に時間がかかりすぎる不利な面がある
However, if this center of gravity position iiix is determined using software realignment, there is a disadvantage that the calculation takes too much time.

そこで、演算時間の短縮を図って重心位Ifxの計算を
行なうことのできる処理手段をここでは採用することに
する。
Therefore, a processing means that can calculate the center of gravity Ifx while reducing the calculation time is adopted here.

まず得られた分離出力G、、G、を位置座標についてビ
ット反転させて第19図に示すように反転分屋出力G、
−G、を生成する。
First, the obtained separated outputs G,,G, are bit-inverted with respect to the position coordinates, and as shown in FIG. 19, the inverted branch outputs G,
-G, is generated.

この方法によれば、反転前の分離出力G、、 G。According to this method, the separated outputs G, , G before inversion.

と反転後の分離出力G、′、Gg′との位相差を演算す
ることにより、上記の精度、と略同程度の精度で重心位
置を求めることができ、この位相差の演算には、公知の
オートフォーカス光学系を有する一眼レフレックスカメ
ラに用いられている位相差検出方法の相関方式演算と同
様の演算方法によって求めることができる。なお、この
演算方式は。
By calculating the phase difference between the separated outputs G,', and Gg' after inversion, the center of gravity position can be determined with approximately the same accuracy as the above. It can be determined by a calculation method similar to the correlation method calculation of the phase difference detection method used in a single-lens reflex camera having an autofocus optical system. Furthermore, this calculation method is as follows.

内挿演算によりセンサの画素の分解能の数lO〜数10
0分の1の精度で得られることが従来より知られている
By interpolation calculation, the resolution of the sensor pixel can be calculated from number lO to number 10.
It has been known for some time that it can be obtained with an accuracy of 1/0.

ところで、全く予測のつかない被写体を撮影するのと異
なり、この視線方向検出装[46の場合。
By the way, unlike when photographing a subject that is completely unpredictable, this line-of-sight direction detection device [46]

得られる像のパターンは予測できるものであり、眼底か
らの反射光と第1プルキンエ像PIを形成する反射光と
が一次元ラインセンサ53にスポット的に結像されたと
きには、それぞれ左右対称の分離出力G、、G、が得ら
れる。そこで、たとえば、第21図に示すように、分離
出力G、′が単純なパターンの場合には、立上りの位置
座標と立ち下がりの位置座標との中心0、が略重心位置
と予想される。よって、位相差を検出するにあたっては
The pattern of the obtained image is predictable, and when the reflected light from the fundus of the eye and the reflected light forming the first Purkinje image PI are imaged spot-wise on the one-dimensional line sensor 53, they are symmetrically separated. Outputs G,,G, are obtained. Therefore, for example, as shown in FIG. 21, when the separated output G,' has a simple pattern, the center 0 between the rising position coordinate and the falling position coordinate is expected to be approximately the center of gravity. Therefore, when detecting the phase difference.

その中心0@の前後のみに関し、演算を行なえば。If we perform calculations only around the center 0@.

演算時間の短縮化を図ることができる。The calculation time can be shortened.

具体的には、一次元ラインセンサ53の出力をS(n)
とする。ここで、nは一次元ラインセンサの光電素、子
53aの番地を示している。そして、n番地とn+1番
地とに着目し、その分離出力の差出力E(n)を生成す
る。差E(n)は、以下の式によって求めら九る。
Specifically, the output of the one-dimensional line sensor 53 is expressed as S(n)
shall be. Here, n indicates the address of the photoelectric element 53a of the one-dimensional line sensor. Then, focusing on address n and address n+1, a difference output E(n) of the separated outputs is generated. The difference E(n) is determined by the following formula.

E(n)=S(n+1)−5(n) このようにして、第21図に示すような微分出力B1が
得られる。
E(n)=S(n+1)-5(n) In this way, a differential output B1 as shown in FIG. 21 is obtained.

次に、E(n)が最大となる座標と最小となる座標をそ
れぞれt□、t2とすると、重心位置は。
Next, if the coordinates where E(n) is maximum and minimum are t□ and t2, respectively, then the center of gravity position is.

略(t、+t、)/2にあると予想できる。It can be expected to be approximately (t, +t,)/2.

そこで、位置座標を反転させたときの反転分離出力をG
、″とし、その差出力R(n)を生成する。
Therefore, the inverted separation output when the position coordinates are inverted is G
,'' and generate the difference output R(n).

この差出力R(n)に対応する微分出力B、′は実線で
示すようなものとなる。ここで、全ビツト数mとして、
 m−(t、+ty)の前後一対して、S(n)に対す
るR(n)の位相差を求めるための相関法演算を行なえ
ば、重心位置を求めることができる。
The differential output B,' corresponding to this difference output R(n) is as shown by the solid line. Here, as the total number of bits m,
The position of the center of gravity can be determined by performing a correlation method calculation to determine the phase difference of R(n) with respect to S(n) for a pair before and after m-(t, +ty).

同様にして、B、とB6 ′との位相差を求めることも
できる。
Similarly, the phase difference between B and B6' can also be determined.

すなわち、S (n)に対するR (n)の位相差ある
いはB、とB、′どの位相差をtとすると、S (n)
のセンサの中心座標06 ′からの重心位置はt/2で
求めることができる。
That is, if the phase difference between R (n) with respect to S (n) or B and B,' is t, then S (n)
The center of gravity position from the center coordinates 06' of the sensor can be determined by t/2.

このような演算アルゴリズムを用いることにより、高精
度の視線方向検出装置を実現できる。
By using such a calculation algorithm, a highly accurate line-of-sight direction detection device can be realized.

ところで、B1とB、′どの位相差を求める方法を採用
するのでなければ−R(n)はS(n)が格納されてい
るメモリのアドレスが対応しているので、アドレスから
逆の順番にデータを呼び出せば、R(n)を生成するた
めのメモリの領域をつくる必要がなく、メモリの節約を
図ることができる。
By the way, B1 and B', unless you adopt the method of calculating the phase difference, -R(n) corresponds to the address of the memory where S(n) is stored, so in reverse order from the address If the data is called, there is no need to create a memory area for generating R(n), and memory can be saved.

また。E(n)の生成についても最大、最小の番地を求
めることが目的であり、E (n)を得ることが目的で
あるわけではないので、その生成領域も不要である。
Also. The purpose of generating E(n) is to find the maximum and minimum addresses, not to obtain E(n), so the generation area is not necessary.

ところで、先の例の視線方向検出装置?146の光学系
は、ペンタプリズム40を境にファインダールーペ44
と反−対側に送光系46A、受光系46Bがカメラ本体
に組み込まれていたので、送光系46A、受光系46B
を構成する各光学要素の屈折面に基づく反射光が受光系
46Bにゴーストとして導かれ、受光系46Bの一次元
ラインセンサ53に第1プルキンエ像PIと共にゴース
トが形成され、ゴーストと第1プルキンエ像PIとの区
別をつけがたいという問題点が残存する。
By the way, what about the gaze direction detection device in the previous example? The optical system of 146 includes a pentaprism 40 and a finder loupe 44.
Since the light transmitting system 46A and the light receiving system 46B were built into the camera body on the opposite side, the light transmitting system 46A and the light receiving system 46B
Reflected light based on the refractive surface of each optical element constituting the optical element is guided as a ghost to the light receiving system 46B, and a ghost is formed along with the first Purkinje image PI on the one-dimensional line sensor 53 of the light receiving system 46B, and the ghost and the first Purkinje image The problem remains that it is difficult to distinguish it from PI.

そこで、次に、ゴーストが受光系46Bに極力導かれな
いようにしたカメラの視線方向検出装置の光学系を説明
する。
Therefore, next, an optical system of a camera line-of-sight direction detection device that prevents ghosts from being guided to the light receiving system 46B as much as possible will be described.

第31図〜第35図は、そのゴーストが受光系46Bに
極力導かれないようにしたカメラの視線方向検出装置の
光学系の説明図であって、第2図に示す光学系の構成要
素と同一構成要素については大略同一符号が付されてい
る。
31 to 35 are explanatory diagrams of the optical system of the camera line-of-sight direction detection device in which the ghost is prevented from being guided to the light receiving system 46B as much as possible, and the components of the optical system shown in FIG. Identical components are given approximately the same reference numerals.

ここでは、送光系46Aは、赤外光を発生する光源48
.全反射ミラー149、コリメーターレンズ150を備
えている。コリメーターレンズ150はその面Aが非球
面である。光源48から出射された赤外光は、全反射ミ
ラー149で反射され、コリメータレンズ150に導か
れる。このコリメータレンズ150の出射側の面には、
絞り151が設けられている。コリメータレンズ150
は光源48から出射された赤外光を平行光束に変換する
機能を有する。
Here, the light transmission system 46A includes a light source 48 that generates infrared light.
.. It is equipped with a total reflection mirror 149 and a collimator lens 150. The surface A of the collimator lens 150 is an aspherical surface. The infrared light emitted from the light source 48 is reflected by a total reflection mirror 149 and guided to a collimator lens 150. On the exit side surface of this collimator lens 150,
A diaphragm 151 is provided. collimator lens 150
has a function of converting the infrared light emitted from the light source 48 into a parallel light beam.

ファインダールーペ44の眼45が臨む側には、送光系
46Aの光軸悲、と受光系慮、の光軸とを共軸とするた
めの共軸形成用光学部材152が設けられている。この
共軸形成用光学部材152は、ここでは。
A coaxial forming optical member 152 is provided on the side of the finder magnifying glass 44 facing the eye 45 to make the optical axis of the light transmitting system 46A and the optical axis of the light receiving system coaxial. This coaxial forming optical member 152 is used here.

反射面153を有するプリズム154,155によりな
る直方体から構成されている。その共軸形成用光学部材
152は、眼45に臨む透過面156と、反射面153
を挾んで透過面156と対向する透過面157と、コリ
メーターレンズ150に臨む透過面157′とを有し。
It is composed of a rectangular parallelepiped made up of prisms 154 and 155 having a reflective surface 153. The coaxial forming optical member 152 has a transmission surface 156 facing the eye 45 and a reflection surface 153.
It has a transmitting surface 157 that faces the transmitting surface 156 and a transmitting surface 157' facing the collimator lens 150.

透過面156にはマスク15gが設けられている。A mask 15g is provided on the transmission surface 156.

ここては・、共軸形成用光学部材152の各透過面にお
ける反射に基づくゴーストを避けるために、透過面15
6.157は光軸患、に対してごくわずかに傾けられ、
透過面157′は光軸息、に対してごくわずかに傾けら
れている。その各光軸2.、jlLに対する各透過面1
56.157.157 の傾き角は、この実施例では+
 t”であり、各透過面156.157,157′が同
一の傾き角を持っているので、平行平面板が挿入された
状態と同じになり、傾斜による収差の変化がほとんどな
い。
Here, in order to avoid ghosts due to reflection on each transmission surface of the coaxial forming optical member 152, the transmission surface 15
6.157 is tilted very slightly with respect to the optical axis,
The transmission surface 157' is tilted very slightly with respect to the optical axis. Each optical axis 2. , each transmission surface 1 for jlL
The inclination angle of 56.157.157 is +
t'', and since each transmission surface 156, 157, 157' has the same inclination angle, the state is the same as that in which a parallel plane plate is inserted, and there is almost no change in aberration due to inclination.

反射面153は、ここでは、赤外光半透過かつ可視光透
過型である。反射面153が可視光を透過するので、撮
影者は1ント板42に形成された被写体像を見ることが
できる。絞り151を通過した平行光束は、反射面15
3により眼4卆に向かう方向に反射され、アイポイント
に置かれた撮影者の眼45に投影される。なお、この実
施例では、共軸形成用光学部材152として用いである
が、赤外光半透過かつ可視光透過型のミラーを用いても
よい。
Here, the reflective surface 153 is semi-transparent for infrared light and transparent for visible light. Since the reflective surface 153 transmits visible light, the photographer can see the subject image formed on the one-piece plate 42. The parallel light beam passing through the aperture 151 is reflected by the reflecting surface 15
3 in the direction toward the eye 4, and is projected onto the photographer's eye 45 placed at the eye point. In this embodiment, the coaxial forming optical member 152 is used, but a mirror that semi-transmits infrared light and transmits visible light may also be used.

第1プルキンエ像PIを形成する角膜鏡面反射光束と、
眼底からの反射光束とは、再び共軸形成用光学部材15
2に導かれ、その反射面153を通過してファインダー
ルーペ44に導かれる。そのファインダールーペ44は
、前記同様にレンズ44a、 44bから構成される装
置 受光系46Bは,ここでは,コンペンセータプリズム1
59、縮小レンズ50,全反射ミラー161、再結像レ
ンズ52,一次元ラインセンサ53から構成されている
.再結像レンズ52には.第33図に拡大して示すよう
に前記同様構成のマスク54が一次元ラインセンサ53
に臨む面の側に設けられている。
A corneal specular reflection light beam forming a first Purkinje image PI;
The reflected light flux from the fundus is again the coaxial forming optical member 15.
2, passes through the reflective surface 153, and is guided to the finder magnifying glass 44. The finder magnifying glass 44 has a device light receiving system 46B composed of lenses 44a and 44b as described above, which here includes a compensator prism 1.
59, a reduction lens 50, a total reflection mirror 161, a re-imaging lens 52, and a one-dimensional line sensor 53. The re-imaging lens 52 has a. As shown in an enlarged view in FIG.
It is located on the side facing the street.

ところで,この例においても、受光系46Bにはディス
トーションが存在しない方が好ましく,かつ,物体高と
の関係において,一次元ラインセンサ53上での光量分
布は略一様であることが望ましく,以下に記載するよう
に光学系を構成すると。
By the way, also in this example, it is preferable that no distortion exists in the light receiving system 46B, and in relation to the object height, it is preferable that the light amount distribution on the one-dimensional line sensor 53 is approximately uniform. When the optical system is configured as described in .

第34@に示すように,必要とする物体高の範囲内で、
一次元ラインセンサ53上での光量分布を略一様に力バ
ーでき,かつ,第35図に示すようにディストーション
を1μ以下とすることができる。
As shown in #34@, within the required object height range,
The light intensity distribution on the one-dimensional line sensor 53 can be made substantially uniform, and the distortion can be made less than 1 μm as shown in FIG. 35.

(1)送光系46Aの設計値 光源48の出射面の曲率半径・・・無限大舅lp誌のU
罎真酎と喧力8社ミラー149との黄肯一l目唾−・・
7.7am全反射ミラー149とコリメーターレンズ1
50の面Aとの距離−7.3s■コリメータレンズ15
0 面Aの曲率半径・・・to.oo一一 面Bの曲率半径一・−28.00m− 屈折率−・・−1.48304 中心厚・・・4.00醜態 マスク151とコリメータレンズ150の面Bとの光s
s+・4.(X)一マスク151 厚さ・・−0.04■■ 曲率半径・・・無限大 マスクtStと透過面156′との光軸間距離・・−0
.66m■11]y二 曲率半径−・無限大 光軸悲,に対する傾き・・−1。
(1) Design value of the light transmitting system 46A Radius of curvature of the exit surface of the light source 48...U of infinite size
Huang Ken's first saliva with Jin Shinju and Kinryoku 8sha Mirror 149...
7.7am total reflection mirror 149 and collimator lens 1
Distance from surface A of 50 - 7.3s Collimator lens 15
0 Radius of curvature of surface A...to. oo radius of curvature of surface B 1・-28.00 m refractive index ---1.48304 center thickness --- 4.00 light s between abomination mask 151 and surface B of collimator lens 150
s+・4. (X) One mask 151 Thickness...-0.04■■ Radius of curvature...Distance between optical axes between infinite mask tSt and transmission surface 156'...-0
.. 66m■11] y Two radius of curvature - Inclination with respect to the infinite optical axis -1.

共軸形成用光学部材152の屈折率・・−1.50Jl
71透過面157′から透過面156までの光軸間距層
−・・lz−■道1而156 曲率半径・−・無限大 光軸患.に対する傾き・・・1 透過面156から角膜32までの光軸間距III−・・
・1355鵬角膜32の曲率半径・・−7.980■■
なお、コリメータレンズ150の面Aは非球面であり,
以下に記載する非球面レンズの結像公式において。
Refractive index of optical member 152 for coaxial formation...-1.50 Jl
71 Optical axis distance layer from transmitting surface 157' to transmitting surface 156 ---lz- ■ path 1 and 156 Radius of curvature --- Infinite optical axis. Inclination...1 Optical axis distance III from the transmission surface 156 to the cornea 32-...
・1355 Peng Cornea 32 radius of curvature...-7.980 ■■
Note that the surface A of the collimator lens 150 is an aspherical surface,
In the imaging formula for an aspheric lens described below.

k=−3.165,a,=−2.95XIO−. a.
=0として,サグ量Xを求め,設計した。
k=-3.165, a,=-2.95XIO-. a.
= 0, the sag amount X was determined and designed.

X=( a,h+ a.h)+c−h”/(1 +J 
l −(k+ 1)♂・h2)なお,Cはコリメータレ
ンズ150の面Aの曲率半径の逆数,hは光軸息、から
の物高であり、kは非球面係数である。
X=(a,h+a.h)+c-h”/(1+J
l −(k+ 1)♂·h2) Note that C is the reciprocal of the radius of curvature of the surface A of the collimator lens 150, h is the height from the optical axis, and k is the aspheric coefficient.

(2)受光系46Bの設計値 角a32の曲率半径・−− −7.980+am角膜3
zから透過面156までの光軸間距離・・・l3■■1
111月瓜 光軸慮、に対する傾き・・−−1” 曲率半径・・−無限大 共輪形成用光学部材152の屈折率・−・1.5087
1透過面156と透過面157どの光軸距離・・・l抛
■透過面157 光軸悲、に対する傾き・・−−t” 曲率半径−・−無限大 透過面157からレンズ44aの面Aまでの光軸間距離
−・・0.60mレンズ44a 面Aの曲率半径−・・115.895■層中心肉厚−・
1.2■鳳 屈折率・−1,69747 面Bの曲率半径−29,210−■ レンズ44b 面Bの曲率半径・−・2L210■鵬 中心肉厚・・−4,92鳳− 屈折率・・−1,61187 面Cの曲率半径・・・−47,880−鵬面Cとペンタ
プリズム40の面Aとの光軸距離・・・1.00+wペ
ンタプリズム40 面Aの曲率半径・・・無限大 屈折率・−・1.50871 面Bの曲率半径・・−無限大 光軸aJに対する面Bの傾き−・・−24゜面Aから面
Bまでの光軸間距離・・・28.80■層面Bとコンペ
ンセータプリズム159の面Aとの光軸間距離・・−0
−14■コンペンセータプリズム159 面Aの曲率半径・−・無限大 光軸悲、に対する面Aの傾き・−一−24゜面Bの曲率
半径−・・無限大 面Aと面Bとの光軸間距離−・・3■■屈折率・・・1
.50871 面Aからマスク159′までの距離・−0■■マスク1
59′ 厚さ・・−0,04■■ 曲率半径・−・無限大 マスク159′から縮小レンズ団の面Aまでの光軸間距
離−・−0,10m縮小レンズ50 面Aの曲率半径・−・11.71ロー膳肉厚−・・2.
50■■ 面Bの曲率半径・−・−60,140園園屈折率−・・
1.48304 面Bから全反射ミラー161までの光軸間距離・−・3
.00mm全反射ミラー161の曲率半径・・・無限大
全反射ミラー161から再結像レンズ52までの光軸間
距離・=1.61)m再苛鷹レンズ52 面Aの曲率半径・・−1,520層■ 屈折率・−1,48304履腸 中心肉厚−・1.520m層 面Bの曲率半径・・−無限大 面Bからマスク54までの距離・−・0.00菖薦マス
ク54 曲率半径−・・無限大 厚さ・=0.04mB なお、縮小レンズ50の面Aは、非球面であり、前記式
において、K =−1,25,a、= −8X 10−
、α、 = −10−として、設計した。
(2) Radius of curvature of design angle a32 of light receiving system 46B - -7.980+am cornea 3
Distance between optical axes from z to transmission surface 156...l3■■1
111 Inclination with respect to the optical axis...--1" Radius of curvature...-Refractive index of the optical member 152 for forming an infinite common ring...-1.5087
1 What optical axis distance between the transmitting surface 156 and the transmitting surface 157...l 抛■Transmitting surface 157 Inclination with respect to the optical axis...--t'' Radius of curvature--From the infinite transmitting surface 157 to the surface A of the lens 44a Distance between optical axes: 0.60 m Lens 44a Radius of curvature of surface A: 115.895 Layer center thickness:
1.2 ■ Otori refractive index -1,69747 Radius of curvature of surface B -29,210 - ■ Lens 44b Radius of curvature of surface B - 2L210 ■ Thickness at the center of Peng ... -4,92 Otori - Refractive index -1,61187 Radius of curvature of surface C...-47,880-Optical axis distance between surface C and surface A of pentaprism 40...1.00+w Radius of curvature of surface A of pentaprism 40... Infinity refractive index --- 1.50871 Radius of curvature of surface B --- Inclination of surface B with respect to infinite optical axis aJ ----24° Distance between optical axes from surface A to surface B --- 28. 80■ Distance between optical axes between layer surface B and surface A of compensator prism 159...-0
-14 ■ Compensator prism 159 Radius of curvature of surface A ---Inclination of surface A with respect to the infinite optical axis - -1-24° Radius of curvature of surface B --- Optical axis of infinite surface A and surface B Distance: 3 ■■ Refractive index: 1
.. 50871 Distance from surface A to mask 159' -0 ■■ Mask 1
59' Thickness...-0,04■■ Radius of curvature--Distance between optical axes from the infinite mask 159' to surface A of the reduction lens group--0,10m Radius of curvature of surface A of the reduction lens 50- -・11.71 Raw meal thickness-・・2.
50 ■■ Radius of curvature of surface B ---60,140 Garden refractive index ---
1.48304 Distance between optical axes from surface B to total reflection mirror 161 - 3
.. 00mm Radius of curvature of total reflection mirror 161... Distance between optical axes from infinite total reflection mirror 161 to re-imaging lens 52 = 1.61)m Radius of curvature of surface A of re-reflection lens 52...-1, 520 layers ■ Refractive index - 1,48304 Center wall thickness of the enteret - 1.520 m Radius of curvature of layer surface B - Distance from infinite surface B to mask 54 - 0.00 Iris mask 54 Radius of curvature -... Infinite thickness = 0.04 mB Note that the surface A of the reduction lens 50 is an aspherical surface, and in the above formula, K = -1,25,a, = -8X 10-
, α, = −10−.

第36図〜第38図は、本発明に係るカメラの視線方向
検出光学系の第2実施例を説明するための図であって、
この実施例は、送光系46Aをペンタプリズム40を挾
んでファインダールーペ44と反対側に設け、受光系4
6Bを共輪形成用光学部材152の透過面157′の側
に設けて、光源48から出射された赤外光を、コンベン
セータプリズム159.ペンタプリズム40を介して、
ファインダールーペ44に導き、このファインダールー
ペ44により赤外光を平行光束に変換して、眼45に投
影すると共に、その眼45の角膜鏡面反射に基づき第1
プルキンエ像PIを形成する光束と眼底からの反射光と
を、共軸形成川光学部材152の反射面153により反
射させて、受光系46Bに導く構成としたものであり、
その他の光学的構成要素は、第1実施例と大略同一であ
り、その光学的特性も、第6図、第7図に示すように第
1実施例と大略同一であるので、以下にその設計値を記
載するにとどめる。
FIGS. 36 to 38 are diagrams for explaining a second embodiment of the line-of-sight direction detection optical system of a camera according to the present invention,
In this embodiment, the light transmitting system 46A is provided on the opposite side of the finder loupe 44 with the pentaprism 40 interposed therebetween, and the light receiving system 46A is provided on the opposite side of the finder magnifying glass 44.
6B is provided on the transmitting surface 157' side of the common ring forming optical member 152, and the infrared light emitted from the light source 48 is transmitted to the convencator prism 159.6B. Through the pentaprism 40,
The infrared light is guided to a finder magnifying glass 44, which converts the infrared light into a parallel beam of light and projects it onto the eye 45.
The light beam forming the Purkinje image PI and the reflected light from the fundus are reflected by the reflective surface 153 of the coaxial optical member 152 and guided to the light receiving system 46B,
The other optical components are almost the same as those of the first embodiment, and the optical characteristics are also almost the same as those of the first embodiment, as shown in FIGS. 6 and 7, so the design will be described below. Just state the value.

(1)送光系46Aの設計値 光源48の出射面の曲率半径・・・無限大光源48の出
射面と全反射ミラー149との光軸間距離・・・17■
全反射ミラー149の曲率半径・−無限大全反射ミラー
149とマスク159′との光軸距離−・31■マスク
159′ 厚さ・・・0.04履璽 曲率半径・・・無限大 マスク15Qとコンペンセータプリズム159の面Bと
の距離・・七、−コンペンセータプリズム159 面Bの曲率半径−・・無限大 面Aと面Bとの距離−・・3■■ 面Aの曲率半径−・・無限大 光軸れに対する面Aの傾き・−24゜ 面Aとペンタプリズム40の面Bとの光軸間距離・・−
0,14■■ペンタプリズム40 面Bの曲率半径・・・無限大 光軸iに対する面Bの傾き・−24゜ 屈折率・−1,50871 面Aの曲率半径−・・無限大 面Aから面Bまでの光軸間距離−28,80mm面Aと
レンズ44bの面Cとの光軸間距離・・・1.00−■
レンズ44b 面Cの曲率半径・・−47,880閣園面Bの曲率半径
−・・−29,210■■中心肉厚・・・4.92閣■ 屈折率・・・1.61187 レンズ44a 面Bの曲率半径・・−−29,210園−面Aの曲率半
径=−−115,895mm中心肉厚・・−1,2冒鳳 屈折率・・・1.69747 面Aと透過面57どの光軸間距離・・・0.60醜m盗
過面157 曲率半径・・・無限大 光軸處、に対する傾き・・・2゜ 共軸形成用光学部材152の屈折率・・−1,5087
1透過面1−57から透過面156までの光軸間距離−
10m■透過面156 曲率半径・・・無限大 光軸2.に対する傾き・・・2゜ 透過面156から角膜32までの光軸間距離・・・13
鵬−角膜32の曲率半径・・・7.980+++n  
  (2)受光系46Bの設計値 角膜32の曲率半径・・・−7,980■■角膜32か
ら透過面156までの光軸間距離・−・13mm透過面
156 曲率半径・・−無限大 光軸處、に対する傾き・・・−2゜ 透過面156から透過面157′までの光軸間距離−・
−12mm共軸形成用光学部材152の屈折率・・・1
1508711111社′ 光軸慮、に対する傾き・−・−2゜ 曲率半径−・・無限大 透過面157′からマスク151までの光軸間距離・・
・0.66龍マスク151と縮小レンズ50との間の距
離・・・0.00■■マスク151 曲率半径・・・無限大 厚さ・・−0,04部間 縮小レンズ50 面Aの曲率半径−・−28,00鵬閣 肉厚・・・4.00−一 面Bの曲率半径−−−−10,00■■屈折率・・・1
.48304 面Bから全反射ミラー161までの光軸間距離・・−7
,30mm全反射ミラー161の曲率半径・・・無限大
全反射ミラー161と再結像レンヌ32の面Aとの光軸
間距離−5,フー再結像レンズ52 面Aの曲率半径−・−2,00■■ 屈折率・・−1,48304■− 中心肉厚・・・2.00履− 面Bの曲率半径・・・無限大 面nからマスク54までの距離・・・0.00m園マス
ク54 曲率半径・・・無限大 厚さー・−0,04m■ なお、縮小レンズ50の面Bは非球面であり、前記式に
おイテ、K =−3,165、a 、 = 2.95 
X 10−−α6=Oとして、設計した。
(1) Design value of the light transmission system 46A Radius of curvature of the output surface of the light source 48...Distance between the optical axes between the output surface of the infinite light source 48 and the total reflection mirror 149...17■
Radius of curvature of total reflection mirror 149 - Optical axis distance between infinite total reflection mirror 149 and mask 159' - 31 ■ Mask 159' Thickness: 0.04 Radius of curvature: Infinity mask 15Q Distance between surface B of compensator prism 159...7, - Compensator prism 159 Radius of curvature of surface B...infinity Distance between surface A and surface B...3 ■■ Radius of curvature of surface A...infinity Inclination of surface A with respect to large optical axis deviation -24° Distance between optical axes between surface A and surface B of pentaprism 40 ...
0,14 ■■ Pentaprism 40 Radius of curvature of surface B...Inclination of surface B with respect to infinite optical axis i -24° Refractive index -1,50871 Radius of curvature of surface A... From infinite surface A Distance between optical axes to surface B -28,80mm Distance between optical axes between surface A and surface C of lens 44b...1.00-■
Lens 44b Radius of curvature of surface C...-47,880 Radius of curvature of surface B...-29,210■■Center thickness...4.92K Refractive index...1.61187 Lens 44a Radius of curvature of surface B...-29,210 Radius of curvature of surface A=--115,895 mm Center thickness...-1,2 Refractive index...1.69747 Surface A and transmission surface 57 Which distance between optical axes...0.60 ugly m stealth surface 157 Radius of curvature...infinity optical axis, inclination...2° Refractive index of optical member 152 for forming coaxial axis...-1, 5087
1 Distance between optical axes from transmission surface 1-57 to transmission surface 156 -
10m ■Transmission surface 156 Radius of curvature...infinite optical axis 2. Inclination...2° Distance between optical axes from the transmission surface 156 to the cornea 32...13
Peng - Radius of curvature of cornea 32...7.980+++n
(2) Design value of light receiving system 46B Radius of curvature of cornea 32...-7,980 ■■ Distance between optical axes from cornea 32 to transmitting surface 156...13 mm Transmitting surface 156 Radius of curvature...-Infinite light Inclination with respect to axis...-2° Distance between optical axes from transmission surface 156 to transmission surface 157'--
-12mm refractive index of optical member 152 for coaxial formation...1
1508711111 Company' Inclination with respect to the optical axis - -2° radius of curvature - Distance between the optical axes from the infinite transmission surface 157' to the mask 151...
・0.66 Distance between dragon mask 151 and reduction lens 50...0.00■■Mask 151 Radius of curvature...infinite thickness...-0.04 Curvature of surface A of reduction lens 50 between parts Radius: 28,00 Thickness: 4.00 - Radius of curvature of surface B: 10,00 Refractive index: 1
.. 48304 Distance between optical axes from surface B to total reflection mirror 161...-7
, curvature radius of the 30 mm total reflection mirror 161... distance between the optical axes of the infinite total reflection mirror 161 and the surface A of the re-imaging lens 32 -5, curvature radius of the Hu re-imaging lens 52 surface A -2 ,00■■ Refractive index...-1,48304■- Center thickness...2.00mm- Radius of curvature of surface B...Distance from infinite surface n to mask 54...0.00m Mask 54 Radius of curvature... Infinite thickness - -0.04 m Note that the surface B of the reduction lens 50 is an aspherical surface, and the above formula is satisfied, K = -3,165, a, = 2. 95
It was designed as X 10−−α6=O.

この視線方向検出装置によれば、受光部にゴーストが生
じるのを極力避けることができるという効果を奏する。
According to this line-of-sight direction detection device, it is possible to avoid ghosts in the light receiving section as much as possible.

又里又羞米 本発明に係るカメラの視線方向検出装置は、以上説明し
たように、 撮影者の眼に平行光束を導く送光系と、受光部を有しか
つその眼の角膜鏡面反射に基づき第1プルキンエ像を形
成する反射光と眼の眼底からの反射光とを受光する受光
系と、 その受光部の受光出力に基づき撮影者の眼の視線方向を
検出するための処理回路とがカメラ本体に設けられてい
るので、 カメラを覗く撮影者の眼の視線方向を検出できるという
効果を奏する。
As explained above, the line-of-sight direction detection device for a camera according to the present invention has a light transmitting system that guides a parallel light beam to the photographer's eye, and a light receiving section, and is capable of detecting the specular reflection of the cornea of that eye. a light-receiving system that receives the reflected light that forms the first Purkinje image and the reflected light from the fundus of the eye; and a processing circuit that detects the line-of-sight direction of the photographer's eye based on the light-receiving output of the light-receiving section. Since it is provided on the camera body, it has the effect of detecting the line of sight direction of the photographer's eyes when looking into the camera.

また、ファインダーに複数個の合焦ゾーンが設けられて
いるカメラにあっては、その合焦ゾーンに対応するオー
トフォーカス光学系を自動的に選択して駆動させること
ができるという効果を奏する。
Further, in a camera in which a finder is provided with a plurality of focusing zones, the autofocus optical system corresponding to the focusing zone can be automatically selected and driven.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図〜第5図は本発明に係る視線方向検出装置を一眼
レフカメラに適用した例を説明するためのもので。 第1図は本発明に係る視線方向検出装置のカメラへの配
置状態を示す説明図、 第2図、第3図はその視線方向検出装置の詳細図。 第4図は第2図、第3図に示す再結像レンズの拡大図。 第5図はその視線方向検出装置の模式図。 第6図は第2図、第3図に示す縮小レンズを非球面とし
ない場合の球面収差のグラフ。 第7図はその第6図に示す球面収差があるときのディス
トーションのグラフ、 第8図は第2図、第3図に示す縮小レンズを非球面とし
たときの球面収差のグラフ、 第9図はその第8図に示す球面収差がないときのディス
トーションのグラフ。 第11図、第11図は本発明に係るカメラの視線方向検
出装置と再結像レンズとファインダールーペと撮影者の
眼と一次元ラインセンサとの関係を示す模式図。 第12図、第13図は視線方向検出光学系の受光素子と
しての一次元ラインセンサを用いた場合の不具合を説明
するための模式図、 第14図は、再結像レンズの周辺部における光量減衰を
補正するための補正処理手段の説明図、第15図はその
補正処理手段を有する処理回路のブロック図。 第16図は実際に得られた光量分布と一次元ラインセン
サとの関係を示す模式図、 第17図、第18図は像分離処理手段の説明図、第19
図〜第21図は像分離出力分布の重心位置を求めるため
の説明用グラフ、 第22図〜第24図は本発明に係る視線方向検出装置の
検出原理を説明するための説明図であって、第22図は
凸面鏡に平行光束を照射した場合に光点が形成される状
態を示す説明図、 第23図は眼の角膜に平行光束を照射した場合に第1プ
ルキンエ像が形成される状態を示す説明図。 第24図はその第1プルキンエ像と瞳孔の中心との関係
を説明するための眼の拡大図 第25図、第26図はその第1プルキンエ像と瞳孔の中
心とから眼の視線方向を演算して求めるための説明図、 第27図は一眼レフカメラの改良したオートフォーカス
光学系の配置状態を概略的に示す斜視図、第28図はそ
の一眼レフカメラの撮影レンズをファインダーの中央合
焦川ゾーンと光学的に略共役なオートフォーカス光学系
の合焦用ゾーンから覗いた射出瞳と開口領域との関係を
説明するための説明図、 第29図はその一眼レフカメラのファインダーの平面図
。 第30図は第27図に示す射出瞳がビネッティングを受
けた場合にその射出瞳と開口領域との関係を説明するた
めの説明図。 第31図〜第35図は本発明に係る視線方向検出装置の
光学系のさらに他の例を説明するための図であって、 第31図はその視線方向検出装置の光学系の構成図、 第32図は第31図に示す視線方向検出装置の光学系の
要部拡大図、 第33図は第31図に示す再結像レンズの拡大図。 第34図、第35図はこの第31図に示す視線方向検出
装置の光学系の光学的特性の説明図。 第36図〜第38図は第31図に示す光学系の他の例を
説明するための図であって、 第36図はその視線方向検出装置の光学系の要部構成を
示す光学図、 第37図、第38図はこの第36図に示す光学系の光学
的特性の説明図、 第39図は従来の一眼レフカメラのオートフォーカス光
学系の概略構成を示す図。 第40図は第39図に示すオートフォーカス光学系の配
置状態を概略的に示す斜視図。 第41図はそのオートフォーカス光学系による合焦を説
明するための説明図。 第42図はそのオートフォーカス光学系のCCDの検出
出力の説明図、 第43図は従来の合焦用ゾーンのファインダーへの配置
状態を説明するための説明図。 第44図はその従来の一眼レフカメラを用いて所望の被
写体が中央から左右にずれた撮影写真を得る場合の撮影
手順を説明するための説明図。 である。 9・・・オートフォーカス光学系、16・・・ファイン
ダー17・・・中央合焦用ゾーン 18.19・−・周辺部合焦用オートフォーカス光学系
26.27・・・周辺部合焦用ゾーン 28.29・−・合焦用ゾーン 32・・・角膜、 4G・−・ペンタプリズム34−・
・瞳孔の中心 44・・・ファインダールーペ、45−・・撮影者の眼
46・・・視線方向検出装置 46A−・・送光系、46B−・・受光系、48・・・
赤外光源50・・・縮小レンズ、52・・・再結像レン
ズ53−・・一次元ラインセンサ、55・・・開口58
・・・マイクロコンピュータ 152・−共軸形成用光学部材 156.15フー・・透過面、皿り、 flA−・・光
軸53a・−・光電素子、0・・・回旋角、5 %・・
・回旋中心PI−−・第1プルキンエ像、G、−G、・
・・光量分布H,・・−補正係数、X、、X、・・・出
力G6・−・眼底反射光対応出力成分 G7・・−第1プルキンエ像形成反射光対応出力成分G
、・・−眼底反射光対応分離出力 −G、・・・第1プルキンエ像形成反射光対応分離出力
系 1 凶 7  1s、1 円    !l     If       I   
   l〒     !し 拓 4 」 54、口152 拓 5 図 弐5W#I$卆覧 第6− 虱目 −100to。 拓8図 讐\−0,034 −0,100,10 第 7 : @/@−t″Aストーシ参ン I    2   t、   6mm 1                 mm見10=    l 兎12  口 尾14図 しX  、、、Q。 53a  l    、53 1  』   : 1    j    l j +、−x、=x、   1 、            1 1  1  ′1 茶15= −4:     lRoMli ンーム ++1 o、−%ノー 、しG5 1第18; −入 6s   。 (□   Q′ jx〜・ 0;−Qft”−鳴 ” 、1   ..1 \1  /■ 第21i 1     /7!IS。 +1jjljl1 1、 :   lli8−BE(E<n))::i 鴎−AoE  第22図 Q7どー\\730 第25図 第26図 !cI      +  +:ii どナー。 第29図 第28図   第30図 拓31区 :58 1          iL−Ji      1第3
4 口 第37 図 ディス卜−シ1ン(#J) 10has 第39コ 拓40口
1 to 5 are for explaining an example in which the line-of-sight direction detecting device according to the present invention is applied to a single-lens reflex camera. FIG. 1 is an explanatory diagram showing how a line-of-sight direction detecting device according to the present invention is arranged on a camera, and FIGS. 2 and 3 are detailed views of the line-of-sight direction detecting device. FIG. 4 is an enlarged view of the re-imaging lens shown in FIGS. 2 and 3. FIG. 5 is a schematic diagram of the line-of-sight direction detection device. FIG. 6 is a graph of spherical aberration when the reduction lens shown in FIGS. 2 and 3 is not made into an aspherical surface. Figure 7 is a graph of distortion when there is the spherical aberration shown in Figure 6, Figure 8 is a graph of spherical aberration when the reduction lens shown in Figures 2 and 3 is made into an aspherical surface, Figure 9 is a graph of distortion when there is no spherical aberration shown in FIG. FIGS. 11 and 11 are schematic diagrams showing the relationship among the line-of-sight direction detection device, reimaging lens, finder loupe, photographer's eye, and one-dimensional line sensor of a camera according to the present invention. Figures 12 and 13 are schematic diagrams for explaining problems when using a one-dimensional line sensor as a light-receiving element in the line-of-sight direction detection optical system. Figure 14 is a diagram showing the amount of light at the periphery of the re-imaging lens. FIG. 15 is an explanatory diagram of a correction processing means for correcting attenuation, and FIG. 15 is a block diagram of a processing circuit having the correction processing means. Fig. 16 is a schematic diagram showing the relationship between the actually obtained light intensity distribution and the one-dimensional line sensor, Figs. 17 and 18 are explanatory diagrams of the image separation processing means, and Fig. 19
21 to 21 are explanatory graphs for determining the center of gravity position of the image separation output distribution, and FIGS. 22 to 24 are explanatory diagrams for explaining the detection principle of the line-of-sight direction detection device according to the present invention. , Fig. 22 is an explanatory diagram showing the state in which a light spot is formed when a parallel light beam is irradiated onto a convex mirror, and Fig. 23 is a state in which a first Purkinje image is formed when a parallel light beam is irradiated on the cornea of the eye. An explanatory diagram showing. Figure 24 is an enlarged view of the eye to explain the relationship between the first Purkinje image and the center of the pupil. Figures 25 and 26 are for calculating the line of sight direction of the eye from the first Purkinje image and the center of the pupil. Fig. 27 is a perspective view schematically showing the layout of the improved autofocus optical system of a single-lens reflex camera, and Fig. 28 is an explanatory diagram for determining the focus of the single-lens reflex camera by focusing the photographing lens on the center of the finder. An explanatory diagram for explaining the relationship between the exit pupil and the aperture area as seen from the focusing zone of the autofocus optical system, which is optically approximately conjugate with the river zone. Figure 29 is a plan view of the finder of the single-lens reflex camera. . FIG. 30 is an explanatory diagram for explaining the relationship between the exit pupil and the aperture area when the exit pupil shown in FIG. 27 is subjected to vignetting. 31 to 35 are diagrams for explaining still other examples of the optical system of the line-of-sight direction detection device according to the present invention, and FIG. 31 is a configuration diagram of the optical system of the line-of-sight direction detection device; 32 is an enlarged view of the main parts of the optical system of the line-of-sight direction detection device shown in FIG. 31, and FIG. 33 is an enlarged view of the re-imaging lens shown in FIG. 31. 34 and 35 are explanatory diagrams of the optical characteristics of the optical system of the line-of-sight direction detection device shown in FIG. 31. 36 to 38 are diagrams for explaining other examples of the optical system shown in FIG. 31, and FIG. 36 is an optical diagram showing the main part configuration of the optical system of the line-of-sight direction detection device; 37 and 38 are explanatory diagrams of the optical characteristics of the optical system shown in FIG. 36, and FIG. 39 is a diagram showing a schematic configuration of an autofocus optical system of a conventional single-lens reflex camera. FIG. 40 is a perspective view schematically showing the arrangement of the autofocus optical system shown in FIG. 39. FIG. 41 is an explanatory diagram for explaining focusing by the autofocus optical system. FIG. 42 is an explanatory diagram of the detection output of the CCD of the autofocus optical system, and FIG. 43 is an explanatory diagram for explaining the arrangement of conventional focusing zones in the finder. FIG. 44 is an explanatory diagram for explaining the photographing procedure when using the conventional single-lens reflex camera to obtain a photograph in which a desired subject is shifted from the center to the left or right. It is. 9... Autofocus optical system, 16... Finder 17... Central focusing zone 18.19... Autofocus optical system for peripheral focusing 26.27... Peripheral focusing zone 28.29 -- Focusing zone 32 -- Cornea, 4G -- Pentaprism 34 --
- Center of pupil 44...Finder magnifying glass, 45--Photographer's eye 46...Line-of-sight direction detection device 46A--Light sending system, 46B--Light receiving system, 48...
Infrared light source 50...Reducing lens, 52...Re-imaging lens 53--One-dimensional line sensor, 55... Aperture 58
... Microcomputer 152 - Optical member for coaxial formation 156.15... Transmission surface, countersunk, flA - Optical axis 53a - Photoelectric element, 0... Rotation angle, 5%...
・Center of rotation PI--・First Purkinje image, G, -G,・
・・Light amount distribution H, ・・・Correction coefficient,
,...-separation output corresponding to fundus reflected light-G,...separation output system corresponding to first Purkinje image forming reflected light 1 7 1s, 1 yen! If I
l〒! Shitaku 4'' 54, 口152 Taku 5 Figure 2 5W#I$ Book List No. 6 - 100to. Taku 8 figure enemy \-0,034 -0,100,10 7th: @/@-t″A stoshi reference I 2 t, 6mm 1 mm view 10= l rabbit 12 mouth and tail 14 figure X ,,, Q. 53a l, 53 1'': 1 j l j +, -x, = x, 1, 1 1 1 '1 tea 15 = -4: lRoMli + + 1 o, -% no, and G5 1st 18; - Enter 6s. (□ Q'jx~・0;-Qft"-Naki", 1 ..1 \1 /■ 21st i 1 /7! IS. +1jjljl1 1, : lli8-BE(E<n)):: i Ou-AoE Fig. 22 Q7 Do\\730 Fig. 25 Fig. 26! cI + +: ii Dona. Fig. 29 Fig. 28 Fig. 30 Taku 31 ward: 58 1 iL-Ji 1 3rd
4 part No. 37 figure disc 1 (#J) 10 has No. 39 copy 40 parts

Claims (18)

【特許請求の範囲】[Claims] (1)撮影者の眼に平行光束を導く送光系と、受光部を
有しかつ前記眼の角膜鏡面反射に基づき第1プルキンエ
像を形成する反射光と前記眼の眼底からの反射光とを受
光する受光系と、前記受光部の受光出力に基づき前記撮
影者の眼の視線方向を検出するための処理回路と、 がカメラ本体に設けられていることを特徴とするカメラ
の視線方向検出装置。
(1) A light transmitting system that guides a parallel light beam to the photographer's eye, and a light receiving unit that includes reflected light that forms a first Purkinje image based on corneal specular reflection of the eye and reflected light from the fundus of the eye. and a processing circuit for detecting the line-of-sight direction of the photographer's eye based on the light-receiving output of the light-receiving section, the camera body being provided with: Device.
(2)前記送光系と前記受光系とは、ペンタプリズムを
境に少なくともその一方がファインダールーペと反対側
で前記カメラ本体に組み込まれていることを特徴とする
請求項1に記載のカメラの視線方向検出装置。
(2) The camera according to claim 1, wherein at least one of the light transmitting system and the light receiving system is incorporated into the camera body on a side opposite to the finder magnifying glass with a pentaprism as a boundary. Gaze direction detection device.
(3)前記平行光束が赤外光であることを特徴とする請
求項1に記載のカメラの視線方向検出装置。
(3) The line-of-sight direction detection device for a camera according to claim 1, wherein the parallel light beam is infrared light.
(4)前記送光系は、ファインダールーペを介して前記
撮影者の眼に向けて平行光束として出射される赤外光を
発生する赤外光源を有し、 前記受光系は、前記角膜鏡面反射に基づき第1プルキン
エ像を形成する反射光と前記眼の眼底からの反射光とを
縮小して結像させる縮小レンズを有することを特徴とす
る請求項1に記載のカメラの視線方向検出装置。
(4) The light transmitting system includes an infrared light source that generates infrared light that is emitted as a parallel beam of light toward the photographer's eye via a finder loupe, and the light receiving system includes the corneal specular reflection. 2. The line-of-sight direction detection device for a camera according to claim 1, further comprising a reduction lens that reduces the reflected light forming the first Purkinje image and the reflected light from the fundus of the eye to form an image.
(5)前記縮小レンズは、少なくとも一方が非球面であ
り、前記受光系には、前記第1プルキンエ像を形成する
反射光を再結像させる再結像レンズが設けられ、該再結
像レンズの曲率中心に位置させて開口が設けられると共
に、前記縮小レンズの焦点が前記再結像レンズの曲率中
心に位置されている請求項4に記載のカメラの視線方向
検出装置。
(5) At least one of the reduction lenses has an aspherical surface, and the light receiving system is provided with a re-imaging lens that re-images the reflected light forming the first Purkinje image, and the re-imaging lens 5. The line-of-sight direction detection device for a camera according to claim 4, wherein an aperture is provided at a center of curvature of the reduction lens, and a focal point of the reduction lens is located at a center of curvature of the re-imaging lens.
(6)前記カメラ本体には、ファインダーの視野内に複
数個の合焦用ゾーンが設けられ、該合焦用ゾーンと略光
学的に共役な位置に該合焦用ゾーンに対応するオートフ
ォーカス光学系の合焦ゾーンが設けられ、前記処理回路
は、前記ファインダーの各合焦用ゾーンのいずれか一つ
が選択されたことを自動的に感知することを特徴とする
請求項1に記載のカメラの視線方向検出装置。
(6) The camera body is provided with a plurality of focusing zones within the field of view of the finder, and an autofocus optical system corresponding to the focusing zone is provided at a position substantially optically conjugate with the focusing zone. 2. The camera according to claim 1, wherein the camera is provided with focusing zones of the viewfinder, and the processing circuit automatically senses that any one of the focusing zones of the finder is selected. Gaze direction detection device.
(7)前記処理回路は前記ファインダーの各合焦用ゾー
ンのうち、選択された合焦用ゾーンに対応するオートフ
ォーカス光学系を駆動させる駆動回路に接続されている
ことを特徴とする請求項6に記載のカメラの視線方向検
出装置。
(7) The processing circuit is connected to a drive circuit that drives an autofocus optical system corresponding to a selected focusing zone among the focusing zones of the finder. A camera line-of-sight direction detection device described in .
(8)前記受光系は、角膜鏡面反射に基づき第1プルキ
ンエ像を形成する反射光を前記受光部に再結像させる再
結像レンズを備え、前記受光部は前記複数個のオートフ
ォーカス光学系の合焦ゾーンに対応させて配列された光
電素子を有する一次元ラインセンサから構成され、前記
再結像レンズと前記一次元ラインセンサとの間に開口を
有するマスクが設けられ、前記再結像レンズが、前記一
次元ラインセンサの光電素子の配列方向と直交する方向
に長く延びる像を形成するシリンドリカルレンズである
ことを特徴とする請求項1に記載のカメラの視線方向検
出装置。
(8) The light-receiving system includes a re-imaging lens that re-images the reflected light forming the first Purkinje image based on corneal specular reflection on the light-receiving section, and the light-receiving section is connected to the plurality of autofocus optical systems. A mask having an opening is provided between the re-imaging lens and the one-dimensional line sensor, and a mask having an opening is provided between the re-imaging lens and the one-dimensional line sensor. 2. The line-of-sight direction detection device for a camera according to claim 1, wherein the lens is a cylindrical lens that forms an elongated image in a direction perpendicular to the arrangement direction of the photoelectric elements of the one-dimensional line sensor.
(9)前記受光系は角膜鏡面反射に基づき第1プルキン
エ像を形成する反射光を前記受光部に再結像させる再結
像レンズを備え、前記受光部は前記複数個のオートフォ
ーカス光学系の合焦ゾーンに対応させて配列された光電
変換素子を有する一次元ラインセンサから構成され、前
記再結像レンズと前記一次元ラインセンサとの間に開口
を有するマスクが設けられ、前記再結像レンズが、前記
一次元ラインセンサの光電素子の配列方向と直交する方
向に長く延びる像を形成するトーリックレンズであるこ
とを特徴とする請求項1に記載のカメラの視線方向検出
装置。
(9) The light-receiving system includes a re-imaging lens that re-images the reflected light forming the first Purkinje image on the light-receiving section based on corneal specular reflection; It is composed of a one-dimensional line sensor having photoelectric conversion elements arranged corresponding to the focusing zone, and a mask having an opening is provided between the re-imaging lens and the one-dimensional line sensor, and the re-imaging 2. The line-of-sight direction detection device for a camera according to claim 1, wherein the lens is a toric lens that forms an elongated image in a direction perpendicular to the arrangement direction of the photoelectric elements of the one-dimensional line sensor.
(10)前記受光部は一次元ラインセンサからなり、前
記処理回路は前記一次元ラインセンサからの出力を、一
のスライスレベルで処理することによって瞳孔の周縁に
対応する瞳孔周縁対応座標を求めると共に、他のスライ
スレベルで処理することにより第1プルキンエ像に対応
するプルキンエ像対応座標を求め、第1プルキンエ像の
中心座標と前記瞳孔の中心座標とを演算して、前記眼の
視線方向を検出することを特徴とする請求項1に記載の
カメラの視線方向検出装置。
(10) The light receiving section is composed of a one-dimensional line sensor, and the processing circuit processes the output from the one-dimensional line sensor at one slice level to obtain coordinates corresponding to the pupil periphery, and , obtain Purkinje image corresponding coordinates corresponding to the first Purkinje image by processing at another slice level, calculate the center coordinates of the first Purkinje image and the center coordinates of the pupil, and detect the line of sight direction of the eye. The line-of-sight direction detection device for a camera according to claim 1.
(11)前記受光部は一次元ラインセンサから構成され
、前記処理回路は前記一次元ラインセンサからの出力を
、眼底からの反射光に対応する眼底反射光対応出力成分
と第1プルキンエ像を形成する反射光に対応する第1プ
ルキンエ像形成反射光対応出力成分とに分離する分離手
段を備え、分離された眼底反射光対応出力成分の重心位
置と第1プルキンエ像形成反射光対応出力成分の重心位
置とをそれぞれ求め、眼の視線方向を検出することを特
徴とする請求項1に記載のカメラの視線方向検出装置。
(11) The light receiving section is composed of a one-dimensional line sensor, and the processing circuit converts the output from the one-dimensional line sensor into a fundus reflected light corresponding output component corresponding to the reflected light from the fundus and forms a first Purkinje image. the center of gravity of the separated output component corresponding to the fundus reflected light and the center of gravity of the output component corresponding to the first Purkinje image forming reflected light; 2. The camera's line-of-sight direction detection device according to claim 1, wherein the camera's line-of-sight direction is detected by determining the position and position of the eye, respectively.
(12)前記受光系は角膜鏡面反射に基づき第1プルキ
ンエ像を形成する反射光を前記一次元ラインセンサに再
結像させる再結像レンズを備え、前記処理回路は、該再
結像レンズの光量分布特性に基づく周辺部入射光量の減
少を補正する補正手段を備えていることを特徴とする請
求項11に記載のカメラの視線方向検出装置。
(12) The light receiving system includes a re-imaging lens that re-images the reflected light forming the first Purkinje image based on corneal specular reflection on the one-dimensional line sensor, and the processing circuit includes a re-imaging lens that re-images the reflected light forming the first Purkinje image based on corneal specular reflection, 12. The line-of-sight direction detection device for a camera according to claim 11, further comprising a correction means for correcting a decrease in the amount of light incident on the peripheral portion based on the light amount distribution characteristics.
(13)前記分離された眼底反射光対応出力成分と第1
プルキンエ像形成反射光対応出力成分とを、ビット反転
させて、第1プルキンエ像の位置と瞳孔の位置とを求め
ることを特徴とする請求項12に記載のカメラの視線方
向検出装置。
(13) The separated output component corresponding to the fundus reflected light and the first
13. The line-of-sight direction detection device for a camera according to claim 12, wherein the position of the first Purkinje image and the position of the pupil are determined by bit-inverting the output component corresponding to the Purkinje image forming reflected light.
(14)ファインダールーペを覗く眼に向かって検出光
を平行光束として出射する送光系と前記眼の角膜鏡面反
射に基づき虚像を形成する検出光を受光部に再結像させ
る受光系とを備え、前記ファインダールーペの前記眼に
臨まされる側に、前記送光系の光軸と前記受光系の光軸
とを共軸とするための共軸形成用光学部材が設けられて
いることを特徴とするカメラの視線方向検出装置。
(14) A light transmitting system that emits detection light as a parallel light beam toward the eye looking through the finder magnifying glass, and a light receiving system that reimages the detection light that forms a virtual image on a light receiving unit based on specular reflection of the cornea of the eye. , characterized in that a coaxial forming optical member for making the optical axis of the light transmitting system and the optical axis of the light receiving system coaxial is provided on the side of the finder magnifying glass facing the eye. A camera line-of-sight direction detection device.
(15)前記受光系は、前記共軸形成用光学部材と前記
受光部との間に、縮小レンズと再結像レンズとを備え、
前記縮小レンズは少なくとも一面が非球面であることを
特徴とする請求項14に記載のカメラの視線方向検出装
置。
(15) The light receiving system includes a reduction lens and a reimaging lens between the coaxial forming optical member and the light receiving section,
The line-of-sight direction detection device for a camera according to claim 14, wherein at least one surface of the reduction lens is an aspherical surface.
(16)前記共軸形成用光学部材は、可視領域の光を透
過し、赤外領域の光に対して反射と透過の特性を有する
ミラーであることを特徴とする請求項14に記載のカメ
ラの視線方向検出装置。
(16) The camera according to claim 14, wherein the coaxial forming optical member is a mirror that transmits light in the visible region and has characteristics of reflecting and transmitting light in the infrared region. line of sight direction detection device.
(17)前記ミラーに代えて反射面を有するプリズムを
用いることを特徴とする請求項16に記載のカメラの視
線方向検出装置。
(17) The line-of-sight direction detection device for a camera according to claim 16, characterized in that a prism having a reflective surface is used in place of the mirror.
(18)前記プリズムは、前記眼に臨む透過面と前記反
射面を狭んで対向されかつ前記ファンイダールーペに臨
む透過面とを備え、前記眼に臨む透過面が少なくとも前
記共軸に対してわずかに傾いていることを特徴とする請
求項17に記載のカメラの視線方向検出装置。
(18) The prism includes a transmitting surface facing the eye and a transmitting surface facing the Van Ider magnifying glass and facing the reflecting surface narrowly, the transmitting surface facing the eye being at least slightly opposite to the co-axis. 18. The viewing direction detection device for a camera according to claim 17, wherein the device is tilted at an angle of .
JP63143259A 1987-06-11 1988-06-10 Camera gaze direction detection device Expired - Fee Related JP2859270B2 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
DE19883841575 DE3841575C2 (en) 1987-12-17 1988-12-09 Camera with a viewfinder system with a device for determining the viewing direction of the user
DE3844912A DE3844912C2 (en) 1987-12-17 1988-12-09 Method for compensating evaluation errors of a device for determining the viewing direction of the user of a camera
DE3844907A DE3844907C2 (en) 1987-12-17 1988-12-09 Camera with a viewfinder with a device for determining the viewing direction of the user
US07/982,427 US5327191A (en) 1987-06-11 1992-11-27 Eye direction detecting apparatus
US08/370,367 US5583606A (en) 1987-06-11 1995-01-09 Eye direction detecting apparatus
US08/462,688 US5557364A (en) 1987-12-17 1995-06-05 Eye direction detecting apparatus

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP14606787 1987-06-11
JP62-146067 1987-06-11
JP31933787 1987-12-17
JP62-319337 1987-12-17
JP12356288 1988-05-20
JP63-123562 1988-05-20

Publications (2)

Publication Number Publication Date
JPH025A true JPH025A (en) 1990-01-05
JP2859270B2 JP2859270B2 (en) 1999-02-17

Family

ID=27314743

Family Applications (1)

Application Number Title Priority Date Filing Date
JP63143259A Expired - Fee Related JP2859270B2 (en) 1987-06-11 1988-06-10 Camera gaze direction detection device

Country Status (1)

Country Link
JP (1) JP2859270B2 (en)

Cited By (94)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0534574A (en) * 1991-07-31 1993-02-12 Victor Co Of Japan Ltd Image pickup device
US5245381A (en) * 1990-08-20 1993-09-14 Nikon Corporation Apparatus for ordering to phototake with eye-detection
US5333029A (en) * 1990-10-12 1994-07-26 Nikon Corporation Camera capable of detecting eye-gaze
US5335035A (en) * 1992-02-24 1994-08-02 Olympus Optical Co., Ltd. Visual line direction detecting device for the camera
US5408292A (en) * 1992-09-07 1995-04-18 Canon Kabushiki Kaisha Visual axis detecting device
US5416317A (en) * 1993-02-06 1995-05-16 Nikon Corporation Visual line detecting device
US5426483A (en) * 1992-09-14 1995-06-20 Nikon Corporation Camera with a line of sight detecting device
US5491532A (en) * 1992-09-14 1996-02-13 Nikon Corporation Camera with device for detecting line of sight
US5526089A (en) * 1992-09-14 1996-06-11 Nikon Corporation Camera with sight line detecting device
US5627586A (en) * 1992-04-09 1997-05-06 Olympus Optical Co., Ltd. Moving body detection device of camera
US5703637A (en) * 1993-10-27 1997-12-30 Kinseki Limited Retina direct display device and television receiver using the same
US5758201A (en) * 1991-06-28 1998-05-26 Nikon Corporation Camera having line of sight detecting device
US5848175A (en) * 1991-05-27 1998-12-08 Canon Kabushiki Kaisha View point detecting device
US6035054A (en) * 1992-10-29 2000-03-07 Canon Kabushiki Kaisha Visual axis detection apparatus and optical apparatus provided therewith
US6299409B1 (en) 1998-04-10 2001-10-09 Denso Corporation Centrifugal type blower unit
US6638371B1 (en) 2002-03-29 2003-10-28 Kawasaki Steel Corporation Cold-rolled steel sheet having ultrafine grain structure and method for manufacturing the same
US6715540B2 (en) 2000-04-28 2004-04-06 Denso Corporation Air-conditioning apparatus for vehicle
JP2006315145A (en) * 2005-05-13 2006-11-24 Muramoto Kogu Kk Eccentric sleeve
WO2007042931A1 (en) 2005-10-13 2007-04-19 Nissan Motor Co., Ltd. Vehicle driving assist system
KR100746705B1 (en) * 2006-08-17 2007-08-06 삼성전자주식회사 Apparatus and method for transmitting and receiving digital broadcasting signal
KR100752683B1 (en) * 2006-09-14 2007-08-29 명지대학교 산학협력단 A valiolone synthase its gene and primer for valienamine biosynthesis and method for producing the same
KR100762962B1 (en) * 2006-05-04 2007-10-04 한국과학기술원 Method for preparing culture media using genome information and in silico analysis
KR100764613B1 (en) * 2006-12-15 2007-10-08 재단법인서울대학교산학협력재단 Fabrication of mesoporous carbon/conducting polymer nanocomposite and application to electrode of electro-double layer capacitor
KR100769681B1 (en) * 2006-05-15 2007-10-23 고려대학교 산학협력단 Circular anastomosis stapling instrument
KR100773050B1 (en) * 2006-09-29 2007-11-02 주식회사농심 Grape seed oil which improves oxidative stability and method thereof
KR100781852B1 (en) * 2005-10-27 2007-12-03 김명진 Production of material to remove red tide using dredged sediment and its removal method
KR100785932B1 (en) * 2006-01-09 2007-12-14 에스케이 텔레콤주식회사 Management scheme in network management system
KR100788789B1 (en) * 2006-08-29 2007-12-27 고려대학교 산학협력단 Bio-marker proteins for diagnosing exposure to formaldehyde
KR100790407B1 (en) * 2006-10-12 2008-01-02 한국전기연구원 Composition of lead-free piezoelectric ceramics and method for manufacturing the same
KR100792630B1 (en) * 2006-07-05 2008-01-09 고려대학교 산학협력단 Bio-marker for diagnosing diabetic nephropathy
KR100794384B1 (en) * 2006-08-02 2008-01-15 한국에너지기술연구원 Production and purification method for hydrogenase of thiocapsa roseopersicina
KR100796534B1 (en) * 2006-10-12 2008-01-21 대림산업 주식회사 Low heat building concrete composition using 3type combine and 3type latent heat storage composition
KR100796773B1 (en) * 2006-10-18 2008-01-22 전북대학교산학협력단 Preparing method of the phyngang for browning prevention and moisture sorption
KR100801929B1 (en) * 2006-07-05 2008-02-12 건국대학교 산학협력단 Novel endonuclease derived from Thermus thermophilus its amino acid sequence glycosylase gene its nucleotide sequence and processes for preparing the same
KR100803532B1 (en) * 2006-08-01 2008-02-14 민재윤 - DF20KCTC10942BP Lactobacillus salivarius sp. salivarius DF20 having been Acid-tolerant Bile-tolerant Antibacterial activity and possesed Alpha-galactosidase
KR100806201B1 (en) * 2006-10-30 2008-02-22 광주과학기술원 Generating method for three-dimensional video formation using hierarchical decomposition of depth image, and device for the same, and system and storage medium therefor
KR100808675B1 (en) * 2006-12-21 2008-02-29 한국항공우주연구원 Manufacturing method for regenerative cooling thrust chamber nozzle
KR100809804B1 (en) * 2006-09-28 2008-03-04 한국전력공사 Light foam concrete composition using bottom ash, used for sound absorbtion materials, light foam concrete product employing the same and the manufacturing method thereof
KR100809866B1 (en) * 2006-06-16 2008-03-06 연세대학교 산학협력단 Method and apparatus for detecting or sorting apoptotic cells using microfulidic channel and magnetic field
KR100813271B1 (en) * 2006-11-09 2008-03-13 삼성전자주식회사 Method and apparatus for disrupting cell or virus and amplifying nucleic acids using gold nanorod
WO2008035740A1 (en) 2006-09-20 2008-03-27 Bridgestone Corporation Information display panel drive method
KR100818979B1 (en) * 2006-09-14 2008-04-04 학교법인 포항공과대학교 Dialog management apparatus and method for chatting agent
KR100819729B1 (en) * 2006-06-22 2008-04-07 한국과학기술연구원 Preparation method of clay/biodegradable polyester nanocomposite using supercritical fluid and nanocomposite obtained thereby
KR100821311B1 (en) * 2006-03-24 2008-04-10 대한민국(관리부서:농촌진흥청) Method to make functional brewed vinegar of colored barley and brewed vinegar beverage made by the method
KR100830719B1 (en) * 2007-05-29 2008-05-20 한국화학연구원 Synthetic methods for liquid hydrocarbons from syngas over alumina-silica based catalysts and preparation methods thereof
KR100832522B1 (en) * 2006-06-14 2008-05-27 대한민국(관리부서:농촌진흥청) Detecting method for carbamate based and organic phosphate based agricultural chemicals
KR100833901B1 (en) * 2006-06-14 2008-06-03 김성국 Method of carrying out underground pile with expanded bulbs and pile with expanded bulbs thereof
KR100834811B1 (en) * 2006-11-28 2008-06-09 고려대학교 산학협력단 CoFeSiB/Pt multilayers exhibiting perpendicular magnetic anisotropy
KR100836569B1 (en) * 2006-08-25 2008-06-10 경남대학교 산학협력단 A pharmaceutical composition and food additive containing a extract of Styela clava
KR100837377B1 (en) * 2007-05-29 2008-06-12 한국화학연구원 Preparation methods for liquid hydrocarbons from syngas by using the zirconia-aluminum oxide-based fischer-tropsch catalysts
KR100838715B1 (en) * 2005-12-16 2008-06-16 국립암센터 Peptides for inhibiting Transglutaminase
KR100839055B1 (en) * 2007-01-19 2008-06-19 고려대학교 산학협력단 Alumina-ceria catalyst comprising copper oxide
KR100842420B1 (en) * 2007-02-14 2008-07-01 주식회사 씨비엔바이오텍 Method of bioreactor culture of echinacea purpurea adventitious roots
KR100842376B1 (en) * 2007-03-07 2008-07-01 한빔 주식회사 Preparation method of zns:mn nanoparticle
KR100846479B1 (en) * 2006-08-21 2008-07-17 삼성에스디아이 주식회사 Organic electrolytic solution comprising electro-grafting monomer, and lithium battery employing the same
KR100854594B1 (en) * 2007-02-09 2008-08-27 전남대학교산학협력단 - -5 - PAS-flagellin fusion protein with improved Toll-like receptor 5 stimulating activity
KR100855299B1 (en) * 2007-02-16 2008-08-29 건국대학교 산학협력단 Monoclonal antibodies specific against il-32 antigens, hybridoma producing the monoclonal antibodies and diagnostic systems using the monoclonal antibodies
KR100855772B1 (en) * 2006-11-21 2008-09-01 주식회사 삼천리 Adsorbent to adsorb the sulfurour gas contained in fuel gas, and desulfurization equipement in fuel cell system using such adsorbent
WO2008156091A1 (en) 2007-06-18 2008-12-24 Nippon Sheet Glass Company, Limited Glass composition
KR100877600B1 (en) * 2006-11-30 2009-01-08 재단법인서울대학교산학협력재단 Pharmaceutical composition comprising metadoxine and garlic oil for preventing and treating alcohol-induced fatty liver and steatohepatitis
EP2017486A1 (en) 2005-10-13 2009-01-21 Schaeffler AG Radial bearing
WO2009014092A1 (en) 2007-07-23 2009-01-29 Tdk Corporation Ceramic substrate, process for producing the same, and dielectric-porcelain composition
KR100886650B1 (en) * 2007-04-13 2009-03-06 주식회사 진켐 Novel 2,3-Sialyltransferase and Method for Producing Compound Having Galatose in Terminal Using the Same
KR100901319B1 (en) * 2007-09-27 2009-06-09 한국전력공사 System and method for intelligent distribution automation
KR100902368B1 (en) * 2007-09-06 2009-06-11 삼성생약주식회사 Pharmaceutical composition for the prevention and treatment of impotency containing extract of gastrodia elata
KR100903952B1 (en) * 2007-08-13 2009-06-25 충남대학교산학협력단 A method for preparing of hydrophilic zeolite membrane
KR100904871B1 (en) * 2007-12-28 2009-06-26 한국과학기술원 Tip testing method
KR100906560B1 (en) * 2007-06-27 2009-07-07 대한주택공사 Treatment method of greywater using constructed wetland and apparatus thereof
KR100906993B1 (en) * 2007-11-09 2009-07-08 한국에너지기술연구원 Power control system for fuel cell hybrid power system and Power control method
US7575619B2 (en) 2005-03-29 2009-08-18 Hitachi Powdered Metals Co., Ltd. Wear resistant sintered member
KR100916312B1 (en) * 2002-09-27 2009-09-11 주식회사 케이티 An apparatus for transmitting video using adaptive weighted error correction coding and multiple description coding and method thereof
EP2361754A2 (en) 2010-02-26 2011-08-31 FUJIFILM Corporation Lens array
EP2371910A1 (en) 2010-03-30 2011-10-05 Fujifilm Corporation Ink composition, inkjet recording method and process for producing molded printed material
WO2012014509A1 (en) 2010-07-30 2012-02-02 株式会社サイバー・ソリューションズ Unauthorized access blocking control method
WO2012128225A1 (en) 2011-03-18 2012-09-27 新日本製鐵株式会社 Steel sheet for hot-stamped member and process for producing same
WO2012144600A1 (en) 2011-04-22 2012-10-26 日立金属株式会社 Steel for solid oxide fuel cells having excellent oxidation resistance, and member for solid oxide fuel cells using same
WO2015155366A1 (en) 2014-04-11 2015-10-15 Bayer Materialscience Ag Composition for producing transparent polythiourethane bodies
US20160114329A1 (en) * 2010-04-23 2016-04-28 Metso Minerals, Inc. Wear part, processing apparatus and processing plant for mineral material
KR20160113999A (en) 2015-03-23 2016-10-04 재영솔루텍 주식회사 Wide Angle Lens System for Camera of Vehicle
EP3141942A1 (en) 2015-09-02 2017-03-15 Olympus Corporation Laser microscope and microscopy method
WO2017067805A1 (en) 2015-10-19 2017-04-27 Basf Se Sandwich structure including a vip and method for producing the same
DE102020205124B4 (en) 2019-05-28 2021-11-04 Yazaki Corporation Heat dissipation structure
DE102020128451B3 (en) 2020-10-29 2021-11-04 Alan E. Baklayan Fractal antenna, in particular for a therapy device for treating patients, a belt and a therapy device for treating patients with the aid of such a fractal antenna
DE102015116478B4 (en) 2014-09-30 2021-11-04 Ngk Insulators, Ltd. Heat / acoustic wave conversion component and heat / acoustic wave conversion unit
DE112013002238B4 (en) 2012-04-25 2021-11-11 Hitachi, Ltd. Railroad car body structure with shock absorbing structure
DE112011105766B4 (en) 2011-10-27 2021-11-11 Mitsubishi Electric Corporation Program logic controller
DE102016202729B4 (en) 2015-03-30 2021-11-11 Honda Motor Co., Ltd. Handle switch for vehicle
DE112018000271B4 (en) 2017-01-20 2021-12-09 Panasonic Intellectual Property Management Co., Ltd. Imaging device
DE112009003597B4 (en) 2008-11-26 2021-12-16 Toyota Jidosha Kabushiki Kaisha Energy transmission device for a vehicle
DE102017223269B4 (en) 2017-03-10 2021-12-16 Mitsubishi Electric Corporation Semiconductor module and power converter arrangement
DE102019120604B4 (en) 2018-08-20 2021-12-16 Intelligrated Headquarters, Llc Sorting conveyor system
DE112016002749B4 (en) 2015-06-18 2021-12-23 Denso Corporation ELECTRIC ACTUATOR
DE102017130901B4 (en) 2017-01-25 2021-12-23 Denso Corporation Fuel injection controller
DE112010000429B4 (en) 2009-03-31 2021-12-23 Aisin Aw Co., Ltd. Information management system for a drive device and method of manufacturing the drive device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180007255A1 (en) * 2016-06-30 2018-01-04 Thalmic Labs Inc. Image capture systems, devices, and methods that autofocus based on eye-tracking

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS55143511A (en) * 1979-04-27 1980-11-08 Hoya Corp Binoculars that can be automatically focused and their focusing method
JPS6161135A (en) * 1984-09-03 1986-03-28 Omron Tateisi Electronics Co Automatic focusing camera

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS55143511A (en) * 1979-04-27 1980-11-08 Hoya Corp Binoculars that can be automatically focused and their focusing method
JPS6161135A (en) * 1984-09-03 1986-03-28 Omron Tateisi Electronics Co Automatic focusing camera

Cited By (98)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5245381A (en) * 1990-08-20 1993-09-14 Nikon Corporation Apparatus for ordering to phototake with eye-detection
US5333029A (en) * 1990-10-12 1994-07-26 Nikon Corporation Camera capable of detecting eye-gaze
EP0680726A2 (en) 1990-10-12 1995-11-08 Nikon Corporation Camera capable of detecting eye-gaze
EP0680723A2 (en) 1990-10-12 1995-11-08 Nikon Corporation Camera capable of detecting eye-gaze
EP0680725A2 (en) 1990-10-12 1995-11-08 Nikon Corporation Camera capable of detecting eye-gaze
EP0680724A2 (en) 1990-10-12 1995-11-08 Nikon Corporation Camera capable of detecting eye-gaze
US5848175A (en) * 1991-05-27 1998-12-08 Canon Kabushiki Kaisha View point detecting device
US5758201A (en) * 1991-06-28 1998-05-26 Nikon Corporation Camera having line of sight detecting device
JPH0534574A (en) * 1991-07-31 1993-02-12 Victor Co Of Japan Ltd Image pickup device
US5335035A (en) * 1992-02-24 1994-08-02 Olympus Optical Co., Ltd. Visual line direction detecting device for the camera
US5627586A (en) * 1992-04-09 1997-05-06 Olympus Optical Co., Ltd. Moving body detection device of camera
US5408292A (en) * 1992-09-07 1995-04-18 Canon Kabushiki Kaisha Visual axis detecting device
US5526089A (en) * 1992-09-14 1996-06-11 Nikon Corporation Camera with sight line detecting device
US5491532A (en) * 1992-09-14 1996-02-13 Nikon Corporation Camera with device for detecting line of sight
US5426483A (en) * 1992-09-14 1995-06-20 Nikon Corporation Camera with a line of sight detecting device
US6035054A (en) * 1992-10-29 2000-03-07 Canon Kabushiki Kaisha Visual axis detection apparatus and optical apparatus provided therewith
US5416317A (en) * 1993-02-06 1995-05-16 Nikon Corporation Visual line detecting device
US5703637A (en) * 1993-10-27 1997-12-30 Kinseki Limited Retina direct display device and television receiver using the same
US6299409B1 (en) 1998-04-10 2001-10-09 Denso Corporation Centrifugal type blower unit
US6715540B2 (en) 2000-04-28 2004-04-06 Denso Corporation Air-conditioning apparatus for vehicle
US6638371B1 (en) 2002-03-29 2003-10-28 Kawasaki Steel Corporation Cold-rolled steel sheet having ultrafine grain structure and method for manufacturing the same
KR100916312B1 (en) * 2002-09-27 2009-09-11 주식회사 케이티 An apparatus for transmitting video using adaptive weighted error correction coding and multiple description coding and method thereof
US7575619B2 (en) 2005-03-29 2009-08-18 Hitachi Powdered Metals Co., Ltd. Wear resistant sintered member
JP2006315145A (en) * 2005-05-13 2006-11-24 Muramoto Kogu Kk Eccentric sleeve
WO2007042931A1 (en) 2005-10-13 2007-04-19 Nissan Motor Co., Ltd. Vehicle driving assist system
EP2017486A1 (en) 2005-10-13 2009-01-21 Schaeffler AG Radial bearing
KR100781852B1 (en) * 2005-10-27 2007-12-03 김명진 Production of material to remove red tide using dredged sediment and its removal method
KR100838715B1 (en) * 2005-12-16 2008-06-16 국립암센터 Peptides for inhibiting Transglutaminase
KR100785932B1 (en) * 2006-01-09 2007-12-14 에스케이 텔레콤주식회사 Management scheme in network management system
KR100821311B1 (en) * 2006-03-24 2008-04-10 대한민국(관리부서:농촌진흥청) Method to make functional brewed vinegar of colored barley and brewed vinegar beverage made by the method
KR100762962B1 (en) * 2006-05-04 2007-10-04 한국과학기술원 Method for preparing culture media using genome information and in silico analysis
KR100769681B1 (en) * 2006-05-15 2007-10-23 고려대학교 산학협력단 Circular anastomosis stapling instrument
KR100833901B1 (en) * 2006-06-14 2008-06-03 김성국 Method of carrying out underground pile with expanded bulbs and pile with expanded bulbs thereof
KR100832522B1 (en) * 2006-06-14 2008-05-27 대한민국(관리부서:농촌진흥청) Detecting method for carbamate based and organic phosphate based agricultural chemicals
KR100809866B1 (en) * 2006-06-16 2008-03-06 연세대학교 산학협력단 Method and apparatus for detecting or sorting apoptotic cells using microfulidic channel and magnetic field
KR100819729B1 (en) * 2006-06-22 2008-04-07 한국과학기술연구원 Preparation method of clay/biodegradable polyester nanocomposite using supercritical fluid and nanocomposite obtained thereby
KR100792630B1 (en) * 2006-07-05 2008-01-09 고려대학교 산학협력단 Bio-marker for diagnosing diabetic nephropathy
KR100801929B1 (en) * 2006-07-05 2008-02-12 건국대학교 산학협력단 Novel endonuclease derived from Thermus thermophilus its amino acid sequence glycosylase gene its nucleotide sequence and processes for preparing the same
KR100803532B1 (en) * 2006-08-01 2008-02-14 민재윤 - DF20KCTC10942BP Lactobacillus salivarius sp. salivarius DF20 having been Acid-tolerant Bile-tolerant Antibacterial activity and possesed Alpha-galactosidase
KR100794384B1 (en) * 2006-08-02 2008-01-15 한국에너지기술연구원 Production and purification method for hydrogenase of thiocapsa roseopersicina
KR100746705B1 (en) * 2006-08-17 2007-08-06 삼성전자주식회사 Apparatus and method for transmitting and receiving digital broadcasting signal
KR100846479B1 (en) * 2006-08-21 2008-07-17 삼성에스디아이 주식회사 Organic electrolytic solution comprising electro-grafting monomer, and lithium battery employing the same
KR100836569B1 (en) * 2006-08-25 2008-06-10 경남대학교 산학협력단 A pharmaceutical composition and food additive containing a extract of Styela clava
KR100788789B1 (en) * 2006-08-29 2007-12-27 고려대학교 산학협력단 Bio-marker proteins for diagnosing exposure to formaldehyde
KR100752683B1 (en) * 2006-09-14 2007-08-29 명지대학교 산학협력단 A valiolone synthase its gene and primer for valienamine biosynthesis and method for producing the same
KR100818979B1 (en) * 2006-09-14 2008-04-04 학교법인 포항공과대학교 Dialog management apparatus and method for chatting agent
WO2008035740A1 (en) 2006-09-20 2008-03-27 Bridgestone Corporation Information display panel drive method
KR100809804B1 (en) * 2006-09-28 2008-03-04 한국전력공사 Light foam concrete composition using bottom ash, used for sound absorbtion materials, light foam concrete product employing the same and the manufacturing method thereof
KR100773050B1 (en) * 2006-09-29 2007-11-02 주식회사농심 Grape seed oil which improves oxidative stability and method thereof
KR100796534B1 (en) * 2006-10-12 2008-01-21 대림산업 주식회사 Low heat building concrete composition using 3type combine and 3type latent heat storage composition
KR100790407B1 (en) * 2006-10-12 2008-01-02 한국전기연구원 Composition of lead-free piezoelectric ceramics and method for manufacturing the same
KR100796773B1 (en) * 2006-10-18 2008-01-22 전북대학교산학협력단 Preparing method of the phyngang for browning prevention and moisture sorption
KR100806201B1 (en) * 2006-10-30 2008-02-22 광주과학기술원 Generating method for three-dimensional video formation using hierarchical decomposition of depth image, and device for the same, and system and storage medium therefor
KR100813271B1 (en) * 2006-11-09 2008-03-13 삼성전자주식회사 Method and apparatus for disrupting cell or virus and amplifying nucleic acids using gold nanorod
KR100855772B1 (en) * 2006-11-21 2008-09-01 주식회사 삼천리 Adsorbent to adsorb the sulfurour gas contained in fuel gas, and desulfurization equipement in fuel cell system using such adsorbent
KR100834811B1 (en) * 2006-11-28 2008-06-09 고려대학교 산학협력단 CoFeSiB/Pt multilayers exhibiting perpendicular magnetic anisotropy
KR100877600B1 (en) * 2006-11-30 2009-01-08 재단법인서울대학교산학협력재단 Pharmaceutical composition comprising metadoxine and garlic oil for preventing and treating alcohol-induced fatty liver and steatohepatitis
KR100764613B1 (en) * 2006-12-15 2007-10-08 재단법인서울대학교산학협력재단 Fabrication of mesoporous carbon/conducting polymer nanocomposite and application to electrode of electro-double layer capacitor
KR100808675B1 (en) * 2006-12-21 2008-02-29 한국항공우주연구원 Manufacturing method for regenerative cooling thrust chamber nozzle
KR100839055B1 (en) * 2007-01-19 2008-06-19 고려대학교 산학협력단 Alumina-ceria catalyst comprising copper oxide
KR100854594B1 (en) * 2007-02-09 2008-08-27 전남대학교산학협력단 - -5 - PAS-flagellin fusion protein with improved Toll-like receptor 5 stimulating activity
KR100842420B1 (en) * 2007-02-14 2008-07-01 주식회사 씨비엔바이오텍 Method of bioreactor culture of echinacea purpurea adventitious roots
KR100855299B1 (en) * 2007-02-16 2008-08-29 건국대학교 산학협력단 Monoclonal antibodies specific against il-32 antigens, hybridoma producing the monoclonal antibodies and diagnostic systems using the monoclonal antibodies
KR100842376B1 (en) * 2007-03-07 2008-07-01 한빔 주식회사 Preparation method of zns:mn nanoparticle
KR100886650B1 (en) * 2007-04-13 2009-03-06 주식회사 진켐 Novel 2,3-Sialyltransferase and Method for Producing Compound Having Galatose in Terminal Using the Same
KR100830719B1 (en) * 2007-05-29 2008-05-20 한국화학연구원 Synthetic methods for liquid hydrocarbons from syngas over alumina-silica based catalysts and preparation methods thereof
KR100837377B1 (en) * 2007-05-29 2008-06-12 한국화학연구원 Preparation methods for liquid hydrocarbons from syngas by using the zirconia-aluminum oxide-based fischer-tropsch catalysts
WO2008156091A1 (en) 2007-06-18 2008-12-24 Nippon Sheet Glass Company, Limited Glass composition
KR100906560B1 (en) * 2007-06-27 2009-07-07 대한주택공사 Treatment method of greywater using constructed wetland and apparatus thereof
WO2009014092A1 (en) 2007-07-23 2009-01-29 Tdk Corporation Ceramic substrate, process for producing the same, and dielectric-porcelain composition
KR100903952B1 (en) * 2007-08-13 2009-06-25 충남대학교산학협력단 A method for preparing of hydrophilic zeolite membrane
KR100902368B1 (en) * 2007-09-06 2009-06-11 삼성생약주식회사 Pharmaceutical composition for the prevention and treatment of impotency containing extract of gastrodia elata
KR100901319B1 (en) * 2007-09-27 2009-06-09 한국전력공사 System and method for intelligent distribution automation
KR100906993B1 (en) * 2007-11-09 2009-07-08 한국에너지기술연구원 Power control system for fuel cell hybrid power system and Power control method
KR100904871B1 (en) * 2007-12-28 2009-06-26 한국과학기술원 Tip testing method
DE112009003597B4 (en) 2008-11-26 2021-12-16 Toyota Jidosha Kabushiki Kaisha Energy transmission device for a vehicle
DE112010000429B4 (en) 2009-03-31 2021-12-23 Aisin Aw Co., Ltd. Information management system for a drive device and method of manufacturing the drive device
EP2361754A2 (en) 2010-02-26 2011-08-31 FUJIFILM Corporation Lens array
EP2371910A1 (en) 2010-03-30 2011-10-05 Fujifilm Corporation Ink composition, inkjet recording method and process for producing molded printed material
US20160114329A1 (en) * 2010-04-23 2016-04-28 Metso Minerals, Inc. Wear part, processing apparatus and processing plant for mineral material
WO2012014509A1 (en) 2010-07-30 2012-02-02 株式会社サイバー・ソリューションズ Unauthorized access blocking control method
WO2012128225A1 (en) 2011-03-18 2012-09-27 新日本製鐵株式会社 Steel sheet for hot-stamped member and process for producing same
WO2012144600A1 (en) 2011-04-22 2012-10-26 日立金属株式会社 Steel for solid oxide fuel cells having excellent oxidation resistance, and member for solid oxide fuel cells using same
DE112011105766B4 (en) 2011-10-27 2021-11-11 Mitsubishi Electric Corporation Program logic controller
DE112013002238B4 (en) 2012-04-25 2021-11-11 Hitachi, Ltd. Railroad car body structure with shock absorbing structure
WO2015155366A1 (en) 2014-04-11 2015-10-15 Bayer Materialscience Ag Composition for producing transparent polythiourethane bodies
DE102015116478B4 (en) 2014-09-30 2021-11-04 Ngk Insulators, Ltd. Heat / acoustic wave conversion component and heat / acoustic wave conversion unit
KR20160113999A (en) 2015-03-23 2016-10-04 재영솔루텍 주식회사 Wide Angle Lens System for Camera of Vehicle
DE102016202729B4 (en) 2015-03-30 2021-11-11 Honda Motor Co., Ltd. Handle switch for vehicle
DE112016002749B4 (en) 2015-06-18 2021-12-23 Denso Corporation ELECTRIC ACTUATOR
EP3141942A1 (en) 2015-09-02 2017-03-15 Olympus Corporation Laser microscope and microscopy method
WO2017067805A1 (en) 2015-10-19 2017-04-27 Basf Se Sandwich structure including a vip and method for producing the same
DE112018000271B4 (en) 2017-01-20 2021-12-09 Panasonic Intellectual Property Management Co., Ltd. Imaging device
DE102017130901B4 (en) 2017-01-25 2021-12-23 Denso Corporation Fuel injection controller
DE102017223269B4 (en) 2017-03-10 2021-12-16 Mitsubishi Electric Corporation Semiconductor module and power converter arrangement
DE102019120604B4 (en) 2018-08-20 2021-12-16 Intelligrated Headquarters, Llc Sorting conveyor system
DE102020205124B4 (en) 2019-05-28 2021-11-04 Yazaki Corporation Heat dissipation structure
DE102020128451B3 (en) 2020-10-29 2021-11-04 Alan E. Baklayan Fractal antenna, in particular for a therapy device for treating patients, a belt and a therapy device for treating patients with the aid of such a fractal antenna

Also Published As

Publication number Publication date
JP2859270B2 (en) 1999-02-17

Similar Documents

Publication Publication Date Title
JPH025A (en) Device for detecting eye direction of camera
US5280312A (en) Visual axis detection apparatus
US5327191A (en) Eye direction detecting apparatus
JP3108697B2 (en) Focus detection device
US6112029A (en) Camera, exchangeable lens, and camera system
US4841326A (en) Apparatus for detecting the focus adjusted state of an objective optical system
US7075735B2 (en) Stereo imaging system
JP2886865B2 (en) Focus state detection device
JP2893768B2 (en) Focus detection device
US4370551A (en) Focus detecting device
US4812023A (en) Zoom finder
US5557364A (en) Eye direction detecting apparatus
JPH02264632A (en) Sight line detector
US20070002161A1 (en) Focus detecting apparatus and image pickup apparatus
JPS60233610A (en) Range finding device
US5583606A (en) Eye direction detecting apparatus
US6369854B2 (en) Distance detecting device for an optical system
JP2787489B2 (en) Eye gaze detection device
US20020186971A1 (en) AF auxiliary light projector for AF camera
US6686577B2 (en) Device detecting focus of a taking lens without decrease in focus detection accuracy when the auto focus area is disposed at an inclination relative to the photographic field
JP2004198701A (en) Focus detecting optical system and camera provided with the same
JP2003207813A (en) Image pickup device
JP2004354435A (en) Stereoscopic imaging apparatus
JPS60247631A (en) Single lens reflex camera with focus detector
JPH09269527A (en) Finder system provided with sight line detecting means

Legal Events

Date Code Title Description
LAPS Cancellation because of no payment of annual fees