JP2005242428A - Driver face imaging device - Google Patents

Driver face imaging device Download PDF

Info

Publication number
JP2005242428A
JP2005242428A JP2004047642A JP2004047642A JP2005242428A JP 2005242428 A JP2005242428 A JP 2005242428A JP 2004047642 A JP2004047642 A JP 2004047642A JP 2004047642 A JP2004047642 A JP 2004047642A JP 2005242428 A JP2005242428 A JP 2005242428A
Authority
JP
Japan
Prior art keywords
driver
face image
eye
face
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2004047642A
Other languages
Japanese (ja)
Inventor
Masayuki Kaneda
雅之 金田
Shinobu Nagaya
忍 長屋
Kinya Iwamoto
欣也 岩本
Haruo Matsuo
治夫 松尾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Priority to JP2004047642A priority Critical patent/JP2005242428A/en
Publication of JP2005242428A publication Critical patent/JP2005242428A/en
Pending legal-status Critical Current

Links

Landscapes

  • Blocking Light For Cameras (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Image Input (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide a driver face imaging device that takes face images suited for image recognition, by effectively avoiding or preventing the influence of reflected light on the front and back of eyeglasses lenses. <P>SOLUTION: The driver face image taking device has an image taking means for taking images of a vehicle driver's face; an eye detection means for detecting the position of the driver's eyes from each face image; and a near infrared illumination means for reflecting near infrared light at a roof part to indirectly apply the light to the driver's face. <P>COPYRIGHT: (C)2005,JPO&NCIPI

Description

本発明は、運転者の顔画像撮影装置に係り、特に、眼鏡を着用した運転者の顔の認識に適した顔画像撮影装置に関する。   The present invention relates to a driver's face image capturing apparatus, and more particularly to a face image capturing apparatus suitable for recognizing a driver's face wearing spectacles.

従来から、車両運転者の脇見、居眠りなどの運転状態を検出する目的や、運転中の運転者の注視点を調べる目的で、運転者の顔画像を撮影し、得られた画像を処理して運転者の顔の特徴点(眼など)を抽出する装置が多数提案されている。   Conventionally, for the purpose of detecting driving conditions such as a driver's sideways and dozing, and for examining the driver's point of interest while driving, the driver's face image is taken and the resulting image is processed. Many devices for extracting feature points (such as eyes) of a driver's face have been proposed.

例えば、顔画像を撮影するカメラの光軸上に光学フィルタを配置し、この光学フィルタを透過する波長域の近赤外光で運転者の顔面を照明する近赤外照明器をカメラの光軸に対して所定の角度以上となる位置に配置することにより、眼鏡のレンズ表面における反射光の影響を極力抑えて、運転者が眼鏡を着用していても運転者の顔画像を撮影できる顔画像撮影装置が知られている(例えば、特許文献1等を参照)。
特開平9−21611号公報
For example, an optical filter is placed on the optical axis of a camera that captures a face image, and a near-infrared illuminator that illuminates the driver's face with near-infrared light in the wavelength range that passes through the optical filter is used as the optical axis of the camera A face image that can capture a driver's face image even when the driver is wearing spectacles by suppressing the influence of reflected light on the lens surface of the spectacles as much as possible by arranging it at a position that is at a predetermined angle or more with respect to An imaging device is known (see, for example, Patent Document 1).
JP-A-9-21611

しかしながら、近赤外照明器が点光源であることと、車室内レイアウトの自由度が小さいことから、眼鏡のレンズ表面への前方風景の映り込みを抑えることができたとしても、眼鏡のレンズ裏面において近赤外照明器の点光源の正反射が発生してしまう。このレンズ裏面で発生する正反射は、撮影された画像上では高輝度点となるため、レンズ裏面で発生する近赤外照明器の発光点の映り込みが画像認識を行う上での障害になってしまう。また、該高輝度点が眼に重なった場合、原画像上でも眼の形を正しく捕らえることが困難になってしまう。   However, because the near-infrared illuminator is a point light source and the degree of freedom in the interior layout of the vehicle is small, even if it is possible to suppress the reflection of the front scenery on the lens surface of the glasses, , Specular reflection of the point light source of the near-infrared illuminator occurs. The specular reflection that occurs on the back of the lens becomes a high-intensity point on the photographed image, so the reflection of the light emission point of the near-infrared illuminator that occurs on the back of the lens is an obstacle to image recognition. End up. Further, when the high luminance point overlaps the eye, it becomes difficult to correctly capture the shape of the eye even on the original image.

本発明の特徴は、車両の運転者の顔画像を撮影する画像撮影手段と、顔画像から運転者の眼の位置を検出する眼検出手段と、近赤外光を車両のルーフ部で反射させて運転者の顔面に間接的に照射する近赤外照明手段とを有する運転者の顔画像撮影装置であることを要旨とする。   The feature of the present invention is that image photographing means for photographing a driver's face image of the vehicle, eye detection means for detecting the position of the driver's eye from the face image, and reflecting near infrared light at the roof portion of the vehicle. The gist of the present invention is a driver's face image photographing device having a near-infrared illumination means for indirectly irradiating the driver's face.

本発明によれば、眼鏡レンズ表面及び裏面での反射光の影響を効果的に抑制または防止することにより、画像認識に適した顔画像を撮影する運転者の顔画像撮影装置を提供することができる。   According to the present invention, it is possible to provide a driver's face image capturing device that captures a face image suitable for image recognition by effectively suppressing or preventing the influence of reflected light on the front and back surfaces of the spectacle lens. it can.

以下図面を参照して、本発明の実施の形態を説明する。図面の記載において同一あるいは類似部分には同一あるいは類似な符号を付している。   Embodiments of the present invention will be described below with reference to the drawings. In the description of the drawings, the same or similar parts are denoted by the same or similar reference numerals.

図1に示すように、本発明の実施の形態に係わる運転者の顔画像撮影装置は、車両の運転者6の顔画像を撮影する画像撮影手段の一例としてのカメラ1と、カメラ1が撮影した運転者6の顔画像から運転者6の運転状態を検出する処理装置2と、近赤外光を車両のルーフ部5で反射させて運転者6の顔面に間接的に照射する近赤外照明手段(近赤外照明器)3と、処理装置2での検出結果を受けて運転者6に対して注意を促す警報器4とを有する。処理装置2は、顔画像から運転者6の眼の位置を検出する眼検出手段10と、眼追跡手段11と、開閉/居眠り判定手段12とを備える。   As shown in FIG. 1, a driver's face image capturing device according to an embodiment of the present invention includes a camera 1 as an example of an image capturing unit that captures a face image of a vehicle driver 6, and the camera 1 captures images. The processing device 2 that detects the driving state of the driver 6 from the face image of the driver 6 and the near infrared that indirectly reflects the near infrared light on the roof portion 5 of the vehicle and irradiates the face of the driver 6 indirectly. An illumination unit (near infrared illuminator) 3 and an alarm device 4 that receives a detection result from the processing device 2 and alerts the driver 6 are provided. The processing device 2 includes an eye detection unit 10 that detects the position of the eyes of the driver 6 from the face image, an eye tracking unit 11, and an open / close / sleeping determination unit 12.

近赤外照明器3は、運転者6の顔面部を直接に近赤外光を照射せずに、車両ルーフ部5を直接照射して車両ルーフ部5で反射した近赤外光が運転者6の顔面部に照射される。即ち、近赤外照明器3は、車両ルーフ部5での反射を用いることで、直接的な近赤外光照射の妨げとなるハンドル8などを回避し、運転者6の顔面部を間接的に近赤外光を照射する。   The near-infrared illuminator 3 does not directly irradiate the face portion of the driver 6 with near-infrared light, but directly irradiates the vehicle roof portion 5 and reflects the near-infrared light reflected by the vehicle roof portion 5 to the driver. 6 face part is irradiated. That is, the near-infrared illuminator 3 uses the reflection on the vehicle roof portion 5 to avoid the handle 8 and the like that hinder direct near-infrared light irradiation, and indirectly controls the face portion of the driver 6. Is irradiated with near infrared light.

次に、図2(a)乃至図2(d)を参照して、図1の顔画像撮影装置を用いて眼鏡を着用している運転者の顔画像認識を行う場合について説明する。図2(a)に示すように眼の全体に一様な状態で眼鏡レンズ部に前方風景の映り込みが発生している場合は、図2(b)のように画像の前処理のパラメータを調整することによって眼の特徴量を精度良く捕らえることができる。しかし、図2(c)に示すような眼の下部だけに前方風景の映り込みが発生している場合は、図2(d)のように画像の前処理のパラメータの調整では眼の特徴量を精度良く捕らえることが難しい。図2(c)のような眼の下部だけに前方風景の映り込みが発生する現象は、眼鏡のレンズ表面の曲率半径と運転者の顔の前後、上下位置の影響を受ける。この眼に重なる映り込みの境界ラインは、車両のフロントガラスの上端ラインである。眼鏡レンズ部のうち映り込みが発生している下部には、フロントウィンドウを通して見える前方風景が映っており、眼鏡レンズ部のうち映り込みが発生していない上部は、車両のルーフ面が映っている。   Next, with reference to FIGS. 2 (a) to 2 (d), a case will be described in which face image recognition of a driver wearing glasses is performed using the face image photographing device of FIG. As shown in FIG. 2 (a), when the spectacle lens part is reflected in a uniform state over the entire eye, the parameters for image preprocessing are set as shown in FIG. 2 (b). By adjusting, the feature amount of the eye can be captured with high accuracy. However, when the forward scenery is reflected only in the lower part of the eye as shown in FIG. 2C, the feature amount of the eye is used in the adjustment of the image preprocessing parameters as shown in FIG. It is difficult to capture accurately. The phenomenon in which a front scene is reflected only in the lower part of the eye as shown in FIG. 2C is affected by the radius of curvature of the lens surface of the spectacles, the front and rear of the driver's face, and the vertical position. The boundary line of the reflection that overlaps the eye is the upper end line of the windshield of the vehicle. The lower part of the spectacle lens part where the reflection is occurring shows the front scenery seen through the front window, and the upper part of the spectacle lens part where the reflection is not reflected is the roof surface of the vehicle .

したがって、図2(c)に示すような前方風景の映り込みの現象が発生した場合においても、図1に示した顔画像撮影装置を用いることにより、車両ルーフ部5に近赤外照明を照射することで、間接的に顔面部を照らすことができ、且つルーフ上面が明るくなるため、レンズ中央部で二分されているコントラスト差を均一にすることもできる。よって、画像認識による前処理によって眼の特徴量を捕らえるにも有利な状態にすることが可能である。このように、図1に示す顔画像撮影装置は、眼鏡を着用している運転者においても眼鏡レンズ面の反射光の影響を効果的に抑制または防止することにより、画像認識に適した画像を撮影できるようにして居眠りなど運転者の状態検出の対応力を向上させることができる。   Therefore, even when the phenomenon of the front scene reflection as shown in FIG. 2C occurs, the vehicle roof portion 5 is irradiated with near-infrared illumination by using the face image photographing device shown in FIG. By doing so, it is possible to indirectly illuminate the face portion and to brighten the upper surface of the roof, so that the contrast difference divided into two at the lens center portion can be made uniform. Therefore, it is possible to obtain an advantageous state for capturing the eye feature amount by pre-processing by image recognition. As described above, the face image capturing device shown in FIG. 1 can effectively suppress or prevent the influence of the reflected light from the spectacle lens surface even in a driver wearing spectacles, thereby obtaining an image suitable for image recognition. The ability to detect the driver's condition such as falling asleep can be improved by enabling photographing.

次に、図3のフローチャートを参照して、図1の顔画像撮影装置の処理動作の流れを説明する。   Next, the flow of processing operations of the face image photographing device of FIG. 1 will be described with reference to the flowchart of FIG.

(イ)先ずS01段階において、カメラ1で運転者6の顔を撮影し、顔画像を処理装置2へ入力する。S02段階において、眼の追跡領域が設定されているか否かを判定する。処理起動直後では眼の追跡領域が設定されていない(S02段階でNo)ので、S03段階に進み、図1の眼検出手段10は、顔画像全体から眼の位置を検出する。   (A) First, in step S01, the face of the driver 6 is photographed by the camera 1, and the face image is input to the processing device 2. In step S02, it is determined whether an eye tracking area is set. Since the eye tracking area is not set immediately after the process is started (No in step S02), the process proceeds to step S03, and the eye detection unit 10 in FIG. 1 detects the position of the eye from the entire face image.

S03段階における顔画像全体から眼の位置を検出する方法はこれまでに様々なものが考案されており、ここではその一例を図4乃至図7を参照して説明する。図4に示す画像の縦方向への濃度値の変化状態が所定条件を満たす点を各縦ライン(Xa、Xb、・・・)上に抽出する。その結果を図5に示すように同2次元画面上に示す。図6に示すように、隣接する縦ラインにおいて各抽出点が上下方向に接近しているものでグループ化する。図6に示すグループG1〜G6は横に黒く長くなる濃度的な特徴を捕らえたものであり、比較的簡単な方法で、右眉:G1、左眉:G2、右眼:G3、左眼:G4、鼻:G5、口:G6として検出することができる。その後、図7に示すように連続データとして捕らえられた顔の特徴量を縦方向に各特徴量が出現する箇所を幾つかのゾーン(ZONE:L、ZONE:C、ZONE:R)に分割して相対位置関係をみることで眼の位置検出を行う。   Various methods for detecting the position of the eye from the entire face image in step S03 have been devised so far, and an example thereof will be described with reference to FIGS. A point where the change state of the density value in the vertical direction of the image shown in FIG. 4 satisfies a predetermined condition is extracted on each vertical line (Xa, Xb,...). The result is shown on the two-dimensional screen as shown in FIG. As shown in FIG. 6, the extracted points are grouped according to the approaching vertical direction in adjacent vertical lines. The groups G1 to G6 shown in FIG. 6 capture density characteristics that are black and long horizontally. In a relatively simple method, the right eyebrow: G1, the left eyebrow: G2, the right eye: G3, and the left eye: It can be detected as G4, nose: G5, and mouth: G6. Thereafter, as shown in FIG. 7, the feature amount of the face captured as continuous data is divided into several zones (ZONE: L, ZONE: C, ZONE: R) where each feature amount appears in the vertical direction. The eye position is detected by checking the relative positional relationship.

(ロ)S04段階において、眼追跡手段11は、連続データの選択により検出された眼の位置座標を記憶し、眼の位置座標を基準にして眼を含む小領域を眼の追跡領域として設定する。   (B) In step S04, the eye tracking unit 11 stores the eye position coordinates detected by the continuous data selection, and sets a small area including the eyes as the eye tracking area with reference to the eye position coordinates. .

(ハ)S05段階において、眼追跡手段11は、眼の追跡領域内での眼の位置を検出する。具体的には、S03段階で眼の位置検出を行った時と同様に連続データの抽出を行う。但し、眼の追跡処理における連続データの抽出は、図8(a)に示す眼21aの追跡領域20a内に限定して行う。また、本処理では、眼の位置検出の時に比べ小さい領域で精度良く眼の位置を検出する必要があるので、図4で示した縦ライン(Xa、Xb、・・・)の走査間隔密度も高くして実施する。   (C) In step S05, the eye tracking unit 11 detects the position of the eye within the eye tracking area. Specifically, continuous data extraction is performed in the same manner as when the eye position is detected in step S03. However, the extraction of continuous data in the eye tracking process is limited to the tracking area 20a of the eye 21a shown in FIG. Further, in this process, since it is necessary to detect the eye position with a small area with high accuracy compared to the case of detecting the eye position, the scanning interval density of the vertical lines (Xa, Xb,...) Shown in FIG. Carry out higher.

(ニ)S06段階において、眼追跡手段11は、眼の追跡が正しく行われているか否かをチェックする。S05段階で眼としての対象物が検出できなかった場合や、検出した対象物の連続データの形状的な特徴が合致しない場合などを含む、眼の追跡に失敗した、即ち眼の追跡が正しく行われていない場合、(S06段階においてNo)、S07段階に進み、眼の追跡領域20aをクリアしてS01段階の画像入力に戻る。この時、S02段階では眼の追跡領域が存在しないと判定されるので、S03段階に移行し、再度画像全体からの眼の位置検出が行われる。眼の追跡が正しく行われている場合(S06段階においてYes)、S08段階に進む。   (D) In step S06, the eye tracking unit 11 checks whether or not the eye is correctly tracked. Eye tracking has failed, that is, the case where the target object as an eye cannot be detected in step S05 or the shape characteristics of the continuous data of the detected target do not match. If not (No in step S06), the process proceeds to step S07, the eye tracking area 20a is cleared, and the process returns to the image input in step S01. At this time, since it is determined in step S02 that there is no eye tracking region, the process proceeds to step S03, and eye position detection is performed again from the entire image. If eye tracking is performed correctly (Yes in step S06), the process proceeds to step S08.

(ホ)S08段階において、眼追跡手段11は、眼の追跡領域を更新する。具体的には、S05段階で特定された眼の中心位置を基準として記憶し、その位置を基準に眼の追跡領域を更新する。図8(a)に示す眼21aの追跡領域20aがS05段階で設定された眼の追跡領域であり、眼の中心位置21aと、追跡領域20aの中心位置は一致している。その後、S01段階で新たな顔画像が取り込まれ、その画像で眼の位置変化があった場合は、図8(b)のように、濃く示した眼21bが、新たに取り込まれた画像での眼の位置となり、薄い眼21aが前フレームでの眼の位置となる。S08段階での眼の追跡領域の更新とは、図8(b)に示す新たに取り込まれた画像での眼の位置21bを中心として眼の追跡領域を移動する処理である。それを示したものが図8(c)である。即ち、新たに取り込まれた画像での眼の位置21bと新たな眼の追跡領域20bの中心位置は一致している。このようにして眼の追跡領域20a、20b内で検出した眼の中心座標を基に眼の追跡領域位置20a、20bの更新を続けることで、眼の追跡し続けることができる。また、眼の追跡領域20a、20bの大きさは、画像の取り込み間隔で、眼の移動量がどれくらいあるかを考慮して必要最小限の領域を設定すると、より速く、眼の追従性に優れた処理が可能となる。   (E) In step S08, the eye tracking means 11 updates the eye tracking area. Specifically, the center position of the eye identified in step S05 is stored as a reference, and the eye tracking area is updated based on the position. The tracking area 20a of the eye 21a shown in FIG. 8A is the eye tracking area set in step S05, and the center position 21a of the eye and the center position of the tracking area 20a coincide. After that, a new face image is captured in step S01, and if there is a change in the position of the eye in the image, as shown in FIG. 8 (b), the darkened eye 21b is replaced with the newly captured image. The position of the eye, and the thin eye 21a is the position of the eye in the front frame. The update of the eye tracking region in step S08 is a process of moving the eye tracking region around the eye position 21b in the newly captured image shown in FIG. 8B. This is shown in FIG. 8 (c). In other words, the eye position 21b in the newly captured image and the center position of the new eye tracking area 20b coincide. The eye tracking can be continued by continuously updating the eye tracking region positions 20a and 20b based on the center coordinates of the eyes detected in the eye tracking regions 20a and 20b. In addition, the size of the eye tracking areas 20a and 20b is faster when the required area is set in consideration of the amount of movement of the eye in the image capture interval, and the eye tracking performance is excellent. Can be processed.

(へ)以上のようにして運転者6の眼の追跡ができるようになると、次にS09段階において、開閉/居眠り判定手段12は、この眼の追跡領域20a、20b内において眼の開閉状態を判定し、S10段階では所定時間内の閉眼発生パターンを判定することによって運転者6の居眠り状態を判定する。S10段階で運転者6が居眠り状態にあると判定された場合は、S11段階において、警報器4は運転者6に注意を促す警報を出力する。   (F) When the eye of the driver 6 can be tracked as described above, in the next step S09, the open / close / slumber determining means 12 determines the open / closed state of the eye in the eye tracking areas 20a, 20b. In step S10, the drowsiness state of the driver 6 is determined by determining a closed eye occurrence pattern within a predetermined time. If it is determined in step S10 that the driver 6 is in a dozing state, the alarm device 4 outputs an alarm that calls the driver 6 in step S11.

(変形例)
図9に示すように、変形例に係わる運転者の顔画像撮影装置は、図1に示した運転者の顔画像撮影装置に対して更に、カメラ1の光軸上に配置された近赤外光を含み所定の波長域の光を透過させる光学フィルタ9を有する。なお、その他の構成は、図1に示した顔画像撮影装置と同じであり、またその処理動作も図3に示したそれと同じであるため、説明を省略する。
(Modification)
As shown in FIG. 9, the driver's face image photographing apparatus according to the modification is further arranged in the near infrared region arranged on the optical axis of the camera 1 with respect to the driver's face image photographing apparatus shown in FIG. 1. An optical filter 9 that includes light and transmits light in a predetermined wavelength range is included. The other configuration is the same as that of the face image photographing apparatus shown in FIG. 1, and the processing operation thereof is also the same as that shown in FIG.

眼鏡7のレンズ表面での映り込みを抑制するには、運転者6が眩しさを感じない程度の近赤外光を用いることが望ましい。しかし、図10に示すように近赤外照明の波長領域は可視光領域に比べ狭いため、太陽の全波長領域の光に対抗する為には近赤外照明器3の出力が大きいことが必要となる。そこで、一般的には近赤外照明器3の波長帯のみの光を透過するバンドパスフィルタ或いはローパスフィルタをカメラ1の光軸上にセットすることによって、より小さな出力の近赤外照明器3で眼鏡レンズ部の映り込みを低減させることができる。しかし、反射防止加工を施した眼鏡7を対象とする場合、その効果が主に可視光の範囲にしかないため、近赤外光下の撮影画像においては、眼鏡レンズ部への映り込みが強くなる傾向がある。この映り込みは強い近赤外光を照射することである程度抑制することはできるが、眼のコントラストそのものも弱くなってしまうので、眼を画像認識の対象とする場合は、あまり好ましい対策とはいえない。そこで、光学フィルタ9の透過特性を変更することで、反射防止膜を施している眼鏡7への対応力を向上させることができる。即ち、反射防止膜(マルチコート、ARコート)を施した眼鏡レンズに対して効率的に前方風景の映り込みを低減させることができる。   In order to suppress the reflection on the lens surface of the glasses 7, it is desirable to use near infrared light that does not cause the driver 6 to feel dazzling. However, as shown in FIG. 10, the wavelength range of near-infrared illumination is narrower than the visible light range, so that the output of the near-infrared illuminator 3 needs to be large in order to counter the light in the entire wavelength range of the sun. It becomes. Therefore, in general, a near-infrared illuminator 3 having a smaller output can be obtained by setting a band-pass filter or a low-pass filter that transmits only light in the wavelength band of the near-infrared illuminator 3 on the optical axis of the camera 1. Thus, the reflection of the spectacle lens portion can be reduced. However, when the spectacles 7 subjected to antireflection processing are targeted, the effect is mainly in the range of visible light, so that in the photographed image under near infrared light, the reflection on the spectacle lens part becomes strong. Tend. This reflection can be suppressed to some extent by irradiating with strong near-infrared light, but the contrast of the eye itself is also weakened. Absent. Therefore, by changing the transmission characteristics of the optical filter 9, it is possible to improve the ability to cope with the glasses 7 having the antireflection film. That is, it is possible to efficiently reduce the reflection of the front scenery with respect to the spectacle lens to which the antireflection film (multicoat, AR coat) is applied.

図11(c)に示すように可視光(波長:380nm〜780nm)を殆ど透過させず近赤外光を透過させる透過特性を有するローパスフィルタ(光学フィルタ)9をカメラ1の光軸上に配置した場合を考える。この場合、反射防止膜を施している眼鏡7を着用した運転者6をカメラ1により撮影すると、図11(a)に示すように、車両前方に置かれた白い反射物の眼鏡レンズ面への映り込みが極端に強くなり、画像上でも眼の形状が分かり辛くなるので、当然画像認識において眼の位置検出などを行うことも困難になる。このような状態においても図11(b)に示すようにある程度強度の強い近赤外光を照射することで眼の形状を出現させることはできる。   As shown in FIG. 11 (c), a low-pass filter (optical filter) 9 having a transmission characteristic that hardly transmits visible light (wavelength: 380 nm to 780 nm) and transmits near-infrared light is disposed on the optical axis of the camera 1. Consider the case. In this case, when the driver 6 wearing the glasses 7 with the antireflection film is photographed by the camera 1, as shown in FIG. 11 (a), the white reflector placed in front of the vehicle is directed to the spectacle lens surface. Reflection becomes extremely strong, and it becomes difficult to understand the shape of the eye even on the image, so naturally it is difficult to detect the position of the eye in image recognition. Even in such a state, as shown in FIG. 11B, the shape of the eye can be made to appear by irradiating near infrared light having a certain intensity.

次に、図11(a)に示した顔画像の撮影条件において光学フィルタ9の特性を変更することによって、更に眼の形状を画像認識で捕らえ易くする例について図11(d)乃至図11(f)を参照して説明する。光学フィルタの透過特性が図11(f)に示すように可視光のうち赤の波長帯の光と近赤外光を透過させるローパスフィルタ(光学フィルタ)9をカメラ1の光軸上に配置した場合を考える。この場合、反射防止膜を施している眼鏡7を着用した運転者6をカメラ1により撮影すると、図11(d)に示すように、近赤外光に加え、赤の波長帯の光を透過させるようにすることで、車両前方に置かれた白い反射物の眼鏡レンズ面への映り込みを図11(a)に比べて低減させることができる。   Next, an example in which the shape of the eye is further easily captured by image recognition by changing the characteristics of the optical filter 9 under the face image capturing conditions shown in FIG. 11A will be described with reference to FIGS. This will be described with reference to f). As shown in FIG. 11 (f), a low-pass filter (optical filter) 9 that transmits light in the red wavelength band and near-infrared light of visible light is disposed on the optical axis of the camera 1. Think about the case. In this case, when the driver 6 wearing the glasses 7 with the antireflection film is photographed by the camera 1, as shown in FIG. 11D, light in the red wavelength band is transmitted in addition to near-infrared light. By doing so, the reflection of the white reflector placed in front of the vehicle on the spectacle lens surface can be reduced as compared with FIG.

これは、図12に示す眼鏡の反射防止膜の波長別の反射特性によるものである。この反射防止膜の反射特性は可視光の全波長帯において約0.3%以下になっており、かなりの反射防止効果があることが分かる。なお、反射防止膜が形成されていないノンコートのガラス及びプラスチックの反射率は約4%である。また、通常反射防止処理が施されている眼鏡のレンズ面の反射光が緑色に見えるのは、反射率が高くなっている波長帯が540nmぐらいの波長帯になっているためである。つまり、図12に示すように、反射率が0%に近い650nm付近の赤の波長帯の光は、眼鏡レンズ部の反射光となることがなく、レンズを通して眼の部分に届く透過光となるので映り込みの影響を抑えることができる。よって、図11(b)に示した場合と同じ出力の近赤外光を照射した場合、図11(e)に示すように更に眼の形状を捕ら易くすることができる。   This is due to the reflection characteristics by wavelength of the antireflection film of the glasses shown in FIG. The reflection characteristic of this antireflection film is about 0.3% or less in the entire wavelength band of visible light, and it can be seen that there is a considerable antireflection effect. The reflectance of non-coated glass and plastic on which no antireflection film is formed is about 4%. Further, the reason why the reflected light of the lens surface of the glasses that are normally subjected to antireflection treatment appears green is that the wavelength band where the reflectance is high is about 540 nm. That is, as shown in FIG. 12, the light in the red wavelength band near 650 nm whose reflectance is close to 0% does not become the reflected light of the spectacle lens part, but becomes the transmitted light that reaches the eye part through the lens. Therefore, the effect of reflection can be suppressed. Therefore, when the near-infrared light having the same output as that shown in FIG. 11B is irradiated, the shape of the eye can be further easily captured as shown in FIG.

なお、図12に示す反射特性はレンズ面への入射角0°の場合のものであり、反射光は、その入射角によって波長帯が変化する特性を持っている。その波長帯の変化は、入射角が大きくなるにしたがって短くなる傾向にあるので、例えば反射強度が強い緑の波長帯の光は紫の波長域の方向に変化するので、運転者6の眼鏡レンズ面への入射角が約20°であっても赤領域の光を透過させる光学フィルタ9を用いることによって問題は発生しない。このようにして、一般的に普及している反射防止加工を施した眼鏡に対しては、眼鏡レンズ部での反射の影響の少ない赤の波長帯の光を透過させる光学フィルタ9を用いることで、映り込みを更に効果的に低減させることができる。   Note that the reflection characteristics shown in FIG. 12 are those when the incident angle to the lens surface is 0 °, and the reflected light has a characteristic that the wavelength band changes depending on the incident angle. Since the change in the wavelength band tends to become shorter as the incident angle becomes larger, for example, the light in the green wavelength band having a strong reflection intensity changes in the direction of the purple wavelength band. Even if the incident angle to the surface is about 20 °, no problem occurs by using the optical filter 9 that transmits light in the red region. In this way, for the spectacles that have been subjected to the antireflection processing that is generally used, the optical filter 9 that transmits light in the red wavelength band that is less affected by the spectacle lens portion is used. The reflection can be further effectively reduced.

(その他の実施の形態)
上記のように、本発明は、1つの実施の形態とその変形例によって記載したが、この開示の一部をなす論述及び図面はこの発明を限定するものであると理解すべきではない。この開示から当業者には様々な代替実施の形態、実施例及び運用技術が明らかとなろう。
(Other embodiments)
As described above, the present invention has been described by way of one embodiment and modifications thereof. However, it should not be understood that the description and drawings constituting a part of this disclosure limit the present invention. From this disclosure, various alternative embodiments, examples and operational techniques will be apparent to those skilled in the art.

図1及び図9に示した近赤外照明器3は、車内レイアウトの様々な箇所へ配置することができる。近赤外光を車両のルーフ部を用いて間接的に照射できる照明器の位置の例を図13乃至図17を参照して説明する。図13は、メータフードを中心としたインストルメントパネルの上面に近赤外照明器31を取り付けた例を示す。ルーフ部5を利用した間接照明であるため、ハンドルの影になるおそれもなく非常に自由度の高い照明装置の取り付けが可能である。図14は、センターコンソールの上面に近赤外照明器32を取り付けた例を示す。図15は、フロントピラー(Aピラー)上に近赤外照明器33を取り付けた例を示す。図16はドアトリムの上面に近赤外照明器34を取り付けた例を示す。図14〜図16の例に示すように、ルーフ部を用いた間接照明を可能とする範囲は広く、様々な車両レイアウトに対応させることが可能となる。図17は、運転席のシートバックの上面に近赤外照明器35を取り付けた例を示す。この例では、運転者6の体格の差によって、シート位置が前後しても常に同等の照射条件を保つことができる。   The near-infrared illuminator 3 shown in FIGS. 1 and 9 can be arranged at various locations in the in-vehicle layout. An example of the position of the illuminator that can indirectly irradiate near-infrared light using the roof portion of the vehicle will be described with reference to FIGS. FIG. 13 shows an example in which a near-infrared illuminator 31 is attached to the upper surface of an instrument panel centered on a meter hood. Since it is indirect illumination using the roof part 5, it is possible to attach a lighting device with a very high degree of freedom without the possibility of shadowing the handle. FIG. 14 shows an example in which a near-infrared illuminator 32 is attached to the upper surface of the center console. FIG. 15 shows an example in which the near-infrared illuminator 33 is mounted on the front pillar (A pillar). FIG. 16 shows an example in which a near-infrared illuminator 34 is attached to the upper surface of the door trim. As shown in the examples of FIGS. 14 to 16, the range in which indirect illumination using the roof portion is possible is wide, and it is possible to correspond to various vehicle layouts. FIG. 17 shows an example in which the near-infrared illuminator 35 is attached to the upper surface of the seat back of the driver's seat. In this example, due to the difference in the physique of the driver 6, it is possible to always maintain the same irradiation condition even if the seat position changes.

このように、本発明はここでは記載していない様々な実施の形態等を包含するということを理解すべきである。したがって、本発明はこの開示から妥当な特許請求の範囲に係る発明特定事項によってのみ限定されるものである。   Thus, it should be understood that the present invention includes various embodiments and the like not described herein. Therefore, the present invention is limited only by the invention specifying matters according to the scope of claims reasonable from this disclosure.

以上説明したように、請求項1の発明によれば、前方風景の眼鏡レンズ表面への映り込み抑制と、近赤外照明器の発光点の眼鏡レンズ裏面への映り込み防止を両立することができる。   As described above, according to the first aspect of the present invention, it is possible to achieve both suppression of reflection of the front scenery on the spectacle lens surface and prevention of reflection of the light emitting point of the near-infrared illuminator on the spectacle lens back surface. it can.

請求項2の発明によれば、反射防止加工(マルチコート、ARコート)を施した眼鏡レンズに対して効率的に前方風景の映り込みを低減することができる。   According to the second aspect of the present invention, it is possible to efficiently reduce the reflection of the front scenery with respect to the spectacle lens subjected to antireflection processing (multi-coating, AR coating).

請求項3〜7の発明によれば、近赤外照明手段の設置場所をインストルメントパネルの上面、センターコンソールの上面、フロントピラー部、ドアトリムの上面、或いは運転者のシートバックの上面としたことにより、眼鏡レンズ表面に前方風景が映り込むことを抑制し、車室内レイアウト条件を満たし、且つ、眼鏡レンズ裏面へ近赤外照明器の発光点が映り込むことをも防止することができる。   According to the inventions of claims 3 to 7, the installation location of the near infrared illumination means is the upper surface of the instrument panel, the upper surface of the center console, the front pillar portion, the upper surface of the door trim, or the upper surface of the driver's seat back. Accordingly, it is possible to suppress the front scenery from being reflected on the spectacle lens surface, satisfy the vehicle interior layout conditions, and prevent the light emission point of the near-infrared illuminator from being reflected on the back surface of the spectacle lens.

(眼鏡レンズ面への近赤外照明器の発光点の映り込み)
図18〜20を参照して、従来の方法を用いた場合に発生する眼鏡レンズ表面及び裏面への近赤外照明器の発光点の映り込みについて説明する。眼鏡レンズ面への近赤外照明器3の映り込みは、レンズ表面で発生するものとレンズ裏面で発生するものがある。レンズ表面の曲率半径は80〜400mmぐらいの間にあり、最も小さい80mmの曲率半径であったとしても、比較的容易にレンズ面に発光点が映り込まないように車室内において撮影装置1及び照明器3を配置することができる。図18に示すように運転者6がカメラ1の下方でも見ない限りレンズ表面への照明器3の発光点の映り込み(表面反射点)は発生しない。
(Reflection of light emission point of near-infrared illuminator on spectacle lens surface)
With reference to FIGS. 18 to 20, the reflection of the light emission point of the near-infrared illuminator on the front and back surfaces of the spectacle lens that occurs when the conventional method is used will be described. Reflection of the near-infrared illuminator 3 on the spectacle lens surface may occur on the lens surface or on the lens back surface. The curvature radius of the lens surface is in the range of about 80 to 400 mm, and even if the curvature radius is the smallest 80 mm, the photographing apparatus 1 and the illumination in the passenger compartment are relatively easy so that the light emitting point is not reflected on the lens surface. A vessel 3 can be arranged. As shown in FIG. 18, no reflection of the light emission point of the illuminator 3 on the lens surface (surface reflection point) occurs unless the driver 6 looks under the camera 1.

一方、レンズ裏面への照明器3の発光点の映り込み(裏面反射点)は、近赤外光が空気層からガラス層またはプラスチック層に進行する時の屈折とレンズ裏面の曲率半径との関係で、図19に示すようにレンズ中央部付近に発生する。このレンズ裏面に発生する映り込みの位置は、通常、運転時にも着用する近視矯正用の眼鏡である場合、眼鏡の度数によっても異なる。近視矯正用の眼鏡の度数は、レンズ表面の曲率半径とレンズ裏面の曲率半径との比によって調整される。その比率は、度の強い眼鏡ほど大きいので、レンズ裏面の曲率半径は小さくなる方向にある。よって、図20(a)乃至図20(c)に示すように度の強い眼鏡ほど近赤外照明器3の発光点の裏面反射点は、レンズ中央部付近に発生する。このような度数の強い眼鏡も含めてレンズ裏面への近赤外照明機の発光点の映り込みをなくしたり、レンズ周囲部へ移動させることは、自由度の小さい車室内では撮影装置と照明器の配置をもって対応することが難しくなる。   On the other hand, the reflection of the light emitting point of the illuminator 3 on the back surface of the lens (back surface reflection point) is the relationship between refraction when near infrared light travels from the air layer to the glass layer or plastic layer and the radius of curvature of the lens back surface. As shown in FIG. 19, it occurs near the center of the lens. The position of the reflection that occurs on the rear surface of the lens usually varies depending on the power of the glasses when the glasses for correcting myopia are worn even during driving. The power of the glasses for correcting myopia is adjusted by the ratio between the radius of curvature of the lens surface and the radius of curvature of the lens back surface. Since the ratio is larger as the degree of spectacles is higher, the radius of curvature on the back surface of the lens tends to be smaller. Therefore, as shown in FIGS. 20 (a) to 20 (c), the back surface reflection point of the light emitting point of the near-infrared illuminator 3 is generated near the center of the lens as the spectacles are more powerful. It is possible to eliminate the reflection of the light emission point of the near-infrared illuminator on the back of the lens, including glasses with such a high frequency, or to move the lens to the periphery of the lens. It becomes difficult to cope with this arrangement.

本発明の実施の形態に係わる運転者の顔画像撮影装置を示す模式図である。It is a schematic diagram showing a driver's face image photographing device according to an embodiment of the present invention. 図2(a)は図1のカメラが撮影した眼鏡を着用している運転者の顔画像であって、眼の全体に一様な状態で前方風景の映り込みが発生している場合の顔画像の例を示し、図2(b)は図2(a)の顔画像の前処理のパラメータ調整を示す図であり、図2(c)は図1のカメラが撮影した眼鏡を着用している運転者の顔画像であって、眼の下半分にだけ前方風景の映り込みが発生している場合の顔画像の例を示し、図2(d)は図2(c)の顔画像の前処理のパラメータ調整を示す図である。FIG. 2A is a face image of the driver wearing the glasses photographed by the camera of FIG. 1, and the face when the front scene is reflected in a uniform state on the entire eye. An example of an image is shown, FIG. 2 (b) is a diagram showing parameter adjustment of the pre-processing of the face image of FIG. 2 (a), and FIG. 2 (c) is wearing glasses taken by the camera of FIG. FIG. 2D shows an example of a face image of a driver who has a frontal scene reflected only in the lower half of his eyes. FIG. 2D shows the face image of FIG. It is a figure which shows the parameter adjustment of pre-processing. 図1の顔画像撮影装置の処理動作の流れを示すフローチャートである。It is a flowchart which shows the flow of a processing operation of the face image photographing device of FIG. 眼の位置の検出方法に関する説明図(その1)であり、顔画像を複数の縦ラインに分割した状態を示す図である。It is explanatory drawing (the 1) regarding the detection method of an eye position, and is a figure which shows the state which divided | segmented the face image into the several vertical line. 眼の位置の検出方法に関する説明図(その2)であり、図4の縦ラインにおける抽出点を示す図である。It is explanatory drawing (the 2) regarding the detection method of an eye position, and is a figure which shows the extraction point in the vertical line of FIG. 眼の位置の検出方法に関する説明図(その3)であり、隣接する縦ラインにおけて連続する抽出点をグループ化した状態を示す図である。It is explanatory drawing (the 3) regarding the detection method of the position of an eye, and is a figure which shows the state which grouped the extraction point which continues in an adjacent vertical line. 眼の位置の検出方法に関する説明図(その4)であり、グループが出現する箇所を縦方向に幾つかのゾーンに分割した状態を示す図である。It is explanatory drawing (the 4) regarding the detection method of an eye position, and is a figure which shows the state which divided | segmented the location where a group appears into several zones in the vertical direction. 図8(a)乃至図8(c)は眼の追跡方法に関する説明図である。FIG. 8A to FIG. 8C are explanatory diagrams regarding the eye tracking method. 変形例に係わる顔画像撮影装置を示す模式図である。It is a schematic diagram which shows the face image imaging device concerning a modification. 可視光の波長帯と近赤外光の波長帯の広さを比較するためのグラフである。It is a graph for comparing the width of the wavelength band of visible light and the wavelength band of near infrared light. 図11(a)は、可視光を殆ど透過させず近赤外光を透過させるローパスフィルタをカメラの光軸上に配置した場合において撮影された、反射防止膜を施している眼鏡を着用した運転者の顔画像を示す。図11(b)は、図11(a)と同様な条件下において近赤外照明器の光強度を強めた場合の顔画像を示す。図11(c)は、図11(a)及び図11(b)の撮影において使用するローパスフィルタの透過特性を示すグラフである。図11(d)は、可視光のうち赤の波長帯の光と近赤外光を透過させるローパスフィルタをカメラの光軸上に配置した場合において撮影された、反射防止膜を施している眼鏡を着用した運転者の顔画像を示す。図11(e)は、図11(d)と同様な条件下において近赤外照明器の光強度を強めた場合の顔画像を示す。図11(f)は、図11(d)及び図11(e)の撮影において使用するローパスフィルタの透過特性を示すグラフである。FIG. 11 (a) shows an operation of wearing spectacles with an antireflection film photographed when a low-pass filter that hardly transmits visible light and transmits near-infrared light is disposed on the optical axis of the camera. A person's face image is shown. FIG.11 (b) shows the face image at the time of increasing the light intensity of a near-infrared illuminator on the same conditions as Fig.11 (a). FIG. 11C is a graph showing the transmission characteristics of the low-pass filter used in the imaging of FIGS. 11A and 11B. FIG. 11D shows spectacles with an antireflection film taken when a low-pass filter that transmits light in the red wavelength band and near-infrared light of visible light is disposed on the optical axis of the camera. The face image of the driver who wears is shown. FIG.11 (e) shows the face image at the time of increasing the light intensity of a near-infrared illuminator on the same conditions as FIG.11 (d). FIG. 11 (f) is a graph showing the transmission characteristics of the low-pass filter used in the imaging of FIGS. 11 (d) and 11 (e). 眼鏡のレンズ面に施された反射防止膜の反射特性の一例を示すグラフである。It is a graph which shows an example of the reflection characteristic of the antireflection film given to the lens surface of spectacles. 近赤外照明器をインストルメントパネルの上面に設置した例を示す模式図である。It is a schematic diagram which shows the example which installed the near-infrared illuminator on the upper surface of the instrument panel. 近赤外照明器をセンターコンソールの上面に設置した例を示す模式図である。It is a schematic diagram which shows the example which installed the near-infrared illuminator on the upper surface of the center console. 近赤外照明器をフロントピラー部に設置した例を示す模式図である。It is a schematic diagram which shows the example which installed the near-infrared illuminator in the front pillar part. 近赤外照明器をドアトリムの上面に設置した例を示す模式図である。It is a schematic diagram which shows the example which installed the near-infrared illuminator on the upper surface of the door trim. 近赤外照明器を運転者のシートバックの上面に設置した例を示す模式図である。It is a schematic diagram which shows the example which installed the near-infrared illuminator on the upper surface of a driver | operator's seat back. 従来の方法を用いた場合に発生する眼鏡レンズ表面への近赤外照明器の発光点の映り込みについて説明するための図である。It is a figure for demonstrating the reflection of the light emission point of the near-infrared illuminator on the spectacle lens surface which generate | occur | produces when the conventional method is used. 従来の方法を用いた場合に発生する眼鏡レンズ裏面への近赤外照明器の発光点の映り込みについて説明するための図である。It is a figure for demonstrating the reflection of the light emission point of a near-infrared illuminator on the spectacle lens back surface which generate | occur | produces when the conventional method is used. 図20(a)乃至図20(c)は、図19に示した眼鏡レンズ裏面への映り込みにおいて、眼鏡の度数により映り込みの位置が変化する様子を示す図である。20 (a) to 20 (c) are diagrams showing how the position of the reflection changes depending on the frequency of the glasses in the reflection on the back surface of the spectacle lens shown in FIG.

符号の説明Explanation of symbols

1…撮影装置(カメラ)
2…処理装置
3、31〜35…近赤外照明器
4…警報器
5…ルーフ部
6…運転者
7…眼鏡
8…ハンドル
9…光学フィルタ
10…眼検出手段
11…眼追跡手段
12…開閉/居眠り判定手段
20a、20b…追跡領域
21a、21b…眼
1 ... Shooting device (camera)
DESCRIPTION OF SYMBOLS 2 ... Processing apparatus 3, 31-35 ... Near-infrared illuminator 4 ... Alarm 5 ... Roof part 6 ... Driver 7 ... Glasses 8 ... Handle 9 ... Optical filter 10 ... Eye detection means 11 ... Eye tracking means 12 ... Opening / closing / Dozing determination means 20a, 20b ... tracking areas 21a, 21b ... eyes

Claims (7)

車両の運転者の顔画像を撮影する画像撮影手段と、
前記顔画像から前記運転者の眼の位置を検出する眼検出手段と、
近赤外光を前記車両のルーフ部で反射させて前記運転者の顔面に間接的に照射する近赤外照明手段
とを有することを特徴とする運転者の顔画像撮影装置。
Image photographing means for photographing a face image of a driver of the vehicle;
Eye detection means for detecting the position of the driver's eyes from the face image;
A near-infrared illumination unit that reflects near-infrared light on the roof of the vehicle and indirectly irradiates the driver's face.
前記画像撮影手段の光軸上に配置された近赤外光を含む所定の波長域の光を透過させる光学フィルタを更に有することを特徴とする請求項1記載の運転者の顔画像撮影装置。   2. The driver's face image photographing apparatus according to claim 1, further comprising an optical filter that transmits light in a predetermined wavelength range including near infrared light disposed on an optical axis of the image photographing means. 前記近赤外照明手段は、前記車両のインストルメントパネルの上面に設置されていることを特徴とする請求項1又は2記載の運転者の顔画像撮影装置。   The driver's face image photographing apparatus according to claim 1 or 2, wherein the near-infrared illumination means is installed on an upper surface of an instrument panel of the vehicle. 前記近赤外照明手段は、前記車両のセンターコンソールの上面に設置されていることを特徴とする請求項1又は2記載の運転者の顔画像撮影装置。   The driver's face image photographing device according to claim 1, wherein the near-infrared illumination means is installed on an upper surface of a center console of the vehicle. 前記近赤外照明手段は、前記車両のフロントピラー上に設置されていることを特徴とする請求項1又は2記載の運転者の顔画像撮影装置。   3. The driver's face image photographing apparatus according to claim 1, wherein the near infrared illumination means is installed on a front pillar of the vehicle. 前記近赤外照明手段は、前記車両のドアトリムの上面に設置されていることを特徴とする請求項1又は2記載の運転者の顔画像撮影装置。   The driver's face image photographing device according to claim 1 or 2, wherein the near-infrared illumination means is installed on an upper surface of a door trim of the vehicle. 前記近赤外照明手段は、運転席のシートバックの上面に設置されていることを特徴とする請求項1又は2記載の運転者の顔画像撮影装置。   3. The driver's face image capturing apparatus according to claim 1, wherein the near infrared illumination means is installed on an upper surface of a seat back of a driver's seat.
JP2004047642A 2004-02-24 2004-02-24 Driver face imaging device Pending JP2005242428A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004047642A JP2005242428A (en) 2004-02-24 2004-02-24 Driver face imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004047642A JP2005242428A (en) 2004-02-24 2004-02-24 Driver face imaging device

Publications (1)

Publication Number Publication Date
JP2005242428A true JP2005242428A (en) 2005-09-08

Family

ID=35024142

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004047642A Pending JP2005242428A (en) 2004-02-24 2004-02-24 Driver face imaging device

Country Status (1)

Country Link
JP (1) JP2005242428A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007062478A1 (en) * 2005-11-30 2007-06-07 Seeing Machines Pty Ltd Visual tracking of eye glasses in visual head and eye tracking systems
JP2008005017A (en) * 2006-06-20 2008-01-10 Honda Motor Co Ltd Observation apparatus in vehicle interior for measuring illuminance on the basis of imaged image
JP2008074146A (en) * 2006-09-19 2008-04-03 Denso Corp Mounting structure of camera for taking picture of vehicle operator
JP2008094221A (en) * 2006-10-11 2008-04-24 Denso Corp Eye state detector, and eye state detector mounting method
WO2008056679A1 (en) * 2006-11-10 2008-05-15 Aisin Seiki Kabushiki Kaisha Illumination device linked with face direction, and illumination method linked with face direction
JP2010013090A (en) * 2008-07-04 2010-01-21 Hyundai Motor Co Ltd Driver's condition monitoring system
DE112008002645T5 (en) 2007-11-09 2010-07-22 Aisin Seiki K.K. Face picture taking procedure and program for this
US8797394B2 (en) 2011-08-25 2014-08-05 Denso Corporation Face image capturing apparatus
JP2015118287A (en) * 2013-12-18 2015-06-25 株式会社デンソー Face image capturing device and driver state determination device
KR101619661B1 (en) 2014-12-08 2016-05-10 현대자동차주식회사 Detection method of face direction of driver
WO2018150554A1 (en) * 2017-02-20 2018-08-23 マクセル株式会社 Pulse wave measuring device, mobile terminal device, and pulse wave measuring method
US10798281B2 (en) 2015-04-09 2020-10-06 Bendix Commercial Vehicle System Llc Apparatus and method for disabling a driver facing camera in a driver monitoring system
DE112020000918T5 (en) 2019-02-25 2021-11-18 Isuzu Motors Limited CONTROL DEVICE AND PHOTOGRAPHY SYSTEM

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8165347B2 (en) 2005-11-30 2012-04-24 Seeing Machines Pty Ltd Visual tracking eye glasses in visual head and eye tracking systems
WO2007062478A1 (en) * 2005-11-30 2007-06-07 Seeing Machines Pty Ltd Visual tracking of eye glasses in visual head and eye tracking systems
JP2008005017A (en) * 2006-06-20 2008-01-10 Honda Motor Co Ltd Observation apparatus in vehicle interior for measuring illuminance on the basis of imaged image
JP2008074146A (en) * 2006-09-19 2008-04-03 Denso Corp Mounting structure of camera for taking picture of vehicle operator
US8587656B2 (en) 2006-09-19 2013-11-19 Denso Corporation Face camera mount structure
JP2008094221A (en) * 2006-10-11 2008-04-24 Denso Corp Eye state detector, and eye state detector mounting method
US8144992B2 (en) 2006-10-11 2012-03-27 Denso Corporation Eye condition detection apparatus and method for installing same
WO2008056679A1 (en) * 2006-11-10 2008-05-15 Aisin Seiki Kabushiki Kaisha Illumination device linked with face direction, and illumination method linked with face direction
JP2008120230A (en) * 2006-11-10 2008-05-29 Aisin Seiki Co Ltd Illumination device linked with face orientation, and illumination method linked with face orientation
US8295559B2 (en) 2007-11-09 2012-10-23 Aisin Seiki Kabushiki Kaisha Face image pickup device and method
DE112008002645T5 (en) 2007-11-09 2010-07-22 Aisin Seiki K.K. Face picture taking procedure and program for this
JP2010013090A (en) * 2008-07-04 2010-01-21 Hyundai Motor Co Ltd Driver's condition monitoring system
US8797394B2 (en) 2011-08-25 2014-08-05 Denso Corporation Face image capturing apparatus
JP2015118287A (en) * 2013-12-18 2015-06-25 株式会社デンソー Face image capturing device and driver state determination device
US10019638B2 (en) 2013-12-18 2018-07-10 Denso Corporation Face image capturing device and driver condition determination device
KR101619661B1 (en) 2014-12-08 2016-05-10 현대자동차주식회사 Detection method of face direction of driver
US9704037B2 (en) 2014-12-08 2017-07-11 Hyundai Motor Company Method for detecting face direction of a person
US10798281B2 (en) 2015-04-09 2020-10-06 Bendix Commercial Vehicle System Llc Apparatus and method for disabling a driver facing camera in a driver monitoring system
WO2018150554A1 (en) * 2017-02-20 2018-08-23 マクセル株式会社 Pulse wave measuring device, mobile terminal device, and pulse wave measuring method
DE112020000918T5 (en) 2019-02-25 2021-11-18 Isuzu Motors Limited CONTROL DEVICE AND PHOTOGRAPHY SYSTEM
US11956525B2 (en) 2019-02-25 2024-04-09 Isuzu Motors Limited Control device and photographing system

Similar Documents

Publication Publication Date Title
JP6166225B2 (en) Vehicle headlamp control device
EP1683668B1 (en) Variable transmissivity window system
US8102417B2 (en) Eye closure recognition system and method
CN103582906B (en) Vehicular field of view assistance
JP2005242428A (en) Driver face imaging device
US7199767B2 (en) Enhanced vision for driving
US20020181774A1 (en) Face portion detecting apparatus
EP2288287B1 (en) Driver imaging apparatus and driver imaging method
US8218832B2 (en) Apparatus for detecting feature of driver&#39;s face
JP5941111B2 (en) Vehicle headlamp device
JP5761074B2 (en) Imaging control apparatus and program
JP5761046B2 (en) Anti-glare control device
JP2002331835A (en) Direct sunshine anti-glare device
JP2007004448A (en) Line-of-sight detecting apparatus
CN101161200A (en) Eye condition detection apparatus and method for installing same
CN106029416A (en) Sun shield
JP2020145724A (en) Apparatus and method for detecting eye position of operator, imaging apparatus with image sensor of rolling shutter driving method, and illumination control method of same
US20100060169A1 (en) Vehicle headlamp apparatus and control method thereof
JP2007025758A (en) Face image extracting method for person, and device therefor
CN111619324A (en) Intelligent anti-dazzling method and system for sight tracking automobile
CN107826116A (en) The operating method of automobile active safety drive assist system based on mobile terminal
CN112334361A (en) Information display device and information display method
KR101709402B1 (en) Driver Assistance System And Method Thereof
JP4313717B2 (en) Gaze detection device
JP2010179817A (en) Anti-dazzling device for vehicle