JPH03533A - Detecting device for driver's operating condition - Google Patents

Detecting device for driver's operating condition

Info

Publication number
JPH03533A
JPH03533A JP1134177A JP13417789A JPH03533A JP H03533 A JPH03533 A JP H03533A JP 1134177 A JP1134177 A JP 1134177A JP 13417789 A JP13417789 A JP 13417789A JP H03533 A JPH03533 A JP H03533A
Authority
JP
Japan
Prior art keywords
driver
image
eyes
reference point
reference points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP1134177A
Other languages
Japanese (ja)
Inventor
Yasushi Ueno
裕史 上野
Takatoshi Seko
恭俊 世古
Tomoko Saito
斉藤 友子
Hiroshi Saito
浩 斉藤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Priority to JP1134177A priority Critical patent/JPH03533A/en
Publication of JPH03533A publication Critical patent/JPH03533A/en
Pending legal-status Critical Current

Links

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Eye Examination Apparatus (AREA)
  • Auxiliary Drives, Propulsion Controls, And Safety Devices (AREA)

Abstract

PURPOSE:To judge the operating condition of a driver by extracting the existence of the driver's eyes from inputted image the information of which has been processed based on the reference point of a reference point indicating means, and thereby detecting a section corresponding to the erises of the driver. CONSTITUTION:It is caught by an infra-red image pick-up means 2 that 4 reference points in the center section out of those reference points 51 through 5n are hidden by the head of a driver, and 3 each of reference points 51 through 53, and 58 through 5n at the right and the left side are lighted. Infra-red rays from an infra-red ray irradiation means 1 are irradiated onto the driver's face by means of synchronizing signals Q3 from a synchronizing signal output means 3, the image of its reflection pattern and the light from the reference points 51 through 5n is picked up by an infra-red image pick-up means 2, and the image of luminescent points are extracted, which is formed by the respective 3 reference points 51 through 53, 58 through 5n at the right and left sides projected in an inputted image I (x, y) by an eye-ball existing range specifying means 6, so that the existing range of the driver's eyes is thereby specified. An eris detecting means 7 detects the section corresponding to the erises of the driver, and an operation judgement means 8 judges whether or not the driver's eyes are opened so that a warning command is thereby outputted to a warning output means 9.

Description

【発明の詳細な説明】 産業上の利用分野 本発明は、運転者の眼が正面を向いているか、開いてい
るか、閉じているかというような、運転状態を検出する
装置に関する。
DETAILED DESCRIPTION OF THE INVENTION Field of the Invention The present invention relates to a device for detecting driving conditions, such as whether a driver's eyes are facing forward, open, or closed.

従来の技術 運転者の運転状態検出装置としては、例えば特開昭60
−158303号公報、特開昭60−158304号公
報、特開昭61−77705号公報、特開昭61−77
706号公報等に開示されたものがある。これらは、車
室内に設けられた赤外線照射手段から運転者の両眼を含
む顔に赤外線を照射し、この赤外線の反射パターンを車
室内に設けられた赤外線撮像手段で撮像して明暗領域に
画像処理する構造になっている。
As a conventional technical driver's driving state detection device, for example,
-158303, JP 60-158304, JP 61-77705, JP 61-77
Some of these are disclosed in Publication No. 706 and the like. These systems irradiate infrared rays onto the driver's face, including both eyes, from an infrared irradiation device installed inside the vehicle, and image the reflection pattern of this infrared rays with an infrared imaging device installed inside the vehicle to create images in bright and dark areas. It is structured to process.

発明が解決しようとする課題 しかし前述の装置は、画像の明暗領域の形状から運転者
の眼の位置を認識することはできるものの、運転者の眼
が開いているか、閉じているかまでは検出することがで
きず、場合によっては眼と眉毛との位置を誤認すること
がある。
Problems to be Solved by the Invention However, although the above-mentioned device can recognize the position of the driver's eyes from the shape of the bright and dark areas in the image, it cannot detect whether the driver's eyes are open or closed. In some cases, the positions of the eyes and eyebrows may be misidentified.

課題を解決するための手段 運転者の両眼を含む顔面に照射する赤外線の反射パター
ンを画像処理して運転者の運転状態を検出する装置にお
いて、前記運転者の眼の位置検出用の参照点を指示する
参照点指示手段と、この参照点指示手段の参照点にもと
づいて前記画像処理した入力画像から運転者の眼の存在
領域を抽出する眼球存在領域規定手段と、この眼球存在
領域規定手段で抽出した眼球存在領域内で運転者の虹彩
相当部分を検出する虹彩検出手段と、この虹彩検出手段
の検出結果から運転者の運転状態を判別する運転状態判
別手段と、を備えている。
Means for Solving the Problem In a device that detects the driving condition of a driver by image processing a reflection pattern of infrared rays irradiated onto the driver's face including both eyes, there is provided a reference point for detecting the position of the driver's eyes. a reference point specifying means for instructing the reference point, an eyeball presence region defining means for extracting the driver's eye region from the image-processed input image based on the reference point of the reference point indicating means; and the eyeball presence region defining means The present invention includes an iris detection means for detecting a portion corresponding to the driver's iris within the eyeball existing region extracted in step 1, and a driving state determining means for determining the driving state of the driver from the detection result of the iris detection means.

作用 画像処理された入力画像から参照点にもとづいて眼球存
在領域を抽出することができるため、正確に眼球位置を
測定することができ、更に虹彩相当部分を検出し、この
検出結果から運転者の眼が正面を向いているか、開いて
いるか、閉じているかというような、運転状態を判別す
る。
Since the eyeball presence area can be extracted based on reference points from the input image that has been subjected to action image processing, it is possible to accurately measure the eyeball position, and the area corresponding to the iris can also be detected, and from this detection result, the driver's It determines the driving state, such as whether the eyes are facing forward, open, or closed.

実施例 第1−10図に示すように、この一実施例では大まかに
は、赤外線照射手段lと赤外線撮像手段2と同期信号出
力手段3と画像メモリ4と参照点指示手段5と眼球存在
領域規定手段6と虹彩検出手段7と運転状態判別手段8
と警報出力手段9とを備えている。
Embodiment As shown in FIG. 1-10, this embodiment generally includes an infrared irradiation means 1, an infrared imaging means 2, a synchronization signal output means 3, an image memory 4, a reference point indicating means 5, and an eyeball presence area. Regulation means 6, iris detection means 7, and driving state determination means 8
and alarm output means 9.

赤外線照射手段lは運転者の両眼を含む顔面に赤外線を
照射するものであって、所謂赤外線ストロボに構成され
ており、例えば自動車の車室内前部で運転者の前方と側
方との視認性を妨げない部分に取り付けられている。
The infrared irradiation means 1 irradiates the driver's face including both eyes with infrared rays, and is configured as a so-called infrared strobe. It is attached to a part that does not interfere with sex.

赤外線撮像手段2は運転者に照射された赤外線の反射パ
ターンを撮像して画像信号を出力するものであって、所
謂赤外線カメラに構成されており、例えば自動車の車室
内前部で、赤外線の反射パターンが撮像できかつ運転者
の前方と側方との視認性を妨げない部分に取り付けられ
ている。
The infrared imaging means 2 images the reflection pattern of the infrared rays irradiated on the driver and outputs an image signal, and is configured as a so-called infrared camera. It is attached to a part where the pattern can be imaged and does not obstruct the driver's visibility to the front and sides.

同期信号出力手段3は赤外線照射手段lの赤外線照射と
赤外線撮像手段2の反射パターン撮像とのタイミングを
合わせる同期信号Q、を赤外線照射手段1と赤外線撮像
手段2とに出力する。
The synchronization signal output means 3 outputs a synchronization signal Q to the infrared irradiation means 1 and the infrared imaging means 2, which synchronizes the timing of the infrared irradiation of the infrared ray irradiation means 1 and the reflection pattern imaging of the infrared imaging means 2.

画像メモリ4は赤外線撮像手段2から出力された画像信
号としての画像1 (x、y)を−時記憶する。この画
像1 (x、y)は例えば横方向なるX方向にM画素、
縦方向なるY方向にN画素に構成されている。
The image memory 4 stores an image 1 (x, y) as an image signal output from the infrared imaging means 2 at - time. For example, this image 1 (x, y) has M pixels in the horizontal direction (X direction),
It is composed of N pixels in the vertical Y direction.

参照点指示手段5は運転者の眼の位置検出用の参照点5
1,5t、・・・・・・5、を指示するものであって、
複数個のLEDのような発光体または鏡のような反射体
で構成されており、これら発光体または反射体が、赤外
線照射手段lから照射される赤外線の照射野内で運転者
の前方と側方との視認性を妨げない車室内の部分の1つ
であるヘッドレスト10の頭部受は面に横一列で等間隔
に、例えば10個装着されている。
The reference point indicating means 5 is a reference point 5 for detecting the position of the driver's eyes.
1,5t...5, which indicates
It is composed of a plurality of light emitters such as LEDs or reflectors such as mirrors, and these light emitters or reflectors illuminate the front and sides of the driver within the infrared irradiation field emitted from the infrared irradiation means l. For example, ten head supports of the headrest 10, which are one of the parts in the vehicle interior that do not obstruct the visibility of the vehicle, are mounted horizontally in a horizontal row at equal intervals.

眼球存在領域規定手段6は画像メモリ4に一時記憶され
た画像1 (x、y)を入力し、この入力画像I (x
、y)から参照点指示手段5たる発光体または反射体に
よる参照点5..5.、・・・・・・51としての輝点
像にもとづいて運転者の眼の存在領域を抽出する。
The eyeball existing area defining means 6 inputs the image 1 (x, y) temporarily stored in the image memory 4, and inputs this input image I (x
, y) to a reference point 5. by a light emitter or a reflector serving as a reference point indicating means 5. .. 5. , . . . 51, the region where the driver's eyes exist is extracted.

虹彩検出手段7は眼球存在領域規定手段6で抽出した眼
球存在領域内で運転者の虹彩相当部分を検出する。
The iris detection means 7 detects a portion corresponding to the driver's iris within the eyeball existence region extracted by the eyeball existence region defining means 6.

運転状態判別手段8は虹彩検出手段7の検出結果から例
えば正常運転状態、わき見運転状態、居眠り運転状態等
の運転者の運転状態を判別する。
The driving state determining means 8 determines the driving state of the driver, such as a normal driving state, a distracted driving state, or a drowsy driving state, from the detection result of the iris detecting means 7.

警報出力手段9は車室内に取り付けられており、運転状
態判別手段8がわき見運転状態、居眠り運転状態を判別
したときにブザー、チャイムあるいは音声等による警報
を発生する。
The alarm output means 9 is installed in the vehicle interior, and generates an alarm by a buzzer, chime, voice, etc. when the driving state determining means 8 determines that the vehicle is in a distracted driving state or a drowsy driving state.

なお、同期信号出力手段39画像メモリ4.眼球存在領
域規定手段6.虹彩検出手段7.運転状態判別手段8等
はマイクロコンピュータに構成した制御装置として1つ
にまとめられて車体に取り付けられる。
Note that the synchronizing signal output means 39 image memory 4. Eyeball presence area defining means 6. Iris detection means7. The driving state determining means 8 and the like are integrated into one control device configured in a microcomputer and attached to the vehicle body.

以上の実施例構造によれば、第1図に示すように運転者
が運転席に着座し、その後頭部がヘッドレストIOの左
右方向(横方向)略中央部に位置されており、ヘッドレ
スト10に配置された10個の参照点5..5.、・・
・・・・5oの中央部分の4個が運転者の頭で隠れ、左
右3個づつの参照点51゜5、.5..5..5..5
I、が点灯しているのが赤外線撮像手段2で捕らえられ
る状態において、同期信号出力手段3からの同期信号Q
3により赤外線照射手段lから赤外線が運転者の両眼を
含む顔面に照射され、この赤外線の反射パターンと参照
点5..5..5..5..5..5.の光とが赤外線
撮像手段2で撮像され、赤外線撮像手段2から画像1 
(x、y)が画像メモリ4に一時記憶される。そして眼
球存在領域規定手段6が画像メモリ4に一時記憶された
画像1 (x、y)を人力し、この入力画像■(x、y
)内に映し出された左右3個づつの参照点5..5..
5..5..5..5゜による輝点像を抽出し、この抽
出した参照点5.。
According to the structure of the above embodiment, as shown in FIG. 10 reference points 5. .. 5. ,...
...Four points in the center of 5o are hidden by the driver's head, and there are three reference points on each side, 51°5, . 5. .. 5. .. 5. .. 5
In a state where the infrared imaging means 2 captures that I is lit, the synchronization signal Q from the synchronization signal output means 3
3, infrared rays are irradiated from the infrared ray irradiation means l to the driver's face including both eyes, and the reflection pattern of this infrared rays and the reference point 5. .. 5. .. 5. .. 5. .. 5. .. 5. is imaged by the infrared imaging means 2, and the image 1
(x, y) is temporarily stored in the image memory 4. Then, the eyeball existence area defining means 6 manually inputs the image 1 (x, y) temporarily stored in the image memory 4, and inputs this input image ■(x, y).
5. Three reference points on the left and right shown in ). .. 5. ..
5. .. 5. .. 5. .. A bright spot image based on 5° is extracted, and this extracted reference point 5. .

5、.53,5..5..5nから運転者の眼の存在領
域を規定する。ここで参照点51〜5ゎは画像処理を行
うに際し、常に所定の位置で所定の波長の光を出してい
るため、基準点として利用できる。
5. 53,5. .. 5. .. 5n defines the region where the driver's eyes exist. Here, the reference points 51 to 5° can be used as reference points since they always emit light of a predetermined wavelength at a predetermined position when performing image processing.

次に眼球存在領域規定手段6の作用を第3図のフローチ
ャートと第4〜9図の作用説明図とにもとづいて詳述す
ると、先ずステップ101では画像メモリ4から第4図
に示す画像1(x、y)を入力する。
Next, the operation of the eyeball existing area defining means 6 will be explained in detail based on the flowchart in FIG. 3 and the operation explanatory diagrams in FIGS. x, y).

次にステップ102では参照点5..5..53゜5、
.5..5.、を抽出できるようなレベルに設定したし
きい値で、ステップ101での入力画像1 (x、y)
を2値化し、輝点像のみを抽出する。
Next, in step 102, reference point 5. .. 5. .. 53°5,
.. 5. .. 5. Input image 1 (x, y) in step 101 with a threshold set to a level that allows extraction of .
is binarized and only the bright spot image is extracted.

こうして生成した画像を、第5図に示すように、J (
x、y)とする。
As shown in FIG. 5, the image thus generated is J (
x, y).

l・・・・・・参照点(輝点像) J(x、y)= 0・・・・・・その他 ステップ103では画像J (x、y)にラベリングを
行うことにより、第6図に示すようにX座標の小さい方
の輝点像から大きい方の輝点像に領域数を表す算用数字
1,2,3,4,5.6を順次付けて、画像K(x、y
)を生成するとともに、領域数1〜6に相当する画素値
を一時記憶する。
l... Reference point (bright spot image) J (x, y) = 0... In addition, in step 103, by labeling the image J (x, y), the image shown in Fig. 6 is obtained. As shown, arithmetic numbers 1, 2, 3, 4, 5.6 indicating the number of areas are sequentially attached to the bright spot images from the one with the smaller X coordinate to the bright spot image with the larger X coordinate, and the image K (x, y
), and temporarily stores pixel values corresponding to the number of areas 1 to 6.

具体的には領域数をiとすると、i番目の領域数に相当
する画素値iが一時記憶される。
Specifically, when the number of regions is i, a pixel value i corresponding to the i-th region number is temporarily stored.

ステップ104では最大領域数1 waaxが基準数n
o例えば7以下であるか否か、すなわち運転者の頭がヘ
ッドレスト10の参照点51〜5□の一部を遮っている
か否かを判定する。最大領域数i、□が基準数n0、つ
まり7以下であるときは運転者の頭が画像K(x、y)
のフレームに入りきっている通常運転位置のものとして
ステップ105に進む。最大領域数i、。3が基準値n
。よりも大きい、つまり8以上の場合は運転者の頭が画
像K(x、y)のフレームに入りきっていない乗降状態
等としてステップ101に戻る。
In step 104, the maximum number of areas is 1 and waax is the reference number n.
o For example, it is determined whether or not it is 7 or less, that is, whether or not the driver's head partially blocks the reference points 51 to 5□ of the headrest 10. When the maximum number of regions i, □ is less than or equal to the reference number n0, that is, 7, the driver's head is in the image K(x, y).
The process proceeds to step 105 assuming that the normal operating position is completely within the frame. Maximum number of regions i. 3 is the standard value n
. If it is larger than 8, that is, 8 or more, the driver's head does not fit into the frame of the image K(x, y) and the process returns to step 101.

ステップ105では初期値i、hを1.0にセットする
In step 105, initial values i and h are set to 1.0.

ステップ106では領域数量が最大領域数1 waax
であるか否かを判別する。領域数iが最大領域数(@a
x以下である場合はステップ107に進み、領域数iが
最大領域数f waaxである場合にはステップ111
に進む。
In step 106, the area quantity is the maximum area number 1 waax
Determine whether or not. The number of regions i is the maximum number of regions (@a
If it is less than or equal to x, proceed to step 107, and if the number of areas i is the maximum number of areas f waax, proceed to step 111.
Proceed to.

ステップ107ではラベリングされた隣り合う輝点像i
と輝点像1+1との間隔を判別する。つまりヘッドレス
ト10上の参照点51〜5nは等間隔に並んでいること
から、運転者の頭で参照点5、〜5nが遮られていない
場合は輝点像iと輝点像i+lとの間隔は規定値になる
。輝点像iと輝点像1+1との間隔が規定値の場合はス
テップ108に進み、規定値でない場合はステップ10
9に進む。
In step 107, the labeled adjacent bright spot images i
The interval between the bright spot image 1+1 and the bright spot image 1+1 is determined. In other words, since the reference points 51 to 5n on the headrest 10 are arranged at equal intervals, if the reference points 5 and 5n are not blocked by the driver's head, the distance between the bright spot image i and the bright spot image i+l is becomes the default value. If the interval between the bright spot image i and the bright spot image 1+1 is the specified value, proceed to step 108; if it is not the specified value, proceed to step 10.
Proceed to step 9.

ステップ108ではステップ105でのiに1を加算(
i=i+1)してステップ106に戻る。
In step 108, 1 is added to i in step 105 (
i=i+1) and return to step 106.

ステップ109ではステップ107での輝点像iと輝点
像i+1とを一時記憶する。そしてステップ11Oに進
む。
In step 109, the bright spot image i and bright spot image i+1 in step 107 are temporarily stored. The process then proceeds to step 11O.

ステップ110ではステップ105でのhに1を加算し
てステップ108に進む。
In step 110, 1 is added to h in step 105, and the process proceeds to step 108.

一方ステップ!11では隣り合う輝点像iと輝点像i+
1との間隔が規定値以上である箇所が1箇所であるか否
かを判別する。具体的にはh=1かh≧2かを判別する
。h≧2であり、隣り合う輝点像iと輝点像i+1との
間隔が規定値以上である箇所が2筒所である場合は何等
かのノイズが混入したものとしてステップ101へ戻る
。h−1であり、隣り合う輝点像iと輝点像1+1との
間隔が規定値以上である箇所が1箇所である場合はステ
ップ112に進む。
Step on the other hand! 11, adjacent bright spot images i and bright spot images i+
1. It is determined whether there is one location where the interval between the two locations is equal to or greater than a specified value. Specifically, it is determined whether h=1 or h≧2. If h≧2 and there are two locations where the interval between adjacent bright spot images i and bright spot images i+1 is equal to or greater than the specified value, it is determined that some noise has been mixed in and the process returns to step 101. h-1, and if there is one location where the interval between the adjacent bright spot images i and bright spot images 1+1 is equal to or greater than the specified value, the process proceeds to step 112.

ステップ112では予め定めた値r、p、−qにより、
第6図に示すように参照点iからX方向にrの位置、y
方向にp、−qの位置にある画素A、Hの2点を決める
。そしてステップ113に進む。
In step 112, using predetermined values r, p, -q,
As shown in FIG. 6, the position r in the X direction from the reference point i, y
Two points, pixels A and H, located at positions p and -q in the direction are determined. The process then proceeds to step 113.

ステップ113では予め定めた値−r 、p+qにより
、第6図に示すように参照点i+1からX方向にrの位
置、y方向にp、−qの位置にある画素C,Dの2点を
決める。そしてステップl14に進む。
In step 113, two points, pixels C and D, which are located at the position r in the X direction and the positions p and -q in the y direction from the reference point i+1, are determined using the predetermined values -r and p+q, as shown in FIG. decide. The process then proceeds to step l14.

ステップ114ではステップ112 113で求めた画
素A、B、C,Dを各々連結して、第7図に示すように
画像L(x、y)を生成するとともに、画素A、B、C
,Dで囲まれた領域の塗りつぶし処理を行う。この塗り
つぶし処理の結果、第8図に示す画像M(x、y)が生
成される。
In step 114, pixels A, B, C, and D obtained in steps 112 and 113 are connected to generate an image L (x, y) as shown in FIG.
, D is filled in. As a result of this filling process, an image M(x,y) shown in FIG. 8 is generated.

ところで画像メモリ4に一時記憶された入力画像1 (
x、y)において、虹彩相当部分は、一般に暗い円形領
域として観測されることから、第9図に示すように、い
ま、半径R画素の暗い円形領域を検出するものと仮定し
、この円形領域に交差する各方向に矩形領域を設定し、 δ=(矩形領域における第9図に斜線を付した部分の明
度値総和)−(矩形領域における第9図の白地部分の明
度値総和) を計算すれば、真の円形領域の中心に於いてδは突出し
た値を出力する。
By the way, input image 1 (
Since the portion corresponding to the iris is generally observed as a dark circular area in Set a rectangular area in each direction that intersects with , and calculate δ = (sum of brightness values of the shaded part in Figure 9 in the rectangular area) - (sum of brightness values of the white background part of Figure 9 in the rectangular area) Then, δ will output a prominent value at the center of the true circular region.

このような原理を利用することにより、虹彩検出手段7
において、第8図のフローチャートに示すように、入力
画像1 (x、y)を変換した画像M(x、y)から運
転者の虹彩に相当する虹彩相当部分を検出する。なお第
8図に示すフローチャートは、検出する虹彩の半径は個
人あるいはカメラと乗員の距離によって異なるため、検
出半径にあるゾーンを設けている(R111tn〜R,
、、)とともに、最終出力として眼球存在領域内のΔ=
δwaxを出力するようにしである。
By utilizing such a principle, the iris detection means 7
As shown in the flowchart of FIG. 8, an iris-corresponding portion corresponding to the driver's iris is detected from an image M (x, y) obtained by converting input image 1 (x, y). In the flowchart shown in FIG. 8, since the radius of the iris to be detected varies depending on the individual or the distance between the camera and the passenger, zones within the detection radius are provided (R111tn to R,
,, ), and Δ= in the eyeball existing region as the final output.
It is designed to output δwax.

ここで開眼時と閉眼時とを比較すると、先に設定した領
域M(x、y)内でのδ最大値δ、、8が開眼時には大
きくなるので、この最大値δwaxをしきい値処理する
ことによって開眼か閉眼かを判別することができる。
Here, when comparing when the eyes are open and when the eyes are closed, the maximum value δ, δ, , 8 within the previously set area M(x, y) becomes larger when the eyes are open, so this maximum value δwax is subjected to threshold processing. This allows you to determine whether the eyes are open or closed.

このようなことから、運転状態判別手段8が虹彩検出手
段7で求めたΔ=δ、、8をしきい値Thと比較処理し
、Δ≧Thの場合は開眼、Δ<Thの場合は閉眼と判別
し、閉眼と判別した場合には運転者が居眠りをしている
ものと見なして警報出力手段9に警報指令を出力し、警
報出力手段9が警報を発して運転者に注意を促す。この
場合、運転状態判別手段8における1度の閉眼判別をし
ただけで運転者が居眠りをしていると判断すると、誤警
報の可能性が高くなるので、同一処理を複数回繰り返し
、ある−足回数例えば3回以上連続して閉眼判別が観測
されたとき居眠りをしていると判断し、警報を発するよ
うにする。また、片目のみが閉眼と判別された場合は、
運転者がわき見をしているために入力画面1 (x、y
)から片目が外れているものと考えられる。したがって
、居眠り判断の場合と同様に3回連続して片目が閉眼で
あると判定したときわき見と判定する。
For this reason, the driving state discriminating means 8 compares Δ=δ, . If it is determined that the driver's eyes are closed, it is assumed that the driver is dozing, and an alarm command is output to the alarm output means 9, and the alarm output means 9 issues an alarm to call the driver's attention. In this case, if it is determined that the driver is dozing off by just one eye closure determination in the driving state determining means 8, there is a high possibility of a false alarm, so the same process is repeated multiple times. When the eyes closed determination is observed a number of times, for example, three or more times in a row, it is determined that the user is dozing, and an alarm is issued. Also, if only one eye is determined to be closed,
Input screen 1 (x, y
), it is thought that one eye is missing. Therefore, similarly to the case of dozing off, when it is determined that one eye is closed three times in a row, it is determined that the user is looking away.

なお本発明は前記実施例に限定されるのではなく、例え
ば第11.12図に示すように、参照点指示手段5Aを
シートベルト11に設けることもできる。具体的には参
照点5..5.、・・・・・・、5、をシートベルト1
1の表面に設け、シートベルト2裏面の参照点5..5
.、・・・・・・、5nと対応する部分に圧力センサ1
2..12..・・・・・・、12nを設け、圧力セン
サ12..122.12nの内で最も圧力の高い部分に
対応する参照点5..5.。
It should be noted that the present invention is not limited to the above-mentioned embodiments, and a reference point indicating means 5A may be provided on the seat belt 11, for example, as shown in FIGS. 11 and 12. Specifically, reference point 5. .. 5. ,...,5, seat belt 1
1, and a reference point on the back side of the seat belt 2.5. .. 5
.. ,..., pressure sensor 1 is placed in the part corresponding to 5n.
2. .. 12. .. . . . , 12n are provided, and the pressure sensor 12. .. 5. Reference point corresponding to the highest pressure part within 122.12n. .. 5. .

・・・・・・、5アを点灯する発光制御部13を設け、
発光制御部!3からの指令を前記実施例における輝点像
として眼球存在領域規定手段に入力するように構成しで
ある。
. . ., a light emission control unit 13 is provided to light up 5a,
Light emission control section! 3 is inputted to the eyeball existing region defining means as the bright spot image in the embodiment described above.

また前記実施例では虹彩検出手段7において4方向の矩
形領域のみで虹彩相当部分を検出したが、この矩形領域
の方向はもっと多くしてもよい。この場合、開眼、閉眼
を判定するしきい値Thは前記実施例とは異なる。
Furthermore, in the embodiment described above, the iris-corresponding portion was detected by the iris detection means 7 using only rectangular areas in four directions, but the rectangular areas may be formed in more directions. In this case, the threshold value Th for determining whether the eyes are open or closed is different from that of the embodiment described above.

発明の効果 以上のように本発明によれば、参照点を参考にして狭い
領域を検索するだけで運転者の虹彩相当部分を検出し、
運転者の開眼、閉眼等運転状態を確実にかつ迅速にモニ
タリングすることができる。
Effects of the Invention As described above, according to the present invention, the part corresponding to the driver's iris can be detected by simply searching a narrow area with reference to a reference point.
It is possible to reliably and quickly monitor the driving status of the driver, such as whether the driver's eyes are open or closed.

又この実施例によれば、運転者の居眠り、わき見等の危
険な運転状態を的確に把握して検知警報を行うことがで
きるという効果が得られる。
Further, according to this embodiment, it is possible to accurately detect dangerous driving conditions such as drowsiness or inattentiveness of the driver, and issue a detection warning.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は本発明の一実施例を示す構成図、第2図は同実
施例の参照点指示手段を設けたヘッドレストの斜視図、
第3図は同実施例の眼球存在領域規定手段のフローチャ
ート、第4〜8図は同フローチャートの要部の説明図、
第9図は同実施例の虹彩検出の原理を示す説明図、第1
O図は同実施例の虹彩検出手段のフローチャート、第1
1図は本発明の参照点指示手段の異なる例を示す斜視図
、第12図は第11図の■−XI[線に沿う断面図であ
る。 l・・・赤外線照射手段、2・・・赤外線撮像手段、4
・・・画像メモリ、5・・・参照点指示手段、6・・・
眼球存在領域規定手段、7・・・虹彩検出手段、8・・
・運転状態判別手段。 外3名 第1図 第11図 第12図 第 図 竺 図 第 図 第 図 第 図 第 図
FIG. 1 is a configuration diagram showing one embodiment of the present invention, and FIG. 2 is a perspective view of a headrest provided with reference point indicating means of the same embodiment.
FIG. 3 is a flowchart of the eyeball presence area defining means of the same embodiment, and FIGS. 4 to 8 are explanatory diagrams of the main parts of the same flowchart,
FIG. 9 is an explanatory diagram showing the principle of iris detection in the same embodiment;
Figure O is a flowchart of the iris detection means of the same embodiment, the first
FIG. 1 is a perspective view showing a different example of the reference point indicating means of the present invention, and FIG. 12 is a sectional view taken along the line -XI in FIG. 11. l... Infrared irradiation means, 2... Infrared imaging means, 4
... Image memory, 5... Reference point indicating means, 6...
Eyeball presence area defining means, 7...iris detection means, 8...
・Driving status determination means. 3 people Figure 1 Figure 11 Figure 12 Figure Figure 1 Figure Figure Figure Figure Figure

Claims (1)

【特許請求の範囲】[Claims] (1)運転者の両眼を含む顔面に照射する赤外線の反射
パターンを画像処理して運転者の運転状態を検出する装
置において、 前記運転者の眼の位置検出用の参照点を指示する参照点
指示手段と、 この参照点指示手段の参照点にもとづいて前記画像処理
した入力画像から運転者の眼の存在領域を抽出する眼球
存在領域規定手段と、 この眼球存在領域規定手段で抽出した眼球存在領域内で
運転者の虹彩相当部分を検出する虹彩検出手段と、 この虹彩検出手段の検出結果から運転者の運転状態を判
別する運転状態判別手段と、 を備えたことを特徴とする運転者の運転状態検出装置。
(1) In a device that detects the driving condition of a driver by image processing a reflection pattern of infrared rays irradiated onto the driver's face including both eyes, a reference point for indicating a reference point for detecting the position of the driver's eyes. a point indicating means; an eyeball existence region defining means for extracting the driver's eye existence region from the input image subjected to the image processing based on the reference point of the reference point indicating means; and an eyeball extracted by the eyeball existence region defining means. A driver comprising: iris detection means for detecting a portion corresponding to the driver's iris within a presence area; and driving state determination means for determining the driving state of the driver from the detection result of the iris detection means. operating status detection device.
JP1134177A 1989-05-26 1989-05-26 Detecting device for driver's operating condition Pending JPH03533A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP1134177A JPH03533A (en) 1989-05-26 1989-05-26 Detecting device for driver's operating condition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP1134177A JPH03533A (en) 1989-05-26 1989-05-26 Detecting device for driver's operating condition

Publications (1)

Publication Number Publication Date
JPH03533A true JPH03533A (en) 1991-01-07

Family

ID=15122246

Family Applications (1)

Application Number Title Priority Date Filing Date
JP1134177A Pending JPH03533A (en) 1989-05-26 1989-05-26 Detecting device for driver's operating condition

Country Status (1)

Country Link
JP (1) JPH03533A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07296299A (en) * 1994-04-20 1995-11-10 Nissan Motor Co Ltd Image processor and warning device against doze at the wheel using the same
JPH1028676A (en) * 1996-07-17 1998-02-03 Tochigi Nippon Denki Kk Gaze interpretation device
US5829782A (en) * 1993-03-31 1998-11-03 Automotive Technologies International, Inc. Vehicle interior identification and monitoring system
US6116639A (en) * 1994-05-09 2000-09-12 Automotive Technologies International, Inc. Vehicle interior identification and monitoring system
US6412813B1 (en) 1992-05-05 2002-07-02 Automotive Technologies International Inc. Method and system for detecting a child seat
US7134687B2 (en) 1992-05-05 2006-11-14 Automotive Technologies International, Inc. Rear view mirror monitor

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6412813B1 (en) 1992-05-05 2002-07-02 Automotive Technologies International Inc. Method and system for detecting a child seat
US7134687B2 (en) 1992-05-05 2006-11-14 Automotive Technologies International, Inc. Rear view mirror monitor
US5829782A (en) * 1993-03-31 1998-11-03 Automotive Technologies International, Inc. Vehicle interior identification and monitoring system
JPH07296299A (en) * 1994-04-20 1995-11-10 Nissan Motor Co Ltd Image processor and warning device against doze at the wheel using the same
US6116639A (en) * 1994-05-09 2000-09-12 Automotive Technologies International, Inc. Vehicle interior identification and monitoring system
JPH1028676A (en) * 1996-07-17 1998-02-03 Tochigi Nippon Denki Kk Gaze interpretation device

Similar Documents

Publication Publication Date Title
JPH0342337A (en) Detector for driving condition of vehicle driver
RU2453884C1 (en) Driver imaging device and method of driver imaging
JP4593942B2 (en) Pupil detection apparatus and method
JP4853389B2 (en) Face image capturing device
US6952498B2 (en) Face portion detecting apparatus
US10722113B2 (en) Gaze detection apparatus and gaze detection method
JP2522859B2 (en) Eye position detection device
JP5753509B2 (en) Device information acquisition device
US20090016574A1 (en) Biometric discrimination device, authentication device, and biometric discrimination method
EP1701319A1 (en) Illuminating apparatus, image capturing apparatus, and monitoring apparatus, for vehicle driver
JPH06230132A (en) Obstacle detector for vehicle
US7091867B2 (en) Wavelength selectivity enabling subject monitoring outside the subject's field of view
JP3116638B2 (en) Awake state detection device
US11904869B2 (en) Monitoring system and non-transitory storage medium
JP6567842B2 (en) Liquor detection device and room mirror device
JPH03533A (en) Detecting device for driver's operating condition
JP2016028669A (en) Pupil detection device and pupil detection method
JP6593133B2 (en) Diagnosis support apparatus and diagnosis support method
EP1800964B1 (en) Method of depth estimation from a single camera
JP2017126151A (en) Sight line detection device and sight line detection method
JP2936943B2 (en) Driver status detection device
KR102127169B1 (en) Notification system and method for lost article in vehicle
JP2836238B2 (en) Driving car eye position detection device and state detection device
JP2802635B2 (en) Energization control device for electrical equipment
JPH0761256A (en) Forward carelessness detecting device for vehicle