JP2021007717A - Occupant observation device, occupant observation method and program - Google Patents

Occupant observation device, occupant observation method and program Download PDF

Info

Publication number
JP2021007717A
JP2021007717A JP2019124391A JP2019124391A JP2021007717A JP 2021007717 A JP2021007717 A JP 2021007717A JP 2019124391 A JP2019124391 A JP 2019124391A JP 2019124391 A JP2019124391 A JP 2019124391A JP 2021007717 A JP2021007717 A JP 2021007717A
Authority
JP
Japan
Prior art keywords
occupant
eye
unit
degree
index value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2019124391A
Other languages
Japanese (ja)
Inventor
顕至 大熊
Akito Okuma
顕至 大熊
航太 齊藤
Kota Saito
航太 齊藤
勝鎬 崔
Seung Ho Choi
勝鎬 崔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Priority to JP2019124391A priority Critical patent/JP2021007717A/en
Priority to CN202010594597.2A priority patent/CN112183176A/en
Publication of JP2021007717A publication Critical patent/JP2021007717A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source

Abstract

To provide an occupant observation device, an occupant observation method and a program which can accurately determine whether or not an occupant is falling asleep.SOLUTION: An occupant observation device comprises: an imaging unit which images a head of an occupant of a vehicle to generate an image; an index value derivation unit which derives an index value indicating a level of opening/closing of eyes of the occupant on the basis of the image generated by the imaging unit; an inclination estimation unit which estimates a change level of the inclination of the head of the occupant using time when the occupant is awake as the reference; and a determination unit which determines whether or not the occupant is falling asleep on the basis of a determination result according to the index value derived by the index value derivation unit and the change level detected by the inclination estimation unit.SELECTED DRAWING: Figure 1

Description

本発明は、乗員観察装置、乗員観察方法、及びプログラムに関する。 The present invention relates to an occupant observation device, an occupant observation method, and a program.

従来、車両の乗員が居眠りしていることを装置によって判定する仕組みについて研究が進められている。乗員が居眠りしていること判定するために重要な要素として、眼の状態が挙げられる。従って、カメラによって乗員を撮像し、画像を解析して眼の状態を観察する装置の実用化が進められている(例えば、特許文献1参照)。 Conventionally, research has been conducted on a mechanism for determining that a vehicle occupant is asleep by a device. An important factor for determining that an occupant is asleep is the condition of the eyes. Therefore, a device that captures an image of an occupant with a camera, analyzes the image, and observes the state of the eye is being put into practical use (see, for example, Patent Document 1).

特開平6−266981号公報Japanese Unexamined Patent Publication No. 6-266981

ここで、乗員の眠気は眼以外の場所にも現れる場合がある。しかしながら、従来の技術では、眼の状態以外の要素に基づいて、乗員が居眠りしているかを判定することまではできなかった。 Here, the drowsiness of the occupant may appear in places other than the eyes. However, with the conventional technique, it has not been possible to determine whether or not the occupant is dozing based on factors other than the condition of the eyes.

本発明は、このような事情を考慮してなされたものであり、精度良く乗員が居眠りしているかを判定することができる乗員観察装置、乗員観察方法、及びプログラムを提供することを目的の一つとする。 The present invention has been made in consideration of such circumstances, and an object of the present invention is to provide an occupant observation device, an occupant observation method, and a program capable of accurately determining whether or not an occupant is asleep. Let's try.

この発明に係る乗員観察装置、乗員観察方法、及びプログラムは、以下の構成を採用した。
(1)この発明の一態様の乗員観察装置は、車両の乗員の頭部を撮像し、画像を生成する撮像部と、前記撮像部により生成された前記画像に基づいて、前記乗員の眼の開閉度合いを示す指標値を導出する指標値導出部と、前記乗員の頭部の傾きの、覚醒時を基準とした変化度合いを推定する傾き推定部と、前記指標値導出部によって導出された前記指標値と、前記傾き推定部によって検出された前記変化度合いとの双方に基づく判定の結果に基づいて、前記乗員が居眠りしているか否かを判定する判定部と、を備えるものである。
The occupant observation device, the occupant observation method, and the program according to the present invention have the following configurations.
(1) The occupant observation device according to one aspect of the present invention is an imaging unit that images the head of a vehicle occupant and generates an image, and the occupant's eyes based on the image generated by the imaging unit. The index value derivation unit for deriving the index value indicating the degree of opening / closing, the inclination estimation unit for estimating the degree of change of the inclination of the occupant's head with respect to the awakening time, and the index value derivation unit derived by the index value derivation unit. It is provided with a determination unit for determining whether or not the occupant is asleep based on the result of determination based on both the index value and the degree of change detected by the inclination estimation unit.

(2)の態様は、上記(1)の態様に係る乗員観察装置が、前記撮像部により生成された前記画像に基づいて、前記乗員の眼の輪郭の一部を少なくとも検出する眼検出部を更に備え、前記指標値導出部は、前記眼検出部により検出された輪郭における複数の特徴点の位置関係に基づいて、前記指標値を導出するものである。 The aspect (2) is an eye detection unit in which the occupant observation device according to the above aspect (1) detects at least a part of the contour of the occupant's eye based on the image generated by the imaging unit. Further, the index value deriving unit derives the index value based on the positional relationship of a plurality of feature points in the contour detected by the eye detecting unit.

(3)の態様は、上記(1)または(2)の態様に係る乗員観察装置において、前記判定部は、前記指標値が、開眼度合いが第1閾値未満であることを示し、または閉眼度合いが第2閾値以上であることを示す状態が第1所定時間以上継続した場合に、前記乗員が居眠りしていると判定するものである。 The aspect (3) is the occupant observation device according to the above aspect (1) or (2), and the determination unit indicates that the index value indicates that the degree of eye opening is less than the first threshold value, or the degree of eye closure. When the state indicating that is equal to or higher than the second threshold value continues for the first predetermined time or longer, it is determined that the occupant is dozing.

(4)の態様は、上記(1)から(3)のいずれかの態様に係る乗員観察装置において、前記判定部は、前記変化度合いが第3閾値以上傾いている状態が第2所定時間以上継続した場合に、前記乗員が居眠りしていると判定するものである。 In the aspect (4), in the occupant observation device according to any one of the above aspects (1) to (3), the determination unit is in a state where the degree of change is tilted by the third threshold value or more for the second predetermined time or more. If it continues, it is determined that the occupant is dozing.

(5)この発明の他の態様の乗員観察方法は、コンピュータが、車両の乗員の頭部を撮像して画像を生成し、生成された前記画像に基づいて、前記乗員の眼の開閉度合いを示す指標値を導出し、前記乗員の頭部の傾きの、覚醒時を基準とした変化度合いを推定し、導出された前記指標値と、推定された前記変化度合いとの双方に基づく判定の結果に基づいて、前記乗員が居眠りしているか否かを判定するものである。 (5) In another aspect of the occupant observation method of the present invention, a computer images the head of a vehicle occupant to generate an image, and based on the generated image, the degree of opening / closing of the occupant's eyes is determined. The indicated index value is derived, the degree of change in the inclination of the occupant's head with respect to the time of awakening is estimated, and the result of judgment based on both the derived index value and the estimated degree of change. Based on the above, it is determined whether or not the occupant is asleep.

(6)この発明の他の態様のプログラムは、コンピュータに、車両の乗員の頭部を撮像して画像を生成し、生成された前記画像に基づいて、前記乗員の眼の開閉度合いを示す指標値を導出し、前記乗員の頭部の傾きの、覚醒時を基準とした変化度合いを推定し、導出された前記指標値と、推定された前記変化度合いとの双方に基づく判定の結果に基づいて、前記乗員が居眠りしているか否かを判定するものである。 (6) The program of another aspect of the present invention uses a computer to image the head of a vehicle occupant to generate an image, and based on the generated image, an index indicating the degree of opening and closing of the occupant's eyes. A value is derived, the degree of change in the inclination of the occupant's head with respect to the time of awakening is estimated, and based on the result of determination based on both the derived index value and the estimated degree of change. Therefore, it is determined whether or not the occupant is asleep.

(1)〜(6)によれば、精度良く乗員が居眠りしているかを判定することができる。 According to (1) to (6), it is possible to accurately determine whether or not the occupant is asleep.

(3)〜(4)によれば、より精度良く乗員が居眠りしているかを判定することができる。 According to (3) to (4), it is possible to more accurately determine whether or not the occupant is dozing.

乗員観察装置1の構成と使用環境の一例を示す図である。It is a figure which shows an example of the structure and use environment of the occupant observation device 1. 撮像部10の設置される位置を例示した図である。It is a figure which illustrated the position where the image pickup unit 10 is installed. 眼検出部22による処理の内容を模式的に示す図である。It is a figure which shows typically the content of the processing by an eye detection unit 22. 開眼率導出部24の処理について説明するための図(その1)である。It is a figure (the 1) for demonstrating the process of the eye opening rate derivation part 24. 開眼率導出部24の処理について説明するための図(その2)である。It is a figure (the 2) for demonstrating the process of the eye opening rate derivation part 24. 傾き推定部26による処理の内容を模式的に示す図である。It is a figure which shows typically the content of the processing by the inclination estimation unit 26. 画像処理装置20により実行される処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of processing executed by image processing apparatus 20.

以下、図面を参照し、本発明の乗員観察装置、乗員観察方法、及びプログラムの実施形態について説明する。 Hereinafter, the occupant observation device, the occupant observation method, and the embodiment of the program of the present invention will be described with reference to the drawings.

<実施形態>
図1は、乗員観察装置1の構成と使用環境の一例を示す図である。乗員観察装置1は、例えば、撮像部10と、画像処理装置20とを備える。画像処理装置20は、例えば、眼検出部22と、開眼率導出部24と、傾き推定部26と、判定部28とを備える。乗員観察装置1は、例えば、車両の乗員の状態が居眠りしているかを判定し、判定結果を各種車載装置100に出力する。乗員は、少なくとも運転者を含み、助手席乗員を含んでもよい。各種車載装置100は、運転支援装置、自動運転制御装置、エージェント装置、その他の装置であり、乗員観察装置1は、各種車載装置100の種類や目的に応じた乗員の状態を推定して出力する。
<Embodiment>
FIG. 1 is a diagram showing an example of the configuration and usage environment of the occupant observation device 1. The occupant observation device 1 includes, for example, an image pickup unit 10 and an image processing device 20. The image processing device 20 includes, for example, an eye detection unit 22, an eye opening rate derivation unit 24, a tilt estimation unit 26, and a determination unit 28. The occupant observation device 1 determines, for example, whether the state of the occupant of the vehicle is dozing, and outputs the determination result to various in-vehicle devices 100. The occupants may include at least the driver and may include passenger seat occupants. The various in-vehicle devices 100 are a driving support device, an automatic driving control device, an agent device, and other devices, and the occupant observation device 1 estimates and outputs the state of the occupants according to the type and purpose of the various in-vehicle devices 100. ..

図2は、撮像部10の設置される位置を例示した図である。撮像部10は、例えば、車両のインストルメントパネルの中央部に設置され、車両の乗員の少なくとも頭部を撮像し、画像を生成する。撮像部10は、ステアリングホイールSWが設けられた運転席DS(着座位置の一例)と、助手席AS(着座位置の他の一例)とのいずれに対しても、それらに正対する位置から横方向にオフセットした位置に設置される。このため、撮像部10が生成した画像は、横方向に関して、斜めから乗員の頭部を撮像して生成された画像となる。換言すると、撮像部10の光軸上には乗員の少なくとも頭部が存在しない状況となる。 FIG. 2 is a diagram illustrating the position where the imaging unit 10 is installed. The image pickup unit 10 is installed in the central portion of the instrument panel of the vehicle, for example, and images at least the head of the occupant of the vehicle to generate an image. The image pickup unit 10 laterally faces both the driver's seat DS (an example of the seating position) provided with the steering wheel SW and the passenger seat AS (another example of the seating position) from a position facing them. It is installed at a position offset to. Therefore, the image generated by the imaging unit 10 is an image generated by imaging the head of the occupant diagonally in the lateral direction. In other words, at least the head of the occupant does not exist on the optical axis of the imaging unit 10.

図1に戻り、画像処理装置20の各部について説明する。画像処理装置20の構成要素は、例えば、CPU(Central Processing Unit)などのハードウェアプロセッサがプログラム(ソフトウェア)を実行することにより実現される。これらの構成要素のうち一部または全部は、LSI(Large Scale Integration)やASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、GPU(Graphics Processing Unit)などのハードウェア(回路部;circuitryを含む)によって実現されてもよいし、ソフトウェアとハードウェアの協働によって実現されてもよい。プログラムは、予めHDD(Hard Disk Drive)やフラッシュメモリなどの記憶装置(非一過性の記憶媒体を備える記憶装置)に格納されていてもよいし、DVDやCD−ROMなどの着脱可能な記憶媒体(非一過性の記憶媒体)に格納されており、記憶媒体がドライブ装置に装着されることでインストールされてもよい。 Returning to FIG. 1, each part of the image processing apparatus 20 will be described. The components of the image processing device 20 are realized by, for example, a hardware processor such as a CPU (Central Processing Unit) executing a program (software). Some or all of these components are hardware (circuit parts;) such as LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit). It may be realized by (including circuits), or it may be realized by the cooperation of software and hardware. The program may be stored in advance in a storage device (a storage device including a non-transient storage medium) such as an HDD (Hard Disk Drive) or a flash memory, or a removable storage device such as a DVD or a CD-ROM. It is stored in a medium (non-transient storage medium) and may be installed by mounting the storage medium in a drive device.

眼検出部22の機能について、一例として、エッジを抽出した後に眼を検出する手法について説明する。眼検出部22は、例えば、まず撮像部10によって生成された画像(以下、撮像画像と称する)において、エッジを抽出する。エッジとは、自身の周辺の画素との画素値の差が基準よりも大きい画素(或いは画素群)、すなわち特徴的な画素である。眼検出部22は、例えば、SOBELフィルタなどのエッジ抽出フィルタを用いて、エッジを抽出する。なお、SOBELフィルタを用いるのは単なる一例であり、眼検出部22は、他のフィルタ、或いはアルゴリズムに基づいてエッジを抽出してもよい。 As an example of the function of the eye detection unit 22, a method of detecting an eye after extracting an edge will be described. For example, the eye detection unit 22 first extracts an edge from an image generated by the imaging unit 10 (hereinafter, referred to as an captured image). An edge is a pixel (or a group of pixels) in which the difference in pixel value from the pixels around itself is larger than the reference, that is, a characteristic pixel. The eye detection unit 22 extracts edges by using, for example, an edge extraction filter such as a SOBEL filter. The SOBEL filter is used only as an example, and the eye detection unit 22 may extract edges based on another filter or an algorithm.

眼検出部22は、例えば、撮像画像において抽出されたエッジの分布に基づいて、撮像画像における少なくとも乗員の眼の一部を検出する。このとき、眼検出部22は、乗員の眼のうち撮像部10に近い方の眼を検出対象とする。撮像部10に近い方の眼とは、運転席DSに着座した乗員であれば左眼であり、助手席ASに着座した乗員であれば右目である。眼検出部22は、エッジを抽出するのではなく、ディープラーニングなどの機械学習手法により生成された学習済みモデルに撮像画像を入力することで、眼の一部(或いは後述する特徴点)を直接的に検出するようにしてもよい。 The eye detection unit 22 detects at least a part of the occupant's eye in the captured image, for example, based on the distribution of the edges extracted in the captured image. At this time, the eye detection unit 22 sets the eye of the occupant, which is closer to the imaging unit 10, as the detection target. The eye closer to the image pickup unit 10 is the left eye if the occupant is seated in the driver's seat DS, and the right eye if the occupant is seated in the passenger seat AS. The eye detection unit 22 directly inputs a part of the eye (or a feature point described later) by inputting an image to a trained model generated by a machine learning method such as deep learning instead of extracting an edge. It may be detected as a target.

図3は、眼検出部22による処理の内容を模式的に示す図である。図中、IMは撮像画像にエッジEGを重畳させた画像を表している。また、本図では専ら運転席DSに着座した乗員に着目している。眼検出部22は、まず、図3の上図に示すように、エッジEGに対して楕円や卵型などのモデルをフィッティングすることで、顔の輪郭CTを抽出する。次に、眼検出部22は、図3の中図に示すように、顔の輪郭CTを基準として鼻検出ウインドウNMを設定し、鼻検出ウインドウNMの中で、エッジが明確に抽出されやすい部位である鼻梁BNの位置を検出する。次に、眼検出部22は、図3の下図に示すように、鼻梁BNの位置を基準として、乗員の左眼が存在する筈の鼻梁BNの右側に所定のサイズの眼検出ウインドウEWを設定し、眼検出ウインドウEWの中で眼の少なくとも一部を検出する。これによって、運転席DSに着座した乗員の左目に重なる位置に眼検出ウインドウEWが設定されることになるため、眼検出ウインドウEWの中で左目が検出されることになる。なお、「眼の少なくとも一部を検出する」処理の具体例については種々に定義されるが、以下の説明では、「目の輪郭の少なくとも一部を検出する」ものとする。輪郭を検出する際に、眼検出部22は、例えば、エッジEGの分布に曲線モデルをフィッティングすることで、輪郭を検出する。 FIG. 3 is a diagram schematically showing the content of processing by the eye detection unit 22. In the figure, IM represents an image in which the edge EG is superimposed on the captured image. In addition, this figure focuses exclusively on the occupants seated in the driver's seat DS. First, as shown in the upper part of FIG. 3, the eye detection unit 22 extracts the contour CT of the face by fitting a model such as an ellipse or an egg shape to the edge EG. Next, as shown in the middle figure of FIG. 3, the eye detection unit 22 sets the nose detection window NM with reference to the contour CT of the face, and the nose detection window NM is a portion where edges are easily extracted. The position of the nasal bridge BN is detected. Next, as shown in the lower figure of FIG. 3, the eye detection unit 22 sets an eye detection window EW of a predetermined size on the right side of the nasal bridge BN where the left eye of the occupant should be present, with reference to the position of the nasal bridge BN. Then, at least a part of the eye is detected in the eye detection window EW. As a result, the eye detection window EW is set at a position overlapping the left eye of the occupant seated in the driver's seat DS, so that the left eye is detected in the eye detection window EW. Although specific examples of the process of "detecting at least a part of the eye" are defined in various ways, in the following description, it is assumed that "at least a part of the contour of the eye is detected". When detecting the contour, the eye detection unit 22 detects the contour by, for example, fitting a curve model to the distribution of the edge EG.

開眼率導出部24は、眼検出部22により検出された眼の輪郭における複数の特徴点の位置関係に基づいて、乗員の眼の開眼率αを導出する。複数の特徴点とは、例えば、眼の輪郭における、横方向に関して撮像部10に近い側の端部(目尻に相当する)である第1特徴点と、上端部である第2特徴点と、下端部である第3特徴点とを含む。図4は、開眼率導出部24の処理について説明するための図(その1)である。図中、P1は第1特徴点、P2は第2特徴点、P3は第3特徴点である。開眼率導出部24は、例えば、眼検出ウインドウEW内で垂直線を眼検出ウインドウEWの右端から左に向かって仮想的に移動させ、最初に眼の輪郭ECTと交わったときの交点を第1特徴点P1とする。また、開眼率導出部24は、例えば、眼検出ウインドウEW内で水平線を眼検出ウインドウEWの上端から下に向かって仮想的に移動させ、最初に眼の輪郭ECTと交わったときの交点を第2特徴点P2とする。また、開眼率導出部24は、例えば、眼検出ウインドウEW内で水平線を眼検出ウインドウEWの下端から上に向かって仮想的に移動させ、最初に眼の輪郭ECTと交わったときの交点を第3特徴点P3とする。 The eye opening rate deriving unit 24 derives the eye opening rate α of the occupant's eye based on the positional relationship of a plurality of feature points in the contour of the eye detected by the eye detecting unit 22. The plurality of feature points include, for example, a first feature point that is an end portion (corresponding to the outer corner of the eye) on the side of the contour of the eye that is closer to the imaging unit 10 in the lateral direction, and a second feature point that is an upper end portion. Includes a third feature point, which is the lower end. FIG. 4 is a diagram (No. 1) for explaining the processing of the eye opening rate deriving unit 24. In the figure, P1 is the first feature point, P2 is the second feature point, and P3 is the third feature point. For example, the eye opening rate deriving unit 24 virtually moves a vertical line from the right end of the eye detection window EW to the left in the eye detection window EW, and sets the intersection point when it first intersects the eye contour ECT. Let it be a feature point P1. Further, the eye opening rate deriving unit 24, for example, virtually moves the horizon from the upper end of the eye detection window EW downward in the eye detection window EW, and sets the intersection point when the eye contour ECT first intersects. 2 Let it be a feature point P2. Further, the eye opening rate deriving unit 24 virtually moves the horizon from the lower end of the eye detection window EW upward in the eye detection window EW, and sets the intersection point when the eye contour ECT first intersects. 3 Let it be a feature point P3.

そして、開眼率導出部24は、第1特徴点P1と第2特徴点P2とを結ぶ第1直線と、第1特徴点P1と第3特徴点P3とを結ぶ第2直線とのなす角度に基づいて、乗員の開眼率αを導出する。図5は、開眼率導出部24の処理について説明するための図(その2)である。図中、L1が第1直線であり、L2が第2直線であり、θ1がそれらのなす角度である。開眼率導出部24は、例えば、当該乗員が車両に乗り込んでから最初の数分程度の撮像画像に基づいて導出された角度の平均を求めた基準角度θiniを開眼率αの100[%]と定義し、その後に導出された角度θ1を基準角度θiniで除算して開眼率αを導出する(式(1)参照)。なお、これに限らず、乗員の人物認証が行われる場合、乗員ごとの100[%]に対応する基準角度をメモリに記憶させ、乗員ごとに基準角度をメモリから読み出して計算に使用してもよい。また、規定値を基準角度θiniに設定してもよいし、最初は規定値を使用し、徐々に乗員の平均的な角度に合わせていくようにしてもよい。 Then, the eye opening rate deriving unit 24 sets the angle between the first straight line connecting the first feature point P1 and the second feature point P2 and the second straight line connecting the first feature point P1 and the third feature point P3. Based on this, the eye opening rate α of the occupant is derived. FIG. 5 is a diagram (No. 2) for explaining the processing of the eye opening rate deriving unit 24. In the figure, L1 is the first straight line, L2 is the second straight line, and θ1 is the angle formed by them. For example, the eye opening rate deriving unit 24 sets the reference angle θini, which is obtained by calculating the average of the angles derived based on the captured images for the first few minutes after the occupant boarded the vehicle, as 100 [%] of the eye opening rate α. The angle θ1 derived after the definition is divided by the reference angle θini to derive the eye opening rate α (see equation (1)). Not limited to this, when the person authentication of the occupant is performed, the reference angle corresponding to 100 [%] for each occupant may be stored in the memory, and the reference angle for each occupant may be read from the memory and used for the calculation. Good. Further, the specified value may be set to the reference angle θini, or the specified value may be used at first and gradually adjusted to the average angle of the occupants.

α=MIN{θ1/θini,100[%]} …(1) α = MIN {θ1 / θini, 100 [%]}… (1)

また、これまでの説明では、あくまで画像平面上での角度θ1に基づいて開眼率αを導出するものとしているが、例えば、眼の三次元モデルを用意しておき、顔の輪郭CTと鼻梁BNの関係から推定される顔向き角度に応じて回転させた目のモデルから二次元に写像した後に、上記説明した処理を行うことで、開眼率αの推定精度を向上させることができる。また、開眼率導出部24は、眼の開閉度合いを示す指標値として、乗員の閉眼率を導出してもよい。この場合、開眼率導出部24は、100[%]から開眼率αを差し引いた値を閉眼率βとして導出する。開眼率α、及び閉眼率βは、「眼の開閉度合いを示す指標値」の一例であり、開眼率導出部24は、「指標値導出部」の一例であり、開眼率αは、「開眼度合い」の一例であり、閉眼率βは、「閉眼度合い」の一例である。 Further, in the explanation so far, the eye opening rate α is derived based on the angle θ1 on the image plane. However, for example, a three-dimensional model of the eye is prepared, and the facial contour CT and the nasal bridge BN are prepared. The estimation accuracy of the eye opening rate α can be improved by performing the processing described above after two-dimensionally mapping the eye model rotated according to the face orientation angle estimated from the above relationship. Further, the eye opening rate deriving unit 24 may derive the occupant's eye closing rate as an index value indicating the degree of opening and closing of the eyes. In this case, the eye opening rate deriving unit 24 derives a value obtained by subtracting the eye opening rate α from 100 [%] as the eye closing rate β. The eye opening rate α and the eye closing rate β are examples of “index values indicating the degree of opening and closing of the eyes”, the eye opening rate derivation unit 24 is an example of the “index value derivation unit”, and the eye opening rate α is “eye opening rate α”. The degree of eye closure is an example of the degree of eye closure, and the eye closure rate β is an example of the degree of eye closure.

図6は、傾き推定部26による処理の内容を模式的に示す図である。また、本図では専ら運転席DSに着座した乗員に着目している。まず、傾き推定部26は、眼検出部22の処理過程の一部を利用し、図6の上図に示すように、エッジEGに対して楕円や卵型などのモデルをフィッティングすることで、顔の輪郭CTを抽出する。次に、傾き推定部26は、図6の中図に示すように、眼検出部22によって検出された左眼と右眼とを結ぶ仮想的な第1線分CL1の中点から、抽出した顔の輪郭CTの弧の中点(つまり、顎の位置)までを結ぶ第2線分CL2の位置を検出する。上記に代えて、傾き推定部26は、ディープラーニングなどの機械学習手法により生成された学習済みモデルに撮像画像を入力することで、第2線分CL2を直接的に検出するようにしてもよい。 FIG. 6 is a diagram schematically showing the content of processing by the inclination estimation unit 26. In addition, this figure focuses exclusively on the occupants seated in the driver's seat DS. First, the tilt estimation unit 26 utilizes a part of the processing process of the eye detection unit 22, and by fitting a model such as an ellipse or an egg shape to the edge EG as shown in the upper figure of FIG. The contour CT of the face is extracted. Next, the tilt estimation unit 26 extracted from the midpoint of the virtual first line segment CL1 that connects the left eye and the right eye detected by the eye detection unit 22, as shown in the middle figure of FIG. The position of the second line segment CL2 connecting to the midpoint of the arc of the contour CT of the face (that is, the position of the jaw) is detected. Instead of the above, the tilt estimation unit 26 may directly detect the second line segment CL2 by inputting an captured image into a trained model generated by a machine learning method such as deep learning. ..

そして、傾き推定部26は、図6の下図に示すように、所定線分CL2dを基準とした第2線分CL2の変化度合いを推定する。所定線分CL2dは、乗員の覚醒時において検出されたものである。所定線分CL2dは、第2線分CL2が覚醒時に比してどれくらい傾いているかを評価する基準として用いられる。所定線分CL2dは、例えば、乗員が車両に乗車するタイミングに検出される。検出された所定線分CL2dを示す情報は、例えば、乗員観察装置1が備える記憶部(不図示)に記憶(更新)される。傾き推定部26は、例えば、第2線分CL2と、所定線分CL2dとの下方の端部とを重ね合わせた際の第2線分CL2と所定線分CL2dのなす角度θ2を、所定線分CL2dを基準とした第2線分CL2の変化度合いとして推定する。 Then, as shown in the lower figure of FIG. 6, the inclination estimation unit 26 estimates the degree of change of the second line segment CL2 with respect to the predetermined line segment CL2d. The predetermined line segment CL2d is detected when the occupant is awake. The predetermined line segment CL2d is used as a standard for evaluating how much the second line segment CL2 is tilted as compared with the time of awakening. The predetermined line segment CL2d is detected, for example, at the timing when the occupant gets on the vehicle. The information indicating the detected predetermined line segment CL2d is stored (updated) in, for example, a storage unit (not shown) included in the occupant observation device 1. The inclination estimation unit 26 sets the angle θ2 formed by the second line segment CL2 and the predetermined line segment CL2d when the second line segment CL2 and the lower end of the predetermined line segment CL2d are overlapped with each other, for example. It is estimated as the degree of change of the second line segment CL2 with reference to the minute CL2d.

なお、これまでの説明では、あくまで画像平面上での角度θ2に基づいて変化度合いを導出するものとしているが、例えば、第2線分CL2に係る三次元モデルを用意しておき、顔の輪郭CTと鼻梁BNの関係から推定される顔向き角度に応じて回転させた第2線分CL2のモデルから二次元に写像した後に、上記説明した処理を行うことで、変化度合いの推定精度を向上させることができる。 In the explanation so far, the degree of change is derived based on the angle θ2 on the image plane. However, for example, a three-dimensional model related to the second line segment CL2 is prepared and the contour of the face. After mapping two-dimensionally from the model of the second line segment CL2 rotated according to the face orientation angle estimated from the relationship between CT and the nasal bridge BN, the above-described processing is performed to improve the estimation accuracy of the degree of change. Can be made to.

判定部28は、例えば、開眼率導出部24により導出された開眼率αと、傾き推定部26により推定された角度θ2とに基づいて、乗員が居眠りしているかを判定し、各種車載装置100に出力する。例えば、判定部28は、開眼率αが小さくなるほど乗員の眠気が強いと判定し、開眼率αが第1閾値Th1未満である場合に、乗員の眠気が強い、或いは乗員が居眠りしていると判定する。また、判定部28は、閉眼率βが大きくなるほど乗員の眠気が強いと判定し、閉眼率βが第2閾値Th2以上である場合に、乗員の眠気が強い、或いは乗員が居眠りしていると判定する。例えば、第1閾値Th1は、乗員が居眠りしていない状態と、居眠りしている状態とを識別可能な開眼率αを示す値であり、第2閾値Th2は、乗員が居眠りしていない状態と、居眠りしている状態とを識別可能な閉眼率βを示す値である。 The determination unit 28 determines whether the occupant is dozing based on, for example, the eye opening rate α derived by the eye opening rate derivation unit 24 and the angle θ2 estimated by the inclination estimation unit 26, and various in-vehicle devices 100. Output to. For example, the determination unit 28 determines that the smaller the eye opening rate α is, the stronger the sleepiness of the occupant is, and when the eye opening rate α is less than the first threshold value Th1, the occupant is strongly drowsy or the occupant is dozing. judge. Further, the determination unit 28 determines that the greater the eye closure rate β, the stronger the sleepiness of the occupant, and when the eye closure rate β is equal to or higher than the second threshold value Th2, the occupant is strongly drowsy or the occupant is dozing. judge. For example, the first threshold Th1 is a value indicating an eye opening rate α that can distinguish between a state in which the occupant is not asleep and a state in which the occupant is asleep, and the second threshold Th2 is a state in which the occupant is not asleep. , It is a value indicating the eye closure rate β that can be distinguished from the dozing state.

また、判定部28は、角度θ2が大きくなるほど乗員の眠気が強いと判定し、角度θ2が第3閾値Th3以上である場合に、乗員の眠気が強い、或いは乗員が居眠りしていると判定する。第3閾値Th3は、乗員が居眠りしていない状態と、居眠りしている状態とを識別可能な角度θ2を示す値である。 Further, the determination unit 28 determines that the greater the angle θ2, the stronger the drowsiness of the occupant, and when the angle θ2 is equal to or greater than the third threshold value Th3, the determination unit 28 determines that the occupant is drowsy or the occupant is dozing. .. The third threshold value Th3 is a value indicating an angle θ2 that can distinguish between a state in which the occupant is not dozing and a state in which the occupant is dozing.

なお、第1閾値Th1、第2閾値Th2、及び第3閾値Th3は、予め値が定められていてもよく、車両に乗車した乗員毎に決定されてもよく、撮像部10によって生成された画像を用いたディープラーニングにより学習された学習モデルを用いて導出されてもよい。 The values of the first threshold value Th1, the second threshold value Th2, and the third threshold value Th3 may be predetermined, may be determined for each occupant in the vehicle, and the image generated by the imaging unit 10 may be determined. It may be derived using a learning model learned by deep learning using.

また、判定部28は、開眼率αが第1閾値Th1未満である状態が第1所定時間以上継続した場合、又は閉眼率βが第2閾値Th2以上である状態が第1所定時間以上継続した場合に乗員が居眠りしていると判定する。また、判定部28は、角度θ2が第3閾値Th3以上である状態が第2所定時間以上継続した場合に乗員が居眠りしていると判定する。 Further, in the determination unit 28, the state where the eye opening rate α is less than the first threshold value Th1 continues for the first predetermined time or more, or the state where the eye closing rate β is the second threshold value Th2 or more continues for the first predetermined time or more. In some cases, it is determined that the occupant is dozing. Further, the determination unit 28 determines that the occupant is dozing when the state in which the angle θ2 is equal to or greater than the third threshold value Th3 continues for the second predetermined time or longer.

[動作フロー]
図7は、画像処理装置20により実行される処理の流れの一例を示すフローチャートである。まず、画像処理装置20は、撮像部10により生成された画像を取得する(ステップS100)。次に、眼検出部22は、取得された画像においてエッジを抽出する(ステップS102)。
[Operation flow]
FIG. 7 is a flowchart showing an example of the flow of processing executed by the image processing apparatus 20. First, the image processing device 20 acquires the image generated by the imaging unit 10 (step S100). Next, the eye detection unit 22 extracts an edge from the acquired image (step S102).

眼検出部22は、撮像画像において抽出されたエッジの分布に基づいて、撮像画像における少なくとも乗員の眼の一部を検出する(ステップS104)。開眼率導出部24は、眼検出部22により検出された眼の輪郭における複数の特徴点の位置関係に基づいて、乗員の眼の開眼率αを導出する(ステップS106)。なお、開眼率導出部24は、複数の特徴点の位置関係に基づいて、乗員の眼の閉眼率βを導出してもよい。 The eye detection unit 22 detects at least a part of the occupant's eye in the captured image based on the distribution of the edges extracted in the captured image (step S104). The eye opening rate deriving unit 24 derives the eye opening rate α of the occupant's eye based on the positional relationship of a plurality of feature points in the contour of the eye detected by the eye detecting unit 22 (step S106). The eye opening rate deriving unit 24 may derive the eye closing rate β of the occupant's eyes based on the positional relationship of the plurality of feature points.

判定部28は、開眼率導出部24によって導出された開眼率αが第1閾値Th1未満であるか否か、又は閉眼率βが第2閾値Th2以上であるか否かを判定する(ステップS108)。判定部28は、開眼率αが第1閾値Th1以上である、又は閉眼率βが第2閾値Th2未満であると判定した場合、乗員が眠気を感じていないものとして処理を終了する。判定部28は、開眼率αが第1閾値Th1未満である、又は閉眼率βが第2閾値Th2以上であると判定した場合、乗員が眠気を感じているものとし、更に、開眼率αが第1閾値Th1未満である状態、又は閉眼率βが第2閾値Th2以上である状態が第1所定時間以上継続したか否かを判定する(ステップS110)。判定部28は、開眼率αが第1閾値Th1未満である状態、又は閉眼率βが第2閾値Th2以上である状態が第1所定時間以上継続したと判定した場合、処理をステップS120に進め、乗員が居眠りしていると判定する(ステップS120)。 The determination unit 28 determines whether or not the eye opening rate α derived by the eye opening rate deriving unit 24 is less than the first threshold Th1 or whether the eye closing rate β is equal to or more than the second threshold Th2 (step S108). ). When the determination unit 28 determines that the eye opening rate α is equal to or higher than the first threshold value Th1 or the eye closing rate β is less than the second threshold value Th2, the determination unit 28 terminates the process assuming that the occupant does not feel drowsy. When the determination unit 28 determines that the eye opening rate α is less than the first threshold value Th1 or the eye closing rate β is equal to or higher than the second threshold value Th2, it is assumed that the occupant feels drowsy, and the eye opening rate α is further determined. It is determined whether or not the state in which the first threshold value is less than Th1 or the eye-closure rate β is equal to or higher than the second threshold value Th2 continues for the first predetermined time or more (step S110). When the determination unit 28 determines that the state in which the eye opening rate α is less than the first threshold value Th1 or the state in which the eye closing rate β is the second threshold value Th2 or more continues for the first predetermined time or more, the process proceeds to step S120. , It is determined that the occupant is dozing (step S120).

傾き推定部26は、判定部28によって乗員が眠気を感じているものの、開眼率αが第1閾値Th1未満である状態、又は閉眼率βが第2閾値Th2以上である状態が第1所定時間以上継続していないと判定された場合、眼検出部22によって抽出されたエッジEGに基づいて、第2線分CL2の位置を検出する(ステップS112)。傾き推定部26は、所定線分CL2dを基準とした第2線分CL2の変化度合い(つまり、角度θ2)を推定する(ステップS114)。 In the tilt estimation unit 26, the occupant feels drowsy due to the determination unit 28, but the eye opening rate α is less than the first threshold value Th1 or the eye closure rate β is the second threshold value Th2 or more for the first predetermined time. If it is determined that the above is not continued, the position of the second line segment CL2 is detected based on the edge EG extracted by the eye detection unit 22 (step S112). The inclination estimation unit 26 estimates the degree of change (that is, the angle θ2) of the second line segment CL2 with respect to the predetermined line segment CL2d (step S114).

判定部28は、傾き推定部26によって推定された角度θ2が第3閾値Th3以上であるか否かを判定する(ステップS116)。判定部28は、角度θ2が第3閾値Th3未満であると判定した場合、乗員が眠気を感じていないものとして処理を終了する。判定部28は、角度θ2が第3閾値Th3以上であると判定した場合、乗員が眠気を感じているものとし、更に、角度θ2が第3閾値Th3以上である状態が第2所定時間以上継続したか否かを判定する(ステップS118)。 The determination unit 28 determines whether or not the angle θ2 estimated by the inclination estimation unit 26 is equal to or greater than the third threshold value Th3 (step S116). When the determination unit 28 determines that the angle θ2 is less than the third threshold value Th3, the determination unit 28 terminates the process assuming that the occupant does not feel drowsy. When the determination unit 28 determines that the angle θ2 is the third threshold value Th3 or more, it is assumed that the occupant feels drowsy, and the state in which the angle θ2 is the third threshold value Th3 or more continues for the second predetermined time or more. It is determined whether or not this has been done (step S118).

判定部28は、乗員が眠気を感じていると判定したものの、開眼率αが第1閾値Th1未満である状態、又は閉眼率βが第2閾値Th2以上である状態が第1所定時間以上継続しなかった場合と、角度θ2が第3閾値Th3以上である状態が第2所定時間以上継続しなかった場合には、乗員が居眠りしていないものとして処理を終了する。判定部28は、角度θ2が第3閾値Th3以上である状態が第2所定時間以上継続したと判定した場合、乗員が居眠りしていると判定する(ステップS120)。 Although the determination unit 28 determines that the occupant feels drowsy, the state in which the eye opening rate α is less than the first threshold value Th1 or the eye closing rate β is in the second threshold value Th2 or more continues for the first predetermined time or more. If the occupant does not do so, or if the state in which the angle θ2 is equal to or greater than the third threshold value Th3 does not continue for the second predetermined time or more, the process is terminated assuming that the occupant has not fallen asleep. When the determination unit 28 determines that the state in which the angle θ2 is the third threshold value Th3 or more continues for the second predetermined time or more, the determination unit 28 determines that the occupant is dozing (step S120).

なお、上述では、画像処理装置20が、まず開眼率α、又は閉眼率βに係る判定処理(ステップS104〜S110の処理)を行った後、角度θ2に係る判定処理(ステップS112〜S118の処理)を行う場合について説明したが、これに限られない。画像処理装置20は、例えば、開眼率α、又は閉眼率βに係る判定処理と、角度θ2に係る判定処理とを並行して行ってもよく、角度θ2に係る判定処理を行った後、開眼率α、又は閉眼率βに係る判定処理を行ってもよい。 In the above description, the image processing apparatus 20 first performs determination processing related to the eye opening rate α or eye closing rate β (processing in steps S104 to S110), and then determination processing related to the angle θ2 (processing in steps S112 to S118). ) Has been described, but it is not limited to this. For example, the image processing device 20 may perform the determination process related to the eye opening rate α or the eye closing rate β and the determination process related to the angle θ2 in parallel, and after performing the determination process related to the angle θ2, the eye opening The determination process relating to the rate α or the eye closure rate β may be performed.

以上説明したように、本実施形態の乗員観察装置1によれば、眼検出部22、及び開眼率導出部24の処理によって導出された眼の開閉度合いを示す指標値と、傾き推定部26によって推定された乗員の頭部の傾きの、覚醒時を基準とした変化度合いとの双方に基づく判定の結果に基づいて、どちらか1つの指標を用いるよりも精度良く乗員が居眠りしているかを判定することができる。 As described above, according to the occupant observation device 1 of the present embodiment, the index value indicating the degree of opening / closing of the eye derived by the processing of the eye detection unit 22 and the eye opening rate derivation unit 24, and the inclination estimation unit 26 Based on the result of the judgment based on both the estimated inclination of the occupant's head and the degree of change based on the awakening time, it is judged whether the occupant is dozing more accurately than using either one index. can do.

以上、本発明を実施するための形態について実施形態を用いて説明したが、本発明はこうした実施形態に何等限定されるものではなく、本発明の要旨を逸脱しない範囲内において種々の変形及び置換を加えることができる。 Although the embodiments for carrying out the present invention have been described above using the embodiments, the present invention is not limited to these embodiments, and various modifications and substitutions are made without departing from the gist of the present invention. Can be added.

1…乗員観察装置、10…撮像部、20…画像処理装置、22…眼検出部、24…開眼率導出部、26…推定部、28…判定部、100…各種車載装置、BN…鼻梁、CL1…第1線分、CL2…第2線分、CL2d…所定線分、CT…顔の輪郭、ECT…目の輪郭、EG…エッジ、EW…眼検出ウインドウ、NM…鼻検出ウインドウ、Th1…第1閾値、Th2…第2閾値、Th3…第3閾値、α…開眼率、β…閉眼率、θ1…角度、θ2…角度、θini…基準角度 1 ... Crew observation device, 10 ... Imaging unit, 20 ... Image processing device, 22 ... Eye detection unit, 24 ... Eye opening rate derivation unit, 26 ... Estimating unit, 28 ... Judgment unit, 100 ... Various in-vehicle devices, BN ... Nose bridge, CL1 ... 1st line segment, CL2 ... 2nd line segment, CL2d ... Predetermined line segment, CT ... Face contour, ECT ... Eye contour, EG ... Edge, EW ... Eye detection window, NM ... Nose detection window, Th1 ... 1st threshold, Th2 ... 2nd threshold, Th3 ... 3rd threshold, α ... eye opening rate, β ... eye closing rate, θ1 ... angle, θ2 ... angle, θini ... reference angle

Claims (6)

車両の乗員の頭部を撮像し、画像を生成する撮像部と、
前記撮像部により生成された前記画像に基づいて、前記乗員の眼の開閉度合いを示す指標値を導出する指標値導出部と、
前記乗員の頭部の傾きの、覚醒時を基準とした変化度合いを推定する傾き推定部と、
前記指標値導出部によって導出された前記指標値と、前記傾き推定部によって検出された前記変化度合いとの双方に基づく判定の結果に基づいて、前記乗員が居眠りしているか否かを判定する判定部と、
を備える乗員観察装置。
An imaging unit that captures the head of the occupant of the vehicle and generates an image,
An index value deriving unit that derives an index value indicating the degree of opening and closing of the eyes of the occupant based on the image generated by the imaging unit, and
A tilt estimation unit that estimates the degree of change in the tilt of the occupant's head with respect to the time of awakening,
Judgment to determine whether or not the occupant is dozing based on the result of determination based on both the index value derived by the index value deriving unit and the change degree detected by the inclination estimation unit. Department and
An occupant observation device equipped with.
前記撮像部により生成された前記画像に基づいて、前記乗員の眼の輪郭の一部を少なくとも検出する眼検出部を更に備え、
前記指標値導出部は、前記眼検出部により検出された輪郭における複数の特徴点の位置関係に基づいて、前記指標値を導出する、
請求項1に記載の乗員観察装置。
An eye detection unit that detects at least a part of the contour of the occupant's eye based on the image generated by the imaging unit is further provided.
The index value deriving unit derives the index value based on the positional relationship of a plurality of feature points in the contour detected by the eye detecting unit.
The occupant observation device according to claim 1.
前記判定部は、前記指標値が、開眼度合いが第1閾値未満であることを示し、または閉眼度合いが第2閾値以上であることを示す状態が第1所定時間以上継続した場合に、前記乗員が居眠りしていると判定する、
請求項1または2に記載の乗員観察装置。
The determination unit indicates that the index value indicates that the degree of eye opening is less than the first threshold value, or the state indicating that the degree of eye closure is equal to or more than the second threshold value continues for the first predetermined time or longer. Judges that he is dozing,
The occupant observation device according to claim 1 or 2.
前記判定部は、前記変化度合いが第3閾値以上傾いている状態が第2所定時間以上継続した場合に、前記乗員が居眠りしていると判定する、
請求項1から3のうちいずれか一項に記載の乗員観察装置。
The determination unit determines that the occupant is dozing when the state in which the degree of change is tilted by the third threshold value or more continues for the second predetermined time or more.
The occupant observation device according to any one of claims 1 to 3.
コンピュータが、
車両の乗員の頭部を撮像して画像を生成し、
生成された前記画像に基づいて、前記乗員の眼の開閉度合いを示す指標値を導出し、
前記乗員の頭部の傾きの、覚醒時を基準とした変化度合いを推定し、
導出された前記指標値と、推定された前記変化度合いとの双方に基づく判定の結果に基づいて、前記乗員が居眠りしているか否かを判定する、
乗員観察方法。
The computer
The head of the occupant of the vehicle is imaged to generate an image,
Based on the generated image, an index value indicating the degree of opening and closing of the eyes of the occupant is derived.
Estimate the degree of change in the inclination of the occupant's head with respect to the time of awakening.
Based on the result of the determination based on both the derived index value and the estimated degree of change, it is determined whether or not the occupant is asleep.
Crew observation method.
コンピュータに、
車両の乗員の頭部を撮像して画像を生成し、
生成された前記画像に基づいて、前記乗員の眼の開閉度合いを示す指標値を導出し、
前記乗員の頭部の傾きの、覚醒時を基準とした変化度合いを推定し、
導出された前記指標値と、推定された前記変化度合いとの双方に基づく判定の結果に基づいて、前記乗員が居眠りしているか否かを判定する、
プログラム。
On the computer
The head of the occupant of the vehicle is imaged to generate an image,
Based on the generated image, an index value indicating the degree of opening and closing of the eyes of the occupant is derived.
Estimate the degree of change in the inclination of the occupant's head with respect to the time of awakening.
Based on the result of the determination based on both the derived index value and the estimated degree of change, it is determined whether or not the occupant is asleep.
program.
JP2019124391A 2019-07-03 2019-07-03 Occupant observation device, occupant observation method and program Pending JP2021007717A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019124391A JP2021007717A (en) 2019-07-03 2019-07-03 Occupant observation device, occupant observation method and program
CN202010594597.2A CN112183176A (en) 2019-07-03 2020-06-24 Occupant observation device, occupant observation method, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2019124391A JP2021007717A (en) 2019-07-03 2019-07-03 Occupant observation device, occupant observation method and program

Publications (1)

Publication Number Publication Date
JP2021007717A true JP2021007717A (en) 2021-01-28

Family

ID=73918816

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2019124391A Pending JP2021007717A (en) 2019-07-03 2019-07-03 Occupant observation device, occupant observation method and program

Country Status (2)

Country Link
JP (1) JP2021007717A (en)
CN (1) CN112183176A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004314750A (en) * 2003-04-15 2004-11-11 Denso Corp Vehicle instrument operation control device
JP2009045418A (en) * 2007-02-16 2009-03-05 Denso Corp Device, program, and method for determining sleepiness
WO2017056401A1 (en) * 2015-09-30 2017-04-06 ソニー株式会社 Control device, control method, and program
US20180204078A1 (en) * 2015-07-10 2018-07-19 Innov Plus System for monitoring the state of vigilance of an operator

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE112014000934T5 (en) * 2013-02-21 2016-01-07 Iee International Electronics & Engineering S.A. Imaging-based occupant monitoring system with broad functional support
CN204303129U (en) * 2014-11-27 2015-04-29 程长明 Anti-fatigue warning system and anti-fatigue eyeglasses
CN105701445A (en) * 2014-12-15 2016-06-22 爱信精机株式会社 determination apparatus and determination method
CN205881127U (en) * 2016-07-27 2017-01-11 尤明洲 Vehicle driver fatigue monitors alarm system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004314750A (en) * 2003-04-15 2004-11-11 Denso Corp Vehicle instrument operation control device
JP2009045418A (en) * 2007-02-16 2009-03-05 Denso Corp Device, program, and method for determining sleepiness
US20180204078A1 (en) * 2015-07-10 2018-07-19 Innov Plus System for monitoring the state of vigilance of an operator
WO2017056401A1 (en) * 2015-09-30 2017-04-06 ソニー株式会社 Control device, control method, and program

Also Published As

Publication number Publication date
CN112183176A (en) 2021-01-05

Similar Documents

Publication Publication Date Title
US9928404B2 (en) Determination device, determination method, and non-transitory storage medium
JP5790762B2 (en) 瞼 Detection device
JP4915413B2 (en) Detection apparatus and method, and program
US9822576B2 (en) Method for operating an activatable locking device for a door and/or a window, securing device for a vehicle, vehicle
JP6584717B2 (en) Face orientation estimation apparatus and face orientation estimation method
JP2020052827A (en) Occupant modeling device, occupant modeling method, and occupant modeling program
JP2016115117A (en) Determination device and determination method
WO2017173480A1 (en) Method and system of distinguishing between a glance event and an eye closure event
US11161470B2 (en) Occupant observation device
JP5349350B2 (en) Eye opening degree determination device and eye opening degree determination method
JP2016115120A (en) Opened/closed eye determination device and opened/closed eye determination method
JP2018151930A (en) Driver state estimation device and driver state estimation method
JP7267467B2 (en) ATTENTION DIRECTION DETERMINATION DEVICE AND ATTENTION DIRECTION DETERMINATION METHOD
JP2011125620A (en) Biological state detector
JP2021129700A (en) Reference value determination device and reference value determination method
JP2021007717A (en) Occupant observation device, occupant observation method and program
JP6982767B2 (en) Detection device, learning device, detection method, learning method, and program
KR20170028631A (en) Method and Apparatus for Detecting Carelessness of Driver Using Restoration of Front Face Image
WO2022113275A1 (en) Sleep detection device and sleep detection system
CN111696312B (en) Passenger observation device
TWI579173B (en) An driver fatigue monitoring and detection method base on an ear-angle
JP4825737B2 (en) Eye opening degree determination device
JPWO2020255238A1 (en) Information processing equipment, programs and information processing methods
JP7127661B2 (en) Eye opening degree calculator
WO2022118475A1 (en) Passenger temperature estimating device, passenger state detection device, passenger temperature estimating method, and passenger temperature estimating system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20211126

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20220823

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20220831

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20230228