JP4864541B2 - Blink data classification device, wakefulness determination device, and wakefulness determination device - Google Patents

Blink data classification device, wakefulness determination device, and wakefulness determination device Download PDF

Info

Publication number
JP4864541B2
JP4864541B2 JP2006142582A JP2006142582A JP4864541B2 JP 4864541 B2 JP4864541 B2 JP 4864541B2 JP 2006142582 A JP2006142582 A JP 2006142582A JP 2006142582 A JP2006142582 A JP 2006142582A JP 4864541 B2 JP4864541 B2 JP 4864541B2
Authority
JP
Japan
Prior art keywords
blink
data
waveform
electrooculogram
means
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2006142582A
Other languages
Japanese (ja)
Other versions
JP2007312824A (en
Inventor
ルーングロジュ ノプスワンチャイ
裕美子 井上
美恵子 大須賀
祥宏 野口
快之 鎌倉
Original Assignee
学校法人常翔学園
旭化成株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 学校法人常翔学園, 旭化成株式会社 filed Critical 学校法人常翔学園
Priority to JP2006142582A priority Critical patent/JP4864541B2/en
Publication of JP2007312824A publication Critical patent/JP2007312824A/en
Application granted granted Critical
Publication of JP4864541B2 publication Critical patent/JP4864541B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a blink data classification device that classifies data related to an electrooculogram (EOG) waveform of a blink, and a pattern model generation device that generates a pattern model based on the data related to the electrooculogram (EOG) waveform of the blink , Blinking waveform appearance frequency information generating device for generating appearance frequency information which is information of temporal change in appearance frequency of blink type for a predetermined period of the subject using the generated pattern model, appearance frequency of the generated waveform type Awake state determination device for determining awake state of subject based on information, awake state determination device for determining awake state of subject based on temporal change of appearance frequency of blink waveform type, and target based on determination result of awake state The present invention relates to a warning device that gives a warning to a person.

  In recent years, research and development for the practical application of ITS (Intelligent Transport System) has been rapidly developed along with the development and sophistication of information processing technology, and the degree of attention from the industry as a new market has also increased. ing. Driving support / navigation systems, etc., play a central role in ITS, but not only road / traffic conditions and car behavior, but also support and information presentation according to the driver's characteristics and conditions at the time. It is rare. Above all, evaluation of the driver's arousal level has been an issue for a long time, and many studies using physiological responses such as electroencephalogram, electrodermal activity, heartbeat, and blinking have been conducted. In particular, it is known that the waveform parameters indicating the blinking pattern and eye movement change depending on the subject's arousal level, and drivers using the waveform parameter indicating the blink pattern and the eye movement during blinking are known. Research such as dozing detection is underway.

  Further, there is an operation content determination device described in Patent Document 1 as a technology for determining a state of wakefulness using a parameter of a waveform indicating eye movement during blinking. This operation content determination apparatus has an eye state determination HMM that outputs the likelihood for the type of electro-oculogram (EOG) waveform of the subject's blink in response to the input of feature quantities for video data of a plurality of frames of the eye part. The arousal state of the subject is determined based on the likelihood output from the eye state determination HMM in response to the input of the feature amount.

  In the prior art of this patent document 1, when generating an eye state determination HMM, based on parameters such as the amplitude, duration, and speed of blinking in an electrooculogram (EOG) waveform of one blink. The type of electrooculogram (EOG) waveform is determined, and based on the determination result, the electrooculogram is obtained with respect to the feature amount extracted from the video data of a plurality of frames of the eye portion corresponding to the electrooculogram (EOG) waveform. FIG. (EOG) Identification information for identifying the type of waveform is given, and the HMM is learned using the feature quantity to which this identification information is given as learning data.

In addition, the wakefulness detection device disclosed in Patent Document 2 sequentially accumulates those whose blinking eye closure time is equal to or greater than a reference time in a unit time, and the wakefulness state is detected when the eye closure time integrated value becomes a set value or more. Judge that it is decreasing.
WO-A1-2005 / 114576 JP-A-7-156682

  However, in the prior art of the above-mentioned Patent Document 1, for example, the operation of adding identification information (identification label) for identifying the type of an electrooculogram (EOG) waveform of blinking to a feature amount extracted from video data is as follows: Human eyes visually determine the blinking electrooculogram (EOG) waveform corresponding to the video data, and there has been a problem that much labor and cost are required as the amount of data to be applied increases.

In addition, since the interpretation (labeling) of the identification information is different depending on the person who performs the work, it is desirable that the same person (or a small number of people) perform as much as possible, and this requires more work time. There was a problem.
Conventionally, it has not been sufficiently verified whether the type (model) of the blink waveform that has been grasped has been accurately grasped, and it has been required to grasp a more accurate model.

  In the prior arts of Patent Document 1 and Patent Document 2, the index that the blink time becomes longer as the arousal level decreases and the blink waveform when sleepiness occurs are small in wave height, and the rising angle of the waveform and the rising angle of the waveform. Since a single index such as an index with a slow down angle and a short duration is used, it is difficult to estimate a wide variety of wakefulness levels that occur before shifting to a dozing state.

  Therefore, the present invention has been made paying attention to such an unsolved problem of the conventional technology, and is suitable for classifying the type of the electrooculogram (EOG) waveform of the blink. Blink data classification device capable of efficiently assigning information identifying the type to the blink data related to the electrooculogram (EOG) waveform, a pattern model generation device for generating a pattern model based on the classified blink data, A blink waveform appearance frequency information generation device that generates appearance frequency information that is information on temporal changes in the appearance frequency of the blink type of the subject for a predetermined period using the generated pattern model, and the subject's awakening based on the appearance frequency information Awake state determination device and awake state determination device for determining (determining) a state, and a warning device for giving a warning to a target person based on a determination result of awake state An object of the present invention is to.

To achieve the above object, the blink data classification device according to claim 1 of the present invention is a blink data classification device that classifies blink data that is data related to an electrooculogram (EOG) waveform of blinking,
Normalization means for normalizing a plurality of types of parameters with different units extracted from electrooculogram (EOG) waveform data which is data of the electrooculogram (EOG) waveform of the blink;
Classification means for classifying parameters corresponding to the electrooculogram (EOG) waveform data of a plurality of types of blinks normalized by the normalization means using a predetermined clustering method;
Based on a classification result of the classification means, comprising: identification information providing means for assigning identification information of the type to which each parameter belongs to the blink data corresponding to each classified parameter. Yes.

With such a configuration, normalization means normalizes multiple types of parameters extracted from electrooculogram (EOG) waveform data, which is data of an electrooculogram (EOG) waveform of blinking, with different units. It is possible to classify the parameters corresponding to the electrooculogram (EOG) waveform data of a plurality of types of blinks normalized by the normalization unit using a predetermined clustering method. Is possible.
Furthermore, it is possible to assign the identification information of the type to which each parameter belongs to the blink data corresponding to each classified parameter based on the classification result of the classification unit by the identification information adding unit. is there.

  Therefore, by extracting characteristic parameters with different units (dimensions) such as distance and time from blinking electrooculogram (EOG) waveform data, and normalizing the extracted parameters, multiple parameters with different units (dimensions) are extracted. Clustering (classification) can be performed using parameters of types at the same time, and identification information can be automatically assigned to blinking data based on the classification results, so that the type of blinking data can be stably and accurately In addition to being able to perform well, it is possible to obtain the effect of reducing the labor and cost required for the operation of providing identification information.

In addition, since all the blink data can be classified into an accurate type (type of blink waveform (model)), the type of the blink waveform of the subject is identified by using these correctly classified blink data. Thus, for example, it is possible to accurately determine the awake state of the subject.
Here, the “blink data” includes the electrooculogram (EOG) waveform data of the blink, the feature amount data extracted from the electrooculogram (EOG) waveform data, and the electrooculogram (EOG) waveform data of the blink. It is the moving image data (blink video data) of the subject's blink at the time of measurement, the feature amount data extracted from the blink video data, and the like.

The above-mentioned “plural types of parameters with different units” are parameters (data) that can be extracted from the blinking electrooculogram (EOG) waveform data. For example, parameters indicating the duration of the blinking waveform, Data indicating the distance (distance), data indicating the blinking speed, and the like are applicable.
In addition, the above “classifying” means that the normalized parameters are clustered into any of a plurality of classes by the clustering means, and the clustering result itself may be used as a classification result, The result of processing may be used as the type result.
The “identification information” is information that can identify to which class (type) the blink data corresponding to each parameter clustered (type) by the clustering means belongs.

Furthermore, the invention according to claim 2 is the blink data classification device according to claim 1,
The plurality of types of parameters having different units are extracted from the blinking electrooculogram (EOG) waveform data, the distance data of the peak height of the electrooculogram (EOG) waveform, from the start of blinking to the peak height And time data from the peak height to the end of blinking.
With such a configuration, it is possible to distinguish between various types of blinking electrooculogram (EOG) waveforms, so that the type of blink data can be more accurately performed.

For example, when clustering is performed using two parameters, the peak height of an electrooculogram (EOG) waveform of blinking and the blinking speed, a plurality of peak heights (distances) and blinking speeds are the same. Since all the electrooculogram (EOG) waveforms belong to the same class (type), accurate classification cannot be performed. On the other hand, according to the configuration of the present invention, even when there are a plurality of electrooculogram (EOG) waveforms having the same peak height (distance) and blink speed, the peak height and the peak height are reached. It is possible to accurately classify items having different times from high to the end of blinking.
Here, the “peak height” is expressed by, for example, the difference between the highest level position and the lowest level position of the waveform in an electrooculogram (EOG) waveform in the time interval where blinking occurs. .

Furthermore, the invention according to claim 3 is the blink data classification device according to claim 1 or 2,
The normalizing means normalizes a plurality of types of parameters having different units using a Z score method.
In such a configuration, normalization is performed using a known Z-score method, and therefore, for each of a plurality of types of parameters having different units (dimensions), the average value is “0” and the standard deviation is “1”. It is possible to normalize by processing the distribution so that “ As a result, it is possible to perform clustering by mixing parameters having different units (dimensions) such as distance and time.

Furthermore, the invention according to claim 4 is the blink data classification device according to any one of claims 1 to 3,
The classification means uses a division optimization method or a hierarchical method as the predetermined clustering method.
With such a configuration, clustering can be performed using a division optimization method or a hierarchical method, so that it is possible to easily perform parameter classification.

  Here, the above “division optimization method” is a non-hierarchical method, and is a method of determining an evaluation function indicating the quality of division and searching for a division that optimizes this evaluation function. A typical example is a k-means method (k-means method). In the k-average method, the number of divisions k is given, and the centroid (the center of gravity of k clusters) is the representative point of the cluster. For example, the sum of the squares of the distances from each target cluster to the centroid (of the evaluation function) Divide k clusters to minimize the solution. Since this method is a hill-climbing method and only a local optimal solution is obtained, the result that minimizes the evaluation function is selected by randomly changing the initial value.

  In addition, the “hierarchical method” described above creates an initial state in which N clusters including only one target data are provided when data including N targets is given. Starting from this state, the distance between the clusters is calculated, and the two clusters having the closest distance are sequentially merged. A hierarchical structure is obtained by repeating this merge of clusters until all objects are merged into one cluster. This hierarchical structure is displayed by a dendrogram. The dendrogram is a binary tree in which each terminal node represents each object and a cluster formed by merging is represented by a non-terminal node. The horizontal axis of the non-terminal node represents the distance between the clusters when merged. Typical examples of hierarchical methods include the Ward method that minimizes the sum of the squares of the distance to the centroid of the cluster, the shortest distance method, the longest distance method, and the group average method. There is.

Furthermore, the invention according to claim 5 is the blink data classification device according to any one of claims 1 to 4,
The blink data is feature amount data extracted from blink video data including moving image data of at least one eye at the time of each subject's blink corresponding to the blinking electrooculogram (EOG) waveform. It is a feature.
With such a configuration, it is possible to add attribute information to the feature amount data extracted from the blinking video data at the time of measuring an electrooculogram (EOG) waveform of the blinking.

  For example, a pattern capable of learning a statistical model such as an HMM using feature data with attribute information as learning data, and identifying the type of electrooculogram (EOG) waveform of the blink from the feature data of the blink video data The effect that a model can be generated is obtained. In other words, if there is feature value data of the blinking video data, the type of blinking of the subject can be identified, so the blinking video data can be obtained without measuring the subject's electrooculogram (EOG) waveform. Identification processing can be performed easily.

On the other hand, in order to achieve the above object, the pattern model generation device according to claim 6 is:
The blink data type device according to any one of claims 1 to 4,
A pattern for generating a pattern model for learning a statistical model using the blink data to which the identification information is added as learning data in the blink data classification device, using the blink data as an input, and outputting the identification data of the blink data as an input And a model generation means.

In such a configuration, the pattern model generation unit causes the blink data classification device to learn the statistical model using the blink data provided with the identification information as learning data, and receives the blink data as input. It is possible to generate a pattern model that outputs the identification data.
Therefore, since it is possible to generate a pattern model learned using the blink data classified stably and accurately as learning data, a pattern model having a high discrimination ability of an electrooculogram (EOG) waveform type is generated. The effect of being able to be obtained.

Here, the pattern model is, for example, a model obtained by learning a statistical model of a signal pattern for matching blinking data. Examples of the statistical model include an HMM and a neural network. Hereinafter, the same applies to the pattern model generation apparatus according to the eighth aspect.
In addition, the identification data is data indicating the identification result itself indicating the blink waveform type for the input blink data, data for identifying the blink waveform type such as the likelihood for each blink waveform type for the input blink data, etc. The data corresponds to the type of statistical model constituting the pattern model and the specification of the pattern model. Hereinafter, the same applies to the pattern model generation apparatus according to the eighth aspect.

Furthermore, the invention according to claim 7 is the pattern model generation device according to claim 6,
The statistical model is an HMM (Hidden Markov Model).
With such a configuration, since a known HMM is used as a statistical model, the type of electrooculogram (EOG) waveform can be identified with high accuracy with respect to operation contents accompanied by a temporal concept such as blinking. The effect that the pattern model which can be produced | generated can be acquired.

  Here, the “HMM” is a statistical model of a time series signal, and an unsteady time series signal can be modeled by transitioning between a plurality of stationary signal sources. For example, the length of time of speech varies depending on the speaking speed, and the utterance content shows a characteristic shape on the frequency (called spectral envelope), but the shape depends on the utterer, environment, content, etc. Fluctuation occurs. HMM is a statistical model that can absorb such fluctuations. Hereinafter, the same applies to the pattern model generation apparatus according to the ninth aspect.

In order to achieve the above object, a pattern model generation device according to claim 8 is:
The blink data classification device according to claim 5;
A pattern model for generating a pattern model that learns a statistical model using the blink data provided with the identification information as learning data in the blink data classification device, receives the blink data as an input, and outputs the identification data of the blink data. And generating means.

  With such a configuration, the pattern model generation unit causes the blink data classification device to learn a statistical model using the blink data (feature data data of the blink image) provided with the identification information as learning data, and the blink data It is possible to generate a pattern model that receives (feature data) as input and outputs identification data of the blink data (feature data).

  Accordingly, it is possible to generate a pattern model learned using feature data of blink video data classified stably and accurately as learning data, so that the ability to identify the type of electrooculogram (EOG) waveform is high ( It is possible to generate a pattern model (which can be accurately identified) and to generate a pattern model that can identify the type of blink of the subject from the feature amount data of the blink video data. .

Furthermore, the invention according to claim 9 is the pattern model generation device according to claim 8,
The statistical model is an HMM (Hidden Markov Model).
With such a configuration, since a known HMM is used as a statistical model, the type of electrooculogram (EOG) waveform can be identified with high accuracy with respect to operation contents accompanied by a temporal concept such as blinking. The effect that the pattern model which can be produced | generated can be acquired.

On the other hand, in order to achieve the above object, the blink waveform appearance frequency information generating device according to claim 10 is:
A pattern model generated by the pattern model generation device according to claim 6 or 7,
Electrooculogram waveform measuring means for measuring an electrooculogram (EOG) waveform when the subject blinks;
Blink data extraction means for extracting blink data corresponding to the pattern model from electrooculogram (EOG) waveform data which is data of an electrooculogram (EOG) waveform measured by the electrooculogram waveform measurement means;
Blink waveform identifying means for identifying the type of electrooculogram (EOG) waveform corresponding to the blink data based on the blink data extracted by the blink data extraction means and the pattern model;
Based on the identification result of the blink waveform identification means for the electrooculogram (EOG) waveform at the time of blinking of the subject measured in a predetermined period, the appearance frequency of the type of the electrooculogram (EOG) waveform in the predetermined period Appearance frequency information generating means for generating appearance frequency information indicating a temporal change.

With such a configuration, the electrooculogram (EOG) waveform when the subject blinks can be measured by the electrooculogram waveform measuring means, and the electrooculogram waveform measurement is performed by the blink data extracting means. It is possible to extract blink data corresponding to the pattern model from electrooculogram (EOG) waveform data, which is electrooculogram (EOG) waveform data measured by the means. Based on the blink data extracted by the blink data extraction means and the pattern model, the type of electrooculogram (EOG) waveform corresponding to the blink data can be identified.
Further, based on the identification result of the blink waveform identifying means with respect to the electrooculogram (EOG) waveform at the time of the subject's blink measured by the appearance frequency information generating means, the electrooculogram in the predetermined period ( EOG) It is possible to generate appearance frequency information indicating a temporal change in the appearance frequency of the waveform type.

Therefore, from the appearance frequency information, since the change in the occurrence frequency of the specific type of blink within a predetermined time such as the occurrence frequency of the specific type of the subject's blink, the swarm of the specific type of blink, etc., an experiment or the like is performed in advance. For example, by investigating the relationship between temporal changes in appearance frequency and various arousal levels (states) from the time of high awakening to the transition to dozing, it is possible to obtain various arousal levels of the subject from the appearance frequency information. The effect that (state) can be determined is obtained.
In other words, based on changes in the frequency of occurrence of specific types of blinks within a predetermined period of time, such as the frequency of occurrence of specific types of blinks and swarms of specific types of blinks, which are effective in determining the state of wakefulness in terms of physiology. Thus, it is possible to determine the arousal state with high accuracy.

In order to achieve the above object, the blink waveform appearance frequency information generating device according to claim 11
A pattern model generated by the pattern model generation device according to claim 8 or 9,
Photographing means for photographing a blinking image including a moving image of at least one whole eye when the subject blinks;
Feature amount data extracting means for extracting feature amount data corresponding to the pattern model from blink image data that is data of a blink image captured by the photographing means;
Blink waveform identifying means for identifying the type of electrooculogram (EOG) waveform corresponding to the blink image corresponding to the feature quantity data based on the feature quantity data extracted by the feature quantity data extracting means and the pattern model; ,
Based on the identification result of the blink waveform identifying means for the blink image of the subject photographed in a predetermined period, the temporal frequency of appearance of each type of electrooculogram (EOG) waveform corresponding to the blink image in the predetermined period Appearance frequency information generating means for generating appearance frequency information indicating a change.

With such a configuration, it is possible to shoot a blinking image including a moving image of at least one whole eye when the subject blinks, and shoot with the photographic means by the feature data extraction unit. It is possible to extract feature value data corresponding to the pattern model from the blink video data that is the data of the blink image, and the feature value data extracted by the feature value data extraction unit by the blink waveform identification unit; Based on the pattern model, it is possible to identify the type of electrooculogram (EOG) waveform corresponding to the blinking image corresponding to the feature amount data.
Further, an electrooculogram (EOG) waveform corresponding to the blink image in the predetermined period based on the identification result of the blink waveform identification unit for the subject's blink image captured in a predetermined period by the appearance frequency information generation unit. It is possible to generate appearance frequency information indicating temporal changes in the appearance frequency for each type.

Therefore, from the appearance frequency information, since the change in the occurrence frequency of the specific type of blink within a predetermined time such as the occurrence frequency of the specific type of the subject's blink, the swarm of the specific type of blink, etc., an experiment or the like is performed in advance. For example, by obtaining the relationship between the temporal change in the appearance frequency and various wakefulness levels (states) from the time of high wakefulness to the transition to doze, various wakefulnesses of the subject can be obtained from the appearance frequency information. The effect that the level (state) can be determined is obtained. In other words, based on changes in the frequency of occurrence of specific types of blinks within a predetermined period of time, such as the frequency of occurrence of specific types of blinks and swarms of specific types of blinks, which are effective in determining the state of wakefulness in terms of physiology. Thus, it is possible to determine the arousal state with high accuracy.
In addition, it is possible to identify the type of the subject's blink (electrocardiogram (EOG) waveform) simply by shooting the blink video data. Compared to the case of directly measuring the blinking electrooculogram (EOG) waveform Thus, it is possible to easily identify the blink waveform without attaching a measurement member such as an electrode to the subject.

On the other hand, in order to achieve the above object, a wakefulness state determination device according to claim 12,
The blink waveform appearance frequency information generating device according to claim 10 or 11,
Awake state determination means for determining the awake state of the subject based on the appearance frequency information generated by the blink waveform appearance frequency information generation device.
With such a configuration, it is possible to determine the awakening state of the subject based on the appearance frequency information generated by the blink waveform appearance frequency information generating device by the awakening state determination unit.
Therefore, for example, it is possible to determine various awakening states until the subject shifts from the high awakening state to the dozing state.

Moreover, in order to achieve the said objective, the alertness state determination apparatus of Claim 13 is provided.
A blinking electrooculogram waveform measuring means for measuring an electrooculogram (EOG) waveform when the subject blinks;
Blink data extracting means for extracting blink data from electrooculogram (EOG) waveform data which is electrooculogram (EOG) waveform data measured by the blinking electrooculogram waveform measuring means;
Blink waveform identifying means for identifying the type of blink waveform corresponding to the blink data based on the blink data extracted by the blink data extracting means and the identification data of the blink waveform type stored in advance,
Appearance frequency information indicating temporal changes in appearance frequency for each type of the blink waveform in the predetermined period based on the identification result of the blink waveform identification means corresponding to the electrooculogram (EOG) waveform measured in a predetermined period Appearance frequency information generating means for generating
And awakening state determination means for determining the awakening state of the subject based on the appearance frequency of a predetermined type of the blink waveform based on the appearance frequency information generated by the appearance frequency information generation means.

  With such a configuration, the electrooculogram (EOG) waveform when the subject blinks can be measured by the electrooculogram waveform measuring means, and the blinking electrooculogram waveform is obtained by the blink data extracting means. It is possible to extract blink data from electrooculogram (EOG) waveform data, which is electrooculogram (EOG) waveform data measured by the measurement means, and the blink data extraction means uses the blink data extraction means. It is possible to identify the type of blink waveform corresponding to the blink data based on the extracted blink data and the identification data of the blink waveform type stored in advance.

  Further, based on the identification result of the blink waveform identifying unit corresponding to the electrooculogram (EOG) waveform measured in a predetermined period by the appearance frequency information generating unit, the appearance frequency for each type of the blink waveform in the predetermined period It is possible to generate appearance frequency information indicating a temporal change in the number of occurrences from the appearance frequency of a predetermined type of the blink waveform based on the appearance frequency information generated by the appearance frequency information generation unit by the awake state determination unit. It is possible to determine the awakening state of the subject.

Therefore, from the appearance frequency information, since the change in the occurrence frequency of the specific type of blink within a predetermined time such as the occurrence frequency of the specific type of the subject's blink, the swarm of the specific type of blink, etc., an experiment is performed in advance, For example, by obtaining the relationship between the temporal change in the appearance frequency and various wakefulness levels (states) from the time of high wakefulness to the transition to doze, various wakefulnesses of the subject can be obtained from the appearance frequency information. The effect that the level (state) can be determined is obtained. In other words, based on changes in the frequency of occurrence of specific types of blinks within a predetermined period of time, such as the frequency of occurrence of specific types of blinks and swarms of specific types of blinks, which are effective in determining the state of wakefulness in terms of physiology. Thus, it is possible to determine the arousal state with high accuracy.
Here, the identification data is data that can identify the type of blink waveform corresponding to the feature amount data from the feature amount data. For example, the feature amount data is input and the identification data of the feature amount data is output. There are a pattern model, a data table (database) in which relationships between various feature data and blinking waveform types are registered.

Furthermore, the invention according to claim 14 is the wakefulness determination device according to claim 13,
The appearance frequency of the predetermined type of the blink waveform is an appearance frequency of a plurality of types of the blink waveform.
With such a configuration, since the frequency of occurrence of specific multiple types of blinks of the target person, the change in the frequency of occurrence of specific multiple types of blinks within a predetermined time, such as swarms of specific types of blinks, is understood, By performing experiments and the like in advance, for example, by obtaining the relationship between the temporal change in the appearance frequency and various wakefulness levels (states) from the time of high wakefulness to the transition to doze, Thus, it is possible to obtain an effect that it is possible to more accurately determine various awakening levels (states) of the subject. In other words, various types of blinks occur before the transition from high awakening to snoozing, and in particular, judgment is made by focusing on the appearance frequency of multiple types of blinks related to the arousal level (state). By performing the above, it is possible to make the determination accuracy of the arousal level more accurate.

In order to achieve the above object, a wakefulness state determination device according to claim 15 includes:
A blinking electrooculogram waveform measuring means for measuring an electrooculogram (EOG) waveform when the subject blinks;
A pattern model that receives feature data extracted from electrooculogram (EOG) waveform data, which is data of the electrooculogram (EOG) waveform, and outputs identification data of the blink waveform type corresponding to the feature data. When,
Feature quantity extraction means for extracting feature quantity data from electrooculogram (EOG) waveform data which is electrooculogram (EOG) waveform data measured by the blinking electrooculogram waveform measurement means;
A blink type identifying unit for identifying a blink waveform type for the feature amount data based on the feature amount data extracted by the feature amount extraction unit and the pattern model;
Based on the identification result of the blink type identification means for the blink image of the subject photographed in a predetermined period, the temporal frequency of appearance of each type of electrooculogram (EOG) waveform corresponding to the blink image in the predetermined period Blinking waveform type appearance frequency information generating means for generating appearance frequency information indicating a change;
And awakening state determination means for determining the awakening state of the subject based on the appearance frequency information generated by the blinking waveform type appearance frequency information generation means.

  If it is such a structure, it is possible to measure the electrooculogram (EOG) waveform at the time of the subject's blink by the blinking electrooculogram waveform measuring means, and the blinking electrooculogram is obtained by the feature amount extracting means. Feature quantity data can be extracted from electrooculogram (EOG) waveform data, which is electrooculogram (EOG) waveform data measured by the waveform measurement means, and the feature quantity is extracted by the blink type identification means. Based on the feature quantity data extracted by the means and the pattern model, it is possible to identify the type of blink waveform for the feature quantity data.

  Further, an electrooculogram corresponding to the blinking image in the predetermined period based on the identification result of the blink type identification unit with respect to the blink image of the subject captured in the predetermined period by the blinking waveform type appearance frequency information generation unit ( EOG) It is possible to generate the appearance frequency information indicating the temporal change in the appearance frequency for each type of waveform, and based on the appearance frequency information generated by the blinking waveform type appearance frequency information generating means by the awake state determination means. It is possible to determine the awakening state of the subject.

Therefore, from the appearance frequency information, since the change in the occurrence frequency of the specific type of blink within a predetermined time such as the occurrence frequency of the specific type of the subject's blink, the swarm of the specific type of blink, etc., an experiment is performed in advance, For example, by investigating the relationship between temporal changes in appearance frequency and various arousal levels (states) from the time of awakening to the transition to dozing, various frequency of arousal ( The effect that the state) can be determined is obtained.
In other words, based on changes in the frequency of occurrence of specific types of blinks within a predetermined period of time, such as the frequency of occurrence of specific types of blinks and swarms of specific types of blinks, which are effective in determining the state of wakefulness in terms of physiology. Thus, it is possible to determine the arousal state with high accuracy.

In order to achieve the above object, the wakefulness state determination device according to claim 16 comprises:
Blinking image photographing means for photographing a blinking image including a moving image of at least one entire eye when the subject blinks;
A pattern model that receives feature data extracted from the blink video data that is the data of the blink video, and outputs identification data of the blink waveform type corresponding to the feature data;
Feature quantity extraction means for extracting feature quantity data from blink video data which is data of a blink video photographed by the blink video photography means;
A blink type identifying unit for identifying a blink waveform type for the feature amount data based on the feature amount data extracted by the feature amount extraction unit and the pattern model;
Based on the identification result of the blink type identification means for the blink image of the subject photographed in a predetermined period, the temporal frequency of appearance of each type of electrooculogram (EOG) waveform corresponding to the blink image in the predetermined period Blinking waveform type appearance frequency information generating means for generating appearance frequency information indicating a change;
And awakening state determination means for determining the awakening state of the subject based on the appearance frequency information generated by the blinking waveform type appearance frequency information generation means.

  With such a configuration, it is possible to photograph a blinking image including a moving image of at least one whole eye when the subject blinks by the blinking image photographing unit, and the blinking image photographing by the feature amount extracting unit. It is possible to extract feature amount data from the blink video data that is the data of the blink image captured by the means, and the feature amount data extracted by the feature amount extraction means by the blink type identification means and the pattern model Based on this, it is possible to identify the blink waveform type for the feature data.

  Further, an electrooculogram corresponding to the blinking image in the predetermined period based on the identification result of the blink type identification unit with respect to the blink image of the subject captured in the predetermined period by the blinking waveform type appearance frequency information generation unit ( EOG) It is possible to generate the appearance frequency information indicating the temporal change in the appearance frequency for each type of waveform, and based on the appearance frequency information generated by the blinking waveform type appearance frequency information generating means by the awake state determination means. It is possible to determine the awakening state of the subject.

  Therefore, from the appearance frequency information, since the change in the occurrence frequency of the specific type of blink within a predetermined time such as the occurrence frequency of the specific type of the subject's blink, the swarm of the specific type of blink, etc., an experiment or the like is performed in advance. For example, by obtaining the relationship between the temporal change in the appearance frequency and various wakefulness levels (states) from the time of awakening to the transition to doze, various wakefulness levels of the subject can be obtained from the appearance frequency information. The effect that (state) can be determined is obtained. In other words, based on changes in the frequency of occurrence of specific types of blinks within a predetermined period of time, such as the frequency of occurrence of specific types of blinks and swarms of specific types of blinks, which are effective in determining the state of wakefulness in terms of physiology. Thus, it is possible to determine the arousal state with high accuracy.

  In addition, it is possible to identify the type of the subject's blink (electrocardiogram (EOG) waveform) simply by shooting the blink video data. Compared to the case of directly measuring the blinking electrooculogram (EOG) waveform Thus, it is possible to easily identify the blink waveform without attaching a measurement member such as an electrode to the subject.

In order to achieve the above object, a warning device according to claim 17 is provided.
Awake state determination device according to claim 12 or claim 13,
Warning means for giving a warning to the subject based on the determination result of the awake state in the awake state determination device.
If it is such a structure, it is possible to give a warning to the said subject based on the determination result of the arousal state in the said arousal state determination apparatus by a warning means.

Therefore, for example, by generating in real time the appearance frequency information for a predetermined period of the driver in the car, from the appearance frequency information, for example, the awakening of the driver who is transitioning to the dozing state before becoming completely dozing Since the level (state) can be determined, it is possible to prevent accidents by giving warnings such as warning sounds and flashing lights to subjects who are likely to fall asleep. Is obtained.
Here, the warning includes a warning sound such as a buzzer, a sound such as a warning voice message, a visual warning caused by lighting, flashing, etc., and a warning giving vibration or shock to the subject. Moreover, you may make it give a warning to the subject which combined these arbitrarily.

[First Embodiment]
Hereinafter, a first embodiment of the present invention will be described with reference to the drawings. 1 to 10 are diagrams showing an embodiment of a blink data classification device, a blink data classification program and a blink data classification method, a pattern model generation device, a pattern model generation program, and a pattern model generation method according to the present invention. .

First, the configuration of the pattern model generation apparatus according to the present invention will be described with reference to FIG. FIG. 1 is a block diagram showing a configuration of a pattern model generation apparatus 100 according to the present invention.
As shown in FIG. 1, the pattern model generation apparatus 100 includes a blink waveform data storage unit 10 that stores data of an electrooculogram (EOG) waveform of blinking (hereinafter referred to as electrooculogram (EOG) waveform data), A blink image data storage unit 11 that stores blink image data (hereinafter referred to as blink image data) simultaneously taken when measuring an electrooculogram (EOG) waveform of a blink, and a parameter extraction unit that extracts parameters from the blink waveform data 12, a feature quantity data extraction unit 13 that extracts feature quantity data from the blinking video data, a parameter that is extracted by the parameter extraction section 12 and a feature type data that is extracted by the feature quantity data extraction section 13 and stored in association with each other The data storage unit 14 is included.

The blink waveform data storage unit 10 stores plural types of blink electrooculogram (EOG) waveform data measured for a plurality of subjects in a predetermined area of a storage device 70 described later. Specifically, for example, electrooculogram (EOG) waveform data of blinking at least one of the left eye or the right eye of the subject is stored.
The blink video data storage unit 11 stores the blink video data corresponding to electrooculogram (EOG) waveform data of at least one blink of the left eye or the right eye stored in the blink waveform data storage unit 10, which will be described later. Is stored in a predetermined area. Here, the blinking image data is blinking image data taken at the same time as measuring an electrooculogram (EOG) waveform, and is composed of moving image data including at least one of the left eye and the right eye when the subject blinks. Yes. Also, electrooculogram (EOG) waveform data and blink video data are associated with each other by identification information, so that one can know the other from either electrooculogram (EOG) waveform data or blink waveform data. Is possible.

The parameter extraction unit 12 is configured to extract a plurality of types of parameters having different units from the electrooculogram (EOG) waveform data stored by the blinking waveform data storage unit 10.
In the present embodiment, three parameters are distance data of the peak height of the electrooculogram (EOG) waveform, time data from the start of blinking to the peak height, and time data from the peak height to the end of blinking. To extract.
The feature amount data extraction unit 13 extracts feature amount data from the blink video data stored in the blink video data storage unit 11.
In the present embodiment, for each frame of the blink video data, the total luminance for each line in the predetermined area is calculated, and the luminance total data of each blink video data is used as the feature amount data.

  The classification target data storage unit 14 includes three types of electrooculogram (EOG) waveform data extracted by the parameter extraction unit 12 based on identification information given to electrooculogram (EOG) waveform data and blink video data. The parameter and the feature amount data for each blink video data extracted by the feature amount data extraction unit 13 are associated with each other and stored in a predetermined area of the storage device 70 described later.

  The pattern model generation apparatus 100 further includes a parameter normalization unit 15 that normalizes the parameters stored in the classification target data storage unit 14, and a parameter classification unit 16 that classifies the parameters normalized by the parameter normalization unit 15. An identification information adding unit 17 for automatically assigning identification information for identifying the type of blinking to the parameter and feature amount data stored by the classification target data storage unit 14 based on the classification result of the parameter classification unit 16; The configuration includes a learning data storage unit 18 that stores the parameter and the feature amount data to which the identification information is added by the adding unit 17 as learning data.

The parameter normalization unit 15 normalizes a plurality of types of parameters having different units using a predetermined normalization method. In the present embodiment, normalization is performed using a known Z score method.
The parameter classification unit 16 classifies the parameters after normalization by the parameter normalization unit 15 (hereinafter referred to as normalization parameters) using a predetermined clustering method. In the present embodiment, a clustering method of which one of a known Ward method that is one of hierarchical clustering methods and a known k-mean method that is one of division optimization clustering methods is designated is used. Clustering the normalization parameters. In the present embodiment, this clustering result is used as a type result, information on the class to which each normalization parameter belongs is assigned to each normalization parameter, and this is used as a type result.

Based on the classification result of the parameter classification unit 16, the identification information adding unit 17 sets the class to which each normalized parameter to which the classification belongs belongs to each type of blinking identification information, and uses this identification information as the normalized parameter after each classification. A corresponding parameter before normalization and feature amount data corresponding to this parameter are automatically given.
The learning data storage unit 18 is configured to store the parameter and feature amount data to which the identification information is added by the identification information adding unit 17 as learning data in a predetermined area of the storage device 70 described later.
The pattern model generation device 100 further stores a statistical model learning unit 19 that learns a statistical model using the learning data stored in the learning data storage unit 18, and a pattern model storage that stores a pattern model including the statistical model after learning. The unit 20 is configured to be included.

The statistical model learning unit 19 learns a statistical model using the parameters stored in the learning data storage unit 18 as learning data in response to an instruction to generate a blink waveform pattern model, and generates a blink waveform pattern model. ing. Further, according to an instruction to generate a blink video pattern model, the statistical model is learned using the feature data stored in the learning data storage unit 18 as learning data, and a blink video pattern model is generated. In this embodiment, an HMM (Hidden Markov Model) is used as the statistical model.
The pattern model storage unit 20 stores the blink waveform pattern model and the blink video pattern model generated by the statistical model learning unit 19 in a predetermined area of the storage device 70 described later.

  Furthermore, the pattern model generation apparatus 100 includes a computer system for realizing the control of the above-described units on software, and the hardware configuration thereof is a central processing unit that performs various controls and arithmetic processing as shown in FIG. Between a CPU (Central Processing Unit) 60 that is a processing device, a RAM (Random Access Memory) 62 that constitutes a main storage device (Main Storage), and a ROM (Read Only Memory) 64 that is a read-only storage device. The bus 68 is connected to various internal and external buses 68 such as a PCI (Peripheral Component Interconnect) bus and an ISA (Industrial Standard Architecture) bus. In addition, via an input / output interface (I / F) 66, an internal or external storage device (Secondary Storage) 70 such as an HDD (Hard Disk Drive), an output device 72 such as a printing means, a CRT, or an LCD monitor, an operation panel, An input device 74 such as a mouse, a keyboard, and a scanner and a network L for communicating with an external device (not shown) are connected.

  When the power is turned on, a system program such as BIOS stored in the ROM 64 or the like is stored in various dedicated computer programs stored in the ROM 64 in advance, or in a CD-ROM, DVD-ROM, flexible disk (FD), or the like. Various dedicated computer programs installed in the storage device 70 are loaded into the RAM 62 via the medium or the communication network L such as the Internet, and the CPU 60 performs various operations according to instructions described in the program loaded in the RAM 62. By performing predetermined control and arithmetic processing by making full use of resources, each function of each means as described above can be realized on software.

Next, the flow of the parameter extraction process from the blink waveform data and the feature data extraction process from the blink video data in the pattern model generation apparatus 100 configured as described above will be described with reference to FIG. FIG. 3 is a flowchart showing parameter extraction processing from the blink waveform data and feature amount data extraction processing from the blink video data.
The parameter extraction process from the blink waveform data and the feature value data extraction process from the blink video data are first shifted to step S100 as shown in the flowchart of FIG. It is determined whether or not there is an extraction instruction from the user, and if it is determined that there is an extraction instruction (Yes), the process proceeds to step S102, and if not (No), the process waits until there is an extraction instruction. .

When the process proceeds to step S102, the parameter extraction unit 12 acquires the blink waveform data from the storage device 70 via the blink waveform data storage unit 10, and the process proceeds to step S104.
In step S104, the parameter extraction unit 12 extracts a plurality of types of parameters having different units from the blink waveform data acquired in step S102, outputs the extracted parameters to the type target data storage unit 14, and proceeds to step S106. To do.

In step S106, the parameter extraction unit 12 determines whether or not there is blink video data corresponding to the blink waveform data acquired in step S102. If it is determined that there is (Yes), the feature data extraction unit 13 determines. An extraction instruction is given to step S108, and the process proceeds to step S108. Otherwise (No), the process proceeds to step S116.
When the process proceeds to step S108, the feature amount data extraction unit 13 acquires the blink video data from the storage device 70 via the blink video data storage unit 11, and the process proceeds to step S110.

In step S110, the feature amount data extraction unit 13 extracts feature amount data from the blink video data acquired in step S108, outputs the extracted feature amount data to the type target data storage unit 14, and proceeds to step S112. To do.
In step S112, in the classification target data storage unit 14, the parameter input from the parameter extraction unit 12 and the feature amount data input from the feature amount data extraction unit 13 are associated with each other and stored in a predetermined area of the storage device 70. Then, the process proceeds to step S114.

In step S114, the parameter extraction unit 12 determines whether or not the parameter extraction process and the feature amount data extraction process have all been completed for the blink waveform data acquired in step S102. If it is determined that the parameter extraction unit 12 has completed (Yes ) Terminates the extraction process; otherwise (No), the process proceeds to step S104.
On the other hand, if there is no blink video data corresponding to the blink waveform data in step S106 and the process proceeds to step S116, the parameter input from the parameter extraction unit 12 is stored in a predetermined area of the storage device 70 in the classification target data storage unit 14. It memorize | stores and transfers to step S114.

Next, the flow of learning data generation processing in the pattern model generation device 100 will be described with reference to FIG. Here, FIG. 4 in the pattern model generation apparatus 100 is a flowchart showing a learning data generation process.
In the learning data generation process, as shown in the flowchart of FIG. 4, first, the process proceeds to step S200, where it is determined whether there is an instruction to generate learning data from the user via the input device 74 or the like. If it is determined (Yes), the process proceeds to step S202. If not (No), the process waits until an extraction instruction is issued.

When the process proceeds to step S202, the parameter normalization unit 15 acquires the parameter from the classification target data storage unit 14, and the process proceeds to step S204.
In step S204, the parameter normalization unit 15 normalizes the parameter acquired in step S202, and the process proceeds to step S206.
In step S206, the parameter classification unit 16 classifies the parameter normalized in step S204 using a predetermined clustering method, and the process proceeds to step S208.

In step S208, the identification information providing unit 17 acquires the parameter and feature amount data corresponding to each normalization parameter classified in step S206 from the classification target data storage unit 14, and proceeds to step S210.
In step S210, the identification information adding unit 17 adds identification information to the parameter and feature amount data acquired in step S208 based on the type result in step S206, and the process proceeds to step S212.

In step S212, the identification information adding unit 17 stores the parameter and feature amount data to which the identification information is added in step S210 as learning data in a predetermined area of the storage device 70 via the learning data storage unit 18, and performs processing. finish. Here, the identification information is information of each class (blink type) divided by clustering.
Next, the flow of pattern model generation processing in the pattern model generation apparatus 100 will be described with reference to FIG. Here, FIG. 5 is a flowchart showing pattern model generation processing in the pattern model generation apparatus 100.

  As shown in the flowchart of FIG. 5, the pattern model generation process first proceeds to step S300, and whether or not the statistical model learning unit 19 has received an instruction to generate a blink waveform pattern model from the user via the input device 74 or the like. If it is determined that there has been a generation instruction (Yes), the process proceeds to step S302. If not (No), the process proceeds to step S308.

When the process proceeds to step S302, the statistical model learning unit 19 acquires the parameter to which the identification information is added from the learning data storage unit 18, and the process proceeds to step S304.
In step S304, the statistical model learning unit 19 generates a blink waveform pattern model by learning the statistical model using the parameters acquired in step S302 as learning data, and proceeds to step S306.

In step S306, the statistical model learning unit 19 stores the pattern model generated in step S304 or S312 in a predetermined area of the storage device 70 via the pattern model storage unit 20, and ends the process.
On the other hand, when the process proceeds to step S308 instead of the blink waveform pattern model generation instruction in step S300, the statistical model learning unit 19 determines whether or not the blink image pattern model generation instruction is issued, and the generation instruction is issued. If it is determined that there is (Yes), the process proceeds to step S310, and if not (No), the process proceeds to step S300.

When the process proceeds to step S310, the statistical model learning unit 19 acquires the feature amount data provided with the identification information from the learning data storage unit 18, and the process proceeds to step S312.
In step S312, the statistical model learning unit 19 generates a blink video pattern model by learning the statistical model using the feature data acquired in step S310 as learning data, and proceeds to step S306.

Next, the operation of the present embodiment will be described with reference to FIGS.
Here, FIG. 6 is an explanatory diagram of parameters extracted from the blink waveform data. FIGS. 7A and 7B are explanatory diagrams of feature amount data extracted from the blink video data. FIG. 8 is a diagram showing a blink waveform for one subject classified by the method of the present invention using three types of parameters, and FIG. 9 is a method of the present invention using two types of parameters. It is a figure which shows the blink waveform with respect to a certain test subject classified according to. FIGS. 10A to 10C are radar charts of Z scores for one subject. FIG. 18 is a diagram illustrating an example of the HMM.

  In the pattern model generation apparatus 100, in performing the above-described parameter and feature amount data extraction processing, learning data generation processing, and pattern model generation processing, first, an electrooculogram (EOG) waveform of a plurality of types of blinks of a plurality of subjects. It is necessary to prepare data (blink waveform data) and blink image data when measuring the blink waveform data in advance.

  In this embodiment, the measurement of blinking electrooculogram (EOG) waveforms of subjects including multiple healthy men and women and photographing of blinking image data are performed in advance, and blinking waveform data and blinking image data are collected. And Specifically, depending on the degree of sleepiness of the subject, the measurement of blinking electrooculogram (EOG) waveform is intermittently performed several times, with 30 minutes to 1 hour being taken once, and at the same time, shooting of blinking video data The blink waveform data and blink video data for the blink performed within the measurement time are acquired.

  In addition, as an electrooculogram (EOG) waveform (EOG waveform), electrodes are mounted on the upper and lower sides of the subject's right eyelid, and an AC amplifier for biometric measurement (BIOPAC, time constant 3.2 [sec], GAIN 5000 times, 35 A vertical EOG waveform was measured using a [Hz] cutoff low-pass filter. Then, a blinking portion is detected from the vertical EOG waveform. The blinking detection method is the same as the known method (Hiroaki Yuse, Hideko Tada: Automatic blink detection and blink waveform analysis, ergonomics, vol30, No5, p.331-337 (1994)). Using the differential value (primary difference) of the waveform, the time when this differential value exceeds the preset start threshold is used as the start point for blinking, and the time when the differential value exceeds the preset end threshold is also used for blinking. As an end point, a waveform from the start point to the end point is detected as a blink waveform.

  In the present embodiment, the blinking video data is shot with a camera that combines an infrared LED irradiation device and a CCD (charge coupled device) camera, assuming shooting with a camera built in an inner mirror in an automobile. Used. Then, a face image of the subject was photographed by this camera, and an image of the right eye part was detected from the face image using a known SVM (Support Vector Machine) to obtain blink image data. For more information on SVM, please refer to “Introduction to Support Vector Machine Takio Kurita” on the web page of URL “http://www.neurosci.aist.go.jp/~kuita/lecture/svm/svm.html”. (As of May 12, 2006).

  The blink waveform data and the blink video data collected as described above are each associated with the blink waveform data storage unit 10 and the blink video data storage unit 11 to understand the combination of each blink and the blink video data corresponding to each blink. Thus, the combination information is given and stored. Note that the presence / absence of the blink video data corresponding to the blink waveform data can be determined from the combination information added to the blink waveform data.

First, based on FIG.6 and FIG.7, operation | movement of the extraction process of a parameter and feature-value data in the pattern model generation apparatus 100 is demonstrated.
When the pattern model generation device 100 is given the extraction instruction from the user via the input device 74 or the like in the state where the blink waveform data and the blink video data are prepared (step S100), the parameter extraction unit 12 blinks. Blink waveform data is acquired from the storage device 70 via the waveform data storage unit 10 (step S102). In the present embodiment, the blinking waveform data can be acquired by selecting whether to automatically acquire all stored data or new data, or to acquire data instructed by the user.

  When the blink waveform data is acquired, the height (distance) data of the peak point of the waveform from the acquired blink waveform data, the time data from the start point of the blink to the peak point, and from the peak point to the end point of the blink Three parameters of the time data are extracted (step S104). For example, if the blink waveform data shows a waveform as shown in FIG. 6, the peak point height x1 is determined by the blinking start point level (voltage or current) and the waveform peak point level. And the difference value. In the example of FIG. 6, the peak of the waveform is “x1 [mm]”, the time x2 from the blink start point to the peak point (hereinafter, rise time x2) is 28 [ms], and the blink end point from the peak point. Time x3 (fall time x3) until is 52 [ms]. In the present embodiment, the peak point height (distance) data x1 uses the measured value as it is, and the rise and fall time data x2 and x3 are logarithmically transformed. These three parameters x1, x2, and x3 are sequentially extracted from the acquired blink waveform data. In addition, each time a parameter is extracted from the blink waveform data, the presence / absence of blink video data corresponding to the blink waveform data is determined based on the combination information described above (step S106), and if there is corresponding blink video data (step S106). (“Yes” branch in S106), an extraction instruction is given to the feature data extraction unit 13.

In response to an extraction instruction from the parameter extraction unit 12, the feature amount data extraction unit 13 acquires blink video data corresponding to the blink waveform data from which the parameters have been extracted from the storage device 70 via the blink video data storage unit 11. (Step S108).
When the blink video data is acquired, feature amount data is extracted from the acquired blink video data (step S110). Specifically, as shown in FIG. 7A, an extraction area image of 11 pixels wide × 30 pixels high is cut out with the eyeball portion in the blink image data constituting the blink video data as the center, and the extracted extraction area image is cut out. The luminance total value of each line (11 pixels) constituting the image is calculated. For example, the luminance total value data for each line for 30 lines of the extraction region image showing the characteristics as shown in FIG. Generate. In the present embodiment, the data of the luminance total value generated for all the blink image data constituting the blink video data corresponding to one blink is the feature amount data for each blink video data. . Note that the number of blinking image data constituting the blinking image data varies depending on the performance of the photographing means, the kind of blinking, and the like (for example, when a normal CCD camera is used, for example, 8 to It will be about 11 images).

The extracted parameter and the extracted feature amount data corresponding to the extracted parameter are stored in a predetermined area of the storage device 70 in the classification target data storage unit 14 in association with each other (step S112).
When the above-described parameter extraction processing and feature amount data extraction processing are all completed for the acquired blink waveform data and the corresponding blink video data (“Yes” branch of step S114), the pattern model generation device 100 Ends the extraction process.

Next, the operation of the learning data generation process in the pattern model generation device 100 will be described based on FIGS.
The pattern model generation device 100 generates learning data from the user via the input device 74 or the like in a state where parameters and feature amount data are stored in a predetermined area of the storage device 70 by the classification target data storage unit 14. When an instruction is given (step S200), the parameter normalization unit 15 acquires a parameter (hereinafter referred to as a feature parameter) from the storage device 70 via the type target data storage unit 14 (step S202).

When the feature parameter is acquired, the parameter normalization unit 15 normalizes the feature parameter composed of the three types of parameters x1 to x3 using the Z score method (step S204).
Normalization by the Z score method is performed by processing the distribution of the three types of feature parameters x1 to x3 so that the average value is 0 and the standard deviation is 1. Specifically, an average value μ among the characteristic parameters x1 to x3 is obtained. At the same time, the standard variance σ among the characteristic parameters x1 to x3 is obtained.
Here, in each of x1 to x3, each feature parameter is X, and the feature parameter X is converted to z according to the following equation (1).

  As a specific numerical example, the average value of all blink waveforms x1 to x3 of a subject is (2.2239, 1.3542, 1.5693), and the standard deviation of each of x1 to x3 is (0.7396, 0.1000). , 0.0709), z is calculated as in the following equation (2).

  When the parameter normalization by the Z score method is completed, the parameter classification unit 16 clusters (normalizes) the normalized parameter (normalized parameter) using a predetermined clustering method (step S206). Here, a method of classifying normalization parameters using the Ward method, which is one of the hierarchical clustering methods, and a method of classifying normalization parameters using the k-means method, which is one of the division optimization clustering methods. And will be described respectively.

  First, the normalization parameter classification method by clustering using the Ward method, which is one of the hierarchical methods, will be described. Here, clustering is a method of defining similar distances between data and grouping together similar ones, and a hierarchical method includes only one target when data consisting of N targets is given. This is a method of starting from an initial state with N clusters, calculating the distance between the clusters, and sequentially merging the two clusters having the closest distance.

  The specific procedure by the Ward method for minimizing the sum of the squares of the distances to the center of gravity of the clusters as the distance between clusters is as follows: (1) Each element is a cluster as an initial setting. (2) Calculate the square of the distance of the cluster barycentric point for all the clusters, and search for the cluster pair with the smallest distance and combine them. (3) Recalculate the distance for the combined cluster and all other clusters. In this embodiment, (2) and (3) are repeatedly executed, and the process is terminated when the total number of clusters reaches 12.

  Next, a normalization parameter classification method by clustering using the k-means method, which is one of the divisional optimal methods, will be described. Here, the division optimization method is a method in which the number of clusters is designated in advance and each element is divided into N clusters based on similarity. The specific procedure by the k-means method is as follows: (1) Twelve elements are selected at random from all the elements and used as representatives of the respective clusters. (2) Assign each other element to the nearest cluster center. (3) Perform (2) with the center of gravity of each cluster as the new center. (4) The assignment of each element ends when there is no change from the previous step.

As described above, by setting the number of clusters to 12 (Class 1 to Class 12), a waveform that is difficult to be considered to blink due to eye movement or the like mixed during automatic detection of blink waveform data is separated from the blink. It can be classified and excluded as a cluster.
The electrooculogram (EOG) waveform corresponding to the normalization parameters (corresponding to the above x1 to x3) of one subject clustered into Class1 to Class12 by the above type processing is shown in FIG. are categorized.

  Also, the radar chart of Z scores (normalized parameters) shown in FIGS. 10 (a) to 10 (c) is a normal of one subject (similar to the subject in FIG. 8) classified into Class 1 to Class 12 by the above clustering. The average value for each class of Class 1 to Class 9 in the activation parameter is calculated, and the calculation result is plotted for each awakening state. FIG. 10 (a) shows a standard blink / conscious and clear blink plot result, and FIG. 10 (b) shows a blink plot result that increases when arousal falls, and FIG. 10 (c). Shows the plot results of fast and small blinks and swarms. FIGS. 10A to 10C are radar charts generated from blink waveforms actually measured for a certain subject.

In the present embodiment, specifically, the blink waveform corresponding to the normalization parameter belonging to Class 1 is a standard blink waveform at the time of high arousal, and each normalization parameter classified into Class 1 to Class 12 is further awakened. Classified by.
In the example of FIGS. 10A to 10C, when comparing the waveform of Class 1 with the waveform of another Class, the duration of blinking (the sum of the rise time and the fall time) is almost equal to Class 2. However, blinks with a small peak height are classified, and blinks with a long blink duration and a large peak height are classified into Class 3 (see FIG. 10A). Class 4 has a blinking duration that is substantially the same, but blinks having a smaller peak height than Class 2 are classified, and Class 5 has a long rise time and blinks having a small peak height (FIG. 10). (See (b)). Class 6 and Class 7 are classified as blinks with long rise times and fall times and small peak heights. Class 6 is classified as blinks with particularly long fall times. Long blinks are classified (see FIG. 10B). In Class 8 and Class 9, blinks whose blink duration and peak height are both smaller than Class 2 are classified, and among them, blinks having a long rise time and a short fall time are classified as Class 9 (FIG. 10 (c)). reference).

As described above, Class 1 to Class 9 are classified as follows based on physiological knowledge based on correspondence between the arousal level of the subject clarified from the experiment and the type of blink generated according to the arousal state.
Class 1: Standard blinking during high awakening
Class 2: Blink when the eyelids fall due to sleepiness and the wave height is a little smaller
Class 3: Big blink intentionally intended to combat drowsiness
Class 4: Blink when the wave height generated in a low arousal state is very small
Class 5-7: blinking with long duration in low arousal state
Class 8: Blink when temporarily awakening from a low arousal state
Class 9: Group blinking In addition, in Class 10 to Class 12 that are not classified in any of FIGS. 10A to 10C, those that were erroneously detected when the blink waveform was detected were affected by eye movements other than blinking. Things that are difficult to judge whether eye movement or blinking are classified.

  Further, FIG. 9 shows that two parameters of peak height x1 and blink duration (x2 + x3) are used, and these two parameters are classified into Class 1 under the same conditions and in the same manner as the above three parameters x1 to x3. This shows the blink waveform type result when clustering to ~ Class12. Compared with the classification result by clustering using three parameters shown in FIG. 8, in the case of two parameters, the peak time (time corresponding to the above x2 and x3) in the entire blink time is not considered. In particular, in the clustering results of Class 5 and Class 9, there is a mixture of electrooculogram (EOG) waveforms that should be classified as other than the blink waveforms of Class 10 to Class 12, and there is an unsatisfactory aggregation of the blink types. . In other words, it is possible to perform the type of the blink waveform (parameter) more accurately by performing the type of the blink waveform using the three parameters x1 to x3, and the blink waveform more adapted to the arousal level (state). Can be performed. For example, it is possible to classify an intentionally clear and large blink performed by the subject in order to combat sleepiness as classified into Class 3 shown in FIG. 8 as one blink type.

  When the normalization parameter is classified as described above and the normalization parameter classification result is input from the parameter classification unit 16 to the identification information adding unit 17, the identification information adding unit 17 stores the type target data storage unit 14. Then, the feature parameter and feature quantity data corresponding to the normalization parameter stored in the predetermined area of the storage device 70 are acquired (step S208).

Based on the input identification result, learning information is generated by adding identification information for identifying the blink type (Class 1 to Class 12) to the acquired feature parameter and feature amount data (step S210). In the present embodiment, for example, with respect to a feature parameter corresponding to a normalization parameter belonging to Class 1 and feature amount data corresponding to this feature parameter, information that can identify that these data belong to Class 1 (Identification label) is assigned.
Finally, the feature parameter and feature amount data to which the identification information is given are stored as learning data in a predetermined area of the storage device 70 via the learning data storage unit 18 (step S212).

Next, the operation of the pattern model generation process in the pattern model generation apparatus 100 will be described.
The pattern model generation device 100 instructs the user to generate a pattern model for a blink waveform via the input device 74 or the like in a state where the learning data is stored in a predetermined area of the storage device 70 by the learning data storage unit 18. Alternatively, when an instruction to generate a blink video pattern model is given, the statistical model learning unit 19 starts pattern pattern generation processing.

  The statistical model learning unit 19 receives identification information from the storage device 70 via the learning data storage unit 18 when an instruction to generate the pattern model for blinking waveform is given (“Yes” in step S300). The obtained characteristic parameters are acquired (step S302). Then, the statistical model is learned using the acquired feature parameter as learning data, the feature parameter is input, and the blink waveform pattern is output that is the identification result data of the blink type (Class 1 to Class 12) for the input feature parameter. A model is generated (step S304). In this embodiment, HMM is used as a statistical model.

  Here, the HMM is a probabilistic model of a time series signal, and an unsteady time series signal is modeled by transitioning between a plurality of stationary signal sources. For example, since the time of one blink is not constant but varies depending on the state of awakening (state), the number of feature parameters in the time direction changes accordingly. Accordingly, the learning of the HMM is specifically based on the calculation result, for example, by determining an HMM with 3 states and calculating the number of state transitions from the acquired feature parameters as shown in FIG. This is done by maximum likelihood estimation of the transition probability from one state to the next state and the output probability of the feature parameter in a certain state. The HMM having the transition probability and the output probability obtained by this learning becomes the blink waveform pattern model.

  On the other hand, the statistical model learning unit 19 receives the identification information from the storage device 70 via the learning data storage unit 18 when an instruction to generate a pattern model for blinking video is given (“Yes” in step S308). The assigned feature amount data is acquired (step S310). Then, the HMM is learned using the acquired feature quantity data as learning data, the feature quantity data is input, and the data of the identification result (likelihood) of the blink type (Class 1 to Class 12) for the input feature quantity data is output. A blink image pattern model is generated (step S312). Note that the HMM learning method is the same as that for the pattern model for blinking waveforms, and thus description thereof is omitted.

The blink waveform pattern model and the blink video pattern model generated as described above are stored in a predetermined area of the storage device 70 via the pattern model storage unit 20 (step S306).
As described above, the pattern model generation apparatus 100 according to the present embodiment performs the peak height (distance) x1 of the blinking electrooculogram (EOG) waveform, the rise time x2 from the start of blinking to the peak height, and from the peak height to the end of blinking. By normalizing the three parameters of the falling time x3, it is possible to perform clustering in which parameters having different units (dimensions) are mixed.

In addition, since the blink type is performed using the three parameters x1 to x3, the electrooculogram (EOG) waveform can be classified more accurately and in detail.
In addition, based on the classification result, identification data is automatically assigned to feature parameters and feature quantity data to generate learning data, so that learning data can be generated simply by adding identification information without human intervention. Is possible.
In addition, based on the classification result classified accurately and in detail, the identification information is added to the feature parameter and the feature amount data to generate learning data, and the pattern model is generated using the learning data. It is possible to generate a pattern model that can output the blinking type identification result data with high accuracy.

In the first embodiment, the parameter normalization unit 15 corresponds to the normalization unit according to claim 1 or 3, and the parameter type unit 16 is described in any one of claims 1, 4, and 5. The identification information adding unit 17 corresponds to the identification information adding unit according to claim 1.
Moreover, in the said 1st Embodiment, the statistical model learning part 19 respond | corresponds to the pattern model production | generation means of Claim 7 or 9.

[Second Embodiment]
Next, a second embodiment of the present invention will be described with reference to the drawings. FIGS. 11 to 16 show a blink waveform appearance frequency information generation apparatus, a blink waveform appearance frequency information generation program and a blink waveform appearance frequency information generation method, a wakefulness determination apparatus, a wakefulness determination program, and a wakefulness determination method according to the present invention, It is a figure which shows embodiment of a warning device, a warning device control program, and a warning device control method.

In the present embodiment, the blink waveform appearance frequency information generation device, the blink waveform appearance frequency information generation program and the blink waveform appearance frequency information generation method, the wakefulness determination device, the wakefulness determination program, and the wakefulness determination method according to the present invention, A case where the warning device, the warning device control program, and the warning device control method are applied to a warning device that determines the awakening state of the driver who drives the vehicle and gives a warning to the driver based on the determination result will be described.
The warning device of the present embodiment includes a blink waveform pattern model and a blink video pattern model generated by the pattern model generation device 100 of the first embodiment, and using these pattern models, The type of blinking for each predetermined period of the driver is identified.

First, the configuration of a warning device according to the present invention will be described with reference to the drawings. FIG. 11 is a block diagram showing the configuration of the warning device 200 according to the present invention.
As shown in FIG. 11, the warning device 200 includes an image capturing unit 21 that captures a face image including an image of the driver's eyes, and a blink waveform measurement that measures an electrooculogram (EOG) waveform of the driver's blink. A feature amount data extraction unit 23 for extracting feature amount data from the image data captured by the image capturing unit 21, and extracting feature amount data from the blink waveform data measured by the blink waveform measuring unit 22; The pattern model storage unit 24 that stores the pattern model generated by the pattern model generation apparatus 100 according to the first embodiment, the pattern model stored by the pattern model storage unit 24, and the feature amount data extraction unit 23 extract the pattern model. The blink type identification unit 25 for identifying the blink type based on the feature amount data, and the blink type based on the blink type identification result for a predetermined period. And it has a configuration including a frequency information generating unit 26 for generating a type of frequency information.

The image capturing unit 21 captures a face image of the driver sitting in the driver's seat in real time in units of frames by a CCD camera installed on an inner mirror in the automobile. The photographed face image is output as digital face image data. The installation position of the CCD camera is not limited to the inner mirror. If the image including the entire face of the person to be photographed can be photographed, the position of the steering column, center panel, front pillar, etc. The place is good.
The blink waveform measuring unit 22 has an AC amplifier for biometric measurement, and measures vertical EOG (electrocardiogram (EOG) waveform) in real time via electrodes mounted on the upper and lower sides of the driver's right eyelid. It is like that.

  The feature amount data extraction unit 23 extracts video data of the right eye part from the driver's face video data captured by the video imaging unit 21 using the SVM, and from the extracted video data of the right eye part. Feature data is extracted. Specifically, similarly to the first embodiment, from the blink image data constituting the video data of the right eye part, an extraction region image of 11 pixels wide × 30 pixels high is extracted from the center of the eyeball. A total luminance value for each pixel line (11 horizontal pixels) of the extracted area image is calculated, and the calculated luminance total value data (for one blink video) is used as feature amount data.

Further, the feature amount data extraction unit 23 calculates the blink waveform from the electrooculogram (EOG) waveform data (blink waveform data) of the blink measured by the blink waveform measurement unit 22 as in the first embodiment. Three feature parameters are extracted as feature amount data: peak height (distance) x1, rise time x2 from the start of blinking to peak height, and fall time x3 from peak height to end of blinking.
The pattern model storage unit 24 stores the blink waveform pattern model and the blink image pattern model generated by the pattern model generation device 100 according to the first embodiment in a predetermined area of the storage device 90 described later. It is supposed to be.

  The blink type identifying unit 25 identifies a blink type for feature data input in real time using a pattern model corresponding to the set identification mode. In the present embodiment, the blinking image mode for performing identification using the feature amount data of the blinking image extracted from the face image data photographed by the image photographing unit 21, and the blinking waveform measured by the blinking waveform measurement unit 22. The user can set an arbitrary mode from two modes, ie, a blink waveform mode in which identification is performed using feature amount data of blink waveform data extracted from data.

  That is, when the blink video mode is set, the feature based on the blink video pattern model stored in the pattern model storage unit 24 and the feature data of the blink video data extracted by the feature data extraction unit 23. Identify the blink type for the quantity data. On the other hand, when the blink waveform mode is set, based on the blink waveform pattern model stored in the pattern model storage unit 24 and the feature amount data of the blink waveform data extracted by the feature amount data extraction unit 23, the feature Identify the blink type for the quantity data.

  The appearance frequency information generation unit 26 generates appearance frequency information indicating temporal changes in the appearance frequency of each blink type (Class 1 to Class 12) based on the identification result of the blink type identification unit 25 in the predetermined period. ing. In the present embodiment, based on the identification result for each predetermined time width (for example, 60 seconds) in the predetermined period, the appearance frequency of each type of blink in the predetermined time width is sequentially calculated to appear for each predetermined time width. Sub-occurrence frequency information indicating the frequency is generated. Then, the appearance frequency information is generated from the sub-appearance frequency information for a predetermined period, and the generated appearance frequency information is output to the awake state determination unit 27.

Further, the warning device 200 is based on the appearance frequency information generated by the appearance frequency information generation unit 26, and based on the determination result of the awake state determination unit 27 that determines the driver's awake state and the determination result of the awake state determination unit 27. And a warning unit 28 for giving a warning to the device.
The arousal state determination unit 27 determines the driver's arousal state based on the appearance frequency of the blink type of each Class indicated by each sub-appearance frequency information, the temporal change of each appearance frequency understood from the appearance frequency information, and the like. The state of wakefulness to be determined includes a state that changes stepwise from the state of wakefulness with clear consciousness to the time of dozing. For example, a normal state, a state of feeling weak sleepiness, a state of feeling sleepiness, a state of feeling strong sleepiness, a dozing state, and the like are included.

Further, since Class 1 to Class 12 can be classified according to the arousal state as described in the first embodiment, the appearance frequency of each class of Class 1 to Class 9 that is a blinking waveform and the time change of the appearance frequency. The awake state is determined from the above.
The warning unit 28 is configured to give a warning to the driver according to the content of the awake state based on the determination result of the awake state determination unit 27.
Specifically, for example, when it is determined that the person feels weak sleepiness, an audio message is output to promote taking a break. When it is determined that the person feels sleepy, the warning sound is output at a slightly high volume. When it is determined that the person feels strong sleepiness or is dozing, a warning sound is output at an extremely high volume.

  Further, the warning device 200 includes a computer system for realizing the control of the above-described units on software, and the hardware configuration thereof is a central processing unit responsible for various controls and arithmetic processing as shown in FIG. Between a CPU (Central Processing Unit) 80, a RAM (Random Access Memory) 82 constituting a main storage device (Main Storage), and a ROM (Read Only Memory) 84, which is a read-only storage device. In addition to being connected by various internal and external buses 88 such as Peripheral Component Interconnect (ISA) bus and ISA (Industrial Standard Architecture) bus, I / O input / output Via an interface (I / F) 86, an internal or external storage device (Secondary Storage) 90 such as a HDD (Hard Disk Drive), an output device 92 such as an LCD monitor, an input device 94 such as an operation panel or a remote control, And a network L for communicating with an external device (not shown).

  When the power is turned on, a system program such as BIOS stored in the ROM 84 or the like is stored in various dedicated computer programs stored in advance in the ROM 84 or a CD-ROM, DVD-ROM, flexible disk (FD), or the like. Various dedicated computer programs installed in the storage device 90 are loaded into the RAM 82 via the medium or the communication network L such as the Internet, and the CPU 80 performs various operations according to instructions described in the program loaded in the RAM 82. By performing predetermined control and arithmetic processing by making full use of resources, each function of each means as described above can be realized on software.

Next, a flow of feature quantity data extraction processing in the warning device 200 configured as described above will be described with reference to FIG. Here, FIG. 13 is a flowchart showing the feature amount data extraction processing in the warning device 200.
As shown in FIG. 13, the feature amount data extraction process first proceeds to step S400, where the feature amount data extraction unit 23 determines whether or not the blink video mode is set as the identification process mode. If it is determined that it has been performed (Yes), the process proceeds to step S402. If not (No), the process proceeds to step S410.

When the process proceeds to step S402, the feature amount data extracting unit 23 acquires the driver's face image data captured by the CCD camera from the image capturing unit 21, and the process proceeds to step S404.
In step S404, the feature data extraction unit 23 uses the SVM to detect blink video data from the face video data acquired in step S402, extracts feature data from the detected blink video data, and proceeds to step S406. Transition.

In step S406, the blink type identifying unit 25 acquires a blink video pattern model from the pattern model storage unit 24, and based on the acquired blink video pattern model and the feature amount data extracted in step S404, the feature. The type of blink corresponding to the amount data is identified, and the process proceeds to step S408.
In step S408, the blink type identifying unit 25 outputs the identification result in step S406 or step S416 to the appearance frequency information generating unit 26, and the process proceeds to step S400.

On the other hand, when the process proceeds to step S410 instead of the blink video mode in step S400, the feature amount data extraction unit 23 determines whether or not the blink waveform mode is set as the identification processing mode. If it is determined (Yes), the process proceeds to step S412. If not (No), the process proceeds to step S400.
When the process proceeds to step S412, the feature amount data extraction unit 23 acquires the blink waveform data from the blink waveform measurement unit 22, and the process proceeds to step S414.

In step S414, the feature amount data extraction unit 23 extracts a plurality of feature amount data (feature parameters) having different units from the blink waveform data acquired in step S412 and proceeds to step S416.
In step S416, the blink type identifying unit 25 obtains the blink waveform pattern model from the pattern model storage unit 24, and the obtained blink waveform pattern model and the feature amount data (feature parameter) extracted in step S414. Based on this, the type of blink corresponding to the feature amount data is identified, and the process proceeds to step S408.

Next, the flow of appearance frequency information generation processing in the warning device 200 will be described based on FIG. Here, FIG. 14 is a flowchart showing appearance frequency information generation processing in the warning device 200.
As shown in the flowchart of FIG. 14, the appearance frequency information generation process first proceeds to step S <b> 500, and the appearance frequency information generation unit 26 determines whether an identification result has been acquired from the blink type identification unit 25. If it is determined (Yes), the process proceeds to step S502. If not (No), the process waits until acquisition. In the present embodiment, the identification result includes information on the blink type of the identification result and information on the acquisition time of the blink video data or the blink waveform data corresponding to the feature data.

When the process proceeds to step S502, the appearance frequency information generation unit 26 stores the identification result in a predetermined area of the RAM 82 or the storage device 90, and the process proceeds to step S504. In the present embodiment, the RAM 92 is preferentially used, and the storage destination is appropriately changed according to the memory capacity of the RAM 92.
In step S504, the appearance frequency information generation unit 26 determines whether or not the time difference between the identification results exceeds a predetermined time. If it is determined that the time has exceeded (Yes), the process proceeds to step S506, otherwise (step S504). No) moves to step S500.

In the present embodiment, as a reference for calculating the time difference, a flag indicating the start position is set in a predetermined identification result, and the time of the identification result having the flag indicating the start position and the time of the identification result currently acquired It is determined whether or not the time difference exceeds a predetermined time. If the predetermined time is exceeded, the flag indicating the start position is changed to the currently acquired identification result.
When the process proceeds to step S506, the appearance frequency information generation unit 26 calculates the appearance frequency at a predetermined time for each type of blink based on the identification result for a predetermined time, and the process proceeds to step S508.

In step S508, the appearance frequency information generating unit 26 generates sub-appearance frequency information indicating the appearance frequency of the blink type in a predetermined time based on the appearance frequency calculated in step S506, and the process proceeds to step S510.
In step S510, the appearance frequency information generation unit 26 stores the sub-occurrence frequency information generated in step S508 in a predetermined area of the RAM 82 or the storage device 90, and the process proceeds to step S512.

In step S512, the appearance frequency information generation unit 26 determines whether or not the sub-appearance frequency information for a predetermined period has been accumulated. If it is determined that it has been accumulated (Yes), the process proceeds to step S514, otherwise. In the case (No), the process proceeds to step S500.
When the process proceeds to step S514, the appearance frequency information generation unit 26 generates appearance frequency information based on the sub-appearance frequency information for the predetermined period stored in step S508, and the process proceeds to step S516.
In step S516, the appearance frequency information generation unit 26 outputs the appearance frequency information generated in step S514 to the awake state determination unit 27, and the process proceeds to step S500.

Next, based on FIG. 15, the flow of the arousal state determination process and the warning process in the warning device 200 will be described. Here, FIG. 15 is a flowchart showing the arousal state determination process and the warning process in the warning device 200.
As shown in the flowchart of FIG. 15, the awakening state determination process and the warning process first proceed to step S600, where the awakening state determination unit 27 determines whether or not the appearance frequency information is acquired from the appearance frequency information generation unit 26. If it is determined that it has been acquired (Yes), the process proceeds to step S602. If not (No), the process waits until acquisition.

When the process proceeds to step S602, the wakefulness determination unit 27 determines the driver's wakefulness based on the appearance frequency information acquired in step S600, and the process proceeds to step S604. Here, the determination of the awake state is the appearance frequency information. In step S604, the warning unit 28 determines whether or not the driver feels weak sleepiness based on the determination result in step S602. If yes (Yes), the process proceeds to step S606. If not (No), the process proceeds to step S608.

When the process proceeds to step S606, the warning unit 28 executes the warning process (1), and the process proceeds to step S600. Here, when the warning process (1) is executed, a voice message that advances taking a break is output.
On the other hand, when the process proceeds to step S608, the warning unit 28 determines whether or not the driver feels drowsy. If yes (Yes), the process proceeds to step S610. For (No), the process proceeds to step S612.

When the process proceeds to step S610, the warning unit 28 executes the warning process (2), and the process proceeds to step S600. Here, when the warning process (2) is executed, a warning sound and a warning message are output at a slightly louder volume (for example, 50% volume) from a speaker disposed in the vehicle.
On the other hand, when the process proceeds to step S612, the warning unit 28 determines whether or not the patient feels strong sleepiness or is dozing, and is in the state of feeling strong sleepiness or the dozing state. If (Yes), the process proceeds to step S614. If not (No), the process proceeds to step S600.
When the process proceeds to step S614, the warning unit 28 executes a warning process (3), and the process proceeds to step S600. Here, when the warning process (3) is executed, a warning sound and a warning message are output at a very high volume (for example, a volume of 70% or more) from a speaker disposed in the vehicle.

Next, based on FIG.16 and FIG.17, operation | movement of this Embodiment is demonstrated.
Here, FIG. 16 is a diagram illustrating an example of appearance frequency information generated using the identification result of the blink waveform pattern model. FIG. 17 is a diagram illustrating an example of appearance frequency information generated using the identification result of the blink video pattern model.
When the user's arbitrary identification mode is set via the input device 94 and the identification process is started, the warning device 200 first determines whether or not the set identification mode is the blink video mode (step). S400). When the set identification mode is the blink video mode (“Yes” branch of step S400), the feature data extraction unit 23 acquires face video data from the video shooting unit 21 (step S402). Further, the feature data extraction unit 23 detects right eye blink video data from the face video data acquired using the SVM, and extracts feature data from the detected blink video data (step S404). This feature amount data extraction uses the same method as the feature amount data extraction unit 13 in the pattern model generation apparatus 100 of the first embodiment. In other words, the luminance total value data for each line corresponding to 30 lines of the extraction region image in each blink image data constituting the blink video data corresponding to one blink is extracted as the feature amount data.

  When the feature amount data is extracted, the blink type identification unit 25 acquires the blink image pattern model stored in the storage device 90 via the pattern model storage unit 24, and the acquired blink image pattern model. The feature amount data is input to the screen, and the type of blinking is identified based on the likelihood for each type of blinking (Class 1 to Class 12) output from the pattern model for blinking video. Specifically, the type of blink with the highest likelihood is used as the identification result for the input feature data. And this identification result is output to the appearance frequency information generation part 26 (step S408).

  On the other hand, when the blink waveform mode is set as the identification mode (“Yes” branch of step S410), the feature amount data extraction unit 23 acquires the blink waveform data from the blink waveform measurement unit 22 (step S412). ). Then, from the acquired blink waveform data, similarly to the parameter extraction unit 12 in the pattern model generation apparatus 100 of the first embodiment, the peak height (distance) x1 of the blink waveform, from the start of blinking to the peak height. Three feature parameters of a rise time x2 and a fall time x3 from the peak height to the end of blinking are extracted as feature amount data (step S414).

  When the feature amount data is extracted, the blink type identifying unit 25 acquires the blink waveform pattern model stored in the storage device 90 via the pattern model storage unit 24, and the acquired blink image pattern model. The feature amount data is input to the screen, and the type of blink is identified based on the likelihood for each type of blink (Class 1 to Class 12) output from the pattern model for blink video (step S416). Then, the identification result is output to the appearance frequency information generation unit 26 (step S408).

  On the other hand, each time the appearance frequency information generation unit 26 acquires an identification result in any one of the above identification modes (“Yes” branch in step S500), the appearance frequency information generation unit 26 stores the identification result in a predetermined area of the RAM 82 or the storage device 90. (Step S502). Here, as described above, the identification result is information including the acquisition time information of the blink video data or the blink waveform data corresponding to the feature amount data at the time of identification, and the type information of the identified blink. Note that a flag indicating the start position of the predetermined time width is set for the identification result acquired first.

  Then, the time difference is calculated from the time indicated by the identification information in which the flag indicating the start position is set and the time indicated by the currently acquired identification information, and the time difference is a predetermined time (here, 60 seconds). It is determined whether it exceeds (step S504). If it exceeds 60 seconds (“Yes” branch in step S504), each blink for 60 seconds is based on the identification result group from the identification result of the start position to the identification result immediately before the identification result currently acquired. The appearance frequency (number of appearances) of the type is calculated (step S506), and this appearance frequency is associated with time information (for example, the acquisition time range of identification information “12:01:20 to 12:02:20”). Sub-occurrence frequency information is generated (step S508). Then, the generated sub-appearance frequency information is stored in a predetermined area of the RAM 82 or the storage device 90 (step S510).

Here, the time information is added to the identification result group from the identification result of the start position before the setting change to the identification result immediately before the identification result currently acquired, and the information is grouped. In addition, a flag indicating the start of the predetermined period is set for the identification result that is the start position of the predetermined period. Here, the predetermined period is “5 minutes (300 seconds)”.
Then, when the sub-appearance frequency information for a predetermined period is accumulated in the predetermined area of the RAM 82 or the storage device 90 (“Yes” branch in step S512), the sub-appearance frequency information for the predetermined period appears as a group in time order. Frequency information is generated (step S514), and the generated appearance frequency information is output to the awake state determination unit 27.

When the appearance frequency information is acquired from the appearance frequency information generation unit 26 (“Yes” branch in step S600), the awakening state determination unit 27 determines the awakening state based on the appearance frequency information (step S602). In addition, appearance frequency information is acquired from the appearance frequency information generation part 26 continuously in order of time, and when the appearance frequency information for 40 minutes is represented on a graph in order of time, FIG. 16 shows a case where it is generated in the blink waveform mode. It becomes like this. In addition, the awakening state determination unit 27 determines the awakening state based on not only the currently acquired appearance frequency information but also the appearance frequency information acquired in the past.
The appearance frequency information shown in FIG. 16 is generated in the blink waveform mode based on the identification result of the blink waveform pattern model for the blink waveform data actually measured for a certain subject. The blink waveform pattern model is the same as that generated in the first embodiment.

  FIG. 17 is a diagram showing appearance frequency information generated in the above-described blink video mode using the blink video data actually captured simultaneously with the measurement of the blink waveform for the same subject as in FIG. That is, the appearance frequency information shown in FIG. 17 is generated based on the identification result using the blink video pattern model. As described in the first embodiment, the blink image pattern model is obtained by classifying the feature parameters extracted from the electrooculogram (EOG) waveform into 12 blink types of Class 1 to Class 12 by clustering. The feature amount data of blink video data generated based on the above is generated as learning data. However, in the first embodiment, the pattern model for blink video is generated using the learning data corresponding to all 12 types of blink types, whereas the appearance frequency information of FIG. The pattern model for blinking images used for generation is generated using only learning data corresponding to nine types of blinking classified into Class1 to Class9. Therefore, as can be seen from FIG. 17, the appearance frequency information only shows the appearance frequencies of the blink waveforms classified into Class 1 to Class 9.

The reason for this is that the blink waveforms (characteristic parameters) classified into Class 10 to Class 12 are electrooculogram (EOG) waveforms due to the effects of eye movements other than blinking, and the relevance to the arousal level is extremely low. Because.
Therefore, when the appearance frequency information shown in FIG. 17 is compared with the appearance frequency information shown in FIG. 16, the temporal appearance frequency of the blink type generated by identifying the blink type only from the blink video input shown in FIG. The change (appearance frequency information) is substantially the same as that generated by identifying the blink type from the electrooculogram (EOG) waveform input as a reference for the blink type shown in FIG. That is, it can be seen that the effect of the blink types of Class 10 to Class 12 on the appearance frequency information is extremely small.

As shown in FIG. 16 and FIG. 17, this test subject feels weak sleepiness from the early stage, and sleepiness gradually increases from the middle stage (around 20 minutes) to the second half. You are awake, but after 40 minutes you are asleep (sleeping).
More specifically, as shown in FIGS. 16 and 17, in addition to Class 1 which is a standard type of blinking from the beginning of driving to a high awake state, generation of Class 2 which is a type of blink whose eyelid has fallen due to sleepiness. Begins. The waking state declines over the middle of driving, and the number of blinks deliberately clear to combat Class 3 sleepiness increases with the increase in Class 2 blink types. From the middle to the end of the driving period when the driver's wakefulness has dropped significantly, a blink type with a small wave height that occurs in the low-wake state of Class 4 occurs, and at the same time, a blink type with a long duration in the low-wake state of Class 5 to Class 7 Has also occurred. In addition, when the user is temporarily awakened by an external factor from the low arousal state, the Class 8 blink type occurs.

  Accordingly, in the case of the examples of FIGS. 16 and 17, it is determined that the driver feels weak sleepiness for a while from the beginning (“Yes” branch of step S604), and the warning unit 28 performs warning processing (1 ) Is executed and a voice message prompting a break is output (step S606). Further, it is determined that the driver feels drowsy from the middle to the middle (the branch of “Yes” in Step S608), and the warning unit 28 executes the warning process (2), and the volume is 50% of the maximum volume. To output a warning sound (step S610). Further, from the middle to the second half, it is determined that the driver feels strong sleepiness or is dozing ("Yes" branch of step S612), and the warning unit 28 executes warning processing (3), A warning sound is output at a volume of 70% or more of the maximum volume (step S614).

Here, as a more specific method for determining (determining) the arousal state, first, a method for determining (determining) the arousal state by focusing on one specific type of blinking will be described with reference to FIGS. 16 and 17. To do.
As can be seen from the temporal change (time-dependent change) in the appearance frequency of the blink type shown in FIG. 16 and FIG. 17, the occurrence of a standard blink at the time of high awakening of Class 1 after about 15 minutes. Decreases and no longer occurs after 20 minutes. In addition, during the same time period, a deliberate and clear big blink occurred against Class 3 sleepiness. In other words, by paying attention to the blinking state of Class 1 and Class 3, the driver feels drowsy from 8 minutes to 20 minutes later (from Class 1) and is trying to counter that drowsiness (from Class 3). Judgment can be made.

  Further, as shown in FIGS. 16 and 17, after 20 minutes, blinking in which the wave height generated in the low awakening state of Class 4 that has not occurred until then becomes very small starts to occur. In other words, by paying attention to the blinking state of Class 4, the driver can determine (determine) that he / she feels very strong sleepiness after about 20 minutes.

  Next, a method for determining (determining) an arousal state from a ratio of appearance frequencies of a plurality of types of blinks will be described with reference to FIGS. 16 and 17. As shown in FIG. 16 and FIG. 17, the appearance rate of the standard blink at the time of high awakening of Class 1 is reduced after about 15 minutes, and the intentional clear and large blink for combating the sleepiness of Class 3 Appearance rate is high. In addition, from around 15 minutes to 20 minutes later, Class 3 blinks and blinks where the eyelids have fallen due to Class 2 sleepiness and the wave height has been reduced occur in each predetermined time (60 seconds in the figure). It occupies almost all blinks. That is, by paying attention to the ratio of the appearance frequency of blinks of Class1 to Class3, the driver judges (determines) that he / she feels sleepiness from 20 minutes to 20 minutes later and tries to counter the sleepiness. be able to.

  Further, as shown in FIG. 16 and FIG. 17, after 20 minutes, the wave height generated in the low wakefulness state of Class 4 becomes very small, and the long time duration in the low wakefulness state of Class 5 to Class 7 From the total number of occurrences, blinks of Class 4 and Class 5 to Class 7 occupy almost all of the blinks that occur at each predetermined time. From this, the driver can judge (determine) that he / she feels very strong sleepiness after about 20 minutes.

16 and 17, the appearance frequency information of the blink type appears (starts) after the elapse of 7 minutes on the time axis (horizontal axis) in FIG. Therefore, in actual practical use, it is possible to generate appearance frequency information immediately after data acquisition.
In the first and second embodiments, the portion using waveform data of an electrooculogram (EOG) is replaced or simultaneously with an ocular electromyogram (EMG). Electro-myography) waveform data may be used.

As described above, the warning device 200 according to the present embodiment can identify the type of blink using the pattern model generated by the pattern model generation device 100 according to the first embodiment. It is possible to identify the types of blinks that are larger than the types of blinks that are performed (Class 1 to Class 12 above).
In addition, it is possible to generate appearance frequency information that is information on temporal changes in appearance frequency of blink types with a predetermined time width in a predetermined period, and a driver's arousal state can be determined based on this appearance frequency information, It is possible to determine various wakefulness states that occur in stages from sleep to sleep state, so that it is possible to more accurately determine the intensity of sleepiness, so give a more appropriate warning It is possible.

  In the second embodiment, the image capturing unit 21 corresponds to the image capturing unit according to claim 11 or the blink image capturing unit according to claim 16, and the blink waveform measuring unit 22 corresponds to claim 10 or claim 13. The processing for extracting feature amount data from the blinking video data in the feature amount data extracting unit corresponding to the electrooculogram waveform measuring unit described in claim 15 or the blinking electrooculogram waveform measuring unit according to claim 15 is described in claim 11. Corresponding to the feature quantity data extraction means or the feature quantity extraction means according to claim 16, the process of extracting feature quantity data (feature parameters) from the blinking waveform data in the feature quantity data extraction unit 23 is described in claim 11 or claim 13. Corresponding to the blink data extracting means described above or the feature value extracting means according to claim 15, the blink type identifying unit 25 is provided with claim 10 or claim 11 or corresponding to the blink type identification unit of blink waveform identification means or claim 15 or claim 16 of claim 13 wherein.

  In the second embodiment, the appearance frequency information generation unit 26 is the appearance frequency information generation unit according to any one of claims 10, 11 and 13, or the blink according to claim 15 or 16. Corresponding to the waveform type appearance frequency information generating means, the wakefulness determination unit 27 corresponds to the wakefulness determination means according to claim 12 or claim 13 or the wakefulness determination means according to claim 15 or claim 16, and a warning. The unit 28 corresponds to the warning means described in claim 17.

In the first embodiment, the example in which the pattern model is configured from the HMM has been described. However, the present invention is not limited to this, and the pattern model may be configured from another statistical model such as an SVM or a neural network.
In the second embodiment, the example in which the right eye region of the person to be imaged is detected and the state of wakefulness is determined has been described. However, depending on the shooting environment, the type of system to be applied, and the like, The determination may be made by detecting the eye region or the binocular region.

In the first and second embodiments described above, the luminance total for each line of the extracted partial image cut out from the blink image data is extracted as the feature amount data. However, the present invention is not limited to this, and the blink image data is extracted. Other feature amounts may be extracted, such as extracting the frequency spectrum component as a feature amount by Fourier transform.
In the second embodiment, the awakening state of the target person (driver) is determined based on the appearance frequency information. However, the present invention is not limited to this, and the tension state of the target person based on the appearance frequency information. The other state of the subject may be determined.

It is a block diagram which shows the structure of the pattern model production | generation apparatus 100 which concerns on this invention. 2 is a block diagram illustrating a hardware configuration of a pattern model generation device 100. FIG. It is a flowchart which shows the extraction process of the parameter from the blink waveform data, and the extraction process of the feature-value data from the blink video data. It is a flowchart which shows a learning data generation process. 3 is a flowchart showing pattern model generation processing in the pattern model generation apparatus 100. It is explanatory drawing of the parameter extracted from blink waveform data. (A) And (b) is explanatory drawing of the feature-value data extracted from the blink video data. It is a figure which shows the blink waveform with respect to a certain test subject classified by the method of this invention using three types of parameters. It is a figure which shows the blink waveform with respect to a certain test subject classified by the method of this invention using two types of parameters. (A)-(c) is a radar chart of Z score with respect to a certain subject. It is a block diagram which shows the structure of the warning device 200 which concerns on this invention. 2 is a block diagram illustrating a hardware configuration of a warning device 200. FIG. 5 is a flowchart showing feature amount data extraction processing in the warning device 200. It is a flowchart which shows the appearance frequency information generation process in the warning device. It is a flowchart which shows the arousal state determination process and warning process in the warning device. It is a figure which shows an example of the appearance frequency information produced | generated using the identification result of the pattern model for blink waveforms. It is a figure which shows an example of the appearance frequency information produced | generated using the identification result of the pattern model for blink images | videos. It is a figure which shows an example of HMM.

Explanation of symbols

100 Pattern Model Generation Device 200 Warning Device 10 Blink Waveform Data Storage Unit 11 Blink Video Data Storage Unit 12 Parameter Extraction Unit 13 Feature Data Extraction Unit 14 Classification Target Data Storage Unit 15 Parameter Normalization Unit 16 Parameter Classification Unit 17 Identification Information Addition Unit 18 Learning Data Storage Unit 19 Statistical Model Learning Unit 20 Pattern Model Storage Unit 21 Video Shooting Unit 22 Blink Waveform Measurement Unit 23 Feature Data Extraction Unit 24 Pattern Model Storage Unit 25 Blink Type Identification Unit 26 Appearance Frequency Information Generation Unit 27 Arousal State Determination Part 28 Warning part

Claims (17)

  1. A blink data classification device that classifies blink data that is data related to an electrooculogram (EOG) waveform of a blink,
    Normalization means for normalizing a plurality of types of parameters with different units extracted from electrooculogram (EOG) waveform data which is data of the electrooculogram (EOG) waveform of the blink;
    Classification means for classifying parameters corresponding to the electrooculogram (EOG) waveform data of a plurality of types of blinks normalized by the normalization means using a predetermined clustering method;
    An identification information providing unit that provides identification information of a type to which each parameter belongs to the blink data corresponding to each classified parameter based on a classification result of the classification unit; Blink data classification device.
  2.   The plurality of types of parameters having different units are extracted from the blinking electrooculogram (EOG) waveform data, the distance data of the peak height of the electrooculogram (EOG) waveform, from the start of blinking to the peak height The blink data classification device according to claim 1, further comprising: time data up to and a time data from the peak height to the end of blinking.
  3.   The blink data classification device according to claim 1 or 2, wherein the normalizing means normalizes a plurality of types of parameters having different units using a Z score method.
  4.   The blink data classification device according to any one of claims 1 to 3, wherein the classification unit uses a division optimization method or a hierarchical method as the predetermined clustering method.
  5.   The blink data is feature amount data extracted from blink video data including moving image data of at least one eye at the time of each subject's blink corresponding to the blinking electrooculogram (EOG) waveform. The blink data classification device according to any one of claims 1 to 4, wherein the blink data classification device is characterized in that:
  6. The blink data type device according to any one of claims 1 to 4,
    A pattern model for generating a pattern model that learns a statistical model using the blink data provided with the identification information as learning data in the blink data classification device, receives the blink data as an input, and outputs the identification data of the blink data. A pattern model generation apparatus comprising: a generation unit;
  7.   The pattern model generation apparatus according to claim 6, wherein the statistical model is an HMM (Hidden Markov Model).
  8. The blink data classification device according to claim 5;
    A pattern model for generating a pattern model that learns a statistical model using the blink data provided with the identification information as learning data in the blink data classification device, receives the blink data as an input, and outputs the identification data of the blink data. A pattern model generation apparatus comprising: a generation unit;
  9.   9. The pattern model generation apparatus according to claim 8, wherein the statistical model is an HMM (Hidden Markov Model).
  10. A pattern model generated by the pattern model generation device according to claim 6 or 7,
    Electrooculogram waveform measuring means for measuring an electrooculogram (EOG) waveform when the subject blinks;
    Blink data extraction means for extracting blink data corresponding to the pattern model from electrooculogram (EOG) waveform data which is data of an electrooculogram (EOG) waveform measured by the electrooculogram waveform measurement means;
    Blink waveform identifying means for identifying the type of electrooculogram (EOG) waveform corresponding to the blink data based on the blink data extracted by the blink data extraction means and the pattern model;
    Based on the identification result of the blink waveform identification means for the electrooculogram (EOG) waveform at the time of blinking of the subject measured in a predetermined period, the appearance frequency of the type of the electrooculogram (EOG) waveform in the predetermined period A blink waveform appearance frequency information generation device comprising: appearance frequency information generation means for generating appearance frequency information indicating temporal changes.
  11. A pattern model generated by the pattern model generation device according to claim 8 or 9,
    Photographing means for photographing a blinking image including a moving image of at least one whole eye when the subject blinks;
    Feature amount data extracting means for extracting feature amount data corresponding to the pattern model from blink image data that is data of a blink image captured by the photographing means;
    Blink waveform identifying means for identifying the type of electrooculogram (EOG) waveform corresponding to the blink image corresponding to the feature quantity data based on the feature quantity data extracted by the feature quantity data extracting means and the pattern model; ,
    Based on the identification result of the blink waveform identifying means for the blink image of the subject photographed in a predetermined period, the temporal frequency of appearance of each type of electrooculogram (EOG) waveform corresponding to the blink image in the predetermined period A blink waveform appearance frequency information generating device comprising: appearance frequency information generating means for generating appearance frequency information indicating a change.
  12. The blink waveform appearance frequency information generating device according to claim 10 or 11,
    An arousal state determination device comprising: awakening state determination means for determining the awakening state of the subject based on appearance frequency information generated by the blink waveform appearance frequency information generation device.
  13. A blinking electrooculogram waveform measuring means for measuring an electrooculogram (EOG) waveform when the subject blinks;
    Blink data extracting means for extracting blink data from electrooculogram (EOG) waveform data which is electrooculogram (EOG) waveform data measured by the blinking electrooculogram waveform measuring means;
    Blink waveform identifying means for identifying the type of blink waveform corresponding to the blink data based on the blink data extracted by the blink data extracting means and the identification data of the blink waveform type stored in advance,
    Appearance frequency information indicating temporal changes in appearance frequency for each type of the blink waveform in the predetermined period based on the identification result of the blink waveform identification means corresponding to the electrooculogram (EOG) waveform measured in a predetermined period Appearance frequency information generating means for generating
    Wakefulness comprising: awakening state determining means for determining the wakefulness state of the subject from the appearance frequency of a predetermined type of the blink waveform based on the appearance frequency information generated by the appearance frequency information generating means State determination device.
  14.   14. The wakefulness state determination apparatus according to claim 13, wherein the appearance frequency of the predetermined type of the blink waveform is an appearance frequency of a plurality of types of the blink waveform.
  15. A blinking electrooculogram waveform measuring means for measuring an electrooculogram (EOG) waveform when the subject blinks;
    A pattern model that receives feature data extracted from electrooculogram (EOG) waveform data, which is data of the electrooculogram (EOG) waveform, and outputs identification data of the blink waveform type corresponding to the feature data. When,
    Feature quantity extraction means for extracting feature quantity data from electrooculogram (EOG) waveform data which is electrooculogram (EOG) waveform data measured by the blinking electrooculogram waveform measurement means;
    A blink type identifying unit for identifying a blink waveform type for the feature amount data based on the feature amount data extracted by the feature amount extraction unit and the pattern model;
    Based on the identification result of the blink type identification means for the blink image of the subject photographed in a predetermined period, the temporal frequency of appearance of each type of electrooculogram (EOG) waveform corresponding to the blink image in the predetermined period Blinking waveform type appearance frequency information generating means for generating appearance frequency information indicating a change;
    A wakefulness state judging device comprising: wakefulness state judging means for judging the wakefulness state of the subject based on the appearance frequency information generated by the blinking waveform type appearance frequency information generating means.
  16. Blinking image photographing means for photographing a blinking image including a moving image of at least one entire eye when the subject blinks;
    A pattern model that receives feature data extracted from the blink video data that is the data of the blink video, and outputs identification data of the blink waveform type corresponding to the feature data;
    Feature quantity extraction means for extracting feature quantity data from blink video data which is data of a blink video photographed by the blink video photography means;
    A blink type identifying unit for identifying a blink waveform type for the feature amount data based on the feature amount data extracted by the feature amount extraction unit and the pattern model;
    Based on the identification result of the blink type identification means for the blink image of the subject photographed in a predetermined period, the temporal frequency of appearance of each type of electrooculogram (EOG) waveform corresponding to the blink image in the predetermined period Blinking waveform type appearance frequency information generating means for generating appearance frequency information indicating a change;
    A wakefulness state judging device comprising: wakefulness state judging means for judging the wakefulness state of the subject based on the appearance frequency information generated by the blinking waveform type appearance frequency information generating means.
  17. Awake state determination device according to claim 12 or claim 13,
    A warning device comprising: warning means for giving a warning to the subject based on the determination result of the awake state in the awake state determination device.
JP2006142582A 2006-05-23 2006-05-23 Blink data classification device, wakefulness determination device, and wakefulness determination device Active JP4864541B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006142582A JP4864541B2 (en) 2006-05-23 2006-05-23 Blink data classification device, wakefulness determination device, and wakefulness determination device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006142582A JP4864541B2 (en) 2006-05-23 2006-05-23 Blink data classification device, wakefulness determination device, and wakefulness determination device

Publications (2)

Publication Number Publication Date
JP2007312824A JP2007312824A (en) 2007-12-06
JP4864541B2 true JP4864541B2 (en) 2012-02-01

Family

ID=38847242

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006142582A Active JP4864541B2 (en) 2006-05-23 2006-05-23 Blink data classification device, wakefulness determination device, and wakefulness determination device

Country Status (1)

Country Link
JP (1) JP4864541B2 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2111796B1 (en) * 2007-01-19 2014-07-02 Asahi Kasei Kabushiki Kaisha Awake state judging model making device, awake state judging device, and warning device
JP5077879B2 (en) * 2007-08-29 2012-11-21 国立大学法人佐賀大学 Gaze input device, gaze input method, and gaze input program
JP5189893B2 (en) * 2008-05-20 2013-04-24 旭化成株式会社 Blink type identification device, blink type identification method, and blink type identification program
US8570176B2 (en) * 2008-05-28 2013-10-29 7352867 Canada Inc. Method and device for the detection of microsleep events
JP5270415B2 (en) * 2009-03-19 2013-08-21 トヨタ自動車株式会社 Sleepiness determination apparatus and program
JP5642945B2 (en) * 2009-05-29 2014-12-17 浜松ホトニクス株式会社 Blink measurement device
US8430510B2 (en) * 2009-11-19 2013-04-30 Panasonic Corporation Noise reduction device, electro-oculography measuring device, ophthalmological diagnosis device, eye-gaze tracking device, wearable camera, head-mounted display, electronic eyeglasses, noise reduction method, and recording medium
KR101391374B1 (en) * 2012-07-11 2014-05-07 상명대학교서울산학협력단 classifying method of blinking and apparatus using the method
JP6091981B2 (en) * 2013-04-25 2017-03-08 オムロンヘルスケア株式会社 Menstruation scheduled date calculation device and program
KR101869713B1 (en) * 2016-12-30 2018-06-21 (주)씽크포비엘 Method and apparatus for usability test based on brain wave
JP6535694B2 (en) * 2017-02-22 2019-06-26 株式会社ジンズ Information processing method, information processing device, and program

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2600834B2 (en) * 1988-08-23 1997-04-16 オムロン株式会社 Doze detecting device
JPH064092A (en) * 1992-06-18 1994-01-14 Matsushita Electric Ind Co Ltd Hmm generating device, hmm storage device, likelihood calculating device, and recognizing device
JPH07156682A (en) * 1993-12-03 1995-06-20 Nissan Motor Co Ltd Awakening condition detecting device
JP3293308B2 (en) * 1994-03-10 2002-06-17 三菱電機株式会社 Person state detection device
JP3160621B2 (en) * 1998-05-21 2001-04-25 京都大学長 Brain wave measurement device
JP3012226B2 (en) * 1998-07-24 2000-02-21 マルチメディアシステム事業協同組合 Drowsy driving prevention device
JP2002331850A (en) * 2001-05-07 2002-11-19 Nissan Motor Co Ltd Driving behavior intention detector
WO2005018455A1 (en) * 2003-08-21 2005-03-03 Peter Geoffrey Burton Cognitive processing

Also Published As

Publication number Publication date
JP2007312824A (en) 2007-12-06

Similar Documents

Publication Publication Date Title
Murphy et al. Sensory and semantic factors in recognition memory for odors and graphic stimuli: elderly versus young persons
Jacques et al. Early electrophysiological responses to multiple face orientations correlate with individual discrimination performance in humans
US7839292B2 (en) Real-time driving danger level prediction
CN100462047C (en) Safe driving auxiliary device based on omnidirectional computer vision
US7209588B2 (en) Unified system and method for animal behavior characterization in home cages using video analysis
EP1997705B1 (en) Drive behavior estimating device, drive supporting device, vehicle evaluating system, driver model making device, and drive behavior judging device
US9025016B2 (en) Systems and methods for audible facial recognition
US7120880B1 (en) Method and system for real-time determination of a subject&#39;s interest level to media content
Kasprowski et al. Eye movements in biometrics
US8487775B2 (en) Method and apparatus for determining and analyzing a location of visual interest
DE19509689C2 (en) A method for detecting a vigilance of a person
US5570698A (en) System for monitoring eyes for detecting sleep behavior
US8265743B2 (en) Fixation-locked measurement of brain responses to stimuli
US8734359B2 (en) Method and system for determining an individual&#39;s state of attention
Doshi et al. On-road prediction of driver's intent with multimodal sensory cues
Ji et al. Real-time nonintrusive monitoring and prediction of driver fatigue
Ueno et al. Development of drowsiness detection system
Vural et al. Drowsy driver detection through facial movement analysis
US9538219B2 (en) Degree of interest estimating device and degree of interest estimating method
Danisman et al. Drowsy driver detection system using eye blink patterns
Bartlett et al. Data mining spontaneous facial behavior with automatic expression coding
Wang et al. Driver fatigue detection: a survey
Ji et al. Real time visual cues extraction for monitoring driver vigilance
Friedrichs et al. Camera-based drowsiness reference for driver state classification under real driving conditions
Sahayadhas et al. Detecting driver drowsiness based on sensors: a review

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20090304

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20111003

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20111018

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20111109

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20141118

Year of fee payment: 3

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250