CN108712879B - Fatigue state determination device and fatigue state determination method - Google Patents

Fatigue state determination device and fatigue state determination method Download PDF

Info

Publication number
CN108712879B
CN108712879B CN201780013890.1A CN201780013890A CN108712879B CN 108712879 B CN108712879 B CN 108712879B CN 201780013890 A CN201780013890 A CN 201780013890A CN 108712879 B CN108712879 B CN 108712879B
Authority
CN
China
Prior art keywords
information
fatigue state
face
component
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780013890.1A
Other languages
Chinese (zh)
Other versions
CN108712879A (en
Inventor
新井润一郎
小谷泰则
户松太郎
大上淑美
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Daikin Industries Ltd
Tokyo Institute of Technology NUC
Original Assignee
Daikin Industries Ltd
Tokyo Institute of Technology NUC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daikin Industries Ltd, Tokyo Institute of Technology NUC filed Critical Daikin Industries Ltd
Publication of CN108712879A publication Critical patent/CN108712879A/en
Application granted granted Critical
Publication of CN108712879B publication Critical patent/CN108712879B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0006ECG or EEG signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0008Temperature signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0013Medical image data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0075Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by spectroscopy, i.e. measuring spectra, e.g. Raman spectroscopy, infrared absorption spectroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • A61B5/015By temperature mapping of body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • A61B5/0261Measuring blood flow using optical means, e.g. infrared light
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/162Testing reaction times
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/168Evaluating attention deficit, hyperactivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4809Sleep detection, i.e. determining whether a subject is asleep or not
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Psychiatry (AREA)
  • Physiology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Hematology (AREA)
  • Cardiology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The purpose of the present invention is to provide a fatigue state determination device and a fatigue state determination method for easily determining the fatigue state of a subject. The fatigue state determination device (400) is provided with a brain function activation information provision unit (441) and a fatigue state determination unit (445). A brain function activation information providing unit (441) provides the subject person (300) with 'brain function activation information' that activates the brain function of a human being. The fatigue state determination unit (445) determines the fatigue state of the target person (300) on the basis of face change information indicating a time-series change in the face data of the target person (300) when the brain function activation information is supplied.

Description

Fatigue state determination device and fatigue state determination method
Technical Field
The present invention relates to a fatigue state determination device and a fatigue state determination method.
Background
In recent years, attempts have been made to estimate human brain activity using data detected by electroencephalography (EEG), Magnetic Resonance Imaging (fMRI), and near infrared spectroscopy (NIRA) disclosed in patent document 1 (japanese patent laid-open No. 2013-176406). In addition, applications of determining physical conditions, mental states, and the like of human beings from the presumed brain activities are also studied.
Disclosure of Invention
Technical problem to be solved by the invention
However, in the electroencephalography method and the near-infrared spectroscopy method, it is necessary to perform preprocessing such as mounting an electrode on a subject. In addition, in the magnetic resonance imaging method, measurement needs to be performed in a predetermined MRI room. In summary, these methods have problems that the operation in the preparation stage is complicated and the conditions for measurement are limited. Furthermore, these methods require a large cost. As a result, it is difficult to determine the physical condition, mental state, and the like of the subject by these methods.
The technical problem of the present invention is to provide a device and a method capable of easily determining the physical condition and mental state of a subject. In particular, an object of the present invention is to provide a fatigue state determination device and a fatigue state determination method for easily determining a fatigue state of a subject person.
Technical scheme for solving technical problem
A fatigue state determination device according to a first aspect of the present invention includes a brain function activation information providing unit and a fatigue state determination unit. The brain function activation information providing unit provides the subject person with brain function activation information for activating a brain function of the human. The fatigue state determination unit determines the fatigue state of the subject person based on face change information indicating a time-series change of face data of the subject person when the brain function activation information is supplied.
In the fatigue state determination device according to the first aspect, since the state of the subject person is determined based on the face change information of the subject person when the brain function activation information is provided, the fatigue state of the subject person can be determined with a simple configuration.
In the fatigue state determination device according to the second aspect of the present invention, the brain function activation information providing unit provides the working memory-related information as the brain function activation information.
In the fatigue state determination device according to the second aspect, since the information relating to the working memory is provided as the brain function activation information, the determination component relating to the brain activity can be extracted. As a result, the fatigue state of the subject can be easily determined.
In the present invention, the "working memory-related information" is information related to a problem that requires memory and judgment, and can be given as an example of the following problem: memory problems such as mental problems, computational problems, and Nbcak (return) tasks (Japanese: N バックタスク); a question of making the most appropriate selection from a plurality of information, such as a pictogram question (Japanese: ピクトグラム ); and attention to handover problems such as mid-way handover problems.
In the fatigue state determination device according to the third aspect of the present invention, the fatigue state determination device according to the first or second aspect further includes a determination information generation unit that generates determination information based on the face change information.
In the fatigue determination device according to the third aspect, since the determination information for determining fatigue is generated from the face change information to determine the fatigue state of the target person, the determination accuracy of the fatigue state of the target person can be improved.
The fatigue state determination device according to a fourth aspect of the present invention is the fatigue state determination device according to the third aspect, further comprising a face change information decomposition unit. The face change information decomposition unit decomposes the face change information into a plurality of components by singular value decomposition, principal component analysis, or independent component analysis. The judgment information generating unit extracts a component related to the brain function activation information from the plurality of components as a judgment component, and generates the judgment information from the judgment component.
In the fatigue state determination device according to the fourth aspect, since the determination component relating to the brain function activation information is extracted from the plurality of components obtained by performing the singular value decomposition, the principal component analysis, and the independent component analysis on the face change information, it is possible to easily estimate whether or not the subject has brain activity without using an electrode or the like that requires a preprocessing before installation. Thus, the fatigue state of the subject can be easily determined based on the determination component corresponding to the brain function of the subject.
In the fatigue state determination device according to the fifth aspect of the present invention, the determination information generation unit extracts the determination component from the value of the risk factor, in addition to the fatigue state determination device according to the fourth aspect.
In the fatigue state determination device according to the fifth aspect, since the component relating to the brain function activation information is extracted from the value of the risk factor, the reliability of the determination can be improved.
The fatigue state determination device according to a sixth aspect of the present invention is the fatigue state determination device according to any one of the third to fifth aspects, further comprising a reference information storage unit. The reference information storage unit stores, as the reference information, a change amount in a predetermined range, which is a change amount in a predetermined range of a correlation value of the information for determination calculated with respect to the brain function activation information, compared with a reference correlation value, and a fatigue state determination unit calculates a correlation value of the information for determination with respect to the brain function activation information, and determines the fatigue state level of the subject person based on the calculated correlation value and the reference information.
In the fatigue state determination device according to the sixth aspect, the fatigue state level can be easily determined using the reference information obtained before the predetermined action.
In the present invention, the "predetermined action" is used as meaning at least either one or both of a physical activity and an intellectual activity. As the physical activity, the following activities can be exemplified: various physical works such as line work and civil work in factories; and various exercises such as body building, running, ball games, mountain climbing, and muscle training. The intellectual activities include learning, discussion, decision-making, situation determination, management and supervision, and the like.
In the fatigue state determination device according to a seventh aspect of the present invention, in addition to the fatigue state determination device according to any one of the first to sixth aspects, the face change information acquisition unit acquires data of the surrounding sinus and/or the forehead portion as the face data.
In the fatigue state determination device according to the seventh aspect, since the face data is data of the periphery of the sinus and/or the forehead, the determination component relating to the brain activity can be extracted with high accuracy.
In the fatigue state determination device according to an eighth aspect of the present invention, in addition to the fatigue state determination device according to any one of the first through seventh aspects, the face change information acquisition unit acquires, as the face data, face skin temperature data indicating a skin temperature of the face of the subject person.
In the fatigue state determination device according to the eighth aspect, since the face data is face skin temperature data indicating a skin temperature of the face of the subject person, the fatigue state can be determined by an infrared camera or the like.
In the fatigue state determination device according to the ninth aspect of the present invention, the face change information acquires, as the face data, face blood circulation amount data based on RGB data of the face of the subject person.
In the fatigue state determination device according to the ninth aspect, since the face data is the face blood circulation amount data based on the RGB data of the face of the subject person, the fatigue state can be determined by using a solid-state imaging device or the like.
In the fatigue state determination device according to the tenth aspect of the present invention, in addition to the fatigue state determination device according to any one of the third to ninth aspects, the fatigue state determination unit calculates a correlation value of the determination component with respect to the brain function activation information, and determines the fatigue state level of the subject based on the calculated correlation value and the reference information. Here, the determination information providing device on the network includes a reference information storage unit that stores, as the reference information, a change amount in a predetermined range of a correlation value of the determination information calculated for the brain function activation information, in comparison with a reference correlation value of the reference determination component calculated for the brain function activation information, in association with the fatigue state level.
In the fatigue state determination device according to the tenth aspect, the determination information providing device on the network can determine the fatigue state level of the target person.
A fatigue state determination method according to an eleventh aspect of the present invention includes a brain function activation information providing step, a face change information acquiring step, and a fatigue state determining step. In the brain function activation information providing step, after a predetermined action is performed, brain function activation information for activating a brain function of a human being is provided to the subject person. In the fatigue state determination step, the fatigue state of the subject person is determined based on the face change information.
In the fatigue state determination method according to the eleventh aspect, since the fatigue state of the subject person is determined based on the face change information of the subject person to which the brain function activation information is supplied after the predetermined action, the fatigue state of the subject person can be determined with a simple configuration.
A fatigue state determination method according to a twelfth aspect of the present invention is the fatigue state determination method according to the eleventh aspect, further comprising a determination information generation step of generating determination information based on the face change information.
In the fatigue determination method according to the twelfth aspect, since the determination information useful for determining fatigue is generated from the face change information to determine the fatigue state of the target person, the accuracy of determining the fatigue state of the target person can be improved.
The fatigue state determination method according to a thirteenth aspect of the present invention is the fatigue state determination method according to the twelfth aspect, further comprising a face change information decomposition step. In the face change information decomposition step, the face change information is decomposed into a plurality of components by singular value decomposition, principal component analysis, or independent component analysis. In the determination information generating step, a component related to the brain function activation information is extracted from the plurality of components as a determination component, and the determination information is generated from the determination component.
In the fatigue state determination method according to the thirteenth aspect, after the predetermined action, the fatigue state is determined by extracting a determination component related to the brain function activation information from a plurality of components obtained by performing singular value decomposition, principal component analysis, or independent component analysis on the face change information, and thereby the influence of the predetermined action on fatigue of the subject person can be easily determined.
In the fatigue state determination method according to the fourteenth aspect of the present invention, in addition to the fatigue state determination method according to the twelfth or thirteenth aspect, the fatigue state determination step calculates a correlation value of the determination information with respect to the brain function activation information, and determines the fatigue state level of the subject based on the calculated correlation value and the reference information. Here, the reference information storage unit stores, as the reference information, a change amount in a predetermined range, which is a change amount in a predetermined range of a correlation value of the information for determination calculated with respect to the brain function activation information, with respect to a reference correlation value, in association with a fatigue state level.
In the fatigue state determination method according to the fourteenth aspect, the fatigue state level can be easily determined using the reference information stored in the reference information storage unit.
In the fatigue state determination method according to the fourteenth aspect of the present invention, the reference information is generated by executing the brain function activation information providing step, the face change information acquiring step, the face change information decomposing step, and the determination information generating step before the predetermined action.
In the fatigue state determination method according to the fifteenth aspect, the reference information is extracted from the face change information of the target person before the predetermined action, and therefore, the influence of the predetermined action on fatigue of the target person can be determined with high accuracy.
In the fatigue state determination method according to the sixteenth aspect of the present invention, in addition to the fatigue state determination method according to the fourteenth or fifteenth aspect, the fatigue state determination step accesses the determination information providing device when determining the fatigue state level. Here, the reference information storage unit is stored in the judgment information providing apparatus on the network.
In the fatigue state determination method according to the sixteenth aspect, since the fatigue state is determined using the reference information of the determination information providing device stored on the external network, the work before the predetermined operation can be simplified. Further, according to the above method, it is possible to determine the fatigue state using big data, for example.
In the fatigue state determination method according to a sixteenth aspect of the present invention, the reference correlation value is calculated by providing brain function activation information to a person other than the subject person.
In the fatigue state determination method according to the seventeenth aspect, the determination of the fatigue state can be realized by using big data or the like obtained from a person other than the subject person.
In the fatigue state determination method according to an eighteenth aspect of the present invention, the predetermined action is a predetermined physical activity, and the subject is a physical worker, in addition to the fatigue state determination method according to any one of the eleventh to seventeenth aspects.
In the fatigue state determination method according to the eighteenth aspect, the fatigue state before and after the intellectual or physical activities can be easily determined.
Effects of the invention
According to the fatigue state determination device described in the first aspect, the fatigue state of the subject person can be determined with a simple configuration.
According to the fatigue state determination device of the second aspect, the fatigue state of the subject person can be easily determined.
According to the fatigue state determination device described in the third aspect, the accuracy of determining the fatigue state of the target person can be improved.
According to the fatigue state determination device of the fourth aspect, the fatigue state of the subject can be easily determined based on the determination component corresponding to the brain function of the subject.
According to the fatigue state determination device of the fifth aspect, the reliability of determination can be improved.
According to the fatigue state determination device described in the sixth aspect, the fatigue state level can be easily determined.
According to the fatigue state determination device described in the seventh aspect, the determination component relating to the brain activity can be extracted with high accuracy.
According to the fatigue state determination device of the eighth aspect, the fatigue state can be determined by an infrared camera or the like.
According to the fatigue state determination device of the ninth aspect, the fatigue state can be determined by a solid-state imaging element or the like.
According to the fatigue state determination device described in the tenth aspect, the determination information providing device on the network can determine the fatigue state level of the target person.
According to the fatigue state determination method described in the eleventh aspect, the influence of fatigue on the subject caused by the predetermined action can be easily determined.
According to the fatigue state determination method described in the twelfth aspect, the accuracy of determining the fatigue state of the target person can be improved.
According to the fatigue state determination method of the thirteenth aspect, the influence of fatigue on the subject caused by the predetermined action can be easily determined.
According to the fatigue state determination method described in the fourteenth aspect, the fatigue state level can be easily determined.
According to the fatigue state determination method described in the fifteenth aspect, the influence of fatigue on the subject caused by the predetermined action can be determined with high accuracy.
According to the fatigue state determination method described in the sixteenth aspect, the work before the predetermined action can be simplified. Further, according to the above method, it is possible to determine the fatigue state using big data, for example.
According to the fatigue state determination method of the seventeenth aspect, the determination of the fatigue state can be realized by using big data or the like obtained from a person other than the subject person.
According to the fatigue state determination method of the eighteenth aspect, the fatigue state before and after the intellectual or physical activities can be easily determined.
Drawings
Fig. 1 is a diagram showing an example of captured image data and a result of analyzing the captured image data.
Fig. 2 is a diagram showing a partial result of analyzing facial skin temperature data.
Fig. 3 is a diagram showing a partial result of analyzing facial skin temperature data.
Fig. 4 is a diagram showing the amplitude of the component waveform of component two and the amplitude of the measured beta wave in the brain wave.
Fig. 5 is a diagram showing the amplitude of the component waveform of component three and the amplitude of the measured beta wave in the brain wave.
Fig. 6 is a graph showing a partial result of analysis of facial skin temperature data obtained in a control experiment.
Fig. 7 is a diagram showing a component waveform of the captured image data based on the face and the amplitude of the measured beta wave in the brain wave.
Fig. 8 is a diagram showing a component waveform based on facial skin temperature data and the amplitude of a β wave in a measured brain wave.
Fig. 9 is a diagram showing a component waveform of captured image data based on a face and an amplitude of a β wave in a measured brain wave.
Fig. 10 is a diagram showing a component waveform based on facial skin temperature data and the amplitude of a β wave in a measured brain wave.
Fig. 11 is a diagram showing a component waveform of captured image data based on a face and an amplitude of a β wave in a measured brain wave.
Fig. 12 is a diagram showing a component waveform based on facial skin temperature data and the amplitude of a β wave in a measured brain wave.
Fig. 13 is a diagram showing a component waveform of captured image data based on a face and an amplitude of a β wave in a measured brain wave.
Fig. 14 is a diagram showing a component waveform based on facial skin temperature data and the amplitude of a β wave in a measured brain wave.
Fig. 15 is a diagram showing a component waveform of captured image data based on a face and an amplitude of a β wave in a measured brain wave.
Fig. 16 is a diagram showing a component waveform based on facial skin temperature data and the amplitude of a β wave in a measured brain wave.
Fig. 17 is a diagram showing a component waveform of the captured image data based on the face and the amplitude of the measured beta wave in the brain wave.
Fig. 18 is a diagram showing a component waveform based on facial skin temperature data and the amplitude of a β wave in a measured brain wave.
Fig. 19 is a schematic view of a brain activity visualization device according to an embodiment of the present invention.
Fig. 20 is a flowchart showing an example of a process flow when a component indicating a change in skin temperature reflecting brain function is specified in the brain activity visualization device.
Fig. 21 is a schematic view of a brain activity visualization device according to an embodiment of the present invention.
Fig. 22 is a flowchart showing an example of a process flow when a component indicating RGB changes of a face reflecting brain functions is specified in the brain activity visualization device.
Fig. 23 is a schematic diagram showing the configuration of a fatigue state determination device according to an embodiment of the present invention.
Fig. 24 is a schematic diagram showing the configuration of the reference information database of the fatigue state determination device.
Fig. 25A is a flowchart showing the operation of the fatigue state determination device.
Fig. 25B is a flowchart showing the operation of the fatigue state determination device.
Fig. 26 is a schematic diagram of an example in which an infrared camera is used in the fatigue state determination device.
Fig. 27 is a schematic diagram showing an example of use of the fatigue state determination device.
Fig. 28 is a schematic diagram showing a configuration of a modification of the fatigue state determination device.
Fig. 29 is a flowchart showing an operation of a modification of the fatigue state determination device.
Detailed Description
Before describing the embodiments of the present invention, the present inventors first describe findings of the present inventors, which are important bases for the present inventors to complete the present invention.
(1) The gist of the inventors' knowledge
It is known that human brain activity reflects human mental activities (cognitive activities and the like) and emotional activities (pleasant/unpleasant activities and the like). In addition, although an attempt has been made to estimate human brain activity, in this case, data detected by any one of a electroencephalogram method, a magnetic resonance imaging method, and a near infrared spectroscopy method is often used.
Here, in the case of using, for example, electroencephalography as a detection method, it is necessary to attach an electroencephalogram electrode to a test subject. In addition, since it is necessary to reduce the electrical resistance between the skin and the electrode when the electroencephalogram electrode is attached, it is necessary to perform a treatment of polishing the skin and a work of applying a paste to the electrode. Further, when the magnetic resonance imaging method is used, measurement cannot be performed outside the MRI room, and there is a limitation in measurement conditions such that metal cannot be taken into the measurement room. In addition, in the case of using the near infrared spectroscopy, it is necessary to attach the probe to the tester, but the attachment of the probe for a long time causes pain to the tester, and the accurate detection may not be possible due to the contact between the hair of the tester and the probe. As described above, when the conventional detection method is used to measure the brain activity of a human, it is necessary to perform a preprocessing such as mounting a brain wave electrode or a probe, or to limit the measurement conditions, which increases the burden on the examiner.
Therefore, there is a need to develop a device that can reduce the burden on the examiner and can easily estimate the brain activity of the human.
The inventors also considered the following: whether or not the brain activity of a human can be estimated from the skin temperature of the face of the human or the blood circulation state of the face which is considered to be proportional to the skin temperature of the face. The skin temperature of the face of a human being can be acquired by a measurement device such as a thermal imaging device, and the blood circulation state of the face, that is, the blood circulation amount of the face can be estimated by RGB data of a captured image of the face obtained by an imaging device. In this way, it is possible to acquire the skin temperature of the face and the captured image of the face without installing a sensor that requires processing before installation, such as an electroencephalogram electrode or a probe.
On the other hand, it is known that the skin temperature of the human face changes under the influence of various factors such as the outside air temperature and/or the activity of autonomic nerves. Therefore, it is considered that if the brain activity is estimated from the skin temperature of the face or from the blood circulation amount of the face which is considered to be proportional to the skin temperature of the face, it is difficult to determine whether or not the acquired data reflects only the brain activity.
The present inventors have found the following results through intensive studies: time-series facial skin temperature data obtained by detecting the skin temperature of a face and including detected temperature data and position data (coordinate data) of a detected portion, or time-series facial blood circulation volume calculated from RGB data obtained from time-series captured image data of the face is decomposed into a plurality of components by a singular value decomposition method, a principal component analysis method, or an independent component analysis method, and the decomposed plurality of components are analyzed, so that a component representing a change in skin temperature of the face or a change in facial blood circulation volume reflecting brain activity can be specified. The present inventors have also found that the physiological state of a subject person can be visualized based on an estimated brain activity by estimating the brain activity of the subject person and analyzing the brain activity.
(2) Method for acquiring various data of face and method for analyzing acquired various data
(2-1) method for acquiring facial skin temperature data and method for analyzing facial skin temperature data
Next, a method for acquiring facial skin temperature data and a method for analyzing facial skin temperature data, which are used by the present inventors when obtaining the above-described findings, will be described.
In this test, facial skin temperature data were obtained from six subjects. Specifically, the examiner is allowed to sit on a chair set in a climatic room maintained at room temperature of 25 ℃, and facial skin temperature data is acquired from the entire face of the examiner using an infrared thermal imaging device. The infrared thermal imaging apparatus can detect infrared radiant energy emitted from an object by an infrared camera, convert the detected infrared radiant energy into a temperature (here, a celsius temperature) of a surface of the object, and display and store the temperature distribution as facial skin temperature data (for example, image data indicating the temperature distribution). In this test, R300 manufactured by NEC Avio infrared technology co (japan: NEC Avio red line テクノロジー) was used as an infrared thermal imaging device. In addition, the infrared camera is arranged at the position which is 1.5 meters away from the tester and is arranged at the front of the tester. In addition, facial skin temperature data was acquired over thirty minutes.
In addition, in the present experiment, the examiner was provided with a problem of activation of brain function during acquisition of facial skin temperature data. Thus, facial skin temperature data at the time of brain inactivation and facial skin temperature data at the time of brain activation are acquired. The brain function activation problem includes a mental task such as calculation performed by a tester on a screen displayed by a display device or the like, recognition of a numerical value, a shape, and a color, or memory of a symbol, a character, or a language. In this test, "mental arithmetic by multiplication" was used as a brain function activation question, and the examiner calculated numbers displayed on the display device in the form of written arithmetic and entered the answer to the above-mentioned question through the keyboard. Further, in the present test, the brain function activation problem was provided to the test subjects continuously for ten minutes after five minutes from the start of the acquisition of the facial skin temperature data.
As analysis of the facial skin temperature data, the acquired facial skin temperature data is targeted, and Singular value decomposition is performed using SVD (Singular value decomposition) of MATLAB (registered trademark) as an analysis tool. In the singular value decomposition, all facial skin temperature data (thirty-minute data) acquired in time series are set as objects, a factor is set as time data every thirty seconds (sixty time points for thirty minutes), and a measure is set as facial skin temperature data (240 × 320 pixels) of the above-described period (thirty seconds). Next, the facial skin temperature data X is decomposed into a plurality of components by singular value decomposition, and the temporal distribution V, the spatial distribution U, and the singular value S indicating the size of each component are calculated for each component. The above relationship is expressed by the following equation. V' is a matrix obtained by exchanging rows and columns of V.
(math formula 1)
X=(U*S)*V·
Then, the time distribution V and the spatial distribution U of each component obtained by the singular value decomposition are plotted to generate a component waveform diagram and a temperature distribution diagram of each component.
Furthermore, analysis for identifying a component representing a change in skin temperature reflecting brain activity was performed based on the generated component waveform map and temperature distribution map of each component.
The correlation between the amplitude of the component waveform and the presence or absence of the brain inactivation and the brain activation was analyzed from the component waveform diagram of each component. Specifically, whether or not there is a correlation between the amplitude shown in the component waveform diagram of each component and the brain inactive period/brain active period was evaluated. In this test, in the period for acquiring facial skin temperature data, a five-minute period from the time when data acquisition is started to the time when five minutes have elapsed, which is a period for which no problem of brain function activation is provided to the test subject, and a fifteen-minute period from the time when data acquisition is started to the time when data acquisition is completed are set as the brain-inactive periods, and a ten-minute period from the time when five minutes have elapsed from the time when data acquisition is started to the time when ten minutes have elapsed, which is a period for providing a problem of brain function activation to the test subject, is set as the brain-active period. Next, the correlation between the amplitude shown in the waveform diagram of each component and the presence or absence of the brain activation and the brain deactivation was evaluated. In addition, the presence or absence of the correlation relationship was statistically analyzed, and it was determined that there is a correlation when the significance level (α) was 0.05 or less.
Whether or not there is a change in the temperature of a predetermined portion of the face is analyzed from the temperature distribution map of each component. Here, the Brain has a structure for Cooling the Brain independently of the body temperature, such as a Selective Brain Cooling mechanism (Selective Brain Cooling System). It is known that selective brain cooling mechanisms utilize the forehead and around the sinuses (including between the eyebrows and around the nose) to remove heat generated by brain activity. Therefore, in this test, whether or not the temperature of the periphery of the sinus and the forehead portion has changed was evaluated in the temperature profile of each component. In addition, whether or not the temperature around the sinus and the forehead portion has changed in the temperature profile is determined by visual observation (visual observation) as a criterion of whether or not the temperature has changed or whether or not the temperature around the sinus and the forehead portion has changed by one Standard Deviation (SD) or more from the average temperature of the entire measurement data.
In addition, since the polarity (positive or negative) of the facial skin temperature data X is determined by the relationship among the values of the spatial distribution U, the singular value S, and the temporal distribution V, the polarity may be reversed in the component waveform diagram and the temperature distribution diagram of each component. Therefore, the polarity is not an object to be evaluated in the evaluation of the component waveform map and the temperature distribution map.
Here, in the infrared thermal imaging apparatus, as described above, infrared radiant energy detected from the object is converted into temperature, and the temperature distribution is used as face skin temperature data. In addition, when the skin temperature of the face is acquired by using an infrared thermal imaging apparatus with a human being as a target, various temperature changes (i.e., disturbances) that are not related to brain activities, such as the movement of the face and/or the activity of the autonomic nerves, are also acquired as face skin temperature data (see fig. 1 (a)). Therefore, in order to detect such a temperature change that is not related to brain activity, relative facial skin temperature data in which the overall average value of temperature data included in facial skin temperature data every thirty seconds is set to "0" is generated, and for the generated facial skin temperature data, singular value decomposition is performed using SVD of MATLAB (registered trademark) as an analysis tool to generate a component waveform diagram and a temperature distribution diagram of each component based on singular values S, and further, analysis is performed to specify a component representing a change in skin temperature reflecting brain activity.
For convenience of explanation, the face skin temperature data acquired by the infrared thermography device will be referred to as "face skin temperature data based on temperature conversion data", and the relative face skin temperature data will be referred to as "face skin temperature data based on relative temperature conversion data" in which the overall average value of the temperature data included in the face skin temperature data based on the temperature conversion data for each predetermined time (every thirty seconds in the present test) is "0".
In addition to the detection of the facial skin temperature by the infrared thermography device, one of the six testers measured brain waves by connecting electrodes to the scalp of the testers, and evaluated the correlation between the amplitude of a β wave (brain wave having a frequency of 14 to 30 Hz) known as a waveform appearing when the patient is awake or under stress and the amplitude of a component waveform diagram. In addition, in the electroencephalogram measurement, electrodes were disposed at six sites (F3, F4, C3, C4, Cz, Pz) according to the international standard 10-20 method.
It is considered that the head of the examiner moves up and down during the period in which the examiner is provided with the brain function activation problem. Thereby, the position of the face of the examiner with respect to the infrared camera is changed. In order to verify whether the change in the position of the face has an influence on the change in the skin temperature, a control test was performed on one test subject. In a control test for verifying the influence of the movement of the examinee when the facial skin temperature data is acquired, the facial skin temperature data of the examinee is acquired by using an infrared thermal imaging device, and the keyboard is arranged at random timing even during a period when the problem of brain function activation is not given (i.e., when the brain is inactive). The facial skin temperature data based on the temperature conversion data obtained in the above-described comparison test and the facial skin temperature data based on the relative temperature conversion data were also subjected to singular value decomposition using SVD of MATLAB (registered trademark) as an analysis tool to generate a component waveform diagram and a temperature distribution diagram of each component based on the singular value S, and further subjected to analysis for specifying a component representing a change in the skin temperature.
(2-2) method for acquiring face photographic image data and method for analyzing face photographic image data
Fig. 1 (a) is a diagram showing an example of captured image data of the vicinity of the paranasal sinus of the face of a test subject captured by an imaging device. Fig. 1 (b) is a diagram showing an example of a blood circulation volume distribution map (image map).
Next, a method for acquiring face photographic image data and a method for analyzing face photographic image data, which are used by the present inventors when obtaining the above-described findings, will be described.
In this test, photographed image data of the face was acquired from six testers. Specifically, the examiner sits on a chair installed in an artificial climate room maintained at room temperature of 25 ℃, and the imaging device capable of acquiring images in time series is used to acquire imaging image data around the sinuses of the entire face of the examiner in time series.
Further, according to the above-described selective brain cooling mechanism, it is considered that a change in the amount of blood circulation in the face proportional to the facial skin temperature accompanying brain activity occurs in the forehead and/or around the sinus. Thus, the present inventors considered that brain activity can be estimated with high accuracy by capturing at least changes in the amount of blood circulation in the face around the forehead and/or the sinus. Next, in the present test, captured image data of the periphery of the paranasal sinus of the face of the test subject was acquired in time series.
In the above test, color moving image data was acquired as time-series captured image data using an imaging device on the liquid crystal screen side of an iPad Air (registered trademark) manufactured by Apple inc (original text). In addition, the photographing device is disposed at a position 1.0 m away from the tester on the front side of the tester. Next, the moving image data of the face is obtained by continuously capturing captured image data for thirty minutes along the time axis at a capturing cycle of 30 frames/second with the capturing device.
In addition, in the present experiment, the examiner was presented with a problem of activation of brain function while the animation data of the face was acquired. Thereby, animation data of the face during the brain inactivity and animation data of the face during the brain activation are acquired. In this test, as in the above test, "mental arithmetic by multiplication" was used as a brain function activation question, and the examiner calculated the numbers displayed on the display device in the form of a written calculation and inputted the answer to the question through the keyboard. Further, in the present experiment, the brain function activation problem was provided to the examinee continuously for ten minutes after five minutes from the start of acquiring the animation data of the face.
As analysis of the animation data of the face, blood circulation volume data was calculated from RGB data obtained from the animation data of the photographed face, and singular value decomposition was performed with SVD of MATLAB (registered trademark) as an analysis tool while targeting the calculated time-series blood circulation volume data. Here, the erythema index "α" correlated with the degree of redness of the skin calculated from the RGB data of the image and the amount of hemoglobin was obtained from the CIE-L × a × b color system, and the erythema index "α" was used as the blood circulation amount data. In the singular value decomposition, blood circulation amount data (herein, erythema index) based on RGB data obtained from all animation data (thirty-minute data) acquired in time series is set as a target, factors are set as time data every thirty seconds (sixty time points for thirty minutes), and a measure is set as an erythema index calculated from the RGB data for the period (every thirty seconds) (erythema index is obtained by taking out frame data for one second every thirty seconds and calculating an average value of RGB values obtained from the frame data; 240 × 320 pixels). Next, the time-series blood circulation volume data based on RGB data obtained from the face animation data is decomposed into a plurality of components by singular value decomposition, and the time distribution V, the spatial distribution U, and the singular value S indicating the size of each component are calculated. The above relationship is expressed by the same expression as the above expression (expression 1).
Then, the time distribution V and the spatial distribution U of each component obtained by the singular value decomposition are plotted to generate a component waveform diagram and a blood circulation volume distribution diagram of each component.
Then, analysis was performed to identify components representing changes in the blood circulation volume of the face reflecting brain activity, that is, RGB changes of the face, from the generated component waveform map and blood circulation volume distribution map of each component.
The correlation between the amplitude of the component waveform and the presence or absence of the brain inactivation and the brain activation was analyzed from the component waveform diagram of each component. Specifically, whether or not there is a correlation between the amplitude shown in the component waveform diagram of each component and the brain inactive period/brain active period was evaluated. In this test, in the period in which the captured image data of the face is acquired, a five-minute period from the time when the acquisition of data is started to the time when five minutes have elapsed, which is a period in which no problem of activation of the brain function is provided to the examinee, and a fifteen-minute period from the time when the acquisition of data is started to the time when the data acquisition is completed are set as the brain-inactive periods, and a ten-minute period from the time when five minutes have elapsed from the time when the acquisition of data is started to the time when ten minutes have elapsed, which is a period in which a problem of activation of the brain function is provided to the examinee, is set as the brain-active period. Next, the correlation between the amplitude shown in the waveform diagram of each component and the presence or absence of the brain activation and the brain deactivation was evaluated. In addition, the presence or absence of the correlation relationship is statistically analyzed, and it is determined that there is a correlation when the significance level (α) is 0.01 or less.
Whether or not there is a change in the blood circulation volume at a predetermined part of the face is analyzed from the blood circulation volume distribution map of each component. The blood circulation volume distribution map is generated by arranging the spatial distribution U calculated for each pixel at the position of each pixel. In the blood circulation volume distribution map of each component generated as described above, whether or not there is a change in the blood circulation volume around the sinus and in the forehead portion was evaluated. In addition, whether or not the blood circulation volume around the sinus and in the forehead portion changes in the blood circulation volume distribution map is determined by visual observation (visual analysis) as a criterion of whether or not the blood circulation volume changes, or whether or not the value of the blood circulation volume around the sinus and in the forehead portion is not "0.000" as shown in fig. 1 (b).
In addition, since the polarity (positive or negative) of the blood circulation volume data X is determined by the relationship among the values of the spatial distribution U, the singular value S, and the temporal distribution V, the polarity may be reversed in the component waveform diagram and the blood circulation volume distribution diagram of each component. Therefore, the polarity is not an object to be evaluated in the evaluation of the component waveform map and the blood circulation volume distribution map.
Then, in order to verify the correlation between the skin temperature of the face and the blood circulation amount of the face, while captured image data of the face was acquired in time series from six examinees, face skin temperature data was also acquired in time series by an infrared thermography apparatus, and the acquired face skin temperature data was also subjected to singular value decomposition using SVD of MATLAB (registered trademark) as an analysis tool, thereby generating a component waveform map of each component based on the singular value S, and further, whether or not the amplitude of the component waveform has a correlation with the brain inactivation time and the brain activation time was analyzed. In this test, the same infrared thermal imaging apparatus as in the above test was used. In addition, the infrared camera is arranged at the position which is 1.5 meters away from the tester and is arranged at the front of the tester.
In addition, when the image pickup device is used to acquire image data of a face, sunlight or the like is irradiated onto the face during image pickup to reflect light through the face, and the reflected light may enter a lens of the image pickup device. As a result, the reflected light is recorded in the captured image data of the captured face. Here, in the RGB data obtained from the captured image data, since the change in brightness based on the blood circulation amount of the face is smaller than the change in brightness based on the reflected light, it is considered that the RGB change of the face (that is, the disturbance) which is not related to the brain activity may be mixed by analyzing the blood circulation amount calculated from the RGB data obtained from the captured image data in which the reflected light is recorded. Therefore, in order to prevent the RGB variation of the face that is not related to the brain activity from being mixed, relative blood circulation amount data is generated from relative RGB data in which the overall average value of RGB data every thirty seconds is set to "0", and for the generated blood circulation amount data, singular value decomposition is performed using SVD of MATLAB (registered trademark) as an analysis tool to generate a component waveform diagram and a blood circulation amount distribution diagram of each component based on the singular value S, and further, analysis is performed to specify a component representing the RGB variation of the face that reflects the brain activity.
For convenience of explanation, hereinafter, relative blood circulation volume data based on relative RGB data in which the overall average value of RGB data for each predetermined time (every thirty seconds in the present test) is "0" is referred to as "relative converted blood circulation volume data", and blood circulation volume data based on RGB data before conversion into relative RGB data is referred to as "blood circulation volume data" only.
Further, while time-series captured image data of the face was acquired from six subjects by the imaging device, the correlation between the amplitude of a β wave (brain wave having a frequency of 13 to 30 Hz) known as a waveform exhibited during brain cell activity such as waking hours and the amplitude of a component waveform map was evaluated by connecting electrodes to the scalp of each subject and measuring the brain wave. In the measurement of brain waves, electrodes were disposed at nineteen sites (Fp1, Fp2, F3, F4, C3, C4, P3, P4, O1, O2, F7, F8, T3, T4, T5, T6, Fz, Cz, and Pz) on the scalp according to the international standard 10-20 method.
Further, it is considered that the head of the examiner moves up and down during the period in which the examiner is provided with the brain function activation problem. This causes the position of the face of the examiner relative to the imaging device to change. In order to verify whether the change in the position of the face has an influence on the RGB change of the face, a control test was performed on one test subject. In the control test, as in the above test, the image pickup device is used to acquire time-series picked-up image data of the face of the test subject, but the operation of placing the keyboard on the test subject is performed at random timing even during a period in which the problem of activation of the brain function is not present (i.e., when the brain is inactive). For time-series blood circulation volume data based on RGB data obtained from time-series captured image data of the face captured by the above-described comparison test, singular value decomposition is also performed using SVD of MATLAB (registered trademark) as an analysis tool to generate a component waveform map based on each component of the singular value S, and whether or not the amplitude of the component waveform has a correlation with the brain inactivation time and the brain activation time is analyzed. Further, it was analyzed whether or not the amplitude of each component waveform and the actual facial movement have a correlation relationship. The actual facial movements were evaluated as follows: two-dimensional coordinates of the same portion of the face are acquired by the captured image data, and the moving distance of the face every thirty seconds at the time of capturing is calculated with the captured image data at the start of the comparison experiment as a reference. Further, it was also analyzed whether or not the amplitude of each component waveform and the number of inputs of the keyboard during imaging have a correlation. The number of inputs of the keyboard during shooting is evaluated by calculating a simple moving average value every thirty seconds in time-series captured image data.
(3) Analysis result
(3-1) analysis result of facial skin temperature data
Fig. 2 is a diagram showing a partial result of analyzing face skin temperature data based on temperature conversion data. Fig. 2 (a) shows a component waveform diagram of a component two of the first tester. Fig. 2 (b) shows a temperature distribution diagram of the second component of the first tester. Fig. 3 (a) shows a waveform diagram of a component three of the first test subject. Fig. 3 (b) shows a temperature distribution diagram of the third component of the first test subject. Fig. 4 and 5 are diagrams showing a relationship between the amplitude of the component waveform and the brain wave. Fig. 4 is a diagram showing the amplitude of the component waveform of the first component two of the examiner and the amplitude of the measured beta wave in the brain wave. Fig. 5 is a diagram showing the amplitude of the component waveform of the first component three of the examiner and the amplitude of the measured beta wave in the brain wave. Fig. 6 is a graph showing a partial result of analysis of facial skin temperature data obtained in a control experiment. Fig. 6 (a) shows a component waveform diagram of component three. Fig. 6 (b) shows a temperature distribution diagram of component three.
Table 1 shows the analysis results of the facial skin temperature data of each test subject.
As a result of analyzing the facial skin temperature data, it is found that there is a significant correlation between the second component and/or the third component among the plurality of components obtained by decomposing the time-series facial skin temperature data by singular value decomposition and the brain activity of the human being.
(Table 1)
Figure BDA0001780267060000191
As shown in fig. 4 and 5, it is clear from the results of the electroencephalogram analysis that the amplitudes of the respective component waveforms of the component two and the component three and the amplitude of the beta wave of the electroencephalogram have a significant correlation.
In addition, in the control experiment, even if the examinee was in a motion state during the acquisition of facial skin temperature data, there was a significant correlation between component three and human brain activity (refer to fig. 6). From this, it is considered that the action of the examiner when obtaining the facial skin temperature data does not affect the third component among the plurality of components.
Based on the above results, the present inventors have obtained the following findings.
The time-series facial skin temperature data acquired from the examinee is decomposed into a plurality of components by singular value decomposition, and from the results of analyzing each of the decomposed components, it is known that the third component among the plurality of components is a component related to brain activity. That is, it is found that a component representing a change in skin temperature reflected from a plurality of components can be identified by decomposing time-series facial skin temperature data into a plurality of components by singular value decomposition, extracting components related to brain activation/inactivation from the plurality of components after decomposition, and analyzing the extracted components by a selective brain cooling mechanism. Thus, the present inventors have obtained the following findings: brain activity can be estimated from the skin temperature of the face of a human.
(3-2) analysis result of captured image data of face
Fig. 7 to 18 are diagrams showing partial results of comparative analysis of a component waveform diagram based on captured image data of the face (blood circulation volume data) or facial skin temperature data and a measured waveform diagram of a β wave in an electroencephalogram. Fig. 7 is a diagram showing the amplitude of the component waveform of the component two based on the captured image data of the first examiner and the measured amplitude of the β wave in the brain wave of the first examiner. Fig. 8 is a graph showing the amplitude of the component waveform of component two based on the facial skin temperature data of the first tester and the measured amplitude of the β wave in the brain wave of the first tester. Fig. 9 is a diagram showing the amplitude of the component waveform of the component two based on the captured image data of the tester two and the measured amplitude of the β wave in the brain wave of the tester two. Fig. 10 is a graph showing the amplitude of the component waveform of the second component based on the facial skin temperature data of the second test subject and the measured amplitude of the β wave in the brain waves of the second test subject. Fig. 11 is a diagram showing the amplitude of the component waveform of component four based on the captured image data of the third subject and the measured amplitude of the β wave in the brain wave of the third subject. Fig. 12 is a diagram showing the amplitude of the component waveform of the component three based on the facial skin temperature data of the tester three and the measured amplitude of the β wave in the brain wave of the tester three. Fig. 13 is a diagram showing the amplitude of the component waveform of the component three based on the captured image data of the tester four and the measured amplitude of the β wave in the brain wave of the tester four. Fig. 14 is a diagram showing the amplitude of the component waveform of the component two based on the facial skin temperature data of the tester four and the measured amplitude of the β wave in the brain wave of the tester four. Fig. 15 is a diagram showing the amplitude of the component waveform of component two based on the captured image data of the fifth human subject and the measured amplitude of the β wave in the brain wave of the fifth human subject. Fig. 16 is a diagram showing the amplitude of the component waveform of component two based on the facial skin temperature data of the fifth human subject and the measured amplitude of the β wave in the brain wave of the fifth human subject. Fig. 17 is a diagram showing the amplitude of the component waveform of component four based on the captured image data of the tester six and the measured amplitude of the β wave in the brain wave of the tester six. Fig. 18 is a diagram showing the amplitude of the component waveform of the component three based on the facial skin temperature data of the tester six and the measured amplitude of the β wave in the brain waves of the tester six.
As shown in fig. 7 to 18, it is understood from the results of the waveform of each component and the brain wave analysis that the skin temperature of the face and the blood circulation amount of the face have a correlation. Further, it is also known that, based on analysis of any one of the facial skin temperature data and the facial blood circulation volume data, there is a significant correlation between the amplitude of each component waveform and the amplitude of the beta wave of the brain wave measured by the electrode attached to the parietal or occipital region.
Table 2 shown below shows the analysis results of the captured image data of the face of each test subject.
(Table 2)
Figure BDA0001780267060000211
As shown in table 2, it is understood from the analysis result of the above-described captured image data of the face that, among a plurality of components obtained by performing singular value decomposition on time-series blood circulation amount data based on the captured image data of the face, there is a significant correlation between the component one, the component two, the component three, the component four, and the component five and the brain activity of the human being. In addition, it is considered herein that not only the components observed to have a significant correlation in the correlation based on the blood circulation volume data and a significant correlation in the correlation based on the relative converted blood circulation volume data have a significant correlation with the brain activity of the human, but also the components observed to have a significant correlation in the correlation based on the relative converted blood circulation volume have a significant correlation with the brain activity of the human although a significant correlation is not observed in the correlation based on the blood circulation volume data.
In addition, table 3 shown below shows the results of the control experiment.
(Table 3)
Ingredients having a correlation with brain rest/brain activation Component one and component two
Component having correlation with face movement distance Component one, component three and component four
Composition having correlation with number of keyboard entries Ingredient eight
As shown in table 3, in the control experiment, when the examinee moves while the face captured image data is being acquired, no significant correlation is found between each of the two components, which are the components having significant correlations between the amplitude of the waveform of the component and the time of brain inactivation and the time of brain activation, and the moving distance and the number of keyboard entries. From this, it is understood that, among a plurality of components obtained by performing singular value decomposition on blood circulation amount data based on RGB data acquired from captured image data of a face, a component having a significant correlation with brain activity is affected by the action of a tester at the time of acquiring time-series captured image data of a face, but the effect is much smaller than the effect (effect due to activation or inactivation of the brain) due to brain activity of the brain.
Based on the above results, the present inventors have obtained the following findings.
As a result of decomposing blood circulation volume data obtained from RGB data of a face based on time-series captured image data of a face acquired from a tester into a plurality of components by singular value decomposition and analyzing each of the decomposed components, component one, component two, component three, component four, and component five of the plurality of components are components relating to brain activity. That is, it is found that a component representing RGB changes of a face reflecting brain activities can be specified from among a plurality of components by decomposing blood circulation volume data obtained from RGB data of a face based on time-series captured image data of a face into a plurality of components by singular value decomposition, extracting components related to brain activation/brain inactivation from the plurality of components after the decomposition, and analyzing the extracted components. Thus, the present inventors have obtained the following findings: brain activity can be estimated from time-series captured image data of a human face.
(4) Brain activity visualization device
Next, based on the above-described findings, the brain activity visualization devices 10 and 110 according to an embodiment of the present invention completed by the present inventors will be described. The brain activity visualization device according to the present invention is not limited to the following embodiments, and can be modified as appropriate without departing from the scope of the invention.
The brain activity visualization device 10, 110 according to an embodiment of the present invention includes: a brain activity estimation unit 30 that estimates brain activity from the facial skin temperature data by the brain activity estimation unit 30; and/or a brain activity estimation unit 130, the brain activity estimation unit 130 estimating a brain activity from the captured image data of the face. Hereinafter, before describing the brain activity visualizing devices 10 and 110 according to the embodiment of the present invention, the respective brain activity estimating units 30 and 130 will be described.
(4-1) brain activity estimating unit 30 for estimating brain activity based on facial skin temperature
Fig. 19 is a schematic view of a brain activity visualization device 10 according to an embodiment of the present invention. Fig. 20 is a flowchart showing a flow of processing when a component representing a change in skin temperature reflecting brain function is specified in the brain activity visualization device 10.
The brain activity estimation unit 30 included in the brain activity visualization apparatus 10 estimates the brain activity of an individual (tester) from the skin temperature of the face of the individual. As shown in fig. 19, the brain activity visualization device 10 includes a facial skin temperature acquisition unit 20, a brain activity estimation unit 30, and a state visualization unit 200.
The face skin temperature acquisition unit 20 detects the skin temperature of at least a part of the face of the individual, and acquires face skin temperature data including the detected temperature data and position data of the detection site thereof in time series (step S1). Further, here, the face skin temperature acquisition unit 20 is an infrared thermal imaging device, and as shown in fig. 19, the face skin temperature acquisition unit 20 has an infrared camera 21 and a processing section 22. The infrared camera 21 is used to detect infrared radiant energy radiated from the face of the individual. Next, here, the infrared camera 21 detects infrared radiant energy from the entire face of the individual. The processing unit 22 converts the infrared radiation energy detected by the infrared camera 21 into temperature as temperature data, generates a temperature distribution map of the facial skin temperature of the entire face with the part where the infrared radiation energy is detected as position data (coordinate data), and processes the generated temperature distribution map as facial skin temperature data based on temperature conversion data. The face skin temperature data based on the temperature conversion data is stored in a storage unit (not shown) included in the processing unit 22.
Here, the processing unit 22 generates a temperature distribution map of the facial skin temperature of the entire face, but the present invention is not limited to this, and may generate a temperature distribution map including at least the facial skin temperature around the sinus and/or the forehead portion, and use the temperature distribution map as facial skin temperature data based on temperature conversion data.
Here, while the facial skin temperature data based on the temperature conversion data is acquired by the facial skin temperature acquisition unit 20, the brain function activation problem is provided to the individual for a certain period. That is, the face skin temperature data based on the temperature conversion data acquired with the face skin temperature acquisition unit 20 includes data during which a brain function activation question is provided to the individual. Further, as the problem of activation of a brain function provided to an individual, there is no particular limitation as long as it is assumed that the human brain can be activated, and for example, the content of the problem of activation of a brain function can be appropriately determined according to the purpose of use of the brain activity visualization device 10.
The brain activity estimation unit 30 estimates the brain activity of the human from the face skin temperature data based on the temperature conversion data acquired by the face skin temperature acquisition unit 20. Specifically, as shown in fig. 19, the brain activity estimation unit 30 includes a conversion unit 31, an analysis unit 32, and an estimation unit 33.
The conversion unit 31 converts the temperature data included in the face skin temperature data based on the temperature conversion data into the relative temperature data, and then generates the face skin temperature data based on the converted relative temperature data, that is, the face skin temperature data based on the relative temperature conversion data (step S2). Specifically, the conversion unit 31 converts the temperature data into relative temperature data while setting an average value of the temperature data included in the face skin temperature data based on the temperature conversion data for each predetermined time (for example, 30 seconds) as a reference value. Next, the conversion unit 31 generates face skin temperature data based on the relative temperature conversion data using the converted relative temperature data and the position data.
The analysis unit 32 decomposes the face skin temperature data based on the time-series temperature conversion data and the face skin temperature data based on the relative temperature conversion data into a plurality of components by singular value decomposition, principal component analysis, or independent component analysis (step S3). Here, the analysis unit 32 subjects the acquired face skin temperature data based on the temperature conversion data and the face skin temperature data based on the converted relative temperature conversion data to each other, and performs singular value decomposition using SVD of MATLAB (registered trademark) as an analysis tool. The singular value decomposition is performed by setting a factor to time data for each predetermined period (for example, 30 seconds) and setting a metric to face skin temperature data based on temperature conversion data and face skin temperature data based on relative temperature conversion data for the period, which are acquired in time series, and face skin temperature data based on temperature conversion data and face skin temperature data based on relative temperature conversion data for the period. Then, the face skin temperature data based on the temperature conversion data and the face skin temperature data based on the relative temperature conversion data are decomposed into a plurality of components by singular value decomposition, and then the time distribution, the spatial distribution, and singular values indicating the sizes of the components are calculated.
In addition, in order to identify a component representing a change in skin temperature reflecting brain activity from a plurality of components decomposed by singular value decomposition, the analysis unit 32 determines whether or not each component satisfies a first condition and a second condition (step S4a, step S4b, step S5a, step S5 b). Here, the analysis unit 32 first determines whether or not each component based on the face skin temperature data corresponding to the temperature conversion data satisfies a first condition (step S4a), and determines whether or not the component based on the face skin temperature data corresponding to the temperature conversion data, which is determined to satisfy the first condition in step S4a, satisfies a second condition (step S4 b). Next, it is determined whether only the component that matches the component determined to satisfy the first condition and the second condition in steps S4a and S4b among the components based on the face skin temperature data corresponding to the relative temperature conversion data satisfies the first condition (step S5a), and it is then determined whether the component based on the face skin temperature data corresponding to the relative temperature conversion data determined to satisfy the first condition in step S5a satisfies the second condition (step S5 b). However, the order of the determination by the analysis unit 32 is not limited to this, and for example, it may be determined whether each component based on the face skin temperature data corresponding to the temperature conversion data and each component based on the face skin temperature data corresponding to the relative temperature conversion data satisfy the first condition and the second condition, and finally, components whose determination results match each other may be extracted.
The first condition is a condition in which the amplitude of the component waveform of the component decomposed by the singular value decomposition has a correlation with the change in the brain during the non-activation and the brain activation. The analysis unit 32 extracts, as a component for determination, a component satisfying the first condition from among the plurality of components. Here, the period during which the face skin temperature data based on the temperature conversion data is acquired is a fixed period during which the brain function activation problem is provided to the individual. The analysis unit 32 compares and analyzes the component waveforms of the respective components, when the period in which the problem of activation of the brain function is not provided to the individual is a period in which the problem of activation of the brain function is not provided, when the period in which the problem of activation of the brain function is provided to the individual is a period in which the problem of activation of the brain function is provided, and when the period in which the problem of activation of the brain function is not provided is a period in which the problem of activation of the brain function is provided. The analysis unit 32 evaluates whether or not the component waveform of each component has a correlation relationship with the time of brain inactivity and the time of brain activation using the result of the comparative analysis based on the component waveform data, and extracts a component evaluated as having a correlation relationship among the plurality of components as a determination component satisfying the first condition. On the other hand, the analysis unit 32 determines that the component evaluated as not having the correlation among the plurality of components does not satisfy the first condition, and is not a component indicating a temperature change reflecting the human brain activity (step S6).
Here, when acquiring the face skin temperature data based on the temperature conversion data, the analysis unit 32 extracts the determination component by providing the brain activity activation problem to the individual for a certain period of time, but the content of the first condition, that is, the means for extracting the determination component in the analysis unit 32, is not limited to this. For example, when a component indicating a waveform of a component having a correlation with a brain-inactive time and a brain-active time among a plurality of components is determined in advance by an experiment or the like, the analysis unit 32 extracts the component determined from the plurality of components as a component for determination. In the case where the present brain activity visualization device detects a human motion known as a motion related to activation/inactivation of the brain, such as eye movement or blinking, the analysis unit 32 may extract a component for determination from a plurality of components by comparing, analyzing, and evaluating the detection result and the component waveforms of the respective components. The criterion for determining whether or not the first condition is satisfied by the analysis unit 32 is appropriately determined by simulation, experiment, theoretical calculation, or the like according to the purpose of use of the brain activity visualization device 10 or the like.
The second condition is a condition that the temperature of a predetermined portion of the human face changes in the extracted determination component. The analysis unit 32 determines a component satisfying the second condition among the determination components as a component having a high possibility of being related to the brain activity of the human being, and extracts the component as a candidate component. That is, the analysis unit 32 determines whether the determination component is related to the brain activity of the human based on whether the temperature of the predetermined portion of the face of the human changes. Specifically, the analysis unit 32 determines whether or not the temperature of the surrounding area of the sinus and/or the forehead area changes based on the temperature distribution data of the extracted determination component, and if the temperature changes, determines that the determination component is a component that satisfies the second condition and has a high possibility of being related to the brain activity of the human, and extracts the component as a candidate component. On the other hand, when the temperature around the sinus and/or the forehead does not change, the analysis unit 32 determines that the determination component does not satisfy the second condition and is not a component indicating a change in the skin temperature reflecting the brain activity (step S6). The criterion for determining whether or not the second condition is satisfied by the analysis unit 32 is appropriately determined by simulation, experiment, theoretical calculation, or the like according to the purpose of use of the brain activity visualization device 10 or the like.
Next, the analysis unit 32 identifies the component determined to satisfy the second condition in step S5b as a component indicating a change in skin temperature reflecting brain activity (step S7). That is, the component determined in step S7 to indicate the change in skin temperature reflecting the brain activity is a component in which the candidate component extracted by performing singular value decomposition and analysis on the face skin temperature data based on the temperature conversion data and the candidate component extracted by performing singular value decomposition and analysis on the face skin temperature data based on the relative temperature conversion data match each other. In step S6, it is determined that the candidate component that does not match the two analyses is not a component indicating a change in skin temperature reflecting brain activity.
The estimation unit 33 estimates the human brain activity from the components identified by the analysis unit 32 as components representing the change in skin temperature reflecting the human brain activity. Specifically, the estimation unit 33 estimates the brain activity amount at the time of acquiring the facial skin temperature data, based on the component waveform data of the component specified by the analysis unit 32.
(4-1-1) modification 1A
The brain activity estimation means 30 includes a conversion unit 31, and generates face skin temperature data based on the relative temperature conversion data using the conversion unit 31. Next, the analysis unit 32 analyzes each component by decomposing not only the face skin temperature data based on the temperature conversion data acquired by the face skin temperature acquisition unit 20 but also the face skin temperature data based on the relative temperature data corresponding to the temperature data converted into the relative temperature data into a plurality of components by singular value decomposition.
Alternatively, the brain activity estimation unit 30 may not have the conversion section 31. In this case, it is possible to omit the processes of generating face skin temperature data based on the relative temperature conversion data and analyzing data based on the face skin temperature data corresponding to the relative temperature conversion data.
However, in order to specify the components related to the human brain activity with high accuracy, as described in the above embodiment, it is preferable that the brain activity estimation unit 30 has the conversion unit 31, and the analysis unit 32 decomposes not only the face skin temperature data based on the temperature conversion data acquired by the face skin temperature acquisition unit 20 but also the face skin temperature data based on the relative temperature data corresponding to the temperature data converted into the relative temperature data into a plurality of components by singular value decomposition, and analyzes each component.
(4-1-2) modification 1B
The facial skin temperature acquisition unit 20 is an infrared thermal imaging device capable of acquiring temperature data without contacting an object.
However, the face skin temperature acquisition means is not limited to the infrared thermal imaging apparatus as long as it can detect the skin temperature of at least a part of the face of the individual and acquire the face skin temperature data including the detected temperature data and the position data of the detection portion thereof in time series.
For example, the facial skin temperature acquisition unit may be a device including a temperature sensor. Specifically, a temperature sensor may be attached to a predetermined portion of the face of the individual, and time-series facial skin temperature data may be acquired based on temperature data detected by the temperature sensor and position data of the portion to which the temperature sensor is attached. In this way, even when the temperature sensor is in contact with the subject person to acquire facial skin temperature data, the temperature sensor does not require processing before mounting, such as brain wave electrodes, and the like, and therefore, data can be acquired more easily than in conventional detection methods such as brain wave measurement, magnetic resonance imaging, and near infrared spectroscopy. Thus, the brain activity of the human can be estimated easily.
(4-2) brain activity estimation unit 130 that estimates brain activity from captured image data of the face
Fig. 21 is a schematic view of a brain activity visualization device 110 according to an embodiment of the present invention. Fig. 22 is a flowchart showing an example of a process flow when the brain activity visualization device 110 specifies a component representing RGB changes of a face reflecting brain functions.
The brain activity estimation unit 130 included in the brain activity visualization device 110 estimates the brain activity of an individual (examinee) from captured image data of the face of the individual. As shown in fig. 21, the brain activity visualization device 110 includes an image data acquisition unit 120, a brain activity estimation unit 130, and a state visualization unit 200.
The image data acquisition unit 120 acquires captured image data of at least a part of the face of a person in time series (step S101). The image data acquisition unit 120 is not particularly limited as long as it is a device having at least an imaging device, and examples thereof include a mobile terminal with an internal imaging device such as a smartphone or a tablet computer (for example, iPad: registered trademark). Here, as shown in fig. 21, the image data acquisition unit 120 has a camera 121 as a photographing device and a storage section 122. The camera 121 is used to acquire captured image data of a face of a person in time series. Here, the camera 121 captures a moving image of the entire face of the person, thereby acquiring captured moving image data. The storage unit 122 stores time-series captured image data captured by the imaging device. Here, the storage unit 122 stores the moving image data acquired by the camera 121.
Note that, although the entire face is imaged by the camera 121, the present invention is not limited to this, and an image of the face including at least the image of the forehead and/or the periphery of the sinus may be imaged.
Here, the problem of activation of brain function is given to the individual for a certain period of time while the image data acquisition unit 120 is acquiring the time-series captured image data of the face. That is, the captured image data acquired with the image data acquisition unit 120 includes data during which a brain function activation question is provided to an individual. Further, as the problem of activation of a brain function provided to an individual, there is no particular limitation as long as it is assumed that the human brain can be activated, and for example, the content of the problem of activation of a brain function can be appropriately determined according to the purpose of use of the brain activity visualization device 110.
The brain activity estimation unit 130 estimates the brain activity of a human from the time-series captured image data of the face acquired by the image data acquisition unit 120. Specifically, as shown in fig. 21, brain activity estimation section 130 includes RGB processing section 131, conversion section 132, blood circulation volume calculation section 133, analysis section 134, and estimation section 135. In fig. 21, the brain activity estimation means 130 is shown as a single device having the RGB processing unit 131, the conversion unit 132, the blood circulation amount calculation unit 133, the analysis unit 134, and the estimation unit 135, but the present invention is not limited to this, and the brain activity estimation means 130 may be a device in which the RGB processing unit 131, the conversion unit 132, the blood circulation amount calculation unit 133, the analysis unit 134, and the estimation unit 135 are independent or partially independent from each other. Here, the facial blood circulation amount acquisition means is constituted by the image data acquisition means 120, the RGB processing unit 131, the conversion unit 132, and the blood circulation amount calculation unit 133.
The RGB processing unit 131 performs RGB processing for decomposing the captured image data acquired by the image data acquisition unit 120 into three color components of an R component, a G component, and a B component (step S102). Here, although RGB processing may be performed on the captured image data of the entire face, here, in order to reduce the amount of arithmetic processing and noise, data around the forehead portion and/or the sinus is extracted from the captured image data, and only the extracted data is subjected to RGB processing.
The conversion unit 132 converts the RGB data of the captured image data obtained by the RGB process into relative RGB data (step S103). Specifically, the conversion unit 132 converts the RGB data into the relative RGB data while taking the average value of the RGB data obtained from the captured image data acquired every predetermined time (for example, 30 seconds) as a reference value.
The blood circulation amount calculation unit 133 calculates time-series blood circulation amount data of the face from RGB data of the captured image data obtained by the RGB process (step S104).
The analysis unit 134 decomposes the time-series relative converted blood circulation volume data into a plurality of components by singular value decomposition, principal component analysis, or independent component analysis (step S105). Here, the analysis unit 134 performs singular value decomposition on the relative converted blood circulation volume data using SVD of MATLAB (registered trademark) as an analysis tool. Specifically, the singular value decomposition is performed by setting the time-series relative converted blood circulation volume data as the target, setting the factor as the time data for each predetermined period (for example, 30 seconds), and setting the metric as the relative converted blood circulation volume data for each pixel calculated from the relative RGB data for each period. Next, the time-series relative converted blood circulation volume data is decomposed into a plurality of components by singular value decomposition, and then the time distribution, the spatial distribution, and the singular value indicating the size of each component are calculated.
In addition, in order to specify a component representing RGB changes of the face reflecting the brain activity from among the plurality of components decomposed by the singular value decomposition, the analysis unit 134 determines whether or not each component satisfies a predetermined condition (step S106). Here, the predetermined condition includes, for example, a condition (hereinafter, referred to as a first condition) in which the amplitude of the component waveform of the component decomposed by singular value decomposition has a correlation with a change in the brain activation and the brain deactivation, a condition (hereinafter, referred to as a second condition) in which the amount of blood circulation in a predetermined portion of the human face among the components decomposed by singular value decomposition changes, and the like. As the predetermined condition to be determined by the analysis unit 134, one or more conditions may be set, and here, the first condition is set as the predetermined condition.
Next, the analysis unit 134 extracts, as a component for determination, a component satisfying a predetermined condition from among the plurality of components. Then, the analysis unit 134 identifies all the components satisfying the predetermined condition included in the extracted components for determination as components indicating RGB changes of the face reflecting the brain activity (step S107). On the other hand, the analysis unit 134 determines that the component determined to not satisfy at least one of the conditions included in the predetermined conditions among the plurality of components is not a component indicating RGB changes of the face reflecting the brain activity (step S108).
Here, as described above, only one condition (first condition) is set as the predetermined condition, and the period during which the problem of activation of the brain function is provided to the individual is a fixed period during the period during which the time-series captured image data of the face is acquired. Therefore, the analysis unit 134 determines that the period in which the problem of activation of the brain function is not provided to the individual is the period in which the problem of activation of the brain function is not provided, and the period in which the problem of activation of the brain function is provided is the period in which the problem of activation of the brain function is provided, and the period in which the problem of activation of the brain function is not provided, and the component waveforms of the respective components are compared and analyzed. Next, the analysis unit 134 evaluates whether or not the component waveform of each component has a correlation relationship with the time of brain inactivity and the time of brain activation using the result of the comparison analysis based on the component waveform data, extracts a component evaluated to have a correlation relationship among the plurality of components as a determination component satisfying a predetermined condition, and specifies it as a component representing RGB changes of the face reflecting the brain activity. On the other hand, the analysis unit 134 determines that the component evaluated as not having a correlation relationship among the plurality of components does not satisfy the predetermined condition, and is not a component indicating RGB variation of the face reflecting the brain activity of the human.
Here, when acquiring time-series captured image data of a face, the analysis unit 134 extracts the determination component by providing the brain activity activation problem to the individual for a certain period of time, but the extraction means of the determination component in the analysis unit 134, which is the content of the first condition, is not limited to this. For example, when a component indicating a waveform of a component having a correlation with a brain-inactive time and a brain-active time among a plurality of components is determined in advance by an experiment or the like, the analysis unit 134 extracts the component determined from the plurality of components as a component for determination. In addition, when the brain activity visualization device 110 detects human motion known as motion related to brain activation/deactivation, such as eye movement or blinking, the analysis unit 134 may extract a component for determination from the plurality of components by comparing, analyzing, and evaluating the detection result and the component waveforms of each component. The criterion for determining whether or not the first condition is satisfied by the analysis unit 134 is appropriately determined by simulation, experiment, theoretical calculation, or the like according to the purpose of use of the brain activity visualization device 110 or the like.
When the second condition is set as the predetermined condition, the analysis unit 134 extracts the determination component based on whether or not there is a change in the blood circulation amount of the face at the predetermined portion of the face of the human being. Specifically, the analysis unit 134 determines whether or not the blood circulation volume around the sinus and/or the forehead portion has changed based on the blood circulation volume distribution map corresponding to the plurality of components decomposed by the singular value decomposition, and determines that the components satisfy the second condition when the blood circulation volume has changed. On the other hand, when the amount of blood circulation around the sinus and/or the forehead portion does not change, the analysis unit 134 determines that the component does not satisfy the second condition. The criterion for determining whether or not the second condition is satisfied by the analysis unit 134 is appropriately determined by simulation, experiment, theoretical calculation, or the like according to the purpose of use of the brain activity visualization device 110 or the like.
In addition, when the blood circulation amount calculation unit 133 calculates blood circulation amount data based on the time series of RGB data converted to the relative RGB data, the analysis unit 134 may determine whether or not a plurality of components obtained by performing singular value decomposition or the like on the blood circulation amount data satisfy the first condition and/or the second condition, and extract the components for determination.
The estimation unit 135 estimates the human brain activity from the component identified by the analysis unit 134 as a component representing the RGB change of the face reflecting the human brain activity. Specifically, the estimation unit 135 estimates the amount of brain activity at the time of acquiring the captured image data of the face, based on the component waveform data of the component specified by the analysis unit 134.
(4-2-1) modification 2A
As described above, the camera 121 may be a built-in imaging device type mobile terminal such as a smartphone or a tablet computer (for example, ipad: registered trademark). That is, the captured image data may be data obtained by capturing an image of a visible light region.
In addition, the blood circulation amount calculation unit 133 may calculate blood circulation amount data of the face by mainly using the R component included in each pixel of the RGB data. The blood circulation volume data is not necessarily limited to the erythema index as long as the blood circulation volume data can be calculated from the RGB data.
(4-2-2) modification 2B
The blood circulation volume calculation unit 133 calculates the relative converted blood circulation volume data from the relative RGB data converted by the conversion unit 132, but alternatively or in addition thereto, the blood circulation volume data may be calculated from the RGB data before conversion into the relative RGB data. Here, since components related to brain activity are likely to appear (the detection force is high) in the blood circulation volume data calculated from the RGB data before conversion into the relative RGB data, the blood circulation volume data calculated from the RGB data before conversion into the relative RGB data can be analyzed earlier than the relative converted blood circulation volume data calculated from the relative RGB data, for example. Further, for example, the amount of arithmetic processing can be reduced by: first, the blood circulation volume data is analyzed to extract components having significant correlation, and then, only components corresponding to the extracted components are analyzed with respect to the relative converted blood circulation volume data.
(4-2-3) modification 2C
The camera 121 is a normal camera that assumes a visible light region, but an infrared camera may be used. In this case, infrared light is irradiated, and reflected waves of the infrared light are imaged by an infrared camera. Thereby, captured image data of a change in the face of the subject person can be obtained. The inventors have confirmed the following: the blood circulation volume data calculated from the captured image data obtained by the reflection of infrared rays has a correlation with the blood circulation volume data calculated mainly using the R component included in each pixel of the RGB data captured in the visible light region. Therefore, even if the captured image data obtained by the reflection of the infrared rays is used, the brain activity of the human can be estimated.
(4-2-4) modification 2D
In the above description, the brain activity visualization device 110 has the form including the image data acquisition unit 120 and the brain activity estimation unit 130, but the brain activity visualization device according to the present embodiment is not limited to the above form. That is, as long as the brain activity visualization device according to the present embodiment includes the blood circulation amount calculation unit 133, the analysis unit 134, and the estimation unit 135, other configurations of the brain activity visualization device may take any form. Specifically, the brain activity visualization device according to the present embodiment includes not only a mode in which the device itself captures image data but also a mode in which captured image data is received from an external device and the data is analyzed. (4-3) State visualization Unit 200
The state visualization unit 200 visualizes the physiological state of the subject person based on the brain activity of the subject person estimated by the brain activity estimation unit 30 and/or the brain activity estimation unit 130. For example, the state visualization unit 200 may include an analysis unit 201 that analyzes the physiological state of the subject by analyzing a change in the brain activity amount of the subject. Specifically, the analysis unit 201 analyzes a change in the brain activity amount corresponding to a stimulus (visual stimulus, auditory stimulus, tactile stimulus, olfactory stimulus, gustatory stimulus, or the like) given to the subject person to determine the physiological state of the subject person. Further, as to the kind or level of the physiological state, appropriate setting may be made based on the degree and/or duration of the rise in the brain activity amount and according to the use of the brain activity visualization device 10, 110. Next, the physiological state of the subject person analyzed by the analysis unit 201 is output from the display unit 202 of the state visualization unit 200 to the administrator, so that the administrator can understand the physiological state of the subject person. The display unit 202 may be any device that can provide the administrator with information about the analyzed physiological state of the subject person visualized, such as a display device that displays an image or a message.
When the analysis units 32 and 134 specify the components reflecting the brain activities and then the facial skin temperature acquisition unit 20 and/or the image data acquisition unit 120 acquire time-series data of various types, the brain activity visualization device 10 or 110 further decomposes the acquired data of various types into a plurality of components by singular value decomposition and analyzes only the specified components, thereby making it possible to understand the physiological state of the subject person in real time.
Further, although there is a technique of acquiring heartbeat information and biological information of a test subject from a skin temperature of a face of the test subject or a captured image, the heartbeat information and biological information can be acquired with high accuracy by applying the conventional technique to components obtained by performing singular value decomposition or the like on various data obtained by the face skin temperature acquisition unit 20 and/or the image data acquisition unit 120. Therefore, the analysis unit 32 and/or the analysis unit 134 may have a function of analyzing a plurality of components obtained by singular value decomposition to acquire heartbeat information and biological information, and the estimation units 33 and 135 of the above-described embodiments may have a function of estimating the motion of the sympathetic nerve/parasympathetic nerve from the acquired heartbeat information and biological information.
(5) Feature(s)
(5-1)
In the present embodiment, the brain activity of a human being is estimated from the time-series facial skin temperature data and/or facial blood circulation volume data acquired by the facial skin temperature acquisition unit 20 and/or the image data acquisition unit 120. Therefore, it is possible to estimate the brain activity of a human without installing a sensor such as a brain wave electrode that needs to be processed before installation. Therefore, it is possible to easily estimate the brain activity of the human and visualize the physiological state of the subject person from the estimated brain activity.
(5-2)
Here, when skin temperature data and/or image data of a face in time series is acquired, when a state in which the brain of a human being is activated or inactivated is created by actually providing a problem of activation of the brain function to the human being or not providing a problem of activation of the brain function, it is considered that a component having a correlation between a component waveform of each component and the time of activation of the brain and the time of inactivation of the brain is a component having a high possibility of representing a change in the skin temperature and/or the blood circulation amount reflecting the brain activity.
In the present embodiment, while the skin temperature data and/or the image data of the face in time series are acquired by the face skin temperature acquisition unit 20 and/or the image data acquisition unit 120, the brain function activation problem is provided to the individual for a certain period. That is, in the present embodiment, a situation in which the brain of a human being is activated or not activated is made by actually providing a brain function activation question or not to the individual. Next, various time-series data acquired as described above are decomposed into a plurality of components by singular value decomposition, and the correlation relationship between the component waveform of each component and the brain activation time and the brain deactivation time is evaluated, so that a component having a correlation relationship is extracted from the plurality of components and used as a determination component. Therefore, for example, compared to a case where a predetermined component determined by an experiment or the like in advance is extracted from a plurality of components as an extraction component, it is possible to reduce the possibility that a component having a low correlation with the brain activity of a human being is extracted from a plurality of components as an extraction component.
(5-3)
Here, the brain has a structure for cooling the brain independently of the body temperature, such as a selective brain cooling mechanism. It is known that selective brain cooling mechanisms utilize the forehead and around the sinuses to remove heat generated by brain activity. Thus, changes in facial skin temperature associated with brain activity and the amount of blood circulation in the face associated with the facial skin temperature may occur around the forehead and/or the sinuses.
In the present embodiment, various data around the forehead and/or the sinus are analyzed to extract a component for determination. Therefore, the components related to the human brain activity can be extracted with high accuracy.
(6) Application example of brain Activity visualization device (fatigue State determination device)
A fatigue state determination device to which the brain activity visualization device of the present invention is applied will be described.
(6-1) Structure of fatigue State determination device
Fig. 23 is a schematic diagram showing an example of the fatigue state determination device according to the present embodiment.
The fatigue state determination device 400 includes an input unit 410, an imaging unit 415, an output unit 420, a storage unit 430, and a processing unit 440.
The input unit 410 inputs various information to the fatigue state determination device 400. For example, the input unit 410 is constituted by a keyboard, a mouse, a touch panel, or the like. Various commands are input to the fatigue state determination device 400 via the input unit 410, and processing corresponding to the commands is executed in the processing unit 440.
The imaging unit 415 images a "face image" of the subject person 300 including the face. For example, the imaging unit 415 is configured by a solid-state imaging device such as a CCD or a CMOS for acquiring an RGB image, an infrared camera for acquiring a thermogram, or the like. By using an infrared camera for the imaging unit 415, the fatigue state can be determined without being affected by the brightness of the surroundings. In particular, accidents and the like due to fatigue are likely to occur at night. In such a case, the fatigue state determination device 400 according to the first embodiment is equipped with an infrared camera, thereby enabling monitoring of the fatigue state at night. Preferably, an infrared camera or the like can detect the temperature in the range of 29.0 ℃ to 37.0 ℃ with high sensitivity under normal room temperature conditions. The imaging unit 415 can perform continuous imaging at predetermined intervals. In the case of photographing the face image, it is preferable to photograph the face image from the front side with a constant illumination. When a frontal image cannot be obtained due to a posture change, a three-dimensional shape of a face of the posture change image is estimated by a perturbation space method, and then the face image is obtained by rendering the face image. Regarding the illumination variation image, a face image of a certain illumination condition is obtained using an illumination basic model (japanese: illumination substrate モデル) in which a diffuse reflection model is built on a human face of the substrate. Subsequently, the face images obtained by the continuous imaging are sent to the processing unit 440 by the imaging unit 415.
The output unit 420 outputs various information from the fatigue state determination device 400. For example, the output unit 420 is configured by a display, a speaker, and the like. Here, the brain function activation information described later is provided to the subject person 300 through the output unit 420.
The storage unit 430 stores information input to the fatigue state determination device 400, information calculated by the fatigue state determination device 400, and the like. For example, the storage unit 430 is configured by a memory, a hard disk device, and the like. The storage unit 430 stores programs for realizing the functions of the processing unit 440, which will be described later. Here, the storage unit 430 includes a brain function activation information database 431 and a reference information database 432.
The brain function activation information database 431 stores brain function activation information that activates a brain function of a human. Here, working memory-related information related to working memory of the human brain or the like can be used as "brain function activation information". The "working memory-related information" is information related to a problem requiring memory and judgment, and includes: memory problems such as mental problems, computational problems, and Nbcak (return) tasks (Japanese: N バックタスク); a question of making the most appropriate selection from a plurality of information, such as a pictogram question (Japanese: ピクトグラム ); and attention to handover problems such as mid-way handover problems. Here, the subject person 300 solves the mental arithmetic problem displayed on the display device or the like, and activates the brain of the subject person 300 to use the working memory.
As shown in fig. 24, the reference information database 432 associates a predetermined range of variation Δ r (r 1-r 2) with "fatigue state level" and stores the variation Δ r as "reference information" in advance, wherein the predetermined range of variation Δ r is a predetermined range of variation of a correlation value r2 with respect to a determination component of brain function activation information, compared with a "reference correlation value" r1 with respect to a reference determination component of brain function activation information extracted by a determination information generation unit 444 to be described later. The "reference judgment component" is set by data of a judgment component extracted before a predetermined action is performed, data of a judgment component extracted last time, data of a judgment component supplied from the outside, and the like. In the example shown in fig. 24, the reference information database 432 stores values from Δ r to Δ ra to Δ rb as "normal state", values from Δ rb to Δ rc as "light fatigue state", and values from Δ rc to Δ rd as "fatigue state", in accordance with the range of the value of the change amount Δ r. Here, the values become larger in the order of Δ ra, Δ rb, Δ rc, and Δ rd. The data of the reference determination component is also stored in the reference information database 432.
The processing unit 440 executes information processing in the fatigue state determination device 400. Specifically, the processing unit 440 includes a CPU, a cache memory, and the like. The processing unit 440 executes the programs stored in the storage unit 430, thereby functioning as a brain function activation information providing unit 441, a face change information acquiring unit 442, a face change information decomposing unit 443, a determination information generating unit 444, and a fatigue state determining unit 445.
The brain function activation information providing section 441 provides brain function activation information. For example, the brain function activation information providing unit 441 reads the brain function activation information from the brain function activation information database 431 in accordance with the operation of the input unit 410, and outputs the brain function activation information to the output unit 420.
The face change information acquisition section 442 acquires "face data" and "face change information" indicating a time-series change of the face data from the face image captured by the imaging section 415. Specifically, the face change information acquisition unit 442 acquires the face data via the imaging unit 415 in synchronization with the time when the brain function activation information is provided by the brain function activation information provision unit 441. Next, the face change information acquisition unit 442 acquires face change information indicating a time-series change in the face data of the subject person 300 from the continuously acquired face data. For example, when face data of sixty dots 240 × 320 pixels is acquired at predetermined intervals, the face change information is a set of 4, 608, 000 pieces of data. The acquired face change information is sent to the face change information decomposition unit 443. In addition, in the case where the imaging unit 415 is an infrared camera, the face change information acquisition unit 442 acquires, as the face data, face skin temperature data indicating the skin temperature of the face of the subject person 300. In the case where the imaging unit 415 is a solid-state imaging device such as a CCD or a CMOS, the face change information acquisition unit 442 acquires, as face data, face blood circulation amount data based on RGB data of the face of the subject person 300. In addition, the face change information acquisition unit 442 may acquire, as the face data, only data of the peripheral sinus and/or forehead portion of the subject person 300.
The face change information decomposition unit 443 decomposes the face change information, which is a collection of a plurality of data, into a plurality of components one, two, three, and … … by singular value decomposition, principal component analysis, or independent component analysis. The information of each decomposed component is sent to the judgment information generating unit 444. Here, when the face change information is subjected to singular value decomposition or the like, components having high singular values are set as component one, component two, component three, and component … …. Further, the higher the singular value, the more likely the component reflects the influence of the large fluctuation. Therefore, in the component one, influence of disturbance or the like of the external environment is often reflected rather than influence of providing brain function activation information.
The determination information generation unit 444 generates determination information from the face change information. Specifically, the determination information generating unit 444 extracts components related to the brain function activation information from the plurality of components one, two, three, and … … as "determination components", and generates the determination information from the determination components. Specifically, the judgment information generating unit 444 calculates a correlation value r between the extracted judgment component and the brain function activation information. Specifically, the determination information generating unit 444 calculates the correlation value r between the plurality of components one, two, three, and … … and the brain function activation information, which are obtained by the face change information decomposition unit 443. Next, when the calculated correlation value r is equal to or greater than a predetermined value, the determination information generating unit 444 sets a component corresponding to the correlation value r as a component related to the brain function activation information. Then, the determination information generating unit 444 extracts the determination component from the value of the risk factor. That is, the determination information generating unit 444 extracts a component having a low risk rate as a determination component. The extracted determination component and the calculated correlation value r are sent to the storage unit 430 or the fatigue state determination unit 445.
The fatigue state determination unit 445 determines the fatigue state of the target person 300 based on the determination information including the determination component. Specifically, the fatigue state determination unit 450 calculates a difference Δ r between a reference correlation value r1 and a correlation value r2, where the reference correlation value r1 is a reference correlation value between the brain function activation information and the reference determination component extracted before the predetermined action, and the correlation value r2 is a correlation value between the brain function activation information and the determination component extracted after the predetermined action. Next, the fatigue state determination unit 450 specifies the fatigue state level corresponding to the difference Δ r between the reference correlation value r1 and the current correlation value r2 based on the reference information stored in the reference information database 432. The determined fatigue state level is output to a display device or the like via the output unit 420.
In the present embodiment, the "predetermined action" is used as meaning at least either one or both of a physical activity and an intellectual activity. As the physical activity, the following activities can be exemplified: various physical works such as line work and civil work in factories; and various exercises such as body building, running, ball games, mountain climbing, and muscle training. The intellectual activities include learning, discussion, decision-making, situation determination, management and supervision, and the like.
(6-2) operation of fatigue State determination device
Fig. 25 is a flowchart showing the operation of the fatigue state determination device 400.
First, before a predetermined action, "reference setting mode" is selected, and a component for reference determination is extracted (S1). Specifically, an output instruction of the brain function activation information is input to the fatigue state determination device 400 via the input unit 410. Next, the brain function activation information is read from the brain function activation information database 431, and the brain function activation information is output to the output unit 420 (S2). Here, the "mental arithmetic problem" is output as brain function activation information.
Next, at the same time as or at a predetermined timing when the brain function activation information is outputted, the image capturing unit 415 captures the face image of the subject person 300 positioned in front of the output unit 420 at predetermined intervals (S3). The captured face image is sent to the face change information acquisition unit 442.
Next, the face change information acquiring unit 442 acquires face change information indicating a time-series change in the face data of the subject person 300 from the captured face image. Then, the face change information is decomposed into a plurality of components one, two, three, and … … by the face change information decomposition portion 443 using singular value decomposition, principal component analysis, or independent component analysis (S4).
Next, the judgment information generating unit 444 calculates correlation values between the plurality of components one, two, three, and … … decomposed by the face change information decomposition unit 443 and the brain function activation information. Then, the determination information generation unit 444 determines whether or not the correlation value is equal to or greater than a predetermined value (S5). When the correlation value is determined to be equal to or greater than the predetermined value, the determination information generation unit 444 determines that the brain function activation information and the component have a correlation (S5 — yes). Next, the determination information generating unit 444 extracts a component having a low risk rate among the components having correlation as a "reference determination component" (S6). The judgment information generating unit 444 sets the correlation value between the criterion judgment component and the brain function activation information as a criterion correlation value r 1. The information of these criterion judgment components is stored in the storage unit 430 (S7). On the other hand, when the correlation values between the brain function activation information and each of the first component, the second component, the third component and the … … are smaller than the predetermined values, the judgment information generating unit 444 judges that there is no correlation between the two components and stores the information in the storage unit 430(S5 — no, S7).
Subsequently, the subject 300 performs a predetermined action (S8). Then, the fatigue state determination mode is selected in the fatigue state determination state 400, and the fatigue state is determined by a predetermined action (S9). First, the same processing as in S2 to S6 is performed, and the correlation value r2 between the judgment component extracted from the face change information and the brain function activation information is calculated (S10 to S14).
Next, the fatigue state determination unit 445 calculates a change amount Δ r, which is a difference between a reference correlation value r1 and a correlation value r2, wherein the reference correlation value r1 is a reference correlation value between the brain function activation information and a reference determination component extracted before the predetermined action, and the correlation value r2 is a correlation value between the brain function activation information and a determination component extracted after the predetermined action (S15). Then, the fatigue state determination unit 450 determines whether or not the amount of change Δ r of the correlation value r2 from the reference correlation value r1 before and after the predetermined action is within a predetermined range. Whether or not the variation Δ r is within a predetermined range is determined based on the reference information stored in the reference information database 432. When the amount of change Δ r of the correlation value r2 from the reference correlation value r1 is within a predetermined range, the fatigue state determination unit 445 determines that the subject person 300 is in the "normal state" (S15 — yes, S16). On the other hand, when the amount of change Δ r of the correlation value r2 from the reference correlation value r1 is not within the predetermined range, the fatigue state determination unit 445 determines that the subject person 300 is in the "fatigue state" (S15 — no, S17). For example, the change amount Δ r is determined to be normal when it is within the range from Δ ra to Δ rb, and is determined to be in a fatigue state when it exceeds Δ rb. These determination results are output to a display device or the like via the output unit 420 as a determination result (S18).
(6-3) characteristics of fatigue State determination device
(6-3-1)
As described above, the fatigue state determination device 400 according to the present embodiment includes the brain function activation information providing unit 441, the face change information acquiring unit 442, the face change information decomposing unit 443, the determination information generating unit 444, and the fatigue state determination unit 445. The brain function activation information providing unit 441 provides the subject person 300 with "brain function activation information" for activating the brain function of a human. The face change information acquisition unit 442 acquires "face change information" indicating a time-series change in the face data of the subject person 300. The face change information decomposition unit 443 decomposes the face change information into a plurality of components one, two, three, and … … by singular value decomposition, principal component analysis, or independent component analysis. The determination information generating unit 444 extracts components related to the brain function activation information from the plurality of components one, two, three, and … … as "determination components". The fatigue state determination unit 445 determines the fatigue state of the target person 300 based on the determination component.
Therefore, in the fatigue state determination device 400 according to the present embodiment, since the determination component relating to the brain function activation information is extracted from the plurality of first, second, third, and … … components obtained by performing singular value decomposition, principal component analysis, independent component analysis, and the like on the face change information, it is possible to easily estimate whether or not the subject 300 has brain activity without using electrodes or the like that require preprocessing before installation. Thereby, the fatigue state of the subject person 300 can be easily determined based on the determination component corresponding to the brain function of the subject person 300.
The fatigue state determination device 400 according to the present embodiment may be incorporated in a smart device. Thereby, the fatigue state determination can be easily performed at an arbitrary place.
(6-3-2)
In addition, in the fatigue state determination device 400 according to the present embodiment, the face change information acquisition unit 442 acquires data of the peripheral sinus and/or forehead of the subject 300 as face data, and therefore, the fatigue state determination device 400 according to the present embodiment can extract the determination component relating to the brain activity with high accuracy. Here, the Brain has a structure for Cooling the Brain independently of the body temperature, such as a Selective Brain Cooling mechanism (Selective Brain Cooling System). It is known that selective brain cooling mechanisms utilize the sinuses and around the forehead to remove heat generated by brain activity. Therefore, by analyzing the data of the above-mentioned part, it is possible to extract the components related to the brain activity with high accuracy. As a result, the fatigue state determination device 400 according to the present embodiment can perform the fatigue state determination with high accuracy.
(6-3-3)
In the fatigue state determination device 400 according to the present embodiment, the face change information acquisition unit 442 acquires, as the face data, face skin temperature data indicating the skin temperature of the face of the subject person 300. In other words, the fatigue state determination device 400 can determine the fatigue state by an infrared camera or the like. For example, as shown in fig. 26, by using an infrared camera for the imaging unit 415, the fatigue state can be determined without being affected by the brightness of the surroundings.
(6-3-4)
In the fatigue state determination device 400 according to the present embodiment, the face change information acquisition unit 442 acquires, as the face data, the face blood circulation amount data based on the RGB data of the face of the subject person 300. That is, the fatigue state determination device 400 can determine the fatigue state using a solid-state imaging device (CCD, CMOS). Thus, the fatigue state can be determined with a simple configuration.
(6-3-5)
In the fatigue state determination device 400 according to the present embodiment, the determination information generation unit 444 extracts the determination component from the value of the risk. In the fatigue state determination device 400, since the determination component related to the brain function activation information is extracted based on the value of the risk factor, the reliability of the fatigue state determination can be improved.
(6-3-6)
In addition, in the fatigue state determination device 400 according to the present embodiment, the brain function activation information providing unit 441 provides the working memory-related information as the brain function activation information. For example, since mental problems and the like are provided as the brain function activation information, components related to the brain function can be extracted. As a result, the fatigue state of the subject person 300 can be easily determined.
In addition, as the information related to the working memory, the service content itself may be used as the brain function activation information. For example, as shown in fig. 27, in a case work using a PC or the like, face data is acquired at a predetermined timing from the start of a service to the end of the service. Next, by calculating the correlation value r between the job displayed on the screen of the PC and the determination component acquired based on the face data as needed, the fatigue state corresponding to the completion of the job can be determined.
(6-3-7)
The fatigue state determination device 400 according to the present embodiment includes a reference information database 432, and the reference information database 432 stores, as "reference information", a change amount Δ r in a predetermined range, which is a change amount in a predetermined range of the correlation value r2 of the determination component calculated with respect to the brain function activation information, in comparison with the reference correlation value r1 of the reference determination component calculated with respect to the brain function activation information, in association with the fatigue state level. Then, the fatigue state determination unit 445 calculates the correlation value r2 between the determination component and the brain function activation information, and determines the fatigue state level of the subject person 300 based on the calculated correlation value r2 and the reference information.
According to the above configuration, the fatigue state determination device 400 can easily determine the fatigue state level using the reference determination component obtained before the predetermined action. In short, the fatigue state determination device 400 can not only determine whether or not it is in a fatigue state, but also easily determine a fatigue state level and output the fatigue state level.
(6-3-8)
The fatigue state determination method according to the present embodiment does not necessarily require the fatigue state determination device 400. That is, regardless of whether or not the fatigue state determination device 400 is provided, the fatigue state determination method of the present embodiment may include the steps of: a brain function activation information providing step of providing "brain function activation information" for activating a brain function of a human being to the subject 300 after the predetermined action; a face change information acquisition step of acquiring "face change information" indicating a time-series change in the face data of the subject person 300 after a predetermined action; a face change information decomposition step of decomposing the face change information into a plurality of components by singular value decomposition, principal component analysis, or independent component analysis; a judgment component extraction step of extracting a component relating to brain function activation information from the plurality of components as a "judgment component"; and a fatigue state determination step of determining the fatigue state of the subject person 300 based on the determination component.
According to the fatigue state determination method, after a predetermined action, the fatigue state can be determined by extracting a determination component related to brain function activation information from a plurality of components obtained by performing singular value decomposition, principal component analysis, or independent component analysis on face change information, and thereby easily determining the influence of the predetermined action on fatigue of the subject person 300.
(6-3-9)
In the above description, the determination component related to the brain function activation information corresponding to the target object is extracted from the face change information by singular value decomposition or the like, but the fatigue state determination device 400 according to the present embodiment is not limited to the above-described embodiment. For example, the fatigue state determination device 400 may determine the state of the subject person using arbitrary determination information other than the determination component generated based on the face change information. In order to generate the determination information, an arbitrary method other than singular value decomposition or the like may be applied to the face change information.
(6-4) modified example of fatigue State determination device
As shown in fig. 28, the fatigue state determination device 400 according to the present embodiment may also use a determination information providing device 500 or the like provided on a network.
Here, the determination information providing apparatus 500 includes a storage unit 530 and a processing unit 540.
The storage unit 530 has a reference information database 532. The reference information database 532 has the same configuration as the reference information database 432. That is, the reference information database 532 stores, as the reference information, the change amount Δ r in the predetermined range, which is the change amount in the predetermined range of the correlation value r2 of the determination component calculated with respect to the brain function activation information, in comparison with the reference correlation value r1 of the reference determination component calculated with respect to the brain function activation information, in association with the fatigue state rank.
The processing unit 540 transmits the reference information stored in the reference information database 532 in response to a request from the fatigue state determination device 400. The processing unit 540 may have a function of generating reference information as big data from predetermined information independently of the determination component extracted by the fatigue state determination device 400. When the fatigue state determination device 400 calculates the reference correlation value r1, the processing unit 540 performs a process of updating the reference correlation value r1 stored in the reference information database 432 as needed.
In the present modification, the fatigue state determination unit 445 requests the determination information providing apparatus 500 to provide the reference information. Specifically, in the fatigue state determination device 400 according to the present modification, the reference information database 532 is stored in the determination information providing device 500 on the network, and the fatigue state determination unit 445 accesses the determination information providing device 500 when determining the fatigue state level. Next, the fatigue state determination unit 445 determines the fatigue state level of the target person 300 based on the calculated correlation value r2 and the reference information.
Therefore, in the fatigue state determination device 400 according to the present modification, the fatigue state determination unit 445 can determine the fatigue state level of the target person 300 using the external network. Further, since the fatigue state determination unit 445 determines the fatigue state by using the reference determination component of the determination information providing apparatus 500 stored on the external network, the operation before the predetermined action can be simplified. That is, the "reference setting mode" may be omitted, and only the "determination mode" may be executed as shown in fig. 29. Here, the same processing as in steps S9 to S18 is performed in steps V1 to V6 and V8 to V11. In step V7, the fatigue state determination device 400 requests the determination information providing device 500 to transmit the reference information. Further, a part of the above steps may be executed without using the fatigue state determination device 400.
Further, according to the method of the present modification, it is possible to determine the fatigue state using big data. That is, the reference correlation value r1 and the predetermined change amount Δ r are obtained from the big data. Specifically, a reference correlation value r1 calculated from a reference determination component obtained by providing brain function activation information to a person other than the subject person 300 is used. Thus, the reference information can be optimized at any time.
(6-5) verification of fatigue State determination method
The verification experiment of the fatigue state determination method of the present embodiment was performed under the following conditions.
In the experiment, mental problems were provided before and after an action accompanying a predetermined physical activity of a physical worker, and a face image at that time was captured. The mental problem is the problem of subtracting 7 from 100. The shooting time is about 250 seconds before the action and about 180 seconds after the action. As a result, the correlation value Rpre with the pre-action brain function activation information and the correlation value Rpost with the post-action brain function activation information were obtained for 11 subjects as shown in table 4 below. Here, since one subject has data missing, the number of valid persons is ten.
(Table 4)
Figure BDA0001780267060000451
In the above results, when the average values are compared, the average value of the correlation value r1 before the action is 0.84, and the average value of the correlation value r2 after the action is 0.69. That is, the correlation value r2 after action is reduced relative to the correlation value r1 before action. From this, it is considered that the correlation value r between the determination component and the brain function activation information is reduced before and after the action. In other words, it can be confirmed that the fatigue state can be determined by comparing the correlation values before and after the action. It can be confirmed that, for example, according to the fatigue state determination method of the present embodiment, the fatigue state before and after a predetermined physical activity of a physical worker can be easily determined.
(attached note)
The present invention is not limited to the above embodiments. In the implementation stage, the present invention can be embodied by modifying the components without departing from the gist thereof. The present invention can be variously embodied by appropriately combining a plurality of constituent elements disclosed in the above embodiments. For example, several constituent elements may be removed from all the constituent elements described in the embodiments. In addition, the constituent elements may be appropriately combined in different embodiments.
Industrial applicability of the invention
According to the present invention, since brain activity can be estimated easily, it is effective to apply the present invention to a brain activity visualization apparatus that visualizes a physiological state of a subject person based on brain activity.
Description of the symbols
300 subjects;
400 fatigue state determination means;
410 an input part;
415 an imaging unit;
420 an output unit;
430 a storage section;
431 a brain function activation information database;
432 a reference information database (reference information storage unit);
440 a processing section;
441 brain function activation information providing unit;
442 face change information acquisition section;
443 a face change information decomposition unit;
444 judgment information generating section (judgment component extracting section);
445 a fatigue state determination unit;
500 determining an information providing apparatus;
530 a storage section;
532 a reference information database (reference information storage unit);
540 a processing section.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open No. 2013-176406.

Claims (12)

1. A fatigue state determination device (400) is characterized by comprising:
a brain function activation information providing unit (441), wherein the brain function activation information providing unit (441) provides, to a subject person (300), brain function activation information that activates a brain function of a human; and
a fatigue state determination unit (445), the fatigue state determination unit (445) determining the fatigue state of the subject person based on face change information indicating a time-series change of the face data of the subject person when the brain function activation information is supplied,
further comprising a determination information generation unit (444), the determination information generation unit (444) generating determination information from the face change information,
further comprising a face change information decomposition section (443) that decomposes the face change information into a plurality of components by singular value decomposition, principal component analysis, or independent component analysis,
the judgment information generating unit extracts a component related to the brain function activation information from the plurality of components as a judgment component, and generates the judgment information from the judgment component,
the judgment information generation unit extracts the judgment component based on a value of a risk,
further comprising a reference information storage unit (432), wherein the reference information storage unit (432) stores, as reference information, a change amount in a predetermined range, which is a change amount in a predetermined range of a correlation value of the information for determination calculated for the brain function activation information compared with a reference correlation value, in association with the fatigue state level,
the fatigue state determination unit calculates a correlation value of the determination information with respect to the brain function activation information, and determines a fatigue state level of the subject person based on the calculated correlation value and the reference information.
2. The fatigue state determination device according to claim 1,
the brain function activation information providing section provides working memory-related information as the brain function activation information.
3. The fatigue state determination device according to claim 1 or 2,
further comprising a face change information acquisition unit (442), wherein the face change information acquisition unit (442) acquires data of the peripheral region of the sinus and/or the forehead region of the subject person as the face data.
4. The fatigue state determination device according to claim 1 or 2,
further included is a face change information acquisition section (442), the face change information acquisition section (442) acquiring, as the face data, face skin temperature data representing a skin temperature of the face of the subject person.
5. The fatigue state determination device according to claim 1 or 2,
further comprising a face change information acquisition section (442), the face change information acquisition section (442) acquiring, as the face data, face blood circulation amount data based on RGB data of the face of the subject person.
6. The fatigue state determination device according to claim 1 or 2,
a judgment information providing device (500) stored on a network, wherein the reference information storage unit stores, as reference information, a change amount in a predetermined range, which is a change amount in a predetermined range of a correlation value of judgment information calculated for brain function activation information compared with a reference correlation value, in association with a fatigue state level,
the fatigue state determination unit calculates a correlation value of the determination information for the brain function activation information, and determines the fatigue state level of the subject person based on the calculated correlation value and the reference information.
7. A fatigue state determination method, comprising:
a brain function activation information providing step of providing a subject person (300) with brain function activation information for activating a brain function of a human being after a predetermined action is performed on the subject person;
a face change information acquisition step of acquiring face change information indicating a time-series change in the face data of the subject person after the predetermined action; and
a fatigue state determination step of determining a fatigue state of the subject person based on the face change information,
further comprising a judgment information generating step of generating judgment information based on the face change information,
further comprising a face change information decomposition step of decomposing the face change information into a plurality of components by singular value decomposition, principal component analysis, or independent component analysis,
in the information for determination generation step, a component related to the brain function activation information is extracted from the plurality of components as a component for determination, and the information for determination is generated from the component for determination,
reference information storage units (432, 532) store, as reference information, changes in a predetermined range, which is a predetermined range of changes in the correlation value of the information for determination calculated for brain function activation information compared with the reference correlation value, in association with the fatigue state level,
in the fatigue state determination step, a correlation value of the determination information with respect to the brain function activation information is calculated, and a fatigue state level of the subject person is determined based on the calculated correlation value and the reference information.
8. A fatigue state determination method according to claim 7,
before the predetermined action, the brain function activation information providing step, the face change information acquiring step, the face change information decomposing step, and the judgment information generating step are executed to generate the reference information.
9. A fatigue state determination method according to claim 7 or 8,
a judgment information providing device (500) stored in the reference information storage unit on the network,
the fatigue state determination step accesses the determination information providing device when determining the fatigue state level.
10. A fatigue state determination method according to claim 9,
the reference correlation value is calculated by providing the brain function activation information to a person other than the subject person.
11. A fatigue state determination method according to any one of claims 7, 8, and 10,
the prescribed action is a mental activity or a physical activity.
12. A fatigue state determination method according to claim 9,
the prescribed action is a mental activity or a physical activity.
CN201780013890.1A 2016-02-29 2017-02-28 Fatigue state determination device and fatigue state determination method Active CN108712879B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2016-038479 2016-02-29
JP2016038479 2016-02-29
PCT/JP2017/007989 WO2017150575A1 (en) 2016-02-29 2017-02-28 Fatigue state determination device and fatigue state determination method

Publications (2)

Publication Number Publication Date
CN108712879A CN108712879A (en) 2018-10-26
CN108712879B true CN108712879B (en) 2021-11-23

Family

ID=59744099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780013890.1A Active CN108712879B (en) 2016-02-29 2017-02-28 Fatigue state determination device and fatigue state determination method

Country Status (5)

Country Link
US (1) US11844613B2 (en)
EP (1) EP3424408B1 (en)
JP (1) JP6463392B2 (en)
CN (1) CN108712879B (en)
WO (1) WO2017150575A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6749278B2 (en) * 2017-04-14 2020-09-02 ダイキン工業株式会社 Physiological condition determination device
US11151726B2 (en) * 2018-01-10 2021-10-19 Canon Medical Systems Corporation Medical image processing apparatus, X-ray diagnostic apparatus, and medical image processing method
JP7386438B2 (en) * 2018-12-20 2023-11-27 パナソニックIpマネジメント株式会社 Biometric device, biometric method, computer readable recording medium, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6092058A (en) * 1998-01-08 2000-07-18 The United States Of America As Represented By The Secretary Of The Army Automatic aiding of human cognitive functions with computerized displays
CN102271584A (en) * 2009-10-29 2011-12-07 松下电器产业株式会社 Organism fatigue evaluation device and organism fatigue evaluation method
WO2015175435A1 (en) * 2014-05-12 2015-11-19 Automotive Technologiesinternational, Inc. Driver health and fatigue monitoring system and method
CN106793958A (en) * 2014-09-01 2017-05-31 大金工业株式会社 Cerebration estimating unit
CN108135498A (en) * 2015-10-15 2018-06-08 大金工业株式会社 Device is presented in useful information

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2702683B2 (en) 1995-07-20 1998-01-21 工業技術院長 Fatigue property evaluation device and fatigue measurement device
US20040210159A1 (en) * 2003-04-15 2004-10-21 Osman Kibar Determining a psychological state of a subject
JP4882420B2 (en) 2006-02-27 2012-02-22 トヨタ自動車株式会社 Awakening degree estimation apparatus and method
US8401261B2 (en) * 2007-09-25 2013-03-19 University Of Houston System Imaging facial signs of neuro-physiological responses
JP2011067284A (en) * 2009-09-24 2011-04-07 Aisin Seiki Co Ltd Fatigue examination apparatus
JP2011123653A (en) * 2009-12-10 2011-06-23 Sumitomo Rubber Ind Ltd Test device for driver's arousal level
US20110251493A1 (en) * 2010-03-22 2011-10-13 Massachusetts Institute Of Technology Method and system for measurement of physiological parameters
JP2013176406A (en) 2010-05-27 2013-09-09 Hitachi Ltd Brain function measuring device
US9642536B2 (en) * 2010-06-07 2017-05-09 Affectiva, Inc. Mental state analysis using heart rate collection based on video imagery
JP5668138B2 (en) * 2011-06-17 2015-02-12 株式会社日立製作所 Biological light measurement device
JP2012139562A (en) * 2012-04-26 2012-07-26 Hamamatsu Photonics Kk Blinking measuring instrument
JP6090653B2 (en) * 2012-12-20 2017-03-08 国立研究開発法人産業技術総合研究所 Fatigue determination device, fatigue determination method, and program thereof
US20140375785A1 (en) 2013-06-19 2014-12-25 Raytheon Company Imaging-based monitoring of stress and fatigue
JP6187122B2 (en) * 2013-10-10 2017-08-30 株式会社デンソー Fatigue measuring device
US20150164418A1 (en) * 2013-12-17 2015-06-18 Mayo Foundation For Medical Education And Research Cognitive performance assessment test
US10028669B2 (en) * 2014-04-02 2018-07-24 Massachusetts Institute Of Technology Methods and apparatus for physiological measurement using color band photoplethysmographic sensor
WO2016049757A1 (en) * 2014-10-01 2016-04-07 Nuralogix Corporation System and method for detecting invisible human emotion
GB201421785D0 (en) * 2014-12-08 2015-01-21 Oxehealth Ltd Method and apparatus for physiological monitoring
CN108135497B (en) 2015-10-15 2021-12-10 大金工业株式会社 Driver state determination device and driver state determination method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6092058A (en) * 1998-01-08 2000-07-18 The United States Of America As Represented By The Secretary Of The Army Automatic aiding of human cognitive functions with computerized displays
CN102271584A (en) * 2009-10-29 2011-12-07 松下电器产业株式会社 Organism fatigue evaluation device and organism fatigue evaluation method
WO2015175435A1 (en) * 2014-05-12 2015-11-19 Automotive Technologiesinternational, Inc. Driver health and fatigue monitoring system and method
CN106793958A (en) * 2014-09-01 2017-05-31 大金工业株式会社 Cerebration estimating unit
CN108135498A (en) * 2015-10-15 2018-06-08 大金工业株式会社 Device is presented in useful information

Also Published As

Publication number Publication date
JP2017153964A (en) 2017-09-07
JP6463392B2 (en) 2019-01-30
EP3424408B1 (en) 2022-05-11
US11844613B2 (en) 2023-12-19
CN108712879A (en) 2018-10-26
EP3424408A1 (en) 2019-01-09
EP3424408A4 (en) 2019-10-16
WO2017150575A1 (en) 2017-09-08
US20190059799A1 (en) 2019-02-28

Similar Documents

Publication Publication Date Title
CN108135491B (en) Physiological state determination device and physiological state determination method
US10842431B2 (en) Mental illness determination device
CN108697392B (en) Determination result output device, determination result providing device, and determination result output system
CN110520047B (en) Physiological state determination device
CN108135498B (en) Useful information presentation device
CN108135497B (en) Driver state determination device and driver state determination method
JP7111450B2 (en) Brain activity estimation device
CN108712879B (en) Fatigue state determination device and fatigue state determination method
JP6829363B2 (en) Evaluation device, market research device, and learning evaluation device
JP6158887B2 (en) Useful information presentation device
JP6093422B1 (en) Brain age presentation device
JP6096857B1 (en) Emotion judgment device
JP2017086992A (en) Feeling determination device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant