WO2023204218A1 - Cognitive ability estimation apparatus and program - Google Patents

Cognitive ability estimation apparatus and program Download PDF

Info

Publication number
WO2023204218A1
WO2023204218A1 PCT/JP2023/015517 JP2023015517W WO2023204218A1 WO 2023204218 A1 WO2023204218 A1 WO 2023204218A1 JP 2023015517 W JP2023015517 W JP 2023015517W WO 2023204218 A1 WO2023204218 A1 WO 2023204218A1
Authority
WO
WIPO (PCT)
Prior art keywords
cognitive ability
state
information
subject
driver
Prior art date
Application number
PCT/JP2023/015517
Other languages
French (fr)
Japanese (ja)
Inventor
幸男 篠崎
Original Assignee
幸男 篠崎
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 幸男 篠崎 filed Critical 幸男 篠崎
Publication of WO2023204218A1 publication Critical patent/WO2023204218A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/06Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/12Limiting control by the driver depending on vehicle state, e.g. interlocking means for the control input for preventing unsafe operation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a cognitive ability estimation device and a program.
  • moving objects There are various types of moving objects that are driven or operated by people. Such moving objects are heavy, and if an accident occurs due to inappropriate driving or maneuvering, the damage is likely to be severe. For example, in a car, the driver's cognitive ability is estimated, and based on the estimation results, the driver is stimulated as necessary to support safe driving. This has also been done (for example, see Patent Document 1).
  • the moving object will be referred to as a car
  • the subject will be referred to as a driver
  • driving the operation for moving the moving object
  • the driver's cognitive ability can be estimated using video information from a camera that captures an image of the driver sitting in the driver's seat (for example, see Patent Document 1).
  • changes in the driver's visually verifiable characteristic values such as the frequency of blinking, the range of movement of the line of sight, and the speed of movement thereof, appear as a result of drowsiness. Therefore, there is a time lag between when sleepiness actually occurs and when a change in the feature amount is confirmed. Because of the existence of such a time difference, sleepiness has also been estimated from the state of autonomic nerves (for example, see Patent Document 2).
  • a driver who is estimated to be drowsy is given some kind of stimulation using an on-vehicle device to improve his or her drowsiness (level of alertness).
  • Stimuli include, for example, blowing air from an air conditioner and sound warnings.
  • Drivers who notice their own drowsiness usually consciously try to improve their drowsiness. Due to this awareness, the activity of the sympathetic nerves may increase even if the person is sleepy. That is, an abnormal state may occur in which both the sympathetic nerve and the parasympathetic nerve are activated (for example, see Patent Document 3).
  • An object of the present invention is to provide a cognitive ability estimation device that can more accurately estimate the cognitive ability of a subject who drives or controls a mobile object.
  • a cognitive ability estimation device includes a biological information acquisition unit that acquires biological information that can identify at least the heartbeat of a subject who drives or steers a mobile object, and an image that acquires video information of the subject.
  • an information acquisition means a state estimation means for estimating the state of the autonomic nerves of the subject based on the heartbeat identified from the biological information, an estimation result of the state of the autonomic nerves by the state estimation means, and based on the video information
  • a cognitive ability estimation means for estimating the cognitive ability of a subject.
  • the cognitive ability of a subject who drives or controls a mobile object can be estimated with higher accuracy.
  • FIG. 2 is a diagram illustrating an example of a mechanism in which a cognitive ability estimating device according to an embodiment of the present invention estimates a subject's cognitive ability, and an example of control performed according to the estimated cognitive ability.
  • FIG. 1 is a block diagram showing an example of the hardware configuration of a cognitive ability estimation device according to an embodiment of the present invention.
  • FIG. 1 is a functional block diagram showing an example of a functional configuration implemented on a cognitive ability estimation device according to an embodiment of the present invention. It is a flow chart which shows an example of safe driving support processing.
  • 3 is a flowchart illustrating an example of safe driving support processing (continued).
  • FIG. 1 is a diagram illustrating an example of a mechanism in which a cognitive ability estimating device according to an embodiment of the present invention estimates a subject's cognitive ability, and an example of control performed according to the estimated cognitive ability.
  • the target person whose cognitive ability is estimated by the cognitive ability estimating device 1 is a person who drives or operates a mobile object. Therefore, a moving body is equipped with a power source such as an engine that enables movement.
  • the moving object is a car
  • the subject is a driver 3 who is sitting in the driver's seat 2 of the car. Therefore, unless otherwise specified, the description will be made assuming that the moving object is a car and the subject is the driver 3.
  • the moving object may be a train, a ship, an airplane, or the like.
  • the target person may be a pilot or the like.
  • biological information is information representing the heartbeat of the driver 3, or information that allows the heartbeat to be specified.
  • FIG. 1 shows a steering wheel 5 equipped with a pulse sensor and a smart watch 6 worn on the arm of a driver 3 as examples of devices that generate biological information.
  • the pulse sensor is, for example, a capacitance type, an optical type, or a radio wave type, and processes a signal output from the sensor for sensing to identify a heartbeat from the pulsation of blood flow.
  • the identification result is transmitted from the pulse sensor as biological information.
  • This biological information is input to the cognitive ability estimation device 1 via, for example, an ECU (Electronic Control Unit).
  • the pulse sensor may be provided not on the steering wheel 5 but on the driver's seat 2.
  • the type of sensor, installation location, etc. are not particularly limited as long as the heartbeat can be identified.
  • the smart watch 6 is equipped with, for example, an optical pulse sensor.
  • the smart watch 6 can detect the pulsation of the blood flow of the driver 3 confirmed by its pulse sensor as a heartbeat, and can transmit the detection result as biological information.
  • the biological information wirelessly transmitted from the smart watch 6 is directly received by the cognitive ability estimation device 1, for example.
  • the camera 4 is installed inside the car to take an image of the face of the driver 3 sitting in the driver's seat 2.
  • the video information obtained by the imaging is transmitted to the cognitive ability estimating device 1 via the ECU, similarly to the pulse sensor provided on the steering wheel 5. It is also possible to identify the heart rate of the driver 3 from the video information. It is also possible to identify breathing. Therefore, video information may be positioned as biological information.
  • the camera 7 and the radar 8 are devices that generate status information or generate information used to generate status information.
  • This status information is information representing the status of the automobile, which is a moving object.
  • the camera 7 is used, for example, to capture an image in the direction in which the automobile is traveling.
  • the video information obtained by the imaging is used, for example, to confirm the state of the road in the direction of travel of the vehicle.
  • the road conditions that are checked include the way the road curves, the presence or absence of lines drawn on the road, the presence or absence of obstacles, and the types of obstacles. All of these are identified by video analysis using video information. Therefore, the camera 7 is a device for generating state information.
  • the radar 8 is for measuring the distance to an object existing in the direction of travel, and for example, generates and outputs distance information representing the measured distance as state information. Therefore, the radar 8 is a device that can generate state information.
  • the status information is not limited to what can be generated by the camera 7 or the radar 8. Measured position information, vehicle speed information, steering angle of the steering wheel 5, and the like can also be used as the status information.
  • the state information to be used may be determined depending on the type of mobile object, its purpose, etc.
  • the distance information output from the radar 8 and the video information output from the camera 7 are processed by the corresponding ECU 9.
  • the ECU 9 recognizes, for example, the presence or absence of a line drawn on the road, the type of line, the presence or absence of an object existing in the direction of travel, the type of object, etc. from the video information. Based on such recognition results, it is determined whether the vehicle body is swaying, whether the vehicle body is protruding from the line, etc. In addition, in combination with distance information, it also determines whether the distance between the vehicle and another vehicle traveling in front is appropriate, whether an obstacle exists, and whether it is possible to avoid the obstacle or not. . Thereby, the ECU 9 outputs, for example, video information, each recognition result, and each determination result to the cognitive ability estimation device 1 as state information.
  • the state information (distance information) generated by the radar 8 is converted into a processing result using the state information.
  • the video information and status information received by the cognitive ability estimation device 1 are used for the video analysis process in step S1.
  • This video analysis process is a process for evaluating multiple indicators that can be used to estimate cognitive ability.
  • the types of multiple indicators and their combinations are not particularly limited, but it is necessary to use at least drowsiness. This is because sleepiness has a very large effect on cognitive ability.
  • fatigue and attention may be used, for example, as in Patent Document 1.
  • it is assumed that the plurality of indicators are sleepiness, fatigue, and attentiveness.
  • the evaluation method for each index well-known methods can be adopted. Specifically, for example, the method described in Patent Document 1 may be adopted. For this reason, detailed explanation will be omitted here.
  • the drowsiness of the driver 3 is generally estimated by focusing on eye movements, specifically, blink frequency, eyeball movement, and the like. From the eye movements, it is possible to check the speed of line of sight movement, the range of line of sight movement, etc.
  • the biological information received by the cognitive ability estimation device 1 is used in the autonomic nerve state analysis process in step S2.
  • This state analysis process evaluates the state of the autonomic nerves, that is, the activity levels of the sympathetic and parasympathetic nerves, and uses the evaluation results to estimate the levels of drowsiness and stress.
  • a spectrum obtained by frequency analysis of heartbeat intervals has a single peak structure. It is said that the activity level can be relatively evaluated by comparing the power in the range of 0.05-0.15 Hz for the sympathetic nerve and 0.15-0.45 Hz for the parasympathetic nerve. Sleepiness and stress can be evaluated from their relative activity levels. Therefore, biological information that represents heartbeat or that can identify heartbeat can be used to estimate the state of autonomic nerves, that is, drowsiness and stress. Therefore, in this embodiment, biological information is information necessary for estimating the state of the autonomic nerve. Since a well-known method may be adopted as a method for estimating the state of the autonomic nervous system from the heartbeat, a more detailed explanation will be omitted.
  • the structure that exists in the range of 0.05-0.15Hz is called the blood pressure fluctuation component (NWSA: Myer Wave Sinus Arrhythmia), and the structure that exists in the range of 0.15-0.45Hz is called the blood pressure fluctuation component (NWSA). This is called the fluctuation component (RSA: Respiratory Sinus Arrhythmia).
  • NWSA Myer Wave Sinus Arrhythmia
  • RSA Respiratory Sinus Arrhythmia
  • the states of the driver 3 estimated from the activity levels of the autonomic nerves, that is, the sympathetic nerves, and the parasympathetic nerves, are broadly classified into the following four types (see, for example, Patent Document 3). (1) State in which sympathetic nerves are dominant (2) State in which parasympathetic nerves are dominant (3) State in which the activity levels of sympathetic nerves and parasympathetic nerves are both increased (4) Each activity of sympathetic nerves and parasympathetic nerves Both levels are decreasing
  • (1) is a state in which it is estimated that the driver 3 is not feeling sleepy. (1) also includes a state where the stress level is high, that is, a state where the driver 3 is excited. Hereinafter, this will be referred to as the first state.
  • (2) is a state in which the driver 3 is estimated to be drowsy or fatigued. If the driver 3 is very relaxed, this state may be assumed. Hereinafter, this will be referred to as the second state.
  • (3) is a state that appears when the driver 3 tries to overcome sleepiness. This is an abnormal condition that does not normally occur. The driver 3 who is aware of sleepiness is conscious of trying to at least suppress the sleepiness. Due to this awareness, it is considered that this condition is relatively likely to occur for driver 3.
  • the third state is a state that is likely to be assumed when the driver 3 is in a depressed state. It can be inferred that the state of the autonomic nervous system is unstable. Hereinafter, this will be referred to as the fourth state.
  • driver 3 is considered to have high cognitive ability. If the driver 3 is feeling strong stress, there is a possibility that his or her attentiveness is decreasing due to the stress. Decreased attentiveness refers to a state in which the actual cognitive ability is low because the driver is not paying attention to driving or is looking elsewhere. However, it is possible that even when under stress, people are able to control themselves and keep their cognitive abilities high. For these reasons, it is not always possible to estimate the level of cognitive ability appropriately.
  • the third state is a state seen when the driver 3 is trying to overcome sleepiness, as described above. Since the driver 3 is aware of his own drowsiness and is conscious of trying to overcome the drowsiness, it is possible that his cognitive ability itself has not deteriorated much. Even in the fourth state, it is possible that the driver 3 appropriately recognizes the situation. Therefore, it is not always possible to estimate the level of cognitive ability with high accuracy.
  • each index evaluated from video information and the results obtained by autonomic nerve state analysis are used in a complementary manner to estimate the cognitive ability of the driver 3 with higher accuracy. I try to do that.
  • the actual drowsiness level analysis process in step S3 and the stress influence analysis process in step S4 are executed by the cognitive ability estimation device 1.
  • the drowsiness that occurs in the driver 3 in the second state has a particularly serious effect on the decline in cognitive ability. This is because not only is sleepiness a major cause of cognitive decline, but sleepiness typically continues for a long period of time. In other words, sleepiness is likely to cause a state of extremely low cognitive ability to continue for a long period of time. For this reason, in this embodiment, the actual level of sleepiness estimated from the state of the autonomic nerves is evaluated as the actual sleepiness level by the actual sleepiness level analysis process in step S3.
  • the driver 3 to be evaluated is the driver 3 whose autonomic nervous system is estimated to be in the second state. This actual drowsiness level analysis process uses video information and state information.
  • the driver 3 who is trying to overcome sleepiness moves his body intentionally within a range that does not interfere with safe driving and tries to provide stimulation to his body. Examples of such body movements include intentionally moving your head or shoulders to provide stimulation to your body, intentionally moving your eyes (blinking, etc.), operating air conditioning equipment, etc. Alternatively, you can make your body more susceptible to sound stimulation, open the windows and let outside air into the car, etc. Even if the driver 3 who moves in this manner feels sleepy, it can be estimated that the decrease in the level of cognitive ability is relatively small. In other words, the actual sleepiness level is relatively low, and it can be estimated that a sufficient cognitive level is maintained.
  • the movements of the driver 3 are affected by the state of the vehicle. Furthermore, the movements of the driver 3 affect the state of the vehicle. For example, when the vehicle is traveling on a road with lines drawn on it, if the driver 3 does not operate the steering wheel 5 appropriately, there is a risk of the vehicle body protruding from the lines. There is also a risk that the vehicle will be driven at a speed that significantly exceeds the speed limit. Such information can also be used as state information to estimate the actual drowsiness level of the driver 3. The presence or absence of a line drawn on the road, its type, and whether the vehicle body protrudes from the line can all be confirmed using video information from the camera 7. When a sign or the like indicating a speed limit is imaged by the camera 7, the speed limit can be confirmed from the video information.
  • the actual drowsiness level is divided into two levels: high and low.
  • the actual sleepiness levels of each of these stages can be further divided into two or more stages.
  • a high actual drowsiness level that is, in a situation where the driver 3 is estimated to be feeling strongly drowsy, it is not necessary to divide into multiple stages in consideration of safety. That is, as a numerical value representing the actual sleepiness level, a numerical value representing that safe driving cannot be expected may be provided, and a level at which safe driving can be expected may be represented by two or more numerical values.
  • the numerical value indicating that safe driving cannot be expected is 3, and that the estimation result of the actual drowsiness level is expressed as an integer between 0 and 3.
  • the stress impact analysis process in step S4 the impact of stress felt by the driver 3 on himself or herself is evaluated. Video information and status information are used to evaluate this influence. The results of autonomic nerve state analysis processing are used to determine whether or not it is necessary to evaluate this influence.
  • the only driver 3 whose influence is to be evaluated is the driver 3 who is in the first state. More specifically, a driver whose stress level evaluated in the autonomic nerve state analysis process in step S2 is equal to or higher than a set value, that is, a driver who is estimated to be feeling stress equal to or higher than the stress level that is considered to have a negative impact on safe driving. There are only 3.
  • the degree of influence is also evaluated, for example, in multiple stages. In evaluating the degree of influence, as described above, it is also determined whether safe driving is possible. Accordingly, the degree of influence may also include, for example, a numerical value indicating that safe driving cannot be expected. In this case, the level at which safe driving can be expected may be expressed, for example, by two or more numerical values. Here, for the sake of explanation and convenience, it is assumed that the numerical value indicating that safe driving cannot be expected is 3, and that the evaluation result of the degree of influence is expressed as an integer between 0 and 3. The smaller the value, the lower the degree of influence, that is, the stress has virtually no effect on the driver 3.
  • the movements of the driver 3 are affected by the state of the vehicle. For example, in a situation where the car is stopped, there are no restrictions on safe driving on the actions of the driver 3.
  • the driver 3 can make arbitrary movements. Even when the automobile is running, the permissible actions of the driver 3 vary depending on the driving speed and the like. For this reason, video information and status information are also used in stress effect analysis processing.
  • the results of the stress influence analysis process and the actual drowsiness level analysis process are passed to the cognitive ability estimation process in step S5 and the device control process in step S6, respectively.
  • the cognitive ability estimation process is a process for estimating the cognitive ability of the driver 3 as well as changes in the cognitive ability. Cognitive ability is estimated using the actual sleep level, the evaluation results of each index except for sleepiness, and the degree of influence of stress. Again, for convenience, it is assumed that cognitive ability is evaluated on a four-level scale from 0 to 3, with 3 representing the worst condition. Under these assumptions, the smaller the number, the higher the cognitive ability. Since cognitive ability is estimated numerically, the target of estimation will also be referred to as "cognitive ability level" from now on.
  • the cognitive ability level may also be set to 3. If both the actual sleepiness level and the degree of influence are not 3, the cognitive ability level may be calculated, for example, using a calculation formula that uses the actual sleepiness level and the evaluation results of each index excluding sleepiness as variables. For example, each variable may be multiplied by a coefficient determined depending on the degree of influence on safe driving.
  • Control indicators Changes in cognitive ability are evaluated using, for example, actual sleepiness level, degree of influence, stress level, and each index other than sleepiness. These are taken into consideration in controlling one or more devices that constitute the device group 20. For this reason, they will hereinafter be collectively referred to as "control indicators" to distinguish them from the above-mentioned indicators.
  • the air conditioner control device 21 is a device that controls air conditioners capable of blowing hot air and cold air, and is also capable of controlling the temperature of the air and the direction of air blowing. Through the control of the control device 21, the driver 3 can be stimulated by the wind.
  • the control device 21 is shown in the device group 20 because it is a device necessary for controlling the air conditioning equipment.
  • the steering control device 22 is a device that performs control to generate a driving force that allows the steering wheel 5 to be turned more easily. It is also part of the equipment that makes autonomous driving possible.
  • the control device 22 can generate vibrations or the like in the hand holding the handle 5 by using the driving force.
  • the driver 3 can be stimulated by controlling the control device 22 to generate vibrations or the like in the steering wheel 5.
  • the control device 22 is also shown in the equipment group 20 because it is a device necessary for stimulation by the handle 5.
  • the devices to be controlled are not limited to those shown in FIG.
  • a navigation device that performs navigation by display and voice, interior lighting, power windows, etc. may also be controlled objects.
  • the navigation device can be used for audio output or message display.
  • the cognitive ability estimating device 1 is shown to give instructions directly to the equipment group 20. However, in reality, control is requested to the corresponding ECU. The reason why the ECU 9 is shown in FIG. 1 is because the ECU 9 generates status information.
  • desirable stimuli are given to the driver 3 in a timely manner so that the driver 3 can drive safely.
  • Stimuli that are considered desirable for the driver 3 may differ depending on the driver's 3 condition.
  • the content of the information to be provided to the driver 3 also differs depending on the condition of the driver 3.
  • the driver 3 who has low attentiveness may be made more aware of his low attentiveness. Therefore, it is conceivable to convey the fact that the attention level is low through voice or the like.
  • it is considered that providing such information through voice or the like is insufficient. It is also considered necessary to provide stimulation to the driver 3 to alleviate the level of sleepiness or fatigue that the driver 3 feels.
  • the content of the information provided to the driver 3 and the stimulation to be provided be selected in consideration of the driver's 3 condition.
  • the condition of the driver 3 is not necessarily improved. Therefore, in this embodiment, changes in each control index are evaluated and the evaluation results are reflected in the control of the equipment group 20, so that the condition of the driver 3 is improved more reliably. .
  • the cognitive ability estimation process it is determined whether there is a need to control the device group 20, and if it is determined that there is a need, a device to be controlled is selected from the device group 20, and the selected device is controlled. The content etc. are also determined.
  • the device control process in step S6 is a process for controlling the device to be controlled among the device group 20.
  • the processing results of the actual drowsiness level analysis processing, the stress influence analysis processing, and the cognitive ability estimation processing are passed to the device control processing.
  • the results of the actual drowsiness level analysis process and the stress influence analysis process are passed, so that a case where the actual drowsiness level or degree of influence becomes 3 is handled.
  • the device control process controls the device according to the instruction content passed as a processing result of the cognitive ability estimation process. Control of the device also includes termination of control of the device.
  • the cognitive ability estimation process if no improvement in the driver's condition is confirmed even after stimulation is applied to the driver 3, stronger stimulation is applied in stages. Therefore, the actual drowsiness level or degree of influence becomes 3, and after starting to control the device, the device is controlled in accordance with the instruction content passed from the cognitive ability estimation process.
  • the state of the driver 3 estimated from the autonomic nervous system is confirmed from video information, and the actual state of the driver 3 is estimated. Therefore, the actual condition of the driver 3 can be estimated with higher accuracy. It is possible to more appropriately respond to individual differences. Status information obtained from the vehicle improves estimation accuracy, making it possible to take more appropriate measures. As a result, information can be provided to the driver 3 according to the driver's condition, and furthermore, stimulation to the body can be provided in a timely manner and more appropriately. By providing stimulation or the like, it can be expected that the condition of the driver 3 will be improved. For this reason, support for the driver 3 can be provided more appropriately so that the driver 3 can drive the car safely.
  • FIG. 2 is a block diagram showing an example of the hardware configuration of a cognitive ability estimation device according to an embodiment of the present invention.
  • various sensors related to the cognitive ability estimating device 1 and each device constituting the device group 20 are also shown. Note that this configuration example is just an example, and the hardware configuration of the cognitive ability estimation device 1 and the information processing device that can be used as the estimation device 1 is not limited to this.
  • the cognitive ability estimation device 1 includes, for example, a CPU (Central Processing Unit) 11, a ROM (Read Only Memory) 12, a RAM (Random Access Memory) 13, an SSD (Solid State Drive) 14, and an IFC (InterFace). controller) 15, and a communication section 16. They are connected to the bus.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • SSD Solid State Drive
  • IFC InterFace
  • the CPU 11 executes, for example, a program recorded in the ROM 12 and/or a program recorded in the SSD 14 to realize various processes. Both programs are loaded into the RAM 13 and executed.
  • the programs loaded from the SSD 14 into the RAM 13 include, for example, an OS (Operating System) and various application programs that run on the OS.
  • the various application programs include one or more programs developed to cause the information processing device to function as the cognitive ability estimation device 1.
  • this developed application program will be referred to as a "developed application.”
  • the developed application may be recorded on removable media and distributed. It may also be possible to distribute it via a network such as the Internet. For this reason, the recording medium on which the developed application is recorded must be installed or attached to an information processing device that is directly or indirectly connected to a network, or installed or attached to an externally accessible device. It may be attached.
  • the IFC group 15 enables connection with various external devices.
  • the IFC group 15 may include IFCs for audio output and image output.
  • An IFC for directly connecting in-vehicle equipment may also be included.
  • An IFC that enables reception of biometric information or video information may also be included.
  • the RAM 13 also appropriately stores data necessary for the CPU 11 to execute various processes.
  • the data includes data used by various programs executed by the CPU 11. Video information, biological information, and status information are also included in the data.
  • the communication unit 16 When each ECU including the ECU 9 is connected to an in-vehicle LAN (Local Area Network), the communication unit 16 enables communication with each ECU via the in-vehicle LAN. When making it compatible with the reception of biometric information from the smart watch 6, the communication unit 16 may be capable of wireless communication.
  • the communication unit 16 may be capable of wireless communication.
  • a camera 4 and a pulse sensor 31 are shown as a sensor group 30 capable of acquiring biological information.
  • the pulse sensor 31 is provided on the handle 5, for example. This pulse sensor 31 may be provided not on the steering wheel 5 but on the driver's seat 2.
  • two sensor groups 40 and 50 are shown in FIG. 2 for acquiring status information.
  • the sensor group 40 is particularly used to check the state of the vehicle, and the sensor group 50 is used to check the environment in which the vehicle is placed.
  • the ECU 9 there is also the ECU 9 that processes information output from one or more sensors included in the sensor groups 40 and 50, respectively, and generates and outputs status information. In FIG. 2, this ECU 9 is omitted.
  • the sensor group 40 includes a steering angle sensor 41, a brake/accelerator sensor 42, a G sensor 43, and a speedometer 44.
  • the steering angle sensor 41 detects the angle at which the steering wheel 5 is rotated as a steering angle.
  • the break/accelerator sensor 42 detects the amount of operation of an accelerator pedal and a brake pedal (not shown).
  • G sensor 43 is an acceleration sensor. Detects the acceleration generated in the vehicle.
  • the speedometer 44 measures the running speed of the vehicle. All of these detection results are treated as vehicle body status information.
  • the vehicle body status information will be referred to as "vehicle information" hereinafter.
  • a locator 51 As the sensor group 5, in addition to the camera 7 and the radar 8, a locator 51 is shown.
  • the locator 51 outputs position information representing the position of the vehicle specified by positioning. With this position information, it is possible to confirm the road on which the vehicle is traveling, the presence or absence of a curve in the road in the traveling direction, the degree of the curve (radius), etc.
  • Video information from the camera 7, status information generated by the ECU 9 from the video information and distance information from the radar 8, and position information from the locator 51 will be referred to as "environmental information" hereinafter.
  • the type, number, and combination of sensors constituting each sensor group 30 to 50 are not particularly limited. What is shown in FIG. 2 is one example.
  • the equipment group 20 in addition to an air conditioning equipment control device 21 and a steering control device 22, a message output control device 23 and an automatic driving device 24 are shown.
  • the message output control device 23 makes it possible to provide information to the driver 3 through audio output.
  • This control device 23 may be mounted on a navigation device.
  • the automatic driving device 24 enables automatic driving of the vehicle. If the automatic driving device 24 cannot confirm that the driver's condition has improved even after stimulation is applied, specifically, if it considers that the driver cannot be expected to drive safely due to drowsiness or stress, The automatic driving device 24 is requested to switch to automatic driving.
  • FIG. 3 is a functional block diagram showing an example of a functional configuration implemented on a cognitive ability estimation device according to an embodiment of the present invention. Next, an example of the functional configuration implemented on the cognitive ability estimation device 1 will be described in detail with reference to FIG. 3.
  • the CPU 11 can transmit and receive information (data) to and from each ECU group 60 including the ECU 9 via the communication unit 16.
  • Each sensor included in the sensor groups 30 to 50 is connected to one of the ECUs forming the ECU group 60. The same applies to each device constituting the device group 20. Therefore, reception of information obtained by each sensor and control of each device are performed via one of the ECUs.
  • the SSD 14 includes a biological information storage section 141, a state information storage section 142, a state analysis result storage section 143, an index evaluation result storage section 144, a level evaluation result storage section 145, an impact evaluation result storage section 146, and a cognitive ability storage section 144.
  • Estimation result storage section 147, condition evaluation information storage section 148, index evaluation information storage section 149, level evaluation information storage section 150, impact evaluation information storage section 151, estimation information storage section 152, and device control information storage section 153 are information storage units. This area is reserved for use.
  • information that should be temporarily stored may be stored in the RAM 13 instead of the SSD 14.
  • Various information stored in the SSD 14 is stored in the RAM 13, and then transferred to and stored in the SSD 14.
  • the process of storing information is ignored, and only the SSD 14 is assumed as the information storage destination.
  • the biological information acquisition unit 111 is responsible for acquiring biological information necessary for autonomic nerve state analysis.
  • the biological information here includes not only information obtained by the pulse sensor 31 but also video information obtained by the camera 4.
  • the biometric information acquired by the biometric information acquisition unit 111 is stored in the biometric information storage unit 141 secured in the SSD 14.
  • the vehicle information acquisition unit 112 and the environmental information acquisition unit 113 correspond to acquisition of vehicle information and environmental information, respectively. Both the acquired vehicle information and environmental information are stored in the status information storage section 142 secured in the SSD 14. Note that information acquisition by the biological information acquisition unit 111, the vehicle information acquisition unit 112, and the environmental information acquisition unit 113 is performed at predetermined timings, for example, at predetermined time intervals. This is because pulse rate and the like are unlikely to repeat rapid changes within a short period of time.
  • the autonomic nerve state analysis unit 114 estimates the state of the autonomic nerves through analysis using the biological information stored in the biological information storage unit 141. Through the analysis, the activity levels of sympathetic and parasympathetic nerves are evaluated, and the evaluation results are used to estimate the levels of sleepiness and stress. This estimation result is stored as a state analysis result in the state analysis result storage section 143 secured in the SSD 14.
  • the autonomic nerve state analysis process shown as step S2 in FIG. 1 is executed by the autonomic nerve state analysis unit 114.
  • the state evaluation information storage unit 148 secured in the SSD 14 stores information for estimating the state of the autonomic nerve as state evaluation information. For example, as described above, this evaluation information evaluates the activity levels of the sympathetic and parasympathetic nerves from the spectrum obtained by frequency analysis of the heartbeat interval, and from the evaluation results, the levels of sleepiness and stress are determined. This is information for further evaluation.
  • the autonomic nerve state analysis unit 114 refers to this evaluation information and estimates the state of the autonomic nerves.
  • the video information analysis unit 115 generates motion feature information representing the characteristics of the movement of the driver 3 by analysis using video information stored in the biometric information storage unit 141 as biometric information, for example.
  • the index evaluation unit 116 evaluates each index through analysis using the generated motion characteristic information and the state information stored in the state information storage unit 142. The evaluation results of each index are stored in the index evaluation result storage section 144 secured in the SSD 14.
  • the index evaluation information storage unit 149 secured in the SSD 14 stores, for each index, information for evaluating that index as index evaluation information.
  • This evaluation information is prepared for each category of status represented by status information, for example.
  • the classification is a classification of vehicle conditions, taking into consideration, for example, the type of road (including whether it is an expressway or not), the driving area, and the driving speed.
  • the index evaluation unit 116 evaluates each index by specifying evaluation information to be referred to from the state information and referring to the specified evaluation information using the operation characteristic information. For this reason, the video analysis process shown as step S1 in FIG. 1 is realized by the video information analysis section 115 and the index evaluation section 116.
  • the actual drowsiness level evaluation unit 117 evaluates the actual drowsiness level. This evaluation includes, for example, the condition analysis results stored in the condition analysis result storage section 143, the condition information stored in the condition information storage section 142, the evaluation results of each index except for drowsiness stored in the index evaluation result storage section 144, This is performed using actual drowsiness level evaluation information stored in the level evaluation information storage unit 150.
  • the actual drowsiness level evaluation information is information for evaluating the actual drowsiness level. This evaluation information is prepared for each category of state represented by the state information, for example, similar to the index evaluation information. Thereby, the state information is used to specify which of the actual drowsiness level evaluation information stored in the level evaluation information storage section 150 should be referred to.
  • Each piece of actual drowsiness level evaluation information is information representing the correspondence between each index other than drowsiness, that is, each evaluation result of fatigue and attentiveness, and the actual drowsiness level.
  • the actual drowsiness level evaluated with reference to such actual drowsiness level evaluation information is stored in the level evaluation result storage section 145 secured in the SSD 14.
  • the actual drowsiness level analysis process shown as step S3 in FIG. 1 is realized by the actual drowsiness level evaluation unit 117.
  • the stress impact evaluation unit 118 evaluates the actual degree of impact of stress. This evaluation uses the stress level stored as an analysis result in the condition analysis result storage section 143, the condition information stored in the condition information storage section 142, and the stress impact evaluation information stored in the impact evaluation information storage section 151. will be carried out.
  • Stress impact evaluation information is information for evaluating the degree of impact of stress. For people with low stress tolerance, even if the stress level is low, the effects of stress may be strongly reflected in their actual behavior. For this reason, it is thought that there are relatively large individual differences in the effects of stress. Therefore, in this embodiment, the degree of influence is positioned as the stress level that actually appears in the behavior of the driver 3, and the degree of influence is evaluated by referring to stress influence evaluation information.
  • the target person may be, for example, only the driver 3 whose estimated stress level is 2 or higher. This also applies to the actual drowsiness level evaluation section 117.
  • the stress impact evaluation information is also prepared for each state category represented by the state information.
  • the status information is similarly used to specify which stress impact evaluation information stored in the impact evaluation information storage section 151 should be referred to.
  • the effects of stress may appear in behaviors that do not or are difficult to affect attention. Examples of such actions include relatively small continuous movements of the upper body, arms, or head. There may also be expressions of anger or sadness. Because of this, the degree of influence is now being evaluated from a different perspective than attentiveness, etc.
  • Each stress impact evaluation information enables such an evaluation.
  • the degree of impact evaluated with reference to such stress impact evaluation information is stored in the impact evaluation result storage section 146 secured in the SSD 14.
  • the stress impact analysis process shown as step S4 in FIG. 1 is realized by the stress impact evaluation unit 118.
  • the cognitive ability estimating unit 119 estimates the cognitive ability level of the driver 3 from the control index, that is, the actual sleep level, the evaluation results of each index excluding drowsiness, and the degree of influence of stress. It is determined whether or not there is a need to control 20.
  • the cognitive ability estimating unit 119 checks changes in each control index, selects a device to be controlled from among the devices constituting the device group 20, and determines the content of the control. For this purpose, the cognitive ability estimation information stored in the estimation information storage section 152 secured in the SSD 14 is referred to.
  • the cognitive ability estimation information is information in which the type of equipment to be controlled and the content of the control are defined, for example, by control index, the level represented by the control index, and the content of changes in the level of the control index. . Therefore, the cognitive ability estimating unit 119 refers to the cognitive ability estimation information using each control index, and outputs the type of equipment to be controlled and the details of the control as the cognitive ability estimation result. This estimation result is stored in the cognitive ability estimation result storage section 147 secured in the SSD 14.
  • the cognitive ability estimation process shown as step S5 in FIG. 1 is realized by the cognitive ability estimation unit 119.
  • the device control processing unit 120 performs processing for controlling the device group 20 according to the type of device to be controlled and the content of the control stored as the estimation result in the cognitive ability estimation result storage unit 147.
  • the device control processing shown as step S6 in FIG. 1 is realized by the device control processing section 120.
  • the control of each device constituting the device group 20 is actually performed by the corresponding ECU. Therefore, the device control processing unit 120 transmits a control request including control information representing the specified device, actual control contents, etc. to the corresponding ECU, depending on the type of device to be controlled and its control content.
  • This control information is determined by referring to the device control information stored in the device control information storage section 153 secured in the SSD 14. For this purpose, the device control information is such that control information to be transmitted is defined for each device and each control content, for example.
  • a control request including control information is transmitted by the device control processing section 120 via the communication section 16. Note that the device group 20 is controlled as needed, and the device group 20 is not always controlled.
  • Each of the above-mentioned parts 111 to 120 operates while the vehicle is in a drivable state. For example, it operates every time a predetermined time interval elapses. Thereby, timely support can be provided so that the driver 3 can drive safely. This time interval may be changed depending on the situation, for example, depending on the actual drowsiness level estimated by the driver 3.
  • FIGS. 4 and 5 are flowcharts illustrating an example of safe driving support processing.
  • This safe driving support processing controls necessary equipment by estimating cognitive ability based on analysis of video information and analysis of the state of autonomic nerves using biological information, in order to support safe driving of driver 3.
  • This is the process to be executed. For example, it is executed every time a predetermined time interval elapses.
  • Each unit 114 to 120 is realized by executing this process.
  • the safe driving support process will be described in detail with reference to FIGS. 4 and 5. It is assumed that the main body that executes the processing is the CPU 11.
  • step S11 the CPU 11 analyzes the state of the autonomic nerve using biological information.
  • step S12 the CPU 11 evaluates each index by video analysis using video information.
  • step S13 the CPU 11 determines whether or not there is an abnormal result that may impair safe driving in any of the autonomic nerve state analysis results or the evaluated indicators. For example, if drowsiness or stress is estimated by autonomic nerve state analysis, or if any of drowsiness, fatigue, or decreased attention is found by video analysis, it is determined that there is an abnormality and the determination in step S13 is made. If the answer is YES, the process moves to step S16. On the other hand, if this is not the case, it is assumed that there is no abnormality, the determination in step S13 is NO, and the process moves to step S14.
  • step S14 the CPU 11 determines whether or not the device is currently being controlled. If the device is under control, the determination in step S14 is YES and the process moves to step S15. When the device is not under control, that is, when the driver 3 maintains a state in which he can drive safely, the determination in step S14 becomes NO, and the safe driving support process ends here.
  • step S15 the CPU 11 ends the control of the device being controlled. Termination of the control is realized by transmitting a request to the corresponding ECU. After transmitting the request, the safe driving support process ends. Note that the automatic operation device 24 is not included in the target equipment here.
  • step S16 the CPU 11 determines whether automatic driving is currently in progress. If automatic driving is in progress, the determination in step S16 is YES, and the safe driving support process ends here. If automatic operation is not in progress, that is, if manual operation is in progress, the determination in step S16 is NO and the process moves to step S17.
  • step S17 the CPU 11 determines whether drowsiness is detected by analyzing the state of the autonomic nerves. If drowsiness is detected, that is, estimated, the determination in step S17 is YES and the process moves to step S18. If drowsiness is not detected, the determination in step S17 is NO, and the process moves to step S31 in FIG. 5.
  • step S18 the CPU 11 evaluates the actual drowsiness level using the video analysis results.
  • step S19 the CPU 11 determines whether the evaluation result of the actual sleepiness level is equal to or higher than the setting. If the evaluation result is higher than the setting, for example, the actual drowsiness level is 2 or higher, the determination in step S19 is YES and the process moves to step S20. If the evaluation result is less than the setting, the determination in step S19 is NO and the process moves to step S31 in FIG.
  • step S20 the CPU 11 determines whether or not devices other than the automatic driving device 24 are being controlled. If any device is being controlled, the determination in step S20 is YES and the process moves to step S22. If no device is being controlled, the determination in step S20 is NO and the process moves to step S21.
  • step S21 the CPU 11 controls the device to be selected according to the actual drowsiness level.
  • the safe driving support process ends. Note that the device selected here is for providing a relatively small stimulus to the driver 3. If the actual drowsiness level is not improved, a stronger stimulus will be given to the driver 3.
  • step S22 the CPU 11 determines whether or not the effect of improving the actual sleepiness level has been observed. If the effect is recognized, the determination in step S22 becomes YES and the process moves to step S21. Thereby, the driver 3 continues to be stimulated as before. On the other hand, if the effect is not recognized, the determination in step S22 is NO and the process moves to step S23. Note that since it takes a certain amount of time for the effect to appear on the driver 3, that time is taken into consideration when determining whether or not the effect is observed. This also applies to step S38, which will be described later.
  • step S23 the CPU 11 determines whether there are other options for controlling the device. Other options are options for devices that provide stronger stimulation to the driver 3 or options for control content. If there are other options, the determination in step S23 is YES and the process moves to step S24. If there are no other options, the determination in step S23 is NO and the process moves to step S25.
  • step S24 the CPU 11 selects one of the other options. What is selected here is, for example, the one that provides the least stimulus among the other options. After the selection, the process moves to step S21. As a result, the device will be controlled based on the selection result.
  • step S25 the CPU 11 requests the automatic driving device 24 to switch to automatic driving. In this way, manual operation is transitioned to automatic operation. After that, the safe driving support process ends. For this reason, autonomous driving is treated separately from other options as an option that should be ultimately selected.
  • step S31 of FIG. 5 the CPU 11 determines whether stress has been detected by analyzing the state of the autonomic nerves. If stress is detected, the determination in step S31 is YES and the process moves to step S32. If stress is not detected, the determination in step S31 is NO and the process moves to step S34.
  • step S32 the CPU 11 evaluates the degree of influence of stress.
  • step S33 the CPU 11 determines whether the evaluated degree of influence is greater than or equal to a setting. If the evaluated degree of influence is equal to or higher than the setting, for example, if the degree of influence is 2 or more, the determination in step S33 is YES and the process moves to step S36. If the degree of influence is less than the setting, the determination in step S33 is NO and the process moves to step S34.
  • step S34 the CPU 11 estimates cognitive ability, determines whether or not it is necessary to control the device according to the estimation result, and if it is determined that it is necessary to control the device, the CPU 11 estimates the cognitive ability. Determine the equipment and its control details.
  • step S35 the CPU 11 performs processing according to the determination result of necessity, the equipment to be controlled, and the details of the control. After that, the safe driving support process ends.
  • cognitive ability may be estimated using only the evaluation results of each index.
  • step S36 the CPU 11 determines whether or not devices other than the automatic driving device 24 are being controlled. If any device is being controlled, the determination in step S36 is YES and the process moves to step S38. If no device is being controlled, the determination in step S36 is NO and the process moves to step S37.
  • step S37 the CPU 11 controls the device to be selected according to the degree of influence.
  • the safe driving support process ends.
  • the device selected here is intended to provide a relatively small stimulus to the driver 3 to suppress the influence of stress. If the degree of influence is not improved, a stronger stimulus will be given to the driver 3.
  • step S38 the CPU 11 determines whether or not the effect of improving the degree of influence has been observed. If the effect is recognized, the determination in step S38 becomes YES and the process moves to step S37. Thereby, the driver 3 continues to be stimulated as before. On the other hand, if the effect is not recognized, the determination in step S38 is NO and the process moves to step S39.
  • step S39 the CPU 11 determines whether there are any other options for controlling the device. Other options are options for devices or control contents that provide stronger stimulation to the driver 3 and suppress the effects of stress. If there are other options, the determination in step S39 is YES and the process moves to step S40. If there are no other options, the determination in step S39 is NO, and the safe driving support process ends here. As a result, the same stimulation as before continues to be given to the driver 3.
  • step S40 the CPU 11 selects one of the other options. What is selected here is, for example, the one that provides the least stimulus among the other options. After the selection, the process moves to step S37. As a result, the device will be controlled based on the selection result.
  • stimulation is applied to the driver 3 who is estimated to be in an undesirable state for safe driving, and if the state does not improve even after the stimulation is applied, the driver 3 is given a stimulus in a step-by-step manner. , to give a stronger stimulus. If the driver 3 is so sleepy that no improvement is seen even after stimulation, the driver 3 is forced to shift to automatic driving.
  • the stimulation includes providing information through audio output. If no improvement is observed by providing information through voice output, body stimulation other than voice output will be performed at the same time.
  • the process of applying stronger stimulation step by step is based on the idea that there is a low possibility that an inappropriate control index will change while the device is being controlled. This is because, for example, if the driver 3 is feeling very sleepy, it is unlikely that he will feel strong stress in a relatively short period of time.
  • the biometric information acquisition unit 111 corresponds to biometric information acquisition means and video information acquisition means.
  • the autonomic nerve state analysis unit 114 corresponds to state estimation means.
  • the video information analysis section 115, the index evaluation section 116, the actual drowsiness level evaluation section 117, the stress influence evaluation section 118, and the cognitive ability estimation section 119 correspond to cognitive ability estimation means.
  • the vehicle information acquisition section 112 and the environmental information acquisition section 113 correspond to state information acquisition means.
  • the device control processing section 120 corresponds to control processing means.
  • a car is assumed as the mobile device, but the mobile device is not limited to a car.
  • the mobile device may be a flying vehicle such as a train, an airplane, or a helicopter, or a ship.
  • Such moving objects differ not only in their characteristics but also in the constraints on which a subject can drive or operate them. For example, trains can only travel on rails. Furthermore, there is usually no need to consider the presence of obstacles. For this reason, it is necessary to determine the evaluation method for each index and the estimation method for cognitive ability depending on the type of moving object.
  • this embodiment can also be applied to moving objects other than automobiles.
  • each index evaluated from video information and the results obtained by analyzing the state of autonomic nerves are used in a complementary manner to each other.
  • the control may be divided into control using the estimation result from autonomic nerve state analysis and control using both the estimation result and the evaluation result using video information.
  • a warning will be issued by audio output
  • if drowsiness is also estimated by evaluation using video information physical stimulation will be provided by controlling the device. You can also give it to them.
  • Various modifications including this are possible.
  • Cognitive ability estimation device 2. Driver's seat, 3. Driver (subject), 4, 7. Camera, 6. Smart watch, 8. Radar, 9. ECU, 11. CPU, 14. SSD, 16. Communication department, 20. Equipment group, 21. Air conditioning equipment. Control device, 22 Steering control device, 23 Message output control device, 24 Automatic driving device, 31 Pulse sensor, 41 Steering angle sensor, 42 Brake access sensor, 43 G sensor, 44 Speedometer, 111 Biological information acquisition unit, 112 Vehicle Information acquisition unit, 113 Environmental information acquisition unit, 114 Autonomic nerve state analysis unit, 115 Video information analysis unit, 116 Index evaluation unit, 117 Actual sleep level evaluation unit, 118 Stress impact evaluation unit, 119 Cognitive ability estimation unit, 120 Equipment control processing section.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Social Psychology (AREA)
  • Public Health (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychology (AREA)
  • Veterinary Medicine (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)

Abstract

A cognitive ability estimation apparatus including: a biological information acquisition means that acquires biological information from which at least a heartbeat of a subject who performs driving or operation of a moving body can be specified; a video information acquisition means that acquires video information of the subject; a state estimation means that estimates the state of an autonomous nerve of the subject on the basis of the heartbeat to be specified from the biological information; and a cognitive ability estimation means that estimates an cognitive ability of the subject on the basis of an estimation result of the autonomous nerve by the state estimation means and the video information.

Description

認知力推定装置、及びプログラムCognitive ability estimation device and program
 本発明は、認知力推定装置、及びプログラムに関する。 The present invention relates to a cognitive ability estimation device and a program.
 人が運転、或いは操縦する移動体としては、様々なものがある。このような移動体は重く、不適切な運転、或いは操縦をして事故を起こした場合の被害も甚大になりやすい。このことから、例えば自動車では、運転者を対象者として、運転者の認知力を推定し、その推定結果から、運転者への刺激等を必要に応じて行わせることにより、安全運転をサポートすることも行われている(例えば、特許文献1参照)。ここでは、説明上、便宜的に、移動体は自動車、対象者は運転者、移動体を移動させるための操作は運転とそれぞれ表記することとする。 There are various types of moving objects that are driven or operated by people. Such moving objects are heavy, and if an accident occurs due to inappropriate driving or maneuvering, the damage is likely to be severe. For example, in a car, the driver's cognitive ability is estimated, and based on the estimation results, the driver is stimulated as necessary to support safe driving. This has also been done (for example, see Patent Document 1). Here, for the sake of explanation and convenience, the moving object will be referred to as a car, the subject will be referred to as a driver, and the operation for moving the moving object will be referred to as driving.
 運転者の認知力の推定は、運転席に座った運転者を撮像するためのカメラからの映像情報を用いて行うことができる(例えば、特許文献1参照)。しかし、運転者の視覚的に確認可能な特徴量、例えば瞬きの頻度、視線の移動範囲、その移動速度等の変化は、眠気が生じた結果として表れる。そのため、実際に眠気が生じてから特徴量の変化が確認されるまでに時間差がある。このような時間差が存在することから、眠気を自律神経の状態から推定することも行われている(例えば、特許文献2参照)。 The driver's cognitive ability can be estimated using video information from a camera that captures an image of the driver sitting in the driver's seat (for example, see Patent Document 1). However, changes in the driver's visually verifiable characteristic values, such as the frequency of blinking, the range of movement of the line of sight, and the speed of movement thereof, appear as a result of drowsiness. Therefore, there is a time lag between when sleepiness actually occurs and when a change in the feature amount is confirmed. Because of the existence of such a time difference, sleepiness has also been estimated from the state of autonomic nerves (for example, see Patent Document 2).
特開2019-200544号公報JP 2019-200544 Publication 特開2009-039167号公報Japanese Patent Application Publication No. 2009-039167 特開2008-125801号公報Japanese Patent Application Publication No. 2008-125801
 従来、眠気があると推定された運転者に対して、車載の機器により何らかの刺激を与え、眠気(覚醒度)を改善させるようにすることも行われている。刺激には、例えば空調機器による送風、及び音による警告等がある。
 自身の眠気に気付いた運転者は、意識的に眠気を改善させようとするのが普通である。この意識により、眠気があっても交換神経の活動量が亢進する場合がある。つまり、交感神経、及び副交感神経がともに、亢進するような異常状態となる場合がある(例えば、特許文献3参照)。
Conventionally, a driver who is estimated to be drowsy is given some kind of stimulation using an on-vehicle device to improve his or her drowsiness (level of alertness). Stimuli include, for example, blowing air from an air conditioner and sound warnings.
Drivers who notice their own drowsiness usually consciously try to improve their drowsiness. Due to this awareness, the activity of the sympathetic nerves may increase even if the person is sleepy. That is, an abnormal state may occur in which both the sympathetic nerve and the parasympathetic nerve are activated (for example, see Patent Document 3).
 運転者は、自動車を安全運転しなければならないことから、自律神経が異常状態に比較的になり易いと思われる。このようなこともあり、運転者の眠気、つまり認知力の推定は、自律神経の状態から必ずしも高精度に行えるとは限らないと言える。それにより、認知力の推定をより高精度に行えるようにすることが重要と思われる。 Because drivers must drive cars safely, it is thought that their autonomic nerves are relatively likely to become abnormal. For this reason, it can be said that it is not always possible to estimate a driver's drowsiness, or cognitive ability, with a high degree of accuracy based on the state of the autonomic nervous system. It seems important to be able to estimate cognitive ability with higher accuracy.
 本発明は、移動体を運転、或いは操縦する対象者の認知力をより高精度に推定するのが可能な認知力推定装置を提供することを目的とする。 An object of the present invention is to provide a cognitive ability estimation device that can more accurately estimate the cognitive ability of a subject who drives or controls a mobile object.
 本開示の一態様の認知力推定装置は、移動体の運転、或いは操縦を行う対象者の心拍を少なくとも特定可能な生体情報を取得する生体情報取得手段と、対象者の映像情報を取得する映像情報取得手段と、生体情報から特定される心拍に基づいて、対象者の自律神経の状態を推定する状態推定手段と、状態推定手段による自律神経の状態の推定結果、及び映像情報に基づいて、対象者の認知力を推定する認知力推定手段と、を備える。 A cognitive ability estimation device according to an aspect of the present disclosure includes a biological information acquisition unit that acquires biological information that can identify at least the heartbeat of a subject who drives or steers a mobile object, and an image that acquires video information of the subject. an information acquisition means, a state estimation means for estimating the state of the autonomic nerves of the subject based on the heartbeat identified from the biological information, an estimation result of the state of the autonomic nerves by the state estimation means, and based on the video information, A cognitive ability estimation means for estimating the cognitive ability of a subject.
 本発明では、移動体を運転、或いは操縦する対象者の認知力をより高精度に推定することができる。 According to the present invention, the cognitive ability of a subject who drives or controls a mobile object can be estimated with higher accuracy.
本発明の一実施形態に係る認知力推定装置が対象者の認知力を推定する仕組みの例、及び推定された認知力に応じて行われる制御の例を説明する図である。FIG. 2 is a diagram illustrating an example of a mechanism in which a cognitive ability estimating device according to an embodiment of the present invention estimates a subject's cognitive ability, and an example of control performed according to the estimated cognitive ability. 本発明の一実施形態に係る認知力推定装置のハードウェア構成の一例を示すブロック図である。FIG. 1 is a block diagram showing an example of the hardware configuration of a cognitive ability estimation device according to an embodiment of the present invention. 本発明の一実施形態に係る認知力推定装置上に実現される機能的構成の一例を示す機能ブロック図である。FIG. 1 is a functional block diagram showing an example of a functional configuration implemented on a cognitive ability estimation device according to an embodiment of the present invention. 安全運転支援処理の例を示すフローチャートである。It is a flow chart which shows an example of safe driving support processing. 安全運転支援処理の例を示すフローチャートである(続き)。3 is a flowchart illustrating an example of safe driving support processing (continued).
 以下、本発明を実施するための形態について、図を参照しながら説明する。なお、以下に説明する実施形態は、一例であって、本発明の技術的範囲はこれに限られない。本発明の技術的範囲には、様々な変形例も含まれる。 Hereinafter, embodiments for carrying out the present invention will be described with reference to the drawings. Note that the embodiment described below is an example, and the technical scope of the present invention is not limited thereto. The technical scope of the present invention also includes various modifications.
 図1は、本発明の一実施形態に係る認知力推定装置が対象者の認知力を推定する仕組みの例、及び推定された認知力に応じて行われる制御の例を説明する図である。
 認知力推定装置1が認知力を推定する対象者は、移動体を運転、或いは操縦する人である。そのため、移動体は、エンジン等の移動を可能にする動力源を搭載したものである。
FIG. 1 is a diagram illustrating an example of a mechanism in which a cognitive ability estimating device according to an embodiment of the present invention estimates a subject's cognitive ability, and an example of control performed according to the estimated cognitive ability.
The target person whose cognitive ability is estimated by the cognitive ability estimating device 1 is a person who drives or operates a mobile object. Therefore, a moving body is equipped with a power source such as an engine that enables movement.
 図1では、移動体としては自動車、対象者としてはその自動車を運転席2に座って運転する運転者3を想定している。それにより、ここでは、特に断らない限り、移動体としては自動車、対象者としては運転者3を想定し説明する。なお、移動体は、電車、船舶、或いは飛行機等であっても良い。対象者は、操縦者等であっても良い。 In FIG. 1, it is assumed that the moving object is a car, and the subject is a driver 3 who is sitting in the driver's seat 2 of the car. Therefore, unless otherwise specified, the description will be made assuming that the moving object is a car and the subject is the driver 3. Note that the moving object may be a train, a ship, an airplane, or the like. The target person may be a pilot or the like.
 本実施形態では、運転者3の認知力の推定に、生体情報、映像情報、及び状態情報を用いている。それらの情報のうち、状態情報は省くことができる。
 生体情報は、運転者3の心拍を表す情報、或いはその心拍を特定可能な情報である。図1では、生体情報を生成する装置の例として、脈拍センサが設けられたハンドル5、及び運転者3の腕に付けられたスマートウォッチ6を示している。
In this embodiment, biological information, video information, and state information are used to estimate the cognitive ability of the driver 3. Among these pieces of information, state information can be omitted.
The biological information is information representing the heartbeat of the driver 3, or information that allows the heartbeat to be specified. FIG. 1 shows a steering wheel 5 equipped with a pulse sensor and a smart watch 6 worn on the arm of a driver 3 as examples of devices that generate biological information.
 脈拍センサは、例えば静電容量式、光学式、或いは電波式のものであり、センシングのためのセンサから出力される信号を処理して、血流の脈動から心拍を特定する。その特定結果が、生体情報として脈拍センサから送信される。この生体情報は、例えばECU(Electronic Control Unit)等を介して認知力推定装置1に入力される。なお、脈拍センサは、ハンドル5ではなく、運転席2に設けても良い。心拍を特定できるのであれば、センサの種類、設置場所等は特に限定されない。 The pulse sensor is, for example, a capacitance type, an optical type, or a radio wave type, and processes a signal output from the sensor for sensing to identify a heartbeat from the pulsation of blood flow. The identification result is transmitted from the pulse sensor as biological information. This biological information is input to the cognitive ability estimation device 1 via, for example, an ECU (Electronic Control Unit). Note that the pulse sensor may be provided not on the steering wheel 5 but on the driver's seat 2. The type of sensor, installation location, etc. are not particularly limited as long as the heartbeat can be identified.
 スマートウォッチ6は、例えば光学式の脈拍センサが搭載されたものである。スマートウォッチ6は、その脈拍センサによって確認される運転者3の血流の脈動を心拍として検出し、その検出結果を生体情報として送信することができる。スマートウォッチ6から無線で送信される生体情報は、例えば直接、認知力推定装置1によって受信される。 The smart watch 6 is equipped with, for example, an optical pulse sensor. The smart watch 6 can detect the pulsation of the blood flow of the driver 3 confirmed by its pulse sensor as a heartbeat, and can transmit the detection result as biological information. The biological information wirelessly transmitted from the smart watch 6 is directly received by the cognitive ability estimation device 1, for example.
 カメラ4は、運転席2に座る運転者3の顔を中心に撮像するために自動車の車内に設置されたものである。その撮像により得られた映像情報は、ハンドル5に設けられた脈拍センサと同じく、ECUを介して認知力推定装置1に送信される。
 映像情報から、運転者3の心拍数を特定することも可能である。呼吸を特定することも可能である。そのため、映像情報は、生体情報と位置づけても良い。
The camera 4 is installed inside the car to take an image of the face of the driver 3 sitting in the driver's seat 2. The video information obtained by the imaging is transmitted to the cognitive ability estimating device 1 via the ECU, similarly to the pulse sensor provided on the steering wheel 5.
It is also possible to identify the heart rate of the driver 3 from the video information. It is also possible to identify breathing. Therefore, video information may be positioned as biological information.
 カメラ7、及びレーダ8は、状態情報を生成するか、或いは状態情報の生成に用いられる情報の生成用の装置である。この状態情報は、移動体である自動車の状態を表す情報である。
 カメラ7は、例えば自動車の進行方向の撮像に用いられる。その撮像によって得られる映像情報は、例えば自動車の進行方向上の道の状態の確認に用いられる。確認される道の状態とは、具体的には、道の曲がり方、道に引かれた線の有無、障害物の有無、障害物の種類、等である。これらは何れも、映像情報を用いた映像解析によって特定される。そのため、カメラ7は、状態情報の生成用の装置である。
The camera 7 and the radar 8 are devices that generate status information or generate information used to generate status information. This status information is information representing the status of the automobile, which is a moving object.
The camera 7 is used, for example, to capture an image in the direction in which the automobile is traveling. The video information obtained by the imaging is used, for example, to confirm the state of the road in the direction of travel of the vehicle. Specifically, the road conditions that are checked include the way the road curves, the presence or absence of lines drawn on the road, the presence or absence of obstacles, and the types of obstacles. All of these are identified by video analysis using video information. Therefore, the camera 7 is a device for generating state information.
 レーダ8は、進行方向上に存在する物体との距離を計測するためのものであり、例えば計測された距離を表す距離情報を状態情報として生成し出力する。そのため、レーダ8は、状態情報を生成可能な装置である。
 なお、状態情報は、カメラ7、或いはレーダ8によって生成可能なものに限定されない。測位された位置情報、自動車の走行速度情報、及びハンドル5の操舵角等も状態情報として用いることができる。用いる状態情報は、移動体の種類、或いは用途等に応じて、決定すれば良いものである。
The radar 8 is for measuring the distance to an object existing in the direction of travel, and for example, generates and outputs distance information representing the measured distance as state information. Therefore, the radar 8 is a device that can generate state information.
Note that the status information is not limited to what can be generated by the camera 7 or the radar 8. Measured position information, vehicle speed information, steering angle of the steering wheel 5, and the like can also be used as the status information. The state information to be used may be determined depending on the type of mobile object, its purpose, etc.
 レーダ8から出力される距離情報、及びカメラ7から出力される映像情報は、対応するECU9によって処理される。ECU9は、映像情報から、例えば道路に引かれた線の有無、線の種類、走行方向上に存在する物体の有無、物体の種類等の認識を行う。このような認識結果から、車体のふらつき、車体が線からはみ出しているか否かの判定等を行う。また、距離情報と組み合わせ、前を走行する他の車体との車間距離が適切か否か、障害物が存在するか否か、障害物を回避可能か距離があるか否か等の判定も行う。それにより、ECU9は、状態情報として、例えば映像情報、各認識結果、及び各判定結果を認知力推定装置1に出力する。レーダ8が生成した状態情報(距離情報)は、それを用いた処理結果に変換された形となる。 The distance information output from the radar 8 and the video information output from the camera 7 are processed by the corresponding ECU 9. The ECU 9 recognizes, for example, the presence or absence of a line drawn on the road, the type of line, the presence or absence of an object existing in the direction of travel, the type of object, etc. from the video information. Based on such recognition results, it is determined whether the vehicle body is swaying, whether the vehicle body is protruding from the line, etc. In addition, in combination with distance information, it also determines whether the distance between the vehicle and another vehicle traveling in front is appropriate, whether an obstacle exists, and whether it is possible to avoid the obstacle or not. . Thereby, the ECU 9 outputs, for example, video information, each recognition result, and each determination result to the cognitive ability estimation device 1 as state information. The state information (distance information) generated by the radar 8 is converted into a processing result using the state information.
 認知力推定装置1に受信された映像情報、及び状態情報は、ステップS1の映像解析処理に用いられる。この映像解析処理は、認知力の推定に用いることが可能な複数の指標を評価するための処理である。複数の指標の種類、及びその組み合わせは、特に限定されないが、少なくとも眠気を採用する必要がある。これは、眠気は、認知力への影響の度合いが非常に大きいからである。他の指標としては、例えば特許文献1と同じく、疲労、及び注意力を採用しても良い。ここでは、複数の指標は、眠気、疲労、注意力の3つと想定する。 The video information and status information received by the cognitive ability estimation device 1 are used for the video analysis process in step S1. This video analysis process is a process for evaluating multiple indicators that can be used to estimate cognitive ability. The types of multiple indicators and their combinations are not particularly limited, but it is necessary to use at least drowsiness. This is because sleepiness has a very large effect on cognitive ability. As other indicators, fatigue and attention may be used, for example, as in Patent Document 1. Here, it is assumed that the plurality of indicators are sleepiness, fatigue, and attentiveness.
 各指標の評価方法としては、周知のものを採用することができる。具体的には、例えば特許文献1に記載の方法を採用しても良い。このようなことから、ここでは詳細な説明については省略する。なお、運転者3の眠気としては、特に目の動き、具体的には瞬きの頻度、及び眼球運動等に着目して推定する方法が一般的と言える。眼球運動から、視線の動きの速さ、及び視線の移動範囲、等を確認することができる。 As the evaluation method for each index, well-known methods can be adopted. Specifically, for example, the method described in Patent Document 1 may be adopted. For this reason, detailed explanation will be omitted here. Note that the drowsiness of the driver 3 is generally estimated by focusing on eye movements, specifically, blink frequency, eyeball movement, and the like. From the eye movements, it is possible to check the speed of line of sight movement, the range of line of sight movement, etc.
 自動車の走行速度が速いほど、運転者3の視線の移動範囲は狭くなる傾向にある。また、前方の障害物を避けようとする運転者3は、自動車を進行させるべき方向を確認するために、視線を視覚的に大きく移動させる場合がある。自動車を左折、或いは右折させる場合にも、運転者3は視線を比較的に大きく移動させるのが普通である。このようなことから、運転者3の認知力の推定をより高精度に行ううえで、状態情報は有用である。 The faster the vehicle travels, the narrower the range of movement of the driver's 3 line of sight tends to be. Further, the driver 3 who is trying to avoid an obstacle ahead may visually move his/her line of sight significantly in order to confirm the direction in which the vehicle should proceed. When turning the car left or right, the driver 3 usually moves his/her line of sight relatively widely. For this reason, the state information is useful for estimating the cognitive ability of the driver 3 with higher accuracy.
 認知力推定装置1に受信された生体情報は、ステップS2の自律神経の状態解析処理に用いられる。この状態解析処理は、自律神経の状態、つまり交換神経、及び副交感神経の各活動レベルを評価し、その評価結果を用いて、眠気、及びストレスの各レベルを推定する。 The biological information received by the cognitive ability estimation device 1 is used in the autonomic nerve state analysis process in step S2. This state analysis process evaluates the state of the autonomic nerves, that is, the activity levels of the sympathetic and parasympathetic nerves, and uses the evaluation results to estimate the levels of drowsiness and stress.
 心拍間隔を周波数解析して得られるスペクトルには、1つのピーク構造があることが知られている。交感神経は、0.05-0.15Hz、副交感神経は、0.15-0.45Hzの各範囲のパワーを比較することで、活動レベルを相対的に評価できるとされている。眠気、及びストレスは、それらの相対的な活動レベルから評価することができる。このことから、心拍を表す、或いは心拍を特定可能な生体情報は、自律神経の状態、つまり眠気、及びストレスの各推定に利用することができる。そのため、本実施形態では、生体情報を、自律神経の状態を推定するうえで必要な情報としている。心拍から自律神経の状態を推定する方法には、周知の方法を採用しても良いことから、より詳細な説明は省略する。 It is known that a spectrum obtained by frequency analysis of heartbeat intervals has a single peak structure. It is said that the activity level can be relatively evaluated by comparing the power in the range of 0.05-0.15 Hz for the sympathetic nerve and 0.15-0.45 Hz for the parasympathetic nerve. Sleepiness and stress can be evaluated from their relative activity levels. Therefore, biological information that represents heartbeat or that can identify heartbeat can be used to estimate the state of autonomic nerves, that is, drowsiness and stress. Therefore, in this embodiment, biological information is information necessary for estimating the state of the autonomic nerve. Since a well-known method may be adopted as a method for estimating the state of the autonomic nervous system from the heartbeat, a more detailed explanation will be omitted.
 なお、0.05-0.15Hzの範囲に存在する構造は、血圧性変動成分(NWSA:Myer Wave Sinus Arrhythmia)と呼ばれ、0.15-0.45Hzの範囲に存在する構造は、呼吸性変動成分(RSA:Respiratory Sinus Arrhythmia)と呼ばれる。心拍ゆらぎのスペクトルから、呼吸に関する情報、及び血圧変化に関する情報も得ることができる。 The structure that exists in the range of 0.05-0.15Hz is called the blood pressure fluctuation component (NWSA: Myer Wave Sinus Arrhythmia), and the structure that exists in the range of 0.15-0.45Hz is called the blood pressure fluctuation component (NWSA). This is called the fluctuation component (RSA: Respiratory Sinus Arrhythmia). Information regarding respiration and blood pressure changes can also be obtained from the spectrum of heart rate fluctuations.
 自律神経、つまり交換神経、及び副交感神経の各活動レベルから推定される運転者3の状態は、以下の4つに大別される(例えば、特許文献3参照)。
(1)交換神経が優位な状態
(2)副交感神経が優位な状態
(3)交換神経、及び副交感神経の各活動レベルがともに亢進している状態
(4)交換神経、及び副交感神経の各活動レベルがともに低下している状態
The states of the driver 3 estimated from the activity levels of the autonomic nerves, that is, the sympathetic nerves, and the parasympathetic nerves, are broadly classified into the following four types (see, for example, Patent Document 3).
(1) State in which sympathetic nerves are dominant (2) State in which parasympathetic nerves are dominant (3) State in which the activity levels of sympathetic nerves and parasympathetic nerves are both increased (4) Each activity of sympathetic nerves and parasympathetic nerves Both levels are decreasing
 (1)は、運転者3は眠気を感じていないと推定される状態である。ストレスレベルが高い状態、つまり運転者3が興奮している状態も(1)に含まれる。ここでは、以降、第1の状態と表記する。
 (2)は、運転者3が眠気、或いは疲労を感じていると推定される状態である。運転者3が非常にリラックスしている場合、この状態と推定される可能性がある。ここでは、以降、第2の状態と表記する。
 (3)は、運転者3が眠気に打ち勝とうとする場合に現れる状態である。通常は現れない異常状態である。眠気を自覚する運転者3は、その眠気を少なくとも抑えようとする意識が働く。その意識により、運転者3には比較的に出現しやすい状態と考えられる。ここでは、以降、第3の状態と表記する。
 (4)は、憂鬱状態の運転者3で推定されやすい状態である。自律神経の状態が不安定となっていると推測することができる。ここでは、以降、第4の状態と表記する。
(1) is a state in which it is estimated that the driver 3 is not feeling sleepy. (1) also includes a state where the stress level is high, that is, a state where the driver 3 is excited. Hereinafter, this will be referred to as the first state.
(2) is a state in which the driver 3 is estimated to be drowsy or fatigued. If the driver 3 is very relaxed, this state may be assumed. Hereinafter, this will be referred to as the second state.
(3) is a state that appears when the driver 3 tries to overcome sleepiness. This is an abnormal condition that does not normally occur. The driver 3 who is aware of sleepiness is conscious of trying to at least suppress the sleepiness. Due to this awareness, it is considered that this condition is relatively likely to occur for driver 3. Hereinafter, this will be referred to as the third state.
(4) is a state that is likely to be assumed when the driver 3 is in a depressed state. It can be inferred that the state of the autonomic nervous system is unstable. Hereinafter, this will be referred to as the fourth state.
 このように自律神経を4つの状態に大別したとしても、各状態のレベルまで推定したことにはならない。また、認知力のレベルを高精度に推定できているとも限らない。
 第1の状態では、運転者3の認知力は高いと考えられる。運転者3が強いストレスを感じている場合、そのストレスにより、注意力が低下している可能性がある。注意力が低下しているとは、運転に意識が向いていない、或いは余所見をしている等により、実際の認知力が低い状態である。しかし、ストレスを感じていても、自身をコントロールし、認知力を高く保っている可能性も考えられる。このようなことから、認知力のレベルは必ずしも適切に推定できるとは限らない。
Even if the autonomic nervous system is roughly divided into four states in this way, it does not mean that the level of each state has been estimated. Furthermore, it is not always possible to estimate the level of cognitive ability with high accuracy.
In the first state, driver 3 is considered to have high cognitive ability. If the driver 3 is feeling strong stress, there is a possibility that his or her attentiveness is decreasing due to the stress. Decreased attentiveness refers to a state in which the actual cognitive ability is low because the driver is not paying attention to driving or is looking elsewhere. However, it is possible that even when under stress, people are able to control themselves and keep their cognitive abilities high. For these reasons, it is not always possible to estimate the level of cognitive ability appropriately.
 第2の状態では、運転者3は眠気、或いは疲労等により、認知力が低くなっている可能性が高いと考えられる。しかし、自動車を走行させている道路、或いはそのときの状況等によっては、単にリラックスしている可能性もあり得る。このようなことから、第2の状態であっても、運転者3は自動車の安全運転を行える可能性がある。
 第3の状態は、上記のように、運転者3が眠気に打ち勝とうとしている場合に見られる状態である。運転者3は自身の眠気を認識しつつ、その眠気に打ち勝とうと意識していることから、認知力自体はあまり低下していないこともあり得る。
 第4の状態でも、運転者3は状況を適切に認識していることが考えられる。そのため、認知力のレベルを高精度に推定できるとは限らない。
In the second state, it is highly likely that the driver 3 has low cognitive ability due to drowsiness, fatigue, or the like. However, depending on the road on which the car is traveling or the situation at that time, there is a possibility that the person is simply relaxing. For this reason, even in the second state, there is a possibility that the driver 3 can drive the car safely.
The third state is a state seen when the driver 3 is trying to overcome sleepiness, as described above. Since the driver 3 is aware of his own drowsiness and is conscious of trying to overcome the drowsiness, it is possible that his cognitive ability itself has not deteriorated much.
Even in the fourth state, it is possible that the driver 3 appropriately recognizes the situation. Therefore, it is not always possible to estimate the level of cognitive ability with high accuracy.
 自律神経の状態に着目する場合、認知力のレベルの変化は、映像解析よりも早いタイミングで推定することが可能である(例えば、特許文献3)。しかし、自律神経の状態から、認知力のレベルを高精度に推定できない可能性もある。このことから、本実施形態では、映像情報から評価する各指標、及び自律神経の状態解析により得られた結果を互いに相補的に用いるようにして、運転者3の認知力をより高精度に推定するようにしている。相補的な認知力の推定のために、ステップS3の実眠気レベル解析処理、及びステップS4のストレス影響解析処理が認知力推定装置1で実行される。 When focusing on the state of autonomic nerves, changes in the level of cognitive ability can be estimated at earlier timing than video analysis (for example, Patent Document 3). However, it is possible that the level of cognitive ability cannot be estimated with high accuracy based on the state of the autonomic nervous system. For this reason, in this embodiment, each index evaluated from video information and the results obtained by autonomic nerve state analysis are used in a complementary manner to estimate the cognitive ability of the driver 3 with higher accuracy. I try to do that. In order to estimate complementary cognitive ability, the actual drowsiness level analysis process in step S3 and the stress influence analysis process in step S4 are executed by the cognitive ability estimation device 1.
 第2の状態で運転者3に生じる眠気は、認知力の低下に尤も重大な影響を及ぼす。これは、眠気は認知力を低下させる尤も大きい原因となるだけでなく、眠気が長い期間、継続するのが普通だからである。つまり、眠気は、認知力が非常に低い状態を長期間、継続させる可能性が高いからである。このことから、本実施形態では、ステップS3の実眠気レベル解析処理により、自律神経の状態から推定される眠気の実際のレベルを実眠気レベルとして評価するようにしている。その評価の対象となる運転者3は、自律神経が第2の状態と推定された運転者3となる。この実眠気レベル解析処理では、映像情報、及び状態情報が用いられる。 The drowsiness that occurs in the driver 3 in the second state has a particularly serious effect on the decline in cognitive ability. This is because not only is sleepiness a major cause of cognitive decline, but sleepiness typically continues for a long period of time. In other words, sleepiness is likely to cause a state of extremely low cognitive ability to continue for a long period of time. For this reason, in this embodiment, the actual level of sleepiness estimated from the state of the autonomic nerves is evaluated as the actual sleepiness level by the actual sleepiness level analysis process in step S3. The driver 3 to be evaluated is the driver 3 whose autonomic nervous system is estimated to be in the second state. This actual drowsiness level analysis process uses video information and state information.
 眠気に打ち勝とうとする運転者3は、例えば安全運転に支障がない範囲内で、身体を意図的に動かし、刺激を身体に与えようとする。そのような身体の動きの例としては、頭、或いは肩等を意図的に動かして、刺激を身体に与える、意図的に目を動かす(瞬き等をする)、空調機器等を操作し、風、或いは音による刺激を身体がより受ける状態にする、窓を開け、外気を車内に入れる、等を挙げることができる。このような動きをする運転者3は、眠気を感じているとしても、認知力のレベルの低下は比較的に小さいと推定することができる。つまり実眠気レベルは比較的に低く、十分な認知力レベルが維持されていると推定することができる。 The driver 3 who is trying to overcome sleepiness, for example, moves his body intentionally within a range that does not interfere with safe driving and tries to provide stimulation to his body. Examples of such body movements include intentionally moving your head or shoulders to provide stimulation to your body, intentionally moving your eyes (blinking, etc.), operating air conditioning equipment, etc. Alternatively, you can make your body more susceptible to sound stimulation, open the windows and let outside air into the car, etc. Even if the driver 3 who moves in this manner feels sleepy, it can be estimated that the decrease in the level of cognitive ability is relatively small. In other words, the actual sleepiness level is relatively low, and it can be estimated that a sufficient cognitive level is maintained.
 しかし、そのような動きが確認できないか、或いはあまり確認できない場合、例え眠気に打ち勝とうと意識していたとしても、つまり交感神経の亢進が確認できるとしても、眠気を覚ますための行動が取れていないことになる。そのため、運転者3の実眠気レベルは高く、十分な認知力レベルではないと推定することができる。 However, if such movements cannot be confirmed or cannot be confirmed at all, even if you are conscious of trying to overcome your sleepiness, that is, even if you can confirm that your sympathetic nervous system is activated, you are not taking any action to wake up your sleepiness. It turns out. Therefore, it can be estimated that the actual drowsiness level of driver 3 is high and that his cognitive ability level is not sufficient.
 上記のように、運転者3の動きは、自動車の状態に影響を受ける。また、運転者3の動きは、自動車の状態に影響を与える。例えば線が引かれた道路を走行中であった場合、運転者3がハンドル5を適切に操作しなければ、線から車体をはみ出させる恐れがある。制限速度を大きく超える走行速度で自動車を走行させる恐れもある。このようなことも、状態情報として、運転者3の実眠気レベルの推定に用いることができる。道路に引かれた線の有無、その種類、及び車体の線からのはみ出しは、何れもカメラ7からの映像情報により確認することができる。制限速度を表す標識等がカメラ7によって撮像された場合、映像情報から制限速度を確認することができる。 As mentioned above, the movements of the driver 3 are affected by the state of the vehicle. Furthermore, the movements of the driver 3 affect the state of the vehicle. For example, when the vehicle is traveling on a road with lines drawn on it, if the driver 3 does not operate the steering wheel 5 appropriately, there is a risk of the vehicle body protruding from the lines. There is also a risk that the vehicle will be driven at a speed that significantly exceeds the speed limit. Such information can also be used as state information to estimate the actual drowsiness level of the driver 3. The presence or absence of a line drawn on the road, its type, and whether the vehicle body protrudes from the line can all be confirmed using video information from the camera 7. When a sign or the like indicating a speed limit is imaged by the camera 7, the speed limit can be confirmed from the video information.
 また、指標として評価する注意力は、眠気との間に関係がある。例えば比較的に大きい範囲に視線を頻繁に移動させることにより、注意力が低いと推定される運転者3は、眠気を感じているとは考え難い。これは、脳の働きがなければ、そのような動きが現れることはないと考えられるからである。このようなことから、実眠気レベル解析処理では、映像情報、及び状態情報が用いられる。 Additionally, there is a relationship between attentiveness, which is evaluated as an index, and sleepiness. For example, the driver 3, who is estimated to have low attentiveness due to frequently moving his line of sight over a relatively large area, is unlikely to be feeling sleepy. This is because such movements would not occur without brain function. For this reason, video information and state information are used in the actual drowsiness level analysis process.
 上記の説明では、便宜的に、高い、及び低いという2段階で実眠気レベルを分けている。しかし、実際には、それら各段階の実眠気レベルはともに、更に2段階以上で分けることもできる。
 高い実眠気レベルでは、つまり強い眠気を運転者3が感じていると推定される状況では、安全性を考慮し、多段階で分けなくとも良い。つまり、実眠気レベルを表す数値としては、安全運転が期待できないことを表す数値を設け、安全運転が期待できるレベルを、2つ以上の数値で表すようにしても良い。ここでは、説明上、便宜的に、安全運転が期待できないことを表す数値を3とし、0~3の間の整数で実眠気レベルの推定結果が表されるものと想定する。値が小さくなるほど、実眠気レベルが低い、つまり運転者3が眠気を感じていないことを表している。
In the above explanation, for convenience, the actual drowsiness level is divided into two levels: high and low. However, in reality, the actual sleepiness levels of each of these stages can be further divided into two or more stages.
At a high actual drowsiness level, that is, in a situation where the driver 3 is estimated to be feeling strongly drowsy, it is not necessary to divide into multiple stages in consideration of safety. That is, as a numerical value representing the actual sleepiness level, a numerical value representing that safe driving cannot be expected may be provided, and a level at which safe driving can be expected may be represented by two or more numerical values. Here, for the sake of explanation and convenience, it is assumed that the numerical value indicating that safe driving cannot be expected is 3, and that the estimation result of the actual drowsiness level is expressed as an integer between 0 and 3. The smaller the value, the lower the actual drowsiness level, that is, the less the driver 3 feels drowsy.
 ステップS4のストレス影響解析処理では、運転者3が感じているストレスの自身への影響が評価される。この影響の評価のために、映像情報、及び状態情報が用いられる。自律神経の状態解析処理の結果は、この影響の評価を行う必要性の有無の判定に用いられる。それにより、影響の評価を行う対象となる運転者3は、第1の状態とされた運転者3のみである。より具体的には、ステップS2の自律神経の状態解析処理で評価されたストレスレベルが設定値以上、つまり安全運転に悪影響を与えるとするストレスレベル以上のストレスを感じていると推定された運転者3のみである。 In the stress impact analysis process in step S4, the impact of stress felt by the driver 3 on himself or herself is evaluated. Video information and status information are used to evaluate this influence. The results of autonomic nerve state analysis processing are used to determine whether or not it is necessary to evaluate this influence. As a result, the only driver 3 whose influence is to be evaluated is the driver 3 who is in the first state. More specifically, a driver whose stress level evaluated in the autonomic nerve state analysis process in step S2 is equal to or higher than a set value, that is, a driver who is estimated to be feeling stress equal to or higher than the stress level that is considered to have a negative impact on safe driving. There are only 3.
 ストレス影響解析処理では、主にストレスの注意力への影響が評価される。それにより、運転者3の腕、上半身、或いは頭等を動かす動作は、目で視認すべき方向の視認、或いは運転操作に悪影響を与えるレベルでない限り、重視されない。これは、そのレベルでなければ、事実上、ストレスが注意力の低下に影響していないか、或いはその影響度合いが比較的に小さいと考えられるためである。影響度合いが小さい範囲内であれば、運転者3は、安全運転を行える状態と推定することができる。 In stress impact analysis processing, the impact of stress on attention is mainly evaluated. As a result, movements of the driver's 3 arms, upper body, head, etc. are not given importance unless they are at a level that adversely affects visual recognition in a direction that should be visually recognized or driving operation. This is because if the stress is not at that level, it is considered that stress does not actually affect the decline in attention, or that the degree of stress is relatively small. If the degree of influence is within a small range, it can be estimated that the driver 3 is in a state where he or she can drive safely.
 一方、腕、上半身、或いは頭等を激しく、或いは大きく動かすような場合、運転者3は、視認すべき方向に視線を適切に向ける、目に入った映像を適切に認識する、或いは行うべき運転操作を直ちに適切に行う、等が困難になる。このような動作は、ストレスの影響が運転者3の身体の動きとなって表面化したものである可能性が考えられる。このような動作が繰り返し確認できる場合、運転者3の注意力は低い状態と評価することができる。運転者3がストレスを制御できていない可能性が非常に高いと考えられる。そのため、運転者3には、安全運転を行えない状態とさせるほど、ストレスが影響を与えていると推定することができる。 On the other hand, when the driver 3 moves his or her arms, upper body, head, etc. violently or greatly, the driver 3 must properly direct his/her line of sight in the direction that should be seen, properly recognize the image that has entered his/her eyes, or change the driving direction that should be carried out. It becomes difficult to perform operations immediately and appropriately. It is conceivable that such movements are caused by the effects of stress manifesting in the movements of the driver's 3 body. If such actions can be confirmed repeatedly, it can be evaluated that the driver's 3's attentiveness is low. It is highly likely that driver 3 is unable to control his stress. Therefore, it can be estimated that the stress has such an effect on the driver 3 that he is unable to drive safely.
 影響度合いも、例えば多段階で評価される。影響度合いの評価では、上記のように、安全運転が可能な否かの判定も行われる。それにより、影響度合いにも、例えば安全運転が期待できないことを表す数値を設けても良い。この場合、安全運転が期待できるレベルは、例えば2つ以上の数値で表現すれば良い。ここでは、説明上、便宜的に、安全運転が期待できないことを表す数値を3とし、0~3の間の整数で影響度合いの評価結果が表されるものと想定する。値が小さくなるほど、影響度合いが低い、つまり運転者3にストレスが事実上、影響していないことを表している。 The degree of influence is also evaluated, for example, in multiple stages. In evaluating the degree of influence, as described above, it is also determined whether safe driving is possible. Accordingly, the degree of influence may also include, for example, a numerical value indicating that safe driving cannot be expected. In this case, the level at which safe driving can be expected may be expressed, for example, by two or more numerical values. Here, for the sake of explanation and convenience, it is assumed that the numerical value indicating that safe driving cannot be expected is 3, and that the evaluation result of the degree of influence is expressed as an integer between 0 and 3. The smaller the value, the lower the degree of influence, that is, the stress has virtually no effect on the driver 3.
 上記のように、運転者3の動きは、自動車の状態に影響を受ける。例えば自動車が停止している状況では、運転者3の動作に安全運転上の制約は存在しない。運転者3は、任意の動きをすることができる。自動車が走行中であっても、許容できる運転者3の動作は、走行速度等によって変化する。このようなことから、ストレス影響解析処理でも、映像情報、及び状態情報が用いられる。 As mentioned above, the movements of the driver 3 are affected by the state of the vehicle. For example, in a situation where the car is stopped, there are no restrictions on safe driving on the actions of the driver 3. The driver 3 can make arbitrary movements. Even when the automobile is running, the permissible actions of the driver 3 vary depending on the driving speed and the like. For this reason, video information and status information are also used in stress effect analysis processing.
 このストレス影響解析処理、及び実眠気レベル解析処理の各結果は、ステップS5の認知力推定処理、及びステップS6の機器制御処理にそれぞれ渡される。
 認知力推定処理は、運転者3の認知力とともに、認知力の変化を推定するための処理である。認知力の推定は、実睡眠レベル、眠気を除く各指標の評価結果、ストレスの影響度合いが用いられる。ここでも、便宜的に、認知力は0~3の4段階で評価するものと想定し、最悪の状態を表す数値を3とする。このような想定では、数値が小さくなるほど、認知力が高いことを表している。数値で認知力を推定することから、以降、推定の対象は「認知力レベル」とも表記する。
The results of the stress influence analysis process and the actual drowsiness level analysis process are passed to the cognitive ability estimation process in step S5 and the device control process in step S6, respectively.
The cognitive ability estimation process is a process for estimating the cognitive ability of the driver 3 as well as changes in the cognitive ability. Cognitive ability is estimated using the actual sleep level, the evaluation results of each index except for sleepiness, and the degree of influence of stress. Again, for convenience, it is assumed that cognitive ability is evaluated on a four-level scale from 0 to 3, with 3 representing the worst condition. Under these assumptions, the smaller the number, the higher the cognitive ability. Since cognitive ability is estimated numerically, the target of estimation will also be referred to as "cognitive ability level" from now on.
 上記のように、実眠気レベル、及び影響度合いを4段階評価し、安全運転上、最悪の状態を表す値を3とする場合、実眠気レベル、或いは影響度合いの値が3であれば、例えば認知力レベルも3としても良い。実眠気レベル、及び影響度合いがともに3ではない場合、認知力レベルは、例えば実眠気レベル、及び眠気を除く各指標の評価結果をそれぞれ変数とする算出式により算出することが考えられる。各変数には、例えば安全運転に影響する度合いに応じて定めた係数を乗算させることが考えられる。 As mentioned above, when the actual drowsiness level and the degree of influence are evaluated in four stages, and the value representing the worst condition in terms of safe driving is set to 3, if the actual drowsiness level or the degree of influence is 3, for example, The cognitive ability level may also be set to 3. If both the actual sleepiness level and the degree of influence are not 3, the cognitive ability level may be calculated, for example, using a calculation formula that uses the actual sleepiness level and the evaluation results of each index excluding sleepiness as variables. For example, each variable may be multiplied by a coefficient determined depending on the degree of influence on safe driving.
 認知力の変化は、例えば実眠気レベル、影響度合い、ストレスレベル、及び眠気を除く各指標のそれぞれで評価する。それらは、機器群20を構成する1つ以上の機器の制御に考慮される。このことから、それらは以降、「制御指標」と総称し、上記指標と区別する。 Changes in cognitive ability are evaluated using, for example, actual sleepiness level, degree of influence, stress level, and each index other than sleepiness. These are taken into consideration in controlling one or more devices that constitute the device group 20. For this reason, they will hereinafter be collectively referred to as "control indicators" to distinguish them from the above-mentioned indicators.
 図1では、機器群20を構成する機器として、空調機器制御装置21、及びステアリング制御装置22のみを示している。
 空調機器制御装置21は、温風、及び冷風を送風可能な空調機器を制御する装置であり、風の温度、及び送風方向の制御も可能である。この制御装置21の制御により、風の送風による刺激を運転者3に与えることができる。制御装置21は、空調機器の制御のために必要な装置であることから、機器群20に示している。
In FIG. 1, only an air conditioning equipment control device 21 and a steering control device 22 are shown as devices constituting the device group 20.
The air conditioner control device 21 is a device that controls air conditioners capable of blowing hot air and cold air, and is also capable of controlling the temperature of the air and the direction of air blowing. Through the control of the control device 21, the driver 3 can be stimulated by the wind. The control device 21 is shown in the device group 20 because it is a device necessary for controlling the air conditioning equipment.
 ステアリング制御装置22は、ハンドル5をより軽く回せるようにする駆動力を発生させる制御を行う装置である。自動運転を可能にさせるための装置の一部でもある。
 制御装置22は、その駆動力を利用し、ハンドル5を持つ手に振動等を発生させることが可能である。この制御装置22を制御し、振動等をハンドル5に発生させることにより、運転者3に刺激を与えることができる。制御装置22も制御装置21と同様に、ハンドル5による刺激のために必要な装置であることから、機器群20に示している。
The steering control device 22 is a device that performs control to generate a driving force that allows the steering wheel 5 to be turned more easily. It is also part of the equipment that makes autonomous driving possible.
The control device 22 can generate vibrations or the like in the hand holding the handle 5 by using the driving force. The driver 3 can be stimulated by controlling the control device 22 to generate vibrations or the like in the steering wheel 5. Like the control device 21, the control device 22 is also shown in the equipment group 20 because it is a device necessary for stimulation by the handle 5.
 なお、制御対象とする機器は、図1に示すものに限定されない。表示、及び音声によりナビゲーションを行うナビゲーション装置、車内照明、パワーウィンドウ等も制御対象としても良い。ナビゲーション装置は、音声出力、或いはメッセージ表示に用いることができる。図1では、機器群20に対し、認知力推定装置1が直接、指示を行うように表している。しかし、実際には、対応するECUに対して制御を依頼する形となる。図1でECU9を示しているのは、ECU9が状態情報を生成するものだからである。 Note that the devices to be controlled are not limited to those shown in FIG. A navigation device that performs navigation by display and voice, interior lighting, power windows, etc. may also be controlled objects. The navigation device can be used for audio output or message display. In FIG. 1, the cognitive ability estimating device 1 is shown to give instructions directly to the equipment group 20. However, in reality, control is requested to the corresponding ECU. The reason why the ECU 9 is shown in FIG. 1 is because the ECU 9 generates status information.
 本実施形態では、安全運転が行えるように、運転者3に対して望ましいと考えられる刺激をタイムリに与えるようにしている。運転者3にとって望ましいと考えられる刺激は、その運転者3の状態に応じて異なる場合がある。運転者3に提供すべき情報の内容も、運転者3の状態に応じて異なる。例えば注意力が低い運転者3には、注意力が低いことをより意識させれば良い。そのため、注意力が低い旨を音声等で伝えることが考えられる。しかし、眠気、或いは疲労を感じているような運転者3には、そのような音声等による情報提供だけでは不十分と考えられる。感じている眠気、或いは疲労の程度を緩和させるための刺激を運転者3に与えることも必要と考えられる。 In this embodiment, desirable stimuli are given to the driver 3 in a timely manner so that the driver 3 can drive safely. Stimuli that are considered desirable for the driver 3 may differ depending on the driver's 3 condition. The content of the information to be provided to the driver 3 also differs depending on the condition of the driver 3. For example, the driver 3 who has low attentiveness may be made more aware of his low attentiveness. Therefore, it is conceivable to convey the fact that the attention level is low through voice or the like. However, for the driver 3 who is feeling drowsy or fatigued, it is considered that providing such information through voice or the like is insufficient. It is also considered necessary to provide stimulation to the driver 3 to alleviate the level of sleepiness or fatigue that the driver 3 feels.
 なお、音声等による情報提供を行った場合、同じ車両の同乗者にも情報提供が行われることになる。それにより、同乗者による運転者3への働きがけが期待できる。このようなことから、運転者3以外の者が容易に認識できるように情報提供を行うことが望ましい。 Note that if information is provided by voice or the like, information will also be provided to fellow passengers in the same vehicle. As a result, it can be expected that the fellow passenger will encourage the driver 3. For this reason, it is desirable to provide information in such a way that anyone other than the driver 3 can easily recognize it.
 このように、運転者3に対して提供する情報の内容、及び与える刺激は、運転者3の状態を考慮して選択するのが望ましい。しかし、望ましいと考えられる情報、及び刺激を運転者3に与えたとしても、運転者3の状態が確実に改善されるとは限らない。そのため、本実施形態では、制御指標毎に、その変化を評価し、その評価結果を機器群20の制御に反映させるようにして、運転者3の状態がより確実に改善されるようにしている。認知力推定処理では、機器群20の制御を行う必要性の有無を判定し、必要性が有りと判定した場合、機器群20のうちで制御すべき機器を選択するとともに、選択した機器の制御内容等も決定するようになっている。 In this way, it is desirable that the content of the information provided to the driver 3 and the stimulation to be provided be selected in consideration of the driver's 3 condition. However, even if information and stimulation that are considered desirable are given to the driver 3, the condition of the driver 3 is not necessarily improved. Therefore, in this embodiment, changes in each control index are evaluated and the evaluation results are reflected in the control of the equipment group 20, so that the condition of the driver 3 is improved more reliably. . In the cognitive ability estimation process, it is determined whether there is a need to control the device group 20, and if it is determined that there is a need, a device to be controlled is selected from the device group 20, and the selected device is controlled. The content etc. are also determined.
 ステップS6の機器制御処理は、機器群20のうちで制御すべき機器を制御するための処理である。機器制御処理には、実眠気レベル解析処理、ストレス影響解析処理、及び認知力推定処理の各処理結果が渡される。
 機器制御処理では、実眠気レベル解析処理、及びストレス影響解析処理の各処理結果が渡されることにより、実眠気レベル、或いは影響度合いが3となった場合に対応する。眠気を覚ますための、或いはストレスを制御するための機器の制御を開始する。制御の開始後、並びに実眠気レベル、及び影響度合いがともに3でない状況では、機器制御処理は、認知力推定処理の処理結果として渡される指示内容に従って、機器を制御する。機器の制御には、機器の制御の終了も含まれる。
The device control process in step S6 is a process for controlling the device to be controlled among the device group 20. The processing results of the actual drowsiness level analysis processing, the stress influence analysis processing, and the cognitive ability estimation processing are passed to the device control processing.
In the device control process, the results of the actual drowsiness level analysis process and the stress influence analysis process are passed, so that a case where the actual drowsiness level or degree of influence becomes 3 is handled. Start controlling devices to wake up sleepiness or control stress. After the start of control, and in a situation where both the actual drowsiness level and the degree of influence are not 3, the device control process controls the device according to the instruction content passed as a processing result of the cognitive ability estimation process. Control of the device also includes termination of control of the device.
 認知力推定処理では、運転者3に刺激を与えても状態の改善が確認できない場合、段階的に、より強い刺激を与えるようにしている。そのため、実眠気レベル、或いは影響度合いが3となり、機器の制御を開始した後は、認知力推定処理から渡される指示内容に従った機器の制御が行われる。 In the cognitive ability estimation process, if no improvement in the driver's condition is confirmed even after stimulation is applied to the driver 3, stronger stimulation is applied in stages. Therefore, the actual drowsiness level or degree of influence becomes 3, and after starting to control the device, the device is controlled in accordance with the instruction content passed from the cognitive ability estimation process.
 本実施形態では、自律神経から推定される運転者3の状態を映像情報から確認し、運転者3の実際の状態を推定する。そのため、運転者3の実際の状態をより高精度に推定することができる。個人差にもより適切に対応することができる。車両から得られる状態情報は、推定精度を向上させることから、更に適切な対応を可能にさせる。結果、運転者3に対しては、状態に応じた情報提供、更には身体への刺激等もタイムリに、より適切に行えるようになる。刺激等を与えることにより、運転者3の状態を改善させることが期待できる。このようなことから、自動車を安全運転できるように、運転者3に対する支援もより適切に行えることとなる。 In this embodiment, the state of the driver 3 estimated from the autonomic nervous system is confirmed from video information, and the actual state of the driver 3 is estimated. Therefore, the actual condition of the driver 3 can be estimated with higher accuracy. It is possible to more appropriately respond to individual differences. Status information obtained from the vehicle improves estimation accuracy, making it possible to take more appropriate measures. As a result, information can be provided to the driver 3 according to the driver's condition, and furthermore, stimulation to the body can be provided in a timely manner and more appropriately. By providing stimulation or the like, it can be expected that the condition of the driver 3 will be improved. For this reason, support for the driver 3 can be provided more appropriately so that the driver 3 can drive the car safely.
 図2は、本発明の一実施形態に係る認知力推定装置のハードウェア構成の一例を示すブロック図である。図2では、認知力推定装置1に関係する各種センサ、及び機器群20を構成する各機器も併せて示している。なお、この構成例は一例であり、認知力推定装置1、及びその推定装置1として使用可能な情報処理装置のハードウェア構成はこれに限定されない。 FIG. 2 is a block diagram showing an example of the hardware configuration of a cognitive ability estimation device according to an embodiment of the present invention. In FIG. 2, various sensors related to the cognitive ability estimating device 1 and each device constituting the device group 20 are also shown. Note that this configuration example is just an example, and the hardware configuration of the cognitive ability estimation device 1 and the information processing device that can be used as the estimation device 1 is not limited to this.
 認知力推定装置1は、図2に示すように、例えばCPU(Central Processing Unit)11、ROM(Read Only Memory)12、RAM(Random Access Memory)13、SSD(Solid State Drive)14、IFC(InterFace Controller)15、及び通信部16を備えている。それらはバスに接続されている。 As shown in FIG. 2, the cognitive ability estimation device 1 includes, for example, a CPU (Central Processing Unit) 11, a ROM (Read Only Memory) 12, a RAM (Random Access Memory) 13, an SSD (Solid State Drive) 14, and an IFC (InterFace). controller) 15, and a communication section 16. They are connected to the bus.
 CPU11は、例えばROM12に記録されているプログラム、或いは/及びSSD14に記録されているプログラムを実行し、各種の処理を実現させる。何れのプログラムも、RAM13にロードされて実行される。SSD14からRAM13にロードされるプログラムには、例えばOS(Operating System)、及びそのOS上で動作する各種アプリケーション・プログラムが含まれる。各種アプリケーション・プログラムには、情報処理装置を認知力推定装置1として機能させるために開発されたものが1つ以上、含まれる。以降、この開発されたアプリケーション・プログラムは、「開発アプリケーション」と表記する。 The CPU 11 executes, for example, a program recorded in the ROM 12 and/or a program recorded in the SSD 14 to realize various processes. Both programs are loaded into the RAM 13 and executed. The programs loaded from the SSD 14 into the RAM 13 include, for example, an OS (Operating System) and various application programs that run on the OS. The various application programs include one or more programs developed to cause the information processing device to function as the cognitive ability estimation device 1. Hereinafter, this developed application program will be referred to as a "developed application."
 開発アプリケーションは、リムーバブルメディアに記録させて配布しても良い。インターネット等のネットワークを介して配布可能にしても良い。このことから、開発アプリケーションを記録した記録媒体としては、ネットワークに直接的、若しくは間接的に接続された情報処理装置に搭載、若しくは装着されたものか、或いは外部のアクセス可能な装置に搭載、若しくは装着されたものであっても良い。 The developed application may be recorded on removable media and distributed. It may also be possible to distribute it via a network such as the Internet. For this reason, the recording medium on which the developed application is recorded must be installed or attached to an information processing device that is directly or indirectly connected to a network, or installed or attached to an externally accessible device. It may be attached.
 IFC群15は、各種外部装置との接続を可能にする。IFC群15には、音声出力用、及び画像出力用のIFCを含めても良い。車載機器を直接、接続させるためのIFCを含めても良い。生体情報、或いは映像情報の受信を可能にするIFCを含めても良い。
 RAM13には、CPU11が各種の処理を実行する上で必要なデータ等も適宜記憶される。そのデータには、CPU11が実行する各種プログラムで用いられるものも含まれる。映像情報、生体情報、及び状態情報も、そのデータに含まれる。
The IFC group 15 enables connection with various external devices. The IFC group 15 may include IFCs for audio output and image output. An IFC for directly connecting in-vehicle equipment may also be included. An IFC that enables reception of biometric information or video information may also be included.
The RAM 13 also appropriately stores data necessary for the CPU 11 to execute various processes. The data includes data used by various programs executed by the CPU 11. Video information, biological information, and status information are also included in the data.
 ECU9を含む各ECUが車載LAN(Local Area Network)に接続されている場合、通信部16は、その車載LANを介して各ECUとの通信を可能にするものである。スマートウォッチ6からの生体情報の受信に対応させる場合、通信部16は、無線通信も可能なものとすれば良い。 When each ECU including the ECU 9 is connected to an in-vehicle LAN (Local Area Network), the communication unit 16 enables communication with each ECU via the in-vehicle LAN. When making it compatible with the reception of biometric information from the smart watch 6, the communication unit 16 may be capable of wireless communication.
 図2では、生体情報を取得可能なセンサ群30として、カメラ4、及び脈拍センサ31を示している。脈拍センサ31は、例えばハンドル5に設けられたものである。この脈拍センサ31は、ハンドル5ではなく、運転席2に設けても良い。
 センサ群としては、他に状態情報の取得用として、2つのセンサ群40、及び50を図2に示している。センサ群40は、特に車両の状態の確認用であり、センサ群50は、車両が置かれた環境の確認用である。上記のように、ECUのうちには、センサ群40、及び50にそれぞれ含まれる1つ以上のセンサから出力される情報を処理し、状態情報を生成し出力するECU9も存在する。図2では、このECU9は省いている。
In FIG. 2, a camera 4 and a pulse sensor 31 are shown as a sensor group 30 capable of acquiring biological information. The pulse sensor 31 is provided on the handle 5, for example. This pulse sensor 31 may be provided not on the steering wheel 5 but on the driver's seat 2.
In addition to the sensor groups, two sensor groups 40 and 50 are shown in FIG. 2 for acquiring status information. The sensor group 40 is particularly used to check the state of the vehicle, and the sensor group 50 is used to check the environment in which the vehicle is placed. As described above, among the ECUs, there is also the ECU 9 that processes information output from one or more sensors included in the sensor groups 40 and 50, respectively, and generates and outputs status information. In FIG. 2, this ECU 9 is omitted.
 センサ群40としては、操舵角センサ41、ブレーキ・アクセルセンサ42、Gセンサ43、及び速度計44を示している。
 操舵角センサ41は、ハンドル5を回転させた角度を操舵角として検出する。ブレーク・アクセルセンサ42は、不図示のアクセルペダル、及びブレーキペダルの各操作量をそれぞれ検出する。Gセンサ43は、加速度センサである。車両に生じた加速度を検出する。速度計44は、車両の走行速度を計測する。これらの検出結果は全て、車体の状態情報として扱われる。車体の状態情報は以降「車両情報」と表記する。
The sensor group 40 includes a steering angle sensor 41, a brake/accelerator sensor 42, a G sensor 43, and a speedometer 44.
The steering angle sensor 41 detects the angle at which the steering wheel 5 is rotated as a steering angle. The break/accelerator sensor 42 detects the amount of operation of an accelerator pedal and a brake pedal (not shown). G sensor 43 is an acceleration sensor. Detects the acceleration generated in the vehicle. The speedometer 44 measures the running speed of the vehicle. All of these detection results are treated as vehicle body status information. The vehicle body status information will be referred to as "vehicle information" hereinafter.
 センサ群5としては、カメラ7、及びレーダ8の他に、ロケータ51を示している。
 ロケータ51は、測位により特定される車両の位置を表す位置情報を出力する。この位置情報により、車両が走行している道路、走行方向における道路のカーブの有無、そのカーブの程度(半径)、等を確認することができる。
 カメラ7による映像情報、その映像情報とレーダ8による距離情報からECU9によって生成される状態情報、及びロケータ51による位置情報は以降「環境情報」と表記する。
 各センサ群30~50を構成するセンサの種類、数、及び組み合わせは、特に限定されない。図2に示したのは1例である。
As the sensor group 5, in addition to the camera 7 and the radar 8, a locator 51 is shown.
The locator 51 outputs position information representing the position of the vehicle specified by positioning. With this position information, it is possible to confirm the road on which the vehicle is traveling, the presence or absence of a curve in the road in the traveling direction, the degree of the curve (radius), etc.
Video information from the camera 7, status information generated by the ECU 9 from the video information and distance information from the radar 8, and position information from the locator 51 will be referred to as "environmental information" hereinafter.
The type, number, and combination of sensors constituting each sensor group 30 to 50 are not particularly limited. What is shown in FIG. 2 is one example.
 機器群20としては、空調機器制御装置21、及びステアリング制御装置22の他に、メッセージ出力制御装置23、及び自動運転装置24を示している。
 メッセージ出力制御装置23は、音声出力により、運転者3に情報提供を行うのを可能にする。この制御装置23は、ナビゲーション装置に搭載されたものであっても良い。
 自動運転装置24は、車両の自動運転を可能にする。この自動運転装置24は、刺激を与えても、運転者の状態の改善が確認できない場合、具体的には、眠気により、或いはストレスにより、安全運転を行うことが期待できないと見做した場合、自動運転装置24に自動運転への切り換えが依頼される。
As the equipment group 20, in addition to an air conditioning equipment control device 21 and a steering control device 22, a message output control device 23 and an automatic driving device 24 are shown.
The message output control device 23 makes it possible to provide information to the driver 3 through audio output. This control device 23 may be mounted on a navigation device.
The automatic driving device 24 enables automatic driving of the vehicle. If the automatic driving device 24 cannot confirm that the driver's condition has improved even after stimulation is applied, specifically, if it considers that the driver cannot be expected to drive safely due to drowsiness or stress, The automatic driving device 24 is requested to switch to automatic driving.
 図3は、本発明の一実施形態に係る認知力推定装置上に実現される機能的構成の一例を示す機能ブロック図である。次に図3を参照しつつ、認知力推定装置1上に実現される機能的構成の例について詳細に説明する。 FIG. 3 is a functional block diagram showing an example of a functional configuration implemented on a cognitive ability estimation device according to an embodiment of the present invention. Next, an example of the functional configuration implemented on the cognitive ability estimation device 1 will be described in detail with reference to FIG. 3.
 認知力推定装置1のCPU11上には、機能的構成として、図3に示すように、生体情報取得部111、車両情報取得部112、環境情報取得部113、自律神経状態解析部114、映像情報解析部115、指標評価部116、実睡眠レベル評価部117、ストレス影響評価部118、認知力推定部119、及び機器制御処理部120が実現される。そのCPU11は、通信部16を介して、ECU9を含む各ECU群60との間で情報(データ)の送受信を行うことができる。 On the CPU 11 of the cognitive ability estimation device 1, as shown in FIG. An analysis section 115, an index evaluation section 116, an actual sleep level evaluation section 117, a stress influence evaluation section 118, a cognitive ability estimation section 119, and a device control processing section 120 are realized. The CPU 11 can transmit and receive information (data) to and from each ECU group 60 including the ECU 9 via the communication unit 16.
 センサ群30~50に含まれる各センサは、ECU群60を構成する何れかのECUに接続されている。機器群20を構成する各機器も同様である。そのため、各センサによって得られた情報の受信、及び各機器の制御は、何れかのECUを介して行われる。 Each sensor included in the sensor groups 30 to 50 is connected to one of the ECUs forming the ECU group 60. The same applies to each device constituting the device group 20. Therefore, reception of information obtained by each sensor and control of each device are performed via one of the ECUs.
 CPU11上の機能的な構成要素は、開発アプリケーションを含む各種プログラムをCPU11が実行することにより実現される。その結果として、SSD14には、生体情報格納部141、状態情報格納部142、状態解析結果格納部143、指標評価結果格納部144、レベル評価結果格納部145、影響評価結果格納部146、認知力推定結果格納部147、状態評価情報格納部148、指標評価情報格納部149、レベル評価情報格納部150、影響評価情報格納部151、推定情報格納部152、及び機器制御情報格納部153が情報格納用領域として確保される。 Functional components on the CPU 11 are realized by the CPU 11 executing various programs including development applications. As a result, the SSD 14 includes a biological information storage section 141, a state information storage section 142, a state analysis result storage section 143, an index evaluation result storage section 144, a level evaluation result storage section 145, an impact evaluation result storage section 146, and a cognitive ability storage section 144. Estimation result storage section 147, condition evaluation information storage section 148, index evaluation information storage section 149, level evaluation information storage section 150, impact evaluation information storage section 151, estimation information storage section 152, and device control information storage section 153 are information storage units. This area is reserved for use.
 なお、一時的に保存しておけば良い情報は、SSD14ではなく、RAM13に保存される場合もある。SSD14に保存される各種情報は、RAM13に記憶された後、SSD14に転送され保存される。ここでは、便宜的に、情報を保存するプロセスを無視するとともに、情報の保存先としてSSD14のみを想定する。 Note that information that should be temporarily stored may be stored in the RAM 13 instead of the SSD 14. Various information stored in the SSD 14 is stored in the RAM 13, and then transferred to and stored in the SSD 14. Here, for convenience, the process of storing information is ignored, and only the SSD 14 is assumed as the information storage destination.
 生体情報取得部111は、自律神経の状態解析に必要な生体情報の取得に対応する。ここでの生体情報には、脈拍センサ31によって得られるものの他に、カメラ4によって得られる映像情報も含まれる。生体情報取得部111によって取得された生体情報は、SSD14に確保された生体情報格納部141に格納される。 The biological information acquisition unit 111 is responsible for acquiring biological information necessary for autonomic nerve state analysis. The biological information here includes not only information obtained by the pulse sensor 31 but also video information obtained by the camera 4. The biometric information acquired by the biometric information acquisition unit 111 is stored in the biometric information storage unit 141 secured in the SSD 14.
 車両情報取得部112、及び環境情報取得部113は、それぞれ、車両情報、及び環境情報の取得に対応する。取得された車両情報、及び環境情報はともに、SSD14に確保された状態情報格納部142に格納される。
 なお、生体情報取得部111、車両情報取得部112、及び環境情報取得部113による情報取得は、予め定めたタイミング、例えば予め定められた時間間隔で行われる。これは、脈拍等は短い時間内に急激な変化を繰り返すようなことは起こりにくいからである。
The vehicle information acquisition unit 112 and the environmental information acquisition unit 113 correspond to acquisition of vehicle information and environmental information, respectively. Both the acquired vehicle information and environmental information are stored in the status information storage section 142 secured in the SSD 14.
Note that information acquisition by the biological information acquisition unit 111, the vehicle information acquisition unit 112, and the environmental information acquisition unit 113 is performed at predetermined timings, for example, at predetermined time intervals. This is because pulse rate and the like are unlikely to repeat rapid changes within a short period of time.
 自律神経状態解析部114は、生体情報格納部141に格納されている生体情報を用いた解析により、自律神経の状態を推定する。その解析により、交換神経、及び副交感神経の各活動レベルを評価し、その評価結果を用いて、眠気、及びストレスの各レベルを推定する。この推定結果は、状態解析結果として、SSD14に確保された状態解析結果格納部143に格納される。図1にステップS2として示す自律神経の状態解析処理は、自律神経状態解析部114によって実行される。 The autonomic nerve state analysis unit 114 estimates the state of the autonomic nerves through analysis using the biological information stored in the biological information storage unit 141. Through the analysis, the activity levels of sympathetic and parasympathetic nerves are evaluated, and the evaluation results are used to estimate the levels of sleepiness and stress. This estimation result is stored as a state analysis result in the state analysis result storage section 143 secured in the SSD 14. The autonomic nerve state analysis process shown as step S2 in FIG. 1 is executed by the autonomic nerve state analysis unit 114.
 SSD14に確保された状態評価情報格納部148には、自律神経の状態を推定するための情報が状態評価情報として格納されている。この評価情報は、例えば上記のように、心拍間隔を周波数解析して得られるスペクトルから、交換神経、及び副交感神経の各活動レベルを評価し、その評価結果から、眠気、及びストレスの各レベルを更に評価するための情報である。自律神経状態解析部114は、この評価情報を参照し、自律神経の状態を推定する。 The state evaluation information storage unit 148 secured in the SSD 14 stores information for estimating the state of the autonomic nerve as state evaluation information. For example, as described above, this evaluation information evaluates the activity levels of the sympathetic and parasympathetic nerves from the spectrum obtained by frequency analysis of the heartbeat interval, and from the evaluation results, the levels of sleepiness and stress are determined. This is information for further evaluation. The autonomic nerve state analysis unit 114 refers to this evaluation information and estimates the state of the autonomic nerves.
 映像情報解析部115は、例えば生体情報として生体情報格納部141に格納されている映像情報を用いた解析により、運転者3の動きの特徴を表す動作特徴情報を生成する。
 指標評価部116は、生成された動作特徴情報、及び状態情報格納部142に格納されている状態情報を用いた解析により、各指標を評価する。各指標の評価結果は、SSD14に確保された指標評価結果格納部144に格納される。
The video information analysis unit 115 generates motion feature information representing the characteristics of the movement of the driver 3 by analysis using video information stored in the biometric information storage unit 141 as biometric information, for example.
The index evaluation unit 116 evaluates each index through analysis using the generated motion characteristic information and the state information stored in the state information storage unit 142. The evaluation results of each index are stored in the index evaluation result storage section 144 secured in the SSD 14.
 SSD14に確保された指標評価情報格納部149には、指標毎に、その指標を評価するための情報が指標評価情報として格納されている。この評価情報は、例えば状態情報によって表される状態の区分別に用意されたものである。区分は、例えば高速道路か否かを含む道路の種類、走行地域、及び走行速度、等を考慮して車両の状態を分けたものである。指標評価部116は、状態情報から、参照すべき評価情報を特定し、特定した評価情報を、動作特徴情報を用いて参照することにより、各指標を評価する。
 このようなことから、図1にステップS1として示す映像解析処理は、映像情報解析部115、及び指標評価部116によって実現される。
The index evaluation information storage unit 149 secured in the SSD 14 stores, for each index, information for evaluating that index as index evaluation information. This evaluation information is prepared for each category of status represented by status information, for example. The classification is a classification of vehicle conditions, taking into consideration, for example, the type of road (including whether it is an expressway or not), the driving area, and the driving speed. The index evaluation unit 116 evaluates each index by specifying evaluation information to be referred to from the state information and referring to the specified evaluation information using the operation characteristic information.
For this reason, the video analysis process shown as step S1 in FIG. 1 is realized by the video information analysis section 115 and the index evaluation section 116.
 実眠気レベル評価部117は、実眠気レベルを評価する。この評価は、例えば状態解析結果格納部143に格納された状態解析結果、状態情報格納部142に格納された状態情報、指標評価結果格納部144に格納された眠気を除く各指標の評価結果、レベル評価情報格納部150に格納されている実眠気レベル評価情報を用いて行われる。 The actual drowsiness level evaluation unit 117 evaluates the actual drowsiness level. This evaluation includes, for example, the condition analysis results stored in the condition analysis result storage section 143, the condition information stored in the condition information storage section 142, the evaluation results of each index except for drowsiness stored in the index evaluation result storage section 144, This is performed using actual drowsiness level evaluation information stored in the level evaluation information storage unit 150.
 実眠気レベル評価情報は、実眠気レベルを評価するための情報である。この評価情報は、例えば指標評価情報と同様に、状態情報によって表される状態の区分別に用意されている。それにより、状態情報は、レベル評価情報格納部150に格納されている実眠気レベル評価情報のうちで参照すべきものを特定するのに用いられる。各実眠気レベル評価情報は、眠気を除く各指標、つまり疲労、及び注意力の各評価結果と、実眠気レベルとの間の対応関係を表す情報である。このような実眠気レベル評価情報を参照して評価された実眠気レベルは、SSD14に確保されたレベル評価結果格納部145に格納される。図1にステップS3として示す実眠気レベル解析処理は、実眠気レベル評価部117によって実現される。 The actual drowsiness level evaluation information is information for evaluating the actual drowsiness level. This evaluation information is prepared for each category of state represented by the state information, for example, similar to the index evaluation information. Thereby, the state information is used to specify which of the actual drowsiness level evaluation information stored in the level evaluation information storage section 150 should be referred to. Each piece of actual drowsiness level evaluation information is information representing the correspondence between each index other than drowsiness, that is, each evaluation result of fatigue and attentiveness, and the actual drowsiness level. The actual drowsiness level evaluated with reference to such actual drowsiness level evaluation information is stored in the level evaluation result storage section 145 secured in the SSD 14. The actual drowsiness level analysis process shown as step S3 in FIG. 1 is realized by the actual drowsiness level evaluation unit 117.
 ストレス影響評価部118は、ストレスの実際の影響度合いを評価する。この評価は、状態解析結果格納部143に解析結果として格納されたストレスレベル、状態情報格納部142に格納された状態情報、及び影響評価情報格納部151に格納されているストレス影響評価情報を用いて行われる。 The stress impact evaluation unit 118 evaluates the actual degree of impact of stress. This evaluation uses the stress level stored as an analysis result in the condition analysis result storage section 143, the condition information stored in the condition information storage section 142, and the stress impact evaluation information stored in the impact evaluation information storage section 151. will be carried out.
 ストレス影響評価情報は、ストレスの影響度合いを評価するための情報である。ストレスへの耐性の低い人は、例え低いストレスレベルであっても、ストレスの影響が実際の行動に強く表れる恐れがある。そのようなこともあり、ストレスが与える影響には個人差が比較的に大きいと考えられる。そのため、本実施形態では、影響度合いは、実際に運転者3の行動に現れるストレスレベルと位置づけ、ストレス影響評価情報を参照し、影響度合いを評価するようにしている。 Stress impact evaluation information is information for evaluating the degree of impact of stress. For people with low stress tolerance, even if the stress level is low, the effects of stress may be strongly reflected in their actual behavior. For this reason, it is thought that there are relatively large individual differences in the effects of stress. Therefore, in this embodiment, the degree of influence is positioned as the stress level that actually appears in the behavior of the driver 3, and the degree of influence is evaluated by referring to stress influence evaluation information.
 個人差があっても、感じていると推定されるストレスが小さいレベルであれば、そのストレスが運転者3の行動に与える影響も小さいと考えられる。このことから、対象者は、例えば推定されたストレスレベルが2以上の運転者3のみとしても良い。これは、実眠気レベル評価部117でも同様である。 Even if there are individual differences, if the level of stress that is estimated to be felt is small, the impact of that stress on the behavior of the driver 3 is considered to be small. For this reason, the target person may be, for example, only the driver 3 whose estimated stress level is 2 or higher. This also applies to the actual drowsiness level evaluation section 117.
 例えばストレス影響評価情報も指標評価情報と同様に、状態情報によって表される状態の区分別に用意されている。それにより、状態情報も同様に、影響評価情報格納部151に格納されているストレス影響評価情報のうちで参照すべきものを特定するのに用いられる。 For example, like the index evaluation information, the stress impact evaluation information is also prepared for each state category represented by the state information. Thereby, the status information is similarly used to specify which stress impact evaluation information stored in the impact evaluation information storage section 151 should be referred to.
 ストレスの影響は、注意力には影響しない、或いは影響し難い行動に現れる場合がある。その行動の例としては、上半身、腕、或いは頭等の比較的に小さい継続的な動きを挙げることができる。怒り、或いは悲しみの表情が現れる場合もある。このようなこともあり、影響度合いは、注意力等とは別の視点でも評価するようになっている。各ストレス影響評価情報は、そのような評価を可能にさせる。このようなストレス影響評価情報を参照して評価された影響度合いは、SSD14に確保された影響評価結果格納部146に格納される。図1にステップS4として示すストレス影響解析処理は、ストレス影響評価部118によって実現される。 The effects of stress may appear in behaviors that do not or are difficult to affect attention. Examples of such actions include relatively small continuous movements of the upper body, arms, or head. There may also be expressions of anger or sadness. Because of this, the degree of influence is now being evaluated from a different perspective than attentiveness, etc. Each stress impact evaluation information enables such an evaluation. The degree of impact evaluated with reference to such stress impact evaluation information is stored in the impact evaluation result storage section 146 secured in the SSD 14. The stress impact analysis process shown as step S4 in FIG. 1 is realized by the stress impact evaluation unit 118.
 認知力推定部119は、制御指標、つまり実睡眠レベル、眠気を除く各指標の評価結果、ストレスの影響度合いから、運転者3の認知力レベルを推定し、その推定結果に応じて、機器群20を制御する必要性の有無を判定する。
 認知力推定部119は、各制御指標の変化を確認し、機器群20を構成する各機器のうちで制御すべき機器を選択するとともに、その制御内容を決定する。そのために、SSD14に確保された推定情報格納部152に格納されている認知力推定情報が参照される。
The cognitive ability estimating unit 119 estimates the cognitive ability level of the driver 3 from the control index, that is, the actual sleep level, the evaluation results of each index excluding drowsiness, and the degree of influence of stress. It is determined whether or not there is a need to control 20.
The cognitive ability estimating unit 119 checks changes in each control index, selects a device to be controlled from among the devices constituting the device group 20, and determines the content of the control. For this purpose, the cognitive ability estimation information stored in the estimation information storage section 152 secured in the SSD 14 is referred to.
 認知力推定情報は、例えば制御指標別、その制御指標が表すレベル別、及びその制御指標のレベルの変化の内容別に、制御すべき機器の種類、及びその制御の内容が定義された情報である。そのため、認知力推定部119は、各制御指標を用いて認知力推定情報を参照することにより、認知力の推定結果として、制御すべき機器の種類、及びその制御の内容を出力する。この推定結果は、SSD14に確保された認知力推定結果格納部147に格納される。図1にステップS5として示す認知力推定処理は、認知力推定部119によって実現される。 The cognitive ability estimation information is information in which the type of equipment to be controlled and the content of the control are defined, for example, by control index, the level represented by the control index, and the content of changes in the level of the control index. . Therefore, the cognitive ability estimating unit 119 refers to the cognitive ability estimation information using each control index, and outputs the type of equipment to be controlled and the details of the control as the cognitive ability estimation result. This estimation result is stored in the cognitive ability estimation result storage section 147 secured in the SSD 14. The cognitive ability estimation process shown as step S5 in FIG. 1 is realized by the cognitive ability estimation unit 119.
 機器制御処理部120は、認知力推定結果格納部147に推定結果として格納された制御すべき機器の種類、及びその制御の内容に従って、機器群20を制御するための処理を行う。図1にステップS6として示す機器制御処理は、機器制御処理部120によって実現される。 The device control processing unit 120 performs processing for controlling the device group 20 according to the type of device to be controlled and the content of the control stored as the estimation result in the cognitive ability estimation result storage unit 147. The device control processing shown as step S6 in FIG. 1 is realized by the device control processing section 120.
 機器群20を構成する各機器の制御、実際には対応するECUによって行われる。そのため、機器制御処理部120は、制御すべき機器の種類、及びその制御内容に応じて、指定する機器、及び実際の制御内容等を表す制御情報を含む制御要求を対応するECUに送信する。この制御情報は、SSD14に確保された機器制御情報格納部153に格納されている機器制御情報を参照して決定される。そのために、機器制御情報は、例えば機器毎、及び制御内容毎に、送信すべき制御情報が定義されたものとなっている。制御情報を含む制御要求は、機器制御処理部120により、通信部16を介して送信される。なお、機器群20の制御は、必要に応じて行われるものであり、機器群20は常に制御されない。 The control of each device constituting the device group 20 is actually performed by the corresponding ECU. Therefore, the device control processing unit 120 transmits a control request including control information representing the specified device, actual control contents, etc. to the corresponding ECU, depending on the type of device to be controlled and its control content. This control information is determined by referring to the device control information stored in the device control information storage section 153 secured in the SSD 14. For this purpose, the device control information is such that control information to be transmitted is defined for each device and each control content, for example. A control request including control information is transmitted by the device control processing section 120 via the communication section 16. Note that the device group 20 is controlled as needed, and the device group 20 is not always controlled.
 上記各部111~120は、自動車が走行可能な状態となっている間、動作する。例えば定めた時間間隔が経過する毎に動作する。それにより、運転者3が安全運転を行えるように、タイムリに支援することができる。この時間間隔は、状況に応じて、例えば運転者3で推定される実眠気レベスに応じて、変化させても良い。 Each of the above-mentioned parts 111 to 120 operates while the vehicle is in a drivable state. For example, it operates every time a predetermined time interval elapses. Thereby, timely support can be provided so that the driver 3 can drive safely. This time interval may be changed depending on the situation, for example, depending on the actual drowsiness level estimated by the driver 3.
 図4、及び図5は、安全運転支援処理の例を示すフローチャートである。この安全運転支援処理は、映像情報の解析、生体情報を用いた自律神経の状態解析を前提とした認知力の推定により、必要な機器を制御し、運転者3の安全運転を支援するために実行される処理である。例えば定められた時間間隔が経過する度に実行される。各部114~120は、この処理の実行により実現される。次に図4、及び図5を参照し、安全運転支援処理について詳細に説明する。処理を実行する主体はCPU11とする。 4 and 5 are flowcharts illustrating an example of safe driving support processing. This safe driving support processing controls necessary equipment by estimating cognitive ability based on analysis of video information and analysis of the state of autonomic nerves using biological information, in order to support safe driving of driver 3. This is the process to be executed. For example, it is executed every time a predetermined time interval elapses. Each unit 114 to 120 is realized by executing this process. Next, the safe driving support process will be described in detail with reference to FIGS. 4 and 5. It is assumed that the main body that executes the processing is the CPU 11.
 先ず、ステップS11では、CPU11は、生体情報を用いた自律神経の状態解析を行う。続くステップS12では、CPU11は、映像情報を用いた映像解析により、各指標の評価を行う。その後、ステップS13に移行する。
 ステップS13では、CPU11は、自律神経の状態解析の結果、或いは評価した各指標のうちの何れかに、安全運転を損なう恐れがある異常な結果が有ったか否か判定する。例えば自律神経の状態解析により、眠気、或いはストレスが推定されたか、或いは映像解析により、眠気、疲労、注意力の低下のうちの何れかが認められた場合、異常ありとして、ステップS13の判定はYESとなってステップS16に移行する。一方、そうでない場合、異常は無いとして、ステップS13の判定はNOとなってステップS14に移行する。
First, in step S11, the CPU 11 analyzes the state of the autonomic nerve using biological information. In the following step S12, the CPU 11 evaluates each index by video analysis using video information. After that, the process moves to step S13.
In step S13, the CPU 11 determines whether or not there is an abnormal result that may impair safe driving in any of the autonomic nerve state analysis results or the evaluated indicators. For example, if drowsiness or stress is estimated by autonomic nerve state analysis, or if any of drowsiness, fatigue, or decreased attention is found by video analysis, it is determined that there is an abnormality and the determination in step S13 is made. If the answer is YES, the process moves to step S16. On the other hand, if this is not the case, it is assumed that there is no abnormality, the determination in step S13 is NO, and the process moves to step S14.
 ステップS14では、CPU11は、現在、機器の制御中か否か判定する。機器が制御中であった場合、ステップS14の判定はYESとなってステップS15に移行する。機器が制御中でない場合、つまり運転者3が安全運転を行える状態を維持している場合、ステップS14の判定はNOとなり、ここで安全運転支援処理が終了する。 In step S14, the CPU 11 determines whether or not the device is currently being controlled. If the device is under control, the determination in step S14 is YES and the process moves to step S15. When the device is not under control, that is, when the driver 3 maintains a state in which he can drive safely, the determination in step S14 becomes NO, and the safe driving support process ends here.
 ステップS15では、CPU11は、制御中の機器の制御を終了させる。その制御の終了は、対応するECUへの要求を送信することで実現される。その要求を送信させた後、安全運転支援処理が終了する。なお、ここで対象となる機器には、自動運転装置24は含まれない。 In step S15, the CPU 11 ends the control of the device being controlled. Termination of the control is realized by transmitting a request to the corresponding ECU. After transmitting the request, the safe driving support process ends. Note that the automatic operation device 24 is not included in the target equipment here.
 ステップS16では、CPU11は、現在、自動運転中か否か判定する。自動運転中であった場合、ステップS16の判定はYESとなり、ここで安全運転支援処理が終了する。自動運転中でない場合、つまり手動運転中である場合、ステップS16の判定はNOとなってステップS17に移行する。 In step S16, the CPU 11 determines whether automatic driving is currently in progress. If automatic driving is in progress, the determination in step S16 is YES, and the safe driving support process ends here. If automatic operation is not in progress, that is, if manual operation is in progress, the determination in step S16 is NO and the process moves to step S17.
 ステップS17では、CPU11は、自律神経の状態解析により、眠気が検出されたか否か判定する。眠気が検出、つまり推定された場合、ステップS17の判定はYESとなってステップS18に移行する。眠気が検出されなかった場合、ステップS17の判定はNOとなって、図5のステップS31に移行する。 In step S17, the CPU 11 determines whether drowsiness is detected by analyzing the state of the autonomic nerves. If drowsiness is detected, that is, estimated, the determination in step S17 is YES and the process moves to step S18. If drowsiness is not detected, the determination in step S17 is NO, and the process moves to step S31 in FIG. 5.
 ステップS18では、CPU11は、映像解析結果を用いた実眠気レベルの評価を行う。続くステップS19では、CPU11は、実眠気レベルの評価結果が設定以上か否か判定する。評価結果が設定以上、例えば実眠気レベルが2以上であった場合、ステップS19の判定はYESとなってステップS20に移行する。評価結果が設定未満であった場合、ステップS19の判定はNOとなって図5のステップS31に移行する。 In step S18, the CPU 11 evaluates the actual drowsiness level using the video analysis results. In the following step S19, the CPU 11 determines whether the evaluation result of the actual sleepiness level is equal to or higher than the setting. If the evaluation result is higher than the setting, for example, the actual drowsiness level is 2 or higher, the determination in step S19 is YES and the process moves to step S20. If the evaluation result is less than the setting, the determination in step S19 is NO and the process moves to step S31 in FIG.
 ステップS20では、CPU11は、自動運転装置24を除く機器の制御中か否か判定する。何れかの機器の制御中であった場合、ステップS20の判定はYESとなってステップS22に移行する。何れの機器も制御していない場合、ステップS20の判定はNOとなってステップS21に移行する。 In step S20, the CPU 11 determines whether or not devices other than the automatic driving device 24 are being controlled. If any device is being controlled, the determination in step S20 is YES and the process moves to step S22. If no device is being controlled, the determination in step S20 is NO and the process moves to step S21.
 ステップS21では、CPU11は、実眠気レベルに応じて選択する機器を制御する。その後、安全運転支援処理が終了する。なお、ここで選択される機器は、比較的に小さい刺激を運転者3に対して与えるためのものである。実眠気レベルが改善されない場合、より強い刺激を運転者3に対して与えることになる。 In step S21, the CPU 11 controls the device to be selected according to the actual drowsiness level. After that, the safe driving support process ends. Note that the device selected here is for providing a relatively small stimulus to the driver 3. If the actual drowsiness level is not improved, a stronger stimulus will be given to the driver 3.
 ステップS22では、CPU11は、実眠気レベルが改善する効果が認められたか否か判定する。その効果が認められた場合、ステップS22の判定はYESとなってステップS21に移行する。それにより、それまでと同様に、運転者3に刺激を与え続ける。一方、その効果が認められない場合、ステップS22の判定はNOとなってステップS23に移行する。なお、運転者3に効果が現れるのには或る程度の時間がかかることから、効果が認められたか否かの判定では、その時間が考慮される。これは、後述するステップS38でも同様である。 In step S22, the CPU 11 determines whether or not the effect of improving the actual sleepiness level has been observed. If the effect is recognized, the determination in step S22 becomes YES and the process moves to step S21. Thereby, the driver 3 continues to be stimulated as before. On the other hand, if the effect is not recognized, the determination in step S22 is NO and the process moves to step S23. Note that since it takes a certain amount of time for the effect to appear on the driver 3, that time is taken into consideration when determining whether or not the effect is observed. This also applies to step S38, which will be described later.
 ステップS23では、CPU11は、機器の制御における他の選択肢があるか否か判定する。他の選択肢は、運転者3に対してより強い刺激を与える機器、或いは制御内容の選択肢である。他の選択肢が存在する場合、ステップS23の判定はYESとなってステップS24に移行する。他の選択肢が存在しない場合、ステップS23の判定はNOとなってステップS25に移行する。 In step S23, the CPU 11 determines whether there are other options for controlling the device. Other options are options for devices that provide stronger stimulation to the driver 3 or options for control content. If there are other options, the determination in step S23 is YES and the process moves to step S24. If there are no other options, the determination in step S23 is NO and the process moves to step S25.
 ステップS24では、CPU11は、他の選択肢のうちの1つを選択する。ここで選択されるのは、例えば他の選択肢のうちで与える刺激が最も小さいものである。その選択後はステップS21に移行する。それにより、選択結果で機器が制御されることになる。
 一方、ステップS25では、CPU11は、自動運転装置24に対し、自動運転への切り換えを依頼する。そのようにして、手動運転から自動運転に移行させる。その後、安全運転支援処理が終了する。このようなことから、自動運転は、最終的に選択すべき選択肢として、他の選択肢とは別に扱われる。
In step S24, the CPU 11 selects one of the other options. What is selected here is, for example, the one that provides the least stimulus among the other options. After the selection, the process moves to step S21. As a result, the device will be controlled based on the selection result.
On the other hand, in step S25, the CPU 11 requests the automatic driving device 24 to switch to automatic driving. In this way, manual operation is transitioned to automatic operation. After that, the safe driving support process ends. For this reason, autonomous driving is treated separately from other options as an option that should be ultimately selected.
 図5のステップS31では、CPU11は、自律神経の状態解析により、ストレスが検出されたか否か判定する。ストレスが検出された場合、ステップS31の判定はYESとなってステップS32に移行する。ストレスが検出されなかった場合、ステップS31の判定はNOとなってステップS34に移行する。 In step S31 of FIG. 5, the CPU 11 determines whether stress has been detected by analyzing the state of the autonomic nerves. If stress is detected, the determination in step S31 is YES and the process moves to step S32. If stress is not detected, the determination in step S31 is NO and the process moves to step S34.
 ステップS32では、CPU11は、ストレスの影響度合いを評価する。続くステップS33では、CPU11は、評価した影響度合いが設定以上か否か判定する。評価した影響度合いが設定以上、例えば影響度合いが2以上であった場合、ステップS33の判定はYESとなってステップS36に移行する。影響度合いが設定未満であった場合、ステップS33の判定はNOとなってステップS34に移行する。 In step S32, the CPU 11 evaluates the degree of influence of stress. In the following step S33, the CPU 11 determines whether the evaluated degree of influence is greater than or equal to a setting. If the evaluated degree of influence is equal to or higher than the setting, for example, if the degree of influence is 2 or more, the determination in step S33 is YES and the process moves to step S36. If the degree of influence is less than the setting, the determination in step S33 is NO and the process moves to step S34.
 ステップS34では、CPU11は、認知力の推定を行い、その推定結果に応じて、機器の制御の必要性の有無を判定し、機器を制御する必要が有ると判定した場合には、制御すべき機器、その制御内容を決定する。続くステップS35では、CPU11は、必要性の有無の判定結果、制御すべき機器、及びその制御内容の決定に応じた処理を行う。その後、安全運転支援処理が終了する。
 なお、ステップS34に移行する場合、自律神経の状態が運転者3の運転に及ぼす影響は比較的に小さいと考えられる。このことから、ステップS34では、各指標の評価結果のみを用いた認知力の推定を行うようにしても良い。
In step S34, the CPU 11 estimates cognitive ability, determines whether or not it is necessary to control the device according to the estimation result, and if it is determined that it is necessary to control the device, the CPU 11 estimates the cognitive ability. Determine the equipment and its control details. In the subsequent step S35, the CPU 11 performs processing according to the determination result of necessity, the equipment to be controlled, and the details of the control. After that, the safe driving support process ends.
In addition, when proceeding to step S34, it is considered that the influence of the state of the autonomic nerves on the driving of the driver 3 is relatively small. Therefore, in step S34, cognitive ability may be estimated using only the evaluation results of each index.
 ステップS36では、CPU11は、自動運転装置24を除く機器の制御中か否か判定する。何れかの機器の制御中であった場合、ステップS36の判定はYESとなってステップS38に移行する。何れの機器も制御していない場合、ステップS36の判定はNOとなってステップS37に移行する。 In step S36, the CPU 11 determines whether or not devices other than the automatic driving device 24 are being controlled. If any device is being controlled, the determination in step S36 is YES and the process moves to step S38. If no device is being controlled, the determination in step S36 is NO and the process moves to step S37.
 ステップS37では、CPU11は、影響度合いに応じて選択する機器を制御する。その後、安全運転支援処理が終了する。なお、ここで選択される機器は、比較的に小さい刺激を運転者3に対して与え、ストレスの影響を抑えるためのものである。影響度合いが改善されない場合、より強い刺激を運転者3に対して与えることになる。 In step S37, the CPU 11 controls the device to be selected according to the degree of influence. After that, the safe driving support process ends. Note that the device selected here is intended to provide a relatively small stimulus to the driver 3 to suppress the influence of stress. If the degree of influence is not improved, a stronger stimulus will be given to the driver 3.
 ステップS38では、CPU11は、影響度合いが改善する効果が認められたか否か判定する。その効果が認められた場合、ステップS38の判定はYESとなってステップS37に移行する。それにより、それまでと同様に、運転者3に刺激を与え続ける。一方、その効果が認められない場合、ステップS38の判定はNOとなってステップS39に移行する。 In step S38, the CPU 11 determines whether or not the effect of improving the degree of influence has been observed. If the effect is recognized, the determination in step S38 becomes YES and the process moves to step S37. Thereby, the driver 3 continues to be stimulated as before. On the other hand, if the effect is not recognized, the determination in step S38 is NO and the process moves to step S39.
 ステップS39では、CPU11は、機器の制御における他の選択肢があるか否か判定する。他の選択肢は、運転者3に対してより強い刺激を与え、ストレスの影響を抑えるための機器、或いは制御内容の選択肢である。他の選択肢が存在する場合、ステップS39の判定はYESとなってステップS40に移行する。他の選択肢が存在しない場合、ステップS39の判定はNOとなり、ここで安全運転支援処理が終了する。この結果、それまでと同様の刺激が運転者3に与え続けられる。 In step S39, the CPU 11 determines whether there are any other options for controlling the device. Other options are options for devices or control contents that provide stronger stimulation to the driver 3 and suppress the effects of stress. If there are other options, the determination in step S39 is YES and the process moves to step S40. If there are no other options, the determination in step S39 is NO, and the safe driving support process ends here. As a result, the same stimulation as before continues to be given to the driver 3.
 ステップS40では、CPU11は、他の選択肢のうちの1つを選択する。ここで選択されるのは、例えば他の選択肢のうちで与える刺激が最も小さいものである。その選択後はステップS37に移行する。それにより、選択結果で機器が制御されることになる。 In step S40, the CPU 11 selects one of the other options. What is selected here is, for example, the one that provides the least stimulus among the other options. After the selection, the process moves to step S37. As a result, the device will be controlled based on the selection result.
 このようにして、本実施形態では、安全運転を行ううえで望ましくない状態と推定される運転者3には、刺激を与え、刺激を与えても状態に改善が認められない場合、段階的に、より強い刺激を与えるようにしている。刺激を与えても改善が見られないほど、運転者3の眠気が強い場合、自動運転に強制的に移行させるようにしている。なお、刺激には、音声出力による情報提供が含まれる。音声出力による情報提供によって改善が認められない場合、音声出力以外の身体への刺激が併せて行われるようになる。段階的に、より強い刺激を与えるようにする処理の流れとしているのは、機器を制御中に、不適切となっている制御指標が変化する可能性は低いという考えによるものである。例えば運転者3が強い眠気を感じている場合、比較的に短い期間に、強いストレスを感じるようになる可能性は低いと考えられるからである。 In this way, in this embodiment, stimulation is applied to the driver 3 who is estimated to be in an undesirable state for safe driving, and if the state does not improve even after the stimulation is applied, the driver 3 is given a stimulus in a step-by-step manner. , to give a stronger stimulus. If the driver 3 is so sleepy that no improvement is seen even after stimulation, the driver 3 is forced to shift to automatic driving. Note that the stimulation includes providing information through audio output. If no improvement is observed by providing information through voice output, body stimulation other than voice output will be performed at the same time. The process of applying stronger stimulation step by step is based on the idea that there is a low possibility that an inappropriate control index will change while the device is being controlled. This is because, for example, if the driver 3 is feeling very sleepy, it is unlikely that he will feel strong stress in a relatively short period of time.
 図4に例を示す機能構成において、生体情報取得部111は、生体情報取得手段、及び映像情報取得手段に相当する。自律神経状態解析部114は、状態推定手段に相当する。映像情報解析部115、指標評価部116、実眠気レベル評価部117、ストレス影響評価部118、及び認知力推定部119は、認知力推定手段に相当する。
 また、車両情報取得部112、及び環境情報取得部113は、状態情報取得手段に相当する。機器制御処理部120は、制御処理手段に相当する。
In the functional configuration exemplified in FIG. 4, the biometric information acquisition unit 111 corresponds to biometric information acquisition means and video information acquisition means. The autonomic nerve state analysis unit 114 corresponds to state estimation means. The video information analysis section 115, the index evaluation section 116, the actual drowsiness level evaluation section 117, the stress influence evaluation section 118, and the cognitive ability estimation section 119 correspond to cognitive ability estimation means.
Furthermore, the vehicle information acquisition section 112 and the environmental information acquisition section 113 correspond to state information acquisition means. The device control processing section 120 corresponds to control processing means.
 なお、本実施形態では、移動機として自動車を想定しているが、移動機は自動車に限定されない。移動機は、電車、飛行機、或いはヘリコプター等の飛行可能な移動体、或いは船舶等であっても良い。
 そのような移動体では、移動体の特性だけでなく、対象者が運転、或いは操縦するうえでの制約等も異なる。例えば電車では、レール上しか走行することができない。また、通常、障害物等の存在は考慮する必要が無い。このようなことからも、移動体の種類に応じて、各指標の評価方法、及び認知力の推定方法を決定する必要がある。しかし、自動車以外の移動体であっても、本実施形態を適用することができる。
Note that in this embodiment, a car is assumed as the mobile device, but the mobile device is not limited to a car. The mobile device may be a flying vehicle such as a train, an airplane, or a helicopter, or a ship.
Such moving objects differ not only in their characteristics but also in the constraints on which a subject can drive or operate them. For example, trains can only travel on rails. Furthermore, there is usually no need to consider the presence of obstacles. For this reason, it is necessary to determine the evaluation method for each index and the estimation method for cognitive ability depending on the type of moving object. However, this embodiment can also be applied to moving objects other than automobiles.
 また、本実施形態では、上記のように、映像情報から評価する各指標、及び自律神経の状態解析により得られた結果を互いに相補的に用いるようにしている。しかし、眠気等の推定は、自律神経の状態解析のほうが早く行うことができる。このことから、自律神経の状態解析による推定結果を用いた制御と、その推定結果、及び映像情報を用いての評価結果の両方を用いた制御と、に分けるようにしても良い。それにより、例えば自律神経の状態解析により眠気が推定された場合、音声出力により警告を行い、更に映像情報を用いた評価でも眠気が推定された場合、機器の制御による物理的な刺激を併せて与えるようにしても良い。このようなことを含め、様々な変形が可能である。 Furthermore, in this embodiment, as described above, each index evaluated from video information and the results obtained by analyzing the state of autonomic nerves are used in a complementary manner to each other. However, drowsiness and the like can be estimated more quickly by analyzing the state of the autonomic nervous system. For this reason, the control may be divided into control using the estimation result from autonomic nerve state analysis and control using both the estimation result and the evaluation result using video information. As a result, for example, if drowsiness is estimated by autonomic nerve state analysis, a warning will be issued by audio output, and if drowsiness is also estimated by evaluation using video information, physical stimulation will be provided by controlling the device. You can also give it to them. Various modifications including this are possible.
 1 認知力推定装置、2 運転席、3 運転者(対象者)、4、7 カメラ、6 スマートウォッチ、8 レーダ、9 ECU、11 CPU、14 SSD、16 通信部、20 機器群、21 空調機器制御装置、22 ステアリング制御装置、23 メッセージ出力制御装置、24 自動運転装置、31 脈拍センサ、41 操舵角センサ、42 ブレーキ・アクセスセンサ、43 Gセンサ、44 速度計、111 生体情報取得部、112 車両情報取得部、113 環境情報取得部、114 自律神経状態解析部、115 映像情報解析部、116 指標評価部、117 実睡眠レベル評価部、118 ストレス影響評価部、119 認知力推定部、120 機器制御処理部。 1. Cognitive ability estimation device, 2. Driver's seat, 3. Driver (subject), 4, 7. Camera, 6. Smart watch, 8. Radar, 9. ECU, 11. CPU, 14. SSD, 16. Communication department, 20. Equipment group, 21. Air conditioning equipment. Control device, 22 Steering control device, 23 Message output control device, 24 Automatic driving device, 31 Pulse sensor, 41 Steering angle sensor, 42 Brake access sensor, 43 G sensor, 44 Speedometer, 111 Biological information acquisition unit, 112 Vehicle Information acquisition unit, 113 Environmental information acquisition unit, 114 Autonomic nerve state analysis unit, 115 Video information analysis unit, 116 Index evaluation unit, 117 Actual sleep level evaluation unit, 118 Stress impact evaluation unit, 119 Cognitive ability estimation unit, 120 Equipment control processing section.

Claims (9)

  1.  移動体の運転、或いは操縦を行う対象者の心拍を少なくとも特定可能な生体情報を取得する生体情報取得手段と、
     前記対象者の映像情報を取得する映像情報取得手段と、
     前記生体情報から特定される心拍に基づいて、前記対象者の自律神経の状態を推定する状態推定手段と、
     前記状態推定手段による前記自律神経の状態の推定結果、及び前記映像情報に基づいて、前記対象者の認知力を推定する認知力推定手段と、
     を備える認知力推定装置。
    biological information acquisition means for acquiring biological information that can at least identify the heartbeat of a subject who drives or steers a mobile object;
    video information acquisition means for acquiring video information of the subject;
    a state estimating means for estimating the state of the subject's autonomic nerves based on the heartbeat identified from the biological information;
    Cognitive ability estimating means for estimating the cognitive ability of the subject based on the estimation result of the autonomic nerve state by the state estimating means and the video information;
    A cognitive ability estimation device comprising:
  2.  前記認知力推定手段は、前記映像情報に基づいて、眠気を含む複数の指標で前記対象者の状態を評価し、各指標の評価結果、及び前記自律神経の状態の推定結果が表す前記対象者の眠気レベルを用いて、前記認知力を推定する、
     請求項1に記載の認知力推定装置。
    The cognitive ability estimating means evaluates the condition of the subject using a plurality of indicators including sleepiness based on the video information, and evaluates the condition of the subject represented by the evaluation results of each indicator and the estimation result of the state of the autonomic nervous system. estimating the cognitive ability using the sleepiness level of
    The cognitive ability estimation device according to claim 1.
  3.  前記認知力推定手段は、前記自律神経の状態の推定結果が表す前記対象者の眠気レベル、及び前記複数の指標の評価結果を用いて、前記対象者の実際の眠気レベルである実眠気レベルを評価する、
     請求項2に記載の認知力推定装置。
    The cognitive ability estimating means calculates an actual sleepiness level that is the actual sleepiness level of the subject using the sleepiness level of the subject represented by the estimation result of the state of the autonomic nervous system and the evaluation results of the plurality of indicators. evaluate,
    The cognitive ability estimation device according to claim 2.
  4.  前記認知力推定手段は、前記映像情報に基づいて、1つ以上の指標で前記対象者の状態を評価し、各指標の評価結果、及び前記自律神経の状態の推定結果が表す前記対象者のストレスレベルを用いて、前記ストレスレベルのストレスが前記複数の指標の各評価結果に影響する度合いを推定する、
     請求項1に記載の認知力推定装置。
    The cognitive ability estimating means evaluates the condition of the subject using one or more indicators based on the video information, and evaluates the condition of the subject expressed by the evaluation results of each indicator and the estimation result of the autonomic nerve condition. using the stress level to estimate the degree to which stress at the stress level affects each evaluation result of the plurality of indicators;
    The cognitive ability estimation device according to claim 1.
  5.  前記移動体の状態を表す状態情報を取得可能な状態情報取得手段、を更に備え、
     前記認知力推定手段は、前記状態推定手段による前記自律神経の状態の推定結果、前記映像情報、及び前記状態情報に基づいて、前記対象者の認知力を推定する、
     請求項1に記載の認知力推定装置。
    further comprising a state information acquisition means capable of acquiring state information representing the state of the mobile object,
    The cognitive ability estimating means estimates the cognitive ability of the subject based on the estimation result of the autonomic nerve state by the state estimating means, the video information, and the state information.
    The cognitive ability estimation device according to claim 1.
  6.  前記認知力推定手段による前記認知力の推定結果に基づいて、前記認知力を向上させるための刺激を前記対象者に与えることが可能な第1の機器を制御するための処理を行う制御処理手段、を更に備える、
     請求項1に記載の認知力推定装置。
    control processing means for performing processing for controlling a first device capable of giving the subject a stimulus for improving the cognitive ability, based on the estimation result of the cognitive ability by the cognitive ability estimating means; , further comprising;
    The cognitive ability estimation device according to claim 1.
  7.  前記制御処理手段は、前記対象者以外の人への報知が可能な第2の機器を制御するための処理を行う、
     請求項6に記載の認知力推定装置。
    The control processing means performs processing for controlling a second device that can notify a person other than the target person.
    The cognitive ability estimation device according to claim 6.
  8.  前記制御処理手段は、前記移動体に自動運転機能が搭載されている場合、前記機器を制御した後の前記認知力推定手段による前記認知力の推定結果に基づいて、前記自動運転機能による前記移動体の自動運転のための処理を行う、
     請求項6に記載の認知力推定装置。
    When the mobile body is equipped with an automatic driving function, the control processing means controls the movement by the automatic driving function based on the estimation result of the cognitive ability by the cognitive ability estimation means after controlling the device. Performs processing for self-driving the body,
    The cognitive ability estimation device according to claim 6.
  9.  情報処理装置に、
     移動体の運転、或いは操縦を行う対象者の心拍を少なくとも特定可能な生体情報を取得させ、
     前記対象者の映像情報を取得させ、
     前記生体情報から特定される心拍に基づいて、前記対象者の自律神経の状態を推定させ、
     前記自律神経の状態の推定結果、及び前記映像情報に基づいて、前記対象者の認知力を推定させる、
     処理を実行させるプログラム。

     
    In the information processing device,
    Acquire biological information that can at least identify the heartbeat of a subject who drives or steers a mobile object,
    Obtain video information of the subject;
    Estimating the state of the subject's autonomic nerves based on the heartbeat identified from the biological information,
    estimating the cognitive ability of the subject based on the estimation result of the state of the autonomic nerve and the video information;
    A program that executes processing.

PCT/JP2023/015517 2022-04-18 2023-04-18 Cognitive ability estimation apparatus and program WO2023204218A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022068549 2022-04-18
JP2022-068549 2022-04-18

Publications (1)

Publication Number Publication Date
WO2023204218A1 true WO2023204218A1 (en) 2023-10-26

Family

ID=88419798

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/015517 WO2023204218A1 (en) 2022-04-18 2023-04-18 Cognitive ability estimation apparatus and program

Country Status (2)

Country Link
TW (1) TW202404532A (en)
WO (1) WO2023204218A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018127112A (en) * 2017-02-08 2018-08-16 パナソニックIpマネジメント株式会社 Arousal level estimation device and arousal level estimation method
JP2019004924A (en) * 2017-06-20 2019-01-17 株式会社東芝 System and method
JP2019111092A (en) * 2017-12-22 2019-07-11 オムロン株式会社 Biological state estimation device, method, and program
WO2021053780A1 (en) * 2019-09-19 2021-03-25 三菱電機株式会社 Cognitive function estimation device, learning device, and cognitive function estimation method
JP2021077134A (en) * 2019-11-11 2021-05-20 マツダ株式会社 Vehicle control device and driver state determination method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018127112A (en) * 2017-02-08 2018-08-16 パナソニックIpマネジメント株式会社 Arousal level estimation device and arousal level estimation method
JP2019004924A (en) * 2017-06-20 2019-01-17 株式会社東芝 System and method
JP2019111092A (en) * 2017-12-22 2019-07-11 オムロン株式会社 Biological state estimation device, method, and program
WO2021053780A1 (en) * 2019-09-19 2021-03-25 三菱電機株式会社 Cognitive function estimation device, learning device, and cognitive function estimation method
JP2021077134A (en) * 2019-11-11 2021-05-20 マツダ株式会社 Vehicle control device and driver state determination method

Also Published As

Publication number Publication date
TW202404532A (en) 2024-02-01

Similar Documents

Publication Publication Date Title
US20210009149A1 (en) Distractedness sensing system
US10210409B1 (en) Seating system with occupant stimulation and sensing
US10379535B2 (en) Drowsiness sensing system
CN112041910B (en) Information processing apparatus, mobile device, method, and program
US10875536B2 (en) Coordinated vehicle response system and method for driver behavior
JP7324716B2 (en) Information processing device, mobile device, method, and program
Aghaei et al. Smart driver monitoring: when signal processing meets human factors: in the driver's seat
EP2247484B1 (en) Vehicle control method for adapting dynamic vehicle performance to the psychophysical condition of the driver
WO2016047063A1 (en) Onboard system, vehicle control device, and program product for vehicle control device
CA2649731C (en) An unobtrusive driver drowsiness detection method
CN109263645A (en) For adjusting the method and system and motor vehicle of the operating parameter of motor vehicle
EP3387995A1 (en) Apparatus and method for controlling vehicle based on degree of fatigue
WO2014149657A1 (en) Coordinated vehicle response system and method for driver behavior
MX2013009434A (en) System and method for responding to driver behavior.
JP7357006B2 (en) Information processing device, mobile device, method, and program
US10845802B2 (en) Method for operating a motor vehicle
CN105015445A (en) Method and system for personalized assistance driver of motor vehicle
US11751784B2 (en) Systems and methods for detecting drowsiness in a driver of a vehicle
US20210197838A1 (en) Method for adapting the comfort of a vehicle, regulating device and vehicle
CN113891823A (en) Method and device for monitoring the state of health of an occupant, in particular of an autonomous vehicle, regulated by a driving maneuver
JP2016088497A (en) Work capability control system
WO2016067594A1 (en) Work capability control system
JP2010063682A (en) Driver monitoring apparatus
WO2023204218A1 (en) Cognitive ability estimation apparatus and program
Hirose et al. Driving characteristics of drivers in a state of low alertness when an autonomous system changes from autonomous driving to manual driving

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23791866

Country of ref document: EP

Kind code of ref document: A1