WO2021005975A1 - Dispositif de détermination d'état, instrument embarqué sur un véhicule, système d'évaluation de conduite, méthode et programme de détermination d'état - Google Patents

Dispositif de détermination d'état, instrument embarqué sur un véhicule, système d'évaluation de conduite, méthode et programme de détermination d'état Download PDF

Info

Publication number
WO2021005975A1
WO2021005975A1 PCT/JP2020/023555 JP2020023555W WO2021005975A1 WO 2021005975 A1 WO2021005975 A1 WO 2021005975A1 JP 2020023555 W JP2020023555 W JP 2020023555W WO 2021005975 A1 WO2021005975 A1 WO 2021005975A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
eye
feature amount
closure rate
state determination
Prior art date
Application number
PCT/JP2020/023555
Other languages
English (en)
Japanese (ja)
Inventor
絵里子 閑
博 田▲崎▼
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2021005975A1 publication Critical patent/WO2021005975A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a state determination device, an in-vehicle device, a driving evaluation system, a state determination method, and a program.
  • Patent Document 1 discloses an awakening maintenance device that maintains the awakening of a vehicle driver.
  • the arousal maintenance device is composed of a stimulus execution unit that executes a stimulus process that changes the display position of the visual stimulus in the left-right direction in the front view of the driver by using a display device that displays the visual stimulus, and the stimulus execution unit.
  • the higher the degree of the control unit that executes the stimulus processing at each interval time the loose state estimation unit that estimates the degree of the driver's loose state, and the loose state estimated by the loose state estimation unit, the more the interval time. It is equipped with a time setting unit that sets the time short.
  • the vague state estimation unit photographs the driver's face using a driver camera and acquires a moving image.
  • the vague state estimation unit acquires the eye closing rate (ratio of the time when the eyes are closed) of the driver by using the acquired moving image.
  • the vague state estimation unit determines whether or not the acquired eye closure rate is equal to or less than the upper limit value (8%), and if the eye closure rate is equal to or less than the upper limit value, the time setting unit causes the above. Based on the eye closure rate, the interval time for executing the stimulus processing by the stimulus execution unit is set. On the other hand, when the eye closure rate exceeds the upper limit value, the stimulus execution unit executes a direct awakening process.
  • the degree of the involuntary state is determined based on the value of the eye closure rate, and the process by the time setting unit or the stimulation execution unit is executed. There is.
  • the eye closure rate may be measured higher even in the awake state. Further, depending on the installation location of the driver camera, the eye closure rate may be measured higher even in the awake state. In such a case, in the determination based on the above-mentioned value of the eye closure rate, there is a possibility that an erroneous determination that the awake state is in the awake state may be made, and there is a problem that the absent state cannot be accurately determined.
  • the present invention has been made in view of the above problems, and is an operation evaluation system including a state determination device capable of accurately determining a person's involuntary state, an in-vehicle device equipped with the device, and the in-vehicle device. , A state determination method, and a program for realizing the method.
  • the state determination device (1) according to the present disclosure in order to achieve the above object is a state determination device that determines the state of a person.
  • An eye opening / closing degree detection unit that detects the eye opening / closing degree of the person from an image of the person's face
  • An eye closure rate calculation unit that calculates the eye closure rate of the person using the eye opening / closing degree of the first predetermined period detected by the eye opening / closing degree detection unit.
  • a variation feature amount calculation unit for calculating the variation feature amount of the eye-closure rate, It is characterized in that it includes a loose state determination unit that determines the loose state of the person based on the variable feature amount calculated by the variable feature amount calculation unit.
  • the eye opening / closing degree of the person is detected from the image, and the eye closing rate of the person is calculated using the eye opening / closing degree of the detected first predetermined period.
  • the fluctuation feature amount of the eye closure rate is calculated using the calculated eye closure rate of the second predetermined period, and the loose state of the person is determined based on the calculated fluctuation feature amount. .. Therefore, since the absent-minded state of the person is determined based on the variable feature amount, for example, due to factors such as individual differences in the eye closure rate or differences in the imaging position of the image, the person may be awake even during awakening. Even when the eye closure rate is measured to be high, the vague state can be accurately determined without being affected by these factors.
  • the person may be, for example, a driver who drives a vehicle, a worker who is performing a predetermined work, or the like.
  • the state determination device (2) further includes a pretreatment unit that performs preprocessing on the eye closure rate calculated by the eye closure rate calculation unit in the state determination device (1).
  • the fluctuation feature amount calculation unit It is characterized in that the fluctuation feature amount is calculated by using the eye closure rate after pretreatment by the pretreatment unit.
  • the pretreatment for the eye closure rate is performed by the pretreatment unit, and the variation feature amount is calculated using the eye closure rate after the pretreatment. Therefore, by using the eye closure rate after the pretreatment for the calculation of the variable feature amount, it is possible to calculate the feature amount that more accurately represents the feature that shifts to the loose state, and the loose state. It is possible to accurately determine the transition to.
  • the state determination device (3) is characterized in that, in the state determination device (2), the pretreatment unit performs the smoothing process of the eye closure rate. According to the state determination device (3), since the smoothing process of the eye closure rate is performed by the pretreatment unit, the variation feature amount is calculated using the eye closure rate after the smoothing process. As a result, it becomes easy to calculate the feature amount indicating the tendency to shift to the vague state.
  • the smoothing process may be, for example, a process of obtaining a moving average of the eye closure rate for each predetermined period.
  • the state determination device (4) further includes an event detection unit for detecting a predetermined event in the state determination device (2).
  • the pretreatment unit Of the eye closure rates in the second predetermined period the eye closure rate calculated when the predetermined event is detected by the event detection unit or during the period is removed from the data to be calculated for the variable feature amount. It is characterized in that it is intended to perform.
  • the state determination device (4) it was calculated by the pretreatment unit when or during the period when the event detection unit detected the predetermined event among the eye closure rates in the second predetermined period.
  • the eye closure rate is removed from the data for which the variable feature amount is calculated.
  • the variable feature amount is calculated using the eye closure rate after the removal treatment. Therefore, by using the eye closure rate after the removal treatment for the calculation of the variable feature amount, it is possible to calculate the feature amount that more accurately represents the feature that shifts to the loose state, and the shift to the loose state can be calculated. Can be accurately determined.
  • the state determination device (5) is characterized in that, in the state determination device (4), the pretreatment unit further performs the smoothing process of the eye closure rate after the removal process. There is. According to the state determination device (5), the pretreatment unit further smoothes the eye closure rate after the removal process, so that the smoothed eye closure rate can be used as the variable feature amount.
  • the variable feature amount can be calculated as a feature amount whose tendency to shift to the loose state can be easily grasped, and the shift to the loose state can be accurately determined.
  • the smoothing process may be, for example, a process of obtaining a moving average of the eye closure rate for each predetermined period.
  • the state determination device (6) is the state determination device (4) or (5) at the time or period during which the eye closure rate of the second predetermined period is removed by the removal process. It is characterized in that it further includes an interpolation processing unit that interpolates the eye closure rate. According to the state determination device (6), the interpolation processing unit interpolates the eye closure rate in the time or period removed by the removal process among the eye closure rates in the second predetermined period. Therefore, since the eye closure rate can be interpolated by the interpolation processing unit, the fluctuation feature amount can be appropriately calculated.
  • the person is the driver of the vehicle.
  • the event detection unit switches the road type on which the vehicle travels, the vehicle stops, the vehicle starts traveling, and the vehicle suddenly steers. It is characterized by including a vehicle dynamics detection unit that detects at least one of an event in which a sudden braking occurs in the vehicle and an event in which an impact occurs in the vehicle.
  • the state determination device (7) since the event detection unit includes the vehicle dynamics detection unit, the road type on which the vehicle travels is switched among the eye closure rates in the second predetermined period. Of the events, the event that the vehicle stopped, the event that the vehicle started running, the event that the vehicle suddenly steered, the event that the vehicle suddenly braked, and the event that the vehicle was impacted.
  • the eye closure rate calculated when at least one of the above events is detected or during the period can be excluded from the calculation target data of the fluctuation feature amount. Therefore, even in an actual vehicle environment in which the dynamics of the vehicle change from moment to moment, the variable feature amount can be calculated as a feature amount that more accurately indicates the tendency to shift to the vague state, and in the actual vehicle environment. It is possible to accurately determine the transition of the driver to the involuntary state.
  • the person is the driver of the vehicle.
  • the event detection unit is characterized in that the event detection unit includes a face-facing event detection unit that detects, as the predetermined event, an event in which the direction of the face of the person changes while the vehicle is driving.
  • the event detection unit since the event detection unit includes the face-facing event detection unit, the person during the driving of the vehicle in the eye closure rate during the second predetermined period. It is possible to exclude the eye closure rate calculated when or during the period when the event of changing the direction of the face is detected from the calculation target data of the variable feature amount.
  • variable feature amount can be calculated as a feature amount that more accurately indicates the tendency to shift to the vague state, and the actual vehicle can be calculated. It is possible to accurately determine the transition of the driver to the involuntary state in the environment.
  • the variation feature amount calculation unit uses the variation feature amount as the variation feature amount for the second predetermined period. It is characterized in that it calculates an index showing the degree of variation in the eye closure rate.
  • the fluctuation feature amount calculation unit uses an index indicating the degree of variation in the eye closure rate during the second predetermined period, such as standard deviation or dispersion, as the fluctuation feature amount. The value of is calculated, and the involuntary state of the person is determined based on the index.
  • the degree of variation in the eye-closure rate in other words, the change in the degree of variation is taken into consideration in the determination of the absent-minded state of the person, so that the eye-closure rate is measured higher even in the awake state, for example. Even in such a case, it is possible to accurately determine the timing of transition to the involuntary state without being affected by the value of the eye closure rate itself.
  • the fluctuation feature amount calculation unit uses the fluctuation feature amount as the fluctuation feature amount for the second predetermined period. It is characterized in that it calculates the amount of change or the rate of change in the eye closure rate. According to the state determination device (10), the fluctuation feature amount calculation unit calculates the change amount or change rate of the eye closure rate during the second predetermined period as the fluctuation feature amount, and the change amount of the eye closure rate. Alternatively, the involuntary state of the person is determined based on the rate of change.
  • the amount of change or the rate of change in the eye closure rate in other words, the magnitude of the change is taken into consideration in determining the absent-minded state of the person, for example, the eye closure rate is measured higher even in the awake state. Even in such a case, it is possible to accurately determine the timing of transition to the involuntary state without being affected by the value of the eye closure rate itself.
  • the fluctuation feature amount calculation unit uses the fluctuation feature amount as the fluctuation feature amount in the second predetermined period. It is characterized in that it calculates the rate of increase in time, which is the ratio of the rate of increase in eye closure rate. According to the state determination device (11), the fluctuation feature amount calculation unit calculates the increase time rate of the eye closure rate in the second predetermined period as the fluctuation feature amount, and based on the increase time rate, The loose state of the person is determined. Therefore, in determining the absent-minded state of the person, the rate of increase in the eye-closure rate, in other words, the tendency of the eye-closure rate to increase over time is taken into consideration. Therefore, for example, even in the awake state. Even when the eye-closure rate is measured higher, it is possible to accurately determine the timing of transition to the involuntary state without being affected by the value of the eye-closure rate itself.
  • the involuntary state determination unit uses two or more determination thresholds to determine the involuntary state of the person. Is characterized in that it is determined step by step. According to the state determination device (12), it is possible to determine the involuntary state of the person step by step by using two or more determination threshold values, and make an appropriate determination according to the degree of the involuntary state. Can be done.
  • the involuntary state determination unit causes the variation feature amount to continuously exceed the determination threshold for a predetermined period.
  • the person is determined to be in a vague state.
  • the state determination device (13) when the variable feature amount continuously exceeds the determination threshold value for a predetermined period, it is determined that the person is in a loose state, so that the person shifts to the loose state. The state can be judged appropriately.
  • the indiscriminate state determination unit has two or more types of the variable feature amount or one or more types. It is characterized in that the loose state of the person is determined based on the variable feature amount and the eye closure rate. According to the state determination device (14), it is possible to determine the involuntary state of the person based on two or more types of the variable feature amount, or one or more types of the variable feature amount and the eye closure rate. By considering two or more types of the variable feature amounts, it is possible to more accurately determine the timing at which the person shifts to the involuntary state.
  • the involuntary state determination unit calculates the fluctuation feature amount, or the fluctuation feature amount and the variation feature amount.
  • a trained learner that has been trained to output a value indicating whether or not the person is in a loose state is used to determine the loose state of the person. It is supposed to be.
  • the learner may be configured to include, for example, a neural network, a support vector machine, or the like.
  • the state determination device (16) is characterized in that any of the above state determination devices (1) to (15) includes an output unit that outputs a determination result by the indiscriminate state determination unit. There is. According to the state determination device (16), the output unit can appropriately output the determination result by the indiscriminate state determination unit.
  • the state determination device (17) is characterized in that the state determination device (16) includes a notification unit that performs notification according to the determination result output by the output unit. According to the state determination device (17), the notification unit can appropriately perform notification according to the determination result of the indiscriminate state determination unit.
  • the state determination device (18) includes a determination result storage unit for storing the determination result output by the output unit in the state determination device (16) or (17), and the determination result storage unit. It is characterized in that it includes a communication unit that transmits data including the determination result stored in the predetermined destination. According to the state determination device (18), the determination result is stored in the determination result storage unit, and the communication unit can appropriately transmit the data including the determination result to a predetermined transmission destination. .. Therefore, the determination result can be appropriately used at the predetermined destination.
  • the vehicle-mounted device (1) is characterized in that it includes any of the above-mentioned state determination devices (1) to (18) and an imaging unit for capturing the image. According to the in-vehicle device (1), it is possible to realize an in-vehicle device that exhibits the effect of any one of the state determination devices (1) to (18).
  • the driving evaluation system (1) is based on data including one or more of the in-vehicle devices (1) and a determination result of the involuntary state of the person determined by the state determination device of the in-vehicle device. It is provided with a driving evaluation unit that performs driving evaluation including evaluation of the person's involuntary state, and an evaluation result output unit that outputs a driving evaluation result including evaluation of the person's involuntary state evaluated by the driving evaluation unit. It is characterized in that it is configured to include an operation evaluation device.
  • the driving evaluation including the evaluation of the loose state of the person is performed based on the data including the determination result of the loose state of the person determined by the state determination device of the vehicle-mounted device. It is possible to output a driving evaluation result including an evaluation of the involuntary state of the person who has been appropriately evaluated, and it is possible to appropriately perform safe driving education for the person.
  • the state determination method is a state determination method for determining the state of a person.
  • An eye opening / closing degree detection step for detecting the eye opening / closing degree of the person from an image of the person's face
  • An eye closure rate calculation step for calculating the eye closure rate of the person using the eye opening / closing degree for the first predetermined period detected in the eye opening / closing degree detection step
  • Using the eye closure rate for the second predetermined period calculated in the eye closure rate calculation step a variable feature amount calculation step for calculating the variable feature amount of the eye closure rate, and a variable feature amount calculation step. It is characterized in that it includes a loose state determination step for determining the loose state of the person based on the variable feature amount calculated in the variable feature calculation step.
  • the eye opening / closing degree of the person is detected from the image, and the eye closing rate of the person is calculated and calculated using the eye opening / closing degree of the detected first predetermined period.
  • the fluctuation feature amount of the eye closure rate is calculated by using the eye closure rate in the second predetermined period, and the loose state of the person is determined based on the calculated fluctuation feature amount. Therefore, since the absent-minded state of the person is determined based on the variable feature amount, for example, due to factors such as individual differences in the eye closure rate or differences in the imaging position of the image, the person may be awake even during awakening. Even when the eye closure rate is measured to be high, the vague state can be accurately determined without being affected by these factors.
  • the program according to the present disclosure is a program for causing at least one or more computers to execute a process of determining the state of a person.
  • An eye opening / closing degree detection step for detecting the eye opening / closing degree of the person from an image of the person's face
  • An eye closure rate calculation step for calculating the eye closure rate of the person using the eye opening / closing degree for the first predetermined period detected in the eye opening / closing degree detection step
  • Using the eye closure rate for the second predetermined period calculated in the eye closure rate calculation step a variable feature amount calculation step for calculating the variable feature amount of the eye closure rate, and a variable feature amount calculation step.
  • the program is characterized in that it is a program for executing a loose state determination step for determining a loose state of the person based on the variable feature amount calculated in the variable feature calculation step.
  • the eye closure rate of the person is determined by having at least one or more computers detect the eye opening / closing degree of the person from the image and using the eye opening / closing degree of the first predetermined period detected.
  • the variable feature amount of the eye closure rate is calculated, and the involuntary state of the person is determined based on the calculated variable feature amount. Can be done. Therefore, due to factors such as individual differences in the eye closure rate or differences in the imaging position of the image, the influence of these factors is affected even when the eye closure rate is measured higher even during awakening. It is possible to realize a device or system capable of accurately determining the absent-minded state without receiving it.
  • the above program may be a program stored in a storage medium, a program that can be transferred via a communication network, or a program that is executed via a communication network.
  • FIG. 1 It is a schematic diagram which shows an example of the application scene of the state discriminating apparatus which concerns on embodiment (1). It is a block diagram which shows the hardware configuration example of the in-vehicle device which concerns on embodiment (1). It is a block diagram which shows the functional structure example of the state determination apparatus which concerns on embodiment (1).
  • (A) is an example of a graph showing a time-series change in the eye-closing rate of a driver having a low eye-closing rate in normal times
  • (b) is an example of the eye-closing rate calculated using the eye-closing rate shown in (a). It is a graph which shows the time series change of a standard deviation.
  • (A) is an example of a graph showing a time-series change in the eye-closing rate of a driver having a high eye-closing rate in normal times
  • (b) is an example of the eye-closing rate calculated using the eye-closing rate shown in (a).
  • It is a graph which shows the time series change of a standard deviation.
  • It is a block diagram which shows the functional structure example of the operation evaluation apparatus which concerns on embodiment (1).
  • It is a flowchart which shows the processing operation example performed by the control part of the state determination apparatus which concerns on embodiment (1).
  • FIG. 2 It is a block diagram which shows the functional structure example of the state determination apparatus which concerns on embodiment (2).
  • (A) is a graph in which the moving average line of the eye closure rate is superimposed on the time-series change of the eye closure rate shown in FIG. 4 (a), and (b) is the graph of the eye closure rate shown in FIG. 5 (a). It is a graph in which the moving average line of the eye closure rate is superimposed on the time series change.
  • FIG. 1 A state determination device, an in-vehicle device, a driving evaluation system, a state determination method, and an embodiment of a program according to the present invention will be described with reference to the drawings.
  • the state determination device, the state determination method, and the program according to the present invention can be widely applied to various uses for determining the state of a person.
  • the in-vehicle device provided with the state determination device according to the present invention and the driving evaluation system including one or more of the in-vehicle devices evaluate, for example, the driving state of the driver of the vehicle managed by a business operator or the like. Therefore, it can be widely applied to devices or systems for supporting improvement of driver's safety awareness from the viewpoint of preventive safety to prevent accidents. [Application example] FIG.
  • FIG. 1 is a schematic view showing an example of an application scene of the state determination device according to the embodiment (1).
  • the state determination device 20 is mounted on the vehicle-mounted device 10, and the vehicle-mounted device 10 mounted on one or more vehicles 2 and at least one or more that processes data acquired from each vehicle-mounted device 10.
  • a driving evaluation system 1 for evaluating the driving of each driver 3 is constructed including the driving evaluation device 4 of the above.
  • the on-board unit 10 includes a state determination device 20 and a camera 11 that captures an image including the face of the driver 3.
  • the state determination device 20 is composed of a computer device that executes various processes for determining the state of the driver 3 while acquiring an image captured by the camera 11.
  • the vehicle 2 on which the vehicle-mounted device 10 is mounted is not particularly limited. For example, trucks managed by transportation companies, buses managed by bus companies, taxis managed by taxi companies, car sharing vehicles managed by car sharing companies, car rentals managed by car rental companies, owned by companies. Vehicles managed by businesses operating various businesses, such as company-owned vehicles that are owned by the company or company-owned vehicles that are leased and used by car leasing companies, can be targeted. Further, the vehicle 2 on which the vehicle-mounted device 10 is mounted may be a general vehicle.
  • the driving evaluation device 4 is configured to be managed or operated by a safety evaluation / driving training institution such as an insurance company or a driving school, and the driving of the driver 3 of each vehicle 2 in these safety evaluation / driving training institutions. It can also be applied as a system to perform evaluation.
  • a safety evaluation / driving training institution such as an insurance company or a driving school
  • the driving evaluation device 4 acquires, for example, the driving behavior data of the driver 3 and the driving behavior data of the vehicle 2 transmitted from the in-vehicle device 10, and each operation is based on the acquired data group and predetermined evaluation conditions. It has a configuration in which the operation evaluation process of the person 3 is executed and the operation evaluation result is output to an external device, for example, the operator terminal 6.
  • the operation evaluation device 4 is composed of one or more server computers including, for example, a communication unit 41, a control unit 42, and a storage unit 43.
  • the driving behavior data of the driver 3 includes, for example, fluctuation characteristics of the driver 3's face orientation, line-of-sight direction, eye opening / closing degree, eye closure rate, and eye closure rate detected by processing an image captured by the camera 11.
  • Data of at least one of the amount, the predetermined facial event, and the determination result of the loose state are included.
  • Predetermined face-facing events include, for example, at least one of a right / left turn confirmation at an intersection, a direction of travel confirmation at an intersection, no face detection, and inattentiveness.
  • the traveling behavior data of the vehicle 2 includes, for example, data of at least one of the acceleration, angular velocity, position, speed, and predetermined vehicle dynamic event of the vehicle 2 detected by the in-vehicle device 10.
  • Predetermined vehicle dynamic events include, for example, at least one of intersection crossing, road type switching, stopping, sudden steering wheel occurrence, sudden braking, and impact on vehicle 2. ..
  • the vehicle-mounted device 10 and the operation evaluation device 4 are configured to be able to communicate with each other via the communication network 5.
  • the communication network 5 may include a mobile phone network including a base station, a wireless communication network such as a wireless LAN (Local Area Network), a wired communication network such as a public telephone network, the Internet, or a telecommunication line such as a dedicated network. May include.
  • the operator terminal 6 that manages the vehicle 2 is configured to be able to communicate with the driving evaluation device 4 via the communication network 5.
  • the business terminal 6 may be a personal computer having a communication function, or a mobile information terminal such as a mobile phone, a smartphone, or a tablet device. Further, the business terminal 6 may be configured to be able to communicate with the vehicle-mounted device 10 via the communication network 5.
  • the state determination device 20 mounted on the vehicle-mounted device 10 acquires an image captured at a predetermined frame rate from a camera 11 arranged so that the face of the driver 3 can be photographed.
  • the state determination device 20 processes the images acquired from the camera 11 in chronological order, detects the degree of eye opening / closing of the driver 3 from the images (for example, every frame), and detects the first predetermined period (for example, for example).
  • the eye closure rate of the driver 3 is calculated using the eye opening / closing degree (for a predetermined time of about 1 minute).
  • the state determination device 20 uses the calculated eye closure rate for the second predetermined period (longer than the first predetermined period, for example, a predetermined time of about 10 to 15 minutes), and uses the variable feature amount of the eye closure rate (hereinafter, the variable feature amount). (Also referred to as) is calculated, and based on the calculated variation feature amount, the driver 3's loose state (in other words, the loose driving state) is determined.
  • the degree of eye opening / closing is an index indicating the degree of opening of the driver 3's eyes. For example, the ratio of the vertical width to the horizontal width of the driver 3's eyes extracted from the image (for example, the number of pixels in the vertical width / the horizontal width). The number of pixels) may be calculated as the degree of opening and closing of the eye.
  • the eye closure rate is the ratio (ratio) of the time during which the eye is closed, and even if the ratio of the eye opening / closing degree below a predetermined threshold value among the eye opening / closing degrees detected in the first predetermined period is calculated as the eye closure rate. Good.
  • the predetermined threshold value is a value for determining whether or not the eye is closed.
  • the variation feature amount of the eye closure rate is data indicating the variation feature of the eye closure rate calculated by using the eye closure rate in the second predetermined period, and is, for example, an index indicating the degree of variation in the eye closure rate in the second predetermined period ( For example, it may be (standard deviation), the amount of change or the rate of change (slope) of the eye closure rate in the second predetermined period, or the rate of increase in the eye closure rate in the second predetermined period.
  • the state of inattentiveness means a state in which concentration or attention is reduced due to psychological or physiological factors (in the case of driver 3 of vehicle 2 in particular, a state of carelessness ahead), and drowsiness or In addition to the state in which concentration or attention is reduced due to fatigue, it may include a state in which the person is vague due to some thought.
  • the driver 3's loose state is not the value of the closed eye rate itself, but the above-mentioned variable feature amount of the closed eye rate.
  • the determination is made based on, for example, in the awake state due to factors such as individual differences in the eye closure rate of each driver 3 or differences in the installation position (in other words, the image imaging position) of the camera 11 of each vehicle 2. Even if there is a case where the value of the eye closure rate is measured higher, it is possible to accurately determine the involuntary state of each driver 3 without being affected by these factors.
  • the operation evaluation system 1 including one or more in-vehicle devices 10 and the operation evaluation device 4, by using the determination result of the loose state determined by the state determination device 20 of the in-vehicle device 10. , It is possible to evaluate each driver 3 in a fair and vague state without being affected by individual differences in the eye closure rate of each driver 3, and it is possible to perform a more appropriate driving evaluation. ..
  • FIG. 2 is a block diagram showing a hardware configuration example of the vehicle-mounted device 10 according to the embodiment (1).
  • the on-board unit 10 includes a state determination device 20 and a camera 11, and further includes an acceleration sensor 12, an angular velocity sensor 13, a GPS (Global Positioning System) receiving unit 14, a communication unit 15, and a notification unit 16. It is composed of.
  • GPS Global Positioning System
  • the state determination device 20 includes a control unit 21, a storage unit 22, and an input / output interface (I / F) 23.
  • the control unit 21 is composed of a microcomputer including a CPU (Central Processing Unit), a RAM (Random Access Memory), and a ROM (Read Only Memory).
  • the control unit 21 performs a process of storing data acquired from the camera 11, the acceleration sensor 12, the angular velocity sensor 13, the GPS receiving unit 14, and the like in the storage unit 22. Further, the control unit 21 expands the program 221 stored in the storage unit 22 into the RAM, reads various detection data stored in the storage unit 22, and interprets and executes the program 221 expanded in the RAM by the CPU. As a result, various processes for determining the state of the driver 3, which will be described later, are executed.
  • the storage unit 22 is composed of one or more storage devices such as a semiconductor memory.
  • the storage unit 22 may store image data acquired from the camera 11, data detected by the acceleration sensor 12, the angular velocity sensor 13, the GPS receiving unit 14, and the like in time series.
  • the program 221 may be stored in the ROM of the control unit 21.
  • the input / output I / F 23 includes an interface circuit, a connector, and the like for exchanging data and signals with and from a device such as a camera 11.
  • the camera 11 operates as an image pickup unit that captures an image including the driver 3's face, and includes, for example, a lens unit (not shown), an image sensor unit, a light irradiation unit, and a camera control unit that controls each of these units.
  • the image sensor unit includes, for example, an image sensor such as a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor) having sensitivity in the visible light wavelength region or the near infrared wavelength region, a filter, a microlens, and the like. It is composed of.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • the light irradiation unit includes a light emitting element such as an LED (Light Emitting Diode), and may use a light emitting element that irradiates infrared rays so that the state of the driver can be imaged day and night.
  • the camera 11 may be a monocular camera or a stereo camera.
  • the camera control unit includes, for example, a processor and the like.
  • the camera control unit controls the operation of the image pickup element unit and the light irradiation unit, irradiates light (for example, near infrared rays) from the light irradiation unit, and the image pickup element unit captures the reflected light. Control etc.
  • the camera 11 captures an image at a predetermined frame rate (for example, 15 frames per second or more), and the data of the captured image is output to the state determination device 20.
  • the acceleration sensor 12 is a sensor that measures the acceleration of the vehicle 2, and is composed of, for example, a three-axis acceleration sensor that measures acceleration in three directions of the XYZ axes.
  • a 2-axis and 1-axis acceleration sensor may be used as the acceleration sensor 12.
  • the acceleration data measured by the acceleration sensor 12 is stored in the storage unit 22 in association with, for example, the detection time (that is, in time series).
  • the angular velocity sensor 13 is a sensor that detects the rotational angular velocity of the vehicle 2, and is at least the angular velocity corresponding to the rotation around the vertical axis (yaw direction), that is, the angular velocity data corresponding to the rotation (turning) of the vehicle 2 in the left-right direction.
  • the angular velocity sensor 13 includes a 1-axis gyro sensor around the vertical axis, a 2-axis gyro sensor that detects the angular velocity around the horizontal axis (pitch direction) in the left-right direction, and a horizontal axis around the front-back direction (roll direction).
  • a 3-axis gyro sensor that also detects the angular velocity of is used.
  • these gyro sensors in addition to the vibration type gyro sensor, an optical type or a mechanical type gyro sensor may be used.
  • clockwise may be set in the negative direction and counterclockwise may be set in the positive direction.
  • the angular velocity is detected in a predetermined cycle (for example, a cycle of several tens of ms), and the detected angular velocity data is stored in the storage unit 22 in association with the detection time, for example.
  • an inertial sensor in which these are mounted in one package may be used.
  • the GPS receiving unit 14 receives GPS signals (including time information) from artificial satellites via the antenna 14a at predetermined cycles (for example, every second), and receives position data (latitude and longitude) of the current location of the vehicle 2. Includes).
  • the position data detected by the GPS receiving unit 14 is stored in the storage unit 22 in association with the detection time, for example.
  • a receiving device compatible with another satellite positioning system may be used, or another position detecting device may be used.
  • the communication unit 15 is a communication module that performs processing such as emitting radio waves from the antenna 15a, transmitting data to the operation evaluation device 4 via the communication network 5, and receiving radio waves from the outside via the antenna 15a. It is configured to include. Further, the communication unit 15 may include a communication module that performs vehicle-to-vehicle communication or road-to-vehicle communication.
  • the notification unit 16 is configured to include a speaker that outputs a predetermined notification sound, voice, or the like based on a command from the state determination device 20.
  • the on-board unit 10 can have a compact configuration in which the state determination device 20, the camera 11, and the like are housed in one housing.
  • the location of the vehicle-mounted device 10 installed in the vehicle is not particularly limited as long as the camera 11 can capture at least the field of view including the driver's face.
  • the on-board unit 10 may be installed, for example, near the center of the dashboard of the vehicle 2, near the steering wheel column, near the instrument panel, near the rearview mirror, or at the A-pillar portion.
  • the camera 11 may be configured as a separate body in addition to the form that is integrally configured with the state determination device 20.
  • FIG. 3 is a block diagram showing a functional configuration example of the state determination device 20 according to the embodiment (1).
  • the control unit 21 of the state determination device 20 expands the program 221 stored in the storage unit 22 into the RAM. Then, the control unit 21 interprets and executes the program expanded in the RAM by the CPU, and the image acquisition unit 30, the eye opening / closing degree detection unit 31, the eye closure rate calculation unit 32, and the variation feature amount calculation unit 33 shown in FIG. , Operates as a loose state determination unit 34 and an output unit 35.
  • the image acquisition unit 30 performs a process of acquiring an image in which the face of the driver 3 is captured from a camera 11 arranged so as to be able to capture the face of the driver 3 in the vehicle 2.
  • the image acquisition unit 30 acquires, for example, an image of n frames per second (n is, for example, 15 or more).
  • the eye opening / closing degree detection unit 31 analyzes the image acquired by the image acquisition unit 30 and performs a process of detecting the eye opening / closing degree of the driver 3 from each image.
  • the degree of opening and closing of the eyes is an index indicating the degree of opening of the eyes.
  • the ratio to (for example, the number of pixels in the vertical width / the number of pixels in the horizontal width) may be used.
  • the eye opening / closing degree detection unit 31 may obtain the eye opening / closing degree of each of the left and right eyes, or may obtain the average value of the left and right eye opening / closing degrees. Further, either the left or right eye opening / closing degree may be used. Further, as the eye opening / closing degree, only the vertical width of the eye (for example, the number of pixels in the vertical width) may be used.
  • the eye closing rate calculation unit 32 calculates the eye closing rate of the driver 3 by using the eye opening / closing degree of the first predetermined period (for example, a predetermined time of about 1 minute) detected by the eye opening / closing degree detecting unit 31. Do. For example, the eye closure rate calculation unit 32 calculates the ratio of data that is equal to or less than a predetermined threshold value indicating the eye closure state among the eye opening / closing degrees detected in the first predetermined period. In addition, the eye-closure rate calculation unit 32 performs a process of calculating the eye-closure rate every first predetermined period.
  • the variable feature amount calculation unit 33 calculates the variable feature amount of the eye closure rate by using the eye closure rate for the second predetermined period (for example, a predetermined time of about 10 to 15 minutes) calculated by the eye closure rate calculation unit 32. For example, when the first predetermined period is 1 minute and the second predetermined period is 15 minutes, the variable feature amount calculation unit 33 is calculated for 15 minutes of the eye closure rate (15 data). Is used to calculate the variable feature amount of the eye closure rate.
  • the variable feature amount calculation unit 33 uses the data of the eye closure rate for the most recent second predetermined period every time the first predetermined period elapses after the second predetermined period elapses after the image acquisition is started. Then, the variable feature amount is calculated. That is, the variable feature amount calculation unit 33 calculates the variable feature amount for each first predetermined period after the lapse of the second predetermined period.
  • the variable feature amount calculation unit 33 may calculate, for example, an index indicating the degree of variation in the eye closure rate during the second predetermined period as the variable feature amount.
  • the variation feature amount calculation unit 33 may calculate the standard deviation or the variance as an index indicating the degree of variation. Further, the variable feature amount calculation unit 33 may calculate the change amount or the change rate (slope) of the eye closure rate in the second predetermined period as the variable feature amount.
  • variable feature amount calculation unit 33 may calculate the rising time rate, which is the ratio of the rising time of the eye closure rate in the second predetermined period, as the variable feature amount.
  • the variable feature amount calculation unit 33 may calculate the above-mentioned two or more types of variable feature amounts.
  • the loose state determination unit 34 performs a process of determining the driver's loose state based on the variable feature amount calculated by the variable feature amount calculation unit 33. For example, when the standard deviation of the eye closure rate in the second predetermined period is calculated as the variable feature amount, it is determined whether or not the standard deviation of the eye closure rate is equal to or greater than the predetermined threshold value indicating the change to the involuntary state. Alternatively, it may be determined whether or not the predetermined threshold value has been exceeded continuously for a predetermined period of time.
  • the amount of change or rate of change in the eye closure rate during the second predetermined period is a predetermined threshold value (change amount) indicating a change to a vague state.
  • a predetermined threshold value change amount
  • the rate of increase in the rate of eye closure in the second predetermined period is calculated as the variable feature amount, is the rate of increase in the rate of eye closure equal to or greater than the predetermined threshold value (ratio) indicating the change to the involuntary state?
  • the loose state determination unit 34 may combine the determination processing based on the above-mentioned two or more types of fluctuation feature amounts, or the determination processing based on the above-mentioned variation feature amount and the determination processing based on the eye closure rate. May be good.
  • the learning has been learned so that when the random state determination unit 34 inputs the variable feature amount calculated by the variable feature amount calculation unit 33, the driver 3 outputs a value indicating whether or not the driver 3 is in the loose state.
  • the learner may be used to determine the driver 3's absent-minded state. Further, when the learning device inputs the fluctuation feature amount calculated by the fluctuation feature amount calculation unit 33 and the eye closure rate calculated by the eye closure rate calculation unit 32, it is determined whether or not the driver 3 is in a vague state. It may be learned to output the indicated value.
  • the learner may be composed of, for example, a neural network having an input layer, one or more intermediate layers, and an output layer, a support vector machine, or the like. By using the learner, it is possible to easily and appropriately determine whether or not the driver 3 is in a loose state.
  • the output unit 35 performs a process of outputting a determination result by the indiscriminate state determination unit 34.
  • the output unit 35 may perform notification processing according to the determination result via the notification unit 16, stores the determination result in the storage unit 22, and stores the determination result stored in the storage unit 22.
  • a process of transmitting the included data to the operation evaluation device 4 via the communication unit 15 at a predetermined timing may be performed.
  • FIG. 4A is an example of a graph showing a time-series change in the eye closure rate of a driver having a low eye closure rate in normal times
  • FIG. 4B is the data of the eye closure rate shown in FIG. 4A. This is an example of a graph showing the time-series change of the standard deviation of the eye closure rate calculated using.
  • FIG. 5 (a) is an example of a graph showing a time-series change in the eye-closure rate of a driver having a high eye-closure rate in normal times
  • FIG. 5 (b) is the data of the eye-closure rate shown in FIG. 5 (a).
  • It is a graph which shows the time-series change of the standard deviation which is an example of the fluctuation feature amount of the eye closure rate calculated by using.
  • the threshold value indicated by the alternate long and short dash line in the figure indicates an example of a criterion for determining whether or not the state is in a dull state. Comparing FIGS. 4 (a) and 5 (a), it can be seen that there are individual differences in the eye closure rate in normal times. As shown in FIG.
  • the eye closure rate in normal times is detected to be low, and as shown in FIG. 5 (a), the eye closure rate in normal times is high in some people.
  • These individual differences are influenced by, for example, the difference in size between the upper and lower eyelids, and the difference in the positional relationship between the camera 11 and the face (distance, orientation, etc., where the camera 11 is installed).
  • FIG. 4A when the eye closure rate is low in normal times, it is possible to relatively accurately determine whether or not the patient is in a vague state by determining the threshold value of the eye closure rate.
  • FIG. 5 (a) when the eye closure rate is high in normal times, whether or not the patient is in a loose state is determined by the threshold value of the eye closure rate (determination is performed with the same threshold value as in FIG. 4 (a)). ), A state will occur in which an erroneous judgment that the state is always in a vague state is made.
  • the state determination device 20 calculates the fluctuation feature amount of the eye closure rate (for example, the standard deviation of the eye closure rate) from the data of the eye closure rate for a predetermined period, and shifts to the involuntary state based on the calculated fluctuation feature amount. judge.
  • FIG. 6 is a block diagram showing a functional configuration example of the operation evaluation device 4 according to the embodiment (1).
  • the operation evaluation device 4 includes a communication unit 41, a control unit 42, and a storage unit 43, and these are connected via a communication bus 44.
  • the communication unit 41 is configured to include a communication device for realizing transmission / reception of various data and signals to / from the vehicle-mounted device 10 and the business terminal 6 via the communication network 5.
  • the control unit 42 is composed of a computer device including one or more processors that execute various arithmetic processes and a main memory in which a predetermined operation evaluation program or the like is stored. As a functional configuration, the control unit 42 includes an operation evaluation unit 421. It is configured to include the evaluation result output unit 422.
  • the driving evaluation unit 421 performs various driving evaluation processes including evaluation of the driver 3's loose state based on the data including the determination result of the driver 3's loose state determined by the state determination device 20 of the vehicle-mounted device 10. I do.
  • the evaluation result output unit 422 outputs a driving evaluation result including an evaluation of the driver 3's involuntary state evaluated by the driving evaluation unit 421.
  • the evaluation result output unit 422 performs a process of transmitting the operation evaluation result to the business operator terminal 6 via the communication unit 41, for example, in response to a request from the business operator terminal 6.
  • the storage unit 43 is composed of one or more large-capacity storage devices such as a hard disk drive and a solid state drive, and includes a detection data storage unit 431, an evaluation condition storage unit 432, an evaluation result storage unit 433, and the like. ing.
  • the detection data storage unit 431 stores the detection data acquired from the vehicle-mounted device 10 of each vehicle 2.
  • the detection data storage unit 431 is associated with, for example, the identification information of the in-vehicle device 10 or the identification information of the driver 3, and the driver 3 is detected at a time when the driver 3 is detected and a predetermined time before and after the time when the state is detected.
  • the driving behavior data of is accumulated in time series.
  • the driving behavior data of the vehicle 2 detected at a predetermined time before and after the involuntary state detection time may be stored in the detection data storage unit 431 in time series.
  • the driving behavior data of the driver 3 includes, for example, data of at least one of the direction of the face of the driver 3, the direction of the line of sight, the degree of opening and closing of the eyes, the rate of eye closure, and the variable feature amount of the rate of eye closure.
  • the traveling behavior data of the vehicle 2 includes data of at least one of the acceleration, the angular velocity, the position, and the speed of the vehicle 2.
  • the detection data storage unit 431 is associated with the identification information of the in-vehicle device 10 or the identification information of the driver 3, and the driving of the driver 3 detected at the intersection passage time and a predetermined time before and after the intersection passage time.
  • the behavior data and the driving behavior data of the vehicle 2 may be accumulated in time series.
  • the evaluation condition storage unit 432 stores at least one evaluation condition for the safety confirmation operation to be performed by the driver 3 at an intersection or the like.
  • the evaluation result storage unit 433 stores a driving evaluation result including an evaluation of the driver 3's involuntary state evaluated by the driving evaluation unit 421.
  • FIG. 7 is a flowchart showing an example of processing operation performed by the control unit 21 of the state determination device 20 according to the embodiment (1).
  • FIG. 7 shows an example in which the control unit 21 operates as an image acquisition unit 30, an eye opening / closing degree detection unit 31, and an eye closure rate calculation unit 32.
  • step S1 the control unit 21 operates as an image acquisition unit 30, performs a process of acquiring an image captured by the camera 11, and proceeds to step S2.
  • the control unit 21 acquires, for example, an image of n frames per second (for example, n is 15 or more).
  • the acquired image is stored in, for example, the storage unit 22.
  • step S2 the control unit 21 operates as the eye opening / closing degree detecting unit 31, performs a process of detecting the eye opening / closing degree of the driver 3 from the image acquired in step S1, and proceeds to step S3.
  • the control unit 21 detects the facial region of the driver 3 from the acquired image, detects facial organs such as eyes, nose, and mouth from the detected facial region, and detects the eye region from the detected eye region.
  • the vertical width and the horizontal width are detected, and the ratio of the detected vertical width to the horizontal width (for example, the number of vertical width pixels / the number of horizontal width pixels) is detected as the degree of opening and closing of the eye.
  • the degree of opening and closing of the eyes detected from each image may be stored in the storage unit 22 in association with data such as the image acquisition time.
  • the control unit 21 may detect the direction of the face, the direction of the line of sight, and the like by using a known method together with the degree of opening and closing of the eyes. Further, when the eye opening / closing degree detection unit 31 views the driver 3's face from the front based on the ratio of the detected vertical width to the horizontal width of the eye based on the orientation of the driver 3's face detected from the image.
  • the degree of eye opening / closing corrected to the value of may be calculated. For example, the method described in Japanese Patent No. 4957711 can be applied to the method for correcting the degree of opening and closing of the eye.
  • step S3 the control unit 21 determines whether or not the eye opening / closing degree in the first predetermined period has been detected, and if it determines that the eye opening / closing degree in the first predetermined period has not been detected, the process returns to step S1. If it is determined that the degree of eye opening / closing during the first predetermined period has been detected, the process proceeds to step S4.
  • the first predetermined period is a period for acquiring the required number of data for the degree of eye opening / closing used for calculating the eye closure rate, and for example, a predetermined time of about 1 minute can be set.
  • step S4 the control unit 21 operates as the eye closure rate calculation unit 32, performs a process of calculating the eye closure rate of the driver 3 using the eye opening / closing degree detected in the first predetermined period, and processes in step S5.
  • the control unit 21 determines the ratio of data that is equal to or less than a predetermined threshold value indicating the closed eye state among the eye opening / closing degrees detected in the first predetermined period ([number of data of eye opening / closing degree below the threshold value / first predetermined period). Number of data on eye opening / closing degree] ⁇ 100 (%)) is calculated.
  • the eye closure rate calculated for each first predetermined period may be stored in the storage unit 22 in association with data such as the elapsed time of the first predetermined period, for example.
  • the eye-closing rate calculation unit 32 may set a predetermined threshold value indicating the eye-closing state based on the direction of the line of sight of the driver 3 detected from the image.
  • a predetermined threshold value indicating the eye-closing state based on the direction of the line of sight of the driver 3 detected from the image.
  • the method described in Japanese Patent No. 4915413 can be applied to the method for setting a predetermined threshold value.
  • step S5 the control unit 21 determines whether or not the eye closure rate for the second predetermined period has been calculated, and if it determines that the eye closure rate for the second predetermined period has not been calculated, the control unit 21 returns to step S1 while performing the first step. 2 If it is determined that the eye closure rate for a predetermined period has been calculated, the process proceeds to step S6.
  • the second predetermined period is a period for acquiring the required number of data for the eye closure rate used for calculating the variable feature amount, and for example, a predetermined time of about 15 minutes can be set.
  • the eye closure rate is calculated every time the first predetermined period elapses, and the latest second predetermined period is calculated. Determine whether or not the eye closure rate has been calculated.
  • step S6 the control unit 21 reads the data of the eye closure rate calculated in the second predetermined period from the storage unit 22, and then returns to step S1 to repeat the eye opening / closing degree detection process and the eye closure rate calculation process.
  • FIG. 8 is a flowchart showing an example of processing operation performed by the control unit 21 of the state determination device 20 according to the embodiment (1). The processing operation shown in FIG. 8 is executed following the processing in step S6 shown in FIG. 7.
  • FIG. 8 shows an example in which the control unit 21 operates as the fluctuation feature amount calculation unit 33, the loose state determination unit 34, and the output unit 35.
  • step S7 the control unit 21 operates as the fluctuation feature amount calculation unit 33, and calculates the fluctuation feature amount of the eye closure rate using the data of the eye closure rate in the second predetermined period read out in step S6. Is performed, and the process proceeds to step S8. For example, when the first predetermined period is set to 1 minute and the second predetermined period is set to 15 minutes, the control unit 21 has data for 15 minutes of the eye closure rate calculated every minute (15 data). Is used to calculate the fluctuation feature amount of the eye closure rate.
  • variable feature amount to be calculated may be the standard deviation of the eye closure rate in the second predetermined period, the change amount or the change rate (slope) of the eye closure rate in the second predetermined period, or the second predetermined amount. It may be the rate of increase in the eye closure rate during the period, and at least one or more of these variable features may be calculated.
  • the calculated variable feature amount may be stored in the storage unit 22 in association with data such as the elapsed time of the second predetermined period, for example.
  • step S8 the control unit 21 operates as a loose state determination unit 34, and based on the variation feature amount of the eye closure rate calculated in step S7, the loose state of the driver 3 (in other words, the loose driving state). ) Is performed, and the process proceeds to step S9.
  • the standard deviation of the eye closure rate is calculated as the variable feature amount of the eye closure rate, it may be determined whether or not the standard deviation of the eye closure rate is equal to or greater than a predetermined threshold value for determining the transition to the involuntary state.
  • a predetermined threshold value for determining the transition to the involuntary state.
  • the amount of change or rate of change in the rate of eye closure is a predetermined threshold value (change amount or rate of change) for determining the transition to a vague state. ) Or more, or whether or not the amount of change in the eye closure rate or the rate of change continuously exceeds a predetermined threshold for a predetermined period (for example, for a certain period of time) may be determined. Good.
  • the rate of increase in the rate of eye closure in the second predetermined period is calculated as the variable feature amount of the rate of eye closure
  • the rate of increase in the rate of eye closure becomes equal to or greater than a predetermined threshold value (ratio) indicating a change to a vague state. It may be determined whether or not the rate of increase in the eye closure rate continuously exceeds a predetermined threshold value for a predetermined period (for example, a certain period of time). In this case, if the rate of increase in the eye closure rate is equal to or greater than a predetermined threshold value, or if the rate of increase in the eye closure rate exceeds the predetermined threshold value continuously for a predetermined period of time, it is determined that the state is indifferent.
  • control unit 21 may determine the driver 3's involuntary state based on the above-mentioned two or more types of fluctuation feature amounts, the determination processing based on the above-mentioned fluctuation feature amount, and the determination based on the eye closure rate.
  • the driver 3 may be determined to be in a vague state in combination with the processing.
  • step S9 the control unit 21 determines whether or not the driver 3 is in a loose state as a result of the loose state determination process in step S8, and if it is determined that the driver 3 is in a loose state, the process is performed in step S10. Proceed.
  • step S10 the control unit 21 operates as an output unit 35, and a notification process for shifting the driver 3 from the awake state to the awake state, for example, the notification unit 16 is operated to output a warning sound, or A process of outputting a predetermined announcement or the like is performed, and the process proceeds to step S11.
  • step S11 the control unit 21 operates as the output unit 35, and stores the result determined to be in the loose state (blunt state detection result) in association with data such as the elapsed time and position of the second predetermined period.
  • the process of storing in the unit 22 is performed, and then the process is completed.
  • the control unit 21 may transmit the data including the vague state detection result stored in the storage unit 22 to the operation evaluation device 4 at a predetermined timing.
  • the involuntary state of the driver 3 is determined based on the above-mentioned fluctuation feature amount of the eye closure rate, not on the value of the eye closure rate itself. Will be done.
  • the eyes are closed even in the awake state due to factors such as individual differences in the eye closing rate of each driver 3 or differences in the installation position (in other words, the image capturing position) of the camera 11 of each vehicle 2. Even when the value of the rate is measured higher, it is possible to accurately determine the loose state of each driver 3, that is, the loose driving state without being affected by these factors.
  • the driver user
  • the threshold value for determining the eye closure rate and determines the involuntary state based on the value of the eye closure rate for each driver or each time the vehicle starts running. In the above embodiment, it is not necessary to perform complicated operations such as threshold setting, and it does not take time and effort, so that the convenience for the user can be improved.
  • the eye closure rate is used to determine the driver 3's involuntary state. Changes in the degree of variation are taken into account. Further, when the amount of change or the rate of change in the eye closure rate during the second predetermined period is used as the variable feature amount, the magnitude of the change in the eye closure rate is taken into consideration in determining the driver 3's absent-minded state.
  • an index for example, a value such as standard deviation or dispersion
  • the increase time rate of the eye closure rate in the second predetermined period is used as the fluctuation feature amount
  • the change tendency that the eye closure rate increases with time is taken into consideration in the determination of the driver 3's absent-minded state. Therefore, by determining the driver 3's absent-minded state based on these variable features, the value of the eye-closure rate itself depends on the eye-closure rate itself, even in the awake state or when the eye-closure rate is measured higher. It is possible to accurately determine the timing of transition to the involuntary state without being affected.
  • the driving evaluation device 4 makes it possible to evaluate each driver 3 in a fair and indiscriminate state without being affected by individual differences in the eye closure rate of each driver 3, and a more appropriate driving evaluation result. Can be output to the operator terminal 6.
  • the business operator who manages the vehicle 2 can appropriately provide safe driving education to the driver 3 by using the driving evaluation result displayed on the business operator terminal 6.
  • FIG. 9 is a block diagram showing a functional configuration example of the state determination device 20A according to the embodiment (2).
  • the state determination shown in FIG. 3 is performed except that the preprocessing unit 36 is provided in front of the variation feature amount calculation unit 33A. Since the functional configuration of the device 20 is substantially the same, the same reference numerals are given to the configurations having the same functions, and the description thereof will be omitted.
  • the pretreatment unit 36 performs a predetermined pretreatment on the eye closure rate calculated by the eye closure rate calculation unit 32.
  • the pretreatment unit 36 may perform a smoothing process of the eye closure rate calculated by the eye closure rate calculation unit 32 as a pretreatment.
  • the preprocessing unit 36 may perform, for example, a process of calculating the moving average of the eye closure rate at predetermined intervals.
  • the predetermined interval may be, for example, an interval of the first predetermined period ⁇ m (m is an integer of 2 or more).
  • the variable feature amount calculation unit 33A performs a process of calculating the variable feature amount using the eye closure rate after the pretreatment by the pretreatment unit 36.
  • FIG. 10 (a) is a graph in which the moving average line of the eye closure rate is superimposed on the graph showing the time-series change of the eye closure rate of the driver having a low eye closure rate in normal times shown in FIG. 4 (a).
  • FIG. 10B is a graph in which the moving average line of the eye closure rate is superimposed on the graph showing the time-series change of the eye closure rate of the driver having a high eye closure rate in normal times shown in FIG. 5A.
  • the thick line shows the moving average of the eye closure rate.
  • FIG. 11 is a flowchart showing an example of processing operation performed by the control unit 21A of the state determination device 20A according to the embodiment (2). The processing operation shown in FIG. 11 is executed following the processing in step S6 of FIG. Since the processing operations up to step S6 in FIG. 7 are the same, the description thereof will be omitted.
  • FIG. 11 is a flowchart showing an example of processing operation performed by the control unit 21A of the state determination device 20A according to the embodiment (2). The processing operation shown in FIG. 11 is executed following the processing in step S6 of FIG. Since the processing operations up to step S6 in FIG. 7 are the same, the description thereof will be omitted.
  • FIG. 11 is a flowchart showing an example of processing operation performed by the control unit 21A of the state determination device 20A according to the embodiment (2). The processing operation shown in FIG. 11 is executed following the processing in step S6 of FIG. Since the processing operations up to step S6 in FIG. 7 are the same, the description thereof will be omitted.
  • FIG. 11 is
  • control unit 21A operates as the preprocessing unit 36, the variation feature amount calculation unit 33A, the loose state determination unit 34, and the output unit 35. Further, the same processing contents as those shown in FIG. 8 are assigned the same step numbers, and the description thereof will be omitted.
  • step S21 the control unit 21A determines whether or not to execute the smoothing process of the eye closure rate based on the type of the variable feature amount used for determining the involuntary state. For example, when the type of the variable feature amount used for determining the loose state is the amount of change or the rate of change in the eye closure rate, or the rate of increase in the eye closure rate, it is determined that the smoothing process of the eye closure rate is to be executed, and step S22. Proceed to processing.
  • step S22 the control unit 21A operates as the preprocessing unit 36, performs smoothing processing on the data of the eye closure rate in the second predetermined period read out in step S6, and proceeds to the processing in step S23.
  • the control unit 21A performs, for example, a process of calculating the average value of the eye closure rate for each fixed period in the second predetermined period while shifting the interval, that is, a process of obtaining a moving average.
  • step S23 the control unit 21A operates as the fluctuation feature amount calculation unit 33A, performs a process of calculating the fluctuation feature amount of the eye closure rate using the eye closure rate smoothed in step S22, and processes in step S24.
  • the variable feature amount to be calculated may be, for example, the amount of change or the rate of change (slope) of the smoothed eye closure rate, or the rate of increase in the smoothed eye closure rate. Of these, at least one type of variable feature amount may be calculated.
  • the calculated variable feature amount may be stored in the storage unit 22 in association with data such as the elapsed time of the second predetermined period, for example.
  • step S24 the control unit 21A operates as a loose state determination unit 34, and the driver's looseness is based on the variation feature amount (change amount, change rate, or rise time rate) of the eye closure rate calculated in step S23.
  • the process of determining the state is performed, and the process proceeds to step S9.
  • step S23 when the amount of change or rate of change in the smoothed eye closure rate is calculated as the variable feature amount of the eye closure rate, in step S24, the amount of change or rate of change in the smoothed eye closure rate is determined. It may be determined whether or not it is equal to or more than a predetermined threshold value (change amount or rate of change) for determining the transition to the loose state. In this case, if the amount of change or the rate of change in the smoothed eye closure rate is equal to or greater than a predetermined threshold value, it is determined that the state is in a vague state.
  • a predetermined threshold value change amount or rate of change
  • step S23 when the increase time rate of the smoothed eye closure rate is calculated as the variable feature amount of the eye closure rate, the increase time rate of the smoothed eye closure rate indicates a change to a vague state. It may be determined whether or not the value exceeds a predetermined threshold value (ratio). In this case, if the rate of increase in the smoothed eye closure rate is equal to or greater than a predetermined threshold value, it is determined that the smoothed state is in a vague state.
  • a predetermined threshold value ratio
  • step S21 when the type of the variable feature amount used for determining the involuntary state is, for example, the standard deviation of the eye closure rate, the control unit 21A determines that the smoothing process of the eye closure rate is not executed, and steps. The process proceeds to S25.
  • step S25 the control unit 21A operates as the fluctuation feature amount calculation unit 33A, performs a process of calculating the fluctuation feature amount of the eye closure rate using the eye closure rate that has not been smoothed, and proceeds to step S26. ..
  • the calculated variable feature amount is, for example, the standard deviation of the eye closure rate, and the calculated variable feature amount may be stored in the storage unit 22 in association with data such as the elapsed time of the second predetermined period.
  • step S26 the control unit 21A performs a process of determining the driver's absent-minded state based on the variation feature amount of the eye closure rate (standard deviation of the eye closure rate) calculated in step S25, and performs the process in step S9. Proceed. Since the processes of steps S9 to S11 are the same as the processes of steps S9 to S11 in FIG. 8, the description thereof will be omitted here. According to the in-vehicle device 10A provided with the state determination device 20A according to the embodiment (2), whether or not to execute the smoothing process of the eye closure rate based on the type of the variable feature amount used for determining the involuntary state.
  • the pretreatment unit 36 When it is determined that the variable feature amount is the amount of change, the rate of change, or the rate of increase in the eye closure rate, the pretreatment unit 36 performs a smoothing process as a pretreatment for the eye closure rate, and after the smoothing process.
  • the amount of change in the rate of eye closure, the rate of change, or the variable feature amount of the rate of increase time is calculated using the rate of eye closure.
  • the standard deviation of the closed eye rate is calculated using the closed eye rate that has not been smoothed.
  • the standard deviation of the eye closure rate can be calculated as a feature amount that makes it easy to grasp the tendency to shift to.
  • FIG. 12 is a block diagram showing a functional configuration example of the state determination device 20B according to the embodiment (3).
  • the functional configuration of the state determination device 20B according to the embodiment (3) except that the vehicle dynamic data acquisition unit 37, the event detection unit 38, and the preprocessing unit 36A are further equipped. Since the functional configuration of the state determination device 20 shown in FIG. 3 is substantially the same, the configurations having the same function are designated by the same reference numerals, and the description thereof will be omitted.
  • the vehicle dynamic data acquisition unit 37 is the acceleration data of the vehicle 2 detected by the acceleration sensor 12 of the in-vehicle device 10, the angular velocity data of the vehicle 2 detected by the angular velocity sensor 13, and the vehicle 2 detected by the GPS receiving unit 14.
  • a process of acquiring at least one of the dynamic data of the position data and sending the acquired data to the event detection unit 38 is performed.
  • the event detection unit 38 includes a vehicle dynamics detection unit 381 and a face-facing event detection unit 382.
  • the vehicle dynamics detection unit 381 detects a predetermined event based on the dynamic data of the vehicle 2 acquired from the vehicle dynamics data acquisition unit 37.
  • the face-facing event detection unit 382 detects the face-facing event of the driver 3 based on the image data acquired from the image acquisition unit 30. Further, the face-facing event detection unit 382 detects the face-to-face event of the driver 3 based on the image data acquired from the image acquisition unit 30 and the dynamic data of the vehicle 2 acquired from the vehicle dynamic data acquisition unit 37. May be good.
  • the vehicle dynamics detection unit 381 includes, as the predetermined events, an event in which the road type on which the vehicle 2 travels is switched, an event in which the vehicle 2 stops, an event in which the vehicle 2 starts traveling, and an event in which the vehicle 2 suddenly handles. , An event in which a sudden brake is generated in the vehicle 2 and an event in which an impact is generated in the vehicle 2 are detected.
  • the face-facing event detection unit 382 detects an event in which the face orientation of the driver 3 changes while the vehicle 2 is driving. Detected facial events include, for example, at least one of a right / left turn confirmation at an intersection, a direction of travel confirmation at an intersection, no face detection, and inattentiveness.
  • the pretreatment unit 36A performs a predetermined preprocessing on the eye closure rate calculated by the eye closure rate calculation unit 32.
  • the preprocessing unit 36A is, for example, when or during a predetermined event detected by the event detection unit 38 among the eye closure rates of the second predetermined period calculated by the eye closure rate calculation unit 32 (for example, the first predetermined period). ) Is removed from the data to be calculated for the variable feature amount.
  • FIG. 13 is a flowchart showing a processing operation performed by the control unit 21B of the state determination device 20B according to the embodiment (3).
  • FIG. 13 shows an example in which the control unit 21B operates as an image acquisition unit 30, an eye opening / closing degree detection unit 31, an eye closure rate calculation unit 32, a vehicle dynamic data acquisition unit 37, an event detection unit 38, and a preprocessing unit 36A. ing. Note that the same processing contents as those shown in FIG. 7 are assigned the same step numbers, and the description thereof will be omitted here.
  • step S31 the control unit 21B operates as the vehicle dynamic data acquisition unit 37, performs a process of acquiring the dynamic data of the vehicle 2, and proceeds to the process in step S32.
  • the control unit 21B determines, for example, dynamic data of at least one of the acceleration data detected by the acceleration sensor 12, the angular velocity data detected by the angular velocity sensor 13, and the position data detected by the GPS receiving unit 14. Obtained at intervals (several tens of msec, or at intervals of several seconds).
  • step S32 the control unit 21B operates as the vehicle dynamic detection unit 381, and using the dynamic data of the vehicle 2 acquired in step S31, whether or not the vehicle 2 is passing through the intersection (for example, during a right turn or a left turn). The process for detecting this is performed, and the process proceeds to step S33.
  • To detect whether or not the vehicle 2 is passing through an intersection for example, it is detected whether or not the absolute value of the angular velocity detected by the angular velocity sensor 13 exceeds a predetermined angular velocity threshold value, and the time when the angular velocity threshold value is exceeded is set at the intersection. It may be detected as the passing time of.
  • a condition that the vehicle speed is equal to or less than a predetermined speed suitable for passing through the intersection may be added.
  • the detected data during passing through the intersection may be stored in the storage unit 22 in association with data such as the passing time and position of the intersection, for example.
  • step S33 the control unit 21B operates as the vehicle dynamics detection unit 381, and uses the dynamic data of the vehicle 2 acquired in step S31 to perform a process of detecting the switching of the road type on which the vehicle 2 is traveling.
  • the process proceeds to step S34.
  • the control unit 21B may calculate the vehicle speed from the moving distance per unit time using the position data acquired in step S31, and detect the switching of the road type based on the calculated vehicle speed.
  • a switch from a general road to an expressway may be detected.
  • the control unit 21B may detect a switch from an expressway to a general road based on the vehicle speed, or detect a switch from a general road to a residential road (for example, a road having a speed limit of 30 km / h or less).
  • the switch from the residential road to the general road may be detected.
  • the control unit 21B may detect the switching of the road type by collating the road map data (including the road type data) with the position data. Further, the control unit 21B may detect the switching of road types based on a communication signal (for example, an approach signal to an expressway or an exit signal from an expressway) acquired by road-to-vehicle communication or the like.
  • the data for detecting the change of the road type may be stored in the storage unit 22 in association with data such as position data and detection time.
  • step S34 the control unit 21B operates as the vehicle dynamics detection unit 381, performs a process of detecting the stop of the vehicle 2 using the dynamic data of the vehicle 2 acquired in step S31, and proceeds to the process in step S35.
  • the stop of the vehicle 2 is determined, for example, when a state in which the position data does not change is continuously detected for a predetermined time (about several seconds to several tens of seconds) using the position data acquired in step S31, or from the position data.
  • a state in which the vehicle speed is equal to or lower than a predetermined value for example, slow speed
  • the data obtained by detecting the stop of the vehicle 2 may be stored in the storage unit 22 in association with data such as the acquisition time of the position data.
  • step S35 the control unit 21B operates as the vehicle dynamics detection unit 381, performs a process of detecting the start of traveling of the vehicle 2 using the dynamic data of the vehicle 2 acquired in step S31, and proceeds to the process in step S36. ..
  • the change in the position data is continuously detected for a predetermined time (about several seconds to several tens of seconds) from the state in which the position data has not changed. In this case, or when it is detected that the vehicle speed obtained from the position data is equal to or higher than a predetermined value, it may be assumed that the vehicle 2 has started running and the start of running of the vehicle 2 may be detected.
  • the data for detecting the start of traveling of the vehicle 2 may be stored in the storage unit 22 in association with data such as the time when the start of traveling is detected, for example.
  • step S36 the control unit 21B operates as the vehicle dynamics detection unit 381, performs a process of detecting sudden steering or sudden braking of the vehicle 2 using the dynamic data of the vehicle 2 acquired in step S31, and in step S37. Proceed with processing.
  • the sudden steering wheel (sudden steering) of the vehicle 2 is, for example, equal to or higher than a predetermined threshold for determining the occurrence of a sudden steering wheel by using the data of the angular velocity (including the yaw angular velocity) and the acceleration (including the left-right acceleration) acquired in step S31.
  • the sudden steering wheel of the vehicle 2 may be detected, assuming that the vehicle 2 has a sudden steering wheel.
  • the sudden braking (sudden deceleration) of the vehicle 2 is, for example, when an acceleration equal to or higher than a predetermined threshold value for determining the occurrence of sudden braking is detected using the acceleration data (including the front-rear acceleration) acquired in step S31.
  • the sudden steering of the vehicle 2 may be detected on the assumption that the sudden braking has occurred in the vehicle 2.
  • the data in which the sudden steering wheel or the sudden braking of the vehicle 2 is detected may be stored in the storage unit 22 in association with the data such as the time when the sudden steering wheel or the sudden braking is detected, for example.
  • step S37 the control unit 21B operates as the vehicle dynamics detection unit 381, performs a process of detecting the impact of the vehicle 2 using the dynamic data of the vehicle 2 acquired in step S31, and proceeds to the process in step S38.
  • the impact of the vehicle 2 is, for example, when an acceleration equal to or higher than a predetermined threshold for determining the occurrence of an impact due to a collision or the like is detected using the acceleration data (including the front-back acceleration or the left-right acceleration) acquired in step S31. Assuming that an impact has occurred on 2, the impact of the vehicle 2 may be detected.
  • the data obtained by detecting the impact of the vehicle 2 may be stored in the storage unit 22 in association with data such as the time when the impact is detected.
  • step S38 the control unit 21B determines whether or not the eye closure rate has been calculated in step S4 in which the processes are performed in parallel, and if it is determined that the eye closure rate has not been calculated, the process returns to step S31. On the other hand, if it is determined that the eye closure rate has been calculated, the process proceeds to step S50. Further, the processes of steps S41 to S43 are executed in parallel with the processes of steps S31 to S38 described above.
  • step S41 the control unit 21B processes the image acquired from the camera 11 in step S1, performs a process of detecting the direction of the driver 3's face from the image, and proceeds to step S42.
  • the method of detecting the orientation of the driver 3's face from the image is not particularly limited.
  • the control unit 21B detects the position or shape of each facial organ such as eyes, nose, mouth, and eyebrows from the facial region in the image, and based on the detected position or shape of each facial organ, The orientation of the face may be detected.
  • the orientation of the driver 3's face detected by the control unit 21B is, for example, a pitch angle which is an angle (up and down direction) around the X axis (left and right axis) of the driver's face, and a Y axis (up and down) of the face. It may be indicated by the Yaw angle, which is the angle around the axis (left-right orientation), and the Roll angle, which is the angle (left-right tilt) around the Z-axis (front-back axis) of the face, and at least the left-right orientation. Includes yaw angle indicating. Further, these angles can be indicated by angles with respect to a predetermined reference direction, and for example, the reference direction may be set to the front direction of the driver.
  • the detected face orientation data may be stored in the storage unit 22 in association with data such as an image acquisition time or a frame number, for example.
  • step S42 the control unit 21B operates as the face-facing event detection unit 382, performs a process of detecting the face-facing event of the driver 3, and proceeds to step S43.
  • the detection of the face-facing event of the driver 3 may detect the face-facing event of the driver 3, such as the face-undetected state and the inattentive state, based on the result of the face-facing detection in step S41, for example.
  • the control unit 21B has a safety confirmation operation (left-right confirmation, traveling direction) when passing through the intersection based on the result of the face orientation detection in step S41 and the detection result of whether or not the intersection is being passed in step S32.
  • An event such as (operation such as confirmation) may be detected.
  • the detected face-facing event detection data may be stored in the storage unit 22 in association with data such as an image acquisition time, an intersection passage time, or a position.
  • step S43 the control unit 21B determines whether or not the eye closure rate has been calculated in step S4 in which processing is performed in parallel, and if it is determined that the eye closure rate has not been calculated, the process returns to step S41. On the other hand, if it is determined that the eye closure rate has been calculated, the process proceeds to step S50.
  • step S50 the control unit 21B performs preprocessing on the eye closure rate calculated in step S4, and proceeds to step S5. The content of the preprocessing in step S50 will be described later.
  • step S5 the control unit 21B determines whether or not the eye closure rate for the second predetermined period has been calculated, and if it determines that the eye closure rate for the second predetermined period has not been calculated, returns to step S1 and repeats the process. On the other hand, if it is determined that the degree of eye closure in the second predetermined period has been calculated, the process proceeds to step S6.
  • step S6 the control unit 21B reads the preprocessed eye closure rate data for the second predetermined period from the storage unit 22, and then returns to step S1 to repeat the eye opening / closing degree detection process and the eye closure rate calculation process.
  • FIG. 14 is a flowchart showing an example of processing operation performed by the control unit 21B of the state determination device 20B according to the embodiment (3).
  • FIG. 14 shows an operation example of the preprocessing of the eye closure rate in step S50 shown in FIG.
  • the processing procedure described below is only an example and may be changed as appropriate, and steps may be omitted, replaced, or added depending on the embodiment.
  • the control unit 21B determines in step S33 of FIG. 13 whether or not the road type switching is detected in the first predetermined period, and determines that the road type switching is detected in the first predetermined period. If so, the process proceeds to step S52.
  • step S52 the control unit 21B performs a process of removing the eye closure rate calculated before the switching of the road type is detected from the target data used for calculating the variation feature amount, and then ends the preprocessing. On the other hand, if it is determined in step S51 that the switching of the road type is not detected in the first predetermined period, the process proceeds to step S53.
  • step S53 the control unit 21B determines in step S34 of FIG. 13 whether or not the stop of the vehicle 2 is detected in the first predetermined period, and determines that the stop of the vehicle 2 is detected in the first predetermined period. If so, the process proceeds to step S54.
  • step S54 the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the time when the vehicle 2 is stopped from the calculation target data, and then ends the preprocessing.
  • step S53 determines in step S53 in step S53 in step S53 in step S53 in step S53 in step S53 in step S53 in step S53 in step S53 in step S53 in step S53 in step S53.
  • step S55 the control unit 21B determines in step S35 of FIG. 13 whether or not the start of traveling of the vehicle 2 is detected in the first predetermined period, and the start of traveling of the vehicle 2 is detected in the first predetermined period. If it is determined, the process proceeds to step S56.
  • step S56 the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the time of detecting the start of travel of the vehicle 2 from the calculation target data, and then ends the preprocessing.
  • step S57 the control unit 21B determines in step S36 of FIG. 13 whether or not the sudden steering wheel or sudden braking of the vehicle 2 is detected in the first predetermined period, and the sudden steering wheel or the sudden braking of the vehicle 2 in the first predetermined period. If it is determined that the sudden braking is detected, the process proceeds to step S58.
  • step S58 the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the detection of the sudden steering wheel or the sudden braking of the vehicle 2 from the calculation target data, and then ends the preprocessing.
  • step S57 if it is determined in step S57 that the sudden steering wheel or sudden braking of the vehicle 2 is not detected in the first predetermined period, the process proceeds to step S59.
  • step S59 the control unit 21B determines whether or not the impact of the vehicle 2 has been detected in the first predetermined period in step S37 of FIG. 13, and determines that the impact of the vehicle 2 has been detected in the first predetermined period. If so, the process proceeds to step S60.
  • step S60 the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the time of impact detection of the vehicle 2 from the calculation target data, and then ends the preprocessing.
  • step S59 if it is determined in step S59 that the impact of the vehicle 2 has not been detected during the first predetermined period, the process proceeds to step S61.
  • step S61 the control unit 21B determines in step S42 of FIG. 13 whether or not the facial event of the driver 3 is detected in the first predetermined period, and the facial event of the driver 3 in the first predetermined period. If it is determined that is detected, the process proceeds to step S62.
  • step S62 the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the time of detecting the facial event of the driver 3 from the calculation target data, and then ends the preprocessing.
  • the preprocessing is then completed and the process proceeds to step S5.
  • a predetermined event was detected by the event detection unit 38 in the eye closure rate in the second predetermined period by the pretreatment unit 36A.
  • the eye closure rate calculated for the time or period is removed from the data for which the variable feature amount is calculated.
  • the variable feature amount is calculated using the eye closure rate after the removal treatment. Therefore, by using the eye closure rate after the removal treatment for the calculation of the variable feature amount, it is possible to calculate the feature amount that more accurately represents the feature that shifts to the loose state, and the transition to the loose state is accurate. It can be judged well.
  • the event detection unit 38 includes the vehicle dynamics detection unit 381, the event in which the road type on which the vehicle 2 travels is switched, the event in which the vehicle 2 is stopped, and the vehicle in the second predetermined period of the eye closure rate. It is possible to exclude the eye closure rate calculated when at least one of the events in which 2 starts running is detected or during the period, from the data to be calculated for the variable feature amount. Further, since the event detection unit 38 includes the face-facing event detection unit 382, an event in which the face orientation of the driver 3 changes while the vehicle 2 is being driven is detected in the eye closure rate during the second predetermined period. It is possible to exclude the eye closure rate calculated at that time or during the period from the data to be calculated for the variable feature amount.
  • the tendency to shift to the vague state is more accurate.
  • the variable feature amount can be calculated, and the transition of the driver 3 to the involuntary state in the actual vehicle environment can be accurately determined.
  • FIG. 15 is a block diagram showing a functional configuration example of the state determination device 20C according to the embodiment (4).
  • the functional configuration of the state determination device 20C is substantially the same as that of the state determination device 20B shown in FIG. 12, so that the configurations having the same functions are the same. Reference numerals are given, and the description thereof will be omitted.
  • the pretreatment unit 36B performs a predetermined preprocessing on the eye closure rate calculated by the eye closure rate calculation unit 32. Similar to the pretreatment unit 36A in FIG. 12, the pretreatment unit 36B is when a predetermined event is detected by the event detection unit 38 in the eye closure rate in the second predetermined period calculated by the eye closure rate calculation unit 32. Alternatively, the eye closure rate calculated during the period (for example, the first predetermined period) is removed from the data for which the variable feature amount is calculated.
  • the interpolation processing unit 39 performs a process of interpolating the eye closure rate in the second predetermined period when it is removed by the removal process of the preprocessing unit 36B or during the period (for example, the first predetermined period). By interpolating the eye closure rate at the time of removal or during the period before calculating the variable feature amount, it is possible to improve the accuracy of the variable feature amount calculated by the variable feature amount calculation unit 33B.
  • the preprocessing unit 36B performs the removal processing and the smoothing processing of the eye closure rate after the interpolation processing, for example, according to the type of the variable feature amount used for determining the loose state. As the smoothing process, the preprocessing unit 36B may perform, for example, a process of calculating the moving average of the eye closure rate at predetermined intervals.
  • the variable feature amount calculation unit 33B uses the eye closure rate after preprocessing and interpolation processing (the removal processing, the interpolation processing, and the smoothing processing) by the preprocessing unit 36B and the interpolation processing unit 39 to obtain the variable feature amount. Perform the calculation process.
  • FIG. 16 is a flowchart showing a processing operation performed by the control unit 21C of the state determination device 20C according to the embodiment (4). The processing operation shown in FIG. 16 is executed following the processing in step S6 shown in FIG. FIG. 16 shows an example in which the control unit 21C operates as the preprocessing unit 36B, the interpolation processing unit 39, the variation feature amount calculation unit 33B, the loose state determination unit 34A, and the output unit 35. Note that the same processing contents as those shown in FIG. 11 are given the same step numbers, and the description thereof will be omitted.
  • step S71 the control unit 21C operates as the interpolation processing unit 39, and in the second predetermined period, among the eye closure rates calculated for each first predetermined period, the eye closure removed by the preprocessing in step S50 of FIG.
  • the process of interpolating the rate is performed, and then the process proceeds to step S21.
  • the control unit 21C may perform a process of interpolating the eye closing rate removed by the preprocessing of step S50 with the calculated eye closing rate immediately before or after the removed eye closing rate. Good.
  • the control unit 21C may perform a process of interpolating with the average value of the eye closing rates calculated immediately before and after the removed eye closing rate, or the eye closing calculated immediately before and after the removed eye closing rate. Interpolation processing may be performed based on the amount of change in the rate or the rate of change (slope).
  • steps S21 to S26 are basically the same as the processing content of steps S21 to S26 shown in FIG. 11, except that the interpolated eye closure rate data is used, and thus the description thereof will be omitted here.
  • step S22 corresponds to the processing operation of the pre-processing unit 36B
  • steps S23 and S25 correspond to the processing operation of the variation feature amount calculation unit 33B
  • steps S24 and S26 correspond to the processing operation of the loose state determination unit 34A. ..
  • step S24 or step S26 the control unit 21C performs the process of determining the involuntary state, and then proceeds to the process of step S72.
  • step S72 the control unit 21C operates as a loose state determination unit 34A, determines whether or not the fluctuation feature amount of the eye closure rate is equal to or greater than the threshold value A set as the reference for performing warning processing, and is equal to or greater than the threshold value A. If it is determined that the above is not the case, the process is completed, while if it is determined that the threshold value is A or more, the process proceeds to step S73.
  • the threshold value A may be used as a criterion for determining that the state is in the loose state of the middle level or lower.
  • step S73 the control unit 21C operates as the output unit 35, performs warning processing for the driver 3, and proceeds to step S74.
  • the control unit 21C may, for example, operate the notification unit 16 to output an alarm sound or a warning announcement for awakening the driver 3.
  • step S74 the control unit 21C operates as a loose state determination unit 34A, determines whether or not the fluctuation feature amount of the eye closure rate is equal to or greater than the threshold value B set as the reference for making an external report, and is equal to or greater than the threshold value B. If it is determined that this is not the case, the process proceeds to step S76, and if it is determined that the threshold value B or higher is determined, the process proceeds to step S75.
  • the threshold value B is larger than the threshold value A, and may be used as a reference for determining that the state is in a loose state higher than the medium level, for example, when the level of the loose state is classified into low, medium, and high.
  • step S75 the control unit 21C operates as the output unit 35 to perform a process of notifying the outside that the driver 3 is in a vague state, and then proceeds to the process in step S76.
  • the communication unit 15 is operated to notify the operator terminal 6 that the driver 3 is in a vague state higher than the middle level.
  • step S76 when the control unit 21C determines in step S74 that the fluctuation feature amount is not equal to or greater than the threshold value B, the control unit 21C obtains data such as the elapsed time and position of the second predetermined period as the determination result of the involuntary state of the threshold value A or more. The process of storing the data in the storage unit 22 is performed in association with the above, and then the process is completed.
  • step S76 the control unit 21C determines that the fluctuation feature amount is equal to or higher than the threshold value B in step S74, and when the external notification process is executed in step S75, the control unit 21C determines the determination result of the indiscriminate state that is equal to or higher than the threshold value B. 2.
  • a process of associating with data such as the elapsed time and position of a predetermined period and storing the data in the storage unit 22 is performed, and then the process is completed.
  • the control unit 21C may transmit the data including the involuntary state detection result stored in the storage unit 22 to the operation evaluation device 4 at a predetermined timing.
  • a predetermined event was detected by the event detection unit 38 among the eye closure rates in the second predetermined period by the pretreatment unit 36B.
  • the eye closure rate calculated for the time or period is removed from the data for which the variable feature amount is calculated.
  • the variable feature amount is calculated using the eye closure rate after the removal treatment.
  • the pretreatment unit 36B further smoothes the eye closure rate after the removal process, the smoothed eye closure rate can be used for calculating the variable feature amount, and the variable feature amount can be used.
  • the involuntary state determination unit 34A determines the involuntary state of the driver 3 stepwise using a plurality of determination threshold values (threshold value A, threshold value B), the involuntary state of the driver 3 is determined stepwise. It is possible to make an appropriate determination according to the degree of the loose state, and it is possible to perform an appropriate output according to the degree of the loose state.
  • the state determination device 20 acquires an image from the camera 11 and performs a process of detecting data on the direction of the face, the direction of the line of sight, and the degree of opening and closing of the eyes. It was.
  • the camera 11 includes an image analysis unit including, for example, an image processing processor, and the image analysis unit uses the captured image to determine the direction of the driver 3's face, the direction of the line of sight, and the opening and closing of the eyes.
  • a process of detecting degree data may be performed.
  • the state determination device 20 acquires the driving behavior data, the image data, and the imaging date / time (time) data of the driver 3 from the camera 11, and uses the acquired eye opening / closing degree data to calculate the eye closure rate and thereafter.
  • the processing of may be executed.
  • the state determination device is not limited to the application to the in-vehicle device.
  • the state determination device is an industrial equipment system. It is also possible to incorporate it into the above and apply it to the purpose of determining the absent-minded state of a person performing a predetermined work.
  • Appendix 1 A state determination device (20) that determines the state of a person.
  • An eye opening / closing degree detection unit (31) that detects the eye opening / closing degree of the person from an image of the person's face
  • An eye closure rate calculation unit (32) that calculates the eye closure rate of the person using the eye opening / closing degree of the first predetermined period detected by the eye opening / closing degree detection unit (31).
  • the variation feature amount calculation unit (33) for calculating the variation feature amount of the eye-closure rate
  • a state determination device including a random state determination unit (34) that determines the loose state of the person based on the variable feature amount calculated by the variable feature amount calculation unit (33).
  • An operation evaluation system (1) characterized in that it is configured to include.
  • the eye opening / closing degree detection step (S2) for detecting the eye opening / closing degree of the person from the image of the person's face
  • An eye closure rate calculation step (S4) for calculating the eye closure rate of the person using the eye opening / closing degree of the first predetermined period detected in the eye opening / closing degree detection step (S2).
  • the variation feature amount calculation step (S7) for calculating the variation feature amount of the eye-closure rate
  • a state determination method including a loose state determination step (S8) for determining the loose state of the person based on the variable feature amount calculated in the variable feature calculation step (S7). ..
  • the variation feature amount calculation step (S7) for calculating the variation feature amount of the eye-closure rate
  • a program for executing a loose state determination step (S8) for determining a loose state of the person based on the variable feature amount calculated in the variable feature calculation step (S7).

Abstract

La présente invention concerne un dispositif de détermination d'état destiné à déterminer l'état d'une personne, et est équipé: d'une unité de détection de degré d'ouverture/fermeture des yeux pour détecter le degré d'ouverture/fermeture des yeux de la personne à partir d'une image obtenue par capture du visage de la personne; une unité de calcul de vitesse de fermeture des yeux pour calculer la vitesse de fermeture des yeux de la personne à l'aide d'un degré d'ouverture/fermeture des yeux détecté pendant une première période prescrite; une unité de calcul de variation de quantité de caractéristique pour calculer une variation de quantité de caractéristiques dans la vitesse de fermeture des yeux à l'aide de la vitesse de fermeture des yeux calculée pendant une seconde période prescrite; et une unité de détermination d'état d'inattention pour déterminer l'état d'inattention de la personne sur la base de la variation de quantité de caractéristique calculée.
PCT/JP2020/023555 2019-07-10 2020-06-16 Dispositif de détermination d'état, instrument embarqué sur un véhicule, système d'évaluation de conduite, méthode et programme de détermination d'état WO2021005975A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-128048 2019-07-10
JP2019128048A JP7298351B2 (ja) 2019-07-10 2019-07-10 状態判定装置、車載機、運転評価システム、状態判定方法、及びプログラム

Publications (1)

Publication Number Publication Date
WO2021005975A1 true WO2021005975A1 (fr) 2021-01-14

Family

ID=74115222

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/023555 WO2021005975A1 (fr) 2019-07-10 2020-06-16 Dispositif de détermination d'état, instrument embarqué sur un véhicule, système d'évaluation de conduite, méthode et programme de détermination d'état

Country Status (2)

Country Link
JP (1) JP7298351B2 (fr)
WO (1) WO2021005975A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2022176136A1 (fr) * 2021-02-18 2022-08-25

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010128649A (ja) * 2008-11-26 2010-06-10 Nissan Motor Co Ltd 覚醒状態判断装置及び覚醒状態判断方法
WO2010092860A1 (fr) * 2009-02-13 2010-08-19 トヨタ自動車株式会社 Dispositif d'estimation de condition physiologique et dispositif de commande de véhicule
JP2017194772A (ja) * 2016-04-19 2017-10-26 トヨタ自動車株式会社 覚醒度判定装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010128649A (ja) * 2008-11-26 2010-06-10 Nissan Motor Co Ltd 覚醒状態判断装置及び覚醒状態判断方法
WO2010092860A1 (fr) * 2009-02-13 2010-08-19 トヨタ自動車株式会社 Dispositif d'estimation de condition physiologique et dispositif de commande de véhicule
JP2017194772A (ja) * 2016-04-19 2017-10-26 トヨタ自動車株式会社 覚醒度判定装置

Also Published As

Publication number Publication date
JP2021015320A (ja) 2021-02-12
JP7298351B2 (ja) 2023-06-27

Similar Documents

Publication Publication Date Title
US10217343B2 (en) Alert generation correlating between head mounted imaging data and external device
CN107415938B (zh) 基于乘员位置和注意力控制自主车辆功能和输出
US10524716B2 (en) System for monitoring vehicle operator compliance with safe operating conditions
JP5179686B2 (ja) 運転行動危険度演算装置
EP3006297B1 (fr) Dispositif de diagnostic de caractéristiques de conduite, système de diagnostic des caractéristiques de conduite, procédé de diagnostic des caractéristiques de conduite, dispositif d'envoi d'informations et procédé d'envoi d'informations
JP7099037B2 (ja) データ処理装置、モニタリングシステム、覚醒システム、データ処理方法、及びデータ処理プログラム
CN110503802A (zh) 基于行车记录仪的行车事故判断方法和系统
US20200209850A1 (en) Methods and systems to facilitate monitoring center for ride share and safe testing method based for selfdriving cars to reduce the false call by deuddaction systems based on deep learning machine
JP6967042B2 (ja) 運転評価装置、運転評価システム、運転評価方法、プログラム、及び交差点属性判別方法
Smirnov et al. Smartphone-based identification of dangerous driving situations: Algorithms and implementation
JP2021128349A (ja) 情報処理装置、情報処理システム、および情報処理方法、並びにプログラム
WO2021005975A1 (fr) Dispositif de détermination d'état, instrument embarqué sur un véhicule, système d'évaluation de conduite, méthode et programme de détermination d'état
JP7114953B2 (ja) 車載機、運転評価装置、これらを備えた運転評価システム、データ送信方法、及びデータ送信プログラム
Kashevnik et al. Context-based driver support system development: Methodology and case study
JP7068606B2 (ja) 運転評価装置、車載機、運転評価方法、及びコンピュータプログラム
JP2019195376A (ja) データ処理装置、モニタリングシステム、覚醒システム、データ処理方法、及びデータ処理プログラム
RU2703341C1 (ru) Способ определения опасных состояний на дорогах общего пользования на основе мониторинга ситуации в кабине транспортного средства
JP7060841B2 (ja) 運転評価装置、運転評価方法、及び運転評価プログラム
US11912307B2 (en) Monitoring head movements of drivers tasked with monitoring a vehicle operating in an autonomous driving mode
JP7070827B2 (ja) 運転評価装置、車載機、これらを備えた運転評価システム、運転評価方法、及び運転評価プログラム
JP7075048B2 (ja) 安全確認評価装置、車載機、これらを備えた安全確認評価システム、安全確認評価方法、及び安全確認評価プログラム
JP7135913B2 (ja) 運転評価画面の表示方法、プログラム、運転評価システム、及び運転評価画面
JP7235438B2 (ja) 運転評価装置、運転評価方法、及びコンピュータプログラム
JP7130994B2 (ja) 車載機、後退判定方法、及び後退判定プログラム
WO2020241292A1 (fr) Dispositif de traitement de signal, procédé de traitement de signal, programme et dispositif d'imagerie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20836246

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20836246

Country of ref document: EP

Kind code of ref document: A1