WO2021005975A1 - State determination device, on-vehicle instrument, driving assessment system, state determination method, and program - Google Patents

State determination device, on-vehicle instrument, driving assessment system, state determination method, and program Download PDF

Info

Publication number
WO2021005975A1
WO2021005975A1 PCT/JP2020/023555 JP2020023555W WO2021005975A1 WO 2021005975 A1 WO2021005975 A1 WO 2021005975A1 JP 2020023555 W JP2020023555 W JP 2020023555W WO 2021005975 A1 WO2021005975 A1 WO 2021005975A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
eye
feature amount
closure rate
state determination
Prior art date
Application number
PCT/JP2020/023555
Other languages
French (fr)
Japanese (ja)
Inventor
絵里子 閑
博 田▲崎▼
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2021005975A1 publication Critical patent/WO2021005975A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a state determination device, an in-vehicle device, a driving evaluation system, a state determination method, and a program.
  • Patent Document 1 discloses an awakening maintenance device that maintains the awakening of a vehicle driver.
  • the arousal maintenance device is composed of a stimulus execution unit that executes a stimulus process that changes the display position of the visual stimulus in the left-right direction in the front view of the driver by using a display device that displays the visual stimulus, and the stimulus execution unit.
  • the higher the degree of the control unit that executes the stimulus processing at each interval time the loose state estimation unit that estimates the degree of the driver's loose state, and the loose state estimated by the loose state estimation unit, the more the interval time. It is equipped with a time setting unit that sets the time short.
  • the vague state estimation unit photographs the driver's face using a driver camera and acquires a moving image.
  • the vague state estimation unit acquires the eye closing rate (ratio of the time when the eyes are closed) of the driver by using the acquired moving image.
  • the vague state estimation unit determines whether or not the acquired eye closure rate is equal to or less than the upper limit value (8%), and if the eye closure rate is equal to or less than the upper limit value, the time setting unit causes the above. Based on the eye closure rate, the interval time for executing the stimulus processing by the stimulus execution unit is set. On the other hand, when the eye closure rate exceeds the upper limit value, the stimulus execution unit executes a direct awakening process.
  • the degree of the involuntary state is determined based on the value of the eye closure rate, and the process by the time setting unit or the stimulation execution unit is executed. There is.
  • the eye closure rate may be measured higher even in the awake state. Further, depending on the installation location of the driver camera, the eye closure rate may be measured higher even in the awake state. In such a case, in the determination based on the above-mentioned value of the eye closure rate, there is a possibility that an erroneous determination that the awake state is in the awake state may be made, and there is a problem that the absent state cannot be accurately determined.
  • the present invention has been made in view of the above problems, and is an operation evaluation system including a state determination device capable of accurately determining a person's involuntary state, an in-vehicle device equipped with the device, and the in-vehicle device. , A state determination method, and a program for realizing the method.
  • the state determination device (1) according to the present disclosure in order to achieve the above object is a state determination device that determines the state of a person.
  • An eye opening / closing degree detection unit that detects the eye opening / closing degree of the person from an image of the person's face
  • An eye closure rate calculation unit that calculates the eye closure rate of the person using the eye opening / closing degree of the first predetermined period detected by the eye opening / closing degree detection unit.
  • a variation feature amount calculation unit for calculating the variation feature amount of the eye-closure rate, It is characterized in that it includes a loose state determination unit that determines the loose state of the person based on the variable feature amount calculated by the variable feature amount calculation unit.
  • the eye opening / closing degree of the person is detected from the image, and the eye closing rate of the person is calculated using the eye opening / closing degree of the detected first predetermined period.
  • the fluctuation feature amount of the eye closure rate is calculated using the calculated eye closure rate of the second predetermined period, and the loose state of the person is determined based on the calculated fluctuation feature amount. .. Therefore, since the absent-minded state of the person is determined based on the variable feature amount, for example, due to factors such as individual differences in the eye closure rate or differences in the imaging position of the image, the person may be awake even during awakening. Even when the eye closure rate is measured to be high, the vague state can be accurately determined without being affected by these factors.
  • the person may be, for example, a driver who drives a vehicle, a worker who is performing a predetermined work, or the like.
  • the state determination device (2) further includes a pretreatment unit that performs preprocessing on the eye closure rate calculated by the eye closure rate calculation unit in the state determination device (1).
  • the fluctuation feature amount calculation unit It is characterized in that the fluctuation feature amount is calculated by using the eye closure rate after pretreatment by the pretreatment unit.
  • the pretreatment for the eye closure rate is performed by the pretreatment unit, and the variation feature amount is calculated using the eye closure rate after the pretreatment. Therefore, by using the eye closure rate after the pretreatment for the calculation of the variable feature amount, it is possible to calculate the feature amount that more accurately represents the feature that shifts to the loose state, and the loose state. It is possible to accurately determine the transition to.
  • the state determination device (3) is characterized in that, in the state determination device (2), the pretreatment unit performs the smoothing process of the eye closure rate. According to the state determination device (3), since the smoothing process of the eye closure rate is performed by the pretreatment unit, the variation feature amount is calculated using the eye closure rate after the smoothing process. As a result, it becomes easy to calculate the feature amount indicating the tendency to shift to the vague state.
  • the smoothing process may be, for example, a process of obtaining a moving average of the eye closure rate for each predetermined period.
  • the state determination device (4) further includes an event detection unit for detecting a predetermined event in the state determination device (2).
  • the pretreatment unit Of the eye closure rates in the second predetermined period the eye closure rate calculated when the predetermined event is detected by the event detection unit or during the period is removed from the data to be calculated for the variable feature amount. It is characterized in that it is intended to perform.
  • the state determination device (4) it was calculated by the pretreatment unit when or during the period when the event detection unit detected the predetermined event among the eye closure rates in the second predetermined period.
  • the eye closure rate is removed from the data for which the variable feature amount is calculated.
  • the variable feature amount is calculated using the eye closure rate after the removal treatment. Therefore, by using the eye closure rate after the removal treatment for the calculation of the variable feature amount, it is possible to calculate the feature amount that more accurately represents the feature that shifts to the loose state, and the shift to the loose state can be calculated. Can be accurately determined.
  • the state determination device (5) is characterized in that, in the state determination device (4), the pretreatment unit further performs the smoothing process of the eye closure rate after the removal process. There is. According to the state determination device (5), the pretreatment unit further smoothes the eye closure rate after the removal process, so that the smoothed eye closure rate can be used as the variable feature amount.
  • the variable feature amount can be calculated as a feature amount whose tendency to shift to the loose state can be easily grasped, and the shift to the loose state can be accurately determined.
  • the smoothing process may be, for example, a process of obtaining a moving average of the eye closure rate for each predetermined period.
  • the state determination device (6) is the state determination device (4) or (5) at the time or period during which the eye closure rate of the second predetermined period is removed by the removal process. It is characterized in that it further includes an interpolation processing unit that interpolates the eye closure rate. According to the state determination device (6), the interpolation processing unit interpolates the eye closure rate in the time or period removed by the removal process among the eye closure rates in the second predetermined period. Therefore, since the eye closure rate can be interpolated by the interpolation processing unit, the fluctuation feature amount can be appropriately calculated.
  • the person is the driver of the vehicle.
  • the event detection unit switches the road type on which the vehicle travels, the vehicle stops, the vehicle starts traveling, and the vehicle suddenly steers. It is characterized by including a vehicle dynamics detection unit that detects at least one of an event in which a sudden braking occurs in the vehicle and an event in which an impact occurs in the vehicle.
  • the state determination device (7) since the event detection unit includes the vehicle dynamics detection unit, the road type on which the vehicle travels is switched among the eye closure rates in the second predetermined period. Of the events, the event that the vehicle stopped, the event that the vehicle started running, the event that the vehicle suddenly steered, the event that the vehicle suddenly braked, and the event that the vehicle was impacted.
  • the eye closure rate calculated when at least one of the above events is detected or during the period can be excluded from the calculation target data of the fluctuation feature amount. Therefore, even in an actual vehicle environment in which the dynamics of the vehicle change from moment to moment, the variable feature amount can be calculated as a feature amount that more accurately indicates the tendency to shift to the vague state, and in the actual vehicle environment. It is possible to accurately determine the transition of the driver to the involuntary state.
  • the person is the driver of the vehicle.
  • the event detection unit is characterized in that the event detection unit includes a face-facing event detection unit that detects, as the predetermined event, an event in which the direction of the face of the person changes while the vehicle is driving.
  • the event detection unit since the event detection unit includes the face-facing event detection unit, the person during the driving of the vehicle in the eye closure rate during the second predetermined period. It is possible to exclude the eye closure rate calculated when or during the period when the event of changing the direction of the face is detected from the calculation target data of the variable feature amount.
  • variable feature amount can be calculated as a feature amount that more accurately indicates the tendency to shift to the vague state, and the actual vehicle can be calculated. It is possible to accurately determine the transition of the driver to the involuntary state in the environment.
  • the variation feature amount calculation unit uses the variation feature amount as the variation feature amount for the second predetermined period. It is characterized in that it calculates an index showing the degree of variation in the eye closure rate.
  • the fluctuation feature amount calculation unit uses an index indicating the degree of variation in the eye closure rate during the second predetermined period, such as standard deviation or dispersion, as the fluctuation feature amount. The value of is calculated, and the involuntary state of the person is determined based on the index.
  • the degree of variation in the eye-closure rate in other words, the change in the degree of variation is taken into consideration in the determination of the absent-minded state of the person, so that the eye-closure rate is measured higher even in the awake state, for example. Even in such a case, it is possible to accurately determine the timing of transition to the involuntary state without being affected by the value of the eye closure rate itself.
  • the fluctuation feature amount calculation unit uses the fluctuation feature amount as the fluctuation feature amount for the second predetermined period. It is characterized in that it calculates the amount of change or the rate of change in the eye closure rate. According to the state determination device (10), the fluctuation feature amount calculation unit calculates the change amount or change rate of the eye closure rate during the second predetermined period as the fluctuation feature amount, and the change amount of the eye closure rate. Alternatively, the involuntary state of the person is determined based on the rate of change.
  • the amount of change or the rate of change in the eye closure rate in other words, the magnitude of the change is taken into consideration in determining the absent-minded state of the person, for example, the eye closure rate is measured higher even in the awake state. Even in such a case, it is possible to accurately determine the timing of transition to the involuntary state without being affected by the value of the eye closure rate itself.
  • the fluctuation feature amount calculation unit uses the fluctuation feature amount as the fluctuation feature amount in the second predetermined period. It is characterized in that it calculates the rate of increase in time, which is the ratio of the rate of increase in eye closure rate. According to the state determination device (11), the fluctuation feature amount calculation unit calculates the increase time rate of the eye closure rate in the second predetermined period as the fluctuation feature amount, and based on the increase time rate, The loose state of the person is determined. Therefore, in determining the absent-minded state of the person, the rate of increase in the eye-closure rate, in other words, the tendency of the eye-closure rate to increase over time is taken into consideration. Therefore, for example, even in the awake state. Even when the eye-closure rate is measured higher, it is possible to accurately determine the timing of transition to the involuntary state without being affected by the value of the eye-closure rate itself.
  • the involuntary state determination unit uses two or more determination thresholds to determine the involuntary state of the person. Is characterized in that it is determined step by step. According to the state determination device (12), it is possible to determine the involuntary state of the person step by step by using two or more determination threshold values, and make an appropriate determination according to the degree of the involuntary state. Can be done.
  • the involuntary state determination unit causes the variation feature amount to continuously exceed the determination threshold for a predetermined period.
  • the person is determined to be in a vague state.
  • the state determination device (13) when the variable feature amount continuously exceeds the determination threshold value for a predetermined period, it is determined that the person is in a loose state, so that the person shifts to the loose state. The state can be judged appropriately.
  • the indiscriminate state determination unit has two or more types of the variable feature amount or one or more types. It is characterized in that the loose state of the person is determined based on the variable feature amount and the eye closure rate. According to the state determination device (14), it is possible to determine the involuntary state of the person based on two or more types of the variable feature amount, or one or more types of the variable feature amount and the eye closure rate. By considering two or more types of the variable feature amounts, it is possible to more accurately determine the timing at which the person shifts to the involuntary state.
  • the involuntary state determination unit calculates the fluctuation feature amount, or the fluctuation feature amount and the variation feature amount.
  • a trained learner that has been trained to output a value indicating whether or not the person is in a loose state is used to determine the loose state of the person. It is supposed to be.
  • the learner may be configured to include, for example, a neural network, a support vector machine, or the like.
  • the state determination device (16) is characterized in that any of the above state determination devices (1) to (15) includes an output unit that outputs a determination result by the indiscriminate state determination unit. There is. According to the state determination device (16), the output unit can appropriately output the determination result by the indiscriminate state determination unit.
  • the state determination device (17) is characterized in that the state determination device (16) includes a notification unit that performs notification according to the determination result output by the output unit. According to the state determination device (17), the notification unit can appropriately perform notification according to the determination result of the indiscriminate state determination unit.
  • the state determination device (18) includes a determination result storage unit for storing the determination result output by the output unit in the state determination device (16) or (17), and the determination result storage unit. It is characterized in that it includes a communication unit that transmits data including the determination result stored in the predetermined destination. According to the state determination device (18), the determination result is stored in the determination result storage unit, and the communication unit can appropriately transmit the data including the determination result to a predetermined transmission destination. .. Therefore, the determination result can be appropriately used at the predetermined destination.
  • the vehicle-mounted device (1) is characterized in that it includes any of the above-mentioned state determination devices (1) to (18) and an imaging unit for capturing the image. According to the in-vehicle device (1), it is possible to realize an in-vehicle device that exhibits the effect of any one of the state determination devices (1) to (18).
  • the driving evaluation system (1) is based on data including one or more of the in-vehicle devices (1) and a determination result of the involuntary state of the person determined by the state determination device of the in-vehicle device. It is provided with a driving evaluation unit that performs driving evaluation including evaluation of the person's involuntary state, and an evaluation result output unit that outputs a driving evaluation result including evaluation of the person's involuntary state evaluated by the driving evaluation unit. It is characterized in that it is configured to include an operation evaluation device.
  • the driving evaluation including the evaluation of the loose state of the person is performed based on the data including the determination result of the loose state of the person determined by the state determination device of the vehicle-mounted device. It is possible to output a driving evaluation result including an evaluation of the involuntary state of the person who has been appropriately evaluated, and it is possible to appropriately perform safe driving education for the person.
  • the state determination method is a state determination method for determining the state of a person.
  • An eye opening / closing degree detection step for detecting the eye opening / closing degree of the person from an image of the person's face
  • An eye closure rate calculation step for calculating the eye closure rate of the person using the eye opening / closing degree for the first predetermined period detected in the eye opening / closing degree detection step
  • Using the eye closure rate for the second predetermined period calculated in the eye closure rate calculation step a variable feature amount calculation step for calculating the variable feature amount of the eye closure rate, and a variable feature amount calculation step. It is characterized in that it includes a loose state determination step for determining the loose state of the person based on the variable feature amount calculated in the variable feature calculation step.
  • the eye opening / closing degree of the person is detected from the image, and the eye closing rate of the person is calculated and calculated using the eye opening / closing degree of the detected first predetermined period.
  • the fluctuation feature amount of the eye closure rate is calculated by using the eye closure rate in the second predetermined period, and the loose state of the person is determined based on the calculated fluctuation feature amount. Therefore, since the absent-minded state of the person is determined based on the variable feature amount, for example, due to factors such as individual differences in the eye closure rate or differences in the imaging position of the image, the person may be awake even during awakening. Even when the eye closure rate is measured to be high, the vague state can be accurately determined without being affected by these factors.
  • the program according to the present disclosure is a program for causing at least one or more computers to execute a process of determining the state of a person.
  • An eye opening / closing degree detection step for detecting the eye opening / closing degree of the person from an image of the person's face
  • An eye closure rate calculation step for calculating the eye closure rate of the person using the eye opening / closing degree for the first predetermined period detected in the eye opening / closing degree detection step
  • Using the eye closure rate for the second predetermined period calculated in the eye closure rate calculation step a variable feature amount calculation step for calculating the variable feature amount of the eye closure rate, and a variable feature amount calculation step.
  • the program is characterized in that it is a program for executing a loose state determination step for determining a loose state of the person based on the variable feature amount calculated in the variable feature calculation step.
  • the eye closure rate of the person is determined by having at least one or more computers detect the eye opening / closing degree of the person from the image and using the eye opening / closing degree of the first predetermined period detected.
  • the variable feature amount of the eye closure rate is calculated, and the involuntary state of the person is determined based on the calculated variable feature amount. Can be done. Therefore, due to factors such as individual differences in the eye closure rate or differences in the imaging position of the image, the influence of these factors is affected even when the eye closure rate is measured higher even during awakening. It is possible to realize a device or system capable of accurately determining the absent-minded state without receiving it.
  • the above program may be a program stored in a storage medium, a program that can be transferred via a communication network, or a program that is executed via a communication network.
  • FIG. 1 It is a schematic diagram which shows an example of the application scene of the state discriminating apparatus which concerns on embodiment (1). It is a block diagram which shows the hardware configuration example of the in-vehicle device which concerns on embodiment (1). It is a block diagram which shows the functional structure example of the state determination apparatus which concerns on embodiment (1).
  • (A) is an example of a graph showing a time-series change in the eye-closing rate of a driver having a low eye-closing rate in normal times
  • (b) is an example of the eye-closing rate calculated using the eye-closing rate shown in (a). It is a graph which shows the time series change of a standard deviation.
  • (A) is an example of a graph showing a time-series change in the eye-closing rate of a driver having a high eye-closing rate in normal times
  • (b) is an example of the eye-closing rate calculated using the eye-closing rate shown in (a).
  • It is a graph which shows the time series change of a standard deviation.
  • It is a block diagram which shows the functional structure example of the operation evaluation apparatus which concerns on embodiment (1).
  • It is a flowchart which shows the processing operation example performed by the control part of the state determination apparatus which concerns on embodiment (1).
  • FIG. 2 It is a block diagram which shows the functional structure example of the state determination apparatus which concerns on embodiment (2).
  • (A) is a graph in which the moving average line of the eye closure rate is superimposed on the time-series change of the eye closure rate shown in FIG. 4 (a), and (b) is the graph of the eye closure rate shown in FIG. 5 (a). It is a graph in which the moving average line of the eye closure rate is superimposed on the time series change.
  • FIG. 1 A state determination device, an in-vehicle device, a driving evaluation system, a state determination method, and an embodiment of a program according to the present invention will be described with reference to the drawings.
  • the state determination device, the state determination method, and the program according to the present invention can be widely applied to various uses for determining the state of a person.
  • the in-vehicle device provided with the state determination device according to the present invention and the driving evaluation system including one or more of the in-vehicle devices evaluate, for example, the driving state of the driver of the vehicle managed by a business operator or the like. Therefore, it can be widely applied to devices or systems for supporting improvement of driver's safety awareness from the viewpoint of preventive safety to prevent accidents. [Application example] FIG.
  • FIG. 1 is a schematic view showing an example of an application scene of the state determination device according to the embodiment (1).
  • the state determination device 20 is mounted on the vehicle-mounted device 10, and the vehicle-mounted device 10 mounted on one or more vehicles 2 and at least one or more that processes data acquired from each vehicle-mounted device 10.
  • a driving evaluation system 1 for evaluating the driving of each driver 3 is constructed including the driving evaluation device 4 of the above.
  • the on-board unit 10 includes a state determination device 20 and a camera 11 that captures an image including the face of the driver 3.
  • the state determination device 20 is composed of a computer device that executes various processes for determining the state of the driver 3 while acquiring an image captured by the camera 11.
  • the vehicle 2 on which the vehicle-mounted device 10 is mounted is not particularly limited. For example, trucks managed by transportation companies, buses managed by bus companies, taxis managed by taxi companies, car sharing vehicles managed by car sharing companies, car rentals managed by car rental companies, owned by companies. Vehicles managed by businesses operating various businesses, such as company-owned vehicles that are owned by the company or company-owned vehicles that are leased and used by car leasing companies, can be targeted. Further, the vehicle 2 on which the vehicle-mounted device 10 is mounted may be a general vehicle.
  • the driving evaluation device 4 is configured to be managed or operated by a safety evaluation / driving training institution such as an insurance company or a driving school, and the driving of the driver 3 of each vehicle 2 in these safety evaluation / driving training institutions. It can also be applied as a system to perform evaluation.
  • a safety evaluation / driving training institution such as an insurance company or a driving school
  • the driving evaluation device 4 acquires, for example, the driving behavior data of the driver 3 and the driving behavior data of the vehicle 2 transmitted from the in-vehicle device 10, and each operation is based on the acquired data group and predetermined evaluation conditions. It has a configuration in which the operation evaluation process of the person 3 is executed and the operation evaluation result is output to an external device, for example, the operator terminal 6.
  • the operation evaluation device 4 is composed of one or more server computers including, for example, a communication unit 41, a control unit 42, and a storage unit 43.
  • the driving behavior data of the driver 3 includes, for example, fluctuation characteristics of the driver 3's face orientation, line-of-sight direction, eye opening / closing degree, eye closure rate, and eye closure rate detected by processing an image captured by the camera 11.
  • Data of at least one of the amount, the predetermined facial event, and the determination result of the loose state are included.
  • Predetermined face-facing events include, for example, at least one of a right / left turn confirmation at an intersection, a direction of travel confirmation at an intersection, no face detection, and inattentiveness.
  • the traveling behavior data of the vehicle 2 includes, for example, data of at least one of the acceleration, angular velocity, position, speed, and predetermined vehicle dynamic event of the vehicle 2 detected by the in-vehicle device 10.
  • Predetermined vehicle dynamic events include, for example, at least one of intersection crossing, road type switching, stopping, sudden steering wheel occurrence, sudden braking, and impact on vehicle 2. ..
  • the vehicle-mounted device 10 and the operation evaluation device 4 are configured to be able to communicate with each other via the communication network 5.
  • the communication network 5 may include a mobile phone network including a base station, a wireless communication network such as a wireless LAN (Local Area Network), a wired communication network such as a public telephone network, the Internet, or a telecommunication line such as a dedicated network. May include.
  • the operator terminal 6 that manages the vehicle 2 is configured to be able to communicate with the driving evaluation device 4 via the communication network 5.
  • the business terminal 6 may be a personal computer having a communication function, or a mobile information terminal such as a mobile phone, a smartphone, or a tablet device. Further, the business terminal 6 may be configured to be able to communicate with the vehicle-mounted device 10 via the communication network 5.
  • the state determination device 20 mounted on the vehicle-mounted device 10 acquires an image captured at a predetermined frame rate from a camera 11 arranged so that the face of the driver 3 can be photographed.
  • the state determination device 20 processes the images acquired from the camera 11 in chronological order, detects the degree of eye opening / closing of the driver 3 from the images (for example, every frame), and detects the first predetermined period (for example, for example).
  • the eye closure rate of the driver 3 is calculated using the eye opening / closing degree (for a predetermined time of about 1 minute).
  • the state determination device 20 uses the calculated eye closure rate for the second predetermined period (longer than the first predetermined period, for example, a predetermined time of about 10 to 15 minutes), and uses the variable feature amount of the eye closure rate (hereinafter, the variable feature amount). (Also referred to as) is calculated, and based on the calculated variation feature amount, the driver 3's loose state (in other words, the loose driving state) is determined.
  • the degree of eye opening / closing is an index indicating the degree of opening of the driver 3's eyes. For example, the ratio of the vertical width to the horizontal width of the driver 3's eyes extracted from the image (for example, the number of pixels in the vertical width / the horizontal width). The number of pixels) may be calculated as the degree of opening and closing of the eye.
  • the eye closure rate is the ratio (ratio) of the time during which the eye is closed, and even if the ratio of the eye opening / closing degree below a predetermined threshold value among the eye opening / closing degrees detected in the first predetermined period is calculated as the eye closure rate. Good.
  • the predetermined threshold value is a value for determining whether or not the eye is closed.
  • the variation feature amount of the eye closure rate is data indicating the variation feature of the eye closure rate calculated by using the eye closure rate in the second predetermined period, and is, for example, an index indicating the degree of variation in the eye closure rate in the second predetermined period ( For example, it may be (standard deviation), the amount of change or the rate of change (slope) of the eye closure rate in the second predetermined period, or the rate of increase in the eye closure rate in the second predetermined period.
  • the state of inattentiveness means a state in which concentration or attention is reduced due to psychological or physiological factors (in the case of driver 3 of vehicle 2 in particular, a state of carelessness ahead), and drowsiness or In addition to the state in which concentration or attention is reduced due to fatigue, it may include a state in which the person is vague due to some thought.
  • the driver 3's loose state is not the value of the closed eye rate itself, but the above-mentioned variable feature amount of the closed eye rate.
  • the determination is made based on, for example, in the awake state due to factors such as individual differences in the eye closure rate of each driver 3 or differences in the installation position (in other words, the image imaging position) of the camera 11 of each vehicle 2. Even if there is a case where the value of the eye closure rate is measured higher, it is possible to accurately determine the involuntary state of each driver 3 without being affected by these factors.
  • the operation evaluation system 1 including one or more in-vehicle devices 10 and the operation evaluation device 4, by using the determination result of the loose state determined by the state determination device 20 of the in-vehicle device 10. , It is possible to evaluate each driver 3 in a fair and vague state without being affected by individual differences in the eye closure rate of each driver 3, and it is possible to perform a more appropriate driving evaluation. ..
  • FIG. 2 is a block diagram showing a hardware configuration example of the vehicle-mounted device 10 according to the embodiment (1).
  • the on-board unit 10 includes a state determination device 20 and a camera 11, and further includes an acceleration sensor 12, an angular velocity sensor 13, a GPS (Global Positioning System) receiving unit 14, a communication unit 15, and a notification unit 16. It is composed of.
  • GPS Global Positioning System
  • the state determination device 20 includes a control unit 21, a storage unit 22, and an input / output interface (I / F) 23.
  • the control unit 21 is composed of a microcomputer including a CPU (Central Processing Unit), a RAM (Random Access Memory), and a ROM (Read Only Memory).
  • the control unit 21 performs a process of storing data acquired from the camera 11, the acceleration sensor 12, the angular velocity sensor 13, the GPS receiving unit 14, and the like in the storage unit 22. Further, the control unit 21 expands the program 221 stored in the storage unit 22 into the RAM, reads various detection data stored in the storage unit 22, and interprets and executes the program 221 expanded in the RAM by the CPU. As a result, various processes for determining the state of the driver 3, which will be described later, are executed.
  • the storage unit 22 is composed of one or more storage devices such as a semiconductor memory.
  • the storage unit 22 may store image data acquired from the camera 11, data detected by the acceleration sensor 12, the angular velocity sensor 13, the GPS receiving unit 14, and the like in time series.
  • the program 221 may be stored in the ROM of the control unit 21.
  • the input / output I / F 23 includes an interface circuit, a connector, and the like for exchanging data and signals with and from a device such as a camera 11.
  • the camera 11 operates as an image pickup unit that captures an image including the driver 3's face, and includes, for example, a lens unit (not shown), an image sensor unit, a light irradiation unit, and a camera control unit that controls each of these units.
  • the image sensor unit includes, for example, an image sensor such as a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor) having sensitivity in the visible light wavelength region or the near infrared wavelength region, a filter, a microlens, and the like. It is composed of.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • the light irradiation unit includes a light emitting element such as an LED (Light Emitting Diode), and may use a light emitting element that irradiates infrared rays so that the state of the driver can be imaged day and night.
  • the camera 11 may be a monocular camera or a stereo camera.
  • the camera control unit includes, for example, a processor and the like.
  • the camera control unit controls the operation of the image pickup element unit and the light irradiation unit, irradiates light (for example, near infrared rays) from the light irradiation unit, and the image pickup element unit captures the reflected light. Control etc.
  • the camera 11 captures an image at a predetermined frame rate (for example, 15 frames per second or more), and the data of the captured image is output to the state determination device 20.
  • the acceleration sensor 12 is a sensor that measures the acceleration of the vehicle 2, and is composed of, for example, a three-axis acceleration sensor that measures acceleration in three directions of the XYZ axes.
  • a 2-axis and 1-axis acceleration sensor may be used as the acceleration sensor 12.
  • the acceleration data measured by the acceleration sensor 12 is stored in the storage unit 22 in association with, for example, the detection time (that is, in time series).
  • the angular velocity sensor 13 is a sensor that detects the rotational angular velocity of the vehicle 2, and is at least the angular velocity corresponding to the rotation around the vertical axis (yaw direction), that is, the angular velocity data corresponding to the rotation (turning) of the vehicle 2 in the left-right direction.
  • the angular velocity sensor 13 includes a 1-axis gyro sensor around the vertical axis, a 2-axis gyro sensor that detects the angular velocity around the horizontal axis (pitch direction) in the left-right direction, and a horizontal axis around the front-back direction (roll direction).
  • a 3-axis gyro sensor that also detects the angular velocity of is used.
  • these gyro sensors in addition to the vibration type gyro sensor, an optical type or a mechanical type gyro sensor may be used.
  • clockwise may be set in the negative direction and counterclockwise may be set in the positive direction.
  • the angular velocity is detected in a predetermined cycle (for example, a cycle of several tens of ms), and the detected angular velocity data is stored in the storage unit 22 in association with the detection time, for example.
  • an inertial sensor in which these are mounted in one package may be used.
  • the GPS receiving unit 14 receives GPS signals (including time information) from artificial satellites via the antenna 14a at predetermined cycles (for example, every second), and receives position data (latitude and longitude) of the current location of the vehicle 2. Includes).
  • the position data detected by the GPS receiving unit 14 is stored in the storage unit 22 in association with the detection time, for example.
  • a receiving device compatible with another satellite positioning system may be used, or another position detecting device may be used.
  • the communication unit 15 is a communication module that performs processing such as emitting radio waves from the antenna 15a, transmitting data to the operation evaluation device 4 via the communication network 5, and receiving radio waves from the outside via the antenna 15a. It is configured to include. Further, the communication unit 15 may include a communication module that performs vehicle-to-vehicle communication or road-to-vehicle communication.
  • the notification unit 16 is configured to include a speaker that outputs a predetermined notification sound, voice, or the like based on a command from the state determination device 20.
  • the on-board unit 10 can have a compact configuration in which the state determination device 20, the camera 11, and the like are housed in one housing.
  • the location of the vehicle-mounted device 10 installed in the vehicle is not particularly limited as long as the camera 11 can capture at least the field of view including the driver's face.
  • the on-board unit 10 may be installed, for example, near the center of the dashboard of the vehicle 2, near the steering wheel column, near the instrument panel, near the rearview mirror, or at the A-pillar portion.
  • the camera 11 may be configured as a separate body in addition to the form that is integrally configured with the state determination device 20.
  • FIG. 3 is a block diagram showing a functional configuration example of the state determination device 20 according to the embodiment (1).
  • the control unit 21 of the state determination device 20 expands the program 221 stored in the storage unit 22 into the RAM. Then, the control unit 21 interprets and executes the program expanded in the RAM by the CPU, and the image acquisition unit 30, the eye opening / closing degree detection unit 31, the eye closure rate calculation unit 32, and the variation feature amount calculation unit 33 shown in FIG. , Operates as a loose state determination unit 34 and an output unit 35.
  • the image acquisition unit 30 performs a process of acquiring an image in which the face of the driver 3 is captured from a camera 11 arranged so as to be able to capture the face of the driver 3 in the vehicle 2.
  • the image acquisition unit 30 acquires, for example, an image of n frames per second (n is, for example, 15 or more).
  • the eye opening / closing degree detection unit 31 analyzes the image acquired by the image acquisition unit 30 and performs a process of detecting the eye opening / closing degree of the driver 3 from each image.
  • the degree of opening and closing of the eyes is an index indicating the degree of opening of the eyes.
  • the ratio to (for example, the number of pixels in the vertical width / the number of pixels in the horizontal width) may be used.
  • the eye opening / closing degree detection unit 31 may obtain the eye opening / closing degree of each of the left and right eyes, or may obtain the average value of the left and right eye opening / closing degrees. Further, either the left or right eye opening / closing degree may be used. Further, as the eye opening / closing degree, only the vertical width of the eye (for example, the number of pixels in the vertical width) may be used.
  • the eye closing rate calculation unit 32 calculates the eye closing rate of the driver 3 by using the eye opening / closing degree of the first predetermined period (for example, a predetermined time of about 1 minute) detected by the eye opening / closing degree detecting unit 31. Do. For example, the eye closure rate calculation unit 32 calculates the ratio of data that is equal to or less than a predetermined threshold value indicating the eye closure state among the eye opening / closing degrees detected in the first predetermined period. In addition, the eye-closure rate calculation unit 32 performs a process of calculating the eye-closure rate every first predetermined period.
  • the variable feature amount calculation unit 33 calculates the variable feature amount of the eye closure rate by using the eye closure rate for the second predetermined period (for example, a predetermined time of about 10 to 15 minutes) calculated by the eye closure rate calculation unit 32. For example, when the first predetermined period is 1 minute and the second predetermined period is 15 minutes, the variable feature amount calculation unit 33 is calculated for 15 minutes of the eye closure rate (15 data). Is used to calculate the variable feature amount of the eye closure rate.
  • the variable feature amount calculation unit 33 uses the data of the eye closure rate for the most recent second predetermined period every time the first predetermined period elapses after the second predetermined period elapses after the image acquisition is started. Then, the variable feature amount is calculated. That is, the variable feature amount calculation unit 33 calculates the variable feature amount for each first predetermined period after the lapse of the second predetermined period.
  • the variable feature amount calculation unit 33 may calculate, for example, an index indicating the degree of variation in the eye closure rate during the second predetermined period as the variable feature amount.
  • the variation feature amount calculation unit 33 may calculate the standard deviation or the variance as an index indicating the degree of variation. Further, the variable feature amount calculation unit 33 may calculate the change amount or the change rate (slope) of the eye closure rate in the second predetermined period as the variable feature amount.
  • variable feature amount calculation unit 33 may calculate the rising time rate, which is the ratio of the rising time of the eye closure rate in the second predetermined period, as the variable feature amount.
  • the variable feature amount calculation unit 33 may calculate the above-mentioned two or more types of variable feature amounts.
  • the loose state determination unit 34 performs a process of determining the driver's loose state based on the variable feature amount calculated by the variable feature amount calculation unit 33. For example, when the standard deviation of the eye closure rate in the second predetermined period is calculated as the variable feature amount, it is determined whether or not the standard deviation of the eye closure rate is equal to or greater than the predetermined threshold value indicating the change to the involuntary state. Alternatively, it may be determined whether or not the predetermined threshold value has been exceeded continuously for a predetermined period of time.
  • the amount of change or rate of change in the eye closure rate during the second predetermined period is a predetermined threshold value (change amount) indicating a change to a vague state.
  • a predetermined threshold value change amount
  • the rate of increase in the rate of eye closure in the second predetermined period is calculated as the variable feature amount, is the rate of increase in the rate of eye closure equal to or greater than the predetermined threshold value (ratio) indicating the change to the involuntary state?
  • the loose state determination unit 34 may combine the determination processing based on the above-mentioned two or more types of fluctuation feature amounts, or the determination processing based on the above-mentioned variation feature amount and the determination processing based on the eye closure rate. May be good.
  • the learning has been learned so that when the random state determination unit 34 inputs the variable feature amount calculated by the variable feature amount calculation unit 33, the driver 3 outputs a value indicating whether or not the driver 3 is in the loose state.
  • the learner may be used to determine the driver 3's absent-minded state. Further, when the learning device inputs the fluctuation feature amount calculated by the fluctuation feature amount calculation unit 33 and the eye closure rate calculated by the eye closure rate calculation unit 32, it is determined whether or not the driver 3 is in a vague state. It may be learned to output the indicated value.
  • the learner may be composed of, for example, a neural network having an input layer, one or more intermediate layers, and an output layer, a support vector machine, or the like. By using the learner, it is possible to easily and appropriately determine whether or not the driver 3 is in a loose state.
  • the output unit 35 performs a process of outputting a determination result by the indiscriminate state determination unit 34.
  • the output unit 35 may perform notification processing according to the determination result via the notification unit 16, stores the determination result in the storage unit 22, and stores the determination result stored in the storage unit 22.
  • a process of transmitting the included data to the operation evaluation device 4 via the communication unit 15 at a predetermined timing may be performed.
  • FIG. 4A is an example of a graph showing a time-series change in the eye closure rate of a driver having a low eye closure rate in normal times
  • FIG. 4B is the data of the eye closure rate shown in FIG. 4A. This is an example of a graph showing the time-series change of the standard deviation of the eye closure rate calculated using.
  • FIG. 5 (a) is an example of a graph showing a time-series change in the eye-closure rate of a driver having a high eye-closure rate in normal times
  • FIG. 5 (b) is the data of the eye-closure rate shown in FIG. 5 (a).
  • It is a graph which shows the time-series change of the standard deviation which is an example of the fluctuation feature amount of the eye closure rate calculated by using.
  • the threshold value indicated by the alternate long and short dash line in the figure indicates an example of a criterion for determining whether or not the state is in a dull state. Comparing FIGS. 4 (a) and 5 (a), it can be seen that there are individual differences in the eye closure rate in normal times. As shown in FIG.
  • the eye closure rate in normal times is detected to be low, and as shown in FIG. 5 (a), the eye closure rate in normal times is high in some people.
  • These individual differences are influenced by, for example, the difference in size between the upper and lower eyelids, and the difference in the positional relationship between the camera 11 and the face (distance, orientation, etc., where the camera 11 is installed).
  • FIG. 4A when the eye closure rate is low in normal times, it is possible to relatively accurately determine whether or not the patient is in a vague state by determining the threshold value of the eye closure rate.
  • FIG. 5 (a) when the eye closure rate is high in normal times, whether or not the patient is in a loose state is determined by the threshold value of the eye closure rate (determination is performed with the same threshold value as in FIG. 4 (a)). ), A state will occur in which an erroneous judgment that the state is always in a vague state is made.
  • the state determination device 20 calculates the fluctuation feature amount of the eye closure rate (for example, the standard deviation of the eye closure rate) from the data of the eye closure rate for a predetermined period, and shifts to the involuntary state based on the calculated fluctuation feature amount. judge.
  • FIG. 6 is a block diagram showing a functional configuration example of the operation evaluation device 4 according to the embodiment (1).
  • the operation evaluation device 4 includes a communication unit 41, a control unit 42, and a storage unit 43, and these are connected via a communication bus 44.
  • the communication unit 41 is configured to include a communication device for realizing transmission / reception of various data and signals to / from the vehicle-mounted device 10 and the business terminal 6 via the communication network 5.
  • the control unit 42 is composed of a computer device including one or more processors that execute various arithmetic processes and a main memory in which a predetermined operation evaluation program or the like is stored. As a functional configuration, the control unit 42 includes an operation evaluation unit 421. It is configured to include the evaluation result output unit 422.
  • the driving evaluation unit 421 performs various driving evaluation processes including evaluation of the driver 3's loose state based on the data including the determination result of the driver 3's loose state determined by the state determination device 20 of the vehicle-mounted device 10. I do.
  • the evaluation result output unit 422 outputs a driving evaluation result including an evaluation of the driver 3's involuntary state evaluated by the driving evaluation unit 421.
  • the evaluation result output unit 422 performs a process of transmitting the operation evaluation result to the business operator terminal 6 via the communication unit 41, for example, in response to a request from the business operator terminal 6.
  • the storage unit 43 is composed of one or more large-capacity storage devices such as a hard disk drive and a solid state drive, and includes a detection data storage unit 431, an evaluation condition storage unit 432, an evaluation result storage unit 433, and the like. ing.
  • the detection data storage unit 431 stores the detection data acquired from the vehicle-mounted device 10 of each vehicle 2.
  • the detection data storage unit 431 is associated with, for example, the identification information of the in-vehicle device 10 or the identification information of the driver 3, and the driver 3 is detected at a time when the driver 3 is detected and a predetermined time before and after the time when the state is detected.
  • the driving behavior data of is accumulated in time series.
  • the driving behavior data of the vehicle 2 detected at a predetermined time before and after the involuntary state detection time may be stored in the detection data storage unit 431 in time series.
  • the driving behavior data of the driver 3 includes, for example, data of at least one of the direction of the face of the driver 3, the direction of the line of sight, the degree of opening and closing of the eyes, the rate of eye closure, and the variable feature amount of the rate of eye closure.
  • the traveling behavior data of the vehicle 2 includes data of at least one of the acceleration, the angular velocity, the position, and the speed of the vehicle 2.
  • the detection data storage unit 431 is associated with the identification information of the in-vehicle device 10 or the identification information of the driver 3, and the driving of the driver 3 detected at the intersection passage time and a predetermined time before and after the intersection passage time.
  • the behavior data and the driving behavior data of the vehicle 2 may be accumulated in time series.
  • the evaluation condition storage unit 432 stores at least one evaluation condition for the safety confirmation operation to be performed by the driver 3 at an intersection or the like.
  • the evaluation result storage unit 433 stores a driving evaluation result including an evaluation of the driver 3's involuntary state evaluated by the driving evaluation unit 421.
  • FIG. 7 is a flowchart showing an example of processing operation performed by the control unit 21 of the state determination device 20 according to the embodiment (1).
  • FIG. 7 shows an example in which the control unit 21 operates as an image acquisition unit 30, an eye opening / closing degree detection unit 31, and an eye closure rate calculation unit 32.
  • step S1 the control unit 21 operates as an image acquisition unit 30, performs a process of acquiring an image captured by the camera 11, and proceeds to step S2.
  • the control unit 21 acquires, for example, an image of n frames per second (for example, n is 15 or more).
  • the acquired image is stored in, for example, the storage unit 22.
  • step S2 the control unit 21 operates as the eye opening / closing degree detecting unit 31, performs a process of detecting the eye opening / closing degree of the driver 3 from the image acquired in step S1, and proceeds to step S3.
  • the control unit 21 detects the facial region of the driver 3 from the acquired image, detects facial organs such as eyes, nose, and mouth from the detected facial region, and detects the eye region from the detected eye region.
  • the vertical width and the horizontal width are detected, and the ratio of the detected vertical width to the horizontal width (for example, the number of vertical width pixels / the number of horizontal width pixels) is detected as the degree of opening and closing of the eye.
  • the degree of opening and closing of the eyes detected from each image may be stored in the storage unit 22 in association with data such as the image acquisition time.
  • the control unit 21 may detect the direction of the face, the direction of the line of sight, and the like by using a known method together with the degree of opening and closing of the eyes. Further, when the eye opening / closing degree detection unit 31 views the driver 3's face from the front based on the ratio of the detected vertical width to the horizontal width of the eye based on the orientation of the driver 3's face detected from the image.
  • the degree of eye opening / closing corrected to the value of may be calculated. For example, the method described in Japanese Patent No. 4957711 can be applied to the method for correcting the degree of opening and closing of the eye.
  • step S3 the control unit 21 determines whether or not the eye opening / closing degree in the first predetermined period has been detected, and if it determines that the eye opening / closing degree in the first predetermined period has not been detected, the process returns to step S1. If it is determined that the degree of eye opening / closing during the first predetermined period has been detected, the process proceeds to step S4.
  • the first predetermined period is a period for acquiring the required number of data for the degree of eye opening / closing used for calculating the eye closure rate, and for example, a predetermined time of about 1 minute can be set.
  • step S4 the control unit 21 operates as the eye closure rate calculation unit 32, performs a process of calculating the eye closure rate of the driver 3 using the eye opening / closing degree detected in the first predetermined period, and processes in step S5.
  • the control unit 21 determines the ratio of data that is equal to or less than a predetermined threshold value indicating the closed eye state among the eye opening / closing degrees detected in the first predetermined period ([number of data of eye opening / closing degree below the threshold value / first predetermined period). Number of data on eye opening / closing degree] ⁇ 100 (%)) is calculated.
  • the eye closure rate calculated for each first predetermined period may be stored in the storage unit 22 in association with data such as the elapsed time of the first predetermined period, for example.
  • the eye-closing rate calculation unit 32 may set a predetermined threshold value indicating the eye-closing state based on the direction of the line of sight of the driver 3 detected from the image.
  • a predetermined threshold value indicating the eye-closing state based on the direction of the line of sight of the driver 3 detected from the image.
  • the method described in Japanese Patent No. 4915413 can be applied to the method for setting a predetermined threshold value.
  • step S5 the control unit 21 determines whether or not the eye closure rate for the second predetermined period has been calculated, and if it determines that the eye closure rate for the second predetermined period has not been calculated, the control unit 21 returns to step S1 while performing the first step. 2 If it is determined that the eye closure rate for a predetermined period has been calculated, the process proceeds to step S6.
  • the second predetermined period is a period for acquiring the required number of data for the eye closure rate used for calculating the variable feature amount, and for example, a predetermined time of about 15 minutes can be set.
  • the eye closure rate is calculated every time the first predetermined period elapses, and the latest second predetermined period is calculated. Determine whether or not the eye closure rate has been calculated.
  • step S6 the control unit 21 reads the data of the eye closure rate calculated in the second predetermined period from the storage unit 22, and then returns to step S1 to repeat the eye opening / closing degree detection process and the eye closure rate calculation process.
  • FIG. 8 is a flowchart showing an example of processing operation performed by the control unit 21 of the state determination device 20 according to the embodiment (1). The processing operation shown in FIG. 8 is executed following the processing in step S6 shown in FIG. 7.
  • FIG. 8 shows an example in which the control unit 21 operates as the fluctuation feature amount calculation unit 33, the loose state determination unit 34, and the output unit 35.
  • step S7 the control unit 21 operates as the fluctuation feature amount calculation unit 33, and calculates the fluctuation feature amount of the eye closure rate using the data of the eye closure rate in the second predetermined period read out in step S6. Is performed, and the process proceeds to step S8. For example, when the first predetermined period is set to 1 minute and the second predetermined period is set to 15 minutes, the control unit 21 has data for 15 minutes of the eye closure rate calculated every minute (15 data). Is used to calculate the fluctuation feature amount of the eye closure rate.
  • variable feature amount to be calculated may be the standard deviation of the eye closure rate in the second predetermined period, the change amount or the change rate (slope) of the eye closure rate in the second predetermined period, or the second predetermined amount. It may be the rate of increase in the eye closure rate during the period, and at least one or more of these variable features may be calculated.
  • the calculated variable feature amount may be stored in the storage unit 22 in association with data such as the elapsed time of the second predetermined period, for example.
  • step S8 the control unit 21 operates as a loose state determination unit 34, and based on the variation feature amount of the eye closure rate calculated in step S7, the loose state of the driver 3 (in other words, the loose driving state). ) Is performed, and the process proceeds to step S9.
  • the standard deviation of the eye closure rate is calculated as the variable feature amount of the eye closure rate, it may be determined whether or not the standard deviation of the eye closure rate is equal to or greater than a predetermined threshold value for determining the transition to the involuntary state.
  • a predetermined threshold value for determining the transition to the involuntary state.
  • the amount of change or rate of change in the rate of eye closure is a predetermined threshold value (change amount or rate of change) for determining the transition to a vague state. ) Or more, or whether or not the amount of change in the eye closure rate or the rate of change continuously exceeds a predetermined threshold for a predetermined period (for example, for a certain period of time) may be determined. Good.
  • the rate of increase in the rate of eye closure in the second predetermined period is calculated as the variable feature amount of the rate of eye closure
  • the rate of increase in the rate of eye closure becomes equal to or greater than a predetermined threshold value (ratio) indicating a change to a vague state. It may be determined whether or not the rate of increase in the eye closure rate continuously exceeds a predetermined threshold value for a predetermined period (for example, a certain period of time). In this case, if the rate of increase in the eye closure rate is equal to or greater than a predetermined threshold value, or if the rate of increase in the eye closure rate exceeds the predetermined threshold value continuously for a predetermined period of time, it is determined that the state is indifferent.
  • control unit 21 may determine the driver 3's involuntary state based on the above-mentioned two or more types of fluctuation feature amounts, the determination processing based on the above-mentioned fluctuation feature amount, and the determination based on the eye closure rate.
  • the driver 3 may be determined to be in a vague state in combination with the processing.
  • step S9 the control unit 21 determines whether or not the driver 3 is in a loose state as a result of the loose state determination process in step S8, and if it is determined that the driver 3 is in a loose state, the process is performed in step S10. Proceed.
  • step S10 the control unit 21 operates as an output unit 35, and a notification process for shifting the driver 3 from the awake state to the awake state, for example, the notification unit 16 is operated to output a warning sound, or A process of outputting a predetermined announcement or the like is performed, and the process proceeds to step S11.
  • step S11 the control unit 21 operates as the output unit 35, and stores the result determined to be in the loose state (blunt state detection result) in association with data such as the elapsed time and position of the second predetermined period.
  • the process of storing in the unit 22 is performed, and then the process is completed.
  • the control unit 21 may transmit the data including the vague state detection result stored in the storage unit 22 to the operation evaluation device 4 at a predetermined timing.
  • the involuntary state of the driver 3 is determined based on the above-mentioned fluctuation feature amount of the eye closure rate, not on the value of the eye closure rate itself. Will be done.
  • the eyes are closed even in the awake state due to factors such as individual differences in the eye closing rate of each driver 3 or differences in the installation position (in other words, the image capturing position) of the camera 11 of each vehicle 2. Even when the value of the rate is measured higher, it is possible to accurately determine the loose state of each driver 3, that is, the loose driving state without being affected by these factors.
  • the driver user
  • the threshold value for determining the eye closure rate and determines the involuntary state based on the value of the eye closure rate for each driver or each time the vehicle starts running. In the above embodiment, it is not necessary to perform complicated operations such as threshold setting, and it does not take time and effort, so that the convenience for the user can be improved.
  • the eye closure rate is used to determine the driver 3's involuntary state. Changes in the degree of variation are taken into account. Further, when the amount of change or the rate of change in the eye closure rate during the second predetermined period is used as the variable feature amount, the magnitude of the change in the eye closure rate is taken into consideration in determining the driver 3's absent-minded state.
  • an index for example, a value such as standard deviation or dispersion
  • the increase time rate of the eye closure rate in the second predetermined period is used as the fluctuation feature amount
  • the change tendency that the eye closure rate increases with time is taken into consideration in the determination of the driver 3's absent-minded state. Therefore, by determining the driver 3's absent-minded state based on these variable features, the value of the eye-closure rate itself depends on the eye-closure rate itself, even in the awake state or when the eye-closure rate is measured higher. It is possible to accurately determine the timing of transition to the involuntary state without being affected.
  • the driving evaluation device 4 makes it possible to evaluate each driver 3 in a fair and indiscriminate state without being affected by individual differences in the eye closure rate of each driver 3, and a more appropriate driving evaluation result. Can be output to the operator terminal 6.
  • the business operator who manages the vehicle 2 can appropriately provide safe driving education to the driver 3 by using the driving evaluation result displayed on the business operator terminal 6.
  • FIG. 9 is a block diagram showing a functional configuration example of the state determination device 20A according to the embodiment (2).
  • the state determination shown in FIG. 3 is performed except that the preprocessing unit 36 is provided in front of the variation feature amount calculation unit 33A. Since the functional configuration of the device 20 is substantially the same, the same reference numerals are given to the configurations having the same functions, and the description thereof will be omitted.
  • the pretreatment unit 36 performs a predetermined pretreatment on the eye closure rate calculated by the eye closure rate calculation unit 32.
  • the pretreatment unit 36 may perform a smoothing process of the eye closure rate calculated by the eye closure rate calculation unit 32 as a pretreatment.
  • the preprocessing unit 36 may perform, for example, a process of calculating the moving average of the eye closure rate at predetermined intervals.
  • the predetermined interval may be, for example, an interval of the first predetermined period ⁇ m (m is an integer of 2 or more).
  • the variable feature amount calculation unit 33A performs a process of calculating the variable feature amount using the eye closure rate after the pretreatment by the pretreatment unit 36.
  • FIG. 10 (a) is a graph in which the moving average line of the eye closure rate is superimposed on the graph showing the time-series change of the eye closure rate of the driver having a low eye closure rate in normal times shown in FIG. 4 (a).
  • FIG. 10B is a graph in which the moving average line of the eye closure rate is superimposed on the graph showing the time-series change of the eye closure rate of the driver having a high eye closure rate in normal times shown in FIG. 5A.
  • the thick line shows the moving average of the eye closure rate.
  • FIG. 11 is a flowchart showing an example of processing operation performed by the control unit 21A of the state determination device 20A according to the embodiment (2). The processing operation shown in FIG. 11 is executed following the processing in step S6 of FIG. Since the processing operations up to step S6 in FIG. 7 are the same, the description thereof will be omitted.
  • FIG. 11 is a flowchart showing an example of processing operation performed by the control unit 21A of the state determination device 20A according to the embodiment (2). The processing operation shown in FIG. 11 is executed following the processing in step S6 of FIG. Since the processing operations up to step S6 in FIG. 7 are the same, the description thereof will be omitted.
  • FIG. 11 is a flowchart showing an example of processing operation performed by the control unit 21A of the state determination device 20A according to the embodiment (2). The processing operation shown in FIG. 11 is executed following the processing in step S6 of FIG. Since the processing operations up to step S6 in FIG. 7 are the same, the description thereof will be omitted.
  • FIG. 11 is
  • control unit 21A operates as the preprocessing unit 36, the variation feature amount calculation unit 33A, the loose state determination unit 34, and the output unit 35. Further, the same processing contents as those shown in FIG. 8 are assigned the same step numbers, and the description thereof will be omitted.
  • step S21 the control unit 21A determines whether or not to execute the smoothing process of the eye closure rate based on the type of the variable feature amount used for determining the involuntary state. For example, when the type of the variable feature amount used for determining the loose state is the amount of change or the rate of change in the eye closure rate, or the rate of increase in the eye closure rate, it is determined that the smoothing process of the eye closure rate is to be executed, and step S22. Proceed to processing.
  • step S22 the control unit 21A operates as the preprocessing unit 36, performs smoothing processing on the data of the eye closure rate in the second predetermined period read out in step S6, and proceeds to the processing in step S23.
  • the control unit 21A performs, for example, a process of calculating the average value of the eye closure rate for each fixed period in the second predetermined period while shifting the interval, that is, a process of obtaining a moving average.
  • step S23 the control unit 21A operates as the fluctuation feature amount calculation unit 33A, performs a process of calculating the fluctuation feature amount of the eye closure rate using the eye closure rate smoothed in step S22, and processes in step S24.
  • the variable feature amount to be calculated may be, for example, the amount of change or the rate of change (slope) of the smoothed eye closure rate, or the rate of increase in the smoothed eye closure rate. Of these, at least one type of variable feature amount may be calculated.
  • the calculated variable feature amount may be stored in the storage unit 22 in association with data such as the elapsed time of the second predetermined period, for example.
  • step S24 the control unit 21A operates as a loose state determination unit 34, and the driver's looseness is based on the variation feature amount (change amount, change rate, or rise time rate) of the eye closure rate calculated in step S23.
  • the process of determining the state is performed, and the process proceeds to step S9.
  • step S23 when the amount of change or rate of change in the smoothed eye closure rate is calculated as the variable feature amount of the eye closure rate, in step S24, the amount of change or rate of change in the smoothed eye closure rate is determined. It may be determined whether or not it is equal to or more than a predetermined threshold value (change amount or rate of change) for determining the transition to the loose state. In this case, if the amount of change or the rate of change in the smoothed eye closure rate is equal to or greater than a predetermined threshold value, it is determined that the state is in a vague state.
  • a predetermined threshold value change amount or rate of change
  • step S23 when the increase time rate of the smoothed eye closure rate is calculated as the variable feature amount of the eye closure rate, the increase time rate of the smoothed eye closure rate indicates a change to a vague state. It may be determined whether or not the value exceeds a predetermined threshold value (ratio). In this case, if the rate of increase in the smoothed eye closure rate is equal to or greater than a predetermined threshold value, it is determined that the smoothed state is in a vague state.
  • a predetermined threshold value ratio
  • step S21 when the type of the variable feature amount used for determining the involuntary state is, for example, the standard deviation of the eye closure rate, the control unit 21A determines that the smoothing process of the eye closure rate is not executed, and steps. The process proceeds to S25.
  • step S25 the control unit 21A operates as the fluctuation feature amount calculation unit 33A, performs a process of calculating the fluctuation feature amount of the eye closure rate using the eye closure rate that has not been smoothed, and proceeds to step S26. ..
  • the calculated variable feature amount is, for example, the standard deviation of the eye closure rate, and the calculated variable feature amount may be stored in the storage unit 22 in association with data such as the elapsed time of the second predetermined period.
  • step S26 the control unit 21A performs a process of determining the driver's absent-minded state based on the variation feature amount of the eye closure rate (standard deviation of the eye closure rate) calculated in step S25, and performs the process in step S9. Proceed. Since the processes of steps S9 to S11 are the same as the processes of steps S9 to S11 in FIG. 8, the description thereof will be omitted here. According to the in-vehicle device 10A provided with the state determination device 20A according to the embodiment (2), whether or not to execute the smoothing process of the eye closure rate based on the type of the variable feature amount used for determining the involuntary state.
  • the pretreatment unit 36 When it is determined that the variable feature amount is the amount of change, the rate of change, or the rate of increase in the eye closure rate, the pretreatment unit 36 performs a smoothing process as a pretreatment for the eye closure rate, and after the smoothing process.
  • the amount of change in the rate of eye closure, the rate of change, or the variable feature amount of the rate of increase time is calculated using the rate of eye closure.
  • the standard deviation of the closed eye rate is calculated using the closed eye rate that has not been smoothed.
  • the standard deviation of the eye closure rate can be calculated as a feature amount that makes it easy to grasp the tendency to shift to.
  • FIG. 12 is a block diagram showing a functional configuration example of the state determination device 20B according to the embodiment (3).
  • the functional configuration of the state determination device 20B according to the embodiment (3) except that the vehicle dynamic data acquisition unit 37, the event detection unit 38, and the preprocessing unit 36A are further equipped. Since the functional configuration of the state determination device 20 shown in FIG. 3 is substantially the same, the configurations having the same function are designated by the same reference numerals, and the description thereof will be omitted.
  • the vehicle dynamic data acquisition unit 37 is the acceleration data of the vehicle 2 detected by the acceleration sensor 12 of the in-vehicle device 10, the angular velocity data of the vehicle 2 detected by the angular velocity sensor 13, and the vehicle 2 detected by the GPS receiving unit 14.
  • a process of acquiring at least one of the dynamic data of the position data and sending the acquired data to the event detection unit 38 is performed.
  • the event detection unit 38 includes a vehicle dynamics detection unit 381 and a face-facing event detection unit 382.
  • the vehicle dynamics detection unit 381 detects a predetermined event based on the dynamic data of the vehicle 2 acquired from the vehicle dynamics data acquisition unit 37.
  • the face-facing event detection unit 382 detects the face-facing event of the driver 3 based on the image data acquired from the image acquisition unit 30. Further, the face-facing event detection unit 382 detects the face-to-face event of the driver 3 based on the image data acquired from the image acquisition unit 30 and the dynamic data of the vehicle 2 acquired from the vehicle dynamic data acquisition unit 37. May be good.
  • the vehicle dynamics detection unit 381 includes, as the predetermined events, an event in which the road type on which the vehicle 2 travels is switched, an event in which the vehicle 2 stops, an event in which the vehicle 2 starts traveling, and an event in which the vehicle 2 suddenly handles. , An event in which a sudden brake is generated in the vehicle 2 and an event in which an impact is generated in the vehicle 2 are detected.
  • the face-facing event detection unit 382 detects an event in which the face orientation of the driver 3 changes while the vehicle 2 is driving. Detected facial events include, for example, at least one of a right / left turn confirmation at an intersection, a direction of travel confirmation at an intersection, no face detection, and inattentiveness.
  • the pretreatment unit 36A performs a predetermined preprocessing on the eye closure rate calculated by the eye closure rate calculation unit 32.
  • the preprocessing unit 36A is, for example, when or during a predetermined event detected by the event detection unit 38 among the eye closure rates of the second predetermined period calculated by the eye closure rate calculation unit 32 (for example, the first predetermined period). ) Is removed from the data to be calculated for the variable feature amount.
  • FIG. 13 is a flowchart showing a processing operation performed by the control unit 21B of the state determination device 20B according to the embodiment (3).
  • FIG. 13 shows an example in which the control unit 21B operates as an image acquisition unit 30, an eye opening / closing degree detection unit 31, an eye closure rate calculation unit 32, a vehicle dynamic data acquisition unit 37, an event detection unit 38, and a preprocessing unit 36A. ing. Note that the same processing contents as those shown in FIG. 7 are assigned the same step numbers, and the description thereof will be omitted here.
  • step S31 the control unit 21B operates as the vehicle dynamic data acquisition unit 37, performs a process of acquiring the dynamic data of the vehicle 2, and proceeds to the process in step S32.
  • the control unit 21B determines, for example, dynamic data of at least one of the acceleration data detected by the acceleration sensor 12, the angular velocity data detected by the angular velocity sensor 13, and the position data detected by the GPS receiving unit 14. Obtained at intervals (several tens of msec, or at intervals of several seconds).
  • step S32 the control unit 21B operates as the vehicle dynamic detection unit 381, and using the dynamic data of the vehicle 2 acquired in step S31, whether or not the vehicle 2 is passing through the intersection (for example, during a right turn or a left turn). The process for detecting this is performed, and the process proceeds to step S33.
  • To detect whether or not the vehicle 2 is passing through an intersection for example, it is detected whether or not the absolute value of the angular velocity detected by the angular velocity sensor 13 exceeds a predetermined angular velocity threshold value, and the time when the angular velocity threshold value is exceeded is set at the intersection. It may be detected as the passing time of.
  • a condition that the vehicle speed is equal to or less than a predetermined speed suitable for passing through the intersection may be added.
  • the detected data during passing through the intersection may be stored in the storage unit 22 in association with data such as the passing time and position of the intersection, for example.
  • step S33 the control unit 21B operates as the vehicle dynamics detection unit 381, and uses the dynamic data of the vehicle 2 acquired in step S31 to perform a process of detecting the switching of the road type on which the vehicle 2 is traveling.
  • the process proceeds to step S34.
  • the control unit 21B may calculate the vehicle speed from the moving distance per unit time using the position data acquired in step S31, and detect the switching of the road type based on the calculated vehicle speed.
  • a switch from a general road to an expressway may be detected.
  • the control unit 21B may detect a switch from an expressway to a general road based on the vehicle speed, or detect a switch from a general road to a residential road (for example, a road having a speed limit of 30 km / h or less).
  • the switch from the residential road to the general road may be detected.
  • the control unit 21B may detect the switching of the road type by collating the road map data (including the road type data) with the position data. Further, the control unit 21B may detect the switching of road types based on a communication signal (for example, an approach signal to an expressway or an exit signal from an expressway) acquired by road-to-vehicle communication or the like.
  • the data for detecting the change of the road type may be stored in the storage unit 22 in association with data such as position data and detection time.
  • step S34 the control unit 21B operates as the vehicle dynamics detection unit 381, performs a process of detecting the stop of the vehicle 2 using the dynamic data of the vehicle 2 acquired in step S31, and proceeds to the process in step S35.
  • the stop of the vehicle 2 is determined, for example, when a state in which the position data does not change is continuously detected for a predetermined time (about several seconds to several tens of seconds) using the position data acquired in step S31, or from the position data.
  • a state in which the vehicle speed is equal to or lower than a predetermined value for example, slow speed
  • the data obtained by detecting the stop of the vehicle 2 may be stored in the storage unit 22 in association with data such as the acquisition time of the position data.
  • step S35 the control unit 21B operates as the vehicle dynamics detection unit 381, performs a process of detecting the start of traveling of the vehicle 2 using the dynamic data of the vehicle 2 acquired in step S31, and proceeds to the process in step S36. ..
  • the change in the position data is continuously detected for a predetermined time (about several seconds to several tens of seconds) from the state in which the position data has not changed. In this case, or when it is detected that the vehicle speed obtained from the position data is equal to or higher than a predetermined value, it may be assumed that the vehicle 2 has started running and the start of running of the vehicle 2 may be detected.
  • the data for detecting the start of traveling of the vehicle 2 may be stored in the storage unit 22 in association with data such as the time when the start of traveling is detected, for example.
  • step S36 the control unit 21B operates as the vehicle dynamics detection unit 381, performs a process of detecting sudden steering or sudden braking of the vehicle 2 using the dynamic data of the vehicle 2 acquired in step S31, and in step S37. Proceed with processing.
  • the sudden steering wheel (sudden steering) of the vehicle 2 is, for example, equal to or higher than a predetermined threshold for determining the occurrence of a sudden steering wheel by using the data of the angular velocity (including the yaw angular velocity) and the acceleration (including the left-right acceleration) acquired in step S31.
  • the sudden steering wheel of the vehicle 2 may be detected, assuming that the vehicle 2 has a sudden steering wheel.
  • the sudden braking (sudden deceleration) of the vehicle 2 is, for example, when an acceleration equal to or higher than a predetermined threshold value for determining the occurrence of sudden braking is detected using the acceleration data (including the front-rear acceleration) acquired in step S31.
  • the sudden steering of the vehicle 2 may be detected on the assumption that the sudden braking has occurred in the vehicle 2.
  • the data in which the sudden steering wheel or the sudden braking of the vehicle 2 is detected may be stored in the storage unit 22 in association with the data such as the time when the sudden steering wheel or the sudden braking is detected, for example.
  • step S37 the control unit 21B operates as the vehicle dynamics detection unit 381, performs a process of detecting the impact of the vehicle 2 using the dynamic data of the vehicle 2 acquired in step S31, and proceeds to the process in step S38.
  • the impact of the vehicle 2 is, for example, when an acceleration equal to or higher than a predetermined threshold for determining the occurrence of an impact due to a collision or the like is detected using the acceleration data (including the front-back acceleration or the left-right acceleration) acquired in step S31. Assuming that an impact has occurred on 2, the impact of the vehicle 2 may be detected.
  • the data obtained by detecting the impact of the vehicle 2 may be stored in the storage unit 22 in association with data such as the time when the impact is detected.
  • step S38 the control unit 21B determines whether or not the eye closure rate has been calculated in step S4 in which the processes are performed in parallel, and if it is determined that the eye closure rate has not been calculated, the process returns to step S31. On the other hand, if it is determined that the eye closure rate has been calculated, the process proceeds to step S50. Further, the processes of steps S41 to S43 are executed in parallel with the processes of steps S31 to S38 described above.
  • step S41 the control unit 21B processes the image acquired from the camera 11 in step S1, performs a process of detecting the direction of the driver 3's face from the image, and proceeds to step S42.
  • the method of detecting the orientation of the driver 3's face from the image is not particularly limited.
  • the control unit 21B detects the position or shape of each facial organ such as eyes, nose, mouth, and eyebrows from the facial region in the image, and based on the detected position or shape of each facial organ, The orientation of the face may be detected.
  • the orientation of the driver 3's face detected by the control unit 21B is, for example, a pitch angle which is an angle (up and down direction) around the X axis (left and right axis) of the driver's face, and a Y axis (up and down) of the face. It may be indicated by the Yaw angle, which is the angle around the axis (left-right orientation), and the Roll angle, which is the angle (left-right tilt) around the Z-axis (front-back axis) of the face, and at least the left-right orientation. Includes yaw angle indicating. Further, these angles can be indicated by angles with respect to a predetermined reference direction, and for example, the reference direction may be set to the front direction of the driver.
  • the detected face orientation data may be stored in the storage unit 22 in association with data such as an image acquisition time or a frame number, for example.
  • step S42 the control unit 21B operates as the face-facing event detection unit 382, performs a process of detecting the face-facing event of the driver 3, and proceeds to step S43.
  • the detection of the face-facing event of the driver 3 may detect the face-facing event of the driver 3, such as the face-undetected state and the inattentive state, based on the result of the face-facing detection in step S41, for example.
  • the control unit 21B has a safety confirmation operation (left-right confirmation, traveling direction) when passing through the intersection based on the result of the face orientation detection in step S41 and the detection result of whether or not the intersection is being passed in step S32.
  • An event such as (operation such as confirmation) may be detected.
  • the detected face-facing event detection data may be stored in the storage unit 22 in association with data such as an image acquisition time, an intersection passage time, or a position.
  • step S43 the control unit 21B determines whether or not the eye closure rate has been calculated in step S4 in which processing is performed in parallel, and if it is determined that the eye closure rate has not been calculated, the process returns to step S41. On the other hand, if it is determined that the eye closure rate has been calculated, the process proceeds to step S50.
  • step S50 the control unit 21B performs preprocessing on the eye closure rate calculated in step S4, and proceeds to step S5. The content of the preprocessing in step S50 will be described later.
  • step S5 the control unit 21B determines whether or not the eye closure rate for the second predetermined period has been calculated, and if it determines that the eye closure rate for the second predetermined period has not been calculated, returns to step S1 and repeats the process. On the other hand, if it is determined that the degree of eye closure in the second predetermined period has been calculated, the process proceeds to step S6.
  • step S6 the control unit 21B reads the preprocessed eye closure rate data for the second predetermined period from the storage unit 22, and then returns to step S1 to repeat the eye opening / closing degree detection process and the eye closure rate calculation process.
  • FIG. 14 is a flowchart showing an example of processing operation performed by the control unit 21B of the state determination device 20B according to the embodiment (3).
  • FIG. 14 shows an operation example of the preprocessing of the eye closure rate in step S50 shown in FIG.
  • the processing procedure described below is only an example and may be changed as appropriate, and steps may be omitted, replaced, or added depending on the embodiment.
  • the control unit 21B determines in step S33 of FIG. 13 whether or not the road type switching is detected in the first predetermined period, and determines that the road type switching is detected in the first predetermined period. If so, the process proceeds to step S52.
  • step S52 the control unit 21B performs a process of removing the eye closure rate calculated before the switching of the road type is detected from the target data used for calculating the variation feature amount, and then ends the preprocessing. On the other hand, if it is determined in step S51 that the switching of the road type is not detected in the first predetermined period, the process proceeds to step S53.
  • step S53 the control unit 21B determines in step S34 of FIG. 13 whether or not the stop of the vehicle 2 is detected in the first predetermined period, and determines that the stop of the vehicle 2 is detected in the first predetermined period. If so, the process proceeds to step S54.
  • step S54 the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the time when the vehicle 2 is stopped from the calculation target data, and then ends the preprocessing.
  • step S53 determines in step S53 in step S53 in step S53 in step S53 in step S53 in step S53 in step S53 in step S53 in step S53 in step S53 in step S53 in step S53.
  • step S55 the control unit 21B determines in step S35 of FIG. 13 whether or not the start of traveling of the vehicle 2 is detected in the first predetermined period, and the start of traveling of the vehicle 2 is detected in the first predetermined period. If it is determined, the process proceeds to step S56.
  • step S56 the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the time of detecting the start of travel of the vehicle 2 from the calculation target data, and then ends the preprocessing.
  • step S57 the control unit 21B determines in step S36 of FIG. 13 whether or not the sudden steering wheel or sudden braking of the vehicle 2 is detected in the first predetermined period, and the sudden steering wheel or the sudden braking of the vehicle 2 in the first predetermined period. If it is determined that the sudden braking is detected, the process proceeds to step S58.
  • step S58 the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the detection of the sudden steering wheel or the sudden braking of the vehicle 2 from the calculation target data, and then ends the preprocessing.
  • step S57 if it is determined in step S57 that the sudden steering wheel or sudden braking of the vehicle 2 is not detected in the first predetermined period, the process proceeds to step S59.
  • step S59 the control unit 21B determines whether or not the impact of the vehicle 2 has been detected in the first predetermined period in step S37 of FIG. 13, and determines that the impact of the vehicle 2 has been detected in the first predetermined period. If so, the process proceeds to step S60.
  • step S60 the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the time of impact detection of the vehicle 2 from the calculation target data, and then ends the preprocessing.
  • step S59 if it is determined in step S59 that the impact of the vehicle 2 has not been detected during the first predetermined period, the process proceeds to step S61.
  • step S61 the control unit 21B determines in step S42 of FIG. 13 whether or not the facial event of the driver 3 is detected in the first predetermined period, and the facial event of the driver 3 in the first predetermined period. If it is determined that is detected, the process proceeds to step S62.
  • step S62 the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the time of detecting the facial event of the driver 3 from the calculation target data, and then ends the preprocessing.
  • the preprocessing is then completed and the process proceeds to step S5.
  • a predetermined event was detected by the event detection unit 38 in the eye closure rate in the second predetermined period by the pretreatment unit 36A.
  • the eye closure rate calculated for the time or period is removed from the data for which the variable feature amount is calculated.
  • the variable feature amount is calculated using the eye closure rate after the removal treatment. Therefore, by using the eye closure rate after the removal treatment for the calculation of the variable feature amount, it is possible to calculate the feature amount that more accurately represents the feature that shifts to the loose state, and the transition to the loose state is accurate. It can be judged well.
  • the event detection unit 38 includes the vehicle dynamics detection unit 381, the event in which the road type on which the vehicle 2 travels is switched, the event in which the vehicle 2 is stopped, and the vehicle in the second predetermined period of the eye closure rate. It is possible to exclude the eye closure rate calculated when at least one of the events in which 2 starts running is detected or during the period, from the data to be calculated for the variable feature amount. Further, since the event detection unit 38 includes the face-facing event detection unit 382, an event in which the face orientation of the driver 3 changes while the vehicle 2 is being driven is detected in the eye closure rate during the second predetermined period. It is possible to exclude the eye closure rate calculated at that time or during the period from the data to be calculated for the variable feature amount.
  • the tendency to shift to the vague state is more accurate.
  • the variable feature amount can be calculated, and the transition of the driver 3 to the involuntary state in the actual vehicle environment can be accurately determined.
  • FIG. 15 is a block diagram showing a functional configuration example of the state determination device 20C according to the embodiment (4).
  • the functional configuration of the state determination device 20C is substantially the same as that of the state determination device 20B shown in FIG. 12, so that the configurations having the same functions are the same. Reference numerals are given, and the description thereof will be omitted.
  • the pretreatment unit 36B performs a predetermined preprocessing on the eye closure rate calculated by the eye closure rate calculation unit 32. Similar to the pretreatment unit 36A in FIG. 12, the pretreatment unit 36B is when a predetermined event is detected by the event detection unit 38 in the eye closure rate in the second predetermined period calculated by the eye closure rate calculation unit 32. Alternatively, the eye closure rate calculated during the period (for example, the first predetermined period) is removed from the data for which the variable feature amount is calculated.
  • the interpolation processing unit 39 performs a process of interpolating the eye closure rate in the second predetermined period when it is removed by the removal process of the preprocessing unit 36B or during the period (for example, the first predetermined period). By interpolating the eye closure rate at the time of removal or during the period before calculating the variable feature amount, it is possible to improve the accuracy of the variable feature amount calculated by the variable feature amount calculation unit 33B.
  • the preprocessing unit 36B performs the removal processing and the smoothing processing of the eye closure rate after the interpolation processing, for example, according to the type of the variable feature amount used for determining the loose state. As the smoothing process, the preprocessing unit 36B may perform, for example, a process of calculating the moving average of the eye closure rate at predetermined intervals.
  • the variable feature amount calculation unit 33B uses the eye closure rate after preprocessing and interpolation processing (the removal processing, the interpolation processing, and the smoothing processing) by the preprocessing unit 36B and the interpolation processing unit 39 to obtain the variable feature amount. Perform the calculation process.
  • FIG. 16 is a flowchart showing a processing operation performed by the control unit 21C of the state determination device 20C according to the embodiment (4). The processing operation shown in FIG. 16 is executed following the processing in step S6 shown in FIG. FIG. 16 shows an example in which the control unit 21C operates as the preprocessing unit 36B, the interpolation processing unit 39, the variation feature amount calculation unit 33B, the loose state determination unit 34A, and the output unit 35. Note that the same processing contents as those shown in FIG. 11 are given the same step numbers, and the description thereof will be omitted.
  • step S71 the control unit 21C operates as the interpolation processing unit 39, and in the second predetermined period, among the eye closure rates calculated for each first predetermined period, the eye closure removed by the preprocessing in step S50 of FIG.
  • the process of interpolating the rate is performed, and then the process proceeds to step S21.
  • the control unit 21C may perform a process of interpolating the eye closing rate removed by the preprocessing of step S50 with the calculated eye closing rate immediately before or after the removed eye closing rate. Good.
  • the control unit 21C may perform a process of interpolating with the average value of the eye closing rates calculated immediately before and after the removed eye closing rate, or the eye closing calculated immediately before and after the removed eye closing rate. Interpolation processing may be performed based on the amount of change in the rate or the rate of change (slope).
  • steps S21 to S26 are basically the same as the processing content of steps S21 to S26 shown in FIG. 11, except that the interpolated eye closure rate data is used, and thus the description thereof will be omitted here.
  • step S22 corresponds to the processing operation of the pre-processing unit 36B
  • steps S23 and S25 correspond to the processing operation of the variation feature amount calculation unit 33B
  • steps S24 and S26 correspond to the processing operation of the loose state determination unit 34A. ..
  • step S24 or step S26 the control unit 21C performs the process of determining the involuntary state, and then proceeds to the process of step S72.
  • step S72 the control unit 21C operates as a loose state determination unit 34A, determines whether or not the fluctuation feature amount of the eye closure rate is equal to or greater than the threshold value A set as the reference for performing warning processing, and is equal to or greater than the threshold value A. If it is determined that the above is not the case, the process is completed, while if it is determined that the threshold value is A or more, the process proceeds to step S73.
  • the threshold value A may be used as a criterion for determining that the state is in the loose state of the middle level or lower.
  • step S73 the control unit 21C operates as the output unit 35, performs warning processing for the driver 3, and proceeds to step S74.
  • the control unit 21C may, for example, operate the notification unit 16 to output an alarm sound or a warning announcement for awakening the driver 3.
  • step S74 the control unit 21C operates as a loose state determination unit 34A, determines whether or not the fluctuation feature amount of the eye closure rate is equal to or greater than the threshold value B set as the reference for making an external report, and is equal to or greater than the threshold value B. If it is determined that this is not the case, the process proceeds to step S76, and if it is determined that the threshold value B or higher is determined, the process proceeds to step S75.
  • the threshold value B is larger than the threshold value A, and may be used as a reference for determining that the state is in a loose state higher than the medium level, for example, when the level of the loose state is classified into low, medium, and high.
  • step S75 the control unit 21C operates as the output unit 35 to perform a process of notifying the outside that the driver 3 is in a vague state, and then proceeds to the process in step S76.
  • the communication unit 15 is operated to notify the operator terminal 6 that the driver 3 is in a vague state higher than the middle level.
  • step S76 when the control unit 21C determines in step S74 that the fluctuation feature amount is not equal to or greater than the threshold value B, the control unit 21C obtains data such as the elapsed time and position of the second predetermined period as the determination result of the involuntary state of the threshold value A or more. The process of storing the data in the storage unit 22 is performed in association with the above, and then the process is completed.
  • step S76 the control unit 21C determines that the fluctuation feature amount is equal to or higher than the threshold value B in step S74, and when the external notification process is executed in step S75, the control unit 21C determines the determination result of the indiscriminate state that is equal to or higher than the threshold value B. 2.
  • a process of associating with data such as the elapsed time and position of a predetermined period and storing the data in the storage unit 22 is performed, and then the process is completed.
  • the control unit 21C may transmit the data including the involuntary state detection result stored in the storage unit 22 to the operation evaluation device 4 at a predetermined timing.
  • a predetermined event was detected by the event detection unit 38 among the eye closure rates in the second predetermined period by the pretreatment unit 36B.
  • the eye closure rate calculated for the time or period is removed from the data for which the variable feature amount is calculated.
  • the variable feature amount is calculated using the eye closure rate after the removal treatment.
  • the pretreatment unit 36B further smoothes the eye closure rate after the removal process, the smoothed eye closure rate can be used for calculating the variable feature amount, and the variable feature amount can be used.
  • the involuntary state determination unit 34A determines the involuntary state of the driver 3 stepwise using a plurality of determination threshold values (threshold value A, threshold value B), the involuntary state of the driver 3 is determined stepwise. It is possible to make an appropriate determination according to the degree of the loose state, and it is possible to perform an appropriate output according to the degree of the loose state.
  • the state determination device 20 acquires an image from the camera 11 and performs a process of detecting data on the direction of the face, the direction of the line of sight, and the degree of opening and closing of the eyes. It was.
  • the camera 11 includes an image analysis unit including, for example, an image processing processor, and the image analysis unit uses the captured image to determine the direction of the driver 3's face, the direction of the line of sight, and the opening and closing of the eyes.
  • a process of detecting degree data may be performed.
  • the state determination device 20 acquires the driving behavior data, the image data, and the imaging date / time (time) data of the driver 3 from the camera 11, and uses the acquired eye opening / closing degree data to calculate the eye closure rate and thereafter.
  • the processing of may be executed.
  • the state determination device is not limited to the application to the in-vehicle device.
  • the state determination device is an industrial equipment system. It is also possible to incorporate it into the above and apply it to the purpose of determining the absent-minded state of a person performing a predetermined work.
  • Appendix 1 A state determination device (20) that determines the state of a person.
  • An eye opening / closing degree detection unit (31) that detects the eye opening / closing degree of the person from an image of the person's face
  • An eye closure rate calculation unit (32) that calculates the eye closure rate of the person using the eye opening / closing degree of the first predetermined period detected by the eye opening / closing degree detection unit (31).
  • the variation feature amount calculation unit (33) for calculating the variation feature amount of the eye-closure rate
  • a state determination device including a random state determination unit (34) that determines the loose state of the person based on the variable feature amount calculated by the variable feature amount calculation unit (33).
  • An operation evaluation system (1) characterized in that it is configured to include.
  • the eye opening / closing degree detection step (S2) for detecting the eye opening / closing degree of the person from the image of the person's face
  • An eye closure rate calculation step (S4) for calculating the eye closure rate of the person using the eye opening / closing degree of the first predetermined period detected in the eye opening / closing degree detection step (S2).
  • the variation feature amount calculation step (S7) for calculating the variation feature amount of the eye-closure rate
  • a state determination method including a loose state determination step (S8) for determining the loose state of the person based on the variable feature amount calculated in the variable feature calculation step (S7). ..
  • the variation feature amount calculation step (S7) for calculating the variation feature amount of the eye-closure rate
  • a program for executing a loose state determination step (S8) for determining a loose state of the person based on the variable feature amount calculated in the variable feature calculation step (S7).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

This state determination device is for determining the state of a person, and is equipped with: an eyes open/closedness degree detection unit for detecting the degree of open/closedness of the eyes of the person from an image obtained by capturing the person's face; a closed-eye rate calculation unit for calculating the person's closed-eye rate using an eye open/closedness degree detected during a first prescribed period; a feature quantity variation calculation unit for calculating a feature quantity variation in the closed-eye rate using the calculated closed-eye rate during a second prescribed period; and an inattentiveness state determination unit for determining the person's inattentiveness state on the basis of the calculated feature quantity variation.

Description

状態判定装置、車載機、運転評価システム、状態判定方法、及びプログラムStatus judgment device, in-vehicle device, operation evaluation system, status judgment method, and program
 本発明は、状態判定装置、車載機、運転評価システム、状態判定方法、及びプログラムに関する。 The present invention relates to a state determination device, an in-vehicle device, a driving evaluation system, a state determination method, and a program.
 特許文献1には、車両のドライバの覚醒を維持する覚醒維持装置が開示されている。
 前記覚醒維持装置は、視覚刺激を表示する表示装置を用いて、ドライバの前方視界内で前記視覚刺激の表示位置を左右方向において変化させる刺激処理を実行する刺激実行ユニットと、該刺激実行ユニットによる前記刺激処理を、間隔時間ごとに実行する制御ユニットと、前記ドライバの漫然状態の程度を推測する漫然状態推測ユニットと、該漫然状態推測ユニットで推測した漫然状態の程度が高いほど、前記間隔時間を短く設定する時間設定ユニットとを備えている。
Patent Document 1 discloses an awakening maintenance device that maintains the awakening of a vehicle driver.
The arousal maintenance device is composed of a stimulus execution unit that executes a stimulus process that changes the display position of the visual stimulus in the left-right direction in the front view of the driver by using a display device that displays the visual stimulus, and the stimulus execution unit. The higher the degree of the control unit that executes the stimulus processing at each interval time, the loose state estimation unit that estimates the degree of the driver's loose state, and the loose state estimated by the loose state estimation unit, the more the interval time. It is equipped with a time setting unit that sets the time short.
 前記漫然状態推測ユニットは、ドライバカメラを用いて前記ドライバの顔を撮影し、動画を取得する。次に前記漫然状態推測ユニットは、取得した動画を用いて、前記ドライバの閉眼率(目を閉じている時間の比率)を取得する。
 次に前記漫然状態推測ユニットは、取得した前記閉眼率が上限値(8%)以下であるか否かを判断し、前記閉眼率が上限値以下である場合は、前記時間設定ユニットが、前記閉眼率に基づいて、前記刺激実行ユニットによる刺激処理を実行する間隔時間を設定する。一方、前記閉眼率が上限値を超えた場合は、前記刺激実行ユニットが直接的覚醒処理を実行する。
The vague state estimation unit photographs the driver's face using a driver camera and acquires a moving image. Next, the vague state estimation unit acquires the eye closing rate (ratio of the time when the eyes are closed) of the driver by using the acquired moving image.
Next, the vague state estimation unit determines whether or not the acquired eye closure rate is equal to or less than the upper limit value (8%), and if the eye closure rate is equal to or less than the upper limit value, the time setting unit causes the above. Based on the eye closure rate, the interval time for executing the stimulus processing by the stimulus execution unit is set. On the other hand, when the eye closure rate exceeds the upper limit value, the stimulus execution unit executes a direct awakening process.
 このように特許文献1記載の覚醒維持装置では、前記閉眼率の値に基づいて、漫然状態の程度が判断されて、前記時間設定ユニット又は前記刺激実行ユニットによる処理が実行されるようになっている。 As described above, in the arousal maintenance device described in Patent Document 1, the degree of the involuntary state is determined based on the value of the eye closure rate, and the process by the time setting unit or the stimulation execution unit is executed. There is.
特開2018-136849号公報Japanese Unexamined Patent Publication No. 2018-136849
 しかしながら、前記閉眼率は、覚醒状態であってもある程度の個人差があるため、覚醒状態であっても前記閉眼率が高めに計測されることもある。また、前記ドライバカメラの設置場所によっては、覚醒状態であっても前記閉眼率が高めに計測されることもある。このような場合、上記した閉眼率の値に基づく判断では、覚醒状態であっても漫然状態であるという誤った判断がなされる虞があり、漫然状態を精度良く判定できないという課題があった。 However, since the eye closure rate varies from person to person even in the awake state, the eye closure rate may be measured higher even in the awake state. Further, depending on the installation location of the driver camera, the eye closure rate may be measured higher even in the awake state. In such a case, in the determination based on the above-mentioned value of the eye closure rate, there is a possibility that an erroneous determination that the awake state is in the awake state may be made, and there is a problem that the absent state cannot be accurately determined.
 本発明は上記課題に鑑みなされたものであって、人物の漫然状態を精度良く判定することができる状態判定装置、該装置を備えた車載機、該車載機を含んで構成される運転評価システム、状態判定方法、該方法を実現するためのプログラムを提供することを目的としている。
 上記目的を達成するために本開示に係る状態判定装置(1)は、人物の状態を判定する状態判定装置であって、
 前記人物の顔を撮像した画像から前記人物の眼開閉度を検出する眼開閉度検出部と、
 該眼開閉度検出部により検出された第1所定期間の前記眼開閉度を用いて、前記人物の閉眼率を算出する閉眼率算出部と、
 該閉眼率算出部により算出された第2所定期間の前記閉眼率を用いて、前記閉眼率の変動特徴量を算出する変動特徴量算出部と、
 該変動特徴量算出部により算出された前記変動特徴量に基づいて、前記人物の漫然状態を判定する漫然状態判定部と、を備えていることを特徴としている。
The present invention has been made in view of the above problems, and is an operation evaluation system including a state determination device capable of accurately determining a person's involuntary state, an in-vehicle device equipped with the device, and the in-vehicle device. , A state determination method, and a program for realizing the method.
The state determination device (1) according to the present disclosure in order to achieve the above object is a state determination device that determines the state of a person.
An eye opening / closing degree detection unit that detects the eye opening / closing degree of the person from an image of the person's face,
An eye closure rate calculation unit that calculates the eye closure rate of the person using the eye opening / closing degree of the first predetermined period detected by the eye opening / closing degree detection unit.
Using the eye-closure rate for the second predetermined period calculated by the eye-closure rate calculation unit, a variation feature amount calculation unit for calculating the variation feature amount of the eye-closure rate,
It is characterized in that it includes a loose state determination unit that determines the loose state of the person based on the variable feature amount calculated by the variable feature amount calculation unit.
 上記状態判定装置(1)によれば、前記画像から前記人物の眼開閉度が検出され、検出された前記第1所定期間の前記眼開閉度を用いて、前記人物の閉眼率が算出され、算出された前記第2所定期間の前記閉眼率を用いて、前記閉眼率の変動特徴量が算出され、算出された前記変動特徴量に基づいて、前記人物の漫然状態が判定されることとなる。したがって、前記人物の漫然状態が、前記変動特徴量に基づいて判定されるので、例えば、前記閉眼率の個人差、又は前記画像の撮像位置の違いなどの要因によって、覚醒時であっても前記閉眼率が高めに計測されるような場合であっても、これら要因の影響を受けることなく、前記漫然状態を精度良く判定することができる。前記人物は、例えば、車両を運転する運転者であってもよいし、所定の作業を行っている作業者などであってもよい。 According to the state determination device (1), the eye opening / closing degree of the person is detected from the image, and the eye closing rate of the person is calculated using the eye opening / closing degree of the detected first predetermined period. The fluctuation feature amount of the eye closure rate is calculated using the calculated eye closure rate of the second predetermined period, and the loose state of the person is determined based on the calculated fluctuation feature amount. .. Therefore, since the absent-minded state of the person is determined based on the variable feature amount, for example, due to factors such as individual differences in the eye closure rate or differences in the imaging position of the image, the person may be awake even during awakening. Even when the eye closure rate is measured to be high, the vague state can be accurately determined without being affected by these factors. The person may be, for example, a driver who drives a vehicle, a worker who is performing a predetermined work, or the like.
 また本開示に係る状態判定装置(2)は、上記状態判定装置(1)において、前記閉眼率算出部により算出された前記閉眼率に対して前処理を行う前処理部をさらに備え、
 前記変動特徴量算出部が、
 前記前処理部により前処理された後の前記閉眼率を用いて、前記変動特徴量を算出するものであることを特徴としている。
Further, the state determination device (2) according to the present disclosure further includes a pretreatment unit that performs preprocessing on the eye closure rate calculated by the eye closure rate calculation unit in the state determination device (1).
The fluctuation feature amount calculation unit
It is characterized in that the fluctuation feature amount is calculated by using the eye closure rate after pretreatment by the pretreatment unit.
 上記状態判定装置(2)によれば、前記前処理部によって前記閉眼率に対する前記前処理が行われ、該前処理された後の前記閉眼率を用いて、前記変動特徴量が算出される。したがって、前記変動特徴量の算出に、前記前処理された後の前記閉眼率を用いることによって、前記漫然状態へ移行する特徴をより正確に表す特徴量を算出することが可能となり、前記漫然状態への移行を精度良く判定することができる。 According to the state determination device (2), the pretreatment for the eye closure rate is performed by the pretreatment unit, and the variation feature amount is calculated using the eye closure rate after the pretreatment. Therefore, by using the eye closure rate after the pretreatment for the calculation of the variable feature amount, it is possible to calculate the feature amount that more accurately represents the feature that shifts to the loose state, and the loose state. It is possible to accurately determine the transition to.
 また本開示に係る状態判定装置(3)は、上記状態判定装置(2)において、前記前処理部が、前記閉眼率の平滑化処理を行うものであることを特徴としている。
 上記状態判定装置(3)によれば、前記前処理部により前記閉眼率の平滑化処理が行われるので、前記平滑化処理された後の前記閉眼率を用いて前記変動特徴量が算出されることとなり、前記漫然状態へ移行する傾向を表す特徴量の算出が容易となる。前記平滑化処理は、例えば、所定期間毎の前記閉眼率の移動平均を求める処理などであってもよい。
Further, the state determination device (3) according to the present disclosure is characterized in that, in the state determination device (2), the pretreatment unit performs the smoothing process of the eye closure rate.
According to the state determination device (3), since the smoothing process of the eye closure rate is performed by the pretreatment unit, the variation feature amount is calculated using the eye closure rate after the smoothing process. As a result, it becomes easy to calculate the feature amount indicating the tendency to shift to the vague state. The smoothing process may be, for example, a process of obtaining a moving average of the eye closure rate for each predetermined period.
 また本開示に係る状態判定装置(4)は、上記状態判定装置(2)において、所定の事象を検出する事象検出部をさらに備え、
 前記前処理部が、
 前記第2所定期間の前記閉眼率のうち、前記事象検出部により前記所定の事象が検出された時又は期間に算出された前記閉眼率を、前記変動特徴量の算出対象データから除く除去処理を行うものであることを特徴としている。
Further, the state determination device (4) according to the present disclosure further includes an event detection unit for detecting a predetermined event in the state determination device (2).
The pretreatment unit
Of the eye closure rates in the second predetermined period, the eye closure rate calculated when the predetermined event is detected by the event detection unit or during the period is removed from the data to be calculated for the variable feature amount. It is characterized in that it is intended to perform.
 上記状態判定装置(4)によれば、前記前処理部により、前記第2所定期間の前記閉眼率のうち、前記事象検出部により前記所定の事象が検出された時又は期間に算出された前記閉眼率が、前記変動特徴量の算出対象データから除去される。そして、前記除去処理された後の前記閉眼率を用いて、前記変動特徴量が算出される。したがって、前記変動特徴量の算出に、前記除去処理後の前記閉眼率を用いることによって、前記漫然状態へ移行する特徴をより正確に表す特徴量を算出することができ、前記漫然状態への移行を精度良く判定することができる。 According to the state determination device (4), it was calculated by the pretreatment unit when or during the period when the event detection unit detected the predetermined event among the eye closure rates in the second predetermined period. The eye closure rate is removed from the data for which the variable feature amount is calculated. Then, the variable feature amount is calculated using the eye closure rate after the removal treatment. Therefore, by using the eye closure rate after the removal treatment for the calculation of the variable feature amount, it is possible to calculate the feature amount that more accurately represents the feature that shifts to the loose state, and the shift to the loose state can be calculated. Can be accurately determined.
 また本開示に係る状態判定装置(5)は、上記状態判定装置(4)において、前記前処理部が、前記除去処理後の前記閉眼率の平滑化処理をさらに行うものであることを特徴としている。
 上記状態判定装置(5)によれば、前記前処理部により、前記除去処理後の前記閉眼率の平滑化処理がさらに行われるので、前記平滑化処理された前記閉眼率を、前記変動特徴量の算出に用いることができ、前記変動特徴量を、前記漫然状態へ移行する傾向が把握しやすい特徴量として算出することができ、前記漫然状態への移行を精度良く判定することができる。前記平滑化処理は、例えば、所定期間毎の前記閉眼率の移動平均を求める処理であってもよい。
Further, the state determination device (5) according to the present disclosure is characterized in that, in the state determination device (4), the pretreatment unit further performs the smoothing process of the eye closure rate after the removal process. There is.
According to the state determination device (5), the pretreatment unit further smoothes the eye closure rate after the removal process, so that the smoothed eye closure rate can be used as the variable feature amount. The variable feature amount can be calculated as a feature amount whose tendency to shift to the loose state can be easily grasped, and the shift to the loose state can be accurately determined. The smoothing process may be, for example, a process of obtaining a moving average of the eye closure rate for each predetermined period.
 また本開示に係る状態判定装置(6)は、上記状態判定装置(4)又は(5)において、前記第2所定期間の前記閉眼率のうち、前記除去処理により除去された前記時又は期間における前記閉眼率を補間する補間処理部をさらに備えていることを特徴としている。
 上記状態判定装置(6)によれば、前記補間処理部により、前記第2所定期間の前記閉眼率のうち、前記除去処理により除去された前記時又は期間における前記閉眼率が補間される。したがって、前記補間処理部により前記閉眼率を補間することができるので、前記変動特徴量の算出を適切に行うことができる。
Further, the state determination device (6) according to the present disclosure is the state determination device (4) or (5) at the time or period during which the eye closure rate of the second predetermined period is removed by the removal process. It is characterized in that it further includes an interpolation processing unit that interpolates the eye closure rate.
According to the state determination device (6), the interpolation processing unit interpolates the eye closure rate in the time or period removed by the removal process among the eye closure rates in the second predetermined period. Therefore, since the eye closure rate can be interpolated by the interpolation processing unit, the fluctuation feature amount can be appropriately calculated.
 また本開示に係る状態判定装置(7)は、上記状態判定装置(4)~(6)のいずれかにおいて、前記人物が、車両の運転者であり、
 前記事象検出部が、前記所定の事象として、前記車両が走行する道路種別が切り替わった事象、前記車両が停車した事象、前記車両が走行を開始した事象、前記車両に急ハンドルが発生した事象、前記車両に急ブレーキが発生した事象、及び前記車両に衝撃が発生した事象のうちの少なくともいずれかの事象を検出する車両動態検出部を含んでいることを特徴としている。
Further, in the state determination device (7) according to the present disclosure, in any of the above state determination devices (4) to (6), the person is the driver of the vehicle.
As the predetermined event, the event detection unit switches the road type on which the vehicle travels, the vehicle stops, the vehicle starts traveling, and the vehicle suddenly steers. It is characterized by including a vehicle dynamics detection unit that detects at least one of an event in which a sudden braking occurs in the vehicle and an event in which an impact occurs in the vehicle.
 上記状態判定装置(7)によれば、前記事象検出部が、前記車両動態検出部を含んでいるので、前記第2所定期間の前記閉眼率のうち、前記車両が走行する道路種別が切り替わった事象、前記車両が停車した事象、前記車両が走行を開始した事象、前記車両に急ハンドルが発生した事象、前記車両に急ブレーキが発生した事象、及び前記車両に衝撃が発生した事象のうちの少なくともいずれかの事象が検出された時又は期間に算出された前記閉眼率を、前記変動特徴量の算出対象データから除くことが可能となる。したがって、前記車両の動態が時々刻々と変化する実車環境であっても、前記漫然状態へ移行する傾向をより正確に示す特徴量として、前記変動特徴量を算出することができ、前記実車環境における前記運転者の漫然状態への移行を精度良く判定することができる。 According to the state determination device (7), since the event detection unit includes the vehicle dynamics detection unit, the road type on which the vehicle travels is switched among the eye closure rates in the second predetermined period. Of the events, the event that the vehicle stopped, the event that the vehicle started running, the event that the vehicle suddenly steered, the event that the vehicle suddenly braked, and the event that the vehicle was impacted. The eye closure rate calculated when at least one of the above events is detected or during the period can be excluded from the calculation target data of the fluctuation feature amount. Therefore, even in an actual vehicle environment in which the dynamics of the vehicle change from moment to moment, the variable feature amount can be calculated as a feature amount that more accurately indicates the tendency to shift to the vague state, and in the actual vehicle environment. It is possible to accurately determine the transition of the driver to the involuntary state.
 また本開示に係る状態判定装置(8)は、上記状態判定装置(4)~(6)のいずれかにおいて、前記人物が、車両の運転者であり、
 前記事象検出部が、前記所定の事象として、前記車両の運転中に前記人物の顔の向きが変化する事象を検出する顔向き事象検出部を含んでいることを特徴としている。
 上記状態判定装置(8)によれば、前記事象検出部が、前記顔向き事象検出部を含んでいるので、前記第2所定期間の前記閉眼率のうち、前記車両の運転中に前記人物の顔の向きが変化する事象が検出された時又は期間に算出された前記閉眼率を、前記変動特徴量の算出対象データから除くことが可能となる。したがって、前記運転者の顔の向きが様々に変化する実車環境であっても、前記漫然状態へ移行する傾向をより正確に示す特徴量として、前記変動特徴量を算出することができ、前記実車環境における前記運転者の漫然状態への移行を精度良く判定することができる。
Further, in the state determination device (8) according to the present disclosure, in any of the above state determination devices (4) to (6), the person is the driver of the vehicle.
The event detection unit is characterized in that the event detection unit includes a face-facing event detection unit that detects, as the predetermined event, an event in which the direction of the face of the person changes while the vehicle is driving.
According to the state determination device (8), since the event detection unit includes the face-facing event detection unit, the person during the driving of the vehicle in the eye closure rate during the second predetermined period. It is possible to exclude the eye closure rate calculated when or during the period when the event of changing the direction of the face is detected from the calculation target data of the variable feature amount. Therefore, even in an actual vehicle environment in which the direction of the driver's face changes variously, the variable feature amount can be calculated as a feature amount that more accurately indicates the tendency to shift to the vague state, and the actual vehicle can be calculated. It is possible to accurately determine the transition of the driver to the involuntary state in the environment.
 また本開示に係る状態判定装置(9)は、上記状態判定装置(1)~(8)のいずれかにおいて、前記変動特徴量算出部が、前記変動特徴量として、前記第2所定期間の前記閉眼率のばらつきの程度を示す指標を算出するものであることを特徴としている。
 上記状態判定装置(9)によれば、前記変動特徴量算出部により、前記変動特徴量として、前記第2所定期間の前記閉眼率のばらつきの程度を示す指標、例えば、標準偏差、又は分散などの値が算出され、前記指標に基づいて、前記人物の漫然状態が判定される。したがって、前記人物の漫然状態の判定に、前記閉眼率のばらつきの程度、換言すれば、ばらつき度合いの変化が考慮されるので、例えば、覚醒状態であっても前記閉眼率が高めに計測されるような場合であっても、前記閉眼率そのものの値による影響を受けることなく、前記漫然状態へ移行するタイミングを精度良く判定することができる。
Further, in the state determination device (9) according to the present disclosure, in any of the above state determination devices (1) to (8), the variation feature amount calculation unit uses the variation feature amount as the variation feature amount for the second predetermined period. It is characterized in that it calculates an index showing the degree of variation in the eye closure rate.
According to the state determination device (9), the fluctuation feature amount calculation unit uses an index indicating the degree of variation in the eye closure rate during the second predetermined period, such as standard deviation or dispersion, as the fluctuation feature amount. The value of is calculated, and the involuntary state of the person is determined based on the index. Therefore, the degree of variation in the eye-closure rate, in other words, the change in the degree of variation is taken into consideration in the determination of the absent-minded state of the person, so that the eye-closure rate is measured higher even in the awake state, for example. Even in such a case, it is possible to accurately determine the timing of transition to the involuntary state without being affected by the value of the eye closure rate itself.
 また本開示に係る状態判定装置(10)は、上記状態判定装置(1)~(8)のいずれかにおいて、前記変動特徴量算出部が、前記変動特徴量として、前記第2所定期間の前記閉眼率の変化量又は変化率を算出するものであることを特徴としている。
 上記状態判定装置(10)によれば、前記変動特徴量算出部により、前記変動特徴量として、前記第2所定期間の前記閉眼率の変化量又は変化率が算出され、前記閉眼率の変化量又は変化率に基づいて、前記人物の漫然状態が判定される。したがって、前記人物の漫然状態の判定に、前記閉眼率の変化量又は変化率、換言すれば、変化の大きさが考慮されるので、例えば、覚醒状態であっても前記閉眼率が高めに計測されるような場合であっても、前記閉眼率そのものの値による影響を受けることなく、前記漫然状態へ移行するタイミングを精度良く判定することができる。
Further, in the state determination device (10) according to the present disclosure, in any of the state determination devices (1) to (8), the fluctuation feature amount calculation unit uses the fluctuation feature amount as the fluctuation feature amount for the second predetermined period. It is characterized in that it calculates the amount of change or the rate of change in the eye closure rate.
According to the state determination device (10), the fluctuation feature amount calculation unit calculates the change amount or change rate of the eye closure rate during the second predetermined period as the fluctuation feature amount, and the change amount of the eye closure rate. Alternatively, the involuntary state of the person is determined based on the rate of change. Therefore, since the amount of change or the rate of change in the eye closure rate, in other words, the magnitude of the change is taken into consideration in determining the absent-minded state of the person, for example, the eye closure rate is measured higher even in the awake state. Even in such a case, it is possible to accurately determine the timing of transition to the involuntary state without being affected by the value of the eye closure rate itself.
 また本開示に係る状態判定装置(11)は、上記状態判定装置(1)~(8)のいずれかにおいて、前記変動特徴量算出部が、前記変動特徴量として、前記第2所定期間における前記閉眼率の上昇時間の割合である上昇時間率を算出するものであることを特徴としている。
 上記状態判定装置(11)によれば、前記変動特徴量算出部により、前記変動特徴量として、前記第2所定期間における前記閉眼率の上昇時間率が算出され、該上昇時間率に基づいて、前記人物の漫然状態が判定される。したがって、前記人物の漫然状態の判定に、前記閉眼率の上昇時間率、換言すれば、前記閉眼率が経時的に上昇している変化傾向が考慮されるので、例えば、覚醒状態であっても前記閉眼率が高めに計測されるような場合であっても、前記閉眼率そのものの値による影響を受けることなく、前記漫然状態へ移行するタイミングを精度良く判定することができる。
Further, in the state determination device (11) according to the present disclosure, in any of the state determination devices (1) to (8), the fluctuation feature amount calculation unit uses the fluctuation feature amount as the fluctuation feature amount in the second predetermined period. It is characterized in that it calculates the rate of increase in time, which is the ratio of the rate of increase in eye closure rate.
According to the state determination device (11), the fluctuation feature amount calculation unit calculates the increase time rate of the eye closure rate in the second predetermined period as the fluctuation feature amount, and based on the increase time rate, The loose state of the person is determined. Therefore, in determining the absent-minded state of the person, the rate of increase in the eye-closure rate, in other words, the tendency of the eye-closure rate to increase over time is taken into consideration. Therefore, for example, even in the awake state. Even when the eye-closure rate is measured higher, it is possible to accurately determine the timing of transition to the involuntary state without being affected by the value of the eye-closure rate itself.
 また本開示に係る状態判定装置(12)は、上記状態判定装置(1)~(11)のいずれかにおいて、前記漫然状態判定部が、2以上の判定閾値を用いて、前記人物の漫然状態を段階的に判定するものであることを特徴としている。
 上記状態判定装置(12)によれば、2以上の判定閾値を用いて、前記人物の漫然状態を段階的に判定することが可能となり、前記漫然状態の程度に応じた適切な判定を行うことができる。
Further, in the state determination device (12) according to the present disclosure, in any of the above state determination devices (1) to (11), the involuntary state determination unit uses two or more determination thresholds to determine the involuntary state of the person. Is characterized in that it is determined step by step.
According to the state determination device (12), it is possible to determine the involuntary state of the person step by step by using two or more determination threshold values, and make an appropriate determination according to the degree of the involuntary state. Can be done.
 また本開示に係る状態判定装置(13)は、上記状態判定装置(1)~(11)のいずれかにおいて、前記漫然状態判定部が、前記変動特徴量が所定期間継続して判定閾値を超えた場合に、前記人物が漫然状態であると判定するものであることを特徴としている。
 上記状態判定装置(13)によれば、前記変動特徴量が所定期間継続して判定閾値を超えた場合に、前記人物が漫然状態であると判定されるので、前記人物が漫然状態に移行した状態を適切に判定することができる。
Further, in the state determination device (13) according to the present disclosure, in any of the above state determination devices (1) to (11), the involuntary state determination unit causes the variation feature amount to continuously exceed the determination threshold for a predetermined period. In this case, the person is determined to be in a vague state.
According to the state determination device (13), when the variable feature amount continuously exceeds the determination threshold value for a predetermined period, it is determined that the person is in a loose state, so that the person shifts to the loose state. The state can be judged appropriately.
 また本開示に係る状態判定装置(14)は、上記状態判定装置(1)~(11)のいずれかにおいて、前記漫然状態判定部が、2種類以上の前記変動特徴量、又は1種類以上の前記変動特徴量及び前記閉眼率に基づいて、前記人物の漫然状態を判定するものであることを特徴としている。
 上記状態判定装置(14)によれば、2種類以上の前記変動特徴量、又は1種類以上の前記変動特徴量及び前記閉眼率に基づいて、前記人物の漫然状態を判定することが可能となり、2種類以上の前記変動特徴量を考慮することにより、前記人物が漫然状態に移行するタイミングをより正確に判定することができる。
Further, in the state determination device (14) according to the present disclosure, in any of the above state determination devices (1) to (11), the indiscriminate state determination unit has two or more types of the variable feature amount or one or more types. It is characterized in that the loose state of the person is determined based on the variable feature amount and the eye closure rate.
According to the state determination device (14), it is possible to determine the involuntary state of the person based on two or more types of the variable feature amount, or one or more types of the variable feature amount and the eye closure rate. By considering two or more types of the variable feature amounts, it is possible to more accurately determine the timing at which the person shifts to the involuntary state.
 また本開示に係る状態判定装置(15)は、上記状態判定装置(1)~(11)のいずれかにおいて、前記漫然状態判定部が、算出された前記変動特徴量、又は前記変動特徴量及び前記閉眼率を入力すると、前記人物が漫然状態であるか否かを示す値を出力するように学習した学習済みの学習器を用いて、前記人物の漫然状態を判定するものであることを特徴としている。
 上記状態判定装置(15)によれば、前記学習器を用いることによって、前記人物が漫然状態であるか否かを、容易かつ適切に判定することが可能となる。前記学習器は、例えば、ニューラルネットワーク、又はサポートベクターマシン等を含んで構成されてよい。
Further, in the state determination device (15) according to the present disclosure, in any of the above state determination devices (1) to (11), the involuntary state determination unit calculates the fluctuation feature amount, or the fluctuation feature amount and the variation feature amount. When the eye closure rate is input, a trained learner that has been trained to output a value indicating whether or not the person is in a loose state is used to determine the loose state of the person. It is supposed to be.
According to the state determination device (15), by using the learning device, it is possible to easily and appropriately determine whether or not the person is in a loose state. The learner may be configured to include, for example, a neural network, a support vector machine, or the like.
 また本開示に係る状態判定装置(16)は、上記状態判定装置(1)~(15)のいずれかにおいて、前記漫然状態判定部による判定結果を出力する出力部を備えていることを特徴としている。
 上記状態判定装置(16)によれば、前記出力部により、前記漫然状態判定部による判定結果を適切に出力することができる。
Further, the state determination device (16) according to the present disclosure is characterized in that any of the above state determination devices (1) to (15) includes an output unit that outputs a determination result by the indiscriminate state determination unit. There is.
According to the state determination device (16), the output unit can appropriately output the determination result by the indiscriminate state determination unit.
 また本開示に係る状態判定装置(17)は、上記状態判定装置(16)において、前記出力部が出力した前記判定結果に応じた報知を行う報知部を備えていることを特徴としている。
 上記状態判定装置(17)によれば、前記報知部により、前記漫然状態判定部の判定結果に応じた報知を適切に行うことができる。
Further, the state determination device (17) according to the present disclosure is characterized in that the state determination device (16) includes a notification unit that performs notification according to the determination result output by the output unit.
According to the state determination device (17), the notification unit can appropriately perform notification according to the determination result of the indiscriminate state determination unit.
 また本開示に係る状態判定装置(18)は、上記状態判定装置(16)又は(17)において、前記出力部が出力した前記判定結果が記憶される判定結果記憶部と、該判定結果記憶部に記憶された前記判定結果を含むデータを所定の送信先へ送信する通信部とを備えていることを特徴としている。
 上記状態判定装置(18)によれば、前記判定結果記憶部に前記判定結果が記憶され、前記通信部により、前記判定結果を含むデータを所定の送信先へ適切に送信することが可能となる。したがって、前記所定の送信先において、前記判定結果を適切に利用させることができる。
Further, the state determination device (18) according to the present disclosure includes a determination result storage unit for storing the determination result output by the output unit in the state determination device (16) or (17), and the determination result storage unit. It is characterized in that it includes a communication unit that transmits data including the determination result stored in the predetermined destination.
According to the state determination device (18), the determination result is stored in the determination result storage unit, and the communication unit can appropriately transmit the data including the determination result to a predetermined transmission destination. .. Therefore, the determination result can be appropriately used at the predetermined destination.
 また本開示に係る車載機(1)は、上記状態判定装置(1)~(18)のいずれかと、前記画像を撮像する撮像部とを備えていることを特徴としている。
 上記車載機(1)によれば、上記状態判定装置(1)~(18)のいずれかの効果を奏する車載機を実現することができる。
Further, the vehicle-mounted device (1) according to the present disclosure is characterized in that it includes any of the above-mentioned state determination devices (1) to (18) and an imaging unit for capturing the image.
According to the in-vehicle device (1), it is possible to realize an in-vehicle device that exhibits the effect of any one of the state determination devices (1) to (18).
 また本開示に係る運転評価システム(1)は、1以上の前記車載機(1)と、該車載機の前記状態判定装置により判定された前記人物の漫然状態の判定結果を含むデータに基づいて、前記人物の漫然状態の評価を含む運転評価を行う運転評価部、及び該運転評価部により評価された前記人物の漫然状態の評価を含む運転評価結果を出力する評価結果出力部を備えている運転評価装置とを含んで構成されていることを特徴としている。 Further, the driving evaluation system (1) according to the present disclosure is based on data including one or more of the in-vehicle devices (1) and a determination result of the involuntary state of the person determined by the state determination device of the in-vehicle device. It is provided with a driving evaluation unit that performs driving evaluation including evaluation of the person's involuntary state, and an evaluation result output unit that outputs a driving evaluation result including evaluation of the person's involuntary state evaluated by the driving evaluation unit. It is characterized in that it is configured to include an operation evaluation device.
 上記運転評価システム(1)によれば、前記車載機の前記状態判定装置により判定された前記人物の漫然状態の判定結果を含むデータに基づいて、前記人物の漫然状態の評価を含む運転評価が行われ、適切に評価された前記人物の漫然状態の評価を含む運転評価結果を出力することが可能となり、前記人物に対する安全運転教育を適切に行うことができる。 According to the driving evaluation system (1), the driving evaluation including the evaluation of the loose state of the person is performed based on the data including the determination result of the loose state of the person determined by the state determination device of the vehicle-mounted device. It is possible to output a driving evaluation result including an evaluation of the involuntary state of the person who has been appropriately evaluated, and it is possible to appropriately perform safe driving education for the person.
 また本開示に係る状態判定方法は、人物の状態を判定する状態判定方法であって、
 前記人物の顔を撮像した画像から前記人物の眼開閉度を検出する眼開閉度検出ステップと、
 該眼開閉度検出ステップで検出された第1所定期間の前記眼開閉度を用いて、前記人物の閉眼率を算出する閉眼率算出ステップと、
 該閉眼率算出ステップで算出された第2所定期間の前記閉眼率を用いて、前記閉眼率の変動特徴量を算出する変動特徴量算出ステップと、
 該変動特徴量算出ステップで算出された前記変動特徴量に基づいて、前記人物の漫然状態を判定する漫然状態判定ステップと、を含んでいることを特徴としている。
Further, the state determination method according to the present disclosure is a state determination method for determining the state of a person.
An eye opening / closing degree detection step for detecting the eye opening / closing degree of the person from an image of the person's face,
An eye closure rate calculation step for calculating the eye closure rate of the person using the eye opening / closing degree for the first predetermined period detected in the eye opening / closing degree detection step,
Using the eye closure rate for the second predetermined period calculated in the eye closure rate calculation step, a variable feature amount calculation step for calculating the variable feature amount of the eye closure rate, and a variable feature amount calculation step.
It is characterized in that it includes a loose state determination step for determining the loose state of the person based on the variable feature amount calculated in the variable feature calculation step.
 上記状態判定方法によれば、前記画像から前記人物の眼開閉度が検出され、検出された前記第1所定期間の前記眼開閉度を用いて、前記人物の閉眼率が算出され、算出された前記第2所定期間の前記閉眼率を用いて、前記閉眼率の変動特徴量が算出され、算出された前記変動特徴量に基づいて、前記人物の漫然状態が判定されることとなる。したがって、前記人物の漫然状態が、前記変動特徴量に基づいて判定されるので、例えば、前記閉眼率の個人差、又は前記画像の撮像位置の違いなどの要因によって、覚醒時であっても前記閉眼率が高めに計測されるような場合であっても、これら要因の影響を受けることなく、前記漫然状態を精度良く判定することができる。 According to the above-mentioned state determination method, the eye opening / closing degree of the person is detected from the image, and the eye closing rate of the person is calculated and calculated using the eye opening / closing degree of the detected first predetermined period. The fluctuation feature amount of the eye closure rate is calculated by using the eye closure rate in the second predetermined period, and the loose state of the person is determined based on the calculated fluctuation feature amount. Therefore, since the absent-minded state of the person is determined based on the variable feature amount, for example, due to factors such as individual differences in the eye closure rate or differences in the imaging position of the image, the person may be awake even during awakening. Even when the eye closure rate is measured to be high, the vague state can be accurately determined without being affected by these factors.
 また本開示に係るプログラムは、人物の状態を判定する処理を少なくとも1以上のコンピュータに実行させるためのプログラムであって、
 前記1以上のコンピュータに、
 前記人物の顔を撮像した画像から前記人物の眼開閉度を検出する眼開閉度検出ステップと、
 該眼開閉度検出ステップで検出された第1所定期間の前記眼開閉度を用いて、前記人物の閉眼率を算出する閉眼率算出ステップと、
 該閉眼率算出ステップで算出された第2所定期間の前記閉眼率を用いて、前記閉眼率の変動特徴量を算出する変動特徴量算出ステップと、
 該変動特徴量算出ステップで算出された前記変動特徴量に基づいて、前記人物の漫然状態を判定する漫然状態判定ステップと、を実行させるためのプログラムであることを特徴としている。
Further, the program according to the present disclosure is a program for causing at least one or more computers to execute a process of determining the state of a person.
To one or more computers
An eye opening / closing degree detection step for detecting the eye opening / closing degree of the person from an image of the person's face,
An eye closure rate calculation step for calculating the eye closure rate of the person using the eye opening / closing degree for the first predetermined period detected in the eye opening / closing degree detection step,
Using the eye closure rate for the second predetermined period calculated in the eye closure rate calculation step, a variable feature amount calculation step for calculating the variable feature amount of the eye closure rate, and a variable feature amount calculation step.
The program is characterized in that it is a program for executing a loose state determination step for determining a loose state of the person based on the variable feature amount calculated in the variable feature calculation step.
 上記プログラムによれば、前記少なくとも1以上のコンピュータに、前記画像から前記人物の眼開閉度を検出させ、検出させた前記第1所定期間の前記眼開閉度を用いて、前記人物の閉眼率を算出させ、算出させた前記第2所定期間の前記閉眼率を用いて、前記閉眼率の変動特徴量を算出させ、算出させた前記変動特徴量に基づいて、前記人物の漫然状態を判定させることができる。そのため、前記閉眼率の個人差、又は前記画像の撮像位置の違いなどの要因によって、覚醒時であっても前記閉眼率が高めに計測されるような場合であっても、これら要因の影響を受けることなく、前記漫然状態を精度良く判定させることができる装置やシステムを実現することができる。上記プログラムは、記憶媒体に保存されたプログラムであってもよいし、通信ネットワークを介して転送可能なプログラムであってもよいし、通信ネットワークを介して実行されるプログラムであってもよい。 According to the above program, the eye closure rate of the person is determined by having at least one or more computers detect the eye opening / closing degree of the person from the image and using the eye opening / closing degree of the first predetermined period detected. Using the calculated and calculated eye closure rate for the second predetermined period, the variable feature amount of the eye closure rate is calculated, and the involuntary state of the person is determined based on the calculated variable feature amount. Can be done. Therefore, due to factors such as individual differences in the eye closure rate or differences in the imaging position of the image, the influence of these factors is affected even when the eye closure rate is measured higher even during awakening. It is possible to realize a device or system capable of accurately determining the absent-minded state without receiving it. The above program may be a program stored in a storage medium, a program that can be transferred via a communication network, or a program that is executed via a communication network.
実施の形態(1)に係る状態判別装置の適用場面の一例を示す模式図である。It is a schematic diagram which shows an example of the application scene of the state discriminating apparatus which concerns on embodiment (1). 実施の形態(1)に係る車載機のハードウェア構成例を示すブロック図である。It is a block diagram which shows the hardware configuration example of the in-vehicle device which concerns on embodiment (1). 実施の形態(1)に係る状態判定装置の機能構成例を示すブロック図である。It is a block diagram which shows the functional structure example of the state determination apparatus which concerns on embodiment (1). (a)は、平常時の閉眼率が低い運転者の閉眼率の時系列変化を示すグラフの一例であり、(b)は、(a)に示した閉眼率を用いて算出した閉眼率の標準偏差の時系列変化を示すグラフである。(A) is an example of a graph showing a time-series change in the eye-closing rate of a driver having a low eye-closing rate in normal times, and (b) is an example of the eye-closing rate calculated using the eye-closing rate shown in (a). It is a graph which shows the time series change of a standard deviation. (a)は、平常時の閉眼率が高い運転者の閉眼率の時系列変化を示すグラフの一例であり、(b)は、(a)に示した閉眼率を用いて算出した閉眼率の標準偏差の時系列変化を示すグラフである。(A) is an example of a graph showing a time-series change in the eye-closing rate of a driver having a high eye-closing rate in normal times, and (b) is an example of the eye-closing rate calculated using the eye-closing rate shown in (a). It is a graph which shows the time series change of a standard deviation. 実施の形態(1)に係る運転評価装置の機能構成例を示すブロック図である。It is a block diagram which shows the functional structure example of the operation evaluation apparatus which concerns on embodiment (1). 実施の形態(1)に係る状態判定装置の制御部が行う処理動作例を示すフローチャートである。It is a flowchart which shows the processing operation example performed by the control part of the state determination apparatus which concerns on embodiment (1). 実施の形態(1)に係る状態判定装置の制御部が行う処理動作例を示すフローチャートである。It is a flowchart which shows the processing operation example performed by the control part of the state determination apparatus which concerns on embodiment (1). 実施の形態(2)に係る状態判定装置の機能構成例を示すブロック図である。It is a block diagram which shows the functional structure example of the state determination apparatus which concerns on embodiment (2). (a)は、図4(a)に示した閉眼率の時系列変化に、閉眼率の移動平均線を重ねたグラフであり、(b)は、図5(a)に示した閉眼率の時系列変化に、閉眼率の移動平均線を重ねたグラフである。(A) is a graph in which the moving average line of the eye closure rate is superimposed on the time-series change of the eye closure rate shown in FIG. 4 (a), and (b) is the graph of the eye closure rate shown in FIG. 5 (a). It is a graph in which the moving average line of the eye closure rate is superimposed on the time series change. 実施の形態(2)に係る状態判定装置の制御部が行う処理動作例を示すフローチャートである。It is a flowchart which shows the processing operation example performed by the control part of the state determination apparatus which concerns on embodiment (2). 実施の形態(3)に係る状態判定装置の機能構成例を示すブロック図である。It is a block diagram which shows the functional structure example of the state determination apparatus which concerns on embodiment (3). 実施の形態(3)に係る状態判定装置の制御部が行う処理動作例を示すフローチャートである。It is a flowchart which shows the processing operation example performed by the control part of the state determination apparatus which concerns on embodiment (3). 実施の形態(3)に係る状態判定装置の制御部が行う処理動作例を示すフローチャートである。It is a flowchart which shows the processing operation example performed by the control part of the state determination apparatus which concerns on embodiment (3). 実施の形態(4)に係る状態判定装置の機能構成例を示すブロック図である。It is a block diagram which shows the functional structure example of the state determination apparatus which concerns on embodiment (4). 実施の形態(4)に係る状態判定装置の制御部が行う処理動作例を示すフローチャートである。It is a flowchart which shows the processing operation example performed by the control part of the state determination apparatus which concerns on embodiment (4).
 以下、本発明に係る状態判定装置、車載機、運転評価システム、状態判定方法、及びプログラムの実施の形態を図面に基づいて説明する。
 本発明に係る状態判定装置、状態判定方法、及びプログラムは、人物の状態を判定する各種用途に広く適用可能である。また、本発明に係る状態判定装置を備えた車載機、及び1以上の前記車載機を含んで構成される運転評価システムは、例えば、事業者などが管理する車両の運転者の運転状態を評価して、事故を未然に防ぐ予防安全の観点から運転者の安全意識の改善を図ることを支援するための装置、又はシステムなどに広く適用可能である。
[適用例]
 図1は、実施の形態(1)に係る状態判定装置の適用場面の一例を示す模式図である。図1に示す適用例では、状態判定装置20が車載機10に装備され、1台以上の車両2に搭載される車載機10と、各車載機10から取得したデータを処理する少なくとも1つ以上の運転評価装置4とを含んで、各運転者3の運転評価を行う運転評価システム1が構築されている。
Hereinafter, a state determination device, an in-vehicle device, a driving evaluation system, a state determination method, and an embodiment of a program according to the present invention will be described with reference to the drawings.
The state determination device, the state determination method, and the program according to the present invention can be widely applied to various uses for determining the state of a person. Further, the in-vehicle device provided with the state determination device according to the present invention and the driving evaluation system including one or more of the in-vehicle devices evaluate, for example, the driving state of the driver of the vehicle managed by a business operator or the like. Therefore, it can be widely applied to devices or systems for supporting improvement of driver's safety awareness from the viewpoint of preventive safety to prevent accidents.
[Application example]
FIG. 1 is a schematic view showing an example of an application scene of the state determination device according to the embodiment (1). In the application example shown in FIG. 1, the state determination device 20 is mounted on the vehicle-mounted device 10, and the vehicle-mounted device 10 mounted on one or more vehicles 2 and at least one or more that processes data acquired from each vehicle-mounted device 10. A driving evaluation system 1 for evaluating the driving of each driver 3 is constructed including the driving evaluation device 4 of the above.
 車載機10は、状態判定装置20と、運転者3の顔を含む画像を撮像するカメラ11とを含んで構成されている。状態判定装置20は、カメラ11で撮像した画像を取得しながら、運転者3の状態を判定するための各種処理を実行するコンピュータ装置で構成されている。
 車載機10が搭載される車両2は、特に限定されない。例えば、運送事業者が管理するトラック、バス事業者が管理するバス、タクシー事業者が管理するタクシー、カーシェアリング事業者が管理するカーシェア車両、レンタカー事業者が管理するレンタカー、会社が所有している社有車、又はカーリース事業者からリースして使用する社有車など、各種の事業を営む事業者が管理する車両が対象とされ得る。また、車載機10が搭載される車両2は、一般の車両であってもよい。例えば、運転評価装置4を、保険会社又は自動車教習所などの安全評価・運転訓練機関により管理又は運営されるものとして構成し、これら安全評価・運転訓練機関において、各車両2の運転者3の運転評価を実行するシステムとして適用することも可能である。
The on-board unit 10 includes a state determination device 20 and a camera 11 that captures an image including the face of the driver 3. The state determination device 20 is composed of a computer device that executes various processes for determining the state of the driver 3 while acquiring an image captured by the camera 11.
The vehicle 2 on which the vehicle-mounted device 10 is mounted is not particularly limited. For example, trucks managed by transportation companies, buses managed by bus companies, taxis managed by taxi companies, car sharing vehicles managed by car sharing companies, car rentals managed by car rental companies, owned by companies. Vehicles managed by businesses operating various businesses, such as company-owned vehicles that are owned by the company or company-owned vehicles that are leased and used by car leasing companies, can be targeted. Further, the vehicle 2 on which the vehicle-mounted device 10 is mounted may be a general vehicle. For example, the driving evaluation device 4 is configured to be managed or operated by a safety evaluation / driving training institution such as an insurance company or a driving school, and the driving of the driver 3 of each vehicle 2 in these safety evaluation / driving training institutions. It can also be applied as a system to perform evaluation.
 運転評価装置4は、例えば、車載機10から送信されてくる運転者3の運転挙動データ及び車両2の走行挙動データを取得し、取得したデータ群と所定の評価条件とに基づいて、各運転者3の運転評価処理を実行し、運転評価結果を外部装置、例えば、事業者端末6に出力する構成を備えている。運転評価装置4は、例えば、通信ユニット41、制御ユニット42及び記憶ユニット43を含む、1以上のサーバコンピュータで構成されている。 The driving evaluation device 4 acquires, for example, the driving behavior data of the driver 3 and the driving behavior data of the vehicle 2 transmitted from the in-vehicle device 10, and each operation is based on the acquired data group and predetermined evaluation conditions. It has a configuration in which the operation evaluation process of the person 3 is executed and the operation evaluation result is output to an external device, for example, the operator terminal 6. The operation evaluation device 4 is composed of one or more server computers including, for example, a communication unit 41, a control unit 42, and a storage unit 43.
 運転者3の運転挙動データには、例えば、カメラ11で撮像した画像を処理して検出された、運転者3の顔の向き、視線の方向、眼開閉度、閉眼率、閉眼率の変動特徴量、所定の顔向き事象、及び漫然状態の判定結果のうちの少なくともいずれかのデータが含まれる。所定の顔向き事象には、例えば、交差点での右左折時確認、交差点での進行方向確認、顔未検出、及び脇見のうちの少なくともいずれかの事象が含まれる。 The driving behavior data of the driver 3 includes, for example, fluctuation characteristics of the driver 3's face orientation, line-of-sight direction, eye opening / closing degree, eye closure rate, and eye closure rate detected by processing an image captured by the camera 11. Data of at least one of the amount, the predetermined facial event, and the determination result of the loose state are included. Predetermined face-facing events include, for example, at least one of a right / left turn confirmation at an intersection, a direction of travel confirmation at an intersection, no face detection, and inattentiveness.
 車両2の走行挙動データには、例えば、車載機10で検出された、車両2の加速度、角速度、位置、速度、所定の車両動態事象のうちの少なくともいずれかのデータが含まれる。所定の車両動態事象には、例えば、交差点通過、道路種別の切り替わり、停車、走行開始急ハンドルの発生、急ブレーキの発生、及び車両2への衝撃発生のうちの少なくともいずれかの事象が含まれる。 The traveling behavior data of the vehicle 2 includes, for example, data of at least one of the acceleration, angular velocity, position, speed, and predetermined vehicle dynamic event of the vehicle 2 detected by the in-vehicle device 10. Predetermined vehicle dynamic events include, for example, at least one of intersection crossing, road type switching, stopping, sudden steering wheel occurrence, sudden braking, and impact on vehicle 2. ..
 車載機10と運転評価装置4とは、通信ネットワーク5を介して通信可能に構成されている。通信ネットワーク5は、基地局を含む携帯電話網や無線LAN(Local Area Network)などの無線通信網を含んでもよいし、公衆電話網などの有線通信網、インターネット、又は専用網などの電気通信回線を含んでもよい。
 また、車両2を管理する事業者端末6が、通信ネットワーク5を介して運転評価装置4と通信可能に構成されている。事業者端末6は、通信機能を備えたパーソナルコンピュータでもよいし、携帯電話、スマートフォン、又はタブレット装置などの携帯情報端末などでもよい。また、事業者端末6が、通信ネットワーク5を介して車載機10と通信可能に構成されていてもよい。
The vehicle-mounted device 10 and the operation evaluation device 4 are configured to be able to communicate with each other via the communication network 5. The communication network 5 may include a mobile phone network including a base station, a wireless communication network such as a wireless LAN (Local Area Network), a wired communication network such as a public telephone network, the Internet, or a telecommunication line such as a dedicated network. May include.
Further, the operator terminal 6 that manages the vehicle 2 is configured to be able to communicate with the driving evaluation device 4 via the communication network 5. The business terminal 6 may be a personal computer having a communication function, or a mobile information terminal such as a mobile phone, a smartphone, or a tablet device. Further, the business terminal 6 may be configured to be able to communicate with the vehicle-mounted device 10 via the communication network 5.
 車載機10に装備される状態判定装置20は、運転者3の顔を撮影可能に配置されたカメラ11から所定のフレームレートで撮像された画像を取得する。
 状態判定装置20は、カメラ11から取得した画像を時系列に処理して、画像から(例えば、1フレーム毎に)運転者3の眼開閉度を検出し、検出した第1所定期間(例えば、1分間程度の所定時間)の眼開閉度を用いて、運転者3の閉眼率を算出する。
The state determination device 20 mounted on the vehicle-mounted device 10 acquires an image captured at a predetermined frame rate from a camera 11 arranged so that the face of the driver 3 can be photographed.
The state determination device 20 processes the images acquired from the camera 11 in chronological order, detects the degree of eye opening / closing of the driver 3 from the images (for example, every frame), and detects the first predetermined period (for example, for example). The eye closure rate of the driver 3 is calculated using the eye opening / closing degree (for a predetermined time of about 1 minute).
 状態判定装置20は、算出した第2所定期間(第1所定期間より長い、例えば、10分から15分間程度の所定時間)の閉眼率を用いて、閉眼率の変動特徴量(以下、変動特徴量とも記す)を算出し、算出した前記変動特徴量に基づいて、運転者3の漫然状態(換言すれば、漫然運転の状態)を判定する。
 眼開閉度とは、運転者3の眼の開き具合を示す指標であり、例えば、画像から抽出される運転者3の眼の縦幅と横幅の比(例えば、縦幅の画素数/横幅の画素数)を眼開閉度として算出してもよい。
The state determination device 20 uses the calculated eye closure rate for the second predetermined period (longer than the first predetermined period, for example, a predetermined time of about 10 to 15 minutes), and uses the variable feature amount of the eye closure rate (hereinafter, the variable feature amount). (Also referred to as) is calculated, and based on the calculated variation feature amount, the driver 3's loose state (in other words, the loose driving state) is determined.
The degree of eye opening / closing is an index indicating the degree of opening of the driver 3's eyes. For example, the ratio of the vertical width to the horizontal width of the driver 3's eyes extracted from the image (for example, the number of pixels in the vertical width / the horizontal width). The number of pixels) may be calculated as the degree of opening and closing of the eye.
 閉眼率とは、閉眼状態である時間の割合(比率)であり、第1所定期間に検出された眼開閉度のうち、所定の閾値以下の眼開閉度の割合を閉眼率として算出してもよい。前記所定の閾値は、閉眼状態であるか否かを判定するための値である。
 閉眼率の変動特徴量は、第2所定期間の閉眼率を用いて算出された、閉眼率の変動特徴を示すデータであり、例えば、第2所定期間の閉眼率のばらつきの程度を示す指標(例えば、標準偏差)でもよいし、第2所定期間の閉眼率の変化量又は変化率(傾き)でもよいし、第2所定期間の閉眼率の上昇時間率などでもよい。
The eye closure rate is the ratio (ratio) of the time during which the eye is closed, and even if the ratio of the eye opening / closing degree below a predetermined threshold value among the eye opening / closing degrees detected in the first predetermined period is calculated as the eye closure rate. Good. The predetermined threshold value is a value for determining whether or not the eye is closed.
The variation feature amount of the eye closure rate is data indicating the variation feature of the eye closure rate calculated by using the eye closure rate in the second predetermined period, and is, for example, an index indicating the degree of variation in the eye closure rate in the second predetermined period ( For example, it may be (standard deviation), the amount of change or the rate of change (slope) of the eye closure rate in the second predetermined period, or the rate of increase in the eye closure rate in the second predetermined period.
 漫然状態とは、換言すれば、心理的又は生理的な要因によって集中力又は注意力などが低下した状態(車両2の運転者3の場合、特に前方不注意の状態)を意味し、眠気や疲労に伴い集中力又は注意力などが低下している状態の他、何か考え事をしたりして、ぼんやりとしている状態を含んでもよい。
 状態判定装置20が装備された車載機10によれば、運転者3の漫然状態(換言すれば、漫然運転の状態)が、閉眼率の値そのものではなく、上記した閉眼率の変動特徴量に基づいて判定されるので、例えば、各運転者3の閉眼率の個人差、又は各車両2のカメラ11の設置位置(換言すれば、画像の撮像位置)の違いなどの要因によって、覚醒状態であっても閉眼率の値が高めに計測されるような場合であっても、これら要因の影響を受けることなく、各運転者3の漫然状態を精度良く判定することが可能となる。
In other words, the state of inattentiveness means a state in which concentration or attention is reduced due to psychological or physiological factors (in the case of driver 3 of vehicle 2 in particular, a state of carelessness ahead), and drowsiness or In addition to the state in which concentration or attention is reduced due to fatigue, it may include a state in which the person is vague due to some thought.
According to the in-vehicle device 10 equipped with the state determination device 20, the driver 3's loose state (in other words, the loose driving state) is not the value of the closed eye rate itself, but the above-mentioned variable feature amount of the closed eye rate. Since the determination is made based on, for example, in the awake state due to factors such as individual differences in the eye closure rate of each driver 3 or differences in the installation position (in other words, the image imaging position) of the camera 11 of each vehicle 2. Even if there is a case where the value of the eye closure rate is measured higher, it is possible to accurately determine the involuntary state of each driver 3 without being affected by these factors.
 また、1以上の車載機10と、運転評価装置4とを含んで構成される運転評価システム1によれば、車載機10の状態判定装置20で判定された漫然状態の判定結果を用いることにより、各運転者3の閉眼率の個人差などの影響を受けない、各運転者3に対して公平な漫然状態の評価を行うことが可能となり、より適正な運転評価を行うことが可能となる。 Further, according to the operation evaluation system 1 including one or more in-vehicle devices 10 and the operation evaluation device 4, by using the determination result of the loose state determined by the state determination device 20 of the in-vehicle device 10. , It is possible to evaluate each driver 3 in a fair and vague state without being affected by individual differences in the eye closure rate of each driver 3, and it is possible to perform a more appropriate driving evaluation. ..
 [ハードウェア構成例]
 図2は、実施の形態(1)に係る車載機10のハードウェア構成例を示すブロック図である。
 車載機10は、状態判定装置20と、カメラ11とを含んで構成され、さらに、加速度センサ12、角速度センサ13、GPS(Global Positioning System)受信部14、通信部15、及び報知部16を含んで構成されている。
[Hardware configuration example]
FIG. 2 is a block diagram showing a hardware configuration example of the vehicle-mounted device 10 according to the embodiment (1).
The on-board unit 10 includes a state determination device 20 and a camera 11, and further includes an acceleration sensor 12, an angular velocity sensor 13, a GPS (Global Positioning System) receiving unit 14, a communication unit 15, and a notification unit 16. It is composed of.
 状態判定装置20は、制御部21と、記憶部22と、入出力インターフェース(I/F)23とを含んで構成されている。
 制御部21は、CPU(Central Processing Unit)、RAM(Random Access Memory)、及びROM(Read Only Memory)を含むマイクロコンピュータで構成されている。制御部21は、カメラ11、加速度センサ12、角速度センサ13、及びGPS受信部14などから取得したデータを記憶部22に記憶する処理を行う。また、制御部21は、記憶部22に記憶されたプログラム221をRAMに展開し、記憶部22に記憶された各種の検出データを読み出し、RAMに展開したプログラム221をCPUにより解釈及び実行することにより、後述する運転者3の状態を判定するための各種処理が実行されるようになっている。
The state determination device 20 includes a control unit 21, a storage unit 22, and an input / output interface (I / F) 23.
The control unit 21 is composed of a microcomputer including a CPU (Central Processing Unit), a RAM (Random Access Memory), and a ROM (Read Only Memory). The control unit 21 performs a process of storing data acquired from the camera 11, the acceleration sensor 12, the angular velocity sensor 13, the GPS receiving unit 14, and the like in the storage unit 22. Further, the control unit 21 expands the program 221 stored in the storage unit 22 into the RAM, reads various detection data stored in the storage unit 22, and interprets and executes the program 221 expanded in the RAM by the CPU. As a result, various processes for determining the state of the driver 3, which will be described later, are executed.
 記憶部22は、例えば、半導体メモリなどの1つ以上の記憶装置で構成されている。記憶部22には、プログラム221の他、カメラ11から取得した画像データ、加速度センサ12、角速度センサ13、GPS受信部14などで検出されたデータが時系列で記憶されてもよい。なお、プログラム221は、制御部21のROMに記憶されてもよい。
 入出力I/F23は、カメラ11などの機器との間でデータや信号の授受を行うためのインターフェース回路や接続コネクタなどを含んで構成されている。
The storage unit 22 is composed of one or more storage devices such as a semiconductor memory. In addition to the program 221, the storage unit 22 may store image data acquired from the camera 11, data detected by the acceleration sensor 12, the angular velocity sensor 13, the GPS receiving unit 14, and the like in time series. The program 221 may be stored in the ROM of the control unit 21.
The input / output I / F 23 includes an interface circuit, a connector, and the like for exchanging data and signals with and from a device such as a camera 11.
 カメラ11は、運転者3の顔を含む画像を撮像する撮像部として動作し、例えば、図示しないレンズ部、撮像素子部、光照射部、及びこれら各部を制御するカメラ制御部などを含んで構成されている。
 前記撮像素子部は、例えば、可視光線波長域、又は近赤外線波長域に感度を有するCCD(Charge Coupled Device)、又はCMOS(Complementary Metal Oxide Semiconductor)などの撮像素子、フィルタ、及びマイクロレンズなどを含んで構成されている。前記光照射部は、LED(Light Emitting Diode)などの発光素子を含み、また、昼夜を問わず運転者の状態を撮像できるように赤外線を照射する発光素子などを用いてもよい。カメラ11は、単眼カメラでもよいし、ステレオカメラであってもよい。
The camera 11 operates as an image pickup unit that captures an image including the driver 3's face, and includes, for example, a lens unit (not shown), an image sensor unit, a light irradiation unit, and a camera control unit that controls each of these units. Has been done.
The image sensor unit includes, for example, an image sensor such as a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor) having sensitivity in the visible light wavelength region or the near infrared wavelength region, a filter, a microlens, and the like. It is composed of. The light irradiation unit includes a light emitting element such as an LED (Light Emitting Diode), and may use a light emitting element that irradiates infrared rays so that the state of the driver can be imaged day and night. The camera 11 may be a monocular camera or a stereo camera.
 前記カメラ制御部は、例えば、プロセッサなどを含んで構成されている。前記カメラ制御部が、前記撮像素子部と前記光照射部の動作を制御して、該光照射部から光(例えば、近赤外線など)を照射し、前記撮像素子部でその反射光を撮像する制御などを行う。カメラ11は所定のフレームレート(例えば、毎秒15フレーム以上)で画像を撮像し、撮像された画像のデータが状態判定装置20へ出力される。 The camera control unit includes, for example, a processor and the like. The camera control unit controls the operation of the image pickup element unit and the light irradiation unit, irradiates light (for example, near infrared rays) from the light irradiation unit, and the image pickup element unit captures the reflected light. Control etc. The camera 11 captures an image at a predetermined frame rate (for example, 15 frames per second or more), and the data of the captured image is output to the state determination device 20.
 加速度センサ12は、車両2の加速度を測定するセンサであり、例えば、XYZ軸の3方向の加速度を測定する3軸加速度センサで構成されている。なお、加速度センサ12には、2軸、1軸の加速度センサを用いてもよい。加速度センサ12で測定された加速度データが、例えば、検出時刻と対応付けて(即ち、時系列で)記憶部22に記憶される。
 角速度センサ13は、車両2の回転角速度を検出するセンサであり、少なくとも鉛直軸回り(ヨー方向)の回転に応じた角速度、すなわち、車両2の左右方向への回転(旋回)に応じた角速度データを検出可能なセンサ、例えば、ジャイロセンサ(ヨーレートセンサともいう)で構成されている。なお、角速度センサ13には、鉛直軸回りの1軸ジャイロセンサの他、左右方向の水平軸回り(ピッチ方向)の角速度も検出する2軸ジャイロセンサ、さらに前後方向の水平軸回り(ロール方向)の角速度も検出する3軸ジャイロセンサを用いてもよい。これらジャイロセンサには、振動式ジャイロセンサの他、光学式、又は機械式のジャイロセンサを用いてもよい。
The acceleration sensor 12 is a sensor that measures the acceleration of the vehicle 2, and is composed of, for example, a three-axis acceleration sensor that measures acceleration in three directions of the XYZ axes. A 2-axis and 1-axis acceleration sensor may be used as the acceleration sensor 12. The acceleration data measured by the acceleration sensor 12 is stored in the storage unit 22 in association with, for example, the detection time (that is, in time series).
The angular velocity sensor 13 is a sensor that detects the rotational angular velocity of the vehicle 2, and is at least the angular velocity corresponding to the rotation around the vertical axis (yaw direction), that is, the angular velocity data corresponding to the rotation (turning) of the vehicle 2 in the left-right direction. It is composed of a sensor capable of detecting, for example, a gyro sensor (also referred to as a yaw rate sensor). The angular velocity sensor 13 includes a 1-axis gyro sensor around the vertical axis, a 2-axis gyro sensor that detects the angular velocity around the horizontal axis (pitch direction) in the left-right direction, and a horizontal axis around the front-back direction (roll direction). A 3-axis gyro sensor that also detects the angular velocity of is used. As these gyro sensors, in addition to the vibration type gyro sensor, an optical type or a mechanical type gyro sensor may be used.
 角速度センサ13の鉛直軸回りの角速度の検出方向については、例えば、時計回りを負方向に、反時計回りを正方向に設定してもよい。この場合、車両2が右方向に旋回すれば負の角速度データが検出され、左方向に旋回すれば正の角速度データが検出される。角速度センサ13では、所定の周期(例えば、数十ms周期)で角速度が検出され、検出された角速度データが、例えば、検出時刻と対応付けて記憶部22に記憶される。なお、加速度センサ12と角速度センサ13には、これらが一つのパッケージ内に実装された慣性センサを用いてもよい。 Regarding the detection direction of the angular velocity around the vertical axis of the angular velocity sensor 13, for example, clockwise may be set in the negative direction and counterclockwise may be set in the positive direction. In this case, if the vehicle 2 turns to the right, negative angular velocity data is detected, and if the vehicle 2 turns to the left, positive angular velocity data is detected. In the angular velocity sensor 13, the angular velocity is detected in a predetermined cycle (for example, a cycle of several tens of ms), and the detected angular velocity data is stored in the storage unit 22 in association with the detection time, for example. For the acceleration sensor 12 and the angular velocity sensor 13, an inertial sensor in which these are mounted in one package may be used.
 GPS受信部14は、アンテナ14aを介して人工衛星からのGPS信号(時刻情報を含む)を所定周期(例えば、1秒毎)で受信して、車両2の現在地の位置データ(緯度、及び経度を含む)を検出する。GPS受信部14で検出された位置データは、例えば、検出時刻と対応付けて記憶部22に記憶される。GPS受信部14に代えて、他の衛星測位システムに対応した受信装置を用いてもよいし、他の位置検出装置を用いてもよい。 The GPS receiving unit 14 receives GPS signals (including time information) from artificial satellites via the antenna 14a at predetermined cycles (for example, every second), and receives position data (latitude and longitude) of the current location of the vehicle 2. Includes). The position data detected by the GPS receiving unit 14 is stored in the storage unit 22 in association with the detection time, for example. Instead of the GPS receiving unit 14, a receiving device compatible with another satellite positioning system may be used, or another position detecting device may be used.
 通信部15は、アンテナ15aから電波を発し、通信ネットワーク5を介して運転評価装置4にデータを送信する処理の他、アンテナ15aを介して外部からの電波を受信する処理などを行う通信モジュールを含んで構成されている。また、通信部15は、車車間通信、又は路車間通信を行う通信モジュールを含んでもよい。
 報知部16は、状態判定装置20からの指令に基づいて、所定の報知音や音声などを出力するスピーカなどを含んで構成されている。
The communication unit 15 is a communication module that performs processing such as emitting radio waves from the antenna 15a, transmitting data to the operation evaluation device 4 via the communication network 5, and receiving radio waves from the outside via the antenna 15a. It is configured to include. Further, the communication unit 15 may include a communication module that performs vehicle-to-vehicle communication or road-to-vehicle communication.
The notification unit 16 is configured to include a speaker that outputs a predetermined notification sound, voice, or the like based on a command from the state determination device 20.
 車載機10は、状態判定装置20、カメラ11などが1つの筐体内に収納された、コンパクトな構成にすることが可能である。その場合における車載機10の車内設置箇所は、カメラ11で少なくとも運転者の顔を含む視野を撮像できる位置であれば、特に限定されない。車載機10は、例えば、車両2のダッシュボード中央付近の他、ハンドルコラム部分、メーターパネル付近、ルームミラー近傍位置、又はAピラー部分などに設置してもよい。また、カメラ11は、状態判定装置20と一体に構成される形態の他、別体で構成されてもよい。
[機能構成例]
 図3は、実施の形態(1)に係る状態判定装置20の機能構成例を示すブロック図である。
The on-board unit 10 can have a compact configuration in which the state determination device 20, the camera 11, and the like are housed in one housing. In that case, the location of the vehicle-mounted device 10 installed in the vehicle is not particularly limited as long as the camera 11 can capture at least the field of view including the driver's face. The on-board unit 10 may be installed, for example, near the center of the dashboard of the vehicle 2, near the steering wheel column, near the instrument panel, near the rearview mirror, or at the A-pillar portion. Further, the camera 11 may be configured as a separate body in addition to the form that is integrally configured with the state determination device 20.
[Functional configuration example]
FIG. 3 is a block diagram showing a functional configuration example of the state determination device 20 according to the embodiment (1).
 図2に示したように、状態判定装置20の制御部21は、記憶部22に記憶されたプログラム221をRAMに展開する。そして、制御部21は、RAMに展開されたプログラムをCPUにより解釈及び実行して、図3に示す画像取得部30、眼開閉度検出部31、閉眼率算出部32、変動特徴量算出部33、漫然状態判定部34、及び出力部35として動作する。 As shown in FIG. 2, the control unit 21 of the state determination device 20 expands the program 221 stored in the storage unit 22 into the RAM. Then, the control unit 21 interprets and executes the program expanded in the RAM by the CPU, and the image acquisition unit 30, the eye opening / closing degree detection unit 31, the eye closure rate calculation unit 32, and the variation feature amount calculation unit 33 shown in FIG. , Operates as a loose state determination unit 34 and an output unit 35.
 画像取得部30は、車両2の運転者3の顔を撮像可能に配置されたカメラ11から、運転者3の顔が撮像された画像を取得する処理を行う。画像取得部30は、例えば、毎秒nフレーム(nは、例えば15以上)の画像を取得する。
 眼開閉度検出部31は、画像取得部30により取得された画像を解析して、各画像から運転者3の眼開閉度を検出する処理を行う。眼開閉度は、眼の開き具合を示す指標であり、例えば、画像から抽出した運転者3の眼の縦幅(上瞼と下瞼との間の距離)と横幅(目頭と目尻との間の距離)との比(例えば、縦幅の画素数/横幅の画素数)を用いてもよい。なお、眼開閉度検出部31は、左右の眼それぞれの眼開閉度を求めてもよいし、左右の眼開閉度の平均値を求めてもよい。また、左右いずれか一方の眼開閉度を用いてもよい。また、眼開閉度として、眼の縦幅(例えば、縦幅の画素数)のみを用いてもよい。
The image acquisition unit 30 performs a process of acquiring an image in which the face of the driver 3 is captured from a camera 11 arranged so as to be able to capture the face of the driver 3 in the vehicle 2. The image acquisition unit 30 acquires, for example, an image of n frames per second (n is, for example, 15 or more).
The eye opening / closing degree detection unit 31 analyzes the image acquired by the image acquisition unit 30 and performs a process of detecting the eye opening / closing degree of the driver 3 from each image. The degree of opening and closing of the eyes is an index indicating the degree of opening of the eyes. For example, the vertical width (distance between the upper eyelid and the lower eyelid) and the horizontal width (between the inner and outer corners of the eyes) of the driver 3's eyes extracted from the image. The ratio to (for example, the number of pixels in the vertical width / the number of pixels in the horizontal width) may be used. The eye opening / closing degree detection unit 31 may obtain the eye opening / closing degree of each of the left and right eyes, or may obtain the average value of the left and right eye opening / closing degrees. Further, either the left or right eye opening / closing degree may be used. Further, as the eye opening / closing degree, only the vertical width of the eye (for example, the number of pixels in the vertical width) may be used.
 閉眼率算出部32は、眼開閉度検出部31により検出された第1所定期間(例えば、1分間程度の所定時間)の眼開閉度を用いて、運転者3の閉眼率を算出する処理を行う。例えば、閉眼率算出部32は、第1所定期間に検出される眼開閉度のうち、閉眼状態を示す所定の閾値以下となるデータの割合を算出する。また、閉眼率算出部32は、第1所定期間毎に、閉眼率を算出する処理を行う。 The eye closing rate calculation unit 32 calculates the eye closing rate of the driver 3 by using the eye opening / closing degree of the first predetermined period (for example, a predetermined time of about 1 minute) detected by the eye opening / closing degree detecting unit 31. Do. For example, the eye closure rate calculation unit 32 calculates the ratio of data that is equal to or less than a predetermined threshold value indicating the eye closure state among the eye opening / closing degrees detected in the first predetermined period. In addition, the eye-closure rate calculation unit 32 performs a process of calculating the eye-closure rate every first predetermined period.
 変動特徴量算出部33は、閉眼率算出部32により算出された第2所定期間(例えば、10分から15分間程度の所定時間)の閉眼率を用いて、閉眼率の変動特徴量を算出する。例えば、第1所定期間を1分間、第2所定期間を15分間とした場合、変動特徴量算出部33は、1分間毎に算出される閉眼率の15分間分のデータ(15個のデータ)を用いて、閉眼率の変動特徴量を算出する。変動特徴量算出部33は、画像の取得を開始して、第2所定期間が経過した後から、第1所定期間が経過する毎に、直近の第2所定期間分の閉眼率のデータを用いて、変動特徴量を算出する。すなわち、変動特徴量算出部33は、第2所定期間の経過後、第1所定期間毎に変動特徴量を算出する。 The variable feature amount calculation unit 33 calculates the variable feature amount of the eye closure rate by using the eye closure rate for the second predetermined period (for example, a predetermined time of about 10 to 15 minutes) calculated by the eye closure rate calculation unit 32. For example, when the first predetermined period is 1 minute and the second predetermined period is 15 minutes, the variable feature amount calculation unit 33 is calculated for 15 minutes of the eye closure rate (15 data). Is used to calculate the variable feature amount of the eye closure rate. The variable feature amount calculation unit 33 uses the data of the eye closure rate for the most recent second predetermined period every time the first predetermined period elapses after the second predetermined period elapses after the image acquisition is started. Then, the variable feature amount is calculated. That is, the variable feature amount calculation unit 33 calculates the variable feature amount for each first predetermined period after the lapse of the second predetermined period.
 変動特徴量算出部33は、例えば、変動特徴量として、第2所定期間の閉眼率のばらつきの程度を示す指標を算出してもよい。変動特徴量算出部33は、ばらつきの程度を示す指標として、標準偏差を算出してもよいし、分散を算出してもよい。
 また、変動特徴量算出部33は、変動特徴量として、第2所定期間の閉眼率の変化量又は変化率(傾き)を算出してもよい。
The variable feature amount calculation unit 33 may calculate, for example, an index indicating the degree of variation in the eye closure rate during the second predetermined period as the variable feature amount. The variation feature amount calculation unit 33 may calculate the standard deviation or the variance as an index indicating the degree of variation.
Further, the variable feature amount calculation unit 33 may calculate the change amount or the change rate (slope) of the eye closure rate in the second predetermined period as the variable feature amount.
 また、変動特徴量算出部33は、変動特徴量として、第2所定期間における閉眼率の上昇時間の割合である上昇時間率を算出してもよい。変動特徴量算出部33は、上記した2種類以上の変動特徴量を算出してもよい。
 漫然状態判定部34は、変動特徴量算出部33により算出された変動特徴量に基づいて、運転者の漫然状態を判定する処理を行う。例えば、変動特徴量として、第2所定期間の閉眼率の標準偏差を算出している場合、閉眼率の標準偏差が、漫然状態への変化を示す所定の閾値以上になったか否かを判定してもよいし、所定期間継続して所定の閾値を超えているか否かを判定してもよい。
Further, the variable feature amount calculation unit 33 may calculate the rising time rate, which is the ratio of the rising time of the eye closure rate in the second predetermined period, as the variable feature amount. The variable feature amount calculation unit 33 may calculate the above-mentioned two or more types of variable feature amounts.
The loose state determination unit 34 performs a process of determining the driver's loose state based on the variable feature amount calculated by the variable feature amount calculation unit 33. For example, when the standard deviation of the eye closure rate in the second predetermined period is calculated as the variable feature amount, it is determined whether or not the standard deviation of the eye closure rate is equal to or greater than the predetermined threshold value indicating the change to the involuntary state. Alternatively, it may be determined whether or not the predetermined threshold value has been exceeded continuously for a predetermined period of time.
 また、変動特徴量として、第2所定期間の閉眼率の変化量又は変化率を算出している場合、閉眼率の変化量又は変化率が、漫然状態への変化を示す所定の閾値(変化量又は変化率)以上になったか否かを判定してもよいし、所定期間継続して所定の閾値を超えているか否かを判定してもよい。
 また、変動特徴量として、第2所定期間における閉眼率の上昇時間率を算出している場合、閉眼率の上昇時間率が、漫然状態への変化を示す所定の閾値(割合)以上になったか否かを判定してもよいし、所定期間継続して所定の閾値を超えているか否かを判定してもよい。また、漫然状態判定部34は、上記した2種類以上の変動特徴量に基づく判定処理を組み合わせてもよいし、上記した変動特徴量に基づく判定処理と、閉眼率に基づく判定処理とを組み合わせてもよい。
In addition, when the amount of change or rate of change in the eye closure rate during the second predetermined period is calculated as the variable feature amount, the amount of change or rate of change in the eye closure rate is a predetermined threshold value (change amount) indicating a change to a vague state. Alternatively, it may be determined whether or not the rate of change has been exceeded, or whether or not the threshold value has been exceeded for a predetermined period of time.
In addition, when the rate of increase in the rate of eye closure in the second predetermined period is calculated as the variable feature amount, is the rate of increase in the rate of eye closure equal to or greater than the predetermined threshold value (ratio) indicating the change to the involuntary state? It may be determined whether or not it exceeds a predetermined threshold value continuously for a predetermined period of time. In addition, the loose state determination unit 34 may combine the determination processing based on the above-mentioned two or more types of fluctuation feature amounts, or the determination processing based on the above-mentioned variation feature amount and the determination processing based on the eye closure rate. May be good.
 また、漫然状態判定部34が、変動特徴量算出部33で算出された変動特徴量を入力すると、運転者3が漫然状態であるか否かを示す値を出力するように学習した学習済みの学習器を用いて、運転者3の漫然状態を判定してもよい。また、前記学習器は、変動特徴量算出部33で算出された変動特徴量と、閉眼率算出部32で算出された閉眼率とを入力すると、運転者3が漫然状態であるか否かを示す値を出力するように学習したものでもよい。前記学習器は、例えば、入力層、1以上の中間層、及び出力層を有するニューラルネットワーク、又はサポートベクターマシン等で構成されてよい。前記学習器を用いることにより、運転者3が漫然状態であるか否かを、容易かつ適切に判定することが可能となる。 Further, the learning has been learned so that when the random state determination unit 34 inputs the variable feature amount calculated by the variable feature amount calculation unit 33, the driver 3 outputs a value indicating whether or not the driver 3 is in the loose state. The learner may be used to determine the driver 3's absent-minded state. Further, when the learning device inputs the fluctuation feature amount calculated by the fluctuation feature amount calculation unit 33 and the eye closure rate calculated by the eye closure rate calculation unit 32, it is determined whether or not the driver 3 is in a vague state. It may be learned to output the indicated value. The learner may be composed of, for example, a neural network having an input layer, one or more intermediate layers, and an output layer, a support vector machine, or the like. By using the learner, it is possible to easily and appropriately determine whether or not the driver 3 is in a loose state.
 出力部35は、漫然状態判定部34による判定結果を出力する処理を行う。例えば、出力部35は、報知部16を介して前記判定結果に応じた報知処理を行ってもよいし、前記判定結果を記憶部22に記憶し、記憶部22に記憶された前記判定結果を含むデータを所定のタイミングで通信部15を介して運転評価装置4へ送信する処理を行ってもよい。
 図4(a)は、平常時における閉眼率が低い運転者の閉眼率の時系列変化を示すグラフの一例であり、図4(b)は、図4(a)に示した閉眼率のデータを用いて算出した閉眼率の標準偏差の時系列変化を示すグラフの一例である。
The output unit 35 performs a process of outputting a determination result by the indiscriminate state determination unit 34. For example, the output unit 35 may perform notification processing according to the determination result via the notification unit 16, stores the determination result in the storage unit 22, and stores the determination result stored in the storage unit 22. A process of transmitting the included data to the operation evaluation device 4 via the communication unit 15 at a predetermined timing may be performed.
FIG. 4A is an example of a graph showing a time-series change in the eye closure rate of a driver having a low eye closure rate in normal times, and FIG. 4B is the data of the eye closure rate shown in FIG. 4A. This is an example of a graph showing the time-series change of the standard deviation of the eye closure rate calculated using.
 図5(a)は、平常時における閉眼率が高い運転者の閉眼率の時系列変化を示すグラフの一例であり、図5(b)は、図5(a)に示した閉眼率のデータを用いて算出した閉眼率の変動特徴量の一例である標準偏差の時系列変化を示すグラフである。なお、図中に一点鎖線で示す閾値は、漫然状態であるか否かを判定する基準の一例を示している。
 図4(a)と図5(a)とを対比すると、平常時における閉眼率に個人差があることが分かる。図4(a)に示すように、平常時における閉眼率が低めに検出される人もいれば、図5(a)に示すように、平常時における閉眼率が高め検出される人もいる。これら個人差は、例えば、上瞼と下瞼と間の大きさの違い、カメラ11と顔との位置関係(距離や向きなど、カメラ11の設置位置)の違いなどが影響している。
FIG. 5 (a) is an example of a graph showing a time-series change in the eye-closure rate of a driver having a high eye-closure rate in normal times, and FIG. 5 (b) is the data of the eye-closure rate shown in FIG. 5 (a). It is a graph which shows the time-series change of the standard deviation which is an example of the fluctuation feature amount of the eye closure rate calculated by using. The threshold value indicated by the alternate long and short dash line in the figure indicates an example of a criterion for determining whether or not the state is in a dull state.
Comparing FIGS. 4 (a) and 5 (a), it can be seen that there are individual differences in the eye closure rate in normal times. As shown in FIG. 4 (a), the eye closure rate in normal times is detected to be low, and as shown in FIG. 5 (a), the eye closure rate in normal times is high in some people. These individual differences are influenced by, for example, the difference in size between the upper and lower eyelids, and the difference in the positional relationship between the camera 11 and the face (distance, orientation, etc., where the camera 11 is installed).
 図4(a)に示すように、平常時における閉眼率が低い場合は、閉眼率の閾値判定により、漫然状態であるか否かを比較的正確に判定することが可能である。
 一方、図5(a)に示すように、平常時における閉眼率が高い場合は、漫然状態であるか否かを閉眼率の閾値判定で行う(図4(a)と同じ閾値で判定を行う)と、常に漫然状態であるという誤った判定がなされる状態が発生することとなる。
As shown in FIG. 4A, when the eye closure rate is low in normal times, it is possible to relatively accurately determine whether or not the patient is in a vague state by determining the threshold value of the eye closure rate.
On the other hand, as shown in FIG. 5 (a), when the eye closure rate is high in normal times, whether or not the patient is in a loose state is determined by the threshold value of the eye closure rate (determination is performed with the same threshold value as in FIG. 4 (a)). ), A state will occur in which an erroneous judgment that the state is always in a vague state is made.
 本実施の形態では、運転者3の漫然状態を判定するにあたり、閉眼率の値そのものを利用するのではなく、覚醒状態から漫然状態に移行するときに閉眼率が変動するという特徴を利用する。すなわち、状態判定装置20は、所定期間の閉眼率のデータから閉眼率の変動特徴量(例えば、閉眼率の標準偏差)を算出し、算出した変動特徴量に基づいて、漫然状態への移行を判定する。 In the present embodiment, in determining the driver 3's absent-minded state, the feature that the eye-closure rate fluctuates when the driver 3 shifts from the awake state to the absent-minded state is used instead of using the value of the eye-closure rate itself. That is, the state determination device 20 calculates the fluctuation feature amount of the eye closure rate (for example, the standard deviation of the eye closure rate) from the data of the eye closure rate for a predetermined period, and shifts to the involuntary state based on the calculated fluctuation feature amount. judge.
 上記変動特徴量を用いることにより、図4(b)に示すように、平常時における閉眼率が低い場合でも、また、図5(b)に示すように、平常時における閉眼率が高い場合でも、閉眼率の標準偏差(変動特徴量)の閾値判定により、漫然状態へ移行するタイミングを正確に判定することができ、漫然状態への移行を精度良く推定することが可能となる。
 図6は、実施の形態(1)に係る運転評価装置4の機能構成例を示すブロック図である。
By using the above-mentioned variable feature amount, even when the eye closure rate in normal times is low as shown in FIG. 4 (b) and even when the eye closure rate in normal times is high as shown in FIG. 5 (b). By the threshold determination of the standard deviation (fluctuation feature amount) of the eye closure rate, the timing of transition to the loose state can be accurately determined, and the transition to the loose state can be estimated accurately.
FIG. 6 is a block diagram showing a functional configuration example of the operation evaluation device 4 according to the embodiment (1).
 運転評価装置4は、通信ユニット41、制御ユニット42、及び記憶ユニット43を含んで構成され、これらが通信バス44を介して接続されている。
 通信ユニット41は、通信ネットワーク5を介して、車載機10や事業者端末6などとの間で各種のデータや信号の送受信を実現するための通信装置を含んで構成されている。
 制御ユニット42は、各種の演算処理を実行する1つ以上のプロセッサと、所定の運転評価プログラムなどが記憶されたメインメモリとを含むコンピュータ装置で構成され、機能構成として、運転評価部421と、評価結果出力部422とを含んで構成されている。
The operation evaluation device 4 includes a communication unit 41, a control unit 42, and a storage unit 43, and these are connected via a communication bus 44.
The communication unit 41 is configured to include a communication device for realizing transmission / reception of various data and signals to / from the vehicle-mounted device 10 and the business terminal 6 via the communication network 5.
The control unit 42 is composed of a computer device including one or more processors that execute various arithmetic processes and a main memory in which a predetermined operation evaluation program or the like is stored. As a functional configuration, the control unit 42 includes an operation evaluation unit 421. It is configured to include the evaluation result output unit 422.
 運転評価部421は、車載機10の状態判定装置20により判定された運転者3の漫然状態の判定結果を含むデータに基づいて、運転者3の漫然状態の評価を含む、各種の運転評価処理を行う。評価結果出力部422は、運転評価部421により評価された運転者3の漫然状態の評価を含む運転評価結果を出力する。評価結果出力部422は、例えば、事業者端末6からの要求に応じて、通信ユニット41を介して、前記運転評価結果を事業者端末6に送信する処理を行う。 The driving evaluation unit 421 performs various driving evaluation processes including evaluation of the driver 3's loose state based on the data including the determination result of the driver 3's loose state determined by the state determination device 20 of the vehicle-mounted device 10. I do. The evaluation result output unit 422 outputs a driving evaluation result including an evaluation of the driver 3's involuntary state evaluated by the driving evaluation unit 421. The evaluation result output unit 422 performs a process of transmitting the operation evaluation result to the business operator terminal 6 via the communication unit 41, for example, in response to a request from the business operator terminal 6.
 記憶ユニット43は、例えば、ハードディスクドライブ、ソリッドステートドライブなど、1以上の大容量記憶装置で構成され、検出データ蓄積部431、評価条件記憶部432、及び評価結果記憶部433などを含んで構成されている。
 検出データ蓄積部431には、各車両2の車載機10から取得した検出データが蓄積される。検出データ蓄積部431には、例えば、車載機10の識別情報又は運転者3の識別情報に対応付けて、漫然状態検出時刻と、該漫然状態検出時刻の前後所定時間に検出された運転者3の運転挙動データとが時系列で蓄積される。また、検出データ蓄積部431には、漫然状態検出時刻の前後所定時間に検出された車両2の走行挙動データも時系列で蓄積されてもよい。運転者3の運転挙動データには、例えば、運転者3の顔の向き、視線の方向、眼開閉度、閉眼率、及び閉眼率の変動特徴量のうちの少なくともいずれかのデータが含まれる。車両2の走行挙動データには、車両2の加速度、角速度、位置、及び速度のうちの少なくともいずれかのデータが含まれる。
The storage unit 43 is composed of one or more large-capacity storage devices such as a hard disk drive and a solid state drive, and includes a detection data storage unit 431, an evaluation condition storage unit 432, an evaluation result storage unit 433, and the like. ing.
The detection data storage unit 431 stores the detection data acquired from the vehicle-mounted device 10 of each vehicle 2. The detection data storage unit 431 is associated with, for example, the identification information of the in-vehicle device 10 or the identification information of the driver 3, and the driver 3 is detected at a time when the driver 3 is detected and a predetermined time before and after the time when the state is detected. The driving behavior data of is accumulated in time series. In addition, the driving behavior data of the vehicle 2 detected at a predetermined time before and after the involuntary state detection time may be stored in the detection data storage unit 431 in time series. The driving behavior data of the driver 3 includes, for example, data of at least one of the direction of the face of the driver 3, the direction of the line of sight, the degree of opening and closing of the eyes, the rate of eye closure, and the variable feature amount of the rate of eye closure. The traveling behavior data of the vehicle 2 includes data of at least one of the acceleration, the angular velocity, the position, and the speed of the vehicle 2.
 また、検出データ蓄積部431には、車載機10の識別情報又は運転者3の識別情報に対応付けて、交差点通過時刻と、該交差点通過時刻の前後所定時間に検出された運転者3の運転挙動データ及び車両2の走行挙動データとが時系列で蓄積されてもよい。
 評価条件記憶部432には、交差点などで運転者3が行うべき安全確認動作の評価条件が少なくとも1つ以上記憶されている。評価結果記憶部433には、運転評価部421により評価された運転者3の漫然状態の評価を含む運転評価結果などが記憶される。
[処理動作例]
 図7は、実施の形態(1)に係る状態判定装置20の制御部21が行う処理動作例を示すフローチャートである。図7では、制御部21が、画像取得部30、眼開閉度検出部31、及び閉眼率算出部32として動作する例を示している。
Further, the detection data storage unit 431 is associated with the identification information of the in-vehicle device 10 or the identification information of the driver 3, and the driving of the driver 3 detected at the intersection passage time and a predetermined time before and after the intersection passage time. The behavior data and the driving behavior data of the vehicle 2 may be accumulated in time series.
The evaluation condition storage unit 432 stores at least one evaluation condition for the safety confirmation operation to be performed by the driver 3 at an intersection or the like. The evaluation result storage unit 433 stores a driving evaluation result including an evaluation of the driver 3's involuntary state evaluated by the driving evaluation unit 421.
[Processing operation example]
FIG. 7 is a flowchart showing an example of processing operation performed by the control unit 21 of the state determination device 20 according to the embodiment (1). FIG. 7 shows an example in which the control unit 21 operates as an image acquisition unit 30, an eye opening / closing degree detection unit 31, and an eye closure rate calculation unit 32.
 まず、ステップS1では、制御部21は、画像取得部30として動作し、カメラ11で撮像された画像を取得する処理を行い、ステップS2に処理を進める。制御部21は、例えば、毎秒nフレーム(例えば、nは15以上)の画像を取得する。取得した画像は、例えば、記憶部22に記憶される。
 ステップS2では、制御部21は、眼開閉度検出部31として動作し、ステップS1で取得した画像から運転者3の眼開閉度を検出する処理を行い、ステップS3に処理を進める。例えば、制御部21は、取得した画像から運転者3の顔の領域を検出し、検出した顔の領域から、眼、鼻、口などの顔器官を検出し、検出した眼の領域から眼の縦幅と横幅を検出し、検出した眼の縦幅と横幅との比(例えば、縦幅の画素数/横幅の画素数)を眼開閉度として検出する。各画像から検出された眼開閉度は、例えば、画像取得時刻などのデータと紐付けて記憶部22に記憶されてもよい。なお、制御部21は、眼開閉度とともに、公知の手法を用いて、顔の向き、視線の方向などを検出してもよい。また、眼開閉度検出部31は、検出した眼の縦幅と横幅との比を、画像から検出された運転者3の顔の向きに基づいて、運転者3の顔を正面から見たときの値に補正した眼開閉度を算出してもよい。眼開閉度を補正する手法には、例えば、特許第4957711号公報に記載されている手法を適用することが可能である。
First, in step S1, the control unit 21 operates as an image acquisition unit 30, performs a process of acquiring an image captured by the camera 11, and proceeds to step S2. The control unit 21 acquires, for example, an image of n frames per second (for example, n is 15 or more). The acquired image is stored in, for example, the storage unit 22.
In step S2, the control unit 21 operates as the eye opening / closing degree detecting unit 31, performs a process of detecting the eye opening / closing degree of the driver 3 from the image acquired in step S1, and proceeds to step S3. For example, the control unit 21 detects the facial region of the driver 3 from the acquired image, detects facial organs such as eyes, nose, and mouth from the detected facial region, and detects the eye region from the detected eye region. The vertical width and the horizontal width are detected, and the ratio of the detected vertical width to the horizontal width (for example, the number of vertical width pixels / the number of horizontal width pixels) is detected as the degree of opening and closing of the eye. The degree of opening and closing of the eyes detected from each image may be stored in the storage unit 22 in association with data such as the image acquisition time. The control unit 21 may detect the direction of the face, the direction of the line of sight, and the like by using a known method together with the degree of opening and closing of the eyes. Further, when the eye opening / closing degree detection unit 31 views the driver 3's face from the front based on the ratio of the detected vertical width to the horizontal width of the eye based on the orientation of the driver 3's face detected from the image. The degree of eye opening / closing corrected to the value of may be calculated. For example, the method described in Japanese Patent No. 4957711 can be applied to the method for correcting the degree of opening and closing of the eye.
 ステップS3では、制御部21は、第1所定期間の眼開閉度を検出したか否かを判断し、第1所定期間の眼開閉度を検出していないと判断すれば、ステップS1に戻る一方、第1所定期間の眼開閉度を検出したと判断すれば、ステップS4に処理を進める。第1所定期間は、閉眼率の算出に用いる眼開閉度を必要データ数だけ取得するための期間であり、例えば、1分間程度の所定時間が設定され得る。 In step S3, the control unit 21 determines whether or not the eye opening / closing degree in the first predetermined period has been detected, and if it determines that the eye opening / closing degree in the first predetermined period has not been detected, the process returns to step S1. If it is determined that the degree of eye opening / closing during the first predetermined period has been detected, the process proceeds to step S4. The first predetermined period is a period for acquiring the required number of data for the degree of eye opening / closing used for calculating the eye closure rate, and for example, a predetermined time of about 1 minute can be set.
 ステップS4では、制御部21は、閉眼率算出部32として動作し、第1所定期間に検出された眼開閉度を用いて、運転者3の閉眼率を算出する処理を行い、ステップS5に処理を進める。例えば、制御部21は、第1所定期間に検出される眼開閉度のうち、閉眼状態を示す所定の閾値以下となるデータの割合([閾値以下の眼開閉度のデータ数/第1所定期間の眼開閉度のデータ数]×100(%))を算出する。第1所定期間毎に算出される閉眼率は、例えば、第1所定期間の経過時刻などのデータと紐付けて記憶部22に記憶されてもよい。なお、閉眼率算出部32は、画像から検出された運転者3の視線の方向に基づいて、上記閉眼状態を示す所定の閾値を設定してもよい。所定の閾値を設定する手法には、例えば、特許第4915413号公報に記載されている手法を適用することが可能である。 In step S4, the control unit 21 operates as the eye closure rate calculation unit 32, performs a process of calculating the eye closure rate of the driver 3 using the eye opening / closing degree detected in the first predetermined period, and processes in step S5. To proceed. For example, the control unit 21 determines the ratio of data that is equal to or less than a predetermined threshold value indicating the closed eye state among the eye opening / closing degrees detected in the first predetermined period ([number of data of eye opening / closing degree below the threshold value / first predetermined period). Number of data on eye opening / closing degree] × 100 (%)) is calculated. The eye closure rate calculated for each first predetermined period may be stored in the storage unit 22 in association with data such as the elapsed time of the first predetermined period, for example. The eye-closing rate calculation unit 32 may set a predetermined threshold value indicating the eye-closing state based on the direction of the line of sight of the driver 3 detected from the image. For example, the method described in Japanese Patent No. 4915413 can be applied to the method for setting a predetermined threshold value.
 ステップS5では、制御部21は、第2所定期間の閉眼率を算出したか否かを判断し、第2所定期間の閉眼率を算出していないと判断すれば、ステップS1に戻る一方、第2所定期間の閉眼率を算出したと判断すれば、ステップS6に処理を進める。第2所定期間は、変動特徴量の算出に用いる閉眼率を必要データ数だけ取得するための期間であり、例えば、15分間前後の所定時間が設定され得る。なお、ステップS1で画像の取得を開始して、最初の第2所定期間が経過した後は、新たに第1所定期間が経過する毎に、閉眼率を算出し、直近の第2所定期間の閉眼率を算出したか否かを判断する。 In step S5, the control unit 21 determines whether or not the eye closure rate for the second predetermined period has been calculated, and if it determines that the eye closure rate for the second predetermined period has not been calculated, the control unit 21 returns to step S1 while performing the first step. 2 If it is determined that the eye closure rate for a predetermined period has been calculated, the process proceeds to step S6. The second predetermined period is a period for acquiring the required number of data for the eye closure rate used for calculating the variable feature amount, and for example, a predetermined time of about 15 minutes can be set. In addition, after the acquisition of the image is started in step S1 and the first predetermined period has elapsed, the eye closure rate is calculated every time the first predetermined period elapses, and the latest second predetermined period is calculated. Determine whether or not the eye closure rate has been calculated.
 ステップS6では、制御部21は、記憶部22から第2所定期間に算出された閉眼率のデータを読み出し、その後ステップS1に戻り、眼開閉度の検出処理と閉眼率の算出処理を繰り返す。
 図8は、実施の形態(1)に係る状態判定装置20の制御部21が行う処理動作例を示すフローチャートである。図8に示す処理動作は、図7に示したステップS6の処理に引き続いて実行される。図8では、制御部21が、変動特徴量算出部33、漫然状態判定部34、及び出力部35として動作する一例を示している。
In step S6, the control unit 21 reads the data of the eye closure rate calculated in the second predetermined period from the storage unit 22, and then returns to step S1 to repeat the eye opening / closing degree detection process and the eye closure rate calculation process.
FIG. 8 is a flowchart showing an example of processing operation performed by the control unit 21 of the state determination device 20 according to the embodiment (1). The processing operation shown in FIG. 8 is executed following the processing in step S6 shown in FIG. 7. FIG. 8 shows an example in which the control unit 21 operates as the fluctuation feature amount calculation unit 33, the loose state determination unit 34, and the output unit 35.
 まず、ステップS7では、制御部21は、変動特徴量算出部33として動作し、ステップS6で読み出した、第2所定期間の閉眼率のデータを用いて、閉眼率の変動特徴量を算出する処理を行い、ステップS8に処理を進める。例えば、第1所定期間が1分間、第2所定期間が15分間に設定されている場合、制御部21は、1分間毎に算出される閉眼率の15分間分のデータ(15個のデータ)を用いて、閉眼率の変動特徴量を算出する。 First, in step S7, the control unit 21 operates as the fluctuation feature amount calculation unit 33, and calculates the fluctuation feature amount of the eye closure rate using the data of the eye closure rate in the second predetermined period read out in step S6. Is performed, and the process proceeds to step S8. For example, when the first predetermined period is set to 1 minute and the second predetermined period is set to 15 minutes, the control unit 21 has data for 15 minutes of the eye closure rate calculated every minute (15 data). Is used to calculate the fluctuation feature amount of the eye closure rate.
 算出する変動特徴量は、第2所定期間の閉眼率の標準偏差であってもよいし、第2所定期間の閉眼率の変化量又は変化率(傾き)であってもよいし、第2所定期間の閉眼率の上昇時間率であってもよく、これらのうち少なくとも1種類以上の変動特徴量を算出すればよい。算出された変動特徴量は、例えば、第2所定期間の経過時刻などのデータと紐付けて記憶部22に記憶されてもよい。 The variable feature amount to be calculated may be the standard deviation of the eye closure rate in the second predetermined period, the change amount or the change rate (slope) of the eye closure rate in the second predetermined period, or the second predetermined amount. It may be the rate of increase in the eye closure rate during the period, and at least one or more of these variable features may be calculated. The calculated variable feature amount may be stored in the storage unit 22 in association with data such as the elapsed time of the second predetermined period, for example.
 ステップS8では、制御部21は、漫然状態判定部34として動作し、ステップS7で算出された、閉眼率の変動特徴量に基づいて、運転者3の漫然状態(換言すれば、漫然運転の状態)を判定する処理を行い、ステップS9に処理を進める。例えば、閉眼率の変動特徴量として、閉眼率の標準偏差を算出した場合、閉眼率の標準偏差が、漫然状態への移行を判定する所定の閾値以上であるか否かを判定してもよいし、閉眼率の標準偏差が、所定期間(例えば、一定時間など)継続して所定の閾値を超えたか否かを判定してもよい。この場合、閉眼率の標準偏差が、所定の閾値以上である場合、又は所定期間継続して前記所定の閾値を超えた場合、漫然状態であると判定されることとなる。 In step S8, the control unit 21 operates as a loose state determination unit 34, and based on the variation feature amount of the eye closure rate calculated in step S7, the loose state of the driver 3 (in other words, the loose driving state). ) Is performed, and the process proceeds to step S9. For example, when the standard deviation of the eye closure rate is calculated as the variable feature amount of the eye closure rate, it may be determined whether or not the standard deviation of the eye closure rate is equal to or greater than a predetermined threshold value for determining the transition to the involuntary state. However, it may be determined whether or not the standard deviation of the eye closure rate continuously exceeds a predetermined threshold value for a predetermined period (for example, for a certain period of time). In this case, if the standard deviation of the eye closure rate is equal to or greater than a predetermined threshold value, or if the standard deviation exceeds the predetermined threshold value continuously for a predetermined period of time, it is determined that the state is indifferent.
 また、閉眼率の変動特徴量として、閉眼率の変化量又は変化率を算出した場合、閉眼率の変化量又は変化率が、漫然状態への移行を判定する所定の閾値(変化量又は変化率)以上であるか否かを判定してもよいし、閉眼率の変化量又は変化率が、所定期間(例えば、一定時間など)継続して所定の閾値を超えたか否かを判定してもよい。この場合、閉眼率の変化量又は変化率が、所定の閾値以上である場合、又は所定期間継続して前記所定の閾値を超えた場合、漫然状態であると判定されることとなる。 In addition, when the amount of change or rate of change in the rate of eye closure is calculated as the variable feature amount of the rate of eye closure, the amount of change or rate of change in the rate of eye closure is a predetermined threshold value (change amount or rate of change) for determining the transition to a vague state. ) Or more, or whether or not the amount of change in the eye closure rate or the rate of change continuously exceeds a predetermined threshold for a predetermined period (for example, for a certain period of time) may be determined. Good. In this case, if the amount of change or the rate of change in the eye closure rate is equal to or greater than a predetermined threshold value, or if the change rate exceeds the predetermined threshold value continuously for a predetermined period of time, it is determined that the state is in a vague state.
 また、閉眼率の変動特徴量として、第2所定期間における閉眼率の上昇時間率を算出した場合、閉眼率の上昇時間率が、漫然状態への変化を示す所定の閾値(割合)以上になったか否かを判定してもよいし、閉眼率の上昇時間率が、所定期間(例えば、一定時間など)継続して所定の閾値を超えたか否かを判定してもよい。この場合、閉眼率の上昇時間率が、所定の閾値以上である場合、又は所定期間継続して前記所定の閾値を超えた場合、漫然状態であると判定されることとなる。また、制御部21は、上記した2種類以上の変動特徴量に基づいて、運転者3の漫然状態を判定してもよいし、上記した変動特徴量に基づく判定処理と、閉眼率に基づく判定処理とを組み合わせて、運転者3の漫然状態を判定してもよい。 In addition, when the rate of increase in the rate of eye closure in the second predetermined period is calculated as the variable feature amount of the rate of eye closure, the rate of increase in the rate of eye closure becomes equal to or greater than a predetermined threshold value (ratio) indicating a change to a vague state. It may be determined whether or not the rate of increase in the eye closure rate continuously exceeds a predetermined threshold value for a predetermined period (for example, a certain period of time). In this case, if the rate of increase in the eye closure rate is equal to or greater than a predetermined threshold value, or if the rate of increase in the eye closure rate exceeds the predetermined threshold value continuously for a predetermined period of time, it is determined that the state is indifferent. Further, the control unit 21 may determine the driver 3's involuntary state based on the above-mentioned two or more types of fluctuation feature amounts, the determination processing based on the above-mentioned fluctuation feature amount, and the determination based on the eye closure rate. The driver 3 may be determined to be in a vague state in combination with the processing.
 ステップS9では、制御部21は、ステップS8での漫然状態の判定処理の結果、運転者3が漫然状態であるか否かを判断し、漫然状態であると判断すれば、ステップS10に処理を進める。
 ステップS10では、制御部21は、出力部35として動作し、運転者3を漫然状態から覚醒状態に移行させるための報知処理、例えば、報知部16を動作させて警告音を出力したり、又は所定のアナウンスなどを出力したりする処理を行い、ステップS11に処理を進める。
In step S9, the control unit 21 determines whether or not the driver 3 is in a loose state as a result of the loose state determination process in step S8, and if it is determined that the driver 3 is in a loose state, the process is performed in step S10. Proceed.
In step S10, the control unit 21 operates as an output unit 35, and a notification process for shifting the driver 3 from the awake state to the awake state, for example, the notification unit 16 is operated to output a warning sound, or A process of outputting a predetermined announcement or the like is performed, and the process proceeds to step S11.
 ステップS11では、制御部21は、出力部35として動作し、漫然状態であると判定された結果(漫然状態検出結果)を、第2所定期間の経過時刻、位置などのデータと紐付けて記憶部22に記憶する処理を行い、その後処理を終える。なお、制御部21は、ステップS11の後、記憶部22に記憶された漫然状態検出結果を含むデータを所定のタイミングで運転評価装置4へ送信するようにしてもよい。
[作用効果]
 実施の形態(1)に係る状態判定装置20を備えた車載機10によれば、運転者3の漫然状態が、閉眼率の値そのものではなく、上記した閉眼率の変動特徴量に基づいて判定される。これにより、例えば、各運転者3の閉眼率の個人差、又は各車両2のカメラ11の設置位置(換言すれば、画像の撮像位置)の違いなどの要因によって、覚醒状態であっても閉眼率の値が高めに計測されるような場合であっても、これら要因の影響を受けることなく、各運転者3の漫然状態、すなわち、漫然運転の状態を精度良く判定することができる。また、運転者ごとに、又は車両の走行開始のたびに、運転者(ユーザ)が閉眼率の判定閾値の設定操作を行い、閉眼率の値で漫然状態の判定を行う方法と比較して、上記実施の形態では、閾値設定などの煩雑な操作を行う必要もなく、手間がかからないため、ユーザにとっての利便性も高めることができる。
In step S11, the control unit 21 operates as the output unit 35, and stores the result determined to be in the loose state (blunt state detection result) in association with data such as the elapsed time and position of the second predetermined period. The process of storing in the unit 22 is performed, and then the process is completed. After step S11, the control unit 21 may transmit the data including the vague state detection result stored in the storage unit 22 to the operation evaluation device 4 at a predetermined timing.
[Action effect]
According to the vehicle-mounted device 10 provided with the state determination device 20 according to the embodiment (1), the involuntary state of the driver 3 is determined based on the above-mentioned fluctuation feature amount of the eye closure rate, not on the value of the eye closure rate itself. Will be done. As a result, the eyes are closed even in the awake state due to factors such as individual differences in the eye closing rate of each driver 3 or differences in the installation position (in other words, the image capturing position) of the camera 11 of each vehicle 2. Even when the value of the rate is measured higher, it is possible to accurately determine the loose state of each driver 3, that is, the loose driving state without being affected by these factors. In addition, compared with the method in which the driver (user) sets the threshold value for determining the eye closure rate and determines the involuntary state based on the value of the eye closure rate for each driver or each time the vehicle starts running. In the above embodiment, it is not necessary to perform complicated operations such as threshold setting, and it does not take time and effort, so that the convenience for the user can be improved.
 また、上記変動特徴量に、第2所定期間の閉眼率のばらつきの程度を示す指標(例えば、標準偏差、又は分散などの値)を用いる場合、運転者3の漫然状態の判定に、閉眼率のばらつき度合いの変化が考慮される。
 また、上記変動特徴量に、第2所定期間の閉眼率の変化量又は変化率を用いる場合、運転者3の漫然状態の判定に、閉眼率の変化の大きさが考慮される。
Further, when an index (for example, a value such as standard deviation or dispersion) indicating the degree of variation in the eye closure rate during the second predetermined period is used as the variable feature amount, the eye closure rate is used to determine the driver 3's involuntary state. Changes in the degree of variation are taken into account.
Further, when the amount of change or the rate of change in the eye closure rate during the second predetermined period is used as the variable feature amount, the magnitude of the change in the eye closure rate is taken into consideration in determining the driver 3's absent-minded state.
 また、上記変動特徴量に、第2所定期間における閉眼率の上昇時間率を用いる場合、運転者3の漫然状態の判定に、閉眼率が経時的に上昇している変化傾向が考慮される。
 したがって、これら変動特徴量に基づいて、運転者3の漫然状態を判定することにより、覚醒状態であっても閉眼率が高めに計測されるような場合であっても、閉眼率そのものの値による影響を受けることなく、漫然状態へ移行するタイミングを精度良く判定することができる。
Further, when the increase time rate of the eye closure rate in the second predetermined period is used as the fluctuation feature amount, the change tendency that the eye closure rate increases with time is taken into consideration in the determination of the driver 3's absent-minded state.
Therefore, by determining the driver 3's absent-minded state based on these variable features, the value of the eye-closure rate itself depends on the eye-closure rate itself, even in the awake state or when the eye-closure rate is measured higher. It is possible to accurately determine the timing of transition to the involuntary state without being affected.
 また、1以上の車載機10と、運転評価装置4とを含んで構成される運転評価システム1によれば、車載機10の状態判定装置20で判定された漫然状態の判定結果を用いることにより、運転評価装置4では、各運転者3の閉眼率の個人差などの影響を受けない、各運転者3に対して公平な漫然状態の評価を行うことが可能となり、より適正な運転評価結果を事業者端末6に出力することができる。車両2を管理する事業者は、事業者端末6に表示される運転評価結果を利用して、運転者3に対する安全運転教育を適切に行うことができる。 Further, according to the driving evaluation system 1 including one or more in-vehicle devices 10 and the driving evaluation device 4, by using the determination result of the loose state determined by the state determining device 20 of the in-vehicle device 10. , The driving evaluation device 4 makes it possible to evaluate each driver 3 in a fair and indiscriminate state without being affected by individual differences in the eye closure rate of each driver 3, and a more appropriate driving evaluation result. Can be output to the operator terminal 6. The business operator who manages the vehicle 2 can appropriately provide safe driving education to the driver 3 by using the driving evaluation result displayed on the business operator terminal 6.
[実施の形態(2)]
 次に実施の形態(2)に係る状態判定装置が装備された車載機について説明する。但し、実施の形態(2)に係る車載機10Aのハードウェア構成例は、図2に示した構成例と略同様であるので、異なる機能を有する状態判定装置20Aとその制御部21Aとには異なる符号を付し、その他の構成部品の説明は省略することとする。
 図9は、実施の形態(2)に係る状態判定装置20Aの機能構成例を示すブロック図である。但し、実施の形態(2)に係る状態判定装置20Aの機能構成については、変動特徴量算出部33Aの前段に前処理部36が設けられている点を除いて、図3に示した状態判定装置20の機能構成と略同様であるため、同一機能を有する構成には同一符号を付し、その説明を省略する。
[Embodiment (2)]
Next, an in-vehicle device equipped with the state determination device according to the embodiment (2) will be described. However, since the hardware configuration example of the vehicle-mounted device 10A according to the embodiment (2) is substantially the same as the configuration example shown in FIG. 2, the state determination device 20A having different functions and the control unit 21A thereof are used. Different reference numerals will be given and the description of other components will be omitted.
FIG. 9 is a block diagram showing a functional configuration example of the state determination device 20A according to the embodiment (2). However, regarding the functional configuration of the state determination device 20A according to the embodiment (2), the state determination shown in FIG. 3 is performed except that the preprocessing unit 36 is provided in front of the variation feature amount calculation unit 33A. Since the functional configuration of the device 20 is substantially the same, the same reference numerals are given to the configurations having the same functions, and the description thereof will be omitted.
 前処理部36は、閉眼率算出部32により算出された閉眼率に対して、所定の前処理を行う。例えば、前処理部36は、前処理として、閉眼率算出部32により算出された閉眼率の平滑化処理を行ってもよい。前処理部36は、平滑化処理として、例えば、所定間隔で閉眼率の移動平均を算出する処理を行ってもよい。所定間隔は、例えば、第1所定期間×m(mは2以上の整数)の間隔としてもよい。 The pretreatment unit 36 performs a predetermined pretreatment on the eye closure rate calculated by the eye closure rate calculation unit 32. For example, the pretreatment unit 36 may perform a smoothing process of the eye closure rate calculated by the eye closure rate calculation unit 32 as a pretreatment. As the smoothing process, the preprocessing unit 36 may perform, for example, a process of calculating the moving average of the eye closure rate at predetermined intervals. The predetermined interval may be, for example, an interval of the first predetermined period × m (m is an integer of 2 or more).
 変動特徴量算出部33Aは、前処理部36により前処理された後の閉眼率を用いて、変動特徴量を算出する処理を行う。
 図10(a)は、図4(a)に示した平常時の閉眼率が低い運転者の閉眼率の時系列変化を示すグラフに、該閉眼率の移動平均線を重ねたグラフであり、図10(b)は、図5(a)に示した平常時の閉眼率が高い運転者の閉眼率の時系列変化を示すグラフに、該閉眼率の移動平均線を重ねたグラフある。太線が、閉眼率の移動平均線を示している。
The variable feature amount calculation unit 33A performs a process of calculating the variable feature amount using the eye closure rate after the pretreatment by the pretreatment unit 36.
FIG. 10 (a) is a graph in which the moving average line of the eye closure rate is superimposed on the graph showing the time-series change of the eye closure rate of the driver having a low eye closure rate in normal times shown in FIG. 4 (a). FIG. 10B is a graph in which the moving average line of the eye closure rate is superimposed on the graph showing the time-series change of the eye closure rate of the driver having a high eye closure rate in normal times shown in FIG. 5A. The thick line shows the moving average of the eye closure rate.
 図10(a)、(b)から明らかなように、閉眼率の時系列データに対して、移動平均を算出する処理を行うことにより、ノイズを含む閉眼率の時系列データを平滑化することが可能となり、特に、閉眼率の変化量又は変化率で示す変動特徴量をより精度良く抽出することが可能となる。
 図11は、実施の形態(2)に係る状態判定装置20Aの制御部21Aが行う処理動作例を示すフローチャートである。図11に示す処理動作は、図7のステップS6の処理に引き続いて実行される。なお、図7のステップS6までの処理動作は同一であるので、その説明を省略する。図11では、制御部21Aが、前処理部36、変動特徴量算出部33A、漫然状態判定部34、及び出力部35として動作する一例を示している。また、図8に示した処理動作と同一の処理内容については、同一ステップ番号を付し、その説明を省略する。
As is clear from FIGS. 10A and 10B, smoothing the time-series data of the eye-closure rate including noise by performing a process of calculating the moving average on the time-series data of the eye-closure rate. In particular, it is possible to extract the amount of change in the eye closure rate or the amount of variable features indicated by the rate of change with higher accuracy.
FIG. 11 is a flowchart showing an example of processing operation performed by the control unit 21A of the state determination device 20A according to the embodiment (2). The processing operation shown in FIG. 11 is executed following the processing in step S6 of FIG. Since the processing operations up to step S6 in FIG. 7 are the same, the description thereof will be omitted. FIG. 11 shows an example in which the control unit 21A operates as the preprocessing unit 36, the variation feature amount calculation unit 33A, the loose state determination unit 34, and the output unit 35. Further, the same processing contents as those shown in FIG. 8 are assigned the same step numbers, and the description thereof will be omitted.
 まず、ステップS21では、制御部21Aは、漫然状態の判定に用いる変動特徴量の種類に基づいて、閉眼率の平滑化処理を実行するか否かを判断する。例えば、漫然状態の判定に用いる変動特徴量の種類が、閉眼率の変化量若しくは変化率、又は閉眼率の上昇時間率である場合は、閉眼率の平滑化処理を実行すると判断し、ステップS22に処理を進める。 First, in step S21, the control unit 21A determines whether or not to execute the smoothing process of the eye closure rate based on the type of the variable feature amount used for determining the involuntary state. For example, when the type of the variable feature amount used for determining the loose state is the amount of change or the rate of change in the eye closure rate, or the rate of increase in the eye closure rate, it is determined that the smoothing process of the eye closure rate is to be executed, and step S22. Proceed to processing.
 ステップS22では、制御部21Aは、前処理部36として動作し、ステップS6で読み出した、第2所定期間の閉眼率のデータに対して平滑化処理を行い、ステップS23に処理を進める。制御部21Aは、平滑化処理として、例えば、第2所定期間のうちの一定期間ごとの閉眼率の平均値を、区間をずらしながら計算する処理、すなわち、移動平均を求める処理を行う。 In step S22, the control unit 21A operates as the preprocessing unit 36, performs smoothing processing on the data of the eye closure rate in the second predetermined period read out in step S6, and proceeds to the processing in step S23. As the smoothing process, the control unit 21A performs, for example, a process of calculating the average value of the eye closure rate for each fixed period in the second predetermined period while shifting the interval, that is, a process of obtaining a moving average.
 ステップS23では、制御部21Aは、変動特徴量算出部33Aとして動作し、ステップS22で平滑化処理された閉眼率を用いて、閉眼率の変動特徴量を算出する処理を行い、ステップS24に処理を進める。
 算出する変動特徴量は、例えば、平滑化処理された閉眼率の変化量又は変化率(傾き)であってもよいし、平滑化処理された閉眼率の上昇時間率であってもよく、これらのうち少なくとも1種類以上の変動特徴量を算出するようにすればよい。算出された変動特徴量は、例えば、第2所定期間の経過時刻などのデータと紐付けて記憶部22に記憶されてもよい。
In step S23, the control unit 21A operates as the fluctuation feature amount calculation unit 33A, performs a process of calculating the fluctuation feature amount of the eye closure rate using the eye closure rate smoothed in step S22, and processes in step S24. To proceed.
The variable feature amount to be calculated may be, for example, the amount of change or the rate of change (slope) of the smoothed eye closure rate, or the rate of increase in the smoothed eye closure rate. Of these, at least one type of variable feature amount may be calculated. The calculated variable feature amount may be stored in the storage unit 22 in association with data such as the elapsed time of the second predetermined period, for example.
 ステップS24では、制御部21Aは、漫然状態判定部34として動作し、ステップS23で算出された閉眼率の変動特徴量(変化量、変化率、又は上昇時間率)に基づいて、運転者の漫然状態を判定する処理を行い、ステップS9に処理を進める。
 ステップS23において、閉眼率の変動特徴量として、平滑化処理された閉眼率の変化量又は変化率を算出した場合は、ステップS24において、平滑化処理された閉眼率の変化量又は変化率が、漫然状態への移行を判定する所定の閾値(変化量又は変化率)以上であるか否かを判定してもよい。この場合、平滑化処理された閉眼率の変化量又は変化率が、所定の閾値以上である場合、漫然状態であると判定されることとなる。
In step S24, the control unit 21A operates as a loose state determination unit 34, and the driver's looseness is based on the variation feature amount (change amount, change rate, or rise time rate) of the eye closure rate calculated in step S23. The process of determining the state is performed, and the process proceeds to step S9.
In step S23, when the amount of change or rate of change in the smoothed eye closure rate is calculated as the variable feature amount of the eye closure rate, in step S24, the amount of change or rate of change in the smoothed eye closure rate is determined. It may be determined whether or not it is equal to or more than a predetermined threshold value (change amount or rate of change) for determining the transition to the loose state. In this case, if the amount of change or the rate of change in the smoothed eye closure rate is equal to or greater than a predetermined threshold value, it is determined that the state is in a vague state.
 また、ステップS23において、閉眼率の変動特徴量として、平滑化処理された閉眼率の上昇時間率を算出した場合、平滑化処理された閉眼率の上昇時間率が、漫然状態への変化を示す所定の閾値(割合)以上になったか否かを判定してもよい。この場合、平滑化処理された閉眼率の上昇時間率が、所定の閾値以上である場合、漫然状態であると判定されることとなる。 Further, in step S23, when the increase time rate of the smoothed eye closure rate is calculated as the variable feature amount of the eye closure rate, the increase time rate of the smoothed eye closure rate indicates a change to a vague state. It may be determined whether or not the value exceeds a predetermined threshold value (ratio). In this case, if the rate of increase in the smoothed eye closure rate is equal to or greater than a predetermined threshold value, it is determined that the smoothed state is in a vague state.
 一方、ステップS21において、制御部21Aが、漫然状態の判定に用いる変動特徴量の種類が、例えば、閉眼率の標準偏差である場合は、閉眼率の平滑化処理を実行しないと判断し、ステップS25に処理を進める。
 ステップS25では、制御部21Aは、変動特徴量算出部33Aとして動作し、平滑化処理されていない閉眼率を用いて、閉眼率の変動特徴量を算出する処理を行い、ステップS26に処理を進める。算出する変動特徴量は、例えば、閉眼率の標準偏差であり、算出された変動特徴量は、第2所定期間の経過時刻などのデータと紐付けて記憶部22に記憶されてもよい。
On the other hand, in step S21, when the type of the variable feature amount used for determining the involuntary state is, for example, the standard deviation of the eye closure rate, the control unit 21A determines that the smoothing process of the eye closure rate is not executed, and steps. The process proceeds to S25.
In step S25, the control unit 21A operates as the fluctuation feature amount calculation unit 33A, performs a process of calculating the fluctuation feature amount of the eye closure rate using the eye closure rate that has not been smoothed, and proceeds to step S26. .. The calculated variable feature amount is, for example, the standard deviation of the eye closure rate, and the calculated variable feature amount may be stored in the storage unit 22 in association with data such as the elapsed time of the second predetermined period.
 ステップS26では、制御部21Aは、ステップS25で算出された、閉眼率の変動特徴量(閉眼率の標準偏差)に基づいて、運転者の漫然状態を判定する処理を行い、ステップS9に処理を進める。なお、ステップS9~S11の処理については、図8におけるステップS9~S11の処理と同様であるので、ここではその説明を省略する。
 実施の形態(2)に係る状態判定装置20Aを備えた車載機10Aによれば、漫然状態の判定に用いる変動特徴量の種類に基づいて、閉眼率の平滑化処理を実行するか否かが判断され、変動特徴量が、閉眼率の変化量、変化率、又は上昇時間率である場合、前処理部36によって閉眼率に対する前処理として平滑化処理が行われ、該平滑化処理された後の閉眼率を用いて、閉眼率の変化量、変化率、又は上昇時間率の変動特徴量が算出される。前処理部36で閉眼率の平滑化処理を行うことにより、閉眼率の変化量、変化率、又は上昇時間率で表される変動特徴量を、前記漫然状態へ移行する傾向が把握しやすい特徴量として算出することができ、漫然状態への移行を精度良く判定することができる。
In step S26, the control unit 21A performs a process of determining the driver's absent-minded state based on the variation feature amount of the eye closure rate (standard deviation of the eye closure rate) calculated in step S25, and performs the process in step S9. Proceed. Since the processes of steps S9 to S11 are the same as the processes of steps S9 to S11 in FIG. 8, the description thereof will be omitted here.
According to the in-vehicle device 10A provided with the state determination device 20A according to the embodiment (2), whether or not to execute the smoothing process of the eye closure rate based on the type of the variable feature amount used for determining the involuntary state. When it is determined that the variable feature amount is the amount of change, the rate of change, or the rate of increase in the eye closure rate, the pretreatment unit 36 performs a smoothing process as a pretreatment for the eye closure rate, and after the smoothing process. The amount of change in the rate of eye closure, the rate of change, or the variable feature amount of the rate of increase time is calculated using the rate of eye closure. By performing the smoothing process of the eye closure rate in the pretreatment unit 36, it is easy to grasp the tendency of the fluctuation feature amount represented by the change amount, the change rate, or the increase time rate of the eye closure rate to shift to the vague state. It can be calculated as a quantity, and the transition to a vague state can be accurately determined.
 また、漫然状態の判定に用いる変動特徴量の種類が閉眼率の標準偏差である場合は、平滑化処理されていない閉眼率を用いて、閉眼率の標準偏差が算出されるので、前記漫然状態へ移行する傾向が把握しやすい特徴量として、閉眼率の標準偏差を算出することができる。 Further, when the type of the variable feature amount used for determining the loose state is the standard deviation of the closed eye rate, the standard deviation of the closed eye rate is calculated using the closed eye rate that has not been smoothed. The standard deviation of the eye closure rate can be calculated as a feature amount that makes it easy to grasp the tendency to shift to.
[実施の形態(3)]
 次に実施の形態(3)に係る状態判定装置が装備された車載機について説明する。但し、実施の形態(3)に係る車載機10Bのハードウェア構成例は、図2に示した構成例と略同様であるので、異なる機能を有する状態判定装置20Bとその制御部21Bとには異なる符号を付し、その他の構成部品の説明は省略することとする。
 図12は、実施の形態(3)に係る状態判定装置20Bの機能構成例を示すブロック図である。但し、実施の形態(3)に係る状態判定装置20Bの機能構成については、車両動態データ取得部37と、事象検出部38と、前処理部36Aとがさらに装備されている点を除いて、図3に示した状態判定装置20の機能構成と略同様であるため、同一機能を有する構成には同一符号を付し、その説明を省略する。
[Embodiment (3)]
Next, an in-vehicle device equipped with the state determination device according to the embodiment (3) will be described. However, since the hardware configuration example of the vehicle-mounted device 10B according to the embodiment (3) is substantially the same as the configuration example shown in FIG. 2, the state determination device 20B having different functions and the control unit 21B thereof have different functions. Different reference numerals will be given and the description of other components will be omitted.
FIG. 12 is a block diagram showing a functional configuration example of the state determination device 20B according to the embodiment (3). However, regarding the functional configuration of the state determination device 20B according to the embodiment (3), except that the vehicle dynamic data acquisition unit 37, the event detection unit 38, and the preprocessing unit 36A are further equipped. Since the functional configuration of the state determination device 20 shown in FIG. 3 is substantially the same, the configurations having the same function are designated by the same reference numerals, and the description thereof will be omitted.
 車両動態データ取得部37は、車載機10の加速度センサ12で検出された車両2の加速度データ、角速度センサ13で検出された車両2の角速度データ、及びGPS受信部14で検出された車両2の位置データのうちの少なくともいずれかの動態データを取得し、取得したデータを事象検出部38に送出する処理を行う。
 事象検出部38は、車両動態検出部381と顔向き事象検出部382とを含んで構成されている。車両動態検出部381は、車両動態データ取得部37から取得した車両2の動態データに基づいて、所定の事象を検出する。顔向き事象検出部382は、画像取得部30から取得した画像データに基づいて、運転者3の顔向き事象を検出する。また、顔向き事象検出部382は、画像取得部30から取得した画像データと車両動態データ取得部37から取得した車両2の動態データとに基づいて、運転者3の顔向き事象を検出してもよい。
The vehicle dynamic data acquisition unit 37 is the acceleration data of the vehicle 2 detected by the acceleration sensor 12 of the in-vehicle device 10, the angular velocity data of the vehicle 2 detected by the angular velocity sensor 13, and the vehicle 2 detected by the GPS receiving unit 14. A process of acquiring at least one of the dynamic data of the position data and sending the acquired data to the event detection unit 38 is performed.
The event detection unit 38 includes a vehicle dynamics detection unit 381 and a face-facing event detection unit 382. The vehicle dynamics detection unit 381 detects a predetermined event based on the dynamic data of the vehicle 2 acquired from the vehicle dynamics data acquisition unit 37. The face-facing event detection unit 382 detects the face-facing event of the driver 3 based on the image data acquired from the image acquisition unit 30. Further, the face-facing event detection unit 382 detects the face-to-face event of the driver 3 based on the image data acquired from the image acquisition unit 30 and the dynamic data of the vehicle 2 acquired from the vehicle dynamic data acquisition unit 37. May be good.
 車両動態検出部381は、前記所定の事象として、車両2が走行する道路種別が切り替わった事象、車両2が停車した事象、車両2が走行を開始した事象、車両2に急ハンドルが発生した事象、車両2に急ブレーキが発生した事象、及び車両2に衝撃が発生した事象のうちの少なくともいずれかの事象を検出する。
 顔向き事象検出部382は、車両2の運転中に運転者3の顔の向きが変化する事象を検出する。検出される顔向き事象には、例えば、交差点での右左折時確認、交差点での進行方向確認、顔未検出、及び脇見のうちの少なくともいずれかの事象が含まれる。
The vehicle dynamics detection unit 381 includes, as the predetermined events, an event in which the road type on which the vehicle 2 travels is switched, an event in which the vehicle 2 stops, an event in which the vehicle 2 starts traveling, and an event in which the vehicle 2 suddenly handles. , An event in which a sudden brake is generated in the vehicle 2 and an event in which an impact is generated in the vehicle 2 are detected.
The face-facing event detection unit 382 detects an event in which the face orientation of the driver 3 changes while the vehicle 2 is driving. Detected facial events include, for example, at least one of a right / left turn confirmation at an intersection, a direction of travel confirmation at an intersection, no face detection, and inattentiveness.
 前処理部36Aは、閉眼率算出部32により算出された閉眼率に対して、所定の前処理を行う。前処理部36Aは、例えば、閉眼率算出部32により算出された、第2所定期間の閉眼率のうち、事象検出部38により所定の事象が検出された時又は期間(例えば、第1所定期間)に算出された閉眼率を、変動特徴量の算出対象データから除く除去処理を行う。 The pretreatment unit 36A performs a predetermined preprocessing on the eye closure rate calculated by the eye closure rate calculation unit 32. The preprocessing unit 36A is, for example, when or during a predetermined event detected by the event detection unit 38 among the eye closure rates of the second predetermined period calculated by the eye closure rate calculation unit 32 (for example, the first predetermined period). ) Is removed from the data to be calculated for the variable feature amount.
 変動特徴量算出部33は、前処理部36Aにより前処理された後の閉眼率を用いて、変動特徴量を算出する処理を行う。
 図13は、実施の形態(3)に係る状態判定装置20Bの制御部21Bが行う処理動作を示すフローチャートである。図13では、制御部21Bが、画像取得部30、眼開閉度検出部31、閉眼率算出部32、車両動態データ取得部37、事象検出部38、及び前処理部36Aとして動作する一例を示している。なお、図7に示した処理動作と同一の処理内容については、同一ステップ番号を付し、ここではその説明を省略することとする。
The variable feature amount calculation unit 33 performs a process of calculating the variable feature amount using the eye closure rate after the pretreatment by the pretreatment unit 36A.
FIG. 13 is a flowchart showing a processing operation performed by the control unit 21B of the state determination device 20B according to the embodiment (3). FIG. 13 shows an example in which the control unit 21B operates as an image acquisition unit 30, an eye opening / closing degree detection unit 31, an eye closure rate calculation unit 32, a vehicle dynamic data acquisition unit 37, an event detection unit 38, and a preprocessing unit 36A. ing. Note that the same processing contents as those shown in FIG. 7 are assigned the same step numbers, and the description thereof will be omitted here.
 図13に示すフローチャートでは、ステップS1~S4の閉眼率を算出するまでの処理と、ステップS31~S38の車両2の動態事象を検出する処理と、ステップS41~S43の運転者3の顔向き事象を検出する処理とが並列的に(並行して)実行される。但し、以下で説明する処理手順は一例に過ぎず、適宜変更されてよく、実施の形態に応じて、ステップの省略、置換、及び追加を行ってもよい。 In the flowchart shown in FIG. 13, the process of calculating the eye closure rate in steps S1 to S4, the process of detecting the dynamic event of the vehicle 2 in steps S31 to S38, and the facial event of the driver 3 in steps S41 to S43. Is executed in parallel (parallel) with the process of detecting. However, the processing procedure described below is only an example and may be changed as appropriate, and steps may be omitted, replaced, or added depending on the embodiment.
 ステップS31では、制御部21Bは、車両動態データ取得部37として動作し、車両2の動態データを取得する処理を行い、ステップS32に処理を進める。制御部21Bは、例えば、加速度センサ12で検出された加速度データ、角速度センサ13で検出された角速度データ、及びGPS受信部14で検出された位置データのうちの少なくともいずれかの動態データを、所定間隔(数十msec、又は数秒間隔)で取得する。 In step S31, the control unit 21B operates as the vehicle dynamic data acquisition unit 37, performs a process of acquiring the dynamic data of the vehicle 2, and proceeds to the process in step S32. The control unit 21B determines, for example, dynamic data of at least one of the acceleration data detected by the acceleration sensor 12, the angular velocity data detected by the angular velocity sensor 13, and the position data detected by the GPS receiving unit 14. Obtained at intervals (several tens of msec, or at intervals of several seconds).
 ステップS32では、制御部21Bは、車両動態検出部381として動作し、ステップS31で取得した車両2の動態データを用いて、車両2が交差点を通過中(例えば、右折中又は左折中)か否かを検出する処理を行い、ステップS33に処理を進める。
 車両2が交差点を通過中か否かの検出は、例えば、角速度センサ13で検出された角速度の絶対値が所定の角速度閾値を超えたか否か検出し、該角速度閾値を超えた時刻を、交差点の通過時刻として検出してもよい。また、車両2が交差点を通過中か否かの検出において、車速が交差点通過に適した所定速度以下であるという条件を付加してもよい。検出された交差点通過中のデータは、例えば、交差点の通過時刻、及び位置などのデータと紐付けて記憶部22に記憶されてもよい。
In step S32, the control unit 21B operates as the vehicle dynamic detection unit 381, and using the dynamic data of the vehicle 2 acquired in step S31, whether or not the vehicle 2 is passing through the intersection (for example, during a right turn or a left turn). The process for detecting this is performed, and the process proceeds to step S33.
To detect whether or not the vehicle 2 is passing through an intersection, for example, it is detected whether or not the absolute value of the angular velocity detected by the angular velocity sensor 13 exceeds a predetermined angular velocity threshold value, and the time when the angular velocity threshold value is exceeded is set at the intersection. It may be detected as the passing time of. Further, in detecting whether or not the vehicle 2 is passing through the intersection, a condition that the vehicle speed is equal to or less than a predetermined speed suitable for passing through the intersection may be added. The detected data during passing through the intersection may be stored in the storage unit 22 in association with data such as the passing time and position of the intersection, for example.
 ステップS33では、制御部21Bは、車両動態検出部381として動作し、ステップS31で取得した車両2の動態データを用いて、車両2が走行している道路種別の切り替わりを検出する処理を行い、ステップS34に処理を進める。制御部21Bは、例えば、ステップS31で取得した位置データを用いて、単位時間当たりの移動距離から車速を算出し、算出した車速に基づいて、道路種別の切り替わりを検出してもよい。 In step S33, the control unit 21B operates as the vehicle dynamics detection unit 381, and uses the dynamic data of the vehicle 2 acquired in step S31 to perform a process of detecting the switching of the road type on which the vehicle 2 is traveling. The process proceeds to step S34. For example, the control unit 21B may calculate the vehicle speed from the moving distance per unit time using the position data acquired in step S31, and detect the switching of the road type based on the calculated vehicle speed.
 例えば、時速80km以上の車速が、所定時間(例えば、数十秒以上)継続して検出された場合、一般道路から高速道路への切り替わりを検出してもよい。また、制御部21Bは、車速に基づいて、高速道路から一般道路への切り替わりを検出してもよいし、一般道路から生活道路(例えば、制限速度が時速30km以下の道路)への切り替わりを検出してもよいし、生活道路から一般道路への切り替わりを検出してもよい。 For example, when a vehicle speed of 80 km / h or more is continuously detected for a predetermined time (for example, several tens of seconds or more), a switch from a general road to an expressway may be detected. Further, the control unit 21B may detect a switch from an expressway to a general road based on the vehicle speed, or detect a switch from a general road to a residential road (for example, a road having a speed limit of 30 km / h or less). Alternatively, the switch from the residential road to the general road may be detected.
 また、車載機10が道路地図データを利用できる場合には、制御部21Bが、道路地図データ(道路種別データを含む)と位置データと照合して、道路種別の切り替わりを検出してもよい。また、制御部21Bは、路車間通信などにより取得した通信信号(例えば、高速道路への進入信号、又は高速道路からの退出信号など)に基づいて、道路種別の切り替わりを検出してもよい。道路種別の切り替わりを検出したデータは、例えば、位置データ、検出時刻などのデータと紐付けて記憶部22に記憶されてもよい。 Further, when the in-vehicle device 10 can use the road map data, the control unit 21B may detect the switching of the road type by collating the road map data (including the road type data) with the position data. Further, the control unit 21B may detect the switching of road types based on a communication signal (for example, an approach signal to an expressway or an exit signal from an expressway) acquired by road-to-vehicle communication or the like. The data for detecting the change of the road type may be stored in the storage unit 22 in association with data such as position data and detection time.
 ステップS34では、制御部21Bは、車両動態検出部381として動作し、ステップS31で取得した車両2の動態データを用いて、車両2の停車を検出する処理を行い、ステップS35に処理を進める。
 車両2の停車は、例えば、ステップS31で取得した位置データを用いて、位置データが変化しない状態が所定時間(数秒から数十秒程度)継続して検出された場合、又は位置データから求めた車速が所定値(例えば、徐行速度)以下である状態が検出された場合、車両2が停車しているとして、車両2の停車を検出してもよい。車両2の停車を検出したデータは、例えば、位置データの取得時刻などのデータと紐付けて記憶部22に記憶されてもよい。
In step S34, the control unit 21B operates as the vehicle dynamics detection unit 381, performs a process of detecting the stop of the vehicle 2 using the dynamic data of the vehicle 2 acquired in step S31, and proceeds to the process in step S35.
The stop of the vehicle 2 is determined, for example, when a state in which the position data does not change is continuously detected for a predetermined time (about several seconds to several tens of seconds) using the position data acquired in step S31, or from the position data. When a state in which the vehicle speed is equal to or lower than a predetermined value (for example, slow speed) is detected, it may be assumed that the vehicle 2 is stopped and the stop of the vehicle 2 may be detected. The data obtained by detecting the stop of the vehicle 2 may be stored in the storage unit 22 in association with data such as the acquisition time of the position data.
 ステップS35では、制御部21Bは、車両動態検出部381として動作し、ステップS31で取得した車両2の動態データを用いて、車両2の走行開始を検出する処理を行い、ステップS36に処理を進める。
 車両2の走行開始は、例えば、ステップS31で取得した位置データを用いて、位置データが変化していない状態から位置データの変化が所定時間(数秒から数十秒程度)継続して検出された場合、又は、位置データから求めた車速が所定値以上になったことが検出された場合、車両2が走行開始したとして、車両2の走行開始を検出してもよい。車両2の走行開始を検出したデータは、例えば、走行開始を検出した時刻などのデータと紐付けて記憶部22に記憶されてもよい。
In step S35, the control unit 21B operates as the vehicle dynamics detection unit 381, performs a process of detecting the start of traveling of the vehicle 2 using the dynamic data of the vehicle 2 acquired in step S31, and proceeds to the process in step S36. ..
For the start of traveling of the vehicle 2, for example, using the position data acquired in step S31, the change in the position data is continuously detected for a predetermined time (about several seconds to several tens of seconds) from the state in which the position data has not changed. In this case, or when it is detected that the vehicle speed obtained from the position data is equal to or higher than a predetermined value, it may be assumed that the vehicle 2 has started running and the start of running of the vehicle 2 may be detected. The data for detecting the start of traveling of the vehicle 2 may be stored in the storage unit 22 in association with data such as the time when the start of traveling is detected, for example.
 ステップS36では、制御部21Bは、車両動態検出部381として動作し、ステップS31で取得した車両2の動態データを用いて、車両2の急ハンドル又は急ブレーキを検出する処理を行い、ステップS37に処理を進める。
 車両2の急ハンドル(急操舵)は、例えば、ステップS31で取得した角速度(ヨー角速度を含む)と加速度(左右加速度を含む)のデータを用いて、急ハンドルの発生を判定する所定閾値以上の角速度と加速度が検出された場合、車両2に急ハンドルが発生したとして、車両2の急ハンドルを検出してもよい。
In step S36, the control unit 21B operates as the vehicle dynamics detection unit 381, performs a process of detecting sudden steering or sudden braking of the vehicle 2 using the dynamic data of the vehicle 2 acquired in step S31, and in step S37. Proceed with processing.
The sudden steering wheel (sudden steering) of the vehicle 2 is, for example, equal to or higher than a predetermined threshold for determining the occurrence of a sudden steering wheel by using the data of the angular velocity (including the yaw angular velocity) and the acceleration (including the left-right acceleration) acquired in step S31. When the angular velocity and the acceleration are detected, the sudden steering wheel of the vehicle 2 may be detected, assuming that the vehicle 2 has a sudden steering wheel.
 また、車両2の急ブレーキ(急減速)は、例えば、ステップS31で取得した加速度(前後加速度を含む)のデータを用いて、急ブレーキの発生を判定する所定閾値以上の加速度が検出された場合、車両2に急ブレーキが発生したとして、車両2の急ハンドルを検出してもよい。車両2の急ハンドル又は急ブレーキを検出したデータは、例えば、急ハンドル又は急ブレーキを検出した時刻などのデータと紐付けて記憶部22に記憶されてもよい。 Further, the sudden braking (sudden deceleration) of the vehicle 2 is, for example, when an acceleration equal to or higher than a predetermined threshold value for determining the occurrence of sudden braking is detected using the acceleration data (including the front-rear acceleration) acquired in step S31. , The sudden steering of the vehicle 2 may be detected on the assumption that the sudden braking has occurred in the vehicle 2. The data in which the sudden steering wheel or the sudden braking of the vehicle 2 is detected may be stored in the storage unit 22 in association with the data such as the time when the sudden steering wheel or the sudden braking is detected, for example.
 ステップS37では、制御部21Bは、車両動態検出部381として動作し、ステップS31で取得した車両2の動態データを用いて、車両2の衝撃を検出する処理を行い、ステップS38に処理を進める。
 車両2の衝撃は、例えば、ステップS31で取得した加速度(前後加速度又は左右加速度を含む)のデータを用いて、衝突などによる衝撃の発生を判定する所定閾値以上の加速度が検出された場合、車両2に衝撃が発生したとして、車両2の衝撃を検出してもよい。車両2の衝撃を検出したデータは、例えば、衝撃を検出した時刻などのデータと紐付けて記憶部22に記憶されてもよい。
In step S37, the control unit 21B operates as the vehicle dynamics detection unit 381, performs a process of detecting the impact of the vehicle 2 using the dynamic data of the vehicle 2 acquired in step S31, and proceeds to the process in step S38.
The impact of the vehicle 2 is, for example, when an acceleration equal to or higher than a predetermined threshold for determining the occurrence of an impact due to a collision or the like is detected using the acceleration data (including the front-back acceleration or the left-right acceleration) acquired in step S31. Assuming that an impact has occurred on 2, the impact of the vehicle 2 may be detected. The data obtained by detecting the impact of the vehicle 2 may be stored in the storage unit 22 in association with data such as the time when the impact is detected.
 ステップS38では、制御部21Bは、並列的に処理が行われているステップS4で閉眼率が算出されたか否かを判断し、閉眼率が算出されていないと判断すれば、ステップS31に戻り処理を繰り返す一方、閉眼率が算出されたと判断すれば、ステップS50に処理を進める。
 また、上記したステップS31~S38の処理と並列的にステップS41~S43の処理が実行される。
In step S38, the control unit 21B determines whether or not the eye closure rate has been calculated in step S4 in which the processes are performed in parallel, and if it is determined that the eye closure rate has not been calculated, the process returns to step S31. On the other hand, if it is determined that the eye closure rate has been calculated, the process proceeds to step S50.
Further, the processes of steps S41 to S43 are executed in parallel with the processes of steps S31 to S38 described above.
 ステップS41では、制御部21Bは、ステップS1でカメラ11から取得した画像を処理して、画像から運転者3の顔の向きを検出する処理を行い、ステップS42に処理を進める。画像から運転者3の顔の向きを検出する手法は特に限定されない。例えば、制御部21Bは、画像中の顔の領域から、眼、鼻、口、眉などの顔の各器官の位置又は形状を検出し、検出した顔の各器官の位置又は形状に基づいて、顔の向きを検出してもよい。 In step S41, the control unit 21B processes the image acquired from the camera 11 in step S1, performs a process of detecting the direction of the driver 3's face from the image, and proceeds to step S42. The method of detecting the orientation of the driver 3's face from the image is not particularly limited. For example, the control unit 21B detects the position or shape of each facial organ such as eyes, nose, mouth, and eyebrows from the facial region in the image, and based on the detected position or shape of each facial organ, The orientation of the face may be detected.
 制御部21Bが検出する運転者3の顔の向きは、例えば、運転者の顔のX軸(左右軸)回りの角度(上下の向き)であるピッチ(Pitch)角、顔のY軸(上下軸)回りの角度(左右の向き)であるヨー(Yaw)角、及び顔のZ軸(前後軸)回りの角度(左右傾き)であるロール(Roll)角で示してよく、少なくとも左右の向きを示すヨー角が含まれる。またこれらの角度は、所定の基準方向に対する角度で示すことができ、例えば、前記基準方向が、運転者の正面方向に設定されてもよい。検出された顔の向きのデータは、例えば、画像取得時刻又はフレーム番号などのデータと紐付けて記憶部22に記憶されてもよい。 The orientation of the driver 3's face detected by the control unit 21B is, for example, a pitch angle which is an angle (up and down direction) around the X axis (left and right axis) of the driver's face, and a Y axis (up and down) of the face. It may be indicated by the Yaw angle, which is the angle around the axis (left-right orientation), and the Roll angle, which is the angle (left-right tilt) around the Z-axis (front-back axis) of the face, and at least the left-right orientation. Includes yaw angle indicating. Further, these angles can be indicated by angles with respect to a predetermined reference direction, and for example, the reference direction may be set to the front direction of the driver. The detected face orientation data may be stored in the storage unit 22 in association with data such as an image acquisition time or a frame number, for example.
 ステップS42では、制御部21Bは、顔向き事象検出部382として動作し、運転者3の顔向き事象を検出する処理を行い、ステップS43に処理を進める。
 運転者3の顔向き事象の検出は、例えば、ステップS41での顔向き検出の結果に基づいて、運転者3の顔未検出状態、脇見状態などの顔向き事象を検出してもよい。また、制御部21Bは、ステップS41での顔向き検出の結果と、ステップS32での交差点を通過中か否かの検出結果とに基づいて、交差点通過時の安全確認動作(左右確認、進行方向確認などの動作)などの事象を検出してもよい。検出された顔向き事象の検出データは、例えば、画像取得時刻、交差点通過時刻、又は位置などのデータと紐付けて記憶部22に記憶されてもよい。
In step S42, the control unit 21B operates as the face-facing event detection unit 382, performs a process of detecting the face-facing event of the driver 3, and proceeds to step S43.
The detection of the face-facing event of the driver 3 may detect the face-facing event of the driver 3, such as the face-undetected state and the inattentive state, based on the result of the face-facing detection in step S41, for example. Further, the control unit 21B has a safety confirmation operation (left-right confirmation, traveling direction) when passing through the intersection based on the result of the face orientation detection in step S41 and the detection result of whether or not the intersection is being passed in step S32. An event such as (operation such as confirmation) may be detected. The detected face-facing event detection data may be stored in the storage unit 22 in association with data such as an image acquisition time, an intersection passage time, or a position.
 ステップS43では、制御部21Bは、並列的に処理が行われているステップS4で閉眼率が算出されたか否かを判断し、閉眼率が算出されていないと判断すれば、ステップS41に戻り処理を繰り返す一方、閉眼率が算出されたと判断すれば、ステップS50に処理を進める。
 ステップS50では、制御部21Bは、ステップS4で算出された閉眼率に対する前処理を行い、ステップS5に処理を進める。ステップS50の前処理の内容については後述する。
In step S43, the control unit 21B determines whether or not the eye closure rate has been calculated in step S4 in which processing is performed in parallel, and if it is determined that the eye closure rate has not been calculated, the process returns to step S41. On the other hand, if it is determined that the eye closure rate has been calculated, the process proceeds to step S50.
In step S50, the control unit 21B performs preprocessing on the eye closure rate calculated in step S4, and proceeds to step S5. The content of the preprocessing in step S50 will be described later.
 ステップS5では、制御部21Bは、第2所定期間の閉眼率を算出したか否かを判断し、第2所定期間の閉眼率を算出していないと判断すれば、ステップS1に戻り処理を繰り返す一方、第2所定期間の閉眼度を算出したと判断すれば、ステップS6に処理を進める。
 ステップS6では、制御部21Bは、記憶部22から第2所定期間の前処理された閉眼率のデータを読み出し、その後ステップS1に戻り、眼開閉度の検出処理と閉眼率の算出処理を繰り返す。
In step S5, the control unit 21B determines whether or not the eye closure rate for the second predetermined period has been calculated, and if it determines that the eye closure rate for the second predetermined period has not been calculated, returns to step S1 and repeats the process. On the other hand, if it is determined that the degree of eye closure in the second predetermined period has been calculated, the process proceeds to step S6.
In step S6, the control unit 21B reads the preprocessed eye closure rate data for the second predetermined period from the storage unit 22, and then returns to step S1 to repeat the eye opening / closing degree detection process and the eye closure rate calculation process.
 図14は、実施の形態(3)に係る状態判定装置20Bの制御部21Bが行う処理動作例を示すフローチャートである。図14は、図13に示したステップS50の閉眼率の前処理の動作例を示している。但し、以下で説明する処理手順は一例に過ぎず、適宜変更されてよく、実施の形態に応じて、ステップの省略、置換、及び追加を行ってもよい。
 ステップS51では、制御部21Bは、図13のステップS33において、第1所定期間に道路種別の切り替わりが検出されたか否かを判断し、第1所定期間に道路種別の切り替わりが検出されたと判断すれば、ステップS52に処理を進める。
FIG. 14 is a flowchart showing an example of processing operation performed by the control unit 21B of the state determination device 20B according to the embodiment (3). FIG. 14 shows an operation example of the preprocessing of the eye closure rate in step S50 shown in FIG. However, the processing procedure described below is only an example and may be changed as appropriate, and steps may be omitted, replaced, or added depending on the embodiment.
In step S51, the control unit 21B determines in step S33 of FIG. 13 whether or not the road type switching is detected in the first predetermined period, and determines that the road type switching is detected in the first predetermined period. If so, the process proceeds to step S52.
 ステップS52では、制御部21Bは、道路種別の切り替わりが検出される前に算出された閉眼率を、変動特徴量の算出に用いる対象データから除去する処理を行い、その後、前処理を終える。
 一方ステップS51において、第1所定期間に道路種別の切り替わりが検出されなかったと判断すれば、ステップS53に処理を進める。
In step S52, the control unit 21B performs a process of removing the eye closure rate calculated before the switching of the road type is detected from the target data used for calculating the variation feature amount, and then ends the preprocessing.
On the other hand, if it is determined in step S51 that the switching of the road type is not detected in the first predetermined period, the process proceeds to step S53.
 ステップS53では、制御部21Bは、図13のステップS34において、第1所定期間に車両2の停車が検出されたか否かを判断し、第1所定期間に車両2の停車が検出されたと判断すれば、ステップS54に処理を進める。
 ステップS54では、制御部21Bは、車両2の停車検出時を含む第1所定期間における閉眼率を算出対象データから除去する処理を行い、その後、前処理を終える。
In step S53, the control unit 21B determines in step S34 of FIG. 13 whether or not the stop of the vehicle 2 is detected in the first predetermined period, and determines that the stop of the vehicle 2 is detected in the first predetermined period. If so, the process proceeds to step S54.
In step S54, the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the time when the vehicle 2 is stopped from the calculation target data, and then ends the preprocessing.
 一方ステップS53において、第1所定期間に車両2の停車が検出されなかったと判断すれば、ステップS55に処理を進める。
 ステップS55では、制御部21Bは、図13のステップS35において、第1所定期間に車両2の走行開始が検出されたか否かを判断し、第1所定期間に車両2の走行開始が検出されたと判断すれば、ステップS56に処理を進める。
On the other hand, if it is determined in step S53 that the stop of the vehicle 2 is not detected in the first predetermined period, the process proceeds to step S55.
In step S55, the control unit 21B determines in step S35 of FIG. 13 whether or not the start of traveling of the vehicle 2 is detected in the first predetermined period, and the start of traveling of the vehicle 2 is detected in the first predetermined period. If it is determined, the process proceeds to step S56.
 ステップS56では、制御部21Bは、車両2の走行開始検出時を含む第1所定期間における閉眼率を算出対象データから除去する処理を行い、その後、前処理を終える。
 一方ステップS56において、第1所定期間に車両2の走行開始が検出されなかったと判断すれば、ステップS57に処理を進める。
 ステップS57では、制御部21Bは、図13のステップS36において、第1所定期間に車両2の急ハンドル又は急ブレーキが検出されたか否かを判断し、第1所定期間に車両2の急ハンドル又は急ブレーキが検出されたと判断すれば、ステップS58に処理を進める。
In step S56, the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the time of detecting the start of travel of the vehicle 2 from the calculation target data, and then ends the preprocessing.
On the other hand, if it is determined in step S56 that the start of traveling of the vehicle 2 is not detected in the first predetermined period, the process proceeds to step S57.
In step S57, the control unit 21B determines in step S36 of FIG. 13 whether or not the sudden steering wheel or sudden braking of the vehicle 2 is detected in the first predetermined period, and the sudden steering wheel or the sudden braking of the vehicle 2 in the first predetermined period. If it is determined that the sudden braking is detected, the process proceeds to step S58.
 ステップS58では、制御部21Bは、車両2の急ハンドル又は急ブレーキの検出時を含む第1所定期間における閉眼率を算出対象データから除去する処理を行い、その後、前処理を終える。
 一方ステップS57において、第1所定期間に車両2の急ハンドル又は急ブレーキが検出されなかったと判断すれば、ステップS59に処理を進める。
In step S58, the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the detection of the sudden steering wheel or the sudden braking of the vehicle 2 from the calculation target data, and then ends the preprocessing.
On the other hand, if it is determined in step S57 that the sudden steering wheel or sudden braking of the vehicle 2 is not detected in the first predetermined period, the process proceeds to step S59.
 ステップS59では、制御部21Bは、図13のステップS37において、第1所定期間に車両2の衝撃が検出されたか否かを判断し、第1所定期間に車両2の衝撃が検出されたと判断すれば、ステップS60に処理を進める。
 ステップS60では、制御部21Bは、車両2の衝撃検出時を含む第1所定期間における閉眼率を算出対象データから除去する処理を行い、その後、前処理を終える。
In step S59, the control unit 21B determines whether or not the impact of the vehicle 2 has been detected in the first predetermined period in step S37 of FIG. 13, and determines that the impact of the vehicle 2 has been detected in the first predetermined period. If so, the process proceeds to step S60.
In step S60, the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the time of impact detection of the vehicle 2 from the calculation target data, and then ends the preprocessing.
 一方ステップS59において、第1所定期間に車両2の衝撃が検出されなかったと判断すれば、ステップS61に処理を進める。
 ステップS61では、制御部21Bは、図13のステップS42において、第1所定期間に運転者3の顔向き事象が検出されたか否かを判断し、第1所定期間に運転者3の顔向き事象が検出されたと判断すれば、ステップS62に処理を進める。
On the other hand, if it is determined in step S59 that the impact of the vehicle 2 has not been detected during the first predetermined period, the process proceeds to step S61.
In step S61, the control unit 21B determines in step S42 of FIG. 13 whether or not the facial event of the driver 3 is detected in the first predetermined period, and the facial event of the driver 3 in the first predetermined period. If it is determined that is detected, the process proceeds to step S62.
 ステップS62では、制御部21Bは、運転者3の顔向き事象の検出時を含む第1所定期間における閉眼率を算出対象データから除去する処理を行い、その後、前処理を終える。
 一方ステップS61において、第1所定期間に運転者3の顔向き事象が検出されなかったと判断すれば、その後、前処理を終え、ステップS5に処理を進める。
In step S62, the control unit 21B performs a process of removing the eye closure rate in the first predetermined period including the time of detecting the facial event of the driver 3 from the calculation target data, and then ends the preprocessing.
On the other hand, if it is determined in step S61 that the face-to-face event of the driver 3 is not detected in the first predetermined period, the preprocessing is then completed and the process proceeds to step S5.
 実施の形態(3)に係る状態判定装置20Bを備えた車載機10Bによれば、前処理部36Aにより、第2所定期間の閉眼率のうち、事象検出部38により所定の事象が検出された時又は期間に算出された閉眼率が、変動特徴量の算出対象データから除去される。そして、前記除去処理された後の閉眼率を用いて、変動特徴量が算出される。したがって、前記変動特徴量の算出に、前記除去処理後の閉眼率を用いることによって、漫然状態へ移行する特徴をより正確に表す特徴量を算出することができ、前記漫然状態への移行を精度良く判定することができる。 According to the vehicle-mounted device 10B provided with the state determination device 20B according to the embodiment (3), a predetermined event was detected by the event detection unit 38 in the eye closure rate in the second predetermined period by the pretreatment unit 36A. The eye closure rate calculated for the time or period is removed from the data for which the variable feature amount is calculated. Then, the variable feature amount is calculated using the eye closure rate after the removal treatment. Therefore, by using the eye closure rate after the removal treatment for the calculation of the variable feature amount, it is possible to calculate the feature amount that more accurately represents the feature that shifts to the loose state, and the transition to the loose state is accurate. It can be judged well.
 また、事象検出部38が、車両動態検出部381を含んでいるので、第2所定期間の閉眼率のうち、車両2が走行する道路種別が切り替わった事象、車両2が停車した事象、及び車両2が走行を開始した事象のうちの少なくともいずれかの事象が検出された時又は期間に算出された閉眼率を、変動特徴量の算出対象データから除くことが可能となる。
 また、事象検出部38が、顔向き事象検出部382を含んでいるので、第2所定期間の閉眼率のうち、車両2の運転中に運転者3の顔の向きが変化する事象が検出された時又は期間に算出された閉眼率を、変動特徴量の算出対象データから除くことが可能となる。
Further, since the event detection unit 38 includes the vehicle dynamics detection unit 381, the event in which the road type on which the vehicle 2 travels is switched, the event in which the vehicle 2 is stopped, and the vehicle in the second predetermined period of the eye closure rate. It is possible to exclude the eye closure rate calculated when at least one of the events in which 2 starts running is detected or during the period, from the data to be calculated for the variable feature amount.
Further, since the event detection unit 38 includes the face-facing event detection unit 382, an event in which the face orientation of the driver 3 changes while the vehicle 2 is being driven is detected in the eye closure rate during the second predetermined period. It is possible to exclude the eye closure rate calculated at that time or during the period from the data to be calculated for the variable feature amount.
 したがって、車両2の動態が時々刻々と変化する実車環境であっても、また、運転者3の顔の向きが様々に変化する実車環境であっても、漫然状態へ移行する傾向をより正確に示す特徴量として、変動特徴量を算出することができ、実車環境における運転者3の漫然状態への移行を精度良く判定することができる。 Therefore, even in an actual vehicle environment in which the dynamics of the vehicle 2 change from moment to moment, and in an actual vehicle environment in which the direction of the face of the driver 3 changes variously, the tendency to shift to the vague state is more accurate. As the feature amount to be shown, the variable feature amount can be calculated, and the transition of the driver 3 to the involuntary state in the actual vehicle environment can be accurately determined.
[実施の形態(4)]
 次に実施の形態(4)に係る状態判定装置が装備された車載機について説明する。但し、実施の形態(4)に係る車載機10Cのハードウェア構成例は、図2に示した構成例と略同様であるので、異なる機能を有する状態判定装置20Cとその制御部21Cとには異なる符号を付し、その他の構成部品の説明は省略することとする。
 図15は、実施の形態(4)に係る状態判定装置20Cの機能構成例を示すブロック図である。但し、実施の形態(4)に係る状態判定装置20Cの機能構成については、前処理部36Bと変動特徴量算出部33Bとの間に補間処理部39が設けられている点と、前処理部36B、変動特徴量算出部33B、及び漫然状態判定部34Aの処理機能とを除いて、図12に示した状態判定装置20Bの機能構成と略同様であるため、同一機能を有する構成には同一符号を付し、その説明を省略する。
[Embodiment (4)]
Next, an in-vehicle device equipped with the state determination device according to the embodiment (4) will be described. However, since the hardware configuration example of the vehicle-mounted device 10C according to the embodiment (4) is substantially the same as the configuration example shown in FIG. 2, the state determination device 20C having different functions and the control unit 21C thereof have different functions. Different reference numerals will be given and the description of other components will be omitted.
FIG. 15 is a block diagram showing a functional configuration example of the state determination device 20C according to the embodiment (4). However, regarding the functional configuration of the state determination device 20C according to the embodiment (4), the point that the interpolation processing unit 39 is provided between the preprocessing unit 36B and the variation feature amount calculation unit 33B and the preprocessing unit Except for the processing functions of 36B, the variable feature amount calculation unit 33B, and the loose state determination unit 34A, the functional configuration is substantially the same as that of the state determination device 20B shown in FIG. 12, so that the configurations having the same functions are the same. Reference numerals are given, and the description thereof will be omitted.
 前処理部36Bは、閉眼率算出部32により算出された閉眼率に対して、所定の前処理を行う。前処理部36Bは、図12の前処理部36Aと同様に、閉眼率算出部32により算出された、第2所定期間の閉眼率のうち、事象検出部38により所定の事象が検出された時又は期間(例えば、第1所定期間)に算出された閉眼率を、変動特徴量の算出対象データから除く除去処理を行う。 The pretreatment unit 36B performs a predetermined preprocessing on the eye closure rate calculated by the eye closure rate calculation unit 32. Similar to the pretreatment unit 36A in FIG. 12, the pretreatment unit 36B is when a predetermined event is detected by the event detection unit 38 in the eye closure rate in the second predetermined period calculated by the eye closure rate calculation unit 32. Alternatively, the eye closure rate calculated during the period (for example, the first predetermined period) is removed from the data for which the variable feature amount is calculated.
 補間処理部39は、第2所定期間の閉眼率のうち、前処理部36Bの除去処理により除去された時又は期間(例えば、第1所定期間)における閉眼率を補間する処理を行う。除去された時又は期間における閉眼率を、変動特徴量を算出する前段で補間することにより、変動特徴量算出部33Bで算出される変動特徴量の精度を高めることが可能となる。
 また、前処理部36Bは、例えば、漫然状態の判定に用いる変動特徴量の種類に応じて、上記除去処理及び補間処理された後の閉眼率の平滑化処理を行う。前処理部36Bは、平滑化処理として、例えば、所定間隔で閉眼率の移動平均を算出する処理を行ってもよい。
The interpolation processing unit 39 performs a process of interpolating the eye closure rate in the second predetermined period when it is removed by the removal process of the preprocessing unit 36B or during the period (for example, the first predetermined period). By interpolating the eye closure rate at the time of removal or during the period before calculating the variable feature amount, it is possible to improve the accuracy of the variable feature amount calculated by the variable feature amount calculation unit 33B.
In addition, the preprocessing unit 36B performs the removal processing and the smoothing processing of the eye closure rate after the interpolation processing, for example, according to the type of the variable feature amount used for determining the loose state. As the smoothing process, the preprocessing unit 36B may perform, for example, a process of calculating the moving average of the eye closure rate at predetermined intervals.
 変動特徴量算出部33Bは、前処理部36Bと補間処理部39により前処理と補間処理(前記除去処理、補間処理、及び平滑化処理)された後の閉眼率を用いて、変動特徴量を算出する処理を行う。
 図16は、実施の形態(4)に係る状態判定装置20Cの制御部21Cが行う処理動作を示すフローチャートである。図16に示す処理動作は、図13に示したステップS6の処理に引き続いて実行される。図16では、制御部21Cが、前処理部36B、補間処理部39、変動特徴量算出部33B、漫然状態判定部34A、及び出力部35として動作する一例を示している。なお、図11に示した処理動作と同一の処理内容については、同一ステップ番号を付し、その説明を省略することとする。
The variable feature amount calculation unit 33B uses the eye closure rate after preprocessing and interpolation processing (the removal processing, the interpolation processing, and the smoothing processing) by the preprocessing unit 36B and the interpolation processing unit 39 to obtain the variable feature amount. Perform the calculation process.
FIG. 16 is a flowchart showing a processing operation performed by the control unit 21C of the state determination device 20C according to the embodiment (4). The processing operation shown in FIG. 16 is executed following the processing in step S6 shown in FIG. FIG. 16 shows an example in which the control unit 21C operates as the preprocessing unit 36B, the interpolation processing unit 39, the variation feature amount calculation unit 33B, the loose state determination unit 34A, and the output unit 35. Note that the same processing contents as those shown in FIG. 11 are given the same step numbers, and the description thereof will be omitted.
 ステップS71では、制御部21Cは、補間処理部39として動作し、第2所定期間において、第1所定期間毎に算出された閉眼率のうち、図13のステップS50の前処理により除去された閉眼率を補間する処理を行い、その後ステップS21に処理を進める。
 ステップS71の補間処理では、制御部21Cは、例えば、ステップS50の前処理により除去された閉眼率を、除去された閉眼率の直前又は直後に算出された閉眼率で補間する処理を行ってもよい。または、制御部21Cは、除去された閉眼率の直前と直後に算出された閉眼率の平均値で補間する処理を行ってもよいし、除去された閉眼率の直前と直後に算出された閉眼率の変化量又は変化率(傾き)に基づいて補間する処理を行ってもよい。
In step S71, the control unit 21C operates as the interpolation processing unit 39, and in the second predetermined period, among the eye closure rates calculated for each first predetermined period, the eye closure removed by the preprocessing in step S50 of FIG. The process of interpolating the rate is performed, and then the process proceeds to step S21.
In the interpolation process of step S71, for example, the control unit 21C may perform a process of interpolating the eye closing rate removed by the preprocessing of step S50 with the calculated eye closing rate immediately before or after the removed eye closing rate. Good. Alternatively, the control unit 21C may perform a process of interpolating with the average value of the eye closing rates calculated immediately before and after the removed eye closing rate, or the eye closing calculated immediately before and after the removed eye closing rate. Interpolation processing may be performed based on the amount of change in the rate or the rate of change (slope).
 ステップS21~S26の処理は、補間された閉眼率のデータを用いる点を除き、図11に示すステップS21~S26の処理内容と基本的に同様であるので、ここではその説明を省略する。なお、ステップS22が前処理部36Bの処理動作に対応し、ステップS23、S25が変動特徴量算出部33Bの処理動作に対応し、ステップS24、S26が漫然状態判定部34Aの処理動作に対応する。 The processing of steps S21 to S26 is basically the same as the processing content of steps S21 to S26 shown in FIG. 11, except that the interpolated eye closure rate data is used, and thus the description thereof will be omitted here. In addition, step S22 corresponds to the processing operation of the pre-processing unit 36B, steps S23 and S25 correspond to the processing operation of the variation feature amount calculation unit 33B, and steps S24 and S26 correspond to the processing operation of the loose state determination unit 34A. ..
 そして、ステップS24、又はステップS26において、制御部21Cは、漫然状態の判定処理を行った後、ステップS72に処理を進める。
 ステップS72では、制御部21Cは、漫然状態判定部34Aとして動作し、閉眼率の変動特徴量が、警告処理を行う基準に設定された閾値A以上であるか否かを判断し、閾値A以上ではないと判断すれば、処理を終える一方、閾値A以上であると判断すれば、ステップS73に処理を進める。例えば、漫然状態のレベルを低、中、高に分類した場合、閾値Aは、中レベル以下の漫然状態にあることを判定するための基準としてもよい。
Then, in step S24 or step S26, the control unit 21C performs the process of determining the involuntary state, and then proceeds to the process of step S72.
In step S72, the control unit 21C operates as a loose state determination unit 34A, determines whether or not the fluctuation feature amount of the eye closure rate is equal to or greater than the threshold value A set as the reference for performing warning processing, and is equal to or greater than the threshold value A. If it is determined that the above is not the case, the process is completed, while if it is determined that the threshold value is A or more, the process proceeds to step S73. For example, when the level of the loose state is classified into low, medium, and high, the threshold value A may be used as a criterion for determining that the state is in the loose state of the middle level or lower.
 ステップS73では、制御部21Cは、出力部35として動作し、運転者3に対する警告処理を行い、ステップS74に処理を進める。制御部21Cは、警告処理として、例えば、報知部16を作動させて、運転者3を覚醒させるための警報音や警告アナウンスを出力する処理を行ってもよい。
 ステップS74では、制御部21Cは、漫然状態判定部34Aとして動作し、閉眼率の変動特徴量が、外部通報を行う基準に設定された閾値B以上であるか否かを判断し、閾値B以上ではないと判断すれば、ステップS76に処理を進める一方、閾値B以上であると判断すれば、ステップS75に処理を進める。閾値Bは、閾値Aより大きく、例えば、漫然状態のレベルを低、中、高に分類した場合、中レベルより高い漫然状態にあることを判定するための基準としてもよい。
In step S73, the control unit 21C operates as the output unit 35, performs warning processing for the driver 3, and proceeds to step S74. As the warning process, the control unit 21C may, for example, operate the notification unit 16 to output an alarm sound or a warning announcement for awakening the driver 3.
In step S74, the control unit 21C operates as a loose state determination unit 34A, determines whether or not the fluctuation feature amount of the eye closure rate is equal to or greater than the threshold value B set as the reference for making an external report, and is equal to or greater than the threshold value B. If it is determined that this is not the case, the process proceeds to step S76, and if it is determined that the threshold value B or higher is determined, the process proceeds to step S75. The threshold value B is larger than the threshold value A, and may be used as a reference for determining that the state is in a loose state higher than the medium level, for example, when the level of the loose state is classified into low, medium, and high.
 ステップS75は、制御部21Cは、出力部35として動作し、運転者3が漫然状態であることを外部へ通報する処理を行い、その後ステップS76に処理を進める。外部通報処理として、例えば、通信部15を作動させて、事業者端末6に、運転者3が中レベルより高い漫然状態にあることを通報する処理を行う。
 ステップS76では、制御部21Cは、ステップS74で変動特徴量が閾値B以上ではないと判定された場合、閾値A以上である漫然状態の判定結果を第2所定期間の経過時刻、位置などのデータと紐付けて記憶部22に記憶する処理を行い、その後処理を終える。
In step S75, the control unit 21C operates as the output unit 35 to perform a process of notifying the outside that the driver 3 is in a vague state, and then proceeds to the process in step S76. As the external notification process, for example, the communication unit 15 is operated to notify the operator terminal 6 that the driver 3 is in a vague state higher than the middle level.
In step S76, when the control unit 21C determines in step S74 that the fluctuation feature amount is not equal to or greater than the threshold value B, the control unit 21C obtains data such as the elapsed time and position of the second predetermined period as the determination result of the involuntary state of the threshold value A or more. The process of storing the data in the storage unit 22 is performed in association with the above, and then the process is completed.
 またステップS76では、制御部21Cは、ステップS74で変動特徴量が閾値B以上であると判定され、ステップS75で外部通報処理が実行された場合、閾値B以上である漫然状態の判定結果を第2所定期間の経過時刻、位置などのデータと紐付けて記憶部22に記憶する処理を行い、その後処理を終える。
 なお、制御部21Cは、ステップS76の後、記憶部22に記憶された漫然状態検出結果を含むデータを所定のタイミングで運転評価装置4へ送信するようにしてもよい。
Further, in step S76, the control unit 21C determines that the fluctuation feature amount is equal to or higher than the threshold value B in step S74, and when the external notification process is executed in step S75, the control unit 21C determines the determination result of the indiscriminate state that is equal to or higher than the threshold value B. 2. A process of associating with data such as the elapsed time and position of a predetermined period and storing the data in the storage unit 22 is performed, and then the process is completed.
After step S76, the control unit 21C may transmit the data including the involuntary state detection result stored in the storage unit 22 to the operation evaluation device 4 at a predetermined timing.
 実施の形態(4)に係る状態判定装置20Cを備えた車載機10Cによれば、前処理部36Bにより、第2所定期間の閉眼率のうち、事象検出部38により所定の事象が検出された時又は期間に算出された閉眼率が、変動特徴量の算出対象データから除去される。そして、前記除去処理された後の閉眼率を用いて、前記変動特徴量が算出される。
 また、前処理部36Bにより、除去処理後の閉眼率の平滑化処理がさらに行われるので、前記平滑化処理された閉眼率を、前記変動特徴量の算出に用いることができ、前記変動特徴量を、漫然状態へ移行する傾向が把握しやすい特徴量として算出することができ、漫然状態への移行を精度良く判定することができる。
According to the vehicle-mounted device 10C provided with the state determination device 20C according to the embodiment (4), a predetermined event was detected by the event detection unit 38 among the eye closure rates in the second predetermined period by the pretreatment unit 36B. The eye closure rate calculated for the time or period is removed from the data for which the variable feature amount is calculated. Then, the variable feature amount is calculated using the eye closure rate after the removal treatment.
Further, since the pretreatment unit 36B further smoothes the eye closure rate after the removal process, the smoothed eye closure rate can be used for calculating the variable feature amount, and the variable feature amount can be used. Can be calculated as a feature amount that makes it easy to grasp the tendency to shift to the loose state, and the shift to the loose state can be accurately determined.
 また、漫然状態判定部34Aでは、複数の判定閾値(閾値A、閾値B)を用いて、運転者3の漫然状態が段階的に判定されるので、運転者3の漫然状態を段階的に判定することが可能となり、漫然状態の程度に応じた適切な判定を行うことができ、また、漫然状態の程度に応じた適切な出力を行うことができる。
[変形例]
 以上、本発明の実施の形態を詳細に説明したが、前述までの説明はあらゆる点において本発明の例示に過ぎない。本発明の範囲を逸脱することなく、種々の改良や変更を行うことができることは言うまでもない。
Further, since the involuntary state determination unit 34A determines the involuntary state of the driver 3 stepwise using a plurality of determination threshold values (threshold value A, threshold value B), the involuntary state of the driver 3 is determined stepwise. It is possible to make an appropriate determination according to the degree of the loose state, and it is possible to perform an appropriate output according to the degree of the loose state.
[Modification example]
Although the embodiments of the present invention have been described in detail above, the above description is merely an example of the present invention in all respects. Needless to say, various improvements and changes can be made without departing from the scope of the present invention.
 なお、上記実施の形態(1)では、状態判定装置20が、カメラ11から画像を取得して、顔の向き、視線の方向、及び眼開閉度のデータを検出する処理を行うようになっていた。別の実施の形態では、カメラ11が、例えば、画像処理プロセッサなどを含む画像解析部を備え、該画像解析部が、撮像した画像から運転者3の顔の向き、視線の方向、及び眼開閉度のデータ(運転挙動データ)を検出する処理などを行うようにしてもよい。そして、状態判定装置20が、運転者3の運転挙動データ、画像データ、及び撮像日時(時刻)データをカメラ11から取得し、取得した眼開閉度のデータを用いて、閉眼率の算出処理以降の処理を実行するようにしてもよい。 In the above embodiment (1), the state determination device 20 acquires an image from the camera 11 and performs a process of detecting data on the direction of the face, the direction of the line of sight, and the degree of opening and closing of the eyes. It was. In another embodiment, the camera 11 includes an image analysis unit including, for example, an image processing processor, and the image analysis unit uses the captured image to determine the direction of the driver 3's face, the direction of the line of sight, and the opening and closing of the eyes. A process of detecting degree data (driving behavior data) may be performed. Then, the state determination device 20 acquires the driving behavior data, the image data, and the imaging date / time (time) data of the driver 3 from the camera 11, and uses the acquired eye opening / closing degree data to calculate the eye closure rate and thereafter. The processing of may be executed.
 また、上記実施の形態では、状態判定装置が車載機に適用された場合について説明したが、状態判定装置は車載機への適用に限定されるものではなく、例えば、状態判定装置を産業機器システムなどに組み込んで、所定の作業を行っている人物の漫然状態を判定する用途などに適用することも可能である。
[付記]
 本発明の実施の形態は、以下の付記の様にも記載され得るが、これらに限定されない。
(付記1)
 人物の状態を判定する状態判定装置(20)であって、
 前記人物の顔を撮像した画像から前記人物の眼開閉度を検出する眼開閉度検出部(31)と、
 該眼開閉度検出部(31)により検出された第1所定期間の前記眼開閉度を用いて、前記人物の閉眼率を算出する閉眼率算出部(32)と、
 該閉眼率算出部(32)により算出された第2所定期間の前記閉眼率を用いて、前記閉眼率の変動特徴量を算出する変動特徴量算出部(33)と、
 該変動特徴量算出部(33)により算出された前記変動特徴量に基づいて、前記人物の漫然状態を判定する漫然状態判定部(34)と、を備えていることを特徴とする状態判定装置。
(付記2)
 状態判定装置(20)と、
 前記画像を撮像する撮像部(11)と、を備えていることを特徴とする車載機(10)。
(付記3)
 1以上の車載機(10)と、
 該車載機(10)の前記状態判定装置(20)により判定された前記人物の漫然状態の判定結果を含むデータに基づいて、前記人物の漫然状態の評価を含む運転評価を行う運転評価部(421)、及び
 該運転評価部(421)により評価された前記人物の漫然状態の評価を含む運転評価結果を出力する評価結果出力部(422)を備えている運転評価装置(4)と、を含んで構成されていることを特徴とする運転評価システム(1)。
(付記4)
 人物の状態を判定する状態判定方法であって、
 前記人物の顔を撮像した画像から前記人物の眼開閉度を検出する眼開閉度検出ステップ(S2)と、
 該眼開閉度検出ステップ(S2)で検出された第1所定期間の前記眼開閉度を用いて、前記人物の閉眼率を算出する閉眼率算出ステップ(S4)と、
 該閉眼率算出ステップ(S4)で算出された第2所定期間の前記閉眼率を用いて、前記閉眼率の変動特徴量を算出する変動特徴量算出ステップ(S7)と、
 該変動特徴量算出ステップ(S7)で算出された前記変動特徴量に基づいて、前記人物の漫然状態を判定する漫然状態判定ステップ(S8)と、を含んでいることを特徴とする状態判定方法。
(付記5)
 人物の状態を判定する処理を少なくとも1以上のコンピュータ(20)に実行させるためのプログラム(221)であって、
 前記1以上のコンピュータ(20)に、
 前記人物の顔を撮像した画像から前記人物の眼開閉度を検出する眼開閉度検出ステップと(S2)、
 該眼開閉度検出ステップ(S2)で検出された第1所定期間の前記眼開閉度を用いて、前記人物の閉眼率を算出する閉眼率算出ステップ(S4)と、
 該閉眼率算出ステップ(S4)で算出された第2所定期間の前記閉眼率を用いて、前記閉眼率の変動特徴量を算出する変動特徴量算出ステップ(S7)と、
 該変動特徴量算出ステップ(S7)で算出された前記変動特徴量に基づいて、前記人物の漫然状態を判定する漫然状態判定ステップ(S8)と、を実行させるためのプログラム。
Further, in the above embodiment, the case where the state determination device is applied to the in-vehicle device has been described, but the state determination device is not limited to the application to the in-vehicle device. For example, the state determination device is an industrial equipment system. It is also possible to incorporate it into the above and apply it to the purpose of determining the absent-minded state of a person performing a predetermined work.
[Additional Notes]
Embodiments of the present invention may also be described as, but are not limited to, the following appendices.
(Appendix 1)
A state determination device (20) that determines the state of a person.
An eye opening / closing degree detection unit (31) that detects the eye opening / closing degree of the person from an image of the person's face, and
An eye closure rate calculation unit (32) that calculates the eye closure rate of the person using the eye opening / closing degree of the first predetermined period detected by the eye opening / closing degree detection unit (31).
Using the eye-closure rate for the second predetermined period calculated by the eye-closure rate calculation unit (32), the variation feature amount calculation unit (33) for calculating the variation feature amount of the eye-closure rate,
A state determination device including a random state determination unit (34) that determines the loose state of the person based on the variable feature amount calculated by the variable feature amount calculation unit (33). ..
(Appendix 2)
Status determination device (20) and
An on-board unit (10) including an imaging unit (11) for capturing the image.
(Appendix 3)
One or more on-board units (10) and
A driving evaluation unit () that performs a driving evaluation including an evaluation of the loose state of the person based on data including a determination result of the loose state of the person determined by the state determination device (20) of the vehicle-mounted device (10). 421) and a driving evaluation device (4) including an evaluation result output unit (422) that outputs a driving evaluation result including an evaluation of the person's involuntary state evaluated by the driving evaluation unit (421). An operation evaluation system (1) characterized in that it is configured to include.
(Appendix 4)
It is a state judgment method for judging the state of a person.
The eye opening / closing degree detection step (S2) for detecting the eye opening / closing degree of the person from the image of the person's face, and
An eye closure rate calculation step (S4) for calculating the eye closure rate of the person using the eye opening / closing degree of the first predetermined period detected in the eye opening / closing degree detection step (S2).
Using the eye-closure rate for the second predetermined period calculated in the eye-closure rate calculation step (S4), the variation feature amount calculation step (S7) for calculating the variation feature amount of the eye-closure rate, and
A state determination method including a loose state determination step (S8) for determining the loose state of the person based on the variable feature amount calculated in the variable feature calculation step (S7). ..
(Appendix 5)
A program (221) for causing at least one or more computers (20) to execute a process of determining the state of a person.
To one or more computers (20)
An eye opening / closing degree detection step of detecting the eye opening / closing degree of the person from an image of the person's face (S2).
An eye closure rate calculation step (S4) for calculating the eye closure rate of the person using the eye opening / closing degree of the first predetermined period detected in the eye opening / closing degree detection step (S2).
Using the eye-closure rate for the second predetermined period calculated in the eye-closure rate calculation step (S4), the variation feature amount calculation step (S7) for calculating the variation feature amount of the eye-closure rate, and
A program for executing a loose state determination step (S8) for determining a loose state of the person based on the variable feature amount calculated in the variable feature calculation step (S7).
1 運転評価システム
2 車両
3 運転者
4 運転評価装置
41 通信ユニット
42 制御ユニット
421 運転評価部
422 評価結果出力部
43 記憶ユニット
431 検出データ蓄積部
432 評価条件記憶部
433 評価結果記憶部
44 通信バス
5 通信ネットワーク
6 事業者端末
10、10A、10B、10C 車載機
11 カメラ
12 加速度センサ
13 角速度センサ
14 GPS受信部
14a アンテナ
15 通信部
15a アンテナ
16 報知部
20、20A、20B、20C 状態判定装置
21、21A、21B、21C 制御部
22 記憶部
221 プログラム
23 入出力インターフェース(I/F)
30 画像取得部
31 眼開閉度検出部
32 閉眼率算出部
33 変動特徴量算出部
34 漫然状態判定部
35 出力部
36、36A 前処理部
37 車両動態データ取得部
38 事象検出部
381 車両動態検出部
382 顔向き事象検出部
39 補間処理部
1 Driving evaluation system 2 Vehicle 3 Driver 4 Driving evaluation device 41 Communication unit 42 Control unit 421 Driving evaluation unit 422 Evaluation result output unit 43 Storage unit 431 Detection data storage unit 432 Evaluation condition storage unit 433 Evaluation result storage unit 44 Communication bus 5 Communication network 6 Business terminal 10, 10A, 10B, 10C On-board unit 11 Camera 12 Acceleration sensor 13 Angle speed sensor 14 GPS receiver 14a Antenna 15 Communication unit 15a Antenna 16 Notification unit 20, 20A, 20B, 20C Status determination device 21, 21A , 21B, 21C Control unit 22 Storage unit 221 Program 23 Input / output interface (I / F)
30 Image acquisition unit 31 Eye opening / closing degree detection unit 32 Eye closure rate calculation unit 33 Fluctuation feature amount calculation unit 34 Vague state determination unit 35 Output unit 36, 36A Preprocessing unit 37 Vehicle dynamic data acquisition unit 38 Event detection unit 381 Vehicle dynamics detection unit 382 Face-facing event detection unit 39 Interpolation processing unit

Claims (22)

  1.  人物の状態を判定する状態判定装置であって、
     前記人物の顔を撮像した画像から前記人物の眼開閉度を検出する眼開閉度検出部と、
     該眼開閉度検出部により検出された第1所定期間の前記眼開閉度を用いて、前記人物の閉眼率を算出する閉眼率算出部と、
     該閉眼率算出部により算出された第2所定期間の前記閉眼率を用いて、前記閉眼率の変動特徴量を算出する変動特徴量算出部と、
     該変動特徴量算出部により算出された前記変動特徴量に基づいて、前記人物の漫然状態を判定する漫然状態判定部と、
    を備えている状態判定装置。
    It is a state judgment device that judges the state of a person.
    An eye opening / closing degree detection unit that detects the eye opening / closing degree of the person from an image of the person's face,
    An eye closure rate calculation unit that calculates the eye closure rate of the person using the eye opening / closing degree of the first predetermined period detected by the eye opening / closing degree detection unit.
    Using the eye-closure rate for the second predetermined period calculated by the eye-closure rate calculation unit, a variation feature amount calculation unit for calculating the variation feature amount of the eye-closure rate,
    Based on the variable feature amount calculated by the variable feature amount calculation unit, the loose state determination unit that determines the loose state of the person,
    A state determination device equipped with.
  2.  前記閉眼率算出部により算出された前記閉眼率に対して前処理を行う前処理部を、さらに備え、
     前記変動特徴量算出部は、前記前処理部により前処理された後の前記閉眼率を用いて、前記変動特徴量を算出する、
    請求項1記載の状態判定装置。
    A pretreatment unit that performs pretreatment on the eye closure rate calculated by the eye closure rate calculation unit is further provided.
    The fluctuation feature amount calculation unit calculates the fluctuation feature amount using the eye closure rate after pretreatment by the pretreatment unit.
    The state determination device according to claim 1.
  3.  前記前処理部は、前記閉眼率の平滑化処理を行う、
    請求項2記載の状態判定装置。
    The pretreatment unit performs the smoothing process of the eye closure rate.
    The state determination device according to claim 2.
  4.  所定の事象を検出する事象検出部を、さらに備え、
     前記前処理部は、前記第2所定期間の前記閉眼率のうち、前記事象検出部により前記所定の事象が検出された時又は期間に算出された前記閉眼率を、前記変動特徴量の算出対象データから除く除去処理を行う、
    請求項2記載の状態判定装置。
    Further equipped with an event detection unit for detecting a predetermined event,
    The pretreatment unit calculates the variable feature amount by calculating the eye closure rate calculated when or during the period when the event detection unit detects the predetermined event among the eye closure rates in the second predetermined period. Perform removal processing to remove from target data,
    The state determination device according to claim 2.
  5.  前記前処理部は、前記除去処理後の前記閉眼率の平滑化処理をさらに行う、
    請求項4記載の状態判定装置。
    The pretreatment unit further performs the smoothing treatment of the eye closure rate after the removal treatment.
    The state determination device according to claim 4.
  6.  前記第2所定期間の前記閉眼率のうち、前記除去処理により除去された前記時又は期間における前記閉眼率を補間する補間処理部を、さらに備えている、
    請求項4又は請求項5記載の状態判定装置。
    Further, an interpolation processing unit for interpolating the eye closure rate during the time or period removed by the removal process among the eye closure rates in the second predetermined period is further provided.
    The state determination device according to claim 4 or 5.
  7.  前記人物が、車両の運転者であり、
     前記事象検出部は、前記所定の事象として、前記車両が走行する道路種別が切り替わった事象、前記車両が停車した事象、前記車両が走行を開始した事象、前記車両に急ハンドルが発生した事象、前記車両に急ブレーキが発生した事象、及び前記車両に衝撃が発生した事象のうちの少なくともいずれかの事象を検出する車両動態検出部を含んでいる、
    請求項4~6のいずれか1項に記載の状態判定装置。
    The person is the driver of the vehicle
    The event detection unit has, as the predetermined event, an event in which the road type on which the vehicle travels is switched, an event in which the vehicle stops, an event in which the vehicle starts traveling, and an event in which the vehicle suddenly handles. Includes a vehicle dynamics detector that detects at least one of an event in which a sudden braking occurs in the vehicle and an event in which an impact occurs in the vehicle.
    The state determination device according to any one of claims 4 to 6.
  8.  前記人物が、車両の運転者であり、
     前記事象検出部は、前記所定の事象として、前記車両の運転中に前記人物の顔の向きが変化する事象を検出する顔向き事象検出部を含んでいる、
    請求項4~6のいずれか1項に記載の状態判定装置。
    The person is the driver of the vehicle
    The event detection unit includes a face-facing event detection unit that detects, as the predetermined event, an event in which the face orientation of the person changes while the vehicle is driving.
    The state determination device according to any one of claims 4 to 6.
  9.  前記変動特徴量算出部は、前記変動特徴量として、前記第2所定期間の前記閉眼率のばらつきの程度を示す指標を算出する、
    請求項1~8のいずれか1項に記載の状態判定装置。
    The variable feature amount calculation unit calculates, as the variable feature amount, an index indicating the degree of variation in the eye closure rate during the second predetermined period.
    The state determination device according to any one of claims 1 to 8.
  10.  前記変動特徴量算出部は、前記変動特徴量として、前記第2所定期間の前記閉眼率の変化量又は変化率を算出する、
    請求項1~8のいずれか1項に記載の状態判定装置。
    The variable feature amount calculation unit calculates, as the variable feature amount, the amount of change or the rate of change of the eye closure rate during the second predetermined period.
    The state determination device according to any one of claims 1 to 8.
  11.  前記変動特徴量算出部は、前記変動特徴量として、前記第2所定期間における前記閉眼率の上昇時間の割合である上昇時間率を算出する、
    請求項1~8のいずれか1項に記載の状態判定装置。
    The fluctuation feature amount calculation unit calculates, as the fluctuation feature amount, the increase time rate, which is the ratio of the increase time of the eye closure rate in the second predetermined period.
    The state determination device according to any one of claims 1 to 8.
  12.  前記漫然状態判定部は、2以上の判定閾値を用いて、前記人物の漫然状態を段階的に判定する、
    請求項1~11のいずれか1項に記載の状態判定装置。
    The absent-minded state determination unit uses two or more determination thresholds to stepwise determine the absent-minded state of the person.
    The state determination device according to any one of claims 1 to 11.
  13.  前記漫然状態判定部は、前記変動特徴量が所定期間継続して判定閾値を超えた場合に、前記人物が漫然状態であると判定する、
    請求項1~11のいずれか1項に記載の状態判定装置。
    The loose state determination unit determines that the person is in a loose state when the variable feature amount continuously exceeds the determination threshold for a predetermined period.
    The state determination device according to any one of claims 1 to 11.
  14.  前記漫然状態判定部は、2種類以上の前記変動特徴量、又は1種類以上の前記変動特徴量及び前記閉眼率に基づいて、前記人物の漫然状態を判定する、
    請求項1~11のいずれか1項に記載の状態判定装置。
    The loose state determination unit determines the loose state of the person based on two or more types of the variable feature amount, or one or more types of the variable feature amount and the eye closure rate.
    The state determination device according to any one of claims 1 to 11.
  15.  前記漫然状態判定部は、算出された前記変動特徴量、又は前記変動特徴量及び前記閉眼率を入力すると、前記人物が漫然状態であるか否かを示す値を出力するように学習した学習済みの学習器を用いて、前記人物の漫然状態を判定する、
    請求項1~11のいずれか1項に記載の状態判定装置。
    The learned state determination unit has learned to output a value indicating whether or not the person is in a loose state when the calculated variable feature amount, or the variable feature amount and the eye closure rate are input. Judging the involuntary state of the person using the learning device of
    The state determination device according to any one of claims 1 to 11.
  16.  前記漫然状態判定部による判定結果を出力する出力部を備えている、
    請求項1~15のいずれか1項に記載の状態判定装置。
    It is provided with an output unit that outputs a determination result by the vague state determination unit.
    The state determination device according to any one of claims 1 to 15.
  17.  前記出力部が出力した前記判定結果に応じた報知を行う報知部を備えている、
    請求項16記載の状態判定装置。
    It is provided with a notification unit that performs notification according to the determination result output by the output unit.
    The state determination device according to claim 16.
  18.  前記出力部が出力した前記判定結果が記憶される判定結果記憶部と、
     該判定結果記憶部に記憶された前記判定結果を含むデータを所定の送信先へ送信する通信部と、を備えている、
    請求項16又は請求項17記載の状態判定装置。
    A determination result storage unit that stores the determination result output by the output unit, and a determination result storage unit.
    A communication unit for transmitting data including the determination result stored in the determination result storage unit to a predetermined destination is provided.
    The state determination device according to claim 16 or 17.
  19.  請求項1~18のいずれか1項に記載の状態判定装置と、
     前記画像を撮像する撮像部と、
    を備えている車載機。
    The state determination device according to any one of claims 1 to 18.
    An imaging unit that captures the image and
    In-vehicle device equipped with.
  20.  請求項19記載の1以上の車載機と、
     該車載機の前記状態判定装置により判定された前記人物の漫然状態の判定結果を含むデータに基づいて、前記人物の漫然状態の評価を含む運転評価を行う運転評価部、及び該運転評価部により評価された前記人物の漫然状態の評価を含む運転評価結果を出力する評価結果出力部を備えている運転評価装置と、
    を含む運転評価システム。
    One or more on-board units according to claim 19,
    Based on the data including the determination result of the loose state of the person determined by the state determination device of the in-vehicle device, the driving evaluation unit that performs the driving evaluation including the evaluation of the loose state of the person, and the driving evaluation unit A driving evaluation device provided with an evaluation result output unit that outputs a driving evaluation result including an evaluation of the evaluated state of the person.
    Driving evaluation system including.
  21.  人物の状態を判定する状態判定方法であって、
     前記人物の顔を撮像した画像から前記人物の眼開閉度を検出する眼開閉度検出ステップと、
     該眼開閉度検出ステップで検出された第1所定期間の前記眼開閉度を用いて、前記人物の閉眼率を算出する閉眼率算出ステップと、
     該閉眼率算出ステップで算出された第2所定期間の前記閉眼率を用いて、前記閉眼率の変動特徴量を算出する変動特徴量算出ステップと、
     該変動特徴量算出ステップで算出された前記変動特徴量に基づいて、前記人物の漫然状態を判定する漫然状態判定ステップと、
    を含む状態判定方法。
    It is a state judgment method for judging the state of a person.
    An eye opening / closing degree detection step for detecting the eye opening / closing degree of the person from an image of the person's face,
    An eye closure rate calculation step for calculating the eye closure rate of the person using the eye opening / closing degree for the first predetermined period detected in the eye opening / closing degree detection step,
    Using the eye closure rate for the second predetermined period calculated in the eye closure rate calculation step, a variable feature amount calculation step for calculating the variable feature amount of the eye closure rate, and a variable feature amount calculation step.
    Based on the variable feature amount calculated in the variable feature amount calculation step, the loose state determination step for determining the loose state of the person, and the loose state determination step.
    State determination method including.
  22.  人物の状態を判定する処理を少なくとも1以上のコンピュータに実行させるためのプログラムであって、
     前記1以上のコンピュータに、
     前記人物の顔を撮像した画像から前記人物の眼開閉度を検出する眼開閉度検出ステップと、
     該眼開閉度検出ステップで検出された第1所定期間の前記眼開閉度を用いて、前記人物の閉眼率を算出する閉眼率算出ステップと、
     該閉眼率算出ステップで算出された第2所定期間の前記閉眼率を用いて、前記閉眼率の変動特徴量を算出する変動特徴量算出ステップと、
     該変動特徴量算出ステップで算出された前記変動特徴量に基づいて、前記人物の漫然状態を判定する漫然状態判定ステップと、
    を実行させるためのプログラム。
    A program for causing at least one computer to execute a process of determining the state of a person.
    To one or more computers
    An eye opening / closing degree detection step for detecting the eye opening / closing degree of the person from an image of the person's face,
    An eye closure rate calculation step for calculating the eye closure rate of the person using the eye opening / closing degree for the first predetermined period detected in the eye opening / closing degree detection step,
    Using the eye closure rate for the second predetermined period calculated in the eye closure rate calculation step, a variable feature amount calculation step for calculating the variable feature amount of the eye closure rate, and a variable feature amount calculation step.
    Based on the variable feature amount calculated in the variable feature amount calculation step, the loose state determination step for determining the loose state of the person, and the loose state determination step.
    A program to execute.
PCT/JP2020/023555 2019-07-10 2020-06-16 State determination device, on-vehicle instrument, driving assessment system, state determination method, and program WO2021005975A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-128048 2019-07-10
JP2019128048A JP7298351B2 (en) 2019-07-10 2019-07-10 State determination device, in-vehicle device, driving evaluation system, state determination method, and program

Publications (1)

Publication Number Publication Date
WO2021005975A1 true WO2021005975A1 (en) 2021-01-14

Family

ID=74115222

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/023555 WO2021005975A1 (en) 2019-07-10 2020-06-16 State determination device, on-vehicle instrument, driving assessment system, state determination method, and program

Country Status (2)

Country Link
JP (1) JP7298351B2 (en)
WO (1) WO2021005975A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022176136A1 (en) * 2021-02-18 2022-08-25 パイオニア株式会社 Information processing device, output control method, and output control program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010128649A (en) * 2008-11-26 2010-06-10 Nissan Motor Co Ltd Awakening state determining device and awakening state determining method
WO2010092860A1 (en) * 2009-02-13 2010-08-19 トヨタ自動車株式会社 Physiological condition estimation device and vehicle control device
JP2017194772A (en) * 2016-04-19 2017-10-26 トヨタ自動車株式会社 Arousal determination apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010128649A (en) * 2008-11-26 2010-06-10 Nissan Motor Co Ltd Awakening state determining device and awakening state determining method
WO2010092860A1 (en) * 2009-02-13 2010-08-19 トヨタ自動車株式会社 Physiological condition estimation device and vehicle control device
JP2017194772A (en) * 2016-04-19 2017-10-26 トヨタ自動車株式会社 Arousal determination apparatus

Also Published As

Publication number Publication date
JP7298351B2 (en) 2023-06-27
JP2021015320A (en) 2021-02-12

Similar Documents

Publication Publication Date Title
US10217343B2 (en) Alert generation correlating between head mounted imaging data and external device
CN107415938B (en) Controlling autonomous vehicle functions and outputs based on occupant position and attention
JP5179686B2 (en) Driving behavior risk calculation device
US10524716B2 (en) System for monitoring vehicle operator compliance with safe operating conditions
KR102669020B1 (en) Information processing devices, mobile devices, and methods, and programs
JP7099037B2 (en) Data processing equipment, monitoring system, awakening system, data processing method, and data processing program
JP6967042B2 (en) Driving evaluation device, driving evaluation system, driving evaluation method, program, and intersection attribute discrimination method
CN110503802A (en) Driving accident judgment method and system based on automobile data recorder
US20200209850A1 (en) Methods and systems to facilitate monitoring center for ride share and safe testing method based for selfdriving cars to reduce the false call by deuddaction systems based on deep learning machine
Smirnov et al. Smartphone-based identification of dangerous driving situations: Algorithms and implementation
JP2021128349A (en) Information processing device, information processing system, information processing method, and program
JP7114953B2 (en) In-vehicle device, driving evaluation device, driving evaluation system provided with these, data transmission method, and data transmission program
WO2021005975A1 (en) State determination device, on-vehicle instrument, driving assessment system, state determination method, and program
JP7135913B2 (en) Driving evaluation screen display method, program, driving evaluation system, and driving evaluation screen
JP7068606B2 (en) Driving evaluation device, on-board unit, driving evaluation method, and computer program
JP2019195376A (en) Data processing device, monitoring system, awakening system, data processing method, and data processing program
Kashevnik et al. Context-based driver support system development: Methodology and case study
RU2703341C1 (en) Method for determining hazardous conditions on public roads based on monitoring the situation in the cabin of a vehicle
JP7060841B2 (en) Operation evaluation device, operation evaluation method, and operation evaluation program
US11912307B2 (en) Monitoring head movements of drivers tasked with monitoring a vehicle operating in an autonomous driving mode
JP7070827B2 (en) Driving evaluation device, in-vehicle device, driving evaluation system equipped with these, driving evaluation method, and driving evaluation program
JP7075048B2 (en) Safety confirmation evaluation device, in-vehicle device, safety confirmation evaluation system equipped with these, safety confirmation evaluation method, and safety confirmation evaluation program
JP6950597B2 (en) On-board unit, driving evaluation system, information processing method, and program
JP7235438B2 (en) Driving evaluation device, driving evaluation method, and computer program
JP7130994B2 (en) In-vehicle device, backward determination method, and backward determination program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20836246

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20836246

Country of ref document: EP

Kind code of ref document: A1