EP3424030A1 - Personalized device and method for monitoring a motor vehicle driver - Google Patents
Personalized device and method for monitoring a motor vehicle driverInfo
- Publication number
- EP3424030A1 EP3424030A1 EP17707079.4A EP17707079A EP3424030A1 EP 3424030 A1 EP3424030 A1 EP 3424030A1 EP 17707079 A EP17707079 A EP 17707079A EP 3424030 A1 EP3424030 A1 EP 3424030A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- driver
- predetermined
- data
- monitoring
- personalized
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 77
- 238000000034 method Methods 0.000 title claims description 17
- 230000036626 alertness Effects 0.000 claims abstract description 10
- 239000004020 conductor Substances 0.000 claims description 22
- 238000012806 monitoring device Methods 0.000 claims description 22
- 210000003128 head Anatomy 0.000 claims description 20
- 230000006870 function Effects 0.000 claims description 16
- 210000000744 eyelid Anatomy 0.000 claims description 13
- 230000008560 physiological behavior Effects 0.000 claims description 4
- 238000009826 distribution Methods 0.000 description 21
- 206010041349 Somnolence Diseases 0.000 description 11
- 238000011084 recovery Methods 0.000 description 6
- 230000004913 activation Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000036544 posture Effects 0.000 description 4
- 210000001747 pupil Anatomy 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 description 2
- 208000032140 Sleepiness Diseases 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 230000036760 body temperature Effects 0.000 description 2
- 230000007177 brain activity Effects 0.000 description 2
- 230000000747 cardiac effect Effects 0.000 description 2
- 230000001149 cognitive effect Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 239000008103 glucose Substances 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000036387 respiratory rate Effects 0.000 description 2
- 230000033764 rhythmic process Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001364 causal effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 210000004087 cornea Anatomy 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000005315 distribution function Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 230000002747 voluntary effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/06—Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W40/09—Driving style or behaviour
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
- B60W2040/0827—Inactivity or incapacity of driver due to sleepiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/229—Attention level, e.g. attentive to driving, reading or sleeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/20—Ambient conditions, e.g. wind or rain
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/40—High definition maps
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/55—External transmission of data to or from the vehicle using telemetry
Definitions
- the invention relates generally to the field of monitoring the driver of a motor vehicle.
- It relates more particularly to a device and a method of monitoring this driver.
- a driver monitoring parameter is measured and interpreted by a control unit to derive information on the state of vigilance of the driver.
- the present invention proposes a device and a method of monitoring the conductor making it possible to adapt the interpretation of the monitoring parameter measured according to the driver of the vehicle, little or nothing. not binding for the driver.
- a personalized monitoring device for a driver of a motor vehicle comprising:
- control unit programmed to estimate a state of vigilance of the driver according to an estimation rule and according to the measured value of said monitoring parameter
- control unit being further programmed to determine said estimation rule in a manner personalized for said driver as a function of a set of data relating to the conductor comprising at least said measured value of the monitoring parameter and of several data sets; predetermined ones each corresponding to a predetermined group of conductors.
- the estimation rule used by the control unit to interpret the measured value of the monitoring parameter and to deduce the state of vigilance of the driver is adapted to the precise driver of the vehicle thanks to an apprenticeship that is supervised in a way that is not very restrictive for the driver.
- the personalized estimation rule is determined by comparing a set of data relating to the driver and several predetermined data sets corresponding to groups of drivers for each of whom an optimized group estimation rule has been previously determined. .
- each predetermined data set is associated with a predetermined estimation rule and the control unit is programmed to identify the personalized estimation rule for the driver to one of said predetermined estimation rules;
- control unit is programmed to identify the estimate rule customized for the driver to the predetermined estimation rule associated with the predetermined set of data closest to the driver data set;
- control unit is programmed to determine the nearest predetermined data set of the driver data set by comparing said set of driver data and said predetermined data sets;
- control unit comprises a learning algorithm adapted to
- SUBSTITUTE SHEET (RULE 26) determining the nearest predetermined data set of said driver data set
- said data set relating to the driver further comprises at least one of the following data:
- said monitoring parameter measured by the measuring device is relative to the direction of gaze of the driver and / or to the posture of the driver's head, and / or to the closing of the driver's eyelids and / or to the movements of the driver's wheel. vehicle;
- the measuring device comprises at least one image capture device of the driver's head and said measured driver monitoring parameter is determined as a function of at least one captured image of the driver's head;
- control unit is programmed to store the personalized estimation rule in relation to a driver identifier
- control unit is programmed to trigger the generation of the alert signal according to the state of alertness of the estimated driver.
- the invention also proposes a method of personalized monitoring of the driver of a motor vehicle, comprising the following steps:
- a set of data relating to the driver is recorded, comprising at least one measured monitoring parameter of the driver
- a custom estimating rule is determined for estimating a state of alertness of the driver as a function of said measured driver monitoring parameter, as a function of said set of data relating to the driver and of several predetermined data sets, each corresponding to a group of drivers predetermined,
- the recording and determination steps are carried out at predetermined time intervals
- each predetermined data set being associated with a predetermined estimation rule
- the personalized estimation rule for the driver is identified with one of said predetermined estimation rules
- the personalized estimation rule is stored in relation to a driver identifier
- FIG. 1 schematically represents the various elements of the monitoring device according to the invention
- FIG. 2 diagrammatically represents the different steps of the monitoring method according to the invention
- FIG. 3 diagrammatically represents the global distributions
- FIG. 4 schematically represents the distributions (densities D as a function of the PERCLOS value) of the PERCLOS parameter values measured for a first group of three drivers among the six drivers of FIG. 3, in an alert awakening state (solid line). ) and in a state of drowsiness (dotted line),
- FIG. 5 schematically represents the distributions (densities D in
- FIRE I LLE OF REM PLACEM ENT (RULE 26) function of the PERCLOS value) of the PERCLOS parameter values measured for a second group of three other drivers among the six drivers of Figure 3, in a state of alert awakening (solid line) and in a state of drowsiness (dotted line) .
- FIG. 1 diagrammatically shows various elements of a monitoring device 500 according to the invention.
- This monitoring device 500 comprises a measuring device 200 of a driver monitoring parameter and a control unit 100 which receives information from this measuring device 200.
- the measuring device 200 records information on the driver without interacting with it, therefore without disturbing it.
- the measuring device 200 emits no signal towards the driver. This has the advantage of allowing the measuring device 200 to record the natural behavior of the conductor, in the absence of any disturbance or constraint by the measuring device 200.
- the measuring device 200 of the driver monitoring parameter is here without interaction with the driver.
- the measuring device interact with the driver. This is a passive interaction for the driver, little intrusive, that is to say without voluntary action to be performed by the driver.
- the measuring device 200 comprises at least one image capture device of the driver's head.
- This image capture device comprises for example a camera disposed near the central rearview mirror of the vehicle, or behind the steering wheel.
- it may include any image capture device positioned so that the driver's face in his driving position enters the field of this image capture device.
- said monitoring parameter measured by the measuring device 200 is relative to the direction of gaze of the driver and / or to the posture of the driver's head, and / or to the closure of the driver's eyelids.
- FIRE I LLE OF REM PLACEM ENT (RULE 26) according to at least one image of the driver's head captured by the measuring device 200.
- This parameter can also be determined from two or more images of the driver's head.
- the image capture device may include any device for monitoring the direction of gaze known to those skilled in the art.
- the monitoring parameter relating to the direction of the gaze of the driver may in particular be the direction of gaze in a repository related to the image capture device or in a repository related to the driver's head, or the position of the driver's pupils in a such reference. It may also include a duration of attachment of the same point, that is to say the duration during which the direction of gaze in the frame linked to the driver's head is approximately constant, or the frequency of fixing eyes .
- the monitoring parameter may also comprise statistical information on a set of viewing directions measured from a plurality of image captures made during a predetermined duration, for example an average direction of view, a standard deviation at this mean direction , a statistical distribution of viewing directions.
- the driver's head posture monitoring parameter may include a position and orientation of the driver's head in the repository related to the image capture device.
- It may also comprise, for example, statistical information on a set of postures measured from a plurality of image captures made during a predetermined duration, such as the average speed of movements of the head and the standard deviation at this average, or a frequency of extreme positions (large angles) of the head.
- the driver monitoring parameter for closing the driver's eyelids may include an eyelid closing frequency, an eyelid closing time, an eyelid closing speed or a statistical information of the average type or statistical frequency distribution or the duration of closure of the eyelids
- the monitoring parameter is the PERCLOS parameter, that is to say the proportion of time during which the closure of the driver's eyelids is between 80% and 100% of the total closure. Eyelids, in a predetermined fixed time interval, set to 60 seconds in the example below. It can also be any statistical information determined from a plurality of measured values of this parameter PERCLOS.
- the values of the parameter PERCLOS are normalized, 0 corresponding to a proportion of 0% and 1 corresponding to a proportion of 100%. Only the range of values of this parameter between 0 and 0.2 is represented in these figures
- the measuring device 200 may also include two image capture devices allowing the reconstruction of an image of the driver's head in three dimensions, by stereoscopy.
- the measuring device may comprise other types of sensors making it possible to measure other types of monitoring parameter, in addition to at least one of the monitoring parameters already mentioned, or in replacement of this monitoring parameter.
- the measuring device may comprise a sensor adapted to measure the hand pressure on the steering wheel of the vehicle, or the movements imposed on the steering wheel by the driver.
- the monitoring parameter can then be relative to the intensity of the hand pressure on the steering wheel or the movement of the steering wheel, for example to the angular position of the steering wheel.
- the measuring device may for example comprise one or more biometric sensors making it possible, for example, to measure the cardiac rhythm of the driver, and / or the frequency, and / or the amplitude of a variation in the size of the driver's pupil, and or skin conductance, and / or blood pressure, and / or body temperature, and / or respiratory rate, and / or blood glucose of the driver, and / or brain activity.
- the control unit 100 of the monitoring device 500 is programmed to estimate a state of vigilance of the driver according to a personalized estimation rule and according to the value
- the custom estimation rule is implemented by a first estimation module 101 of the control unit 200 (FIG. 1).
- This estimation rule includes at the output several levels of vigilance of the possible driver. At least two levels are foreseen corresponding to satisfactory or insufficient vigilance. Preferably, at least three levels of vigilance are provided. For example, the following levels are designed to estimate the possible drowsiness of the driver: alert awakening, hypovigilance, drowsiness, and possibly falling asleep. The following levels can be expected to estimate driver distraction: concentration, low distraction, high distraction.
- Visual distraction can for example be estimated according to the direction of gaze of the driver.
- the cognitive distraction is estimated for example by additional information transmitted by other devices of the vehicle, such as the activation of the wireless connection of a mobile phone to the vehicle or the start of a device broadcasting music .
- the estimation rule may include a comparison between the measured value of frequency or duration or speed with a threshold value of frequency, duration or speed. When the measured value of frequency or duration is greater than the threshold value of frequency or duration, the estimation rule sends back a state of vigilance of the driver indicating that it is sleepy or has insufficient vigilance.
- the estimation rule used when the monitoring parameter is the parameter PERCLOS defined above comprises the comparison with a threshold value of this parameter.
- the estimation rule indicates that the driver is in a state of somnolence while when it is lower, the estimation rule indicates that the driver is in a state of alert awakening.
- said estimation rule is determined by a
- FIRE I LLE OF REM PLACEM ENT (RULE 26) learning algorithm adapted to establish a causal structure between the monitoring parameter and the state of vigilance of the driver.
- This learning algorithm is initially driven with the aid of a predetermined initial database comprising predetermined pair of input parameter monitoring parameters / associated vigilance state.
- It may be for example a Bayesian learning algorithm, a perceptron neuron network, a deep learning algorithm, a support vector machine or a forest of decision trees. It can also be any other type of adapted learning algorithm known to those skilled in the art.
- This initial database includes data relating to different drivers, so that the performance of the estimation rule is optimized for these drivers.
- the learning algorithm determines for example, on the basis of this database, the probability of observing a given state of alertness in the driver knowing that the monitoring parameter has said measured value.
- the estimation rule outputs, on the basis of the learning algorithm, the most likely level of vigilance according to the measured value of the driver's monitoring parameter.
- control unit 100 can also receive additional information transmitted by other devices of the vehicle, such as GPS information, acceleration / braking, activation of certain functions of the vehicle, such as the activation of the wireless connection of a mobile phone to the vehicle.
- additional information as well as information indicating a course along a path unfavorable to the road, such as information concerning the movements of the steering wheel or lateral acceleration of the vehicle, or the reception of a telephone call by the driver can be taken into account in the estimation rule to determine the state of alertness of the driver.
- control unit is further programmed to determine said estimation rule in a manner personalized for said driver as a function of a set of data relating to the driver comprising at least said value
- SUBSTITUTE SHEET (RULE 26) of the monitoring parameter and a plurality of predetermined data sets each corresponding to a predetermined group of conductors.
- control unit 100 is programmed to record a set of data relating to the conductor, comprising at least said measured monitoring parameter of the conductor.
- This data set is for example recorded in a recording module 102 of the control unit 100 (FIG. 1).
- Said set of data relating to the driver may for example comprise a plurality of values of said monitoring parameter measured during one or more paths of the driver.
- Said set of data relating to the driver may also comprise, for example, in addition to the values of the measured monitoring parameter, at least one of the following data:
- These other data recovery devices may for example include an acquisition interface through which the driver can manually enter information on him. For example, it may be a question of completing a questionnaire stating his age, his experience as a driver, his state of general fatigue, an estimate of his level of stress, or the length of his night's sleep the day before.
- These other data recovery devices may also comprise different sensors arranged in the vehicle and transmitting to the control unit 100 measured information on the operation of the vehicle, for example exhaust gas temperature / pressure, lateral and frontal acceleration of the vehicle. vehicle, vehicle speed, turning angle of the steering wheel.
- control unit may also include the control unit itself, which determines many vehicle operating parameters, such as vehicle speed or operating point, demand
- SUBSTITUTE SHEET (RULE 26) acceleration / braking of the driver, the activation of certain functions of the vehicle, such as for example the activation of the wireless connection of a mobile phone to the vehicle.
- They may also comprise a system for determining the position of the vehicle, of the GPS type, and / or a device for determining the external climatic conditions, for example the temperature of the outside air and / or in the vehicle and / or the road, air humidity and / or road, the level of ambient noise.
- biometric sensors for recording data relating to the physiological behavior of the driver, for example the cardiac rhythm of the driver, and / or the skin conductance, and / or the blood pressure, and / or the body temperature. and / or the respiratory rate, and / or the blood glucose of the driver, and / or the brain activity of this driver.
- the predetermined data sets each corresponding to a predetermined group of drivers are pooled and accessible to all vehicles. They are in this universal and not personalized.
- the monitoring device 500 then comprises at least one server 10 outside the vehicle, on which the predetermined data sets are stored and to which the control unit 100 accesses remotely via a wireless connection of the network type. 4G for example. It is then possible for the control unit to access it at any time.
- control unit accesses the external server only during certain stops of the vehicle, for example at the driver or during technical inspections of the vehicle.
- a wireless connection to the Wi-Fi network is then possible, as well as a wired connection.
- control unit includes a memory that contains the predetermined data sets.
- each predetermined data set is associated
- FIRE I LLE OF REM PLACEM ENT (RULE 26) to a predetermined estimation rule and the control unit 100 is programmed to identify the personalized estimation rule for the driver to one of said predetermined estimation rules.
- control unit 100 is programmed to identify the driver's estimate rule customized to the predetermined estimate rule associated with the predetermined set of data closest to the set of data relating to the driver.
- control unit 100 is programmed to compare said set of data relating to the recorded conductor in the recording module 102 and said predetermined data sets each corresponding to a predetermined group of conductors.
- control unit 100 further comprises a comparator module 103 which performs said comparison.
- This comparator module 103 receives all of the driver data from the monitor 200 and other data recovery devices 300 and the predetermined data sets from the external server 1 10.
- the comparison determines which set of predetermined data is closest to the driver data set.
- control unit 100 is for example programmed to compare the corresponding data with each other and to quantify the difference between these data, then to determine an overall difference factor between the data set relating to the driver and each set of predetermined data by summing in a weighted manner the differences between corresponding data.
- the control unit 100 is then programmed to determine the nearest predetermined data set of the driver data set to be the one whose overall difference with the driver data set is the smallest.
- control unit 100 is programmed to compare at least one characteristic of a statistical distribution of the driver data set with the statistical distribution of the predetermined values of that parameter included in each predetermined data set.
- SUBSTITUTE SHEET (RULE 26) It may be for example, as is the case in the example described in more detail later, to compare the characteristics of the statistical distribution of the measured values of the monitoring parameter with the statistical distribution of the predetermined values of this parameter included in FIG. each set of predetermined data. The control unit is then programmed to determine which predetermined data set comprises the values of the monitoring parameter whose statistical distribution is closest to that of the values measured for the driver.
- control unit 100 comprises a learning algorithm adapted to determine the predetermined set of data closest to said set of data relating to the driver,
- This learning algorithm is for example driven on a database comprising different sets of data with their closest associated predetermined data set.
- the predetermined data sets are determined in a prior step, based on data collected from different drivers and the statistical processing of these data.
- This learning algorithm identifies the correlations between the data of some drivers and automatically groups the data of these conductors to form said predetermined data sets.
- SUBSTITUTE SHEET (RULE 26) with six different drivers.
- This data comprises, for each driver, a plurality of measured values of the PERCLOS parameter and the corresponding alert or sleepy state of the driver.
- the density is represented here as a function of the measured value of the PERCLOS parameter (between 0 and 1).
- a density curve represents the relative proportion of the different PERCLOS values within the set of PERCLOS measured values.
- the first threshold value VS1 of the PERCLOS parameter allowing an effective distinction between the state of alert awakening and the state of drowsiness is approximately 0.02.
- the second threshold value VS2 of the PERCLOS parameter allowing an effective distinction between the state of alert awakening and the state of somnolence is approximately 0.04.
- Two sets of predetermined data corresponding to the data of the drivers of group 1 or group 2 are thus defined.
- an associated group estimation rule is optimized in order to estimate as accurately as possible the state of vigilance of the driver, so as to reliably detect the situations in which the vigilance of the driver is insufficient and to avoid the issuance of unjustified alert signals, as explained below.
- the estimation rule associated with the first driver group will use a first threshold value VS1 equal to 0.02, while the estimation rule associated with the second group of drivers will use a second threshold value VS2 equal to 0. , 04 of the PERCLOS parameter.
- the threshold values of the estimation rule are thus adjusted according to the predetermined data of the group of drivers considered.
- the initial database used to drive this learning algorithm includes the input parameter / vigilance state input variable pairs associated with the group drivers.
- the control unit 100 is programmed to determine, based on the comparison made, the personalized estimation rule for the driver, as the group estimation rule associated with the predetermined set of data closest to the driver. data set relating to the driver.
- a plurality of measured values of this parameter is acquired by the monitoring device according to the invention, for example 10,000 values measured over several hours of driving of the vehicle.
- the distribution of these measured values is compared with the distribution of the predetermined values of the conductors of the first and second groups described above, for example by comparing the moments of these distributions as their means, standard deviations, dissymmetry coefficient and / or coefficient of flattening, or other statistical characteristics of these distributions, such as median or percentiles.
- the control unit 100 associates the driver of the vehicle with the rule estimate of the first group.
- the first threshold value VS1 which is used to estimate the state of vigilance of the driver.
- the average value of the measured value distribution of the PERCLOS parameter is closer to the mean value of the distribution of the predetermined values of the PERCLOS parameter of the second driver group, it is the estimation rule associated with the second group that is used. .
- control unit 100 is programmed to estimate the state of vigilance of the driver using said custom estimation rule for this driver.
- the motor vehicle concerned further comprises for this purpose a driver recognition system.
- This recognition system automatically identifies the driver of the vehicle from a list of stored drivers.
- This recognition system is based, for example, on the facial recognition of the driver on an image of the driver's head, on the processing of biometric information (such as a fingerprint), or on the identification of a key. contact used by the driver.
- This recognition system then transmits a driver identifier to the control unit 100, which is programmed to store the personalized estimation rule in relation to this driver identifier.
- the control unit 100 is then programmed to use the personalized estimation rule associated with the identifier of the driver recognized when starting the vehicle.
- control unit 100 uses the personalized estimation rule for this driver. This allows you to customize the driver's monitoring.
- the estimation rule is determined according to the driver, the number of unjustified warning signals is reduced, which improves driver comfort, and leads to a better acceptance of the monitoring device.
- the monitoring device makes it possible to control the issuance of warning signals in a greater number of situations of drowsiness or inattention and the safety of the vehicle and its passengers is improved.
- the accuracy of the estimation rule associated with the first group of drivers is 74%, while the accuracy of the estimation rule associated with the second group of drivers is 87%.
- a generic estimation rule the threshold value of which would be determined on the basis of all the data of the six drivers, would have a threshold value VS of the PERCLOS parameter equal to 0.03 (represented in FIG. 68% accuracy.
- the performance of the custom estimation rule is therefore improved compared to the rule
- the monitoring device 500 further comprises a device for transmitting an alert signal 400 (FIG. 1).
- control unit 100 Depending on the level of vigilance of the driver estimated by the control unit 100, it is programmed to control or not the transmission of an alert signal to the driver.
- the warning signal emitted by the device for emitting the warning signal can be visual, audible, or haptic. This is for example a flashing or a flash light, a warning sound or a vibration in the steering wheel or the driver's seat.
- the measured value of the monitoring parameter and, if appropriate, the associated reference vigilance state are stored in a register 120 in correspondence with the result of the comparison, of to allow tracking of the performance of the custom estimate rule.
- Each data stored in the register 120 is also preferably associated with the corresponding driver identifier.
- FIG. 2 diagrammatically shows the steps of the personalized monitoring method according to the invention.
- the driver when the driver is installed in the vehicle and starts it, the driver is automatically identified by the recognition device as described above.
- the driver drives the vehicle as usual, without any particular constraints.
- the personalized monitoring method according to the invention is implemented during the driving of the vehicle by the driver. According to this method:
- a custom estimation rule is determined (blocks 30 and 40 of FIG. 2) for estimating a state of vigilance of the driver as a function of said measured driver monitoring parameter, as a function of said set of data relating to the driver and of several sets of data
- SUBSTITUTE SHEET predetermined ones each corresponding to a predetermined group of conductors
- the estimation rule is determined by the following steps:
- the said monitoring parameter is measured beforehand by means of the measuring device 200.
- the modified estimation rule is stored in relation to the identifier of the driver (block 70 of FIG. 2) previously determined by the recognition device.
- an alert signal is issued to improve the driver's vigilance (block 60 of Figure 2).
- a plurality of measured values of the monitoring parameter are measured and the state of vigilance of the driver is evaluated according to the personalized estimation rule and this plurality of measured values.
- the measuring device 200 determines at least one measured value of the monitoring parameter, preferably a plurality of measured values of this monitoring parameter.
- the control unit 100 receives these measured values of the monitoring parameter.
- the measuring device 200 captures a plurality of images of the driver's head.
- the control unit 100 receives these images and processes them, for example by identifying, on each image, the eyes of the driver, and more particularly, the position of the eyelids, and / or pupils of the driver. It is also possible to identify the position of light source reflections on the cornea. The control unit 100 derives from these positions the value of the monitoring parameter of the
- FIRE I LLE OF REM PLACEM ENT (RULE 26) driver for example the direction of the gaze, the position of the driver's eyelids and / or the parameter PERCLOS.
- the control unit 100 also preferably receives the other data determined, measured or calculated by the other data recovery devices 300.
- the set of data relating to the conductor thus constituted is stored in the recording module 102 of the control unit 100.
- the control unit 100 connects to the external server 1 10, for example through the 4G wireless communication network, and loads the predetermined data sets stored on this external server.
- the comparator module 103 of the control unit 100 compares the set of data relating to the conductor and the predetermined data sets, as previously described.
- the estimation module 101 also receives from the measuring device 200, or from the recording module 102, the measured values of the monitoring parameter.
- the estimation module 101 transmits the personalized estimation rule for its storage in relation to the identifier of the driver in the register 120.
- the threshold values of the monitoring parameters or the base of the memory are stored. data of the learning algorithm of the estimation rule, in correspondence with the identifier of the driver.
- control unit 100 loads the personalized estimation rule associated with that specific driver.
- the custom estimation rule being adapted specifically to the driver, the performance of the latter is improved. The number of unwarranted warning signals is reduced and the driver is not tempted to disable the monitoring device.
- the control unit stores the measured values of the monitoring parameter, the vigilance state estimated by the personalized estimation rule and the reference vigilance state.
- the steps of recording, comparing and determining are performed at predetermined time intervals.
- the personalized estimation rule can be updated, depending on the possible evolution of the behavior of the driver or the evolution of the predetermined data sets, which can be refined in order to make the rule of group estimation associated more accurate.
- each set of predetermined data may be subdivided into several subsets of data, and the comparison step may also identify the closest subset of the driver data set. .
- the data set relating to the driver and each predetermined set of data comprise various subsets comprising data of different types, for example from different measuring or data recovery devices.
- the corresponding subsets are then compared to determine the nearest predetermined subsets of the driver subsets and the nearest predetermined set of data is derived from the driver data set.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Mathematical Physics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1651687A FR3048542A1 (en) | 2016-03-01 | 2016-03-01 | DEVICE AND METHOD FOR PERSONALIZED MONITORING OF A DRIVER OF A MOTOR VEHICLE |
PCT/EP2017/054831 WO2017149045A1 (en) | 2016-03-01 | 2017-03-01 | Personalized device and method for monitoring a motor vehicle driver |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3424030A1 true EP3424030A1 (en) | 2019-01-09 |
Family
ID=55808692
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17707079.4A Withdrawn EP3424030A1 (en) | 2016-03-01 | 2017-03-01 | Personalized device and method for monitoring a motor vehicle driver |
Country Status (6)
Country | Link |
---|---|
US (1) | US11312384B2 (en) |
EP (1) | EP3424030A1 (en) |
JP (1) | JP2019507443A (en) |
CN (1) | CN109690640A (en) |
FR (1) | FR3048542A1 (en) |
WO (1) | WO2017149045A1 (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108021875A (en) * | 2017-11-27 | 2018-05-11 | 上海灵至科技有限公司 | A kind of vehicle driver's personalization fatigue monitoring and method for early warning |
WO2019198179A1 (en) * | 2018-04-11 | 2019-10-17 | 三菱電機株式会社 | Passenger state determination device, alarm output control device, and passenger state determination method |
FR3090171B1 (en) * | 2018-12-13 | 2021-01-29 | Continental Automotive France | Method for determining a drowsiness level of a vehicle driver |
JP2021026596A (en) * | 2019-08-07 | 2021-02-22 | トヨタ自動車株式会社 | Driving behavior evaluation device, driving behavior evaluation method, and driving behavior evaluation program |
EP3795441A1 (en) * | 2019-09-17 | 2021-03-24 | Aptiv Technologies Limited | Method and device for determining an estimate of the capability of a vehicle driver to take over control of a vehicle |
US11587461B2 (en) * | 2019-10-23 | 2023-02-21 | GM Global Technology Operations LLC | Context-sensitive adjustment of off-road glance time |
US11810198B2 (en) | 2020-05-26 | 2023-11-07 | BlueOwl, LLC | Systems and methods for identifying distracted driving events using common features |
US11518391B1 (en) * | 2020-05-26 | 2022-12-06 | BlueOwl, LLC | Systems and methods for identifying distracted driving events using semi-supervised clustering |
US11518392B1 (en) | 2020-06-26 | 2022-12-06 | BlueOwl, LLC | Systems and methods for identifying distracted driving events using unsupervised clustering |
CN112489369B (en) * | 2020-11-06 | 2022-07-05 | 安徽盛瑞科技有限公司 | Anti-doze alarm device suitable for vehicle-mounted navigation system and use method |
JP2022142941A (en) * | 2021-03-17 | 2022-10-03 | 本田技研工業株式会社 | Driving support device, driving support method, and program |
CN114287939A (en) * | 2021-12-13 | 2022-04-08 | 上海航盛实业有限公司 | Fatigue driving detection method and system |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7027621B1 (en) * | 2001-03-15 | 2006-04-11 | Mikos, Ltd. | Method and apparatus for operator condition monitoring and assessment |
US7269504B2 (en) * | 2004-05-12 | 2007-09-11 | Motorola, Inc. | System and method for assigning a level of urgency to navigation cues |
US7394393B2 (en) * | 2005-08-02 | 2008-07-01 | Gm Global Technology Operations, Inc. | Adaptive driver workload estimator |
JP4791874B2 (en) * | 2006-03-31 | 2011-10-12 | 株式会社エクォス・リサーチ | Driving support device and driving action determination device |
KR100753839B1 (en) * | 2006-08-11 | 2007-08-31 | 한국전자통신연구원 | Method and apparatus for adaptive selection of interface |
JP2009015548A (en) * | 2007-07-04 | 2009-01-22 | Omron Corp | Drive assisting device and method, and program |
US20110022298A1 (en) * | 2008-04-11 | 2011-01-27 | Volvo Technology Corporation | Method and system for modifying a drive plan of a vehicle towards a destination |
JP5161643B2 (en) * | 2008-04-23 | 2013-03-13 | 富士重工業株式会社 | Safe driving support system |
US8548660B2 (en) * | 2009-09-11 | 2013-10-01 | Alte Powertrain Technologies, Inc. | Integrated hybrid vehicle control strategy |
CN101667323A (en) * | 2009-09-25 | 2010-03-10 | 武汉理工大学 | Real-time monitoring and alarming system for distraction of driver |
US8384534B2 (en) * | 2010-01-14 | 2013-02-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | Combining driver and environment sensing for vehicular safety systems |
CN102469948B (en) * | 2010-03-12 | 2015-06-03 | 塔塔咨询服务有限公司 | A system for vehicle security, personalization and cardiac activity monitoring of a driver |
US8554513B2 (en) * | 2010-10-28 | 2013-10-08 | Ashland Licensing And Intellectual Property, Llc | Method of testing and proving fuel efficiency improvements |
US20140019167A1 (en) * | 2012-07-16 | 2014-01-16 | Shuli Cheng | Method and Apparatus for Determining Insurance Risk Based on Monitoring Driver's Eyes and Head |
DE102012214464A1 (en) * | 2012-08-14 | 2014-02-20 | Ford Global Technologies, Llc | System for monitoring and analyzing the driving behavior of a driver in a motor vehicle |
NZ709258A (en) * | 2012-12-11 | 2020-05-29 | Ami Klin | Systems and methods for detecting blink inhibition as a marker of engagement and perceived stimulus salience |
US8952819B2 (en) * | 2013-01-31 | 2015-02-10 | Lytx, Inc. | Direct observation event triggering of drowsiness |
US9751534B2 (en) * | 2013-03-15 | 2017-09-05 | Honda Motor Co., Ltd. | System and method for responding to driver state |
WO2014172316A1 (en) * | 2013-04-15 | 2014-10-23 | Flextronics Ap, Llc | Building profiles associated with vehicle users |
FI124068B (en) * | 2013-05-03 | 2014-02-28 | Jyvaeskylaen Yliopisto | A method to improve driving safety |
FR3012029B1 (en) * | 2013-10-23 | 2015-12-04 | Peugeot Citroen Automobiles Sa | METHOD FOR DETECTING THE VIGILANCE DROP OF THE DRIVER OF A MOTOR VEHICLE |
EP3103095A1 (en) * | 2014-02-04 | 2016-12-14 | Sudak, Menachem | Monitoring system and method |
US9189692B2 (en) * | 2014-02-14 | 2015-11-17 | GM Global Technology Operations LLC | Methods and systems for detecting driver attention to objects |
US9135803B1 (en) * | 2014-04-17 | 2015-09-15 | State Farm Mutual Automobile Insurance Company | Advanced vehicle operator intelligence system |
US10343682B2 (en) * | 2014-10-17 | 2019-07-09 | Ford Global Technologies, Llc | Vehicle operation based on activity tracking |
US20210169417A1 (en) * | 2016-01-06 | 2021-06-10 | David Burton | Mobile wearable monitoring systems |
-
2016
- 2016-03-01 FR FR1651687A patent/FR3048542A1/en not_active Withdrawn
-
2017
- 2017-03-01 WO PCT/EP2017/054831 patent/WO2017149045A1/en active Application Filing
- 2017-03-01 EP EP17707079.4A patent/EP3424030A1/en not_active Withdrawn
- 2017-03-01 CN CN201780025522.9A patent/CN109690640A/en active Pending
- 2017-03-01 US US16/082,192 patent/US11312384B2/en active Active
- 2017-03-01 JP JP2018546448A patent/JP2019507443A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US20200290628A1 (en) | 2020-09-17 |
WO2017149045A1 (en) | 2017-09-08 |
US11312384B2 (en) | 2022-04-26 |
CN109690640A (en) | 2019-04-26 |
JP2019507443A (en) | 2019-03-14 |
FR3048542A1 (en) | 2017-09-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017149045A1 (en) | Personalized device and method for monitoring a motor vehicle driver | |
US9908530B1 (en) | Advanced vehicle operator intelligence system | |
US10593182B1 (en) | Vehicle operator emotion management system and method | |
US10343693B1 (en) | System and method for monitoring and reducing vehicle operator impairment | |
WO2017149047A1 (en) | Device and method for monitoring a driver of an automotive vehicle | |
EP3463051A1 (en) | Connected device for behavioural monitoring of an individual and for detecting and/or preventing an anomaly | |
WO2017149046A1 (en) | Device and method for monitoring a driver of a transport vehicle | |
EP3265334B1 (en) | Device and method for predicting a vigilance level of a driver of a motor vehicle | |
WO2009071598A1 (en) | Method and apparatus for detecting a critical situation of a subject | |
WO2017137612A1 (en) | Anticollision device and method for a motor vehicle | |
FR3083346A1 (en) | METHOD AND DEVICE FOR MONITORING THE CAPACITY OF A CREW MEMBER OF AN AIRCRAFT | |
FR3057517A1 (en) | DEVICE FOR PREVENTING DANGEROUS SITUATIONS FOR A CONDUCTOR OF A TRANSPORT VEHICLE AND ASSOCIATED METHOD | |
AU2021105935A4 (en) | System for determining physiological condition of driver in autonomous driving and alarming the driver using machine learning model | |
FR3057516A1 (en) | DEVICE FOR PREVENTING DANGEROUS SITUATIONS FOR A CONDUCTOR OF A TRANSPORT VEHICLE AND ASSOCIATED METHOD | |
FR3107495A1 (en) | Monitoring device for monitoring a person driving a vehicle | |
EP4041606B1 (en) | Determination of a state of a user actively driving a motor vehicle or not | |
FR2978719A1 (en) | METHOD AND SYSTEM FOR PREDICTIVE STATE OF SOMNOLENCE OF VEHICLE DRIVER | |
EP4245221A1 (en) | Electronic device for monitoring a neurophysiological state of an operator in a control station of an aircraft, associated monitoring method and computer program | |
FR3133691A1 (en) | Data labeling process for building a database to configure, validate and/or test an application for monitoring the fatigue level of an individual | |
FR3119479A1 (en) | Determination of driver fatigue status in a controlled environment | |
WO2017097940A1 (en) | Device and method for managing the wakeup of a user of an automotive vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20181001 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20201209 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20210420 |