WO2021014228A1 - Driver evaluation method and device for implementing the method - Google Patents

Driver evaluation method and device for implementing the method Download PDF

Info

Publication number
WO2021014228A1
WO2021014228A1 PCT/IB2020/054670 IB2020054670W WO2021014228A1 WO 2021014228 A1 WO2021014228 A1 WO 2021014228A1 IB 2020054670 W IB2020054670 W IB 2020054670W WO 2021014228 A1 WO2021014228 A1 WO 2021014228A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
driver
sensor
simulator
input measurement
Prior art date
Application number
PCT/IB2020/054670
Other languages
French (fr)
Inventor
Jaka SODNIK
Boštjan KALUŽA
Mojca Komavec
Kristina STOJMENOVA
Original Assignee
Nervtech, Raziskave In Razvoj, D.O.O.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nervtech, Raziskave In Razvoj, D.O.O. filed Critical Nervtech, Raziskave In Razvoj, D.O.O.
Publication of WO2021014228A1 publication Critical patent/WO2021014228A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/14Traffic procedures, e.g. traffic regulations
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/16Control of vehicles or other craft
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • G09B9/052Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles characterised by provision for recording or measuring trainee's performance

Definitions

  • the present invention relates to a method for evaluating a driver's ability using an interactive evaluation device, for example by means of a simulator.
  • the invention also relates to a device for implementing the method.
  • the object of the present invention is to create a new method for evaluating a driver's ability by using an evaluation device.
  • a method of the present invention allows for a comprehensive multi-purpose driver evaluation, such as insurance risk assessment, professional driver suitability assessment, driver deficiency assessment, training, disease diagnosis, comparison of driving profile with celebrities, etc.
  • driver is used throughout the description of the invention, and it is used in the broadest sense possible, for example the driver of road vehicle and/or rolling stock, the helmsman of a watercraft, the pilot of an aircraft, etc.
  • FIG. 1 shows a schematic representation of a method for evaluating a driver's ability according to the invention using an interactive evaluation device, for example using a driving simulator that comprises data capture from the simulator 1 and data capture from the driver 2, at least one module 3 that synchronises this data, a processor unit 4, and the memory 5.
  • the above data is fed into the processor unit 4, where they are processed and subsequently evaluated, and the driver's ability is displayed on the display 6.
  • the above driving simulator 1 has at least four degrees of freedom of movement, being able to simulate different driving environments, traffic situations, different densities and types of traffic, different weather conditions, and traffic signals, and provides a field of view of at least 120°.
  • the above driving simulator 1 generates or simulates input measurement data based on data from the simulation environment and the physical components of the simulator 1, such as the pedals, steering wheel, gearbox, movable platform of the simulator 1, etc.
  • These input measurement data which can be generated at any time during the simulation by the driving simulator 1, are selected non-restrictively as data about the gradient, curvature, type, and width of the road, weather data (visibility, precipitation intensity, snowfall data), driving time or duration data, vehicle speed data, speed limit data, speed data of other simulated vehicles, acceleration and deceleration data (depression of the accelerator, brake, and clutch pedal), gear-shift lever data, data on the status of the lights and horn, fuel consumption data, collision data (i.e.
  • collision position, collision angle, collision force, and collision participants vehicle lane positioning data, vehicle location data relative to other simulated vehicles, location data of other simulated vehicles and other objects, data on the observance of traffic signals (observance of speed limits, stop signs, traffic lights, etc.), data on the speed of the steering wheel rotation, angle of steering wheel rotation, and steering wheel torque, data on annotated critical situations, data on annotated objects (bounding boxes that constrain objects in a 2D projection), etc.
  • the driving simulator 1 provides the option of annotating or segmenting images to detect and locate objects in the simulation. Furthermore, timestamps and annotation or image segmentation enables the automatic calculation of the view in the mirrors, monitoring of traffic signals, duration of a look away from the road, etc.
  • Driver 2 input measurement data is captured by means of a set of sensors mounted on the driver 2, which are capable of measuring and/or collecting measured driver 2 data, including, but not limited to, an electro cardiovascular sensor, electrodermal sensor or sensor for measuring galvanic skin response, thermoelectric sensor, infrared sensor that can provide a photoplethysmograph, accelerometer for measuring head and arm movements of the driver 2, real-time gaze tracker, camera or sensor for detecting eye activity.
  • thermoelectric sensor, the infrared sensor, the accelerometer, the gaze tracker, and the camera or sensor for detecting eye activity are equipped with timestamps so that the data or events acquired by the simulator 1 can be aligned with the measurements of the respective sensor.
  • the input measurement data for the driver 2 that can be provided by the electrodermal sensor are the conductivity data for the driver's skin 2.
  • the input measurement data that can be generated by the thermoelectric sensor are skin temperature data.
  • the input measurement data that can be generated by the electro cardiovascular sensor are the cardiac electrical activity data of the driver 2 obtained from the electroencephalograph.
  • the input measurement data that can be generated by the infrared sensor, which is capable of providing a photoplethysmograph, are, for example, the heart rate and blood flow of the driver 2.
  • the input measurement data that can be generated by the accelerometer are, for example, the speed and direction of the movement of the head and hands of the driver 2.
  • the input measurement data that can be generated by the real-time gaze tracker are pupil diameter, focus, and eye movement speed data, as well as the gaze data, such as the position of the gaze focus point on one of the simulator 1 screens.
  • the input measurement data that can be generated by the camera or gaze tracker are the position and size of the pupils, as well as data about other ocular activities such as blinking, fixing of the gaze, and the eye openness level for cases of sleepiness.
  • Further input data about the driver 2 can be obtained on the basis of a personality questionnaire. These further input data are: extraversion, neuroticism, conscientiousness, agreeableness, openness, seeking of stimulation, aggressiveness, etc.
  • the input measurement data from the simulator 1 and the input measurement data about the driver 2 are used to calculate individual categories, such as aggression, driver reaction time, stress, driver attention, driving errors and violations of traffic rules, driving style, fuel consumption, and an assessment of driver reactions in critical situations.
  • Each of the above categories is calculated from the subcategories listed below for each category. The better the categories and subcategories are rated, the safer the driver's driving is.
  • Jerks are indicators of ride comfort. If the jerk is absolutely less than a certain value, the ride is acceptable as a comfortable ride. Jerks can be considered by segment.
  • Aggressiveness is, for example, calculated on the basis of the formula below:
  • each w represents a weight that indicates how much a given category affects the final rating and how much a particular subcategory affects each category.
  • Each w is obtained empirically with the help of experts, such as safe driving instructors, findings in professional and research articles, and commonly known road traffic regulations.
  • the driver 2 reaction time is, for example, a calculation based on predetermined events that force the driver 2 to respond.
  • An example of one possible event is a ball falling on the road, and the reaction time of the driver 2 to activate the brake is measured.
  • the average reaction time for all events is calculated in the processor unit 4. A driver with a shorter reaction time gets a better rating.
  • Driver 2 stress is, for example, evaluated based on biometric data measured on the driver, such as skin conductivity, heart rate, skin temperature. The higher the heart rate and the greater the skin conductivity, the more stressed the driver 2 is. Skin temperature decreases as stress levels increase. Unforeseen and unusual situations such as reduced road visibility due to fog or heavy precipitation can be more stressful for young inexperienced drivers compared to experienced drivers. When evaluating a driver, the average and maximum heart rate and the difference between them, as well as the maximum and average skin conductivity and the difference between them, are compared. The driver stress assessment in the processor unit 4 is calculated and compared based on parameters that show which category the driver falls into relative to the average population.
  • the attention of the driver 2 is evaluated, for example, based on data about the frequency of use of all mirrors in the vehicle or simulator 1. From the collected data on the direction of the gaze and the position of the mirrors, the processor unit 4 calculates whether the driver 2 looked in the mirror. Afterwards, based on the expected frequency of looking at each mirror over a specified time, such as every 8 seconds, the percentage of the driver successfully checking the mirror is calculated. Alternatively, mirror observation is calculated based on a frequency adapted to the current speed and the number of significant objects on the route. Another possibility for mirror observation is cumulative, whereby the percentage of times that the driver looked in each mirror is calculated at the end of the simulation.
  • the attention rating of the driver 2 during the simulation is also influenced by the driver's observation of significant objects and disregard for objects that distract from driving. Throughout the entire drive, the positions of objects that are significant for driving and those that distract from driving are collected and recorded in the aforementioned memory 5. These data are then compared with the gaze direction of driver 2 to determine whether or not the driver noticed a particular object. At the end of the simulation, the processor unit 4 calculates how many significant and how many distracting objects the driver 2 observed, and then rates the driver.
  • - stopping at a red light based on data about the distance to the traffic light, light colour, and current speed, the processor unit 4 calculates whether or not the driver stopped at a red light. This calculation represents the percentage of cases when the driver stopped correctly at a red traffic light;
  • - stopping at a stop sign based on data about the distance to a stop sign and the current vehicle speed, the processor unit 4 calculates whether or not the driver stopped at a stop sign. This calculation represents the percentage of cases when the driver stopped correctly at a stop sign;
  • the processor unit 4 calculates the number of collisions caused by the driver, excluding the collisions for which the driver was not responsible;
  • the processor unit 4 and the memory 5 help combine the data on mirror checking and the recognition of actions such as changing lanes, turning, and overtaking. Before each action, the driver must check the mirrors to make sure that he or she can safely perform the action. The aforementioned calculation represents the percentage of cases when the driver correctly checked the mirrors when changing lanes, turning, or overtaking.
  • rating wi c speed limit violation + w 2 c red light violations + w 3 c stop sign violations + w 4 c non-priority road sign violations + w 5 c accident rating + w 6 c penalty point and penalty rating
  • each w represents a weight that indicates how much a given category affects the final rating and how much a particular subcategory affects a category.
  • Each w is obtained empirically with the help of experts, such as safe driving instructors, findings in professional and research articles, and commonly known road traffic regulations.
  • the driving style of the driver 2 is, for example, obtained based on a personality questionnaire where the processor unit 4 calculates the extent to which the driver 2 belongs to each of the four driving styles (patient, apprehensive, reckless, or angry).
  • the driving style is calculated from the personality categories mentioned above, i.e. extraversion, neuroticism, conscientiousness, agreeableness, openness, seeking of stimulation, aggressiveness, so that each of these categories contributes to the individual driving style with its weight.
  • an additional rating may be added to a particular driving style in critical situations depending on the driver's response.
  • the evaluation of the driving style is obtained based on a questionnaire that is not the subject of the invention, on the basis of which personality traits are assessed. Each personality trait contributes to the driving style in different ways as follows.
  • Angry driving style ((1 - agreeableness) + (1 - conscientiousness) + seeking of stimulation + aggressiveness)/4;
  • Reckless driving style ((1 - agreeableness) + (1 - conscientiousness) + seeking of stimulation)/3;
  • Patient driving style (conscientiousness + agreeableness + openness + (1 - neuroticism) + (1 - seeking of stimulation))/5.
  • Fuel consumption for example, is derived from data and the simulator 1 throughout the drive. Based on the data on current fuel consumption and distance travelled, the processor unit 4 calculates the average consumption over the entire drive. The calculated consumption is classified as low or high consumption, which is determined experimentally, and the processor unit 4 calculates the driver's rating based on this.
  • An assessment of the reactions of the driver 2 to critical situations is obtained based on the driver's attention and reaction, whereby it is determined in advance what the driver should pay attention to and how he or she should respond in each critical situation. Based on the predefined reactions, the driver receives a reaction rating. The rating incorporates data about the driver's gaze and marked positions of the objects, together with data on the current speed, brake application, steering wheel position, etc.
  • the processor unit 4 For each of the above categories that have subcategories the processor unit 4 first calculates a rating for each category using weights. Afterwards, the processor unit 4 again calculates the final driver rating from all categories using weights. The weights are intended to assign a different weight or significance to different subcategories and categories. The weights can be changed depending on the purpose of the final driver rating and the group of rated drivers, such as young drivers, professional drivers, etc.
  • the weights can be determined based on a set of drivers with a known final rating of a driver to serve as a benchmark. For example, an insurance premium class, claims history, etc. can be used as the aforementioned benchmark.
  • the aforementioned weights are determined by first providing a set of drivers to drive on the driving simulator 1 and capturing the input measurement data. Thereafter, a final rating is determined for each of the drivers, which can be determined in one of the following ways:
  • the final rating is a categorical rating, for example, a driving instructor ranks the drivers as safe, average, or aggressive drivers, or a numeric final rating, for example, a driving instructor gives a final rating to a driver between 0 and 100, 100 being the best rating and 0 the worst rating. This defines the final classification rating.
  • the next step in determining the aforementioned weights is to introduce the input measurement data into the optimisation model, with said model being selected as a gradient descent, reverse propagation method, genetic algorithm, etc.. This is followed by a determination of the input value of the connection strengths or the input weight between categories and subcategories as well as categories and the final rating as presented in the aforementioned rule-based model, whereby the input weights can be selected freely, determined based on traffic rules, insurance statistics, or expertise, and they represent the parameters of the selected model.
  • the driver evaluation method comprises providing the driver evaluation system described above, placing the driver in the driving simulator 1, and optionally calibrating the gaze tracker, which is followed by driving in the driving simulator 1. At the same time, the input measurement data of the driver are captured, and the rating of individual subcategories and the final rating of the driver are calculated with the help of the processor unit 4.
  • the method described above is optionally followed by a readout or printout of the rating on the display 6, on paper, etc.
  • the readout or printout of the rating is prepared according to the purpose of the evaluation, for example, as a training certificate, a report of a diagnostic test for a medical condition, a certificate of the driver's risk level, a recommendation, etc.

Abstract

The present invention relates to a method for evaluating a driver's ability using an interactive evaluation device, for example by means of a simulator. The invention also relates to a device for implementing this method. The method of the present invention provides a comprehensive multi-purpose driver evaluation, such as insurance risk assessment, professional driver suitability assessment, driver deficiency assessment, training, disease diagnosis, comparison of driving profile with celebrities, etc.

Description

DRIVER EVALUATION METHOD AND DEVICE FOR IMPLEMENTING THE METHOD
[0001] The present invention relates to a method for evaluating a driver's ability using an interactive evaluation device, for example by means of a simulator. The invention also relates to a device for implementing the method.
[0002] The publication of patent application US2008082372 A1 discloses a method for evaluating a driver's ability by using a realistic driving simulator. The driver is exposed to a variety of traffic situations in the driving simulator, whereby the driver's ability is assessed and the insurance risk is evaluated based on the driver's responses. The known method only discloses instructions on how to evaluate the driving itself, while the psychophysical condition of the driver is not factored in.
[0003] The object of the present invention is to create a new method for evaluating a driver's ability by using an evaluation device.
[0004] The object of the present invention is accomplished by the characteristics set forth in the characterizing part of claim 1. Details of the invention are disclosed in the corresponding subclaims. A method of the present invention allows for a comprehensive multi-purpose driver evaluation, such as insurance risk assessment, professional driver suitability assessment, driver deficiency assessment, training, disease diagnosis, comparison of driving profile with celebrities, etc. The term“driver” is used throughout the description of the invention, and it is used in the broadest sense possible, for example the driver of road vehicle and/or rolling stock, the helmsman of a watercraft, the pilot of an aircraft, etc.
[0005] Fig. 1 shows a schematic representation of a method for evaluating a driver's ability according to the invention using an interactive evaluation device, for example using a driving simulator that comprises data capture from the simulator 1 and data capture from the driver 2, at least one module 3 that synchronises this data, a processor unit 4, and the memory 5. The above data is fed into the processor unit 4, where they are processed and subsequently evaluated, and the driver's ability is displayed on the display 6. Furthermore, there is an option to store processed data in the memory 5, either temporarily or permanently.
[0006] The above driving simulator 1 has at least four degrees of freedom of movement, being able to simulate different driving environments, traffic situations, different densities and types of traffic, different weather conditions, and traffic signals, and provides a field of view of at least 120°. The above driving simulator 1 generates or simulates input measurement data based on data from the simulation environment and the physical components of the simulator 1, such as the pedals, steering wheel, gearbox, movable platform of the simulator 1, etc. These input measurement data, which can be generated at any time during the simulation by the driving simulator 1, are selected non-restrictively as data about the gradient, curvature, type, and width of the road, weather data (visibility, precipitation intensity, snowfall data), driving time or duration data, vehicle speed data, speed limit data, speed data of other simulated vehicles, acceleration and deceleration data (depression of the accelerator, brake, and clutch pedal), gear-shift lever data, data on the status of the lights and horn, fuel consumption data, collision data (i.e. collision position, collision angle, collision force, and collision participants), vehicle lane positioning data, vehicle location data relative to other simulated vehicles, location data of other simulated vehicles and other objects, data on the observance of traffic signals (observance of speed limits, stop signs, traffic lights, etc.), data on the speed of the steering wheel rotation, angle of steering wheel rotation, and steering wheel torque, data on annotated critical situations, data on annotated objects (bounding boxes that constrain objects in a 2D projection), etc.
[0007] The driving simulator 1 provides the option of annotating or segmenting images to detect and locate objects in the simulation. Furthermore, timestamps and annotation or image segmentation enables the automatic calculation of the view in the mirrors, monitoring of traffic signals, duration of a look away from the road, etc.
[0008] Driver 2 input measurement data is captured by means of a set of sensors mounted on the driver 2, which are capable of measuring and/or collecting measured driver 2 data, including, but not limited to, an electro cardiovascular sensor, electrodermal sensor or sensor for measuring galvanic skin response, thermoelectric sensor, infrared sensor that can provide a photoplethysmograph, accelerometer for measuring head and arm movements of the driver 2, real-time gaze tracker, camera or sensor for detecting eye activity. According to the proposed invention, at least the thermoelectric sensor, the infrared sensor, the accelerometer, the gaze tracker, and the camera or sensor for detecting eye activity are equipped with timestamps so that the data or events acquired by the simulator 1 can be aligned with the measurements of the respective sensor.
[0010] The input measurement data for the driver 2 that can be provided by the electrodermal sensor are the conductivity data for the driver's skin 2.
[0011] The input measurement data that can be generated by the thermoelectric sensor are skin temperature data.
[0012] The input measurement data that can be generated by the electro cardiovascular sensor are the cardiac electrical activity data of the driver 2 obtained from the electroencephalograph.
[0013] The input measurement data that can be generated by the infrared sensor, which is capable of providing a photoplethysmograph, are, for example, the heart rate and blood flow of the driver 2.
[0014] The input measurement data that can be generated by the accelerometer are, for example, the speed and direction of the movement of the head and hands of the driver 2.
[0015] The input measurement data that can be generated by the real-time gaze tracker are pupil diameter, focus, and eye movement speed data, as well as the gaze data, such as the position of the gaze focus point on one of the simulator 1 screens.
[0016] The input measurement data that can be generated by the camera or gaze tracker are the position and size of the pupils, as well as data about other ocular activities such as blinking, fixing of the gaze, and the eye openness level for cases of sleepiness.
[0017] Further input data about the driver 2 can be obtained on the basis of a personality questionnaire. These further input data are: extraversion, neuroticism, conscientiousness, agreeableness, openness, seeking of stimulation, aggressiveness, etc.
[0018] The input measurement data from the simulator 1 and the input measurement data about the driver 2 are used to calculate individual categories, such as aggression, driver reaction time, stress, driver attention, driving errors and violations of traffic rules, driving style, fuel consumption, and an assessment of driver reactions in critical situations.
[0019] Each of the above categories is calculated from the subcategories listed below for each category. The better the categories and subcategories are rated, the safer the driver's driving is.
[0020] Driving aggressiveness is evaluated according to the following parameters.
[0021] Observance of safe distance between vehicles based on the data obtained from the simulator 1 whereby, either on predefined sections or along the entire driving route, without considering sections where the speed is less than a certain threshold, the safe distance is calculated with respect to the vehicle speed, taking into account a reaction time of 2 seconds. Optimal safe distance = 2 s c speed [m/s]. Afterwards, the percentage of time that the actual safe distance was more or less than the distance to the next vehicle is calculated in the processor unit 4. This result shows the percentage of time the driver 2 complied with the safe distance.
[0022] Observance of sticking to the road based on the data obtained from the simulator 1, whereby the percentage of time the driver 2 actually drove on the road is calculated in the processor unit 4 using the data regarding the road type, such as paved road, pavement, motorway acceleration lane, shoulder, etc.
[0023] Observance of acceleration/deceleration rate based on the data obtained from the simulator 1, whereby the acceleration derivative is calculated in the processor unit 4, which represents the jerk. Jerks are indicators of ride comfort. If the jerk is absolutely less than a certain value, the ride is acceptable as a comfortable ride. Jerks can be considered by segment.
[0024] Observance of the use of the direction indicator when turning, overtaking, and changing lanes based on the data obtained from the simulator 1, whereby the processing unit 4 calculates whether it is a turn, overtake, or change of lane based on data about the lane in which the vehicle is located, the position of the steering wheel, and other vehicles involved in the simulation. In the case of one of the listed actions, it is verified whether the driver 2 correctly switched on the direction indicator. The processor unit 4 calculates the percentage of correct direction indicator usage.
[0025] Observance of lane changing based on the data obtained from the simulator 1, whereby it is first determined whether it is a lane change, considering only the segments where this is possible, for example on a motorway. Based on data about the other vehicles involved in the simulation, the distance to vehicles in the other lane is calculated in the processor unit 4. If the distances mentioned, observing the safe distance, are appropriate in relation to the speed of the vehicle and the speed of the vehicles in the other or adjacent lane, the relevant subcategory is considered as a suitable lane change. The percentage of lane changes where the lane change was acceptable is calculated in the processor unit 4.
[0026] Aggressiveness is, for example, calculated on the basis of the formula below:
[0027] aggressiveness = wi c breach of safe distance + w2 c off-road driving + w3 c uncomfortable driving + w4 c stop sign violations + w5 c red light violations + w6 c non priority road sign violations + w7 c speed limit violation + w8 c speed limit violation based on weather conditions,
[0028] whereby each w, represents a weight that indicates how much a given category affects the final rating and how much a particular subcategory affects each category. Each w, is obtained empirically with the help of experts, such as safe driving instructors, findings in professional and research articles, and commonly known road traffic regulations. In a particular implementation case, the weights that were taken into account included for example Wi = 0.11, w2 = 0.074, w3 = 0.18, w4 = 0.09, w5 = 0.09, w6 = 0.09, w7 = 0.09, w8 = 0.17.
[0029] The driver 2 reaction time is, for example, a calculation based on predetermined events that force the driver 2 to respond. An example of one possible event is a ball falling on the road, and the reaction time of the driver 2 to activate the brake is measured. The time from moment ti of the occurrence of the event to moment t2 when the driver 2 responds represents the reaction time tr of the driver 2, which is tr = t2 - ti. At the end of the simulation, the average reaction time for all events is calculated in the processor unit 4. A driver with a shorter reaction time gets a better rating.
[0030] Driver 2 stress is, for example, evaluated based on biometric data measured on the driver, such as skin conductivity, heart rate, skin temperature. The higher the heart rate and the greater the skin conductivity, the more stressed the driver 2 is. Skin temperature decreases as stress levels increase. Unforeseen and unusual situations such as reduced road visibility due to fog or heavy precipitation can be more stressful for young inexperienced drivers compared to experienced drivers. When evaluating a driver, the average and maximum heart rate and the difference between them, as well as the maximum and average skin conductivity and the difference between them, are compared. The driver stress assessment in the processor unit 4 is calculated and compared based on parameters that show which category the driver falls into relative to the average population.
[0031] The attention of the driver 2 is evaluated, for example, based on data about the frequency of use of all mirrors in the vehicle or simulator 1. From the collected data on the direction of the gaze and the position of the mirrors, the processor unit 4 calculates whether the driver 2 looked in the mirror. Afterwards, based on the expected frequency of looking at each mirror over a specified time, such as every 8 seconds, the percentage of the driver successfully checking the mirror is calculated. Alternatively, mirror observation is calculated based on a frequency adapted to the current speed and the number of significant objects on the route. Another possibility for mirror observation is cumulative, whereby the percentage of times that the driver looked in each mirror is calculated at the end of the simulation.
[0032] The attention rating of the driver 2 during the simulation is also influenced by the driver's observation of significant objects and disregard for objects that distract from driving. Throughout the entire drive, the positions of objects that are significant for driving and those that distract from driving are collected and recorded in the aforementioned memory 5. These data are then compared with the gaze direction of driver 2 to determine whether or not the driver noticed a particular object. At the end of the simulation, the processor unit 4 calculates how many significant and how many distracting objects the driver 2 observed, and then rates the driver.
[0033] All of the mentioned objects are marked as distracting or significant over a specific period of time. If the driver 2 pays attention to a distracting object, it lowers his or her final attention rating, but if the driver notices a significant object, it raises his or her final attention rating. The final rating is calculated by the processor unit 4 by adding together the individual weights of the objects according to their (un)importance.
[0034] Driving errors and violations of traffic rules are, for example, evaluated based on data obtained from the simulator 1. In doing so, the following is considered non-restricting:
[0035] - speed adjustment: based on data about the current speed of the vehicle and the speed limit, the percentage of time the driver complied with the speed limit is calculated in the processor unit 4. This calculation can be done for the whole drive or for segments;
[0036] - stopping at a red light: based on data about the distance to the traffic light, light colour, and current speed, the processor unit 4 calculates whether or not the driver stopped at a red light. This calculation represents the percentage of cases when the driver stopped correctly at a red traffic light;
[0037] - stopping at a stop sign: based on data about the distance to a stop sign and the current vehicle speed, the processor unit 4 calculates whether or not the driver stopped at a stop sign. This calculation represents the percentage of cases when the driver stopped correctly at a stop sign;
[0038] - reduction of speed ahead of a non-priority road sign: based on data about the distance to the non-priority road sign and the current speed, the processor unit 4 calculates whether the driver decreased the speed below the limit value. This calculation represents the percentage of cases when the driver correctly reduced the speed ahead of a non-priority road sign;
[0039] - number of accidents caused by the driver: based on the collision data, the processor unit 4 calculates the number of collisions caused by the driver, excluding the collisions for which the driver was not responsible;
[0040] - number of penalty points: based on the data on the current speed and the speed limit, the processor unit 4 calculates over speeding at individual points. Over speeding and the speed limit are linked to data about traffic offence penalties, which provides the number of penalty points a driver would receive;
[0041] - checking mirrors when the driver changes lanes, turns, or overtakes: the processor unit 4 and the memory 5 help combine the data on mirror checking and the recognition of actions such as changing lanes, turning, and overtaking. Before each action, the driver must check the mirrors to make sure that he or she can safely perform the action. The aforementioned calculation represents the percentage of cases when the driver correctly checked the mirrors when changing lanes, turning, or overtaking.
[0042] For example, the calculation of the evaluation of driving errors and violations of traffic rules is provided by the following formula:
[0043] rating = wi c speed limit violation + w2 c red light violations + w3 c stop sign violations + w4 c non-priority road sign violations + w5 c accident rating + w6 c penalty point and penalty rating
[0044] whereby each w, represents a weight that indicates how much a given category affects the final rating and how much a particular subcategory affects a category. Each w, is obtained empirically with the help of experts, such as safe driving instructors, findings in professional and research articles, and commonly known road traffic regulations. In a particular implementation case, the weights that were taken into account included for example Wi = 0.0625, w2 = 0.0625, w3 = 0.0625, w4 = 0.0625, w5 = 0.625, w6 = 0.125.
[0045] The driving style of the driver 2 is, for example, obtained based on a personality questionnaire where the processor unit 4 calculates the extent to which the driver 2 belongs to each of the four driving styles (patient, apprehensive, reckless, or angry). The driving style is calculated from the personality categories mentioned above, i.e. extraversion, neuroticism, conscientiousness, agreeableness, openness, seeking of stimulation, aggressiveness, so that each of these categories contributes to the individual driving style with its weight. In addition, an additional rating may be added to a particular driving style in critical situations depending on the driver's response.
[0046] The evaluation of the driving style is obtained based on a questionnaire that is not the subject of the invention, on the basis of which personality traits are assessed. Each personality trait contributes to the driving style in different ways as follows.
[0047] Apprehensive driving style = ((1 - conscientiousness) + (1 - extraversion) + neuroticism + seeking of stimulation)/4;
[0048] Angry driving style = ((1 - agreeableness) + (1 - conscientiousness) + seeking of stimulation + aggressiveness)/4;
[0049] Reckless driving style = ((1 - agreeableness) + (1 - conscientiousness) + seeking of stimulation)/3;
[0050] Patient driving style = (conscientiousness + agreeableness + openness + (1 - neuroticism) + (1 - seeking of stimulation))/5.
[0051] Fuel consumption, for example, is derived from data and the simulator 1 throughout the drive. Based on the data on current fuel consumption and distance travelled, the processor unit 4 calculates the average consumption over the entire drive. The calculated consumption is classified as low or high consumption, which is determined experimentally, and the processor unit 4 calculates the driver's rating based on this. [0052] An assessment of the reactions of the driver 2 to critical situations is obtained based on the driver's attention and reaction, whereby it is determined in advance what the driver should pay attention to and how he or she should respond in each critical situation. Based on the predefined reactions, the driver receives a reaction rating. The rating incorporates data about the driver's gaze and marked positions of the objects, together with data on the current speed, brake application, steering wheel position, etc.
[0053] For each of the above categories that have subcategories the processor unit 4 first calculates a rating for each category using weights. Afterwards, the processor unit 4 again calculates the final driver rating from all categories using weights. The weights are intended to assign a different weight or significance to different subcategories and categories. The weights can be changed depending on the purpose of the final driver rating and the group of rated drivers, such as young drivers, professional drivers, etc.
[0054] The weights can be determined based on a set of drivers with a known final rating of a driver to serve as a benchmark. For example, an insurance premium class, claims history, etc. can be used as the aforementioned benchmark.
[0055] The aforementioned weights are determined by first providing a set of drivers to drive on the driving simulator 1 and capturing the input measurement data. Thereafter, a final rating is determined for each of the drivers, which can be determined in one of the following ways:
a) assessing the driver's driving using a driving instructor based on the driving in the simulator 1 or
b) determining the driver's premium class and/or claim history.
[0056] The final rating is a categorical rating, for example, a driving instructor ranks the drivers as safe, average, or aggressive drivers, or a numeric final rating, for example, a driving instructor gives a final rating to a driver between 0 and 100, 100 being the best rating and 0 the worst rating. This defines the final classification rating.
[0057] The next step in determining the aforementioned weights is to introduce the input measurement data into the optimisation model, with said model being selected as a gradient descent, reverse propagation method, genetic algorithm, etc.. This is followed by a determination of the input value of the connection strengths or the input weight between categories and subcategories as well as categories and the final rating as presented in the aforementioned rule-based model, whereby the input weights can be selected freely, determined based on traffic rules, insurance statistics, or expertise, and they represent the parameters of the selected model.
[0058] Then comes the learning and testing of the selected model in order to obtain as accurate a prediction as possible of the final driver rating relative to an already known final driver rating, whereby the testing can be carried out by means of a cross check. This is followed by a determination of the accuracy of the prediction based on stored performance measures in order to determine how well the model has learned from the data. If the results are unsatisfactory, other model parameters are used, for example, in the case of a neural network, a different number of levels with a different number of neurons at these levels can be used, and the aforementioned learning and testing of the selected model are repeated with a modified model configuration until a satisfactory result is obtained. When a satisfactory result is obtained, it is selected as a more accurate prediction of the driver's rating based on the known, expertly determined final driver's rating.
[0059] The driver evaluation method comprises providing the driver evaluation system described above, placing the driver in the driving simulator 1, and optionally calibrating the gaze tracker, which is followed by driving in the driving simulator 1. At the same time, the input measurement data of the driver are captured, and the rating of individual subcategories and the final rating of the driver are calculated with the help of the processor unit 4.
[0060] The method described above is optionally followed by a readout or printout of the rating on the display 6, on paper, etc. The readout or printout of the rating is prepared according to the purpose of the evaluation, for example, as a training certificate, a report of a diagnostic test for a medical condition, a certificate of the driver's risk level, a recommendation, etc.

Claims

Claims
1. A method for evaluating a driver's ability using an interactive evaluation device, for example, using a simulator, comprising
a) providing a simulator (1),
b) generating and capturing input measurement data in the simulator (1), c) providing a driver (2),
d) placing sensors on the driver (2) and acquiring input measurement data from the driver (2) by means of these sensors,
e) synchronising the above sensors by means of at least one module (3), f) processing the data obtained in step c) in the processing unit (4),
g) evaluating the data that has been processed in this way,
h) displaying the evaluated data.
2. A method according to claim 1, comprising the temporary or permanent storing of the processed data in the memory (5).
3. A method according to claim 1, comprising the generation of the aforementioned input measurement data at any time of the simulation, whereby these input measurement data are selected as data about the gradient, curvature, type, and width of the road, weather data, driving time or duration data, vehicle speed data, speed limit data, speed data of other simulated vehicles, acceleration and deceleration data, gear shift lever data, data on the status of the lights and horn, fuel consumption data, collision data, vehicle lane positioning data, vehicle location data relative to other simulated vehicles, location data of other simulated vehicles and other objects, data on the observance of traffic signals, data on the speed of the steering wheel rotation, angle of steering wheel rotation, and steering wheel torque, data on annotated critical situations, data on annotated objects, etc.
4. A method according to any one of the claims from 1 to 3, comprising capturing the driver input measurement data by means of a set of sensors mounted on the driver which are capable of measuring and/or collecting the measured driver data, whereby the mentioned sensor set comprises an electro cardiovascular sensor, electrodermal sensor or sensor for measuring galvanic skin response, thermoelectric sensor, infrared sensor that can provide a photoplethysmograph, accelerometer for measuring head and arm movements of the driver, real-time gaze tracker, and camera or sensor for detecting eye activity.
5. A method according to claim 4, comprising capturing the driver input measurement data by means of a set of sensors, with at least the thermoelectric sensor, the infrared sensor, the accelerometer, the gaze tracker, and the camera or sensor for detecting eye activity being equipped with timestamps so that the data or events acquired by the simulator can be aligned with the measurements of the respective sensor.
6. A device for evaluating a driver's ability for performing the method according to any one of the preceding claims.
PCT/IB2020/054670 2019-07-09 2020-05-18 Driver evaluation method and device for implementing the method WO2021014228A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SIP-201900130 2019-07-09
SI201900130A SI25874A (en) 2019-07-09 2019-07-09 The driver evaluation procedure and the device for carrying out the procedure

Publications (1)

Publication Number Publication Date
WO2021014228A1 true WO2021014228A1 (en) 2021-01-28

Family

ID=70978299

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2020/054670 WO2021014228A1 (en) 2019-07-09 2020-05-18 Driver evaluation method and device for implementing the method

Country Status (2)

Country Link
SI (1) SI25874A (en)
WO (1) WO2021014228A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116453289A (en) * 2022-01-06 2023-07-18 中国科学院心理研究所 Bus driving safety early warning method and system based on electrocardiosignal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130302760A1 (en) * 2009-09-29 2013-11-14 Advanced Training System Llc System, Method and Apparatus for Driver Training System with Stress Management
KR20140073669A (en) * 2012-12-06 2014-06-17 대구가톨릭대학교산학협력단 Driving ability evaluating system
US20160027336A1 (en) * 2012-04-23 2016-01-28 The Boeing Company Methods for Evaluating Human Performance in Aviation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130302760A1 (en) * 2009-09-29 2013-11-14 Advanced Training System Llc System, Method and Apparatus for Driver Training System with Stress Management
US20160027336A1 (en) * 2012-04-23 2016-01-28 The Boeing Company Methods for Evaluating Human Performance in Aviation
KR20140073669A (en) * 2012-12-06 2014-06-17 대구가톨릭대학교산학협력단 Driving ability evaluating system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116453289A (en) * 2022-01-06 2023-07-18 中国科学院心理研究所 Bus driving safety early warning method and system based on electrocardiosignal
CN116453289B (en) * 2022-01-06 2024-02-20 中国科学院心理研究所 Bus driving safety early warning method and system based on electrocardiosignal

Also Published As

Publication number Publication date
SI25874A (en) 2021-01-29

Similar Documents

Publication Publication Date Title
Eskandarian et al. Evaluation of a smart algorithm for commercial vehicle driver drowsiness detection
Braunagel et al. Ready for take-over? A new driver assistance system for an automated classification of driver take-over readiness
CN109774471B (en) Vehicle-mounted equipment suitable for safe driving
US9478150B1 (en) Real-time driver observation and scoring for driver's education
EP1997705A1 (en) Drive behavior estimating device, drive supporting device, vehicle evaluating system, driver model making device, and drive behavior judging device
EP2296124B1 (en) System for automatic evaluation of driving behavior
Feng et al. An on-board system for detecting driver drowsiness based on multi-sensor data fusion using Dempster-Shafer theory
Bieliauskas Neuropsychological assessment of geriatric driving competence
CN108876165B (en) Driver safety monitoring learning system
Mortazavi et al. Effect of drowsiness on driving performance variables of commercial vehicle drivers
Tran et al. Predicting driver’s work performance in driving simulator based on physiological indices
Hoogendoorn Empirical research and modeling of longitudinal driving behavior under adverse conditions
WO2021014228A1 (en) Driver evaluation method and device for implementing the method
JP3782025B2 (en) Driving training system using driving simulator
EP4230493A2 (en) Coachable driver risk groups
Cavallo et al. Elderly pedestrians' visual timing strategies in a simulated street-crossing situation
Wong et al. A microscopic driver attention allocation model
EA041273B1 (en) A METHOD FOR ASSESSING DRIVER INVOLVEMENT IN DRIVING A VEHICLE
WO2024062769A1 (en) Driver assistance device, driver assistance system, and driver assist method
TWM552127U (en) System for evaluating driving risk
Beňuš et al. Comparison of driver awareness in real traffic and driving on a simulator
Horáková et al. MEASURING PERFORMANCE DURING A FALLBACK PROCEDURE IN AUTONOMOUS VEHICLES
Cai et al. Evaluation of driver visual behavior and road signs in virtual environment
Poliak et al. Road freight transport driver fatique test: A pilot study
Gregoriades et al. Evaluating a custom-made agent-based driving simulator.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20730733

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20730733

Country of ref document: EP

Kind code of ref document: A1