SE2030301A1 - Method and system for driving skill feedback - Google Patents

Method and system for driving skill feedback

Info

Publication number
SE2030301A1
SE2030301A1 SE2030301A SE2030301A SE2030301A1 SE 2030301 A1 SE2030301 A1 SE 2030301A1 SE 2030301 A SE2030301 A SE 2030301A SE 2030301 A SE2030301 A SE 2030301A SE 2030301 A1 SE2030301 A1 SE 2030301A1
Authority
SE
Sweden
Prior art keywords
driving
driver
calculating
observation
perforrnance
Prior art date
Application number
SE2030301A
Inventor
Achim J Lilienthal
Henrik Andreasson
Maike Schindler
Chadalavada Ravi Teja
Pathi Sai Krishna
Original Assignee
Chadalavada Ravi Teja
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chadalavada Ravi Teja filed Critical Chadalavada Ravi Teja
Priority to SE2030301A priority Critical patent/SE2030301A1/en
Priority to PCT/SE2021/050948 priority patent/WO2022071851A1/en
Publication of SE2030301A1 publication Critical patent/SE2030301A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W40/09Driving style or behaviour
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/582Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/16Control of vehicles or other craft
    • G09B19/167Control of land vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • G09B9/042Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles providing simulation in a real vehicle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/20Workers
    • A61B2503/22Motor vehicles operators, e.g. drivers, pilots, captains
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/225Direction of gaze

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Educational Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Hospice & Palliative Care (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Automation & Control Theory (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Medical Informatics (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Business, Economics & Management (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Human Computer Interaction (AREA)
  • Educational Administration (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Ophthalmology & Optometry (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to a method for calculating a driving performance observation for a driver driving a vehicle comprising the following method steps; i.) identify and store information of an object in registered visual information, ii.) identify and store information of the position of the driver's eye gaze, iii.) identify and store the position of the driven vehicle, iv.) calculate observations based upon comparing visual information and the position of the driver's eyes for a specific position of the vehicle.The invention further relates to a system for calculating a driving performance observation for a driver driving a vehicle

Description

METHOD AND SYSTEM FOR DRIVING SKILL FEEDBACK TECHNICAL FIELD The invention relates to a method of driving skill feedback based on measuredperformance of the driver. The invention further relates to a system for driving skillfeedback.
BACKGROUND OF THE INVENTION, PROBLEM AND PRIOR ART Determining the driving skill of a driving school student is important to guarantee acertain level of skill to guarantee and/or ensure safety for the student and for other road- users such as pedestrians, car driver, bicyclists and motorcyclists.
Commonly today the level of experience is deterrnined solely by a drivingschoolteacher and/or driving exam leader by a visual inspection of the driving schoolstudent performance When actually driving in a car and/or, not so common, in a car simulator environment.
Since the cost and/or time needed for a student to qualify for a driving licences iscontinuously increasing, it is important to improve the method of assessment andleaming. If the driving test fails, there is also a need for repeated tests resulting inincreased cost and time for the student. Repeated driving also results in environmental pollution.
It is also of interest to eff1ciently deterrnining the driving skills of elderly drivers and or drivers affected by a disease.
It is thus important to identify a method and/or system that improves the possibility todetermine and/or analyse the skill of a driving school student such that the outcome ofthe analysis valid, objective, and reliable, i.e. that it is identical or comparable for eachtrial. Furthermore, it is important to identify a method and/or system that improves thecapability to drive safely for the driving school students and thus the safety for allfelloW road-users. Furthermore, it is important to identify a method and/or system thatreduces the cost and time used, by the driving student, in driving schools by improving the leaming experience When driving.
Patent document US 9,256,995 B2 discloses an apparatus for diagnosing drivingbehaviour including a storage unit that stores ideal running inforrnation defining arelationship between a vehicle speed and a vehicle position corresponding to a roadsituation, a generation unit that generates actual running inforrnation expressing arelationship between an actual vehicle speed and an actual position when a vehiclepasses through a road, a condition identification unit that identifies a matching conditionwhere the degree of correlation of the actual running inforrnation with the ideal runninginforrnation exceeds a predeterrnined value, and a diagnosis unit that diagnoses adriving behaviour of a driver of the vehicle, based on a degree of similarity between theideal running inforrnation and the actual running inforrnation under the matching condition identified in the condition identification unit.
Patent document US 9,256,995 B2 does not disclose a system adaptable to be usedwithout the need to arrange ideal running inforrnation in a storage unit before the system is utilized.
Further problems addressed by the invention will become clear from the following detailed description of the various embodiments.
OBJECT AND FEATURES OF THE INVENTION The present invention relates to a method for calculating a driving performanceobservation for a driver driving a vehicle comprising the following method steps; i.) identify and store inforrnation of an object in registered visual inforrnation, ii.) identify and store inforrnation of the position of the driver°s eye gaze, iii.) identify and store the position of the driven vehicle, iv.) calculate observations based upon comparing visual inforrnation and the position of the driver's eyes for a specific position of the vehicle.
According to further aspects of the improved method for calculating a driving performance observation for a driver driving a vehicle, provision is made as follows; the objects are at least one of- at least one road sign, - at least one traffic signal, - at least one pedestrian, - at least one vehicle. the observation is calculated by that as for a specific instance in time or position the position of the driver°s eyes are compared to the position of an identified object. a driving perforrnance score is calculated based upon at least two observations. the driving perforrnance score is calculated as a relation of observations where the driver observes an object in relation to the total number of observations.
As an alternative the driving perforrnance score could be calculated using a weight matrix based on the driver"s attention and reaction towards different signals and objects,where different traffic violations are weighted to have different score - i.e. missing a redlight or stop sign is a severe traffic violation compared to missing a sign of a road bump or other less severe traffic information.
As an altemative the driving performance score could be calculated based on theobserved driving behavior (including the driver's gaze positions with respect to thegiven situation and the driver's reaction) to the rules and regulations stated by the traffic authorities.
The invention further relates to a system for calculating a driving performanceobservation for a driver driving a vehicle comprising means for identifying objects,means for registering the positions of the eye gaze of the driver, means for registeringthe position of the vehicle, arranged to at least one computer, where the computer isarranged with at least one microprocessor and an arrangement for visualizing thedriving behavior and presenting a driving performance score and/or, on a display, visualizing at least one driving performance observation.
According to further aspects of the improved system for calculating a driving performance observation for a driver driving a vehicle, provision is made as follows; the means for identifying the objects is a digital camera arranged with software for pattem recognition. the means for registering the positions of the eyes of the driver is arranged in glasses or goggles arranged on the driver. the means for registering the positions of the vehicles is a GPS receiver and IMU arranged in the vehicle. a 360-degree camera is arranged on the vehicle.
ADVANTAGES AND EFFECTS OF THE INVENTION The disclosed method and system describe how to automatically calculate the drivingperformance of a person under test, such as a driving school student or other individual driving a vehicle.
By making a better estimation of the ability of the person under test to adapt to speedinglimits it is also possible to evaluate hoW environmentally friendly the person under testdriving performance is.
BRIEF DESCRIPTION OF FIGURES Different embodiments of the invention are described in more detail below Withreference to the attached figures, in Which: Fig. l shoWs method steps for calculating a driving performance score for a driver according to an embodiment of the invention.
Fig. 2 shoWs a user environment for calculating a driving performance score for a driver according to an embodiment of the invention.
Fig. 3 shoWs a realization of the use of the system for calculating a driving performance score for a driver according to an embodiment of the invention.
Fig. 4 shoWs a presentation view for calculating a driving performance score for a driver according to an embodiment of the invention.
DETAILED DESCRIPTION OF EMBODIMENTS Fig. 1 shows the method steps of the method for calculating a driving performance scorefor a driver driving a vehicle 1. The driving performance score is deterrnined basedupon driving performance observations. The method for calculating a drivingperformance score for a driver driving a vehicle 1 comprises four initial, preferablyparallel, information acquiring steps. For increased visibility, the three steps aredisclosed sequentially but could also be performed in another sequence or as a parallel process.
Vehicles could include, but are not limited to, cars, motorcycles, bicycles, scooters,trucks, different kinds of work machines/heavy machinery such as excavators, forklifts, heavy vehicles, wheel loaders etc.
The first method step, Register an image 10, registers an image, preferably in the formof a digital image. After an image is registered, for example by a digital camera, theinformation in the image, preferably in a digital format, is provided to be processed bythe method. The method is preferably implemented in software, the software is arrangedto be processed in a computer including a microprocessor or other central processingunit (CPU) such as a digital signal processor (DSP) or Graphical Processing Unit (GPU) In the second method step, Identify visual information in the image 20, informationsuch as objects is identified. By identifying an object in a registered visual information,i.e. the registered image, information about different events occurring during drivingcan be characterized. A method of image recognition or pattem recognition is used to identify objects in the image.
Examples of objects that could be identified includes road signs, pedestrians, cyclists,and vehicles. Both, moving vehicles and parked, or otherwise not moving, vehiclescould also be identified. Examples of vehicles that could be identified includesmotorcycles, bicycles, scooters, cars, different kinds of work machines/heavymachinery such as excavators, wheel loaders etc. Object identities are stored in acalculating unit or storage unit and are time stamped and position stamped, i.e. stored with a specific time and position.
Image recognition Pattern recognition is the automated recognition of patterns and regularities in data. Asan example, an image sensor, such as a digital camera, could register images. Theimages could be registered as individual images or as a stream of images registered witha predeterrnined frequency. The images are preferably represented by digital data andthe digital data could be utilized to identify information on the digital data, as an example by utilizing machine leaming.
Machine leaming algorithms build a mathematical model based on sample data, knownas training data, in order to make predictions and/ or decisions based upon theinformation provided in the training data. One specific example of machine leaming issupervised leaming. Supervised leaming algorithms build a mathematical model of a setof data that contains both the inputs and the desired outputs. The data is known astraining data and consists of a set of training examples. Each training example has oneor more inputs and the desired output, also known as a supervisory signal. In themathematical model, each training example is represented by an array or vector,sometimes called a feature vector, and the training data is represented by a matrix.Through iterative optimization of an objective function, supervised leaming algorithmsleam a function that can be used to predict the output associated with new inputs. Anoptimal function will allow the algorithm to correctly determine the output for inputsthat were not a part of the training data. An algorithm that improves the accuracy of its outputs or predictions over time is said to have leamed to perform that task.
Types of supervised leaming algorithms include Neural Networks, Support VectorMachines, classification and regression. Classification algorithms are used when theoutputs are restricted to a limited set of values, and regression algorithms are used when the outputs may have any numerical value within a range.
In the third method step, Register the position of the driver°s eye gaze 30, the positionof the eye gaze of the driver is registered. Preferably means for eye tracking are utilizedbut a conventional image of the eyes of the driver could also be used to identify theposition of the eyes. By identifying and storing information of the position of thedriver°s eye the attention of the driver could be deterrnined, i.e. does the driver look atspecific relevant objects as they occur? The position of the eye is stored in a calculatingunit or storage unit and is time stamped and/or position stamped, i.e. stored with a specific time and/or position.
Eye tracking The position of the eyes of the driver is preferably measured by eye tracking. Eyetracking is the process of measuring the point of gaze, i.e. where an individual islooking, and/or the motion of an eye relative to the head. An eye tracker is a device formeasuring eye positions and eye movement. Methods to track eye gaze includes videoimages and from the image information the eye position could be extracted. Othermethods use search coils and/or are based on the electrooculogram (EOG). The methodscould thus be divided into three categories: (i) measurement of the movement of anobject (norrnally, a special contact lens) attached to the eye; (ii) optical tracking withoutdirect contact to the eye; and (iii) measurement of electric potentials using electrodes placed around the eyes.
The preferred method is the non-contact, optical method for measuring eye motion.Light, typically in the wavelength of infrared light, is reflected from the eye and sensedby a video camera or some other specially designed optical sensor. The informationdetected/recorded by the sensor is then analyzed to extract eye rotation from changes inreflections. Video-based eye trackers typically use the comeal reflection, also known asthe first Purkinj e image, and the center of the pupil as features to track over time. Amore sensitive type of eye-tracker known as the dual-Purkinje eye tracker, usesreflections from the front of the comea, also known as the first Purkinje image, and/orthe back of the lens, also known as the fourth Purkinje image, as features to track. Analtemative, and a more sensitive, method of tracking is to image features from inside theeye, such as the retinal blood vessels, and follow these features as the eye rotates.Optical methods, particularly those based on video recording, are commonly used for gaze-tracking and have the benefit of being non-invasive and inexpensive.
One type of eye tracking device utilizes a video-based eye-trackers. At least one cameraor other sensor focuses on one or both eyes and records eye movement as the viewer,the driver, looks at some kind of stimulus such as an object or, e.g., a road sign. Oneimplementation of an eye-tracker device uses the center of the pupil and infrared / near-infrared non-collimated light to create comeal reflections (CR). The vector between thepupil center and the comeal reflections can be used to compute the point of regard on surface or the gaze direction.
Two general types of infrared / near-infrared, also known as active light, eye-trackingtechniques are used: bright-pupil and dark-pupil. The difference between bright-pupil and dark-pupil eye-tracking is based on the location of the illumination source with respect to the Optics. If the illumination is coaxial with the optical path, i.e. bright-pupileye-tracking, then the eye acts as a retroreflector as the light reflects off the retina. If theillumination source is offset from the optical path, i.e. dark-pupil eye-tracking, then thepupil appears dark because the retroreflection from the retina is directed away from the CEJIIICTEJ..
Bright-pupil tracking creates greater iris/pupil contrast, allowing more robust eye-tracking with all iris pigmentation, and greatly reduces interference caused by eyelashesand other obscuring features. It also allows tracking in lighting conditions ranging fromtotal darkness to very bright.
An altemative method is known as passive light. The method uses visible light toilluminate. A drawback with this method is that the contrast of the pupil is less than inthe active light methods, therefore, the center of iris is used for calculating the vectorinstead. This calculation needs to detect the boundary of the iris and the white sclera(limbus tracking).
The frequency of measuring, the sample rate, is preferably above 30 Hz and morepreferably in the range of 50 Hz to 60 Hz is more common, system is available withsampling rate up to 1250 Hz. High sampling rate could be used in order to capture fixational eye movements or correctly measure saccade dynamics.
Scanpaths, i.e. the movements of the eye during pattem perception, are useful foranalyzing cognitive intent, interest, and salience. Biological factors, such as gender and age, may affect the scanpath.
In the fourth method step, Register the position (orientation and velocity) of vehicle 40,the position, velocity and orientation of the vehicle is deterrnined and registered. As anexample, a GPS receiver could be used to determine the position and velocity and theposition and velocity could be stored in a calculating unit, such as a computer or otherdevice. For deterrnining the orientation an Inertial Measuring Unit, IMU, is used totrack the vehicle dynamics. With an IMU a lane change or a sharp tum could beidentified which is not possible using a GPS. The fourth method step thus identifies and stores the position, orientation and velocity of the driven vehicle.
Position The position of the vehicle is preferably measured by a positioning system such as GPSor GLONASS or other satellite positioning system. The position could also be measuredand/or deterrnined by an Inertial Navigation System, INS, or other system incorporating an accelerometer, gyro or other motion sensors and/or rotation sensors.
The position could also be deterrnined based upon geographical information in a map.
The four method steps above serve to acquire information for use in the process, there isno need to perform the steps sequentially, preferably the method steps are performed simultaneously.
In the fifth method step, Calculate observations based upon comparing visualinformation and the position of the drivers eyes for a specific position 50, anobservation is calculated in a calculation unit, such as a computer. An observation, alsoknown as a driving performance observation, is an instance, for a certain position or fora series of consecutive positions, Where a specific object is observed or not observed bythe driver.
The method, or algorithm, to determine if the driver observed an object could forexample include that the eyes of the driver should be directed to the object for a certaintime. For a specific instance in time or position the position of the driver's eyes iscompared to the position of an identified object. If the eyes of the driver are deterrninedas being directed to the object, the specific observation is classified as being anobservation Where the driver observes the object. If the eyes of the driver aredeterrnined as not being directed to the object, the specific observation is classified asbeing an observation Where the driver did not observe the object. Measured velocity and orientation of the driver could also be used to calculate the observation.
In the sixth method step, Calculate a score based upon at least two observations 60, ascore is calculated based upon the registered observations. In one example the scorecould be the percentage expressed as the number of observations Where the driverobserves the object in relation to the total number of observations. For example, if the score is l or 100% all observations Was observed by the driver.
In the seventh method step, Visualization of driving behavior 70, the results from the score and/or other information is visualized for the user of the method for calculating a driving performance score for a driver driving a vehicle 1. An example of the visualization of the driving behavior is shown in fig. 4.
Fig. 2 shows a user environment for calculating a driving performance score for a driver100, an eye tracking device 105 is arranged to the driver 104, as an altemative the eyetracking device could be located in the car bur not arranged to the driver 104. A numberof traffic signs 102 and/or signal 102 are detected and classified by the method forcalculating a driving performance score for a driver driving a vehicle 1. The sensor fordetecting the traffic signs 102 and/or signals 102 is shown as a camera 106 in the shownembodiment but could be other image registering sensors. An additional sensor, shownas a 360-degree camera 110 in the shown embodiment of the invention, is used torecord the complete traffic situation for the complete surrounding traffic environmentfor the plane on where the vehicle is located. An additional sensor, shown as a GPSreceiver 112 in the shown embodiment of the invention, is used to track the position ofthe vehicle and also to calculate the speed and/or acceleration. In addition to the GPSreceiver 112 an IMU could be arranged in the vehicle to improve the accuracy ofposition, velocity, and orientation information. While driving, the driver 104 focusand/or fixate the eyes on different objects in the driving environment and the eyetracking device identifies and record the eye gaze fixations. In the figure the driverfixates on the sign 108. The information for the actual time and/or position of therecorded fixation are preferably stored and visualized to describe the drivers driving behavior and/or driving performance.
Fig. 3 shows a realization of the use of the system for calculating a driving performancescore for a driver. In the realization 200 a driving student and/or leamer 202 and aninstructor and/or examiner 204 are shown in a vehicle. While driving, the student 202 iswearing an arrangement for eye-tracking 206, the arrangement for eye-tracking couldalso be located in the car to continuously register the position of the driver"s eyes. Theinstructor 204 instruct 208 the driver 202 while driving and it is possible to providedirect feedback, where the information for the feedback is presented on the visualizationdevice 210, for the specific realization illustrated as a tablet. The visualization device ispreferably a tablet or a smartphone but could also be directly visualized on the in-vehicle infotainment system or other system and or device suitable for presenting information.
Fig. 4 shows a presentation view for calculating a driving performance score for adriver. The shown embodiment of a presentation view 300 is presented on avisualization device 210. The visualization device 210 is arranged with a display forshowing and/or displaying inforrnation. The presentation view is not limited to thedisclosed visualization and could include alternative inforrnation. For visualizationdevices with smaller screens, such as a smartphone, a more reduced set on information could be presented. In the presented view, according to fig. 4, identified traffic signs and/or signals 302 are visualized by a frame or other means to highlight the inforrnation.
Furthermore, the information of the eye gaze fixations 304 are also visualized. Thetraffic signs and/or signals 302 and the eye gaze fixations 304 are preferably displayedwith different colors and/ or symbols. The sensor 306 for capturing the visualinformation, such as video, is preferably located in the front of the car to record thesame view as the driver sees while driving. In the presented view the actual speed andthe speed limit 308 could be presented. The speed limit is deterrnined based upon image recognition of speed limit signs and/or stored information for the specific position.
Specific filters could be utilized to focus or highlight on specific characteristics of thedriving, a filter select box 314 could be implements as shown in the presentation view300 that determine the segments of the recorded video of the driving that could behighlighted. A video select box 310 could be implemented whereas shown in thepresentation view 300 where the user could select what part of the video, as highlightedby the filter selection, to view in more detail. A driving score box 312 could beimplemented in the presentation view as shown in fig. 4 where the driving score andother information relating to the driving score could be presented such as an error report.
The presentation view could also include information from a driving leaming bookand/or other information for leaming driving skills, specific parts and/or segments fromthe driving schoolbook could be visualized when a specific event occur. An elevatedview 316 could also be implemented in the presentation view 300 to show the exacttraffic situation in a holistic view. Inforrnation for the elevated view 316 is preferably arranged by the 360-degree camera 110.
At least one of the eye-tracking device 105, the GPS receiver 112, the IMU, the camera106, and the 360-degree camera 110 is in communicative contact with the visualization device 210, for example by wireless connection, such as Bluetooth or Wi-Fi or other wireless radio protocol and/or by connection by a physical cable utilizing thecommunication bus system of the vehicle.. Storage of information and/or necessarycomputation functionality could be arranged in eye-tracking device 105, the GPSreceiver 112, the IMU, the camera 106, and the 360-degree camera 110 and/or thecomputational device. The software for arranging the presentation view is preferablyarranged in the visualization device 210. Other software could be arranged in thevisualization device 210 and/or the eye-tracking device 105, the GPS receiver 112, theIMU, the camera 106, and/or the 360-degree camera 110.
ALTERNATIVE EMBODIMENTS The invention is not limited to the particular embodiments shown but can be varied in different ways within the scope of the patent claims.

Claims (1)

1. Method for calculating a driving performance observation for a driver driving avehicle comprising the following method steps; i.) identify and store inforrnation of an object in registered visual inforrnation,ii.) identify and store inforrnation of the position of the driver's eye gaze, iii.) identify and store the position of the driven vehicle, iv.) calculate observations based upon comparing visual inforrnation and the position of the driver's eyes for a specific position of the vehicleMethod for calculating a driving perforrnance observation for a driver driving avehicle according to claim 1 Wherein the objects are at least one of - at least one road sign, - at least one traffic signal, - at least one pedestrian, - at least one vehicleMethod for calculating a driving perforrnance observation for a driver driving avehicle according to any of the preceding claims Wherein the observations are visually presented on a displayMethod for calculating a driving perforrnance observation for a driver driving avehicle according to any of the preceding claims Wherein the observation iscalculated by that as for a specific instance in time or position the position of the driver's eyes are compared to the position of an identified objectMethod for calculating a driving perforrnance observation for a driver driving avehicle according to any of the preceding claims Wherein a driving perforrnance score is calculated based upon at least two observationsMethod for calculating a driving perforrnance observation for a driver driving avehicle according to claim 5 Wherein the driving perforrnance score iscalculated as a relation of observations Where the driver observes an object in relation to the total number of observationsSystem for calculating a driving perforrnance observation for a driver driving avehicle comprising means for identifying objects, means for registering the positions of the eyes gaze of the driver, means for registering the position of the 10 11. vehicle, arranged to at least one computer, Where the computer is arranged Withat least one microprocessor and an arrangement for visualizing the drivingbehavior and presenting a driving perforrnance score and/or, on a display, visualizing at least one driving perforrnance observationSystem for calculating a driving performance observation for a driver driving avehicle according to claim 7 Wherein the means for identifying the objects is a digital camera arranged With softWare for pattem recognitionSystem for calculating a driving performance observation for a driver driving avehicle according to any of claim 7 - 8 Wherein the means for register thepositions of the eyes of the driver is arranged in glasses or goggles arranged on the driverSystem for calculating a driving performance observation for a driver driving avehicle according to any of claim 7 - 9 Wherein the means for register thepositions of the vehicles is a GPS receiver and/or an IMU arranged in the vehicleSystem for calculating a driving performance observation for a driver driving avehicle according to any of claim 7 - 10 Wherein a 360-degree camera is arranged on the vehicle.
SE2030301A 2020-09-30 2020-09-30 Method and system for driving skill feedback SE2030301A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE2030301A SE2030301A1 (en) 2020-09-30 2020-09-30 Method and system for driving skill feedback
PCT/SE2021/050948 WO2022071851A1 (en) 2020-09-30 2021-09-29 Method and system for driving skill feedback

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE2030301A SE2030301A1 (en) 2020-09-30 2020-09-30 Method and system for driving skill feedback

Publications (1)

Publication Number Publication Date
SE2030301A1 true SE2030301A1 (en) 2022-03-31

Family

ID=80950758

Family Applications (1)

Application Number Title Priority Date Filing Date
SE2030301A SE2030301A1 (en) 2020-09-30 2020-09-30 Method and system for driving skill feedback

Country Status (2)

Country Link
SE (1) SE2030301A1 (en)
WO (1) WO2022071851A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024046066A (en) * 2022-09-22 2024-04-03 マツダ株式会社 Driver state determination device
CN115691267A (en) * 2022-11-18 2023-02-03 天津五八驾考信息技术有限公司 Driving simulation control method, system, driving simulator and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000051097A2 (en) * 1999-02-26 2000-08-31 Intel Corporation Operator training system
EP2564766A1 (en) * 2011-09-02 2013-03-06 Volvo Car Corporation Visual input of vehicle operator
US20140350777A1 (en) * 2013-05-27 2014-11-27 Fujitsu Limited Apparatus for diagnosing driving behavior, method for diagnosing driving behavior, and program thereof
EP2893867A1 (en) * 2014-01-09 2015-07-15 Harman International Industries, Incorporated Detecting visual inattention based on eye convergence
DE102014105374A1 (en) * 2014-04-15 2015-10-15 Deutsches Zentrum für Luft- und Raumfahrt e.V. Driver assistance system
US20160297449A1 (en) * 2015-04-08 2016-10-13 Robert Bosch Gmbh Method and device for detecting the alertness of a vehicle driver
US9586591B1 (en) * 2015-05-04 2017-03-07 State Farm Mutual Automobile Insurance Company Real-time driver observation and progress monitoring
WO2017105333A1 (en) * 2015-12-15 2017-06-22 Greater Than Ab Method and system for assessing the trip performance of a driver
US20190278268A1 (en) * 2018-03-08 2019-09-12 Steering Solutions Ip Holding Corporation Driver readiness assessment system and method for vehicle
US20190318180A1 (en) * 2018-04-17 2019-10-17 GM Global Technology Operations LLC Methods and systems for processing driver attention data
US20190374151A1 (en) * 2018-06-08 2019-12-12 Ford Global Technologies, Llc Focus-Based Tagging Of Sensor Data
US10625745B1 (en) * 2019-01-07 2020-04-21 Sean Tremblay Automated driver's exam system
WO2020122986A1 (en) * 2019-06-10 2020-06-18 Huawei Technologies Co.Ltd. Driver attention detection using heat maps

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000051097A2 (en) * 1999-02-26 2000-08-31 Intel Corporation Operator training system
EP2564766A1 (en) * 2011-09-02 2013-03-06 Volvo Car Corporation Visual input of vehicle operator
US20140350777A1 (en) * 2013-05-27 2014-11-27 Fujitsu Limited Apparatus for diagnosing driving behavior, method for diagnosing driving behavior, and program thereof
EP2893867A1 (en) * 2014-01-09 2015-07-15 Harman International Industries, Incorporated Detecting visual inattention based on eye convergence
DE102014105374A1 (en) * 2014-04-15 2015-10-15 Deutsches Zentrum für Luft- und Raumfahrt e.V. Driver assistance system
US20160297449A1 (en) * 2015-04-08 2016-10-13 Robert Bosch Gmbh Method and device for detecting the alertness of a vehicle driver
US9586591B1 (en) * 2015-05-04 2017-03-07 State Farm Mutual Automobile Insurance Company Real-time driver observation and progress monitoring
WO2017105333A1 (en) * 2015-12-15 2017-06-22 Greater Than Ab Method and system for assessing the trip performance of a driver
US20190278268A1 (en) * 2018-03-08 2019-09-12 Steering Solutions Ip Holding Corporation Driver readiness assessment system and method for vehicle
US20190318180A1 (en) * 2018-04-17 2019-10-17 GM Global Technology Operations LLC Methods and systems for processing driver attention data
US20190374151A1 (en) * 2018-06-08 2019-12-12 Ford Global Technologies, Llc Focus-Based Tagging Of Sensor Data
US10625745B1 (en) * 2019-01-07 2020-04-21 Sean Tremblay Automated driver's exam system
WO2020122986A1 (en) * 2019-06-10 2020-06-18 Huawei Technologies Co.Ltd. Driver attention detection using heat maps

Also Published As

Publication number Publication date
WO2022071851A1 (en) 2022-04-07

Similar Documents

Publication Publication Date Title
US11583221B2 (en) Cognitive impairment diagnostic apparatus and cognitive impairment diagnostic program
Wang et al. Driver fatigue detection: a survey
Kapitaniak et al. Application of eye-tracking in drivers testing: A review of research
Shinoda et al. What controls attention in natural environments?
US20220061724A1 (en) Systems and methods for assessing user physiology based on eye tracking data
KR100416152B1 (en) Operator training system
Biswas et al. Detecting drivers’ cognitive load from saccadic intrusion
JP2017505733A (en) Method and apparatus for detecting safe driving state of driver
SE2030301A1 (en) Method and system for driving skill feedback
US20020188219A1 (en) Method and apparatus for inferring physical/mental fitness through eye response analysis
Jansen et al. Does agreement mean accuracy? Evaluating glance annotation in naturalistic driving data
Bergasa et al. Visual monitoring of driver inattention
Guasconi et al. A low-cost implementation of an eye tracking system for driver's gaze analysis
Reimer et al. Detecting eye movements in dynamic environments
Wang et al. Driver fatigue detection technology in active safety systems
Bergasa et al. Real-time system for monitoring driver vigilance
Biswas et al. Characterizing drivers’ peripheral vision via the functional field of view for intelligent driving assistance
Cheng et al. Active heads-up display based speed compliance aid for driver assistance: A novel interface and comparative experimental studies
Li et al. A driving attention detection method based on head pose
Oberlin et al. Designing Real-time Observation System to Evaluate Driving Pattern through Eye Tracker
Dixit et al. Face detection for drivers’ drowsiness using computer vision
US20240350051A1 (en) Systems and methods for using eye imaging on a wearable device to assess human health
US12133567B2 (en) Systems and methods for using eye imaging on face protection equipment to assess human health
US20240156189A1 (en) Systems and methods for using eye imaging on face protection equipment to assess human health
Gondi et al. Voice Assistant for Driver Drowsiness Detection

Legal Events

Date Code Title Description
NAV Patent application has lapsed