GB2563871A - Control system - Google Patents

Control system Download PDF

Info

Publication number
GB2563871A
GB2563871A GB1710302.9A GB201710302A GB2563871A GB 2563871 A GB2563871 A GB 2563871A GB 201710302 A GB201710302 A GB 201710302A GB 2563871 A GB2563871 A GB 2563871A
Authority
GB
United Kingdom
Prior art keywords
user
vehicle
control system
predetermined
gaze
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1710302.9A
Other versions
GB201710302D0 (en
GB2563871B (en
Inventor
Singh Harpreet
Skrypchuk Lee
Thomas Philip
Rosa Crundall Elizabeth
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1710302.9A priority Critical patent/GB2563871B/en
Publication of GB201710302D0 publication Critical patent/GB201710302D0/en
Priority to DE102018209838.7A priority patent/DE102018209838A1/en
Priority to US16/019,097 priority patent/US20190001883A1/en
Publication of GB2563871A publication Critical patent/GB2563871A/en
Application granted granted Critical
Publication of GB2563871B publication Critical patent/GB2563871B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/10
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F02COMBUSTION ENGINES; HOT-GAS OR COMBUSTION-PRODUCT ENGINE PLANTS
    • F02NSTARTING OF COMBUSTION ENGINES; STARTING AIDS FOR SUCH ENGINES, NOT OTHERWISE PROVIDED FOR
    • F02N11/00Starting of engines by means of electric motors
    • F02N11/08Circuits or control means specially adapted for starting of engines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Ophthalmology & Optometry (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • General Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Mathematical Physics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

A control system (Fig.2, 200) for enabling operation of a vehicle, comprising: a gaze sensing means (Fig.2, 120) configured to detect a gaze characteristic 420 of a user of the vehicle. A primary indication of intent or desire of the user to operate the vehicle is identified 440 when the user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period. The operation of the vehicle is enabled in dependence on the identified primary indication of intent. The user gaze characteristic may include the eye movement of the user or an iris feature. The gaze sensing means may detect the users gaze on a target location in the interior of the vehicle. The system may detect a secondary indication of intent of the user to operate the vehicle if a user input complies with a predetermined condition for a second predetermined time period. The system may enable the operation of the vehicle based on the first and second indications of intent. The invention is relating to enabling the vehicle to be started.

Description

CONTROL SYSTEM
TECHNICAL FIELD
The present disclosure relates to a control system. Particularly, but not exclusively, the disclosure relates to a control system for enabling operation of a vehicle. Aspects of the invention relate to a controller, to a control system, to a method, to a computer program product and to a vehicle.
BACKGROUND
There is a need to improve how operation of a vehicle is enabled by a user.
Known systems for enabling operation of a vehicle include a key fob, an actuation button and, optionally, require the engaging of either a brake pedal or a clutch pedal.
The above-described known systems can inconvenience the user and also be unreliable and, thus, decrease the quality of user interaction with the vehicle.
At least in certain embodiments, the present invention seeks to overcome or ameliorate at least some of the shortcomings of prior art arrangements.
SUMMARY OF THE INVENTION
Aspects and embodiments ofthe invention provide a control system for enabling operation of a vehicle, a vehicle comprising a control system for enabling operation of a vehicle, a controller for a control system enabling operation of a vehicle, a method for enabling operation of a vehicle, and a computer program product as claimed in the appended claims.
According to an aspect of the invention there is provided a control system for enabling operation of a vehicle, the control system comprising: a camera; a processor communicatively coupled to the camera; and a controller communicatively coupled to the processor; wherein: the camera is configured to detect a gaze characteristic of a user of the vehicle and send data representative of the gaze characteristic to the processor; the processor is configured to identify a primary indication of intent of the user to operate the vehicle when the user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period; and the controller is configured to generate an activation signal to enable operation of the vehicle in dependence on the identified primary indication of intent.
According to another aspect of the invention, there is provided a control system for enabling operation of a vehicle. The control system comprises a gaze sensing means. The control system comprises a processor communicatively coupled to the gaze sensing means. The control system comprises a controller communicatively coupled to the processor.
The gaze sensing means is configured to detect a gaze characteristic of a user of the vehicle. The gaze sensing means is configured to send data representative of the gaze characteristic to the processor.
The processor is configured to identify a primary indication of intent of the user to operate the vehicle when the user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period.
The controller is configured to generate an activation signal to enable operation of the vehicle in dependence on the identified primary indication of intent.
Identifying a primary indication of intent of the user to operate the vehicle when the user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period reduces the likelihood of identifying a false positive, and, thus provides a more reliable system for enabling operation of a vehicle. In particular, having a first predetermined time period during which a user gaze characteristic has to comply with a predetermined gaze characteristic takes into account the actual intent of a user to operate the vehicle, and avoids the unnecessary sending of the activation signal if a user gaze characteristic matches a predetermined gaze characteristic by mistake or unintentionally.
Optionally, the activation signal enables the vehicle to transition from a lower power mode to a higher power mode. Optionally, the activation signal enables activation of an electric machine of the vehicle. Optionally, the activation signal enables the vehicle to transition into a mode that allows a user to drive the vehicle. Optionally, the activation signal allows the vehicle to transition from a stationary state to a moving state.
Optionally, the controller may be configured to generate a feedback signal to enable feedback to be provided to the user based on the user gaze characteristic. Optionally, the feedback is at least one of the following: visual feedback, audio feedback, and haptic feedback.
Providing feedback to a user increases the level of user interaction with the system and thus, improves user engagement with the system so as to more accurately identify the user's intent to operate the vehicle. The feedback may helpfully instruct the user to comply with the predetermined gaze characteristic for the first predetermined time period.
Optionally, upon identification by the processor that the user gaze characteristic may comply with the predetermined gaze characteristic, the controller is configured to generate a feedback signal to enable feedback representative of approaching a threshold associated with the first predetermined time period.
Optionally, the threshold may be equal to the first predetermined time period.
Optionally, the feedback representative of approaching a threshold associated with the first predetermined time period may be one or more of: a visual timer on a display module of the vehicle; an audio count down or count up; an increasing or decreasing amplitude or frequency of haptic feedback.
Optionally, upon identification by the processor that the user gaze characteristic does not comply with the predetermined gaze characteristic within the first predetermined time period, then the controller is configured to reset the feedback signal.
Optionally, the user gaze characteristic comprises eye movement of the user and the predetermined gaze characteristic comprises predetermined eye movement of the user.
Optionally, the gaze characteristic of a user comprises a gaze direction of the user and the predetermined gaze characteristic comprises a predetermined gaze direction to a predetermined target location, T.
Optionally, the predetermined target location, T, may be chosen by the user.
Optionally, the first predetermined time period is in one or more of the following ranges: 1 second to 10 seconds; 1 second to 5 seconds; 2 seconds to 5 seconds; and 2 seconds to 4 seconds.
The amount of seconds for the first predetermined time period can be set long enough to more accurately identify a user's intent to operate the vehicle and avoid recording a false positive which leads to unnecessarily enabling operation of the vehicle by generating an activation signal. In addition, the amount of seconds for the first predetermined time period can be set short enough to allow for ease of use by a user and a quick process to enable operation of the vehicle.
Optionally, the predetermined target location, T, is an interior location of the vehicle.
Optionally, the interior location is associated with one of the following: an instrument cluster, a steering wheel, a mirror, a dashboard, a glove box, a screen, a portion of a windshield, or an air vent.
Optionally, the user gaze characteristic comprises an iris feature of the user and the predetermined gaze characteristic comprises a predetermined iris feature.
Optionally, the gaze sensing means comprises at least one camera.
Optionally, the control system comprises an input sensing means communicatively coupled to the processor. The input sensing means may be configured to detect an input from the user and send data representative of the input to the processor. The processor may be configured to identify a secondary indication of intent of the user to operate the vehicle when the input from the user complies with a predetermined input condition for a second predetermined time period. The controller may be configured to generate the activation signal in dependence on the primary indication of intent and the secondary indication of intent.
Identifying a secondary indication of intent beneficially more accurately identifies a user's intent to operate the vehicle and, in doing so, reduces the likelihood of incorrect interpretation that a user wishes to operate the vehicle and unnecessary generation of an activation signal. A more reliable system is therefore provided.
Optionally, the length of the second predetermined time period may be dependent on the proportion of the first predetermined time period that has already passed at the time that the input sensing means detects an input from the user. Optionally, the second predetermined time period may be reduced if the proportion of the first predetermined time period that has already passed falls into at least one of the following ranges: greater than or equal to 75%; greater than or equal to 65%; and greater than or equal to 50%. Optionally, the degree of reduction of the second predetermined time period may fall into at least one of the following ranges: greater than or equal to 20%; greater than or equal to 40%; greater than or equal to 60%.
In this way, the second predetermined time period may be set so that a user is not required to wait as long before operation of the vehicle is enabled if a higher proportion of the first predetermined time period has already passed. This creates a sophisticated system that more accurately pre-empts a user's intent to operate the vehicle, whilst also being easy to use and providing a quick process to enable operation of the vehicle.
Optionally, the input sensing means is one of the following: a capacitive sensor; a pressure sensor; and an audio sensor.
Optionally, the input sensing means is associated with one of the following: a clutch pedal; a brake pedal, and a steering wheel.
Optionally, the controller is configured to generate the activation signal in dependence on the first predetermined time period and the second predetermined time period satisfying an overlap time period. Using an overlap time period more accurately identifies a user's intent to operate the vehicle, and, in this way, provides a third indication of intent. The predetermined overlap time period requires the user gaze characteristic to comply with the predetermined user gaze characteristic at the same time, and for a certain time period, as the user input complies with a predetermined input condition, and, thus, beneficially increases the reliability of the control system.
Optionally, the length of the overlap time period may be dependent on the proportion of the first predetermined time period that has passed. Optionally, the overlap time period may be reduced if the proportion of the first predetermined time period that has already passed falls into at least one of the following ranges: greater than or equal to 75%; greater than or equal to 65%; and greater than or equal to 50%.
Optionally, the length of the overlap time period may be dependent on the proportion of the second time period that has passed. Optionally, the overlap time period may be reduced if the proportion of the second predetermined time period that has already passed falls into at least one of the following ranges: greater than or equal to 75%; greater than or equal to 65%; and greater than or equal to 50%.
The length of the overlap time period in which (1) the user gaze characteristic must comply with a predetermined gaze characteristic; and (2) the user input must comply with a predetermined input condition, can change dynamically dependent on real-time actions of the user and can thus, beneficially result in quicker and more reliable identification of a user's intent to operate the vehicle.
Optionally, the overlap time period may be equal to the shorter of the first predetermined time period and the second predetermined time period.
The overlap time period being equal to the shortest of the first and second predetermined time periods can reduce the amount of time a user is required to wait to operate the vehicle, whilst also increasing the certainty of the intent of the user to operate the vehicle.
Optionally, the overlap time period is in one or more of the following ranges: 1 second to 10 seconds; 1 second to 5 seconds; 2 seconds to 5 seconds; and 2 seconds to 4 seconds.
Optionally, the controller is configured to generate a feedback signal to enable feedback representative of approaching a threshold associated with the overlap time period.
Optionally, the control system comprises a pre-controller in which the pre-controller is configured to identify an authorised user within the vehicle. The controller may be configured to generate the activation signal after the pre-controller identifies an authorised user.
The identification of an authorised user before generation of the activation signal increases the security of the control system by ensuring the vehicle is not being operated by an unauthorised user.
Optionally, the controller may be configured to not generate the activation signal before the pre-controller identifies an authorised user.
Optionally, the controller may be configured to wait to receive an identification signal from the pre-controller before generating the activation signal.
Optionally, the pre-controller is configured to detect an identification module within the vehicle.
Optionally, the identification module is a key fob or smart key.
According to another aspect of the invention, there is provided a vehicle comprising a control system according to an abovementioned aspect of the invention.
According to a further aspect of the invention, there is provided a method for enabling operation of a vehicle. The method comprises detecting a user gaze characteristic of a user of the vehicle. The method comprises identifying a primary indication of intent of the user to operate the vehicle when the user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period. The method comprises generating an activation signal to enable operation of the vehicle in dependence on the primary indication of intent.
Optionally, the method comprises detecting an input from the user of the vehicle. Optionally, the method comprises identifying a secondary indication of intent of the user to operate the vehicle when the input from the user complies with a predetermined input condition for a second predetermined time period. Optionally, the method comprises generating the activation signal to enable operation of the vehicle in dependence on the primary indication of intent and the secondary indication of intent.
According to a further aspect of the invention, there is provided a controller for a control system for enabling operation of a vehicle. The controller is configured to receive an input signal indicative of a primary indication of intent of a user to operate a vehicle. The primary indication of intent may be determined when a user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period. The controller may be configured to generate an activation signal to enable operation of the vehicle in dependence on the received input signal.
In an embodiment, the abovementioned controller comprises a processor having an electronic processor including an electrical input for receiving data representative of the gaze characteristic; an electronic memory device electrically coupled to the electronic processor and having instructions stored therein, wherein the processor is configured to access the memory device and execute the instructions stored therein such that it is operable to identify the primary indication of intent of the user to operate the vehicle in dependence on the user gaze characteristic complying with a predetermined gaze characteristic for a first predetermined time period, and to generate an activation signal to enable operation of the vehicle in dependence on said identification; and an electrical output configured to output the activation signal.
According to a further aspect of the present invention there is provided a computer program product comprising instructions which, when a program of the program product is executed by a computer, cause the computer to carry out the method of an above aspect of the present invention. The computer program product may be downloadable from a communication network and/or stored on a computer-readable and/or microprocessor-executable medium.
According to a further aspect of the present invention there is provided a non-transitory computer-readable medium having stored thereon the computer program product of a foregoing aspect of the invention.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic diagram of a controller in accordance with an embodiment of the invention.
Figure 2 is a schematic diagram of a control system including the controller of Figure 1, in accordance with an embodiment ofthe invention.
Figure 3 is a flow chart of a method in accordance with an embodiment of the invention.
Figure 4 is a flow chart of a method of a further embodiment of the invention.
Figure 5 is a schematic diagram of the control system Figure 2, in accordance with an embodiment of the invention.
Figure 6 is a schematic illustration of the functions occurring within the set up described in relation to Figure 5, in accordance with an embodiment of the invention.
Figure 7 is a schematic diagram showing the method of Figure 4 in more detail, in accordance with an embodiment ofthe invention.
Figure 8 is a flow chart illustrating a further method that can be used in conjunction with the method of Figure 4 and Figure 7, in accordance with an embodiment of the invention.
Figure 9A is an illustration of a human machine interface in accordance with an embodiment of the invention.
Figure 9B is an illustration of a human machine interface in accordance with an embodiment of the invention.
Figure 9C is an illustration of a human machine interface in accordance with an embodiment of the invention.
Figure 9D is an illustration of a human machine interface in accordance with an embodiment of the invention.
Figure 9E is an illustration of a human machine interface in accordance with an embodiment of the invention.
Figure 9F is an illustration of a human machine interface in accordance with an embodiment of the invention.
Figure 10 is a side view of a vehicle in accordance with an embodiment of the invention.
DETAILED DESCRIPTION
Figure 1 is a schematic diagram of a controller 160. The controller 160 receives an input signal 130 indicative of a primary indication of intent of a user to operate a vehicle 300 (Figure 10). The primary indication of intent is determined when a user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period. The controller 160 generates an activation signal 150 to enable operation of the vehicle 300 in dependence on the received input signal 130.
Figure 2 is a schematic diagram of a control system 200 including the controller 160 of Figure 1. The control system 200 is provided for enabling operation of a vehicle 300 (Figure 10). The control system 200 comprises a gaze sensing means 120, a processor 140 communicatively coupled to the gaze sensing means 120, and the controller 160 of Figure 1 communicatively coupled to the processor 140.
The gaze sensing means 120 detects a gaze characteristic of a user of the vehicle 300 and sends data representative of the gaze characteristic to the processor 140.
The processor 140 identifies a primary indication of intent of the user to operate the vehicle 300 when the user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period. The processor 140 sends a signal 130, indicative of a primary indication of intent of the user to operate the vehicle 300, to the controller 160.
As described in relation to Figure 1, the controller 160 generates an activation signal 150 to enable operation of the vehicle 300 in dependence on the identified primary indication of intent.
Identifying a primary indication of intent of the user to operate the vehicle when the user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period reduces the likelihood of identifying a false positive, and, thus provides a more reliable system for enabling operation of a vehicle. In particular, having a first predetermined time period during which a user gaze characteristic has to comply with a predetermined gaze characteristic takes into account the actual intent of a user to operate the vehicle, and avoids the unnecessary sending of the activation signal if a user gaze characteristic matches a predetermined gaze characteristic by mistake or unintentionally.
The controller 160 generates a feedback signal 170 (Figure 5) to enable feedback to be provided to the user based on the user gaze characteristic. The feedback is at least one of the following: visual feedback, audio feedback, and haptic feedback (discussed later in more detail). The controller 160 may generate the feedback signal 170 in response to identification of a primary indication of intent or in response to the user gaze characteristic. Alternatively, the controller 160 may generate the feedback signal 170 when a primary indication of intent is not identified.
Providing feedback to a user increases user interaction with the control system 200 and thus, improves user engagement with the control system 200. The feedback may helpfully instruct the user to perform actions to enable operation of the vehicle 300. This saves time when operating the vehicle 300.
When the feedback signal 170 is generated in response to the user gaze characteristic the feedback signal 170 may enable feedback representative of approaching a threshold associated with the first predetermined time period. The threshold is equal to the first predetermined time period. In this way a user is informed of how much longer the user is required to maintain the user gaze characteristic in order to enable operation of the vehicle 300.
Upon identification by the processor 140 that the user gaze characteristic does not comply with the predetermined gaze characteristic within the first predetermined time period, the controller 160 resets the feedback signal 170.
The gaze characteristic of a user comprises a gaze direction of the user and the predetermined gaze characteristic comprises a predetermined gaze direction to a predetermined target location, T. Additionally, or alternatively the user gaze characteristic is eye movement of the user and the predetermined gaze characteristic is predetermined eye movement of the user. Additionally, or alternatively, the gaze characteristic is an iris feature of the user and the predetermined gaze characteristic is a predetermined iris feature.
The predetermined target location, T, may be chosen by the user.
The first predetermined time period is from 2 seconds up to and including 4 seconds. However, the first predetermined time period may be in one or more of the following ranges: 1 second to 10 seconds; 1 second to 5 seconds; and 2 seconds to 5 seconds.
The duration (e.g. the number of seconds) of the first predetermined time period can be set long enough to avoid recording a false positive and unnecessarily initiating operation of the vehicle 300 through generating an activation signal 150. In addition, duration (e.g. the number of seconds) of the first predetermined time period is set short enough to allow for ease of use by a user and a short activation process of the vehicle 300.
Figure 3 is a flow chart of a method 400. Method 400 is for enabling operation of a vehicle 300. Method 400 comprises detecting 420 a user gaze characteristic of a user of the vehicle 300, identifying 440 a primary indication of intent of the user to operate the vehicle 300 when the user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period, and generating 460 an activation signal 150 to enable operation of the vehicle 300 in dependence on the primary indication of intent.
The control system 200 of Figure 2 may be modified slightly to include an input sensing means 180 (Figure 5) referred to as a control system 200'. The input sensing means 180 of the control system 200' is communicatively coupled to the processor 140. The input sensing means 180 detects an input from the user and sends data representative of the input to the processor 140. The processor 140 identifies a secondary indication of intent of the user to operate the vehicle 300 when the input from the user complies with a predetermined input condition for a second pre-determined time period. The controller 160 generates the activation signal 150 in dependence on the primary indication of intent and the secondary indication of intent.
Identifying a secondary indication of intent beneficially reduces incorrect interpretation that a user wishes to operate the vehicle 300, and, thus, decreases unnecessary generation of an activation signal 150. A more reliable system is therefore provided.
The second predetermined time period may be dependent on the proportion of the first predetermined time period that has already passed at the time that the input sensing means 180 detects an input from the user. In this way, a user is not required to wait the full second predetermined time period before operation ofthe vehicle 300 is enabled.
The input sensing means 180 is one of the following: a capacitive sensor; a pressure sensor; and an audio sensor. The input sensing means 180 is associated with one of the following: a clutch pedal; a brake pedal, and a steering wheel.
Figure 4 is a flow chart of a method 400' of a further embodiment of the invention. Method 400' of Figure 4 is a method using the aforementioned modified control system 200'. The method 400' comprises detecting 420 a user gaze characteristic of a user of the vehicle 300 and detecting 480 an input from the user of the vehicle 300. The method 400' optionally comprises providing 530 user feedback (see dashed line) dependent on the detected user gaze characteristic and the input from the user. The circumstances under which feedback is provided to the user were discussed briefly in relation to Figure 2 and are discussed in more detail in relation to Figure 7 and Figures 9A to 9E.
The method 400' further comprises identifying 440 a primary indication of intent (as described in relation to Figure 3) and identifying 490 a secondary indication of intent of the user to operate the vehicle 300 when the input from the user complies with a predetermined input condition for a second predetermined time period. Generating 460 the activation signal 150 to enable operation of the vehicle 300 is in dependence on the primary indication of intent and the secondary indication of intent.
Figure 5 is a schematic diagram of the control system 200' within a vehicle 300.
The control system 200' is coupled to components of vehicle 300 (Figure 10).
In more detail, the gaze sensing means 120 is a user monitoring camera system. The user monitoring camera system is coupled to the Interior Sensing Platform Electronic Control Unit (ISP ECU) 360 of vehicle 300. The ISP ECU 360 is coupled to the processor 140.
The input sensing means 180 is coupled to the Body Control Module (BCM) 320 of vehicle 300. The BCM 320 is coupled to the processor 140. In one embodiment, the input sensing means 180 is a capacitive sensor and is associated with the clutch pedal of vehicle 300.
The user monitoring camera system 120 performs detecting 420 a user gaze characteristic. Data representative of the user gaze characteristic is sent by the ISP ECU 360 to the processor 140.
The input sensing means 180 performs the detecting 480 of a user input. Data representative of the user input is then sent by the BCM 320 to the processor 140.
The processor 140 compares the user gaze characteristic data to a predetermined user gaze characteristic. The processor 140 compares the user input data to a predetermined user input condition.
The processor 140 identifies 440 a primary indication of intent when the user gaze characteristic complies with the predetermined user gaze characteristic for a first predetermined time period. The processor 140 identifies 490 a secondary indication of intent when the user input complies with a predetermined input condition for a second predetermined time period.
The processor 140 sends a signal 130 to the controller 160 in dependence on the identification of the primary and secondary indications of intent.
In dependence on receipt of signal 130, the controller 160 generates an activation signal 150 to enable operation of the vehicle 300. In this embodiment, the activation signal 150 enables an engine start request which is sent to the BCM 320 of vehicle 300.
The controller 160 generates 510 a feedback signal 170 to the user when either the primary indication of intent or the secondary indication of intent is not identified. The feedback signal 170 enables feedback to be provided 530 (Figure 6) to the user. In this case, the feedback is visual feedback. The visual feedback is a human machine interface. The feedback is displayed on an I PC display module 380 of the vehicle 300. The type of feedback provided to the user is discussed in more detail in relation to Figure 7 and Figures 9A-9E.
Figure 6 is a schematic illustration of the functions occurring within the set up described in relation to Figure 5. That is, each box of the schematic illustration of Figure 6 represents a function occurring.
At 502, user gaze characteristic data is sent from the ISP ECU 360. At 504, foot-on-pedal data is sent from the BCM 320. In this embodiment, the foot-on-pedal data is received by the input sensing means 180 of Figure 5.
The user gaze characteristic data and the foot-on-pedal data are used to establish the primary and secondary indications of intent on a user to operate the vehicle 300. In this case, the user gaze characteristic data and the foot-on-pedal data are used to establish a user's intent to start an engine of vehicle 300, at 440, 490.
As discussed in relation to Figure 5, once the primary and secondary indications of intent of the user to operate the vehicle 300 are identified, the controller 160 generates an activation signal 150, at 460 of Figure 6. The activation signal 150 enables a request to start the engine of vehicle 300 to be generated 550.
If neither the primary indication of intent nor the secondary indication of intent are identified, the controller 160 generates a feedback signal 170 in order to provide 530 feedback to the user, at 510 and 530 of Figure 6. The feedback is provided to the user as a human machine interface (HMI) on an IPC display module 380.
The type of feedback provided to the user is discussed in more detail in relation to Figure 7 and Figures 9A-9E.
Figure 7 is a flow chart illustrating in more detail the method 400' of Figure 4 and also the feedback referred to in relation to Figure 6, 510, 530.
In common with Figure 6, user gaze characteristic data and foot-on-pedal data are retrieved 502, 504. Feedback is then provided 530A to the user. In line with the embodiment of Figures 5 and 6, the feedback is provided as a human machine interface. In particular, the feedback provided 530A is human machine interface 1 (Figure 9B).
Next, the user gaze characteristic data is compared to a predetermined gaze characteristic. If the user gaze characteristic complies with the predetermined gaze characteristic the "Yes" branch is followed and the user is provided with further feedback 530B. The feedback provided 530B is visual feedback as a human machine interface, that is, human machine interface 2 (Figure 9C).
Next, a comparison of the user input to a predetermined input condition is carried out and if the user input complies with the predetermined input condition the "Yes" branch is followed and further feedback is provided 530C to the user. Feedback of 530C is a human machine interface, human machine interface 3 (Figure 9D).
Next, a determination 520 of whether the user gaze characteristic complies with the predetermined gaze characteristic for a first predetermined time period and the user input complies with a predetermined input condition for a second predetermined time period. Following the "Yes" branch an activation signal is generated 460 to enable the user to operate the vehicle. In this case, primary and secondary indications of intent of the user to operate the vehicle are identified and a request 550 to start the engine of vehicle 300 is made.
Following the "No" branch, further feedback is provided 530E to the user. The further feedback may instruct the user so that the first and second predetermined time periods may be satisfied. For example, the further feedback may be human machine interface 2 (Figure 9C) or human machine interface 3 (Figure 9D).
Looking back to the earlier "No" branches of the flow chart, if it is determined that the user gaze characteristic does not comply with the predetermined gaze characteristic or that the foot is not on the pedal (user input does not comply with predetermined input condition) further feedback is provided 530D to the user. That is, human machine interface 3' is provided to the user. Human machine interface 3' has a counter. Human machine interface 3' may also have instruction to the user so that the predetermined gaze characteristic and the predetermined input condition can be complied with. As an example, the counter of human machine interface 3' counts up to make the user aware of the increasing time the user has to wait to enable operation of the vehicle. If the counter equals a predetermined threshold, that is, if enough time passes, further feedback is provided to the user, 530E. The further feedback may reset the counter of human machine interface 3' or may be human machine interface 1 or 2.
However, if the user gaze characteristic complies with the predetermined gaze characteristic and the foot-on-pedal complies with the predetermined input condition before the counter equals the predetermined threshold, the feedback provided 530C to the user is human machine interface 3 (discussed in relation to Figure 9D), which flows onto the assessment of whether the first and second predetermined time periods are complied with (that is, identification of primary and secondary indications of intent).
Figure 8 is a schematic illustration of a power check method 450 that can occur within either method 400 of Figure 3 or method 400' of Figures 4 and 7.
At 540, a power check is made to ensure that the power mode of the vehicle 300 is above a predetermined threshold. In this case, the predetermined threshold is a power level that is below the power level of the vehicle when the vehicle is activated. For example, power level 6. If the "Yes" branch is followed vehicle 300 is already activated and, consequently, a request to start the engine is redundant. If the "No" branch is followed, the power mode of the vehicle 300 is low enough to allow an engine start 550. After the engine start 550 feedback provided 530 to the user is updated to the human machine interface of Figure 9E.
Figures 9A-9E are examples of visual feedback provided to the user throughout the start-up process of the vehicle 300.
Figure 9A is an example display shown on a human machine interface before initiation of the start-up process has occurred.
Figure 9B is an example of feedback provided 530 to the user. Figure 9B is human machine interface 1 (referred to in relation to Figure 7) and provides instruction to the user.
Figure 9C is human machine interface 2 (Figure 7) and illustrates visual feedback representative of approaching threshold associated with the first predetermined time period. The visual feedback counter is a counter that, once started, counts down to the first predetermined time period. Alternatively, the counter may count down to a threshold that is equal to the sum of the first and second predetermined time periods.
Figure 9D is an example of feedback provided 530 to the user. Figure 9D is human machine interface 3. The circular shape of Figure 9D is the counter of Figure 9C which has started counting down to inform user of approach to enabling of vehicle operation.
Human machine interface 3 may be modified slightly to include a counter that increases -human machine interface 3'.
Figure 9E is an example of the feedback provided 530 to the user after a request to start the engine of vehicle 300 has been generated.
Figure 9F is an example of a human machine interface provided to the user when the identification module of the user, i.e., the smart key is not recognised by the vehicle 300.
Figure 10 is a side view of a vehicle in accordance with an embodiment of the invention. Vehicle 300 includes the control system 200 of Figure 2.
The predetermined target location, T, is an interior location of the vehicle 300 and is illustrated by the dashed line.
Specifically, the interior target location, T, is set to overlay the instrument cluster. However, alternatively, the interior target location T may be set to overlay one of the following: a steering wheel, a mirror, a dashboard, a glove box, a screen, a portion of a windshield, or an air vent.
The gaze sensing means 120 is a camera. The camera 120 is located centrally within the target location, T. In a slight variation, the gaze sensing means 120 may be two or more cameras. Each camera of the gaze sensing means may be offset from the target location, T, or at least one camera may be located within the target location T.
Optionally, the control system 200 comprises a pre-controller 165 in which the pre-controller 165 is configured to identify an authorised user within the vehicle 300 and where the controller 160 is configured to generate the activation signal 150 after the pre-controller 165 identifies an authorised user.
The identification of an authorised user before generation of the activation signal 150 increases the security of the control system 200 by ensuring the vehicle 300 is not being operated by an unauthorised user.
In an embodiment, the pre-controller 165 is configured to detect an identification module within the vehicle 300. The identification module is a key fob or smart key. If a smart key is not detected the user is presented with feedback. For example, the human machine interface of Figure 9F.
Notwithstanding that some of the foregoing embodiments describe sending an engine start request in dependence on issuance of the activation signal 150, the skilled person will readily appreciate that this is not intended to limit the present invention to vehicles comprising internal combustion (IC) engines. In this respect the term ‘engine’ is intended to cover all vehicle prime-movers, for example, but not limited to, electrical machines. According to certain embodiments, issuance of the activation signal enables activation of an electric machine of the vehicle; alternatively the activation signal enables the vehicle to transition from a lower power mode to a higher power mode; further alternatively, the activation signal enables the vehicle to transition into a mode that allows a user to drive the vehicle. Optionally, the activation signal allows the vehicle to transition from a stationary state to a moving state.
Many modifications may be made to the above examples without departing from the scope of the present invention as defined in the accompanying claims.

Claims (24)

1. A control system for enabling operation of a vehicle, the control system comprising: a gaze sensing means; a processor communicatively coupled to the gaze sensing means; and a controller communicatively coupled to the processor; wherein: the gaze sensing means is configured to detect a gaze characteristic of a user of the vehicle and send data representative of the gaze characteristic to the processor; the processor is configured to identify a primary indication of intent of the user to operate the vehicle when the user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period; and the controller is configured to generate an activation signal to enable operation of the vehicle in dependence on the identified primary indication of intent.
2. The control system of claim 1, wherein the controller is configured to generate a feedback signal to enable feedback to be provided to the user based on the user gaze characteristic.
3. The control system of claim 2, wherein the feedback is at least one of the following: visual feedback, audio feedback, and haptic feedback.
4. The control system of claim 3, wherein upon identification by the processor that the user gaze characteristic complies with the predetermined gaze characteristic, the controller is configured to generate a feedback signal to enable feedback representative of approaching a threshold associated with the first predetermined time period.
5. The control system of claim 4, wherein upon identification by the processor that the user gaze characteristic does not comply with the predetermined gaze characteristic within the first predetermined time period, the controller is configured to reset the feedback signal.
6. The control system of any preceding claim, wherein the user gaze characteristic comprises eye movement of the user and the predetermined gaze characteristic comprises predetermined eye movement of the user.
7. The control system of any preceding claim, wherein the gaze characteristic of a user comprises a gaze direction of the user and the predetermined gaze characteristic comprises a predetermined gaze direction to a predetermined target location, T.
8. The control system of any preceding claim, wherein the first predetermined time period is in one or more of the following ranges: 1 second to 10 seconds; 1 second to 5 seconds; 2 seconds to 5 seconds; and 2 seconds to 4 seconds.
9. The control system of claim 7 or 8, wherein the predetermined target location, T, is an interior location of the vehicle.
10. The control system of claim 9, wherein the interior location is associated with one of the following: an instrument cluster, a steering wheel, a mirror, a dashboard, a glove box, a screen, a portion of a windshield, or an air vent.
11. The control system of any of any preceding claim, wherein the user gaze characteristic comprises an iris feature of the user and the predetermined gaze characteristic comprises a predetermined iris feature.
12. The control system of any preceding claim, wherein the gaze sensing means comprises at least one camera.
13. The control system of any preceding claim comprising an input sensing means communicatively coupled to the processor; wherein: the input sensing means is configured to detect an input from the user and send data representative of the input to the processor; the processor is configured to identify a secondary indication of intent of the user to operate the vehicle when the input from the user complies with a predetermined input condition for a second pre-determined time period; and the controller is configured to generate the activation signal in dependence on the primary indication of intent and the secondary indication of intent.
14. The control system of claim 13, wherein the input sensing means is one of the following: a capacitive sensor; a pressure sensor; and an audio sensor.
15. The control system of claim 13 or claim 14, wherein the input sensing means is associated with one of the following: a clutch pedal; a brake pedal, and a steering wheel.
16. The control system of any preceding claim, comprising a pre-controller, wherein the pre-controller is configured to identify an authorised user within the vehicle; and wherein the controller is configured to generate the activation signal after the pre-controller identifies an authorised user.
17. The control system of claim 16, wherein the pre-controller is configured to detect an identification module within the vehicle.
18. The control system of claim 17, wherein the identification module is a key fob.
19. A vehicle comprising the control system of any of claims 1-18.
20. A method for enabling operation of a vehicle, the method comprising: detecting a user gaze characteristic of a user of the vehicle; identifying a primary indication of intent of the user to operate the vehicle when the user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period; and generating an activation signal to enable operation of the vehicle in dependence on the primary indication of intent.
21. The control method of claim 20, comprising: detecting an input from the user of the vehicle; identifying a secondary indication of intent of the user to operate the vehicle when the input from the user complies with a predetermined input condition for a second predetermined time period; and generating the activation signal to enable operation of the vehicle in dependence on the primary indication of intent and the secondary indication of intent.
22. A controller for a control system enabling operation of a vehicle, the controller configured to: receive an input signal indicative of a primary indication of intent of a user to operate the vehicle, wherein the primary indication of intent is determined when a user gaze characteristic complies with a predetermined gaze characteristic for a first predetermined time period; and generate an activation signal to enable operation of the vehicle in dependence on the received input signal.
23. A computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of any one of claims 20 or 21.
24. A non-transitory computer-readable medium having stored thereon the computer program product of claim 23.
GB1710302.9A 2017-06-28 2017-06-28 Control system for enabling operation of a vehicle Active GB2563871B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB1710302.9A GB2563871B (en) 2017-06-28 2017-06-28 Control system for enabling operation of a vehicle
DE102018209838.7A DE102018209838A1 (en) 2017-06-28 2018-06-19 control system
US16/019,097 US20190001883A1 (en) 2017-06-28 2018-06-26 Control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1710302.9A GB2563871B (en) 2017-06-28 2017-06-28 Control system for enabling operation of a vehicle

Publications (3)

Publication Number Publication Date
GB201710302D0 GB201710302D0 (en) 2017-08-09
GB2563871A true GB2563871A (en) 2019-01-02
GB2563871B GB2563871B (en) 2019-12-11

Family

ID=59523613

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1710302.9A Active GB2563871B (en) 2017-06-28 2017-06-28 Control system for enabling operation of a vehicle

Country Status (3)

Country Link
US (1) US20190001883A1 (en)
DE (1) DE102018209838A1 (en)
GB (1) GB2563871B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2582548A (en) * 2019-03-19 2020-09-30 Jaguar Land Rover Ltd Vehicle control system and method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110217197B (en) * 2019-06-12 2020-06-19 湖北东方星海科技实业有限公司 Automobile starting method based on iris recognition technology

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150084854A1 (en) * 2012-03-23 2015-03-26 Audi Ag Method for operating an operating device of a motor vehicle
US20150185834A1 (en) * 2013-12-26 2015-07-02 Theodore Charles Wingrove System and method for gaze tracking
US20150234459A1 (en) * 2014-01-24 2015-08-20 Tobii Technology Ab Gaze driven interaction for a vehicle
US20170235361A1 (en) * 2016-01-20 2017-08-17 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Interaction based on capturing user intent via eye gaze

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8493354B1 (en) * 2012-08-23 2013-07-23 Immersion Corporation Interactivity model for shared feedback on mobile devices
US10209791B2 (en) * 2012-05-22 2019-02-19 Kyocera Corporation Electronic device and panel device
US9626072B2 (en) * 2012-11-07 2017-04-18 Honda Motor Co., Ltd. Eye gaze control system
US9395816B2 (en) * 2013-02-28 2016-07-19 Lg Electronics Inc. Display device for selectively outputting tactile feedback and visual feedback and method for controlling the same
US20160180677A1 (en) * 2014-12-18 2016-06-23 Ford Global Technologies, Llc Apparatus for reducing driver distraction via short range vehicle communication
US9505413B2 (en) * 2015-03-20 2016-11-29 Harman International Industries, Incorporated Systems and methods for prioritized driver alerts
EP3415394B1 (en) * 2016-02-12 2023-03-01 LG Electronics Inc. User interface apparatus for vehicle, and vehicle
DK179823B1 (en) * 2016-06-12 2019-07-12 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
KR20170141484A (en) * 2016-06-15 2017-12-26 엘지전자 주식회사 Control device for a vehhicle and control metohd thereof
EP3264391A1 (en) * 2016-06-30 2018-01-03 Honda Research Institute Europe GmbH Method and system for assisting a driver in driving a vehicle and vehicle on which such system is mounted
US20180070388A1 (en) * 2016-09-02 2018-03-08 Hyundai America Technical Center, Inc System and method for vehicular and mobile communication device connectivity

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150084854A1 (en) * 2012-03-23 2015-03-26 Audi Ag Method for operating an operating device of a motor vehicle
US20150185834A1 (en) * 2013-12-26 2015-07-02 Theodore Charles Wingrove System and method for gaze tracking
US20150234459A1 (en) * 2014-01-24 2015-08-20 Tobii Technology Ab Gaze driven interaction for a vehicle
US20170235361A1 (en) * 2016-01-20 2017-08-17 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Interaction based on capturing user intent via eye gaze

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2582548A (en) * 2019-03-19 2020-09-30 Jaguar Land Rover Ltd Vehicle control system and method
GB2582548B (en) * 2019-03-19 2021-10-13 Jaguar Land Rover Ltd Vehicle control system and method

Also Published As

Publication number Publication date
US20190001883A1 (en) 2019-01-03
GB201710302D0 (en) 2017-08-09
DE102018209838A1 (en) 2019-01-03
GB2563871B (en) 2019-12-11

Similar Documents

Publication Publication Date Title
US10486661B2 (en) Stop control device
CN107804321B (en) Advanced autonomous vehicle tutorial
US20150051780A1 (en) Driver assistance system and method for operating a driver assistance system
JP6647400B2 (en) Driving support device
US9888875B2 (en) Driver monitoring apparatus
US10297092B2 (en) System and method for vehicular dynamic display
US20140012435A1 (en) Vehicle controller
US9099002B2 (en) Vehicle having a device for influencing the attentiveness of the driver and for determining the viewing direction of the driver
CN108944950B (en) Switching control method and device for automobile driving modes
TWI729461B (en) Drive recorder and situation information management system
JP2016078677A (en) Vehicular braking-driving force control apparatus
CN111231862B (en) Device and method for recognizing take-over state of vehicle driver
US20190001883A1 (en) Control system
GB2523642A (en) Vehicle with traffic flow reminder
US20180362019A1 (en) Control apparatus
US20160039440A1 (en) Method and apparatus for operating a vehicle, in particular a railroad vehicle
JP2010106800A (en) Controller for vehicle
JP7073682B2 (en) In-vehicle alarm device
WO2020079755A1 (en) Information providing device and information providing method
JP2020100184A (en) Electronic control device, electronic control program and electronic control system
CN115700199A (en) Data processing method and device applied to intelligent driving
WO2020053983A1 (en) Autonomous driving diagnostic device and autonomous driving diagnostic method
JP2013152572A (en) Driving support device
EP3922529A1 (en) Apparatus for controlling automated driving, and method therefor
CN111591305B (en) Control method, system, computer device and storage medium for driving assistance system