GB2578910A - A controller for a vehicle - Google Patents

A controller for a vehicle Download PDF

Info

Publication number
GB2578910A
GB2578910A GB1818515.7A GB201818515A GB2578910A GB 2578910 A GB2578910 A GB 2578910A GB 201818515 A GB201818515 A GB 201818515A GB 2578910 A GB2578910 A GB 2578910A
Authority
GB
United Kingdom
Prior art keywords
torque sensor
vehicle
sensor data
steering
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1818515.7A
Other versions
GB2578910B (en
GB201818515D0 (en
Inventor
Thaibu Amici-Langi Georges
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1818515.7A priority Critical patent/GB2578910B/en
Priority to GB2104421.9A priority patent/GB2590877B/en
Publication of GB201818515D0 publication Critical patent/GB201818515D0/en
Publication of GB2578910A publication Critical patent/GB2578910A/en
Application granted granted Critical
Publication of GB2578910B publication Critical patent/GB2578910B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D6/00Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
    • B62D6/08Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits responsive only to driver input torque
    • B62D6/10Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits responsive only to driver input torque characterised by means for sensing or determining torque
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)

Abstract

A controller for a vehicle operable in an autonomous mode and a non-autonomous mode and having a steering system including a steering device, the controller comprising: an input configured to receive torque sensor data from a torque sensor of the steering system, the sensor data being indicative of a torque being applied to the steering system; a processor including a neural network algorithm having an input layer and an output layer, the processor being configured to determine a plurality of parameters in dependence on the received torque sensor data, provide the plurality of parameters to the input layer, and execute the neural network algorithm to generate an output classification in the output layer in dependence on the plurality of parameters; and an output configured to transmit the output classification from the controller, the output classification being indicative of whether a hand of an occupant of the vehicle is in contact with the steering device. The processor may be configured to apply a frequency domain transform such as a Fast Fourier Transform. Another controller and a method are provided. A second invention is also claimed.

Description

A CONTROLLER FOR A VEHICLE
TECHNICAL FIELD
The present disclosure relates to a controller for a vehicle operable in an autonomous mode. Aspects of the invention relate to a controller, to a method, and to a method.
BACKGROUND
It is desirable for a vehicle to be able to detect whether the vehicle driver has at least one hand on the vehicle steering wheel, i.e. to determine that the driver is in control of the vehicle.
Modern vehicles have increasing numbers of driver aids in the form of Advanced Driver Assistance Systems (ADAS). These systems are intended to assist the driver in controlling the vehicle, but not to take overall control of the vehicle. It is important that the driver does not become over-reliant on such systems and so it is useful to ensure that the driver's hands are on the steering wheel to determine that the driver is still in control of the vehicle when these systems are active.
Current trends in the industry have the aim of moving towards vehicles with autonomous modes in which the vehicle may take overall control in certain situations. For example, a vehicle may operate in an autonomous mode in which the vehicle is in overall control when the vehicle is travelling along a motorway or highway. When the vehicle approaches the end of the motorway or exits the motorway the driver may need to assume overall control of the vehicle. It is important in such a case to ensure that the driver has their hands on the steering wheel when the vehicle comes out of an autonomous mode.
It can be difficult to determine whether the driver's hands are in contact with the steering wheel. One current method includes providing a driver-facing camera on the vehicle dashboard, for example. This suffers the drawback that the camera position is rarely optimal for performing this function as it cannot obstruct the driver's view or any other vehicle function. As a result, the camera view to the steering wheel may be obstructed or even blocked such that a 'hands-on-determination cannot be made.
Another common method is to insert a proximity sensor around the fabric of the vehicle steering wheel which acts like a switch when the driver's hands contact the steering wheel; however, the underlying technology to implement this method can be unreliable Another method is to sense a torque being applied to a torsion bar of the vehicle steering system, possibly by inducing a test force that is not perceptible by the driver; however, it is difficult to determine whether the sensed applied torque is from a driver's hands actuating the steering wheel or from another source. This determination can be particularly difficult when travelling off-road or on a rough / uneven surface as this can lead to torque fluctuations at the torsion bar sensor through the vehicle wheels.
It is an aim of the present invention to address one or more of the disadvantages associated with the prior art.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a controller for a vehicle operable in an autonomous mode, a method for a vehicle operable in an autonomous mode, and a vehicle comprising the controller, as claimed in the appended claims.
According to an aspect of the present invention there is provided a controller for a vehicle operable in an autonomous mode and a non-autonomous mode and having a steering system including a steering device. The controller may comprise an input configured to receive torque sensor data from a torque sensor of the steering system, the torque sensor data being indicative of a torque being applied to the steering system. The controller may comprise a processor including a neural network algorithm having an input layer and an output layer. The processor may be configured to determine a plurality of parameters in dependence on the received torque sensor data, provide the plurality of parameters to the input layer, execute the neural network algorithm to generate an output classification in the output layer in dependence on the plurality of parameters The controller may comprise an output configured to transmit the output classification from the controller, the output classification being indicative of whether a hand of an occupant of the vehicle is in contact with the steering device.
The present invention is advantageous in that an artificial intelligence machine learning algorithm is used to teach the controller the difference between a 'hands-on' situation and a 'hands-off' situation. In particular, the controller may be taught the differences between these two situations for a wide variety of different road and/or driving conditions by being taught pattern recognition for a hands-on situation versus a hands- off situation. For example, torque fluctuations in the steering system when driving off-road or on an uneven surface make it difficult to determine whether a vehicle occupant's hands are on the steering device; however, in the present invention the controller may be taught between torque fluctuations caused by an uneven surface and torque fluctuations caused by the vehicle occupant's hands being in contact with the steering device. Expressed differently, the present invention is particularly good at dealing with the nonlinearity of the steering device coupled with interferences from the road which could easily be mistaken for a hands-on situation.
The processor may be configured to apply a frequency domain transform to the received torque sensor data to determine one or more of the plurality of parameters.
The frequency domain transform may be a Fast Fourier Transform.
As the received torque sensor output data / signals are several sinusoidal waves embedded with each other, it is useful to consider the decomposition of the signal.
This highlights the difference between hands-on and hands-off situations in more depth. For example, differences between the magnitudes of the decomposed signals in the hands-on and hands-off cases mean that this can be a useful parameter to distinguish between the two situations.
The plurality of parameters may include one or more frequency domain transform magnitudes of the torque sensor data.
The one or more frequency domain transform magnitudes may include a maximum frequency domain transform magnitude and/or a minimum frequency domain transform magnitude.
The plurality of parameters includes one or more frequency domain transform magnitude patterns of the torque sensor data.
The plurality of parameters may be based on a plurality of samples of the torque sensor data.
The plurality of parameters may include a plurality of torsion bar torque values indicative of a level of torque being applied to a torsion bar of the steering system.
The plurality of parameters may include at least one of: a maximum value of the torque sensor data; a root mean squared value of the torque sensor data; a ratio between a largest absolute value of the torque sensor data and the root mean squared value; a power of the torque sensor data; a variance of the torque sensor data; an average of the torque sensor data; a standard deviation of the torque sensor data; a first component of a principal component analysis of the torque sensor data; and, an upper and lower level of a waveform of the torque sensor data.
The input may be configured to receive at least one of steering angle data and steering wheel angle rate of change data. The processor may be configured to determine the plurality of parameters in dependence on the received steering angle data or steering wheel angle rate of change data.
It may be advantageous to provide the controller with other data in addition to torque sensor data in order to distinguish between a hands-on and hands-off situation.
The neural network algorithm may comprise at least one hidden layer comprising a plurality of neurons each having an associated activation function.
The activation function may be a sigmoid function.
Each connection between two of the neurons may have a first associated predetermined weight corresponding to a lateral control function of the vehicle not being activated and a second associated predetermined weight corresponding to the lateral control function being activated. The input may be configured to receive lateral control data from a lateral control system of the vehicle, the lateral control data being indicative of whether the lateral control function is activated. The processor may be configured to select the first or second set of predetermined weights for executing the neural network algorithm in dependence on the received lateral control data.
The difference in the received torque sensor output data in a hands-on situation between the lateral control function being activated or not may be significant and can make it difficult for the controller to accurately distinguish between a hands-on or hands-off situation. Therefore, it may be advantageous for the controller to be trained separately in dependence on whether the lateral control function is activated so as to improve the accuracy of the hands-on or hands-off determination.
Further different sets of weights for various different situations may be tuned and used by the processor. For example, different sets of weights for on-road versus off-road driving may be used or even different weights for various different types of identified terrain. Another possibility is to have different weights for automated versus manual steering.
The neural network algorithm may include only one hidden layer. This may reduce the chance of overfitfing.
The output classification may be sent to a hands-on determination module that is configured to make a determination of whether the vehicle occupant's hand is in contact with the steering device in dependence on the output classification and at least one other signal indicative of whether the vehicle occupant's hand is in contact with the steering device.
As it is becoming increasingly important from a legislative point of view for manufacturers to be able to accurately distinguish between hands-on and hands-off situations then it may be advantageous to corroborate the determination of the current controller with a similar determination made via a different method, for example a steering device-facing camera or a steering device proximity sensor.
The output may be configured to transmit the output classification to an advanced driver-assistance system of the vehicle.
According to another aspect of the present invention there is provided a controller for a vehicle operable in an autonomous mode and a non-autonomous mode and having a steering system including a steering device. The controller may comprise an input configured to receive steering sensor data from a steering sensor of the steering system, the steering sensor data being indicative of a level of actuation of the steering system. The controller may comprise a processor including a neural network algorithm having an input layer and an output layer. The processor may be configured to determine a plurality of parameters in dependence on the received steering sensor data, provide the plurality of parameters to the input layer, and execute the neural network algorithm to generate an output classification in the output layer in dependence on the plurality of parameters. The controller may comprise an output configured to transmit the output classification from the controller, the output classification being indicative of whether a hand of an occupant of the vehicle is in contact with the steering device.
The steering sensor data may include at least one of: torque sensor data; steering angle data; and, steering wheel angle rate of change data.
According to another aspect of the present invention there is provided a controller for a vehicle operable in an autonomous mode and a non-autonomous mode and having a steering system including a steering device. The controller may comprise an input configured to receive torque sensor data from a torque sensor of the steering system, the torque sensor data being indicative of a torque being applied to the steering device. The controller may comprise a processor configured to determine a plurality of parameters in dependence on the received torque sensor data, convert the plurality of parameters to a received multi-dimensional data point for a cluster model comprising a first cluster of pre-determined multi-dimensional data points indicative that a vehicle occupant's hand is not in contact with the steering device and a second cluster of predetermined multi-dimensional data points indicative that the vehicle occupant's hand is in contact with the steering device, calculate a distance metric for the first and second clusters indicative of a distance between the received multi-dimensional data point and the respective cluster, execute a clustering algorithm to the cluster model in dependence on the calculated distance metrics to assign the received multidimensional data point to the first cluster or to the second cluster to determine an indication of whether the vehicle occupant's hand is in contact with the steering device.
The controller may comprise an output configured to transmit the output classification from the controller.
This controller is advantageous in that there is a reduced training or calibration burden for such a classification algorithm, and so more efficient classification may be achieved.
Embodiments of the invention may provide a controller that combines the use of a neural network algorithm and of a cluster model as described above to determine whether the vehicle occupant's hand is in contact with the steering device. For example, the output classification of the neural network algorithm may be based on steering wheel torque sensor data,and the cluster model output classification may be based on steering wheel angle data and/or steering wheel angle rate of change data. An overall classification may then be determined based on these two output classifications.
According to another aspect of the present invention there is provided a vehicle comprising a controller as described above.
The vehicle may be operable in an off-road mode.
The steering system may be an electronic power assisted steering system.
The steering system torque sensor may be a torsion bar torque sensor. The controller may receive the sensor data from a CAN bus signal of the vehicle.
According to another aspect of the present invention there is provided a method for a vehicle operable in an autonomous mode and a non-autonomous mode and having a steering system including a steering device. The method may comprise receiving torque sensor data from a torque sensor of the steering system, the torque sensor data being indicative of a torque being applied to the steering device. The method may comprise providing a neural network algorithm having an input layer and an output layer. The method may comprise determining a plurality of parameters in dependence on the received torque sensor data. The method may comprise providing the plurality of parameters to the input layer. The method may comprise executing the neural network algorithm to generate an output classification in the output layer in dependence on the plurality of parameters. The method may comprise transmitting the output classification, the output classification being indicative of whether a vehicle occupant's hand is in contact with the steering device.
According to another aspect of the present invention there is provided a method for a vehicle operable in an autonomous mode and a non-autonomous mode and having a steering system including a steering device. The method may comprise receiving steering sensor data from a steering sensor of the steering system, the steering sensor data being indicative of a level of actuation of the steering system. The method may comprise providing a neural network algorithm having an input layer and an output layer. The method may comprise determining a plurality of parameters in dependence on the received steering sensor data. The method may comprise providing the plurality of parameters to the input layer. The method may comprise executing the neural network algorithm to generate an output classification in the output layer in dependence on the plurality of parameters. The method may comprise an output configured to transmit the output classification, the output classification being indicative of whether a hand of an occupant of the vehicle is in contact with the steering device.
According to another aspect of the present invention there is provided a method for a vehicle operable in an autonomous mode and a non-autonomous mode and having a steering system including a steering device. The method may comprise receiving torque sensor data from a torque sensor of the steering system, the torque sensor data being indicative of a torque being applied to the steering device. The method may comprise determining a plurality of parameters in dependence on the received torque sensor data. The method may comprise converting the plurality of parameters to a received multi-dimensional data point for a cluster model comprising a first cluster of pre-determined multi-dimensional data points indicative that a vehicle occupant's hand is not in contact with the steering device and a second cluster of pre-determined multidimensional data points indicative that the vehicle occupant's hand is in contact with the steering device. The method may comprise calculating a distance metric for the first and second clusters indicative of a distance between the received multi-dimensional data point and the respective cluster. The method may comprise executing a clustering algorithm to the cluster model in dependence on the calculated distance metrics to assign the received multi-dimensional data point to the first cluster or to the second cluster to determine an indication of whether the vehicle occupant's hand is in contact with the steering device. The method may comprise transmitting the output classification.
According to another aspect of the present invention there is provide a non-transitory, computer-readable storage medium storing instructions thereon that when executed by one or more electronic processors causes the one or more electronic processors to carry out the method described above.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 shows a schematic top view of a vehicle having a steering system including a steering wheel, and a controller according to an embodiment of an aspect of the invention, the controller being configured to execute a neural network algorithm in order to determine whether the vehicle driver's hands are in contact with the steering wheel ('hands on') or not ('hands off'); Figure 2 shows sets of training data for the neural network algorithm of Figure 1 for different types of surface in a hands-on situation and where a lateral control function of the vehicle of Figure 1 is not activated, in particular: Figures 2(a), 2(c) and 2(e) show torsion bar torque values when the vehicle travels over a relatively smooth surface, road rumbles, and cat's eyes, respectively; 20 and, Figures 2(b), 2(d) and 2(f) show the fast Fourier Transform magnitudes for the torsion bar torque values of Figures 2(a), 2(c) and 2(e), respectively; Figure 3 shows sets of training data similar to Figure 2, but for a hands-off situation where the lateral control function of the vehicle of Figure 1 is not activated; Figure 4 shows sets of training data similar to Figure 2, but for a hands-on situation where the lateral control function of the vehicle of Figure 1 is activated; Figure 5 shows sets of training data similar to Figure 2, but for a hands-off situation where the lateral control function of the vehicle of Figure 1 is activated; Figure 6 schematically illustrates a perceptron of a neuron of the neural network algorithm of Figure 1; Figure 7 shows the steps of a method undertaken by the controller of Figure 1 in accordance with an embodiment of an aspect of the invention; Figure 8 shows a schematic top view of a vehicle having a steering system including a steering wheel, and a controller according to another embodiment of the invention, the controller being configured to execute a clustering algorithm in order to determine whether the vehicle driver's hands are in contact with the steering wheel; and, Figure 9 shows the steps of a method undertaken by the controller of Figure 8 in accordance with an embodiment of an aspect of the invention.
DETAILED DESCRIPTION
The present invention provides a controller that provides an indication as to whether a vehicle driver's hands are on the vehicle steering wheel based on output data from a torsion bar torque sensor of a steering system of the vehicle. In particular, the controller uses parameters based on the sensor output data as inputs to a neural network algorithm which then outputs either a 'hands-on' or 'hands-off' determination in dependence on the input parameters.
Figure 1 shows a vehicle 10 having two steered wheels 12 at the front of the vehicle 10, and a steering system 14, in particular an electric power assisted steering (EPAS) system, for controlling the steering angle of the wheels 12. The steering system 14 includes a driver steering wheel or steering device 16 in a cabin of the vehicle 10.
The steering system 14 includes a rotary valve 18 including a torsion bar 20. The torsion bar 20 is a thin rod of metal that twists when torque is applied to it. One end of the torsion bar 20 is connected to the steering wheel 16 and the other end is connected to a worm gear or pinion for steering the steered wheels 12. Rotation of the steering wheel 16 causes twisting of the torsion bar 20. Similarly, twisting of the torsion bar 20 may be caused by the wheels 12 being turned when travelling along uneven surfaces such as when travelling off-road, for example. The steering system 14 also includes a torsion bar torque sensor 22 configured to measure the level of torque that is being applied to the torsion bar 20.
Figure 1 also shows that the vehicle 10 includes a controller 30 having an input 32, a processor 34, a data memory 36, and an output 38. The data memory 36 stores data to be used by the processor 34. The input 32 is configured to receive output data from the torsion bar sensor 22, the output data being indicative of a torque being applied to the torsion bar 20. In particular, the input 32 receives data from a CAN bus of the vehicle 10 from the EPAS system 14. The processor 34 is configured to determine whether the vehicle driver's hands are in contact with the steering wheel 16, i.e. a hands-on' or 'hands-off' situation, based on the received sensor output data. The output 38 is configured to send a control signal indicating whether it is a hands-on or hands-off situation to one or more subsystems 40 of the vehicle 10.
The operation of the processor 34 in determining a hands-on or hands-off situation using the sensor output data from the torsion bar sensor 22 is now described in greater detail. In particular, the processor 34 uses a neural network (NN) algorithm to determine an indication of whether the vehicle driver's hands are on the steering wheel 16. For example, the NN algorithm may be a Multi-Layer Perceptron (MLP) NN model.
Alternatively, any other suitable artificial NN model may be used. The MLP model comprises multiple layers of nodes (or perceptrons) in a multidimensional matrix, with each layer connected to nodes in the next layer. A weighting factor (or simply a weight') is applied to each node in order to maximise the probability of a correct hands-on or hands-off determination by the NN algorithm, as is discussed below.
The inputs to the NN algorithm are parameter values calculated using the sensor output data from the torsion bar sensor 22. The output from the NN algorithm is in the form of a binary classification, e.g. a 1 is a hands-off determination and a 2 is a hands-on determination. When executed, the NN algorithm determines a value relating to the probability that the driver's hands are in contact with the steering wheel 16 in dependence on a given set of input parameter values. If the probability is above a threshold value then a hands-on classification is output, else a hands-off classification is output. The threshold value may be between 70% and 90%, more particularly between 75% and 85%, or more particularly approximately 79%. In other words, the NN algorithm maps sets of input parameters (that are based on collected sensor output data) onto a binary output classification.
The MLP model uses a supervised learning technique referred to as 'back projection' for calibrating or training the network. At the point of vehicle manufacture, the controller 30 undergoes a calibration or training phase. In particular, the weights applied to each node of the NN algorithm are unknown a priori. The purpose of the calibration phase is to determine the values of the weights that should be applied to each node. Therefore, at the calibration phase pre-determined sets of sensor output data in which the inputs and the outputs of the NN algorithm are known are used.
Expressed differently, pre-determined sensor output data (that is, sensor output data relating to either a known hands-on or known hands-off situation) obtained from offline measurements is used to calibrate the NN algorithm, i.e. to determine the values of the weights at each node of the NN algorithm. The pre-determined sensor output data may be part of a standardised data set and/or may include empirically-gathered data. The calibrated weights are then stored in the data memory 36.
In the described embodiment, the received sensor output data is split into chunks or groups of 25 measurements of the torsion bar torque. The sampling rate is 0.02 seconds, and so each chunk is representative of 0.5 seconds of data. Each of these measurements is used as an input parameter for the NN algorithm. This means that the NN determination is representative of driver behaviour over a 0.5 second period. It may be that the driver needs to have his hands in contact with the steering wheel 16 for a pre-determined amount of time before an overall hands-on determination is made, for example 2 seconds. This time threshold validates the determination by the controller, i.e. a 'hands-on' determination should be a determination that the driver intends to take back control of the vehicle, and not simply that the driver's hands happen to have come into contact with the steering wheel.
In the described embodiment, a decomposition of the torque signal is also considered.
In particular, a fast Fourier Transform (FFT) of the received sensor output data is calculated by the processor 34. A 128 point FFT is used and one or more magnitudes of the FFT are used as input parameters to the NN, specifically a maximum and minimum magnitude of the FFT in the described embodiment.
Figure 2 shows sets of training data of the torsion bar torque signal and associated FFT for a 'hands on' situation for different types of surface to be used to train the NN, i.e. to adjust the NN weights. In particular, Figures 2(a), 2(c) and 2(e) show 25 samples of the torsion bar torque signal in a hands on situation when the vehicle 10 is travelling over different types of road surface. Specifically, Figures 2(a), 2(c) and 2(e) show the torsion bar torque signal for a relatively smooth surface, when the vehicle 10 is travelling over road rumbles, and when the vehicle 10 travels over cat's eyes, respectively. Figures 2(b), 2(d) and 2(f) show a 128 point FFT magnitude of the chunk of 25 torsion bar torque samples in Figures 2(a), 2(c) and 2(e), respectively.
Figure 3 shows sets of training data of the torsion bar torque signal and associated FFT for a 'hands off' situation for different types of surface to be used to train the NN in a similar manner to the Figure 2 data. In particular, Figures 3(a), 3(c) and 3(e) show 25 samples of the torsion bar torque signal in a hands off situation when the vehicle 10 is travelling over the same types of road surface as in Figure 2. Specifically, Figures 3(a), 3(c) and 3(e) show the torsion bar torque signal for a relatively smooth surface, when the vehicle 10 is travelling over road rumbles, and when the vehicle 10 travels over cat's eyes, respectively. Again similarly to Figure 2, Figures 3(b), 3(d) and 3(f) show a 128 point FFT coefficient of the torsion bar torque signal in Figures 3(a), 3(c) and 3(e), respectively.
Inspection of Figures 2 and 3 shows that there are differences in the FFT magnitudes between the hands on and hands off situations, and so this is a useful parameter to be used by the NN to differentiate between the two situations.
One factor which can greatly affect the torsion bar torque signal is whether a lateral control function or subsystem of the vehicle 10 is activated. Such lateral control systems may include a lane departure warning system, a lane keeping system, and a yaw stability control system. Therefore, in the present embodiment a different set of training data is used to train the NN in the case where a lateral control function of the vehicle 10 is activated.
Figure 4 shows sets of training data of the torsion bar torque signal and associated FFT magnitudes for a 'hands on' situation for the same types of surface in Figure 2, but when a lateral control function of the vehicle 10 is activated. Similarly, Figure 5 shows sets of training data of the torsion bar torque signal and associated FFT magnitudes for a 'hands off' situation for the same types of surface in Figure 3, but when a lateral control function of the vehicle 10 is activated.
Inspection of Figures 2-5 shows that a hands-on situation has a lower magnitude in higher frequencies compared with a hands-off situation. This may be explained by the driver removing engine vibration frequency by applying a counter force to the steering wheel 16, thereby reducing high frequency oscillation.
Regarding the NN algorithm, in the described embodiment 25 input layers are used, in particular the 25 samples of torsion bar torque. In addition, a single hidden layer with a plurality of nodes is used. Each of the connections between nodes in the input, hidden and output layers has an associated weight representative of its strength carrying the input signal for activation. Each weight value may be positive, negative or zero and the weight values are adjusted during a training phase of the NN algorithm based on the sets of training data.
Figure 6 schematically illustrates a mathematical model or perceptron 50 of a neuron or node of the NN. In particular, Figure 6 illustrates a case where the perceptron 50 receives three inputs xo,xux2. The inputs are adjusted by associated weights wo, wJ.,w2 and the neuron output will then be calculated to be either 1 ('OFF') or 2 ('ON') in dependence on the adjusted inputs wax°, wixi, w2x2. Specifically, a sigmoid or logistic activation function is utilised as an activation function to determine the output from each neuron. That is, the output y of the neuron may be expressed as y _(n -f Yvvixt+ h 1=1 where f is the sigmoid function, n is the number of inputs to the neuron, and b is a bias of the neural network. The activation function determines whether a prescribed threshold value is reached.
When training the NN algorithm with the training data illustrated in Figures 2-5, the NN is executed with a set of 'hypothesis' weights and a cost function or Mean Square Error (MSE) is calculated to determine the difference between the actual and expected outputs of the NN. In the described embodiment, a cross-entropy cost function is used and the resulting value is used to adjust the weights to try to reduce the error.
Specifically, back-propagation and gradient descent methods are used to tune the weights in the training phase.
Figure 7 shows a method 60 undertaken by the controller 30 to determine an indication of whether the hand of an occupant of the vehicle 10 is in contact with the steering device 16. At step 62 the controller 30 receives data from a lateral control subsystem of the vehicle 10, the data being indicative of whether a lateral control function of the vehicle 10 is activated or not. Two separate artificial NN algorithms are stored in the data memory 36: one for use when the lateral control function is activated and one for use when the lateral control function is inactive. In particular, two different sets of NN weights are stored in the data memory 36. At step 64 the processor 34 retrieves the appropriate set of NN weights from the data memory 36 depending on whether the lateral control function is activated, and loads the retrieved NN weights ready for executing the NN algorithm.
At step 66 the controller 30 receives sensor output data from the torsion bar torque sensor 22 via the input 32. As described above, the sensor output data is split into chunks of 25 measurements of the torque by the torsion bar torque sensor 22 at a sampling rate of 0.02 seconds.
At step 68 the processor 34 determines the parameters to be used as inputs in the input layer of the neural network. The parameters are determined based on the received sensor output data. The processor 34 applies a frequency domain transform to the sensor output data, in this case a Fast Fourier Transform (FFT). The processor 34 selects the maximum and minimum magnitudes of the FFT and both of these are used as parameters to be input to the NN. In this case, each of the measured torque values in the chunk of values are used as input parameters to the NN.
At step 70 the processor 34 executes the neural network algorithm with the determined input parameters and the appropriate weights. The NN algorithm outputs a binary decision corresponding to either the vehicle occupant's hand being determined to be in contact with the steering wheel 16 or not.
At step 72 the controller 30 transmits the neural network determination via the output 38 to one or more vehicle subsystems 40. In the described embodiment, the determination of the controller 30 is used together with a determination from at least one other controller/subsystem of the vehicle 10 to make an overall determination of whether the vehicle occupant's hand is in contact with the steering wheel 16. The subsystems 40 include a 'hands-on' determination module which is responsible for making this overall determination. Together with the output classification from the controller 30, the hands-on determination module receives an indication of whether the occupant's hand is in contact with the steering wheel 16 from at least one other vehicle subsystem. This may be, for example, a dashboard-mounted camera providing a view of the steering wheel 16 and/or a proximity sensor on the steering wheel 16. It may be that all of the various methods that are being used need to indicate that the occupant's hand is in contact before the hands-on determination module makes an overall determination that the hand is in contact with the steering wheel 16. Alternatively, in other embodiments only one or some of the methods may need to make a positive determination.
The subsystems 40 may also include a human-machine interface which is configured to provide a warning signal if the controller 30 (or hands-on determination module) determines that the occupant's hands are not in contact with the steering wheel 16. In particular, this warning may be in the form of a visual and/or audio signal. Haptic feedback may be provided to the vehicle occupant as a warning. Alternatively, or in addition, one of the subsystems 40 may control the vehicle 10, for example to bring the vehicle 10 to a stop, if it is determined that the occupant's hands are not on the steering wheel 16.
Figure 8 shows a schematic top view of a vehicle 100 having a controller 130 according to another embodiment of the invention. Those components in Figure 8 which are the same as those in Figure 1 have the same reference numerals. In particular, the steering system 14 is the same as in Figure 1. The controller 130 has an input 132, a processor 134, a data memory 136 and an output 138, and its operation is described below.
Like the controller 30, the controller 130 determines an indication of whether a vehicle occupant's hand or hands is in contact with the steering wheel 16 based on sensor output data from the torsion bar torque sensor 22. Unlike the controller 30, the controller 130 makes this determination using a mathematical cluster model rather than a neural network algorithm, as is described below.
At the point of vehicle manufacture, the controller 130 goes through a calibration process or training phase whereby pre-determined data obtained from offline measurements is pre-stored on the data memory 136. The pre-determined data is for a plurality of parameters and it is known whether each set of parameters relates to a hands-on or hands-off situation. The pre-determined data stored in the data memory 138 may be part of a standardised data set and/or may include empirically-gathered data. A multi-dimensional vector consisting of the parameters of the pre-determined data is stored in the data memory 136. Each set of collected data may be regarded as a data point in multi-dimensional vector space and the data points can be separated into two distinct clusters: one representative of a hands-on situation, and one representative of a hands-off situation.
Once calibrated, the controller 130 is used in real-time to determine an indication of whether the vehicle driver's hands are on the steering wheel 16. To do this, the processor 134 analyses real-time sensor output data and compares this with the pre-determined clusters of parameters from the data memory 136. Similarly to the previously-described embodiment, the data memory 136 stores a first set of predetermined data to be used when a lateral control function of the vehicle 10 is not activated and a second set to be used when the lateral control function is activated.
Figure 9 shows the steps of a method 80 undertaken by the controller 130. As in the method 60 of the previous embodiment, at step 82 the controller 130 receives data from a lateral control subsystem of the vehicle 10, the data being indicative of whether a lateral control function of the vehicle 100 is activated or not. At step 84 the processor 134 loads the relevant set of pre-determined data based on the lateral control data.
At step 86 the controller 130 receives sensor output data from the torsion bar torque sensor 22 via the input 132. As described above, the sensor output data is split into chunks of a pre-determined number of measurements of the torque by the torsion bar torque sensor 22. At step 88 the processor 134 determines a number of parameters based on the received sensor output data. This may include some or all of the torque values themselves in a particular chunk, together with the maximum and minimum Fast Fourier Transform values of the chunk of torque measurements in a similar manner to the previous embodiment. The calculated parameters are the same as those in the pre-determined sets stored in memory 136. At step 90 the processor 134 converts the calculated parameters into a single, multi-dimensional data point or vector.
A metric is used at step 92 to calculate the distance from the sensor output data point to each pre-determined data point in the multi-dimensional vector loaded from memory 136. In particular, a Euclidean algorithm is used in the described embodiment to calculate the normalised distance between the data point calculated based on the sensor output data and each of the other data points in the cluster model. In an alternative embodiment, some other metric may be used at the distance algorithm step 84, for example, Minkowski distance, Hamming distance, or Chebyshev distance.
A clustering algorithm is used at step 92 to decide to which pre-stored cluster the sensor output data point belongs. As mentioned above, each pre-determined data point is associated with either a hands-on or a hands-off situation, and data points that are associated with the same situation may be regarded as a cluster or group.
At step 94 the sensor output data point is assigned to the cluster whose characteristics most closely match those of the sensor output data point. There are several different ways in which the similarity between the characteristics of a particular cluster and of the sensor output data point may be measured. One strategy with which to determine to which particular cluster the characteristics of the sensor data point are most similar is to minimise some measure of the normalised distance between the sensor output data point and a particular cluster.
In the present embodiment embodiment, a K-nearest neighbour algorithm is used to minimise the normalised distance between the sensor output data point and a particular cluster. The K-nearest neighbour algorithm is based on minimising the sum of the normalised distances from the sensor output data point to the K nearest predetermined data points in each cluster. In particular, this may be written as \ 2 Elf=1,1X7=1(Xljt Yt) argmin \ Er=1Ef=1.1ENAX1/1 yl)2 for i = 1,2, i.e. the number of clusters, where xij = (xi 11, -r q2,-,xim) is the j-th closest pre-determined data point of the i-th cluster to the sensor output data point, y = (y1,y2,...,yN) is the sensor output data point, and N is the number of parameters in each data point. The above equation returns the number of the cluster, i.e. 1 or 2, that is 'nearest' to the sensor output data point.
In an alternative embodiment, some other clustering algorithm may be used at step 94, for example, a K-means algorithm, a classification tree algorithm, a naive Bayes algorithm, or a support vector machine algorithm.
Finally, at step 96 the determined classification, i.e. hands-on or hands-off, is transmitted from the output 138 to the vehicle subsystems 40 as described previously.
It will be appreciated that various changes and modifications can be made to the present invention without departing from the scope of the present application.
In the above-described embodiment, there are 25 torsion bar torque samples per chunk; however, in different embodiments there may be different numbers of samples per chunk, for example 50 chunks. The sample rate may also differ from 0.02 seconds.
In the above-described embodiment, the controller 30 receives sensor output data from the torsion bar torque sensor 22; however, in different embodiments, the controller may receive sensor output data from other sensors in addition to, or instead of, the torsion bar torque sensor 22. For example, the controller may receive data indicative of a steering angle of the steering wheel 16 and/or data indicative of a steering angle rate of change of the steering wheel 16.
In the described embodiment, there are 25 input parameters at the input layers of the NN, namely, the 25 torsion bar torque values in a particular chunk. In different embodiments, different parameters based on the received sensor output data may be used as inputs to the NN. For example, the maximum and minimum Fast Fourier Transform magnitudes for the chunk may be used. One or more of the following parameters may also be calculated by the processor and used provided as an input to the NN: a maximum value of the torque signal data; a root mean squared value of the torque sensor data; a ratio between a largest absolute value of the torque sensor data and the root mean squared value; a power of the torque sensor data; a variance of the torque sensor data; an average of the torque sensor data; a standard deviation of the torque sensor data; a first component of a principal component analysis of the torque sensor data; and, an upper and lower level of a waveform of the torque sensor data.
That is, in the described embodiment the 25 samples of torque sensor data are used as the input parameters to the NN; however, in different embodiments a plurality of signal features (such as those outlined above) are calculated based on the received torque sensor data, and then these signal features are used as the input parameters to the NN.
Further sets of training data for different situations may be used in different embodiments to train the NN algorithm. For example, training data for off-road driving and/or automated steering may be used.
Any controller or controllers described herein may suitably comprise a control unit or computational device having one or more electronic processors. Thus the system may comprise a single control unit or electronic controller or alternatively different functions of the controller may be embodied in, or hosted in, different control units or controllers. As used herein the term "controller" or "control unit" will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide any stated control functionality. To configure a controller, a suitable set of instructions may be provided which, when executed, cause said control unit or computational device to implement the control techniques specified herein. The set of instructions may suitably be embedded in said one or more electronic processors. Alternatively, the set of instructions may be provided as software saved on one or more memory associated with said controller to be executed on said computational device. A first controller may be implemented in software run on one or more processors. One or more other controllers may be implemented in software run on one or more processors, optionally the same one or more processors as the first controller. Other suitable arrangements may also be used.
Many modifications may be made to the above examples without departing from the scope of the present invention as defined in the accompanying claims.

Claims (25)

  1. CLAIMS1. A controller for a vehicle operable in an autonomous mode and a non-autonomous mode and having a steering system including a steering device, the controller comprising: an input configured to receive torque sensor data from a torque sensor of the steering system, the torque sensor data being indicative of a torque being applied to the steering system; a processor including a neural network algorithm having an input layer and an output layer, the processor being configured to: determine a plurality of parameters in dependence on the received torque sensor data; provide the plurality of parameters to the input layer; and, execute the neural network algorithm to generate an output classification in the output layer in dependence on the plurality of parameters; and, an output configured to transmit the output classification from the controller, the output classification being indicative of whether a hand of an occupant of the vehicle is in contact with the steering device.
  2. 2. A controller according to Claim 1, wherein the processor is configured to apply a frequency domain transform to the received torque sensor data to determine one or more of the plurality of parameters.
  3. 3. A controller according to Claim 2, wherein the frequency domain transform is a Fast Fourier Transform.
  4. 4. A controller according to Claim 2 or Claim 3, wherein the plurality of parameters includes one or more frequency domain transform magnitudes of the torque sensor data.
  5. 5. A controller according to Claim 4, wherein the one or more frequency domain transform magnitudes includes a maximum frequency domain transform magnitude and/or a minimum frequency domain transform magnitude.
  6. 6. A controller according to Claim 4 or Claim 5, wherein the plurality of parameters includes one or more frequency domain transform magnitude patterns of the torque sensor data.
  7. 7. A controller according to any previous claim, wherein the plurality of parameters is based on a plurality of samples of the torque sensor data.
  8. 8. A controller according to any previous claim, wherein the plurality of parameters includes a plurality of torsion bar torque values indicative of a level of torque being applied to a torsion bar of the steering system.
  9. 9. A controller according to any previous claim, wherein the plurality of parameters includes at least one of: a maximum value of the torque sensor data; a root mean squared value of the torque sensor data; a ratio between a largest absolute value of the torque sensor data and the root mean squared value; a power of the torque sensor data; a variance of the torque sensor data; an average of the torque sensor data; a standard deviation of the torque sensor data; a first component of a principal component analysis of the torque sensor data; and, an upper and lower level of a waveform of the torque sensor data.
  10. 10. A controller according to any previous claim, wherein the input is configured to receive at least one of steering angle data and steering wheel angle rate of change data, and wherein the processor is configured to determine the plurality of parameters in dependence on the received steering angle data or steering wheel angle rate of change data.
  11. 11. A controller according to any previous claim, wherein the neural network algorithm comprises at least one hidden layer comprising a plurality of neurons each having an associated activation function.
  12. 12. A controller according to Claim 11, wherein the activation function is a sigmoid function. 10
  13. 13. A controller according to Claim 11 or Claim 12, each connection between two of the neurons having a first associated predetermined weight corresponding to a lateral control function of the vehicle not being activated and a second associated predetermined weight corresponding to the lateral control function being activated, wherein the input is configured to receive lateral control data from a lateral control system of the vehicle, the lateral control data being indicative of whether the lateral control function is activated, and wherein the processor is configured to select the first or second set of predetermined weights for executing the neural network algorithm in dependence on the received lateral control data
  14. 14. A controller according to any of Claims 11 to 13, wherein the neural network algorithm includes only one hidden layer.
  15. 15. A controller according to any previous claim, wherein the output classification is sent to a hands-on determination module that is configured to make a determination of whether the vehicle occupant's hand is in contact with the steering device in dependence on the output classification and at least one other signal indicative of whether the vehicle occupant's hand is in contact with the steering device.
  16. 16. A controller according to any previous claim, wherein the output is configured to transmit the output classification to an advanced driver-assistance system of the vehicle.
  17. 17. A controller for a vehicle operable in an autonomous mode and a non-autonomous mode and having a steering system including a steering device, the controller comprising: an input configured to receive steering sensor data from a steering sensor of the steering system, the steering sensor data being indicative of a level of actuation of the steering system; a processor including a neural network algorithm having an input layer and an output layer, the processor being configured to: determine a plurality of parameters in dependence on the received steering sensor data; provide the plurality of parameters to the input layer; and, execute the neural network algorithm to generate an output classification in the output layer in dependence on the plurality of parameters; and, an output configured to transmit the output classification from the controller, the output classification being indicative of whether a hand of an occupant of the vehicle is in contact with the steering device.
  18. 18. A controller according to Claim 17, wherein the steering sensor data includes at least one of: torque sensor data; steering angle data; and, steering wheel angle rate of change data.
  19. 19. A controller for a vehicle operable in an autonomous mode and a non-autonomous mode and having a steering system including a steering device, the controller comprising: an input configured to receive torque sensor data from a torque sensor of the steering system, the torque sensor data being indicative of a torque being applied to the steering device; a processor configured to: determine a plurality of parameters in dependence on the received torque sensor data; convert the plurality of parameters to a received multi-dimensional data point for a cluster model comprising a first cluster of pre-determined multi-dimensional data points indicative that a vehicle occupant's hand is not in contact with the steering device and a second cluster of pre-determined multi-dimensional data points indicative that the vehicle occupant's hand is in contact with the steering device; calculate a distance metric for the first and second clusters indicative of a distance between the received multi-dimensional data point and the respective cluster; and, execute a clustering algorithm to the cluster model in dependence on the calculated distance metrics to assign the received multi-dimensional data point to the first cluster or to the second cluster to determine an indication of whether the vehicle occupant's hand is in contact with the steering device; an output configured to transmit the output classification from the controller.
  20. 20. A vehicle comprising a controller according to any of Claims 1 to 19.
  21. 21. A vehicle according to Claim 20, the vehicle being operable in an off-road mode.
  22. 22. A vehicle according to Claim 17 or Claim 18, wherein the steering system is an electronic power assisted steering system.
  23. 23. A vehicle according to any of Claims 20-22, wherein the steering system torque sensor is a torsion bar torque sensor.
  24. 24. A method for a vehicle operable in an autonomous mode and a non-autonomous mode and having a steering system including a steering device, the method comprising: receiving torque sensor data from a torque sensor of the steering system, the torque sensor data being indicative of a torque being applied to the steering device; providing a neural network algorithm having an input layer and an output layer; determining a plurality of parameters in dependence on the received torque sensor data; providing the plurality of parameters to the input layer; executing the neural network algorithm to generate an output classification in the output layer in dependence on the plurality of parameters; and, transmitting the output classification, the output classification being indicative of whether a vehicle occupant's hand is in contact with the steering device.
  25. 25. A non-transitory, computer-readable storage medium storing instructions thereon that when executed by one or more electronic processors causes the one or more electronic processors to carry out the method of Claim 24.
GB1818515.7A 2018-11-13 2018-11-13 A controller for a vehicle Active GB2578910B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1818515.7A GB2578910B (en) 2018-11-13 2018-11-13 A controller for a vehicle
GB2104421.9A GB2590877B (en) 2018-11-13 2018-11-13 A controller for a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1818515.7A GB2578910B (en) 2018-11-13 2018-11-13 A controller for a vehicle

Publications (3)

Publication Number Publication Date
GB201818515D0 GB201818515D0 (en) 2018-12-26
GB2578910A true GB2578910A (en) 2020-06-03
GB2578910B GB2578910B (en) 2021-05-12

Family

ID=64739612

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1818515.7A Active GB2578910B (en) 2018-11-13 2018-11-13 A controller for a vehicle

Country Status (1)

Country Link
GB (1) GB2578910B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024002777A1 (en) * 2022-06-29 2024-01-04 Volkswagen Aktiengesellschaft Method for hand detection, computer program, and device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019211016A1 (en) * 2019-07-25 2021-01-28 Volkswagen Aktiengesellschaft Detection of hands-off situations through machine learning
US11577616B2 (en) * 2020-10-27 2023-02-14 GM Global Technology Operations LLC Methods, systems, and apparatuses for torque control utilizing roots of pseudo neural network

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007018991A2 (en) * 2005-08-02 2007-02-15 Gm Global Technology Operations, Inc. Adaptive driver workload estimator
US20140371989A1 (en) * 2013-06-13 2014-12-18 Ford Global Technologies, Llc Method and system for detecting steering wheel contact
US20160187880A1 (en) * 2014-12-25 2016-06-30 Automotive Research & Testing Center Driving control system and dynamic decision control method thereof
CN109367541A (en) * 2018-10-15 2019-02-22 吉林大学 A kind of intelligent vehicle class people's lane change decision-making technique based on driver behavior pattern

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007018991A2 (en) * 2005-08-02 2007-02-15 Gm Global Technology Operations, Inc. Adaptive driver workload estimator
US20140371989A1 (en) * 2013-06-13 2014-12-18 Ford Global Technologies, Llc Method and system for detecting steering wheel contact
US20160187880A1 (en) * 2014-12-25 2016-06-30 Automotive Research & Testing Center Driving control system and dynamic decision control method thereof
CN109367541A (en) * 2018-10-15 2019-02-22 吉林大学 A kind of intelligent vehicle class people's lane change decision-making technique based on driver behavior pattern

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024002777A1 (en) * 2022-06-29 2024-01-04 Volkswagen Aktiengesellschaft Method for hand detection, computer program, and device
DE102022206603A1 (en) 2022-06-29 2024-01-04 Volkswagen Aktiengesellschaft Hand detection method, computer program, and device

Also Published As

Publication number Publication date
GB2578910B (en) 2021-05-12
GB201818515D0 (en) 2018-12-26

Similar Documents

Publication Publication Date Title
CN109866752B (en) Method for tracking running system of dual-mode parallel vehicle track based on predictive control
GB2578910A (en) A controller for a vehicle
Butakov et al. Personalized driver/vehicle lane change models for ADAS
EP3359439B1 (en) Humanized steering model for automated vehicles
CN102883912B (en) Vehicle with identification system
Duchanoy et al. A novel recurrent neural network soft sensor via a differential evolution training algorithm for the tire contact patch
JPH07107421B2 (en) Vehicle shift control device
US9457815B2 (en) Multi-level vehicle integrity and quality control mechanism
US8489253B2 (en) Driver state assessment device
CN108657188B (en) Driver driving technology online evaluation system
CN109747645A (en) The driving supplementary controlled system of vehicle
CN108027427A (en) The ultrasonic transducer system for being used for landform identification in vehicle
CN109191788B (en) Driver fatigue driving judgment method, storage medium, and electronic device
US12084073B2 (en) Method and device for optimum parameterization of a driving dynamics control system for vehicles
CN113602284A (en) Man-machine common driving mode decision method, device, equipment and storage medium
KR102088428B1 (en) Automobile, server, method and system for estimating driving state
Deng et al. Shared control for intelligent vehicle based on handling inverse dynamics and driving intention
CN111830962A (en) Interpretation data for reinforcement learning agent controller
JP7415471B2 (en) Driving evaluation device, driving evaluation system, in-vehicle device, external evaluation device, and driving evaluation program
KR20230101849A (en) How to perform control procedures in the vehicle
EP3825191A1 (en) Vehicle sideslip angle estimation system and method
JP2019104486A (en) Method and system for determining rack force, operation assisting method for work device, operation assisting device and work device
GB2590877A (en) A controller for a vehicle
US12017675B2 (en) Vehicle and control method thereof
Dahl et al. Performance and efficiency analysis of a linear learning-based prediction model used for unintended lane-departure detection