GB2555397A - Control of automonous vehicles - Google Patents

Control of automonous vehicles Download PDF

Info

Publication number
GB2555397A
GB2555397A GB1617915.2A GB201617915A GB2555397A GB 2555397 A GB2555397 A GB 2555397A GB 201617915 A GB201617915 A GB 201617915A GB 2555397 A GB2555397 A GB 2555397A
Authority
GB
United Kingdom
Prior art keywords
vehicle
operator
sensors
route
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB1617915.2A
Other versions
GB201617915D0 (en
Inventor
Martin Cross Gary
Robert Goodall Mark
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BAE Systems PLC
Original Assignee
BAE Systems PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BAE Systems PLC filed Critical BAE Systems PLC
Priority to GB1617915.2A priority Critical patent/GB2555397A/en
Publication of GB201617915D0 publication Critical patent/GB201617915D0/en
Priority claimed from PCT/GB2017/053148 external-priority patent/WO2018078335A1/en
Publication of GB2555397A publication Critical patent/GB2555397A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0055Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot with safety arrangements
    • G05D1/0077Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot with safety arrangements using redundant signals or controls
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles

Abstract

Control of a vehicle (100) having an autonomous mode of operation uses a system comprising sensors (102) to measure one or more parameters associated with an operator (110) controlling the vehicle in a manual mode (100) to perform a first operation. Processors (106) determine whether measurements taken by the sensors (102) fulfil one or more predetermined criteria and in response determine a second operation for performance by the vehicle (100). The vehicle (100) is then controlled in an autonomous mode to perform the second operation. The system thus enables autonomous control of a vehicle to teach a user of the vehicle.

Description

(54) Title of the Invention: Control of automonous vehicles Abstract Title: Control of autonomous vehicles (57) Control of a vehicle (100) having an autonomous mode of operation uses a system comprising sensors (102) to measure one or more parameters associated with an operator (110) controlling the vehicle in a manual mode (100) to perform a first operation. Processors (106) determine whether measurements taken by the sensors (102) fulfil one or more predetermined criteria and in response determine a second operation for performance by the vehicle (100). The vehicle (100) is then controlled in an autonomous mode to perform the second operation. The system thus enables autonomous control of a vehicle to teach a user of the vehicle.
Figure GB2555397A_D0001
fit.I
100
Figure GB2555397A_D0002
fit I
Figure GB2555397A_D0003
Figure GB2555397A_D0004
2ο ο
Figure GB2555397A_D0005
Η6. 2
TA KT
Figure GB2555397A_D0006
oPER-fttoR controls νεμιοΛ To iaonf ai_ong f iRST coute
SCUSoRS CAPTURE M€A£ J^UAOiVS
Figure GB2555397A_D0007
PCOCESScrR. DCTCTKlNCT VEHICLE PATH AND VEHICLE fKOHLCfc) ALONG FIRST ROUTE
I ;8 ? co lets or REUieve^ faap
PROCESSOR T>eTEKiMiN6S CEC^ND CoUTe
Figure GB2555397A_D0008
s|0 £12
Pf\jC>CE££©Rs pE^CCiM.5iTC WHlTvACC OF. NoT
VEHICLE UAS lTN CCNTColLCD IN ActefTMSce MAk/NCC
OfLK^foR CONTROLS VEHICLE To EagVE To ΡοιΝΤ Λ ;'M-
Figure GB2555397A_D0009
fEocessoE /toENTi ^quest To οιΤΈλτε
OPoc/vTcA INJECTS CE£PoN£E
Figure GB2555397A_D0010
fCOCESSCTK SENW SPECIFICATION! OF SECOND* «ουΤ€ To controlceK
T cocsTEollCK controls VEHICLE to IMOVE Along I SECOND Co UTE j (IK I Γ\
Fit. 3 £22-1 CONTROL OF AUTONOMOUS VEHICLES
FIELD OF THE INVENTION
The present invention relates to the control of vehicles, in particular those which have an autonomous mode.
BACKGROUND
Some vehicles are configured to operate in a manual mode or in an autonomous mode. The vehicle may be switched, e.g. by an operator, from operating in manual mode to operating in autonomous mode, and vice versa. In its manual mode, an operator of the vehicle exercises a relatively high degree of control over the movement of the vehicle. In its autonomous mode, the vehicle is capable of sensing its environment and navigating without human input.
Many vehicles include systems that enable the tracking and reporting of that vehicle’s location and/or speed.
SUMMARY OF THE INVENTION
The present inventor has realised that current systems that enable the tracking of vehicle location and/or speed provide little or no real-time feedback to the vehicle’s operator based upon the actual operation of the vehicle. The present inventor has realised that, if information relating to the vehicle’s location and/or speed is relayed to the vehicle’s operator, it can be used to reduce or eliminate undesirable vehicle operation. For example, operation that is dangerous or that is damaging to the vehicle may be reduced or eliminated.
The present inventor has realised that it is desirable for a vehicle to monitor operator behaviour and vehicle driving conditions to provide feedback (e.g. real-time feedback) to the operator. This may, for example, be used to provide training and mentoring to the operator in operation of the vehicle, and may be used to delineate desirable vehicle operation from undesirable vehicle
-2operation. Also, aggressive operator behaviour and operator inattention may be detected and corrected, thereby improving operator and vehicle safety.
In a first aspect, the present invention provides a system for controlling a vehicle, the vehicle having an autonomous mode of operation. The system comprises: one or more sensors configured to measure one or more parameters associated with an operator controlling the vehicle to perform a first operation; and one or more processors configured to: determine that measurements taken by the one or more sensors fulfil one or more predetermined criteria; responsive to determining that the measurements taken by the one or more sensors fulfil the one or more predetermined criteria, determine a second operation for performance by the vehicle; and control the vehicle in the autonomous mode such that the vehicle performs the second operation.
The first operation may comprise the vehicle travelling along a first route between two points. The second operation may comprise the vehicle travelling along a second route between the two points. The second route may be different to the first route.
The second operation may comprise the vehicle travelling along the first route between the two points differently (e.g. at lower speed) to how the vehicle travelled along the first route during the first operation.
The one or more processors may be further configured to: using the measurements taken by the one or more sensors, determine a first route travelled by the vehicle during the first operation, the first route being between two points; determine a second route between the two points; compare the first route and the second route; and, based on the comparison between the first route and the second route, determine that the one or more sensors fulfil the one or more predetermined criteria.
The one or more processors may be configured to determine the second route between the two points using a map of an environment between the two points. The one or more sensors may be further configured to measure parameters associated with the environment between the two points. The one or more processors may be configured to generate the map using
-3measurements taken by the one or more sensors of the parameters associated with the environment between the two points.
The one or more predetermined criteria may comprise one or more criteria that one or more sensor measurements exceed respective thresholds.
The one or more sensors may include one or more sensors selected from the group of sensors consisting of: a vehicle position sensor, an image sensor, a sensor configured to measure a state of a subsystem of the vehicle, a vehicle speed sensor, a vehicle acceleration sensor, an attitude sensor, and a force sensor.
The operator may be located on or in the vehicle.
The one or more processors may be configured to provide, to the operator, a notification that the one or more predetermined criteria have been fulfilled.
The one or more processors may be configured to, responsive to determining that the measurements taken by the one or more sensors fulfil the one or more predetermined criteria, provide, to the operator, a request that the vehicle be operated in the autonomous mode. The one or more processors may be configured to control the vehicle in the autonomous mode such that the vehicle performs the second operation only in response to receiving an acceptance of the request from the operator.
The vehicle may be a land-based vehicle.
In a further aspect, the present invention provides a vehicle having an autonomous mode of operation, the vehicle comprising a system according to any preceding aspect.
In a further aspect, the present invention provides a method for controlling a vehicle, the vehicle having an autonomous mode of operation. The method comprises: measuring, by one or more sensors, one or more parameters associated with an operator controlling the vehicle to perform a first operation; determining that measurements taken by the one or more sensors fulfil one or more predetermined criteria; responsive to determining that the measurements taken by the one or more sensors fulfil the one or more
-4predetermined criteria, determining a second operation for performance by the vehicle; and controlling the vehicle in the autonomous mode such that the vehicle performs the second operation.
In a further aspect, the present invention provides a program or plurality of programs arranged such that when executed by a computer system or one or more processors it/they cause the computer system or the one or more processors to: receive sensor data comprising measurements of one or more parameters associated with an operator controlling a vehicle to perform a first operation; determine that measurements taken by the one or more sensors fulfil one or more predetermined criteria; responsive to determining that the measurements taken by the one or more sensors fulfil the one or more predetermined criteria, determine a second operation for performance by the vehicle; and control the vehicle in an autonomous mode such that the vehicle performs the second operation.
In a further aspect, the present invention provides a machine readable storage medium storing a program or at least one of the plurality of programs according to the preceding aspect.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a schematic illustration (not to scale) of a vehicle;
Figure 2 is a schematic illustration (not to scale) showing an example scenario in which the vehicle operates; and
Figure 3 is a process flow chart showing certain steps of a process of operating the vehicle.
DETAILED DESCRIPTION
Figure 1 is a schematic illustration (not to scale) of an embodiment of a vehicle 100. In this embodiment, the vehicle 100 is a land-based vehicle configured to be controlled to move over land.
-5The vehicle 100 comprises a sensor system 102, a memory 104, a processor 106, an operator interface 108, an operator 110, operator controls 112, a controller 114, and vehicle subsystems 116.
The sensor system 102 comprises a plurality of sensors located on or in the vehicle 100. In this embodiment, the sensor system 102 includes, but is not limited to, a GPS receiver 118, a camera 120, a lidar sensor 122, and a plurality of vehicle subsystem sensors 123. The sensor system 102 is coupled to the processor 106 such that measurements taken by the sensors 118, 120, 122, 123 of the sensor system 102 may be sent from the sensor system 102 to the processor 106.
The GPS receiver 118 is configured to receive GPS satellite signals from GPS satellites. The GPS receiver 118 is configured to, using the received GPS satellite signals, calculate the geographical location of the vehicle 100. The GPS receiver 118 is configured to send the determined location of the vehicle 100 to the processor 106.
The camera 120 is a visible light detecting camera configured to capture visible light images of the vehicle’s environment proximate to the vehicle 100,
i.e. the vehicle’s surroundings. The camera 120 may have a fixed orientation on the vehicle 100 (for example, the camera 120 may be a forward facing camera), or the camera 120 may be controlled to vary its facing with respect to the vehicle 100. In some embodiments, the vehicle 100 comprises multiple cameras, each having a different respective facing with respect to the vehicle 100. The camera 120 is configured to send the captured images to the processor 106.
The lidar sensor 122 is a range sensor configured to measure a distance between the vehicle 100 and objects within the vehicle’s environment that are proximate to the vehicle 100. For example, the lidar sensor 122 may be arranged on the vehicle 100 to measure a separation between the vehicle 100 and objects in the path of the vehicle 100. Preferably, the lidar sensor 122 is a forward facing sensor on the vehicle 100. In some embodiments, the vehicle 100 comprises multiple lidar sensors, each having a different respective facing with respect to the vehicle 100. The vehicle 100 may include a rear facing lidar
-6sensor. The lidar sensor 122 is configured to send the distance measurements to the processor 106.
The vehicle subsystem sensors 123 are configured to measure the state of the vehicle subsystems 116 during vehicle operation. For example, the vehicle subsystem sensors 123 may measure a position of one or more actuators that control a steering unit 126, a throttle 128, and/or a brake unit 130 of the vehicle subsystems 116. In some embodiments, the vehicle subsystem sensors 123 include one or more sensors for determining a current driving lane of the vehicle 100, a fuel level, a brake fluid level, or other appropriate parameter value. The vehicle subsystem sensors 123 are configured to send the measurements of the vehicle subsystems 116 to the processor 106.
The memory 104 is a computer memory which may be any read/write storage device such as a random access memory (RAM) or a non-volatile storage device. An example of a non-volatile storage device includes a disk or tape storage device. The memory 104 is coupled to the processor 106 such that information may be sent from the processor 106 to the memory 104, and stored by the memory 104. Also, the memory 104 is coupled to the processor 106 such that information stored in the memory 104 may be accessed or retrieved for use by the processor 106. In this embodiment, the memory 104 stores a map 124.
In this embodiment, the map 124 comprises a digital model of an environment in which the vehicle 100 operates. The map 124 may include a three-dimensional representation of a terrain’s surface in a region in which the vehicle 100 is operating, i.e. a digital elevation model. The map 124 may be editable by the processor 106, for example, such that the processor 106 may modify or update the map 124.
The processor 106 is coupled to the sensor system 102 such that sensor measurements may be sent from the sensor system 102 to the processor 106. The processor 106 is configured to receive and process the sensor measurements, for example, as described in more detail later below with reference to Figure 3. The processor 106 is further coupled to the memory 104 such that information may be stored in and retrieved from the memory 104 by the processor 106. The processor 106 is further coupled to the operator
-7interface 108 such that information may be sent between the processor 106 and the operator interface 108. As described in more detail later below with reference to Figure 3, in this embodiment, the processor 106 is configured to output information to the operator interface 108 for presentation to the operator
110. Also, the processor 106 is configured to receive an operator input from the operator interface 108, and to process the received operator input. The processor 106 is further coupled to the controller 114 such that the processor 106 may send control instructions to the controller 114, for example as described in more detail later below with reference to Figure 3.
The operator interface 108 is an interface to devices for both input and output of data. The operator interface 108 is configured to present to the operator 110 information received from the processor 106. The operator interface 108 is further configured to receive an operator input from the operator 110, for example in response to information being presented to the operator 110, and to send the received operator input to the processor 106. Preferably, the operator interface 108 is an interface that can easily be used by the operator 110 while the operator 110 simultaneously controls the vehicle 100. Examples of appropriate operator interfaces include, but are not limited, a touchscreen display, and a voice-user interface.
The operator 110 is a human operator for the vehicle 100. The operator 110 is located on or in the vehicle 100. The operator 110 may control the operator controls 112 to control the vehicle 100.
The operator controls 112 are coupled to the vehicle subsystems 116 in such a way that the operator controls 112 control operation of the vehicle subsystems 116, thereby controlling the vehicle 116. Thus, by operating the operator controls 112, the operator 110 may control operation of the vehicle 100. Example operator controls 112 include, but are not limited to: a steering wheel; a joystick; various pedals including, for example, a throttle or accelerator pedal, a brake pedal, and a clutch pedal; a gear stick; signalling apparatus; and lighting apparatus.
Thus, the controller 114 is configured to cooperate with the processor 106 to provide for autonomous control of the vehicle 100. The controller 114 is
-8configured to receive, from the processor 106, a processor output which may include instructions for controlling the vehicle 100. In addition to being coupled to the processor 106, the controller 114 is coupled to the vehicle subsystems 116. The controller 114 is further configured to, based on information received from the processor 106, generate control signals for controlling the vehicle subsystems 116. The controller 114 is configured to send the generated control signals to the vehicle subsystems 116, thus controlling operation of the vehicle subsystems 116, and thereby controlling the vehicle 100 in accordance with the instructions received from the processor 106.
The vehicle subsystems 116 include various subsystems that allow the vehicle 100 to operate. Accordingly, in this embodiment, the vehicle subsystems 116 include various subsystems including, but not limited to, a steering unit 126, a throttle 128, a brake unit 130, an engine 132, a transmission 134, and a plurality of wheels 136.
The engine 132 is configured to convert an energy source into mechanical energy. The engine 132 may be, for example, any combination of an internal combustion engine, an electric motor, steam engine, Stirling engine, or other types of engines and/or motors.
The transmission 134 comprises elements that are operable to transmit mechanical power from the engine 132 to the wheels 136. Accordingly, the transmission 134 may include a gearbox, clutch, differential, and drive shafts including axles coupled to the wheels 136.
The wheels 136 are configured to enable the vehicle 100 to move over land. The vehicle may include any number of wheels, for example, 4 or 6 wheels.
The steering unit 126 comprises any combination of mechanisms that are operable to adjust the heading of the vehicle 100. In this embodiment, the steering unit 126 is coupled to the wheels 136 and is configured to adjust the facing of the one or more wheels 136.
-9The throttle 128 is coupled to the engine 132 and is configured to control an operating speed of an engine 132, thereby to control the speed of the vehicle 100.
The brake unit 130 includes any combination of mechanisms configured 5 to decelerate the vehicle 100. In this embodiment, the brake unit 136 is coupled to the wheels 136 and is configured to use friction to slow rotation of the wheels
136.
In this embodiment, the vehicle 100 is switchable between two different modes of operation, namely an autonomous mode and a manual mode.
In its autonomous mode, the vehicle 100 may control itself without human interaction. In this embodiment, in autonomous mode, the vehicle 100 is fully autonomous. The processor 106 may determine a current state of the vehicle 100 and its environment from measurements taken by the sensor system 102. Also, in autonomous mode, the processor 106 and controller 114 cooperate to control the vehicle 100 based on the vehicle and environment states. In autonomous mode, the processor 106 and controller 114 may control the vehicle 100 without human input, for example without a need for an input from the operator 110.
In its manual mode, the vehicle 100 is driven entirely by the operator
110, i.e. the operator 100 is entirely responsible for vehicle navigation.
Nevertheless, in its manual mode, some functions of the vehicle 100 other than navigation may be performed by the processor 106 and/or the controller 114.
In some embodiments, the vehicle 100 may be operable in a semiautonomous mode instead of or in addition to one or both of the autonomous mode and the manual mode. In semi- autonomous mode, the operator 110 and the controller 114 in combination navigate the vehicle 100.
Apparatus, including the processor 106, for implementing the above arrangement, and performing the method steps to be described later below, may be provided by configuring or adapting any suitable apparatus, for example one or more computers or other processing apparatus or processors, and/or providing additional modules. The apparatus may comprise a computer, a
- 10network of computers, or one or more processors, for implementing instructions and using data, including instructions and data in the form of a computer program or plurality of computer programs stored in or on a machine readable storage medium such as computer memory, a computer disk, ROM, PROM etc., or any combination of these or other storage media.
Figure 2 is a schematic illustration (not to scale) showing an example scenario in which the vehicle 100 operates. A process of controlling the vehicle 100 in this scenario is described in more detail later below with reference to Figure 3.
In this scenario, the vehicle 100 operates within an environment 200, and in particular travels over uneven ground (indicated by contour lines 202) between a first point A on the ground and a second point B on the ground. In this scenario, two ruts (each bounded by a respective pair of dotted lines and indicated in Figure 2 by the reference numerals 204) have been formed in the ground between the first and second points A, B. The ruts 204 are depressions or grooves worn into the ground by the previous travel of wheeled vehicles, such as the vehicle 100. Routes travelled by the vehicle 100 between the first point A and the second point B are indicated in Figure 2 by solid arrows and the reference numerals 206 and 208. The reference numeral 206 indicates a first route. The reference numeral 208 indicates a second route.
Figure 3 is a process flow chart showing certain steps of a process of operating the vehicle 100, in the scenario of Figure 2.
It should be noted that certain of the process steps depicted in the flowchart of Figure 3 and described below may be omitted or such process steps may be performed in differing order to that presented above and shown in Figure 3. Furthermore, although all the process steps have, for convenience and ease of understanding, been depicted as discrete temporally-sequential steps, nevertheless some of the process steps may in fact be performed simultaneously or at least overlapping to some extent temporally.
At step s2, using the operator controls 112, the operator 110 controls the vehicle 100 to travel within the environment 200, from the first point A to the
-11 second point B, along the first route 206. At step s2, the vehicle 100 operates in manual mode.
In this embodiment, the first route 206 is a route over the uneven ground that avoids the wheels 136 of the vehicle 100 travelling along the ruts 204. The first route 206 avoiding the ruts 204 means that the first route 206 is relatively bumpy. Thus, following the first route 206 may potentially damage the vehicle 100 (for example, a suspension system of the vehicle 100), may be uncomfortable for any passengers in the vehicle 100 (including the operator 110), and/or may risk the vehicle 100 overturning. In other words, in this embodiment, the first route 206 is an undesirable route through the environment 200 for the vehicle 100.
At step s4, while the vehicle 100 travels along the first route 206, the sensor system 102 takes sensor measurements.
In particular, the GPS receiver 118 continuously or periodically measures the vehicle’s global position as it travels along the first route 206. Also, the camera 120 captures images of the environment 200 as the vehicle 100 travels along the first route 206. The camera 120 may capture two-dimensional images from a forward-looking view with respect to the vehicle 100. Also, the lidar sensor 122 captures range measurements of the vehicle’s surroundings as the vehicle 100 travels along the first route 206.
Also, the vehicle subsystem sensors 123 continuously or periodically measure the state of the vehicle subsystems 116 as the vehicle 100 travels along the first route 206.
The measurement data captured by the sensors 118, 120, 122, 123 are 25 sent to the processor 106, for example in real-time.
At step s6, using the measurements taken by the GPS receiver 118, the processor 106 determines the path taken by the vehicle at step s2, i.e. the processor 106 determines a specification for the first route 206.
Also, using the measurements taken by the camera 120 and the lidar 30 sensor 122, the processor 106 detects the presence and location of obstacles
-12(such as rocks, trees, other vehicles, etc.) within the environment 200 between the first and second points A, B.
Also, using the measurements taken by the vehicle subsystem sensors 123 and/or the GPS receiver 118, the processor 106 determines one or more performance profiles for the vehicle 100 describing how the vehicle 100 traverses the terrain between the first and second points A, B. For example, in some embodiments, the processor 106 determines a speed profile for the vehicle 100 which maps the vehicle’s speed along the first route 206. In some embodiments, the processor 106 determines an acceleration profile for the vehicle 100 which maps the vehicle’s acceleration along the first route 206. Other profiles recording other vehicle parameters may be calculated including, but not limited to, a braking profile and a throttle profile.
At step s8, the processor 106 retrieves the map 124 from the memory
104.
At step s10, using the map 124, the processor 106 determines a second route 208 within the environment 200, from the first point A to the second point B.
In some embodiments, the processor 106 determines the second route 208 only in response to one or more events occurring. Examples of such events include, but are not limited to: a passenger of the vehicle 100 (e.g. the operator 110) or another entity providing an indication to the processor 106 that the first route 204 was uncomfortable or otherwise undesirable; a sensor on the vehicle (e.g. a tilt sensor, which may be referred to as an attitude sensor) measuring that the vehicle 100 experienced a tilt angle above a threshold tilt angle; and a sensor on the vehicle (e.g. a force sensor, a suspension monitoring sensor, etc.) measuring that the vehicle suspension or other subsystem experienced undesirable conditions.
In some embodiments, the processor 106 determines the second route 208 as an optimum route between the first and second points A, B relative one or more predefined conditions. For example, the processor 106 may determine, as the second route 208: the route between the first and second points A, B that minimises the tilt angle experienced by the vehicle 100; the least bumpy route
-13between the first and second point A, B; the shortest route between the first and second point A, B; or the most fuel-efficient route between the first and second point A, B. Any appropriate optimisation algorithm may be implemented by the processor 106 to determine the second route 208.
In some embodiments, the map 124 specifies the surface of the terrain between the first and second points A, B. The processor 206 may determine the second route 208 based on the terrain surface specified by the map 124.
In this embodiment, the determined second route 208 follows the ruts 204 between the first and second points A, B. In other words, were the vehicle
100 to follow the determined second route 208, the wheels 136 of the vehicle
100 would travel along the ruts 124.
In some embodiments, the map 124 specifies the location of the ruts 204 on the ground between the first and second points A, B. In such embodiments, the processor 106 may determine the second route 208 directly from the map
124, for example, without using sensor measurements captured by the sensor system 102.
In some embodiments, the processor 106 detects the presence and the locations the ruts 204 on the ground based on measurements taken by the camera 120 and/or the lidar sensor 122. For example, an object detection process may be performed on the images captured by the camera 120 to detect the ruts 204 between the first and second points A, B.
In some embodiments, the processor 106 detects the presence and location of obstacles (such as rocks, trees, other vehicles, etc.) within the environment based on measurements taken by the camera 120 and/or the lidar sensor 122. The processor 106 may determine the second route 208 so as to avoid collision of the vehicle 100 with the detected obstacles. In some embodiments, the processor 106 updates the map 124 to include the detected obstacles.
At step s12, the processor 102 determines whether or not the vehicle
100 moving along the first route 206 (at step s2) constitutes acceptable vehicle operation.
- 14ln this embodiment, step s12 comprises the processor 106 comparing the specification of the first route 206 (calculated at step s6) to the determined second route 208. The processor 106 compares the first and second routes 206, 208 to determine whether or not the first and second route 206, 208 differ significantly. Any appropriate criteria may be used to assess whether or not the routes 206, 208 differ significantly. For example, in some embodiments, the first and second routes 206, 208 are determined to differ significantly only if a distance between corresponding points along those routes 206, 208 exceeds a threshold distance value.
Also, in this embodiment, step s12 comprises the processor 106 assessing the one or more performance profiles (calculated at step s6), for example, by comparing them against one or more criteria, thresholds, and/or desired vehicle profiles. For example, the processor 106 may compare a determined speed profile for the vehicle 100 against a speed threshold to determine whether or not the vehicle 100 exceeded a speed limit.
In this embodiment, for the purposes of illustration, it is assumed that the processor 106 determines that the first and second routes 206, 208 differ significantly. The first and second routes 206, 208 differing significantly corresponds to the vehicle 100 being navigated between the first and second points A, B in an undesirable manner. In other words, by determining that the first and second routes 106, 208 differ significantly, the processor 106 determines the vehicle was operated in an unacceptable manner.
In some embodiments, the processor 106 may determine that one or more of the performance profiles for the vehicle 100 indicates that the vehicle 100 has been controlled between the first and second points A, B in an unacceptable manner, for example, too fast, too slow, inefficiently, etc.
As explained in more detail later below at step s16, in response to determining that the vehicle was operated in an unacceptable manner, the processor 106 will provide, to the operator 110, a request to take control of the vehicle operation.
In some embodiments, the processor 106 may provide feedback (e.g. in real-time) to the operator 110 regarding their unsatisfactory control of the
-15vehicle 100. This feedback may be provided using the operator interface 108. This feedback may, for example, be in the form of a visual display and/or audio signal.
However, in other cases, the processor 106 may determine that the vehicle 100 was operated in an acceptable manner, for example by determining that the first and second routes 206, 208 do not differ significantly. In some embodiments, in response to determining that the vehicle 100 was operated in an acceptable manner, the processor 106 takes no action (i.e. it does not request to take control of the vehicle operation), and/or provides feedback to the o operator 110 regarding their acceptable performance.
At step s14, using the operator controls 112, the operator 110 controls the vehicle 100 to return to the first point A.
At step s16, the processor detects that the vehicle 100 has been returned to the first point A (for example, based on GPS measurements received from the GPS receiver 118), and, in response, presents a request to the operator 110. The request is presented to the operator 110 using the operator interface 108.
In this embodiment, the request presented to the operator 110 is a request that the vehicle 100 be switched from manual mode into autonomous mode.
In some embodiments, the request is only presented to the operator 100 if, at step s12, the processor 106 determines that control of the vehicle 100 by the operator 110 along the first route 206 was unacceptable. In some embodiments, if, at step s12, the processor 106 determined that the operator
110 controlled the vehicle 100 in an acceptable manner, no request is presented to the operator 110 and the operator 110 may continue to control the vehicle 100 uninterrupted.
At step s18, the operator 110 inputs a response to the presented request into the operator interface 108. In this embodiment, the operator 110 may input one of two different responses. A first possible response that may be input by the operator 110 specifies that the vehicle 100 should be switched into
-16autonomous mode. A second possible response that may be input by the operator 110 specifies that the vehicle 100 should not be switched into autonomous mode.
For illustration purposes, in this embodiment, the response provided by the operator 110 at step s18 switches the vehicle 100 into autonomous mode. However, in some situations, the operator 110 may decline the request to switch the vehicle 100 into autonomous mode, and may instead maintain manual control of the vehicle 100.
At step s20, responsive to the vehicle 100 being switched to autonomous mode, the processor 106 sends a specification of the determined second route 208 to the controller 114. In some embodiments, the processor 106 may disable the operator controls 112 in response to the vehicle 100 being switched into autonomous mode.
At step s22, the controller 114 controls the vehicle subsystems 116 in accordance with the information received from the processor 106, thereby controlling the vehicle 100 to travel within the environment 200, from the first point A to the second point B, along the second route 208.
In particular, in this embodiment, the controller 114 generates control signals for the vehicle subsystems 116 based on the received specification of the second route 208. The control signals are then sent from the controller 114 to the vehicle subsystems 116 to control the vehicle subsystems 116 so as to cause the vehicle 100 to follow the second route 208.
During autonomous navigation along the second route 208, the processor 106 may detect the presence and location of obstacles (such as rocks, trees, other vehicles, etc.) within the environment 200 based on measurements taken by the camera 120 and/or the lidar sensor 122. The processor 106 and controller 114 may operate to control the vehicle 100 to avoid collision with the detected obstacles. In some embodiments, the processor 106 updates the map 124 to include the detected obstacles.
The operator 100 is located within the vehicle 100 as the vehicle 100 navigates autonomously along the second route 208. Thus, the vehicle 100
-17demonstrates to the operator 110 preferred operation of the vehicle 100 between the first and second points A, B. In other words, the operator 110 receives training on how best to navigate the vehicle 100 in the environment 200.
After autonomous navigation along the second route 208, the vehicle 100 may be switched back into manual mode, and the process of Figure 3 may be iteratively repeated one or more times. At each iteration, the operator 110 may navigate the vehicle 100 taking into account of the demonstration(s) provided by the vehicle 100 during the previous iteration(s).
Thus, a process of operating the vehicle 100 is provided.
Advantageously, the vehicle monitors operator driving behaviour and vehicle driving conditions to provide feedback (e.g. real-time feedback) to the operator. The vehicle tends to inform the operator when they are performing a manoeuvre in an unsafe or non-optimal way. Providing real-time feedback to the operator advantageously tends to facilitate the operator associating the feedback with particular manoeuvres, thus facilitating the operator in correcting their behaviour.
The above described system and method may be used to train and mentor the operator in the preferred operation of the vehicle, thereby reducing the likelihood of damage to the vehicle and improving safety for the operator, vehicle passengers, and users of other vehicles. Training of a human operator using an autonomous vehicle tends to be contrary to the conventional purpose of autonomous vehicles, which usually aim to eliminate a need for a human operator.
The above described system and method tend to be particularly useful at providing training to operators that are inexperienced at operating the vehicle.
Advantageously, the above described system and method tend to be implementable on existing autonomous or semi-autonomous vehicles.
The above described vehicle tends to perform the role of a human instructor. The vehicle may suggest to the operator a correct way to perform a particular manoeuvre and/or teach the operator how to drive a path that has
- 18been previously mapped more smoothly or quickly. A need for a human instructor tends to be reduced or eliminated.
In the above embodiments, the vehicle is a land-based vehicle configured to be controlled to move over land. For example, the vehicle may be a car, a truck, a motorcycle, a bus, etc. However, in other embodiments, the vehicle is a different type of vehicle. For example, in some embodiments, the vehicle is a different type of land-based vehicle such as a tracked vehicle, e.g., a tank. In some embodiments, the vehicle is an aircraft, such as an aeroplane or a helicopter. In some embodiments, the vehicle is a water-based vehicle such as a boat.
In the above embodiments, the sensor system is located on the vehicle. However, in other embodiments, one or more of the sensors of the sensor system is remote from the vehicle.
In the above embodiments, the processor is located on the vehicle. However, in other embodiments, some or all of the processor is remote from the vehicle. For example, the processor may be distributed across multiple different entities.
In the above embodiments, the memory is located on the vehicle. However, in other embodiments, the memory is remote from the vehicle.
In the above embodiments, the operator, the operator interface, and the operator controls are located on board the vehicle. However, in other embodiments, the operator, the operator interface, and the operator controls are located remotely from the vehicle. The vehicle may be an unmanned vehicle, such as an unmanned air vehicle (UAV).
In the above embodiments, the camera and the lidar sensor measure a state of the vehicle’s surroundings. In some embodiments, one or more other sensors instead of or in addition to the camera and the lidar sensor measure the environment in which the vehicle operates. For example, a temperature sensor may measure a temperature of the environment, e.g. to detect icy conditions, and/or a light sensor may be used to assess visibility. Appropriate routes may
- 19be suggested and/or demonstrated to the operator depending upon the measured environmental conditions.
In the above embodiments, a map is stored on the vehicle. However, in other embodiments, a map is not stored on the vehicle. In some embodiments, the vehicle constructs a map of its environment based on sensor data collected as it moves around its environment. In some embodiments, the vehicle acquires the map from an entity remote from the vehicle. For example, in some embodiments, responsive to detecting undesirable vehicle operation (e.g. exceeding a speed limit), the vehicle may download a map of the area in which the undesirable operation took place, and determine a preferred operation using the downloaded map. This preferred operation may subsequently be presented or demonstrated to the operator.
In the above embodiments, initially the operator controls the vehicle along the first route. Subsequently, and in response to detecting undesirable vehicle operation, the vehicle autonomously navigates along a different route to demonstrate to the operator more preferable vehicle operation. However, in other embodiments, a different process is performed. For example, in some embodiments, no autonomous navigation takes place, and instead only feedback on the operator’s performance is presented to the operator. In some embodiments, the vehicle may firstly autonomously navigate through the environment to demonstrate to the operator preferred operation. On subsequent passes through the environment, control of the vehicle may be handed over to the operator, for example, gradually by starting with shorter, easier segments before allowing the operator to perform more complex tasks.
In some embodiments, the vehicle additionally includes a transmitter. The processor may, using the transmitter, transmit details of undesirable vehicle operation to an entity remote from the vehicle, for example to a shared service configured to collect this information and to group operators with similar operators. This could be used to enable the vehicle processing system to request taking control of the vehicle before it reaches a situation where similar operators perform sub-optimally.

Claims (14)

1. A system for controlling a vehicle (100), the vehicle (100) having an autonomous mode of operation, the system comprising:
one or more sensors (102) configured to measure one or more
5 parameters associated with an operator (110) controlling the vehicle (100) to perform a first operation; and one or more processors (106) configured to:
determine that measurements taken by the one or more sensors (102) fulfil one or more predetermined criteria;
10 responsive to determining that the measurements taken by the one or more sensors (102) fulfil the one or more predetermined criteria, determine a second operation for performance by the vehicle (100); and control the vehicle (100) in the autonomous mode such that the vehicle (100) performs the second operation.
2. A system according to claim 1, wherein:
the first operation comprises the vehicle (100) travelling along a first route (206) between two points (A, B); and the second operation comprises the vehicle (100) travelling along a
20 second route (208) between the two points (A, B), the second route (208) being different to the first route (206).
3. A system according to claim 1 or 2, wherein the one or more processors (106) are further configured to:
25 using the measurements taken by the one or more sensors (102), determine a first route (206) travelled by the vehicle (100) during the first operation, the first route (206) being between two points (A, B);
determine a second route (208) between the two points (A, B); compare the first route (206) and the second route (208); and,
-21 based on the comparison between the first route (206) and the second route (208), determine that the one or more sensors (102) fulfil the one or more predetermined criteria.
5 4. A system according to claim 3, wherein the one or more processors (106) are configured to determine the second route (208) between the two points (A, B) using a map (124) of an environment (200) between the two points (A, B).
10 5. A system according to claim 4, wherein:
the one or more sensors (102) are further configured to measure parameters associated with the environment (200) between the two points (A, B); and the one or more processors (106) are configured to generate the map 15 (124) using measurements taken by the one or more sensors (102) of the parameters associated with the environment (200) between the two points (A,
B).
6. A system according to any of claims 1 to 5, wherein the one or more 20 predetermined criteria comprise one or more criteria that one or more sensor measurements exceed respective thresholds.
7. A system according to any of claims 1 to 6, wherein the one or more sensors (102) include one or more sensors selected from the group of sensors
25 consisting of: a vehicle position sensor (118), an image sensor (120, 122), a sensor configured to measure a state of a subsystem of the vehicle (123), a vehicle speed sensor, a vehicle acceleration sensor, a vehicle attitude sensor, and a force sensor.
-228. A system according to any of claims 1 to 7, wherein the operator (110) is located on or in the vehicle (100).
9. A system according to any of claims 1 to 8, wherein the one or more 5 processors (106) are configured to provide, to the operator (110), a notification that the one or more predetermined criteria have been fulfilled.
10. A system according to any of claims 1 to 9, wherein the one or more processors (106) are configured to:
10 responsive to determining that the measurements taken by the one or more sensors (102) fulfil the one or more predetermined criteria, provide, to the operator (110), a request that the vehicle (100) be operated in the autonomous mode; and, only in response to receiving an acceptance of the request from the
15 operator (110), control the vehicle (100) in the autonomous mode such that the vehicle (100) performs the second operation.
11. A system according to any of claims 1 to 10, wherein the vehicle (100) is a land-based vehicle (100).
12. A vehicle (100) having an autonomous mode of operation, the vehicle (100) comprising a system according to any of claims 1 to 11.
13. A method for controlling a vehicle (100), the vehicle (100) having an
25 autonomous mode of operation, the method comprising:
measuring, by one or more sensors (102), one or more parameters associated with an operator (110) controlling the vehicle (100) to perform a first operation;
-23determining that measurements taken by the one or more sensors (102) fulfil one or more predetermined criteria;
responsive to determining that the measurements taken by the one or more sensors (102) fulfil the one or more predetermined criteria, determining a
5 second operation for performance by the vehicle (100); and controlling the vehicle (100) in the autonomous mode such that the vehicle (100) performs the second operation.
14. A program or plurality of programs arranged such that when executed by 10 a computer system or one or more processors (106) it/they cause the computer system or the one or more processors (106) to:
receive sensor data comprising measurements of one or more parameters associated with an operator (110) controlling a vehicle (100) to perform a first operation;
15 determine that measurements taken by the one or more sensors (102) fulfil one or more predetermined criteria;
responsive to determining that the measurements taken by the one or more sensors (102) fulfil the one or more predetermined criteria, determine a second operation for performance by the vehicle (100); and
20 control the vehicle (100) in an autonomous mode such that the vehicle (100) performs the second operation.
15. A machine readable storage medium storing the program or at least one of the plurality of programs according to claim 14.
Intellectual
Property
Office
Application No: GB1617915.2 Examiner: Mr Joseph Mitchell
GB1617915.2A 2016-10-24 2016-10-24 Control of automonous vehicles Pending GB2555397A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1617915.2A GB2555397A (en) 2016-10-24 2016-10-24 Control of automonous vehicles

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB1617915.2A GB2555397A (en) 2016-10-24 2016-10-24 Control of automonous vehicles
PCT/GB2017/053148 WO2018078335A1 (en) 2016-10-24 2017-10-18 Control of autonomous vehicles
EP17784673.0A EP3529680A1 (en) 2016-10-24 2017-10-18 Control of autonomous vehicles
US16/344,223 US20190243359A1 (en) 2016-10-24 2017-10-18 Control of autonomous vehicles

Publications (2)

Publication Number Publication Date
GB201617915D0 GB201617915D0 (en) 2016-12-07
GB2555397A true GB2555397A (en) 2018-05-02

Family

ID=57738025

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1617915.2A Pending GB2555397A (en) 2016-10-24 2016-10-24 Control of automonous vehicles

Country Status (1)

Country Link
GB (1) GB2555397A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2571154A (en) * 2018-02-15 2019-08-21 Jaguar Land Rover Ltd Vehicle Control System And Control Method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160355190A1 (en) * 2014-02-12 2016-12-08 Denso Corporation Driving assist device
EP3115272A1 (en) * 2015-07-06 2017-01-11 Toyota Jidosha Kabushiki Kaisha Control system of automated driving vehicle

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160355190A1 (en) * 2014-02-12 2016-12-08 Denso Corporation Driving assist device
EP3115272A1 (en) * 2015-07-06 2017-01-11 Toyota Jidosha Kabushiki Kaisha Control system of automated driving vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2571154A (en) * 2018-02-15 2019-08-21 Jaguar Land Rover Ltd Vehicle Control System And Control Method
GB2571154B (en) * 2018-02-15 2020-04-22 Jaguar Land Rover Ltd Vehicle Control System And Control Method

Also Published As

Publication number Publication date
GB201617915D0 (en) 2016-12-07

Similar Documents

Publication Publication Date Title
US10591919B1 (en) Avoiding blind spots of other vehicles
US10000216B2 (en) Engaging and disengaging for autonomous driving
US9766333B1 (en) Use of motion data in the processing of automotive radar image processing
US9255805B1 (en) Pose estimation using long range features
US9381918B1 (en) Modifying speed of an autonomous vehicle based on traffic conditions
EP2771751B1 (en) Sensor field selection
US20130197736A1 (en) Vehicle control based on perception uncertainty
US20190243359A1 (en) Control of autonomous vehicles
GB2555397A (en) Control of automonous vehicles
US10871777B2 (en) Autonomous vehicle sensor compensation by monitoring acceleration
US10546499B2 (en) Systems and methods for notifying an occupant of a cause for a deviation in a vehicle
EP3312697A1 (en) Control of autonomous vehicles
US20200142405A1 (en) Systems and methods for dynamic predictive control of autonomous vehicles
US20200018612A1 (en) Mapping of temporal roadway conditions
CN112987053A (en) Method and apparatus for monitoring yaw sensor