GB2577676A - Control system for a vehicle - Google Patents

Control system for a vehicle Download PDF

Info

Publication number
GB2577676A
GB2577676A GB1815337.9A GB201815337A GB2577676A GB 2577676 A GB2577676 A GB 2577676A GB 201815337 A GB201815337 A GB 201815337A GB 2577676 A GB2577676 A GB 2577676A
Authority
GB
United Kingdom
Prior art keywords
vehicle
cost
path
terrain
cost data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1815337.9A
Other versions
GB2577676B (en
GB201815337D0 (en
Inventor
Fairgrieve Andrew
John King Paul
Ravi Bineesh
Jayaraj Jithin
Jayaprakash Krishna
Kotteri Jithesh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1815337.9A priority Critical patent/GB2577676B/en
Publication of GB201815337D0 publication Critical patent/GB201815337D0/en
Publication of GB2577676A publication Critical patent/GB2577676A/en
Application granted granted Critical
Publication of GB2577676B publication Critical patent/GB2577676B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0011Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

A control system for a vehicle (100 Fig. 1) has at least one controller 10 and obtains first cost data 800, and second cost data 880 different from the first cost data, and both sets of cost data relate to a portion of a terrain over which the vehicle is intending to travel. In use, the first cost data and the second cost data are used to determine the route of the vehicle over the terrain. The first cost data may be used to generate a cost map of the wheels of the vehicle and the second cost data may be used to generate a cost map of the body of the vehicle. The cost data may be determined by a range of sensors, such as image sensors (185C Fig.3), and such sensors may determine both an obstacle and a non-obstacle cost data map, and from a range of possible vehicle trajectories an optimum trajectory can be selected based upon the trajectory cost data. The cost data relates to a penalty or a reward associated with a potential vehicle route across the terrain.

Description

CONTROL SYSTEM FOR A VEHICLE
TECHNICAL FIELD
The present disclosure relates to a control system and particularly, but not exclusively, to a control system for a vehicle. Aspects of the invention relate to a control system, a method, a vehicle, a computer program product and a non-transitory computer readable medium.
BACKGROUND
Occupancy grid maps are used in robotics and autonomous vehicle systems to represent a binary indication of whether a cell in a grid is occupied or not, and therefore whether a cell of the grid is traversable. In cost maps, non-binary values can be allocated to the cells which can be weighted accordingly.
However, existing cost map and occupancy grid based vehicle control systems can often cause vehicles to travel along sub-optimal paths across a terrain.
It is an object of embodiments of the invention to at least mitigate one or more of the problems of the prior art.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a control system, a method, a vehicle and a non-transitory computer readable medium as claimed in the appended claims.
According to a first aspect of the invention, there is provided a control system for a vehicle configured to determine a (typically future) path for the vehicle.
According to another aspect of the invention, there is provided a control system for a vehicle, the control system comprising at least one controller. It may be that the control system is configured to: obtain first cost data relating to at least a portion of terrain to be traversed by the vehicle from a first cost data structure; and obtain second cost data relating to at least a portion of the terrain to be traversed by the vehicle from a second cost data structure comprising different cost data to the first cost data structure. It may be that the control system is configured to determine a (typically future) vehicle path in dependence on the first and second cost data.
By obtaining cost data relating to the terrain from different cost data structures, a more optimal vehicle path can advantageously be determined.
It may be that the control system is configured to operate in an autonomous driving mode, such as a driving mode having level 1, 2, 3, 4 or 5 autonomy (e.g. level 2 autonomy). It may be that the control system is configured to operate in an autonomous off-road driving mode. It may be that the control system is configured to operate in an autonomous low-speed cruise control driving mode, or in both an autonomous low-speed cruise control driving mode and an off-road driving mode.
It may be that the vehicle path is a future path for the vehicle. It may be that the terrain is off-road terrain.
It may be that the first and second cost data structures are discrete from each other.
It may be that the first cost data comprises cost data relating to a portion of the terrain, and the second cost data comprises cost data relating to the same said portion of the terrain.
It will be understood that the cost data typically relates to the cost for the vehicle to traverse the said at least a portion of the terrain.
It may be that the at least one controller collectively comprises: at least one electronic processor; and at least one electronic memory device electrically coupled to the at least one electronic processor having instructions stored therein, wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions thereon so as to obtain the first and second cost data and to determine the vehicle path therefrom.
It may be that the control system is configured to control the vehicle in dependence on the determined vehicle path. It may be that the control system is configured to control a steering angle of one or more wheels of the vehicle in dependence on the determined vehicle path. It may be that the control system is configured to control a speed of the vehicle in dependence on the determined vehicle path.
It may be that the first and second cost data structures relate to the cost(s) for different portions of the vehicle to traverse at least a portion of the terrain. As it may be that there are different costs for different portions of the vehicle to traverse a given portion of the terrain, providing different cost data structures for different portions of the vehicle to traverse at least a portion of the terrain advantageously allows the control system to determine more accurate costs for the vehicle to traverse candidate trajectories across the terrain, thereby enabling a more optimal vehicle path to be determined.
It may be that at least one of the first and second cost data structures is a direction dependent cost data structure indicative of cost(s) for the vehicle to traverse one or more portions of the terrain dependent on a direction of travel of the vehicle with respect to the said one or more portions of the terrain. By providing a direction dependent cost data structure, direction dependent cost data can be taken into account by the control system to determine more accurate costs for the vehicle to traverse respective candidate trajectories across the terrain, thereby enabling a more optimal vehicle path to be determined. For example, a vehicle wheel travelling in a first direction across a feature of the terrain (such as a rut) may incur a greater cost to the vehicle than the vehicle wheel travelling in a second direction over the said feature perpendicular to the first direction. By the direction dependent cost data structure comprising direction dependent cost data, a greater cost can be allocated to a portion of a path extending over the said feature in the first direction than to a portion of a path extending over the feature in the second direction, thus providing more accurate cost data, allowing a more optimal vehicle path to be determined in dependence thereon. It may be that the direction dependent cost data structure is (for example) a line features cost data structure indicative of one or more line features of the terrain or a wheel cost map which takes into account directions of one or more candidate trajectories of the vehicle across the terrain.
It may be that the first and second cost data structures comprise or consist of any two of: a wheel cost map indicative of respective cost(s) for wheels of the vehicle to traverse one or more portions of the terrain independently of the body of the vehicle; a body cost map indicative of respective cost(s) for a body of the vehicle to traverse one or more portions of the terrain independently of the wheels of the vehicle; a line features cost data structure indicative of one or more line features of the terrain.
By providing a wheel cost map, the control system can advantageously determine costs for the vehicle to traverse respective candidate trajectories across the terrain based on projected locations of the wheels of the vehicle (rather than a projected location of the entire vehicle) on the terrain independently of the body of the vehicle. As different portions of the vehicle may be affected differently by a particular portion of the terrain, this helps to provide more accurate costs for the respective candidate trajectories of the vehicle, thereby enabling a more optimal vehicle path to be determined.
It may be that the wheel cost map comprises cost data indicative of respective cost(s) for the wheels of the vehicle to traverse one or more portions of the terrain independently of the rest of the vehicle. It may be that the wheel cost map includes cost data which depends on any one or more of: a gradient of the terrain in a projected direction of travel of the vehicle; a side slope of the terrain transverse to a projected direction of travel of the vehicle; whether the terrain relates to a path region or a non-path region of the terrain.
By providing a body cost map, the control system can advantageously determine a cost for the vehicle body (rather than the vehicle wheels or the vehicle as a whole) to traverse respective candidate trajectories across the terrain (typically including whether there are obstacles on one or more portions of the terrain which the vehicle should avoid) independently of the rest of the vehicle. As the vehicle body may be affected differently from other parts of the vehicle by one or more portions of the terrain, this helps to provide more accurate costs for the vehicle to traverse the terrain, thereby enabling a more optimal vehicle path to be determined. It may be that the control system is configured to determine whether one or more candidate trajectories of the vehicle across the terrain are passable by the vehicle based on the body cost map.
It may be that the body cost map comprises cost data indicative of the cost for a volume of the body of the vehicle to traverse one or more portions of the terrain independently of the rest of the vehicle. It may be that the body cost map includes cost data relating to one or more obstacles of the terrain, such as one or more three dimensional, 3D, obstacles of the terrain.
It may be that the body cost map includes cost data relating to one or more objects (e.g. branches, bushes) overhanging a ground level (e.g. a path region on a ground level) of the terrain.
It may be that the body cost map comprises cost data which is dependent on the minimum or maximum elevation(s) of one or more obstacles or overhanging objects relative to a minimum and/or maximum elevation of the body of the vehicle. For example, it may be that the body cost map comprises cost data which takes into account whether an obstacle has a maximum elevation which is greater than the minimum elevation of the body of the vehicle. In another example, it may be that the body cost map comprises cost data which takes into account whether an overhanging object has a minimum elevation which is less than the maximum elevation of the vehicle. This advantageously allows the body cost map to account for whether or not the body of the vehicle would engage or clear the obstacle or overhanging object if it traversed the terrain across a particular trajectory. Again, this enables a more accurate cost, and thus a more optimal vehicle path, to be determined.
It may be that the line features cost data structure comprises cost data relating to one or more line features of the terrain to be traversed by the vehicle. It may be that the line features relate to one or more boundaries, the crossing of which has a high cost for the vehicle. For example, one or more line features of the cost data structure may relate to a boundary between a path region of the terrain and a non-path region of the terrain. It may be that the control system is configured to allocate a higher cost to a trajectory crossing a (or each) line feature of the line features cost data structure than to a trajectory parallel to the said line feature(s). Thus, the control system can take into account boundaries across which the determined vehicle path should not cross. This helps the control system to determine a more optimal vehicle path.
It may be that each of the line feature(s) of the line features data structure comprises line feature location data defining the location of the line feature. It may be that the line feature location data comprises a plurality of location points, or data representing a best fit line through a plurality of location points (for example). The line features data structure may also comprise direction data indicative of a crossing direction of the line features, although this may be implicit in the shape of the line feature in which case it may not be necessary to store direction data in the line features data structure.
It will be understood that the term "cost" may relate to a penalty or a reward associated with a portion of the terrain to be traversed by the vehicle. An increased or relatively high cost may relate to an increased or relatively high penalty or a reduced or relatively low reward. Similarly a reduced or relatively low cost may relate to a reduced or relatively low penalty or an increased or relatively high reward. An intermediate cost may relate to a cost or reward between a relatively low penalty and a relatively high penalty, between a relatively low reward and a relatively high reward, or between a penalty and a reward.
It may be that the first cost data structure comprises a wheel cost map indicative of cost(s) for wheels of the vehicle to traverse one or more portions of the terrain independently of the body of the vehicle; and the second cost data structure comprises a body cost map indicative of cost(s) for a body of the vehicle to traverse one or more portions of the terrain independently of the wheels of the vehicle. By providing separate wheel and body cost maps, the different effects of the terrain on the wheels and body of the vehicle can be accounted for, enabling more accurate costs to be determined for the vehicle to traverse respective candidate trajectories across the terrain, allowing a more optimal vehicle path to be determined (e.g. as compared to providing a single cost map which does not separate wheel and body cost data).
It may be that the control system is configured to: obtain third cost data relating to one or more portions of the terrain from a third cost data structure comprising different cost data to the first and second cost data structures; and determine a (future) vehicle path in dependence on the first, second and third cost data. Typically the third cost data structure is discrete from the first and second cost data structures.
It may be that: the first cost data structure comprises or consists of a wheel cost map indicative of cost(s) for wheels of the vehicle to traverse one or more portions of the terrain independently of the body of the vehicle; the second cost data structure comprises or consists of a body cost map indicative of cost(s) for a body of the vehicle to traverse one or more portions of the terrain independently of the wheels of the vehicle; and the third cost data structure comprises or consists of a line features cost data structure indicative of one or more line features of the terrain. By determining the vehicle path in dependence on cost data from a wheel cost map, a body cost map and a line features cost data structure, the synergistic benefits of all three cost maps are obtained in order to determine a more accurate cost associated with a candidate vehicle path, enabling a more optimal path to be determined.
It may be that the control system is configured to: obtain environment data relating to the terrain; and determine (e.g. generate or update) the respective cost data structures in dependence on the environment data. It may be that the environment data comprises environment sensor data from one or more environment sensors of the vehicle. It may be that the environment sensor data comprises image data relating to the terrain from one or more image sensors of the vehicle.
It may be that the control system is configured to determine one or more of the respective cost data structures based on image data received from one or more image sensors of the vehicle.
It may be that the control system is configured to determine one or more of the respective cost data structures based on three dimensional, 3D, environment data from one or more 3D environment sensors of the vehicle, such as a stereo vision imaging system, a radar-based terrain ranging system, a laser-based terrain ranging system or an acoustic ranging system.
It may be that the one or more image sensors and the one or more 3D environment sensors of the vehicle are provided by a stereoscopic camera system of the vehicle.
It may be that the control system is configured to determine the cost data structures based on both image data received from one or more image sensors of the vehicle and 3D environment data from one or more 3D environment sensors of the vehicle.
It may be that one of the first and second cost data structures comprises a wheel cost map indicative of respective cost(s) for wheels of the vehicle to traverse one or more portions of the terrain independently of the body of the vehicle, and that the control system is configured to determine the wheel cost map in dependence on image data from one or more image sensors of the vehicle. For example, it may be that the control system is configured to: determine from the image data whether each of one or more portions of the terrain relates to a path region or a non-path region of the terrain; and determine the wheel cost map in dependence thereon. It may be that the wheel cost map allocates a higher cost to a non-path region of the terrain than to a path region of the terrain. It may be that the control system is configured to determine whether each of the said portions of the terrain relates to a path region or a non-path region of the terrain by determining image content data based on a respective sub-region of the image data and comparing the image content data to a path model relating to the path region of the terrain and/or a non-path model relating to the non-path region of the terrain.
It may be that the control system is configured to determine the wheel cost map (where provided) in dependence on three dimensional, 3D, environment data from one or more 3D environment sensors of the vehicle. For example, it may be that the control system is configured to determine (e.g. generate or update) the wheel cost map (where provided) depending on data relating to a gradient in a projected direction of travel of the vehicle or a side slope of a path region of the terrain determined from said 3D environment data. It may be that the control system is configured to determine 3D environment data from stereoscopic image data obtained by a stereoscopic camera of the vehicle, and to determine the said gradient or side slope based on the 3D environment data. It may be that the 3D environment data comprises 3D point cloud data or a 3D grid (e.g. multi-level surface) map (e.g. a 3D grid map derived from 3D point cloud data).
It may be that the control system is configured to determine a plurality of wheel cost maps, each relating to a different candidate trajectory of the vehicle across the terrain. In this case, it may be that the cost data in each of the said wheel cost maps is dependent on the direction(s) followed by the respective candidate trajectory to which the wheel cost map relates across the terrain. It may be that the control system is configured to determine a (typically future) vehicle path in dependence on cost data from the said plurality of wheel cost maps (and typically on the associated candidate trajectories).
Alternatively, it may be that the cost data of the wheel cost map comprises more generalised (e.g. non-directional) cost data which can be used to determine the cost for the wheels of the vehicle to traverse the terrain by way of each of the candidate trajectories. Advantageously determining (e.g. non-directional) more generalised cost data is computationally simpler than determining a plurality of wheel cost maps comprising directional cost data.
It may be that one of the first and second cost data structures comprises a body cost map indicative of respective cost(s) for a body of the vehicle to traverse one or more portions of the terrain independently of the wheels of the vehicle, and that the control system is configured to determine the body cost map in dependence on three dimensional, 3D, environment data from one or more 3D environment sensors of the vehicle. It may be that the control system is configured to determine from the said 3D environment data the presence of an obstacle on, or overhanging a ground level (e.g. path region) of, the terrain and to determine the body cost map (where provided) in dependence on an elevation of the obstacle relative to the elevation of the body of the vehicle.
It may be that one of the first and second cost data structures comprises a line features cost data structure, and wherein the control system is configured to determine one or more line features of the line features cost data structure in dependence on image data from one or more image sensors of the vehicle. For example, the control system may be configured to determine one or more line features of the line features cost data structure by determining based on image data captured by one or more image sensors of the vehicle one or more boundaries between path and non-path regions of the terrain. It may be that the control system is configured to determine one or more line features of the line features cost data structure by obtaining image data relating to the terrain from one or more image sensors of the vehicle, and for each of a plurality of sub-regions of the image data: determining path probability data indicative of a probability that the respective sub-region relates to a path region of the terrain; determining non-path probability data indicative of a probability that the respective sub-region relates to a non-path region of the terrain; and determining one or more boundaries between path and non-path regions of the terrain based on the path and non-path probability data. By determining boundaries between path and non-path regions of the terrain in this way, an accurate determination of the one or more boundaries can advantageously be determined.
It may be that the control system is configured to determine the said one or more boundaries between path and non-path regions of the terrain by: determining a first pair of path boundaries based on the path probability data; determining a second pair of path boundaries based on the non-path probability data; determining a first weight in dependence on a consistency of the first path boundaries; determining a second weight in dependence on a consistency of the second path boundaries; and determining a pair of boundaries between path and non-path regions of the terrain based on the first and second pairs of path boundaries and the first and second weights. It may be that the first weight is dependent on any one or more of: lengths of the path boundaries of the first pair; a standard deviation of the path width between the boundaries of the first pair; an average (e.g. mean) path width between the boundaries of the first pair. Similarly, the second weight may be dependent on any one or more of: lengths of the path boundaries of the second pair; a standard deviation of the path width between the boundaries of the second pair; an average (e.g. mean) path width between the boundaries of the second pair.
It may be that the control system is configured to determine the path probability data by comparing the respective sub-regions to a path model relating to a path region of the terrain.
It may be that the control system is configured to determine the non-path probability data by comparing the respective sub-regions to a non-path model relating to a non-path region of the terrain.
It may be that the control system comprises: a first controller configured to determine the first and second cost data structures; and a second controller configured to determine the vehicle path in dependence on the cost data from the first and second cost data structures. For example, it may be that the first controller is a controller of a stereoscopic camera system of the vehicle. It may be that the second controller is a vehicle control unit of the vehicle.
It may be that the first controller is configured to provide the first and second cost data structures to the second controller in a transmitted data structure or in a respective plurality of transmitted data structures.
It may be that the first and second cost data structures together comprise obstacle cost data and non-obstacle cost data relating to a portion of the terrain, wherein the first controller is configured to provide the obstacle cost data at a predefined portion of the respective transmitted data structure.
For example, it may be that the obstacle cost data is provided at a predefined portion substantially at the beginning or substantially at the end of the respective transmitted data structure.
Typically the predefined portion of the respective transmitted data structure is known to the second controller.
It may be that the or each transmitted data structure relates to a particular portion of the terrain.
It may be that the second controller is configured to selectively process the obstacle cost data with a higher priority than non-obstacle cost data. It may be that the second controller is configured to selectively discard the non-obstacle cost data. Thus, it may be that the second controller is configured to determine the vehicle path in dependence on the obstacle cost data but not in dependence on the non-obstacle cost data relating to the same portion of the terrain as the obstacle cost data.
It will be understood that, due to the presence of an obstacle at a portion of the terrain, it may be that the vehicle cannot traverse that respective portion of the terrain. This means that the non-obstacle cost data relating to that portion of the terrain may be of less importance. By selectively processing the obstacle cost data, processing and battery power of the second controller is saved.
It may be that the first and second cost data structures together comprise obstacle cost data and non-obstacle cost data relating to a portion of the terrain, and wherein the first controller is configured to provide the obstacle cost data to the second controller but not the non-obstacle cost data. By not providing the non-obstacle cost data to the second controller, bandwidth of the communication medium (e.g. vehicle communications bus or wireless network) by which data is transmitted between the first controller and the second controller is saved, together with processing power of the first and second controllers.
It may be that the control system is configured to, for each of a plurality of candidate trajectories of the vehicle across the terrain: determine candidate trajectory cost data in dependence on the first and second cost data (and optionally, where provided, on the third cost data), the candidate trajectory cost data relating to a cost for the vehicle to traverse at least a portion of the respective candidate trajectory; and determine the vehicle path by selecting a candidate trajectory from the said plurality of candidate trajectories in dependence on the candidate trajectory cost data.
It may be that the functionality of the control system is performed by the at least one controller. It may be that the at least one controller is implemented in hardware, software, firmware or any combination thereof. It may be that the at least one controller comprises one or more electronic processors. It may be that one or more or each of the electronic processors are hardware processors. It may be that the at least one controller comprises or consists of an electronic control unit.
According to a further aspect of the invention, there is provided a vehicle comprising a control system as described herein.
According to a further aspect of the invention, there is provided a method of determining a path for a vehicle, the method comprising: obtaining first cost data relating to at least a portion of terrain to be traversed by the vehicle from a first cost data structure; obtaining second cost data relating to at least a portion of the terrain to be traversed by the vehicle from a second cost data structure comprising different cost data to the first cost data structure; and determining a (future) vehicle path in dependence on the first and second cost data.
It may be that the method may comprise any of the functionality of the control system discussed herein.
It may be that the method comprises controlling the vehicle in dependence on the determined vehicle path. For example, it may be that the method comprises controlling a steering angle of one or more wheels of the vehicle in dependence on the determined vehicle path. It may be that the method comprises controlling a speed of the vehicle in dependence on the determined vehicle path.
It may be that one of the first and second cost data structures comprises a wheel cost map indicative of respective cost(s) for wheels of the vehicle to traverse one or more portions of the terrain independently of the body of the vehicle. It may be that the method comprises determining the wheel cost map in dependence on image data from one or more image sensors of the vehicle.
It may be that one of the first and second cost data structures comprises a body cost map indicative of respective cost(s) for a body of the vehicle to traverse one or more portions of the terrain independently of the wheels of the vehicle. It may be that the method comprises determining the body cost map in dependence on three dimensional, 3D, environment data from one or more 3D environment sensors of the vehicle.
It may be that one of the first and second cost data structures comprises a line features cost data structure. It may be that the method comprises determining one or more line features of the line features cost data structure in dependence on image data from one or more image sensors of the vehicle.
It may be that the method comprises determining one or more line features of the line features cost data structure by obtaining image data relating to the terrain from one or more image sensors of the vehicle, and for each of a plurality of sub-regions of the image data: determining path probability data indicative of a probability that the respective sub-region relates to a path region of the terrain; determining non-path probability data indicative of a probability that the respective sub-region relates to a non-path region of the terrain; and determining one or more boundaries between path and non-path regions of the terrain based on the path and non-path probability data.
It may be that the first and second cost data structures together comprise obstacle cost data and non-obstacle cost data relating to a portion of the terrain.
It may be that the method comprises selectively processing the obstacle cost data with a higher priority than non-obstacle cost data. It may be that the method comprises selectively discarding the non-obstacle cost data.
It may be that the method comprises selectively transmitting the obstacle data from a first controller to a second controller and selectively not transmitting the non-obstacle data from the said first controller to the said second controller.
It may be that the method comprises, for each of a plurality of candidate trajectories of the vehicle across the terrain: determining candidate trajectory cost data in dependence on the first and second cost data, the candidate trajectory cost data relating to a cost for the vehicle to traverse at least a portion of the respective candidate trajectory; and determining the vehicle path by selecting a candidate trajectory from the said plurality of candidate trajectories in dependence on the candidate trajectory cost data.
According to another aspect of the invention, there is provided a computer program product comprising computer readable instructions that, when executed by a computer, cause performance of a method described herein.
According to another aspect of the invention, there is provided a non-transitory computer readable medium comprising computer readable instructions that, when executed by a computer, cause performance of a method described herein.
Any controller or controllers described herein may suitably comprise a control unit or computational device having one or more electronic processors. Thus the system may comprise a single control unit or electronic controller or alternatively different functions of the controller may be embodied in, or hosted in, different control units or controllers. As used herein the term "controller" or "control unit" will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide any stated control functionality. To configure a controller, a suitable set of instructions may be provided which, when executed, cause said control unit or computational device to implement the control techniques specified herein. The set of instructions may suitably be embedded in said one or more electronic processors. Alternatively, the set of instructions may be provided as software saved on one or more memory associated with said controller to be executed on said computational device. A first controller may be implemented in software run on one or more processors. One or more other controllers may be implemented in software run on one or more processors, optionally the same one or more processors as the first controller. Other suitable arrangements may also be used.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment or aspect can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described by way of example only, with reference to the accompanying drawings, in which: Fig. 1 shows a schematic illustration of a vehicle in plan view; Fig. 2 shows the vehicle of Fig. 1 in side view; Fig. 3 is a high level schematic diagram of the vehicle speed control system of the vehicle of Figs 1 and 2, including a cruise control system and a low-speed progress control system; Fig. 4 illustrates a steering wheel of the vehicle of FIGs. 1, 2; Fig. 5 is a flow chart illustrating operation of a control system of the vehicle of FIGs. 1, 2; Fig. 6 illustrates the manner in which a colour and/or texture descriptor may be generated; Figs. 7a and 7b show sub-regions of image data having a first portion relating to a path region of the terrain and a second portion relating to a non-path region of the terrain; Fig. 8a illustrates a 2D image captured by one of the cameras of the stereoscopic camera system of the vehicle of Fig. 1; Fig. 8b illustrates a disparity image indicative of the differences between images obtained from the first and second cameras of the stereoscopic camera system of the vehicle of Fig. 1; and Fig. 8c shows in plan view the pixels of the image of Fig. 8a overlaid on a 3D grid obtained based on the disparity image of Fig. 8b; Fig. 9a shows, in plan view, the pixels of the RGB image of Fig. 8a overlaid on a 3D grid obtained based on the disparity image of Fig. 8b (i.e. Fig. 9a is identical to Fig. 8c); Fig. 9b is a path probability map derived from the image of Fig. 9a with each pixel classified by reference to a path model; Fig. 9c shows, in plan view, a non-path probability map derived from the image of Fig. 9a with each pixel classified using a non-path model; Fig. 9d illustrates the inverse of the non-path probability map; Fig. 9e and Fig. 9f show the path and non-path boundaries derived from the path and non-path probability maps; Fig. 9g shows the final path probability map obtained from a weighted combination of the path probability map and the inverse of the non-path probability map; Fig. 9h shows the final path probability map of Fig. 9g merged with a global path probability map stored in a memory; and Fig. 9h shows the final path boundary determined from the merged path probability map of Fig. 9g; Fig. 10 is a close up schematic view of the terrain of Fig. 9a together with a cost map overlaid thereon; Fig. 11 shows the view and cost map of Fig. 10 with a plurality of candidate trajectories overlaid thereon; Fig. 12 is a similar view to Fig. 10 but showing shadow regions on the terrain in place of the puddle regions of Fig. 10; Fig. 13 shows the volume swept by the vehicle of Figs. 1 and 2 increasing during tight turns; Fig. 14 shows the vehicle of Figs. 1 and 2 following mud ruts; Fig. 15 shows a vehicle control unit of the vehicle of Figs. 1 and 2 receiving cost data from three different cost data structures; Figs. 16a and 16b show obstacle data being provided at a predefined portions at the beginning and end of respective transmitted data structures comprising obstacle and non-obstacle data; Fig. 17 shows co-ordinate systems for a frame of reference of the vehicle and a global frame of reference; Fig. 18 shows the VCU of the vehicle of Figs. 1, 2 receiving data from a pair of electronic control units referenced to different globally referenced locations; Fig. 19 shows the VCU of the vehicle of Figs. 1, 2 receiving cost data referenced to different globally referenced locations from a stereoscopic camera system controller; and Fig. 20 shows a globally referenced cost map at three different vehicle locations.
DETAILED DESCRIPTION
FIGs. 1 and 2 show a vehicle 100 having wheels 111, 112, 114, 115, each of which is fitted with a respective tyre, and a body 116 carried by the wheels 111, 112, 114, 115. The vehicle 100 has a powertrain 129 that includes an engine 121 that is connected to a driveline 130 having an automatic transmission 124. A control system for the vehicle engine 121 includes a central controller, referred to as a vehicle control unit (VCU) 10, a powertrain controller 11, a brake controller 13 (an anti-lock braking system (ABS) controller) and a steering controller 170C. The ABS controller 13 forms part of a braking system 22 (Fig. 3). Each of the controllers 10, 11, 13, 170 comprises one or more electronic processors and a memory device storing computer program instructions, the one or more electronic processors being configured to access the respective memory device and to execute the computer program instructions stored therein to thereby perform the functionality attributed to that controller 10, 11, 13, 170.
The VCU 10 may receive and output a plurality of signals to and from various sensors and subsystems (not shown) provided on the vehicle. Referring to Fig. 3, the VCU 10 may include a low-speed progress (LSP) control system 12 shown in Fig. 3, a stability control system (SCS) 14, a cruise control system 16 and a hill descent control (HDC) system 12HD. The SCS 14 improves the safety of the vehicle 100 by detecting and managing loss of traction or steering control. When a reduction in traction or steering control is detected, the SCS 14 may be operable automatically to command the ABS controller 13 to apply one or more brakes of the vehicle to help to steer the vehicle 100 in the direction the user wishes to travel. Although the SCS 14 is implemented by the VCU 10 in this case, the SCS 14 may alternatively be implemented by the ABS controller 13.
The cruise control system 16 may be operable to automatically maintain vehicle speed at a selected speed when the vehicle is travelling at speeds in excess of 25 kph. The cruise control system 16 may be provided with a cruise control HMI (human machine interface) 18 by which means the user can input a target vehicle speed to the cruise control system 16. In one embodiment of the invention, cruise control system input controls are mounted to a steering wheel 171. This is illustrated in Fig. 4. The cruise control system 16 may monitor vehicle speed and any deviation from the target vehicle speed may be adjusted automatically so that the vehicle speed is maintained at a substantially constant value, typically in excess of 25 kph. It may be that the cruise control system 16 is not effective at speeds lower than 25 kph. The cruise control HMI 18 may be configured to provide an alert to the user about the status of the cruise control system 16 via a visual display of the HMI 18.
The LSP control system 12 may also provide a speed-based control system for the user which enables the user to select a relatively low target speed at which the vehicle can progress without any pedal inputs being required by the user to maintain vehicle speed. It may be that low-speed speed control (or progress control) functionality is not provided by the on-highway cruise control system 16 which operates only at speeds above 25 kph. The LSP control system 12 may be activated by pressing LSP control system selector button 178 mounted on steering wheel 171. The LSP system 12 may be operable to apply selective powertrain, traction control and braking actions to one or more wheels of the vehicle 100, collectively or individually.
The LSP control system 12 may be configured to allow a user to input a desired value of vehicle target speed in the form of a set-speed parameter, user_set-speed, via a low-speed progress control HMI (LSP HMI) 20 (Fig. 1, Fig. 3) which shares certain input buttons 173175 with the cruise control system 16 and HDC control system 12HD (Fig. 4). Provided the vehicle speed is within the allowable range of operation of the LSP control system 12 (which may be the range from 2 to 30kph although other ranges may be provided) and no other constraint on vehicle speed exists whilst under the control of the LSP control system 12, the LSP control system 12 may control vehicle speed in accordance with a LSP control system set-speed value LSP_set-speed which is set substantially equal to user_setspeed. The LSP HMI 20 may also include a visual display by means of which information and guidance can be provided to the user about the status of the LSP control system 12.
The LSP control system 12 may receive an input from the ABS controller 13 of the braking system 22 of the vehicle indicative of the extent to which the user has applied braking by means of the brake pedal 163. The LSP control system 12 may also receive an input from an accelerator pedal 161 indicative of the extent to which the user has depressed the accelerator pedal 161, and an input from the transmission or gearbox 124. Other inputs to the LSP control system 12 may include an input from the cruise control HMI 18 which is representative of the status (ON/OFF) of the cruise control system 16, an input from the LSP control HMI 20, and an input from a gradient sensor 45 indicative of the gradient of the driving surface over which the vehicle 100 is driving. In the present embodiment the gradient sensor 45 may be a gyroscopic sensor. In some alternative embodiments the LSP control system 12 may receive a signal indicative of driving surface gradient from another controller such as the ABS controller 13. The ABS controller 13 may determine gradient based on a plurality of inputs, optionally based at least in part on signals indicative of vehicle longitudinal and lateral acceleration and a signal indicative of vehicle reference speed (v actual) being a signal indicative of actual vehicle speed over ground. The vehicle reference speed may be determined to be the speed of the second slowest turning wheel, or the average speed of all the wheels. Other ways of calculating vehicle reference speed may be used including by means of a camera device or radar sensor.
The VCU 10 may be configured to implement a Terrain Response (TR) (RTM) System in which the VCU 10 controls settings of one or more vehicle systems or sub-systems such as the powertrain controller 11 in dependence on a selected driving mode. The driving mode may be selected by a user by means of a driving mode selector 141S (Fig. 1) or it may be determined automatically by the VCU 10. The driving modes may also be referred to as terrain modes, terrain response (TR) modes, or control modes. In the embodiment of Fig. 1 five driving modes may be provided such as: an 'on-highway' driving mode suitable for driving on a relatively hard, smooth driving surface where a relatively high surface coefficient of friction exists between the driving surface and wheels of the vehicle; a 'sand' driving mode suitable for driving over sandy terrain, being terrain characterised at least in part by relatively high drag, relatively high deformability or compliance and relatively low surface coefficient of friction; a 'grass, gravel or snow' (GGS) driving mode suitable for driving over grass, gravel or snow, being relatively slippery surfaces (i.e. having a relatively low coefficient of friction between surface and wheel and, typically, lower drag than sand); a 'rock crawl' (RC) driving mode suitable for driving slowly over a rocky surface; and a 'mud and ruts' (MR) driving mode suitable for driving in muddy, rutted terrain. The latter four driving modes may be considered to be off-road driving modes.
In order to cause application of the necessary positive or negative torque to the wheels, the VCU 10 may command that positive or negative torque is applied to the vehicle wheels by the powertrain 129 and/or that a braking force is applied to the vehicle wheels by the braking system 22, either or both of which may be used to implement the change in torque that is necessary to attain and maintain a required vehicle speed.
The vehicle 100 may be provided with additional sensors (not shown) which are representative of a variety of different parameters associated with vehicle motion and status. These may be inertial systems unique to the LSP or HDC control systems 12, 12HD or part of an occupant restraint system or any other sub-system which may provide data from sensors such as gyros and/or accelerometers that may be indicative of vehicle body movement and may provide a useful input to the LSP and/or HDC control systems 12, 12HD. The signals from the sensors provide, or are used to calculate, a plurality of driving condition indicators (also referred to as terrain indicators) which are indicative of the nature of the terrain conditions over which the vehicle 100 is travelling. The sensors (not shown) of the vehicle 100 may include, but are not limited to, sensors which provide continuous sensor outputs to the VCU 10, including any one or more of: wheel speed sensors; an ambient temperature sensor; an atmospheric pressure sensor; tyre pressure sensors; wheel articulation sensors; gyroscopic sensors to detect vehicular yaw, roll and pitch angle and rate; a vehicle speed sensor; a longitudinal acceleration sensor; an engine torque sensor (or engine torque estimator); a steering angle sensor; a steering wheel speed sensor; a gradient sensor (or gradient estimator); a lateral acceleration sensor which may be part of the SCS 14; a brake pedal position sensor; a brake pressure sensor; an accelerator pedal position sensor; longitudinal, lateral and vertical motion sensors; water detection sensors forming part of a vehicle wading assistance system (not shown). The vehicle 100 may further comprise a location sensor, such as a satellite positioning system (e.g. Global Positioning System (GPS), Galileo or GLONASS) receiver configured to receive signals from a plurality of satellites to determine the location of the vehicle.
The vehicle 100 may be provided with a stereoscopic camera system 185C configured to generate stereo colour image pairs by means of a pair of forward-facing colour video cameras comprised by the system 185C. The system 185C may further comprise one or more electronic processors and a memory device storing computer program instructions, the one or more electronic processors being configured to access the respective memory device and to execute the computer program instructions stored therein. A stream of dual video image data is fed from the cameras to the one or more processors of the system 185C which may access and execute instructions stored in the memory of the said system 1850 to process the image data and repeatedly generate a 3D point cloud data set based on the images received.
Alternatively the images may be obtained and processed by any processing system of the vehicle 100, such as the VCU 10. Each point in the 3D point cloud data set may correspond to a 3D coordinate of a point on a surface of terrain ahead of the vehicle 100 viewed by each of the forward-facing video cameras of the stereoscopic camera system 185C.
The LSP control system 12 may have an autonomous driving mode in which the VCU 10 controls the steering and speed of the vehicle autonomously. In this case, the LSP HMI 20 may allow the driver to select the autonomous driving mode. The autonomous driving mode may be have a level of automation of level 1 or above by the SAE International standard. The autonomous mode may have level 2 autonomy, that is: the automated system takes full control of the vehicle (accelerating, braking, and steering); the driver must monitor the driving and be prepared to intervene immediately at any time if the automated system fails to respond properly. Thus, the speed of the vehicle 100 set by the user in the LSP mode may be overridden (typically reduced) by the VCU 10, for example, if it is inappropriate for driving conditions (e.g. if there are obstacles in the path, or if the set speed is inappropriate for the curvature of the vehicle path).
It may be that the autonomous LSP driving mode is particularly suitable for controlling the vehicle off-road, where road markings and road signs are either absent or sparsely provided.
Accordingly, as well as operating in the autonomous LSP driving mode, it may be that the vehicle is also operating in one of the off-road TA driving modes. However, the autonomous LSP driving mode may also be suitable for controlling the vehicle on road.
As will be explained below with reference to Fig. 5, when the vehicle is operating in the autonomous LSP driving mode, the stereoscopic camera system 185C may be configured to use the colour image pairs to determine cost data relating to the terrain and to provide the cost data to the VCU 10. The VCU 10 may then determine a future path of the vehicle 100 in dependence on the cost data and control the vehicle 100 in accordance with the determined path. For example, the steering angle of one or more wheels of the vehicle 100 may be controlled by the VCU 10 outputting a steering angle control signal to the steering controller 170C dependent on the curvature of the determined path. The VCU 10 may store in an electronic memory thereof a look-up table of predetermined maximum allowable vehicle speeds for different path curvatures, and the VCU 10 may select an appropriate vehicle speed from the look-up table in dependence on the curvature of the determined path, and output this speed to the LSP control system 12 to override the user_setspeed.
A method for determining a future path of the vehicle to traverse (e.g. off-road) terrain based on respective frames of image data from the stereoscopic camera system 185C will now be explained with reference to Fig. 5. At 500, a stereo colour image pair may be captured by the stereoscopic camera system 185C and RGB image data from the stereo colour image pair may be stored in a memory accessible to the one or more electronic processors of the stereoscopic camera system 1850. At 502a, the stereoscopic camera system 185C may select the image data of a first image of the image pair (which is a 2D colour image) and at 504a may convert the selected image from the RGB colour space to the LAB colour space (although 504a is not essential, and other colour spaces such as RGB or HSV may alternatively be used). At 502b, the camera system 185C may compare the first and second images of the image pair to thereby determine a disparity image.
At 504b, the stereoscopic camera system 185C may calculate a real-world 3D point cloud based on the disparity image. The 3D point cloud may initially be related to a frame of reference of the camera system 185C, but may then be translated to a frame of reference of the vehicle 100 before being translated to a frame of reference which is fixed with respect to the earth (rather than with respect to the vehicle 100), for example by reference to vehicle orientation information provided by the vehicle's inertial measurement unit (IMU) 23. The 3D point data cloud typically has a high number of points. The number of points of the 3D point data cloud may be reduced by the camera system 185C determining a 3D grid (by a method such as multi-level surface (MLS)) map from the 3D point data cloud mapped relative to a horizontal ground plane. It may be that the surface of the terrain is inclined or shifted with respect to the horizontal ground plane. The 3D grid map may comprise one or more metrics in respect of each of a plurality of 3D blocks of the 3D point cloud, the metrics typically including information relating to the slope of the terrain and the elevation of the features of the terrain within that block.
At 506, the stereoscopic camera system 185C may overlay the LAB (or RGB or HSV, for example) pixels derived from the first image of the stereo image pair onto the 3D grid map.
It may be that the (e.g. off-road) terrain to be traversed by the vehicle comprises a path region (e.g. a paved portion or mud ruts provided through a grass field) and a non-path region (e.g. grass on either side of the ruts, or on either side of the paved portion). At 508a-516a, the one or more processors of the stereoscopic camera system 185C may execute computer program instructions on the image data determined at 506 to determine probabilities that respective portions of the terrain relate to the path region thereof.
The image data may be divided by the camera system 185C into a plurality of sub-regions. It may be that each of the sub-regions relates to a 25cm x 25cm region of the terrain. Thus, it may be that each of the sub-regions comprise a plurality of pixels of image data. At 508a, an assumption may be made that the tyres of the vehicle are located on a path region of the terrain. Because the vehicle 100 is moving, the current location of the vehicle will differ from the location of the vehicle when the image data was captured. Accordingly, the image data may comprise image data corresponding to the current locations of the tyres of the vehicle 100. The camera system 185C may determine the current location of the vehicle 100 relative to the location of the vehicle 100 when the image data was captured, e.g. by performing visual odometry or inertial odometry on the image data and/or inertia data from the IMU 23, or using satellite positioning data (such as Global Positioning System (GPS) data) indicative of the location of the vehicle 100, and identify one or more sub-regions of the image data corresponding to the locations of the terrain currently contacted by the tyres of the vehicle 100 based on the current location of the vehicle. In the following description it will be assumed that one sub-region of the image data is identified for each tyre, but it will be understood that more than one sub-region may be identified for each tyre (depending on the relative sizes of the portion of the tyre in contact with the terrain and the sub-regions).
At 510a, the camera system 185C may be configured to process the sub-regions of the image data corresponding to the locations currently occupied by the tyres of the vehicle 100 to determine tyre region image content data relating to each of those tyre regions. The image content data may comprise colour image content data relating to the colour content of the respective sub-regions. Additionally or alternatively, the image content data may comprise texture data relating to the texture content of the respective sub-regions. Texture is a measure of the local spatial variation in the intensity of the image and is generally measured by subtracting the intensity of a given pixel from the intensity of each of the eight surrounding pixels to provide eight texture descriptors per pixel. It may be that the image content data comprises a colour and texture descriptor, p_i, which contains eleven components for each pixel consisting of three L, a, b colour components and eight texture descriptors. An example of how a colour and texture descriptor, p i, may be calculated in shown in Fig. 6, where subject pixel S of intensity Lc is shown surrounded by pixels S1 to S8 of respective intensities LI to L8. Lc, ac and be are the "LAB" colour components of pixel S. The set of weights Wi, W2 and W3 is used to balance how much to use colour, texture and brightness for image clustering.
By making the assumption that the tyre regions of the image data relate to a path region of the terrain, portions of the terrain which relate to path regions can be automatically identified. In addition, "path regions" comprising tyre tracks or mud ruts rather than paved paths can automatically be accounted for.
At 512a, the tyre region image content data may be merged with a global path model, such as a Gaussian mixture model (GMM), stored in a memory of the VCU 10. In some cases, it may be that more than one path model is provided (e.g. one for colour and one for texture), in which case 512a may be performed for each path model, but it will be assumed in the following description that a single global path GMM is provided. The global path GMM may be based on historical image data captured by the stereoscopic camera system 185C relating to historical locations of the terrain of the tyres of the vehicle 100.
Before the tyre region image content data is merged with the global path GMM, checks may be performed on the tyre region image content data to determine whether it is suitable for merger with the global path GMM. For example, the tyre region image content data relating to the location of each tyre may be compared to the tyre region image content data relating to the locations of each of the other tyres. If the tyre region image content data relating to a tyre 3o does not meet one or more similarity criteria with respect to tyre region image content data relating to one or more of the other tyres, it may be that this is indicative that the tyre to which it relates does not in fact occupy a path region of the terrain, and it may be that the stereoscopic camera system 185C decides not to merge it with the global path GMM. If tyre region image content data relating to the tyres (or a sub-set of the tyres) do meet the similarity criteria with respect to each other, it may be that the camera system 185C updates the global path GMM in dependence on the tyre region image content data relating to those tyres.
It may be that the similarity criteria comprise one or more conditions relating to the tyre region image content data. For example, it may be that the similarity criteria comprise one or more conditions that the tyre region image content data relating to a tyre matches the tyre region image content data relating to one or more other tyres of the vehicle to at least a given degree.
For example, it may be that the similarity criteria comprise one or more colour and/or texture conditions that colour and/or texture distributions of the tyre region image content data relating to a tyre matches the colour and/or texture distribution of the tyre region image content data relating to one or more other tyres. It will be understood that the image content data may be represented in any suitable way. For example, the image content data may comprise colour and/or texture components for each pixel of the sub-region, or the image content data may comprise a local GMM for that sub-region.
One or more tyres of the vehicle may occasionally enter a non-path region of the terrain while one or more other tyres of the vehicle remain on the path region of the terrain. By correlating tyre region image content data relating to different tyres of the vehicle and updating the global path GMM in dependence on there being a similarity between the tyre region image content data, cross-contamination of the global path GMM by tyre region image content data relating to a non-path region of the terrain can be reduced.
Additionally or alternatively, it may be that the tyre region image content data is compared to the global path GMM. If tyre region image content data meets one or more similarity criteria with respect to the global path GMM, the stereoscopic camera system 185C may update the global path GMM in accordance with the tyre region image content data. If the tyre region image content data does not meet the said similarity criteria with respect to the global path GMM, it may be an indication that the tyre region image content data does not in fact relate to the path region of the terrain, and that tyre region image content data is not merged with the global path GMM. This again helps to avoid cross-contamination of the path model by image data relating to a non-path region of the terrain.
3o In this case, it may be that the similarity criteria comprise one or more conditions relating to the tyre region image content data and global path GMM. For example, it may be that the similarity criteria comprise one or more colour and/or texture conditions that colour and/or texture distribution of the tyre region image content data matches the colour and/or texture distribution of the global path GMM to at least a given degree.
When the relevant tyre region image content data is merged with the global path GMM, an updated global path GMM may be provided. It will be understood that, the first time the method of Fig. 5 is performed, 512a may be omitted. Instead, it may be that the tyre region image content data relating to the locations of each of the tyres are compared to each other and the matching tyre region image content data is used to create a global path GMM.
If the tyre region image content data does not meet the said similarity criteria with respect to the global path GMM (or meets one or more dissimilarity criteria with respect to the global path GMM), the stereoscopic camera system 185C may exclude the tyre region image content data from the global path GMM. The stereoscopic camera system 185C may generate or update a second global path GMM (or any other suitable model) distinct from the said global path GMM based on the tyre region image content data. This helps the control system to accommodate changes in the terrain. For example, the control system may be configured to replace the global path GMM with the second global path GMM, for example in dependence on a determination that the path region of the terrain better matches the second path global path GMM.
At 514a, the updated global path GMM may be used to determine probabilities that the respective sub-regions of the image data (not only the tyre regions) relate to the path region of the terrain. The image content of each of the sub-regions of the image data may be compared to the distribution of the updated global path GMM in order to determine a probability that the respective sub-region relates to the path region. Thus, a single path probability value may be determined for each sub-region. The closer the image content to the peak of the distribution of the updated path GMM, the higher the probability that the sub-region relates to a path region of the terrain, and vice versa. It may be that the path probability determined for each sub-region is stored in a memory of the stereoscopic camera system 185C in association with the sub-region of the image data to which it relates.
At 516a, the camera system 185C may determine a path probability map in dependence on the determined path probabilities. It may be that the path probability map comprises the image data determined at 506 with the path probabilities for each of the sub-regions overlaid thereon.
At 508b-516b, the stereoscopic camera system 185C may execute computer program instructions on the image data determined at 506 to determine for each of the sub-regions a non-path probability that the respective sub-region relates to a non-path region of the terrain.
At 508b, two non-path regions laterally offset from the vehicle 100 are selected by the stereoscopic camera system 185C. It may be that the camera system 185C is configured to identify the non-path regions by determining image content data relating to each of a plurality of sub-regions of image data relating to a first portion of the terrain laterally offset from the vehicle 100 on a first (e.g. left) side of the vehicle 100 and to a second portion of the terrain laterally offset from the vehicle 100 on a second (e.g. right) side of the vehicle opposite the first side. For example, the sub-regions may relate to portions of the terrain between 3m and 8m laterally offset from the centre of the wheelbase line of the vehicle 100 on both sides of the vehicle 100 at its current location (as before the current location of the vehicle may be obtained by visual odometry or inertia odometry or from satellite positioning data (such as Global Positioning System (GPS) data) indicative of the location of the vehicle). As before, the image content data may comprise, for example, colour and/or texture data derived from the sub-region of image data. It may be that the camera system 185C is configured to compare the image content data relating to each of the selected laterally offset sub-regions to the global path GMM to thereby identify one or more of the sub-regions having an image content which meet one or more dissimilarity criteria with respect to the global path GMM. The camera system 185C may be configured to determine a lateral offset between the centre of the wheelbase line of the vehicle 100 at its current location and respective portions of the terrain to which the said one or more dissimilar sub-regions relate. For subsequent iterations of the method of Fig. 5 (for at least a limited time), the camera system 185C may be configured to determine the non-path sub-regions simply by determining sub-regions relating to portions of the terrain laterally offset from the vehicle by the lateral offset. Different lateral offsets may be determined for each side of the vehicle.
It may be that the dissimilarity criteria comprise one or more conditions relating to the image content data and the global path GMM. For example, it may be that the dissimilarity criteria comprise one or more colour and/or texture conditions that a colour and/or texture distribution of the image content data relating to a respective sub-region do not match corresponding distribution of the global path GMM to a given degree.
Thus, sub-regions relating to the non-path region of the terrain may be identified. By identifying the non-path regions with reference to the global path GMM, no user input needs to be requested by the VCU 10 in order to identify the non-path regions.
At 510b, image content data (e.g. the colour and/or texture data) relating to the non-path subregions may be determined. At 512b, the image content data relating to the non-path subregions may be merged with a global non-path model, such as a global non-path GMM. In some cases, it may be that more than one non-path model is provided (e.g. one for colour and one for texture), in which case 512b may be performed for each non-path model, but it will be assumed in the following description that a single global non-path GMM is provided. The global non-path GMM may be based on historical non-path image content data relating to portions of the terrain laterally offset from the vehicle on either side.
Before the image content data is merged with the global non-path GMM, checks may be performed on the non-path region image content data to determine whether it would contaminate or complement the global non-path GMM. For example, it may be that the camera system 185C compares the non-path region image content data to the global path GMM. If it is determined that the non-path region image content data meets one or more dissimilarity criteria with respect to the global path GMM, it may be that the camera system 185C merges the non-path image content data with the global non-path GMM. If it is determined that the non-path region image content data does not meet the one or more dissimilarity criteria with respect to the global path GMM, it may be that the camera system 185C decides not to merge the non-path region image content data with the global non-path GMM. Thus, an updated global non-path GMM may be obtained.
It will be understood that the first time 508b-512b are performed, step 512b may be omitted. In this case, the camera system 185C may be configured to merge the image content data obtained from the sub-regions laterally offset from the vehicle to form the global non-path GMM.
At 514b, the updated global non-path GMM may be used by the stereoscopic camera system to determine probabilities that the respective sub-regions of the image data (not only the subregions laterally offset from the vehicle) relate to the non-path region of the terrain. In this case, the image content of each of the sub-regions of the image data may be compared to the updated global non-path GMM to determine a probability that the respective sub-region relates to the non-path region of the terrain. The closer the image content data to the peak of the distribution of the updated non-path GMM, the higher the probability that the sub-region relates to the non-path region of the terrain, and vice versa. It may be that the probability is stored in a memory of the VCU 10 in association with the sub-region of the image data to which it relates.
At 516b, a non-path probability map is determined in dependence on the non-path probabilities. It may be that the non-path probability map comprises the image data determined at 506 with the non-path probabilities for each of the sub-regions overlaid thereon.
The global non-path GMM may be used to provide a further check on the tyre region image content data to determine whether it is suitable for merger with the global path GMM. The stereoscopic camera system may compare the tyre region image content data relating to the location of each tyre to the global non-path GMM and, in dependence on the tyre region image content data and the global non-path GMM meeting one or more dissimilarity criteria with respect to each other, the camera system 185C may update the path model in dependence on the tyre region image content data. If the tyre region image content data and the global non-path GMM do not meet one or more dissimilarity criteria with respect to each other, it may be that the camera system 185C does not use the tyre region image content data to update the global path GMM. This helps to reduce cross contamination of the global path GMM with image data relating to the non-path region of the terrain.
It may be that the dissimilarity criteria comprise one or more conditions relating to the tyre region image content data and the global non-path GMM. For example, it may be that the dissimilarity criteria comprise one or more colour and/or texture conditions that a colour and/or texture distribution of the tyre region image content data relating to a respective sub-region does not match the distribution of the global non-path GMM to a given degree.
At 518, path and non-path probability data determined during 508a-516a and 508b-516b are combined to provide a final path probability map. The final path probability map may be determined as a weighted combination of the path probability map and a secondary path probability map inferred from the non-path probability data. The secondary path probability map may be an inverse of the non-path probability map determined by inferring that the subregions which have low non-path probabilities have high path probabilities. For example, it may be that the camera system 185C is configured to infer the secondary path probability, Psecondary_path_i, for a respective it" sub-region from the non-path probability, P * non-palh_i, based on: Psecondary_path = 1 -Pnon-pa,"_;. Alternatively, the path and non-path probability data may be combined in any other suitable way to determine the final probability map. For example, the probabilities determined at 514a, 514b may be combined, for example by subtracting the non-path probability from the path probability to provide a final path probability for each sub-region. It may be that the probabilities from the path probability map are allocated a more significant weighting than the inferred probabilities from the secondary path probability map to reflect a greater confidence in those values. Alternatively it may be that the same weights are applied to each, or indeed greater weight may be allocated to the inferred probabilities from the secondary path probability map. It may be that respective weights to be applied to the path and non-path probability data are determined in dependence on the respective consistencies of one or more path/non-path boundaries determined from the path and non-path probability data respectively. Ways in which the respective consistencies of path boundaries can be measured are explained in more detail below. By inferring secondary probability data from the non-path probability data and combining the secondary path probability data with the path probability data, a more confident determination can be made as to whether a portion of the terrain is a path or non-path region of the terrain.
When the final path probability map is determined, it may be merged with a global final path probability map determined from previous frames of image data captured by the stereoscopic camera system 185C.
The method of Fig. 5 will now be illustrated with reference to the example of Figs 8-9.
Fig. 8a shows a first 2D image obtained by a first image sensor of the stereoscopic camera system 185C of terrain ahead of the vehicle 100. The terrain comprises a paved path region 700 and a pair of grass non-path regions 702, 704 on either side of the paved path region 700. Fig. 8b shows the disparity image between the first 2D image shown in Fig. 8a and a second 2D image obtained by a second image sensor of the stereoscopic camera system 185C. Fig. 8c shows the pixels of the image of Fig. 8a overlaid on a 3D grid obtained based on the disparity image of Fig. 8b and mapped relative to a horizontal plane. Also shown in Fig. 8c is a white line 708 showing the location of the longitudinal axis of the vehicle 100 on the path 700, the locations 706a-706d of the tyres of the vehicle 100 and the locations 707a, 707b of the non-path regions of the terrain laterally offset from the vehicle used to determine the non-path model.
Fig. 9a repeats the view of Fig. 8c for reference. Fig. 9b shows the path probability map determined at 516a of the method of Fig. 5 with respect to the terrain of Fig. 9a. The shading of the path probability map varies from black to grey, with black indicating a higher probability that the sub-region relates to the path region of the terrain and grey indicating a lower probability thereof. The majority of the path region 700 is correctly identified as being a path region of the terrain, and the majority of the non-path regions 702, 704 are correctly identified as not being a path region of the terrain. However, at the edges of the path region, and some portions near the centre of the path region, there are sub-regions which have not been correctly identified as being a path region. This may be at least partly because, as can be seen from Fig. 8a for example, some portions of the path region 700, such as the puddle regions 705 covered with water, have image contents (e.g. colour and/or textures) which will not accurately match the global path GMM.
Fig. 9c shows the non-path probability map determined at 516b of the method of Fig. 5. The colour varies from black to grey, with black indicating a higher probability that the sub-region relates to the non-path region of the terrain and grey indicating a lower probability thereof. The majority of the path region 700 is correctly identified as not being a non-path portion of the terrain, and the majority of the non-path regions are correctly identified as being non-path portions of the terrain. Indeed, there is no distinction made in the non-path probability map between the dry and puddle regions of the path 700. However, at the edges of the non-path region, and some portions further to the left and right of the non-path region, there are subregions which have not been identified as being non-path with a high probability. This may be because they do not match the global non-path GMM.
Fig. 9d shows a secondary path probability map inferred from the non-path probability map of Fig. 9c by the stereoscopic camera system 185C determining the inverse of the non-path probability map of Fig. 9c at 518. Again the colour varies from black to grey, with black indicating a higher probability that the sub-region relates to the path region of the terrain and grey indicating a lower probability thereof. It can be seen that the centre of the path region is determined to relate to the path region of the terrain with a greater probability in the secondary path probability map than in the path probability map of Fig. 9b. This is at least in part because there is no distinction made between the dry and puddle regions of the path 700 of the terrain in the secondary path probability map. In addition, some of the sub-regions at the boundaries between the path region and the non-path region are determined to relate to path or non-path regions of the terrain with higher probability than in the path probability map of Fig. 9b.
Figs. 9e and 9f show the path and non-path boundaries respectively determined from the path and non-path probability maps of Figs. 9b and 9c respectively. It can be seen, particularly in the right hand path boundary of Fig. 9e near the location of the vehicle 100, that a consistent boundary is not determined from the path probability data of Fig. 9b alone.
Fig. 9g shows the final path probability map determined by a weighted combination of the path probability map and the secondary path probability map inferred from the non-path probability map. It can be seen that the sub-regions of the image data are determined to be path or non-path with a higher degree of confidence in the final path probability map that from any of the path probability map determined at 516a (Fig. 9b), the non-path probability map determined at 516b (Fig. 9c) or the secondary path probability map inferred from the non-probability map at 518 (Fig. 9d). Fig. 9h shows the final path probability map merged with the global path probability map. Fig. 9i shows the path boundary determined from the updated global path probability map of Fig. 9h. It can be seen that the path boundary of Fig. 9i is more consistent than the path boundary of Fig. 9e or the non-path boundary of Fig. 9f.
Thus, the method of Fig. 5, in which both path and non-path probabilities are determined, provides more confident determination of path and non-path regions of the terrain than is achievable with either the path or non-path probabilities alone.
It will be understood that occasionally part of one or more tyres of the vehicle 100 may leave the path region of the terrain and enter a non-path region of the terrain. In this case, it may be that a first part of the tyre region image data relates to the path region, while a second part of the tyre region image data relates to the non-path region. This is illustrated in Figs. 7a and 7b, which show sub-regions of the image data comprising path portions 600 and non-path portions 602. Accordingly, in the event that tyre region image content data relating to a tyre of the vehicle is determined not to match the tyre region image content data relating to the other tyres of the vehicle or the global path GMM, it may be that the stereoscopic camera system 185C is configured to split the sub-region of the image data from which the tyre region image content data is derived into two or more portions. In this case, image content data derived from a (or each) selected portion of the sub-region (rather than from the entire sub-region) may be compared to any one or more of: the tyre region image content data relating to the other tyres; the global path GMM; the global non-path GMM. If the image content data derived from the selected portion of the sub-region meets the one or more similarity criteria with respect to one or more of the other tyres and/or the global path GMM, it may be that the camera system 185C selectively merges the image content data derived from that selected portion of the sub-region with the global path GMM. For example, the similarity criteria may require that the selected portion of the sub-region is more strongly correlated with the global path GMM than the global non-path GMM and/or is more strongly correlated with the global path GMM than another portion of the sub-region and/or is sufficiently strongly correlated with the global path GMM. Otherwise, it may be that the camera system 185C decides not to merge the image content data derived from that selected portion of the sub-region with the global path GMM. It may be that this helps the global path GMM to become more generalised more quickly, thereby helping to improve the accuracy of the path probability data.
As will be explained below, the VCU 10 may determine cost data in dependence on the path and non-path probabilities, and determine a future path for the vehicle in dependence on the cost data, typically by determining a cost map based on the cost data The cost map may comprise a grid of cells. The way in which the image data relating to a sub-region is split may depend on the direction of the determined path relative to the cost map grid. For example, if the path is parallel to an axis of the grid, it may be that the image data is split into left and right portions. For example, Fig. 7a shows the path region 600 on the right hand side with the non-path region 602 on the left hand side for a path travelling north/south (parallel to the vertical axis). In another example (see Fig. 7b), the path is diagonal to the grid and the image data may be split diagonally in the cell.
The stereoscopic camera system 185C may determine a cost map relating to the terrain based on the path and non-path probabilities, for example in dependence on the final path probability map. In order to determine the cost map, the stereoscopic camera system 185C may determine, for each of the sub-regions of the image data, a cost for the vehicle 100 to traverse the respective portion of the terrain to which the sub-region relates in dependence on the path and non-path probabilities determined from 508-518, for example in dependence on the final path probability relating to that sub-region determined at 518. The cost may relate to a penalty or a reward associated with the respective portion of the terrain. An increased cost may relate to an increased penalty or a reduced reward. Similarly a reduced cost may relate to a reduced penalty or an increased reward. However, it will be assumed in the following description that the cost is allocated on a penalty basis.
In an example, for each sub-region, the greater the final path probability, the lower the cost allocated to that sub-region and the lower the final path probability, the greater the cost allocated to that sub-region. It may be that the costs are allocated to sub-regions on a binary basis, for example a low cost for sub-regions having final path probabilities greater than a threshold and a high cost for sub-regions having final path probabilities lower than a threshold.
However, it may be that costs are allocated on a more granular basis. For example, it may be that the cost for the vehicle to traverse a portion of the terrain to which the sub-region relates is determined depending on the final path probability meeting one or more path probability criteria indicating that the sub-region relates to the path region, one or more non-path probability criteria indicating that the sub-region relates to the non-path region or neither the path probability criteria nor the non-path probability criteria.
Sub-regions having low path probabilities determined at 514a and low non-path probabilities determined at 514b may have an intermediate final path probability between a relatively low final path probability and a relatively high final path probability. For example, it may be that puddle regions 705 of the path region 700 of the example of Fig. 8a are provided with intermediate final path probabilities because they do not meet one or more similarity criteria with respect to either the global path or global non-path GMM. In this case, a first, relatively high cost may be allocated to sub-regions having a relatively low final path probability, a second, intermediate cost may be allocated to sub-regions having an intermediate final path probability and a third, relatively low cost may be allocated to sub-regions having relatively high final path probabilities. In this way, an at least three-tiered cost allocation scheme may be implemented in which the final path probabilities are probability parameters relating to whether the respective sub-regions relate to the path region or the non-path region. If the final path probability associated with a sub-region meets path probability criteria, in this case that the final path probability is greater than a respective path threshold, a low cost may be allocated to that sub-region. If the final path probability associated with a sub-region meets non-path probability criteria, in this case that the final path probability is less than a respective non-path threshold (which may be different from the path threshold), a high cost may be allocated to that sub-region. If the final path probability associated with a sub-region meets neither the path nor the non-path probability criteria (e.g. the final path probability is between the path and non-path thresholds), it may be that an intermediate cost is allocated to that sub-region.
This is illustrated in Fig. 10 which schematically shows an example terrain based on the terrain of Fig. 8a with a cost map 800 determined using a three-tiered cost allocation scheme overlaid thereon, the cost map 800 comprising a plurality of cells 802 each of which relates to a sub-region of the image data. The letter L indicates that a relatively low cost has been allocated to the sub-region, the letter H indicates that a relatively high cost has been allocated to the sub-region, while the letter I indicates that an intermediate cost has been allocated to the subregion. For ease of illustration, the cells 802 of the cost map 800 of Fig. 10 are larger than would normally be used in practice.
In other examples, it may be that costs are allocated to each sub-region from a (e.g. continuous) scale having more than three possible costs in dependence on the final path probability determined for that sub-region. In this case, because there are more than three possible costs which can be allocated to a sub-region, it may still be considered that the cost for the vehicle to traverse a portion of the terrain to which the sub-region relates is determined depending on the final path probability for that sub-region meeting one or more path probability criteria indicating that the sub-region relates to the path region, one or more non-path probability criteria indicating that the sub-region relates to the non-path region or neither the 3o path probability criteria nor the non-path probability criteria. For example, in this case, there may be at least three sub-regions, at least a first of which has a relatively high final path probability such that it is allocated a relatively low cost, at least a second of which has a relatively low final path probability such that it is allocated a relatively high cost, and at least a third of which has an intermediate path probability such that it is allocated an intermediate cost intermediate the relatively high and relatively low costs. In this case, it may be said that the final probability of the first sub-region implicitly meets one or more path probability criteria, the final probability of the second sub-region implicitly meets one or more non-path probability criteria and the final probability of the third sub-region implicitly meets neither the path nor non-path probability criteria. It will be understood that the scale of costs may be infinitely variable. It may be that the relationship between the costs allocated and the final path probabilities of the respective sub-regions is not linear. For example, exponentially greater costs may be allocated to respective cells relating to sub-regions for which there is a high probability that it relates to a non-path region.
Cost may (for example) alternatively be allocated directly in dependence on the path and non-path probabilities. Any suitable alternative cost allocation strategy may be employed.
The cost map may be transmitted from the stereoscopic camera system 185C to the VCU 10 which may determine a future path for the vehicle in dependence on the cost map. The cost data of the cost map may be provided by the stereoscopic camera system 185C to the VCU 10 on a cell by cell basis. Optionally, the VCU 10 merges the cost map with an existing global cost map which may be based (at least in part) on cost maps obtained previously from the stereoscopic camera system 185C. In order to determine a future path for the vehicle 100, costs for the vehicle to traverse the terrain by each of a plurality of candidate trajectories may be calculated from the cost map 800 (or from a global cost map into which cost map 800 is merged) and a preferred path may be selected from the candidate trajectories in dependence on the calculated costs. This is illustrated in Fig. 11, which shows the same terrain and cost map 800 as Fig. 10 with a plurality of candidate trajectories 810-830 overlaid thereon. If it is assumed that the VCU 10 will want to avoid high cost regions of the terrain, then the VCU 10 may select trajectory 810 as its preferred path as it will have the lowest cost. Following trajectory 810 would cause the vehicle 100 to traverse regions of the terrain which have been allocated intermediate (I) costs by the cost map 800. This is made possible by using an at least three-tiered cost allocation scheme to determine the cost map 800. It will be understood that, if a binary cost allocation scheme was employed, it may be that the puddle regions 705 of the terrain would have been allocated a high cost, in which case it may be that the VCU 10 would attempt to guide the vehicle around the puddle regions rather than through them. Thus, 3o a more optimal (direct) vehicle path may be achieved by the cost map 800 having at least three different costs allocated to different sub-regions of the image data.
Although the intermediate cost cells have been illustrated as portions of the path 700 covered by puddles of water, it will be understood that intermediate cost cells may additionally or alternatively be portions of the path 700 covered by shadows, e.g. of trees or bushes on either side of the path 700 or any other cause of variable lighting on the path 700. This is illustrated by the cost map 850 shown in Fig. 12 relating to a similar terrain as the cost map of Fig. 10 but with shadow portions 852 rather than puddle portions 705. Again, by allocating an intermediate cost to the shadow portions 852, it may be that the vehicle path determined by the VCU 10 passes through one or more of the shadow portions 852, which may allow a more optimal path to be determined.
It will be understood that, instead of calculating the cost for each of a plurality of candidate trajectories and selecting a preferred path from the candidate trajectories in dependence on the costs, it may be that the future path is determined by analysing the cost map in order to determine the lowest cost route. While this may allow a more optimal route to be determined, it is more processing intensive.
In the case that the cost map 800 is merged with an existing global cost map based on previously obtained cost data, it may not be necessary to determine costs based on image data common to previous frames of image data captured at different times and/or locations of the vehicle. Rather, it may be that cost data is determined only for image data relating to portions of the terrain for which cost data was not determined based on previous frames of image data. This helps to reduce the quantity of processing required.
It will be understood that it is not essential to obtain both path and non-path probabilities in order to be able to allocate an at least three-tiered cost allocation scheme to a cost map. For example, a three-tiered cost allocation scheme may be based on path probabilities determined with reference to a texture-based global path GMM (e.g. which uses texture or colour and texture as the modelled parameter). In this case, it may be that regions of the path 700 under variable lighting conditions (e.g. having reflective puddles, shadow regions etc) may be identified as having a texture which is more similar to the other portions of the path 700 than to the non-path region on either side of the path (which may be grass and have a more distinctive texture). Additionally or alternatively, it may be determined that a shadow region of the path 700 has a more similar colour content to other portions of the path 700 than the (e.g. grassy) non-path regions 702, 704 of the terrain. In either case, intermediate costs can be allocated to portions of the path 700 under variable lighting conditions, and low and high costs to portions of the terrain confidently identified as path and non-path regions respectively.
It may be that the cost map 800 or 850 (and/or the global cost map with which the cost map 800 or 850 is merged) is a wheel cost map indicative of costs for the wheels of the vehicle 100 to traverse the terrain independently of the body of the vehicle 100. Although the candidate trajectories in Fig. 11 are each represented by single lines 810-830, it will be understood that different wheels of the vehicle 100 may follow different paths from each other for a given vehicle trajectory 810-830. Accordingly, it may be that the cost associated with each of the candidate trajectories 810-830 takes into account the different paths which would be followed by each of the wheels if the vehicle 100 were to follow that trajectory 810-830. For example, the cost of the vehicle 100 traversing the terrain by way of each candidate trajectory 810-830 may be determined by determining the costs for each of the wheels of the vehicle 100 to follow their respective paths along that trajectory. Alternatively, it may be that the cost of the vehicle 100 traversing the terrain by way of each candidate trajectory 810-830 may be determined by selectively determining the costs for diagonally opposite wheels of the vehicle 100, such as for the front right and rear left or front left and rear right wheels of the vehicle, to follow their respective paths along that trajectory. In this case, it may be that the cost associated with each candidate trajectory is the average or the sum of the costs for the respective wheels to follow their respective paths along that trajectory.
Although the costs associated with the cells 802 of the wheel cost map 800 are described above as relating to whether they relate to path or non-path regions of the terrain, the costs associated with each of the cells 802 of the wheel cost map may additionally or alternatively relate to any one or more of: a gradient of the terrain in a projected direction of travel of the vehicle; a side slope of the terrain transverse to a projected direction of travel of the vehicle. The gradient of the terrain in the projected direction of travel and/or the side slope of the terrain may be determined from topography information relating to the terrain determined from the 3D grid map onto which the image data is overlaid. Alternatively, the gradient of the terrain in the projected direction of travel and/or the side slope of the terrain may be determined from a ranging system of the vehicle such as a radar-based terrain ranging system, a laser-based terrain ranging (e.g. LIDAR) system or an acoustic ranging system of the vehicle 100 or from a gradient sensor of the vehicle (if provided). The projected direction of travel may be determined from the candidate trajectory of the vehicle 100.
It may be that the costs associated with the cells 802 of the wheel cost map 800 are generalised costs substantially independent of the direction of travel of the vehicle 100 across 3o the respective cells 802. Alternatively, a plurality of wheel cost maps 800 may be determined, each being associated with a respective candidate trajectory across the terrain. In the latter case, the costs associated with the cells 802 of the wheel cost map 800 may be dependent on the direction of travel of the candidate trajectory with which it is associated across the terrain. This provides the cost data with increased accuracy, but involves increased processing as compared to the former case.
The preferred path selected from the wheel cost map(s) 800 may be based on non-obstacle cost data, and/or it may be that the preferred path selected from the wheel cost map(s) does not take into account some obstacles of the terrain. Accordingly, it may be that the VCU 10 is configured to obtain further cost data relating to the cost of traversing the terrain to check, for example, whether the selected preferred path contains any obstacles which would render it unsuitable for the vehicle 100. For example, it may be that the stereoscopic camera system 185C is configured to determine a second, body cost map indicative of respective cost(s) for a swept volume of the body of the vehicle 100 to traverse one or more portions of the terrain independently of the wheels of the vehicle 100. The stereoscopic camera system 185C may then be configured to transmit the second, body cost map to the VCU 10 which may take it into account to determine the future path for the vehicle 100. The cells of the body cost map may correspond to (e.g. are aligned with in relation to the terrain) the respective cells of the wheel cost map(s) 800.
The second, body cost map may be based on the 3D grid data generated by the stereoscopic camera system 185C in respect of the terrain. The body cost map may include cost data relating to one or more obstacles of the terrain, such as one or more three-dimensional, 3D, obstacles of the terrain. The body cost map may also include cost data relating to one or more objects (e.g. branches, bushes) overhanging a ground level (e.g. a path region on a ground level) of the terrain. It may be that the body of the vehicle 100 has a predetermined minimum elevation with respect to the ground level of the terrain. It may be that the body cost map is selectively based on 3D grid data relating to objects or obstacles having elevations which exceed the predetermined minimum elevation. Similarly, it may be that the body of the vehicle 100 has a predetermined maximum elevation with respect to the ground level of the terrain. It may be that the body cost map is selectively based on 3D grid data relating to objects or obstacles having elevations which are below the predetermined maximum elevation. This advantageously allows the body cost map to account for whether or not the body of the vehicle 100 would engage or clear the obstacle or overhanging object if the vehicle 100 were to traverse the terrain across a particular candidate trajectory.
It may be that the stereoscopic camera system 185C determines from the 3D grid data whether the portion of the terrain to which each respective sub-region relates comprises any features having an elevation greater than the minimum elevation and less than the maximum elevation, which may be an indication that the body of the vehicle 100 would be impeded if it was to try to traverse that portion of the terrain. If so, it may be that the stereoscopic camera system 185C determines that there are obstacles present in the portion of the terrain to which that sub-region relates. If not, it may be that the stereoscopic camera system 185C determines that there are no obstacles present in the portion of the terrain to which that subregion relates. If there are one or more obstacles, it may be that the stereoscopic camera system 185C allocates a relatively high cost to that sub-region. If there are no obstacles, it may be that the stereoscopic camera system 185C allocates a relatively low cost to that sub-region. The cost data in the body cost map may be binary (e.g. relating to whether the cell to which it relates is passable or impassible by the vehicle) such that the body cost map is in effect an occupancy grid. Alternatively, it may be that the cost data in the body cost map is more granular. For example, costs may be allocated to cells of the body cost map using an at least three-tiered cost allocation scheme.
While the cost for the vehicle 100 to traverse the terrain based on the wheel cost map may involve the determination of the costs for different wheels of the vehicle 100 to traverse different paths along the various candidate trajectories 810-830 (and optionally summing or averaging those costs), it may be that determining the cost for the vehicle 100 to traverse the terrain based on the body cost map involves determining which cells of the body cost map would be occupied by a volume swept by the body of the vehicle 100 if it were to traverse the terrain by a particular trajectory. In this case, it may be that the stereoscopic camera system 185C determines the swept volume of the vehicle 100 with respect to each candidate trajectory, and determines the cost for the body of the vehicle 100 to traverse the terrain by that trajectory by for example summing the costs associated with the cells which would be occupied by the swept volume of the vehicle 100 following that trajectory. As illustrated in Fig. 13, the volume swept by the body of the vehicle 100 along a trajectory depends on the curvature of the path followed by the vehicle 100, increasing for tighter turns and decreasing for gentler turns. Thus, it may be that the stereoscopic camera system 185C is configured to determine the volume swept by the body of the vehicle 100 along a trajectory in dependence on the curvature of that trajectory, and to determine the cost associated with that trajectory in dependence on the swept volume and the body cost map.
It may be that the body cost map is transmitted by the stereoscopic camera system 185C together with the wheel cost map to the VCU 10, and it may be that the VCU 10 merges the body cost map with a global body cost map based on previous frames of image data. It may be that the VCU 10 selects a preferred path based on the wheel cost map 800 as described above, before determining the cost for a swept volume of the body of the vehicle 100 to traverse the selected preferred path based on the body cost map. If the cost derived from the body cost map is too high (e.g. above a threshold indicative that the path contains one or more impassible obstacles), it may be that the VCU 10 selects an alternative preferred path (e.g. from the candidate trajectories shown in Fig. 11 based on the wheel cost map 800) and determines whether that path comprises obstacles based on the body cost map. This process may be repeated until a path is found which does not contain obstacles which are impassible by the body of the vehicle 100.
By providing separate wheel and body cost maps, the different effects of the terrain on the wheels and body of the vehicle 100 can be accounted for, enabling more accurate costs to be determined for the vehicle 100 to traverse respective candidate trajectories across the terrain, allowing a more optimal vehicle path to be determined (e.g. as compared to providing a single cost map which does not separate wheel and body cost data). For example, it may be that one or more portions of the terrain, such as a strip of grass extending between a pair of substantially parallel ruts or tracks, would incur a relatively high cost for the wheels of the vehicle to traverse but a relatively low cost for the body to traverse (e.g. because a minimum elevation of the body is greater than a maximum elevation of the said portion of the terrain such that the vehicle body would clear the said portion of the terrain). By providing separate wheel and body cost maps, a (potentially optimal) vehicle path which provides the wheels of the vehicle in the ruts/tracks and the body of the vehicle over the grass strip may be determined to have a relatively low overall cost. Conversely, a cost map which does not separate body and wheel effects may determine that such a path would be of a relatively high overall cost. Thus, providing separate wheel and body cost maps is particularly advantageous.
In some terrains, there are features which are of low cost for a vehicle to traverse in one direction, but which are of high cost for the vehicle to traverse in other directions. For example, mud ruts typically comprise tracks for wheels of the vehicle which are of low cost for the vehicle to follow, but which are of high cost for the vehicle to cross. This is illustrated in Fig. 14 which shows the wheels of vehicle 100 following mud ruts 860, 862. Stars 864, 866, 868 represent locations of the mud ruts 860, 862 which would be crossed by the vehicle 100 if it were to follow the path defined by lines 870, 872. This directional dependency on cost cannot be accommodated by traditional cost maps or occupancy grids, which typically allocate a cost to a particular portion of the terrain which is applied independently of the direction of travel of the vehicle.
Accordingly, it may be that the stereoscopic camera system 185C (or any other processing system of the vehicle 100 in data communication with the stereoscopic camera system 185C such as the VCU 10) is configured to determine a third, line features cost data structure in dependence on which the future path for the vehicle 100 may be determined. The line features cost data structure typically comprises a plurality of line features, which may each be represented by a plurality of location points defining the line feature or a best fit line (for example), the line features indicating lines of the terrain which should not be crossed by the vehicle. The line features cost data structure may also comprise direction data indicative of a crossing direction of the line features, although this may be implicit in the shape of the line feature in which case it may not be necessary to store direction data in the line features cost data structure. In one example, the stereoscopic camera system 185C may be configured to determine line features based on the path boundaries derived from the path and non-path probability data described above (e.g. from the final path probability data).
It may be that the camera system 185C (or other processing system of the vehicle such as VCU 10) is configured to determine boundaries between path and non-path regions of the terrain based on the path probability data and the non-path probability data. For example, it may be that first and second (e.g. left and right) boundary lines between the path and non-path regions of the terrain are identified independently from each of path probability data and the non-path probability data. It may be that the boundary lines determined from the non-path probability data are determined from the secondary path probability data. For each boundary pair, the path width (i.e. shortest distance between the boundaries of the said pair) may determined for each of a plurality of locations along the path. A consistency measure may be determined for each said boundary pair in dependence on any one or more of: the average (e.g. mean) of the said path widths of the boundary pair; the standard deviation of the path widths between the boundaries of the boundary pair; the lengths of the boundaries. Respective first and second weights may then be determined for the path and non-path boundary pairs respectively in dependence on the consistency measures of the boundaries determined from the path and non-path probability data respectively. The boundaries obtained from the path and non-path probability data may be combined in dependence on the first and second weights. For example, the camera system 185C (or other processing system of the vehicle such as VCU 10) may put more emphasis on one of the path and non-path boundaries if they have been allocated more significant weight than the other by virtue of being more consistent than the other. The boundaries determined from the combination of the boundaries determined from the path and non-path probability data may provide line features.
The crossing direction for each line feature may be determined based on the direction in which the terrain changes from a path region to a non-path region in the path and non-path probability data (or in the final path probability data). The line features act as boundaries across which the vehicle path should not cross. It may be that the line features cost data structure is merged with a global line features cost data structure based on line features cost data structures derived from previous frames of image data.
As before, the stereoscopic camera system 185C may be configured to transmit the line features cost data structure to the VCU 10 (e.g. together with the wheel cost map and/or the body cost map) which may determine the future path of the vehicle 100 in dependence thereon. This is illustrated in Fig. 15, where the VCU 10 receives cost data from the wheel cost map 800, body cost map 880 and line features cost data structure 882. It will be understood that, in some embodiments, cost data from any two or more of the cost data structures 800, 880, 882 may be used by the VCU 10 to determine the vehicle path.
The cost data from the wheel cost map 800, the body cost map 880 and the line features cost data structure 882 may be transmitted from the stereoscopic camera system 185C to the VCU grouped in dependence on the respective portions of the terrain to which it relates. For example, it may be that the stereoscopic camera system 185C is configured to transmit cost data relating to corresponding cells of the wheel and body cost maps together with any line features cost data relating to the same portion of the terrain as part of the same transmitted data structure.
In order to determine the transmitted data structure, it may be that the stereoscopic camera system 185C determines whether any of the cells of body cost map 882 contain obstacles. If any of the cells of the body cost map 882 contain obstacles, the stereoscopic camera system 185C may divide the cost data from the wheel cost map 800, the body cost map 880 and the line features cost data structure 882 relating to a particular portion of the terrain into obstacle cost data 884 and non-obstacle cost data 886. As shown in Figs. 16a and 16b, it may be that the obstacle cost data is provided at a predetermined portion of the transmitted data structure, such as the least significant (e.g. four) bits (Fig. 16a) or the most significant (e.g. four) bits (Fig. 16b). The predefined portion of the transmitted data structure is typically known to the VCU 10. In this case, the VCU 10 may be configured to selectively process the obstacle cost data with a higher priority than non-obstacle cost data when determining the vehicle path. For example, the VCU 10 may discard the non-obstacle data 886. This may be possible because, due to the presence of an obstacle, it may be that the vehicle cannot traverse the portion of the terrain to which the data structure relates. This means that the non-obstacle cost data relating to that portion of the terrain is of less (or indeed of no) importance. By selectively processing the obstacle cost data, processing and battery power of the VCU 10 is saved.
In an alternative example, it may be that the stereoscopic camera system 185C is configured to provide the obstacle cost data 884 to the VCU 10 but not the non-obstacle cost data 886.
This may be possible for the same reason discussed above. By not providing the non-obstacle cost data 886 to the VCU 10, bandwidth of the communication medium (e.g. vehicle communications bus or wireless network) by which data is transmitted between the stereoscopic camera system 185C and the VCU 10 is saved, together with processing power of the stereoscopic camera system 185C and the VCU 10.
Thus, it may be that the VCU 10 is configured to determine the vehicle path in dependence on the obstacle cost data 884 relating to one or more portions of the terrain, but not in dependence on the non-obstacle cost data 886 relating to the same portion of the terrain.
In other examples, it may be that the 3D cost data is obtained from another electronic control unit of the vehicle 100, for example from a ranging system of the vehicle such as a radar-based terrain ranging system, a laser-based terrain ranging (e.g. LIDAR) system or an acoustic ranging system of the vehicle. In this case, the camera system of the vehicle need not be stereoscopic and a single 2D camera may be employed. In this case, the preferred path may be determined based on cost data derived from a mapping of image data obtained by the 2D camera relative to a horizontal plane representing the surface of the terrain in a similar way to that described above. The body cost map in this case may be determined and provided to the VCU 10 by the separate, ranging system controller of the ranging system.
The VCU 10 may be configured to perform a feasibility assessment on the selected path to determine whether it is a feasible path for the vehicle to follow. For example, it may be that the VCU 10 is configured to determine whether it is a feasible path for the vehicle to follow in dependence on any one or more of: the width of the path region (e.g. the distance between a pair of typically substantially parallel path boundaries, typically substantially perpendicular to the longitudinal axis of the vehicle); whether the path has parallel boundaries; the continuity of the path boundaries; whether the path emanates from the vehicle 100. If the VCU 10 determines that the path is infeasible, it may be that an alternative preferred path is selected (e.g. from the candidate trajectories shown in Fig. 11) and it is determined whether that path is feasible. This process may be repeated until a feasible path is found.
It may be that the VCU 10 is configured to provide an output representative of the determined path. It may be that the output is a visual output. It may be that the output is an audio visual output. It may be that the output is provided by way of a display and/or speaker system of the vehicle, such as a display and/or speaker system of an infotainment system of the vehicle.
The VCU 10 may then control the vehicle 100 in accordance with the determined path. The VCU 10 may determine a required steering angle for one or more wheels of the vehicle 100 in dependence on the curvature of the determined path, and in dependence thereon transmit a steering angle command signal to the steering controller 170C. The steering controller 170C, in turn, may set the angle of the steerable wheels of the vehicle accordingly. The VCU 10 may also determine a recommended speed of the vehicle in dependence on the curvature of the determined path from the relevant look-up table. The VCU 10 may be configured to output the recommended speed to the LSP control system 12 which controls the speed of the vehicle 100 accordingly by changing the user set speed in accordance with the received recommended speed.
It will be understood that, in alternative examples, the VCU 10 may obtain the image data captured by the stereoscopic camera system 185C, and the VCU 10 (rather than the stereoscopic camera system 185C) may be configured to derive cost data therefrom in the way described above.
It will be understood that the VCU 10 may be configured to add and remove cost data from the global cost map depending on its location, and typically in dependence on a direction of movement of the vehicle 100. This helps to limit the quantity of cost data it needs to store in the global cost map at any given time.
As shown in Fig. 17, the location and orientation of the vehicle 100 may be defined with respect to its own frame of reference 900 or with respect to a global frame of reference 902. It may be that sensors of the vehicle determine sensor data with respect to the vehicle frame of reference 900. The wheel and body cost maps may be defined with respect to either frame of reference 900, 902. As discussed above, the VCU 10 may determine merged wheel and body cost maps in dependence on sensor data or cost data (e.g. cost maps) obtained from the same (e.g. the stereoscopic camera system 185C) electronic control unit or different electronic control units (e.g. an imaging system and a 3D ranging system) of the vehicle 100. The sensor data on which the cost data is based may be captured at different locations of the vehicle 100, and so may be referenced to those different locations. In this case, updating a cost map which is referenced to the vehicle's frame of reference with cost data which is referenced to the vehicle's frame of reference is computationally difficult and therefore highly processing intensive. Conversely, it has been discovered that merging cost maps which are defined with respect to a global reference (e.g. a globally referenced location and/or a global orientation reference) in dependence on cost or sensor data which is referenced to different globally referenced locations and a global orientation reference is less computationally difficult, and can be achieved more quickly (typically in real time) and/or with less processing power.
For example, as shown in Fig. 18 the VCU 10 may be required to merge a first cost map determined by a first electronic control unit 910 (such as an electronic control unit of a camera system of the vehicle) with a second cost map determined by a second electronic control unit 912 (such as an electronic control unit of a 3D ranging system of the vehicle), for example in order to determine a wheel cost map in dependence on whether portions of the terrain relate to path or non-path regions of the terrain, and in dependence on the gradient of the terrain in the propagation direction of the vehicle and the side slope of the terrain transverse to the propagation direction of the vehicle. For example, cost data relating to whether portions of the terrain relate to path or non-path regions of the terrain may be derived from (e.g. 2D) image data by control unit 910, while cost data relating to the gradient of the terrain in the propagation direction of the vehicle and the side slope of the terrain transverse to the propagation direction of the vehicle may be determined from ranging data by control unit 912. It may be that the first and second cost maps are oriented with respect to the global orientation reference, such as a magnetic pole of the earth. It may be that the first and second electronic control units 910, 912 are asynchronous such that they capture sensor data at different times. Accordingly, it may be that the first and second cost maps are referenced to different locations. The first and second cost maps may be accompanied by location data, which may for example be determined by visual odometry or inertia odometry or from satellite positioning data (such as Global Positioning System (GPS) positioning data) indicative of the location of the vehicle.
For example, the location data may comprise a globally referenced location of the vehicle 100 when the respective sensor data was captured and the globally referenced location of the origin of the respective cost map. The globally referenced location of the vehicle 100 may comprise the globally referenced location of a portion of the vehicle 100 providing a reference point, such as an origin, for a co-ordinate system relating to the vehicle frame of reference 900. This allows motion of vehicle reference frames to be estimated, and for sensor data to be translated to the vehicle frame of reference before being translated to a globally referenced orientation by reference to vehicle orientation information provided by the vehicle's inertial measurement unit (IMU) 23 rather than estimating the global position of the sensor reference frame directly. The global wheel cost map stored by the VCU 10 may be oriented with respect to the global orientation reference, and may comprise an origin referenced to a globally referenced location. The VCU 10 may merge the first and second cost maps received from the first and second control units with the global wheel cost map in dependence on the location data obtained from the respective electronic control units 910, 912.
In another example, as shown in Fig. 19, the VCU 10 may obtain cost data, for example in the form of wheel and body cost maps, from the same electronic control unit, such as an electronic control unit of the stereoscopic camera system 185C, derived from a first, earlier frame 920 of image data referenced to a first location 922 of the vehicle and from a second, later frame 924 of image data referenced to a second location 926 of the vehicle. Again, in each case the cost data may be accompanied by location data comprising a globally referenced location of the vehicle 100 (which again may be the globally referenced location of the co-ordinate system describing the vehicle frame of reference 900) when the respective sensor data was captured and a globally referenced location of the origin of the respective cost map. Again the cost maps may be oriented with respect to a global orientation reference such as a magnetic pole of the earth. The global wheel and body cost maps stored by the VCU 10 may also be oriented with respect to the global orientation reference, and may comprise an origin referenced to a globally referenced location. For the cost maps derived from the respective frames of image data at locations 922, 924, the VCU 10 may merge the wheel and body cost maps received from the control unit with the global wheel and body cost maps respectively in dependence on the respective location data obtained from the control unit. It will be understood that the image data from the first and second frames and the location data may alternatively be provided to the VCU 10 which may itself determine the cost data before updating the global cost maps in dependence on the cost data and the location data as before.
In each case, because the location data is globally referenced, and the global cost maps are defined with respect to a global reference, cost data referenced to different reference locations can be located and spatially combined more easily, thus enabling the global cost maps to be determined in a way that is less processing intensive. The VCU 10 may be configured to determine the future vehicle path in dependence on the updated global cost maps using any of the techniques described herein.
Example global cost maps A, B and C for three locations 930, 932, 934 of the vehicle 100 are shown in Fig. 20. In this case, the cost maps A, B and C are each oriented with respect to a global orientation reference, such as the earth's magnetic north. The cost maps A, B and C also comprise an origin (which may be defined as the bottom left hand square of the respective cost map) which is associated with a global location reference. For example, the global 3o location reference may comprise longitude and latitude co-ordinates. As the vehicle 100 travels one grid square of the map north, south, east or west, the global location reference changes accordingly, data is removed from the cost map and new data is added to the cost map, the locations to which the deleted and added cost data relate being dependent on the globally referenced vehicle location. For example, with reference to Fig. 20, when the vehicle 100 moves from location 930 to location 932, the global location reference of the origin changes accordingly, a row 940 and a column 941 of cells of the cost-map are removed and a new row 942 and a new column 943 of cells are added (based on the cost or sensor data obtained from the electronic control unit(s)). This helps to keep the overall amount of electronic memory required to store the cost map substantially the same for different locations of the vehicle 100. Similarly, as the vehicle 100 travels from location 932 to location 934, the global location reference of the origin again changes accordingly, a row 944 and a column 945 of cells of the cost-map are removed and a new row 946 and a new column 947 of cells are added (based on the cost or sensor data obtained from the electronic control unit(s)). When the vehicle 100 turns, the cost map remains oriented with respect to the global orientation reference (rather than following changes in the orientation of the vehicle 100).
It will be understood that any of the processing performed in respect of the present disclosure may additionally or alternatively be performed by any processors of the vehicle 100 (or even processors external to the vehicle, for example in communication with the vehicle by way of a wireless network). For example, processing operations performed by the camera system 185C may be performed by the VCU 10 (and/or vice versa).
It will be appreciated that embodiments of the present invention can be realised in the form of hardware, software or a combination of hardware and software. Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a ROM, whether erasable or rewritable or not, or in the form of memory such as, for example, RAM, memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a CD, DVD, magnetic disk or magnetic tape. It will be appreciated that the storage devices and storage media are embodiments of machine-readable storage that are suitable for storing a program or programs that, when executed, implement embodiments of the present invention. Accordingly, embodiments provide a program comprising code for implementing a system or method as claimed in any preceding claim and a machine readable storage storing such a program. Still further, embodiments of the present invention may be conveyed electronically via any medium such as a communication signal carried over a wired or wireless connection and embodiments suitably encompass the same.
All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
The invention is not restricted to the details of any foregoing embodiments. The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed. The claims should not be construed to cover merely the foregoing embodiments, but also any embodiments which fall within the scope of the claims.

Claims (25)

  1. CLAIMS1. A control system for a vehicle, the control system comprising at least one controller and being configured to: obtain first cost data relating to at least a portion of terrain to be traversed by the vehicle from a first cost data structure; obtain second cost data relating to at least a portion of the terrain to be traversed by the vehicle from a second cost data structure comprising different cost data to the first cost data structure; and determine a vehicle path in dependence on the first and second cost data.
  2. 2. A control system according to claim 1 wherein the at least one controller collectively comprises: at least one electronic processor; and at least one electronic memory device electrically coupled to the at least one electronic processor having instructions stored therein, wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions thereon so as to obtain the first and second cost data and to determine the vehicle path therefrom.
  3. 3. A control system according to claim 1 or claim 2 wherein the control system is configured to control the vehicle in dependence on the determined vehicle path.
  4. 4. A control system according to any preceding claim wherein the first and second cost data structures relate to the cost(s) for different portions of the vehicle to traverse at least a portion of the terrain.
  5. 5. A control system according to claim 4 wherein the first cost data structure comprises a wheel cost map indicative of cost(s) for wheels of the vehicle to traverse one or more portions 3o of the terrain independently of the body of the vehicle; and the second cost data structure comprises a body cost map indicative of cost(s) for a body of the vehicle to traverse one or more portions of the terrain independently of the wheels of the vehicle.
  6. 6. A control system according to any of claims 1 to 3 wherein at least one of the first and second cost data structures is a direction dependent cost data structure indicative of cost(s) for the vehicle to traverse one or more portions of the terrain dependent on a direction of travel of the vehicle with respect to the said one or more portions of the terrain.
  7. 7. A control system according to claim 6 wherein the direction dependent cost data structure is a line features cost data structure indicative of one or more line features of the terrain.
  8. 8. A control system according to claim 7 wherein one or more or each of the said line features relates to a respective boundary between path and non-path regions of the terrain.
  9. 9. A control system for a vehicle according to any preceding claim wherein the first and second cost data structures comprise or consist of any two of: a wheel cost map indicative of respective cost(s) for wheels of the vehicle to traverse one or more portions of the terrain independently of the body of the vehicle; a body cost map indicative of respective cost(s) for a body of the vehicle to traverse one or more portions of the terrain independently of the wheels of the vehicle; a line features cost data structure indicative of one or more line features of the terrain.
  10. 10. A control system according to any preceding claim wherein the control system is configured to: obtain third cost data relating to one or more portions of the terrain from a third cost data structure comprising different cost data to the first and second cost data structures; and determine a vehicle path in dependence on the first, second and third cost data.
  11. 11. A control system according to claim 10 wherein: the first cost data structure comprises or consists of a wheel cost map indicative of cost(s) for wheels of the vehicle to traverse one or more portions of the terrain independently of the body of the vehicle; the second cost data structure comprises or consists of a body cost map indicative of cost(s) for a body of the vehicle to traverse one or more portions of the terrain 3o independently of the wheels of the vehicle; and the third cost data structure comprises or consists of a line features cost data structure indicative of one or more line features of the terrain.
  12. 12. A control system according to any preceding claim wherein one of the first and second cost data structures comprises a wheel cost map indicative of respective cost(s) for wheels of the vehicle to traverse one or more portions of the terrain independently of the body of the vehicle, and wherein the control system is configured to determine the wheel cost map in dependence on image data from one or more image sensors of the vehicle.
  13. 13. A control system according to any preceding claim wherein one of the first and second cost data structures comprises a body cost map indicative of respective cost(s) for a body of the vehicle to traverse one or more portions of the terrain independently of the wheels of the vehicle, and wherein the control system is configured to determine the body cost map in dependence on three dimensional, 3D, environment data from one or more 3D environment sensors of the vehicle.
  14. 14. A control system according to any preceding claim wherein one of the first and second cost data structures comprises a line features cost data structure, and wherein the control system is configured to determine one or more line features of the line features cost data structure in dependence on image data from one or more image sensors of the vehicle.
  15. 15. A control system according to claim 14, wherein the control system is configured to determine one or more line features of the line features cost data structure by obtaining image data relating to the terrain from one or more image sensors of the vehicle, and for each of a plurality of sub-regions of the image data: determining path probability data indicative of a probability that the respective sub-region relates to a path region of the terrain; determining non-path probability data indicative of a probability that the respective sub-region relates to a non-path region of the terrain; and determining one or more boundaries between path and non-path regions of the terrain based on the path and non-path probability data.
  16. 16. A control system according to any preceding claim comprising: a first controller configured to determine the first and second cost data structures; and a second controller configured to determine the vehicle path in dependence on the cost data from the first and second cost data structures.
  17. 17. A control system according to claim 16 wherein the first controller is configured to provide the first and second cost data structures to the second controller in a transmitted data structure.
  18. 18. A control system according to claim 17 wherein the first and second cost data structures together comprise obstacle cost data and non-obstacle cost data relating to a portion of the terrain, and wherein the first controller is configured to provide the obstacle cost data at a predefined portion of the transmitted data structure.
  19. 19. A control system according to claim 18, wherein the second controller is configured to selectively process the obstacle cost data with a higher priority than non-obstacle cost data.
  20. 20. A control system according to claim 18 or claim 19 wherein the second controller is configured to selectively discard the non-obstacle cost data.
  21. 21. A control system according to claim 16 wherein the first and second cost data structures together comprise obstacle cost data and non-obstacle cost data relating to a portion of the terrain, and wherein the first controller is configured to provide the obstacle cost data to the second controller but not the non-obstacle cost data.
  22. 22. A control system according to any preceding claim wherein the control system is configured to, for each of a plurality of candidate trajectories of the vehicle across the terrain: determine candidate trajectory cost data in dependence on the first and second cost data, the candidate trajectory cost data relating to a cost for the vehicle to traverse at least a portion of the respective candidate trajectory; and determine the vehicle path by selecting a candidate trajectory from the said plurality of candidate trajectories in dependence on the candidate trajectory cost data.
  23. 23. A vehicle comprising a control system according to any of claims 1 to 22.
  24. 24. A method of determining a path for a vehicle, the method comprising: obtaining first cost data relating to at least a portion of terrain to be traversed by the vehicle from a first cost data structure; obtaining second cost data relating to at least a portion of the terrain to be traversed by the vehicle from a second cost data structure comprising different cost data to the first cost data structure; and determining a vehicle path in dependence on the first and second cost data.
  25. 25. A non-transitory computer readable medium comprising computer readable instructions that, when executed by a computer, cause performance of a method according to claim 24.
GB1815337.9A 2018-09-20 2018-09-20 Control system configured to obtain cost data structures for a vehicle Active GB2577676B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1815337.9A GB2577676B (en) 2018-09-20 2018-09-20 Control system configured to obtain cost data structures for a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1815337.9A GB2577676B (en) 2018-09-20 2018-09-20 Control system configured to obtain cost data structures for a vehicle

Publications (3)

Publication Number Publication Date
GB201815337D0 GB201815337D0 (en) 2018-11-07
GB2577676A true GB2577676A (en) 2020-04-08
GB2577676B GB2577676B (en) 2022-12-21

Family

ID=64024199

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1815337.9A Active GB2577676B (en) 2018-09-20 2018-09-20 Control system configured to obtain cost data structures for a vehicle

Country Status (1)

Country Link
GB (1) GB2577676B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112649012A (en) * 2020-12-15 2021-04-13 北京三快在线科技有限公司 Trajectory planning method, equipment, medium and unmanned equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100191455A1 (en) * 2007-07-11 2010-07-29 Honda Motor Co., Ltd. Navigation server, navigation apparatus, and navigation system
US20140019041A1 (en) * 2012-07-13 2014-01-16 Harman Becker Automotive Systems Gmbh Method of estimating an ability of a vehicle to reach a target road segment, method of generating a database, and navigation system
US20140257621A1 (en) * 2013-03-08 2014-09-11 Oshkosh Corporation Terrain classification system for a vehicle
US20150345959A1 (en) * 2014-05-30 2015-12-03 Nissan North America, Inc. Vehicle trajectory optimization for autonomous vehicles
US20180203455A1 (en) * 2015-07-30 2018-07-19 Samsung Electronics Co., Ltd. Autonomous vehicle and method of controlling the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100191455A1 (en) * 2007-07-11 2010-07-29 Honda Motor Co., Ltd. Navigation server, navigation apparatus, and navigation system
US20140019041A1 (en) * 2012-07-13 2014-01-16 Harman Becker Automotive Systems Gmbh Method of estimating an ability of a vehicle to reach a target road segment, method of generating a database, and navigation system
US20140257621A1 (en) * 2013-03-08 2014-09-11 Oshkosh Corporation Terrain classification system for a vehicle
US20150345959A1 (en) * 2014-05-30 2015-12-03 Nissan North America, Inc. Vehicle trajectory optimization for autonomous vehicles
US20180203455A1 (en) * 2015-07-30 2018-07-19 Samsung Electronics Co., Ltd. Autonomous vehicle and method of controlling the same

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112649012A (en) * 2020-12-15 2021-04-13 北京三快在线科技有限公司 Trajectory planning method, equipment, medium and unmanned equipment

Also Published As

Publication number Publication date
GB2577676B (en) 2022-12-21
GB201815337D0 (en) 2018-11-07

Similar Documents

Publication Publication Date Title
GB2577485A (en) Control system for a vehicle
US11999378B2 (en) Control system for a vehicle
US11772647B2 (en) Control system for a vehicle
JP6380422B2 (en) Automated driving system
US11554778B2 (en) Vehicle speed control
CN109426261B (en) Automatic driving device
US11021160B2 (en) Slope detection system for a vehicle
WO2018166747A1 (en) Improvements in vehicle control
US11603103B2 (en) Vehicle speed control
US10611375B2 (en) Vehicle speed control
WO2018007079A1 (en) Improvements in vehicle speed control
CN112824997B (en) Method and system for localized lane of travel awareness
US11975725B2 (en) Systems and methods for updating the parameters of a model predictive controller with learned external parameters generated using simulations and machine learning
CN110316197A (en) Tilt evaluation method, inclination estimation device and the non-transitory computer-readable storage media for storing program
WO2019166142A1 (en) Methods and apparatus for acquisition and tracking, object classification and terrain inference
GB2576265A (en) Improvements in vehicle speed control
GB2551711A (en) Improvements in vehicle speed control
US11763694B2 (en) Systems and methods for training a driver about automated driving operation using a reliability model
GB2584587A (en) Control system for a vehicle
GB2577676A (en) Control system for a vehicle
GB2571587A (en) Vehicle control method and apparatus
CN112987053A (en) Method and apparatus for monitoring yaw sensor
WO2020160927A1 (en) Vehicle control system and method
GB2576450A (en) Improvements in vehicle speed control
GB2577486A (en) Control system for a vehicle