GB2577486A - Control system for a vehicle - Google Patents

Control system for a vehicle Download PDF

Info

Publication number
GB2577486A
GB2577486A GB1815339.5A GB201815339A GB2577486A GB 2577486 A GB2577486 A GB 2577486A GB 201815339 A GB201815339 A GB 201815339A GB 2577486 A GB2577486 A GB 2577486A
Authority
GB
United Kingdom
Prior art keywords
path
vehicle
region
terrain
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1815339.5A
Other versions
GB2577486B (en
GB201815339D0 (en
Inventor
John King Paul
Fairgrieve Andrew
Ravi Bineesh
Jayaraj Jithin
Jayaprakash Krishna
Kotteri Jithesh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1815339.5A priority Critical patent/GB2577486B/en
Publication of GB201815339D0 publication Critical patent/GB201815339D0/en
Priority to PCT/EP2019/068943 priority patent/WO2020057801A1/en
Priority to US17/278,265 priority patent/US11999378B2/en
Priority to DE112019004698.5T priority patent/DE112019004698T5/en
Publication of GB2577486A publication Critical patent/GB2577486A/en
Application granted granted Critical
Publication of GB2577486B publication Critical patent/GB2577486B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3453Special cost functions, i.e. other than distance or default speed limit of road segments
    • G01C21/3461Preferred or disfavoured areas, e.g. dangerous zones, toll or emission zones, intersections, manoeuvre types, segments such as motorways, toll roads, ferries
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0217Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with energy consumption, time reduction or distance reduction criteria

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

A control system for a vehicle (100 Fig. 1) has at least one controller that is configured to obtain image data relating to terrain to be traversed by the vehicle, for example by use of a stereoscopic camera system (185C Fig. 3). The controller evaluates a plurality of sub-regions of the image data, and determines path probability data 516a indicative of a probability that the respective sub-region relates to a path region of the terrain (700 Fig 8a) and considers non-path probability data 516b indicative of a probability that the respective sub-region relates to a non-path region of the terrain (702, 704 Fig 8a). In selecting the future path 518, the control system determines a penalty or benefit for the vehicle to traverse a respective portion of the terrain to which the respective sub-region relates in dependence on the determined path and the non-path probability data, and determines the future vehicle path in dependence on the determined penalty/benefit analysis. The assessment may include inferring primary and secondary paths for both the path probability data and the non-path probability data, and weighting and combining the data for improved accuracy (Figs 9a-i). The image data may be captured from one or more tyre locations of the vehicle.

Description

CONTROL SYSTEM FOR A VEHICLE
TECHNICAL FIELD
The present disclosure relates to a control system and particularly, but not exclusively, to a control system for a vehicle. Aspects of the invention relate to a control system, a method, a vehicle, a computer program product and a non-transitory computer readable medium.
BACKGROUND
Vehicles with increasing levels of autonomy require detailed information with respect to their driving environment. In structured on-road environments there are known markers for the vehicle to identify (e.g. lane markings, road signs, road edges). In off-road environments, this becomes more complex. Some vehicle control systems use cameras to detect images of the driving environment and, based on the images, categorise portions of the terrain to be traversed by the vehicle into different categories in an attempt to apply some structure to the off-road environment. However, existing systems struggle to categorise the terrain correctly, particularly at the boundaries where different types of terrain overlap.
It is an object of embodiments of the invention to at least mitigate one or more of the problems
of the prior art.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a control system, a method, a vehicle and a non-transitory computer readable medium as claimed in the appended claims.
According to an aspect of the invention, there is provided a control system configured to determine a (typically future) path for a vehicle. It may be that the control system is configured to determine a (typically future) path for the vehicle based on image data captured by one or more image sensors of the vehicle.
According to an aspect of the invention, there is provided a control system for a vehicle, the control system comprising at least one controller and being configured to: obtain image data relating to terrain to be traversed by the vehicle, and for each of a plurality of sub-regions of the image data: determine path probability data indicative of a probability that the respective sub-region relates to a path region of the terrain; and determine non-path probability data indicative of a probability that the respective sub-region relates to a non-path region of the terrain; and determine a (typically future) vehicle path in dependence on the determined path and non-path probability data.
According to another aspect of the invention, there is provided a control system for a vehicle, the control system comprising at least one controller and being configured to: obtain image data relating to terrain to be traversed by the vehicle, and for each of a plurality of sub-regions of the image data: determine path probability data indicative of a probability that the respective sub-region relates to a path region of the terrain; determine non-path probability data indicative of a probability that the respective sub-region relates to a non-path region of the terrain; and determine a cost for the vehicle to traverse a respective portion of the terrain to which the respective sub-region relates in dependence on the determined path and non-path probability data; and determine a (typically future) vehicle path in dependence on the determined costs.
By determining a future path for the vehicle to traverse the terrain in dependence on both determined path and non-path probability data, a more optimal vehicle path can be determined.
It will be understood that the vehicle path is a path for the vehicle to traverse the terrain.
It may be that each of the sub-regions comprise more than one pixel of the said image data.
It may be that the image data comprises two dimensional (2D) image data. It may be that the image data comprises colour image data. It may be that the image data is non-obstacle image data. It may be that the image data contains obstacle and non-obstacle image data. It may be that the control system is configured to obtain the image data from one or more image sensors of the vehicle.
It will be understood that the term "cost" may relate to a penalty or a reward associated with a portion of the terrain to be traversed by the vehicle. An increased or relatively high cost may relate to an increased or relatively high penalty or a reduced or relatively low reward. Similarly a reduced or relatively low cost may relate to a reduced or relatively low penalty or an increased or relatively high reward. An intermediate cost may relate to a cost or reward between a relatively low penalty and a relatively high penalty, between a relatively low reward and a relatively high reward, or between a penalty and a reward.
It may be that the control system is configured to operate in an autonomous driving mode, such as a driving mode having level 1, 2, 3, 4 or 5 autonomy (e.g. level 2 autonomy). It may be that the control system is configured to operate in an autonomous off-road driving mode. It may be that the control system is configured to operate in an autonomous low-speed cruise control driving mode, or in both an autonomous low-speed cruise control driving mode and an off-road driving mode.
It may be that the terrain is off-road terrain.
It may be that the at least one controller collectively comprises: at least one electronic processor having an input for receiving the image data; and at least one electronic memory device electrically coupled to the at least one electronic processor and having instructions stored therein, wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions thereon to determine the path probability data, the non-path probability data, the costs and the vehicle path.
It may be that the control system is configured to control the vehicle in dependence on the determined path.
It may be that the control system is configured to autonomously control a steering angle of one or more wheels associated with the vehicle in dependence on the determined vehicle path. By autonomously controlling the steering angle in dependence on the determined vehicle path, the vehicle can be directed autonomously along the vehicle path.
It may be that the control system is configured to autonomously control a speed of the vehicle in dependence on the determined vehicle path. By autonomously controlling the speed of the vehicle in dependence on the determined vehicle path, an optimal vehicle speed can be autonomously selected for the vehicle to safely and comfortably traverse the vehicle path.
It may be that the control system is configured to, for each of the said sub-regions: infer secondary path probability data from the non-path probability data, the secondary path probability data being indicative of a probability that the respective sub-region relates to a path region of the terrain; and determine a cost for the vehicle to traverse a respective portion of the terrain to which the respective sub-region relates based on a combination of the path probability data and the secondary path probability data. By inferring secondary probability data from the non-path probability data and combining the secondary path probability data with the path probability data, a more accurate determination can be made as to whether a portion of the terrain is a path region of the terrain.
It may be that the control system is configured to combine the path probability data with the secondary path probability data by applying different weights to the path probability data and the secondary path probability data and combining the weighted path probability data and the weighted secondary path probability data. It may be that the path probability data is given a more significant weight than the secondary path probability data. Alternatively, it may be that the control system is configured to combine the path probability data with the secondary path probability data by applying the same weights to the path probability data and the non-path probability data, or greater weight may be allocated to the non-path probability data. It may be that the control system is configured to determine the weight to be applied to the path probability data based on confidence data associated with one or more path boundaries determined from the path probability data (and optionally on confidence data associated with one or more path boundaries determined from the secondary path probability data). Similarly, it may be that the control system is configured to determine the weight to be applied to the secondary path probability data based on confidence data associated with one or more path boundaries determined from the secondary path probability data (and optionally on confidence data associated with one or more path boundaries determined from the path probability data). It may be that the path boundaries are boundaries between path and non-path regions of the terrain.
Alternatively, it may be that the control system is configured to combine the path probability data with the secondary path probability data by offsetting the path probability data in dependence on the non-path probability data.
It may be that the control system is configured to determine, for each of the said sub-regions, path probability data indicative of a probability that the respective sub-region relates to a path region of the terrain by: determining image content data from the respective sub-region; and comparing the image content data to a path model relating to the path region of the terrain. It will be understood that, the greater the correlation between the image content data and the path model, the higher the determined probability that the respective sub-region relates to a path region of the terrain (and vice versa).
It may be that the image content data is indicative of a colour content of the sub-region. It may be that the image content data is indicative of a texture content of the sub-region. It may be that the image content data is indicative of a colour and texture content of the sub-region.
It may be that the path model is dependent on historical image data relating to the terrain.
Advantageously by the path model being dependent on historical image data relating to the terrain, an accurate, stable, customised path model can be generated for the specific terrain to be traversed by the vehicle. This helps to more accurately determine whether a sub-region of the image data is likely to relate to a path region of the terrain.
It may be that the vehicle has a plurality of wheels. It may be that each of the said wheels is fitted with a respective tyre. It may be that the path model is based on tyre region image data relating to locations on the terrain of one or more tyres of the vehicle. Advantageously, locations on the terrain of one or more tyres of the vehicle typically provide an accurate reference on which the path model can be based as it can generally be assumed that the vehicle tyres will be provided on a path region of the terrain.
It may be that the control system is configured to determine the path model in dependence on one or more of the said sub-regions of the said image data relating to location(s) on the terrain of one or more tyres of the vehicle.
It may be that the control system is configured to determine the one or more sub-regions of the said image data relating to location(s) on the terrain of one or more tyres of the vehicle in dependence on location data indicative of a location of the vehicle at a time after the image data was captured. Advantageously, this allows tyre region image data to be determined from previously captured frames of image data.
It may be that the control system is configured to determine the said location data by performing visual odometry or inertial odometry in respect of the vehicle or from satellite positioning data (such as Global Positioning System (GPS) data) indicative of the location of the vehicle. Advantageously, this generally allows existing sensors of the vehicle to be used to determine the said location data.
It may be that the control system is configured to determine (e.g. update) the path model in dependence on tyre region image data relating to respective current locations on the terrain of one or more tyres of the vehicle.
It may be that the control system is configured to: determine first tyre region image content data from one or more sub-regions of the image data relating to a location of a first tyre of the vehicle; determine second tyre region image content data from one or more sub-regions of the image data relating to a location of a second tyre of the vehicle; compare the first tyre region image content data to the second tyre region image content data; and, in dependence on the first and second tyre region image content data meeting one or more similarity criteria with respect to each other, determine the path model in dependence on the first and second tyre region image content data. By correlating tyre region image data relating to different tyres of the vehicle and determining the path model based on the first and second tyre region image data in dependence on there being a similarity between them, cross-contamination of the path model by image data relating to a non-path region of the terrain can be reduced.
It may be that the control system is configured to: determine tyre region image content data from one or more sub-regions of the image data relating to a location of a tyre of the vehicle; compare the tyre region image content data to the path model; and, in dependence on the tyre region image content data and the path model meeting one or more similarity criteria with respect to each other, update the path model in dependence on the tyre region image content data. Again, this helps to avoid cross-contamination of the path model by image data relating to a non-path region of the terrain. By updating the path model over time, a more generalised path model can be determined which is more representative of the path region of the terrain.
It may be that the control system is configured to: determine tyre region image content data from one or more sub-regions of the image data relating to a location of a tyre of the vehicle; compare the tyre region image content data to the path model; and, in dependence on the tyre region image content data and the path model meeting one or more dissimilarity criteria, generate or update a second path model (which may be distinct from the first path model) based on the tyre region image content data. This helps to accommodate changes in the terrain. It will be understood that the control system may be configured to replace the path model with the second path model, for example in dependence on a determination that the path region of the terrain better matches the second path model.
It may be that the control system is configured to, for each of the said sub-regions, determine the non-path probability data indicative of a probability that the respective sub-region relates to a non-path region of the terrain by: determining image content data relating to the respective sub-region from the said image data; and comparing the image content data relating to the respective sub-region to a non-path model relating to the non-path region of the terrain. It will be understood that, the greater the correlation between the image content data and the non-path model, the higher the determined probability that the respective sub-region relates to a non-path region of the terrain (and vice versa).
It may be that the path model is a mixture model. It may be that the path model is a Gaussian mixture model (GMM) or any other suitable statistical model. It may be that the non-path model is a mixture model. It may be that the non-path model is a Gaussian mixture model (GMM) or any other suitable statistical model.
It may be that the non-path model is dependent on historical image data relating to the terrain.
Advantageously by the non-path model being dependent on historical image data relating to the terrain, an accurate, stable, customised non-path model can be generated for the specific terrain to be traversed by the vehicle. This helps to more accurately determine whether a subregion of the image data is likely to relate to a non-path region of the terrain.
It may be that the non-path model is dependent on image data relating to one or more non-path regions of the terrain laterally offset from the vehicle. Advantageously, image data relating to one or more non-path regions of the terrain laterally offset from the vehicle typically provides an accurate reference on which the non-path model can be based as it can generally be assumed that the terrain on either side of the vehicle relates to non-path region, particularly when off-road.
It may be that the non-path model is based on image data relating to one or more non-path regions of the terrain laterally offset from the vehicle by a set distance, the control system being configured to determine the set distance by: determining image content data from a plurality of sub-regions of image data relating to respective portions of the terrain laterally offset from the vehicle; comparing the image content data to the path model to thereby determine one or more sub-regions having image contents which meet one or more dissimilarity criteria with respect to the path model; and determining the set distance in dependence on lateral distance(s) between the vehicle and respective portions of the terrain 3o to which the said one or more dissimilar sub-regions relate. By determining the set distance in this way, the lateral distance between the vehicle and the non-path region(s) can advantageously be determined for a particular terrain, allowing the non-path model to be more accurately determined for different terrains. It may be that the control system is configured to update the set distance over time. It may be that the control system is configured to determine (e.g. update) the non-path model based on one or more sub-regions of the image data relating to one or more non-path regions of the terrain laterally offset from the vehicle by the set distance. By updating the non-path model over time, a more generalised non-path model can be generated which is more representative of the non-path region of the terrain.
It may be that the control system is configured to: determine image content data from one or more sub-regions of the image data relating to respective portions of the terrain laterally offset from the vehicle by the set distance; compare the image content data to the path model; and, in dependence on the image content data relating to one or more of the said sub-regions meeting one or more dissimilarity criteria with respect to the path model, determine the non-path model in dependence on the dissimilar image content data. Advantageously, this helps to reduce cross contamination of the non-path model with image data relating to the path region of the terrain.
It may be that the control system is configured to: determine tyre region image content data from one or more sub-regions of the image data relating to a location of a tyre of the vehicle; compare the tyre region image content data with the non-path model; and, in dependence on the tyre region image content data and the non-path model meeting one or more dissimilarity criteria with respect to each other, determine the path model in dependence on the tyre region image content data. Advantageously, this helps to reduce cross contamination of the path model with image data relating to the non-path region of the terrain.
It may be that the control system is configured to: determine that a selected portion of a subregion meets one or more similarity criteria with respect to the path model; and in dependence thereon selectively update the path model in dependence on the first portion of the sub-region.
It may be that the control system is configured to: obtain 3D data in respect of the terrain; and, for respective portions of the terrain relating to each of a plurality of the said sub-regions, determine the cost for the vehicle to traverse the respective portion of the terrain to which the respective sub-region relates in dependence on the determined path and non-path probability data and on the 3D data. Advantageously, this allows the presence of obstacles to be determined for the sub-regions and, correspondingly, a more optimal vehicle path can be determined.
It may be that the functionality of the control system is performed by the at least one controller. It may be that the at least one controller is implemented in hardware, software, firmware or any combination thereof. It may be that the at least one controller comprises one or more electronic processors. It may be that one or more or each of the electronic processors are hardware processors. It may be that the at least one controller comprises or consists of an electronic control unit.
According to another aspect of the invention, there is provided a method of determining a vehicle path, the method comprising: obtaining image data relating to terrain to be traversed by the vehicle, and for each of a plurality of sub-regions of the image data: determining path probability data indicative of a probability that the respective sub-region relates to a path region of the terrain; determining non-path probability data indicative of a probability that the respective sub-region relates to a non-path region of the terrain; and determining a cost for the vehicle to traverse a respective portion of the terrain to which the respective sub-region relates in dependence on the determined path and non-path probability data; and determining a (future) vehicle path in dependence on the determined costs or outputting determined costs in dependence on which a vehicle path can be determined.
It may be that the method comprises controlling the (e.g. steering and/or speed of the) vehicle in dependence on the determined vehicle path.
It may be that the method comprises, for each of the said sub-regions: inferring secondary path probability data from the non-path probability data, the secondary path probability data being indicative of a probability that the respective sub-region relates to a path region of the terrain; and determining a cost for the vehicle to traverse a respective portion of the terrain to which the respective sub-region relates based on a combination of the path probability data and the secondary path probability data.
It may be that the method comprises combining the path probability data with the secondary path probability data by applying (e.g. different) weights to the path probability data and the secondary path probability data and combining the weighted path probability data and the weighted secondary path probability data.
It may be that the method comprises, for each of the said sub-regions, determining path probability data indicative of a probability that the respective sub-region relates to a path region of the terrain by: determining image content data from the respective sub-region; and comparing the image content data to a path model relating to the path region of the terrain.
It may be that the method comprises determining the path model in dependence on one or more of the said sub-regions of the said image data relating to location(s) on the terrain of one or more tyres of the vehicle.
It may be that the method comprises determining the one or more sub-regions of the said image data relating to location(s) on the terrain of one or more tyres of the vehicle in dependence on location data indicative of a location of the vehicle at a time after the image data was captured.
It may be that the method comprises: determining first tyre region image content data from one or more sub-regions of the image data relating to a location of a first tyre of the vehicle; determining second tyre region image content data from one or more sub-regions of the image data relating to a location of a second tyre of the vehicle; comparing the first tyre region image content data to the second tyre region image content data; and, in dependence on the first and second tyre region image content data meeting one or more similarity criteria with respect to each other, determining the path model in dependence on the first and second tyre region image content data.
It may be that the method comprises: determining tyre region image content data from one or more sub-regions of the image data relating to a location of a tyre of the vehicle; comparing the tyre region image content data to the path model; and, in dependence on the tyre region image content data and the path model meeting one or more similarity criteria with respect to each other, updating the path model in dependence on the tyre region image content data.
It may be that the method comprises, for each of the said sub-regions, determining the non-path probability data indicative of a probability that the respective sub-region relates to a non-path region of the terrain by: determining image content data relating to the respective subregion from the said image data; and comparing the image content data relating to the respective sub-region to a non-path model relating to the non-path region of the terrain.
It may be that the non-path model is dependent on image data relating to one or more non-path regions of the terrain laterally offset from the vehicle by a set distance. It may be that the method comprises determining the set distance by: determining image content data from a plurality of sub-regions of image data relating to respective portions of the terrain laterally offset from the vehicle; comparing the image content data to the path model to thereby determine one or more sub-regions having image contents which meet one or more dissimilarity criteria with respect to the path model; and determining the set distance in dependence on lateral distance(s) between the vehicle and respective portions of the terrain to which the said one or more dissimilar sub-regions relate.
It may be that the method comprises determining the non-path model based on one or more sub-regions of the image data relating to one or more non-path regions of the terrain laterally offset from the vehicle by the set distance.
It may be that the method comprises: determining image content data from one or more sub-regions of the image data relating to respective portions of the terrain laterally offset from the vehicle by the set distance; comparing the image content data to the path model; and, in dependence on the image content data relating to one or more of the said sub-regions meeting one or more dissimilarity criteria with respect to the path model, determining the non-path model in dependence on the dissimilar image content data.
It may be that the method comprises: determining tyre region image content data from one or more sub-regions of the image data relating to a location of a tyre of the vehicle; comparing the tyre region image content data with the non-path model; and, in dependence on the tyre region image content data and the non-path model meeting one or more dissimilarity criteria with respect to each other, determining the path model in dependence on the tyre region image content data.
It may be that the method comprises: determining that a selected portion of a sub-region meets one or more similarity criteria with respect to the path model; and in dependence thereon selectively updating the path model in dependence on the first portion of the sub-region.
It may be that the method comprises: obtaining 3D data in respect of the terrain; and, for respective portions of the terrain relating to each of a plurality of the said sub-regions, determining the cost for the vehicle to traverse the respective portion of the terrain to which the respective sub-region relates in dependence on the determined path and non-path probability data and on the 3D data.
According to another aspect of the invention, there is provided a vehicle comprising a control system as described herein.
According to another aspect of the invention, there is provided a computer program product comprising computer readable instructions that, when executed by a computer, cause performance of a method described herein.
According to another aspect of the invention, there is provided a non-transitory computer readable medium comprising computer readable instructions that, when executed by a processor, cause performance of a method described herein.
Any controller or controllers described herein may suitably comprise a control unit or computational device having one or more electronic processors. Thus the system may comprise a single control unit or electronic controller or alternatively different functions of the controller may be embodied in, or hosted in, different control units or controllers. As used herein the term "controller" or "control unit" will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide any stated control functionality. To configure a controller, a suitable set of instructions may be provided which, when executed, cause said control unit or computational device to implement the control techniques specified herein. The set of instructions may suitably be embedded in said one or more electronic processors. Alternatively, the set of instructions may be provided as software saved on one or more memory associated with said controller to be executed on said computational device. A first controller may be implemented in software run on one or more processors. One or more other controllers may be implemented in software run on one or more processors, optionally the same one or more processors as the first controller. Other suitable arrangements may also be used.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment or aspect can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described by way of example only, with reference to the accompanying drawings, in which: Fig. 1 shows a schematic illustration of a vehicle in plan view; Fig. 2 shows the vehicle of Fig. 1 in side view; Fig. 3 is a high level schematic diagram of the vehicle speed control system of the vehicle of Figs 1 and 2, including a cruise control system and a low-speed progress control system; Fig. 4 illustrates a steering wheel of the vehicle of FIGs. 1, 2; Fig. 5 is a flow chart illustrating operation of a control system of the vehicle of FIGs. 1, 2; Fig. 6 illustrates the manner in which a colour and/or texture descriptor p_i may be generated; Figs. 7a and 7b show sub-regions of image data having a first portion relating to a path region of the terrain and a second portion relating to a non-path region of the terrain; Fig. 8a illustrates a 2D image captured by one of the cameras of the stereoscopic camera system of the vehicle of Fig. 1; Fig. 8b illustrates a disparity image indicative of the differences between images obtained from the first and second cameras of the stereoscopic camera system of the vehicle of Fig. 1; and Fig. 8c shows in plan view the pixels of the image of Fig. 8a overlaid on a 3D grid obtained based on the disparity image of Fig. 8b; Fig. 9a shows, in plan view, the pixels of the RGB image of Fig. 8a overlaid on a 3D grid obtained based on the disparity image of Fig. 8b (i.e. Fig. 9a is identical to Fig. 8c); Fig. 9b is a path probability map derived from the image of Fig. 9a with each pixel classified by reference to a path model; Fig. 9c shows, in plan view, a non-path probability map derived from the image of Fig. 9a with each pixel classified using a non-path model; Fig. 9d illustrates the inverse of the non-path probability map; Fig. 9e and Fig. 9f show the path and non-path boundaries derived from the path and non-path probability maps; Fig. 9g shows the final path probability map obtained from a weighted combination of the path probability map and the inverse of the non-path probability map; Fig. 9h shows the final path probability map of Fig. 9g merged with a global path probability map stored in a memory; and Fig. 9h shows the final path boundary determined from the merged path probability map of Fig. 9g; Fig. 10 is a close up schematic view of the terrain of Fig. 9a together with a cost map overlaid thereon; Fig. 11 shows the view and cost map of Fig. 10 with a plurality of candidate trajectories overlaid thereon; Fig. 12 is a similar view to Fig. 10 but showing shadow regions on the terrain in place of the puddle regions of Fig. 10; Fig. 13 shows the volume swept by the vehicle of Figs. 1 and 2 increasing during tight turns; Fig. 14 shows the vehicle of Figs. 1 and 2 following mud ruts; Fig. 15 shows a vehicle control unit of the vehicle of Figs. 1 and 2 receiving cost data from three different cost data structures; and Figs. 16a and 16b show obstacle data being provided at a predefined portions at the beginning and end of respective transmitted data structures comprising obstacle and non-obstacle data.
DETAILED DESCRIPTION
FIGs. 1 and 2 show a vehicle 100 having wheels 111, 112, 114, 115, each of which is fitted with a respective tyre, and a body 116 carried by the wheels 111, 112, 114, 115. The vehicle has a powertrain 129 that includes an engine 121 that is connected to a driveline 130 having an automatic transmission 124. A control system for the vehicle engine 121 includes a central controller, referred to as a vehicle control unit (VCU) 10, a powertrain controller 11, a brake controller 13 (an anti-lock braking system (ABS) controller) and a steering controller 170C. The ABS controller 13 forms part of a braking system 22 (Fig. 3). Each of the controllers 10, 11, 13, 170 comprises one or more electronic processors and a memory device storing computer program instructions, the one or more electronic processors being configured to access the respective memory device and to execute the computer program instructions stored therein to thereby perform the functionality attributed to that controller 10, 11, 13, 170.
The VCU 10 may receive and output a plurality of signals to and from various sensors and subsystems (not shown) provided on the vehicle. Referring to Fig. 3, the VCU 10 may include a low-speed progress (LSP) control system 12 shown in Fig. 3, a stability control system (SCS) 14, a cruise control system 16 and a hill descent control (HDC) system 12HD. The SCS 14 improves the safety of the vehicle 100 by detecting and managing loss of traction or steering control. When a reduction in traction or steering control is detected, the SCS 14 may be operable automatically to command the ABS controller 13 to apply one or more brakes of the vehicle to help to steer the vehicle 100 in the direction the user wishes to travel. Although the SCS 14 is implemented by the VCU 10 in this case, the SCS 14 may alternatively be implemented by the ABS controller 13.
The cruise control system 16 may be operable to automatically maintain vehicle speed at a selected speed when the vehicle is travelling at speeds in excess of 25 kph. The cruise control system 16 may be provided with a cruise control HMI (human machine interface) 18 by which means the user can input a target vehicle speed to the cruise control system 16. In one embodiment of the invention, cruise control system input controls are mounted to a steering wheel 171. This is illustrated in Fig. 4. The cruise control system 16 may monitor vehicle speed and any deviation from the target vehicle speed may be adjusted automatically so that the vehicle speed is maintained at a substantially constant value, typically in excess of 25 kph. It may be that the cruise control system 16 is not effective at speeds lower than 25 kph. The cruise control HMI 18 may be configured to provide an alert to the user about the status of the cruise control system 16 via a visual display of the HMI 18.
The LSP control system 12 may also provide a speed-based control system for the user which enables the user to select a relatively low target speed at which the vehicle can progress without any pedal inputs being required by the user to maintain vehicle speed. It may be that low-speed speed control (or progress control) functionality is not provided by the on-highway cruise control system 16 which operates only at speeds above 25 kph. The LSP control system 12 may be activated by pressing LSP control system selector button 178 mounted on steering wheel 171. The LSP system 12 may be operable to apply selective powertrain, traction control and braking actions to one or more wheels of the vehicle 100, collectively or individually.
The LSP control system 12 may be configured to allow a user to input a desired value of vehicle target speed in the form of a set-speed parameter, user_set-speed, via a low-speed progress control HMI (LSP HMI) 20 (Fig. 1, Fig. 3) which shares certain input buttons 173175 with the cruise control system 16 and HDC control system 12HD (Fig. 4). Provided the vehicle speed is within the allowable range of operation of the LSP control system 12 (which may be the range from 2 to 30kph although other ranges may be provided) and no other constraint on vehicle speed exists whilst under the control of the LSP control system 12, the LSP control system 12 may control vehicle speed in accordance with a LSP control system set-speed value LSP set-speed which is set substantially equal to user setspeed. The LSP HMI 20 may also include a visual display by means of which information and guidance can be provided to the user about the status of the LSP control system 12.
The LSP control system 12 may receive an input from the ABS controller 13 of the braking system 22 of the vehicle indicative of the extent to which the user has applied braking by means of the brake pedal 163. The LSP control system 12 may also receive an input from an accelerator pedal 161 indicative of the extent to which the user has depressed the accelerator pedal 161, and an input from the transmission or gearbox 124. Other inputs to the LSP control system 12 may include an input from the cruise control HMI 18 which is representative of the status (ON/OFF) of the cruise control system 16, an input from the LSP control HMI 20, and an input from a gradient sensor 45 indicative of the gradient of the driving surface over which the vehicle 100 is driving. In the present embodiment the gradient sensor 45 may be a gyroscopic sensor. In some alternative embodiments the LSP control system 12 may receive a signal indicative of driving surface gradient from another controller such as the ABS controller 13. The ABS controller 13 may determine gradient based on a plurality of inputs, optionally based at least in part on signals indicative of vehicle longitudinal and lateral acceleration and a signal indicative of vehicle reference speed (v actual) being a signal indicative of actual vehicle speed over ground. The vehicle reference speed may be determined to be the speed of the second slowest turning wheel, or the average speed of all the wheels. Other ways of calculating vehicle reference speed may be used including by means of a camera device or radar sensor.
The VCU 10 may be configured to implement a Terrain Response (TR) (RTM) System in which the VCU 10 controls settings of one or more vehicle systems or sub-systems such as the powertrain controller 11 in dependence on a selected driving mode. The driving mode may be selected by a user by means of a driving mode selector 141S (Fig. 1) or it may be determined automatically by the VCU 10. The driving modes may also be referred to as terrain modes, terrain response (TR) modes, or control modes. In the embodiment of Fig. 1 five driving modes may be provided such as: an 'on-highway' driving mode suitable for driving on a relatively hard, smooth driving surface where a relatively high surface coefficient of friction exists between the driving surface and wheels of the vehicle; a 'sand' driving mode suitable for driving over sandy terrain, being terrain characterised at least in part by relatively high drag, relatively high deformability or compliance and relatively low surface coefficient of friction; a grass, gravel or snow' (GGS) driving mode suitable for driving over grass, gravel or snow, being relatively slippery surfaces (i.e. having a relatively low coefficient of friction between surface and wheel and, typically, lower drag than sand); a 'rock crawl' (RC) driving mode suitable for driving slowly over a rocky surface; and a 'mud and ruts' (MR) driving mode suitable for driving in muddy, rutted terrain. The latter four driving modes may be considered to be off-road driving modes.
In order to cause application of the necessary positive or negative torque to the wheels, the VCU 10 may command that positive or negative torque is applied to the vehicle wheels by the powertrain 129 and/or that a braking force is applied to the vehicle wheels by the braking system 22, either or both of which may be used to implement the change in torque that is necessary to attain and maintain a required vehicle speed.
The vehicle 100 may be provided with additional sensors (not shown) which are representative of a variety of different parameters associated with vehicle motion and status. These may be inertial systems unique to the LSP or HDC control systems 12, 12HD or part of an occupant restraint system or any other sub-system which may provide data from sensors such as gyros and/or accelerometers that may be indicative of vehicle body movement and may provide a useful input to the LSP and/or HDC control systems 12, 12HD. The signals from the sensors provide, or are used to calculate, a plurality of driving condition indicators (also referred to as terrain indicators) which are indicative of the nature of the terrain conditions over which the vehicle 100 is travelling. The sensors (not shown) of the vehicle 100 may include, but are not limited to, sensors which provide continuous sensor outputs to the VCU 10, including any one or more of: wheel speed sensors; an ambient temperature sensor; an atmospheric pressure sensor; tyre pressure sensors; wheel articulation sensors; gyroscopic sensors to detect vehicular yaw, roll and pitch angle and rate; a vehicle speed sensor; a longitudinal acceleration sensor; an engine torque sensor (or engine torque estimator); a steering angle sensor; a steering wheel speed sensor; a gradient sensor (or gradient estimator); a lateral acceleration sensor which may be part of the SCS 14; a brake pedal position sensor; a brake pressure sensor; an accelerator pedal position sensor; longitudinal, lateral and vertical motion sensors; water detection sensors forming part of a vehicle wading assistance system (not shown). The vehicle 100 may further comprise a location sensor, such as a satellite positioning system (e.g. Global Positioning System (GPS), Galileo or GLONASS) receiver configured to receive signals from a plurality of satellites to determine the location of the vehicle.
The vehicle 100 may be provided with a stereoscopic camera system 185C configured to generate stereo colour image pairs by means of a pair of forward-facing colour video cameras comprised by the system 185C. The system 185C may further comprise one or more electronic processors and a memory device storing computer program instructions, the one or more electronic processors being configured to access the respective memory device and to execute the computer program instructions stored therein. A stream of dual video image data is fed from the cameras to the one or more processors of the system 185C which may access and execute instructions stored in the memory of the said system 185C to process the image data and repeatedly generate a 3D point cloud data set based on the images received.
Alternatively the images may be obtained and processed by any processing system of the vehicle 100, such as the VCU 10. Each point in the 3D point cloud data set may correspond to a 3D coordinate of a point on a surface of terrain ahead of the vehicle 100 viewed by each of the forward-facing video cameras of the stereoscopic camera system 185C.
The LSP control system 12 may have an autonomous driving mode in which the VCU 10 controls the steering and speed of the vehicle autonomously. In this case, the LSP HMI 20 may allow the driver to select the autonomous driving mode. The autonomous driving mode may be have a level of automation of level 1 or above by the SAE International standard. The autonomous mode may have level 2 autonomy, that is: the automated system takes full control of the vehicle (accelerating, braking, and steering); the driver must monitor the driving and be prepared to intervene immediately at any time if the automated system fails to respond properly. Thus, the speed of the vehicle 100 set by the user in the LSP mode may be overridden (typically reduced) by the VCU 10, for example, if it is inappropriate for driving conditions (e.g. if there are obstacles in the path, or if the set speed is inappropriate for the curvature of the vehicle path).
It may be that the autonomous LSP driving mode is particularly suitable for controlling the vehicle off-road, where road markings and road signs are either absent or sparsely provided.
Accordingly, as well as operating in the autonomous LSP driving mode, it may be that the vehicle is also operating in one of the off-road TR driving modes. However, the autonomous LSP driving mode may also be suitable for controlling the vehicle on road.
As will be explained below with reference to Fig. 5, when the vehicle is operating in the autonomous LSP driving mode, the stereoscopic camera system 185C may be configured to use the colour image pairs to determine cost data relating to the terrain and to provide the cost data to the VCU 10. The VCU 10 may then determine a future path of the vehicle 100 in dependence on the cost data and control the vehicle 100 in accordance with the determined path. For example, the steering angle of one or more wheels of the vehicle 100 may be controlled by the VCU 10 outputting a steering angle control signal to the steering controller 170C dependent on the curvature of the determined path. The VCU 10 may store in an electronic memory thereof a look-up table of predetermined maximum allowable vehicle speeds for different path curvatures, and the VCU 10 may select an appropriate vehicle speed from the look-up table in dependence on the curvature of the determined path, and output this speed to the LSP control system 12 to override the user_setspeed.
A method for determining a future path of the vehicle to traverse (e.g. off-road) terrain based on respective frames of image data from the stereoscopic camera system 185C will now be explained with reference to Fig. 5. At 500, a stereo colour image pair may be captured by the stereoscopic camera system 185C and RGB image data from the stereo colour image pair may be stored in a memory accessible to the one or more electronic processors of the stereoscopic camera system 185C. At 502a, the stereoscopic camera system 185C may select the image data of a first image of the image pair (which is a 2D colour image) and at 504a may convert the selected image from the RGB colour space to the LAB colour space (although 504a is not essential, and other colour spaces such as RGB or HSV may alternatively be used). At 502b, the camera system 185C may compare the first and second images of the image pair to thereby determine a disparity image.
At 504b, the stereoscopic camera system 185C may calculate a real-world 3D point cloud based on the disparity image. The 3D point cloud may initially be related to a frame of reference of the camera system 185C, but may then be translated to a frame of reference of the vehicle 100 before being translated to a frame of reference which is fixed with respect to the earth (rather than with respect to the vehicle 100), for example by reference to vehicle orientation information provided by the vehicle's inertial measurement unit (IMU) 23. The 3D point data cloud typically has a high number of points. The number of points of the 3D point data cloud may be reduced by the camera system 185C determining a 3D grid (by a method such as multi-level surface (MLS)) map from the 3D point data cloud mapped relative to a horizontal ground plane. It may be that the surface of the terrain is inclined or shifted with respect to the horizontal ground plane. The 3D grid map may comprise one or more metrics in respect of each of a plurality of 3D blocks of the 3D point cloud, the metrics typically including information relating to the slope of the terrain and the elevation of the features of the terrain within that block.
At 506, the stereoscopic camera system 185C may overlay the LAB (or RGB or HSV, for example) pixels derived from the first image of the stereo image pair onto the 3D grid map. 30 It may be that the (e.g. off-road) terrain to be traversed by the vehicle comprises a path region (e.g. a paved portion or mud ruts provided through a grass field) and a non-path region (e.g. grass on either side of the ruts, or on either side of the paved portion). At 508a-516a, the one or more processors of the stereoscopic camera system 185C may execute computer program instructions on the image data determined at 506 to determine probabilities that respective portions of the terrain relate to the path region thereof.
The image data may be divided by the camera system 185C into a plurality of sub-regions. It may be that each of the sub-regions relates to a 25cm x 25cm region of the terrain. Thus, it may be that each of the sub-regions comprise a plurality of pixels of image data. At 508a, an assumption may be made that the tyres of the vehicle are located on a path region of the terrain. Because the vehicle 100 is moving, the current location of the vehicle will differ from the location of the vehicle when the image data was captured. Accordingly, the image data may comprise image data corresponding to the current locations of the tyres of the vehicle 100. The camera system 185C may determine the current location of the vehicle 100 relative to the location of the vehicle 100 when the image data was captured, e.g. by performing visual odometry or inertial odometry on the image data and/or inertia data from the IMU 23, or using satellite positioning data (such as Global Positioning System (GPS) data) indicative of the location of the vehicle 100, and identify one or more sub-regions of the image data corresponding to the locations of the terrain currently contacted by the tyres of the vehicle 100 based on the current location of the vehicle. In the following description it will be assumed that one sub-region of the image data is identified for each tyre, but it will be understood that more than one sub-region may be identified for each tyre (depending on the relative sizes of the portion of the tyre in contact with the terrain and the sub-regions).
At 510a, the camera system 185C may be configured to process the sub-regions of the image data corresponding to the locations currently occupied by the tyres of the vehicle 100 to determine tyre region image content data relating to each of those tyre regions. The image content data may comprise colour image content data relating to the colour content of the respective sub-regions. Additionally or alternatively, the image content data may comprise texture data relating to the texture content of the respective sub-regions. Texture is a measure of the local spatial variation in the intensity of the image and is generally measured by subtracting the intensity of a given pixel from the intensity of each of the eight surrounding pixels to provide eight texture descriptors per pixel. It may be that the image content data comprises a colour and texture descriptor, p i, which contains eleven components for each pixel consisting of three L, a, b colour components and eight texture descriptors. An example 3o of how a colour and texture descriptor, p i, may be calculated in shown in Fig. 6, where subject pixel S of intensity Lc is shown surrounded by pixels S1 to S8 of respective intensities LI to L. Lc, ac and be are the "LAB" colour components of pixel S. The set of weights Wi, W2 and W3 is used to balance how much to use colour, texture and brightness for image clustering.
By making the assumption that the tyre regions of the image data relate to a path region of the terrain, portions of the terrain which relate to path regions can be automatically identified.
In addition, "path regions" comprising tyre tracks or mud ruts rather than paved paths can automatically be accounted for.
At 512a, the tyre region image content data may be merged with a global path model, such as a Gaussian mixture model (GMM), stored in a memory of the VCU 10. In some cases, it may be that more than one path model is provided (e.g. one for colour and one for texture), in which case 512a may be performed for each path model, but it will be assumed in the following description that a single global path GMM is provided. The global path GMM may be based on historical image data captured by the stereoscopic camera system 185C relating to historical locations of the terrain of the tyres of the vehicle 100.
Before the tyre region image content data is merged with the global path GMM, checks may be performed on the tyre region image content data to determine whether it is suitable for merger with the global path GMM. For example, the tyre region image content data relating to the location of each tyre may be compared to the tyre region image content data relating to the locations of each of the other tyres. If the tyre region image content data relating to a tyre does not meet one or more similarity criteria with respect to tyre region image content data relating to one or more of the other tyres, it may be that this is indicative that the tyre to which it relates does not in fact occupy a path region of the terrain, and it may be that the stereoscopic camera system 185C decides not to merge it with the global path GMM. If tyre region image content data relating to the tyres (or a sub-set of the tyres) do meet the similarity criteria with respect to each other, it may be that the camera system 185C updates the global path GMM in dependence on the tyre region image content data relating to those tyres.
It may be that the similarity criteria comprise one or more conditions relating to the tyre region image content data. For example, it may be that the similarity criteria comprise one or more conditions that the tyre region image content data relating to a tyre matches the tyre region image content data relating to one or more other tyres of the vehicle to at least a given degree. For example, it may be that the similarity criteria comprise one or more colour and/or texture conditions that colour and/or texture distributions of the tyre region image content data relating to a tyre matches the colour and/or texture distribution of the tyre region image content data relating to one or more other tyres. It will be understood that the image content data may be represented in any suitable way. For example, the image content data may comprise colour and/or texture components for each pixel of the sub-region, or the image content data may comprise a local GMM for that sub-region.
One or more tyres of the vehicle may occasionally enter a non-path region of the terrain while one or more other tyres of the vehicle remain on the path region of the terrain. By correlating tyre region image content data relating to different tyres of the vehicle and updating the global path GMM in dependence on there being a similarity between the tyre region image content data, cross-contamination of the global path GMM by tyre region image content data relating to a non-path region of the terrain can be reduced.
Additionally or alternatively, it may be that the tyre region image content data is compared to the global path GMM. If tyre region image content data meets one or more similarity criteria with respect to the global path GMM, the stereoscopic camera system 185C may update the global path GMM in accordance with the tyre region image content data. If the tyre region image content data does not meet the said similarity criteria with respect to the global path GMM, it may be an indication that the tyre region image content data does not in fact relate to the path region of the terrain, and that tyre region image content data is not merged with the global path GMM. This again helps to avoid cross-contamination of the path model by image data relating to a non-path region of the terrain.
In this case, it may be that the similarity criteria comprise one or more conditions relating to the tyre region image content data and global path GMM. For example, it may be that the similarity criteria comprise one or more colour and/or texture conditions that colour and/or texture distribution of the tyre region image content data matches the colour and/or texture distribution of the global path GMM to at least a given degree.
When the relevant tyre region image content data is merged with the global path GMM, an updated global path GMM may be provided. It will be understood that, the first time the method of Fig. 5 is performed, 512a may be omitted. Instead, it may be that the tyre region image content data relating to the locations of each of the tyres are compared to each other and the matching tyre region image content data is used to create a global path GMM.
3o If the tyre region image content data does not meet the said similarity criteria with respect to the global path GMM (or meets one or more dissimilarity criteria with respect to the global path GMM), the stereoscopic camera system 185C may exclude the tyre region image content data from the global path GMM. The stereoscopic camera system 185C may generate or update a second global path GMM (or any other suitable model) distinct from the said global path GMM based on the tyre region image content data. This helps the control system to accommodate changes in the terrain. For example, the control system may be configured to replace the global path GMM with the second global path GMM, for example in dependence on a determination that the path region of the terrain better matches the second path global path GMM.
At 514a, the updated global path GMM may be used to determine probabilities that the respective sub-regions of the image data (not only the tyre regions) relate to the path region of the terrain. The image content of each of the sub-regions of the image data may be compared to the distribution of the updated global path GMM in order to determine a probability that the respective sub-region relates to the path region. Thus, a single path probability value may be determined for each sub-region. The closer the image content to the peak of the distribution of the updated path GMM, the higher the probability that the sub-region relates to a path region of the terrain, and vice versa. It may be that the path probability determined for each sub-region is stored in a memory of the stereoscopic camera system 185C in association with the sub-region of the image data to which it relates.
At 516a, the camera system 185C may determine a path probability map in dependence on the determined path probabilities. It may be that the path probability map comprises the image data determined at 506 with the path probabilities for each of the sub-regions overlaid thereon.
At 508b-516b, the stereoscopic camera system 185C may execute computer program instructions on the image data determined at 506 to determine for each of the sub-regions a non-path probability that the respective sub-region relates to a non-path region of the terrain.
At 508b, two non-path regions laterally offset from the vehicle 100 are selected by the stereoscopic camera system 185C. It may be that the camera system 185C is configured to identify the non-path regions by determining image content data relating to each of a plurality of sub-regions of image data relating to a first portion of the terrain laterally offset from the vehicle 100 on a first (e.g. left) side of the vehicle 100 and to a second portion of the terrain laterally offset from the vehicle 100 on a second (e.g. right) side of the vehicle opposite the first side. For example, the sub-regions may relate to portions of the terrain between 3m and 8m laterally offset from the centre of the wheelbase line of the vehicle 100 on both sides of the vehicle 100 at its current location (as before the current location of the vehicle may be obtained by visual odometry or inertia odometry or from satellite positioning data (such as Global Positioning System (GPS) data) indicative of the location of the vehicle). As before, the image content data may comprise, for example, colour and/or texture data derived from the sub-region of image data. It may be that the camera system 185C is configured to compare the image content data relating to each of the selected laterally offset sub-regions to the global path GMM to thereby identify one or more of the sub-regions having an image content which meet one or more dissimilarity criteria with respect to the global path GMM. The camera system 185C may be configured to determine a lateral offset between the centre of the wheelbase line of the vehicle 100 at its current location and respective portions of the terrain to which the said one or more dissimilar sub-regions relate. For subsequent iterations of the method of Fig. 5 (for at least a limited time), the camera system 185C may be configured to determine the non-path sub-regions simply by determining sub-regions relating to portions of the terrain laterally offset from the vehicle by the lateral offset. Different lateral offsets may be determined for each side of the vehicle.
It may be that the dissimilarity criteria comprise one or more conditions relating to the image content data and the global path GMM. For example, it may be that the dissimilarity criteria comprise one or more colour and/or texture conditions that a colour and/or texture distribution of the image content data relating to a respective sub-region do not match corresponding distribution of the global path GMM to a given degree.
Thus, sub-regions relating to the non-path region of the terrain may be identified. By identifying the non-path regions with reference to the global path GMM, no user input needs to be requested by the VCU 10 in order to identify the non-path regions.
At 510b, image content data (e.g. the colour and/or texture data) relating to the non-path sub-regions may be determined. At 512b, the image content data relating to the non-path subregions may be merged with a global non-path model, such as a global non-path GMM. In some cases, it may be that more than one non-path model is provided (e.g. one for colour and one for texture), in which case 512b may be performed for each non-path model, but it will be assumed in the following description that a single global non-path GMM is provided. The global non-path GMM may be based on historical non-path image content data relating to portions of the terrain laterally offset from the vehicle on either side.
Before the image content data is merged with the global non-path GMM, checks may be 3o performed on the non-path region image content data to determine whether it would contaminate or complement the global non-path GMM. For example, it may be that the camera system 185C compares the non-path region image content data to the global path GMM. If it is determined that the non-path region image content data meets one or more dissimilarity criteria with respect to the global path GMM, it may be that the camera system 185C merges the non-path image content data with the global non-path GMM. If it is determined that the non-path region image content data does not meet the one or more dissimilarity criteria with respect to the global path GMM, it may be that the camera system 185C decides not to merge the non-path region image content data with the global non-path GMM. Thus, an updated global non-path GMM may be obtained.
It will be understood that the first time 508b-512b are performed, step 512b may be omitted. In this case, the camera system 185C may be configured to merge the image content data obtained from the sub-regions laterally offset from the vehicle to form the global non-path GMM.
At 514b, the updated global non-path GMM may be used by the stereoscopic camera system to determine probabilities that the respective sub-regions of the image data (not only the sub-regions laterally offset from the vehicle) relate to the non-path region of the terrain. In this case, the image content of each of the sub-regions of the image data may be compared to the updated global non-path GMM to determine a probability that the respective sub-region relates to the non-path region of the terrain. The closer the image content data to the peak of the distribution of the updated non-path GMM, the higher the probability that the sub-region relates to the non-path region of the terrain, and vice versa. It may be that the probability is stored in a memory of the VCU 10 in association with the sub-region of the image data to which it relates.
At 516b, a non-path probability map is determined in dependence on the non-path probabilities. It may be that the non-path probability map comprises the image data determined at 506 with the non-path probabilities for each of the sub-regions overlaid thereon.
The global non-path GMM may be used to provide a further check on the tyre region image content data to determine whether it is suitable for merger with the global path GMM. The stereoscopic camera system may compare the tyre region image content data relating to the location of each tyre to the global non-path GMM and, in dependence on the tyre region image content data and the global non-path GMM meeting one or more dissimilarity criteria with respect to each other, the camera system 185C may update the path model in dependence on the tyre region image content data. If the tyre region image content data and the global non-path GMM do not meet one or more dissimilarity criteria with respect to each other, it may be that the camera system 185C does not use the tyre region image content data to update the global path GMM. This helps to reduce cross contamination of the global path GMM with image data relating to the non-path region of the terrain.
It may be that the dissimilarity criteria comprise one or more conditions relating to the tyre region image content data and the global non-path GMM. For example, it may be that the dissimilarity criteria comprise one or more colour and/or texture conditions that a colour and/or texture distribution of the tyre region image content data relating to a respective sub-region does not match the distribution of the global non-path GMM to a given degree.
At 518, path and non-path probability data determined during 508a-516a and 508b-516b are combined to provide a final path probability map. The final path probability map may be determined as a weighted combination of the path probability map and a secondary path probability map inferred from the non-path probability data. The secondary path probability map may be an inverse of the non-path probability map determined by inferring that the subregions which have low non-path probabilities have high path probabilities. For example, it may be that the camera system 185C is configured to infer the secondary path probability, Psecondary_path_i, for a respective ith sub-region from the non-path probability, P non-path_i, based on: Psecondary path = 1 -Pnon-path 1-Alternatively, the path and non-path probability data may be combined in any other suitable way to determine the final probability map. For example, the probabilities determined at 514a, 514b may be combined, for example by subtracting the non-path probability from the path probability to provide a final path probability for each sub-region.
It may be that the probabilities from the path probability map are allocated a more significant weighting than the inferred probabilities from the secondary path probability map to reflect a greater confidence in those values. Alternatively it may be that the same weights are applied to each, or indeed greater weight may be allocated to the inferred probabilities from the secondary path probability map. It may be that respective weights to be applied to the path and non-path probability data are determined in dependence on the respective consistencies of one or more path/non-path boundaries determined from the path and non-path probability data respectively. Ways in which the respective consistencies of path boundaries can be measured are explained in more detail below. By inferring secondary probability data from the non-path probability data and combining the secondary path probability data with the path probability data, a more confident determination can be made as to whether a portion of the terrain is a path or non-path region of the terrain.
When the final path probability map is determined, it may be merged with a global final path probability map determined from previous frames of image data captured by the stereoscopic camera system 185C.
The method of Fig. 5 will now be illustrated with reference to the example of Figs 8-9.
Fig. 8a shows a first 2D image obtained by a first image sensor of the stereoscopic camera system 185C of terrain ahead of the vehicle 100. The terrain comprises a paved path region 700 and a pair of grass non-path regions 702, 704 on either side of the paved path region 700.
Fig. 8b shows the disparity image between the first 2D image shown in Fig. 8a and a second 2D image obtained by a second image sensor of the stereoscopic camera system 185C. Fig. 8c shows the pixels of the image of Fig. 8a overlaid on a 3D grid obtained based on the disparity image of Fig. 8b and mapped relative to a horizontal plane. Also shown in Fig. 8c is a white line 708 showing the location of the longitudinal axis of the vehicle 100 on the path 700, the locations 706a-706d of the tyres of the vehicle 100 and the locations 707a, 707b of the non-path regions of the terrain laterally offset from the vehicle used to determine the non-path model.
Fig. 9a repeats the view of Fig. Sc for reference. Fig. 9b shows the path probability map determined at 516a of the method of Fig. 5 with respect to the terrain of Fig. 9a. The shading of the path probability map varies from black to grey, with black indicating a higher probability that the sub-region relates to the path region of the terrain and grey indicating a lower probability thereof. The majority of the path region 700 is correctly identified as being a path region of the terrain, and the majority of the non-path regions 702, 704 are correctly identified as not being a path region of the terrain. However, at the edges of the path region, and some portions near the centre of the path region, there are sub-regions which have not been correctly identified as being a path region. This may be at least partly because, as can be seen from Fig. 8a for example, some portions of the path region 700, such as the puddle regions 705 covered with water, have image contents (e.g. colour and/or textures) which will not accurately match the global path GMM.
Fig. 9c shows the non-path probability map determined at 516b of the method of Fig. 5. The colour varies from black to grey, with black indicating a higher probability that the sub-region relates to the non-path region of the terrain and grey indicating a lower probability thereof.
The majority of the path region 700 is correctly identified as not being a non-path portion of the terrain, and the majority of the non-path regions are correctly identified as being non-path portions of the terrain. Indeed, there is no distinction made in the non-path probability map between the dry and puddle regions of the path 700. However, at the edges of the non-path region, and some portions further to the left and right of the non-path region, there are sub-regions which have not been identified as being non-path with a high probability. This may be because they do not match the global non-path GMM.
Fig. 9d shows a secondary path probability map inferred from the non-path probability map of Fig. 9c by the stereoscopic camera system 185C determining the inverse of the non-path probability map of Fig. 9c at 518. Again the colour varies from black to grey, with black indicating a higher probability that the sub-region relates to the path region of the terrain and grey indicating a lower probability thereof. It can be seen that the centre of the path region is determined to relate to the path region of the terrain with a greater probability in the secondary path probability map than in the path probability map of Fig. 9b. This is at least in part because there is no distinction made between the dry and puddle regions of the path 700 of the terrain in the secondary path probability map. In addition, some of the sub-regions at the boundaries between the path region and the non-path region are determined to relate to path or non-path regions of the terrain with higher probability than in the path probability map of Fig. 9b.
Figs. 9e and 9f show the path and non-path boundaries respectively determined from the path and non-path probability maps of Figs. 9b and 9c respectively. It can be seen, particularly in the right hand path boundary of Fig. 9e near the location of the vehicle 100, that a consistent boundary is not determined from the path probability data of Fig. 9b alone.
Fig. 9g shows the final path probability map determined by a weighted combination of the path probability map and the secondary path probability map inferred from the non-path probability map. It can be seen that the sub-regions of the image data are determined to be path or non-path with a higher degree of confidence in the final path probability map that from any of the path probability map determined at 516a (Fig. 9b), the non-path probability map determined at 516b (Fig. 9c) or the secondary path probability map inferred from the non-probability map at 518 (Fig. 9d). Fig. 9h shows the final path probability map merged with the global path probability map. Fig. 9i shows the path boundary determined from the updated global path probability map of Fig. 9h. It can be seen that the path boundary of Fig. 9i is more consistent than the path boundary of Fig. 9e or the non-path boundary of Fig. 9f.
Thus, the method of Fig. 5, in which both path and non-path probabilities are determined, provides more confident determination of path and non-path regions of the terrain than is achievable with either the path or non-path probabilities alone.
It will be understood that occasionally part of one or more tyres of the vehicle 100 may leave the path region of the terrain and enter a non-path region of the terrain. In this case, it may be that a first part of the tyre region image data relates to the path region, while a second part of the tyre region image data relates to the non-path region. This is illustrated in Figs. 7a and 7b, which show sub-regions of the image data comprising path portions 600 and non-path portions 602. Accordingly, in the event that tyre region image content data relating to a tyre of the vehicle is determined not to match the tyre region image content data relating to the other tyres of the vehicle or the global path GMM, it may be that the stereoscopic camera system 185C is configured to split the sub-region of the image data from which the tyre region image content data is derived into two or more portions. In this case, image content data derived from a (or each) selected portion of the sub-region (rather than from the entire sub-region) may be compared to any one or more of: the tyre region image content data relating to the other tyres; the global path GMM; the global non-path GMM. If the image content data derived from the selected portion of the sub-region meets the one or more similarity criteria with respect to one or more of the other tyres and/or the global path GMM, it may be that the camera system 185C selectively merges the image content data derived from that selected portion of the sub-region with the global path GMM. For example, the similarity criteria may require that the selected portion of the sub-region is more strongly correlated with the global path GMM than the global non-path GMM and/or is more strongly correlated with the global path GMM than another portion of the sub-region and/or is sufficiently strongly correlated with the global path GMM. Otherwise, it may be that the camera system 185C decides not to merge the image content data derived from that selected portion of the sub-region with the global path GMM. It may be that this helps the global path GMM to become more generalised more quickly, thereby helping to improve the accuracy of the path probability data.
As will be explained below, the VCU 10 may determine cost data in dependence on the path and non-path probabilities, and determine a future path for the vehicle in dependence on the cost data, typically by determining a cost map based on the cost data The cost map may comprise a grid of cells. The way in which the image data relating to a sub-region is split may depend on the direction of the determined path relative to the cost map grid. For example, if the path is parallel to an axis of the grid, it may be that the image data is split into left and right portions. For example, Fig. 7a shows the path region 600 on the right hand side with the non-path region 602 on the left hand side for a path travelling north/south (parallel to the vertical axis). In another example (see Fig. 7b), the path is diagonal to the grid and the image data may be split diagonally in the cell.
The stereoscopic camera system 185C may determine a cost map relating to the terrain based on the path and non-path probabilities, for example in dependence on the final path probability map. In order to determine the cost map, the stereoscopic camera system 185C may determine, for each of the sub-regions of the image data, a cost for the vehicle 100 to traverse the respective portion of the terrain to which the sub-region relates in dependence on the path and non-path probabilities determined from 508-518, for example in dependence on the final path probability relating to that sub-region determined at 518. The cost may relate to a penalty or a reward associated with the respective portion of the terrain. An increased cost may relate to an increased penalty or a reduced reward. Similarly a reduced cost may relate to a reduced penalty or an increased reward. However, it will be assumed in the following description that the cost is allocated on a penalty basis.
In an example, for each sub-region, the greater the final path probability, the lower the cost allocated to that sub-region and the lower the final path probability, the greater the cost allocated to that sub-region. It may be that the costs are allocated to sub-regions on a binary basis, for example a low cost for sub-regions having final path probabilities greater than a threshold and a high cost for sub-regions having final path probabilities lower than a threshold. However, it may be that costs are allocated on a more granular basis. For example, it may be that the cost for the vehicle to traverse a portion of the terrain to which the sub-region relates is determined depending on the final path probability meeting one or more path probability criteria indicating that the sub-region relates to the path region, one or more non-path probability criteria indicating that the sub-region relates to the non-path region or neither the path probability criteria nor the non-path probability criteria.
Sub-regions having low path probabilities determined at 514a and low non-path probabilities determined at 514b may have an intermediate final path probability between a relatively low final path probability and a relatively high final path probability. For example, it may be that puddle regions 705 of the path region 700 of the example of Fig. 8a are provided with intermediate final path probabilities because they do not meet one or more similarity criteria with respect to either the global path or global non-path GMM. In this case, a first, relatively high cost may be allocated to sub-regions having a relatively low final path probability, a second, intermediate cost may be allocated to sub-regions having an intermediate final path probability and a third, relatively low cost may be allocated to sub-regions having relatively high final path probabilities. In this way, an at least three-tiered cost allocation scheme may be implemented in which the final path probabilities are probability parameters relating to whether the respective sub-regions relate to the path region or the non-path region. If the final path probability associated with a sub-region meets path probability criteria, in this case that the final path probability is greater than a respective path threshold, a low cost may be allocated to that sub-region. If the final path probability associated with a sub-region meets non-path probability criteria, in this case that the final path probability is less than a respective non-path threshold (which may be different from the path threshold), a high cost may be allocated to that sub-region. If the final path probability associated with a sub-region meets neither the path nor the non-path probability criteria (e.g. the final path probability is between the path and non-path thresholds), it may be that an intermediate cost is allocated to that sub-region.
This is illustrated in Fig. 10 which schematically shows an example terrain based on the terrain of Fig. 8a with a cost map 800 determined using a three-tiered cost allocation scheme overlaid thereon, the cost map 800 comprising a plurality of cells 802 each of which relates to a subregion of the image data. The letter L indicates that a relatively low cost has been allocated to the sub-region, the letter H indicates that a relatively high cost has been allocated to the sub-region, while the letter I indicates that an intermediate cost has been allocated to the subregion. For ease of illustration, the cells 802 of the cost map 800 of Fig. 10 are larger than would normally be used in practice.
In other examples, it may be that costs are allocated to each sub-region from a (e.g. continuous) scale having more than three possible costs in dependence on the final path probability determined for that sub-region. In this case, because there are more than three possible costs which can be allocated to a sub-region, it may still be considered that the cost for the vehicle to traverse a portion of the terrain to which the sub-region relates is determined depending on the final path probability for that sub-region meeting one or more path probability criteria indicating that the sub-region relates to the path region, one or more non-path probability criteria indicating that the sub-region relates to the non-path region or neither the path probability criteria nor the non-path probability criteria. For example, in this case, there may be at least three sub-regions, at least a first of which has a relatively high final path probability such that it is allocated a relatively low cost, at least a second of which has a relatively low final path probability such that it is allocated a relatively high cost, and at least a third of which has an intermediate path probability such that it is allocated an intermediate cost intermediate the relatively high and relatively low costs. In this case, it may be said that the final probability of the first sub-region implicitly meets one or more path probability criteria, the final probability of the second sub-region implicitly meets one or more non-path probability criteria and the final probability of the third sub-region implicitly meets neither the path nor non-path probability criteria. It will be understood that the scale of costs may be infinitely variable. It may be that the relationship between the costs allocated and the final path probabilities of the respective sub-regions is not linear. For example, exponentially greater costs may be allocated to respective cells relating to sub-regions for which there is a high probability that it relates to a non-path region.
Cost may (for example) alternatively be allocated directly in dependence on the path and non-path probabilities. Any suitable alternative cost allocation strategy may be employed.
The cost map may be transmitted from the stereoscopic camera system 185C to the VCU 10 which may determine a future path for the vehicle in dependence on the cost map. The cost data of the cost map may be provided by the stereoscopic camera system 185C to the VCU 10 on a cell by cell basis. Optionally, the VCU 10 merges the cost map with an existing global cost map which may be based (at least in part) on cost maps obtained previously from the stereoscopic camera system 185C. In order to determine a future path for the vehicle 100, costs for the vehicle to traverse the terrain by each of a plurality of candidate trajectories may be calculated from the cost map 800 (or from a global cost map into which cost map 800 is merged) and a preferred path may be selected from the candidate trajectories in dependence on the calculated costs. This is illustrated in Fig. 11, which shows the same terrain and cost map 800 as Fig. 10 with a plurality of candidate trajectories 810-830 overlaid thereon. If it is assumed that the VCU 10 will want to avoid high cost regions of the terrain, then the VCU 10 may select trajectory 810 as its preferred path as it will have the lowest cost. Following trajectory 810 would cause the vehicle 100 to traverse regions of the terrain which have been allocated intermediate (I) costs by the cost map 800. This is made possible by using an at least three-tiered cost allocation scheme to determine the cost map 800. It will be understood that, if a binary cost allocation scheme was employed, it may be that the puddle regions 705 of the terrain would have been allocated a high cost, in which case it may be that the VCU 10 would attempt to guide the vehicle around the puddle regions rather than through them. Thus, a more optimal (direct) vehicle path may be achieved by the cost map 800 having at least three different costs allocated to different sub-regions of the image data.
Although the intermediate cost cells have been illustrated as portions of the path 700 covered by puddles of water, it will be understood that intermediate cost cells may additionally or alternatively be portions of the path 700 covered by shadows, e.g. of trees or bushes on either side of the path 700 or any other cause of variable lighting on the path 700. This is illustrated by the cost map 850 shown in Fig. 12 relating to a similar terrain as the cost map of Fig. 10 but with shadow portions 852 rather than puddle portions 705. Again, by allocating an intermediate cost to the shadow portions 852, it may be that the vehicle path determined by the VCU 10 passes through one or more of the shadow portions 852, which may allow a more optimal path to be determined.
It will be understood that, instead of calculating the cost for each of a plurality of candidate trajectories and selecting a preferred path from the candidate trajectories in dependence on the costs, it may be that the future path is determined by analysing the cost map in order to determine the lowest cost route. While this may allow a more optimal route to be determined, it is more processing intensive.
In the case that the cost map 800 is merged with an existing global cost map based on previously obtained cost data, it may not be necessary to determine costs based on image data common to previous frames of image data captured at different times and/or locations of the vehicle. Rather, it may be that cost data is determined only for image data relating to portions of the terrain for which cost data was not determined based on previous frames of image data. This helps to reduce the quantity of processing required.
It will be understood that it is not essential to obtain both path and non-path probabilities in order to be able to allocate an at least three-tiered cost allocation scheme to a cost map. For example, a three-tiered cost allocation scheme may be based on path probabilities determined with reference to a texture-based global path GMM (e.g. which uses texture or colour and texture as the modelled parameter). In this case, it may be that regions of the path 700 under variable lighting conditions (e.g. having reflective puddles, shadow regions etc) may be identified as having a texture which is more similar to the other portions of the path 700 than to the non-path region on either side of the path (which may be grass and have a more distinctive texture). Additionally or alternatively, it may be determined that a shadow region of the path 700 has a more similar colour content to other portions of the path 700 than the (e.g. grassy) non-path regions 702, 704 of the terrain. In either case, intermediate costs can be allocated to portions of the path 700 under variable lighting conditions, and low and high costs to portions of the terrain confidently identified as path and non-path regions respectively.
It may be that the cost map 800 or 850 (and/or the global cost map with which the cost map 800 or 850 is merged) is a wheel cost map indicative of costs for the wheels of the vehicle 100 to traverse the terrain independently of the body of the vehicle 100. Although the candidate trajectories in Fig. 11 are each represented by single lines 810-830, it will be understood that different wheels of the vehicle 100 may follow different paths from each other for a given vehicle trajectory 810-830. Accordingly, it may be that the cost associated with each of the candidate trajectories 810-830 takes into account the different paths which would be followed by each of the wheels if the vehicle 100 were to follow that trajectory 810-830. For example, the cost of the vehicle 100 traversing the terrain by way of each candidate trajectory 810-830 may be determined by determining the costs for each of the wheels of the vehicle 100 to follow their respective paths along that trajectory. Alternatively, it may be that the cost of the vehicle 100 traversing the terrain by way of each candidate trajectory 810-830 may be determined by selectively determining the costs for diagonally opposite wheels of the vehicle 100, such as for the front right and rear left or front left and rear right wheels of the vehicle, to follow their respective paths along that trajectory. In this case, it may be that the cost associated with each candidate trajectory is the average or the sum of the costs for the respective wheels to follow their respective paths along that trajectory.
Although the costs associated with the cells 802 of the wheel cost map 800 are described above as relating to whether they relate to path or non-path regions of the terrain, the costs associated with each of the cells 802 of the wheel cost map may additionally or alternatively relate to any one or more of: a gradient of the terrain in a projected direction of travel of the vehicle; a side slope of the terrain transverse to a projected direction of travel of the vehicle. The gradient of the terrain in the projected direction of travel and/or the side slope of the terrain may be determined from topography information relating to the terrain determined from the 3D grid map onto which the image data is overlaid. Alternatively, the gradient of the terrain in the projected direction of travel and/or the side slope of the terrain may be determined from a ranging system of the vehicle such as a radar-based terrain ranging system, a laser-based terrain ranging (e.g. LIDAR) system or an acoustic ranging system of the vehicle 100 or from a gradient sensor of the vehicle (if provided). The projected direction of travel may be determined from the candidate trajectory of the vehicle 100.
It may be that the costs associated with the cells 802 of the wheel cost map 800 are generalised costs substantially independent of the direction of travel of the vehicle 100 across the respective cells 802. Alternatively, a plurality of wheel cost maps 800 may be determined, each being associated with a respective candidate trajectory across the terrain. In the latter case, the costs associated with the cells 802 of the wheel cost map 800 may be dependent on the direction of travel of the candidate trajectory with which it is associated across the terrain. This provides the cost data with increased accuracy, but involves increased processing as compared to the former case.
The preferred path selected from the wheel cost map(s) 800 may be based on non-obstacle cost data, and/or it may be that the preferred path selected from the wheel cost map(s) does not take into account some obstacles of the terrain. Accordingly, it may be that the VCU 10 is configured to obtain further cost data relating to the cost of traversing the terrain to check, for example, whether the selected preferred path contains any obstacles which would render it unsuitable for the vehicle 100. For example, it may be that the stereoscopic camera system 185C is configured to determine a second, body cost map indicative of respective cost(s) for a swept volume of the body of the vehicle 100 to traverse one or more portions of the terrain independently of the wheels of the vehicle 100. The stereoscopic camera system 185C may then be configured to transmit the second, body cost map to the VCU 10 which may take it into account to determine the future path for the vehicle 100. The cells of the body cost map may correspond to (e.g. are aligned with in relation to the terrain) the respective cells of the wheel cost map(s) 800.
The second, body cost map may be based on the 3D grid data generated by the stereoscopic camera system 185C in respect of the terrain. The body cost map may include cost data relating to one or more obstacles of the terrain, such as one or more three-dimensional, 3D, obstacles of the terrain. The body cost map may also include cost data relating to one or more objects (e.g. branches, bushes) overhanging a ground level (e.g. a path region on a ground level) of the terrain. It may be that the body of the vehicle 100 has a predetermined minimum elevation with respect to the ground level of the terrain. It may be that the body cost map is selectively based on 3D grid data relating to objects or obstacles having elevations which exceed the predetermined minimum elevation. Similarly, it may be that the body of the vehicle 100 has a predetermined maximum elevation with respect to the ground level of the terrain. It may be that the body cost map is selectively based on 3D grid data relating to objects or obstacles having elevations which are below the predetermined maximum elevation. This advantageously allows the body cost map to account for whether or not the body of the vehicle 100 would engage or clear the obstacle or overhanging object if the vehicle 100 were to traverse the terrain across a particular candidate trajectory.
It may be that the stereoscopic camera system 185C determines from the 3D grid data whether the portion of the terrain to which each respective sub-region relates comprises any features having an elevation greater than the minimum elevation and less than the maximum elevation, which may be an indication that the body of the vehicle 100 would be impeded if it was to try to traverse that portion of the terrain. If so, it may be that the stereoscopic camera system 185C determines that there are obstacles present in the portion of the terrain to which that sub-region relates. If not, it may be that the stereoscopic camera system 185C determines that there are no obstacles present in the portion of the terrain to which that subregion relates. If there are one or more obstacles, it may be that the stereoscopic camera system 185C allocates a relatively high cost to that sub-region. If there are no obstacles, it may be that the stereoscopic camera system 185C allocates a relatively low cost to that sub-region. The cost data in the body cost map may be binary (e.g. relating to whether the cell to which it relates is passable or impassible by the vehicle) such that the body cost map is in effect an occupancy grid. Alternatively, it may be that the cost data in the body cost map is more granular. For example, costs may be allocated to cells of the body cost map using an at least three-tiered cost allocation scheme.
While the cost for the vehicle 100 to traverse the terrain based on the wheel cost map may involve the determination of the costs for different wheels of the vehicle 100 to traverse different paths along the various candidate trajectories 810-830 (and optionally summing or averaging those costs), it may be that determining the cost for the vehicle 100 to traverse the terrain based on the body cost map involves determining which cells of the body cost map would be occupied by a volume swept by the body of the vehicle 100 if it were to traverse the terrain by a particular trajectory. In this case, it may be that the stereoscopic camera system 185C determines the swept volume of the vehicle 100 with respect to each candidate trajectory, and determines the cost for the body of the vehicle 100 to traverse the terrain by that trajectory by for example summing the costs associated with the cells which would be occupied by the swept volume of the vehicle 100 following that trajectory. As illustrated in Fig. 13, the volume swept by the body of the vehicle 100 along a trajectory depends on the curvature of the path followed by the vehicle 100, increasing for tighter turns and decreasing for gentler turns. Thus, it may be that the stereoscopic camera system 185C is configured to determine the volume swept by the body of the vehicle 100 along a trajectory in dependence on the curvature of that trajectory, and to determine the cost associated with that trajectory in dependence on the swept volume and the body cost map.
It may be that the body cost map is transmitted by the stereoscopic camera system 185C together with the wheel cost map to the VCU 10, and it may be that the VCU 10 merges the body cost map with a global body cost map based on previous frames of image data. It may be that the VCU 10 selects a preferred path based on the wheel cost map 800 as described above, before determining the cost for a swept volume of the body of the vehicle 100 to traverse the selected preferred path based on the body cost map. If the cost derived from the body cost map is too high (e.g. above a threshold indicative that the path contains one or more impassible obstacles), it may be that the VCU 10 selects an alternative preferred path (e.g. from the candidate trajectories shown in Fig. 11 based on the wheel cost map 800) and determines whether that path comprises obstacles based on the body cost map. This process may be repeated until a path is found which does not contain obstacles which are impassible by the body of the vehicle 100.
By providing separate wheel and body cost maps, the different effects of the terrain on the wheels and body of the vehicle 100 can be accounted for, enabling more accurate costs to be determined for the vehicle 100 to traverse respective candidate trajectories across the terrain, allowing a more optimal vehicle path to be determined (e.g. as compared to providing a single cost map which does not separate wheel and body cost data). For example, it may be that one or more portions of the terrain, such as a strip of grass extending between a pair of substantially parallel ruts or tracks, would incur a relatively high cost for the wheels of the vehicle to traverse but a relatively low cost for the body to traverse (e.g. because a minimum elevation of the body is greater than a maximum elevation of the said portion of the terrain such that the vehicle body would clear the said portion of the terrain). By providing separate wheel and body cost maps, a (potentially optimal) vehicle path which provides the wheels of the vehicle in the ruts/tracks and the body of the vehicle over the grass strip may be determined to have a relatively low overall cost. Conversely, a cost map which does not separate body and wheel effects may determine that such a path would be of a relatively high overall cost. Thus, providing separate wheel and body cost maps is particularly advantageous.
In some terrains, there are features which are of low cost for a vehicle to traverse in one direction, but which are of high cost for the vehicle to traverse in other directions. For example, mud ruts typically comprise tracks for wheels of the vehicle which are of low cost for the vehicle to follow, but which are of high cost for the vehicle to cross. This is illustrated in Fig. 14 which shows the wheels of vehicle 100 following mud ruts 860, 862. Stars 864, 866, 868 represent locations of the mud ruts 860, 862 which would be crossed by the vehicle 100 if it were to follow the path defined by lines 870, 872. This directional dependency on cost cannot be accommodated by traditional cost maps or occupancy grids, which typically allocate a cost to a particular portion of the terrain which is applied independently of the direction of travel of the vehicle.
Accordingly, it may be that the stereoscopic camera system 185C (or any other processing system of the vehicle 100 in data communication with the stereoscopic camera system 185C such as the VCU 10) is configured to determine a third, line features cost data structure in dependence on which the future path for the vehicle 100 may be determined. The line features cost data structure typically comprises a plurality of line features, which may each be represented by a plurality of location points defining the line feature or a best fit line (for example), the line features indicating lines of the terrain which should not be crossed by the vehicle. The line features cost data structure may also comprise direction data indicative of a crossing direction of the line features, although this may be implicit in the shape of the line feature in which case it may not be necessary to store direction data in the line features cost data structure. In one example, the stereoscopic camera system 185C may be configured to determine line features based on the path boundaries derived from the path and non-path probability data described above (e.g. from the final path probability data).
It may be that the camera system 185C (or other processing system of the vehicle such as VCU 10) is configured to determine boundaries between path and non-path regions of the terrain based on the path probability data and the non-path probability data. For example, it may be that first and second (e.g. left and right) boundary lines between the path and non-path regions of the terrain are identified independently from each of path probability data and the non-path probability data. It may be that the boundary lines determined from the non-path probability data are determined from the secondary path probability data. For each boundary pair, the path width (i.e. shortest distance between the boundaries of the said pair) may determined for each of a plurality of locations along the path. A consistency measure may be determined for each said boundary pair in dependence on any one or more of: the average (e.g. mean) of the said path widths of the boundary pair; the standard deviation of the path widths between the boundaries of the boundary pair; the lengths of the boundaries. Respective first and second weights may then be determined for the path and non-path boundary pairs respectively in dependence on the consistency measures of the boundaries determined from the path and non-path probability data respectively. The boundaries obtained from the path and non-path probability data may be combined in dependence on the first and second weights. For example, the camera system 185C (or other processing system of the vehicle such as VCU 10) may put more emphasis on one of the path and non-path boundaries if they have been allocated more significant weight than the other by virtue of being more consistent than the other. The boundaries determined from the combination of the boundaries determined from the path and non-path probability data may provide line features. The crossing direction for each line feature may be determined based on the direction in which the terrain changes from a path region to a non-path region in the path and non-path probability data (or in the final path probability data). The line features act as boundaries across which the vehicle path should not cross. It may be that the line features cost data structure is merged with a global line features cost data structure based on line features cost data structures derived from previous frames of image data.
As before, the stereoscopic camera system 185C may be configured to transmit the line features cost data structure to the VCU 10 (e.g. together with the wheel cost map and/or the body cost map) which may determine the future path of the vehicle 100 in dependence thereon. This is illustrated in Fig. 15, where the VCU 10 receives cost data from the wheel cost map 800, body cost map 880 and line features cost data structure 882. It will be understood that, in some embodiments, cost data from any two or more of the cost data structures 800, 880, 882 may be used by the VCU 10 to determine the vehicle path.
The cost data from the wheel cost map 800, the body cost map 880 and the line features cost data structure 882 may be transmitted from the stereoscopic camera system 185C to the VCU 10 grouped in dependence on the respective portions of the terrain to which it relates. For example, it may be that the stereoscopic camera system 185C is configured to transmit cost data relating to corresponding cells of the wheel and body cost maps together with any line features cost data relating to the same portion of the terrain as part of the same transmitted data structure.
In order to determine the transmitted data structure, it may be that the stereoscopic camera system 185C determines whether any of the cells of body cost map 882 contain obstacles. If any of the cells of the body cost map 882 contain obstacles, the stereoscopic camera system 185C may divide the cost data from the wheel cost map 800, the body cost map 880 and the line features cost data structure 882 relating to a particular portion of the terrain into obstacle cost data 884 and non-obstacle cost data 886. As shown in Figs. 16a and 16b, it may be that the obstacle cost data is provided at a predetermined portion of the transmitted data structure, such as the least significant (e.g. four) bits (Fig. 16a) or the most significant (e.g. four) bits (Fig. 16b). The predefined portion of the transmitted data structure is typically known to the VCU 10. In this case, the VCU 10 may be configured to selectively process the obstacle cost data with a higher priority than non-obstacle cost data when determining the vehicle path. For example, the VCU 10 may discard the non-obstacle data 886. This may be possible because, due to the presence of an obstacle, it may be that the vehicle cannot traverse the portion of the terrain to which the data structure relates. This means that the non-obstacle cost data relating to that portion of the terrain is of less (or indeed of no) importance. By selectively processing the obstacle cost data, processing and battery power of the VCU 10 is saved.
In an alternative example, it may be that the stereoscopic camera system 185C is configured to provide the obstacle cost data 884 to the VCU 10 but not the non-obstacle cost data 886. This may be possible for the same reason discussed above. By not providing the non-obstacle cost data 886 to the VCU 10, bandwidth of the communication medium (e.g. vehicle 3o communications bus or wireless network) by which data is transmitted between the stereoscopic camera system 185C and the VCU 10 is saved, together with processing power of the stereoscopic camera system 185C and the VCU 10.
Thus, it may be that the VCU 10 is configured to determine the vehicle path in dependence on the obstacle cost data 884 relating to one or more portions of the terrain, but not in dependence on the non-obstacle cost data 886 relating to the same portion of the terrain.
In other examples, it may be that the 3D cost data is obtained from another electronic control unit of the vehicle 100, for example from a ranging system of the vehicle such as a radar-based terrain ranging system, a laser-based terrain ranging (e.g. LIDAR) system or an acoustic ranging system of the vehicle. In this case, the camera system of the vehicle need not be stereoscopic and a single 2D camera may be employed. In this case, the preferred path may be determined based on cost data derived from a mapping of image data obtained by the 2D camera relative to a horizontal plane representing the surface of the terrain in a similar way to that described above. The body cost map in this case may be determined and provided to the VCU 10 by the separate, ranging system controller of the ranging system.
The VCU 10 may be configured to perform a feasibility assessment on the selected path to determine whether it is a feasible path for the vehicle to follow. For example, it may be that the VCU 10 is configured to determine whether it is a feasible path for the vehicle to follow in dependence on any one or more of: the width of the path region (e.g. the distance between a pair of typically substantially parallel path boundaries, typically substantially perpendicular to the longitudinal axis of the vehicle); whether the path has parallel boundaries; the continuity of the path boundaries; whether the path emanates from the vehicle 100. If the VCU 10 determines that the path is infeasible, it may be that an alternative preferred path is selected (e.g. from the candidate trajectories shown in Fig. 11) and it is determined whether that path is feasible. This process may be repeated until a feasible path is found.
It may be that the VCU 10 is configured to provide an output representative of the determined path. It may be that the output is a visual output. It may be that the output is an audio visual output. It may be that the output is provided by way of a display and/or speaker system of the vehicle, such as a display and/or speaker system of an infotainment system of the vehicle.
The VCU 10 may then control the vehicle 100 in accordance with the determined path. The VCU 10 may determine a required steering angle for one or more wheels of the vehicle 100 in dependence on the curvature of the determined path, and in dependence thereon transmit a steering angle command signal to the steering controller 170C. The steering controller 170C, in turn, may set the angle of the steerable wheels of the vehicle accordingly. The VCU 10 may also determine a recommended speed of the vehicle in dependence on the curvature of the determined path from the relevant look-up table. The VCU 10 may be configured to output the recommended speed to the LSP control system 12 which controls the speed of the vehicle 100 accordingly by changing the user set speed in accordance with the received recommended speed.
It will be understood that, in alternative examples, the VCU 10 may obtain the image data captured by the stereoscopic camera system 185C, and the VCU 10 (rather than the stereoscopic camera system 185C) may be configured to derive cost data therefrom in the way described above.
It will be understood that the VCU 10 may be configured to add and remove cost data from the global cost map depending on its location, and typically in dependence on a direction of movement of the vehicle 100. This helps to limit the quantity of cost data it needs to store in the global cost map at any given time.
It will be appreciated that embodiments of the present invention can be realised in the form of hardware, software or a combination of hardware and software. Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a ROM, whether erasable or rewritable or not, or in the form of memory such as, for example, RAM, memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a CD, DVD, magnetic disk or magnetic tape. It will be appreciated that the storage devices and storage media are embodiments of machine-readable storage that are suitable for storing a program or programs that, when executed, implement embodiments of the present invention. Accordingly, embodiments provide a program comprising code for implementing a system or method as claimed in any preceding claim and a machine readable storage storing such a program. Still further, embodiments of the present invention may be conveyed electronically via any medium such as a communication signal carried over a wired or wireless connection and embodiments suitably encompass the same.
All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
The invention is not restricted to the details of any foregoing embodiments. The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed. The claims should not be construed to cover merely the foregoing embodiments, but also any embodiments which fall within the scope of the claims.

Claims (25)

  1. CLAIMS1. A control system for a vehicle, the control system comprising at least one controller and being configured to: obtain image data relating to terrain to be traversed by the vehicle, and for each of a plurality of sub-regions of the image data: determine path probability data indicative of a probability that the respective sub-region relates to a path region of the terrain; determine non-path probability data indicative of a probability that the respective sub-region relates to a non-path region of the terrain; and determine a cost for the vehicle to traverse a respective portion of the terrain to which the respective sub-region relates in dependence on the determined path and non-path probability data; and determine a vehicle path in dependence on the determined costs.
  2. 2. A control system according to claim 1 wherein the at least one controller collectively comprises: at least one electronic processor having an input for receiving the image data; and at least one electronic memory device electrically coupled to the at least one electronic processor and having instructions stored therein, wherein the at least one electronic processor is configured to access the at least one memory device and execute the instructions thereon to determine the path probability data, the non-path probability data, the costs and the vehicle path.
  3. 3. A control system according to claim 1 or claim 2 wherein the control system is configured to control the vehicle in dependence on the determined path.
  4. 4. A control system according to any one preceding claim wherein the control system is configured to, for each of the said sub-regions: infer secondary path probability data from the non-path probability data, the secondary path probability data being indicative of a probability that the respective sub-region relates to a path region of the terrain; and determine a cost for the vehicle to traverse a respective portion of the terrain to which the respective sub-region relates based on a combination of the path probability data and the secondary path probability data.
  5. 5. A control system according to claim 4 wherein the control system is configured to combine the path probability data with the secondary path probability data by applying different weights to the path probability data and the secondary path probability data and combining the weighted path probability data and the weighted secondary path probability data.
  6. 6. A control system according to any preceding claim wherein the control system is configured to, for each of the said sub-regions. determine path probability data indicative of a probability that the respective sub-region relates to a path region of the terrain by: determining image content data from the respective sub-region; and comparing the image content data to a path model relating to the path region of the terrain.
  7. 7. A control system according to claim 6 wherein the path model is dependent on historical image data relating to the terrain.
  8. 8. A control system according to claim 7 wherein the path model is based on tyre region image data relating to locations on the terrain of one or more tyres of the vehicle.
  9. 9. A control system according to any of claims 6 to 8 wherein the control system is configured to determine the path model in dependence on one or more of the said sub-regions of the said image data relating to location(s) on the terrain of one or more tyres of the vehicle.
  10. 10. A control system according to claim 9 wherein the control system is configured to determine the one or more sub-regions of the said image data relating to location(s) on the terrain of one or more tyres of the vehicle in dependence on location data indicative of a location of the vehicle at a time after the image data was captured.
  11. 11. A control system according to any one of claims 6 to 10 wherein the control system is configured to: determine first tyre region image content data from one or more sub-regions of the image data relating to a location of a first tyre of the vehicle; determine second tyre region image content data from one or more sub-regions of the image data relating to a location of a second tyre of the vehicle; compare the first tyre region image content data to the second tyre region image content data; and, in dependence on the first and second tyre region image content data meeting one or more similarity criteria with respect to each other, determine the path model in dependence on the first and second tyre region image content data.
  12. 12. A control system according to any one of claims 6 to 11 wherein the control system is configured to: determine tyre region image content data from one or more sub-regions of the image data relating to a location of a tyre of the vehicle; compare the tyre region image content data to the path model; and, in dependence on the tyre region image content data and the path model meeting one or more similarity criteria with respect to each other, update the path model in dependence on the tyre region image content data.
  13. 13. A control system according to any preceding claim wherein the control system is configured to, for each of the said sub-regions, determine the non-path probability data indicative of a probability that the respective sub-region relates to a non-path region of the terrain by: determining image content data relating to the respective sub-region from the said image data; and comparing the image content data relating to the respective sub-region to a non-path model relating to the non-path region of the terrain.
  14. 14. A control system according to claim 13 wherein the non-path model is dependent on historical image data relating to the terrain.
  15. 15. A control system according to claim 12 wherein the non-path model is dependent on image data relating to one or more non-path regions of the terrain laterally offset from the vehicle.
  16. 16. A control system according to claim 15 wherein the non-path model is based on image data relating to one or more non-path regions of the terrain laterally offset from the vehicle by a set distance, the control system being configured to determine the set distance by: determining image content data from a plurality of sub-regions of image data relating to respective portions of the terrain laterally offset from the vehicle; comparing the image content data to the path model to thereby determine one or more sub-regions having image contents which meet one or more dissimilarity criteria with respect to the path model; and determining the set distance in dependence on lateral distance(s) between the vehicle and respective portions of the terrain to which the said one or more dissimilar sub-regions relate.
  17. 17. A control system according to claim 16 wherein the control system is configured to determine the non-path model based on one or more sub-regions of the image data relating to one or more non-path regions of the terrain laterally offset from the vehicle by the set distance.
  18. 18. A control system according to claim 16 or claim 17 wherein the control system is configured to: determine image content data from one or more sub-regions of the image data relating to respective portions of the terrain laterally offset from the vehicle by the set distance; compare the image content data to the path model; and, in dependence on the image content data relating to one or more of the said sub-regions meeting one or more dissimilarity criteria with respect to the path model, determine the non-path model in dependence on the dissimilar image content data.
  19. 19. A control system according to any one of claims 13 to 17 as dependent on any one of claims 6 to 12 wherein the control system is configured to: determine tyre region image content data from one or more sub-regions of the image data relating to a location of a tyre of the vehicle; compare the tyre region image data with the non-path model; and, in dependence on the tyre region image content data and the non-path model meeting one or more dissimilarity criteria with respect to each other, determine the path model in dependence on the tyre region image content data.
  20. 20. A control system according to any one of claims 6 to 12 or on any one of claims 13 to 19 as dependent on any one of claims 6 to 12 wherein the control system is configured to: determine that a selected portion of a sub-region meets one or more similarity criteria with respect to the path model; and in dependence thereon selectively update the path model in dependence on the first portion of the sub-region.
  21. 21. A control system according to any preceding claim wherein the control system is configured to: obtain 3D data in respect of the terrain; and, for respective portions of the terrain relating to each of a plurality of the said sub-regions, determine the cost for the vehicle to traverse the respective portion of the terrain to which the respective sub-region relates in dependence on the determined path and non-path probability data and on the 3D data.
  22. 22. A vehicle comprising a control system according to any one of claims 1 to 20. 25
  23. 23. A method of determining a vehicle path, the method comprising: obtaining image data relating to terrain to be traversed by the vehicle, and for each of a plurality of sub-regions of the image data: determining path probability data indicative of a probability that the respective sub-region relates to a path region of the terrain; determining non-path probability data indicative of a probability that the respective sub-region relates to a non-path region of the terrain; and determining a cost for the vehicle to traverse a respective portion of the terrain to which the respective sub-region relates in dependence on the determined path and non-path probability data; and determining a vehicle path in dependence on the determined costs or outputting determined costs in dependence on which a vehicle path can be determined.
  24. 24. A computer program product comprising computer readable instructions that, when executed by a computer, cause performance of the method of claim 23.
  25. 25. A non-transitory computer readable medium comprising computer readable instructions that, when executed by a processor, cause performance of the method of claim 23.
GB1815339.5A 2018-09-20 2018-09-20 Control system for determining vehicle path Active GB2577486B (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
GB1815339.5A GB2577486B (en) 2018-09-20 2018-09-20 Control system for determining vehicle path
PCT/EP2019/068943 WO2020057801A1 (en) 2018-09-20 2019-07-15 Control system for a vehicle
US17/278,265 US11999378B2 (en) 2018-09-20 2019-07-15 Control system for a vehicle
DE112019004698.5T DE112019004698T5 (en) 2018-09-20 2019-07-15 CONTROL SYSTEM FOR A VEHICLE

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1815339.5A GB2577486B (en) 2018-09-20 2018-09-20 Control system for determining vehicle path

Publications (3)

Publication Number Publication Date
GB201815339D0 GB201815339D0 (en) 2018-11-07
GB2577486A true GB2577486A (en) 2020-04-01
GB2577486B GB2577486B (en) 2023-05-10

Family

ID=64024203

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1815339.5A Active GB2577486B (en) 2018-09-20 2018-09-20 Control system for determining vehicle path

Country Status (1)

Country Link
GB (1) GB2577486B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140257621A1 (en) * 2013-03-08 2014-09-11 Oshkosh Corporation Terrain classification system for a vehicle
US20170285648A1 (en) * 2016-04-01 2017-10-05 Locus Robotics Corporation Navigation using planned robot travel paths
WO2018158020A1 (en) * 2017-03-01 2018-09-07 Volkswagen Aktiengesellschaft Method and device for determining a trajectory in off-road scenarios

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2654157T3 (en) * 2010-07-06 2018-02-12 Bae Systems Plc Method to support the guidance of a vehicle on the ground

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140257621A1 (en) * 2013-03-08 2014-09-11 Oshkosh Corporation Terrain classification system for a vehicle
US20170285648A1 (en) * 2016-04-01 2017-10-05 Locus Robotics Corporation Navigation using planned robot travel paths
WO2018158020A1 (en) * 2017-03-01 2018-09-07 Volkswagen Aktiengesellschaft Method and device for determining a trajectory in off-road scenarios

Also Published As

Publication number Publication date
GB2577486B (en) 2023-05-10
GB201815339D0 (en) 2018-11-07

Similar Documents

Publication Publication Date Title
GB2577485A (en) Control system for a vehicle
US11999378B2 (en) Control system for a vehicle
US11772647B2 (en) Control system for a vehicle
US11554778B2 (en) Vehicle speed control
JP6380422B2 (en) Automated driving system
WO2018166747A1 (en) Improvements in vehicle control
US11603103B2 (en) Vehicle speed control
US11021160B2 (en) Slope detection system for a vehicle
CN109426261B (en) Automatic driving device
US10611375B2 (en) Vehicle speed control
US20140188350A1 (en) Vehicle control system and method
WO2018007079A1 (en) Improvements in vehicle speed control
US11975725B2 (en) Systems and methods for updating the parameters of a model predictive controller with learned external parameters generated using simulations and machine learning
CN108466621A (en) effective rolling radius
US20220242401A1 (en) Systems and methods for updating the parameters of a model predictive controller with learned controls parameters generated using simulations and machine learning
GB2576265A (en) Improvements in vehicle speed control
GB2584587A (en) Control system for a vehicle
GB2551711A (en) Improvements in vehicle speed control
GB2577676A (en) Control system for a vehicle
GB2576450A (en) Improvements in vehicle speed control
GB2577486A (en) Control system for a vehicle
US20220242441A1 (en) Systems and methods for updating the parameters of a model predictive controller with learned operational and vehicle parameters generated using simulations and machine learning
CN116588078B (en) Vehicle control method, device, electronic equipment and computer readable storage medium
GB2552940A (en) Improvements in vehicle control