GB2571589A - Terrain inference method and apparatus - Google Patents

Terrain inference method and apparatus Download PDF

Info

Publication number
GB2571589A
GB2571589A GB1806629.0A GB201806629A GB2571589A GB 2571589 A GB2571589 A GB 2571589A GB 201806629 A GB201806629 A GB 201806629A GB 2571589 A GB2571589 A GB 2571589A
Authority
GB
United Kingdom
Prior art keywords
target vehicle
terrain
vehicle
movement
inference system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1806629.0A
Other versions
GB201806629D0 (en
GB2571589B (en
Inventor
Shamshiri Navid
Boyd Robin
Raveendran Arun
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Publication of GB201806629D0 publication Critical patent/GB201806629D0/en
Priority to PCT/EP2019/050389 priority Critical patent/WO2019166142A1/en
Priority to DE112019001080.8T priority patent/DE112019001080T5/en
Priority to US16/977,065 priority patent/US20210012119A1/en
Publication of GB2571589A publication Critical patent/GB2571589A/en
Application granted granted Critical
Publication of GB2571589B publication Critical patent/GB2571589B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/04Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/10Conjoint control of vehicle sub-units of different type or different function including control of change-speed gearings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/22Conjoint control of vehicle sub-units of different type or different function including control of suspension systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/14Adaptive cruise control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • B60W40/076Slope angle of the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo or light sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/05Type of road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/15Road slope
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/20Road profile
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/25Road altitude
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/30Road curve radius
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/35Road bumpiness, e.g. pavement or potholes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/40Coefficient of friction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/802Longitudinal distance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle for navigation systems

Abstract

A terrain inference system, in a host vehicle 2, comprises a controller (6, fig 3) which monitors, e.g. via optical sensor and/or camera 13, a target vehicle 3 to identify an attitude and/or a movement of the target vehicle 3. At least one terrain characteristic, such as incline angle, surface roughness and terrain composition, is inferred in dependence on the identified attitude of the target vehicle 3 and/or the identified movement of the target vehicle 3. The at least one terrain characteristic relates to a region of terrain proximal to the target vehicle 3. The controller (6) is configured to generate a vehicle control parameter, such as a driveline and/or transmission and/or chassis control parameter, or output an alert in dependence on the inferred terrain characteristic. The controller (6) determines a geographical position of the target vehicle 3 and maps the terrain characteristic in dependence on the determined geographic position. Reference is also made to a method of inferring at least one terrain characteristic and a non-transitory computer-readable medium.

Description

(57) A terrain inference system, in a host vehicle 2, comprises a controller (6, fig 3) which monitors, e.g. via optical sensor and/or camera 13, a target vehicle 3 to identify an attitude and/or a movement of the target vehicle 3. At least one terrain characteristic, such as incline angle, surface roughness and terrain composition, is inferred in dependence on the identified attitude of the target vehicle 3 and/or the identified movement of the target vehicle 3. The at least one terrain characteristic relates to a region of terrain proximal to the target vehicle 3. The controller (6) is configured to generate a vehicle control parameter, such as a driveline and/or transmission and/or chassis control parameter, or output an alert in dependence on the inferred terrain characteristic. The controller (6) determines a geographical position of the target vehicle 3 and maps the terrain characteristic in dependence on the determined geographic position. Reference is also made to a method of inferring at least one terrain characteristic and a non-transitory computer-readable medium.
FIG. 2
05 19
Y1
05 19
05 19
4/7
05 19
FIG. 4c
05 19 z
FIG. 6A
FIG. oB
FIG. 6C
FIG. 7A
05 19
IMG4
TERRAIN INFERENCE METHOD AND APPARATUS
TECHNICAL FIELD
The present disclosure relates to a terrain inference method and apparatus. In particular, but not exclusively, the present disclosure discloses a method and apparatus for inferring at least one terrain characteristic. The present disclosure has particular application in a vehicle, such as an automobile.
BACKGROUND
When driving a vehicle off-road, it can be advantageous to have advanced knowledge of the terrain ahead, for example to assess the terrain composing the track ahead. Examples of information which would be useful can include track obstacles (holes, ruts, rough surfaces, side slopes, wades) or track direction (bends, slopes). Detecting these features is usually very difficult, until the vehicle is traversing them, often resulting in reactive systems to deal with them after the event.
The present invention seeks to implement a terrain inference apparatus and method for inferring at least one terrain characteristic.
SUMMARY OF THE INVENTION
Aspects of the present invention relate to a terrain inference system, a vehicle, a method and a non-transitory computer-readable medium as claimed in the appended claims.
According to a further aspect of the present invention there is provided a terrain inference system comprising a controller configured to:
monitor a target vehicle;
identify an attitude of the target vehicle and/or a movement of the target vehicle; and inferring at least one terrain characteristic relating to a region of terrain proximal to the target vehicle in dependence on the identified attitude of the target vehicle and/or the identified movement of the target vehicle. The at least one terrain characteristic may be inferred with reference to the attitude and/or movement of the target vehicle. Thus, the at least one terrain characteristic may be determined indirectly with reference to the behaviour of the target vehicle. At least in certain embodiments, the terrain inference system may apply an inverse dynamics model to infer the at least one terrain characteristic in dependence on the determined behaviour of the target vehicle.
The target vehicle may be in front of the host vehicle. The target vehicle may, for example, be the vehicle in front of the host vehicle in a convoy or may be a lead vehicle in a convoy. The host vehicle may be a following vehicle (i.e. a vehicle which is following the target vehicle). At least in certain embodiments, the host vehicle and the target vehicle are both land vehicles. The host vehicle and the target vehicle may be wheeled vehicles.
In a vehicle follow situation, data can be obtained relating to the target vehicle. It is possible to detect, for example, target vehicle roll, target vehicle inclination relative, or small target deviations resulting from surface conditions. Computation of these parameters can be used to provide a prediction of approaching surface conditions, or to determine a direction or course of the track taken by the target vehicle. The terrain inference system could, for example, be used to implement a pro-active adaptive terrain system that prepares one or more systems in the host vehicle for a rough surface based on the observations made of the target vehicle. Another example may be a warning system to output an alert of a dangerous side slope ahead, for example based on the relative body angle of the target vehicle.
The inferred terrain characteristic may comprise at least one of the following set: an incline angle, an incline direction, a surface roughness, and a terrain composition. The incline angle may correspond to a gradient of the terrain on which the target vehicle is traversing. The surface roughness may provide an indication of the prevailing surface conditions, for example the magnitude and frequency of surface irregularities. The terrain composition may provide an indication of whether the terrain comprises solid/stable surface or an amorphous/unstable surface. The terrain composition may be determined, for example, by detecting a vertical displacement between an underside of the vehicle body and the surface of the terrain.
The terrain characteristic may be inferred in dependence on a roll angle and/or a pitch angle and/or a yaw angle of the target vehicle. For example, the incline angle and/or incline direction may be determined in dependence on one or more of the following: the roll angle, the pitch angle, the yaw angle of the target vehicle. Alternatively, or in addition, the terrain characteristic may be inferred in dependence on a rate of change of the roll angle, the pitch angle, the yaw angle of the target vehicle.
The controller may be configured to generate a vehicle control parameter in dependence on the at least one inferred terrain characteristic. The vehicle control parameter comprises at least one of the following set: drivetrain control parameter, a transmission control parameter, a chassis control parameter, and a steering control parameter. The terrain inference system described herein may be installed in a host vehicle. The vehicle control parameter may be generated to control one or more vehicle systems in said host vehicle.
The controller may be configured to output an alert in dependence on the inferred terrain characteristic. The alert may, for example, notify a driver that the terrain is impassable or potentially hazardous. The controller may, for example, determine that an incline angle of the terrain exceeds a predefined incline threshold.
The identification of the attitude of the target vehicle may comprise one or more of the following set: a target vehicle pitch angle, a target vehicle roll angle, and a target vehicle yaw angle.
The identification of the movement of the target vehicle may comprise identifying at least one of the following set: a change in the target vehicle pitch angle, a change in the target vehicle roll angle, and a change in the target vehicle yaw angle.
The identification of the movement of said target vehicle may comprise identifying at least one of the following set: a vertical movement, a transverse movement, and a longitudinal movement.
The identification of the movement of said target vehicle may comprise identifying an extension or a compression of a vehicle suspension.
The controller may be configured to receive image data from at least one image sensor, the controller being configured to process said image data to identify the attitude of the target vehicle and/or the movement of the target vehicle
The controller may be configured to determine a geographic position of a target vehicle and to map said at least one terrain characteristic in dependence on the determined geographic position.
According to a further aspect of the present invention there is provided a vehicle comprising a terrain inference system as described herein.
According to a further aspect of the present invention there is provided a method of inferring at least one characteristic of the terrain proximal to a target vehicle, the method comprising: monitoring a target vehicle;
identifying an attitude of the target vehicle and/or a movement of the target vehicle; and inferring said at least one terrain characteristic proximal to the target vehicle in dependence on the identified attitude and/or the identified movement.
The inferred terrain characteristic may comprise at least one of the following set: an incline angle, an incline direction, a surface roughness, and a terrain composition. The incline angle and/or the incline direction may be determined in dependence on a roll angle and/or a pitch angle and/or a yaw angle of the target vehicle.
The method may comprise generating a vehicle control parameter in dependence on the at least one inferred terrain characteristic. The vehicle control parameter comprises at least one of the following set: drivetrain control parameter, a transmission control parameter, a chassis control parameter, and a steering control parameter. The chassis control parameter may, for example, adjust suspension controls and/or Electronic Stability Program (ESP) functions.
The method may comprise outputting an alert in dependence on the inferred terrain characteristic.
The identification of the attitude of said target vehicle comprises identifying at least one of the following set: a target vehicle pitch angle, a target vehicle roll angle, and a target vehicle yaw angle.
The identification of the movement of said target vehicle may comprise identifying at least one of the following set: a change in the target vehicle pitch angle, a change in the target vehicle roll angle, and a change in the target vehicle yaw angle.
The method may comprise identifying the movement of said target vehicle comprises identifying at least one of the following set: a vertical movement, a transverse movement, and a longitudinal movement.
The identification of the movement of said target vehicle may comprise identifying an extension or a compression of a vehicle suspension.
The method may comprise receiving image data from at least one image sensor, the method comprising processing said image data to identify the attitude of the target vehicle and/or the movement of the target vehicle
The method comprising determining a geographic position of a target vehicle. The at least one terrain characteristic may be mapped in dependence on the determined geographic position.
According to a further aspect of the present invention there is provided a non-transitory computer-readable medium having a set of instructions stored therein which, when executed, cause a processor to perform the method(s) described herein.
The host vehicle may be a land vehicle. The target vehicle may be a land vehicle. The term “land vehicle” is used herein to refer to a vehicle configured to apply steering and drive (traction) forces against the ground. The vehicle may, for example, be a wheeled vehicle or a tracked vehicle.
The term “location” is used herein to refer to the relative position of an object on the surface of the earth. Unless indicated to the contrary, either explicitly or implied by the context, references herein to the location of an object refer to the geospatial location of that object.
It is to be understood that by the term 'type of terrain' is meant the material comprised by the terrain over which the vehicle is driving such as asphalt, grass, gravel, snow, mud, rock and/or sand. By 'off-road' is meant a surface traditionally classified as off-road, being surfaces other than asphalt, concrete or the like. For example, off-road surfaces may be relatively compliant surfaces such as mud, sand, grass, earth, gravel or the like. Alternatively or in addition offroad surfaces may be relatively rough, for example stony, rocky, rutted or the like. Accordingly in some arrangements an off-road surface may be classified as a surface that has a relatively high roughness and/or compliance compared with a substantially flat, smooth asphalt or concrete road surface.
Any control unit or controller described herein may suitably comprise a computational device having one or more electronic processors. The system may comprise a single control unit or electronic controller or alternatively different functions of the controller may be embodied in, or hosted in, different control units or controllers. As used herein the term “controller” or “control unit” will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide any stated control functionality. To configure a controller or control unit, a suitable set of instructions may be provided which, when executed, cause said control unit or computational device to implement the control techniques specified herein. The set of instructions may suitably be embedded in said one or more electronic processors. Alternatively, the set of instructions may be provided as software saved on one or more memory associated with said controller to be executed on said computational device. The control unit or controller may be implemented in software run on one or more processors. One or more other control unit or controller may be implemented in software run on one or more processors, optionally the same one or more processors as the first controller. Other suitable arrangements may also be used.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the present invention will now be described, by way of example only, with reference to the accompanying Figures, in which:
Figure 1 shows a plan view of a host vehicle incorporating an object classification system in accordance with an embodiment of the present invention;
Figure 2 shows a side elevation of the host vehicle shown in Figure 1 incorporating the object classification system in accordance with an embodiment of the present invention;
Figure 3 shows a schematic representation of the object classification system incorporated into the host vehicle shown in Figures 1 and 2;
Figure 4 shows a schematic representation of the combination of the data sets from the inertial measurement unit and the image processing module;
Figure 5 shows an exemplary image captured by the optical sensor and analysed to detect a discrete image components corresponding to the target vehicle;
Figure 6A illustrates the determination of a minimum inclination angle of a track on which the target vehicle is travelling;
Figure 6B illustrates the determination of a roll angle of the target vehicle is travelling;
Figure 6C illustrates the determination of a surface roughness by tracking the movement and/or attitude of the target vehicle;
Figure 7A illustrates an image acquired by a camera showing a target vehicle and a bounding box generated by an image processing module;
Figure 7B illustrates changes to the image shown in Figure 7A resulting from the target vehicle traversing a pothole; and
Figure 7C illustrates changes to the image shown in Figure 7A resulting from the target vehicle driving around a pothole.
DETAILED DESCRIPTION
A terrain inference system 1 in accordance with an embodiment of the present invention will now be described with reference to the accompanying Figures.
As illustrated in Figures 1 and 2, the terrain inference system 1 is installed in a host vehicle 2. The host vehicle 2 is a wheeled vehicle, such as an automobile or an off-road vehicle. The terrain inference system 1 is operable to detect a target vehicle 3. The target vehicle 3 is a wheeled vehicle, such as an automobile or an off-road vehicle. The host vehicle 2 and the target vehicle 3 are both land vehicles (i.e. vehicles configured to apply steering and drive (traction) forces against the ground). The target vehicle 3 may, for example, be travelling in front of the host vehicle 2. For example, the target vehicle 3 may be a lead vehicle or a vehicle in front of the host vehicle 2 in a convoy. In this scenario, the host vehicle 2 may be a following vehicle which is travelling along the same route as the target vehicle 3.
The host vehicle 2 described herein comprises a first reference frame comprising a longitudinal axis X1, a transverse axis Y1 and a vertical axis Z1. The target vehicle 3 described herein comprises a second reference frame comprising a longitudinal axis X2, a transverse axis Y2 and a vertical axis Z2. The orientation of the first and second reference frames is described herein with reference to a horizontal axis X and a vertical axis Z.
The host vehicle 2 comprises four wheels W1-4. A torque is transmitted to the wheels W1-4 to apply a tractive force to propel the host vehicle 2. The torque is generated by one or more torque generating machine, such as an internal combustion engine or an electric traction machine, and transmitted to the driven wheels W1 -4 via a vehicle powertrain. The host vehicle 2 in the present embodiment has four-wheel drive and, in use, torque is transmitted selectively to each of said wheels W1-4. It will be understood that the terrain inference system 1 could also be installed in a host vehicle 2 having two-wheel drive. The host vehicle 2 in the present embodiment is an automobile having off-road driving capabilities. For example, the host vehicle 2 may be capable of driving on an un-metalled road, such as a dirt road or track. The host vehicle 2 may, for example, be a sports utility vehicle (SUV) or a utility vehicle, but it will be understood that the terrain inference system 1 may be installed in other types of vehicle. The terrain inference system 1 may be installed in other types of wheeled vehicles, such as light, medium or heavy trucks. The target vehicle 3 may have the same configuration as the host vehicle 2 or may have a different configuration.
A schematic representation of the terrain inference system 1 installed in the host vehicle 2 is shown in Figure 3. The terrain inference system 1 comprises a controller 6 having at least one electronic processor 7 and a memory 8. The processor 7 is operable to receive a data signal S1 from a sensing means 9. As described herein, the processor 7 is operable to process the image data signal S1. In the present embodiment, the processor 7 is configured to implement an image processing module 10 to analyse the image data signal S1. The image processing module 10 in accordance with the present invention is configured to detect the target vehicle 3 and to determine an attitude (orientation) and/or movement of the target vehicle 3. The processor 7 may optionally also control operation of the host vehicle 2 in dependence on the relative location of the target vehicle 3. For example, the processor 7 may be operable to control a target follow distance D1 between the host vehicle 2 and the target vehicle 3. The processor 7 may control selection of one or more driving modes of the host vehicle 2 in dependence on the monitoring of the target vehicle 3. For example, the processor 7 may be configured to control one or more of the following systems: All-Terrain Progress Control (ATPC), Hill Descent Control, Electronic Traction Control (ETC), Adaptive Dynamics, Dynamic Stability Control (DSC), and variable ratio Electric Power-Assisted Steering (EPAS). The processor 7 may, for example, control one or more of the following set: suspension settings; throttle response; brake response; and transmission settings. Alternatively, or in addition, the processor 7 may output a target follow distance signal SD1 to a cruise control module 11. The cruise control module 11 may be selectively operable in a follow mode suitable for controlling a target speed of the host vehicle 2 to maintain the target follow distance D1 between the host vehicle 2 and the target vehicle 3. The cruise control module 11 may output a target speed signal SV1 to an engine control module 12 which controls the output torque transmitted to the wheels W1-4. The cruise control module 11 may also generate a brake control signal for controlling a braking torque applied to said wheels W1 -4. The processor 7 may optionally also output a steering control signal (not represented) to control an electronic power assisted steering module (not shown) to control a steering angle of the host vehicle 2. The steering control signal SD1 may be output to control the host vehicle 2 to follow the path taken by the target vehicle 3.
As illustrated in Figures 1 and 2, the sensing means 9 is mounted in a forward-facing orientation to establish a detection region in front of the host vehicle 2. The sensing means 9 in the present embodiment comprises at least one optical sensor 13 mounted to the host vehicle 2. The sensing means 9 may comprise a single camera. Alternatively, the sensing means 9 may comprise a stereoscopic camera. The at least one optical sensor 13 may be mounted at the front of the host vehicle 2, for example incorporated into a front bumper or engine bay grille; or may be mounted within the vehicle cabin, for example in front of a rearview mirror. The at least one optical sensor 13 has a field of view FOV having a central optical axis VX extending substantially parallel to the longitudinal axis X1 of the host vehicle 2. The field of view FOV is generally conical in shape and extends in horizontal and vertical directions. The at least one optical sensor 13 comprises a digital imaging sensor for capturing image data. The image data comprises an image IMG1 corresponding to a scene within the field of view FOV of the at least one optical sensor 13. The image data is captured substantially in real-time, for example at 30 frames per second. The at least one optical sensor 13 in the present embodiment is operable to detect light in the visible spectrum of light. The sensing means 9 comprises optics (not shown) for directing the incident light onto an imaging sensor, such as a charge-coupled device (CCD), operable to generate image data for transmission in the image data signal S1. Alternatively, or in addition, the sensing means 9 may be operable to detect light outside of the visible light spectrum, for example in the infra-red range to generate a thermographic image. Alternatively, or in addition, the sensing means 9 may comprise a Lidar sensor for projecting a laser light in front of the host vehicle 2. Other types of sensor are also contemplated.
The sensing means 9 is connected to the controller 6 over a communication bus 14 provided in the host vehicle 2, as shown in Figure 3. The image data signal S1 is published to the communication bus 14 by the sensing means 9. In the present embodiment, the connection between the sensing means 9 and the controller 6 comprises a wired connection. In alternative embodiments, the connection between the sensing means 9 and the controller 6 may comprise a wireless connection, for example to enable remote positioning of the sensing means 9. By way of example, the sensing means 9 may be provided in a remote targeting system, such as a drone vehicle. The processor 7 is operable to read the image data signal S1 from the communication bus 14. The processor 7 extracts image data from the image data signal S1. In accordance with an aspect of the present invention, the image processing module 10 is configured to infer one or more characteristics of the terrain over which the target vehicle 3 is travelling in dependence on a determined attitude (orientation) and/or a determined movement of the target vehicle 3. The image processing module 10 cross-references the inferred terrain characteristic(s) with a determined geospatial location of the target vehicle 3. The image processing module 10 may thereby compile terrain data remote from the host vehicle 2. The resulting terrain data is particularly useful if the host vehicle 2 is following the target vehicle 3 along a particular route, as the host vehicle 2 will in due course traverse the same terrain. Accordingly, the terrain data may be used proactively to coordinate vehicle systems prior to encountering the terrain. The operation of the image processing module 10 will now be described.
The image processing module 10 parses the image data from the at least one optical sensor 13 to identify one or more image components IMC(n) within an image IMG1. The image components IMC(n) are preferably persistent features within the image IMG1 detectable within the image data for at least a predetermined time period or over a predetermined number of frames, for example two or more successive frames. In certain embodiments, the image components IMC(n) may comprise an identifiable feature or element contained within the image IMG1, for example comprising a plurality of pixels which are present in successive frames. The image processing module 10 implements an edge detection algorithm to detect edges within the image data. The image processing algorithm may, for example, be configured to identify points where the image brightness comprises discontinuities, particularly those points arranged into linear or curved line segments which may correspond to an edge. The image processing module 10 may apply a brightness threshold (which may be a predetermined threshold or a dynamic threshold) to identify the edges of the image components IMC(n) within the image IMG1. The identified edge(s) may be incomplete, for example in regions where image discontinuities are less pronounced. The image processing module 10 may complete the edges, for example utilising a morphological closing technique, to form a closed region. The or each closed region is identified as a discrete image component IMC(n). By repeating this process, the image processing algorithm may identify each image component IMC(n) contained within the image data.
The image processing module 10 implements a pattern matching algorithm to compare each of the image components IMC(n) identified in the image IMG1 to predefined patterns stored in memory 8. The image processing module 10 classifies each of the image components IMC(n) in dependence on the correlation between each image component IMC(n) with the predefined patterns. The image processing module 10 may, for example, classify each image component IMC(n) as one of the following set: an obstacle 4; a target vehicle 3; a cyclist; a person (not shown); an animal, etc. In the present embodiment, the image processing module 10 is configured to identify the target vehicle 3 within the image IMG1. The pattern matching algorithm is implemented to determine if any of the image components IMC(n) identified in the image data (partially or completely) match one or more predefined patterns. The predefined patterns may, for example, comprise an object model defined in two-dimensions (2-D) or three-dimensions (3-D). The predefined patterns may be stored in the memory 8 and accessed by the image processing module 10. In the present embodiment, the predefined patterns correspond to a shape and/or profile of one or more target vehicles 5. Optionally, the predefined patterns may define a colour of the target vehicle 3, for example specified by a user or identified during an initial calibration procedure. Alternatively, or in addition, the predefined patterns may comprise a registration (number) plate mounted to an exterior of the target vehicle 3. The registration (number) plate comprises one or more alphanumeric characters and the attitude of the target vehicle 3 may be determined by analysing the image IMG1 to determine the perspective of said alphanumeric characters. The pattern corresponding to the registration (number) plate may be defined during a calibration phase. Known pattern matching techniques may be used to determine a correlation between the predefined patterns and the or each image component IMC(n). The image component IMC(n) corresponding to the target vehicle 3 may thereby be identified within the image IMG1.
The image processing module 10 is configured to analyse the image component IMC(n) corresponding to the target vehicle 3 to estimate the attitude of the target vehicle 3. For example, the image processing module 10 may analyse the image component IMC(n) to estimate one or more of the following set: a target vehicle pitch angle (Θ2), a target vehicle roll angle (β2), and a target vehicle yaw angle (y2). The target vehicle pitch angle (Θ2) is the included angle between the longitudinal axis X2 and the horizontal axis X. The target vehicle roll angle (β2) is the included angle between the vertical axis X2 and the vertical axis Z. The target vehicle yaw angle (y2) is the included angle between the longitudinal axis X1 of the host vehicle 2 and the longitudinal axis Z2 of the target vehicle 3. The image processing module 10 may optionally also monitor movement of the target vehicle 3. The image processing module 10 may analyse changes in the image component IMC(n) with respect to time to estimate one or more of the following set: longitudinal movement (speed and/or acceleration) of the target vehicle 3; lateral movement (speed and/or acceleration) of the target vehicle 3, for example caused by side-slipping; and/or vertical movement (speed and/or acceleration) of the target vehicle 3. Alternatively, or in addition, the image processing module 10 may analyse changes in the image component IMC(n) with respect to time to estimate one or more of the following set: a change or rate of change of the target vehicle pitch angle (Θ2), a change or rate of change of the target vehicle roll angle (β2), and a change or rate of change of the target vehicle yaw angle (γ2). It will be understood that the image processing module 10 may operate in conjunction with other sensors provided on the host vehicle 2 to monitor the target vehicle 3. The host vehicle 2 may comprise additional sensors suitable for tracking the movement of the target vehicle 3. By way of example, the host vehicle 2 may comprise one or more of the following set: an ultrasound sensor, a radar sensor and a lidar sensor.
The image processing module 10 may optionally also estimate the position of the target vehicle 3 relative to the host vehicle 2. For example, the image processing module 10 may determine the relative position of the target vehicle 3 in dependence on the size of the image component IMC(n) within the image IMG1; and/or the position of the image component IMC(n) within the image IMG1. By combining a known location of the host vehicle 2, for example derived from a global positioning system breaks GPS), with the relative position determined by the image processing module 10, a geospatial location of the target vehicle 3 may be determined. Alternatively, or in addition, the host vehicle 2 may receive geospatial location data transmitted from the target vehicle 3, for example using a suitable vehicle-to-vehicle communication protocol. The image processing module 10 outputs a target vehicle data signal ST 1 to the terrain inference system 1.
It will be understood that the scene captured by the sensing means 9 is dependent on the attitude of the host vehicle 2 and/or movements of the host vehicle 2. In order to compensate for changes in the attitude and/or movements of the host vehicle 2, the terrain inference system 1 in the present embodiment is configured to receive an inertial measurement signal S2 from an inertial measurement unit (IMU) 15 provided in the host vehicle 2. The IMU 15 comprises one or more sensors 16 for measuring inertial movement of the host vehicle 2. The one or more sensors 16 measure a host vehicle pitch angle (Θ1) and a host vehicle roll angle (β1). The host vehicle pitch angle (Θ1) is the included angle between the longitudinal axis X1 and the horizontal axis X. The host vehicle roll angle (β1) is the included angle between the vertical axis X1 and the vertical axis Z. The IMU 15 may determine a change (or a rate of change) of the host vehicle pitch angle (Θ1) and a change (or rate of change) of the host vehicle roll angle (β1). The one or more sensors 16 may comprise one or more accelerometers (not shown) and/or one or more gyroscopes (not shown). The terrain inference system 1 analyses said inertial measurement signal S2 to determine movements of the host vehicle 2. Optionally, one or more movements of the host vehicle 2 may be estimated, for example in dependence on the inertial measurement signal S2. The estimation of one or more movements of the host vehicle 2 may, for example, be appropriate if the IMU 15 does not include a sensor for one or more degrees of movement.
As shown in Figure 5, the terrain inference system 1 is configured to correct the measured attitude and/or movements of the target vehicle 3 in dependence on the determined attitude and/or movements of the host vehicle 2. The orientation and the movement of the host vehicle 2 are derived from the IMU 15 (BLOCK 10); and the measured orientation and movement of the target vehicle 3 are derived from the image processing module 10 (BLOCK 20). A comparison algorithm is applied (BLOCK 30) to compare both data sets. The comparison algorithm may, for example, subtract the orientation and the movement of the host vehicle 2 from the measured orientation and movement of the target vehicle 3 to determine a corrected orientation and movement of the target vehicle 3. The corrected orientation of the target vehicle 3 may, for example, be defined relative to a horizontal axis and a vertical axis. The terrain inference system 1 uses the corrected orientation and movement of the target vehicle 3 to estimate the one or more terrain characteristics (BLOCK 40). The terrain inference system 1 may, for example, apply an inverse dynamics model to infer the at least one terrain characteristic. By monitoring the dynamic behaviour of the target vehicle 3, the terrain inference system 1 may infer one or more characteristics of the terrain over which the target vehicle 3 is travelling. The terrain inference system 1 may, for example, determine a surface gradient in dependence on the corrected orientation of the target vehicle 3. The surface gradient may be inferred with reference to the long period behaviour of the target vehicle 3. The terrain inference system 1 may infer characteristics of the surface roughness by virtue of the magnitude and/or range and/or frequency of changes in the orientation of the target vehicle 3. For example, if the orientation of the target vehicle 3 is changing with a high frequency, the terrain inference system 1 may infer that the target vehicle 3 is travelling over a rough or irregular surface. The magnitude of the changes in the orientation of the target vehicle 3 may provide an indication of the size of any surface irregularities. The frequency of the changes in the orientation of the target vehicle 3 may provide an indication of the number of surface irregularities. The surface roughness may be inferred with reference to the short period behaviour of the target vehicle 3. The surface composition may be inferred with reference to the position and/or the attitude of the target vehicle 3 relative to the surface.
The terrain inference system 1 may grade the terrain, for example by determining a surface roughness coefficient SRC. The surface roughness coefficient SRC provides an indication of the roughness of a surface SF over which the target vehicle 3 is travelling. The surface roughness coefficient SRC may, for example, provide an indication of the size and/or prevalence of surface irregularities. The surface roughness coefficient SRC may be determined in dependence on the magnitude and/or frequency of target vehicle movements, for example vertical movements. Alternatively, or in addition, the surface roughness coefficient SRC may be determined in dependence on changes in the target vehicle pitch angle (Θ1) and/or the target vehicle roll angle (β1). The surface roughness coefficient SRC may be determined in dependence on the period of any such movements, for example differentiating between short-period oscillations and long-period oscillations of the target vehicle 3. In the present embodiment, the surface roughness coefficient SRC is in the range zero (0) to one (1), inclusive. The surface roughness coefficient SRC is set equal to one (1) if the surface is deemed to be very rough, for example corresponding to terrain that cannot be traversed by the host vehicle 2. The surface roughness coefficient SRC is set equal to zero (0) if the surface is deemed to be smooth, for example corresponding to a metalled road surface. The surface roughness coefficient SRC may grade the surface roughness between these endpoints. For example, a surface which is slightly rough may have a surface roughness coefficient SRC of 0.6.
The operation of the terrain inference system 1 will now be described. An exemplary image IMG1 captured by the sensing means 9 disposed on the host vehicle 2 is shown in Figure 5. The image processing module 10 identifies a plurality of image components IMC(n) within the image IMG1. Using appropriate pattern matching techniques, the image processing module 10 classifies a first of said image component IMC(1) as corresponding to the target vehicle 3. The image processing module 10 analyses the first image component IMC(1) to determine the target vehicle pitch angle (Θ2), target vehicle roll angle (β2), and the target vehicle yaw angle (y2). The terrain inference system 1 determines the host vehicle pitch angle (Θ1) and host vehicle roll angle (β1) in dependence on the inertial measurement signal ST2 received from the IMU 15. By combining the datasets relating to the host vehicle 2 and the target vehicle 3, the terrain inference system 1 determines the corrected orientation and/or corrected movement of the target vehicle 3. The image processing module 10 in the present embodiment is configured to track the first image component IMC(1), for example over successive frames of the image data or at predetermined time intervals. The image processing module 10 may thereby monitor the target vehicle 3.
As shown in Figure 6A, the height H (elevation) of the target vehicle 3 relative to the host vehicle 2 may be determined in dependence on the vertical position of the first image component IMC(1) within the first image IMG1. By determining a longitudinal distance between the host vehicle 2 and the target vehicle 3, the terrain inference system may estimate a minimum inclination angle (a) of the surface between the host vehicle 2 and the target vehicle 3. As shown in Figure 6B, the target vehicle roll angle (β1) is calculated by comparing a vertical axis of the first image component IMC(1) to a reference vertical axis. The terrain inference system 1 may thereby determine that the target vehicle 3 is disposed on an inclined surface having a side slope angle substantially equal to the calculated target vehicle roll angle (β2). As shown in Figure 6C, the terrain inference system 1 determines the surface roughness coefficient SRC in dependence on the magnitude and/or frequency of changes in the vertical position of the target vehicle 3. The terrain inference system 1 may optionally also consider the magnitude and/or frequency of changes in the target vehicle pitch angle (Θ2). As outlined above, the terrain characteristics are cross-referenced with the determined geospatial location of the target vehicle 3, for example to generate a terrain map.
The terrain inference system 1 in accordance with the present embodiment has particular application in an off-road environment. When the host vehicle 2 and the target vehicle 3 are travelling off-road, the determination of the terrain characteristics is usually more important than in an on-road environment. The terrain inference system 1 may be selectively activated when the host vehicle 2 is travelling off-road, for example in response to a user input or automatically when an off-road driving mode is selected. The terrain inference system 1 may track the target vehicle 3, for example to determine the route taken by the target vehicle 3. The terrain inference system 1 may generate a corresponding target route for the host vehicle
2. At least in certain embodiments, the image processing module 10 may calculate the speed and/or the trajectory of the target vehicle 3. It will be understood, however, that the terrain inference system may be utilised in an on-road setting (i.e. a metalled surface), for example to facilitate identification of a traffic calming measure, such as a speed hump or a speed table, or a pothole.
A variant of the terrain inference system 1 will now be described with reference to Figures 7A, 7B and 7C. Like reference numerals are used for like components. The terrain inference system 1 is suitable for inferring the presence of an obstacle 16, such as a pothole or other terrain feature, in the path of the target vehicle 3. The obstacle 16 may be present in a metalled surface or un-metalled surface.
The terrain inference system 1 comprises at least one optical sensor 13 configured to capture an image IMG2. The optical sensor 13 in the present embodiment comprises a forward-facing camera disposed on the host vehicle 2 and operable to capture a video image, for example comprising twenty (20) images per second. The camera may comprise a mono camera or a stereoscopic camera. As described herein, the image processing module 10 is configured to process the images captured by the optical sensor 13 to identify and track the target vehicle
3. An exemplary image IMG2 captured by the optical sensor 13 is shown in Figure 7A. The image processing module 10 analyses the image IMG1 to identify and classify an image component IMC(1) corresponding to the target vehicle 3. The image processing module 10 adds a bounding box 17 around the image component IMC(1) in the image IMG1. A suitable method of generating the bounding box 17 comprises identifying corners 18A-D of the image component IMC(1). Horizontal and vertical lines are drawn between the corners 18A-D to complete the bounding box 17. The image processing unit 10 is configured to perform this operation at least substantially in real-time. The bounding box 17 moves with the target vehicle 3, thereby enabling the image processing module 10 to track movement of the target vehicle 3 in a sequence of images. Over a period of time the image processing module 10 will track the bounding box 17 and determine its normal range of movement in a vertical direction and/or a transverse direction. Alternatively, or in addition, the terrain inference system 1 may comprise a radar sensor or other type of sensor.
Upon identifying an obstacle 16, a driver of a vehicle may elect to drive over the obstacle 16 or to drive around the obstacle 16. If the vehicle drives over the obstacle 16, there is typically a corresponding vertical movement of the vehicle. If the vehicle drives around the obstacle 16, there is a corresponding lateral movement of the vehicle. The terrain inference system 1 in the present embodiment is configured to identify short period perturbations which may correspond to a target vehicle 3 driving over or around an obstacle 16. Any such perturbations may indicate that the target vehicle 3 is reacting to an obstacle 16 in its path. The terrain inference system 1 may infer terrain characteristics in dependence on the perturbations in the movement of the target vehicle 3. By analysing the movement of the target vehicle 3, the terrain inference system 1 may categorise the type or nature of the obstacle 16. For example, if the obstacle 16 is a pothole, the movement may comprise a downwards movement followed by an upwards movement. If the obstacle 16 is a ridge or a speed hump, the movement may comprise an upwards movement followed by a downwards movement. The terrain inference system 1 may identify such movements in the target vehicle 3 and infer characteristics of the obstacle 16. The host vehicle 2 may act upon this information and take appropriate preemptive action to mitigate the effect of the obstacle 16. In dependence on the terrain characteristics inferred by the terrain inference system 1, the host vehicle 2 could, for example, implement a steering change, or may re-configure a vehicle suspension, for example by changing damper settings.
The operation of the terrain inference system 1 to infer terrain characteristics is illustrated with reference to the images IMG3 and IMG4 shown in Figures 7B and 7C respectively. The obstacle 16 in the illustrated examples comprises a pothole. If the target vehicle 3 drives over the pothole with one wheel, there is a sudden movement of the target vehicle 3 which causes a rolling motion. This rolling motion of the target vehicle 3 can be detected by analysing the image IMG3. In particular, the image processing module may estimate a target vehicle roll angle (β2) by calculating an angle between the top and bottom sides of the bounding box 16 and a horizontal reference plane Y.
Alternatively, or in addition, the image processing module may be configured to detect vertical movement of the target vehicle 3 by monitoring the position of the bounding box 16. The vertical movement of the target vehicle 3 may be detected by monitoring the vertical position of one or more sides of the bounding box 16 in the image IMG3. If the target vehicle 3 traverses a pothole or a speed restriction hump with both wheels, the resulting movement of the target vehicle 3 would comprise a vertical movement with or without a change in the roll angle. The image processing module may be configured to detect a corresponding change in the vertical position of the bounding box 16 in the image IMG3.
Alternatively, or in addition, at least one threshold may be predefined for relative movement of diametrically opposed corners 18A-D of the bounding box 17. If the movement of the diametrically opposed corners 18A-D of the bounding box 17 exceeds the predefined threshold(s), the image processing module may determine that the target vehicle 3 has traversed a pothole. The at least one threshold may be generated from one or more previous observations of the target vehicle 3. The at least one threshold may be calibrated by comparing detected movements of the target vehicle 3 with measured behaviour of the host vehicle 2 traversing the same obstacle 16. The thresholds may be adjusted dynamically, for example adjusted in dependence on an estimated speed of the target vehicle 3.
If the target vehicle 3 drives around the obstacle 16, there is a change in the trajectory of the target vehicle 3. This change in trajectory may occur rapidly as the driver of the vehicle may have a relatively short period of time in which to drive around the obstacle 16. As illustrated in Figure 7C, if the target vehicle 3 drives around a pothole, there is a first lateral movement to avoid the pothole which may optionally be followed by a second lateral movement to return target vehicle 3 to the original trajectory. In this example, it will be appreciated that the first and second lateral movements are in opposite directions. The image processing module 10 may be configured to detect the first lateral movement and optionally also the second lateral movement of the target vehicle 3 which are indicative of an avoidance manoeuvre. The image processing module may detect the lateral movement(s) of the target vehicle 3 by identifying a movement of the bounding box 16. The image processing module may be configured to identify a lateral movement ΔΥ which exceeds a predetermined threshold, for example within a set time period. The lateral movement ΔΥ is illustrated in Figure 7C by a first bounding box 17’ shown as a dashed line representing the position of the target vehicle 3 at a first time; and a second bounding box 17 shown as a continuous line representing the position of the target vehicle 3 at a second time. The threshold may be set by a calibration process or derived from observation of movement of the target vehicle 2 over a period of time. The thresholds may be adjusted dynamically, for example in dependence on an estimated speed of the target vehicle
3.
The terrain inference system 1 may determine a geospatial position of the obstacle 16. For example, the image processing module 10 may estimate a position of the obstacle 16 with reference to a known location of the host vehicle 2. The image processing module 10 may be configured to track a wheel path of the target vehicle 3. The wheel path could be used to estimate a location of the obstacle 16 that prompted a change in the trajectory of the target vehicle 3.
The terrain inference system 1 described herein infers terrain characteristics in dependence on the movement or behaviour of another vehicle (the target vehicle 3), typically the vehicle in front of the host vehicle 2. The terrain inference system 1 may thereby infer terrain characteristics which are obscured from on-board sensors by the target vehicle 3. This has particular advantages if the distance between the host vehicle 2 and the target vehicle 3 is relatively small, for example when operating in traffic. The operation of a conventional scanning system, for example utilising a radar system, which directly scans the terrain may be impaired in this scenario.
It will be appreciated that various modifications may be made to the embodiment(s) described herein without departing from the scope of the appended claims.
The image processing module may be configured to detect and track the rear (tail) lights on a rear surface of the target vehicle 3. This technique may be used instead of, or in addition to, the techniques described herein to identify an outline of the target vehicle 3. This approach may be advantageous at night or in restricted visibility conditions. The host vehicle 2 could optionally emit light, for example from the headlamps, which is reflected off of the rear (tail) lights of the target vehicle 3.
The present invention has been described with particular reference to sensing means 9 which is forward facing to enable detection and classification of the target vehicle 3 in front of the host vehicle 2 when it is travelling a forward direction. It will be understood that the invention may be implemented in other configurations, for example comprising sensing means 9 which is side-facing and/or rear-facing. The image processing module 10 could optionally be configured to track movements of the wheels of the target vehicle 3. Any such movements of the wheels of the target vehicle 3 may provide an indication of the operation of the suspension of the target vehicle 3. The terrain inference system 1 may, for example, determine the surface roughness coefficient SRC in dependence on analysis of the behaviour of the vehicle suspension (not shown). For example, the extent and/or frequency of changes in the suspension height may be used to determine the surface roughness coefficient SRC.
The host vehicle 2 may be configured to transmit the determined terrain characteristics, for example to relay them to another vehicle (discrete from said host vehicle 2 and the target vehicle 3).
CLAIMS:

Claims (24)

1. A terrain inference system terrain inference system comprising a controller configured to:
monitor a target vehicle;
identify an attitude of the target vehicle and/or a movement of the target vehicle; and inferring at least one terrain characteristic relating to a region of terrain proximal to the target vehicle in dependence on the identified attitude of the target vehicle and/or the identified movement of the target vehicle.
2. A terrain inference system as claimed in claim 1, wherein the inferred terrain characteristic comprises at least one of the following set: an incline angle, an incline direction, a surface roughness, and a terrain composition.
3. A terrain inference system as claimed in claim 1 or claim 2, wherein the controller is configured to generate a vehicle control parameter in dependence on the at least one inferred terrain characteristic.
4. A terrain inference system as claimed in claim 3, wherein the vehicle control parameter comprises at least one of the following set: drivetrain control parameter, a transmission control parameter, a chassis control parameter, and a steering control parameter.
5. A terrain inference system as claimed in any one of claims 1 to 4, wherein the controller is configured to output an alert in dependence on the inferred terrain characteristic.
6. A terrain inference system as claimed in any one of the preceding claims, wherein identifying the attitude of said target vehicle comprises identifying at least one of the following set: a target vehicle pitch angle, a target vehicle roll angle, and a target vehicle yaw angle.
7. A terrain inference system as claimed in claim 6, wherein identifying the movement of said target vehicle comprises identifying at least one of the following set: a change in the target vehicle pitch angle, a change in the target vehicle roll angle, and a change in the target vehicle yaw angle.
8. A terrain inference system as claimed in any one of the preceding claims, wherein identifying the movement of said target vehicle comprises identifying at least one of the following set: a vertical movement, a transverse movement, and a longitudinal movement.
9. A terrain inference system as claimed in any one of the preceding claims, wherein identifying the movement of said target vehicle comprises identifying an extension or a compression of a vehicle suspension.
10. A terrain inference system as claimed in any one of the preceding claims, wherein the controller is configured to receive image data from at least one image sensor, the controller being configured to process said image data to identify the attitude of the target vehicle and/or the movement of the target vehicle
11. A terrain inference system as claimed in any one of the preceding claims, wherein the controller is configured to determine a geographic position of a target vehicle and to map said at least one terrain characteristic in dependence on the determined geographic position.
12. A vehicle comprising a terrain inference system as claimed in any one of the preceding claims.
13. A method of inferring at least one characteristic of the terrain proximal to a target vehicle, the method comprising:
monitoring a target vehicle;
identifying an attitude of the target vehicle and/or a movement of the target vehicle; and inferring said at least one terrain characteristic proximal to the target vehicle in dependence on the identified attitude and/or the identified movement.
14. A method as claimed in claim 13, wherein the inferred terrain characteristic comprises at least one of the following set: an incline angle, an incline direction, a surface roughness, and a terrain composition.
15. A method as claimed in claim 13 or claim 14 comprising generating a vehicle control parameter in dependence on the at least one inferred terrain characteristic.
16. A method as claimed in claim 15, wherein the vehicle control parameter comprises at least one of the following set: drivetrain control parameter, a transmission control parameter, a chassis control parameter, and a steering control parameter.
17. A method as claimed in any one of claims 13 to 16 comprising outputting an alert in dependence on the inferred terrain characteristic.
18. A method claimed in any one of claims 13 to 17, wherein identifying the attitude of said target vehicle comprises identifying at least one of the following set: a target vehicle pitch angle, a target vehicle roll angle, and a target vehicle yaw angle.
19. A method as claimed in claim 18, wherein identifying the movement of said target vehicle comprises identifying at least one of the following set: a change in the target vehicle pitch angle, a change in the target vehicle roll angle, and a change in the target vehicle yaw angle.
20. A method as claimed in any one of claims 13 to 19, wherein identifying the movement of said target vehicle comprises identifying at least one of the following set: a vertical movement, a transverse movement, and a longitudinal movement.
21. A method as claimed in any one of claims 13 to 20, wherein identifying the movement of said target vehicle comprises identifying an extension or a compression of a vehicle suspension.
22. A method as claimed in any one of claims 13 to 21 comprise receiving image data from at least one image sensor, the method comprising processing said image data to identify the attitude of the target vehicle and/or the movement of the target vehicle
23. A method as claimed in any one of claims 13 to 22 comprising determining a geographic position of a target vehicle and to map said at least one terrain characteristic in dependence on the determined geographic position.
24. A non-transitory computer-readable medium having a set of instructions stored therein which, when executed, cause a processor to perform the method claimed in any one of claims 13 to 23.
GB1806629.0A 2018-03-01 2018-04-24 Terrain inference method and apparatus Active GB2571589B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/EP2019/050389 WO2019166142A1 (en) 2018-03-01 2019-01-09 Methods and apparatus for acquisition and tracking, object classification and terrain inference
DE112019001080.8T DE112019001080T5 (en) 2018-03-01 2019-01-09 METHOD AND DEVICE FOR DETECTION AND TRACKING, OBJECT CLASSIFICATION AND TERRAIN INFERENCE
US16/977,065 US20210012119A1 (en) 2018-03-01 2019-01-09 Methods and apparatus for acquisition and tracking, object classification and terrain inference

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
IN201811007660 2018-03-01

Publications (3)

Publication Number Publication Date
GB201806629D0 GB201806629D0 (en) 2018-06-06
GB2571589A true GB2571589A (en) 2019-09-04
GB2571589B GB2571589B (en) 2020-09-16

Family

ID=62236139

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1806629.0A Active GB2571589B (en) 2018-03-01 2018-04-24 Terrain inference method and apparatus

Country Status (1)

Country Link
GB (1) GB2571589B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210001850A1 (en) * 2018-03-01 2021-01-07 Jaguar Land Rover Limited Vehicle control method and apparatus
US11390287B2 (en) * 2020-02-21 2022-07-19 Hyundai Motor Company Device for classifying road surface and system for controlling terrain mode of vehicle using the same
GB2611121A (en) * 2021-09-23 2023-03-29 Motional Ad Llc Spatially and temporally consistent ground modelling with information fusion

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020087345A1 (en) * 2018-10-31 2020-05-07 深圳市大疆创新科技有限公司 Method for controlling ground-based telerobot and ground-based telerobot
CN114643824B (en) * 2022-04-15 2023-10-13 安徽博泰微电子有限公司 Electronic control suspension system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005178530A (en) * 2003-12-18 2005-07-07 Nissan Motor Co Ltd Road surface shape detecting device and method
DE102005051141A1 (en) * 2005-10-26 2007-05-03 GM Global Technology Operations, Inc., Detroit Control method for electronically adjustable damping system in motor vehicle, involves detection of vertically directed motion of preceding vehicle by means of motion detection sensor and determination and transmission of control variable
DE102011007608A1 (en) * 2011-04-18 2012-10-18 Robert Bosch Gmbh Method for active chassis controlling of vehicle i.e. car, involves determining gear control information for regulating wheel suspension system, and performing active controlling of suspension system based on regulated control information
DE102012022367A1 (en) * 2012-11-15 2014-05-15 GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) Method for regulating and controlling damping- and stabilizing device of motor vehicle, involves determining vertical movement of section of preceding vehicle and determiningfuture surrounding situation of vehicle based on vertical movement
US20170197630A1 (en) * 2016-01-11 2017-07-13 Trw Automotive Gmbh Control system and method for determining an irregularity of a road surface
US20180057003A1 (en) * 2016-09-01 2018-03-01 Samsung Electronics Co., Ltd. Autonomous driving method and apparatus
EP3312029A1 (en) * 2016-10-13 2018-04-25 MAN Truck & Bus AG Method for operating a vehicle and vehicle
WO2018141340A1 (en) * 2017-02-06 2018-08-09 Continental Automotive Gmbh Detection of road unevenness based on a situational analysis
RO132761A2 (en) * 2017-02-06 2018-08-30 Continental Automotive Gmbh System for identification of road bumps based on a case analysis

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005178530A (en) * 2003-12-18 2005-07-07 Nissan Motor Co Ltd Road surface shape detecting device and method
DE102005051141A1 (en) * 2005-10-26 2007-05-03 GM Global Technology Operations, Inc., Detroit Control method for electronically adjustable damping system in motor vehicle, involves detection of vertically directed motion of preceding vehicle by means of motion detection sensor and determination and transmission of control variable
DE102011007608A1 (en) * 2011-04-18 2012-10-18 Robert Bosch Gmbh Method for active chassis controlling of vehicle i.e. car, involves determining gear control information for regulating wheel suspension system, and performing active controlling of suspension system based on regulated control information
DE102012022367A1 (en) * 2012-11-15 2014-05-15 GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) Method for regulating and controlling damping- and stabilizing device of motor vehicle, involves determining vertical movement of section of preceding vehicle and determiningfuture surrounding situation of vehicle based on vertical movement
US20170197630A1 (en) * 2016-01-11 2017-07-13 Trw Automotive Gmbh Control system and method for determining an irregularity of a road surface
US20180057003A1 (en) * 2016-09-01 2018-03-01 Samsung Electronics Co., Ltd. Autonomous driving method and apparatus
EP3312029A1 (en) * 2016-10-13 2018-04-25 MAN Truck & Bus AG Method for operating a vehicle and vehicle
WO2018141340A1 (en) * 2017-02-06 2018-08-09 Continental Automotive Gmbh Detection of road unevenness based on a situational analysis
RO132761A2 (en) * 2017-02-06 2018-08-30 Continental Automotive Gmbh System for identification of road bumps based on a case analysis

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210001850A1 (en) * 2018-03-01 2021-01-07 Jaguar Land Rover Limited Vehicle control method and apparatus
US11958485B2 (en) * 2018-03-01 2024-04-16 Jaguar Land Rover Limited Vehicle control method and apparatus
US11390287B2 (en) * 2020-02-21 2022-07-19 Hyundai Motor Company Device for classifying road surface and system for controlling terrain mode of vehicle using the same
GB2611121A (en) * 2021-09-23 2023-03-29 Motional Ad Llc Spatially and temporally consistent ground modelling with information fusion
GB2611121B (en) * 2021-09-23 2024-02-28 Motional Ad Llc Spatially and temporally consistent ground modelling with information fusion

Also Published As

Publication number Publication date
GB201806629D0 (en) 2018-06-06
GB2571589B (en) 2020-09-16

Similar Documents

Publication Publication Date Title
US11772647B2 (en) Control system for a vehicle
GB2571589A (en) Terrain inference method and apparatus
US10106167B2 (en) Control system and method for determining an irregularity of a road surface
US11554778B2 (en) Vehicle speed control
EP2758260B1 (en) Suspension control device
US7366602B2 (en) Roll stability control system for an automotive vehicle using an external environmental sensing system
US20210012119A1 (en) Methods and apparatus for acquisition and tracking, object classification and terrain inference
US11958485B2 (en) Vehicle control method and apparatus
US11603103B2 (en) Vehicle speed control
US11021160B2 (en) Slope detection system for a vehicle
US10611375B2 (en) Vehicle speed control
EP3741638A1 (en) Vehicle control device
WO2018007079A1 (en) Improvements in vehicle speed control
EP3738849A1 (en) Vehicle control device
EP1017036A1 (en) Method and apparatus for detecting deviation of automobile from lane
GB2571590A (en) Vehicle control method and apparatus
JP7152906B2 (en) Steering control device, steering control method, and steering control system
CN111731282A (en) Emergency collision avoidance system considering vehicle stability and control method thereof
CN107972672B (en) Driving assistance system and driving assistance method
GB2571587A (en) Vehicle control method and apparatus
GB2571588A (en) Object classification method and apparatus
JP7216695B2 (en) Surrounding vehicle monitoring device and surrounding vehicle monitoring method
GB2571585A (en) Vehicle control method and apparatus
GB2571586A (en) Acquisition and tracking method and apparatus
GB2584383A (en) Vehicle control system and method