GB2584383A - Vehicle control system and method - Google Patents

Vehicle control system and method Download PDF

Info

Publication number
GB2584383A
GB2584383A GB1901749.0A GB201901749A GB2584383A GB 2584383 A GB2584383 A GB 2584383A GB 201901749 A GB201901749 A GB 201901749A GB 2584383 A GB2584383 A GB 2584383A
Authority
GB
United Kingdom
Prior art keywords
control system
identifying
rut
elongate
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1901749.0A
Other versions
GB2584383B (en
GB201901749D0 (en
Inventor
Kotteri Jithesh
Lisa Issac Neenu
Kelly James
Dharmajan Sheela Vishnu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1901749.0A priority Critical patent/GB2584383B/en
Publication of GB201901749D0 publication Critical patent/GB201901749D0/en
Priority to DE112020000735.9T priority patent/DE112020000735T5/en
Priority to PCT/EP2020/051683 priority patent/WO2020160927A1/en
Publication of GB2584383A publication Critical patent/GB2584383A/en
Application granted granted Critical
Publication of GB2584383B publication Critical patent/GB2584383B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Abstract

The present disclosure relates to a control system 1, preferably for a vehicle, for identifying one or more rut (figure 8: R1, R2) in a driving surface, such as a road, trail, or track (figure 7: SRF). The control system comprises a controller 12 configured to receive image data representing an imaging region. The controller analyses the image data to generate three-dimensional data relating to the imaging region RIMG. The three-dimensional data is analysed to identify one or more elongate section (rut) 23A, 23B having a vertical offset relative to an adjacent section. A rut identification signal is output to mark each identified elongate section as corresponding to a rut. The present disclosure also relates to a vehicle 2 comprising the control system and to a method of identifying one or more rut in a surface. The scanning of the 3D terrain data may be performed by searching cells within the data for a cell that demonstrates a vertical offset to an adjacent cell. The system may determine the depth of the rut and may output a warning if the rut depth is greater than a threshold depth.

Description

VEHICLE CONTROL SYSTEM AND METHOD
TECHNICAL FIELD
The present disclosure relates to a vehicle control system and method. Aspects of the invention relate to a control system for identifying one or more rut in a surface.
BACKGROUND
A rut may be formed in a ground surface by the wheels of a vehicle, particularly if the ground is composed of a deformable medium, such as mud. The rut is usually in the form of an elongated open channel. Depending on the local conditions, the wheels of the vehicle may form left and right ruts which extend substantially parallel to each other. The rut(s) may present an obstacle to a following vehicle and it may be appropriate to configure the powertrain and/or the suspension of the following vehicle to aid progress along the rut(s) or traversal of the rut(s). However, the detection of ruts may prove problematic due to limitations in sensor perception.
For example, optical sensors operating in very bright or very dark conditions may result in generation of false positives.
At least in certain embodiments, the present invention seeks to overcome or address at least some of the limitations associated with known systems.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a control system, a vehicle, and a non-transitory computer-readable medium as claimed in the appended claims.
According to an aspect of the present invention there is provided a control system for identifying one or more rut in a surface, the control system comprising one or more controllers, the control system being configured to: receive image data representing an imaging region; and analyse the image data to generate three dimensional data relating to the imaging region. The control system may be configured to analyse the three dimensional data to identify one or more elongate section having a vertical offset relative to an adjacent section. The control system may be configured to identify the elongate section as having a vertical height which is below that of the adjacent section. The three dimensional data generated by the control system may represent a ground surface (i.e. a surface of the ground within the imaging region). The control system may be installed in a host vehicle.
The control system may output a rut identification signal for identifying each identified elongate section as corresponding to a rut. The rut identification signal may be output to one or more vehicle system, for example via a communication network. The one or more vehicle system may be controlled in dependence on the rut identification signal. At least in certain embodiments, the control system may enable advance detection of the one or more rut (i.e. before the vehicle encounters the rut). The one or more vehicle system may be pre-configured to facilitate progress, for example to enable progress of the vehicle within the identified rut(s), or traversal of the identified rut(s). By way of example, the vehicle powertrain and/or the vehicle suspension may be pre-configured in dependence on the rut identification signal.
Alternatively, or in addition, the rut identification signal may comprise rut data defining one or more characteristic of each identified rut. The rut data may comprise one or more of the following: the location of the rut; a profile of the rut in plan elevation; a depth profile of the rut; and a width profile of the rut. The rut data may be used to generate a graphical representation of the rut, for example to display the rut in relation to the vehicle.
The controller may comprise a processor having an input for receiving the image data; and a memory coupled to the processor and having instructions stored thereon for controlling operation of the processor. The processor may be configured to analyse the image data to generate the three dimensional data. The processor may identify the one or more elongate section having a vertical offset relative to an adjacent section.
The control system may be configured to identify an elongate section having a vertical offset relative to a first adjacent section disposed on a first side thereof; and/or having a vertical offset relative to a second adjacent section disposed on a second side thereof. The identified elongate section may be located at a lower height than the first adjacent section and/or the second adjacent section.
The control system may be configured to analyse the three dimensional data to identify said one or more elongate section by identifying a step change in vertical height relative to the adjacent section.
The control system may be configured to analyse the three dimensional data to identify said one or more elongate section by identifying a vertical offset greater than or equal to a predetermined threshold value.
The control system may be configured to analyse the three dimensional data to identify said one or more elongate section having a width less than a predefined threshold width; and/or a length greater than or equal to a predefined threshold length.
The control system may be configured to analyse the three dimensional data to identify said one or more elongate section having a substantially continuous profile in plan elevation. The elongate section may comprise a curved section and/or a rectilinear section.
The three dimensional data may comprise a plurality of cells. The control system may be configured to analyse the three dimensional data to identify said one or more elongate section by identifying a sequence composed of a plurality of cells. Each cell in the sequence may be vertically offset from at least one adjacent cell.
The control system may be configured to identify first and second said elongate sections as corresponding to first and second ruts. The first and second ruts may form a vehicle track, for example on an unmetalled surface.
The identification of said first and second elongate sections may comprise identifying elongate sections which are substantially parallel to each other. Alternatively, or in addition, the identification of said first and second elongate sections may comprise identifying elongate sections having at least substantially the same depth and/or at least substantially the same width.
The identification of said first and second elongate sections may comprise identifying elongate sections having a predetermined spacing therebetween; or having a spacing therebetween which is within a predetermined range.
The identification of the elongate section may comprise identifying each cell having first and second adjacent cells (disposed on opposing sides thereof) which are at a greater height. The identification of a plurality of said cells forming a continuous or substantially continuous line may represent a rut. This configuration may be indicative of the profile of a rut in a transverse direction.
The control system may be configured to identify a sequence of cells representing a substantially planar surface extending in a horizontal plane. This functionality may be used in conjunction with the other techniques described herein, for example to identify first and second sequences representing respective planar surfaces which extend substantially parallel to each other. The processor could optionally assess whether the first and second sequences represent surfaces at the same vertical height (which may be indicative of first and second ruts in liquid communication with each other).
The control system may be configured to analyse the three dimensional data to determine the vertical offset between the elongate section and the adjacent section to determine a depth of the corresponding rut.
The control system may be configured to output an alert if the determined vertical offset is determined to be greater than or equal to a predetermined threshold.
The image data may be received from first and second imaging sensors. The first and second imaging sensors may, for example, each comprise an optical camera, for example a video camera. The image data may comprise video image data. The imaging sensors may capture the image data at least substantially in real time. Alternatively, or in addition, the three dimensional data may comprise data received from a lidar sensor or a radar sensor. The image data may be received from a suitable sensor array.
According to a further aspect of the present invention there is provided a control system for identifying first and second ruts in a surface, the control system comprising one or more controllers, the control system being configured to: receive image data representing an imaging region; analyse the image data to generate three dimensional data relating to the imaging region; analyse the three dimensional data to identify first and second elongate sections which are substantially parallel to each other; and output a rut identification signal for identifying each identified elongate section as corresponding to a rut.
According to a further aspect of the present invention there is provided a vehicle comprising a control system as described herein.
According to a further aspect of the present invention there is provided a method of identifying one or more rut in a surface, the method comprising: receiving image data representing an imaging region; analysing the image data to generate three dimensional data relating to the imaging region; analysing the three dimensional data to identify one or more elongate section having a vertical offset relative to an adjacent section; and outputting a rut identification signal for identifying each identified elongate section as corresponding to a rut.
The method may comprise identifying said one or more elongate section by identifying a step change in vertical height.
The one or more elongate section may each have a substantially continuous profile in plan elevation.
The three dimensional data may comprise a plurality of cells. The identification of said one or more elongate section may comprise identifying a sequence composed of a plurality of said cells. The cells in the sequence may each be vertically offset from at least one adjacent cell.
The method may comprise identifying first and second said elongate sections corresponding to first and second ruts.
The method may comprise identifying first and second elongate sections which are substantially parallel to each other.
The method may comprise identifying elongate sections having a predetermined spacing therebetween.
The method may comprise determining a vertical offset between the elongate section and the adjacent section to determine a depth of the corresponding rut.
The method may comprise generating an alert if the determined vertical offset is greater than or equal to a predetermined threshold.
The method may comprise receiving the image data from first and second imaging sensors.
According to a further aspect of the present invention there is provided a non-transitory computer-readable medium having a set of instructions stored therein which, when executed, cause a processor to perform the method described herein.
Any control unit or controller described herein may suitably comprise a computational device having one or more electronic processors. The system may comprise a single control unit or electronic controller or alternatively different functions of the controller may be embodied in, or hosted in, different control units or controllers. As used herein the term "controller" or "control unit" will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide any stated control functionality. To configure a controller or control unit, a suitable set of instructions may be provided which, when executed, cause said control unit or computational device to implement the control techniques specified herein. The set of instructions may suitably be embedded in said one or more electronic processors. Alternatively, the set of instructions may be provided as software saved on one or more memory associated with said controller to be executed on said computational device. The control unit or controller may be implemented in software run on one or more processors. One or more other control unit or controller may be implemented in software run on one or more processors, optionally the same one or more processors as the first controller. Other suitable arrangements may also be used.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the present invention will now be described, by way of example only, with reference to the accompanying figures, in which: Figure 1 shows a schematic representation of a vehicle comprising a control system in accordance with an embodiment of the present invention; Figure 2 shows a schematic representation of a scanning region of an imaging device provided on the vehicle shown in Figure 1; Figure 3 shows a first image captured by the imaging device shown schematically in Figure 2; Figure 4 shows an elevation map generated by identifying disparities in the images captured by the imaging device; Figure 5 shows a schematic representation of the elevation map shown in Figure 4 differentiating between traversable and un-traversable terrain features; Figure 6 shows a schematic representation of the elevation map shown in Figure 5 incorporating a route of the vehicle; Figure 7 shows a second image captured by the imaging device having a first graphical overlay representing paths of opposing wheels of the vehicle; Figure 8 shows a third image captured by the imaging device having a graphical overlay representing the predicted paths of the left and right wheels of the vehicle; Figure 9A shows a multi-level surface map generated by analysing the third image shown in Figure 8; Figure 9B shows first and second elongate sequences extracted from the multi-level surface map shown in Figure 9A; Figure 10 shows a graphical overlay representing the topographical relief of the ground surface in the third image shown in Figure 8; and Figure 11 is a block diagram representing the implementation of the method described herein.
DETAILED DESCRIPTION
A control system 1 for a vehicle 2 in accordance with an embodiment of the present invention will now be described with reference to the accompanying figures. The vehicle 2 in the present embodiment is an automobile, but it will be understood that the controller 1 may be used in other types of land vehicle. The vehicle 2 is described herein with reference to a reference frame comprising a longitudinal axis X, a transverse axis Y and a vertical axis Z. As illustrated in Figure 1, the vehicle 2 comprises four (4) wheels W1-4, four suspension assemblies S1-4 (each associated with a respective wheel W1-4) and a vehicle body 4. The wheels W1-4 are provided on front and rear axles 5, 6. The first wheel W1 is a front left wheel; the second wheel W2 is a front right wheel; the third wheel W3 is a rear left wheel; and the fourth wheel W4 is a rear right wheel. The vehicle 2 comprises a drivetrain comprising an internal combustion engine 7 drivingly connected to the front axle 5 for transmitting a traction torque to the first and second wheels W1, W2. It will be understood that the internal combustion engine 7 could be drivingly connected to the rear axle 6 for transmitting a traction torque to the first and second wheels W1, W2. In alternative implementations, the drivetrain may comprise an electric propulsion unit instead of, or in addition to the internal combustion engine 7.
As described herein, the control system 1 is operable to identify localised relief features formed in a ground surface SRF. The ground surface SRF comprises or consists of the surface of a section of ground over which the vehicle 2 is travelling, such as the surface of an unmetalled road or an off-road track. The control system 1 in the present embodiment is operable to identify relief features comprising a first rut R1 and/or a second rut R2. The first and second ruts R1, R2 each comprise an elongated relief feature, typically in the form of a channel, formed in the ground surface SRF. The first and second ruts R1, R2 may be formed by one or more land vehicle travelling over the ground surface SRF. The ground surface SRF may be particularly susceptible to the formation of first and second ruts R1, R2 if the underlying ground is composed of a deformable medium, such as mud or sand. The first and second ruts R1, R2 in the present embodiment are formed by the left and right wheels of a vehicle traversing the ground surface SRF. Since the transverse distance between the left and right wheels is fixed, the first and second ruts R1, R2 are at least substantially parallel to each other. A spacing between the first and second ruts R1, R2 (in a transverse direction) at least substantially corresponds to an axle (wheel) track (i.e. the transverse distance between the wheels) of the vehicle which formed them. The depth and/or the width of the first and second ruts R1, R2 may increase as a result of the passage of more than one vehicle.
The vehicle 2 comprises an inertial measurement unit (IMU) 8 for determining an orientation of the vehicle body 4. The IMU 8 comprises one or more accelerometer and/or one or more gyroscope. The IMU 8 in the present embodiment determines a pitch angle of the vehicle body 4 about the transverse axis Y and outputs a pitch angle signal S1 to a communication network (not shown) provided in the vehicle 2. The IMU 8 may optionally also determine a roll angle of the vehicle 2 about the longitudinal axis X and output a roll angle signal. A steering wheel sensor 9 is provided for determining a steering angle of the steering wheel (not shown) in the vehicle 2. The steering wheel sensor 9 outputs a steering angle signal S2 to the communication network.
As described herein, the control system 1 is configured to determine a topographical relief of the ground surface SRF. As illustrated in Figure 2, the vehicle 2 comprises an imaging device for capturing image data DIMG representing an imaging region RIMG external to the vehicle 2. The imaging device 10 may be operable to capture the image data DIMG at least substantially in real time. The imaging device 10 may capture a predefined number of frames of image data DIMG per second, for example twenty-four (24) frames per second. The captured image data DIMG is composed of data relating to real-world features within the imaging region RIMG. The imaging region RIMG in the present embodiments extends from 5m to 25m in front of the vehicle 2 in the direction of vehicle travel. A first image IMG1 captured by the imaging device 10 is shown in Figure 3 by way of example. The imaging device 10 is configured such that the imaging region RIMG comprises a region of the surface SRF over which the vehicle 2 is travelling. Thus, the captured image data DIMG comprises the ground surface SRF proximal to the vehicle 2. The captured image data DIMG may include one or more obstacle which may impede or prevent vehicle progress. The imaging device 10 in the present embodiment is forward-facing and the imaging region RIMG is located in front of the vehicle 2. The imaging device 10 may be mounted proximal an upper edge of a front windshield, for example behind a rear-view mirror (not shown).
The imaging device 10 in the present embodiment comprises a stereo camera 11 comprising first and second imaging sensors 11-1, 11-2, as shown in Figure 1. The first and second imaging sensors 11-1, 11-2 are respective first and second optical cameras in the present embodiment. The image data DIMG comprises a first set of image data DIMG-1 captured by the first camera 11-1, and a second set of image data DIMG-2 captured by the second camera 11-2. The first and second cameras 11-1, 11-2 are spatially separated from each other but have overlapping fields of view FOV. In the present embodiment, the first and second cameras 11-1, 11-2 operate in the visible spectrum. Alternatively, or in addition, the first and second cameras 11-1, 11-2 may operate in the non-visible spectrum, for example comprising infrared light. Alternatively, or in addition, the imaging device 10 may comprise or consist of a radar imaging device.
The control system 1 comprises a controller 12 for receiving the captured image data DIMG. As shown schematically in Figure 1, the controller 12 includes a processor 13 and a memory 14. A set of computational instructions is stored on the memory 14. When executed, the computational instructions cause the processor 13 to perform the method(s) described herein.
The processor 13 is configured to implement an image processing algorithm to analyse the first and second sets of image data DIMG-1, DIMG-2 to determine characteristics of the ground surface SRF within the imaging region RIMG. The processor 13 identifies disparities between the first and second sets of image data DIMG-1, DIMG-2 and performs range imaging to determine the distance to features within the imaging region RIMG. With reference to known parameters of the stereo camera 11, such as the spatial separation of the first and second cameras 11-1, 11-2, the processor 13 generates three dimensional (3D) data in the form of a point cloud 15 in dependence on the first and second sets of image data DIMG-1, DIMG-2. The point cloud 15 is composed of a plurality of discrete points located on the external surfaces of objects and features within the imaging region RIMG. A transformation is applied to move an origin of the point cloud 15 to a predefined reference point. The transformation moves the point cloud origin from a centre position CP1 of the stereo camera 11 to a reference point defining an origin of a vehicle co-ordinate system. In the present embodiment the reference point is a centre position CP2 of a rear axle (i.e. the position on vehicle centreline) which is coincident with the centre of the rear wheels. The centre position CP2 defines a common centre point of turning of the vehicle 2. The transformation is predefined in dependence on the relative location of the centre positions CP1, CP2. The modified point cloud 15 thereby defines the vertical height of the points relative to a centre of the vehicle rear wheel.
The processor 13 determines the pitch angle of the vehicle 2 in dependence on the pitch angle signal S1 output by the IMU 8. The processor 13 utilises the vehicle pitch angle and the modified point cloud 15 to form an elevation map corresponding to the imaging region RIMG.
The elevation map is referred to herein as a Multi-Level Surface (MLS) map 17. An example of an MLS map 17 generated from the image data DIMG is shown in Figure 4. The MLS map 17 provides terrain geometry within the imaging region RIMG. The MLS map 17 is composed of a grid comprising a plurality of two-dimensional (2D) cells 18 arranged in a horizontal plane.
The processor 13 generate the MLS map 17 in dependence on the three-dimensional spatial distribution of the points of the modified point cloud 15 within each cell 18. The processor 13 may, for example, generate the MLS map 17 in dependence on a mean vertical height of the points of the modified point cloud 15 within each cell 18, or in dependence on a maximum or minimum vertical height of the points within the modified point cloud 15. A distribution of the modified point cloud 15 within each cell 18 may provide an indication of a localised change in a vertical height of the ground surface SRF. The MLS map 17 may comprise data representing the distribution of the modified point cloud 15 within each cell 18, for example representing a statistical analysis of the vertical distribution of points of the modified point cloud 15 within each cell 18. In the present embodiment, the cells 18 each measure 25cm x 25cm. The resolution of the MLS map 17 may be increased or decreased by changing the dimensions of the cells 18. In a variant, the processor 13 may be configured to determine a gradient (positive or negative) of the terrain in each cell 18. In a variant, the MLS map 17 may comprise a low-poly model of the terrain in the imaging region.
The processor 13 in the present embodiment is configured to refine the MLS map 17 by identifying overhang features, such as a branch of a tree or a space under another vehicle, present within the imaging region RIMG. The processor 13 may identify an overhang by identifying two or more points within the modified point cloud 15 having different vertical heights but at least substantially the same horizontal position. If an overhang feature is identified, the processor 13 refines the MLS map 17 by omitting the point (or points) having a lower vertical height. If an overhang is identified, the processor 13 refines the MLS map 17 based on vehicle traversability analysis using the height difference of the vertical heights. If traversability is positive (i.e. the processor 13 determines that the feature is traversable), the points corresponding to overhang features are omitted. If traversability is negative (i.e. the processor 13 determines that the feature cannot be traversed), the points in the 2 vertical patches are combined and the cell 18 is characterised as representing an obstacle.
The control system 1 is configured to analyse the image data DIMG to identify obstacles within the imaging region RIMG. In the context of the present application, an obstacle may be classified as a physical feature or object which will impede progress of the vehicle 2 or which is deemed to be un-traversable by the vehicle 2. The processor 13 is configured to identify any such obstacles within the MLS map 17. In the present embodiment, the processor identifies an obstacle as a feature which results in a change in terrain height between adjacent cells 18 within the MLS map 17. If the processor 13 identifies a change in terrain height between two or more adjacent cells 18 exceeding a predefined vertical threshold, the processor 13 characterises the identified cell as representing an obstacle. The predefined vertical threshold may, for example, be 25cm or 50cm. The processor 13 could optionally be configured to implement a route planning algorithm for planning a vehicle route in dependence on the determined position and/or size of any identified obstacle(s). It will be understood that the grading of the cells 18 may be refined, for example by defining a plurality of vertical thresholds or classifying the cells 18 in direct proportion to a detected change in terrain height between two or more adjacent cells 18.
By way of example, an image representing the image data DIMG is shown in Figure 3. The image data DIMG shows an unsurfaced track 19 along which the vehicle 2 is travelling and a tree 20 adjacent to the track 19. The track 19 comprises a dip in which water has collected to form a pool 21. The processor 13 analyses the image data DIMG captured by the imaging device 10 and generates a point cloud 15 which is used to generate the MLS map 17 shown in Figure 4. The features identified through analysis of the image data DIMG are labelled in the MLS map 17 shown in Figure 4. The pool 21 is identified as a region which is at least substantially empty in the image data DIMG. The region behind the tree 20 is obscured from view and is identified in the MLS map 17 as a contiguous extension thereof.
The processor 13 analyses the MLS map 17 to identify obstacles. By way of example, an MLS map 17 is shown in Figure 5 with the cells 18 marked to represent the determination of the processor 13. The cells 18 outside of a field of view FOV of the imaging device 10 are shown unshaded. The cells 18 inside the field of view FOV which are identified as corresponding to traversable terrain (terrain cells) are shown having an intermediate shading The cells 18 inside the field of view FOV which are identified as corresponding to an obstacle (such as the tree 20 shown in Figure 3) are shown having a dark shading (obstacle cells).
The processor 13 is configured to model a route R for the vehicle 2. The vehicle route R may, for example, be modelled in dependence on the current (i.e. instantaneous) steering angle of the first and second wheels W-1, W-2. Other implementations of the control system 1 may model the vehicle route R in dependence on a user-specified route and/or a route planning algorithm. The processor 13 determines left and right wheel paths P1, P2 along which the left and right wheels W1-4 will travel respectively. The left and right wheel paths P1, P2 are overlaid onto the MLS map 17 in Figure 6. The processor 13 may take account of changes in the vertical height of the terrain when determining the left and right wheel paths P1, P2. In a variant, the processor 13 may be configured only to analyse the image data DIMG captured by the imaging device 10 in a region along or proximal to the route R to generate the MLS map, optionally discarding image data DIMG distal from the route R. A second image IMG2 captured by the imaging device 10 is shown in Figure 7 by way of example. As shown in the second image IMG2, the change in relative height of the left and right wheel paths P1, P2 may be determined as the vehicle 2 progresses along the vehicle route R. A third image IMG3 captured by the imaging device 10 is shown in Figure 8 by way of example. The third image IMG3 comprises an unmetalled track having first and second ruts R1, R2. The left and right wheel paths P1, P2 are overlaid onto the third image IMG3 to show the predicted positions of the left and right wheels W1-4 of the vehicle 2 in relation to the first and second ruts R1, R2.
The MLS map 17 generated through analysis of the third image IMG3 is shown in Figure 9A.
The MLS map 17 represents the topographical relief of the ground surface SRF identified within the third image IMG3. The processor 13 applies a transform to project the MLS map 17 in a plan elevation, as shown in Figure 9A. The processor 13 analyses the MLS map 17 by performing a height differential analysis. The height differential analysis comprises comparing the height of each cell 18 with the height of each adjacent cell 18 within the MLS map 17. The processor 13 identifies each cell 18 having a height which is offset vertically relative to one or more adjacent cell 18 by a vertical distance greater than or equal to a predefined vertical offset threshold. In the present embodiment, the processor 13 is configured to identify each cell 18 having a height below that of one or more adjacent cell 18 by at least vertical offset threshold.
The cells 18 identified by the processor 13 as a result of the height differential analysis are referred to herein as step-change cells 18'. In the present embodiment, the vertical offset threshold is defined as 5cm, but larger or smaller vertical offset thresholds may be defined.
The step-change cells 18' each represent a step change (i.e. an abrupt height change over a relatively small distance) in the vertical height of the ground surface SRF, as approximated by the MLS map 17. The processor 13 generates a step-change map 22 comprising each of the step-change cells 18'. By way of example, a step-change map 22 is shown in Figure 9B representing the results of a height differential analysis of the MLS map 17 shown in Figure 9A. In the present embodiment, the step-change map 22 also represents the height differential between adjacent cells 18 and characterises each cell 18 as having a LOW, MEDIUM or HIGH height differential. The processor 13 flags each cell 18 identified in the MLS map 17 as having a HIGH height differential (i.e. a vertical offset greater than or equal to 5cm) and the step-change cells 18' are represented in the map shown in Figure 9B.
The ruts R1, R2 typically comprise left and right channels (which are formed by the left and right wheels of one or more vehicles). The control system 1 is configured to analyse the step-change map 22 to identify elongate sequences having a profile which at least substantially matches the expected features and characteristics of the ruts R1, R2. The processor 13 analyses the step-change map 22 to identify first and second elongate sections 23A, 23B corresponding to the first and second ruts R1, R2 respectively. The first and second elongate sections 23A, 23B are shown in Figure 10 which shows a graphical overlay 24 on the third image IMG3. The processor 13 analyses the step-change map 22 to identify a plurality of the step-change cells 18' arranged in one or more of the following: a continuous sequence; a substantially continuous sequence; or an interrupted sequence. The continuous sequence may comprise a plurality of the step-change cells 18' arranged in an uninterrupted sequence (i.e. composed of contiguous step-change cells 18'). The substantially continuous sequence may comprise a plurality of step-change cells 18' which are offset from each other in a diagonal direction and/or which are separated from each other by a distance less than or equal to a predefined distance threshold (for example a separation of less than or equal to n cells 18, where n is a whole number less than or equal to one, two or three). The interrupted sequence may comprise one or more continuous sequences and/or one or more substantially continuous sequences which are separated from each other by a distance greater than or equal to a predefined distance threshold (for example a separation of greater than or equal to n cells 18, where n is a whole number greater than or equal to three, four or five).
The processor 13 in the present embodiment is configured to apply a pattern detection algorithm to identify each elongate section forming a continuous line in plan elevation. In particular, the processor 13 applies a curve detection algorithm to detect each sequence (continuous, substantially continuous, or interrupted) of the step-change cells 18' which forms a curve within the MLS map 17. The processor 13 could be configured to identify a curved sequence of the step-change cells 18' as corresponding to one of the first and second ruts R1, R2. In the present embodiment, however, the processor 13 is configured to analyse the MLS map 17 to identify pairs of curved sequences corresponding to the respective first and second ruts R1, R2. In particular, the processor 13 identifies first and second elongate sections forming first and second curves which are at least substantially parallel to each other. The first and second elongate sections identified within the MLS map 17 as being at least substantially parallel to each other are identified as the first and second ruts R1, R2.
The first and second ruts R1, R2 are typically spaced apart from each other by a distance corresponding to a wheel track of a vehicle. To facilitate identification of the first and second ruts R1, R2, an upper wheel track threshold and/or a lower wheel track threshold may be defined. The processor 13 may optionally determine a distance between the first and second elongate sections identified within the MLA map 17. The processor may identify the first and second elongate sections as corresponding to the first and second ruts R1, R2 only if the distance between the first and second elongate sections is less than the upper wheel track threshold and/or greater than the lower wheel track threshold.
The processor 13 is configured to output a rut identification signal RSIG in dependence on identification of the first and second ruts R1, R2. The rut identification signal RSIG may, for example, be output to a vehicle communication network. One or more vehicle systems may be controlled in dependence on the output of the rut identification signal RSIG. By way of example, one or more of the following vehicle systems may be controlled: a throttle response; a drivetrain; a vehicle transmission (for example to select a particular gear ratio); a transfer case (for example to select a high or low ratio); an electrical power steering unit (for example to modify a steering ratio and/or to change feedback from the steering wheel); and a suspension system (for example to adjust suspension travel and/or to adjust a damping setting). The processor 13 may output a steering control signal to control a steering angle of the vehicle 2 so as to match or follow the profile of the first and second ruts R1, R2. For example, the steering angle of the vehicle 2 may be controlled The operation of the control system 1 is illustrated in a flow diagram 100 shown in Figure 11. The imaging device 10 is provided to capture image data DIMG (BLOCK 105) corresponding to an imaging region RIMG in front of the vehicle 2. The first and second cameras 11-1, 11-2 of the imaging device 10 capture respective first and second sets of image data DIMG-1, DIMG-2 (BLOCK 110). The processor 13 generates a disparity image in dependence on the first and second sets of image data DIMG-1, DIMG-2 (BLOCK 115). The processor 13 retrieves known parameters of the imaging device (BLOCK 120) and generates a point cloud in dependence on the disparity image (BLOCK 125). The processor 13 reads the pitch angle signal S1 output by the IMU 8 and determines the pitch angle of the vehicle body 4 (BLOCK 130). The MLS map 17 is generated in dependence on the point cloud 15 and the determined pitch of the vehicle 2 (BLOCK 135). The processor 13 analyses the MLS map 17 to classify the constituent cells 18 as corresponding to either an obstacle (i.e. cannot be traversed by the vehicle 2) or a traversable section of terrain (BLOCK 140). The cells 18 corresponding to an obstacle may optionally be discarded from subsequent analysis. The processor 13 reads the steering angle signal S2 and determines the current steering angle of the vehicle 2 (BLOCK 145), for example by reading a steering signal published by a steering angle sensor. A vehicle route R is determined in dependence on the current steering angle and the left and right wheel paths P1, P2 determined (BLOCK 150).
A height comparison is performed to compare the height of each cell 18 along the left and right wheel paths P1, P2 with the adjacent (eight) cells 18. The processor 13 determines if the height differential is greater than or less than the predefined vertical offset threshold (BLOCK 155). If the height differential of a cell 18 is less than the predefined vertical offset threshold, the cell 18 is discarded (BLOCK 160). If the height differential for a cell 18 is greater than the predefined vertical offset threshold, the cell 18 is flagged as a step-change cell 18' (BLOCK 165). A step-change map 22 is created by projecting each step-change cell 18' identified by the processor 13 (BLOCK 170). The step-change map 22 provides a two-dimensional representation of the topographical relief of the ground surface SRF. The processor 13 utilises a curve detection algorithm to detect sequences of the step-change cells 18' which form a curve (BLOCK 175). The processor 13 then analyses the detected curved sequences of step-change cells 18' to identify sequence pairs which are at least substantially parallel to each other (BLOCK 180). If the processor 13 does not identify a pair of sequences which are at least substantially parallel to each other, a determination is made that first and second ruts R1, R2 are not present in the captured image data (BLOCK 185). If the processor 13 identifies a pair of sequences which are at least substantially parallel to each other, a determination is made that first and second ruts R1, R2 are present in the captured image data (BLOCK 190).
The processor 13 may output a rut detection signal RSIG in dependence on this determination.
The processor 13 may be configured to determine further features of the identified first and second ruts R1, R2. For example, the processor 13 may analyse the MLS map 17 to determine the depth and/or the width of the first and second ruts R1, R2. If the depth of one or both of the first and second ruts R1, R2 exceeds a predefined threshold, the processor 13 may output a notification, for example to warn a driver of a potential risk that the vehicle 2 will become stranded. Alternatively, or in addition, the processor 13 may be configured to identify where the depth of one or both of the first and second ruts R1, R2 is less than a predefined threshold, for example to identify a location for entry into, or exit from the first and second ruts R1, R2. Alternatively, or in addition, the processor 13 may be configured to determine the height of a (central) ridge between the first and second ruts R1, R2 relative to the first and second ruts R1, R2. If the relative height of the ridge exceeds a predefined threshold, the processor 13 may output a notification, for example to warn a driver of a potential scenario in which the vehicle 2 may be high-centred. The processor 13 may optionally supplement this functionality be detecting one or more obstacle, such as a rock, on the ridge between the first and second ruts R1, R2.
The processor 13 has been described herein as identifying first and second curves which are substantially parallel to each other. Alternatively, or in addition, the processor 13 may identify first and second curved sequences which are spaced apart from each other by a distance within a predefined wheel track range. The wheel track range may, for example, define upper and lower wheel track thresholds. The upper and lower wheel track thresholds may be defined in dependence on the wheel track of the vehicle 2.
It will be appreciated that various modifications may be made to the embodiment(s) described herein without departing from the scope of the appended claims.
The control system 1 has been described herein with reference to the identification of first and second ruts R1, R2. It will be understood that the control system 1 can be modified to identify a single rut R1. Alternatively, or in addition, the control system 1 may be configured to identify larger channels, such as a ditch or a gulley, formed in the ground surface SRF. The techniques described herein in relation to the analysis of the MSL map 17 to identify a rut RS1, RS2 may be modified to identify the ditch or gulley. For example, the processor 13 may identify a series of step-change cells 18' representing a V-shaped or U-shaped channel. Alternatively, or in addition, the processor 13 may be configured to identify the sides and/or the bottom of the channel within the MSL map 17. Conversely, the control system 1 may be configured to analyse the MSL map 17 to identify a ridge or raised region in ground surface SRF.
The imaging device 10 has been described herein as comprising first and second imaging sensors 11-1, 11-2. The first and second imaging sensors 11-1, 11-2 have been described as comprising first and second optical cameras. It will be understood that different types of sensors may be used to generate the image data used to generate the three dimensional data used in the prediction of the vertical position of the at least one wheel. The imaging system may, for example, comprise or consist of a lidar (Light Detection and Ranging) system for generating the three dimensional data. The lidar system may comprise a laser transmitter and sensor array. Alternatively, or in addition, the imaging device 10 may comprise a radar system operable to generate the three dimensional data.

Claims (23)

  1. CLAIMS: 1. A control system for identifying one or more rut in a surface, the control system comprising one or more controllers, the control system being configured to: receive image data representing an imaging region; analyse the image data to generate three dimensional data relating to the imaging region; analyse the three dimensional data to identify one or more elongate section having a vertical offset relative to an adjacent section; and output a rut identification signal for identifying each identified elongate section as corresponding to a rut.
  2. 2. A control system as claimed in claim 1, the controller comprising: a processor having an input for receiving the image data; and a memory coupled to the processor and having instructions stored thereon for controlling operation of the processor; wherein the processor is configured to: analyse the image data to generate the three dimensional data; and identify the one or more elongate section having a vertical offset relative to an adjacent section.
  3. 3. A control system as claimed in claim 1 or claim 2, wherein the control system is configured to analyse the three dimensional data to identify said one or more elongate section by identifying a step change in vertical height relative to the adjacent section.
  4. 4. A control system as claimed in any one of claims 1, 2 or 3, wherein the control system is configured to analyse the three dimensional data to identify said one or more elongate section having a substantially continuous profile in plan elevation.
  5. 5. A control system as claimed in any one of claims 1 to 4, wherein the three dimensional data comprises a plurality of cells, the control system being configured to analyse the three dimensional data to identify said one or more elongate section by identifying a sequence composed of a plurality of said cells, each of the cells in the sequence being vertically offset from at least one adjacent cell.
  6. 6. A control system as claimed in any one of the preceding claims, wherein the control system is configured to identify first and second said elongate sections as corresponding to first and second ruts.
  7. 7. A control system as claimed in claim 6, wherein identifying said first and second elongate sections comprises identifying elongate sections which are substantially parallel to each other.
  8. 8. A control system as claimed in claim 6 or claim 7, wherein identifying said first and second elongate sections comprises identifying elongate sections having a predetermined spacing therebetween.
  9. 9. A control system as claimed in any one of the preceding claims, wherein the control system is configured to analyse the three dimensional data to determine a vertical offset between the elongate section and the adjacent section to determine a depth of the corresponding rut.
  10. 10. A control system as claimed in claim 9, wherein the control system is configured to output an alert if the determined vertical offset is determined to be greater than or equal to a predetermined threshold.
  11. 11. A control system as claimed in any one of the preceding claims, wherein the image data is received from first and second imaging sensors.
  12. 12. A vehicle comprising a control system as claimed in any one of the preceding claims.
  13. 13. A method of identifying one or more rut in a surface, the method comprising: receiving image data representing an imaging region; analysing the image data to generate three dimensional data relating to the imaging region; analysing the three dimensional data to identify one or more elongate section having a vertical offset relative to an adjacent section; and outputting a rut identification signal for identifying each identified elongate section as corresponding to a rut.
  14. 14. A method as claimed in claim 13 comprising identifying said one or more elongate section by identifying a step change in vertical height.
  15. 15. A method as claimed in claim 13 or claim 14, wherein said one or more elongate section has a substantially continuous profile in plan elevation.
  16. 16. A method as claimed in any one of claims 13, 14 or 15, wherein the three dimensional data comprises a plurality of cells; the identification of said one or more elongate section comprises identifying a sequence composed of a plurality of said cells, each of the cells in the sequence being vertically offset from at least one adjacent cell.
  17. 17. A method as claimed in any one of claims 13 to 16 comprising identifying first and second said elongate sections corresponding to first and second ruts.
  18. 18. A method as claimed in claim 17, comprising identifying first and second elongate sections which are substantially parallel to each other.
  19. 19. A method as claimed in claim 17 or claim 18 comprising identifying elongate sections having a predetermined spacing therebetween.
  20. 20. A method as claimed in any one of claims 13 to 19 comprising determining a vertical offset between the elongate section and the adjacent section to determine a depth of the corresponding rut.
  21. 21. A method as claimed in claim 20 comprising generating an alert if the determined vertical offset is greater than or equal to a predetermined threshold.
  22. 22. A method as claimed in any one of claims 13 to 21 comprising receiving the image data from first and second imaging sensors.
  23. 23. A non-transitory computer-readable medium having a set of instructions stored therein which, when executed, cause a processor to perform the method claimed in any one of claims 15 to 22.
GB1901749.0A 2019-02-08 2019-02-08 Vehicle control system and method Active GB2584383B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
GB1901749.0A GB2584383B (en) 2019-02-08 2019-02-08 Vehicle control system and method
DE112020000735.9T DE112020000735T5 (en) 2019-02-08 2020-01-23 VEHICLE CONTROL SYSTEM AND METHODS
PCT/EP2020/051683 WO2020160927A1 (en) 2019-02-08 2020-01-23 Vehicle control system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1901749.0A GB2584383B (en) 2019-02-08 2019-02-08 Vehicle control system and method

Publications (3)

Publication Number Publication Date
GB201901749D0 GB201901749D0 (en) 2019-03-27
GB2584383A true GB2584383A (en) 2020-12-09
GB2584383B GB2584383B (en) 2022-06-15

Family

ID=65997013

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1901749.0A Active GB2584383B (en) 2019-02-08 2019-02-08 Vehicle control system and method

Country Status (1)

Country Link
GB (1) GB2584383B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111452783A (en) * 2020-04-29 2020-07-28 汉腾新能源汽车科技有限公司 Optimization system and method for vehicle running track

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014205127A1 (en) * 2013-04-17 2014-10-23 Ford Global Technologies, Llc Control of the driving dynamics of a vehicle with ruts compensation
US20140347448A1 (en) * 2012-02-10 2014-11-27 Conti Temic Microelectronic Gmbh Determining the Characteristics of a Road Surface by Means of a 3D Camera
CN106978774A (en) * 2017-03-22 2017-07-25 中公高科养护科技股份有限公司 A kind of road surface pit automatic testing method
GB2563198A (en) * 2017-03-15 2018-12-12 Jaguar Land Rover Ltd Improvements in vehicle control

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140347448A1 (en) * 2012-02-10 2014-11-27 Conti Temic Microelectronic Gmbh Determining the Characteristics of a Road Surface by Means of a 3D Camera
DE102014205127A1 (en) * 2013-04-17 2014-10-23 Ford Global Technologies, Llc Control of the driving dynamics of a vehicle with ruts compensation
GB2563198A (en) * 2017-03-15 2018-12-12 Jaguar Land Rover Ltd Improvements in vehicle control
CN106978774A (en) * 2017-03-22 2017-07-25 中公高科养护科技股份有限公司 A kind of road surface pit automatic testing method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
T Shen et al, "IEEE 17th International Conference on Intelligent Transportation Systems (ITSC)", published 2014, IEEE, pp. 1843-1849, Shen et al, "Stereo Vision Based Road Surface Preview" *

Also Published As

Publication number Publication date
GB2584383B (en) 2022-06-15
GB201901749D0 (en) 2019-03-27

Similar Documents

Publication Publication Date Title
CN111442776B (en) Method and equipment for sequential ground scene image projection synthesis and complex scene reconstruction
US8428305B2 (en) Method for detecting a clear path through topographical variation analysis
US6734787B2 (en) Apparatus and method of recognizing vehicle travelling behind
US9415779B2 (en) Vehicle control system and method
US8630806B1 (en) Image processing for vehicle control
CN109814130B (en) System and method for free space inference to separate clustered objects in a vehicle awareness system
US11688155B2 (en) Lane detection and tracking techniques for imaging systems
CN108243623A (en) Vehicle anticollision method for early warning and system based on binocular stereo vision
CN108108750A (en) Metric space method for reconstructing based on deep learning and monocular vision
CN105550665A (en) Method for detecting pilotless automobile through area based on binocular vision
CN102222236A (en) Image processing system and position measurement system
US20210354725A1 (en) Control system for a vehicle
US20210012119A1 (en) Methods and apparatus for acquisition and tracking, object classification and terrain inference
GB2571589A (en) Terrain inference method and apparatus
WO2020160927A1 (en) Vehicle control system and method
Kühnl et al. Visual ego-vehicle lane assignment using spatial ray features
CN110053625B (en) Distance calculation device and vehicle control device
CN113432615B (en) Detection method and system based on multi-sensor fusion drivable area and vehicle
CN107220632A (en) A kind of pavement image dividing method based on normal direction feature
GB2584383A (en) Vehicle control system and method
CN113569778A (en) Pavement slippery area detection and early warning method based on multi-mode data fusion
WO2022050006A1 (en) Image processing device
US20200126247A1 (en) Method and device for determining a quality of a surface in the surroundings of a transportation vehicle
GB2571588A (en) Object classification method and apparatus
JP5452518B2 (en) Vehicle white line recognition device