GB2571588A - Object classification method and apparatus - Google Patents
Object classification method and apparatus Download PDFInfo
- Publication number
- GB2571588A GB2571588A GB1806628.2A GB201806628A GB2571588A GB 2571588 A GB2571588 A GB 2571588A GB 201806628 A GB201806628 A GB 201806628A GB 2571588 A GB2571588 A GB 2571588A
- Authority
- GB
- United Kingdom
- Prior art keywords
- vehicle
- surface roughness
- object classification
- roughness coefficient
- sensor data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/40—Coefficient of friction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
An object classification system 1 for a vehicle 2 comprises a controller configured to receive data from at least one sensor 13 and process the data to classify one or more objects 4,5 external to the vehicle. The processing of the sensor data is controlled in dependence on a surface roughness coefficient indicative of a roughness of the surface on which the vehicle is travelling. An object classification algorithm may be applied to the sensor data. The object classification algorithm may apply a dynamic classification filter, which may be modified in dependence on the surface roughness coefficient or on the vehicle speed. The controller may either receive or determine the surface roughness coefficient. The sensor data may comprise image data received from an image sensor, and the processing may comprise image processing. The system can be used in vehicles travelling in off-road environments and allows objects and vehicles to be detected more reliably.
Description
OBJECT CLASSIFICATION METHOD AND APPARATUS
TECHNICAL FIELD
The present disclosure relates to an object classification method and apparatus. In particular, but not exclusively, the present disclosure discloses a method and apparatus for classifying an object using dynamic image processing. The present disclosure has particular application in a vehicle, such as an automobile.
BACKGROUND
It is known to provide a host vehicle with an object classification system for detecting and classifying an object, such as a target vehicle, proximal to the host vehicle. Known object classification systems are often used to offer a feature to assist in cruise or traffic jam situations to maintain a distance to the target vehicle, typically the vehicle in front. Known object classification systems are usually optimised for road type conditions, where it is possible to make number of assumptions with relative certainty. For example, it may be assumed that the host vehicle and the target vehicle are both travelling on a predominantly continuous surface and, accordingly, that the position of the target vehicle will change in a progressive manner between frames of the image data. Furthermore, when the host vehicle is travelling along a road, it is reasonable to assume a predominantly flat surface in which target vehicles will appear within a tight horizon envelope, relative to the host vehicle. These assumptions can be used to weight object classification, with an aim to reduce false positives and incorrect object classification. However, when traversing terrain in an off-road scenario, both the host vehicle and the host vehicle may change elevation, relative angle or other parameters relatively quickly given the low speed of operation. Accordingly, the assumptions applicable for on-road systems cannot reliably be made. This results in many valid targets not being detected or being incorrectly classified.
At least in certain embodiments, the present invention seeks to provide an improved tracking and acquisition apparatus and method.
SUMMARY OF THE INVENTION
Aspects of the present invention relate to an object classification system, a vehicle, a method of classifying an object, and a non-transitory computer-readable medium as claimed in the appended claims.
According to a further aspect of the present invention there is provided an object classification system for a vehicle, the object classification system comprising a controller configured to: receive sensor data from at least one sensor; and process the sensor data to classify one or more object external to the vehicle;
wherein the processing of the sensor data is controlled in dependence on a surface roughness coefficient, the surface roughness coefficient being indicative of a roughness of a surface on which the vehicle is travelling. The object classification system is operative to differentiate between different objects. For example, the object classification system may classify an object as an obstacle or a target vehicle.
The processing of the sensor data may comprise applying an object classification algorithm to the sensor data. The object classification algorithm may comprise a pattern matching algorithm. The pattern matching algorithm may determine a correlation between one or more features of the sensor data with one or more predefined patterns. The image processing module 10 implements a classification algorithm to the image data to classify each of the image components IMC(n).
The object classification algorithm may apply a dynamic classification filter. The dynamic classification filter may be modified in dependence on said surface roughness coefficient. Alternatively, or in addition, the dynamic classification filter may be modified in dependence on a vehicle speed.
The controller may be configured to receive said surface roughness coefficient.
Alternatively, the controller may be configured to determine said surface roughness coefficient. The surface roughness coefficient may be generated by analysing image data. The surface roughness coefficient may be generated by analysing a signal from at least one inertial sensor, such as an accelerometer or a gyroscope.
The sensor data may comprise image data received from at least one image sensor. The processing of the sensor data may comprise image processing.
According to a further aspect of the present invention there is provided a vehicle comprising an object classification system as described herein.
According to a further aspect of the present invention there is provided a method of classifying objects external to a vehicle, the method comprising:
determining a roughness of a surface on which the vehicle is travelling; receiving sensor data from at least one sensor provided on the vehicle; and analysing the sensor data to classify one or more object;
wherein the analysis of the sensor data is modified in dependence on the determined surface roughness.
The method may comprise applying an object classification algorithm to the sensor data. The object classification algorithm applies at least one classification filter.
The object classification algorithm may apply a dynamic classification filter. The method may comprise modifying the at least one classification filter in dependence on one or more operating parameters.
The dynamic classification filter may be modified in dependence on said surface roughness coefficient.
Alternatively, or in addition, the dynamic classification filter may modified in dependence on a vehicle speed.
The method may comprise receiving the surface roughness coefficient.
Alternatively, the method may comprise determining said surface roughness coefficient. The surface roughness coefficient may be determined by analysing image data. The surface roughness coefficient may be generated by analysing a signal from at least one intertial sensor, such as an accelerometer or a gyroscope.
The method may comprise processing image data received from at least one image sensor.
According to a further aspect of the present invention there is provided a non-transitory computer-readable medium having a set of instructions stored therein which, when executed, cause a processor to perform the method described herein.
The host vehicle may be a land vehicle. The target vehicle may be a land vehicle. The term “land vehicle” is used herein to refer to a vehicle configured to apply steering and drive (traction) forces against the ground. The vehicle may, for example, be a wheeled vehicle or a tracked vehicle.
The term “location” is used herein to refer to the relative position of an object on the surface of the earth. Unless indicated to the contrary, either explicitly or implied by the context, references herein to the location of an object refer to the geospatial location of that object.
It is to be understood that by the term 'type of terrain' is meant the material comprised by the ter-rain over which the vehicle is driving such as asphalt, grass, gravel, snow, mud, rock and/or sand. By 'off-road' is meant a surface traditionally classified as off-road, being surfaces other than asphalt, concrete or the like. For example, off-road surfaces may be relatively compliant surfaces such as mud, sand, grass, earth, gravel or the like. Alternatively or in addition offroad surfaces may be relatively rough, for example stony, rocky, rutted or the like. Accordingly in some arrangements an off-road surface may be classified as a surface that has a relatively high roughness and/or compliance compared with a substantially flat, smooth asphalt or concrete road surface.
Any control unit or controller described herein may suitably comprise a computational device having one or more electronic processors. The system may comprise a single control unit or electronic controller or alternatively different functions of the controller may be embodied in, or hosted in, different control units or controllers. As used herein the term “controller” or “control unit” will be understood to include both a single control unit or controller and a plurality of control units or controllers collectively operating to provide any stated control functionality. To configure a controller or control unit, a suitable set of instructions may be provided which, when executed, cause said control unit or computational device to implement the control techniques specified herein. The set of instructions may suitably be embedded in said one or more electronic processors. Alternatively, the set of instructions may be provided as software saved on one or more memory associated with said controller to be executed on said computational device. The control unit or controller may be implemented in software run on one or more processors. One or more other control unit or controller may be implemented in software run on one or more processors, optionally the same one or more processors as the first controller. Other suitable arrangements may also be used.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the present invention will now be described, by way of example only, with reference to the accompanying figures, in which:
Figure 1 shows a plan view of a host vehicle incorporating an object classification system in accordance with an embodiment of the present invention;
Figure 2 shows a side elevation of the host vehicle shown in Figure 1 incorporating the object classification system in accordance with an embodiment of the present invention;
Figure 3 shows a schematic representation of the object classification system incorporated into the host vehicle shown in Figures 1 and 2;
Figure 4 shows a schematic representation of the determination of the surface roughness coefficient; and
Figure 5 shows an exemplary image captured by the optical sensor and analysed to classify discrete image components in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION
An object classification system 1 in accordance with an embodiment of the present invention will now be described with reference to the accompanying figures.
As illustrated in Figures 1 and 2, the object classification system 1 is installed in a host vehicle 2. The host vehicle 2 is a wheeled vehicle, such as an automobile or an off-road vehicle. The object classification system 1 is operable to detect and classify an object 3 external to the host vehicle 2. The object 3 may, for example, be an obstacle 4 or a target vehicle 5. The target vehicle 5 may, for example, be another wheeled vehicle, such as an automobile or an off-road vehicle. The target vehicle 5 may, for example, be a vehicle travelling in front of the host vehicle 2. For example, the target vehicle 5 may be a lead vehicle or a vehicle in front of the host vehicle 2 in a convoy. In this scenario, the host vehicle 2 may be a following vehicle which is travelling along the same route as the target vehicle 5. The object classification system 1 is described herein with reference to a host vehicle reference frame comprising a longitudinal axis X, a transverse axis Y and a vertical axis Z.
The host vehicle 2 comprises four wheels W1-4. A torque is transmitted to the wheels W1-4 to apply a tractive force to propel the host vehicle 2. The torque is generated by one or more torque generating machine, such as an internal combustion engine or an electric traction machine, and transmitted to the driven wheels W1 -4 via a vehicle powertrain. The host vehicle 2 in the present embodiment has four-wheel drive and, in use, torque is transmitted selectively to each of said wheels W1 -4. It will be understood that the object classification system 1 could also be installed in a host vehicle 2 having two-wheel drive. The host vehicle 2 in the present embodiment is an automobile having off-road driving capabilities. For example, the host vehicle 2 may be capable of driving on an un-metalled road, such as a dirt road or track. The host vehicle 2 may, for example, be a sports utility vehicle (SUV) or a utility vehicle, but it will be understood that the object classification system 1 may be installed in other types of vehicle. The object classification system 1 may be installed in other types of wheeled vehicles, such as light, medium or heavy trucks. The target vehicle 5 may have the same configuration as the host vehicle 2 or may have a different configuration.
A schematic representation of the object classification system 1 installed in the host vehicle 2 is shown in Figure 3. The object classification system 1 comprises a controller 6 having at least one electronic processor 7 and a memory 8. The processor 7 is operable to receive a data signal S1 from a sensing means 9. As described herein, the processor 7 is operable to process the image data signal S1. In the present embodiment, the processor 7 is configured to implement an image processing module 10 to analyse the image data signal S1. The image processing module 10 in accordance with the present invention is configured to detect and classify objects 3 external to the host vehicle 2. The processor 7 may optionally also control operation of the host vehicle 2 in dependence on the relative location of the target vehicle 5. For example, the processor 7 may be operable to control a target follow distance D1 between the host vehicle 2 and the target vehicle 5. The processor 7 may, for example, output a target follow distance signal SD1 to a cruise control module 11. The cruise control module 11 may be selectively operable in a follow mode suitable for controlling a target speed of the host vehicle 2 to maintain the target follow distance D1 between the host vehicle 2 and the target vehicle 5. The cruise control module 11 may output a target speed signal SV1 to an engine control module 12 which controls the output torque transmitted to the wheels W1-4. The cruise control module 11 may also generate a brake control signal for controlling a braking torque applied to said wheels W1-4. The processor 7 may optionally also output a steering control signal SD1 to control an electronic power assisted steering module (not shown) to control a steering angle of the host vehicle 2.
As illustrated in Figure 2, the sensing means 9 is mounted in a forward-facing orientation to establish a detection region in front of the host vehicle 2. The sensing means 9 in the present embodiment comprises at least one optical sensor 13 mounted to the host vehicle 2. The sensing means 9 may comprise a single camera. Alternatively, the sensing means 9 may comprise a stereoscopic camera. The at least one optical sensor 13 may be mounted at the front of the vehicle, for example incorporated into a front bumper or engine bay grille; or may be mounted within the vehicle cabin, for example in front of a rear-view mirror. The at least one optical sensor 13 has a field of view FOV having a central optical axis VX extending substantially parallel to the longitudinal axis X of the host vehicle 2. The field of view FOV is generally conical in shape and extends in horizontal and vertical directions. The at least one optical sensor 13 comprises a digital imaging sensor for capturing image data. The image data comprises an image IMG1 corresponding to a scene within the field of view FOV of the at least one optical sensor 13. The image data is captured substantially in real-time, for example at 30 frames per second. The at least one optical sensor 13 in the present embodiment is operable to detect light in the visible spectrum of light. The sensing means 9 comprises optics (not shown) for directing the incident light onto an imaging sensor, such as a charge-coupled device (CCD), operable to generate image data for transmission in the image data signal S1. Alternatively, or in addition, the sensing means 9 may be operable to detect light outside of the visible light spectrum, for example in the infra-red range to generate a thermographic image. Alternatively, or in addition, the sensing means 9 may comprise a Lidar sensor for projecting a laser light in front of the host vehicle 2. Other types of sensor are also contemplated.
The sensing means 9 is connected to the controller 6 over a communication bus 14 provided in the host vehicle 2. The image data signal S1 is published to the communication bus 14 by the sensing means 9. In the present embodiment, the connection between the sensing means 9 and the controller 6 comprises a wired connection. In alternative embodiments, the connection between the sensing means 9 and the controller 6 may comprise a wireless connection, for example to enable remote positioning of the sensing means 9. By way of example, the sensing means 9 may be provided in a remote targeting system, such as a drone vehicle. The processor 7 is operable to read the image data signal S1 from the communication bus 14. The processor 7 extracts image data from the image data signal S1. The image processing module 10 parses the image data to detect and classify image components IMC(n). The operation of the image processing module 10 is described in more detail below.
As shown schematically in Figure 3, the object classification system 1 in the present embodiment is configured to receive an inertial measurement signal S2 from an inertial measurement unit (IMU) 15 provided in the host vehicle 2. The IMU 15 comprises one or more sensors 16 for measuring inertial movement of the host vehicle 2. The one or more sensors 16 in the present embodiment are configured to measure acceleration of the host vehicle 2 in the longitudinal axis X and the vertical axis Z. The one or more sensors 16 may optionally also measure acceleration of the host vehicle 2 in the transverse axis Y. The one or more sensors 16 may optionally also measure acceleration about one or more of said longitudinal, transverse and vertical axes X, Y, Z. The IMU 15 may determine the orientation of the host vehicle 2, for example one or more of the following set: vehicle pitch angle, vehicle roll angle and vehicle yaw angle. The one or more sensors 16 may comprise one or more accelerometers (not shown) and/or one or more gyroscopes (not shown). The object classification system 1 analyses said inertial measurement signal S2 to determine a surface roughness coefficient SRC. The surface roughness coefficient SRC provides an indication of the roughness of a surface SF over which the host vehicle 2 is currently travelling. The surface roughness coefficient SRC thereby provides an indication of the magnitude or extent of any irregularities or projections in the surface SF over which the host vehicle 2 is travelling. The surface roughness coefficient SRC may, for example, be determined in dependence on the magnitude of oscillations and/or the frequency with which oscillations are detected. The surface roughness coefficient SRC may be determined in dependence on the period of the oscillations, for example differentiating between short-period oscillations and long-period oscillations. When the host vehicle 2 is travelling over a rough or irregular surface, the orientation of the host vehicle 2 may change due to localised variations in the height of the vehicle under each of the wheels W1-4. The surface roughness coefficient SRC may be determined in dependence on changes in the orientation of the host vehicle 2, or the rate of change of the orientation of the host vehicle 2.
Conventionally, a terrain surface roughness coefficient is calculated by taking the average of three surface roughness index values. The three surface roughness index values are: a roughness index (Rl) value Rl_1, providing a measure of roll acceleration, i.e. changes in the angular velocity about the Z-axis; a second Rl value Rl_2, providing a measure of pitch acceleration, i.e. changes in the angular velocity of the vehicle about the X-axis; and a third Rl value Rl_3, a measure of heave acceleration, i.e. changes in the vertical velocity of the vehicle. In the present embodiment, the object classification system 1 analyses the inertial measurement signal S2 to determine the first, second and third Rl values Rl 1, Rl 2, Rl 3. In the present and alternative embodiments the surface roughness coefficient may also be determined as well or instead via the use of a sensor package on the vehicle, such as RADAR or SONAR, to provide surface information for a region of the terrain preceding the vehicle.
Alternatively, or in addition, the surface roughness coefficient SRC may be determined in dependence on processing of the data received from the sensing means 9. The data may be processed to identify and categorise features contained therein, for example to grade the size and/or frequency of surface irregularities. The data may be processed to detect movements in the at least one optical sensor 13 resulting from movements of the host vehicle 2. Alternatively, or in addition, the surface roughness coefficient SRC may be determined in dependence on analysis of the behaviour of the vehicle suspension (not shown). For example, the extent and/or frequency of changes in the suspension height may be used to determine the surface roughness coefficient SRC. Alternatively, or in addition, the surface roughness coefficient SRC could be predefined, for example by correlating a current geospatial location of the host vehicle 2 to map data defining the surface roughness coefficient SRC.
In the present embodiment, the surface roughness coefficient SRC is in the range zero (0) to one (1), inclusive. The surface roughness coefficient SRC is set equal to one (1) if the surface SF is deemed to be very rough, for example corresponding to terrain that cannot be traversed by the host vehicle 2. The surface roughness coefficient SRC is set equal to zero (0) if the surface SF is deemed to be smooth, for example corresponding to a metalled road surface. The surface roughness coefficient SRC may grade the surface roughness between these endpoints. For example, a surface which is slightly rough may have a surface roughness coefficient SRC of 0.8. The surface roughness coefficient SRC in accordance with an aspect of the present invention is illustrated in Figure 4 for different surface roughness’s.
The operation of the image processing module 10 will now be described. The image processing module 10 parses the image data from the optical sensor 13 to identify one or more image components IMC(n) within an image IMG1. The image components IMC(n) are preferably persistent features within the image IMG1 detectable within the image data for at least a predetermined time period or over a predetermined number of frames, for example two or more successive frames. In certain embodiments, the image components IMC(n) may comprise an identifiable feature or element contained within the image IMG1, for example comprising a plurality of pixels which are present in successive frames. The image processing module 10 implements an edge detection algorithm to detect edges within the image data. The image processing algorithm may, for example, be configured to identify points where the image brightness comprises discontinuities, particularly those points arranged into linear or curved line segments which may correspond to an edge. The image processing module 10 may apply a brightness threshold (which may be a predetermined threshold or a dynamic threshold) to identify the edges of the image components IMC(n) within the image IMG1. The identified edge(s) may be incomplete, for example in regions where image discontinuities are less pronounced. The image processing module 10 may complete the edges, for example utilising a morphological closing technique, to form a closed region. The or each closed region is identified as a discrete image component IMC(n). By repeating this process, the image processing algorithm may identify each image component IMC(n) contained within the image data.
The image processing module 10 implements a pattern matching algorithm to compare each of the image components IMC(n) identified in the image IMG1 to predefined patterns stored in memory 8. The image processing module 10 classifies each of the image components IMC(n) in dependence on the correlation between each image component IMC(n) with the predefined patterns. The image processing module 10 may, for example, classify each image component IMC(n) as one of the following set: an obstacle 4; a target vehicle 5; a cyclist; a person (not shown); an animal, etc. The image processing module relate may seek to further classify the objects 3, for example to differentiate between different types of obstacles 4. With reference to the exemplary image IMG1 shown in Figure 5, the image processing module 10 classifies the image components IMC(n) as a first obstacle 4-1, a second obstacle 4-2 and a target vehicle 5. The image processing module 10 could be configured to use the techniques described herein to classify the first obstacle 4-1 as a tree; and the second obstacle 4-2 as a boulder. The image processing module 10 may be unable to classify some of the image components IMC(n) and these may remain unclassified.
The pattern matching algorithm is implemented to determine if any of the image component IMC(n) identified in the image data (partially or completely) match one or more predefined patterns. The predefined patterns may, for example, comprise an object model defined in twodimensions (2-D) or three-dimensions (3-D). The predefined patterns may be stored in the memory 8 and accessed by the image processing module 10. The predefined patterns may, for example, correspond to a shape and/or profile of one or more predefined obstacles 4 and/or target vehicles 7. Optionally, the predefined patterns may define a colour of the target vehicle 5, for example specified by a user or identified during an initial calibration procedure. Known pattern matching techniques may be used to determine a correlation between the predefined patterns and the or each image component IMC(n). The determined correlation may be expressed as a correlation value, for example expressed as a percentage value, an integer or a real number. A correlation filter is applied to the determined correlation value. The correlation filter compares the determined correlation value to a correlation threshold to determine if there is a match between the image component IMC(n) and the predefined pattern. The correlation threshold is predefined. If the determined correlation value is lower than the correlation threshold, the image processing module 10 determines that the image component IMC(n) does not correspond to the predefined pattern and outputs an invalid classification signal. If the determined correlation value is greater than the correlation threshold, the image processing module 10 classifies the image component IMC(n) as corresponding to the predefined pattern and outputs a valid classification signal. If the image processing module 10 determines that the image component IMC(n) may correspond to more than one of the predefined patterns, the match providing the highest correlation value is selected.
The correlation filter in the present embodiment is a dynamic filter. As such, the correlation threshold may be varied in dependence on vehicle operating conditions. By raising or lowering the correlation threshold, the level of correlation required between an image component IMC(n) and one of the predefined patterns to trigger a valid classification may be adjusted dynamically. In accordance with an aspect of the present invention, the correlation filter is controlled in dependence on the surface roughness coefficient SRC. In particular, the correlation threshold is inversely proportional to the surface roughness coefficient SRC. If the surface roughness coefficient SRC is high (signifying a rough or irregular surface), the correlation threshold is reduced in order to broaden the classification envelope. Conversely, if the surface roughness coefficient SRC is low (signifying a smooth surface), the correlation threshold is increased in order to narrow the classification envelope. The dynamic control of the correlation can help maintain the low false positive count for a smooth surface (i.e. having a low surface roughness coefficient SRC), while helping to classify more objects when the host vehicle 2 is traversing a rougher surface by expanding the detection and weighting criteria when a rougher surface is traversed (i.e. having a higher surface roughness coefficient SRC).
The object classification system 1 has particular application in an off-road environment. When the host vehicle 2 is travelling off-road, the host vehicle 2 may be subject to sudden changes in direction and/or orientation that make the acquisition and tracking of the target vehicle 5 more challenging. The object classification system 1 may be selectively activated when the host vehicle 2 is travelling off-road, for example in response to a user input or automatically when an off-road driving mode is selected. It will be understood that the surface roughness coefficient SRC as the host vehicle 2 travels along a route. These variations in the surface roughness coefficient SRC are illustrated in Figure 4 which illustrate the surface roughness coefficient SRC in five (7) sections dt 1 -dt5 of a single route. The image processing module 10 grades the surface SF as being smooth in the first, third and fifth sections dt1, dt3, dt5 of the route and sets the surface roughness coefficient SRC as equal to 0 in each of these sections. However, the image processing module 10 determines that the surface SF is very rough in the second section dt2 and defines sets the surface roughness coefficient SRC as equal to one in this section. The fourth section dt4, the image processing module 10 determines that the surface SF as an intermediate roughness and sets the surface roughness coefficient SRC as equal to 0.8.
The ability to classify objects 3 over a larger range of operating conditions may facilitate tracking of a target vehicle 5, for example when the host vehicle 2 is following the target vehicle 5 in a convoy. The target vehicle 5 may, for example, be a lead vehicle. The object classification system 1 may track the target vehicle 5 with respect to time. The object classification system 1 may thereby determine a route or path along which the target vehicle 5 is travelling.
The object classification system 1 may track the target vehicle 5, for example to determine the route taken by the target vehicle 5. The object classification system 1 may generate a corresponding target route for the host vehicle 2. At least in certain embodiments, the image processing module 10 may calculate the speed and/or the trajectory of the target vehicle 5.
It will be understood that the object classification system 1 may be operable to identify and/or classify one or more objects 3, for example to identify a plurality of target vehicles 5. This may be applicable when the host vehicle 2 is travelling in a convoy, for example to identify and classify one or more target vehicles 7 travelling in front of the host vehicle 2 in a convoy.
It will be appreciated that various modifications may be made to the embodiment(s) described herein without departing from the scope of the appended claims.
The present invention has been described with particular reference to sensing means 9 which is forward facing to enable detection and classification of objects 3 in front of the host vehicle 2 when it is travelling a forward direction. It will be understood that the invention may be implemented in other configurations, for example comprising sensing means 9 which is sidefacing and/or rear-facing.
The surface roughness coefficient SRC could be determined by a downward-facing camera, for example a camera mounted in a wing mirror of the vehicle.
The object classification system 1 has been described as determining the surface roughness coefficient SRC. It will be understood that the IMU 15 may be configured to determine the surface roughness coefficient SRC. For example, the IMU 15 may grade or otherwise classify the surface SF over which the host vehicle 2 is travelling.
Claims (19)
1. An object classification system for a vehicle, the object classification system comprising a controller configured to:
receive sensor data from at least one sensor; and process the sensor data to classify one or more object external to the vehicle;
wherein the processing of the sensor data is controlled in dependence on a surface roughness coefficient, the surface roughness coefficient being indicative of a roughness of a surface on which the vehicle is travelling.
2. An object classification system as claimed in claim 1, wherein processing the sensor data comprises applying an object classification algorithm to the sensor data.
3. An object classification system as claimed in claim 2, wherein the object classification algorithm applies a dynamic classification filter.
4. An object classification system as claimed in claim 3, wherein the dynamic classification filter is modified in dependence on said surface roughness coefficient.
5. An object classification system as claimed in claim 3 or claim 4, wherein the dynamic classification filter is modified in dependence on a vehicle speed.
6. An object classification system as claimed in any one of the preceding claims, wherein the controller is configured to receive said surface roughness coefficient.
7. An object classification system as claimed in any one of claims 1 to 5, wherein the controller is configured to determine said surface roughness coefficient.
8. An object classification system as claimed in any one of the preceding claims, wherein the sensor data comprises image data received from at least one image sensor; and the processing of the sensor data comprises image processing.
9. A vehicle comprising an object classification system as claimed in any one of the preceding claims.
10. A method of classifying objects external to a vehicle, the method comprising:
determining a roughness of a surface on which the vehicle is travelling;
receiving sensor data from at least one sensor provided on the vehicle; and analysing the sensor data to classify one or more object;
wherein the analysis of the sensor data is modified in dependence on the determined surface roughness.
11. A method as claimed in claim 10 comprising applying an object classification algorithm to the sensor data.
12. A method as claimed in claim 11, wherein the object classification algorithm applies at least one classification filter.
13. A method as claimed in claim 12, wherein the object classification algorithm applies a dynamic classification filter.
14. A method as claimed in claim 13, wherein the dynamic classification filter is modified in dependence on said surface roughness coefficient.
15. A method as claimed in claim 13 or claim 13, wherein the dynamic classification filter is modified in dependence on a vehicle speed.
16. A method as claimed in any one of claims 9 to 15 comprising receiving the surface roughness coefficient.
17. A method as claimed in any one of claims 10 to 15 comprising determining said surface roughness coefficient.
18. A method as claimed in any one of claims 10 to 17, wherein the method comprises processing image data received from at least one image sensor.
19. A non-transitory computer-readable medium having a set of instructions stored therein which, when executed, cause a processor to perform the method claimed in any one of claims 10 to 18.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2019/050389 WO2019166142A1 (en) | 2018-03-01 | 2019-01-09 | Methods and apparatus for acquisition and tracking, object classification and terrain inference |
DE112019001080.8T DE112019001080T5 (en) | 2018-03-01 | 2019-01-09 | METHOD AND DEVICE FOR DETECTION AND TRACKING, OBJECT CLASSIFICATION AND TERRAIN INFERENCE |
US16/977,065 US20210012119A1 (en) | 2018-03-01 | 2019-01-09 | Methods and apparatus for acquisition and tracking, object classification and terrain inference |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN201811007659 | 2018-03-01 |
Publications (3)
Publication Number | Publication Date |
---|---|
GB201806628D0 GB201806628D0 (en) | 2018-06-06 |
GB2571588A true GB2571588A (en) | 2019-09-04 |
GB2571588B GB2571588B (en) | 2020-08-19 |
Family
ID=62236154
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1806628.2A Active GB2571588B (en) | 2018-03-01 | 2018-04-24 | Object classification method and apparatus |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2571588B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210001850A1 (en) * | 2018-03-01 | 2021-01-07 | Jaguar Land Rover Limited | Vehicle control method and apparatus |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150353095A1 (en) * | 2013-01-14 | 2015-12-10 | Robert Bosch Gmbh | Method and device for assisting a driver of a vehicle when driving on uneven terrain |
US20170151850A1 (en) * | 2015-12-01 | 2017-06-01 | Honda Research Institute Europe Gmbh | Predictive suspension control for a vehicle using a stereo camera sensor |
GB2552024A (en) * | 2016-07-08 | 2018-01-10 | Jaguar Land Rover Ltd | Improvements in vehicle speed control |
-
2018
- 2018-04-24 GB GB1806628.2A patent/GB2571588B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150353095A1 (en) * | 2013-01-14 | 2015-12-10 | Robert Bosch Gmbh | Method and device for assisting a driver of a vehicle when driving on uneven terrain |
US20170151850A1 (en) * | 2015-12-01 | 2017-06-01 | Honda Research Institute Europe Gmbh | Predictive suspension control for a vehicle using a stereo camera sensor |
GB2552024A (en) * | 2016-07-08 | 2018-01-10 | Jaguar Land Rover Ltd | Improvements in vehicle speed control |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210001850A1 (en) * | 2018-03-01 | 2021-01-07 | Jaguar Land Rover Limited | Vehicle control method and apparatus |
US11958485B2 (en) * | 2018-03-01 | 2024-04-16 | Jaguar Land Rover Limited | Vehicle control method and apparatus |
Also Published As
Publication number | Publication date |
---|---|
GB201806628D0 (en) | 2018-06-06 |
GB2571588B (en) | 2020-08-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210012119A1 (en) | Methods and apparatus for acquisition and tracking, object classification and terrain inference | |
US7366602B2 (en) | Roll stability control system for an automotive vehicle using an external environmental sensing system | |
US8825306B2 (en) | Analyzing road surfaces | |
US9643617B2 (en) | Friction coefficient estimation from camera and wheel speed data | |
US11958485B2 (en) | Vehicle control method and apparatus | |
US10444346B2 (en) | Method for migrating radar sensor limitations with video camera input for active braking for pedestrians | |
GB2571589A (en) | Terrain inference method and apparatus | |
US11603103B2 (en) | Vehicle speed control | |
CN110588623B (en) | Large automobile safe driving method and system based on neural network | |
JP7049129B2 (en) | Vehicle control device and vehicle control method | |
JP6020729B2 (en) | Vehicle position / posture angle estimation apparatus and vehicle position / posture angle estimation method | |
US10611375B2 (en) | Vehicle speed control | |
EP3741638A1 (en) | Vehicle control device | |
EP1017036A1 (en) | Method and apparatus for detecting deviation of automobile from lane | |
GB2571587A (en) | Vehicle control method and apparatus | |
CN111731282A (en) | Emergency collision avoidance system considering vehicle stability and control method thereof | |
CN112793507A (en) | Blind area early warning braking system based on inertial device perception vehicle right turn motion characteristic | |
GB2571590A (en) | Vehicle control method and apparatus | |
CN107972672B (en) | Driving assistance system and driving assistance method | |
US11059480B2 (en) | Collision avoidance system with elevation compensation | |
US11220255B2 (en) | Systems and methods for mitigating trailer instability due to pressure differentials | |
GB2571588A (en) | Object classification method and apparatus | |
WO2020160927A1 (en) | Vehicle control system and method | |
GB2571586A (en) | Acquisition and tracking method and apparatus | |
JP4670449B2 (en) | Object detection device |