GB2406948A - Target detection apparatus for a vehicle - Google Patents

Target detection apparatus for a vehicle Download PDF

Info

Publication number
GB2406948A
GB2406948A GB0420487A GB0420487A GB2406948A GB 2406948 A GB2406948 A GB 2406948A GB 0420487 A GB0420487 A GB 0420487A GB 0420487 A GB0420487 A GB 0420487A GB 2406948 A GB2406948 A GB 2406948A
Authority
GB
United Kingdom
Prior art keywords
data
location
sensing means
estimate
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0420487A
Other versions
GB0420487D0 (en
Inventor
Adam John Heenan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TRW Ltd
Original Assignee
TRW Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TRW Ltd filed Critical TRW Ltd
Publication of GB0420487D0 publication Critical patent/GB0420487D0/en
Publication of GB2406948A publication Critical patent/GB2406948A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles

Abstract

A target objection apparatus for a vehicle comprises two sensors for detecting an object relative to the vehicle and a processing means. The processing means receives signals from each sensor and applies a weighting to each, combining the weighted data to estimate the location of the object relative to the vehicle. The sensors may each have different characteristics such as video, RADAR or LIDAR. The processing means may produce estimates for the location of an object from each sensor and the weighting may be applied to each estimate. The weighting applied to the first and second sensor locations may vary as a function, preferably a linear function, of the distance to the object as determined by each sensor. A memory attached to the processor may contain sets of weighting values to be applied.

Description

TARGET DETECTION APPARATUS FOR VEHICLES
This invention relates to improvements in target detection apparatus for vehicles. It in particular but not exclusively relates to target detection apparatus for a host vehicle that is adapted to estimate the location of other vehicles relative to the host vehicle.
In recent years the introduction of improved sensors and increases in processing power have led to considerable improvements in automotive control systems. Improvements in vehicle safety have driven these developments, which are approaching commercial acceptance. One example of the latest advances is the provision of a vehicle detection apparatus which is structured around position sensors, which detect the distance to, or position of, other vehicles on a highway. For convenience, vehicles such as lorries, cars or motorcycles or the like are herein after referred to simply as "targets" or objects.
The detection of a target is typically performed using a video, LIDAR or radar based sensor mounted on of the host vehicle. The sensor identifies the location of detected objects relative to the host vehicle and feeds this information to a processor.
In accordance with a first aspect, the invention provides an object detection apparatus for a host vehicle, the apparatus comprising: a first sensing means, which provides data indicative of the location of an object relative to the host vehicle; a second sensing means, which provides data indicative of the location of the object relative to the host vehicle; and a processing means arranged to estimate the location of the object relative by applying a weighting to the data from each of the first and second sensing means and combining the weighted data.
The processing means may be adapted to produce a first estimate of location from the data produced by the first sensing means. This may be produced independent of the data from the second sensing means.
Similarly, it may be adapted to produce a second estimate of location from the data produced by the second sensing means. This may be produced independent of the data from the first sensing means.
The second sensing means may have different performance characteristics to the first sensing means. Typically, the two sensing means may have different accuracy levels which vary differently with distance and/or angle (azimuth) to an object. One may, for example, be optimised for short ranges, and the other for relatively longer ranges.
The processing means may estimate the location of the object by applying a weighting to the estimate of location produced from the data from the first sensor, the weighting depending upon the estimated location from the first sensor data, applying a second weighting to the data from the second sensor, and combining the two weighted estimates to produce an estimate of location. They may simply be combined by taking the average of the two values.
The weighting applied to the first sensor may increase in value as the estimated distance determined by the first sensor increases. The weighting applied to the second may decrease as the estimated distance determined by the second sensor decreases. This would give preference to data from the first sensor at large distances and preference to the second sensor at small distances.
Use of the weightings ensures that the most reliable raw data from each of the two sensors is given preference. The weightings may vary linearly with distance between a minimum and maximum distance. The minimum and maximum may depend upon the characteristics of the sensing means used.
The apparatus may include a memory, which can be accessed by the processor and which stores information needed to allocate the weightings to the estimates from the individual sensing means. This may comprise one or more sets of weighting values. They may be stored in a look-up table with the correct weighting for an estimate accessed according to its range and the sensing means which produced it. For example, the memory may store a set of weightings corresponding to a plurality of ranges, i.e. tom, 20m, 30m and 50m. In an alternative, an equation may be held in the memory, which requires as its input a range and the identity of the sensing means, and produces as its output a weighting.
Thus, in at least one embodiment the invention provides for the combination, or fusion, of information from two different sensing means of differing range-dependent characteristics to enable the location of the lanes to be determined. The invention enables each sensing means to be dominant over the range and angular position of lane artefacts that it is best suited to by weighting the data from the sensing means.
The selection of the weightings which are to be used for each sensing means can be made using prior knowledge of the characteristics of each sensor. A higher weighting being given to one sensor than to the other at a particular distance to the object if it is more accurate than the other.
The weightings may be chosen such that over a particular range of locations one sensing means is given a zero-weighting whilst the other has a non-zero weighting. In this case, the estimate of the non-zero- weighted sensing means will be used over that range and the other sensor effectively ignored. This allows different minimum and maximum ranges to be applied to each sensor, with estimates produced from the sensors that lie outside the range being ignored when estimating location.
In a most preferred arrangement, separate estimates are obtained from each of the sensing means for the longitudinal and lateral distance to the object. By longitudinal we mean how far an object is ahead of the host vehicle, and by lateral we mean how far it is displaced to the side. A different set of weightings may be used for lateral distance than for longitudinal distance. This would allow differences in sensor characteristics with range and viewing angle to be taken into consideration.
More advanced techniques for combining the data from the two sensors could be used. For example, the weighting applied to each estimate from a respective sensor could be varied according to one or more variable parameters such as the velocity of the host vehicle, or the relative velocity of the host vehicle and the object.
The first and second sensing means may produce a stream of data over time by capturing a sequence of data frames. The frames may be captured at a frequency of 10Hz or more, i.e. one set of data forming an image is produced every 1/lO'h of a second or less. Newly produced data may be combined with old data to update an estimate of the position of the objects in the captured data sets.
The first and/or second sensing means may comprise a sensor such as a laser range finder often referred to as a LIDAR type device. This may have a relatively wide field of view- up to say 270 degrees. Such a device produces accurate data over a relatively short range of up to, say, or 30 metres depending on the application.
The sensing means may alternatively or additionally comprise a video camera, which has a relatively narrow field of view - less than say 30 degrees - and a relatively long range of more than 50 metres or so depending on the application. They may in a still further additional or alternative arrangement comprises RADAR type devices.
The sensing means may be fitted to part of the vehicle although it is envisaged that one sensing means could be remote from the vehicle, for example a satellite image system or a GPS driven map of the road.
Whilst video sensing means and RADAR and LIDAR have been mentioned, the skilled man will appreciate that a wide range of sensing means may be used. A sensing means may comprise an emitter which emits a signal outward in front of the vehicle and a receiver which is adapted to receive a portion of the emitted signal reflected from objects in front of the vehicle, and a target processing means which is adapted to determine the distance between the host vehicle and the object.
According to a second aspect, the invention provides a method of estimating the position of an object on a road ahead of a host vehicle comprising: capturing a data from a first sensing means and data from a second sensing means; applying a weighting to the data; and fusing the weighted data - or data derived therefrom - captured by both sensing means to produce an estimate of the location of the object relative to the vehicle.
By location it will, of course, be understood that we may simply determine the distance of the object from the host vehicle, and optionally the actual position of the object relative to the vehicle.
The first sensing means may have different performance characteristics to the second sensing means.
The fusion step of the method may include determining an estimate of location from each of the sensors independently, applying a weighting to each estimate of location of the object from each sensor, and combining the weighted estimates to determine an improved estimate of the location.
According to a third aspect the invention provides a computer program which when running on a processor causes the processor to perform the method of the second aspect of the invention.
The program may be distributed across a number of different processors.
For example, method steps of capturing raw data may be performed on one processor, generating higher level data on another, Reconstructing the data on another processor, and fusing on a still further processor. These may be located at different areas.
According to a fourth aspect the invention provides a processing means which is adapted to receive data from at least two different sensing means, the data being dependent upon the location of an object on a highway on which a vehicle including the processing means is located and which fuses the data from the two sensing means to produce an estimate of the location of the object relative to the vehicle.
The processing means may be distributed across a number of different locations on the vehicle.
There will now be described by way of example only one embodiment of the present invention with reference to the accompanying drawings of which: Figure 1 illustrates an object detection apparatus fitted to a host vehicle and shows the relationship between the vehicle and an object on a highway as well as the detection regions of its two sensors; Figure 2 illustrates the fusion of data from the two sensors for a single object; and Figure 3 is an example of the weightings applied to estimates obtained from the two sensors at a range of distances.
The system of the present invention improves on the prior art by providing for an object detection apparatus that detects the location of objects, such as other vehicles, relative to the host vehicle, by fusing data from two different sensors. This can be used to determine information relating to the longitudinal and lateral position of the host vehicle relative to the object.
The apparatus required to implement the system is illustrated in Figure 1 of the accompanying drawings, fitted to a host vehicle 10. The vehicle is shown as viewed from above on a highway, and is in the centre of a lane having left and right boundaries. A target/object 40 is also shown in the centre of the lane ahead of the host vehicle. In its simplest form, the apparatus comprises two sensing or image acquisition means - one a long range RADAR detector 13 mounted to the front of the host vehicle 10 and the other a short range RADAR detector 14. The long range detector 13 produces a stream of output data, which are fed to a processing board 15.
The processing board 15 captures data from the detector in real time.
The short range detector 14, which is also mounted to the front of the vehicle 101, also provides object identification. Both detectors allow the distance of the detected objects from the host vehicle 10 to be determined together with the bearing of the object relative to the host vehicle. The output of the short range sensor 14 is also passed to a processing board 16 and the data produced by the two image processing boards 15,16 is passed to a data processor 17 located within the vehicle which combines or fuses the image and object detection data.
The fusion ensures that the data from one sensor can take preference over data from the other, or be given more significance than the otheraccording to the performance characteristics of the sensors and the range at which the data is collected.
As also illustrated in Figure 1 and Figure 2 of the accompanying drawings, the two sensors have different performance characteristics.
The field of view and range of the short range sensor is indicated by the hatched cone 20 projected in front of the host vehicle, viewed from above. The sensor can detect objects within the hatched cone area. The detection area of the long range sensor is similarly illustrated by the unhatched cone shaped area 21. The line A-A' represents the maximum range of the short range sensor sensor, the line C-C the minimum for the video sensor. It can be seen that these ranges overlap.
For the detection of objects close to the vehicle ( < [meter) the short range sensor is more accurate as it has a very wide field of view, whereas the narrow field of view of the long range sensor makes it less accurate.
On the other hand, detecting objects at long ranges (>20m) the long range sensor is more accurate than the short range sensor. Of course, the skilled man will understand that the sensors described herein are mere examples, and other types of sensor with different fields of view and ranges could be provided. Indeed, two video sensors could be provided with different fields of view and focal lengths, or perhaps two different LIDAR sensors, or a combination of two different sensor types. The invention can be applied with any two sensors provided they have different performance characteristics.
The processor first analyses the data from the two sensors 13,14 to determine the location of an object 40 relative to the host vehicle 10.
Techniques for producing such estimates are well known to the man skilled in the art. In this embodiment, each sensor produces a pair of estimate values Xl,Y1 and X1,Y2 to describe the location of the object relative to the host vehicle. One value of a pair is an estimate X of the longitudinal distance to the object and the other its lateral distance Y.
J
Figure 2 also shows sample estimates from long range radar (represented by a clear square) and the short range radar sensor (represented by a clear triangle) for two targets 40,50 at different ranges. It is clear that the estimates from each sensor, whilst similar in value, are not the same.
In order to fuse the data from the two sensors, each of the values of a pair of estimates is given a weighting according to how reliable the value is believed to be. This weighting is dependent upon the performance characteristics of each sensor and may be a function of distance and/or bearing. Hence, in the example given estimates from the short range data are weighted more heavily in the near range than the estimates form the long-range data, whilst the long-range data is weighted more heavily in the distance. Typical plots 30,32 of weighting value against range are illustrated in Figure 3 of the accompanying drawings for the two sensors.
As can be seen, the plot 30 of weightings for the short range sensor is given a zero weighting at high distances indicating that it is to be ignored in the estimate at such ranges due to its low accuracy. Conversely, a zero weighting for the long-range sensor is given for any distances below a minimum value. This again reflects its relatively low accuracy with near objects.
Having generated pairs of weighted estimate values Xl,Y1 and X2, Y2 where numeral 1 corresponds to the long range sensor and numeral 2 the short range sensor, the processor may then proceed to fuse or combine the weighted values to estimate the location of the object relative to the host vehicle. The following equations may be used: Yessed = Y. + ( y2 Yl Y2 Y' ) and X7S.,Sed = X + (1 + Wx2 - Wrl XX2 - X' ) where: Y. = Longitudinal distance according to estimate 1; Y2= Longitudinal distance according to estimate 2; Wyl = Longitudinal Weighting applied to estimate 1; Wy2= Longitudinal Weighting applied to estimate from sensor 2; Y,U'ed = Longitudinal distance of object after fusion.
X, = Lateral distance according to estimate from sensor 1; X2= Lateral distance according to estimate from sensor 2; Wx,= Lateral Weighting applied to estimate from sensor 1; Wx2= Lateral Weighting applied to estimate from sensor 2; Xfu,ed= Lateral distance of object after fusion.
The fused estimates for distance will, generally, be more accurate over a wider range of distances than could be achieved using one sensor alone.
For distances below the minimum range set for the long-range sensor data, the estimate is based on the short-range sensor data only. As distance increases, the estimate will be formed by combining estimates from both sensors and lie somewhere between two. Eventually, for objects that are still further away, the range limit of the short-range sensor is exceeded and estimates will match those produced by the long- range sensor alone. As can be seen with the example illustrated in Figure 2, where the fused estimate for the objects 40 and 50 is represented by a solid circle, the fused estimate will be closer to the short-range estimate for an object 40 at close range than the long-range estimate. For the object 50 at a greater range the opposite is true.

Claims (15)

1. An object detection apparatus for a host vehicle, the apparatus comprising: a first sensing means, which provides data indicative of the location of an object relative to the host vehicle; a second sensing means, which provides data indicative of the location of the object relative to the host vehicle; and a processing means arranged to estimate the location of the object relative by applying a weighting to the data from each of the first and second sensing means and combining the weighted data.
2. Apparatus according to claim 1 in which the processing means is adapted to produce a first estimate of location from the data produced by the first sensing means and a a second estimate of location from the data produced by the second sensing means.
3. Apparatus according to claim 1 or claim 2 in which the second sensing means has different performance characteristics to the first sensing means.
4. Apparatus according to claim 3 in which the processing means estimates the location of the object by applying a weighting to the estimate of location produced from the data from the first sensor, the weighting depending upon the estimated location from the first sensor data, applying a second weighting to the data from the second sensor, and combining the two weighted estimates to produce an estimate of location.
5. Apparatus according to any preceding claim in which the weighting applied to the first sensor increases in value as the estimated distance determined by the first sensor increases.
6. Apparatus according to any preceding claim in which the weighting applied to the second sensor decreases as the estimated distance determined by the second sensor decreases.
7. Apparatus according to any preceding claim in which the weightings vary linearly with distance between a minimum and maximum distance.
8. Apparatus according to any preceding claim which includes a memory, which can be accessed by the processor and which stores information needed to allocate the weightings to the estimates from the individual sensing means.
9. Apparatus according to claim 8 in which the memory stores one or more sets of weighting values.
10. Apparatus according to any preceding claim in which separate estimates are obtained from each of the sensing means for the longitudinal and lateral distance to the object.
11. A method of estimating the position of an object on a road ahead of a host vehicle comprising: capturing a data from a first sensing means and data from a second sensing means; applying a weighting to the data; and fusing the weighted data - or data derived therefrom - captured by both sensing means to produce an estimate of the location of the object relative to the vehicle.
The first sensing means may have different performance characteristics to the second sensing means.
12. The method of claim 11 in which the fusion step of the method includes determining an estimate of location from each of the sensors independently, applying a weighting to each estimate of location of the object from each sensor, and combining the weighted estimates to determine an improved estimate of the location.
13. A computer program which when running on a processor causes the processor to perform the method of claim 11 or claim 12.
14. A processing means which is adapted to receive data from at least two different sensing means, the data being dependent upon the location of an object on a highway on which a vehicle including the processing means is located and which fuses the data from the two sensing means to produce an estimate of the location of the object relative to the vehicle.
15. An object detection apparatus for a host vehicle substantially as described herein with reference to and as illustrated in the accompanying drawings.
GB0420487A 2003-09-15 2004-09-15 Target detection apparatus for a vehicle Withdrawn GB2406948A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB0321560.5A GB0321560D0 (en) 2003-09-15 2003-09-15 Target detection apparatus for vehicles

Publications (2)

Publication Number Publication Date
GB0420487D0 GB0420487D0 (en) 2004-10-20
GB2406948A true GB2406948A (en) 2005-04-13

Family

ID=29227096

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB0321560.5A Ceased GB0321560D0 (en) 2003-09-15 2003-09-15 Target detection apparatus for vehicles
GB0420487A Withdrawn GB2406948A (en) 2003-09-15 2004-09-15 Target detection apparatus for a vehicle

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB0321560.5A Ceased GB0321560D0 (en) 2003-09-15 2003-09-15 Target detection apparatus for vehicles

Country Status (1)

Country Link
GB (2) GB0321560D0 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011003487A1 (en) * 2009-07-09 2011-01-13 Wabco Gmbh Method for correctly carrying out autonomous emergency braking in a road vehicle
CN102837658A (en) * 2012-08-27 2012-12-26 北京工业大学 Intelligent vehicle multi-laser-radar data integration system and method thereof
DE102011085544A1 (en) * 2011-11-02 2013-05-02 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Vehicle e.g. car has controller that is provided to activate warning unit, if vehicle velocity is in specific speed range and when obstruction behind vehicle portion is detected by sensor system
EP2787496A4 (en) * 2011-11-30 2015-06-17 Hitachi Automotive Systems Ltd Object detection device
US20220113405A1 (en) * 2020-10-14 2022-04-14 Argo AI, LLC Multi-Detector Lidar Systems and Methods

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5177462A (en) * 1990-03-28 1993-01-05 Mitsubishi Denki K.K. Car interval control apparatus
DE19501612A1 (en) * 1995-01-20 1996-07-25 Bayerische Motoren Werke Ag Method for measuring distance between motor vehicle and other objects
EP0892281A2 (en) * 1997-07-17 1999-01-20 Volkswagen Aktiengesellschaft Method and system for determining the driving situation of a vehicle
WO2001061377A2 (en) * 2000-02-16 2001-08-23 Altra Technologies Incorporated Scalable sensor systems based on sensor modules
DE19948254A1 (en) * 1999-10-07 2001-11-15 Bayerische Motoren Werke Ag Detecting state of system for automatic motor vehicle control involves not displaying sensor fouling or obscuring derived from probability indication if object stability gradient positive
JP2002274301A (en) * 2001-03-19 2002-09-25 Nissan Motor Co Ltd Obstacle detector
WO2004027451A2 (en) * 2002-09-18 2004-04-01 Bendix Commercial Vehicle Systems, Llc Vehicular situational awareness system
WO2004093028A2 (en) * 2003-04-07 2004-10-28 Robert Bosch Gmbh Method and arrangement for controlling a driving aid

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5177462A (en) * 1990-03-28 1993-01-05 Mitsubishi Denki K.K. Car interval control apparatus
DE19501612A1 (en) * 1995-01-20 1996-07-25 Bayerische Motoren Werke Ag Method for measuring distance between motor vehicle and other objects
EP0892281A2 (en) * 1997-07-17 1999-01-20 Volkswagen Aktiengesellschaft Method and system for determining the driving situation of a vehicle
DE19948254A1 (en) * 1999-10-07 2001-11-15 Bayerische Motoren Werke Ag Detecting state of system for automatic motor vehicle control involves not displaying sensor fouling or obscuring derived from probability indication if object stability gradient positive
WO2001061377A2 (en) * 2000-02-16 2001-08-23 Altra Technologies Incorporated Scalable sensor systems based on sensor modules
JP2002274301A (en) * 2001-03-19 2002-09-25 Nissan Motor Co Ltd Obstacle detector
WO2004027451A2 (en) * 2002-09-18 2004-04-01 Bendix Commercial Vehicle Systems, Llc Vehicular situational awareness system
WO2004093028A2 (en) * 2003-04-07 2004-10-28 Robert Bosch Gmbh Method and arrangement for controlling a driving aid

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011003487A1 (en) * 2009-07-09 2011-01-13 Wabco Gmbh Method for correctly carrying out autonomous emergency braking in a road vehicle
CN102427976A (en) * 2009-07-09 2012-04-25 威伯科有限公司 Method for correctly carrying out autonomous emergency braking in a road vehicle
CN102427976B (en) * 2009-07-09 2016-03-23 威伯科有限公司 For the method implementing automatic emergency brake correct in on-road vehicle
US9428163B2 (en) 2009-07-09 2016-08-30 Wabco Gmbh Autonomous vehicle emergency braking method
DE102011085544A1 (en) * 2011-11-02 2013-05-02 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Vehicle e.g. car has controller that is provided to activate warning unit, if vehicle velocity is in specific speed range and when obstruction behind vehicle portion is detected by sensor system
EP2787496A4 (en) * 2011-11-30 2015-06-17 Hitachi Automotive Systems Ltd Object detection device
CN102837658A (en) * 2012-08-27 2012-12-26 北京工业大学 Intelligent vehicle multi-laser-radar data integration system and method thereof
CN102837658B (en) * 2012-08-27 2015-04-08 北京工业大学 Intelligent vehicle multi-laser-radar data integration system and method thereof
US20220113405A1 (en) * 2020-10-14 2022-04-14 Argo AI, LLC Multi-Detector Lidar Systems and Methods

Also Published As

Publication number Publication date
GB0420487D0 (en) 2004-10-20
GB0321560D0 (en) 2003-10-15

Similar Documents

Publication Publication Date Title
Stiller et al. Multisensor obstacle detection and tracking
US9151626B1 (en) Vehicle position estimation system
US9863775B2 (en) Vehicle localization system
US20060220912A1 (en) Sensing apparatus for vehicles
US9610961B2 (en) Method and device for measuring speed in a vehicle independently of the wheels
CN105393135B (en) The determination of the pitching error angle of the radar sensor of motor vehicle
US10836388B2 (en) Vehicle control method and apparatus
US10317522B2 (en) Detecting long objects by sensor fusion
US20150378015A1 (en) Apparatus and method for self-localization of vehicle
JP2020091281A (en) Method and apparatus for processing radar data
US20090122136A1 (en) Object detection device
US20080106462A1 (en) Object detection system and object detection method
US11257369B2 (en) Off road route selection and presentation in a drive assistance system equipped vehicle
US9123251B2 (en) Image system for automotive safety applications
JP2004534947A (en) Object location system for road vehicles
CN109839636B (en) Object recognition device
US11158192B2 (en) Method and system for detecting parking spaces which are suitable for a vehicle
JP7366695B2 (en) Object recognition method and object recognition device
US20230008630A1 (en) Radar device
WO2018212287A1 (en) Measurement device, measurement method, and program
WO2019084130A1 (en) Method and system of digital light processing and light detection and ranging for guided autonomous vehicles
US6947841B2 (en) Method for identifying obstacles for a motor vehicle, using at least three distance sensors for identifying the lateral extension of an object
GB2406948A (en) Target detection apparatus for a vehicle
US11914028B2 (en) Object detection device for vehicle
US8035548B2 (en) Evaluation method, particularly for a driver assistance system of a motor vehicle, for object detection using a radar sensor

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)