US5343206A - Method and means for avoiding collision between a motor vehicle and obstacles - Google Patents

Method and means for avoiding collision between a motor vehicle and obstacles Download PDF

Info

Publication number
US5343206A
US5343206A US08/001,395 US139593A US5343206A US 5343206 A US5343206 A US 5343206A US 139593 A US139593 A US 139593A US 5343206 A US5343206 A US 5343206A
Authority
US
United States
Prior art keywords
road
motor vehicle
vehicle
obstacle
path
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US08/001,395
Inventor
Ermanno Ansaldi
Stefano Re Fiorentin
Andrea Saroldi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fiat Auto SpA
Original Assignee
Fiat Auto SpA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fiat Auto SpA filed Critical Fiat Auto SpA
Priority to US08/001,395 priority Critical patent/US5343206A/en
Application granted granted Critical
Publication of US5343206A publication Critical patent/US5343206A/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/932Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9322Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using additional data, e.g. driver condition, road state or weather data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9329Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles cooperating with reflectors or transponders

Definitions

  • This invention relates to a method and means for avoiding collision between a driven motor vehicle and obstacles lying in its path.
  • the object of this invention is to provide a method and means which overcome the disadvantages of known arrangements, reducing false alarms to a minimum and allowing effective display of the road situation.
  • a method for avoiding collision between a driven motor vehicle and at least one obstacle or object.
  • the method includes the stages of: detecting possible obstacles in an area in front of the motor vehicle on the basis of the presumed path of the vehicle, calculating the distance and the relative speed of the obstacles found, and indicating the obstacles found with an indication of their hazard level, the said stages being repeated cyclicly, characterised in that the said stage of detecting possible obstacles comprises the stages of generating a map of the environment in front of the vehicle by means of scanning the said environment, reconstructing the geometry of the road on the basis of this map, reconstructing the presumed path of the vehicle on the basis of the road geometry and detecting any obstacles within the presumed path.
  • This invention also relates to a system for avoiding collision between a driven motor vehicle and at least one obstacle or object, comprising a sensor capable of monitoring an area in front of the vehicle in order to detect the presence of objects, means for calculating the distance and relative speed of the objects found, and means for signalling the presence of objects found, indicating their hazard level, characterised in that the said sensor is a scanning sensor capable of generating a map of the environment in front of the motor vehicle, and in that it comprises means capable of reconstructing the geometry of the road on the basis of the said map, means for reconstructing the presumed path of the vehicle on the basis of the road geometry and means for detecting any obstacles within the presumed path.
  • FIG. 1 is a plan view of a section of road being travelled by a vehicle
  • FIG. 2 is a block diagram of the means according to this invention.
  • FIG. 3 is a flow chart of the method used by the means in FIG. 2,
  • FIGS. 4 and 5 are two plan views of curved sections of road being travelled by a motor vehicle
  • FIG. 6 shows a radar map which can be obtained with the anticollision means and method in FIGS. 2 and 3, and
  • FIG. 7 shows a representation of the road environment in front of a vehicle obtained using the means and method in FIGS. 2 and 3.
  • the method according to the invention provides for the reconstruction, for a vehicle 1 on a stretch of road 3, of a path 10 which on the one hand can be physically followed by the vehicle itself and on the other hand is plausible in the specific situation, i.e. it takes into account the geometry of the road and the manoeuvre imposed by the vehicle itself.
  • this method provides for acquisition of a map of the road situation so as to detect objects placed at the side of the road whereby the edges of the road can be determined; also any obstacles which may be present and the position of the vehicle with respect to the road are detected and data on the speed and angle of turn of the vehicle are obtained.
  • this method is based on the assumption that with the lack of evidence to the contrary the vehicle will tend to follow the flow of traffic, i.e. will tend to keep to a path parallel to the line of the road.
  • Three different levels of approximation to the path are provided in order to take account of different possible situations and the data available from time to time.
  • the lowest level of approximation consists of determining the path on the basis of the position of the steering wheel and in this case C 0 is equal to the curve set via the steering wheel and C 1 is equal to 0.
  • a greater level of approximation consists of assuming that the path of the vehicle will be maintained parallel to the direction of the road and therefore requires an understanding of the geometry of the road itself in order that the curvature of the road and the curvature of its spatial derivative within a predetermined distance can be calculated.
  • a third level of approximation is obtained by attempting to understand the manoeuvre performed by the vehicle, in particular when it is found that the axis of the vehicle forms an angle which is not zero with respect to the direction of the road, to understand whether this deviation represents a non-significant fluctuation or is due to a special manoeuvre by the driver.
  • the third level is therefore designed to eliminate false alarms due to small fluctuations in the direction of the vehicle as it travels, instead providing correct indications in the case of manoeuvres which do not follow the line of the road, such as when overtaking or when entering a side road.
  • the determination of a presumed path on the basis of the specific situation, with a greater or lesser likelihood, as indicated, makes it possible to determine a travel corridor which has as its width the width of the vehicle and the radar map found is therefore "reshaped" on the basis of the travel corridor obtained in order to check whether any objects have been found within it.
  • These objects represent true obstacles, unlike other objects found by the radar map which lie outside the travel corridor and if the vehicle continues on its path will not lie in its path.
  • Continuous updating of the situation, with determination of the map at a predetermined frequency and corresponding reconstruction of the presumed path makes it possible to follow changes in the path of the vehicle itself.
  • the "hazard level" of these obstacles is evaluated, taking into account the safe distance (which depends on the speed of the vehicle, the maximum deceleration which can be achieved by braking and the reaction time of the driver/vehicle system) and the relative speed of the obstacles.
  • a forecast representation of the road situation in front of the vehicle is then provided showing potentially hazardous obstacles, and a signal which may also be of an acoustic type is provided. Real time display of the situation, with continuous updating, therefore enables the driver to keep the road situation continuously in view and to adjust his behaviour correspondingly.
  • Means 13 includes a central unit 15 for data processing, reconstruction of the path and graphic computation of the road situation, a plurality of sensors for detecting the parameters required for the computation and display, and acoustic devices to represent the road situation and signal alarm situations.
  • central unit 15 includes: an input/output unit 16, a data processing unit 17 connected to input/output unit 16, a data memory 18 connected to data processing unit 17, a graphics computing unit 19 connected to data processing unit 17, and a graphics data memory unit 20 connected to the corresponding graphics computing unit 19.
  • Input/output unit 16 of central unit 15 is connected by a two directional line 21 to a scanning radar 22 equipped with an antenna 23 and is capable of generating a radar map of the portion of space lying in front of the vehicle; it is also connected by a one-way line 24 to a sensor 25 detecting the angle of steering wheel 26 of the vehicle on which means 13 is mounted, and by a one-way line 27 to the vehicle's speedometer 28 and by a one-way line 29 to a sensor 30 which detects whether the windscreen wipers are in operation.
  • radar 22 has a horizontal scanning angle of approximately 40° centred along the axis of the vehicle and divided into a plurality of sectors and an elevation of 5°-6° which offers both a good view of the scene with the object of evaluating required behaviour even when on an incline or bumps and also does not excessively load the system with useless (or downright misleading) data for the purpose of the recognition of obstacles, in particular objects which are off the road or above it (bridges and suspended signals).
  • radar 22 calculates the distance of the obstacles and generates a radar map, i.e.
  • This map is provided together with the steering angle detected by sensor 25, and the speed of the vehicle provided by speedometer 28, and the state (wet or dry) of the road surface (detected in the case of the embodiments illustrated by sensor 30), to central unit 15 which processes the data received, as will be seen later on the basis of FIGS. 3 to 7, and generates a perspective representation of the scene in front of the vehicle for the purposes of displaying and detecting obstacles, and also controls various acoustic alarm signals.
  • central unit 15 is connected via one way line 31 to an acoustic warning device 32 and graphics computing unit 19 is connected via a one way line 33 to a graphics display unit 34, for example a liquid crystal display.
  • graphics display unit 34 for example a liquid crystal display.
  • a preferred embodiment of the means for avoiding collisions will now be described with reference to FIG. 3, showing the sequence of stages repeated at a predetermined frequency in order to provide a continual update of the detection and display of obstacles on the basis of the actual situation.
  • radar 22 constructs the radar map, as previously described, and passes it by line 21 to central unit 15 (map input block 35).
  • data processing unit 17 filters the detection data obtained, in particular by means of an alpha-beta filter of a known type (see for example: Kalata.
  • the Tracking Index A generalised parameter for alpha-beta and alpha-beta-gamma target trackers, IEEE Transactions on Aerospace and Electronic Systems, vol. 20, No. 2, March 1984).
  • Data processing unit 17 then calculates (block 38) the safe distance using the equation:
  • Tr represents the reaction time of the vehicle/driver system and is a fixed value, for example 1 second, corresponding to a mean value determined on the basis of specific investigations
  • a max represents the maximum deceleration produced by hard braking and depends on speed and road conditions.
  • This latter parameter A max (which corresponds to the maximum deceleration and is also governed by regulations) is generally read from a suitable table stored in data memory 18, on the basis of the vehicle speed v and the road surface conditions found.
  • the central unit attempts to reconstruct the geometry of the road in a reference system centred on the vehicle, calculating its five characteristic parameters (block 39).
  • the road can be described by the parameters ⁇ , L s , L d , C 0s and C 1s , where ⁇ represents the "heading angle", i.e.
  • This method in practice searches for the road parameters best approximating to the reflectors present at the edges of the road.
  • the safe area within which potentially hazardous obstacles have to be sought extends not only in depth (safe distance), but also in width, the latter being determined by manoeuvrability, i.e. the possibility of performing "minor" manoeuvres corresponding to small changes in the angle of the steering wheel. Changes in the steering wheel angle affect the possibility of computing different paths within a band called the manoeuvrability zone; this zone is wide at low speeds and narrow at high speeds.
  • a vehicle 100 is travelling on a section of road 102 divided by an intermediate dashed line 103 into two carriageways 104 and 105.
  • Vehicle 100 which lies on carriageway 104, may find the crash barrier (detected in this area as thick line 106) to be an obstacle if its path is not suitably corrected.
  • the magnitude of the hazard situation is linked with the speed of the vehicle, which in the case of high speed is constrained to make a sharp manoeuvre in order to avoid the obstacle, whereas at low speed it can change the direction of movement less brusquely without a risk of collision.
  • the path must therefore be reconstructed making a distinction between the two possibilities (in the first case in fact the estimated travel corridor within which possible obstacles have to be sought will be that shown by dashed portion 107 in the figure, while in the second case reconstruction of the path will result in the path indicated by dotted and dashed line 108).
  • the spatial derivative of the curvature C 1a which is necessary to return the vehicle to a direction axial with the road at a predetermined distance D associated with the speed of the vehicle is calculated in block 42 (this distance is read from a suitable table stored in data memory 18).
  • C 1a is obtained in the following way.
  • the "heading angle" ⁇ (1) varies in accordance with a relationship resulting from the difference between the angle of the axis of the road and the angle of the axis of the vehicle, i.e.:
  • C 1a i.e. the manoeuvre required to bring the axis of the vehicle in line with the axis of the road at distance D
  • this value is then compared (block 43) with a threshold value C 1amax (determined experimentally and also stored in memory 18) which can be used to distinguish whether the manoeuvre which has to be performed is slight or sharp.
  • C 1amax a threshold value which can be used to distinguish whether the manoeuvre which has to be performed is slight or sharp.
  • the radar map is reshaped on the basis of the shape of the travel corridor centred on the foreseen path which has just been constructed, which has a width equal to that of the vehicle, and a check is made to see whether there are any obstacles within this corridor (block 46). For example if radar 22 provides a radar map such as that illustrated in FIG.
  • the calculated travel corridor 119 follows the road profile and intercepts another vehicle 120 which is travelling in the same direction as vehicle 100 after bend 118, while vehicle 121 travelling on the opposite carriageway and tree 122 which is off the road lie outside corridor 119 and are therefore not identified as obstacles, as also crash barriers 117, which are used to reconstruct the geometry of the road and therefore travel corridor 119.
  • the method provides for a representation of the road situation as reconstructed on the basis of the data available, as will be seen below (and the NO output from block 46 therefore passes to block 53), while on the other hand if obstacles are identified, as in the case illustrated in FIG. 6, (YES' output from block 46) this method provides for an evaluation of the "hazard level" of the obstacle.
  • its y coordinate in the reference system with respect to the vehicle i.e. its distance from the vehicle itself
  • d obst (t) in a suitable matrix (block 47).
  • a check is made to see whether obstacles were identified within the travel corridor (block 48) in the previous map.
  • V obst >0 i.e. when the obstacle is moving away, no special alarm is generated and the system proceeds directly to process the perspective graphical representation of the road situation (block 53), while if the relative velocity of the obstacle is zero or negative, i.e. when the obstacle keeps the same distance from the vehicle provided with means 13 or approaches it (no output from block 52) the acoustic warning device is activated so that it generates an acoustic signal characteristic of a prealarm condition (block 54). The graphic processing process for representing the situation (block 53) is then initiated.
  • data processing unit 17 instructs graphics processing unit 19 to represent the object in red (block 55) and then checks (block 56) whether the relative speed of the obstacles v obst is positive or not.
  • acoustic warning 32 is activated to produce an acoustic signal indicating an alarm condition (block 57), while in the second case (obstacle approaching) acoustic warning 32 is activated to produce an acoustic signal indicating a serious alarm condition (block 58).
  • graphics processing unit 19 processes the data previously obtained to generate a perspective image of the situation.
  • a transformation is effected so as to convert the radar-generated representation based on angle and distance (and corresponding to a plan view of the area in front of the vehicle) into a perspective view.
  • the objects can then be associated with a height which decreases with distance so as to give depth in accordance with the relationship:
  • the background is provided in the form of a rectangular grid to provide an idea of depth, the edges of the road are indicated by means of short lines, objects on the road which do not lie within the travel corridor are shown, and objects lying within the travel corridor are coloured yellow or red in the manner described above.
  • Graphics processing unit 19 therefore controls display 34 (block 59) so that this displays the image produced; a representation of the type shown in FIG. 7 of the radar map in FIG. 6 is therefore obtained.
  • the rectangular grid background is indicated by 123
  • tree 122 is not shown, being off the road
  • short lines 124 represent the edge of road 116 and in general do not coincide with crash barriers 117 in FIG. 6
  • vehicle 120 which can be seen in FIG. 6 and is located within travel corridor 119
  • rectangular block 125 represents vehicle 121 which can be seen in FIG. 6 and is travelling on the opposite carriageway, and is therefore outside travel corridor 119.
  • rectangular block 125 which represents a possible obstacle will be coloured red or yellow depending on its distance from the vehicle carrying the anticollision means, while rectangular block 126 will only be shown as its external profile without special colouring.
  • the method provides for the acquisition of a new map and its processing as described above in order to update the display on the basis of changes in the road situation, thus always keeping the driver in a state such that he can evaluate as precisely as possible what manoeuvre is most desirable.
  • investigation of the manoeuvre required in order to get back into line with the road profile in the stages described in block 42 and 43 makes it possible to eliminate false alarms even when moving in a straight line, when small correcting manoeuvres such as those which are normally performed during driving are performed, and which, if only the angle of the steering wheel were to be considered, might cause vehicle paths which intercept the edge of the road or adjacent carriageways to be considered.
  • small correcting manoeuvres such as those which are normally performed during driving are performed, and which, if only the angle of the steering wheel were to be considered, might cause vehicle paths which intercept the edge of the road or adjacent carriageways to be considered.
  • the basic information provided by the angle of the steering wheel can be corrected in order to hold the vehicle's path substantially along the road profile.
  • the objects alongside the road represent an effective hazard and are therefore indicated as obstacles (in fact the driver may not have seen the bend because of the fog).
  • the method avoids generating false alarms due to the presence of the crash barrier while the manoeuvre is in progress.
  • Reconstruction of the road geometry makes it possible to provide a correct forecast, and therefore to avoid a false alarm, even in the situation of a double bend shown in FIG. 1, with two vehicles 1 and 5 facing each other without this being represented as a hazard.
  • Another advantage lies in the fact that when evaluating the hazard of an obstacle, the safe distance corresponding to the normal definition, which considers the possibility that the obstacle may stop without warning as a result of an accident, rather than a "relative" safe distance as proposed in some known arrangements, and which also takes into account the braking distance of the vehicle in front and can therefore provide an incorrect evaluation in the abovementioned case of a sudden stop, is taken into consideration.
  • the means and methods described have the widest application is also emphasised. In particular their effectiveness does not depend on the existence of a number of equipped vehicles or on the provision of special structures on a road. Also the representation of the road situation in a substantially realistic manner, and not by means of symbols which can only indicate "hazard" situations due to the presence of obstacles closer to or further away from the vehicle, enables the driver to make a full evaluation of the situation so that he can then choose the most appropriate behaviour on the basis of the position of the obstacle in the context of the environment and in particular with respect to the edges of the road and other possible vehicles in order to evaluate, for example, whether the situation is such that a slight manoeuvre to put the vehicle on a path which does not intercept the obstacle (i.e. to steer clear of the obstacle) will be sufficient, or whether more decisive manoeuvres are required, such as sharper braking and a rapid change of course.
  • Reconstruction of the vehicle's path on the basis of increasing orders of approximation provides acceptable forecasts even where the starting data are in themselves insufficient for a reconstruction of the road, or where the manoeuvre being performed by the driver is not understood.
  • the higher speed requires that obstacles should be indicated while they are still at a distance.
  • the environment is generally structured, in that the existence of the crash barrier makes it possible to recognise the road and therefore to extrapolate the vehicle's path correctly for some distance.
  • information on the edge of the road may be absent; in this case however the speeds in question are normally lesser, and therefore obstacles can be indicated at shorter distances. In this case extrapolation on the basis of the angle of the steering wheel alone has been shown to be sufficient.
  • the type of representation used which presupposes an interpretation of the situation, so as to provide only and all information required and to display significant objects on the radar map on the basis of their evaluated hazard level is particularly useful in helping the driver to make a correct evaluation of a situation.
  • Detection of the radius of curvature set by the steering wheel may be effected in various ways, either by direct determination of the angle of the steering wheel, as provided for in the example embodiment described above, or by means of an accelerometer or by other means.
  • the condition (wet or dry) of the road may be detected by means of suitable type adhesion sensors, or may be keyed in directly by the driver by means of an appropriate key.
  • the type of display used for the perspective representation of the road situation various arrangements may be provided, for example a "head-up display", similar to that used in aircraft, may be provided instead of a liquid crystal screen.
  • the type of perspective display may be partly modified in respect of for example the transformation relationships, the outer profile of objects, the representation of the edges of the road and the height of objects.
  • the method according to the invention permits a number of variants with respect to the embodiment described with reference to FIG. 3.
  • all the data received by the sensors may be subjected to different and/or further types of filtering and processing in order to eliminate undesirable disturbances.
  • the various distance measurements made on successive maps may be filtered by means of an alpha-beta filter which requires two observations in order to determine the relative speed of the vehicle (in which the filter is initialized for the distance of the obstacle by the first measurement and the relative velocity of the obstacle is estimated from the second measurement).
  • a 3 Hz radar scanner is sufficient for the purpose of describing the road geometry and understanding the manoeuvre in progress, although other higher frequencies may also be used.
  • the value of C 1 (the spatial derivative of the presumed path curvature) is determined only on the basis of the conditions specified in blocks 41, 44 or 45, in accordance with a preferred variant, the value of C 1 may be selected so as to take into account the level of approximation with which the road geometry is reconstructed. In essence, the value F of the function (calculated in block 40) may be used to modify the value of C 1 in such a way that this also depends on the value of F.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)
  • Regulating Braking Force (AREA)

Abstract

The method for avoiding collision between a motor vehicle and obstacles placed in the path of the vehicle comprises the stages of forming a radar map of the area in front of the vehicle, reconstructing the geometry of the road, identifying its edges, detecting the position and speed of the vehicle with respect to the road, determining the presumed path of the vehicle on the basis of the road geometry and the manoeuvre being carried out by the vehicle at that instant, detecting objects on the radar map found lying in the presumed path of the motor vehicle and displaying the map found in a perspective representation, ignoring objects which are off the road and indicating objects in the path of the vehicle in a different manner according to their hazard and if appropriate generating alarms of an acoustic type.

Description

This is a continuation of copending application Ser. No. 07/725,014 filed on Jul. 3, 1991 now abandoned.
BACKGROUND OF THE INVENTION
This invention relates to a method and means for avoiding collision between a driven motor vehicle and obstacles lying in its path.
It is known that a high percentage of accidents could be avoided by anticipated reaction by the driver. As a consequence various anti-collision systems designed to monitor the environment surrounding the motor vehicle and to indicate the presence of obstacles to the driver, in particular under conditions of poor or reduced visibility, typically as a result of fog, but also as a result of lack of attention by the driver, have already been investigated and proposed for a number of years.
In particular various systems, either of the passive type, such as television cameras, which however have the disadvantage that they require complex computing for the determination of distance, and of the active optical type, such as laser devices, which are however subject to atmospheric absorption, in particular in fog, and above all microwave radar, which has better properties than the other systems in that it provides the distance to the objects found directly, with barely any atmospheric absorption, have been proposed for the detection of obstacles.
The simplest known arrangements are of the fixed beam type with straight-line observation of the area of the road in front of the vehicle. These arrangements have proved to be of little reliability and are subject to numerous errors, particularly on bends, because of the inability of the system to distinguish obstacles which actually lie in the vehicle's path from objects which lie in a straight line in front of the vehicle but not in its path, such as trees growing by the edge of the road on a bend or vehicles travelling in the opposite direction (as is for example shown in FIG. 1, in which a vehicle 1 travelling in direction 2 on a section of road 3 mounting a fixed beam system would see vehicle 5 travelling on section of road 6 on the opposite carriageway and in direction 7 as an obstacle).
In order to overcome the problems mentioned above, other known arrangements provide for the possibility of orientating the beam on the basis of the angle of the steering wheel and others also provide for the scanning of circular sectors ahead of the vehicle on the basis of the steering wheel angle. However even these arrangements yield a high number of false alarms, as again made clear in the example in FIG. 1. In fact, as is known, when vehicle 1 explores the area of road which extends along a line of circumference indicated by dashed line 8 corresponding to the manoeuvre initially brought about by means of the steering wheel, objects which are placed at the edge of the road at the point indicated by 9, or right off the road, are identified as obstacles, when in fact there is no obstacle in the real path of the vehicle indicated by dashed line 10.
In an attempt to eliminate these disadvantages special arrangements, such as the use of complex filters based on the relative speed of an obstacle in such a way as to delay the alarm and permit further observation, which only solve part of the problem of false alarms and do not offer a sufficient level of reliability, have been investigated.
Known systems also provide various arrangements for attracting the driver's attention to the alarm situation detected, using displays which in general provide for the illumination of predetermined luminous icons of different colour and size depending on the level of the alarm, accompanied by signals of an acoustic type. However even these known types of display have proved to be insufficiently effective and also restricted, in that they do not provide information on the transverse dimensions and position of the obstacle with respect to the road, and therefore on the scope for avoiding the obstacle by means of a suitable manoeuvre.
SUMMARY OF THE INVENTION
The object of this invention is to provide a method and means which overcome the disadvantages of known arrangements, reducing false alarms to a minimum and allowing effective display of the road situation.
According to this invention a method is provided for avoiding collision between a driven motor vehicle and at least one obstacle or object. The method includes the stages of: detecting possible obstacles in an area in front of the motor vehicle on the basis of the presumed path of the vehicle, calculating the distance and the relative speed of the obstacles found, and indicating the obstacles found with an indication of their hazard level, the said stages being repeated cyclicly, characterised in that the said stage of detecting possible obstacles comprises the stages of generating a map of the environment in front of the vehicle by means of scanning the said environment, reconstructing the geometry of the road on the basis of this map, reconstructing the presumed path of the vehicle on the basis of the road geometry and detecting any obstacles within the presumed path.
This invention also relates to a system for avoiding collision between a driven motor vehicle and at least one obstacle or object, comprising a sensor capable of monitoring an area in front of the vehicle in order to detect the presence of objects, means for calculating the distance and relative speed of the objects found, and means for signalling the presence of objects found, indicating their hazard level, characterised in that the said sensor is a scanning sensor capable of generating a map of the environment in front of the motor vehicle, and in that it comprises means capable of reconstructing the geometry of the road on the basis of the said map, means for reconstructing the presumed path of the vehicle on the basis of the road geometry and means for detecting any obstacles within the presumed path.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to provide a better understanding of this invention a preferred embodiment will now be described purely by way of a non restrictive example with reference to the appended drawing, in which:
FIG. 1 is a plan view of a section of road being travelled by a vehicle,
FIG. 2 is a block diagram of the means according to this invention,
FIG. 3 is a flow chart of the method used by the means in FIG. 2,
FIGS. 4 and 5 are two plan views of curved sections of road being travelled by a motor vehicle,
FIG. 6 shows a radar map which can be obtained with the anticollision means and method in FIGS. 2 and 3, and
FIG. 7 shows a representation of the road environment in front of a vehicle obtained using the means and method in FIGS. 2 and 3.
DETAILED DESCRIPTION OF THE INVENTION
With reference to FIG. 1, in order to avoid as many false alarms as possible the method according to the invention provides for the reconstruction, for a vehicle 1 on a stretch of road 3, of a path 10 which on the one hand can be physically followed by the vehicle itself and on the other hand is plausible in the specific situation, i.e. it takes into account the geometry of the road and the manoeuvre imposed by the vehicle itself. In order to reconstruct this path this method provides for acquisition of a map of the road situation so as to detect objects placed at the side of the road whereby the edges of the road can be determined; also any obstacles which may be present and the position of the vehicle with respect to the road are detected and data on the speed and angle of turn of the vehicle are obtained. In order to reconstruct the presumed path this method is based on the assumption that with the lack of evidence to the contrary the vehicle will tend to follow the flow of traffic, i.e. will tend to keep to a path parallel to the line of the road. Three different levels of approximation to the path are provided in order to take account of different possible situations and the data available from time to time.
With this object the path is reconstructed on the assumption that every road consists of a series of sections which may be either straight, or curved or forming a junction and each section is described using the geometrical model of the Cornu spiral, according to which every path is represented by three parameters corresponding to the width, the instantaneous curvature C0 and its spatial derivative C1 =dC0 /dl. As a consequence straight sections are described by the values C0 =0 and C1 =0, curves by the values C0 =l/R (where R=the radius of curvature of the curve) and C1 =0, and junctions are distinguished by a value of C1 which is not 0. Given these assumptions the lowest level of approximation consists of determining the path on the basis of the position of the steering wheel and in this case C0 is equal to the curve set via the steering wheel and C1 is equal to 0. A greater level of approximation consists of assuming that the path of the vehicle will be maintained parallel to the direction of the road and therefore requires an understanding of the geometry of the road itself in order that the curvature of the road and the curvature of its spatial derivative within a predetermined distance can be calculated. A third level of approximation is obtained by attempting to understand the manoeuvre performed by the vehicle, in particular when it is found that the axis of the vehicle forms an angle which is not zero with respect to the direction of the road, to understand whether this deviation represents a non-significant fluctuation or is due to a special manoeuvre by the driver. The third level is therefore designed to eliminate false alarms due to small fluctuations in the direction of the vehicle as it travels, instead providing correct indications in the case of manoeuvres which do not follow the line of the road, such as when overtaking or when entering a side road.
The use of different levels of approximation when reconstructing the presumed path thus makes it possible to describe the situation with greater accuracy, providing a more correct forecast when the data available permit it, and nevertheless ensuring a description of the situation which can be used for the detection of obstacles, with a lesser level of likelihood, when it is not possible to reconstruct an effective "plausible" path. This means for example that when it is not possible to reconstruct the geometry of the road owing to the lack of reference points (such as a guard rail or crash barrier) the path is reconstructed with the lesser degree of approximation.
According to this method the determination of a presumed path on the basis of the specific situation, with a greater or lesser likelihood, as indicated, makes it possible to determine a travel corridor which has as its width the width of the vehicle and the radar map found is therefore "reshaped" on the basis of the travel corridor obtained in order to check whether any objects have been found within it. These objects represent true obstacles, unlike other objects found by the radar map which lie outside the travel corridor and if the vehicle continues on its path will not lie in its path. Continuous updating of the situation, with determination of the map at a predetermined frequency and corresponding reconstruction of the presumed path makes it possible to follow changes in the path of the vehicle itself.
Once the path has been reconstructed and the obstacles lying in it have been identified the "hazard level" of these obstacles is evaluated, taking into account the safe distance (which depends on the speed of the vehicle, the maximum deceleration which can be achieved by braking and the reaction time of the driver/vehicle system) and the relative speed of the obstacles. A forecast representation of the road situation in front of the vehicle is then provided showing potentially hazardous obstacles, and a signal which may also be of an acoustic type is provided. Real time display of the situation, with continuous updating, therefore enables the driver to keep the road situation continuously in view and to adjust his behaviour correspondingly.
With reference to FIG. 2, the means for implementing the method for detecting obstacles and their display in the manner described above is indicated as a whole by 13 and is intended to be mounted on a motor vehicle (indicated by 100 in FIG. 6). Means 13 includes a central unit 15 for data processing, reconstruction of the path and graphic computation of the road situation, a plurality of sensors for detecting the parameters required for the computation and display, and acoustic devices to represent the road situation and signal alarm situations.
In details central unit 15 includes: an input/output unit 16, a data processing unit 17 connected to input/output unit 16, a data memory 18 connected to data processing unit 17, a graphics computing unit 19 connected to data processing unit 17, and a graphics data memory unit 20 connected to the corresponding graphics computing unit 19. Input/output unit 16 of central unit 15 is connected by a two directional line 21 to a scanning radar 22 equipped with an antenna 23 and is capable of generating a radar map of the portion of space lying in front of the vehicle; it is also connected by a one-way line 24 to a sensor 25 detecting the angle of steering wheel 26 of the vehicle on which means 13 is mounted, and by a one-way line 27 to the vehicle's speedometer 28 and by a one-way line 29 to a sensor 30 which detects whether the windscreen wipers are in operation. In an embodiment of means 13 which has provided satisfactory results radar 22 has a horizontal scanning angle of approximately 40° centred along the axis of the vehicle and divided into a plurality of sectors and an elevation of 5°-6° which offers both a good view of the scene with the object of evaluating required behaviour even when on an incline or bumps and also does not excessively load the system with useless (or downright misleading) data for the purpose of the recognition of obstacles, in particular objects which are off the road or above it (bridges and suspended signals). In a known way, on the basis of the time for the receipt of return echos due to reflection of the radar pulses by obstacles, radar 22 calculates the distance of the obstacles and generates a radar map, i.e. a matrix within which the echos detected are stored on the basis of the map sector (scanning angle) and the distance found. This map is provided together with the steering angle detected by sensor 25, and the speed of the vehicle provided by speedometer 28, and the state (wet or dry) of the road surface (detected in the case of the embodiments illustrated by sensor 30), to central unit 15 which processes the data received, as will be seen later on the basis of FIGS. 3 to 7, and generates a perspective representation of the scene in front of the vehicle for the purposes of displaying and detecting obstacles, and also controls various acoustic alarm signals.
With this object data processing unit 17 of central unit 15 is connected via one way line 31 to an acoustic warning device 32 and graphics computing unit 19 is connected via a one way line 33 to a graphics display unit 34, for example a liquid crystal display.
A preferred embodiment of the means for avoiding collisions will now be described with reference to FIG. 3, showing the sequence of stages repeated at a predetermined frequency in order to provide a continual update of the detection and display of obstacles on the basis of the actual situation. To begin with radar 22 constructs the radar map, as previously described, and passes it by line 21 to central unit 15 (map input block 35). At this stage data processing unit 17 filters the detection data obtained, in particular by means of an alpha-beta filter of a known type (see for example: Kalata. The Tracking Index: A generalised parameter for alpha-beta and alpha-beta-gamma target trackers, IEEE Transactions on Aerospace and Electronic Systems, vol. 20, No. 2, March 1984). Then data processing unit 17 obtains data relating to the steering wheel steering angle θ, the velocity V of the vehicle and the condition of the road surface as provided by lines 24, 27 and 29 (block 36). Then the radius of curvature RV of the path travelled by the vehicle at that instant is read from the angle of the steering wheel, using a suitable table stored in data memory 18 and the curvature C0V of this path is then calculated (from the equation C0V =l/RV, block 37). Preferably the speed of the vehicle V and its curvature C0V are filtered using a alpha-beta filter so as to satisfy the physical assumption of continuity. The data obtained in this way, which can be gathered under any conditions because they are directly measurable, provide a first approximation to the presumed path on the basis of the current situation, thus providing a minimum hypothesis which in most cases will not correspond to the actual path travelled by the vehicle, but nevertheless generally an acceptable working hypothesis when it is not possible to reconstruct the geometry of the road, as will be explained below.
Data processing unit 17 then calculates (block 38) the safe distance using the equation:
dds=v*Tr+v.sup.2 /(2*A.sub.max)
in which Tr represents the reaction time of the vehicle/driver system and is a fixed value, for example 1 second, corresponding to a mean value determined on the basis of specific investigations, and Amax represents the maximum deceleration produced by hard braking and depends on speed and road conditions. This latter parameter Amax (which corresponds to the maximum deceleration and is also governed by regulations) is generally read from a suitable table stored in data memory 18, on the basis of the vehicle speed v and the road surface conditions found.
The central unit then attempts to reconstruct the geometry of the road in a reference system centred on the vehicle, calculating its five characteristic parameters (block 39). In fact, with reference to FIG. 4, with respect to a vehicle 100 travelling on a section of road 101 representing the origin of a system of cartesian axes x,y, the road can be described by the parameters Φ, Ls, Ld, C0s and C1s, where Φ represents the "heading angle", i.e. the angle formed between the axis of vehicle 100 and the direction of the road, Ls and Ld are the distances from the left hand side and the right hand side of the road respectively to the centre of the vehicle and C0s and C1s define the shape of the road on the basis of the Cornu spiral model, according to which the curvature C (which is the inverse of the radius Rs, i.e. C=l/Rs) is given by
C=C.sub.0s +C.sub.1s *1
where l=the length of the section of road in question.
On the assumption that the edge of the road can be recognised by the presence of reflecting objects which can be identified by the radar, such as crash barriers, guard rails or trees, and xi, yi are the coordinates of the ith point detected by radar scanning, the algorithm used by this method finds the values of the parameters Φ, Ls, Ld, C0s and C1s which maximise the function ##EQU1## where i=1 . . . n, where n is equal to the number of points detected by the radar scanning and d is the geometrical distance between the point having coordinates (xi, yi) and the edge of the road, identified by means of the above five parameters.
This method in practice searches for the road parameters best approximating to the reflectors present at the edges of the road.
On the basis of this method it is assumed that the values of the five parameters effectively define the edge of the road if the function F is greater than a predetermined minimum threshold value F0, and as a consequence the next pass (block 40) consists of verifying this condition. In particular, if function F has a value lower than predetermined threshold F0 it is considered that the road is not recognised (lack of a sufficient number of objects on the edge of the road), and the road is approximated on the basis of the simplest level of interpolation, which is obtained, as has been said, from the position of the steering wheel. In this case therefore (block 41) C0s =C0V and C1 =0.
If this is not the case it is assumed that the values of the parameters found in fact describe the geometry of the road. These parameters can be used to reconstruct a "plausible" path on the assumption that the vehicle remains parallel to the edges of the road, and therefore provide a better approximation to the path itself, even though they do not take into account the fact that there are generally fluctuations in the direction of travel while in motion or that the driver may perform a special manoeuvre. In this case therefore a path reconstructed on these assumptions will not be a true path and cannot be used to detect potential obstacles.
In accordance with this method therefore an attempt is made to establish whether a manoeuvre aligning the vehicle and the road is in progress, evaluating whether this alignment would involve a greater or a lesser manoeuvre and as a consequence reconstructing the path within which possible obstacles which have to be indicated to the driver should be sought. In fact the safe area within which potentially hazardous obstacles have to be sought extends not only in depth (safe distance), but also in width, the latter being determined by manoeuvrability, i.e. the possibility of performing "minor" manoeuvres corresponding to small changes in the angle of the steering wheel. Changes in the steering wheel angle affect the possibility of computing different paths within a band called the manoeuvrability zone; this zone is wide at low speeds and narrow at high speeds.
For a better understanding of this problem a special case will now be illustrated with reference to FIG. 5. Here a vehicle 100 is travelling on a section of road 102 divided by an intermediate dashed line 103 into two carriageways 104 and 105. Vehicle 100, which lies on carriageway 104, may find the crash barrier (detected in this area as thick line 106) to be an obstacle if its path is not suitably corrected. In particular the magnitude of the hazard situation is linked with the speed of the vehicle, which in the case of high speed is constrained to make a sharp manoeuvre in order to avoid the obstacle, whereas at low speed it can change the direction of movement less brusquely without a risk of collision. The path must therefore be reconstructed making a distinction between the two possibilities (in the first case in fact the estimated travel corridor within which possible obstacles have to be sought will be that shown by dashed portion 107 in the figure, while in the second case reconstruction of the path will result in the path indicated by dotted and dashed line 108).
In order to understand the manoeuvre being performed by the vehicle, according to this method the spatial derivative of the curvature C1a which is necessary to return the vehicle to a direction axial with the road at a predetermined distance D associated with the speed of the vehicle is calculated in block 42 (this distance is read from a suitable table stored in data memory 18). In particular C1a is obtained in the following way.
In the system of reference with respect to the vehicle the angle of the road axis α(1) changes in accordance with the relationship:
α(1)=Φ+C.sub.0s *1+C.sub.1s *1.sup.2 /2
while the angle of the axis of the vehicle β(1) varies according to the relationship:
β(1)=C.sub.0V *1+C.sub.1V *1.sup.2 /2
The "heading angle" Φ(1) varies in accordance with a relationship resulting from the difference between the angle of the axis of the road and the angle of the axis of the vehicle, i.e.:
Φ(1)=Φ+(C.sub.0s -C.sub.0V)*1+(C.sub.1s -C.sub.1V)*1.sup.2 /2
Calculation of the aligning manoeuvre returns the "heading angle" Φ to zero at distance D. Substituting Φ(0D)=0 we obtain:
C.sub.1a =C.sub.1V =C.sub.1s +2*(C.sub.0s -C.sub.0V)/D-2*Φ/D.sup.2
Having obtained C1a, i.e. the manoeuvre required to bring the axis of the vehicle in line with the axis of the road at distance D, this value is then compared (block 43) with a threshold value C1amax (determined experimentally and also stored in memory 18) which can be used to distinguish whether the manoeuvre which has to be performed is slight or sharp. In particular, if it is found that |C1a |<C1amax this means that the manoeuvre required is of the "slight" type, and it is then very likely that the inclination of the vehicle with respect to the road is due to a fluctuation, and then C1 =C1a, i.e. the change in the curvature of the estimated path will be equal to the "slight" manoeuvre which has to be performed (block 44). Vice versa, if the condition established by block 43 is not satisfied (NO output therefrom) the value of C1 resulting from the manoeuvre actually being performed by the vehicle (block 45) is calculated using the equation: ##EQU2## where C0V (t) represents the derivative of the curvature of the vehicle which has just been calculated (by means of block 37), C0V.sup.(t-1) represents the same derivative calculated in the previous iteration and t represents the time interval between this iteration and the previous one.
Once the parameters of the assumed path, which according to the data available may approximate to a greater or lesser extent to the actual path of the vehicle, have been found, the radar map, as acquired via block 35, is reshaped on the basis of the shape of the travel corridor centred on the foreseen path which has just been constructed, which has a width equal to that of the vehicle, and a check is made to see whether there are any obstacles within this corridor (block 46). For example if radar 22 provides a radar map such as that illustrated in FIG. 6 and indicated by reference 115 (in this particular instance the road, indicated by 116, is bounded by a set of crash barriers 117 and has a bend 118 a little in front of vehicle 100 in its direction of travel), the calculated travel corridor 119, represented by dashed lines, follows the road profile and intercepts another vehicle 120 which is travelling in the same direction as vehicle 100 after bend 118, while vehicle 121 travelling on the opposite carriageway and tree 122 which is off the road lie outside corridor 119 and are therefore not identified as obstacles, as also crash barriers 117, which are used to reconstruct the geometry of the road and therefore travel corridor 119.
If no obstacles are identified within the travel corridor the method provides for a representation of the road situation as reconstructed on the basis of the data available, as will be seen below (and the NO output from block 46 therefore passes to block 53), while on the other hand if obstacles are identified, as in the case illustrated in FIG. 6, (YES' output from block 46) this method provides for an evaluation of the "hazard level" of the obstacle. For this purpose, and for every obstacle found, its y coordinate in the reference system with respect to the vehicle (i.e. its distance from the vehicle itself) is stored in memory as dobst (t) in a suitable matrix (block 47). Then, for every obstacle, a check is made to see whether obstacles were identified within the travel corridor (block 48) in the previous map. For this purpose a search is made among the obstacles stored in memory in the previous iteration to see if there is one whose distance is slightly different from the distance of the obstacle being considered at that instant (taking into account if appropriate other parameters such as the width dimension of the obstacle, which provides a more reliable association between identifications of the same obstacle in successive maps).
If the obstacle in question was not present on the previous map (NO output from block 48) the system proceeds directly to processing the representation of the road situation (block 53), otherwise (YES' output) the relative speed of the obstacle is calculated using the formula: ##EQU3## (block 49). Then (block 50) a check is made to establish whether the distance of the obstacle found is greater or less than the safe distance dds calculated in block 38. If it is greater (YES' output from block 50) data processing unit 17 instructs graphics processing unit 19 to prepare a representation of the object in yellow (block 51) and then checks (block 52) whether the relative speed of the obstacles Vobst is positive or not, in order to determine whether the object is moving away from the vehicle mounting anticollision means 13 or approaching it. In the first case (Vobst >0), i.e. when the obstacle is moving away, no special alarm is generated and the system proceeds directly to process the perspective graphical representation of the road situation (block 53), while if the relative velocity of the obstacle is zero or negative, i.e. when the obstacle keeps the same distance from the vehicle provided with means 13 or approaches it (no output from block 52) the acoustic warning device is activated so that it generates an acoustic signal characteristic of a prealarm condition (block 54). The graphic processing process for representing the situation (block 53) is then initiated.
If instead the output from block 50 is negative, i.e. if the obstacle is within the safe distance, data processing unit 17 instructs graphics processing unit 19 to represent the object in red (block 55) and then checks (block 56) whether the relative speed of the obstacles vobst is positive or not. In the first case, which also corresponds in this case to the situation where the obstacle is moving away from the vehicle, acoustic warning 32 is activated to produce an acoustic signal indicating an alarm condition (block 57), while in the second case (obstacle approaching) acoustic warning 32 is activated to produce an acoustic signal indicating a serious alarm condition (block 58).
In both cases, graphics processing unit 19 at this point processes the data previously obtained to generate a perspective image of the situation. In detail (block 53) a transformation is effected so as to convert the radar-generated representation based on angle and distance (and corresponding to a plan view of the area in front of the vehicle) into a perspective view.
In particular, the transformation which links the position (x, y) of objects on the radar map with that (xp, yp) in the perspective view is given by the laws of perspective and can be expressed, ignoring scale factors, by the following equations:
x.sub.p =x/y
y.sub.p =l/y
The objects can then be associated with a height which decreases with distance so as to give depth in accordance with the relationship:
h=l/y
In accordance with this method the background is provided in the form of a rectangular grid to provide an idea of depth, the edges of the road are indicated by means of short lines, objects on the road which do not lie within the travel corridor are shown, and objects lying within the travel corridor are coloured yellow or red in the manner described above.
Graphics processing unit 19 therefore controls display 34 (block 59) so that this displays the image produced; a representation of the type shown in FIG. 7 of the radar map in FIG. 6 is therefore obtained. In FIG. 7 therefore the rectangular grid background is indicated by 123, tree 122 is not shown, being off the road, short lines 124 represent the edge of road 116 and in general do not coincide with crash barriers 117 in FIG. 6, vehicle 120, which can be seen in FIG. 6 and is located within travel corridor 119, is represented by a rectangular block 125 and rectangular block 126 represents vehicle 121 which can be seen in FIG. 6 and is travelling on the opposite carriageway, and is therefore outside travel corridor 119. As explained above rectangular block 125 which represents a possible obstacle will be coloured red or yellow depending on its distance from the vehicle carrying the anticollision means, while rectangular block 126 will only be shown as its external profile without special colouring.
Having completed these stages the method provides for the acquisition of a new map and its processing as described above in order to update the display on the basis of changes in the road situation, thus always keeping the driver in a state such that he can evaluate as precisely as possible what manoeuvre is most desirable.
The advantages which can be obtained with the means and methods described above are as follows. Through reconstruction of the estimated path of the vehicle at a level of approximation which depends on the data available it is possible to search for obstacles within the area which probably will (or might) be occupied by the vehicle in the course of its travel, thus making it possible to eliminate virtually all or almost all false alarms which in known arrangements are due to insufficient understanding of the situation. The reduction in false alarms is also due to the fact that the assumptions introduced are based on physical characteristics of the world under investigation and, in particular, the laws of the physical world act as constraints on permitted configurations (for example only manoeuvres which can physically be performed by the vehicle are considered when calculating its path).
In detail, investigation of the manoeuvre required in order to get back into line with the road profile in the stages described in block 42 and 43 makes it possible to eliminate false alarms even when moving in a straight line, when small correcting manoeuvres such as those which are normally performed during driving are performed, and which, if only the angle of the steering wheel were to be considered, might cause vehicle paths which intercept the edge of the road or adjacent carriageways to be considered. Similarly, when travelling along the central part of a bend with a constant radius of curvature, through investigation of the manoeuvre the basic information provided by the angle of the steering wheel can be corrected in order to hold the vehicle's path substantially along the road profile. Also, when approaching a bend or leaving one, information from the angle of the steering wheel alone could give rise to false alarms due to objects at the side of the road, given that the curve set by the steering wheel is not a valid prediction of the path, whereas by foreseeing a manoeuvre which would maintain the vehicle's path along the axis of the road it is possible to search for obstacles in a much more realistic way. However if the speed of the vehicle is too high the path which would cause the vehicle to follow the curve moves outside the limits of manoeuvrability, and as a result of the discrimination which can be made by block 43 is not considered. In this case, as already explained in a similar way with reference to FIG. 5, the objects alongside the road represent an effective hazard and are therefore indicated as obstacles (in fact the driver may not have seen the bend because of the fog). Likewise, in the event of normal overtaking, the method avoids generating false alarms due to the presence of the crash barrier while the manoeuvre is in progress. Reconstruction of the road geometry (stages 39 and 44) makes it possible to provide a correct forecast, and therefore to avoid a false alarm, even in the situation of a double bend shown in FIG. 1, with two vehicles 1 and 5 facing each other without this being represented as a hazard.
Another advantage lies in the fact that when evaluating the hazard of an obstacle, the safe distance corresponding to the normal definition, which considers the possibility that the obstacle may stop without warning as a result of an accident, rather than a "relative" safe distance as proposed in some known arrangements, and which also takes into account the braking distance of the vehicle in front and can therefore provide an incorrect evaluation in the abovementioned case of a sudden stop, is taken into consideration.
The fact that the means and methods described have the widest application is also emphasised. In particular their effectiveness does not depend on the existence of a number of equipped vehicles or on the provision of special structures on a road. Also the representation of the road situation in a substantially realistic manner, and not by means of symbols which can only indicate "hazard" situations due to the presence of obstacles closer to or further away from the vehicle, enables the driver to make a full evaluation of the situation so that he can then choose the most appropriate behaviour on the basis of the position of the obstacle in the context of the environment and in particular with respect to the edges of the road and other possible vehicles in order to evaluate, for example, whether the situation is such that a slight manoeuvre to put the vehicle on a path which does not intercept the obstacle (i.e. to steer clear of the obstacle) will be sufficient, or whether more decisive manoeuvres are required, such as sharper braking and a rapid change of course.
Also representation of the road situation in a more normal manner in general makes it possible to obtain shorter reaction times from the driver, even if the latter has a reduced level of concentration and sources of distraction are present.
Reconstruction of the vehicle's path on the basis of increasing orders of approximation provides acceptable forecasts even where the starting data are in themselves insufficient for a reconstruction of the road, or where the manoeuvre being performed by the driver is not understood. In fact, in general, if the vehicle is travelling on a motorway, the higher speed requires that obstacles should be indicated while they are still at a distance. In this case the environment is generally structured, in that the existence of the crash barrier makes it possible to recognise the road and therefore to extrapolate the vehicle's path correctly for some distance. If instead the vehicle is travelling on an unstructured road, information on the edge of the road may be absent; in this case however the speeds in question are normally lesser, and therefore obstacles can be indicated at shorter distances. In this case extrapolation on the basis of the angle of the steering wheel alone has been shown to be sufficient.
Also, the type of representation used, which presupposes an interpretation of the situation, so as to provide only and all information required and to display significant objects on the radar map on the basis of their evaluated hazard level is particularly useful in helping the driver to make a correct evaluation of a situation. In this respect it is advantageous to have a representation, as described with reference to FIG. 7, in which objects which are off the road, which would confuse the scene, making it less comprehensible at a glance, are not shown, in which objects on the road are illustrated in a compact way, so as to represent possible obstacles which lie in the vehicle's path, in which the dimensions of the obstacles increase with their closeness to the vehicle (on the basis of the laws of perspective), in which increasing emphasis is provided with increasing hazard, and in which different alarm levels, with different colours for the obstacles and acoustic warnings, which further aid the driver's understanding of the situation, are provided, thus also providing a certain level of redundancy, which is desirable in the application considered.
Finally it is clear that modifications and variants may be made to the anticollision means and method described and illustrated here without thereby going beyond the scope of the protection of this invention.
In particular, although the use of radar equipment for preparing the map of the environment in front of the vehicle is the most advantageous for the reasons explained above, it is also conceivable that sensors of a different type, such as for example laser beams, may be used for such an application. Detection of the radius of curvature set by the steering wheel may be effected in various ways, either by direct determination of the angle of the steering wheel, as provided for in the example embodiment described above, or by means of an accelerometer or by other means. Similarly the condition (wet or dry) of the road may be detected by means of suitable type adhesion sensors, or may be keyed in directly by the driver by means of an appropriate key. As regards the type of display used for the perspective representation of the road situation, various arrangements may be provided, for example a "head-up display", similar to that used in aircraft, may be provided instead of a liquid crystal screen. Also the type of perspective display may be partly modified in respect of for example the transformation relationships, the outer profile of objects, the representation of the edges of the road and the height of objects.
In addition to this the method according to the invention permits a number of variants with respect to the embodiment described with reference to FIG. 3. In particular all the data received by the sensors may be subjected to different and/or further types of filtering and processing in order to eliminate undesirable disturbances. For example the various distance measurements made on successive maps may be filtered by means of an alpha-beta filter which requires two observations in order to determine the relative speed of the vehicle (in which the filter is initialized for the distance of the obstacle by the first measurement and the relative velocity of the obstacle is estimated from the second measurement). In this case a 3 Hz radar scanner is sufficient for the purpose of describing the road geometry and understanding the manoeuvre in progress, although other higher frequencies may also be used.
In addition to this, although in the embodiment described it is provided that the value of C1 (the spatial derivative of the presumed path curvature) is determined only on the basis of the conditions specified in blocks 41, 44 or 45, in accordance with a preferred variant, the value of C1 may be selected so as to take into account the level of approximation with which the road geometry is reconstructed. In essence, the value F of the function (calculated in block 40) may be used to modify the value of C1 in such a way that this also depends on the value of F.

Claims (22)

We claim:
1. In a method for avoiding collision between a human driven motor vehicle and at least one obstacle comprising the stages of: searching for at least one obstacle (120) in an environment in front of the motor vehicle (100) along a presumed path (119) for the motor vehicle, calculating the distance (dobst) and the relative speed (Vobst) of the obstacle, and indicating the obstacle with an indication of its hazard level, these stages being repeated cyclicly, the improvement in that the stage of searching for the obstacle comprises:
generating a map (115) of the environment in front of the motor vehicle (100) by scanning the environment;
reconstructing the geometry of a road (116) on the basis of the map (115);
reconstructing the presumed path (119) of the motor vehicle on the basis of the reconstructed geometry of the road; and
searching for the obstacle (120) within the reconstructed presumed path.
2. A method according to claim 1, characterised in that it also includes the stages of detecting the manoeuvre being performed by the motor vehicle (100) with respect to the road (116), and of amending the presumed path (119) of the motor vehicle on the basis of the manoeuvre detected.
3. A method according to claim 1, characterised in that the said scanning is performed by means of a microwave radar.
4. A method according to claim 1, characterised in that the said stage of generating a map (115) comprises scanning of the environment in front of the motor vehicle (100) within a predetermined scanning angle centred on the longitudinal axis of the vehicle (100), with subdivision of the environment in front of the vehicle into a plurality of sectors, and storing a matrix which includes the findings calculated on the basis of the corresponding sector and the distance found.
5. A method according to claim 1, characterised in that it includes the stages of measuring the angle (θ) of the steering wheel of the motor vehicle (100), measuring the speed (v) of the vehicle and determining the condition, wet or dry, of the road surface.
6. A method according to claim 5, characterised in that the radius (RV) and the curvature (C0V =l/RV) of the path travelled by the vehicle (100) at that instant is calculated on the basis of the said steering wheel angle (θ) and the safe distance (dds) of the motor vehicle is calculated on the basis of the formula:
dds=v*Tr+v.sup.2 /(2*A.sub.max)
where Tr is a fixed value and Amax represents the maximum deceleration by sharp braking and is stored in the memory as a function of the speed of the vehicle and the road condition.
7. A method according to claim 1, characterised in that the said stage of reconstructing the road geometry includes calculation of the following parameters: the heading angle (Φ) of the motor vehicle (100) and its distance from the edges of the road (Ls, Ld), and the curvature (C0s) and spatial derivative (C1s) of the road curvature (116).
8. A method according to claim 7, characterised in that values of the said parameters (Φ, Ls, Ld, C0s and C1s) which maximise the value F in the function ##EQU4## where i=1 . . . n, where n is equal to the number of obstacles found in the map, d is the geometrical distance between an obstacle having the coordinates (xi, yi) and the edge of the road, identified from the abovementioned five parameters, and α is a multiplying coefficient of a predetermined value, are sought.
9. A method according to claim 8, characterised in that the value F of the said function is compared with a reference threshold (F0) and when it is less than the said threshold the curvature (C1) of the presumed path (119) is determined on the basis of the angle (θ) of the steering wheel, and if this is not the case the manoeuvre being performed by the motor vehicle (100) is determined.
10. A method according to claim 8, characterised in that the said stage of detecting the manoeuvre being performed includes calculation of the manoeuvre necessary to return the motor vehicle (100) to axial alignment with the road (116) at a predetermined distance (D).
11. A method according to claim 10, characterised in that calculation of the necessary manoeuvre includes calculation of the alignment manoeuvre (C1a) in accordance with the relationship:
C.sub.1a =C.sub.1V =C.sub.1s +2*(C.sub.0s -C.sub.0V)/D-2*Φ/D.sup.2
where C1V represents the derivative of the spatial curvature of the path (119) of the motor vehicle (100), C1s is the derivative of the spatial curvature of the road, C0s is the spatial curvature of the road, C0V is the spatial curvature of the vehicle's path, Φ is the heading angle and D represents the said predetermined distance.
12. A method according to claim 11, characterised in that the calculated value of the aligning manoeuvre (C1a) is compared with a limit value (C1amax) to determine whether the motor vehicle (100) is performing a manoeuvre aligning it with the road profile (119) or is performing a special manoeuvre.
13. A method according to claim 12, characterised in that when the value of the aligning manoeuvre (C1a) lies below the said limit value (C1amax) the spatial derivative (C1) of the curvature of the presumed path (119) is determined on the basis of the said value of the aligning manoeuvre (C1a), otherwise the spatial derivative (C1) of the curvature of the presumed path (119) is calculated on the basis of the special manoeuvre being performed on the basis of the equation: ##EQU5## where C0V (t) and C0V (t-1) are the curvature of the vehicle's path (100) at the instants when the present map (115) and the previous map respectively was determined, v is the speed of the motor vehicle (100) and t is the time between two successive detection cycles.
14. A method according to claim 1, characterised in that the said stage of searching for possible objects within the presumed path includes searching for obstacles (120) within a travel corridor (119) having a width equal to the width of the motor vehicle, a curvature equal to C0s and a derivative of the curvature equal to C1.
15. A method according to claim 14, characterised in that when the said stage of searching for obstacles (120) has a negative outcome a representation of the map (115) of the road situation is displayed.
16. A method according to claim 14, characterised in that when the said stage of searching for obstacles (120) has a positive outcome the identification of an obstacle (120) in the previous map (115) is verified.
17. A method according to claim 16, characterised in that when no obstacle is identified in the previous map (115) a representation of the map (115) of the road situation is displayed, and if this is not the case the relative speed (Vobst) of the obstacle found is calculated, the distance (dobst) of the obstacle is compared with the safe distance (dds), when the relative speed (Vobst) of the obstacle is negative and the distance to the obstacle (dds) is less than the safe distance a serious alarm condition is generated, when the relative speed of the obstacle is positive and the distance to the obstacle is less than the safe distance an alarm condition is generated, and when the relative speed is negative and the distance to the obstacle is greater than the safe distance a prealarm condition is generated.
18. A method according to claim 1, characterised by the generation of a perspective representation of the map (115) of the environment in front of the motor vehicle showing the edges (124) of the road (116), the presumed path (119) of the motor vehicle (100) and objects (125, 126) located on the road, with a different identification according to hazard level.
19. In a system for avoiding collision between a human driven motor vehicle and at least one object, the system including a sensor (22) for monitoring an environment in front of the motor vehicle (100) to detect the presence of the object, means (47, 49) for calculating the distance (dobst) and the relative speed (Vobst) of the object, and means (32, 34) for indicating the presence of the object with an indication of its hazard level, the improvement in that the sensor (22) is a scanning sensor for generating a map (115) of the environment in front of the motor vehicle (100) comprising means (39) for reconstructing the geometry of a road (116) on the basis of the map, means (15) for reconstructing a presumed path (119) for the motor vehicle on the basis of the reconstructed geometry of the road, and means (46) for searching for the object within the presumed path.
20. A system according to claim 19, characterised in that the said sensor (22) is a microwave radar.
21. A system according to claim 19, characterised in that it includes means (43) for detecting the manoeuvre being performed by the motor vehicle (100) with respect to the road (116) and means (45) for altering the presumed path (119) of the motor vehicle (100) on the basis of the manoeuvre found.
22. A system according to claim 19, characterised in that it includes means (17) for counting the positive findings obtained for a given distance within a plurality of sectors of predetermined angular size constituting the map (115) and means (18) for storing a matrix recording the number of findings calculated in relation to the corresponding sector and the distance found.
US08/001,395 1990-07-05 1993-01-06 Method and means for avoiding collision between a motor vehicle and obstacles Expired - Lifetime US5343206A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US08/001,395 US5343206A (en) 1990-07-05 1993-01-06 Method and means for avoiding collision between a motor vehicle and obstacles

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
IT67499A IT1240974B (en) 1990-07-05 1990-07-05 METHOD AND EQUIPMENT TO AVOID THE COLLISION OF A VEHICLE AGAINST OBSTACLES.
IT67499-A/90 1990-07-05
US72501491A 1991-07-03 1991-07-03
US08/001,395 US5343206A (en) 1990-07-05 1993-01-06 Method and means for avoiding collision between a motor vehicle and obstacles

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US72501491A Continuation 1990-07-05 1991-07-03

Publications (1)

Publication Number Publication Date
US5343206A true US5343206A (en) 1994-08-30

Family

ID=11302936

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/001,395 Expired - Lifetime US5343206A (en) 1990-07-05 1993-01-06 Method and means for avoiding collision between a motor vehicle and obstacles

Country Status (5)

Country Link
US (1) US5343206A (en)
EP (1) EP0464821B1 (en)
JP (1) JP3374193B2 (en)
DE (1) DE69113881T2 (en)
IT (1) IT1240974B (en)

Cited By (99)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5479173A (en) * 1993-03-08 1995-12-26 Mazda Motor Corporation Obstacle sensing apparatus for vehicles
US5585798A (en) * 1993-07-07 1996-12-17 Mazda Motor Corporation Obstacle detection system for automotive vehicle
US5587929A (en) * 1994-09-02 1996-12-24 Caterpillar Inc. System and method for tracking objects using a detection system
US5680097A (en) * 1992-12-10 1997-10-21 Mazda Motor Corporation Vehicle run safety apparatus
US5680313A (en) * 1990-02-05 1997-10-21 Caterpillar Inc. System and method for detecting obstacles in a road
WO1997042521A1 (en) * 1996-05-08 1997-11-13 Daimler-Benz Aktiengesellschaft Process for detecting the road conditions ahead for motor vehicles
US5689264A (en) * 1994-10-05 1997-11-18 Mazda Motor Corporation Obstacle detecting system for vehicles
US5751211A (en) * 1995-12-28 1998-05-12 Denso Corporation Obstacle warning system for a vehicle
US5786787A (en) * 1994-06-07 1998-07-28 Celsiustech Electronics Ab Method for determining the course of another vehicle
US5801667A (en) * 1994-06-02 1998-09-01 Nissan Motor Co., Ltd. Vehicle display which reduces driver's recognition time of alarm display
US5926126A (en) * 1997-09-08 1999-07-20 Ford Global Technologies, Inc. Method and system for detecting an in-path target obstacle in front of a vehicle
US5930739A (en) * 1995-04-07 1999-07-27 Regie Nationale Des Usines Renault Method for measuring the yaw velocity of a vehicle
US5995037A (en) * 1997-06-30 1999-11-30 Honda Giken Kogyo Kabushiki Kaisha Obstacle detection system for a vehicle
US6070120A (en) * 1996-12-20 2000-05-30 Mannesmann Vdo Ag Method and system for the determination in advance of a travel corridor of a motor vehicle
US6256584B1 (en) * 1998-08-25 2001-07-03 Honda Giken Kogyo Kabushiki Kaisha Travel safety system for vehicle
US6285778B1 (en) * 1991-09-19 2001-09-04 Yazaki Corporation Vehicle surroundings monitor with obstacle avoidance lighting
US6369700B1 (en) 1998-08-27 2002-04-09 Toyota Jidosha Kabushiki Kaisha On-vehicle DBF radar apparatus
US6377191B1 (en) * 1999-05-25 2002-04-23 Fujitsu Limited System for assisting traffic safety of vehicles
US20020049539A1 (en) * 2000-09-08 2002-04-25 Russell Mark E. Path prediction system and method
US6388565B1 (en) * 1999-05-08 2002-05-14 Daimlerchrysler Ag Guidance system for assisting lane change of a motor vehicle
US20020067287A1 (en) * 2000-08-16 2002-06-06 Delcheccolo Michael Joseph Near object detection system
US6420997B1 (en) * 2000-06-08 2002-07-16 Automotive Systems Laboratory, Inc. Track map generator
US20020093180A1 (en) * 1994-05-23 2002-07-18 Breed David S. Externally deployed airbag system
US20020097170A1 (en) * 2000-06-30 2002-07-25 Nobuhiko Yasui Rendering device
US20020168958A1 (en) * 2001-05-14 2002-11-14 International Business Machines Corporation System and method for providing personal and emergency service hailing in wireless network
US6496770B2 (en) * 2000-03-28 2002-12-17 Robert Bosch Gmbh Method and apparatus for controlling the travel speed of a vehicle
US6539294B1 (en) * 1998-02-13 2003-03-25 Komatsu Ltd. Vehicle guidance system for avoiding obstacles stored in memory
US20030060956A1 (en) * 2001-09-21 2003-03-27 Ford Motor Company Method for operating a pre-crash sensing system with object classifier in a vehicle having a countermeasure system
US6542111B1 (en) * 2001-08-13 2003-04-01 Yazaki North America, Inc. Path prediction for vehicular collision warning system
US20030174054A1 (en) * 2002-03-12 2003-09-18 Nissan Motor Co., Ltd. Method for determining object type of reflective object on track
WO2003093914A1 (en) * 2002-04-27 2003-11-13 Robert Bosch Gmbh Method and device for predicting the course of motor vehicles
FR2839933A1 (en) * 2002-05-27 2003-11-28 Bernard Jean Francois C Roquet Aid to driving motor vehicle in bad external visibility, uses video camera capturing view ahead for display on a screen, with automatic cleaning of camera and filters against oncoming bright lights or low sun
US20040051659A1 (en) * 2002-09-18 2004-03-18 Garrison Darwin A. Vehicular situational awareness system
US6864831B2 (en) * 2000-08-16 2005-03-08 Raytheon Company Radar detection method and apparatus
DE10347168A1 (en) * 2003-10-06 2005-04-21 Valeo Schalter & Sensoren Gmbh Maneuvering aid for motor vehicles
US20050179580A1 (en) * 2002-07-15 2005-08-18 Shan Cong Road curvature estimation and automotive target state estimation system
WO2005078476A1 (en) * 2004-02-12 2005-08-25 Carlos Vargas Marquez Radar system for vehicles
US20050197771A1 (en) * 2004-03-04 2005-09-08 Seick Ryan E. Potential accident detection assessment wireless alert network
US7015876B1 (en) * 1998-06-03 2006-03-21 Lear Corporation Heads-up display with improved contrast
US20060089799A1 (en) * 2004-10-21 2006-04-27 Kenjiro Endoh Vehicle detector and vehicle detecting method
US20070008211A1 (en) * 2005-03-31 2007-01-11 Denso It Laboratory, Inc. Vehicle mounted radar apparatus
US20070080825A1 (en) * 2003-09-16 2007-04-12 Zvi Shiller Method and system for providing warnings concerning an imminent vehicular collision
US20070102214A1 (en) * 2005-09-06 2007-05-10 Marten Wittorf Method and system for improving traffic safety
US20080046150A1 (en) * 1994-05-23 2008-02-21 Automotive Technologies International, Inc. System and Method for Detecting and Protecting Pedestrians
US20080119993A1 (en) * 1994-05-23 2008-05-22 Automotive Technologies International, Inc. Exterior Airbag Deployment Techniques
US20080183419A1 (en) * 2002-07-15 2008-07-31 Automotive Systems Laboratory, Inc. Road curvature estimation system
US20080306666A1 (en) * 2007-06-05 2008-12-11 Gm Global Technology Operations, Inc. Method and apparatus for rear cross traffic collision avoidance
US7532152B1 (en) 2007-11-26 2009-05-12 Toyota Motor Engineering & Manufacturing North America, Inc. Automotive radar system
US20090187335A1 (en) * 2008-01-18 2009-07-23 Mathias Muhlfelder Navigation Device
US20090326734A1 (en) * 2008-06-27 2009-12-31 Caterpillar Inc. Worksite avoidance system
US20100010699A1 (en) * 2006-11-01 2010-01-14 Koji Taguchi Cruise control plan evaluation device and method
US20100017067A1 (en) * 2007-01-29 2010-01-21 Josef Kolatschek Method and control unit for triggering passenger protection means
US20100023264A1 (en) * 2008-07-23 2010-01-28 Honeywell International Inc. Aircraft display systems and methods with obstacle warning envelopes
US20100076685A1 (en) * 2008-09-25 2010-03-25 Ford Global Technologies, Llc System and method for assessing vehicle paths in a road environment
US20100238066A1 (en) * 2005-12-30 2010-09-23 Valeo Raytheon Systems, Inc. Method and system for generating a target alert
US20100328644A1 (en) * 2007-11-07 2010-12-30 Yuesheng Lu Object Detection and Tracking System
US20110093134A1 (en) * 2008-07-08 2011-04-21 Emanuel David C Method and apparatus for collision avoidance
US20110153742A1 (en) * 2009-12-23 2011-06-23 Aws Convergence Technologies, Inc. Method and Apparatus for Conveying Vehicle Driving Information
US20110228980A1 (en) * 2009-10-07 2011-09-22 Panasonic Corporation Control apparatus and vehicle surrounding monitoring apparatus
US20120078498A1 (en) * 2009-06-02 2012-03-29 Masahiro Iwasaki Vehicular peripheral surveillance device
US20130342373A1 (en) * 2012-06-26 2013-12-26 Honeywell International Inc. Methods and systems for taxiway traffic alerting
US20140104051A1 (en) * 2002-05-20 2014-04-17 Intelligent Technologies International, Inc. Vehicular anticipatory sensor system
WO2014108233A1 (en) * 2013-01-14 2014-07-17 Robert Bosch Gmbh Creation of an obstacle map
US20140214276A1 (en) * 2013-01-28 2014-07-31 Fujitsu Ten Limited Object detector
US8818042B2 (en) 2004-04-15 2014-08-26 Magna Electronics Inc. Driver assistance system for vehicle
US8842176B2 (en) 1996-05-22 2014-09-23 Donnelly Corporation Automatic vehicle exterior light control
US8917169B2 (en) 1993-02-26 2014-12-23 Magna Electronics Inc. Vehicular vision system
US8977008B2 (en) 2004-09-30 2015-03-10 Donnelly Corporation Driver assistance system for vehicle
US8993951B2 (en) 1996-03-25 2015-03-31 Magna Electronics Inc. Driver assistance system for a vehicle
US9014904B2 (en) 2004-12-23 2015-04-21 Magna Electronics Inc. Driver assistance system for vehicle
US20150112570A1 (en) * 2013-10-22 2015-04-23 Honda Research Institute Europe Gmbh Confidence estimation for predictive driver assistance systems based on plausibility rules
US20150183410A1 (en) * 2013-12-30 2015-07-02 Automotive Research & Testing Center Adaptive anti-collision method for vehicle
US9076336B2 (en) 2013-03-15 2015-07-07 Volkswagen Ag Personalized parking assistant
US9171217B2 (en) 2002-05-03 2015-10-27 Magna Electronics Inc. Vision system for vehicle
US9191574B2 (en) 2001-07-31 2015-11-17 Magna Electronics Inc. Vehicular vision system
US9255989B2 (en) 2012-07-24 2016-02-09 Toyota Motor Engineering & Manufacturing North America, Inc. Tracking on-road vehicles with sensors of different modalities
US20160084952A1 (en) * 2014-09-24 2016-03-24 Nxp B.V. Personal radar assistance
US9308919B2 (en) 2013-10-22 2016-04-12 Honda Research Institute Europe Gmbh Composite confidence estimation for predictive driver assistant systems
US20160121892A1 (en) * 2013-06-18 2016-05-05 Continental Automotive Gmbh Method and device for determining a driving state of an external motor vehicle
US9436880B2 (en) 1999-08-12 2016-09-06 Magna Electronics Inc. Vehicle vision system
US9440535B2 (en) 2006-08-11 2016-09-13 Magna Electronics Inc. Vision system for vehicle
US9508014B2 (en) 2013-05-06 2016-11-29 Magna Electronics Inc. Vehicular multi-camera vision system
US20170060138A1 (en) * 2014-02-07 2017-03-02 Crown Equipment Corporation Systems, methods, and mobile client devices for supervising industrial vehicles
WO2017105320A1 (en) * 2015-12-17 2017-06-22 Scania Cv Ab Method and system for following a trail of a vehicle along a road
WO2017105319A1 (en) * 2015-12-17 2017-06-22 Scania Cv Ab Method and system for facilitating following a leader vehicle along a road
US9900522B2 (en) 2010-12-01 2018-02-20 Magna Electronics Inc. System and method of establishing a multi-camera image using pixel remapping
US20180172814A1 (en) * 2016-12-20 2018-06-21 Panasonic Intellectual Property Management Co., Ltd. Object detection device and recording medium
US10310064B2 (en) * 2016-08-15 2019-06-04 Qualcomm Incorporated Saliency based beam-forming for object detection
US10429506B2 (en) * 2014-10-22 2019-10-01 Denso Corporation Lateral distance sensor diagnosis apparatus
US10448555B2 (en) 2016-05-27 2019-10-22 Cnh Industrial America Llc System and method for scouting vehicle mapping
CN111554116A (en) * 2018-12-26 2020-08-18 歌乐株式会社 Vehicle-mounted processing device
US10793067B2 (en) 2011-07-26 2020-10-06 Magna Electronics Inc. Imaging system for vehicle
EP3796031A1 (en) * 2019-09-20 2021-03-24 Veoneer Sweden AB A method for reducing the amount of sensor data from a forward-looking vehicle sensor
US20220035018A1 (en) * 2018-09-26 2022-02-03 Kyocera Corporation Electronic device, method for controlling electronic device, and electronic device control program
US11277558B2 (en) 2016-02-01 2022-03-15 Magna Electronics Inc. Vehicle vision system with master-slave camera configuration
US11366214B2 (en) * 2019-12-30 2022-06-21 Woven Planet North America, Inc. Systems and methods for adaptive clutter removal from radar scans
US11433809B2 (en) 2016-02-02 2022-09-06 Magna Electronics Inc. Vehicle vision system with smart camera video output
CN115171414A (en) * 2022-06-10 2022-10-11 哈尔滨工业大学重庆研究院 CACC following traffic flow control system based on Frenet coordinate system
CN115214706A (en) * 2022-06-09 2022-10-21 广东省智能网联汽车创新中心有限公司 Dangerous road early warning method and system based on V2X

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2748065B2 (en) * 1992-03-31 1998-05-06 株式会社小糸製作所 Radar equipment for vehicles
JP3164439B2 (en) * 1992-10-21 2001-05-08 マツダ株式会社 Obstacle detection device for vehicles
DE4319904C2 (en) * 1993-06-16 2002-11-28 Siemens Ag Warning device for displaying information in a vehicle
JP3401913B2 (en) * 1994-05-26 2003-04-28 株式会社デンソー Obstacle recognition device for vehicles
JP3132361B2 (en) * 1995-03-17 2001-02-05 トヨタ自動車株式会社 Automotive radar equipment
DE59607990D1 (en) * 1995-08-30 2001-11-29 Volkswagen Ag Obstacle detection method for speed and / or distance control of a motor vehicle
JP3314623B2 (en) * 1996-08-12 2002-08-12 トヨタ自動車株式会社 In-vehicle scanning radar
SE511013C2 (en) * 1997-12-10 1999-07-19 Celsiustech Electronics Ab Procedure for predicting the occurrence of a bend on a road section
DE19859345A1 (en) * 1998-12-22 2000-07-06 Mannesmann Vdo Ag Device for displaying a control situation determined by a distance control device of a motor vehicle
DE19944542C2 (en) * 1999-09-17 2003-01-23 Daimler Chrysler Ag Method for determining the course of the vehicle on the vehicle
GB2358975B (en) 2000-02-05 2004-05-05 Jaguar Cars Motor vehicle trajectory measurement
AU2001241808A1 (en) 2000-02-28 2001-09-12 Veridian Engineering, Incorporated System and method for avoiding accidents in intersections
JP4081958B2 (en) * 2000-03-29 2008-04-30 松下電工株式会社 Obstacle detection device for vehicles
DE10050127B4 (en) * 2000-10-11 2014-04-30 Volkswagen Ag Method for determining a driving tube of a vehicle
EP1347306A3 (en) 2001-11-08 2004-06-30 Fujitsu Ten Limited Scan type radar device
EP1369283B1 (en) * 2002-05-31 2009-09-02 Volkswagen AG Automatic distance control system
JP2004038877A (en) * 2002-07-08 2004-02-05 Yazaki Corp Perimeter monitoring device and image processing apparatus for vehicles
DE10257798A1 (en) * 2002-12-11 2004-07-22 Daimlerchrysler Ag Safety device for non-tracked vehicles
DE10323465A1 (en) * 2003-05-23 2004-12-30 Robert Bosch Gmbh Vehicle speed control device
JP4412337B2 (en) 2007-03-08 2010-02-10 トヨタ自動車株式会社 Ambient environment estimation device and ambient environment estimation system
DE102007047720A1 (en) * 2007-10-05 2009-04-09 Robert Bosch Gmbh Driver assistance system for a motor vehicle
US8026800B2 (en) * 2008-08-26 2011-09-27 GM Global Technology Operations LLC Methods and systems for controlling external visual indicators for vehicles
EP2306433A1 (en) 2009-10-05 2011-04-06 Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO Collision avoidance system and method for a road vehicle and respective computer program product
DE102009047066A1 (en) * 2009-11-24 2011-05-26 Robert Bosch Gmbh A method for warning of an object in the vicinity of a vehicle and driving assistant system
DE102010010912A1 (en) * 2010-03-10 2010-12-02 Daimler Ag Driver assistance device for vehicle, has sensor unit for detecting object in surrounding of vehicle and display unit for optical representation of detected object by sensor unit to schematic top view of vehicle
GB2510167B (en) * 2013-01-28 2017-04-19 Jaguar Land Rover Ltd Vehicle path prediction and obstacle indication system and method
DE102016001202B4 (en) * 2016-02-03 2021-07-08 Audi Ag Motor vehicle
DE102016001201B4 (en) * 2016-02-03 2021-07-08 Audi Ag Motor vehicle
JP6804991B2 (en) * 2017-01-05 2020-12-23 株式会社東芝 Information processing equipment, information processing methods, and information processing programs
US10803740B2 (en) 2017-08-11 2020-10-13 Cubic Corporation System and method of navigating vehicles
US10373489B2 (en) 2017-08-11 2019-08-06 Cubic Corporation System and method of adaptive controlling of traffic using camera data
US10636298B2 (en) 2017-08-11 2020-04-28 Cubic Corporation Adaptive traffic control using object tracking and identity details
US10636299B2 (en) 2017-08-11 2020-04-28 Cubic Corporation System and method for controlling vehicular traffic
US11250699B2 (en) 2017-08-14 2022-02-15 Cubic Corporation System and method of adaptive traffic management at an intersection
US10395522B2 (en) 2017-08-14 2019-08-27 Cubic Corporation Adaptive traffic optimization using unmanned aerial vehicles
US11100336B2 (en) 2017-08-14 2021-08-24 Cubic Corporation System and method of adaptive traffic management at an intersection
US10935388B2 (en) 2017-08-14 2021-03-02 Cubic Corporation Adaptive optimization of navigational routes using traffic data
US10559198B1 (en) 2018-08-08 2020-02-11 Cubic Corporation System and method of adaptive controlling of traffic using zone based occupancy
DE102020208878A1 (en) 2020-02-27 2021-09-02 Continental Automotive Gmbh Process for curve detection

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3717873A (en) * 1970-11-05 1973-02-20 Sperry Rand Corp Ship's maneuver assessment system
US4011563A (en) * 1976-02-06 1977-03-08 Rca Corporation Variable range automotive radar system
US4072945A (en) * 1975-12-02 1978-02-07 Nissan Motor Company, Limited Radar-operated collision avoidance system for roadway vehicles using stored information for determination of valid objects
US4148028A (en) * 1976-08-03 1979-04-03 Nissan Motor Company, Limited Radar system for an anti-collision system for a vehicle
US4158841A (en) * 1976-05-26 1979-06-19 Daimler-Benz Aktiengesellschaft Method and apparatus for the control of the safety distance of a vehicle relative to preceding vehicles
US4165511A (en) * 1976-10-15 1979-08-21 Robert Bosch Gmbh Reduction of echoes of irrelevant targets in a vehicle anti-collision radar system
US4195425A (en) * 1972-07-17 1980-04-01 Ernst Leitz Wetzlar Gmbh System for measuring position and/or velocity
US4197538A (en) * 1976-08-02 1980-04-08 Stocker Godfrey H Pilot's traffic monitoring system
US4313115A (en) * 1978-05-10 1982-01-26 Sperry Limited Collision avoidance apparatus
US4361202A (en) * 1979-06-15 1982-11-30 Michael Minovitch Automated road transportation system
US4623966A (en) * 1983-02-19 1986-11-18 Sperry Limited Collision avoidance apparatus
US4632543A (en) * 1983-05-06 1986-12-30 Nissan Motor Company, Limited Optical radar system for vehicles
US4673937A (en) * 1985-07-24 1987-06-16 Davis John W Automotive collision avoidance and/or air bag deployment radar
US4833469A (en) * 1987-08-03 1989-05-23 David Constant V Obstacle proximity detector for moving vehicles and method for use thereof
EP0353200A2 (en) * 1988-06-27 1990-01-31 FIAT AUTO S.p.A. Method and device for instrument-assisted vision in poor visibility, particularly for driving in fog
US4926171A (en) * 1988-11-21 1990-05-15 Kelley William L Collision predicting and avoidance device for moving vehicles
US4970653A (en) * 1989-04-06 1990-11-13 General Motors Corporation Vision method of detecting lane boundaries and obstacles
US5068654A (en) * 1989-07-03 1991-11-26 Hazard Detection Systems Collision avoidance system
US5081585A (en) * 1987-06-17 1992-01-14 Nissan Motor Company, Ltd. Control system for autonomous automotive vehicle or the like
US5122957A (en) * 1989-02-28 1992-06-16 Nissan Motor Company, Limited Autonomous vehicle for automatically/autonomously running on route of travel and its method using fuzzy control
US5128874A (en) * 1990-01-02 1992-07-07 Honeywell Inc. Inertial navigation sensor integrated obstacle detection system
US5140532A (en) * 1981-01-13 1992-08-18 Harris Corporation Digital map generator and display system
US5208750A (en) * 1987-06-17 1993-05-04 Nissan Motor Co., Ltd. Control system for unmanned automotive vehicle
US5249126A (en) * 1989-09-27 1993-09-28 Nissan Motor Company, Limited System and method for controlling steering response according to vehicle speed applicable to autonomous vehicle

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3717873A (en) * 1970-11-05 1973-02-20 Sperry Rand Corp Ship's maneuver assessment system
US4195425A (en) * 1972-07-17 1980-04-01 Ernst Leitz Wetzlar Gmbh System for measuring position and/or velocity
US4072945A (en) * 1975-12-02 1978-02-07 Nissan Motor Company, Limited Radar-operated collision avoidance system for roadway vehicles using stored information for determination of valid objects
US4011563A (en) * 1976-02-06 1977-03-08 Rca Corporation Variable range automotive radar system
US4158841A (en) * 1976-05-26 1979-06-19 Daimler-Benz Aktiengesellschaft Method and apparatus for the control of the safety distance of a vehicle relative to preceding vehicles
US4197538A (en) * 1976-08-02 1980-04-08 Stocker Godfrey H Pilot's traffic monitoring system
US4148028A (en) * 1976-08-03 1979-04-03 Nissan Motor Company, Limited Radar system for an anti-collision system for a vehicle
US4165511A (en) * 1976-10-15 1979-08-21 Robert Bosch Gmbh Reduction of echoes of irrelevant targets in a vehicle anti-collision radar system
US4313115A (en) * 1978-05-10 1982-01-26 Sperry Limited Collision avoidance apparatus
US4361202A (en) * 1979-06-15 1982-11-30 Michael Minovitch Automated road transportation system
US5140532A (en) * 1981-01-13 1992-08-18 Harris Corporation Digital map generator and display system
US4623966A (en) * 1983-02-19 1986-11-18 Sperry Limited Collision avoidance apparatus
US4632543A (en) * 1983-05-06 1986-12-30 Nissan Motor Company, Limited Optical radar system for vehicles
US4673937A (en) * 1985-07-24 1987-06-16 Davis John W Automotive collision avoidance and/or air bag deployment radar
US5081585A (en) * 1987-06-17 1992-01-14 Nissan Motor Company, Ltd. Control system for autonomous automotive vehicle or the like
US5208750A (en) * 1987-06-17 1993-05-04 Nissan Motor Co., Ltd. Control system for unmanned automotive vehicle
US4833469A (en) * 1987-08-03 1989-05-23 David Constant V Obstacle proximity detector for moving vehicles and method for use thereof
EP0353200A2 (en) * 1988-06-27 1990-01-31 FIAT AUTO S.p.A. Method and device for instrument-assisted vision in poor visibility, particularly for driving in fog
US4926171A (en) * 1988-11-21 1990-05-15 Kelley William L Collision predicting and avoidance device for moving vehicles
US5122957A (en) * 1989-02-28 1992-06-16 Nissan Motor Company, Limited Autonomous vehicle for automatically/autonomously running on route of travel and its method using fuzzy control
US4970653A (en) * 1989-04-06 1990-11-13 General Motors Corporation Vision method of detecting lane boundaries and obstacles
US5068654A (en) * 1989-07-03 1991-11-26 Hazard Detection Systems Collision avoidance system
US5249126A (en) * 1989-09-27 1993-09-28 Nissan Motor Company, Limited System and method for controlling steering response according to vehicle speed applicable to autonomous vehicle
US5128874A (en) * 1990-01-02 1992-07-07 Honeywell Inc. Inertial navigation sensor integrated obstacle detection system

Cited By (206)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5680313A (en) * 1990-02-05 1997-10-21 Caterpillar Inc. System and method for detecting obstacles in a road
US6285778B1 (en) * 1991-09-19 2001-09-04 Yazaki Corporation Vehicle surroundings monitor with obstacle avoidance lighting
US5680097A (en) * 1992-12-10 1997-10-21 Mazda Motor Corporation Vehicle run safety apparatus
US8917169B2 (en) 1993-02-26 2014-12-23 Magna Electronics Inc. Vehicular vision system
US5479173A (en) * 1993-03-08 1995-12-26 Mazda Motor Corporation Obstacle sensing apparatus for vehicles
US5585798A (en) * 1993-07-07 1996-12-17 Mazda Motor Corporation Obstacle detection system for automotive vehicle
US20080046150A1 (en) * 1994-05-23 2008-02-21 Automotive Technologies International, Inc. System and Method for Detecting and Protecting Pedestrians
US20100057305A1 (en) * 1994-05-23 2010-03-04 Automotive Technologies International, Inc. Exterior Airbag Deployment Techniques
US8447474B2 (en) 1994-05-23 2013-05-21 American Vehicular Sciences Llc Exterior airbag deployment techniques
US20020093180A1 (en) * 1994-05-23 2002-07-18 Breed David S. Externally deployed airbag system
US7630806B2 (en) 1994-05-23 2009-12-08 Automotive Technologies International, Inc. System and method for detecting and protecting pedestrians
US6749218B2 (en) * 1994-05-23 2004-06-15 Automotive Technologies International, Inc. Externally deployed airbag system
US20080119993A1 (en) * 1994-05-23 2008-05-22 Automotive Technologies International, Inc. Exterior Airbag Deployment Techniques
US8041483B2 (en) 1994-05-23 2011-10-18 Automotive Technologies International, Inc. Exterior airbag deployment techniques
US5801667A (en) * 1994-06-02 1998-09-01 Nissan Motor Co., Ltd. Vehicle display which reduces driver's recognition time of alarm display
US5786787A (en) * 1994-06-07 1998-07-28 Celsiustech Electronics Ab Method for determining the course of another vehicle
US5668739A (en) * 1994-09-02 1997-09-16 Caterpillar Inc. System and method for tracking objects using a detection system
US5587929A (en) * 1994-09-02 1996-12-24 Caterpillar Inc. System and method for tracking objects using a detection system
AU687218B2 (en) * 1994-09-02 1998-02-19 Caterpillar Inc. System and method for tracking objects using a detection system
US5689264A (en) * 1994-10-05 1997-11-18 Mazda Motor Corporation Obstacle detecting system for vehicles
US5930739A (en) * 1995-04-07 1999-07-27 Regie Nationale Des Usines Renault Method for measuring the yaw velocity of a vehicle
US5751211A (en) * 1995-12-28 1998-05-12 Denso Corporation Obstacle warning system for a vehicle
US8993951B2 (en) 1996-03-25 2015-03-31 Magna Electronics Inc. Driver assistance system for a vehicle
US6300865B1 (en) 1996-05-08 2001-10-09 Daimlerchrysler Ag Process for detecting the road conditions ahead for motor vehicles
WO1997042521A1 (en) * 1996-05-08 1997-11-13 Daimler-Benz Aktiengesellschaft Process for detecting the road conditions ahead for motor vehicles
US9131120B2 (en) 1996-05-22 2015-09-08 Magna Electronics Inc. Multi-camera vision system for a vehicle
US8842176B2 (en) 1996-05-22 2014-09-23 Donnelly Corporation Automatic vehicle exterior light control
US6070120A (en) * 1996-12-20 2000-05-30 Mannesmann Vdo Ag Method and system for the determination in advance of a travel corridor of a motor vehicle
US5995037A (en) * 1997-06-30 1999-11-30 Honda Giken Kogyo Kabushiki Kaisha Obstacle detection system for a vehicle
US5926126A (en) * 1997-09-08 1999-07-20 Ford Global Technologies, Inc. Method and system for detecting an in-path target obstacle in front of a vehicle
US6539294B1 (en) * 1998-02-13 2003-03-25 Komatsu Ltd. Vehicle guidance system for avoiding obstacles stored in memory
US7015876B1 (en) * 1998-06-03 2006-03-21 Lear Corporation Heads-up display with improved contrast
US20060125714A1 (en) * 1998-06-03 2006-06-15 Lear Automotive Dearborn, Inc. Heads-up display with improved contrast
US6256584B1 (en) * 1998-08-25 2001-07-03 Honda Giken Kogyo Kabushiki Kaisha Travel safety system for vehicle
US6369700B1 (en) 1998-08-27 2002-04-09 Toyota Jidosha Kabushiki Kaisha On-vehicle DBF radar apparatus
US6388565B1 (en) * 1999-05-08 2002-05-14 Daimlerchrysler Ag Guidance system for assisting lane change of a motor vehicle
US6377191B1 (en) * 1999-05-25 2002-04-23 Fujitsu Limited System for assisting traffic safety of vehicles
US9436880B2 (en) 1999-08-12 2016-09-06 Magna Electronics Inc. Vehicle vision system
US6496770B2 (en) * 2000-03-28 2002-12-17 Robert Bosch Gmbh Method and apparatus for controlling the travel speed of a vehicle
US6420997B1 (en) * 2000-06-08 2002-07-16 Automotive Systems Laboratory, Inc. Track map generator
EP1168241A3 (en) * 2000-06-30 2004-01-14 Matsushita Electric Industrial Co., Ltd. Rendering device
US20020097170A1 (en) * 2000-06-30 2002-07-25 Nobuhiko Yasui Rendering device
US6825779B2 (en) 2000-06-30 2004-11-30 Matsushita Electric Industrial Co., Ltd. Rendering device
US20020067287A1 (en) * 2000-08-16 2002-06-06 Delcheccolo Michael Joseph Near object detection system
US6670910B2 (en) 2000-08-16 2003-12-30 Raytheon Company Near object detection system
US6864831B2 (en) * 2000-08-16 2005-03-08 Raytheon Company Radar detection method and apparatus
US6784828B2 (en) 2000-08-16 2004-08-31 Raytheon Company Near object detection system
US7071868B2 (en) 2000-08-16 2006-07-04 Raytheon Company Radar detection method and apparatus
US6675094B2 (en) 2000-09-08 2004-01-06 Raytheon Company Path prediction system and method
US20020049539A1 (en) * 2000-09-08 2002-04-25 Russell Mark E. Path prediction system and method
US20020168958A1 (en) * 2001-05-14 2002-11-14 International Business Machines Corporation System and method for providing personal and emergency service hailing in wireless network
US7529537B2 (en) * 2001-05-14 2009-05-05 International Business Machines Corporation System and method for providing personal and emergency service hailing in wireless network
US7689202B2 (en) * 2001-05-14 2010-03-30 International Business Machines Corporation System and method for providing personal and emergency service hailing in wireless network
US9191574B2 (en) 2001-07-31 2015-11-17 Magna Electronics Inc. Vehicular vision system
US9834142B2 (en) 2001-07-31 2017-12-05 Magna Electronics Inc. Driving assist system for vehicle
US9656608B2 (en) 2001-07-31 2017-05-23 Magna Electronics Inc. Driver assist system for vehicle
US10046702B2 (en) 2001-07-31 2018-08-14 Magna Electronics Inc. Control system for vehicle
US10611306B2 (en) 2001-07-31 2020-04-07 Magna Electronics Inc. Video processor module for vehicle
US9376060B2 (en) 2001-07-31 2016-06-28 Magna Electronics Inc. Driver assist system for vehicle
US6542111B1 (en) * 2001-08-13 2003-04-01 Yazaki North America, Inc. Path prediction for vehicular collision warning system
US6859705B2 (en) * 2001-09-21 2005-02-22 Ford Global Technologies, Llc Method for operating a pre-crash sensing system with object classifier in a vehicle having a countermeasure system
US20030060956A1 (en) * 2001-09-21 2003-03-27 Ford Motor Company Method for operating a pre-crash sensing system with object classifier in a vehicle having a countermeasure system
US20030174054A1 (en) * 2002-03-12 2003-09-18 Nissan Motor Co., Ltd. Method for determining object type of reflective object on track
US6888622B2 (en) * 2002-03-12 2005-05-03 Nissan Motor Co., Ltd. Method for determining object type of reflective object on track
US8126640B2 (en) * 2002-04-27 2012-02-28 Robert Bosch Gmbh Method and device for predicting the course of motor vehicles
WO2003093914A1 (en) * 2002-04-27 2003-11-13 Robert Bosch Gmbh Method and device for predicting the course of motor vehicles
US20050228580A1 (en) * 2002-04-27 2005-10-13 Hermann Winner Method and device for predicting the course of motor vehicles
US10351135B2 (en) 2002-05-03 2019-07-16 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US9171217B2 (en) 2002-05-03 2015-10-27 Magna Electronics Inc. Vision system for vehicle
US10118618B2 (en) 2002-05-03 2018-11-06 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US9555803B2 (en) 2002-05-03 2017-01-31 Magna Electronics Inc. Driver assistance system for vehicle
US9643605B2 (en) 2002-05-03 2017-05-09 Magna Electronics Inc. Vision system for vehicle
US10683008B2 (en) 2002-05-03 2020-06-16 Magna Electronics Inc. Vehicular driving assist system using forward-viewing camera
US11203340B2 (en) 2002-05-03 2021-12-21 Magna Electronics Inc. Vehicular vision system using side-viewing camera
US9834216B2 (en) 2002-05-03 2017-12-05 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US9007197B2 (en) * 2002-05-20 2015-04-14 Intelligent Technologies International, Inc. Vehicular anticipatory sensor system
US20140104051A1 (en) * 2002-05-20 2014-04-17 Intelligent Technologies International, Inc. Vehicular anticipatory sensor system
FR2839933A1 (en) * 2002-05-27 2003-11-28 Bernard Jean Francois C Roquet Aid to driving motor vehicle in bad external visibility, uses video camera capturing view ahead for display on a screen, with automatic cleaning of camera and filters against oncoming bright lights or low sun
US7034742B2 (en) 2002-07-15 2006-04-25 Automotive Systems Laboratory, Inc. Road curvature estimation and automotive target state estimation system
US20050179580A1 (en) * 2002-07-15 2005-08-18 Shan Cong Road curvature estimation and automotive target state estimation system
US7522091B2 (en) 2002-07-15 2009-04-21 Automotive Systems Laboratory, Inc. Road curvature estimation system
US20080183419A1 (en) * 2002-07-15 2008-07-31 Automotive Systems Laboratory, Inc. Road curvature estimation system
US7626533B2 (en) 2002-07-15 2009-12-01 Automotive Systems Laboratory, Inc. Road curvature estimation system
US20040051659A1 (en) * 2002-09-18 2004-03-18 Garrison Darwin A. Vehicular situational awareness system
US20070080825A1 (en) * 2003-09-16 2007-04-12 Zvi Shiller Method and system for providing warnings concerning an imminent vehicular collision
US7797107B2 (en) * 2003-09-16 2010-09-14 Zvi Shiller Method and system for providing warnings concerning an imminent vehicular collision
DE10347168A1 (en) * 2003-10-06 2005-04-21 Valeo Schalter & Sensoren Gmbh Maneuvering aid for motor vehicles
WO2005078476A1 (en) * 2004-02-12 2005-08-25 Carlos Vargas Marquez Radar system for vehicles
US20050197771A1 (en) * 2004-03-04 2005-09-08 Seick Ryan E. Potential accident detection assessment wireless alert network
US9008369B2 (en) 2004-04-15 2015-04-14 Magna Electronics Inc. Vision system for vehicle
US11847836B2 (en) 2004-04-15 2023-12-19 Magna Electronics Inc. Vehicular control system with road curvature determination
US9428192B2 (en) 2004-04-15 2016-08-30 Magna Electronics Inc. Vision system for vehicle
US11503253B2 (en) 2004-04-15 2022-11-15 Magna Electronics Inc. Vehicular control system with traffic lane detection
US10462426B2 (en) 2004-04-15 2019-10-29 Magna Electronics Inc. Vehicular control system
US10110860B1 (en) 2004-04-15 2018-10-23 Magna Electronics Inc. Vehicular control system
US10015452B1 (en) 2004-04-15 2018-07-03 Magna Electronics Inc. Vehicular control system
US9191634B2 (en) 2004-04-15 2015-11-17 Magna Electronics Inc. Vision system for vehicle
US9736435B2 (en) 2004-04-15 2017-08-15 Magna Electronics Inc. Vision system for vehicle
US10306190B1 (en) 2004-04-15 2019-05-28 Magna Electronics Inc. Vehicular control system
US10735695B2 (en) 2004-04-15 2020-08-04 Magna Electronics Inc. Vehicular control system with traffic lane detection
US9948904B2 (en) 2004-04-15 2018-04-17 Magna Electronics Inc. Vision system for vehicle
US8818042B2 (en) 2004-04-15 2014-08-26 Magna Electronics Inc. Driver assistance system for vehicle
US9609289B2 (en) 2004-04-15 2017-03-28 Magna Electronics Inc. Vision system for vehicle
US10187615B1 (en) 2004-04-15 2019-01-22 Magna Electronics Inc. Vehicular control system
US10623704B2 (en) 2004-09-30 2020-04-14 Donnelly Corporation Driver assistance system for vehicle
US8977008B2 (en) 2004-09-30 2015-03-10 Donnelly Corporation Driver assistance system for vehicle
US7509217B2 (en) * 2004-10-21 2009-03-24 Alpine Electronics, Inc. Vehicle detector and vehicle detecting method
US20060089799A1 (en) * 2004-10-21 2006-04-27 Kenjiro Endoh Vehicle detector and vehicle detecting method
US9940528B2 (en) 2004-12-23 2018-04-10 Magna Electronics Inc. Driver assistance system for vehicle
US9193303B2 (en) 2004-12-23 2015-11-24 Magna Electronics Inc. Driver assistance system for vehicle
US11308720B2 (en) 2004-12-23 2022-04-19 Magna Electronics Inc. Vehicular imaging system
US12118806B2 (en) 2004-12-23 2024-10-15 Magna Electronics Inc. Vehicular imaging system
US10509972B2 (en) 2004-12-23 2019-12-17 Magna Electronics Inc. Vehicular vision system
US9014904B2 (en) 2004-12-23 2015-04-21 Magna Electronics Inc. Driver assistance system for vehicle
US20070008211A1 (en) * 2005-03-31 2007-01-11 Denso It Laboratory, Inc. Vehicle mounted radar apparatus
US20070102214A1 (en) * 2005-09-06 2007-05-10 Marten Wittorf Method and system for improving traffic safety
US7782184B2 (en) * 2005-09-06 2010-08-24 Gm Global Technology Operations, Inc. Method and system for improving traffic safety
US20100238066A1 (en) * 2005-12-30 2010-09-23 Valeo Raytheon Systems, Inc. Method and system for generating a target alert
US10071676B2 (en) 2006-08-11 2018-09-11 Magna Electronics Inc. Vision system for vehicle
US11148583B2 (en) 2006-08-11 2021-10-19 Magna Electronics Inc. Vehicular forward viewing image capture system
US11623559B2 (en) 2006-08-11 2023-04-11 Magna Electronics Inc. Vehicular forward viewing image capture system
US11396257B2 (en) 2006-08-11 2022-07-26 Magna Electronics Inc. Vehicular forward viewing image capture system
US11951900B2 (en) 2006-08-11 2024-04-09 Magna Electronics Inc. Vehicular forward viewing image capture system
US9440535B2 (en) 2006-08-11 2016-09-13 Magna Electronics Inc. Vision system for vehicle
US10787116B2 (en) 2006-08-11 2020-09-29 Magna Electronics Inc. Adaptive forward lighting system for vehicle comprising a control that adjusts the headlamp beam in response to processing of image data captured by a camera
US20100010699A1 (en) * 2006-11-01 2010-01-14 Koji Taguchi Cruise control plan evaluation device and method
US9224299B2 (en) 2006-11-01 2015-12-29 Toyota Jidosha Kabushiki Kaisha Cruise control plan evaluation device and method
US20100017067A1 (en) * 2007-01-29 2010-01-21 Josef Kolatschek Method and control unit for triggering passenger protection means
US20080306666A1 (en) * 2007-06-05 2008-12-11 Gm Global Technology Operations, Inc. Method and apparatus for rear cross traffic collision avoidance
US10295667B2 (en) 2007-11-07 2019-05-21 Magna Electronics Inc. Object detection system
US9383445B2 (en) 2007-11-07 2016-07-05 Magna Electronics Inc. Object detection system
US8027029B2 (en) * 2007-11-07 2011-09-27 Magna Electronics Inc. Object detection and tracking system
US20100328644A1 (en) * 2007-11-07 2010-12-30 Yuesheng Lu Object Detection and Tracking System
US11346951B2 (en) 2007-11-07 2022-05-31 Magna Electronics Inc. Object detection system
US8767186B2 (en) 2007-11-07 2014-07-01 Magna Electronics Inc. Object detection system
US20090135050A1 (en) * 2007-11-26 2009-05-28 Toyota Motor Engineering & Manufacturing North America, Inc. Automotive radar system
US7532152B1 (en) 2007-11-26 2009-05-12 Toyota Motor Engineering & Manufacturing North America, Inc. Automotive radar system
US8935046B2 (en) * 2008-01-18 2015-01-13 Garmin Switzerland Gmbh Navigation device
US20090187335A1 (en) * 2008-01-18 2009-07-23 Mathias Muhlfelder Navigation Device
US8527155B2 (en) * 2008-06-27 2013-09-03 Caterpillar Inc. Worksite avoidance system
US20090326734A1 (en) * 2008-06-27 2009-12-31 Caterpillar Inc. Worksite avoidance system
US8346468B2 (en) * 2008-07-08 2013-01-01 Sky-Trax Incorporated Method and apparatus for collision avoidance
US20110093134A1 (en) * 2008-07-08 2011-04-21 Emanuel David C Method and apparatus for collision avoidance
US20100023264A1 (en) * 2008-07-23 2010-01-28 Honeywell International Inc. Aircraft display systems and methods with obstacle warning envelopes
EP2169500A1 (en) * 2008-09-25 2010-03-31 Ford Global Technologies, LLC Method of assessing vehicle paths in a road environment and a vehicle path assessment system.
US20100076685A1 (en) * 2008-09-25 2010-03-25 Ford Global Technologies, Llc System and method for assessing vehicle paths in a road environment
US8401782B2 (en) 2008-09-25 2013-03-19 Volvo Car Corporation System and method for assessing vehicle paths in a road environment
US20120078498A1 (en) * 2009-06-02 2012-03-29 Masahiro Iwasaki Vehicular peripheral surveillance device
US8571786B2 (en) * 2009-06-02 2013-10-29 Toyota Jidosha Kabushiki Kaisha Vehicular peripheral surveillance device
US20110228980A1 (en) * 2009-10-07 2011-09-22 Panasonic Corporation Control apparatus and vehicle surrounding monitoring apparatus
US9547987B2 (en) 2009-12-23 2017-01-17 Earth Networks, Inc. Method and apparatus for conveying vehicle driving information
WO2011079195A1 (en) * 2009-12-23 2011-06-30 Aws Convergence Technologies, Inc. Method and apparatus for conveying vehicle driving information
US20110153742A1 (en) * 2009-12-23 2011-06-23 Aws Convergence Technologies, Inc. Method and Apparatus for Conveying Vehicle Driving Information
US8655951B2 (en) 2009-12-23 2014-02-18 Earth Networks, Inc. Method and apparatus for conveying vehicle driving information
US11553140B2 (en) 2010-12-01 2023-01-10 Magna Electronics Inc. Vehicular vision system with multiple cameras
US9900522B2 (en) 2010-12-01 2018-02-20 Magna Electronics Inc. System and method of establishing a multi-camera image using pixel remapping
US10868974B2 (en) 2010-12-01 2020-12-15 Magna Electronics Inc. Method for determining alignment of vehicular cameras
US10793067B2 (en) 2011-07-26 2020-10-06 Magna Electronics Inc. Imaging system for vehicle
US11285873B2 (en) 2011-07-26 2022-03-29 Magna Electronics Inc. Method for generating surround view images derived from image data captured by cameras of a vehicular surround view vision system
US20130342373A1 (en) * 2012-06-26 2013-12-26 Honeywell International Inc. Methods and systems for taxiway traffic alerting
US9255989B2 (en) 2012-07-24 2016-02-09 Toyota Motor Engineering & Manufacturing North America, Inc. Tracking on-road vehicles with sensors of different modalities
US9738278B2 (en) 2013-01-14 2017-08-22 Robert Bosch Gmbh Creation of an obstacle map
CN104919471B (en) * 2013-01-14 2019-05-07 罗伯特·博世有限公司 The building of barrier map
CN104919471A (en) * 2013-01-14 2015-09-16 罗伯特·博世有限公司 Creation of an obstacle map
WO2014108233A1 (en) * 2013-01-14 2014-07-17 Robert Bosch Gmbh Creation of an obstacle map
US9045085B2 (en) * 2013-01-28 2015-06-02 Fujitsu Ten Limited Object detector
US20140214276A1 (en) * 2013-01-28 2014-07-31 Fujitsu Ten Limited Object detector
US9076336B2 (en) 2013-03-15 2015-07-07 Volkswagen Ag Personalized parking assistant
US10057489B2 (en) 2013-05-06 2018-08-21 Magna Electronics Inc. Vehicular multi-camera vision system
US9508014B2 (en) 2013-05-06 2016-11-29 Magna Electronics Inc. Vehicular multi-camera vision system
US9769381B2 (en) 2013-05-06 2017-09-19 Magna Electronics Inc. Vehicular multi-camera vision system
US11050934B2 (en) 2013-05-06 2021-06-29 Magna Electronics Inc. Method for displaying video images for a vehicular vision system
US10574885B2 (en) 2013-05-06 2020-02-25 Magna Electronics Inc. Method for displaying video images for a vehicular vision system
US11616910B2 (en) 2013-05-06 2023-03-28 Magna Electronics Inc. Vehicular vision system with video display
US20160121892A1 (en) * 2013-06-18 2016-05-05 Continental Automotive Gmbh Method and device for determining a driving state of an external motor vehicle
US10246092B2 (en) * 2013-06-18 2019-04-02 Continental Automotive Gmbh Method and device for determining a driving state of an external motor vehicle
US9308919B2 (en) 2013-10-22 2016-04-12 Honda Research Institute Europe Gmbh Composite confidence estimation for predictive driver assistant systems
US9889858B2 (en) * 2013-10-22 2018-02-13 Honda Research Institute Europe Gmbh Confidence estimation for predictive driver assistance systems based on plausibility rules
US20150112570A1 (en) * 2013-10-22 2015-04-23 Honda Research Institute Europe Gmbh Confidence estimation for predictive driver assistance systems based on plausibility rules
US20150183410A1 (en) * 2013-12-30 2015-07-02 Automotive Research & Testing Center Adaptive anti-collision method for vehicle
US9254824B2 (en) * 2013-12-30 2016-02-09 Automotive Research & Testing Center Adaptive anti-collision method for vehicle
US9898010B2 (en) * 2014-02-07 2018-02-20 Crown Equipment Corporation Systems, methods, and mobile client devices for supervising industrial vehicles
US10613549B2 (en) 2014-02-07 2020-04-07 Crown Equipment Corporation Systems and methods for supervising industrial vehicles via encoded vehicular objects shown on a mobile client device
US20170060138A1 (en) * 2014-02-07 2017-03-02 Crown Equipment Corporation Systems, methods, and mobile client devices for supervising industrial vehicles
US10386854B2 (en) * 2014-02-07 2019-08-20 Crown Equipment Corporation Systems, methods, and mobile client devices for supervising industrial vehicles
US20160084952A1 (en) * 2014-09-24 2016-03-24 Nxp B.V. Personal radar assistance
US9618611B2 (en) * 2014-09-24 2017-04-11 Nxp B.V. Personal radar assistance
US10429506B2 (en) * 2014-10-22 2019-10-01 Denso Corporation Lateral distance sensor diagnosis apparatus
WO2017105320A1 (en) * 2015-12-17 2017-06-22 Scania Cv Ab Method and system for following a trail of a vehicle along a road
WO2017105319A1 (en) * 2015-12-17 2017-06-22 Scania Cv Ab Method and system for facilitating following a leader vehicle along a road
US11277558B2 (en) 2016-02-01 2022-03-15 Magna Electronics Inc. Vehicle vision system with master-slave camera configuration
US11708025B2 (en) 2016-02-02 2023-07-25 Magna Electronics Inc. Vehicle vision system with smart camera video output
US11433809B2 (en) 2016-02-02 2022-09-06 Magna Electronics Inc. Vehicle vision system with smart camera video output
US10448555B2 (en) 2016-05-27 2019-10-22 Cnh Industrial America Llc System and method for scouting vehicle mapping
TWI685672B (en) * 2016-08-15 2020-02-21 美商高通公司 Saliency based beam-forming for object detection
US10310064B2 (en) * 2016-08-15 2019-06-04 Qualcomm Incorporated Saliency based beam-forming for object detection
US10761201B2 (en) * 2016-12-20 2020-09-01 Panasanic Intellectul Property Management Co., Ltd. Object detection device and recording medium
US20180172814A1 (en) * 2016-12-20 2018-06-21 Panasonic Intellectual Property Management Co., Ltd. Object detection device and recording medium
US20220035018A1 (en) * 2018-09-26 2022-02-03 Kyocera Corporation Electronic device, method for controlling electronic device, and electronic device control program
CN111554116A (en) * 2018-12-26 2020-08-18 歌乐株式会社 Vehicle-mounted processing device
EP4310551A3 (en) * 2019-09-20 2024-03-20 Arriver Software AB A method for reducing the amount of sensor data from a forward-looking vehicle sensor
EP3796031A1 (en) * 2019-09-20 2021-03-24 Veoneer Sweden AB A method for reducing the amount of sensor data from a forward-looking vehicle sensor
US11366214B2 (en) * 2019-12-30 2022-06-21 Woven Planet North America, Inc. Systems and methods for adaptive clutter removal from radar scans
CN115214706A (en) * 2022-06-09 2022-10-21 广东省智能网联汽车创新中心有限公司 Dangerous road early warning method and system based on V2X
CN115214706B (en) * 2022-06-09 2024-03-01 广东省智能网联汽车创新中心有限公司 Dangerous road early warning method and system based on V2X
CN115171414A (en) * 2022-06-10 2022-10-11 哈尔滨工业大学重庆研究院 CACC following traffic flow control system based on Frenet coordinate system

Also Published As

Publication number Publication date
DE69113881T2 (en) 1996-06-27
EP0464821A1 (en) 1992-01-08
DE69113881D1 (en) 1995-11-23
IT9067499A0 (en) 1990-07-05
IT9067499A1 (en) 1992-01-05
EP0464821B1 (en) 1995-10-18
IT1240974B (en) 1993-12-27
JPH05101299A (en) 1993-04-23
JP3374193B2 (en) 2003-02-04

Similar Documents

Publication Publication Date Title
US5343206A (en) Method and means for avoiding collision between a motor vehicle and obstacles
CN109927719B (en) Auxiliary driving method and system based on obstacle trajectory prediction
JP3684776B2 (en) Obstacle recognition device for vehicles
EP2074380B1 (en) A method of analysing the surroundings of a vehicle
US7317987B2 (en) Vehicle navigation, collision avoidance and control system
US6631324B2 (en) Vehicle surroundings monitoring apparatus
EP1304264B1 (en) A 360 degree vision system for a vehicle
US7230640B2 (en) Three-dimensional perception of environment
CN112639849A (en) Route selection method and route selection device
JP3153839B2 (en) Preventive safety devices for vehicles
EP1273930B1 (en) A method for collision avoidance and collision mitigation
CN105717514A (en) Road surface reflectivity detection by lidar sensor
EP2012211A1 (en) A system for monitoring the surroundings of a vehicle
JP2000090243A (en) Periphery monitoring device and method therefor
CN112485784B (en) Method and device for determining risk coefficient of target in inner wheel difference area, electronic equipment and storage medium
Shimomura et al. An algorithm for distinguishing the types of objects on the road using laser radar and vision
JP3723835B2 (en) Obstacle detection method on the road
CN115019556A (en) Vehicle collision early warning method and system, electronic device and readable storage medium
JPH03111785A (en) Preceding-vehicle recognizing apparatus
JPH05273341A (en) Obstruction detecting device
Lasky et al. The advanced snowplow driver assistance system
JPH11110699A (en) Device and method for recognizing preceding vehicle
JPH05274035A (en) Obstacle detector
Lu et al. Quantitative testing of a frontal collision warning system for transit buses
JP3487016B2 (en) Obstacle detection device

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12