EP1436640A2 - Object detecting device - Google Patents

Object detecting device

Info

Publication number
EP1436640A2
EP1436640A2 EP02774378A EP02774378A EP1436640A2 EP 1436640 A2 EP1436640 A2 EP 1436640A2 EP 02774378 A EP02774378 A EP 02774378A EP 02774378 A EP02774378 A EP 02774378A EP 1436640 A2 EP1436640 A2 EP 1436640A2
Authority
EP
European Patent Office
Prior art keywords
detection device
error
sensor
data
object detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP02774378A
Other languages
German (de)
French (fr)
Inventor
Goetz Braeuchle
Martin Heinebrodt
Juergen Boecker
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP1436640A2 publication Critical patent/EP1436640A2/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K31/00Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator
    • B60K31/0008Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator including means for detecting potential obstacles in vehicle path
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/20Data confidence level
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9315Monitoring blind spots
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9321Velocity regulation, e.g. cruise control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9325Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles for inter-vehicle distance regulation, e.g. navigating in platoons
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles

Definitions

  • the invention relates to an object detection device for driver assistance systems in motor vehicles, with at least two sensor systems that measure data about the location and / or state of motion of objects in the vicinity of vehicle L5 and whose detection areas overlap one another.
  • Assistance system is a so-called ACC system (Adaptive Cruise Control), which automatically regulates the speed of the vehicle to a desired speed selected by the driver or, if a vehicle in front is present, adjusts the speed so that a suitable one, with help
  • ACC system Adaptive Cruise Control
  • driver assistance systems are collision warning devices, automatic lane guidance systems (LKS; Lane Keeping System) that recognize lane markings and the vehicle through
  • LLS Lane Keeping System
  • these devices can be evaluated and interpreted appropriately.
  • these devices must be able to detect objects in the vicinity of the vehicle, for example other vehicles and other obstacles, and to record data which identify the location and, if appropriate, the state of motion of these objects.
  • the sensor systems and the associated evaluation units should therefore be referred to collectively as an object detection device.
  • Radar systems and their light-optical counterparts so-called lidar systems, as well as stereo camera systems.
  • radar systems the distance of the object along the line of sight can be measured by evaluating the transit time of the radar echo. By evaluating the Doppler shift of the radar echo, the relative speed of the object on the line of sight can also be measured directly.
  • a direction-sensitive radar system for example a multi-beam radar, it is also possible 'direction data to be recorded over the objects, for example, the azimuth angle relative to a through the adjustment of the
  • Radar sensor defined reference axis. With stereo camera systems, directional data and, by evaluating the parallax, distance data can also be obtained. By evaluating the raw data measured directly by these sensor systems, data can be calculated which indicate the distance of the object in the direction of travel and the transverse offset of the object relative to the center of the lane or to the current straight-ahead direction of the vehicle.
  • the measured To subject raw data to a plausibility analysis in order to decide or at least indicate probabilities as to whether the detected object is a relevant obstacle or an irrelevant object, for example a traffic sign on the edge of the road.
  • the implausibility of the recorded data can also be an indication of a defect in the sensor system.
  • the object of the invention is therefore a
  • an error detection device which checks the data measured by the sensor systems for their freedom from contradictions and outputs an error signal when a contradiction is detected.
  • the invention is based on the consideration that, in the case of several sensor systems with overlapping detection areas, it repeatedly occurs that objects are located in the overlapping area.
  • the sensor systems which operate independently of one another, provide redundant information which enables error detection during the running operation of the device. If the sensor systems involved work correctly, the information they provide must be compatible with one another within certain error limits. If not, then if so If data contradict each other, it can be concluded that at least one of the sensor systems involved is defective and an error signal is output.
  • This error signal can be used in the simplest case, alert the driver through a visual or audible indication of the malfunction and possibly a self-shutdown of • assistance system trigger. According to a development of the invention, however, an automatic error correction can also be carried out with the aid of this error signal.
  • the invention thus enables continuous self-diagnosis of the object detection device during normal driving operation and thus a substantial improvement in traffic safety of the assistance system that uses the data of the object detection device.
  • the object detection device has, in addition to a sensor system for the long-range, which is formed, for example, by a 77 GHz radar system or a lidar system, a sensor system for the short-range, which has a shorter range but detects a larger angular range, so that blind spots largely avoided at close range.
  • the sensor system for the close range can also be formed by a radar system or by a lidar system or also by a video sensor system, for example a stereo camera system with two electronic cameras.
  • there are objects in the common overlap area in addition to error detection, there is a simple possibility of "majority decision" to identify the faulty sensor system and, if necessary, to correct the data, the adjustment or the calibration of the faulty system.
  • a video system allows a relatively precise measurement of the transverse offset of a vehicle in front, while the transverse offset measurement with the aid of a radar or lidar system is critically dependent on the adjustment of the radar or lidar sensor. In this case, a discrepancy therefore speaks for a defect in the radar or lidar system.
  • the area in which the detection areas of the sensor systems overlap will be one
  • the sensor systems for the short and long range will preferably be designed such that they overlap in the distance range which corresponds to the typical safety distance from a vehicle traveling in front.
  • achieve an automatic error correction or an improvement of the measuring accuracy also in that the data supplied by the various sensor systems are weighted according to their respective reliability and then combined to a final result.
  • the error detection device it is expedient to store the error signals supplied by the error detection device together with the associated, contradicting L0 measurement data and thus to generate error statistics which facilitate the diagnosis when repairing or maintaining the object detection device.
  • Fig. 1 is a schematic representation of the detection areas of several sensor systems that are on a motor vehicle
  • FIG. 2 is a block diagram of an object detection device according to the invention.
  • FIG. 3 shows a sketch to explain the consequences of error adjustments of different sensor systems.
  • FIG. 1 schematically shows a top view of the front part of a motor vehicle 10 which is equipped with three sensor systems working independently of one another, namely a long-range radar 12, a short-range radar 14 and one
  • the long-range radar 12 has a detection range 18 with a range of, for example, 150 m and a detection angle of 15 °, while the short-range radar 14 has a detection range 20 with a range of, for example, 50
  • reference numeral 16 denotes reference numeral 16. is not shown in the drawing, but includes the overlap area 22 (in good visibility conditions). An object 24, which is located in this overlap region 22, can therefore be detected by all three sensor systems.
  • the detection area 18 of the long-range radar 12 is symmetrical with respect to a reference axis 18A, which - with correct adjustment of the radar sensor - runs parallel to a main axis H which runs in the longitudinal direction through the center of the vehicle 10
  • Short-range radar 14 symmetrical to a reference axis 20A, which is also parallel to the main axis H and to the reference axis 18A.
  • the long-range radar 12 measures the distance dl to the object 24 and the relative speed of the object 24 relative to the vehicle 10 and the azimuth angle jl of the object 24 relative to the reference axis 18A.
  • the short-range radar 14 measures the distance d2 from the object 24, the relative speed of the
  • the images of object. 24 recorded by cameras 16L, 16R 0 are electronically evaluated.
  • the known as such evaluation software of such stereo camera systems is able to identify the object 24 in the images recorded by both cameras and, based on the parallactic shift, the position of the object 24 in a two-dimensional coordinate system (parallel to the Road level).
  • the video system 16 provides the vertical distance d3 of the object 24 from the vehicle 10 (ie, from the base line of the cameras 16L, 16R) and the transverse offset y3 of the object 24 with respect to the 5 main axis H.
  • the location coordinates of the object 24 can thus be determined with the aid of the three sensor systems 12, 14, 16 in three mutually independent ways.
  • the radar systems are
  • L0 measured polar coordinates can be determined by a simple
  • the relative speed 24 of the object can also be determined by temporally deriving the distance d3 measured with the video system 16. Since the visual rays from the radar sensors to the object 24, along which the relative speeds are measured with the aid of the Doppler effect, are not exactly parallel to the
  • Figure 2 shows a block diagram of a 5 object detection device, the long-range radar 12, the Short-range radar 14, video system 16 and associated evaluation units 26, 28, 30 and also one
  • Error detection device 32 and a correction device 34 comprises.
  • the evaluation units 26, 28, 30, the 5 error detection device 32 and the correction device 34 can be formed by electronic circuits, by microcomputers or also by software modules in a single microcomputer.
  • the evaluation unit 26 determines from the raw data supplied by the long-range radar 12 the distances dli, the relative speeds vli and the azimuth angles jli of all objects that are located in the detection area 18 of the long-range radar 12.
  • the index i is used here
  • the evaluation unit 26 also calculates the transverse offsets yli of the various objects from the distance data and azimuth angles.
  • the evaluation unit 28 determines the distances 20 d2i, the relative speed v2i, the azimuth angles j2i and the transverse offsets y2i of all objects that are located in the detection area 20 of the short-range radar 14.
  • the evaluation unit 30 first determines the azimuth angles jLi 25 and jRi of the objects detected by the cameras 16L, 16R. These azimuth angles are defined analogously to the azimuth angles j1 and j2 in FIG. 1, that is, they indicate the angle between the respective line of sight to the object and a straight line parallel to the main axis H. From the azimuth angles jLi and jRi, the distances d3i and the transverse offsets y3i and - by time derivation of the distance data - the relative speeds v3i are calculated on the basis of the known distance between cameras 16L and 16R.
  • dli, d2i and d3i are fed to a distance module 36 of the error detection device 32.
  • the relative speed data vli, v2i and v3i are fed to a speed module 38 and the transverse offset data yli, y2i and y3i to a transverse offset module 40.
  • An angle module 42 of the error detection device 32 evaluates the azimuth angles jli, j2i, jLi and jRi.
  • the various modules of the error detection device 32 are connected to one another and have access to all of the data that are fed to the error detection device 28 from any of the evaluation units.
  • the data connections shown in the drawing only relate to the data whose processing is in the foreground in the module concerned.
  • the distance module 36 When the evaluation unit 26 reports the distance dil of a detected object (with the index i) to the distance module 36, the distance module 36 first checks using the associated transverse offset yli whether the object in question is also in the detection area 20 of the short-range radar 14 and / or in Detection area of the video system 16 is located. If this is the case, the distance module checks whether data for this object are also available from the evaluation units 28, 30. The identification of the objects is facilitated in that the distance module 36 can track the change in the distance data over time.
  • the evaluation unit 28 reports the occurrence of a new object, which can then be identified with the tracked object.
  • the criterion can also be used that the location coordinates transmitted by the various evaluation units for the same object must at least roughly match. If distance data for the same object are available from several sensor systems, the distance module 36 checks whether these distance data match within the respective error limits 5. It should be borne in mind that the
  • the limits of error are variable.
  • the cross offset data yli are relatively imprecise at a large object distance, because the long-range radar 12 has only a limited angular resolution and even slight deviations in the
  • the matching value di is transmitted to a downstream assistance system 44, for example an ACC system.
  • the output value di can be a weighted average of the distance data dli, d2i and d3i, the weights being greater the more reliable the data of the sensor system in question are.
  • Distance data d2i and d3i are evaluated by distance module 36 in a manner corresponding to data dli by evaluation unit 26. If, for example, an object is initially only detected by short-range radar 14 and then in
  • the distance module 36 first tracks the change in the data arriving from the evaluation unit 28 and then checks whether corresponding data also arrives from the evaluation unit 26 at the expected time.
  • the distance module 36 If the expected data from one of the evaluation units 26, 28, 30 fail to appear, that is to say if a sensor system does not detect an object, even though this object would have to be in the detection range based on the data of the other systems, the distance module 36 outputs an error signal Fdj , The index j identifies the sensor system from which no data was obtained. The error signal Fdj thus indicates that the sensor system in question may have failed or is "blind”. 5
  • the error signal Fdj is also output.
  • the error signal Fdj also indicates L0 ' from which sensor systems the deviating data were obtained and how large the deviation is.
  • L5 can be used to form the distance value di from this data and output it to the assistance system 44, although an error was detected and the error signal Fdj was generated.
  • the mode of operation of the speed module 38 is analogous to the mode of operation of the distance module 36 described above, only that here not the distance data, but the
  • Speed data vli, v2i and v3i are compared with one another in order to form a speed value vi therefrom, which is output to the assistance system 44, and / or to output an error signal Fvj, which indicates a discrepancy between the measured relative speeds.
  • the mode of operation of the transverse offset module 40 is largely the same as that of the distance module 36 and
  • the azimuth angles jli, j2i, jLi and jRi are compared separately in the angle module 42.
  • the deviations to 5 must of course be taken into account, which inevitably result for a given object from the object distance and the different positions of the relevant sensors or cameras on the baseline. If, taking these deviations into account, there remains a discrepancy that exceeds the error limits, a will
  • the error signals Fdj, Fvj and Ffk are in the example shown
  • Correction signal K is output to the associated evaluation unit 26, 28 or 30.
  • the correction signal can also be output directly to the long-range radar 12, the short-range radar 14 or the video system 16. 0
  • An example of a systematic error that can be corrected by recalibration is a misalignment of a radar sensor or a camera, which leads to a pivoting of the relevant reference axis, for example 18A or 2.0A, and thus to an incorrect measurement of the azimuth angle.
  • the calibration in the relevant evaluation unit can be changed so that the misalignment is corrected and the correct azimuth angle is obtained again.
  • the misadjustment should nevertheless be remedied during the next repair 5, since the misadjustment also leads to an undesired one
  • the correction device 34 has a statistics module 46 which stores the error signals Fdj, Fvj and Fjk which occurred during the operation of the device and thus documents the type and size of all errors which have occurred. This data is then available for a repair or
  • the statistics module 46 has the function of deciding whether an error can be corrected automatically or whether there is an error that cannot be remedied and an optical or acoustic error message F to the driver
  • error message F will be output if the signals received from error correction device 28 indicate a total failure of one of the sensor systems.
  • the functions of the statistics module 46 offer the possibility 5 of not outputting the error message F immediately in the event of a discrepancy occurring only once or sporadically, but instead of outputting the error message only if discrepancies of the same type occur with a certain frequency. In this way, the robustness of the device is increased considerably. 0
  • FIG. 3 shows examples of how a misadjustment of a sensor affects the measurement result.
  • the measured azimuth angle j1 is too large by this angle, and the long-range radar 12 does not “see” the object 24 in the actual position, but in the position 24 ′ shown in broken lines. This results in 5 an error Dyl for the measured transverse offset. This error is greater the further object 24 is away from vehicle 10.
  • the left camera 16L of the video system has a .0 misalignment of the same size
  • the associated azimuth angle jL is falsified by the same angular deviation Dj as is indicated in FIG. 3 by a line of sight S drawn in broken lines.
  • the video system 16 then sees the object 24 at the intersection of the visual beams of the .5 two cameras, that is to say in the position 24 ′′. It can be seen that in this case the error Dy3 measured for the transverse offset is significantly smaller.
  • the misadjustment of the camera 16L leads to a considerable error Dd3 in the measurement of the distance. ! 0
  • the error is due to a misalignment of the radar system and not to a misadjustment of a camera.
  • the misalignment can even be determined quantitatively from the measured size of the error and by recalibrating the radar sensor
  • the error 15 or the associated evaluation unit 26 are corrected. If, on the other hand, there is a misadjustment of the camera 16L, this can be recognized from a large discrepancy Dd3 in the distance data with the consistency of the cross offset data being largely consistent. In this case, the error can be corrected by recalibrating the video system.
  • the azimuth angle j1 is approximately inversely proportional to the object distance.
  • the rate of change of the azimuth angle j1 is dependent on the relative speed of the object, which can be measured directly using the radar system.
  • the measured (apparent) azimuth angle is independent of the distance and the relative speed. Accordingly, there is also a discrepancy between the measured and the theoretically predicted change rate of the azimuth angle on the basis of the relative speed, even when the object is actually offset. This discrepancy suggests a misadjustment of the sensor system.
  • the type of object for example a car or a truck
  • the typical actual size of such objects is at least approximately known
  • this information can also be used for automatic error detection and error correction.
  • the transverse offset of one's own vehicle relative to the center of the lane can be recognized from the recognized lane markings. It is to be expected that this transverse offset has the value 0 on a statistical average. If the evaluation in the statistics module 46 shows that the measured

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Electromagnetism (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention concerns a device for detecting objects designed for a driving aid system in motor vehicles. The inventive object detecting device comprises at least two sensor systems (13; 14; 16L, 16R) which measure data (d1, j1; d2, j2; d3, y3) concerning the location or the displacement state of the objects (24) in the neighbourhood of the vehicle (10) and whereof the detection zones (18, 20) are mutually overlapping. The inventive object detecting device is characterized in that it comprises an error detection device which controls the coherence of the data measured by the sensor systems(12; 14; 16L, 16R) and emits an error signal when it identifies an incoherence.

Description

Ob ekterfassungsVorrichtung Whether it is a data acquisition device
L OL O
Die Erfindung betrifft eine Objekterfassungsvorrichtung für Fahrer-Assistenzsysteme in Kraftfahrzeugen, mit mindestens zwei Sensorsystemen, die Daten über den Ort und/oder Bewegungszustand von Objekten in der Umgebung des Fahrzeugs L5 messen und deren Detektionsbereiche einander überlappen.The invention relates to an object detection device for driver assistance systems in motor vehicles, with at least two sensor systems that measure data about the location and / or state of motion of objects in the vicinity of vehicle L5 and whose detection areas overlap one another.
Kraftfahrzeuge werden zunehmend mit Fahrer-Assistenzsystemen ausgestattet, die den Fahrer bei der Führung des Fahrzeugs unterstützen und entlasten. Ein Beispiel eines solchenMotor vehicles are increasingly being equipped with driver assistance systems that support and relieve the driver of driving the vehicle. An example of one
20 Assistenzsystems ist ein sogenanntes ACC-System (Adaptive Cruise Control) , das die Geschwindigkeit des Fahrzeugs automatisch auf eine vom Fahrer gewählte Wunschgeschwindigkeit regelt oder, sofern ein vorausfahrendes Fahrzeug vorhanden ist, die Geschwindigkeit so anpaßt, daß ein geeigneter, mit Hilfe20 Assistance system is a so-called ACC system (Adaptive Cruise Control), which automatically regulates the speed of the vehicle to a desired speed selected by the driver or, if a vehicle in front is present, adjusts the speed so that a suitable one, with help
25 eines Abstandssensors überwachter Abstand zu dem vorausfahrenden Fahrzeug eingehalten wird. Andere Beispiele für Fahrer-Assistenzssysteme sind Kollisionswarngeräte, automatische SpurführungsSysteme (LKS; Lane Keeping System) , die Fahrbahnmarkierungen erkennen und das Fahrzeug durch25 of a distance sensor monitored distance to the vehicle in front is observed. Other examples of driver assistance systems are collision warning devices, automatic lane guidance systems (LKS; Lane Keeping System) that recognize lane markings and the vehicle through
30 Eingriff in die Lenkung automatisch in der Spurmitte halten, sensorgestützte Einparkhilfen und dergleichen. All diese Assistenzsysteme benötigen ein Sensorsystem, mit dem Informationen über die Umgebung des Fahrzeugs erfaßt werden können, sowie Auswerteeinheiten, mit denen diese Informationen30 Automatically keep intervention in the steering in the middle of the lane, sensor-assisted parking aids and the like. All of these assistance systems require a sensor system with which information about the surroundings of the vehicle can be recorded, as well as evaluation units with which this information
35 geeignet ausgewertet und interpretiert werden können. Insbesondere müssen diese Einrichtungen in der Lage sein, Objekte in der Umgebung des Fahrzeugs, beispielsweise andere Fahrzeuge und sonstige Hindernisse zu detektieren und Daten zu erfassen, die den Ort und ggf. den Bewegungszustand dieser Objekte kennzeichnen. Die Sensorsysteme und die zugehörigen Auswerteeinheiten sollen deshalb zusammenfassend als Objekterfassungsvorrichtung bezeichnet werden.35 can be evaluated and interpreted appropriately. In particular, these devices must be able to detect objects in the vicinity of the vehicle, for example other vehicles and other obstacles, and to record data which identify the location and, if appropriate, the state of motion of these objects. The sensor systems and the associated evaluation units should therefore be referred to collectively as an object detection device.
Beispiele für Sensorsysteme, die in solchen Objekterfassungsvorrichtungen eingesetzt werden, sindExamples of sensor systems used in such object detection devices are
Radarsysteme und ihre lichtoptischen Gegenstücke, sogenannten Lidar-Systeme, sowie Stereo-Kamerasysteme. Mit Radarsystemen kann durch Auswertung der Laufzeit des Radarechos der Abstand des Objekts längs der Sichtlinie gemessen werden. Durch Auswertung der Dopplerverschiebung des Radarechos kann auch die Relativgeschwindigkeit des Objekts auf der Sichtlinie direkt gemessen werden. Mit einem richtungsensitiven Radarsystem, beispielsweise einem Mehrstrahlradar ist es auch möglich, 'Richtungsdaten über die Objekte zu erfassen, beispielsweise den Azimutwinkel relativ zu einer durch die Justierung desRadar systems and their light-optical counterparts, so-called lidar systems, as well as stereo camera systems. With radar systems, the distance of the object along the line of sight can be measured by evaluating the transit time of the radar echo. By evaluating the Doppler shift of the radar echo, the relative speed of the object on the line of sight can also be measured directly. With a direction-sensitive radar system, for example a multi-beam radar, it is also possible 'direction data to be recorded over the objects, for example, the azimuth angle relative to a through the adjustment of the
Radarsensors definierten Bezugsachse. Mit Stereo-Kamerasystemen können Richtungsdaten und, durch Auswertung der Parallaxe, auch Abstandsdaten gewonnen werden. Durch Auswertung der von diesen Sensorsystemen direkt gemessenen Rohdaten können Daten berechnet werden, die den Abstand des Objekts in Fahrtrichtung sowie den Querversatz des Objektes relativ zur Fahrbahnmitte oder zur augenblicklichen Geradeaus-Richtung des Fahrzeugs angeben .Radar sensor defined reference axis. With stereo camera systems, directional data and, by evaluating the parallax, distance data can also be obtained. By evaluating the raw data measured directly by these sensor systems, data can be calculated which indicate the distance of the object in the direction of travel and the transverse offset of the object relative to the center of the lane or to the current straight-ahead direction of the vehicle.
Da all diese bekannten Senso-rsysteme ihre besonderen Stärken und Schwächen bei der Erfassung der benötigten Meßdaten haben, ist es zweckmäßig, mehrere Sensorsysteme zu verwenden, die einander ergänzen.Since all these known sensor systems have their particular strengths and weaknesses in the acquisition of the required measurement data, it is expedient to use several sensor systems which complement one another.
Bei ACC-Systemen ist es grundsätzlich bekannt, die gemessenen Rohdaten einer Plausibilitätsauswertung zu unterziehen, um zu entscheiden oder zumindest Wahrscheinlichkeiten dafür anzugeben, ob es sich bei dem erfaßten Objekt um ein relevantes Hindernis oder um ein irrelevantes Objekt, beispielsweise ein Verkehrsschild am Fahrbahnrand handelt. Unter Umständen kann die Unplausibilität der erfaßten Daten auch ein Indiz für einen Defekt des Sensorsystems ein.In the case of ACC systems, it is generally known that the measured To subject raw data to a plausibility analysis in order to decide or at least indicate probabilities as to whether the detected object is a relevant obstacle or an irrelevant object, for example a traffic sign on the edge of the road. Under certain circumstances, the implausibility of the recorded data can also be an indication of a defect in the sensor system.
Generell ist es jedoch mit den bekanntenGenerally, however, it is with the known ones
Objekterfassungsvorrichtungen nicht möglich, Fehljustierungen oder sonstige Defekte der Sensorsysteme, die die Funktionsfähigkeit des Assistenzsystems beeinträchtigen, sicher zu erkennen.Object detection devices not possible to reliably detect misalignments or other defects in the sensor systems that impair the functionality of the assistance system.
Aufgabe der Erfindung ist es deshalb, eineThe object of the invention is therefore a
Objekterfassungsvorrichtung zu schaffen, mit der es möglich ist, Defekte der Sensorsysteme während des Betriebs genauer und zuverlässiger zu erkennen und so die Funktionssicherheit des Assisten∑systems zu verbessern.To create object detection device with which it is possible to detect defects in the sensor systems more precisely and reliably during operation and thus to improve the functional reliability of the assistant system.
Diese Aufgabe wird erfindungsgemäß gelöst durch eine Fehlererkennungseinrichtung, die die von den Sensorsysterαen gemessenen Daten auf ihre Widerspruchsfreiheit prüft und bei Erkennung eines Widerspruchs ein Fehlersignal ausgibt.This object is achieved according to the invention by an error detection device which checks the data measured by the sensor systems for their freedom from contradictions and outputs an error signal when a contradiction is detected.
Die Erfindung beruht auf der Überlegung, daß es bei mehreren Sensorsystemen mit einander überlappenden Detektionsbereichen immer wieder vorkommt, daß Objekte in dem Überlappungsbereich geortet werden. In diesem Fall liefern die unabhängig voneinander arbeitenden Sensorsysteme redundante Informationen, die eine Fehlererkennung während des laufenden Betriebs der Vorrichtung ermöglichen. Wenn die beteiligten Sensorsystexne korrekt arbeiten, müssen die von ihnen gelieferten Informationen innerhalb gewisser Fehlergrenzen miteinander kompatibel sein. Wenn dies nicht der Fall ist, wenn also die Daten einander widersprechen, so kann daraus geschlossen werden, daß mindestens eines der beteiligten Sensorsysteme defekt ist, und es wird ein Fehlersignal ausgegeben. Dieses Fehlersignal kann im einfachsten Fall dazu benutzt werden, den Fahrer durch eine optische oder akustische Anzeige auf die Fehlfunktion hinzuweisen und ggf. eine Selbstabschaltung des Assistenzsystems auszulösen. Gemäß einer Weiterbildung der Erfindung kann jedoch mit Hilfe dieses Fehlersignals auch eine automatische Fehlerkorrektur vorgenommen werden.The invention is based on the consideration that, in the case of several sensor systems with overlapping detection areas, it repeatedly occurs that objects are located in the overlapping area. In this case, the sensor systems, which operate independently of one another, provide redundant information which enables error detection during the running operation of the device. If the sensor systems involved work correctly, the information they provide must be compatible with one another within certain error limits. If not, then if so If data contradict each other, it can be concluded that at least one of the sensor systems involved is defective and an error signal is output. This error signal can be used in the simplest case, alert the driver through a visual or audible indication of the malfunction and possibly a self-shutdown of assistance system trigger. According to a development of the invention, however, an automatic error correction can also be carried out with the aid of this error signal.
Die Erfindung ermöglicht somit während des normalen Fahrbetriebs eine fortlaufende Selbstdiagnose der Objekterfassungsvorrichtung und damit eine wesentliche Verbesserung der Verkehrssicherheit des Assistenzsystems, das die Daten der Objekterfassungsvorrichtung benutzt.The invention thus enables continuous self-diagnosis of the object detection device during normal driving operation and thus a substantial improvement in traffic safety of the assistance system that uses the data of the object detection device.
Vorteilhafte Ausgestaltungen der Erfindung ergeben sich aus den Unteransprüchen .Advantageous refinements of the invention result from the subclaims.
In einer bevorzugten Ausführungsform weist die Objekterfassungsvorrichtung neben einem Sensorsystem für den Fernbereich, das beispielsweise durch ein 77GHz Radarsystem oder ein Lidarsystem gebildet wird, ein Sensorsystem für den Nahbereich auf, das eine kürzere Reichweite hat, dafür jedoch einen größeren Winkelbereich erfaßt, so daß tote Winkel im Nahbereich weitgehend vermieden werden. Das Sensorsystem für den Nahbereich kann ebenfalls durch ein Radarsystem oder durch ein Lidarsystem oder auch durch ein Video-Sensorsystem, beispielsweise ein Stereo-Kamerasystem mit zwei elektronischen Kameras gebildet werden.In a preferred embodiment, the object detection device has, in addition to a sensor system for the long-range, which is formed, for example, by a 77 GHz radar system or a lidar system, a sensor system for the short-range, which has a shorter range but detects a larger angular range, so that blind spots largely avoided at close range. The sensor system for the close range can also be formed by a radar system or by a lidar system or also by a video sensor system, for example a stereo camera system with two electronic cameras.
In einer modifizierten Ausführungsform können drei voneinander unabhängige Sensorsysteme vorhanden sein, deren Detektionsbereiche einen gemeinsamen Überlappungsbereich aufweisen. In diesem Fall besteht bei Objekten, die sich in dem gemeinsamen Überlappungsbereich befinden, über die Fehlererkennung hinaus eine einfache Möglichkeit, durch "Mehrheitsentscheidung" das fehlerhafte Sensorsystem zu identifizieren und ggf. die Daten, die Justierung oder die 5 Kalibrierung des fehlerhaften Systems zu korrigieren.In a modified embodiment, there can be three mutually independent sensor systems whose detection areas have a common overlap area. In this case, there are objects in the common overlap area, in addition to error detection, there is a simple possibility of "majority decision" to identify the faulty sensor system and, if necessary, to correct the data, the adjustment or the calibration of the faulty system.
Auch bei Ausführungsformen mit nur zwei Sensorsystemen ist unter bestimmten Umständen eine automatische Identifizierung des fehlerhaften Systems sowie eine automatischeIn certain circumstances, even in embodiments with only two sensor systems, automatic identification of the faulty system and automatic
L0 Fehlerkorrektur möglich, insbesondere durch eineL0 error correction possible, especially with a
Plausibilitätsauswertung unter Berücksichtigung der jeweiligen Besonderheiten der physikalischen Meßprinzipien, die bei den beteiligten Sensorsystemen eingesetzt werden. Zum Beispiel ist mit Radar- und Lidar-Systemen eine relativ genauePlausibility evaluation taking into account the particular characteristics of the physical measuring principles used in the sensor systems involved. For example, using radar and lidar systems is a relatively accurate one
L5 Abstandsmessung möglich, während die Abstandsmessung mit Hilfe eines Stereo-Kamerasystems insbesondere bei größeren Abständen mit größeren Fehlertoleranzen behaftet ist und kritisch von der Kamerajustierung abhängt. Im Fall einer Diskrepanz spricht deshalb eine hohe Wahrscheinlichkeit für einen Fehler in demL5 distance measurement possible, while the distance measurement with the aid of a stereo camera system, particularly with larger distances, is subject to greater error tolerances and critically depends on the camera adjustment. In the event of a discrepancy, there is therefore a high probability of an error in the
20 Stereo-Kamerasystem. Umgekehrt erlaubt ein Videosystem eine relativ genaue Messung des Querversatzes eines vorausfahrenden Fahrzeugs, während die Querversatzmessung mit Hilfe eines Radar- oder Lidar-Systems kritisch von der Justierung des Radar- bzw. Lidarsensors abhängig ist. In diesem Fall spricht 25 eine Diskrepanz daher eher für einen Defekt im Radar- oder Lidarsystem.20 stereo camera system. Conversely, a video system allows a relatively precise measurement of the transverse offset of a vehicle in front, while the transverse offset measurement with the aid of a radar or lidar system is critically dependent on the adjustment of the radar or lidar sensor. In this case, a discrepancy therefore speaks for a defect in the radar or lidar system.
In der Praxis wird es sich bei dem Bereich, in dem die Detektionsbereiche der Sensorsysteme überlappen, um einenIn practice, the area in which the detection areas of the sensor systems overlap will be one
30 Bereich handeln, der für das Assistenzsystem von besonderer30 area that is special for the assistance system
Relevanz ist. Beispielsweise wird man bei einem ACC-System die Sensorsysteme für den Nahbereich und den Fernbereich vorzugsweise so auslegen, daß sie sich in dem Abstandsbereich überlappen, der dem typischen Sicherheitsabstand zu einem 35 vorausfahrenden Fahrzeug entspricht. In diesem Fall läßt sich eine automatische Fehlerkorrektur bzw. -eine Verbesserung der Meßgenauigkeit auch dadurch erreichen, daß die von den verschiedenen Sensorsystemen gelieferten Daten entsprechend ihrer jeweiligen Verläßlichkeit gewichtet und dann zu einem 5 Endergebnis kombiniert werden.Relevance is. For example, in the case of an ACC system, the sensor systems for the short and long range will preferably be designed such that they overlap in the distance range which corresponds to the typical safety distance from a vehicle traveling in front. In this case, achieve an automatic error correction or an improvement of the measuring accuracy also in that the data supplied by the various sensor systems are weighted according to their respective reliability and then combined to a final result.
Gemäß einer Weiterbildung der Erfindung ist es zweckmäßig, die von der Fehlererkennungseinrichtung gelieferten Fehlersignale zusammen mit den zugehörigen, einander widersprechenden L0 Meßdaten zu speichern und so eine Fehlerstatistik zu erstellen, die bei der Reparatur oder Wartung der Objekterfassungsvorrichtung die Diagnose erleichtert.According to a development of the invention, it is expedient to store the error signals supplied by the error detection device together with the associated, contradicting L0 measurement data and thus to generate error statistics which facilitate the diagnosis when repairing or maintaining the object detection device.
Im folgenden werden Ausführungsbeispiele der Erfindung anhand L5 der Zeichnung näher erläutert.Exemplary embodiments of the invention are explained in more detail below with reference to L5 of the drawing.
Es zeigen:Show it:
Fig. 1 eine schematische Darstellung der Detektionsbereiche von mehreren Sensorsystemen, die an einem KraftfahrzeugFig. 1 is a schematic representation of the detection areas of several sensor systems that are on a motor vehicle
20 installiert sind;20 are installed;
Fig. 2 ein Blockdiagramm einer Objekterfassungsvorrichtung gemäß der Erfindung; und Fig. 3 eine Skizze zur Erläuterung der Konsequenzen von FehlerJustierungen verschiedener Sensorsysteme.Fig. 2 is a block diagram of an object detection device according to the invention; and FIG. 3 shows a sketch to explain the consequences of error adjustments of different sensor systems.
2525
Figur 1 zeigt schematisch in der Draufsicht die Frontpartie eines Kraftfahrzeugs 10, das mit drei unabhängig voneinander arbeitenden Sensorsystemen ausgerüstet ist, nämlich einem Fernbereichsradar 12, einem Nahbereichsradar 14 und einemFIG. 1 schematically shows a top view of the front part of a motor vehicle 10 which is equipped with three sensor systems working independently of one another, namely a long-range radar 12, a short-range radar 14 and one
SO Videosystem, das durch zwei Kameras 16L und 16R gebildet wird. Das Fernbereichsradar 12 hat -einen Detektionsbereich 18 mit einer Reichweite von beispielsweise 150 m und einem Erfassungswinkel von 15°, während das Nahbereichsradar 14 einen Detektionsbereich 20 mit einer Reichweite von beispielsweise 50SO video system formed by two cameras 16L and 16R. The long-range radar 12 has a detection range 18 with a range of, for example, 150 m and a detection angle of 15 °, while the short-range radar 14 has a detection range 20 with a range of, for example, 50
15 m und einem Erfassungswinkel von 40° aufweist. Zwischen diesen Detektionsbereichen 18, 20, die in der Zeichnung nicht maßstäblich dargestellt sind, gibt es einen Überlappungsbereich. 22. Der Detektionsbereich des durch die Kameras 16L, 16R gebildeten Videosystems, das zusammenfassend mit dem15 m and a detection angle of 40 °. Between these Detection areas 18, 20, which are not shown to scale in the drawing, there is an overlap area. 22. The detection area of the video system formed by the cameras 16L, 16R, which in summary with the
5 Bezugszeichen 16 bezeichnet . werden soll, ist in der Zeichnung nicht dargestellt, schließt jedoch (bei guten Sichtbedingungen) den Uberlappungsbereich 22 ein. Ein Objekt 24, das sich in diesem Überlappungsbereich 22 befindet, kann daher von allen drei Sensorsystemen erfaßt werden.5 denotes reference numeral 16. is not shown in the drawing, but includes the overlap area 22 (in good visibility conditions). An object 24, which is located in this overlap region 22, can therefore be detected by all three sensor systems.
00
Der Detektionsbereich 18 des Fernbereichsradars 12 ist symmetrisch zu einer Bezugsachse 18A, die - bei korrekter Justierung des Radarsensors - parallel zu einer Hauptachse H verläuft, die in Längsrichtung durch die Mitte des Fahrzeugs 10The detection area 18 of the long-range radar 12 is symmetrical with respect to a reference axis 18A, which - with correct adjustment of the radar sensor - runs parallel to a main axis H which runs in the longitudinal direction through the center of the vehicle 10
.5 verläuft. Entsprechend ist der Detektionsbereich 20 des.5 runs. The detection area 20 of the
Nahbereichsradars 14 symmetrisch zu einer Bezugsachse 20A, die ebenfalls zu der Hauptachse H und zu der Bezugsachse 18A parallel ist.Short-range radar 14 symmetrical to a reference axis 20A, which is also parallel to the main axis H and to the reference axis 18A.
!0 Das Fernbereichsradar 12 mißt den Abstand dl zum Objekt 24 sowie die Relativgeschwindigkeit des Objekts 24 relativ zum Fahrzeug 10 und den Azimutwinkel jl des Objekts 24 relativ zur Bezugsachse 18A. Entsprechend mißt das Nahbereichsradar 14 den Abstand d2 zum Objekt 24, die Relativgeschwindigkeit des! 0 The long-range radar 12 measures the distance dl to the object 24 and the relative speed of the object 24 relative to the vehicle 10 and the azimuth angle jl of the object 24 relative to the reference axis 18A. Correspondingly, the short-range radar 14 measures the distance d2 from the object 24, the relative speed of the
25 Objekts 24 längs der Sichtlinie vom Radarsensor zum Objekt sowie den Azimutwinkel j2 des Objekts 24 relativ zur Bezugsachse 20A.25 object 24 along the line of sight from the radar sensor to the object and the azimuth angle j2 of the object 24 relative to the reference axis 20A.
Im Videosystem 16 werden die von den Kameras 16L, 16R 0 aufgenommenen Bilder des Objekts.24 elektronisch ausgewertet. Die als solche bekannte Auswertungssoftware solcher Stereo- Kamerasysteme ist in der Lage, das Objekt 24 in den von beiden Kameras aufgenommenen Bildern zu identifizieren und anhand der parallaktischen Verschiebung die Lage des Objekts 24 in einem 5 zweidimensionalen Koordinatensystem (parallel zur Fahrbahnebene) zu bestimmen. Auf diese Weise liefert das Videosystem 16 den senkrechten Abstand d3 des Objekts 24 vom Fahrzeug 10 (d.h., von der Basislinie der Kameras 16L, 16R) sowie den Querversatz y3 des Objekts 24 gegenüber der 5 Hauptachse H.In video system 16, the images of object. 24 recorded by cameras 16L, 16R 0 are electronically evaluated. The known as such evaluation software of such stereo camera systems is able to identify the object 24 in the images recorded by both cameras and, based on the parallactic shift, the position of the object 24 in a two-dimensional coordinate system (parallel to the Road level). In this way, the video system 16 provides the vertical distance d3 of the object 24 from the vehicle 10 (ie, from the base line of the cameras 16L, 16R) and the transverse offset y3 of the object 24 with respect to the 5 main axis H.
Die Ortskoordinaten des Objekts 24 lassen sich somit mit Hilfe der drei Sensorsysteme 12, 14, 16 auf drei voneinander unabhängige Weisen bestimmen. Die von den RadarsystemenThe location coordinates of the object 24 can thus be determined with the aid of the three sensor systems 12, 14, 16 in three mutually independent ways. The radar systems
L0 gemessenen Polarkoordinaten lassen sich durch eine einfacheL0 measured polar coordinates can be determined by a simple
Koordinatentransformation in kartesische Koordinaten umrechnen, wie im gezeigten Beispiel durch das Koordinatenpaar (d3, y3) gebildet werden. Die drei unabhängig voneinander gemessenen Koordinatensätze können nun miteinander verglichen werden, undConvert coordinate transformation to Cartesian coordinates, as in the example shown by the pair of coordinates (d3, y3). The three sets of coordinates measured independently of one another can now be compared with one another, and
L5 wenn diese Koordinaten einander widersprechen, so deutet dies darauf hin, daß eines der drei Sensorsysteme fehlerhaft arbeitet. Anhand des abweichenden Koordinatensatzes kann auch das fehlerhafte System identifiziert werden.L5 if these coordinates contradict each other, this indicates that one of the three sensor systems is malfunctioning. The faulty system can also be identified on the basis of the different coordinate set.
20 Durch zeitliche Ableitung des mit dem Videosystem 16 gemessenen Abstands d3 kann auch die Relativgeschwindigkeit 24 des Objekts bestimmt werden. Da die Sehstrahlen von den Radarsensoren zum Objekt 24, längs derer die Relativgeschwindigkeiten mit Hilfe _ des Doppler-Effektes gemessen werden, nicht genau parallel zurThe relative speed 24 of the object can also be determined by temporally deriving the distance d3 measured with the video system 16. Since the visual rays from the radar sensors to the object 24, along which the relative speeds are measured with the aid of the Doppler effect, are not exactly parallel to the
25 Hauptachse H sind, werden die drei gemessenen25 are the main axis H, the three measured
Relativgeschwindigkeiten geringfügig voneinander abweichen. Bei den in der Praxis auftretenden Entfernungsverhältnissen ist diese Abweichung jedoch in der Regel vernachlässigbar. Erforderlichenfalls kann sie durch Umrechnung auf kartesischeRelative speeds differ slightly from each other. However, this deviation is usually negligible in the distance ratios that occur in practice. If necessary, it can be converted to Cartesian
>0 Koordinaten korrigiert werden, so daß sich auch die gemessenen Geschwindigkeitsdaten miteinander vergleichen und gegeneinander abgleichen lassen.> 0 coordinates are corrected so that the measured speed data can also be compared and compared.
Figur 2 zeigt ein Blockdiagramm einer 5 Objekterfassungsvorrichtung, die das Fernbereichsradar 12, das Nahbereichsradar 14, das Videosystem 16 sowie zugehörige Auswerteeinheiten 26, 28, 30 und weiterhin eineFigure 2 shows a block diagram of a 5 object detection device, the long-range radar 12, the Short-range radar 14, video system 16 and associated evaluation units 26, 28, 30 and also one
Fehlererkennungseinrichtung 32 und eine Korrektureinrichtung 34 umfaßt. Die Auswerteeinheiten 26, 28, 30, die 5 Fehlererkennungseinrichtung 32 und die Korrektureinrichtung 34 können durch elektronische Schaltungen, durch Mikrocomputer oder auch durch Softwaremodule in einem einzigen Mikrocomputer gebildet werden.Error detection device 32 and a correction device 34 comprises. The evaluation units 26, 28, 30, the 5 error detection device 32 and the correction device 34 can be formed by electronic circuits, by microcomputers or also by software modules in a single microcomputer.
L0 Die Auswerteeinheit 26 bestimmt aus den vom Fernbereichsradar 12 gelieferten Rohdaten die Abstände dli, die Relativgeschwindigkeiten vli und die Azimutwinkel jli sämtlicher Objekte, die sich im Erfassungsbereich 18 des Fernbereichsradars 12 befinden. Der Index i dient hier zurL0 The evaluation unit 26 determines from the raw data supplied by the long-range radar 12 the distances dli, the relative speeds vli and the azimuth angles jli of all objects that are located in the detection area 18 of the long-range radar 12. The index i is used here
L5 Identifizierung der einzelnen Objekte. Aus den Abstandsdaten und Azimutwinkeln berechnet die Auswerteeinheit 26 auch die Querversätze yli der verschiedenen Objekte.L5 Identification of the individual objects. The evaluation unit 26 also calculates the transverse offsets yli of the various objects from the distance data and azimuth angles.
Auf analoge Weise bestimmt die Auswerteeinheit 28 die Abstände 20 d2i, die Relativgeschwindigkeit v2i, die Azimutwinkel j2i und die Querversätze y2i sämtlicher Objekte, die sich im Erfassungsbereich 20 des Nahbereichsradars 14 befinden.In an analogous manner, the evaluation unit 28 determines the distances 20 d2i, the relative speed v2i, the azimuth angles j2i and the transverse offsets y2i of all objects that are located in the detection area 20 of the short-range radar 14.
Die Auswerteeinheit 30 bestimmt zunächst die Azimutwinkel jLi 25 und jRi der von den Kameras 16L, 16R erfaßten Objekte. Diese Azimutwinkel sind analog zu den Azimutwinkeln jl und j2 in Figur 1 definiert, geben also den Winkel zwischen dem jeweiligen Sehstrahl zum Objekt und einer zur Hauptachse H parallelen Geraden an. Aus den Azimutwinkeln jLi und jRi werden SO anhand des bekannten Abstands zwischen Kameras 16L und 16R die Abstände d3i und die Querversätze y3i und - durch zeitliche Ableitung der Abstandsdaten - die Relativgeschwindigkeiten v3i berechnet.The evaluation unit 30 first determines the azimuth angles jLi 25 and jRi of the objects detected by the cameras 16L, 16R. These azimuth angles are defined analogously to the azimuth angles j1 and j2 in FIG. 1, that is, they indicate the angle between the respective line of sight to the object and a straight line parallel to the main axis H. From the azimuth angles jLi and jRi, the distances d3i and the transverse offsets y3i and - by time derivation of the distance data - the relative speeds v3i are calculated on the basis of the known distance between cameras 16L and 16R.
15 Die von den drei Auswerteeinheiten 26,' 28, 30 bestimmten Abstände dli, d2i und d3i werden einem Abstandsmodul 36 der Fehlererkennungseinrichtung 32 zugeführt. Entsprechend werden die Relativgeschwindigkeitsdaten vli, v2i und v3i einem Geschwindigkeitsmodul 38 und die Querversatzdaten yli, y2i und y3i einem Querversatzmodul 40 zugeführt. Ein Winkelmodul 42 der Fehlererkennungseinrichtung 32 wertet die Azimutwinkel jli, j2i, jLi und jRi aus.15 determined by the three evaluation units 26, '28, 30 Distances dli, d2i and d3i are fed to a distance module 36 of the error detection device 32. Correspondingly, the relative speed data vli, v2i and v3i are fed to a speed module 38 and the transverse offset data yli, y2i and y3i to a transverse offset module 40. An angle module 42 of the error detection device 32 evaluates the azimuth angles jli, j2i, jLi and jRi.
Die verschiedenen Module der Fehlererkennungseinrichtung 32 stehen miteinander in Verbindung und haben Zugriff auf sämtliche Daten, die der Fehlererkennungseinrichtung 28 von irgendeiner der Auswerteeinheiten zugeführt werden. Die in der Zeichnung dargestellten Datenverbindungen beziehen sich j eweils nur auf die Daten, deren Verarbeitung in dem -betreffenden Modul im Vordergrund steht.The various modules of the error detection device 32 are connected to one another and have access to all of the data that are fed to the error detection device 28 from any of the evaluation units. The data connections shown in the drawing only relate to the data whose processing is in the foreground in the module concerned.
Wenn die Auswerteeinheit 26 den Abstand dil eines erfaßten Objektes (mit dem Index i) an das Abstandsmodul 36 meldet, so überprüft das Abstandsmodul 36 zunächst anhand des zugehörigen Querversatzes yli, ob sich das betreffende Objekt auch im Erfassungsbereich 20 des Nahbereichsradars 14 und/oder im Erfassungsbereich des Videosystems 16 befindet. Wenn dies der Fall ist, so überprüft das Abstandsmodul, ob Daten für dieses Objekt auch von den Auswerteeinheiten 28, 30 verfügbar sind. Die Identifizierung der Objekte wird dadurch erleichtert, daß das Abstandsmodul 36 die zeitliche Änderung der Abstandsdaten verfolgen kann. Wenn beispielsweise ein Objekt zunächst nur vom Fernbereichsradar 12 erfaßt wird und dann in den Erfassungsbereich 20 des Nahbereichsradars 14 eintritt, so ist zu erwarten, daß die Auswerteeinheit 28 das Auftreten eines neuen Objektes meldet, das dann mit dem verfolgten Objekt identifiziert werden kann. Zur Beseitigung von Mehrdeutigkeiten kann auch das Kriterium herangezogen werden, daß die von den verschiedenen Auswerteeinheiten übermittelten Ortskoordinaten für dasselbe Objekt zumindest grob übereinstimmen müssen. Wenn Abstandsdaten für dasselbe Objekt von mehreren Sensorsystemen verfügbar sind, überprüft das Abstandsmodul 36, ob dieses Abstandsdaten innerhalb der jeweiligen Fehlergrenzen 5 übereinstimmen. Dabei ist zu berücksichtigen, daß dieWhen the evaluation unit 26 reports the distance dil of a detected object (with the index i) to the distance module 36, the distance module 36 first checks using the associated transverse offset yli whether the object in question is also in the detection area 20 of the short-range radar 14 and / or in Detection area of the video system 16 is located. If this is the case, the distance module checks whether data for this object are also available from the evaluation units 28, 30. The identification of the objects is facilitated in that the distance module 36 can track the change in the distance data over time. If, for example, an object is initially only detected by the long-range radar 12 and then enters the detection range 20 of the short-range radar 14, it can be expected that the evaluation unit 28 reports the occurrence of a new object, which can then be identified with the tracked object. In order to eliminate ambiguities, the criterion can also be used that the location coordinates transmitted by the various evaluation units for the same object must at least roughly match. If distance data for the same object are available from several sensor systems, the distance module 36 checks whether these distance data match within the respective error limits 5. It should be borne in mind that the
Fehlergrenzen ihrerseits variabel sind. Beispielsweise sind die Querversatzdaten yli bei großem Objektabstand relativ ungenau, weil das Fernbereichsradar 12 nur ein begrenztes Winkelauflösungsvermögen hat und schon geringe Abweichungen imThe limits of error are variable. For example, the cross offset data yli are relatively imprecise at a large object distance, because the long-range radar 12 has only a limited angular resolution and even slight deviations in the
L0 gemessenen Azimutwinkel zu einer beträchtlichen Abweichung des zugehörigen Querversatzes führen. Wenn die Abstandsdaten innerhalb der Fehlergrenzen übereinstimmen, wird der übereinstimmende Wert di an ein nachgeschaltetes Assistenzsystem 44, beispielsweise ein ACC-System, übermittelt.L0 measured azimuth angle lead to a considerable deviation of the associated cross offset. If the distance data match within the error limits, the matching value di is transmitted to a downstream assistance system 44, for example an ACC system.
L5 Bei dem ausgegebenen Wert di kann es sich dabei um ein gewichtetes Mittel der Abstandsdaten dli, d2i und d3i handeln, wobei die Gewichte um so größer sind, je verläßlicher die Daten des betreffenden Sensorsystems sind.L5 The output value di can be a weighted average of the distance data dli, d2i and d3i, the weights being greater the more reliable the data of the sensor system in question are.
20 Die von den Auswerteeinheiten 28 und 30 übermittelten20 The transmitted by the evaluation units 28 and 30
Abstandsdaten d2i und d3i werden vom Abstandsmodul 36 in entsprechender Weise ausgewertet wie die Daten dli von der Auswerteeinheit 26. Wenn also beispielsweise ein Objekt zunächst nur vom Nahbereichsradar 14 erfaßt wird und dann inDistance data d2i and d3i are evaluated by distance module 36 in a manner corresponding to data dli by evaluation unit 26. If, for example, an object is initially only detected by short-range radar 14 and then in
25 den Erfassungsbereich 18 des Fernbereichsradars 12 einwandert, so verfolgt das Abstandsmodul 36 zunächst die Änderung der von der Auswerteeinheit 28 eintreffenden Daten und überprüft dann, ob zum erwarteten Zeitpunkt auch entsprechende Daten von der Auswerteeinheit 26 eintreffen.25 immerses the detection area 18 of the long-range radar 12, the distance module 36 first tracks the change in the data arriving from the evaluation unit 28 and then checks whether corresponding data also arrives from the evaluation unit 26 at the expected time.
3030
Wenn die erwarteten Daten von einer der Auswerteeinheiten 26, 28, 30 ausbleiben, d.h., wenn ein Sensorsystem kein Objekt erfaßt, obwohl sich dieses Objekt nach den Daten der anderen Systeme zu urteilen im Erfassungsbereich befinden müßte, so gibt das Abstandsmodul 36 ein Fehlersignal Fdj aus. Der Index j identifiziert hier das Sensorsystem, von dem keine Daten erhalten wurden. Somit läßt das Fehlersignal Fdj erkennen, daß das betreffende Sensorsystem möglicherweise ausgefallen oder "blind" ist. 5If the expected data from one of the evaluation units 26, 28, 30 fail to appear, that is to say if a sensor system does not detect an object, even though this object would have to be in the detection range based on the data of the other systems, the distance module 36 outputs an error signal Fdj , The index j identifies the sensor system from which no data was obtained. The error signal Fdj thus indicates that the sensor system in question may have failed or is "blind". 5
Wenn das Abstandsmodul 36 zwar alle erwarteten Abstandsdaten enthält, diese Daten jedoch um mehr als die Fehlergrenzen voneinander abweichen, so wird ebenfalls das Fehlersignal Fdj ausgegeben. In diesem Fall gibt das Fehlersignal Fdj auch an, L0 ' von welchen Sensorsystemen die abweichenden Daten erhalten wurden und wie groß die Abweichung ist.If the distance module 36 contains all expected distance data, but this data differs from one another by more than the error limits, then the error signal Fdj is also output. In this case, the error signal Fdj also indicates L0 ' from which sensor systems the deviating data were obtained and how large the deviation is.
Sofern von mindestens zwei Sensorsytemen Abstandsdaten vorliegen, die innerhalb der Fehlergrenzen überreinstimmen, L5 kann aus diesen Daten der Abstandswert di gebildet und an das Assistenzsystem 44 ausgegeben werden, obgleich ein Fehler festgestellt und das Fehlersignal Fdj erzeugt wurde.If there are distance data from at least two sensor systems that overlap within the error limits, L5 can be used to form the distance value di from this data and output it to the assistance system 44, although an error was detected and the error signal Fdj was generated.
Die Arbeitsweise des Geschwindigkeitsmoduls 38 ist der oben 20 beschriebenen Arbeitsweise des- Abstandsmoduls 36 analog, nur das hier nicht die Abstandsdaten, sondern dieThe mode of operation of the speed module 38 is analogous to the mode of operation of the distance module 36 described above, only that here not the distance data, but the
Geschwindigkeitsdaten vli, v2i und v3i miteinander abgeglichen werden, um daraus einen Geschwindigkeitswert vi zu bilden, der an das Assistenzsystem 44 ausgegeben wird, und/oder ein 25 Fehlersignal Fvj auszugeben, das auf eine Diskrepanz zwischen den gemessenen Relativgeschwindigkeiten hinweist.Speed data vli, v2i and v3i are compared with one another in order to form a speed value vi therefrom, which is output to the assistance system 44, and / or to output an error signal Fvj, which indicates a discrepancy between the measured relative speeds.
Auch die Arbeitsweise des Querversatzmoduls 40 ist weitgehend der oben beschriebenen Arbeitsweise des Abstandsmoduls 36 undThe mode of operation of the transverse offset module 40 is largely the same as that of the distance module 36 and
30 des Geschwindigkeitsmoduls 38 analog. Allerdings ist hier im gezeigten Beispiel keine Ausgabe eines Fehlersignals vorgesehen, weil es sich bei den Querversatzdaten lediglich um abgeleitete Daten handelt, die aus den gemessenen Azimutwinkeln berechnet werden, so daß für die Fehlererkennung primär auf die30 of the speed module 38 analog. However, no output of an error signal is provided here in the example shown, because the transverse offset data are only derived data that are calculated from the measured azimuth angles, so that the error detection is primarily based on the
35 Azimutwinkel abgestellt werden sollte. Dementsprechend werden die Azimutwinkel jli, j2i, jLi und jRi im Winkelmodul 42 gesondert abgeglichen. Bei dem Vergleich dieser Azimutwinkel sind natürlich die Abweichungen zu 5 berücksichtigen, die sich für ein gegebenes Objekt zwangsläufig aus dem Objektabstand und den unterschiedlichen Positionen der betreffenden Sensoren oder Kameras auf der Basislinie ergeben. Wenn unter Berücksichtigung dieser Abweichungen eine Diskrepanz verbleibt, die die Fehlergrenzen übersteigt, wird ein35 azimuth angle should be turned off. Accordingly, the azimuth angles jli, j2i, jLi and jRi are compared separately in the angle module 42. When comparing these azimuth angles, the deviations to 5 must of course be taken into account, which inevitably result for a given object from the object distance and the different positions of the relevant sensors or cameras on the baseline. If, taking these deviations into account, there remains a discrepancy that exceeds the error limits, a will
-0 Fehlersignal Ffk ausgegeben. Der Index k (k = 1 - 4) identifiziert in diesem Fall die Kamera 16L oder 16R bzw. den Radarsensor, deren oder dessen Azimutwinkel nicht zu den .übrigen Azimutwinkeln paßt. Wenn trotz der festgestellten Diskrepanz eine hinreichend verläßliche Bestimmung des-0 Ffk error signal output. In this case, the index k (k = 1 - 4) identifies the camera 16L or 16R or the radar sensor, the azimuth angle of which or not its . other azimuth angles. If, despite the discrepancy, a sufficiently reliable determination of the
L5 Querversatzes möglich ist, wird ein entsprechender Wert yi für den Querversatz vom Querversatzmodul 40 an das Assistenzsystem 44 ausgegeben.L5 transverse offset is possible, a corresponding value yi for the transverse offset is output by the transverse offset module 40 to the assistance system 44.
Die Fehlersignale Fdj, Fvj und Ffk werden im gezeigten BeispielThe error signals Fdj, Fvj and Ffk are in the example shown
20 der Korrektureinrichtung 34 zugeführt. Sofern sich aus den20 supplied to the correction device 34. Provided that from the
Fehlersignalen mit hinreichender Sicherheit ergibt, welches der drei Sensorsysteme für die Diskrepanz verantwortlich ist, und sofern sich aus der Art und Größe des festgestellten Fehlers ergibt, daß dieser Fehler sich durch eine Neukalibrierung desError signals with sufficient certainty shows which of the three sensor systems is responsible for the discrepancy, and if it follows from the type and size of the error found that this error is due to a recalibration of the
25 betreffenden Sensorsystems korrigieren läßt, so wird ein25 concerned sensor system can be corrected, so a
Korrektursignal K an die zugehörige Auswerteeinheit 26, 28 oder 30 ausgegeben. Wahlweise kann das Korrektursignal auch direkt an das Fernbereichsradar 12, das Nahbereichsradar 14 oder das Videosystem 16 ausgegeben werden. 0Correction signal K is output to the associated evaluation unit 26, 28 or 30. Optionally, the correction signal can also be output directly to the long-range radar 12, the short-range radar 14 or the video system 16. 0
Ein Beispiel für einen systematischen Fehler, der sich durch eine Neukalibrierung korrigieren läßt, ist eine Fehljustierung eines Radarsensors oder einer Kamera, die zu einem Verschwenken der betreffenden Bezugsachse, z.B. 18A oder 2.0A, und damit zu 5 einer fehlerhaften Messung des Azimutwinkels führt. In diesem Fall kann die Kalibrierung in der betreffenden Auswerteeinheit so verändert werden, daß die Fehljustierung korrigiert wird und man wieder den richtigen Azimutwinkel erhält. Zwar sollte die FehlJustierung dennoch bei der nächsten Reparatur behoben 5 werden, da die FehlJustierung auch zu einer unerwünschtenAn example of a systematic error that can be corrected by recalibration is a misalignment of a radar sensor or a camera, which leads to a pivoting of the relevant reference axis, for example 18A or 2.0A, and thus to an incorrect measurement of the azimuth angle. In this In this case, the calibration in the relevant evaluation unit can be changed so that the misalignment is corrected and the correct azimuth angle is obtained again. The misadjustment should nevertheless be remedied during the next repair 5, since the misadjustment also leads to an undesired one
Veränderung des Erfassungsbereiches führt, doch kann durch die Neukalibrierung die Funktionsfähigkeit des Systems einstweilen erhalten werden.Changes in the detection range leads, but the functionality of the system can be maintained for the time being through the recalibration.
-0 Die Korrektureinrichtung 34 weist im gezeigten Beispiel ein Statistikmodul 46 auf, das die während des Betriebs der Vorrichtung aufgetretenen Fehlersignale Fdj, Fvj und Fjk speichert und so Art und Größe aller aufgetretenen Fehler dokumentiert. Diese Daten stehen dann bei einer Reparatur oderIn the example shown, the correction device 34 has a statistics module 46 which stores the error signals Fdj, Fvj and Fjk which occurred during the operation of the device and thus documents the type and size of all errors which have occurred. This data is then available for a repair or
-5 Wartung der Vorrichtung zur Diagnosezwecken zur Verfügung.-5 Maintenance of the device is available for diagnostic purposes.
Außerdem hat das Statistikmodul 46 im gezeigten Beispiel die Funktion, zu entscheiden, ob ein Fehler automatisch korrigiert werden kann oder ob ein nicht behebbarer Fehler vorliegt und eine optische oder akustische Fehlermeldung F an den FahrerIn addition, in the example shown, the statistics module 46 has the function of deciding whether an error can be corrected automatically or whether there is an error that cannot be remedied and an optical or acoustic error message F to the driver
20 ausgegeben werden muß, um ihn auf die Fehlfunktion hinzuweisen.20 must be issued to indicate the malfunction.
Beispielsweise wird die Fehlermeldung F ausgegeben werden, wenn die von der Fehlerkorrektureinrichtung 28 erhaltenen Signale einen Totälausfall eines der Sensorsysteme anzeigen. Die Funktionen des Statistikmoduls 46 bieten dabei die Möglichkeit, 5 nicht sofort bei einer nur einmal oder sporadisch auftretenden Diskrepanz die Fehlermeldung F auszugeben, sondern die Fehlermeldung nur dann auszugeben, wenn Diskrepanzen derselben Art mit einer gewissen Häufigkeit auftreten. Auf diese Weise wird die Robustheit der Vorrichtung beträchtlich gesteigert. 0For example, error message F will be output if the signals received from error correction device 28 indicate a total failure of one of the sensor systems. The functions of the statistics module 46 offer the possibility 5 of not outputting the error message F immediately in the event of a discrepancy occurring only once or sporadically, but instead of outputting the error message only if discrepancies of the same type occur with a certain frequency. In this way, the robustness of the device is increased considerably. 0
In Figur 3 ist an Beispielen illustriert, wie sich eine FehlJustierung eines Sensors auf das Meßergebnis auswirkt.FIG. 3 shows examples of how a misadjustment of a sensor affects the measurement result.
Wenn beispielsweise infolge einer FehlJustierung des 5 Fernbereichsradars 12 die Hauptachse 18A um den Winkel Dj verschwenkt wird, so ist der gemessene Azimutwinkel jl um diesen Winkel zu groß, und das Fernbereichsradar 12 "sieht" das Objekt 24 nicht in der tatsächlichen Position, sondern in der gestrichelt eingezeichneten Position 24'. Hieraus ergibt sich 5 ein Fehler Dyl für den gemessenen Querversatz. Dieser Fehler ist um so größer, je weiter das Objekt 24 vom Fahrzeug 10 entfernt ist.For example, if the main axis 18A is rotated by the angle Dj is pivoted, the measured azimuth angle j1 is too large by this angle, and the long-range radar 12 does not “see” the object 24 in the actual position, but in the position 24 ′ shown in broken lines. This results in 5 an error Dyl for the measured transverse offset. This error is greater the further object 24 is away from vehicle 10.
Wenn dagegen die linke Kamera 16L des Videosystems eine .0 FehlJustierung von derselben Größe aufweist, so ist der zugehörige Azimutwinkel jL um dieselbe Winkelabweichung Dj verfälscht, wie in Figur 3 durch einen strichpunktiert eingezeichneten Sehstrahl S angegeben wird. Das Videosystem 16 sieht dann das Objekt 24 am Schnittpunkt der Sehstrahlen der .5 beiden Kameras, also in der Position 24'' . Man erkannt, daß in diesem Fall der für den Querversatz gemessene Fehler Dy3 wesentlich kleiner ist. Andererseits führt aber die FehlJustierung der Kamera 16L zu einem beträchtlichen Fehler Dd3 bei der Messung des Abstands. !0If, on the other hand, the left camera 16L of the video system has a .0 misalignment of the same size, the associated azimuth angle jL is falsified by the same angular deviation Dj as is indicated in FIG. 3 by a line of sight S drawn in broken lines. The video system 16 then sees the object 24 at the intersection of the visual beams of the .5 two cameras, that is to say in the position 24 ″. It can be seen that in this case the error Dy3 measured for the transverse offset is significantly smaller. On the other hand, the misadjustment of the camera 16L leads to a considerable error Dd3 in the measurement of the distance. ! 0
Diese Zusammenhänge lassen sich bei der beschriebenen Vorrichtung zur automatischen Fehlerkorrektur ausnutzen, und zwar selbst dann, wenn nur zwei Sensorsysteme vorhanden sind.These relationships can be used in the described device for automatic error correction, even if only two sensor systems are present.
!5 Wenn beispielsweise eine FehlJustierung des Fernbereichsradars 12 vorliegt, so erhält man bei einem Abgleich der Querversatzdaten des Fernbereichsradars 12 und des Videosystems 16 eine deutliche Diskrepanz Dyl, während die mit denselben Systemen gemessenen Abstandsdaten im wesentlichen konsistent! 5 If, for example, there is a misadjustment of the long-range radar 12, a comparison of the transverse offset data of the long-range radar 12 and the video system 16 results in a clear discrepancy Dyl, while the distance data measured with the same systems are essentially consistent
10 sind. Dies läßt darauf schließen, daß der Fehler auf eine Fehljustierung des Radarsystems und nicht auf eine FehlJustierung einer Kamera zurückzuführen ist. Aus der gemessenen Größe des Fehlers kann sogar die Fehljustierung quantitativ bestimmt und durch Neukalibrierung des Radarsensors10 are. This suggests that the error is due to a misalignment of the radar system and not to a misadjustment of a camera. The misalignment can even be determined quantitatively from the measured size of the error and by recalibrating the radar sensor
15 bzw. der zugehörigen Auswerteeinheit 26 korrigiert werden. Wenn dagegen eine FehlJustierung der Kamera 16L vorliegt, so ist dies an einer großen Diskrepanz Dd3 der Abstandsdaten bei weitgehender Konsistenz der Querversatzdaten zu erkennen. In diesem Fall kann der Fehler durch eine Neukalibrierung des Videosystems korrigiert werden.15 or the associated evaluation unit 26 are corrected. If, on the other hand, there is a misadjustment of the camera 16L, this can be recognized from a large discrepancy Dd3 in the distance data with the consistency of the cross offset data being largely consistent. In this case, the error can be corrected by recalibrating the video system.
Im Fall von widersprüchlichen Meßergebnissen können für die Entscheidung, welches der beteiligten Sensorsysteme defekt ist, auch andere Kriterien herangezogen werden, insbesondere auch in den Fällen, in denen für das Objekt nur Daten von zwei Sensorsystemen vorliegen oder überhaupt nur zwei Sensorsysteme . am Fahrzeug vorhanden sind.In the event of contradictory measurement results, other criteria can also be used to decide which of the sensor systems involved is defective, in particular also in those cases in which only data from two sensor systems are available for the object or only two sensor systems at all. are present on the vehicle.
Wenn zum Beispiel, wie in Figur 3, der Querversatz des Objektes 24 gegenüber der Bezugsachse 18A nicht 0 ist, so ist der Azimutwinkel jl annähernd umgekehrt proportional zum Objektabstand. In diesem Fall ist somit die Änderungsrate des Azimutwinkels jl von der Relativgeschwindigkeit des Objekts abhängig, die mit Hilfe des Radarsystems direkt gemessen werden kann. Wenn sich aber das Objekt 24 tatsächlich auf der Hauptachse 18A befindet und der Querversatz nur durch eine Fehljustierung Dj des Sensorsystems vorgetäuscht wird, so ist der gemessene (scheinbare) Azimutwinkel vom Abstand und von der ' Relativgeschwindigkeit unabhängig. Entsprechend gibt es auch bei einem tatsächlich bestehenden Querversatz des Objekts eine Diskrepanz zwischen der gemessenen und der anhand der Relativgeschwindigkeit theoretisch vorhergesagten Änderungsrate des Azimutwinkels. Diese Diskrepanz läßt auf eine FehlJustierung des Sensorsystems schließen.If, for example, as in FIG. 3, the transverse offset of the object 24 with respect to the reference axis 18A is not 0, the azimuth angle j1 is approximately inversely proportional to the object distance. In this case, the rate of change of the azimuth angle j1 is dependent on the relative speed of the object, which can be measured directly using the radar system. However, if the object 24 is actually on the main axis 18A and the transverse offset is only simulated by a misalignment Dj of the sensor system, the measured (apparent) azimuth angle is independent of the distance and the relative speed. Accordingly, there is also a discrepancy between the measured and the theoretically predicted change rate of the azimuth angle on the basis of the relative speed, even when the object is actually offset. This discrepancy suggests a misadjustment of the sensor system.
Bei dem Videosystem 16 besteht außerdem die Möglichkeit, die abstandsabhängige Änderung der scheinbaren Größe des Objekts 24 zu messen. Diese scheinbare Größenänderung ist direkt proportional zur Relativgeschwindigkeit und annähernd umgekehrt proportional zum Abstand. Ein Fehler bei der Abstandsmessung, der durch eine Fehljustierung einer Kamera verursacht wird, ist dann daran zu erkennen, daß die scheinbare Größenänderung nicht zu der gemessenen Abstandsänderung paßt.With video system 16 there is also the possibility of measuring the distance-dependent change in the apparent size of object 24. This apparent change in size is directly proportional to the relative speed and almost the other way round proportional to the distance. An error in the distance measurement, which is caused by incorrect adjustment of a camera, can then be recognized from the fact that the apparent change in size does not match the measured change in distance.
Da mit Hilfe des Kamerasystems 16 auch die Art des Objektes, beispielsweise ein Pkw oder ein Lkw, erkannt werden kann und die typische tatsächliche Größe solcher Objekte zumindest annähernd bekannt ist, kann auch überprüft werden, ob der mit dem Kamerasystem 16 gemessene Abstand des Objekts mit der gemessenen scheinbaren Größe des Objekts kompatibel ist.Since the type of object, for example a car or a truck, can also be recognized with the aid of the camera system 16 and the typical actual size of such objects is at least approximately known, it can also be checked whether the distance of the object measured with the camera system 16 is also the measured apparent size of the object is compatible.
Wenn mit Hilfe des Kamerasystems 16 auch Farbahnmarkierungen erkannt werden, kann auch diese Information für die automatische Fehlererkennung und Fehlerkorrektur herangezogen werden. Insbesondere läßt sich anhand der erkannten Fahrbahnmarkierungen der Querversatz des eigenen Fahrzeugs relativ zur Fahrbahnmitte erkennen. Es ist zu erwarten, daß dieser Querversatz im statistischen Mittel' den Wert 0 hat . Wenn die Auswertung im Statistikmodul 46 zeigt, daß die gemesseneIf color tooth markings are also recognized with the aid of the camera system 16, this information can also be used for automatic error detection and error correction. In particular, the transverse offset of one's own vehicle relative to the center of the lane can be recognized from the recognized lane markings. It is to be expected that this transverse offset has the value 0 on a statistical average. If the evaluation in the statistics module 46 shows that the measured
Querposition des eigenen Fahrzeugs dauernd nach einer Richtung von der Fahrbahnmitte abweicht, so deutet dies auf eine FehlJustierung einer oder beider Kameras hin. Dies gilt natürlich erst recht, wenn der mit dem Kamerasystem gemessene Querversatz eines Objektes im gleichen Sinne von dem mit einem anderen Sensorsystem gemessenen Querversatz abweicht. The transverse position of one's own vehicle constantly deviates in one direction from the center of the road, this indicates a misadjustment of one or both cameras. Of course, this applies all the more if the transverse offset of an object measured with the camera system deviates in the same sense from the transverse offset measured with another sensor system.

Claims

Patentansprüche claims
1. Objekterfassungsvorrichtung für Fahrer-Assistenzsysteme in Kraftfahrzeugen, mit mindestens zwei Sensorsystemen (12, 14, 16), die Daten (dl, vl, jl; d2 , v2, j2; jL, jR) über den Ort und/oder Bewegungszustand von Objekten (24) in der Umgebung des1. Object detection device for driver assistance systems in motor vehicles, with at least two sensor systems (12, 14, 16), the data (dl, vl, jl; d2, v2, j2; jL, jR) about the location and / or state of motion of objects (24) around the
L5 Fahrzeugs (10) messen und deren Detektionsbereiche (18, 20) einander überlappen, gekennzeichnet durch eine Eehlererkennungseinrichtung (32), die die von den Sensorsystemen (12, 14, 16) gemessenen Daten auf ihre Widerspruchsfreiheit prüft und bei Erkennung eines WiderspruchsL5 measure the vehicle (10) and their detection areas (18, 20) overlap one another, characterized by an error detection device (32) which checks the data measured by the sensor systems (12, 14, 16) for freedom from contradictions and upon detection of a contradiction
20 ein Fehlersignal (Fdj, Fvj, Ffk) ausgibt.20 outputs an error signal (Fdj, Fvj, Ffk).
2. Objekterfassungsvorrichtung nach Anspruch 1, dadurch gekennzeichnet, daß die Sensorsysteme (12; 14; 16) als Radar-, Lidar- oder Videosystem oder einer Kombination hieraus 5 ausgeführt sind.2. Object detection device according to claim 1, characterized in that the sensor systems (12; 14; 16) are designed as a radar, lidar or video system or a combination thereof 5.
3. Objekterfassungsvorrichtung nach Anspruch 1, dadurch gekennzeichnet, daß die Sensorsysteme ein Sensorsystem (12) für den Fernbereich und ein Sensorsystem (14; 16) für den 0 Nahbereich umfassen.3. Object detection device according to claim 1, characterized in that the sensor systems comprise a sensor system (12) for the far range and a sensor system (14; 16) for the near range.
4. Objekterfassungsvorrichtung nach Anspruch 3, dadurch gekennzeichnet, daß die Sensoreinrichtung (12) für den Fernbereich ein Radarsystem oder Lidarsystem ist. 5 4. Object detection device according to claim 3, characterized in that the sensor device (12) for the far range is a radar system or lidar system. 5
5. Objekterfassungsvorrichtung nach Anspruch 3 oder 4, dadurch gekennzeichnet, daß das Sensorsystem (14; 16) für den Nahbereich ein Radarsystem, ein Lidarsystem oder ein Videosystem ist.5. Object detection device according to claim 3 or 4, characterized in that the sensor system (14; 16) for the close range is a radar system, a lidar system or a video system.
6. Objekterfassungsvorrichtung nach einem der vorstehenden Ansprüche, dadurch gekennzeichnet, daß eines der Sensorsysteme ein Videosystem (16) mit zwei Kameras (16L, 16R) zur Erfassung eines Stereobildes des Objektes (24) umfassen.6. Object detection device according to one of the preceding claims, characterized in that one of the sensor systems comprise a video system (16) with two cameras (16L, 16R) for detecting a stereo image of the object (24).
7. Objekterfassungsvorrichtung nach einem der vorstehenden Ansprüche, dadurch gekennzeichnet, daß die7. Object detection device according to one of the preceding claims, characterized in that the
Fehlererkennungseinrichtung (32) anhand der Daten mindestens eines Sensorsystems (12) überprüft, ob sich das Objekt im Überlappungsbereich (22) des Detektionsbereiches (18) dieses Sensorsystems mit dem Detektionsbereich (20) eines anderen Sensorsystems (14) befindet, und das Fehlersignal (Fdj, F j, Fjk) ausgibt, wenn das Objekt (24) von dem anderen Sensorsystem (14) nicht geortet wird oder wenn die von diesem Sensorsystem (14) gemessenen Daten unter Berücksichtigung der Fehlergrenzen von den mit dem ersten Sensorsystem (12) gemessenen Daten abweichen.Error detection device (32) uses the data of at least one sensor system (12) to check whether the object is in the overlap area (22) of the detection area (18) of this sensor system with the detection area (20) of another sensor system (14), and the error signal (Fdj , F j, Fjk) outputs if the object (24) is not located by the other sensor system (14) or if the data measured by this sensor system (14) taking into account the error limits of the data measured by the first sensor system (12) differ.
8. Objekterfassungsvorrichtung nach einem der vorstehenden Ansprüche, dadurch gekennzeichnet, daß die8. Object detection device according to one of the preceding claims, characterized in that the
Fehlererkennungseinrichtung (32) für Objekte (24) , die sich im Überlappungsbereich (22) mehrerer Sensorsysteme (12, 14, 16) befinden, die von diesen Sensorsystemen gemessenen Daten entsprechend ihrer Verläßlichkeit wichtet und die einander entsprechenden gewichteten Daten zu Daten (di, vi, yi) kombiniert, die an das Assistenzsystem (44) ausgegeben werden.Error detection device (32) for objects (24) located in the overlap area (22) of several sensor systems (12, 14, 16), the data measured by these sensor systems is weighted according to their reliability and the corresponding weighted data to data (di, vi , yi) combined, which are output to the assistance system (44).
9. Objekterfassungsvorrichtung nach einem der vorstehenden Ansprüche, dadurch gekennzeichnet, daß die Fehlererkennungseinrichtung (32) dazu ausgebildet ist, anhand der Art des festgestellten Widerspruches das Sensorsystem, zu identifizieren, das den Fehler verursacht.9. Object detection device according to one of the preceding claims, characterized in that the error detection device (32) is designed to use the type of contradiction found to identify the sensor system that is causing the error.
10. Objekterfassungsvorrichtung nach Anspruch 9, gekennzeichnet 5 durch eine Korrektureinrichtung (34) zur Korrektur des festgestellten Fehlers durch Neujustierung oder Neukalibrierung des Sensorsystems, das den Fehler verursacht, oder einer zu diesem Sensorsystem gehörenden Auswerteeinheit (26, 28, 30).10. Object detection device according to claim 9, characterized 5 by a correction device (34) for correcting the detected error by readjusting or recalibrating the sensor system which causes the error, or an evaluation unit (26, 28, 30) belonging to this sensor system.
.0 11. Objekterfassungsvorrichtung nach Anspruch 9 oder 10, gekennzeichnet durch ein Statistikmodul (46) zur Aufzeichnung und Speicherung der festgestellten Fehler..0 11. Object detection device according to claim 9 or 10, characterized by a statistics module (46) for recording and storing the detected errors.
12. Objekterfassungsvorrichtung nach Anspruch 11, dadurch L5 gekennzeichnet, daß das Statistikmodul (46) dazu ausgebildet ist, eine Fehlermeldung (F) an den Fahrer auszugeben, wenn ein von der Fehlererkennungseinrichtung (32) festgesteller Fehler mit einer bestimmten statistischen Häufigkeit auftritt. 12. Object detection device according to claim 11, characterized in that the statistics module (46) is designed to output an error message (F) to the driver when an error detected by the error detection device (32) occurs with a certain statistical frequency.
EP02774378A 2001-10-05 2002-09-17 Object detecting device Withdrawn EP1436640A2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE10149115 2001-10-05
DE10149115A DE10149115A1 (en) 2001-10-05 2001-10-05 Object detection device for motor vehicle driver assistance systems checks data measured by sensor systems for freedom from conflict and outputs fault signal on detecting a conflict
PCT/DE2002/003483 WO2003031228A2 (en) 2001-10-05 2002-09-17 Object detecting device

Publications (1)

Publication Number Publication Date
EP1436640A2 true EP1436640A2 (en) 2004-07-14

Family

ID=7701470

Family Applications (1)

Application Number Title Priority Date Filing Date
EP02774378A Withdrawn EP1436640A2 (en) 2001-10-05 2002-09-17 Object detecting device

Country Status (5)

Country Link
US (1) US7012560B2 (en)
EP (1) EP1436640A2 (en)
JP (1) JP2005505074A (en)
DE (1) DE10149115A1 (en)
WO (1) WO2003031228A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2742323C2 (en) * 2018-12-29 2021-02-04 Общество с ограниченной ответственностью "Яндекс Беспилотные Технологии" Method and computer device for determining angular displacement of radar system

Families Citing this family (123)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10201523A1 (en) * 2002-01-17 2003-07-31 Bosch Gmbh Robert Method and device for masking detection in image sensor systems
DE10241456A1 (en) * 2002-09-07 2004-03-18 Robert Bosch Gmbh Arrangement of sensors on vehicle, used for locating objects, establishes detection zones covering full vehicle width at first distance, with overlap further away
JP4223320B2 (en) * 2003-04-17 2009-02-12 富士重工業株式会社 Vehicle driving support device
KR100513523B1 (en) * 2003-05-29 2005-09-07 현대자동차주식회사 Autonomous intelligent cruise control device
US6834232B1 (en) * 2003-07-30 2004-12-21 Ford Global Technologies, Llc Dual disimilar sensing object detection and targeting system
JP3941765B2 (en) * 2003-09-11 2007-07-04 トヨタ自動車株式会社 Object detection device
JP4193765B2 (en) * 2004-01-28 2008-12-10 トヨタ自動車株式会社 Vehicle travel support device
DE102004008868A1 (en) * 2004-02-20 2005-09-08 Daimlerchrysler Ag Motor vehicle lane recognition method in which a camera is used to record an image of lane markings in the medium to far range and a LIDAR sensor is used to detect lane markings immediately in front of the vehicle
DE102004046632A1 (en) * 2004-09-25 2006-03-30 Robert Bosch Gmbh Antenna radar system with heterodyne frequency conversion (mixing) of input / output signals
DE102004047086A1 (en) * 2004-09-29 2006-03-30 Robert Bosch Gmbh Radar sensor for motor vehicles
DE102004059915A1 (en) * 2004-12-13 2006-06-14 Robert Bosch Gmbh radar system
DE102004062801A1 (en) * 2004-12-20 2006-06-22 Balluff Gmbh High frequency position and path sensor for detecting approach of an object to a detection area in near field having main transmission and reception directions at a angle to each other
DE102005003969A1 (en) * 2005-01-27 2006-08-03 Daimlerchrysler Ag Sensor e.g. radar sensor, arrangement operability determining method for motor vehicle, involves setting sensor signals from detection areas on surrounding area, where surrounding area is not completely detected by concerned sensors
DE102005005720A1 (en) * 2005-02-09 2006-08-17 Robert Bosch Gmbh Driver assistance system e.g. electronic parking assistance, for motor vehicle, has comparators consistently checking results of sensors and section modules of processor module before transmission of results to successive modules
JP4853993B2 (en) * 2005-03-11 2012-01-11 独立行政法人情報通信研究機構 Ranging system
US7253722B2 (en) * 2005-06-09 2007-08-07 Delphi Technologies, Inc. Sensor alignment detection method for an infrared blind-zone sensing system
JP4557819B2 (en) * 2005-06-21 2010-10-06 アルパイン株式会社 Vehicle periphery information providing device
DE102006007149B4 (en) * 2005-08-05 2021-06-02 Volkswagen Ag Device and method for checking the parking space measurement of parking aid devices
DE102005050576B4 (en) * 2005-10-21 2022-06-15 Robert Bosch Gmbh Parking assistance system and parking assistance method
DE102005056800A1 (en) * 2005-11-29 2007-05-31 Valeo Schalter Und Sensoren Gmbh Motor vehicle radar system operating method, involves receiving transmission signal by sensor module in monitoring mode to obtain information about operating condition of another module, where signal is transmitted from latter module
DE102006007173A1 (en) * 2006-02-08 2007-08-09 Valeo Schalter Und Sensoren Gmbh Vehicle environment recognition system, in particular for the detection of objects coming to the side of the vehicle and / or approaching intersection traffic, and method therefor
JP5042558B2 (en) * 2006-08-10 2012-10-03 富士通テン株式会社 Radar equipment
DE102006047634A1 (en) * 2006-10-09 2008-04-10 Robert Bosch Gmbh Method for detecting an environment of a vehicle
DE102006049879B4 (en) * 2006-10-23 2021-02-18 Robert Bosch Gmbh Radar system for automobiles
US8447472B2 (en) * 2007-01-16 2013-05-21 Ford Global Technologies, Llc Method and system for impact time and velocity prediction
EP2122599B1 (en) * 2007-01-25 2019-11-13 Magna Electronics Inc. Radar sensing system for vehicle
DE102007008798A1 (en) * 2007-02-22 2008-09-04 Götting jun., Hans-Heinrich Contactlessly operating protection sensor examining arrangement for detecting object i.e. person, has testing device examining protection sensor and comparing actual position data with data that is determined by protection sensor
DE102007018470A1 (en) 2007-04-19 2008-10-23 Robert Bosch Gmbh Driver assistance system and method for object plausibility
JP5234894B2 (en) * 2007-06-28 2013-07-10 富士重工業株式会社 Stereo image processing device
DE102007058242A1 (en) * 2007-12-04 2009-06-10 Robert Bosch Gmbh Method for measuring transverse movements in a driver assistance system
DE102007062566A1 (en) * 2007-12-22 2009-07-02 Audi Ag motor vehicle
EP2260322A1 (en) * 2008-03-31 2010-12-15 Valeo Radar Systems, Inc. Automotive radar sensor blockage detection apparatus and method
US20090259399A1 (en) * 2008-04-15 2009-10-15 Caterpillar Inc. Obstacle detection method and system
US8280621B2 (en) * 2008-04-15 2012-10-02 Caterpillar Inc. Vehicle collision avoidance system
DE102008026876A1 (en) * 2008-06-05 2009-12-10 Hella Kgaa Hueck & Co. Stereo camera system and method for determining at least one calibration error of a stereo camera system
JP2010064725A (en) * 2008-09-15 2010-03-25 Denso Corp On-vehicle captured image display controller, program for on-vehicle captured image display controller, and on-vehicle captured image display system
DE102009000550B4 (en) * 2009-02-02 2018-10-04 Ford Global Technologies, Llc Wide-angle imaging system for providing an image of the surroundings of a vehicle, in particular of a motor vehicle
DE102009024064A1 (en) * 2009-06-05 2010-12-09 Valeo Schalter Und Sensoren Gmbh Driver assistance means for determining a target angle of a device external object and method for correcting a target angle parameter characteristic
DE102009033854A1 (en) 2009-07-16 2011-01-20 Daimler Ag Method for observation of stereo camera arrangement in vehicle or robot, involves recording image data pixels by two cameras, where image data pixels are processed stereoscopically with corresponding pixels of image pair by processing unit
US7978122B2 (en) * 2009-08-13 2011-07-12 Tk Holdings Inc. Object sensing system
DE102009054835A1 (en) 2009-12-17 2011-06-22 Robert Bosch GmbH, 70469 object sensor
EP2431225B1 (en) 2010-09-17 2013-11-27 SMR Patents S.à.r.l. Method for an automotive hazardous detection and information system
JP5481337B2 (en) * 2010-09-24 2014-04-23 株式会社東芝 Image processing device
WO2012068064A1 (en) 2010-11-15 2012-05-24 Image Sensing Systems, Inc. Hybrid traffic sensor system and associated method
US9472097B2 (en) 2010-11-15 2016-10-18 Image Sensing Systems, Inc. Roadway sensing systems
DE102010054214A1 (en) * 2010-12-11 2012-06-14 Valeo Schalter Und Sensoren Gmbh A method of assisting a driver in driving a motor vehicle and driver assistance system
DE102011006554A1 (en) 2011-03-31 2012-10-04 Robert Bosch Gmbh Method and apparatus for providing a signal to a lighting control unit
JP2013002927A (en) * 2011-06-15 2013-01-07 Honda Elesys Co Ltd Obstacle detection apparatus and computer program
KR101251836B1 (en) * 2011-09-02 2013-04-09 현대자동차주식회사 Driver condition detecting device with IR sensor
DE102011082103B4 (en) * 2011-09-02 2017-08-24 Audi Ag Safety system for a motor vehicle
US8605949B2 (en) * 2011-11-30 2013-12-10 GM Global Technology Operations LLC Vehicle-based imaging system function diagnosis and validation
DE102011120535A1 (en) * 2011-12-08 2013-06-13 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Method for adjusting sensor during manufacture of motor car, involves comparing object positions determined relative to vehicle reference axis, to produce comparative data and adjusting preset sensor as a function of comparative data
EP2604478B2 (en) * 2011-12-13 2021-03-31 Aptiv Technologies Limited Method for recognising function errors of a multi-sensor assembly
JP2014006243A (en) * 2012-05-28 2014-01-16 Ricoh Co Ltd Abnormality diagnostic device, abnormality diagnostic method, imaging apparatus, moving body control system and moving body
CA2858309C (en) * 2012-07-10 2015-08-18 Honda Motor Co., Ltd. Failure-determination apparatus
KR101380888B1 (en) * 2012-07-24 2014-04-02 현대모비스 주식회사 Apparatus and Method for Calculating Vehicle-Distance
DE102012106860A1 (en) * 2012-07-27 2014-02-13 Jenoptik Robot Gmbh Device and method for identifying and documenting at least one object passing through a radiation field
DE102012215026A1 (en) * 2012-08-23 2014-05-28 Bayerische Motoren Werke Aktiengesellschaft Method and device for operating a vehicle
KR102245648B1 (en) 2012-09-10 2021-04-29 에이매스, 아이엔씨. Multi-dimensional data capture of an environment using plural devices
JP6003462B2 (en) * 2012-09-24 2016-10-05 トヨタ自動車株式会社 Vehicle detection device using running sound
US20140152490A1 (en) * 2012-12-03 2014-06-05 Michael Lehning Method and Arrangement for the Detection of an Object in a Radar Field
EP2821308B1 (en) * 2013-07-03 2016-09-28 Volvo Car Corporation Vehicle system for control of vehicle safety parameters, a vehicle and a method for controlling safety parameters
US20150022634A1 (en) * 2013-07-16 2015-01-22 The Steelastic Co., Llc Object inspection system
JP5812061B2 (en) * 2013-08-22 2015-11-11 株式会社デンソー Target detection apparatus and program
US10318823B2 (en) 2013-10-14 2019-06-11 Mobileye Vision Technologies Ltd. Forward-facing multi-imaging system for navigating a vehicle
US20150120035A1 (en) * 2013-10-25 2015-04-30 Infineon Technologies Ag Systems and Methods for Linking Trace Information with Sensor Data
JP6032195B2 (en) * 2013-12-26 2016-11-24 トヨタ自動車株式会社 Sensor abnormality detection device
JP6467748B2 (en) 2014-04-08 2019-02-13 パナソニックIpマネジメント株式会社 Object detection device
EP3026458B1 (en) 2014-11-26 2021-09-01 Maritime Radar Systems Limited A system for monitoring a maritime environment
JP6457278B2 (en) 2015-01-23 2019-01-23 トヨタ自動車株式会社 Object detection apparatus and object detection method
JP2016206774A (en) * 2015-04-17 2016-12-08 トヨタ自動車株式会社 Three-dimensional object detection apparatus and three-dimensional object detection method
US10578713B2 (en) * 2015-06-24 2020-03-03 Panasonic Corporation Radar axis displacement amount calculation device and radar axis displacement calculation method
DE102015111925B4 (en) * 2015-07-22 2021-09-23 Deutsches Zentrum für Luft- und Raumfahrt e.V. Lane departure warning system for a vehicle
DE102015216888A1 (en) * 2015-09-03 2017-03-09 Conti Temic Microelectronic Gmbh Self-analysis of a radar sensor
US10267908B2 (en) 2015-10-21 2019-04-23 Waymo Llc Methods and systems for clearing sensor occlusions
US9916703B2 (en) 2015-11-04 2018-03-13 Zoox, Inc. Calibration for autonomous vehicle operation
US9606539B1 (en) * 2015-11-04 2017-03-28 Zoox, Inc. Autonomous vehicle fleet service and system
DE102016103203A1 (en) * 2016-02-24 2017-08-24 Valeo Schalter Und Sensoren Gmbh Method for detecting a blocked state of a radar sensor, radar sensor device, driver assistance system and motor vehicle
US10162046B2 (en) 2016-03-17 2018-12-25 Valeo Radar Systems, Inc. System and method for detecting blockage in an automotive radar
DE102016005058B4 (en) * 2016-04-26 2020-02-06 Audi Ag Method for operating a radar sensor in a motor vehicle and motor vehicle
EP3293543B1 (en) * 2016-09-08 2021-06-09 KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH Apparatus for sensing a vehicular environment when fitted to a vehicle
EP3293542B1 (en) * 2016-09-08 2023-11-01 KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH Apparatus for sensing a vehicular environment when fitted to a vehicle
EP3293667A1 (en) 2016-09-08 2018-03-14 KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH An apparatus for providing vehicular environment information
JP6796653B2 (en) * 2016-09-15 2020-12-09 株式会社小糸製作所 Sensor system
US10386792B2 (en) * 2016-10-19 2019-08-20 Ants Technology (Hk) Limited Sensory systems for autonomous devices
US11119188B2 (en) 2016-10-27 2021-09-14 Hitachi Automotive Systems, Ltd. Malfunction detecting device
DE102016226312A1 (en) * 2016-12-29 2018-07-05 Robert Bosch Gmbh Method for operating a driver assistance system for motor vehicles
CN106597417A (en) * 2017-01-10 2017-04-26 北京航天计量测试技术研究所 Remote scanning laser radar measurement error correction method
JP6914065B2 (en) * 2017-03-17 2021-08-04 シャープ株式会社 Obstacle detection device, traveling device, obstacle detection system and obstacle detection method
JP6932015B2 (en) * 2017-03-24 2021-09-08 日立Astemo株式会社 Stereo image processing device
EP3415943A1 (en) * 2017-06-13 2018-12-19 Veoneer Sweden AB Error estimation for a vehicle environment detection system
US10656245B2 (en) 2017-09-05 2020-05-19 Valeo Radar Systems, Inc. Automotive radar sensor blockage detection using adaptive overlapping visibility
US10877148B2 (en) 2017-09-07 2020-12-29 Magna Electronics Inc. Vehicle radar sensing system with enhanced angle resolution using synthesized aperture
US10962641B2 (en) 2017-09-07 2021-03-30 Magna Electronics Inc. Vehicle radar sensing system with enhanced accuracy using interferometry techniques
US11150342B2 (en) 2017-09-07 2021-10-19 Magna Electronics Inc. Vehicle radar sensing system with surface segmentation using interferometric statistical analysis
US10962638B2 (en) 2017-09-07 2021-03-30 Magna Electronics Inc. Vehicle radar sensing system with surface modeling
DE112017008078B4 (en) * 2017-11-13 2022-03-17 Mitsubishi Electric Corporation FAULT DETECTION DEVICE, FAULT DETECTION METHOD AND FAULT DETECTION PROGRAM
JP2019114906A (en) * 2017-12-22 2019-07-11 ルネサスエレクトロニクス株式会社 Semiconductor device and semiconductor system
DE102018200752A1 (en) * 2018-01-18 2019-07-18 Robert Bosch Gmbh Method and device for evaluating an angular position of an object, and driver assistance system
US20190244136A1 (en) * 2018-02-05 2019-08-08 GM Global Technology Operations LLC Inter-sensor learning
WO2019172117A1 (en) * 2018-03-05 2019-09-12 株式会社小糸製作所 Sensor system, and image data generating device
JP6977629B2 (en) * 2018-03-09 2021-12-08 株式会社デンソー Vehicle driving support control device, vehicle driving support system and vehicle driving support control method
CN110271502A (en) * 2018-03-16 2019-09-24 株式会社小糸制作所 Sensing system
US10705194B2 (en) 2018-03-21 2020-07-07 Zoox, Inc. Automated detection of sensor miscalibration
DE102018206532A1 (en) * 2018-04-27 2019-10-31 Robert Bosch Gmbh A method of operating a first radar part sensor and a second radar part sensor and radar sensor system comprising a first radar part sensor and a second radar part sensor
US10878709B2 (en) * 2018-07-19 2020-12-29 The Boeing Company System, method, and computer readable medium for autonomous airport runway navigation
GB2576308B (en) * 2018-08-10 2020-12-30 Jaguar Land Rover Ltd An apparatus and method for providing driver assistance of a vehicle
DE102018216704A1 (en) * 2018-09-28 2020-04-02 Ibeo Automotive Systems GmbH Environment detection system, vehicle and method for an environment detection system
DE102018217128A1 (en) * 2018-10-08 2020-04-09 Robert Bosch Gmbh Entity discovery method
DE102018218492A1 (en) * 2018-10-29 2020-04-30 Robert Bosch Gmbh Control device, method and sensor arrangement for self-monitored localization
DE102018221427B4 (en) 2018-12-11 2020-08-06 Volkswagen Aktiengesellschaft Method for determining an existing misalignment of at least one sensor within a sensor network
US11327155B2 (en) 2018-12-21 2022-05-10 Robert Bosch Gmbh Radar sensor misalignment detection for a vehicle
JP6789341B2 (en) * 2019-04-01 2020-11-25 本田技研工業株式会社 Target recognition system, target recognition method, and program
US11548526B2 (en) * 2019-04-29 2023-01-10 Motional Ad Llc Systems and methods for implementing an autonomous vehicle response to sensor failure
DE102019217642A1 (en) * 2019-11-15 2021-05-20 Volkswagen Aktiengesellschaft Method for capturing image material for checking image evaluation systems, device and vehicle for use in the method, as well as computer program
US11247695B2 (en) * 2019-05-14 2022-02-15 Kyndryl, Inc. Autonomous vehicle detection
DE102019212279B3 (en) * 2019-08-15 2021-01-28 Volkswagen Aktiengesellschaft Method and device for checking a calibration of environmental sensors
US11609315B2 (en) * 2019-08-16 2023-03-21 GM Cruise Holdings LLC. Lidar sensor validation
JP7385412B2 (en) * 2019-09-25 2023-11-22 株式会社Subaru automatic driving system
EP4043910A4 (en) * 2019-10-10 2023-08-23 Kyocera Corporation Electronic apparatus, control method for electronic apparatus, and control program for electronic apparatus
DE102021133407A1 (en) 2021-12-16 2023-06-22 Valeo Schalter Und Sensoren Gmbh METHOD OF DETERMINING MISSING DETECTION INFORMATION, METHOD OF OPERATING A DRIVING ASSISTANCE SYSTEM, COMPUTER PROGRAM PRODUCT, DRIVING ASSISTANCE SYSTEM, VEHICLE AND ARRANGEMENT
DE102022127122A1 (en) 2022-10-17 2024-04-18 Bayerische Motoren Werke Aktiengesellschaft LIDAR system for a driver assistance system
DE102022212275A1 (en) 2022-11-18 2024-05-23 Volkswagen Aktiengesellschaft Environment detection system for a vehicle and method for detecting an environment of a vehicle

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04117600A (en) * 1990-01-22 1992-04-17 Yoji Kozuka Moving body safe running supporting system
JP2861431B2 (en) * 1991-03-04 1999-02-24 トヨタ自動車株式会社 In-vehicle distance measuring device
IL100175A (en) * 1991-11-27 1994-11-11 State Of Isreal Ministry Of De Collision warning apparatus for a vehicle
DE4407757A1 (en) * 1993-03-08 1994-09-15 Mazda Motor Device for detecting obstacles for a vehicle
JPH0717347A (en) * 1993-07-07 1995-01-20 Mazda Motor Corp Obstacle detecting device for automobile
US6067110A (en) * 1995-07-10 2000-05-23 Honda Giken Kogyo Kabushiki Kaisha Object recognizing device
DE19647660B4 (en) * 1996-11-19 2005-09-01 Daimlerchrysler Ag Tripping device for occupant restraint systems in a vehicle
US6085151A (en) * 1998-01-20 2000-07-04 Automotive Systems Laboratory, Inc. Predictive collision sensing system
US6055042A (en) * 1997-12-16 2000-04-25 Caterpillar Inc. Method and apparatus for detecting obstacles using multiple sensors for range selective detection
JP3913878B2 (en) * 1998-01-14 2007-05-09 本田技研工業株式会社 Vehicle object detection device
DE19856313A1 (en) * 1998-12-07 2000-06-08 Volkswagen Ag Method for monitoring multi-flow distance metering systems for vehicles fits on a vehicle front to transmit narrow bundles of measuring rays to identify flow-measuring flumes with radar, light or ultra-sonic bundles of rays.
WO2000040999A1 (en) * 1999-01-07 2000-07-13 Siemens Aktiengesellschaft Method for detecting targets and for determining their direction for a radar device in a motor vehicle
DE19934670B4 (en) 1999-05-26 2004-07-08 Robert Bosch Gmbh Object detection system
EP1103004A1 (en) * 1999-05-26 2001-05-30 Robert Bosch Gmbh Object detection system
DE19949409A1 (en) * 1999-10-13 2001-04-19 Bosch Gmbh Robert Pulse radar object detection for pre crash control systems has tracks object to eliminate spurious detection
JP4120114B2 (en) * 1999-11-04 2008-07-16 株式会社デンソー Road surface condition estimation device
JP2001194457A (en) * 2000-01-14 2001-07-19 Sogo Jidosha Anzen Kogai Gijutsu Kenkyu Kumiai Vehicle circumference monitor
US6882287B2 (en) * 2001-07-31 2005-04-19 Donnelly Corporation Automotive lane change aid
US6771208B2 (en) * 2002-04-24 2004-08-03 Medius, Inc. Multi-sensor system
US6873251B2 (en) * 2002-07-16 2005-03-29 Delphi Technologies, Inc. Tracking system and method employing multiple overlapping sensors

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO03031228A3 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2742323C2 (en) * 2018-12-29 2021-02-04 Общество с ограниченной ответственностью "Яндекс Беспилотные Технологии" Method and computer device for determining angular displacement of radar system
US11402467B2 (en) 2018-12-29 2022-08-02 Yandex Self Driving Group Llc Methods and computer devices for determining angular offset of radar system

Also Published As

Publication number Publication date
DE10149115A1 (en) 2003-04-17
WO2003031228A3 (en) 2003-08-14
US7012560B2 (en) 2006-03-14
JP2005505074A (en) 2005-02-17
US20050062615A1 (en) 2005-03-24
WO2003031228A2 (en) 2003-04-17

Similar Documents

Publication Publication Date Title
EP1436640A2 (en) Object detecting device
EP2793045B1 (en) Method for testing an environment detection system of a vehicle
DE19618922C2 (en) Device and method for measuring the vehicle distance for motor vehicles
EP1690730B1 (en) Driver assistance system comprising redundant decision unit
EP1577682B1 (en) Object locating system for vehicles to recognize lane change
DE19964020A1 (en) Method and device for misalignment detection in a motor vehicle radar system
DE102016100401A1 (en) Method for determining a misalignment of an object sensor
DE102010007468B4 (en) System and method for validating modes of adaptive cruise control
EP2046619B1 (en) Driver assistance system
EP2033013A1 (en) Lane changing aid for motor vehicles
DE102010049093A1 (en) Method for operating at least one sensor of a vehicle and vehicle with at least one sensor
EP3563173B1 (en) Method for operating a driver assistance system for motor vehicles
DE102016221440A1 (en) Method for diagnosing environmental sensor systems in vehicles
WO1998054594A1 (en) Method and device for determining the probable path to be covered by a vehicle
DE19637053A1 (en) Method and device for recognizing right-hand or left-hand traffic
EP1912844B1 (en) Method for the creation of environmental hypotheses for driver assistance functions
EP2162872A1 (en) Collision warning device having guardrail detection
EP1766431B1 (en) Method and device for compensating mounting tolerances of a proximity sensor
EP2073034B1 (en) Motor vehicle with a combination of forward-looking radars with overlapping beams
EP0899543A2 (en) Method and device for determining the yaw rate of a moving object
DE19746524B4 (en) Compensation device for compensating the installation tolerances of a distance sensor on a vehicle
DE10335898A1 (en) Driver assistance system with means for detecting moving and stationary objects in front of the driver's vehicle, whereby evaluation of stationary objects is improved by accessing the behavior of moving objects
EP1612125A2 (en) Lane-change assistant for motor vehicles
WO2021063567A1 (en) Method and device for guiding a motor vehicle in a lane
DE102022204776B3 (en) Method for locating a vehicle within a SAR image

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20040506

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LI LU MC NL PT SE SK TR

RIN1 Information on inventor provided before grant (corrected)

Inventor name: HEINEBRODT, MARTIN

Inventor name: BRAEUCHLE, GOETZ

Inventor name: BOECKER, JUERGEN

17Q First examination report despatched

Effective date: 20061215

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20100401