GB2548465A - Method and device for driver assistance - Google Patents

Method and device for driver assistance Download PDF

Info

Publication number
GB2548465A
GB2548465A GB1701167.7A GB201701167A GB2548465A GB 2548465 A GB2548465 A GB 2548465A GB 201701167 A GB201701167 A GB 201701167A GB 2548465 A GB2548465 A GB 2548465A
Authority
GB
United Kingdom
Prior art keywords
distance
camera
vehicle
detected
identified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1701167.7A
Other versions
GB201701167D0 (en
Inventor
Banko Zoltan
Losteiner David
Miklos Tamas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of GB201701167D0 publication Critical patent/GB201701167D0/en
Publication of GB2548465A publication Critical patent/GB2548465A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/301Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo or light sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/54Audio sensitive means, e.g. ultrasound
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9315Monitoring blind spots
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93272Sensor installation details in the back of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93274Sensor installation details on the side of the vehicles

Abstract

A method for driver assistance, wherein a rearwardly orientated distance sensor (i.e. ultrasonic) 16 detects the distance of a detected object 28, a rearward camera 20 (i.e. reversing camera) identifies items in its field of vision 24, the distance of an identified item 28 is calculated and/or an identified item 28 is assigned to a traffic lane 35. An object 28 detected by the distance sensor 16 is filtered if the distance from the host vehicle 1 to the object is greater than a traffic lane width 40 and no item is detected by the camera 20 and/or if the object 28 is detected as a potential threat by the distance sensor 16 and the camera 20 detects that an item is in a non-adjacent lane 38, and a warning is given if a detected object 28 has not been eliminated by the filter and is at a distance below a warning distance, for example between 3-5m. Items identified by the camera 20 may be identified based on having an optical flow in the direction of travel. The distance of the items identified may be calculated from their position in the field of vision 24.

Description

Description
Title
Method and device for driver assistance Prior art
The invention relates to a method for driver assistance, in which objects in the proximity of a vehicle are detected and a warning is given if an object is located in a blind spot region of the vehicle. Further aspects of the invention relate to a computer program which carries out the method and to a driver assistance system which is set up to carry out the method.
In the automotive field, various driver assistance systems are used which are intended to assist the driver in carrying out various driving manoeuvres. These include for example reversing assistants, which detect image data using a reversing camera, in other words a rearwardly orientated camera, and thus make better orientation possible for the driver during rearward travel. Another driver assistance system warns the driver for example of objects located in the blind spot.
For their operation, the driver assistance systems require data concerning the proximity of the vehicle, a plurality of sensors, in particular ultrasound-based sensors, being used for this purpose. The ultrasound-based sensors emit ultrasound signals and receive echoes reflected by objects in the proximity. From the transit time of the ultrasound signal and the known speed of sound in air, the distance between the sensor and the reflecting object is calculated. A sensor of this type comprises a field of vision within which it can detect objects. Since only a distance is determined using the ultrasound-based sensor, it is necessary, in particular in connection with warning of objects in the blind spot, to draw on further information so as to be able to decide whether or not an object is located in the blind spot. DE 10 2012 204 948 A1 discloses a method for assisting a driver, in which in particular the blind spot region is monitored. It is provided that objects in the vehicle proximity are detected using a camera system. Additionally, further sensors such as ultrasound sensors may be used. To avoid incorrect warnings, which are given as a result of a vehicle a plurality of traffic lanes away, a plausibility check is provided. For this purpose, an object detected in the blind spot range is assigned to a traffic lane using distance data. DE 10 2012 215 014 A1 discloses a method for monitoring a blind spot. For the monitoring, a first environment detection device is provided, which is implemented for example in the form of a plurality of obliquely rearwardly orientated ultrasound sensors. Moreover, a further environment detection device is provided, which is implemented for example as a camera system, so as to plausibility-check the data of the first environment detection device. For this purpose, a module evaluates the data of the first environment detection device and identifies objects. Additionally, the relative speed of the object is determined. Subsequently, the data are plausibility-checked by a further module against data from the further environment detection device. There is only a warning if the object is located in the blind spot. DE 10 2013 013 082 A1 discloses a blind spot warning system. The system comprises a distance measuring unit, by means of which a distance between the present vehicle and a target vehicle located in a side zone or in a rear zone is measured. The distance measuring unit is for example implemented as an ultrasound sensor. Additionally, a unit implemented as a camera is provided, and determines the distance from a line defining the traffic lane width. Using the determined traffic lane width, it can be established whether or not a given target vehicle is located on the adjacent traffic lane.
Disclosure of the invention A method for driver assistance is proposed in which objects in the proximity of a vehicle are detected and a warning is given if an object is located in a blind spot region. In the method, it is provided that objects in a field of vision of a distance sensor are detected. the distance sensor being disposed obliquely rearwardly orientated on the vehicle, and the distance of the detected objects from the vehicle being determined. Moreover, items located in the field of vision of a camera are identified, the camera being disposed rearwardly orientated on the vehicle, and the distance of an identified item from the vehicle being calculated and/or identified items being assigned to a traffic lane. Subsequently, the objects detected by the distance sensor are filtered using the items identified by the camera. If a detected object has not been eliminated by the filter and the distance thereof as determined by the distance sensor is less than a warning distance, a warning is given.
The warning distance below which a warning is given is preferably selected in the range of 3 m to 5 m. Particularly preferably, the warning limit is selected in the region of 4 m to 4.5m. For example, the warning limit is 4.2 m. The higher the warning limit is selected, the earlier the warning message is given.
As a result of the obliquely rearward orientation of the distance sensor, it is provided that the majority of a blind spot region can be detected using just a single distance sensor. As a minimum, the covered area of the blind spot region is more than 50 %. Thus, even a single distance sensor for each monitored blind spot region is sufficient. In general, a distance sensor is used for monitoring the associated blind spot region for each of the two sides of the vehicle.
In the method, it is provided that a detected object is eliminated by filters if the distance of the detected object from the vehicle, as determined by the distance sensor, is greater than a traffic lane width and no item has been detected by the camera in a distance interval determined on the basis of the distance determined by the distance sensor. Alternatively or additionally, detected objects are eliminated during filtering if an item which is not located on a traffic lane adjacent to the vehicle is detected by the camera in a distance interval determined on the basis of the distance determined by the distance sensor. The traffic lane width may for example be fixedly predetermined in the method or for example be determined using a sensor, such as a camera. A blind spot region refers to a region which is located alongside the vehicle and cannot be viewed by the driver by looking in the rear mirror or in one of the wing mirrors. A blind spot region at least covers a region which extends as far as 3 metres behind the vehicle as considered in the longitudinal direction of the vehicle and which reaches for example as far as the level of the wing mirror of the vehicle. To the side of the vehicle, the blind spot region extends laterally away from the vehicle at least as far as a distance of 3 metres. Depending on the embodiment of the method, it may be provided that there is exactly one blind spot region, or for example there may be two blind spot regions, one blind spot region bordering the left side of the vehicle and a further blind spot region bordering the right side of the vehicle. A detected object is understood to mean an object detected by a distance sensor, it being possible in particular for the detected object to be other vehicles, for example lorries, passenger vehicles, motorcycles or bicycles.
The distance sensor may for example be configured as an ultrasound sensor, radar sensor or lidar sensor, an ultrasound sensor being preferred. An ultrasound sensor detects exclusively the distance of an object, but cannot specify in what direction the object is located as seen from the vehicle. Alternatively, a plurality of distance sensors may also be used, in particular arrays .
Detecting items is understood to mean detecting items in a field of vision of a camera. Depending on what sort of algorithm is used for evaluating the images supplied by the camera, a detected item may for example be identical to another vehicle or the detected item may for example be part of another vehicle. Thus, a detected item may correspond to a detected object or part of a detected object. In particular parts of a vehicle which are prominent in the camera image, such as the edges or a number plate, may be detected as an item in this context.
For assessing whether a detected object is actually located in a blind spot region, the distance determined by the distance sensor is not sufficient as a sole criterion. Specifically, if the determined distance is greater than the width of a traffic lane, it is not possible to distinguish between an object located on the adjacent traffic lane behind the vehicle and an object located at the same level as the vehicle but two traffic lanes away. It is therefore provided that filtering is carried out using items identified by the camera.
In a first filtering variant, for each object detected by a distance sensor it is checked whether an item identified by the camera is located in a distance interval determined on the basis of the distance determined by the distance sensor. If an identified item is located in the distance interval, there is a high probability that the identified item corresponds to the detected object or to part of the object detected by the distance sensor. The object detected by the distance sensor is thus located in a position at which it can be both detected by the distance sensor and be identified as an item by the camera. Since the camera is rearwardly orientated, the field of vision of the camera also points rearwards, in such a way that there is high probability that the detected object is located on the adjacent traffic lane behind the vehicle .
It is thus appropriate to give a warning that the object is in the blind spot region.
Conversely, if no item is identified by the camera in a distance determined on the basis of the distance determined by the distance sensor, because of the rearwardly orientated field of vision of the camera there is a high probability that the object detected by the distance sensor is located at the same level as the vehicle itself and thus is not located on an adjacent traffic lane, but rather is at least two traffic lanes away. In this case, a warning is not appropriate, since the detected object is not located in the blind spot region. It is therefore eliminated during filtering.
If identified items are assigned to a traffic lane when the items are identified, a detected object may additionally or alternatively be eliminated during filtering of the detected objects if an item which is not located on a traffic lane adjacent to the vehicle is identified by the camera in a distance interval of the distance determined by the distance sensor. In
turn, there is a high probability that the object detected by the distance sensor or part of the object corresponds to the item identified by the camera, and thus is not located in a blind spot region, but rather at least two traffic lanes away from the vehicle. A warning would not be appropriate in this case.
In the proposed filtering method, a distance interval is used. In determining the distance interval, the distance determined by the distance sensor is converted to the position of the camera, taking into account the distance between the distance sensor and the camera. Since this conversion is not necessarily unique, this results in a distance interval.
Preferably, the distance interval is obtained using a two-dimensional model and the distance determined by the distance sensor. In the two-dimensional model, the height, in other words the direction perpendicular to the plane of the street, is ignored. Since the distance sensor only determines one direction, the detected object may be located on an arc defined by the opening angle of the ultrasound sensor or the field of vision thereof and the determined distance. In the conversion, the position of this arc is translated in the X and Y directions in the model by the distance between the camera and the distance sensor. The lower bound on the interval is thus given by the smallest distance of the arc from the camera position, and the upper bound on the distance interval is given by the largest distance of the arc from the camera position.
The determined distance interval may additionally be increased by a predetermined amount so as to compensate measurement errors. For example, for this purpose the upper bound may be increased by a magnitude of 1 to 20 cm and/or the lower bound may be decreased by a magnitude of 1 to 20 cm.
If a detected object is not eliminated by the filter, it is to be assumed that the detected object is in fact located in a blind spot region, and a warning is given. The warning may for example be provided in the form of an optical, acoustic or haptic signal, and any desired combinations of these warning signals are also possible. For example, to give the warning a warning light may be activated, a warning tone may be produced and/or the steering wheel may vibrate.
The camera used in the context of the method is preferably a reversing camera. This is already rearwardly orientated in any case, and is already present on many vehicles in any case, and so a further camera does not have to be installed to carry out the proposed method.
Preferably, items in the field of vision of the camera are identified by analysing the optical flow. Images generated by the camera are analysed, and movements in the images are detected.
It is preferably provided that regions in the field of vision of the camera having an optical flow in the direction of travel are respectively identified as an item. Depending on the implementation of the algorithm, it is conceivable for example for a vehicle to be identified as an item or for a plurality of prominent regions of the vehicle each to be identified as an item in the camera image. For carrying out the proposed method, it is irrelevant whether a physical item such as a vehicle is identified in the form of an individual item or identified in the form of a plurality of items. For the proposed filtering, it is merely necessary for an item to be reliably distinguishable from the background or from stationary items, such as hedges or trees, stationary items having no optical flow in the direction of travel.
Moreover, for the method it is important to determine the distance of an identified item from the vehicle. For example, the distance of an item identified in the field of vision of the camera is calculated from the position thereof in the field of vision. For example, this makes use of the fact that the camera images have a known perspective because of the known orientation of the camera. For example, it may be the case that the distance of an identified item from the vehicle is smaller the lower in the field of vision of the camera the item is detected. Conversely, the distance may thus be greater the higher in the field of vision an item is detected. In addition to the vertical position in the field of vision, it is also possible to make use of the horizontal position in the field of vision. For example, for the horizontal position, it may be the case that the distance of an identified item is greater the further away from the centre it is located.
Moreover, it is preferred for at least one image region to be defined in the field of vision of the camera, a traffic lane being assigned to the at least one image region and an item identified in the image region in question being assigned to the traffic lane assigned to the image region. This also makes use of the fact that the perspective of the camera images is also known because of the known orientation of the camera. Thus, in the camera image, each visible traffic lane is mapped to a particular image region. Each of these image regions can be defined as an individual image region, and a traffic lane can be assigned to each of these image regions. If an item is now identified in an image region of this type, it can be assumed that the identified item is located on the traffic lane assigned to this image region. In further embodiments, to improve the assignment of an identified item to a traffic lane, it is conceivable to convert the known position of said item in the camera image to a point on the road surface.
To define the image regions, traffic lane markings may be detected in the field of vision of the camera and/or information of a lane departure warning system present in the vehicle may be taken into account.
According to the invention, a computer program is further proposed, in which one of the methods disclosed herein is carried out when the computer program is executed on a programmable computer device. The computer program may for example be a module for implementing a driver assistance system or implementing a subsystem thereof in a vehicle or an application for driver assistance functions which is executable for example on a smartphone or tablet. The computer program may be stored on a machine-readable storage medium, for example on a permanent or rewritable storage medium, or in association with a computer device or on a removable CD-ROM, DVD, Blu-ray disc or USB stick. Additionally or alternatively, the computer program may be provided for download on a computer device such as a server, for example over a data network such as the internet or a communication link such as a telephone line or a wireless connection.
According to the invention, a driver assistance system is further proposed. The driver assistance system comprises a camera and at least one distance sensor and is formed and/or set up to carry out one of the methods disclosed herein. Accordingly, the features disclosed in the context of the method apply equivalently to the driver assistance system, and conversely the features disclosed in the context of the driver assistance system apply accordingly to the method.
Preferably, the camera is implemented as a reversing camera. The reversing camera is already present in any case as part of another driver assistance system, and so the reversing camera can be used jointly.
The at least one distance sensor is preferably an ultrasound sensor, radar sensor or lidar sensor, an ultrasound sensor being particularly preferred. The distance sensor is orientated in such a way that the field of vision thereof points obliquely rearwards from the vehicle. It is particularly preferred for the driver assistance system to comprise a distance sensor disposed in this manner both on the left side of the vehicle and on the right side of the vehicle.
Preferably, a warning device is further provided, by way of which a warning can be given to a driver of the vehicle. The warning device is for example configured as an optical display, a loudspeaker or a haptic output device. Moreover, any desired combinations of these devices are also possible.
Advantages of the invention
By means of the method proposed according to the invention, a reliable blind spot warning can be given even using exclusively a distance sensor and a camera, it being possible to prevent incorrect warnings. Incorrect warnings should always be avoided, since they may confuse the driver of a vehicle and if they occur repeatedly the driver may potentially ignore a legitimate warning.
In the proposed method, after an object is detected by the distance sensor, filtering is advantageously carried out before a warning is generated. For the filtering, identified items in the field of vision of a camera are drawn on, the camera in particularly advantageous embodiments being a reversing camera which is already installed in the vehicle in any case, in such a way that no further systems have to be installed. Furthermore, an extremely simple filtering method is proposed, it being checked, in a variant, whether an item is also identified by the camera at approximately the same distance. In a further variant of the filtering method, the items identified by the camera are assigned to a traffic lane, and for an object detected by a distance sensor it is checked whether an item on the adjacent traffic lane is recognised by the camera at approximately the same distance. These comparisons of the distances are simple to carry out, and so the filtering method can easily be technically implemented and major computing resources are not required. Furthermore, the filtering method is extremely robust and reliable as a result of the simplicity thereof.
In the proposed method, it is advantageously also extremely simple to detect items in the camera images. Specifically, for the operation of the filtering method it is not necessary to identify a vehicle in the camera images; it is sufficient merely to identify image regions in the camera images for which the optical flow points in the direction of travel. It is not necessary to establish whether these identified image regions belong to the same item, and so the evaluation of the camera images is also extremely simple, and robust and reliable evaluation of the camera images can take place using few computing resources.
In summary, by the method according to the invention it is possible in a simple manner to be able to generate a warning message reliably even if the distance detected by the distance sensor is greater than a traffic lane width, since by the proposed method it can be distinguished whether or not the detected object is located on an adjacent traffic lane. It is therefore not necessary to delay a warning until the determined distance is less than a traffic lane width.
Brief description of the drawings
Embodiments of the invention are shown in the drawings and described in greater detail in the following description. In the drawings:
Fig. 1 shows a first traffic situation,
Fig. 2 shows a second traffic situation.
Fig. 3 shows objects detected in a camera image. Fig. 4 shows a third traffic situation, and Fig. 5 shows the assignment of detected objects to a traffic lane.
In the following description of the embodiments of the invention, like components and elements are denoted by like reference numerals, and in some cases there is no repeated description of these components or elements. The drawings merely schematically represent the subject matter of the invention.
Fig. 1 is a detail of a multi-lane road 4, the detail showing three traffic lanes 35, 36, 38 in the same direction of travel. The traffic lanes 35, 36, 38 each have a traffic lane width 40 which is by way of example equally large. On the right traffic lane 35 there is a vehicle 1 which comprises a driver assistance system 10. The driver assistance system 10 comprises a control device 12, a distance sensor 14 and a camera 18. The driver assistance system 10 further comprises a warning device 2, by means of which the driver of the vehicle 1 is to be warned if a vehicle stays in a blind spot range 26. In the example shown, a blind spot range 2 6 is adjacent to the left side of the vehicle 1 and extends 3 metres behind the rear edge of the vehicle 1 as considered in the longitudinal direction of the vehicle 1 and forwards approximately to the level of the wing mirror of the vehicle 1. The width of the blind spot region 26 is comparable with the traffic lane width 40 and is for example 3 metres.
The driver assistance system 10 is set up in such a way that the driver of the vehicle 1 receives an indication by way of the warning device 2 if an object 28, such as another vehicle 6, is located in the blind spot region 26. To monitor the blind spot region 2 6, the distance sensor 14 is disposed on the vehicle 1 in such a way that a field of vision 22 of the distance sensor 14 points obliquely rearwards, in other words is orientated laterally rearwards. The distance sensor 14 is additionally set up in such a way that it can determine a distance from a detected object 28. However, in the case of a single ultrasound sensor 16 for example, the distance sensor 14 cannot determine the direction in which the object 28 is located as seen from the vehicle 1. In the example shown in Fig. 1, a blind spot region 26 is merely shown on the left side of the vehicle 1, but it is preferably also provided to dispose a distance sensor 14 correspondingly on the right side of the vehicle 1, in such a way that a blind spot region can also be monitored on the right side of the vehicle.
In addition to the distance sensor 14, the vehicle 1 also comprises the camera 18, which is preferably configured as a reversing camera 20 and thus points rearwards. Accordingly, the field of vision 24 of the reversing camera 20 is orientated counter to the direction of travel of the vehicle 1. Part of the right traffic lane 35 on which the vehicle 1 is located and parts of the adjacent traffic lane 36 and a far traffic lane 38 are positioned in the field of vision 24 of the camera 18.
Once the object 28 has been detected by the distance sensor 14 and the distance of said object from the vehicle 1 has been determined, filtering is carried out, in which items identified by the camera 18 are used. The detection of items in the image of the camera 18 is described in greater detail below with reference to Fig. 3.
In the example shown in Fig. 1, the camera 18 detects items of which the distance from the vehicle 1 corresponds to the distance of the detected object 28 as determined by the distance sensor 14 and converted to the position of the camera 18. This excludes the possibility that the detected object 28 is located at the same level as the vehicle 1 on the far traffic lane 38. The detected object 28 is therefore not eliminated during the filtering.
Additionally or alternatively, the items identified by the camera 18 may be assigned to a traffic lane 36, 38. In this case, in the filtering it is checked whether an item located on the adjacent traffic lane 36 is detected by the camera 18 at the distance of the detected object 28 as determined by the distance sensor 14 and converted to the position of the camera 18. If the camera 18 detects an item on the far traffic lane 38, the detected object 28 is eliminated. In the example shown in Fig. 1, the detected object 28 is located on the adjacent traffic lane 36, and so it is not eliminated during the filtering.
Thus, in the traffic situation shown in Fig. 1, the method according to the invention establishes that the detected object 28 is located in the blind spot region 26, and a warning is given using the warning device 2.
Fig. 2 shows a second traffic situation, the vehicle 1 again being located on the right traffic lane 35 of the road 4.
Unlike in the traffic situation shown in Fig. 1, the ultrasound sensor 16 now detects an object 28 which is not located in the blind spot region 26. Specifically, the detected object 28 is not located on the adjacent traffic lane 36, but rather on the far traffic lane 38. However, merely using the distance determined by the ultrasound sensor 16, the driver assistance system 10 cannot distinguish whether the detected object 28 is located on the far traffic lane 36 approximately at the level of the vehicle 1 or whether it is in a region obliquely behind the vehicle 1 on the adjacent traffic lane 36. Therefore, in the filtering step it is checked whether items are detected by the reversing camera 20 at the distance as determined by the ultrasound sensor 16 and converted to the position of the reversing camera 20. Since this is not the case in the traffic situation shown in Fig. 2, the detected object 28 is eliminated during the filtering, and so no warning is given.
Fig. 3 shows an image of the reversing camera 20 by way of example. A detected object 28 has been detected by the ultrasound sensor 16, and the distance of the detected object 28 from the vehicle 1 has been determined. In Fig. 3, the distance from the detected object 28 as converted to the position of the reversing camera is indicated by an arrow having reference numeral 30. The detected object 28 is a further vehicle 6 which is located obliquely behind the vehicle 1 but still outside the blind spot region 26.
By analysing the optical flow of the images supplied by the reversing camera 20, various prominent regions are identified as items 32 in the image of the further vehicle 6. For carrying out the proposed method, it is not important to identify the further vehicle 6 as a whole; it is perfectly sufficient for prominent image regions assigned to the further vehicle 6 to be identified as items 32, since in the preferably provided method it is merely checked whether or not the detected item 32 is located in a distance interval determined on the basis of the distance 30 determined by the distance sensor 14. This greatly reduces the computing power required for the image processing.
Fig. 4 shows a third traffic situation, which substantially corresponds to the second traffic situation of Fig. 2. However, unlike in the second traffic situation of Fig 2, the further vehicle 6 is located both In the field of vision 22 of the ultrasound sensor 16 and In the field of vision 24 of the reversing camera 20. Thus, a check as to whether items 32 identified by the reversing camera 20 are located at the distance 30 determined by the ultrasound sensor 16 and converted to the position of the reversing camera would turn out positive. Therefore, during the filtering, assignment of the identified items 32 to a traffic lane 36, 38 may additionally or alternatively be used. In the third traffic situation shown in Fig. 4, it is thus detected that although identified items 32 are identified at the distance 30 determined by the ultrasound sensor 16, these identified items 32 are assigned to the far traffic lane 38, and so in the third traffic situation shown the object 28 detected by the ultrasound sensor 16 is eliminated during the filtering. No warning is given.
Fig. 5 again shows an image of the reversing camera 20, which largely corresponds to the image described with reference to Fig. 3. Thus, the image of the reversing camera 20 again shows the further vehicle 6, which has been detected by the ultrasound sensor 16 as a detected object 28 and of which the distance 30 has been determined. Additionally, a plurality of identified items 32 have been determined by analysing the optical flow.
Additionally, however, it is provided that at least one image region 34 is defined, which in the example shown in Fig. 5 is assigned to the far traffic lane 38. Identified items 32 positioned in the image region 34 are thus also assigned to the far traffic lane 38. If appropriate, to improve the precision of this assignment it may be provided that the position of the identified item 32 is converted into a point on the traffic lane surface and it is checked whether this point is positioned in the image region 34. In further embodiments, it is possible to define further image regions; for example, a corresponding image region may also be defined for the adjacent traffic lane 36 and assigned to the adjacent traffic lane 36. It may likewise be provided that a corresponding image region is assigned to each traffic lane.
The invention is not limited to the embodiments disclosed herein and the aspects highlighted therein. Rather, a number of modifications within the scope of expert activity are possible within the limits set out by the claims.

Claims (10)

Claims
1. Method for driver assistance, in which objects (28) in the proximity of a vehicle (1) are detected and a warning is given if an object (28) is located in a blind spot region (26) of the vehicle (1), comprising the steps of a) detecting objects (28) in a field of vision (22) of a distance sensor (14), the distance sensor (14) being disposed on the vehicle (1) and orientated obliquely rearwards with respect to the vehicle (1), and the distance of the detected objects (28) from the vehicle (1) being determined, b) identifying items (32) in the field of vision (24) of a camera (18), the camera (18) being disposed on the vehicle (1) and orientated rearwards with respect to the vehicle (1), and the distance of an identified item (32) from the vehicle (1) being calculated and/or identified items (32) being assigned to a traffic lane (36, 38), c) filtering the objects (28) detected by the distance sensor (14) using the items (32) identified by the camera (18), and d) giving a warning if a detected object (28) has not been eliminated by the filter and the distance thereof as determined by the distance sensor (14) is below a warning distance, wherein i. a detected object (28) is eliminated during the filtering of step c) if the distance (30) of the detected object (28) from the vehicle (1), as determined by the distance sensor (14), is greater than a traffic lane width (40) and no item (32) has been detected by the camera (18) in a distance interval determined on the basis of the distance (30) determined by the distance sensor (14), and/or ii. a detected object (28) is eliminated during the filtering of step c) if an item (32) which is not located on a traffic lane (36) adjacent to the vehicle (1) is detected by the camera (18) in a distance interval determined on the basis of the distance (30) determined by the distance sensor (14).
2. Method according to claim 1, characterised in that the warning distance is selected in the range of 3 m to 5 m.
3. Method according to either claim 1 or claim 2, characterised in that in step b) items (32) are identified in the field of vision (24) of the camera (14) by analysing the optical flow.
4. Method according to claim 3, characterised in that regions in the field of vision (24) of the camera (18) having an optical flow in the direction of travel are identified as an item (32).
5. Method according to one of claims 1 or 4, characterised in that the distance of an item (32) identified in the field of vision (24) of the camera (18) is calculated from the position thereof in the field of vision (24).
6. Method according to any of claims 1 to 5, characterised in that at least one image region (34) is defined in the field of vision (24) of the camera (18), a traffic lane (36, 38) being assigned to the at least one image region (34) and an item (32) identified in the image region (34) in question being assigned to the traffic lane (36, 38) assigned to the image region (34) .
7. Computer program which carries out the method according to any of claims 1 to 6 when executed on a computer .
8. Driver assistance system (10) comprising a camera (14) and at least one distance sensor (14), characterised in that the driver assistance system (10) is set up to carry out the method according to any of claims 1 to 6.
9. Driver assistance system (10) according to claim 8, characterised in that the camera (18) is a reversing camera (20).
10. Driver assistance system (10) according to either claim 8 or claim 9, characterised in that the at least one distance sensor (14) is an ultrasound sensor (16).
GB1701167.7A 2016-01-26 2017-01-24 Method and device for driver assistance Withdrawn GB2548465A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102016201070.0A DE102016201070A1 (en) 2016-01-26 2016-01-26 Method and device for driver assistance

Publications (2)

Publication Number Publication Date
GB201701167D0 GB201701167D0 (en) 2017-03-08
GB2548465A true GB2548465A (en) 2017-09-20

Family

ID=58463076

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1701167.7A Withdrawn GB2548465A (en) 2016-01-26 2017-01-24 Method and device for driver assistance

Country Status (3)

Country Link
DE (1) DE102016201070A1 (en)
FR (1) FR3047105B1 (en)
GB (1) GB2548465A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10578727B2 (en) * 2016-09-22 2020-03-03 Robert Bosch Gmbh Method and processing unit for detecting a wet or damp roadway and for object detection

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017119036A1 (en) * 2017-08-21 2019-02-21 Valeo Schalter Und Sensoren Gmbh Avoidance of false alarms during blind spot monitoring
CN109703556B (en) * 2018-12-20 2021-01-26 斑马网络技术有限公司 Driving assistance method and apparatus

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080136612A1 (en) * 2006-11-16 2008-06-12 Hitachi, Ltd. Obstacle Detection Apparatus
US20090212930A1 (en) * 2005-03-03 2009-08-27 Continental Teves Ag & Co. Ohg Method and Device for Avoiding a Collision in a Lane Change Maneuver of a Vehicle
DE102008061357A1 (en) * 2008-12-10 2010-06-17 Valeo Schalter Und Sensoren Gmbh Monitoring device and method for monitoring blind spot areas of a vehicle
US20120271539A1 (en) * 2011-04-19 2012-10-25 GM Global Technology Operations LLC Device and method for driver assistance

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2158827B1 (en) * 2000-02-18 2002-03-16 Fico Mirrors Sa DEVICE FOR DETECTION OF PRESENCE OF OBJECTS.
DE10305935A1 (en) * 2003-02-13 2004-08-26 Valeo Schalter Und Sensoren Gmbh Device for detecting objects in the environment of a motor vehicle
DE102005027653A1 (en) * 2005-06-15 2006-12-21 Robert Bosch Gmbh Blind spot`s object detecting device, has comparison unit ascertaining, based on preset correlations between detection signals whether vehicle following host vehicle is present, and blocking output unit from outputting warning signal
DE102012204948A1 (en) 2012-03-28 2013-10-02 Robert Bosch Gmbh Method for assisting driver when driving vehicle at changing lanes, involves assigning vehicle and detected objects to tracks of roadway, and generating warning signal when detected object regarding vehicle is located on relevant track
KR20140019571A (en) 2012-08-06 2014-02-17 주식회사 만도 Blind spot warning system and method
DE102012215014A1 (en) 2012-08-23 2014-02-27 Robert Bosch Gmbh Method for monitoring blind angle of vehicle, involves detecting surrounding laterally adjacent to vehicle, where objects in detected surrounding and differential speeds of objects to vehicle are determined

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090212930A1 (en) * 2005-03-03 2009-08-27 Continental Teves Ag & Co. Ohg Method and Device for Avoiding a Collision in a Lane Change Maneuver of a Vehicle
US20080136612A1 (en) * 2006-11-16 2008-06-12 Hitachi, Ltd. Obstacle Detection Apparatus
DE102008061357A1 (en) * 2008-12-10 2010-06-17 Valeo Schalter Und Sensoren Gmbh Monitoring device and method for monitoring blind spot areas of a vehicle
US20120271539A1 (en) * 2011-04-19 2012-10-25 GM Global Technology Operations LLC Device and method for driver assistance

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10578727B2 (en) * 2016-09-22 2020-03-03 Robert Bosch Gmbh Method and processing unit for detecting a wet or damp roadway and for object detection

Also Published As

Publication number Publication date
DE102016201070A1 (en) 2017-07-27
GB201701167D0 (en) 2017-03-08
FR3047105B1 (en) 2020-12-18
FR3047105A1 (en) 2017-07-28

Similar Documents

Publication Publication Date Title
US10821946B2 (en) Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method
US9947227B1 (en) Method of warning a driver of blind angles and a device for implementing the method
US8952799B2 (en) Method and system for warning a driver of a vehicle about potential obstacles behind the vehicle
KR101864896B1 (en) Method of capturing the surroundings of a vehicle
JP6246014B2 (en) Exterior recognition system, vehicle, and camera dirt detection method
US8620526B2 (en) Method for operating a motor vehicle and motor vehicle
US8885889B2 (en) Parking assist apparatus and parking assist method and parking assist system using the same
US20130033371A1 (en) Method for warning of an object in the vicinity of a vehicle and driving assistance system
CN111976598A (en) Vehicle blind area monitoring method and system
KR20170070213A (en) Lane assistance system responsive to extremely fast approaching vehicles
CN110383102B (en) Fault detection device, fault detection method, and computer-readable storage medium
US20190225211A1 (en) Cross traffic alert with flashing indicator recognition
JP2008037361A (en) Obstacle recognition device
US9589470B2 (en) Method and apparatus for detecting vehicle running in blind spot, and method and apparatus for giving warning in changing cruising lane
GB2548465A (en) Method and device for driver assistance
JP2022502642A (en) How to evaluate the effect of objects around the means of transportation on the driving operation of the means of transportation
US9322909B2 (en) Negative obstacle detection with stereo camera and long range radar
KR20190020670A (en) Supports overtaking acceleration for adaptive cruise control of vehicles
JP7414154B2 (en) Method and device for detecting traffic congestion in a vehicle
JP2012242937A (en) Vehicular road shape recognition method and device, and recording medium
JP5837199B2 (en) Method and apparatus for supporting driving operation of automobile driver
JP6896023B2 (en) Vehicle approach notification system and vehicle approach notification method
KR20180007211A (en) A Rear collision warning system of Vehicles
JP6548147B2 (en) Vehicle control device
CN114555420A (en) Vehicle periphery monitoring device and vehicle periphery monitoring method

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)