EP3332351A1 - Verfahren zum betreiben eines fahrerassistenzsystems eines fahrzeugs, steuergerät und fahrzeug - Google Patents
Verfahren zum betreiben eines fahrerassistenzsystems eines fahrzeugs, steuergerät und fahrzeugInfo
- Publication number
- EP3332351A1 EP3332351A1 EP16731835.1A EP16731835A EP3332351A1 EP 3332351 A1 EP3332351 A1 EP 3332351A1 EP 16731835 A EP16731835 A EP 16731835A EP 3332351 A1 EP3332351 A1 EP 3332351A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- vehicle
- traffic
- traffic scenario
- detection algorithm
- driver assistance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Definitions
- the invention is based on a device or a method according to the preamble of the independent claims.
- the subject of the present invention is also a computer program.
- Modern driver assistance systems English advanced driver assistance systems or ADAS for short, as well as highly automated vehicle systems for automated driving in urban environments, English urban automated driving or UAD for short, provide detailed knowledge of a vehicle environment and a particular traffic situation in which the vehicle is currently running is, ahead.
- sensor measurement data are usually used. From these, objects can be extracted by means of so-called detector algorithms, with which the vehicle environment can be described and analyzed.
- Modern sensors like
- Stereo video cameras or laser scanners make it possible to capture a wealth of information from the vehicle environment. Due to the high
- Information content is the information processing usually associated with high demands on the hardware resources.
- a method for operating a driver assistance system of a vehicle comprising the following steps:
- Reading a read-in or recognition signal the one by means of a
- Envelope reading device of the vehicle read or recognized
- a driver assistance system can be understood to mean an electronic system for assisting a driver, in particular in certain driving situations, to increase driving safety or to increase driving comfort.
- the driver assistance system may be coupled to an environment read-in device for detecting or reading in an environment of the vehicle by means of various sensors. These sensors can be used, for example, as a camera, lidar. Radar and / or ultrasonic sensor be configured. So can the
- an environment reading device is used to detect a traffic scenario in which the vehicle is located or to recognize it by sensors
- Import traffic scenario Under a traffic scenario, i.a. one by certain landmarks, d. H. static, reliably detectable entities, such. Infrastructure elements or abstract features
- a "drive in a closed town", a “drive on a highway” or a “drive on a highway” may be defined by a set of predefined detected objects or Infrastructure elements are identified in an image representing the environment of the vehicle.
- an amount of predefined detected objects for the traffic scenario "driving in a closed locality” may be formed by existing lanterns, traffic light systems (such as traffic lights or hazard warning lights on pedestrian crossings), curbs and / or the like which in an image indicate that the surroundings of the Vehicles are represented, pictured, or expected when the vehicle is traveling through a closed locality, such as objects such as regular or tree-like trees or bushes, delineators, missing lanterns, large yellow signposts, warning signs in front of narrow streets Curves, roadside warning signs and / or the like on the roadside may be used as a set of objects to detect the traffic scenario "overland road travel”.
- the information about the current traffic scenario can also be found in the
- Vehicle pose ie the orientation and location of the vehicle, for example on the road
- map information is obtained and therefore does not necessarily require an environment detection in the true sense.
- a detection algorithm also called a detector, a rule for
- the detection algorithm can be used to detect lane markings when the vehicle is driving on a correspondingly marked roadway.
- a detection algorithm corresponding to the detected traffic scenario can be activated.
- Detection algorithm is thus an algorithm whose
- a scenario-dependent prioritization can be understood to mean, for example, that a first algorithm in a first traffic scenario recognizes a first of the objects with a higher urgency and thus more detailed than a second object, while a second algorithm in a second traffic scenario the second object with a higher priority recognizes as the first object.
- the first algorithm may detect a pedestrian as the first object with a higher urgency than a curve of the road with a small curve radius
- the second algorithm in a traffic scenario “driving over the highway", in which usually pedestrians are not expected, the recognition of the first objects (ie the pedestrian) need not recognize with such a high urgency, as the second objects (ie here the curves with a small radius of curvature, which in a high speed motorway driving significantly more safety-critical are, as very unlikely on the highway occurring pedestrians).
- Detection algorithm can then be detected in the step of detecting, for example, only those objects that are suitable for a localization of
- Driver assistance system can be configured such that depending on a detected traffic situation, one or more detectors for
- trajectory are actually relevant within the respective traffic situation.
- the localization of the vehicle can be carried out with significantly lower computing power.
- a traffic light detection is usually high
- the localization of the vehicle system plays a central role.
- the implementation of the localization may be based, for example, on a comparison of sensor measurements with an already known map of the vehicle environment. From the
- Sensor measurements can be used to form object hypotheses using detectors become.
- On the nature and arrangement of the objects in turn can be closed on a pose, ie a position and orientation of the vehicle system in the map.
- Landmark detector algorithms for performing a
- Driver assistance system which may consist of several driver assistance functions or a functional bundle, or even a relatively inexpensive highly automated vehicle system.
- Detection algorithm to be activated.
- step of detecting using the first detection algorithm, at least one object of a first object type assigned to the traffic scenario for laterally locating the vehicle can be detected.
- step of detecting using the second detection algorithm additionally or alternatively, at least one object of a second object type assigned to the traffic scenario for longitudinal localization of the vehicle can be detected.
- a longitudinal localization can be understood to be the determination of the position of the vehicle in the direction of travel, for example a distance of the vehicle to an imminent crossing or an object located in front of the vehicle.
- Vehicle can be localized precisely and reliably with relatively little computational effort.
- the object types for lateral and longitudinal localization can be identical, so that in the minimal case exactly one
- Lane marking, a traffic sign, a traffic light system, a street lamp, at least one other road user or a combination of a plurality of said objects is detected.
- Such objects offer the advantage of good visibility.
- the method may include a step of locating the vehicle using at least one object detected in the step of detecting. Additionally or alternatively, in a step of controlling, at least one function of the driver assistance system may be controlled in response to a location performed in the step of locating.
- the vehicle is further localized using map data relating to an environment of the vehicle.
- the map data may be around in one
- the map data can increase the reliability of locating the vehicle.
- Detection algorithm is activated to change an accuracy in locating the vehicle depending on the traffic scenario. For example, the accuracy may be varied depending on a speed of the vehicle, a number of other road users or a lane course. This embodiment also makes possible the most efficient utilization of available computing power.
- This method can be implemented, for example, in software or hardware or in a mixed form of software and hardware, for example in a control unit.
- the approach presented here also provides a control unit which is designed to implement the steps of a variant of a method presented here
- a control device can be understood as meaning an electrical device which processes sensor signals and outputs control and / or data signals in dependence thereon.
- the control unit may have an interface, which may be formed in hardware and / or software. In a hardware training, the interfaces may for example be part of a so-called system ASICs, the various functions of the
- Control unit includes. However, it is also possible that the interfaces are their own integrated circuits or at least partially consist of discrete components. In a software training, the
- Interfaces software modules that are available for example on a microcontroller in addition to other software modules.
- a computer program product or computer program with program code which can be stored on a machine-readable carrier or storage medium such as a semiconductor memory, a hard disk memory or an optical memory and for the implementation, implementation and / or Triggering the steps of the method according to one of the above
- Fig. 1 is a schematic representation of a vehicle according to a
- FIG. 2 shows a schematic representation of an overall system with three different detection algorithms for activation by means of a control device according to an exemplary embodiment
- FIG. 3 is a flowchart of a method according to a
- FIG. 1 shows a schematic representation of a vehicle 100 according to a
- the vehicle 100 is exemplarily on a main road 102 and approaches an intersection 104. Das
- Traffic scenario which is shown in FIG. 1, can thus be understood, for example, as “driving in a closed locality", wherein the
- Subdivision un two sub-scenarios is possible, as in this case the driving on a main road 102 and the approach to an intersection 104.
- the two (sub) traffic scenarios in the form of the main road 102 and the intersection 104 are detected by a meeteinlese worn 106 of the vehicle 100 and in Form of a corresponding detection signal 108 to a control unit 110 for controlling a driver assistance system of the vehicle 100 transfer.
- the control unit 110 is configured to use the detection signal 108 to activate one or more detection algorithms corresponding to the (sub) traffic scenarios for detecting objects of one or more object types assigned to the traffic scenarios.
- such a detection algorithm makes it possible to detect lane markings 112 defining a lane 111 of the vehicle 100, a stop line 113, a lighting system 114, here a traffic light, a
- Traffic Sign 116 or street lights 117 are the same.
- Object types in the form of the lane markings 112 and the street lamps 117 are assigned to the traffic scenario of the main road 102, while the object types in the form of the stop line 113, the lighting system 114 and the
- Traffic sign 116 are associated with the traffic scenario of the intersection 104.
- the controller 110 is optionally configured to use the vehicle 100 by means of the corresponding one
- the vehicle 100 is traveling on a cross-country road (that is, for example, outside a closed highway)
- Locality is located.
- objects in an image of the current environment of the vehicle 100, as objects, bushes or trees, mountain flanks, traffic signs for indicating debris, or traffic signs for warning against tight-radius turns may be recognized by the high-priority detection algorithm, whereas, for example, objects such as
- Stop line or a lighting system as shown in Figure 1 for the traffic scenario "driving in a closed locality” need to be recognized with a lower priority
- the traffic scenario "drive over an overland road” is detected by the detection signal 108 and the appropriate matching detection algorithm for the
- Detection algorithm will now attempt to recognize objects such as the above-mentioned bushes or trees, traffic signs for indicating game change or warning of curves with curve radius with high priority, whereas Objects such as a stop line or a traffic light system, which is usually less when driving over a highway
- Occurrence probability have been tried only to recognize with a lower priority.
- computing power can be saved in the object detection by prioritizing the respective traffic scenario and the objects that are likely to occur therein when recognizing objects in the current vehicle environment of the vehicle 100.
- a traffic scenario a "ride on a highway" be signaled by a detection signal 108, in which case, for example, a recognition of blue traffic signs or a structural separation of the individual lanes as infrastructure elements or objects in a picture about the current environment of the vehicle 100 as Reference to the existence of such a traffic scenario "driving on a motorway".
- a detection signal 108 in which case, for example, a recognition of blue traffic signs or a structural separation of the individual lanes as infrastructure elements or objects in a picture about the current environment of the vehicle 100 as Reference to the existence of such a traffic scenario "driving on a motorway".
- Case for example, a detection algorithm in the control unit or a corresponding (detection) device in the control unit 118 are loaded, which recognizes, for example, a lane boundary line or recorded on the road direction arrow as an object with high priority and forwards, for example, to a driver assistance system.
- a detection algorithm in the control unit or a corresponding (detection) device in the control unit 118 are loaded, which recognizes, for example, a lane boundary line or recorded on the road direction arrow as an object with high priority and forwards, for example, to a driver assistance system.
- Driver assistance system can then, for example, on the basis of the detected objects to perform an automatic vehicle guidance of the vehicle 100 via the highway and thereby the ride comfort for the or
- FIG. 2 shows a schematic representation of an overall system 200 with three different detection algorithms for triggering by means of a
- the overall system 200 may, for example, be controlled by a control unit as described above with reference to FIG. 1.
- the detection algorithms are realized by way of example as three detector units 202, 204, 206.
- the overall system 200 includes a sensor in the form of the field reader 106, which transmits the detection signal 108 to a control module 208 for function switching.
- the control module 208 is configured to operate using the detection signal 108 as appropriate detected traffic scenario, a switch signal 209 for driving a first detector switch 210 in a Popeeinlese réelle 106 and the first detector unit 202 interconnecting signal line, a second detector switch 212 in the thoroughlyeinlese stimulating 106 and the second detector unit 204 interconnecting signal line or a third
- the detector switches 210, 212, 214 Depending on the position of the detector switches 210, 212, 214, one or more of the three detector units 202, 204, 206 transmits information 216 about relevant objects relating to the traffic scenario to a processing unit
- the processing unit 218 which processes the information 216 using a suitable environment model.
- the processing unit 218 in turn is connected to an analysis unit 220, which serves the situation analysis, among other things about the location of the vehicle in the traffic scenario.
- an analysis unit 220 which serves the situation analysis, among other things about the location of the vehicle in the traffic scenario.
- a control unit 222 connected to the analysis unit 220, the functions of the driver assistance system are controlled as a function of the situation analysis performed by the analysis unit 220. For this is the
- Control unit 222 for example, connected to an actuator 223 or a human-machine interface 224 of the vehicle.
- a signal flow in the overall system 200 is marked by thick arrowheads.
- a control flow in the overall system 200 is marked with small arrowheads.
- Algorithms for detecting a specific object type such as a traffic sign In such a scenario, it is about driving on an intersection or a major road.
- Example application for such a system is considered below the location of a vehicle.
- an application to other areas of automated driving is possible.
- To design the system 200 localization accuracy requirements are formulated. On the basis of these requirements then for several
- Traffic scenarios landmarks identified in the respective Traffic scenario to be detected in order to meet the localization requirements.
- the required computing power of the control unit for the object detection can be significantly reduced by running only detectors for scene-related objects.
- the system 200 exerts a favorable influence on the overall vehicle system, since the algorithms required for the detection of static objects are executed only in relevant cases. This may prove advantageous, especially in the context of highly automated vehicle systems for automated urban driving, UAD for short.
- Stereo video system with a range of 50 m and an opening angle of 45 ° when aligned in the direction of travel, for example, the following requirements:
- Lane markings are suitable for this purpose.
- street lights are detected at a distance of about 50 m in order to detect a longitudinal localization, i. H. a localization in
- Driving direction to make This requirement is based on the assumption that in each time step along the direction of travel a longitudinal localization object should be present with an accuracy of approximately 5 m. This corresponds approximately to the distance from street lamps to major roads. If the vehicle is located at an intersection without lane markings, the detection of traffic lights, traffic signs or lanterns takes place.
- the requirement is based on the assumption that an intersection scenario requires a higher accuracy in the localization in order to be able to perform a lane-exact relative localization for other road users and to be able to follow a given trajectory through the intersection.
- the lateral or longitudinal accuracy is for example 0.3 m.
- a maximum orientation error is for example at 0.2 °.
- the advantages of the system 200 are therefore on the one hand in the reduction of a required computing power for driver assistance systems or highly automated vehicle systems.
- the detector algorithms are selected scenario-appropriate, so that only the relevant landmarks are detected in each scenario, on the other hand, the number of total detectable objects can be increased significantly with the same system requirements.
- the System 200 is particularly suitable for use in partially or more automated assistance systems on parking facilities or urban areas
- Traffic routes that, for example, build on information from a digital map.
- d. H. for determining its position and orientation, a localization method described in more detail below with reference to FIG. 3 is used.
- FIG. 3 shows a flow chart of a method 300 for operating a driver assistance system of a vehicle according to one exemplary embodiment.
- the method 300 may be performed in conjunction with a vehicle or controller described with reference to FIG. 1.
- a detection signal is read in, which is a detected by means of a passereinlese Rhein the vehicle Traffic scenario.
- at least one detection algorithm of the driver assistance system is activated using the detection signal.
- at least one object of an object type associated with the traffic scenario is detected using the detection algorithm to locate the vehicle in the traffic scenario.
- the method 300 includes an optional step 340 in which the vehicle is located using the at least one object detected in step 330 in the traffic scenario.
- step 350 at least one function of the
- an exemplary embodiment comprises an "and / or" link between a first feature and a second feature, then this is to be read so that the embodiment according to one embodiment, both the first feature and the second feature and according to another embodiment either only first feature or only the second feature.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102015215105 | 2015-08-07 | ||
DE102015216979.0A DE102015216979A1 (de) | 2015-08-07 | 2015-09-04 | Verfahren zum Betreiben eines Fahrerassistenzsystems eines Fahrzeugs, Steuergerät und Fahrzeug |
PCT/EP2016/063768 WO2017025226A1 (de) | 2015-08-07 | 2016-06-15 | Verfahren zum betreiben eines fahrerassistenzsystems eines fahrzeugs, steuergerät und fahrzeug |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3332351A1 true EP3332351A1 (de) | 2018-06-13 |
Family
ID=57853469
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16731835.1A Withdrawn EP3332351A1 (de) | 2015-08-07 | 2016-06-15 | Verfahren zum betreiben eines fahrerassistenzsystems eines fahrzeugs, steuergerät und fahrzeug |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3332351A1 (de) |
DE (1) | DE102015216979A1 (de) |
WO (1) | WO2017025226A1 (de) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9898008B2 (en) | 2016-03-22 | 2018-02-20 | Delphi Technologies, Inc. | Scenario aware perception system for an automated vehicle |
DE102017220139A1 (de) * | 2017-11-13 | 2019-05-16 | Robert Bosch Gmbh | Verfahren und Vorrichtung zum Bereitstellen einer Position wenigstens eines Objekts |
DE102019205481A1 (de) * | 2019-04-16 | 2020-10-22 | Zf Friedrichshafen Ag | Umfelderfassung mittels eines Sensors mit variierbarem Erfassungsbereich |
DE102021128785A1 (de) * | 2021-11-05 | 2023-05-11 | Bayerische Motoren Werke Aktiengesellschaft | Vorrichtung und Verfahren zur Ermittlung der Entfernung eines Lichtsignalgebers |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150153184A1 (en) * | 2013-12-04 | 2015-06-04 | GM Global Technology Operations LLC | System and method for dynamically focusing vehicle sensors |
-
2015
- 2015-09-04 DE DE102015216979.0A patent/DE102015216979A1/de not_active Withdrawn
-
2016
- 2016-06-15 WO PCT/EP2016/063768 patent/WO2017025226A1/de active Application Filing
- 2016-06-15 EP EP16731835.1A patent/EP3332351A1/de not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
DE102015216979A1 (de) | 2017-02-09 |
WO2017025226A1 (de) | 2017-02-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2562039B1 (de) | Verfahren und Vorrichtung zum Ändern einer Lichtaussendung zumindest eines Scheinwerfers eines Fahrzeugs | |
EP2583263B1 (de) | Verfahren zur fusion eines verkehrszeichenerkennungssystems und eines spurerkennungssystems eines kraftfahrzeugs | |
DE102018211941B4 (de) | Verfahren zum Ermitteln einer Kreuzungstopologie einer Straßenkreuzung | |
DE102017221407A1 (de) | Fahrassistenzsystem, das navigationsinformation verwendet, und dessen betriebsverfahren | |
DE112016007376T5 (de) | Vorrichtung zur Bestimmung von Peripherie-Informationen und Verfahren zur Bestimmung von Peripherie-Informationen | |
DE102014117751A1 (de) | System und Verfahren zum dynamischen Fokussieren von Fahrzeugsensoren | |
DE112020004949T5 (de) | Fahrzeugbordvorrichtung und fahrunterstzützungsverfahren | |
EP3649519B1 (de) | Verfahren zur verifizierung einer digitalen karte eines höher automatisierten fahrzeugs, entsprechende vorrichtung und computerprogramm | |
EP2562038A1 (de) | Verfahren und Steuergerät zum Hervorheben eines erwarteten Bewegungspfads eines Fahrzeugs | |
EP2116958B1 (de) | Verfahren und Vorrichtung zum Ermitteln des Fahrbahnverlaufs im Bereich vor einem Fahrzeug | |
DE102009036433A1 (de) | Verfahren und Vorrichtung zum Betrieb eines Fahrerassistenzsystems | |
DE102016213782A1 (de) | Verfahren, Vorrichtung und computerlesbares Speichermedium mit Instruktionen zur Bestimmung der lateralen Position eines Fahrzeuges relativ zu den Fahrstreifen einer Fahrbahn | |
DE102021103149A1 (de) | Verfahren und vorrichtung zur bestimmung der optimalen kreuzungsspur in einem unterstützten fahrsystem | |
DE102016219503A1 (de) | Verfahren und Fahrerassistenzsystem zur Erkennung der Intention eines Fußgängers zur Querung einer Ego-Fahrspur | |
WO2009149960A1 (de) | Verfahren zur kombinierten ausgabe eines bildes und einer lokalinformation, sowie kraftfahrzeug hierfür | |
DE112018007063T5 (de) | Anzeigesteuervorrichtung, Anzeigevorrichtung und Anzeigesteuerverfahren | |
EP3332351A1 (de) | Verfahren zum betreiben eines fahrerassistenzsystems eines fahrzeugs, steuergerät und fahrzeug | |
WO2019233777A1 (de) | Fahrassistenzsystem | |
DE102019208176A1 (de) | Fahrzeug-fahrassistenz-system | |
DE10311241B4 (de) | Verfahren und Einrichtung für die Spurführung eines Fahrzeugs | |
DE102015200059A1 (de) | Fahrer-Intentions-Vorhersage-Vorrichtung und Verfahren und Fahrzeug umfassend solch eine Vorrichtung | |
DE102007007540A1 (de) | Fahrstreifenkontrollierte Erkennung von Fahrzeugen beim Spurenwechsel | |
EP3440433B1 (de) | Verfahren zur bestimmung einer pose eines wenigstens teilautomatisiert fahrenden fahrzeugs mittels speziell ausgewählter und von einem backend- server übertragener landmarken | |
DE102018211368A1 (de) | Verfahren zur Beschreibung einer Umgebung eines Fahrzeugs durch die Topologie der befahrenen Straße | |
WO2018188846A1 (de) | Fahrerassistenzsystem für ein fahrzeug |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20180307 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ROBERT BOSCH GMBH |
|
17Q | First examination report despatched |
Effective date: 20200804 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20201126 |