WO2019121521A1 - Procédé et dispositif - Google Patents

Procédé et dispositif Download PDF

Info

Publication number
WO2019121521A1
WO2019121521A1 PCT/EP2018/085218 EP2018085218W WO2019121521A1 WO 2019121521 A1 WO2019121521 A1 WO 2019121521A1 EP 2018085218 W EP2018085218 W EP 2018085218W WO 2019121521 A1 WO2019121521 A1 WO 2019121521A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor data
backend
transmitting
vehicle
sensor
Prior art date
Application number
PCT/EP2018/085218
Other languages
German (de)
English (en)
Inventor
Martin Böld
Lutz-Wolfgang Tiede
Original Assignee
Continental Automotive Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive Gmbh filed Critical Continental Automotive Gmbh
Publication of WO2019121521A1 publication Critical patent/WO2019121521A1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/94Hardware or software architectures specially adapted for image or video understanding
    • G06V10/95Hardware or software architectures specially adapted for image or video understanding structured as a network, e.g. client-server architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • G06V20/625License plates

Definitions

  • the invention relates to the field of driver assistance systems.
  • the invention relates to a method for increasing the range of sensors.
  • ADAS Advanced Driver Assistance Systems
  • ADASs include, for example, adaptive headlight settings, adaptive cruise control, lane departure warnings, cornering warnings, speed limit warnings.
  • ADAS intervene partly autonomously or autonomously in the drive, control or signaling device of the vehicle and warn the driver shortly before or during critical situations by means of suitable man-machine cutting parts.
  • Some ADAS use a range of sensors such as RADAR, infrared, ultrasound, and optical sensors such as digital video cameras and LIDAR.
  • the digital map provides information about the road network, road geometry, road conditions and terrain around a vehicle.
  • Digital map data provides valuable information that can not be detected by sensors such as curvature, grade, undetached speed limits, lane boundaries, etc.
  • the digital map data can be correlated with vehicle environment data generated by vehicle-side sensors. be collected in real time, enriched.
  • the digital map data may be provided to a vehicle via a backend device or a cloud.
  • the digital map data are usually associated with the navigation system.
  • eHorizon systems integrate digital topographic map data with sensor data for predictive control of vehicle systems. Future events, eg the slope behind the next turn, will be used early to adjust the control of the vehicle.
  • eHorizon sys- tems interpret card and sensor data and, for example, automatically adjust engine and engine management.
  • the range of the sensors of a vehicle such as radar, camera and lidar is limited.
  • the sensor information obtained directly in the vehicle is insufficient due to the limited range. It sometimes requires a higher visibility than can be detected with the sensors of the vehicle.
  • vehicles driving in front or hin Hinnisse and curves affect the visibility of the vehicle-mounted sensors. For example, overtaking operations could not be performed if the required visibility is not available. This applies both to support by means of overtaking as well as to the control of autonomously overtaking vehicles.
  • the invention relates to a method of range of sensors.
  • the method includes detecting, by at least one sensor of a first object, first sensor data; detecting, by means of a second sensor of a second object, second sensor data; recognizing the second object in the first sensor data; and providing the second sensor data to the first object.
  • the second sensor data provided to the first object can be linked to the first sensor data, so that the effect results in an increase in the range of the first sensor data by expanding by the second sensor data.
  • the first and second sensors can accordingly be of the same type.
  • the provision can be direct or indirect.
  • the recognition of the second object in the first sensor data can represent a precursor for the exact identification of the second object to the extent that it can be seen for the first object that a second object equipped with at least one sensor lies within the detection range of the at least one sensor of the first object.
  • the second object can be identified more precisely so that the second sensor data can be made available to the first object.
  • the first and second objects may be vehicles.
  • One aspect relates to the indirect provision of the second sensor data to the first object.
  • This can be done in particular via a backend or a cloud.
  • the first object and the second object can be mapped in the backend as a switching center, wherein the first object and the second object must be identifiable.
  • both the first object and the second object each transmit at least one identification feature to the backend, by means of which the first object and the second object are identifiable.
  • the backend comprises means for receiving records relating to the identification features and the first and second sensor data.
  • the means may include one or more digital data interfaces with transmitters and / or Receivers include, which are designed to one or more telecommunication standards telecommunication compatible and are communicatively connected to other components of the backend.
  • the backend may also include a data processing unit communicatively connected to the digital interface (s) that analyzes received data records.
  • the data processing unit may include one or more processors which execute a corresponding computer program.
  • the first object and the second object may also include one or more digital interfaces with transmitters and / or receivers that are compatible with the one or more telecommunications standards.
  • the at least one identification feature may be a spatial position and / or speed, in the case of motor vehicles the identifier. Accordingly, both the first object and the second object have a database that is communicatively connected to the interface and in which the at least one identification feature is stored or have means for determining the at least one identification feature.
  • both the first object and the second object may include a satellite receiver for determining the position in a satellite-based Global Positioning System.
  • the determination of the position can also be done by triangulation of mobile base station signals.
  • the direction can be determined via a gyrosensor.
  • the provision of the second sensor data to the first object via the backend can take place in such a way that the first object determines the at least one identification feature of the second object in the first sensor data and transmits this to the backend.
  • the backend can, based on the identification feature by appropriate comparison, ie by comparison of the transmitted by the first object identification feature of the second object and the transmitted by the second object own identification feature of the second object, the second Identify object and thus its to be provided to the first object sensor data.
  • the backend acts as a switching center, ie the second object transmits the second sensor data to the backend and the backend then transmits this to the first object.
  • One aspect relates to an alternative or extended identification of the second object.
  • This type of identification can be done by balancing distinctive feature points in the first and second sensor data.
  • camera images are considered as sensor data.
  • search is made for feature points that have been taken to feature points in camera made by the second object.
  • the distinctive feature points may be landmarks, traffic signs or other distinctive street equipment items.
  • the positions of the landmarks detected by the sensor detection may be, e.g. be referenced via the World Geodetic System 1984 (WGS 84). This is a geodetic reference system, which serves as a uniform basis for position information on Earth and in near-Earth space.
  • the method may include transmitting, through the first object, the first sensor data to the backend, and the
  • the method may then include identifying the second object by comparing distinctive feature points present in both the first sensor data and the second sensor data. Is the second If the object is uniquely identified, the backend can again act as an exchange, thus transferring the second sensor data, which is already present in the backend anyway, to the first object.
  • the identification of the second object on the basis of distinctive feature points can be carried out alternatively or in support of the identification by means of identification features of the objects themselves.
  • the alternative identification comes in particular into consideration if the first object recognizes a second object as such in its sensed sensor data, but for some reason is unable to determine at least one identification feature of the second object.
  • the first object only has to transmit to the backend a request for the provision of second sensor data and that the backend itself has to identify the second object on the basis of the distinctive feature points.
  • the backend can limit the determination to those objects that are due to their position in Erfas sungs Kunststoff of the at least one sensor of the first object.
  • the linkage can be carried out in the backend such that the linked sensor data are present as the first sensor data extended by the second sensor data and these are transmitted to the first object by the backend as linked second sensor data.
  • the link can include the combination or extension of the first sensor data and second sensor data with further data present in the backend, for example also the linkage with third sensor data.
  • the first and second sensor data or the feature points extracted therefrom can be used to enrich a digital roadmap.
  • One aspect relates to the immediate provision of the second sensor data to the first object.
  • the deployment does not have to via a central office like a backend.
  • the provision can also be made directly, for example via
  • Car2Car communication This is the case in particular if the first object can uniquely identify the second object.
  • the method may include determining, by the first object, the at least one identification feature of the second object in the first sensor data; identifying the second object, by the first object, based on the at least one identification feature; transmitting to the second object, through the first object, a request to provide sensor data; and
  • One aspect relates to providing second sensor data at particular locations.
  • These places can, for example, represent danger locations or danger areas where an extension of the visibility is necessary, for example, in front of a dome or for a passing process in a curve.
  • These places or areas are defined in advance and provided with location positions. This definition can also be croud sourcing based, i. based on events that already exist in a digital map in the backend such as occurred or canceled
  • the second object may be a permanently installed device, for example an infrastructure facility near a street crest.
  • a disordered collection of sensor data is avoided since only those second sensor data are collected that can be usefully used to increase the range of the first sensor data.
  • Increasing the range increases the safety in various driving situations, for example, in overtaking or driving in poor visibility conditions such as in the fog.
  • the method can be implemented in such a way that a targeted exchange of sensor data with a basis for a possible thor- rization and accounting can take place.
  • Fig. 1 is a block diagram of a system for increasing the range of sensors
  • Fig. 2 is a flowchart of one implemented in the system
  • Fig. 1 shows a block diagram of a system 100 for
  • Vehicle 106 wants to overtake lorry 104, which is traveling in front of vehicle 106 on the same lane 102.
  • the roadway and associated road is curved in the further course, so that the located on the opposite lane vehicle 108 is hidden from view of the vehicle 106 by the truck 108.
  • Vehicle 106 has imaging sensors in the front region, which are indicated by the dashed cone. Due to the occlusion, these imaging sensors can not detect vehicle 108. However, in the front of trucks 104 arranged imaging sensors, again indicated by a dashed cone, detect vehicle 108 due to the free field of view.
  • a method for increasing the range of the imaging sensors of motor vehicle 106 will be described with reference to FIG. The method is supported backend.
  • Backend 110 is communicative with vehicle 106 and 104, in particular via a mobile radio interface, connected. Part of the process steps is in the vehicle 106, another part in the back end 110 and another part in the truck 104 executed. This is indicated by the corresponding reference numerals in Fig. 2 and the dashed vertical lines.
  • Vehicle 106 detects first sensor data in the form of camera images by means of its imaging sensor, see method step 202.
  • Truck 104 detects second sensor data in the form of camera images by means of its imaging sensor, see method step 210.
  • First and second sensor data serve as input data for an eHorizon system, which is present both in the vehicle 106 and in the truck 104.
  • Vehicle 106 recognizes a truck in the first sensor data by means of suitable recognition software which is executed in a processor in vehicle 106, see method step 204.
  • the recognition software is also able to identify at least one identification attribute of the truck, here its vehicle license plate number , see method step 206.
  • Vehicle 106 then transmits backend 110 the vehicle registration number and the first sensor data. Vehicle 106 thus makes a request to the backend for the transmission of second sensor data, that is to say the camera images of trucks 104.
  • Both vehicle 106 and truck 104 continually transmit their GPS position with reference to it
  • Backend 110 thus has knowledge of trucks 104.
  • Backend 110 uniquely identifies trucks 104 by matching transmitted
  • Backend 110 then initiates a transmission of the second sensor data.
  • Truck 104 then transmits the second sensor data to backend 110.
  • Backend 110 links the first sensor data and second sensor data, see method step 212, so that an actual range increase of the imaging sensor of vehicle 106 results.
  • Backend 110 transmits the linked ones Sensor data back to vehicle 106, where they are displayed on a display. The driver of vehicle 106 recognizes vehicle 108 on the associated sensor data and then looks away from overtaking.
  • vehicle 106 can identify the truck 104 based on its GPS position. For this purpose, vehicle 106 measures the distance to truck 104 and calculates the GPS position of truck 104 based on its own GPS position and the distance in the direction of travel. In the backend 110, the comparison between the calculated GPS position and the GPS position is then carried out by lorries 104 themselves to the backend 110 so that they can be unambiguously identified there. Alternatively, it is also possible for vehicle 106 to transmit the measured distance to backend 110 in addition to its own GPS position, and backend 110 to calculate the GPS position of trucks 104.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un procédé pour augmenter la portée de capteurs. Le procédé comprend la détection, au moyen d'au moins un capteur, d'un premier objet, par des premières données de capteur ; la détection, au moyen d'un second capteur, d'un second objet, par des secondes données de capteur ; la reconnaissance du second objet dans les premières données de capteur ; la fourniture des secondes données de capteur au premier objet.
PCT/EP2018/085218 2017-12-21 2018-12-17 Procédé et dispositif WO2019121521A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102017223575.6A DE102017223575A1 (de) 2017-12-21 2017-12-21 Verfahren und Einrichtung
DE102017223575.6 2017-12-21

Publications (1)

Publication Number Publication Date
WO2019121521A1 true WO2019121521A1 (fr) 2019-06-27

Family

ID=65010721

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/085218 WO2019121521A1 (fr) 2017-12-21 2018-12-17 Procédé et dispositif

Country Status (2)

Country Link
DE (1) DE102017223575A1 (fr)
WO (1) WO2019121521A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021213308A1 (de) 2021-11-25 2023-05-25 Volkswagen Aktiengesellschaft Verfahren und Assistenzsystem zum Unterstützen eines Überholmanövers und Kraftfahrzeug

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008061890A1 (fr) * 2006-11-23 2008-05-29 Continental Automotive Gmbh Procédé de communication hertzienne entre des véhicules
US20170263122A1 (en) * 2016-03-14 2017-09-14 International Business Machines Corporation Interactive camera viewpoint and adjustment sharing system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013220023A1 (de) * 2013-10-02 2015-04-02 Continental Automotive Gmbh System zur Bereitstellung von Daten für Fahrzeuge
DE102015221439B3 (de) * 2015-11-02 2017-05-04 Continental Automotive Gmbh Verfahren und Vorrichtung zur Auswahl und Übertragung von Sensordaten von einem ersten zu einem zweiten Kraftfahrzeug

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008061890A1 (fr) * 2006-11-23 2008-05-29 Continental Automotive Gmbh Procédé de communication hertzienne entre des véhicules
US20170263122A1 (en) * 2016-03-14 2017-09-14 International Business Machines Corporation Interactive camera viewpoint and adjustment sharing system

Also Published As

Publication number Publication date
DE102017223575A1 (de) 2019-06-27

Similar Documents

Publication Publication Date Title
DE102016217645B4 (de) Verfahren zum Bereitstellen von Information über eine voraussichtliche Fahrintention eines Fahrzeugs
DE102017107216B4 (de) Geschwindigkeitsbegrenzungsanzeigevorrichtung für ein fahrzeug
DE102015213884B4 (de) Vorrichtung zum Bestimmen einer Gefahr in einer Fahrtumgebung und Vorrichtung zum Anzeigen einer Gefahr in einer Fahrtumgebung
DE102016216335A1 (de) System und Verfahren zur Analyse von Fahrtrajektorien für einen Streckenabschnitt
DE102016112859A1 (de) Navigationsvorrichtung für ein autonom fahrendes Fahrzeug
DE102016112913A1 (de) Verfahren und Vorrichtung zum Bestimmen einer Fahrzeug-Ich-Position
EP3830523B1 (fr) Procédé pour la mise à jour d'une carte des environs, dispositifs pour l'exécution côté véhicule d'étapes du procédé, véhicule, dispositif pour l'exécution côté ordinateur central d'étapes du procédé ainsi que support de stockage lisible par ordinateur
EP3830522B1 (fr) Procédé pour estimer la qualité de la localisation lors de la localisation propre d'un véhicule, dispositif pour mettre en oeuvre le procédé, véhicule et programme informatique
DE102014220681A1 (de) Verkehrssignalvorhersage
EP3380810B1 (fr) Méthode, dispositif, installation d'administration de cartes et système pour localiser avec précision un véhicule dans son environnement
DE112017007184T5 (de) Unterstützungsvorrichtung, unterstützungsverfahren und programm
DE102008035992A1 (de) Ampelphasenassistent unterstützt durch Umfeldsensoren
WO2009027122A1 (fr) Unité de mise à jour et procédé de mise à jour d'une carte numérique
DE102009014104A1 (de) Erkennungssystem für ein Fahrzeug
EP2562735A1 (fr) Procédé et dispositif d'analyse d'un tronçon de trajet à parcourir par un véhicule
DE112014002959T5 (de) Bestimmung der Fahrspurposition
EP3151213B1 (fr) Dispositif de véhicule et procédé d'enregistrement d'une zone environnante d'un véhicule automobile
EP2953111A1 (fr) Procédé et dispositif destinés à déterminer des aires de stationnement libres sur des parkings de camions et communication au conducteur de camion
EP2662848A2 (fr) Procédé de fabrication dýun profil de conduite
DE102012207864A1 (de) Verfahren zum Reduzieren einer Staugefahr
DE112014002958T5 (de) Verwalten von Sensorerkennung in einem Fahrerassistenzsystem eines Fahrzeugs
WO2018215156A1 (fr) Procédé, dispositifs et support d'enregistrement lisible par ordinateur comprenant des instructions pour déterminer des règles de circulation à appliquer pour un véhicule automobile
DE102012220138A1 (de) Automatische Erkennung von Falschfahrern
DE112021003340T5 (de) Hindernisinformationsverwaltungsvorrichtung,hindernisinformationsverwaltungsverfahren und vorrichtung für ein fahrzeug
DE102013212010A1 (de) Verfahren und Vorrichtung zum Unterstützen einer Engstellendurchfahrung für ein Fahrzeug, Verfahren zum Unterstützen einer Engstellendurchfahrung für ein Nachfolgefahrzeug und Verfahren zum Verwalten von Vermessungsinformationen zum Unterstützen von Engstellendurchfahrungen von Fahrzeugen

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18833003

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18833003

Country of ref document: EP

Kind code of ref document: A1