WO2020233902A1 - Procédé de surveillance d'une infrastructure - Google Patents

Procédé de surveillance d'une infrastructure Download PDF

Info

Publication number
WO2020233902A1
WO2020233902A1 PCT/EP2020/060141 EP2020060141W WO2020233902A1 WO 2020233902 A1 WO2020233902 A1 WO 2020233902A1 EP 2020060141 W EP2020060141 W EP 2020060141W WO 2020233902 A1 WO2020233902 A1 WO 2020233902A1
Authority
WO
WIPO (PCT)
Prior art keywords
environment
detection devices
surroundings
data
infrastructure
Prior art date
Application number
PCT/EP2020/060141
Other languages
German (de)
English (en)
Inventor
Dieter Joecker
Rolf Nicodemus
Stefan Nordbruch
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Priority to EP20719966.2A priority Critical patent/EP3973443A1/fr
Priority to CN202080037914.9A priority patent/CN113874871A/zh
Publication of WO2020233902A1 publication Critical patent/WO2020233902A1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/042Detecting movement of traffic to be counted or controlled using inductive or magnetic detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • B62D15/0285Parking performed automatically
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit

Definitions

  • the invention relates to a method for monitoring an infrastructure using at least two environment detection devices which are arranged within the infrastructure and each include an environment sensor.
  • the invention also relates to a surroundings detection device, a computer program and a machine-readable storage medium.
  • a parking area can be monitored, for example, by infrastructure sensors such as cameras or lidar sensors.
  • the sensors usually send their sensor data via one or more cables to a central computer system, which evaluates the sensor data.
  • planning functions for the at least partially automated control of motor vehicles within the parking area can be carried out.
  • the computer system usually consists of a static network / system dimensioned once before the parking lot is put into operation.
  • Such a computer system usually has to carry out many parallel calculations.
  • the object on which the invention is based is to be seen in providing an efficient concept for the efficient monitoring of an infrastructure.
  • a method for monitoring an infrastructure using at least two environment detection devices arranged within the infrastructure, each comprising an environment sensor, the environment sensors each detecting their environment in order to determine environment data based on the detection, the respective Environment data of the environment sensors can be evaluated by means of at least one of the environment detection devices.
  • a surroundings detection device which comprises an surroundings sensor, the surroundings detection device being set up to carry out all steps of the method according to the first aspect.
  • a computer program which comprises commands which, when the computer program is executed by a computer, for example by the surroundings detection device according to the second aspect, cause this to carry out a method according to the first aspect.
  • a machine-readable storage medium is provided on which the computer program according to the third aspect is stored.
  • the invention is based on and includes the knowledge that the above object can be achieved in that one or more surroundings detection devices are used to evaluate the surroundings data from the surroundings sensors. This can be done instead of or in addition to a central computer system of the infrastructure.
  • a static computer system that has been dimensioned in advance can be replaced by local computer units in the environment detection devices, in particular in the environment sensors, at least for an environment evaluation.
  • a surroundings detection device comprises a GPU, that is to say a graphics computer, which is set up to evaluate surroundings data.
  • such a computer system can be equipped with a lower Speichererkapazi ity and with lower computing capacity compared to the Case in which the computer system has to take over the evaluation of the surrounding data.
  • this has the technical advantage that a concept for the efficient monitoring of an infrastructure is provided.
  • An infrastructure includes, for example, a parking lot.
  • Environment data represent an environment of the respective environment sensor.
  • a surrounding field detection device comprises one or more processors which are set up to evaluate the surrounding field data.
  • a surroundings detection device can comprise a local computer unit which is set up to evaluate the surroundings data.
  • a local computer unit comprises in particular one or more processors.
  • n surroundings detection devices evaluate the respective surroundings data independently of one another with n greater than or greater than or equal to two, the n surroundings detection devices checking each other with regard to their respective evaluation.
  • the n surroundings detection devices check each other with regard to their respective evaluation in order to detect errors in the respective evaluations and / or deviations in the evaluations, that is to say evaluations deviating from one another.
  • n is an odd number and greater than or greater than or equal to three.
  • the result of the two surroundings detection devices is valid.
  • the result of the third environment detection device is, for example, discarded or ignored.
  • n> or> 3 and n is an odd number, the technical advantage in particular is that there will always be a majority of surrounding area detection devices which provide the same respective evaluation.
  • a number of surroundings detection devices it is determined which are to carry out the evaluation of the respective environment data, wherein the environment detection devices corresponding to the determined number evaluate the respective environment data.
  • the determination includes an estimation of the number.
  • a separate number of surroundings detection devices is determined for different areas of the infrastructure.
  • the number can be efficiently adapted to a current concrete situation.
  • Changing in the sense of the description includes, for example, increasing the number or decreasing the number.
  • the respective evaluations are analyzed over time, the number determined being changed depending on the analyzes after the start of the monitoring.
  • Another embodiment provides that, based on the respective environment data, a degree of complexity is determined which indicates the complexity of a scene corresponding to the recorded environment, with the determined number being changed depending on the determined degree of complexity after the start of monitoring.
  • the degree of complexity is determined based on one or more complexity parameters selected from the following group of complexity parameters, with a complexity parameter in particular being an estimated complexity parameter: number of motor vehicles currently in the infrastructure, number of currently Motor vehicles that are at least partially automated in the infrastructure, number of road users currently in the infrastructure other than motor vehicles, environmental parameters, in particular weather data, light data, visual data, performance parameters of an environmental sensor, current safety requirements related to safe, at least partially automated driving of a motor vehicle, current safety requirements the secure processing of the surrounding data.
  • a complexity parameter in particular being an estimated complexity parameter: number of motor vehicles currently in the infrastructure, number of currently Motor vehicles that are at least partially automated in the infrastructure, number of road users currently in the infrastructure other than motor vehicles, environmental parameters, in particular weather data, light data, visual data, performance parameters of an environmental sensor, current safety requirements related to safe, at least partially automated driving of a motor vehicle, current safety requirements the secure processing of the surrounding data.
  • complexity parameters can be estimated and / or predicted in part or in full, for example.
  • An estimate or a prediction is based, for example, on historical complexity parameters.
  • At least one further function, in particular path planning, for at least partially automated driving of a motor vehicle is carried out by means of at least one of the surroundings detection devices in addition to evaluating the surroundings data.
  • several surroundings detection devices execute the at least one further function, the several surroundings detection devices checking each other with respect to the execution of the at least one further function.
  • At least two surrounding field sensors of two surrounding sensing devices detect a common area of the infrastructure.
  • respective detection areas of two surrounding field sensors of two surrounding detection devices can at least partially overlap one another, in particular completely.
  • An evaluation of the environment data provides, for example, a result that indicates whether or not an object was detected in the correspondingly recorded environment.
  • both environment sensors should detect the same object with a certain probability, provided that this object is in the overlap area.
  • one embodiment provides that two detection areas of two environment sensors of two environment detection devices essentially completely overlap. Essentially here means in particular at least 90%.
  • An environment sensor is, for example, one of the following environment sensors: radar sensor, lidar sensor, ultrasonic sensor, infrared sensor, magnetic field sensor and video sensor, in particular video sensor of a video camera.
  • the surroundings sensors of the surroundings detection devices are, for example, the same or different.
  • a surroundings detection device comprises, for example, several surroundings sensors.
  • the phrase "at least partially automated leadership” includes one or more of the following cases: assisted leadership, partially automated leadership, highly automated leadership, fully automated leadership.
  • Assisted guidance means that a driver of the motor vehicle continuously performs either the transverse or the longitudinal guidance of the motor vehicle.
  • the respectively other driving task i.e. controlling the longitudinal or lateral guidance of the motor vehicle
  • Semi-automated guidance means that in a specific situation (for example: driving on a motorway, driving within a parking lot, overtaking an object, driving within a lane that is defined by lane markings) and / or for a certain period of time, a longitudinal and a lateral guidance of the motor vehicle can be controlled automatically.
  • a driver of the motor vehicle does not have to manually control the longitudinal and lateral guidance of the motor vehicle.
  • the driver must permanently monitor the automatic control of the longitudinal and lateral guidance in order to be able to intervene manually if necessary. The driver must be ready to take full control of the vehicle at all times.
  • Highly automated guidance means that for a certain period of time in a specific situation (for example: driving on a freeway, driving within a parking lot, overtaking an object, driving within a lane that is defined by lane markings) a longitudinal and a transverse guidance of the motor vehicle can be controlled automatically.
  • a driver of the motor vehicle does not have to manually control the longitudinal and lateral guidance of the motor vehicle.
  • the driver does not have to constantly monitor the automatic control of the longitudinal and lateral guidance in order to be able to intervene manually if necessary.
  • a takeover request is automatically issued to the driver to take over control of the longitudinal and lateral guidance, in particular with sufficient time reserve.
  • the driver must therefore potentially be able to take control of the longitudinal and lateral guidance.
  • the limits of the automatic control of the lateral and longitudinal guidance are automatically recognized. In the case of highly automated management, it is not possible to automatically bring about a low-risk state in every initial situation.
  • Fully automated guidance means that in a specific situation (for example: driving on a motorway, driving within a parking lot, fetching an object, driving within a lane that is determined by lane markings) a longitudinal and a lateral guidance of the motor vehicle are automatically controlled.
  • a driver of the motor vehicle does not have to manually control the longitudinal and lateral guidance of the motor vehicle himself.
  • the driver does not have to monitor the automatic control of the longitudinal and lateral guidance in order to be able to intervene manually if necessary.
  • the driver is automatically prompted to take over the driving task (controlling the transverse and longitudinal guidance of the motor vehicle), in particular with sufficient time reserve. If the driver does not take over the driving task, the system automatically returns to a low-risk state.
  • the limits of the automatic control of the lateral and longitudinal guidance are automatically recognized. In all situations, it is possible to automatically return to a low-risk system state.
  • the method according to the first aspect is a computer-implemented method.
  • Fig. 1 is a flow chart of a method for monitoring an infrastructure
  • FIG. 6 shows the infrastructure according to FIG. 4 divided into several areas.
  • the same reference symbols can be used for the same features.
  • Fig. 1 shows a flowchart of a method for monitoring an infrastructure using at least two within the infrastructure is arranged, each comprising an environment sensor, surrounding detection devices
  • FIG. 1 The following description of the flowchart according to FIG. 1 is based on two surroundings detection devices as an example. It is explicitly pointed out that more than two surroundings detection devices can be provided in exemplary embodiments that are not shown.
  • the environment sensor of the first environment detection device detects its environment, so that according to a step 103 environment data based on the detection are determined.
  • a step 105 it is provided that the environment sensor of the second environment detection device detects its environment, so that in a step 107 environment data based on the detection are determined.
  • the respective environment data of the environment sensors are evaluated according to a step 109 by means of at least one of the two environment detection devices.
  • the respective environment data are evaluated by both environment detection devices.
  • a result corresponding to the evaluation is output by means of the corresponding surroundings detection device.
  • a result of an evaluation indicates, for example, whether or not an object was detected in the corresponding area of the environment sensor.
  • the U field sensors of the two surroundings detection devices cover a common area.
  • FIG. 2 shows a surroundings detection device 201.
  • the surroundings detection device 201 comprises an surroundings sensor 203. In an embodiment not shown, a plurality of surroundings sensors 203 are provided.
  • the surroundings detection device 201 further comprises an input 205 which is set up to receive respective surroundings data from surroundings sensors from one or more further surroundings detection devices.
  • the surroundings detection device 201 further comprises an evaluation unit 207 which is set up to evaluate the received surroundings data and surroundings data from its own surroundings sensor 203.
  • the environment detection device 201 further comprises an output 209 which is set up to output result signals which represent a result corresponding to the evaluation.
  • a result of the evaluation indicates, for example, whether an object was detected or not based on the surrounding field data.
  • the evaluation unit 207 comprises, for example, one or more processors.
  • FIG. 3 shows a machine-readable storage medium 301.
  • a computer program 303 is stored on the machine-readable storage medium 301.
  • the computer program 303 comprises commands which, when the computer program 303 is executed by a computer, for example by the surroundings detection device 201, in particular by the evaluation unit 207, cause the computer to execute a method according to the first aspect.
  • the 4 shows an infrastructure 401.
  • the infrastructure 401 is, for example, a parking lot.
  • a first surrounding sensing device 403, a second surrounding sensing device 405, a third surrounding sensing device 407, a fourth surrounding sensing device 409 and a fifth surrounding sensing device 411 are arranged on a ceiling 413 of the infrastructure 401.
  • the surroundings detection devices each include an evaluation unit 415, which is set up to evaluate surroundings data from its own and the other surroundings sensors.
  • each of the surroundings detection devices can evaluate the surroundings data of its own surroundings sensor and also the surroundings data of the surroundings sensors of the other surroundings detection devices.
  • the surroundings detection devices according to FIG. 4 are designed like the order field detection device 201 of FIG. 2.
  • a motor vehicle 417 and a person 419 are located within the infrastructure 401.
  • the individual environment sensors of the environment detection devices detect these two objects, with the corresponding environment sensors should detect both objects 417, 419 in the case of overlapping detection areas. This can then be a result of the corresponding evaluation, for example.
  • Fig. 5 shows the first environment detection device 403 and the second environment detection device 405 with their respective detection area.
  • a detection area of the environment sensor of the first environment detection device 403 is identified by the reference symbol 501.
  • a detection area of the environment sensor of the second environment detection device 405 is identified by the reference symbol 503.
  • Both surroundings detection devices 403, 405 detect a common area 505 of the infrastructure 401.
  • both environment detection devices 403, 405 can check each other.
  • both environment detection devices 403, 405 mutually transmit their corresponding environment data, so that the first environment detection device 403 evaluates the environment data of the second environment detection device 405 in addition to its own environment data and vice versa.
  • Both environment sensing devices should come to the same conclusion. If the two results differ from one another, it can be determined, for example, that an error has occurred, which can then lead, for example, to at least partially automated motor vehicles within the infrastructure 401 being stopped for safety reasons. For example, the motor vehicle 417 according to FIG. 4 is then stopped.
  • two surroundings detection devices which monitor or detect a common area
  • three surroundings detection devices monitor or detect a common area, they can also check each other here with regard to their evaluation.
  • the following cases can arise, for example: Based on their evaluation, two of the surroundings detection devices come to the result that an object is present within the common surroundings, with the third surroundings detection device arriving at the opposite result, so it is determined that an object is present in the common surroundings.
  • n is an odd number and greater than three.
  • FIG. 6 shows the infrastructure 401 according to FIG. 4 divided into a first area 601, a second area 603 and an n-th area 605, where n is a natural number and greater than or equal to two.
  • Each of these areas is monitored by means of a plurality of surroundings detection devices, a common reference numeral 609 being used for the surroundings detection devices for the sake of simplicity.
  • the surroundings detection devices 609 each include one or more surroundings sensors, which are not shown for the sake of clarity.
  • the surroundings detection devices 609 are designed analogously to the surroundings detection devices 403 to 411 according to FIG. 4 or according to the surroundings detection device 201 according to FIG. 2.
  • a fixed number of surroundings detection devices 609 take over the evaluation of the surroundings field data from those surroundings sensors which detect the first area 601.
  • the second area 603 it is provided that a variable number of surrounding field detection devices is provided for the evaluation of those surroundings sensors which detect the second area 630.
  • This variable number can, for example, be three at a first point in time and five at a later point in time.
  • a fixed number of surroundings detection devices can in turn be provided, which take over the evaluation of the surroundings data from those surroundings sensors which monitor the nth area 605.
  • the fixed number is five.
  • the number of surroundings detection devices which carry out the evaluation of the respective surroundings data is defined in advance. Defined in advance means, for example, that the number has already been determined before monitoring begins.
  • the number is estimated or calculated in advance.
  • a number of surroundings detection devices are determined which check each other for each area.
  • a respective number per area can be different.
  • the number depends, for example, on a result of a last evaluation or on results of last evaluations.
  • not only a result of the last evaluation, but also a trend of the last evaluations is taken into account when the number is changed.
  • the complexity of the scene is calculated separately and the number of processing units is determined on the result. This can be done with the existing sensors / computing units or additional sensors / computing units. This can be determined / calculated individually for all areas or can be determined on the basis of individual areas or several areas or all areas.
  • the complexity can u. a.
  • complexity parameters can be defined:
  • Ambient conditions e.g. weather, lighting conditions; possibly more must be calculated due to poor visibility
  • Infrastructure performance e.g. B. the environment detection devices (for example, it takes into account how many parallel calculations a surrounding detection device can calculate),
  • Estimated data (predicted from historical data; e.g., for a certain day of the week, e.g. for Monday mornings, vacation time, ...)

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un procédé de surveillance d'une infrastructure en utilisant au moins deux dispositifs de détection d'environnement agencés à l'intérieur de l'infrastructure et comprenant respectivement un capteur d'environnement, les capteurs d'environnement détectant respectivement leur environnement afin de déterminer des données d'environnement basées sur la détection, les données d'environnement respectives des capteurs d'environnement étant évaluées à l'aide d'au moins un dispositif de détection d'environnement. L'invention concerne en outre un dispositif de détection d'environnement, un programme informatique et un support d'enregistrement lisible par machine.
PCT/EP2020/060141 2019-05-20 2020-04-09 Procédé de surveillance d'une infrastructure WO2020233902A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20719966.2A EP3973443A1 (fr) 2019-05-20 2020-04-09 Procédé de surveillance d'une infrastructure
CN202080037914.9A CN113874871A (zh) 2019-05-20 2020-04-09 用于监控基础设施的方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019207344.1A DE102019207344A1 (de) 2019-05-20 2019-05-20 Verfahren zum Überwachen einer Infrastruktur
DE102019207344.1 2019-05-20

Publications (1)

Publication Number Publication Date
WO2020233902A1 true WO2020233902A1 (fr) 2020-11-26

Family

ID=70295099

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/060141 WO2020233902A1 (fr) 2019-05-20 2020-04-09 Procédé de surveillance d'une infrastructure

Country Status (4)

Country Link
EP (1) EP3973443A1 (fr)
CN (1) CN113874871A (fr)
DE (1) DE102019207344A1 (fr)
WO (1) WO2020233902A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020213661A1 (de) 2020-10-30 2022-05-05 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Analysieren eines Umfelds eines Kraftfahrzeugs

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015222934A1 (de) * 2015-11-20 2017-05-24 Robert Bosch Gmbh Steuern eines Kraftfahrzeugs
DE102016212195A1 (de) * 2016-07-05 2018-01-11 Robert Bosch Gmbh Verfahren zum Durchführen eines automatischen Eingriffs in die Fahrzeugführung eines Fahrzeugs
DE102016223185A1 (de) * 2016-11-23 2018-05-24 Robert Bosch Gmbh Verfahren und System zum Detektieren eines sich innerhalb eines Parkplatzes befindenden erhabenen Objekts

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2858039B1 (fr) * 2013-07-25 2017-11-08 Deutsches Zentrum für Luft- und Raumfahrt e.V. Procédé de contrôle automatique de l'arrivée d'un véhicule routier dans une section de route contrôlée, système de contrôle et son système côté véhicule et programme informatique
DE102016202862A1 (de) * 2016-02-24 2017-08-24 Robert Bosch Gmbh System und Verfahren zum Erfassen eines Belegungszustands einer abgegrenzten Parkfläche eines Parkplatzes und Parkplatz

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015222934A1 (de) * 2015-11-20 2017-05-24 Robert Bosch Gmbh Steuern eines Kraftfahrzeugs
DE102016212195A1 (de) * 2016-07-05 2018-01-11 Robert Bosch Gmbh Verfahren zum Durchführen eines automatischen Eingriffs in die Fahrzeugführung eines Fahrzeugs
DE102016223185A1 (de) * 2016-11-23 2018-05-24 Robert Bosch Gmbh Verfahren und System zum Detektieren eines sich innerhalb eines Parkplatzes befindenden erhabenen Objekts

Also Published As

Publication number Publication date
DE102019207344A1 (de) 2020-11-26
EP3973443A1 (fr) 2022-03-30
CN113874871A (zh) 2021-12-31

Similar Documents

Publication Publication Date Title
EP3181422B1 (fr) Procédé et système de commande automatique d'un véhicule suiveur comprenant un véhicule scout
EP2591969A1 (fr) Procédé de mise en fonction d'un système d'un véhicule automobile et véhicule automobile
WO2015149971A1 (fr) Procédé pour l'analyse de la situation d'un véhicule dans un environnement de trafic
WO2020200792A1 (fr) Procédé pour contrôler un capteur de perception de l'environnement d'un véhicule et procédé pour faire fonctionner un véhicule
DE102014210485A1 (de) Verfahren und Vorrichtung zum Überwachen eines eine zumindest teilautomatisierte Fahrfunktion bereitstellendes System eines Fahrzeugs
DE102009046913A1 (de) Verfahren und Steuergerät zur Müdigkeitserkennung
DE102018203063A1 (de) Kollisionsrisiko-Vorhersageeinheit
DE102019002790A1 (de) Verfahren zur Prädiktion einer Verkehrssituation für ein Fahrzeug
DE102019105739A1 (de) Verfahren zum zumindest teilautomatisierten Führen eines Kraftfahrzeugs
WO2019215017A1 (fr) Procédé, dispositif et support d'informations lisible par un ordinateur pourvu d'instructions destinées à contrôler et à valider des données de fonctionnement dans l'actionneur d'un véhicule automobile autonome
DE102019214420A1 (de) Verfahren zum zumindest assistierten Überqueren eines Knotenpunkts durch ein Kraftfahrzeug
WO2020126447A1 (fr) Procédé et système de fourniture de données d'environnement
WO2021058174A1 (fr) Procédé d'incorporation au moins assistée d'un véhicule automobile à une voie de circulation
WO2020233902A1 (fr) Procédé de surveillance d'une infrastructure
DE102020106283B3 (de) Kraftfahrzeug und Verfahren zum Betrieb eines Kraftfahrzeugs
WO2020233901A1 (fr) Concept de traitement de données pour une conduite au moins partiellement automatique d'un véhicule à moteur
WO2020254063A1 (fr) Procédé pour conduire un véhicule automobile au moins de manière semi-automatique
DE102019214931A1 (de) Steuerung eines Fahrzeugs
DE102019214443A1 (de) Verfahren zum zumindest assistierten Durchfahren eines Kreisverkehrs durch ein Kraftfahrzeug
DE19601831A1 (de) Verfahren zur Erkennung von relevanten Objekten
WO2022090015A1 (fr) Procédé pour analyser un environnement d'un véhicule automobile
DE102016004292A1 (de) Verfahren zum Betrieb eines Fahrzeuges
EP3398828B1 (fr) Système d'aide à la conduite et procédé permettant d'assister un conducteur d'un véhicule ferroviaire
DE102014212380A1 (de) Verfahren und Vorrichtung zum Betreiben eines Fahrzeugs
EP3866135B1 (fr) Procédé de commande d'une installation de signalisation lumineuse

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20719966

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020719966

Country of ref document: EP

Effective date: 20211220