EP4251485A1 - Verfahren und vorrichtung zum ermitteln einer virtuellen grenzlinie zwischen zwei fahrspuren - Google Patents

Verfahren und vorrichtung zum ermitteln einer virtuellen grenzlinie zwischen zwei fahrspuren

Info

Publication number
EP4251485A1
EP4251485A1 EP21810655.7A EP21810655A EP4251485A1 EP 4251485 A1 EP4251485 A1 EP 4251485A1 EP 21810655 A EP21810655 A EP 21810655A EP 4251485 A1 EP4251485 A1 EP 4251485A1
Authority
EP
European Patent Office
Prior art keywords
lane
line
information
ego
probability
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21810655.7A
Other languages
English (en)
French (fr)
Inventor
Badreddine ABOULISSANE
Ismail Abouessire
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Stellantis Auto SAS
Original Assignee
PSA Automobiles SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PSA Automobiles SA filed Critical PSA Automobiles SA
Publication of EP4251485A1 publication Critical patent/EP4251485A1/de
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/20Data confidence level
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data

Definitions

  • the invention is in the field of autonomous vehicle driving assistance systems.
  • the invention relates to a method and a device for determining a virtual boundary line between a traffic lane, called ego-lane, and a traffic lane adjacent to the ego-lane, called adjacent lane, for regulating a position of an autonomous vehicle traveling on the ego-lane.
  • Vehicle means any type of vehicle such as a motor vehicle, moped, motorcycle, warehouse storage robot, etc.
  • Autonomous driving of an “autonomous vehicle” means any process capable of assisting the driving of the vehicle. The method can thus consist in partially or totally directing the vehicle or providing any type of assistance to a natural person driving the vehicle. The process thus covers all autonomous driving, from level 0 to level 5 in the OICA scale, for Organization International des Constructeurs Automobiles.
  • ADAS Advanced Driver Assistance Systems
  • LPA Lane Positioning Assist
  • these systems include on-board cameras capturing images of the vehicle's environment as well as sensors of the RADAR, LIDAR, ultrasound type, etc., capturing data from the vehicle's environment.
  • Processing of the captured images and data detects, first of all, objects (such as, for example, a vehicle, a truck, a cyclist, a pedestrian, an obstacle, etc.) present in the vehicle's environment (in a vicinity of the vehicle). Next, these detected objects are identified. Each object is tracked over time in order to assign it a probability of existence, to classify it (for example, assign a class of vehicle such as car, truck, bus, motorcycle, ... or a class of non-vehicle such as pedestrian, animal , panel, ...), assign a probability of belonging to a given class, determine a state of motion (in particular, determine if the object is stationary/static, rolling in the same direction as the vehicle, driving in the opposite direction of the vehicle).
  • objects such as, for example, a vehicle, a truck, a cyclist, a pedestrian, an obstacle, etc.
  • Each object is tracked over time in order to assign it a probability of existence, to classify it (for example, assign a class of vehicle such as car, truck, bus
  • a determination of the position (in particular relative to the vehicle) of each object is also determined.
  • this determination identifies on which traffic lane the detected objects are circulating.
  • this determination is a function of the lateral deviation of the detected object with respect to the vehicle and also of another processing of the images and of the data captured. In case of doubt, the probability of existence of the object is reduced.
  • This further processing of captured images and data identifies and recognizes lane markings and other lane delineations, and thereby identifies traffic lanes.
  • ground markings are essential. Indeed, the width of traffic lanes generally depends on the country, the roads and their maximum speeds. Regulation of the position of the vehicle then takes place with respect to the center line between the line representing a marking to the left of the traffic lane and a line representing a marking to the right of said traffic lane.
  • ADAS systems regulating the position of an autonomous vehicle are more capable of automatically positioning the vehicle and must abruptly return control of the vehicle's operation to a driver. This situation is partially stressful, the driver being provided only at the last moment. Furthermore, the availability of the ADAS system is reduced, which gives a bad brand image.
  • An object of the present invention is to remedy the aforementioned problem, in particular to virtually reconstruct a dividing line between two tracks. So the operation of ADAS systems capable of regulating a position of an autonomous vehicle is more robust and regular.
  • a first aspect of the invention relates to a method for determining a virtual boundary line between a traffic lane, called ego-lane, and a traffic lane adjacent to the ego-lane, called adjacent lane , to regulate a position of an autonomous vehicle traveling on the ego-lane, said method comprising the steps of:
  • the determination of the virtual line is based on the path traveled by the first objects circulating on the ego-lane and on the path traversed by the second objects circulating on the adjacent lane, thus forming two lines.
  • This virtual line can be a median line between the two lines then naturally representing a separation between the ego-path and the adjacent path.
  • the determination of the virtual line is robust and reliable with respect to measurement uncertainties. This determination takes into account a plurality of geolocation points, about twenty for example, per detected object, and each object is detected reliably thanks to a probability of existence greater than an existence threshold (for example 70% ).
  • the information received for the first and for the second object further comprises a class of membership of the object, a probability of membership of the class, and/or a state of movement, and in which the step of construction of the line for each object is carried out only if the class of membership is a member of a list of predetermined classes, if the probability of membership of the class is greater than a predetermined membership threshold, and/or if the motion state indicates that the object is moving.
  • the determination of the virtual line is even more robust and more reliable with respect to measurement uncertainties.
  • the construction of the line for each object is based on a parametric adjustment of a polynomial, the polynomial modeling the path traveled by the object.
  • the information received is combined with the information of the first object to complete the information of said first object.
  • the information received is combined with the information of the second object to complete the information of said second object.
  • each piece of information received is timestamped, and only the data received over a predetermined time interval are used in the construction of a line for each object from the geolocation points.
  • a global, continuous and periodic processing is carried out determining a virtual line over a given time horizon, for example over 3 seconds.
  • a second aspect of the invention relates to a device comprising a memory associated with at least one processor configured to implement the method according to the first aspect of the invention.
  • the invention also relates to a vehicle comprising the device.
  • the invention also relates to a computer program comprising instructions adapted for the execution of the steps of the method, according to the first aspect of the invention, when said program is executed by at least one processor.
  • FIG. 1 schematically illustrates a device, according to a particular embodiment of the present invention.
  • FIG. 2 schematically illustrates a method for determining a virtual line, according to a particular embodiment of the present invention.
  • the invention is described below in its non-limiting application to the case of an autonomous motor vehicle traveling on a road or on a traffic lane.
  • Other applications such as a robot in a storage warehouse or a motorcycle on a country road are also possible.
  • FIG. 1 represents an example of a device 101 included in the vehicle, in a network (“cloud”) or in a server.
  • This device 101 can be used as a centralized device in charge of at least certain steps of the method described below with reference to FIG. 2. In one embodiment, it corresponds to an autonomous driving computer.
  • the device 101 is included in the vehicle.
  • This device 101 can take the form of a box comprising printed circuits, of any type of computer or else of a mobile telephone (“smartphone”).
  • the device 101 comprises a random access memory 102 for storing instructions for the implementation by a processor 103 of at least one step of the method as described above.
  • the device also comprises a mass memory 104 for storing data intended to be kept after the implementation of the method.
  • the device 101 may also include a digital signal processor (DSP) 105.
  • This DSP 105 receives data to shape, demodulate and amplify, in a manner known per se, this data.
  • Device 101 also includes an input interface 106 for receiving data implemented by the method according to the invention and an output interface 107 for transmitting data implemented by the method according to the invention.
  • FIG. 2 schematically illustrates a method for determining a virtual boundary line between a traffic lane, called ego-lane, and a traffic lane adjacent to the ego-lane, called adjacent lane, to regulate a position of an autonomous vehicle traveling on the ego-lane, according to a particular embodiment of the present invention.
  • Step 201 is a step where the device 101, for example, receives information from at least one first detected object circulating on the ego-path, the information characterizing a probability of existence of the first object and a geolocation point of the first object in the route travelled.
  • processing of images acquired by a camera and processing of data from sensors of the RADAR, LIDAR, ultrasound type, etc. have detected an object present in the ego-path.
  • a geolocation point of the first object in the route traveled is determined and transmitted to the device 101.
  • the geolocation point of a detected object represents for example the projection on the route of the center of gravity image of a detected vehicle.
  • the geolocation point represents the lateral and longitudinal distance from the autonomous vehicle.
  • a probability of existence of the first object is associated with the geolocation point. It represents the plausibility of the first detected object. For example, this probability has a value between 0 and 1 as its value, and is provided by a conventional perception module.
  • the device 101 also receives a class of membership of the first object, a probability of class membership. For example, the detected object is first classified according to a mobile class or not, this class identifying the objects which move. Then, if necessary, the detected object is classified according to a subclass representing the number of wheels (0 wheel,
  • the detected object is classified according to a sub-class representing a personal vehicle, a truck, a bus, etc.
  • This classification is carried out by a module external to the invention conventionally present in a perception module.
  • a class probability is associated with this classification.
  • device 101 receives a motion status that identifies whether the object is currently moving in the direction of autonomous vehicle travel, if the object is currently moving in the opposite direction of vehicle travel standalone, or if the object is currently static.
  • the information received is combined with the information of the first object to complete the information of said first object.
  • the information received is combined with the information of the first object to complete the information of said first object.
  • several objects circulating in the ego-path are detected. The information concerning these other objects completes the data of the first object thus multiplying the number of information received.
  • each item of information received is timestamped. This makes it possible to use data over a predetermined time interval such as 3 to 5 seconds, other values being possible.
  • the data is stored in memory 102.
  • Step 202 is a step where the device 101 for example receives information from at least one second detected object traveling on the adjacent lane, the information characterizing a probability of existence of the second object and a geolocation point of the second object in the traffic lane. This step is similar to step 201, but only concerns objects traveling on the adjacent lane.
  • Step 203 "Test 1?" is a step that tests whether the information received during step 201 meets certain criteria. For example, to take into account the data relating to the geolocation points in a following step 205 described below, the probability of existence of an object must be greater than a predetermined existence threshold.
  • the existence threshold has a value of 0.7 (i.e. 70%), but can take any other value. In an operating mode, the existence threshold varies according to the use case, such as the type of road on which the autonomous vehicle travels.
  • a minimum number of geolocation points must be received. In an operating mode, this number is 20, but it can take any other value and varies according to the use case. For example, if only one first object is detected, the minimum number is 15. If another object circulating in the ego-path is detected, the overall minimum number (first object and other object) is 20.
  • the detected object must belong to a class of vehicle with at least 4 wheels and the probability of belonging to this class must be greater than 70% (or another value that may vary depending on the use case).
  • the motion state of the detected object must indicate that the object is in motion. Indeed, in the next step 205, a line representing the path traveled by the object is constructed. If the object is stationary, the line cannot be constructed. In an operating mode, the state of movement also includes a direction of circulation. For example, objects traveling on the ego-lane must travel in the same direction as the autonomous vehicle.
  • the different examples constitute different criteria for accepting geolocation points. If any combination of these criteria is not met, then there is no not enough data validates to be able to move on to the next step 205. We return to step 201 in order to receive new information.
  • Step 204 "Test2?" is a step that tests whether the information received during step 202 meets certain criteria.
  • the probability of existence of an object must be greater than a predetermined existence threshold.
  • the existence threshold has a value of 0.7 (i.e. 70%), but can take any other value.
  • the existence threshold varies according to the use case, such as the type of road on which the autonomous vehicle travels.
  • a minimum number of geolocation points must be received. In an operating mode, this number is 20, but it can take any other value and varies according to the use case. For example, if only one second object is detected, the minimum number is 10. This number is not necessarily the same as that of step 203 because, in certain cases of use, the vehicles circulating on the lane adjacent travel in the opposite direction with respect to the direction of travel of the autonomous vehicle. If another object traveling in the adjacent lane is detected, the global minimum number (second object and other object traveling in the adjacent lane) is 15. The numbers indicated are examples and can take other values depending on the case. use.
  • the detected object must belong to a class of vehicle with at least 4 wheels and the probability of belonging to this class must be greater than 70% (or another value that may vary depending on the use case).
  • the motion state of the detected object must indicate that the object is in motion. Indeed, in the next step 206, a line representing the path traveled by the object is constructed. If the object is stationary, the line cannot be constructed.
  • the state of movement also includes a sense of traffic. For example, objects traveling on the adjacent lane must all travel in the same direction (direction of the autonomous vehicle, or opposite direction) in order to be taken into consideration. In the latter case, the information of the objects carrying out a prolonged overtaking are not taken into account.
  • the different examples constitute different criteria for accepting geolocation points. If a combination of these criteria is not respected, then there is not enough valid data to be able to move on to the next step 206. We return to step 202 in order to receive new information.
  • Step 205 is a step for constructing a line for the first object from the geolocation points which have been retained after step 203, the line representing the path traveled by the object.
  • the geolocation points also include the geolocation points of another object traveling on the ego-lane.
  • the line thus constructed represents the path traveled by a fictitious vehicle comprising all the geolocation points.
  • step 203 the geolocation points are reliable and are representative of a center line of the ego-path.
  • the construction of the line for each object is based on a parametric fitting of a polynomial, the polynomial modeling the path traveled by the object. For example, in a straight line by a conventional method of linear regression, the best straight line passing through all the geolocation points which have been retained after step 203 is determined.
  • the order of the polynomial is greater than 1, for example 4, in order to take into account the curvature of the road.
  • the order of the polynomial is to be chosen to meet the compromise between precision and speed of calculation.
  • Step 206 is a step for constructing a line for the second object from the geolocation points which have been retained after step 204, the line representing the path traveled by the object.
  • the geolocation points also include the geolocation points of another object traveling on the adjacent lane. Line thus constructed, represents the path traveled by a fictitious vehicle comprising all the geolocation points.
  • step 204 the geolocation points are reliable and are representative of a center line of the ego-path.
  • the construction of the line for each object is based on a parametric fitting of a polynomial, the polynomial modeling the path traveled by the object. For example, in a straight line by a conventional method of linear regression, the best straight line passing through all the geolocation points which have been retained after step 203 is determined.
  • the order of the polynomial is greater than 1, for example 4, in order to take into account the curvature of the road.
  • the order of the polynomial is to be chosen to meet the compromise between precision and speed of calculation.
  • Step 207 "VirtLi” is the step of determining a virtual line delimiting the ego-path of the adjacent path from the line constructed for the first object, line resulting from step 205, and from from the line constructed for the second object, line resulting from step 206.
  • the lines resulting from step 205 and from step 206 respectively represent the path traveled by a fictitious vehicle on the ego-lane and on the adjacent lane.
  • the virtual line is a median line between the line constructed from the first object and the line constructed from the second object, thus clearly indicating the separation between the ego-path and the adjacent path.
  • the method receives information of a width of the ego-path and the construction of the virtual line is based on a half width of the ego-path, and on the line constructed for the first object or on the line constructed for the second object.
  • the constructed virtual line replaces a line representing the detection of the marking on the ground delimiting the ego-lane of the adjacent lane in order to regulate the position of the autonomous vehicle by a driving assistance system.
  • the present invention is not limited to the embodiments described above by way of examples; it extends to other variants.
  • an exemplary embodiment has been described above in which the determination of the virtual line is carried out after a separate processing of the information received concerning the ego-channel and of the information received concerning the adjacent channel.
  • the virtual line is built from all the information received and which respects a combination of the stated criteria (existence, number of points, belonging to a class, state of movement, number of points, etc. ).
  • the virtual line can use techniques of the SVM type (acronyms for “Support Vector Machine”) being capable of solving discrimination problems at the cost of making calculations more complex.
  • the information received comes from a processing of images and data from sensors.
  • all or part of this information comes from a transmission ensured by a radiofrequency link, such as a communication from vehicle to vehicle (V2V) or communication from vehicle to any other communicating object (V2X).
  • a radiofrequency link such as a communication from vehicle to vehicle (V2V) or communication from vehicle to any other communicating object (V2X).

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
EP21810655.7A 2020-11-25 2021-10-20 Verfahren und vorrichtung zum ermitteln einer virtuellen grenzlinie zwischen zwei fahrspuren Pending EP4251485A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR2012108A FR3116501B1 (fr) 2020-11-25 2020-11-25 Procédé et dispositif de détermination d’une ligne virtuelle de délimitation entre deux voies de circulation.
PCT/FR2021/051825 WO2022112672A1 (fr) 2020-11-25 2021-10-20 Procédé et dispositif de détermination d'une ligne virtuelle de délimitation entre deux voies de circulation

Publications (1)

Publication Number Publication Date
EP4251485A1 true EP4251485A1 (de) 2023-10-04

Family

ID=75746703

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21810655.7A Pending EP4251485A1 (de) 2020-11-25 2021-10-20 Verfahren und vorrichtung zum ermitteln einer virtuellen grenzlinie zwischen zwei fahrspuren

Country Status (3)

Country Link
EP (1) EP4251485A1 (de)
FR (1) FR3116501B1 (de)
WO (1) WO2022112672A1 (de)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3668127A (en) 1968-07-01 1972-06-06 Ricoh Kk Liquid developer for electrophotography
KR101714185B1 (ko) * 2015-08-05 2017-03-22 엘지전자 주식회사 차량 운전 보조장치 및 이를 포함하는 차량
DE102016118497A1 (de) * 2016-09-29 2018-03-29 Valeo Schalter Und Sensoren Gmbh Ermittlung einer virtuellen Fahrspur für eine von einem Kraftfahrzeug befahrene Straße
FR3088280A1 (fr) * 2018-11-08 2020-05-15 Psa Automobiles Sa Construction par segmentation de voies virtuelles sur une chaussee
FR3089925B1 (fr) * 2018-12-13 2020-11-20 Psa Automobiles Sa Conduite autonome sécurisée dans le cas d’une détection d’un objet cible

Also Published As

Publication number Publication date
FR3116501B1 (fr) 2022-10-07
WO2022112672A1 (fr) 2022-06-02
FR3116501A1 (fr) 2022-05-27

Similar Documents

Publication Publication Date Title
EP3894295B1 (de) Sicheres autonomes fahren im fall der detektion eines zielobjekts
WO2020157407A1 (fr) Gestion via une vitesse équivalente d'une conduite autonome avec au moins deux objets cibles
FR3094318A1 (fr) Procédé de commande du positionnement d’un véhicule automobile sur une voie de circulation
EP3894294B1 (de) Konsolidierung eines indikators der anwesenheit eines zielobjekts für autonomes fahren
EP4251485A1 (de) Verfahren und vorrichtung zum ermitteln einer virtuellen grenzlinie zwischen zwei fahrspuren
WO2023052692A1 (fr) Procédé et dispositif de détection d'insertion dans une voie de circulation d'un véhicule.
EP3877228B1 (de) Konstruktion durch segmentierung von virtuellen fahrspuren auf einer fahrbahn
EP3924237A1 (de) Autonomes fahren basierend auf abstand und geschwindigkeit bestimmter zielobjekte
WO2020025260A1 (fr) Procédé de détermination d'un type d'emplacement de stationnement
FR3099961A1 (fr) Estimation de la vitesse moyenne d’un trafic d’au moins un vehicule sur un troncon de route
FR3080345A1 (fr) Amelioration de la detection par le suivi d’un vehicule eclaireur
FR3082044A1 (fr) Procede et dispositif de detection de la voie de circulation sur laquelle circule un vehicule, en fonction des delimitations determinees
FR3137780A1 (fr) Procédé et dispositif de détermination d’un tracé arrière d’au moins une délimitation latérale de voie de circulation
WO2023161571A1 (fr) Procédé et dispositif de contrôle de sélection d'un véhicule cible d'un système de régulation adaptative de vitesse d'un véhicule
FR3138098A1 (fr) Procédé et dispositif de détermination d’une vitesse de rotation en lacet d’un véhicule
WO2023233088A1 (fr) Procédé et dispositif de contrôle de système d'aide à la conduite d'un véhicule basé sur une limite de vitesse
FR3137781A1 (fr) Procédé et dispositif de détermination d’une largeur d’une voie latérale adjacente à une voie de circulation
EP4308880A1 (de) Verfahren und vorrichtung zur bestimmung der zuverlässigkeit einer karte mit niedriger auflösung
FR3135048A1 (fr) Procédé de suivi d’au moins une limite de bord de voie de circulation et véhicule automobile associé
FR3141911A1 (fr) Procédé et système de gestion de la vitesse longitudinale d’un véhicule
WO2022195182A1 (fr) Procede et dispositif de determination d'une fiabilite d'une cartographie basse definition
FR3080344A1 (fr) Detection fiabilisee d’un objet par un vehicule
WO2020174142A1 (fr) Assistance à la conduite d'un véhicule, par détermination fiable d'objets dans des images déformées
FR3098777A1 (fr) Procédé d’insertion dans un convoi de véhicules autonomes par un véhicule automobile
FR3143513A1 (fr) Procédé et dispositif de contrôle d’un système de régulation de vitesse adaptatif d’un véhicule autonome par distance latérale modulée.

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230504

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

RAP3 Party data changed (applicant data changed or rights of an application transferred)

Owner name: STELLANTIS AUTO SAS

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20240703

RIN1 Information on inventor provided before grant (corrected)

Inventor name: ABOUESSIRE, ISMAIL

Inventor name: ABOULISSANE, BADREDDINE