WO2024012660A1 - Procédé mis en œuvre par ordinateur pour déterminer un profil de vitesse le long d'un trajet de déplacement prédéterminé pour un véhicule autonome - Google Patents

Procédé mis en œuvre par ordinateur pour déterminer un profil de vitesse le long d'un trajet de déplacement prédéterminé pour un véhicule autonome Download PDF

Info

Publication number
WO2024012660A1
WO2024012660A1 PCT/EP2022/069434 EP2022069434W WO2024012660A1 WO 2024012660 A1 WO2024012660 A1 WO 2024012660A1 EP 2022069434 W EP2022069434 W EP 2022069434W WO 2024012660 A1 WO2024012660 A1 WO 2024012660A1
Authority
WO
WIPO (PCT)
Prior art keywords
autonomous vehicle
travelling path
along
sensor
predetermined
Prior art date
Application number
PCT/EP2022/069434
Other languages
English (en)
Inventor
Wilhelm WIBERG
Oskar Ljungqvist
Original Assignee
Volvo Autonomous Solutions AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volvo Autonomous Solutions AB filed Critical Volvo Autonomous Solutions AB
Priority to PCT/EP2022/069434 priority Critical patent/WO2024012660A1/fr
Publication of WO2024012660A1 publication Critical patent/WO2024012660A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • B60W60/0018Planning or execution of driving tasks specially adapted for safety by employing degraded modes, e.g. reducing speed, in response to suboptimal conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2300/00Indexing codes relating to the type of vehicle
    • B60W2300/12Trucks; Load vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4048Field of view, e.g. obstructed view or direction of gaze
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2720/00Output or target parameters relating to overall vehicle dynamics
    • B60W2720/10Longitudinal speed
    • B60W2720/103Speed profile

Definitions

  • the invention relates to a computer-implemented method for determining a speed profile along a predetermined travelling path for an autonomous vehicle.
  • the invention also relates to a method for automatically guiding an autonomous vehicle along a predetermined travelling path.
  • the invention also relates to a control unit, an autonomous vehicle, a computer program and a computer readable medium.
  • the invention can be applied in heavy-duty vehicles, such as trucks, buses and construction equipment. Although the invention will be described with respect to a truck, the invention is not restricted to this particular vehicle, but may also be used in other vehicles such as dump trucks, wheel loaders, excavators, buses, marine vessels and passenger cars.
  • An autonomous vehicle may be defined as a vehicle which can at least partly be driven automatically without any direct human involvement. For example, at least one of steering, propulsion and braking may be automatically controlled during driving of the vehicle.
  • the vehicle may be a fully autonomous vehicle or a semi-autonomous vehicle.
  • a semi-autonomous vehicle may be defined as a vehicle which can switch between automatic driving control and manual driving control.
  • one or more autonomous vehicles may be used for performing transport missions in a confined area, such as a mining site, logistics centre or port.
  • the autonomous vehicle(s) may be configured to follow predetermined travelling paths. This simplifies the task of planning and makes the system more deterministic and less dynamic, something that is positive in a confined area.
  • the travelling paths may be determined offline, e.g. in a back-office central, and thereafter submitted to the autonomous vehicle(s), preferably via a wireless communication link.
  • Each autonomous vehicle typically has one or more environment perception sensors which are used for detecting obstacles along the travelling paths.
  • the autonomous vehicle may brake and/or perform a steering manoeuvre for avoiding the detected obstacle.
  • the autonomous vehicle may initiate an emergency braking action when an obstacle is detected which interferes with the travelling path. Thereby, a collision can be avoided.
  • an object of the invention is to provide an improved method for determining a speed profile along a predetermined travelling path for an autonomous vehicle which alleviates at least one drawback of the prior art, or which at least provides a suitable alternative.
  • Another object of the invention is to provide an improved method for automatically guiding an autonomous vehicle along a predetermined travelling path which alleviates at least one drawback of the prior art, or which at least provides a suitable alternative.
  • Yet further objects of the invention are to provide an improved control unit, an autonomous vehicle, a computer program and/or a computer readable medium which alleviate at least one drawback of the prior art, or which at least provides suitable alternatives.
  • At least one of the objects is achieved by a method according to claim 1 .
  • the autonomous vehicle comprises at least one environment perception sensor having a sensor field of view for covering a portion ahead of the autonomous vehicle during travelling along the predetermined travelling path.
  • the method comprises:
  • an improved speed profile will be determined which takes map information and the sensor field of view into account in a proactive manner. More particularly, the present invention is based on a realization that a more optimized speed profile can be provided if proactively evaluating where an insufficient sensor coverage may appear along the predetermined travelling path.
  • a speed profile which is not proactively determined as disclosed herein would likely lead to an emergency braking action, or at least to an instant hard braking action, for the autonomous vehicle when it reaches a point where there is insufficient sensor coverage.
  • a more optimal speed profile can be provided. This implies increased efficiency, such as increased efficiency for autonomous vehicles operating in a confined area, reduced vehicle wear, more comfortable driving, increased safety, etc.
  • Map information as used herein shall be interpreted broadly.
  • the map information is adapted to be used by a computer, i.e. it is a digital map. It may be defined as a representation of an area, comprising for example information about obstacles in the area, altitude variations in the area and/or drivable road segments in the area.
  • the sensor field of view may be defined as the area in front of the sensor where the sensor can detect objects in the environment.
  • the sensor field of view may be defined by a maximum detection range, or detection distance, and a maximum detection angle.
  • the map information comprises three-dimensional map information of an area in which the predetermined travelling path is located, such as three-dimensional map information of the area comprising travelling surface inclination information and/or information about objects occluding the sensor field of view around object corners.
  • three-dimensional map information implies that the determined sensor coverage will better reflect the actual sensor coverage at the at least one point along the predetermined travelling path. As such, the risk of unwanted emergency braking actions at the at least one point can be mitigated.
  • the at least one point along the predetermined travelling path is a respective point before reaching an object corner and/or a downhill slope.
  • the sensor field of view may be occluded when the autonomous vehicle approaches an object corner and/or a downhill slope. Accordingly, by decreasing the speed before the at least one point along the predetermined travelling path, the autonomous vehicle will be better prepared for any obstacles which are occluded by the object and/or by the downhill slope.
  • the speed of the autonomous vehicle is decreased so that it at least reaches a predetermined reduced speed level at the at least one point along the predetermined travelling path.
  • the speed may be decreased to different predetermined reduced speed levels in dependence on the level of determined sensor coverage at the at least one point. This implies a more flexible and efficient approach in which the speed is adapted to different sensor coverage levels.
  • the speed of the autonomous vehicle is decreased before the at least one point so that an energy efficiency threshold is reached, wherein reaching the energy efficiency threshold is indicative of a longer speed reducing distance in relation to a braking distance if performing an instant emergency braking action at the at least one point in response to the insufficient sensor coverage.
  • the speed profile according to the present invention may result in a smoother ride for the autonomous vehicle. This in turn may be used for reducing the energy consumption for the autonomous vehicle when travelling along the predetermined travelling path.
  • the speed profile may be set so that the aforementioned energy efficiency threshold is reached.
  • the energy efficiency threshold may for example be set so that a desired tradeoff between energy efficiency and time for completing a mission is achieved.
  • the speed of the autonomous vehicle is decreased before the at least one point so that a wear threshold is reached, wherein reaching the wear threshold is indicative of a reduced vehicle wear in relation to a vehicle wear if an instant emergency braking action is performed at the at least one point in response to the insufficient sensor coverage.
  • wear of friction brakes may be severe if performing too many emergency braking actions.
  • friction brake wear can be controlled in an efficient manner, implying e.g. reduced need for service or repair.
  • the friction brake wear may be removed completely, for example by only using regenerative brakes only.
  • the speed of the autonomous vehicle is decreased before the at least one point so that the autonomous vehicle is able to stop within a predetermined distance when reaching the at least one point.
  • the predetermined distance may be variable in dependence on the determined sensor coverage of the at least one environment perception sensor at the at least one point. Thus, this implies a more flexible and efficient approach.
  • the method may further comprise determining the speed profile by use of the determined sensor coverage along the predetermined travelling path so that the speed of the autonomous vehicle is increased after the at least one point along the predetermined travelling path when the determined sensor coverage is above a second sensor coverage threshold indicative of a sufficient sensor field of view.
  • the time for completing a driving mission may be reduced, implying increased efficiency.
  • the sensor coverage is further determined based on a mounting position of the at least one environment perception sensor on the autonomous vehicle, such as a mounting height position with respect to a ground surface. It has been realized that also the mounting position may be taken into consideration when determining the sensor coverage. For example, an environment perception sensor mounted close to the ground surface may have a lower sensor coverage than an environment perception sensor mounted higher up from the ground surface. The environment perception sensor(s) may also be directed in different directions with respect to the autonomous vehicle, and this may also be taken into consideration when determining the sensor coverage.
  • the method may further comprise updating the predetermined travelling path in response to determining that the sensor coverage is below a third sensor coverage threshold at at least one second point along the predetermined travelling path so that the autonomous vehicle instead travels across at least one third point where the sensor coverage is above the third sensor coverage threshold.
  • updating the travelling path may be advantageous when the sensor coverage is below the third sensor coverage threshold and when there is at least one second point available where the sensor coverage is better. Thereby, the speed may be maintained in some situations, implying increased efficiency.
  • portions of the predetermined travelling path may be allowed to be updated in some areas and not allowed to be updated in other areas.
  • the predetermined travelling path may also be updated before initiating driving of the autonomous vehicle along the predetermined travelling path, implying a more proactive approach.
  • At least one of the objects is achieved by a method according to claim 12.
  • a method for automatically guiding an autonomous vehicle along a predetermined travelling path comprising guiding the autonomous vehicle along the predetermined travelling path and wherein the speed profile along the predetermined travelling path is determined by a method according to any one of the embodiments of the first aspect of the invention.
  • At least one of the objects is achieved by a control unit according to claim 13.
  • control unit for determining a speed profile along a predetermined travelling path for an autonomous vehicle and/or for automatically guiding an autonomous vehicle along a predetermined travelling path, wherein the control unit is configured to perform the method according to any one of the embodiments of the first and/or second aspect of the invention.
  • Advantages and effects of the third aspect are largely analogous to advantages and effects of the first and second aspects of the invention.
  • the control unit may include a microprocessor, microcontroller, programmable digital signal processor or another programmable device.
  • the control unit may also, or instead, include an application specific integrated circuit, a programmable gate array or programmable array logic, a programmable logic device, or a digital signal processor.
  • the processor may further include computer executable code that controls operation of the programmable device.
  • the control unit may comprise embedded hardware, sometimes with integrated software. Examples of physical relationships are: shared casing and components mounted on one or several circuit boards.
  • the control unit may be any kind of control unit, and it may also comprise more than one control unit, i.e. the control unit may be configured by two or more sub-control units, which may be provided close to each other or be separated from each other. In some embodiments, the control unit may be denoted a computer.
  • At least one of the objects is achieved by an autonomous vehicle according to claim 14.
  • an autonomous vehicle comprising at least one environment perception sensor having a sensor field of view for covering a portion ahead of the autonomous vehicle during travelling along a predetermined travelling path, and further comprising a control unit according to any one of the embodiments of the third aspect of the invention.
  • the at least one environment perception sensor may for example be any one of a LIDAR (light detection and ranging) sensor, a RADAR (radio detection and ranging) sensor, a camera, a stereo camera, a time-of-flight camera, a SONAR (sound navigation and ranging), an ultrasonic sensor, etc.
  • a LIDAR light detection and ranging
  • RADAR radio detection and ranging
  • a camera a stereo camera
  • a time-of-flight camera a time-of-flight camera
  • SONAR sound navigation and ranging
  • ultrasonic sensor etc.
  • a computer program comprising program code means for performing the steps of any embodiment of the first and/or second aspects of the invention when said program is run on a computer, such as on the aforementioned control unit.
  • At least one of the objects is achieved by a computer readable medium according to claim 16.
  • a computer readable medium carrying a computer program comprising program code means for performing the steps of any embodiment of the first and/or second aspects of the invention when said program product is run on a computer, such as on the aforementioned control unit.
  • Fig. 1 is a side view of an autonomous vehicle according to an embodiment of the present invention
  • FIGs. 2a-b are flowcharts of methods according to example embodiments of the invention.
  • Fig. 3 is a schematic view of an area in which an autonomous vehicle according to an example embodiment of the invention is driving
  • Fig. 4 is another view of an autonomous vehicle according to an example embodiment of the invention when reaching a downhill slope
  • Fig. 5 is a schematic view of an environment perception sensor and control units according to example embodiments of the invention.
  • Fig. 6 shows a graph with a speed profile according to an example embodiment of the invention.
  • Fig. 1 depicts a side view of an autonomous vehicle 1 according to an example embodiment of the present invention.
  • the autonomous vehicle 1 is an articulated truck and trailer combination comprising a towing truck 14 and a connected trailer 16.
  • the invention is not limited to this particular vehicle, but may also be used for other vehicles, such as other trucks, buses, construction equipment, etc.
  • the invention has shown to be advantageous for use in a confined area, such as a mining site, a construction site, a port, a logistics centre, an airport, etc., the invention is not limited only to these situations, but may also be used in e.g. public road networks.
  • the towing truck 14 comprises an environment perception sensor 12 which is directed in a forward travelling direction of the autonomous vehicle 1 .
  • the environment perception sensor 12 as shown here represents a camera, even though any other type of environment perception sensor can be used. For example, it is common to equip the autonomous vehicle 1 with more than one environment perception sensor, thereby obtaining an improved sensor field of view.
  • the towing truck 14 further comprises a control unit 10, which herein is a control unit according to any embodiment of the third aspect of the invention.
  • a control unit 10 which herein is a control unit according to any embodiment of the third aspect of the invention.
  • Fig. 2a depicts a flowchart of the method
  • fig. 3 depicts a schematic view from above of an area A in which an autonomous vehicle 1 , such as the vehicle shown in fig. 1 ., is operating.
  • the area A is delimited by a dashed-dotted line A1 , which may represent a physical barrier, such as a fence, or an artificial barrier, which defines where the autonomous vehicle 1 is allowed to operate.
  • the area A comprises one obstacle B1 , which may be a building, a stillstanding vehicle, or any other object which may occlude the field of view of the at least one environment perception sensor 12 of the autonomous vehicle 1.
  • a control unit 100 may be associated with the area A.
  • the control unit 100 which may be in the form of a computer or server, may be a back-office function for the area, e.g. a central server or the like.
  • the control unit 100 may be configured to perform the method according to the first aspect and/or the second aspect of the invention.
  • the control unit 10 of the autonomous vehicle 1 may be configured to perform the method according to the first and/or the second aspect of the invention.
  • the control unit 100 may be arranged to communicate with the autonomous vehicle 1 via a wireless communication link (not shown), such as WiFi, Bluetooth, or any other wireless communication link.
  • the control unit 100 may additionally or alternatively be arranged to communicate with the autonomous vehicle 1 by a combined wireless and wired communication link, such as by a wired connection between the control unit 100 and a gateway/hub (not shown), and a wireless connection between the gateway/hub and the autonomous vehicle 1 .
  • the method according to the first aspect is a computer-implemented method for determining a speed profile v1 along a predetermined travelling path T for the autonomous vehicle 1 .
  • the autonomous vehicle 1 comprises the at least one environment perception sensor 12 which has a sensor field of view for covering a portion ahead of the autonomous vehicle 1 during travelling along the predetermined travelling path T.
  • the travelling path T which also may be denoted a travelling trajectory T, is represented by a line T in fig. 3 which the autonomous vehicle 1 is intended to follow.
  • the method comprises:
  • the map information may be map information about the area A, including information about the obstacle B1 .
  • the map information may comprise information about the coordinates of the obstacle B1 , including its outer boundaries, its height, etc.
  • the map information may for example be obtained from the control unit 100, e.g. the central server, of the area A.
  • the control unit 100 may comprise a storage means for storing the map information, e.g. a digital memory device.
  • the map information may be obtained from any kind of database.
  • the database may be an onboard database of the autonomous vehicle 1 and/or it may be a remote database, such as part of a cloud storage system.
  • the method comprises:
  • S2 determining a sensor coverage SC along the predetermined travelling path T, wherein the sensor coverage SC along the predetermined travelling path T is determined based on the sensor field of view of the at least one environment perception sensor 12 and the obtained map information.
  • the sensor coverage SC is in fig. 3 indicated by an almost triangular shaped area which has been reduced in size due to the obstacle B1 which occludes a portion of the area A behind the obstacle B1 . Accordingly, in the example shown in fig. 3, the sensor coverage SC is at the shown point in time reduced as a consequence of the obstacle B1 .
  • the method further comprises:
  • S3 determining the speed profile v1 by use of the determined sensor coverage SC along the predetermined travelling path T so that the speed v of the autonomous vehicle 1 is decreased before at least one point P1 along the predetermined travelling path T where the sensor coverage SC is below a first sensor coverage threshold indicative of an insufficient sensor coverage caused by an occlusion at the at least one point P1 . Furthermore, the speed profile v1 is determined before initiating driving of the autonomous vehicle 1 along the predetermined travelling path T, thereby providing a proactive approach which reduces the risk of any unwanted instantaneous hard braking actions.
  • fig. 6 depicts a graph with the speed v as a function of distance d travelled. As shown, the speed v is decreased before reaching the point P1 . As another example, the speed profile may be expressed as speed as a function of time.
  • the speed profile v1 can be planned in advance before the autonomous vehicle 1 starts driving, thereby enabling for a smoother and more efficient travelling of the autonomous vehicle 1 . It has further been found that planning the speed profile v1 in this manner for more than one vehicle travelling in the area A is advantageous in that it may result in a better cooperation between the vehicles in the area A. For example, the risk of unwanted hard braking actions as a consequence of another vehicle braking hard can be reduced. As another example, if several autonomous vehicles are travelling one after the other in a convoy, hard braking actions which may accumulate through the convoy in an undesired manner may be avoided.
  • an actual arrival time when performing a work mission can be closer to an expected time of arrival, ETA.
  • ETA expected time of arrival
  • the improved ETA estimate and the increased cooperation as mentioned in the above implies improved flow of the vehicles travelling in the area A. This in turn implies a reduction of fuel consumption and a more deterministic flow of material, or goods, which the vehicles are carrying.
  • the map information preferably comprises three-dimensional map information of the area A in which the predetermined travelling path T is located.
  • the three- dimensional map information of the area A may comprise travelling surface inclination information and/or information about objects B1 occluding the sensor field of view around object corners B11 .
  • the at least one point P1 along the predetermined travelling path T may as shown in fig. 3 be a respective point before reaching the object corner B11 of the obstacle B1 .
  • Fig. 4 shows another example, where the point P1 is a point before reaching a downhill slope DS. As shown, the downhill slope DS in front of the autonomous vehicle 1 occludes the senor field of view so that the sensor coverage SC is reduced.
  • the speed of the autonomous vehicle 1 may be decreased so that it at least reaches a predetermined reduced speed level vO at the at least one point P1 along the predetermined travelling path T.
  • the speed of the autonomous vehicle 1 may be decreased before the at least one point P1 so that an energy efficiency threshold is reached, wherein reaching the energy efficiency threshold is indicative of a longer speed reducing distance in relation to a braking distance if performing an instant emergency braking action at the at least one point P1 in response to the insufficient sensor coverage.
  • the speed of the autonomous vehicle 1 may be decreased before the at least one point P1 so that a wear threshold is reached, wherein reaching the wear threshold is indicative of a reduced vehicle wear in relation to a vehicle wear if an instant emergency braking action is performed at the at least one point P1 in response to the insufficient sensor coverage.
  • the speed v of the autonomous vehicle 1 may be decreased before the at least one point P1 so that the autonomous vehicle 1 is able to stop within a predetermined distance when reaching the at least one point P1 .
  • the predetermined distance may correspond to a distance from the point P1 to the object corner B11 of the obstacle B1 .
  • the object corner B11 is in fig. 3 the corner which the autonomous vehicle 1 will pass around when travelling along the predetermined travelling path T.
  • the predetermined distance may correspond to the distance from the point P1 in fig. 4 to a point where the downhill slope DS begins.
  • the predetermined distance may additionally or alternatively be variable in dependence on the determined sensor coverage SC of the at least one environment perception sensor 12 at the at least one point P1 .
  • the method may further comprise determining the speed profile v1 by use of the determined sensor coverage SC along the predetermined travelling path T so that the speed of the autonomous vehicle 1 is increased after the at least one point P1 along the predetermined travelling path T when the determined sensor coverage is above a second sensor coverage threshold indicative of a sufficient sensor field of view. For example, when the autonomous vehicle 1 in fig. 3 has passed the object corner B11 , the speed may be increased again.
  • the sensor coverage SC may further be determined based on a mounting position of the at least one environment perception sensor 12 on the autonomous vehicle 1 . As shown in fig. 4, a mounting height position hi with respect to a ground surface may be used for determining the sensor coverage SC.
  • the method may further comprise the following optional step:
  • S4 updating the predetermined travelling path in response to determining that the sensor coverage is below a third sensor coverage threshold at at least one second point P2 along the predetermined travelling path T so that the autonomous vehicle 1 instead travels across at least one third point P3 where the sensor coverage is above the third sensor coverage threshold.
  • Fig. 3 depicts a situation where the predetermined travelling path T is updated so that the autonomous vehicle instead follows the updated travelling path T1 which includes the point P3. More particularly, as shown in the particular example, the autonomous vehicle 1 may take a left-turn so that the sensor coverage is not occluded as much as when following the first intended path T2 of the predetermined travelling path T. A speed profile for the updated portion T1 of the travelling path may also be determined before initiating the autonomous driving.
  • the optional step S4 is represented by a box with a dashed line in fig. 2a.
  • the points P1 , P2 and P3 may be defined as coordinates in the area A.
  • Fig. 2b represents a method according to an example embodiment of the second aspect of the invention.
  • the second aspect relates to a method for automatically guiding an autonomous vehicle 1 along a predetermined travelling path T.
  • the method is preferably computer-implemented and comprises:
  • S10 guiding the autonomous vehicle 1 along the predetermined travelling path T.
  • the speed profile v1 along the predetermined travelling path T is determined by a method according to any one of the embodiments of the first aspect of the invention.
  • the guiding may be performed by the control unit 10 of the autonomous vehicle 1 which guides the vehicle 1 by sending guiding instructions to one or more actuators (not shown) of the autonomous vehicle 1 for driving the autonomous vehicle 1 .
  • the actuators may be brake actuators for controlling brakes of the autonomous vehicle 1 and propulsion actuators for controlling a propulsion system of the autonomous vehicle 1 .
  • the propulsion system may comprise one or more electric motors and/or an internal combustion engine.
  • the actuators may be steering actuators for controlling steering of the autonomous vehicle 1 along the predetermined travelling path T.
  • the guiding may be performed by the control unit 100 which guides the vehicle 1 by sending guiding instructions to the autonomous vehicle 1 .
  • control unit 10 and the control unit 100 may comprise a computer program comprising program code means for performing the steps of any of the methods as disclosed herein.
  • the invention also relates to a computer readable medium carrying a computer program comprising program code means for performing the steps of any of the methods as disclosed herein when said program product is run on a computer, such as on the control units 10 or 100.
  • Fig. 5 depicts a schematic view of an environment perception sensor 12, a control unit 10 and a control unit 100 as mentioned in the above.
  • the different units are configured to communicate with each other, indicated by the dashed lines between the units 12, 10 and 100.
  • the communication between the environment perception sensor 12 and the control unit 10 may be a wired communication link, such as a CAN bus communication link.
  • the communication link may additionally, or alternatively, be wireless.
  • the communication between the control unit 10 and the control unit 100 may be a wireless communication link.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un procédé mis en œuvre par ordinateur pour déterminer un profil de vitesse le long d'un trajet de déplacement (T) prédéterminé pour un véhicule autonome (1), le véhicule autonome (1) comportant au moins un capteur (12) de perception d'environnement doté d'un champ de vision de capteur servant à couvrir une partie en avant du véhicule autonome (1) pendant un déplacement le long du trajet de déplacement (T) prédéterminé, le procédé comportant les étapes consistant à: - obtenir (S1) des informations de carte associées au trajet de déplacement (T) prédéterminé, - déterminer (S2) une couverture de capteur (SC) le long du trajet de déplacement (T) prédéterminé, la couverture de capteur (SC) le long du trajet de déplacement prédéterminé étant déterminée d'après le champ de vision de capteur du ou des capteurs (12) de perception d'environnement et les informations de carte obtenues, et - déterminer (S3) le profil de vitesse en utilisant la couverture de capteur (SC) déterminée le long du trajet de déplacement (T) prédéterminé de telle sorte que la vitesse du véhicule autonome (1) soit diminuée avant au moins un point (P1) le long du trajet de déplacement (T) prédéterminé où la couverture de capteur (SC) est inférieure à un premier seuil de couverture de capteur indicatif d'une couverture de capteur insuffisante causée par une occultation au(x) point(s) (P1), le profil de vitesse étant déterminé avant d'amorcer la conduite du véhicule autonome (1) le long du trajet de déplacement (T) prédéterminé. L'invention concerne également un procédé de guidage d'un véhicule autonome (1), une unité (10, 100) de commande, un programme d'ordinateur et un support lisible par ordinateur.
PCT/EP2022/069434 2022-07-12 2022-07-12 Procédé mis en œuvre par ordinateur pour déterminer un profil de vitesse le long d'un trajet de déplacement prédéterminé pour un véhicule autonome WO2024012660A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/069434 WO2024012660A1 (fr) 2022-07-12 2022-07-12 Procédé mis en œuvre par ordinateur pour déterminer un profil de vitesse le long d'un trajet de déplacement prédéterminé pour un véhicule autonome

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2022/069434 WO2024012660A1 (fr) 2022-07-12 2022-07-12 Procédé mis en œuvre par ordinateur pour déterminer un profil de vitesse le long d'un trajet de déplacement prédéterminé pour un véhicule autonome

Publications (1)

Publication Number Publication Date
WO2024012660A1 true WO2024012660A1 (fr) 2024-01-18

Family

ID=82557889

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/069434 WO2024012660A1 (fr) 2022-07-12 2022-07-12 Procédé mis en œuvre par ordinateur pour déterminer un profil de vitesse le long d'un trajet de déplacement prédéterminé pour un véhicule autonome

Country Status (1)

Country Link
WO (1) WO2024012660A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2741269A1 (fr) * 2011-08-02 2014-06-11 Toyota Jidosha Kabushiki Kaisha Dispositif d'aide à la conduite
WO2018132608A2 (fr) * 2017-01-12 2018-07-19 Mobileye Vision Technologies Ltd. Navigation basée sur des zones de masquage
US20180259968A1 (en) * 2017-03-07 2018-09-13 nuTonomy, Inc. Planning for unknown objects by an autonomous vehicle
WO2019168530A1 (fr) * 2018-02-28 2019-09-06 Nissan North America, Inc. Infrastructure de réseau de transport en vue d'une prise de décision de véhicule autonome
US20210061269A1 (en) * 2019-08-30 2021-03-04 Argo AI, LLC Method of handling occlusions at intersections in operation of autonomous vehicle
US20220176962A1 (en) * 2020-12-09 2022-06-09 Waymo Llc Physics-informed optimization for autonomous driving systems

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2741269A1 (fr) * 2011-08-02 2014-06-11 Toyota Jidosha Kabushiki Kaisha Dispositif d'aide à la conduite
WO2018132608A2 (fr) * 2017-01-12 2018-07-19 Mobileye Vision Technologies Ltd. Navigation basée sur des zones de masquage
US20180259968A1 (en) * 2017-03-07 2018-09-13 nuTonomy, Inc. Planning for unknown objects by an autonomous vehicle
WO2019168530A1 (fr) * 2018-02-28 2019-09-06 Nissan North America, Inc. Infrastructure de réseau de transport en vue d'une prise de décision de véhicule autonome
US20210061269A1 (en) * 2019-08-30 2021-03-04 Argo AI, LLC Method of handling occlusions at intersections in operation of autonomous vehicle
US20220176962A1 (en) * 2020-12-09 2022-06-09 Waymo Llc Physics-informed optimization for autonomous driving systems

Similar Documents

Publication Publication Date Title
US10331135B2 (en) Systems and methods for maneuvering around obstacles in autonomous vehicles
US10513267B2 (en) Vehicle safety system
EP3782000B1 (fr) Procédé de commande d'une chaîne de véhicules
EP2921362B1 (fr) Véhicule, système de véhicule et procédé pour augmenter la sécurité et/ou le confort pendant un entraînement autonome
US8798841B1 (en) System and method for improving sensor visibility of vehicle in autonomous driving mode
US20210197808A1 (en) Moving body control system
JP6970807B2 (ja) 移動体制御装置
JP6892508B2 (ja) 走行制御システム
JP6825081B2 (ja) 車両制御装置及び車両制御方法
WO2020116156A1 (fr) Dispositif de commande de véhicule
CN104925064A (zh) 增加自主驾驶安全和/或舒适性的车辆、车辆系统和方法
US20230021615A1 (en) Vehicle control device, and vehicle control system
EP3600999B1 (fr) Commande de distance pour un véhicule avec remorque
CN113771841A (zh) 用于车队的驾驶辅助系统、方法、计算机设备和存储介质
US20230035414A1 (en) A method for providing a positive decision signal for a vehicle
US20230138848A1 (en) Driver assistance system for heavy-duty vehicles with overhang
WO2024012660A1 (fr) Procédé mis en œuvre par ordinateur pour déterminer un profil de vitesse le long d'un trajet de déplacement prédéterminé pour un véhicule autonome
JP7446216B2 (ja) 車両制御装置
US11952060B2 (en) Steering assembly for a vehicle
US20230159055A1 (en) Method for planning a driving trajectory defining a travelling path for a vehicle
JP7050098B2 (ja) 車両制御装置、車両制御方法、およびプログラム
US20220351620A1 (en) Method for generating a routing map of an area to be used for guiding a vehicle in the area
US20230192102A1 (en) Control system and method for manoeuvring an automated vehicle
WO2023198281A1 (fr) Système de commande pour commander une flotte de véhicules autonomes
WO2023112385A1 (fr) Dispositif de commande de véhicule

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22741767

Country of ref document: EP

Kind code of ref document: A1