SE1950883A1 - Method and control unit for predicting a collision between a vehicle and a mobile object - Google Patents

Method and control unit for predicting a collision between a vehicle and a mobile object

Info

Publication number
SE1950883A1
SE1950883A1 SE1950883A SE1950883A SE1950883A1 SE 1950883 A1 SE1950883 A1 SE 1950883A1 SE 1950883 A SE1950883 A SE 1950883A SE 1950883 A SE1950883 A SE 1950883A SE 1950883 A1 SE1950883 A1 SE 1950883A1
Authority
SE
Sweden
Prior art keywords
vehicle
mobile object
trajectory
time
collision
Prior art date
Application number
SE1950883A
Other languages
Swedish (sv)
Other versions
SE543781C2 (en
Inventor
Fredrik Nordin
Jakob Arnoldsson
Magnus Granström
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1950883A priority Critical patent/SE543781C2/en
Publication of SE1950883A1 publication Critical patent/SE1950883A1/en
Publication of SE543781C2 publication Critical patent/SE543781C2/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0953Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18154Approaching an intersection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18159Traversing an intersection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions

Abstract

The present disclosure relates to a method and a control unit for predicting a collision between a vehicle (1) and a mobile object (2). The disclosure also relates to a computer program, a computer-readable medium and a vehicle. According to a first aspect, the disclosure relates to a method for predicting a collision between a vehicle and a mobile object. The method comprises obtaining (S1) a first trajectory defining (S2) an intended trajectory of the vehicle and determining a current position and speed of a mobile object that the vehicle risks colliding with. The method further comprises estimating (S3), based on the current position and speed of the mobile object, a second trajectory defining a predicted future trajectory of the mobile object, and comparing (S4) a position, at one point in time, at one of the first and second trajectories with each of a plurality of positions at a plurality of consecutive discrete points in time in a time window, at the other trajectory. The method further comprises determining (S5) a future collision based on the comparison revealing at least one overlap of the position of the vehicle and the mobile object.

Description

Method and control unit for predicting a collision between a vehicle and a mobileobject Technical field The present disclosure relates to a method and a control unit for predicting acollision between a vehicle and a mobile object. The disclosure also relates to acomputer program, to a computer-readable medium and to a vehicle.
BackgroundToday, many vehicles use autonomous control systems to control vehicle operation. For example, vehicles may use adaptive cruise control systems tocontrol speed based on both the operator-selected speed and the presence ofand distance to objects, such as another vehicle in a projected travel path, forexample. An autonomous vehicle uses sensors to detect and track othersurrounding vehicles and obstacles and tries to plan its trajectory such thatcollisions with these other vehicles and obstacles are avoided. Typically, sensordata derived from disparate sources are combined (also known as sensor fusion),such that the resulting information has less uncertainty than when these sources were used individually.
As an example, US 201 O/022841 9 A1 discloses a vehicle equipped with a spatialmonitoring system. Each of a plurality of objects located proximate to the vehicleis monitored. Locations of each of the objects are predicted relative to a projectedtrajectory of the vehicle. A collision risk level between the vehicle and each of theobjects is assessed. l\/lore specifically, locations of the subject vehicle and objectvehicle are compared at corresponding points in time to assess collision risk. Anobject vehicle is said to be a potential risk if it is determined to be longitudinallyclose, i.e. within an allowable margin, to the subject vehicle in the next 6 seconds.
However, object tracking may give noisy and sometimes wrong sensor readings.Bad sensor readings could potentially lead to collisions if the uncertainty is notconsidered. For example, if a mobile object's speed, velocity, size or position ismisjudged, what might seem like a safe maneuver could end up being a collision, 2 which might have devastating consequences. Another uncertainty is the fact thatanother vehicle may change its own velocity in an unexpected way, which can bedifficult to predict.
One way to increase safety despite inaccurate sensors is to increase the above-mentioned safety margin (i.e. safety distance) used to predict a collision.However, for a vehicle driving on a non-straight trajectory, implementation of sucha safety margin requires calculation of a safety zone having a complex geometricshape. As a consequence, the calculations required to make the collision check would typically be complicated and time consuming. ln conclusion, sensor fusion and obstacle detection are prone to suffer from noiseand to be non-exact. With better sensors, more computing power and betterfusion techniques the measurements might be enhanced. However, noise andnon-exactness may still remain, and it seems like it will remain a “fuzzy” and non-exact topic for the time being. At the same time, the demand for safe and exactcollision avoidance methods is increasing. Hence, there is a need for improvedways of safe and efficient collision detection.
Summarylt is an object of the disclosure to alleviate at least some of the drawbacks with the prior art. Thus, it is an object of this disclosure to provide an efficient way topredict and avoid collisions between vehicles and mobile objects.
According to a first aspect, the disclosure relates to a method for predicting acollision between a vehicle and a mobile object. The method comprises obtaininga first trajectory defining an intended trajectory of the vehicle and determining acurrent position and speed of a mobile object that the vehicle risks colliding with.The method further comprises estimating, based on the current position and speedof the mobile object, a second trajectory defining a predicted future trajectory of themobile object, and comparing a position, at one point in time, at one of the first andsecond trajectories, with each of a plurality of positions, at a plurality of consecutivediscrete points in time in a time window T at the other trajectory. ln some 3 embodiments, the time window includes the one point in time. The method furthercomprises determining a future collision based on the comparison revealing at leastone overlap of the position of the vehicle and the mobile object. The proposedmethod is robust, and simple. lt is also versatile and can be used to safeguardagainst multiple types of safety issues. For example, it is very useful for handlingjitter in for example sensors. Furthermore, it is simple to implement as it allows forfast and simple collision check algorithms between simple geometrical objects. lnother words, the method makes it possible to preserve a simple rectangular form ofthe mobile object, keeping the actual collision check algorithm simple and at thesame time safeguard against uncertainties in a manner that is equivalent, or at least similar, to reshaping the mobile object into more advanced geometrical shape. ln some embodiments, a size of the time window is dynamically configurable.Thereby, the method may handle different levels of required security and differentlevels of uncertainty. ln some embodiments, the size of the time window isdetermined based on a collision risk requirement of the vehicle. Hence, if a very low collision risk is required a large window may be used, and vice versa. ln some embodiments, the size of the time window is based on an estimateduncertainty level within a system of the vehicle. Hence, the time window may beadapted to different levels of insecurity in the system. For example, if one sensor issubject to noise or jitter, the time window size may be increased. ln some embodiments, the distances between the consecutive discrete points intime are variable or static. Hence, the distances may be varied depending on the situation. ln some embodiments, the mobile object is another vehicle, a pedestrian, ananimal. Hence, the method may be used to predict and prevent collision with different types of objects. ln some embodiments, the estimating is based on at least one of; map data, anacceleration of the mobile object, an angular velocity of the mobile object, velocity 4 of the mobile object and a known intended trajectory of the mobile object. Hence,the method may be adapted to use different sorts of data. ln some embodiments, the determining is based on a volume or area of the vehicleand/or of the mobile object. Thus, because of the time window simple geometric shapes may be used. ln some embodiments, the future trajectory comprises a plurality of possiblealternative future trajectories. Thus, the method is usable even when the trajectoryof the mobile object is not known. ln some embodiments, the time window extends before and/or after the one pointin time. Thereby, the method will detect a mobile object even if it is accelerating orbraking. ln some embodiments, the method comprises controlling driving of the vehiclebased on the determined future collision. Thereby, collisions may be avoided even when the sensor readings are inaccurate.
According to a second aspect, the disclosure related to a control unit configured tocontrolling a vehicle, the control unit being configured to perform the methodaccording to the first aspect.
According to a third aspect, the disclosure relates to a vehicle comprising thecontrol unit of the second aspect.
According to a fourth aspect, the disclosure relates to a computer programcomprising instructions which, when the program is executed by a computer, cause the computer to carry out the method according to the first aspect.
According to a fifth aspect, the disclosure relates to a computer-readable mediumcomprising instructions which, when executed by a computer, cause the computerto carry out the method according to the first aspect.
Brief description of the drawinqs Fig. 1 illustrates an example collision scenario that may be detected using theproposed technique.
Fig. 2 illustrates a vehicle where the proposed technique may be implemented.Fig. 3 is a flow chart of a method for collision detection according to the firstaspect.
Fig. 4 illustrates another example collision scenario that may be detected usingthe proposed technique.
Fig. 5 illustrates a control unit of the vehicle of Fig. 2 according to an example embodiment.
Detailed descriptionThis disclosure proposes a method, for use in a vehicle, that safely predicts collisions with mobile objects, such as other vehicles, despite noisy and inaccuratesensor readings. The method involves estimating the vehicle”s own intended futuretrajectory as well as another mobile object”s estimated future trajectory, for exampleby assuming that the mobile object will keep constant velocity. The method involveschecking for collisions in discrete time steps. To safeguard against uncertainties inmeasurements used for estimating the trajectories, as well as the rather simpleassumption of constant velocity, a time window is introduced. This means that forevery discrete point in time in the vehicle”s own intended trajectory, the methodchecks for collisions with the mobile object at a plurality of discrete points in timewithin this time window along the mobile object”s estimated trajectory. lf this timewindow is scaled appropriately it can protect against poor sensor readings, noise,jitter and possible accelerations or decelerations of the mobile object and thussafeguard against collisions. Thus, the time window replaces the safety zone, whichmay be problematic because of its complex geometric form.
An illustration of how the time window T would find an otherwise non-consideredcollision is displayed in Fig. 1. T1 represents the present point in time. ln Fig. 1 theposition of the own vehicle 1 is simulated forward in time at two points in time T2,T3 and a mobile object 2 (here another vehicle) is simulated forward in time at four 6 points in time T2 T5. ln this example, the position of vehicle 1 at time T2 is notonly compared to the position of the mobile object 2 at time T2, but also thepositions at time T2, TS, T4 and T5. Thus, the time window Tfor comparing thepositions to detect a collision is set to four points in time. More specifically, the timewindow includes the present position of the mobile object 2 and three additionalpoints forwards in time. The time window TTZ for comparison at time T2 is illustrated.ln this case the comparison with the position of the mobile object 2 at time T5indicates a collision. Hence, if the vehicle 1 in Fig. 1 has an erroneously estimatedspeed or if the mobile object is accelerating heavily, the solution might prevent acollision which would otherwise not have been prevented. This will work for bothpositive as well as negative accelerations, if the time window extends in both directions in relation to the time of the vehicle 1.
Fig. 2 conceptually illustrates a vehicle 1, here a truck, where the proposedmethod for predicting a collision between a vehicle and a mobile object may beimplemented. The vehicle 1 may comprise a means for transportation in broadsense such as e.g. a bus, a truck, a car or other similar manned or unmanned meanS.
The vehicle 1 comprises a plurality of electrical systems and subsystems.However, for simplicity only some parts of the vehicle 1 that are associated withthe proposed method are shown in Fig. 2. Thus, the illustrated vehicle 1 of Fig. 2comprises a control system for autonomous driving comprising a sensorarrangement 11, a positioning device 12 and a control unit 10. The vehicle also comprises other control systems 13.
The sensor arrangement 11 is for example configured to detect various features,such as landscape and road features and objects close to the vehicle 1, that thevehicle 1 needs to consider while driving autonomously. The sensor arrangement11 is also configured to measure and/or estimate driving parameters of thevehicle 1, such as vehicle position (e.g. in relation to the road), vehicle speed,vehicle velocity, acceleration, heading, jerk-rate etc. The sensor arrangement 11may for example be configured to determine a current position and speed of any 7 object that the vehicle 1 risks colliding with. The sensor arrangement 11 maycomprise one or more of a variety of Sensors, such as a speed sensor, anaccelerometer, a camera, a radar-unit, a lidar-unit, an ultrasonic-transducer,vehicle-to-infrastructure (V21) communications, and/or vehicle-to-vehicle (V2V) communications, just to mention some.
The positioning device 12 comprises for example a global-positioning-system(GPS) communications device that is configured to provide an absolute position ofthe vehicle 1. The position is for example given in a global coordinate system. lnsome embodiments, the positioning device 12 is configured to determine a currentabsolute position of the vehicle 1. The positioning system is in some embodimentsconfigured to provide information about the surroundings of the vehicle 1, such asinformation about the configuration or layout of the roads and junctions by, forexample, drawing information from a map database. The positioning device maybe regarded as a part of the sensor arrangement 11.
The control unit 10, or ECU, is basically a digital computer that controls theautonomous driving function of the vehicle 1 based on e.g. information read fromthe sensor arrangement 11 and the positioning device 12. ECU is a generic termthat is used in automotive electronics for any embedded system that controls oneor more functions of the electrical system or sub systems in a transport vehicle. Avehicle typically comprises a plurality of ECUs that communicate over a ControllerArea Network, CAN. The control system for autonomous driving may comprise aplurality of control units. However, for simplicity only one ECU is illustrated. TheCAN is a network that is used to handle communication between the various controlunits in the vehicle 1. The CAN uses a message-based protocol. Often, severalconnected CAN networks are arranged in the vehicle 1. ln the future the CANnetworks may be replaced by Ethernet. The control unit 10 will be described infurther detail in Fig. 5.
The other control systems 13 are the other systems of the vehicle 1 that are usedto control the vehicle 1 for example while driving. Each control system of the vehicle 1 is typically controlled by one or more respective control units. The other 8 control systems 13 comprise for example a brake control system for the control of,for example, service brake, secondary brake and parking brake. The term"secondary brake" can be used to denote retarders, exhaust brakes, electricalsecondary brakes and similar. Typically, the other control systems 13 furthercomprises an engine control system configured to control an engine of the vehicle1 and its propulsion, and a change of gear control system for the control of thechange of gear of the vehicle 1.
The proposed technique will now be described in further detail with reference to theflow chart of Fig. 3 and the vehicle of Fig. 2. Fig. 3 illustrates a flow chart of a methodfor predicting a collision between a vehicle 1 and a mobile object 2. Some methodsteps are optional, which is indicated by dashed outlines.
The method of Fig. 3 is e.g. performed by a control unit 10 (Fig. 5) of the vehicle1. The method is typically performed during autonomous driving of the vehicle 1.However, even though the examples in this disclosure are mainly directed toautonomous vehicles, it must be anticipated that the method may also be useful in a fully or partly manually operated vehicle.
The method may be implemented as a computer program comprising instructionswhich, when the program is executed by a computer (e.g. a processor in thecontrol unit 10 (Fig. 5)), cause the computer to carry out the method. According tosome embodiments the computer program is stored in a computer-readablemedium (e.g. a memory or a compact disc) that comprises instructions which, when executed by a computer, cause the computer to carry out the method. ln some embodiments, the method is performed during normal driving of thevehicle 1, upon detecting SO a mobile object that the vehicle risks colliding with.Such a mobile object may be detected SO using commonly known techniquessuch as object detection using sensor data from for example a radar, lidar or Camera SGFISOI”.
The method comprises obtaining S1 a first trajectory defining an intended trajectoryof the vehicle 1. ln other words, the path that the vehicle 1 is expected to travel 9 within the next seconds or minutes is determined or retrieved. lf the vehicle 1 isautonomous, the intended trajectory often continuously determined by anautonomous driving function of the vehicle 1. Then, the first trajectory may beretrieved therefrom. The first trajectory is typically estimated S3 based on a selecteddestination, environment data and other vehicle data. The destination may besettable by an operator of the vehicle 1. Environment data is typically map dataretrieved from a global positioning system, e.g. using the positioning device 12. Themap data may e.g. provide information about roads that the vehicle 1 may drivealong, such as curvature, cross-sections, stops, etc. The environment data mayalso be data obtained by the sensor arrangement11. For example, the environmentdata comprises information about objects in the environment detected using acamera, a radar and/or a lidar.
The first trajectory is typically also determined based on driving parameters, suchas data provided by the sensor arrangement 11. Examples of sensor data providedby the sensor arrangement 11 is vehicle velocity, acceleration, jerk etc. Techniquesfor calculating the intended trajectory of a vehicle are well known in the art.
The method comprises determining S2 a current position and speed of a mobileobject 2 that the vehicle 1 risks colliding with. ln some embodiments, the mobileobject 2 is another vehicle, a pedestrian, an animal. The position is for exampledefined as an expected rotation center of the vehicle 1. ln some embodiments, thedetermining S2 comprises determining further parameters such as an accelerationof the mobile object 2, an angular velocity of the mobile object 2, velocity of themobile object 2, jerk rate of the mobile object etc. The determining is for exampleperformed using the sensor arrangement 11. For example, the position and speedmay be determined using radar, lidar and or a camera. ln some embodiments, the position and speed are determined using V2V or V2l communication.
The method then comprises estimating S3, based on the current position and speedof the mobile object 2, a second trajectory defining a predicted future trajectory ofthe mobile object 2. ln other words, the path that a mobile object 2 that the vehicle 1 risks colliding with is expected to travel in the near future is predicted. The lO predicted future trajectory may be estimated based on the mobile object”s currentposition and movement in combination with information about the environment,such as map data. The environment data may be provided by the sensorarrangement 11 and/or by the positioning device 12. ln a very simple example, the second trajectory is estimated based on theassumption that the mobile object 2 will continue to travel in the same direction withthe same speed. However, additional parameters may be used to estimate S3 apredicted future trajectory of the mobile object 2. ln some embodiments, theestimating S3 is based on at least one of; map data, an acceleration of the mobileobject, an angular velocity of the mobile object, velocity of the mobile object and aknown intended trajectory of the mobile object 2. ln other words, in someembodiments map data and traffic rules retrieved for example from the positioning device 12 is used to estimate S3 the second trajectory. ln some embodiments, a pose of the mobile object 2, is determined and theestimation S3 of the second trajectory is based thereon. For example, a headingdirection of a mobile object 2 is estimated. ln some embodiments, visual turn indications of the mobile object 2 is determinedand the estimation S3 of the second trajectory is based thereon. For example,flashers of the mobile object 2 indicating a certain driving direction, may bedetected. ln some embodiments, the method uses a prediction model of the mobile object 2and the estimation S3 of the second trajectory is based thereon. For example, amodel representing the dynamics of the mobile object 2 may be used. The modelmay be selected based on identification of the mobile object 2. Also availablehistoric information about an identified mobile object 2 may be used to estimate S3the second trajectory. ln other words, in some embodiments, the estimating S3comprises simulating the mobile object's predicted future trajectory using a model taking one or more of the above-mentioned parameters as input. ll ln general, the estimating SS of the second trajectory is up to implementation andany suitable method may be used. ln some embodiments, the predicted future trajectory comprises a plurality ofpossible alternative future trajectories. For example, the mobile object 2 may beanother vehicle driving along a road. lf there is a junction, then the mobile object”spredicted future trajectory along each of the possible roads after the junction aredetermined. The following steps of the method may then need to be performed foreach of the alternative possible future paths of the mobile object 2.
The method further comprises comparing S4 a position, at one point in time, at oneof the first and second trajectories with each of a plurality of positions, at a pluralityof consecutive discrete points in time in a time window T at the other trajectory.Hence, to safeguard against and reduce the likelihood of a collision happening it isproposed to utilize a time window when comparing the vehicle”s position with thepredicted future trajectory of the mobile object. Hence, if the vehicle 1 is at positionT1, then the intended position of the vehicle 1 at time T2 may be compared with themobile objects expected position at point T2, TS, T4 and T5, as explained in Fig. 1.The comparing may be repeated for several points in time along the one trajectory.For example, the intended position of the vehicle 1 at time TS may be comparedwith the mobile objects expected position at point TS, T4, T5 and T6, the intendedposition of the vehicle 1 at time T4 may be compared with the mobile objectsexpected position at point T4, T5, T6 and T7 and so on. ln some embodiments, the consecutive discrete points in time (e.g. T2, TS, T4, T5)are within a time window T that includes the one point in time (e.g. T2). However, itis also possible to check for collisions just in the future and/or in the past. This timewindow is used to collision check each discrete point at the intended trajectory ofthe vehicle 1 with multiple discrete points at the mobile object's predicted futuretrajectory, or vice versa. ln the examples below, the time window is applied to thesecond trajectory, that is the estimated predicted future trajectory of the mobileobject 2. However, it must be appreciated that the time window may be applied to either one of the trajectories, with the same or similar result. 12 lf a collision, that is an overlap, is detected in any one of these checks, then it isconsidered as an indication that a collision of the vehicle and the mobile object willhappen at that point in time. The vehicle 1 should consequently act appropriately upon this information. ln some embodiments, a size of the time window is dynamically configurable. lnother words, the width of the time window T can be tuned to increase the safety. ltcan also be scaled variably to account for safety uncertainty within the system. Forexample, it can be scaled in relation to measurement uncertainty, estimationuncertainty, situation danger level (e.g. potential collision energy), collisionchecking accuracy and system jitter. ln some embodiments, the size of the time window is determined based on acollision risk requirement or based on an estimated uncertainty level within a systemof the vehicle 1. ln some embodiments the collision energy may be estimated basedon the relative velocities of the vehicle 1 and the mobile object 2. However furtherparameters may be taken into account. For example, if the goods are heavy (highcollision energy) or fragile, a larger window may be configured. ln some embodiments, the time window extends before and/or after the one pointin time. ln some scenarios, comparison of only the current point in time at the twotrajectories would indicate that the mobile object 2 has already past the possibleimpact point, when the vehicle 1 arrives at the impact point. However, if the mobileobject 2 brakes more than expected or if the sensors indicate a too low speed orerroneous position, this may not be the case. Hence, it may be desirable to let thetime window extend both before and after the one point in time. ln some embodiments, the distances between the consecutive discrete points intime are variable or static. Typically, the discrete points in time should be closeenough so that the vehicle 1 could not fit in-between, as it is then possible that acollision would not be detected. Hence, the periodicity of the points in time could beadapted based on the size of the vehicle 1 and the size of the mobile object 2. 13 The method also comprises determining S5 a future collision based on thecomparison revealing at least one overlap of the position of the vehicle 1 and themobile object 2. ln other words, if one of the comparisons reveals an overlap, thena collision is detected. An overlap is defined as an overlap of the vehicle 1, whenprojected at the one position, and the mobile object 2, when projected at one of theplurality of positions within the time window. ln some embodiments, the determining S5 is based on a volume or area of thevehicle 1 and/or of the mobile object 2. Hence, an area or volume around the oneposition at the first trajectory is compared with an area or volume around one of theplurality of positions at the second trajectory. lf there is an overlap between theareas or volumes, a collision is considered detected. The volume or area is typicallydetermined based on the type of mobile object. For example, a vehicle is typicallyrepresented by a rectangular or cuboid shape, while a pedestrian may berepresented by a square/cube. ln other words, in some embodiments, the volumeor area is a simple geometric shape. Examples of shapes are a rectangle, a circle,a sphere and a cuboid. The size of the area or volume may also vary. For example,the area or volume of a truck is typically assumed to be larger than the area orvolume of a car. However, a more complex shape may also be used, such as aCAD-model of the vehicle. lf a future collision is determined S5, then the vehicle 1 should be driven to avoidthe collision. Typically, the vehicle 1 should be controlled to brake or turn. lf thevehicle 1 is autonomous, then it is autonomously controlled to avoid a determinedfuture collision. ln other words, in some embodiments, the method comprisescontrolling S6 of the vehicle 1 based on the determined future collision. This istypically done by sending control data from the control unit 10 of the control systemfor autonomous driving to the other control systems 13, such as to the brake controlsystem or engine control system. Alternatively, if the vehicle 1 is an at least partlymanually operated vehicle, the collision risk may instead be communicated to theuser e.g. through an audible or visual alarm signal. Then the driver or operator may manually control the vehicle 1 to avoid the collision. 14 Fig. 4 illustrates another example collision scenario that may be detected usingthe proposed technique. Fig. 4 illustrates a vehicle 1 and a detected mobile object2 (here another vehicle). Both vehicles are simulated forward in time T2 T5,and the time window is set to four points in time, one step back in time and twosteps forwards in time. The checks for collision in the vehicle 1 are made for eachpoint in time within the time window. ln Fig. 4 the time window is illustrated for T2.Hence, at point in time T2 the proposed method does not only check for collisionat the position of the mobile object 2 at T2 but also at T1, TS and T4. Furthermore,for time TS the proposed method does not only check for collision at the positionof the mobile object 2 at time TS but also at time T2, T4 and T5. Thereby, themethod safeguards against a heavy accelerations, decelerations or measurementuncertainties, which would otherwise not be considered. ln this example, thecomparison of the vehicle”s position at TS and the mobile object's position at T4reveals a collision. Hence, in this case collision would be considered detected andthe vehicle 1 should be operated to for example brake.
With the proposed method, the collision detection can easily be made byinvestigating overlap between the rectangular shapes representing the vehicle 1and the mobile object 2 at different points in time. To illustrate the benefit of this,Fig. 4 also illustrates an example of a more complex geometric object (thecomplex geometric shape with dash-dotted lines) that would be required if usingthe prior art technique using safety margin instead of a time window.
Now turning to Fig. 5 which illustrates the control unit 10 configured to implementthe proposed method in more detail. ln some embodiments, the control unit 10 is a“unit” in a functional sense. Hence, in some embodiments the control unit 10 is acontrol arrangement comprising several physical control units that operate incorporation. The control unit 10 comprises hardware and software. The hardwarebasically comprises various electronic components on a Printed Circuit Board, PCB.The most important of those components is typically a processor 101 e.g. amicroprocessor, along with a memory 102 e.g. EPROIVI or a Flash memory chip.
The software (also called firmware) is typically lower-level software code that runs in the microcontroller.
The control unit 10, or more specifically the processor 101 of the control unit 10, isconfigured to cause the control unit 10 to perform all aspects of the methoddescribed above and below. This is typically done by running computer programcode stored in the memory 102 in the processor 101 of the control unit 10.
The control unit 10 may also comprise a communication interface 103 forcommunicating with other control units of the vehicle and/or with external systems. l\/lore particularly, the control unit 10 is configured to obtain a first trajectory definingan intended trajectory of the vehicle 1 and determine a current position and speedof a mobile object 2 that the vehicle 1 risks colliding with.
The control unit 10 is also configured to determine a current position and speed ofa mobile object 2 that the vehicle 1 risks colliding with. The control unit 10 is furtherconfigured to estimate, based on the current position and speed of the mobile object2, a second trajectory defining a predicted future trajectory of the mobile object 2.ln some embodiments, the control unit 10 is configured to estimate the predictedfuture trajectory of the mobile object 2 based on at least one of; map data, anacceleration of the mobile object 2, an angular velocity of the mobile object, velocityof the mobile object and a known intended trajectory of the mobile object 2. ln someembodiments, the mobile object 2 is another vehicle, a pedestrian, an animal. lnsome embodiments, the predicted future trajectory comprises a plurality of possible alternative future trajectories.
The control unit 10 is further configured to compare a position, at one point in timeT2, at one of the first and second trajectories with each of a plurality of positions, ata plurality of consecutive discrete points in time T2, T3, T4, T5, at the othertrajectory, wherein the consecutive discrete points in time T2, T3, T4, T5, are withina time window T that includes the one point in time T2 and to determine S5 a futurecollision based on the comparison revealing at least one overlap of the position ofthe vehicle 1 and the mobile object 2. ln some embodiments, the control unit 10 is 16 configured to determine the collision based on a volume or area of the vehicle 1and/or of the mobile object 2. ln some embodiments, the volume or area is asimple geometric shape. ln some embodiments, a size of the time window is dynamically configurable. lnsome embodiments, the size of the time window is determined based on a collisionrisk or based on an estimated uncertainty level within a system of the vehicle 1. lnsome embodiments, the distances between the consecutive discrete points in timeare variable or static. ln some embodiments, the time window extends before and/or after the one point in time. ln some embodiments, the control unit 10 is configured to drive the vehicle 1 based on the determined future collision.
The terminology used in the description of the embodiments as illustrated in theaccompanying drawings is not intended to be limiting of the described method;control arrangement or computer program. Various changes, substitutions and/oralterations may be made, without departing from disclosure embodiments asdefined by the appended claims.
The term “or” as used herein, is to be interpreted as a mathematical OR, i.e., asan inclusive disjunction; not as a mathematical exclusive OR (XOR), unlessexpressly stated otherwise. ln addition, the singular forms "a", "an" and "the" areto be interpreted as “at least one”, thus also possibly comprising a plurality ofentities of the same kind, unless expressly stated otherwise. lt will be furtherunderstood that the terms "includes", "comprises", "including" and/ or"comprising", specifies the presence of stated features, actions, integers, steps,operations, elements, and/ or components, but do not preclude the presence oraddition of one or more other features, actions, integers, steps, operations,elements, components, and/ or groups thereof. A single unit such as e.g. aprocessor may fulfil the functions of several items recited in the claims.

Claims (16)

17 CLAIIVIS
1. A method for predicting a collision between a vehicle (1) and a mobileobject (2), the method comprising: - obtaining (S1) a first trajectory defining an intended trajectory of the vehicle(1), - determining (S2) a current position and speed of a mobile object (2) that thevehicle (1) risks colliding with, - estimating (S3), based on the current position and speed of the mobile object(2), a second trajectory defining a predicted future trajectory of the mobileobject (2), - comparing (S4) a position, at one point in time, at one of the first and secondtrajectories with each of a plurality of positions, at a plurality of consecutivediscrete points in time in a time window (T) at the other trajectory, and - determining (S5) a future collision based on the comparison revealing atleast one overlap of the position of the vehicle (1) and the mobile object (2).
2. The method according to claim 1, wherein a size of the time window isdynamically configurable.
3. The method according to claim 2, wherein the size of the time windowis determined based on a collision risk or based on an estimated uncertainty levelwithin a system of the vehicle (1 ).
4. The method according to any of the preceding claims, wherein the time window (T) includes the one point in time.
5. The method according to any of the preceding claims, wherein thedistances between the consecutive discrete points in time are variable or static.
6. The method according to any of the preceding claims, the mobile object (2) is another vehicle, a pedestrian or an animal. 18
7. The method according to any of the preceding claims, wherein theestimating (S3) the second trajectory is based on at least one of; map data, anacceleration of the mobile object (2), an angular velocity of the mobile object, velocity of the mobile object and a known intended trajectory of the mobile object (2)-
8. The method according to any of the preceding claims, wherein thedetermining (S5) is based on a volume or area of the vehicle (1) and/or of the mobileobject (2).
9. The method according to any of the preceding claims, wherein the volume or area is a simple geometric shape.
10. The method according to any of the preceding claims, wherein thepredicted future trajectory comprises a plurality of possible alternative futuretrajectories.
11. The method according to any of the preceding claims, wherein the timewindow extends before and/or after the one point in time.
12. The method according to any of the preceding claims, comprising:- controlling (S6) driving of the vehicle based on the determined future collision.
13. A computer program comprising instructions which, when the computerprogram is executed by a computer, cause the computer to carry out the methodaccording to any one of the preceding claims.
14. A control unit (10) configured for controlling a vehicle (1), the controlunit (6) being configured to: - obtain a first trajectory defining an intended trajectory of the vehicle (1 ),
15. 19 determine a current position and speed of a mobile object (2) that the vehicle(1) risks colliding with, estimate, based on the current position and speed of the mobile object (2),a second trajectory defining a predicted future trajectory of the mobile object(2), compare a position, at one point in time, at one of the first and secondtrajectories with each of a plurality of positions, at a plurality of consecutivediscrete points in time in a time window (T), at the other trajectory, anddetermine a future collision based on the comparison revealing at least one overlap of the position of the vehicle (1) and the mobile object (2). The control unit (1 O) according to claim 14, wherein the control unit (1 O) is configured to perform the method according to any one of claims 2 to 12.
16. A vehicle (1) comprising a control unit (10) according to claim 15.
SE1950883A 2019-07-11 2019-07-11 Method and control unit for predicting a collision between a vehicle and a mobile object SE543781C2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
SE1950883A SE543781C2 (en) 2019-07-11 2019-07-11 Method and control unit for predicting a collision between a vehicle and a mobile object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1950883A SE543781C2 (en) 2019-07-11 2019-07-11 Method and control unit for predicting a collision between a vehicle and a mobile object

Publications (2)

Publication Number Publication Date
SE1950883A1 true SE1950883A1 (en) 2021-01-12
SE543781C2 SE543781C2 (en) 2021-07-20

Family

ID=74222111

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1950883A SE543781C2 (en) 2019-07-11 2019-07-11 Method and control unit for predicting a collision between a vehicle and a mobile object

Country Status (1)

Country Link
SE (1) SE543781C2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4350641A1 (en) * 2022-10-04 2024-04-10 aiMotive Kft. Automatic traffic-aware semantic annotation of dynamic objects

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100228419A1 (en) * 2009-03-09 2010-09-09 Gm Global Technology Operations, Inc. method to assess risk associated with operating an autonomic vehicle control system
US20150241880A1 (en) * 2014-02-26 2015-08-27 Electronics And Telecommunications Research Institute Apparatus and method for sharing vehicle information
DE102015214689A1 (en) * 2014-08-04 2016-02-04 Continental Teves Ag & Co. Ohg System for automated cooperative driving
EP3048022A1 (en) * 2015-01-20 2016-07-27 Toyota Jidosha Kabushiki Kaisha Collision avoidance control system and control method
DE102016218549B3 (en) * 2016-09-27 2017-12-28 Audi Ag Method for determining a collision-related forecast result
DE102016009954A1 (en) * 2016-08-16 2018-02-22 MSR ENGINEERING Heiko Evers e.K. Method for the early detection of collisions between at least two mobile objects and early warning system
US20180114443A1 (en) * 2015-04-02 2018-04-26 Denso Corporation Collision avoidance apparatus, collision avoidance system, and driving support method
WO2019034514A1 (en) * 2017-08-16 2019-02-21 Valeo Schalter Und Sensoren Gmbh Method and a system for collision avoidance of a vehicle

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100228419A1 (en) * 2009-03-09 2010-09-09 Gm Global Technology Operations, Inc. method to assess risk associated with operating an autonomic vehicle control system
US20150241880A1 (en) * 2014-02-26 2015-08-27 Electronics And Telecommunications Research Institute Apparatus and method for sharing vehicle information
DE102015214689A1 (en) * 2014-08-04 2016-02-04 Continental Teves Ag & Co. Ohg System for automated cooperative driving
EP3048022A1 (en) * 2015-01-20 2016-07-27 Toyota Jidosha Kabushiki Kaisha Collision avoidance control system and control method
US20180114443A1 (en) * 2015-04-02 2018-04-26 Denso Corporation Collision avoidance apparatus, collision avoidance system, and driving support method
DE102016009954A1 (en) * 2016-08-16 2018-02-22 MSR ENGINEERING Heiko Evers e.K. Method for the early detection of collisions between at least two mobile objects and early warning system
DE102016218549B3 (en) * 2016-09-27 2017-12-28 Audi Ag Method for determining a collision-related forecast result
WO2019034514A1 (en) * 2017-08-16 2019-02-21 Valeo Schalter Und Sensoren Gmbh Method and a system for collision avoidance of a vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4350641A1 (en) * 2022-10-04 2024-04-10 aiMotive Kft. Automatic traffic-aware semantic annotation of dynamic objects
WO2024074238A1 (en) * 2022-10-04 2024-04-11 Aimotive Kft. Automatic traffic-aware semantic annotation of dynamic objects

Also Published As

Publication number Publication date
SE543781C2 (en) 2021-07-20

Similar Documents

Publication Publication Date Title
US20210009121A1 (en) Systems, devices, and methods for predictive risk-aware driving
US7729857B2 (en) System for and method of detecting a collision and predicting a vehicle path
US11235757B2 (en) Collision avoidance apparatus
US10703363B2 (en) In-vehicle traffic assist
US10829114B2 (en) Vehicle target tracking
US11731661B2 (en) Systems and methods for imminent collision avoidance
US11318963B2 (en) Vehicle control apparatus, vehicle, and vehicle control method
US11167754B2 (en) Systems and methods for trajectory based safekeeping of vehicles
US20200070819A1 (en) Obstacle avoidance apparatus and obstacle avoidance route generating apparatus
US20200114921A1 (en) Sensor-limited lane changing
Eidehall Tracking and threat assessment for automotive collision avoidance
US20210316722A1 (en) Systems and methods for trajectory based safekeeping of vehicles
WO2021249926A1 (en) Representing objects in a surrounding environment of a vehicle using a frenet coordinate system
CN112319456A (en) Vehicle threat detection and response
US20210016779A1 (en) Automated driving system and method of autonomously driving a vehicle
Yang et al. Driving environment assessment and decision making for cooperative lane change system of autonomous vehicles
US11673548B2 (en) Vehicle detection and response
SE1950883A1 (en) Method and control unit for predicting a collision between a vehicle and a mobile object
US20220198810A1 (en) Information processing device and information processing method
CN113492882B (en) Method and device for realizing vehicle control, computer storage medium and terminal
US20220009494A1 (en) Control device, control method, and vehicle
EP4017772A1 (en) Systems and methods for trajectory based safekeeping of vehicles
Rodríguez-Seda Collision avoidance systems, automobiles
US20230054626A1 (en) Persisting Predicted Objects for Robustness to Perception Issues in Autonomous Driving
US20230054590A1 (en) Validation of surrounding objects percieved by an ads-equipped vehicle