WO2018165199A4 - Planning for unknown objects by an autonomous vehicle - Google Patents

Planning for unknown objects by an autonomous vehicle Download PDF

Info

Publication number
WO2018165199A4
WO2018165199A4 PCT/US2018/021208 US2018021208W WO2018165199A4 WO 2018165199 A4 WO2018165199 A4 WO 2018165199A4 US 2018021208 W US2018021208 W US 2018021208W WO 2018165199 A4 WO2018165199 A4 WO 2018165199A4
Authority
WO
WIPO (PCT)
Prior art keywords
sensors
hypothetical
vehicle
generating
environment
Prior art date
Application number
PCT/US2018/021208
Other languages
French (fr)
Other versions
WO2018165199A1 (en
Inventor
Emilio FRAZZOLI
Baoxing Qin
Original Assignee
nuTonomy Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/451,703 external-priority patent/US10234864B2/en
Priority claimed from US15/451,747 external-priority patent/US10095234B2/en
Priority claimed from US15/451,734 external-priority patent/US10281920B2/en
Application filed by nuTonomy Inc. filed Critical nuTonomy Inc.
Priority to CN201880030067.6A priority Critical patent/CN114830202A/en
Priority to EP18764238.4A priority patent/EP3593337A4/en
Publication of WO2018165199A1 publication Critical patent/WO2018165199A1/en
Publication of WO2018165199A4 publication Critical patent/WO2018165199A4/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/05Big data

Abstract

Among other things, a world model is maintained of an environment of a vehicle. A hypothetical object in the environment that cannot be perceived by sensors of the vehicle is included in the world model.

Claims

AMENDED CLAIMS received by the International Bureau on 10 September 2018 (10.09.2018)
1. A computer-implemented method comprising:
maintaining a model of an environment of a vehicle, the environment of the vehicle including a first section and a second section, wherein the first section is capable of being perceived by one or more sensors of the vehicle based on sensor conditions, wherein the second section is not capable of being perceived by the one or more sensors of the vehicle based on sensor conditions and the second section is adjacent to a boundary with the first section;
selectively generating a hypothetical object that is not be perceived by the one or more sensors of the vehicle within a particular location of the second section of the environment of the vehicle; and
updating the model with the generated hypothetical object at the particular location in the environment of the vehicle.
2. The method of claim 1, in which the hypothetical object comprises a moving object.
3. The method of claim 1, in which the hypothetical object comprises an object that uses a path of travel from which the vehicle is excluded.
4. The method of claim 1, in which the hypothetical object comprises at least one of: a second vehicle, a bicycle, a bus, a train, a pedestrian, and an animal.
5. The method of claim 1, wherein selectively generating the hypothetical object comprises selecting a type of the hypothetical object and an attribute of the hypothetical object probabilistically based on objects previously observed in the environment.
6. The method of claim 5, in which the attribute comprises a size or a speed.
7. The method of claim 1, further comprising including in the model known objects in the environment that are perceived by the one or more sensors of the vehicle or are otherwise known.
8. The method of claim 7, in which the hypothetical object and the known objects maintained by the model are in different parts of the environment.
9. The method of claim 1 , wherein the selectively generating the hypothetical model
comprises: obtaining historical data on objects previously observed in the environment at the particular location;
selecting a type of the hypothetical object based on the historical data;
selecting an attribute of the hypothetical object probabilistically based on the objects previously observed in the environment at the particular location;
determining, based on the historical data and objects of a same selected type previously observed in the environment at the particular location, that the selected type of the hypothetical object at the particular location behaves with particular attributes; and in response to determining the selected type of the hypothetical object at the particular location behaves with particular attributes, selecting an additional attribute of the hypothetical object probabilistically based on the objects previously observed in the environment at the particular location.
10. The method of claim 1, in which the first section and the second section are separated by a boundary.
11. The method of claim 10, comprising detecting of the boundary.
12. The method of claim 11, in which detecting of the boundary comprises using data from one or more sensors to distinguish an observable ground from a foreground that obscures a portion of the observable ground.
13. The method of claim 12, in which the one or more sensors comprise sensors of the vehicle or sensors offboard the vehicle or both.
14. The method of claim 1, further comprising:
querying traffic lane information from a road network database; and updating, based on the traffic lane information, the generated hypothetical object within the model.
15. The method of claim 1, wherein updating the hypothetical object in the model comprises using stored data to infer a possible location of the hypothetical object.
16. The method of claim 1, wherein updating the hypothetical object in the model comprises determining a location of the vehicle based on a road network database and one or more sensors.
17. The method of claim 1, wherein updating the hypothetical object in the model comprises querying traffic lane information from a database and discretizing a traffic lane into discretized points.
18. The method of claim 1, wherein updating the hypothetical object in the model comprises generating an unknown skeleton of discretized points of a lane that cannot be perceived by the one or more sensors of the vehicle.
19. The method of claim 18, wherein updating the hypothetical object in the model comprises (a) generating a representative shape at a discretized point of the unknown skeleton, and (b) evaluating if the representative shape is completely within the second section of the environment.
20. The method of claim 1, wherein updating the hypothetical object in the model treating a representative shape as the hypothetical object.
21. The method of claim 1 wherein updating the hypothetical object in the model comprises applying temporal filtering to determine a location of the hypothetical object.
22. The method of claim 21, in which applying the temporal filtering comprises smoothing an unknown skeleton by a forward propagated unknown skeleton, wherein the forward propagated unknown skeleton is generated by moving forward an old unknown skeleton along a traffic lane.
23. The method of claim 1, wherein updating the hypothetical object in the model comprises associating one or more attributes with the hypothetical object.
24. The method of claim 23, in which the one or more of the attributes are related to a possible motion state of the hypothetical object.
25. The method of claim 24, in which the motion state comprises a stationary condition.
26. The method of claim 24, in which the motion state comprises a moving condition.
27. The method of claim 24, in which the motion state comprises a speed and a moving
direction.
28. The method of claim 27, in which the speed is set to less than or equal to a predetermined maximum value.
29. The method of claim 28, in which the predetermined maximum value comprises a speed limit.
30. The method of claim 28, in which the predetermined maximum value comprises a quantity derived from other objects concurrently or previously observed in the environment.
31. The method of claim 28, in which the predetermined maximum value comprises a quantity derived from at least one of historical data, road configuration, traffic rules, an event, a time, and a weather condition.
32. The method of claim 1, in which the maintaining of the model comprises accessing a
database comprising road network information.
33. The method of claim 1, in which the maintaining of the model comprises using data from the one or more sensors.
34. The method of claim 33, in which the one or more sensors comprise a radar sensor.
35. The method of claim 33, in which the one or more sensors comprise a lidar sensor.
36. The method of claim 33, in which the one or more sensors comprise a camera sensor.
37. The method of claim 36, in which the camera sensor comprises a stereo camera sensor.
38. The method of claim 36, in which the camera sensor comprises a monocular camera sensor.
39. The method of claim 1, comprising updating a trajectory for the vehicle based on the model.
40. The method of claim 39, comprising executing the trajectory for the vehicle.
41. The method of claim 1, in which the vehicle comprises an autonomous vehicle.
42. A computer implemented method comprising:
receiving from a sensor, data representing an observable part of an environment of a vehicle; generating data representing a non-observable part of the environment, including data representing at least one hypothetical object in the non-observable part of the environment; and
generating commands for operation of the vehicle within the environment, the commands being dependent on the data representing the observable part of the environment and on the data representing the hypothetical object in the non-observable part of the environment.
43. The method of claim 42, in which the hypothetical object comprises a moving object.
44. The method of claim 42, in which the hypothetical object comprises an object that uses a path of travel from which the vehicle is excluded.
45. The method of claim 42, in which the hypothetical object comprises at least one of: a
vehicle, a bicycle, a bus, a train, a pedestrian, and an animal.
46. The method of claim 42, in which generating data representing a non-observable part of the environment comprises selecting a type of the hypothetical object and an attribute of the hypothetical object probabilistically based on objects previously observed in the environment.
47. The method of claim 46, in which the hypothetical object comprises a vehicle and the
attribute comprises a size and a speed.
48. The method of claim 42, in which the observable part and the non-observable part are
separated by a boundary.
49. The method of claim 48, comprising detecting of the boundary.
50. The method of claim 49, in which detecting the boundary comprises using data from the sensor to distinguish the observable ground from a foreground that obscures a portion of the ground.
51. The method of claim 42, in which generating data representing the non-observable part of the environment comprises using stored data to infer a possible location of the hypothetical object.
52. The method of claim 42, in which generating data representing the non-observable part of the environment comprises querying traffic lane information from a road network database.
53. The method of claim 42, in which generating data representing the non-observable part of the environment comprises determining a location of the vehicle based on a road network database and one or more sensors.
54. The method of claim 42, in which generating data representing the non-observable part of the environment comprises querying traffic lane information from a database and discretizing the traffic lane into discretized points.
55. The method of claim 42, in which generating data representing the non-observable part of the environment comprises generating an unknown skeleton of discretized points of a lane that cannot be perceived by the sensor.
56. The method of claim 55, in which generating data representing the non-observable part of the environment comprises (a) generating a representative shape at a discretized point of the unknown skeleton, and (b) evaluating if the representative shape is completely within the non-observable part.
57. The method of claim 42, in which generating data representing the non-observable part of the environment comprises treating a representative shape as the hypothetical object.
58. The method of claim 42, in which generating data representing the non-observable part of the environment comprises applying temporal filtering to determine a location of the hypothetical object.
59. The method of claim 58, in which applying the temporal filtering comprises smoothing an unknown skeleton by a forward propagated unknown skeleton, wherein the forward propagated unknown skeleton is generated by moving forward an old unknown skeleton along a traffic lane.
60. The method of claim 42, in which including the hypothetical object in the world model comprises associating one or more attributes with the hypothetical object.
61. The method of claim 60 in which the one or more of the attributes are related to a possible motion state of the hypothetical object.
62. The method of claim 61, in which the motion state comprises a stationary condition.
63. The method of claim 61, in which the motion state comprises a moving condition.
64. The method of claim 61, in which the motion state comprises a speed and a moving
direction.
65. The method of claim 64, in which the speed is set to less than or equal to a predetermined maximum value.
66. The method of claim 65, in which the predetermined maximum value comprises a speed limit.
67. The method of claim 65, in which the predetermined maximum value comprises a quantity derived from other objects concurrently or previously observed in the environment.
68. The method of claim 65, in which the predetermined maximum value comprises a quantity derived from historical data, road configuration, traffic rules, an event, a time, a weather condition, or a combination of two or more of them.
69. The method of claim 42, comprising accessing a database comprising road network
information.
70. The method of claim 69, comprises using data from a second set of sensors.
71. The method of claim 70, in which the sensor or a second set of sensors comprises a radar sensor.
72. The method of claim 70, in which the sensor or a second set of sensors comprises a lidar sensor.
73. The method of claim 70, in which the sensor or a second set of sensors comprises a camera sensor.
74. The method of claim 70, in which the sensor or a second set of sensors comprises a stereo camera sensor.
75. The method of claim 70, in which the sensor or a second set of sensors comprises a
monocular camera sensor.
76. The method of claim 42, in which generating the commands comprises updating a trajectory for the vehicle.
77. The method of claim 76, comprising executing the trajectory for the vehicle.
78. The method of claim 42, in which the vehicle comprises an autonomous vehicle.
79. A computer-implemented method comprising:
generating commands to cause an autonomous vehicle to drive on a road network at specified speeds and making specified turns to reach a goal position; and
updating the commands in response to current data representing a hypothetical speed and moving direction of a hypothetical vehicle also being driven on the road network, the commands being updated to reduce a risk of the autonomous vehicle colliding with another vehicle on the road network.
80. The method of claim 79, in which the hypothetical speed and moving direction is
probabilistically derived based on vehicles previously observed in the environment.
81. The method of claim 79, in which the observable part and the non-observable part are
separated by a boundary.
82. The method of claim 81, comprising detecting of the boundary.
83. The method of claim 82, in which detecting of the boundary comprises using data from one or more sensors to distinguish an observable ground from a foreground that obscures a portion of the ground.
84. The method of claim 83, in which the one or more sensors comprise sensors onboard the autonomous vehicle
85. The method of claim 83, in which the one or more sensors comprise sensors offboard the autonomous vehicle.
86. The method of claim 79, comprising generating of the current data based on known objects perceived by one or more sensors.
87. The method of claim 86, in which generating of the current data comprises querying traffic lane information from a road network database.
88. The method of claim 86, in which generating of the current data comprises using stored data to infer a possible location of the hypothetical vehicle.
89. The method of claim 86, in which generating of the current data comprises determining a location of the autonomous vehicle based on a road network database and one or more sensors.
90. The method of claim 86, in which generating of the current data comprises querying traffic lane information from a database and discretizing the traffic lane into discretized points.
91. The method of claim 86, in which generating of the current data comprises generating an unknown skeleton of discretized points of a lane that cannot be perceived by sensors.
92. The method of claim 91, in which generating of the current data comprises (a) generating a representative shape at a discretized point of the unknown skeleton, and (b) evaluating if the representative shape is completely within the unperceived world.
93. The method of claim 92, in which generating of the current data comprises treating the representative shape as the hypothetical vehicle.
94. The method of claim 86, in which generating of the current data comprises applying
temporal filtering to determine a location of the hypothetical vehicle.
95. The method of claim 94, in which applying the temporal filtering comprises smoothing an unknown skeleton by a forward propagated unknown skeleton, wherein the forward propagated unknown skeleton is generated by moving forward an old unknown skeleton along a traffic lane.
96. The method of claim 86, in which generating of the current data comprises associating one or more attributes with the hypothetical vehicle.
97. The method of claim 96, in which the one or more of the attributes are related to a possible motion state of the hypothetical vehicle.
98. The method of claim 97, in which the motion state comprises a stationary condition.
99. The method of claim 79, in which the hypothetical speed is set to less than or equal to a predetermined maximum value.
50
100. The method of claim 99, in which the predetermined maximum value comprises a speed limit.
101. The method of claim 99, in which the predetermined maximum value comprises a quantity derived from other objects concurrently or previously observed in the environment.
102. The method of claim 99, in which the predetermined maximum value comprises a quantity derived from historical data, road configuration, traffic rules, an event, a time, a weather condition, or a combination of two or more of them.
103. The method of claim 79, comprising accessing a database comprising road network
information.
104. The method of claim 79, in which the sensor comprises a radar sensor.
105. The method of claim 79, in which the sensor comprises a lidar sensor.
106. The method of claim 79, in which the sensor comprise a camera sensor.
107. The method of claim 106, in which the camera sensor comprises a stereo camera sensor.
108. The method of claim 106, in which the camera sensor comprises a monocular camera
sensor.
109. An apparatus comprising:
an autonomous vehicle comprising:
a) steering, acceleration, deceleration, or gear selection devices or combinations of them configured to effect movement of the autonomous vehicle on a road network; and b) a computer having a processor to execute a process (i) to generate commands to the steering, acceleration, deceleration, or gear selection devices or combinations of them to move the autonomous vehicle in accordance with driving decisions; and (ii) to update the commands (a) in response to data representing motion characteristics of a hypothetical vehicle being driven on the road network and (b) based on discretized points of a lane that cannot be perceived by one or more sensors.
110. The apparatus of claim 109, in which data representing motion characteristics of a
hypothetical vehicle is probabilistically derived based on vehicles previously observed in an environment of the autonomous vehicle.
51
111. The apparatus of claim 109, in which regions perceivable and unperceivable by the sensors are separated by a boundary.
112. The apparatus of claim 111 , in which the computer detects the boundary.
113. The apparatus of claim 112, in which detecting of the boundary is based on using data from the sensors to distinguish an perceivable ground from a foreground that obscures a portion of the ground.
114. The apparatus of claim 113, in which the sensors comprise sensors onboard the autonomous vehicle.
115. The apparatus of claim 113, in which the sensors comprise sensors offboard the autonomous vehicle.
116. The apparatus of claim 109, in which the computer generates the data based on known
objects perceived by the sensors.
117. The apparatus of claim 116, in which generating of the data comprises querying traffic lane information from a road network database.
118. The apparatus of claim 116, in which generating of the data comprises using stored data to infer a possible location of the hypothetical vehicle.
119. The apparatus of claim 116, in which generating of the data comprises determining a
location of the autonomous vehicle based on a road network database and the sensors.
120. The apparatus of claim 116, in which generating of the data comprises querying traffic lane information from a database and discretizing the traffic lane into discretized points.
121. The apparatus of claim 116, in which generating of the data comprises generating an
unknown skeleton of discretized points of a lane that cannot be perceived by sensors.
122. The apparatus of claim 121, in which generating of the data comprises (a) generating a representative shape at a discretized point of the lane that cannot be perceived by sensors, and (b) evaluating if the representative shape is completely within an unperceivable region.
123. The apparatus of claim 122, in which generating of the data comprises treating the
representative shape as the hypothetical vehicle.
52
124. The apparatus of claim 116, in which generating of the data comprises applying temporal filtering to determine a location of the hypothetical vehicle.
125. The apparatus of claim 124, in which applying the temporal filtering comprises smoothing an unknown skeleton by a forward propagated unknown skeleton, wherein the forward propagated unknown skeleton is generated by moving forward an old unknown skeleton along a traffic lane.
126. The apparatus of claim 116, in which generating of the data comprises associating one or more attributes with the hypothetical vehicle.
127. The apparatus of claim 126, in which the one or more of the attributes are related to a possible motion state of the hypothetical vehicle.
128. The apparatus of claim 127, in which the possible motion state comprises a stationary condition.
129. The apparatus of claim 109, in which the motion characteristics comprise a hypothetical speed being set to less than or equal to a predetermined maximum value.
130. The apparatus of claim 129, in which the predetermined maximum value comprises a speed limit.
131. The apparatus of claim 129, in which the predetermined maximum value comprises a
quantity derived from other objects concurrently or previously perceived in the
environment.
132. The apparatus of claim 129, in which the predetermined maximum value comprises a
quantity derived from historical data, road configuration, traffic rules, an event, a time, a weather condition, or a combination of two or more of them.
133. The apparatus of claim 109, in which the computer accesses to a database comprising road network information.
134. The apparatus of claim 109, in which the sensors comprises a radar sensor.
135. The apparatus of claim 109, in which the sensors comprises a lidar sensor.
136. The apparatus of claim 109, in which the sensors comprise a camera sensor.
53
137. The apparatus of claim 136, in which the camera sensor comprises a stereo camera sensor.
138. The apparatus of claim 136, in which the camera sensor comprises a monocular camera sensor.
139. An apparatus comprising:
an autonomous vehicle comprising:
a) steering, acceleration, deceleration, or gear selection devices or combinations of them configured to effect movement of the autonomous vehicle on a road network; and
b) a computer having a processor to execute a process (i) to generate commands to the steering, acceleration, deceleration, or gear selection devices or combinations of them to move the autonomous vehicle in accordance with driving decisions; and (ii) to update the commands (a) in response to data representing motion characteristics of a hypothetical vehicle being driven on the road network and (b) based on temporal filtering to determine a location of the hypothetical vehicle.
140. The apparatus of claim 139, in which data representing motion characteristics of a
hypothetical vehicle is probabilistically derived based on vehicles previously observed in an environment of the autonomous vehicle.
141. The apparatus of claim 139, in which regions perceivable and unperceivable by the sensors are separated by a boundary.
142. The apparatus of claim 141, in which the computer detects the boundary.
143. The apparatus of claim 142, in which detecting of the boundary is based on using data from the sensors to distinguish an perceivable ground from a foreground that obscures a portion of the ground.
144. The apparatus of claim 143, in which the sensors comprise sensors onboard the autonomous vehicle.
145. The apparatus of claim 143, in which the sensors comprise sensors offboard the autonomous vehicle.
146. The apparatus of claim 139, in which the computer generates the data based on known
objects perceived by the sensors.
54
147. The apparatus of claim 146, in which generating of the data comprises querying traffic lane information from a road network database.
148. The apparatus of claim 146, in which generating of the data comprises using stored data to infer a possible location of the hypothetical vehicle.
149. The apparatus of claim 146, in which generating of the data comprises determining a
location of the autonomous vehicle based on a road network database and the sensors.
150. The apparatus of claim 146, in which generating of the data comprises querying traffic lane information from a database and discretizing the traffic lane into discretized points.
151. The apparatus of claim 146, in which generating of the current data comprises generating an unknown skeleton of discretized points of a lane that cannot be perceived by sensors.
152. The apparatus of claim 146, in which generating of the data comprises (a) generating a representative shape at a discretized point of the lane that cannot be perceived by sensors, and (b) evaluating if the representative shape is completely within an unperceivable region.
153. The apparatus of claim 152, in which generating of the data comprises treating the
representative shape as the hypothetical vehicle.
154. The apparatus of claim 146, in which applying the temporal filtering comprises smoothing an unknown skeleton by a forward propagated unknown skeleton, wherein the forward propagated unknown skeleton is generated by moving forward an old unknown skeleton along a traffic lane.
155. The apparatus of claim 146, in which generating of the data comprises associating one or more attributes with the hypothetical vehicle.
156. The apparatus of claim 155, in which the one or more of the attributes are related to a
possible motion state of the hypothetical vehicle.
157. The apparatus of claim 156, in which the possible motion state comprises a stationary
condition.
158. The apparatus of claim 146, in which the motion characteristics comprise a hypothetical speed is being set to less than or equal to a predetermined maximum value.
55
159. The apparatus of claim 158, in which the predetermined maximum value comprises a speed limit.
160. The apparatus of claim 159, in which the predetermined maximum value comprises a
quantity derived from other objects concurrently or previously perceived in the
environment.
161. The apparatus of claim 159, in which the predetermined maximum value comprises a
quantity derived from historical data, road configuration, traffic rules, an event, a time, a weather condition, or a combination of two or more of them.
162. The apparatus of claim 146, in which the computer accesses to a database comprising road network information.
163. The apparatus of claim 146, in which the sensors comprises a radar sensor.
164. The apparatus of claim 146, in which the sensors comprises a lidar sensor.
165. The apparatus of claim 146, in which the sensors comprise a camera sensor.
166. The apparatus of claim 165, in which the camera sensor comprises a stereo camera sensor.
167. The apparatus of claim 165, in which the camera sensor comprises a monocular camera sensor.
56
PCT/US2018/021208 2017-03-07 2018-03-06 Planning for unknown objects by an autonomous vehicle WO2018165199A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880030067.6A CN114830202A (en) 2017-03-07 2018-03-06 Planning for unknown objects by autonomous vehicles
EP18764238.4A EP3593337A4 (en) 2017-03-07 2018-03-06 Planning for unknown objects by an autonomous vehicle

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US15/451,703 US10234864B2 (en) 2017-03-07 2017-03-07 Planning for unknown objects by an autonomous vehicle
US15/451,703 2017-03-07
US15/451,747 US10095234B2 (en) 2017-03-07 2017-03-07 Planning for unknown objects by an autonomous vehicle
US15/451,734 2017-03-07
US15/451,747 2017-03-07
US15/451,734 US10281920B2 (en) 2017-03-07 2017-03-07 Planning for unknown objects by an autonomous vehicle

Publications (2)

Publication Number Publication Date
WO2018165199A1 WO2018165199A1 (en) 2018-09-13
WO2018165199A4 true WO2018165199A4 (en) 2018-10-11

Family

ID=63448894

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/021208 WO2018165199A1 (en) 2017-03-07 2018-03-06 Planning for unknown objects by an autonomous vehicle

Country Status (3)

Country Link
EP (1) EP3593337A4 (en)
CN (1) CN114830202A (en)
WO (1) WO2018165199A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111683851B (en) * 2018-12-26 2023-09-12 百度时代网络技术(北京)有限公司 Mutual avoidance algorithm for self-steering lanes for autopilot
US20230159026A1 (en) * 2021-11-19 2023-05-25 Motional Ad Llc Predicting Motion of Hypothetical Agents

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006060518A2 (en) * 2004-11-30 2006-06-08 Circumnav Networks, Inc. Methods for deducing road geometry and connectivity
JP4400584B2 (en) * 2006-03-01 2010-01-20 トヨタ自動車株式会社 Obstacle detection method and obstacle detection device
JP5620147B2 (en) * 2010-05-24 2014-11-05 株式会社豊田中央研究所 Movable object prediction apparatus and program
US8970701B2 (en) * 2011-10-21 2015-03-03 Mesa Engineering, Inc. System and method for predicting vehicle location
US9633564B2 (en) * 2012-09-27 2017-04-25 Google Inc. Determining changes in a driving environment based on vehicle behavior
DE102013216994A1 (en) * 2013-08-27 2015-03-05 Robert Bosch Gmbh Speed assistant for a motor vehicle
CN105205196B (en) * 2014-06-27 2018-08-03 国际商业机器公司 Method and system for generating road network
EP3842747A1 (en) * 2015-02-10 2021-06-30 Mobileye Vision Technologies Ltd. Sparse map for autonomous vehicle navigation
EP3091370B1 (en) * 2015-05-05 2021-01-06 Volvo Car Corporation Method and arrangement for determining safe vehicle trajectories

Also Published As

Publication number Publication date
WO2018165199A1 (en) 2018-09-13
EP3593337A1 (en) 2020-01-15
CN114830202A (en) 2022-07-29
EP3593337A4 (en) 2021-01-06

Similar Documents

Publication Publication Date Title
US11714417B2 (en) Initial trajectory generator for motion planning system of autonomous vehicles
US20240140487A1 (en) Autonomous Vehicles Featuring Machine-Learned Yield Model
JP6791905B2 (en) Systems and methods for dynamic vehicle control according to traffic
US20190025843A1 (en) Systems and Methods for Speed Limit Context Awareness
US10882522B2 (en) Systems and methods for agent tracking
WO2019093190A1 (en) Information processing device, vehicle, moving body, information processing method, and program
CN109643118B (en) Influencing a function of a vehicle based on function-related information about the environment of the vehicle
US20210009119A1 (en) Method and control device for a system for the control of a motor vehicle
US10730531B1 (en) Machine-learning based vehicle motion control system
US20220188695A1 (en) Autonomous vehicle system for intelligent on-board selection of data for training a remote machine learning model
US10546499B2 (en) Systems and methods for notifying an occupant of a cause for a deviation in a vehicle
US10474149B2 (en) Autonomous behavior control using policy triggering and execution
CN112394725B (en) Prediction and reaction field of view based planning for autopilot
US11292489B2 (en) Systems and methods for information aggregation and event management in a vehicle
EP3869342A1 (en) System and method for generating simulation scenario definitions for an autonomous vehicle system
WO2019093193A1 (en) Information processing device, vehicle, mobile body, information processing method, and program
US11748995B2 (en) Object state tracking and prediction using supplemental information
JP2021130457A (en) Vehicle control device
EP4129797A1 (en) Method and system for training an autonomous vehicle motion planning model
WO2022159261A1 (en) Systems and methods for scenario dependent trajectory scoring
DE112022003364T5 (en) COMPLEMENTARY CONTROL SYSTEM FOR AN AUTONOMOUS VEHICLE
WO2018165199A4 (en) Planning for unknown objects by an autonomous vehicle
WO2018198769A1 (en) Surrounding environment recognition device, display control device
DE112020000357T5 (en) CLASSIFYING PERCEIVED OBJECTS BASED ON ACTIVITY
RU2782972C1 (en) Method for forming a recuperative energy-efficient track of an operated vehicle equipped with an electric power recuperation system during braking, when the operated vehicle is moving along a section of the path that includes a possible deceleration point

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18764238

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018764238

Country of ref document: EP

Effective date: 20191007