EP4059003A1 - Surveillance de collision au moyen de modèles statistiques - Google Patents

Surveillance de collision au moyen de modèles statistiques

Info

Publication number
EP4059003A1
EP4059003A1 EP20887777.9A EP20887777A EP4059003A1 EP 4059003 A1 EP4059003 A1 EP 4059003A1 EP 20887777 A EP20887777 A EP 20887777A EP 4059003 A1 EP4059003 A1 EP 4059003A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
determining
probability
additional
estimated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20887777.9A
Other languages
German (de)
English (en)
Other versions
EP4059003A4 (fr
Inventor
Andrew Scott CREGO
Ali Ghasemzadehkhoshgroudi
Sai Anurag MODALAVALASA
Andreas Christian Reschka
Siavosh Rezvan Behbahani
Lingqiao Qin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zoox Inc
Original Assignee
Zoox Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/683,005 external-priority patent/US11648939B2/en
Priority claimed from US16/682,971 external-priority patent/US11697412B2/en
Application filed by Zoox Inc filed Critical Zoox Inc
Publication of EP4059003A1 publication Critical patent/EP4059003A1/fr
Publication of EP4059003A4 publication Critical patent/EP4059003A4/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0953Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • An autonomous vehicle can use an autonomous vehicle controller to guide the autonomous vehicle through an environment.
  • the autonomous vehicle controller can use planning methods, apparatuses, and systems to determine a drive path and guide the autonomous vehicle through the environment that contains dynamic objects (e.g., vehicles, pedestrians animals, and the like) and static objects (e.g., buildings, signage, stalled vehicles, and the like).
  • the autonomous vehicle controller ma take into account predicted behavior of the dynamic objects as the vehicle navigates through the environment.
  • FIG. 1 is an illustration of an environment that includes a vehicle performing collision monitoring using error models and/or system data, in accordance with embodiments of the disclosure.
  • FIG. 2 is an illustration of an example of a vehicle analyzing sensor data using error models in order to determine estimated locations associated with an object, in accordance with embodiments of the disclosure.
  • FIG. 3 is an illustration of another example of a vehicle analyzing sensor data using error models in order to determine estimated locations associated with an object, in accordance with embodiments of the disclosure.
  • FIG. 4 is an illustration of an example of a vehicle analyzing sensor data and system data in order to determine estimated locations associated with an object, in accordance with embodiments of the disclosure.
  • FIG. 5 is an illustration of another example of a vehicle analyzing sensor data and system data in order to determine estimated locations associated with an object, in accordance with embodiments of the disclosure.
  • FIG. 6 illustrates an example graph illustrating a vehicle determining probabilities of collision, in accordance with embodiments of the disclosure.
  • FIG. 7 illustrates generating error model data based at least in part on vehicle data and ground truth data, in accordance with embodiments of the present disclosure.
  • FIG. 8 illustrates computing device(s) generating perception error model data based at least in part on log data generated by the vehicle(s) and ground truth data, in accordance with embodiments of the present disclosure.
  • FIG. 9 illustrates generating uncertainty data based at least in part on vehicle data and ground truth data, in accordance with embodiments of the present disclosure.
  • FIG. 10 depicts a block diagram of an example system for implementing the techniques described herein, in accordance with embodiments of the present disclosure.
  • FIG. 11 depicts an example process for performing collision monitoring using error models, in accordance with embodiments of the disclosure.
  • FIG. 12 depicts an example process for using error models to determine estimated locations associated with an object, in accordance with embodiments of the disclosure.
  • FIGS. 13A-13B depict an example process for performing collision monitoring using uncertainties, in accordance with embodiments of the disclosure.
  • FIG. 14 depicts an example process for using uncertainties to determine estimated locations associated with an object, in accordance with embodiments of the disclosure.
  • an autonomous vehicle can use a controller to guide the autonomous vehicle through an environment.
  • the controller can use planning methods, apparatuses, and systems to determine a drive path and guide the autonomous vehicle through the environment that contains dynamic objects (e.g., vehicles, pedestrians, animals, and the like) and/or static objects (e.g., buildings, signage, stalled vehicles, and the like).
  • the autonomous vehicle controller may employ safety factors when operating in the environment.
  • systems and controllers may comprise complex systems which are incapable of being inspected. Despite the fact that there may not be methods for determining errors or uncertainties associated with such systems and systems, such errors and uncertainties may be necessary for informing such a vehicle of safe operation in an environment.
  • an autonomous vehicle may use error models and/or system uncertainties to determine, at a later time, estimated locations of both the autonomous vehicle and one or more objects.
  • the estimated locations may include distributions of probability locations associated with the autonomous vehicle and the one or more objects.
  • the autonomous vehicle may then determine a probability of collision between the autonomous vehicle and the one or more objects using the estimated locations. Based at least in part on the probability of collision, the autonomous vehicle may perform one or more actions. In at least some examples, such probabilities may be determined based on determinations made according to any of the techniques described in detail herein.
  • the autonomous vehicle can traverse an environment and generate sensor data using one or more sensors.
  • the sensor data can include data captured by sensors such as time-of-flight sensors, location sensors (e.g., GPS, compass, etc.), inertial sensors (e.g., inertial measurement units (IMUs), accelerometers, magnetometers, gyroscopes, etc.), lidar sensors, radar sensors, sonar sensors, infrared sensors, cameras (e.g., RGB, IR, intensity, depth, etc.), microphone sensors, environmental sensors (e.g., temperature sensors, humidity sensors, light sensors, pressure sensors, etc.), ultrasonic transducers, wheel encoders, etc.
  • the autonomous vehicle can then analyze the sensor data using one or more components (e.g., one or more systems) when navigating through the environment.
  • the one or more of the components of the autonomous vehicle can use the sensor data to generate a trajectory for the autonomous vehicle.
  • the one or more components can also use the sensor data to determine pose data associated with a position of the autonomous vehicle.
  • the one or more components can use the sensor data to determine position data, coordinate data, and/or orientation data of the vehicle in the environment.
  • the pose data can include x-y-z coordinates and/or can include pitch, roll, and yaw data associated with the vehicle.
  • the one or more component of the autonomous vehicle can use the sensor data to perform operations such as detecting, identifying, segmenting, classifying, and/or tracking objects within the environment.
  • objects such as pedestrians, bicycles/bicyclists, motorcycles/motorcyclists, buses, streetcars, trucks, animals, and/or the like can be present in the environment.
  • the one or more components can use the sensor data to determine current locations of the objects as well as estimated locations for the objects at future times (e.g., one second in the future, five seconds in the future, etc.).
  • the autonomous vehicle can then use the trajectory of the autonomous vehicle along with the estimated locations of the objects to determine a probability of collision between the autonomous vehicle and the objects. For example, the autonomous vehicle may determine if an estimated location of an object at a future time intersects with a location of the autonomous vehicle along the trajectory at the future time. To increase the safety, the autonomous vehicle may use distance and/or time buffers when making the determination. For example, the autonomous vehicle may determine that there is a high probability of collision when the location of the object at the future time is within a threshold distance (e.g., a distance buffer) to the location of the autonomous vehicle.
  • a threshold distance e.g., a distance buffer
  • the autonomous vehicle may use error models associated with the components and/or uncertainties associated with the outputs of the components to determine the probability of collision.
  • An error model associated with a component can represent error(s) and/or error percentages associated with the output of the component.
  • a perception error model can produce a perception error associated with a perception parameter (e.g., an output) of a perception component
  • a prediction error model can produce a prediction error associated with a prediction parameter (e.g., an output) from a prediction component, and/or the like.
  • the errors may be represented by, and without limitation, look-up tables determined based at least in part on statistical aggregation using ground-truth data, functions (e.g., errors based on input parameters), or any other model or data structure which maps an output to a particular error.
  • error models may map particular errors with probabilities/frequencies of occurrence.
  • error models may be determined for certain classes of data (e.g., differing error models for a perception system for detections within a first range and for detections within a second range of distances, based on a velocity of the vehicle, of the object, etc.).
  • the error models may include static error models.
  • the error models may include dynamic error models which are updated by the autonomous vehicle and/or one or more computing devices.
  • the computing device(s) may continue to receive vehicle data from autonomous vehicles. The computing device(s) may then update the error models using the vehicle data as well as ground truth data, which is described in more detail below. After updating the error models, the computing device(s) may send the updated error models to the autonomous vehicle.
  • a component may analyze sensor data and, based at least in part on the analysis, produce an output, which may represent one or more parameters.
  • An error model can then indicate that an output of the component of the vehicle, such as a speed associated with an object, is associated with an error percentage. For instance, the component may determine that the speed of the object within the environment is 10 meters per second.
  • the autonomous vehicle may determine that the error percentage is X% (e.g., 20%) resulting in a range of speeds +/- X% (e.g., between 8 meters per second and 12 meters per second in the case of a 20% error percentage).
  • the range of speeds can be associated with a probability distribution such as a Gaussian distribution, indicating that portions of the range have a higher probability of occurring than other portions of the range.
  • the probability distribution may be binned into multiple discrete probabilities. For example, 8 meters per second and 12 meters per second may be associated with a 5% probability, 9 meters per second and 11 meters per second may be associated with a 20% percent probability, and 10 meters per second may be associated with a 45% probability.
  • the autonomous vehicle may determine estimated locations of an object at a future time based at least in part on the outputs from the components and the error models associated with the components.
  • the estimated locations may correspond to a probability distribution such as a Gaussian distribution of locations.
  • the autonomous vehicle determines the estimated locations of the object by initially determining the respective probability distributions associated with each of the components and/or the parameters.
  • the autonomous vehicle may then determine the estimated locations using the probability distributions for all of the components and/or parameters.
  • the autonomous vehicle may aggregate or combine the probability distributions for all of the components and/or parameters to determine the estimated locations. Aggregating and/or combining the probability distributions may include multiplying the probability distributions, summing up the probability distributions, and/or applying one or more other formulas to the probability distributions.
  • the autonomous vehicle may first determine an initial estimated location associated with the object using the outputs from the components. The autonomous vehicle then uses the error models to determine total errors of each of the outputs. The autonomous vehicle may determine the total errors by aggregating and/or combining the errors from each of the error models for the components. Next, the autonomous vehicle uses the total errors and initial estimated location to determine the estimated locations. In such instances, the estimated locations may include a distribution of probable locations around the initial estimated location.
  • the autonomous vehicle may use one or more components to analyze the sensor data in order to determine parameters associated with an object.
  • the parameters may include, but are not limited to, a type of object, a current location of the object, a speed of the object, a direction of travel of the object, and/or the like.
  • the autonomous vehicle may then determine a probability distribution associated with the type of object, a probability distribution associated with the current location of the object, a probability distribution associated with the speed of the object, a probability distribution associated with the direction of travel of the object, and/or the like.
  • the autonomous vehicle may then determine the estimated locations of the object at the future time using the probability distributions for the parameters.
  • each (or any one or more) of the estimated locations may be represented as a probability distribution of locations.
  • the autonomous vehicle may use the one or more components to analyze the sensor data in order to once again determine the parameters associated with the object.
  • the autonomous vehicle may then determine an initial estimated location of the object at the future time using the parameters.
  • the autonomous vehicle may use the error models associated with the parameters to determine total errors and/or error percentages associated with determining the initial estimated location of the object.
  • the autonomous vehicle may then use the initial estimated location and the total errors and/or error percentages to determine the estimated locations for the object.
  • each (or any one or more) of the estimated locations may be represented as a probability distribution of locations.
  • the autonomous vehicle may use similar processes to determine estimated locations of one or more other objects located within the environment. Additionally, the autonomous vehicle may use similar processes to determine estimated locations of the autonomous vehicle at the future time. For example, the autonomous vehicle may use one or more components to analyze the sensor data in order to determine parameters associated with the autonomous vehicle. The parameters may include, but are not limited to, a location of the autonomous vehicle, a speed of the autonomous vehicle, a direction of travel of the autonomous vehicle, and/or the like. The autonomous vehicle may then employ error models associated with the parameters to determine estimated locations of the autonomous vehicle at the future time. In some examples, each (or any one or more) of the estimated locations may correspond to a probability distribution of locations for the autonomous vehicle at the future time.
  • the autonomous vehicle may use system data, such as uncertainty models, associated with the components and/or the outputs to determine the estimated locations.
  • An uncertainty model associated with a parameter may correspond to a distribution of how much the output should be trusted and/or a measure of how correct the system believes the output to be. For example, if a component analyzes sensor data multiple times in order to determine a location of an object, the component will output a low uncertainty if the outputs include a small distribution of values (e.g., within a first range) around the location indicated by the ground truth data.
  • the component will output a large uncertainty if the outputs include a large distribution of values around the location indicated by the ground truth data (e.g., within a second range that is greater than the first range).
  • the autonomous vehicle may use the uncertainty models to determine estimated locations of an object at a future time.
  • the autonomous vehicle may use one or more components to analyze the sensor data in order to again determine the parameters associated with an object.
  • the autonomous vehicle may then determine an uncertainty model associated with determining the type of object, an uncertainty model associated with determining the current location of the object, an uncertainty model associated with determining the speed of the object, an uncertainty model associated with determining the direction of travel of the object, and/or the like.
  • the autonomous vehicle may then determine the estimated locations of the object at a future time using the uncertainty models associated with the parameters.
  • the estimated locations may correspond to a probability distribution of locations.
  • the autonomous vehicle may use the one or more components to analyze the sensor data in order to once again determine the parameters associated with the object.
  • the autonomous vehicle may then determine an initial estimated location of the object at the future time using the parameters.
  • the autonomous vehicle may use the uncertainty models associated with the components determining the parameters and the estimated location to determine the estimated locations of the object at the future time.
  • the estimated locations may correspond to a probability distribution of locations.
  • the autonomous vehicle may use similar processes to determine estimated locations of one or more other objects located within the environment. Additionally, the autonomous vehicle may use similar processes to determine estimated locations of the autonomous vehicle at the future time.
  • the autonomous vehicle may use one or more components to analyze the sensor data in order to determine parameters associated with the autonomous vehicle.
  • the parameters may include, but are not limited to, a location of the autonomous vehicle, a speed of the autonomous vehicle, a direction of travel of the autonomous vehicle, and/or the like (any and/or all of which may be derived from an output trajectory from a planner system, for example).
  • the autonomous vehicle may then use uncertainty models associated with determining the parameters to determine estimated locations of the autonomous vehicle at the future time.
  • the estimated locations may correspond to a probability distribution of locations for the autonomous vehicle at the future time.
  • the autonomous vehicle may then determine a probability of collision using the estimated locations of the autonomous vehicle and the estimated locations of the objects. For example, the probability of collision between the autonomous vehicle and an object may be computed using an area of geometric overlap between the estimated locations (e.g., the probability distribution of locations) of the of the autonomous vehicle and the estimated locations (e.g., the probability distribution of locations) of the object.
  • the autonomous vehicle may determine a total probability of collision associated with the autonomous vehicle using the determined probability of collisions for each of the objects. For example, the total probability of collision may include the sum of the probability of collisions for each of the objects.
  • the autonomous vehicle may then determine whether the probability of collision is equal to or greater than a threshold (e.g., .5%, 1%, 5%, and/or some other threshold percentage). In some instances, if the probability of collision is less than the threshold, then the autonomous vehicle may continue to navigate along the current route of the autonomous vehicle. However, in some instances, if the autonomous vehicle determines that the probability of collision is equal to or greater than the threshold, then the autonomous vehicle may take one or more actions. For example, the autonomous vehicle may change a speed (e.g., slowdown) of the autonomous vehicle, change the route of the autonomous vehicle, park at a safe location, and/or the like.
  • a threshold e.g., .5%, 1%, 5%, and/or some other threshold percentage.
  • the autonomous vehicle may determine a total uncertainty associated with navigating the autonomous vehicle based at least in part on the uncertainty models used to determine the estimated locations of the autonomous vehicle and the uncertainty models used to determine the estimated locations of the object(s). The autonomous vehicle may then generate different routes and perform similar processes for determining the total uncertainties associated with the different routes. Additionally, the autonomous vehicle may select the route that includes the lowest uncertainty.
  • the autonomous vehicle and/or one or more computing devices use input data (e.g., log data and/or simulation data) to generate the error models and/or the uncertainty models.
  • the autonomous vehicle and/or the one or more computing devices may compare the input data to ground truth data.
  • the ground truth data can be manually labeled and/or determined from other, validated, machine-learned components.
  • the input data can include the sensor data and/or the output data generated by a component of the autonomous vehicle.
  • the autonomous vehicle and/or the one or more computing devices can compare the input data with the ground truth data which can indicate the actual parameters of an object in the environment. By comparing the input data with the ground truth data, the autonomous vehicle and/or the one or more computing devices can determine an error and/or uncertainty associated with a component and/or parameter and generate the corresponding error model using the error and/or the corresponding uncertainty models using the uncertainty.
  • the autonomous vehicle and/or the one or more computing devices can determine the uncertainties associated with the components. For example, the autonomous vehicle and/or the one or more computing devices may input the input data into a component multiple times in order to receive multiple outputs (e.g., parameters) from the component. The autonomous vehicle and/or the one or more computing devices may then analyze the outputs to determine a distribution associated with the outputs. Using the distribution, the autonomous vehicle and/or the one or more computing devices may determine the uncertainty. For example, if there is a large distribution, then the autonomous vehicle and/or the one or more computing devices may determine there is a large uncertainty. However, if there is a small distribution, then the autonomous vehicle and/or the one or more computing devices may determine that there is a small uncertainty.
  • the uncertainty For example, if there is a large distribution, then the autonomous vehicle and/or the one or more computing devices may determine there is a large uncertainty. However, if there is a small distribution, then the autonomous vehicle and/or the one or more computing devices may determine that there is a
  • the techniques described herein may be implemented in a number of ways. Example implementations are provided below with reference to the following figures. Although discussed in the context of an autonomous vehicle, the methods, apparatuses, and systems described herein may be applied to a variety of systems (e.g., a sensor system or a robotic platform), and are not limited to autonomous vehicles. In another example, the techniques may be utilized in an aviation or nautical context, or in any system using machine vision (e.g., in a system using image data). Additionally, the techniques described herein may be used with real data (e.g., captured using sensor(s)), simulated data (e.g., generated by a simulator), or any combination of the two.
  • real data e.g., captured using sensor(s)
  • simulated data e.g., generated by a simulator
  • FIG. 1 is an illustration of an environment 100 that includes a vehicle 102 performing collision monitoring using error models and/or system data, in accordance with embodiments of the disclosure.
  • the vehicle 102 may be navigating along a trajectory 104 within the environment 100 While navigating, the vehicle 102 may be generating sensor data 106 using one or more sensors of the vehicle 102 and analyzing the sensor data 106 using one or more components 108 (e.g., one or more systems) of the vehicle 102.
  • the component(s) 108 may include, but are not limited to, a localization component, a perception component, a prediction component, a planning component, and/or the like.
  • the vehicle 102 may identify at least a first object 110 and a second object 112 located within the environment 100.
  • the vehicle 102 may analyze the sensor data 106 using the component(s) 108 in order to determine estimated locations 114 associated with the vehicle 102, estimated locations 116 associated with the first object 110, and estimated locations 118 associated with the second object 112 at a future time.
  • the estimated locations 114 may include a probability distribution of locations associated with the vehicle 102
  • the estimated locations 116 may include a probability distribution of locations associated with the first object 110
  • the estimated locations 118 may include a probability distribution of locations associated with the second object 112.
  • the estimated locations 114 may include an estimated location 120(1) associated with the vehicle 102, a first area of estimated locations 120(2) (e.g., a first boundary) that are associated with a first probability, a second area of estimated locations 120(3) (e.g., a second boundary) that are associated with a second probability, and a third area of estimated locations 120(4) (e.g., a third boundary) that are associated with a third probability.
  • the first probability is greater than the second probability and the second probability is greater than the third probability.
  • the vehicle 102 may determine that there is a higher probability that the vehicle 102 will be located within the first area of estimated locations 120(2) than within the second area of probably location 120(3). Additionally, the vehicle 102 may determine that there is a higher probability that the vehicle 102 will be located within the second area of estimated location 120(3) than within the third area of estimated locations 120(4).
  • FIG. 1 only illustrates three separate areas so estimated locations, in other examples, there may be any number of areas of estimated locations. Additionally, the areas that are located further from the estimated location 120(1) may include a lower probability than the areas that are located closer to the estimated location 120(1). This may similarly be for each of the estimated locations of the object 110 and the object 112.
  • the estimated locations 116 may include an estimated location 122(1) associated with the first object 110, a first area (e.g., a first boundary) of estimated locations 122(2) that are associated with a first probability, a second area of estimated locations 122(3) (e.g., a second boundary) that are associated with a second probability, and a third area of estimated locations 122(4) (e.g., a third boundary) that are associated with a third probability.
  • the first probability is greater than the second probability and the second probability is greater than the third probability.
  • the vehicle 102 may determine that there is a higher probability that the first object 110 will be located within the first area of estimated locations 122(2) than within the second area of probably location 122(3). Additionally, the vehicle 102 may determine that there is a higher probability that the first object 110 will be located within the second area of estimated location 122(3) than within the third area of estimated locations 122(4).
  • the estimated locations 118 may include an estimated location 124(1) associated with the second object 112, a first area of estimated locations 124(2) (e.g., a first boundary) that are associated with a first probability, a second area of estimated locations 124(3) (e.g., a second boundary) that are associated with a second probability, and a third area of estimated locations 124(4) (e.g., a third boundary) that are associated with a third probability.
  • the first probability is greater than the second probability and the second probability is greater than the third probability.
  • the vehicle 102 may determine that there is a higher probability that the second object 112 will be located within the first area of estimated locations 124(2) than within the second area of probably location 124(3). Additionally, the vehicle 102 may determine that there is a higher probability that the second object 112 will be located within the second area of estimated location 124(3) than within the third area of estimated locations 124(4).
  • the vehicle 102 may determine the estimated locations 114-118 using error model(s) 126 associated with the component(s) 108. For example, and for the first object 110, the vehicle 102 may analyze the sensor data 106 using the component(s) 108 in order to determine one or more parameters 128 associated with the first object 110.
  • the parameter(s) 128 may include, but are not limited to, a type of the first object 110, a current location of the first object 110 (and/or distance to the first object 110), a speed of the first object 110, and/or the like.
  • the vehicle 102 may then determine the estimated locations 116 of the first object 110.
  • the vehicle 102 may use a first error model 126 to determine a probability distribution associated with the type of the first object 110, use a second error model 126 to determine a probability distribution associated with the current location of the first object 110, use a third error model 126 to determine a probability distribution associated with the speed of the first object 110, and/or the like. For instance, and using the speed of the first object 110, the vehicle 102 may determine that the speed of the first object 110 is 1 meter per second. The vehicle 102 may then use the third error model 126 to determine that the error percentage can be X% (e.g., 20%) resulting in a range of speeds (e.g., speeds between .8 meters per second and 1.2 meters per second at 20%).
  • X% e.g. 20%
  • the error model 126 may further indicate that portions of the range have a higher probability of occurring than other portions of the range. For example, .8 meters per second and 1.2 meters per second may be associated with a 5% probability, .9 meters per second and 1.1 meters per second may be associated with a 20% percent probability, and 1 meter per second may be associated with a 45% probability.
  • the vehicle 102 may use similar processes for determining the probability distributions of the other parameter(s) 128.
  • the vehicle 102 may then use the probability distributions of the parameters 128 to determine the estimated locations 116 of the first object 110. Additionally, the vehicle 102 may use similar processes to determine parameters 128 for the vehicle 102, determine the probability distributions associated with the parameters 128 for the vehicle 102, and determine the estimated locations 114 using the probability distributions. Furthermore, the vehicle 102 may use similar processes to determine parameters 128 for the second object 112, determine the probability distributions associated with the parameters 128 for the second object 112, and determine the estimated locations 116 using the probability distributions. [0051] For a second example, the vehicle 102 may use the parameters 128 for the first object 110 in order to determine the estimated location 122(1) for the first object 110.
  • the vehicle 102 may then use the error models 126 associated with the parameters 128 that were used to determine the estimated location 122(1) in order to determine total errors for the parameters 128. Using the total errors and the estimated location 122(1), the vehicle 102 may determine the estimated locations 116 for the first object 110. Additionally, the vehicle 102 may use similar processes to determine the estimated locations 114 for the vehicle 102 and the estimated locations 118 for the second object 112. [0052] Additionally to, or alternatively from, using the error model(s) 126 to determine the estimated locations 114- 118, in other examples, the vehicle 102 may use one or more uncertainty model(s) 130 associated with the component(s) 108 and/or the parameter(s) 128.
  • the outputs from the component(s) 108 may include uncertainty model(s) 130 associated with determining the parameters 128.
  • the vehicle 102 may determine a first uncertainty model 130 associated with determining the type of the first object 110, a second uncertainty model 130 associated with determining the current location of the first object 110, a third uncertainty model 130 associated with determining the speed of the first object 110, and/or the like. The vehicle 102 may then determine the estimated locations 116 for the first object 110 using the parameters 128 and the uncertainty models 130.
  • the vehicle 102 may use the first uncertainty model 130 to determine a probability distribution associated with the type of the first object 110, use the second uncertainty model 130 to determine a probability distribution associated with the current location of the first object 110, use the third uncertainty model 130 to determine a probability distribution associated with the speed of the first object 110, and/or the like. For instance, and using the speed of the first object 110, the vehicle 102 may determine that the speed of the first object 110 is 1 meter per second. The vehicle 102 may then determine that the uncertainty for the speed of the first object is 20% and as such, the certainty is 80%. As such, the vehicle 102 may determine that the range for the speed is between .8 meters per second and 1.2 meters per second.
  • the vehicle 102 may further determine that portions of the range have a higher probability of occurring than other portions of the range. For example, .8 meters per second and 1.2 meters per second may be associated with a 5% probability, .9 meters per second and 1.1 meters per second may be associated with a 20% percent probability, and 1 meter per second may be associated with a 45% probability.
  • the vehicle 102 may use similar processes for determining the probability distributions of the other parameter(s) 128.
  • the vehicle 102 may then use the probability distributions of the parameters 128 to determine the estimated locations 116 of the first object 110. Additionally, the vehicle 102 may use similar processes to determine parameters 128 for the vehicle 102, determine the probability distributions associated with the parameters 128 for the vehicle 102, and determine the estimated locations 114 using the probability distributions. Furthermore, the vehicle 102 may use similar processes to determine parameters 128 for the second object 112, determine the probability distributions associated with the parameters 128 for the second object 112, and determine the estimated locations 116 using the probability distributions. [0055] For a second example, the vehicle 102 may use the parameters 128 for the first object 110 in order to determine the estimated location 122(1) for the first object 110.
  • the vehicle 102 may then use the uncertainty model(s) 130 associated with the parameters 128 in order to determine a total uncertainty associated with the estimated location 122(1). Using the total uncertainty, the vehicle 102 may determine the estimated locations 116 for the first object 110. Additionally, the vehicle 102 may use similar processes to determine the estimated locations 114 for the vehicle 102 and the estimated locations 118 for the second object 112.
  • the vehicle 102 may determine a probability of collision using the estimated locations 114-118. For example, the vehicle 102 may determine the probability of collision between the vehicle 102 and the first object 110. In some instances, the vehicle 102 may determine the probability of collision using at least an area of geometric overlap between the estimated locations 114 of the vehicle 102 and the estimated locations 116 of the first object 110.
  • the estimated locations 114 of the vehicle 102 may be Gaussian with parameters m n ,s n (which may be represented by N (m n , s£)).
  • the estimated locations 116 of the first object 110 may be Gaussian with parameters m 0 , s 0 (which may be represented by N(b 0 ,s Q )).
  • the vehicle 102 may perform similar processes in order to extend the one -dimensional problem to a two-dimensional problem. Additionally, the vehicle 102 may perform similar processes in order to determine the probability of collision between the vehicle 102 and the second object 112. In some instances, the vehicle 102 may then determine a total probability of collision using the probability of collision between the vehicle 102 and the first object 110 and the probability of collision between the vehicle 102 and the second obj ect 112. However, in the example of FIG. 1 , the probability of collision between the vehicle 102 and the second object 112 may be zero since there is no geometric overlap between the estimated locations 114 and the estimated locations 118.
  • the vehicle 102 may then determine if the probability of collision is equal to or greater than a threshold. Based at least in part on determining that the probability of collision is less than the threshold, the vehicle 102 may continue to navigate along the trajectory 104. However, based at least in part on determining that the probability of collision is equal to or greater than the threshold, the vehicle 102 may take one or more actions. The one or more actions may include, but are not limited to, navigating along a new trajectory changing a speed (e.g., slowing down), parking, and/or the like.
  • the vehicle 102 may perform similar processes in order to determine a probability of collision between the object 110 and the object 112. The vehicle 102 may then perform one or more actions based at least in part on the probability of collision. For instance, if the vehicle 102 determines that the probability of collision between the object 110 and the object 112 is equal to or greater than a threshold, the vehicle 102 may stop.
  • FIG. 2 is an illustration of an example of the vehicle 102 analyzing the sensor data 106 using the error model(s) 126 in order to determine estimated locations associated with an object, in accordance with embodiments of the disclosure.
  • sensor system(s) 202 of the vehicle 102 may generate the sensor data 106.
  • the sensor data 106 may then be analyzed by the component(s) 108 of the vehicle 102.
  • the component(s) 108 may include a localization component 204, a perception component 206, a planning component 208, and a prediction component 210.
  • the vehicle 102 may not include one or more of the localization component 204, the perception component 206, the planning component 208, or the prediction component 210.
  • the vehicle 102 may include one or more additional components.
  • One or more of the components 204-210 may then analyze the sensor data 106 and generate outputs 212-218 based at least in part on the analysis.
  • the outputs 212-218 may include parameters associated with the vehicle 102 and/or objects.
  • the output 212 from the localization component 204 may indicate the position of the vehicle 102.
  • the output 214 from the perception component 206 may include detection, segmentation, classification, and/or the like associated with objects.
  • the output 216 from the planning component 208 may include a path for the vehicle 102 to traverse within the environment.
  • one or more of the components 204-210 may use outputs 212-218 from one or more of the other components 204-210 in order to generate an output 212-218.
  • the planning component 208 may use the output 212 from the localization component 204 in order to generate the output 216.
  • the planning component 208 may use the output 214 from the perception component 206 in order to generate the output 216.
  • a component 204-210 may use the probability distributions 220-226, which are described below.
  • Error component(s) 228 may be configured to process the outputs 212-218 using the error model(s) 126 in order to generate the probability distributions 220-226 associated with the outputs 212-218.
  • the error component(s) 228 may be included within the components 204-210.
  • the localization component 204 may analyze the sensor data 106 and, based at least in part on the analysis, generate both the output 212 and the probability distribution 220 associated with the output 212.
  • the perception component 206 may analyze the sensor data 106 and, based at least in part on the analysis, generate both the output 214 and the probability distribution 222 associated with the output 214.
  • the probability distributions 220-226 may respectfully be associated with the outputs 212-218.
  • the error component(s) 228 may process the output 212 using error model(s) 126 associated with the localization component 204 in order to generate the probability distribution 220.
  • the probability distribution 220 may represent estimated locations of the vehicle 102 that are based on the determined location and error(s) represented by the error model(s) 126 for the localization component 204.
  • the error component(s) 228 may process the output 214 using error model(s) 126 associated with the perception component 206 in order to generate the probability distribution 222.
  • the probability distribution 222 may represent probable speeds of the object that are based on the determined speed and error(s) represented by the error model(s) 126 for the perception component 206.
  • An estimation component 230 may be configured to process one or more of the probability distributions 220- 226 and/or the sensor data 106 (not illustrated for clarity reasons) in order to generate estimated locations 232 associated with the vehicle 102 and/or objects.
  • the estimated locations 232 may include a probability distribution, such as a Gaussian distribution, of locations.
  • FIG. 3 is an illustration of another example of the vehicle 102 analyzing the sensor data 106 using the error model(s) 126 in order to determine estimated locations associated with an object, in accordance with embodiments of the disclosure.
  • the estimation component 230 may analyze one or more of the outputs 212-218 from one or more of the components 204-210 in order to determine an estimated location 302 associated with the vehicle 102 and/or an object.
  • the error component(s) 228 may then use the error model(s) 126 and the estimated location 302 to determine the estimated locations 304 of the vehicle 102 and/or the object.
  • the error component(s) 228 may use the error model(s) 126 to determine total error(s) and/or total error percentages associated with the output(s) 212-218 of the component(s) 204-210 that were used to determine the estimated location 302. The error component(s) 228 may then use the total error(s) and/or total error percentages to generate the estimated locations 304.
  • the estimated locations 304 may include a probability distribution, such as a Gaussian distribution, of locations.
  • FIG. 4 is an illustration of an example of the vehicle 102 analyzing the sensor data 106 using the uncertainty model(s) 130 in order to determine estimated locations associated with an object, in accordance with embodiments of the disclosure.
  • uncertainty component(s) 402 may be configured to process the outputs 212-218 using the uncertainty model(s) 130 in order to generate probability distributions 404-410 associated with the outputs 212-218.
  • the uncertainty component(s) 402 may be included within the components 204-210.
  • the localization component 204 may analyze the sensor data 106 and, based at least in part on the analysis, generate both the output 212 and the probability distribution 404 associated with the output 212.
  • the perception component 206 may analyze the sensor data 106 and, based at least in part on the analysis, generate both the output 214 and the probability distribution 406 associated with the output 214.
  • one or more of the components 204-210 may use outputs 212-218 from one or more of the other components 204-210 in order to generate an output 212-218.
  • the planning component 208 may use the output 212 from the localization component 204 in order to generate the output 216.
  • the planning component 208 may use the output 214 from the perception component 206 in order to generate the output 216.
  • a component 204-210 may use the probability distributions 404-410.
  • the probability distributions 404-410 may respectfully be associated with the outputs 212-218.
  • the uncertainty component(s) 402 may process the output 212 using the uncertainty model(s) 130 associated with the localization component 204 in order to generate the probability distribution 404.
  • the probability distribution 404 may represent estimated locations of the vehicle 102 that are based at least in part on the determined location and uncertainty model(s) 130 for the localization component 204.
  • the uncertainty component(s) 402 may process the output 214 using uncertainty model(s) 130 associated with the perception component 206 in order to generate the probability distribution 406.
  • the probability distribution 406 may represent probable speeds of the object that are based on the determined speed and the uncertainty model(s) 130 for the perception component 206.
  • a estimation component 230 may be configured to process one or more of the probability distributions 404- 410 and/or the sensor data 106 (not illustrated for clarity reasons) in order to generate estimated locations 412 associated with the vehicle 102 and/or the object.
  • the estimated locations 412 may include a probability distribution, such as a Gaussian distribution, of locations.
  • FIG. 5 is an illustration of another example of the vehicle 102 analyzing the sensor data 106 using the uncertainty model(s) 130 in order to determine estimated locations associated with an object, in accordance with embodiments of the disclosure.
  • the estimation component 230 may analyze one or more of the outputs 212-218 from one or more of the components 204-210 in order to determine the estimated location 302 associated with the vehicle 102 and/or an object.
  • the uncertainty component(s) 402 may then use the uncertainty model(s) 130 and the estimated location 302 to determine the estimated locations 502 of the vehicle 102 and/or the object.
  • the uncertainty component(s) 402 may use the uncertainty model(s) 130 for the components 204- 210 to determine total uncertainties associated with the output(s) 212-218 of the component(s) 204-210 that were used to determine the estimated location 302. The uncertainty component(s) 402 may then use the total uncertainties to generate the estimated locations 502.
  • the estimated locations 502 may include a probability distribution such as a Gaussian distribution, of locations.
  • FIG. 6 illustrates an example graph 600 illustrating the vehicle 102 determining probabilities of collision over a period of time, in accordance with embodiments of the disclosure.
  • the graph 600 represent probabilities 602 along the y-axis and time 604 along the x-axis.
  • the vehicle 102 may determine the probabilities of collision at time 606(1). For instance, at time 606(1), the vehicle 102 may determine the probabilities of collision at three future times, time 606(2), time 606(3), and time 606(4).
  • the probabilities of collision are associated with the vehicle 102 and a single object. In other instances, the probabilities of collision are associated with the vehicle 102 and more than one object.
  • the vehicle 102 may determine that there is a first probability of collision 608(1) at time 606(2), a second probability of collision 608(2) at time 606(3), and no probability of collision at time 606(4).
  • the first probability of collision 608(1) may be associated with a low risk
  • the second probability of collision 608(2) may be associated with a high risk
  • the first probability of collision 608(1) may be low risk based at least in part on the first probability of collision 608(1) being below a threshold probability
  • the second probability of collision 608(2) may be high risk based at least in part on the second probability of collision 608(2) being equal to or greater than the threshold probability.
  • FIG. 6 illustrates determining the probabilities of collision at discrete times
  • the vehicle 102 may continuously be determining the probabilities of collision.
  • FIG. 7 illustrates an example 700 of generating error model data based at least in part on vehicle data and ground truth data, in accordance with embodiments of the present disclosure.
  • vehicle(s) 702 can generate vehicle data 704 and transmit the vehicle data 704 to an error model component 706.
  • the error model component 706 can determine an error model 126 that can indicate an error associated with a parameter.
  • the vehicle data 704 can be data associated with a component of the vehicle(s) 702 such as the perception component 206, the planning component 208, the localization component 204, the estimation component 230, and/or the like.
  • the vehicle data 704 can be associated with the perception component 206 and the vehicle data 704 can include a bounding box associated with an object detected by the vehicle(s) 702 in an environment.
  • the error model component 706 can receive ground truth data 708 which can be manually labeled and/or determined from other, validated, machine learned components.
  • the ground truth data 708 can include a validated bounding box that is associated with the object in the environment.
  • the error model component 706 can determine an error associated with the system (e.g., the component) of the vehicle(s) 702.
  • Such errors may comprise, for example, differences between the ground truth and the output, percent differences, error rates, and the like.
  • the vehicle data 704 can include one or more characteristics (also referred to as parameters) associated with a detected entity and/or the environment in which the entity is positioned.
  • characteristics associated with an entity can include, but are not limited to, an x-position (global position), a y-position (global position), a z-position (global position), an orientation, an entity type (e.g., a classification), a velocity of the entity, an extent of the entity (size), etc.
  • Characteristics associated with the environment can include, but are not limited to, a presence of another entity in the environment, a state of another entity in the environment, a time of day, a day of a week, a season, a weather condition, an indication of darkness/light, etc. Therefore, the error can be associated with the other characteristics (e.g., environmental parameters).
  • such error models may be determined for various groupings of parameters (e.g., distinct models for different combinations of classifications, distances, speeds, etc.).
  • such parameters may further comprise environmental information such as, but not limited to, the number of objects, the time of day, the time of year, weather conditions, and the like.
  • the error model component 706 can process a plurality of vehicle data 704 and a plurality of ground truth data 708 to determine error model data 710.
  • the error model data 710 can include the error calculated by the error model component 706 which can be represented as error 712(1)— (3). Additionally, the error model component 706 can determine a probability associated with the error 712(1)— (3) represented as probability 714(1)— (3) which can be associated with an environmental parameter to present error models 716(1)— (3) (which may represent error models 126).
  • the vehicle data 704 can include a bounding box associated with an object at a distance of 50 meters from the vehicle(s) 702 in an environment that includes rainfall.
  • the ground truth data 708 can provide the validated bounding box associated with the object.
  • the error model component 706 can determine error model data 710 that determines that the error associated with the perception system of the vehicle(s) 702. The distance of 50 meters and the rainfall can be used as environmental parameters to determine which of error model of error models 716(1)— (3) to use. Once the error model is identified, the error model 716(1)— (3) can provide an error 712(1)— (3) based on the probability 714(1)— (3) where errors 712(1)— (3) associated with higher probabilities 714(1)— (3) are more likely to be selected than errors 712(1)— (3) associated with lower probabilities 714(1)— (3).
  • FIG. 8 illustrates an example 800 of the vehicle(s) 702 generating vehicle data 704 and transmitting the vehicle data 704 to the computing device(s) 802, in accordance with embodiments of the present disclosure.
  • the error model component 706 can determine a perception error model that can indicate an error associated with a parameter.
  • the vehicle data 704 can include sensor data generated by a sensor of the vehicle(s) 702 and/or perception data generated by a perception system of the vehicle(s) 702.
  • the perception error model can be determined by comparing the vehicle data 704 against ground truth data 708.
  • the ground truth data 708 can be manually labeled and can be associated with the environment and can represent a known result.
  • a deviation from the ground truth data 708 in the vehicle data 704 can be identified as an error in a sensor system and/or the perception system of the vehicle(s) 702.
  • a perception system can identity an object as a bicyclist where the ground truth data 708 indicates that the object is a pedestrian.
  • a sensor system can generate sensor data that represents an object as having a width of 2 meters where the ground truth data 708 indicates that the object has a width of 3 meters.
  • the error model component 706 can determine a classification associated with the object represented in the vehicle data 704 and determine other objects of the same classification in the vehicle data 704 and/or other log data. Then the error model component 706 can determine a probability distribution associated with a range of errors of associated with the object. Based on the comparison and the range of errors, the error model component 706 can determine the estimated locations 502.
  • an environment 804 can include objects 806(1)— (3) represented as bounding boxes generated by a perception system.
  • the perception error model data 808 can indicate scenario parameters as 810(1)— (3) and the error associated with the scenario parameters as 812(1)— (3).
  • FIG. 9 illustrates an example 900 of generating uncertainty data based at least in part on vehicle data and ground truth data, in accordance with embodiments of the present disclosure.
  • the vehicle(s) 702 can generate the vehicle data 704 and trans it the vehicle data 704 to an uncertainty model component 902.
  • the uncertainty model component 902 can determine uncertainties associated with components determining parameters.
  • the vehicle data 704 can be data associated with a component of the vehicle(s) 702 such as the perception component 206, the planning component 208, the localization component 204, the prediction component 210, and/or the like.
  • the vehicle data 704 can be associated with the perception component 206 and the vehicle data 704 can include a bounding box associated with an object detected by the vehicle(s) 702 in an environment.
  • the uncertainty model component 902 can receive ground truth data 708 which can be manually labeled and/or determined from other, validated, machine learned components.
  • the ground truth data 708 can include a validated bounding box that is associated with the object in the environment.
  • the uncertainty model component 902 can determine a consistency for which the system (e.g., the component) of the vehicle(s) 702 determine the ground truth. For instance, the consistency may indicate the percentage for which the parameters represented by the vehicle data 704 are the same as the parameter represented by the ground truth data 708.
  • the uncertainty model component 902 may then use the consistency to generate uncertainty data 904 associated with the component that determine the parameter and/or associated with the component determining the parameter. For instance, if the consistency indicates that there is a low percentage, then the uncertainty data 904 may indicate a high uncertainty. However, if the consistency data indicates that there is a high percentage, then the uncertainty data 904 may indicate a low uncertainty.
  • the uncertainty model component 902 may identify one or more types of uncertainty.
  • the types of uncertainty may include, but are not limited to, epistemic uncertainty, aleatoric uncertainty (e.g., data-dependent, task-dependent, etc.), and/or the like.
  • Epistemic uncertainty may be associated with ignorance about which a component generated data.
  • Aleatoric uncertainty may be associated with uncertainty with respect to information for which the data cannot explain.
  • the uncertainty model component 902 may then use the identified uncertainty(ies) to generate the uncertainty model(s) 130.
  • the uncertainty model component 902 may input the data into a component multiple times, where one or more nodes of the component are changed when inputting the data, which causes the outputs of the component to differ. This may cause the range in the outputs from the component. In some instances, the component may further output the mean and/or the variance of the outputs. The uncertainty model component 902 may then use the distribution associated with the range of the outputs, the mean, and/or the variance to generate the uncertainty model(s) 130 for the component and/or the type of output (e.g., the parameter).
  • the type of output e.g., the parameter
  • FIG. 10 depicts a block diagram of an example system 1000 for implementing the techniques discussed herein.
  • the system 1000 can include the vehicle 102.
  • the vehicle 102 is an autonomous vehicle; however, the vehicle 102 can be any other type of vehicle (e.g., a driver-controlled vehicle that may provide an indication of whether it is safe to perform various maneuvers).
  • the vehicle 102 can include computing device(s) 1002, one or more sensor system(s) 202, one or more emitter(s) 1004, one or more communication connection(s) 1006 (also referred to as communication devices and/or modems), at least one direct connection 1008 (e.g., for physically coupling with the vehicle 102 to exchange data and/or to provide power), and one or more drive system(s) 1010.
  • the one or more sensor system(s) 202 can be configured to capture the sensor data 106 associated with an environment.
  • the sensor system(s) 202 can include time-of-flight sensors, location sensors (e.g., GPS, compass, etc.), inertial sensors (e.g., inertial measurement units (IMUs), accelerometers, magnetometers, gyroscopes, etc.), lidar sensors, radar sensors, sonar sensors, infrared sensors, cameras (e.g., RGB, IR, intensity, depth, etc.), microphone sensors, environmental sensors (e.g., temperature sensors, humidity sensors, light sensors, pressure sensors, etc.), ultrasonic transducers, wheel encoders, etc.
  • the sensor system(s) 202 can include multiple instances of each of these or other types of sensors.
  • the time-of-flight sensors can include individual time-of-flight sensors located at the comers, front, back, sides, and/or top of the vehicle 102.
  • the camera sensors can include multiple cameras disposed at various locations about the exterior and/or interior of the vehicle 102.
  • the sensor system(s) 202 can provide input to the computing device(s) 1002.
  • the vehicle 102 can also include one or more emitter(s) 1004 for emitting light and/or sound.
  • the one or more emitter(s) 1004 in this example include interior audio and visual emitters to communicate with passengers of the vehicle 102.
  • interior emitters can include speakers, lights, signs, display screens, touch screens, haptic emitters (e.g., vibration and/or force feedback), mechanical actuators (e.g., seatbelt tensioners, seat positioners, headrest positioners, etc.), and the like.
  • the one or more emitter(s) 1004 in this example also include exterior emitters.
  • the exterior emitters in this example include lights to signal a direction of travel or other indicator of vehicle action (e.g., indicator lights, signs, light arrays, etc.), and one or more audio emitters (e.g., speakers, speaker arrays, horns, etc.) to audibly communicate with pedestrians or other nearby vehicles, one or more of which may comprise acoustic beam steering technology.
  • the vehicle 102 can also include one or more communication connection(s) 1006 that enable communication between the vehicle 102 and one or more other local or remote computing device(s) (e.g., a remote teleoperations computing device) or remote services.
  • the communication connection(s) 1006 can facilitate communication with other local computing device(s) on the vehicle 102 and/or the drive system(s) 1010. Also, the communication connection(s) 1006 can allow the vehicle 102 to communicate with other nearby computing device(s) (e.g., other nearby vehicles, traffic signals, etc.).
  • other nearby computing device(s) e.g., other nearby vehicles, traffic signals, etc.
  • the communications connection(s) 1006 can include physical and/or logical interfaces for connecting the computing device(s) 1002 to another computing device or one or more external network(s) 1012 (e.g., the Internet).
  • the communications connection(s) 1006 can enable Wi-Fi-based communication such as via frequencies defined by the IEEE 802.11 standards, short range wireless frequencies such as Bluetooth, cellular communication (e.g., 2G, 3G, 4G, 4G LTE, 5G, etc.), satellite communication, dedicated short-range communications (DSRC), or any suitable wired or wireless communications protocol that enables the respective computing device to interface with the other computing device(s).
  • the communication connection(s) 1006 may comprise the one or more modems as described in detail above.
  • the vehicle 102 can include one or more drive system(s) 1010. In some examples, the vehicle 102 can have a single drive system 1010. In at least one example, if the vehicle 102 has multiple drive systems 1010, individual drive systems 1010 can be positioned on opposite ends of the vehicle 102 (e.g., the front and the rear, etc.). In at least one example, the drive system(s) 1010 can include one or more sensor system(s) 202 to detect conditions of the drive system(s) 1010 and/or the surroundings of the vehicle 102.
  • the sensor system(s) 202 can include one or more wheel encoders (e.g., rotary encoders) to sense rotation of the wheels of the drive systems, inertial sensors (e.g., inertial measurement units, accelerometers, gyroscopes, magnetometers, etc.) to measure orientation and acceleration of the drive system, cameras or other image sensors, ultrasonic sensors to acoustically detect objects in the surroundings of the drive system, lidar sensors, radar sensors, etc. Some sensors, such as the wheel encoders can be unique to the drive system(s) 1010. In some cases, the sensor system(s) 202 on the drive system(s) 1010 can overlap or supplement corresponding systems of the vehicle 102 (e.g., sensor system(s) 202).
  • wheel encoders e.g., rotary encoders
  • inertial sensors e.g., inertial measurement units, accelerometers, gyroscopes, magnetometers, etc.
  • ultrasonic sensors to acoust
  • the drive system(s) 1010 can include many of the vehicle systems, including a high voltage battery, a motor to propel the vehicle, an inverter to convert direct current from the battery into alternating current for use by other vehicle systems, a steering system including a steering motor and steering rack (which can be electric), a braking system including hydraulic or electric actuators, a suspension system including hydraulic and/or pneumatic components, a stability control system for distributing brake forces to mitigate loss of traction and maintain control, an HVAC system, lighting (e.g., lighting such as head/tail lights to illuminate an exterior surrounding of the vehicle), and one or more other systems (e.g., cooling system, safety systems, onboard charging system, other electrical components such as a DC/DC converter, a high voltage junction, a high voltage cable, charging system, charge port, etc.).
  • a high voltage battery including a motor to propel the vehicle
  • an inverter to convert direct current from the battery into alternating current for use by other vehicle systems
  • a steering system including a steering motor and steering rack (which can
  • the drive system(s) 1010 can include a drive system controller which can receive and preprocess data from the sensor system(s) 202 and to control operation of the various vehicle systems.
  • the drive system controller can include one or more processor(s) and memory communicatively coupled with the one or more processor(s).
  • the memory can store one or more modules to perform various functionalities of the drive system(s) 1010.
  • the drive system(s) 1010 also include one or more communication connection(s) that enable communication by the respective drive system with one or more other local or remote computing device(s).
  • the computing device(s) 1002 can include one or more processors 1014 and memory 1016 communicatively coupled with the processor(s) 1014.
  • the memory 1016 of the computing device(s) 1002 stores the localization component 204, the perception component 206, the prediction component 210, the estimation component 230, the planning component 208, the error component(s) 228, the uncertainty component(s) 402, and one or more sensor system 202.
  • the localization component 204, the perception component 206, the prediction component 210, the estimation component 230, the planning component 208, the error component(s) 228, the uncertainty component(s) 402, and the one or more system controller(s) 1018 can additionally, or alternatively, be accessible to the computing device(s) 1002 (e.g., stored in a different component of vehicle 102 and/or be accessible to the vehicle 102 (e.g., stored remotely).
  • the localization component 204 can include functionality to receive data from the sensor system(s) 202 to determine a position of the vehicle 102.
  • the localization component 204 can include and/or request/receive a three-dimensional map of an environment and can continuously determine a location of the autonomous vehicle within the map.
  • the localization component 204 can use SLAM (simultaneous localization and mapping) or CLAMS (calibration, localization and mapping, simultaneously) to receive time-of-flight data, image data, lidar data, radar data, sonar data, IMU data, GPS data, wheel encoder data, or any combination thereof, and the like to accurately determine a location of the autonomous vehicle.
  • the localization component 204 can provide data to various components of the vehicle 102 to determine an initial position of an autonomous vehicle for generating a trajectory, as discussed herein.
  • the perception component 206 can include functionality to perform object detection, segmentation, and/or classification.
  • the perception component 206 can provide processed sensor data that indicates a presence of an entity that is proximate to the vehicle 102 and/or a classification of the entity as an entity type (e.g., car, pedestrian, cyclist, building, tree, road surface, curb, sidewalk, unknown, etc.).
  • the perception component 206 can provide processed sensor data that indicates one or more characteristics (also referred to as parameters) associated with a detected entity and/or the environment in which the entity is positioned.
  • characteristics associated with an entity can include, but are not limited to, an x-position (global position), a y-position (global position), a z-position (global position), an orientation, an entity type (e.g., a classification), a velocity of the entity, an extent of the entity (size), etc.
  • Characteristics associated with the environment can include, but are not limited to, a presence of another entity in the environment, a state of another entity in the environment, a time of day, a day of a week, a season, a weather condition, a geographic position, an indication of darkness/light, etc.
  • the perception component 206 can include functionality to store perception data generated by the perception component 206. In some instances, the perception component 206 can determine a track corresponding to an object that has been classified as an object type. For purposes of illustration only, the perception component 206, using sensor system(s) 202 can capture one or more images of an environment. The sensor system(s) 202 can capture images of an environment that includes an object, such as a pedestrian. The pedestrian can be at a first position at a time T and at a second position at time T + 1 (e.g., movement during a span of time t after time T). In other words, the pedestrian can move during this time span from the first position to the second position. Such movement can, for example, be logged as stored perception data associated with the object.
  • the stored perception data can, in some examples, include fused perception data captured by the vehicle.
  • Fused perception data can include a fusion or other combination of sensor data from sensor system(s) 202, such as image sensors, lidar sensors, radar sensors, time-of-flight sensors, sonar sensors, global positioning system sensors, internal sensors, and/or any combination of these.
  • the stored perception data can additionally or alternatively include classification data including semantic classifications of objects (e.g., pedestrians vehicles, buildings, road surfaces, etc.) represented in the sensor data.
  • the stored perception data can additionally or alternatively include a track data (collections of historical positions, orientations, sensor features, etc. associated with the object over time) corresponding to motion of objects classified as dynamic objects through the environment.
  • the back data can include multiple backs of multiple different objects over time.
  • This back data can be mined to idenbfy images of certain types of objects (e.g., pedesbians, animals, etc.) at times when the object is stationary (e.g., standing still) or moving (e.g., walking, running, etc.).
  • the computing device determines a back corresponding to a pedesbian.
  • the prediction component 210 can generate one or more probability maps representing prediction probabilities of estimated locations of one or more objects in an environment.
  • the predicbon component 210 can generate one or more probability maps for vehicles, pedesbians, animals, and the like within a threshold distance from the vehicle 102.
  • the prediction component 210 can measure a back of an object and generate a discretized predicbon probability map, a heat map, a probability disbibution, a discretized probability disbibution, and/or a trajectory for the object based on observed and predicted behavior.
  • the one or more probability maps can represent an intent of the one or more objects in the environment.
  • the planning component 208 can determine a path for the vehicle 102 to follow to baverse through an environment. For example, the planning component 208 can determine various routes and paths and various levels of detail. In some instances, the planning component 208 can determine a route to bavel from a first location (e.g., a current location) to a second location (e.g., a target location). For the purpose of this discussion, a route can be a sequence of waypoints for baveling between two locations. As non-limiting examples, waypoints include sheets, intersections, global positioning system (GPS) coordinates, etc. Further, the planning component 208 can generate an insbuction for guiding the vehicle 102 along at least a portion of the route from the first location to the second location.
  • GPS global positioning system
  • the planning component 208 can determine how to guide the vehicle 102 from a first waypoint in the sequence of waypoints to a second waypoint in the sequence of waypoints.
  • the instruction can be a path, or a portion of a path.
  • multiple paths can be substanhally simultaneously generated (i.e., within technical tolerances) in accordance with a receding horizon technique. A single path of the multiple paths in a receding data horizon having the highest confidence level may be selected to operate the vehicle.
  • the planning component 208 can alternatively, or additionally, use data from the perception component 206 and/or the predicbon component 210 to determine a path for the vehicle 102 to follow to baverse through an environment.
  • the planning component 208 and/or the prediction component 210 can receive data from the perception component 206 regarding objects associated with an environment. Using this data, the planning component 208 can determine a route to travel from a first location (e.g., a current location) to a second location (e.g., a target location) to avoid objects in an environment.
  • a first location e.g., a current location
  • a second location e.g., a target location
  • such a planning component 208 may determine there is no such collision free path and, in turn, provide a path which brings vehicle 102 to a safe stop avoiding all collisions and/or otherwise mitigating damage.
  • the computing device(s) 1002 can include one or more system controllers 1018, which can be configured to control steering, propulsion, braking, safety, emitters, communication, and other systems of the vehicle 102. These system controller(s) 1018 can communicate with and/or control corresponding systems of the drive system(s) 1010 and/or other components of the vehicle 102, which may be configured to operate in accordance with a path provided from the planning component 208.
  • the vehicle 102 can connect to computing device(s) 802 via the network(s) 1012 and can include one or more processors 1020 and memory 1022 communicatively coupled with the one or more processors 820.
  • the processor(s) 820 can be similar to the processor(s) 1014 and the memory 1022 can be similar to the memory 1016.
  • the memory 1022 of the computing device(s) 802 stores the vehicle data 704, the ground truth data 708, and the error model component 706.
  • vehicle data 704, the ground truth data 708, and/or the error model component 706 can additionally, or alternatively, be accessible to the computing device(s) 802 (e.g., stored in a different component of computing device(s) 802 and/or be accessible to the computing device(s) 802 (e.g., stored remotely).
  • the processor(s) 1014 of the computing device(s) 1002 and the processor(s) 1020 of the computing device(s) 802 can be any suitable processor capable of executing instructions to process data and perform operations as described herein.
  • the processor(s) 1014 and 1020 can comprise one or more Central Processing Units (CPUs), Graphics Processing Units (GPUs), or any other device or portion of a device that processes electronic data to transform that electronic data into other electronic data that can be stored in registers and/or memory.
  • integrated circuits e.g., ASICs, etc.
  • gate arrays e.g., FPGAs, etc.
  • other hardware devices can also be considered processors in so far as they are configured to implement encoded instructions.
  • the memory 1016 of the computing device(s) 1002 and the memory 1022 of the computing device(s) 802 are examples of non-transitory computer-readable media.
  • the memory 1016 and 1022 can store an operating system and one or more software applications, instructions, programs, and/or data to implement the methods described herein and the functions attributed to the various systems.
  • the memory 1016 and 1022 can be implemented using any suitable memory technology, such as static random access memory (SRAM), synchronous dynamic RAM (SDRAM), nonvolatile/Flash-type memory, or any other type of memory capable of storing information.
  • the architectures, systems, and individual elements described herein can include many other logical, programmatic, and physical components, of which those shown in the accompanying figures are merely examples that are related to the discussion herein. [0109] In some instances, aspects of some or all of the components discussed herein can include any models, algorithms, and/or machine learning algorithms. For example, in some instances, the components in the memory 1016 and 1022 can be implemented as a neural network.
  • FIGS. 11-14 illustrate example processes in accordance with embodiments of the disclosure. These processes are illustrated as logical flow graphs, each operation of which represents a sequence of operations that may be implemented in hardware, software, or a combination thereof.
  • the operations represent computer-executable instructions stored on one or more computer-readable storage media that, when executed by one or more processors, perform the recited operations.
  • computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular abstract data types.
  • the order in which the operations are described is not intended to be construed as a limitation, and any number of the described operations may be combined in any order and/or in parallel to implement the processes.
  • FIG. 11 depicts an example process 1100 for performing collision monitoring using error models, in accordance with embodiments of the disclosure.
  • the process 1100 may include receiving sensor data generated by one or more sensors.
  • the vehicle 102 may be navigating along a path from a first location to a second location. While navigating, the vehicle 102 may generate the sensor data using one or more sensors of the vehicle 102.
  • the process 1100 may include determining, using at least a first system of a vehicle, at least a parameter associated with the vehicle based at least in part on a first portion of the sensor data.
  • the vehicle 102 may analyze the first portion of the sensor data using one or more systems.
  • the one or more systems may include, but are not limited to, a localization system, a perception system, a planning system, a prediction system, and/or the like.
  • the vehicle 102 may determine the parameter associated with the vehicle 102.
  • the parameter may include, but is not limited to, a location of the vehicle 102, a speed of the vehicle 102, a direction of travel of the vehicle 102, and/or the like.
  • the process 1100 may include determining estimated locations associated with the vehicle based at least in part on the parameter associated with the vehicle and a first error model associated with the first system.
  • the vehicle 102 may process at least the parameter associated with the vehicle 102 using the first error model.
  • the first error model can represent error(s) and/or error percentages associated with the output of the first system.
  • the vehicle 102 may determine the estimated locations associated with the vehicle 102 at a later time.
  • the estimated locations may correspond to a probability distribution of locations.
  • the process 1100 may include determining, using at least a second system of the vehicle, at least a parameter associated with an object based at least in part on a second portion of the sensor data.
  • the vehicle 102 may analyze the sensor data and, based at least in part on the analysis, identify the object. The vehicle 102 may then analyze the second portion of the sensor data using the one or more systems. Based at least in part on the analysis, the vehicle 102 may determine the parameter associated with the object.
  • the parameter may include, but is not limited to, a type of the object, a location of the object, a speed of the object, a direction of travel of the object, and/or the like.
  • the process 1100 may include determining estimated locations associated with the object based at least in part on the parameter associated with the object and a second error model associated with the second system.
  • the vehicle 102 may process at least the parameter associated with the object using the second error model.
  • the second error model can represent error(s) and/or error percentages associated with the output of the second system.
  • the vehicle 102 may determine the estimated locations associated with the object at the later time.
  • the estimated locations may correspond to a probability distribution of locations.
  • the process 1100 may include determining a probability of collision based at least in part on the estimated locations associated with the vehicle and the estimated locations associated with the object.
  • the vehicle 102 may analyze the estimated locations associated with the vehicle 102 and the estimated locations associated with the object in order to determine the probability of collision.
  • the probability of collision may be based at least in part on an amount of overlap between the estimated locations associated with the vehicle 102 and the estimated locations associated with the object.
  • the process 1100 may include determining if the probability of collision is equal to or greater than a threshold. For instance, the vehicle 102 may compare the probability of collision to the threshold in order to determine if the probability of collision is equal to or greater than the threshold.
  • the process 1100 may include causing the vehicle to continue to navigate along a path. For instance, if the vehicle 102 determines that the probability of collision is less than the threshold, then the vehicle 102 may continue to navigate along the path.
  • the process 1100 may include causing the vehicle to perform one or more actions. For instance, if the vehicle 102 determines that the probability of collision is equal to or greater than the threshold, then the vehicle 102 may perform the one or more actions.
  • the one or more actions may include, but are not limited to, changing a path of the vehicle 102, changing a speed of the vehicle 102, parking the vehicle 102, and/or the like.
  • FIG. 12 depicts an example process 1200 for using error models to determine estimated locations associated with an object, in accordance with embodiments of the disclosure.
  • the process 1200 may include receiving sensor data generated by one or more sensors.
  • the vehicle 102 may be navigating along a path from a first location to a second location. While navigating, the vehicle 102 may generate the sensor data using one or more sensors of the vehicle 102.
  • the process 1200 may include determining, using one or more systems of a vehicle, a first parameter associated with an object based at least in part the sensor data.
  • the vehicle 102 may analyze the sensor data using one or more systems.
  • the one or more systems may include, but are not limited to, a localization system, a perception system, a planning system, a prediction system, and/or the like.
  • the vehicle 102 may determine the first parameter associated with the object (e.g., the vehicle or another object).
  • the first parameter may include, but is not limited to, a location of the object, a speed of the object, a direction of travel of the object, and/or the like.
  • the process 1200 may include determining a first probability distribution associated with the first parameter based at least in part on a first error model.
  • the vehicle 102 may process at least the first parameter using the first error model.
  • the first error model can represent error(s) and/or error percentages associated with the first parameter. Based at least in part on the processing, the vehicle 102 may determine the first probability distribution associated with the first parameter.
  • the process 1200 may include determining, using one or more systems of the vehicle, a second parameter associated with the object based at least in part on at least one of the sensor data or the first probability distribution.
  • the vehicle 102 may analyze the at least one of the sensor data or the first probability distribution using the one or more systems.
  • the vehicle 102 analyzes the first probability distribution when the second parameter is determined using the first parameter.
  • the vehicle 102 may determine the second parameter associated with the object (e.g., the vehicle or another object).
  • the second parameter may include, but is not limited to, a location of the object, a speed of the object, a direction of travel of the object, an estimated location of the object at a future time, and/or the like.
  • the process 1200 may include determining a second probability distribution associated with the second parameter based at least in part on a second error model.
  • the vehicle 102 may process at least the second parameter using the second error model.
  • the second error model can represent error(s) and/or error percentages associated with the second parameter. Based at least in part on the processing, the vehicle 102 may determine the second probability distribution associated with the second parameter.
  • the process 1200 may include determining estimated locations associated with the object based at least in part on at least one of the first probability distribution or the second probability distribution. For instance, the vehicle 102 may determine the estimated locations based at least in part on the first probability distribution and/or the second probability distribution. In some instances, if the first parameter and the second parameter are independent, such as the first parameter indicating a current location of the object and the second parameter indicating a speed of the object, then the vehicle 102 may determine the estimated locations using both the first probability distribution and the second probability distribution.
  • FIGS. 13A-13B depict an example process 1300 for performing collision monitoring using uncertainty models, in accordance with embodiments of the disclosure.
  • the process 1300 may include receiving sensor data generated by one or more sensors.
  • the vehicle 102 may be navigating along a path from a first location to a second location. While navigating, the vehicle 102 may generate the sensor data using one or more sensors of the vehicle 102.
  • the process 1300 may include determining, using at least a first system of a vehicle, at least a parameter associated with the vehicle based at least in part on a first portion of the sensor data.
  • the vehicle 102 may analyze the first portion of the sensor data using one or more systems.
  • the one or more systems may include, but are not limited to, a localization system, a perception system, a planning system, a prediction system, and/or the like.
  • the vehicle 102 may determine the parameter associated with the vehicle 102.
  • the parameter may include, but is not limited to, a location of the vehicle 102, a speed of the vehicle 102, a direction of travel of the vehicle 102, and/or the like.
  • the process 1300 may include determining a first uncertainty model associated with the first system determining the parameter associated with the vehicle. For instance, the vehicle 102 may determine the first uncertainty model. In some instances, the vehicle 102 determines the first uncertainty model by receiving the first uncertainty model from the first system. In some instances, the vehicle 102 determines the first uncertainty model using uncertainty data indicating uncertainties associated with the first system determining the first parameter.
  • the process 1300 may include determining estimated locations associated with the vehicle based at least in part on the parameter associated with the vehicle and the first uncertainty model. For instance, the vehicle 102 may process at least the parameter associated with the vehicle 102 using the first uncertainty model. Based at least in part on the processing, the vehicle 102 may determine the estimated locations associated with the vehicle 102 at a later time. As also discussed herein, the estimated locations may correspond to a probability distribution of locations.
  • the process 1300 may include determining, using at least a second system of the vehicle, at least a parameter associated with an object based at least in part on a second portion of the sensor data.
  • the vehicle 102 may analyze the sensor data and, based at least in part on the analysis, identify the object. The vehicle 102 may then analyze the second portion of the sensor data using the one or more systems. Based at least in part on the analysis, the vehicle 102 may determine the parameter associated with the object.
  • the parameter may include, but is not limited to, a type of the object, a location of the object, a speed of the object, a direction of travel of the object, and/or the like.
  • the process 1300 may include determining a second uncertainty model associated with the second system determining the parameter associated with the object. For instance, the vehicle 102 may determine the second uncertainty model. In some instances, the vehicle 102 determines the second uncertainty model by receiving the second uncertainty model from the second system. In some instances, the vehicle 102 determines the second uncertainty model using uncertainty data indicating uncertainties associated with the second system determining the second parameter.
  • the process 1300 may include determining estimated locations associated with the object based at least in part on the parameter associated with the object and the second uncertainty model. For instance, the vehicle 102 may process at least the parameter associated with the object using the second uncertainty model. Based at least in part on the processing, the vehicle 102 may determine the estimated locations associated with the object at the later time. As also discussed herein, the estimated locations may correspond to a probability distribution of locations.
  • the process 1300 may include determining a probability of collision based at least in part on the estimated locations associated with the vehicle and the estimated locations associated with the object.
  • the vehicle 102 may analyze the estimated locations associated with the vehicle 102 and the estimated locations associated with the object in order to determine the probability of collision.
  • the probability of collision may be based at least in part on an amount of overlap between the estimated locations associated with the vehicle 102 and the estimated locations associated with the object.
  • the process 1300 may include determining if the probability of collision is equal to or greater than a threshold. For instance, the vehicle 102 may compare the probability of collision to the threshold in order to determine if the probability of collision is equal to or greater than the threshold.
  • the process 1300 may include causing the vehicle to continue to navigate along a path. For instance, if the vehicle 102 determines that the probability of collision is less than the threshold, then the vehicle 102 may continue to navigate along the path.
  • the process 1300 may include causing the vehicle to perform one or more actions. For instance, if the vehicle 102 determines that the probability of collision is equal to or greater than the threshold, then the vehicle 102 may perform the one or more actions.
  • the one or more actions may include, but are not limited to, changing a path of the vehicle 102, changing a speed of the vehicle 102, parking the vehicle 102, and/or the like.
  • the vehicle 102 may perform steps 1304-1314 using multiple possible routes associated with the vehicle 102. In such examples, the vehicle 102 may select the route that includes the lowest uncertainty and/or the lowest probability of collision.
  • FIG. 14 depicts an example process 1400 for using uncertainty models to determine estimated locations associated with an object, in accordance with embodiments of the disclosure.
  • the process 1400 may include receiving sensor data generated by one or more sensors.
  • the vehicle 102 may be navigating along a path from a first location to a second location. While navigating, the vehicle 102 may generate the sensor data using one or more sensors of the vehicle 102.
  • the process 1400 may include determining, using one or more systems of a vehicle, a first parameter associated with an object based at least in part the sensor data.
  • the vehicle 102 may analyze the sensor data using one or more systems.
  • the one or more systems may include, but are not limited to, a localization system, a perception system, a planning system, a prediction system, and/or the like.
  • the vehicle 102 may determine the first parameter associated with the object (e.g., the vehicle or another object).
  • the first parameter may include, but is not limited to, a location of the object, a speed of the object, a direction of travel of the object, and/or the like.
  • the process 1400 may include determining a first probability distribution associated with the first parameter based at least in part on a first uncertainty model. For instance, the vehicle 102 may process at least the first parameter using the first uncertainty model. Based at least in part on the processing, the vehicle 102 may determine the first probability distribution associated with the first parameter.
  • the process 1400 may include determining, using one or more systems of the vehicle, a second parameter associated with the object based at least in part on at least one of the sensor data or the first probability distribution.
  • the vehicle 102 may analyze the at least one of the sensor data or the first probability distribution using the one or more systems.
  • the vehicle 102 analyze the first probability distribution when the second parameter is determined using the first parameter.
  • the vehicle 102 may determine the second parameter associated with the object (e.g., the vehicle or another object).
  • the second parameter may include, but is not limited to, a location of the object, a speed of the object, a direction of travel of the object, an estimated location of the object at a future time, and/or the like.
  • the process 1400 may include determining a second probability distribution associated with the second parameter based at least in part on a second uncertainty model. For instance, the vehicle 102 may process at least the second parameter using the second uncertainty model. Based at least in part on the processing, the vehicle 102 may determine the second probability distribution associated with the second parameter.
  • the process 1400 may include determining estimated locations associated with the object based at least in part on at least one of the first probability distribution or the second probability distribution. For instance, the vehicle 102 may determine the estimated locations based at least in part on the first probability distribution and/or the second probability distribution. In some instances, if the first parameter and the second parameter are independent, such as the first parameter indicating a current location of the object and the second parameter indicating a speed of the object, then the vehicle 102 may determine the estimated locations using both the first probability distribution and the second probability distribution.
  • the vehicle 102 may determine the estimated locations using the second probability distribution.
  • An autonomous vehicle comprising: one or more sensors; one or more processors; and one or more computer-readable media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: obtaining sensor data from the one or more sensors; determining, based at least in part on a first portion of the sensor data, an estimated location of the autonomous vehicle at a future time; determining, based at least in part on a system of the autonomous vehicle and a second portion of the sensor data, an estimated location of an object at the future time; determining, based at least in part on an error model and the estimated location of the object, a distribution of estimated locations associated with the object, the error model representing a probability of error associated with the system; determining a probability of collision between the autonomous vehicle and the object based at least in part on the estimated location of the autonomous vehicle and the distribution of estimated locations associated with the object; and causing the autonomous vehicle to perform one or more actions based at least in part on the probability of collision.
  • C The autonomous vehicle as recited in either of paragraphs A or B, the operations further comprising: determining, based at least in part on an additional error model and the estimated location of the vehicle, a distribution of estimated locations associated with the autonomous vehicle, and wherein determining the probability of collision between the autonomous vehicle and the object comprises at least: determining an amount of overlap between the distribution of estimated locations associated with the autonomous vehicle and the distribution of estimated locations associated with the object; and determining the probability of collision based at least in part on the amount of overlap.
  • a method comprising: receiving sensor data from one or more sensors of a vehicle; determining, based at least in part on a first portion of the sensor data, an estimated location associated with the vehicle at a time; determining, based at least in part on a system of the vehicle and a second portion of the sensor data, a parameter associated with an object; determining, based at least in part on an error model and the parameter associated with the object, an estimated location associated with the object at the time, the error model representing a probability of error associated with the system; and causing the vehicle to perform one or more actions based at least in part on the estimated location associated with the vehicle and the estimated location associated with the object.
  • G The method as recited in either paragraphs E or F, wherein the parameter comprises at least one of: an object type associated with the object; a location of the object within an environment; a speed of the object; or a direction of travel of the object within the environment.
  • determining the estimated location associated with the vehicle at the time comprises at least: determining, based at least in part on an additional system of the vehicle and the first portion of the sensor data, a parameter associated with the vehicle; and determining, based at least in part on an additional error model and the parameter associated with the vehicle, the estimated location associated with the vehicle at the time, the additional error model representing a probability of error associated with the additional system.
  • determining the estimated location associated with the object at the time comprises determining, based at least in part on the error model and the parameter associated with the object, a distribution of estimated locations associated with the object at the time.
  • determining the estimated location associated with the vehicle comprise at least: determining, based at least in part on an additional system of the vehicle and the first portion of the sensor data, a parameter associated with the vehicle; and determining, based at least in part on an additional error model and the parameter associated with the vehicle, a distribution of estimated locations associated with the vehicle at the time, the additional error model representing a probability of error associated with the additional system.
  • L The method as recited in any one of paragraphs E-K, further comprising: determining an amount of overlap between the distribution of estimated locations associated with the vehicle and the distribution of estimated locations associated with the object; and determining a probability of collision based at least in part on the amount of overlap, and wherein causing the vehicle to perform the one or more actions is based at least in part on the probability of collision.
  • M The method as recited in any one of paragraphs E-L, further comprising selecting the error model based at least in part on the parameter.
  • N The method as recited in any one of paragraphs E-M, further comprising: determining, based at least in part an additional system of the vehicle and the second portion the sensor data, an additional parameter associated with the object; and determining, based at least in part on an additional error model and the additional parameter associated with the object, an output associated with the object, the additional error model representing a probability of error associated with the additional system, and wherein determining the parameter associated with the object comprises determining, based at least in part on the system of the vehicle and the output, the parameter associated with the object.
  • P The method as recited in any one of paragraphs E-O, further comprising: determining, based at least in part on the first portion of the sensor data, an additional estimated location associated with the vehicle at an additional time that is later than the time; determining, based at least in part on the system of the vehicle and the second portion of the sensor data, an additional parameter associated with the object; determining, based at least in part on the error model and the additional parameter associated with the object, an additional estimated location associated with the object at the additional time; and causing the vehicle to perform one or more actions based at least in part on the additional estimated location associated with the vehicle and the additional estimated location associated with the object.
  • R One or more non-transitory computer-readable media storing instructions that, when executed by one or more processors, cause one or more computing devices to perform operations comprising: receiving sensor data generated by a sensor associated with a vehicle; determining, based at least in part on a portion of the sensor data, an estimated location associated with an object at the time; determining, based at least in part on the estimated location, an error model from a plurality of error models; determining, based at least in part on the error model and the estimated location, a distribution of estimated locations associated with the object; and determining one or more actions for navigating the vehicle based at least in part on distribution of estimated locations.
  • S The one or more non-transitory computer-readable media as recited in paragraph R, the operation further comprising: determining, based at least in part on the portion of the sensor data, a parameter associated with the vehicle; determining the estimated location based at least in part on the parameter, and wherein the error model is associated with the parameter.
  • T The one or more non-transitory computer-readable media as recited in either of paragraphs R or S, wherein determining the error model is further based at least in part on one or more of: a classification of the object, a speed of the object, a size of the object, a number of objects in the environment, a weather condition in the environment, a time of day, or a time of year.
  • An autonomous vehicle comprising: one or more sensors; one or more processors; and one or more computer-readable media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: obtaining sensor data generated by the one or more sensors; determining, based at least in part on a first portion of the sensor data, an estimated location of the autonomous vehicle; determining, based at least in part a second portion of the sensor data, an estimated location of an object; determining an uncertainty model associated with the estimated location of the object; determining, based at least in part on the uncertainty model and the estimated location of the object, a distribution of estimated locations associated with the object; determining a probability of collision between the autonomous vehicle and the object based at least in part on the estimated location associated with the vehicle and probability of estimated locations associated with the object; and causing the autonomous vehicle to perform one or more actions based at least in part on the probability of collision.
  • V The autonomous vehicle as recited in paragraph U, the operations further comprising: determining an additional uncertainty model associated with an additional system determining the estimated location of the autonomous vehicle; and determining, based at least in part on the additional uncertainty model and the estimated location of the autonomous vehicle, a probability of estimated locations associated with the autonomous vehicle, and wherein determining the probability of collision between the autonomous vehicle and the object comprises at least: determining an amount of overlap between the probability of estimated locations associated with the autonomous vehicle and the probability of estimated locations associated with the object; and determining the probability of collision based at least in part on the amount of overlap.
  • W The autonomous vehicle as recited in either of paragraphs U or V, wherein: the estimated location of the object is further determined based at least in part on an additional system of the autonomous vehicle; the operations further comprise determining an additional uncertainty model associated with the additional system determining the estimated location of the object; and the probability of estimated locations is further determined based at least in part on the additional uncertainty model.
  • a method comprising: receiving sensor data from one or more sensors of a vehicle; determining, based at least in part on a first portion of the sensor data, an estimated location associated with the vehicle; determining, based at least in part on a system of the vehicle and a second portion of the sensor data, a parameter associated with an object; determining an uncertainty model associated with the system determining the parameter associated with the object; determining, based at least in part on the parameter associated with the object and the uncertainty model, an estimated location associated with the object; and causing the vehicle to perform one or more actions based at least in part on the estimated location associated with the vehicle and the estimated location associated with the object.
  • determining the parameter associated with the object comprises determining, based at least in part on the system and the second portion of the sensor data, at least one of: an object type associated with the object; a location of the object within an environment; a speed of the object; or a direction of travel of the object within the environment.
  • determining the estimated location associated with the vehicle comprises at least: determining, based at least in part on an additional system of the vehicle and the first portion of the sensor data, a parameter associated with the vehicle; determining an additional uncertainty model associated with the additional system determining the parameter associated with the vehicle; and determining, based at least in part on the parameter associated with the vehicle and the additional uncertainty model, the estimated location associated with the vehicle.
  • AB The method as recited in any one of paragraphs X-AA, further comprising: determining an additional estimated location associated with the object based at least in part on the parameter, and wherein determining the estimated location associated with the object comprises determining, based at least in part on the additional estimated location associated with the object and the uncertainty model, the estimated location associated with the object.
  • determining the estimated location associated with the object comprises determining, based at least in part on the parameter associated with the object and the uncertainty model, a distribution of estimated locations associated with the object.
  • AD The method as recited in any one of paragraphs X-AC, wherein determining the estimated location associated with the vehicle comprise at least: determining, based at least in part on an additional system of the vehicle and the first portion of the sensor data, a parameter associated with the vehicle; determining an additional uncertainty model associated with the additional system determining the parameter associated with the vehicle; and determining, based at least in part on the parameter associated with the vehicle and the additional uncertainty model, a distribution of estimated locations associated with the vehicle.
  • AE The method as recited in any one of paragraphs X-AD, further comprising: determining an amount of overlap between the distribution of estimated locations associated with the vehicle and the distribution of estimated locations associated with the object; and determining a probability of collision based at least in part on the amount of overlap, and wherein causing the vehicle to perform the one or more actions is based at least in part on the probability of collision.
  • AF The method as recited in any one of paragraphs X-AE, further comprising: determining, based at least in part on an additional system of the vehicle and a third portion the sensor data, an additional parameter associated with the object; and determining an additional uncertainty model associated with the additional system determining the additional parameter associated with the object, and wherein determining the estimated location associated with the object is further based at least in part on the additional parameter and the additional uncertainty model.
  • AH The method as recited in any one of paragraphs X-AG, further comprising: determining, based at least in part on the system of the vehicle and a third portion of the sensor data, a parameter associated with an additional object; determining an additional uncertainty model associated with the system determining the parameter associated with the additional object; determining, based at least in part the parameter associated with the additional object and the additional uncertainty model, an estimated location associated with the additional object; and wherein causing the vehicle to perform the one or more actions is further based at least in part on the estimated location associated with the additional object.
  • AI The method as recited in any one of paragraphs X-AH, further comprising: determining a probability of collision based at least in part on the estimated location associated with the vehicle and the estimated location associated with the object, and wherein causing the vehicle to perform the one or more actions comprises causing, based at least in part on the probability of collision, the vehicle to at least one of change a velocity or change a route.
  • AJ One or more non-transitory computer-readable media storing instructions that, when executed by one or more processors, cause one or more computing devices to perform operations comprising: receiving sensor data generated by a sensor associated with a vehicle; determining, based at least in part on a portion of the sensor data, an estimated location associated with an object; determining, based at least in part on the estimated location, an uncertainty model from a plurality of uncertainty models; determining, based at least in part on the uncertainty model and the estimated location, a distribution of estimated locations associated with the object; and determining one or more actions for navigating the vehicle based at least in part on distribution of estimated locations.
  • AK The one or more non-transitory computer-readable media as recited in paragraph AJ, the operation further comprising: determining, based at least in part on the portion of the sensor data, a parameter associated with the vehicle; determining the estimated location based at least in part on the parameter, and wherein the uncertainty model is associated with the parameter.
  • AL The one or more non-transitory computer-readable media as recited in either of paragraphs AJ or AK, the operation further comprising: determining, based at least in part on an additional portion of the sensor data, an estimated location associated with the vehicle; determining, based at least in part on the estimated location, an additional uncertainty model from the plurality of uncertainty models; and determining, based at least in part on the additional uncertainty model and the estimated location associated with the vehicle, a distribution of estimated locations associated with the vehicle, and wherein determining the one or more actions is further based at least in part on the distribution of estimated locations associated with the vehicle.
  • AM The one or more non-transitory computer-readable media as recited in any one of paragraphs AJ-AL, the operation further comprising: determining a probability of collision based at least in part on the distribution of estimated locations associated with the vehicle and the distribution of estimated locations associated with the object, and wherein determining the one or more actions is based at least in part on the probability of collision.
  • AN The one or more non-transitory computer-readable media as recited in any one of paragraphs AJ-AM, wherein determining the uncertainty model is further based at least in part on one or more of: a classification of the object, a speed of the object, a size of the object, a number of objects in the environment, a weather condition in the environment, a time of day, or a time of year.

Abstract

La présente invention concerne des techniques et des procédés destinés à réaliser une surveillance de collision au moyen de modèles d'erreur. Par exemple, un véhicule peut générer des données de capteur au moyen d'un ou de plusieurs capteurs. Le véhicule peut ensuite analyser les données de capteur au moyen de systèmes afin de déterminer des paramètres associés au véhicule et des paramètres associés à un autre objet. De plus, le véhicule peut traiter les paramètres associés au véhicule au moyen de modèles d'erreur associés aux systèmes afin de déterminer une distribution d'emplacements estimés associés au véhicule. Le véhicule peut également traiter les paramètres associés à l'objet au moyen des modèles d'erreur afin de déterminer une distribution d'emplacements estimés associés à l'objet. Au moyen des distributions d'emplacements estimés, le véhicule peut déterminer la probabilité de collision entre le véhicule et l'objet.
EP20887777.9A 2019-11-13 2020-11-12 Surveillance de collision au moyen de modèles statistiques Pending EP4059003A4 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US16/683,005 US11648939B2 (en) 2019-11-13 2019-11-13 Collision monitoring using system data
US16/682,971 US11697412B2 (en) 2019-11-13 2019-11-13 Collision monitoring using statistic models
PCT/US2020/060197 WO2021097070A1 (fr) 2019-11-13 2020-11-12 Surveillance de collision au moyen de modèles statistiques

Publications (2)

Publication Number Publication Date
EP4059003A1 true EP4059003A1 (fr) 2022-09-21
EP4059003A4 EP4059003A4 (fr) 2023-11-22

Family

ID=75912837

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20887777.9A Pending EP4059003A4 (fr) 2019-11-13 2020-11-12 Surveillance de collision au moyen de modèles statistiques

Country Status (4)

Country Link
EP (1) EP4059003A4 (fr)
JP (1) JP2023502598A (fr)
CN (1) CN114730521A (fr)
WO (1) WO2021097070A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2612632B (en) * 2021-11-08 2024-04-03 Jaguar Land Rover Ltd Control system for a vehicle and method thereof

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8509982B2 (en) * 2010-10-05 2013-08-13 Google Inc. Zone driving
US8457827B1 (en) * 2012-03-15 2013-06-04 Google Inc. Modifying behavior of autonomous vehicle based on predicted behavior of other vehicles
US9656667B2 (en) * 2014-01-29 2017-05-23 Continental Automotive Systems, Inc. Method for minimizing automatic braking intrusion based on collision confidence
WO2015155833A1 (fr) * 2014-04-08 2015-10-15 三菱電機株式会社 Dispositif de prévention de collision
JP6203381B2 (ja) * 2014-04-10 2017-09-27 三菱電機株式会社 経路予測装置
JP6409680B2 (ja) * 2015-05-29 2018-10-24 株式会社デンソー 運転支援装置、運転支援方法
JP6481520B2 (ja) * 2015-06-05 2019-03-13 トヨタ自動車株式会社 車両の衝突回避支援装置
US10496766B2 (en) * 2015-11-05 2019-12-03 Zoox, Inc. Simulation system and methods for autonomous vehicles
US10029682B2 (en) * 2016-01-22 2018-07-24 Toyota Motor Engineering & Manufacturing North America, Inc. Surrounding vehicle classification and path prediction
US10515390B2 (en) * 2016-11-21 2019-12-24 Nio Usa, Inc. Method and system for data optimization

Also Published As

Publication number Publication date
CN114730521A (zh) 2022-07-08
WO2021097070A1 (fr) 2021-05-20
JP2023502598A (ja) 2023-01-25
EP4059003A4 (fr) 2023-11-22

Similar Documents

Publication Publication Date Title
US11648939B2 (en) Collision monitoring using system data
US11697412B2 (en) Collision monitoring using statistic models
US11836623B2 (en) Object detection and property determination for autonomous vehicles
CN112752988B (zh) 雷达空间估计
US10937178B1 (en) Image-based depth data and bounding boxes
US20210339741A1 (en) Constraining vehicle operation based on uncertainty in perception and/or prediction
US10614717B2 (en) Drive envelope determination
US10984543B1 (en) Image-based depth data and relative depth data
CN112789481A (zh) 对自上而下场景的轨迹预测
JP2021527591A (ja) オクルージョン認識プランニング
JP2023511755A (ja) オブジェクト速度および/またはヨーレート検出およびトラッキング
JP2022532920A (ja) レーダーデータからのヨーレート
US11538185B2 (en) Localization based on semantic objects
US20220185331A1 (en) Calibration based on semantic objects
JP2023547988A (ja) 衝突回避計画システム
WO2023147160A1 (fr) Classification radar d'objets fondée sur des données de section efficace radar
WO2024006115A1 (fr) Détermination de droit de priorité
EP4059003A1 (fr) Surveillance de collision au moyen de modèles statistiques
US20230131721A1 (en) Radar and doppler analysis and concealed object detection
WO2023009794A1 (fr) Détection d'objets tridimensionnels sur la base de données d'image
US20230142674A1 (en) Radar data analysis and concealed object detection
CN117545674A (zh) 用于识别路缘的技术
US20230033177A1 (en) Three-dimensional point clouds based on images and depth data
US11915436B1 (en) System for aligning sensor data with maps comprising covariances
US20230095410A1 (en) System for detecting objects in an environment

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220420

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20231025

RIC1 Information provided on ipc code assigned before grant

Ipc: B60W 30/095 20120101ALI20231019BHEP

Ipc: G08G 1/16 20060101ALI20231019BHEP

Ipc: G08G 1/00 20060101AFI20231019BHEP