CN113692372A - Exception handling for autonomous vehicles - Google Patents

Exception handling for autonomous vehicles Download PDF

Info

Publication number
CN113692372A
CN113692372A CN202080028282.XA CN202080028282A CN113692372A CN 113692372 A CN113692372 A CN 113692372A CN 202080028282 A CN202080028282 A CN 202080028282A CN 113692372 A CN113692372 A CN 113692372A
Authority
CN
China
Prior art keywords
vehicle
sth
time
processors
runtime
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202080028282.XA
Other languages
Chinese (zh)
Other versions
CN113692372B (en
Inventor
D.李
M.P.麦克诺顿
S.约舒亚
A.科斯洛沙希
I-A.苏坎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Waymo LLC
Original Assignee
Waymo LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Waymo LLC filed Critical Waymo LLC
Publication of CN113692372A publication Critical patent/CN113692372A/en
Application granted granted Critical
Publication of CN113692372B publication Critical patent/CN113692372B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0289Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling with means for avoiding collisions between vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0953Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/035Bringing the control units into a predefined state, e.g. giving priority to particular actuators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

Aspects of the present technique relate to exception handling for the vehicle 100. For example, a current trajectory 500 for a vehicle and sensor data corresponding to one or more objects may be received. Based on the received sensor data, estimated trajectories 580, 582, 584 of one or more objects may be determined. A potential collision with one or more objects may be determined based on the predicted trajectory and the current trajectory. The earliest in time one of the potential collisions may be identified. Based on the one of the potential collisions, a Safe Time Horizon (STH) may be identified. When a runtime exception occurs, wait no longer than STH for the runtime exception to resolve before performing a preventative maneuver to avoid the collision.

Description

Exception handling for autonomous vehicles
Cross Reference to Related Applications
This application claims the benefit of U.S. application No. 16/383,096 filed on 12.4.2019, the entire disclosure of which is incorporated herein by reference.
Background
Autonomous vehicles, such as vehicles that do not require a human driver, may be used to assist in transporting passengers or items from one location to another. Such vehicles may operate in a fully autonomous mode in which the user may provide some initial input such as a pickup location or destination location and the vehicle maneuvers itself to that location. Autonomous vehicles may often rely on software and hardware systems that operate in a time-and-closely coupled manner to successfully and safely maneuver the vehicle from one point to another. In the event that the vehicle's computing device encounters a runtime anomaly (runtime exception) that prevents or otherwise delays the system from operating as intended, the safe operation of the vehicle may be compromised.
Disclosure of Invention
One aspect of the present disclosure provides a method for exception handling for a vehicle, the method comprising: receiving, by one or more processors, a current trajectory of a vehicle; receiving, by one or more processors, sensor data generated by a perception system of a vehicle having sensors, wherein the sensor data corresponds to one or more objects of an area surrounding the vehicle; determining, by the one or more processors, a predicted trajectory of the one or more objects based on the received sensor data; determining, by the one or more processors, a potential collision with the one or more objects based on the predicted trajectory and the current trajectory; identifying, by the one or more processors, an earliest in time one of the potential collisions; determining, by one or more processors, a Safe Time Horizon (STH) based on one of the potential collisions; and when a runtime exception occurs, waiting, by the one or more processors, for no longer than the STH for the runtime exception to resolve before performing the preventative maneuver to avoid the collision.
In one example, the determination of STH is based on a predetermined time period prior to the time of one of the potential collisions. In another example, determining the STH is based on an exception handling rate profile. In this example, the abnormal handling rate profile is a constant deceleration measure of the vehicle. Alternatively, the exception handling rate profile corresponds to one or more changes to a deceleration metric of the vehicle. Additionally or alternatively, the method further comprises performing a preventative maneuver by using the anomaly handling rate profile to control the vehicle when the runtime anomaly has not been resolved after the STH. In another example, the method further comprises periodically re-determining the STH. In another example, the runtime exception corresponds to a communication delay from the sensor. In another example, the sensor is a radar unit. In another example, the runtime anomaly corresponds to an absence of communication from a sensor of the perception system for a predetermined period of time.
Another aspect of the present disclosure provides a system for exception handling for a vehicle. The system includes one or more processors configured to: receiving a current trajectory of a vehicle; receiving sensor data generated by a perception system of a vehicle having sensors, wherein the sensor data corresponds to one or more objects of an area surrounding the vehicle; determining an estimated trajectory of one or more objects based on the received sensor data; determining potential collisions with one or more objects based on the predicted trajectory and the current trajectory; identifying an earliest in time one of the potential collisions; determining a Safe Time Horizon (STH) based on the one of the potential collisions; and when a runtime exception occurs, waiting no longer than the STH for the runtime exception to resolve before performing a preventative maneuver to avoid the collision.
In one example, the one or more processors are further configured to determine the STH based on a predetermined time period before a time of the one of the potential collisions. In another example, the one or more processors are further configured to determine the STH based on the exception handling rate profile. In this example, the abnormal handling rate profile is a constant deceleration measure of the vehicle. Additionally or alternatively, the one or more processors are further configured to determine to perform a preventative maneuver by using the anomaly handling rate profile to control the vehicle when the runtime anomaly has not been resolved after the STH. In another example, the one or more processors are further configured to periodically re-determine the STH. In another example, the runtime exception corresponds to a communication delay from the sensor. In another example, the system further comprises a sensor, and wherein the sensor is a radar unit. In another example, the runtime anomaly corresponds to an absence of communication from a sensor of the perception system for a predetermined period of time. In another example, the system further includes a vehicle.
Drawings
FIG. 1 is a functional diagram of an example vehicle, according to an example embodiment.
Fig. 2 is an example of map information according to aspects of the present disclosure.
FIG. 3 is an example exterior view of a vehicle according to aspects of the present disclosure.
Fig. 4 is an example of a vehicle traveling on a road segment according to aspects of the present disclosure.
Fig. 5 continues the example of fig. 4 with additional data, according to aspects of the present disclosure.
Fig. 6 continues the example of fig. 5 with additional data, according to aspects of the present disclosure.
FIG. 7 is an example diagram of aspects of the example of FIG. 6 and additional data according to aspects of the present disclosure.
Fig. 8 continues the example of fig. 7 with additional data, according to aspects of the present disclosure.
Fig. 9 is an example flow diagram in accordance with aspects of the present disclosure.
Detailed Description
SUMMARY
The present technology relates to handling runtime anomalies in autonomous vehicles. Autonomous vehicles may often rely on software and hardware systems that operate in a time-and-closely coupled manner to successfully and safely maneuver the vehicle from one point to another. In some instances, the vehicle's computing device may encounter a runtime anomaly that prevents or otherwise delays the system from operating as intended. In such instances, the vehicle may be forced to mitigate the risk introduced by the root cause of the run-time anomaly by performing a preventative maneuver (e.g., a quick stop or a side-to-side stop). Such preventative maneuvers may result in an uncomfortable experience for the vehicle occupants, and may not mitigate all the risks to surrounding road users, such as drivers of other vehicles in proximity to the vehicle. Moreover, such manipulation may not be necessary in practice, as most runtime exceptions may eventually resolve themselves. To address these issues, the autonomous vehicle may provide a period of time or safe time horizon ("STH") for self-resolving the run-time anomaly before performing the preventative maneuver.
The computing device may control movement of the autonomous vehicle. In this regard, the computing device is capable of communicating with various components of the vehicle. Based on data received from the various system components, the computing device may control the direction, velocity, acceleration, etc. of the vehicle by sending instructions to the various components of the vehicle.
Runtime exceptions may be generated in the following example: where one of the autonomous vehicle's computing devices or systems encounters a condition where it is not programmed to process or does not receive enough information to process. Such runtime exceptions may be caused by processing delays, communication delays or lack of communication for a certain period of time, software or hardware failures, or other such conditions that result in the computing device failing to operate as intended.
For example, the perception system of the vehicle may include a RADAR system that transmits and receives signals at a rate of 10 Hz. As such, the perception system may expect, and in some cases, rely on, receiving RADAR signals every 100 ms. However, due to communication delays, the sensing system may not receive RADAR signals for a period of 150 ms. This 50ms delay may trigger a runtime exception. In another example, the sensing system of the vehicle may not receive RADAR signals at the expected rate of 100ms due to a loss of power to the RADAR caused by a faulty power cord, which may also trigger a run-time anomaly because messages will not be received within reasonable time limits (as determined, for example, by using a timer). This may be handled by the same or separate software modules.
Some runtime exceptions may resolve themselves after a period of time, while others may require external intervention. For example, a runtime anomaly caused by a communication delay of the RADAR may be a result of an object preventing the RADAR from receiving RADAR signals, or may be a processing delay in the computing device from the RADAR and/or perception system (such as when the computing device is overloaded with processing tasks). In a first instance, once the object has moved, the runtime exception may be resolved, allowing the RADAR to again receive RADAR signals. In a second example, the load on the computing device may become normal, allowing the RADAR to continue to provide sensor data to the perception system and/or other systems of the vehicle. Although the above examples refer to RADAR, similar runtime anomalies and solutions may occur at other sensors and computing devices of the vehicle. Other runtime anomalies, such as those caused by a faulty power line, may require external intervention, such as replacement of the faulty power line by a technician, to resolve.
However, the computing device of the vehicle may not know the root cause that triggered the run-time anomaly, and as such, the computing device of the vehicle may not know whether the run-time anomaly may resolve itself or require external intervention. Even in instances where the computing device is aware of the cause of the runtime exception, the computing device may not know the amount of time before the runtime exception can be resolved.
The computing device of the vehicle may utilize the following possibilities: by providing a period of time (or more precisely, an expected point in time) for runtime exception recovery before performing a preventative maneuver, the runtime exception may resolve itself. The expected point in time of the runtime anomaly being provided for recovery or STH may allow the vehicle to avoid the need to perform a preventative maneuver if the runtime anomaly is recovered. Thus, the autonomous vehicle may maintain its current trajectory while waiting to resolve the runtime anomaly during the STH. As a result, if the running abnormality is resolved, the autonomous vehicle can avoid unnecessary maneuvers, thereby also maintaining the comfort level of its passengers.
The computing device of the vehicle may determine the STH based on a current trajectory of the autonomous vehicle and a predicted (projected) trajectory of an object external to the autonomous vehicle. The current trajectory of the vehicle may be generated by a planning system of the vehicle. Each trajectory may include geometric components that describe the future physical path of the vehicle and a velocity profile (speed profile) that describes the future velocity and the change in velocity of the vehicle over time. The current trajectory may then be sent to and processed by various other systems of the vehicle to make driving and other decisions in order to enable a computing device of the vehicle to control the vehicle.
The behavior modeling system of the vehicle may generate one or more predicted trajectories for each observed object external to the autonomous vehicle continuously or over a predetermined period of time. The behavior modeling system may input sensor data received from the perception system into one or more models and determine or generate one or more predicted trajectories of the object. Each predicted trajectory may correspond to a possible path that an object may potentially traverse and the time at which the object is expected to be at a different point along the path. These predicted trajectories may then be sent to and processed by various other systems of the vehicle to make driving and other decisions for the vehicle.
The predicted trajectory of the object may be compared to a current trajectory of the autonomous vehicle to identify a potential collision. Based on this comparison, the computing device of the vehicle may determine potential locations and times at which the current trajectory of the autonomous vehicle will intersect the trajectory of the object. Such locations and times may correspond to locations and times of potential collisions or where a collision is predicted to occur at some point in the future.
The computing device of the vehicle may then identify the earliest possible collision in time. The computing device of the vehicle determines the STH for the earliest possible collision in time. If the runtime exception is resolved by itself during the STH, the vehicle's computing device may continue to control the vehicle without taking preventative maneuvers or performing some other exception handling function. If the runtime exception is not self-resolved, the vehicle's computing device will still have time to take preventative action or perform some other exception handling function.
The features described herein may allow an autonomous vehicle to avoid taking unnecessary or overly cautious preventative maneuvers in the event of a self-resolving operational anomaly. By doing so, the autonomous vehicle may continue to operate as intended, thereby avoiding unexpected and uncomfortable maneuvers or delays that may cause discomfort to the occupants, while still maintaining the safety of the vehicle and its occupants.
Example System
As shown in fig. 1, a vehicle 100 according to an aspect of the present disclosure includes various components. While certain aspects of the present disclosure are particularly useful in conjunction with a particular type of vehicle, the vehicle may be any type of vehicle, including but not limited to cars, trucks, motorcycles, buses, recreational vehicles, and the like. The vehicle may have one or more computing devices, such as computing device 110 containing one or more processors 120, memory 130, and other components typically found in a general purpose computing device.
Memory 130 stores information accessible by one or more processors 120, including instructions 132 and data 134 that may be executed or otherwise used by processors 120. The memory 130 may be any type of memory capable of storing information accessible by the processor, including a computing device readable medium or other medium that stores data that may be read by an electronic device, such as a hard disk drive, memory card, ROM, RAM, DVD, or other optical disk, as well as other writable and read-only memories. The systems and methods may include different combinations of the foregoing, whereby different portions of the instructions and data are stored on different types of media.
The instructions 132 may be any set of instructions executed directly (such as machine code) or indirectly (such as scripts) by a processor. For example, the instructions may be stored as computing device code on a computing device readable medium. In this regard, the terms "software," "instructions," and "programs" may be used interchangeably herein. The instructions may be stored in an object code format for direct processing by a processor or in any other computing device language, including a collection of separate source code modules or scripts that are interpreted or pre-compiled as needed. The function, method and routine of the instructions are explained in more detail below.
Processor 120 may retrieve, store, or modify data 134 according to instructions 132. For example, although claimed subject matter is not limited by any particular data structure, data may be stored in a computing device register, in a relational database as a table, XML document, or flat file having a plurality of different fields and records. The data may also be formatted in any computing device readable format.
The one or more processors 120 may be any conventional processor, such as a commercially available CPU. Alternatively, one or more processors may be special purpose devices, such as an ASIC or other hardware-based processor. Although fig. 1 functionally shows the processor, memory, and other elements of the computing device 110 as being within the same block, those of ordinary skill in the art will appreciate that a processor, computing device, or memory may in fact be comprised of multiple processors, computing devices, or memories, which may or may not be housed within the same physical housing. For example, the memory may be a hard drive or other storage medium located in a different enclosure than the enclosure of the computing device 110. Thus, references to a processor or computing device are to be understood as including references to a collection of processors or computing devices or memories that may or may not operate in parallel.
Computing device 110 may include all of the components typically used in connection with computing devices, such as the processors and memories described above, as well as user input 150 (e.g., a mouse, keyboard, touch screen, and/or microphone) and various electronic displays (e.g., a monitor having a screen or any other electrical device operable to display information). In this example, the vehicle includes an internal electronic display 152 and one or more speakers 154 to provide an informational or audiovisual experience. In this regard, the internal electronic display 152 may be located within a cabin of the vehicle 100 and may be used by the computing device 110 to provide information to passengers within the vehicle 100.
Computing device 110 may also include one or more wireless network connections 156 to facilitate communications with other computing devices, such as client and server computing devices described in detail below. The wireless network connection may include short-range communication protocols such as bluetooth, bluetooth Low Energy (LE), cellular connections, and various configurations and protocols including the internet, world wide web, intranets, virtual private networks, wide area networks, local networks, private networks using one or more company-specific communication protocols, ethernet, WiFi, and HTTP, as well as various combinations of the foregoing.
In one example, the computing device 110 may be a control computing device of an autonomous driving computing system or incorporated into the vehicle 100. The autonomous driving computing system can communicate with various components of the vehicle to control movement of the vehicle 100 according to the autonomous vehicle control software of the memory 130, as discussed further below. For example, returning to fig. 1, the computing device 110 may communicate with various systems of the vehicle 100, such as a deceleration system 160, an acceleration system 162, a steering system 164, a signaling system 166, a planning system 168, a routing system 170, a positioning system 172, a perception system 174, a behavior modeling system 176, and a powertrain system 178 (i.e., an engine or motor of the vehicle) to control movement, velocity, etc. of the vehicle 100 according to the instructions 132 of the memory 130. Each of these systems may include various hardware (processors and memory similar to processor 120 and memory 130) and software to enable these systems to perform various tasks. Also, while these systems are shown external to computing device 110, in practice, these systems may also be incorporated into computing device 110, again as an autonomous driving computing system for controlling vehicle 100.
As an example, computing device 110 may interact with one or more actuators of deceleration system 160 and/or acceleration system 162 (such as a brake, an accelerator pedal, and/or an engine or motor of the vehicle) to control a velocity of the vehicle. Similarly, the computing device 110 may use one or more actuators of the steering system 164, such as a steering wheel, a steering shaft, and/or a pinion and rack in a rack and pinion (rack and pinion) system, to control the direction of the vehicle 100. For example, if the vehicle 100 is configured for use on a road, such as a car or truck, the steering system may include one or more actuators to control the angle of the wheels to turn the vehicle. The computing device 110 may use the signaling system 166 to signal (e.g., by illuminating turn signals or brake lights when needed) the intent of the vehicle to other drivers or vehicles.
The computing device 110 may use the planning system 168 to determine and follow routes to locations generated by the routing system 170. For example, the routing system 170 may use map information to determine a route from the current location of the vehicle to the destination location. The planning system 168 may periodically generate trajectories or short term plans for controlling the vehicle at some time in the future in order to follow a route to a destination. In this regard, the planning system 168, routing system 170, and/or data 134 may store detailed map information, such as highly detailed maps identifying roads, lane lines, intersections, crosswalks, speed limits, traffic signals, buildings, signs, real-time traffic information, shapes and heights of vegetation, or other such objects and information.
Fig. 2 is an example of map information 200 for a road section (section of route) including an intersection 202. The map information 200 includes information identifying the shape, location, and other characteristics of various features including lane lines 210, 211, 212, 213, 214, traffic control devices 220, 220 (which may include, for example, traffic lights, stop signs, etc.), crosswalks 230, 232, sidewalks 240, 242, road markings including arrows 250, 251, 252, and features such as lanes 260, 261, 262, 263, 264, 265. Although only a few features are shown and identified, the map information 200 may be highly detailed and include various additional features.
Although the map information is depicted herein as an image-based map, the map information need not be entirely image-based (e.g., raster). For example, map information may include one or more road maps or graphical networks of information such as roads, lanes, intersections, and connections between these features. Each feature may be stored as graphical data and may be associated with information such as the geographic location and whether it is linked to other relevant features (e.g., stop signs may be linked to roads and intersections, etc.). In some examples, the associated data may include a grid-based index of road maps to allow efficient lookup of certain road map features.
The computing device 110 may use the positioning system 172 to determine the relative or absolute position of the vehicle on a map or on the earth. For example, the positioning system 172 may include a GPS receiver to determine the latitude, longitude, and/or altitude location of the device. Other positioning systems, such as laser-based positioning systems, inertial assisted GPS, or camera-based positioning, may also be used to identify the location of the vehicle. The location of the vehicle may include an absolute geographic location, such as latitude, longitude, and altitude, as well as relative location information, such as the location relative to other cars immediately surrounding it, which may generally be determined with less noise than the absolute geographic location.
The positioning system 172 may also include other devices (such as an accelerometer, a gyroscope, or another direction/velocity detection device) in communication with the computing device 110 to determine the direction and speed of the vehicle or changes thereto. For example only, the acceleration device may determine its pitch, yaw, or roll (or changes thereof) relative to the direction of gravity or relative to a plane perpendicular to the direction of gravity. The device may also track the increase or decrease in velocity and the direction of such changes. The provision of location and orientation data by a device as set forth herein may be automatically provided to computing device 110, other computing devices, and combinations of the foregoing.
The perception system 174 also includes one or more components for detecting objects external to the vehicle, such as other vehicles, obstacles in the road, traffic signals, signs, trees, and so forth. For example, perception system 174 may include a laser, sonar, radar, camera, and/or any other detection device that records data that may be processed by computing device 110. In the case where the vehicle is a passenger vehicle such as a minivan, the minivan may include a laser or other sensor mounted on the roof (roof) or other convenient location. For example, fig. 3 is an example exterior view of the vehicle 100. In this example, a top (roof-top) housing 310 and a dome housing 312 may include a LIDAR sensor as well as various cameras and radar units. Further, a housing 320 located at the front end of the vehicle 100 and housings 330, 332 on the driver and passenger sides of the vehicle may each house LIDAR sensors. For example, the housing 330 is located in front of the driver's door 350. The vehicle 100 also includes housings 340, 342 for radar units and/or cameras that are also located on the roof of the vehicle 100. Additional radar units and cameras (not shown) may be located at the front and rear ends of the vehicle 100 and/or at other locations along the roof or roof housing 310. Vehicle 100 also includes many features of a typical passenger vehicle, such as doors 350, 352, wheels 360, 362, etc.
Various systems of the vehicle may operate using autonomous vehicle control software to determine how to control the vehicle and to determine to control the vehicle. As an example, the perception system software modules of perception system 174 may use sensor data generated by one or more sensors (such as cameras, LIDAR sensors, radar units, sonar units, etc.) of the autonomous vehicle to detect and identify objects and their characteristics. These characteristics may include location, type, orientation, velocity, acceleration, change in acceleration, magnitude, shape, and the like. In some instances, the characteristics may be input into a behavior prediction system software module that uses various models based on object type to output predicted future behavior for the detected object. In other examples, the characteristics may be input into one or more detection system software modules, such as a construction zone detection system software module configured to detect construction zones from sensor data generated by one or more sensors of the vehicle and an emergency vehicle detection system configured to detect emergency vehicles from sensor data generated by sensors of the vehicle. Each of these detection system software modules may use various models to output the likelihood that a construction zone or object is an emergency vehicle. Detected objects, predicted future behavior, various possibilities from the detection system software module, map information identifying the vehicle environment, positioning information from the positioning system 172 identifying the location and orientation of the vehicle, the destination of the vehicle, and feedback from various other systems of the vehicle (including routes generated by the routing system 170) may be input into the planning system software module of the planning system 168. The planning system may use this input to generate a trajectory for the vehicle to follow for some short period of time in the future. The control system software module of the computing device 110 may be configured to control movement of the vehicle (e.g., by controlling braking, acceleration, and steering of the vehicle) in order to follow the trajectory.
The computing device 110 may autonomously control the direction and speed of the vehicle by controlling various components. To do so, the computing device 110 may accelerate the vehicle (e.g., by increasing fuel or other energy provided to the engine by the acceleration system 162), decelerate (e.g., by decreasing fuel supplied to the engine, shifting gears, and/or by applying brakes by the deceleration system 160), change direction (e.g., by turning front or rear wheels of the vehicle 100 by the steering system 164), and signal such a change (e.g., by illuminating a turn signal of the signaling system 166). Thus, acceleration system 162 and deceleration system 160 may be part of a transmission system that includes various components between the vehicle engine and the vehicle wheels. Also, by controlling these systems, the computing device 110 may also control the driveline of the vehicle to autonomously steer the vehicle.
Example method
In addition to the operations described above and illustrated in the figures, various operations will now be described. It should be understood that the following operations need not be performed in the exact order described below. Rather, various steps may be processed in a different order or concurrently, and steps may also be added or omitted.
The computing device of the vehicle may utilize the following possibilities: by providing a period of time (or more precisely, an expected point in time) for runtime exception recovery before performing a preventative maneuver, the runtime exception may resolve itself. The provision of an expected point in time for the run time anomaly to recover or STH may allow the vehicle to avoid the need to perform a preventative maneuver if the run time anomaly is recovered. Thus, the autonomous vehicle may maintain its current trajectory while waiting to resolve the runtime anomaly during the STH. As a result, if the running abnormality is resolved, the autonomous vehicle can avoid unnecessary maneuvers, thereby also maintaining the comfort level of its passengers.
Fig. 4 shows that the vehicle 100 travels in a road segment 400 corresponding to the map information 200. The roadway 400 includes information identifying the shape, location, and other characteristics of various features including an intersection 402 corresponding to the intersection 202, lane lines 410, 411, 412, 413, 414 corresponding to the lane lines 210, 211, 212, 213, 214, traffic control devices 420, 422 corresponding to the traffic control devices 220, 222, crosswalks 430, 432 corresponding to the crosswalks 230, 232, sidewalks 440, 442 corresponding to the sidewalks 240, 242, arrows 450, 451 corresponding to the arrows 250, 251, 252, and 460, 461, 462, 463, 464 corresponding to the lanes 260, 261, 262, 263, 264, 265. In this example, the vehicle 100 is approaching the intersection 402 in a lane 465. In addition, the vehicle 480 also approaches the intersection 402 in a lane 465, and the vehicle 490 approaches the vehicle 100 in the intersection 402 (albeit in line with the lane 463). This example depicts at position l1At a time t1Of the vehicle 100.
Fig. 9 is an example flow diagram 900 of aspects of techniques described herein for exception handling for a vehicle (such as vehicle 100), which may be executed by one or more processors of one or more computing devices of a vehicle (such as processor 120 of computing device 110). At block 910, a current trajectory of a vehicle is received. The current trajectory of the vehicle may be generated by the planning system 168 based on the route generated by the routing system 170, the predicted trajectory generated by the behavior modeling system 176, and the sensor data and other data generated by the perception system 174. Each trajectory may include geometric components that describe a future physical path of the vehicle and a velocity profile that describes a future velocity of the vehicle and a change in velocity over time. The current trajectory may then be transmitted to and processed by various other systems of the vehicle in order to make travel and other decisions for the vehicle (including, for example, a computing device of the vehicle like computing device 110). Turning to the example of fig. 5, the vehicle 100 is currently following the trajectory 500.
Returning to FIG. 9, at block 920, sensor data generated by a sensing system of the vehicle is received. The sensor data corresponds to one or more objects in an area surrounding the vehicle. For example, the sensing system 174 may use various sensors of the vehicle to generate sensor data. The sensor data may be raw sensor data or processed sensor data as well as other information about the characteristics of objects in the area surrounding the vehicle 100. This may include, for example, position, orientation, velocity, acceleration/deceleration, changes in acceleration/deceleration, and the like.
The behavior modeling system 176 of the vehicle may generate one or more predicted trajectories for each observed object outside of the autonomous vehicle continuously or over a predetermined period of time (such as every 100 milliseconds or more or less). For example, the behavior modeling system 176 may receive sensor data and other data for the object from the perception system 174. Again, the sensor data may be raw sensor data or processed sensor data as well as other information about the characteristics of objects in the area surrounding the vehicle 100.
At block 930, an estimated trajectory of one or more objects may be determined based on the received sensor data. For example, the behavior modeling system 176 may input sensor data received from the perception system 174 into one or more models and determine or generate one or more predicted trajectories for the object. These estimated trajectories may then be sent to and processed by various other systems of the vehicle in order to make driving and other decisions for the vehicle (including, for example, a computing device of the vehicle like computing device 110).
The model may be based on typical operating procedures of similar objects. For example, the predicted trajectory for stopping the vehicle at the lights may be based on typical operation (e.g., speed, acceleration, heading, etc.) of other vehicles at the same lights or similar lights. In some instances, the model may also operate based on irregularities of similar objects. For example, the predicted trajectory for a vehicle stopped at a light may include a trajectory corresponding to a stopped vehicle backing up, accelerating quickly, stopping quickly after starting to move, and the like. The estimated trajectory based on irregular operation may be limited to physically feasible possibilities. In other words, the irregular operation used to generate the predicted trajectory may be an action that the object is known to be able to perform.
Each predicted trajectory may correspond to a possible path that an object may potentially traverse and a time at which the object is expected to be at a different point along the path. For example, the behavior modeling system may generate predicted trajectories for vehicles stopping at the intersection and vehicles traveling through the intersection using the above data provided by the perception system. Returning to FIG. 5, for a vehicle 480 that may stop at the intersection 402, the behavior modeling system may generate estimated trajectories 580, 582, 584. For a vehicle 490 that may be traveling through the intersection 402, the behavior modeling system may generate the predicted trajectories 590, 592. Although only five predicted trajectories are shown, there may be more or fewer predicted trajectories generated for each object. In some instances, stationary objects such as road signs, trees, etc. may be filtered by the behavior modeling system or otherwise ignored or not processed.
Returning to FIG. 9, at block 940, a potential collision with one or more objects may be determined based on the predicted trajectory and the current trajectory (of the vehicle). For example, the predicted trajectory of the object may be compared to a current trajectory of the autonomous vehicle to identify a potential collision. Based on the comparison, a computing device of the vehicle, such as computing device 110, may determine potential locations and times at which a current trajectory of the autonomous vehicle (e.g., trajectory 500) will intersect the trajectory of the object. Such locations and times may correspond to locations and times of potential collisions or where a collision is predicted to occur at some point in the future.
For example, turning to FIG. 6, the current trajectory of an autonomous vehicle may result in a collision (represented by location points 600, 610, respectively) with the predicted trajectory of a stopped vehicle and a vehicle traveling through an intersection. In other words, these location points may represent a combination of locations and times at which the trajectory 500 will intersect the estimated trajectories of the vehicles 480 and 490. For example, location point 600 may represent at time t2And position l2And similarly, location point 610 may represent a potential collision at time t3And position l3A possible collision of (a). Thus, although the location point 610 is near the predicted trajectory 582, the location point does not represent a potential collision with the vehicle 480, as the vehicle 480 and the vehicle 100 will not intersect in time (only in location).
Returning to FIG. 9, at block 950, the earliest in time one of the potential collisions may be identified. For example, a computing device of a vehicle, such as computing device 110, may identify the earliest possible collision in time. For example, as described above, as shown in the examples of fig. 4, 5, and 6, the current position of the vehicle is at time t1Is at1. In this example, the first potential collision in time is likely to be at location i2And time t2Where (i.e., location point 600) the vehicle 480 and the second potential collision in time is likely to be at location i3And time t3Where (i.e., location point 610) occurs with vehicle 490.
Returning to FIG. 9, at block 960, a safe time range is determined based on the potential collision. The computing device of the vehicle may determine the STH for the earliest possible collision in time. As in the above example, the earliest possible collision in time may occur at time t2And position l2To (3). In this regard, STH may be a particular point in time t' or from a current time t1A certain period of time to time t'. Time t' may be as being at mostEarly time of possible collision (here, t)2) A certain predetermined time period before in order to allow the autonomous vehicle to handle the run-time anomaly at least before the predetermined time. The computing device of the vehicle may then solve for time t' using the following equation:
Figure BDA0003299524730000131
f(t′)=fe(t′)
t1≤t′≤t2
in this example, f (t) is a current trajectory velocity profile of the autonomous vehicle, and fe(t) is an exception handling rate profile for the autonomous vehicle.
As one example, exception handling rate profiles may include applying a constant deceleration metric until the vehicle comes to a complete stop. As an example, the constant deceleration measure may be-6 m/s2Or more or less. However, since the rate of deceleration is to be used in an emergency, it may be rather uncomfortable for the passengers. Of course, more complex velocity profiles of acceleration changes over time may also be used. For example, the rate profile may be based on the type of data (e.g., delayed data) that caused the runtime exception. For example, if the cause of the operational anomaly is a higher priority (sensor data from the sensing system is lost or not received in time), the velocity profile may cause the vehicle to automatically "slam" the brakes. As another example, if the runtime anomaly is of lower priority (such as being connected to a remote computing device), the velocity profile may cause the vehicle to slow down at a more reasonable rate.
Returning to FIG. 9, at block 970, when a runtime exception occurs, no longer than a safe time range is waited for to resolve the runtime exception before performing a preventative maneuver to avoid the collision. In the event of a runtime anomaly, the vehicle's computing device may wait for a period of STH, or more precisely, from t1Time period to t' for resolving the runtime exception. If STH has passed and the runtime exception has not yet passedIf resolved, the vehicle's computing device may engage in (engage) exception handling responses. The exception handling response may include an exception handling rate profile.
Fig. 7 shows example operating conditions of the vehicle 100 corresponding to the examples of fig. 4, 5 and 6. In an example, at time t1At position l, the vehicle 1001Is traveling at a constant (i.e., not accelerating or decelerating) speed of 10m/s (i.e., f (t)). The vehicle 480 is located at a forward distance D (e.g., 100 meters) at location l2And stopped. As described above, the computing device of the vehicle may determine that the earliest possible collision in time is at location point 600, or at current location 1 of vehicle 4802To (3). In this example, the estimated time of the earliest possible collision t2May be from the current time t15 seconds. Further, the vehicle 100 can operate at-5 m/s2Is decelerated (or, for example, reliably decelerated). In view of this information, a computing device, such as computing device 110, may determine that vehicle 100 must decelerate from 20m/s to 0 to avoid a collision with vehicle 480. The level of throttling may be an exception handling rate profile.
Using the above data, a computing device of a vehicle, such as computing device 110, may determine that the autonomous vehicle will take at least 4 seconds (20m/s/-5 m/s)2) To make a full stop before the vehicle 480 (assuming the vehicle 480 remains stationary). As such, the vehicle 100 must begin decelerating 40 meters behind the vehicle 480, or in other words, at the current location l1The front 60 meters starts to decelerate to avoid collision. In this regard, the vehicle can continue at its current speed for the next 3 seconds (60 m/20 m/s) while safely ignoring any running anomalies. Thus, the STH may be 3 seconds. In other words, for the occurrence at time t1The computing device of the vehicle may wait 3 seconds for the runtime anomaly to be resolved.
As described above, if the run-time anomaly resolves itself during the STH, the vehicle's computing device may continue to control the vehicle without taking preventative maneuvers or performing some other anomaly handling function. If the runtime anomaly is not self-resolved, the computing device of the vehicle will still have time to take advanceDefensively manipulates or performs some other exception handling function. For example, returning to FIG. 7, a computing device of a vehicle, such as computing device 100, may have time ts(here, ts2s) is decelerated according to the abnormal processing rate distribution and stopped before the vehicle 480.
Turning to the representation of FIG. 8 with time tlTime t corresponding to the next 2snThe vehicle 100 has moved closer to the vehicle 480 than in the example of fig. 7. If a runtime exception were to occur at this time, the STH would be only 1 second. For example, STH-3 seconds-2 seconds-1 second. As such, the vehicle's computing device may wait only 1 second and stop before vehicle 480 before beginning to decelerate according to the abnormal processing rate profile. In other words, once the vehicle 100 reaches location lnThe vehicle 100 must start decelerating according to the abnormal handling rate profile.
The STH may be determined (or re-determined) periodically (e.g., every 100 milliseconds or more or less, or every time new sensor data is received from the sensing system). In this regard, a computing device of a vehicle, such as computing device 110, may continue to count down the period until a new period is determined. Alternatively, the STH may be determined in response to a runtime exception, and a countdown may be initiated once the STH is determined. In this regard, upon occurrence of a runtime anomaly, the vehicle's computing device may at least wait for the STH for itself to resolve the runtime anomaly. Still further, if the runtime error is not self-resolved during the STH, the vehicle's computing device may automatically initiate a preventative maneuver such as a quick stop or a side stop.
The features described herein may allow an autonomous vehicle to avoid taking unnecessary or overly cautious preventative maneuvers in the event of a self-addressed run-time anomaly. By doing so, the autonomous vehicle may continue to operate as intended, thereby avoiding unexpected and uncomfortable maneuvers or delays that may cause discomfort to the occupants, while still maintaining the safety of the vehicle and its occupants.
Unless otherwise indicated, the above-described alternative examples are not mutually exclusive and may be implemented in various combinations to achieve unique advantages. As these and other variations and combinations of the features discussed above can be utilized without departing from the subject matter defined by the claims, the foregoing description of the embodiments should be taken by way of illustration rather than by way of limitation of the subject matter defined by the claims. Furthermore, the provision of examples described herein and clauses phrased as "such as," "including," and the like, should not be interpreted as limiting the claimed subject matter to the specific examples; rather, the example is intended to illustrate only one of many possible embodiments. Further, the same reference numbers in different drawings may identify the same or similar elements.

Claims (20)

1. A method for exception handling for a vehicle, the method comprising:
receiving, by one or more processors, a current trajectory of a vehicle;
receiving, by one or more processors, sensor data generated by a perception system of a vehicle having sensors, wherein the sensor data corresponds to one or more objects of an area surrounding the vehicle;
determining, by the one or more processors, a predicted trajectory of the one or more objects based on the received sensor data;
determining, by the one or more processors, a potential collision with the one or more objects based on the predicted trajectory and the current trajectory;
identifying, by the one or more processors, an earliest in time one of the potential collisions;
determining, by one or more processors, a safe time horizon STH based on the one of the potential collisions; and
when a runtime exception occurs, waiting, by the one or more processors, for no longer than the STH for the runtime exception to resolve before performing a preventative maneuver to avoid the collision.
2. The method of claim 1, wherein determining the STH is based on a predetermined period of time prior to a time of the one of the potential collisions.
3. The method of claim 1, wherein determining the STH is based on an exception handling rate profile.
4. The method of claim 3, wherein the exception handling rate profile is a constant deceleration measure of the vehicle.
5. The method of claim 3, wherein the exception handling rate profile corresponds to one or more changes to a deceleration metric of a vehicle.
6. The method of claim 3, further comprising performing a preventative maneuver by using the anomaly handling rate profile to control the vehicle when the runtime anomaly has not been resolved after the STH.
7. The method of claim 1, further comprising periodically re-determining the STH.
8. The method of claim 1, wherein the runtime exception corresponds to a communication delay from a sensor.
9. The method of claim 1, wherein the sensor is a radar unit.
10. The method of claim 1, wherein the runtime anomaly corresponds to an absence of communication from a sensor of a perception system for a predetermined period of time.
11. A system for exception handling for a vehicle, the system comprising one or more processors configured to:
receiving a current trajectory of a vehicle;
receiving sensor data generated by a perception system of a vehicle having sensors, wherein the sensor data corresponds to one or more objects of an area surrounding the vehicle;
determining an estimated trajectory of one or more objects based on the received sensor data;
determining potential collisions with one or more objects based on the predicted trajectory and the current trajectory;
identifying an earliest in time one of the potential collisions;
determining a safe time range STH based on the one of the potential collisions; and
when a runtime exception occurs, wait no longer than STH for the runtime exception to resolve before performing a preventative maneuver to avoid the collision.
12. The system of claim 11, wherein the one or more processors are further configured to determine the STH based on a predetermined time period before a time of the one of the potential collisions.
13. The system of claim 11, wherein the one or more processors are further configured to determine the STH based on an exception handling rate distribution.
14. The system of claim 13, wherein the exception handling rate profile is a constant deceleration measure of a vehicle.
15. The system of claim 13, wherein the one or more processors are further configured to perform a preventative maneuver by using the anomaly handling rate profile to control the vehicle when the runtime anomaly has not been resolved after the STH.
16. The system of claim 11, wherein the one or more processors are further configured to periodically re-determine the STH.
17. The system of claim 11, wherein the runtime exception corresponds to a communication delay from a sensor.
18. The system of claim 11, further comprising a sensor, and wherein the sensor is a radar unit.
19. The system of claim 11, wherein the runtime anomaly corresponds to an absence of communication from a sensor of a perception system for a predetermined period of time.
20. The system of claim 11, further comprising a vehicle.
CN202080028282.XA 2019-04-12 2020-04-10 Exception handling for autonomous vehicles Active CN113692372B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US16/383,096 2019-04-12
US16/383,096 US11327507B2 (en) 2019-04-12 2019-04-12 Exception handling for autonomous vehicles
PCT/US2020/027662 WO2020210618A1 (en) 2019-04-12 2020-04-10 Exception handling for autonomous vehicles

Publications (2)

Publication Number Publication Date
CN113692372A true CN113692372A (en) 2021-11-23
CN113692372B CN113692372B (en) 2022-09-13

Family

ID=72750864

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080028282.XA Active CN113692372B (en) 2019-04-12 2020-04-10 Exception handling for autonomous vehicles

Country Status (5)

Country Link
US (3) US11327507B2 (en)
EP (1) EP3934956A4 (en)
JP (1) JP7300516B2 (en)
CN (1) CN113692372B (en)
WO (1) WO2020210618A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11327507B2 (en) * 2019-04-12 2022-05-10 Waymo Llc Exception handling for autonomous vehicles
KR20210007634A (en) * 2019-07-12 2021-01-20 현대자동차주식회사 Apparatus for recoding image of a vehicle, system having the same and method thereof
CN110949381B (en) * 2019-11-12 2021-02-12 深圳大学 Method and device for monitoring driving behavior risk degree
JP6979091B2 (en) * 2020-01-29 2021-12-08 本田技研工業株式会社 Vehicle control devices, vehicles, vehicle control methods and programs
CN116401560B (en) * 2023-06-07 2023-08-25 上海伯镭智能科技有限公司 Operation abnormality detection method based on unmanned vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859494A (en) * 2009-04-06 2010-10-13 通用汽车环球科技运作公司 Autonomous vehicle management
US20170351261A1 (en) * 2015-11-04 2017-12-07 Zoox, Inc. Sensor-Based Object-Detection Optimization For Autonomous Vehicles
US10042363B1 (en) * 2015-01-20 2018-08-07 State Farm Mutual Automobile Insurance Company Analyzing telematics data to determine travel events and corrective actions
CN108597251A (en) * 2018-04-02 2018-09-28 昆明理工大学 A kind of traffic intersection distribution vehicle collision prewarning method based on car networking
CN109311474A (en) * 2016-06-06 2019-02-05 罗伯特·博世有限公司 Autonomous brake fault management in pedestrian protecting

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7797107B2 (en) 2003-09-16 2010-09-14 Zvi Shiller Method and system for providing warnings concerning an imminent vehicular collision
DE102004051758A1 (en) * 2004-10-23 2006-04-27 Daimlerchrysler Ag Planning of processes in driving system equipment
US7631552B2 (en) * 2006-12-22 2009-12-15 Detroit Diesel Corporation Method of verifying component functionality on EGR and air systems
JP4412356B2 (en) * 2007-06-13 2010-02-10 株式会社デンソー Vehicle collision mitigation device
JP4977568B2 (en) * 2007-09-28 2012-07-18 日産自動車株式会社 Current position information notification system, center apparatus, and error correction method
JP2009196464A (en) * 2008-02-20 2009-09-03 Keihin Corp Pedestrian collision detection device and pedestrian protection system
DE102008001301B4 (en) * 2008-04-22 2018-03-22 Robert Bosch Gmbh Method and control device for controlling personal protective equipment for a vehicle
JP5406072B2 (en) * 2010-02-18 2014-02-05 株式会社日立製作所 Embedded control device
EP2484573B1 (en) 2011-02-08 2017-12-13 Volvo Car Corporation Method for reducing the risk of a collision between a vehicle and a first external object
JP2012216017A (en) 2011-03-31 2012-11-08 Honda Motor Co Ltd Vehicular collision avoidance support device
US9221396B1 (en) * 2012-09-27 2015-12-29 Google Inc. Cross-validating sensors of an autonomous vehicle
DE102013013867A1 (en) 2013-08-20 2015-03-12 Audi Ag Motor vehicle and method for controlling a motor vehicle
US9216745B2 (en) 2013-09-16 2015-12-22 Disney Enterprises, Inc. Shared control of semi-autonomous vehicles including collision avoidance in multi-agent scenarios
CN115871715A (en) * 2014-12-12 2023-03-31 索尼公司 Automatic driving control apparatus, automatic driving control method, and program
JP6558731B2 (en) * 2015-04-21 2019-08-14 パナソニックIpマネジメント株式会社 Driving support method, driving support device, driving control device, vehicle, and driving support program using the same
US10169006B2 (en) * 2015-09-02 2019-01-01 International Business Machines Corporation Computer-vision based execution of graphical user interface (GUI) application actions
US10139828B2 (en) * 2015-09-24 2018-11-27 Uber Technologies, Inc. Autonomous vehicle operated with safety augmentation
EP3356899B1 (en) * 2015-09-28 2021-12-29 Uatc, Llc Method of operating an autonomous vehicle having independent auxiliary control unit
US20180281856A1 (en) * 2017-03-31 2018-10-04 Ford Global Technologies, Llc Real time lane change display
US10501091B2 (en) * 2017-05-23 2019-12-10 Uber Technologies, Inc. Software version and mode switching for autonomous vehicles
US10611371B2 (en) * 2017-09-14 2020-04-07 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for vehicle lane change prediction using structural recurrent neural networks
JP6939322B2 (en) * 2017-09-25 2021-09-22 トヨタ自動車株式会社 Driving support device
CN116968731A (en) * 2018-03-20 2023-10-31 御眼视觉技术有限公司 System for navigating a host vehicle and braking a host vehicle
US11124185B2 (en) * 2018-11-13 2021-09-21 Zoox, Inc. Perception collision avoidance
US11327507B2 (en) * 2019-04-12 2022-05-10 Waymo Llc Exception handling for autonomous vehicles

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859494A (en) * 2009-04-06 2010-10-13 通用汽车环球科技运作公司 Autonomous vehicle management
US10042363B1 (en) * 2015-01-20 2018-08-07 State Farm Mutual Automobile Insurance Company Analyzing telematics data to determine travel events and corrective actions
US20170351261A1 (en) * 2015-11-04 2017-12-07 Zoox, Inc. Sensor-Based Object-Detection Optimization For Autonomous Vehicles
CN109311474A (en) * 2016-06-06 2019-02-05 罗伯特·博世有限公司 Autonomous brake fault management in pedestrian protecting
CN108597251A (en) * 2018-04-02 2018-09-28 昆明理工大学 A kind of traffic intersection distribution vehicle collision prewarning method based on car networking

Also Published As

Publication number Publication date
WO2020210618A8 (en) 2021-05-14
EP3934956A1 (en) 2022-01-12
EP3934956A4 (en) 2022-12-07
US20220229445A1 (en) 2022-07-21
CN113692372B (en) 2022-09-13
JP2022526376A (en) 2022-05-24
US11327507B2 (en) 2022-05-10
US20230305577A1 (en) 2023-09-28
US11709503B2 (en) 2023-07-25
JP7300516B2 (en) 2023-06-29
US20200363818A1 (en) 2020-11-19
WO2020210618A1 (en) 2020-10-15

Similar Documents

Publication Publication Date Title
CN113692372B (en) Exception handling for autonomous vehicles
CN111132884B (en) Method and system for stopping vehicle
US11760354B2 (en) Multi-way stop intersection precedence for autonomous vehicles
EP3720750B1 (en) Method and system for maneuvering a vehicle
US10967861B2 (en) Using discomfort for speed planning in responding to tailgating vehicles for autonomous vehicles
JP7337155B2 (en) A system for implementing fallback behavior for autonomous vehicles
CN113924241B (en) Tracking vanishing object for autonomous vehicle
CN113508056A (en) Signaling for turns of autonomous vehicles
CN112424047B (en) Using discomfort for speed planning of autonomous vehicles
CN115593429A (en) Response of autonomous vehicle to emergency vehicle
US20230047336A1 (en) Time gaps for autonomous vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant